qq_43178462 2023-11-30 14:55 采纳率: 30%
浏览 6
已结题

深度学习的模型融合问题

使用深度学习模型完成(10000, 1, 64)的时间序列分类,想在模型代码内加入VGG19和DenseNet121的训练返回的Dense结果。同时在VGG19和DenseNet121中先使用reshape将数据转为(10000,64),然后在使用x_gasf_train = gasf.fit_transform(x_train)转为(10000,64,64),用这样的数据完成训练并将Dense结果输出到网络中用于训练。TF版本为tensorflow-gpu 2.2.0请麻烦完善代码:

import os
import numpy as np
import pandas as pd
import matplotlib as mpl
import matplotlib.pyplot as plt
from sklearn.preprocessing import LabelEncoder
import warnings
warnings.simplefilter('ignore', category=DeprecationWarning)
from keras.models import Model
from keras.layers import Permute
from keras.optimizers import Adam 
from tensorflow.keras.utils import to_categorical
from keras.preprocessing.sequence import pad_sequences
from keras.callbacks import ModelCheckpoint, ReduceLROnPlateau
from keras import backend as K
from utils.generic_utils import load_dataset_at, calculate_dataset_metrics, cutoff_choice, \
                                cutoff_sequence
from utils.constants import MAX_NB_VARIABLES, MAX_TIMESTEPS_LIST
from keras.layers import Input, Dense, LSTM, multiply, concatenate, Activation, Masking, Reshape
from keras.layers import Conv1D, BatchNormalization, GlobalAveragePooling1D, Permute, Dropout
from keras.layers import Conv2D, MaxPool2D, Add, Flatten
from pyts.image import GramianAngularField


def generate_model_2():
    ip = Input(shape=(10000, 64))
    
    # stride = 10

    # x = Permute((2, 1))(ip)
    # x = Conv1D(MAX_NB_VARIABLES // stride, 8, strides=stride, padding='same', activation='relu', use_bias=False,
    #            kernel_initializer='he_uniform')(x)  # (None, variables / stride, timesteps)
    # x = Permute((2, 1))(x)

    #ip1 = K.reshape(ip,shape=(MAX_TIMESTEPS,MAX_NB_VARIABLES))
    #x = Permute((2, 1))(ip)
    x = Masking()(ip)
    x = VGG19(x)
    x = Dropout(0.8)(x)

    y = Permute((2, 1))(ip)
    y = Conv1D(128, 8, padding='same', kernel_initializer='he_uniform')(y)
    y = BatchNormalization()(y)
    y = Activation('relu')(y)
    y = squeeze_excite_block(y)

    y = Conv1D(256, 5, padding='same', kernel_initializer='he_uniform')(y)
    y = BatchNormalization()(y)
    y = Activation('relu')(y)
    y = squeeze_excite_block(y)

    y = Conv1D(128, 3, padding='same', kernel_initializer='he_uniform')(y)
    y = BatchNormalization()(y)
    y = Activation('relu')(y)

    y = GlobalAveragePooling1D()(y)

    z  = DenseNet121(x)
    z  = Dropout(0.8)(z)

    x = concatenate([x, y])
    x = concatenate([x, z])

    out = Dense(NB_CLASS, activation='softmax')(x)

    model = Model(ip, out)
    model.summary()

    # add load model code here to fine-tune

    return model

  • 写回答

6条回答 默认 最新

  • 叫兽-郭老师 新星创作者: Java技术领域 2023-11-30 15:23
    关注

    给你完善代码,你干嘛点踩~~~
    不对,给你改就是了

    评论

报告相同问题?

问题事件

  • 已结题 (查看结题原因) 11月30日
  • 创建了问题 11月30日