qq_46034030 2020-05-31 21:39 采纳率: 0%
浏览 168

在kaggle网站是的notebook里面看到一种网络结构,很复杂,请问这是一种特定的结构吗?

Model: "model_1"


Layer (type) Output Shape Param # Connected to

input_1 (InputLayer) (None, 256, 256, 4) 0


conv2d_1 (Conv2D) (None, 126, 126, 16) 1616 input_1[0][0]


leaky_re_lu_1 (LeakyReLU) (None, 126, 126, 16) 0 conv2d_1[0][0]


conv2d_2 (Conv2D) (None, 126, 126, 16) 2320 leaky_re_lu_1[0][0]


conv2d_3 (Conv2D) (None, 126, 126, 16) 6416 leaky_re_lu_1[0][0]


leaky_re_lu_2 (LeakyReLU) (None, 126, 126, 16) 0 conv2d_2[0][0]


leaky_re_lu_3 (LeakyReLU) (None, 126, 126, 16) 0 conv2d_3[0][0]


concatenate_1 (Concatenate) (None, 126, 126, 32) 0 leaky_re_lu_2[0][0]

leaky_re_lu_3[0][0]


conv2d_4 (Conv2D) (None, 61, 61, 32) 25632 concatenate_1[0][0]


leaky_re_lu_4 (LeakyReLU) (None, 61, 61, 32) 0 conv2d_4[0][0]


conv2d_5 (Conv2D) (None, 61, 61, 32) 9248 leaky_re_lu_4[0][0]


conv2d_6 (Conv2D) (None, 61, 61, 32) 25632 leaky_re_lu_4[0][0]


leaky_re_lu_5 (LeakyReLU) (None, 61, 61, 32) 0 conv2d_5[0][0]


leaky_re_lu_6 (LeakyReLU) (None, 61, 61, 32) 0 conv2d_6[0][0]


concatenate_2 (Concatenate) (None, 61, 61, 64) 0 leaky_re_lu_5[0][0]

leaky_re_lu_6[0][0]


conv2d_7 (Conv2D) (None, 61, 61, 32) 2080 concatenate_2[0][0]


leaky_re_lu_7 (LeakyReLU) (None, 61, 61, 32) 0 conv2d_7[0][0]


conv2d_8 (Conv2D) (None, 61, 61, 32) 1056 leaky_re_lu_7[0][0]


conv2d_9 (Conv2D) (None, 61, 61, 32) 9248 leaky_re_lu_7[0][0]


leaky_re_lu_8 (LeakyReLU) (None, 61, 61, 32) 0 conv2d_8[0][0]


leaky_re_lu_9 (LeakyReLU) (None, 61, 61, 32) 0 conv2d_9[0][0]


concatenate_3 (Concatenate) (None, 61, 61, 64) 0 leaky_re_lu_8[0][0]

leaky_re_lu_9[0][0]


conv2d_10 (Conv2D) (None, 29, 29, 64) 102464 concatenate_3[0][0]


leaky_re_lu_10 (LeakyReLU) (None, 29, 29, 64) 0 conv2d_10[0][0]


conv2d_11 (Conv2D) (None, 29, 29, 64) 36928 leaky_re_lu_10[0][0]


conv2d_12 (Conv2D) (None, 29, 29, 64) 102464 leaky_re_lu_10[0][0]


leaky_re_lu_11 (LeakyReLU) (None, 29, 29, 64) 0 conv2d_11[0][0]


leaky_re_lu_12 (LeakyReLU) (None, 29, 29, 64) 0 conv2d_12[0][0]


concatenate_4 (Concatenate) (None, 29, 29, 128) 0 leaky_re_lu_11[0][0]

leaky_re_lu_12[0][0]


conv2d_13 (Conv2D) (None, 29, 29, 64) 8256 concatenate_4[0][0]


leaky_re_lu_13 (LeakyReLU) (None, 29, 29, 64) 0 conv2d_13[0][0]


conv2d_14 (Conv2D) (None, 29, 29, 64) 4160 leaky_re_lu_13[0][0]


conv2d_15 (Conv2D) (None, 29, 29, 64) 36928 leaky_re_lu_13[0][0]


leaky_re_lu_14 (LeakyReLU) (None, 29, 29, 64) 0 conv2d_14[0][0]


leaky_re_lu_15 (LeakyReLU) (None, 29, 29, 64) 0 conv2d_15[0][0]


concatenate_5 (Concatenate) (None, 29, 29, 128) 0 leaky_re_lu_14[0][0]

leaky_re_lu_15[0][0]


conv2d_16 (Conv2D) (None, 13, 13, 64) 204864 concatenate_5[0][0]


leaky_re_lu_16 (LeakyReLU) (None, 13, 13, 64) 0 conv2d_16[0][0]


conv2d_17 (Conv2D) (None, 13, 13, 64) 36928 leaky_re_lu_16[0][0]


conv2d_18 (Conv2D) (None, 13, 13, 64) 102464 leaky_re_lu_16[0][0]


leaky_re_lu_17 (LeakyReLU) (None, 13, 13, 64) 0 conv2d_17[0][0]


leaky_re_lu_18 (LeakyReLU) (None, 13, 13, 64) 0 conv2d_18[0][0]


concatenate_6 (Concatenate) (None, 13, 13, 128) 0 leaky_re_lu_17[0][0]

leaky_re_lu_18[0][0]


conv2d_19 (Conv2D) (None, 13, 13, 64) 8256 concatenate_6[0][0]


leaky_re_lu_19 (LeakyReLU) (None, 13, 13, 64) 0 conv2d_19[0][0]


conv2d_20 (Conv2D) (None, 13, 13, 64) 4160 leaky_re_lu_19[0][0]


conv2d_21 (Conv2D) (None, 13, 13, 64) 36928 leaky_re_lu_19[0][0]


leaky_re_lu_20 (LeakyReLU) (None, 13, 13, 64) 0 conv2d_20[0][0]


leaky_re_lu_21 (LeakyReLU) (None, 13, 13, 64) 0 conv2d_21[0][0]


concatenate_7 (Concatenate) (None, 13, 13, 128) 0 leaky_re_lu_20[0][0]

leaky_re_lu_21[0][0]


conv2d_22 (Conv2D) (None, 11, 11, 64) 73792 concatenate_7[0][0]


leaky_re_lu_22 (LeakyReLU) (None, 11, 11, 64) 0 conv2d_22[0][0]


flatten_1 (Flatten) (None, 7744) 0 leaky_re_lu_22[0][0]


dense_1 (Dense) (None, 512) 3965440 flatten_1[0][0]


leaky_re_lu_23 (LeakyReLU) (None, 512) 0 dense_1[0][0]


dropout_1 (Dropout) (None, 512) 0 leaky_re_lu_23[0][0]


dense_2 (Dense) (None, 12) 6156 dropout_1[0][0]

  • 写回答

1条回答 默认 最新

  • 关竹 2020-06-01 04:48
    关注

    也不能说复杂,就是层比较多, conv 代表 Convolutional,卷积的意思,在深度学习的体系中全称为Convolutional Neural Network, CNN,卷积神经网络。
    总体分为三层 卷积层(图像拆分,特征提取,一般函数命名为 conv相关 ),池化层(特征选择,信息过滤,一般函数命名为pool相关),全连接层(dense 将这些结果进行拟合),dropout是一个参数,其作用是按这个比例丢弃部分结果,防止结果的过拟合(对特定输入拟合结果百分百,换个输入拟合结果很差),concatenate (将特征进行融合)。
    你列举的 这些东西中 (None, 13, 13, 64),代表的含义对应函数的输入输出格式,这个要对应到函数里看其具体作用,一般conv卷积函数中意味按13,13分割输入,输出64个(实质是对应的数组维度)。 re_lu是激活函数(激活函数有很多种),起作用是将拟合结果变为非线性表达式(更好的进行分类,也更复合实际情况), flatten 的作用是将多维数组一维化([[1,2,3],[4,5,6]]-->[1,2,3,4,5,6])

    评论

报告相同问题?

悬赏问题

  • ¥15 QQ邮箱过期怎么恢复?
  • ¥15 登录他人的vue项目显示服务器错误
  • ¥15 (标签-android|关键词-app)
  • ¥60 如何批量获取json的url
  • ¥15 comsol仿真压阻传感器
  • ¥15 Python线性规划函数optimize.linprog求解为整数
  • ¥15 llama3中文版微调
  • ¥15 pg数据库导入数据序列重复
  • ¥15 三分类机器学习模型可视化分析
  • ¥15 本地测试网站127.0.0.1 已拒绝连接,如何解决?(标签-ubuntu)