陌上寒雪
2021-09-03 16:53
采纳率: 40%
浏览 34
已结题

Keras 怎么把embeding输入层去掉直接变成one-hot输入

原来embeding输入


seed=7
np.random.seed(seed)
DATE_train, x_val, LABLES_train, y_val = train_test_split(train_data,df_train['标签_数字'],test_size=0.2, random_state=0,shuffle=True)
model = tf.keras.Sequential()
vocab_size=len(tokenizer.word_index)+1
model.add(layers.Embedding(input_dim=vocab_size,output_dim=128,mask_zero=True,embeddings_initializer='uniform'))
model.add(layers.GRU(128,return_sequences=Ture,dropout=0.2, recurrent_dropout=0.2))
model.add(layers.GRU(128,return_sequences=False,dropout=0.2, recurrent_dropout=0.2))
model.add(layers.BatchNormalization())

one-hot改动

seed=7
np.random.seed(seed)
DATE_train, x_val, LABLES_train, y_val = train_test_split(train_data,df_train['标签_数字'],test_size=0.2, random_state=0,shuffle=True)
model = tf.keras.Sequential()
vocab_size=len(tokenizer.word_index)+1
model.add(layers.Dense(128,input_dim=vocab_size))
model.add(layers.GRU(128,return_sequences=Ture,dropout=0.2, recurrent_dropout=0.2))
model.add(layers.GRU(128,return_sequences=False,dropout=0.2, recurrent_dropout=0.2))
model.add(layers.BatchNormalization())

报错

ValueError: Input 0 of layer gru is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [None, 128]

  • 收藏

1条回答 默认 最新

  • 程序媛一枚~ 2021-09-04 08:19
    已采纳

    看报错,期待的0层GRU输入是3维的数组,但是接收到的是2维的

    已采纳该答案
    打赏 评论

相关推荐 更多相似问题