rubyarrow 2022-05-03 18:40 采纳率: 100%
浏览 768
已结题

使用Keras编写的LSTM,训练时出现loss: nan - val_loss: nan,该如何调整?

我下载了别人写好的LSTM预测时间序列的代码(代码是https://github.com/EthanChenYZ/Time-Series-Forecast/blob/master/LSTM/prophet.ipynb,数据集是https://github.com/EthanChenYZ/Time-Series-Forecast/blob/master/LSTM/data3.xlsx)。
现在我替换成自己的数据集(代码和数据集见https://github.com/dhj0506/Keras-LSTM),但是训练中途出现了loss: nan - val_loss: nan(见下面的Epoch 69/250)。该如何解决呢?是需要调整什么参数吗?


Epoch 1/250
2/2 [==============================] - 0s 125ms/step - loss: 0.3848 - val_loss: 0.6035
Epoch 2/250
2/2 [==============================] - 0s 48ms/step - loss: 0.3803 - val_loss: 0.5985
Epoch 3/250
2/2 [==============================] - 0s 52ms/step - loss: 0.3779 - val_loss: 0.5936
Epoch 4/250
2/2 [==============================] - 0s 52ms/step - loss: 0.3750 - val_loss: 0.5886
Epoch 5/250
2/2 [==============================] - 0s 54ms/step - loss: 0.3700 - val_loss: 0.5835
Epoch 6/250
2/2 [==============================] - 0s 54ms/step - loss: 0.3673 - val_loss: 0.5784
Epoch 7/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3641 - val_loss: 0.5730
Epoch 8/250
2/2 [==============================] - 0s 52ms/step - loss: 0.3606 - val_loss: 0.5676
Epoch 9/250
2/2 [==============================] - 0s 51ms/step - loss: 0.3567 - val_loss: 0.5619
Epoch 10/250
2/2 [==============================] - 0s 51ms/step - loss: 0.3544 - val_loss: 0.5561
Epoch 11/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3515 - val_loss: 0.5500
Epoch 12/250
2/2 [==============================] - 0s 54ms/step - loss: 0.3468 - val_loss: 0.5437
Epoch 13/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3435 - val_loss: 0.5371
Epoch 14/250
2/2 [==============================] - 0s 51ms/step - loss: 0.3407 - val_loss: 0.5302
Epoch 15/250
2/2 [==============================] - 0s 52ms/step - loss: 0.3370 - val_loss: 0.5228
Epoch 16/250
2/2 [==============================] - 0s 49ms/step - loss: 0.3321 - val_loss: 0.5150
Epoch 17/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3310 - val_loss: 0.5066
Epoch 18/250
2/2 [==============================] - 0s 55ms/step - loss: 0.3258 - val_loss: 0.4976
Epoch 19/250
2/2 [==============================] - 0s 49ms/step - loss: 0.3238 - val_loss: 0.4879
Epoch 20/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3162 - val_loss: 0.4772
Epoch 21/250
2/2 [==============================] - 0s 54ms/step - loss: 0.3121 - val_loss: 0.4653
Epoch 22/250
2/2 [==============================] - 0s 51ms/step - loss: 0.3081 - val_loss: 0.4519
Epoch 23/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3040 - val_loss: 0.4366
Epoch 24/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2996 - val_loss: 0.4185
Epoch 25/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2939 - val_loss: 0.3967
Epoch 26/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2917 - val_loss: 0.3799
Epoch 27/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2900 - val_loss: 0.3697
Epoch 28/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2866 - val_loss: 0.3654
Epoch 29/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2806 - val_loss: 0.3615
Epoch 30/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2780 - val_loss: 0.3569
Epoch 31/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2772 - val_loss: 0.3512
Epoch 32/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2716 - val_loss: 0.3444
Epoch 33/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2702 - val_loss: 0.3373
Epoch 34/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2640 - val_loss: 0.3299
Epoch 35/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2645 - val_loss: 0.3237
Epoch 36/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2584 - val_loss: 0.3208
Epoch 37/250
2/2 [==============================] - 0s 48ms/step - loss: 0.2529 - val_loss: 0.3208
Epoch 38/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2520 - val_loss: 0.3210
Epoch 39/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2479 - val_loss: 0.3189
Epoch 40/250
2/2 [==============================] - 0s 50ms/step - loss: 0.2494 - val_loss: 0.3135
Epoch 41/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2465 - val_loss: 0.3073
Epoch 42/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2446 - val_loss: 0.3054
Epoch 43/250
2/2 [==============================] - 0s 50ms/step - loss: 0.2434 - val_loss: 0.3099
Epoch 44/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2373 - val_loss: 0.3132
Epoch 45/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2379 - val_loss: 0.3146
Epoch 46/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2346 - val_loss: 0.3119
Epoch 47/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2310 - val_loss: 0.3076
Epoch 48/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2296 - val_loss: 0.3053
Epoch 49/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2273 - val_loss: 0.3117
Epoch 50/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2213 - val_loss: 0.3201
Epoch 51/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2202 - val_loss: 0.3205
Epoch 52/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2261 - val_loss: 0.3171
Epoch 53/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2180 - val_loss: 0.3133
Epoch 54/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2220 - val_loss: 0.3114
Epoch 55/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2160 - val_loss: 0.3135
Epoch 56/250
2/2 [==============================] - 0s 48ms/step - loss: 0.2160 - val_loss: 0.3158
Epoch 57/250
2/2 [==============================] - 0s 50ms/step - loss: 0.2144 - val_loss: 0.3186
Epoch 58/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2130 - val_loss: 0.3194
Epoch 59/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2156 - val_loss: 0.3176
Epoch 60/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2085 - val_loss: 0.3155
Epoch 61/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2126 - val_loss: 0.3174
Epoch 62/250
2/2 [==============================] - 0s 48ms/step - loss: 0.2070 - val_loss: 0.3210
Epoch 63/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2049 - val_loss: 0.3218
Epoch 64/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2034 - val_loss: 0.3188
Epoch 65/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2063 - val_loss: 0.3252
Epoch 66/250
2/2 [==============================] - 0s 50ms/step - loss: 0.2039 - val_loss: 0.3276
Epoch 67/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2039 - val_loss: 0.3265
Epoch 68/250
2/2 [==============================] - 0s 48ms/step - loss: 0.2026 - val_loss: 0.3220
Epoch 69/250
2/2 [==============================] - 0s 50ms/step - loss: 0.1936 - val_loss: 75128408.0000
Epoch 70/250
2/2 [==============================] - 0s 50ms/step - loss: 4084874813187817472.0000 - val_loss: nan
Epoch 71/250
2/2 [==============================] - 0s 51ms/step - loss: nan - val_loss: nan
Epoch 72/250
2/2 [==============================] - 0s 52ms/step - loss: nan - val_loss: nan
Epoch 73/250
2/2 [==============================] - 0s 50ms/step - loss: nan - val_loss: nan
Epoch 74/250
2/2 [==============================] - 0s 54ms/step - loss: nan - val_loss: nan
Epoch 75/250
2/2 [==============================] - 0s 52ms/step - loss: nan - val_loss: nan
……(此处省略,都是 loss: nan - val_loss: nan)
Epoch 250/250
2/2 [==============================] - 0s 49ms/step - loss: nan - val_loss: nan
1/1 [==============================] - 0s 0s/step - loss: nan
nan

  • 写回答

2条回答 默认 最新

  • 关注

    原数据第一列是时间形式的20220503这种,在读数据之后进行下面操作,你的数据是简单的1 2 34这种,数据可能在处理过程丢失或者变成nan了,
    你可以一步步看看每次处理后当前的数据现在是什么格式

    data = data[data.date < '20201031']
    
    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

问题事件

  • 系统已结题 5月13日
  • 已采纳回答 5月5日
  • 创建了问题 5月3日

悬赏问题

  • ¥15 metadata提取的PDF元数据,如何转换为一个Excel
  • ¥15 关于arduino编程toCharArray()函数的使用
  • ¥100 vc++混合CEF采用CLR方式编译报错
  • ¥15 coze 的插件输入飞书多维表格 app_token 后一直显示错误,如何解决?
  • ¥15 vite+vue3+plyr播放本地public文件夹下视频无法加载
  • ¥15 c#逐行读取txt文本,但是每一行里面数据之间空格数量不同
  • ¥50 如何openEuler 22.03上安装配置drbd
  • ¥20 ING91680C BLE5.3 芯片怎么实现串口收发数据
  • ¥15 无线连接树莓派,无法执行update,如何解决?(相关搜索:软件下载)
  • ¥15 Windows11, backspace, enter, space键失灵