rubyarrow 2022-05-03 18:40 采纳率: 100%
浏览 316
已结题

使用Keras编写的LSTM,训练时出现loss: nan - val_loss: nan,该如何调整?

我下载了别人写好的LSTM预测时间序列的代码(代码是https://github.com/EthanChenYZ/Time-Series-Forecast/blob/master/LSTM/prophet.ipynb,数据集是https://github.com/EthanChenYZ/Time-Series-Forecast/blob/master/LSTM/data3.xlsx)。
现在我替换成自己的数据集(代码和数据集见https://github.com/dhj0506/Keras-LSTM),但是训练中途出现了loss: nan - val_loss: nan(见下面的Epoch 69/250)。该如何解决呢?是需要调整什么参数吗?


Epoch 1/250
2/2 [==============================] - 0s 125ms/step - loss: 0.3848 - val_loss: 0.6035
Epoch 2/250
2/2 [==============================] - 0s 48ms/step - loss: 0.3803 - val_loss: 0.5985
Epoch 3/250
2/2 [==============================] - 0s 52ms/step - loss: 0.3779 - val_loss: 0.5936
Epoch 4/250
2/2 [==============================] - 0s 52ms/step - loss: 0.3750 - val_loss: 0.5886
Epoch 5/250
2/2 [==============================] - 0s 54ms/step - loss: 0.3700 - val_loss: 0.5835
Epoch 6/250
2/2 [==============================] - 0s 54ms/step - loss: 0.3673 - val_loss: 0.5784
Epoch 7/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3641 - val_loss: 0.5730
Epoch 8/250
2/2 [==============================] - 0s 52ms/step - loss: 0.3606 - val_loss: 0.5676
Epoch 9/250
2/2 [==============================] - 0s 51ms/step - loss: 0.3567 - val_loss: 0.5619
Epoch 10/250
2/2 [==============================] - 0s 51ms/step - loss: 0.3544 - val_loss: 0.5561
Epoch 11/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3515 - val_loss: 0.5500
Epoch 12/250
2/2 [==============================] - 0s 54ms/step - loss: 0.3468 - val_loss: 0.5437
Epoch 13/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3435 - val_loss: 0.5371
Epoch 14/250
2/2 [==============================] - 0s 51ms/step - loss: 0.3407 - val_loss: 0.5302
Epoch 15/250
2/2 [==============================] - 0s 52ms/step - loss: 0.3370 - val_loss: 0.5228
Epoch 16/250
2/2 [==============================] - 0s 49ms/step - loss: 0.3321 - val_loss: 0.5150
Epoch 17/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3310 - val_loss: 0.5066
Epoch 18/250
2/2 [==============================] - 0s 55ms/step - loss: 0.3258 - val_loss: 0.4976
Epoch 19/250
2/2 [==============================] - 0s 49ms/step - loss: 0.3238 - val_loss: 0.4879
Epoch 20/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3162 - val_loss: 0.4772
Epoch 21/250
2/2 [==============================] - 0s 54ms/step - loss: 0.3121 - val_loss: 0.4653
Epoch 22/250
2/2 [==============================] - 0s 51ms/step - loss: 0.3081 - val_loss: 0.4519
Epoch 23/250
2/2 [==============================] - 0s 50ms/step - loss: 0.3040 - val_loss: 0.4366
Epoch 24/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2996 - val_loss: 0.4185
Epoch 25/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2939 - val_loss: 0.3967
Epoch 26/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2917 - val_loss: 0.3799
Epoch 27/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2900 - val_loss: 0.3697
Epoch 28/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2866 - val_loss: 0.3654
Epoch 29/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2806 - val_loss: 0.3615
Epoch 30/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2780 - val_loss: 0.3569
Epoch 31/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2772 - val_loss: 0.3512
Epoch 32/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2716 - val_loss: 0.3444
Epoch 33/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2702 - val_loss: 0.3373
Epoch 34/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2640 - val_loss: 0.3299
Epoch 35/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2645 - val_loss: 0.3237
Epoch 36/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2584 - val_loss: 0.3208
Epoch 37/250
2/2 [==============================] - 0s 48ms/step - loss: 0.2529 - val_loss: 0.3208
Epoch 38/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2520 - val_loss: 0.3210
Epoch 39/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2479 - val_loss: 0.3189
Epoch 40/250
2/2 [==============================] - 0s 50ms/step - loss: 0.2494 - val_loss: 0.3135
Epoch 41/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2465 - val_loss: 0.3073
Epoch 42/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2446 - val_loss: 0.3054
Epoch 43/250
2/2 [==============================] - 0s 50ms/step - loss: 0.2434 - val_loss: 0.3099
Epoch 44/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2373 - val_loss: 0.3132
Epoch 45/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2379 - val_loss: 0.3146
Epoch 46/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2346 - val_loss: 0.3119
Epoch 47/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2310 - val_loss: 0.3076
Epoch 48/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2296 - val_loss: 0.3053
Epoch 49/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2273 - val_loss: 0.3117
Epoch 50/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2213 - val_loss: 0.3201
Epoch 51/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2202 - val_loss: 0.3205
Epoch 52/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2261 - val_loss: 0.3171
Epoch 53/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2180 - val_loss: 0.3133
Epoch 54/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2220 - val_loss: 0.3114
Epoch 55/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2160 - val_loss: 0.3135
Epoch 56/250
2/2 [==============================] - 0s 48ms/step - loss: 0.2160 - val_loss: 0.3158
Epoch 57/250
2/2 [==============================] - 0s 50ms/step - loss: 0.2144 - val_loss: 0.3186
Epoch 58/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2130 - val_loss: 0.3194
Epoch 59/250
2/2 [==============================] - 0s 49ms/step - loss: 0.2156 - val_loss: 0.3176
Epoch 60/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2085 - val_loss: 0.3155
Epoch 61/250
2/2 [==============================] - 0s 51ms/step - loss: 0.2126 - val_loss: 0.3174
Epoch 62/250
2/2 [==============================] - 0s 48ms/step - loss: 0.2070 - val_loss: 0.3210
Epoch 63/250
2/2 [==============================] - 0s 52ms/step - loss: 0.2049 - val_loss: 0.3218
Epoch 64/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2034 - val_loss: 0.3188
Epoch 65/250
2/2 [==============================] - 0s 54ms/step - loss: 0.2063 - val_loss: 0.3252
Epoch 66/250
2/2 [==============================] - 0s 50ms/step - loss: 0.2039 - val_loss: 0.3276
Epoch 67/250
2/2 [==============================] - 0s 53ms/step - loss: 0.2039 - val_loss: 0.3265
Epoch 68/250
2/2 [==============================] - 0s 48ms/step - loss: 0.2026 - val_loss: 0.3220
Epoch 69/250
2/2 [==============================] - 0s 50ms/step - loss: 0.1936 - val_loss: 75128408.0000
Epoch 70/250
2/2 [==============================] - 0s 50ms/step - loss: 4084874813187817472.0000 - val_loss: nan
Epoch 71/250
2/2 [==============================] - 0s 51ms/step - loss: nan - val_loss: nan
Epoch 72/250
2/2 [==============================] - 0s 52ms/step - loss: nan - val_loss: nan
Epoch 73/250
2/2 [==============================] - 0s 50ms/step - loss: nan - val_loss: nan
Epoch 74/250
2/2 [==============================] - 0s 54ms/step - loss: nan - val_loss: nan
Epoch 75/250
2/2 [==============================] - 0s 52ms/step - loss: nan - val_loss: nan
……(此处省略,都是 loss: nan - val_loss: nan)
Epoch 250/250
2/2 [==============================] - 0s 49ms/step - loss: nan - val_loss: nan
1/1 [==============================] - 0s 0s/step - loss: nan
nan

  • 写回答

2条回答

      报告相同问题?

      相关推荐 更多相似问题

      问题事件

      • 系统已结题 5月13日
      • 已采纳回答 5月5日
      • 创建了问题 5月3日

      悬赏问题

      • ¥30 如何在运行时,输入训练集目录下的所有文件?(语言-python)
      • ¥15 类的继承,综合输出学生信息并析构
      • ¥15 js 使用contenteditable属性模拟富文本框的时候如何定位光标到指定位置
      • ¥15 怎么激活组态王的Touchvew菜单和Touchmak 菜单
      • ¥15 Meta分析中,如何在只有样本量的情况下用cma处理效应量r值。
      • ¥15 关于#OpenCV#的问题,如何解决?(语言-qt|开发工具-visual studio)
      • ¥15 centos环境中引入jsoncpp后,编译工程报错,找不到基本库。
      • ¥15 vue3为什么请求在这边获取不到数据呢(关键词-select)
      • ¥20 Multisim仿真设计三路智力抢答装置
      • ¥15 用UML制作简单的语音计算机