weixin_38256708
2018-08-23 06:40
采纳率: 36.4%
浏览 2.2k

吴恩达deeplearning.ai课程中序列模型这一章的作业Rnn(lstm)反向传播推导

为什么自己算的结果和答案中给的结果对不上?
for t in reversed(range(T_x)):
# Compute all gradients using lstm_cell_backward
gradients = lstm_cell_backward(da[:, :, t] + da_prevt, dc_prevt, caches[t])
# Store or add the gradient to the parameters' previous step's gradient
da_prevt = gradients['da_prev']
dc_prevt = gradients['dc_prev']
dx[:,:,t] = gradients['dxt']
dWf = dWf + gradients['dWf']
dWi = dWi + gradients['dWi']
dWc = dWc + gradients['dWc']
dWo = dWo + gradients['dWo']
dbf = dbf + gradients['dbf']
dbi = dbi + gradients['dbi']
dbc = dbc + gradients['dbc']
dbo = dbo + gradients['dbo']

            这个对吗da[:, :, t] + da_prevt, dc_prevt
  • 写回答
  • 好问题 提建议
  • 追加酬金
  • 关注问题
  • 邀请回答

4条回答 默认 最新

  • devmiao 2018-08-23 15:45
    最佳回答
    评论
    解决 无用
    打赏 举报
查看更多回答(3条)

相关推荐 更多相似问题