My network has a output layer with Relu activation function, but I want the output is something like "Relu+1", that is I want the output is all bigger than 1 and has the same shape of Relu function.
How should I change my torch.nn network?
关于torch激活函数自定义
- 写回答
- 好问题 0 提建议
- 关注问题
- 邀请回答
-
1条回答 默认 最新
影醉阏轩窗 2021-07-09 09:33关注原始Relu都是大于0的值,你想大于1的值,直接+1即可。关于shape一致,激活函数是不改变shape的。这还有什么问题吗?
本回答被题主选为最佳回答 , 对您是否有帮助呢?评论 打赏 举报解决 1无用