调用时optimizer = AdamW(optimizer_grouped_parameters, lr=Learning_rate, eps=1e-8)
报错:
FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning
warnings.warn(
大概翻译了一下,可能是旧版本才能这样写,而自己用的新版的要换,但自己搜了用PyTorch实现torch.optim.AdamW搜不到结果,不知道该怎么解决。