weixin_39620653
weixin_39620653
2020-12-09 05:10

Got RuntimeError: expected a Variable argument, but got NoneType

Hi,

I was trying to use Fusedmax() with a tensor and got the following error. *** RuntimeError: expected a Variable argument, but got NoneType

This was observed both in Pytorch 0.4.1 and Pytorch 1.1.0.

该提问来源于开源项目:vene/sparse-structured-attention

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答

4条回答

  • weixin_39760206 weixin_39760206 5月前

    Hi, could you please provide a minimum code sample reproducing the issue?

    点赞 评论 复制链接分享
  • weixin_39620653 weixin_39620653 5月前

    Thank you for your quick response, below is the code that generates the RuntimeError for me. I am not sure whether I am using the function correctly though.

    P.S. During my actual run, my tensor is a 3-dimension one but I was able to replicate the error with just normal tensor.

    ruby
    import torch
    import torchsparseattn
    a = torch.tensor([1,2,2])
    fusedmax = torchsparseattn.Fusedmax()
    fusedmax(a)
    
    点赞 评论 复制链接分享
  • weixin_39760206 weixin_39760206 5月前

    You need a "lengths" argument too; this is implemented for masking purposes (i.e. when doing batched attention with variable-length sequences.)

    Here is an example

    
    In [1]: import torch
    In [2]: import torchsparseattn
    In [3]: a = torch.tensor([1, 2, 2], dtype=torch.double)
    In [4]: lengths = torch.tensor([3])
    In [5]: fusedmax = torchsparseattn.Fusedmax()
    In [6]: fusedmax(a, lengths)
    Out[6]: tensor([0.3333, 0.3333, 0.3333], dtype=torch.float64)
    
    点赞 评论 复制链接分享
  • weixin_39760206 weixin_39760206 5月前

    (I agree this is unclear, let me add an example to the readme.)

    点赞 评论 复制链接分享

相关推荐