dqhr76378 2017-01-01 16:06
浏览 644

Golang GPU上的矢量加法

I'm writing a application that requires additions of 5000 length float vectors many times a second. Is it possible to make the GPU perform the calculations, an how would that be done? i need it to run on both windows and linux (later a raspberry pi), so CUDA is out of the question as i don't have a Nvidia graphics card.

  • 写回答

1条回答 默认 最新

  • dongqing5575 2017-01-08 04:55
    关注

    You can't directly talk to Nvidia GPUs from Go. You'd need to use cgo to call C library from Go. See slide #8 in this presentation for one example (also see full talk).

    There're some Go packages that wrap cgo part I mentioned above, into Go a library. mumax is one such package.

    评论

报告相同问题?