dqhr76378 2017-01-01 16:06
浏览 643

Golang GPU上的矢量加法

I'm writing a application that requires additions of 5000 length float vectors many times a second. Is it possible to make the GPU perform the calculations, an how would that be done? i need it to run on both windows and linux (later a raspberry pi), so CUDA is out of the question as i don't have a Nvidia graphics card.

  • 写回答

1条回答 默认 最新

  • dongqing5575 2017-01-08 04:55
    关注

    You can't directly talk to Nvidia GPUs from Go. You'd need to use cgo to call C library from Go. See slide #8 in this presentation for one example (also see full talk).

    There're some Go packages that wrap cgo part I mentioned above, into Go a library. mumax is one such package.

    评论

报告相同问题?

悬赏问题

  • ¥15 关于#网络安全#的问题:求ensp的网络安全,不要步骤要完成版文件
  • ¥15 可否在不同线程中调用封装数据库操作的类
  • ¥20 使用Photon PUN2解决游戏得分同步的问题
  • ¥15 微带串馈天线阵列每个阵元宽度计算
  • ¥15 keil的map文件中Image component sizes各项意思
  • ¥20 求个正点原子stm32f407开发版的贪吃蛇游戏
  • ¥15 划分vlan后,链路不通了?
  • ¥20 求各位懂行的人,注册表能不能看到usb使用得具体信息,干了什么,传输了什么数据
  • ¥15 Vue3 大型图片数据拖动排序
  • ¥15 Centos / PETGEM