dpqaaczn141761 2014-04-14 18:28
浏览 21

去大内存垃圾回收性能

I am considering implementing a memory caching daemon in Go. It has potential of getting some serious memory utilization (say, Terabyte). Fragmenting into separate heaps is not a good option, I want it all in one memory space. Does anyone have experience running Go with such huge memory sizes? Is GC going to perform acceptably?

  • 写回答

1条回答 默认 最新

  • duanan1228 2014-04-14 19:18
    关注

    I am trying to do the same but the only projects that gave me a good performance to cache data was the binary tree https://github.com/stathat/treap m which supported more than 1 millons of nodes on memory in one machine Ubuntu 12.0.4 LTS with 8 GB memory. Furthermore, it was fast loading and searching data.

    Other projects that I tested was LMDB but not support many nodes on memory, kv, go-cache and goleveldb but no one was as faster to recovery data from memory that treap.

    评论

报告相同问题?

悬赏问题

  • ¥15 写一个方法checkPerson,入参实体类Person,出参布尔值
  • ¥15 我想咨询一下路面纹理三维点云数据处理的一些问题,上传的坐标文件里是怎么对无序点进行编号的,以及xy坐标在处理的时候是进行整体模型分片处理的吗
  • ¥15 CSAPPattacklab
  • ¥15 一直显示正在等待HID—ISP
  • ¥15 Python turtle 画图
  • ¥15 关于大棚监测的pcb板设计
  • ¥15 stm32开发clion时遇到的编译问题
  • ¥15 lna设计 源简并电感型共源放大器
  • ¥15 如何用Labview在myRIO上做LCD显示?(语言-开发语言)
  • ¥15 Vue3地图和异步函数使用