dspows0637 2012-10-10 01:15
浏览 470
已采纳

NodeJS setInterval对性能有害吗?

I have a beanstalkapp worker that is made with nodejs what it does is there is a PHP application which does all the site stuff and if there are errors or issues or notifications or what ever it adds it to the beanstalkapp. The nodejs then needs to run pretty much constantly checking the beanstalkapp for any messages and do something with them (email someone, add it to the log, post somewhere).

My question is, is this bad performance wise or is there a better way to do this? I would assume that setInterval doesn't let the process end and would therefore be bad?

  • 写回答

1条回答 默认 最新

  • douji3426 2012-10-10 01:44
    关注

    It depends on your setInterval time. The javascript interpreter sleeps between events. Therefore, in between setIntervals your node.js app consumes zero (or almost zero) CPU time.

    So how much load the app incurs on the system as a whole depends on how often setInterval fires. Once every second would hardly consume any CPU at all. On the other hand, once every 1ms (called with a setInterval time of 1 or 0) can bog down your system especially if you're running on a resource constrained machine or VM.

    In my experience, a good compromise is around 50 or 100 ms (20 or 10 times per second). It's responsive enough for even real-time applications (because human perception is a lot slower than 20Hz) and long enough to have little effect on the rest of the system.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论

报告相同问题?

悬赏问题

  • ¥15 为什么使用javacv转封装rtsp为rtmp时出现如下问题:[h264 @ 000000004faf7500]no frame?
  • ¥15 乘性高斯噪声在深度学习网络中的应用
  • ¥15 运筹学排序问题中的在线排序
  • ¥15 关于docker部署flink集成hadoop的yarn,请教个问题 flink启动yarn-session.sh连不上hadoop,这个整了好几天一直不行,求帮忙看一下怎么解决
  • ¥30 求一段fortran代码用IVF编译运行的结果
  • ¥15 深度学习根据CNN网络模型,搭建BP模型并训练MNIST数据集
  • ¥15 C++ 头文件/宏冲突问题解决
  • ¥15 用comsol模拟大气湍流通过底部加热(温度不同)的腔体
  • ¥50 安卓adb backup备份子用户应用数据失败
  • ¥20 有人能用聚类分析帮我分析一下文本内容嘛