doujiaochan7317 2012-08-27 16:07 采纳率: 0%
浏览 55
已采纳

长期运行的Magento流程

Anyone have any experience using a long-running Magento process to mitigate overhead. For example, a typical Magento API call to an order or customer resource could take 1s or more, with potentially half of that time being spent on Magento overhead, and not specific to the API resource in question.

So, what if a Magento PHP process was spun up and maintained in memory, waiting for API requests, so that it could handle them without the need to load up Magento each time.

Most of my searches for long-running php scripts are turning up questions / issues related to troubleshooting PHP scripts that are taking longer than expected to run b/c of the amount of data they're processing, etc. - so I'm finding it difficult to find good resources on this kind of thing, if it's even possible.

UPDATE: To be a bit more specific with my needs:

  • I already have memcached in place for simple GETs that we can safely cache server-side.
  • It's write operations that I'd like to optimize now.
  • Using a REST API, so don't have any WSDL loading that we're concerned with.
  • 写回答

1条回答 默认 最新

  • dongtang6718 2012-08-27 20:16
    关注

    You may want to look into proc-open, and you'll need to do a lot of management that usually occurs in the OS's itself.

    However, if the problem is speed and not just wanting a means to pipe/fork to take use of hardware available I would look into simply finding bottlenecks through out the system, and caching before diving into such. Such as WSDL Caching, DB normalization, OP code caching or even memcache or reverse proxy caching. Alan does have WSDL caching in his Mercury API product ( http://store.pulsestorm.net/products/mercury-api )

    I have used proc-open before when importing over 500k customer records (through Magento's models (stack) I may add) with addresses into Magento on a 32 Core system in less than 8 hours using this same approach. One PHP file acted as the main entry point and new processes based of chunks of data were forked out to a secondary PHP file that did the actual importing.

    I did leverage this small script for multi-threading on the import I had mentioned, although this isn't an exact answer to your question, as it doesn't seem to very technically specific oriented but hopefully offers some insight on possibilities:

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论

报告相同问题?

悬赏问题

  • ¥15 企业资源规划ERP沙盘模拟
  • ¥15 前端echarts坐标轴问题
  • ¥15 CMFCPropertyPage
  • ¥15 ad5933的I2C
  • ¥15 请问RTX4060的笔记本电脑可以训练yolov5模型吗?
  • ¥15 数学建模求思路及代码
  • ¥50 silvaco GaN HEMT有栅极场板的击穿电压仿真问题
  • ¥15 谁会P4语言啊,我想请教一下
  • ¥15 这个怎么改成直流激励源给加热电阻提供5a电流呀
  • ¥50 求解vmware的网络模式问题 别拿AI回答