2014-03-19 09:40
浏览 55


I was working on migration scripts which selects data from one MySQL database and import through doctrine into another MySQL database. The problem was that after every chunk of created entities, my scripts slowed down.

first 100 articles takes about 5 seconds to import, next 100 articles takes 7 seconds, next 10 seconds and so on. It is really big problem, because I need to import about 1.500.000 articles.

图片转代码服务由CSDN问答提供 功能建议

我正在研究迁移脚本,它从一个MySQL数据库中选择数据并通过doctrine导入另一个MySQL数据库。 问题是,在每一块创建的实体之后,我的脚本变慢了。

前100篇文章大约需要5秒才能导入,接下来的100篇文章需要7秒,接下来的10秒等等 。 这真是个大问题,因为我需要导入约1.5万篇文章。

  • 写回答
  • 好问题 提建议
  • 追加酬金
  • 关注问题
  • 邀请回答

1条回答 默认 最新

  • dongpu2727 2014-03-19 09:41

    I found out that php >=5.3 has garbage collector cleaner. So, when I import chunk of articles I call gc_collect_cycles(); to clear memory out of all entities which script will needs no more. The script is slowing down no more!

    If you are using a framework, check if it has its own cache system. If you are using doctrine turn off SQL logger

    /** @var $em EntityManager */
    $em = $this->getContainer()->get('doctrine')->getEntityManager();

    and then clear doctrine cache after every chunk is imported

    解决 无用
    打赏 举报

相关推荐 更多相似问题