douhao1956 2014-02-08 14:18
浏览 66

如何处理和插入通过API在mysql数据库中传输的大量数据?

In my live website, i need to process and insert very huge amount of data coming through edmunds api. I have 3 tables viz. makes, models, and trims. By connecting with edmunds api, I am getting makes data. Under each makes, I am getting models. And consequently under each model, I will have various trims. I have no problem with makes and models. But trims data is causing a lot of problems. Actually there are more than 2000 rows that need to be inserted in database in one operation.

But I can get all these data for makes and models only when i disable the code for trims. If I enable my code for trims section then makes and models also don't work.

But very surprisingly, all the code for makes, models and trims work on my localhost. I have already used

ini_set('max_execution_time', 600);

in my script to avoid script execution timeout.

So please help me how can I be able to process and insert all the data into my database while working on live site. Thanks in advance

  • 写回答

2条回答 默认 最新

  • duangejian6657 2014-02-08 14:23
    关注

    Inserting data doesn't take very long for the database. I suspect that the script is dying because of memory limits, not execution time.

    Try:

    ini_set('memory_limit', '500M');
    

    This allows the script to use up to 500 megabytes of memory so you know it's not a memory issue. If it runs successfully, you can pear it down as you need.

    评论

报告相同问题?