2018-06-07 14:46



I have built a PHP script for uploading data from Excel (~40KB size) into DB sent via an Ajax request. It will insert 200 events, 500 registrations (Registrations for the events calendar pro plugin), insert around 500 users and pages and create 1000 PDFs.

It will insert/create this one by one (first events, then registrations etc.) and when I upload a smaller batch, the entire script works just fine. I can hereby upload the entire data, just in smaller pieces.

However, when I upload the entire data, it will always stop at the 3rd step (the insert of the users) and it will always do it at around 200 users without returning anything anymore - In the Console - Network Inspector it is still running.

I have increased the PHP settings to very high, but the error still exists:

memory_limit = 8216M
max_execution_time = 3600
max_input_time = 3600
post_max_size = 1028M
upload_max_filesize = 1028M

Can there be any setting either in PHP, Server or Wordpress that can cause this issue? I am just lost now after sitting on this now for a couple of days, if anyone has a hint or idea, thanks a lot in advance.

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答


  • duanfeng7756 duanfeng7756 3年前

    Like all the commentors are stating, the tasks you are trying to complete feel a bit much for an ajax call.

    I'm guessing you regularly update the Excel file and need to update it to WP?

    Is it ok if I was to return the ajax call after each insert-step and then start a new one.

    Sure that's an option, i've build many ajax batch loaders for collecting large data sets and sending mail. This way you also circumvent PHP script timeouts.

    点赞 评论 复制链接分享