douzi8112 2016-08-19 22:16
浏览 26

优化wp_insert_post到一个大的基础?

I have a script in php reading a text file, and each line of the file is inserted as post on wordpress with wp_insert_post ().

Everything is fine until the database becomes large.

Currently, the base has exceeded 250,000 posts, and is now very slow process.

It reads only 2000 lines of the file and insert the base one hour.

Does anyone have any tips to improve the process?

below is part of the code ...

  
$pointer = fopen ("file.txt", "r");
while (! feof ($ pointer)) {
    $ Line = fgets ($ pointer, 4096);//Here I get the information I need to post the variables ..
    $ My_post = array ();
    $ My_post [ 'post_title'] = $post_title;
    $ My_post [ 'post_content'] = $content;
    $ My_post [ 'post_status'] = 'publish';
    $ My_post [ 'post_author'] = 1;
    $ My_post [ 'post_category'] = $NewcatArray;
    $ My_post [ 'tags_input'] = $tags 
    $ Postid = wp_insert_post ($ my_post);
}
  • 写回答

1条回答 默认 最新

  • dsubq24666 2016-08-20 03:32
    关注

    You can try to disable auto commit before your loop

    $wpdb->query( 'SET autocommit = 0;' );

    Then after your loop commit and reenable autommit

    $wpdb->query( 'COMMIT; SET AUTOCOMMIT = 1;' );

    评论

报告相同问题?