dongshanfan1941
dongshanfan1941
2015-08-16 19:48

大型CSV文件导入到mysql,最佳实践

  • csv
  • php

Looking for insight on the best approach for large csv file imports to mysql and managing the dataset. This is for an ecommerce storefront "startup". All product data will be read from csv files which are download via curl (server to server).

Each csv file represents a different supplier/warehouse with up to 100,000 products. In total there are roughly 1.2 million products spread over 90-100 suppliers. At least 75% of the row data (51 columns) is redundant garbage and will not be needed.

Would it be better to use mysqli LOAD DATA LOCAL INFILE to 'temp_products' table. Then, make the needed data adjustments per row, then insert to the live 'products' table or simply use fgetcsv() and go row by row? The import will be handled by a CronJob using the sites php.ini with a memory limit of 128M.

  • Apache V2.2.29
  • PHP V5.4.43
  • MySQL V5.5.42-37.1-log
  • memory_limit 128M

I'm not looking for "How to's". I'm simply looking for the "best approach" from the communities perspective and experience.

  • 点赞
  • 回答
  • 收藏
  • 复制链接分享

1条回答

为你推荐