doumie6223 2012-05-14 08:38
浏览 94
已采纳

插入大量数据[PHP,MySQL]

I have a big data set into MySQL (users, companies, contacts)? about 1 million records.

And now I need to make import new users, companies, contacts from import file (csv) with about 100000 records. I records from file has all info for all three essences (user, company, contacts). Moreover on production i can't use LOAD DATA (just do not have so many rights :( ).

So there are three steps which should be applied to that data set. - compare with existing DB data - update it (if we will find something on previous step) - and insert new, records

I'm using php on server for doing that. I can see two approaches:

  • reading ALL data from file at once and then work with this BIG array and apply those steps.
  • or reading line by line from the file and pass each line through steps

which approach is more efficient ? by CPU, memory or time usage

Can I use transactions ? or it will slow down whole production system ?

Thanks.

  • 写回答

3条回答 默认 最新

  • dongmao7195 2012-05-14 08:41
    关注

    CPU time/time there won't be much in it, although reading the whole file will be slightly faster. However, for such a large data set, the additional memory required to read all records into memory will vastly outstrip the time advantage - I would definitely process one line at a time.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(2条)

报告相同问题?

悬赏问题

  • ¥20 有关区间dp的问题求解
  • ¥15 多电路系统共用电源的串扰问题
  • ¥15 slam rangenet++配置
  • ¥15 有没有研究水声通信方面的帮我改俩matlab代码
  • ¥15 对于相关问题的求解与代码
  • ¥15 ubuntu子系统密码忘记
  • ¥15 信号傅里叶变换在matlab上遇到的小问题请求帮助
  • ¥15 保护模式-系统加载-段寄存器
  • ¥15 电脑桌面设定一个区域禁止鼠标操作
  • ¥15 求NPF226060磁芯的详细资料