dougu0824 2016-07-21 10:52
浏览 109
已采纳

如何快速搜索1500万条记录

I have excel file of 600 records, which I am going to compare with 15 million records from the database. For each excel record I need to find matches from 15 million records. currently it takes about 4 hours to complete the process. I want to minimize it at least up to 2 hours.

  • 写回答

2条回答 默认 最新

  • dongzhan3937 2016-07-21 10:57
    关注
    1. Export excel to CSV
    2. Import CSV into mysql (LOAD)
    3. Create index on the key column (the one you want to compare with the 15M records)
    4. Make sure that the 15M records table has the proper index set up on the comparison key!
    5. Write a simple query that joins the two table with the criteria you need.

    All these points are fairly trivial and left as an exercise to the reader.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥15 如何在scanpy上做差异基因和通路富集?
  • ¥20 关于#硬件工程#的问题,请各位专家解答!
  • ¥15 关于#matlab#的问题:期望的系统闭环传递函数为G(s)=wn^2/s^2+2¢wn+wn^2阻尼系数¢=0.707,使系统具有较小的超调量
  • ¥15 FLUENT如何实现在堆积颗粒的上表面加载高斯热源
  • ¥30 截图中的mathematics程序转换成matlab
  • ¥15 动力学代码报错,维度不匹配
  • ¥15 Power query添加列问题
  • ¥50 Kubernetes&Fission&Eleasticsearch
  • ¥15 報錯:Person is not mapped,如何解決?
  • ¥15 c++头文件不能识别CDialog