I am using Mysql for indexing data to SOLR. Earlier the data was less, so the indexing happened quickly without any lag. Now my data contains almost 3 million rows and the mysql query always times out and because of that SOLR can't index the data accordingly.Is there any way to index SOLR fastly using mysql or can do any tweaking in SOLR? Please help.
1条回答 默认 最新
- dsiuz86842 2014-11-20 20:07关注
If you're not using incremental / delta indexing, you should start doing that instead. That way only the rows that have changed since the last index ran will be indexed again, allowing you to lessen the impact and the amount of rows from MySQL.
In addition the JDBCDataSource (which I guess you're using) supports the
batchSize
parameter, which tells the JDBC driver to limit the amount of documents in each query - and issue more than one query instead.You should also take care to have usable indexes on your data if you're performing any sort of filtering on the SQL content when retrieving it (such as for a delta import).
本回答被题主选为最佳回答 , 对您是否有帮助呢?解决 无用评论 打赏 举报
悬赏问题
- ¥15 乌班图ip地址配置及远程SSH
- ¥15 怎么让点阵屏显示静态爱心,用keiluVision5写出让点阵屏显示静态爱心的代码,越快越好
- ¥15 PSPICE制作一个加法器
- ¥15 javaweb项目无法正常跳转
- ¥15 VMBox虚拟机无法访问
- ¥15 skd显示找不到头文件
- ¥15 机器视觉中图片中长度与真实长度的关系
- ¥15 fastreport table 怎么只让每页的最下面和最顶部有横线
- ¥15 java 的protected权限 ,问题在注释里
- ¥15 这个是哪里有问题啊?