douzhe1264 2014-12-29 07:36
浏览 58

把Solraw中的Crawled数据和mysql数据放在一起

i am using `

Nutch

` and '

Solr

integration process. i am able to crawl some website data and pass it into solr system,which i am able to access easily.Now i want to import data from mysql also and put both data together but different index. i already tried to import data from mysql and i was also successful in doing that but then it replaced all the indexes created on crawled data , indirectly i lost mine crawled data. so can anyone help me out for doing the above process?

  • 写回答

1条回答 默认 最新

  • duanjurong1347 2014-12-29 12:09
    关注

    You'll need to create a separate collection for the import task from MySQL. These are defined in your Solr configuration directory - see example/solr in the distribution for a minimal setup. You can create as many collections as you need, and each collection will be handled separately from existing collections.

    There's also the possibility of having different content in the same index, but that might cause an issue with Nutch - it will require you to change preImportDeleteQuery in the DataImportHandler, and handle deletes from the data set by yourself.

    评论

报告相同问题?

悬赏问题

  • ¥50 导入文件到网吧的电脑并且在重启之后不会被恢复
  • ¥15 (希望可以解决问题)ma和mb文件无法正常打开,打开后是空白,但是有正常内存占用,但可以在打开Maya应用程序后打开场景ma和mb格式。
  • ¥20 ML307A在使用AT命令连接EMQX平台的MQTT时被拒绝
  • ¥20 腾讯企业邮箱邮件可以恢复么
  • ¥15 有人知道怎么将自己的迁移策略布到edgecloudsim上使用吗?
  • ¥15 错误 LNK2001 无法解析的外部符号
  • ¥50 安装pyaudiokits失败
  • ¥15 计组这些题应该咋做呀
  • ¥60 更换迈创SOL6M4AE卡的时候,驱动要重新装才能使用,怎么解决?
  • ¥15 让node服务器有自动加载文件的功能