What would be the best way to migrate Solr cores to elastic search indices ?
The solr-river-plugin (https://github.com/javanna/elasticsearch-river-solr) is deprecated.
There's a nice adhoc Python tool made with love by the nice folks at OpenSource Connections that you can use to do this:
https://github.com/o19s/solr-to-es
Simply
./solr-to-es solr_url elasticsearch_url elasticsearch_index doc_type
For instance, the command below will page through all documents on the local Solr node, named node, and submit them to the local Elasticsearch server in the index my_index with a document type of my_type.
./solr-to-es.py localhost:8983/solr/node localhost:9200 my_index my_type
Related
By default Elasticsearch seems to query its own database in the indexes defined during the search.
Is it possible that Elasticsearch is not querying its database but mine in PostgresSql?
No.
Elasticsearch is a database on its own rights, it's not an interface/middleman for other backends.
If you want to conditionally query different databases, you need to implement that logic at application level.
I am using mongodb to store items for an auction site
I want to enable fuzzy searching.
Should I query for 1000 results with no parameters then use a js library like fuse.js
Or should I rely on mongodb $regex alone to do the query?
mongodb isn't a great choice for a problem like this. There are lots of great text search utilities available, the most prominent these days being elasticsearch. You'd continue to store your data in mongodb, but you'd keep an elasticsearch instance synced to the mongodb database and perform your searches against elasticsearch. Mongoosastic is a good way to write to both concurrently or Transporter can be used shift the synchronization away from your database persistence flow.
Mongoosastic example:
https://blog.cloudboost.io/sync-mongo-with-elastic-and-save-months-of-development-time-and-cost-d281e0ca8fe4
Some other ways to sync including Transporter: https://code.likeagirl.io/5-different-ways-to-synchronize-data-from-mongodb-to-elasticsearch-d8456b83d44f
Hi i have started working the neo4j and the seraph and seraph-model packages.
A problem poped:
I cannot seem to find an way to create a Node connected to other node in one query example:
Create (n:User)-[r:Has]->(p:UserImage)
I know I can do that using a the native seraph.query but then I lose some of the model features.. (like the timestamps)
Is there some otherway to do that?
How expensive is to do that query in 3 steps? (create user, create image, link then)
seraph-model is extension of seraph. seraph.query is for raw query. model features are accessible only when use modelName.Method
I am fairly new to both mongodb and node.js but recently got everything to work well for me until I reached the point where I needed to add a full text search to my website. From my research I figured out that Elasticsearch would be a good fit, but I couldnt figure out exactly how to get it to work with node.js and mongodb. I am currently using Heroku and MongoLab to host my application. Here are my questions.
How do I host Elasticsearch?
How do I make all my mongo data available to elasticsearch do I use a river or do I manually inset and delete all data?
I found something this river but I am not quite sure how to make this happen automatically and where to host it.
How do I query Elasticsearch from node.js? Is there a package that allows for this?
Edit:
Question 2 is really what I am struggling with. I have also included question 1 and 3 to help people that are new to the topic and coming from google.
1) either on your own server/VM/whatever.. or with a hosted service such as https://searchbox.io/
2) you can create a script to index your existing data and then index new data once its created, or use a river to index your current database.
3) ElasticSearch is a simple HTTP API, you can make your own requests using the 'http' module or simplify it with something like https://github.com/mikeal/request
you can also use a 3rd party library like https://github.com/phillro/node-elasticsearch-client
Searchly.com (Aka SearchBox.io)introduced a new feature crawlers includes MongoDB crawler.
It fetches data from a given collection and syncs periodically to ElasticSearch. Check http://www.searchly.com/documentation/crawler-beta/
You can host on your own server or use aws elasticsearch service or use elastic cloud provided by elasticsearch.
Try any of the following three solutions:-
i) Try using mongoosastic. NPM package
ii) Use mongo-connector.
iii) python script for indexing data to elasticsearch
elasticsearch-js. The javascript client library
Using the CouchDB river, it is possible to index CouchDB databases.
Is it also possible to index a CouchDB view with Elastic Search?
Not yet. See https://github.com/elasticsearch/elasticsearch-river-couchdb/pull/2
BTW you can checkout my pull request, build it and start to query views...