ElasticSearch,mongoDB,nodejs for live search - node.js

I am using using MongoDB(3.0.3),Elastic Search(1.7.1) , NodeJS.
Actually in our Database we have multiple collections like task , users , jobs , events etc Now we want full text search on Multiple collections in MongoDB but till now MongoDB does not provide that . So we use elastic search for fulltext search from multiple collections .
We want that whatever we insert in MongoDB in these collections jobs,events,users etc is automatically save into Elastic Search index and it is available for searching instantaneously . Is there any npm module which can help me or any other ideas .
After searching on this I found https://github.com/richardwilly98/elasticsearch-river-mongodb and https://github.com/mongoosastic/mongoosastic
So which is better and easy to use ? or any other solution

Rivers have been deprecated:
https://www.elastic.co/blog/deprecating-rivers
If you're already using Mongoose then Mongoosastic seems like a good choice. You should check if it's features concerning search fulfill your needs.

Related

How can I find alfresco empty foldres using Lucene Query

I want to retrieve the list of folders in a specific node whose list of childrens is empty using Lucene query.
I create this query:
+PATH:"/app:company_home/cm:contexts/cm:ctx_exploitation/cm:runs/cm:Run_322645//."+Children is empty.
but it does not give good results.
What is the right Lucene syntax to do this
There is no way to find empty folders using a Lucene query.
However, there are some java services and javascript APIs in alfresco
like 'FileFolderService' in Java and 'childByNamePath' in javascript,
using them you can write your logic and find empty folders.
You can find o bytes file using below lucene query.
TYPE:"cm:content" AND #cm:content.size:0

Creating a MongoDB 'find' query to search between a specific number of matching documents

I am using the MongoDB NodeJS driver and need to search the database for a specific set of posts. However if the post count gets too big I want to search between perhaps the 50th - 100th set of matching posts with the query and return the value to the client. However would searching for so many documents and returning them serve as a performance issue? If so what would be the proper query term?
Sample using skip/limit:
//1st part
db.dummy.find().limit(10)
//2nd part
db.dummy.find().skip(10).limit(10)
Take a look at the mongodb documentation: http://docs.mongodb.org/manual/reference/method/db.collection.find/
limit: http://docs.mongodb.org/manual/reference/method/cursor.limit/#cursor.limit
skip:
http://docs.mongodb.org/manual/reference/method/cursor.skip/#cursor.skip
You can use Skip and limit
skip the first 49 objects , then limit it by 50
then you will get what you want !
I hope this will help you ,

PouchDB get documents by ID with certain string in them

I would like to get all documents that contain a certain string in them, I can't seem to find a solution for it..
for example I have the following doc ids
vw_10
vw_11
bmw_12
vw_13
bmw_14
volvo_15
vw_16
how can I get allDocs with the string vw_ "in" it?
Use batch fetch API:
db.allDocs({startkey: "vm_", endkey: "vm_\ufff0"})
Note: \ufff0 is the highest Unicode character which is used as sentinel to specify ranges for ordered strings.
You can use PouchDB find plugin API which is way more sophisticated than allDocs IMO for querying. With the PouchDB find plugin, there is a regex search operator which will allow you do exactly this.
db.find({selector: {name: {$regext: '/vw_'}}});
It's in BETA at the time of writing but we are about to ship a production app with it. That's how stable it has been so far. See https://github.com/nolanlawson/pouchdb-find for more on Pouch Db Find
You better have a view with the key you want to search. This ensures that the key is indexed. Otherwise, the search might be too slow.

Drupal 7 GeoField Proximity using Search API Index

I've created a view that utilizes Search API integration and allows searching node fields. I have related content to ol_locator_location (Location) which is comprised of Address and GeoField. I have indexed the GeoField (ALL possibile iterations including WKT). I'd like to perform Proximity (Distance) searches against the indexed nodes based on the WKT data that is available. The problem is that GeoField:Proximity doesn't seem to relate well.
I'm able to add the GeoField of the related nodes and I'm able to see this on the OpenLayers map but I'm not offered any option for Proximity searching. How can I get this working?
You need to use the search_api_location module. That module adds new abilities to your geopoint filter, namely, being able to specify a point and search for nearby (proximity) places based on a radius.
Alternatively, if you like to program, you can query the solr search yourself and create the view you want using PHP or Javascript.
http://wiki.apache.org/solr/SpatialSearch
http://docs.lucidworks.com/display/solr/Spatial+Search

Search over multicore index without same schema solr

I use Solr, and i would like to know if i can search over multiple index using multicore. I know there is "Distributed Search" but i think it's only for index with same schema.
Thanks.
no we can search with different schema also but only thing is it will search in common field..See this for more information .
Further for manipulation we can use solrj. else if only in solr .. read about Core Admin Handler
let me know if you need any help further.

Resources