I am trying to add copy_to for the existing index. For Ex:
PUT /my_index
{
"mappings":{
"my_type":{
"properties":{
"user_name":{
"type":"string",
"copy_to":["key"],
"index":"not_analyzed",
"include_in_all":false
}
}
},
"key":{
"type":"string",
"store":"yes"
}
}
}
I have the data already in the user_name. When updating the mapping will these data gets copied to the copy_to field ?
When i index new set of inputs will the new inputs only gets copied to the copy_to (key) field ?
Or do we need to do re-index to get the changes reflected ?
How to update copy to without re-indexing the whole document or can we update only the specific user_name document to get the changes reflected ?
The mapping for a given field is "frozen" when that field is added to the index. To change it, you need to reindex the data.
Related
I'm new to elasticsearch. I'm re-indexing old index to new index. but while re-indexing millions of records sometime, I am getting mapper_parsing_exception exception. so my question is that, Is there a way to set ignore_malformed flag true of an already existing index.
Yes, it's possible to change the ignore_malformed setting dynamically simply by running this:
PUT logstash_june_2019/doc/_mapping
{
"properties": {
"createdAt": {
"type" : "date",
"ignore_malformed": true <--- add this
}
}
}
I am using MongoDB as my DB in my node.js application. My clarification is
consider i have 2 collections as A, B.
B's document structure is
{
"key1":"value1",
"key2":[_id(1) from collection A,
_id(2) from collection A,
,,,so on..
]
}
So when i retrieving collection B , i should have to retrieve the details of collection A also. So i have to loop here for(key2) and find the details from collection A which is tedious.
My question is whether its better to store along with details? like us below
{
"key1":"value1",
"key2":[{
"keyA":"valueA"
},
{
"KeyA":"valueA"
}
,,,so on..
]
}
Now its just retrieve no for loop here. Also in the above case user can update the "key2" can remove , add from frontend. In this case its good to delete the document and create a new one with updated array?
Please share your ideas. Thanks in advance...
If you want to have simple updating, deleting and retrieving, maybe you should think about table that connect A and B.
Let's say this collection name is AB
{
keyAB: "keyAB"
keyA: "valueA",
keyB: "keyB"
}
In this case, you would add new document for every connection. Retrieving, deleting and updating is easy in this case as you just delete one document.
If you really don't want to have third collection then you could do
{
"key1":"value1",
"key2":{"keyA1": "valueA1",
"keyA2": "valueA2",...
}
}
In that case, for updating and retrieving values you would just get key2[keyA1] for getting value or delete key2[keyA1] for deleting.
How can I add a new row with the update operation
I am using following code
statuscollection.update({
id: record.id
}, {
id: record.id,
ip: value
}, {
upsert: true
}, function (err, result) {
console.log(err);
if (!err) {
return context.sendJson([], 404);
}
});
While calling this first one i will add the row
id: record.id
Then id:value then i have to add id:ggh
How can i add every new row by calling this function for each document I need to insert
By the structure of your code you are probably missing a few concepts.
You are using update in a case where you probably do not need to.
You seem to be providing an id field when the primary key for MongoDB would be _id. If that is what you mean.
If you are intending to add a new document on every call then you probably should be using insert. Your use of update with upsert has an intended usage of matching a document with the query criteria, if the document exists update the fields as specified, if not then insert a new document with the fields specified.
Unless that actually is your goal then insert is most certainly what you need. In that case you are likely to rely on the value of _id being populated automatically or by supplying your own unique value yourself. Unless you specifically want another field as an identifier that is not unique then you will likely want to be using the _id field as described before.
I Want to query CouchDB and I have a specific need : my query should return the name field of documents corresponding to this condition : the id is equal or contained in a document filed (a list).
For example, the field output is the following :
"output": [
"doc_s100",
"doc_s101",
"doc_s102",
"doc_s103",
],
I want to get all the documents having in their output field "doc_s102" for example.
I wrote a view in a design document :
"backward_by_docid": {
"map": "function(doc) {if(doc.output) emit(doc.output, doc.name)}"
}
but this view works only when I have a unique value in the output field.
How can I resolve this query ?
Thanks !
you have to iterate over the array:
if(doc.output) {
for (var curOutput in doc.output) {
emit (doc.output[curOutput],doc.name);
}
}
make sure that output always is an array (at least [])
.. and, of course use key="xx" instead key=["xxx"]
I have 10,000+ couchdb documents, each having (simplified) format like -
{
"First Name" : "John",
"Last Name" : "Doe"
}
I want to add another field to this document, which is e-mail, so that document now looks like -
{
"First Name" : "John",
"Last Name" : "Doe",
"e-mail" : ""
}
I understand that I can easily update this document by inserting a new JSON, in new format.
But my question is how can I add new field automatically to "all 10,000+" docs that I have existing in the DB? Do I need to write my own script to read each doc and update each one of them individually? Or is there a simpler way?
If you use views to access your data, you can modify the view without having to modify the documents. Just emit an email value with a default of "".
Assuming the above is no good, use a view to show you which documents need upgrading.
function(doc) {
// views.email_upgrade.map
if(! ('e-mail' in doc)) {
var key = [doc["Last Name"], doc["First Name"]];
emit(key, {_id:doc._id, _rev:doc._rev});
}
Query /db/_design/foo/_view/email_upgrade?include_docs=true. You can add a &limit=N property to help. Query. The doc value in each row is a document that needs to upgrade. You can send them back with POST /db/_bulk_docs. Loop until you have 0 rows. Once you have 0 rows, add a check to your validate_doc_update function.