Unable to get new selected value in jest - jestjs

I have a test:
it('Can render FormComponent with select and select an option', () => {
render(<FormComponent
field={fields[0].fields[1]}
currentItem={{}}
/>);
const blankSelect = screen.getByText(/scegli/i);
expect(blankSelect).toBeInTheDocument();
userEvent.selectOptions(
screen.getByLabelText("bar"), ["zulu"]
);
console.log((screen.getByRole('option', {name: /scegli/i}).selected));
console.log((screen.getByRole('option', {name: /zulu/i}).selected));
console.log((screen.getByRole('option', {name: /qwerty/i}).selected));
//expect(screen.getByRole('option', {name: "zulu"}).selected).toBe(true);
//expect(screen.getByRole('option', {name: /qwerty/i}).selected).toBe(false);
//expect(screen.getByRole('option', {name: /scegli/i}).selected).toBe(false);
});
Basically, for jest selected value is "scegli" with blank value.
I just tried to add async / awit without luck. Also with or without square brackets.
This is the generating options key / value:
{
name: "bar",
type: "select",
options: [
{
key: "1",
value: "zulu"
},
{
key: "2",
value: "qwerty",
}
]
},

Related

Sequelize - trying to make models dynamic

I've been trying to automate the creation of my sequelize models by creating them with a generic model that I can pass definitions into, rather than creating a model file specifically for each one.
I have an array of model definitions which looks something like this:
const modelDefinitions = [
{
name: "User",
fieldDefinitions: [
{
name: "first_name",
label: "First Name",
column_type: Sequelize.DataTypes.STRING,
},
{
name: "last_name",
label: "Last Name",
column_type: Sequelize.DataTypes.STRING,
},
{
name: "email",
label: "Email",
column_type: Sequelize.DataTypes.STRING,
},
{
name: "password",
label: "Password",
restricted: true,
column_type: Sequelize.DataTypes.STRING,
},
],
},
{
name: "Audit",
fieldDefinitions: [
{
name: "ref",
column_type: Sequelize.DataTypes.STRING,
label: "Audit Ref",
},
{
name: "result",
column_type: Sequelize.DataTypes.STRING,
label: "Result",
},
{
name: "auditor_id",
column_type: Sequelize.DataTypes.INTEGER,
label: "Auditor",
},
],
},
];
When my array of models contains just one model it works perfectly fine, but when I have multiple, the GenericModel of the previously defined models is then "changed" to ne the last one in the list that was initialised.
I'm new to node so I think I'm either missing something or there's some sort of model caching happening, meaning that all instances of GenericModel become what it is initialised as last.
Please see my example below (the commented out code is what I used to use to define the models and the reduce is my new way of defining these)
// {
// User: User.init(sequelize, modelDef),
// Article: Article.init(sequelize, modelDef),
// Audit: Audit.init(sequelize, modelDef),
// Form: Form.init(sequelize, modelDef),
// };
const models = modelDefinitions.reduce((acc, modelDef) => {
return { ...acc, [modelDef.name]: GenericModel.init(sequelize, modelDef) };
}, {});
console.log({ models });
My console.log() returns the following - notice both are Group :
{
models: {
User: Group,
Group: Group
}
}
As you can see, what ever the last model is defined as, the previous ones inherit that instead of keeping what I defined them as originally.
But what I actually want is :
{
models: {
User: User,
Group: Group
}
}
If my list only had User in it, it works fine.
I managed to get this working in the end.
I think my issue was that my GenericModel was treated as a Singleton, so to get around this I changed GenericModel from extending the Sequelize.Model and instead made a new class with a contructor to consume my arguments and then created a method on the new class to return the sequelize model.
The main change there was instead of defining the models with GenericModel.init(), I defined them by calling sequelize.define(modelName, attributes, options)
so my map now looks like this :
const models = modelDefinitions.reduce((acc, modelDef) => {
return { ...acc, [modelDef.name]: new GenericModel(sequelize, modelDef).getDBModel() };
}, {});
and my class:
class TestModel {
constructor(sequelize, modelDef) {
this.sequelize = sequelize;
this.modelDef = modelDef;
this.modelName = modelDef?.name;
this.definitions = modelDef?.fieldDefinitions;
this.restrictedFieldList = this.definitions.filter((field) => field?.restricted).map((definition) => definition.name);
}
getDBModel() {
const model = this.sequelize.define(
this.modelName,
this.definitions.reduce((acc, definition) => {
return { ...acc, [definition.name]: definition.column_type };
}, {}),
{
defaultScope: {
attributes: {
exclude: this.restrictedFieldList,
},
},
sequelize: this.sequelize,
modelName: this.modelName,
}
);
return model;
}
}```

How to insert multiple JSON document in elastic search

Input Data
[{
"_index": "abc",
"_type": "_doc",
"_id": "QAE",
"_score": 6.514091,
"_source": {
"category": "fruits",
"action": "eating",
"metainfo": {
"hash": "nzUZ1ONm0e167p"
},
"createddate": "2019-10-03T12:37:45.297Z"
}},
{
"_index": "abc",
"_type": "_doc",
"_id": "PQR",
"_score": 6.514091,
"_source": {
"category": "Vegetables",
"action": "eating",
"metainfo": {
"hash": "nzUZ1ONm0e167p"
},
"createddate": "2019-10-03T12:37:45.297Z"
}
}-----------------
----------------]
I have around 30,000 records as input data. How to insert this data in a single query. I tried by
var elasticsearch = require('elasticsearch');
var client = new elasticsearch.Client({
host: '********',
log: 'trace'
});
client.index({
index: "abc",
body: ****input data*****
}).then((res) => {
console.log(res);
}, (err) => {
console.log("err", err);
});
In this code, send input data in the body. but it returns an error. Please suggest to me.
This seems like what are you looking for:
'use strict'
require('array.prototype.flatmap').shim()
const { Client } = require('#elastic/elasticsearch')
const client = new Client({
node: 'http://localhost:9200'
})
async function run () {
await client.indices.create({
index: 'tweets',
body: {
mappings: {
properties: {
id: { type: 'integer' },
text: { type: 'text' },
user: { type: 'keyword' },
time: { type: 'date' }
}
}
}
}, { ignore: [400] })
const dataset = [{
id: 1,
text: 'If I fall, don\'t bring me back.',
user: 'jon',
date: new Date()
}, {
id: 2,
text: 'Winter is coming',
user: 'ned',
date: new Date()
}, {
id: 3,
text: 'A Lannister always pays his debts.',
user: 'tyrion',
date: new Date()
}, {
id: 4,
text: 'I am the blood of the dragon.',
user: 'daenerys',
date: new Date()
}, {
id: 5, // change this value to a string to see the bulk response with errors
text: 'A girl is Arya Stark of Winterfell. And I\'m going home.',
user: 'arya',
date: new Date()
}]
// The major part is below:
const body = dataset.flatMap(doc => [{ index: { _index: 'tweets' } }, doc])
const { body: bulkResponse } = await client.bulk({ refresh: true, body })
//
if (bulkResponse.errors) {
const erroredDocuments = []
// The items array has the same order of the dataset we just indexed.
// The presence of the `error` key indicates that the operation
// that we did for the document has failed.
bulkResponse.items.forEach((action, i) => {
const operation = Object.keys(action)[0]
if (action[operation].error) {
erroredDocuments.push({
// If the status is 429 it means that you can retry the document,
// otherwise it's very likely a mapping error, and you should
// fix the document before to try it again.
status: action[operation].status,
error: action[operation].error,
operation: body[i * 2],
document: body[i * 2 + 1]
})
}
})
console.log(erroredDocuments)
}
const { body: count } = await client.count({ index: 'tweets' })
console.log(count)
}
run().catch(console.log)
Reference link: https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/bulk_examples.html

Export To Excel filtered data with Free jqgrid 4.15.4 in MVC

I have a question regarding Export to Excel in free-jqgrid 4.15.4. I want to know how to use this resultset {"groupOp":"AND","rules":[{"field":"FirstName","op":"eq","data":"Amit"}]} into my Business Logic Method.
Just for more clarification, I've using OfficeOpenXml and if I don't use filtered resultset(aforementioned) it is working fine and I'm able to download file with full records in an excel sheet. But I'm not sure what to do or how to utilize the resultset {"groupOp":"AND","rules":[{"field":"FirstName","op":"eq","data":"Amit"}]}
If required I can share my controller and BL code.
I have added a fiddle which shows implementation of Export to Excel button in jqGrid pager.
Before coming to here, I've read and tried to understand from following questions:
1] jqgrid, export to excel (with current filter post data) in an asp.net-mvc site
2] Export jqgrid filtered data as excel or CSV
Here is the code :
$(function () {
"use strict";
var mydata = [
{ id: "10", FirstName: "test", LastName: "TNT", Gender: "Male" },
{ id: "11", FirstName: "test2", LastName: "ADXC", Gender: "Male" },
{ id: "12", FirstName: "test3", LastName: "SDR", Gender: "Female" },
{ id: "13", FirstName: "test4", LastName: "234", Gender: "Male" },
{ id: "14", FirstName: "test5", LastName: "DAS", Gender: "Male" },
];
$("#list").jqGrid({
data: mydata,
colNames: ['Id', 'First Name', 'Last Name', 'Gender'],
colModel: [
{
label: "Id",
name: 'Id',
hidden: true,
search: false,
},
{
label: "FirstName",
name: 'FirstName',
searchoptions: {
searchOperators: true,
sopt: ['eq', 'ne', 'lt', 'le','ni', 'ew', 'en', 'cn', 'nc'],
}, search: true,
},
{
label: "LastName",
name: 'LastName',
searchoptions: {
searchOperators: true,
sopt: ['eq', 'ne', 'lt', 'ni', 'ew', 'en', 'cn', 'nc'],
}, search: true,
},
{
label: "Gender",
name: 'Gender',
search: true, edittype: 'select', editoptions: { value: 'Male:Male;Female:Female' }, stype: 'select',
},
],
onSelectRow: function (id) {
if (id && id !== lastsel) {
jQuery('#list').restoreRow(lastsel);
jQuery('#list').editRow(id, true);
lastsel = id;
}
},
loadComplete: function (id) {
if ($('#list').getGridParam('records') === 0) {
//$('#grid tbody').html("<div style='padding:6px;background:#D8D8D8;'>No records found</div>");
}
else {
var lastsel = 0;
if (id && id !== lastsel) {
jQuery('#list').restoreRow(lastsel);
jQuery('#list').editRow(id, true);
lastsel = id;
}
}
},
loadonce: true,
viewrecords: true,
gridview: true,
width: 'auto',
height: '150px',
emptyrecords: "No records to display",
iconSet:'fontAwesome',
pager: true,
jsonReader:
{
root: "rows",
page: "page",
total: "total",
records: "records",
repeatitems: false,
Id: "Id"
},
});
jQuery("#list").jqGrid("navButtonAdd", {
caption: "",
buttonicon: "fa-table",
title: "Export To Excel",
onClickButton: function (e) {
var projectId = null;
var isFilterAreUsed = $('#grid').jqGrid('getGridParam', 'search'),
filters = $('#grid').jqGrid('getGridParam', 'postData').filters;
var Urls = "/UsersView/ExportToExcel_xlsxFormat?filters="+ encodeURIComponent(filters); //' + encodeURIComponent(filters);/
if (totalRecordsCount > 0) {
$.ajax({
url: Urls,
type: "POST",
//contentType: "application/json; charset=utf-8",
data: { "searchcriteria": filters, "projectId": projectId, "PageName": "MajorsView" },
//datatype: "json",
success: function (data) {
if (true) {
window.location = '/UsersView/SentFiletoClientMachine?file=' + data.filename;
}
else {
$("#resultDiv").html(data.errorMessage);
$("#resultDiv").addClass("text-danger");
}
},
error: function (ex) {
common.handleAjaxError(ex.status);
}
});
}
else {
bootbox.alert("There are no rows to export in the Participant List")
if (dialog) {
dialog.modal('hide');
}
}
}
});
});
https://jsfiddle.net/ap43xecs/10/
There are exist many option to solve the problem. The simplest one consist of sending ids of filtered rows to the server instead of sending filters parameter. Free jqGrid supports lastSelectedData parameter and thus you can use $('#grid').jqGrid('getGridParam', 'lastSelectedData') to get the array with items sorted and filtered corresponds to the current filter and sorting criteria. Every item of the returned array should contain Id property (or id property) which you can use on the server side to filter the data before exporting.
The second option would be to implement server side filtering based on the filters parameter, which you send currently to the server. The old answer (see FilterObjectSet) provides an example of filtering in case of usage Entity Framework. By the way, the answer and another one contain code, which I used for exporting grid data to Excel using Open XML SDK. You can compare it with your existing code.
In some situations it could be interesting to export grid data to Excel without writing any server code. The corresponding demo could be found in the issue and UPDATED part of the answer.

Add unique value to every element in array

I'm fairly new to MongoDB and I'm trying to merge an embedded array in a MongoDB collection, my schema for my Project collection is as follows:
Projects:
{
_id: ObjectId(),
client_id: String,
description: String,
samples: [
{
location: String, //Unique
name: String,
}
...
]
}
A user can upload a JSON file that is in the form of:
[
{
location: String, //Same location as in above schema
concentration: float
}
...
]
The length of the samples array is the same length as the uploaded data array. I'm trying to figure out how to add the data field into every element of my samples array, but I can't find out how to do it based on MongoDB documentation. I can load my json data in as "data" and I want to merge based on the common "location" field:
db.projects.update({_id: myId}, {$set : {samples.$[].data : data[location]}});
But I can't think of how to get the index on the json array in update query, and I haven't been able to find any examples in the mongodb documentation, or questions like this.
Any help would be much appreciated!
MongoDB 3.6 Positional Filtered Updates
So you're actually in the right "ballpark" with the positional all $[] operator, but the problem is that just simply applies to "every" array element. Since what you want is "matched" entries you actually want the positional filtered $[<identifier>] operator instead.
As you note your "location" is going to be unique and within the array. Using "index positions" is really not reliable for atomic updates, but actually matching the "unique" properties is. Basically you need to get from something like this:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
To this:
{
"$set": {
"samples.$[l0].concentration": 3,
"samples.$[l0].other": "c",
"samples.$[l1].concentration": 4,
"samples.$[l1].other": "a"
},
"arrayFilters": [
{
"l0.location": "A"
},
{
"l1.location": "C"
}
]
}
And that really is just a matter of applying some basic functions to the provided input array:
let arrayFilters = input.map(({ location },i) => ({ [`l${i}.location`]: location }));
let $set = input.reduce((o,{ location, ...e },i) =>
({
...o,
...Object.entries(e).reduce((oe,[k,v]) => ({ ...oe, [`samples.$[l${i}].${k}`]: v }),{})
}),
{}
);
log({ $set, arrayFilters });
The Array.map() simply takes the values of the input and creates a list of identifiers to match the location values within arrayFilters. The construction of the $set statement uses Array.reduce() with two iterations being able to merge keys for each array element processed and for each key present in that array element, after removing the location from consideration since this is not being updated.
Alternately, loop with for..of:
let arrayFilters = [];
let $set = {};
for ( let [i, { location, ...e }] of Object.entries(input) ) {
arrayFilters.push({ [`l${i}.location`]: location });
for ( let [k,v] of Object.entries(e) ) {
$set[`samples.$[l${i}].${k}`] = v;
}
}
Note we use Object.entries() here as well as the "object spread" ... in construction. If you find yourself in a JavaScript environment without this support, then Object.keys() and Object.assign() are basically drop in replacements with little change.
Then those can actually be applied within an update as in:
Project.update({ client_id: 'ClientA' }, { $set }, { arrayFilters });
So the positional filtered $[<identifier>] is actually used here to create "matching pairs" of entries within the $set modifier and within the arrayFilters option of the update(). So for each "location" we create an identifier that matches that value within the arrayFilters and then use that same identifier within the actual $set statement in order to just update the array entry which matches the condition for the identifier.
The only real rule with "identifiers" is that that cannot start with a number, and they "should" be unique but it's not a rule and you simply get the first match anyway. But the updates then only touch those entries which actually match the condition.
Ealier MongoDB fixed Indexes
Failing having support for that, then you are basically falling back to "index positions" and that's really not that reliable. More often than not you will actually need to read each document and determine what is in the array already before even updating. But with at least presumed "parity" where index positions are in place then:
let input = [
{ location: "A", concentration: 3 },
{ location: "B", concentration: 5 },
{ location: "C", concentration: 4 }
];
let $set = input.reduce((o,e,i) =>
({ ...o, [`samples.${i}.concentration`]: e.concentration }),{}
);
log({ $set });
Producing an update statement like:
{
"$set": {
"samples.0.concentration": 3,
"samples.1.concentration": 5,
"samples.2.concentration": 4
}
}
Or without the parity:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
// Need to get the document to compare without parity
let doc = await Project.findOne({ "client_id": "ClientA" });
let $set = input.reduce((o,e,i) =>
({
...o,
...Object.entries(e).filter(([k,v]) => k !== "location")
.reduce((oe,[k,v]) =>
({
...oe,
[`samples.${doc.samples.map(c => c.location).indexOf(e.location)}`
+ `.${k}`]: v
}),
{}
)
}),
{}
);
log({ $set });
await Project.update({ client_id: 'ClientA' },{ $set });
Producing the statement matching on the indexes ( after you actually read the document ):
{
"$set": {
"samples.0.concentration": 3,
"samples.0.other": "c",
"samples.2.concentration": 4,
"samples.2.other": "a"
}
}
Noting of course that for each "update set" you really don't have any other option than to read from the document first to determine which indexes you will update. This generally is not a good idea as aside from the overhead of needing to read each document before a write, there is no absolute guarantee that the array itself remains unchanged by other processes in between the read and the write, so using a "hard index" is making the presumption that everything is still the same, when that may not actually be the case.
Earlier MongoDB positional matches
Where data permits it's generally better to cycle standard positional matched $ updates instead. Here location is indeed unique so it's a good candidate, and most importantly you do not need read the existing documents to compare arrays for indexes:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let batch = input.map(({ location, ...e }) =>
({
updateOne: {
filter: { client_id: "ClientA", 'samples.location': location },
update: {
$set: Object.entries(e)
.reduce((oe,[k,v]) => ({ ...oe, [`samples.$.${k}`]: v }), {})
}
}
})
);
log({ batch });
await Project.bulkWrite(batch);
A bulkWrite() sends multiple update operations, but it does so with a single request and response just like any other update operation. Indeed if you are processing a "list of changes" then returning the document for comparison of each and then constructing one big bulkWrite() is the direction to go in instead of individual writes, and that actually even applies to all previous examples as well.
The big difference is "one update instruction per array element" in the change set. This is the safe way to do things in releases without "positional filtered" support, even if it means more write operations.
Demonstration
A full listing in demonstration follows. Note I'm using "mongoose" here for simplicity, but there is nothing really "mongoose specific" about the actual updates themselves. The same applies to any implementation, and particular in this case the JavaScript examples of using Array.map() and Array.reduce() to process the list for construction.
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let arrayFilters = input.map(({ location },i) => ({ [`l${i}.location`]: location }));
let $set = input.reduce((o,{ location, ...e },i) =>
({
...o,
...Object.entries(e).reduce((oe,[k,v]) => ({ ...oe, [`samples.$[l${i}].${k}`]: v }),{})
}),
{}
);
log({ $set, arrayFilters });
await Project.update(
{ client_id: 'ClientA' },
{ $set },
{ arrayFilters }
);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And the output for those who cannot be bothered to run, shows the matching array elements updated:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b1778605c59470ecaf10fac"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b1778605c59470ecaf10faf"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b1778605c59470ecaf10fae"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b1778605c59470ecaf10fad"), location: 'C', name: 'Location C' } ], __v: 0 })
{
"$set": {
"samples.$[l0].concentration": 3,
"samples.$[l0].other": "c",
"samples.$[l1].concentration": 4,
"samples.$[l1].other": "a"
},
"arrayFilters": [
{
"l0.location": "A"
},
{
"l1.location": "C"
}
]
}
Mongoose: projects.update({ client_id: 'ClientA' }, { '$set': { 'samples.$[l0].concentration': 3, 'samples.$[l0].other': 'c', 'samples.$[l1].concentration': 4, 'samples.$[l1].other': 'a' } }, { arrayFilters: [ { 'l0.location': 'A' }, { 'l1.location': 'C' } ] })
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b1778605c59470ecaf10fac",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b1778605c59470ecaf10faf",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b1778605c59470ecaf10fae",
"location": "B",
"name": "Location B"
},
{
"_id": "5b1778605c59470ecaf10fad",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}
Or by hard index:
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
// Need to get the document to compare without parity
let doc = await Project.findOne({ "client_id": "ClientA" });
let $set = input.reduce((o,e,i) =>
({
...o,
...Object.entries(e).filter(([k,v]) => k !== "location")
.reduce((oe,[k,v]) =>
({
...oe,
[`samples.${doc.samples.map(c => c.location).indexOf(e.location)}`
+ `.${k}`]: v
}),
{}
)
}),
{}
);
log({ $set });
await Project.update(
{ client_id: 'ClientA' },
{ $set },
);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And the output:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b1778e0f7be250f2b7c3fc8"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b1778e0f7be250f2b7c3fcb"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b1778e0f7be250f2b7c3fca"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b1778e0f7be250f2b7c3fc9"), location: 'C', name: 'Location C' } ], __v: 0 })
Mongoose: projects.findOne({ client_id: 'ClientA' }, { fields: {} })
{
"$set": {
"samples.0.concentration": 3,
"samples.0.other": "c",
"samples.2.concentration": 4,
"samples.2.other": "a"
}
}
Mongoose: projects.update({ client_id: 'ClientA' }, { '$set': { 'samples.0.concentration': 3, 'samples.0.other': 'c', 'samples.2.concentration': 4, 'samples.2.other': 'a' } }, {})
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b1778e0f7be250f2b7c3fc8",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b1778e0f7be250f2b7c3fcb",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b1778e0f7be250f2b7c3fca",
"location": "B",
"name": "Location B"
},
{
"_id": "5b1778e0f7be250f2b7c3fc9",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}
And of course with standard "positional" $ syntax and updates:
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let batch = input.map(({ location, ...e }) =>
({
updateOne: {
filter: { client_id: "ClientA", 'samples.location': location },
update: {
$set: Object.entries(e)
.reduce((oe,[k,v]) => ({ ...oe, [`samples.$.${k}`]: v }), {})
}
}
})
);
log({ batch });
await Project.bulkWrite(batch);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And output:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b179142662616160853ba4a"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b179142662616160853ba4d"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b179142662616160853ba4c"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b179142662616160853ba4b"), location: 'C', name: 'Location C' } ], __v: 0 })
{
"batch": [
{
"updateOne": {
"filter": {
"client_id": "ClientA",
"samples.location": "A"
},
"update": {
"$set": {
"samples.$.concentration": 3,
"samples.$.other": "c"
}
}
}
},
{
"updateOne": {
"filter": {
"client_id": "ClientA",
"samples.location": "C"
},
"update": {
"$set": {
"samples.$.concentration": 4,
"samples.$.other": "a"
}
}
}
}
]
}
Mongoose: projects.bulkWrite([ { updateOne: { filter: { client_id: 'ClientA', 'samples.location': 'A' }, update: { '$set': { 'samples.$.concentration': 3, 'samples.$.other': 'c' } } } }, { updateOne: { filter: { client_id: 'ClientA', 'samples.location': 'C' }, update: { '$set': { 'samples.$.concentration': 4, 'samples.$.other': 'a' } } } } ], {})
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b179142662616160853ba4a",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b179142662616160853ba4d",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b179142662616160853ba4c",
"location": "B",
"name": "Location B"
},
{
"_id": "5b179142662616160853ba4b",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}

How can I restructure this MongoDB result into the right format?

I'm querying mongo with something like this:
const loadedLists = await ListValues.find( {} )
I get results back that look more or less like this:
result = [ { listname: 'ListOne',
values: [ {
label: 'OptOne',
cost: '25'
},{
label: 'OptTwo',
cost: '22'
}]
}, { listname: 'ListTwo',
values: [...//and so on]
}]
I need to turn this into an object that looks like this:
const newData =
{ listOne: [ { label: 'OptOne', cost: '25' }, { label ...} ],
listTwo: [ { label: 'OptTwo', cost: '25' }, { label ...} ]}
With the goal of eventually doing Object.assign(res.locals, ...newData) so res.locals has a property of listOne and listTwo.
I've messed around with map like this:
const hh = {}
hh = result.map( async list => {
return { [list.listName]: list.values }
}
hh = await Promise.all(hh)
Which almost works, except hh looks like this:
[
{ listOne: [ ...//Properly has the values ] },
{ listTwo: [ ...//Same...]}
]
I just need to cleanly get those objects moved up a level. I've found NPM modules to do this, or loops to reprocess it, but I can't help but feel like there's a cleaner / tidier way to get it returned.
You need to use reduce
const loadedLists = await ListValues.find( {} )
const hh = loadedList.reduce(
(acc, {listname, values}) => (acc[listname] = values, acc)
{}
)
Also you don't need to scatter async keyword all over the place. :) It makes sense to use it only with async functions that use await keyword.
Example from your code.
// why do you even need async here? there is nothing async in creating an object
hh = result.map( async list => {
return { [list.listName]: list.values }
}
You can use Array.reduce() to accumulate the result.
'use strict';
let data = [{
listname: 'ListOne',
values: [{
label: 'OptOne',
cost: '25'
}, {
label: 'OptTwo',
cost: '22'
}]
}, {
listname: 'ListTwo',
values: [{
label: 'OptOne',
cost: '25'
}, {
label: 'OptTwo',
cost: '22'
}]
}];
let result = data.reduce(function (acc, {listname, values}) {
if (acc[listname]) {
acc[listname] = acc[listname].concat(values);
} else {
acc[listname] = values;
}
return acc;
}, {});
console.log('result', result);
This logs
result { ListOne:
[ { label: 'OptOne', cost: '25' },
{ label: 'OptTwo', cost: '22' } ],
ListTwo:
[ { label: 'OptOne', cost: '25' },
{ label: 'OptTwo', cost: '22' } ] }

Resources