How to write information into json in this format?
{
name_data: {
"1": {
name: "A",
},
"2": {
name: "B",
},
"3": {
name: "C",
},
"4": {
name: "D",
}
}
}
Currently this is how I write information into json
client.reqs[1] = {
name: "A",
}
fs.writeFile("./Database/reqs.json", JSON.stringify(client.reqs, null, 4), err => {
if (err) throw err;
})
And this is the resulting format in json
{
"1": {
"name": "A",
},
"2": {
"name": "A",
},
"3": {
"name": "A",
},
"4": {
"name": "A",
}
}
Can somebody give answers, websites, or documentation that can help me answer my question?
Try using JSON.stringify(client.reqs, null, " ")
The other thing you want to is not directly possible, it is more JavaScript than JSON. You cannot remove the quotes from the keys, because it would be invalid JSON.
For having the name_data on top level, do this (indentation change is included in this example):
fs.writeFile("./Database/reqs.json", JSON.stringify({ name_data: client.reqs }, null, 4), err => {
if (err) throw err; // ^ this adds name_data
})
for the custom object key name
all you have to do is just to add ["the name you want"] : value
so I assume there will be something like
for(let i = 0; i <= 5; i++){
client.reqs[i] = {
[i]: "A",
}
}
Related
I have a parent-child array that lookes like this:
const myTestArray = [
{
"id": "1",
"parent": "root"
},
{
"id": "2",
"parent": "1"
},
{
"id": "3",
"parent": "1"
},
{
"id": "4",
"parent": "2"
},
{
"id": "5",
"parent": "4"
}
];
From this I have been able to create an nested parent-child json-tree, but now I need another data-structure that looks like this:
const childesArray = {
"1": ["1", "2", "3", "4", "5"],
"2": ["2", "4", "5"],
"3": ["3"],
"4": ["4", "5"],
"5": ["5"]
};
For each element in myTestArray I need an array that contains all childes (deep) for that element.
I guess I also here can use reduce on this array, but I'm stucked right now how I can achive this. Anyone that can point me in the right direction? Or have any solution on this?
Br. Rune
There are several ways to achieve this, but I tried to keep it simple. Take a look at this example.
const nesting = {};
myTestArray.forEach((item) => {
const { parent, id } = item;
if (!nesting[id]) {
nesting[id] = [id];
}
if (!nesting[parent] && parent !== "root") {
nesting[parent] = [parent];
}
if (parent !== "root") {
nesting[parent].push(id);
}
});
console.log(nesting);
//
// {
// "1":["1","2","3"],
// "2":["2","4"],
// "3":["3"],
// "4":["4","5"],
// "5":["5"]
// }
//
You could take the JSON object and iterate through it with a ForEach loop. You can then append an element to the index of its parent, and you would end up with a 2-dimensional array such as the one you specified.
var newArr = {};
myTestArray.forEach(e => {
newArr[e.id] = [e.id];
if(e.parent != "root"){
newArr[e.parent].push(e.id);
}
});
would produce
newArr = { '1': [ '1', '2', '3' ],
'2': [ '2', '4' ],
'3': [ '3' ],
'4': [ '4', '5' ],
'5': [ '5' ] }
Have been using my day on this :)
Think I have a solution now.
Suggestions on improvements are welcome.
function getNestedChildrenArray(inputArray: Array<Record<string,unknown>>, inputId: string) {
let myA: Array<string> = inputArray.reduce((previousValue: Array<string>, currentValue: Record<string,unknown>) => {
if(typeof currentValue.id === 'string' && inputId === currentValue.parent) previousValue.push(currentValue.id);
return previousValue;
}, []);
if(myA.length > 0) {
myA = myA.concat(myA.reduce((previousValue: Array<string>, currentValue: string) => {
previousValue = previousValue.concat(getNestedChildrenArray(inputArray, currentValue));
return previousValue;
},[]));
}
return myA;
}
function getNestedChildrenArrayJsonObject(inputArray: Array<Record<string,unknown>>) {
return inputArray.reduce((previousValue: Record<string,unknown>, currentValue: Record<string,unknown>, index, array) => {
if(typeof currentValue.id === 'string') {
previousValue[currentValue.id] = [currentValue.id].concat(getNestedChildrenArray(array, currentValue.id));
}
return previousValue;
}, {});
}
console.log(getNestedChildrenArrayJsonObject(myTestArray));
I want to remove items (an object) from an array on a document in elasticsearch, however whenever I try and run my update script using painless, I receive an Array Index Out of Bounds exception.
I'm using the javascript elasticsearch npm package to search elasticsearch for the relevant documents which then returns me data like:
"_index": "centres",
"_type": "doc",
"_id": "51bc77d1-b514-4f4e-85fa-412def6829f5",
"_score": 1,
"_source": {
"id": "cbaa7daa-f1a2-4ac3-8d7c-fc981245d21c",
"name": "Five House",
"openDays": [
{
"title": "new open Day",
"endDate": "2022-03-22T00:00:00.000Z",
"id": "82be934b-eeb1-419c-96ed-a58808b30df7"
},
{
"title": "last open Day",
"endDate": "2020-12-24T00:00:00.000Z",
"id": "8cc339b9-d2f8-4252-b68a-ed0a49cbfabd"
}
]
}
I then want to go through and remove certain items from the openDays array. I've created an array of the items I want to remove, so for the above example:
[
{
id: '51bc77d1-b514-4f4e-85fa-412def6829f5',
indexes: [
{
"title": "last open Day",
"endDate": "2020-12-24T00:00:00.000Z",
"id": "8cc339b9-d2f8-4252-b68a-ed0a49cbfabd"
}
]
}
]
I'm then trying to run an update via the elasticsearch node client like this:
for (const centre of updates) {
if (centre.indexes.length) {
await Promise.all(centre.indexes.map(async (theIndex) => {
const updated = await client.update({
index: 'centres',
type: 'doc',
id: centre.id,
body: {
script: {
lang: 'painless',
source: "ctx._source.openDays.remove(ctx._source.openDays.indexOf('openDayID'))",
params: {
"openDayID": theIndex.id
}
}
}
}).catch((err) => {throw err;});
}))
.catch((err) => {throw err;});
await client.indices.refresh({ index: 'centres' }).catch((err) => { throw err;});
}
}
When I run this though, it returns a 400 with an "array_index_out_of_bounds_exception" error:
-> POST http://localhost:9200/centres/doc/51bc77d1-b514-4f4e-85fa-412def6829f5/_update
{
"script": {
"lang": "painless",
"source": "ctx._source.openDays.remove(ctx._source.openDays.indexOf(\u0027openDayID\u0027))",
"params": {
"openDayID": "8cc339b9-d2f8-4252-b68a-ed0a49cbfabd"
}
}
}
<- 400
{
"error": {
"root_cause": [
{
"type": "remote_transport_exception",
"reason": "[oSsa7mn][172.17.0.2:9300][indices:data/write/update[s]]"
}
],
"type": "illegal_argument_exception",
"reason": "failed to execute script",
"caused_by": {
"type": "script_exception",
"reason": "runtime error",
"script_stack": [],
"script": "ctx._source.openDays.remove(ctx._source.openDays.indexOf(\u0027openDayID\u0027))",
"lang": "painless",
"caused_by": {
"type": "array_index_out_of_bounds_exception",
"reason": null
}
}
},
"status": 400
}
I'm not quite sure where I'm going wrong with this. Am I using the indexOf painless script correctly? Does indexOf allow for the searching of properties on objects in arrays?
I stumbled across this question and answer: Elasticsearch: Get object index with Painless script
The body of the update script needs changing like so:
Promise.all(...
const inline = `
def openDayID = '${theIndex.id}';
def openDays = ctx._source.openDays;
def openDayIndex = -1;
for (int i = 0; i < openDays.length; i++)
{
if (openDays[i].id == openDayID)
{
openDayIndex = i;
}
}
if (openDayIndex != -1) {
ctx._source.openDays.remove(openDayIndex);
}
`;
const updated = await client.update({
index: 'centres',
type: 'doc',
id: centre.id,
body: {
script: {
lang: 'painless',
inline: inline,
},
}
}).catch((err) => {throw err;});
await client.indices.refresh({ index: 'centres' }).catch((err) => { throw err;});
})).catch(... //end of Promise.all
I am not au fait with painless scripting, so there are most likely better ways of writing this e.g. breaking once the index of the ID is found.
I have also had to move the refresh statement into the Promise.all since if you're trying to remove more than one item from the array of objects, you'll be changing the document and changing the index. There is probably a better way of dealing with this too.
'openDayID' should be params.openDayID
And use removeIf:
"ctx._source.openDays.removeIf(el -> (el.id == params.openDayID))"
I have converted many xml files to json using xmltodict and inserted those in arangodb.
Now I well loop over the collection and change some values in the database. LIke day, mount and year from string to int. The documents can be very nested and the values that I well change can be in different places.
This is what I have of the code.
# Get the API wrapper for "FORM16" collection.
FORM16 = db.collection('FORM16')
def recursive_items(dictionary):
for key, value in dictionary.items():
if type(value) is dict:
yield from recursive_items(value)
else:
yield (key, value)
search_key = 'LOW_VALUE'
for item in FORM16:
for key, value in recursive_items(item):
if search_key in list(key):
item[search_key] = int(item[search_key])
else:
pass
FORM16.update(item)
{'_id': 'FORM16/2098312',
'_key': '2098312',
'_rev': '_blGxlRi---',
'_old_rev': '_blGvpVO---'}
The code runs but It won’t update the database and the document that I receive that has changed is only the last document in the collection.
What do I have to change in the code to convert values in keys like day, mount and year to int?
EDIT:
This is one of the nested json's doc. that I well update
{
"DOFFIN_ESENDERS": {
"DOFFIN_APPENDIX": {
"AUTHORITY_ORGANISATION_NR": "986 105 174",
"DOFFIN_FORM_TYPE": {
"NATIONAL": {
"EXPRESSION_OF_INTEREST_URL": "https://kgv.doffin.no/ctm/Supplier/Notice/260549",
"EXTERNAL_DOCUMENT_URL": "https://kgv.doffin.no/ctm/Supplier/Documents/Folder/124452",
"LOCATION": {
"NATIONWIDE": null
},
"PUBLISH_TO_TED": null
}
}
},
"FORM_SECTION": {
"PRIOR_INFORMATION_DEFENCE": {
"CATEGORY": "ORIGINAL",
"FD_PRIOR_INFORMATION_DEFENCE": {
"AUTHORITY_PRIOR_INFORMATION_DEFENCE": {
"NAME_ADDRESSES_CONTACT_PRIOR_INFORMATION": {
"CA_CE_CONCESSIONAIRE_PROFILE": {
"ADDRESS": "Postboks 800, Postmottak",
"ATTENTION": "Ole Jan Skoglund",
"CONTACT_POINT": "Forsvarets logistikkorganisasjon",
"COUNTRY": {
"VALUE": "NO"
},
"E_MAILS": {
"E_MAIL": "olskoglund#mil.no"
},
"FAX": "+47 67863799",
"ORGANISATION": {
"NATIONALID": "986105174",
"OFFICIALNAME": "Forsvarets logistikkorganisasjon"
},
"PHONE": "+47 67863787",
"POSTAL_CODE": "LILLEHAMMER",
"TOWN": "N-2617"
},
"FURTHER_INFORMATION": {
"IDEM": null
},
"INTERNET_ADDRESSES_PRIOR_INFORMATION": {
"URL_BUYER": "https://kgv.doffin.no/ctm/Supplier/CompanyInformation/Index/1127",
"URL_GENERAL": "http://www.forsvaret.no"
}
},
"TYPE_AND_ACTIVITIES_OR_CONTRACTING_ENTITY_AND_PURCHASING_ON_BEHALF": {
"PURCHASING_ON_BEHALF": {
"PURCHASING_ON_BEHALF_NO": null
},
"TYPE_AND_ACTIVITIES": {
"TYPE_OF_ACTIVITY": {
"VALUE": "DEFENCE"
},
"TYPE_OF_CONTRACTING_AUTHORITY": {
"VALUE": "MINISTRY"
}
}
}
},
"CTYPE": "SUPPLIES",
"LEFTI_PRIOR_INFORMATION": null,
"OBJECT_WORKS_SUPPLIES_SERVICES_PRIOR_INFORMATION": {
"ADDITIONAL_INFORMATION": {
"P": "Konkurransen vil bli utført som en forhandlet prosedyre etter en planlagt kunngjøring ultimo 2015 i henhold til “Forskrift 4. oktober 2013 nr. 1185 om forsvars og sikkerhetsanskaffelser“ basert på Eu direktiv 2009/81/EC fra Europa Parlamentet."
},
"CPV": {
"CPV_ADDITIONAL": [
{
"CPV_CODE": {
"CODE": "18900000"
}
},
{
"CPV_CODE": {
"CODE": "18930000"
}
},
{
"CPV_CODE": {
"CODE": "18937000"
}
},
{
"CPV_CODE": {
"CODE": "33000000"
}
},
{
"CPV_CODE": {
"CODE": "33120000"
}
},
{
"CPV_CODE": {
"CODE": "33124000"
}
},
{
"CPV_CODE": {
"CODE": "33140000"
}
},
{
"CPV_CODE": {
"CODE": "33141000"
}
},
{
"CPV_CODE": {
"CODE": "33141100"
}
},
{
"CPV_CODE": {
"CODE": "33141200"
}
},
{
"CPV_CODE": {
"CODE": "33141300"
}
},
{
"CPV_CODE": {
"CODE": "50400000"
}
}
],
"CPV_MAIN": {
"CPV_CODE": {
"CODE": "33100000"
}
}
},
"FRAMEWORK_AGREEMENT": {
"VALUE": "YES"
},
"QUANTITY_SCOPE_WORKS_DEFENCE": {
"COSTS_RANGE_AND_CURRENCY": {
"CURRENCY": "NOK",
"RANGE_VALUE_COST": {
"HIGH_VALUE": "200000000",
"LOW_VALUE": "150000000"
}
},
"F16_DIVISION_INTO_LOTS": {
"DIV_INTO_LOT_NO": null
},
"TOTAL_QUANTITY_OR_SCOPE": {
"P": "Forsvarets logistikkorganisasjon planlegger å skifte ut Forsvarets prehospitale sanitetssystem. Vi ser derfor etter en systemleverandør som kan levere test moduler, store initielle systemleveranser og ta ansvar for effektiv etterforsyning til Forsvaret på rammeavtaler med inntil syv års varighet."
}
},
"SCHEDULED_DATE_PERIOD": {
"PERIOD_WORK_DATE_STARTING": {
"MONTHS": "84"
}
},
"TITLE_CONTRACT": {
"P": "RFI P9346 -Nytt Prehospital Sanitetssystem til Forsvaret"
},
"TYPE_CONTRACT_PLACE_DELIVERY_DEFENCE": {
"SITE_OR_LOCATION": {
"LABEL": "N-2055 Nordkisa",
"NUTS": {
"CODE": "NO"
}
},
"TYPE_CONTRACT_PI_DEFENCE": {
"TYPE_CONTRACT": {
"VALUE": "SUPPLIES"
}
}
}
},
"OTH_INFO_PRIOR_INFORMATION": {
"ADDITIONAL_INFORMATION": {
"P": "Vi ønsker svar både fra Systemleverandører og Underleverandører på denne RFI."
},
"INFORMATION_REGULATORY_FRAMEWORK": {
"TAX_LEGISLATION": {
"TAX_LEGISLATION_VALUE": "www.lovdata.no"
}
},
"NOTICE_DISPATCH_DATE": {
"DAY": "28",
"MONTH": '11',
"YEAR": "2014"
},
"RELATES_TO_EU_PROJECT_NO": null
}
},
"FORM": "16",
"LG": "NB",
"VERSION": "R2.0.8.S02"
}
},
"VERSION": "V2.0.0",
"http://www.w3.org/2001/XMLSchema-instance:noNamespaceSchemaLocation": "DOFFIN_ESENDERS.xd",
"xmlns": {
"xsi": "http://www.w3.org/2001/XMLSchema-instance"
}
}
}
It looks like your code is correct, assuming the JSON blob at the bottom is a representation of item. Just make sure the data you're passing to .update() includes a valid _key and/or _id attribute.
However, it looks like your update statement is not indented properly and/or out of order. I would put the update inline, when you make the change:
FORM16 = db.collection('FORM16')
for item in FORM16:
for key, value in recursive_items(item):
if search_key in list(key):
item[search_key] = int(item[search_key])
FORM16.update(item)
else:
pass
or in the top-level for loop:
FORM16 = db.collection('FORM16')
for item in FORM16:
for key, value in recursive_items(item):
if search_key in list(key):
item[search_key] = int(item[search_key])
else:
pass
FORM16.update(item)
I did find a functin for convert sting to int and float in JSON files.
def _decode(o):
# Note the "unicode" part is only for python2
if isinstance(o, str):
try:
return int(o)
except ValueError:
try:
return float(o)
except ValueError:
return o
elif isinstance(o, dict):
return {k: _decode(v) for k, v in o.items()}
elif isinstance(o, list):
return [_decode(v) for v in o]
else:
return o
path = 'C:/doffin/test/'
for filename in os.listdir(path):
if not filename.endswith('.json'):
continue
#26
#fullname = os.path.join(path, filename)
fullname = os.path.join(path, filename)
with open(fullname, 'rb') as f:
jsonstr = f.read()
json_sting = json.loads(jsonstr, object_hook=_decode)
json_str2 = json.dumps(json_sting)
with open(fullname[:-4] + ".json", 'w') as f:
f.write(json_str2)
and afther that I use arango import form the shell. It works better then the API.
This is my structure
[{
"date": "2019-01-10T18:30:00.000Z",
"time": "2019-01-11T04:37:49.587Z",
"abc_Info": {
"_id": "5c381da651f18d5040611eb2",
"abc": 2.5,
"guardian": "XYZ"
}
}]
What I want is
[{
"date": "2019-01-10T18:30:00.000Z",
"time": "2019-01-11T04:37:49.587Z",
"abc": 2.5,
"guardian": "XYZ"
}]
Code
this._model.find(params, (err, docs) => {
if (err) {
var response = this.errorResponse("error", 500, null);
res.send(response);
} else {
for (var i = 0; i < docs.length; i++) {
const abc= {
"date": docs[i].date,
"time": docs[i].time,
"abc_Info":docs[i].abc_Info //this is object, i couldn't select value separately from this object
}
if (docs[i].abc_Info != undefined) {
abcArray.push(abc);
}
}
res.send(abc);
}
});
I am trying to select values from object like "abc_Info":docs[i].abc_Info.abc but I couldn't do it, Its throwing error.
There are two way I can achieve this.
Selecting values directly from object and store in variable. That is throwing error for me
Merging date and time with abc_Info. I don't know how to do that
Chnage this.
const abc= {
"date": docs[i].date,
"time": docs[i].time,
"abc":docs[i].abc_Info.abc,
"guardian":docs[i].abc_Info.guardian
}
hope it will be helpful
var d=[{
"date": "2019-01-10T18:30:00.000Z",
"time": "2019-01-11T04:37:49.587Z",
"abc_Info": {
"_id": "5c381da651f18d5040611eb2",
"abc": 2.5,
"guardian": "XYZ"
}
}]
d[0]=Object.assign(d[0],d[0].abc_Info);
delete d[0]['abc_Info'];
delete d[0]['_id'];
console.log(d);
let x = {
"date": "2019-01-10T18:30:00.000Z",
"time": "2019-01-11T04:37:49.587Z",
}
let abc_Info = {
"_id": "5c381da651f18d5040611eb2",
"abc": 2.5,
"guardian": "XYZ"
}
// to concat
let z = {...x, ...abc_Info}
console.log(z)
You can use the above ES6 spread operator for the cause
Object.assign is quite better in performance so it'll be like this:
Or let z = Object.assign(x, abc_info);
In your code it should be like this:
const abc= {...{
"date": docs[i].date,
"time": docs[i].time,
}, ...docs[i].abc_Info}
Use Spread operator and also rather using for loop and push, use Array.prototype.map
let docs= [{
"date": "2019-01-10T18:30:00.000Z",
"time": "2019-01-11T04:37:49.587Z",
"abc_Info": {
"_id": "5c381da651f18d5040611eb2",
"abc": 2.5,
"guardian": "XYZ"
}
}]
let abcArray = docs.map(({date, time, abc_Info}) => {
delete abc_Info._id;
return {date, time, ...abc_Info }
});
console.log(abcArray)
Sorry Its my mistake. I should have declared object inside if statement. Its working now
if (docs[i].abc_Info != undefined) {
const abc= {
"date": docs[i].date,
"time": docs[i].time,
"abc_Info":docs[i].abc_Info
}
abcArray.push(abc);
}
I'm trying to store the json object in dynamodb using the code given below.
exports.handler = function (event, context, callback) {
var docClient = new AWS.DynamoDB.DocumentClient();
var table = "Logs";
id_val = 1
var params = {
TableName: table,
Item: {
"id": id_val,
"message": event
}
};
docClient.put(params, function(err, data) {
if (err) {
callback(null, JSON.stringify(err, null, 2));
context.fail("Unable to add item. Error JSON:", JSON.stringify(err, null, 2));
}
});
}
Input to event
[
{
"id": 1,
"demographic": {
"firstName": "John",
"middleName": "w",
"lastName": "Doe",
"suffix": "jr",
"birthDate": "1990-02-02",
"gender": "M",
"ssn": 123
}
}
]
What's stored in the table
{
"id": {
"N": "84.20420287568176"
},
"message": {
"L": [
{
"M": {
"demographic": {
"M": {
"birthDate": {
"S": "1990-02-02"
...
...
...
Why is datatype being stored in the table? How can one break this down so that attributes are stored separately?
That's the way dynamoDB stores data because when we look at the data we should be able to know the datatype of the field, which is excatly same as data type of column in relationa data base. Though data stores and displays like this in dynamoDB when we get data from dynamoDB through an actual application or API we can get rid of data types, which will return the actual data only.