Json/Nodejs: How do I get specific data from a JSON file - node.js

I am trying to develop a discord bot with nodejs. My question is how do I get a specific data out of a JSON file (for instance, from this json. I want to only specify a single uuid like "59d4c622c7cc459a98c2e947054e2b11" and get the data.

Assuming you have already parsed the JSON into an actual Javascript object named data and given the way the data is currently organized, you would have to search the data.guild.members array to find the object that had the desired uuid property value.
function findDataForUuid(uuid) {
for (let obj of data.guild.members) {
if (obj.uuid === uuid) {
return obj;
}
}
// no match found
return null;
}
let item = findDataForUuid("59d4c622c7cc459a98c2e947054e2b11");
if (item) {
console.log(item); // will show all the properties of the member object
}
Or, using .find() on the array:
function findDataForUuid(uuid) {
return data.guild.members.find(item => {
return item.uuid === uuid;
});
}

The simple soln you can use filter. Best is use reduce.
const data = {
guild: {
_id: "5b2906070cf29ddccd0f203c",
name: "Dulcet",
coins: 122010,
coinsEver: 232010,
created: 1529415175560,
members: [
{
uuid: "59d4c622c7cc459a98c2e947054e2b11",
rank: "MEMBER",
joined: 1529683128302,
questParticipation: 39,
expHistory: {
"2020-02-16": 0,
"2020-02-15": 0,
"2020-02-14": 0,
"2020-02-13": 0,
"2020-02-12": 0,
"2020-02-11": 0,
"2020-02-10": 0
}
}
]
}
};
const members = data.guild.members.filter(
member => member.uuid == "59d4c622c7cc459a98c2e947054e2b11"
);
const firstMember = members[0]
console.log(firstMember)

Related

Update an array value in nested document in ArangoDB

`Given the following document inside a collection card: I have to update the whole data value for a particular id in staticCard
`
{
"staticCards": [
{
id:123,
search:"",
data:[]
},
{
id:456,
search:"",
data:[]
},
],
"dynamicCards":[
{
id:789,
search:"",
data:[]
},
{
id:127,
search:"",
data:[]
},
{}
]
}
You need to determine the index of the array element, which isn't straightforward if you want to match one of the object attributes instead of the whole object. POSITION() is not an option in this case, but you can solve it with a subquery. Then you can use REPLACE_NTH() to set a new value. Finally, you need to update the respective top-level attribute.
LET pos = FIRST(FOR i IN 0..LENGTH(doc.dynamicCards)-1
FILTER doc.dynamicCards[i].id == 127
LIMIT 1
RETURN i
)
LET new = REPLACE_NTH(doc.dynamicCards, pos, { id: 128, search: "", data: [] })
UPDATE doc WITH { dynamicCards: new } IN coll

Get map of map field from Firestore

I have a field of type map that contains Maps of data in firestore.
I am trying to retrieve this data using a cloud function in node.js. I can get the document and the data from the field but i can't get it in a usable way. I have tried every solution i can find on SO and google but the below is the only code that can give me access to the data. I obviously need to be able to access each field with in the Map individually. in swift i build an array of String:Any but i can get that to work in Node.JS
const docRef = dbConst.collection('Comps').doc('XEVDk6e4AXZPkNprQRn5Imfcsah11598092006.724980');
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets')
console.log(tagets);
}).catch(result => { console.log(result) });
this is what i am getting back in the console.
In Swift i do the following and am so far not able to find an equivalent in typescript. (i don't need to build the custom object just ability to access the keys and values)
let obj1 = doc.get("targets") as! [String:Any]
for objs in obj1{
let obs = objs.value as! [String:Any]
let targObj = compUserDetails(IDString: objs.key, activTarg: obs["ActivTarget"] as! Double, stepTarg: obs["StepTarget"] as! Double, name: obs["FullName"] as! String)
UPDATE
After spending a whole day working on it thought i had a solution using the below:
const docRef = dbConst.collection('Comps').doc('XEVDk6e4AXZPkNprQRn5Imfcsah11598092006.724980');
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets') as [[string, any]];
const newDataMap = [];
for (let [key, value] of Object.entries(tagets)) {
const tempMap = new Map<String,any>();
console.log(key);
const newreWorked = value;
tempMap.set('uid',key);
for(let [key1, value1] of Object.entries(newreWorked)){
tempMap.set(key1,value1);
newDatMap.push(tempMap);
};
};
newDatMap.forEach(element => {
const name = element.get('FullName');
console.log(name);
});
However the new data map has 6 seperate mapped objects. 3 of each of the original objects from the cloud. i can now iterate through and get the data for a given key but i have 3 times as many objects.
So after two days of searching an getting very close i finnaly worked out a solution, it is very similar to the code above but this works. it may not be the "correct" way but it works. feel free to make other suggestions.
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets') as [[string, any]];
const newDatarray = [];
for (let [key, value] of Object.entries(tagets)) {
const tempMap = new Map<String,any>();
const newreWorked = value;
tempMap.set('uid',key);
for(let [key1, value1] of Object.entries(newreWorked)){
tempMap.set(key1,value1);
};
newDatarray.push(tempMap);
};
newDatarray.forEach(element => {
const name = element.get('FullName');
const steps = element.get('StepTarget');
const avtiv = element.get('ActivTarget');
const UID = element.get('uid');
console.log(name);
console.log(steps);
console.log(avtiv);
console.log(UID);
});
}).catch(result => { console.log(result) });
I made this into a little function that gets the underlying object from a map:
function getMappedValues(map) {
var tempMap = {};
for (const [key, value] of Object.entries(map)) {
tempMap[key] = value;
}
return tempMap;
}
For an object with an array of maps in firestore, you can get the value of the first of those maps like so:
let doc = { // Example firestore document data
items: {
0: {
id: "1",
sr: "A",
},
1: {
id: "2",
sr: "B",
},
2: {
id: "3",
sr: "B",
},
},
};
console.log(getMappedValues(doc.items[0]));
which would read { id: '1', sr: 'A' }

node js access value in json result and get the value

i have following json object which i get from API end point
let myjson = { Team1: { SCORE: 10 } }
i want to access the score inside Team but not able to complete as i need to just the result as 10
i have tried following code but not able to get the result
for(var attribute name in JSON.parse(myjson)){
return console.log(attributename+": "+body[attributename]);
}
i also used bellow code
const userStr = JSON.stringify(myjson);
JSON.parse(userStr, (key, value) => {
if (typeof value === 'string') {
return value.toUpperCase();
}
return value;
});
Not a node developer but why do you need to json.stringify it? Can't you just reach the value with dot notation like this:
myJson.Team1.SCORE
myjson is already an Object, you don't need to do JSON.parse nor JSON.stringify on it.
Just access the property directly:
console.log(myjson.Team1.SCORE)
If you have multiple teams, or want to access it dynamically:
const obj = { Team1: { SCORE: 10 }, Team2: { SCORE: 20 } }
for(const [team, value] of Object.entries(obj)) {
console.log(`${team}: ${value.SCORE}`)
}
you also can use this if it fulfills your query.
here is the code.
let myjson = {Team1: {SCORE:10}, Team2: {SCORE: 20}};
Object.keys(myjson).forEach(function(item) {
console.log(myjson[item].SCORE);
});
Not sure if there can be more teams in that object, so I put here some more complex solution and then the straightforward one.
const myjson = { Team1: { SCORE: 10 }, Team2: { SCORE: 20 } }
const result = Object.keys(myjson).map(key => myjson[key].SCORE);
console.log('For dynamic resolution', result);
console.log('If there is only Team1', myjson.Team1.SCORE);

Optional but non-nullable fields in GraphQL

In an update to our GraphQL API only the models _id field is required hence the ! in the below SDL language code. Other fields such as name don't have to be included on an update but also cannot have null value. Currently, excluding the ! from the name field allows the end user to not have to pass a name in an update but it allows them to pass a null value for the name in, which cannot be allowed.
A null value lets us know that a field needs to be removed from the database.
Below is an example of a model where this would cause a problem - the Name custom scalar doesn't allow null values but GraphQL still allows them through:
type language {
_id: ObjectId
iso: Language_ISO
auto_translate: Boolean
name: Name
updated_at: Date_time
created_at: Date_time
}
input language_create {
iso: Language_ISO!
auto_translate: Boolean
name: Name!
}
input language_update {
_id: ObjectId!
iso: Language_ISO!
auto_translate: Boolean
name: Name
}
When a null value is passed in it bypasses our Scalars so we cannot throw a user input validation error if null isn't an allowed value.
I am aware that ! means non-nullable and that the lack of the ! means the field is nullable however it is frustrating that, as far as I can see, we cannot specify the exact values for a field if a field is not required / optional. This issue only occurs on updates.
Are there any ways to work around this issue through custom Scalars without having to start hardcoding logic into each update resolver which seems cumbersome?
EXAMPLE MUTATION THAT SHOULD FAIL
mutation tests_language_create( $input: language_update! ) { language_update( input: $input ) { name }}
Variables
input: {
_id: "1234",
name: null
}
UPDATE 9/11/18: for reference, I can't find a way around this as there are issues with using custom scalars, custom directives and validation rules. I've opened an issue on GitHub here: https://github.com/apollographql/apollo-server/issues/1942
What you're effectively looking for is custom validation logic. You can add any validation rules you want on top of the "default" set that is normally included when you build a schema. Here's a rough example of how to add a rule that checks for null values on specific types or scalars when they are used as arguments:
const { specifiedRules } = require('graphql/validation')
const { GraphQLError } = require('graphql/error')
const typesToValidate = ['Foo', 'Bar']
// This returns a "Visitor" whose properties get called for
// each node in the document that matches the property's name
function CustomInputFieldsNonNull(context) {
return {
Argument(node) {
const argDef = context.getArgument();
const checkType = typesToValidate.includes(argDef.astNode.type.name.value)
if (checkType && node.value.kind === 'NullValue') {
context.reportError(
new GraphQLError(
`Type ${argDef.astNode.type.name.value} cannot be null`,
node,
),
)
}
},
}
}
// We're going to override the validation rules, so we want to grab
// the existing set of rules and just add on to it
const validationRules = specifiedRules.concat(CustomInputFieldsNonNull)
const server = new ApolloServer({
typeDefs,
resolvers,
validationRules,
})
EDIT: The above only works if you're not using variables, which isn't going to be very helpful in most cases. As a workaround, I was able to utilize a FIELD_DEFINITION directive to achieve the desired behavior. There's probably a number of ways you could approach this, but here's a basic example:
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const { args: { paths } } = this
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
for (const path of paths) {
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
}
return resolve.apply(this, resolverArgs)
}
}
}
Then in your schema:
directive #nonNullInput(paths: [String!]!) on FIELD_DEFINITION
input FooInput {
foo: String
bar: String
}
type Query {
foo (input: FooInput!): String #nonNullInput(paths: ["input.foo"])
}
Assuming that the "non null" input fields are the same each time the input is used in the schema, you could map each input's name to an array of field names that should be validated. So you could do something like this as well:
const nonNullFieldMap = {
FooInput: ['foo'],
}
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const visitedTypeArgs = this.visitedType.args
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
visitedTypeArgs.forEach(arg => {
const argType = arg.type.toString().replace("!", "")
const nonNullFields = nonNullFieldMap[argType]
nonNullFields.forEach(nonNullField => {
const path = `${arg.name}.${nonNullField}`
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
})
})
return resolve.apply(this, resolverArgs)
}
}
}
And then in your schema:
directive #nonNullInput on FIELD_DEFINITION
type Query {
foo (input: FooInput!): String #nonNullInput
}

array manipulation in node js and lodash

I have two arrays
typeArr = [1010111,23342344]
infoArr={'name':'jon,'age':25}
I am expecting following
[{'name:'jone','age':25,'type':1010111,'default':'ok'},{'name:'jone','age':25,'type':23342344,'default':'nok'}]
Code :
updaterecord(infoArr,type)
{
infoArr.type=type;
response = calculate(age);
if(response)
infoArr.default = 'ok';
else
infoArr.default = 'nok';
return infoArr;
}
createRecord(infoArr,typeArr)
{
var data = _.map(typeArr, type => {
return updaterecord(infoArr,type);
});
return (data);
}
var myData = createRecord(infoArr,typeArr);
I am getting
[{'name:'jone,'age':25.'type':23342344,'default':nok},{'name:'jone,'age':25.'type':23342344,'default':nok}]
with some reason the last record updates the previous one. I have tried generating array using index var but not sure what's wrong it keep overriding the previous item.
how can I resolve this
You are passing the entire infoArr array to your updaterecord() function, but updaterecord() looks like it's expecting a single object. As a result it is adding those properties to the array rather than individual members of the array.
It's not really clear what is supposed to happen because typeArr has two elements and infoArr has one. Do you want to add another to infoArr or should infoArr have the same number of elements as typeArr.
Assuming it should have the same number you would need to use the index the _map gives you to send each item from infoArr:
function createRecord(infoArr,typeArr) {
var data = _.map(typeArr, (type, i) => {
// use infoArr[i] to send one element
return updaterecord(infoArr[i],type);
});
return (data);
}
Edit:
I'm not sure how you are calculating default since it's different in your expected output, but based on one number. To get an array of objects based on infoArray you need to copy the object and add the additional properties the you want. Object.assign() is good for this:
let typeArr = [1010111,23342344]
let infoArr={'name':'jon','age':25}
function updaterecord(infoArr,type){
var obj = Object.assign({}, infoArr)
return Object.assign(obj, {
type: type,
default: infoArr.age > 25 ? 'ok' : 'nok' //or however your figuring this out
})
}
function createRecord(infoArr,typeArr) {
return _.map(typeArr, type => updaterecord(infoArr,type));
}
Result:
[ { name: 'jon', age: 25, type: 1010111, default: 'nok' },
{ name: 'jon', age: 25, type: 23342344, default: 'nok' } ]

Resources