So I am trying to dynamically delete specific documents from elastic and would like to use one function without hard coding the name of the item I am looking for the time range in. See code below:
exports.cleanElastic = function(type, timeReference){
return new Promise (function (resolve, reject){
let dataOlderThan = "now-".concat(config.amountOfData);
elastic.deleteByQuery({
index: 'canary',
type: type,
body: {
query: {
range : {
START_TS: {
lte: dataOlderThan
}
}
}
}
},
As you can see 'START_TS' is the name of the date field I care about in this instance. That will not always be the case with the project. So I am trying to pass 'timeReference' or at least its' value in for where the query reads 'START_TS'. Any suggestions would be much appreciated.
Thank you,
Ryan
I think what you are asking is more of javascript to convert strings to symbols.
exports.cleanElastic = function(type, timeReference){
return new Promise (function (resolve, reject){
let dataOlderThan = "now-".concat(config.amountOfData);
elastic.deleteByQuery({
index: 'canary',
type: type,
body: {
query: {
range : {
toSymbol(timeReference): {
lte: dataOlderThan
}
}
}
}
}
}
}
function toSymbol(variable) {
return Symbol(variable);
};
In rails we use something like following for this.
key_name = 'age'
hash = {
name: 'john',
key_name.to_sym => '23'
}
Related
i have following json object which i get from API end point
let myjson = { Team1: { SCORE: 10 } }
i want to access the score inside Team but not able to complete as i need to just the result as 10
i have tried following code but not able to get the result
for(var attribute name in JSON.parse(myjson)){
return console.log(attributename+": "+body[attributename]);
}
i also used bellow code
const userStr = JSON.stringify(myjson);
JSON.parse(userStr, (key, value) => {
if (typeof value === 'string') {
return value.toUpperCase();
}
return value;
});
Not a node developer but why do you need to json.stringify it? Can't you just reach the value with dot notation like this:
myJson.Team1.SCORE
myjson is already an Object, you don't need to do JSON.parse nor JSON.stringify on it.
Just access the property directly:
console.log(myjson.Team1.SCORE)
If you have multiple teams, or want to access it dynamically:
const obj = { Team1: { SCORE: 10 }, Team2: { SCORE: 20 } }
for(const [team, value] of Object.entries(obj)) {
console.log(`${team}: ${value.SCORE}`)
}
you also can use this if it fulfills your query.
here is the code.
let myjson = {Team1: {SCORE:10}, Team2: {SCORE: 20}};
Object.keys(myjson).forEach(function(item) {
console.log(myjson[item].SCORE);
});
Not sure if there can be more teams in that object, so I put here some more complex solution and then the straightforward one.
const myjson = { Team1: { SCORE: 10 }, Team2: { SCORE: 20 } }
const result = Object.keys(myjson).map(key => myjson[key].SCORE);
console.log('For dynamic resolution', result);
console.log('If there is only Team1', myjson.Team1.SCORE);
In an update to our GraphQL API only the models _id field is required hence the ! in the below SDL language code. Other fields such as name don't have to be included on an update but also cannot have null value. Currently, excluding the ! from the name field allows the end user to not have to pass a name in an update but it allows them to pass a null value for the name in, which cannot be allowed.
A null value lets us know that a field needs to be removed from the database.
Below is an example of a model where this would cause a problem - the Name custom scalar doesn't allow null values but GraphQL still allows them through:
type language {
_id: ObjectId
iso: Language_ISO
auto_translate: Boolean
name: Name
updated_at: Date_time
created_at: Date_time
}
input language_create {
iso: Language_ISO!
auto_translate: Boolean
name: Name!
}
input language_update {
_id: ObjectId!
iso: Language_ISO!
auto_translate: Boolean
name: Name
}
When a null value is passed in it bypasses our Scalars so we cannot throw a user input validation error if null isn't an allowed value.
I am aware that ! means non-nullable and that the lack of the ! means the field is nullable however it is frustrating that, as far as I can see, we cannot specify the exact values for a field if a field is not required / optional. This issue only occurs on updates.
Are there any ways to work around this issue through custom Scalars without having to start hardcoding logic into each update resolver which seems cumbersome?
EXAMPLE MUTATION THAT SHOULD FAIL
mutation tests_language_create( $input: language_update! ) { language_update( input: $input ) { name }}
Variables
input: {
_id: "1234",
name: null
}
UPDATE 9/11/18: for reference, I can't find a way around this as there are issues with using custom scalars, custom directives and validation rules. I've opened an issue on GitHub here: https://github.com/apollographql/apollo-server/issues/1942
What you're effectively looking for is custom validation logic. You can add any validation rules you want on top of the "default" set that is normally included when you build a schema. Here's a rough example of how to add a rule that checks for null values on specific types or scalars when they are used as arguments:
const { specifiedRules } = require('graphql/validation')
const { GraphQLError } = require('graphql/error')
const typesToValidate = ['Foo', 'Bar']
// This returns a "Visitor" whose properties get called for
// each node in the document that matches the property's name
function CustomInputFieldsNonNull(context) {
return {
Argument(node) {
const argDef = context.getArgument();
const checkType = typesToValidate.includes(argDef.astNode.type.name.value)
if (checkType && node.value.kind === 'NullValue') {
context.reportError(
new GraphQLError(
`Type ${argDef.astNode.type.name.value} cannot be null`,
node,
),
)
}
},
}
}
// We're going to override the validation rules, so we want to grab
// the existing set of rules and just add on to it
const validationRules = specifiedRules.concat(CustomInputFieldsNonNull)
const server = new ApolloServer({
typeDefs,
resolvers,
validationRules,
})
EDIT: The above only works if you're not using variables, which isn't going to be very helpful in most cases. As a workaround, I was able to utilize a FIELD_DEFINITION directive to achieve the desired behavior. There's probably a number of ways you could approach this, but here's a basic example:
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const { args: { paths } } = this
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
for (const path of paths) {
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
}
return resolve.apply(this, resolverArgs)
}
}
}
Then in your schema:
directive #nonNullInput(paths: [String!]!) on FIELD_DEFINITION
input FooInput {
foo: String
bar: String
}
type Query {
foo (input: FooInput!): String #nonNullInput(paths: ["input.foo"])
}
Assuming that the "non null" input fields are the same each time the input is used in the schema, you could map each input's name to an array of field names that should be validated. So you could do something like this as well:
const nonNullFieldMap = {
FooInput: ['foo'],
}
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const visitedTypeArgs = this.visitedType.args
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
visitedTypeArgs.forEach(arg => {
const argType = arg.type.toString().replace("!", "")
const nonNullFields = nonNullFieldMap[argType]
nonNullFields.forEach(nonNullField => {
const path = `${arg.name}.${nonNullField}`
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
})
})
return resolve.apply(this, resolverArgs)
}
}
}
And then in your schema:
directive #nonNullInput on FIELD_DEFINITION
type Query {
foo (input: FooInput!): String #nonNullInput
}
I have two arrays
typeArr = [1010111,23342344]
infoArr={'name':'jon,'age':25}
I am expecting following
[{'name:'jone','age':25,'type':1010111,'default':'ok'},{'name:'jone','age':25,'type':23342344,'default':'nok'}]
Code :
updaterecord(infoArr,type)
{
infoArr.type=type;
response = calculate(age);
if(response)
infoArr.default = 'ok';
else
infoArr.default = 'nok';
return infoArr;
}
createRecord(infoArr,typeArr)
{
var data = _.map(typeArr, type => {
return updaterecord(infoArr,type);
});
return (data);
}
var myData = createRecord(infoArr,typeArr);
I am getting
[{'name:'jone,'age':25.'type':23342344,'default':nok},{'name:'jone,'age':25.'type':23342344,'default':nok}]
with some reason the last record updates the previous one. I have tried generating array using index var but not sure what's wrong it keep overriding the previous item.
how can I resolve this
You are passing the entire infoArr array to your updaterecord() function, but updaterecord() looks like it's expecting a single object. As a result it is adding those properties to the array rather than individual members of the array.
It's not really clear what is supposed to happen because typeArr has two elements and infoArr has one. Do you want to add another to infoArr or should infoArr have the same number of elements as typeArr.
Assuming it should have the same number you would need to use the index the _map gives you to send each item from infoArr:
function createRecord(infoArr,typeArr) {
var data = _.map(typeArr, (type, i) => {
// use infoArr[i] to send one element
return updaterecord(infoArr[i],type);
});
return (data);
}
Edit:
I'm not sure how you are calculating default since it's different in your expected output, but based on one number. To get an array of objects based on infoArray you need to copy the object and add the additional properties the you want. Object.assign() is good for this:
let typeArr = [1010111,23342344]
let infoArr={'name':'jon','age':25}
function updaterecord(infoArr,type){
var obj = Object.assign({}, infoArr)
return Object.assign(obj, {
type: type,
default: infoArr.age > 25 ? 'ok' : 'nok' //or however your figuring this out
})
}
function createRecord(infoArr,typeArr) {
return _.map(typeArr, type => updaterecord(infoArr,type));
}
Result:
[ { name: 'jon', age: 25, type: 1010111, default: 'nok' },
{ name: 'jon', age: 25, type: 23342344, default: 'nok' } ]
This is my data saved in elastic search:
{
index: productName,
type: 'users',
body: {
name:'xyz',
subject:{
12:{
id:12,
name:Maths
count:3
},
13:{
id:13,
name:Physics
count:7
}
}
}
}
Is There a way to somehow search and get total number of users whose count in maths is greater than 0. Where 12 will be in a variable say subject_id.
I tried searching in the docs but coudn't find any one example to use.
I am new to elastic search any help would be appreciated thanks.
Create an object first, like this:
var queryObj = {
"query":{
"range":{
}
}
};
queryObj.query.range['subjects.'+data.subject_id+'.opened']={
"gte":1
};
then pass this object in the body of elastic search like this
elasticClient.search({
index: indexName,
type: type,
body: {
queryObj
}
}).then(promiseFunc)
I would like baffle.where({id: 1}).fetch() to always get typeName attribute as a part of baffle model, without fetching it from baffleType explicitly each time.
The following works for me but it seems that withRelated will load relations if baffle model is fetched directly, not by relation:
let baffle = bookshelf.Model.extend({
constructor: function() {
bookshelf.Model.apply(this, arguments);
this.on('fetching', function(model, attrs, options) {
options.withRelated = options.withRelated || [];
options.withRelated.push('type');
});
},
virtuals: {
typeName: {
get: function () {
return this.related('type').attributes.typeName;
}
}
},
type: function () {
return this.belongsTo(baffleType, 'type_id');
}
});
let baffleType = bookshelf.Model.extend({});
What is the proper way to do that?
Issue on repo is related to Fetched event, However Fetching event is working fine (v0.9.2).
So just for example if you have a 3rd model like
var Test = Bookshelf.model.extend({
tableName : 'test',
baffleField : function(){
return this.belongsTo(baffle)
}
})
and then do Test.forge().fetch({ withRelated : ['baffleField']}), fetching event on baffle will fire. However ORM will not include type (sub Related model) unless you specifically tell it to do so by
Test.forge().fetch({ withRelated : ['baffleField.type']})
However I would try to avoid this if it is making N Query for N records.
UPDATE 1
I was talking about same thing that you were doing on fetching event like
fetch: function fetch(options) {
var options = options || {}
options.withRelated = options.withRelated || [];
options.withRelated.push('type');
// Fetch uses all set attributes.
return this._doFetch(this.attributes, options);
}
in model.extend. However as you can see, this might fail on version changes.
This question is super old, but I'm answering anyway.
I solved this by just adding a new function, fetchFull, which keeps things pretty DRY.
let MyBaseModel = bookshelf.Model.extend({
fetchFull: function() {
let args;
if (this.constructor.withRelated) {
args = {withRelated: this.constructor.withRelated};
}
return this.fetch(args);
},
};
let MyModel = MyBaseModel.extend({
tableName: 'whatever',
}, {
withRelated: [
'relation1',
'relation1.related2'
]
}
);
Then whenever you're querying, you can either call Model.fetchFull() to load everything, or in cases where you don't want to take a performance hit, you can still resort to Model.fetch().