I'm using the latest version of elasticsearch npm with elasticsearch version 6.4 and trying to put new script.
According to thier documentation; putScript function takes id and body properties.
So when i try to call it, for instance:
client.putScript({
id: 'date_formatter',
body: {
lang: "painless",
source: `// Get each field value as string
String datetime = doc[params.field].value.toString();
// Create format object based on string
DateTimeFormatter formatter = DateTimeFormatter.ofPattern(params.format);
// cast datetime into ZonedDateTime to use format function
ZonedDateTime zdt = ZonedDateTime.parse(datetime);
// return formatted date
return zdt.format(formatter);`
}
})
It returns { acknowledged: true } as expected, But when i check it through kibana, it returns:
{
"_id": "date_formatter",
"found": true,
"script": {
"lang": "mustache",
"source": """{"lang":"painless"}"""
}
}
Is there any way to put script into elasticsearch through node client?
You need to wrap both lang and source into a script section basically the same way as described here:
client.putScript({
id: 'date_formatter',
body: {
script: {
lang: "painless",
source: `// Get each field value as string
String datetime = doc[params.field].value.toString();
// Create format object based on string
DateTimeFormatter formatter = DateTimeFormatter.ofPattern(params.format);
// cast datetime into ZonedDateTime to use format function
ZonedDateTime zdt = ZonedDateTime.parse(datetime);
// return formatted date
return zdt.format(formatter);`
}
}
})
Related
I define a schema like this:
const query = new GraphQLObjectType({
name: 'Query',
fields: {
quote: {
type: queryType,
args: {
id: { type: QueryID }
},
},
},
});
const schema = new GraphQLSchema({
query,
});
The QueryID is a customised scalar type.
const QueryID = new GraphQLScalarType({
name: 'QueryID',
description: 'query id field',
serialize(dt) {
// value sent to the client
return dt;
},
parseLiteral(ast) {
if (ast.kind === 'IntValue') {
return Number(ast.value);
}
return null;
},
parseValue(v) {
// value from the client
return v;
},
});
client query
query {
quote(queryType: 1)
}
I found that the parseValue method is not called when clients send query to my server. I can see parseLiteral is called correctly.
In most of the document I can find, they use gql to define schema and they need to put scalar QueryID in their schema definition. But in my case, I am using GraphQLSchema object for schema. Is this the root cause of that? If yes, what is the best way to make it works? I don't want to switch to gql format because I need to construct my schema at runtime.
serialize is only called when sending the scalar back to the client in the response. The value it receives as a parameter is the value returned in the resolver (or if the resolver returned a Promise, the value the Promise resolved to).
parseLiteral is only called when parsing a literal value in a query. Literal values include strings ("foo"), numbers (42), booleans (true) and null. The value the method receives as a parameter is the AST representation of this literal value.
parseValue is only called when parsing a variable value in a query. In this case, the method receives as a parameter the relevant JSON value from the variables object submitted along with the query.
So, assuming a schema like this:
type Query {
someField(someArg: CustomScalar): String
someOtherField: CustomScalar
}
serialize:
query {
someOtherField: CustomScalar
}
parseLiteral:
query {
someField(someArg: "something")
}
parseValue:
query ($myVariable: CustomScalar) {
someField(someArg: $myVariable)
}
I want to be able to pass a dynamic BSON variable to match for mongodb.
Here is what I have tried:
var query = "\"info.name\": \"ABC\"";
and
var query = {
info: {
name: "ABC"
}
}
Neither of these work when passing variable 'query' to match (like below):
$match: {
query
}
but explicitly passing like below does work:
$match: {
"info.name": "ABC"
}
It works when you pass a query object that is like;
var query = {
"info.name": "ABC"
}
and passed into the aggregation pipe like so;
{ $match: query }
You can see its details on MongoDB Node.js Driver Tutorials
You cannot use a JSON object to query nested fields, like;
var query = {
info: {
name: "ABC"
}
}
check here
unless info contains only name field, then it can be used in such a manner. But again you have to pass with { $match: query }, like here
Here is what went wrong
var query = "\"info.name\": \"ABC\"";
{ $match: ""info.name": "ABC"" } // This is single string
Here in this query, we get single string that's why not result filter
But when you are using explicitly passing like this "info.name": "ABC" it work
For data
{
_id: ObjectId(XXXXXXXXXX),
info: { name: "ABC" }
}
You can use this aggregate query
// Create object and use this into [$match][1] stage
const data = { "info.name": "ABC" };
// Use this object in match stag
.aggregate([{ $match: data}])
if you have array of object then you need to use $elemMatch
For data
{
_id: ObjectId(XXXXXXXXXX)
info: [{ name: "ABC" }, { name: "DEF" } ]
}
// use elemMatch if array of object
.aggregate([{ $match: { info: { $elemMatch:{ name: "ABC" }}}}])
As MongoDB documentation says:
To specify a query condition on fields in an embedded/nested document, use dot notation.
The actual problem is that you have your object property name as a variable in your JS code. Please, check how to add a property to a JavaScript object using a variable as the name.
Here is how you can do it:
var propertyName = "info.name"
var query = {}
query[propertyName] = "ABC"
...
When using the library mongoose-uuid, I am able to setup UUID types for my schemas, so when I read the data it is in string (utf-8) format and when I save the data it is in UUID ObjectID BSON type 4 format. This works great with top level or flat direct values and ref definitions in my schema. However, when I have a UUID's in an array of ref's in a schema, the array saves to the database correctly, However when it is presented it is in its raw type. Based on the example below you can see scope_id is presented in the right format but the entitlements are not.
Here are the versions I am using:
mongoose-uuid - 2.3.0
mongoose - 5.5.11
I have tried modifying the library (mongoose-uuid) by changing the getter and converting the value, however, when I do so, it works when presenting but fails when it saves to the database. This is most likely due to the fact that the value is converted or casted before saving to the database.
Here is an example schema
{
"code": {
"type": String,
"required": true
},
"scope_id": {
"type": mongoose.Types.UUID,
"ref": "scopes"
},
"entitlements": [{
"type": mongoose.Types.UUID,
"ref": "entitlements"
}]
}
Example actual response
{
"entitlements": [
"zMihi1BKRomM1Q41p7hgLA==",
"ztOYL7n1RoGA6aoc0TcqoQ=="
],
"code": "APPUSR",
"scope_id": "b8f80c82-8325-4ffd-bfd7-e373a90e7c45",
"id": "32e79061-e531-45ad-b934-56811e2ad713"
}
Expected Response
{
"entitlements": [
"ccc8a18b-504a-4689-8cd5-0e35a7b8602c",
"ced3982f-b9f5-4681-80e9-aa1cd1372aa1"
],
"code": "APPUSR",
"scope_id": "b8f80c82-8325-4ffd-bfd7-e373a90e7c45",
"id": "32e79061-e531-45ad-b934-56811e2ad713"
}
As mentioned above, the code does work but breaks another part of the code. I found a solution that corrects this:
It is a slight amendment to the code above
SchemaUUID.prototype.cast = function (value, doc, init) {
console.log("cast", value, doc, init)
if (value instanceof mongoose.Types.Buffer.Binary) {
if (init && doc instanceof mongoose.Types.Embedded) {
return getter(value);
}
return value;
}
if (typeof value === 'string') {
var uuidBuffer = new mongoose.Types.Buffer(uuidParse.parse(value));
uuidBuffer.subtype(bson.Binary.SUBTYPE_UUID);
return uuidBuffer.toObject();
}
throw new Error('Could not cast ' + value + ' to UUID.');
};
This alternate version of the code allows for updates such as POST and PATCH to be applied.
As per my observation, if you change the below function in mongoose, it works fine
SchemaUUID.prototype.cast = function (value, doc, init) {
console.log("cast", value, doc, init)
if (value instanceof mongoose.Types.Buffer.Binary) {
if (init) {
return getter(value);
} else {
return value;
}
}
if (typeof value === 'string') {
var uuidBuffer = new mongoose.Types.Buffer(uuidParse.parse(value));
uuidBuffer.subtype(bson.Binary.SUBTYPE_UUID);
return uuidBuffer.toObject();
}
throw new Error('Could not cast ' + value + ' to UUID.');
};
Basically when you save objects init is false and when its initiated init is true
I have encoded string data to base64 format and setted the output to custom field which is type is long text. In the user interface of the record I could see whole output of encoded value. But while try to get the output value with using rec.getText({fieldId:'customfieldname'}) somehow it breaks the value and doesn't return whole value. Is there any limit size of custom field value?
UserEvent script to get the custom field value:
function beforeSubmit(scriptContext) {
try {
var invrecord = scriptContext.newRecord;
var encodedata = invrecord.getText({fieldId: 'customfield'});
log.debug({title:'Custom field value',
details: encodedata});
return true;
}
catch (e) {
log.error({
title: e.name,
details: e.message
});
return false;
}}
return {
beforeSubmit: beforeSubmit, };});
To encode field value I have used code below:
function encodeBase64Binary(strdata) {
try{
var base64EncodedString = encode.convert({
string: strdata,
inputEncoding: encode.Encoding.UTF_8,
outputEncoding: encode.Encoding.BASE_64
});
return base64EncodedString;
}
catch (e) {
log.error({
title: e.name,
details: e.message)}
}
The value of the field contains the value you're looking for, however, log.debug truncates the value to 3,999 characters. That's why you're not seeing the complete value.
In an update to our GraphQL API only the models _id field is required hence the ! in the below SDL language code. Other fields such as name don't have to be included on an update but also cannot have null value. Currently, excluding the ! from the name field allows the end user to not have to pass a name in an update but it allows them to pass a null value for the name in, which cannot be allowed.
A null value lets us know that a field needs to be removed from the database.
Below is an example of a model where this would cause a problem - the Name custom scalar doesn't allow null values but GraphQL still allows them through:
type language {
_id: ObjectId
iso: Language_ISO
auto_translate: Boolean
name: Name
updated_at: Date_time
created_at: Date_time
}
input language_create {
iso: Language_ISO!
auto_translate: Boolean
name: Name!
}
input language_update {
_id: ObjectId!
iso: Language_ISO!
auto_translate: Boolean
name: Name
}
When a null value is passed in it bypasses our Scalars so we cannot throw a user input validation error if null isn't an allowed value.
I am aware that ! means non-nullable and that the lack of the ! means the field is nullable however it is frustrating that, as far as I can see, we cannot specify the exact values for a field if a field is not required / optional. This issue only occurs on updates.
Are there any ways to work around this issue through custom Scalars without having to start hardcoding logic into each update resolver which seems cumbersome?
EXAMPLE MUTATION THAT SHOULD FAIL
mutation tests_language_create( $input: language_update! ) { language_update( input: $input ) { name }}
Variables
input: {
_id: "1234",
name: null
}
UPDATE 9/11/18: for reference, I can't find a way around this as there are issues with using custom scalars, custom directives and validation rules. I've opened an issue on GitHub here: https://github.com/apollographql/apollo-server/issues/1942
What you're effectively looking for is custom validation logic. You can add any validation rules you want on top of the "default" set that is normally included when you build a schema. Here's a rough example of how to add a rule that checks for null values on specific types or scalars when they are used as arguments:
const { specifiedRules } = require('graphql/validation')
const { GraphQLError } = require('graphql/error')
const typesToValidate = ['Foo', 'Bar']
// This returns a "Visitor" whose properties get called for
// each node in the document that matches the property's name
function CustomInputFieldsNonNull(context) {
return {
Argument(node) {
const argDef = context.getArgument();
const checkType = typesToValidate.includes(argDef.astNode.type.name.value)
if (checkType && node.value.kind === 'NullValue') {
context.reportError(
new GraphQLError(
`Type ${argDef.astNode.type.name.value} cannot be null`,
node,
),
)
}
},
}
}
// We're going to override the validation rules, so we want to grab
// the existing set of rules and just add on to it
const validationRules = specifiedRules.concat(CustomInputFieldsNonNull)
const server = new ApolloServer({
typeDefs,
resolvers,
validationRules,
})
EDIT: The above only works if you're not using variables, which isn't going to be very helpful in most cases. As a workaround, I was able to utilize a FIELD_DEFINITION directive to achieve the desired behavior. There's probably a number of ways you could approach this, but here's a basic example:
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const { args: { paths } } = this
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
for (const path of paths) {
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
}
return resolve.apply(this, resolverArgs)
}
}
}
Then in your schema:
directive #nonNullInput(paths: [String!]!) on FIELD_DEFINITION
input FooInput {
foo: String
bar: String
}
type Query {
foo (input: FooInput!): String #nonNullInput(paths: ["input.foo"])
}
Assuming that the "non null" input fields are the same each time the input is used in the schema, you could map each input's name to an array of field names that should be validated. So you could do something like this as well:
const nonNullFieldMap = {
FooInput: ['foo'],
}
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const visitedTypeArgs = this.visitedType.args
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
visitedTypeArgs.forEach(arg => {
const argType = arg.type.toString().replace("!", "")
const nonNullFields = nonNullFieldMap[argType]
nonNullFields.forEach(nonNullField => {
const path = `${arg.name}.${nonNullField}`
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
})
})
return resolve.apply(this, resolverArgs)
}
}
}
And then in your schema:
directive #nonNullInput on FIELD_DEFINITION
type Query {
foo (input: FooInput!): String #nonNullInput
}