How do I write an array of strings to an OPCUA Server? - node.js

This is how I'm currently trying to write an array of strings to a opcua variable on a opcua server ( attached image 2 gives attributes ). The method below takes in a string[] and tries to write this string to the variable. I can read the variable easily, with a similar method.
async writeFeatureName(arrayToWrite: String[]): Promise <any> {
console.log(arrayToWrite);
let nodesToWrite = [{
nodeId: "ns=3;s=\"DB_ScvsInterface01".\"OUT\".\"FeatureName\"",
attributeId: AttributeIds.Value,
value: new DataValue({
statusCode: StatusCodes.Good,
value: new Variant({
dataType: DataType.String,
arrayType: VariantArrayType.Array,
value: arrayToWrite
})
}),
}];
const dataValue = await this.session.write(nodesToWrite);
winston.debug(`wrote Feature Name Array : ${dataValue.toString()}`);
return dataValue
}
When I try to write to the variable on the server, I get a type mismatch. The array is of type string[]. I've tried various recommendations, but I do not find a clear example of writing an array to an array on the server? Is this even possible ?
this image shows the error i'm getting, which is a type mismatch
this is the server attributes for the variable I'm trying to write to

You are using the correct technic to write an array of string to the variable.
However, the arrayDimension attribute of the variable is [60], which specifies that the variable should contain 60 element in the array.
I wonder if the arrayToWrite contains 60 elements , Can you check ?

Related

How to check for element not in a set in Amazon DynamoDB?

I'm trying to perform an update in Amazon DynamoDB, but only if a StringSet does not contain a specific string. Looking at the AWS docs I saw that there exists the contains() function, which can be used inside the ConditionExpression. Then I tried this code:
function addLecture(event){
const params = {
TableName: 'lectures',
Key: {
'lecture_id': Number.parseInt(event.lecture_id)
},
UpdateExpression: 'SET numeric_attr = numeric_attr - :val ADD students :student',
ConditionExpression: '(numeric_attr > :limit) AND (NOT contains(students, :student))',
ExpressionAttributeValues: {
':val': 1,
':limit' : 0,
':student' : ddb.createSet(event.student)
},
ReturnValues : 'UPDATED_NEW'
}
return ddb.update(params).promise();
}
Anyway, if I try to perform the update, it is actually performed, even if the String is already in the StringSet.
How could I check for the absence of the String?
I think your condition is malformated:
:student resolves to ddb.createSet(event.student). You should pass a string variable to contains, and not a string set. From the doc you pointed out:
The operand must be a String if the attribute specified by path is a String. If the attribute specified by path is a Set, the operand must be the set's element type.

How can I mask json using json-masker for fields with "-" in it?

My requirement is to mask certain fields of a JSON while logging them.I am working on node.js. I have used json-masker library of node.js. While passing the JSON path of attributes with "-" in the name in the "whitelist" parameter, I am getting lexical error.
JSON
{
"attribute1":"value1",
"attribute2":"value2",
"attribute-name":"value3"
}
Code
const masker = require('json-masker');
const mask= masker({
whitelist: ['$.attribute1','$.attribute-name']
});
Error
Error Lexical error on line 1. Unrecognized text.
$.attribute-name
Also, is there a way to specify only the attributes that needs to be masked rather that specifying the ones that need not be masked(as specified in whitelist).
Please suggest if there is a better approach to do this using any other function/library.
Please note that I am receiving this JSON , so I cannot change the key name
The correct syntax is '$["attribute-name"]' instead of '$.attribute-name'
The $ fields are processed by jsonpath, a dependency of json-masker. This issue is discussed in one of their github issues (#90) and the solution presented there.
Use maskdata npm module: https://www.npmjs.com/package/maskdata
You can mask json fields containing '-' without any extra effort. Also, you can mask the nested fields too.
Example:
const MaskData = require('./maskdata');
const maskJSONOptions = {
// Character to mask the data. Default value is '*'
maskWith : "*",
// It should be an array
// Field names to mask. Can give multiple fields.
fields : ['level1.level2.level3.field3', 'level1.level2.field2', 'level1.field1', 'value1']
};
const nestedObject = {
level1: {
field1: "field1",
level2: {
field2: "field2",
level3: {
field3: "field3",
}
}
},
value1: "value"
};
const maskedObj = MaskData.maskJSONFields(nestedObject, defaultJSONMaskOptions2);
//Output : {"level1":{"field1":"******","level2":{"field2":"******","level3":{"field3":"******"}}},"value1":"*****"}

Mongoose accepts null for Number field

I have a mongoose schema where I'm storing a port number. I also have a default value set for the field.
port:{
type:Number,
default:1234
}
If I don't get any value via my API, it gets set to 1234.
However, If someone sends null, it accepts null and saves to database.
Shouldn't it covert null to 1234? null is not a number! Am I understanding it wrong?
I am considering the solution given here, but I dont want to add extra code for something that should work without it (unless I'm wrong and its not supposed to convert null to 1234)
See the comments in this issue:
https://github.com/Automattic/mongoose/issues/2438
null is a valid value for a Date property, unless you specify required. Defaults only get set if the value is undefined, not if its falsy.
(it's about dates but it can be applied to numbers just as well.)
Your options are to either:
add required to the field
add a custom validator that would reject it
use hooks/middleware to fix the issue
You might get away with a pre-save or post-validate (or some other) hook like this:
YourCollection.pre('save', function (next) {
if (this.port === null) {
this.port = undefined;
}
next();
});
but probably you'll have to use something like:
YourCollection.pre('save', function (next) {
if (this.port === null) {
this.port = 1234; // get it from the schema object instead of hardcoding
}
next();
});
See also this answer for some tricks on how to make null trigger default values in function invocation:
Passing in NULL as a parameter in ES6 does not use the default parameter when one is provided
This is unfortunate that Mongoose cannot be configured to tread null as undefined (with some "not-null" parameter or something like that) because it is sometimes the case that you work with data that you got in a request as JSON and it can sometimes convert undefined to null:
> JSON.parse(JSON.stringify([ undefined ]));
[ null ]
or even add null values where there was no (explicit) undefined:
> JSON.parse(JSON.stringify([ 1,,2 ]));
[ 1, null, 2 ]
As explained in mongoose official docs here
Number
To declare a path as a number, you may use either the Number global constructor or the string 'Number'.
const schema1 = new Schema({ age: Number }); // age will be cast to a Number
const schema2 = new Schema({ age: 'Number' }); // Equivalent
const Car = mongoose.model('Car', schema2);
There are several types of values that will be successfully cast to a Number.
new Car({ age: '15' }).age; // 15 as a Number
new Car({ age: true }).age; // 1 as a Number
new Car({ age: false }).age; // 0 as a Number
new Car({ age: { valueOf: () => 83 } }).age; // 83 as a Number
If you pass an object with a valueOf() function that returns a Number, Mongoose will call it and assign the returned value to the path.
The values null and undefined are not cast.
NaN, strings that cast to NaN, arrays, and objects that don't have a valueOf() function will all result in a CastError.

Cheerio Map Strange Behaviour

I'm using map with a list of Cheerio results to return an attribute value. What I want is a variable that contains a list of attribute values (in this case ID's), but instead I'm getting the ID's and extra data.
The following code prints a list of ID's:
let ids = $('[data-profileid]').map(function() {
console.log($(this).attr('data-profileid'))
})
Result:
1012938412
493240324
123948532
423948234
...
But, the following code returns the IDs but in a different format:
let ids = $('[data-profileid]').map(function() {
return $(this).attr('data-profileid')
})
console.log(ids)
Results:
...
'69': '234234234,
'70': '9328402397432',
'71': '1324235234',
options:
{ withDomLvl1: true,
normalizeWhitespace: false,
xmlMode: false,
decodeEntities: true },
_root:
{ '0':
{ type: 'root',
name: 'root',
attribs: {},
...
What is all this extra data? It certainly isn't required. I'd rather just have an ordinary array.
According to http://api.jquery.com/map/:
As the return value is a jQuery object, which contains an array, it's
very common to call .get() on the result to work with a basic array.
So it looks like this should work:
let ids = $('[data-profileid]').map(function() {
return $(this).attr('data-profileid')
}).get()
What is all this extra data? It certainly isn't required. I'd rather just have an ordinary array.
Cheerio has a fluent API, meaning most of its functions return an object on which additional functions can be chained. If map just returned an "ordinary array" then you wouldn't be able to call additional Cheerio functions on the result. There aren't a lot of ways you can chain additional function calls onto the result of your map call, which returns an array of strings, but Cheerio's developers (taking a cue from jQuery's developers) chose to keep a consistent API rather than pepper it with special cases.
If you want an ordinary array, though, Cheerio gives you a handy toArray function:
let ids = $('[data-profileid]').map(function() {
return $(this).attr('data-profileid')
});
console.log(ids.toArray());
// => [ '1012938412', '493240324', '123948532', '423948234' ]

How to use multiple types in node-elasticsearch-client?

I'm a newbie and working on an ES project (Express JS+ ES+MongoDB ). I'm using https://github.com/richardwilly98/elasticsearch-river-mongodb to do the indexing. The following code is working fine for a single index and type. But I have another type with the same index name ("type" : "file_info"). Is there any way to use multiple types with the same index name ?
For example- var type= ["stu_info", "file_info"].
var index = "studentdb";
var type = "stu_info";
var elasticSearchClient = new ElasticSearchClient(serverOptions);
elasticSearchClient.search(index, type, qryObj).
on('data', function (data) {
console.log(data)
})
Simply concatenate the types with a comma:
var type = 'my_type,my_other_type,my_third_type';
elasticSearchClient.search('my_index_name', type, qryObj)
.on('data', function(data) {
console.log(JSON.parse(data))
})
.exec();
Elasticsearch.js API has search method.
type: String, String[], Boolean — A comma-separated list of document types to search; leave empty to perform the operation on all types.
[search Api][1]https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/api-reference.html#api-search

Resources