Google Cloud Vision annotateImage feature types don't exist - node.js

I'm trying to get more than 10 results in GC Vision with Node.js.
Since I cannot pass the custom request directly to webDetection() I've tried to use annotateImage() instead:
const vision = require('#google-cloud/vision');
const client = new vision.ImageAnnotatorClient();
const webSearchRequest = {
image: {
source: {
imageUri: `gs://${bucket.name}/${filePath}`
}
},
features: [{
maxResults: 50,
type: vision.types.Feature.Type.WEB_DETECTION
}]
};
return client.annotateImage(webSearchRequest).then(webResults => {
console.log(webResults);
}
The output is Cannot read property 'Feature' of undefined

For visibility purpose I am posting my solution from the comments as an answer.
After doing some research and testing with this tool I've seen that the attribute type should be as follow: type: WEB_DETECTION instead of type: vision.types.Feature.Type.WEB_DETECTION.

Related

Contentful NodeJs SDK, Client.getEntries | Load nested ContentType fields

I have the following structure
User {
image: Asset
...
}
Comment {
author: User
...
}
BlogArticle {
slug: Text
author: User
comments: Comment[]
}
When I pull entries with the following method
const articles = await client.getEntries({ content_type: "BlogArticle" })
console.log(articles.entries.fields.comments)
I only get the sys property for the author
[
{
author: {
sys: {
...
}
fields ??????
}
}
]
PS: This is the case for all types that come in second level of nesting
I checked the docs and the apis but with no luck
Any help ?
I created a similar content model and was able to get the fields of the Author successfully. One thing you can do is use the include parameter. With the include parameter, your code should look as follow:
const articles = await client.getEntries({ content_type: "BlogArticle", include: 2 })
console.log(articles.entries.fields.comments)
You can learn more about it here

SNS SDK for NodeJS won't create FIFO topic

When I create a topic using the sns.createTopic (like the code below) it won't accept the booleans and say 'InvalidParameterType: Expected params.Attributes['FifoTopic'] to be a string', even though the docs say to provide boolean value, and when I provide it with a string of 'true' it still doesn't set the topic type to be FIFO, anyone knows why?
Here's the code:
const TOPIC = {
Name: 'test.fifo',
Attributes: {
FifoTopic: true,
ContentBasedDeduplication: true
},
Tags: [{
Key: 'test-key',
Value: 'test-value'
}]
};
sns.createTopic(TOPIC).promise().then(console.log);
Used aws-sdk V2
I sent FifoTopic and ContentBasedDeduplication as strings.
The below code works fine for me
const TOPIC = {
Name: 'test.fifo',
Attributes: {
FifoTopic: "true",
ContentBasedDeduplication: "true"
},
Tags: [{
Key: 'test-key',
Value: 'test-value'
}]
};
let sns = new AWS.SNS();
let response3 =await sns.createTopic(TOPIC).promise();
console.log(response3);
Note: Make sure your lambda has correct permissions.
You will be getting attributes like FifoTopic and ContentBasedDeduplication when performing the getTopicAttributes.
let respo = await sns.getTopicAttributes({
TopicArn:"arn:aws:sns:us-east-1:XXXXXXX:test.fifo"}
).promise();
please find the screenshot

how to convert JSON to XML with Nodejs?

i just know that the object Json can be an XMl file by the js2xml library,
so that's why I'm trying to convert the following json to XML,
How can I achieve this in NodeJS?
i can't find an answer or a documentation that can help me?
here is the model JSON
const UserSchema = new mongoose.Schema({
email: {
type: String,
required: [true, "Please provide email address"],
unique: true,
match: [
/^(([^<>()[\]\\.,;:\s#"]+(\.[^<>()[\]\\.,;:\s#"]+)*)|(".+"))#((\[[0-9]{1,3}\.[0-9]{1,3}\.[0-9]
{1,3}\.[0-9]{1,3}\])|(([a-zA-Z\-0-9]+\.)+[a-zA-Z]{2,}))$/,
"Please provide a valid email",
],
},
password: {
type: String,
required: [true, "Please add a password"],
minlength: 6,
select: false,
},
const User = mongoose.model("User", UserSchema);
module.exports = User;
i used this exemple that didn't work for me
function groupChildren(obj) {
for(prop in obj) {
if (typeof obj[prop] === 'object') {
groupChildren(obj[prop]);
} else {
obj['$'] = obj['$'] || {};
obj['$'][prop] = obj[prop];
delete obj[prop];
}
}
return obj;
}
const xml2js = require('xml2js');
const obj = {
Level1: {
attribute: 'value',
Level2: {
attribute1: '05/29/2020',
attribute2: '10',
attribute3: 'Pizza'
}
}
};
const builder = new xml2js.Builder();
const xml = builder.buildObject(groupChildren(obj));
console.log(xml);
When converting JSON to XML, one has to ask: what XML do you want to convert it to? Does it have to be a specific XML format, or will any old XML do?
If any old XML will do, then you can usually find some library to do the job, such as js2xml or js2xmlparser. The problem with these libraries is that they usually offer very little control over how the XML is generated, especially for example if there are JSON objects with keys that are not valid XML names (which doesn't apply in your case).
If you want a specific XML format then I would recommend using XSLT 3.0, which is available on node.js in the form of Saxon-JS. [Disclaimer: my company's product]. If you are interested in pursuing this approach, then tell us what you want the output to look like, and we can help you create it.
There are many different packages for XML serialization.
Most of them enforce a specific XML and JSON mapping convention.
Others require you to build the XML document in code.
Finally, there are solutions that do this with decorators. Those give you freedom in defining the structure without having to build the document entirely in code.
As an example: the xml decorators package.
It means that you define the XML mapping using a class. Next, you define decorators on top of each field, to define how it should be mapped to XML.
import { XMLAttribute, xml } from 'xml-decorators';
const NS = 'ns';
export class User {
#XMLAttribute({namespace: NS})
private email:string;
#XMLAttribute({namespace: NS})
private password: string;
constructor(email: string, password: string) {
this.email = email;
this.password = password;
}
}
And finally, to actually serialize
const user = new User('foo#bar.com', 'secret');
const xml = xml.serialize(user);
Conceptually, this is certainly a robust solution, since it strongly resembles how java xml binding (JAXB) and C# xml serialization work.

How do I select specific column data to be displayed in my bookshelf model belongsTo relationship in Nodejs?

This is a contrived example of what I would like to do:
Suppose I have a database of teams and players:
team:
->id
->color
->rank
->division
player:
->id
->team_id
->number
->SECRET
And the following bookshelf models:
var Base = require('./base');
const Player = Base.Model.extend(
{
tableName: "players",
},
nonsecretdata: function() {
return this.belongsTo('Team')
},
{
fields: {
id: Base.Model.types.integer,
team_id: Base.Model.types.integer,
number: Base.Model.types.integer,
SECRET: Base.Model.types.string,
}
}
);
module.exports = Base.model('Player', Player);
And
var Base = require('./base');
const Team = Base.Model.extend(
{
tableName: "teams",
},
{
fields: {
id: Base.Model.types.integer,
color: Base.Model.types.string,
rank: Base.Model.types.integer,
division: Base.Model.types.string,
}
}
);
module.exports = Base.model('Team', Team);
My question is, how can I limit the scope of player such that SECRET is not grabbed by calls to join player and team with callback nonsecretdata?
I am new to Bookshelf so if any other information is needed, please let me know. Thank you
++++++++++
Edit: Do I need to create a separate model?
The only way to do this using bookshelf would be to delete the individual fields from the object after fetching the entire model.
A potentially better solution for this use case would be to define a custom Data Access Object class that uses a SQL query for the information that would like to be obtained and then use that DOA instead of using bookshelf. That way the SQL code is still abstracted away from the code that is requesting the information and the SECRET or any other potential sensitive information that is added to the table will not be included in the fetch.

GraphQL mutation that accepts an array of dynamic size in one request as input with NodeJs

I want to pass an object array of [{questionId1,value1},{questionId2,value2},{questionId3,value3}] of dynamic size in GraphQL Mutation with NodeJS
.........
args: {
input: {
type: new GraphQLNonNull(new GraphQLInputObjectType({
name: 'AssessmentStep3Input',
fields: {
questionId:{
name:'Question ID',
type: new GraphQLNonNull(GraphQLID)
},
value:{
name:'Question Value',
type: new GraphQLNonNull(GraphQLBoolean)
}
}
}))
}
},
.........
How can I do that with the given sample of code?
Thanks
If you want to pass an object array with GraphQL Mutation you need to use "GraphQLList" which allows you to pass an array with dynamic size of given input.
Here is the example
........
........
args: {
input: {
type: new GraphQLNonNull(GraphQLList(new GraphQLInputObjectType({
name: 'AssessmentStep3Input',
fields: {
questionId:{
name:'Question ID',
type: new GraphQLNonNull(GraphQLID)
},
value:{
name:'Question Value',
type: new GraphQLNonNull(GraphQLBoolean)
}
}
}))
)
}
},
........
........
Hope it helps.
Thanks
i just published the article on that, so that you can take a look if you would like to know more detail. This is the repository with the examples, where the createUsers mutation is implemented https://github.com/atherosai/express-graphql-demo/blob/feature/5-modifiers/server/graphql/users/userMutations.js. You can take a look how it is implemented, but in general the above answer is correct. You can input as many objects as you would like to in the array (if you have not implemented some number of items limiting, but it is not there by default).

Resources