I'm using Google Cloud Vision to detect text on an image. This works about 80% of the time. The other 20%, I get this error:
Error: 3 INVALID_ARGUMENT: Request must specify image and features.
at Object.callErrorFromStatus (C:\Users\emily\workspace\bot\node_modules\#grpc\grpc-js\build\src\call.js:31:26)
at Object.onReceiveStatus (C:\Users\emily\workspace\bot\node_modules\#grpc\grpc-js\build\src\client.js:180:52)
at Object.onReceiveStatus (C:\Users\emily\workspace\bot\node_modules\#grpc\grpc-js\build\src\client-interceptors.js:336:141)
at Object.onReceiveStatus (C:\Users\emily\workspace\bot\node_modules\#grpc\grpc-js\build\src\client-interceptors.js:299:181)
at C:\Users\emily\workspace\bot\node_modules\#grpc\grpc-js\build\src\call-stream.js:160:78
at processTicksAndRejections (node:internal/process/task_queues:78:11) {
code: 3,
details: 'Request must specify image and features.',
metadata: Metadata { internalRepr: Map(0) {}, options: {} },
note: 'Exception occurred in retry method that was not classified as transient'
When I googled this issue, it seems I need to send specific headers with my request to resolve this, basically like as specified here: https://cloud.google.com/vision/docs/ocr#specify_the_language_optional
However, I have no idea how to send these request parameters with the Node.js code I'm using and I can't find any examples anywhere. Can someone please help me figure out how to use this? My current code is this:
// Performs text detection on the image file using GCV
(async () => {
await Jimp.read(attachment.url).then(image => {
return image
.invert()
.contrast(0.5)
.brightness(-0.25)
.write('temp.png');
});
const [result] = await googleapis.textDetection('temp.png');
const fullImageResults = result.textAnnotations;
Thanks!
If you are using Node.js with Vision API you can refer to this sample quickstart code for using Node.js Client Library in Vision API for TEXT_DETECTION.
For the error that you are facing, you can refer to the below code to add request parameters:
index.js :
async function quickstart() {
const vision = require('#google-cloud/vision');
const client = new vision.ImageAnnotatorClient();
const request = {
"requests": [
{
"image": {
"source": {
"imageUri": "gs://bucket1/download.png"
}
},
"features": [
{
"type": "TEXT_DETECTION"
}
],
"imageContext": {
"languageHints": ["en"]
}
}
]
};
const [result] = await client.batchAnnotateImages(request);
const detections = result.responses[0].fullTextAnnotation;
console.log(detections.text);
}
quickstart().catch(console.error);
Here in the above code I have stored the image in GCS and used the path of that image in my code.
Image :
Output :
It was the best of
times, it was the worst
of times, it was the age
of wisdom, it was the
age of foolishness...
If you want to use the image file stored in the local system you can refer to the below code.
Since your file is in the local system, first you need to convert it to a base64 encoded string format and pass the same in the request parameters in your code.
index.js :
async function quickstart() {
const vision = require('#google-cloud/vision');
const client = new vision.ImageAnnotatorClient();
const request ={
"requests":[
{
"image":{
"content":"/9j/7QBEUGhvdG9...image contents...eYxxxzj/Coa6Bax//Z"
},
"features": [
{
"type":"TEXT_DETECTION"
}
],
"imageContext": {
"languageHints": ["en"]
}
}
]
};
const [result] = await client.batchAnnotateImages(request);
const detections = result.responses[0].fullTextAnnotation;
console.log(detections.text);
}
quickstart();
Related
I using react native and backend is node.I trying implement pusher in my app.
Object {
"error": "Unable to retrieve auth string from auth endpoint - received status: 0 from http://10.0.27.124:8070/pusher/auth. Clients must be authenticated to join private or presence channels. See: https://pusher.com/docs/authenticating_users",
"status": 0,
"type": "AuthError",
}
Here is my react native code :
const pusher = new Pusher('73286f08a5b2aeeea398', {
cluster: 'ap1',
authEndpoint: 'http://10.0.27.124:8070/pusher/auth',
});
console.log(pusher)
const presenceChannel = pusher.subscribe('presence-channel');
Here is my node js code :
exports.authPusher = function (req, res) {
const socketId = req.body.socket_id;
const channel = req.body.channel_name;
console.log(req.body);
const presenceData = {
user_id: 'unique_user_id',
user_info: { name: 'Mr Channels', twitter_id: '#pusher' },
};
const auth = pusher.authenticate(socketId, channel, presenceData);
res.send(auth);
};
Thank you for answering.
I think you just forgot to use quote marks around authEndpoint parameter. Pusher is trying to call http instead of http://10.0.27.124:8070/pusher/auth.
const pusher = new Pusher('73286f08a5b2aeeea398', {
cluster: 'ap1',
authEndpoint: 'http://10.0.27.124:8070/pusher/auth',
});
If you open network tab on the web-developers toolbar of your browser, you must be able to see the actual request and debug it from there.
Update
I'm able to get my original code, and the suggestions as well working when running it in isolation. However, what I need to do is call it from within a Firebase onRequest or onCall function. When this code gets wrapped by these, the malformed headers and request for authorization are still an issue. We use many other APIs this way so it's puzzling why the Clarifiai API is having these issues. Any suggestions on using it with Firebase?
Original
New to Clarifai and having some authentication issues while attempting to retrieve model outputs from the Food Model.
I've tried two different keys:
API key generated from an app I created in the Portal
API key - the Personal Access Token I generated for myself
In both cases I encounter an Empty or malformed authorization header response.
{
"status":{
"code":11102,
"description":"Invalid request",
"details":"Empty or malformed authorization header. Please provide an API key or session token.",
"req_id":"xyzreasdfasdfasdfasdfasf"
},
"outputs":[
]
}
I've following the following articles to piece together this code. This is running in a Node 10 environment.
Initialization
Food Model
Prediction
const { ClarifaiStub } = require('clarifai-nodejs-grpc');
const grpc = require('#grpc/grpc-js');
const stub = ClarifaiStub.json();
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key xyzKey");
return new Promise((resolve, reject) => {
stub.PostModelOutputs(
{
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
metadata,
(err, response) => {
if (err) {
return reject(`ERROR: ${err}`);
}
resolve(JSON.stringify(response));
}
);
});
}
Update: There was an issue in versions prior to 7.0.2 where, if you had another library using #grpc/grpc-js with a different version, the grpc.Metadata object wasn't necessarily constructed from the library version that clarifai-grpc-nodejs was using.
To fix the issue, update the clarifai-grpc-nodejs library, and require the grpc object like this:
const {ClarifaiStub, grpc} = require("clarifai-nodejs-grpc");
Previously, the grpc object was imported directly from #grpc/grpc-js, which was the source of the problem.
There are two ways of authenticating to the Clarifai API:
with an API key, which is application-specific, meaning that an API key is attached to an application and can only do operations inside that application,
with a Personal Access Token (PAT), which is user-specific, which means you can assess / manipulate / do operations on all the applications the user owns / has access to (and also create/update/delete applications themselves).
When using a PAT, you have to specify, in your request data, which application you are targeting. With an API key this is not needed.
I've tested your example (using Node 12, though it should work in 10 as well) with a valid API key and it works fina (after putting it into an async function). Here's a full runnable example (replace YOUR_API_KEY with your valid API key).
function predict() {
const { ClarifaiStub } = require('clarifai-nodejs-grpc');
const grpc = require('#grpc/grpc-js');
const stub = ClarifaiStub.json();
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key YOUR_API_KEY");
return new Promise((resolve, reject) => {
stub.PostModelOutputs(
{
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
metadata,
(err, response) => {
if (err) {
return reject(`ERROR: ${err}`);
}
resolve(JSON.stringify(response));
}
);
});
}
async function main() {
const response = await predict();
console.log(response);
}
main();
If you want to use a PAT in the above example, two things must change. Firstly, replace the API key with a PAT:
...
metadata.set("authorization", "Key YOUR_PAT");
...
To the method request object, add the application ID.
...
stub.PostModelOutputs(
{
user_app_id: {
user_id: "me", // The literal "me" resolves to your user ID.
app_id: "YOUR_APPLICATION_ID"
},
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
...
Make sure that you have respected the format to pass the key in your code as such:
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key {YOUR_CLARIFAI_API_KEY}");
Make sure that "Key" is present.
Let me know.
EDIT: So looks like Firebase doesn't support custom headers. This is likely impacting the 'Authorization' header. At least this is my best guess. See the comments in the following ticket.
Firebase hosting custom headers not working
The following code works for me:
{
const { ClarifaiStub } = require('clarifai-nodejs-grpc');
const grpc = require('#grpc/grpc-js');
const stub = ClarifaiStub.json();
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key {APP API KEY}");
return new Promise((resolve, reject) => {
stub.PostModelOutputs(
{
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
metadata,
(err, response) => {
if (err) {
return reject(`ERROR: ${err}`);
}
console.log(JSON.stringify(response));
resolve(JSON.stringify(response));
}
);
});
}
There was a missing { although I'm not sure if that is what is reflected in the actual code you are running. I'm using in this case an APP API Key (when you create an App, there will be an API Key on the Application Details page.
It sounds like you might be using a Personal Access Token instead, which can be used like this:
{
const { ClarifaiStub } = require('clarifai-nodejs-grpc');
const grpc = require('#grpc/grpc-js');
const stub = ClarifaiStub.json();
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key {Personal Access Token}"); // Sounds like you've made the personal access token correctly - go into settings, then authentication, then create one. Make sure it has proper permissions (I believe all by default).
return new Promise((resolve, reject) => {
stub.PostModelOutputs(
{
user_app_id: {
user_id: "{USER ID}", // I used my actual ID, I did not put 'me'. You can find this under your profile.
app_id: "{APP NAME}" // This is the app ID found in the upper left corner of the app after it is created - not the API Key. This is generally what you named the app when you created it.
},
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
metadata,
(err, response) => {
if (err) {
return reject(`ERROR: ${err}`);
}
console.log(JSON.stringify(response));
resolve(JSON.stringify(response));
}
);
});
}
Make sure to fill out the: {Personal Access Token}, {USER ID} and {APP NAME}. I used my actual user id (found in the profile), and the app name is not the API Key for the app, but the name in the upper left corner when you're on the Application details page. This call worked for me.
I am following this tutorial here: Tutorial
everything seems ok and it allows me to do everything in the tutorial, but when I run the function I get this error.
textPayload: "TypeError: Cannot read property 'charCodeAt' of undefined
at peg$parsetemplate (/workspace/node_modules/google-gax/build/src/pathTemplateParser.js:304:17)
at Object.peg$parse [as parse] (/workspace/node_modules/google-gax/build/src/pathTemplateParser.js:633:18)
at new PathTemplate (/workspace/node_modules/google-gax/build/src/pathTemplate.js:55:54)
at segments.forEach.segment (/workspace/node_modules/google-gax/build/src/pathTemplate.js:120:29)
at Array.forEach (<anonymous>)
at PathTemplate.render (/workspace/node_modules/google-gax/build/src/pathTemplate.js:114:23)
at FirestoreAdminClient.databasePath (/workspace/node_modules/#google-cloud/firestore/build/src/v1/firestore_admin_client.js:904:57)
at exports.scheduledFirestoreExport (/workspace/index.js:13:31)
at Promise.resolve.then (/layers/google.nodejs.functions-framework/functions-framework/node_modules/#google-cloud/functions-framework/build/src/invoker.js:330:28)
at process._tickCallback (internal/process/next_tick.js:68:7)
insertId: "000000-8410c5c7-8304-42b6-b2b6-dd55a54e8cab"
resource: {2}
timestamp: "2020-07-11T18:14:35.981Z"
severity: "ERROR"
labels: {1}
logName: "projects/b-b-b-app/logs/cloudfunctions.googleapis.com%2Fcloud-functions"
trace: "projects/b-b-b-app/traces/d7c07a715d0106225d9963ce2a046489"
receiveTimestamp: "2020-07-11T18:14:44.813410062Z"
}
I can't see what the problem may be.
I changed the buckets and the app ids like asked in the tutorial.
I am on a Blaze plan and can export the database to the bucket manually by using shell command and using
gcloud firestore export gs://bbbdata-backup
I am using the GCP console on the firebase site and using this code.
const firestore = require('#google-cloud/firestore');
const client = new firestore.v1.FirestoreAdminClient();
const bucket = 'gs://bbbdata-backup'
exports.scheduledFirestoreExport = (event, context) => {
const databaseName = client.databasePath(
process.env.GCLOUD_PROJECT,
'(default)'
);
return client
.exportDocuments({
name: databaseName,
outputUriPrefix: bucket,
// Leave collectionIds empty to export all collections
// or define a list of collection IDs:
// collectionIds: ['users', 'posts']
collectionIds: [],
})
.then(responses => {
const response = responses[0];
console.log(`Operation Name: ${response['name']}`);
return response;
})
.catch(err => {
console.error(err);
});
};
Following the tutorial referred by the OP I run into precisely the same error. Runtime used: Node.js 14.
Root cause of the issue: value of process.env.GCLOUD_PROJECT is undefined.
Workaround: Go to GCP console -> Home. Note your Project ID. Replace process.env.GCLOUD_PROJECT with the 'Project ID' string. The Cloud Function will then work as expected
Note: it appears to be a known issue that GCLOUD_PROJECT environment variable was missing in the Node.js 10 runtime. This bug report contains a lot of additional pointers: https://github.com/firebase/firebase-functions/issues/437
I had a similar issue last year, probably you are missing some permission, I would do it this way, hope this works for you:
import * as functions from 'firebase-functions'
import { auth } from 'google-auth-library'
export const generateBackup = async () => {
const client = await auth.getClient({
scopes: [
'https://www.googleapis.com/auth/datastore',
'https://www.googleapis.com/auth/cloud-platform'
]
})
const path = `YOUR_FOLDER_NAME_FOR_THE_BACKUP`
const BUCKET_NAME = `YOUR_BUCKET_NAME_HERE`
const projectId = await auth.getProjectId()
const url = `https://firestore.googleapis.com/v1beta1/projects/${projectId}/databases/(default):exportDocuments`
const backup_route = `gs://${BUCKET_NAME}/${path}`
return client.request({
url,
method: 'POST',
data: {
outputUriPrefix: backup_route,
// collectionsIds: [] // if you want to specify which collections to export, none means all
}
})
.catch(async (e) => {
return Promise.reject({ message: e.message })
})
}
You can then decide that is your trigger for this function and execute it accordingly.
Note: Go to the IAM section of your project and find the App Engine service account, you will need to add the role Cloud Datastore Import Export Admin, otherwise, It will fail.
You can read more about it here It's very detailed.
Cheers.
I am following this article from medium https://blog.bitsrc.io/serverless-backend-using-aws-lambda-hands-on-guide-31806ceb735e
Everything works except when I attempt to add a record to the DynamoDB I get an error that say "this is not a function"
const AWS = require ("aws-sdk");
const client = new AWS.DynamoDB.DocumentClient();
const uuid = require ("uuid");
module.exports.myHero = async (event) => {
const data = JSON.parse(event.body);
const params = {
TableName: "myHeros",
Item: {
id: uuid(),
name: data.name,
checked: false
}
};
await client.put(params).promise();
return {
statusCode: 200,
body: JSON.stringify(data)
};
};
{
"errorMessage": "client.put(...).promise is not a function",
"errorType": "TypeError",
"stackTrace": [
"module.exports.myHero (/var/task/create.js:30:27)"
]
}
In almost all cases, when you call a method xyz() on an AWS client object and it fails with ‘xyz is not a function’, the problem is that you are using an old version of an SDK that does not actually support that method.
Upgrading to the latest AWS SDK version will fix this problem.
When initializing dynamodb client new AWS.DynamoDB.DocumentClient()' please pass options (at-least region parameter) to DocumentClient function.
I have been tasked with making a POST api call to elastic search api,
https://search-test-search-fqa4l6ubylznt7is4d5yxlmbxy.us-west-2.es.amazonaws.com/klove-ddb/recipe/_search
I don't have any previous experience with making api calls to AWS services.
So, I tried this -
axios.post('https://search-test-search-fqa4l6ubylznt7is4d5yxlmbxy.us-west-2.es.amazonaws.com/klove-ddb/recipe/_search')
.then(res => res.data)
.then(res => console.log(res));
But I was getting {"Message":"User: anonymous is not authorized to perform: es:ESHttpPost"}
I also checked out with some IAM roles and added AWSESFullAccess policies to my profile.
Still I can't make anything work out.
Please help me.
The reason your seeing the error User: anonymous is not authorized to perform: es:ESHttpPost is because you're making requesting data without letting ElasticSearch know who you are - this is why it says 'Anonymous'.
There are a couple ways of authentication, the easiest being using the elasticsearch library. With this library you'll give the library a set of credentials (access key, secret key) to the IAM role / user. It will use this to create signed requests. Signed requests will let AWS know who's actually making the request, so it won't be received as anonymous, but rather, yourself.
Another way of getting this to work is to adjust your access policy to be IP-based:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "es:*",
"Condition": {
"IpAddress": {
"aws:SourceIp": [
"AAA.BBB.CCC.DDD"
]
}
},
"Resource": "YOUR_ELASTICSEARCH_CLUSTER_ARN"
}
]
}
This particular policy will be wide open for anyone with the ip(range) that you provide here. It will spare you the hassle of having to go through signing your requests though.
A library that helps setting up elasticsearch-js with AWS ES is this one
A working example is the following:
const AWS = require('aws-sdk')
const elasticsearch = require('elasticsearch')
const awsHttpClient = require('http-aws-es')
let client = elasticsearch.Client({
host: '<YOUR_ES_CLUSTER_ID>.<YOUR_ES_REGION>.es.amazonaws.com',
connectionClass: awsHttpClient,
amazonES: {
region: '<YOUR_ES_REGION>',
credentials: new AWS.Credentials('<YOUR_ACCESS_KEY>', '<YOUR_SECRET_KEY>')
}
});
client.search({
index: 'twitter',
type: 'tweets',
body: {
query: {
match: {
body: 'elasticsearch'
}
}
}
})
.then(res => console.log(res));
The Elasticsearch npm package is going to be deprecated soon, use #elastic/elasticsearch and #acuris/aws-es-connection so you don't have to provide IAM Credentails to the function.
Here the code, I use:
'use strict';
const { Client } = require('#elastic/elasticsearch');
const { createAWSConnection, awsGetCredentials } = require('#acuris/aws-es-
connection');
module.exports.get_es_interests = async event => {
const awsCredentials = await awsGetCredentials();
const AWSConnection = createAWSConnection(awsCredentials);
const client = new Client({
...AWSConnection,
node: 'your-endpoint',
});
let bodyObj = {};
try {
bodyObj = JSON.parse(event.body);
} catch (jsonError) {
console.log('There was an error parsing the JSON Object', jsonError);
return {
statusCode: 400
};
}
let keyword = bodyObj.keyword;
const { body } = await client.search({
index: 'index-name',
body: {
query: {
match: {
name: {
query: keyword,
analyzer: "standard"
}
}
}
}
});
var result = body.hits.hits;
return result;
};
Now there's https://github.com/gosquared/aws-elasticsearch-js
Import them in
const AWS = require('aws-sdk');
const ElasticSearch = require('#elastic/elasticsearch');
const { createConnector } = require('aws-elasticsearch-js');
Configure client using named profile that can be found on ~/.aws/config. You can verify this by doing: cat ~/.aws/config which should output something like:
[profile work]
region=ap-southeast-2
[default]
region = ap-southeast-1
const esClient = new ElasticSearch.Client({
nodes: [
'<aws elastic search domain here>'
],
Connection: createConnector({
region: '<region>',
getCreds: callback =>
callback(
null,
new AWS.SharedIniFileCredentials({ profile: '<target profile>' })
)
})
});
Then you can start using it like:
// this query will delete all documents in an index
await esClient.delete_by_query({
index: '<your index here>',
body: {
query: {
match_all: {}
}
}
});
References:
https://github.com/gosquared/aws-elasticsearch-js
https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-delete-by-query.html
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/SharedIniFileCredentials.html