I have been tasked with making a POST api call to elastic search api,
https://search-test-search-fqa4l6ubylznt7is4d5yxlmbxy.us-west-2.es.amazonaws.com/klove-ddb/recipe/_search
I don't have any previous experience with making api calls to AWS services.
So, I tried this -
axios.post('https://search-test-search-fqa4l6ubylznt7is4d5yxlmbxy.us-west-2.es.amazonaws.com/klove-ddb/recipe/_search')
.then(res => res.data)
.then(res => console.log(res));
But I was getting {"Message":"User: anonymous is not authorized to perform: es:ESHttpPost"}
I also checked out with some IAM roles and added AWSESFullAccess policies to my profile.
Still I can't make anything work out.
Please help me.
The reason your seeing the error User: anonymous is not authorized to perform: es:ESHttpPost is because you're making requesting data without letting ElasticSearch know who you are - this is why it says 'Anonymous'.
There are a couple ways of authentication, the easiest being using the elasticsearch library. With this library you'll give the library a set of credentials (access key, secret key) to the IAM role / user. It will use this to create signed requests. Signed requests will let AWS know who's actually making the request, so it won't be received as anonymous, but rather, yourself.
Another way of getting this to work is to adjust your access policy to be IP-based:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "es:*",
"Condition": {
"IpAddress": {
"aws:SourceIp": [
"AAA.BBB.CCC.DDD"
]
}
},
"Resource": "YOUR_ELASTICSEARCH_CLUSTER_ARN"
}
]
}
This particular policy will be wide open for anyone with the ip(range) that you provide here. It will spare you the hassle of having to go through signing your requests though.
A library that helps setting up elasticsearch-js with AWS ES is this one
A working example is the following:
const AWS = require('aws-sdk')
const elasticsearch = require('elasticsearch')
const awsHttpClient = require('http-aws-es')
let client = elasticsearch.Client({
host: '<YOUR_ES_CLUSTER_ID>.<YOUR_ES_REGION>.es.amazonaws.com',
connectionClass: awsHttpClient,
amazonES: {
region: '<YOUR_ES_REGION>',
credentials: new AWS.Credentials('<YOUR_ACCESS_KEY>', '<YOUR_SECRET_KEY>')
}
});
client.search({
index: 'twitter',
type: 'tweets',
body: {
query: {
match: {
body: 'elasticsearch'
}
}
}
})
.then(res => console.log(res));
The Elasticsearch npm package is going to be deprecated soon, use #elastic/elasticsearch and #acuris/aws-es-connection so you don't have to provide IAM Credentails to the function.
Here the code, I use:
'use strict';
const { Client } = require('#elastic/elasticsearch');
const { createAWSConnection, awsGetCredentials } = require('#acuris/aws-es-
connection');
module.exports.get_es_interests = async event => {
const awsCredentials = await awsGetCredentials();
const AWSConnection = createAWSConnection(awsCredentials);
const client = new Client({
...AWSConnection,
node: 'your-endpoint',
});
let bodyObj = {};
try {
bodyObj = JSON.parse(event.body);
} catch (jsonError) {
console.log('There was an error parsing the JSON Object', jsonError);
return {
statusCode: 400
};
}
let keyword = bodyObj.keyword;
const { body } = await client.search({
index: 'index-name',
body: {
query: {
match: {
name: {
query: keyword,
analyzer: "standard"
}
}
}
}
});
var result = body.hits.hits;
return result;
};
Now there's https://github.com/gosquared/aws-elasticsearch-js
Import them in
const AWS = require('aws-sdk');
const ElasticSearch = require('#elastic/elasticsearch');
const { createConnector } = require('aws-elasticsearch-js');
Configure client using named profile that can be found on ~/.aws/config. You can verify this by doing: cat ~/.aws/config which should output something like:
[profile work]
region=ap-southeast-2
[default]
region = ap-southeast-1
const esClient = new ElasticSearch.Client({
nodes: [
'<aws elastic search domain here>'
],
Connection: createConnector({
region: '<region>',
getCreds: callback =>
callback(
null,
new AWS.SharedIniFileCredentials({ profile: '<target profile>' })
)
})
});
Then you can start using it like:
// this query will delete all documents in an index
await esClient.delete_by_query({
index: '<your index here>',
body: {
query: {
match_all: {}
}
}
});
References:
https://github.com/gosquared/aws-elasticsearch-js
https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-delete-by-query.html
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/SharedIniFileCredentials.html
Related
I'm making a simple React app to access RDS data via DescribeDBInstances API. I want to allow public access, so I'm using Cognito with Unauthenticated access enabled.
I have the following policy attached to the provided UnAuth role, yet I'm still getting the following error when trying to use the RDS API from JavaScript (nodejs):
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "rds:DescribeDBInstances",
"Resource": "*"
}
]
}
AccessDenied: User: arn:aws:sts::(account):assumed-role/Cognito_TestUnauth_Role/CognitoIdentityCredentials is not authorized to perform: rds:DescribeDBInstances on resource: arn:aws:rds:us-east-1:(account):db:*
I redacted my account ID.
Also this default policy is attached:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"mobileanalytics:PutEvents",
"cognito-sync:*"
],
"Resource": "*"
}
]
}
Here's my calling code:
import { RDSClient, DescribeDBInstancesCommand } from "#aws-sdk/client-rds";
import { CognitoIdentityClient } from "#aws-sdk/client-cognito-identity";
import { fromCognitoIdentityPool } from "#aws-sdk/credential-provider-cognito-identity";
// see https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-rds/index.html
export default async function getDbInstances() {
const region = "us-east-1";
const client = new RDSClient({
region,
credentials: fromCognitoIdentityPool({
client: new CognitoIdentityClient({ region }),
identityPoolId: "(my identity pool ID)",
})
});
const command = new DescribeDBInstancesCommand({});
return await client.send(command).DBInstances;
}
I'm going a bit crazy here, it seems everything is set up correctly. What is missing?
I found the answer inside the IAM Roles documentation for Cognito: https://docs.aws.amazon.com/cognito/latest/developerguide/iam-roles.html (see "Access Policies" section)
Enhanced authentication is recommended for Cognito and enabled by default, but since it uses the GetCredentialForIdentity API under the hood, access is limited to certain AWS services regardless of IAM policy (RDS isn't an allowed service). I didn't see any way to override this limitation.
The solution is to switch to basic authentication (you have to enable it first in the Cognito identity pool settings). Here's my working nodejs code to use basic auth and then fetch the RDS instances:
import { RDSClient, DescribeDBInstancesCommand } from "#aws-sdk/client-rds";
import {
CognitoIdentityClient,
GetIdCommand ,
GetOpenIdTokenCommand
} from "#aws-sdk/client-cognito-identity";
import { getDefaultRoleAssumerWithWebIdentity } from "#aws-sdk/client-sts";
import { fromWebToken } from "#aws-sdk/credential-provider-web-identity";
const region = "us-east-1";
const cognitoClient = new CognitoIdentityClient({ region })
// see https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-rds/index.html
export default async function getDbInstances() {
const { Token, IdentityId } = await getTokenUsingBasicFlow();
const client = new RDSClient({
region,
credentials: fromWebToken({
roleArn: "arn:aws:iam::(account id):role/Cognito_RDSDataAppPoolUnauth_Role",
webIdentityToken: Token,
roleSessionName: IdentityId.substring(IdentityId.indexOf(":") + 1),
roleAssumerWithWebIdentity: getDefaultRoleAssumerWithWebIdentity()
})
});
const command = new DescribeDBInstancesCommand({});
return (await client.send(command)).DBInstances;
}
async function getTokenUsingBasicFlow() {
const getIdCommand = new GetIdCommand({ IdentityPoolId: "us-east-1:(identity pool id)" });
const id = (await cognitoClient.send(getIdCommand)).IdentityId;
const getOpenIdTokenCommand = new GetOpenIdTokenCommand({ IdentityId: id });
return await cognitoClient.send(getOpenIdTokenCommand);
}
Here's the documentation for the basic authentication flow vs enhanced that I followed to write my implementation: https://docs.aws.amazon.com/cognito/latest/developerguide/authentication-flow.html
Update
I'm able to get my original code, and the suggestions as well working when running it in isolation. However, what I need to do is call it from within a Firebase onRequest or onCall function. When this code gets wrapped by these, the malformed headers and request for authorization are still an issue. We use many other APIs this way so it's puzzling why the Clarifiai API is having these issues. Any suggestions on using it with Firebase?
Original
New to Clarifai and having some authentication issues while attempting to retrieve model outputs from the Food Model.
I've tried two different keys:
API key generated from an app I created in the Portal
API key - the Personal Access Token I generated for myself
In both cases I encounter an Empty or malformed authorization header response.
{
"status":{
"code":11102,
"description":"Invalid request",
"details":"Empty or malformed authorization header. Please provide an API key or session token.",
"req_id":"xyzreasdfasdfasdfasdfasf"
},
"outputs":[
]
}
I've following the following articles to piece together this code. This is running in a Node 10 environment.
Initialization
Food Model
Prediction
const { ClarifaiStub } = require('clarifai-nodejs-grpc');
const grpc = require('#grpc/grpc-js');
const stub = ClarifaiStub.json();
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key xyzKey");
return new Promise((resolve, reject) => {
stub.PostModelOutputs(
{
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
metadata,
(err, response) => {
if (err) {
return reject(`ERROR: ${err}`);
}
resolve(JSON.stringify(response));
}
);
});
}
Update: There was an issue in versions prior to 7.0.2 where, if you had another library using #grpc/grpc-js with a different version, the grpc.Metadata object wasn't necessarily constructed from the library version that clarifai-grpc-nodejs was using.
To fix the issue, update the clarifai-grpc-nodejs library, and require the grpc object like this:
const {ClarifaiStub, grpc} = require("clarifai-nodejs-grpc");
Previously, the grpc object was imported directly from #grpc/grpc-js, which was the source of the problem.
There are two ways of authenticating to the Clarifai API:
with an API key, which is application-specific, meaning that an API key is attached to an application and can only do operations inside that application,
with a Personal Access Token (PAT), which is user-specific, which means you can assess / manipulate / do operations on all the applications the user owns / has access to (and also create/update/delete applications themselves).
When using a PAT, you have to specify, in your request data, which application you are targeting. With an API key this is not needed.
I've tested your example (using Node 12, though it should work in 10 as well) with a valid API key and it works fina (after putting it into an async function). Here's a full runnable example (replace YOUR_API_KEY with your valid API key).
function predict() {
const { ClarifaiStub } = require('clarifai-nodejs-grpc');
const grpc = require('#grpc/grpc-js');
const stub = ClarifaiStub.json();
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key YOUR_API_KEY");
return new Promise((resolve, reject) => {
stub.PostModelOutputs(
{
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
metadata,
(err, response) => {
if (err) {
return reject(`ERROR: ${err}`);
}
resolve(JSON.stringify(response));
}
);
});
}
async function main() {
const response = await predict();
console.log(response);
}
main();
If you want to use a PAT in the above example, two things must change. Firstly, replace the API key with a PAT:
...
metadata.set("authorization", "Key YOUR_PAT");
...
To the method request object, add the application ID.
...
stub.PostModelOutputs(
{
user_app_id: {
user_id: "me", // The literal "me" resolves to your user ID.
app_id: "YOUR_APPLICATION_ID"
},
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
...
Make sure that you have respected the format to pass the key in your code as such:
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key {YOUR_CLARIFAI_API_KEY}");
Make sure that "Key" is present.
Let me know.
EDIT: So looks like Firebase doesn't support custom headers. This is likely impacting the 'Authorization' header. At least this is my best guess. See the comments in the following ticket.
Firebase hosting custom headers not working
The following code works for me:
{
const { ClarifaiStub } = require('clarifai-nodejs-grpc');
const grpc = require('#grpc/grpc-js');
const stub = ClarifaiStub.json();
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key {APP API KEY}");
return new Promise((resolve, reject) => {
stub.PostModelOutputs(
{
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
metadata,
(err, response) => {
if (err) {
return reject(`ERROR: ${err}`);
}
console.log(JSON.stringify(response));
resolve(JSON.stringify(response));
}
);
});
}
There was a missing { although I'm not sure if that is what is reflected in the actual code you are running. I'm using in this case an APP API Key (when you create an App, there will be an API Key on the Application Details page.
It sounds like you might be using a Personal Access Token instead, which can be used like this:
{
const { ClarifaiStub } = require('clarifai-nodejs-grpc');
const grpc = require('#grpc/grpc-js');
const stub = ClarifaiStub.json();
const metadata = new grpc.Metadata();
metadata.set("authorization", "Key {Personal Access Token}"); // Sounds like you've made the personal access token correctly - go into settings, then authentication, then create one. Make sure it has proper permissions (I believe all by default).
return new Promise((resolve, reject) => {
stub.PostModelOutputs(
{
user_app_id: {
user_id: "{USER ID}", // I used my actual ID, I did not put 'me'. You can find this under your profile.
app_id: "{APP NAME}" // This is the app ID found in the upper left corner of the app after it is created - not the API Key. This is generally what you named the app when you created it.
},
model_id: 'bd367be194cf45149e75f01d59f77ba7',
inputs: [{ data: { image: { url: 'https://samples.clarifai.com/metro-north.jpg' } } }],
},
metadata,
(err, response) => {
if (err) {
return reject(`ERROR: ${err}`);
}
console.log(JSON.stringify(response));
resolve(JSON.stringify(response));
}
);
});
}
Make sure to fill out the: {Personal Access Token}, {USER ID} and {APP NAME}. I used my actual user id (found in the profile), and the app name is not the API Key for the app, but the name in the upper left corner when you're on the Application details page. This call worked for me.
I need help troubleshooting I CORS error I am having in Apollo, Node, and Next.js. I am not sure what change I have made, but suddenly I am unable to fetch the data from my Prisma database. I am currently running dev mode. My Yoga server which pulls in my data from Prisma run at localhost:4444. My frontend is run on localhost:7777.
Here is my CORS setup:
import withApollo from "next-with-apollo";
import ApolloClient from "apollo-boost";
import { endpoint, prodEndpoint } from "../config";
import { LOCAL_STATE_QUERY } from "../components/Cart";
function createClient({ headers }) {
return new ApolloClient({
uri: process.env.NODE_ENV === "development" ? endpoint : prodEndpoint,
request: (operation) => {
operation.setContext({
fetchOptions: {
credentials: "include",
},
headers,
});
},
// local data
clientState: {
resolvers: {
Mutation: {
toggleCart(_, variables, { cache }) {
// read the cartOpen value from the cache
const { cartOpen } = cache.readQuery({
query: LOCAL_STATE_QUERY,
});
// Write the cart State to the opposite
const data = {
data: { cartOpen: !cartOpen },
};
cache.writeData(data);
return data;
},
},
},
defaults: {
cartOpen: false,
},
},
});
}
export default withApollo(createClient);
variables.env
FRONTEND_URL="localhost:7777"
PRISMA_ENDPOINT="https://us1.prisma.sh/tim-smith-131869/vouch4vet_dev_backend/dev"
PRISMA_SECRET="..."
APP_SECRET="..."
STRIPE_SECRET="..."
PORT=4444
backend index.js
const server = createServer();
server.express.use(cookieParser());
// decode the JWT so we can get the user Id on each request
server.express.use((req, res, next) => {
const { token } = req.cookies;
if (token) {
const { userId } = jwt.verify(token, process.env.APP_SECRET);
// put the userId onto the req for future requests to access
req.userId = userId;
}
next();
});
I have tried rolling back to previous commit and I have had no luck. I have not ruled out internet problems.
Let me know if you need to see the rest of my repo.
Thanks
I am trying to setup unit test for a Strapi project my code looks like below
test_utils.js
const Strapi = require("strapi");
const http = require('http');
let instance; // singleton
jest.setTimeout(10000)
async function setupStrapi() {
if (!instance) {
instance = Strapi()
await instance.load();
// Run bootstrap function.
await instance.runBootstrapFunctions();
// Freeze object.
await instance.freeze();
instance.app.use(instance.router.routes()).use(instance.router.allowedMethods());
instance.server = http.createServer(instance.app.callback());
}
return instance;
}
module.exports = { setupStrapi }
controllers.test.js
const request = require("supertest")
const {setupStrapi, setupUser} = require("../../test_utils")
describe("chat-group controllers", ()=>{
let strapi
beforeAll(async ()=>{
strapi = await setupStrapi()
})
test("endpoint tasks", async (done)=>{
app = strapi
app.server.listen(app.config.port, app.config.host)
const resp = await request(app.server).get("/testpublics")
.expect(200)
console.log(resp.body)
done()
})
})
when I run the test, I get 403 error on "/testpublics". Note that "/testpublics" is public api and I can access it from browser.
I think the problem is with setupStrapi function, I took the code from node_modules/strapi/lib/strapi.js file.
What is the better way to setup unit test for Strapi project. I want to achieve following
start test with clean database each time
test public and authenticated api endpoints
I encountered the same problem. Go to ./api/name-of-your-api/config/routes.json and remove the config property for each of the endpoints.
It should be this:
{
"routes": [
{
"method": "GET",
"path": "/testpublics",
"handler": "testpublics.index"
},
}
as opposed to this:
{
"routes": [
{
"method": "GET",
"path": "/testpublics",
"handler": "testpublics.index",
"config": {
"policies": []
}
},
}
If you want this route to by public by policy, answer from #sama-bala resolves everything.
For Strapi the custom route and controller that is not public (needs JWT token in Request header) must be assigned to role — otherwise even for a valid token controller will throw Forbidden 403 error. The whole process is described in Authenticated request tutorial on Strapi documentation page. This information is saved in the database only. Usually you do this in the admin panel, not from source code.
Have a look on the following snippet
/**
* Grants database `permissions` table that role can access an endpoint/controllers
*
* #param {int} roleID, 1 Autentihected, 2 Public, etc
* #param {string} value, in form or dot string eg `"permissions.users-permissions.controllers.auth.changepassword"`
* #param {boolean} enabled, default true
* #param {string} policy, default ''
*/
const grantPrivilage = async (
roleID = 1,
value,
enabled = true,
policy = ""
) => {
const updateObj = value
.split(".")
.reduceRight((obj, next) => ({ [next]: obj }), { enabled, policy });
return await strapi.plugins[
"users-permissions"
].services.userspermissions.updateRole(roleID, updateObj);
};
It allows you to assign route to role programatically by updating the database. In case of your code the solution might look like that
await grantPrivilage(2, "permissions.application.controllers.testpublics.index"); // 1 is default role for Autheticated user, 2 is Public role.
You can add this in beforeAll or in bootstrap.js, eg
beforeAll(async (done) => {
user = await userFactory.createUser(strapi);
await grantPrivilage(1, "permissions.application.controllers.hello.hi");
done();
});
I've tried to explore this topic in my blog post
I am following this article from medium https://blog.bitsrc.io/serverless-backend-using-aws-lambda-hands-on-guide-31806ceb735e
Everything works except when I attempt to add a record to the DynamoDB I get an error that say "this is not a function"
const AWS = require ("aws-sdk");
const client = new AWS.DynamoDB.DocumentClient();
const uuid = require ("uuid");
module.exports.myHero = async (event) => {
const data = JSON.parse(event.body);
const params = {
TableName: "myHeros",
Item: {
id: uuid(),
name: data.name,
checked: false
}
};
await client.put(params).promise();
return {
statusCode: 200,
body: JSON.stringify(data)
};
};
{
"errorMessage": "client.put(...).promise is not a function",
"errorType": "TypeError",
"stackTrace": [
"module.exports.myHero (/var/task/create.js:30:27)"
]
}
In almost all cases, when you call a method xyz() on an AWS client object and it fails with ‘xyz is not a function’, the problem is that you are using an old version of an SDK that does not actually support that method.
Upgrading to the latest AWS SDK version will fix this problem.
When initializing dynamodb client new AWS.DynamoDB.DocumentClient()' please pass options (at-least region parameter) to DocumentClient function.