AWS API GATEWAY <=> AWS Lambda CORS problem - node.js

This is my first time with AWS, I did set up a DynamoDB table as well as created Lambda functions to do CRUD manipulations, I made the functions static at first. Then, I proceeded to create API Gateway methods to access the Lambda functions dynamically.
The method that I try to call is a GET method, it should be callable using JS and from a different domain, I need to enable CORS, unfortunately even after I have it in the header of my lambda function it gives me CORS error:
'use strict';
const AWS = require('aws-sdk');
AWS.config.update({ region: 'us-east-2' });
exports.handler = async (event, context) => {
const ddb = new AWS.DynamoDB({ apiVersion: '2012-10-08' });
const documentClient = new AWS.DynamoDB.DocumentClient({ region: 'us-east-2' });
let responseBody = "";
let statusCode = 0;
var params = {
TableName: "Users",
};
// scan is a function, that is like the SELECT .. WHERE .. clause in SQL
try{
const data = await documentClient.scan(params).promise();
responseBody = JSON.stringify(data.Items);
statusCode = 200;
}catch(err) {
responseBody = `Unable to Scan Users, error: ${err}`;
statusCode = 403;
}
const response = {
statusCode: statusCode,
headers: {
"access-control-allow-origin": "*"
},
body: {
responseBody
}
};
return response;
};
This is the code I have in a VueJS application that should make a request to the GET method in API Gateway:
try{
const res = await axios.get("https://here-goes-my-url");
console.log(res.data);
} catch (err){
console.log(err);
}
this is the error I'm getting:
Access to XMLHttpRequest at 'https://here-goes-the-url' from origin 'http://localhost:8080' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
The Lambda GET method should return all items in the table "Users".
Please help, thanks!

Ok you have your header set.
Have you enabled cors from the API Gateway ?
And when you see CORS errors, they are not always related to CORS in case of AWS lambda as sometimes the response is blocked by API Gateway and the client is still getting a CORS error. I think this is made so it is more difficult to hack, as errors are useful if you are trying to hack the server.
So I would strongly suggest you to test your application first in the API Gateway console.
This will give you a more actionable error. I am guessing this might be related to permissions. Hope this helps.

Related

HTTP 502 Error in an AWS Amplify Project with an API Gateway and Node.js Lambda Function

I have an AWS Amplify project with an API Gateway and a node.js Lambda function. Whenever I hit the API and it makes a connection to the RDS PostgreSQL DB I'm getting back a HTTP 502 error. I'm not sure what to do next to resolve it. Can anyone suggest some potential causes of this error and how I can troubleshoot and fix it?
I've been trying to adjust something in the lambda with the hopes that it'll fix it but the problem could lie elsewhere in the flow such as the API Gateway or RDS DB..
NOTE: This is a sample project that I'm working on, it isn't perfect I know. Feedback is always appreciated. Thanks!
/**
* #type {import('#types/aws-lambda').APIGatewayProxyHandler}
*/
var pg = require('pg');
exports.handler = async (event) => {
try {
const rds_host = "";
const name = "";
const password = "";
const db_name = "";
const port = 5432;
const connString = `postgres://${name}:${password}#${rds_host}:${port}/${db_name}`;
const client = new pg.Client(connString);
await client.connect();
const query = {
text: 'SELECT * FROM projects'
}
const res = await client.query(query);
const data = res.rows;
await client.end();
const response = {
statusCode: 200,
headers: {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Headers": "*"
},
body: JSON.stringify(data),
};
return response;
} catch(error) {
console.log('error: ', error);
}
};

How to upload files to AWS S3?, I get an error when doing it (using private buckets)

I am trying to upload files to AWS S3 using getSignedUrlPromise() to obtain the access link since the bucket is completely private, I want it to only be accessible through the links that the server generates with getSignedUrlPromise().
The problem comes when I try to make a Put request to that link obtained since I get the following error, I also leave you the response that I receive.
Here is the code for configuring aws in nodeJS:
import AWS from 'aws-sdk';
const bucketName = 'atlasfitness-progress';
const region =process.env.AWS_REGION;
const accessKeyId = process.env.AWS_ACCESS_KEY
const secretAccessKey = process.env.AWS_SECRET_KEY
const URL_EXPIRATION_TIME = 60; // in seconds
const s3 = new AWS.S3({
region,
accessKeyId,
secretAccessKey,
signatureVersion: 'v4'
})
export const generatePreSignedPutUrl = async (fileName, fileType) => {
const params = ({
Bucket: bucketName,
Key: fileName,
Expires: 60
})
const url = await s3.getSignedUrlPromise('putObject', params);
return url;
}
And then I have a express controller to send the link when it's requested:
routerProgress.post('/prepare_s3', verifyJWT, async (req, res) => {
res.send({url: await generatePreSignedPutUrl(req.body.fileName, req.body.fileType)});
})
export { routerProgress };
But the problem comes in the frontend, here is the function that first asks for the link and then it tryies to upload the file to S3.
const upload = async (e) => {
e.preventDefault();
await JWT.checkJWT();
const requestObject = {
fileName: frontPhoto.name,
fileType: frontPhoto.type,
token: JWT.getToken()
};
const url = (await axiosReq.post(`${serverPath}/prepare_s3`, requestObject)).data.url;
//Following function is the one that doesn't work
const response = await fetch(url, {
method: "PUT",
headers: {
"Content-Type": "multipart/form-data"
},
body: frontPhoto
});
console.log(response);
}
And with this all is done, I can say that I am a newbie to AWS so it is quite possible that I have caused a rather serious error without realizing it, but I have been blocked here for some many days and I'm starting to get desperate. So if anyone detects the error or knows how I can make it work I would be very grateful for your help.
The first thing I note about your code is that you await on async operations but do not provide for exceptions. This is very bad practice as it hides possible failures. The rule of thumb is: whenever you need to await for a result, wrap your call in a try/catch block.
In your server-side code above, you have two awaits which can fail, and if they do, any error they generate is lost.
A better strategy would be:
export const generatePreSignedPutUrl = async (fileName, fileType) => {
const params = ({
Bucket: bucketName,
Key: fileName,
Expires: 60
})
let url;
try {
url = await s3.getSignedUrlPromise('putObject', params);
} catch (err) {
// do something with the error here
// and abort the operation.
return;
}
return url;
}
And in your POST route:
routerProgress.post('/prepare_s3', verifyJWT, async (req, res) => {
let url;
try {
url = await generatePreSignedPutUrl(req.body.fileName, req.body.fileType);
} catch (err) {
res.status(500).send({ ok: false, error: `failed to get url: ${err}` });
return;
}
res.send({ url });
})
And in your client-side code, follow the same strategy. At the very least, this will give you a far better idea of where your code is failing.
Two things to keep in mind:
Functions declared using the async keyword do not return the value of the expected result; they return a Promise of the expected result, and like all Promises, can be chained to both .catch() and .then() clauses.
When calling async functions from within another async function, you must do something with any exceptions you encounter because, due to their nature, Promises do not share any surrounding runtime context which would allow you to capture any exceptions at a higher level.
So you can use either Promise "thenable" chaining or try/catch blocks within async functions to trap errors, but if you choose not to do either, you run the risk of losing any errors generated within your code.
Here's an example of how to create a pre-signed URL that can be used to PUT an MP4 file.
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
apiVersion: '2010-12-01',
signatureVersion: 'v4',
region: process.env.AWS_DEFAULT_REGION || 'us-east-1',
});
const params = {
Bucket: 'mybucket',
Key: 'videos/sample.mp4',
Expires: 1000,
ContentType: 'video/mp4',
};
const url = s3.getSignedUrl('putObject', params);
console.log(url);
The resulting URL will look something like this:
https://mybucket.s3.amazonaws.com/videos/sample.mp4?
Content-Type=video%2Fmp4&
X-Amz-Algorithm=AWS4-HMAC-SHA256&
X-Amz-Credential=AKIASAMPLESAMPLE%2F20200303%2Fus-east-1%2Fs3%2Faws4_request&
X-Amz-Date=20211011T090807Z&
X-Amz-Expires=1000&
X-Amz-Signature=long-sig-here&
X-Amz-SignedHeaders=host
You can test this URL by uploading sample.mp4 with curl as follows:
curl -X PUT -T sample.mp4 -H "Content-Type: video/mp4" "<signed url>"
A few notes:
hopefully you can use this code to work out where your problem lies.
pre-signed URLs are created locally by the SDK, so there's no need to go async.
I'd advise creating the pre-signed URL and then testing PUT with curl before testing your browser client, to ensure that curl works OK. That way you will know whether to focus your attention on the production of the pre-signed URL or on the use of the pre-signed URL within your client.
If your attempt to upload via curl fails with Access Denied then check that:
the pre-signed URL has not expired (they have time limits)
the AWS credentials you used to sign the URL allow PutObject to that S3 bucket
the S3 bucket policy does not explicitly deny your request

How can i invoke lambda function correctly? [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 1 year ago.
Improve this question
I'm trying to create my first Lambda function on AWS.
I'm following a YouTube tutorial and I wrote the following code in order to get some Users from the DynamoDB online service.
const aws = require('aws-sdk')
aws.config.update({
region : "eu-west-3"
})
const dynamodb = new aws.DynamoDB.DocumentClient()
const dynamoUsersTable = "Users"
const usersPath = "/users"
exports.handler = async (event) => {
console.log('Request event : ', event);
let response;
switch (true) {
case event.httpMethod === 'GET' && event.path === usersPath:
response = await getAllUsers();
break;
default:
break;
}
// UPDATED ANSWER - ADDED RETURN RESPONSE LIGNE ! -> New error : Missing Authentication Token
return response;
}
const buildResponse = (statusCode, body) => {
return {
statusCode : statusCode,
headers : {
"content-type" : "application/json"
},
body : JSON.stringify(body)
}
}
const scanDynamoRecords = async (scanParams, itemArray ) => {
try {
const dynamoData = await dynamodb.scan(scanParams).promise();
itemArray = itemArray.concat(dynamoData.Items);
if (dynamoData.LastEvaluatedKey) {
scanParams.ExclusiveStartkey = dynamoData.LastEvaluatedKey;
return await scanDynamoRecords(scanParams, itemArray);
}
return itemArray;
} catch(error) {
console.error('Do your custom error handling here. I am just gonna log it: ', error);
}
}
const getAllUsers = async () => {
const params = {
TableName: dynamoUsersTable
}
const allUsers = await scanDynamoRecords(params, []);
const body = {
users: allUsers
}
return buildResponse(200, body);
}
But apparently, I get Internal Server Error when I make my GET Request on Postman :
PS: The function has DynamoDBFullAccess Policy and CloudWatch Policy, and I can't find the error message log on cloud watch.
UPDATE: The error was that I had a missing "return" statement after the switch, however and after the fix, "Missing Authentication Token" message is returned as a response from a GET request on Postman. I'm suspecting that IAM has a say in this since I'm using multiple AWS services.
When you create an Lambda function, you define an IAM role that is used to run the Lambda function. If your Lambda function is going to invoke other AWS Services, then you need to create an IAM role that has permissions to invoke those given services. If you do not correctly setup the IAM role permissions, then your Lambda function is not successful.
See this AWS tutorial. Although it is implemented in Java, it covers important information such as setting up an IAM role to invoke AWS services from the Lambda function, and so on.
Creating scheduled events to invoke Lambda functions
You need to return the response variable in your handler method.

HTTP GET | POST from Serverless give timeout error and does not return anything

I am using aws serverless mongodb template which works fine with mongodb queries and other local processing
I want make http request to get some data for that i tried. http/http/axios/request
But it just returns me following timeout error
{
"message": "Endpoint request timed out"
}
Following is my last tried code.
const util = require('../../lib/util')
// const UserModel = require('../../schema/User');
const fetch = require("node-fetch");
const url = "https://jsonplaceholder.typicode.com/posts/1";
module.exports = async (event) => {
try {
const response = await fetch(url);
const json = await response.json();
console.log(json);
return util.bind(json)
} catch (error) {
return util.bind(error)
}
}
Similarly API sdk's like paypal and razorpay also become unresponsive wile using.
is this problem with aws settings or nodejs code?
Thanks in advance.
Above code works fine on serverless offline
As per AWS all the lambdas has only incoming connection. lambda can not call any http for us. for that we need to have NAT Gateway.
using NAT Gateway your lambda/serverless will get internet access and it can make outgoing http calls.

Malformed Lambda proxy response

I am trying to access a Lambda function using a POST method. When I try to test the POST resource I put in the Request body {"article_url": "http://technewstt.com/bd1108/"}
This makes the API Gateway respond with Execution failed due to configuration error: Malformed Lambda proxy response My code is below.
exports.handler = (event, context, callback) => {
//event = {"article_url": "http://technewstt.com/bd1108/"};
console.log(event.article_url);
var http = require('http');
var TextAPI = require('textapi');
var textapi = new TextAPI({
application_id: "randomn numbers",
application_key: "randomn numbers"
});
textapi.summarize({
url: event.article_url,
sentences_number: 3
}, function(error, response) {
if (error === null) {
response.sentences.forEach(function(s) {
console.log(s);
//var body = JSON.parse(s);
// TODO implement
//callback(null, s);
callback(null, {"statusCode": 200, "body": JSON.stringify(s)});
});
}
});
};
Please note that if I uncomment the second line the API gateway works fine.
Any help with this issue would be greatly appreciated.
You are using Lambda Proxy integration which always expects output of the format http://docs.aws.amazon.com/apigateway/latest/developerguide/set-up-lambda-proxy-integrations.html#api-gateway-simple-proxy-for-lambda-output-format
"Malformed Lambda proxy response" is returned if the lambda response format is unexpected. You can enable logging with full request/responses which should show you the response being returned from lambda. Its likely there was an error in you lambda function which returned an error.

Resources