Right now what I'm trying to do is that every time a request is made, a query is made to the Redis service. The problem is that when using a basic configuration, it would not be working. The error is the following:
INFO Redis Client Error Error: connec at TCPConnectWrap.afterConnect [as oncomplete] (node} port: 6379127.0.0.1',
I have as always running redis-server with its corresponding credentials listening to port 127.0.0.1:6379. I know that AWS SAM runs with a container, and the issue is probably due to a network configuration, but the only command that AWS SAM CLI provides me is --host. How could i fix this?
my code is the following, although it is not very relevant:
import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
import { createClient } from 'redis';
import processData from './src/lambda-data-dictionary-read/core/service/controllers/processData';
export async function lambdaHandler(event: APIGatewayProxyEvent): Promise<APIGatewayProxyResult> {
const body: any = await processData(event.queryStringParameters);
const url = process.env.REDIS_URL || 'redis://127.0.0.1:6379';
const client = createClient({
url,
});
client.on('error', (err) => console.log('Redis Client Error', err));
await client.connect();
await client.set('key', 'value');
const value = await client.get('key');
console.log('----', value, '----');
const response: APIGatewayProxyResult = {
statusCode: 200,
body,
};
if (body.error) {
return {
statusCode: 404,
body,
};
}
return response;
}
My template.yaml:
Transform: AWS::Serverless-2016-10-31
Description: >
lambda-data-dictionary-read
Sample SAM Template for lambda-data-dictionary-read
Globals:
Function:
Timeout: 0
Resources:
IndexFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: app/
Handler: index.lambdaHandler
Runtime: nodejs16.x
Timeout: 10
Architectures:
- x86_64
Environment:
Variables:
ENV: !Ref develope
REDIS_URL: !Ref redis://127.0.0.1:6379
Events:
Index:
Type: Api
Properties:
Path: /api/lambda-data-dictionary-read
Method: get
Metadata:
BuildMethod: esbuild
BuildProperties:
Minify: true
Target: 'es2020'
Sourcemap: true
UseNpmCi: true
Im using:
"scripts": {
"dev": "sam build --cached --beta-features && sam local start-api --port 8080 --host 127.0.0.1"
}
Related
I am trying to run a simple query locally in Node JS using serverless - for the eventual purpose of uploading an Apollo Server API onto AWS Lambda.
However, I am not able to get anywhere near the deployment step as it appears that Node is unable to run a single instance of Apollo Server/Serverless locally in the first place due to a multitude of errors which shall be explained below:
Steps I have taken:
git clone the example API and follow all instructions here: https://github.com/fullstack-hy2020/rate-repository-api (I ensured everything works perfectly)
Follow all instructions on Apollographql up to "Running Server Locally": https://www.apollographql.com/docs/apollo-server/deployment/lambda/ - then run following command: serverless invoke local -f graphql -p query.json
ERROR - cannot use import statement outside module .... Solution - add "type": "module" to package.json - run command: serverless invoke local -f graphql -p query.json
ERROR - Cannot find module 'C:\Users\Julius\Documents\Web Development\rate-repository-api\src\utils\authService' imported from C:\Users\Julius\Documents\Web Development\rate-repository-api\src\apolloServer.js... Solution - install webpack as per solution here: Serverless does not recognise subdirectories in Node then run serverless invoke local -f graphql -p query.json
ERROR - Error [ERR_MODULE_NOT_FOUND]: Cannot find module 'C:\Users\Julius\Documents\Web Development\rate-repository-api\src\utils\authService' imported from C:\Users\Julius\Documents\Web Development\rate-repository-api\src\apolloServer.js
I do not know how to proceed from here, I am hoping that someone can point me in the right direction.
File Structure:
apolloServer.js:
import { ApolloServer, toApolloError, ApolloError } from '#apollo/server';
import { ValidationError } from 'yup';
import { startServerAndCreateLambdaHandler } from '#as-integrations/aws-lambda';
import AuthService from './utils/authService';
import createDataLoaders from './utils/createDataLoaders';
import logger from './utils/logger';
import { resolvers, typeDefs } from './graphql/schema';
const apolloErrorFormatter = (error) => {
logger.error(error);
const { originalError } = error;
const isGraphQLError = !(originalError instanceof Error);
let normalizedError = new ApolloError(
'Something went wrong',
'INTERNAL_SERVER_ERROR',
);
if (originalError instanceof ValidationError) {
normalizedError = toApolloError(error, 'BAD_USER_INPUT');
} else if (error.originalError instanceof ApolloError || isGraphQLError) {
normalizedError = error;
}
return normalizedError;
};
const createApolloServer = () => {
return new ApolloServer({
resolvers,
typeDefs,
formatError: apolloErrorFormatter,
context: ({ req }) => {
const authorization = req.headers.authorization;
const accessToken = authorization
? authorization.split(' ')[1]
: undefined;
const dataLoaders = createDataLoaders();
return {
authService: new AuthService({
accessToken,
dataLoaders,
}),
dataLoaders,
};
},
});
};
export const graphqlHandler = startServerAndCreateLambdaHandler(createApolloServer());
export default createApolloServer;
Serverless.yml:
service: apollo-lambda
provider:
name: aws
runtime: nodejs16.x
httpApi:
cors: true
functions:
graphql:
# Make sure your file path is correct!
# (e.g., if your file is in the root folder use server.graphqlHandler )
# The format is: <FILENAME>.<HANDLER>
handler: src/apolloServer.graphqlHandler
events:
- httpApi:
path: /
method: POST
- httpApi:
path: /
method: GET
custom:
webpack:
packager: 'npm'
webpackConfig: 'webpack.config.js' # Name of webpack configuration file
includeModules:
forceInclude:
- pg
Webpack.config.js
const path = require('path');
module.exports = {
mode: 'development',
entry: './src/index.js',
output: {
path: path.resolve(__dirname, 'build'),
filename: 'foo.bundle.js',
},
};
I am developing a REST API in NodeJs using AWS Lambda and AWS API Gateway. I am using AWS SAM Template as well.
Below is my NodeJS Code. There I am only trying to access a sample API over the internet and make a POST call.
const mysql = require('mysql2');
const errorCodes = require('source/error-codes');
const PropertiesReader = require('properties-reader');
const fetch = require('node-fetch');
const prop = PropertiesReader('properties.properties');
const con = mysql.createConnection({
host: prop.get('server.host'),
user: prop.get("server.username"),
password: prop.get("server.password"),
port: prop.get("server.port"),
database: prop.get("server.dbname")
});
exports.testApi = async (event, context) => {
context.callbackWaitsForEmptyEventLoop = false;
con.config.namedPlaceholders = true;
if (event.body == null && event.body == undefined) {
var response = errorCodes.missing_parameters;
return response;
}
let body = JSON.parse(event.body)
if (body.key == null ) {
console.log("fire 1");
var response = errorCodes.not_null_parameters;
return response;
}
try {
let key = body.key;
console.log("body", body);
var notificationMessage = {
"key": key
};
const notificationResponse = await fetch("https://reqbin.com/sample/post/json", {
method: 'post',
body: JSON.stringify(notificationMessage),
headers: {
'Content-Type': 'application/json'
}
});
const data = await notificationResponse.json();
//Return the response
var response = {
"statusCode": 200,
"headers": {
"Content-Type": "application/json"
},
"body": JSON.stringify({
"message": data
}),
"isBase64Encoded": false
};
return response;
} catch (error) {
console.log(error);
//Return the response
var response = {
"statusCode": 500,
"headers": {
"Content-Type": "application/json"
},
"body": JSON.stringify({
"error": error
}),
"isBase64Encoded": false
};
return response;
}
};
Below is my template.yaml file. It contains nested access to another template.
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
xxx-restapi
Sample SAM Template for xxx-restapi
# More info about Globals: https://github.com/awslabs/serverless-application-model/blob/master/docs/globals.rst
Globals:
Function:
Timeout: 5
VpcConfig:
SecurityGroupIds:
- sg-xxxxx
SubnetIds:
- subnet-xxxx
- subnet-aaaa
- subnet-bbbb
- subnet-cccc
- subnet-dddd
- subnet-eeee
Parameters:
FirebaseProjectId:
Type: String
#Dont create this domain in the AWS Console manually, so it will fail here
DomainName:
Type: String
Default: api2.someapp.com
Resources:
# Authentication required HTTP API
AuthGatewayHttpApi:
Type: AWS::Serverless::HttpApi
Properties:
Domain:
DomainName: !Ref DomainName
EndpointConfiguration: REGIONAL
CertificateArn: arn:aws:acm:us-east-1:xxxxxx:certificate/bac44716-xxxx-431b-xxxx-xxxx
Route53:
HostedZoneId: xxxxxxx
IpV6: true
Auth:
Authorizers:
FirebaseAuthorizer:
IdentitySource: $request.header.Authorization
JwtConfiguration:
audience:
- !Ref FirebaseProjectId
issuer: !Sub https://securetoken.google.com/${FirebaseProjectId}
DefaultAuthorizer: FirebaseAuthorizer
# Authentication NOT required HTTP API
NoAuthGatewayHttpApi:
Type: AWS::Serverless::HttpApi
Properties:
Domain:
BasePath: noauth
DomainName: !Ref DomainName
CertificateArn: arn:aws:acm:us-east-1:xxxx:certificate/xxx-420d-xxx-xxx-xxxx
Route53:
HostedZoneId: xxxxxx
# Lambda settings
LambdaRole:
Type: 'AWS::IAM::Role'
Properties:
AssumeRolePolicyDocument:
Version: "2012-10-17"
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action:
- 'sts:AssumeRole'
Path: /
ManagedPolicyArns:
- arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
Policies:
- PolicyName: root
PolicyDocument:
Version: "2012-10-17"
Statement:
- Effect: Allow
Action:
- ec2:DescribeNetworkInterfaces
- ec2:CreateNetworkInterface
- ec2:DeleteNetworkInterface
- ec2:DescribeInstances
- ec2:AttachNetworkInterface
Resource: '*'
Outputs:
# ServerlessRestApi is an implicit API created out of Events key under Serverless::Function
# Find out more about other implicit resources you can reference within SAM
# https://github.com/awslabs/serverless-application-model/blob/master/docs/internals/generated_resources.rst#api
SharedValueOutput:
Value: !Ref FirebaseProjectId
Description: You can refer to any resource from the template.
# HelloWorldApi:
# Description: "API Gateway endpoint URL for Prod stage for functions"
# Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/"
Below is the nested template file
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
xxx-restapi
Sample SAM Template for xxx-restapi
Globals:
Function:
Timeout: 30
VpcConfig:
SecurityGroupIds:
- sg-xxxx
SubnetIds:
- subnet-xxx
- subnet-aaa
- subnet-ccc
- subnet-sss
- subnet-fff
- subnet-eee
Parameters:
FirebaseProjectId:
Type: String
DomainName:
Type: String
Resources:
NoAuthGatewayHttpApi2:
Type: AWS::Serverless::HttpApi
Properties:
StageName: Prod
MyApiMapping:
DependsOn: NoAuthGatewayHttpApi2
Type: AWS::ApiGatewayV2::ApiMapping
Properties:
ApiMappingKey: no-auth
DomainName: api2.xxx.com
ApiId: !Ref NoAuthGatewayHttpApi2
Stage: !Ref NoAuthGatewayHttpApi2.Stage
TestPostFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: xxx-restapi/
Handler: source/fcm/test-api.testApi
Runtime: nodejs14.x
Events:
GetRtcTokenAPIEvent:
Type: HttpApi
Properties:
Path: /fcm/test-api
Method: post
ApiId: !Ref NoAuthGatewayHttpApi2
When I executed the lambda function in local environment it works fine. But if i executed the same after uploading to AWS, it gives me the following error with 503 status code.
{
"message": "Service Unavailable"
}
Below is my Cloud Watch log.
START RequestId: e84242ea-xxx-4aa0-xxx-xxx Version: $LATEST
2022-06-12T05:55:52.914Z e84242ea-xxx-4aa0-xxx-xxx INFO body { key: 'value' }
END RequestId: e84242ea-xxx-xxx-dd37f2a005c0
REPORT RequestId: e84242ea-xxx-4aa0xxx993e-xxx Duration: 30032.56 ms Billed Duration: 30000 ms Memory Size: 128 MB Max Memory Used: 72 MB Init Duration: 315.65 ms
2022-06-12T05:56:22.936Z xxx-b568-xxx-993e-xxx Task timed out after 30.03 seconds
The timed out error you see above is just a mask to the real problem. I even tried this with time limit set to 2 minutes, same result. In my local environment with Docker this works in seconds.
After googling I figured out I "may" have not enabled internet connection from my Lambda functions to outside. Even in my API, all Lambda functions that require no Lambda to outside internet connection is working fine.
How can I fix this error?
I would first try to debug the networking aspect. Can anything in those subnets connect to the database? Are the security groups correct?
You might find it easier to create an EC2 instance and debug the networking from there. It certainly seems like a connection timing out.
I got the 503 error as well,
turns out to be the Api Gateway timeout problem.
I am trying to save a json file from a Lambda function with Node but the file is not showing in my s3 bucket. From what I found, I have to make some change in my template.yaml file but I don't know what it is. I'm getting a positive response from the api post call but nothing happens in the bucket.
How can I fiz this?
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
webhook
Sample SAM Template for webhook
Globals:
Function:
Timeout: 3
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: webhook_nath/
Handler: app.lambdaHandler
Runtime: nodejs14.x
Architectures:
- x86_64
Events:
HelloWorld:
Type: Api
Properties:
Path: /webhook
Method: POST
Metadata: # Manage esbuild properties
BuildMethod: esbuild
BuildProperties:
Minify: true
Target: "es2020"
Sourcemap: true
EntryPoints:
- app.ts
import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
import aws from 'aws-sdk';
const s3 = new aws.S3();
export const lambdaHandler = async (event: APIGatewayProxyEvent): Promise<APIGatewayProxyResult> => {
let response: APIGatewayProxyResult;
try {
s3.putObject({
Body: JSON.stringify(event.body),
Bucket: 'new-bucket-nath',
Key: 'file_name.json',
ContentType:'application/json'
}).promise();
response = {
statusCode: 200,
body: JSON.stringify({
message: 'worked',
}),
};
} catch (err) {
console.log(err);
response = {
statusCode: 500,
body: JSON.stringify({
message: 'some error happened',
}),
};
}
return response;
};
The call to s3.putObject needs an await.
‘await s3.putObject‘
I'm currently testing out AWS SAM with DynamoDB Local using Docker.
Here is the steps that I followed (mostly found in the internet)
Create new docker network using docker network create local-dev.
Run DynamoDB Local docker run -d -v "$PWD":/dynamodb_local_db -p 8000:8000 --network local-dev --name dynamodb amazon/dynamodb-local. Until this point, I'm being able to create and list tables using AWS CLI.
Then, I proceed with running AWS SAM sam local start-api --docker-network local-dev. Everything looks okay.
Invoked lambda.js, but it looks like no result for console.log(err)or console.log(data).
I'm not sure where could it be wrong. Please help me. Thank you in advance!
lambda.js
const services = require('./services.js');
const AWS = require('aws-sdk');
let options = {
apiVersion: '2012-08-10',
region: 'ap-southeast-1',
}
if(process.env.AWS_SAM_LOCAL) {
options.endpoint = new AWS.Endpoint('http://localhost:8000')
}
const dynamoDB = new AWS.DynamoDB(options);
exports.getUser = async (event, context) => {
let params = {};
dynamoDB.listTables(params, (err, data) => {
if(err) console.log(err)
else console.log(data)
})
return true;
}
template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Serverless Resources
Parameters:
FunctionsCodeBucket:
Type: String
Description: CodeBucket
FunctionsCodeKey:
Type: String
Description: CodeKey
FunctionsCodeVersion:
Type: String
Description: CodeVersion
NodeEnv:
Type: String
Description: NodeEnv
Globals:
Api:
Cors:
AllowMethods: "'OPTIONS,POST,GET,DELETE,PUT'"
AllowHeaders: "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token,Api-Key,api-key'"
AllowOrigin: "'*'"
Function:
Timeout: 300
Runtime: nodejs10.x
MemorySize: 128
CodeUri: ./
Resources:
DevResources:
Type: AWS::Serverless::Function
Properties:
Handler: "index.routes"
Environment:
Variables:
NODE_ENV: !Ref NodeEnv
# REGION: !Ref "AWS::Region"
Policies:
- Version: '2012-10-17'
Statement:
- Action:
- dynamodb:*
Effect: Allow
Resource: "*"
Events:
GetUser:
Type: Api
Properties:
Path: /user
Method: get
You lambda function does not wait for dynamoDB.listTables operation. You can fix this issue by using promisified version of dynamoDB.listTables as follows:
exports.getUser = async (event, context) => {
let params = {};
try {
const resp = await dynamoDB.listTables(params).promise();
console.log(resp);
} catch (err) {
console.log(err)
}
};
Another thing that you will likely need to do is to assign a network alias to your dynamodb container (you can do that using --network-alias=<container_name> option) for example, let's set the alias to dynamodb
docker run -d -v "$PWD":/dynamodb_local_db -p 8000:8000 --network local-dev --network-alias=dynamodb --name dynamodb amazon/dynamodb-local
After that you can use this network alias in your lambda function:
if(process.env.AWS_SAM_LOCAL) {
options.endpoint = new AWS.Endpoint('http://dynamodb:8000')
}
Not sure what is happening with my code.
It never executes my updated code on my local. If I update my code and run sls invoke local , it still runs the old code. ALso, it does not send out SES EMail.
For some reason, it always executes the code which was already deployed to AWS platform rather than executing my local code. This is confusing.
Below is my serverless.yml:
service: lambda-test
provider:
name: aws
runtime: nodejs8.10
stage: dev
region: ap-southeast-1
iamRoleStatements:
- Effect: Allow
Action:
- lambda:InvokeFunction
- lambda:InvokeAsync
Resource: "*"
functions:
hello:
handler: handler.hello
environment:
events:
- http:
path: /
method: get
trulyYours:
handler: handler.trulyYours
environment:
events:
- http:
path: /trulyYours
method: get
sendRegistrationEmail:
handler: handler.sendRegistrationEmail
environment:
events:
- http:
path: /sendRegistrationEmail
method: get
plugins:
- serverless-offline
I am not sure if I should continue to edit code in AWS web console itself or try
setting up local dev environment. Been tying since last two days, but turning out to be useless to spend time.
'use strict';
var aws = require("aws-sdk");
var nodeMailer = require("nodemailer");
//aws.config.loadFromPath('aws_config.json');
var ses = new aws.SES();
var s3 = new aws.S3();
module.exports.hello = async (event, context) => {
console.log("executing lambda function 'hello' XX...");
// return {
// statusCode: 200,
// body: JSON.stringify({
// message: 'v1.0',
// }),
// };
// Use this code if you don't use the http event with the LAMBDA-PROXY integration
// return { message: 'Go Serverless v1.0! Your function executed successfully!', event };
};
//
exports.trulyYours = async (event, context, callback) => {
console.log('Lambda trulyYours Received event:', JSON.stringify(event, null, 2));
//context.succeed('Hello from' + event.name);
return {
statusCode: 200,
body: JSON.stringify({
message: 'hello from trulyYours' + event.name,
}),
};
}
/*function trulyYours (foo, bar) {
// MyLambdaFunction logic here
}*/
module.exports.sendRegistrationEmail = (event, context) => {
console.log("Executing sendRegistrationEmail...");
var lambda = new aws.Lambda({
region: 'ap-southeast-1' //change to your region
});
var params = {
FunctionName: 'lambda-test-dev-trulyYours', // the lambda function we are going to invoke
InvocationType: 'RequestResponse',
LogType: 'Tail',
Payload: '{ "name" : "Alexa" }'
};
lambda.invoke(params, function (err, data) {
if (err) {
context.fail(err);
} else if (data.Payload) {
context.succeed('Lambda trulyYours ' + data.Payload);
}
});
//
// return {
// statusCode: 200,
// body: JSON.stringify({
// message: 'sent email successfully',
// }),
// };
// Use this code if you don't use the http event with the LAMBDA-PROXY integration
// return { message: 'Go Serverless v1.0! Your function executed successfully!', event };
};
Try to create a serverless-local.yml, for example :
config:
region: eu-west-1
environment:
VAR: local
database:
hostname: localhost
port: 8000
username: root
password: toor
database: db
url: http://localhost:3000
and in you serverless.yml in provider add this line :
stage: ${opt:stage, 'dev'}
then in you terminal try this cli :
sls invoke local -f functionName -s local