How to deploy NestJS services with graphql, serverless lambda in the Localstack? - node.js

Solution
I found the solution, thanks for your time. StackOverflow doesn't let me answer me. >:(
Description
I would like to know what kind of configuration I need to perform to deploy a lambda Localstack of an application in NesJS that has Graphql, not RestApi. This is my configuration.
Is correct this configuration?
How can I get the fake link to access the playground across lambda Localstack?
Implementation
Configuration the nestJS lambda function project
import { ValidationPipe } from '#nestjs/common';
import { NestFactory } from '#nestjs/core';
import { ExpressAdapter } from '#nestjs/platform-express';
import serverlessExpress from '#vendia/serverless-express';
import { APIGatewayProxyHandler, Handler } from 'aws-lambda';
import express from 'express';
import { AppModule } from './app.module';
let cachedServer: Handler;
const bootstrapServer = async (): Promise<Handler> => {
const expressApp = express();
const app = await NestFactory.create(
AppModule,
new ExpressAdapter(expressApp),
);
app.useGlobalPipes(new ValidationPipe());
app.enableCors();
await app.init();
return serverlessExpress({
app: expressApp,
});
};
export const handler: APIGatewayProxyHandler = async (
event,
context,
callback,
) => {
if (!cachedServer) {
cachedServer = await bootstrapServer();
}
return cachedServer(event, context, callback);
};
Serverless.yml Configuration
service: test
provider:
name: aws
runtime: nodejs14.x
stage: ''
# profile: local # Config your AWS Profile
environment: # Service wide environment variables
NODE_ENV: local
GLOBAL_PREFIX: graphql
PORT: 4000
plugins:
# - serverless-plugin-typescript
# - serverless-plugin-optimize
# - serverless-offline
- serverless-localstack
custom:
localstack:
debug: true
stages:
- local
- dev
endpointFile: localstack_endpoints.json
individually: true
# serverless-offline:
# httpPort: 3000
functions:
main:
handler: dis/index.handler
# local:
# handler: dist/main.handler
# events:
# - http:
# path: /
# method: any
# cors: true
Docker Compose Localstack
version: '3.8'
services:
localstack:
image: localstack/localstack:latest
environment:
- AWS_DEFAULT_REGION=us-east-1
- EDGE_PORT=4566
- SERVICES=lambda,s3,cloudformation,sts
ports:
- '4566-4597:4566-4597'
volumes:
- "${TMPDIR:-/tmp}/localstack:/tmp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
Services
{
"CloudFormation" : "http://localhost:4566",
"CloudWatch" : "http://localhost:4566",
"Lambda" : "http://localhost:4566",
"S3" : "http://localhost:4566"
}
Commands
serverless deploy --stage local
When executing this command I get this error
serverless info --stage local
When executing this command I get this information
serverless invoke local -f main -l
When executing this command

Not sure, but probably you have a typo here at Serverless.yml:
handler: dis/index.handler

Related

Serverless Cannot Run Simple Example Query Using Node.JS

I am trying to run a simple query locally in Node JS using serverless - for the eventual purpose of uploading an Apollo Server API onto AWS Lambda.
However, I am not able to get anywhere near the deployment step as it appears that Node is unable to run a single instance of Apollo Server/Serverless locally in the first place due to a multitude of errors which shall be explained below:
Steps I have taken:
git clone the example API and follow all instructions here: https://github.com/fullstack-hy2020/rate-repository-api (I ensured everything works perfectly)
Follow all instructions on Apollographql up to "Running Server Locally": https://www.apollographql.com/docs/apollo-server/deployment/lambda/ - then run following command: serverless invoke local -f graphql -p query.json
ERROR - cannot use import statement outside module .... Solution - add "type": "module" to package.json - run command: serverless invoke local -f graphql -p query.json
ERROR - Cannot find module 'C:\Users\Julius\Documents\Web Development\rate-repository-api\src\utils\authService' imported from C:\Users\Julius\Documents\Web Development\rate-repository-api\src\apolloServer.js... Solution - install webpack as per solution here: Serverless does not recognise subdirectories in Node then run serverless invoke local -f graphql -p query.json
ERROR - Error [ERR_MODULE_NOT_FOUND]: Cannot find module 'C:\Users\Julius\Documents\Web Development\rate-repository-api\src\utils\authService' imported from C:\Users\Julius\Documents\Web Development\rate-repository-api\src\apolloServer.js
I do not know how to proceed from here, I am hoping that someone can point me in the right direction.
File Structure:
apolloServer.js:
import { ApolloServer, toApolloError, ApolloError } from '#apollo/server';
import { ValidationError } from 'yup';
import { startServerAndCreateLambdaHandler } from '#as-integrations/aws-lambda';
import AuthService from './utils/authService';
import createDataLoaders from './utils/createDataLoaders';
import logger from './utils/logger';
import { resolvers, typeDefs } from './graphql/schema';
const apolloErrorFormatter = (error) => {
logger.error(error);
const { originalError } = error;
const isGraphQLError = !(originalError instanceof Error);
let normalizedError = new ApolloError(
'Something went wrong',
'INTERNAL_SERVER_ERROR',
);
if (originalError instanceof ValidationError) {
normalizedError = toApolloError(error, 'BAD_USER_INPUT');
} else if (error.originalError instanceof ApolloError || isGraphQLError) {
normalizedError = error;
}
return normalizedError;
};
const createApolloServer = () => {
return new ApolloServer({
resolvers,
typeDefs,
formatError: apolloErrorFormatter,
context: ({ req }) => {
const authorization = req.headers.authorization;
const accessToken = authorization
? authorization.split(' ')[1]
: undefined;
const dataLoaders = createDataLoaders();
return {
authService: new AuthService({
accessToken,
dataLoaders,
}),
dataLoaders,
};
},
});
};
export const graphqlHandler = startServerAndCreateLambdaHandler(createApolloServer());
export default createApolloServer;
Serverless.yml:
service: apollo-lambda
provider:
name: aws
runtime: nodejs16.x
httpApi:
cors: true
functions:
graphql:
# Make sure your file path is correct!
# (e.g., if your file is in the root folder use server.graphqlHandler )
# The format is: <FILENAME>.<HANDLER>
handler: src/apolloServer.graphqlHandler
events:
- httpApi:
path: /
method: POST
- httpApi:
path: /
method: GET
custom:
webpack:
packager: 'npm'
webpackConfig: 'webpack.config.js' # Name of webpack configuration file
includeModules:
forceInclude:
- pg
Webpack.config.js
const path = require('path');
module.exports = {
mode: 'development',
entry: './src/index.js',
output: {
path: path.resolve(__dirname, 'build'),
filename: 'foo.bundle.js',
},
};

API Gateway refuses to connect to S3,DynamodB using Localstack and serverless

I'm developing Localstack application where i need to connect s3 bucket and dynamodB to the API gateway, when i invoke my lambda using command line using
serverless invoke local -f helloagain
it just works fine, but when i invoke it through postman using api gateway e.g. http://localhost:4566/restapis/82wh1mpf11/local/_user_request_/helloagain it gives internal server error.
Read operations on s3 and ddb are working fine, it is getting a tedious task when it comes to dumping the data into these resources.
Sharing my code snippets below
this is docker-compose file
version: "3.1"
services:
localstack:
image: localstack/localstack:latest
environment:
- AWS_DEFAULT_REGION=us-east-1
- EDGE_PORT=4566
- SERVICES=lambda,s3,cloudformation,sts,apigateway,iam,route53
ports:
- "4566-4597:4566-4597"
volumes:
- "${TEMPDIR:-/tmp/localstack}:/temp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
this is a serverless.yml
service: localstack-lambda
plugins:
- serverless-localstack
custom:
localstack:
debug: true
stages:
- local
- dev
endpointFile: localstack_endpoints.json
# frameworkVersion: "2"
provider:
name: aws
runtime: nodejs12.x
#functions are pretty important
functions:
hello:
handler: handler.hello
events:
- http:
path: hello
method: any
helloagain:
handler: handler.helloagain
events:
- http:
path: helloagain
method: any
createbucket:
handler: s3handler.createbucket
events:
- http:
path: createbucket
method: any
createDDB:
handler: ddb.createDDB
events:
- http:
path: createDDB
method: any
listTables:
handler: ddb.listTables
events:
- http:
path: listTables
method: any
this is helloagain lambda script
const aws = require('aws-sdk');
const fs = require('fs');
const s3 = new aws.S3({
apiVersion: '2006-03-01',
endpoint: `http://localhost:4566`, // This two lines are
s3ForcePathStyle: true, //only needed for LocalStack.
});
module.exports.helloagain = async (event) => {
const file_buffer = fs.readFileSync("demo2.txt");
const params = {
Bucket: "newbucket",
Key: "demo2.txt",
Body:file_buffer
}
const res = await s3.upload(params).promise();
return {
statusCode: 200,
body: JSON.stringify({
res
})
}
}
i've already checked invoking this lambda through command line that
file demo2.txt already exists
bucket exists
i'm getting this error in the console
localstack.services.awslambda.lambda_executors.InvocationException: Lambda process returned with error. Result: {"errorType":"NetworkingError","errorMessage":"connect ECONNREFUSED 127.0.0.1:4566"}.
and this in the postman

Serverless is unable to find the handler in nestjs

I am trying to deploy my NestJS project to AWS lambda with serverless framework, I followed official documentation
I created a file serverless.ts in src directory along with main.ts file.
When I run sls offline, it successfull runs and gives following output:
Running "serverless" from node_modules
Starting Offline at stage dev (us-east-1)
Offline [http for lambda] listening on http://localhost:3002
Function names exposed for local invocation by aws-sdk:
* main: projectName-dev-main
┌────────────────────────────────────────────────────────────────────────┐
│ │
│ ANY | http://localhost:4000/ │
│ POST | http://localhost:4000/2015-03-31/functions/main/invocations │
│ ANY | http://localhost:4000/{proxy*} │
│ POST | http://localhost:4000/2015-03-31/functions/main/invocations │
│ │
└────────────────────────────────────────────────────────────────────────┘
Server ready: http://localhost:4000 🚀
But when I open the localhost URL http://localhost:4000, I get 502 Bad gateway and
following error.
ANY / (λ: main)
✖ Unhandled exception in handler 'main'.
✖ Error: Cannot find module 'main'
Require stack:
- /home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/aws-lambda-ric/UserFunction.js
✖ Runtime.ImportModuleError: Error: Cannot find module 'main'
Require stack:
- /home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/aws-lambda-ric/UserFunction.js
at _loadUserApp (/home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/aws-lambda-ric/UserFunction.js:310:15)
at async module.exports.load (/home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/aws-lambda-ric/UserFunction.js:341:21)
at async InProcessRunner.run (file:///home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/InProcessRunner.js:41:21)
at async MessagePort.<anonymous> (file:///home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/worker-thread-runner/workerThreadHelper.js:25:14)
Here is code:
main.ts
import { NestFactory } from '#nestjs/core';
import { ValidationPipe } from '#nestjs/common';
import { AppModule } from './app.module';
async function bootstrap() {
process.env.TZ = 'Asia/Calcutta';
const app = await NestFactory.create(AppModule);
const port = process.env.PORT || 3000;
app.useGlobalPipes(new ValidationPipe({ whitelist: true }));
await app.listen(port);
}
bootstrap();
serverless.ts
import { NestFactory } from '#nestjs/core';
import serverlessExpress from '#vendia/serverless-express';
import { Callback, Context, Handler } from 'aws-lambda';
import { AppModule } from './app.module';
let server: Handler;
async function bootstrap(): Promise<Handler> {
const app = await NestFactory.create(AppModule);
await app.init();
const expressApp = app.getHttpAdapter().getInstance();
return serverlessExpress({ app: expressApp });
}
export const handler: Handler = async (
event: any,
context: Context,
callback: Callback,
) => {
server = server ?? (await bootstrap());
return server(event, context, callback);
};
serverless.yaml
service: myProject
useDotenv: true
plugins:
- serverless-offline
# - serverless-plugin-optimize
provider:
name: aws
runtime: nodejs16.x
region: us-east-1
profile: default
memorySize: 128
stage: dev
environment:
TZ: ${env:TZ}
functions:
main:
handler: dist/main.handler
events:
- http:
method: ANY
path: /
- http:
method: ANY
path: '{proxy+}'
custom:
serverless-offline:
noPrependStageInUrl: true
httpPort: 4000

Use Redis with AWS SAM (Redis Client Error)

Right now what I'm trying to do is that every time a request is made, a query is made to the Redis service. The problem is that when using a basic configuration, it would not be working. The error is the following:
INFO Redis Client Error Error: connec at TCPConnectWrap.afterConnect [as oncomplete] (node} port: 6379127.0.0.1',
I have as always running redis-server with its corresponding credentials listening to port 127.0.0.1:6379. I know that AWS SAM runs with a container, and the issue is probably due to a network configuration, but the only command that AWS SAM CLI provides me is --host. How could i fix this?
my code is the following, although it is not very relevant:
import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
import { createClient } from 'redis';
import processData from './src/lambda-data-dictionary-read/core/service/controllers/processData';
export async function lambdaHandler(event: APIGatewayProxyEvent): Promise<APIGatewayProxyResult> {
const body: any = await processData(event.queryStringParameters);
const url = process.env.REDIS_URL || 'redis://127.0.0.1:6379';
const client = createClient({
url,
});
client.on('error', (err) => console.log('Redis Client Error', err));
await client.connect();
await client.set('key', 'value');
const value = await client.get('key');
console.log('----', value, '----');
const response: APIGatewayProxyResult = {
statusCode: 200,
body,
};
if (body.error) {
return {
statusCode: 404,
body,
};
}
return response;
}
My template.yaml:
Transform: AWS::Serverless-2016-10-31
Description: >
lambda-data-dictionary-read
Sample SAM Template for lambda-data-dictionary-read
Globals:
Function:
Timeout: 0
Resources:
IndexFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: app/
Handler: index.lambdaHandler
Runtime: nodejs16.x
Timeout: 10
Architectures:
- x86_64
Environment:
Variables:
ENV: !Ref develope
REDIS_URL: !Ref redis://127.0.0.1:6379
Events:
Index:
Type: Api
Properties:
Path: /api/lambda-data-dictionary-read
Method: get
Metadata:
BuildMethod: esbuild
BuildProperties:
Minify: true
Target: 'es2020'
Sourcemap: true
UseNpmCi: true
Im using:
"scripts": {
"dev": "sam build --cached --beta-features && sam local start-api --port 8080 --host 127.0.0.1"
}

Localstack lambda trying to reach AWS outside container

I have a localstack image :
version: '3'
services:
localstack:
image: localstack:latest
ports:
- 4566:4566
environment:
- SERVICES=ssm
- DEBUG=1
- DATA_DIR=/tmp/localstack/data
- LOCALSTACK_HOSTNAME=localstack
- LAMBDA_EXECUTOR=local
- LAMBDA_REMOTE_DOCKER=true
volumes:
- './.localstack:/tmp/localstack'
- '/var/run/docker.sock:/var/run/docker.sock'
I created a lambda which is running inside the docker container :
service: service-trigger-action-runner
provider:
name: aws
region: eu-central-1
runtime: nodejs12.x
plugins:
- serverless-offline
custom:
defaultStage: local
functions:
trigger_action_runner:
handler: ../src/trigger_action_runner.handler
environment:
URL_SSM: 'http://localhost:4566'
AWS_ACCESS_KEY_ID: 'ABCDEFGH123456'
AWS_SECRET_ACCESS_KEY: 'key'
Launching the all in a node.js integration test, I'am sending an event in order to trigger my lambda, this part is successfully working
const lambda = new Lambda({
apiVersion: '2031',
endpoint: 'http://localhost:3000'
})
const params = {
FunctionName: 'service-trigger-action-runner-dev-trigger_action_runner',
InvocationType: 'RequestResponse',
Payload: JSON.stringify({ eventTypeAliasName: "fake_event_alias"})
}
await lambda.invoke(params).promise()
Inside my handle script, I'm creating a lambda alias, and my issue is that this call is not staying inside my container, it's not mocked :
exports.handler = async (event) => {
const params = {
Description: 'alias name',
FunctionName: event.eventTypeAlias,
FunctionVersion: '007',
Name: event.eventTypeAliasName
}
console.log(`Creating alias with params=${JSON.stringify(params)}`)
return await lambdaAwsClient.createAlias(params).promise()
This below error is thrown :
UnrecognizedClientException: The security token included in the request is invalid.
How can I do to keep the call createAlias inside the container ?

Resources