Serverless is unable to find the handler in nestjs - node.js

I am trying to deploy my NestJS project to AWS lambda with serverless framework, I followed official documentation
I created a file serverless.ts in src directory along with main.ts file.
When I run sls offline, it successfull runs and gives following output:
Running "serverless" from node_modules
Starting Offline at stage dev (us-east-1)
Offline [http for lambda] listening on http://localhost:3002
Function names exposed for local invocation by aws-sdk:
* main: projectName-dev-main
┌────────────────────────────────────────────────────────────────────────┐
│ │
│ ANY | http://localhost:4000/ │
│ POST | http://localhost:4000/2015-03-31/functions/main/invocations │
│ ANY | http://localhost:4000/{proxy*} │
│ POST | http://localhost:4000/2015-03-31/functions/main/invocations │
│ │
└────────────────────────────────────────────────────────────────────────┘
Server ready: http://localhost:4000 🚀
But when I open the localhost URL http://localhost:4000, I get 502 Bad gateway and
following error.
ANY / (λ: main)
✖ Unhandled exception in handler 'main'.
✖ Error: Cannot find module 'main'
Require stack:
- /home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/aws-lambda-ric/UserFunction.js
✖ Runtime.ImportModuleError: Error: Cannot find module 'main'
Require stack:
- /home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/aws-lambda-ric/UserFunction.js
at _loadUserApp (/home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/aws-lambda-ric/UserFunction.js:310:15)
at async module.exports.load (/home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/aws-lambda-ric/UserFunction.js:341:21)
at async InProcessRunner.run (file:///home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/in-process-runner/InProcessRunner.js:41:21)
at async MessagePort.<anonymous> (file:///home/my-PC/Desktop/projectName/node_modules/serverless-offline/src/lambda/handler-runner/worker-thread-runner/workerThreadHelper.js:25:14)
Here is code:
main.ts
import { NestFactory } from '#nestjs/core';
import { ValidationPipe } from '#nestjs/common';
import { AppModule } from './app.module';
async function bootstrap() {
process.env.TZ = 'Asia/Calcutta';
const app = await NestFactory.create(AppModule);
const port = process.env.PORT || 3000;
app.useGlobalPipes(new ValidationPipe({ whitelist: true }));
await app.listen(port);
}
bootstrap();
serverless.ts
import { NestFactory } from '#nestjs/core';
import serverlessExpress from '#vendia/serverless-express';
import { Callback, Context, Handler } from 'aws-lambda';
import { AppModule } from './app.module';
let server: Handler;
async function bootstrap(): Promise<Handler> {
const app = await NestFactory.create(AppModule);
await app.init();
const expressApp = app.getHttpAdapter().getInstance();
return serverlessExpress({ app: expressApp });
}
export const handler: Handler = async (
event: any,
context: Context,
callback: Callback,
) => {
server = server ?? (await bootstrap());
return server(event, context, callback);
};
serverless.yaml
service: myProject
useDotenv: true
plugins:
- serverless-offline
# - serverless-plugin-optimize
provider:
name: aws
runtime: nodejs16.x
region: us-east-1
profile: default
memorySize: 128
stage: dev
environment:
TZ: ${env:TZ}
functions:
main:
handler: dist/main.handler
events:
- http:
method: ANY
path: /
- http:
method: ANY
path: '{proxy+}'
custom:
serverless-offline:
noPrependStageInUrl: true
httpPort: 4000

Related

Serverless Cannot Run Simple Example Query Using Node.JS

I am trying to run a simple query locally in Node JS using serverless - for the eventual purpose of uploading an Apollo Server API onto AWS Lambda.
However, I am not able to get anywhere near the deployment step as it appears that Node is unable to run a single instance of Apollo Server/Serverless locally in the first place due to a multitude of errors which shall be explained below:
Steps I have taken:
git clone the example API and follow all instructions here: https://github.com/fullstack-hy2020/rate-repository-api (I ensured everything works perfectly)
Follow all instructions on Apollographql up to "Running Server Locally": https://www.apollographql.com/docs/apollo-server/deployment/lambda/ - then run following command: serverless invoke local -f graphql -p query.json
ERROR - cannot use import statement outside module .... Solution - add "type": "module" to package.json - run command: serverless invoke local -f graphql -p query.json
ERROR - Cannot find module 'C:\Users\Julius\Documents\Web Development\rate-repository-api\src\utils\authService' imported from C:\Users\Julius\Documents\Web Development\rate-repository-api\src\apolloServer.js... Solution - install webpack as per solution here: Serverless does not recognise subdirectories in Node then run serverless invoke local -f graphql -p query.json
ERROR - Error [ERR_MODULE_NOT_FOUND]: Cannot find module 'C:\Users\Julius\Documents\Web Development\rate-repository-api\src\utils\authService' imported from C:\Users\Julius\Documents\Web Development\rate-repository-api\src\apolloServer.js
I do not know how to proceed from here, I am hoping that someone can point me in the right direction.
File Structure:
apolloServer.js:
import { ApolloServer, toApolloError, ApolloError } from '#apollo/server';
import { ValidationError } from 'yup';
import { startServerAndCreateLambdaHandler } from '#as-integrations/aws-lambda';
import AuthService from './utils/authService';
import createDataLoaders from './utils/createDataLoaders';
import logger from './utils/logger';
import { resolvers, typeDefs } from './graphql/schema';
const apolloErrorFormatter = (error) => {
logger.error(error);
const { originalError } = error;
const isGraphQLError = !(originalError instanceof Error);
let normalizedError = new ApolloError(
'Something went wrong',
'INTERNAL_SERVER_ERROR',
);
if (originalError instanceof ValidationError) {
normalizedError = toApolloError(error, 'BAD_USER_INPUT');
} else if (error.originalError instanceof ApolloError || isGraphQLError) {
normalizedError = error;
}
return normalizedError;
};
const createApolloServer = () => {
return new ApolloServer({
resolvers,
typeDefs,
formatError: apolloErrorFormatter,
context: ({ req }) => {
const authorization = req.headers.authorization;
const accessToken = authorization
? authorization.split(' ')[1]
: undefined;
const dataLoaders = createDataLoaders();
return {
authService: new AuthService({
accessToken,
dataLoaders,
}),
dataLoaders,
};
},
});
};
export const graphqlHandler = startServerAndCreateLambdaHandler(createApolloServer());
export default createApolloServer;
Serverless.yml:
service: apollo-lambda
provider:
name: aws
runtime: nodejs16.x
httpApi:
cors: true
functions:
graphql:
# Make sure your file path is correct!
# (e.g., if your file is in the root folder use server.graphqlHandler )
# The format is: <FILENAME>.<HANDLER>
handler: src/apolloServer.graphqlHandler
events:
- httpApi:
path: /
method: POST
- httpApi:
path: /
method: GET
custom:
webpack:
packager: 'npm'
webpackConfig: 'webpack.config.js' # Name of webpack configuration file
includeModules:
forceInclude:
- pg
Webpack.config.js
const path = require('path');
module.exports = {
mode: 'development',
entry: './src/index.js',
output: {
path: path.resolve(__dirname, 'build'),
filename: 'foo.bundle.js',
},
};

Use Redis with AWS SAM (Redis Client Error)

Right now what I'm trying to do is that every time a request is made, a query is made to the Redis service. The problem is that when using a basic configuration, it would not be working. The error is the following:
INFO Redis Client Error Error: connec at TCPConnectWrap.afterConnect [as oncomplete] (node} port: 6379127.0.0.1',
I have as always running redis-server with its corresponding credentials listening to port 127.0.0.1:6379. I know that AWS SAM runs with a container, and the issue is probably due to a network configuration, but the only command that AWS SAM CLI provides me is --host. How could i fix this?
my code is the following, although it is not very relevant:
import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
import { createClient } from 'redis';
import processData from './src/lambda-data-dictionary-read/core/service/controllers/processData';
export async function lambdaHandler(event: APIGatewayProxyEvent): Promise<APIGatewayProxyResult> {
const body: any = await processData(event.queryStringParameters);
const url = process.env.REDIS_URL || 'redis://127.0.0.1:6379';
const client = createClient({
url,
});
client.on('error', (err) => console.log('Redis Client Error', err));
await client.connect();
await client.set('key', 'value');
const value = await client.get('key');
console.log('----', value, '----');
const response: APIGatewayProxyResult = {
statusCode: 200,
body,
};
if (body.error) {
return {
statusCode: 404,
body,
};
}
return response;
}
My template.yaml:
Transform: AWS::Serverless-2016-10-31
Description: >
lambda-data-dictionary-read
Sample SAM Template for lambda-data-dictionary-read
Globals:
Function:
Timeout: 0
Resources:
IndexFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: app/
Handler: index.lambdaHandler
Runtime: nodejs16.x
Timeout: 10
Architectures:
- x86_64
Environment:
Variables:
ENV: !Ref develope
REDIS_URL: !Ref redis://127.0.0.1:6379
Events:
Index:
Type: Api
Properties:
Path: /api/lambda-data-dictionary-read
Method: get
Metadata:
BuildMethod: esbuild
BuildProperties:
Minify: true
Target: 'es2020'
Sourcemap: true
UseNpmCi: true
Im using:
"scripts": {
"dev": "sam build --cached --beta-features && sam local start-api --port 8080 --host 127.0.0.1"
}

Error: getaddrinfo EAI_AGAIN database at GetAddrInfoReqWrap.onlookup [as oncomplete]

I'm creating an api using docker, postgresql, and nodejs (typescript). I've had this error ever since creating an admin user and nothing seems to be able to fix it:
Error in docker terminal:
Error: getaddrinfo EAI_AGAIN database
at GetAddrInfoReqWrap.onlookup [as oncomplete] (node:dns:72:26)
[ERROR] 12:33:32 Error: getaddrinfo EAI_AGAIN database
Error in Insomnia:
{
"status": "error",
"message": "Internal server error - Cannot inject the dependency \"categoriesRepository\" at position #0 of \"ListCategoriesUseCase\" constructor. Reason:\n No repository for \"Category\" was found. Looks like this entity is not registered in current \"default\" connection?"
}
I'm following an online course and this same code seems to work for everyone else since there are no reports of this error in the course's forum.
From what I've gathered it seems to be some type of problem when connecting to my database, which is a docker container. Here is my docker-compose.yml file:
version: "3.9"
services:
database_ignite:
image: postgres
container_name: database_ignite
restart: always
ports:
- 5432:5432
environment:
- POSTGRES_USER=something
- POSTGRES_PASSWORD=something
- POSTGRES_DB=rentx
volumes:
- pgdata:/data/postgres
app:
build: .
container_name: rentx
restart: always
ports:
- 3333:3333
- 9229:9229
volumes:
- .:/usr/app
links:
- database_ignite
depends_on:
- database_ignite
volumes:
pgdata:
driver: local
My server.ts file:
import "reflect-metadata";
import express, { Request, Response, NextFunction } from "express";
import "express-async-errors";
import swaggerUi from "swagger-ui-express";
import { AppError } from "#shared/errors/AppError";
import createConnection from "#shared/infra/typeorm";
import swaggerFile from "../../../swagger.json";
import { router } from "./routes";
import "#shared/container";
createConnection();
const app = express();
app.use(express.json());
app.use("/api-docs", swaggerUi.serve, swaggerUi.setup(swaggerFile));
app.use(router);
app.use(
(err: Error, request: Request, response: Response, next: NextFunction) => {
if (err instanceof AppError) {
return response.status(err.statusCode).json({
message: err.message,
});
}
return response.status(500).json({
status: "error",
message: `Internal server error - ${err.message}`,
});
}
);
app.listen(3333, () => console.log("Server running"));
And this is my index.ts file, inside src>modules>shared>infra>http>server.ts:
import { Connection, createConnection, getConnectionOptions } from "typeorm";
export default async (host = "database"): Promise<Connection> => {
const defaultOptions = await getConnectionOptions();
return createConnection(
Object.assign(defaultOptions, {
host,
})
);
};
I've tried restarting my containers, remaking them, accessing my postgres container and checking the tables, I've switched every "database" to "localhost" but it's the same every time: the containers run, but the error persists. I've checked the course's repo and my code matches. I've flushed my DNS and that also did nothing.
Here's the admin.ts file that "started it all":
import { hash } from "bcryptjs";
import { v4 as uuidv4 } from "uuid";
import createConnection from "../index";
async function create() {
const connection = await createConnection("localhost");
const id = uuidv4();
const password = await hash("admin", 6);
await connection.query(`
INSERT INTO Users (id, name, email, password, "isAdmin", driver_license, created_at)
VALUES (
'${id}',
'admin',
'admin#rentx.com.br',
'${password}',
true,
'0123456789',
NOW()
)
`);
await connection.close;
}
create().then(() => console.log("Administrative user created"));
I would love to know what is causing this error.
It looks like you have a service named database_ignite in your docker-compose.yml file. Docker by default creates a host using the name of your service. Try changing your host from database inside your index.ts file to database_ignite:
import { Connection, createConnection, getConnectionOptions } from "typeorm";
export default async (host = "database_ignite"): Promise<Connection> => {
// Changed database to ^^ database_ignite ^^
const defaultOptions = await getConnectionOptions();
return createConnection(
Object.assign(defaultOptions, {
host,
})
);
};

How to deploy NestJS services with graphql, serverless lambda in the Localstack?

Solution
I found the solution, thanks for your time. StackOverflow doesn't let me answer me. >:(
Description
I would like to know what kind of configuration I need to perform to deploy a lambda Localstack of an application in NesJS that has Graphql, not RestApi. This is my configuration.
Is correct this configuration?
How can I get the fake link to access the playground across lambda Localstack?
Implementation
Configuration the nestJS lambda function project
import { ValidationPipe } from '#nestjs/common';
import { NestFactory } from '#nestjs/core';
import { ExpressAdapter } from '#nestjs/platform-express';
import serverlessExpress from '#vendia/serverless-express';
import { APIGatewayProxyHandler, Handler } from 'aws-lambda';
import express from 'express';
import { AppModule } from './app.module';
let cachedServer: Handler;
const bootstrapServer = async (): Promise<Handler> => {
const expressApp = express();
const app = await NestFactory.create(
AppModule,
new ExpressAdapter(expressApp),
);
app.useGlobalPipes(new ValidationPipe());
app.enableCors();
await app.init();
return serverlessExpress({
app: expressApp,
});
};
export const handler: APIGatewayProxyHandler = async (
event,
context,
callback,
) => {
if (!cachedServer) {
cachedServer = await bootstrapServer();
}
return cachedServer(event, context, callback);
};
Serverless.yml Configuration
service: test
provider:
name: aws
runtime: nodejs14.x
stage: ''
# profile: local # Config your AWS Profile
environment: # Service wide environment variables
NODE_ENV: local
GLOBAL_PREFIX: graphql
PORT: 4000
plugins:
# - serverless-plugin-typescript
# - serverless-plugin-optimize
# - serverless-offline
- serverless-localstack
custom:
localstack:
debug: true
stages:
- local
- dev
endpointFile: localstack_endpoints.json
individually: true
# serverless-offline:
# httpPort: 3000
functions:
main:
handler: dis/index.handler
# local:
# handler: dist/main.handler
# events:
# - http:
# path: /
# method: any
# cors: true
Docker Compose Localstack
version: '3.8'
services:
localstack:
image: localstack/localstack:latest
environment:
- AWS_DEFAULT_REGION=us-east-1
- EDGE_PORT=4566
- SERVICES=lambda,s3,cloudformation,sts
ports:
- '4566-4597:4566-4597'
volumes:
- "${TMPDIR:-/tmp}/localstack:/tmp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
Services
{
"CloudFormation" : "http://localhost:4566",
"CloudWatch" : "http://localhost:4566",
"Lambda" : "http://localhost:4566",
"S3" : "http://localhost:4566"
}
Commands
serverless deploy --stage local
When executing this command I get this error
serverless info --stage local
When executing this command I get this information
serverless invoke local -f main -l
When executing this command
Not sure, but probably you have a typo here at Serverless.yml:
handler: dis/index.handler

nest js this.instance.use is not a function

I studying serverless nest.js with aws lambda
When I run serverless offline and send the request, an error occurred.
[Nest] 11048 - 2021. 07. 14. 오후 7:02:11 [NestFactory] Starting Nest application...
[Nest] 11048 - 2021. 07. 14. 오후 7:02:11 [InstanceLoader] AppModule dependencies initialized +16ms
[Nest] 11048 - 2021. 07. 14. 오후 7:02:11 [ExceptionHandler] this.instance.use is not a function +3ms
TypeError: this.instance.use is not a function
at ExpressAdapter.use (C:\study\nestjs-practice\serverless-nestjs\node_modules\#nestjs\core\adapters\http-adapter.js:20:30)
at NestApplication.use (C:\study\nestjs-practice\serverless-nestjs\node_modules\#nestjs\core\nest-application.js:140:26)
at C:\study\nestjs-practice\serverless-nestjs\node_modules\#nestjs\core\nest-factory.js:127:40
at Function.run (C:\study\nestjs-practice\serverless-nestjs\node_modules\#nestjs\core\errors\exceptions-zone.js:9:13)
at Proxy.<anonymous> (C:\study\nestjs-practice\serverless-nestjs\node_modules\#nestjs\core\nest-factory.js:126:46)
at Proxy.<anonymous> (C:\study\nestjs-practice\serverless-nestjs\node_modules\#nestjs\core\nest-factory.js:168:54)
at bootstrapServer (C:\study\nestjs-practice\serverless-nestjs\.build\src\lambda.js:16:17)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
at async handler (C:\study\nestjs-practice\serverless-nestjs\.build\src\lambda.js:23:20)
at async InProcessRunner.run (C:\study\nestjs-practice\serverless-nestjs\node_modules\serverless-offline\dist\lambda\handler-runner\in-process-runner\InProcessRunner.js:211:24)
at async LambdaFunction.runHandler (C:\study\nestjs-practice\serverless-nestjs\node_modules\serverless-offline\dist\lambda\LambdaFunction.js:355:20)
at async hapiHandler (C:\study\nestjs-practice\serverless-nestjs\node_modules\serverless-offline\dist\events\http\HttpServer.js:601:18)
at async module.exports.internals.Manager.execute (C:\study\nestjs-practice\serverless-nestjs\node_modules\#hapi\hapi\lib\toolkit.js:45:28)
at async Object.internals.handler (C:\study\nestjs-practice\serverless-nestjs\node_modules\#hapi\hapi\lib\handler.js:46:20)
at async exports.execute (C:\study\nestjs-practice\serverless-nestjs\node_modules\#hapi\hapi\lib\handler.js:31:20)
at async Request._lifecycle (C:\study\nestjs-practice\serverless-nestjs\node_modules\#hapi\hapi\lib\request.js:312:32)
at async Request._execute (C:\study\nestjs-practice\serverless-nestjs\node_modules\#hapi\hapi\lib\request.js:221:9)
How can I fix it?
Except for the lambda.ts and serverless.yml files, the settings are the same as when the nest.js project was created.
I'll post the code, please let me know if you know a solution
lambda.ts
import { NestFactory } from '#nestjs/core';
import { Handler, Context } from 'aws-lambda';
import { ExpressAdapter } from '#nestjs/platform-express';
import * as express from 'express';
import { Server } from 'http';
import { AppModule } from './app.module';
import { eventContext } from 'aws-serverless-express/middleware';
import { createServer, proxy } from 'aws-serverless-express';
const binaryMimeTypes: string[] = [];
let cachedServer: Server;
async function bootstrapServer(): Promise<Server> {
if (!cachedServer) {
const expressApp = express();
const nestApp = await NestFactory.create(
AppModule,
new ExpressAdapter(express),
);
nestApp.use(eventContext);
await nestApp.init();
cachedServer = createServer(expressApp, undefined, binaryMimeTypes);
}
return cachedServer;
}
export const handler: Handler = async (event: any, context: Context) => {
cachedServer = await bootstrapServer();
return proxy(cachedServer, event, context, 'PROMISE').promise;
};
serverless.yml
service:
name: serverless-nestjs
plugins:
- serverless-plugin-typescript
- serverless-plugin-optimize
- serverless-offline
provider:
name: aws
runtime: nodejs12.x
functions:
main: # The name of the lambda function
# The module 'handler' is exported in the file 'src/lambda'
handler: src/lambda.handler
events:
- http:
method: any
path: /{any+}

Resources