AWS SAM fails to build layer - node.js

I have read that SAM now supports building layers and followed the directions mentioned here. However, i am getting a build error when i try to build the layer locally with sam build samDeployLayer
Build Failed
Error: NodejsNpmBuilder:NpmPack - NPM Failed: npm ERR! code ENOLOCAL
npm ERR! Could not install from "E:\Development\sam-deploy\src\sam-deploy-layer" as it does not contain a package.json file.
Here is my template file:
AWSTemplateFormatVersion: 2010-09-09
Description: >-
sam-deploy
Transform:
- AWS::Serverless-2016-10-31
Resources:
samDeploy:
Type: AWS::Serverless::Function
Properties:
CodeUri: src/sam-deploy
Handler: index.handler
Runtime: nodejs12.x
MemorySize: 128
Timeout: 100
Description: A Lambda function that returns a static string.
FunctionName: "sam-deploy"
Layers:
- !Ref samDeployLayer
Policies:
# Give Lambda basic execution Permission to the helloFromLambda
- AWSLambdaBasicExecutionRole
samDeployLayer:
Type: AWS::Serverless::LayerVersion
Properties:
LayerName: sam-deploy-layer
ContentUri: src/sam-deploy-layer
CompatibleRuntimes:
- nodejs12.x
Metadata:
BuildMethod: nodejs12.x
I have included both the CompatibleRuntimes and Metadata properties as per the requirement to build layers. package.json for this layer is located at src/sam-deploy-layer/nodejs as required for nodeJS runtimes and i am using SAM CLI version 0.53.0. What am i doing wrong?

Place package.json under src/sam-deploy-layer/.
After running sam build, the directory .aws-sam/build/samDeployLayer/nodejs/node_modules/ should be created.

You should create a folder named nodejs in src/sam-deploy-layer. Here, you can do npm init and npm install (packages in your layer).

Related

AWS lambda can't find lambda layer path

I am having a tough time setting up the lambda layers with lambdas. I am using node 14.
My folder structure for lambda layer
layer/
nodejs/
node14/
node_modules/
hey.js
I have also tried having only the nodejs directory as below
layer/
nodejs/
hey.js
But in both cases I get cannot found module error in the lambda.
The paths I tried for accessing layers in lambda are as below
'/opt/nodejs/node14/node_modules/hey.js' (for first folder structure)
'/opt/nodejs/hey.js' (for folder structure with only the nodejs directory in the layer)
'hey.js' (trying to access the file directly)
But I had no luck. What am I doing wrong?
I am using AWS sam to deploy lambda and layers. I could see the layer getting attached to lambda on the console.
Here is my SAM template
layer1:
Type: AWS::Serverless::LayerVersion
Properties:
ContentUri: ./src/lambda-layers/layer1
CompatibleRuntimes:
- nodejs14.x
Metadata:
BuildMethod: nodejs14.x
lambda1:
Type: 'AWS::Serverless::Function'
Properties:
CodeUri: ./src/lambdas/lambda1
Handler: index.handler
Role: !GetAtt LambdaExecutionRole.Arn
Layers:
- !Ref layer1
Events:
AwsIoTMetadata:
Type: Api
Properties:
RestApiId: !Ref CrApi
Path: /user
Method: GET
How to access the layer in lambda?
Please Help. Thanks in advance

Problem with fs-extra while deploying python using serverless

I'm not that much expert using npm and bitbucket-pipelines, but I want to create a pipeline on Bitbucket to deploy my python (flask) project using serverless to AWS Lambda. It's being deployed locally, but when I run it using the Bitbucket pipeline, this happens:
Error: Cannot find module '/opt/atlassian/pipelines/agent/build/node_modules/fs-extra/lib/index.js'. Please verify that the package.json has a valid "main" entry
Here is my code:
bitbucket-pipelines.yml
image: node:14.13.1-alpine3.10
pipelines:
branches:
master:
- step:
caches:
- node
script:
- apk add python3
- npm install
- npm install -g serverless
- serverless config credentials --stage dev --provider aws --key ${AWS_DEV_LAMBDA_KEY} --secret ${AWS_DEV_LAMBDA_SECRET}
- serverless deploy --stage dev
serverless.yml
service: serverless-flask
plugins:
- serverless-python-requirements
- serverless-wsgi
custom:
wsgi:
app: app.app
packRequirements: false
pythonRequirements:
dockerizePip: non-linux
provider:
name: aws
runtime: python3.8
stage: dev
region: us-west-2
functions:
app:
handler: wsgi.handler
events:
- http: ANY /
- http: 'ANY {proxy+}'
alert:
handler: alerts.run
events:
- schedule: rate(1 day)
package:
exclude:
- .venv/**
- venv/**
- node_modules/**
- bitbucket-pipelines.yml
How can I fix this?
What helped me in the same situation was:
Deleted /node_modules folder
run npm install inside service folder
run serverless deploy
I had the same issue and resolved the problem by (re)installing fs-extra
npm install fs-extra

FUNCTION_ERROR_INIT_FAILURE AWS lambda

I recently added the cool lambda feature - provisioned concurrency.
After a few successful deployments, I now face this issue
Serverless Error ---------------------------------------
ServerlessError: An error occurred:
GraphqlPrivateProvConcLambdaAlias - Provisioned Concurrency
configuration failed to be applied. Reason:
FUNCTION_ERROR_INIT_FAILURE.
at C:\Users\theod\AppData\Roaming\npm\node_modules\serverless\lib\plugins\aws\lib\monitorStack.js:125:33
From previous event:
at AwsDeploy.monitorStack (C:\Users\theod\AppData\Roaming\npm\node_modules\serverless\lib\plugins\aws\lib\monitorStack.js:28:12)
at C:\Users\theod\AppData\Roaming\npm\node_modules\serverless\lib\plugins\aws\lib\updateStack.js:107:28
From previous event:
at AwsDeploy.update
here's my sample serverless.yml file
service: backend-api
parameters:
region: ap-southeast-2
path: &path /
provider:
name: aws
runtime: nodejs12.x
stage: ${env:STAGE, 'staging'}
region: ap-southeast-2
versionFunctions: true
plugins:
- serverless-webpack
- serverless-pseudo-parameters
- serverless-prune-plugin
# - serverless-offline-scheduler
- serverless-offline
functions:
# GRAPHQL APIs
graphqlPrivate:
handler: src/graphql/private/index.handler
memorySize: 256
timeout: 30
name: ${self:service}-gqlPrivate-${self:provider.stage}
vpc: ${file(./serverless/vpc.yml)}
events:
- http:
path: /graphql/private
method: ANY
cors: true
authorizer:
arn: arn:aws:cognito-idp:#{AWS::Region}:#{AWS::AccountId}:userpool/${self:custom.cognitoArns.private.${self:provider.stage}}
provisionedConcurrency: 10
package:
individually: true
custom:
webpack:
keepOutputDirectory: true
serializedCompile: true
webpackConfig: 'webpack.config.js'
packager: 'npm'
stage: ${opt:stage, self:provider.stage}
prune:
automatic: true
number: 1
anybody able to resolve this issue?
Your Environment Information ---------------------------
Operating System: win32
Node Version: 12.11.0
Framework Version: 1.61.3
Plugin Version: 3.2.7
SDK Version: 2.3.0
Components Core Version: 1.1.2
Components CLI Version: 1.4.0
FUNCTION_ERROR_INIT_FAILURE plainly means there's something wrong with the function's handler/code that i'm trying to deploy, w/c is why provisioned lambdas can't start up/initialize.
The way to resolve this, is to test w/o provisioned concurrency option first.
Once you are able to push your lambda, error(s) will surely flow into your CW logs.
The best way though, is to test your lambda locally(using serverless-offline plugin or serverless invoke), if it works properly.
You can also package your app, and invoke it with serverless cli to detect issues on packaging.
In my case, there is a runtime error where my code bundle is looking for a require that is not part of bundle.
This is undocumented on AWS lambda as of now(Jan 29, 2020)

Lambda Layer with SAM CLI

I created a lambda layer with serverless successfully but now I have to do the same with SAM CLI but i can't.
With serverless I only use two files:
serverless.yml
awswrangler-layer-0.0.23-py3.7.zip
serverless.yml content below:
service: MyService
provider:
name: aws
layers:
awswrangler:
package:
artifact: awswrangler-layer-0.0.23-py3.7.zip
How can I do the same with SAM CLI? Please give an example of the template.yaml
I have to unzip and this was the solution:
AwswranglerLayer :
Type: AWS::Serverless::LayerVersion
Properties:
LayerName: !Join ['-', [!Ref Project, !Ref Environment, 'AwswranglerLayer']]
ContentUri: ../layers/awswrangler/
Description: "ETL and wrangling utility belt to handle data on AWS. Pandas, PySpark"
CompatibleRuntimes:
- python3.7
- python3.8
RetentionPolicy: Retain
AwswranglerLayer:
Type: AWS::Serverless::LayerVersion
Properties:
LayerName: !Sub '${EnvironmentKey}-AwswranglerLayer'
CompatibleRuntimes:
- nodejs12.x
ContentUri: AwswranglerLayerPath/

Serverless not including my node_modules

I have a nodejs serverless project that has this structure:
-node_modules
-package.json
-serverless.yml
-funcitons
-medium
-mediumHandler.js
my serverless.yml:
service: googleAnalytic
provider:
name: aws
runtime: nodejs6.10
stage: dev
region: us-east-1
package:
include:
- node_modules/**
functions:
mediumHandler:
handler: functions/medium/mediumHandler.mediumHandler
events:
- schedule:
name: MediumSourceData
description: 'Captures data between set dates'
rate: rate(2 minutes)
- cloudwatchEvent:
event:
source:
- "Lambda"
detail-type:
- ""
- cloudwatchLog: '/aws/lambda/mediumHandler'
my sls info shows:
Service Information
service: googleAnalytic
stage: dev
region: us-east-1
stack: googleAnalytic-dev
api keys:
None
endpoints:
None
functions:
mediumHandler: googleAnalytic-dev-mediumHandler
When I run sls:
serverless invoke local -f mediumHandler
it works and my script where I included googleapis and aws-sdk work. But when I deploy, those functions are skipped and show no error.
When debugging serverless's packaging process, use sls package (or sls deploy --noDeploy (for old versions). You'll get a .serverless directory that you can inspect to see what's inside the deployment package.
From there, you can see if node_modules is included or not and make changes to your serverless.yml correspondingly without needing to deploy every time you make a change.
Serverless will exclude development packages by default. Check your package.json and ensure your required packages are in the dependencies object, as devDependencies will be excluded.
I was dumb to put this in my serverless.yml which caused me the same issue you're facing.
package:
patterns:
- '!node_modules/**'

Resources