Deploy Lambda functions without binaries - node.js

I have some problems with serverless deploy, because when I deploy my Lambda function, the Serverless Framework start packing my node_modules, but it takes a lot of time.
I mean why to upload node_modules again if it have not been updated. Maybe somebody know, how deploy only a Lambda function code without packing binaries?

You need to add packaging configuration.
In your serverless.yml file, add:
package:
exclude:
- node_modules/**
It is useful to remove the AWS-SDK modules (because if you don't upload them, Lambda will use what AWS offers - which is better) and to remove dev modules (like testing frameworks). However, all other modules are dependencies and will be needed to be uploaded for your function to work properly. So, configure the package settings to include/exclude exactly what you need.
Regarding your other question
why to upload node_modules again if it have not been updated
It is not a limitation of the Serverless Framework. It is a limitation of the AWS Lambda service. You can't make a partial upload of a Lambda function. Lambda always requires that the uploaded zip package contains the updated code and all required modules dependencies.
If your deploy is taking too long, maybe you should consider breaking this Lambda function into smaller units.

Related

AWS Serverless: minify/uglify the sam build lambda output

I have a few AWS Lambda functions with APIGateway. I'm using the serverless approach, packing and deploying the app using SAM CLI. It outputs the separate function built inside the .aws-sam directory in the project root. I'd like to minify & uglify that source code before the package actually uploads the S3 bucket for deployment. I'm referring to the SAM CLI Docs, but nothing related to customized packaging or using bundlers has been mentioned. Is there a workaround to bundle the sourcecode with minification/uglification?
You can create a custom build runtime described here
https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/building-custom-runtimes.html
or (if you only want to use minification) you can use esbuild
https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-using-build-typescript.html

How do I use shared code in lambdas in an AWS SAM template using layers in Node.js?

We have a very simple use case--we want to share code with all of our lambdas and we don't want to use webpack.
We can't put relative paths in our package.json files in the lambda folders because when you do sam build twice, it DELETES the shared code and I have no idea why.
Answer requirements:
Be able to debug locally
Be able to run unit tests on business logic (without having to be ran in an AWS sandbox)
Be able to run tests in sam local start-api
Be able to debug the code in the container via sam local invoke
sam build works
sam deploy works
Runs in AWS Lambda in the cloud
TL;DR
Put your shared code in a layer
When referencing shared code in the lambda layer, use a ternary operator when you require(). Check an environment variable that is only set when running in the AWS environment. In this case, we added a short AWS variable in the SAM template, but you can find environment variables that AWS automatically defines, but they will not be as short. This enables you to debug locally outside of the AWS stack, allowing very fast unit tests that test business logic.
let math = require(process.env.AWS ? '/opt/nodejs/common' : '../../layers/layer1/nodejs/common');
let tuc = require(process.env.AWS ? 'temp-units-conv' : '../../layers/layer1/nodejs/node_modules/temp-units-conv');
You shouldn't need to use the ternary operator like that unless within the lambda folder code
Here's a working example that we thought we'd post so that others will have a much easier time of it than we did.
It is our opinion that AWS should make this much easier.
https://github.com/blmille1/aws-sam-layers-template.git
Gotchas
The following gotcha has been avoided in this solution. I am mentioning it because it looked like a straight-forward solution and it took a lot of time before I finally abandoned it.
It is very tempting to add a folder reference in the lambda function's package.json.
//...
"dependencies": {
"common":"file:../../layers/layer1/nodejs/common"
},
//...
If you do that, it will work the first sam build. However, the second time you run sam build, your shared code folder and all subdirectories will be DELETED. This is because when sam builds, it creates an .aws-sam folder. If that folder exists, it performs an npm cleanup, and I think that is what provokes the deleting of the shared code.

Online-Edit Amazon Lambda function with alexa-sdk

I am creating an Alexa skill and I am using Amazon Lambda to handle the intents. I found online several tutorials and decided to use NodeJs with the alexa-sdk. After installing the alexa-sdk with npm, the zipped archive occupied a disksize of ~6MB. If I upload it to amazon, it tells me
The deployment package of your Lambda function "..." is too large to enable inline code editing. However, you can still invoke your function right now.
My index.js has a size of < 4KB but the dependencies are large. If I want to change something, I have to zip it altogether (index.js and the folder with the depencencies "node_modules"), upload it to Amazon and wait till its processed, because online editing isn't available anymore. So every single change of the index.js wastes > 1 minute of my time to zip and upload it. Is there a possibility to use the alexa-sdk dependency (and other dependencies) without uploading the same code continually every time I am changing something? Is there a possibility to use the online-editing function though I am using large dependencies? I just want to edit the index.js.
If the size of your Lambda function's zipped deployment packages
exceeds 3MB, you will not be able to use the inline code editing
feature in the Lambda console. You can still use the console to invoke
your Lambda function.
Its mentioned here under AWS Lambda Deployment Limits
ASK-CLI
ASK Command Line Interface let you manage your Alexa skills and related AWS Lambda functions from your local machine. Once you set it up, you can make necessary changes in your Lambda code or skill and use deploy command to deploy a skill. The optional target will let you deploy the associated Lambda code.
ask deploy [--no-wait] [-t| --target <target>] [--force] [-p| --profile <profile>] [--debug]
More info about ASK CLI here and more about deploy command here

Can lambda pull in npm dependencies on the fly

I want to build potentially hundreds of different node projects using lambda.
Is it possible for lambda to perform a npm install to download all the node modules or do I have to first send lambda all my dependencies with my code? i.e. the node_modules folder.
This is not a feature that Lambda offers. The documention for Lambda says that you need to pack everything in your Lambda function into the zip-file you later upload. That means all source code, including node_modules.
However, to have Node.JS fetch code and run it during runtime would be possible. You can make it work using a HTTP client (request, axios, http) to pull the code and then combine that with require to load it into the process.

Serverless Node.js Project Structure

I am building a RESTful API with the serverless framework to run on AWS (API Gateway + Lambda + DynamoDB). It's my first Node project and my first serverless production project and I have no idea how to structure it. So far I have the following
|--Functions
|-----Function1
|--------InternalModule
|-----Function2
|-----Function3
|--------InternalModule
|-----Function4
|--Shared
|-----Module1
|-----Module2
|-----Module3
|--Tests
|-----Functions
|--------Function1
|-----------InternalModule
|--------Function2
|-----------InternalModule
|--------Function3
|-----------InternalModule
|--------Function4
|-----------InternalModule
|-----Modules
|--------Module1
|-----------InternalModule
|--------Module2
|-----------InternalModule
|--------Module3
|-----------InternalModule
I keep my API endpoints (Lambda handlers) in Functions. Some of them have internal modules which they only use and there are some who use modules from Shared. I want to have unit tests for all my modules - inner and shared as well as API testing on the lambda functions. I am using mocha and chai and want to integrate everything in a pipeline which on a git push runs the linters and tests and if they are successful deploy the API to the appropriate stage. The problem is that in order to test each module I have to have chai as a local node module in every folder where I have a test file and reference the modules to be tested by a relative path. In most cases it looks really ugly because of the nesting. If I want to test an internal module from
Tests/Functions/Function1/InternalModule
and I require it on top of the test like so
require('../../../../Tests/Functions/Function1/InternalModule')
+ I have to install chai in every folder so it's reachable. The same goes for mocha and all the dependencies needed for the test and I haven't even mentioned configuration. The main idea I am now considering is weather or not I should bring all modules to a folder called modules and require them when needed - worst case
from Functions/Function1
require('../../Modules/Module1')
Also keep the test files in the module folder and run them inside, but that would require the assertion library installed in every folder. I've read about npm link and symlinks but I want to keep track of what dependencies each folder has so I can install them on the CI environment after the clean project is downloaded from GitHub where I can't do links (or I've got the whole concept wrong?)
If anyone can suggest a better solution I would highly appreciate it!
the way Node uses require is so much more than I thought!
First, Node.js looks to see if the given module is a core module - Node.js comes with many modules compiled directly into the executable binary (ex. http, fs, sys, events, path, etc.). These core modules will always take precedence in the loading algorithm.
If the given module is not a core module, Node.js will then begin to search for a directory named, "node_modules". It will start in the current directory (relative to the currently-executing Javascript file in Node) and then work its way up the folder hierarchy, checking each level for a node_modules folder.
read more here
I will try out Putting all modules in a separate folder each with it's own Folder prefixed with FunctionName_ so I know where each module is used, and the test file + package.json file. Then if I need a module I can require it from the functions with shallow nesting which would look not so bad:
from Functions/Function1
require('module-1');
with package.json
"dependencies":{
"module-1":"file:../../Modules/Function1_Module1"
}
and have a separate folder Shared where I keep the shared Modules.
I am still open for better ideas!

Resources