Can lambda pull in npm dependencies on the fly - node.js

I want to build potentially hundreds of different node projects using lambda.
Is it possible for lambda to perform a npm install to download all the node modules or do I have to first send lambda all my dependencies with my code? i.e. the node_modules folder.

This is not a feature that Lambda offers. The documention for Lambda says that you need to pack everything in your Lambda function into the zip-file you later upload. That means all source code, including node_modules.
However, to have Node.JS fetch code and run it during runtime would be possible. You can make it work using a HTTP client (request, axios, http) to pull the code and then combine that with require to load it into the process.

Related

Should I have two package.jsons for a node.js website using Firebase?

I'm building a simple website using Node.js and Express.js with a typical root-level package.json to handle my dependencies.
Now, I'm learning to use Google Firebase for web hosting and functions to deploy the website. Upon executing the Firebase CLI command firebase init functions, I now have a functions directory that also has a package.json file. I transferred my initial /app.js file into the functions/index.js and it's working fine when I firebase serve for testing.
Here is my project directory structure:
functions
└─views
└─index.js
└─package.json <- contains firebase-admin and firebase-functions dependencies
public
└─images
├─styles
└─scripts
firebase.json
package.json <- contains express, handlebars, etc. dependencies I had originally created.
Is it good practice to now have 2 package.json files for this project? One for Firebase functions, and the other for my app? Should I consolidate them into one, and if so, which one?
I'm still new to back end development, so any advice would be deeply appreciated!
You have to see your functions as your backend server. In the end if you host your web app and call your functions via http requests your functions act as independent "components" even then they are in the same place in your local project. That's why they need their own package.json.
To help rookies like me, yes you should have two package.json files.
One for your front end. One for your Firebase Functions 'backend'
To install packages on the functions package.json, in the terminal navigate to the functions folder.

Online-Edit Amazon Lambda function with alexa-sdk

I am creating an Alexa skill and I am using Amazon Lambda to handle the intents. I found online several tutorials and decided to use NodeJs with the alexa-sdk. After installing the alexa-sdk with npm, the zipped archive occupied a disksize of ~6MB. If I upload it to amazon, it tells me
The deployment package of your Lambda function "..." is too large to enable inline code editing. However, you can still invoke your function right now.
My index.js has a size of < 4KB but the dependencies are large. If I want to change something, I have to zip it altogether (index.js and the folder with the depencencies "node_modules"), upload it to Amazon and wait till its processed, because online editing isn't available anymore. So every single change of the index.js wastes > 1 minute of my time to zip and upload it. Is there a possibility to use the alexa-sdk dependency (and other dependencies) without uploading the same code continually every time I am changing something? Is there a possibility to use the online-editing function though I am using large dependencies? I just want to edit the index.js.
If the size of your Lambda function's zipped deployment packages
exceeds 3MB, you will not be able to use the inline code editing
feature in the Lambda console. You can still use the console to invoke
your Lambda function.
Its mentioned here under AWS Lambda Deployment Limits
ASK-CLI
ASK Command Line Interface let you manage your Alexa skills and related AWS Lambda functions from your local machine. Once you set it up, you can make necessary changes in your Lambda code or skill and use deploy command to deploy a skill. The optional target will let you deploy the associated Lambda code.
ask deploy [--no-wait] [-t| --target <target>] [--force] [-p| --profile <profile>] [--debug]
More info about ASK CLI here and more about deploy command here

Serverless Node.js Project Structure

I am building a RESTful API with the serverless framework to run on AWS (API Gateway + Lambda + DynamoDB). It's my first Node project and my first serverless production project and I have no idea how to structure it. So far I have the following
|--Functions
|-----Function1
|--------InternalModule
|-----Function2
|-----Function3
|--------InternalModule
|-----Function4
|--Shared
|-----Module1
|-----Module2
|-----Module3
|--Tests
|-----Functions
|--------Function1
|-----------InternalModule
|--------Function2
|-----------InternalModule
|--------Function3
|-----------InternalModule
|--------Function4
|-----------InternalModule
|-----Modules
|--------Module1
|-----------InternalModule
|--------Module2
|-----------InternalModule
|--------Module3
|-----------InternalModule
I keep my API endpoints (Lambda handlers) in Functions. Some of them have internal modules which they only use and there are some who use modules from Shared. I want to have unit tests for all my modules - inner and shared as well as API testing on the lambda functions. I am using mocha and chai and want to integrate everything in a pipeline which on a git push runs the linters and tests and if they are successful deploy the API to the appropriate stage. The problem is that in order to test each module I have to have chai as a local node module in every folder where I have a test file and reference the modules to be tested by a relative path. In most cases it looks really ugly because of the nesting. If I want to test an internal module from
Tests/Functions/Function1/InternalModule
and I require it on top of the test like so
require('../../../../Tests/Functions/Function1/InternalModule')
+ I have to install chai in every folder so it's reachable. The same goes for mocha and all the dependencies needed for the test and I haven't even mentioned configuration. The main idea I am now considering is weather or not I should bring all modules to a folder called modules and require them when needed - worst case
from Functions/Function1
require('../../Modules/Module1')
Also keep the test files in the module folder and run them inside, but that would require the assertion library installed in every folder. I've read about npm link and symlinks but I want to keep track of what dependencies each folder has so I can install them on the CI environment after the clean project is downloaded from GitHub where I can't do links (or I've got the whole concept wrong?)
If anyone can suggest a better solution I would highly appreciate it!
the way Node uses require is so much more than I thought!
First, Node.js looks to see if the given module is a core module - Node.js comes with many modules compiled directly into the executable binary (ex. http, fs, sys, events, path, etc.). These core modules will always take precedence in the loading algorithm.
If the given module is not a core module, Node.js will then begin to search for a directory named, "node_modules". It will start in the current directory (relative to the currently-executing Javascript file in Node) and then work its way up the folder hierarchy, checking each level for a node_modules folder.
read more here
I will try out Putting all modules in a separate folder each with it's own Folder prefixed with FunctionName_ so I know where each module is used, and the test file + package.json file. Then if I need a module I can require it from the functions with shallow nesting which would look not so bad:
from Functions/Function1
require('module-1');
with package.json
"dependencies":{
"module-1":"file:../../Modules/Function1_Module1"
}
and have a separate folder Shared where I keep the shared Modules.
I am still open for better ideas!

Deploy Lambda functions without binaries

I have some problems with serverless deploy, because when I deploy my Lambda function, the Serverless Framework start packing my node_modules, but it takes a lot of time.
I mean why to upload node_modules again if it have not been updated. Maybe somebody know, how deploy only a Lambda function code without packing binaries?
You need to add packaging configuration.
In your serverless.yml file, add:
package:
exclude:
- node_modules/**
It is useful to remove the AWS-SDK modules (because if you don't upload them, Lambda will use what AWS offers - which is better) and to remove dev modules (like testing frameworks). However, all other modules are dependencies and will be needed to be uploaded for your function to work properly. So, configure the package settings to include/exclude exactly what you need.
Regarding your other question
why to upload node_modules again if it have not been updated
It is not a limitation of the Serverless Framework. It is a limitation of the AWS Lambda service. You can't make a partial upload of a Lambda function. Lambda always requires that the uploaded zip package contains the updated code and all required modules dependencies.
If your deploy is taking too long, maybe you should consider breaking this Lambda function into smaller units.

Including local dependencies in deployment to lambda

I have a repo which consists of several "micro-services" which I upload to AWS's Lambda. In addition I have a few shared libraries that I'd like to package up when sending to AWS.
Therefore my directory structure looks like:
/micro-service-1
/dist
package.json
index.js
/micro-service-2
/dist
package.json
index.js
/shared-component-1
/dist
package.json
component-name-1.js
/shared-component-2
/dist
package.json
component-name-2.js
The basic deployment leverages the handy node-lambda npm module but when I reference a local shared component with a statement like:
var sharedService = require('../../shared-component-1/dist/index');
This works just fine with the node-lambda run command but node-lambda deploy drops this local dependency. Probably makes sense because I'm going below the "root" directory in my dependency so I thought maybe I'd leverage gulp to make this work but I'm pretty darn new to it so I may be doing something dumb. My strategy was to:
Have gulp deploy depend on a local-deps task
the local-deps task would:
npm build --production to a directory
then pipe this directory over to the micro-service under the /local directory
clean up the install in the shared
I would then refer to all shared components like so:
var sharedService = require('local/component-name-1');
Hopefully this makes what I'm trying to achieve. Does this strategy make sense? Is there a simpler way I should be considering? Does anyone have any examples of anything like this in "gulp speak"?
I have an answer to this! :D
TL;DR - Use npm link to link create a symbolic link between your common component and the dependent component.
So, I have a a project with only two modules:
- main-module
- referenced-module
Each of these is a node module. If I cd into referenced-module and run npm link, then cd into main-module and npm link referenced-module, npm will 'install' my referenced-module into my main-module and store it in my node_modules folder. NOTE: When running the second npm link, the name of the project is the one you find in your package.json, not the name of the directory (see npm link documentation, previously linked).
Now, in my main-module all I need to do is var test = require('referenced-module') and I can use that to my hearts content. Be sure to module.exports your code from your referenced-module!
Now, when you zip up main-module to deploy it to AWS Lambda, the links are resolved and the real modules are put in their place! I've tested this and it works, though not with node-lambda yet, though I don't see why this should be a problem (unless it does something different with the package restores).
What's nice about this approach as well is that any changes I make to my referenced-module are automatically picked up by my main-module during development, so I don't have to run any gulp tasks or anything to sync them.
I find this is quite a nice, clean solution and I was able to get it working within a few minutes. If anything I've described above doesn't make any sense (as I've only just discovered this solution myself!), please leave a comment and I'll try and clarify for you.
UPDATE FEB 2016
Depending on your requirements and how large your application is, there may be an interesting alternative that solves this problem even more elegantly than using symlinking. Take a look at Serverless. It's quite a neat way of structuring serverless applications and includes useful features like being able to assign API Gateway endpoints that trigger the Lambda function you are writing. It even allows you to script CloudFormation configurations, so if you have other resources to deploy then you could do so here. Need a 'beta' or 'prod' stage? This can do it for you too. I've been using it for just over a week and while there is a bit of setup to do and things aren't always as clear as you'd like, it is quite flexible and the support community is good!
While using serverless we faced a similar issue, when having the need to share code between AWS Lambdas. Initially we used to duplication the code, across each microservice, but later as always it became difficult to manage.
Since the development done in Windows Environment, using symbolic links was not an option for us.
Then we came up with a solution to use a shared folder to keep the local dependencies and use a custom written gulp task to copy these dependencies across each of the microservice endpoints so that the dependency can be required similar to npm package.
One of the decisions we made is not to keep two places to define the dependencies for microservices, so we used the same package.json to define the local shared dependencies, where gulp task passes this file and copy the shared dependencies accordingly also installing the npm dependencies with a single command.
Later we made the code open source as npm modules serverless-dependency-install and gulp-dependency-install.

Resources