My project's structure is next:
MyProject
layers
myLayer1
nodejs
node_modules
myLayer1
myExtension.js
lambda1
handler.js
lambda2
handler.js
jsconfig.json
myExtension.js
module.exports.myTest = () => {
return 'My extension test';
};
handler.js
const myext = require('myLayer1');
module.exports.handler = async (event) => {
return {
statusCode: 200,
body: JSON.stringify({
message: myext.myTest()
})
}};
When I deploy to AWS - things are working.
But I'm unable to run/debug it on my local machine.
According to what I found the jsconfig.json file should help to map paths in this case, but VSCode/NodeJS ignore it whatever I wrote there (I tried to place it the MyProject root folder and within lambda folders).
I can run this lambdas locally if I change the 'require' within the handler.js to:
const myext = require('./layers/myLayer1/nodejs/node_modules/myLayer1');
which obviously breaks the code when I deploy it to AWS.
Overview
This is similar to another question that is looking for how to reference modules from multiple layers: How configure Visual Studio Code to resolve input paths for AWS Lambda Layers (javascript)
The simpler answer, for a lambda function with a single layer for node_modules, is to use a jsconfig.json file in the directory where the lambda's package.json file with no dependencies is located (this is not the root of the entire project, but the root of the lambda function project only).
The example project of the question does attempt to use a jsconfig.json file but it should be in the lambda1 and lambda2 directories, while the question has this file located at the top-level, which does not have the desired effect.
Example jsconfig.json file for VS Code
This file should be placed at both of the following locations for the example project in the question:
MyProject/lambda1/jsconfig.json
MyProject/lambda2/jsconfig.json
// https://code.visualstudio.com/docs/languages/jsconfig
{
"compilerOptions": {
// baseUrl is the path used for pathless / naked node module includes
"baseUrl": "../layer/node_modules/",
"moduleResolution": "node",
"module": "commonJS"
}
}
Complete Example Code / Project Structure
https://github.com/pwrdrvr/layers-js-demo
I have an http cloud function that returns some dynamic HTML. I want to use Handlebars as the templating engine. The template is sufficiently big that it's not practical to have it in a const variable on top of my function.
I've tried something like:
const template = fs.readFileSync('./template.hbs', 'utf-8');
But when deploying the function I always get an error that the file does not exist:
Error: ENOENT: no such file or directory, open './template.hbs'
The template.hbs is in the same directory as my index.js file so I imagine the problem is that the Firebase CLI is not bundling this file along the rest of files.
According to the docs of Google Cloud Functions it is possible to bundle local modules with "mymodule": "file:mymodule". So I've tried creating a templates folder at the root of the project and added "templates": "file:./templates" to the package.json.
My file structure being something like this:
/my-function
index.js
/templates
something.hbs
index.js //this is the entry point
And then:
const template = fs.readFileSync('../node_modules/templates/something.hbs', 'utf-8');
But I'm getting the same not found error.
What is the proper way of including and requiring a non JS dependencies in a Firebase Cloud Function?
The Firebase CLI will package up all the files in your functions folder, except for node_modules, and send the entire archive to Cloud Functions. It will reconstitue node_modules by running npm install while building the docker image that runs your function.
If your something.hbs is in /templates under your functions folder, you should be able to refer to it as ./templates/something.hbs from the top-level index.js. If your JS is in another folder, you might have to work you way out first with ../templates/something.hbs. The files should all be there - just figure out the path. I wouldn't try to do anything fancy is your package.json. Just take advantage of the fact that the CLI deploys everything but node_modules.
This code works fine for me if I have a file called 'foo' at the root of my functions folder:
import * as fs from 'fs'
export const test = functions.https.onRequest((req, res) => {
const foo = fs.readFileSync('./foo', 'utf-8')
console.log(foo)
res.send(foo)
})
The solution was to use path.join(__dirname,'template.hbs').
const fs = require('fs');
const path = require('path');
const template = fs.readFileSync(path.join(__dirname,'template.hbs'), 'utf-8');
As #doug-stevenson pointed out all files are included in the final bundle but for some reason using the relative path did not work. Forcing an absolute path with __dirname did the trick.
I have a lambda that was made in Node JS. I used npm archiver to zip and deploy the files in AWS. Everything is okay except the zip includes node_modules. I then used glob
var srcFolder = './';
var ignores = ['node_modules/**/*', 'publish.js'];
archive.glob('**/*',
{
cwd: srcFolder,
ignore: ignores,
nodir: true,
dot: true,
follow: true
});
this works.. Problem here is when a time comes to have the need to include "some" node_modules with my deployment. I would like to ask how to just include the packages that I needed for my production (not in devDependencies)? And Ignore the aws-sdk and archiver.
I'm trying to use the Serverless Framework to create a Lambda function that uses open weather NPM module. However, I'm getting the following exception, but my node_modules contain the specific library.
I have managed to run the sample, (https://github.com/serverless/examples/tree/master/aws-node-rest-api-with-dynamodb) successfully, now hacking to add node module to integrate open weather API.
Endpoint response body before transformations: {"errorMessage":"Cannot find module 'Openweather-Node'","errorType":"Error","stackTrace":["Module.require (module.js:353:17)","require (internal/module.js:12:17)","Object.<anonymous> (/var/task/todos/weather.js:4:17)","Module._compile (module.js:409:26)","Object.Module._extensions..js
My code
'use strict';
const AWS = require('aws-sdk'); // eslint-disable-line import/no-extraneous-dependencies
var weather = require('Openweather-Node');
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.weather = (event, context, callback) => {
const params = {
TableName: process.env.DYNAMODB_TABLE,
Key: {
id: event.pathParameters.id,
},
};
weather.setAPPID("mykey");
//set the culture
weather.setCulture("fr");
//set the forecast type
weather.setForecastType("daily");
const response = {
statusCode: 200,
body: "{test response}",
};
callback(null, response);
};
Did you npm install in your working directory before doing your serverless deploy? The aws-sdk node module is available to all lambda functions, but for all other node dependencies you must install them so they will be packaged with your lambda when you deploy.
You may find this issue on the serverless repository helpful (https://github.com/serverless/serverless/issues/948).
I fixed this error when in package.json I moved everything from devDependencies to dependencies.
Cheers
I don't if it applies to this answer but in case someone just needs a brain refresh I forgot to export my handler and was exporting the file with was looking for a default export that didn't exist...
changed from this...
handler: foldername/exports
to this...
handler: foldername/exports.handler
I have the same problem with serverless framework to deploy multiple lambda functions. I fixed by the following steps
Whatever you keep the path at the handler like handler: foldername/exports.handler
Name the file inside the folder as exports.js(whatever you name the handler)
run serverless deploy
This should solve your problem
I went to cloud watch and looked for the missing packages
Then npm i "missing package" and do sls deploy
The missing packages are needed in the dependencies in my case there was some on the devDepencies and another missing
You need to do the package deployment in case you have external dependencies.
Please see this answer
AWS Node JS with Request
Reference
http://docs.aws.amazon.com/lambda/latest/dg/nodejs-create-deployment-pkg.html
In several cases, don't forget to check global serverless installation.
mine solved by reinstalling:
npm install -g serverless
In my case, what worked was switching to node 10 (via nvm). I was using a newer version of node (v15.14.0) than was probably supported by the packages.
My case was configuring params for creating an AWS lambda function. The right string for a handler was (last row):
Resources:
StringResourceName:
Type: 'AWS::Serverless::Function'
Properties:
Handler: myFileName.handler
Where myFileName is the name of a file that I use as the zip file.
You have issue with your ts files, as serverless-offline plugin cannot find equivalent js file it throws error module not found.
There is a work around to it to install (serverless-plugin-typescript) . Only issue with the plugin is that it creates a new .build/.dist folder with transpiled js files
I was doing some stupidity. But still wanted to put here so any beginner like me should not struggle for it. I copied the serverless.xml from example where handler value was
handler: index.handler
But my index.js was in src folder. Hence I was getting file not found. It worked after I change the handler value to
handler: src/index.handler
For anyone developing python lambda functions with serverless-offline and using a virtual environment during local development, deactivate your environment, delete it entirely, and recreate it. Install all python requirements and try again. This worked for me.
For me, the issue was that the handler file name contained a dot.
main-handler.graphql.js caused serverless "Error: Cannot find module 'main'.
when I changed the file to main-handler-graphql.js everything worked.
is library versioning is supported in nodeJS?
i have folder like package/version/1.0/
and files under this path
test1.js
test2.js
script.js
//access the folder package of version 1.1
var lib = require('require-all')(__dirname + '/package/version/1.0');
test1.js
========
function sum()
{ a+b ;}
exports.sum = sum;
test2.js
========
function sub()
{ a-b ;}
exports.sub = sub;
in script.js file, can require the package/version/1.1 folder. but how can i access the function sum() and sub() in my script file? and is library versioning supported in nodeJS? is the above code is a sort of library versioning ?
First of all, i haven't seen versions of libraries in one package, most common way is to release new versions of packages and upload them online, defining the required version in a package.json dependencies , npm will take care of download & install
If you want to deprecate a certain version of your library online there is npm deprecate which is the right command for that job.
When you create new npm package you can define a main script which will handle the loading of all files inside the package.
Usually its called index.js or main.js and it will be used when someone calls require('<library>');
So you can try the following to achieve the "versioning"
index.js
var fs=require('fs');
var path=require('path');
var _packageJSON=require(__dirname+'/package.json');
var defaultVersion=_packageJSON.version;
module.exports=function(whichVersion){
whichVersion=whichVersion||defaultVersion;
fs.exists(whichVersion,function(_exists){
if(_exists==null){
throw new Error('Unable to load version : '+whichVersion+' : '+_packageJSON.name);
}else{
// require , 1.0/index.js
require(path.join(whichVersion,'index.js'));
}
}
}
and any script that has that package as dependency it can load it by simply calling
require("<library name>")(<version>) ex.
require("mylib")("1.0")
under each version inside the package, you can have index.js which loads/exports variables and functions properly.
The final structure should look like
my npm package main module
index.js file
versions directory
1.0/index.js file
util.js
fn.js
var.js
2.0/index.js file
util.js
fn.js
var.js
Hope it helps.