I am creating a lambda layer, bundling up some dependencies, including node_modules. I am successfully creating a layer but when i try to require a module from my code, the console is telling me that the module cannot be found. Here is the code
var Promise = require('promise');
module.exports.handler = function(event, context, callback) {
new Promise(function (resolve, reject) {
setTimeout(function() {
callback(null, "helloWorld2");
}, 9000);
});
};
How can I reference node modules from a layer???
How are you running your lambda? If via sam cli, something like the below has worked for me as my template.yaml ...
example template
AWSTemplateFormatVersion: 2010-09-09
Transform: AWS::Serverless-2016-10-31
Description: example of node.js lambda function using layer for dependencies
Resources:
ExampleFunction:
Type: AWS::Serverless::Function
Properties:
Runtime: nodejs8.10
CodeUri: nodejs/
Handler: src/event.handler
Layers:
- !Ref NodeModulesLayer
NodeModulesLayer:
Type: AWS::Serverless::LayerVersion
Properties:
Description: full set of function dependencies
ContentUri: ./
CompatibleRuntimes:
- nodejs6.10
- nodejs8.10
LicenseInfo: 'Available under the MIT-0 license.'
RetentionPolicy: Retain
pointing to local layer
The SAM developer guide includes a page on Working with Layers. At the time I'm writing this, they don't really get into how to reference layers at local file paths, and instead focus on references to remotely hosted layers.
The aspect I found tricky is that the directory structure of a node.js layer is expected to be ...
nodejs/
node_modules/
... which means that in order for your locally installed node_modules directory to work as a layer, your package.json file must be nested inside a folder named nodejs.
Note the paths in the above example template.yaml:
ExampleFunction.Properties.CodeUri is set to nodejs/
ExampleFunction.Properties.Handler should be set to the path to your handler file, relative to nodejs/.
NodeModulesLayer.Properties.ContentUri is set to the folder that contains both the template.yaml file and the nodejs dir.
Which means my example is assuming the following structure ...
nodejs/
node_modules/
src/
event.js
package.json
template.yaml
preserve sam build support
One additional gotcha to be wary of ...
With respect to defining your function resource in template.yaml, there's some "flexibility" in terms of which parts of the path you put in CodeUri vs Handler. In some cases, doing ...
Properties:
CodeUri: nodejs/src/
Handler: event.handler
... works just as well as doing ...
Properties:
CodeUri: nodejs/
Handler: src/event.handler
BUT, if you're using the sam build command, the former will NOT work. That command expects to find package.json inside of the CodeUri directory. So, stick with CodeUri: nodejs/ and use the Handler value to navigate through any additional folder hierarchy necessary to reaching your handler.
Try this, simple example how to set up lambda layer in nodejs:
https://medium.com/#anjanava.biswas/nodejs-runtime-environment-with-aws-lambda-layers-f3914613e20e
Dependencies in 2022...
It was quite hard for me to figure out, whether this is still state-of-the-art or we can simplify dependencies now. Turns out it's possible now to include dependencies without hacks or complicated setup.
For me, creating the lambda with new lambda.NodeJsFunction() instead of new lambda.Function() did the trick. Yet, it was super hard for me to find a working sample. I decided to share a sample repo with you.
Sample repository
https://github.com/AntoniusGolly/cdk-lambda-typescript-boilerplate
What it does:
it has a lot of stuff you don't need to demonstrate the purpose (e.g. it produces a API Gateway based on a config.ts, so ignore that part if you don't need it)
it allows you to use TS and ES6
it allows you to use a dependency: i.e. I use node-fetch here
I just do cdk deploy and no other bundling
according to the docu by default, all node modules are bundled except for aws-sdk.
I don't know exactly how it works, but it does ;)
I hope this helps someone as I would have appreciated it.
In order to address your module you have to use path with '/opt' as a prefix.
If it's yours file which you packaged to myLib.zip with myLib.js inside you should write:
const myModule = require('/opt/myLib');
If you plugging installed dependency then you upload node_modules.zip with node_modules folder and address as:
const module = require('/opt/node_modules/lib_name');
Related
Overview
I've been trying to get datadog to work with my lambda functions. We're using serverless framework. Our main objective here is to send some custom metrics to datadog from our lambdas. To do this, we're using the Serverless plugin for datadog (serverless-plugin- datadog), along with datadog-lambda-js Library (Included references in the end). One last thing to include is, we are using webpack to build our services.
Issues and Errors:
Running the serverless-services locally works perfectly fine (we install datadog-lambda-js on dev dependencies). All our metrics are being sent to Datadog.
However when we push it to Lambda, We get errors from all services saying Error: Cannot find module 'datadog-lambda-js. We see this on the lambda logs.
We've used all steps and methods available to fix this, we also know webpack has some issues with datadog, but we've followed steps to handle this also (check this). Also, before deploying it to lambdas, we remove the datadog-lambda-js dependencies package.json, but still no luck.
I'll drop the code blocks below for reference. Please let me know if I've missed anything.
// serverless.yml
plugins:
- serverless-plugin-warmup
- serverless-webpack
- serverless-offline
- serverless-domain-manager
- serverless-step-functions
- serverless-plugin-resource-tagging
- serverless-dynamo-stream-plugin
- serverless-plugin-datadog
custom:
serviceName: <my-service>
datadog:
addExtension: true
addLayers: true
apiKey: <my-dd-key>
# flushMetricsToLogs: false
enabled: true
webpack:
webpackConfig: /webpack.config.js
includeModules: true
forceExclude:
- datadog-lambda-js
packagerOptions:
scripts:
- rm -rf node_modules/datadog-lambda-js
packager: npm
// file that sends distribution metrics
const { datadog, sendDistributionMetric, sendDistributionMetricWithDate } = require('datadog-lambda-js');
exports.hello = async function hello() {
console.log("Running metrics")
sendDistributionMetricWithDate(
<my-metric-name>, // Metric name
1,
new Date(Date.now()), // Metric value
"tag1:T1",
"tag2:T2", // Associated tags
);
console.log("Closing metrics")
return {
statusCode: 200,
body: "hello, dog!",
};
}
Apart from this, We have also excluded datadog-lambda-js from webpack : externals: [nodeExternals(), "datadog-lambda-js"]
Would appreciate any help on this.
References:
Sending Custom Metric
Using Datadog with Node.js Serverless application
Webpack changes to run DD library
You need to add the Datadog lambda layer to your lambda function. This should happen when you install the serverless framework plugin via serverless plugin install --name serverless-plugin-datadog.
Alternately, find the ARN from the docs, then add it to your lambda function. This blog post has more detailed instructions on adding a layer via the AWS console.
My goal is to share library code among several of our lambda functions using layers and be able to debug locally and run tests.
npm is able to install dependencies from the local file system. When we change our library code, we want all users of that library to get the updated code without having to set up a dedicated npm server.
I'm able to debug locally just fine using the relative paths, but that's before I involve sam build.
sam build creates a hidden folder at the root of the repository and builds the folder out and eventually runs npm install, however this time the folder structure is different. The relative paths used in the package.json file are now broken. We can't use explicit paths because our repositories reside under our user folders, which are of course different from one developer to another.
Here's what I did:
I created a project using sam init and took the defaults (except the name of sam-app-2) for a nodejs 12.x project (options 1 and 1).
That command created a folder called sam-app-2 which is the reference for all of the following file names.
I created a dependencies/nodejs folder.
I added dep.js to that folder:
exports.x = 'It works!!';
I also added package.json to the same folder:
{
"name": "dep",
"version": "1.0.0",
"description": "",
"main": "dep.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC"
}
Under hello-world (the folder housing the lambda function), I added the following to the dependencies in package.json:
"dep": "file:../dependencies/nodejs"
I ran npm install under hello-world and it copied the dependencies under node_modules/dep.
Normally, you would't do that here. This is purely to allow me to run locally without involving the sam CLI. It's just pure nodejs code. I can run tests, I can debug and not have to wait twenty seconds or more while sam packages up everything and invokes my function. Developing in this state is awesome because it's very fast. However, it'll eventually need to run correctly in the wild.
I edited ./hello-world/app.js:
const dep = require('dep');
let response;
exports.lambdaHandler = async (event, context) => {
try {
// const ret = await axios(url);
response = {
'statusCode': 200,
'dep.x': dep.x,
'body': JSON.stringify({
message: 'Hello, World!!',
// location: ret.data.trim()
})
}
} catch (err) {
console.log(err);
return err;
}
return response
};
if(require.main === module){
(async () => {
var result = await exports.lambdaHandler(process.argv[1]);
console.log(result);
})();
}
I edited template.yml:
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
sam-app-2
Sample SAM Template for sam-app-2
# More info about Globals: https://github.com/awslabs/serverless-application-model/blob/master/docs/globals.rst
Globals:
Function:
Timeout: 3
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function # More info about Function Resource: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction
Properties:
CodeUri: hello-world/
Handler: app.lambdaHandler
Runtime: nodejs12.x
Layers:
- !Ref DepLayer
Events:
HelloWorld:
Type: Api # More info about API Event Source: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#api
Properties:
Path: /hello
Method: get
DepLayer:
Type: AWS::Serverless::LayerVersion
Properties:
LayerName: sam-app-dependencies-2
Description: Dependencies for sam app [temp-units-conv]
ContentUri: ./dependencies/
CompatibleRuntimes:
- nodejs12.x
LicenseInfo: 'MIT'
RetentionPolicy: Retain
Outputs:
# ServerlessRestApi is an implicit API created out of Events key under Serverless::Function
# Find out more about other implicit resources you can reference within SAM
# https://github.com/awslabs/serverless-application-model/blob/master/docs/internals/generated_resources.rst#api
HelloWorldApi:
Description: "API Gateway endpoint URL for Prod stage for Hello World function"
Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/hello/"
HelloWorldFunction:
Description: "Hello World Lambda Function ARN"
Value: !GetAtt HelloWorldFunction.Arn
HelloWorldFunctionIamRole:
Description: "Implicit IAM Role created for Hello World function"
Value: !GetAtt HelloWorldFunctionRole.Arn
Running it straight from the command line works:
sam-app-2> node hello-world\app.js
{
statusCode: 200,
'dep.x': 'It works!!',
body: '{"message":"Hello, World!!"}'
}
Even sam deploy works! Yes, it deploys the code to the cloud and when I invoke the lambda function in the cloud, it gives the same result as above.
However, when I run sam build, it fails with:
Building resource 'HelloWorldFunction'
Running NodejsNpmBuilder:NpmPack
Running NodejsNpmBuilder:CopyNpmrc
Running NodejsNpmBuilder:CopySource
Running NodejsNpmBuilder:NpmInstall
Build Failed
Error: NodejsNpmBuilder:NpmInstall - NPM Failed: npm ERR! code ENOLOCAL
npm ERR! Could not install from "..\dependencies\nodejs" as it does not contain a package.json file.
npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\Brandon\AppData\Roaming\npm-cache\_logs\2020-03-04T19_34_01_873Z-debug.log
When I try to invoke the lambda locally:
sam local invoke "HelloWorldFunction" -e events/event.json
Invoking app.lambdaHandler (nodejs12.x)
DepLayer is a local Layer in the template
Building image...
Requested to skip pulling images ...
Mounting C:\Users\Brandon\source\repos\sam-app-2\hello-world as /var/task:ro,delegated inside runtime container
2020-03-03T19:34:28.824Z undefined ERROR Uncaught Exception {"errorType":"Runtime.ImportModuleError","errorMessage":"Error: Cannot find module 'dep'\nRequire stack:\n- /var/task/app.js\n- /var/runtime/UserFunction.js\n- /var/runtime/index.js","stack":["Runtime.ImportModuleError: Error: Cannot find module 'dep'","Require stack:","- /var/task/app.js","- /var/runtime/UserFunction.js","- /var/runtime/index.js"," at _loadUserApp (/var/runtime/UserFunction.js:100:13)"," at Object.module.exports.load (/var/runtime/UserFunction.js:140:17)"," at Object.<anonymous> (/var/runtime/index.js:43:30)"," at Module._compile (internal/modules/cjs/loader.js:955:30)"," at Object.Module._extensions..js (internal/modules/cjs/loader.js:991:10)"," at Module.load (internal/modules/cjs/loader.js:811:32)"," at Function.Module._load (internal/modules/cjs/loader.js:723:14)"," at Function.Module.runMain (internal/modules/cjs/loader.js:1043:10)"," at internal/main/run_main_module.js:17:11"]}
?[32mSTART RequestId: b6f39717-746d-1597-9838-3b6472ec8843 Version: $LATEST?[0m
?[32mEND RequestId: b6f39717-746d-1597-9838-3b6472ec8843?[0m
?[32mREPORT RequestId: b6f39717-746d-1597-9838-3b6472ec8843 Init Duration: 237.77 ms Duration: 3.67 ms Billed Duration: 100 ms Memory Size: 128 MB Max Memory Used: 38 MB ?[0m
{"errorType":"Runtime.ImportModuleError","errorMessage":"Error: Cannot find module 'dep'\nRequire stack:\n- /var/task/app.js\n- /var/runtime/UserFunction.js\n- /var/runtime/index.js"}
When I try to start the API locally with sam local start-api, it fails with the same error as above.
I'm thinking that if it weren't for the relative file paths being off during the build phase, I'd be able to have my cake (debugging locally very quickly) and eat it too (run sam build, sam local start-api).
What should I do?
After much frustration and angst, this has been produced:
https://github.com/blmille1/aws-sam-layers-template
Enjoy!
I also faced the same issue and eventually came up with an alternative solution to yours. It offers the developer experience you are looking for but avoids the slight inconvenience and maintenance overhead in your solution (aka using the ternary operator for every import that access the shared layer). My proposed solution uses a similar approach as yours but only requires an one-time initialization call per lambda function. Under the hood, it uses module-alias to resolve the dependencies during runtime.
Here's a link to a repository with an example template: https://github.com/dangpg/aws-sam-shared-layers-template
(tl;dr) When using the linked template you get:
Share common code or dependencies among multiple lambda functions using layers
Does not use any module bundlers (e.g., Webpack)
Instead, uses module-alias to resolve dependencies during runtime
Supports local debugging within VSCode via AWS Toolkit
Code can be executed outside of AWS sandbox (node app.js)
Supports unit testing with Jest
Intellisense/Autocomplete within VSCode
Compatible with SAM CLI (sam build, sam deploy, sam local invoke, sam local start-api, etc.).
Can be deployed and run in the cloud as generic lambdas
1. Folder structure
+ lambdas
| + func1
| - app.js
| - package.json
| + func2
| - app.js
| - package.json
+ layers
| + common // can be any name, I chose common
| + nodejs // needs to be nodejs due to how SAM handles layers
| - index.js // file for own custom code that you want to share
| - package.json // list any dependencies you want to share
- template.yaml
Here's the folder structure I ended up with. Keep in mind though, that it is quite flexible and doesn't require hard rules in order to fulfill possible relative file paths (however, you would need to adapt some files if your structure differs).
If you want to share npm packages among lambda functions, just add them to the package.json within the layer folder:
{
"name": "common",
"version": "1.0.0",
"main": "index.js",
"dependencies": {
"lorem-ipsum": "^2.0.4"
}
}
For completeness, here's the content of index.js:
exports.ping = () => {
return "pong";
};
2. Template.yaml
[...]
Globals:
Function:
Timeout: 3
Runtime: nodejs14.x
Environment:
Variables:
AWS: true
Resources:
Func1:
Type: AWS::Serverless::Function
Properties:
CodeUri: lambdas/func1/
Handler: app.lambdaHandler
Layers:
- !Ref CommonLayer
[...]
Func2:
Type: AWS::Serverless::Function
Properties:
CodeUri: lambdas/func2/
Handler: app.lambdaHandler
Layers:
- !Ref CommonLayer
[...]
CommonLayer:
Type: AWS::Serverless::LayerVersion
Properties:
ContentUri: ./layers/common/
CompatibleRuntimes:
- nodejs14.x
RetentionPolicy: Retain
Pretty straightforward, just follow the official example on how to include layers in your lambdas. Like in your solution, add a gloval env variable, so we can differentiate if we are running code within an AWS sandbox or not.
3. Lambda package.json
Add module-alias as dependency and your local common folder as devDependency to each lambda function:
...
"dependencies": {
"module-alias": "^2.2.2"
},
"devDependencies": {
"common__internal": "file:../../layers/common/nodejs" // adapt relative path according to your folder structure
},
...
We will need the local reference to our common folder later on (e.g. for testing). We add it as devDependency since we only need it for local development and so we don't run into issues when running sam build (since it ignores devDependencies). I chose common__internal as package name, however you are free to choose whatever you like. Make sure to run npm install before doing any local development.
4. Lambda handler
Within your handler source code, before you import any packages from your shared layer, initialize module-alias to do the following:
const moduleAlias = require("module-alias");
moduleAlias.addAlias("#common", (fromPath, request) => {
if (process.env.AWS && request.startsWith("#common/")) {
// AWS sandbox and reference to dependency
return "/opt/nodejs/node_modules";
}
if (process.env.AWS) {
// AWS sandbox and reference to custom shared code
return "/opt/nodejs";
}
if (request.startsWith("#common/")) {
// Local development and reference to dependency
return "../../layers/common/nodejs/node_modules";
}
// Local development and reference to custom shared code
return "../../layers/common/nodejs";
});
const { ping } = require("#common"); // your custom shared code
const { loremIpsum } = require("#common/lorem-ipsum"); // shared dependency
exports.lambdaHandler = async (event, context) => {
try {
const response = {
statusCode: 200,
body: JSON.stringify({
message: "hello world",
ping: ping(),
lorem: loremIpsum(),
}),
};
return response;
} catch (err) {
console.log(err);
return err;
}
};
if (require.main === module) {
(async () => {
var result = await exports.lambdaHandler(process.argv[1]);
console.log(result);
})();
}
You can move the module-alias code part to a separate file and just import that one in the beginning instead (or even publish your own custom package that you can then properly import; this way you can reference it within each of your lambda function and don't have to duplicate code). Again, adjust the relative file paths according to your folder structure. Similar to your approach, it checks for the AWS environment variable and adjusts the import paths accordingly. However, this only has to be done once. After that, all successive imports can just use your defined alias: const { ping } = require("#common"); and const { loremIpsum } = require("#common/lorem-ipsum");. Also here, feel free to define your very own custom logic on how to handle aliases. This is just the solution I came up with that worked for me.
From this point on, you should be able to execute your lambda code either locally through node app.js or through the SAM CLI (sam build, sam local invoke, etc.). However, if you want local testing and intellisense, there is some additional work left.
5. Intellisense
For VSCode, you can just add a jsconfig.json file with the respective path mappings for your defined alias. Point it to the internal devdependency from earlier:
{
"compilerOptions": {
"baseUrl": ".",
"paths": {
"#common": ["./node_modules/common__internal/index.js"],
"#common/*": ["./node_modules/common__internal/node_modules/*"]
}
}
}
6. Testing
For testing, I personally use Jest. Fortunately, Jest provides the option to provide path mappings too:
// within your lambda package.json
"jest": {
"moduleNameMapper": {
"#common/(.*)": "<rootDir>/node_modules/common__internal/node_modules/$1",
"#common": "<rootDir>/node_modules/common__internal/"
}
}
Final disclaimer
This solution currently only works when using the CommonJS module system. I haven't been able to reproduce the same result when using ES modules (mostly due to the lacking support of module-alias). However, if somebody can come up with a solution using ES modules, I am happy to hear it!
And that's it! Hopefully I didn't leave anything out. Overall, I'm pretty happy with the developer experience this solution is offering. Feel free to look at the linked template repository for more details. I know it's been a bit since your original question but I left it here in the hope that it will maybe help fellow developers too. Cheers
I followed up on your approved answer and I believe I figured out the correct answer (and posting here since I got here via Google and others may wander in here).
1. Organize your module in the following way (my-layers and my-module can be adjusted but the nodejs/node_modules must remain)
+ my-layers
| + my-module
| + nodejs
| + node_modules
| + my-module
| + index.js
| + package.json
+ my-serverless
+ template.yaml
+ package.json
I don't know the ideal setup for package.json. Specifying "main": "index.js" was enough for me.
{
"name": "my-module",
"version": "1.0.0",
"main": "index.js"
}
and this is the index.js
exports.ping = () => console.log('pong');
2. In the SAM template lambda layer point to the ../my-layers/
MyModuleLayer:
Type: AWS::Serverless::LayerVersion
Properties:
LayerName: my-module
ContentUri: ../my-layers/my-module/
CompatibleRuntimes:
- nodejs14.x
So far so good. But now - to get the code completion and get rid of the crazy require(process.env.AWS ? '/opt/nodejs/common' : '../../layers/layer1/nodejs/common'); you had in the func2internals.js
3. Add the dependency to the my-serverless
a. either install from CLI:
npm i --save ../my-layers/my-module/nodejs/node_modules/my-module
b. or add in package.json dependencies
"dependencies": {
"my-module": "file:../my-layers/my-module/nodejs/node_modules/my-module",
}
4. Use my-module in your serverless function
var myModule = require('my-module');
exports.handler = async (event) => {
myModule.ping();
};
That's it. You have code completion and it works on local env and in sam local invoke and in sam local start-api
And dont forgot to exclude my-layers/my-module/nodejs/node_modules from .gitignore :)
Below is my serverless.yml
service: serverless-typescript-example
provider:
name: aws
package:
individually: true
plugins:
- serverless-plugin-typescript
functions:
hello1:
handler: hello1/src/index.handler
hello2:
handler: hello/src/index.handler
and my folder structure looks like below
hello1
--index.ts
--package.json
hello2
--index.ts
--package.json
package.json
serverless.yml
when I run sls package it creates 2 zip archives in .serverless folder with names hello1.zip and hello2.zip. Upon unzipping both folders have identitical contents i.e hello1 ande hello2 with node_modules.
Is there any option to resolve this and can we place the .zip file in the respective function folder ,i mean hello1.zip in hello1 and hello2.zip in hello2
i haven't used the serverless-plugin-typescript, but we use serverless-webpack and it does a pretty neat job. It reduced the lambda size a lot since it uses webpack to bundle.
there is also a serverless create template that uses the serverless-webpack plugin.
serverless create --template aws-nodejs-typescript
the zip files are placed by default in the .serverless folder. You can use the command below specify a destination folder. I don't think there is a way to define different destinations for each entry point.
serverless package --package my-artifacts
I'm trying to use the Serverless Framework to create a Lambda function that uses open weather NPM module. However, I'm getting the following exception, but my node_modules contain the specific library.
I have managed to run the sample, (https://github.com/serverless/examples/tree/master/aws-node-rest-api-with-dynamodb) successfully, now hacking to add node module to integrate open weather API.
Endpoint response body before transformations: {"errorMessage":"Cannot find module 'Openweather-Node'","errorType":"Error","stackTrace":["Module.require (module.js:353:17)","require (internal/module.js:12:17)","Object.<anonymous> (/var/task/todos/weather.js:4:17)","Module._compile (module.js:409:26)","Object.Module._extensions..js
My code
'use strict';
const AWS = require('aws-sdk'); // eslint-disable-line import/no-extraneous-dependencies
var weather = require('Openweather-Node');
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.weather = (event, context, callback) => {
const params = {
TableName: process.env.DYNAMODB_TABLE,
Key: {
id: event.pathParameters.id,
},
};
weather.setAPPID("mykey");
//set the culture
weather.setCulture("fr");
//set the forecast type
weather.setForecastType("daily");
const response = {
statusCode: 200,
body: "{test response}",
};
callback(null, response);
};
Did you npm install in your working directory before doing your serverless deploy? The aws-sdk node module is available to all lambda functions, but for all other node dependencies you must install them so they will be packaged with your lambda when you deploy.
You may find this issue on the serverless repository helpful (https://github.com/serverless/serverless/issues/948).
I fixed this error when in package.json I moved everything from devDependencies to dependencies.
Cheers
I don't if it applies to this answer but in case someone just needs a brain refresh I forgot to export my handler and was exporting the file with was looking for a default export that didn't exist...
changed from this...
handler: foldername/exports
to this...
handler: foldername/exports.handler
I have the same problem with serverless framework to deploy multiple lambda functions. I fixed by the following steps
Whatever you keep the path at the handler like handler: foldername/exports.handler
Name the file inside the folder as exports.js(whatever you name the handler)
run serverless deploy
This should solve your problem
I went to cloud watch and looked for the missing packages
Then npm i "missing package" and do sls deploy
The missing packages are needed in the dependencies in my case there was some on the devDepencies and another missing
You need to do the package deployment in case you have external dependencies.
Please see this answer
AWS Node JS with Request
Reference
http://docs.aws.amazon.com/lambda/latest/dg/nodejs-create-deployment-pkg.html
In several cases, don't forget to check global serverless installation.
mine solved by reinstalling:
npm install -g serverless
In my case, what worked was switching to node 10 (via nvm). I was using a newer version of node (v15.14.0) than was probably supported by the packages.
My case was configuring params for creating an AWS lambda function. The right string for a handler was (last row):
Resources:
StringResourceName:
Type: 'AWS::Serverless::Function'
Properties:
Handler: myFileName.handler
Where myFileName is the name of a file that I use as the zip file.
You have issue with your ts files, as serverless-offline plugin cannot find equivalent js file it throws error module not found.
There is a work around to it to install (serverless-plugin-typescript) . Only issue with the plugin is that it creates a new .build/.dist folder with transpiled js files
I was doing some stupidity. But still wanted to put here so any beginner like me should not struggle for it. I copied the serverless.xml from example where handler value was
handler: index.handler
But my index.js was in src folder. Hence I was getting file not found. It worked after I change the handler value to
handler: src/index.handler
For anyone developing python lambda functions with serverless-offline and using a virtual environment during local development, deactivate your environment, delete it entirely, and recreate it. Install all python requirements and try again. This worked for me.
For me, the issue was that the handler file name contained a dot.
main-handler.graphql.js caused serverless "Error: Cannot find module 'main'.
when I changed the file to main-handler-graphql.js everything worked.
Architecture
I would like to share code between client and server side. I have defined aliases in the webpack config:
resolve: {
// Absolute paths: https://github.com/webpack/webpack/issues/109
alias: {
server : absPath('/src/server/'),
app : absPath('/src/app/'),
client : absPath('/src/client/'),
}
},
Problem
Now on the server side I need to include webpack in order to recognize the correct paths when I require a file. For example
require('app/somefile.js')
will fail in pure node.js because can't find the app folder.
What I need (read the What I need updated section)
I need to be able to use the webpack aliases. I was thinking about making a bundle of all the server part without any file from node_modules. In this way when the server starts it will use node_modules from the node_modules folder instead of a minified js file (Why? 1st: it doesn't work. 2nd: is bad, because node_modules are compiled based on platform. So I don't want my win files to go on a unix server).
Output:
Compiled server.js file without any node_modules included.
Let the server.js to use node_modules;
What I need updated
As I've noticed in https://github.com/webpack/webpack/issues/135 making a bundled server.js will mess up with all the io operation file paths.
A better idea would be to leave node.js server files as they are, but replace the require method provided with a custom webpack require which takes in account configurations such as aliases (others?)... Can be done how require.js has done to run on node.js server.
What I've tried
By adding this plugin in webpack
new webpack.optimize.CommonsChunkPlugin(/* chunkName= */"ignore", /* filename= */"server.bundle.js")
Entries:
entry: {
client: "./src/client/index.js",
server: "./src/server/index.js",
ignore: ['the_only_node_module'] // But I need to do that for every node_module
},
It will create a file server.js which only contains my server code. Then creates a server.bundle.js which is not used. But the problem is that webpack includes the webpackJsonp function in the server.bundle.js file. Therefore both the client and server will not work.
It should be a way to just disable node_modules on one entry.
What I've tried # 2
I've managed to exclude the path, but requires doesn't work because are already minified. So the source looks like require(3) instead of require('my-module'). Each require string has been converted to an integer so it doesn't work.
In order to work I also need to patch the require function that webpack exports to add the node.js native require function (this is easy manually, but should be done automatically).
What I've tried # 3
In the webpack configuration:
{target: "node"}
This only adds an exports variable (not sure about what else it does because I've diffed the output).
What I've tried # 4 (almost there)
Using
require.ensure('my_module')
and then replacing all occurrences of r(2).ensure with require. I don't know if the r(2) part is always the same and because of this might not be automated.
Solved
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
Related
https://www.bountysource.com/issues/1660629-what-s-the-right-way-to-use-webpack-specific-functionality-in-node-js
https://github.com/webpack/webpack/issues/135
http://webpack.github.io/docs/configuration.html#target
https://github.com/webpack/webpack/issues/458
How to simultaneously create both 'web' and 'node' versions of a bundle with Webpack?
http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/
Thanks
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
My solution was:
{
// make sure that webpack will externalize
// modules using Node's module API (CommonJS 2)
output: { ...output, libraryTarget: 'commonjs2' },
// externalize all require() calls to non-relative modules.
// Unless you do something funky, every time you import a module
// from node_modules, it should match the regex below
externals: /^[a-z0-9-]/,
// Optional: use this if you want to be able to require() the
// server bundles from Node.js later
target: 'node'
}