Serverless Framework with AWS Lambda error "Cannot find module" - node.js

I'm trying to use the Serverless Framework to create a Lambda function that uses open weather NPM module. However, I'm getting the following exception, but my node_modules contain the specific library.
I have managed to run the sample, (https://github.com/serverless/examples/tree/master/aws-node-rest-api-with-dynamodb) successfully, now hacking to add node module to integrate open weather API.
Endpoint response body before transformations: {"errorMessage":"Cannot find module 'Openweather-Node'","errorType":"Error","stackTrace":["Module.require (module.js:353:17)","require (internal/module.js:12:17)","Object.<anonymous> (/var/task/todos/weather.js:4:17)","Module._compile (module.js:409:26)","Object.Module._extensions..js
My code
'use strict';
const AWS = require('aws-sdk'); // eslint-disable-line import/no-extraneous-dependencies
var weather = require('Openweather-Node');
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.weather = (event, context, callback) => {
const params = {
TableName: process.env.DYNAMODB_TABLE,
Key: {
id: event.pathParameters.id,
},
};
weather.setAPPID("mykey");
//set the culture
weather.setCulture("fr");
//set the forecast type
weather.setForecastType("daily");
const response = {
statusCode: 200,
body: "{test response}",
};
callback(null, response);
};

Did you npm install in your working directory before doing your serverless deploy? The aws-sdk node module is available to all lambda functions, but for all other node dependencies you must install them so they will be packaged with your lambda when you deploy.
You may find this issue on the serverless repository helpful (https://github.com/serverless/serverless/issues/948).

I fixed this error when in package.json I moved everything from devDependencies to dependencies.
Cheers

I don't if it applies to this answer but in case someone just needs a brain refresh I forgot to export my handler and was exporting the file with was looking for a default export that didn't exist...
changed from this...
handler: foldername/exports
to this...
handler: foldername/exports.handler

I have the same problem with serverless framework to deploy multiple lambda functions. I fixed by the following steps
Whatever you keep the path at the handler like handler: foldername/exports.handler
Name the file inside the folder as exports.js(whatever you name the handler)
run serverless deploy
This should solve your problem

I went to cloud watch and looked for the missing packages
Then npm i "missing package" and do sls deploy
The missing packages are needed in the dependencies in my case there was some on the devDepencies and another missing

You need to do the package deployment in case you have external dependencies.
Please see this answer
AWS Node JS with Request
Reference
http://docs.aws.amazon.com/lambda/latest/dg/nodejs-create-deployment-pkg.html

In several cases, don't forget to check global serverless installation.
mine solved by reinstalling:
npm install -g serverless

In my case, what worked was switching to node 10 (via nvm). I was using a newer version of node (v15.14.0) than was probably supported by the packages.

My case was configuring params for creating an AWS lambda function. The right string for a handler was (last row):
Resources:
StringResourceName:
Type: 'AWS::Serverless::Function'
Properties:
Handler: myFileName.handler
Where myFileName is the name of a file that I use as the zip file.

You have issue with your ts files, as serverless-offline plugin cannot find equivalent js file it throws error module not found.
There is a work around to it to install (serverless-plugin-typescript) . Only issue with the plugin is that it creates a new .build/.dist folder with transpiled js files

I was doing some stupidity. But still wanted to put here so any beginner like me should not struggle for it. I copied the serverless.xml from example where handler value was
handler: index.handler
But my index.js was in src folder. Hence I was getting file not found. It worked after I change the handler value to
handler: src/index.handler

For anyone developing python lambda functions with serverless-offline and using a virtual environment during local development, deactivate your environment, delete it entirely, and recreate it. Install all python requirements and try again. This worked for me.

For me, the issue was that the handler file name contained a dot.
main-handler.graphql.js caused serverless "Error: Cannot find module 'main'.
when I changed the file to main-handler-graphql.js everything worked.

Related

Vue Error - Can't resolve 'https' when importing package

I'm trying to make a Vue project and use an npm package for connecting to the retroachievements.org api to fetch some data, but I'm getting an error. Here's my process from start to finish to create the project and implement the package.
Navigate to my projects folder and use the vue cli to create the project: vue create test. For options, I usually chose not to include the linter, vue version 2, and put everything in package.json.
cd into the /test folder: cd test and install the retroachievements npm package: npm install --save raapijs
Modify App.vue to the following (apologies for code formatting, not sure why the post isn't formatting/coloring it all properly...):
const RaApi = require('raapijs');
export default {
name: 'App',
data: () => ({
api:null,
user: '<USER_NAME>',
apiKey: '<API_KEY>',
}),
created() {
this.api = new RaApi(this.user, this.apiKey);
},
}
run `npm run serve' and get the error:
ERROR in ./node_modules/raapijs/index.js 2:14-30
Module not found: Error: Can't resolve 'https' in 'C:\Projects\Web\test\node_modules\raapijs'
I'm on Windows 10, Node 16.17.0, npm 8.15.0, vue 2.6.14, vue CLI 5.0.8, raapijs 0.1.2.
The first solution below says he can run it without error but it looks like the exact same code as I'm trying. Can anyone see a difference and a reason for this error?
EDIT: I reworded this post to be more clear about my process and provide more info, like the versions.
This solution works for me. I installed raapijs with npm install --save raapijs command. Then in my Vue version 2 component I used your code as follow:
const RaApi = require('raapijs');
export default {
data: () => ({
api: null,
user: '<USER_NAME>',
apiKey: '<API_KEY>',
}),
created() {
this.api = new RaApi(this.user, this.apiKey);
},
};
It seems the raapijs package was designed to be used in a Node environment, rather than in Vue's browser based environment, so that's the reason I was getting an error. The package itself was looking for the built in https package in Node, but since it wasn't running in Node, it wasn't finding it.
So I solved my problem by looking at the package's github repo and extractingt he actual php API endpoints that were being used and using those in my app directly, rather than using the package wrapper. Not quite as clean and tidy as I was hoping but still a decent solution.

How to fix "Error: /home/site/wwwroot/node_modules/canvas/build/Release/canvas.node: invalid ELF header" on NodeJs Azure Functions in Linux?

I am trying to deploy an AzureFunctions in NodeJs but it doesn't work on Azure.
My apllication is a v3 functions running on Linux.
When the deploy is completed, i get this 500 error:
Error:
/home/site/wwwroot/node_modules/canvas/build/Release/canvas.node:
invalid ELF header
Its happen only when I do this imports:
import ChartDataLabels from 'chartjs-plugin-datalabels';
const canvasRenderService = new CanvasRenderService(width, height, chartCallback);
const chartCallback = (ChartJS) => {
ChartJS.register(require('chartjs-plugin-datalabels'))
};
const jsdom = require("jsdom");
const { JSDOM } = jsdom;
const { document } = (new JSDOM(`...`)).window;
Would someone help me please?
It works (only) on my machine :(
Edit: It works when I make the deploy by Linux Subsystem.
I hope this will help somebody.
Azure function will not include the Node_modules while deploying into azure. Because Node_modules directory contains very large file. You can include your package.json in you function directory and run npm install as you normally would with Node.js projects using Kudu (https://<function_app_name>.scm.azurewebsites.net )or the Console in the Azure portal.
Check Dependency management for more information.
Refer here Link 1 & Link 2
Any updates on this topic?
Doesn't seem like a valid option for me to manually run npm install via KUDU or some other terminal in a Cloud Function App - especially with Continoues Deployment etc.
Got the same problem while using canvas for barcode generation...

Lambda layers node_modules

I am creating a lambda layer, bundling up some dependencies, including node_modules. I am successfully creating a layer but when i try to require a module from my code, the console is telling me that the module cannot be found. Here is the code
var Promise = require('promise');
module.exports.handler = function(event, context, callback) {
new Promise(function (resolve, reject) {
setTimeout(function() {
callback(null, "helloWorld2");
}, 9000);
});
};
How can I reference node modules from a layer???
How are you running your lambda? If via sam cli, something like the below has worked for me as my template.yaml ...
example template
AWSTemplateFormatVersion: 2010-09-09
Transform: AWS::Serverless-2016-10-31
Description: example of node.js lambda function using layer for dependencies
Resources:
ExampleFunction:
Type: AWS::Serverless::Function
Properties:
Runtime: nodejs8.10
CodeUri: nodejs/
Handler: src/event.handler
Layers:
- !Ref NodeModulesLayer
NodeModulesLayer:
Type: AWS::Serverless::LayerVersion
Properties:
Description: full set of function dependencies
ContentUri: ./
CompatibleRuntimes:
- nodejs6.10
- nodejs8.10
LicenseInfo: 'Available under the MIT-0 license.'
RetentionPolicy: Retain
pointing to local layer
The SAM developer guide includes a page on Working with Layers. At the time I'm writing this, they don't really get into how to reference layers at local file paths, and instead focus on references to remotely hosted layers.
The aspect I found tricky is that the directory structure of a node.js layer is expected to be ...
nodejs/
node_modules/
... which means that in order for your locally installed node_modules directory to work as a layer, your package.json file must be nested inside a folder named nodejs.
Note the paths in the above example template.yaml:
ExampleFunction.Properties.CodeUri is set to nodejs/
ExampleFunction.Properties.Handler should be set to the path to your handler file, relative to nodejs/.
NodeModulesLayer.Properties.ContentUri is set to the folder that contains both the template.yaml file and the nodejs dir.
Which means my example is assuming the following structure ...
nodejs/
node_modules/
src/
event.js
package.json
template.yaml
preserve sam build support
One additional gotcha to be wary of ...
With respect to defining your function resource in template.yaml, there's some "flexibility" in terms of which parts of the path you put in CodeUri vs Handler. In some cases, doing ...
Properties:
CodeUri: nodejs/src/
Handler: event.handler
... works just as well as doing ...
Properties:
CodeUri: nodejs/
Handler: src/event.handler
BUT, if you're using the sam build command, the former will NOT work. That command expects to find package.json inside of the CodeUri directory. So, stick with CodeUri: nodejs/ and use the Handler value to navigate through any additional folder hierarchy necessary to reaching your handler.
Try this, simple example how to set up lambda layer in nodejs:
https://medium.com/#anjanava.biswas/nodejs-runtime-environment-with-aws-lambda-layers-f3914613e20e
Dependencies in 2022...
It was quite hard for me to figure out, whether this is still state-of-the-art or we can simplify dependencies now. Turns out it's possible now to include dependencies without hacks or complicated setup.
For me, creating the lambda with new lambda.NodeJsFunction() instead of new lambda.Function() did the trick. Yet, it was super hard for me to find a working sample. I decided to share a sample repo with you.
Sample repository
https://github.com/AntoniusGolly/cdk-lambda-typescript-boilerplate
What it does:
it has a lot of stuff you don't need to demonstrate the purpose (e.g. it produces a API Gateway based on a config.ts, so ignore that part if you don't need it)
it allows you to use TS and ES6
it allows you to use a dependency: i.e. I use node-fetch here
I just do cdk deploy and no other bundling
according to the docu by default, all node modules are bundled except for aws-sdk.
I don't know exactly how it works, but it does ;)
I hope this helps someone as I would have appreciated it.
In order to address your module you have to use path with '/opt' as a prefix.
If it's yours file which you packaged to myLib.zip with myLib.js inside you should write:
const myModule = require('/opt/myLib');
If you plugging installed dependency then you upload node_modules.zip with node_modules folder and address as:
const module = require('/opt/node_modules/lib_name');

azure-storage not working with web pack (azure-functions-pack) server side

I made an Azure Function micro service using Node.js and I'm using the npm module azure-storage to insert files in a Blob Storage.
Locally is working fine, but when deploying to development environment, it is executed a script that executes azure-functions-pack and generate a bundle with the service code and all the required npm modules. Then when making a request to the micro service, it return a status code 500 and in the logs the error is the following:
System.Exception : Error: Cannot find module "."
at webpackMissingModule (D:\home\site\wwwroot\.funcpack\index.js:238044:68)
at Object.<anonymous> (D:\home\site\wwwroot\.funcpack\index.js:238044:147)
at __webpack_require__ (D:\home\site\wwwroot\.funcpack\index.js:21:30)
...
I only know that the problem is the azure-storage module because If I comment the "azureStorage = require('azure-storage');" , then start working. I also tried the npm module fast-azure-storage without success and until now I was not able to find a workaround for this problem. The code that uses this module is the following:
const blobSvc = azureStorage.createBlobService(storageConnectionString);
const writeStream = blobSvc.createWriteStreamToBlockBlob('containerName', fileName);
return new Promise(function (resolve) {
writeStream.write(svgString);
writeStream.on('close', () => {
resolve('https://' + storageAccount + '.blob.core.windows.net/containerName/' + fileName);
});
writeStream.end();
});
The version of azure-storage is 2.6.0. Thanks for any help.
Not a direct answer to your question - but you should use output binding feature of Azure Function to insert Blobs instead of doing that manually with library calls.
If you do that, you won't have to import the package, so it will also solve your problem.
Read more about output bindings in docs, there is a node example there too.
Actually the problem wasn't the azure-storage module, but the node-chartist module which for some reason also was causing problems in other modules. After remove node-chartist all the modules started working perfectly.

node-canvas build for AWS Lambda

I'm a Linux & node noob. I'm trying to run FabricJS (which requires node-canvas) in AWS Lambda. I've been able to follow the instructions to get up and running on an AWS Linux EC2, however, Lambda has me at my wits end. Anyone have any tips or pointers on how to get this compiled for AW Lambda?
I found this issue in the Node Canvas GitHub site. The questioner was trying to run FabricJS in Lambda as well. Here is the relevant section with an answer:
Make sure you're compiling this on the same AMI that lambda currently uses:
http://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html
Lambda runs at /var/task (that's the path when you unzip), so something.so at the root of the zip file will be at /var/task/something.so.
We then want to build our libraries using an "rpath":
export LDFLAGS=-Wl,-rpath=/var/task/
Compile libraries according to: https://github.com/Automattic/node-canvas/wiki/Installation---Amazon-Linux-AMI-%28EC2%29
You may want to set the prefix= ~/canvas to have all the files in one place.
Install node-canvas with npm
cd node_modules/canvas; node-gyp rebuild
mkdir ~/pkg and cp the .so files (~/canvas/lib/*.so) there, using -L to dereference symlinks.
scp the pkg directory to the local lambda folder, possibly putting the files in the right places. (.so in root, node_modules/canvas with other libs). You'll probably want to rearrange this.
Here is a gulp plugin which could upload your files along with node-canvas and its dependencies binary specifically built for aws lambda.
NPM Package
'use strict';
//This is a sample gulp file that can be used.
//npm install --save gulp gulp-zip gulp-awslambda
const gulp = require('gulp');
const zip = require('gulp-zip');
const path = require('path');
const lambda = require('gulp-awslambda');
const aws_lambda_node_canvas = require('./');
let runtime = 'nodejs4.3' // nodejs or nodejs4.3
const lambda_params = {
FunctionName: 'NodeCanvas', /* Lambda function name */
Description: 'Node canvas function in aws lambda', //Description for your lambda function
Handler: 'main.lambda_handler', //Assuming you will provide main.py file with a function called handler.
MemorySize: 128,
Runtime: runtime,
Role : 'ROLE_STRING',//eg:'arn:aws:iam::[Account]:role/lambda_basic_execution'
Timeout: 50
};
var opts = {
region : 'ap-southeast-2'
}
gulp.task('default', () => {
return gulp.src(['main.js', '!node_modules/**/*','!dist/**/*','!node_modules/aws-lambda-node-canvas/**/*']) //Your src files to bundle into aws lambda
.pipe(aws_lambda_node_canvas({runtime : runtime})) //Adds all the required files needed to run node-canvas in aws lambda
.pipe(zip('archive.zip'))
.pipe(lambda(lambda_params, opts))
.pipe(gulp.dest('dist')); //Also place the uploaded file
});

Resources