My situation is that I am having a bit of trouble in adding external NPM packages to my Serverless Framework project (specific package is geopoint).
I went to the root folder of the Serverless project and ran npm install geopoint --save. package.json got updated with dependencies": { "geopoint": "^1.0.1" } and node_modules folder was created.
My folder structure looks like this:
root-project-folder
-functions
--geospatial
---handler.js
-node_modules
--geopoint
In my functions/geospatial/handler.js I declared the geopoint module with:
var geopoint = require('geopoint');
var geopoint = require('../../geopoint');
var geopoint = require('../../../geopoint');
The lambda console returns an error of:
{
"errorMessage": "Cannot find module '../../geopoint'",
"errorType": "Error",
"stackTrace": []
}
How can I properly add external NPM modules to a Serverless Framework project?
I think what you are experiencing is the same as what I was experiencing recently. I could install npm packages in my application root directory, but nothing would get deployed to lambda.
My understanding is that serverless deploys everything under each component directory (subdirectory under the application root). In your case, under functions.
I could not find much in the serverless documentation around this, but what I did was define a package.json file under my functions folder and then run an npm install in that subdirectory. Then after deploying to lambda, the node_modules under this directory got deployed too, meaning that my function code could require any of these npm modules.
So, your folder structure should now look like this:
root-project-folder
|-functions
|--package.json
|--node_modules
|---geopoint
|--geospatial
|---handler.js
|-package.json
|-node_modules
|--geopoint
The benefit here as well is that you can only deploy the npm dependencies that your functions need, without those that serverless needs to deploy your resources.
Hopefully that helps - once again, not sure this is best practise, just what I do because this isn't documented anywhere that I could find on the Serverless repository or in any example code.
For me best solution was Serverless plugin: serverless-plugin-include-dependencies
serverless-plugin-include-dependencies
You can do the following:
# serverless.yml
custom:
webpack:
includeModules:
packagePath: '../package.json' # relative path to custom package.json file.
Reference document
If someone runs into this trouble and none of the answers above are helping, try this one(worked for me):
custom:
webpack:
webpackIncludeModules: true
Related
I am new to nodeJS, npm, and aws lambda. Suppose I have a custom sort algorithm that I want to use in several lambda functions. This should be something very simple, but the only way I have found is to create a layer with a node module published as an npm package. I do not want any of the code be uploaded to an npm.
I tried to download the layer I am currently using, create a folder in node_modules along with other packages that are published in npm with
npm init
fill all the info for the package.json
created a function code in a index.js
'use strict'
exports.myfunc= myfunc;
function myfunc() {
console.log("This is a message from the demo package");
}
zip all the layer again and upload it in a version 2 of the layer
pick the new version in a lambda function and call it as i would do with any other node_module form a third party, like this:
const mypack= require('mypack');
mypack.myfunc();
but it tells me:
"errorMessage": "Error: Cannot find module ... \nRequire stack:\n- /var/task/index.js\n- /var/runtime/UserFunction.js\n- /var/runtime/index.js",
I think it maybe because in the layer in the nodejs folder, there is package-lock.json and my module is not there. I tried to put it there without the "resolved" and "integrity" that the packages published on npm have but did not work.
Is there a way to simply upload the code I want in a nodejs layer having nothing to do with npm?
I would like to try something such as this in the accepted answer,
var moduleName = require("path/to/example.js")
How to install a node.js module without using npm?
but I don't know where is the path of the modules of a layer, the path of lambda that shows __dirname is /var/task, but it looks like any lambda has same path. I am lost...
I was able to import js in a folder in the following manner:
download the zip of the layer uncompress it, put in node_modules any additional folder with your custom js and upload it as a new version of the layer.
In the lambda function, reference it with
moduleName = require("path/to/example.js")
You can find that path, which is the trick here, using a known library you already have in your layer and is working, in my case I used base64-js, then I returned the path of that library like this:
require.resolve('base64-js')
That returned
'/opt/nodejs/node_modules/base64-js/index.js'
So I used
moduleName = require("/opt/nodejs/node_modules/MYCUSTOMFOLDER/index.js")
and that was it...
I am trying to deploy a function to Google Cloud Functions. I based it on their ImageMagick tutorial.
Every time, the function fails to deploy because it reaches an error. Looking at the log, the error is:
Provided module can't be loaded.
Did you list all required modules in the package.json dependencies?
Detailed stack trace:
Error: Cannot find module 'sharp'
I can't figure out why this is happening, because sharp is in my package.json dependencies. If I open the web editor for the function in the Google Cloud console, the package.json is there as one of the files and shows sharp as a dependency. I tried running npm install and npm install --save and re-deploying, and that hasn't fixed anything.
I'm including the package in the function with const sharp = require('sharp'); (this is the line where the log shows the error occurring), and this is my package.json:
{
"name": "Resize images",
"version": "0.0.1",
"private": true,
"author": "James Tyner",
"engines": {
"node": ">=10.0.0"
},
"dependencies": {
"#google-cloud/storage": "^5.0.0",
"sharp": "^0.25.4"
}
}
Can you help me figure out what I'm doing wrong?
This has happened to me many times since I was tricked to install packages in the project directory. It works fine locally but creates an error when you try to deploy.
It worked for me when I changed directory into the functions folder, instead of the firebase project folder and did a package install in there
cd functions
npm install [your missing package] --save
I was running into this issue. Various dependencies were causing my function deployment to fail. After a bit of digging I found that the peer-dependencies were not being included.
Adding this fixed my issue
"scripts": {
...
"gcp-build": "npm i npm-install-peers"
},
checking the docs.
the gcp-build command allows us to perform a custom build step during the function build process.
Somehow I was able to address the issue, but I don't fully understand what I did differently. I found that the dependencies listed in package.json weren't being installed when I ran npm install, so I created a separate folder and copied my code there, ran npm install in the new folder, and it worked well from there. Since then, the dependencies have been working properly when I change them and re-deploy the function.
Using Node v12.13.1 and serverless deployment with webpack to GCP and cloud-functions, I've struggled with this issue. In my case it was a different module though. The problem is that no module from node_modules will be possible to require or import. The reason becomes clear if one takes a look at the webpack zip-file in directory .serverless. It seems that with GCP nothing but the file (typically index.js) denoted as "main" in package.json is actually included.
The solution was to adapt webpack.config.js to explicitly include those files missing.
webpack.config.js
I hope there is an IBM Bluemix wizard watching that can answer this.
I have an application, written in Meteor, which I am trying to deploy to Bluemix. The application contains this line:
var AdmZip = Npm.require('adm-zip');
which of course means that the application uses the adm-zip package to do stuff. When I try to deploy the application via DevOps Services, it fails with this error:
ERR Error: Cannot find module 'adm-zip'
in the logs. If I remove the Npm.require line, the application deploys fine, but of course doesn't work correctly because adm-zip is not there.
My package.json file contains, among other things, this entry:
"dependencies": {
"adm-zip": "*"
},
which I believe should be sufficient to load the adm-zip package. I've also tried specifying a Git URL for adm-zip but the results are the same.
Does anyone know what I have to do to get this application to deploy correctly?
Looking at the Meteor documentation the following line...
// import a global NPM package
var Spooky = Npm.require('spooky');
...tries to import a global NPM package (installed with the -g flag).
There is a plugin for Meteor that handles NPM integration.
Install this module with the following command:
$ meteor add meteorhacks:npm
If you have correctly set up the package dependency in package.json, you can use the following to import and use the spooky package:
// This method loads NPM modules you've specified in the packages.json file.
var Spooky = Meteor.require('spooky');
I'm using this API for a web app I'm making that uses mailchimp:
Here's the node.js API page
I'm also using this git repo to understand how to use the API: example repo
I cloned the repo, ran npm install express in the express directory of the repo, then ran node app
when I did that, I got this error: Error: Cannot find module './node_modules/mailchimp-api/mailchimp'
The require statement that names this module (in index.js) is:
var mcapi = require('./node_modules/mailchimp-api/mailchimp');
I checked the path, it should be correct. is there something I'm missing?
You need to run npm install in the express directory of the repo so that all of the modules are installed. You are missing the mailchimp module.
I have an application successfully working locally so I know the code works. However when I go to deploy to node jitsu I get an error that it cannot find a local module. Here is what I have:
File Setup:
/index.js
/config/config.js
index.js
var cfg = require('./config/config.js');
When trying to deploy node jitsu is giving me an error:
Error: Cannot find module './config/config.js'
Since all this code works locally I do not believe this is a coding issue. I am under the impression that local modules do not need to be included in package.json but perhaps they do for node jitsu? I read their documentation but cannot find anything special for local modules.
Thanks!
Local modules like this should work properly.. so long as you don't have it in .gitignore or .npmignore.
Modules in the node_modules directory require that you add it to the bundledDependencies array in your package.json file.
An easy way to check for whether the file is included in your deploy is to run tar -tf $(npm pack).
I had this exact same error on deploy, but caused by a different root cause. In case anybody stumbles into the same problem:
File Setup:
/public/Data/TargetData.js
app.js require statement:
var target = require('./public/data/TargetData.js');
My local Mac OSX environment allowed the capitalization difference of /data/ vs. /Data/ - the Nodejitsu server did not.