I am new to nodeJS, npm, and aws lambda. Suppose I have a custom sort algorithm that I want to use in several lambda functions. This should be something very simple, but the only way I have found is to create a layer with a node module published as an npm package. I do not want any of the code be uploaded to an npm.
I tried to download the layer I am currently using, create a folder in node_modules along with other packages that are published in npm with
npm init
fill all the info for the package.json
created a function code in a index.js
'use strict'
exports.myfunc= myfunc;
function myfunc() {
console.log("This is a message from the demo package");
}
zip all the layer again and upload it in a version 2 of the layer
pick the new version in a lambda function and call it as i would do with any other node_module form a third party, like this:
const mypack= require('mypack');
mypack.myfunc();
but it tells me:
"errorMessage": "Error: Cannot find module ... \nRequire stack:\n- /var/task/index.js\n- /var/runtime/UserFunction.js\n- /var/runtime/index.js",
I think it maybe because in the layer in the nodejs folder, there is package-lock.json and my module is not there. I tried to put it there without the "resolved" and "integrity" that the packages published on npm have but did not work.
Is there a way to simply upload the code I want in a nodejs layer having nothing to do with npm?
I would like to try something such as this in the accepted answer,
var moduleName = require("path/to/example.js")
How to install a node.js module without using npm?
but I don't know where is the path of the modules of a layer, the path of lambda that shows __dirname is /var/task, but it looks like any lambda has same path. I am lost...
I was able to import js in a folder in the following manner:
download the zip of the layer uncompress it, put in node_modules any additional folder with your custom js and upload it as a new version of the layer.
In the lambda function, reference it with
moduleName = require("path/to/example.js")
You can find that path, which is the trick here, using a known library you already have in your layer and is working, in my case I used base64-js, then I returned the path of that library like this:
require.resolve('base64-js')
That returned
'/opt/nodejs/node_modules/base64-js/index.js'
So I used
moduleName = require("/opt/nodejs/node_modules/MYCUSTOMFOLDER/index.js")
and that was it...
Related
According to the AWS blog, use of ES modules in Lambda is supported as of the Nodejs14 runtime.
Announcement - https://aws.amazon.com/about-aws/whats-new/2022/01/aws-lambda-es-modules-top-level-await-node-js-14/
Example - https://aws.amazon.com/blogs/compute/using-node-js-es-modules-and-top-level-await-in-aws-lambda/
I have checked that the Lambda function runtime is Node14 (and tried switching to Node18 without any difference) - I checked via the Lambda console once I'd pushed the code, and checked that the version changes to 18 when the setting in the Amplify config is changed.
I won't go into the detail of how I got here, other than I need to use an npm package that is written to ESM syntax.
As a sanity check and as a minimum reproducable example, I generated a new simple hello world function with the Amplify CLI, and then ran it with amplify mock function test --event src/event.json and confirmed it runs ok. But when I change the package.json to "type":"module" I get:
stack: 'Error: Could not load lambda handler function due to Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: /[redacted]/amplify/backend/function/test/src/index.js\n' +
'require() of ES modules is not supported.\n' +
'require() of /[redacted]/amplify/backend/function/test/src/index.js from /snapshot/repo/build/node_modules/amplify-nodejs-function-runtime-provider/lib/utils/execute.js is an ES module file as it is a .js file whose nearest parent package.json contains "type": "module" which defines all .js files in that package scope as ES modules.\n'
I get this same error whether I exercise the function from the amplify mock function CLI, the Lambda console, or by accessing the API gateway that links to the Lambda function.
Beyond the link blog posts above, I can't find any other mention or examples of using ES Modules with Lambda.
If you want to do this yourself:
Install amplify cli
amplify init
amplify add function and name it test, choose Nodejs, Hello World template
amplify mock function test --event src/event.json and it will work
Change amplify/backend/function/test/src/package.jsonto include"type":"module"`
amplify mock function test --event src/event.json and it will fail
Optionally you can push the application to AWS and test the lambda through the lambda console, you should get the same results.
Related issues:
https://github.com/aws-amplify/amplify-cli/issues/10437: Same issue in that the module being imported is ESM, and resolve in that a version update to the module provided CJS support
https://github.com/aws-amplify/amplify-cli/issues/5691: This relates to the root project being ESM (type: module in package.json) and monkey patching the package.json via amplify hooks, but doesn't address the issue of importing an ESM package. There is a comment at the bottom that claims its fixed in amplify CLI 10.2.3, but might have only addresses the root package issue, and not the lambda sub-project where I've encountering it.
https://github.com/aws-amplify/amplify-cli/issues/10432: Relates to use of 'mock function', but the problem relates to mocked or deployed functions. The steps outlined are about continuing to generate CJS output from typescript, converting the TS codes ESM style import/export. Because the code that amplify runs is CJS, if you try to import an ESM package it'll fail.
I have made a very simple npm package to support my discord bot.
In my bot code, I am trying to use a function from my package. However, when I launch bot code, I receive this error:
Error: ENOENT: no such file or directory, open 'prefixes.json'
I have a prefixes.json file in the main dir of npm package. In the package this code (which returns an error) is executed:
const contents = fs.readFileSync(`prefixes.json`);
const jsonPrefixes = JSON.parse(contents);
This code is executed when I turn on my discord bot that is dependant on this package. prefix.json is in the same dir as index.js of my npm package. I tried ./prefix.json and prefix.json, none of which worked.
Is the error because the package tries to search in my bot dir, instead of his own? How do I overcome this?
Update: When I tried ./node_modules/kifo/prefixes.json it worked, but I don't want it like that - is there a way to provide a path relative to the package?
You need to use require() instead of fs.readFileSync():
const jsonPrefixes = require('./prefixes.json');
const contents = JSON.stringify(jsonPrefixes); // you don't actually need this
Why?
The reason why fs.readFileSync() must behave the way it does is because file API in all programming languages behave that way. Say for example you write a program called dump. The working directory must be the one the user is currently in otherwise if you do:
> cd /my/folder
> ls
test.txt
> dump test.txt
Error: cannot open /path/to/node_modules/dump/test.txt
Of course YOU DO NOT EXPECT THIS. Nor should you. You should not expect fs.readFileSync to use it's own module directory to open files from.
On the other hand, require() was designed to load javascript modules, some of which are form your own project. So require() will open files from the directory the code is in.
The require() function can load either javascript code or a JSON file. So in your case you can use require().
What if it's not JSON?
If you cannot use require() you can use the __dirname variable. It is a special variable that contains the path of the currently executing module:
const contents = fs.readFileSync(`${__dirname}/prefixes.json`);
const jsonPrefixes = JSON.parse(contents);
However for JSON I still prefer to use require().
Currently I am playing around with electron using vue-cli-plugin-electron-builder along side a simple vue project. This is the project https://github.com/nklayman/vue-cli-plugin-electron-builder .
vue create my-project
cd my-project
vue add electron-builder
npm run electron:serve
My goal is to add a simple plugin-like architecture. The app serves only base functionality but can be extended with "plugins". Those plugins therefore are not included in the built, but will be loaded at runtime by electron. I would prefere when those plugins just behave like node modules ( module.exports = ) with its own dependencies ( probably with a package.json file inside ). I would locate those plugins at app.getPath('userData') + '/Plugins.
I looked at a few approaches on how to tackle this problem :
1. Using Nodejs vm module
First, I tried using Nodejs vm module to read and execute a script from an external file, all at runtime. It works great so far, although I would not be able to use external dependencies inside those loaded scripts. If I want to use external dependencies inside the plugin scripts, those dependencies must have been included in the electron build beforehand. Somehow defeats the whole purpose of having plugins ... only vanilla js + nodejs base modules would be possible .
2. using global.require
I saw this solution in another SO answer.
Using node require with Electron and Webpack
Webpack/electron require dynamic module
They say to use global.require but it throws an error saying global.require is not a function. The solution looked promising first, but somehow I can't get it to work.
3. simply use require
Of course I had to try it. When I try to require an external module from a non-project location it won't find the module, even if the path is correct. Again, the path I am trying to locate the module should be at app.getPath("userData"), not in the projects root directory. When however, I locate the plugins inside the root directory of the project it gets included in the built. This again defeats the purpose of having plugins.
Goal
So far, I haven't found a viable solution to this. I simply want my electron app to be extendible with basic node modules at runtime ( following a pre-defined schema to simplify ) . Of course there is atom, made with electron, using their own apm manager to install and load plugins, but this seems way to overpowered. Its enough for me to only have plugin files located locally, to have a public "marketplace" is no goal. Also, it's ok if the app has to reload / restart to load plugins.
Any ideas ?
After more and more research I stumbled over 2 packages :
https://www.npmjs.com/package/live-plugin-manager
https://github.com/getstation/electron-package-manager
both integrating npm to programmatically handle package installation at runtime. I settled for live-plugin-manager for now since its better documented and even allow package installation from local file system.
Pro
I was able to integrate the system out-of-the-box into a vanilla electron app. Works like a charm.
Cons
.I was not able to use it inside a vue electron boilerplate (like the one I said I was using in OP), since webpack is interferring with the require environment. But there sure is a solution to this.
Update : I was able to get it to work eventually inside a webpack bundled electron vue boilerplate. I accidentally mixed import and require . The following code works for me using live-plugin-manager
// plugin-loader.js
const path = require('path');
const { PluginManager } = require('live-plugin-manager');
const pluginInstallFolder = path.resolve(app.getPath('userData'), '.plugins');
const pluginManager = new PluginManager();
module.exports = async (pkg) => {
// installs pkg from npm
await pluginManager.install(pkg);
const package = pluginManager.require(pkg);
return package
}
// main.js
const pluginLoader = require('./plugin-loader');
pluginLoader("moment").then((moment) => {
console.log(moment().format());
})
This will install "moment" package from npm during runtime into a local directory and load it into the app, without bundling it into the executable files.
How to include dependencies in J2V8? I would like to use certain dependencies in the javascript file for instance the crypto package.
var crypto = require('crypto');
function foo(){ return crypto.createHash('md5').update('Apple').digest("hex");}
However, I got the following error saying require keyword is undefined.
undefined:1: ReferenceError: require is not defined
var crypto = require('crypto');
^
ReferenceError: require is not defined at <anonymous>:1:14
com.eclipsesource.v8.V8ScriptExecutionException
at com.eclipsesource.v8.V8._executeVoidScript(Native Method)
Can anyone tell me how to import an package into J2V8?
Unless you're working with Node, require is not a feature. Usually, you want to use a bundler like webpack to pack your structured source code into one large file so that it can be understood by browsers. This way you can use require and npm packages for your frontend code, which makes development easier, and a bundler turns it with every build (or live update) into a different format, that's hard to read for humans, but is valid Javascript.
I have had success using node modules in J2v8, please check out this blog :http://eclipsesource.com/blogs/2016/07/20/running-node-js-on-the-jvm/
NodeJs nodeJS = NodeJs.createNodeJs();
After registering callbacks
nodeJs.exec(File thescripttoexecute)
Make sure you have the proper path to the node modules in the require() command.
You may have to make a nodejs package that takes dependencies and exports what you need. Then, You have to execute npm install manually.
or You can just npm install what-you-need.
Create Node JS Runtime, and use require with your your-package-dir/index.js or exact location of module that you need. like this:
V8Object module = nvm.require(file);
Now you can call the function likes:
module.executeJSFunction("test");
To deliver entire dependencies you have to bundlize module directory your self.
What if you have to support cross-platform, refer https://www.npmjs.com/package/node-pre-gyp also.
My situation is that I am having a bit of trouble in adding external NPM packages to my Serverless Framework project (specific package is geopoint).
I went to the root folder of the Serverless project and ran npm install geopoint --save. package.json got updated with dependencies": { "geopoint": "^1.0.1" } and node_modules folder was created.
My folder structure looks like this:
root-project-folder
-functions
--geospatial
---handler.js
-node_modules
--geopoint
In my functions/geospatial/handler.js I declared the geopoint module with:
var geopoint = require('geopoint');
var geopoint = require('../../geopoint');
var geopoint = require('../../../geopoint');
The lambda console returns an error of:
{
"errorMessage": "Cannot find module '../../geopoint'",
"errorType": "Error",
"stackTrace": []
}
How can I properly add external NPM modules to a Serverless Framework project?
I think what you are experiencing is the same as what I was experiencing recently. I could install npm packages in my application root directory, but nothing would get deployed to lambda.
My understanding is that serverless deploys everything under each component directory (subdirectory under the application root). In your case, under functions.
I could not find much in the serverless documentation around this, but what I did was define a package.json file under my functions folder and then run an npm install in that subdirectory. Then after deploying to lambda, the node_modules under this directory got deployed too, meaning that my function code could require any of these npm modules.
So, your folder structure should now look like this:
root-project-folder
|-functions
|--package.json
|--node_modules
|---geopoint
|--geospatial
|---handler.js
|-package.json
|-node_modules
|--geopoint
The benefit here as well is that you can only deploy the npm dependencies that your functions need, without those that serverless needs to deploy your resources.
Hopefully that helps - once again, not sure this is best practise, just what I do because this isn't documented anywhere that I could find on the Serverless repository or in any example code.
For me best solution was Serverless plugin: serverless-plugin-include-dependencies
serverless-plugin-include-dependencies
You can do the following:
# serverless.yml
custom:
webpack:
includeModules:
packagePath: '../package.json' # relative path to custom package.json file.
Reference document
If someone runs into this trouble and none of the answers above are helping, try this one(worked for me):
custom:
webpack:
webpackIncludeModules: true