NodeJS using 'archiver' to deploy in AWS - node.js

I have a lambda that was made in Node JS. I used npm archiver to zip and deploy the files in AWS. Everything is okay except the zip includes node_modules. I then used glob
var srcFolder = './';
var ignores = ['node_modules/**/*', 'publish.js'];
archive.glob('**/*',
{
cwd: srcFolder,
ignore: ignores,
nodir: true,
dot: true,
follow: true
});
this works.. Problem here is when a time comes to have the need to include "some" node_modules with my deployment. I would like to ask how to just include the packages that I needed for my production (not in devDependencies)? And Ignore the aws-sdk and archiver.

Related

How to combine multiple node js file into a single bundle using webpack

I am trying to create a single bundle from multiple javascript file in a nodejs application.
The configuration I am using looks somewhat like this:
const path = require('path')
const nodeExternals = require('webpack-node-externals')
'use strict';
module.exports = {
externals: [nodeExternals({})],
entry: './lib/index.js',
output: {
iife: false,
path: path.resolve(__dirname, 'lib'),
filename: 'bundle.js', // <-- Important
},
target: 'node', // <-- Important
};
The problem is when I run bundle.js command instead for it to do what the command says, i get the full source of the file streamed into the terminal.
It seems the file contains some sort of IIFE that gets executed immediately. I set iife: false to false in the webpack configuration but that also did not make any difference.
Any ideas what could be wrong?
Edit:
I am calling webpack by adding:
bundle: webpack --config webpack.config.js to script section in package.json and then I run npm run bundle
As an alternative, you could use https://www.npmjs.com/package/#vercel/ncc package to bundle all your Node.js code into one file.

How to bundle and require non JS dependencies in Firebase Cloud Functions?

I have an http cloud function that returns some dynamic HTML. I want to use Handlebars as the templating engine. The template is sufficiently big that it's not practical to have it in a const variable on top of my function.
I've tried something like:
const template = fs.readFileSync('./template.hbs', 'utf-8');
But when deploying the function I always get an error that the file does not exist:
Error: ENOENT: no such file or directory, open './template.hbs'
The template.hbs is in the same directory as my index.js file so I imagine the problem is that the Firebase CLI is not bundling this file along the rest of files.
According to the docs of Google Cloud Functions it is possible to bundle local modules with "mymodule": "file:mymodule". So I've tried creating a templates folder at the root of the project and added "templates": "file:./templates" to the package.json.
My file structure being something like this:
/my-function
index.js
/templates
something.hbs
index.js //this is the entry point
And then:
const template = fs.readFileSync('../node_modules/templates/something.hbs', 'utf-8');
But I'm getting the same not found error.
What is the proper way of including and requiring a non JS dependencies in a Firebase Cloud Function?
The Firebase CLI will package up all the files in your functions folder, except for node_modules, and send the entire archive to Cloud Functions. It will reconstitue node_modules by running npm install while building the docker image that runs your function.
If your something.hbs is in /templates under your functions folder, you should be able to refer to it as ./templates/something.hbs from the top-level index.js. If your JS is in another folder, you might have to work you way out first with ../templates/something.hbs. The files should all be there - just figure out the path. I wouldn't try to do anything fancy is your package.json. Just take advantage of the fact that the CLI deploys everything but node_modules.
This code works fine for me if I have a file called 'foo' at the root of my functions folder:
import * as fs from 'fs'
export const test = functions.https.onRequest((req, res) => {
const foo = fs.readFileSync('./foo', 'utf-8')
console.log(foo)
res.send(foo)
})
The solution was to use path.join(__dirname,'template.hbs').
const fs = require('fs');
const path = require('path');
const template = fs.readFileSync(path.join(__dirname,'template.hbs'), 'utf-8');
As #doug-stevenson pointed out all files are included in the final bundle but for some reason using the relative path did not work. Forcing an absolute path with __dirname did the trick.

node-canvas build for AWS Lambda

I'm a Linux & node noob. I'm trying to run FabricJS (which requires node-canvas) in AWS Lambda. I've been able to follow the instructions to get up and running on an AWS Linux EC2, however, Lambda has me at my wits end. Anyone have any tips or pointers on how to get this compiled for AW Lambda?
I found this issue in the Node Canvas GitHub site. The questioner was trying to run FabricJS in Lambda as well. Here is the relevant section with an answer:
Make sure you're compiling this on the same AMI that lambda currently uses:
http://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html
Lambda runs at /var/task (that's the path when you unzip), so something.so at the root of the zip file will be at /var/task/something.so.
We then want to build our libraries using an "rpath":
export LDFLAGS=-Wl,-rpath=/var/task/
Compile libraries according to: https://github.com/Automattic/node-canvas/wiki/Installation---Amazon-Linux-AMI-%28EC2%29
You may want to set the prefix= ~/canvas to have all the files in one place.
Install node-canvas with npm
cd node_modules/canvas; node-gyp rebuild
mkdir ~/pkg and cp the .so files (~/canvas/lib/*.so) there, using -L to dereference symlinks.
scp the pkg directory to the local lambda folder, possibly putting the files in the right places. (.so in root, node_modules/canvas with other libs). You'll probably want to rearrange this.
Here is a gulp plugin which could upload your files along with node-canvas and its dependencies binary specifically built for aws lambda.
NPM Package
'use strict';
//This is a sample gulp file that can be used.
//npm install --save gulp gulp-zip gulp-awslambda
const gulp = require('gulp');
const zip = require('gulp-zip');
const path = require('path');
const lambda = require('gulp-awslambda');
const aws_lambda_node_canvas = require('./');
let runtime = 'nodejs4.3' // nodejs or nodejs4.3
const lambda_params = {
FunctionName: 'NodeCanvas', /* Lambda function name */
Description: 'Node canvas function in aws lambda', //Description for your lambda function
Handler: 'main.lambda_handler', //Assuming you will provide main.py file with a function called handler.
MemorySize: 128,
Runtime: runtime,
Role : 'ROLE_STRING',//eg:'arn:aws:iam::[Account]:role/lambda_basic_execution'
Timeout: 50
};
var opts = {
region : 'ap-southeast-2'
}
gulp.task('default', () => {
return gulp.src(['main.js', '!node_modules/**/*','!dist/**/*','!node_modules/aws-lambda-node-canvas/**/*']) //Your src files to bundle into aws lambda
.pipe(aws_lambda_node_canvas({runtime : runtime})) //Adds all the required files needed to run node-canvas in aws lambda
.pipe(zip('archive.zip'))
.pipe(lambda(lambda_params, opts))
.pipe(gulp.dest('dist')); //Also place the uploaded file
});

Building with Grunt and Requirejs

I am creating a grunt task for building a javascript project with requirejs using grunt-contrib-requirejs
https://github.com/gruntjs/grunt-contrib-requirejs
Here is the config:
requirejs:
compile:
options:
#appDir: './'
baseUrl: "client"
mainConfigFile: "client/main.js"
name: "main"
out: "build/main.js"
wrap:
start: ""
end: ""
The main.js file requires 2 other files inside subdirectories. Althrough this task does not throw errors, the resulting built file does not run the browser. The files seem to be concatenated since the require calls still exist in the built file. I expect the js files called by require to substitute the require calls, and then be optimized. how can I achieve that?
PS: The config above is written in coffeescript.
If you want your compiled javascript file to not contain and require() or define() calls you can use the AMDclean npm package and simple add this to your options object:
onModuleBundleComplete: function (data) {
var fs = require('fs'),
amdclean = require('amdclean'),
outputFile = data.path;
fs.writeFileSync(outputFile, amdclean.clean({
'filePath': outputFile
}));
}

How to share code between node.js apps?

I have several apps in node that all share a few modules that I've written. These modules are not available via npm. I would like to be able to share freely between apps, but I don't want to copy directories around, nor rely on Git to do so. And I'm not really big on using symlinks to do this either.
I would like to arrange directories something like this:
app1
server.js
node_modules
(public modules from npm needed for app1)
lib
(my own modules specific to app1)
app2
server.js
node_modules
(public modules from npm needed for app2)
lib
(my own modules specific to app2)
shared_lib
(my own modules that are used in both app1 and app2)
The problem I'm seeing is that the modules in shared_lib seem to get confused as to where to find the modules that will be in the node_modules directory of whichever app they are running in. At least I think that is the problem.
So....what is a good way to do this that avoids having duplicates of files? (note that I don't care about duplicates of things in node_modules, since those aren't my code, I don't check them into Git, etc)
The npm documentation recommends using npm-link to create your own Node.js packages locally, and then making them available to other Node.js applications. It's a simple four-step process.
A typical procedure would be to first create a package with the following structure:
hello
| index.js
| package.json
A typical implementation of these files would be:
index.js
exports.world = function() {
return('Hello World');
}
package.json
{
"name": "hello",
"version": "0.0.1",
"private": true,
"main": "index.js",
"dependencies": {
},
"engines": {
"node": "v0.6.x"
}
}
"private:true" ensures that npm will refuse to publish the package. This is a way to prevent accidental publication of private packages.
Next, navigate to the root of your Node.js package folder and run npm link to link the package globally so it can be used in other applications.
To use this package in another application, e.g., "hello-world", with the following directory structure:
hello-world
| app.js
Navigate to the hello-world folder and run:
npm link hello
Now you can use it like any other npm package like so:
app.js
var http = require('http');
var hello = require('hello');
var server = http.createServer(function(req, res) {
res.writeHead(200);
res.end(hello.world());
});
server.listen(8080);
I've got this working by having node_modules folders at different levels - node then automatically traverses upwards until it finds the module.
Note you don't have to publish to npm to have a module inside of node_modules - just use:
"private": true
Inside each of your private package.json files - for your project I would have the following:
app1
server.js
node_modules
(public modules from npm needed for app1)
(private modules locally needed for app1)
app2
server.js
node_modules
(public modules from npm needed for app2)
(private modules locally needed for app2)
node_modules
(public modules from npm needed for app1 & app2)
(private modules locally for app1 & app2)
The point is node.js has a mechanism for dealing with this already and it's awesome. Just combine it with the 'private not on NPM' trick and you are good to go.
In short a:
require('somemodule')
From app A or B would cascade upwards until it found the module - regardless if it lived lower down or higher up. Indeed - this lets you hot-swap the location without changing any of the require(...) statements.
node.js module documentation
Just use the correct path in your require call
For example in server.js that would be:
var moduleName = require('../shared_lib/moduleName/module.js');
Its Important to know that as soon as your path is prefixed with '/', '../', or './' the path is relative to the calling file.
For further information about nodes module loading visit:
http://nodejs.org/docs/latest/api/modules.html
Yes, you can reference shared_lib from app1, but then you run into a problem if you want to package and deploy app1 to some other environment, such as a web server on AWS.
In this case, you're better off installing your modules in shared_lib to app1 and app2 using "npm install shared_lib/module". It will also install all the dependencies of the shared_lib modules in app1 and app2 and deal with conflicts/duplicates.
See this:
How to install a private NPM module without my own registry?
If you check out the node.js docs, you'll see that Node.js understands the package.json file format as well, at least cursorily.
Basically, if you have a directory named foo, and in that directory is a package.json file with the key-value pair: "main": "myCode.js", then if you try to require("foo") and it finds this directory with a package.json file inside, it will then use foo/myCode.js for the foo module.
So, with your directory structure, if each shared lib has it's own directory with such a simple package.json file inside, then your apps can get the shared libs by:
var lib1 = require('../shared_lib/lib1');
var lib2 = require('../shared_lib/lib2');
And that should work for both of these apps.
Another solution can be cloning files from the other places into this repo:
clone.js:
const path = require('path')
const fs = require('fs')
const shared = [
{
type: 'file',
source: '../app1',
files: [
'src/file1',
'src/file2',
'...'
],
},
]
function cloneFiles(source, files) {
const Reset = '\x1b[0m'
const FgGreen = '\x1b[32m'
console.log(`---------- Cloning ${files.length} files from ${source} ----------`)
for (const file of files) {
const sourceFile = path.join(__dirname, '..', source, file)
const targetFile = path.join(__dirname, '..', file)
process.stdout.write(`📁 ${file} ... `)
fs.copyFileSync(sourceFile, targetFile)
console.log(`${FgGreen}Done!${Reset}`)
}
console.log(`---------- All done successfully ----------\n`)
}
;(() => {
for (const item of shared) {
switch (item.type) {
case 'file':
cloneFiles(item.source, item.files)
break
}
}
})()
Then, in the package.json you can add this script and call it when you want to clone / sync files:
"clone": "node clone.js"

Resources