Asynchronous folder zipping in NodeJS - node.js

I need to zip multiple folders asynchronously in nodeJS. Is there any way to do an asynchronous zipping in NodeJS.(this needs to be work both on linux and windows computers)
I tried adm-zip library but it seems to be synchronous.

Related

is node.js merely a functions file that has files in it? (read the question detail for more detail)

When I download node.js from the internet through bash shell commands, am I merely downloading a "functions" folder that has many files in them, or am I downloading anything else besides that?
This question came from the shocking realization I got when I downloaded AngularJS framework and realized it was literally a one page document and nothing more.
Node.js contains a compiled executable that can load and run Javascript code.
This exposes quite a few built in functions that run compiled code within the executable, as well lots of other plain javascript in plain *.js files that make up the standard library.
But to run all that Node.js integrates the V8 javascript engine which is written in C++ and then compiled for your operating system.
When you download Angular, it is meant to run in a browser. That browser provides the execution environment. So all Anglular must provide is it's own code, which you can then leverage for your own projects. Javascript libraries really are just Javascript.
Think of Node.js more like your web browser. It's a program that can execute Javascript, as well as provides the basic functionality you need to write Javascript programs.

How do I package node.js app into single file (without bundling node.js itself)?

I'm looking for a way to deliver a node.js server-side application (not browser) as a single file that contains all my code and node_modules. Well, it can be a few files, but I'd like to avoid shipping 10,000+ files, that are usually in node_modules.
I've used solutions like pkg, but I don't need an executable that has node.js bundled. I'd rather ship node.js separately and only have a bundle with code. This would be especially useful as I need to ship a few applications and don't need each of them to contain a copy of node.js.
Appreciate any suggestions.

Serverless Node.js Project Structure

I am building a RESTful API with the serverless framework to run on AWS (API Gateway + Lambda + DynamoDB). It's my first Node project and my first serverless production project and I have no idea how to structure it. So far I have the following
|--Functions
|-----Function1
|--------InternalModule
|-----Function2
|-----Function3
|--------InternalModule
|-----Function4
|--Shared
|-----Module1
|-----Module2
|-----Module3
|--Tests
|-----Functions
|--------Function1
|-----------InternalModule
|--------Function2
|-----------InternalModule
|--------Function3
|-----------InternalModule
|--------Function4
|-----------InternalModule
|-----Modules
|--------Module1
|-----------InternalModule
|--------Module2
|-----------InternalModule
|--------Module3
|-----------InternalModule
I keep my API endpoints (Lambda handlers) in Functions. Some of them have internal modules which they only use and there are some who use modules from Shared. I want to have unit tests for all my modules - inner and shared as well as API testing on the lambda functions. I am using mocha and chai and want to integrate everything in a pipeline which on a git push runs the linters and tests and if they are successful deploy the API to the appropriate stage. The problem is that in order to test each module I have to have chai as a local node module in every folder where I have a test file and reference the modules to be tested by a relative path. In most cases it looks really ugly because of the nesting. If I want to test an internal module from
Tests/Functions/Function1/InternalModule
and I require it on top of the test like so
require('../../../../Tests/Functions/Function1/InternalModule')
+ I have to install chai in every folder so it's reachable. The same goes for mocha and all the dependencies needed for the test and I haven't even mentioned configuration. The main idea I am now considering is weather or not I should bring all modules to a folder called modules and require them when needed - worst case
from Functions/Function1
require('../../Modules/Module1')
Also keep the test files in the module folder and run them inside, but that would require the assertion library installed in every folder. I've read about npm link and symlinks but I want to keep track of what dependencies each folder has so I can install them on the CI environment after the clean project is downloaded from GitHub where I can't do links (or I've got the whole concept wrong?)
If anyone can suggest a better solution I would highly appreciate it!
the way Node uses require is so much more than I thought!
First, Node.js looks to see if the given module is a core module - Node.js comes with many modules compiled directly into the executable binary (ex. http, fs, sys, events, path, etc.). These core modules will always take precedence in the loading algorithm.
If the given module is not a core module, Node.js will then begin to search for a directory named, "node_modules". It will start in the current directory (relative to the currently-executing Javascript file in Node) and then work its way up the folder hierarchy, checking each level for a node_modules folder.
read more here
I will try out Putting all modules in a separate folder each with it's own Folder prefixed with FunctionName_ so I know where each module is used, and the test file + package.json file. Then if I need a module I can require it from the functions with shallow nesting which would look not so bad:
from Functions/Function1
require('module-1');
with package.json
"dependencies":{
"module-1":"file:../../Modules/Function1_Module1"
}
and have a separate folder Shared where I keep the shared Modules.
I am still open for better ideas!

fs.mkdir vs child_process.exec(mkdir) which one is more efficient

I need to create a dir in nodejs using the 'fs' module but at same time the directory can be created using child_process.exec.
I tried looking in the fs.mkdir code and it went down to node_file.cc and I guess it creates a new v8 environment. I am not sure how this works internally.
And looking at nodejs fs docs - it creates a subshell - which I guess is fork + exec call.
Was not sure which one is more efficient ?
You should always prefer the in-language version. I am certain that it does not create a shell just to run mkdir() on common platforms.

Get a JSON response and save to file using require.js

Im trying to make a plugin for require.js that allows me to call an external api, convert the json response and save it to a file.
Problems:
Im not sure if I am writing the plugin correctly
I cant seem to use node filesystem - though i am using r.js
I am hoping to do this on build, so that the file is ready before concat method happens (putting all files into one)
Is this even possible? Should I use a grunt task instead?
Any pointers or examples or tutorials or anything would be really useful.
In the end I used, https://npmjs.org/package/grunt-curl.
Was alot easier, and just modified the file a bit to wrap the response in define();
It allows the files to be downloaded on build and required later in the app

Resources