Importing a zip file and getting individual files in it, using Webpack - node.js

I have a standard webpack environment set up, and I am using ES6 imports with npm packages (the usual import name from 'package'). I'm using webpack-dev-server as my dev environment, and the standard webpack build for building the output directory.
I have a zip file in my source containing a large number of XML files, and in my JS program, I want to be able to read that zip file in and be able to extract individual files from it, so I can get the text from those files and process the XML in them.
I am currently using the xmldoc package for transforming XML text to a JSON object for processing, but I have not been able to find a reliable package for reading in a zip file (using the path to the file) and being able to get the individual files in it. In the ones I have tried, I have run into trouble with node's fs module. If I try adding the target: 'node' property to my webpack config file, I receive an error stating require is not defined.
Is there a reliable npm package compatible with ES6 imports that can read a zip file into my program? I realize that means the zip must be included in the build files to be sent to the browser, but that is better than including the hundreds of individual files without zipping them. Thanks!

I think you want browserfs. Here is example code from the readme. You will also need to do some config for webpack (check the readme):
fetch('mydata.zip').then(function(response) {
return response.arrayBuffer();
}).then(function(zipData) {
var Buffer = BrowserFS.BFSRequire('buffer').Buffer;
BrowserFS.configure({
fs: "MountableFileSystem",
options: {
"/zip": {
fs: "ZipFS",
options: {
// Wrap as Buffer object.
zipData: Buffer.from(zipData)
}
},
"/tmp": { fs: "InMemory" },
"/home": { fs: "IndexedDB" }
}
}, function(e) {
if (e) {
// An error occurred.
throw e;
}
// Otherwise, BrowserFS is ready to use!
});
});

Related

Interacting with the FileSystem using TypeScript Path Aliases

I'm working in a rather large monorepo, so I set up some path aliases in my tsconfig.json:
"paths": {
"#app/specialLib/*": ["libs/specialLib/src/lib/*"],
"#app/specialLib": ["libs/specialLib/src/index.ts"],
}
so that I can simplify my import statements, both in my apps, and in other places inside the library:
import { Foo } from "#app/specialLib"
import { SubItem } from "#app/specialLib/nested/library"
In one of my library files, I need to work with the file system, using writeFileSync:
// the path to the log file
const itemLog = "#app/specialLib/item-log.json";
// remove the necessary item from the log
let loggedItems: LogItem[] = require(itemLog);
loggedItems = loggedItems.filter(({ id }) => {
id != this.id;
});
// write the log back to the file system
writeFileSync(resolve(itemLog), JSON.stringify(loggedItems));
Now the compiler can resolve the require just fine, but when I try to run my script, I get Error: ENOENT: no such file or directory, open 'C:\Users\Chris\development\Big.Monorepo\#app\specialLib\item-log.json', because of course, there is no such folder on my machine. I've tried just using the variable itself, as well as wrapping it inside of path.resolve(), with the same effects.
So my question is this: is there a way to read and write to the file system using Node's built-in functionality in conjunction with TypeScript path aliases?

How to use docx npm package in production to save a file

I am generating a doc file using docx-template npm package in node js. and the file is getting successfully saved in my backend/controller folder on my local machine. Now i want to do prod deployment on Heroku but i dont know what path has to be set to save the file in production.
I have used 'fs' module to read and write file. Shown below.
fs.writeFileSync(
path.resolve(__dirname, `${contractName} ${element.frequency}.docx`),
buffer
);
You probably have to use the "send(Buffer)" feature of express.
See the following page :
https://expressjs.com/en/api.html#res.send
const contentTypes = {
docx: "application/vnd.openxmlformats-officedocument.wordprocessingml.document",
pptx: "application/vnd.openxmlformats-officedocument.presentationml.presentation",
xlsx: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
};
res.set('Content-Type', contentTypes.docx);
res.send(buffer);
This is because the "fs" module is used to write the data on the disk, but if it is in production, most likely what you want is to let the user "download" a file, and that is done using res.send() which takes as an argument a Buffer to achieve this feature.

How to import a node module inside an angular web worker?

I try to import a node module inside an Angular 8 web worker, but get an compile error 'Cannot find module'. Anyone know how to solve this?
I created a new worker inside my electron project with ng generate web-worker app, like described in the above mentioned ng documentation.
All works fine until i add some import like path or fs-extra e.g.:
/// <reference lib="webworker" />
import * as path from 'path';
addEventListener('message', ({ data }) => {
console.log(path.resolve('/'))
const response = `worker response to ${data}`;
postMessage(response);
});
This import works fine in any other ts component but inside the web worker i get a compile error with this message e.g.
Error: app/app.worker.ts:3:23 - error TS2307: Cannot find module 'path'.
How can i fix this? Maybe i need some additional parameter in the generated tsconfig.worker.json?
To reproduce the error, run:
$ git clone https://github.com/hoefling/stackoverflow-57774039
$ cd stackoverflow-57774039
$ yarn build
Or check out the project's build log on Travis.
Note:
1) I only found this as a similar problem, but the answer handles only custom modules.
2) I tested the same import with a minimal electron seed which uses web workers and it worked, but this example uses plain java script without angular.
1. TypeScript error
As you've noticed the first error is a TypeScript error. Looking at the tsconfig.worker.json I've found that it sets types to an empty array:
{
"compilerOptions": {
"types": [],
// ...
}
// ...
}
Specifying types turns off the automatic inclusion of #types packages. Which is a problem in this case because path has its type definitions in #types/node.
So let's fix that by explicitly adding node to the types array:
{
"compilerOptions": {
"types": [
"node"
],
// ...
}
// ...
}
This fixes the TypeScript error, however trying to build again we're greeted with a very similar error. This time from Webpack directly.
2. Webpack error
ERROR in ./src/app/app.worker.ts (./node_modules/worker-plugin/dist/loader.js!./src/app/app.worker.ts)
Module build failed (from ./node_modules/worker-plugin/dist/loader.js):
ModuleNotFoundError: Module not found: Error: Can't resolve 'path' in './src/app'
To figure this one out we need to dig quite a lot deeper...
Why it works everywhere else
First it's important to understand why importing path works in all the other modules. Webpack has the concept of targets (web, node, etc). Webpack uses this target to decide which default options and plugins to use.
Ordinarily the target of a Angular application using #angular-devkit/build-angular:browser would be web. However in your case, the postinstall:electron script actually patches node_modules to change that:
postinstall.js (parts omitted for brevity)
const f_angular = 'node_modules/#angular-devkit/build-angular/src/angular-cli-files/models/webpack-configs/browser.js';
fs.readFile(f_angular, 'utf8', function (err, data) {
var result = data.replace(/target: "electron-renderer",/g, '');
var result = result.replace(/target: "web",/g, '');
var result = result.replace(/return \{/g, 'return {target: "electron-renderer",');
fs.writeFile(f_angular, result, 'utf8');
});
The target electron-renderer is treated by Webpack similarily to node. Especially interesting for us: It adds the NodeTargetPlugin by default.
What does that plugin do, you wonder? It adds all known built in Node.js modules as externals. When building the application, Webpack will not attempt to bundle externals. Instead they are resolved using require at runtime. This is what makes importing path work, even though it's not installed as a module known to Webpack.
Why it doesn't work for the worker
The worker is compiled separately using the WorkerPlugin. In their documentation they state:
By default, WorkerPlugin doesn't run any of your configured Webpack plugins when bundling worker code - this avoids running things like html-webpack-plugin twice. For cases where it's necessary to apply a plugin to Worker code, use the plugins option.
Looking at the usage of WorkerPlugin deep within #angular-devkit we see the following:
#angular-devkit/src/angular-cli-files/models/webpack-configs/worker.js (simplified)
new WorkerPlugin({
globalObject: false,
plugins: [
getTypescriptWorkerPlugin(wco, workerTsConfigPath)
],
})
As we can see it uses the plugins option, but only for a single plugin which is responsible for the TypeScript compilation. This way the default plugins, configured by Webpack, including NodeTargetPlugin get lost and are not used for the worker.
Solution
To fix this we have to modify the Webpack config. And to do that we'll use #angular-builders/custom-webpack. Go ahead and install that package.
Next, open angular.json and update projects > angular-electron > architect > build:
"build": {
"builder": "#angular-builders/custom-webpack:browser",
"options": {
"customWebpackConfig": {
"path": "./extra-webpack.config.js"
}
// existing options
}
}
Repeat the same for serve.
Now, create extra-webpack.config.js in the same directory as angular.json:
const WorkerPlugin = require('worker-plugin');
const NodeTargetPlugin = require('webpack/lib/node/NodeTargetPlugin');
module.exports = (config, options) => {
let workerPlugin = config.plugins.find(p => p instanceof WorkerPlugin);
if (workerPlugin) {
workerPlugin.options.plugins.push(new NodeTargetPlugin());
}
return config;
};
The file exports a function which will be called by #angular-builders/custom-webpack with the existing Webpack config object. We can then search all plugins for an instance of the WorkerPlugin and patch its options adding the NodeTargetPlugin.

Can I require jsx files in node?

I have a script that does some analysis on my source files and a part of that analysis is to require the file. Some of the files are in JSX format however and node does not understand this by default.
Is it possible to make it so that a file that looks like this:
function MyModule () {
return <div>hello</div>
}
module.exports = MyModule
is possible to require through require('./my-module')?
Use JSX as a template engine in Node
NPM Package : https://www.npmjs.com/package/jsx-node
To be able to simply require .jsx files, you need to tell Node what to do with them. Running the following code makes you able to require('./SomeFile.jsx'):
require('jsx-node').install({
replace: {
preact: 'jsx-node',
}
});
Warning:
This module is still in a very early phase. Any production use should be approached with caution.
For more Detail visit Link.

Why can node not find my module?

I am using node v0.12.5 with nwjs and I have defined my own custom module inside of my project so that I can modularise my project (obviously).
I am trying to call my module from another module in my project but every time I attempt to require it I get the error could not find module 'uploader'.
My uploader module is currently very simple and looks like:
function ping_server(dns, cb) {
require('dns').lookup(dns, function(err) {
if (err && err.code == "ENOTFOUND") {
cb(false);
} else {
cb(true);
}
})
}
function upload_files()
{
}
module.exports.ping_server = ping_server;
module.exports.upload_files = upload_files;
With the idea that it will be used to recursively push files to a requested server if it can be pinged when the test device has internet connection.
I believe I have exported the methods correctly here using the module.exports syntax, I then try to include this module in my test.js file by using:
var uploader = require('uploader');
I also tried
var uploader = require('uploader.js');
But I believe node will automatically look for uploader.js if uploader is specified.
The file hierarchy for my app is as follows:
package.json
public
|-> lib
|-> test.js
|-> uploader.js
|-> css
|-> img
The only thing I am thinking, is that I heard node will try and source the node_modules folder which is to be included at the root directory of the application, could this be what is causing node not to find it? If not, why can node not see my file from test.js given they exist in the same directory?
UPDATE Sorry for the confusion, I have also tried using require('./uploader') and I am still getting the error: Uncaught Error: Cannot find module './uploader'.
UPDATE 2 I am normally completely against using images to convey code problems on SO, but I think this will significantly help the question:
It's clear here that test.js and uploader.js reside in the same location
When you don't pass a path (relative or absolute) to require(), it does a module lookup for the name passed in.
Change your require('uploader') to require('./uploader') or require(__dirname + '/uploader').
To load a local module (ie not one from node_modules) you need to prefix the path with ./. https://nodejs.org/api/modules.html#modules_modules
So in your case it would be var uploader = require('./uploader');
This problem stemmed from using Node Webkit instead of straight Node, as stated in their documentation all modules will be source from a node_modules directory at the root of the project.
Any internal non C++ libraries should be placed in the node_modules directory in order for Node webkit to find them.
Therefore to fix, I simply added a node_modules directory at the root of my project (outside of app and public) and placed the uploader.js file inside of there. Now when I call require('uploader') it works as expected.
If you're developing on a mac, check your file system case sensitivity. It could be that the required filename is capitalized wrong.

Resources