I'm working on a Node.js library that's already in production, and we are implementing Webpack to use babel-loader. I'm using webpack-node-externals to leave external requires unresolved until runtime.
The thing is, at some points the library need information about the app that is using it, and so far it's bee requiring it using the following:
const basepath = process.cwd();
const pkg = require( path.join( basepath, 'package.json' ) );
const packageVersion = pkg.version;
This has been working fine so far, since process.cwd() resolves to the working directory of the app that is running the library. But when ebpack reaches this require, it tries to resolve it and replaces it with a webpackMissingModule error.
Is there a way to leave this require as it is until runtime? I tried using the externals property with no luck.
Related
I have two Node based typescript projects A and B. Can I dynamically import a ts file into project A that resides in project B?
It's straight forward to dynamically import a file if it's local to the project e.g.
const myImport = await import('./my');
But as soon as I try to import a file that exists outside the Project A tsconfig rootDir I get an error:
const myImport = await import('c:/projectB/my.ts');
// Error: Cannot use import statement outside a module
If I don't specify the .ts extension I get error:
Error: Cannot find module.
Using require instead of import results in the same errors:
const myImport = require('c://projectB/my.ts');
// Error: Cannot use import statement outside a module
The typescript code across both projects is commonjs.
I'm trying to create a simple plugin architecture where ProjectA imports a plugin.ts file from ProjectB (with types). In old posts, people suggest copying files or creating symlinks. However, I'd like to publish project A as an NPM package so I don't think this approach will work.
Changing the path to point to the transpiled .js file (instead of the .ts) worked. The class can be instantiated successfully. Intellisense works the same as if the class was a local import. Note that my transpiled .js files are in a different folder than my source .ts files (in case that makes a difference).
const myImport = await import('c:/projectB/my'); // Note that no file extension is specified.
const myImportClass = new myImport();
myImportClass.myMethod();
The example above uses a hard coded absolute path for testing only. This might result in error: File is not under 'rootDir'. Simply work around this by using a variable e.g.
const myPath = 'c:/projectB/my';
const myImport = await import(myPath);
In my production code I'm using the following dynamic path:
const myPath = path.join(process.cwd(), myPath, myFileName);
I have an http cloud function that returns some dynamic HTML. I want to use Handlebars as the templating engine. The template is sufficiently big that it's not practical to have it in a const variable on top of my function.
I've tried something like:
const template = fs.readFileSync('./template.hbs', 'utf-8');
But when deploying the function I always get an error that the file does not exist:
Error: ENOENT: no such file or directory, open './template.hbs'
The template.hbs is in the same directory as my index.js file so I imagine the problem is that the Firebase CLI is not bundling this file along the rest of files.
According to the docs of Google Cloud Functions it is possible to bundle local modules with "mymodule": "file:mymodule". So I've tried creating a templates folder at the root of the project and added "templates": "file:./templates" to the package.json.
My file structure being something like this:
/my-function
index.js
/templates
something.hbs
index.js //this is the entry point
And then:
const template = fs.readFileSync('../node_modules/templates/something.hbs', 'utf-8');
But I'm getting the same not found error.
What is the proper way of including and requiring a non JS dependencies in a Firebase Cloud Function?
The Firebase CLI will package up all the files in your functions folder, except for node_modules, and send the entire archive to Cloud Functions. It will reconstitue node_modules by running npm install while building the docker image that runs your function.
If your something.hbs is in /templates under your functions folder, you should be able to refer to it as ./templates/something.hbs from the top-level index.js. If your JS is in another folder, you might have to work you way out first with ../templates/something.hbs. The files should all be there - just figure out the path. I wouldn't try to do anything fancy is your package.json. Just take advantage of the fact that the CLI deploys everything but node_modules.
This code works fine for me if I have a file called 'foo' at the root of my functions folder:
import * as fs from 'fs'
export const test = functions.https.onRequest((req, res) => {
const foo = fs.readFileSync('./foo', 'utf-8')
console.log(foo)
res.send(foo)
})
The solution was to use path.join(__dirname,'template.hbs').
const fs = require('fs');
const path = require('path');
const template = fs.readFileSync(path.join(__dirname,'template.hbs'), 'utf-8');
As #doug-stevenson pointed out all files are included in the final bundle but for some reason using the relative path did not work. Forcing an absolute path with __dirname did the trick.
When I package the electron app on macOS I can never obfuscate the file with Nightmare because of its limitations. Would I need to re-write the whole library, or is there a way I can get around this?
There is a solution. Absolutely unstable and untested, try at your own risk. And ofc, if you can find and fix the bugs that comes as side effect, feel free to share them :)
We are going to use a module called pkg to bundle up the script with node. Also for simplicity, we will use npx.
There were some side-effects but that was first time something worked with electron and nightmare.
Consider the following script,
const Nightmare = require("nightmare");
const nightmare = Nightmare({
show: true
});
nightmare
.goto("https://example.com")
.title()
.end()
.then(console.log)
.catch(error => {
console.error(error);
});
This is a simple script that says, go to example.com and give me the title.
Cool! Let's try using it thru npx and pkg. The code for that is,
npx pkg app.js --target 'host'
However, we got some nasty errors,
> Warning Cannot include file %1 into executable.
The file must be distributed with executable as %2.
node_modules/nightmare/lib/runner.js
path-to-executable/nightmare/runner.js
> Warning Cannot include file %1 into executable.
The file must be distributed with executable as %2.
node_modules/nightmare/lib/frame-manager.js
...
etc etc. and the file wouldn't run.
Error: spawn /home/someone/Desktop/a/electron/dist/electron ENOENT
It cannot find the required files as they were not bundled. We will use process.cwd() to get those data which resides in relevant folder.
const nodeDir = process.cwd() + "/node_modules/"; // <- Get node modules folder
const nightmareDir = nodeDir + "nightmare"; // <-- Get nightmarejs path
const electronDir = nodeDir + "electron"; // <-- Get electron path
const Nightmare = require(nightmareDir);
const electronPath = require(electronDir);
const nightmare = Nightmare({
show: true,
electronPath // <-- use the specific electron path
});
nightmare
.goto("https://example.com")
.title()
.end()
.then(console.log)
.catch(error => {
console.error(error);
});
When I ran it, it showed me some more warning, but that's because I did not optimize the process.cwd() yet. Then I ran it and voila!
➜ a npx pkg app.js --target 'host'
> pkg#4.3.1
> Warning Cannot resolve 'nightmareDir'
/home/someone/Desktop/a/app.js
Dynamic require may fail at run time, because the requested file
is unknown at compilation time and not included into executable.
Use a string literal as an argument for 'require', or leave it
as is and specify the resolved file name in 'scripts' option.
> Warning Cannot resolve 'electronDir'
/home/someone/Desktop/a/app.js
Dynamic require may fail at run time, because the requested file
is unknown at compilation time and not included into executable.
Use a string literal as an argument for 'require', or leave it
as is and specify the resolved file name in 'scripts' option.
➜ a ./app
Example Domain // <-- Our sweet result :D
➜ a
This can be improved and tweaked, but I'll leave that to you.
Im on macOS. I am creating a simple electron app. When I run the app with electron . everything works perfectly with no errors. Now that my app is finished, I wanted to build and distribute it. So I setup electron-builder and I got that to work just fine. However, when I run the MyApp.app in the build folder, I get an error saying:
Uncaught Error: ENOENT: no such file or directory, scandir './img/'
I call scandir here:
const fs = require('fs');
var files = [];
fs.readdirSync("./img/").forEach(file => {
files.push(file);
})
Why is this working when I run it with node, but is not working in the build? How can I fix this issue?
Why is this working when I run it with node, but is not working in the
build? How can I fix this issue?
It's difficult to tell without having more information about the whole app's structure, it may depend on how your code is actually called or required from the html file.
Anyway, using the __dirname global variable to build the directory path usually solves this kind of problem. Please try:
const fs = require('fs');
const path = require('path');
var files = [];
fs.readdirSync(path.join(__dirname, 'img')).forEach(file => {
files.push(file);
});
Architecture
I would like to share code between client and server side. I have defined aliases in the webpack config:
resolve: {
// Absolute paths: https://github.com/webpack/webpack/issues/109
alias: {
server : absPath('/src/server/'),
app : absPath('/src/app/'),
client : absPath('/src/client/'),
}
},
Problem
Now on the server side I need to include webpack in order to recognize the correct paths when I require a file. For example
require('app/somefile.js')
will fail in pure node.js because can't find the app folder.
What I need (read the What I need updated section)
I need to be able to use the webpack aliases. I was thinking about making a bundle of all the server part without any file from node_modules. In this way when the server starts it will use node_modules from the node_modules folder instead of a minified js file (Why? 1st: it doesn't work. 2nd: is bad, because node_modules are compiled based on platform. So I don't want my win files to go on a unix server).
Output:
Compiled server.js file without any node_modules included.
Let the server.js to use node_modules;
What I need updated
As I've noticed in https://github.com/webpack/webpack/issues/135 making a bundled server.js will mess up with all the io operation file paths.
A better idea would be to leave node.js server files as they are, but replace the require method provided with a custom webpack require which takes in account configurations such as aliases (others?)... Can be done how require.js has done to run on node.js server.
What I've tried
By adding this plugin in webpack
new webpack.optimize.CommonsChunkPlugin(/* chunkName= */"ignore", /* filename= */"server.bundle.js")
Entries:
entry: {
client: "./src/client/index.js",
server: "./src/server/index.js",
ignore: ['the_only_node_module'] // But I need to do that for every node_module
},
It will create a file server.js which only contains my server code. Then creates a server.bundle.js which is not used. But the problem is that webpack includes the webpackJsonp function in the server.bundle.js file. Therefore both the client and server will not work.
It should be a way to just disable node_modules on one entry.
What I've tried # 2
I've managed to exclude the path, but requires doesn't work because are already minified. So the source looks like require(3) instead of require('my-module'). Each require string has been converted to an integer so it doesn't work.
In order to work I also need to patch the require function that webpack exports to add the node.js native require function (this is easy manually, but should be done automatically).
What I've tried # 3
In the webpack configuration:
{target: "node"}
This only adds an exports variable (not sure about what else it does because I've diffed the output).
What I've tried # 4 (almost there)
Using
require.ensure('my_module')
and then replacing all occurrences of r(2).ensure with require. I don't know if the r(2) part is always the same and because of this might not be automated.
Solved
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
Related
https://www.bountysource.com/issues/1660629-what-s-the-right-way-to-use-webpack-specific-functionality-in-node-js
https://github.com/webpack/webpack/issues/135
http://webpack.github.io/docs/configuration.html#target
https://github.com/webpack/webpack/issues/458
How to simultaneously create both 'web' and 'node' versions of a bundle with Webpack?
http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/
Thanks
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
My solution was:
{
// make sure that webpack will externalize
// modules using Node's module API (CommonJS 2)
output: { ...output, libraryTarget: 'commonjs2' },
// externalize all require() calls to non-relative modules.
// Unless you do something funky, every time you import a module
// from node_modules, it should match the regex below
externals: /^[a-z0-9-]/,
// Optional: use this if you want to be able to require() the
// server bundles from Node.js later
target: 'node'
}