Detect whether an npm package can run on browser, node or both - node.js

I'm building a NextJs application which uses a set of abstractions to enable code from both react (Component logic etc.) and node (API logic) to utilize same types and classes. The motive here is to make the development process seem seamless across client side and server side.
For example. a call on User.create method of the User class will behave differently based on the runtime environment (ie. browser or node) - on the browser, it will call an api with a POST request and on server it will persist data to a database.
So far this pattern worked just fine, and I like how it all turned out in terms of the code structure. However, for this to work, I have to import modules responsible for working on the server and browser to a single module (which in this case is the User class) this leads to a critical error when trying to resolve dependencies in each module.
For example I'm using firebase-admin in the server to persist data to the Firebase Firestore. But it uses fs which cannot be resolved when the same code runs on the browser.
As a work around I set the resolve alias of firebase-admin to false inside the webpack configuration (given it runs on browser) see code below.
/** next.config.js **/
webpack: (config, { isServer }) => {
if (!isServer) {
// set alias of node specific modules to false
// eg: service dependencies
config.resolve.alias = {
...config.resolve.alias,
'firebase-admin': false,
}
} else {
// set alias of browser only modules to false.
config.resolve.alias = {
...config.resolve.alias,
}
}
While this does the trick, it won't be much long until the process gets really tedious to include all such dependencies within resolve aliases.
So, my approach to this is to write a script that runs prior to npm run dev (or manually) that will read all dependencies in package.json and SOMEHOW identify packages that will not run on a specific runtime environment and add them to the webpack config. In order to this, there should be a way to identify this nature of each dependency which I don't think is something that comes right out of the box from npm or the package itself.
Any suggestion on how this can be achieved is really appreciated. Thanks`

Related

Gatsby Failed Build - error "window" is not available during server side rendering

I have been trying to build my gatsby (react) site recently using an external package.
The link to this package is "https://github.com/transitive-bullshit/react-particle-animation".
As I only have the option to change the props from the components detail, I cannot read/write the package file where it all gets together in the end as it is not included in the public folder of 'gatsby-build'.
What I have tried:
Editing the package file locally, which worked only on my machine but when I push it to netlify, which just receives the public folder and the corresponding package.json files and not the 'node-modules folder', I cannot make netlify read the file that I myself changed, as it requests it directly from the github page.
As a solution I found from a comment to this question, we can use the "Patch-Package" to save our fixes to the node module and then use it wherever we want.
This actually worked for me!
To explain how I fixed it: (As most of it is already written in the "Patch Package DOCS), so mentioning the main points:
I first made changes to my local package files that were giving the error.(For me they were in my node_modules folder)
Then I used the Patch Package Documentation to guide my self through the rest.
It worked after I pushed my changes to github such that now, Patch Package always gives me my edited version of the node_module.
When dealing with third-party modules that use window in Gatsby you need to add a null loader to its own webpack configuration to avoid the transpilation during the SSR (Server-Side Rendering). This is because gatsby develop occurs in the browser (where there is a window) while gatsby build occurs in the Node server where obviously there isn't a window or other global objects (because they are not even defined yet).
exports.onCreateWebpackConfig = ({ stage, loaders, actions }) => {
if (stage === "build-html") {
actions.setWebpackConfig({
module: {
rules: [
{
test: /react-particle-animation/,
use: loaders.null(),
},
],
},
})
}
}
Keep in mind that the test value is a regular expression that will match a folder under node_modules so, ensure that the /react-particle-animation/ is the right name.
Using a patch-package may work but keep in mind that you are adding an extra package, another bundled file that could potentially affect the performance of the site. The proposed snippet is a built-in solution that is fired when you build your application.

In our library, how to tell webpack to skip dependencies?

We have a library that is traditionally client-side only. It uses HTTP Request (or several other dependency libraries) to make REST calls. When using the library, user will initialize with a particular request provider and off they go.
We use webpack in our examples to utilize our library.
It is now extended it to use node-fetch, so if someone wants to use it from nodejs that's supported too.
For people using webpack, webpack is now attempting to pack node-fetch and the require call is failing in the browser. We can get around this with setting an external
"externals" : {
"node-fetch": "{}"
}
Is there a way to define our library so that if the consumer is using webpack target: web, it'd skip the require check for node-fetch? And similarly, if the consumer is using webpack target: nodejs - it needs to include the node-fetch component.
The project in question is https://github.com/OfficeDev/PnP-JS-Core
Thanks for reporting this. So according to This commit and conversation linked to it, the automatic module resolution field (also known as a described-resolve to the webpack resolver instance) changes based on what your target is.
By default, when target is node in your webpack build, resolution in package.json field will default to the main field else, browser field takes priority by default.
More reference https://github.com/webpack/webpack/issues/151
The links provided in the accepted answer & comment show how to do this, so +1 to those, but just to surface it directly here
Is there a way to define our library so that if the consumer is using webpack target: web, it'd skip the require check for node-fetch
Yes. In the library's package.json, add a browser field with the following
"browser": {
"node-fetch": false
}
This provides a hint to webpack and other bundlers that the the node-fetch module should be ignored - i.e. do not even attempt to include it in the bundle - when the target is web. When the target is node, it will be included.
Note that the above relies on the code in the client bundle never using node-fetch. In that sense it can be considered unsafe, because there is no compile-time guarantee of this, and if it happens, it will just error and probably crash your client. If you're absolutely sure it can never be used client-side, though, this is the simplest way to get this done.
For more safety - i.e. if you want the client code to only warn if you try to use node-fetch - you also have the option of providing a shim to the module that the client bundle can include instead, and for instance just log a warning in the shim implementation if it gets used. You do this in the same way, just by providing a path to the shim module instead of false
"browser": {
"node-fetch": "./shims/node-fetch.js"
}

Webpack bundle dynamic client config

We have a node.js app bundled for production using Webpack.
Our problem is how to add dynamic configuration after you already have a bundle, without the need to re-bundle?
On the server-side, we can just use node env variables, but how can this be done for the client bundle? Specifically, we need to tell a browser module to which api server address to connect.
Having a js/json file with the configurations causes the configuration values to be injected into the bundle, and therefore can't be changed afterwards (in a comfortable manner, without open the bundle file and manually finding and replacing).
Using something like express-expose, isn't something we want, since it causes another network request to get the data, and our server address is dynamic.
node-config etc., don't work on client side
You can make creative use of the externals option:
externals: [
{ appConfig: 'var appConfig' },
],
If you add that to your configuration you can just let your web server add a script tag with var appConfig = {"config":"value"}; somewhere before the loading of your webpack bundle, and a simple require('appConfig') will pick it up.

Meteor.js: How do you require or link one javascript file in another on the client and the server?

1) In node on the backend to link one javascript file to another we use the require statement and module.exports.
This allows us to create modules of code and link them together.
How do the same thing in Meteor?
2) On the front end, in Meteor is I want to access a code from another front end javascript file, I have to use globals. Is there a better way to do this, so I can require one javascript file in another file? I think something like browserify does this but I am not sure how to integrate this with Meteor.
Basically if on the client I have one file
browserifyTest.coffee
test = () ->
alert 'Hello'
I want to be able to access this test function in another file
test.coffee
Template.profileEdit.rendered = ->
$ ->
setPaddingIfMenuOpen()
test()
How can I do this in Meteor without using globals?
Meteor wraps all the code in a module (function(){...your code...})() for every file we create. If you want to export something out of your js file (module), make it a global. i.e don't use var with the variable name you want to export and it'll be accessible in all files which get included after this module. Keep in mind the order in which meteor includes js files http://docs.meteor.com/#structuringyourapp
I don't think you can do this without using globals. Meteor wraps code in js files in SEF (self executing function) expressions, and exports api is available for packages only. What problem do you exactly have with globals? I've worked with fairly large Meteor projects and while using a global object to keep my global helpers namespaces, I never had any issues with this approach of accessing functions/data from one file in other files.
You can use a local package, which is just like a normal Meteor package but used only in your app.
If the package proves to be useful in other apps, you may even publish it on atmosphere.
I suggest you read the WIP section "Writing Packages" of the Meteor docs, but expect breaking changes in coming weeks as Meteor 0.9 will include the final Package API, which is going to be slightly different.
http://docs.meteor.com/#writingpackages
Basically, you need to create a package directory (my-package) and put it under /packages.
Then you need a package description file which needs to be named package.js at the root of your package.
/packages/my-package/package.js
Package.describe({
summary:"Provides test"
});
Package.on_use(function(api){
api.use(["underscore","jquery"],"client");
api.add_files("client/lib/test.js","client");
// api.export is what you've been looking for all along !
api.export("Test","client");
});
Usually I try to mimic the Meteor application structure in my package so that's why I'd put test.js under my-package/client/lib/test.js : it's a utility function residing in the client.
/packages/my-package/client/lib/test.js
Test={
test:function(){
alert("Hello !");
}
};
Another package convention is to declare a package-global object containing everything public and then exporting this single object so the app can access it.
The variables you export NEED to be package-global so don't forget to remove the var keyword when declaring them : package scope is just like regular meteor app scope.
Last but not least, don't forget to meteor add your package :
meteor add my-package
And you will be able to use Test.test in the client without polluting the global namespace.
EDIT due to second question posted in the comments.
Suppose now you want to use NPM modules in your package.
I'll use momentjs as an example because it's simple yet interesting enough.
First you need to call Npm.depends in package.js, we'll depend on the latest version of momentjs :
/packages/my-moment-package/package.js
Package.describe({
summary:"Yet another moment packaged for Meteor"
});
Npm.depends({
"moment":"2.7.0"
});
Package.on_use(function(api){
api.add_files("server/lib/moment.js");
api.export("moment","server");
});
Then you can use Npm.require in your server side code just like this :
/packages/my-moment-package/server/moment.js
moment=Npm.require("moment");
A real moment package would also export moment in the client by loading the client side version of momentjs.
You can use the atmosphere npm package http://atmospherejs.com/package/npm which lets you use directly NPM packages in your server code without the need of wrapping them in a Meteor package first.
Of course if a specific NPM package has been converted to Meteor and is well supported on atmosphere you should use it.

Grunt task to optionally include an AMD module for different environment

I'm developing a web app using Require.js for AMD and amplify.request to abstract away my AJAX calls. The other advantage to amplify.request is that I've defined an alternative module containing mocked versions of my requests that I can use for testing purposes. Currently, I'm switching between the two versions of my request module by simply commenting/un-commenting the module reference in my main.js file.
What I'd love to do is use Grunt to create different builds of my app depending on which module I wanted included. I could also use it to do things like turn my debug mode on or off. I'm picturing something similar to usemin, only for references inside JavaScript, not HTML.
Anyone know of a plugin that does this, or have a suggestion about how I could do it with Grunt?
On our current project we have a few different environments. For each of them, we can specify different configuration settings for the requirejs build.
To distinguish between these different environments, I've used a parameter target.
You can simply pass this to grunt by appending it to your call like
grunt --target=debug
And you can access this parameter in the Gruntfile, by using grunt.option, like
var target = (grunt.option('target') || 'debug').toLowerCase();
The line above will default to debug. You could then make use of the paths configuration setting of requirejs to point the build to the correct module. Example code below.
requirejs: {
compile: {
options: {
paths: {
"your/path/to/amplify/request": target === "debug" ? "path/to/mock" : "path/to/real",
}
}
}
}

Resources