Gatsby Failed Build - error "window" is not available during server side rendering - node.js

I have been trying to build my gatsby (react) site recently using an external package.
The link to this package is "https://github.com/transitive-bullshit/react-particle-animation".
As I only have the option to change the props from the components detail, I cannot read/write the package file where it all gets together in the end as it is not included in the public folder of 'gatsby-build'.
What I have tried:
Editing the package file locally, which worked only on my machine but when I push it to netlify, which just receives the public folder and the corresponding package.json files and not the 'node-modules folder', I cannot make netlify read the file that I myself changed, as it requests it directly from the github page.
As a solution I found from a comment to this question, we can use the "Patch-Package" to save our fixes to the node module and then use it wherever we want.
This actually worked for me!
To explain how I fixed it: (As most of it is already written in the "Patch Package DOCS), so mentioning the main points:
I first made changes to my local package files that were giving the error.(For me they were in my node_modules folder)
Then I used the Patch Package Documentation to guide my self through the rest.
It worked after I pushed my changes to github such that now, Patch Package always gives me my edited version of the node_module.

When dealing with third-party modules that use window in Gatsby you need to add a null loader to its own webpack configuration to avoid the transpilation during the SSR (Server-Side Rendering). This is because gatsby develop occurs in the browser (where there is a window) while gatsby build occurs in the Node server where obviously there isn't a window or other global objects (because they are not even defined yet).
exports.onCreateWebpackConfig = ({ stage, loaders, actions }) => {
if (stage === "build-html") {
actions.setWebpackConfig({
module: {
rules: [
{
test: /react-particle-animation/,
use: loaders.null(),
},
],
},
})
}
}
Keep in mind that the test value is a regular expression that will match a folder under node_modules so, ensure that the /react-particle-animation/ is the right name.
Using a patch-package may work but keep in mind that you are adding an extra package, another bundled file that could potentially affect the performance of the site. The proposed snippet is a built-in solution that is fired when you build your application.

Related

Detect whether an npm package can run on browser, node or both

I'm building a NextJs application which uses a set of abstractions to enable code from both react (Component logic etc.) and node (API logic) to utilize same types and classes. The motive here is to make the development process seem seamless across client side and server side.
For example. a call on User.create method of the User class will behave differently based on the runtime environment (ie. browser or node) - on the browser, it will call an api with a POST request and on server it will persist data to a database.
So far this pattern worked just fine, and I like how it all turned out in terms of the code structure. However, for this to work, I have to import modules responsible for working on the server and browser to a single module (which in this case is the User class) this leads to a critical error when trying to resolve dependencies in each module.
For example I'm using firebase-admin in the server to persist data to the Firebase Firestore. But it uses fs which cannot be resolved when the same code runs on the browser.
As a work around I set the resolve alias of firebase-admin to false inside the webpack configuration (given it runs on browser) see code below.
/** next.config.js **/
webpack: (config, { isServer }) => {
if (!isServer) {
// set alias of node specific modules to false
// eg: service dependencies
config.resolve.alias = {
...config.resolve.alias,
'firebase-admin': false,
}
} else {
// set alias of browser only modules to false.
config.resolve.alias = {
...config.resolve.alias,
}
}
While this does the trick, it won't be much long until the process gets really tedious to include all such dependencies within resolve aliases.
So, my approach to this is to write a script that runs prior to npm run dev (or manually) that will read all dependencies in package.json and SOMEHOW identify packages that will not run on a specific runtime environment and add them to the webpack config. In order to this, there should be a way to identify this nature of each dependency which I don't think is something that comes right out of the box from npm or the package itself.
Any suggestion on how this can be achieved is really appreciated. Thanks`

AppImage from electron-builder with file system not working

I’m using Electron Builder to compile my Electron app to an .AppImage file, and I’m using the fs module to write to an .json file, but it’s not working in the appimage format (it’s working fine when I have the normal version not made with Electron Builder). I can still read from the file.
The code (preload):
setSettings: (value) => {fs.writeFileSync(path.join(__dirname, "settings.json"), JSON.stringify(value), "utf8")}
The code (on the website):
api.setSettings(settings);
The project: https://github.com/Nils75owo/crazyshit
That's not a problem with AppImage or Electron Builder but with the way you're packaging your app. Since you didn't post your package.json*, I can only guess what's wrong, but probably you haven't changed Electron Builder's default behaviour regarding packing your application.
By default, Electron Builder compiles your application, including all resources, into a single archive file in the ASAR format (think of it like the TAR format). Electron includes a patched version of the fs module to be able to read from the ASAR file, but writing to it is obviously not supported.
You have two options to mitigate this problem: Either you store your settings somewhere in the user's directory (which is the way I'd go, see below) or you refrain from packing your application to an ASAR file, but that will leave all your JavaScript code outside the executable in a simple folder. (Note that ASAR is not capable of keeping your code confidential, because there are applications which can extract such archives, but it makes it at least a little harder for attackers or curious eyes to get a copy of your code.)
To disable packing to ASAR, simply tell Electron Builder that you don't want it to compile an archive. Thus, in your package.json, include the following:
{
// ... other options
"build": {
// ... other build options
"asar": false
}
}
However, as I mentioned above, it's probably wiser to store settings in a common place where advanced users can actually find (and probably edit, mostly for troubleshooting) them. On Linux, one such folder would be ~/.config, where you could create a subdirectory for your application.
To get the specific application data path on a cross-platform basis, you can query Electron's app module from within the main process. You can do so like this:
const { app } = require ("electron"),
path = require ("path");
var configPath;
try {
configPath = path.join (app.getPath ("appData"), "your-app-name");
} catch (error) {
console.error (error);
app.quit ();
}
If you however have correctly set your application's name (by using app.setName ("...");), you can instead simply use app.getPath ("userData"); and omit the path joining. Take a look at the documentation!
For my Electron Applications, I typically choose to store settings in a common hidden directory (one example for a name could be the brand under which you plan to market the application) in the user's home directory and I then have separate directories for each application. But how you organise this is up to you completely.
* For the future, please refrain from directing us to a GitHub repository and instead include all information (and code is information too) needed to replicate/understand your problem in your question. That'd save us a lot of time and could potentially get you answers faster. Thanks!

How to setup StencilJS components on S3 and CloudFront

I have a few components and I want to deploy them into S3 and make them reachable with CloudFront.
My problem is that I don't know what file(s) I need to upload to S3 and which file needs CloudFront needs to point to as entry point.
Here's my stencil.config.tsx:
import { Config } from '#stencil/core';
export const config: Config = {
namespace: 'stencil-test',
taskQueue: 'async',
outputTargets: [
{
type: 'dist',
esmLoaderPath: '../loader',
dir: './build/dist'
},
{
type: 'www',
serviceWorker: null // disable service workers
}
]
};
I tried executing npm run build that generated a couple of folders: build/loader and build/dist there's a lot of stuff within each folder but I have no idea how which folder and files are supposed to do what.
It was hoping the build command would generate a minified file that contained all the stuff needed (is this how it works?) so I could eventually do something like the following where I want to use my components:
<script type="module" src='https://cdn.jsdelivr.net/npm/my-name#0.0.1/dist/myname.js'></script>
Can anyone offer some guidance or point me towards any resources?
The www output target is meant for generating apps and not really relevant for component libraries. To host your components, you should upload the whole generated dist folder. Only the files that the client needs are downloaded, which depends on the client and which components they access (lazy-loading). So you don't need to worry about the amount of files. See https://stenciljs.com/docs/distribution.
To start, Stencil was designed to lazy-load itself only when the component was actually used on a page. There are many benefits to this approach, such as simply adding a script tag to any page and the entire library is available for use, yet only the components actually used are downloaded.
If you want to generate a single bundle containing all your components, there's an output target called dist-custom-elements-bundle. For the differences to dist you can have a look at the same docs link above.
One of the main differences is that loading the script doesn't automatically register the components for you, you'll have to either do it manually per component (using customElements.define(), or define them all using the defineCustomElements() export. The official documentation for that output target is https://stenciljs.com/docs/custom-elements.

Webpack bundle dynamic client config

We have a node.js app bundled for production using Webpack.
Our problem is how to add dynamic configuration after you already have a bundle, without the need to re-bundle?
On the server-side, we can just use node env variables, but how can this be done for the client bundle? Specifically, we need to tell a browser module to which api server address to connect.
Having a js/json file with the configurations causes the configuration values to be injected into the bundle, and therefore can't be changed afterwards (in a comfortable manner, without open the bundle file and manually finding and replacing).
Using something like express-expose, isn't something we want, since it causes another network request to get the data, and our server address is dynamic.
node-config etc., don't work on client side
You can make creative use of the externals option:
externals: [
{ appConfig: 'var appConfig' },
],
If you add that to your configuration you can just let your web server add a script tag with var appConfig = {"config":"value"}; somewhere before the loading of your webpack bundle, and a simple require('appConfig') will pick it up.

Meteor.js: How do you require or link one javascript file in another on the client and the server?

1) In node on the backend to link one javascript file to another we use the require statement and module.exports.
This allows us to create modules of code and link them together.
How do the same thing in Meteor?
2) On the front end, in Meteor is I want to access a code from another front end javascript file, I have to use globals. Is there a better way to do this, so I can require one javascript file in another file? I think something like browserify does this but I am not sure how to integrate this with Meteor.
Basically if on the client I have one file
browserifyTest.coffee
test = () ->
alert 'Hello'
I want to be able to access this test function in another file
test.coffee
Template.profileEdit.rendered = ->
$ ->
setPaddingIfMenuOpen()
test()
How can I do this in Meteor without using globals?
Meteor wraps all the code in a module (function(){...your code...})() for every file we create. If you want to export something out of your js file (module), make it a global. i.e don't use var with the variable name you want to export and it'll be accessible in all files which get included after this module. Keep in mind the order in which meteor includes js files http://docs.meteor.com/#structuringyourapp
I don't think you can do this without using globals. Meteor wraps code in js files in SEF (self executing function) expressions, and exports api is available for packages only. What problem do you exactly have with globals? I've worked with fairly large Meteor projects and while using a global object to keep my global helpers namespaces, I never had any issues with this approach of accessing functions/data from one file in other files.
You can use a local package, which is just like a normal Meteor package but used only in your app.
If the package proves to be useful in other apps, you may even publish it on atmosphere.
I suggest you read the WIP section "Writing Packages" of the Meteor docs, but expect breaking changes in coming weeks as Meteor 0.9 will include the final Package API, which is going to be slightly different.
http://docs.meteor.com/#writingpackages
Basically, you need to create a package directory (my-package) and put it under /packages.
Then you need a package description file which needs to be named package.js at the root of your package.
/packages/my-package/package.js
Package.describe({
summary:"Provides test"
});
Package.on_use(function(api){
api.use(["underscore","jquery"],"client");
api.add_files("client/lib/test.js","client");
// api.export is what you've been looking for all along !
api.export("Test","client");
});
Usually I try to mimic the Meteor application structure in my package so that's why I'd put test.js under my-package/client/lib/test.js : it's a utility function residing in the client.
/packages/my-package/client/lib/test.js
Test={
test:function(){
alert("Hello !");
}
};
Another package convention is to declare a package-global object containing everything public and then exporting this single object so the app can access it.
The variables you export NEED to be package-global so don't forget to remove the var keyword when declaring them : package scope is just like regular meteor app scope.
Last but not least, don't forget to meteor add your package :
meteor add my-package
And you will be able to use Test.test in the client without polluting the global namespace.
EDIT due to second question posted in the comments.
Suppose now you want to use NPM modules in your package.
I'll use momentjs as an example because it's simple yet interesting enough.
First you need to call Npm.depends in package.js, we'll depend on the latest version of momentjs :
/packages/my-moment-package/package.js
Package.describe({
summary:"Yet another moment packaged for Meteor"
});
Npm.depends({
"moment":"2.7.0"
});
Package.on_use(function(api){
api.add_files("server/lib/moment.js");
api.export("moment","server");
});
Then you can use Npm.require in your server side code just like this :
/packages/my-moment-package/server/moment.js
moment=Npm.require("moment");
A real moment package would also export moment in the client by loading the client side version of momentjs.
You can use the atmosphere npm package http://atmospherejs.com/package/npm which lets you use directly NPM packages in your server code without the need of wrapping them in a Meteor package first.
Of course if a specific NPM package has been converted to Meteor and is well supported on atmosphere you should use it.

Resources