Background
I've inherited a legacy build system based on Webpack 4 and run through yarn, which is run both on local development environments and in the cloud. Recently, Microsoft began to roll out a breaking change whereby the build agents allocated to the pipeline that runs this build system are now sometimes provisioned with Node 18 instead of Node 16.
I've come to learn that a breaking change in Node 18 is that it ships with a new OpenSSL provider, which has dropped support for the old md4 cryptographic hashing algorithm that is used internally by Webpack 4.
For backwards compatability, Node 18 has added a new NODE_OPTIONS command line flag, --openssl-legacy-provider, which can be used to force it to use a legacy OpenSSL provider that still supports the md4 algorithm. More recent versions of Node 16 have also back-ported support for this option, but only in the case that a newer version of the OpenSSL provider is being used than the one that ships with Node 16.
The Problem
Currently, we have two versions of the build system. The current one, which runs on Node 16 just fine, and a modified one that runs on Node 18 using the --openssl-legacy-provider flag but fails to run on Node 16 because the option is not allowed.
Ideally, I would like to get the build system into a state where it can run on either Node 16 or Node 18. I believe the key to this is to get both versions of Node to use the same legacy OpenSSL provider.
Potential Solutions
Our current interim solution has been to update our pipelines to force build agents to always use Node 16. Clearly this isn't a good long-term solution, though.
The only reason Node 18 is breaking for us is that Webpack is using the md4 algorithm. It is possible to configure output.hashFunction in Webpack's configuration, to tell it to use another algorithm like sha256, but there are also hard-coded uses of md4 in Webpack 4 that can't be configured so this configuration option hasn't solved the issue for us.
I've tried updating this legacy build system to Webpack 5, which has a built-in xxhash64 hashing algorithm that doesn't rely on OpenSSL and so should be able to work on either version of Node. But unfortunately we rely on several third party plugins that haven't been updated for years, and don't have successors that support Webpack 5. So while of course it would be possible to update it, it's probably going to be a lot more work than any other solution, and as I mentioned this is a legacy build system.
Another potential solution that we could do, but would prefer not to do, would be to restrict our build agents to all run with a version of Node >= 18, and also update all our local development environments to use Node 18. I'd prefer to treat this option as a last resort.
I've also seen references to creating a file called openssl.cnf and setting an OPENSSL_CONF environment variable to the path to that file, as a way to tell any version of node to use the same OpenSSL provider. But I haven't been to find any instructions on how to configure an environment variable in local development environments. I'm familiar with using the dotenv library to access environment variables specified in a .env file through process.env, for example, but as this solution doesn't require me to access these variables through code it doesn't feel applicable here.
I've also looked for ways to query the version of Node when running a yarn script, in order to determine whether or not it's safe to use the --openssl-legacy-provider option, but haven't found any information about how to do this. If it's even possible at all.
tl;dr
What I'm looking to achieve is a way to run a yarn script using the legacy OpenSSL provider without knowing ahead of time if it will be running on Node 16 or Node 18.
Related
I am getting error:0308010C:digital envelope routines::unsupported for a serverless package command.
Have referred to multiple SO questions and other documents. Most of them suggest the following:
Downgrading node version to any LTS lesser than 17:
Downgrading the node version creates a lot of library compatibility issues. For example, one of the many errors I get from the node-fetch library during webpacking is Can't resolve 'node:util' in <project-directory>\node_modules\node-fetch\src
Set Node to use openssl legacy provider: Since the newer versions of node use the encryption algorithms supported by the newer openssl but the libraries in the project still (guessing) only support the algorithms supported by openssl 1.1, which is the main reason behind the ERR_OSSL_EVP_UNSUPPORTED issue, we need to override node's default encryption algorithms to the ones supported by openssl 1.1 by setting NODE_OPTIONS to --openssl-legacy-provider. This solution is working for react projects or even frontend frameworks like ionic where the node options can be easily overridden in the scripts in package.json or the documentation of ionic tells us how to set it so that it can pick it up during build stage. But this does not work for backend framework libraries like serverless. I am not sure how the node options can be overridden for serverless because setting it as an environment variable is not working nor can I find any documentation.
We have a project built with NodeJS. With time the version upgrades are very necessary but while updating the version, if we do not have enough test cases, something might break and we may know it far later. Such a scenario was introduced when replaceAll method was used in some part of the code. But replaceAll is not supported until NodeJS 15 or later. So we run into trouble after merging the code.
Can we check whether the NodeJS code works or not for a specific version?
Demonstration:
I've created a repository on GitHub for this with a workflow to demonstrate the problem. See this run https://github.com/kiranparajuli589/node-check/runs/7573211393?check_suite_focus=true
Here I've used Node 14 and properly configured the engines keyword in the package.json but still, the linter is not reporting about the usages of such functions that are not available.
I am currently working in an environment where there are multiple node projects, some running on a 14.* node docker image, some on 16.*
The problem I faced was that I didn't realize the Dockerfile was obsolete and that the app would not run as it has been developed under 16.* while the Dockerfile specified a node environment in 14.*
I was wondering if there would be a way to reduce the possible amount of code that has to be modified if our structure decides to start implementing projects in other versions than the ones we currently use. After thinking about it, I ended up with two main axes of thoughts:
The environment is set (Dockerfile), the app should be developed under the environment specifications
The environment needs to be set according to the app
After some research I ran into this article about dynamic image specification. Now this would make it pretty nice as we could dynamically somehow pass as an argument a version of the node image we would want to install for our environment.
This would require two things:
As a dev, I must define the node version in my package.json
A script must be able to read from that package.json and launch the docker build with the parameter it got and possibly catch the error if it is not defined.
Is this a recommended pattern to work? I believe it would reduce the amount of manual code changes in case of version updates but at the same time seeing the lack of documentation towards this use case I don't feel like this is a common thing.
My Node application needs to be deployed on Windows and Linux. The main deployment package is built on a Linux CI server.
When this package is deployed to Windows, it crashes immediately due to missing native bindings, such as those for sqlite. Only the bindings for the build platform (Linux) are restored.
With a deadline approaching, we just set up a Windows build configuration which outputs a Windows specific package that contains the appropriate bindings, and we choose the appropriate artifact to bundle in the installer.
This works but feels fragile, as we would need to keep the Node versions in sync between the two otherwise unrelated environments. I would like to be able to do this with a single build configuration.
I couldn't find any guidance on how this is done. I'm imagining a command-line option like --platform=windows to npm ci, or a modification to package.json but I couldn't find any information about this. Presumably this is a reasonably rare requirement, and perhaps there is no tooling around this, which would be a shame.
Another requirement is that the application must be installed without an internet connection. We cannot run npm ci or npm install when we install it as some of our clients do not permit their servers to access the public internet.
Based on your requirements it sounds like building a package on each required platform would be the safest bet, with the least number moving parts to go wrong.
As the comments have suggested most projects rely on an npm install on the required platform so you are stepping into not that common territory.
This works but feels fragile, as we would need to keep the Node versions in sync between the two otherwise unrelated environments. I would like to be able to do this with a single build configuration.
Node uses NODE_MODULE_VERSION (displayed on the releases page) to track ABI compatibility for native modules. This only changes with a new major Node release number.
The CI builds would need to create app packages for each major version of Node you run on each platform. Keeping the Node.js major versions in sync for the application a good thing in any case. Running Node N and N-1 builds until that can be achieved is good cover and probably the best option with the air gap requirements.
NPM Cache
If the air gapped clients are largely on common networks, an NPM cache/proxy (nexus/verdaccio) may be of use. The NPM cache will need a process to snapshot the repo after a production npm install on all required platforms, to be pushed out to your endpoints. Unfortunately binary modules are often distributed out of band from NPM so won't be stored in regular NPM caches. Each client instance will need a complete build environment to build any native modules from source which can sometime present it's own difficulties on Windows platforms.
Alternatives
Node.js is not a great platform for distributing packaged applications to many diverse clients, especially if you need to distribute Node itself. Any language with an external VM requirement presents difficulties. Nodes package management choices and reliance on native modules exacerbate this.
I've given up in the past and converted clients (albeit thin) to Go, as it lends itself to cross platform distribution a lot better by removing the external runtime requirement and having less variables.
I have this strange issue, where I am using 10th version of node (10.9.0) on a server, but things that should work or be supported in that version, are not.
For example, according to this table, this version supports Object.values(). On my local node installation - this indeed works, but on server, where I don't have much freedom about what software I am using, it does not.
Is there any way to truly verify used node version (node -v shows 10.9.0 as written above)? Maybe it's only a version of main binary yet all libs it's using are from version 6 (that one is also installed on that server)?
The process object that Node.js exposes has a lot of information including the version.
console.log(process.version); // v10.9.0
You can find the Node.js process.verison documentation here.
So within your application you can run that to see if it's truly what you expect.
You can also try running which node on your server. That should print the path it's using to find node. If you have multiple copies or installations of node, it might be using a path that is outdated. Making sure your path is up to date will solve that problem, and which node can help debug that.