set proxy in package.json to env variable - node.js

I need to set the proxy value in my package.json at runtime, like with an environment variable. How could this be done?
// package.json
{
"name": "demo",
"proxy": process.env.MY_PROXY_VAL , // <- how?
"dependencies": {},
"scripts": {},
}
Thanks.

It will automatically read from HTTPS_PROXY or HTTP_PROXY so you dont need to do that.
From the docs:
A proxy to use for outgoing https requests. If the HTTPS_PROXY or https_proxy or HTTP_PROXY or http_proxy environment variables are set, proxy settings will be honored by the underlying request library.

So I am converting my comment into an answer, since I think it is important to point out an actual solution to this problem.
I was searching for that exact same answer, and also tried setting the HTTP_PROXY and HTTPS_PROXY variables via an .env file, and also directly from within a script. However this does not solve the problem since this will overwrite the system proxy settings for the local machine, which I don't think was something the OP intended to do. The result can be that you cannot load npm packages anymore, because of incorrect system proxy settings.
Instead, there is a way of manually configuring the proxy for a CRA in development, as pointed out by the official docs:
https://create-react-app.dev/docs/proxying-api-requests-in-development/#configuring-the-proxy-manually
With this you should create a local setupProxy.js file under the /src folder of the project, which will then override the proxy set in package.json.
Of course then you have to make sure that all the paths are correctly set, but it works well and you have fine grained control over which pages in your app will be proxied and which will not.
To target specifically your question about how to set the proxy via an environment variable, here is an example how you could do it with the setupProxy approach and the createProxyMiddleware:
// Sample of how to setup a proxy middleware with environment variables
//// file: <project_root>/src/setupProxy.js
const { createProxyMiddleware } = require('http-proxy-middleware');
module.exports = function(app) {
app.use(
'/rest',
createProxyMiddleware({
target: process.env.REACT_APP_PROXY_HOST,
changeOrigin: true,
})
);
};
//// file: <project_root>/.env
REACT_APP_PROXY_HOST=https://localhost:6700
In this case I only wanted to proxy requests targeted to the /rest endpoint, for which I created a new endpoint. All other requests will still go to the default localhost:3000 Url, serving the react app.
The host is defined via the environment variable REACT_APP_PROXY_HOST. I defined the variable in the local .env file, but you could also directly set it inside a script in package.json if needed.
Update:
Even though the original question was already solved for me, I had an additional issue trying to forward requests to a server running on https.
The previous development setup has been fine with the static proxy set in package.json. However when using the createProxyMiddleware and targeting a server running on https with certificates, the path to the used certificate has to be provided.
// Sample of how to setup a proxy middleware with environment variables
// targeting a server running on https
//// file: <project_root>/src/setupProxy.js
const { createProxyMiddleware } = require('http-proxy-middleware');
const fs = require('fs');
const protocol = JSON.parse(process.env.HTTPS) ? "https:" : "http:";
const host = process.env.REACT_APP_PROXY_HOST
const port = process.env.REACT_APP_PROXY_PORT
module.exports = function(app) {
app.use(
'/rest',
createProxyMiddleware({
target: {
protocol: protocol,
host: host,
port: port,
pfx: fs.readFileSync('src/root-cert.pem')
},
changeOrigin: true,
})
);
};
//// file: <project_root>/.env
HTTPS=true
REACT_APP_PROXY_HOST=localhost
REACT_APP_PROXY_PORT=6700
In this case instead of providing the target as a string, it should be given as an object containing protocol, host, port and an attribute pfx, which contains the certificate to validate the server via https.
In this case it is a hardcoded path within the project source directory, however it could also be set using environment variables.
The setting HTTPS=true overwrites the default development setup and will by default start the development server at https://localhost:3000.
With this setting as well as providing the correct certificate the server running on https can be reached without issues.
For reference, this solution is officially linked in the documentation of http-proxy-middleware and the underlying node-http-proxy:
https://github.com/chimurai/http-proxy-middleware/blob/master/recipes/https.md
https://github.com/http-party/node-http-proxy#http---https-using-a-pkcs12-client-certificate
This question also got some attention at other places, e.g.
https://github.com/facebook/create-react-app/issues/3783
https://github.com/facebook/create-react-app/issues/4288
https://github.com/usc-isi-i2/kgtk-browser/issues/32
Hope to help someone searching for the same problem, if there are any updates to this or things change feel free to add suggestions in the comments.

Related

Deploying React Express and Node app to Heroku Proxy Error

I have successfully in the past launched full stack applications to Heroku by using within the client package.json file.
"proxy": "http://localhost:3001"
Now I am getting an "Invalid Host header" I did fix that error by removing the proxy as well as implementing setupProxy.js file with the following code, but afterwards the app does not call the back end at all and errors out.
const { createProxyMiddleware } = require('http-proxy-middleware');
module.exports = function(app) {
app.use(
'/api',
createProxyMiddleware({
target: 'http://localhost:3001',
changeOrigin: true,
})
);
};
I'm wondering how to fix, or if anything changed recently in Heroku to not allow proxy within the client package.json file?
It looks like it was a seemingly unrelated fix. I had to enter in some environment variables within Heroku to allow the server to run. Without the variables, I believe the server would stop with errors therefore trickling down and causing many problems. So long story short, always remember your environment variables within Heroku.

How to ignore devServer settings in vue.config.js when building for staging environment?

I'm building an app with Vue. I needed https working on my local environment for dev and testing purposes. I successfully created the necessary certificates following this tutorial. To get the dev server to use them I copied the server.key and server.crt files into the root of my project and left the rootCA.pem file in ~/.ssh. I added an entry in /etc/hosts to alias localhost for my domain like:
127.0.0.1 example.test
I then edited vue.config.js in the root of my project to look like:
const fs = require('fs')
module.exports = {
devServer: {
host: 'example.test',
https: {
key: fs.readFileSync('./server.key'),
cert: fs.readFileSync('./server.crt'),
ca: fs.readFileSync(process.env.HOME + '/.ssh/rootCA.pem')
}
}
}
This works great locally. My problem is on my staging server.
My staging server is running ubuntu (16.04) and has an actual SSL cert (i.e. not self-signed) installed using certbot (LetsEncrypt). My project is a static frontend so I am serving it with nginx configured to point to the /dist directory and make the project available at staging.example.com. This was all working fine before I added the devServer config in vue.config.js above.
What I expect to happen:
When I build the project for staging, i.e.
npm run build -- --mode staging
I expected it to ignore the devServer config section since NODE_ENV === 'production' (set in my .env.staging file).
What actually happens:
The build process fails, complaining that:
Error loading vue.config.js:
Error: ENOENT: no such file or directory, open './server.key'
Question: How do I get the build process to ignore the devServer section of vue.config.js when building for "staging" or "production?"
vue.config.js doesn't automatically detect the environment. As it says in the Vue CLI configuration docs:
If you need conditional behavior based on the environment, or want to directly mutate the config, use a function (which will be lazy evaluated after the env variables are set). The function receives the resolved config as the argument. Inside the function, you can either mutate the config directly, OR return an object which will be merged.
How to actually do this was not immediately clear to me. Here's what ended up working:
const fs = require('fs')
module.exports = {
configureWebpack: config => {
if (process.env.NODE_ENV !== 'production') {
config.devServer = {
host: 'qzuku.test',
// https://medium.freecodecamp.org/how-to-get-https-working-on-your-local-development-environment-in-5-minutes-7af615770eec
https: {
key: fs.readFileSync('./qzukuDevServer.key'),
cert: fs.readFileSync('./qzukuDevServer.crt'),
ca: fs.readFileSync(process.env.HOME + '/.ssh/rootDevCA.pem')
}
}
}
}
}
Hope this helps!

Using dotenv with bundled client side code

I am creating a node js application as a Contentful UI Extension. The code is hosted here: https://github.com/doodybrains/media-tagging-extension
A lot of the gulp file is boiler plate but in the end everything gets bundled into an index.html file. I know that env variables shouldn't be called or processed in the client code but I don't know how to get them in there before the project is built. When I run the repo in development and call process.env.NAME_OF_TOKEN from src/index.js it returns undefined. I have tried import dotenv, creating a gulp env pipeline etc.
ANY ADVICE will be so helpful. The app is being deployed to Netlify and I already have the env variables set up there as well.
thank you
You can create another js file which can use NODE_ENV to set correct variable.
I prefer calling a service to get all my properties on app start and set in some map. I use the map to set the value in different places in my code.
Some sample code...
const env = process.env.NODE_ENV || 'local';
const sit = {
URL: 'sit url',
HOST: 'sit host',
ENV: 'sit'
};
const uat = {
URL: 'uat url',
HOST: 'uat host',
ENV: 'uat'
};
var property_service_url = config[env].URL;
var property_service_host = config[env].HOST;
Before starting your app, you can set the NODE_ENV=environment. For example in linux.
export NODE_ENV=uat
This will make sure that environment set correctly. Now in your app.js for , you can call the service to load your properties. If you don't want to call a service, you can set it in the same way the URL and HOST are set.

Should I use a NodeJS global to make config properties available to my application?

My question is relatively simple:
In NodeJS you have the ability to set globals using
GLOBAL.myConfigObject = {...}
My question for the developer community is whether or not this is best practice. If not, what would be a better way to relay config variables (such as api-url or Port or ip address) to the entire application.
Instead of setting a global for the config, use a global for the environment you're launching in. Setup a config directory that has each of your environments:
config/local.js
module.exports = {
port: 3000,
ipAddress: "127.0.0.1"
}
config/production.js
module.exports = {
port: 443,
ipAddress: "8.8.8.8"
}
in your main file where you spin up your server:
server.js
var config = require("./config/" + process.env.NODE_ENV)
// use your config however you need it
spinning up your server from the command line, you can then do:
NODE_ENV=local node server.js
This way, you have a single global variable indicating your environment, but you can use whatever configuration variables you need for that environment.
most common way for these config such as port , and api_keys , is enviorment variable.
bash
$ set PORT=8080
you can access environment variable like that
node.js
node> process.env.PORT
In my project case, it's very useful to set a connection config. I use it for all of my server and if I have to change my database, I can switch pretty easily

How to setup gulp browser-sync for a node / react project that uses dynamic url routing

I am trying to add BrowserSync to my react.js node project. My problem is that my project manages the url routing, listening port and mongoose connection through the server.js file so obviously when I run a browser-sync task and check the localhost url http://localhost:3000 I get a Cannot GET /.
Is there a way to force browser-sync to use my server.js file? Should I be using a secondary nodemon server or something (and if i do how can the cross-browser syncing work)? I am really lost and all the examples I have seen add more confusion. Help!!
gulp.task('browser-sync', function() {
browserSync({
server: {
baseDir: "./"
},
files: [
'static/**/*.*',
'!static/js/bundle.js'
],
});
});
We had a similar issue that we were able to fix by using proxy-middleware(https://www.npmjs.com/package/proxy-middleware). BrowserSync lets you add middleware so you can process each request. Here is a trimmed down example of what we were doing:
var proxy = require('proxy-middleware');
var url = require('url');
// the base url where to forward the requests
var proxyOptions = url.parse('https://appserver:8080/api');
// Which route browserSync should forward to the gateway
proxyOptions.route = '/api'
// so an ajax request to browserSync http://localhost:3000/api/users would be
// sent via proxy to http://appserver:8080/api/users while letting any requests
// that don't have /api at the beginning of the path fall back to the default behavior.
browserSync({
// other browserSync options
// ....
server: {
middleware: [
// proxy /api requests to api gateway
proxy(proxyOptions)
]
}
});
The cool thing about this is that you can change where the proxy is pointed, so you can test against different environments. One thing to note is that all of our routes start with /api which makes this approach a lot easier. It would be a little more tricky to pick and choose which routes to proxy but hopefully the example above will give you a good starting point.
The other option would be to use CORS, but if you aren't dealing with that in production it may not be worth messing with for your dev environment.

Resources