I deployed my server on Heroku but when I make any requests it returns a "500 Internal Server" error. It runs fine locally though. Could anyone help figure out what's going on?
When I check my logs this is what I'm getting.
2021-06-08T18:43:09.715406+00:00 app[web.1]: error: no pg_hba.conf entry for host "3.90.138.215", user "detmvsbueicsez", database "da9nlve42hcp91", SSL off
Repo Link: https://github.com/zcason/Restaurant-Review-Server
Live App: https://restaurant-review-phi.vercel.app/
As mentioned here on Heroku help, this indicate that there was a failed authentication attempt to the database, so the connection couldn't be established. This can happen because of different reasons.
In your case i suspect it's something related to not using ssl.
So after taking a look on the code provided in the github repo i noticed you are using knex and getting the connection string from .env
Try this :
Just add this ?ssl=true and append it to the end of DATABASE_URL in your .env file.
Edit your server.js (i didn't take a good look at the code so you need to add this ssl: { rejectUnauthorized: false } in your connection config) :
const db = knex({
client: 'pg',
connection: {
connectionString: DATABASE_URL,
ssl: { rejectUnauthorized: false }
}
});
Also make sure you're using the wright user and password and database name etc
OR Alternatively :
Run this command heroku config:set PGSSLMODE=no-verify in terminal to omit the ssl configuration object and set PGSSLMODE to no-verify
I have deployed consul using hashicorp-consul-helm-chart
now, I want to connect to the consul from my Node.js project.
Therefore, I created an object like this : (using 'consul' npm package)
import consul from 'consul';
var consulObj = new consul({
host: 'xxx.xxx.xxx.xxx',
promisify: true
});
var watch = consulObj.watch({
method: consulObj.kv.get,
options: { key: 'config' },
backoffFactor: 1000,
});
I have got the host value from kubectl get endpoints
used the value opposite to consul-server
still, i get consul: kv.get: connect ETIMEDOUT when I run the code.
what could be the reason?
Thanks in advance!
You should be accessing the Consul client which is running on the node where your app is located instead of directly accessing the server.
Details can be found in the accepted answer for Hashicorp Consul, Agent/Client access.
I am using this package in a nodejs app: https://www.npmjs.com/package/scp2 and it is calling upload(src, dest) to a unix server. It is not connecting to the unix server and the only error I see is Exit code 255 while establishing SFTP session. I realize it is not connecting due to something on the Unix servers OpenSSL package changing but since its rollback, it is still not working as it was before.
Is there a way to display the debug lines for scp2.upload()? I am stepping through the scp2 package in vscode debug mode but not finding anything.
I added this in the scp2 configs: debug: console.log as such:
this.sshCreds = {
host: this.host,
username: this.username,
password: this.password,
debug: console.log,
};
this.sshClient = new SshClient(this.sshCreds);
I'm using node.js request.js to reach an api. I'm getting this error
[Error: UNABLE_TO_VERIFY_LEAF_SIGNATURE]
All of my credentials are accurate and valid, and the server's fine. I made the same request with postman.
request({
"url": domain+"/api/orders/originator/"+id,
"method": "GET",
"headers":{
"X-API-VERSION": 1,
"X-API-KEY": key
},
}, function(err, response, body){
console.log(err);
console.log(response);
console.log(body);
});
This code is just running in an executable script ex. node ./run_file.js, Is that why? Does it need to run on a server?
Note: the following is dangerous, and will allow API content to be intercepted and modified between the client and the server.
This also worked
process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = '0';
It's not an issue with the application, but with the certificate which is signed by an intermediary CA.
If you accept that fact and still want to proceed, add the following to request options:
rejectUnauthorized: false
Full request:
request({
"rejectUnauthorized": false,
"url": domain+"/api/orders/originator/"+id,
"method": "GET",
"headers":{
"X-API-VERSION": 1,
"X-API-KEY": key
},
}, function(err, response, body){
console.log(err);
console.log(response);
console.log(body);
});
The Secure Solution
Rather than turning off security you can add the necessary certificates to the chain. First install ssl-root-cas package from npm:
npm install ssl-root-cas
This package contains many intermediary certificates that browsers trust but node doesn't.
var sslRootCAs = require('ssl-root-cas/latest')
sslRootCAs.inject()
Will add the missing certificates. See here for more info:
https://git.coolaj86.com/coolaj86/ssl-root-cas.js
CoolAJ86's solution is correct and it does not compromise your security like disabling all checks using rejectUnauthorized or NODE_TLS_REJECT_UNAUTHORIZED. Still, you may need to inject an additional CA's certificate explicitly.
I tried first the root CAs included by the ssl-root-cas module:
require('ssl-root-cas/latest')
.inject();
I still ended up with the UNABLE_TO_VERIFY_LEAF_SIGNATURE error. Then I found out who issued the certificate for the web site I was connecting to by the COMODO SSL Analyzer, downloaded the certificate of that authority and tried to add only that one:
require('ssl-root-cas/latest')
.addFile(__dirname + '/comodohigh-assurancesecureserverca.crt');
I ended up with another error: CERT_UNTRUSTED. Finally, I injected the additional root CAs and included "my" (apparently intermediary) CA, which worked:
require('ssl-root-cas/latest')
.inject()
.addFile(__dirname + '/comodohigh-assurancesecureserverca.crt');
For Create React App (where this error occurs too and this question is the #1 Google result), you are probably using HTTPS=true npm start and a proxy (in package.json) which goes to some HTTPS API which itself is self-signed, when in development.
If that's the case, consider changing proxy like this:
"proxy": {
"/api": {
"target": "https://localhost:5001",
"secure": false
}
}
secure decides whether the WebPack proxy checks the certificate chain or not and disabling that ensures the API self-signed certificate is not verified so that you get your data.
It may be very tempting to do rejectUnauthorized: false or process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = '0'; but don't do it! It exposes you to man in the middle attacks.
The other answers are correct in that the issue lies in the fact that your cert is "signed by an intermediary CA." There is an easy solution to this, one which does not require a third party library like ssl-root-cas or injecting any additional CAs into node.
Most https clients in node support options that allow you to specify a CA per request, which will resolve UNABLE_TO_VERIFY_LEAF_SIGNATURE. Here's a simple example using node's built-int https module.
import https from 'https';
const options = {
host: '<your host>',
defaultPort: 443,
path: '<your path>',
// assuming the bundle file is co-located with this file
ca: readFileSync(__dirname + '/<your bundle file>.ca-bundle'),
headers: {
'content-type': 'application/json',
}
};
https.get(options, res => {
// do whatever you need to do
})
If, however, you can configure the ssl settings in your hosting server, the best solution would be to add the intermediate certificates to your hosting provider. That way the client requester doesn't need to specify a CA, since it's included in the server itself. I personally use namecheap + heroku. The trick for me was to create one .crt file with cat yourcertificate.crt bundle.ca-bundle > server.crt. I then opened up this file and added a newline after the first certificate. You can read more at
https://www.namecheap.com/support/knowledgebase/article.aspx/10050/33/installing-an-ssl-certificate-on-heroku-ssl
You can also try by setting strictSSL to false, like this:
{
url: "https://...",
method: "POST",
headers: {
"Content-Type": "application/json"},
strictSSL: false
}
I had the same issues. I have followed #ThomasReggi and #CoolAJ86 solution and worked well but I'm not satisfied with the solution.
Because "UNABLE_TO_VERIFY_LEAF_SIGNATURE" issue is happened due to certification configuration level.
I accept #thirdender solution but its partial solution.As per the nginx official website, they clearly mentioned certificate should be combination of The server certificate and chained certificates.
Just putting this here in case it helps someone, my case was different and a bit of an odd mix. I was getting this on a request that was accessed via superagent - the problem had nothing to do with certificates (which were setup properly) and all to do with the fact that I was then passing the superagent result through the async module's waterfall callback. To fix: Instead of passing the entire result, just pass result.body through the waterfall's callback.
Following commands worked for me :
> npm config set strict-ssl false
> npm cache clean --force
The problem is that you are attempting to install a module from a repository with a bad or untrusted SSL[Secure Sockets Layer] certificate. Once you clean the cache, this problem will be resolved.You might need to turn it to true later on.
Another approach to solving this securely is to use the following module.
node_extra_ca_certs_mozilla_bundle
This module can work without any code modification by generating a PEM file that includes all root and intermediate certificates trusted by Mozilla. You can use the following environment variable (Works with Nodejs v7.3+),
NODE_EXTRA_CA_CERTS
To generate the PEM file to use with the above environment variable. You can install the module using:
npm install --save node_extra_ca_certs_mozilla_bundle
and then launch your node script with an environment variable.
NODE_EXTRA_CA_CERTS=node_modules/node_extra_ca_certs_mozilla_bundle/ca_bundle/ca_intermediate_root_bundle.pem node your_script.js
Other ways to use the generated PEM file are available at:
https://github.com/arvind-agarwal/node_extra_ca_certs_mozilla_bundle
NOTE: I am the author of the above module.
I had an issue with my Apache configuration after installing a GoDaddy certificate on a subdomain. I originally thought it might be an issue with Node not sending a Server Name Indicator (SNI), but that wasn't the case. Analyzing the subdomain's SSL certificate with https://www.ssllabs.com/ssltest/ returned the error Chain issues: Incomplete.
After adding the GoDaddy provided gd_bundle-g2-g1.crt file via the SSLCertificateChainFile Apache directive, Node was able to connect over HTTPS and the error went away.
If you come to this thread because you're using the node postgres / pg module, there is a better solution than setting NODE_TLS_REJECT_UNAUTHORIZED or rejectUnauthorized, which will lead to insecure connections.
Instead, configure the "ssl" option to match the parameters for tls.connect:
{
ca: fs.readFileSync('/path/to/server-ca.pem').toString(),
cert: fs.readFileSync('/path/to/client-cert.pem').toString(),
key: fs.readFileSync('/path/to/client-key.pem').toString(),
servername: 'my-server-name' // e.g. my-project-id/my-sql-instance-id for Google SQL
}
I've written a module to help with parsing these options from environment variables like PGSSLROOTCERT, PGSSLCERT, and PGSSLKEY:
https://github.com/programmarchy/pg-ssl
Hello just a small adition to this subject since in my case the
require('ssl-root-cas/latest')
.inject()
.addFile(__dirname + '/comodohigh-assurancesecureserverca.crt');
didn't work out for me it kept returning error that the file could not be downloaded i had been a couple of hours into the reasearch of this particular error when I ran into this response https://stackoverflow.com/a/65442604
Since in my application we do have a proxy to proxy some of our requests as a security requirement of some of our users I found that in the case you are consulting an API that has this issue and if you can access the API url throught your browser you can proxy your request and it might fix the [Error: UNABLE_TO_VERIFY_LEAF_SIGNATURE] issue.
An example of how i use my proxy
await axios.get(url, {
timeout: TIME_OUT,
headers: {
'User-Agent': 'My app'
},
params: params,
proxy: {
protocol: _proxy.protocol,
host: _proxy.hostname,
port: _proxy.port,
auth: {
username: _proxy_username,
password: _proxy_password
}
}
});
I had the same problem and I am able to fix it the following way,
Use the full-chain or just the chain certificate instead of just the certificate.
That is all.
This same error can be received when trying to install a local git shared repo from npm.
The error will read: npm ERR! code UNABLE_TO_VERIFY_LEAF_SIGNATURE
Apparently there is an issue with the certificate, however what worked for me was change the link to my shared repo in the package.json file from:
"shared-frontend": "https://myreposerver"
to:
"shared-frontend": "git+https://myreposerver"
In short, just adding git+ to the link solved it.
Another reason node could print that error is because a backend connection/service is misconfigured.
Unfortunately, the node error doesn't say which certificate it was unable to verify [feature request !]
Your server may have a perfectly good certificate chain installed for clients to connect and even show a nice padlock in the browser's URL bar, but when the server tries to connect to a backend database using a different misconfigured certificate, then it could raise an identical error.
I had this issue in some vendor code for some time. Changing a backend database connection from self-signed to an actual certificate resolved it.
You have to include the Intermediate certificate in your server. This solves the [Error: UNABLE_TO_VERIFY_LEAF_SIGNATURE]
I am using the apn module (https://github.com/argon/node-apn) to send push notifications to iPhones from NodeJS.
My code works fine on my development machine (Mac OSX) and is successfully pushing notifications through the Apple sandbox gateway (gateway.sandbox.push.apple.com), but when I move it to the staging server (which is running Ubuntu) pushing notifications fails with the message:
Error: ENOENT, no such file or directory 'apns-dev-cert.pem'
I am setting up the NodeJS apn object as such:
var options = {
cert: "apns-dev-cert.pem",
key: "apns-key.pem",
passphrase: null,
gateway: "gateway.sandbox.push.apple.com",
port: 2195,
enhanced: true,
errorCallback: undefined,
cacheLength: 5
};
On my development Mac OSX machine, the cert is installed in the Keychain. From my limited understanding of Ubuntu, the equivalent would be to copy the cert file to /etc/ssl/certs. I tried doing this, and also changing the path to "/etc/ssl/certs/apn-dev-cert.pem" in my NodeJS code, but the same error message shows up.
Any ideas?
I struggled a lot with this issue until I realized that I hadn't understood how the fs module reads files. Apparently, it reads them from the directory where you started your node process. So the path to your .pem files should be relative to wherever you're doing that.
You might wanna check out __dirname too, which might make it easier to specify your paths.
You try to setup as below:
var options = {
//cert: "apns-dev-cert.pem",
//key: "apns-key.pem",
pfx: '<path>/apns-key.pem',
passphrase: null,
gateway: "gateway.sandbox.push.apple.com",
port: 2195,
enhanced: true,
errorCallback: undefined,
cacheLength: 5
};