Where is the default CA certs used in nodejs? - node.js

I'm connecting to a server whos cert is signed by my own CA, the ca's cert had installed into system's keychain.
connecting with openssl s_client -connect some.where says Verify return code: 0 (ok)
but i cant connect with nodejs's tls/https module, which fails with
Error: SELF_SIGNED_CERT_IN_CHAIN
but connecting to a normal server (i.e google.com:443) works fine.
seems that nodejs's openssl is not sharing same keychain with system's openssl.
but I cannt find where is it. i tried overide with SSL_CERT_DIR but not seemed working.
BTW: i can bypass the server verifying by setting NODE_TLS_REJECT_UNAUTHORIZED=0 , but that's not pretty enough ;)
Im using OSX 10.8.3 with OpenSSL 0.9.8r, node v0.9.8

The default root certificates are static and compiled into the node binary.
https://github.com/nodejs/node/blob/v4.2.0/src/node_root_certs.h

You can make node use the system's OpenSSL certificates. This is done by starting node via:
node --use-openssl-ca
See the docs for further information.
See this answer on how system certificates are extended for Debian and Ubuntu

If you're using the tls module (and it seems like you are) with tls.connect you can pass a ca param in the options that is an array of strings or buffers of certificates you want to trust.

Related

.NET Core build in docker linux container fails due to SSL authentication to Nuget

I was given a .NET Core project to run in a Linux Docker container to do the build, everything seems to be okay on the docker configuration side, but when I run this command: dotnet publish -c Release -o out, I get the SSL authentication error below.
The SSL connection could not be established, see inner exception. Authentication failed because the remote party has closed the transport stream.
I did my research and apparently it seemed that I was missing:
the environment variables Kestrel for ASPNET (as per https://github.com/aspnet/AspNetCore.Docs/issues/6199), which I add to my docker-compose, but I don't think it is the issue.
a Developer .pfx certificate, so I updated my docker-compose with the Kestrel Path to the certs file as seen below.
version: '3'
services:
netcore:
container_name: test_alerting_comp
tty: true
stdin_open: true
image: alerting_netcore
environment:
- http_proxy=http://someproxy:8080
- https_proxy=http://someproxy:8080
- ASPNETCORE_ENVIRONMENT=Development
- ASPNETCORE_URLS=https://+;http://+
- ASPNETCORE_HTTPS_PORT=443
- ASPNETCORE_Kestrel__Certificates__Default__Password="ABC"
- ASPNETCORE_Kestrel__Certificates__Default__Path=/root/.dotnet/corefx/cryptography/x509stores/my
ports:
- "8080:80"
- "443:443"
build: .
#context: .
security_opt:
- seccomp:unconfined
volumes:
- "c:/FakePath/git/my_project/src:/app"
- "c:/TEMP/nuget:/root"
networks:
- net
networks:
net:
I re-run the docker container and executed dotnet publish -c Release -o out with the same results.
From my host I can do this to my local NuGet:
A) wget https://nuget.local.com/api/v2 without issues,
B) but from the container I can't.
C) However from the container I can do this to official NuGet wget https://api.nuget.org/v3/index.json, so definetely my proxy is working okay.
Debugging SSL issue:
The given .pfx certificate is a self-signed one, and it is working okay from Windows OS (at least I was told that).
strace shows me from where the certs are being pulled from as below
root#9b98d5447904:/app# strace wget https://nuget.local.com/api/v2 |& grep certs open("/etc/ssl/certs/ca-certificates.crt", O_RDONLY) = 3
I exported the .pfx as follows:
openssl pkcs12 -in ADPRootCertificate.pfx -out my_adp_dev.crt then moved it to /usr/local/share/ca-certificates/, removed the private part, just left in the file public part (-----BEGIN CERTIFICATE----- -----END CERTIFICATE----- ) executed update-ca-certificates and I could see 1 added, double checked in file /etc/ssl/certs/ca-certificates.crt and the new cert was in there.
Executed this again wget https://nuget.local.com/api/v2 and failed.
I used OpenSSL to get more info and as you can see it is not working, the cert has a weird CN, because they used a wildcard for the subject and to me this is wrong, but they state that .pfx is working in Windows OS.
root#ce21098e9643:/usr/local/share/ca-certificates# openssl s_client -connect nuget.local.com:443 -CApath /etc/ssl/certs
CONNECTED(00000003)
depth=0 CN = *.local.com
verify error:num=20:unable to get local issuer certificate
verify return:1
depth=0 CN = *.local.com
verify error:num=21:unable to verify the first certificate
verify return:1
---
Certificate chain
0 s:/CN=\x00*\x00l\x00o\x00c\x00a\x00l\x00.\x00c\x00o\x00m
i:/C=ES/ST=SomeCity/L=SomeCity/OU=DEV/O=ASD/CN=Development CA
---
Server certificate
-----BEGIN CERTIFICATE-----
XXXXXXXXXXX
XXXXXXXXXXX
-----END CERTIFICATE-----
subject=s:/CN=\x00*\x00l\x00o\x00c\x00a\x00l\x00.\x00c\x00o\x00m
issuer=i:/C=ES/ST=SomeCity/L=SomeCity/OU=DEV/O=ASD/CN=Development CA
---
No client certificate CA names sent
Peer signing digest: SHA1
Server Temp Key: ECDH, P-256, 256 bits
---
SSL handshake has read 1284 bytes and written 358 bytes
Verification error: unable to verify the first certificate
---
New, TLSv1.2, Cipher is ECDHE-RSA-AES256-SHA384
Server public key is 1024 bit
Secure Renegotiation IS supported
Compression: NONE
Expansion: NONE
No ALPN negotiated
SSL-Session:
Protocol : TLSv1.2
Cipher : ECDHE-RSA-AES256-SHA384
Session-ID: 95410000753146AAE1D313E8538972244C7B79A60DAF3AA14206417490E703F3
Session-ID-ctx:
Master-Key: B09214XXXXXXX0007D126D24D306BB763673EC52XXXXXXB153D310B22C341200EF013BC991XXXXXXX888C08A954265623
PSK identity: None
PSK identity hint: None
SRP username: None
Start Time: 1558993408
Timeout : 7200 (sec)
Verify return code: 21 (unable to verify the first certificate)
Extended master secret: yes
---
I don't know what issue I'm facing, but it appears to be that:
A) the self-signed .pfx was wrongly configured, and now that it is being used in Linux it doesn't work as it should.
B) I need some more config in the container, which I'm not aware of.
What else should I do?
I'm thinking on probaly create other cert to use from Linux hosts.
Is it feasible to create another self-signed cert with OpenSSL for IIS ver 8 and import it to IIS?.
Any ideas are welcome, cheers.
ANSWERING TO MYSELF
It was not a Linux container issue, it is a certificate issue in the web server (IIS), because we are using self-signed certificates and in this way the cert will be always an invalid certificate. Self-signed certs works okay on Windows OS side, doesn't matter the invalid error. Of course self-signed certs are just for a test environment or so.
From Linux OS when you are trying to pull packages from NuGet you will get the error below, because:
1) The cert is indeed invalid, and
2) because apparently there is not an option to ignore an invalid certificate from Linux side.
The SSL connection could not be established, see inner exception.
The remote certificate is invalid according to the validation procedure.
The solution is you are working in a corporate environment, is to request to System Administrator a proper signed certificate, for that you generate a CSR from your web server, in my case IIS, then pass it to them, so they will send you back a .cer file to install in that web server.
The other option that I was trying to do but I couldn't due to the limitations of my corporate environment, is to create a fake CA (with OpenSSL), then you sign the CSR's yourself to have some valid certificates for your Dev or test environment.
Apologizes for answering this myself, but I believe it is worth to share my findings.
Hope it helps.
I had a similar problem. Docker build would not restore my nugets.
Unable to load the service index for source https://api.nuget.org/v3/index.json.
The SSL connection could not be established, see inner exception.
Authentication failed because the remote party has closed the transport stream.
(On a Mac running Catalina)
I turned off Fiddler and then it all worked again.

azure-cli login getting "self signed certificate in certificate chain"

I have installed azure-cli via npm install -g azure-cli but and am getting self signed certificate in certificate chain:
Any hints appreciated.
You can use troubleshooting guide for the Microsoft Azure Explorer tool for same issue https://support.microsoft.com/en-us/help/4021389/storage-explorer-troubleshooting-guide.
It helped me, with only one tweak needed: openssl command should check connection to management.azure.com and not microsoft.com
So, the command in OpenSSL console will be:
OpenSSL> s_client -showcerts -connect management.azure.com:443
I have found a self-signed certificate in my case (subject = issuer) added by my company IT department. After importing that certificate into ASE (so that ASE can trust that self-issued cert) the problem was solved.
As per the these bug reports it seems that this issue is related to npm using a system/network wide proxy which intercepts the SSL traffic. Update npm config with the https proxy and this issue should be fixed. some have reported that updating npm also has fixed it. check it out.
https://github.com/npm/npm/issues/7519
https://github.com/npm/npm/issues/6916

Failed to install Gitlab - curl (60) ssl certificate

I was trying to install gitlab on my linux server following this guide and got stucked in the second setp that says
curl: (60) SSL certificate problem: self signed certificate
More details here: http://curl.haxx.se/docs/sslcerts.html
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
the -k (or --insecure) option.
any idea on how can I solve this?
ANSWER be sure to have http_proxy and https_proxy variables correctly set.
---- UPDATE ----
After setting the variables I got the following answer from curl
Detected operating system as Ubuntu/trusty.
Checking for curl...
Detected curl...
Running apt-get update... done.
Installing apt-transport-https... done.
Installing /etc/apt/sources.list.d/gitlab_gitlab-ce.list...curl: (60) SSL certificate problem: self signed certificate
More details here: http://curl.haxx.se/docs/sslcerts.html
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
the -k (or --insecure) option.
Unable to run:
curl https://packages.gitlab.com/install/repositories/gitlab/gitlab-ce/config_file.list?os=Ubuntu&dist=trusty&source=script
Double check your curl installation and try again.
Tell curl to ignore SSL warnings with -k/--insecure. Documented in man curl.
Edit: also check your proxy settings, as the host you're trying to curl to does, in fact, have a valid SSL certificate. See the --proxy option of curl.

npm install error - unable to get local issuer certificate

I am getting an unable to get local issuer certificate error when performing an npm install:
typings ERR! message Unable to read typings for "es6-shim". You should check the
entry paths in "es6-shim.d.ts" are up to date
typings ERR! caused by Unable to connect to "https://raw.githubusercontent.com/D
efinitelyTyped/DefinitelyTyped/7de6c3dd94feaeb21f20054b9f30d5dabc5efabd/es6-shim
/es6-shim.d.ts"
typings ERR! caused by unable to get local issuer certificate
I have recently update to node 4 from a much earlier version and it sounds like node is much more strict when these kind of problems arise.
There is an issue discussed here which talks about using ca files, but it's a bit beyond my understanding and I'm unsure what to do about it.
I am behind a corporate firewall, but I can get to the url fine in a browser without any restriction.
Does anyone have any further insight into this issue and what possible solutions there are?
I'm wondering about reverting to node 0.12 in the meantime :(
Try
npm config set strict-ssl false
This is a alternative shared in this url https://github.com/nodejs/node/issues/3742
There is an issue discussed here which talks about using ca files, but it's a bit beyond my understanding and I'm unsure what to do about it.
This isn't too difficult once you know how! For Windows:
Using Chrome go to the root URL NPM is complaining about (so https://raw.githubusercontent.com in your case).
Open up dev tools and go to Security-> View Certificate. Check Certification path and make sure your at the top level certificate, if not open that one. Now go to "Details" and export the cert with "Copy to File...".
You need to convert this from DER to PEM. There are several ways to do this, but the easiest way I found was an online tool which should be easy to find with relevant keywords.
Now if you open the key with your favorite text editor you should see
-----BEGIN CERTIFICATE-----
yourkey
-----END CERTIFICATE-----
This is the format you need. You can do this for as many keys as you need, and combine them all into one file. I had to do github and the npm registry keys in my case.
Now just edit your .npmrc to point to the file containing your keys like so
cafile=C:\workspace\rootCerts.crt
I have personally found this to perform significantly better behind our corporate proxy as opposed to the strict-ssl option. YMMV.
This worked for me:
export NODE_TLS_REJECT_UNAUTHORIZED=0
Please refer to the NodeJS documentation for usage and warnings:
https://nodejs.org/api/cli.html#cli_node_tls_reject_unauthorized_value
Anyone gets this error when 'npm install' is trying to fetch a package from HTTPS server with a self-signed or invalid certificate.
Quick and insecure solution:
npm config set strict-ssl false
Why this solution is insecure?
The above command tells npm to connect and fetch module from server even server do not have valid certificate and server identity is not verified. So if there is a proxy server between npm client and actual server, it provided man in middle attack opportunity to an intruder.
Secure solution:
If any module in your package.json is hosted on a server with self-signed CA certificate then npm is unable to identify that server with an available system CA certificates.
So you need to provide CA certificate for server validation with the explicit configuration in .npmrc.
In .npmrc you need to provide cafile, please refer to more detail about cafile configuration.
cafile=./ca-certs.pem
In ca-certs file, you can add any number of CA certificates(public) that you required to identify servers. The certificate should be in “Base-64 encoded X.509 (.CER)(PEM)” format.
For example,
# cat ca-certs.pem
DigiCert Global Root CA
=======================
-----BEGIN CERTIFICATE-----
CAUw7C29C79Fv1C5qfPrmAE.....
-----END CERTIFICATE-----
VeriSign Class 3 Public Primary Certification Authority - G5
========================================
-----BEGIN CERTIFICATE-----
MIIE0zCCA7ugAwIBAgIQ......
-----END CERTIFICATE-----
Note: once you provide cafile configuration in .npmrc, npm try to identify all server using CA certificate(s) provided in cafile only, it won't check system CA certificate bundles then.
Here's a well-known public CA authority certificate bundle.
One other situation when you get this error:
If you have mentioned Git URL as a dependency in package.json and git is on invalid/self-signed certificate then also npm throws a similar error.
You can fix it with following configuration for git client
git config --global http.sslVerify false
Typings can be configured with the ~/.typingsrc config file. (~ means your home directory)
After finding this issue on github: https://github.com/typings/typings/issues/120, I was able to hack around this issue by creating ~/.typingsrc and setting this configuration:
{
"proxy": "http://<server>:<port>",
"rejectUnauthorized": false
}
It also seemed to work without the proxy setting, so maybe it was able to pick that up from the environment somewhere.
This is not a true solution, but was enough for typings to ignore the corporate firewall issues so that I could continue working. I'm sure there is a better solution out there.
If you're on a corporate computer, it likely has custom certificates (note the plural on that). It took a while to figure out, but I've been using this little script to grab everything and configure Node, NPM, Yarn, AWS, and Git (turns out the solution is similar for most tools). Stuff this in your ~/.bashrc or ~/.zshrc or similar location:
function setup-certs() {
# place to put the combined certs
local cert_path="$HOME/.certs/all.pem"
local cert_dir=$(dirname "${cert_path}")
[[ -d "${cert_dir}" ]] || mkdir -p "${cert_dir}"
# grab all the certs
security find-certificate -a -p /System/Library/Keychains/SystemRootCertificates.keychain > "${cert_path}"
security find-certificate -a -p /Library/Keychains/System.keychain >> "${cert_path}"
# configure env vars for commonly used tools
export GIT_SSL_CAINFO="${cert_path}"
export AWS_CA_BUNDLE="${cert_path}"
export NODE_EXTRA_CA_CERTS="${cert_path}"
# add the certs for npm and yarn
# and since we have certs, strict-ssl can be true
npm config set -g cafile "${cert_path}"
npm config set -g strict-ssl true
yarn config set cafile "${cert_path}" -g
yarn config set strict-ssl true -g
}
setup-certs
You can then, at any time, run setup-certs in your terminal. Note that if you're using Nvm to manage Node versions, you'll need to run this for each version of Node. I've noticed that some corporate certificates get rotated every so often. Simply re-running setup-certs fixes all that.
You'll notice that most answers suggest setting strict-ssl to false. Please don't do that. Instead use the setup-certs solution to use the actual certificates.
My problem was that my company proxy was getting in the way. The solution here was to identify the Root CA / certificate chain of our proxy, (on mac) export it from the keychain in .pem format, then export a variable for node to use.
export NODE_EXTRA_CA_CERTS=/path/to/your/CA/cert.pem
There are different reason for this issue and workaround is different depends on situation. Listing here few workaround (note: it is insecure workaround so please check your organizational policies before trying).
Step 1: Test and ensure internet is working on machine with command prompt and same url is accessible directly which fails by NPM. There are many tools for this, like curl, wget etc. If you are using windows then try telnet or curl for windows.
Step 2: Set strict ssl to false by using below command
npm -g config set strict-ssl false
Step 3: Set reject unauthorized TLS to no by using below command:
export NODE_TLS_REJECT_UNAUTHORIZED=0
In case of windows (or can use screen to set environment variable):
set NODE_TLS_REJECT_UNAUTHORIZED=0
Step 4: Add unsafe param in installation command e.g.
npm i -g abc-package#1.0 --unsafe-perm true
In case you use yarn:
yarn config set strict-ssl false
Add:
process.env["NODE_TLS_REJECT_UNAUTHORIZED"] = 0;
Source: Ignore invalid self-signed ssl certificate in node.js with https.request?
I have encountered the same issue. This command didn't work for me either:
npm config set strict-ssl false
After digging deeper, I found out that this link was block by our IT admin.
http://registry.npmjs.org/npm
So if you are facing the same issue, make sure this link is accessible to your browser first.
For anyone coming to this from macOS:
Somehow, npm hasn't picked up correct certificates file location, and I needed to explicitly point to it:
$ echo "cafile=$(brew --prefix)/share/ca-certificates/cacert.pem" >> ~/.npmrc
$ cat ~/.npmrc # for ARM macOS
cafile=/opt/homebrew/share/ca-certificates/cacert.pem
Well this is not a right answer but can be consider as a quick workaround. Right answer is turn off Strict SSL.
I am having the same error
PhantomJS not found on PATH
Downloading https://github.com/Medium/phantomjs/releases/download/v2.1.1/phantomjs-2.1.1-windows.zip
Saving to C:\Users\Sam\AppData\Local\Temp\phantomjs\phantomjs-2.1.1-windows.zip
Receiving...
Error making request.
Error: unable to get local issuer certificate
at TLSSocket. (_tls_wrap.js:1105:38)
at emitNone (events.js:106:13)
at TLSSocket.emit (events.js:208:7)
at TLSSocket._finishInit (_tls_wrap.js:639:8)
at TLSWrap.ssl.onhandshakedone (_tls_wrap.js:469:38)
So the after reading the error.
Just downloaded the file manually and placed it on the required path.
i.e
C:\Users\Sam\AppData\Local\Temp\phantomjs\
This solved my problem.
PhantomJS not found on PATH
Download already available at C:\Users\sam\AppData\Local\Temp\phantomjs\phantomjs-2.1.1-windows.zip
Verified checksum of previously downloaded file
Extracting zip contents
A disclaimer: This solution is less secure, bad practice, don't do this.
I had a duplicate error message--I'm behind a corporate VPN/firewall. I was able to resolve this issue by adding a .typingsrc file to my user directory (C:\Users\MyUserName\.typingsrc in windows). Of course, anytime you're circumventing SSL you should be yapping to your sys admins to fix the certificate issue.
Change the registry URL from https to http, and as seen in nfiles' answser above, set rejectUnauthorized to false.
.typingsrc (placed in project directory or in user root directory)
{
"rejectUnauthorized": false,
"registryURL": "http://api.typings.org/"
}
Optionally add your github token (I didn't find success until I had added this too.)
{
"rejectUnauthorized": false,
"registryURL": "http://api.typings.org/",
"githubToken": "YourGitHubToken"
}
See instructions for setting up your github token at https://github.com/blog/1509-personal-api-tokens
Once you have your certificate (cer or pem file), add it as a system variable like in the screenshot below.
This is the secure way of solving the problem, rather than disabling SSL. You have to tell npm or whatever node tool you're using to use these certificates when establing an SSL connection using the environment variable NODE_EXTRA_CA_CERTS.
This is common when you're behind a corporate firewall or proxy. You can find the correct certificate by just inspecting the security tab in Chrome when visiting a page while on your company's VPN or proxy and exporting the certificate through the "Manage Computer Certificates" window in Windows.
On FreeBSD, this error can be produced because the cafile path is set to a symlink instead of the absolute path.

(60) Peer's certificate issuer has been marked as not trusted by the user: Linux/Apache

I am trying to find out why my HTTPS link is not working for my website:
So I ran this command to try:
curl https://localhost/
I am using a valid signed SSL certificate and my HTTP link is working fine. I am using a Multi Domain certificate that was exported from an IIS 6 server. My instance on AWS has the 443 port enabled.
Here is a picture of my CA certificates:
I have tried to change the http.conf file's Virtual Host following the instructions in here: http://ananthakrishnanravi.wordpress.com/2012/04/15/configuring-ssl-and-https-for-your-website-amazon-ec2/
Is there any suggestions on how to get my website properly working on a HTTPS protocol?
Let me know if you need anymore information.
Thanks,
If you're not sure of the certificate that your web server is serving, you can use this command to view the certificate:
openssl s_client -showcerts -connect hostname.domain.tld:443
Also, the hostname in the certificate must match the site that you are requesting. For example, if you request a page from localhost, but your certificate is for www.yourdomain.com, the certificate check will fail.
This means that you are using a self-signed certificate.
In order for this warning not to appear, you need to purchase a certificate from a Certificate Authority.
If you are using Self Signed SSL certificate then you faced this issue.
For this you can use curl command with -k option.
curl -k https://yourdomain.com/
And if you are trying with Postman that time disable the SSL Certificate option in setting.
I got a same error but not similar to your, but summary here hope useful for others:
OS: CentOS 7
Run Python's pyspider error:
File "/usr/local/lib64/python3.6/site-packages/tornado/concurrent.py", line 238, in result
raise_exc_info(self._exc_info)
File "", line 4, in raise_exc_info
Exception: HTTP 599: Peer's certificate issuer has been marked as not trusted by the user.
root cause and steps to fix:
previously existed a soft link:
/usr/lib64/libcurl.so.4 -> /usr/lib64/libcurl.so.4.3.0_openssl
which is invalid one, so changed to valid:
/usr/lib64/libcurl.so.4 -> /usr/lib64/libcurl.so.4.3.0
while two file is:
-rwxr-xr-x 1 root root 435192 Nov 5 2018 /usr/lib64/libcurl.so.4.3.0
-rwxr-xr-x 1 root root 399304 May 10 09:20 /usr/lib64/libcurl.so.4.3.0_openssl
then for pyspider reinstall pycurl:
pip3 uninstall pycurl
export PYCURL_SSL_LIBRARY=nss
export LDFLAGS=-L/usr/local/opt/openssl/lib;export CPPFLAGS=-I/usr/local/opt/openssl/include;pip install pycurl --compile --no-cache-dir
in which PYCURL_SSL_LIBRARY is nss, due to current curl backend is nss according to
# curl --version
curl 7.29.0 (x86_64-redhat-linux-gnu) libcurl/7.29.0 NSS/3.36 zlib/1.2.7 libidn/1.28 libssh2/1.4.3
...
can fix my problem.

Resources