Can't start httpd service due to tampered or incorrect password - linux

I am on linux Redhat OS. I was trying to import certificate using this command
keytool -importcert -alias 3dspace ...
as can be seen in the image below. But it doesn't allow to as it states that either the keystore was tampered or the password was incorrect.
To make matter worse, now I cant run the service httpd as well even though it was running prior to this problem. What can be the cause and how to solve them?
Thank you in advance.

Related

Renewed my SSL certificate but getting UNABLE_TO_VERIFY_LEAF_SIGNATURE in nodejs on AWS EC2 server

I have a nodejs/express api on a AWS EC2 server with a ssl certificate that is generated with Let's encrypt every 3 months.
Auto renewal isn't on and we let it exipre before trying to renew but after renewing it we are getting an error saying:
Unable to verify the first certificate
or
UNABLE_TO_VERIFY_LEAF_SIGNATURE
depending on what we are testing with.
We are using Certbot for renewing with the following command (and not $ certbot renew) :
$ sudo certbot certonly --dns-route53 -d *.example.com -d example.com --server https://acme-v02.api.letsencrypt.org/directory
Certificates are generated as expected with an expiration date 3 months from now.
Any ideas on what's going on ? I've tried most of the things I could find on SO and elsewhere but nothing worked.
P.S. Servers and I don't go along very well :/ (I do mobile app dev) so assume that I don't know anything when replying :D
Solution was quite easy, just needed to use the fullchain.pem file (and reboot your server if applicable).
Sidenote:
If someone on your team tells you that they've tested a solution and that it didn't work, don't just blindly trust them but test it yourself if all other possible solutions didn't work...(have lost 1+ day because someone thought they did test with the fullchain.pem (or did it wrongly)

How to test HTTPS rest from command line

I'm having a problem I can't seem to solve on my own:
I have to request RESTful services but the RESTful services use HTTPs protocol.
I've created the client and it is deployed in WebLogic
I've downloaded the certificate using the browser and I've installed it in JAVA using the following command (In my linux server):
"keytool -import -alias myalias -keystore /jdk1.8.0_101/jre/lib/security/cacerts -file certificado.com.crt"
I need to test that it works...
First, How can I test the RESTful services from command line?
Second, Do I need to install the certificate in WebLogic? If yes, How can I do it?
JAVA: jdk1.8.0_101
WebLogic: 12.1.3.0.0
The certificate must be installed on the system that's making the request. It's not about weblogic, it's about the local sertificate vault of the machine. You don't need to install the certificate on weblogic. You may want to take a look at this.
Then you can use wget or curl, just keep in mind that wget just makes get requests while curl do all types of requests.

How to secure the default apache karaf installation

Following Christian Schneider's blog post, How to hack into any default apache karaf installation, I checked to see if my default Karaf installation (4.0.5) is insecure:
Some simple steps to check if your karaf installations is open.
Check the "etc/org.apache.karaf.shell.cfg" for the attribute sshPort. Note this port number. By default it is 8101
Do "ssh -p 8101 karaf#localhost". Like expected it will ask for a password. This may also be dangerous if you do not change the default password but is quite obvious.
Now just do bin/client -a 8101. You will get a shell without supplying a password. If this works then your server is vulnerable
As expected. It is vulnerable. So I tried to secure it following the instructions as described:
How to secure your server?
Simply remove the public key of the karaf user in the "etc/keys.properties". Unfortunately this will stop the bin/client command from working.
Also make sure you change the password of the karaf user in "etc/users.properties".
I shut down the Karaf server using the halt command. Then I changed the karaf password in etc/users.properties and deleted the file etc/keys.properties. Then I started the server again with bin/karaf. Then in a new terminal I tested to see if the installation was secure by trying to ssh into the server. I validated that ssh login now requires the newly configured password. Finally, I tried using the bin/client -a 8101 command.
At this point, as explained in the blog post, I expected the command to fail:
Unfortunately this will stop the bin/client command from working.
I noticed after running bin/client -a 8101 there is a new file etc/host.key that either bin/client or the container itself auto generated. Rather than failing the command succeeded and I was presented with the Karaf console.
Does this means the container is still vulnerable to this attack vector?
No.
The modifications described in the OP (changing the default password in etc/users.properties and deleting etc/keys.properties) secures the container from that specific attack vector.
According to the discussion on the Karaf users mailing list concerning this stack overflow question:
By default bin/client tries (in this order) to use:
etc/keys.properties
etc/users.properties
karaf/karaf
-u to prompt for the password
bin/client is an SSH client (written in Java). The host.key is the same file as for SSH and containing the trusted hosts (you also have .sshkaraf/known_hosts for that).
The section quoted from the blog in the OP is out dated:
Unfortunately this will stop the bin/client command from working.

Passwordless SSH error while Installing the Big Insight

I am getting below error while installing BigInsight in my Linux machine (RedHat 6.6). Kindly help me how to resolve this.
[ERROR] Prerequisite check - Failed to use given credentials to access nodes.Either provide root password during add node or make sure BI admin user exists on new nodes and passwordless ssh is setup from management node to new nodes that are being added. Please revisit Secure Shell page from installer UI or SSH section in response file to make sure all prerequisites are satisfied, then re-run the command.
Execute the following as root on the server and rerun
ssh-keygen -t rsa ( leave blanks at all prompts )
cat /root/.ssh/*.pub >> /root/.ssh/authorized_keys
then try root#localhost , this should not ask you for a password.

Where is the default CA certs used in nodejs?

I'm connecting to a server whos cert is signed by my own CA, the ca's cert had installed into system's keychain.
connecting with openssl s_client -connect some.where says Verify return code: 0 (ok)
but i cant connect with nodejs's tls/https module, which fails with
Error: SELF_SIGNED_CERT_IN_CHAIN
but connecting to a normal server (i.e google.com:443) works fine.
seems that nodejs's openssl is not sharing same keychain with system's openssl.
but I cannt find where is it. i tried overide with SSL_CERT_DIR but not seemed working.
BTW: i can bypass the server verifying by setting NODE_TLS_REJECT_UNAUTHORIZED=0 , but that's not pretty enough ;)
Im using OSX 10.8.3 with OpenSSL 0.9.8r, node v0.9.8
The default root certificates are static and compiled into the node binary.
https://github.com/nodejs/node/blob/v4.2.0/src/node_root_certs.h
You can make node use the system's OpenSSL certificates. This is done by starting node via:
node --use-openssl-ca
See the docs for further information.
See this answer on how system certificates are extended for Debian and Ubuntu
If you're using the tls module (and it seems like you are) with tls.connect you can pass a ca param in the options that is an array of strings or buffers of certificates you want to trust.

Resources