How to setup cron job in cpanel? if connection is not secure? - cron

I am trying to setup cron job with following url :
wget -O - -q -t 1 https://myexample.com/check/test > /dev/null
but this is not working.
When i am trying to execute this url on web https://myexample.com/check/test
i see message Your connection is not private

You will need a SSL certificate to get rid of that security warning. You could use one generated by letsencrypt (which is free). An alternative way would be to get a SSL certificate through startssl.com (also free). If you just need your cron to run you could use it like this:
/usr/bin/wget --no-check-certificate -O - -q -t 1 https://myexample.com/check/test > /dev/null
Accessing the same link via a web browser, without having a valid SSL certificate will result in a security warning. If you do not want to buy or use a real SSL certificate then you could just use Firefox web browser an add an exception for that website/certificate.

Related

cannot wget from uniprot, what does the ssv3 error mean?

EDITED
I am fairly new to bioinformatics, I have a script to download sequence data from uniprot, when I use the wget command however I am getting an error message for some of the weblinks:
OpenSSLL error:14094410:SSL routines:ssl3_read_bytes:sslv3 alert handshake failure
Unable to establish SSL connection.
So, some of the sequence data downloads (3 in total) and then I get this error message for each of the others.
I am using a virtual box with the ubuntu OS, and so using a bash script.
I have tried installing an updated version of wget but I get a message saying the latest version is installed. Has anyone got any suggestions?
As requested, script is as follows:
VAR=$(cat uniprot_id.txt)
URL="https://www.uniprot.org/uniprot/"
for i in ${VAR}
do
echo "(o) Downloading Uniprot entry: ${i}"
wget ${URL}${i}.fasta
echo "(o) Done downloading ${i}"
cat ${i}.fasta >> muscle_input.fasta
rm ${i}.fasta
done
to confirm, the weblinks that I create using this script are all valid so they open up the sequence data I require when I click the weblinks.
I also had trouble with this. Would you be able to share what changes you made to the SSL configuration please?
In the meantime, because I trust the URL domain, I was able to workaround the issue with this line:
wget ${URL}${i}.fasta --no-check-certificate
I think I need to update domain certificates somewhere, but I'm working on that.

Ubuntu 18, proxy not working on terminal but work on browser

(related and perhaps more simple problem to solve: proxy authentication by MSCHAPv2)
Summary: I am using a Ubuntu 18, the proxy is working with web-browser but not with terminal applications (wget, curl or apt update). Any clues? Seems the problem is to interpretate a proxy's "PAC file"... Is it? How to translate to Linux's proxy variables? ... Or the problem is simple: my proxy-config (see step-by-step procedure below) was wrong?
Details:
By terminal env | grep -i proxy we obtain
https_proxy=http://user:pass#pac._ProxyDomain_/proxy.pac:8080
http_proxy=http://user:pass#pac._ProxyDomain_/proxy.pac:8080
no_proxy=localhost,127.0.0.0/8,::1
NO_PROXY=localhost,127.0.0.0/8,::1
ftp_proxy=http://user:pass#pac._ProxyDomain_/proxy.pac:8080
and browser (Firefox) is working fine for any URL, but:
wget http://google.com say Resolving pac._ProxyDomain_ (pac._ProxyDomain_)... etc.etc.0.26 connecting to pac._ProxyDomain_ (pac._ProxyDomain_)|etc.etc.0.26|:80... conected.
Proxy request has been sent, waiting for response ... 403 Forbidden
2019-07-25 12:52:19 ERROR 403: Forbidden.
curl http://google.com say "curl: (5) Could not resolve proxy: pac._ProxyDomain_/proxy.pac"
Notes
(recent news here: purge exported proxy changes something and not tested all again...)
The proxy configuration procedures that I used (there are some plug-and-play PAC file generator? I need a PAC file?)
Config procedures used
All machine was running, with a direct non-proxy internet connection... Them the machine goes to the LAN with the proxy.
Add lines of "export *_proxy" (http, https and ftp) in my ~/.profile. The URL definitions are in the form http_proxy="http://user:pwd#etc" (supposing that is correct, because testesd before with user:pwd#http://pac.domain/proxy.pac syntax and Firefox promped proxy-login)(if the current proxy-password is using # character, need to change?)
Add lines of "export *_proxy" in my ~root/.profile.(need it?)
(can reboot and test with echo $http_proxy)
visudo procedure described here
reboot and navigate by Firefox without need of login, direct (good is working!). Testing env | grep -i proxy, it shows all correct values as expected.
Testing wget and curl as the begin of this report, proxy bug.
Testing sudo apt update, bug.
... after it more one step, supponing that for apt not exist a file, created by sudo nano /etc/apt/apt.conf.d/80proxy and add 3 lines for Acquire::*::proxy "value"; with value http://user:pass#pac._ProxyDomain_/proxy.pac:8080. where pass is etc%23etc, url-encoded.
Summary of tests performed
CONTEXT-1.1
(this was a problem but now ignoring it to focus on more relevant one)
After (the proxied) cable connection and proxy configurations in the system. (see above section "Config procedures used"). Proxy-password with special character.
curl http://google.com say "curl: (5) Could not resolve proxy..."
When change all .profile from %23 to # the error on wget changes, but curl not. Wget changes to "Error parsing proxy URL http://user:pass#pac._ProxyDomain_/proxy.pac:8080: Bad port number"
PS: when used $ on password the system (something in the internal export http_proxy command or use of http_proxy confused it with a variable).
CONTEXT-1.2
Same as context-1.1 above, but password with no special character. Good and clean proxy-password.
curl http://google.com say "curl: (5) Could not resolve proxy..."
CONTEXT-2
After (the proxied) cable connection and no proxy configurations in the system (but confirmed that connection is working on browser after automatic popup form login).
curl -x 192.168.0.1:8080 http://google.com "curl: (7) Failed to connect..."
curl --verbose -x "http://user:pass#pac._proxyDomain_/proxy.pac" http://google.com say "curl: (5) Could not resolve proxy..."
Other configs in use
As #Roadowl suggested to check:
files ~/.netrc and ~root/.netrc not exists
file more /etc/wgetrc exists, but all commented, exept by passive_ftp = on

How do I export JIRA issues to xls?

As title says. I'm trying to export JIRA issues to a xls file through my command line.
this is what I'm entering:
wget -O export.xls https://myjiraurl.com/sr/jira.issueviews:searchrequest-excel-all-fields/10300/SearchRequest-10300.xls?tempMax=1000
But I keep getting this error
Resolving myjiraurl.com (myjiraurl.com)... 10.64.80.92
Connecting to myjiraurl.com (myjiraurl.com)|10.64.80.92|:443... connected.
ERROR: cannot verify myjiraurl.com's certificate, issued by ‘/C=US/O=MyCorporation/CN=Intel Intranet Basic Issuing CA 2B’:
Self-signed certificate encountered.
To connect to myjiraurl.com insecurely, use `--no-check-certificate'.
I've also tried adding
&os_username=<myuserid>&os_password=<mypassword>
how it still gives me the same thing. I've tried directly plugging the url into the address bar and it exports a xls file back to be just fine. tips?
Try:
wget --user=userid --password=secret --no-check-certificate -O export.xls "https://myjiraurl.com/sr/jira.issueviews:searchrequest-excel-all-fields/10300/SearchRequest-10300.xls?tempMax=1000"

How to Install SSL certificate in Linux Servers

I am trying to access https wcf web service from my application using monodevelop in Linux. The web service call is throwing the following exception
SendFailure (Error writing headers) at
System.Net.HttpWebRequest.EndGetRequestStream (IAsyncResult
asyncResult) [0x00043] in
/home/abuild/rpmbuild/BUILD/mono-3.4.0/mcs/class/System/System.Net/HttpWebRequest.cs:845
at
System.ServiceModel.Channels.HttpRequestChannel+c__AnonStorey1.<>m__0
(IAsyncResult r) [0x0001d] in
/home/abuild/rpmbuild/BUILD/mono-3.4.0/mcs/class/System.ServiceModel/System.ServiceModel.Channels/HttpRequestChannel.cs:219.
when I try to load the web service url using https, the Firefox browser is giving a warning:
"This connection is Untrusted"
The certificate(.cer) is generated using Visual Studio 2008 makecert utility.I tried to install certificate using the below commands
certmgr -add -c -m My MyCert.cer
certmgr -ssl https://myserver.com:1200/Monik/TestSvc
But it looks like the certificates are not configured properly. In some of the forums it saying that move the certificate to /etc/httpd/conf . But there is no such folder in my system
Please let me know what I am missing?
Mono handles certificates via mozroots so best thing in your case would probably be to run this:
sudo mozroots --import --machine --sync
sudo certmgr -ssl -m https://myserver.com:1200/Monik/TestSvc
First command will synchronise public root certificates. Second will ask you if you want to trust your server certificate which will be displayed to you after entering command. Type "yes".

curl - Is data encrypted when using the --insecure option?

I have a situation where the client makes a call through curl to a https url. The SSL certificate of the https url is self signed and therefore curl cannot do certificate validation and fails. curl provides an option -k/--insecure which disables certificate validation.
My question is that on using --insecure option, is the data transfer that is done between client and server encrypted(as it should be for https urls)? I understand the security risk because of certificate validation not being done, but for this question I am only concerned about whether data transfer is encrypted or not.
Yes, the transfered data is still sent encrypted. -k/--insecure will "only make" curl skip certificate validation, it will not turn off SSL all together.
More information regarding the matter is available under the following link:
curl.haxx.se - Details on Server SSL Certificates
It will be encrypted but insecure. If you trust the certificate you should add the certificate to your certificate store instead of connecting insecurely.
macOS:
sudo security add-trusted-cert -d -r trustRoot -k /Library/Keychains/System.keychain ~/new-root-certificate.crt
Ubuntu, Debian:
sudo cp foo.crt /usr/local/share/ca-certificates/foo.crt
sudo update-ca-certificates
CentOS 6:
yum install ca-certificates
update-ca-trust force-enable
cp foo.crt /etc/pki/ca-trust/source/anchors/
update-ca-trust extract
CentOs 5:
cat foo.crt >>/etc/pki/tls/certs/ca-bundle.crt
Windows:
certutil -addstore -f "ROOT" new-root-certificate.crt

Resources