Why doesn't telegram send the updates after setting the webhook? - node.js

I can get the updates with Telegram getUpdates API, while after setting webhook with a valid SSL certificate or self-signed certificate it says:
{"ok":true,"result":true,"description":"Webhook was set"}
But it does not send any updates to my webhook link (I checked the Nginx and node.js access log files), I tried a lot of curl commands for setWebhook API with and without certificate, but still no result:
curl -s -X POST https://api.telegram.org/bot<TOKEN>/setWebhook -d url='https://www.example.tech/<TOKEN>/webhook' jq .
curl -F "url=https://www.example.tech/<TOKEN>/webhook" -F "certificate=#./www_example_tech.crt" https://api.telegram.org/bot<TOKEN>/setWebhook

Maybe problem in your certificate. Could you check your certificate in any SSL online checker? Is it ok?
For example in this one https://www.sslshopper.com/ssl-checker.html
Also do you see the request in access.log if call webhook url directly?

Related

Nodejs headers not matching actual request

There is a problem with NodeJS v7.9.0 where there can be a request
curl -i -H Accept:application/json -H range:bytes=1-8 -X GET http://localhost:8080/examples/text.txt
However node's request header doesn't match when it is logged
console.log(req.headers.range)
The logged value varies between different values for the exact same request
(some values logged from that request: bytes=1-2, bytes=1-3, bytes=1-4, bytes=1-5, bytes=1-6, bytes=1, 7 bytes=1-8)
Is this a problem with NodeJS or something else with the computer's setup? And how does one fix it
Note the requests are being made with "Rest Web service client" (chrome plugin), and the request above is the equivalent curl command.

How to setup cron job in cpanel? if connection is not secure?

I am trying to setup cron job with following url :
wget -O - -q -t 1 https://myexample.com/check/test > /dev/null
but this is not working.
When i am trying to execute this url on web https://myexample.com/check/test
i see message Your connection is not private
You will need a SSL certificate to get rid of that security warning. You could use one generated by letsencrypt (which is free). An alternative way would be to get a SSL certificate through startssl.com (also free). If you just need your cron to run you could use it like this:
/usr/bin/wget --no-check-certificate -O - -q -t 1 https://myexample.com/check/test > /dev/null
Accessing the same link via a web browser, without having a valid SSL certificate will result in a security warning. If you do not want to buy or use a real SSL certificate then you could just use Firefox web browser an add an exception for that website/certificate.

How to download Bing Static Map in Linux

I'm trying to download the static map using Bing Maps API. It works when I load the URL from Chrome, but when I tried to curl to wget from Linux, I got Auth Failed error.
The URL are identical but for some reason Bing is blocking calls from Linux?
Here's the commands I tried:
wget -O map.png http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/...
curl -O map.png http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/...
Error:
Resolving dev.virtualearth.net (dev.virtualearth.net)... 131.253.14.8
Connecting to dev.virtualearth.net (dev.virtualearth.net)|131.253.14.8|:80... connected.
HTTP request sent, awaiting response... 401 Unauthorized
Username/Password Authentication Failed.
--2016-10-24 15:42:30-- http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/.../12?mapSize=340,500
Reusing existing connection to dev.virtualearth.net:80.
HTTP request sent, awaiting response... 401 Unauthorized
Username/Password Authentication Failed.
I'm not sure if it has anything to do with Key Type, I've tried several from Public Website to Dev/Test but still didn't work.
The url needs to be wrapped (because of & symbol in query string that needs to be escaped) with quotes:
wget 'http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/...'
Examples
Via wget:
wget -O map.jpg 'http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/Bellevue%20Washington?mapLayer=TrafficFlow&key=<key>'
Via curl:
curl -o map.jpg 'http://dev.virtualearth.net/REST/V1/Imagery/Map/Road/Bellevue%20Washington?mapLayer=TrafficFlow&key=<key>'
Have been verified under Ubuntu 16.04

How to find Windows Azure Image IDs?

How can I find the publicly available Image IDs on Windows Azure?
I found this related question - Azure: List OS Images
But, the answer requires Windows+PowerShell while I need a way to get it on Linux or REST/
Use the URL specified here:
http://msdn.microsoft.com/en-us/library/windowsazure/jj157191.aspx
You'll need to provide a client certificate when sending the request.
If you are using curl on Linux, add the --cert to point to a .pem file (you'll need to upload it to the administrator's management certificate as a .cer file first).
Don't forget to add the x-ms-version header for it to work:
-H "x-ms-version: 2013-03-01"
Here is an example of using curl to get the auto-scale information for a cloud service
curl -H "accept: application/json" -H "x-ms-version: 2013-10-01"
--cert azure-cert.pem $AUTOSCALEURL

curl - Is data encrypted when using the --insecure option?

I have a situation where the client makes a call through curl to a https url. The SSL certificate of the https url is self signed and therefore curl cannot do certificate validation and fails. curl provides an option -k/--insecure which disables certificate validation.
My question is that on using --insecure option, is the data transfer that is done between client and server encrypted(as it should be for https urls)? I understand the security risk because of certificate validation not being done, but for this question I am only concerned about whether data transfer is encrypted or not.
Yes, the transfered data is still sent encrypted. -k/--insecure will "only make" curl skip certificate validation, it will not turn off SSL all together.
More information regarding the matter is available under the following link:
curl.haxx.se - Details on Server SSL Certificates
It will be encrypted but insecure. If you trust the certificate you should add the certificate to your certificate store instead of connecting insecurely.
macOS:
sudo security add-trusted-cert -d -r trustRoot -k /Library/Keychains/System.keychain ~/new-root-certificate.crt
Ubuntu, Debian:
sudo cp foo.crt /usr/local/share/ca-certificates/foo.crt
sudo update-ca-certificates
CentOS 6:
yum install ca-certificates
update-ca-trust force-enable
cp foo.crt /etc/pki/ca-trust/source/anchors/
update-ca-trust extract
CentOs 5:
cat foo.crt >>/etc/pki/tls/certs/ca-bundle.crt
Windows:
certutil -addstore -f "ROOT" new-root-certificate.crt

Resources