Curl soap request using VPN linux - linux

I need to make a SOAP request using curl through VPN. The end point only allow call through VPN.
I have created a VPN and through it I am successfully able to hit the endpoint, But when I try to call it and send xml it returns me error
192.168.11.11 is my VPN
ssh root#192.168.11.11 curl \
--header "Content-Type:text/xml" \
--header "SOAPAction:GenericAPIRequest" \
--header "charset=UTF-8" \
--data "#call.xml" \
"http://10.xxx.xx.xx:8080/xxxxx/xxxx/xxxxxxxx"
call.xml is in the same directory from where I'm hitting curl request
It return me error "Warning: Couldn't read data from file "call.xml", this makes an empty POST."

This curl warning means that it could not fopen the file supplied.
The SSH command may be executing from root's home in /root, so ensure the file is there and that it is readable. You may want to use an absolute path.
Just to be clear, that curl command is executing on the remote side, so call.xml must be located on 192.168.11.11, and not in the directory on the client side where the ssh command is being executed from.

Related

Does Github still allow private raw file access with PAT in URL

I used to be able to make the following call with a Personal Access Token (the classic token)
curl https://$PAT#raw.githubusercontent.com/$ORG/$REPO/master/$FILE
But now I just get the Github default 404 response.
If I follow a different approach the the file is accessible.
curl -H 'Authorization: token $PAT' \
-H 'Accept: application/vnd.github.v3.raw' \
-O -L https://api.github.com/$ORG/$REPO/master/$FILE
I can't find anything in the docs that states the old curl URL request has been removed. Has this method now been removed?

Run query API in curl

I'm trying to explore the curl connect or retrieve the data with query API link but when I run my GET command in curl it runs without any error shown in -vv but the process stop. below the actual statement at the end of the posted process, may I know if this is normal or I need to add an additional arguments or parameters on my command for me to retrieve the data? If yes may I know how to do it?
Connection #1 to host myserver.com left intact
My command
curl -vv -k -X GET -H --user user123:password1 -H "Accept: application/json" https://myapplink/statement/EXE=sample

Encrypting credentials

I am using curl - u user:password -X post method in shell script to trigger my Jenkins jobs externally. While using this method I am providing my credentials to access Jenkins.
Is there any way to hide or encrypt credentials.?
Curl with -u does not support encrypt username and password but you can do it in different way to hide username and password
Create an environment variable Use that on your curl command like below :
export USERNAME=""
export PASSWORD=""
after that
curl -u $USERNAME:$PASSWORD -X POST ...
Make use of .netrc file with curl command.
curl command option for .netrc file
-n, --netrc Must read .netrc for user name and password
--netrc-file <filename> Specify FILE for netrc
Steps to use .netrc
Create a .netrc file on your home directory (~) with content
machine jenkins.url
login username
password jenkinsTokenOrPassword
invoke curl command
curl -n -X POST ....
Note. If you don't want to keep your .netrc file on your home directory ~ , than place it somewhere else but make sure let curl know about the location like curl --netrc-file /path/to/.netrc -X POST ...

How does urllib.request differ from curl or httpx in behaviour? Getting a 401 in a request to the Google Container Registry

I am currently working on some code to interact with images on the Google Container Registry. I have working code both using plain curl and also httpx. I am trying to build a package without 3rd party dependencies. My curiosity is around a particular endpoint from which I get a successful response in curl and httpx but a 401 Unauthorized using urllib.request.
The bash script that demonstrates what I'm trying to achieve is the following. It retrieves an access token from the registry API, then uses that token to verify that the API indeed runs version 2 and tries to access a particular Docker image configuration. I'm afraid that in order to test this, you will need access to a private GCR image and a digest for one of the tags.
#!/usr/bin/env bash
set -eu
token=$(gcloud auth print-access-token)
image=...
digest=sha256:...
get_token() {
curl -sSL \
-G \
--http1.1 \
-H "Authorization: Bearer ${token}" \
-H "Accept: application/vnd.docker.distribution.manifest.v2+json" \
--data-urlencode "scope=repository:$1:pull" \
--data-urlencode "service=gcr.io" \
"https://gcr.io/v2/token" | jq -r '.token'
}
echo "---"
echo "Retrieving access token."
access_token=$(get_token ${image})
echo
echo "---"
echo "Testing version 2 capability with access token."
curl -sSL \
--http1.1 \
-o /dev/null \
-w "%{http_code}" \
-H "Authorization: Bearer ${access_token}" \
-H "Accept: application/vnd.docker.distribution.manifest.v2+json" \
https://gcr.io/v2/
echo
echo "---"
echo "Retrieving image configuration with access token."
curl -vL \
--http1.1 \
-o /dev/null \
-w "%{http_code}" \
-H "Authorization: Bearer ${access_token}" \
-H "Accept: application/vnd.docker.distribution.manifest.v2+json" \
"https://gcr.io/v2/${image}/blobs/${digest}"
I additionally created two Jupyter notebooks demonstrating my solutions in httpx and bare urllib.request. The httpx one works perfectly while somehow urllib fails on the image configuration request. I'm running out of ideas trying to spot the difference. If you run the notebook yourself, you will see that the called URL contains a token as a query parameter (is this a security issue?). When I open that link I can actually successfully download the data myself. Maybe urllib still passes along the Authorization header with the Bearer token making that last call fail with 401 Unauthorized?
Any insights are greatly appreciated.
I did some investigation and I believe the difference is that the last call to "https://gcr.io/v2/${image}/blobs/${digest}" actually contains a redirect. Inspecting the curl and httpx calls showed me that both do not include the Authorization header in the second, redirected request, whereas in the way that I set up the urllib.request in the notebook, this header is always included. It's a bit odd that this leads to a 401 but now I know how to address it.
Edit: I can now confirm that by building a urllib.request.Request instance and unlike in the linked notebook, add the authorization header with the request's add_unredirected_header method, everything works as expected.

cURL not sending file data

I have a cURL call that I'm trying to use to send file data to a remote server.
curl -X POST -u username:password -d 'data=#/path/to/file.ext&version=2&action=Parse' http://fqdn.to.server.i.control/Parser.cgi
curl -X POST -u username:password -d 'data=#localFile.ext&version=2&action=Parse' http://fqdn.to.server.i.control/Parser.cgi
cat file.ext | curl -X POST -u username:password -d 'data=#-&version=2&action=Parse' http://fqdn.to.server.i.control/Parser.cgi
The file contents are URI encoded already. Using Perl and CGI on the server side.
My problem is that when the server tries to access that "data" line, value I have is only "file.ext" - the path is stripped out and the file's contents are not used ($cgi->param("data") is just "file.ext", "localFile.ext" or "-" respectively).
Any indication as to what I'm doing wrong?
#MattJacob was correct; my syntax was wrong. data=#... should have been #... and the data= portion should have been in the file. Boy am I thick.

Resources