I want to Transfer some of my data from WHM server to SoftLayer Object Storage without using FTP or SCP, can any one suggest the way out? - object

I have tried this, but it is just adding the file to the object storage..
$ curl -i -XPUT -H "X-Auth-Token: AUTH_tkb26239d441d6401d9482b004d45f7259" --data-binary "Created for testing REST client" https://dal05.objectstorage.softlayer.net/v1/AUTH_df0de35c-d00a-40aa-b697-2b7f1b9331a6/container2/folder3/file1.txt

The swift client is an easy way to list/upload/download files on SoftLayer Object storage.
https://swiftstack.com/docs/integration/python-swiftclient.html

Related

CURL Rest API via Proxy

I want to get data from ServiceNow via a CURL in Putty via a Proxyserver (Plan is to implement it to a PySpark script later on) and then save the data onto the server.
My Command looks like this:
curl -x <proxyadress:port> -U proxyuser:proxypassword -u '<apiuser:apipassword>' -d status="message" "https:/apiadress" -H 'Accept: application/json'
I get the error message:
{"error":{"message":"Invalid content-type. Supported request media types for this service are: [application/json, application/xml, text/xml]","detail":null},"status":"failure"}
A few days ago I was able to have the data printed into the log but didn't manage to replicate the command ... what's wrong?
Thanks for your help

How to upload files to SharepointOnline using curl on Linux

I am trying to automate file uploads to SharePoint Online. Problem is I keep getting 401 unauthorized error when trying to upload the files. I created a script to retrieve the token as suggested in curl request to Microsoft Sharepoint API?, but uploads are still failing, even though I do get a positive response when running curl -i -H "Authorization: Bearer $(./get_access_token.sh)" -H "Accept: application/json;odata=verbose" -s "https://YourTenant.sharepoint.com/_api/web". I have afeeling that I am just malforming my curl command. Any suggestion on command format to be using?

Making a single file of a private repo on Gitlab to be publicly accessible

I have a bash script file in my GitLab private repo. I wish to download the file in Linux when running the wget command, however it fails to do so since the file is hosted in a private repo, thus it goes to Login page.
Is there a way to make this single file publicly accessible? If not, is there a way to include my credentials in the GET URL when attempting to open the file?
If you can use curl, you can use the GitLab API to get a raw file from repository. You'd need to add your private-token as well to get this file.
For example:
curl --request GET --header 'PRIVATE-TOKEN: YOUR_PRIVATE_TOKEN' 'https://gitlab.example.com/api/v4/projects/PROJECT_ID/repository/files/FILE_NAME/raw?ref=BRANCH' --output FILE_NAME
As mentioned by #Revkoni, you can use the GitLab API for this:
$ wget --header="PRIVATE-TOKEN: XXXXXXXX" "https://gitlab.example.com/api/v4/projects/PROJECT_ID/repository/files/FILE_NAME/raw?ref=BRANCH"

How to find Windows Azure Image IDs?

How can I find the publicly available Image IDs on Windows Azure?
I found this related question - Azure: List OS Images
But, the answer requires Windows+PowerShell while I need a way to get it on Linux or REST/
Use the URL specified here:
http://msdn.microsoft.com/en-us/library/windowsazure/jj157191.aspx
You'll need to provide a client certificate when sending the request.
If you are using curl on Linux, add the --cert to point to a .pem file (you'll need to upload it to the administrator's management certificate as a .cer file first).
Don't forget to add the x-ms-version header for it to work:
-H "x-ms-version: 2013-03-01"
Here is an example of using curl to get the auto-scale information for a cloud service
curl -H "accept: application/json" -H "x-ms-version: 2013-10-01"
--cert azure-cert.pem $AUTOSCALEURL

How to send file to Sharepoint from Linux creating non existend directories

I have a problem while sending file from linux to SharePoint. Everything is fine if I am uploading to existing directory, I use this method:
curl --ntlm --user username:password --upload-file myfile.xls https://sharepointserver.com/sites/mysite/myfile.xls
Unfortunately problem arises when I point the target to non existing directory, like:
curl --ntlm --user username:password --upload-file myfile.xls https://sharepointserver.com/sites/mysite/nonexist/myfile.xls
I would like it to create all necessary directorie on the path. I've tried to use "--create-dirs" CURL option, but it doesn't work.
Any ideas how to achieve the goal? It doesn't have to be CURL actually, i can use different method available on linux.
As the name (CLIENT URL) suggests, you will not be able to create new directories on remote SERVERS involving http/https while uploading files.
For downloads involving http/https server, --create-dirs option is applicable only on local machines to create new directories (for instance, when you are downloading a content on to your local linux machine).
However, while using ftp/sftp to a server, you will be able to create new directories on the remote server.

Resources