Is there a way to use cURL commands in Azure Data Factory? - azure

So I am trying to create a Pipeline in Azure Data Factory and part of the process involves uploading a CSV to a temporary URL generated earlier in the Pipeline by an earlier REST API request. The API's documentation say's to use a cURL command or "a similar application". I have gotten the cURL command to work on my local environment but have had no luck doing it in ADF. the cURL command I am currently using is curl --upload-file "<file location>" "<api URL>" --ssl-no-revoke -v
While ADF supports web requests, it does not seem to support cURL commands at least directly. Currently I am trying to automate the cURL command through a Automation Account which runs a PowerShell Script and then use a web hook to continue from there within the pipeline but I have my doubt that this will work due to having to pass the temporary URL from the pipeline to the PowerShell script.
The questions can be summed up as follows:
Is it possible to put a cURL command in a web request? I have not found any good examples of this as most cURL commands seem to take place in PowerShell and Command Prompt
Is there some ADF functionality I am not aware of that runs cURL commands?
What are the alternatives to cURL that I could use for this process? Are they friendlier than cURL when it comes to ADF?
Any other potential advice I may need to know
I appreciate any input on this matter!

Only way to do that is by creating your own batch / *.bat files which have the curl commands in it and execute them from server side , you may save results on text files which you can also read them from your server side .
Read more here

Related

Jenkins using PowerShell Azure Modules failing to run

I have a bit of a strange one here.
I have written a PowerShell script to pull Azure blob storage objects which when I run manually via the console works absolutely fine. However if i run it from jenkins, it calls the PowerShell function, starts running through it but fails when using the az.storage function Get-azstorageblobcontent.
The errors seem to be things like "Couldn't connect" or "Retry count exceeded" etc. I am unable to use the -debug switch as it says its interactive.
I have tested by including the access token inside the script, eliminating jenkins from handling the secrets but i get the same issue.
Jenkins is running the latest version, latest PowerShell plugin and latest version of java. it is also calling the 64 bit powershell session as expected.
I am also aware that there is a Jenkins Blob storage plugin however due to the amount of additional work that is required, it makes a lot more sense i use the PowerShell Modules inside a PowerShell Script to run this.
I would really appreciate if anyone has any ideas about this please, It has been driving me nuts for weeks.
Many Thanks

How do I retrieve files stored in azure or in one drive with labview application?

I want to build a labview application starter which can check the main application version and eventually download and update the .exe file which is stored in my onedrive or even in somehow in azure.
I said azure because I think that there must be an option with it... But I can't get the correct keywords
Do anyone has some tip to share?
The benefit of LabVIEW being a higher level application is that it gives you access to this little vi:
This is the System Exec VI which allows you to run general command line scripts from within LabVIEW. So if you can find a windows command to force the sync or a batch file then you can interface with external programs.
The example Batch file there should work with OneDrive but with a little hunting around you should be able to find something to work with Azure.
P.S. Have a look at the 'Command Line Execution.vi' Example from LabVIEW to see more about how this vi works.
I wrote similar code that downloads and installs new releases from AWS S3. It is a real time-saver and worth the effort to get working.
I haven't interacted with OneDrive but from the documentation it looks a bit easier than AWS S3 because the Microsoft accepts OAuth2 tokens vs the more complex AWS Signature Version 4. If you are using LabView, you'll need to use the HTTP client functions to interact with the remote service. I recommend starting with the JKI HTTP REST API client toolkit. This toolkit works around a flaw in HTTPClient:OpenHandle function (it is globally blocking!) by maintaining a connection pool.
Microsoft's documentation looks pretty good; basically what you will want to do is open an HTTP session to the authorization endpoint, request an access token, close the HTTP session, open an HTTP session to the OneDrive endpoint, format the token into an Authorization: bearer header, add that header session, submit a list file request, find your file, submit a download request, save the output to a file, close the http session.
Along the way you will need to parse the API's JSON responses. As NI's built-in JSON parser is rather inflexible, I recommend JKI JSON. Both of the JKI tools are most easily installed with VIPM, which is installed by default in recent editions of LabView.

Django custom management commands as windows scheduled task

I'm trying to run a Django2.1 custom management command from within a python3 virtual environment on a windows server with the task scheduler. The command I've tried work as follows:
C:\Users\dev\Programs\Python3\Scripts\python.exe C:\Users\dev\Programs\Python3\Scripts\access-api\my_project\manage.py accessapi
The script runs just fine if I also execute it as a .bat file but when I try to create a scheduled task and run the .bat script, the task scheduler fails or says it completed but the data that I'm looking to update, doesn't get updated. I managed to find this reddit post about this same issue but it doesn't seem to work as described.
The script itself isn't a pretty one but it works using pyodbc drivers to run queries in from an Access 2010 database and convert to it JSON, then update required records using Django's API in a PostgreSQL database.
"Actions" Tab in task property use your commands to configure as:
Program script field:
C:\Python36\python.exe
Add Argumenrs(optional) field:
"C:\Users\dev\Programs\Python3\Scripts\access-api\my_project\manage.py" accessapi

Azure Chef Automate Starter Kit

I have a Chef Automate server in Azure that I am just starting to configure according to the Chef Docs (https://docs.chef.io/azure_portal.html). I successfully setup my credentials and logged into the Chef server, however, I was not prompted to download the starter_kit.zip file. Is there a way to manually download this file, and if so, how/where can I do that?
I also run the following command on the Chef Automate server and get a 404 Not Found error.
curl -X GET https://MY-AUTOMATE-SERVER.azure.com/biscotti/setup/starter-kit -k
I don't think the Starter Kit was carried over from the Manage product to Automate. You would normally just make your own knife config file these days.

CouchDB: Workflow for creating documents?

Beginner question.
My goal is to add documents to my CouchDB database on a remote server.
Is this the a good way?
Create CouchDB documents using a code editor and store them on my hard drive.
Post them to my CouchDB database on a remote server.
If yes, how to do #2 on Windows?
I suggest using your browser, and use the Futon web app, which all CouchDB servers support. Just go to /_utils/ in your browser and you can get started adding and modifying documents.
Once you are comfortable, I suggest the curl tool, however keep in mind that Windows is a little different from the examples.
In Linux and Mac, curl looks like this
curl -X POST http://localhost:5984/db/ -d '{"_id":"something"}'
In Windows, you cannot use the single-quote:
curl -X POST http://localhost:5984/db/ -d "{\"_id\":\"something\"}"

Resources