Get XML Data from API URL in Azure - azure

I need to get XML Data from API and store the data in Azure Data lake store and finally have a table created for this in Azure SQL DB/DWH. I know ADF can't handle XML data. How do i need to pull XML Data into azure. I have checking some links on using Logicapps.Any suggestions or way to handle it

As I'm not so clear about the details of your requirement, you asked "how to pull XML data into azure", so I post some suggestions for your reference.
If you want to get the xml data of your api, you can use "HTTP" action in your logic app.
Then you can use the output of the api in the next steps of your logic app.
If you want to parse the xml data, you need to transform it to json, please refer to the screenshot below.(input "json(xml(body('HTTP')))" to the content box and provide a schema)

Related

Azure integration account-unable to uplaod json schemas

i tried uploading json format data on integration account but it throws an error
"Integration account The content of schema 'sd' of type 'Xml' must be a valid XML."
but on azure logic app i am able to upload json schemas not on azure integration account
If azure integration doesn't support.Kindly help me with the official link
i tried uploading the json file.it not worked
As mentioned in document, schemas in integration account expects files to be in xsd format and maps to be xslt format.
If you want to upload json file, you need to use liquid template. Here is reference link to generate liquid templates.
Basically Integration Account uses maps to transform XML data between formats. A map is an XML document that defines the data in a document that should be transformed into another format.
Reference link

Import data from Salesforce Report using ADF not working

I am working on importing data from Salesforce Report using ADF. I am using copy activity and have created salesforce linked service.
I followed the Microsoft documentation https://learn.microsoft.com/en-us/azure/data-factory/connector-salesforce to setup the ADF. I am getting empty rows.
Used this tip: Get Salesforce reports by specifying a query as {call "<report name>"}. An example is "query": "{call "TestReport"}".
I can see header information but all rows are empty. What I am missing here?

Azure LogicApp for files migration

I am trying to figure out if Azure LogicApp can be used for files/documents migration from Azure Blob Storage to a custom service, where we have REST API. Here is the shortlist of requirements I have right now:
Files/documents must be uploaded into Azure Storage weekly or daily, which means that we need to migrate only new items. The amount of files/documents per week is about hundreds of thousands
The custom service REST API is secured and any interaction with endpoints should have JWT passed in the headers
I did the following exercise according to tutorials:
Everything seems fine, but the following 2 requirements make me worry:
Getting only new files and not migrate those that already moved
Getting JWT to pass security checks in REST
For the first point, I think that I can introduce a DB instance (for example Azure Table Storage) to track files that have been already moved, and for the second one I have an idea to use Azure Function instead of HTTP Action. But everything looks quite complicated and I believe that there might be better and easier options.
Could you please advise what else I can use for my case?
For the first point, you can use "When a blob is added or modified" trigger as the logic app's trigger. Then it will just do operation on the new blob item.
For the second point, just provide some steps for your reference:
1. Below is a screenshot that I request for the token in logic app in the past.
2. Then use "Parse JSON" action to parse the response body from the "HTTP" action above.
3. After that, your can request your rest api (with the access token from "Parse JSON" above)

How to Dynamically adding HTTP endpoint to load data into azure data lake by using Azure Data Factory and the REST api is cookie autheticated

I am trying to dynamically add/update linked service REST based on certain trigger/events to consume a RESP API to be authenticated using cookie which provides telemetry data. This telemetry data will be stored in Data Lake Gen2 and then will use Data Bricks to move to secondary data storage/SQL Server.
Have someone tried this? I am not able to find the cookie based Auth option while adding the linked service REST.
Also how to create data pipes dynamically or to have the parameters of the rest api to be dynamic ?
Currently, unfortunately this is not possible using Azure data factory native components/activities. For now at least, you cannot get access to the response cookies from a web request in data factory. Someone has put a feature request for this or something that might help, see here
It might be possible to do this via an Azure function to get/save the cookie and then send it as part of a following request. I had a similar problem but resorted to using Azure functions for all of it, but I guess you could just do the authentication part with a function! ;-)
EDIT: update
Actually, after I wrote this I went back to check if this was still the case and looks like things have changed. There now appears (never seen this before) in the web response output, a property called "ADFWebActivityResponseHeaders" and as you can see there is property for the "Set-Cookie"
See example below:-

How to send the Large data file from blob storage to Sharepoint in azure Logic App?

am getting Invalid Template Error, need to send the file from Blob storage to Sharepoint Createfile]
Where the file size is 109.29 MiB.
[[Error:
InvalidTemplate. Unable to process template language expressions in action 'Create_file_2' inputs at line '1' and column '3144': 'The template language function 'body' cannot be used when the referenced action outputs body has large aggregated partial content. Actions with large aggregated partial content can only be referenced by actions that support chunked transfer mode.'. ]
How to send blob storage .xlsx huge file to Sharepoint when Create the file
Image of logic app: https://i.stack.imgur.com/lcQvH.jpg
Currently Sharepoint connector doesn't support transfer large files even though blob connector supports chunk files for transferring large file. You can raise your voice for feature request on this page.
For a workaround, you can use microsoft graph api to upload the file from blob to sharepoint. Before request this api, you need to get a access token from app in azure ad and then use this access token as Authorization bear token. Here is a post which use another graph api in logic app for your reference.
You can sharepoint rest api along with azure function app for transferring large files. You should use small byte arrays to so that you won’t run out of memory. If you are using a trigger in logic app, you will need to used dountill step in logic app to make sure the file is completely uploaded since the sharepoint trigger gets triggered even before file is created

Resources