Azure integration account-unable to uplaod json schemas - azure

i tried uploading json format data on integration account but it throws an error
"Integration account The content of schema 'sd' of type 'Xml' must be a valid XML."
but on azure logic app i am able to upload json schemas not on azure integration account
If azure integration doesn't support.Kindly help me with the official link
i tried uploading the json file.it not worked

As mentioned in document, schemas in integration account expects files to be in xsd format and maps to be xslt format.
If you want to upload json file, you need to use liquid template. Here is reference link to generate liquid templates.
Basically Integration Account uses maps to transform XML data between formats. A map is an XML document that defines the data in a document that should be transformed into another format.
Reference link

Related

Find the type of azure storage using SAS URI of blob

I was wondering how to tell if the type of azure storage based on SAS URI of blob? or more specifically how to know if it is PageBlob or BlockBlob.
There is a REST API that returns a type of blob by just doing a HEAD request to SAS URI and if the file exists there is a field in the response header x-ms-blob-type which indicates the type of blob. However, if the file doesn't exist it returns 404. Now when we get a 404 we can upload a dummy file using BlockBlob and if it fails then we know it's a PageBlob. But I am wondering is there a better way? more straightforward way.
Example of SAS URI:
var sasUriStr = "https://storageaccountname.blob.core.windows.net/containername/file?sp=r&st=2021-08-10T00:34:00Z&se=2021-08-15T08:34:00Z&spr=https&sv=2020-08-04&sr=c&sig=ABCDEFGH/YJKLMNOP=";
There's a way to find that information however it requires you to bring in your logic and it requires a different kind of SAS token.
What you have to do is create an Account SAS (currently you're using Service SAS) and then invoke Get Account Information REST API using that token. Next you will need to extract x-ms-sku-name and x-ms-account-kind response headers. Based on the values of these, you will have to come up with a logic for supported blob types. For example,
If the value of x-ms-account-kind is BlobStorage, then it only supports Block Blobs and Append Blobs.
If the value of x-ms-account-kind is not BlobStorage or BlockBlobStorage and value of x-ms-sku-name is PremiumLRS, then it only supports Page Blobs.
I wrote a blog post some time ago where I created a matrix of features supported by account kinds and skus. You can read that blog post here: https://www.ais.com/how-to-choose-the-right-kind-of-azure-storage-account/
From this blog post:

Azure DevOps, how to get the blobIds

I am trying to download a blobbed zip for a repository on the Azure DevOps Server 2019 API using the following documentation.
https://learn.microsoft.com/en-us/rest/api/azure/devops/git/blobs/get%20blobs%20zip?view=azure-devops-server-rest-5.0
The request body is supposed to contain:
REQUEST BODY
Name Type Description
body string[] Blob IDs (SHA1 hashes) to be returned in the zip file.
How can I obtain the blob ids?
How can I obtain the blob ids?
You can try Items-List api to obtain the blob ids:
GET https://{instance}/{collection}/{project}/_apis/git/repositories/{repositoryId}/items?recursionLevel=Full&api-version=5.0
The response:
Also, if you're trying to get the IDs programatically, you can use GitHttpClientBase.GetItemsAsync Method
.
Ps: As Daniel commented above, it's more recommended to use git command to download whole repository. So you can also try to call git-related api in your code if you want to do that programatically. There're many discussions online about this topic, like this.(Since it's not your original question about Blobids, I won't talk too much here.)

How to send the Large data file from blob storage to Sharepoint in azure Logic App?

am getting Invalid Template Error, need to send the file from Blob storage to Sharepoint Createfile]
Where the file size is 109.29 MiB.
[[Error:
InvalidTemplate. Unable to process template language expressions in action 'Create_file_2' inputs at line '1' and column '3144': 'The template language function 'body' cannot be used when the referenced action outputs body has large aggregated partial content. Actions with large aggregated partial content can only be referenced by actions that support chunked transfer mode.'. ]
How to send blob storage .xlsx huge file to Sharepoint when Create the file
Image of logic app: https://i.stack.imgur.com/lcQvH.jpg
Currently Sharepoint connector doesn't support transfer large files even though blob connector supports chunk files for transferring large file. You can raise your voice for feature request on this page.
For a workaround, you can use microsoft graph api to upload the file from blob to sharepoint. Before request this api, you need to get a access token from app in azure ad and then use this access token as Authorization bear token. Here is a post which use another graph api in logic app for your reference.
You can sharepoint rest api along with azure function app for transferring large files. You should use small byte arrays to so that you won’t run out of memory. If you are using a trigger in logic app, you will need to used dountill step in logic app to make sure the file is completely uploaded since the sharepoint trigger gets triggered even before file is created

DocuSign API - Update Template PDF Document

I have an existing DocuSign Template setup and working well. I would like to be able to update the PDF file used for the Template via the API using a locally stored PDF file that I have. Is there a way to update the PDF file used by the DocuSign Template via the API?
I can see that you can update a Template here:
https://developers.docusign.com/esign-rest-api/reference/Templates/Templates/update
but can't work out what I need to do to replace the PDF file with a new one?
Is there a way to update the PDF file used by the DocuSign Template via the API?
Yes. You use the compositing templates API feature to substitute a document for the existing one in the "server template" (a template stored on the server).
See DocuSign Rest API to replace single template document for a code example.
(From a comment)
How do I update the template's definition to use a different document?
See the TemplateDocuments::update API method.

Get XML Data from API URL in Azure

I need to get XML Data from API and store the data in Azure Data lake store and finally have a table created for this in Azure SQL DB/DWH. I know ADF can't handle XML data. How do i need to pull XML Data into azure. I have checking some links on using Logicapps.Any suggestions or way to handle it
As I'm not so clear about the details of your requirement, you asked "how to pull XML data into azure", so I post some suggestions for your reference.
If you want to get the xml data of your api, you can use "HTTP" action in your logic app.
Then you can use the output of the api in the next steps of your logic app.
If you want to parse the xml data, you need to transform it to json, please refer to the screenshot below.(input "json(xml(body('HTTP')))" to the content box and provide a schema)

Resources