Azure Synapse and Jira Rest Api Error 21155 - azure

I am trying to call the Rest Api from Jira via the REST Connector of Synapse.
I always get the error 21155.
Error occured when deserializing source JSON file.
Check if data is in valid JSON.
Unexpected character encountered while parsing value <Path", line 0, position 0.
Does anybody know how to solve this problem?
I checked in postmen and there it worked and the result is a valid json.

I tried to reproduce your scenario in my environment like I am also getting output in postman but facing error in ADF as below:
Getting output in postman
Facing similar error in ADF
I believe the error information implies that the source data's non-standard JSON format prevents ADF from deserializing it. JSON that isn't legal cannot be copied using ADF. Because when I tried with another Api with Correct Json format output, I am able to see the preview the data, as below:
The connection documentation states that ADF supports the JIRA connector. Perhaps you could give that a go.

Related

Azure Data Factory - Parameter interpolation

I'm trying to add a pipeline parameter into the body of a post request to an Azure Function app in Azure Data Factory. It appears that the string isn't getting replaced, but the Microsoft documentation suggests I'm doing it the right way. See here: https://learn.microsoft.com/en-gb/azure/data-factory/control-flow-web-activity#request-payload-schema
This is a screenshot of an error I'm getting alongside it:
I'm confused as to how to proper interpolate pipeline parameters into this request
Thanks!
As mentioned in this document, wrap your parameter inside braces #{pipeline().parameters.ParameterName}.
Below is the example with my sample API post body:
{
"tourist_name": "#{pipeline().parameters.name}",
"tourist_email": "#{pipeline().parameters.email}",
"tourist_location": "#{pipeline().parameters.location}"
}

XSLT got the exception in logic apps

When I use the XSLT to transform XML source file then got the below exception in Logic APP. This below URL is avail in the XSLT file.
How to resolve this issue in logic apps?
XSD schema is valid and working fine. At TransformXML place got this issue.
"Code": "InvalidXsltContent", "Message": "An error occurred while
processing map. 'Cannot find a script or an extension object
associated with namespace
'http://schemas.microsoft.com/BizTalk/2003/ScriptNS0'.'",
According to the error message, it seems the logic app can't find the external assembly from the integration account. So could you please check if you have upload the assembly to your integration account ? You can refer to this tutorial to know how to upload the assembly.
By the way, it seems you want to use custom extension in your xslt map. Here I provide two links for your reference:
https://learn.microsoft.com/en-us/biztalk/core/technical-reference/custom-extension-xml-grid-property?redirectedfrom=MSDN
https://blog.vertica.dk/2013/03/20/using-custom-xslt-in-biztalk/

Error "BadRequest" when calling Azure Function in ADF

I am creating an extensive data factory work flow that will create and fill a data warehouse for multiple customers automatic, however i'm running into an error. I am going to post the questions first, since the remaining info is a bit long. Keep in mind i'm new to data factory and JSON coding.
Questions & comments
How do i correctly pass the parameter through to an Execute Pipeline activity?
How do i add said parameter to an Azure Function activity?
The issue may lie with correctly passing the parameter through, or it may lie in picking it up - i can't seem to determine which one. If you spot an error with the current setup, dont hesitate to let me know - all help is appreciated
The Error
{
"errorCode": "BadRequest",
"message": "Operation on target FetchEntries failed: Call to provided Azure function
'' failed with status-'BadRequest' and message -
'{\"Message\":\"Please pass 'customerId' on the query string or in the request body\"}'.",
"failureType": "UserError",
"target": "ExecuteFullLoad"
}
The Setup:
The whole setup starts with a function call to get new customers from an online economic platform. It the writes them to a SQL table, from which they are processed and loaded into the final table, after which a new pipeline is executed. This process works perfectly. From there the following pipeline is executed:
As you can see it all works well until the ForEach loop tries to execute another pipeline, that contains an azure function that calls a .NET scripted function that fills said warehouse (complex i know). This azure function needs a customerid to retrieve tokens and load the data into the warehouse. I'm trying to pass those tokens from the InternalCustomerID lookup through the ForEach into the pipeline and into the function. The ForEach works actually, but fails "Because an inner activity failed".
The Execute Pipeline task contains the following settings, where i'm trying to pass the parameter through which comes from the foreach loop. This part of the process also works, since it executes twice (as it should in this test phase):
I dont know if it doesn't successfully pass the parameter through or it fails at adding it to the body of the azure function.
The child pipeline (FullLoad) contains the following parameters. I'm not sure if i should set a default value to be overwritten or how that actually works. The guides i've look at on the internet havent had a default value.
Finally there is the settings for the Azure function. I'm not sure what i need to write in order to correctly capture the parameter and/or what to fill in - if it's the header or the body regarding the error message. I know a post cannot be executed without a body.
If i run this specific funtion by hand (using the Function App part of portal.azure.com) it works fine, by using the following settings:
I viewed all of your detailed question and I think the key of the issue is the format of Azure Function Request Body.
I'm afraid this is incorrect. Please see my below steps based on your description:
Work Flow:
Inside ForEach Activity, only one Azure Function Activity:
The preview data of LookUp Activity:
Then the configuration of ForEach Activity: #activity('Lookup1').output.value
The configuration of Azure Function Activity: #json(concat('{"name":"',item().name,'"}'))
From the azure function, I only output the input data. Sample Output as below:
Tips: I saw your step is executing azure function in another pipeline and using Execute Pipeline Activity, (I don't know why you have to follow such steps), but I think it doesn't matter because you only need to focus on the Body format, if your acceptable format is JSON, you could use #json(....),if the acceptable format is String, you could use #cancat(....). Besides, you could check the sample from the ADF UI portal which uses pipeline().parameters

What is the Azure API version

I'm trying to access the result of a GET request provided by Azure, as shown in the example : https://msdn.microsoft.com/sv-se/library/azure/dn820159.aspx
My problem is that the api-version is a mandatory argument, but I have no idea about what to write inside. I'm a bit lost with the Azure Batch documentation, it doesn't seem to be complete.
I found something in an Azure webpage : https://azure.microsoft.com/en-us/documentation/articles/search-api-versions/ and the api-version was api-version=2015-02-28. However, if I try it in my browser, I have this answer : "key":"Reason","value":"The specified api version string is invalid".
Any idea of what I can put inside the api-version parameter ?
Have a look here
As the time of this writing
The version of the Batch API described here is '2016-07-01.3.1', and
using that version is recommended where possible.
Earlier versions include '2016-02-01.3.0', '2015-12-01.2.1',
'2015-11-01.2.1', '2015-06-01.2.0', '2015-03-01.1.1', and
'2014-10-01.1.0'.
So try specifying '2016-07-01.3.1'

generateSharedAccessSignature not adding sv parameter?

I'm trying to generate a Shared Access Signature and am using the code here (http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx) for a custom API to generate the SAS.
It seems to be missing the sv=2014-02-14 parameter when calling "generateSharedAccessSignature()".
The SAS url doesn't seem to work when I try it (getting a 400 xml not valid error) but if I try a SAS generated from Azure Management Studio the URL contains the "sv" parameter and works when I attempt to upload with it.
Any ideas?
Based on the Storage Service REST API Documentation, sv parameter in Shared Access Signature is introduced in storage service version 2014-02-14. My guess is that Azure Mobile Service is using an older version of the storage service API and this is the reason you don't see sv parameter in your SAS token.
You could be getting 400 error (invalid XML) because of this. In the earlier version of storage service API, the XML syntax for committing block list was different than what is used currently. I have had one more user come to my blog post complaining about the same error. Please try the following XML syntax when performing a commit block list operation and see if the error is gone:
<?xml version="1.0" encoding="utf-8"?>
<BlockList>
<Block>[base64-encoded-block-id]</Block>
<Block>[base64-encoded-block-id]</Block>
...
<Block>[base64-encoded-block-id]</Block>
</BlockList>
Please notice that we're not using Latest node. Instead we're using Block node.
Leaving the sv parameter out and setting it as part of the PUT request header worked using:
xhr.setRequestHeader('x-ms-version','2014-02-14');
You can check out this example for an azure file upload script: http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/
...which will work with the generated SAS from the question's original blog link - http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx
Add the request header in the beforeSend like so:
beforeSend: function(xhr) {
xhr.setRequestHeader('x-ms-version','2014-02-14');
},

Resources