Azure APIM json-to xml gives xml with BOM - azure

I am having an issue with an Azure integration scenario where we are receiving a JSON message on Azure APIM from Azure datafactory and we are sending this using APIM in XML format to an external REST receiver a BOM is added. Question is how to remove this. Is there an APIM policy available for it?
Azure Datafactory
Pick up a XML file and we transmit it using copy to Azure APIM
Result (captured with requestbin)
APIM policy used to transform to XML
<json-to-xml apply="always" consider-accept-header="false" parse-date="false" namespace-separator=":" />

Related

Is there any way to fetch a Json file in Azure Blob Storage at APIM and forward to Backend as a json payload?

I am looking for a way to fetch a JSON file in Azure Blob storage at APIM layer, make it a part of the payload then forward it to the Backend engine (Function App). Appreciate it if you guys can share a sample C# snippet with me. Thanks for the help
You can use the send request policy to fetch the blob
https://learn.microsoft.com/en-us/azure/api-management/api-management-sample-send-request#send-request
and then use the policy expression to append it to your requests
How to edit a json payload with set-body in azure api management

Can I set Azure ML output to real JSON using Azure API Management?

I have started using Azure ML to deploy ML service, but it sent results as raw text. I see Azure API Management can use to set outbound body. Can I use it to convert raw text to JSON? and how?
This is an example result from Azure ML WebService.
"{\"transcript\": \"\\u0e27\\u0e31\\u0e19\"}"
Another question, Can I decode UTF-8 in set-body policy?
Now, I fixed this by making score.py return python dict instead of json.dumps().

Is it possible to force Azure IoT hub to save blobs as content type application/json

I have an IoT device which sends telemetry messages to an Azure IoT hub. The message does not have any attribute for content type. I am saving the received IoT telemetry messages in a blob, and found out that the hub is saving them with content type = application/octet-stream but I want them to be saved as content type = application/json. Since I am unable to modify the device and the message structure, is there a way to set up the IoT hub, the Route rule, or the blob storage itself, to force the content type to be application/json?
Screenshot of my hub custom storage endpoint, encoding setup to JSON.
screenshot here
I did repro and found that might your IOT hub messages default set to blob as AVRO. please make use of the Encoding section to specify the message format such as JSON
The below picture show how we can navigate to IOT hub message routing and add a custom endpoint for storage and pick a container for storing the message of IOT hub.
Note: The data can be written to blob storage in either the Apache
Avro format, which is the default, or JSON.
The encoding format can be only set at the time the blob storage
endpoint is configured. The format cannot be changed for an endpoint
that has already been set up. When using JSON encoding, you must set
the contentType to JSON and the contentEncoding to UTF-8 in the
message system properties.
Reference: Tutorial - Configure message routing for Azure IoT Hub using Azure CLI | Microsoft Docs
Please go through the route to storage account section in the above document.

How can we call rest api using ssl certificate in azure data factory?

I have been trying to configure/call a rest full api through azure data factory where it gives response in xml format.
Using REST Linked Service: it doesn't have the certificate authentication type. So cannot go with this.
Using HTTP Linked Service: it has the certificate authentication and able to create it successfully but when try to create a dataset it doesn't have the xml format to choose.
I have even read the supported file formats in azure data factory and mentioned the same.
Is there any other posbilities where im missing in azure data factory.
Could anyone help on this please.
Else i will go with Azure Logic app or Azure Databricks.
Still i need to know how can we configure in above two referred azure resources but i will try it later on.
XML format is supported for the following connectors: Amazon S3, Azure Blob, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure File Storage, File System, FTP, Google Cloud Storage, HDFS, HTTP, and SFTP. It is supported as source but not sink.
Ref: XML format in Azure Data Factory
When we create dataset in Web active, we can choose the XML format:
For example, click New-->Blog Storage-->XML:
Please check if your source supports the XML format file.

Saving a JSON data from an Azure Function

I have integrated an Azure Service Bus and an Azure Function to receive a message, and then update a SQL DB.
I want to save a JSON created from a query from the same Database to a Azure Blob.
My questions are:
I can save the JSON by calling the Azure Blob REST API. Is it a Cloud Native pattern to call one service from another service?
Is sending the JSON to the Azure Service Bus and another Azure Function saving the data to the Blob an optimal approach?
Is a resource other than Azure Blob to save the JSON data from an Azure Function which will make the integration easy.
There are many ways of saving a file in Azure Blob, if you want to save over HTTP, use Azure Blob REST API, you can also use Microsoft Azure Storage SDK that you can integrate into your application, there are storage client for many languages (.NET, Python, javascript, GO, etc.) or if you are using Azure function, you can use Output Binding.
it depends... Blob Storage is not the only location where you can save JSON, you can also save JSON straight into a SQL database for instance.
The easiest way to save from an Azure function is to use Azure Blob storage output binding for Azure Functions.

Resources