How to change dynamically GET request in Azure Logic App - azure

I've only started playing with Azure Logic Apps and already bumped into some problem.
The app that I'm trying to develop has a fairly simple logic: call the API, get the data and then save it to the Blob storage.
App Logic app
It works perfectly fine for a single request (https://example.com/rest/businessObject/objectName/details?fields=abc,dde). However, I would like to make multiple get requests to various objects getting different fields, i.e. change the objectName in the URI and fields values in the get request. Is it possible to parametrize the call using something like JSON object, iterate over it and make different requests in a single app instead of creating multiple apps for each request? And if it's possible then how?
Update
I've used JSON parsing (big thanks to the author of response in this thread). Now my app looks like this:
Seems to work fine:
The problem that I have now however is different - I can't save the body of the response as blob as I can't access it in the create blob. The only variable available for Blob content is 'Current item' which, of course, is a chunk of JSON used in the for loop. Any ideas if it's possible to save it to the storage and how to get it done?

You can do JSON parsing itself, use this example.
You can also do nesting, and also use variables and call different Logic App from another and that can be nested or loops or if-else.
If you have a fixed number of requests and not infinite then make those Logic apps, and using variables make use of Nested Logic App calls. If you have an in-definite (un-fixed) amount of API calls then a better idea would be to use Azure Durable Functions or even may be Azure Logic App as an orchestrator which then uses Azure Functions in nested calls.

Related

CosmosDB return data from external API on read

I am attempting to write an Azure CosmosDB integration (Core SQL Api) that integrates with an external service to provide some of the query data. As an example, I need a query made on Cosmos DB to convert some of the data returned (e.g. ID's) by the query into real data by calling an external service via a REST API. This should only happen when querying certain columns.
I initially investigated using a JS stored procedure and/or a UDF to make this external call, but the JS environment seems to be extremely limited and doesn't provide any way to make external calls. I then tried using this https://github.com/Oblarg/cosmosdb-storedprocs-ts repository, which uses webpack to bundle all of node.js into the stored procedure, allowing node modules to be used in stored procedures. Whilst this does allow some node modules to be used, whenever I try and use "https", "fetch", or "axios" modules to make an HTTP GET request I get errors (the same code works fine in a normal node environment, but I'm not a JS expert and can't seem to work past these errors). After a day of attempts it seems like the stored procedure approach is not possible.
Is this the case or is there some way of making HTTP GET requests from a JS stored procedure? If not possible with stored procedures, are there any other techniques to achieve the requirement of reading data from a remote API when querying cosmos DB?
Thanks
There is no way to achieve this from CosmosDB directly, for queries you also cannot use the change feed as the document dont change, so really your only option is to use a function or some preprocessor app to handle it, as you say its not ideal but there is no other solution here. If it was an insert or an update then change feed would allow you to do this but for plain queries its not possible.

Enqueueing a message to Azure Storage in an Azure function without changing the output

I have a custom handler written in Go running as an Azure Function. It has an endpoint with two methods:
POST /entities
PUT /entities
It was easy to make my application run as an Azure function: I added "enableForwardingHttpRequest": true to host.json, and it just works.
What I need to achieve: life happened and now I need to enqueue a message when my entities change, so it will trigger another function that uses a queueTrigger to perform some async stuff.
What I tried: The only way I found so far was to disable enableForwardingHttpRequest, and change all my endpoints to accept the Azure Function's raw JSON input and output, and then output a message in one of the fields (as documented here.
It sounds like a huge change to perform something simple... Is there a way I can enqueue a message without having to change all the way my application handles requests?
As per this Github document as of now custom handlers related to go lang in Azure functions having a bug and which need to be fixed.

Azure Functions .Net 5 Is it possible to bind CosmosInput to parameters in HttpRequestData?

Actually the title is self-descriptive. I have an azure function on .Net5 with http trigger which has to send back results received by running . Is it possible (even some weird hacky way would work) to pass parameters coming in HttpRequestData to CosmosInput query? For certain reasons I want to go with azure functions functionality and don't want to implement full-fledged Cosmos routine inside function.

Azure durable functions and synchronous read?

I'm trying to use Azure Durable Functions to implement a minimal server that accepts data via a POST and returns the same data via a GET - but I don't seem able to generate the GET response.
Is it simply not possible to return the response via a simple GET response? What I do NOT want to happen is:
GET issued
GET response returns with second URL
Client has to use second URL to get result.
I just want:
GET issues
GET response with requested data.
Possible?
I'm not really sure whether durable functions are meant to be used as a 'server'. Durable functions are part of 'serverless' and therefore I'm not sure whether it's possible (in a clean way).
To my knowledge durable functions are used to orchestrate long lasting processes. So for example handling the orchestration of a batch job. Using durable functions it's possible to create an 'Async HTTP API', to check upon the status of the processing a batch of items (durable function documentation). I've wrote a blogpost about durable functions, feel free to read it (https://www.luminis.eu/blog/azure-functions-for-your-long-lasting-logic/).
But as for your use case; I think you can create two separate Azure functions. One for posting your data, you can use an Azure Blob Storage output binding. Your second function can have a GET http trigger and depending on your data you can use an blob input binding. No need to use durable functions for it :)!
Not really a direct answer to your question, but hopefully a solution to your problem!

Pass parameters from C# function app to another Javascript function app in Azure

I need to set up an application in Azure and make communicate 2 functions (one written in C# and one written in JavaScript).
The C# fragment consists in analyzing a XML feed, get the data and save in objects then finally send them to the other JavaScript function by parameter.
I did read that we could establish communication between both functions using HTTP calls but is it possible to do it with parameters ?
If not, would have any suggestions in order to achieve something like this properly? I'm getting started with Azure and i don't have enough visibility to know what is recommened in such a situation
Thank you for your advices
Yes, this is absolutely possible. How you do this is up to you. If you look at the default HTTP trigger templates, you can see that they take parameters (for example, as query string parameters). You can find more examples in the HTTP and webhook recipes documentation.
You can use other trigger types for cross-function communication as well. Take a look at this documentation for related best practices: https://learn.microsoft.com/en-us/azure/azure-functions/functions-best-practices#cross-function-communication

Resources