I need to make 6 HTTP request using trigger in one logic resource only. How can I make multiple HTTP request from logic app. Also Azure logic app showing error "This session has timed out. To see the latest run status, navigate to the runs history blade."
All the http request should be independent of each other.
If any documents is available for this please share.
Every logic app workflow starts with a trigger, which fires when a specific event happens, or when new available data meets specific criteria.
Each time that the trigger fires, the Logic Apps engine creates a logic app instance that runs the actions in the workflow.
For HTTP trigger , follow the doc Call, trigger, or nest workflows with HTTP endpoints in Azure Logic Apps
YES. This is a very basic pattern.
How? Well, you just do. If it's exactly 6 calls, you can use 6 HTTP Connectors. If it's variable, you have to loop on whatever incoming set you have and put the HTTP Connector in the loop.
All outbound request are independent of each other.
Related
I want to run this logic app once and it needs to hit the api endpoint 50 times. I want to implement this to check some caching policy in API Management.
I want to run this logic app once and it needs to hit the api endpoint
50 times
You can design a logic app such as below , based on screenshot we are using variables to keep the count until loop ,Inside which for each iteration we can have a http activity to invoke in API .
For more information you can also refer this Microsoft documentation: Call service endpoints over HTTP or HTTPS from Azure Logic Apps
I'm migrating extensive SharePoint application to Azure and I'm looking for a recommendation in terms of replacing Timer Jobs with some Azure service. Which one would be most suitable? Azure Functions, Azure Logic Apps? The timer job needs to connect to a web service and make some GET / POST calls. Maybe sending e-mails.
What would you recommend? Any pros / cons?
Thanks,
George
With that additional context, but without a full understanding of the logic you need to implement around the calls, I'd suggest you look into a combination of a Logic App and a Function. The Logic App will make the time trigger and any email jobs more configurable, but the Function will give you more control of your web requests and logic, especially if those are more complex than simply making a request and ensuring it succeeded.
Set up the Logic App on a time trigger, and use it to orchestrate the Function and any email jobs. You can use the Logic App HTTP connector to trigger an HTTP-based Function, or you can look at the Function App connector.
For the Function, write it in C# and reuse your code there. Add any other calls and handling logic you require, then return a response for the Logic App to consume.
Add Control and Email (there are several of these for different email providers) connectors to the Logic App to send out relevant emails per the Function app result.
TLDR:
Is it possible to have multiple inputs to an Azure Function?
Longer Explanation:
I'm new to Azure Functions and still don't have a good understanding of it.
I have an application which downloads HTML data through a proxy web request and I was considering moving it to Azure Functions.
However, the function would require two inputs: a string URL and a proxy object (which contains IP address, username and password properties).
I was thinking of having two queues, one for URLs and one for proxies.
URLs would be added to the queue by a client application, which would trigger the function.
The proxy queue would have a limited pool of proxy objects which would be added back into the queue by the consuming function after they had been used for the web request.
So, if there are no proxies in the proxy queue, the function will not be able to create a web request until one is added back into the queue.
This is all assuming that Azure Functions are parallel and every trigger from the URL queue runs a function on another thread.
So, is what I'm considering possible? If not, is there an alternative way that I could go about it?
There can be only one trigger for a given function, i.e. the function will run when there is a new message in one specified queue.
There is an input bindings feature, which can load additional data based on the properties from triggering request. E.g. if incoming queue message contains URL and some proxy ID, and proxy settings are stored as Table Storage entities (or blobs), you could define an input binding to automatically load proxy settings based on ID from the message. See this example.
Of course, you could achieve the same without input binding, just by loading the proxy settings manually in function body based on your custom logic.
There is no way to setup a Function not to be triggered until you have messages in two queues at the same time.
I have looked through documentation for WebJobs, Functions and Logic Apps in Azure but I cannot find a way to schedule a one-time execution of a process through code. My users need to be able to schedule notifications to go out at a specific time in the future (usually within a few hours or a day from being scheduled). Everything I am reading on those processes is using CRON expressions which is not designed for one-time executions. I realize that I could schedule the job to run on intervals and check the database to see if the rest of the job needs to run, but I would like to avoid running the jobs unnecessarily if possible. Any help is appreciated.
If it is relevant, I am using C#, ASP.NET MVC Core, App Services and a SQL database all hosted in Azure. My plan was to use Logic apps to check the database for a scheduled event and send notifications through Twilio, SendGrid, and iOS/Android push notifications.
One option is to create Azure Service Bus Messages in your App using the ScheduledEnqueueTimeUtc property. This will create the message in the queue, but will only be consumable at that time.
Then a Logic App could be listening to that Service Bus Queue and doing the further processing, e.g. SendGrid, Twilio, etc...
HTH
You could use Azure Queue trigger with deferred visibility. This will keep the message invisible for a specified timeout. This conveniently acts as a timer.
CloudQueue queueOutput; // same queue as trigger listens on
var strjson = JsonConvert.SerializeObject(message); // message is your payload
var cloudMsg = new CloudQueueMessage(strjson);
var delay = TimeSpan.FromHours(1);
queueOutput.AddMessage(cloudMsg, initialVisibilityDelay: delay);
See https://learn.microsoft.com/en-us/dotnet/api/microsoft.azure.storage.queue.cloudqueue.addmessage?view=azure-dotnet for more details on this overload of AddMessage.
You can use Azure Automation to schedule tasks programmatically using REST API. Learn about it here.
You can use Azure Event Grid also. Based on this article you can “Extend existing workflows by triggering a Logic App once there is a new record in your database".
Hope this helps.
The other answers are all valid options, but there are some others as well.
For Logic Apps you can build this behavior into the app as described in the Scheduler migration guide. The solution described there is to create a logic app with a http trigger, and pass the desired execution time to that trigger (in post data or query parameters). The 'Delay Until' block can then be used to postpone the execution of the following steps to the time passed to the trigger.
You'd have to change the logic app to support this, but depending on the use case that may not be an issue.
For Azure functions a similar pattern could be achieved using Durable Functions which has support for Timers.
How can I correlate a single request across multiple Azure services in Application Insights?
Say we expose a "Create Case" API endpoint in API Management, using an API App.
The API App does some work, including triggering a Logic App.
How can I see the "flow" of the request throughout all the various Azure services to give a single "view" of the state of a particular case?
And I'm only saying Application Insights as we sort of use it in Web Apps / API Apps, so any other Azure based tool is fine
What we do:
in API Management inbound policy we create a GUID (CorrelationId)
which we then pass on either in HTTP headers to API backends or in Message
Properties through Service Bus Queues/Topics
API Management logging (including the CorrelationId) is pushed to EventHub which then is logged to AppInsights with an Azure Functions
all APIs and Functions that handle Queue etc. messsages also log to AppInsights
In summary: all logic components that can log to AppInsights take the CorrelationId and put it into the CustomDimensions. This way we keep track of all steps a request takes.
You can use Client Tracking Id for this.
The client tracking ID is a value that will correlate events across a logic app run, including any nested workflows called as a part of a logic app. This ID will be auto-generated if not provided, but you can manually specify the client tracking ID from a trigger by passing a x-ms-client-tracking-id header with the ID value in the trigger request (request trigger, HTTP trigger, or webhook trigger).