How can I execute a second cloud function when the first has finished?
I'd like to call a new cloud function, not a normal js function, in order to reduce each function execution time. Also I need a serial execution, not parallell (the second function queries data updated by the first)
To solve your issue, you need an orchestrator. Run a pipeline, call your first functions, then call the second functions (then whatever!!)
My preferred is https://cloud.google.com/workflows/docs/overview: serverless, cheap, perfect for working with Cloud Functions
Or you can use the heavy and expensive Cloud Composer
Related
I have a timer function that runs by itself once per minute.
Is it possible to invoke this function from another type of function if I want to call it at an arbitrary time (not on its cron schedule).
From:
An orchestrator function?
An activity function?
Also, it is it possible to call an orchestrator directly from a an activity function? I have heard that you can do "sub orchestrations" from an orchestrator. But what about directly from an activity function.
You cannot call the Time Trigger Function, but one thing you can do is extract the logic to a class library and share it with a Http Trigger that function that will be running in the same Azure Function App as the time trigger one.
About the Durable part, it's been a while since the last time I worked with that, but as far as I know, the orchestrator can call sub orchestrators and activities.
I develop for first time on aws lambda with serverless
I know that my NodeJS code is not blocking so a NodeJS server can handle several requests simultaneously
My question : does Lambda create an instance for each call ? example if there are 10 simultaneous connections, will Lambda create 10 instances of NodeJS
Currently, in my tests, I have the impression that lambda creates an instance for each call because at each call, my code creates a new connection to my database while locally my code keeps in memory the connection to my database
Yes, this is a fundamental feature of AWS Lambda (and "serverless" functions in general). A new instance is created for each request.
If you have multiple parallel executions, all will be separate instances (and this, each would use its own connection to the DB).
Now, if you are invoking multiple Lambda functions one after another, that's a bit different. It is possible that subsequent invocations of the Lambda function reuse the context. That means there is a possibility of reusing some things, like DB connection in subsequent calls to the Lambda function.
There is no exact information about how long a Lambda function keeps the previous context alive. Also, in order to reuse things like DB connection, you must define and obtain a connection outside of your Handler function. If you put it in the handler function, it will certainly not be reused.
When the context is reused, you have something called a "warm" start. Lambda function is started quicker. If some time has passed and the context cannot be reused anymore, you have a "cold" start, meaning the Lambda function will take more time to start its execution (it needs to pull all the dependencies when doing the cold start)
Because http-trigger azure function has a strict time limit for 230s, I created a http-trigger durable azure function. I find that when I trigger durable function multiple times and if the last run is not completed, the current run will continue the last run until it is finished. It is a little confused for me because I only want each run to do the task of the current run, not replay the last un-finished run. So my question is that:
Is it by design for durable function to make sure each run is completed (succeed or failed)?
Can durable function only focus on the current run just like the normal http-trigger azure function?
If 2) is not, is there any way to mitigate the time limit issue for normal http-trigger azure function?
Thanks a lot!
The function runs until it gets results that is Successfully completed or failed message.
According to Microsoft-Documentation it says,
Azure Functions times out after 230 seconds regardless of the functionTimeout setting you've configured in the settings.
I am using Azure Function V1 c#. I have a time triggered azure function which is checking for some data in my database every second. If the data is found I want to perform some operation on it. This operation can take 30 seconds to 5 minutes depending on the operations happening on it.
When I my time triggered function gets data and starts performing operation on it. Time triggered function is not getting executed again until first operation is completed. So, even if time triggered function is scheduled to be executed every second, it is not getting executed for next 30 seconds if the operation in previous iteration took 30 seconds. How can I solve it?
Can I call some other azure function from current time triggered function that can take care of that 30 sec. running operation and my time triggered function runs smoothly every second?
How can I call another azure function (Custom Function) from current time triggered function?
Thanks,
You may need to consider logic apps for this scenario. Logic Apps are serverless workflow offering from Azure. Use recurrence trigger to schedule the job (http call) and it will trigger the azure function regardless.
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-recurrence
If you want to trigger any external function you may use httpclient.
Azure Functions call http post inside function
have two function app (httptrigger) in one of azure function apps project.
PUT
DELETE
In certain condition, would like to call DELETE function app from PUT function app.
Is it possible to get directly RUN of DELETE function app as both are resides in same function app project ?
I wouldn't recommend trying to call the actual function directly, but you can certainly refactor the DELETE functionality into a normal method and then call that from both the DELETE and PUT functions.
There is a few ways to call a function from the function:
HTTP request - it's simple, execute a normal HTTP request to your second function. It's not recommended, because it extends function execution time and generates a few additional problems, such as the possibility of receiving a timeout, the unavailability of the service and others.
Storage queues - make communication through queues (recommended), e.g. the first function (in your situation: "PUT function) can insert a message to the queue and the second function ("DELETE function") can listen on this queue and process a message.
Azure Durable Functions - this extensions allows to create rich, easy-to-understand workflows that are cheap and reliable. Another advantage is that they can retain their own internal state, which can be used for communication between functions.
Read more about cross function communication here.