I have two Azure Functions. The first one generates one big template file and stores it in d:\local folder.
After that this function sends about 10 POST requests to another Azure Function with some parameters.
The function, that received the POST request launches, copies this template file and processes it with the passed parameters.
Although the template file generated successfully in the first function, I get file not found error, when I try to copy that template file in the second function. I assume I have to use another folder, instead of d\local.
What folder should I use to exchange files between Azure Functions?
d:\home or something else ?
This is likely happening because your function apps are spun up on different servers. Regardless of the why, you should treat cloud serverless offerings such as functions as stateless and use a service such as a storage account or (if the docs are small) service bus to exchange payloads.
Have a look at the claim check pattern, which sounds like an appropriate solution for your use case.
Related
We have a windows-based app service that requires a large dataset to run (files stored on Azure Blob Storage at around ~30GB). This data is static per app version, and therefore should be accessible to all instances across a given slot (a slot in our case represents a version).
Based on our initial research, it seems like Persistent Storage (%HOME%) would be the ideal place for this, since data stored there is shared across instances, but not across slots.
The next step now is to load the required data as part of our devops deployment pipeline, since the app service cannot operate without the underlying data. However, it seems like the %HOME% directory is only accessible by the app service itself, even though the underlying implementation is using Azure Storage.
At this point, we're considering having the app service download the data during its startup, but then we hit a snag which is that we have two instances. We could implement a Mutex (using blob lease) but this seems to us to be too complicated a solution for a simple need.
Any thoughts about how to best implement this?
The problems I see with loading the file on container startup are the following:
It's going to be really slow, and you might hit one of the built-in App Service timeouts.
Every time your container restarts, or you add another instance, it will re-download all the data, and it might cause issues with blocked writes because of file handle locks, which can make files or directories on %HOME% completely unaccessible for reading and modifying (I just had this happen to me).
For this I would rather suggest connecting the app to Azure Files by SMB, and for example have a directory per each version. This way you can connect to Azure Files and write the data during your build pipeline, and save an ENV variable or file that tells each slot which directory to get the current version's data from.
I have a time trigger azure function deployed on portal. It runs daily at 10:00 am. However, there is now a requirement that function should also be invoked and run on some other time dynamically as well.
I know how to set the trigger in function.json file dynamically via Kudu Api using the steps in answer mentioned here. So using those steps, I can set the trigger for the next minute and run the function.
But this isn't real-time, this seems a workaround. Isn't there any direct way to invoke and manually run azure function directly via apis?
Isn't there any direct way to invoke and manually run azure function directly via apis?
We could trigger the deployed Azure function with REST API. I test it with Time Trigger C# Azure function on my side.
Post https://{FunctionAppName}.azurewebsites.net/admin/functions/{functionName}
Note: I trace it from Azure portal, I don't find any official document mentioned this, if you want to use this API in the product environment, please pay more attention to this.
We need x-functions-key as header. And we could get the function key from the function Application.
We also could use bearer token as authorization, about how to get the authorization for this Rest API please refer to another SO thread.
Updated:
Add the body info.
For the requirement above, my recommendation would be to create two functions that share the same logic (if CSX, either by importing the common implementation using #load or adding a reference to a common assembly, or by having a common type).
You'd have one function using a timer trigger and another using a different trigger type that enables you to invoke the function on demand without a dependency on any of the Kudu or Admin APIs (e.g. HTTP, queue, SB, etc.), the function entry point (your Run method) would just invoke the common logic you bring in.
I currently have a couple of WebApi projects that use a few class libraries such as address lookup, bank validation, image storage etc.
Currently they are all in a shared solution but I'm planning to split them up. I thought about moving the libraries into NuGet packages so that they are separate from the API projects and is properly shared.
However, if I make a change to one of these components I will need to build and redeploy the API service even though it's a separate component which has changed.
I thought about putting these components into a separate service but seems a bit of overhead for what it is.
I've been looking at Azure WebJobs and think I may be able to move these components into this instead. I have two questions related to this:
Are WebJobs suitable for calling on demand (not using a queue)? The request will be activated from a user on a web site which calls my API service which then calls the Web Job so it needs to be quick.
Can a WebJob return data? I've seen examples where it does some processing and updates a database but I need a response (ideally Json) back to my API service.
Thanks
According to your requirement, I assume that you could try to leverage Azure Functions by creating a function using the HTTP trigger, which could be triggered by accessing the Function URL with parameters and return the response as you expected. You could follow this tutorial for getting started with Azure Functions.
I have an app (.exe) that picks up a file and imports it into a database. I have to move this set up into Azure. I am familiar with Azure SQL and Azure File Storage. What I am not familiar with is how I execute am app within Azure.
My app reads rows out of my Azure database to determine where the file is (in Azure File Storage) and then dumps the data into a specified table. I'm unsure if this scenario is appropriate for Azure Scheduler or if I need an App Service to set up a WebJob.
Is there any possibility I can put my app in a directly in Azure File Storage and point a task to that location to execute it (then it might be easier to resolve the file locations of the files to be imported).
thanks.
This is a good scenario for Azure Functions, if you want to just run some code on a schedule in Azure.
Functions are like Web Jobs (they share the same SDK in fact) so you can trigger on a schedule or from a storage queue, etc., but you don't need an app service to run your code in. There are some great intro videos here Azure Functions Documentation , and here is a link to a comparison of the hosting options between web jobs, functions, flow and logic apps.
You can edit the function directly in the portal (paste/type your c# or node.js code straight in), or use source control to manage it.
If you really want to keep your app as an exe and run it like that, then you will need to use the azure scheduler to do this instead, which is a basic job runner.
Decisions, decisions...!
Looking at https://azure.microsoft.com/en-gb/documentation/articles/scheduler-intro/ it seems that the only actions that are supported are:
HTTP, HTTPS,
a storage queue,
a service bus queue,
a service bus topic
so running a self contains .exe or script doesn't look to be possible.
Do you agree?
I am building an application in Functions (in PS/C#) that connects back into other functions via HTTP. Currently these other functions are looked up in a table and called. This table has been manually created.
https://{appname}.azurewebsites.net/api/Orchestrate?code={secret}
However when the application is deployed (likely from Github) I would need to have some process that automatically populates that storage table.
How can I find the authorisation secrets at deploy / run time.
It seems my ${generic-Search-engine}foo was broken
According to Azure Functions HTTP and webhook bindings
You can find API key values in the D:\home\data\Functions\secrets folder in the file system of the function app.
If the secrets folder contains a JSON file with the same name as a function, the key property in that file can also be used to trigger the function