Finding Functions authorisation code at deploy time - azure

I am building an application in Functions (in PS/C#) that connects back into other functions via HTTP. Currently these other functions are looked up in a table and called. This table has been manually created.
https://{appname}.azurewebsites.net/api/Orchestrate?code={secret}
However when the application is deployed (likely from Github) I would need to have some process that automatically populates that storage table.
How can I find the authorisation secrets at deploy / run time.

It seems my ${generic-Search-engine}foo was broken
According to Azure Functions HTTP and webhook bindings
You can find API key values in the D:\home\data\Functions\secrets folder in the file system of the function app.
If the secrets folder contains a JSON file with the same name as a function, the key property in that file can also be used to trigger the function

Related

Folder to exchange files between Azure Functions

I have two Azure Functions. The first one generates one big template file and stores it in d:\local folder.
After that this function sends about 10 POST requests to another Azure Function with some parameters.
The function, that received the POST request launches, copies this template file and processes it with the passed parameters.
Although the template file generated successfully in the first function, I get file not found error, when I try to copy that template file in the second function. I assume I have to use another folder, instead of d\local.
What folder should I use to exchange files between Azure Functions?
d:\home or something else ?
This is likely happening because your function apps are spun up on different servers. Regardless of the why, you should treat cloud serverless offerings such as functions as stateless and use a service such as a storage account or (if the docs are small) service bus to exchange payloads.
Have a look at the claim check pattern, which sounds like an appropriate solution for your use case.

Call Azure function from azure cloud service without API key

Good day,
we created an Azure function which fetches a secret from a key vault. The idea is not to have the secret in the code, as not every developer should be able to authenticate with the application.
However, now I had to include the azure function's API Key in code, which (for me) seems like adding just another layer to the problem without actually preventing anyone from accessing the secret (the developer would just need to call the function).
I wonder: Is there any possibility to allow calling an Azure Function from a cloud service without an API key? I'd argue this should be possible, as both is living in azure itself. However, I already tried exactly this: Calling the function (from the cloud service) without key, nevertheless this just says forbidden.

How to run azure function deployed on portal manually via Kudu Api dynamically?

I have a time trigger azure function deployed on portal. It runs daily at 10:00 am. However, there is now a requirement that function should also be invoked and run on some other time dynamically as well.
I know how to set the trigger in function.json file dynamically via Kudu Api using the steps in answer mentioned here. So using those steps, I can set the trigger for the next minute and run the function.
But this isn't real-time, this seems a workaround. Isn't there any direct way to invoke and manually run azure function directly via apis?
Isn't there any direct way to invoke and manually run azure function directly via apis?
We could trigger the deployed Azure function with REST API. I test it with Time Trigger C# Azure function on my side.
Post https://{FunctionAppName}.azurewebsites.net/admin/functions/{functionName}
Note: I trace it from Azure portal, I don't find any official document mentioned this, if you want to use this API in the product environment, please pay more attention to this.
We need x-functions-key as header. And we could get the function key from the function Application.
We also could use bearer token as authorization, about how to get the authorization for this Rest API please refer to another SO thread.
Updated:
Add the body info.
For the requirement above, my recommendation would be to create two functions that share the same logic (if CSX, either by importing the common implementation using #load or adding a reference to a common assembly, or by having a common type).
You'd have one function using a timer trigger and another using a different trigger type that enables you to invoke the function on demand without a dependency on any of the Kudu or Admin APIs (e.g. HTTP, queue, SB, etc.), the function entry point (your Run method) would just invoke the common logic you bring in.

Using Azure WebJobs for on demand responses

I currently have a couple of WebApi projects that use a few class libraries such as address lookup, bank validation, image storage etc.
Currently they are all in a shared solution but I'm planning to split them up. I thought about moving the libraries into NuGet packages so that they are separate from the API projects and is properly shared.
However, if I make a change to one of these components I will need to build and redeploy the API service even though it's a separate component which has changed.
I thought about putting these components into a separate service but seems a bit of overhead for what it is.
I've been looking at Azure WebJobs and think I may be able to move these components into this instead. I have two questions related to this:
Are WebJobs suitable for calling on demand (not using a queue)? The request will be activated from a user on a web site which calls my API service which then calls the Web Job so it needs to be quick.
Can a WebJob return data? I've seen examples where it does some processing and updates a database but I need a response (ideally Json) back to my API service.
Thanks
According to your requirement, I assume that you could try to leverage Azure Functions by creating a function using the HTTP trigger, which could be triggered by accessing the Function URL with parameters and return the response as you expected. You could follow this tutorial for getting started with Azure Functions.

Running an exe in azure at a regular interval

I have an app (.exe) that picks up a file and imports it into a database. I have to move this set up into Azure. I am familiar with Azure SQL and Azure File Storage. What I am not familiar with is how I execute am app within Azure.
My app reads rows out of my Azure database to determine where the file is (in Azure File Storage) and then dumps the data into a specified table. I'm unsure if this scenario is appropriate for Azure Scheduler or if I need an App Service to set up a WebJob.
Is there any possibility I can put my app in a directly in Azure File Storage and point a task to that location to execute it (then it might be easier to resolve the file locations of the files to be imported).
thanks.
This is a good scenario for Azure Functions, if you want to just run some code on a schedule in Azure.
Functions are like Web Jobs (they share the same SDK in fact) so you can trigger on a schedule or from a storage queue, etc., but you don't need an app service to run your code in. There are some great intro videos here Azure Functions Documentation , and here is a link to a comparison of the hosting options between web jobs, functions, flow and logic apps.
You can edit the function directly in the portal (paste/type your c# or node.js code straight in), or use source control to manage it.
If you really want to keep your app as an exe and run it like that, then you will need to use the azure scheduler to do this instead, which is a basic job runner.
Decisions, decisions...!
Looking at https://azure.microsoft.com/en-gb/documentation/articles/scheduler-intro/ it seems that the only actions that are supported are:
HTTP, HTTPS,
a storage queue,
a service bus queue,
a service bus topic
so running a self contains .exe or script doesn't look to be possible.
Do you agree?

Resources