Call Azure function from azure cloud service without API key - azure

Good day,
we created an Azure function which fetches a secret from a key vault. The idea is not to have the secret in the code, as not every developer should be able to authenticate with the application.
However, now I had to include the azure function's API Key in code, which (for me) seems like adding just another layer to the problem without actually preventing anyone from accessing the secret (the developer would just need to call the function).
I wonder: Is there any possibility to allow calling an Azure Function from a cloud service without an API key? I'd argue this should be possible, as both is living in azure itself. However, I already tried exactly this: Calling the function (from the cloud service) without key, nevertheless this just says forbidden.

Related

Python Azure Timer Function with Durable Entity

I'm currently trying to work on my first Azure Function to act as an intermediary between Defender for Endpoints, and Azure Sentinel. It runs every 5 minutes, and collects data matching specific filters from the Defender API to then forward as custom logs to Azure Sentinel. Due to the authentication measures in place on Defender, I've set my script up using ADAL to do a device code logon the first time, then use the refresh tokens to do its scheduled running.
This is where I've come across the problem; since Azure Functions are serverless by design, holding this refresh token somewhere for the next invocation has proven troublesome. I'm trying to use Durable Functions, but the documentation for such a use case seems non-existent.
Are there other appropriate methods to store a singular variable across invocations of an Azure Function?
There are more than one way to solve the problem you are facing with holding the refresh token for every new invocation Azure Functions.
One way to solve the problem is by using a Azure Function Timer Trigger to request new access tokens and Azure Key Vault to store these tokens securely. We want to save them in Key Vault so the next time we invoke our function to refresh our tokens again, we will use the updated values and the next function will be able to obtain that value when invocated.
Check this document to access secrets from key vault.
Another way is enabling the token store in azure function and store it in blob storage. Check this document for more information.

How to handle a Custom Connector that uses header authentication in each API call?

I have an Azure logic app that uses a Custom Connector that I've made from importing a Postman Collection. The RESTful API that my connector calls require 3 authentication headers in each request: UserName, Secret, ApiIntegrationCode. Because it takes three specifically named parameters, I don't believe that any of the authentication presets will work for me.
I know that I can protect the inputs and outputs of various connectors. I have been entertaining the idea of storing the sensitive information in a SQL table that I query in each run and storing the values in variables that I pass to each of my custom connector's API calls.
Would this be a viable way of protecting sensitive data from being seen by people that may have access to my logic app? What is the most secure way I can pass these headers in each call?
There are not too many options within a (consumption) Logic App in this regard.
Your options with Logic Apps
A first step into the right direction is to put your sensitive information into an Azure Key Vault and use the corresponding connector in your Logic App to retrieve the data from there. This is easier to implement and more secure than querying a SQL table for this purpose.
The second thing you can do is to activate secure inputs for the connectors that make the API calls. This makes sure, that the sensitive information passed to these connectors is obfuscated in the run history of your logic App and in connected services like Azure Log Analytics.
The problem with this approach is, that anyone who has more than just read permissions to your Logic App can just go ahead and deactivate the secure inputs setting or create a step that dumps the content of your Key Vault. You can use RBAC to control access to your Logic App but that means of course administrative overhead.
Alternative: API Management Service
If you want by all means to allow other developers to change the Logic App without exposing API secrets to them, you might consider using some sort of middle tier to communicate with the API. Azure API Management Service (APIM) is one of the options here.
You would manage your sensitive information in a Key Vault and inject them via "Named Values" into your APIM instance. You can then add your API as a backend in APIM and expose it towards your Logic App.
The advantage here is that you can secure access to your API with APIM subscription keys that you can cycle frequently. You can also restrict the access to the original API to only those calls, that need to be available to the Logic App.
If APIM is something for you depends on your use case, as it comes at a price. Even the developer plan costs about $50/month: https://azure.microsoft.com/en-us/pricing/details/api-management/
Alternative: Azure Function
You can use a simple Azure Function that serves as a middle tier between your Logic App and your API. This function can be configured to pull the sensitive data from a Key Vault and can also be secured via a function access key, that you can renew on a regular basis.
This is a dirt cheap option, if you are running the functions on a consumption plan: https://azure.microsoft.com/en-us/pricing/details/functions/

How to get Resources groups list within an Azure Function App

I want to create an Azure Function that would retrieve the Sources Groups List, I found related question
here, but I'm wondering if there could be another alternative as this function will be hosted in the same Azure subscription, without making REST requests.
It is not possible to have it without a call. Having something in the resource group does not grant access out of the box to list other resources in that resource group. That could be a security issue.
There are a couple of different ways to get the desired information. It could be done via a REST API call, Powershell, Azure CLI etc.
There is a way to use Powershell in Azure Functions, but it is currently in the preview. I have not tried this before, but maybe you can try to leverage this to call the simple PowerShell command to get the resources. That way you are not calling the rest API, at least not directly.
If you ask me, and if you really need an Azure function to do this, going with REST API call is the safest bet.

How to run azure function deployed on portal manually via Kudu Api dynamically?

I have a time trigger azure function deployed on portal. It runs daily at 10:00 am. However, there is now a requirement that function should also be invoked and run on some other time dynamically as well.
I know how to set the trigger in function.json file dynamically via Kudu Api using the steps in answer mentioned here. So using those steps, I can set the trigger for the next minute and run the function.
But this isn't real-time, this seems a workaround. Isn't there any direct way to invoke and manually run azure function directly via apis?
Isn't there any direct way to invoke and manually run azure function directly via apis?
We could trigger the deployed Azure function with REST API. I test it with Time Trigger C# Azure function on my side.
Post https://{FunctionAppName}.azurewebsites.net/admin/functions/{functionName}
Note: I trace it from Azure portal, I don't find any official document mentioned this, if you want to use this API in the product environment, please pay more attention to this.
We need x-functions-key as header. And we could get the function key from the function Application.
We also could use bearer token as authorization, about how to get the authorization for this Rest API please refer to another SO thread.
Updated:
Add the body info.
For the requirement above, my recommendation would be to create two functions that share the same logic (if CSX, either by importing the common implementation using #load or adding a reference to a common assembly, or by having a common type).
You'd have one function using a timer trigger and another using a different trigger type that enables you to invoke the function on demand without a dependency on any of the Kudu or Admin APIs (e.g. HTTP, queue, SB, etc.), the function entry point (your Run method) would just invoke the common logic you bring in.

Finding Functions authorisation code at deploy time

I am building an application in Functions (in PS/C#) that connects back into other functions via HTTP. Currently these other functions are looked up in a table and called. This table has been manually created.
https://{appname}.azurewebsites.net/api/Orchestrate?code={secret}
However when the application is deployed (likely from Github) I would need to have some process that automatically populates that storage table.
How can I find the authorisation secrets at deploy / run time.
It seems my ${generic-Search-engine}foo was broken
According to Azure Functions HTTP and webhook bindings
You can find API key values in the D:\home\data\Functions\secrets folder in the file system of the function app.
If the secrets folder contains a JSON file with the same name as a function, the key property in that file can also be used to trigger the function

Resources