How do I set a secret in a Azure Logic App - azure

Is there a way to set a secret's value and expiration date from with a Logic App?
I don't see the option listed (I scrolled thru the list to ensure that I wasn't missing it):

It is as you suspected, you can not update a secret using the out of the box key vault connector available with Logic Apps. You however can do it using the Keyvault REST reference Update Secret- REST reference
I personally write an azure function to perform various key vault operations as I find there are some operations lacking in the out of the box connector in logic app

Related

is it recommended to create and use azure key value while creating linked service for a storage container in ADF (OR) to use Managed Identity?

I am relatively new to learning ADF; while creating linked-service for 'blob' data store, with the default settings:
'using connection string' for authentication type, at the end of creation step, I got the following recommendation:
Linked service will be published immediately
As Data Factory cannot store credentials in a Git repository, this change will be published immediately.
This may cause issues on the Master branch and on published resources that depend on this linked service. To avoid immediately publish of linked services, we recommend using Azure Key Vault.
I have attached screenshot of the recommendation to this post.
My concern is, what should be the ideal approach?
Further, if I publish the created 'linked service' directly with connection string as authentication type, how do I use it to run and test the pipeline? As of now, I haven't run a pipeline yet; everything I have created so far, I did it in Git-Repository mode of ADF.
Would anyone please help me guide through the process and best practice?
Thank you for giving your valuable time and support.
In Azure it is best to leverage managed identity wherever possible rather than having credentials stored in key vault as it adds to another security and maintainence layer

Python Azure Timer Function with Durable Entity

I'm currently trying to work on my first Azure Function to act as an intermediary between Defender for Endpoints, and Azure Sentinel. It runs every 5 minutes, and collects data matching specific filters from the Defender API to then forward as custom logs to Azure Sentinel. Due to the authentication measures in place on Defender, I've set my script up using ADAL to do a device code logon the first time, then use the refresh tokens to do its scheduled running.
This is where I've come across the problem; since Azure Functions are serverless by design, holding this refresh token somewhere for the next invocation has proven troublesome. I'm trying to use Durable Functions, but the documentation for such a use case seems non-existent.
Are there other appropriate methods to store a singular variable across invocations of an Azure Function?
There are more than one way to solve the problem you are facing with holding the refresh token for every new invocation Azure Functions.
One way to solve the problem is by using a Azure Function Timer Trigger to request new access tokens and Azure Key Vault to store these tokens securely. We want to save them in Key Vault so the next time we invoke our function to refresh our tokens again, we will use the updated values and the next function will be able to obtain that value when invocated.
Check this document to access secrets from key vault.
Another way is enabling the token store in azure function and store it in blob storage. Check this document for more information.

How to handle a Custom Connector that uses header authentication in each API call?

I have an Azure logic app that uses a Custom Connector that I've made from importing a Postman Collection. The RESTful API that my connector calls require 3 authentication headers in each request: UserName, Secret, ApiIntegrationCode. Because it takes three specifically named parameters, I don't believe that any of the authentication presets will work for me.
I know that I can protect the inputs and outputs of various connectors. I have been entertaining the idea of storing the sensitive information in a SQL table that I query in each run and storing the values in variables that I pass to each of my custom connector's API calls.
Would this be a viable way of protecting sensitive data from being seen by people that may have access to my logic app? What is the most secure way I can pass these headers in each call?
There are not too many options within a (consumption) Logic App in this regard.
Your options with Logic Apps
A first step into the right direction is to put your sensitive information into an Azure Key Vault and use the corresponding connector in your Logic App to retrieve the data from there. This is easier to implement and more secure than querying a SQL table for this purpose.
The second thing you can do is to activate secure inputs for the connectors that make the API calls. This makes sure, that the sensitive information passed to these connectors is obfuscated in the run history of your logic App and in connected services like Azure Log Analytics.
The problem with this approach is, that anyone who has more than just read permissions to your Logic App can just go ahead and deactivate the secure inputs setting or create a step that dumps the content of your Key Vault. You can use RBAC to control access to your Logic App but that means of course administrative overhead.
Alternative: API Management Service
If you want by all means to allow other developers to change the Logic App without exposing API secrets to them, you might consider using some sort of middle tier to communicate with the API. Azure API Management Service (APIM) is one of the options here.
You would manage your sensitive information in a Key Vault and inject them via "Named Values" into your APIM instance. You can then add your API as a backend in APIM and expose it towards your Logic App.
The advantage here is that you can secure access to your API with APIM subscription keys that you can cycle frequently. You can also restrict the access to the original API to only those calls, that need to be available to the Logic App.
If APIM is something for you depends on your use case, as it comes at a price. Even the developer plan costs about $50/month: https://azure.microsoft.com/en-us/pricing/details/api-management/
Alternative: Azure Function
You can use a simple Azure Function that serves as a middle tier between your Logic App and your API. This function can be configured to pull the sensitive data from a Key Vault and can also be secured via a function access key, that you can renew on a regular basis.
This is a dirt cheap option, if you are running the functions on a consumption plan: https://azure.microsoft.com/en-us/pricing/details/functions/

Azure Table Storage for housing Application Configuration

-- I am exploring Azure functionality and am wondering if Azure Table Storage can be an easy way for holding application configuration for an entire environment. It would be easy to see and change (adding list values etc.). Can someone please guide me on whether this is a good idea? I would expect this table to hold no more than 2000 rows if all our applications were moved over to Azure.
Partition Key --> Project Name + Component Name (Azure Function/Logic App)
Row Key --> Parameter Key
Value column --> Parameter Value
-- For securing password/keys, I can use the Azure Key Vault.
There are different ways of storing application configurations:
Key Vault (as you stated) for sensitive information. Ex. tokens, keys, connection strings. It can be standardized and extended to any type of resources for ease of storing and retrieving these.
Application Settings, found under each App Service. This approach assumes you have an App Service for each of your app.
Release Pipeline, such as Azure DevOps Services (AzDo). AzDo has variables that can be global to the release pipeline or some that can be specific to each stages
I am exploring Azure functionality and am wondering if Azure Table
Storage can be an easy way for holding application configuration for
an entire environment. It would be easy to see and change (adding list
values etc.). Can someone please guide me on whether this is a good
idea?
Considering Azure Tables is a key/value pair store, it is certainly a good idea to store application configuration values there. Only thing I would recommend is that you incorporate some kind of caching layer between your application and table storage so that you don't end up making calls to table storage every time you need to fetch a setting.
I would expect this table to hold no more than 2000 rows if all our
applications were moved over to Azure.
Considering the number of entities is going to be less than 2000, I think your design would have no impact in querying the entities however I think your design is good. For best performance, please ensure that you're including both PartitionKey and RowKey while querying. At the very least, include PartitionKey in your query.
Please see this for more details: https://learn.microsoft.com/en-us/azure/cosmos-db/table-storage-design-guide.
For securing password/keys, I can use the Azure Key Vault.
That's the way to go for storing sensitive data in Azure.
Have you looked at the App Configuration service?
There are client libraries in .NET, Java, TypeScript and Python to interact with the service that you can leverage in your application.

Azure Functions trigger for when a new key version is created or expired Key Vault

I'm trying to see if it is possible to have a trigger setup within Azure Functions that will fire off when Key Vault has a new version of a key created. It doesn't seem that there is a supported trigger at the moment, but wanted to see if someone else has had this type of idea and might have some solution in mind.
The use case I had in mind was for a on premise cache of keys and we wanted to setup an easy way to update/refresh the cache when key versions are created or expired so the data stays up to date.
Also, if this is a stupid idea, I'm open to suggestions of alternative ideas.
Azure Functions currently does not have support for Key Vault triggers. However, it seems that Key Vault has ability to send activity logs to Event Hub and there is support for Event Hub triggers in Azure Functions. Not sure if this would work for your use-case and I am not familiar with the SIEM pipeline, but here are some references that may help:
https://learn.microsoft.com/en-us/azure/security/security-azure-log-integration-keyvault-eventhub
https://mitra.computa.asia/articles/msdn-integrate-azure-logs-streamed-event-hubs-siem

Resources