Is there any Azure service that can simulate the concept of a global Azure 'variable' to hold a single value? - azure

I am looking for some Azure service that can store a value and then I can fetch it from any other Azure service. It's a storage basically but extremely lightweight storage -- it should allow one to define a variable for a given subscription and then its value can be updated from any other Azure service. In Azure Data Factory there is a recent introduction of global parameter at data factory level , even this could serve purpose to some limited extent if it was mutable, but it's a parameter not a variable. So its value can't be updated. Even if I can get some solution that will work within data factory that's fine too. One could always store such a value in SQL or blob but that sounds like an overkill. Having a global Azure variable is a genuine requirement -- so wondering if there is anything like that.

Please consider Azure KeyVault. You can define there a secret to hold this value. However I'm not sure what integration with other Azure services you need.

you have several options:
cosmosdb table api
redis
table storage
ref: https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-overview#keyvalue-stores

Related

Parameterize Linked Services Azure Data Factory

I have Database Username , Servername, hostand other details that are stored in Table. I wanted to make a Linked Service that can use the connection details from these table and store it in parameter.
As of now i am hardcoding these details in parameters created in linked service but I want a generic linked service that can take details from table or from pipeline parameter.
AFAIK, there is no such feature available in Azure Data Factory which allows to parameterize the Linked Service or the pipeline where values are stored in a out source Table or file. You need to define the values in ADF only.
The standard and only way possible is to parameterize a linked service and pass dynamic values at run time by defining the values in ADF. For example, if you want to connect to different databases on the same logical SQL server, you can now parameterize the database name in the linked service definition. This prevents you from having to create a linked service for each database on the logical SQL server.
You can use parameters to pass external values into pipelines,
datasets, linked services, and data flows. Once the parameter has been
passed into the resource, it cannot be changed. By parameterizing
resources, you can reuse them with different values each time.
Parameters can be used individually or as a part of expressions. JSON
values in the definition can be literal or expressions that are
evaluated at runtime.
The official document Parameterize linked services in Azure Data Factory will help you to understand the complete fundamentals.

How to remove special characters from XML stored in ADLS using Azure data factory or any other option?

I have scenario where i need to remove some characters from xml tags which is stored in ADLS. I am looking for an option with ADF. Can someone help me here with approach i should follow?
This is not possible by ADF. May you can have piece code to do this in
Azure Functions. As, Azure Data Factory can do data movement and data
transformation only. When you are saying about tags that means it does
not come under that.
You may use the Azure Function activity in a Data Factory pipeline to run Azure Functions. To launch an Azure Function, you must first set up a connected service connection and an activity that specifies the Azure Function you want to perform.
There is the Microsoft document which have deep insights about Azure Function Activity in ADF | Here.

Is it possible to store variables in Azure Data Factory pipelines?

In my Azure Data Factory pipeline, I want to use a variable, which gets updated on each run and which is also read on each run. At the moment, I am using a Database to achieve that. But it would be much simpler if Azure Data Factory provided a way of storing variables. So, my question is, is there any such facility in Azure Data Factory?
As #Joel Cochran says, ADF doesn't support persist a variable inside pipeline runs. We need to write data to a storage, eg. database or azure storage. Use Lookup Activity to get the value from blob storage file or DB. :)

Does Azure automatically replicate exiting objects when when storage redundancy type is changed?

Does Azure automatically replicate existing blobs when we change the storage redundancy type from say LRS to ZRS/GRS? or do we have to manually replicate these objects? I know AWS does not auto replicate so i want to know if same applies to Azure as we need to implement that in one of our existing solutions
It is not automatic, after you change the storage redundancy type, you need to either perform a manual migration or request that Microsoft performs a live migration.
For details see: https://learn.microsoft.com/en-us/azure/storage/common/redundancy-migration?tabs=portal

Azure Functions - Table Storage Trigger with Azure Functions

I need a way to trigger the Azure functions when an entity is added to the Azure Table storage. Is there a way to do this ? When I tried to add a new Azure function, I did not see any Azure Table storage trigger. I see there is Queue and Blob triggers available.
If there is no support for the Azure table storage trigger, then should I need to have a Http trigger and have the Azure Table storage as input binding ?
Thanks
There is no trigger binding for Table Storage.
Here's a detailed view on what is supported by the different bindings available today:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings#overview
If there is no support for the Azure table storage trigger, then should I need to have a Http trigger and have the Azure Table storage as input binding ?
Yes, this approach would work and would allow you to pass the table data as an input while relying on a separate trigger. Depending on the type of clients you are working with, and your requirements, using a queue trigger is also another good option.
#venki What the Fabio Cavalcante said to you is really true. Azure Function doesn't have a trigger option for Storage Table. But, whether your business needs store the data into the Storage Table and you as a Developer decide to use Azure Function into your architecture, you're able to configure you Function to use data that will come from Storage Table as a Input to your Function! This works really well.
But, There is another way to configure your Function to have "automagically" trigger, using Storage Queue (for small business) or Service Bus (for a business that needs a mechanism more robust)

Resources