Azure Logic App SubscriptionNotFound error - azure

With the intention of building a generic logic app solution via Azure Portal, I've supplied all config to the logic app connectors as parameters sourced from a database.
The only problem I've experienced now though, is that the custom value supplied as the subscription for the ADF pipeline run is not working. The exact same value works when selected explicitly from the dropdown. The other values such as the Resource Group, Data Factory Name and the Data Factory Pipeline Name work fine when populated by the parameters.
Is there a specific way to do this? Or is this a bug in the Logic App ADF connector?

Please do not put the name of subscription into the Subscription input box, it can not be executed success. You can put the subscription id into the Subscription input box, show as below screenshot:

Related

Creating Multiple Environment Parameters for Azure Data Factory Linked Services

I have a requirement where I need to point our DEV Azure Data Factory to a Production Azure SQL database and also have the ability to switch the data source back to the Dev database should we need to.
I've been looking at creating parameters against the linked services but unsure of the best approach.
Should I create parameters as follows and choose the relevant parameters depending on the environment I want to pull data from?
DevFullyQualifiedDomainName
ProdFullyQualifiedDomainName
DevDatabaseName
ProdDatabaseName
DevUserName
ProdUserName
Thanks
Any sort of trigger can also have parameters attached to it. Check out the following example, assuming you have a custom event trigger and SQL server as a source:
Create a string parameter for the database name field while establishing a SQL server connected service as a dataset.
Create New parameter in dataset, assign the dataset parameter to that same Linked service parameter, which will be used to store the trigger data.
A custom event trigger has the ability to parse and deliver a custom data payload to your pipeline. You define the pipeline parameters and then populate the values on the Parameters page. To parse the data payload and provide values to the pipeline parameters, use the format #triggerBody().event.data. keyName_.
As per Microsoft Official Documents, which could be referred:
Reference trigger metadata in pipelines
System variables in custom event trigger
When you utilize a pipeline activity in a source, it will request you for a dataset parameter. In this case, utilize dynamic content and choose the parameter containing the trigger data.
I would suggest using Azure Key Vault for that.
Create an Azure Key Vault for each environment (dev, prod, etc.)
Create secrets inside both key vaults with the same name but different values.
For example, for the database server name, create the same secret "database-server" in both dev and prod key vaults but with the correct value representing the connection string of the dev and prod server respectively, in the following format:
integrated security=False;encrypt=True;connection timeout=30;data source=<serverName>.database.windows.net;initial catalog=<databaseName>;user id=<userName>;password=<loginPassword>
In your Azure Data Factory, create a Key Vault linked service pointing to your key vault.
In your Azure Data Factory, create a new Azure SQL Database linked service selecting the Key Vault created in step 1 and the secret created in step 2.
Now you can easily switch between dev and prod by simply adjusting your Key Vault linked service to point to the desired environment.
Have fun ;)
Reference:
https://learn.microsoft.com/en-us/azure/data-factory/store-credentials-in-key-vault

How can I be notified if someone creates a new database in Azure?

I would like to set up an Azure alert for when someone on our team sets up an Azure database. Once alerted, I want to have an additional alert created if that resource is running for more than a certain amount of time.
My solution is to create an Alert Rule on the storage account and have it send an email. Where I'm running into trouble is how to monitor the database, since it just got created and I don't know the name yet for the second Alert rule that will monitor its uptime.
Is there some programmatic way to determine the database resource name?
If you don't want to invest time in Programmatic way there is a option to set or configure an alert at Resource Group level based on Resource type, where in the alert rule configure with below configurations
Scope -Select the right subscription, filter by resource type like SqlDatabase and if required filter based on location wise
Condition - In Select condition, Signal type drop down select "Create/Update Azure Sql Database" and in alert logic you can provide additional filtering logic's
Can choose existing Action group or create new one based on your requirement
Add "Alert rule details" like rule name, description etc.
Finally create alert rule
Now once alert rule is created if any new Azure SqlDatabase is created you will be notified based on alert configured.
According to the official doc, you can use Event Grid to notify Azure Automation when a SQL database is created.
https://learn.microsoft.com/en-au/azure/event-grid/overview#ops-automation
Once you subscribe, you can use Logic Apps to send you an email for example.
About the second part, you'll need to query the metrics and figure out if it's running (is performing compute) or not.

How to find the value for aadSessionkey when deploying a Kubernetes template in Azure DevOps

I am trying to use a template to deploy a managed Kubernetes cluster (AKS). My problem is that the template has a parameter aadSessionKey that I seem to be unable to locate.
I assume the expanded name of the parameter is Azure AD SessionKey. When I look in the portal, I can see that my Azure AD has a Name, Application ID and Object ID, but nothing that looks like a session key, nor a way to generate such a thing.
I am using a free trial account if that matters.
Can you try entering any random value and try deploying it. It seems like this is system generated value which is not to be filled by clients. This has been present in template for some other reason.
Ref - https://twitter.com/ashtonkj/status/1196384865672925184

Azure Data Factory: event not starting pipeline

I've set up a Azure Data Factory pipeline containing a copy activity. For testing purposes both source and sink are Azure Blob Storages.
I wan't to execute the pipeline as soon as a new file is created on the source Azure Blob Storage.
I've created a trigger of type BlovEventsTrigger. Blob path begins with has been set to //
I use Cloud Storage Explorer to upload files but it doesn't trigger my pipeline. To get an idea of what is wrong, how can I check if the event is fired? Any idea what could be wrong?
Thanks
Reiterating what others have stated:
Must be using a V2 Storage Account
Trigger name must only contain letters, numbers and the '-' character (this restriction will soon be removed)
Must have registered subscription with Event Grid resource provider (this will be done for you via the UX soon)
Trigger makes the following properties available #triggerBody().folderPath and #triggerBody().fileName. To use these in your pipeline your must map them to pipeline paramaters and use them as such: #pipeline().parameters.paramaetername.
Finally, based on your configuration setting blob path begins with to // will not match any blob event. The UX will actually show you an error message saying that that value is not valid. Please refer to the Event Based Trigger documentation for examples of valid configuration.
Please reference this. First, it needs to be a v2 storage. Second, you need register it with event grid.
https://social.msdn.microsoft.com/Forums/azure/en-US/db332ac9-2753-4a14-be5f-d23d60ff2164/azure-data-factorys-event-trigger-for-pipeline-not-working-for-blob-creation-deletion-most-of-the?forum=AzureDataFactory
There seems to be a bug with Blob storage trigger, if you have more than one trigger is allocated to the same blob container, none of the triggers will fire.
For some reasons (another bug, but this time in Data factories?), if you edit several times your trigger in the data factory windows, the data factory seems to loose track of the triggers it creates, and your single trigger may end up creating multiple duplicate triggers on the blob storage. This condition activates the first bug discussed above: the blob storage trigger doesn't trigger anymore.
To fix this, delete the duplicate triggers. For that, navigate to your blob storage resource in the Azure portal. Go to the Events blade. From there you'll see all the triggers that the data factories added to your blob storage. Delete the duplicates.
And now, on 20.06.2021, same for me: event trigger is not working, though when editing it's definition in DF, it shows all my files in folder, that matches. But when i add new file to that folder, nothing happens!
If you're creating your trigger via arm template, make sure you're aware of this bug. The "runtimeState" (aka "Activated") property of the trigger can only be set as "Stopped" via arm template. The trigger will need to be activated via powershell or the ADF portal.
An event grid resource provider needs to have been registered, within the specific azure subscription.
Also if you use Synapse Studio pipelines instead of Data Factory (like me) make sure the Data Factory resource provider is also registered.
Finally, the user should have both 'owner' and 'storage blob data contributor' on the storage account.

multiple key vault references in same ARM template

I'm trying to deploy an ARM template with conditional logic to use two different key-vaults depending on the input. Each exists in a different subscription which is the issue..
(I'm using one template for prod and dev and deploying to different subscriptions accordingly)
Master template variable
1. Key vault 1: /subid1/xxxxx/keyvault
2. Key vualt 2: /subid2/xxxx/keyvault
Nested template
"[If(x=y),/subid1/xxxxx/keyvault,/subid1/xxxxx/keyvault)]"
So when deploying into subscrition 2 (subid2) as an example the error is:
Code=KeyVaultParameterReferenceNotInTheSameTenant; Message=The specified KeyVault /subid1/xxxxx/keyvault is not in current tenant.
So I get why the error message is flagging(I've declared a variable in another sub), but how can i get the validation to check what's being deployed as opposed to the variables or is there another way to achieve the same goal?
Thanks,
more a workaround than an answer but I just declared key-vault as a parameter in input a different value into the VSTS build definition. Not ideal as I wanted selected DEV to flow through all the pertinent settings, but works so closing.

Resources