Empty error while executing SSIS package in Azure Data Factory - azure

I have created a simple SSIS project and in this project, I have a package that will delete a particular file in Downloads folder.
I deployed this project to Azure. And when I am trying to execute this package using Azure Data Factory then the pipeline fails with an empty error (I am attaching the screenshot here).
enter image description here
What I have done to fix this error is:
I have added self-hosted IR to Azure-SSIS IR as the proxy to access the data on-premise.
Set the ConnectByProxy as True.
Converted the project to Project Deployment Model.
Please help me out to fix this error and if you need more details then just leave a comment.

Windows Authentication :
To access data stores such as SQL servers/file shares on-premises or Azure Files, check the Windows authentication check box.
If this check box is selected, fill in the Domain, Username, and Password fields with the values for your package execution credentials. The domain is Azure, the username is storage account name>, and the password is storage account key> to access Azure Files, for example.
Using the secrets stored in your Azure Key Vault
As a substitute, you can leverage secrets from your Azure Key Vault as values. Select the AZURE KEY VAULT check box next to them to do so. Create a new key vault connected service or choose or update an existing one. Then choose your value's secret name and version. You can pick or update an existing key vault or create a new one when creating or editing your key vault connected service. If you haven't previously done so, allow Data Factory managed identity access to your key vault. You may also directly input your secret in the format key vault linked service name>/secret name>/secret version>.
Note : If you are using Windows Authentication, there are four methods to
access data stores with Windows authentication from SSIS packages
running on your Azure-SSIS IR: Access data stores and file shares with
Windows authentication from SSIS packages in Azure | Docs
Make Sure it Falls under one of such methods, else it could potentially fail at the Run Time.

Related

SqlPackage.exe deploy DACPAC with always encrypted and Key vault

I am trying to deploy a DACPAC using Azure Release pipeline.
Following are the methods which I used
Using Azure SQL DACPAC task - While using this approach, as part of additional properties following set of params are passed for deployment
AzureKeyVaultAuthMethod:ClientIdSecret /ClientId:'$(SERVICEPRINCIPALID)' /Secret:'$(SERVICEPRINCIPALKEY)'. On enabling diagnostics log, getting following error:
SqlPackage build version - 16.0.6161.0
Using a power-shell script - While using the power-shell script in order to do the DACPAC deployment, I am getting the following error.
Failed to decrypt a column encryption key. Invalid key store provider name: 'AZURE_KEY_VAULT'. A key store provider name must denote either a system key store provider or a registered custom key store provider. Valid system key store provider names are: 'MSSQL_CERTIFICATE_STORE', 'MSSQL_CNG_STORE', 'MSSQL_CSP_PROVIDER'. Valid (currently registered) custom key store provider names are: . Please verify key store provider information in column master key definitions in the database, and verify all custom key store providers used in your application are registered properly.
Logs attached here
SqlPackage build version - 15.0.5472.2
As a pre-requisite before deploying the DACPAC, using a power-shell script I am creating the keys and then inserting them to DB. Contents of the script below.
All the deployments are happening through a service principal and it has admin level access on all the resources in the Azure AD.
Am I missing any steps here before the deployment of DACPAC through release pipeline.
Thanks,
Nandan

Creating Multiple Environment Parameters for Azure Data Factory Linked Services

I have a requirement where I need to point our DEV Azure Data Factory to a Production Azure SQL database and also have the ability to switch the data source back to the Dev database should we need to.
I've been looking at creating parameters against the linked services but unsure of the best approach.
Should I create parameters as follows and choose the relevant parameters depending on the environment I want to pull data from?
DevFullyQualifiedDomainName
ProdFullyQualifiedDomainName
DevDatabaseName
ProdDatabaseName
DevUserName
ProdUserName
Thanks
Any sort of trigger can also have parameters attached to it. Check out the following example, assuming you have a custom event trigger and SQL server as a source:
Create a string parameter for the database name field while establishing a SQL server connected service as a dataset.
Create New parameter in dataset, assign the dataset parameter to that same Linked service parameter, which will be used to store the trigger data.
A custom event trigger has the ability to parse and deliver a custom data payload to your pipeline. You define the pipeline parameters and then populate the values on the Parameters page. To parse the data payload and provide values to the pipeline parameters, use the format #triggerBody().event.data. keyName_.
As per Microsoft Official Documents, which could be referred:
Reference trigger metadata in pipelines
System variables in custom event trigger
When you utilize a pipeline activity in a source, it will request you for a dataset parameter. In this case, utilize dynamic content and choose the parameter containing the trigger data.
I would suggest using Azure Key Vault for that.
Create an Azure Key Vault for each environment (dev, prod, etc.)
Create secrets inside both key vaults with the same name but different values.
For example, for the database server name, create the same secret "database-server" in both dev and prod key vaults but with the correct value representing the connection string of the dev and prod server respectively, in the following format:
integrated security=False;encrypt=True;connection timeout=30;data source=<serverName>.database.windows.net;initial catalog=<databaseName>;user id=<userName>;password=<loginPassword>
In your Azure Data Factory, create a Key Vault linked service pointing to your key vault.
In your Azure Data Factory, create a new Azure SQL Database linked service selecting the Key Vault created in step 1 and the secret created in step 2.
Now you can easily switch between dev and prod by simply adjusting your Key Vault linked service to point to the desired environment.
Have fun ;)
Reference:
https://learn.microsoft.com/en-us/azure/data-factory/store-credentials-in-key-vault

Azure Key Valut connection to Azure SQL

I have successfully setup a Linked Service in Azure Data Factory that uses a Key Vault for the connection string which includes the user/pwd and connects to the Azure SQL DB as desired. However, I can only do this when I use the "admin" account. The string below works.
Server=tcp:database1.database.windows.net,1433;Initial Catalog=DB;Persist Security Info=False;User ID=Admin;Password=Pa$$w0rd;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;
I created a new login/user and granted the necessary permissions. I know because I can connect using the new login via remote SSMS or by adding the credentials directly in the linked service in Azure. (e.g. hard coding the user/pwd in the connection string in the linked service)
Unfortunately, when I switch to using the key vault connection string, I get the generic SQLErrorNumber 18456 for the newly created user. I know the credentials are correct, I know I can connect via the Key Vault (when using the elevated admin account), I just cannot use the Key Vault connection string when using the new user.
Server=tcp:database1.database.windows.net,1433;Initial Catalog=DB;Persist Security Info=False;User ID=Username;Password=Pa$$w0rd;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;
If using Azure SQL with Data Factory look at using Managed Service Identity That way you'd add the name of the Data Factory directly to the SQL DB and no need for a username and password. Plus can assign permissions directly on the user.
Only downside is if DataFactory get's wiped out and redeployed the user will need to be dropped and recreated since it is using a thumbprint to recognize the Identity, this isn't the case with all resources and MSI auth.
For your specific case check to make sure the SQL server is allowing Azure Services and resource to Access the sever by going to "Firewalls and virtual networks" and make sure it is turned on:

Azure back up unable to delete backup items

I used the Azure Backup client (MARS) to back up a server he had. The server no longer exists. In the Azure portal I am unable to delete the vault because the resource group contains backup items.
I tried using Powershell but Az.RecoveryServices is not meant to be used for MARS BackupManagementType. You can Get-AzureRmRecoveryServicesBackupContainer but then Get-AzureRmRecoveryServicesBackupItem fails because there is no WorkLoadType for MARS
So I cant delete the backup items from the Portal. I cant delete backup Items using powershell and the server no longer exists so I can use the MARS agent to delete items.
You can't delete a Recovery Services vault that has servers registered in it, or that holds backup data.
To gracefully delete a vault, unregister servers it contains, remove vault data, and then delete the vault.
If you try to delete a vault that still has dependencies, an error message is issued, and you will need to manually remove the vault dependencies, including:
Backed up items
Protected servers
Backup management servers (Azure Backup Server, DPM)
Refer to this article for detailed info:https://learn.microsoft.com/en-us/azure/backup/backup-azure-delete-vault
Note: You can use Cloud Shell available in portal to achieve this. Please select PowerShell after you launch Cloud Shell.
Kindly let us know if the above helps or you need further assistance on this issue.

Azure Function In-Portal editor is unreachable, can't access though Kudus

I am unable to access my Azure Function which I created in-portal. I can't get to the Kudus
I created the function in-portal I don't have the backup for the code I created in the portal. I need to get access to the code.
I did change the Azure Storage keys that were associated with the function, as new keys were generated due to some reasons.
Double check if WEBSITE_CONTENTAZUREFILECONNECTIONSTRING appSetting has the right connection string? Restart the site. You can also go to Azure Files (using Azure Portal) to see/download your content.

Resources