Azure Feature Pack on Azure DevOps Hosted Agent? - azure

I have an SSIS project that uses the Azure Storage Connection Manager from SSIS Azure Feature Pack. When this connection is included in the project, the Azure Pipeline Build fails with the following message with a vs2017 hosted agent:
System.ArgumentException: Value does not fall within the expected
range. at
Microsoft.SqlServer.Dts.Runtime.Interop.ProjectInterop.ReferencePackage(Package
package, String packageLocation) at
Microsoft.SqlServer.Dts.Runtime.PackageItem.Load(IDTSEvents events) at
Microsoft.SqlServer.Dts.Runtime.PackageItem.get_Package() at
Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.IncrementalBuildThroughObj(IOutputWindow
outputWindow) at
Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.BuildIncremental(IOutputWindow
outputWindow)
When the Azure Storage Connection Manager is removed the the project, the Azure Pipeline Build is successful.
I have also tried 2019 hosted agent but it failed wit ha different error (Some errors occurred during migration. For more information, see the migration report).
Is the Azure Feature Pack for SSIS not installed on the hosted agents? I would like to resolve this without having to use self hosted or docker.

The capabilities of Microsoft-Hosted agents can be checked here:
https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/hosted?view=azure-devops
They don't have seem to include SSIS support. You can try to install those as part of the build definition. For example:
https://erwindekreuk.com/2019/02/azure-devops-and-azure-feature-pack-for-integration-services/

Thank you JukkaK. The solution you posted worked for me, with a slight change - i had to change all of the single quotes to double quotes - it was not recognizing the variable for the file correctly. Below is the final solution in my implementation:
Write-Information "Installing SSIS Azure Feature Pack 2017"
#Define Filename
$Filename = "SsisAzureFeaturePack_2017_x64.msi"
$Arguments="/qn"
Write-Host "Downloading " $Filename
#Define download link including filename and output directory with filename
Invoke-WebRequest -Uri "https://download.microsoft.com/download/E/E/0/EE0CB6A0-4105-
466D-A7CA-5E39FA9AB128/SsisAzureFeaturePack_2017_x64.msi" -OutFile
"$(Build.StagingDirectory)\$Filename"
Write-Host "Installing "$Filename
Invoke-Expression -Command "$(Build.StagingDirectory)\$Filename $Arguments"
Write-Host "Finished Installing " $Filename

Related

Azure DevOps - Multitenant database deployment

We have Web application with one code and multitenant database behind and we are using Azure Devops CI/CD for code deployment. We are able to release the code to database perfectly, however we want to deploy it to multiple database without multiple release pipelines (number of tenants is dynamical and 100+ of them).
I am able to deploy the code to one database and then redeploy to each other using sqlpackage.exe and dacpac file publish in command line on target server. But I want to do so just in Azure devops, however there is no sqlpackage.exe on agent machine in command line task.
Is here someone who was dealing with similar issue with multi-tenant DB approach? I am open to whatever solution using Azure release pipelines.
I was able to resolve this by running Inline Powershell and SqlPackage.exe on Deployment group. You just need to have one "Template DB", get dacpac or bacpac file file from it and then publish it to all other DBs.
Example for a reference:
ForEach ($a in (Get-Content "E:\DatabaseSync\databaselist.txt"))
{
SqlPackage.exe /a:Publish /sf:"E:\DatabaseSync\TemplateDB.dacpac" `
/tsn:localhost /tu:username /tp:password /tdn:$a `
/p:TreatVerificationErrorsAsWarnings=False /p:BlockOnPossibleDataLoss=True `
/p:AllowDropBlockingAssemblies=True /p:DropObjectsNotInSource=True `
/p:DropPermissionsNotInSource=False /p:IgnorePermissions=True `
/p:IgnoreRoleMembership=True /p:IgnoreUserSettingsObjects=True `
/p:DoNotDropObjectTypes=Users /p:ExcludeObjectTypes=Users
}

How to run a remote command (powershell/bash) against an existing Azure VM in Azure Data Factory V2?

I've been trying to find a way to run a simple command against one of my existing Azure VMs using Azure Data Factory V2.
Options so far:
Custom Activity/Azure Batch won't let me add existing VMs to the pool
Azure Functions - I have not played with this but I have not found any documentation on this using AZ Functions.
Azure Cloud Shell - I've tried this using the browser UI and it works, however I cannot find a way of doing this via ADF V2
The use case is the following:
There are a few tasks that are running locally (Azure VM) in task scheduler that I'd like to orchestrate using ADF as everything else is in ADF, these tasks are usually python applications that restore a SQL Backup and or purge some folders.
i.e. sqdb-restore -r myDatabase
where sqldb-restore is a command that is recognized locally after installing my local python library. Unfortunately the python app needs to live locally in the VM.
Any suggestions? Thanks.
Thanks to #martin-esteban-zurita, his answer helped me to get to what I needed and this was a beautiful and fun experiment.
It is important to understand that Azure Automation is used for many things regarding resource orchestration in Azure (VMs, Services, DevOps), this automation can be done with Powershell and/or Python.
In this particular case I did not need to modify/maintain/orchestrate any Azure resource, I needed to actually run a Bash/Powershell command remotely into one of my existing VMs where I have multiple Powershell/Bash commands running recurrently in "Task Scheduler".
"Task Scheduler" was adding unnecessary overhead to my data pipelines because it was unable to talk to ADF.
In addition, Azure Automation natively only runs Powershell/Python commands in Azure Cloud Shell which is very useful to orchestrate resources like turning on/off Azure VMs, adding/removing permissions from other Azure services, running maintenance or purge processes, etc, but I was still unable to run commands locally in an existing VM. This is where the Hybrid Runbook Worker came into to picture. A Hybrid worker group
These are the steps to accomplish this use case.
1. Create an Azure Automation Account
2. Install the Windows Hybrid Worker in my existing VM . In my case it was tricky because my proxy was giving me some errors. I ended up downloading the Nuget Package and manually installing it.
.\New-OnPremiseHybridWorker.ps1 -AutomationAccountName <NameofAutomationAccount> -AAResourceGroupName <NameofResourceGroup>
-OMSResourceGroupName <NameofOResourceGroup> -HybridGroupName <NameofHRWGroup>
-SubscriptionId <AzureSubscriptionId> -WorkspaceName <NameOfLogAnalyticsWorkspace>
Keep in mind that in the above code, you will need to find your own parameter values, the only parameter that does not have to be found and will be created is HybridGroupName this will define the name of the Hybrid Group
3. Create a PowerShell Runbook
[CmdletBinding()]
Param
([object]$WebhookData) #this parameter name needs to be called WebHookData otherwise the webhook does not work as expected.
$VerbosePreference = 'continue'
#region Verify if Runbook is started from Webhook.
# If runbook was called from Webhook, WebhookData will not be null.
if ($WebHookData){
# Collect properties of WebhookData
$WebhookName = $WebHookData.WebhookName
# $WebhookHeaders = $WebHookData.RequestHeader
$WebhookBody = $WebHookData.RequestBody
# Collect individual headers. Input converted from JSON.
$Input = (ConvertFrom-Json -InputObject $WebhookBody)
# Write-Verbose "WebhookBody: $Input"
#Write-Output -InputObject ('Runbook started from webhook {0} by {1}.' -f $WebhookName, $From)
}
else
{
Write-Error -Message 'Runbook was not started from Webhook' -ErrorAction stop
}
#endregion
# This is where I run the commands that were in task scheduler
$callBackUri = $Input.callBackUri
# This is extremely important for ADF
Invoke-WebRequest -Uri $callBackUri -Method POST
4. Create a Runbook Webhook pointing to the Hybrid Worker's VM
4. Create a webhook activity in ADF where the above PowerShell runbook script will be called via a POST Method
Important Note: When I created the webhook activity it was timing out after 10 minutes (default), so I noticed in the Azure Automation Account that I was actually getting INPUT data (WEBHOOKDATA) that contained a JSON structure with the following elements:
WebhookName
RequestBody (This one contains whatever you add in the Body plus a default element called callBackUri)
All I had to do was to invoke the callBackUri from Azure Automation. And this is why in the PowerShell runbook code I added Invoke-WebRequest -Uri $callBackUri -Method POST. With this, ADF was succeeding/failing instead of timing out.
There are many other details that I struggled with when installing the hybrid worker in my VM but those are more specific to your environment/company.
This looks like a use case that is supported with Azure Automation, using a hybrid worker. Try reading here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
You can call runbooks with webhooks in ADFv2, using the web activity.
Hope this helped!

Does Azure Automation support Write-Information?

I want to write info logs into Azure Automation job logs. I've created the simple PowerShell runbook
$InformationPreference = "Continue"
Write-Information "Hello info"
Write-Verbose "Hello Verbose"
Write-Warning "Hello warning"
Write-Error "Hello error"
And in runbook execution All logs I see only verbose, warning and error logs
If to disable runbook Verbose logs I see only warnings and errors. Locally it works fine but not in Azure. I've also tried Write-Information "Hello info" -InformationAction Continue - didn't help.
Write-Information appeared in PowerShell 5.0. I've checked the PS version in Azure Automation sandbox machine by using $PSVersionTable - it's more than 5. So, should work.
Do you know if they support it or not?
If you want to write info logs into Azure Automation job logs, I suggest you use write-output.
For details, you can refer to this article.
I'm not sure if write-information is supported or not in runbook. I test it at my side, as well as I test the cmdlet write-host which is a wrapper for write-information. But no message output for both of them.
A support ticket is raised for confirmation from MS.
Hope this helps.
Azure Automation does not fully support the Information stream at this point. PowerShell 5 support is not enough: your runbook will not fail, but Automation will not capture and store the Information stream content, and this is why you will not see it in the logs.
I do wish Write-Information was available in Azure Automation.
Using Write-Output in a function that you want to return something else (like a Boolean) is quite problematic.

Azure Point-to-Site VPN Resource Manager powershell

I found the link, https://azure.microsoft.com/en-us/documentation/articles/vpn-gateway-howto-point-to-site-rm-ps/, which give details instructions on how to create a Point-to-Site VPN connection using powershell in the new Azure resource manager.
While attempting to create run this script I am getting the error message. " The term 'Add-AzureRmVpnClientRootCertificate' is not recognized as the name of a cmdlet"
I am currently running Azure Powershell version 1.0.1 and this reference, https://msdn.microsoft.com/en-us/library/mt653593.aspx, indicates that it should be available in version 1.0.
What am I doing wrong?
It looks like you need at least Azure PowerShell 1.0.4 to get this cmdlet. If you look at the GitHub source for this cmdlet at https://github.com/Azure/azure-powershell/blob/master/src/ResourceManager/Network/Commands.Network/VirtualNetworkGateway/AddAzureVpnClientRootCertificateCommand.cs, it looks like it was added with the commit for 1.0.4: https://github.com/Azure/azure-powershell/commit/09b5f57ff798ca90aeb84e73fbd88f406d7edd7c.

Referencing Microsoft.WindowsAzure.Storage.dll and creating Cloud Blob Client using CreateCloudBlobClient() method in Runbook under Azure Automation

I am trying to create a runbook in Azure Automation which will take a snapshot of the VM. I found "CreateBlobSnapshot.ps1" from script center. It works good in Power Shell. But when I try to use the same code in run book it was throwing couple of exceptions.
Example:
A. method invocation is not supported in a windows powershell workflow ..... and B. It was not able to locate Microsoft.WindowsAzure.Storage.dll. I tried writing InlineScript too without luck. Please advice.
#Loading Windows Azure Storage Library for .NET.
Write-Verbose-Message"Loading Windows Azure Storage Library from $StorageLibraryPath"
Reflection.Assembly]::LoadFile("$StorageLibraryPath") | Out-Null
$Creds=New-ObjectMicrosoft.WindowsAzure.Storage.Auth.StorageCredentials("$StorageAccountName","$StorageAccountKey")
$CloudStorageAccount=New-ObjectMicrosoft.WindowsAzure.Storage.CloudStorageAccount($creds, $true)
$CloudBlobClient=$CloudStorageAccount.CreateCloudBlobClient()
For issue B, did all the Dependent DLL got loaded as well? Try this To Load Windows Storage Dll
PM> Install-Package WindowsAzure.Storage
Then trying loading the DLL in PowerShell using:
PM> Add-Type -Path "<Path where package is present>\Microsoft.WindowsAzure.Storage.dll"

Resources