U-SQL Job Failing in Data Factory - azure

I keep getting following error from Data Factory whenever I run an U-SQL Job
Job submission failed, the user 'adla account name' does not have permissions to a subfolder in the /system/ path needed by Data Lake Analytics. Please run “Add User Wizard” from the Data Lake Analytics Azure Portal or use Azure PowerShell to grant access for the user to the /system/ and its children on the Data Lake Store.
And I am not using any Firewall as suggested in this post:
Run U-SQL Script from C# code with Azure Data Factory
I am using the Azure Data Lake Store service principal authentication. When I start the Job from Visual Studio, it also works fine.
I would be grateful for any idea...
thanks

If you are authorising the Azure Data Lake Analytics linked service from Azure Data Factory with a service principal that could be your problem.
I have an outstanding support ticket with Microsoft because the service principal authentication method only works with simple data factory activities like 'copy'. It does not work if you want to authenticate complex activities like 'DotNotActivity'.
My advise would be to change the linked service back to using session and token auth, then deploy your activities and try again.
Hope this helps.

It does sound like a permissions problem. You can run this powershell script to ensure you have applied the proper permissions to the security principal:
Login-AzureRmAccount
$appname = “adla”
$dataLakeStoreName = “yourdatalakename”
$app = Get-AzureRmADApplication -DisplayName $appname
$servicePrincipal = Get-AzureRmADServicePrincipal -SearchString $appname
Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path / -AceType User -Id $servicePrincipal.Id -Permissions All
Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path /system -AceType User -Id $servicePrincipal.Id -Permissions All
If you want to create everything from scratch using a powershell script, here is a blog that will help you:
http://dyor.com/setting-up-an-azure-data-lake-and-azure-data-factory-using-powershell/

Related

Scheduling Azure Virtual Machine (VM) Startup with Tags

I am trying to put some auto start policy on my VM on Azure.
So, I used automation account and power shell script to do this from this link: https://adamtheautomator.com/azure-vm-schedule/
But on testing it give me error of Run Login-AzureRmAccount to login
Please suggest how to fix this?
## Get the Azure Automation Acount Information
$azConn = Get-AutomationConnection -Name 'AzureRunAsConnection'
## Add the automation account context to the session
Add-AzureRMAccount -ServicePrincipal -Tenant $azConn.TenantID -ApplicationId $azConn.ApplicationId -CertificateThumbprint $azConn.CertificateThumbprint
## Get the Azure VMs with tags matching the value '10am'
$azVMs = Get-AzureRMVM | Where-Object {$_.Tags.StartTime -eq '10am'}
## Start VMs
$azVMS | Start-AzureRMVM
Regards
ESNGSRJ
This can happen when the Run As account isn't configured appropriately. You will need to create one to provide authentication for managing resources on the Azure Resource Manager using Automation runbooks.
When you create a Run As account, it performs the following tasks:
Creates an Azure AD application with a self-signed certificate, creates a service principal account for the application in Azure AD, and assigns the Contributor role for the account in your current subscription.
Creates an Automation certificate asset named AzureRunAsCertificate in the specified Automation account.
Creates an Automation connection asset named AzureRunAsConnection in the specified Automation account.
Please note the following requirements from the referenced link:
You must have an Azure Automation Account with an Azure Run As account already prepared. If you don’t have this yet, learn how to create one when you go to Create a new Automation account in the Azure portal.
The Azure PowerShell module must be installed. If you don’t have this yet, please go to the Install the Azure PowerShell module page for more information.
Note: You can configure your Runbook to use managed identities as well and it has added benefits as compared to using Run As accounts. You can get started with this tutorial to use managed identity.

Alternative to New-AzSqlDatabaseCopy on Hyperscale Database

I am working on Microsoft Azure, in which I have a group of resources for a test environment and a production environment, in both I have an Azure SQL Databases database server with its respective database.
I am creating a Runbook of Automation Accounts in Powershell in another Microsoft Azure account (Important Note) that is responsible for "Copying" the production database to tests. I know that there is the New-AzSqlDatabaseCopy command, however, this command does not It works with Hyperscale databases.
Is there an alternative to this command in Hyperscale? or in this second account it is possible to create a. Bacpac remotely with Azure commands for Powershell, all I have seen are for working on the same account, but the database account is different from the automation account due to work rates.
Thank you in advance for your help and comments.
I already tried to use the New-AzureRmSqlDatabaseExport command, but it seems to work only in the same Azure Account, and I can't specify "Azure Account for backup" and "Azure account for storage". Am I right?
Like Alberto Morillo says in his comment New-AzSqlDatabaseCopy it's currently not available for Azure SQL HyperScale. at least at the moment of this answer.
So i try to use New-AzureRmSqlDatabaseExport with 2 Azure Accounts and it's tottally possible, you need to login with the Azure Account of the origin database Connect-AzureRmAccount then you need to call the New-AzureRmSqlDatabaseExport command with the following parameters.
New-AzureRmSqlDatabaseExport
-ResourceGroupName $RGName # Resource group of the source database
-ServerName $Server # Server name of the source database
-DatabaseName $Database # Name of the source database
-AdministratorLogin $User # Administrator user of the source database
-AdministratorLoginPassword $Pwd # Password of the source database
-StorageKeytype "StorageAccessKey" # Key type of the destination storage account (The one of the another azure account)
-StorageKey $StorageKey # Key of the destination storage account(The one of the another azure account)
-StorageUri $StorageFileFullURI # The full file uri of the destination storage (The one of the another azure account)
# The format of the URI file is the following:
# https://contosostorageaccount.blob.core.windows.net/backupscontainer/backupdatabasefile.bacpac
unfortunately, this command is not enabled for hyperscale, so I get the following error message:
New-AzureRmSqlDatabaseExport : 40822: This feature is not available for the selected database's edition (Hyperscale).
I used the same command with a database that was not Hyperscale and it worked perfectly.
Finally, I think I will have to perform the manual process for at least a few months, have Microsoft launch the update for HyperScale
Database copy is currently not available for Azure SQL Hyperscale but you may see it in public preview in a few months.

Get own Service Principal Name in an Azure DevOps Powershell pipeline task

When running an Azure Powershell task in an Azure DevOps Release Pipeline with system.debug=true, you will get an output similar to this:
# anonymized
...
2019-09-05T12:19:41.8983585Z ##[debug]INPUT_CONNECTEDSERVICENAMEARM: '7dd40b2a-1c37-4c0a-803e-9b0044a8b54e'
2019-09-05T12:19:41.9156487Z ##[debug]ENDPOINT_URL_7dd40b2a-1c37-4c0a-803e-9b0044a8b54e: 'https://management.azure.com/'
2019-09-05T12:19:41.9188051Z ##[debug]ENDPOINT_AUTH_7dd40b2a-1c37-4c0a-803e-9b0044a8b54e: '********'
2019-09-05T12:19:41.9221892Z ##[debug]ENDPOINT_DATA_7dd40b2a-1c37-4c0a-803e-9b0044a8b54e: '{"subscriptionId":"b855f753-d5b3-48f4-b7cd-5beb58fb5508","subscriptionName":"Entenhausen","environment":"AzureCloud","creationMode":"Automatic","azureSpnRoleAssignmentId":"5ddcc3fe-f93c-4771-8041-50b49f76b828","azureSpnPermissions":"[{\"roleAssignmentId\":\"5ddcc3fe-f93c-4771-8041-50b49f76b828\",\"resourceProvider\":\"Microsoft.RoleAssignment\",\"provisioned\":true}]","spnObjectId":"76055cb6-3b75-4191-9309-306b32dad443","appObjectId":"e4b90b9d-7a73-42a3-ae6e-4daec910def4","environmentUrl":"https://management.azure.com/","galleryUrl":"https://gallery.azure.com/","serviceManagementUrl":"https://management.core.windows.net/","resourceManagerUrl":"https://management.azure.com/","activeDirectoryAuthority":"https://login.microsoftonline.com/","environmentAuthorityUrl":"https://login.windows.net/","graphUrl":"https://graph.windows.net/","managementPortalUrl":"https://manage.windowsazure.com/","armManagementPortalUrl":"https://portal.azure.com/","activeDirectoryServiceEndpointResourceId":"https://management.core.windows.net/","sqlDatabaseDnsSuffix":".database.windows.net","AzureKeyVaultDnsSuffix":"vault.azure.net","AzureKeyVaultServiceEndpointResourceId":"https://vault.azure.net","StorageEndpointSuffix":"core.windows.net","EnableAdfsAuthentication":"false"}'
2019-09-05T12:19:41.9284444Z ##[debug]AuthScheme ServicePrincipal
...
I need to add the SPN of the Azure DevOps connection to a resource. When changing subscriptions or pipelines, the SPN also changes and I do not want to hardcode the value.
As the value is printed in the system.debug=true output, I am wondering how to access my own SPN within a pipeline task. Is it possible to read out spnObjectId":"76055cb6-3b75-4191-9309-306b32dad443" somehow using Powershell?
Information about the Service Principal can be accessed using Get-AzureRmContext but the information is limited and some is obfuscated in the logs so you need to make a second call to Get-AzureRmServicePrincipal to access the ObjectId
$Context = Get-AzureRmContext
$AzureDevOpsServicePrincipal = Get-AzureRmADServicePrincipal -ApplicationId $Context.Account.Id
$ObjectId = $AzureDevOpsServicePrincipal.Id
The Id exposed in $Context.Account.Id is the Service Principals ApplicationId
SPN within a pipeline task is nothing but the Azure subscription you have passed on to the task. You can click on manage connections and copy the details of the SPN under connections and use them as you need. But, I am not sure why do you want to use the SPN directly as you can always use an Azure Powershell Task and just select the subscription. Once you store the Connection, you can always reuse it in different pipelines.

Azure Powershell - automating Login-AzureRmAccount AD Login - for Azure function

I have this Azure Powershell script, which successfully backs up a SQL Azure DB to Azure Blob.
In its current form, it requires me to log in via AD.
I now need to implement this script to execute via a Azure Function at specific intervals.
The first snippet of the script:
$subscriptionId = "YOUR AZURE SUBSCRIPTION ID"
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId $subscriptionId
I thus need to not use Login-AzureRmAccount, but replace it with a method that does not require human input.
I have found this link:
https://cmatskas.com/automate-login-for-azure-powershell-scripts/
In short, the author:
Creates an Azure AD Application (with its own password)
Creates a Service Principal
Assigns Permissions to the Service Principal
This is a once-off manual creation - which is perfect.
The author then logs in to this newly created application
$psCred = New-Object System.Management.Automation.PSCredential($azureAccountName, $azurePassword)
Add-AzureRmAccount -Credential $psCred -TenantId e801a3ad-3690-4aa0-a142-1d77cb360b07 -ServicePrincipal
My questions:
Is this what I should do to be able to automate my application and prevent human login?
This Azure AD app created in step 1 - can I use this app as a starting point in my of my Azure functions?
Yes, you can use that route, or use certificate auth, or use an Azure AD user, it can login with user\password, but is considered less secure than service principal.
Yes, you can use one service principal for any number of Azure Functions you would like to.
To use Azure PowerShell in Azure Functions, you may refer to the following response in another SO thread. The example is an HTTP-Trigger, but you can modify it to use a Timer-Trigger for your use-case. Here's the link:
Azure Function role like permissions to Stop Azure Virtual Machines
Run PowerShell as Administrator, you need to install AzureRM in PowerShell,
Login to Azure
Login-AzureRmAccount
Enter your Azure credentials
To get your subscription(s) details
enter
Get-AzureRmSubscription
Use the subscription id to select the subscription.
Select-AzureRmSubscription -SubscriptionId xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Save the AzureProfile using the below command
Save-AzureRmProfile -Path "C:\AzureScripts\profile.json"
The json file can be used to login to Azure
Select-AzureRmProfile -Path "C:\AzureScripts\profile.json"
Put this line on top of you .ps1 file, you does not require human input.
Ref : http://www.smartcoding.in/blog/auto-login-azure-power-shell

Puzzled about Credential Details for Vertically Scaling SQL Azure using Azure Automation?

I am trying to upscale and downscale my SQL Azure instances using Azure Automation. I am using a gallery runbook called "Set-AzureSqlDatabaseEdition.ps1" which has been created by Joseph Idziorek.
The link is: SQL Azure vertical scale Runbook
The parameter examples are:
.EXAMPLE for Set-AzureSqlDatabaseEdition
-SqlServerName bzb98er9bp
-DatabaseName myDatabase
-Edition Premium
-PerfLevel P1
-Credential myCredential
However I am confused what should go into "Crediential". Is this the SQLServer Admin Username or something else? Is it something I create in Azure Automation Assets?
Thanks.
One needs to create a credential asset using the "SQLServer" credentials not AD, and then use the name of this for the Credential parameter value.

Resources