I'm wondering if there is a specific Powershell command to create Azure SQL Analytics resource. So far I've only found how to enable sending diagnostics logs to a Log Analytics workspace.
I assume I can try arm template, but it will be nice if there is a dedicated command to avoid creating another json template.
You can use the following powershell:
Set-AzOperationalInsightsIntelligencePack -ResourceGroupName rgName -WorkspaceName workspaceName -IntelligencePackName AzureSQLAnalytics -Enabled $true
Reading: https://learn.microsoft.com/en-us/azure/azure-monitor/platform/powershell-workspace-configuration#create-and-configure-a-log-analytics-workspace
Related
I have 100 dataflows ,50 pipelines and their related datasets variables etc.
Now i want to use Synapse service and want my all pipelines amd stuff of ADF into Synapse. My Adf is git configured
Can we export them in one go??
To move data from data factory to synapse we need to download the supported module files in ADF.
I tried to migrate data from ADF to Synapse workspace using PowerShell following below procedure:
I connected to Azure account using below command:
Connect-AzAccount
I set my subscription using below command:
Select-AzSubscription -SubscriptionName "<SubscriptionName>"
I created a linked service for source and sink using below comand:
Set-AzSynapseLinkedService -WorkspaceName <synapseworkspace> -Name <linkedservicename> -DefinitionFile "<json file path>"
I created dataset for source and sink in synapse workspace using below command:
Set-AzSynapseDataset -WorkspaceName <synapseworkspace> -Name <datasetname> -DefinitionFile "<json file path>"
I created pipeline in synapse workspace using below command:
Set-AzSynapsePipeline -WorkspaceName <synapseworkspace> -Name <pipelinename> -DefinitionFile "<json file path>"
They are created successfully in synapse workspace:
In your case you are having bulk data in your data factory, you can use Azure Data Factory to Synapse Analytics Migration Tool.
You can migrate Azure Data Factory pipelines, datasets, linked service, integration runtime and triggers to a Synapse Workspace through below PowerShell script:
.\importADFtoSynapseTool.ps1 -sourceADFResourceId "/subscriptions/<SubscriptionID>/resourcegroups/<ADFResourceGroupName>/providers/Microsoft.DataFactory/factories/<ADFName>" -destSynapseResourceId "/subscriptions/<SubscriptionID>/resourcegroups/<SynapseResourceGroupName>/providers/Microsoft.Synapse/workspaces/<SynapseName>" -TenantId <tenantId>
Resource group name for Azure Data Factory
Resource group name for Azure Synapse Workspace
Subscription ID
Tenant ID (you can find this by click Azure Active Directory in the Azure Portal)
For more clarification you can visit this.
As far as I know, there is no way to do this out of the box.
There is an option to download the object's JSON and then use PowerShell to recreate the objects in Synapse workspace.
I haven't tried that myself, but the process is explained here.
I'm a bit of an Azure & Powershell newbie.
I'm trying to write PowerShell scripts to create an environment that can be published to from Azure DevOps.
As part of that, I'm creating a Service Bus with multiple topics. Each of the topics will have multiple Authorization Rules - one for publication and one for subscription.
I have the scripts for this working. However, I need to get the connection strings for these rules and save them to a key vault, to make them available to apps.
This is where I have become stuck.
This is similar to my existing code:
New-AzServiceBusTopic -ResourceGroupName myResourceGroup -Namespace myServiceBus -EnablePartitioning $false -Name myTopic
New-AzServiceBusAuthorizationRule `
-ResourceGroupName myResourceGroup `
-Namespace myServiceBus `
-Topic myTopic`
-Name myTopic.pub `
-Rights #("Send")
In the Azure Portal, I would click into the Service bus and Topic, select Shared Access Policies and click on the policy. It would show me the SAS Policy with the Primary Connection String.
Is there any way in PowerShell to get the Primary Connection String?
Thanks
If you have azure powershell Az.ServiceBus module installed, you can directly use this command: Get-AzServiceBusKey.
For example:
Get-AzServiceBusKey -ResourceGroup Default-ServiceBus-WestUS -Namespace SB-Example1 -Name AuthoRule1
I need your help.
I have an azure analysis service that I want to start and pause at different times of the day.
I want to use Powershell to execute a script to do this task.
Do you know how I can do it?
Thank you for your help
Here are the steps to start and pause Azure Analysis services with on premise powershell:
Step1: Import Az.AnalysisServices module
To create a server in your subscription, you use the Az.AnalysisServices module. Load the Az.AnalysisServices module into your PowerShell session.
Import-Module Az.AnalysisServices
Step2: Sign in to Azure
Sign in to your Azure subscription by using the Connect-AzAccount command. Follow the on-screen directions.
Connect-AzAccount
Step3: [Only run if you have multiple Azure Subscription].
The Set-AzContext cmdlet sets authentication information for cmdlets that you run in the current session. The context includes tenant, subscription, and environment information.
Set-AzContext -SubscriptionId "xxxx-xxxx-xxxx-xxxx"
Step4: Gets the details of an Analysis Services server.
The Get-AzAnalysisServicesServer cmdlet gets the details of an Analysis Services server.
Get-AzAnalysisServicesServer -ResourceGroupName "ResourceGroup03" -Name "testserver"
To start Azure Analysis services, use Resume-AzAnalysisServicesServer
The Resume-AzAnalysisServicesServer cmdlet resumes an instance of Analysis Services server
Resume-AzAnalysisServicesServer -Name "testserver" -ResourceGroupName "testgroup"
To stop Azure Analysis services, use Suspend-AzAnalysisServicesServer
The Suspend-AzAnalysisServicesServer cmdlet suspends an instance of Analysis Services server
Suspend-AzAnalysisServicesServer -Name "testserver" -ResourceGroupName "testgroup"
Reference: Azure Analysis services – Azure PowerShell cmdlets
Now using Azure Automation, you can schedule start & stop of Azure Analysis Services as per required time in a day.
I have an Az script that sets up Advanced Data Security for my Azure SQL Databases/Servers.
Unfortunately, Az cannot run in Azure Devops, so I translated the script to AzureRM. The script leaves Advanced Data Security in a "Partially Configured" state, due to the Azure SQL Server's VULNERABILITY ASSESSMENT SETTINGS not being set.
What is the AzureRM equivalent of Update-AzSqlServerVulnerabilityAssessmentSetting
I tried
Update-AzSqlServerVulnerabilityAssessmentSetting
to:
Update-AzureRmSqlDatabaseVulnerabilityAssessmentSettings
However, only the database gets configured and this leave the Server unconfigured.
They are not an equivalent, the AzureRm module was deprecated and will not be updated.
Unfortunately, Az cannot run in Azure Devops, so I translated the script to AzureRM.
As I know, the Task version with 4.*(preview) supports Az module. I tried it here.
If you want to AzureRm module to update Azure SQL Vulnerability Assessment Setting, you just can use command "update-AzureRmSqlDatabaseVulnerabilityAssessmentSettings" to configure all database in one server, AzureRM does not provide command to enable customers to directly configure Azure SQL server. For more details, please refer to the blog.
Get-AzureRmSqlDatabase -ResourceGroupName $params.rgname -ServerName $params.serverName`
| where {$_.DatabaseName -ne "master"} `
| Update-AzureRmSqlDatabaseVulnerabilityAssessmentSettings `
-RecurringScansInterval Weekly `
-NotificationEmail $scanNotificationEmail `
-EmailAdmins $true"
I keep getting following error from Data Factory whenever I run an U-SQL Job
Job submission failed, the user 'adla account name' does not have permissions to a subfolder in the /system/ path needed by Data Lake Analytics. Please run “Add User Wizard” from the Data Lake Analytics Azure Portal or use Azure PowerShell to grant access for the user to the /system/ and its children on the Data Lake Store.
And I am not using any Firewall as suggested in this post:
Run U-SQL Script from C# code with Azure Data Factory
I am using the Azure Data Lake Store service principal authentication. When I start the Job from Visual Studio, it also works fine.
I would be grateful for any idea...
thanks
If you are authorising the Azure Data Lake Analytics linked service from Azure Data Factory with a service principal that could be your problem.
I have an outstanding support ticket with Microsoft because the service principal authentication method only works with simple data factory activities like 'copy'. It does not work if you want to authenticate complex activities like 'DotNotActivity'.
My advise would be to change the linked service back to using session and token auth, then deploy your activities and try again.
Hope this helps.
It does sound like a permissions problem. You can run this powershell script to ensure you have applied the proper permissions to the security principal:
Login-AzureRmAccount
$appname = “adla”
$dataLakeStoreName = “yourdatalakename”
$app = Get-AzureRmADApplication -DisplayName $appname
$servicePrincipal = Get-AzureRmADServicePrincipal -SearchString $appname
Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path / -AceType User -Id $servicePrincipal.Id -Permissions All
Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path /system -AceType User -Id $servicePrincipal.Id -Permissions All
If you want to create everything from scratch using a powershell script, here is a blog that will help you:
http://dyor.com/setting-up-an-azure-data-lake-and-azure-data-factory-using-powershell/