I use a script PowerShell in a Release Azure DevOps to create a new report in Power BI Service.
The script :
New-PowerBIReport -Path 'C:\Users\cbonier\Documents\REPORTFINANCEDEV.pbix' -Name 'REPORTFINANCEDEV' -Workspace (Get-PowerBIWorkspace -Name 'WKS - DTS [DEV]') -ConflictAction CreateOrOverwrite
But I have this error when I use it : enter image description here
Related
Now i want to run some power shell scipts like New-AzAutomationRunbook -AutomationAccountName 'Testing' -Name 'Runbook02' -ResourceGroupName 'rg' -Type Python3
How to do it by ansible-playbook ?
We can achieve the above requirement by following this Blog here to configure in our local.
The same command we can run with PowerShell as well to create runbook for our Automation Accounts.
Screenshot for reference:-
For more information please refer this MICROSOFT DOCUMENTATION:- Using Ansible with Azure
When trying to configure a release pipeline for ADF getting an error as below :
Trigger enabled cannot update : Cannot update enabled trigger; trigger needs to be disabled first.
But there are no triggers in ADF.
How can this be handled ?
There are 3 steps in release pipeline
Disable triggers Azure power shell script
$triggersADF | ForEach-Object { Stop-AzureRmDataFactoryV2Trigger -ResourceGroupName <ResourceGroupName> -DataFactoryName <DataFactoryName> -Name $_.name -Force }
ARM template deployment
Enable triggers Azure power shell script
$triggersADF | ForEach-Object { Start-AzureRmDataFactoryV2Trigger -ResourceGroupName <ResourceGroupName> -DataFactoryName <DataFactoryName> -Name $_.name -Force }
It seems that you are already following the sequence of deploy ADF as per MS docs.
I suspect that there is something unexpected in the ARM template. I would recommend you to export the template manually and use it instead. Your scripts look fine.
Trying to execute ps script on azure vm's using azure automation using the approach defined at
https://stackoverflow.com/a/62258063/1014275. Script below does copy files between azure vm's. Both Vm's are in same subnet and if we run the copy-item command from VM powershell it copies the files to target VM folder. The same script executes successfully with azure automation runbooks, but without copying files.
Script used:
Copy-Item -Path C:\folder\sample.txt -Destination \\\VmHostname\C$\folder -Force
Updated 1: The script throws exception as below.
Failed
VERBOSE: Performing the operation "Copy File" on target "Item: C:\folder\file.txt Destination:
\\vmhostname\C$\folder". Copy-Item : You can't connect to the file share because it's not secure. This share requires the obsolete SMB1
protocol, which is unsafe and could expose your system to attack.
Your system requires SMB2 or higher.
Update 2:
run the below ps command (Solved update 1 issue)
Enable-WindowsOptionalFeature -Online -FeatureName SMB1Protocol
after this we are getting below error
Failed
Copy-Item : Access is denied
At C:\Packages\Plugins\Microsoft.CPlat.Core.RunCommandWindows\1.1.5\Downloads\script17.ps1:1 char:1
+ Copy-Item -Path C:\folder\file.txt -Destination \\vmhostname ...
Any suggestion would be helpful!
I have created cognitive services account . I want to get cognitive services account key and store it in a variable using powershell script.
I have used below script :
$resourceGroup = "Demo"
$AccountName = "DemoCs"
$Key = Get-AzCognitiveServicesAccountKey -ResourceGroupName $resourceGroup -Name $AccountName
Write-Host "account key 1 = " $Key
after executing the script result is :
2020-05-20T08:30:31Z [Information] INFORMATION: account key 1 = Microsoft.Azure.Commands.Management.CognitiveServices.Models.PSCognitiveServicesAccountKeys
Above script is able to list keys in the cloud shell but not in powershell function app .
You cannot simply call Import-Module Az.CognitiveServices like on your local machine where you have previously installed Az.CognitiveServices from that site. But you have to copy all files from that locally installed package into specific folder inside of your Function App in Azure.
1.Install Az.CognitiveServices in local and go to its folder to get all the content in it.
2.Go to your function KUDU. Click CMD>site>wwwroot>yourFunctionName then create a directory called modules.
3.Simply drag-and-drop all files from your local powershell module location to your Azure Function App folder create above(modules).
4.Include Az.CognitiveServices PowerShell module in run.ps1 file like that example below:
Import-Module "D:\home\site\wwwroot\HttpTrigger1\modules\Az.CognitiveServices.psd1"
5.Then run the script as below with $Key.Key1 to specify the Key1.
$Key = Get-AzCognitiveServicesAccountKey -ResourceGroupName $resourceGroup -Name $AccountName
Write-Host "account key 1 = " $Key.Key1
For more details, you could refer to this tutorial and this one.
I want to download a file that is on my Azure File storage in FileShare into my release pipeline agent.
Inside the release pipeline I am using a PowerShell step and run the command:
Start-AzStorageFileCopy -SrcShareName "report.xml" -SrcFilePath "." -DestFilePath "$(System.DefaultWorkingDirectory)" -DestShareName "report.xml" -Context $(context)
its asking me now for a parameter -name
2020-05-09T01:43:34.1007773Z ##[error]Cannot process command because of one or more missing mandatory parameters: Name.
Basically my plan is to use this file for a test report in a release pipeline. Therefore I need this file to be used in a Publish Test Result step.
Since you are trying to download single report.xml file from Azure File Share, directly use Get-AzureStorageFileContent command.
Sample:
$ctx = New-AzureStorageContext [storageaccountname] [storageaccountkey]
##sharename is your existed name.
$s = Get-AzureStorageShare [sharename] –Context $ctx
##To download a file from the share to the local computer, use Get-AzureStorageFileContent.
Get-AzureStorageFileContent –Share $s –Path [path to file on the share] [path on local computer]
If you want to download multiple files using one command, you could use Azcopy.
More detail info please take a look at this blog.
You are looking to download the file locally from the release agent job, so i would stick to using this command below:
Get-AzStorageFileContent -Context $Context -ShareName "acishare" -Path "report.xml" -Destination $(System.DefaultWorkingDirectory)