So, I'm new to Azure Devops (but not Azure or PowerShell). I've got two scripts as tasks in my pipeline, the first runs perfectly (Note: no az module commands). The second one fails (It has one Az call). The error I get in the pipeline is:
ParserError: /home/vsts/work/_temp/3f7b3ce1-6afb-46f3-b00d-1efe35fbac71.ps1:5
Line |
5 | } else {
| ~
| Unexpected token '}' in expression or statement.
Here is the thing... I don't have '} else {' in my script or at least, I removed it and got the same error.
So whatever is causing this is more fundamental than my script. I assumed that '/home/vsts/work/_temp/3f7b3ce1-6afb-46f3-b00d-1efe35fbac71.ps1' was my script copied to the remote syste, but it doesn't seem to be the case.
Is there any way, I can find out what that file is?
The task is 'PowerShell#2' with 'pwsh: true' set.
Thanks!
YAML:
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- task: AzureResourceManagerTemplateDeployment#3
inputs:
deploymentScope: 'Resource Group'
azureResourceManagerConnection: 'Pay-As-You-Go(<guid>)'
subscriptionId: '<guid>'
action: 'Create Or Update Resource Group'
resourceGroupName: 'project-Test'
location: 'East US 2'
templateLocation: 'Linked artifact'
csmFile: 'deploy.template.json'
csmParametersFile: 'deploy.parameters.json'
deploymentMode: 'Incremental'
deploymentOutputs: 'DeploymentOutput'
- task: PowerShell#2
inputs:
targetType: 'inline'
script: $(System.DefaultWorkingDirectory)/Get-ArmDeploymentOutput.ps1 -ArmOutputString '$(DeploymentOutput)' -MakeOutput -ErrorAction Stop
pwsh: true
displayName: Get-ArmDeploymentOutput
- task: PowerShell#2
inputs:
targetType: 'inline'
script: $(System.DefaultWorkingDirectory)/Set-AzWebAppIPRestriction.ps1 -Priority 100 -Action 'Allow' -WebAppId '$(WebAppId)' -PipId '$(gwpipId)'' -ErrorAction Stop
pwsh: true
displayName: Set-AzWebAppIPRestriction
I found there is an extra ' in -PipId '$(gwpipId)'' in the script of your second powershell task. Maybe it caused the error.
script: $(System.DefaultWorkingDirectory)/Set-AzWebAppIPRestriction.ps1 -Priority 100 -Action 'Allow' -WebAppId '$(WebAppId)' -PipId '$(gwpipId)'' -ErrorAction Stop
Since you were running your scripts in .ps1 file. You should change the targetType parameter of powershell task to filePath. And set .ps1 file in the filePath parameter. See below example:
steps:
- task: PowerShell#2
displayName: 'PowerShell Script'
inputs:
targetType: filePath
filePath: '$(System.DefaultWorkingDirectory)/Set-AzWebAppIPRestriction.ps1'
arguments: '-Priority 100 -Action "Allow" -WebAppId "$(WebAppId)" -PipId "$(gwpipId)" -ErrorAction Stop'
pwsh: true
If you are using azure powershell module in your scripts. You can use Azure powershell task instead of powershell task..
Related
I am deploying a database to an Azure SQL Server with Azure Pipelines. The deployment is working really well. So I added the creation and deployment of a temporary database (one that is removed at the end of the pipeline) for unit testing of my USP's.
I created a unit test .dll with SSDT. I configured the app.config and tested it on my laptop in Visual Studio. On my laptop it works, the unit tests are run an all pass.
I tried to connect to the server with the credentials from the app.config in SSMS. The connection is established.
But, I cant get the pipeline (or hosted agent) task to connect to the database from within the pipeline.
I tried adding the IP of the agent to the server whitelist. I checked the "allow azure services to access the database". These did not solve the problem.
The error I get is this:
Initialization method
BuilEnDeployTest_Unittest.SqlServerUnitTest001.TestInitialize threw
exception. System.Data.SqlClient.SqlException:
System.Data.SqlClient.SqlException: A transport-level error has
occurred when receiving results from the server. (provider: TCP
Provider, error: 0 - An existing connection was forcibly closed by the
remote host.) ---> System.ComponentModel.Win32Exception: An existing
connection was forcibly closed by the remote host.
pipeline yml:
trigger:
- test
name: $(TeamProject)_$(Date:yyyyMMdd)$(Rev:_r)
variables:
- group: "DatabaseDeploy"
- name: databaseName
value: sqldb-AzureDevOps_$(Build.BuildNumber)
#- name: ExecutionContext.__Database__
# value: $(variables.databaseName)
stages:
### Building the code in the Pull-Request
- stage: Build
displayName: Build the project
jobs:
- job: Build
displayName: Build DB project
pool:
vmImage: windows-latest
steps:
- script: echo '$(Build.BuildNumber)'
displayName: BuildName
#- powershell:
# Get-ChildItem -Path $(Agent.WorkFolder)\1\s\BuilEnDeployTest_Unittest -recurse
# displayName: Folder structure
#- script: echo '$(Agent.WorkFolder)\1\s\BuilEnDeployTest_Unittest\app.config'
# displayName: app.config
- powershell:
Get-Content -path $(Agent.WorkFolder)/1/s/BuilEnDeployTest_Unittest/app.config
displayName: app.config
- powershell: >
(Get-Content -path $(Agent.WorkFolder)/1/s/BuilEnDeployTest_Unittest/app.config -Raw)
-replace 'MyDatabase', '$(databaseName)'
-replace 'MyServer', '$(ServerName_unittst).database.windows.net'
-replace 'MyUser', '$(DatabaseUser_unittst)'
-replace 'MyPassword', '$(DatabasePassword_unittst)'| Set-Content -Force -Path $(Agent.WorkFolder)/1/s/BuilEnDeployTest_Unittest/app.config
- powershell:
Get-Content -path $(Agent.WorkFolder)/1/s/BuilEnDeployTest_Unittest/app.config
displayName: app.config
- task: VSBuild#1
displayName: Building the database project
inputs:
solution: '**\*.sln'
- task: CopyFiles#2
displayName: Copy Build Artifacts
inputs:
SourceFolder: '$(agent.builddirectory)\s'
Contents: '**'
TargetFolder: '$(build.artifactstagingdirectory)'
### In the commands below, 'pipeline' is used instead of 'Container', allthough
### 'Container' is often seen in the online documentation.
### I have noticed that changing 'pipeline' to 'Container' breaks my code
### I haven't spend time in figuring out why. It could be because of the path I use
### down the line.
- task: PublishPipelineArtifact#1
displayName: Publish Build Artifacts
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'database'
publishLocation: 'pipeline'
#### Unit testing the Pull-Request
- stage: Unit_testing
displayName: Database UnitTest
condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest'))
jobs:
- job: Deploy_to_temporary_database
displayName: Deploy to temporary database
pool:
vmImage: windows-latest
steps:
############ Create the Azure Resources as defined in the AzureSQLARMTemplate.json
## When the Azure SQL server does not exist it is also created by the ARM deployment.
## NOTE: I experienced errors when deleting and recreating the Azure SQL server. This could be
## because of the little time between deletion and creation.
## Deployment through the ARM template can deal with an existing server. In that case
## the server creation is skipped. So best to not delete the Azure SQL server untill the project
## is abandoned.
- task: AzurePowerShell#5
displayName: 'Create unittest DB'
inputs:
azureSubscription: $(connectedServiceNameARM)
ScriptType: 'FilePath'
ScriptPath: './ARMtemplate/CreateAzureSQLAndDB.ps1'
ScriptArguments: -azuresqlserverName $(ServerName_unittst) -sqlserverAdminLogin $(DatabaseUser_unittst) -sqlserverAdminPassword $(DatabasePassword_unittst) -databaseName $(databaseName) -Location 'northeurope' -ResourceGroupName 'rg-temp' -TemplateFile ARMtemplate\AzureSQLARMTemplate.json
azurePowerShellVersion: 'LatestVersion'
############ Download the build artifacts from the build stage
- task: DownloadPipelineArtifact#2
displayName: 'Download artifacts'
inputs:
buildType: 'current'
artifactName: 'database'
targetPath: '$(Pipeline.Workspace)/database'
- powershell:
Get-ChildItem -Path $(Pipeline.Workspace)/database -recurse
displayName: Folder structure
- powershell:
Get-Content -path $(Pipeline.Workspace)/database/a/BuilEnDeployTest_Unittest/bin/Debug/BuilEnDeployTest_Unittest.dll.config
displayName: BuilEnDeployTest_Unittest.dll.config
############ Deploy the database dacpac file to the newly created database
- task: SqlAzureDacpacDeployment#1
displayName: 'Deploy unittest DB'
inputs:
connectedServiceNameARM: $(connectedServiceNameARM)
ServerName: $(ServerName_unittst).database.windows.net
DatabaseName: $(databaseName)
SqlUsername: $(DatabaseUser_unittst)
SqlPassword: $(DatabasePassword_unittst)
DacpacFile: $(Pipeline.Workspace)/database/a/$(dacpacfile_location)
PublishProfile: $(Pipeline.Workspace)/database/a/$(publishprofile_location)
## Create a firewall rule for the IP adress of the build agent
- task: AzurePowerShell#5
displayName: 'Add buildserver public ip'
inputs:
azureSubscription: $(connectedServiceNameARM)
ScriptType: InlineScript
Inline: |
$ip = (Invoke-WebRequest -uri "http://ifconfig.me/ip").Content
$AzureSQLFirewallRule = Get-AzSqlServerFirewallRule -ResourceGroupName 'rg-temp' -ServerName $(ServerName_unittst) -FirewallRuleName "azuredevops"
if ($AzureSQLFirewallRule -eq $null) {New-AzSqlServerFirewallRule -ResourceGroupName 'rg-temp' -ServerName $(ServerName_unittst) -FirewallRuleName "azuredevops" -StartIpAddress $ip -EndIpAddress $ip}
azurePowerShellVersion: 'LatestVersion'
############ Run the unittest
- task: VSTest#2
inputs:
testAssemblyVer2: $(Pipeline.Workspace)/database/a/$(Unittestdll_location)
testConfiguration: 'Debug'
runOnlyImpactedTests: true
runInParallel: false
############ Delete the newly created database
- task: AzurePowerShell#5
displayName: 'Delete unittest DB'
inputs:
azureSubscription: $(connectedServiceNameARM)
ScriptType: 'FilePath'
ScriptPath: './ARMtemplate/CleanupAzureSQLAndDB.ps1'
ScriptArguments: -azuresqlserverName $(ServerName_unittst) -databaseName $(databaseName) -ResourceGroupName 'rg-temp'
azurePowerShellVersion: 'LatestVersion'
## Delete a firewall rule for the IP adress of the build agent
- task: AzurePowerShell#5
displayName: 'remove buildserver public ip'
inputs:
azureSubscription: $(connectedServiceNameARM)
ScriptType: InlineScript
Inline: |
$ip = (Invoke-WebRequest -uri "http://ifconfig.me/ip").Content
Remove-AzSqlServerFirewallRule -ResourceGroupName 'rg-temp' -ServerName $(ServerName_unittst) -FirewallRuleName "azuredevops"
azurePowerShellVersion: 'LatestVersion'
## Deploying the merged result to the database
- stage: Deploy_to_tst
displayName: Deploy
condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'))
jobs:
- job: Deploy
displayName: Deploy
pool:
vmImage: windows-latest
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
artifactName: 'database'
targetPath: '$(Pipeline.Workspace)/database'
- powershell:
#Get-ChildItem -Path $(Pipeline.Workspace)/database -recurse
#Get-ChildItem -Path $(Agent.Workspace) -recurse
- task: SqlAzureDacpacDeployment#1
displayName: 'Deploying the database to Azure'
inputs:
connectedServiceNameARM: $(connectedServiceNameARM)
ServerName: $(Server_tst)
DatabaseName: $(DatabaseName_tst)
SqlUsername: $(DatabaseUser_tst)
SqlPassword: $(DatabasePassword_tst)
DacpacFile: $(Pipeline.Workspace)/database/a/$(dacpacfile_location)
PublishProfile: $(Pipeline.Workspace)/database/a/$(publishprofile_location)
For hosted agent check if you can create a local db with admin username and password. You can use these credentials later to connect to the database.
Normal approach is usually just test if your sql project is deployable. You create local db and nuke it.
For your unit tests you can always keep a developer databae in your release pipeline where you run your test cases.
parameters:
- name: AzureSubscription
default: 'abc'
- name: BlobName
type: string
default: ""
stages:
- stage: MyStage
displayName: 'My Stage'
variables:
- name: sas
jobs:
- job: ABC
displayName: ABC
steps:
- task: AzureCLI#2
displayName: 'XYZ'
inputs:
azureSubscription: ${{ parameters.AzureSubscription }}
scriptType: pscore
arguments:
scriptLocation: inlineScript
inlineScript: |
$sas=az storage account generate-sas --account-key "mykey" --account-name "abc" --expiry (Get-Date).AddHours(100).ToString("yyyy-MM-dTH:mZ") --https-only --permissions rw --resource-types sco --services b
Write-Host "My Token: " $sas
- task: PowerShell#2
inputs:
targetType: 'filepath'
filePath: $(System.DefaultWorkingDirectory)/psscript.ps1
arguments: >
-Token "????"
-BlobName "${{parameters.BlobName}}"
displayName: 'some work'
In this Azure Devops yaml, i have created 2 tasks. AzureCLI#2 and PowerShell#2
In AzureCLI#2 i get value in $sas varaible. Write-Host confirms that, but $sas does not get passes as parameter to PowerShell#2 powershell file as parameter.
"${{parameters.BlobName}}" is working fine. In powershell i am able to read that value.
How to pass sas variable value?
Tried
-Token $sas # not worked
-Token "${{sas}}" # not worked
Different tasks in Azure Pipeline don't share a common runspace that would allow them to preserve or pass on variables.
For this reason Azure Pipelines offers special logging commands that allow to take string output from a task to update an Azure Pipeline environment variable that can be used in subsequent tasks: Set variables in scripts (Microsoft Docs).
In your case you would use a logging command like this to make your sas token available to the next task:
Write-Host "##vso[task.setvariable variable=sas]$sas"
In the argument of your subsequent task (within the same job) use the variable syntax of Azure Pipelines:
-Token '$(sas)'
In Azure DevOps we have the follwing YAML pipeline which is applying Terraform configuration from a CmdLine task.
The Output task should return the ObjectId of a Data Factory after it is deployed by Terraform.
I would like to use that ObjectId and pass it to the next Azure Powershell Task as a parameter so I can add that Id as member to an AzureADGroup.
How can I use the output from the step called 'Terraform output' in the next Powershell Task?
- task: CmdLine#2
displayName: Terraform Apply
enabled: False
inputs:
script: terraform apply -auto-approve -input=false tfplan
workingDirectory: infrastructure/tf_scripts/dev
- task: CmdLine#2
displayName: Terraform output
enabled: False
inputs:
script: |
terraform output adf_objectid
workingDirectory: infrastructure/tf_scripts/dev
- task: AzurePowerShell#4
displayName: 'Azure PowerShell script: InlineScript'
inputs:
azureSubscription: 'a6cb1cd3-8d5e-4db6-8af5-bcb66492d5cc'
ScriptType: 'InlineScript'
Inline: |
$spn=(terraform output adf_objectid)
Connect-AzureAD -AadAccessToken $aadToken -AccountId $context.Account.Id -TenantId $context.tenant.id -MsAccessToken $graphToken
Add-AzureADGroupMember -ObjectId xxxxx-xxxxx-xxxxx -RefObjectId $spn
workingDirectory: wd/scripts/dev
azurePowerShellVersion: 'LatestVersion'
Passing Terraform output to Powershell task in Azure Devosps
You could try to use the the Logging Command to set the adf_objectid as an azure devops pipeline variable:
echo ##vso[task.setvariable variable=spn]$(terraform output adf_objectid)
Check the similar thread for some details.
Here is an extract from a YAML pipeline in Azure DevOps:
- task: AzureCLI#2
name: GetAppInsightsConnString
displayName: 'Get AppInsights ConnectionString'
inputs:
azureSubscription: ${{ parameters.TelemetryAzureSubscription }}
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: |
az extension add -n application-insights
az feature register --name AIWorkspacePreview --namespace microsoft.insights
$resourceInfo = az monitor app-insights component show --app ${{ parameters.AppInsightsResourceName }} --resource-group ${{ parameters.AppInsightsResourceGroupName }}
$instrumentationKey = ($resourceInfo | ConvertFrom-Json).InstrumentationKey
echo "##vso[task.setvariable variable=ApplicationInsightsInstrumentationKey]$instrumentationKey"
- task: FileTransform#2
displayName: "Replace Parameters From Variables"
inputs:
folderPath: '$(Pipeline.Workspace)'
xmlTransformationRules: ''
jsonTargetFiles: '**/${{ parameters.ArmTemplateParameters }}'
- powershell: 'Get-Content $(Pipeline.Workspace)/${{ parameters.ArtifactName }}-provisioning/${{ parameters.ArmTemplateParameters }}'
displayName: 'Preview Arm Template Parameters File'
- task: PowerShell#2
displayName: "TEMP: Test new variable values"
inputs:
targetType: 'inline'
script: |
Write-Host "ApplicationInsightsInstrumentationKey: $(ApplicationInsightsInstrumentationKey)"
- task: AzureResourceManagerTemplateDeployment#3
inputs:
deploymentScope: 'Resource Group'
ConnectedServiceName: ${{ parameters.AzureSubscription }}
action: 'Create Or Update Resource Group'
resourceGroupName: ${{ parameters.ResourceGroupName }}
location: $(locationLong)
templateLocation: 'Linked artifact'
csmFile: '$(Pipeline.Workspace)/${{ parameters.ArtifactName }}-provisioning/${{ parameters.ArmTemplate }}'
csmParametersFile: '$(Pipeline.Workspace)/${{ parameters.ArtifactName }}-provisioning/${{ parameters.ArmTemplateParameters }}'
overrideParameters: '–applicationInsightsInstrumentationKey "$(ApplicationInsightsInstrumentationKey)"'
deploymentMode: 'Incremental'
This is connecting to an App Insights instance, getting the instrumentation key, then doing a variable replacement on an ARM parameters file before previewing it and deploying it.
The instrumentation key is writtent to a ApplicationInsightsInstrumentationKey pipeline variable, and you can see a later task which previews this in the pipeline logs so I can confirm the variable is being set as expected.
On the final task I'm using an overrideParameters option to feed this key into the deployment as the value of the applicationInsightsInstrumentationKey parameter. This is where the pipeline fails, with the error:
##[error]One of the deployment parameters has an empty key. Please see https://aka.ms/resource-manager-parameter-files for details.
My web searching tells me this can occur when the value has spaces and isn't enclosed in double-quotes, but neither of those are the case here. In fact I can even replace that line with a hard-coded value and I still get the same issue.
If I remove that overrideParameters line the deployment succeeds, but obviously the parameter I want isn't included.
Anyone know how to solve this?
As shown by the help dialog on ARM template deployment ADO task:
Since, applicationInsightsInstrumentationKey will not have multiple words, try changing line like below:
overrideParameters: '–applicationInsightsInstrumentationKey $(ApplicationInsightsInstrumentationKey)'
I'm trying to use azure pipeline to upload certificate and binding the app service.
First I use a DEV-stage,all works well.Currently I have to create a new stage for QUAL env.Just clone a new stage from DEV-stage and update the variables,but we run the pipeline can not find the certificate(file) I uploaded.
My download task is:
steps:
- task: DownloadSecureFile#1
displayName: 'Download ***.**.com Certificate for API App'
inputs:
secureFile: dev.pfx
and then use a azure powershell task,but in my script such error happens:
Certificate does not exist at path D:\a\_temp/
It seems can not find the download file in the agent.
Uploaded task:
steps:
- task: AzurePowerShell#3
displayName: 'Upload Certificate to API app and Bind Domain'
inputs:
azureSubscription: 'Azure: CDA NextGen DEV'
ScriptPath: '$(System.DefaultWorkingDirectory)/CdaApi-ArmTemplates/ArmTemplates/InstallSSLAndCustomDomain.ps1'
ScriptArguments: '-ResourceGroupName $(ResourceGroupName) -AppServiceName $(ApiSiteName) -CustomDomains $(ApiHostName) -CertificatePassword $(Password) -CertificateFileName $(CertificateFileName)'
azurePowerShellVersion: LatestVersion
power shell script:
$CertificateFilePath = $env:AGENT_TEMPDIRECTORY + "/" + $CertificateFileName
$ResourceGroupName -ResourceType Microsoft.Web/sites -ApiVersion 2014-11-01
if ([System.IO.File]::Exists($CertificateFilePath))
{
Write-Host ("Certificate found at {0}" -f $CertificateFilePath)
}
else
{
Write-Error ("Certificate does not exist at path {0}" -f $CertificateFilePath)
throw
}
How to check it?
Updated:
Based on your comment the files has been exists there. Also, combine your powershell script and your error message.
Since you just share the part of your YAML, I could not know how do you define variables. Please ensure your CertificateFileName variable has been stored and passed to powershell successfully.
Because, the complete file name should be displayed in your powershell error message even it does not exists in path.
In fact, it is very easy to cause some issue after you change the agent environment used.
After the Download secure file executed, it will generated one environment variable which name is secureFilePath. You just need set is as output variable and use it directly in your powershell script.
Little changes on your YAML and powershell script:
YAML:
steps:
- task: DownloadSecureFile#1
displayName: 'Download ***.**.com Certificate for API App'
inputs:
secureFile: dev.pfx
name: Path
- task: AzurePowerShell#3
displayName: 'Upload Certificate to API app and Bind Domain'
inputs:
azureSubscription: 'Azure: CDA NextGen DEV'
ScriptPath: '$(System.DefaultWorkingDirectory)/CdaApi-ArmTemplates/ArmTemplates/InstallSSLAndCustomDomain.ps1'
ScriptArguments: '-ResourceGroupName $(ResourceGroupName) -AppServiceName $(ApiSiteName) -CustomDomains $(ApiHostName) -CertificatePassword $(Password) -CertificateFileName $(CertificateFileName) -SecureFilePath $(Path.secureFilePath)'
azurePowerShellVersion: LatestVersion
Powershell:
$CertificateFilePath = $SecureFilePath
$ResourceGroupName -ResourceType Microsoft.Web/sites -ApiVersion 2014-11-01
if ([System.IO.File]::Exists($CertificateFilePath))
{
Write-Host ("Certificate found at {0}" -f $CertificateFilePath)
}
else
{
Write-Error ("Certificate does not exist at path {0}" -f $CertificateFilePath)
throw
}