When trying to run a simple Azure CLI Task in my ubuntu-based Azure DevOps pipeline I get the following error message:
##[error]Script failed with error: Error: Unable to locate executable file:
'/home/vsts/work/_temp/azureclitaskscript1637831708745.bat'.
Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable.
Also check the file mode to verify the file is executable.
If I'm reading this correctly the inline script is not being found, right? What am I missing here? Here's the full YAML:
trigger:
- main
- dev
pool:
vmImage: ubuntu-latest
steps:
- task: AzureCLI#2
inputs:
azureSubscription: 'My Subscription Name'
scriptType: 'batch'
scriptLocation: 'inlineScript'
inlineScript: 'az --version'
You're running your task on Linux, you can't use batch there, you'll need to pick pscore or bash instead, or switch to a windows-latest vmImage.
Script Type*: Select the type of script to be executed on the agent. Task supports four types: Batch / Shell / PowerShell / PowerShell Core scripts, default selection being empty. Select Shell/PowerShell Core script when running on Linux agent or Batch/PowerShell/PowerShell Core script when running on Windows agent. PowerShell Core script can run on cross-platform agents (Linux, macOS, or Windows)
https://github.com/microsoft/azure-pipelines-tasks/tree/master/Tasks/AzureCLIV2#parameters-of-the-task
The actual strings to pass to the arguments can be found in the task.json. It looks like the task doesn't do any validation on these fields, causing this weird error situation.
"options": {
"ps": "PowerShell",
"pscore": "PowerShell Core",
"batch": "Batch",
"bash": "Shell"
}
Related
I am using Azure pipeline with Self-hosted Windows agents to run a local script init.cmd, which is mainly for alias setting with doskey. I am using the following YAML file to configure the pipeline
trigger:
- master
pool: $(AGENT_POOL)
steps:
- script: |
init.cmd
build
displayName: 'init.cmd'
workingDirectory: $(CODE_BASE)
the alias setting seems not to work:
'build' is not recognized as an internal or external command
I am trying to use AutoIT in my automated tests for the project. Locally I am able to register the COM Library using regsvr32 but when I try to do the same from my azure pipeline, the script runs continuously.
I have my azure pipeline yml as following:
- job: Tests
displayName: Automated Tests
pool:
vmImage: "windows-latest"
steps:
- task: NuGetToolInstaller#1
- task: DotNetCoreCLI#2
displayName: Restore Packages
inputs:
command: 'restore'
projects: 'Free.Automation/Free.Automation.csproj'
feedsToUse: 'config'
nugetConfigPath: 'Free.Automation/nuget.config'
- task: BatchScript#1
displayName: Register AutoIT
inputs:
filename: 'Free.Automation/autoit.bat'
- task: MSBuild#1
inputs:
solution: "Free.Automation/Free.Automation.sln"
And this is the bat file I am using:
cd c:\windows\system32
regsvr32 C:\Users\%USERNAME%\.nuget\packages\autoitx.dotnet\3.3.14.5\build\AutoItX3.dll
I verified that the path of azure pipeline space is something D:\1\a\s but not sure how the directory works.
Could anyone help me registering the COM lib on azure hosted pipeline space?
With Azure DevOps Microsoft-hosted agent, you can't get your local files directly. So if you want to use COM DLLs, you need to include them in your source code files.
I recommend that you have a lib folder to store your DLLs in. Please make sure that your DLLs are referenced correctly as a relative path in .csproj.
I verified that the path of azure pipeline space is something D:\1\a\s but not sure how the directory works.
In Azure DevOps, you can use the predefined variable $(System.DefaultWorkingDirectory) to get the local path on the agent where your source code files are downloaded. That's the "azure pipeline space D:\1\a\s" you mentioned.
I have my normal and working release pipeline that, by given a certain deployment group, performs some tasks:
Copies a script
Executes that powershell script (on the target machines defined in the Deployment Group)
Deletes the script
I know that YAML doesn't support deployment groups, but (lucky me!) so far my deployment group has only one machine, let's call it MyTestVM .
So what I am trying to achieve mainly is simply executing a powershell script on that vm . Normally, what happenes with the release pipeline, is that you have a tentacle/release agent installed on the VM, your deployment target (which is inside the Deployment Group) is hooked up to that, and your release pipeline (thanks to the Deployment Group specification) is able to use that release agent on the machine and do whatever it wants on the VM itself.
I need the same... but through YAML ! I know there is PowerShellOnTargetMachines command available in YAML but I don't want to use that. It uses PSSession, it requires SSL certificates and many other things. I just want to use the already existing agent on the VM !
What I have in place so far:
pool: 'Private Pool'
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: 'blahblah'
definition: 'blah'
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'
- task: CopyFiles#2
displayName: 'Copy Files to: C:\TestScript'
inputs:
SourceFolder: '$(Pipeline.Workspace)/Scripts/'
Contents: '**/*.ps1'
TargetFolder: 'C:\TestScript'
CleanTargetFolder: true
OverWrite: true
The first part just downloads the Artifact containing my script. And then to be honest I am not even sure I need to copy the script in the second part.. first because I don't think it is copying the script to the VM target workspace, but it is copying it to the VM where the azure pipeline agent is installed. And second: I think I can just reference it from my artifact.. but this is not the important part.
How can I make my YAML pipeline make use of the release agent installed on the VM in the same way that a normal release pipeline does?
Reached somehow a solution. First of all worth mentioning that since deployment groups don't work with YAML pipelines the way to proceed is to create an Environment and add as resource your target VM.
So I didn't need to create my own hosted agent or anything special since the problem was the target itself and not the agent running the pipeline.
By creating an Environment and adding a resource (in my case a VM) to that environment, we create also a new release agent on the target itself. So my target VM will now have 2 release agents: the old one that can be used by normal release pipelines, and the new one, attached to the Environment resource on Azure Devops that can be used by YAML pipelines.
Now I am finally able to hit my VM:
- stage: PerformScriptInVM
jobs:
- deployment: VMDeploy
pool:
vmImage: 'windows-latest'
# watch out: this creates an environment if it doesn’t exist
environment:
name: My Environment Name
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: 'blahblahblah'
definition: 'blah'
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'
- task: PowerShell#2
displayName: 'PowerShell Script'
inputs:
targetType: filePath
filePath: '$(Pipeline.Workspace)/Scripts/TestScript.ps1'
arguments: 'whatever your script needs..'
To get the job to run on the specific release agent you want, you can do two things:
Create a pool and only put your release agent into it.
pool: 'My Pool with only one release agent'
Use an existing pool, and publish/demand a capability for your agent.
On the agent machine itself, add a system environment variable (for example, MyCustomCapability. Give it a value like 1
then your pipeline becomes:
pool:
name: 'My pool with potentially more than one agent'
demands: 'MyCustomCapability'
If only this agent has this environment variable set, then only this agent can execute the job
As a part of my yml pipeline definition, I have a AzurePowerShell#4 task, following is an extract from my pipeline definition
stages:
- stage: DeployDemoCluster
jobs:
- job: 'DeployAKSAndAll'
pool:
vmImage: 'windows-latest'
steps:
- task: AzurePowerShell#4
displayName: Store AI instrumentation key for Inbound Processor in central KeyVault
inputs:
azureSubscription: 'service-connection'
azurePowerShellVersion: LatestVersion
pwsh: true
ScriptType: 'FilePath'
ScriptPath: 'AKS/ps/update_kv_firewall.ps1'
The issue is, within my update_kv_firewall.ps1, all the powershell commands fail with the error, for example:
[error]Login-AzureRmAccount : The term 'Login-AzureRmAccount' is not recognized as the name of a cmdlet, function, script file, or operable program.
The script when executed individually / standalone, works perfectly fine.
what am I missing here?
As per your comment: the command "Get-AzKeyVault" runs without any errors, while 'Get-AzureRmVirtualNetwork' leads to errors.
Then I'm sure that you're installing the new Az module of azure powershell. So the command like Get-AzKeyVault can work.
Since you're installing Az module, please use all the commands from Az module. Almost each azure Rm command has an equivalent azure Az command, you can find it from the Az command list.
Note: the command like Get-AzureRmVirtualNetwork / Login-AzureRmAccount is from azure RM module, which will be retired this year later.
I have an azure deployment pipeline that copy files to an azure VM hosted on my account using this task:
- task: AzureFileCopy#3
inputs:
SourcePath: '$(Build.ArtifactStagingDirectory)'
azureSubscription: 'qa-serverside-snapshots'
Destination: 'AzureVMs'
storage: 'serversidesnapshotsdiag'
enableCopyPrerequisites: true
resourceGroup: 'SERVERSIDE-SNAPSHOTS'
vmsAdminUserName: $(username)
vmsAdminPassword: $(password)
TargetPath: 'C:\Git\serverside-snapshots'
The issue is I run into errors pertaining to not being able to write to a file because it is in use by the service running on the machine. So what I want to do is run a bat script that stops the service on the azure VM. Here was my attempt:
- task: BatchScript#1
inputs:
filename: '$(Build.SourcesDirectory)\stopPM2.bat'
workingFolder: 'C:\Git\serverside-snapshots'
"stopPM2.bat" literally just contains pm2 stop myservice. The issue is, when the bat file gets ran, I get this error:
C:\Git\serverside-snapshots>pm2 stop myservice
'pm2' is not recognized as an internal or external command,
operable program or batch file.
##[error]Process completed with exit code 1.
I definitely have PM2 installed on the server I'm deploying to. It looks like the bat file isn't actually getting run on the VM I'm trying to deploy to, but its running on the temporary VM that the pipeline in running stuff on. So my question is, how do I get a bat script to run on the VM I'm deploying to so I can stop the service before copying files?
BatchScript runs on the build agent.
You need to use PowerShell on Target Machine or SSH on Target Machine to invoke something on a remote machine.