I need to use Java 14 on a hosted agent (macos-12) and I'm trying to install it through a script and then use the JavaToolInstaller task to make it available.
I have modified the script from https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/tool/java-tool-installer?view=azure-devops
jobs:
- job: RunApiTests
pool:
vmImage: macOS-12
condition: ne(variables['Build.SourceBranch'], 'refs/heads/main')
steps:
- task: JavaToolInstaller#0
displayName: Install Java 14
inputs:
versionSpec: '14'
jdkArchitectureOption: 'x64'
jdkSourceOption: AzureStorage
azureResourceManagerEndpoint: {azureResourceManagerEndpoint}
azureStorageAccountName: {azureStorageAccountName}
azureContainerName: openjdk
azureCommonVirtualFile: 'openjdk-14.0.2.zip'
jdkDestinationDirectory: '$(agent.toolsDirectory)/jdk14'
cleanDestinationDirectory: false
Below is the error I am getting:
##[error]JDK file is not valid. Verify if JDK file contains only one root folder with 'bin' inside.
I downloaded the 14.0.2 (build 14.0.2+12) jdk for Mac from https://jdk.java.net/archive/ and I compressed only the Home folder to a zip file because bin is in there.
So, as per the requirement, the zip file contains only one root folder 'Home' with bin inside.
I need help on why I'm getting this error.
I tried to test your yaml in my pipeline and it works. The following are my steps.
I downloaded this package:
In my zip, I have these files:
Uploaded to the storage account.
Yaml:
trigger:
- none
pool:
vmImage: macos-12
steps:
- task: JavaToolInstaller#0
inputs:
versionSpec: '14'
jdkArchitectureOption: 'x64'
jdkSourceOption: 'AzureStorage'
azureResourceManagerEndpoint: 'myEndpoint'
azureStorageAccountName: 'teststorage'
azureContainerName: 'test'
azureCommonVirtualFile: 'homefolder.zip'
jdkDestinationDirectory: '$(agent.toolsDirectory)/jdk14'
cleanDestinationDirectory: false
Result:
Related
I have an azure pipeline that triggers a python selenium scrip that check that my website works properly.
But there is a stage that keeps failing because I need selenium to input a specific date, and as I am not sure if the date inputed is in the wrong format (locally works just fine) I would like to take a screenshot at that stage to fully understand what is happening in there.
locally this is my configuration to save the screenshot:
try:
wait.until(EC.element_to_be_clickable((By.XPATH, '//*[#id="root"]/div[2]/main/div[4]/div/button[2]'))).click()
except:
driver.save_screenshot('error.png')
This works just fine and it does output the png image in the local folder.
but running this on azure pipeline, is not saving the png file.
this is my pipeline configuration
stages:
- stage:
jobs:
- job: Configuration
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.8'
addToPath: true
- script: |
python -m pip install --upgrade pip
pip install selenium
printenv
- task: Pythonscript#0
inputs:
scriptSource: 'filePath'
scriptPath: './script1.py'
env:
USERNAMEOT: $(usernameot)
PASSWORDOT: $(passwordot)
- job: Artifact
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(system.defaultworkingdirectory)'
Contents: '**.png'
TargetFolder: '$(build.artifactstagingdirectory)'
flattenFolders: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: screenshots'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: screenshots
I do have a task to copy file and publish the artefact, but as this task runs in parallel, it completes before the previous job and return nothing.
I was wandering how can I save the png file I have to the artefact folder even if the Configuration job fails?
Thank you so much for any help you can provide me guys, I am really struggling on this
I was wandering how can I save the png file I have to the artefact folder even if the Configuration job fails?
You could try to set the Dependencies and condition for the job Artifact, like:
jobs:
- job: Artifact
dependsOn: Configuration
condition: succeededOrFailed()
With the dependsOn: Configuration, the job Artifact will executed after the job Configuration. And the condition: succeededOrFailed() will keep the job Artifact execute when the Configuration job fails.
You could check the document Specify conditions and Dependencies for some more details.
I have created an Azure agent environment to a virtual machine where I download builds to the folder. I have the following yaml file
# Python package
# Create and test a Python package on multiple Python versions. OK
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
paths:
include:
- Flytteordre
pool:
vmImage: 'ubuntu-latest'
name: Azure Pipelines
variables:
python.version: '3.6'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: build
displayName: build
steps:
- task: UsePythonVersion#0
displayName: 'Use Python $(python.version) copy'
inputs:
versionSpec: '$(python.version)'
# We actually don't need to install these dependencies here. It needs to happen in the deploy yaml file.
- task: CmdLine#2
inputs:
script: |
python -m pip install --upgrade pip
python -m pip install selenium
python -m pip install pdfplumber
python -m pip install pandas
displayName: 'Install dependencies'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: dist'
inputs:
PathtoPublish: Flytteordre
ArtifactName: dist
- stage: Deployment
displayName: Deployment stage
jobs:
- deployment: deploymentJob
displayName: Deploy
environment:
name: Production
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- download: none
- downloadBuild: none
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'dist'
downloadPath: 'C:/azagent/A1/_work/myfolder/'
My problem is that each time I run the pipeline, it creates a folder inside the _work environment called 1, 2 etc.
I would like to avoid this so that I am in full control of which folders are created. As you can see I have indicated that my artifact should by downloaded to the folder path C:/azagent/A1/_work/myfolder/, however this will create two folders. One is the folder that I have indicated but the other is a folder with the title of a number. I now that this is the default but I would like to know if there is a way to turn off this default setting or at the very least to be able to change the predefined path variable PIPELINE.WORKSPACE or Agent.BuildDirectory?
How to avoid creating new folder in Azure Agent for Pipeline in Environment
According to the document Agent variables:
So, each build definition goes into its own directory within the agent's working directory.
As we know, a pipeline has input and output. Whenever we create a new pipeline, we will increment a number to name the new work folder created. The advantage of this is that multiple running builds sharing the same copy of the repository are guaranteed to step on each other sooner or later.
I would prefer the option of being able to name the directory myself
in case I need to find a particular project and open it on the local
agent.
If you want open it on the local agent, you could just use the Agent variables to show the path:
Agent.BuildDirectory
I deploy Azure functions using yaml scripts.
For some reasons, I do not put all my package requirements in the file requirements.txt, as this is used by some other processes.
Yet, when deploying my web app through a YAML pipeline, I want to install additional python packages. However, the resulting app that is deployed crashes with errors saying that it does not have those packages installed.
my pipeline:
# Python to Linux Web App on Azure
# Build your Python project and deploy it to Azure as a Linux Web App.
# Change python version to one thats appropriate for your application.
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
variables:
# Azure Resource Manager connection created during pipeline creation
azureServiceConnectionId: .........................
# Web app name
webAppName: .........................
# Agent VM image name
vmImageName: 'ubuntu-latest'
# Environment name
environmentName: .........................
# Project root folder. Point to the folder containing manage.py file.
projectRoot: $(System.DefaultWorkingDirectory)
# Python version: 3.8
pythonVersion: '3.8'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: BuildJob
pool:
vmImage: $(vmImageName)
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python $(pythonVersion)'
- script: |
python -m venv antenv
source antenv/bin/activate
python -m pip install --upgrade pip
pip install setup
pip install -r requirements.txt
pip install pyodbc
workingDirectory: $(projectRoot)
displayName: "Install requirements"
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(projectRoot)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
displayName: 'Upload package'
artifact: drop
- stage: Deploy
displayName: 'Deploy Web App'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: DeploymentJob
pool:
vmImage: $(vmImageName)
environment: $(environmentName)
strategy:
runOnce:
deploy:
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python version'
- task: AzureWebApp#1
displayName: 'Deploy Azure Web App : .........................
inputs:
azureSubscription: $(azureServiceConnectionId)
appName: $(webAppName)
package: $(Pipeline.Workspace)/drop/$(Build.BuildId).zip
startUpCommand: 'startup-nocontainer.sh'
Here, pyodbc is installed separately in the pipeline, and not present in the file requirements.txt.
Checking the logs of the pipeline, I can see that the installation was performed successfully.
Yet when the app starts, it crashes at first import of pyodbc, as if the app that is eventually deployed only relyed on requirements.txt.
Any idea?
I think pyodbc is not being installed in the venv you create. The venv contains only the packages you specify in requirements.txt
Check the solutions proposed in this old post
I get the following error:
problem picture
It says that the POM file can't be found. I have my yaml file within my github project like this:
Project picture
My yaml file looks like this:
In my github there is a pom file in a folder called dataops/poms/pom.xml so i don't know why it can't find it.
# Talend CI/CD on Azure DevOps
# Build Pipeline for building and publishing on Talend Cloud
trigger: none
pool:
vmImage: 'ubuntu-latest'
variables:
- group: Talend Configuration
- name: project_name
value: 'DataOps'
- name: job_name
value: 'Soccer_api_Call'
- name: job_version
value: '0.1'
steps:
- task: DownloadSecureFile#1
name: settings_xml
inputs:
secureFile: settings.xml
- task: DownloadSecureFile#1
name: license_txt
inputs:
secureFile: license.txt
- task: Maven#3
inputs:
mavenPomFile: '$(project_name)/poms/pom.xml'
mavenOptions: |
-Dlicense.path=$(license_txt.secureFilePath)
-Dupdatesite.path=$(UPDATESITE_PATH)
-Dservice.url=$(CLOUD_URL)
-Dcloud.publisher.screenshot=true
-Xmx3096m -Xmx1024m
options: '--settings $(settings_xml.secureFilePath) -Pcloud-publisher -pl jobs/process/$(job_name)_$(job_version) -am'
goals: 'deploy'
Any help is much appreciated!!
We recommend you can try to use the mavenPomFile: 'DATAOPS/poms/pom.xml' instead of the mavenPomFile: '$(project_name)/poms/pom.xml'.
On my test, I create a demo like your yaml, and it works well with the full path:
Please ignore the test result, it just uses to check if the task can find the pom.xml.
I'm trying to deploy a react web app. Here is my current yaml file. I've been following this tutorial.
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
variables:
azureSubscription: <myServiceConnection>
appName: <myAppName>
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
#so this calls the build object in package.json. Theoretically creates a /build/ directory
- script: |
npm install
npm run build
displayName: 'npm install and build'
#this should copy contents of build into artifactstagingdirectory.
- task: CopyFiles#2
inputs:
Contents: 'build/**' # Pull the build directory (React)
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
inputs:
pathtoPublish: $(Build.ArtifactStagingDirectory) # dist or build files
ArtifactName: 'www' # output artifact named www
- task: AzureWebApp#1
inputs:
azureSubscription: <myServiceConnection>
appName: <myAppName>
appType: webAppLinux
package: $(Build.ArtifactStagingDirectory)/**/www/build
customWebConfig: '-handler iisnode -NodeStartFile server.js -appType node'
Basically, the issue is that I dont understand where the PublishBuildArtifact task publishes the files, and so I don't know where to point the package of AzureWebApp to.
So I turned on system.debug and I got the following information
npm install build uses the following directory: /home/vsts/work/1/s
copy files copies the build folder over. So it goes /home/vsts/work/1/s/build to /home/vsts/work/1/a/build. So this means artifactstagingdirectory is /home/vsts/work/1/a
Publish Build artifacts takes the artifactstagingdirectory and publishes it to some folder www/build. The following output is provided: Upload '/home/vsts/work/1/a' to file container: '#/8995596/www'
The current AzureWebApp task is looking in /home/vsts/work/1/a/**/www/build
So I think it should be the right directory. However, it is not providing the whole path in step 3. So it looks like I'm wrong. Where is this www folder being created?
Also since it doesn't seem like building/publishing the react app will create a zip file, do I just point it to the build folder or the www folder?
try
**/wwwroot/
instead of **/www/
Check following blog from Microsoft with deployment details outlined.
https://learn.microsoft.com/en-us/azure/app-service/faq-deployment