I am trying to get tfsec running in my Azure Pipeline. I have been able to get this to run with the following commands without any additional flags:
stages:
- stage: QualityCheckStage
displayName: Quality Check Stage
jobs:
- job: QualityTestJob
pool:
vmImage: ubuntu-latest
displayName: Run TFSec
steps:
- bash: |
mkdir TFSecReport
docker pull tfsec/tfsec:latest
docker run --rm -v $(System.DefaultWorkingDirectory):/src tfsec/tfsec ./src --format JUnit > $(System.DefaultWorkingDirectory)/TFSecReport/junit.xml
displayName: TFSec Static Code Analysis
- task: PublishTestResults#2
displayName: Publish TFSecReport Test Results
condition: succeededOrFailed()
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/junit.xml'
searchFolder: '$(System.DefaultWorkingDirectory)/TFSecReport'
mergeTestResults: false
failTaskOnFailedTests: false
publishRunAttachments: true
This operates as I expect and successfully completes the job, checking my terraform code and showing results during the pipeline run.
I would like to add additional tfsec flags such as --include-passed, --verbose, or --config-file to make the pipeline more robust. I am struggling to find a way to pass any of these tfsec flags into the docker run command. I've tried adding the flags to every part of the command and Azure spits out the following error ##[error]Bash exited with code '1'..
How can I pass flags into the tfsec image through the docker run command?
Related
I am trying to build a pipeline azure DevOps. I build a basic Flask website and wrote a Unittest script for it. It basically all works perfect. When i commit to azure repos the pipeline will do his thing and the test will run. The thing i want is to see the test results, i see all these tutorials for Pytest but not for Unittest.
trigger:
- Development
jobs:
- job: 'Test'
pool:
vmImage: 'ubuntu-latest' # other options: 'macOS-latest', 'windows-latest'
strategy:
matrix:
Python37:
python.version: '3.7'
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
- script: |
python -m pip install --upgrade pip
python -m pip install -e .
displayName: 'Install dependencies'
- script: |
python -m unittest discover -p "*.py" > results.txt
displayName: unittesting
This is my Yaml file that runs the pipeline.
This is how my results look when running the pipeline.
Pipeline results
Is there a way to publish these results with Unittest and have them in azure pipeline.
You'll first need to make your test script generate results in a format that DevOps can understand, i.e. JUnit XML.
There's an example in MS docs (which includes coverage as well):
- script: |
pip install pytest pytest-azurepipelines
pip install pytest-cov
pytest --doctest-modules --junitxml=junit/test-results.xml --cov=. --cov-report=xml
displayName: 'pytest'
Alternatively, using unittest-xml-reporting should give you results in JUnit XML format as well.
Once you have that, then you can use Publish Test Results task to upload results and make them visible in DevOps UI, i.e.:
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/test-*.xml'
If your unit tests are written using unittest module, dont worry
pytest can run unittest as well !!!
Add below yaml code snippet to your azure pipeline.
Run your unittest with pytest
Generate Junit XML output
Publish the output back to azure pipeline
- script: |
cd $(Build.Repository.LocalPath)
python -m pytest $(Build.Repository.LocalPath)/<unit_tests_path>/*.py --junitxml=test-unit.xml
displayName: 'Run Unit Tests'
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: 'test-unit.xml'
Environment
Azure Dev Ops (code repo and pipeline trigger)
AWS ECR/ECS (target deployment platform)
Docker
.NET Core Web application (v5.0)
Current Situation
Presently building the application using dotnet build (powershell script) and pushing the zip file to Azure DevOps artifacts using azurepipeline.yml. This works out fine. I have added another task for ECR Push and that also pushes a generated docker image to the ECR using a Dockerfile in the source code.
Business Problem
We want to be able to chose a specific build (eg 0.1.24) in the Azure Artifact (using a variable to provide version number), and generate a Docker build using the corresponding binaries and the Dockerfile. I am unable to find a way to do so. The specific task is as follows:-
Deploy user updates variable "versionNoToDeploy" with the artifact id or name
Deploy user runs a specific pipeline
Pipeline finds the artifact (assuming its valid, else sends error), unzips the package at temp location (-need help on)
Pipeline runs dockerfile to build the image (-known & working)
Pipeline pushes this image to ECR (-known & working)
The purpose is to keep on building the branch till we get stable builds. This build is deployed on a test server manually and tested. Once the build gets certified, it needs to be pushed to Production ECR/ECS instances.
Our pipeline (specific code only)
- pwsh: ./build.ps1 --target Clean Protolint Compile --runtime $(runtime)
displayName: ⚙️ Compile
- task: Docker#2
displayName: Build
inputs:
command: build
repository: appRepo
tags: |
$(Build.BuildId)
deploy
addPipelineData: true
Dockerfile: src\DockerfileNew
- task: ECRPushImage#1
inputs:
awsCredentials: 'AWS ECR Connection'
regionName: 'ap-south-1'
imageSource: 'imagename'
sourceImageName: 'myApplication'
sourceImageTag: 'deploy'
repositoryName: 'devops-demo'
outputVariable: 'imageTagOutputVar'
- pwsh: ./build.ps1 --target Test Coverage --skip
displayName: 🚦 Test
- pwsh: ./build.ps1 --target BuildImage Pack --runtime $(runtime) --skip
displayName: 📦 Pack
- pwsh: ./build.ps1 --target Publish --runtime $(runtime) --skip
displayName: 🚚 Publish
Artifact details
Any specific aspects needed can be provided
Finally after playing a lot with the pipeline and custom tweaking the individual steps, I came out with the following (excerpted yml).
This involves having a build version to be stored in a variable, which is referenced in each of the steps of the pipeline.
The admin has to decide if they want a general build, producing an artifact; or just deploy a specific build to AWS. The variable having the build-id is evaluated conditionally, and based on that, the steps are executed or bypassed.
- pwsh: ./build.ps1 --target Clean Protolint Compile --runtime $(runtime)
condition: eq(variables['artifactVersionToPush'], '')
displayName: ⚙️ Compile
- task: DownloadBuildArtifacts#0
condition: ne(variables['artifactVersionToPush'], '')
inputs:
buildType: 'specific'
project: 'NitinProj'
pipeline: 'NitinProj'
buildVersionToDownload: specific
buildId: $(artifactVersionToPush)
downloadType: 'single'
artifactName: 'app'
downloadPath: '$(System.ArtifactsDirectory)' #(this needs to be mentioned as default is Build directory)
- task: ExtractFiles#1
displayName: Extract Artifact to temp location
condition: ne(variables['artifactVersionToPush'], '')
inputs:
archiveFilePatterns: '$(System.ArtifactsDirectory)/app/*.zip' #path need update
cleanDestinationFolder: false
overwriteExistingFiles: true
destinationFolder: src
- task: Docker#2
displayName: Docker Build image with compiled code in artifact
condition: ne(variables['artifactVersionToPush'], '')
inputs:
command: build
repository: myApp
tags: |
$(Build.BuildId)
deploy
addPipelineData: true
Dockerfile: src\DockerfileNew
- task: ECRPushImage#1
displayName: Push built image to AWS ECR
condition: ne(variables['artifactVersionToPush'], '')
inputs:
awsCredentials: 'AWS ECR Connection'
regionName: 'ap-south-1'
imageSource: 'imagename'
sourceImageName: 'myApp'
sourceImageTag: 'deploy'
pushTag: '$(Build.BuildId)'
repositoryName: 'devops-demo'
outputVariable: 'imageTagOutputVar'
- pwsh: ./build.ps1 --target Test Coverage --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 🚦 Test
- pwsh: ./build.ps1 --target BuildImage Pack --runtime $(runtime) --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 📦 Pack
- pwsh: ./build.ps1 --target Publish --runtime $(runtime) --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 🚚 Publish
I will update this yml to have steps organized into jobs, but that's an optimization story.. :)
Since it involves manual intervention here, you may consider splitting the workflow into several jobs like this:
jobs:
- job: BuildAndDeployToTest
steps:
- bash: echo "A"
- job: waitForValidation
displayName: Wait for external validation
pool: server
timeoutInMinutes: 4320 # job times out in 3 days
steps:
- task: ManualValidation#0
timeoutInMinutes: 1440 # task times out in 1 day
inputs:
notifyUsers: |
test#test.com
example#example.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'resume'
- job: DeployToProd
steps:
- bash: echo "B"
This is not exactly what you want in terms of involving variables, but you will be able to achieve your goal. Wait for validation and deploy to prod only validated builds.
It rely on ManualValidation task.
Another approach could be using deployment job and approvals:
jobs:
- job: BuildAndDeployToTest
steps:
- bash: echo "A"
jobs:
# Track deployments on the environment.
- deployment: DeployToProd
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-16.04'
# Creates an environment if it doesn't exist.
environment: 'PROD'
strategy:
# Default deployment strategy, more coming...
runOnce:
deploy:
steps:
- checkout: self
- script: echo my first deployment
For this you need to define evnironment and define approval.
In both ways you will get clear picture what was delivered to prod and information who approved PROD deployment.
Azure DevOps Pipelines only supports JaCoCo and Cobertura coverage report formats :
PHPUnit only supports Clover, Crap4jn PHP, (custom) XML, HTML and TXT coverage report formats :
How can I publish the coverage result of my PHPUnit tests in my Pipeline ?
As of this time, however, publishing PHPUnit code coverage result in Pipeline is not supported.
PHPUnit 9.4 added support for Cobertura coverage output. However the default Ubuntu build agents that Azure Pipelines provides atm. only support phpunit 8.5. But you can get coverage reports by running phpunit 9.4+ inside a docker container instead. Here is a snippet of my current azure build pipeline that does that:
trigger:
- master
pool:
vmImage: ubuntu-latest
variables:
phpVersion: 7.4
phpunitImage: jitesoft/phpunit:7.4-9
steps:
- script: |
sudo update-alternatives --set php /usr/bin/php$(phpVersion)
sudo update-alternatives --set phar /usr/bin/phar$(phpVersion)
sudo update-alternatives --set phpdbg /usr/bin/phpdbg$(phpVersion)
sudo update-alternatives --set php-cgi /usr/bin/php-cgi$(phpVersion)
sudo update-alternatives --set phar.phar /usr/bin/phar.phar$(phpVersion)
php -version
displayName: 'Use PHP version $(phpVersion)'
# Do a composer install to get an autoloader that phpunit can use
- script: composer install --no-interaction --prefer-dist
displayName: 'composer install'
# Run the test using the jitesoft phpunit docker image to get support
# for phpunit 9+ and that way cobertura reports for code coverage.
- script: |
docker run --rm -v $(pwd):/app ${{ variables.phpunitImage }} phpunit --log-junit .junit/TEST-phpunit-junit.xml --coverage-cobertura=.coverage/COVERAGE-phpunit-cobertura.xml
displayName: 'Run tests with phpunit docker container'
- task: PublishTestResults#2
displayName: 'Publish test report'
condition: always()
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/TEST-phpunit-*.xml'
searchFolder: '$(System.DefaultWorkingDirectory)/.junit'
failTaskOnFailedTests: true
- task: PublishCodeCoverageResults#1
displayName: 'Publish coverage report'
condition: always()
inputs:
codeCoverageTool: 'Cobertura'
summaryFileLocation: '$(System.DefaultWorkingDirectory)/.coverage/COVERAGE-phpunit-*.xml'
pathToSources: '$(System.DefaultWorkingDirectory)/src'
failIfCoverageEmpty: true
note the always() condition in the Publish* tasks. This is needed because if a test fails, then the docker run step will fail with a bash exit code 1, which in turn would prevent the report publishing unless those steps are forced. There might be a way to handle the exit code cleaner, but I haven't figured it out yet.
ps. ideally you'd do some caching as well so the docker image is not always downloaded, but I skipped that part to keep the example focused on the actual running of unit tests and coverage reports.
I'm trying to push a built docker image in a release pipeline.
My docker build task yaml is:
steps:
- task: Docker#2
displayName: Build
inputs:
containerRegistry: MyRegistry
repository: myrepo/containername
command: build
Dockerfile: '$(System.DefaultWorkingDirectory)/My.dockerfile'
buildContext: '$(System.DefaultWorkingDirectory)'
arguments: '--build-arg FILE_NAME=myfile.zip'
My docker push task yaml is:
steps:
- task: Docker#2
displayName: Push
inputs:
containerRegistry: MyRegistry
repository: myrepo/containername
command: push
The log says it runs this command:
/usr/bin/docker push ***/myrepo/containername:tag
The tasks reports success, but I can't see the resulting image in dockerhub.
I wonder if the *** has anything to do with this?
I ended up not using the Azure DevOps tasks and writing the commands via a python script. It was important to docker login inside the python script also.
I am running a pytest-based suite of tests during my Azure DevOps build process. I have two jobs arranged to run these tests against two different environments.
In each job, I run the pytest tests using a script task and generate a junit-style xml output file, then have a PublishTestResults task publish that xml file. This is working great, and I'm able to peruse my test results in the azure build tests report UI -- but only if all the tests pass. If any tests fail, the publish task is skipped, and the tests aren't reported in the UI.
YML extract:
- job: 'RunTestsQA'
continueOnError: True
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.6'
architecture: 'x64'
- task: DownloadSecureFile#1
inputs:
secureFile: 'ConfigFile'
- script: pip install -r requirements.txt
displayName: 'Install Requirements'
- script: |
pytest -m smoke --ENV=qa --log-file $SYSTEM_ARTIFACTSDIRECTORY/smoke-qa.log --junitxml="TEST-qa-smoke.xml"
displayName: 'Test with pytest'
# PUBLISH JUNIT RESULTS
- task: PublishTestResults#2
inputs:
condition: succeededOrFailed()
testResultsFormat: 'JUnit' # Options: JUnit, NUnit, VSTest, xUnit
testResultsFiles: '**/TEST-*.xml'
#searchFolder: '$(System.DefaultWorkingDirectory)' # Optional
mergeTestResults: false # Optional
testRunTitle: 'API_CHECK QA'
#buildPlatform: # Optional
#buildConfiguration: # Optional
publishRunAttachments: true # Optional
Through some experimentation, I've been able to confirm the XML file is always created. What do I need to fix here? A test report isn't super helpful if it only shows up when the tests pass.
In your task description, the condition is effectively listed as a task input, and hence won't be taken into account at all.
You had:
# PUBLISH JUNIT RESULTS
- task: PublishTestResults#2
inputs:
condition: succeededOrFailed()
testResultsFormat: 'JUnit' # Options: JUnit, NUnit, VSTest, xUnit
testResultsFiles: '**/TEST-*.xml'
The correct setup is
# PUBLISH JUNIT RESULTS
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit' # Options: JUnit, NUnit, VSTest, xUnit
testResultsFiles: '**/TEST-*.xml'
condition: succeededOrFailed()
The full list of things you can do with conditions is here
I'm using Ruby and Minitest, but I have found that the following setting allows the PublishTestResults task to run:
- script: |
pytest -m smoke --ENV=qa --log-file $SYSTEM_ARTIFACTSDIRECTORY/smoke-qa.log --junitxml="TEST-qa-smoke.xml"
displayName: 'Test with pytest'
continueOnError: true
The only issue that I have found with this setting is that if the build fails, it reports as "Partially Succeeded" and not "Failed".
edit:
Of course, if your build process has any deploy tasks after the test task, you may not want to use this setting.