Concourse: Use a semver resource to control which artifact to use from s3 - semantic-versioning

My pipeline contains a task with the following pre-requisites
- get: version
trigger: true
params: { bump: patch }
passed: ["trigger_job [CI]"]
- get: sdk-package
passed: ["package_generation_job"]
params:
version: {path: "artifact_[I want to put the version here]"}
version is a semver stored in git; sdk-package is a build artifact stored in s3 where each run of the pipeline puts a new artifact using the version number as part of the name.
What I would like to do is used the version input to determine which version of the artifact is pulled from S3. Based on this I suspect that Concourse doesn't allow this, but I couldn't find a definitive answer.

This is not currently possible, you will have to download the artifact you want in a task script. You can pass the version into that task.

Related

Call AWS CodeBuild project from CodePipeline with different parameters

Let's imagine that we have one CodePipeline with 2 stages in the following fashion:
new codepipeline.Pipeline(this, name + "Pipeline", {
pipelineName: this.projectName + "-" + name,
crossAccountKeys: false,
stages: [{
stageName: 'Source',
actions: [codeCommitSourceAction]
},{
stageName: 'Build',
actions: [buildAction]
}]
});
Here the Source stage is where we pull the changes from the repository and the Build one is a CodeBuild project which has the following actions in the buildspec file:
Install the dependencies (npm i).
Run the tests (npm run test).
Pack the project (npm run pack).
Update/deploy lambda function (aws lambda update-function-code).
In general it does what it supposed to do, however, if the build fails, the only way to find out, which part has failed, is to look to the logs. I would like that it is seen straight from CodePipeline. In this case CodePipeline must have more stages which correlate with each action from CodeBuild. Based on my experience I can do it if for every stage I provide different CodeBuild project.
Question: can I provide same CodeBuild project to the different CodePipeline stages so, that it will execute only part of buildspec file (for example, only running the tests)?
You can have your buildspec.yml perform different actions based on environment variables. You can then pass different environment variables to CodeBuildAction with environmentVariables.
new codepipeline_actions.CodeBuildAction({
actionName: 'Build',
project: buildProject,
input: sourceInput,
runOrder: 1,
environmentVariables: {
STEP: { value: 'test' }
}
}),
And then check STEP environment variable in buildspec.yml.
Question: can I provide same CodeBuild project to the different CodePipeline stages so, that it will execute only part of buildspec file (for example, only running the tests)?
No, I don't think that is possible.
What you can do however, is to have different buildspecs.yml file called at different stages of your pipeline.
For example, you could have a Codepipeline stage called Init which will call the builspec_init.yml of your project. If that succeed, you could have a following state Apply calling the buildspec_apply.yml file of your project.

Azure DevOps: release version

I am going to create my CI/CD pipeline in Azure DevOps, but I have a problem with release version number. with this CI/CD a dotnet app build and a docker image created, so I want to have docker image release number same as : V1.2.0 and ..... but currently I have number for example: 10, 11, ... or only the latest tag!
Can anybody support me to have my own release version number ?
Thanks
You could set the release version number in Release Pipelines -> Options -> General -> Release name format.
The $(rev:r) is an incrementing variable. So you could add it in the Release version.
For example: V1.2.$(rev:r)
Result:
Note: the $(rev:r) counts from 1 (1,2,3...).
From your requirement, you are using CI and CD process and it seems that you need to count from 0. You also could try to use the $(Build.buildnumber) variable.
Here are the steps:
Step1: In Build Pipeline(CI) , set the count variable(e.g. BuildRevision :$[counter( ' ',0)]).
Step2: Use the variable in Build number (Build Pipeline->Options ->Build number format).
Step3: Set the build artifacts as the release source. Use the $(Build.buildnumber) in release pipeline version.
Result:
In this situation, the release version could start from v1.2.0.
Update:
when I change the release version for example from V0.0 to V1.0 , how the counter restarted ?
You could try the following steps:
Create 2 variables:
1.major-minor = 0.0
2.revision = $[ counter(variables['major-minor'],0) ]
The build number: $(major-minor).$(revision)
In this case, when the major-minor change as V1.0, the counter will reset.
In that case you can use GitVersion and Semantic Versioning pattern.
For that you will need this extension: https://marketplace.visualstudio.com/items?itemName=gittools.gitversion
After that you add the step before compiling/build your project:
steps:
- task: GitVersion#5
inputs:
runtime: 'core'
After that you can use variable:
$(GitVersion.FullSemVer)
That variable will store the current build version - it's based on git.

How do I upload Artifacts depending on Release or Prerelease to Artifactory using GitVersion on Azure DevOps?

I prefer to organize my artifacts in Artifactory in a hierarchy of Repo[dev|test|prod] -> Artifact Name -> Releases Artifacts go here -> Pre-Releases go into a sub-folder.
Why? So when I am navigating the Artifactory Repository Browser I don't have an exceedingly long tree. I can expand a repository and see the first level by artifact name and still not see any artifacts, then expand the artifact name leaf and then see my released artifacts. But, the top item underneath will be a sub-directory folder called "prerelease". This is done so I can easily manually delete all my pre-releases if I wish to do so in one action, or schedule to clean them up.
[My Repo]
|
+-\prerelease\
| |--artifact-1.2.3-ci0004.nupkg
| |--artifact-1.0.1-ci0002.nupkg
|--artifact-1.0.0.nupkg
|--artifact-1.0.1.nupkg
I know how to use the Artifactory filespec to upload the package to my repository:
** For Pre-Release
{
"files": [
{
"pattern": "$(build.artifactstagingdirectory)\*.nupkg",
"target": "myrepo-nuget-dev-local/$(PackageName)/prerelease/"
}
]
}
** For Release
{
"files": [
{
"pattern": "$(build.artifactstagingdirectory)\*.nupkg",
"target": "myrepo-nuget-dev-local/$(PackageName)/"
}
]
}
What I need to do is put each file spec into its own build step and then add conditions that will execute EITHER one build step OR the other, but never both. Why? Because the build artifact will ever be a pre-release or a release artifact but never both. I am using GitVersion and Git Tags along with Azure DevOps.
So the question: What does the Custom Condition need to be to get this working?
This logic should work for any CI system, but this syntax will work for Azure DevOps.
How to create these can be found here: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops
Here is what it looks like:
For Pre-Release:
and(succeeded(), not(startsWith(variables['GitVersion.BranchName'], 'tags/')), or(ne(variables['GitVersion.PreReleaseLabel'], ''),ne(variables['GitVersion.BuildMetaData'], '')))
This is saying all 3 conditions MUST be met:
if succeeding
GitVersion.BranchName does not start with 'tags/' (this makes sure this build event was not triggered by a tag), and,
GitVersion.PreReleaseLabel is not empty OR GitVersion.BuildMetaData is not empty
For Release:
and(succeeded(), or(and(eq(variables['GitVersion.PreReleaseLabel'], ''), eq(variables['GitVersion.BuildMetaData'], ''), eq(variables['GitVersion.BranchName'], 'master')), startsWith(variables['GitVersion.BranchName'], 'tags/')), ne(variables['Build.Reason'], 'PullRequest'))
This is saying all 3 conditions MUST be met:
if succeeding
(GitVersion.PreReleaseLabel is empty AND GitVersion.BuildMetaData is empty AND GitVersion.BranchName is 'master') OR (GitVersion.BranchName starts with 'tags/')
Build.Reason is not 'PullRequest'
Here is what it looks like:

VSTS / TFS Build only run NPM task when Git tag is committed and pushed

On Microsoft docs it states you can set conditionals for a specific build task. For example to run the build on a specific version of a defined Git branch.
But is it also possible to make an conditional to make the VSTS to build the NPM package only if a commit contains a git-tag?
Update 1: The command $ git describe (source) seems like a possible part of the solution. This command takes the last commit hash and matches it to see if there were annotated tags. However, it doesn't equal it to a boolean value what we need in the custom condition of VSTS.
Update 2: Using npm git-describe you can return wether or not the latest commit has annotated tags. Example of a gulp task:
/** Example of a gulp task using git describe */
var gulp = require('gulp');
var {gitDescribeSync} = require('git-describe');
/**
* #function
* #name checkGitTag
* #description Returns wether or not the latest commit has a tag
* #returns {(String|null)} Git tag or null if no git tag exists on the commit
*/
var checkGitTag = function() {
var gitInfo = gitDescribeSync();
// output the result to debug
console.log(gitInfo.tag);
// gitInfo.tag seems to contain the logic needed
};
gulp.task('checkGitTag', checkGitTag);
module.exports = checkGitTag;
Maybe by installing the NPM package on the build server and using a similar function would work. Going to test it out.
Yes, it’s possible to conditional run a task when the build commit contains tag(s).
But since there is no such predefined variable to record the tag for the build commit $(BUILD.SOURCEVERSION), so you should add the steps (add a variable and a PowerShell task) to check if there has tag(s) on the build commit. Detail step as below:
1. Add a variable (such as result) with the default value 0. And the variable is used for check if the build commit contains tag(s).
2. Then add a PowerShell task before the task which you want to conditional run, and the powershell script as below:
$tag=$(git tag --contains $(BUILD.SOURCEVERSION))
if($tag)
{
Write-Host "##vso[task.setvariable variable=result]1"
echo "The build version $(BUILD.SOURCEVERSION) contains tag(s)"
}
else
{
Write-Host "##vso[task.setvariable variable=result]0"
echo "The build version $(BUILD.SOURCEVERSION) does not contain any tags"
}
3. Finally set custom condition for the task you want to run only when the commit contains tag as below:
and(succeeded(), eq(variables['result'], '1'))

Concourse CI - Build Artifacts inside source, pass all to next task

I want to set up a build pipeline in Concourse for my web application. The application is built using Node.
The plan is to do something like this:
,-> build style guide -> dockerize
source code -> npm install -> npm test -|
`-> build website -> dockerize
The problem is, after npm install, a new container is created so the node_modules directory is lost. I want to pass node_modules into the later tasks but because it is "inside" the source code, it doesn't like it and gives me
invalid task configuration:
you may not have more than one input or output when one of them has a path of '.'
Here's my jobs set up
jobs:
- name: test
serial: true
disable_manual_trigger: false
plan:
- get: source-code
trigger: true
- task: npm-install
config:
platform: linux
image_resource:
type: docker-image
source: {repository: node, tag: "6" }
inputs:
- name: source-code
path: .
outputs:
- name: node_modules
run:
path: npm
args: [ install ]
- task: npm-test
config:
platform: linux
image_resource:
type: docker-image
source: {repository: node, tag: "6" }
inputs:
- name: source-code
path: .
- name: node_modules
run:
path: npm
args: [ test ]
Update 2016-06-14
Inputs and outputs are just directories. So you put what you want output into an output directory and you can then pass it to another task in the same job. Inputs and Outputs can not overlap, so in order to do it with npm, you'd have to either copy node_modules, or the entire source folder from the input folder to an output folder, then use that in the next task.
This doesn't work between jobs though. Best suggestion I've seen so far is to use a temporary git repository or bucket to push everything up. There has to be a better way of doing this since part of what I'm trying to do is avoid huge amounts of network IO.
There is a resource specifically designed for this use case of npm between jobs. I have been using it for a couple of weeks now:
https://github.com/ymedlop/npm-cache-resource
It basically allow you to cache the first install of npm and just inject it as a folder into the next job of your pipeline. You could quite easily setup your own caching resources from reading the source of that one as well, If you want to cache more than node_modules.
I am actually using this npm-cache-resource in combination with a Nexus proxy to speed up the initial npm install further.
Be aware that some npm packages have native bindings that need to be built with the standardlibs that matches the containers linux versions standard libs so, If you move between different types of containers a lot you may experience some issues with libmusl etc, in that case I recommend either streamlinging to use the same container types through the pipeline or rebuilding the node_modules in question...
There is a similar one for gradle (on which the npm one is based upon)
https://github.com/projectfalcon/gradle-cache-resource
This doesn't work between jobs though.
This is by design. Each step (get, task, put) in a Job is run in an isolated container. Inputs and outputs are only valid inside a single job.
What connects Jobs is Resources. Pushing to git is one way. It'd almost certainly be faster and easier to use a blob store (eg S3) or file store (eg FTP).

Resources