I have an Azure DevOps pipeline with the resource section is given below
resources:
repositories:
- repository: test
type: git
name: Hackfest/template
pipelines:
- pipeline: Build
source: mybuild
branch: main
# version: # Latest by default
trigger:
branches:
include:
- main
I'm trying to invoke the pipeline using a rest api call. The body of the rest api call is given below
$body='{
"definition": { "id": "3321" },
"resources": {
"pipelines": {
"Build": {
"version": "20220304.15",
"source": "mybuild"
}
}
},
"sourceBranch": "main"
}'
With the above json string I'm able to invoke the pipeline build, but it is not picking the artifacts from version 20220304.15 of the build "mybuild". Rather it is taking the latest artifact version of mybuild and starting the build.
How I should modify the above body string to pick the correct version of the "mybuild"?
With the Runs - Run Pipeline this is worked for me:
"resources": {
"repositories": {
"self": {
"refName": "refs/heads/dev"
}
},
"pipelines": {
"Build": {
"version": "Build_202203040100.1"
}
}
}
Related
I have created a pipeline in Azure DevOps to perform the following three steps:
Retrieve the job definition from one Databricks workspace and save it as a json (Databricks CLI config is omitted)
databricks jobs get --job-id $(job_id) > workflow.json
Use this json to update the workflow in a second (separate) Databricks workspace (Databricks CLI is first reconfigured to point to the new workspace)
databricks jobs reset --job-id $(job_id) --json-file workflow.json
Run the updated job in the second Databricks workspace
databricks jobs run-now --job-id $(job_id)
However, my pipeline fails at step 2 with the following error, even though the existing_cluster_id is already defined inside the workflow.json. Any idea?
Error: b'{"error_code":"INVALID_PARAMETER_VALUE","message":"One of job_cluster_key, new_cluster, or existing_cluster_id must be specified."}'
Here is what my workflow.json looks like (hiding some of the details):
{
"job_id": 123,
"creator_user_name": "user1",
"run_as_user_name": "user1",
"run_as_owner": true,
"settings":
{
"name": "my-workflow",
"existing_cluster_id": "abc-def-123-xyz",
"email_notifications": {
"no_alert_for_skipped_runs": false
},
"webhook_notifications": {},
"timeout_seconds": 0,
"notebook_task": {
"notebook_path": "notebooks/my-notebook",
"base_parameters": {
"environment": "production"
},
"source": "GIT"
},
"max_concurrent_runs": 1,
"git_source": {
"git_url": "https://my-org#dev.azure.com/my-project/_git/my-repo",
"git_provider": "azureDevOpsServices",
"git_branch": "master"
},
"format": "SINGLE_TASK"
},
"created_time": 1676477563075
}
I figured out that you don't need to retrieve the entire workflow definition json file, as shown in step 1, but only the "settings" part, i.e. modifying step 1 to this solved my issue:
databricks jobs get --job-id $(job_id) | jq .settings > workflow.json
I have gitlab secret detection, and i wanted to check it works. I have spring project and the job set up. What kind of secret pattern would it pick up.
Does anyone know how i can check it actually picks something up?
I have tried adding the following to the code, its made up, but doesn't get flagged:
aws_secret=AKIAIMNOJVGFDXXXE4OA
If the secrets detector finds a secret, it doesn't fail the job (ie, it doesn't have a non-0 exit code). In the analyzer output, it will show how many leaks were found, but not what they were. The full details are written to a file called gl-secret-detection-report.json. You can either cat the file in the job so you can see the results in the Job output, or upload it as an artifact so it gets recognized as a sast report.
Here's the secrets detection job from one of my pipelines that both cat's the file and uploads it as a sast report artifact. Note: for my purposes, I wasn't able to directly use the template, so I run the analyzer manually:
Secrets Detector:
stage: sast
image:
name: "registry.gitlab.com/gitlab-org/security-products/analyzers/secrets"
needs: []
only:
- branches
except:
- main
before_script:
- apk add jq
script:
- /analyzer run
- cat gl-secret-detection-report.json | jq '.'
artifacts:
reports:
sast: gl-secret-detection-report.json
The gl-secret-detection-report.json file looks like this for a test repository I set up and added a GitLab Runner registration token to a file called TESTING:
{
"version": "14.0.4",
"vulnerabilities": [
{
"id": "138bf52be327e2fc3d1934e45c93a83436c267e45aa84f5b55f2db87085cb205",
"category": "secret_detection",
"name": "GitLab Runner Registration Token",
"message": "GitLab Runner Registration Token detected; please remove and revoke it if this is a leak.",
"description": "Historic GitLab Runner Registration Token secret has been found in commit 0a4623336ac54174647e151186c796cf7987702a.",
"cve": "TESTING:5432b14f2bdaa01f041f6eeadc53fe68c96ef12231b168d86c71b95aca838f3c:gitlab_runner_registration_token",
"severity": "Critical",
"confidence": "Unknown",
"raw_source_code_extract": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"scanner": {
"id": "gitleaks",
"name": "Gitleaks"
},
"location": {
"file": "TESTING",
"commit": {
"author": "author",
"date": "2022-09-12T17:30:33Z",
"message": "a commit message",
"sha": "0a4623336ac54174647e151186c796cf7987702a"
},
"start_line": 1
},
"identifiers": [
{
"type": "gitleaks_rule_id",
"name": "Gitleaks rule ID gitlab_runner_registration_token",
"value": "gitlab_runner_registration_token"
}
]
}
],
"scan": {
"analyzer": {
"id": "secrets",
"name": "secrets",
"url": "https://gitlab.com/gitlab-org/security-products/analyzers/secrets",
"vendor": {
"name": "GitLab"
},
"version": "4.3.2"
},
"scanner": {
"id": "gitleaks",
"name": "Gitleaks",
"url": "https://github.com/zricethezav/gitleaks",
"vendor": {
"name": "GitLab"
},
"version": "8.10.3"
},
"type": "secret_detection",
"start_time": "2022-09-12T17:30:54",
"end_time": "2022-09-12T17:30:55",
"status": "success"
}
}
This includes the type of secret found, what file it was in and what line(s), and information from the commit where the secret was added.
If you wanted to force the job to fail if any secrets were found, you can do that with jq (note: I install jq in the before_script of this job, it's not available in the image by default.):
Secrets Detector:
stage: sast
image:
name: "registry.gitlab.com/gitlab-org/security-products/analyzers/secrets"
needs: []
only:
- branches
except:
- main
before_script:
- apk add jq
script:
- /analyzer run
- cat gl-secret-detection-report.json | jq '.'
- if [[ $(cat gl-secret-detection-report.json | jq '.vulnerabilities | length > 0') ]]; then echo "secrets found" && exit 1; fi
artifacts:
reports:
sast: gl-secret-detection-report.json
I am developing a handful of WordPress projects on Gitlab and I would like to use semantic-release to automatically manage releases. To that end I'm trying to accomplish a few additional things:
Update and commit applicable version strings in the codebase via ${nextRelease.version}.
Similarly update versions strings in the files generated for the release (which are zipped for convenience).
I'm pretty sure I'm close, I've got the first item (via google's semantic-release-replace-plugin) but not the second. Up to this point I've tried to do most things via semantic-releases' plugin ecosystem, but if need be I can venture into script territory.
My .releaserc looks like:
{
"branches": [ "main" ],
"plugins": [
"#semantic-release/commit-analyzer",
"#semantic-release/release-notes-generator",
[
"#google/semantic-release-replace-plugin",
{
"replacements": [
{
"files": ["style.css"],
"from": "Version: .*",
"to": "Version: ${nextRelease.version}",
"results": [
{
"file": "style.css",
"hasChanged": true,
"numMatches": 1,
"numReplacements": 1
}
],
"countMatches": true
}
]
}
],
[
"#semantic-release/git",
{
"assets": ["style.css"]
}
],
[
"#semantic-release/gitlab",
{
"assets": [
{"path": "experiments.zip", "label": "zip"}
]
}
]
]
}
And the .gitlab-ci.yml looks like:
variables:
GL_TOKEN: $GL_TOKEN
stages:
- release
before_script:
- npm install
publish:
image: cimg/php:7.4-node
stage: release
script:
- npm run build
- npm run zip
- npx semantic-release
only:
refs:
- main
Where npm run build compiles some assets and npm run zip is a JavaScript-based script that zips up the desired production-ready files, in this case to generate the experiments.zip.
Any suggestions would be appreciated!
So the main issue here was that compilation was just not occurring at the right time and I needed to slip a
"#semantic-release/exec",
{
"prepareCmd": "node bin/makezip.js"
}
between "#semantic-release/git" and "#semantic-release/gitlab".
I have an Azure DevOps pipeline and want to reference other pipeline that my pipeline will fetch the artefacts from. I am struggling to find a way to actually do it over REST API.
https://learn.microsoft.com/en-us/rest/api/azure/devops/pipelines/runs/run%20pipeline?view=azure-devops-rest-6.1 specifies there is a BuildResourceParameters or PipelineResourceParameters but I cannot find a way to get it to work.
For example:
Source pipeline A produces an artefact B in run C. I want to tell API to reference the artefact B from run C of pipeline A rather than refer to the latest.
Anyone?
In your current situation, we recommend you can follow the below request body to help you select your reference pipeline version.
{
"stagesToSkip": [],
"resources": {
"repositories": {
"self": {
"refName": "refs/heads/master"
}
},
"pipelines": {
"myresourcevars": {
"version": "1313"
}
}
},
"variables": {}
}
Note: The name 'myresourcevars' is the pipeline name you defined in your yaml file:
enter image description here
how do I store (Archive stage) artifacts to artifactory post jenkins build with SHA5 checksum in filename?
E.g. What I want as filename stored on artifactory: sha__2340ursoddpkjfsodfj0429trjw0fjosdfkjsao90024r.h
Artifactory checksum SHA-1: 2340ursoddpkjfsodfj0429trjw0fjosdfkjsao90024r
def server = Artifactory.server 'my-jfrog-artifactoryserver'
def uploadSp = """{
"files": [
{
"pattern": "*.h",
"target": "builds/myhfiles/"
}
]
}"""
node('h-builder')
{
{
stage ('Archive')
{
archiveArtifacts artifacts: '**/*.h', fingerprint: true
server.upload spec: uploadSp, failNoOp: true
}
}
}
You can try using Placeholders for this:
{
"files": [
{
"pattern": "(*).h",
"target": "builds/myhfiles/{1}"
}
]
}
Read more about File Specs greatness here and there.
Plugin documentation here.