Bitbucket Pipeline Call via API with parameters - bitbucket-pipelines

I have a bitbucket pipeline that will be triggered from outside:
custom:
# This Pipeline will be triggered automatically when the setup succeeds
test-deployment:
- variables:
- name: build_number
- step:
name: "Test"
image: atlassian/default-image:latest
script:
- echo $build_number
- export
I have to pass the parameter "build_number" to this pipeline and I tried this with the following call:
curl -X POST -is -u user:pass \
-H 'Content-Type: application/json' \
https://api.bitbucket.org/2.0/repositories/user/repo/pipelines/ \
-d '
{
"target": {
"ref_type": "branch",
"type": "pipeline_ref_target",
"ref_name": "feature/pipeline-tests",
"selector": {
"type": "custom",
"pattern" : "test-deployment"
}
}
},
"variables": [
{
"key" : "build_number",
"value" : "202"
}
]
}'
But the "build_number" was not set. If I call the pipeline via the Bitbucket UI it works. What's wrong here?

I think you could try adding the missing parameter "secured":
{
"key" : "build_number",
"value" : "202",
"secured": False
}

Related

how to use secret variables from Azure variable group in a script part of a pipeline?

We have the following variables defined in a variable group in Azure:
"SMDB_DATABASE": {
"isSecret": null,
"value": "smdb_all"
},
"SMDB_HOSTNAME": {
"isSecret": null,
"value": "localhost"
},
"SMDB_PASSWORD": {
"isSecret": true,
"value": <some_password>
},
"SMDB_ROOT_PASSWORD": {
"isSecret": true,
"value": <some_root_password>
},
"SMDB_USER": {
"isSecret": null,
"value": "IntegrationTest"
}
}
and we should use them, when running python integration tests via a pipeline. The pipeline looks as follows:
jobs:
- job: IntegrationTests
variables:
- group: <the_group_name>
- script: |
pdm run pytest \
--variables "$VARIABLE_FILE" \
--test-run-title="$TEST_TITLE" \
--napoleon-docstrings \
--doctest-modules \
--color=yes \
--junitxml=junit/test-results.xml \
integration
displayName: 'Tests'
env:
DB_USER: $(SMDB_USER)
DB_PASSWORD: $(SMDB_PASSWORD)
DB_HOST: $(SMDB_HOST)
DB_DATABASE: $(SMDB_DATABASE)
In the python code, the variables are read like this:
host=os.environ.get("DB_HOST")
It works correctly except for the SMDB_PASSWORD, because it is a secret value. When i look at the DB_PASSWORD value, it is "***"
What should i do in order to correctly read and use this secret variable in my python code?

Gitlab secret detection, how to test it works

I have gitlab secret detection, and i wanted to check it works. I have spring project and the job set up. What kind of secret pattern would it pick up.
Does anyone know how i can check it actually picks something up?
I have tried adding the following to the code, its made up, but doesn't get flagged:
aws_secret=AKIAIMNOJVGFDXXXE4OA
If the secrets detector finds a secret, it doesn't fail the job (ie, it doesn't have a non-0 exit code). In the analyzer output, it will show how many leaks were found, but not what they were. The full details are written to a file called gl-secret-detection-report.json. You can either cat the file in the job so you can see the results in the Job output, or upload it as an artifact so it gets recognized as a sast report.
Here's the secrets detection job from one of my pipelines that both cat's the file and uploads it as a sast report artifact. Note: for my purposes, I wasn't able to directly use the template, so I run the analyzer manually:
Secrets Detector:
stage: sast
image:
name: "registry.gitlab.com/gitlab-org/security-products/analyzers/secrets"
needs: []
only:
- branches
except:
- main
before_script:
- apk add jq
script:
- /analyzer run
- cat gl-secret-detection-report.json | jq '.'
artifacts:
reports:
sast: gl-secret-detection-report.json
The gl-secret-detection-report.json file looks like this for a test repository I set up and added a GitLab Runner registration token to a file called TESTING:
{
"version": "14.0.4",
"vulnerabilities": [
{
"id": "138bf52be327e2fc3d1934e45c93a83436c267e45aa84f5b55f2db87085cb205",
"category": "secret_detection",
"name": "GitLab Runner Registration Token",
"message": "GitLab Runner Registration Token detected; please remove and revoke it if this is a leak.",
"description": "Historic GitLab Runner Registration Token secret has been found in commit 0a4623336ac54174647e151186c796cf7987702a.",
"cve": "TESTING:5432b14f2bdaa01f041f6eeadc53fe68c96ef12231b168d86c71b95aca838f3c:gitlab_runner_registration_token",
"severity": "Critical",
"confidence": "Unknown",
"raw_source_code_extract": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"scanner": {
"id": "gitleaks",
"name": "Gitleaks"
},
"location": {
"file": "TESTING",
"commit": {
"author": "author",
"date": "2022-09-12T17:30:33Z",
"message": "a commit message",
"sha": "0a4623336ac54174647e151186c796cf7987702a"
},
"start_line": 1
},
"identifiers": [
{
"type": "gitleaks_rule_id",
"name": "Gitleaks rule ID gitlab_runner_registration_token",
"value": "gitlab_runner_registration_token"
}
]
}
],
"scan": {
"analyzer": {
"id": "secrets",
"name": "secrets",
"url": "https://gitlab.com/gitlab-org/security-products/analyzers/secrets",
"vendor": {
"name": "GitLab"
},
"version": "4.3.2"
},
"scanner": {
"id": "gitleaks",
"name": "Gitleaks",
"url": "https://github.com/zricethezav/gitleaks",
"vendor": {
"name": "GitLab"
},
"version": "8.10.3"
},
"type": "secret_detection",
"start_time": "2022-09-12T17:30:54",
"end_time": "2022-09-12T17:30:55",
"status": "success"
}
}
This includes the type of secret found, what file it was in and what line(s), and information from the commit where the secret was added.
If you wanted to force the job to fail if any secrets were found, you can do that with jq (note: I install jq in the before_script of this job, it's not available in the image by default.):
Secrets Detector:
stage: sast
image:
name: "registry.gitlab.com/gitlab-org/security-products/analyzers/secrets"
needs: []
only:
- branches
except:
- main
before_script:
- apk add jq
script:
- /analyzer run
- cat gl-secret-detection-report.json | jq '.'
- if [[ $(cat gl-secret-detection-report.json | jq '.vulnerabilities | length > 0') ]]; then echo "secrets found" && exit 1; fi
artifacts:
reports:
sast: gl-secret-detection-report.json

Add Databricks API to configure init script in existing bash script

I would like to add the Databricks init script API in my exsiting bash script. How can I do this? Here is the API provided by Databricks:
curl -n -X POST -H 'Content-Type: application/json' -d '{
"cluster_id": "",
"num_workers": 1,
"spark_version": "8.4.x-scala2.12",
"node_type_id": "$node_type",
"cluster_log_conf": {
"dbfs" : {
"destination": "dbfs:/cluster-logs"
}
},
"init_scripts": [ {
"dbfs": {
"destination": "dbfs:/FileStore/shared_uploads/kafka_keytabs/CopyKrbFiles.sh"
}
} ]
}' https://<databricks-instance>/api/2.0/clusters/edit

How can I pass pipeline variable to parameters file for blueprint assignment

I'm trying to create an Azure DevOps pipeline for deploying Azure Blueprint. There are some fields in the parameters file(JSON) which I want to be configurable. How can I pass these values as pipeline variables and use them in the parameters file?
I tried defining a pipeline variable and reference it in the parameter file like this "$(var-name)", but it didn't work. Is there a way to solve this?
Below is my pipeline definition, I'm using AzureBlueprint extension for creating and assigning blueprint:
steps:
- task: CreateBlueprint#1
inputs:
azureSubscription: $(serviceConnection)
BlueprintName: $(blueprintName)
BlueprintPath: '$(blueprintPath)'
AlternateLocation: false
PublishBlueprint: true
- task: AssignBlueprint#1
inputs:
azureSubscription: $(serviceConnection)
AssignmentName: '$(blueprintName)-assignment'
BlueprintName: $(blueprintName)
ParametersFile: '$(blueprintPath)/assign.json'
SubscriptionID: $(subscriptionId)
Wait: true
Timeout: 500
and my parameters file:
"parameters":{
"organization" : {
"value": "xxxx"
},
"active-directory-domain-services_ad-domain-admin-password" : {
"reference": {
"keyVault": {
"id": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.KeyVault/vaults/xxxx"
},
"secretName": "xxxx"
}
},
"jumpbox_jumpbox-local-admin-password" : {
"reference": {
"keyVault": {
"id": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.KeyVault/vaults/xxxx"
},
"secretName": "xxxx"
}
},
"keyvault_ad-domain-admin-user-password" : {
"value" : "xxxx"
},
"keyvault_deployment-user-object-id" : {
"value" : "xxxx"
},
"keyvault_jumpbox-local-admin-user-password" : {
"value" : "xxxx"
}
}
Since the Tasks (CreateBlueprint and AssignBlueprint) you are using doesn't support overriding parameters, you have two options:
Use the Azure CLI az blueprint command to directly create and assign blueprints.
Change the parameters file bei either using JSON variable substitution or by using a small PowerShell script (see blow):
Sample:
$paramFile = Get-Content ./azuredeploy.parameters.json | ConvertFrom-Json
$paramFile.parameters.organization.value = "your-org-name"
$paramFile | ConvertTo-Json | Set-Content ./azuredeploy.parameters.json
Be aware that the Task you are using hasn't received an update within the last 17 months (here is the GitHub repository).
AssignBlueprint#1 doesn't support natively this. However you can modify assign.json using Json substitution
It comes down to having Azure Pipeline variables with a name like a path to a leaf in the json file which you want to teplace.
Here is an example:
variables:
Data.DebugMode: disabled
Data.DefaultConnection.ConnectionString: 'Data Source=(prodDB)\MSDB;AttachDbFilename=prod.mdf;'
Data.DBAccess.Users.0: Admin-3
Data.FeatureFlags.Preview.1.NewWelcomeMessage: AllAccounts
# Update appsettings.json via FileTransform task.
- task: FileTransform#1
displayName: 'File transformation: appsettings.json'
inputs:
folderPath: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
targetFiles: '**/appsettings.json'
fileType: json

Getting "The body you sent contains an unknown key." when trying to create a Contentful entry

I have created a contentful model called "User" with two fields:
id - text, unique, required
email - text, optional
When I try to create a new entry via content management API using these parameters:
Headers:
X-Contentful-Content-Type : user
Content-Type : application/vnd.contentful.management.v1+json
Method
PUT
URL
https://api.contentful.com/spaces/qilo7tiaixh8/environments/entries/
Body:
{
"id":"whatever",
"email": "peter#petervukovic.com"
}
I get the following error:
{
"requestId": "2849bbcd7ee0486bb36b47927071f37b",
"sys": {
"type": "Error",
"id": "UnknownKey"
},
"message": "The body you sent contains an unknown key.",
"details": {
"errors": [
{
"keys": [
"id",
"email"
]
}
]
}
}
I have no idea what I'm doing wrong as the examples in the official documentation aren't helpful (they assume multi-lingual content I'm not using) and there are no debugging hints.
Contentful DevRel here. 👋
I just tried it and the following CURL works for me on a user content type that defines a title field.
curl --include \
--request PUT \
--header 'Authorization: Bearer ...' \
--header 'Content-Type: application/vnd.contentful.management.v1+json' \
--header 'X-Contentful-Content-Type: user' \
--data-binary '{
"fields": {
"title": {
"en-US": "Hello Test"
}
}
}' \
https://api.contentful.com/spaces/.../environments/master/entries/\test-1
It looks like you were missing to include the fields property in your payload. About the localization part, I think it's required to provide the locale for your field values. So in my example, en-US is the default value and it is required.
For using PUT you have to define or come up with an entry id.
To create an entry without passing and defining an id have a look at the docs in Entry collection.
curl --include \
--request POST \
--header 'Authorization: Bearer ...' \
--header 'Content-Type: application/vnd.contentful.management.v1+json' \
--header 'X-Contentful-Content-Type: user' \
--data-binary '{
"fields": {
"title": {
"en-US": "Hello Test 2"
}
}
}' \
https://api.contentful.com/spaces/.../environments/master/entries/

Resources