How to use global credentials in python script invoked by jenkins pipeline - python-3.x

I'm very new to working with jenkins, so far I was able to run simple pipeline with simple pip install, but I need to pass global credentials from Jenkins into python script test.py invoked by jenkinsfile.
pipeline {
options { timeout(time: 30, unit: 'MINUTES')
buildDiscarder(logRotator(numToKeepStr: '30', artifactNumToKeepStr: '30')) }
agent { label 'ops_slave' }
stages {
stage('Environment Build') {
steps {
echo "Hello World!"
sh "echo Hello from the shell"
sh "hostname"
sh "uptime"
sh "python3 -m venv test_env"
sh "source ./test_env/bin/activate"
sh "pip3 install pandas psycopg2"
sh """echo the script is working"""
withCredentials([[
$class: 'UsernamePasswordMultiBinding',
credentialsId: 98,
usernameVariable: 'user',
passwordVariable: 'pw',
]])
sh """python3 bartek-jenkins-testing/python/test.py"""
}
}
}
}
I've seen implementation where you use argparse, but it's way above my level at this point, and I believe there is a way to reference it from python script or jenkins directly to pass to python script. I was googling for some time now, but I'm not sure questions I'm asking are correct.
My python script should be able to get username and password from Jenkins global credentials ID98:
print('Hello World this is python')
import pandas as pd
print(pd.__version__)
import pyodbc
import psycopg2
# can pass environemtn variables
connection = psycopg2.connect(
host="saturn-dv",
database="saturn_dv",
port='8080',
user='saturn_user_bartek_malysz',
password='')
connection.set_session(readonly=True)
query = """
SELECT table_name FROM information_schema.tables
WHERE table_schema = 'public'
ORDER BY table_schema,table_name;"""
data = pd.read_sql(query, connection)
print(data)

A straightforward way is to leverage environment variable as following
// Jenkinsfile
withCredentials([[
$class: 'UsernamePasswordMultiBinding',
credentialsId: 98,
usernameVariable: 'user',
passwordVariable: 'pw',
]]) {
sh """
export DB_USERNAME="${user}"
export DB_PASSWORD="${pw}"
python3 bartek-jenkins-testing/python/test.py
"""
}
// test.py
connection = psycopg2.connect(
host="saturn-dv",
database="saturn_dv",
port='8080',
user=os.getenv('DB_USERNAME'),
password=os.getenv('DB_PASSWORD'))

Related

how to write the correct pipline jenkins docker grovy node

I am rewriting my pipline in node, I need to understand how to perform a step with a gait in node now an error is coming from stage('Deploy')
node {
checkout scm
def customImage = docker.build("python-web-tests:${env.BUILD_ID}")
customImage.inside {
sh "python ${env.CMD_PARAMS}"
}
stage('Deploy') {
post {
always {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]
])
cleanWs()
}
}
}
and this is the old pipeline
pipeline {
agent {label "slave_first"}
stages {
stage("Создание контейнера image") {
steps {
catchError {
script {
docker.build("python-web-tests:${env.BUILD_ID}", "-f Dockerfile .")
}
}
}
}
stage("Running and debugging the test") {
steps {
sh 'ls'
sh 'docker run --rm -e REGION=${REGION} -e DATA=${DATA} -e BUILD_DESCRIPTION=${BUILD_URL} -v ${WORKSPACE}:/tmp python-web-tests:${BUILD_ID} /bin/bash -c "python ${CMD_PARAMS} || exit_code=$?; chmod -R 777 /tmp; exit $exit_code"'
}
}
}
post {
always {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]
])
cleanWs()
}
}
}
I tried to transfer the method of creating an allure report, but nothing worked, I use the version above, almost everything turned out, you can still add environment variables to the build, for example, those that are specified -e DATA=${DATA} how do I add it
I don't recommend to switch from declarative to scriptive pipeline.
You are losing possibility to use multiple tooling connected with declarative approach like syntax checkers.
If you still want to use scriptive approach try this:
node('slave_first') {
stage('Build') {
checkout scm
def customImage = docker.build("python-web-tests:${env.BUILD_ID}")
customImage.inside {
sh "python ${env.CMD_PARAMS}"
}
}
stage('Deploy') {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]])
cleanWs()
}
}
There is no post and always directive in scriptive pipelines. It's on your head to catch all exceptions and set status of the job. I guess you were using this page: https://www.jenkins.io/doc/book/pipeline/syntax/, but it's a mistake.
This page only refers to declarative approach and in few cases you have hidden scriptive code as examples.
Also i don't know if you have default agent label set in your Jenkins config, but by looking at your declarative one I think you missed 'slave_first' arg in node object.
those that are specified -e DATA=${DATA} how do I add it
That's a docker question not a Jenkins. If you want to launch docker image and then also have access to some reports located in this container you should mount workspace/file where those output files landed. You should also pass location of those files to allure.
I suggest you to try this:
mount some subfolder in workspace to docker container
cat test report file if it's visible
add allure report with passing this file location to allure step

How to get the passed parameter inside python container in AWS Batch job?

I have 2 job definitions (job-1, job-2) and I'm executing Job1 first. Then Job1 will submit Job2 and starts its execution. I need to pass some parameters to Job2 when submitting the job. Below is my Python3 code,
# job1
import boto3
import os
env = os.environ.get('environment')
batch = boto3.client('batch')
def submit_job():
return batch.submit_job(
jobName='Job2',
jobQueue='job2-queue-dev',
jobDefinition='job-2',
containerOverrides= {
'environment': [
{
'name': 'environment',
'value': env
},
]
},
parameters={
'opco': '123',
'app' : 'app1'
},
);
submit_job()
In the Job2 i can easily get the environment variable with below code.
# job2
env = os.environ.get('environment')
def get_index_name(env):
return 'liberty-'+env
....
So my question is How can we get those parameters (opco, app) inside the job2?
FYI, i could pass them as environment variable, But i want to know how parameter retrieval is done here.
Thanks in advance

Jenkins ImportError module not found

I have a file structure like this:
Main/
|----unit_tests
| |---test_main.py
| |---__init__.py
|--main.py
|--__init__.py
|--requirements.txt
test_main.py:
import unittest.mock
from main import MainFunction
from nose.tools import assert_is_not_none, assert_is_none, assert_equal
class TestMainLogic(unittest.case):
~~~ code here ~~~~
When running my unit tests in test_main.py from Pycharm, all is well. However when I try to run it in a Jenkins setup, I get:
+ python3 unit_tests/test_main.py
Traceback (most recent call last):
File "unit_tests/test_main.py", line 4, in <module>
from main import MainFunction
ModuleNotFoundError: No module named 'main'
Here's my Jenkinsfile:
pipeline
{
environment
{
PATH = "/home/jenkins/.local/bin:$PATH"
}
stages
{
stage('build')
{
steps
{
sh 'python3 --version'
sh 'wget https://bootstrap.pypa.io/get-pip.py'
sh 'python3 get-pip.py'
sh 'pip3 install -r requirements.txt'
echo 'build phase has finished'
}
}
stage('Lint')
{
steps
{
sh 'pylint unit_tests/test_main.py'
}
}
stage('test')
{
steps
{
sh 'python3 unit_tests/test_main.py'
}
post
{
always
{
cleanWs()
}
}
}
}
}
I've followed several posts and I'm just not getting an answer that works. I tried:Python module import failure in Jenkins and its linked answers. I do not have access to ssh into the Jenkins server and find the path so I need a way to set it at execution.
you need to use absolute import instead of relative in your test_main.py file.
in test_main.py
replace:
from main import MainFunction
# SOME CODE
with:
from Main.main import MainFunction
# SOME CODE

Unable to interpolate sensitive environment variables

I have a piece of code that runs like this
package core.jenkins
class Utils implements Serializable {
def script
Utils(script) {
this.script = script
}
def func() {
script.withCredentials([script.usernamePassword(credentialsId: 'chartmuseum-basic-auth', usernameVariable: 'USER', passwordVariable: 'PASSWORD')]) {
script.sh "helm repo add --username script.USER} --password ${script.PASSWORD} chartmuseum \"http://${chartmuseumHostname}:8080\""
}
}
The above works perfectly fine but I do not a warning
Warning: A secret was passed to "sh" using Groovy String interpolation, which is insecure.
Affected argument(s) used the following variable(s): [PASSWORD, USER]
See https://jenkins.io/redirect/groovy-string-interpolation for details.
+ helm repo add --username **** --password **** chartmuseum http://apps-chartmuseum.apps.svc.cluster.local:8080
So following the guide, Im doing the following
script.withCredentials([script.usernamePassword(credentialsId: 'chartmuseum-basic-auth', usernameVariable: 'USER', passwordVariable: 'PASSWORD')]) {
script.sh 'helm repo add --username $script.USER --password $script.PASSWORD chartmuseum "http://$chartmuseumHostname:8080"'
}
But running the variable values are not be properly substitured and I get
+ helm repo add --username .USER --password .PASSWORD chartmuseum http://:8080
Error: Looks like "http://:8080" is not a valid chart repository or cannot be reached: Get http://:8080/index.yaml: dial tcp :8080: connect: connection refused
So neither the credentials nor the value of the chartmuseumHostname variable is being substituted correctly. What am I missing here ?
Actuall withCredentials() creates a environment variable which you can access it from shell scripts.
See here: https://www.jenkins.io/doc/pipeline/steps/credentials-binding/
Try using directly the shell variables:
script.sh 'helm repo add --username $USER --password $PASSWORD chartmuseum "http://$chartmuseumHostname:8080"'
Just binding together the answers already on this post, the withCredentials makes it so that you should be able to use the variables directly (answer by #catalin), the single quotes make it so that jenkins should stop complaining about security and if you want to be extra careful, you can double quote the variable values as suggested in the docs for withCredentials.
This should give you something like this:
script.withCredentials([script.usernamePassword(credentialsId: 'chartmuseum-basic-auth',
usernameVariable: 'USER',
passwordVariable: 'PASSWORD')]) {
script.sh 'helm repo add --username "$USER" --password "$PASSWORD" chartmuseum "http://$chartmuseumHostname:8080"'
}
which still leaves us with the question of why you are calling things with the script. prefix as mentioned in the comments by #matt-schuchard.
I try using the suggestion by #Catalin (https://www.jenkins.io/doc/pipeline/steps/credentials-binding/ using directly the shell variables)
But for me adding double quotes inside single quotes doesn't work.
The only solution I found is taking the variables out of the single quotes like:
'myscript $secretvariable' + notsecretvariable
Examples:
Test1: Try using recommended solution (jenkins/#catalin)
Code:
sh label: 'Test1', script: 'echo this is a secret $docker_pwd this is not "$dockerRegistry"'
Result: variable dockerRegistry is not interpolated/resolved
15:50:43 [Pipeline] sh (Test1)
15:50:43 + echo this is a secret **** this is not ''
15:50:43 this is a secret **** this is not
Test2: Take non-sensitive variable out of the single quotes:
sh label: 'Test2', script: 'echo this is a secret $docker_pwd this is not' + dockerRegistry
Result: variable dockerRegistry is properlly resolved
15:50:44 [Pipeline] sh (Test2)
15:50:44 + echo this is a secret **** this is not my.repositories.xx
15:50:44 this is a secret **** this is not my.repositories.xx

replacing file variables by envsubst in jenkins pipeline

I want to replace some variables in a file having $variablename, at runtime from jenkins pipeline script. It seems envsubst is the best for my use case. When i execute by command line on linux server its working fine but when i'm executing through jenkins pipeline in sh script, nothing happens.
sonar-scanner.properties:
sonar.projectKey=Project:MavenTest$BRANCHNAME
sonar.projectName=MavenTest$BRANCHNAME
Example of Command line on linux box:
$ export BRANCHNAME=develop
$ envsubst '$BRANCHNAME'
Output:
sonar.projectKey=Project:MavenTestdevelop
sonar.projectName=MavenTestdevelop
But when i'm executing through jenkins file as a script, nothing is changed in file.
jenkins script:
node {
stage('checkout'){
checkout([$class: 'GitSCM', branches: [[name: ':^(?!origin/master$|origin/develop$).*']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'c0ce73db-3864-4360-9c17-d87caf8a9ea5', url: 'http://172.16.4.158:17990/scm/ctoo/testmaven.git']]])
}
stage('initialize variables'){
// Configuring BRANCH_NAME variable
sh 'git name-rev --name-only HEAD > GIT_BRANCH'
sh label: '', script: 'cut -d \'/\' -f 3 GIT_BRANCH > BRANCH'
branchname = readFile('BRANCH').trim()
env.BRANCHNAME = branchname
}
stage('build & SonarQube analysis') {
withSonarQubeEnv('Sonar') {
sh "envsubst '$BRANCHNAME' <sonar-scanner.properties"
}
}
}
Output:
[Pipeline] sh (hide)
envsubst repotest
sonar.projectKey=Project:MavenTest$BRANCHNAME
sonar.projectName=MavenTest$BRANCHNAME
Can someone please help me
Hi I don't have idea about the envbust but this can be achieved by passing sonar parameters via command line to the sonar see the below example:
withSonarQubeEnv('Sonar') {
sh "<sonarscanner path> -Dsonar.projectKey=Project:MavenTest$BRANCHNAME"
}
I had this problem and solved it by using his escape character
\,
for example:
sh "envsubst '\${SERVER_NAME}' < ./config/nginx/nginx.conf.template > ./config/nginx/nginx.conf"

Resources