How to extend file in Groovy? - groovy

I have a YAML file ('config.yaml') with this content:
egress:
hostName: example.com
tls:
- hosts:
- example.com
How can I extend this file in groovy?
Result must be like this:
egress:
hostName: example.com
tls:
- hosts:
- example.com
tag: 1.1.1
Thanks!

It depends.
If you want to treat config.yaml just as an ordinary text file, then you can just use common Groovy ways to work with IO:
String path = 'path/to/file'
File cfgFile = new File(path, 'config.yml')
cfgFile.withWriterAppend() { writer ->
writer.writeLine('\ntag: 1.1.1')
}
or just
new File(path, 'config.yml') << '\ntag: 1.1.1'
But if you want to build something more sophisticated and aware of YAML format of this file, then you can use SnakeYaml library:
#Grapes([
#Grab(group='org.yaml', module='snakeyaml', version='1.25')
])
import org.yaml.snakeyaml.Yaml
String path = '../data/'
File cfgFile = new File(path, 'config.yaml')
Yaml yaml = new Yaml()
Map content = [:]
cfgFile.withReader { reader ->
content = yaml.load(reader)
}
content.put('tag', '1.1.1')
cfgFile.withWriter { writer ->
yaml.dump(content, writer)
}
If you use Groovy 3.0+ then you can use built-in YamlSlurper and YamlBuilder instead.
(Groovy 3.0 is not released yet at the time of writing this answer)
One disadvantage here is that parsing and re-writing yaml file will get rid of comments and will reformat entire file.

Related

nodejs templating yaml files

I looking for node module (or something else) that can parse in runtime parameters from my program into yaml files.
for example in kubernetes yamls
metadata:
name: $PROJECT_NAME
labels:
service: $SERVICE_NAME
system: $SYSTEM_ID
app_version: $PROJECT_VERSION
tier: app
There is a nice way to build new yaml or change the exist one that contain all my parameters values?
I decided to use Handlebars module
just give to the function a template with the parameters that i want to parse and the function will create new file contain all my changes
const Handlebars = require('handlebars');
const source = fs.readFileSync(`${cwd}/${file}`).toString();
const template = Handlebars.compile(source);
const contents = template({ PROJECT_NAME: `${name}`, PROJECT_VERSION: `${version}`, DOCKER_IMAGE: `${image}` });
fs.writeFileSync(`${cwd}/target/${file}`, contents);
console.log(`${file} -- Finish parsing YAML.`);
and the JSON look like
spec:
containers:
- name: {{PROJECT_NAME}}:{{PROJECT_VERSION}}
resources:
limits:
memory: "1Gi"
cpu: "1"
image: {{DOCKER_IMAGE}}
YAML doesn't always need a template as it is structured data. As long as you don't need formatting/comments, objects can be read or dumped with js-yaml.
const yaml = require('js-yaml')
const fs = require('fs')
const kyaml = {
metadata: {
name: project_name,
service: service_name,
system: system_id,
app_version: project_version,
tier: 'app',
}
}
fs.writeFile('new.yaml', yaml.safeDump(kyaml), 'utf8', err => {
if (err) console.log(err)
})
Also you could possibly be doing things that helm can already do for you with templates.
Any template engine should work but the templated values should be escaped appropriately if there is a chance the values will produce encoding errors. Because YAML is a superset of JSON, JSON.stringify can be safely used as a valid YAML escape function.
Using Mustache templates, we ensure all templated values are escaped by setting the escape function:
const mustache = require('mustache')
mustache.escape = JSON.stringify
mustache.render(template, vars)
Using Handlebars we can produce valid YAML by disabling the default escaping and providing a "json" helper:
const handlebars = require('handlebars')
const render = handlebars.compile(template, { noEscape: true })
render(vars, {helpers: { json: JSON.stringify }}))
The json helper needs to be used in the YAML template any time a value may require escaping:
metadata:
name: {{json projectName}}
labels:
version: {{buildId}}
tier: app
Without appropriate YAML escaping there could be encoding errors in the generated YAML document if values contain newlines or YAML special characters like "|". Some templating engines (eg Mustache and Handlebars) will escape values with HTML encoding by default which will also produce encoding errors if HTML special characters are present in values (eg quote which is escaped as "'").

SnakeYAML by example

I am trying to read and parse a YAML file with SnakeYAML and turn it into a config POJO for my Java app:
// Groovy pseudo-code
class MyAppConfig {
List<Widget> widgets
String uuid
boolean isActive
// Ex: MyAppConfig cfg = new MyAppConfig('/opt/myapp/config.yaml')
MyAppConfig(String configFileUri) {
this(loadMap(configFileUri))
}
private static HashMap<String,HashMap<String,String>> loadConfig(String configFileUri) {
Yaml yaml = new Yaml();
HashMap<String,HashMap<String,String>> values
try {
File configFile = Paths.get(ClassLoader.getSystemResource(configUri).toURI()).toFile();
values = (HashMap<String,HashMap<String,String>>)yaml.load(new FileInputStream(configFile));
} catch(FileNotFoundException | URISyntaxException ex) {
throw new MyAppException(ex.getMessage(), ex);
}
values
}
MyAppConfig(HashMap<String,HashMap<String,String>> yamlMap) {
super()
// Here I want to extract keys from 'yamlMap' and use their values
// to populate MyAppConfig's properties (widgets, uuid, isActive, etc.).
}
}
Example YAML:
widgets:
- widget1
name: blah
age: 3000
isSilly: true
- widget2
name: blah meh
age: 13939
isSilly: false
uuid: 1938484
isActive: false
Since it appears that SnakeYAML only gives me a HashMap<String,<HashMap<String,String>> to represent my config data, it seems as though we can only have 2 nested mapped properties that SnakeYAML supports (the outer map and in the inner map of type <String,String>)...
But what if widgets contains a list/sequence (say, fizzes) which contained a list of, say, buzzes, which contained yet another list, etc? Is this simply a limitation of SnakeYAML or am I using the API incorrectly?
To extract values out of this map, I need to iterate its keys/values and (seemingly) need to apply my own custom validation. Does SnakeYAML provide any APIs for doing this extraction + validation? For instance, instead of hand-rolling my own code to check to see if uuid is a property defined inside the map, it would be great if I could do something like yaml.extract('uuid'), etc. And then ditto for the subsequent validation of uuid (and any other property).
YAML itself contains a lot of powerful concepts, such as anchors and references. Does SnakeYAML handle these concepts? What if an end user uses them in the config file - how am I supposed to detect/validate/enforce them?!? Does SnakeYAML provide an API for doing this?
Do you mean like this:
#Grab('org.yaml:snakeyaml:1.17')
import org.yaml.snakeyaml.*
import org.yaml.snakeyaml.constructor.*
import groovy.transform.*
String exampleYaml = '''widgets:
| - name: blah
| age: 3000
| silly: true
| - name: blah meh
| age: 13939
| silly: false
|uuid: 1938484
|isActive: false'''.stripMargin()
#ToString(includeNames=true)
class Widget {
String name
Integer age
boolean silly
}
#ToString(includeNames=true)
class MyConfig {
List<Widget> widgets
String uuid
boolean isActive
static MyConfig fromYaml(yaml) {
Constructor c = new Constructor(MyConfig)
TypeDescription t = new TypeDescription(MyConfig)
t.putListPropertyType('widgts', Widget)
c.addTypeDescription(t);
new Yaml(c).load(yaml)
}
}
println MyConfig.fromYaml(exampleYaml)
Obviously, that's a script to run in the Groovy console, you wouldn't need the #Grab line, as you probably already have the library in your classpath ;-)

Puppet: unable to get hiera variable

I've been using hiera for several weeks now and all was working fine til few days ago when i started to get that kind of message:
Error: Could not retrieve catalog from remote server: Error 400 on SERVER: Could not find data item nom in any Hiera data file and no default supplied on node d0puppetclient.victor-buck.com
Warning: Not using cache on failed catalog
Error: Could not retrieve catalog; skipping run
So i tried to make a very simple test to check if the problem came from my last code changes and i'm still getting this message. I can't get hiera variable anymore.
Below the test i made:
hiera.yaml:
---
:backends:
- yaml
:yaml:
:datadir: /etc/puppet/hieradata
:hierarchy:
- common
site.pp:
# /etc/puppet/manifests/site.pp
case $operatingsystem {
'Solaris': { include role::solaris }
'RedHat', 'CentOS': { include redhat::roles::common }
/^(Debian|Ubuntu)$/: { include role::debian }
# default: { include role::generic }
}
case $hostname {
/^d0puppetclient/: { include test }
}
test.pp:
class test{
$nom = hiera('nom')
file {"/root/test.txt":
ensure => file,
source => "/etc/puppet/test.txt.erb",
}
}
test.txt.erb:
<%= nom %>
Any idea about to fix this? I thought this could be an file access right issue, so i tried to grante access on some files (755) and it's not working...
You need to define nom in your common.yaml in order for it to hold a value. You can set a default value and conditionally create the file if you don't plan on setting it.
class test {
$nom = hiera('nom', false)
if $nom {
file { '/root/test.txt':
ensure => file,
content => template('test/test.txt.erb')
}
}
}
Notice how i used content instead of source. When using erb templates you need to specify the content using the template() function.
Using Templates
If you use source it is expecting a file rather than an erb template.
Hope this helps.

Reading file from Workspace in Jenkins with Groovy script

I want to add a Build step with the Groovy plugin to read a file and trigger a build fail depending on the content of the file.
How can I inject the workspace file path in the groovy plugin ?
myFileDirectory = // Get workspace filepath here ???
myFileName = "output.log"
myFile = new File(myFileDirectory + myFileName)
lastLine = myFile.readLines().get(myFile.readLines().size().toInteger() - 1)
if (lastLine ==~ /.Fatal Error.*/ ){
println "Fatal error found"
System.exit(1)
} else{
println "nothing to see here"
}
I realize this question was about creating a plugin, but since the new Jenkins 2 Pipeline builds use Groovy, I found myself here while trying to figure out how to read a file from a workspace in a Pipeline build. So maybe I can help someone like me out in the future.
Turns out it's very easy, there is a readfile step, and I should have rtfm:
env.WORKSPACE = pwd()
def version = readFile "${env.WORKSPACE}/version.txt"
If you are trying to read a file from the workspace during a pipeline build step, there's a method for that:
readFile('name-of-file.groovy')
For reference, see https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#readfile-read-file-from-workspace.
Based on your comments, you would be better off with Text-finder plugin.
It allows to search file(s), as well as console, for a regular expression and then set the build either unstable or failed if found.
As for the Groovy, you can use the following to access ${WORKSPACE} environment variable:
def workspace = manager.build.getEnvVars()["WORKSPACE"]
Although this question is only related to finding directory path ($WORKSPACE) however I had a requirement to read the file from workspace and parse it into JSON object to read sonar issues ( ignore minor/notes issues )
Might help someone, this is how I did it-
from readFile
jsonParse(readFile('xyz.json'))
and jsonParse method-
#NonCPS
def jsonParse(text) {
return new groovy.json.JsonSlurperClassic().parseText(text);
}
This will also require script approval in ManageJenkins-> In-process script approval
May this help to someone if they have the same requirement.
This will read a file that contains the Jenkins Job name and run them iteratively from one single job.
Please change below code accordingly in your Jenkins.
pipeline {
agent any
stages {
stage('Hello') {
steps {
script{
git branch: 'Your Branch name', credentialsId: 'Your crendiatails', url: ' Your BitBucket Repo URL '
##To read file from workspace which will contain the Jenkins Job Name ###
def filePath = readFile "${WORKSPACE}/ Your File Location"
##To read file line by line ###
def lines = filePath.readLines()
##To iterate and run Jenkins Jobs one by one ####
for (line in lines) {
build(job: "$line/branchName",
parameters:
[string(name: 'vertical', value: "${params.vert}"),
string(name: 'environment', value: "${params.env}"),
string(name: 'branch', value: "${params.branch}"),
string(name: 'project', value: "${params.project}")
]
)
}
}
}
}
}
}
If you already have the Groovy (Postbuild) plugin installed, I think it's a valid desire to get this done with (generic) Groovy instead of installing a (specialized) plugin.
That said, you can get the workspace using manager.build.workspace.getRemote(). Don't forget to add File.separator between path and file name.
As mentioned in a different post Read .txt file from workspace groovy script in Jenkins I was struggling to make it work for the pom modules for a file in the workspace, in the
Extended Choice Parameter. Here is my solution with the printlns:
import groovy.util.XmlSlurper
import java.util.Map
import jenkins.*
import jenkins.model.*
import hudson.*
import hudson.model.*
try{
//get Jenkins instance
def jenkins = Jenkins.instance
//get job Item
def item = jenkins.getItemByFullName("The_JOB_NAME")
println item
// get workspacePath for the job Item
def workspacePath = jenkins.getWorkspaceFor (item)
println workspacePath
def file = new File(workspacePath.toString()+"\\pom.xml")
def pomFile = new XmlSlurper().parse(file)
def pomModules = pomFile.modules.children().join(",")
return pomModules
} catch (Exception ex){
println ex.message
}

Can groovy heredocs be internationalized?

Have some multiline strings that are presented to the user and stored as Heredocs. So rather than a 'normal' (Java) property file, a groovy-based one (see here) to be consumed by ConfigSlurper was used and works great. Sorry if this is a dumb question, but can that be easily internationalized? If so, can you outline how that is accomplished?
My solution: In your ConfigSlurper you should store keys to the internalized strings. Inject messageSourceand localResolver in your controller/service, get key from your ConfigSlurper and find localized string in your i18n messages.property file. Example (not sure that code is correct, but it's the main idea):
def config = new ConfigSlurper().parse(new File('src/Config.groovy').toURL())
//localized string1 value
def msg = messageSource.getMessage(config.data1.string1, null, localeResolver.defaultLocale)
As far as I know the ConfigSlurper does not have special support for i18n.
You may achieve it by using the leveraging its support for environments by creating an environment closure per locale. For example:
environments {
english {
sample {
hello = "hello"
}
}
spanish {
sample {
hello = "hola"
}
}
}
When creating the ConfigSlurper you will need to pass the desired language:
def config = new ConfigSlurper("spanish")

Resources