Is there a way to check in manifest files if a given class exists?
I want to do something like this:
class foo {
if exists( Class["foo::${lsbdistcodename}"] ) {
include foo::${lsbdistcodename}
}
}
So I can easily add distrubution / version specific classes which are then automatically included.
You should use defined instead of exists statement.
The following snippet works for me:
class foo {
if defined( "foo::${lsbdistcodename}") {
notify {'defined':}
include "foo::${lsbdistcodename}"
}
}
class foo::precise {
notify{'precise':}
}
[assuming you're running puppet version > 2.6.0]
Related
How to overwrite defined type in nodes.pp? I want to able to set custom domain using nodes.pp. Case Default isn't an option.
I'm using puppet 6.0..
The following method doesn't work. It says Could not find declared class resolv::resolv_config.
It looks like it used to work in 3.0 according to this answer.
nodes.pp
node "test001" {
class { 'resolv::resolv_config':
domain => "something.local",
}
}
modules/resolv/manifests/init.pp
class resolv {
case $hostname {
/^[Abc][Xyz]/: {
resolv:resolv_config { 'US':
domain => "mydomain.local",
}
}
}
}
define resolv::resolv_config($domain){
file { '/etc/resolv.conf':
content => template("resolv/resolv.conf.erb"),
}
}
resolv.conf.erb
domain <%= #domain %>
There are a couple of problems here, but the one causing the "could not find declared class" error is that you are using the wrong syntax for declaring a defined type. Your code should be something like this:
node "test001" {
resolv::resolv_config { 'something.local':
domain => "something.local",
}
}
There are examples of declaring defined types in the documentation, https://puppet.com/docs/puppet/latest/lang_defined_types.html.
Once you get that working, you'll find another problem, in that this definition
define resolv::resolv_config($domain){
file { '/etc/resolv.conf':
content => template("resolv/resolv.conf.erb"),
}
}
will cause a error if you try to declare more than one resolv::resolv_config, because they will both try to declare the /etc/resolv.conf file resource. You almost certainly wanted to use a file_line resource.
I know it is something with the catalog that is the issue, I just can't figure out how to work around it.
I have the following code and I get the following error:
class test1 {
file { '/tmp/test.txt':
ensure => present,
content => 'name=joe',
}
}
class test2 {
$test = file('/tmp/test.txt')
notify { $test: }
}
class test3 {
class { 'test1': } ->
class { 'test2': }
}
puppet apply -e "include test3"
Error: Could not find any files from test.txt at ../modules/test2/manifests/init.pp
So essentially, I am trying to read a file before it exists, and the ordering doesn't appear to be working. Any ideas how I can work around this?
Based on the description of the function you are trying to utilize it will never operate in the fashion you are trying.
file:
Loads a file from a module and returns its contents as a string.
Affectively what this means is that the file would exist in
test1/files/test.txt
And would be loaded using:
file('test1/test.txt') i.e. file(<MODULENAME>/<FILENAME>)
I would like feedback on the best practices for defining plugin tasks that depend on external state (i.e. defined in the build.gradle that referenced the plugin). I'm using extension objects and closures to defer accessing those settings until they're needed and available. I'm also interested in sharing state between tasks, e.g. configuring the outputs of one task to be the inputs of another.
The code uses "project.afterEvaluate" to define the tasks when the required settings have been configured through the extension object. This seems more complex than should be needed. If I move the code out of the "afterEvaluate", it gets compileFlag == null which isn't the external setting. If the code is changed again to use the << or doLast syntax, then it will get the external flag... but then it fails to work with type:Exec and other similarly helpful types.
I feel that I'm fighting Gradle in some ways, which means I don't understand better how to work well with it. The following is a simplified pseudo-code of what I'm using. This works but I'm looking to see if this can be simplified, or indeed what the best practices are. Also, the exception shouldn't be thrown unless the tasks are being executed.
apply plugin: MyPlugin
class MyPluginExtension {
String compileFlag = null
}
class MyPlugin implements Plugin<Project> {
void apply(Project project) {
project.extensions.create("myPluginConfig", MyPluginExtension)
project.afterEvaluate {
// Closure delays getting and checking flag until strictly needed
def compileFlag = {
if (project.myPluginConfig.compileFlag == null) {
throw new InvalidUserDataException(
"Must set compileFlag: myPluginConfig { compileFlag = '-flag' }")
}
return project.myPluginConfig.compileFlag
}
// Inputs for translateTask
def javaInputs = {
project.files(project.fileTree(
dir: project.projectDir, includes: ['**/*.java']))
}
// This is the output of the first task and input to the second
def translatedOutputs = {
project.files(javaInputs().collect { file ->
return file.path.replace('src/', 'build/dir/')
})
}
// Translates all java files into 'translatedOutputs'
project.tasks.create(name: 'translateTask', type:Exec) {
inputs.files javaInputs()
outputs.files translatedOutputs()
executable '/bin/echo'
inputs.files.each { file ->
args file.path
}
}
// Compiles 'translatedOutputs' to binary
project.tasks.create(name: 'compileTask', type:Exec, dependsOn: 'translateTask') {
inputs.files translatedOutputs()
outputs.file project.file(project.buildDir.path + '/compiledBinary')
executable '/bin/echo'
args compileFlag()
translatedOutputs().each { file ->
args file.path
}
}
}
}
}
I'd look at this problem another way. It seems like what you want to put in your extension is really owned by each of your tasks. If you had something that was a "global" plugin configuration option, would it be treated as an input necessarily?
Another way of doing this would have been to use your own SourceSets and wire those into your custom tasks. That's not quite easy enough yet, IMO. We're still pulling together the JVM and native representations of sources.
I'd recommend extracting your Exec tasks as custom tasks with a #TaskAction that does the heavy lifting (even if it just calls project.exec {}). You can then annotate your inputs with #Input, #InputFiles, etc and your outputs with #OutputFiles, #OutputDirectory, etc. Those annotations will help auto-wire your dependencies and inputs/outputs (I think that's where some of the fighting is coming from).
Another thing that you're missing is if the compileFlag effects the final output, you'd want to detect changes to it and force a rebuild (but not a re-translate).
I simplified the body of the plugin class by using the Groovy .with method.
I'm not completely happy with this (I think the translatedFiles could be done differently), but I hope it shows you some of the best practices. I made this a working example (as long as you have a src/something.java) by implementing the translate as a copy/rename and the compile as something that just creates an 'executable' file (contents is just the list of the inputs). I've also left your extension class in place to demonstrate the "global" plug-in config. Also take a look at what happens with compileFlag is not set (I wish the error was a little better).
The translateTask isn't going to be incremental (although, I think you could probably figure out a way to do that). So you'd probably need to delete the output directory each time. I wouldn't mix other output into that directory if you want to keep that simple.
HTH
apply plugin: 'base'
apply plugin: MyPlugin
class MyTranslateTask extends DefaultTask {
#InputFiles FileCollection srcFiles
#OutputDirectory File translatedDir
#TaskAction
public void translate() {
// println "toolhome is ${project.myPluginConfig.toolHome}"
// translate java files by renaming them
project.copy {
includeEmptyDirs = false
from(srcFiles)
into(translatedDir)
rename '(.+).java', '$1.m'
}
}
}
class MyCompileTask extends DefaultTask {
#Input String compileFlag
#InputFiles FileCollection translatedFiles
#OutputDirectory File outputDir
#TaskAction
public void compile() {
// write inputs to the executable file
project.file("$outputDir/executable") << "${project.myPluginConfig.toolHome} $compileFlag ${translatedFiles.collect { it.path }}"
}
}
class MyPluginExtension {
File toolHome = new File("/some/sane/default")
}
class MyPlugin implements Plugin<Project> {
void apply(Project project) {
project.with {
extensions.create("myPluginConfig", MyPluginExtension)
tasks.create(name: 'translateTask', type: MyTranslateTask) {
description = "Translates all java files into translatedDir"
srcFiles = fileTree(dir: projectDir, includes: [ '**/*.java' ])
translatedDir = file("${buildDir}/dir")
}
tasks.create(name: 'compileTask', type: MyCompileTask) {
description = "Compiles translated files into outputDir"
translatedFiles = fileTree(tasks.translateTask.outputs.files.singleFile) {
includes [ '**/*.m' ]
builtBy tasks.translateTask
}
outputDir = file("${buildDir}/compiledBinary")
}
}
}
}
myPluginConfig {
toolHome = file("/some/custom/path")
}
compileTask {
compileFlag = '-flag'
}
I have an application which can run scripts to automate certain tasks. I'd like to use meta programming in these scripts to optimize code size and readability. So instead of:
try {
def res = factory.allocate();
... do something with res ...
} finally {
res.close()
}
I'd like to
Factory.metaClass.withResource = { c ->
try {
def res = factory.allocate();
c(res)
} finally {
res.close()
}
}
so the script writers can write:
factory.withResource { res ->
... do something with res ...
}
(and I could do proper error handling, etc).
Now I wonder when and how I can/should implement this. I could attach the manipulation of the meta class in a header which I prepend to every script but I'm worried what would happen if two users ran the script at the same time (concurrent access to the meta class).
What is the scope of the meta class? The compiler? The script environment? The Java VM? The classloader which loaded Groovy?
My reasoning is that if Groovy meta classes have VM scope, then I could run a setup script once during startup.
Metaclasses exist per classloader [citation needed]:
File /tmp/Qwop.groovy:
class Qwop { }
File /tmp/Loader.groovy:
Qwop.metaClass.bar = { }
qwop1 = new Qwop()
assert qwop1.respondsTo('bar')
loader = new GroovyClassLoader()
clazz = loader.parseClass new File("/tmp/Qwop.groovy")
clazz.metaClass.brap = { 'oh my' }
qwop = clazz.newInstance()
assert !qwop.respondsTo('bar')
assert qwop1.respondsTo('bar')
assert qwop.brap() == "oh my"
assert !qwop1.respondsTo('brap')
// here be custom classloaders
new GroovyShell(loader).evaluate new File('/tmp/QwopTest.groovy')
And a script to test the scoped metaclass (/tmp/QwopTest.groovy):
assert !new Qwop().respondsTo('bar')
assert new Qwop().respondsTo('brap')
Execution:
$ groovy Loaders.groovy
$
If you have a set of well defined classes you could apply metaprogramming on top of the classes loaded by your classloader, as per the brap method added.
Another option for this sort of thing which is better for a lot of scenarios is to use an extension module.
package demo
class FactoryExtension {
static withResource(Factory instance, Closure c) {
def res
try {
res = instance.allocate()
c(res)
} finally {
res?.close()
}
}
}
Compile that and put it in a jar file which contains a file at META-INF/services/org.codehaus.groovy.runtime.ExtensionModule that contains something like this...
moduleName=factory-extension-module
moduleVersion=1.0
extensionClasses=demo.FactoryExtension
Then in order for someone to use your extension they just need to put that jar file on their CLASSPATH. With all of that in place, a user could do something like this...
factoryInstance.withResource { res ->
... do something with res ...
}
More information on extension modules is available at http://docs.groovy-lang.org/docs/groovy-2.3.6/html/documentation/#_extension_modules.
I want to override parameters of base nodes. What I want to get is a pattern like this:
# File manifests/nodes.pp
node myDefault {
class { 'my::common::puppet_setup':
service => 'enable',
pushable => 'disable',
}
# Do lots of default things ...
}
node 'myFirstNode' inherits myDefault {
# Do something ...
}
node 'mySecondNode' inherits myDefault {
class { 'my::common::puppet_setup::params':
service => 'disable',
pushable => 'enable',
}
}
I understood the the puppet documentation, i could do this by writing my module like this:
# File modules/my/manifests/common/puppet_setup.pp
class my::common::puppet_setup (
$pushable = $my::common::puppet_setup::params::pushable,
$service = $my::common::puppet_setup::params::service,
) inherits my::common::puppet_setup::params {
# package that configures puppet node
# input value validation
validate_re($pushable, ['^enable$', '^disable$', '^ignore$', ])
validate_re($service, ['^enable$', '^disable$', '^ignore$', '^cron$', ])
# setup puppet, start or disable agent, put ssh keys for push ...
}
class my::common::puppet_setup::params {
$pushable = 'enable'
$service = 'enable'
$puppetserver = 'puppet.my.site.de'
case $::osfamily {
'Debian': {
}
default: {
fail("not implemented yet for {::operatingsystem}")
}
}
}
The Documentations on puppet website says:
When a derived class is declared, its base class is automatically declared first (if it wasn’t already declared elsewhere).
But i get this error (some indentation added):
mySecondNode# puppet agent --test --environment dev_my
Error: Could not retrieve catalog from remote server:
Error 400 on SERVER: Duplicate declaration:
Class[My::Common::Puppet_setup::Params] is already declared;
cannot redeclare at /.../puppet/manifests/nodes.pp:16 on node mySecondNode
Warning: Not using cache on failed catalog
Error: Could not retrieve catalog; skipping run
I'm reading on this for a week and i guess my understanding ist totally wrong somewhere, although i used the puppetlabs ntp modules as an example.
what am i missing?
You should check Inheritance section from http://docs.puppetlabs.com/puppet/latest/reference/lang_node_definitions.html
Puppet treats node definitions like classes. It does not mash the two together and then compile the mix; instead, it compiles the base class, then compiles the derived class, which gets a parent scope and special permission to modify resource attributes from the base class.
One of the good solutions is to use roles and profiles, there's a great blog post about it:
http://garylarizza.com/blog/2014/02/17/puppet-workflow-part-2/
You can use virtual resources :
http://docs.puppetlabs.com/guides/virtual_resources.html