I have a pipeline groovy script, which I load from a different script:
load("path/to/my/script/pipeline.groovy")
Now, in this script, I want to load another groovy script. But I do not know the full path/to/my/script path. I tried:
load("./subfolder/subscript.groovy")
But it cannot find it this way. Can I somehow load a groovy script relative to the current script file?
You may want to consider using the shared library plugin if you are loading multiple remote scripts.
If the groovy file exists in a subfolder, you can use the findFiles step:
def subscript = findFiles(glob: '**/subscript.groovy')
load(subscript[0].path)
One way would be to load the 2nd script (curl) in your working directory, from there the 1st script can find and load it.
Related
In cucumber, after a new project directory is made and a .feature file is created and saved in an editor, how does the File get found by Cucumber When Cucumber is Run? Is the file exported manually into Cucumber or the tool scans the whole system automatically mapping itself to the file?
By default Cucumber will load all files in the 'features' folder in the root directory (recursively).
If you want to use a different location you can run Cucumber with the command 'cucumber myfolder' which will look for features in a folder called myfolder in the project root.
It gets a bit more complicated when using subdirectories for features -> From this site (copied here for the record) - http://makandracards.com/makandra/4971-how-to-organize-and-execute-cucumber-features-e-g-in-subdirectories
By default, cucumber loads all *.rb files it can find (recursively) within the directory you passed as argument to cucumber.
$ cucumber # defaults to directory "features"
$ cucumber features
$ cucumber my/custom/features/dir
So, if you would like to organize features in subdirectories, you won't have any problems when running the whole test-suite. Cucumber will automatically load and run features in subdirectories, too.
However, running features in subdirectories does not work out of the box. The reason for this is that cucumber will look for your step definitions and support files within the subdirectory.
What you can do now is to either provide all needed support files and step definitions also within the subdirectories (not practicable) OR use the -r command line argument when running the subdirectory features:
cucumber -r features
In your testrunner class you can specify the path to where your feature files are locate.
I have a library that contains multiple bash scripts on a Linux. The main application, say myApp.sh will call different bash scripts to perform the tasks.
I have to compile all of these scripts into ONE binary file. With the SHC tool, I can only compile one file, and if I remove the other files, then the reference inside myApp is broken.
So how can I compile all of the bash scripts into one file and then run SHC on it?
I have a .jar file that reads two files from within its current folder and produces as output a .txt file and a separate folder with multiple other .txt files. This works perfectly in Windows using this code to create the directory:
static String dir = System.getProperty("user.dir");
I used the instructions here: https://askubuntu.com/questions/192914/how-run-a-jar-file-with-a-double-click to set up my .jar file to run on a simple double-click, but as of right now, it does nothing when double-clicked. My guess is that the above line of code does not translate well to Linux. Anybody know how to resolve this?
First, try running it on the command-line, with
java -jar <file.jar>
The user.dir property is cross-platform (see here) so it should not be the problem. However, are you using correct file separators? Remember it's '/' on UNIX and '\' on Windows.
Try java -jar Jarname.jar and pass other files as arguments after this command
The code line you gave works fine on linux.
My best guess is that you're then trying to use this directory path by adding a windows-specific path separator (like path + "\subdir") which isn't appropriate for linux (you should build a new File object instead).
Either that, or your jar file isn't being executed at all. Have you tried doing something very simple in your jar file to see if anything is being run? Have you tried running your jar with java -jar myapp.jar to see if any exceptions are thrown or error messages displayed?
You will need to manually tweak your build process to get the jar file marked as executable. In your build xml file, there is a target, "-post-jar", that is called after the jar is built. You'll need to make that target and use Ant's chmod task to modify your jar. Once you do that it will occur every time you make a jar file in that project.
It will run fine as long as you have a JRE installed.
Read this article to know more.
I want to use the option "This build has the parameters" in Jenkins ( hudson ) and then instead of String parameters, I want to load these settings from an external file that contains all the parameters (val=value ...) .
I find this plugin in the "Trigger parameterized" that puts a file paraetres but after a build, me I need this file in the first build
thank you
I would suggest that you specify the path to the file as a String parameter, call it PARAMS_FILE
The file should look like.
VAR1=someValue
VAR2=someOtherValue
If you use bash in your build steps then you could do.
. ${PARAMS_FILE}
At the beginning of an execute shell and the params would be set for that shell.
That would solve the problem for using shell build steps atleast.
If you use other build steps you would have to do another solution.
Here is what i need to do in scons, and at present I'm not able to get this to work correctly.
Firstly I need to run perl script 1. This generates a series of cpp files.
Then I need to run perl script 2. This generates another series of cpp files.
Then I need to take the cpp files that have been created as a result of executing the 2 perl scripts and build a static library from them.
I use a custom builder to execute the perl scripts. I don't want to manually define the target list, as this can change depending on the file that the perl scripts uses to generate the source files.
ny help would be much appreciated.
Thanks,
D
For running the perl scripts you just need to use standard python code:
import subprocess
subprocess.call(['perl', ...args...])
For building static lib, try something like this:
env = Environment()
env.StaticLibrary('example', Glob('*.cpp'))
where Glob('*.cpp') generates a list of all .cpp files. If you already have some customized environment just use is instead of env in my sample.