gradle get relative resource path - groovy

When I iterate over source repository I do like this
def resourceDir = proj.sourceSets.main.output.resourcesDir
resourceDir.eachFileRecurse(groovy.io.FileType.FILES) { // only files will be recognized
file ->
def path = FilenameUtils.separatorsToUnix(file.toString())
if (FilenameUtils.getExtension(file.toString()) in supportedResourceExt) {
proj.logger.lifecycle("Reading file {}.", file)
//.....
}
}
In log it writes this
Reading file D:\PROJECT_FOLDER\project\subproject\subsubproject\build\resources\main\com\package\something\file.txt
How to get only the part starting with com\package\something\file.txt without explicitly reading it like file.substring(file.indexOf)?
Maybe it's posible to relativize it with project path somehow?

It seems that:
proj.logger.lifecycle("Reading file {}.", file.absolutePath - resourceDir.absolutePath)
should work. Can't check it right now.

Related

Need to check if folder exists in workspace in groovy script for jenkins pipeline

I need to check if a specific folder exists, I can not give the full path as some of the folder names will be different each time.
I used the below code -
echo "checking if folder exists"
def files = findFiles glob: '**/*example*'
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
example is a folder which is inside -
java-maven-app/src/main/java/com/example
the error, I am getting in pipeline is -
You are getting that error because your files list doesn't have any content. AFAIK findFiles is not capable of finding directories recursively. If you have any known files within the directory you are looking for you may be able to get the full path to that file using findFiles and determine whether your directory exists. But it will not work if the directory is empty. As a better solution, you can use the following script to get a list of all directories recursively.
pipeline {
agent any
stages {
stage('Example') {
steps {
script {
def directories = getDirectories("$WORKSPACE")
echo "$directories"
}
}
}
}
}
#NonCPS
def getDirectories(path) {
def dir = new File(path);
def dirs = [];
dir.traverse(type: groovy.io.FileType.DIRECTORIES, maxDepth: -1) { d ->
dirs.add(d)
};
return dirs
}

Compile time check if file at path exists? like include_str!(..)

I like how include_str!(..) works. Is there a macro that simply checks if the file exists instead of loading the contents of the file?
Use case? I want to make sure that all the file paths that are valid before I release it, to prevent runtime error.
So the file path has to be checked even if the macro isn't called during runtime.
OR should I be using tests here?
This will do for now.
#[macro_export]
macro_rules! find_file{
($arg1:literal) => {
{
//opportunity for improvement
let _ = include_bytes!($arg1);
let r = $arg1;
r
}
};
}
#PitaJ thanks

Flatten first directory of a FileTree in Gradle

I'm writing a task to extract a tarball into a directory. I don't control this tarball's contents.
The tarball contains a single directory which contains all the files I actually care about. I want to pull everything out of that directory and copy that into my destination.
Example:
/root/subdir
/root/subdir/file1
/root/file2
Desired:
/subdir
/subdir/file1
/file2
Here's what I tried so far, but this seems like a really goofy way of doing it:
copy {
eachFile {
def segments = it.getRelativePath().getSegments() as List
it.setPath(segments.tail().join("/"))
return it
}
from tarTree(resources.gzip('mytarfile.tar.gz'))
into destinationDir
}
For each file, I get the elements of its path, remove the first, join that with /, then set that as the file's path. And this works...sort of. The problem is that this creates the following structure as a result:
/root/subdir
/root/subdir/file1
/root/file2
/subdir
/subdir/file1
/file2
I'm fine with just removing the root directory myself as a final action of the task, but I feel like there should be a much simpler way of doing this.
AFAIK, the only way is to unpack the zip, tar, tgz file :(
There is an open issue here
Please go vote for it!
Until then, the solution isn't very pretty, but not that hard either. In the example below, I am assuming that you want to remove the 'apache-tomcat-XYZ' root-level directory from a 'tomcat' configuration that only includes the apache-tomcat zip file.
def unpackDir = "$buildDir/tmp/apache.tomcat.unpack"
task unpack(type: Copy) {
from configurations.tomcat.collect {
zipTree(it).matching {
// these would be global items I might want to exclude
exclude '**/EMPTY.txt'
exclude '**/examples/**', '**/work/**'
}
}
into unpackDir
}
def mainFiles = copySpec {
from {
// use of a closure here defers evaluation until execution time
// It might not be clear, but this next line "moves down"
// one directory and makes everything work
"${unpackDir}/apache-tomcat-7.0.59"
}
// these excludes are only made up for an example
// you would only use/need these here if you were going to have
// multiple such copySpec's. Otherwise, define everything in the
// global unpack above.
exclude '**/webapps/**'
exclude '**/lib/**'
}
task createBetterPackage(type: Zip) {
baseName 'apache-tomcat'
with mainFiles
}
createBetterPackage.dependsOn(unpack)
Using groovy's syntax, we can use a regex to eliminate the first path segment:
task myCopyTask(type: Copy) {
eachFile {
path -= ~/^.+?\//
}
from tarTree(resources.gzip('mytarfile.tar.gz'))
into destinationDir
includeEmptyDirs = false // ignore empty directories
}

Variable project configuration is bound to in groovy axis plugin for jenkins

I have a multi-configuration build for which I'd like essentally one build to be run for each file matching foo/*/bar/*.xml. I figured the GroovyAxis Plugin would be a nice fit, but I cannot find any documentation on how the build configuration can be accessed from within the script, so I cannot read the workspace-directory from anywhere.
Running something like return new File('.').listFiles().collect{it.toString()} returns all files within the root directory of the server.
Can anyone point me in the right direction?
It took a while to figure this out, but here is a solution. Note that since the Groovy script runs on the master, you must use FilePath to access the files on the slave.
import hudson.FilePath
def workspace = context?.build?.workspace
if (null == workspace) {
return ['noworkspace'] // avoid returning 'default' so the user has a chance of figuring out what went wrong
}
def configDir = workspace.toString() + '/openpower/configs/'
def dir = new FilePath(workspace.channel, configDir)
def files = []
dir.list().each {
def name = it.getName()
if (name.endsWith('_defconfig')) {
files << name.replace('_defconfig', '')
}
}
return files

How to read a custom path of a file in c++?

My file is located in C:\\Input\\pStep.p21 . i want to open that file in my cpp program. How can i do it? I am using char* inputPath="C:\\Input\\pStep.p21"; but its not finding my file in program. How to get current working directory in VC++?. Its working if try to save p21 file but failing if i read it.
my code in CAA:
#include<CATSDM_Services>
#include<SdaiModel.h>
#include<CATIUniCodeString>
---
---
main()
{
CATIUniCodeString inputPath("C:\\Input\\pStep.p21");
HRESULT hr=S_OK;
SdaiModel edxModel=Null;
//this method reads express schema name, input p21 file and sdaimodel
hr=CreateModelFromFile(inputPath,"parts",edxModel);
if(FAILED(hr))
{
cout<<"Model created succesfully";
}
else
{
cout<<"Failed";
}
}
Sorry for not seeing the question earlier.
From what I am seeing in your code, the test is wrong. The FAILED() macro denotes incorrect execution. Thus from you code, if you see "Failed" on the console, it actually means the execution of the method ran OK.
Change your code to something like:
if ( SUCCEEDED(hr) )
{
cout<<"Model created succesfully";
}
else
{
cout<<"Failed";
}

Resources