Unable to recognize WORKSPACE as directory using Jenkinsfile/pipline plugin - groovy

I am trying to search a file recursively inside a directory hence cannot use findFiles.
I have seen the directories via manually login in to the slave but it cannot be recognized in the code below. When I use isDirectory() it says false hence later while using dir.listFiles() it return null.
Below is the code:
def recursiveFileSearch(File dir, filename, filesPath) {
File[] files = dir.listFiles() // It returns null here as it cannot recognize it as directory
echo "$files"
for (int i=0; i < files.size(); i++) {
if (files[i].isDirectory()) {
recursiveFileSearch(files[i], filename, filesPath)
} else {
if (files[i].getAbsolutePath().contains(filename)) {
filesPath.add(files[i].getAbsolutePath())
return filesPath
}
}
}
return filesPath
}
node('maven') {
git 'https://github.com/rupalibehera/t3d.git'
sh 'mvn clean install'
File currentDir = new File(pwd())
def isdir = currentDir.isDirectory()
println "isdir:${isdir}" // The output here is False
def isexist = currentDir.exists()
println "isexist:${isexist}" // The output here is False
def canread = currentDir.canRead()
println "canread:${canread}" // The output here is False
def filesPath = []
def openshiftYaml = recursiveFileSearch(currentDir, "openshift.yml", filesPath)
}
I am not sure what is going wrong here.
But below are some observations:
When I do File currentDir = new File("."), it returns / and starts reading complete root directory which I don't want and in that also it does not recognize WORKSPACE as directory
It executes well if I run it on Master node, but in my use case it will be always a slave.
I have also checked the permissions of directory the user has read/write/execute permissions.
Any pointers/help is appreciated

Generally, run a sh step to do whatever work you need. You may not use java.io.File or the like from Pipeline script. It does not run on the agent, and is also insecure, which is why any such attempt will be rejected when the sandbox mode is left on (the default).

you are running into the Using File in Pipeline Description problem. I know it all too well. File objects and NIO work fine for breaking up paths, but their isDirectory, exists and other methods run on master as a part of the Jenkinsfile and not on the node. So, all use on master looks great, because the files are in the workspace. All use on a node, fails.
In short, don't do that. Use fileExists(), pwd(), findFiles etc
If you created a shareLibrary and want to use unit tests on the code outside of Jenkins, then you can create a fascade which relies on the script object ('this' from a pipeline)
Class for shared lib
class PipelineUtils implements Serializable {
static def pipelineScript = null;
/**
* Setup this fascade with access to pipeline script methods
* #param jenkinsPipelineScript
* #return
*/
static initialize(def jenkinsPipelineScript) {
pipelineScript = jenkinsPipelineScript
}
/**
* Use pipelineScript object ('this' from pipeline) to access fileExists
* We cannot use Java File objects for detection as the pipeline script runs on master and uses delegation/serialization to
* get to the node. So, File.exists() will be false if the file was generated on the node and that node isn't master.
* https://support.cloudbees.com/hc/en-us/articles/230922128-Pipeline-Using-java-io-File-in-a-Pipeline-description
* #param target
* #return true if path exists
*/
static boolean exists(Path target) {
if (!pipelineScript) {
throw new Exception("PipelineUtils.initialize with pipeline script not called - access to pipeline 'this' required for access to file detection routines")
}
if (! target.parent) {
throw new Exception('Please use absolutePaths with ${env.WORKSPACE}/path-to-file')
}
return pipelineScript.fileExists(target.toAbsolutePath().toString())
}
/**
* Convert workspace relative path to absolute path
* #param path relative path
* #return node specific absolute path
*/
static def relativeWorkspaceToAbsolutePath(String path) {
Path pwd = Paths.get(pipelineScript.pwd())
return pwd.resolve(path).toAbsolutePath().toString()
}
static void echo(def message) {
pipelineScript.echo(message)
}
}
class for tests
class JenkinsStep {
static boolean fileExists(def path) {
return new File(path).exists()
}
static def pwd() {
return System.getProperty("user.dir")
}
static def echo(def message) {
println "${message}"
}
}
usage in jenkins
PipelineUtils.initialize(this)
println PipelineUtils.exists(".")
// calls jenkins fileExists()
usage in unit tests
PipelineUtils.initialize(new JenkinsStep())
println PipelineUtils.exists(".")
// calls File.exists

I found the answer,
for searching any file in your workspace from Jenkinsfile you can use findFiles step,
I did try this but I was passing the incorrect glob for the same. Now I just do
def files = findFiles(glob: '**/openshift.yml') \\ it returns the path of file

Related

Shopware 6 : how to route shopware unit test output from fixture to original folder?

I'm building unit test for feature of get files from downloads folder under public directory. But in the unit test , i'm using fixture for test files path. how i can mockup directory in my common function beetween fixture and the original pub directory?
Have a look at this core test: \Shopware\CI\Test\Service\ReleasePrepareServiceTest::setUp
They use the following code to simulate folder contents:
public function setUp(): void
{
$this->artifactsFilesystem = new Filesystem(new MemoryAdapter());
[...]
$this->artifactsFilesystem->put('install.zip', random_bytes(1024 * 1024 * 2 + 11));
$this->artifactsFilesystem->put('install.tar.xz', random_bytes(1024 + 11));
$this->artifactsFilesystem->put('update.zip', random_bytes(1024 * 1024 + 13));
}
The unit tests creates some in-memory file system here and fills it with some files with random data.
I found this by checking the usages of the Filesystem class in the test folders of Shopware.
This filesystem can be injected into the service or code you are testing.
For example:
public function setUp(): void
{
$this->myMockFileSystem = new Filesystem(new MemoryAdapter());
$this->myMockFileSystem->put('file_we_need_in_the_test.pdf', random_bytes(1024 * 1024 * 2 + 11));
}
public function testSomething(): void
{
$service = new MyService($this->myMockFilesystem);
$this->assertEquals('some result', $service->doSomething());
}

Gradle: How to upload release APK file over FTP after build

I want to upload automatically the APK file to the server when building the release version.
To do so, I'm going to use FTP protocol.
I'm new regarding Gradle scripting. I used those 2 questions (this and this) as a base but something is not working out.
Could anyone point out what it is?
This is the code (on build.gradle):
gradle.buildFinished {
println("---- Build finished. This message appears ----")
task ftp << {
project.logger.lifecycle('-- This message does not appear --')
ant {
taskdef(name: 'ftp',
classname: 'org.apache.tools.ant.taskdefs.optional.net.FTP',
classpath: configurations.ftpAntTask.asPath)
def destination = "ftp://xxxxxxxxxx#xxx.surftown.com/xxxxx/"
def source = null
android.applicationVariants.all { variant ->
if ( (variant.name).equals("release") ) {
variant.outputs.each { output ->
source = output.outputFile
}
}
}
def user = 'xxxxxxxxx'
def pass = 'xxxxxxxxx'
ftp(server: source, userid: user, password: pass, remoteDir: destination)
}
}
gradle.buildFinished registers a hook executed at the end of the build. In your case it just creates the ftp task.
Use this if the build task is involved :
build.finalizedBy(ftp)
Otherwise, to make sure it works whatever the invoked task :
tasks.all*.finalizedBy(ftp)
By the way, it was explained in the comment section of first answer of your first link.

How to use createTempFile in groovy/Jenkins to create a file in non-default directory?

What I am trying to achieve is to create a temporary file in groovy in workspace directory, but as an example /tmp/foo will be good enough.
So, here is perfectly working java code:
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.Files;
class foo {
public static void main(String[] args) {
try {
String s="/tmp/foo";
Path p=Paths.get(s);
Path tmp=Files.createTempFile(p,"pref",".suf");
System.out.println(tmp.toString());
} catch (Exception e) {
e.printStackTrace();
}
}
}
however, when used in context of Jenkins pipeline it simply does not work:
def mktemp() {
//String s=pwd(tmp:true)
String s="/tmp/foo"
Path p=Paths.get(s)
Path tmp=Files.createTempFile(p,"pref",".suf")
return tmp;
}
The result is array element type mismatch message with nothing helpful in pipeline log:
java.lang.IllegalArgumentException: array element type mismatch
at java.lang.reflect.Array.set(Native Method)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovyCallSiteSelector.parametersForVarargs(GroovyCallSiteSelector.java:104)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovyCallSiteSelector.matches(GroovyCallSiteSelector.java:51)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovyCallSiteSelector.findMatchingMethod(GroovyCallSiteSelector.java:197)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovyCallSiteSelector.staticMethod(GroovyCallSiteSelector.java:191)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onStaticCall(SandboxInterceptor.java:153)
at org.kohsuke.groovy.sandbox.impl.Checker$2.call(Checker.java:184)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedStaticCall(Checker.java:188)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:95)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
at WorkflowScript.mktemp(WorkflowScript:16)
The java.io.File.createTempFile() is not any better. In plain java code it works perfectly. In groovy it throws java.io.IOException: No such file or directory.
BTW, /tmp/foo directory exists, methods are added on script approval screen.
From the IOException I suspect you're calling mktemp from within a node {} block and expecting to create the temporary file on that node. Pipeline scripts are run entirely on the Jenkins master. Pipeline steps that interact with the filesystem (e.g. writeFile) are aware of node {} blocks and will be sent over to the node to be executed there, but any pure-Java methods know nothing about remote nodes and are going to interact with the master's filesystem.

Running SoapUI test cases using testRunner

I am working on a SoapUI project where I need to run my test suite using test runner. I am using external groovy scripting for environment variable. The problem I am facing here is whenever I am running test case from test runner its return Workspace as null, which is used in External groovy. So in external groovy I am getting workspace as null causing error [getProjectByname() cannot be invoked on null]. Below is the
constructor of global script where workspace is used
AvengerAPITestManager(String TestProject, String TestSuite, String TestCase,String TestStep)
{
TestName = "AvengerAPITests";
testProject = SoapUI.getWorkspace().getProjectByName(TestProject);
tSuite = testProject.getTestSuiteByName(TestSuite);
tCase = testProject.getTestSuiteByName(TestSuite).getTestCaseByName(TestCase);
tStepName = TestStep.toString();
tStep=testProject.getTestSuiteByName(TestSuite).getTestCaseByName(TestCase).getTestStepByName (TestStep);
}
Above we have user SoapUI.getWorkspace() which is working fine when trying to run from soapUI but whever I m trying to run from testrunner SoapUI.getWorkspace comes out to be null. I even tried passing workspace like I am passing testproject name still it didnt worked.
I tried something like this also
AvengerAPITestManager(Object workspace,String TestProject, String TestSuite, String TestCase, String TestStep)
{
TestName = "AvengerAPITests";
testProject = workspace.getProjectByName(TestProject);
tSuite = testProject.getTestSuiteByName(TestSuite);
tCase = testProject.getTestSuiteByName(TestSuite).getTestCaseByName(TestCase);
tStepName = TestStep.toString();
tStep = testProject.getTestSuiteByName(TestSuite).getTestCaseByName(TestCase).getTestStepByName(TestStep);
}
In the above code I tries passing Workspace object from the test case as I passed Testcase name and all but still I m getting null for workspace. Please tell me how do I deal with the problem.
Here is usefull working example https://github.com/stokito/soapui-junit
You should place your sample-soapui-project.xml to /src/test/resources folder that will expose it to classpath
If you want to use soap ui in external code, try to directly create new test runner with specific project file:
SoapUITestCaseRunner runner = new SoapUITestCaseRunner();
runner.setProjectFile( "src/dist/sample-soapui-project.xml" );
runner.run();
Or if you want to define test execution more precisely, you can use something like this:
WsdlProject project = new WsdlProject( "src/dist/sample-soapui-project.xml" );
TestSuite testSuite = project.getTestSuiteByName( "Test Suite" );
TestCase testCase = testSuite.getTestCaseByName( "Test Conversions" );
// create empty properties and run synchronously
TestRunner runner = testCase.run( new PropertiesMap(), false );
PS: don't forget to import soap ui classes, that you use in your code and put them to classpath.
PPS: If you need just run test cases outside the soap ui and/or automate this process, why not just use testrunner.sh/.bat for the same thing? (here is description of this way: http://www.soapui.org/Test-Automation/functional-tests.html)
I am not sure if this is going to help anyone out there but here is what I did to fix the problem I was having with workspace as null causing error[getProjectByname() cannot be invoked on null] When i run from cmd
try this:
import com.eviware.soapui.model.project.ProjectFactoryRegistry
import com.eviware.soapui.impl.wsdl.WsdlProjectFactory
import com.eviware.soapui.impl.wsdl.WsdlProject
//get the Util project
def project = null
def workspace = testRunner.testCase.testSuite.project.getWorkspace();
//if running Soapui
if (workspace != null) {
project = workspace.getProjectByName("Your Project")
}
//if running in Jenkins/Hudson
else{
project = new WsdlProject("C:\\...\\....\\....\\-soapui-project.xml");
}
if (project.open && project.name == "Your Project") {
def properties = new com.eviware.soapui.support.types.StringToObjectMap()
def testCase = project.getTestSuiteByName("TestSuite 1").getTestCaseByName("TestCase");
if(testCase == null)
{
throw new RuntimeException("Could not locate testcase 'TestCase'! ");
} else {
// This will run everything in the selected project
runner = testCase.run(new com.eviware.soapui.support.types.StringToObjectMap(), false)
}
}
else {
throw new RuntimeException("Could not find project ' Order Id....' !")
}
The above code will run everything in the selected project.

Write to file via jenkins post-groovy script on slave

I'd like to do something very simple: Create/write to a file located in the remote workspace of a slave via the jenkins groovy post-build script plug-in
def props_file = new File(manager.build.workspace.getRemote() + "/temp/module.properties")
def build_num = manager.build.buildVariables.get("MODULE_BUILD_NUMBER").toInteger()
def build_props = new Properties()
build_props["build.number"] = build_num
props_file.withOutputStream { p ->
build_props.store(p, null)
}
The last line fails, as the file doesn't exist. I'm thinking it has something to do with the output stream pointing to the master executor, rather than the remote workspace, but I'm not sure:
Groovy script failed:
java.io.FileNotFoundException: /views/build_view/temp/module.properties (No such file or directory)
Am I not writing to the file correctly?
While writing onto slave you need to check the channel first and then you can successfully create a file handle and start reading or writing to that file:
if(manager.build.workspace.isRemote())
{
channel = manager.build.workspace.channel;
}
fp = new hudson.FilePath(channel, manager.build.workspace.toString() + "\\test.properties")
if(fp != null)
{
String str = "test";
fp.write(str, null); //writing to file
versionString = fp.readToString(); //reading from file
}
hope this helps!
Search for words The post build plugin runs on the manager and doing it as you say will fail if you are working with slaves! on the plugin page (the link to which you've provided) and see if the workaround there helps.
Does the folder /views/build_view/temp exist?
If not, you will need to do new File( "${manager.build.workspace.remote}/temp" ).mkdirs()

Resources