Concatenate files on linux with Gradle - linux

Here's my current Gradle task :
task concat << {
println "cat $localWebapp/sc*.js > $buildDir/js/sc.concat.js"
exec {
commandLine "bash","-c",'cat',"$localWebapp/sc*.js", ">", "$buildDir/js/sc.concat.js"
}
}
Even while the command I print using println is correct (it's working if I paste it in a console in the project directory), the command doesn't build the sc.concat.js file.
What's happening and how can I fix that ?

sh -c takes one param for the script/commands to execute:
commandLine "/bin/sh","-c","cat $localWebapp/sc*.js > $buildDir/js/sc.concat.js"
Otherwise the params after cat are passed as further params to shell, which are "misinterpreted".

Instead of commandLine, it seems executable works:
task concat << {
println "cat $localWebapp/sc*.js > $buildDir/js/sc.concat.js"
exec {
executable "sh"
args "-c", "cat $localWebapp/sc*.js > $buildDir/js/sc.concat.js"
}
}

Related

Try catch in Nextflow processes

How can I perform a try catch in nextflow?
I am currently writing a pipeline where it is possible that the bash command I am executing exits with an exitcode 1 under certain conditions. This brings my pipeline to a grinding halt. I would now like to use a try catch clause to define some alternative behavior in case this happens.
I have tried doing this in groovy fashion which does not seem to work:
process align_kallisto {
publishDir "${params.outdir}/kallisto", mode: 'copy', saveAs:{ filename -> "${name}_abundance.tsv" }
input:
tuple val(name), file(fastq) from fq_kallisto.dump(tag: 'kallisto fq')
file(index) from kallisto_index.collect().dump(tag: 'kallisto index')
output:
file("output/abundance.tsv") into kallisto_quant
// this can throw an exit 1 status
try {
"""
kallisto quant -i ${index} --bias --single --fr-stranded -o output --plaintext \
--fragment-length ${params.frag_length} --sd ${params.frag_deviation} ${fastq}
"""
}
// if this happens catch and do something else
catch (Exception e) {
println("Exception: ${e} for $name")
"""
// execute some alternative command
"""
}
}
Any advise?
I could tell nextflow to just ignore this error and still continue, but I would rather learn how to do a proper try catch.
AFAIK there's no way to handle errors in your process definition using a try/catch block. Rather than trying to catch all of the scenarios that result in an exit status 1, could you better define those conditions and handle them before trying to execute your process? For example, if an empty FASTQ file (or a FASTQ file with an insufficient number of reads as required by your process) was supplied as input and this resulted in an exit status 1, a pre-processing command that filtered out those files could be useful here.
But if it's not possible to better define the condition(s) that your command produces exit status 1 or any non-zero exit status, you can ignore them like you have suggested by appending errorStrategy 'ignore' to your process definition. Below is an example of how you could get the 'success' and 'failed' outputs, so they can be handled appropriately:
nextflow.enable.dsl=2
process test {
errorStrategy 'ignore'
input:
tuple val(name), path(fastq)
output:
tuple val(name), path("output/abundance.tsv")
"""
if [ "${fastq.baseName}" == "empty" ]; then
exit 1
fi
mkdir output
touch output/abundance.tsv
"""
}
workflow {
fastqs = Channel.fromFilePairs( './data/*.fastq', size: 1 )
test(fastqs) \
.join(fastqs, remainder: true) \
.branch { name, abundance, fastq_tuple ->
failed: abundance == null
return tuple( name, *fastq_tuple )
succeeded: true
return tuple( name, abundance )
} \
.set { results }
results.failed.view { "failed: $it" }
results.succeeded.view { "success: $it" }
}
Run with:
mkdir data
touch data/nonempty.fastq
touch data/empty.fastq
nextflow run -ansi-log false test.nf
Results:
N E X T F L O W ~ version 20.10.0
Launching `test.nf` [suspicious_newton] - revision: b883179718
[08/60a99f] Submitted process > test (1)
[42/358d60] Submitted process > test (2)
[08/60a99f] NOTE: Process `test (1)` terminated with an error exit status (1) -- Error is ignored
success: [nonempty, /home/user/working/stackoverflow/66119818/work/42/358d60bd7ac2cd8ed4dd7aef665d62/output/abundance.tsv]
failed: [empty, /home/user/working/stackoverflow/66119818/data/empty.fastq]

Check Whether Directory Exists In Remote Server For Perl

I wish to check whether a path given is exists or is a directory in another site server in Perl. My code is as below.
my $destination_path = "<path>";
my $ssh = "usr/bin/ssh";
my $user_id = getpwuid( $< );
my $site = "<site_name>";
my $host = "rsync.$site.com";
if ($ssh $user_id\#$host [-d $destination_path]){
print "Is a directory.\n";
}
else{
print "Is not a directory.\n";
}
I am sure my code is wrong as I modify the code according to bash example I see from another question but I have no clue how to fix it. Thanks for everyone that helps here.
[ is the name of a command, and it must be separated from other words on the command line. So just use more spaces:
$ssh $user\#$host [ -d $destination_path ]
To actually execute this command, you'll want to use the builtin system function. system returns 0 when the command it executes is successful (see the docs at the link)
if (0 == system("$ssh $user\#$host [ -d $destination_path ]")) {
print "Is a directory.\n";
} else {
print "Is not a directory.\n";
}
Accessing the remote file system through SFTP:
use Net::SFTP::Foreign;
$sftp = Net::SFTP::Foreign->new("rsync.$site.com");
if ($sftp->test_d($destination_path)) {
"print $destination_path is a directory!\n";
}

awk : how to use variable value

I want to declare a variable called variableToUse which holds the file name path.
I want to append file name with today's date.
Below code is in myAWK.awk
$bash: cat myAWK.awk
BEGIN{
today="date +%Y%m%d";
variableToUse=/MainDir/MainDir1/MainDir2/OutputFile_today.xml
}
/<record / { i=1 }
i { a[i++]=$0 }
/<\/record>/ {
if (found) {
print a[i] >> variableToUse
}
}
I am getting syntax error at OutputFile_today.xml.
How to use variable value?
You should quote the variables properly
Example
$ awk 'BEGIN{variableToUse="/MainDir/MainDir1/MainDir2/OutputFile_today.xml"; print variableToUse}'
/MainDir/MainDir1/MainDir2/OutputFile_today.xml
To get the current date you can use strftime
Example
$ awk 'BEGIN{today="date +%Y%m%d";variableToUse="/MainDir/MainDir1/MainDir2/OutputFile_"strftime("%Y%m%d")".xml"; print variableToUse}'
/MainDir/MainDir1/MainDir2/OutputFile_20160205.xml
Have your awk script like this:
BEGIN {
today="date +%Y%m%d";
variableToUse="/MainDir/MainDir1/MainDir2/OutputFile_" today ".xml"
}
/<record / { i=1 }
i { a[i++]=$0 }
/<\/record>/ {
if (found) {
print a[i] >> variableToUse
}
}
btw there are couple of other issues:
- I don't see found getting set anywhere in this script.
- today="date +%Y%m%d" will not execute date command. It just assigns literaldate +%Y%m%dtotodayvariable. If you want to executedate` command then use:
awk -v today="$(date '+%Y%m%d')" -f myAWK.awk
and remove today= line from BEGIN block.

CliBuilder Arguments Are Empty

Here's a working example of my problem:
def cli = new CliBuilder(usage: 'cli-test -d <argument>')
cli.with {
h(longOpt: 'help', 'usage information')
d(longOpt: 'do-something', required: true, args: 1, 'Do Something' )
}
OptionAccessor options = cli.parse(args)
if(!options) {
return
}
// print usage if -h, --help, or no argument is given
if(options.h || options.arguments().isEmpty()) {
println options.arguments().size()
cli.usage()
return
} else if (options.d) {
println options.d
}
When I execute the script with the following:
groovy cli-test.groovy -d hello
I get this output:
0
usage: cli-test -d <argument>
-d,--do-something <arg> Do Something
-h,--help usage information
The 0 is my println is the arguments length. I can't get any options to work other than h. I'm not sure if I'm doing something wrong.
The reason is that there are no arguments! You've swallowed them all in options.
If you call
groovy cli-test.groovy -d hello foo
then the arguments() list is [foo]
The -d arg is automatically checked for because you made it required, so there's no need to test for it later on.
Not sure why this works this way, but removing:
|| options.arguments().isEmpty()
from the initial if check makes everything work.

How to check if folder has any content in puppet

What I want to do is check whether a folder has any content before copying the content in the folder, since puppet is throwing an error if you are trying to copy content of a empty folder. This is what I have tried but it doesn't work :(
exec { "Copying_patches_$setupnode-$number":
path => '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/opt/java/bin/',
command => "cp -r ${params::script_base_dir}/libs/patches/* ${params::deployment_target}/$setup/repository/patches/",
onlyif => "test -f ${params::script_base_dir}/libs/patches/*",
notify => Notify['${params::script_base_dir}/libs/patches/* found'],
require => File["${params::deployment_target}/$setupnode"],
}
params::script_base_dir will give the path up to the script location.
Try this approach:
package { 'rsync': ensure => 'installed' }
$from = "${params::script_base_dir}/libs/patches/"
$to = "${params::deployment_target}/$setup/repository/patches/"
file { "$from/.sync_marker": ensure => file }
exec { "Copying_patches_$setupnode-$number":
path => '/usr/bin:/bin',
command => "rsync -r $from $to",
require => [
File["${params::deployment_target}/$setupnode"],
Package['rsync'],
File["$from/.sync_marker"],
],
creates => "$to/.sync_marker",
}
Some remarks:
I shortened your path - no need for java or things in /sbin
Notifying a notify resource is usually no sensible - those always produce their message
The trailing slash on the target does not matter to rsync, but the one on the sources does!
The file in the source directory is created, just to make it possible to build a simple creates clause
The creates parameter makes sure that the command is run only once, and not during every run.
If you need Puppet to wait until the source directory is populated, you do have to use onlyif. Try this condition:
onlyif => "find $from | wc -l | grep -v '^2\$'",
The two lines of output would represent the directory itself and the marker file. The $ sign is escaped so that Puppet includes it in the command string literally.
spent a little bit of time to find it, This might help someone else as well
unless => "find /directory/path/ -mindepth 1 | read",
So wanted
exec { "Install wordpress in ${docroot}":
command => "git clone https://github.com/WordPress/WordPress.git ${docroot}",
unless => "find ${docroot} -mindepth 1 | read", # only cloen if directory is empty
cwd => $docroot,
path => ['/bin', '/usr/bin', '/usr/sbin'],
}

Resources