Compiling multiple bash scripts into one binary file - linux

I have a library that contains multiple bash scripts on a Linux. The main application, say myApp.sh will call different bash scripts to perform the tasks.
I have to compile all of these scripts into ONE binary file. With the SHC tool, I can only compile one file, and if I remove the other files, then the reference inside myApp is broken.
So how can I compile all of the bash scripts into one file and then run SHC on it?

Related

How to launch executable embedded within Eclipse plug-in?

I am building an Eclipse plug-in that have to parse the result of an executable (Linux) to display informations to the user. The executable should be embedded in the plug-in, not installed apart.
I first made a small prototype, in which I've embedded a fake executable, then before launching the executable, I extract it into a temporary file, build my command line and then launch it. That worked ok for me.
I've just received the real executable, and then realised it was not a standalone executable, but a bunch of libraries, config files and such. It comes also with a script to execute in order to set env variables.
The only option I am seeing now is to extract the whole bunch to a temp directory, set the environment variables according to the script, and then call my executable.
Is my solution valid ? Do you think of a better way to do it ?
Don't package the plugin as a jar, instead just use a directory so you don't have to do any unpacking.
You specify this using
Eclipse-BundleShape: dir
in the plugin MANIFEST.MF.
Note: if you package your plugins in a feature then this setting is overridden by the unpack="true" attribute of the plugin element in the feature.xml file.

Jenkins pipeline/groovy: Load script relative to current script

I have a pipeline groovy script, which I load from a different script:
load("path/to/my/script/pipeline.groovy")
Now, in this script, I want to load another groovy script. But I do not know the full path/to/my/script path. I tried:
load("./subfolder/subscript.groovy")
But it cannot find it this way. Can I somehow load a groovy script relative to the current script file?
You may want to consider using the shared library plugin if you are loading multiple remote scripts.
If the groovy file exists in a subfolder, you can use the findFiles step:
def subscript = findFiles(glob: '**/subscript.groovy')
load(subscript[0].path)
One way would be to load the 2nd script (curl) in your working directory, from there the 1st script can find and load it.

Run build script in Code::Blocks instead of building a target

Background:
I recently joined a software development company as an intern and am getting used to a new build system. The software is for an embedded system, and lets just say that all building and compiling is done on a buildbox. The building makes use of code generation using xml files, and then makes use of make files, spec files, and version files as well.
We develop on our own comps, (linux - mandriva distro) and build using the following methods:
ssh buildserver
use mount to mount drive on personal computer to the buildserver
set the environment using . ./set_env (may not be exactly that)
cd app_dir/obj (where makefile is)
make spec_clean
make spec_all
make clean
make
The Question:
I am a newbie to Code::Blocks and linux and was wondering how to set up a project file so that it can simply run a script file to execute these commands, instead of actually invoking the build process on my actual computer. Sort of like a pre-build script. I want to pair the execution of this script simply to Ctrl-F9 (build) and capture any output from the above commands in the build log window.
In other words, there is no build configuration or target that the project needs to worry about. I don't even need a compiler installed on my computer! I wish to set this up so that I can have the full features of an IDE.
Appreciate any suggestions!
put your script in a shell script file. E.g.,
#!/bin/sh
mount ... /mnt/path/buildserver
. ./set_env
cd app_dir/obj
make spec_clean
make spec_all
make clean
make
Say you name it as /path/to/my_build_script, then chmod 755 /path/to/my_build_script and invoke the following from your ssh client machine:
script -c ssh buildserver "path/to/my_build_script"
When finish, then check for the file typescript under current directory.
HTH

Compiling coffee binaries

If I'm writing my entire project in CoffeeScript, how do I write my "binary" files?
#!/usr/bin/env coffee
# My program sits here
Then, once compiled, I loose the shebang:
// Generated by CoffeeScript 1.4.0
// My program sits here
I was hoping it would turn into something like:
#!/usr/bin/env node
// My program sits here
Is it possible? Or do I need to rethink how I'm working.
As you surmise, you probably want a script to help you add the necessary shebang line. I usually create a Cakefile task to do the necessary compilation and add the appropriate first line.
The trick is to NOT put a .coffee extension on your "binary" file and to not compile it.
I also recommend that you not place any signficant logic in the binary. Rather, just have the binary kick off the full source.
In general, every one of my binaries sits in a /bin directory off the root of my project and it has these two lines only (like my CoffeeDocTest project on GitHub here):
#!/usr/bin/env coffee
require(__dirname + '/../src/coffeedoctest')
You'll also want to run chmod 755 <filename> on it to make it executable.
Look here for an example of how the main coffeedoctest.coffee starts off and handles command line options, etc.

how can i build a static library from files autogenerated by running a perl script within the SConscript

Here is what i need to do in scons, and at present I'm not able to get this to work correctly.
Firstly I need to run perl script 1. This generates a series of cpp files.
Then I need to run perl script 2. This generates another series of cpp files.
Then I need to take the cpp files that have been created as a result of executing the 2 perl scripts and build a static library from them.
I use a custom builder to execute the perl scripts. I don't want to manually define the target list, as this can change depending on the file that the perl scripts uses to generate the source files.
ny help would be much appreciated.
Thanks,
D
For running the perl scripts you just need to use standard python code:
import subprocess
subprocess.call(['perl', ...args...])
For building static lib, try something like this:
env = Environment()
env.StaticLibrary('example', Glob('*.cpp'))
where Glob('*.cpp') generates a list of all .cpp files. If you already have some customized environment just use is instead of env in my sample.

Resources