scons not respecting variant_dir with object files - scons

For simplicity sake, I'm able to reproduce this problem I'm having with the following example...
I currently have two files, a SConstruct and a SConscript.
My directory tree is as follows:
- .
- SConstruct
- build (dir)
- dir1 (dir)
- mysrc.cpp
- proj (dir)
- SConscript
Here is the contents of my SConstruct:
SConscript('dir1/proj/SConscript', variant_dir='build/out', duplicate=0)
Here is the contents of dir1/proj/SConscript:
src = Dir('.').srcnode().abspath + '/../mysrc.cpp'
StaticLibrary('mylib', src)
When I run scons at my root directory, I see the following output
g++ -o dir1/mysrc.o -c dir1/mysrc.cpp
ar rc build/out/libmylib.a
The ar command looks great, it outputs to the variant_dir, which is build/out. The problem here is the output location of mysrc.o. It goes to dir1 instead of build/out. Why does this happen?

You need to specify paths to source files as though they are sitting in the variant directory path.
Change your top level SConstruct to this...
SConscript('dir1/proj/SConscript', variant_dir='build/out', src_dir='dir1', duplicate=0)
And change your dir1/proj/SConscript to this...
src = Dir('.').srcnode().abspath + '/../../build/out/mysrc.cpp'
StaticLibrary('mylib', src)
Then when you run, you will see the following.
>> scons --version
SCons by Steven Knight et al.:
script: v2.3.6.rel_2.3.5:3347:d31d5a4e74b6[MODIFIED], 2015/07/31 14:36:10, by bdbaddog on hpmicrodog
engine: v2.3.6.rel_2.3.5:3347:d31d5a4e74b6[MODIFIED], 2015/07/31 14:36:10, by bdbaddog on hpmicrodog
engine path: ['/usr/lib/scons/SCons']
Copyright (c) 2001 - 2015 The SCons Foundation
>> tree
.
├── dir1
│   ├── mysrc.cpp
│   └── proj
│   └── SConscript
└── SConstruct
2 directories, 3 files
>> scons
scons: Reading SConscript files ...
scons: done reading SConscript files.
scons: Building targets ...
g++ -o build/out/mysrc.o -c dir1/mysrc.cpp
ar rc build/out/proj/libmylib.a build/out/mysrc.o
ranlib build/out/proj/libmylib.a
scons: done building targets.
>> tree
.
├── build
│   └── out
│   ├── mysrc.o
│   └── proj
│   └── libmylib.a
├── dir1
│   ├── mysrc.cpp
│   └── proj
│   └── SConscript
└── SConstruct
5 directories, 5 files

Related

How do you force automake to copy shared objects to /usr/local/lib?

I currently have a c++ project which depend on some external shared objects (.so). My current directory looks like this:
├── src
│ └── .cpp files
├── include
│ ├── glad
│ │ └── .h files
│ └── fmod
│ ├── core
│ │ └── .h files
│ └── studio
│ └── .h files
├── lib
│ └── fmod
│ ├── core
│ │ └── .so files
│ └── studio
│ └── .so files
├── Makefile.am
├── configure.ac
I want to compile this project while simultaneously copying those .so files towards /usr/lib or /usr/local/lib, however I can't seem to be able to do this!
The following is my configure.ac file
AC_INIT([autoGL], 1.0)
AM_INIT_AUTOMAKE
AC_PROG_CC
AC_PROG_CXX
AC_CONFIG_FILES(Makefile)
AC_OUTPUT
And my Makefile.am
bin_PROGRAMS = autogl
autogl_SOURCES = src/Source.cpp
autogl_SOURCES+= src/glad.c
autogl_LDADD = -lglfw -ldl
autogl_LDADD+= -L lib/fmod/core -lfmod
autogl_LDADD+= -L lib/fmod/studio -lfmodstudio
autogl_LDFLAGS = -Wl,--no-as-needed,-rpath,lib/fmod/core,-rpath,lib/fmod/studio
autogl_CPPFLAGS = -I include
autogl_CPPFLAGS+= -I include/fmod/core
autogl_CPPFLAGS+= -I include/fmod/studio
autogl_CPPFLAGS+= -I include/fmod/fsbank
You can see that I'm linking every library using the link flags -L lib/fmod/---- -library. Initially, the seventh line of my Makefile.am was only
autogl_LDFLAGS = -Wl,--no-as-needed
resulting in the following g++ code, which was successfull and gave me an executable file
g++ -g -O2 -Wl,--no-as-needed -o autogl autogl-Source.o autogl-glad.o -lglfw -ldl -L lib/fmod/core -lfmod -L lib/fmod/studio/ -lfmodstudio
However, when tried to run this, I would get the following error:
./autogl: error while loading shared libraries: libfmod.so.12: cannot open shared object file: No such file or directory
My shared objects are not being copied to /usr/lib or /usr/local/lib.
With the addition of
autogl_LDFLAGS = -Wl,--no-as-needed,-rpath,lib/fmod/core,-rpath,lib/fmod/studio
since we are linking rpath to our lib files, the program has no problem running. However, if I run make install, the rpath being linked would be /usr/bin/lib/fmod/core and /usr/bin/lib/fmod/studio, which clearly don't have the needed files. My .so files are still not being copied anywhere. I want to copy my .so files directly to /usr/local/lib so that my program can run without me having to link it directly.
How can I force automake to copy these .so files directly to a folder of my choice? (preferable /usr/local/lib).
Found a solution!
Autotools also offers the possibility to transfer data. I added the following to my Makefile.am
flashdir=$(prefix)/lib
flash_DATA= lib/fmod/core/libfmodL.so \
lib/fmod/core/libfmodL.so.12 \
lib/fmod/core/libfmodL.so.12.10
.....
This adds all of my .so files to $(prefix)/lib, which usually is /usr/local/lib.
However, there is a problem, particularly in Ubuntu, where /usr/local/lib is not by default on /etc/ld.so.conf.d, so libraries on /usr/local/lib are not used.
To solve this I added the following line to my makefile.am
install-data-hook:
ldconfig $(prefix)/lib
This creates a hook which runs AFTER the lib files are already added to $(prefix)/lib, which, when run, adds the folder to /etc/ld.so.conf.d, so now after make install, everything runs smoothly.

rsync --exclude and --exclude-from not working

I've created a simple test directory:
├── backup.tgz
├── Dummy-dir
│   ├── dummy-file-a.txt
│   ├── dummy-file-b.txt
│   ├── Images
│   │   ├── imgA.jpg
│   │   ├── imgB.jpg
│   ├── Music
│   │   ├── Una Mattina - ludovico Einaudi (Jimmy Sax Impro live).mp3
│   │   └── Worakls - Blue ( Jimmy Sax live).mp3
│   └── Videos
│   ├── IMG_0001.MOV
│   └── IMG_5377.mov
├── Dummy-target
├── excludes.txt
To test, i did rsync of Dummy-dir to Dummy-target.
I ran multiple tests:
rsync -avz --exclude ./Dummy-dir/Images ./Dummy-dir/ ./Dummy-target/
rsync -avz --exclude=./Dummy-dir/Images ./Dummy-dir/ ./Dummy-target/
rsync -avz --exclude-from=./excludes.txt ./Dummy-dir/ ./Dummy-target/
I tested both with relative and full path.
No matter what i try, it doesn't seems to work.
What is going on?
This should exclude Images folder:
rsync -avz --exclude 'Images' ./Dummy-dir/ ./Dummy-target/
should be relative to the source path without the source path
and if you want to exclude multiple files / directories:
rsyc -avz --exlude={'Images','Music','dummy-file-a.txt'} src-directory/ dest-dirctory/
There must be no spaces in the list! It will not work if there are!

Updating the script present in Poky Source code

meta/recipes-core/initrdscripts/files/init-install-efi.sh is used for formatting and creating partitions.
I have modified this file to create one more partition for software update.
Can I copy the newly updated script file in my own custom layer recipes-core/initrdscripts/files/init-install-efi.sh.
Will it update the init-install-efi.sh. If not how to achieve this, I don't want to touch the poky source code, as that is fetched using repo utility
$ tree meta-ncr/
meta-ncr/
├── conf
│   ├── bblayers.conf
│   ├── layer.conf
│   └── machine
│   └── panther2.conf
├── recipes-core
│   └── initrdscripts
│   ├── files
│   │   └── init-install-efi.sh
│   └── initramfs-live-install-efi_1.0.bbappend
└── scripts
└── setup-environment
$ cat meta-ncr/recipes-core/initrdscripts/initramfs-live-install-efi_1.0.bbappend
FILESEXTRAPATHS_prepend := "${THISDIR}/files:"
SRC_URI = "file://init-install-efi.sh"
After debugging, I found that it is copying the script present in the meta-intel layer and not of my layer.
This is from the output of bitbake-layers show-appends
initramfs-live-install-efi_1.0.bb:
/home/jamal/repo_test/sources/meta-intel/recipes-core/initrdscripts/initramfs-live-install-efi_%.bbappend
/home/jamal/repo_test/sources/meta-ncr/recipes-core/initrdscripts/initramfs-live-install-efi_1.0.bbappend
Can you please tell me what changes are required for my bbappend to work instead of meta-intel
Yocto provides bbappend mechanism to archive Your case without touching metadata from poky, please follow these few steps to archive this:
create new layer or use Your existing one,
in this layer create bbappend file for initramfs-module-install-efi_1.0.bb or initramfs-live-install-efi_1.0.bb (I found that this recipes are based on this script), with content:
$ cat meta-test/recipes-core/initrdscripts/initramfs-live-install-efi_1.0.bbappend
FILESEXTRAPATHS_prepend := "${THISDIR}/files:"
SRC_URI = "file://init-install-efi.sh"
move modified script file under files directory, Your meta layer structure should look like:
$ tree meta-test/
meta-test/
├── conf
│   └── layer.conf
├── COPYING.MIT
├── README
└── recipes-core
└── initrdscripts
├── files
│   └── init-install-efi.sh
└── initramfs-live-install-efi_1.0.bbappend
4 directories, 5 files
Then finally after running do_unpack task on initramfs-live-install-efi recipe in working directory You will find Your modified file in recipe workspace,
$ bitbake -c unpack initramfs-live-install-efi
Test:
$ cat tmp/work/i586-poky-linux/initramfs-live-install-efi/1.0-r1/init-install-efi.sh
#!/bin/bash
echo "hello"
FILESEXTRAPATHS - is used to extend search path for do_fetch and do_patch tasks.

data_files differences between pip and setuptools

I have a Python application that comes with a setup.py script and can be installed via Pip or setuptools. However, I'm finding some annoying differences between the two methods and I want to know the correct way of distributing data-files.
import glob
import setuptools
long_description = ''
setuptools.setup(
name='creator-build',
version='0.0.3-dev',
description='Meta Build System for Ninja',
long_description=long_description,
author='Niklas Rosenstein',
author_email='rosensteinniklas#gmail.com',
url='https://github.com/creator-build/creator',
py_modules=['creator'],
packages=setuptools.find_packages('.'),
package_dir={'': '.'},
data_files=[
('creator', glob.glob('creator/builtins/*.crunit')),
],
scripts=['scripts/creator'],
classifiers=[
"Development Status :: 5 - Production/Stable",
"Programming Language :: Python",
"Intended Audience :: Developers",
"Topic :: Utilities",
"Topic :: Software Development :: Libraries",
"Topic :: Software Development :: Libraries :: Python Modules",
],
license="MIT",
)
Using Pip, the files specified in data_files end up in sys.prefix + '/creator'.
Using setuptools (that is, running setup.py directly), the files end up in lib/python3.4/site-packages/creator_build-0.0.3.dev0-py3.4.egg/creator.
Ideally, I would like the files to always end up in the same location, independent from the installation method. I would also prefer the files to be put into the module directory (the way setuptools does it), but that could lead to problems if the package is installed as a zipped Python Egg.
How can I make sure the data_files end up in the same location with both installation methods? Also, how would I know if my module was installed as a zipped Python Egg and how can I load the data files then?
I've been asking around and the general consensus including the official docs is that:
Warning data_files is deprecated. It does not work with wheels, so it should be avoided.
Instead, everyone appears to be pointing towards include_package_data instead.
There's a drawback here in that it doesn't allow for including things outside of your src root. Which means, if creator is outside creator-build, it won't include it. Even package_data will have this limitation.
The only workaround, if your data files are outside of your source files (for instance, I'm trying to include examples/*.py for a lot of reasons we don't need to discuss), you can hot-swap them in, do the setup and then remove them.
import setuptools, glob, shutil
with open("README.md", "r") as fh:
long_description = fh.read()
shutil.copytree('examples', 'archinstall/examples')
setuptools.setup(
name="archinstall",
version="2.0.3rc4",
author="Anton Hvornum",
author_email="anton#hvornum.se",
description="Arch Linux installer - guided, templates etc.",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/Torxed/archinstall",
packages=setuptools.find_packages(),
classifiers=[
"Programming Language :: Python :: 3.8",
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Operating System :: POSIX :: Linux",
],
python_requires='>=3.8',
package_data={'archinstall': glob.glob('examples/*.py')},
)
shutil.rmtree('archinstall/examples')
This is at best ugly, but works.
My folder structure for reference is (in the git repo):
.
├── archinstall
│   ├── __init__.py
│   ├── lib
│   │   ├── disk.py
│   │   └── exceptions.py
│   └── __main__.py
├── docs
│   ├── logo.png
├── examples
│   ├── guided.py
│   └── minimal.py
├── LICENSE
├── profiles
│   ├── applications
│   │   ├── awesome.json
│   │   ├── gnome.json
│   │   ├── kde.json
│   │   └── postgresql.json
│   ├── desktop.py
│   ├── router.json
│   ├── webserver.json
│   └── workstation.json
├── README.md
└── setup.py
And this is the only way I can see how to include for instance my profiles as well as examples without moving them outside of the root of the repository (which I'd prefer not to do, as I want users to easily find them when navigating to the repo on github).
And one final note. If you don't mind polluting the src directory, in my case that's just archinstall. You could symlink in whatever you need to include instead of copying it.
cd archinstall
ln -s ../examples ./examples
ln -s ../profiles ./profiles
That way, when setup.py or pip installs it, they'll end up in the <package dir> as it's root.

Gradle 1.3: build.gradle not building classes

a newb question here: I have a build.gradle file with apply plugin: java in it, and associated with a java project/package. when I run gradle build from the command line I get:
:compileJava UP-TO-DATE
:processResources UP-TO-DATE
:classes UP-TO-DATE
:jar UP-TO-DATE
:assemble UP-TO-DATE
:compileTestJava UP-TO-DATE
:processTestResources UP-TO-DATE
:testClasses UP-TO-DATE
:test UP-TO-DATE
:check UP-TO-DATE
:build UP-TO-DATE
BUILD SUCCESSFUL
Total time: 4.312 secs
but when I check the build folder, there are no classes. What am I doing wrong ?
I have:
lib, reports, test-results & tmp under the folder but no classes. The src code is a class with a main method with 'Hello world' printed to console...
My tree looks like this:
.
├── build
│   ├── libs
│   │   └── gradle-sandbox.jar
│   ├── reports
│   │   └── tests
│   │   ├── base-style.css
│   │   ├── css3-pie-1.0beta3.htc
│   │   ├── index.html
│   │   ├── report.js
│   │   └── style.css
│   ├── test-results
│   └── tmp
│   └── jar
│   └── MANIFEST.MF
├── build.gradle
└── src
└── org
└── gradle
└── example
└── simple
├── HelloWorld.java
└── package-info.java
Without more information, it's hard to say. Maybe you put the source file into the wrong directory (default is src/main/java). Or the internal caches got corrupted (shouldn't happen, but try to delete the .gradle directory).
Do you get class files when you do gradle clean build? Note that a clean build is required whenever you switch between Gradle versions.
I had a similar issue where all of my classes were in src/main/java but none of the classes were showing up in the jar.
Issue was I was using Groovy. Adjusting it to src/main/groovy resolved the issue.
I had similar issue where I was already following the above hierarchy but also putted build.gradle inside the srcm/main/java/{{my project}}. Moving build.gradle to the project's root directory worked for me.
Let me know if it'll work for anyone else.

Resources