Location of libtensorflow.so and headers after building tensorflow r1.12 with bazel on Linux - linux

after having a lot of troubles building earlier versions of tensorflow using cmake I decided to give bazel a go since it supposedly is able to create a shared library. As per official recommendation I downloaded and built bazel 0.15 and then used
bazel build //tensorflow:libtensorflow.so
in the hopes of being able to build a shared library. After almost two hours bazel claimed that it was able to build libtensorflow.so, however, I cannot find it anywhere. It is especially strange since the whole directory is only about 650MB large. Earlier I built tensorflow r1.10 using cmake which generated a libtensorflow.so (which does not work in my test project due to other reasons) and that alone was over 800 MB large; the whole cmake directory was over 11GB in size.
Furthermore my test project (that actually works under Windows with an earlier version of tensorflow) requires some headers like
tensorflow/core/protobuf/meta_graph.pb.h
but it seems that this file hasn't been generated either because I cannot find it.
Can someone please tell me the correct way of getting a shared library and the necessary headers or where I find them after the supposed successful bazel build.
Cheers

Alright, so I now find out that the command find doesn't look in symlinks and so I was able to find libtensorflow.so (albeit a much smaller one with a size of about 100MB) and some headers in one of the symlinked directories that are created by bazel in your working path, i.e. bazel-bin, bazel-out etc.
Howevere, I am now stuck with another problem. As I mentioned above there were some headers but not all. For instance I cannot find
google/protobuf/stubs/common.h
Does anyone know how I can get all the rest of the headers like the one mentioned above, Eigen, Tensor and what not. What bazel target do I need to specify or how do I get them otherwise?

Related

log installed files when compiling

im currently building LFS and looking for a package management solution
specifically a program that keeps track of what files got installed when you compiled
something from source also has a method for removing those files in case make uninstall isnt present
i have looked into programs like install-log and checkinstall but couldnt get both to compile
any help is appreciated, thanks!
Personally, I have always used the User Based Management (aka "Package Users"), as described in section 8.2. Package Management. It doesn't get much love, even from the LFS community, despite it (still) being in the book and being the only method "unique to LFS".
It's not great for the first timer as it will really force you to dig deep at times to solve issues and make important decisions. I suggest you complete your first LFS system build, then consider Package Users your next time around.
But once you get used to it, it works great.
Another, simpler, method is the Timestamp Based technique (described in the same link).
For example, when it comes time to copy a package's files to your system, you can do something like this:
touch timestamp_start
make install
# do other stuff as instructed
touch timestamp_stop
find / -newer timestamp_start -not -newer timestamp_stop > list_of_files_affected
And I do use this approach when installing the Nvidia proprietary drivers, because successfully installing them with a non-root account is a real pain.

Handling autoconf with Android after NDK16

I'm trying to update an existing configuration we have we are cross compiling for a number of targets - the question specifically here is about Android. More specifically we are building code using cmake and the hunter package manager. However we are building ICU using a link that uses autoconf/configure, called from cmake. Not sure that is specifically important except that we have less control on the use of configure than is generally the case.
OK: we have a version that builds against an old NDK but I am updating and have hit a problem identified by https://android.googlesource.com/platform/ndk/+/master/docs/UnifiedHeaders.md: with NDK16 and later, the value of the sysroot parameter needs to vary between compilation and linkage. As it stands the configure script tries to build a small program conftest.c - the program fails to link. Manually I can compile the code in two stages using -c and then linking the subsequent .o, but that is not what configure is trying to do.
Now the reality is that when I build this code, I don't actually need to link the code - I am generating a library which is used elsewhere. However that is not currently the way that configure sees it.
I may look to redo the configuration script to just check that the code can be compiled when cross compiling. However I am curious to know if anybody has managed to handle this sort of thing by keeping the existing config files and just changing the parameters by which the scripts are called.
When r19 releases to stable this problem will go away on its own (https://github.com/android-ndk/ndk/issues/780), but since that's still in beta it's not a good solution just yet.
Prior to r19 (this isn't really unique to r16+, this has always been the case and it was just asymptomatic previously), autoconf builds should be done using a standalone toolchain.
You however should not use a standalone toolchain for CMake, so odds are something about your configuration will need to change until r19 is released. Depending on the effort involved, it may make sense to keep to r15 until r19 is available.

How to use theos extra framework?

Theos lists it supports
Third party frameworks can be placed inside $THEOS/lib, and utilised with instance_EXTRA_FRAMEWORKS. (kirb)
But I am not sure how to make it work or trouble shooting, can someone explain what's this for, and how to use it? If I already build a binary, and the binary needs some frameworks, how to do it?
I tried to follow the samples like putting frameworks under $THEOS/lib and add the flag, but when it runs, for example I am adding AWSCore.framework and AWSS3.framework, it reports library not loaded, image not found
I need to understand how the framework is added into my binary, and what's the run path etc. and how to debug where goes off. Thank you. Does the binary already contains the framework, or I should copy it to somewhere?

Building libharu from scratch

Recently I'm trying to build and use libharu library in order to create PDFs from bitmaps.
I've made some research trough it's site : http://libharu.org/.
There are instructions showing how to build it, but i doesn't build because it has dependencies to two other libraries(which i don't understand how to integrate in the building process) - zlib and libpng.
But i cant understand clearly the entire process so my last hope is if someone has built it from scratch and could explain me or provide me with some details for the building process.
LibHaru was forked after 2.0.8. The later version uses a make system whose code seems to have changed. First of the new variant was 2.10.0. Old version is on sourceforge.
I couldn't get later version to compile but 2.0.8 worked. (dated 2006) In the past I have seen comment suggesting I am not alone. You are correct there are no instructions about the dependencies. If you can you should use the pre-built version, which is mentioned.
From your message I assume you have little software building experience. Outlining in a few words if not feasible, here is a little. Dependent libraries have to be available, either as source for compiling, or occasionally as pre-built libraries specifically for the compiler/OS you are using. You have to go and get them. Then the compiler system you are using to build libharu, has to be able to "see" the dependent libraries, in this case the *.h file. After compiling the whole lot has to be linked together. None of this is rocket science but is a major source of frustration, everything has to be just right, usually with nothing to tell you what is wrong.
And that is why some people favor using a third party "build" tool. If it works.
libharu has two major dependencies: zlib and libpng, both widely used libraries which usually compile easily but I think there are ways to omit these for a loss of functionality, are about handling import of bitmaps.
So you have three sets of sources and essentially three libraries where as a final step are linked to from the libharu source code.
Alternatively you could find a pre-built version.

What is the target of sphinxbase?

I downloaded the AndroidPocketSphinx package along with Pocketsphinx and Sphinxbase and built it per the instructions here.
It builds and runs fine, but now I am trying to understand how all the components fit together to come with this impressive system.
I examined the Sphinxbase directory tree and could not find any binary target of which Pocketsphinx and/or AndroidPocketSphinx could reference. That is, I was expecting a .so, .a, .dll, bin or something like that, but all I could find were source files in various languages.
I am wondering, how can the system build and run so wonderfully if the very base package doesn't produce any library or similar binary target?
What am I missing?

Resources