Dealing with dependencies of cargo crates - rust

I am new to Rust, so excuse me if im just doing things horribly wrong.
While larning the language i wanted to try out different bindings of libraries that i already used in other languages, amongst them SDL2, SFML2, Gtk3.
To my surprise, nothing seemed to work out of the box. They all depend on C libraries and those don't come with the cargo crate. I managed to get SFML2 to work after following the readme and manually copying .lib and .dll files to the right places. I tried to make the Rust linker to look into my vcpk directory for .lib files, sadly with no success.
The whole point of a package manager kind of is to automate these things for you. Other package managers like NuGet for C# dont require you to manually fiddle the dependencies for their packages together.
Getting rid of the thirdparty library management hell of C/C++ was one of the reasons why i took a closer look at Rust.
Am i doing something wrong, or is this just how things are with Rust/Cargo?

Cargo is a build management and source package management tool for Rust code - it is not a tool for managing binaries or compiling other languages such as C or C++.
Having said that, it is a very flexible tool so it is possible for crates that provide bindings to libraries written in other languages to "bundle" the libraries they depend on.
The Rust-SDL2 crate, for example, does offer such a feature - as it says in their README:
Since 0.31, this crate supports a feature named "bundled" which
downloads SDL2 from source, compiles it and links it automatically.
To use this, you would would add it to your Cargo.toml like this:
[dependancies]
sdl2 = { version = "0.34.0", features=["bundled"] }
Not all such binding crates support bundling, especially if the libraries they bind to are large, complex, have lots of their own dependencies and/or have lots of compile time configuration options.
In those cases you will either need to install a pre-compiled binary, or compile them from source yourself.

Related

Cargo: How could I build a single rlib with all dependencies

I build my rlib with cargo build —-lib. However when I use it with rustc main.rc —-extern mylib=mylib.rlib, I got an compile error can’t find crate for xxx which mylib depends on.
How could I get a rlib with all dependencies included?
How could I get a rlib with all dependencies included?
You can't. An .rlib file is the result of compiling one crate.
What is the correct way to sharing compiled library across projects then?
The Rust toolchain is not designed to support this in general.
One of the main reasons for this is that libraries have Cargo features which define whether certain conditionally-compiled code should be enabled or not, and libraries of the same major version are expected to not be duplicated. Putting these properties together means that you can't expect that compiling a library and its dependencies separately will produce a correct result (because some of the included dependencies might be missing features required by the separately compiled dependent, but must not be duplicated) — compilation needs to look at the entire graph of library crate dependencies.
You can share the cost of compiling one library to be used by another set of packages by putting all those packages in one workspace, but workspaces are designed for compiling a set of closely related packages and are not necessarily suitable for combining arbitrary ones.

What is the exact difference between a Crate and a Package?

I come from a Java background and have recently started with Rust.
The official Rust doc is pretty self-explanatory except the chapter that explains Crates and Packages.
The official doc complicates it with so many ORs and ANDs while explaining the two.
This reddit post explains it a little better, but is not thorough.
What is the exact difference between a Crate and Package in Rust? Where/When do we use them?
Much thanks!
Crates
From the perspective of the Rust compiler, "crate" is the name of the compilation unit. A crate consists of an hierarchy of modules in one or multiple files. This is in contrast to most "traditional" compiled languages like Java, C or C++, where the compilation unit is a single file.
From the perspective of an user, this definition isn't really helpful. Indeed, in most cases, you will need to distinguish between two types of crates:
binary crates can be compiled to executables by the Rust compiler. For example, Cargo, the Rust package manager, is a binary crate translated by the Rust compiler to the executable that you use to manage your project.
library crates are what you'd simply call libraries in other languages. A binary crate can depend on library crates to use functionality supplied by the libraries.
Packages
The concept of packages does not originate in the Rust compiler, but in Cargo, the Rust package manager. At least for simple projects, a package is also what you will check into version control.
A package consists of one or multiple crates, but no more than one library crate.
Creating packages
to create a new package consisting of one binary crate, you can run cargo new
to create a new package consisting of one library crate, you can run cargo new --lib
to create a package consisting of a library as well as one or multiple binaries, you can run either cargo new or cargo new --lib and then modify the package directory structure to add the other crate
When should you use crates, and when should you use packages?
As you can see now, this question doesn't really make sense – you should and must always use both. A package can't exist without at least one crate, and a crate is (at least if you are using Cargo) always part of a package.
Therefore, a better question is this:
When should you put multiple crates into one package?
There are multiple reasons to have more than one crate in a package. For example:
If you have a binary crate, it is idiomatic to have the "business logic" in a library in the same package. This has multiple advantages:
Libraries can be integration tested while binaries can't
If you later decide that the business logic needs to also be used in another binary, it is trivial to add this second binary to the package and also use the library
If you have a library crate that generates some files (a database engine or something like that), you may want to have a helper binary to inspect those files
Note that if you have a very big project, you may instead want to use the workspace feature of Cargo in these cases.

Avoid dynamic linking in dependencies

I am developing a project against a custom linux and I am having troubles with dynamic dlls that are referenced by dependencies.
Is there a way to know if a dependency has dynamic linked libraries before hand? Is it possible to somehow avoid those libraries? I want to have a static binary (MUSL didn’t work for me as one dependency doesn’t compile with it).
Thanks
If you're compiling against glibc, you'll need to have at least some dynamic linking. While it is possible to statically link glibc, that isn't a supported configuration since the name service switch won't work in such a case.
In general, you should expect a build-dependency on cc or pkg-config to be an indicator of the use of a C or C++ library. That isn't a guarantee either way, but it is probably going to be the case the vast majority of the time. Some of those libraries will be able to be linked statically, but of course if you do that you must recompile your code every time any of your dependencies has a security update or you'll have a vulnerability. There's unfortunately no clear way to tell whether static linking is an option in such a case other than looking at the build.rs or the documentation of the crate.

Building libharu from scratch

Recently I'm trying to build and use libharu library in order to create PDFs from bitmaps.
I've made some research trough it's site : http://libharu.org/.
There are instructions showing how to build it, but i doesn't build because it has dependencies to two other libraries(which i don't understand how to integrate in the building process) - zlib and libpng.
But i cant understand clearly the entire process so my last hope is if someone has built it from scratch and could explain me or provide me with some details for the building process.
LibHaru was forked after 2.0.8. The later version uses a make system whose code seems to have changed. First of the new variant was 2.10.0. Old version is on sourceforge.
I couldn't get later version to compile but 2.0.8 worked. (dated 2006) In the past I have seen comment suggesting I am not alone. You are correct there are no instructions about the dependencies. If you can you should use the pre-built version, which is mentioned.
From your message I assume you have little software building experience. Outlining in a few words if not feasible, here is a little. Dependent libraries have to be available, either as source for compiling, or occasionally as pre-built libraries specifically for the compiler/OS you are using. You have to go and get them. Then the compiler system you are using to build libharu, has to be able to "see" the dependent libraries, in this case the *.h file. After compiling the whole lot has to be linked together. None of this is rocket science but is a major source of frustration, everything has to be just right, usually with nothing to tell you what is wrong.
And that is why some people favor using a third party "build" tool. If it works.
libharu has two major dependencies: zlib and libpng, both widely used libraries which usually compile easily but I think there are ways to omit these for a loss of functionality, are about handling import of bitmaps.
So you have three sets of sources and essentially three libraries where as a final step are linked to from the libharu source code.
Alternatively you could find a pre-built version.

Differences between SCons and Shake

I'm working on a Python/Haskell project and I'm looking for alternatives to Makefile. Obvious choices are Python SCons and Haskell Shake. Since I have no experience with either of them, I'd like to ask if there is any comparison of their drawbacks and advantages.
Update: The project has somewhat complex requirements for building:
Let the user configure the build - like options to enable/disable, paths to tools etc.
There are both Haskell and Python files generated at compile time. Their dependencies should work properly.
There are multiple Haskell programs that share most of the source files. I'd like so that:
it's possible to build each one individually, not building the sources that aren't needed;
source files aren't built multiple times when compiling multiple programs;
yet achieve parallelism during compilation, if possible.
Check for several installed programs on target systems and their paths (like python, flock etc.)
Check for dependencies on target systems, both Python, Haskell.
Parametrize the build according to the dependencies - if the dependencies for testing are missing, it should still be possible to build the project, skipping the tests (and informing the user about it).
There is a Why Shake? document that gives reasons to chose Shake over other build systems, but it does not focus on a comparison to SCons.
Update: All of your requirements seem easy enough to express in Shake (ask on StackOverflow if you get stuck with any of them). As to Shake vs SCons:
Shake is particularly good at dealing with generated files with dependencies that cannot be statically predicted, particularly if you are generating the files from programs you compile.
Building the Haskell parts of your project is likely to be harder than building the Python (since Haskell has a richer structure and more complex compiler). Using Shake makes it easier to tap into existing examples of compiling Haskell and use libraries for parsing Haskell if you need it.
There is a SCons wiki page that compares it to other build tools, unfortunately there is no comparison there with Haskell/Shake.
Also, this question may help.
SCons really shines as compared to other tools (especially make and cmake) by its Python syntax, and its implicit dependency system that is very accurate and easy to use.

Resources