Re-export optional cargo feature that can be turned off - rust

With the following directory structure:
tree hello_dep
.
├── Cargo.lock
├── Cargo.toml
├── dep_a
│ ├── Cargo.toml
│ └── src
│ └── main.rs
├── dep_b
│ ├── Cargo.toml
│ └── src
│ └── main.rs
└── src
└── main.rs
And the following dependency chain: hello_dep -> dep_a -> dep_b -> (optional feature) rustc-serialize,
I would like to create a feature in dep_a that re-exports the optional rustc-serialize feature in dep_b.
At the bottom, I have dep_b, which has rustc-serialize as an optional default feature:
# dep_b/Cargo.toml
[package]
name = "dep_b"
version = "0.1.0"
[dependencies]
rustc-serialize = { version = "0.3.19", optional = true }
[features]
default = ["rustc-serialize"]
I would like to create a feature in dep_a to optionally reexport "rustc-serialize". Here is the attempt:
# dep_a/Cargo.toml
[package]
name = "dep_a"
version = "0.1.0"
[dependencies]
dep_b = { version = "0.1.0", path = "../dep_b" }
[features]
rustc-serialize = ["dep_b/rustc-serialize"]
default = ["rustc-serialize"]
However, when I try to add this as a dependency with the default off using the following Cargo.toml:
# hello_dep/Cargo.toml
[package]
name = "hello_dep"
version = "0.1.0"
[dependencies]
dep_a = { version = "0.1.0", path = "dep_a", default-features = false, optional = true }
cargo build still yields rustc-serialize in Cargo.lock. But directly depending on dep_b correctly avoids pulling in rustc-serialize with the following line
dep_b = { version = "0.1.0", path = "dep_b", default-features = false }
Is this a bug in Cargo, or am I doing something wrong? Here is a related question

In dep_a/Cargo.toml, you did not specify default-features = false on the dep_b dependency. Therefore, the rustc-serialize feature in dep_b is enabled by default. The fact that you included a feature in dep_a to enable dep_b's rustc-serialize doesn't change the fact that it's still enabled when dep_a's feature is not enabled.
Thus, in dep_a/Cargo.toml, you should have:
[dependencies]
dep_b = { version = "0.1.0", path = "../dep_b", default-features = false }

Related

Bindgen failing because of relative paths in header file

I'm trying to build out some bindings for vmaf but i've been running into some issues. The header files and the .c files in the vmaf repository live in seperate folders. I'm having an issue where the main vmaf.h file references other .h files in the same directory:
#ifndef __VMAF_H__
#define __VMAF_H__
#include <stdint.h>
#include <stdio.h>
#include "libvmaf/compute_vmaf.h"
#include "libvmaf/model.h"
#include "libvmaf/picture.h"
#include "libvmaf/feature.h"
...
this results in me getting the following build error:
vmaf/libvmaf/include/libvmaf/libvmaf.h:25:10: fatal error: 'libvmaf/compute_vmaf.h' file not found. It looks to me that rust-bindgen is looking in the current working directory for the next header file when in reality it lives in a subdirectory in my project as a git submodule pointing to the vmaf git repository
Here's the folder structure
.
├── src # lib.rs lives here
├── target
│ └── debug
└── vmaf
├── libvmaf
│ ├── doc
│ ├── include
│ │ └── libvmaf # vmaf.h lives here along with other .h files
│ ├── src # various other .h files live here
│ │ ├── arm
│ │ ├── compat
│ │ ├── ext
│ │ ├── feature
│ │ └── x86
│ ├── test
│ └── tools
and here's my build.rs
extern crate meson;
use std::env;
use std::path::PathBuf;
fn main() {
//env::set_var("RUST_BACKTRACE", "1");
let build_path = PathBuf::from(env::var("OUT_DIR").unwrap());
_ = build_path.join("build");
let build_path = build_path.to_str().unwrap();
println!("cargo:rustc-link-lib=libvmaf");
println!("cargo:rustc-link-search=native={build_path}");
meson::build("vmaf/libvmaf", build_path);
let bindings = bindgen::Builder::default()
.header("vmaf/libvmaf/include/libvmaf/libvmaf.h")
.parse_callbacks(Box::new(bindgen::CargoCallbacks))
.generate()
.expect("Unable to generate bindings");
let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());
bindings.write_to_file(out_path).expect("Couldn't write bindings!")
}
How can I get rust-bindgen to look in that directory rather than from my current working directory?
Should i use cargo:rustc-link-search and point it to the correct directory? Will that mess up linking to the library itself since i've already used that statement earlier to compile the meson project?
The answer ended up being the -I flag to clang arg
// Path to vendor header files
let headers_dir = PathBuf::from("vmaf/libvmaf/include");
let headers_dir_canonical = canonicalize(headers_dir).unwrap();
let include_path = headers_dir_canonical.to_str().unwrap();
// Generate bindings to libvmaf using rust-bindgen
let bindings = bindgen::Builder::default()
.header("vmaf/libvmaf/include/libvmaf/libvmaf.h")
.clang_arg(format!("-I{include_path}")) // New Stuff!
.parse_callbacks(Box::new(bindgen::CargoCallbacks))
.generate()
.expect("Unable to generate bindings");

Unable to access lib.rs file from binary

I am unable to access a library from my binary.
Here is what my cargo.toml looks like
[package]
name = "app"
version = "0.1.0"
edition = "2021"
[dependencies]
<--snip-->
[lib]
path = "src/lib.rs"
[[bin]]
path = "src/main.rs"
name = "realio"
the application root
.
├── Cargo.toml
├── src
│ ├── lib.rs
│ └── main.rs
└── test
└── integration.rs`
and my main.rs
#![crate_name = "realio"]
use env_logger::Env;
use realio::run;
use std::net::TcpListener;
#[tokio::main]
async fn main() -> std::io::Result<()> {
env_logger::Builder::from_env(Env::default().default_filter_or("info")).init();
let listener = TcpListener::bind("127.0.0.1:8000").expect("failed to bind port");
run(listener)?.await
}
However , I get the following error
error[E0432]: unresolved import `realio`
--> app/src/main.rs:4:5
|
4 | use realio::run;
| ^^^^^^^^^^^^^^^^^^^^^ use of undeclared crate or module `realio`
I would appreciate pointers on this
You're missing the name for the lib.
This would be right:
[dependencies]
[lib]
name = "realio"
path = "src/lib.rs"
[[bin]]
name = "realio"
path = "src/main.rs"
But you don't need to manually declare it there if you stick with the main.rs and lib.rs naming convention. Also keep in mind to change the Package name (line 2 in your Cargo.toml) to "realio", so your code still works.
You can find more infos for that in the Cargo Book: https://doc.rust-lang.org/cargo/guide/project-layout.html

rust how to define an internal dependency?

I am trying to achieve the following, I have this project structure:
.
├── Cargo.lock
├── Cargo.toml
└── src
├── lib.rs
├── mesh
│ ├── gltf_loader
│ │ ├── Cargo.toml
│ │ ├── pluralize
│ │ └── src
│ └── mesh.rs
└── vulkan
├── core.rs
├── hardware_interface.rs
├── image.rs
├── memory.rs
├── mod.rs
├── pipeline.rs
├── renderer.rs
├── shader_program.rs
├── swapchain.rs
└── tools.rs
in here pluralize defines a procedural macro that is needed to compile gltf_loader but that needs to be kept hidden from all other dependencies. So I want to compile gltf loader as a "subcrate" then use that crate as a dependency for mesh.rs. Ideally without moving the directory structure.
To that effect I added this to the root toml:
[dependencies]
gltf_loader = { path = "src/mesh/gltf_loader" }
However if I do this and try to run an example, I get this error:
error: failed to get `gltf_loader` as a dependency of package `vulkan_bindings v0.1.0 (/home/makogan/rust_never_engine)`
Caused by:
failed to load source for dependency `gltf_loader`
Caused by:
Unable to update /home/makogan/rust_never_engine/src/mesh/gltf_loader
Caused by:
failed to parse manifest at `/home/makogan/rust_never_engine/src/mesh/gltf_loader/Cargo.toml`
Caused by:
can't find `proc-macros` bin at `src/bin/proc-macros.rs` or `src/bin/proc-macros/main.rs`. Please specify bin.path if you want to use a non-default path.
Why does this happen?
This id the full .toml:
[package]
name = "vulkan_bindings"
version = "0.1.0"
edition = "2021"
[lib]
name = "vulkan_bindings"
path = "src/lib.rs"
[dependencies]
# Rendering
ash = { version = "0.37.0" }
glfw = { version = "0.45.0", features = ["vulkan"] }
gpu-allocator = "0.18.0"
spirv_cross = { version = "0.23.1", features = ["glsl"] }
shaderc = "0.8.0"
# Math
nalgebra = "*"
nalgebra-glm = "0.3"
nalgebra-sparse = "0.1"
# json
serde_json = "1.0"
serde = { version = "1.0", features = ["derive"] }
data-url = "0.1.1"
# misc
paste = "1.0.8"
termcolor = "1.1.3"
regex = "1.6.0"
add_getters_setters = "1.1.2"
# internal
gltf_loader = { path = "src/mesh/gltf_loader" }

The package import path is different for dynamic codegen and static codegen

Here is the structure for src directory of my project:
.
├── config.ts
├── protos
│ ├── index.proto
│ ├── index.ts
│ ├── share
│ │ ├── topic.proto
│ │ ├── topic_pb.d.ts
│ │ ├── user.proto
│ │ └── user_pb.d.ts
│ ├── topic
│ │ ├── service.proto
│ │ ├── service_grpc_pb.d.ts
│ │ ├── service_pb.d.ts
│ │ ├── topic.integration.test.ts
│ │ ├── topic.proto
│ │ ├── topicServiceImpl.ts
│ │ ├── topicServiceImplDynamic.ts
│ │ └── topic_pb.d.ts
│ └── user
│ ├── service.proto
│ ├── service_grpc_pb.d.ts
│ ├── service_pb.d.ts
│ ├── user.proto
│ ├── userServiceImpl.ts
│ └── user_pb.d.ts
└── server.ts
share/user.proto:
syntax = "proto3";
package share;
message UserBase {
string loginname = 1;
string avatar_url = 2;
}
topic/topic.proto:
syntax = "proto3";
package topic;
import "share/user.proto";
enum Tab {
share = 0;
ask = 1;
good = 2;
job = 3;
}
message Topic {
string id = 1;
string author_id = 2;
Tab tab = 3;
string title = 4;
string content = 5;
share.UserBase author = 6;
bool good = 7;
bool top = 8;
int32 reply_count = 9;
int32 visit_count = 10;
string create_at = 11;
string last_reply_at = 12;
}
As you can see, I try to import share package and use UserBase message type in Topic message type. When I try to start the server, got error:
no such Type or Enum 'share.UserBase' in Type .topic.Topic
But when I changed the package import path to a relative path import "../share/user.proto";. It works fine and got server logs: Server is listening on http://localhost:3000.
Above is the usage of dynamic codegen.
Now, I switch to using static codegen, here is the shell script for generating the codes:
protoc \
--plugin=protoc-gen-ts=./node_modules/.bin/protoc-gen-ts \
--ts_out=./src/protos \
-I ./src/protos \
./src/protos/**/*.proto
It seems protocol buffer compiler doesn't support relative path, got error:
../share/user.proto: Backslashes, consecutive slashes, ".", or ".." are not allowed in the virtual path
And, I changed the the package import path back to import "share/user.proto";. It generated code correctly, but when I try to start my server, got same error:
no such Type or Enum 'share.UserBase' in Type .topic.Topic
It's weird.
Package versions:
"grpc-tools": "^1.6.6",
"grpc_tools_node_protoc_ts": "^4.1.3",
protoc --version
libprotoc 3.10.0
UPDATE:
repo: https://github.com/mrdulin/nodejs-grpc/tree/master/src
Your dynamic codegen is failing because you are not specifying the paths to search for imported .proto files. You can do this using the includeDirs option when calling protoLoader.loadSync, which works in a very similar way to the -I option you pass to protoc. In this case, you are loading the proto files from the src/protos directory, so it should be sufficient to pass the option includeDirs: [__dirname]. Then the import paths in your .proto files should be relative to that directory, just like when you use protoc.
You are probably seeing the same error when you try to use the static code generation because it is actually the dynamic codegen error; you don't appear to be removing the dynamic codegen code when trying to use the statically generated code.
However, the main problem you will face with the statically generated code is that you are only generating the TypeScript type definition files. You also need to generate JavaScript files to actually run it. The official Node gRPC plugin for proto is distributed in the grpc-tools package. It comes with a binary called grpc_tools_node_protoc, which should be used in place of protoc and automatically includes the plugin. You will still need to pass a --js_out flag to generate that code.

Configure repository field on package.json on monorepo

Situation
I have a monorepo created with lerna with 40-50 projects. Each has a package.json like this.
{
"name": "#base-repo/add-class-methods",
"version": "1.0.0",
"main": "index.js",
"license": "MIT"
}
The folder structure is like this,
packages
├── absolute-url
│ ├── index.js
│ └── package.json
├── add-class-methods
│ ├── index.js
│ └── package.json
├── check-set-relative
│ ├── index.js
│ └── package.json
├── crypto
│ ├── index.js
│ └── package.json
If I push it to github, it will have a single github url, however I saw babel has 142 packages where each of them has a custom repository field in the package.json.
"repository": "https://github.com/babel/babel/tree/master/packages/babel-types"
I hope they are not setting this value manually for 142 packages. Same with my 40 small packages.
I understand I can manually set them in 3-4 minutes by the time I am writing this question. However this will get overwhelming when I try to do the same with a 150 package monorepo or in future.
Problem
How can I set/update the repository field without opening each package.json file manually for 40 packages?
What I tried
Manually set each as possible, but things quickly got boring and repeating considering I am a programmer. Then I googled the solution for around an hour. Finally I wrote the following script,
const glob = require('glob');
const fs = require('fs');
const path = require('path');
const gitUrl = 'https://github.com/user';
const author = `Mr. Github User <user#example.com> (${gitUrl})`;
const basePath = '/utility-scripts/tree/master';
const baseRepo = gitUrl + basePath;
glob('packages/*/package.json', (err, files) => {
for (const filePath of files) {
const [parent, pkg] = filePath.split('/');
const newData = {
author,
license: 'MIT',
repository: `${baseRepo}/${parent}/${pkg}`,
};
const data = Object.assign(
{},
JSON.parse(fs.readFileSync(path.resolve(filePath), 'utf-8')),
newData,
);
fs.writeFileSync(path.resolve(filePath), JSON.stringify(data, true, 2));
}
});
Is there an easy way to deal with this? With any kind of shell, git, yarn or npm command?

Resources