So i just started using rust, and started using the bellman crate.
I used the MimC example that was added to the bellman git account, and it seems like its calculating the parameters for the circuit each time you run the example. I want to use the example as a base for my code, and it seems redundant to calculate it each time for the same circuit so I waned to try and write params to the disk, and to check each time whether it exists or not for a specific circuit (so if it was already calculated, it will read it instead of calculating it).
Assuming params is a structure, I tried using serde and serde_json. but I keep on getting the following error:
^^^^^^^ the trait serde::ser::Serialize is not implemented for bellman::groth16::Parameters<pairing::bls12_381::Bls12>
any thoughts about how can I write it and read it later efficently?
thanks!
serde has a Serialize/Deserialize traits which should be derived/implemented in the crate where the types are defined. So usually it's a good idea to look at Cargo.toml (or documentation) for serde features, it's a pretty common practice to have it (and sometimes you need manually enable them). For the bellman crate however that doesn't seem to be implemented, so you need to workaround for "external" type (explanation). Serde particularly has a fairly good support of that, take a look at their doc. Simply, you need to provide a newtype to #[serde(with = "<here-your-newtype>")], which mimics the original one.
Related
I have been dabbling in experimental features lately and have been using them for a library I'm building. I am trying to reduce the size of an enum by using ThinBox<[T]> to store contents in a fixed length array without the whole const generics monomorphization business happening in my code (since I need to store this in an enum later and don't want to have a const generic on the level of the enum).
The closest thing I got to a solution is to ThinBox a fixed sized array. (it coerces to a slice). Though it technically does fix the problem of const generics on the type level, I want to find a solution that doesn't require me to input const generics into a function (since it's a lot less flexibility). I also don't want to end up with a ThinBox<&[T]> since that is two levels of indirection.
Is there a method, safe or unsafe, that can initialize a ThinBox<[T]> without directly hacking the compiler?
You can use ThinBox::new_unsize like this:
ThinBox::<[T]>::new_unsize([/* your array */])
I am working on a Rust program that uses serde-json, and I really like the #[derive(Serialize, Deserialize)] macros that it gives for use with custom structs and enums. The macros work just fine with my own types. However, I would like to be able to call the macros on types from other libraries that I am using.
I would implement the Serialize and Deserialize traits on those types myself, but the code for Deserialize is especially convoluted, and it would be a pain to write to for every single library type that I use in a struct.
Is there a way for me to use #[derive] on a struct or enum from a library without editing the actual library's source code?
No, there is not.
See also:
How do I implement a trait I don't own for a type I don't own?
For the specific case of Serde, you can use "remote deriving", but you have to provide a duplicate definition of the type, essentially rewriting the original structure.
Many crates provide a feature flag to enable optional functionality, so you may want to look to see if your crate has one for Serde. If it doesn't, you could submit such to the library.
I'm having problem understanding the usefulness of Rust enums after reading The Rust Programming Language.
In section 17.3, Implementing an Object-Oriented Design Pattern, we have this paragraph:
If we were to create an alternative implementation that didn’t use the state pattern, we might instead use match expressions in the methods on Post or even in the main code that checks the state of the post and changes behavior in those places. That would mean we would have to look in several places to understand all the implications of a post being in the published state! This would only increase the more states we added: each of those match expressions would need another arm.
I agree completely. It would be very bad to use enums in this case because of the reasons outlined. Yet, using enums was my first thought of a more idiomatic implementation. Later in the same section, the book introduces the concept of encoding the state of the objects using types, via variable shadowing.
It's my understanding that Rust enums can contain complex data structures, and different variants of the same enum can contain different types.
What is a real life example of a design in which enums are the better option? I can only find fake or very simple examples in other sources.
I understand that Rust uses enums for things like Result and Option, but those are very simple uses. I was thinking of some functionality with a more complex behavior.
This turned out to be a somewhat open ended question, but I could not find a useful response after searching Google. I'm free to change this question to a more closed version if someone could be so kind as to help me rephrase it.
A fundamental trade-off between these choices in a broad sense has a name: "the expression problem". You should find plenty on Google under that name, both in general and in the context of Rust.
In the context of the question, the "problem" is to write the code in such a way that both adding a new state and adding a new operation on states does not involve modifying existing implementations.
When using a trait object, it is easy to add a state, but not an operation. To add a state, one defines a new type and implements the trait. To add an operation, naively, one adds a method to the trait but has to intrusively update the trait implementations for all states.
When using an enum for state, it is easy to add a new operation, but not a new state. To add an operation, one defines a new function. To add a new state, naively, one must intrusively modify all the existing operations to handle the new state.
If I explained this well enough, hopefully it should be clear that both will have a place. They are in a way dual to one another.
With this lens, an enum would be a better fit when the operations on the enum are expected to change more than the alternatives. For example, suppose you were trying to represent an abstract syntax tree for C++, which changes every three years. The set of types of AST nodes may not change frequently relative to the set of operations you may want to perform on AST nodes.
With that said, there are solutions to the more difficult options in both cases, but they remain somewhat more difficult. And what code must be modified may not be the primary concern.
At the beginning of my program, I read data from a file:
let file = std::fs::File::open("data/games.json").unwrap();
let data: Games = serde_json::from_reader(file).unwrap();
I would like to know how it would be possible to do this at compile time for the following reasons:
Performance: no need to deserialize at runtime
Portability: the program can be run on any machine without the need to have the json file containing the data with it.
I might also be useful to mention that, the data can be read only which means the solution can store it as static.
This is straightforward, but leads to some potential issues. First, we need to deal with something: do we want to load the tree of objects from a file, or parse that at runtime?
99% of the time, parsing on boot into a static ref is enough for people, so I'm going to give you that solution; I will point you to the "other" version at the end, but that requires a lot more work and is domain-specific.
The macro (because it has to be a macro) you are looking for to be able to include a file at compile-time is in the standard library: std::include_str!. As the name suggests, it takes your file at compile-time and generates a &'static str from it for you to use. You are then free to do whatever you like with it (such as parsing it).
From there, it is a simple matter to then use lazy_static! to generate a static ref to our JSON Value (or whatever it may be that you decide to go for) for every part of the program to use. In your case, for instance, it could look like this:
const GAME_JSON: &str = include_str!("my/file.json");
#[derive(Serialize, Deserialize, Debug)]
struct Game {
name: String,
}
lazy_static! {
static ref GAMES: Vec<Game> = serde_json::from_str(&GAME_JSON).unwrap();
}
You need to be aware of two things when doing this:
This will massively bloat your file size, as the &str isn't compressed in any way. Consider gzip
You'll need to worry about the usual concerns around multiple, threaded access to the same static ref, but since it isn't mutable you only really need to worry about a portion of it
The other way requires dynamically generating your objects at compile-time using a procedural macro. As stated, I wouldn't recommend it unless you really have a really expensive startup cost when parsing that JSON; most people will not, and the last time I had this was when dealing with deeply-nested multi-GB JSON files.
The crates you want to look out for are proc_macro2 and syn for the code generation; the rest is very similar to how you would write a normal method.
When you are deserializing something at runtime, you're essentially building some representation in program memory from another representation on disk. But at compile-time, there's no notion of "program memory" yet - where will this data deserialize too?
However, what you're trying to achieve is, in fact, possible. The main idea is like following: to create something in program memory, you must write some code which will create the data. What if you're able to generate the code automatically, based on the serialized data? That's what uneval crate does (disclaimer: I'm the author, so you're encouraged to look through the source to see if you can do better).
To use this approach, you'll have to create build.rs with approximately the following content:
// somehow include the Games struct with its Serialize and Deserialize implementations
fn main() {
let games: Games = serde_json::from_str(include_str!("data/games.json")).unwrap();
uneval::to_out_dir(games, "games.rs");
}
And in you initialization code you'll have the following:
let data: Games = include!(concat!(env!("OUT_DIR"), "/games.rs"));
Note, however, that this might be fairly hard to do in ergonomic way, since the necessary struct definitions now must be shared between the build.rs and the crate itself, as I mentioned in the comment. It might be a little easier if you split your crate in two, keeping struct definitions (and only them) in one crate, and the logic which uses them - in another one. There's some other ways - with include! trickery, or by using the fact that the build script is an ordinary Rust binary and can include other modules as well, - but this will complicate things even more.
I am starting out learning Rust macros, but the documentation is somewhat limited. Which is fine — they're an expert feature, I guess. While I can do basic code generation, implementation of traits, and so on, some of the built-in macros seem well beyond that, such as the various print macros, which examine a string literal and use that for code expansion.
I looked at the source for print! and it calls another macro called format_args. Unfortunately this doesn't seem to be built in "pure Rust" the comment just says "compiler built-in."
Is it possible to write something as complex as print! in a pure Rust macro? If so, how would it be done?
I'm actually interested in building a "compile time trie" -- basically recognizing certain fixed strings as "keywords" fixed at compile time. This would be performant (probably) but mostly I'm just interested in code generation.
format_args is implemented in the compiler itself, in the libsyntax_ext crate. The name is registered in the register_builtins function, and the code to process it has its entry point in the expand_format_args function.
Macros that do such detailed syntax processing cannot be defined using the macro_rules! construct. They can be defined with a procedural macro; however, this feature is currently unstable (can only be used with the nightly compiler and is subject to sudden and unannounced changes) and rather sparsely documented.
Rust macros cannot parse string literals, so it's not possible to create a direct Rust equivalent of format_args!.
What you could do is to use a macro to transform the function-call-like syntax into something that represents the variadic argument list in the Rust type system in some way (say, as a heterogeneous single-linked list, or a builder type). This can then be passed to a regular Rust function, along with the format string. But you will not be able to implement compile-time type checking of the format string this way.