If I have:
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)]
struct Bytes(u64);
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)]
struct Seconds(f64);
#[derive(Debug, Clone, Copy, PartialEq, PartialOrd)]
struct BytesPerSecond(f64);
is there a simple way to get something like:
let b = Bytes(1024);
let s = Seconds(2.0);
let bytes = b / b; // yields Bytes
let bps = b / s; // yields BytesPerSecond
The uom crate lets you do things like:
let l1 = Length::new::<meter>(15.0);
let t1 = Time::new::<second>(50.0);
let l2 = l1 / l1
let v1 = l1 / t1;
Since each trait (Div, in this case) can only have one method, it seems like I would have to use generics to create multiple impls. I'm not sure how that would work in practice, though (and I found uoms's source hard to decipher).
Simple, Div is generic over the right-hand operand type. It simply defaults to Self if left unspecified (done via Div<RhsType>). And the output type must be supplied regardless.
So implementing Bytes / Bytes = Bytes and Bytes / Seconds = BytesPerSecond would look like this:
impl Div for Bytes {
type Output = Bytes;
fn div(self, rhs: Bytes) -> Bytes {
todo!()
}
}
impl Div<Seconds> for Bytes {
type Output = BytesPerSecond;
fn div(self, rhs: Seconds) -> BytesPerSecond {
todo!()
}
}
Related
Using tonic_build I can add .type_attribute(".", "#[derive(serde::Serialize, serde::Deserialize)]β) to annotate my types with e.g. serde.
However is there a way to only annotate enums, that are in the same package as structs.
Example:
syntax = "proto3";
package foo.bar;
message A {
Enu b = 0;
}
enum Enu {
FOO = 0;
BAR = 1;
}
I want in the end, for the generated struct to have serde, which I can already but for the enum to have one more annotation (strum in this case):
#[derive(serde::Serialize, serde::Deserialize)]
#[allow(clippy::derive_partial_eq_without_eq)]
#[derive(Clone, PartialEq, ::prost::Message)]
pub struct A {
#[prost(enumeration = βEnu", tag = β1")]
pub b: i32,
}
#[derive(serde::Serialize, serde::Deserialize)]
#[derive(Clone, Copy, Debug, PartialEq, Eq, Hash, PartialOrd, Ord, ::prost::Enumeration)]
#[repr(i32)]
#[derive(strum::EnumString)] // <β This is what I want added only here
pub enum Enu {
FOO = 0,
BAR = 1,
}
Note this was hand-crafted, so I might have missed a character somewhere
I have implemented a Cons list using a Box pointer to the next element in the list as follows.
#[derive(Debug, PartialEq, Eq)]
enum List<T> {
Cons(T, Box<List<T>>),
Nil,
}
impl<T: Copy> List<T> {
pub fn from_slice(v: &[T]) -> List<T> {
let mut inner = List::Nil;
for &next_elem in v.iter().rev() {
inner = List::Cons(next_elem, Box::new(inner));
}
inner
}
}
struct ListIter<'a, T>(&'a List<T>);
impl<'a, T> Iterator for ListIter<'a, T> {
type Item = &'a T;
fn next(&mut self) -> Option<Self::Item> {
match self.0 {
List::Cons(inner, next) => {
self.0 = next;
Some(inner)
},
List::Nil => None
}
}
}
This implementation works the way I want and has all the features I want. But is it possible to achieve the same functionality using a plain shared reference (&List) to represent a pointer to the next node in the List? For example such that List might defined as
#[derive(Debug, PartialEq, Eq)]
enum List<'a, T> {
Cons(T, &'a List<'a, T>),
Nil,
}
And if so, what are the costs and benefits of using a plain shared reference as opposed to a Box?
I tried my hand at it but I couldn't find a straightforward way to implement from_slice for List<'a, T> without violating the borrowing rules.
There are two ways to use references: on stack or with arenas.
On the stack means something like:
let nil = List::Nil;
let cons1 = List::Cons(1, &nil);
let cons2 = List::Cons(0, &cons1);
The advantage is that it involves only stack allocations and therefore should be blazing fast; the disadvantage is that you cannot return the list from the function, and that implies creating from_slice() is impossible.
Another way is to use arenas, a memory allocator without deallocation. For example, using the bumpalo crate:
use bumpalo::Bump;
#[derive(Debug, PartialEq, Eq)]
enum List<'a, T> {
Cons(T, &'a List<'a, T>),
Nil,
}
impl<'a, T: Copy> List<'a, T> {
pub fn from_slice(v: &[T], arena: &'a Bump) -> List<'a, T> {
let mut inner = List::Nil;
for &next_elem in v.iter().rev() {
inner = List::Cons(next_elem, arena.alloc(inner));
}
inner
}
}
struct ListIter<'a, T>(&'a List<'a, T>);
impl<'a, T> Iterator for ListIter<'a, T> {
type Item = &'a T;
fn next(&mut self) -> Option<Self::Item> {
match self.0 {
List::Cons(inner, next) => {
self.0 = next;
Some(inner)
},
List::Nil => None
}
}
}
The advantages is that this is very fast (as arena allocation is just some pointer manipulation) and this allows sharing the tail of similar lists; the disadvantages is that this does not allow you to free only some elements, so don't use it if you are not going to free all (or at least most) elements at once, and that it requires a reference to the arena to exist (you cannot store the arena alongside the list because this creates a self-referential struct).
If I make a list List::Cons(1, &List::Cons(2, &List::Nil)), I have a reference to that inner list &List::Cons(2, &List::Nil), right? That reference is to a List on the stack.
That is to say, List::Cons(1, &List::Cons(2, &List::Nil)) is extremely similar to
let x = List::Nil;
let y = List::Cons(2, &x);
let z = List::Cons(1, &y);
When I return from the function, no one owns the resulting stack values -- x and y are destroyed. Thus you can't return z, it's referencing data that will no longer exist.
I am trying to implement the following trait and struct:
pub trait Funct {
fn arity(&self) -> u32;
}
#[derive(Debug, Hash, Eq, PartialEq, Clone)]
pub struct FunctionLiteral<T: Funct> {
pub function: T,
pub args: Vec< FunctionLiteral<T> >
}
pub enum Foo {
Foo
}
impl Funct for Foo {
fn arity(&self) -> u32 {0}
}
pub enum Bar {
Bar
}
impl Funct for Bar {
fn arity(&self) -> u32 {0}
}
fn main() {
let baz = FunctionLiteral{
function: Foo::Foo,
args: vec![FunctionLiteral{
function: Bar::Bar,
args: vec![]
}]
};
}
I can set it up the way I have for the generic type T to be of trait Funct, but I don't necessarily want T to be of the same type.
Here, compiling the code gives the following error:
error[E0308]: mismatched types
--> foo.rs:31:23
|
31 | function: Bar::Bar,
| ^^^^^^^^ expected enum `Foo`, found enum `Bar`
error: aborting due to previous error
Is it possible to set up FunctionLiteral such that I can have different types for function and the items of args, while forcing both of them to be of type Funct?
The problem
When you do:
Structure<T: Trait>{
inner: T,
many: Vec<T>
}
You are telling the compiler to create a specialized instance for each different T. So if you have Foo and Bar both implementing Trait, then the compiler will generate two different representations, with two different sizes:
struct Foo(u8);
impl Trait for Foo{
// impl goes here
}
struct Bar(u64);
impl Trait for Bar{
// impl goes here
}
Then the compiler will generate something like:
Structure<Foo>{
inner: Foo,
many: Vec<Foo>
}
// and
Structure<Bar>{
inner: Bar,
many: Vec<Bar>
}
Obviously you cannot put Foo instances into Bar as they are different types and have different sizes.
The solution
You need to Box<> your Funct types in order to make them the same size (i.e. pointer sized). By putting them behind a (smart) pointer you are essentially erasing their types:
let a: Box<dyn Trait> = Box::new(Foo(0));
let b: Box<dyn Trait> = Box::new(Bar(0));
Now both a and b have the same size (the size of a pointer) and have the same type - Box<dyn Trait>. So now you can do:
struct Structure{ // no longer generic!
inner: Box<dyn Trait>, // can hold any `Box<dyn Trait>`
many: Vec<Box<dyn Trait>> // can hold any `Box<dyn Trait>`
}
The downside of this approach is that it requires heap allocation and that it looses the exact type of a and b. Youno longer know if a is Foo or Bar or something else.
Instead of Box you can use any other smart pointer, such as Rc or Arc if you need its functionality.
Another option
Another option is to make Foo and Bar the same size and type. This can be done by wrapping them in an enum:
enum Holder{
Foo(Foo),
Bar(Bar), // any other types go here in their own variants
}
Then your structure will look like:
struct Structure{ // no longer generic!
inner: Holder, // can hold any Holder variant`
many: Vec<Holder> // can hold any Holder variant`
}
The downside is that you have to either implement a delegation like:
impl Trait for Holder{
fn some_method(&self){
match self{
Holder::Foo(foo) => foo.some_method(),
Holder::Bar(bar) => bar.some_method(),
}
}
}
Or match everywhere you want to use the object. Also now your Holder enum will be the size of max(sizeof(Foo), sizeof(Bar))
On the plus side:
you still know the actual type - it's not erased
no heap allocation
I tried to create FFI bindings to libmodbus, written in C.
Here I stumble upon this function
modbus_set_error_recovery(ctx,
MODBUS_ERROR_RECOVERY_LINK |
MODBUS_ERROR_RECOVERY_PROTOCOL);
The second parameter is defined as
typedef enum
{
MODBUS_ERROR_RECOVERY_NONE = 0,
MODBUS_ERROR_RECOVERY_LINK = (1<<1),
MODBUS_ERROR_RECOVERY_PROTOCOL = (1<<2)
} modbus_error_recovery_mode;
My bindgen-generated bindings are these:
#[repr(u32)]
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub enum modbus_error_recovery_mode {
MODBUS_ERROR_RECOVERY_NONE = 0,
MODBUS_ERROR_RECOVERY_LINK = 2,
MODBUS_ERROR_RECOVERY_PROTOCOL = 4,
}
and
extern "C" {
pub fn modbus_set_error_recovery(ctx: *mut modbus_t,
error_recovery:
modbus_error_recovery_mode)
-> ::std::os::raw::c_int;
}
My safe interface looks like this, so far:
pub fn set_error_recovery(&mut self, error_recovery_mode: ErrorRecoveryMode) -> Result<()> {
unsafe {
match ffi::modbus_set_error_recovery(self.ctx, error_recovery_mode.to_c()) {
-1 => bail!(Error::last_os_error()),
0 => Ok(()),
_ => panic!("libmodbus API incompatible response"),
}
}
}
and
use std::ops::BitOr;
#[derive(Clone, Copy, PartialEq, Eq)]
pub enum ErrorRecoveryMode {
NONE = 0,
Link = 2,
Protocol = 4,
}
impl ErrorRecoveryMode {
pub fn to_c(self) -> ffi::modbus_error_recovery_mode {
match self {
NONE => ffi::modbus_error_recovery_mode::MODBUS_ERROR_RECOVERY_NONE,
Link => ffi::modbus_error_recovery_mode::MODBUS_ERROR_RECOVERY_LINK,
Protocol => ffi::modbus_error_recovery_mode::MODBUS_ERROR_RECOVERY_PROTOCOL,
}
}
}
impl BitOr for ErrorRecoveryMode {
type Output = Self;
fn bitor(self, rhs: ErrorRecoveryMode) -> ErrorRecoveryMode {
self | rhs
}
}
This triggered a stack overflow if I call set_error_recovery like this
assert!(modbus.set_error_recovery(ErrorRecoveryMode::Link | ErrorRecoveryMode::Protocol).is_ok())
The error is
thread 'set_error_recovery' has overflowed its stack
fatal runtime error: stack overflow
As DK. mentioned:
C's enum and Rust's enum have different restrictions.
It's not valid to have a Rust enum that isn't one of the enum variants.
What you have is called "bitflags"
Luckily, Bindgen understands bitflags. If you generate your headers while passing the bitfield-enum flag or by using Builder::bitfield_enum:
bindgen --bitfield-enum modbus_error_recovery_mode fake-modbus.h
Bindgen will generate constants for each C enum value, a newtype wrapper, and implementations of the Bit* traits:
// Many implementation details removed
pub struct modbus_error_recovery_mode(pub ::std::os::raw::c_uint);
pub const modbus_error_recovery_mode_MODBUS_ERROR_RECOVERY_NONE: modbus_error_recovery_mode =
modbus_error_recovery_mode(0);
pub const modbus_error_recovery_mode_MODBUS_ERROR_RECOVERY_LINK: modbus_error_recovery_mode =
modbus_error_recovery_mode(2);
pub const modbus_error_recovery_mode_MODBUS_ERROR_RECOVERY_PROTOCOL: modbus_error_recovery_mode =
modbus_error_recovery_mode(4);
impl ::std::ops::BitOr<modbus_error_recovery_mode> for modbus_error_recovery_mode {}
impl ::std::ops::BitOrAssign for modbus_error_recovery_mode {}
impl ::std::ops::BitAnd<modbus_error_recovery_mode> for modbus_error_recovery_mode {}
impl ::std::ops::BitAndAssign for modbus_error_recovery_mode {}
extern "C" {
pub fn modbus_set_error_recovery(
ctx: *mut modbus_t,
error_recovery: modbus_error_recovery_mode,
) -> ::std::os::raw::c_int;
}
How do I expose the bindgen generated constants to the public
Of course, creating an idiomatic Rust API to non-Rust code is the hard part. I might try something like this:
#[derive(Debug)]
struct Modbus(*mut raw::modbus_t);
#[derive(Debug)]
struct Error;
#[derive(Debug, Copy, Clone)]
enum ErrorRecovery {
Link,
Protocol,
}
impl ErrorRecovery {
fn as_raw(&self) -> raw::modbus_error_recovery_mode {
use ErrorRecovery::*;
match *self {
Link => raw::modbus_error_recovery_mode_MODBUS_ERROR_RECOVERY_LINK,
Protocol => raw::modbus_error_recovery_mode_MODBUS_ERROR_RECOVERY_PROTOCOL,
}
}
}
impl Modbus {
fn set_error_recovery(&mut self, flags: Option<&[ErrorRecovery]>) -> Result<(), Error> {
let flag = flags.unwrap_or(&[]).iter().fold(
raw::modbus_error_recovery_mode_MODBUS_ERROR_RECOVERY_NONE,
|acc, v| acc | v.as_raw(),
);
let res = unsafe { raw::modbus_set_error_recovery(self.0, flag) };
Ok(()) // real error checking
}
}
The problem is that C's enum and Rust's enum are very different things. In particular, C allows an enum to have absolutely any value whatsoever, whether or not that value corresponds to a variant.
Rust does not. Rust relies on enums only ever having a single value of a defined variant, or you run the risk of undefined behaviour.
What you have is not an enumeration (in the Rust sense), you have are bit flags, for which you want the bitflags crate.
As for the stack overflow, that's just because you defined the BitOr implementations in terms of itself; that code is unconditionally recursive.
In theory, Dynamically-Sized Types (DST) have landed and we should now be able to use dynamically sized type instances. Practically speaking, I can neither make it work, nor understand the tests around it.
Everything seems to revolve around the Sized? keyword... but how exactly do you use it?
I can put some types together:
// Note that this code example predates Rust 1.0
// and is no longer syntactically valid
trait Foo for Sized? {
fn foo(&self) -> u32;
}
struct Bar;
struct Bar2;
impl Foo for Bar { fn foo(&self) -> u32 { return 9u32; }}
impl Foo for Bar2 { fn foo(&self) -> u32 { return 10u32; }}
struct HasFoo<Sized? X> {
pub f:X
}
...but how do I create an instance of HasFoo, which is DST, to have either a Bar or Bar2?
Attempting to do so always seems to result in:
<anon>:28:17: 30:4 error: trying to initialise a dynamically sized struct
<anon>:28 let has_foo = &HasFoo {
I understand broadly speaking that you can't have a bare dynamically sized type; you can only interface with one through a pointer, but I can't figure out how to do that.
Disclaimer: these are just the results of a few experiments I did, combined with reading Niko Matsakis's blog.
DSTs are types where the size is not necessarily known at compile time.
Before DSTs
A slice like [i32] or a bare trait like IntoIterator were not valid object types because they do not have a known size.
A struct could look like this:
// [i32; 2] is a fixed-sized vector with 2 i32 elements
struct Foo {
f: [i32; 2],
}
or like this:
// & is basically a pointer.
// The compiler always knows the size of a
// pointer on a specific architecture, so whatever
// size the [i32] has, its address (the pointer) is
// a statically-sized type too
struct Foo2<'a> {
f: &'a [i32],
}
but not like this:
// f is (statically) unsized, so Foo is unsized too
struct Foo {
f: [i32],
}
This was true for enums and tuples too.
With DSTs
You can declare a struct (or enum or tuple) like Foo above, containing an unsized type. A type containing an unsized type will be unsized too.
While defining Foo was easy, creating an instance of Foo is still hard and subject to change. Since you can't technically create an unsized type by definition, you have to create a sized counterpart of Foo. For example, Foo { f: [1, 2, 3] }, a Foo<[i32; 3]>, which has a statically known size and code some plumbing to let the compiler know how it can coerce this into its statically unsized counterpart Foo<[i32]>. The way to do this in safe and stable Rust is still being worked on as of Rust 1.5 (here is the RFC for DST coercions for more info).
Luckily, defining a new DST is not something you will be likely to do, unless you are creating a new type of smart pointer (like Rc), which should be a rare enough occurrence.
Imagine Rc is defined like our Foo above. Since it has all the plumbing to do the coercion from sized to unsized, it can be used to do this:
use std::rc::Rc;
trait Foo {
fn foo(&self) {
println!("foo")
}
}
struct Bar;
impl Foo for Bar {}
fn main() {
let data: Rc<Foo> = Rc::new(Bar);
// we're creating a statically typed version of Bar
// and coercing it (the :Rc<Foo> on the left-end side)
// to as unsized bare trait counterpart.
// Rc<Foo> is a trait object, so it has no statically
// known size
data.foo();
}
playground example
?Sized bound
Since you're unlikely to create a new DST, what are DSTs useful for in your everyday Rust coding? Most frequently, they let you write generic code that works both on sized types and on their existing unsized counterparts. Most often these will be Vec/[] slices or String/str.
The way you express this is through the ?Sized "bound". ?Sized is in some ways the opposite of a bound; it actually says that T can be either sized or unsized, so it widens the possible types we can use, instead of restricting them the way bounds typically do.
Contrived example time! Let's say that we have a FooSized struct that just wraps a reference and a simple Print trait that we want to implement for it.
struct FooSized<'a, T>(&'a T)
where
T: 'a;
trait Print {
fn print(&self);
}
We want to define a blanket impl for all the wrapped T's that implement Display.
impl<'a, T> Print for FooSized<'a, T>
where
T: 'a + fmt::Display,
{
fn print(&self) {
println!("{}", self.0)
}
}
Let's try to make it work:
// Does not compile. "hello" is a &'static str, so self print is str
// (which is not sized)
let h_s = FooSized("hello");
h_s.print();
// to make it work we need a &&str or a &String
let s = "hello"; // &'static str
let h_s = &s; // & &str
h_s.print(); // now self is a &str
Eh... this is awkward... Luckily we have a way to generalize the struct to work directly with str (and unsized types in general): ?Sized
//same as before, only added the ?Sized bound
struct Foo<'a, T: ?Sized>(&'a T)
where
T: 'a;
impl<'a, T: ?Sized> Print for Foo<'a, T>
where
T: 'a + fmt::Display,
{
fn print(&self) {
println!("{}", self.0)
}
}
now this works:
let h = Foo("hello");
h.print();
playground
For a less contrived (but simple) actual example, you can look at the Borrow trait in the standard library.
Back to your question
trait Foo for ?Sized {
fn foo(&self) -> i32;
}
the for ?Sized syntax is now obsolete. It used to refer to the type of Self, declaring that `Foo can be implemented by an unsized type, but this is now the default. Any trait can now be implemented for an unsized type, i.e. you can now have:
trait Foo {
fn foo(&self) -> i32;
}
//[i32] is unsized, but the compiler does not complain for this impl
impl Foo for [i32] {
fn foo(&self) -> i32 {
5
}
}
If you don't want your trait to be implementable for unsized types, you can use the Sized bound:
// now the impl Foo for [i32] is illegal
trait Foo: Sized {
fn foo(&self) -> i32;
}
To amend the example that Paolo Falabella has given, here is a different way of looking at it with the use of a property.
struct Foo<'a, T>
where
T: 'a + ?Sized,
{
printable_object: &'a T,
}
impl<'a, T> Print for Foo<'a, T>
where
T: 'a + ?Sized + fmt::Display,
{
fn print(&self) {
println!("{}", self.printable_object);
}
}
fn main() {
let h = Foo {
printable_object: "hello",
};
h.print();
}
At the moment, to create a HasFoo storing a type-erased Foo you need to first create one with a fixed concrete type and then coerce a pointer to it to the DST form, that is
let has_too: &HasFoo<Foo> = &HasFoo { f: Bar };
Calling has_foo.f.foo() then does what you expect.
In future these DST casts will almost certainly be possible with as, but for the moment coercion via an explicit type hint is required.
Here is a complete example based on huon's answer. The important trick is to make the type that you want to contain the DST a generic type where the generic need not be sized (via ?Sized). You can then construct a concrete value using Bar1 or Bar2 and then immediately convert it.
struct HasFoo<F: ?Sized = dyn Foo>(F);
impl HasFoo<dyn Foo> {
fn use_it(&self) {
println!("{}", self.0.foo())
}
}
fn main() {
// Could likewise use `&HasFoo` or `Rc<HasFoo>`, etc.
let ex1: Box<HasFoo> = Box::new(HasFoo(Bar1));
let ex2: Box<HasFoo> = Box::new(HasFoo(Bar2));
ex1.use_it();
ex2.use_it();
}
trait Foo {
fn foo(&self) -> u32;
}
struct Bar1;
impl Foo for Bar1 {
fn foo(&self) -> u32 {
9
}
}
struct Bar2;
impl Foo for Bar2 {
fn foo(&self) -> u32 {
10
}
}