Is there any way to make this code work without using Boxing:
fn some_func(my_type: MyType, some_str: &str) -> bool {
let mut hmac = match my_type {
MyType::MyType1 => create_hmac(Sha256::new(), some_str),
MyType::MyType2 => create_hmac(Sha384::new(), some_str),
MyType::MyType3 => create_hmac(Sha512::new(), some_str),
_ => panic!()
};
//some calculations goes HERE, NOT in create_hmac function...
hmac.input("fdsfdsfdsfd".to_string().as_bytes());
//something else....
true
}
fn create_hmac<D: Digest>(digest: D, some_str: &str) -> Hmac<D> {
Hmac::new(digest, some_str.to_string().as_bytes())
}
The library it's using is https://github.com/DaGenix/rust-crypto
You need to either Box or use a reference, as a "trait object" can only work behind a pointer.
Here's a very simplified version of your code. You have three different structs that implement the same trait (Digest)
struct Sha256;
struct Sha384;
struct Sha512;
trait Digest {}
impl Digest for Sha256 {}
impl Digest for Sha384 {}
impl Digest for Sha512 {}
struct HMac<D: Digest> { d: D }
fn main() {
let a = 1;
// what you're trying to do
// (does not work, Sha256, Sha384 and Sha512 are different types)
//let _ = match a {
// 1 => Sha256,
// 2 => Sha384,
// 3 => Sha512,
// _ => unreachable!()
//};
}
Note that in the real case, not only all ShaXXX types are different for the type system, they have a different memory layout as well (compare Engine256State with Engine512State for instance), so this rules out unsafe tricks with transmute.
So, as said, you can use Box or references (but you have to pre-create a concrete instance before the match if you want to use references):
fn main() {
let a = 1;
let _ : Box<Digest> = match a {
1 => Box::new(Sha256),
2 => Box::new(Sha384),
3 => Box::new(Sha512),
_ => unreachable!()
};
// to use references we need a pre-existing instance of all ShaXXX
let (sha256, sha384, sha512) = (Sha256, Sha384, Sha512);
let _ : &Digest = match a {
1 => &sha256, //... otherwise the reference wouldn't outlive the match
2 => &sha384,
3 => &sha512,
_ => unreachable!()
};
}
Note that a Box is doing the equivalent of what most Garbage Collected languages do for you under the hood when you want to only use an object through its interface. Some memory is dynamically allocated for the concrete objects, but you're only really allowed to pass around a pointer to the memory.
In your case (but I haven't tested the code below) you should be able to do:
//HMac implements a Mac trait, so we can return a Box<Mac>
// (I'm assuming you only want to use HMac through its Mac trait)
fn create_hmac<'a, D: Digest>(digest: D, some_str: &'a str) -> Box<Mac + 'a> {
Box::new(Hmac::new(digest, some_str.to_string().as_bytes()))
}
and use it as:
let mut hmac: Box<Mac> = match my_type {
MyType::MyType1 => create_hmac(Sha256::new(), some_str),
MyType::MyType2 => create_hmac(Sha384::new(), some_str),
MyType::MyType3 => create_hmac(Sha512::new(), some_str),
_ => unreachable!()
};
One addition and one clarification to Paolo's good answer. First, you could make your enum incorporate the appropriate Sha* struct and then implement Digest by delegating it as appropriate. This might not make sense in all cases, but if conceptually that's what you are doing this might make sense:
struct Sha256;
struct Sha384;
struct Sha512;
trait Digest { fn digest(&self); }
impl Digest for Sha256 { fn digest(&self) {println!("256")} }
impl Digest for Sha384 { fn digest(&self) {println!("384")} }
impl Digest for Sha512 { fn digest(&self) {println!("512")} }
enum MyType {
One(Sha256),
Two(Sha384),
Three(Sha512),
}
impl Digest for MyType {
fn digest(&self) {
use MyType::*;
match *self {
One(ref sha) => sha.digest(),
Two(ref sha) => sha.digest(),
Three(ref sha) => sha.digest(),
}
}
}
fn main() {
let a = MyType::Two(Sha384);
a.digest()
}
Also, you don't have to actually instantiate all of the types if you want to use references, you just have to ensure that the one you use is available. You also have to have places where the reference can live beyond the match expression:
#![feature(std_misc)]
#![feature(io)]
use std::time::duration::Duration;
use std::old_io::timer::sleep;
struct Sha256(u8);
struct Sha384(u8);
struct Sha512(u8);
impl Sha256 { fn new() -> Sha256 { sleep(Duration::seconds(1)); Sha256(1) }}
impl Sha384 { fn new() -> Sha384 { sleep(Duration::seconds(2)); Sha384(2) }}
impl Sha512 { fn new() -> Sha512 { sleep(Duration::seconds(3)); Sha512(3) }}
trait Digest {}
impl Digest for Sha256 {}
impl Digest for Sha384 {}
impl Digest for Sha512 {}
fn main() {
let a = 1;
let sha256: Sha256;
let sha384: Sha384;
let sha512: Sha512;
let _ : &Digest = match a {
1 => {
sha256 = Sha256::new();
&sha256
},
2 => {
sha384 = Sha384::new();
&sha384
},
3 => {
sha512 = Sha512::new();
&sha512
},
_ => unreachable!()
};
}
Related
I'm making my own Serializable trait, in the context of a client / server system.
My idea was that the messages sent by the system is an enum made by the user of this system, so it can be customize as needed.
Too ease implementing the trait on the enum, I would like to use the #[derive(Serializable)] method, as implementing it is always the same thing.
Here is the trait :
pub trait NetworkSerializable {
fn id(&self) -> usize;
fn size(&self) -> usize;
fn serialize(self) -> Vec<u8>;
fn deserialize(id: usize, data: Vec<u8>) -> Self;
}
Now, I've tried to look at the book (this one too) and this example to try to wrap my head around derive macros, but I'm really struggling to understand them and how to implement them. I've read about token streams and abstract trees, and I think I understand the basics.
Let's take the example of the id() method : it should gives a unique id for each variant of the enum, to allow headers of messages to tell which message is incoming.
let's say I have this enum as a message system :
enum NetworkMessages {
ErrorMessage,
SpawnPlayer(usize, bool, Transform), // player id, is_mine, position
MovePlayer(usize, Transform), // player id, new_position
DestroyPlayer(usize) // player_id
}
Then, the id() function should look like this :
fn id(&self) -> usize {
match &self {
&ErrorMessage => 0,
&SpawnPlayer => 1,
&MovePlayer => 2,
&DestroyPlayer => 3,
}
}
Here was my go with writting this using a derive macro :
#[proc_macro_derive(NetworkSerializable)]
pub fn network_serializable_derive(input: TokenStream) -> TokenStream {
// Construct a representation of Rust code as a syntax tree
// that we can manipulate
let ast = syn::parse(input).unwrap();
// Build the trait implementation
impl_network_serializable_macro(&ast)
}
fn impl_network_serializable_macro(ast: &syn::DeriveInput) -> TokenStream {
// get enum name
let ref name = ast.ident;
let ref data = ast.data;
let (id_func, size_func, serialize_func, deserialize_func) = match data {
// Only if data is an enum, we do parsing
Data::Enum(data_enum) => {
// Iterate over enum variants
let mut id_func_internal = TokenStream2::new();
let mut variant_id: usize = 0;
for variant in &data_enum.variants {
// add the branch for the variant
id_func_internal.extend(quote_spanned!{
variant.span() => &variant_id,
});
variant_id += 1;
}
(id_func_internal, (), (), ())
}
_ => {(TokenStream2::new(), (), (), ())},
};
let expanded = quote! {
impl NetworkSerializable for #name {
// variant_checker_functions gets replaced by all the functions
// that were constructed above
fn size(&self) -> usize {
match &self {
#id_func
}
}
/*
#size_func
#serialize_func
#deserialize_func
*/
}
};
expanded.into()
}
So this is generating quite a lot of errors, with the "proc macro NetworkSerializable not expanded: no proc macro dylib present" being first. So I'm guessing there a lot of misunderstaning from my part in here.
I have the goal of wrapping an Iterator<Item = rusb::Device<_> to Iterator<Item = LitraDevice>. The latter contains specific implementation.
To make this work I tried the following code:
use std::iter::Filter;
use rusb;
const VENDOR: u16 = 0x046d;
const PRODUCT: u16 = 0xc900;
struct LitraDevice {
dev: rusb::Device<rusb::GlobalContext>,
}
pub struct LitraDevices {
unfiltered: rusb::DeviceList<rusb::GlobalContext>,
}
struct LitraDeviceIterator<'a> {
it: Filter<rusb::Devices<'a, rusb::GlobalContext>, for<'r> fn(&'r rusb::Device<rusb::GlobalContext>) -> bool>,
}
impl LitraDevices {
pub fn new() -> Self {
let unfiltered = rusb::devices().unwrap();
LitraDevices { unfiltered }
}
fn can_not_handle<'r>(dev: &'r rusb::Device<rusb::GlobalContext>) -> bool {
let desc = dev.device_descriptor().unwrap();
match (desc.vendor_id(), desc.product_id()) {
(VENDOR, PRODUCT) => (),
_ => return true,
}
match desc.class_code() {
LIBUSB_CLASS_HID => return true, // Skip HID devices, they are handled directly by OS libraries
_ => return false,
}
}
pub fn iter<'a>(self) -> LitraDeviceIterator<'a> {
let it = self.unfiltered.iter().filter(Self::can_not_handle);
LitraDeviceIterator{
it,
}
}
}
impl <'a> Iterator for LitraDeviceIterator<'a> {
type Item = LitraDevice;
fn next(&mut self) -> Option<Self::Item> {
let n = self.it.next();
match n {
Some(Device) => return Some(LitraDevice{dev: n.unwrap()}),
None => return None,
}
}
}
Now I really cannot figure out how to code LitraDeviceIterator so that it wraps the filtered iterator.
All code iterations I have tried so far turn into a generic nightmare very quickly.
I rewrote your iter() to yield LitraDevice, you can surely take it wherever you wanted to go from there.
The first underlying issue is that filter() yields references, but in cases like these, you actually mean to move yielded items while filtering. That's what filter_map() is capable of. That way, you can scrap the references, greatly simplifying your code.
(This code does not work yet, read on)
pub fn iter(self) -> impl Iterator<Item = LitraDevice> {
self.unfiltered.iter().filter_map(|dev| {
(!Self::can_not_handle(&dev))
.then_some(dev)
.map(|dev| LitraDevice { dev })
})
}
Now, there's a second little issue at play her: rusb::DeviceList<T : UsbContext>>::iter(&self) returns rusb::Devices<'_, T>, '_ being the anonymous lifetime inferred from &self. Meaning, while you can drive rusb::Devices<'_, T> to yield Device<T>s, you can not actually keep it around longer than self.unfiltered. More specifically, as you consume self in iter(), you can not return an iterator referencing that rusb::Devices<'_, T> from iter(). One solution is to immediately collect, then again moving into an iterator.
pub fn iter(self) -> impl Iterator<Item = LitraDevice> {
let devices = self.unfiltered.iter().collect::<Vec<_>>();
devices.into_iter().filter_map(|dev| {
(!Self::can_not_handle(&dev))
.then_some(dev)
.map(|dev| LitraDevice { dev })
})
}
I am implementing a derive macro to reduce the amount of boilerplate I have to write for similar types.
I want the macro to operate on structs which have the following format:
#[derive(MyTrait)]
struct SomeStruct {
records: HashMap<Id, Record>
}
Calling the macro should generate an implementation like so:
impl MyTrait for SomeStruct {
fn foo(&self, id: Id) -> Record { ... }
}
So I understand how to generate the code using quote:
#[proc_macro_derive(MyTrait)]
pub fn derive_answer_fn(item: TokenStream) -> TokenStream {
...
let generated = quote!{
impl MyTrait for #struct_name {
fn foo(&self, id: #id_type) -> #record_type { ... }
}
}
...
}
But what is the best way to get #struct_name, #id_type and #record_type from the input token stream?
One way is to use the venial crate to parse the TokenStream.
use proc_macro2;
use quote::quote;
use venial;
#[proc_macro_derive(MyTrait)]
pub fn derive_answer_fn(item: proc_macro::TokenStream) -> proc_macro::TokenStream {
// Ensure it's deriving for a struct.
let s = match venial::parse_declaration(proc_macro2::TokenStream::from(item)) {
Ok(venial::Declaration::Struct(s)) => s,
Ok(_) => panic!("Can only derive this trait on a struct"),
Err(_) => panic!("Error parsing into valid Rust"),
};
let struct_name = s.name;
// Get the struct's first field.
let fields = s.fields;
let named_fields = match fields {
venial::StructFields::Named(named_fields) => named_fields,
_ => panic!("Expected a named field"),
};
let inners: Vec<(venial::NamedField, proc_macro2::Punct)> = named_fields.fields.inner;
if inners.len() != 1 {
panic!("Expected exactly one named field");
}
// Get the name and type of the first field.
let first_field_name = &inners[0].0.name;
let first_field_type = &inners[0].0.ty;
// Extract Id and Record from the type HashMap<Id, Record>
if first_field_type.tokens.len() != 6 {
panic!("Expected type T<R, S> for first named field");
}
let id = first_field_type.tokens[2].clone();
let record = first_field_type.tokens[4].clone();
// Implement MyTrait.
let generated = quote! {
impl MyTrait for #struct_name {
fn foo(&self, id: #id) -> #record { *self.#first_field_name.get(&id).unwrap() }
}
};
proc_macro::TokenStream::from(generated)
}
I'm trying to implement this pattern:
use std::any::Any;
use std::fmt::Debug;
trait CommandHandler<TCommand> {
fn execute(&self, data: TCommand);
}
#[derive(Debug)]
struct FooCommand {}
struct FooCommandHandler {}
impl CommandHandler<FooCommand> for FooCommandHandler {
fn execute(&self, data: FooCommand) {
println!("Foo");
}
}
#[derive(Debug)]
struct BarCommand {}
struct BarCommandHandler {}
impl CommandHandler<BarCommand> for BarCommandHandler {
fn execute(&self, data: BarCommand) {
println!("Bar");
}
}
fn execute<T>(command: T)
where
T: Any + Debug,
{
println!("Command: {:?}", command);
match (&command as &Any).downcast_ref::<FooCommand>() {
Some(c) => (FooCommandHandler {}).execute(c),
None => {}
};
match (&command as &Any).downcast_ref::<BarCommand>() {
Some(c) => (BarCommandHandler {}).execute(c),
None => {}
};
}
fn main() {
(FooCommandHandler {}).execute(FooCommand {});
(BarCommandHandler {}).execute(BarCommand {});
execute(FooCommand {});
execute(BarCommand {});
}
This doesn't work:
error[E0308]: mismatched types
--> src/main.rs:37:51
|
37 | Some(c) => (FooCommandHandler {}).execute(c),
| ^ expected struct `FooCommand`, found &FooCommand
|
= note: expected type `FooCommand`
found type `&FooCommand`
error[E0308]: mismatched types
--> src/main.rs:41:51
|
41 | Some(c) => (BarCommandHandler {}).execute(c),
| ^ expected struct `BarCommand`, found &BarCommand
|
= note: expected type `BarCommand`
found type `&BarCommand`
How can I implement the execute() method in a way that preserves the following requirements:
The type XCommand should be totally naive of the XCommandHandler's that execute it.
Multiple implementations of CommandHandler<X> may exist.
The command handler receives (and consumes) the concrete command instance, not a reference to it (making duplicate dispatch of commands impossible).
In essence, I have a generic function fn foo<T>(v: T) and a I wish to dispatch to a number of concrete functions fn foo1(v: Foo), fn foo2(v: Bar); how do I do that?
Is transmute the only option?
Note that this is distinct from what Any::downcast_ref does, which is return an &Foo, not Foo from the generic value v.
You need to go via Box, like so:
fn execute<T>(command: T)
where
T: Any + Debug,
{
println!("Command: {:?}", command);
let any: Box<Any> = Box::new(command);
let any = match any.downcast() {
Ok(c) => return (FooCommandHandler {}).execute(*c),
Err(any) => any,
};
let any = match any.downcast() {
Ok(c) => return (BarCommandHandler {}).execute(*c),
Err(any) => any,
};
let _ = any; // avoid unused variable error
panic!("could not downcast command");
}
"But I don't wanna use a Box!"
Just use Box.
"But it's an allocation! I've measured the above code and proven beyond a shadow of a doubt that it's a bottleneck!"
What? Really?
"You can't prove otherwise."
Oh fine. But I do not guarantee that this will work in all cases. This is treading into "blow yourself up" territory. Do not do this unless you know you need to:
fn execute<T>(command: T)
where
T: Any + Debug,
{
use std::any::TypeId;
use std::mem;
println!("Command: {:?}", command);
macro_rules! do_cast {
($t:ty, $h:expr) => {
if TypeId::of::<T>() == TypeId::of::<$t>() {
let casted: $t = mem::transmute_copy(&command);
mem::forget(command); // we CANNOT let command drop.
$h.execute(casted);
return;
}
};
}
unsafe {
do_cast!(FooCommand, FooCommandHandler {});
do_cast!(BarCommand, BarCommandHandler {});
}
panic!("could not downcast command");
}
Just as a quick summary of the accepted answer:
Where &Any only has:
pub fn downcast_ref<T>(&self) -> Option<&T> where T: Any
Box<Any> implements:
pub fn downcast<T>(self) -> Result<Box<T>, Box<Any + 'static>> where T: Any
However, for complicated reasons, the documentation is on Box not on Any.
Say we want to have objects implementations switched at runtime, we'd do something like this:
pub trait Methods {
fn func(&self);
}
pub struct Methods_0;
impl Methods for Methods_0 {
fn func(&self) {
println!("foo");
}
}
pub struct Methods_1;
impl Methods for Methods_1 {
fn func(&self) {
println!("bar");
}
}
pub struct Object<'a> { //'
methods: &'a (Methods + 'a),
}
fn main() {
let methods: [&Methods; 2] = [&Methods_0, &Methods_1];
let mut obj = Object { methods: methods[0] };
obj.methods.func();
obj.methods = methods[1];
obj.methods.func();
}
Now, what if there are hundreds of such implementations? E.g. imagine implementations of cards for collectible card game where every card does something completely different and is hard to generalize; or imagine implementations for opcodes for a huge state machine. Sure you can argue that a different design pattern can be used -- but that's not the point of this question...
Wonder if there is any way for these Impl structs to somehow "register" themselves so they can be looked up later by a factory method? I would be happy to end up with a magical macro or even a plugin to accomplish that.
Say, in D you can use templates to register the implementations -- and if you can't for some reason, you can always inspect modules at compile-time and generate new code via mixins; there are also user-defined attributes that can help in this. In Python, you would normally use a metaclass so that every time a new child class is created, a ref to it is stored in the metaclass's registry which allows you to look up implementations by name or parameter; this can also be done via decorators if implementations are simple functions.
Ideally, in the example above you would be able to create Object as
Object::new(0)
where the value 0 is only known at runtime and it would magically return you an Object { methods: &Methods_0 }, and the body of new() would not have the implementations hard-coded like so "methods: [&Methods; 2] = [&Methods_0, &Methods_1]", instead it should be somehow inferred automatically.
So, this is probably extremely buggy, but it works as a proof of concept.
It is possible to use Cargo's code generation support to make the introspection at compile-time, by parsing (not exactly parsing in this case, but you get the idea) the present implementations, and generating the boilerplate necessary to make Object::new() work.
The code is pretty convoluted and has no error handling whatsoever, but works.
Tested on rustc 1.0.0-dev (2c0535421 2015-02-05 15:22:48 +0000)
(See on github)
src/main.rs:
pub mod implementations;
mod generated_glue {
include!(concat!(env!("OUT_DIR"), "/generated_glue.rs"));
}
use generated_glue::Object;
pub trait Methods {
fn func(&self);
}
pub struct Methods_2;
impl Methods for Methods_2 {
fn func(&self) {
println!("baz");
}
}
fn main() {
Object::new(2).func();
}
src/implementations.rs:
use super::Methods;
pub struct Methods_0;
impl Methods for Methods_0 {
fn func(&self) {
println!("foo");
}
}
pub struct Methods_1;
impl Methods for Methods_1 {
fn func(&self) {
println!("bar");
}
}
build.rs:
#![feature(core, unicode, path, io, env)]
use std::env;
use std::old_io::{fs, File, BufferedReader};
use std::collections::HashMap;
fn main() {
let target_dir = Path::new(env::var_string("OUT_DIR").unwrap());
let mut target_file = File::create(&target_dir.join("generated_glue.rs")).unwrap();
let source_code_path = Path::new(file!()).join_many(&["..", "src/"]);
let source_files = fs::readdir(&source_code_path).unwrap().into_iter()
.filter(|path| {
match path.str_components().last() {
Some(Some(filename)) => filename.split('.').last() == Some("rs"),
_ => false
}
});
let mut implementations = HashMap::new();
for source_file_path in source_files {
let relative_path = source_file_path.path_relative_from(&source_code_path).unwrap();
let source_file_name = relative_path.as_str().unwrap();
implementations.insert(source_file_name.to_string(), vec![]);
let mut file_implementations = &mut implementations[*source_file_name];
let mut source_file = BufferedReader::new(File::open(&source_file_path).unwrap());
for line in source_file.lines() {
let line_str = match line {
Ok(line_str) => line_str,
Err(_) => break,
};
if line_str.starts_with("impl Methods for Methods_") {
const PREFIX_LEN: usize = 25;
let number_len = line_str[PREFIX_LEN..].chars().take_while(|chr| {
chr.is_digit(10)
}).count();
let number: i32 = line_str[PREFIX_LEN..(PREFIX_LEN + number_len)].parse().unwrap();
file_implementations.push(number);
}
}
}
writeln!(&mut target_file, "use super::Methods;").unwrap();
for (source_file_name, impls) in &implementations {
let module_name = match source_file_name.split('.').next() {
Some("main") => "super",
Some(name) => name,
None => panic!(),
};
for impl_number in impls {
writeln!(&mut target_file, "use {}::Methods_{};", module_name, impl_number).unwrap();
}
}
let all_impls = implementations.values().flat_map(|impls| impls.iter());
writeln!(&mut target_file, "
pub struct Object;
impl Object {{
pub fn new(impl_number: i32) -> Box<Methods + 'static> {{
match impl_number {{
").unwrap();
for impl_number in all_impls {
writeln!(&mut target_file,
" {} => Box::new(Methods_{}),", impl_number, impl_number).unwrap();
}
writeln!(&mut target_file, "
_ => panic!(\"Unknown impl number: {{}}\", impl_number),
}}
}}
}}").unwrap();
}
The generated code:
use super::Methods;
use super::Methods_2;
use implementations::Methods_0;
use implementations::Methods_1;
pub struct Object;
impl Object {
pub fn new(impl_number: i32) -> Box<Methods + 'static> {
match impl_number {
2 => Box::new(Methods_2),
0 => Box::new(Methods_0),
1 => Box::new(Methods_1),
_ => panic!("Unknown impl number: {}", impl_number),
}
}
}