Consider this simple protocol implementation:
#[derive(PartialEq, Debug)]
enum Req<'a> {
InputData(&'a [u8]),
Stop,
}
impl<'a> Req<'a> {
fn decode(packet: &'a [u8]) -> Result<Req<'a>, String> {
match packet.first() {
Some(&0x01) => Ok(Req::InputData(&packet[1..])),
Some(&0x02) => Ok(Req::Stop),
_ => Err(format!("invalid request: {:?}", packet)),
}
}
}
#[derive(PartialEq, Debug)]
enum Rep<'a> {
OutputData(&'a [u8]),
StopAck,
}
impl<'a> Rep<'a> {
fn decode(packet: &'a [u8]) -> Result<Rep<'a>, String> {
match packet.first() {
Some(&0x01) => Ok(Rep::OutputData(&packet[1..])),
Some(&0x02) => Ok(Rep::StopAck),
_ => Err(format!("invalid reply: {:?}", packet)),
}
}
}
fn assert_req(packet: Vec<u8>, sample: Req) {
assert_eq!(Req::decode(&packet), Ok(sample));
}
fn assert_rep(packet: Vec<u8>, sample: Rep) {
assert_eq!(Rep::decode(&packet), Ok(sample));
}
fn main() {
assert_req(vec![1, 2, 3], Req::InputData(&[2, 3]));
assert_req(vec![2], Req::Stop);
assert_rep(vec![1, 2, 3], Rep::OutputData(&[2, 3]));
assert_rep(vec![2], Rep::StopAck);
}
playground
This works, but the two functions assert_req and assert_rep have identical code with only a difference in types. It is a good idea to write one generic assert_packet:
trait Decode<'a>: Sized {
fn decode(packet: &'a [u8]) -> Result<Self, String>;
}
#[derive(PartialEq, Debug)]
enum Req<'a> {
InputData(&'a [u8]),
Stop,
}
impl<'a> Decode<'a> for Req<'a> {
fn decode(packet: &'a [u8]) -> Result<Req<'a>, String> {
match packet.first() {
Some(&0x01) => Ok(Req::InputData(&packet[1..])),
Some(&0x02) => Ok(Req::Stop),
_ => Err(format!("invalid request: {:?}", packet)),
}
}
}
#[derive(PartialEq, Debug)]
enum Rep<'a> {
OutputData(&'a [u8]),
StopAck,
}
impl<'a> Decode<'a> for Rep<'a> {
fn decode(packet: &'a [u8]) -> Result<Rep<'a>, String> {
match packet.first() {
Some(&0x01) => Ok(Rep::OutputData(&packet[1..])),
Some(&0x02) => Ok(Rep::StopAck),
_ => Err(format!("invalid reply: {:?}", packet)),
}
}
}
fn assert_packet<'a, T>(packet: Vec<u8>, sample: T)
where
T: Decode<'a> + PartialEq + std::fmt::Debug,
{
assert_eq!(T::decode(&packet), Ok(sample));
}
fn main() {
assert_packet(vec![1, 2, 3], Req::InputData(&[2, 3]));
assert_packet(vec![2], Req::Stop);
assert_packet(vec![1, 2, 3], Rep::OutputData(&[2, 3]));
assert_packet(vec![2], Rep::StopAck);
}
playground
However, this triggers a "does not live long enough" error:
error[E0597]: `packet` does not live long enough
--> src/main.rs:41:27
|
41 | assert_eq!(T::decode(&packet), Ok(sample));
| ^^^^^^ does not live long enough
42 | }
| - borrowed value only lives until here
|
note: borrowed value must be valid for the lifetime 'a as defined on the function body at 37:1...
--> src/main.rs:37:1
|
37 | / fn assert_packet<'a, T>(packet: Vec<u8>, sample: T)
38 | | where
39 | | T: Decode<'a> + PartialEq + std::fmt::Debug,
40 | | {
41 | | assert_eq!(T::decode(&packet), Ok(sample));
42 | | }
| |_^
If I understand correctly, the problem is in the function signature:
fn assert_packet<'a, T>(packet: Vec<u8>, sample: T)
where
T: Decode<'a>
Here, packet is destroyed when the function returns, but the user-provided 'a lifetime parameter says that the lifetime should end somewhere outside the assert_packet function. Is there any right solution? How should the signature look? Maybe higher rank trait bounds could help here?
Why does this compile:
fn assert_req(packet: Vec<u8>, sample: Req) {
assert_eq!(Req::decode(&packet), Ok(sample));
}
while this doesn't?
fn assert_packet<'a, T>(packet: Vec<u8>, sample: T) where T: Decode<'a> + PartialEq + std::fmt::Debug {
assert_eq!(T::decode(&packet), Ok(sample));
}
The difference is that in the first version, the two textual occurrences of Req name two different instances of the Req<'a> struct, with two different lifetimes. The first occurrence, on the sample parameter, is specialized with a lifetime parameter received by the assert_req function. The second occurrence, used to invoke decode, is specialized with the lifetime of the packet parameter itself (which ceases to exist as soon as the function returns). This means that the two arguments to assert_eq! don't have the same type; yet, it compiles because Req<'a> can be coerced to a Req<'b> where 'b is a shorter lifetime than 'a (Req<'a> is a subtype of Req<'b>).
On the other hand, in the second version, both occurrences of T must represent the exact same type. Lifetime parameters are always assumed to represent lifetimes that are longer than the function call, so it's an error to invoke T::decode with a lifetime that is shorter.
Here's a shorter function that exhibits the same problem:
fn decode_packet<'a, T>(packet: Vec<u8>) where T: Decode<'a> {
T::decode(&packet);
}
This function fails to compile:
<anon>:38:16: 38:22 error: `packet` does not live long enough
<anon>:38 T::decode(&packet);
^~~~~~
It's possible to use higher rank trait bounds on this function to make it compile:
fn decode_packet<T>(packet: Vec<u8>) where for<'a> T: Decode<'a> {
T::decode(&packet);
}
However, now we have another problem: we can't invoke this function! If we try to invoke it like this:
fn main() {
decode_packet(vec![1, 2, 3]);
}
We get this error:
<anon>:55:5: 55:18 error: unable to infer enough type information about `_`; type annotations or generic parameter binding required [E0282]
<anon>:55 decode_packet(vec![1, 2, 3]);
^~~~~~~~~~~~~
That's because we didn't specify which implementation of Decode we want to use, and there are no parameters the compiler can use to infer this information.
What if we specify an implementation?
fn main() {
decode_packet::<Req>(vec![1, 2, 3]);
}
We get this error:
<anon>:55:5: 55:25 error: the trait `for<'a> Decode<'a>` is not implemented for the type `Req<'_>` [E0277]
<anon>:55 decode_packet::<Req>(vec![1, 2, 3]);
^~~~~~~~~~~~~~~~~~~~
That doesn't work, because the Req we wrote is actually interpreted as Req<'_>, where '_ is a lifetime inferred by the compiler. This doesn't implement Decode for all possible lifetimes, only for one particular lifetime.
In order to be able to implement your desired function, Rust would have to support higher kinded types. It would then be possible to define a type constructor parameter (instead of a type parameter) on the function. For instance, you'd be able to pass Req or Rep as a type constructor requiring a lifetime parameter to produce a specific Req<'a> type.
It works correctly if you link the lifetimes involved to the input which you'll be referencing, e.g.
fn assert_packet<'a, T>(packet: &'a [u8], sample: T) where T: Decode<'a> ..
(You'll also need to update the tests to pass borrows rather than owned Vecs.)
Related
I have a trait Atom that has many associated types, of which one is an owned version OP and the other is a borrow version O of essentially the same data. I have a function to_pow_view that creates a view from an owned version and I have an equality operator.
Below is an attempt:
pub trait Atom: PartialEq {
// variants truncated for this example
type P<'a>: Pow<'a, R = Self>;
type OP: OwnedPow<R = Self>;
}
pub trait Pow<'a>: Clone + PartialEq {
type R: Atom;
}
#[derive(Debug, Copy, Clone)]
pub enum AtomView<'a, R: Atom> {
Pow(R::P<'a>),
}
#[derive(Debug, Copy, Clone, PartialEq)]
pub enum OwnedAtom<R: Atom> {
Pow(R::OP),
}
pub trait OwnedPow {
type R: Atom;
fn some_mutable_fn(&mut self);
fn to_pow_view<'a>(&'a self) -> <Self::R as Atom>::P<'a>;
// compiler said I should add 'a: 'b
fn test<'a: 'b, 'b>(&'a mut self, other: <Self::R as Atom>::P<'b>) {
if self.to_pow_view().eq(&other) {
self.some_mutable_fn();
}
}
}
impl<R: Atom> OwnedAtom<R> {
// compiler said I should add 'a: 'b, why?
pub fn eq<'a: 'b, 'b>(&'a self, other: AtomView<'b, R>) -> bool {
let a: AtomView<'_, R> = match self {
OwnedAtom::Pow(p) => {
let pp = p.to_pow_view();
AtomView::Pow(pp)
}
};
match (&a, &other) {
(AtomView::Pow(l0), AtomView::Pow(r0)) => l0 == r0,
}
}
}
// implementation
#[derive(Debug, Copy, Clone, PartialEq)]
struct Rep {}
impl Atom for Rep {
type P<'a> = PowD<'a>;
type OP = OwnedPowD;
}
#[derive(Debug, Copy, Clone, PartialEq)]
struct PowD<'a> {
data: &'a [u8],
}
impl<'a> Pow<'a> for PowD<'a> {
type R = Rep;
}
struct OwnedPowD {
data: Vec<u8>,
}
impl OwnedPow for OwnedPowD {
type R = Rep;
fn some_mutable_fn(&mut self) {
todo!()
}
fn to_pow_view<'a>(&'a self) -> <Self::R as Atom>::P<'a> {
PowD { data: &self.data }
}
}
fn main() {}
This code gives the error:
27 | fn test<'a: 'b, 'b>(&'a mut self, other: <Self::R as Atom>::P<'b>) {
| -- lifetime `'b` defined here
28 | if self.to_pow_view().eq(&other) {
| ------------------
| |
| immutable borrow occurs here
| argument requires that `*self` is borrowed for `'b`
29 | self.some_mutable_fn();
| ^^^^^^^^^^^^^^^^^^^^^^ mutable borrow occurs here
I expect this to work, since the immutable borrow should be dropped right after the eq function evaluates.
Something in the setting of the lifetimes is wrong in this code, already in the equality function eq: I would expect that there is no relation between 'a and 'b; they should just live long enough to do the comparison. However, the compiler tells me that I should add 'a: 'b and I do not understand why. The same thing happened for the function test.
These problems lead me to believe that the lifetimes in to_pow_view are wrong, but no modification I tried made it work (except for removing the 'a lifetime on &'a self, but then the OwnedPowD does not compile anymore).
Link to playground
Can someone help understand what is going on?
Here's the point: you constrained Pow to be PartialEq. However, PartialEq is PartialEq<Self>. In other words, Pow<'a> only implements PartialEq<Pow<'a>> for the same 'a.
This is usually the case for any type with lifetime and PartialEq, so why does it always work but not here?
It usually works because if we compare T<'a> == T<'b>, the compiler can shrink the lifetimes to the shortest of the two and compare that.
However, Pow is a trait. Traits are invariant over their lifetime, in other words, it must stay exactly as is, not longer nor shorter. This is because they may be used with invariant types, for example Cell<&'a i32>. Here's an example of how it could be exploited if this was allowed:
use std::cell::Cell;
struct Evil<'a> {
v: Cell<&'a i32>,
}
impl PartialEq for Evil<'_> {
fn eq(&self, other: &Self) -> bool {
// We asserted the lifetimes are the same, so we can do that.
self.v.set(other.v.get());
false
}
}
fn main() {
let foo = Evil { v: Cell::new(&123) };
{
let has_short_lifetime = 456;
_ = foo == Evil { v: Cell::new(&has_short_lifetime) };
}
// Now `foo` contains a dangling reference to `has_short_lifetime`!
dbg!(foo.v.get());
}
The above code does not compile, because Evil is invariant over 'a, but if it would, it would contain UB in safe code. For that reason, traits, that may contain types such as Evil, are also invariant over their lifetimes.
Because of that, the compiler cannot shrink the lifetime of other. It can shrink the lifetime of self.to_pow_view() (in test(), eq() is similar), because it doesn't really shrink it, it just picks a shorter lifetime for to_pow_view()'s self. But because PartialEq is only implemented for types with the same lifetime, it means that the Pow resulting from self.to_pow_view() must have the same lifetime of other. Because of that, (a) 'a must be greater than or equal to 'b, so we can pick 'b out of it, and (b) by comparing, we borrow self for potentially whole 'a, because it may be that 'a == 'b and therefore the comparison borrows self for 'a, so it is still borrowed immutably while we borrow it mutably for some_mutable_fn().
Once we understood the problem, we can think about the solution. Either we require that Pow is covariant over 'a (can be shrinked), or we require that it implements PartialEq<Pow<'b>> for any lifetime 'b. The first is impossible in Rust, but the second is possible:
pub trait Pow<'a>: Clone + for<'b> PartialEq<<Self::R as Atom>::P<'b>> {
type R: Atom;
}
This triggers an error, because the automatically-derived PartialEq does not satisfy this requirement:
error: implementation of `PartialEq` is not general enough
--> src/main.rs:73:10
|
73 | impl<'a> Pow<'a> for PowD<'a> {
| ^^^^^^^ implementation of `PartialEq` is not general enough
|
= note: `PartialEq<PowD<'0>>` would have to be implemented for the type `PowD<'a>`, for any lifetime `'0`...
= note: ...but `PartialEq` is actually implemented for the type `PowD<'1>`, for some specific lifetime `'1`
So we need to implement PartialEq manually:
impl<'a, 'b> PartialEq<PowD<'b>> for PowD<'a> {
fn eq(&self, other: &PowD<'b>) -> bool {
self.data == other.data
}
}
And now it works.
struct Response {}
struct PlayResponse(Response);
struct DescribeResponse(Response);
impl From<Response> for PlayResponse {
fn from(response: Response) -> Self {
PlayResponse(response)
}
}
enum RtspState {
Init,
Playing,
}
struct RtspMachine {
state: RtspState
}
pub trait OnEvent<T> {
fn on_event(&mut self, event: &T) -> std::result::Result<(), ()>;
}
impl OnEvent<PlayResponse> for RtspMachine {
fn on_event(&mut self, event: &PlayResponse) -> std::result::Result<(), ()> {
self.state = RtspState::Playing;
Ok(())
}
}
fn do_something<T: OnEvent<T>>() where RtspMachine: OnEvent<T>, T: From<Response>{
let mut rtsp_machine = RtspMachine{state: RtspState::Init};
rtsp_machine.on_event(&T::from(Response{}));
rtsp_machine.on_event(&PlayResponse::from(Response{}));
}
On the do_something above, we require where RtspMachine: OnEvent<T>, T: From<Response>.
Note that RtspMachine: OnEvent<PlayResponse> and PlayResponse: From<Response>. I should be able to do rtsp_machine.on_event(&PlayResponse::from(Response{}));, but it only works for the version with T.:
Compiling playground v0.0.1 (/playground)
error[E0308]: mismatched types
--> src/lib.rs:36:27
|
33 | fn do_something<T: OnEvent<T>>() where RtspMachine: OnEvent<T>, T: From<Response>{
| - this type parameter
...
36 | rtsp_machine.on_event(&PlayResponse::from(Response{}));
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ expected type parameter `T`, found struct `PlayResponse`
|
= note: expected reference `&T`
found reference `&PlayResponse`
Rust playground
I know that
fn do_something<T>() where
RtspMachine: OnEvent<T> + OnEvent<PlayResponse>,
T: From<Response>
would work but I have lots of T that I wanted to use the specific type instead of generic T, so I can't just put them all on the where like that.
This is a known problem with the compiler's method resolution in the presence of trait bounds (#24066, #38071). Changing your method call
rtsp_machine.on_event(&PlayResponse::from(Response{}));
to the explicit function call form
OnEvent::<PlayResponse>::on_event(
&mut rtsp_machine, &PlayResponse::from(Response{}));
will allow the code to compile. Apparently, in the version that doesn't work, method resolution is looking only at the OnEvent<T> trait that's mentioned in where, even though OnEvent<PlayResponse> also exists.
I don't know if there's a more elegant solution, but perhaps the above will be adequate for your problem — at least it means the extra syntax is local to the call site.
I have a struct other_struct that has a bunch of methods that I need to call depending on certain situations (in this example there is only foo(). I'd like to have a field in other_struct called fmap that stores a HashMap of other_struct methods.
use std::collections::HashMap;
pub struct fn_struct {
pub func: Option<fn(&other_struct) -> ()>,
}
pub struct other_struct<'a> {
fmap: HashMap<String, fn_struct>,
some_str: &'a str,
}
impl<'a> other_struct<'a> {
fn new(some_str: &str) -> other_struct {
let mut new_struct = other_struct {
fmap: HashMap::new(),
some_str: some_str,
};
new_struct.fmap.insert(
String::from("foo"),
fn_struct {
func: Some(other_struct::foo),
},
);
new_struct
}
pub fn foo(&self) {
println!("Do some stuff foo");
}
}
fn main() {
let test_str = "test";
let mut new_o = other_struct::new(test_str);
new_o.fmap.get("foo").unwrap().func.unwrap()(&new_o);
}
I'm struggling with dealing with the lifetimes, as I get the following error:
error[E0308]: mismatched types
--> src/main.rs:22:28
|
22 | func: Some(other_struct::foo),
| ^^^^^^^^^^^^^^^^^ one type is more general than the other
|
= note: expected fn pointer `for<'r, 's> fn(&'r other_struct<'s>)`
found fn pointer `for<'r> fn(&'r other_struct<'_>)`
I've been reading the high ranked trait bound documentation but it's unclear to me what's happening here. Does this mean the compiler wants me to specify the lifetime of the instance of other_struct somehow in relation to the lifetime of the pointer to the method?
The fn(&other_struct) -> () part of fn_struct is a function pointer that must be able to accept other_structs of any lifetime for<'r, 's> fn(&'r other_struct<'s>) -> (). However, other_struct::foo only accepts a specific lifetime for other_struct<'_>.
You can fix it by specifying that lifetime in some manner, here basically saying its always going to pass in itself:
pub struct fn_struct<'a> {
pub func: Option<fn(&other_struct<'a>) -> ()>,
}
pub struct other_struct<'a> {
fmap: HashMap<String, fn_struct<'a>>,
some_str: &'a str,
}
Or by making foo more generic by unbinding it from 'a.
pub fn foo(_self: &other_struct) {
println!("Do some stuff foo");
}
The difference is subtle, but the original is other_struct<'a>::foo<'b>(&'b other_struct<'a>) -> () and the fixed version is other_struct<'a>::foo<'b, 'c>(&'b other_struct<'c>) -> ().
I've got three examples, one using Vec one using SmallVec, and one with my own implementation of SmallVec. The ones using Vec and my own SmallVec compile but the one using the real SmallVec does not.
Working example using Vec
use std::borrow::Cow;
use std::collections::HashMap;
pub trait MyTrait {
fn get_by_id(&self, id: usize) -> &ItemTraitReturns;
}
/// IMPORTANT PART IS HERE: `Vec<Cow<'a, str>>`
pub struct ItemTraitReturns<'a>(Vec<Cow<'a, str>>);
/// this implementation only takes items with static lifetime (but other implementations also might have different lifetimes)
pub struct MyTraitStruct {
map: HashMap<usize, ItemTraitReturns<'static>>,
}
impl MyTrait for MyTraitStruct {
fn get_by_id(&self, id: usize) -> &ItemTraitReturns {
let temp: &ItemTraitReturns<'static> = self.map.get(&id).unwrap();
// Works as expected: I expect that I can return `&ItemTraitReturns<'_>`
// when I have `&ItemTraitReturns<'static>` (since 'static outlives everything).
temp
// Will return `&ItemTraitReturns<'_>`
}
}
Failing example with SmallVec
Uses SmallVec instead of Vec with no other changes.
use smallvec::SmallVec;
use std::borrow::Cow;
use std::collections::HashMap;
pub trait MyTrait {
fn get_by_id(&self, id: usize) -> &ItemTraitReturns;
}
/// IMPORTANT PART IS HERE: Uses SmallVec instead of Vec
pub struct ItemTraitReturns<'a>(SmallVec<[Cow<'a, str>; 2]>);
/// this implementation only takes items with static lifetime (but other implementations also might have different lifetimes)
pub struct MyTraitStruct {
map: HashMap<usize, ItemTraitReturns<'static>>,
}
impl MyTrait for MyTraitStruct {
fn get_by_id(&self, id: usize) -> &ItemTraitReturns {
let temp: &ItemTraitReturns<'static> = self.map.get(&id).unwrap();
temp
}
}
error[E0308]: mismatched types
--> src/lib.rs:23:9
|
23 | temp
| ^^^^ lifetime mismatch
|
= note: expected type `&ItemTraitReturns<'_>`
found type `&ItemTraitReturns<'static>`
note: the anonymous lifetime #1 defined on the method body at 18:5...
--> src/lib.rs:18:5
|
18 | / fn get_by_id(&self, id: usize) -> &ItemTraitReturns {
19 | | let temp: &ItemTraitReturns<'static> = self.map.get(&id).unwrap();
20 | | // Error:
21 | | // = note: expected type `&demo2::ItemTraitReturns<'_>`
22 | | // found type `&demo2::ItemTraitReturns<'static>`
23 | | temp
24 | | }
| |_____^
= note: ...does not necessarily outlive the static lifetime
Working example with my own SmallVec
When I implement my own (very naive) SmallVec<[T; 2]> (called NaiveSmallVec2<T>) the code also compiles... Very strange!
use std::borrow::Cow;
use std::collections::HashMap;
/// This is a very naive implementation of a SmallVec<[T; 2]>
pub struct NaiveSmallVec2<T> {
item1: Option<T>,
item2: Option<T>,
more: Vec<T>,
}
impl<T> NaiveSmallVec2<T> {
pub fn push(&mut self, item: T) {
if self.item1.is_none() {
self.item1 = Some(item);
} else if self.item2.is_none() {
self.item2 = Some(item);
} else {
self.more.push(item);
}
}
pub fn element_by_index(&self, index: usize) -> Option<&T> {
match index {
0 => self.item1.as_ref(),
1 => self.item2.as_ref(),
_ => self.more.get(index - 2),
}
}
}
pub trait MyTrait {
fn get_by_id(&self, id: usize) -> &ItemTraitReturns;
}
/// IMPORTANT PART IS HERE: Uses NaiveSmallVec2
pub struct ItemTraitReturns<'a>(NaiveSmallVec2<Cow<'a, str>>);
/// only takes items with static lifetime
pub struct MyTraitStruct {
map: HashMap<usize, ItemTraitReturns<'static>>,
}
impl MyTrait for MyTraitStruct {
fn get_by_id(&self, id: usize) -> &ItemTraitReturns {
let temp: &ItemTraitReturns<'static> = self.map.get(&id).unwrap();
// astonishingly this works!
temp
}
}
I expect the SmallVec version to compile like the Vec version does. I don't understand why in some cases (in the case of Vec) &ItemTraitReturns<'static> can be converted to &ItemTraitReturns<'_> and in some cases (SmallVec) it's not possible (I don't see the influence of Vec / SmallVec).
I don't want to change lifetimes of this trait:
pub trait MyTrait {
fn get_by_id(&self, id: usize) -> &ItemTraitReturns;
}
... since when using the trait I don't care about lifetimes (this should be an implementation detail) ... but still would like to use SmallVec.
This difference appears to be caused by a difference in variance between Vec and SmallVec. While Vec<T> is covariant in T, SmallVec appears to be invariant. As a result, SmallVec<[&'a T; 1]> can't be converted to SmallVec<[&'b T; 1]> even when lifetime 'a outlives 'b and the conversion should be safe. Here is a minimal example triggering the compiler error:
fn foo<'a, T>(x: SmallVec<[&'static T; 1]>) -> SmallVec<[&'a T; 1]> {
x // Compiler error here: lifetime mismatch
}
fn bar<'a, T>(x: Vec<&'static T>) -> Vec<&'a T> {
x // Compiles fine
}
This appears to be a shortcoming of SmallVec. Variance is automatically determined by the compiler, and it is sometimes cumbersome to convince the compiler that it is safe to assume that a type is covariant.
There is an open Github issue for this problem.
Unfortunately, I don't know a way of figuring out the variance of a type based on its public interface. Variance is not included in the documentation, and depends on the private implementation details of a type. You can read more on variance in the language reference or in the Nomicon.
I'm aware of Where did the 'static lifetime come from and Cannot infer an appropriate lifetime for autoref due to conflicting requirements。
But I still do not understand the problem I've encountered:
use std::ops::Index;
trait Stack<T> {
fn as_slice(&self) -> &[T];
}
impl<T> Index<usize> for Stack<T> {
type Output = T;
fn index(&self, i: usize) -> &T {
&self.as_slice()[i]
}
}
trait Core {
fn stack(&self) -> &Stack<usize>;
fn bad(&mut self) -> usize {
self.stack()[0]
}
fn good(&mut self) -> usize {
self.stack().as_slice()[0]
}
}
fn main() {}
In the code above, good() gives no error, but bad() complains with:
error[E0495]: cannot infer an appropriate lifetime for autoref due to conflicting requirements
--> src/main.rs:18:14
|
18 | self.stack()[0]
| ^^^^^
|
note: first, the lifetime cannot outlive the anonymous lifetime #1 defined on the method body at 17:5...
--> src/main.rs:17:5
|
17 | / fn bad(&mut self) -> usize {
18 | | self.stack()[0]
19 | | }
| |_____^
note: ...so that reference does not outlive borrowed content
--> src/main.rs:18:9
|
18 | self.stack()[0]
| ^^^^
= note: but, the lifetime must be valid for the static lifetime...
= note: ...so that the types are compatible:
expected std::ops::Index<usize>
found std::ops::Index<usize>
There is no Box in this code, and I do not know where the static lifetime comes from.
Edit: through try and error I found the compiler assume Stack + 'static. The following code compiles. But why? Please point me to some document.
impl<'b, T> Index<usize> for Stack<T> + 'b {
type Output = T;
fn index<'a>(&'a self, i: usize) -> &'a T {
&self.as_slice()[i]
}
}
You are implementing a Trait (Index) for a Trait Object (Stack<T>).
The Rust reference states:
Since a trait object can contain references, the lifetimes of those references need to be expressed as part of the trait object. This lifetime is written as Trait + 'a. There are defaults that allow this lifetime to usually be inferred with a sensible choice.
If You don't define a lifetime the compiler assume a default, in this case it is assumed 'static (see here for a detailed explanation)
Your code is equivalent to:
impl<T> Index<usize> for Stack<T> + 'static {
type Output = T;
fn index(&self, i: usize) -> &T {
&self.as_slice()[i]
}
}
To resolve the cannot infer an appropriate lifetime for autoref due to conflicting requirements compilation error just declare that stack() method returns a trait object with 'static lifetime.
trait Core {
fn stack(&self) -> &'static Stack<usize>;
fn bad(&mut self) -> usize {
self.stack()[0]
}
fn good(&mut self) -> usize {
self.stack().as_slice()[0]
}
}
Otherwise declare a generic lifetime for the Stack<T> trait object that impl Index:
impl<'a, T> Index<usize> for Stack<T> + 'a {
type Output = T;
fn index(&self, i: usize) -> &T {
&self.as_slice()[i]
}
}
trait Core {
fn stack(&self) -> &Stack<usize>;
fn bad(&mut self) -> usize {
self.stack()[0]
}
fn good(&mut self) -> usize {
self.stack().as_slice()[0]
}
}
At this point you should ask: why using as_slice() in good() works and using index() in bad() does not?
To make same sense of it try read the comments embedded in the MVCE below.
use std::ops::Index;
trait Stack<T> {
fn as_slice(&self) -> &[T];
}
// equivalent to: impl<T> Index<usize> for Stack<T>
// just to expose the conflicting requirements error
// the right declaration is:
// impl<'a, T> Index<usize> for Stack<T> + 'a
impl<T> Index<usize> for Stack<T> + 'static {
type Output = T;
fn index(&self, i: usize) -> &T {
&self.as_slice()[i]
}
}
trait Core {
fn stack(&self) -> &Stack<usize>;
fn bad<'a>(&'a mut self) -> usize {
//self.stack()[0] syntactic sugar for:
*self.stack().index(0)
// self.stack() returns a trait object with a lifetime bound != 'static
// but Stack impl for Index requires a 'static lifetime bound:
// COMPILE ERROR: cannot infer an appropriate lifetime for
// autoref due to conflicting requirements
}
fn good<'a>(&'a mut self) -> usize {
// self.stack() returns a trait object with 'a lifetime bound
// this is congruent with as_slice() lifetime requirements
// see Catasta::as_slice() impl below
// NO COMPILE ERROR
self.stack().as_slice()[0]
}
}
struct Catasta<T> {
pila: [T;4]
}
impl<T> Stack<T> for Catasta<T> {
fn as_slice<'a>(&'a self) -> &'a [T] {
&self.pila
}
}
struct RealCore {
stk: Catasta<usize>
}
impl Core for RealCore {
fn stack(&self) -> &Stack<usize> {
&self.stk
}
}
fn main() {
let mut core = RealCore {stk: Catasta {pila: [100, 2, 3, 4]} };
println!("pos [0] item: {}", core.good());
}