I'm struggling with this error rustc gives me:
error: method `create_shader_explicit` has an incompatible type for trait: expected bound lifetime parameter 'a, found concrete lifetime
My trait declaration is pretty much this:
pub trait GraphicsContext<R: Resources> {
/// Creates a shader object
fn create_shader<'a>(&'a self, shader::Stage, source: &str) ->
Result<handle::Shader<R>, shader::CreateError>;
}
Here's my implementation,
pub struct OpenGLResources<'a> {
phantom: PhantomData<&'a u32>
}
impl<'a> Resources for OpenGLResources<'a> {
type Shader = Shader<'a>;
}
impl<'z> GraphicsContext<OpenGLResources<'z>> for OpenGLGraphicsContext {
/// Creates a shader object
fn create_shader<'a>(&'a self, stage: shader::Stage, source: &str) ->
Result<handle::Shader<OpenGLResources>, shader::CreateError> {
let shader = Shader::new(self, stage);
try!(shader.compile_from_source(source));
Ok(shader)
}
}
In other questions on StackOverflow, they are missing things like <'a> between create_shader and (), however when I compare the fn definitions in mine they look identical.
EDIT:
Changing the definition inside impl to the following fixes that issue
fn create_shader<'a>(&'a self, stage: shader::Stage, source: &str) ->
Result<handle::Shader<OpenGLResources**<'z>**>, shader::CreateError>
But then the issue is that 'a and 'z need to be the same lifetime. If I change it to this:
fn create_shader(**&'z** self, stage: shader::Stage, source: &str) ->
Result<handle::Shader<OpenGLResources<'z>>, shader::CreateError>
The impl block works, but then I need a way of specifying the 'z lifetime in the trait definition. I tried the following:
pub trait<'z> GraphicsContext<R: Resources<'z>>
But it didn't work.
When comparing things like this, you need to remember to expand all the generics so that you can actually compare it all. In this case, you haven’t expanded R. If you do, the answer becomes obvious: R is OpenGLResources<'z>, linking the OpenGLResources to the impl block, whereas your method definition has elided the lifetime on OpenGLResources, causing it to be inferred as self’s lifetime, which is 'a.
Thanks the hints of #Chris Morgan I managed to implement this functionality and its now working fine.
If we start with the base trait with the 'a lifetime included:
trait Resources<'a> {
type Shader: Shader;
type ShaderProgram: ShaderProgram;
}
Then implement it for OpenGL. (note the PhantomData struct)
struct OpenGLResources<'a> {
phantom: PhantomData<&'a u32> // 'a is the lifetime of the context reference
}
impl<'a> ResourcesTrait<'a> for Resources<'a> {
type Shader = Shader<'a>;
type ShaderProgram = ShaderProgram<'a>;
type CommandBuffer = CommandBuffer;
type CommandBufferBuilder = CommandBufferBuilder;
}
Its a bit verbose, but the GraphicsContext trait works fine too now. The 'a lifetime goes in the type parameters part.
trait GraphicsContext<'a, R: Resources<'a>> {
fn create_shader(&'a self, ty: Type, source: &str) -> Result<R::Shader, ShaderCreationError>
}
Finally this is the required code in the graphics context implementation.
It is extremely verbose with the 'a lifetimes sprinkled everywhere but at least it works!
impl<'a> GraphicsContext<'a, Resources<'a>> for OpenGLGraphicsContext
Related
The impl Trait syntax for return types seems to cause the compiler to incorrectly assume that the lifetime of the input argument must match the output in some situations. Consider the function
fn take_by_trait<T: InTrait>(_: T) -> impl OutTrait {}
If the input type contains a lifetime the compiler complains when the output outlives the it even though they are completely independent. This will not happen if the input type is not generic, or if the output is a Box<dyn OutTrait>.
Full code:
trait InTrait {}
struct InStruct<'a> {
_x: &'a bool, // Just a field with some reference
}
impl<'a> InTrait for InStruct<'a> {}
trait OutTrait {}
impl OutTrait for () {}
fn take_by_type(_: InStruct) -> impl OutTrait {}
fn take_by_trait<T: InTrait>(_: T) -> impl OutTrait {}
fn take_by_trait_output_dyn<T: InTrait>(_: T) -> Box<dyn OutTrait> {
Box::new(())
}
fn main() {
let _ = {
let x = true;
take_by_trait(InStruct{ _x: &x }) // DOES NOT WORK
// take_by_type(InStruct { _x: &x }) // WORKS
// take_by_trait_output_dyn(InStruct { _x: &x }) // WORKS
};
}
Is there some elided lifetime here that I could qualify to make this work, or do I need to do heap allocation?
The impl Trait semantic means that the function returns some type that implements the Trait, but the caller cannot make any assumptions about which type that would be or what lifetimes it would use.
For all the compiler knows the take_by_trait function can be used in many different modules, or, probably, in other crates. Now, your implementation would work fine in all use cases. It can be rewritten like
fn take_by_trait<T: InTrait>(_: T) -> () {}
This is a perfectly fine function and will work just fine. But then at some point you might want to add another implementation for the OutTrait and change your take_by_trait function a little.
trait OutTrait { fn use_me(&self) {} }
impl<T: InTrait> OutTrait for T {}
fn take_by_trait<T: InTrait>(v: T) -> impl OutTrait {v}
If we expand the generic parameter and impl definition, we get this code:
fn take_by_trait<'a>(v: InStruct<'a>) -> InStruct<'a> {v}
fn main() {
let v = {
let x = true;
take_by_trait(InStruct{ _x: &x })
};
v.use_me();
}
This obviously cannot work because x is dropped before println! tries to access its value. So by adding a new implementation for the OutTrait you broke code that uses this function, potentially somewhere in a crate that depends on yours. That's why the compiler is reluctant to allow you defining such things.
So, again, the issue with impl OutTrait is just that the compiler cannot make any assumptions about the returned type and its lifetime, so it uses the maximum possible bound, which produces the borrow checker error you see.
EDIT: I've modified the code a little so that the signature of the function would not change and the code actually compiles and produces the same lifetime error: playground
impl Trait in return position implicitly captures the any lifetime appearing in generic parameters. This is not something you can express in normal Rust, only with impl Trait.
On stable as far as I know there is no way to avoid that. On nightly you can use type_alias_impl_trait:
#![feature(type_alias_impl_trait)]
type Ret = impl OutTrait;
fn take_by_trait<T: InTrait>(v: T) -> Ret {}
See issues Unclear compiler error when impl Trait return value captures non-'static argument (#82171), impl Trait capturing lifetime of type parameter (#79415), False-positive "temporary dropped while borrowed" involving return-position impl Trait (#98997), impl Trait + 'static is not static if returned from generic function (#76882).
TL;RD. I would like to inherit a trait from From like this: pub trait VectorLike<'a, T: 'a + Clone>: From<&'a [T]> {} but in such a way that VectorLike doesn't have a lifetime parameter.
Loner version. I'm writing generic code that supports various vector kinds (e.g. standard Vec and SmallVec from the smallvec crate). In order for my functions to work with different vectors I'm making a trait that encompasses everything that is common to them. It goes like this:
pub trait VectorLike<T: Clone>:
ops::Deref<Target = [T]> +
IntoIterator<Item = T> +
FromIterator<T> +
{
fn pop(&mut self) -> Option<T>;
fn push(&mut self, value: T);
}
Everything above works fine.
However I run into problem when I try to support creating vector from slices like this: Vec::from(&[1, 2, 3][..]). In order to allow this Vec implements From<&'_ [T]> trait. So naturally I'm trying to add it as a parent trait for VectorLike:
pub trait VectorLike<T: Clone>:
// ...
From<&[T]> +
{
// ...
}
... and I get “expected named lifetime parameter” error. Rust compiler suggests to fix the code like this:
pub trait VectorLike<'a, T: 'a + Clone>:
// ...
From<&'a [T]> +
{
// ...
}
Now I have to specify explicit lifetime wherever VectorLike is used. This seems completely unnecessary:
This lifetime contains no valuable information: the vector does not inherit the lifetime, it copies all elements.
Lifetime specification of this sort is not required when Vec is used directly, e.g. this works: fn make_vector(elements: &[i32]) -> Vec<i32> { Vec::<i32>::from(elements) }.
I can workaround this limitation by adding a new function instead of implementing From: pub trait VectorLike<T: Clone>: /* ... */ { fn from_slice(s: &[T]) -> Self; }. This way my trait is functionally equivalent and can be used without lifetime specifiers.
So is there a way to remove the superfluous lifetime specifier here?
P.S. For now I'm using the “new function” option as a workaround, but it has drawbacks. For example, I find it confusing when the function has a different name, but giving it the same name leads to ambiguity which needs to be resolved with verbose constructs like this: <SmallVec::<[T; 8]> as From<&[T]>>::from(slice).
Every reference in Rust must have an associated lifetime.
The reason you usually don't have to write them out is because the compiler is very good at lifetime elision.
Whenever this doesn't work, you need to carefully consider where the lifetime you need is actually coming form.
In this case, the constraint you're trying to impose can be though of as
"for any lifetime 'a that you give me, VectorLike<T> implements From<&'a [T]>".
This is a higher-ranked trait bound that can be expressed using for<'a> syntax:
pub trait VectorLike<T: Clone>:
...
for<'a> From<&'a [T]>
{
...
}
Playground
I have a trait for which I want to require that implementing types are iterable by borrow. I have managed to do this using a for<'x> higher-ranked trait bound (HRTB) on &'x Self.
However, I also want to require that the IntoIter associated type implements ExactSizeIterator, but every way I try to describe this in the type system causes compilation issues which seem to stem from the use of the HRTB (which I am not fully confident I'm using correctly).
Here is the (simplified) code:
struct Thing<'thing>(&'thing ());
trait Trait<'thing>
where
for<'x> &'x Self: IntoIterator<Item = &'x Thing<'thing>>,
// Compiles fine until uncommenting this line:
//for<'x> <&'x Self as IntoIterator>::IntoIter: ExactSizeIterator
{ }
struct Bucket<'things> {
things: Vec<Thing<'things>>,
}
struct BucketRef<'a, 'things: 'a> {
bucket: &'a Bucket<'things>,
}
impl<'x, 'a, 'things: 'a> IntoIterator for &'x BucketRef<'a, 'things> {
type Item = &'x Thing<'things>;
type IntoIter = std::slice::Iter<'x, Thing<'things>>;
fn into_iter(self) -> Self::IntoIter {
self.bucket.things.iter()
}
}
impl<'a, 'things: 'a> Trait<'things> for BucketRef<'a, 'things> { }
fn foo<'a, 'things>(anchor: &BucketRef<'a, 'things>) {
println!("{}", ExactSizeIterator::len(&anchor.into_iter()));
}
As written this compiles fine, but when I try to further restrict the bounds on Trait via the commented line, I get the following compiler error:
error[E0277]: the trait bound `for<'x> <&'x anchor::BucketRef<'a, 'things> as std::iter::IntoIterator>::IntoIter: std::iter::ExactSizeIterator` is not satisfied
It seems to my not-a-compiler-writer mind that given rustc appears able to determine inside the function foo that all instances of &BucketRef are ExactSizeIterators, that it should be able to do similarly for the trait bound, but this is not borne out in reality.
Can anyone explain to me why this doesn't work and if there is a better way to express either the bound itself or the intent behind the bound?
active toolchain
----------------
stable-x86_64-unknown-linux-gnu (default)
rustc 1.43.0 (4fb7144ed 2020-04-20)
this maybe related to this compiler bug and will be fixed soon: https://github.com/rust-lang/rust/issues/56556
https://github.com/rust-lang/rust/pull/85499
workarounds:
https://github.com/Lucretiel/joinery/blob/master/src/join.rs#L174-L188
This used to work:
struct Foo<'a, T> {
parent:&'a (Array<T> + 'a)
}
impl<'a, T> Foo<'a, T> { //'
pub fn new<T>(parent:&Array<T>) -> Foo<T> {
return Foo {
parent: parent
};
}
}
trait Array<T> {
fn as_foo(&self) -> Foo<T> {
return Foo::new(self);
}
}
fn main() {
}
Now it errors:
:15:21: 15:25 error: the trait core::kinds::Sized is not implemented for the type Self
:15 return Foo::new(self);
I can kind of guess what's wrong; it's saying that my impl of Foo<'a, T> is for T, not Sized? T, but I'm not trying to store a Sized? element in it; I'm storing a reference to a Sized element in it. That should be a pointer, fixed size.
I don't see what's wrong with what I'm doing, or why it's wrong?
For example, I should (I think...) be able to store a &Array in my Foo, no problem. I can't see any reason this would force my Foo instance to be unsized.
playpen link: http://is.gd/eZSZYv
There's two things going on here: trait objects coercions (the error), and object safety (fixing it).
The error
As suggested by the error message, the difficult part of the code is the Foo::new(self), and this is because pub fn new<T>(parent: &Array<T>) -> ..., that is, self is being coerced to an &Array<T> trait object. I'll simplify the code to:
trait Array {
fn as_foo(&self) {
let _ = self as &Array; // coerce to a trait object
}
}
fn main() {}
which gives the same thing:
<anon>:3:13: 3:27 error: the trait `core::kinds::Sized` is not implemented for the type `Self`
<anon>:3 let _ = self as &Array; // coerce to a trait object
^~~~~~~~~~~~~~
Self is the stand-in name for the type that implements the trait. Unlike most generic parameters, Self is possibly-unsized (?Sized) by default, since RFC 546 and #20341 for the purposes of allowing e.g. impl Array<T> for Array<T> to work by default more often (we'll come to this later).
The variable self has type &Self. If Self is a sized type, then this is a normal reference: a single pointer. If Self is an unsized type (like [T] or a trait), then &Self (&[T] or &Trait) is a slice/trait object: a fat pointer.
The error appears because the only references &T that can be cast to a trait object are when T is sized: Rust doesn't support making fat pointers fatter, only thin pointer → fat pointer is valid. Hence, since the compiler doesn't know that Self will always be Sized (remember, it's special and ?Sized by default) it has to assume the worst: that the coercion is not legal, and so it's disallowed.
Fixing it
It seems logical that the fix we're looking for is to ensure that Self: Sized when we want to do a coercion. The obvious way to do this would be to make Self always Sized, that is, override the default ?Sized bound as follows:
trait Array: Sized {
fn as_foo(&self) {
let _ = self as &Array; // coerce to a trait object
}
}
fn main() {}
Looks good!
Except there's the small point that it doesn't work; but at least it's for a difference reason, we're making progress! Trait objects can only be made out of traits that are "object safe" (i.e. safe to be made into a trait object), and having Sized Self is one of the things that breaks object safety:
<anon>:3:13: 3:17 error: cannot convert to a trait object because trait `Array` is not object-safe [E0038]
<anon>:3 let _ = self as &Array; // coerce to a trait object
^~~~
<anon>:3:13: 3:17 note: the trait cannot require that `Self : Sized`
<anon>:3 let _ = self as &Array; // coerce to a trait object
^~~~
<anon>:3:13: 3:17 note: the trait cannot require that `Self : Sized`
<anon>:3 let _ = self as &Array; // coerce to a trait object
^~~~
(I filed the double printing of the note as #20692.)
Back to the drawing board. There's a few other "easy" possibilities for a solution:
define an extension trait trait ArrayExt: Sized + Array { fn as_foo(&self) { ... } } and implement it for all Sized + Array types
just use a free function fn array_as_foo<A: Array>(x: &A) { ... }
However, these don't necessarily work for every use case, e.g. specific types can't customise the behaviour by overloading the default method. However, fortunately there is a fix!
The Turon Trick
(Named for Aaron Turon, who discovered it.)
Using generalised where clauses we can be highly specific about when Self should implement Sized, restricting it to just the method(s) where it is required, without infecting the rest of the trait:
trait Array {
fn as_foo(&self) where Self: Sized {
let _ = self as &Array; // coerce to a trait object
}
}
fn main() {}
This compiles just fine! By using the where clause like this, the compiler understands that (a) the coercion is legal because Self is Sized so self is a thin pointer, and (b) that the method is illegal to call on a trait object anyway, and so doesn't break object safety. To see it being disallowed, changing the body of as_foo to
let x = self as &Array; // coerce to a trait object
x.as_foo();
gives
<anon>:4:7: 4:15 error: the trait `core::kinds::Sized` is not implemented for the type `Array`
<anon>:4 x.as_foo();
^~~~~~~~
as expected.
Wrapping it all up
Making this change to the original unsimplified code is as simple adding that where clause to the as_foo method:
struct Foo<'a, T> { //'
parent:&'a (Array<T> + 'a)
}
impl<'a, T> Foo<'a, T> {
pub fn new(parent:&Array<T>) -> Foo<T> {
return Foo {
parent: parent
};
}
}
trait Array<T> {
fn as_foo(&self) -> Foo<T> where Self: Sized {
return Foo::new(self);
}
}
fn main() {
}
which compiles without error. (NB. I had to remove the unnecessary <T> in pub fn new<T> because that was causing inference failures.)
(I have some in-progress blog posts that go into trait objects, object safety and the Turon trick, they will appear on /r/rust in the near future: first one.)
I posted a similar question (Rust lifetime error expected concrete lifetime but found bound lifetime) last night, but still can't figure out how to apply it to this case now. Once again, a simplified example bellow:
struct Ref;
struct Container<'a> {
r : &'a Ref
}
struct ContainerB<'a> {
c : Container<'a>
}
trait ToC {
fn from_c<'a>(r : &'a Ref, c : Container<'a>) -> Self;
}
impl<'b> ToC for ContainerB<'b> {
fn from_c<'a>(r : &'a Ref, c : Container<'a>) -> ContainerB<'a> {
ContainerB{c:c}
}
}
With the error message:
test.rs:16:3: 18:4 error: method `from_c` has an incompatible type for trait: expected concrete lifetime, but found bound lifetime parameter 'a
test.rs:16 fn from_c<'a>(r : &'a Ref, c : Container<'a>) -> ContainerB<'a> {
test.rs:17 ContainerB{c:c}
test.rs:18 }
test.rs:16:67: 18:4 note: expected concrete lifetime is the lifetime 'b as defined on the block at 16:66
test.rs:16 fn from_c<'a>(r : &'a Ref, c : Container<'a>) -> ContainerB<'a> {
test.rs:17 ContainerB{c:c}
test.rs:18 }
error: aborting due to previous error
What I think needs to happen is I need some way to equate / sub-type lifetime 'a, and lifetime 'b. Unlike the previous example there is no &self to use. I am guessing I can do this by adding a lifetime type argument to my trait(trait ToC<'a> ...), but I would prefer not to do this as it adds extra <'a> everywhere I want to use the trait as a type bound.
If anybody is curious(AKA can ignore this) where this might actually come up, I am using it in a library to convert between rust and python types. The trait is here. Everything works fine, but I am trying to implement a wrapper around the PyObject type (such as a numpy ndarray) and be able to convert it to and from a PyObject with this.
Thanks again!
This boils down to much the same problem as in your previous question.
Self refers to the type you are implementing the trait for. In this case it is ContainerB<'b>, and so the whole thing about its not being the same applies; this time this time there is nothing to tie 'b and 'a together, either; the lifetimes are and must be assumed by the compiler to be potentially disjoint. (This is as distinct to the &'a ContainerB<'b> which guaranteed 'b ≥ 'a.)
Once you are using a lifetime defined on the method, tying that in with a lifetime on Self is not possible. The solution that is probably best is to shift the lifetime parameter from the method onto the trait:
trait ToC<'a> {
fn from_c(r: &'a Ref, c: Container<'a>) -> Self;
}
impl<'a> ToC<'a> for ContainerB<'a> {
fn from_c(r: &'a Ref, c: Container<'a>) -> ContainerB<'a> {
ContainerB { c: c }
}
}