This snippet does not compile because of a compiler bug:
struct Theory<'a, T: 'a> {
left: &'a T,
}
pub struct Contain<'a, T: 'a, U>
where
&'a T: IntoIterator,
for<'x> <(&'a T) as IntoIterator>::Item: PartialEq<&'x U>,
{
theory: Theory<'a, T>,
right: U,
}
impl<'a, T: 'a, U> Drop for Contain<'a, T, U>
where
&'a T: IntoIterator,
for<'x> <(&'a T) as IntoIterator>::Item: PartialEq<&'x U>,
{
fn drop(&mut self) {
//TODO
}
}
fn main() {}
I need this because I must compare the iterator Items with U; but Item is a reference type, because I call into_iter() in a borrowed collection.
I then tried something like this to work around:
struct Theory<'a, T: 'a> {
left: &'a T,
}
pub struct Contain<'a, 'b: 'a, T: 'a, U: 'b>
where
&'a T: IntoIterator,
<(&'a T) as IntoIterator>::Item: PartialEq<&'b U>,
{
theory: Theory<'a, T>,
right: U,
_marker: ::std::marker::PhantomData<&'b ()>,
}
impl<'a, 'b, T: 'a, U> Drop for Contain<'a, 'b, T, U>
where
&'a T: IntoIterator,
<(&'a T) as IntoIterator>::Item: PartialEq<&'b U>,
{
fn drop(&mut self) {
for left in self.theory.left.into_iter() {
if left == &self.right {
return;
}
}
//handle case where all lefts are different of right
}
}
fn main() {}
But I got a:
cannot infer an appropriate lifetime for borrow expression due to conflicting requirements
--> src/main.rs:22:24
|
22 | if left == &self.right {
| ^^^^^^^^^^^
|
How can I iterate over left, then compare each elements with right?
You can simply require the trait bound PartialEq<B>. The methods eq and ne from the trait take all arguments by reference, so there is no reason why requiring PartialEq for a reference to a type.
So this works:
impl<'a, 'b, T: 'a, U> Drop for Contain<'a, 'b, T, U>
where
&'a T: IntoIterator,
<(&'a T) as IntoIterator>::Item: PartialEq<U>, // <-- change 1
{
fn drop(&mut self) {
for left in self.theory.left.into_iter() {
if left == self.right { // <-- change 2
return;
}
}
//handle case where all lefts are different of right
}
}
Related
Given a struct and trait:
// Minimal version of the actual data structure and trait
trait MyTrait {
fn blub(&mut self);
}
struct MyStruct;
impl MyTrait for MyStruct {
fn blub(&mut self) {
println!("Blub!");
}
}
I would like to create a struct that can hold an object that implements MyTrait:
impl<T> Foo<T>
where
T: MyTrait,
{
fn new(t: T) -> Self {
Self { t }
}
fn run(&mut self) {
// Execute `blub` of `t`.
// Something like:
self.t.blub();
}
}
So far, that's easy. Now comes the crux: I want to accept both owned and mutably referenced types, like this:
fn main() {
let t0 = MyStruct;
let mut f0 = Foo::new(t0);
f0.run();
let mut t1 = MyStruct;
let mut f1 = Foo::new(&mut t1);
f1.run();
}
The code here of course doesn't work, because &mut MyStruct does not implement MyTrait.
In theory, this should be possible though, because MyTrait::blub takes &mut self, which is compatible with both owned and mutably borrowed types.
This is how far I've come. It works, but has two problems:
It has a pointless second generic
It requires PhantomData
use std::{borrow::BorrowMut, marker::PhantomData};
// Minimal version of the actual data structure and trait
trait MyTrait {
fn blub(&mut self);
}
struct MyStruct;
impl MyTrait for MyStruct {
fn blub(&mut self) {
println!("Blub!");
}
}
// Object that shall carry objects OR mutable references of type `MyTrait`
struct Foo<T, U> {
t: T,
_p: PhantomData<U>,
}
impl<T, U> Foo<T, U>
where
T: BorrowMut<U>,
U: MyTrait,
{
fn new(t: T) -> Self {
Self { t, _p: PhantomData }
}
fn run(&mut self) {
self.t.borrow_mut().blub();
}
}
fn main() {
let t0 = MyStruct;
let mut f0 = Foo::new(t0);
f0.run();
let mut t1 = MyStruct;
let mut f1: Foo<_, MyStruct> = Foo::new(&mut t1);
f1.run();
}
Blub!
Blub!
Is there a way to implement this more elegantly?
The only other elegant-ish way I have seen so far is to impl MyTrait for &mut MyStruct. Sadly, I do not own the trait or type, so I cannot do that. Although please tell me if my attempts here are misguided and this entire thing is an XY problem; and the actual thing I should do is to report this problem in said library so they can add that impl.
The simplest way is probably to add another implantation of MyTrait for &mut MyStruct.
impl MyTrait for &mut MyStruct {
fn blub(&mut self) {
println!("Blub!");
}
}
If you dont have access to the struct or trait, you can use an enum to manage the Owned and Borrowed versions and implement Deref/DerefMut to keep the usage of t the same.
enum Container<'a, T> {
Owned(T),
Borrowed(&'a mut T)
}
impl<'a, T: MyTrait> From<T> for Container<'a, T> {
fn from(t: T) -> Self {
Self::Owned(t)
}
}
impl<'a, T: MyTrait> From<&'a mut T> for Container<'a, T> {
fn from(t: &'a mut T) -> Self {
Self::Borrowed(t)
}
}
impl<'a, T> Deref for Container<'a, T> {
type Target = T;
fn deref(&self) -> &Self::Target {
match self {
Self::Owned(o) => o,
Self::Borrowed(o) => o
}
}
}
impl<'a, T> DerefMut for Container<'a, T> {
fn deref_mut(&mut self) -> &mut Self::Target {
match self {
Self::Owned(o) => o,
Self::Borrowed(o) => o
}
}
}
struct Foo<'a, T> {
t: Container<'a, T>
}
impl<'a, T> Foo<'a, T>
where
T: MyTrait,
{
fn new(t: impl Into<Container<'a, T>>) -> Self {
Self{ t: t.into() }
}
fn run(&mut self) {
self.t.blub();
}
}
Other solutions I found:
This one doesn't require looking at an enum at runtime, it solves types at compile time:
use std::marker::PhantomData;
// Minimal version of the actual data structure and trait
trait MyTrait {
fn blub(&mut self);
}
struct MyStruct;
impl MyTrait for MyStruct {
fn blub(&mut self) {
println!("Blub!");
}
}
trait MyTraitFrom<'a, T> {
fn mytrait_from(value: &'a mut T) -> Self;
}
impl<'a, T: MyTrait> MyTraitFrom<'a, T> for &'a mut T {
fn mytrait_from(value: &'a mut T) -> Self {
value
}
}
impl<'a, T: MyTrait> MyTraitFrom<'a, &'a mut T> for &'a mut T {
fn mytrait_from(value: &'a mut &'a mut T) -> Self {
let value: &'a mut T = value;
value
}
}
struct Foo<T, U> {
t: T,
_p: PhantomData<U>,
}
impl<'a, T: 'a, U: 'a> Foo<T, U>
where
&'a mut U: MyTraitFrom<'a, T>,
U: MyTrait,
{
fn new(t: T) -> Self {
Self { t, _p: PhantomData }
}
fn run(&'a mut self) {
let u: &'a mut U = MyTraitFrom::mytrait_from(&mut self.t);
u.blub();
}
}
fn main() {
let t0 = MyStruct;
let mut f0 = Foo::new(t0);
f0.run();
let mut t1 = MyStruct;
let mut f1 = Foo::new(&mut t1);
f1.run();
}
Blub!
Blub!
I've reduced my problem to the following code:
struct Struct<'a, 'b, T> {
a: &'a T,
b: &'b T,
}
trait Trait<'a, 'b, T> {
fn a(&self) -> &'a T;
fn b(&self) -> &'b T;
}
impl<'a, 'b, T> Trait<'a, 'b, T> for Struct<'a, 'b, T> {
fn a(&self) -> &'a T {
self.a
}
fn b(&self) -> &'b T {
self.b
}
}
struct Confused<T> {
field: T,
}
impl<T> Confused<T> {
fn foo<'a, 'b>(&'a self, param: &Struct<'a, 'b, T>) -> &'a T {
param.b();
param.a()
}
fn bar<'a, 'b, U: Trait<'a, 'b, T>>(&'a self, param: &U) -> &'a T {
param.b();
param.a()
}
}
The function foo is okay, but when I replace the concrete type Struct<'a, 'b, T> with a generic type U: Trait<'a, 'b, T>, I get the following error:
error[E0309]: the parameter type `T` may not live long enough
--> src/lib.rs:31:15
|
24 | impl<T> Confused<T> {
| - help: consider adding an explicit lifetime bound `T: 'b`...
...
31 | param.b();
| ^
|
note: ...so that the reference type `&'b T` does not outlive the data it points at
--> src/lib.rs:31:15
|
31 | param.b();
| ^
The suggestion to add the bound T: 'b doesn't make sense to me, since 'b is a parameter to bar(). How can I fix bar() to accept any implementation of Trait<'a, 'b, T> as a parameter?
When you write a generic type such as:
struct Foo<'a, T> {
a: &'a T,
}
Rust automatically adds an implicit restriction of the type T: 'a, because your reference to T cannot live longer than T itself. This is automatic because your type would not work without it.
But when you do something like:
impl<T> Foo {
fn bar<'a, 'b>() -> &'a T {/*...*/}
}
there is an automatic T: 'a but not a T: 'b because there is no &'b T anywhere.
The solution is to add those constraints by yourself. In your code it would be something like this:
impl<T> Confused<T> {
fn bar<'a, 'b, U: Trait<'a, 'b, T>>(&'a self, param: &U) -> &'a T
where
T: 'b, //<--- here!
{
param.b();
param.a()
}
}
I have a function that takes a collection of &T (represented by IntoIterator) with the requirement that every element is unique.
fn foo<'a, 'b, T: std::fmt::Debug, I>(elements: &'b I)
where
&'b I: IntoIterator<Item = &'a T>,
T: 'a,
'b: 'a,
I would like to also write a wrapper function which can work even if the elements are not unique, by using a HashSet to remove the duplicate elements first.
I tried the following implementation:
use std::collections::HashSet;
fn wrap<'a, 'b, T: std::fmt::Debug + Eq + std::hash::Hash, J>(elements: &'b J)
where
&'b J: IntoIterator<Item = &'a T>,
T: 'a,
'b: 'a,
{
let hashset: HashSet<&T> = elements.into_iter().into_iter().collect();
foo(&hashset);
}
playground
However, the compiler doesn't seem happy with my assumption that HashSet<&T> implements IntoIterator<Item = &'a T>:
error[E0308]: mismatched types
--> src/lib.rs:10:9
|
10 | foo(&hashset);
| ^^^^^^^^ expected type parameter, found struct `std::collections::HashSet`
|
= note: expected type `&J`
found type `&std::collections::HashSet<&T>`
= help: type parameters must be constrained to match other types
= note: for more information, visit https://doc.rust-lang.org/book/ch10-02-traits.html#traits-as-parameters
I know I could use a HashSet<T> by cloning all the input elements, but I want to avoid unnecessary copying and memory use.
If you have a &HashSet<&T> and need an iterator of &T (not &&T) that you can process multiple times, then you can use Iterator::copied to convert the iterator's &&T to a &T:
use std::{collections::HashSet, fmt::Debug, hash::Hash, marker::PhantomData};
struct Collection<T> {
item: PhantomData<T>,
}
impl<T> Collection<T>
where
T: Debug,
{
fn foo<'a, I>(elements: I) -> Self
where
I: IntoIterator<Item = &'a T> + Clone,
T: 'a,
{
for element in elements.clone() {
println!("{:?}", element);
}
for element in elements {
println!("{:?}", element);
}
Self { item: PhantomData }
}
}
impl<T> Collection<T>
where
T: Debug + Eq + Hash,
{
fn wrap<'a, I>(elements: I) -> Self
where
I: IntoIterator<Item = &'a T>,
T: 'a,
{
let set: HashSet<_> = elements.into_iter().collect();
Self::foo(set.iter().copied())
}
}
#[derive(Debug, Hash, PartialEq, Eq)]
struct Foo(i32);
fn main() {
let v = vec![Foo(1), Foo(2), Foo(4)];
Collection::<Foo>::wrap(&v);
}
See also:
Using the same iterator multiple times in Rust
Does cloning an iterator copy the entire underlying vector?
Why does cloning my custom type result in &T instead of T?
Note that the rest of this answer made the assumption that a struct named Collection<T> was a collection of values of type T. OP has clarified that this is not true.
That's not your problem, as shown by your later examples. That can be boiled down to this:
struct Collection<T>(T);
impl<T> Collection<T> {
fn new(value: &T) -> Self {
Collection(value)
}
}
You are taking a reference to a type (&T) and trying to store it where a T is required; these are different types and will generate an error. You are using PhantomData for some reason and accepting references via the iterator, but the problem is the same.
In fact, PhantomData makes the problem harder to see as you can just make up values that don't work. For example, we never have any kind of string here but we "successfully" created the struct:
use std::marker::PhantomData;
struct Collection<T>(PhantomData<T>);
impl Collection<String> {
fn new<T>(value: &T) -> Self {
Collection(PhantomData)
}
}
Ultimately, your wrap function doesn't make sense, either:
impl<T: Eq + Hash> Collection<T> {
fn wrap<I>(elements: I) -> Self
where
I: IntoIterator<Item = T>,
This is equivalent to
impl<T: Eq + Hash> Collection<T> {
fn wrap<I>(elements: I) -> Collection<T>
where
I: IntoIterator<Item = T>,
Which says that, given an iterator of elements T, you will return a collection of those elements. However, you put them in a HashMap and iterate on a reference to it, which yields &T. Thus this function signature cannot be right.
It seems most likely that you want to accept an iterator of owned values instead:
use std::{collections::HashSet, fmt::Debug, hash::Hash};
struct Collection<T> {
item: T,
}
impl<T> Collection<T> {
fn foo<I>(elements: I) -> Self
where
I: IntoIterator<Item = T>,
for<'a> &'a I: IntoIterator<Item = &'a T>,
T: Debug,
{
for element in &elements {
println!("{:?}", element);
}
for element in &elements {
println!("{:?}", element);
}
Self {
item: elements.into_iter().next().unwrap(),
}
}
}
impl<T> Collection<T>
where
T: Eq + Hash,
{
fn wrap<I>(elements: I) -> Self
where
I: IntoIterator<Item = T>,
T: Debug,
{
let s: HashSet<_> = elements.into_iter().collect();
Self::foo(s)
}
}
#[derive(Debug, Hash, PartialEq, Eq)]
struct Foo(i32);
fn main() {
let v = vec![Foo(1), Foo(2), Foo(4)];
let c = Collection::wrap(v);
println!("{:?}", c.item)
}
Here we place a trait bound on the generic iterator type directly and a second higher-ranked trait bound on a reference to the iterator. This allows us to use a reference to the iterator as an iterator itself.
See also:
How does one generically duplicate a value in Rust?
Is there any way to return a reference to a variable created in a function?
How do I write the lifetimes for references in a type constraint when one of them is a local reference?
There were a number of orthogonal issues with my code that Shepmaster pointed out, but to solve the issue of using a HashSet<&T> as an IntoIterator<Item=&T>, I found that one way to solve it is with a wrapper struct:
struct Helper<T, D: Deref<Target = T>>(HashSet<D>);
struct HelperIter<'a, T, D: Deref<Target = T>>(std::collections::hash_set::Iter<'a, D>);
impl<'a, T, D: Deref<Target = T>> Iterator for HelperIter<'a, T, D>
where
T: 'a,
{
type Item = &'a T;
fn next(&mut self) -> Option<Self::Item> {
self.0.next().map(|x| x.deref())
}
}
impl<'a, T, D: Deref<Target = T>> IntoIterator for &'a Helper<T, D> {
type Item = &'a T;
type IntoIter = HelperIter<'a, T, D>;
fn into_iter(self) -> Self::IntoIter {
HelperIter((&self.0).into_iter())
}
}
Which is used as follows:
struct Collection<T> {
item: PhantomData<T>,
}
impl<T: Debug> Collection<T> {
fn foo<I>(elements: I) -> Self
where
I: IntoIterator + Copy,
I::Item: Deref<Target = T>,
{
for element in elements {
println!("{:?}", *element);
}
for element in elements {
println!("{:?}", *element);
}
return Self { item: PhantomData };
}
}
impl<T: Debug + Eq + Hash> Collection<T> {
fn wrap<I>(elements: I) -> Self
where
I: IntoIterator + Copy,
I::Item: Deref<Target = T> + Eq + Hash,
{
let helper = Helper(elements.into_iter().collect());
Self::foo(&helper);
return Self { item: PhantomData };
}
}
fn main() {
let v = vec![Foo(1), Foo(2), Foo(4)];
Collection::<Foo>::wrap(&v);
}
I'm guessing that some of this may be more complicated than it needs to be, but I'm not sure how.
full playground
I'm trying to wrap a HashMap, as defined below, to return a mutable reference from a HashMap:
use std::{collections::HashMap, marker::PhantomData};
struct Id<T>(usize, PhantomData<T>);
pub struct IdCollection<T>(HashMap<Id<T>, T>);
impl<'a, T> std::ops::Index<Id<T>> for &'a mut IdCollection<T> {
type Output = &'a mut T;
fn index(&mut self, id: &'a Id<T>) -> Self::Output {
self.0.get_mut(id).unwrap()
}
}
And the resulting error:
note: first, the lifetime cannot outlive the anonymous lifetime #1 defined on the method body at 54:5...
--> src/id_container.rs:54:5
|
54 | / fn index(&mut self, id: &'a Id<T>) -> Self::Output {
55 | | self.0.get_mut(id).unwrap()
56 | | }
| |_____^
note: ...so that reference does not outlive borrowed content
--> src/id_container.rs:55:9
|
55 | self.0.get_mut(id).unwrap()
| ^^^^^^
note: but, the lifetime must be valid for the lifetime 'a as defined on the impl at 52:6...
--> src/id_container.rs:52:6
|
52 | impl<'a, T> std::ops::Index<Id<T>> for &'a mut IdCollection<T> {
| ^^
= note: ...so that the types are compatible:
expected std::ops::Index<id_container::Id<T>>
found std::ops::Index<id_container::Id<T>>
Why can't the compiler extend the lifetime of the get_mut? The IdCollection would then be borrowed mutably.
Note that I tried using a std::collections::HashSet<IdWrapper<T>> instead of a HashMap:
struct IdWrapper<T> {
id: Id<T>,
t: T,
}
Implementing the proper borrow etc. so I can use the Id<T> as a key.
However, HashSet doesn't offer a mutable getter (which makes sense since you don't want to mutate what's used for your hash). However in my case only part of the object should be immutable. Casting a const type to a non-const is UB so this is out of the question.
Can I achieve what I want? Do I have to use some wrapper such as a Box? Although I'd rather avoid any indirection...
EDIT
Ok I'm an idiot. First I missed the IndexMut instead of the Index, and I forgot the & when specifying the Self::Output in the signature.
Here's my full code below:
pub struct Id<T>(usize, PhantomData<T>);
impl<T> std::fmt::Display for Id<T> {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(f, "{}", self.0)
}
}
impl<T> Hash for Id<T> {
fn hash<H: Hasher>(&self, state: &mut H) {
self.0.hash(state);
}
}
impl<T> PartialEq for Id<T> {
fn eq(&self, o: &Self) -> bool {
self.0 == o.0
}
}
impl<T> Eq for Id<T> {}
pub struct IdCollection<T>(HashMap<Id<T>, T>);
impl<'a, T> IntoIterator for &'a IdCollection<T> {
type Item = (&'a Id<T>, &'a T);
type IntoIter = std::collections::hash_map::Iter<'a, Id<T>, T>;
fn into_iter(self) -> Self::IntoIter {
self.0.iter()
}
}
impl<'a, T> IntoIterator for &'a mut IdCollection<T> {
type Item = (&'a Id<T>, &'a mut T);
type IntoIter = std::collections::hash_map::IterMut<'a, Id<T>, T>;
fn into_iter(self) -> Self::IntoIter {
self.0.iter_mut()
}
}
impl<T> std::ops::Index<Id<T>> for IdCollection<T> {
type Output = T;
fn index(&self, id: Id<T>) -> &Self::Output {
self.0.get(&id).unwrap()
}
}
impl<T> std::ops::IndexMut<Id<T>> for IdCollection<T> {
fn index_mut(&mut self, id: Id<T>) -> &mut Self::Output {
self.0.get_mut(&id).unwrap()
}
}
impl<T> std::ops::Index<&Id<T>> for IdCollection<T> {
type Output = T;
fn index(&self, id: &Id<T>) -> &Self::Output {
self.0.get(id).unwrap()
}
}
impl<T> std::ops::IndexMut<&Id<T>> for IdCollection<T> {
fn index_mut(&mut self, id: &Id<T>) -> &mut Self::Output {
self.0.get_mut(id).unwrap()
}
}
If I understand correctly what you try to achieve, then I have to tell you, that it is a bit more complex than you originally thought it would be.
First of all, you have to realise, that if you like to use a HashMap then the type of the key required to be hashable and comparable. Therefore the generic type parameter T in Id<T> has to be bound to those traits in order to make Id hashable and comparable.
The second thing you need to understand is that there are two different traits to deal with the indexing operator: Index for immutable data access, and IndexMut for mutable one.
use std::{
marker::PhantomData,
collections::HashMap,
cmp::{
Eq,
PartialEq,
},
ops::{
Index,
IndexMut,
},
hash::Hash,
};
#[derive(PartialEq, Hash)]
struct Id<T>(usize, PhantomData<T>)
where T: PartialEq + Hash;
impl<T> Eq for Id<T>
where T: PartialEq + Hash
{}
struct IdCollection<T>(HashMap<Id<T>, T>)
where T: PartialEq + Hash;
impl<T> Index<Id<T>> for IdCollection<T>
where T: PartialEq + Hash
{
type Output = T;
fn index(&self, id: Id<T>) -> &Self::Output
{
self.0.get(&id).unwrap()
}
}
impl<T> IndexMut<Id<T>> for IdCollection<T>
where T: PartialEq + Hash
{
fn index_mut(&mut self, id: Id<T>) -> &mut Self::Output
{
self.0.get_mut(&id).unwrap()
}
}
fn main()
{
let mut i = IdCollection(HashMap::new());
i.0.insert(Id(12, PhantomData), 99i32);
println!("{:?}", i[Id(12, PhantomData)]);
i[Id(12, PhantomData)] = 54i32;
println!("{:?}", i[Id(12, PhantomData)]);
}
It may seem a bit surprising, but IndexMut is not designed to insert an element into the collection but to actually modify an existing one. That's the main reason why HashMap does not implement IndexMut -- and that's also the reason why the above example uses the HashMap::insert method to initially place the data. As you can see, later on, when the value is already available we can modify it via the IdCollection::index_mut.
I have a binary trait Resolve.
pub trait Resolve<RHS = Self> {
type Output;
fn resolve(self, rhs: RHS) -> Self::Output;
}
I implemented the trait for something trivial where both arguments are taken by reference (self is &'a Foo and rhs is &'b Foo):
struct Foo;
impl <'a, 'b> Resolve<&'b Foo> for &'a Foo {
type Output = Foo;
fn resolve(self, rhs: &'b Foo) -> Self::Output {
unimplemented!()
}
}
If I now write
fn main() {
let a: &Foo = &Foo;
let b = Foo;
a.resolve(&b);
}
it will compile just fine, but if I try to implement it on my struct Signal, it will not work.
pub struct Signal<'a, T> {
ps: Vec<&'a T>,
}
impl<'a, T: Resolve<&'a T, Output = T> + 'a> Signal<'a, T> {
pub fn foo(&mut self) {
let a: &T = &self.ps[0];
let b = &self.ps[1];
a.resolve(b);
}
}
error[E0507]: cannot move out of borrowed content
--> src/main.rs:25:9
|
25 | a.resolve(b);
| ^ cannot move out of borrowed content
How do I get this example working? (playground)
The trait bound on foo only says that T implements Resolve, but you try to call .resolve() on a value of type &T.
To say, instead, that references to T must implement Resolve, you need a higher-ranked trait bound:
impl<'a, T> Signal<'a, T>
where
for<'b> &'b T: Resolve<&'a T, Output = T>,
{
pub fn foo(&mut self) { ... }
}
After thinking about this I came up with a simpler solution that does not rely on HRTB.
impl<'a, T> Signal<'a, T>
where
&'a T: Resolve<&'a T, Output = T> + 'a,
{
pub fn foo(&mut self) {
let a: &T = &self.ps[0];
let b = &self.ps[1];
a.resolve(b);
}
}
This does the same, namely describe, that &T implements Resolve, but without the need of HRTB.
You have to use the where clause for this, but apart from that this is a nice and easy solution.