While following the rustbyexample.com tutorial, I typed the following code:
impl fmt::Display for Structure {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let x = format!("{}", "something");
write!(f, "OMG! {}", self.0);
}
}
And got the following error from the compiler:
error[E0308]: mismatched types
--> src/main.rs:5:58
|
5 | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
| __________________________________________________________^
6 | | let x = format!("{}", "something");
7 | | write!(f, "OMG! {}", self.0);
8 | | }
| |_____^ expected enum `std::result::Result`, found ()
|
= note: expected type `std::result::Result<(), std::fmt::Error>`
found type `()`
help: consider removing this semicolon:
--> src/main.rs:7:37
|
7 | write!(f, "OMG! {}", self.0);
| ^
Why is the semicolon relevant (or not) here?
The return value from a Rust function is the last expression not followed by a semicolon. With the semicolon, your method doesn't return anything. Without the last semicolon, it returns the value of write!(f, "OMG! {}", self.0).
You can read more about that in The Rust Programming Language chapter about functions; look for the part starting with "What about returning a value?".
Related
I'm trying to implement parts of the minecraft protocol (https://wiki.vg/).
I've successfully implemented a decoder to decode packets. But I'm stuck with the encoding part. The thing is that the minecraft protocol uses "minecraft varints" (https://wiki.vg/Protocol#VarInt_and_VarLong) and I want to have methods to write data as varints.
So my goal is to have a trait named Encoder with these methods:
fn write_var_int(&mut self, value: i32) -> Result<(), error::EncodeError>;
fn write_var_long(&mut self, value: i64) -> Result<(), error::EncodeError>;
fn write_string(&mut self, value: &str) -> Result<(), error::EncodeError>;
At the moment, I've only written the code for the first method:
fn write_var_int(&mut self, mut value: i32) -> Result<(), error::EncodeError> {
loop {
let mut byte = (value & 0b01111111) as u8;
if byte == 0 {
self.write_u8(byte).unwrap();
break;
}
self.write_u8(byte | 0b10000000).unwrap();
value = value >> 7;
}
Ok(())
}
In main.rs I import the module Encoder and I try to use it on a cursor :
let test = [0; 17];
let mut wrt = Cursor::new(test);
wrt.write_var_int(packet.id);
But I get these compilation errors:
error[E0599]: the method `write_var_int` exists for struct `std::io::Cursor<[{integer}; 17]>`, but its trait bounds were not satisfied
--> src/main.rs:57:37
|
57 | ... wrt.write_var_int(packet.id);
| ^^^^^^^^^^^^^ method cannot be called on `std::io::Cursor<[{integer}; 17]>` due to unsatisfied trait bounds
|
::: /home/clement/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/io/cursor.rs:75:1
|
75 | pub struct Cursor<T> {
| --------------------
| |
| doesn't satisfy `std::io::Cursor<[{integer}; 17]>: Encoder`
| doesn't satisfy `std::io::Cursor<[{integer}; 17]>: std::io::Write`
|
note: the following trait bounds were not satisfied because of the requirements of the implementation of `Encoder` for `_`:
`std::io::Cursor<[{integer}; 17]>: std::io::Write`
--> src/protocol/encoder.rs:12:16
|
12 | impl<W: Write> Encoder for W {
| ^^^^^^^ ^
warning: unused import: `protocol::encoder::Encoder`
--> src/main.rs:11:5
|
11 | use protocol::encoder::Encoder;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
For more information about this error, try `rustc --explain E0599`.
warning: `eupim` (bin "eupim") generated 2 warnings
error: could not compile `eupim` due to previous error; 2 warnings emitted
I don't understand why the import of the Encoder module is marked as unused, and how to fix these errors.
I'll be pleased to have hints on how to fix this.
Thanks!
I tried to turn the code in your question into a minumal reproducible example:
use std::io::{Cursor, Write};
use std::slice;
mod error {
#[derive(Debug)]
pub enum EncodeError { }
}
trait Encoder {
fn write_u8(&mut self, value: u8) -> Result<(), error::EncodeError>;
fn write_var_int(&mut self, value: i32) -> Result<(), error::EncodeError>;
}
impl<W: Write> Encoder for W {
fn write_u8(&mut self, value: u8) -> Result<(), error::EncodeError> {
self.write(slice::from_ref(&value)).unwrap();
Ok(())
}
fn write_var_int(&mut self, mut value: i32) -> Result<(), error::EncodeError> {
loop {
let mut byte = (value & 0b01111111) as u8;
if byte == 0 {
self.write_u8(byte).unwrap();
break;
}
self.write_u8(byte | 0b10000000).unwrap();
value = value >> 7;
}
Ok(())
}
}
fn main() {
let test = [0; 17];
let mut wrt = Cursor::new(test);
wrt.write_var_int(3);
}
This produces the error:
Compiling playground v0.0.1 (/playground)
error[E0599]: the method `write_var_int` exists for struct `std::io::Cursor<[{integer}; 17]>`, but its trait bounds were not satisfied
--> src/main.rs:39:9
|
39 | wrt.write_var_int(3);
| ^^^^^^^^^^^^^ method cannot be called on `std::io::Cursor<[{integer}; 17]>` due to unsatisfied trait bounds
|
note: the following trait bounds were not satisfied because of the requirements of the implementation of `Encoder` for `_`:
`std::io::Cursor<[{integer}; 17]>: std::io::Write`
--> src/main.rs:14:16
|
14 | impl<W: Write> Encoder for W {
| ^^^^^^^ ^
For more information about this error, try `rustc --explain E0599`.
error: could not compile `playground` due to previous error
The problem here is that we've got a Cursor<[{integer}; 17]>, but Write is only implemented for a Cursor<&mut [u8]>, where &mut [u8] is a mutable reference to a u8 slice.
This makes sense if you think about it: Cursor wraps an in-memory buffer, but there's no need for it to take ownership of it.
So let's make sure we're passing a mutable slice to our Cursor:
fn main() {
let mut test = [0u8; 17];
let mut wrt = Cursor::new(&mut test[..]);
wrt.write_var_int(3);
}
This compiles as expected.
I have the following trait and generic implementation for Fn:
trait Provider<'a> {
type Out;
fn get(&'a self, state: &State) -> Self::Out;
}
impl<'a, F, T> Provider<'a> for F
where
F: Fn(&State) -> T,
{
type Out = T;
fn get(&'a self, state: &State) -> T {
self(state)
}
}
Now, I have some code that wants a for<'a> Provider<'a, Out = usize>. However, the most simple closure, |_| 1, does not qualify and instead provides this error message which I don't understand:
fn assert_usize_provider<P>(_: P)
where
P: for<'a> Provider<'a, Out = usize>,
{
}
fn main() {
assert_usize_provider(|_| 1);
}
error[E0308]: mismatched types
--> src/main.rs:27:5
|
27 | assert_usize_provider(|_| 1);
| ^^^^^^^^^^^^^^^^^^^^^ lifetime mismatch
|
= note: expected type `FnOnce<(&State,)>`
found type `FnOnce<(&State,)>`
note: this closure does not fulfill the lifetime requirements
--> src/main.rs:27:27
|
27 | assert_usize_provider(|_| 1);
| ^^^^^
note: the lifetime requirement is introduced here
--> src/main.rs:22:29
|
22 | P: for<'a> Provider<'a, Out = usize>,
| ^^^^^^^^^^^
Playground link
Can someone explain what that error message means and how to get this code working?
I don't know why inference does not work in this case but you can add type annotation to get the code working.
assert_usize_provider(|_ : &State| 1);
Here is a minimal reproducible sample:
use std::fmt::{self, Debug, Formatter};
pub trait Apple {}
impl Debug for dyn Apple {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
f.write_str("Apple")
}
}
struct Basket<'a> {
apples: Vec<Box<dyn Apple + 'a>>,
}
impl<'a> Basket<'a> {
fn f(&self) {
let x = format!("{:?}", self.apples);
}
}
error[E0495]: cannot infer an appropriate lifetime due to conflicting requirements
--> src/lib.rs:17:17
|
17 | let x = format!("{:?}", self.apples);
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
note: first, the lifetime cannot outlive the lifetime `'a` as defined on the impl at 15:6...
--> src/lib.rs:15:6
|
15 | impl<'a> Basket<'a> {
| ^^
note: ...so that the types are compatible
--> src/lib.rs:17:17
|
17 | let x = format!("{:?}", self.apples);
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
= note: expected `(&std::vec::Vec<std::boxed::Box<dyn Apple>>,)`
found `(&std::vec::Vec<std::boxed::Box<(dyn Apple + 'a)>>,)`
= note: but, the lifetime must be valid for the static lifetime...
note: ...so that the types are compatible
--> src/lib.rs:17:33
|
17 | let x = format!("{:?}", self.apples);
| ^^^^^^^^^^^
= note: expected `std::fmt::Debug`
found `std::fmt::Debug`
= note: this error originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info)
format! should only borrow self.apples for a very short period of time (only borrow during its formatting process). However, it even requires self.apples to have the 'static lifetime.
EDIT: Now I see why is that, after kind help from comments and other answers. According to #Shepmaster's suggestion, I move this part of content into an answer.
Because of default trait object lifetimes, impl Debug for dyn Apple is shorthand for impl Debug for dyn Apple + 'static.
To write an implementation for non-'static trait objects, you can override this default lifetime with an explicit one:
impl<'a> Debug for dyn Apple + 'a
I see the reason after #chuigda_whitegive 's "non-reproducible" sample:
This is wrong
pub trait Apple {}
impl Debug for dyn Apple {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
f.write_str("Apple")
}
}
but should do this
pub trait Apple: Debug {}
I still do not understand, why I should not impl Debug for dyn Apple. It compiles in other cases and it sounds reasonable ("when debugging, we should output the Apple string").
EDIT
Now I see how to do a two-letter modification to the original question. One more anonymous lifetime and this compiles:
use std::fmt::{Debug, Formatter};
pub trait Apple {}
impl Debug for dyn Apple + '_ {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
f.write_str("Apple")
}
}
struct Basket<'a> {
apples: Vec<Box<dyn Apple + 'a>>,
}
impl<'a> Basket<'a> {
fn f(&self) {
let x = format!("{:?}", self.apples);
}
}
For why this is the case, please see #Matt Brubeck 's answer.
I would like to convert my Option<&String> to Option<&str> with as_deref() but the compiler seems to misunderstand my intent. If I replace it with .map(|e| e.as_str()) it works.
use std::collections::HashMap;
trait SipField {
fn from(&self) -> Option<&str>;
}
impl SipField for HashMap<String, Vec<String>> {
fn from(&self) -> Option<&str> {
self.get("from")
.or_else(|| self.get("f"))
.and_then(|i| i.first())
.as_deref()
}
}
error[E0308]: mismatched types
--> src/lib.rs:9:9
|
9 | / self.get("from")
10 | | .or_else(|| self.get("f"))
11 | | .and_then(|i| i.first())
12 | | .as_deref()
| |_______________________^ expected `str`, found struct `std::string::String`
|
= note: expected enum `std::option::Option<&str>`
found enum `std::option::Option<&std::string::String>`
For this code (trimmed some, sorry not more), I get a lifetime problem:
fn main() {
println!("Hello, world!");
}
#[derive(Debug)]
pub struct Token<'a> {
pub line: usize,
// Col in code points.
pub col: usize,
// Index in bytes.
pub index: usize,
pub state: TokenState,
pub text: &'a str,
}
#[derive(Clone, Copy, Debug, PartialEq)]
pub enum TokenState {
VSpace,
}
#[derive(Clone, Copy, Debug, PartialEq)]
pub enum ParseState {
Expr,
}
pub struct Node<'a> {
kids: Vec<Node<'a>>,
state: ParseState,
token: Option<&'a Token<'a>>,
}
impl<'a> Node<'a> {
fn new(state: ParseState) -> Node<'a> {
Node {
kids: vec![],
state,
token: None,
}
}
fn new_token(token: &'a Token<'a>) -> Node<'a> {
// TODO Control state? Some token state?
Node {
kids: vec![],
state: ParseState::Expr,
token: Some(&token),
}
}
fn push_if(&mut self, node: Node<'a>) {
if !node.kids.is_empty() {
self.kids.push(node);
}
}
}
pub fn parse<'a>(tokens: &'a Vec<Token<'a>>) -> Node<'a> {
let mut root = Node::new(ParseState::Expr);
let mut parser = Parser {
index: 0,
tokens: tokens,
};
parser.parse_block(&mut root);
root
}
struct Parser<'a> {
index: usize,
tokens: &'a Vec<Token<'a>>,
}
impl<'a> Parser<'a> {
fn parse_block(&mut self, parent: &mut Node) {
loop {
let mut row = Node::new(ParseState::Expr);
match self.peek() {
Some(_) => {
self.parse_row(&mut row);
}
None => {
break;
}
}
parent.push_if(row);
}
}
fn parse_row(&mut self, parent: &mut Node) {
loop {
match self.next() {
Some(ref token) => match token.state {
TokenState::VSpace => break,
_ => {
parent.kids.push(Node::new_token(&token));
}
},
None => break,
}
}
}
fn next(&mut self) -> Option<&Token> {
let index = self.index;
if index < self.tokens.len() {
self.index += 1;
}
self.tokens.get(index)
}
fn peek(&mut self) -> Option<&Token> {
self.tokens.get(self.index)
}
}
(playground)
This is the error message:
error[E0495]: cannot infer an appropriate lifetime for autoref due to conflicting requirements
--> src/main.rs:90:24
|
90 | match self.next() {
| ^^^^
|
note: first, the lifetime cannot outlive the lifetime 'a as defined on the impl at 72:1...
--> src/main.rs:72:1
|
72 | / impl<'a> Parser<'a> {
73 | | fn parse_block(&mut self, parent: &mut Node) {
74 | | loop {
75 | | let mut row = Node::new(ParseState::Expr);
... |
112| | }
113| | }
| |_^
note: ...so that the type `Parser<'a>` is not borrowed for too long
--> src/main.rs:90:19
|
90 | match self.next() {
| ^^^^
note: but, the lifetime must be valid for the anonymous lifetime #3 defined on the method body at 88:5...
--> src/main.rs:88:5
|
88 | / fn parse_row(&mut self, parent: &mut Node) {
89 | | loop {
90 | | match self.next() {
91 | | Some(ref token) => match token.state {
... |
99 | | }
100| | }
| |_____^
note: ...so that expression is assignable (expected Node<'_>, found Node<'_>)
--> src/main.rs:94:42
|
94 | parent.kids.push(Node::new_token(&token));
| ^^^^^^^^^^^^^^^^^^^^^^^
All the references should be tied to the same outside lifetime. In my full code (of which I just have an excerpt here), I expect to hang onto the original parsed source, and I'm trying to tie everything to that.
I know the error messages are trying to be helpful, but I'm really unsure what the conflict is. And I'm unsure what other lifetime questions here are related to the same issue I have or not.
Let's take a look at the signature of Parser::next:
fn next(&mut self) -> Option<&Token>
This function promises to return an Option<&Token>. There are elided lifetimes here; let's rewrite the signature to make them explicit:
fn next<'b>(&'b mut self) -> Option<&'b Token<'b>>
We can now see that next is generic over lifetime 'b. Notice how the return type uses 'b, not 'a. This is valid in itself, because the compiler can infer that 'b is a shorter than 'a, and mutable references (&'a mut T) are covariant over 'a ("covariant" in this context means that we can substitute lifetime 'a with a shorter lifetime). But what the function ends up promising is that the result lives at least as long as itself, while it can in fact live at least as long as 'a.
In Parser::parse_row, you're trying to take the result of Parser::next and insert it into parent. Let's look at Parser::parse_row's signature:
fn parse_row(&mut self, parent: &mut Node)
We have some omitted lifetimes here again. Let's spell them out:
fn parse_row<'b, 'c, 'd>(&'b mut self, parent: &'c mut Node<'d>)
'c is not going to be important, so we can ignore it.
If we try to compile now, the last two notes are different:
note: but, the lifetime must be valid for the lifetime 'd as defined on the method body at 88:5...
--> src/main.rs:88:5
|
88 | / fn parse_row<'b, 'c, 'd>(&'b mut self, parent: &'c mut Node<'d>) {
89 | | loop {
90 | | match self.next() {
91 | | Some(ref token) => match token.state {
... |
99 | | }
100| | }
| |_____^
note: ...so that expression is assignable (expected Node<'d>, found Node<'_>)
--> src/main.rs:94:42
|
94 | parent.kids.push(Node::new_token(&token));
| ^^^^^^^^^^^^^^^^^^^^^^^
Now, one of the anonymous lifetimes is identified as 'd. The other is still an anonymous lifetime, and that's an artifact of how the compiler manipulates lifetimes, but we can think of it as being 'b here.
The problem should be a bit clearer now: we're trying to push a Node<'b> into a collection of Node<'d> objects. It's important that the type be exactly Node<'d>, because mutable references (&'a mut T) are invariant over T ("invariant" means it can't change).
Let's make the lifetimes match. First, we'll change next's signature to match what we can actually return:
fn next(&mut self) -> Option<&'a Token<'a>>
This means that now, when we call self.next() in parse_row, we'll be able to construct a Node<'a>. A Node<'x> can only store Node<'x> objects (per your definition of Node), so the parent parameter's referent must also be of type Node<'a>.
fn parse_row(&mut self, parent: &mut Node<'a>)
If we try to compile now, we'll get an error in Parser::parse_block on the call to parse_row. The problem is similar to what we just saw. parse_block's signature is:
fn parse_block(&mut self, parent: &mut Node)
which expands to:
fn parse_block<'b, 'c, 'd>(&'b mut self, parent: &'c mut Node<'d>)
Here's the error the compiler gives with this elaborated signature:
error[E0495]: cannot infer an appropriate lifetime for lifetime parameter `'a` due to conflicting requirements
--> src/main.rs:78:26
|
78 | self.parse_row(&mut row);
| ^^^^^^^^^
|
note: first, the lifetime cannot outlive the lifetime 'a as defined on the impl at 72:1...
--> src/main.rs:72:1
|
72 | / impl<'a> Parser<'a> {
73 | | fn parse_block<'b, 'c, 'd>(&'b mut self, parent: &'c mut Node<'d>) {
74 | | loop {
75 | | let mut row = Node::new(ParseState::Expr);
... |
112| | }
113| | }
| |_^
note: ...so that types are compatible (expected &mut Parser<'_>, found &mut Parser<'a>)
--> src/main.rs:78:26
|
78 | self.parse_row(&mut row);
| ^^^^^^^^^
note: but, the lifetime must be valid for the lifetime 'd as defined on the method body at 73:5...
--> src/main.rs:73:5
|
73 | / fn parse_block<'b, 'c, 'd>(&'b mut self, parent: &'c mut Node<'d>) {
74 | | loop {
75 | | let mut row = Node::new(ParseState::Expr);
76 | | match self.peek() {
... |
85 | | }
86 | | }
| |_____^
note: ...so that types are compatible (expected &mut Node<'_>, found &mut Node<'d>)
--> src/main.rs:84:20
|
84 | parent.push_if(row);
| ^^^^^^^
The compiler is unable to infer the type of row (specifically, the lifetime in its type Node<'x>). On one hand, the call to parse_row means it should be Node<'a>, but the call to push_if means it should be Node<'d>. 'a and 'd are unrelated, so the compiler doesn't know how to unify them.
The solution is easy, and it's the same as above: just make parent have type &mut Node<'a>.
fn parse_block(&mut self, parent: &mut Node<'a>)
Now your code compiles!