Implement From trait on string slice and borrowed string - rust

In Rust, it's common practice to have functions which prefer string slices, &str, to borrowed strings, &String, because the String type implements Deref<Target = str>.
I am wondering if there is an idiomatic way to implement the From<&str> trait for a struct such that the struct can be created from either a string slice or a borrowed string. The best example I can come up with is below:
pub struct Base64 {
bytes: Vec<u8>
}
impl From<&str> for Base64 {
fn from(s: &str) -> Self {
Base64 { bytes: base64::decode(s).unwrap() }
}
}
impl From<&String> for Base64 {
fn from(s: &String) -> Self {
Base64::from(s as &str)
}
}
Without the last function usage like Base64::from(&String::from("...")) generates the following error:
the trait bound `Base64: From<&String>` is not satisfied
the following other types implement trait `From<T>`:
<Base64 as From<&str>>

There's indeed a way to make this more concise:
impl<T: Deref<Target=str>> From<&T> for Base64 {
fn from(s: &T) -> Self {
let s = s as &str;
Base64 { bytes: base64::decode(s).unwrap() }
}
}
and get rid of the other two trait impls. This one uses generics with trait bounds: Now anything that can be dereferenced to a str can be used with your struct.
EDIT: Alternatively, keep the non-generic version for &str, ditch the String version, and then instead call it like so:
Base64::from(String::from("...").as_ref())
which will alleviate the concern pointed out by #kmdreko

Related

Lifetime declaration errors on struct with &str

I have a struct like this -
pub struct A {
pub f: &str,
}
When I later try to assign a &str value to the struct using this -
fn use(&self, t: &str){
let a = A {
f: self.t,
};
}
I get an error saying -
pub f: &str,
| ^ expected named lifetime parameter
I then changed my struct to use named lifetime parameters -
pub struct A<'i> {
pub f: &'i str,
}
But it now gives error -
implicit elided lifetime not allowed here
note: assuming a `'static` lifetime...
help: indicate the anonymous lifetime
Can someone explain the second error and how can I fix it?
Use String instead of &str
pub struct A {
pub f: String,
}
// use .to_string() to convert &str to String
fn use(&self, t: &str){
let a = A {
f: t.to_string(),
};
}
Without seeing a more complete picture of your code it's hard to say what exactly is wrong, but you are probably missing a lifetime specifier on your impl. For example, this code will compile:
pub struct A<'a> {
pub f: &'a str,
}
impl<'a> A<'a> {
fn new(t: &'a str) -> Self {
A {
f: t,
}
}
}
fn main() {
A::new("hi");
}
You need to implement A for the generic lifetime 'a, otherwise the compiler will imply the 'static lifetime which conflicts with your struct definition.
This in most cases isn't very practical however, and you probably want your struct to hold a string using the String type instead which has ownership over the string rather than a reference.

How do you create a generic function in Rust with a trait requiring a lifetime?

I am trying to write a trait which works with a database and represents something which can be stored. To do this, the trait inherits from others, which includes the serde::Deserialize trait.
trait Storable<'de>: Serialize + Deserialize<'de> {
fn global_id() -> &'static [u8];
fn instance_id(&self) -> Vec<u8>;
}
struct Example {
a: u8,
b: u8
}
impl<'de> Storable<'de> for Example {
fn global_id() -> &'static [u8] { b"p" }
fn instance_id(&self) -> Vec<u8> { vec![self.a, self.b] }
}
Next, I am trying to write this data using a generic function:
pub fn put<'de, S: Storable>(&mut self, obj: &'de S) -> Result<(), String> {
...
let value = bincode::serialize(obj, bincode::Infinite);
...
db.put(key, value).map_err(|e| e.to_string())
}
However, I am getting the following error:
error[E0106]: missing lifetime specifier
--> src/database.rs:180:24
|
180 | pub fn put<'de, S: Storable>(&mut self, obj: &'de S) -> Result<(), String> {
| ^^^^^^^^ expected lifetime parameter
Minimal example on the playground.
How would I resolve this, possibly avoid it altogether?
You have defined Storable with a generic parameter, in this case a lifetime. That means that the generic parameter has to be propagated throughout the entire application:
fn put<'de, S: Storable<'de>>(obj: &'de S) -> Result<(), String> { /* ... */ }
You can also decide to make the generic specific. That can be done with a concrete type or lifetime (e.g. 'static), or by putting it behind a trait object.
Serde also has a comprehensive page about deserializer lifetimes. It mentions that you can choose to use DeserializeOwned as well.
trait Storable: Serialize + DeserializeOwned { /* ... */ }
You can use the same concept as DeserializeOwned for your own trait as well:
trait StorableOwned: for<'de> Storable<'de> { }
fn put<'de, S: StorableOwned>(obj: &'de S) -> Result<(), String> {
You have the 'de lifetime in the wrong place -- you need it to specify the argument to Storable, not the lifetime of the reference obj.
Instead of
fn to_json<'de, S: Storable>(obj: &'de S) -> String {
use
fn to_json<'de, S: Storable<'de>>(obj: &S) -> String {
Playground.
The lifetime of obj doesn't actually matter here, because you're not returning any values derived from it. All you need to prove is that S implements Storable<'de> for some lifetime 'de.
If you want to eliminate the 'de altogether, you should use DeserializeOwned, as the other answer describes.

the From<&String> trait is not implemented for the type String

I'm going off of this article in an attempt to write a function that accepts both a String and a &str, but I'm running into a problem. I have the following function:
pub fn new<S>(t_num: S) -> BigNum where S: Into<String> {
let t_value = t_num.into();
let t_digits = t_value.len();
BigNum { value: t_value, digits: t_digits }
}
BigNum is a simple struct, the problem however is when I attempted to call this with a &collections::string::String I get an error:
let line = "123456".to_string()
let big = bignum::BigNum::new(&line)
main.rs:23:15: 23:34 error: the trait `core::convert::From<&collections::string::String>` is not implemented for the type `collections::string::String` [E0277]
main.rs:23 let big = bignum::BigNum::new(&line);
I was under the impression that a &String will be implicitly broken down into a &str no? And in that case the Into trait would convert the &str into a String I could then use. What am I doing wrong?
You're conflating two different processes.
First, there's coercion; in particular, Deref coercion. This happens when the compiler sees that you have a &U, but you want a &T. Provided there is an impl Deref<Target=T> for U, it will do the coercion for you. This is why a &String will coerce to a &str.
However, this does not come into play when the compiler is substituting generic type parameters. When you say BigNum::new(&line), what the compiler sees is that you're trying to pass a &String where it expects an S; thus, S must be &String, thus S must implement Into<String> and... oh no! It doesn't! BOOM! Coercion is never triggered because the compiler never needs to coerce anything; unfulfilled type constraints are a different problem.
In this particular case, what you should do depends on your circumstances:
You can just pass a String; use line or line.clone(). This is the most efficient in that you can always pass in an owned String you no longer need and avoid an extra allocation.
You can instead take an &S with S: ?Sized + AsRef<str>, which doesn't allow you to pass an owned string, but if you're always going to allocate anyway, this may be more ergonomic.
Here's an example of both in action:
use std::convert::AsRef;
fn main() {
take_a_string(String::from("abc"));
// take_a_string(&String::from("abc")); // Boom!
take_a_string("def");
// take_a_string_ref(String::from("abc")); // Boom!
take_a_string_ref(&String::from("abc"));
take_a_string_ref("def");
}
fn take_a_string<S>(s: S)
where S: Into<String> {
let s: String = s.into();
println!("{:?}", s);
}
fn take_a_string_ref<S: ?Sized>(s: &S)
where S: AsRef<str> {
let s: String = s.as_ref().into();
println!("{:?}", s);
}
As mentioned by DK., this is not possible with the Rust Into trait, because of the missing implementation of Into<String> for &String. I couldn't find the reasons behind this, but you can create your own Trait to solve this:
pub trait IntoString {
fn into(self) -> String;
}
impl IntoString for &String {
fn into(self) -> String {
self.to_string()
}
}
impl IntoString for &str {
fn into(self) -> String {
self.to_string()
}
}
impl IntoString for String {
fn into(self) -> String {
self
}
}
pub fn new<S>(t_num: S) -> BigNum where S: IntoString {
let t_value = t_num.into();
let t_digits = t_value.len();
BigNum { value: t_value, digits: t_digits }
}

Implementing FromStr for a custom &[u8] type

This is a two-parter.
Ideally I'd like to implement the FromStr trait, but with or without that, I need to implement from_str().
A CqlString consists of a u16 (two u8s) followed by the raw bytes of the original string.
The version below generates "error: 'bytes' does not live long enough", so that's problem #1.
If I make it "impl FromStr for CqlString", then I get an earlier error of:
error: method from_str has an incompatible type for trait: expected concrete lifetime, found bound lifetime parameter [E0053]
So given the structure of CqlString, how can I implement a FromStr fn properly?
#[repr(C, packed)]
pub struct CqlString<'a>(&'a [u8]);
impl<'a> CqlString<'a> {
fn from_str(s: &str) -> Option<CqlString> {
let mut bytes = Vec::<u8>::new();
bytes.push_all(unsafe{Utils::raw_byte_repr(&s.len().to_u16())}); //convert the hashmap length to a a two byte short and start building our &[u8]
bytes.push_all(s.as_bytes());
let cqls = CqlString(bytes[]);
Some(cqls)
}
}
The short answer is that you can't. CqlString contains a reference to other data, but FromStr expects to create a fully-owned object that no longer needs to reference the &str. These two concepts are incompatible.
The closest I can see is that you could create an OwnedCqlString:
struct OwnedCqlString {
data: Vec<u8>,
}
impl OwnedCqlString {
fn as_cql_string(&self) -> CqlString { CqlString(self.data.as_slice()) }
}
impl FromStr for OwnedCqlString {
fn from_str(s: &str) -> Option<OwnedCqlString> {
// logic here
}
}
fn main() {
let ocs: OwnedCqlString = "hello".parse();
let cs = ocs.as_cql_string();
}
Ultimately, this comes down to two questions:
Where are you going to store the bytes that represent the size?
How do you ensure that those bytes immediately precede the string data in memory?
An alternate idea
If you didn't need to store the slice of bytes, but instead could have a "streaming" interface, then you could implement that directly on &str:
trait WriteCqlStr {
fn write_to<W>(&self, &mut W)
where W: Writer;
}
impl WriteCqlStr for CqlStr {
// Straight-forward impl, write the bytes we refer to
}
impl<'a> WriteCqlStr for &'a str {
// Write the length, then write the bytes of the str
}

Returning a string from a method in Rust

Editor's note: The syntax in this question predates Rust 1.0 and the 1.0-updated syntax generates different errors, but the overall concepts are still the same in Rust 1.0.
I have a struct T with a name field, I'd like to return that string from the name function. I don't want to copy the whole string, just the pointer:
struct T {
name: ~str,
}
impl Node for T {
fn name(&self) -> &str { self.name }
// this doesn't work either (lifetime error)
// fn name(&self) -> &str { let s: &str = self.name; s }
}
trait Node {
fn name(&self) -> &str;
}
Why is this wrong? What is the best way to return an &str from a function?
You have to take a reference to self.name and ensure that the lifetimes of &self and &str are the same:
struct T {
name: String,
}
impl Node for T {
fn name<'a>(&'a self) -> &'a str {
&self.name
}
}
trait Node {
fn name(&self) -> &str;
}
You can also use lifetime elision, as shown in the trait definition. In this case, it automatically ties together the lifetimes of self and the returned &str.
Take a look at these book chapters:
References and borrowing
Lifetime syntax

Resources