I want to convert multiple env.variables to static struct.
I can do it mannually:
Env {
is_development: env::var("IS_DEVELOPMENT")
.unwrap()
.parse::<bool>()
.unwrap(),
server: Server {
host: env::var("HOST").unwrap(),
port: env::var("PORT")
.unwrap()
.parse::<u16>()
.unwrap(),
},
}
But when there is multiple values, it's became bloated. Is there a way to make generic helper function that will give me value that i specify or panic? Something like this (or another solution):
fn get_env_var<T>(env_var_name: String) -> T {
// panic is ok here
let var = env::var(env_var_name).unwrap();
T::from(var)
}
get_env_var<u16>("PORT") // here i got u16
get_env_var<bool>("IS_DEVELOPMENT") // here is my boolean
Full example
use crate::server::logger::log_raw;
use dotenv::dotenv;
use serde::Deserialize;
use std::env;
#[derive(Deserialize, Debug, Clone)]
pub struct Server {
pub host: String,
pub port: u16,
}
#[derive(Deserialize, Debug, Clone)]
pub struct Env {
pub is_development: bool,
pub server: Server,
}
impl Env {
pub fn init() -> Self {
dotenv().expect(".env loading fail");
// how can i specify what type i expect?
fn get_env_var<T>(env_var_name: String) -> T {
// panic is ok here
let var = env::var(env_var_name).unwrap();
T::from(var)
}
// instead this
Env {
is_development: env::var("IS_DEVELOPMENT")
.unwrap()
.parse::<bool>()
.unwrap(),
server: Server {
host: env::var("HOST").unwrap(),
port: env::var("PORT")
.unwrap()
.parse::<u16>()
.unwrap(),
},
}
// do something like this
/*
Env {
is_development: get_env_var<bool>("IS_DEVELOPMENT"),
server: Server {
host: get_env_var<String>("HOST"),
port: get_env_var<u16>("PORT"),
},
}
*/
}
}
lazy_static! {
pub static ref ENV: Env = Env::init();
}
Like in your manual version, where you use str::parse, you can have the same requirement as str::parse, which is FromStr. So if you include the T: FromStr requirement, then you'll be able to do var.parse::<T>().
use std::env;
use std::fmt::Debug;
use std::str::FromStr;
fn get_env_var<T>(env_var_name: &str) -> T
where
T: FromStr,
T::Err: Debug,
{
let var = env::var(env_var_name).unwrap();
var.parse::<T>().unwrap()
}
Then if you run the following by executing PORT=1234 IS_DEVELOPMENT=true cargo run.
fn main() {
println!("{}", get_env_var::<u16>("PORT"));
println!("{}", get_env_var::<bool>("IS_DEVELOPMENT"));
}
Then it will output:
1234
true
Alternatively, you might want to be able to handle VarError::NotPresent and fallback to a default.
use std::env::{self, VarError};
use std::fmt::Debug;
use std::str::FromStr;
fn get_env_var<T>(env_var_name: &str) -> Result<T, VarError>
where
T: FromStr,
T::Err: Debug,
{
let var = env::var(env_var_name)?;
Ok(var.parse().unwrap())
}
Now if you only executed PORT=1234 cargo run, then it would make it easier to do this:
let is_dev = get_env_var::<bool>("IS_DEVELOPMENT")
.map_err(|err| match err {
VarError::NotPresent => Ok(false),
err => Err(err),
})
.unwrap();
println!("{:?}", is_dev);
If you want to fallback to Default if VarError::NotPresent:
fn get_env_var<T>(env_var_name: &str) -> T
where
T: FromStr,
T::Err: Debug,
T: Default,
{
let var = match env::var(env_var_name) {
Err(VarError::NotPresent) => return T::default(),
res => res.unwrap(),
};
var.parse().unwrap()
}
Rust genericity, inspired by Haskell's works through traits and specifically trait bounds. This means when you write
fn get_env_var<T>(env_var_name: String) -> T
since there is no trait bound on T there are essentially no capabilities for it (this is rather unlike C++).
Therefore, as far as rustc is concerned, pretty much the only thing it can do with a T is... take one as parameter then return it as-is.
Thus to do anything useful with a T (including creating one, whether from something else or de novo) you need to use the correct trait and provide the correct trait bounds.
The From trait is entirely the wrong trait to involve here: it specifies total (never-failing) conversions e.g. converting a u16 to a u32, which can never fail.
Whether it's converting a String to a bool or a u16, the conversion is quite obviously less than total: there is an infinity of string values which are not sequences of decimal digits describing a number below 2^16.
In Rust, the signifier of failabibility tends to be Try. There is a TryFrom trait, however for historical reasons and as it documents in its signature the str::parse method is hooked on the FromStr trait.
This means in order to declare that your T can be created from a string (and use the parse method to create one), you need to bound T to FromStr. And of course indicate that it may fail, and will return whatever error T generates when it can't be parsed from a string:
fn get_env_var<T: FromStr>(env_var_name: String) -> Result<T, T::Err> {
let var = env::var(env_var_name).unwrap();
var.parse()
}
Incidentally, taking a String as input is usually avoided unless you really have to[0]. Usually you'd take an &str, that's a lot more flexible as it can be used e.g. with string literals (which are of type &'static str).
So
fn get_env_var<T: FromStr>(env_var_name: &str) -> Result<T, T::Err> {
let var = env::var(env_var_name).unwrap();
var.parse()
}
[0] or for efficiency purposes sometimes
Related
I am trying to understand following enum from this repo
#[repr(C)]
#[derive(BorshSerialize, BorshDeserialize, Debug, Clone)]
pub struct InitEscrowArgs {
pub data: EscrowReceive,
}
#[repr(C)]
#[derive(BorshSerialize, BorshDeserialize, Debug, Clone)]
pub struct ExchangeArgs {
pub data: EscrowReceive,
}
#[derive(BorshSerialize, BorshDeserialize, Clone)]
pub enum EscrowInstruction {
InitEscrow(InitEscrowArgs),
Exchange(ExchangeArgs),
CancelEscrow(),
}
and it's use of it in this match from this repo.
pub fn process(
program_id: &Pubkey,
accounts: &[AccountInfo],
instruction_data: &[u8],
) -> ProgramResult {
let instruction = EscrowInstruction::try_from_slice(instruction_data)?;
match instruction {
EscrowInstruction::InitEscrow(args) => {
msg!("Instruction: Init Escrow");
Self::process_init_escrow(program_id, accounts, args.data.amount)
}
EscrowInstruction::Exchange(args) => {
msg!("Instruction: Exchange Escrow");
Self::process_exchange(program_id, accounts, args.data.amount)
}
EscrowInstruction::CancelEscrow() => {
msg!("Instruction: Cancel Escrow");
Self::process_cancel(program_id, accounts)
}
}
}
I understand that this try_from_slice method gets some sort of byte array and deserialize it.
I do not understand how it determines which enum value to use.
The enum has 3 choices, InitEscrow / Exchange / CancelEscrow, but what determines the match to know which one it is suppose to select?
Seem to me the InitEscrowArgs and ExchangeArgs both takes in same struct. Both containing data that is EscrowReceive data type.
Method try_from_slice is part of the BorshDeserialize trait, which is derived on the enum in question. So, the choice between enum variants is made by the implementation of deserializer.
To see what is really going on, I've built the simplest possible example:
use borsh::BorshDeserialize;
#[derive(BorshDeserialize)]
enum Enum {
Variant1(u8),
Variant2,
}
By using cargo expand and a little manual cleanup, we can get the following equivalent code:
impl borsh::de::BorshDeserialize for Enum {
fn deserialize(buf: &mut &[u8]) -> Result<Self, std::io::Error> {
let variant_idx: u8 = borsh::BorshDeserialize::deserialize(buf)?;
let return_value = match variant_idx {
0u8 => Enum::Variant1(borsh::BorshDeserialize::deserialize(buf)?),
1u8 => Enum::Variant2,
_ => {
let msg = format!("Unexpected variant index: {}", variant_idx);
return Err(std::io::Error::new(
std::io::ErrorKind::InvalidInput,
msg,
));
}
};
Ok(return_value)
}
}
Where the inner deserialize calls refers to impl BorshDeserialize for u8:
fn deserialize(buf: &mut &[u8]) -> Result<Self> {
if buf.is_empty() {
return Err(Error::new(
ErrorKind::InvalidInput,
ERROR_UNEXPECTED_LENGTH_OF_INPUT,
));
}
let res = buf[0];
*buf = &buf[1..];
Ok(res)
}
So, it works the following way:
Deserializer tries to pull one byte from input; if there's none - this is an error.
This byte is interpreted as an index of enum variant; if it doesn't match to one of variants - this is an error.
If the variant contains any data, deserializer tries to pull this data from the input; if it fails (according to the inner type's implementation) - this is an error.
Wondering if there's a "proper" way of converting an Enum to a &str and back.
The problem I'm trying to solve:
In the clap crate, args/subcommands are defined and identified by &strs. (Which I'm assuming don't fully take advantage of the type checker.) I'd like to pass a Command Enum to my application instead of a &str which would be verified by the type-checker and also save me from typing (typo-ing?) strings all over the place.
This is what I came up with from searching StackOverflow and std:
use std::str::FromStr;
#[derive(Debug, Clone, Copy, PartialEq)]
pub enum Command {
EatCake,
MakeCake,
}
impl FromStr for Command {
type Err = ();
fn from_str(s: &str) -> std::result::Result<Self, Self::Err> {
match s.to_ascii_lowercase().as_str() {
"eat-cake" => Ok(Self::EatCake),
"make-cake" => Ok(Self::MakeCake),
_ => Err(()),
}
}
}
impl<'a> From<Command> for &'a str {
fn from(c: Command) -> Self {
match c {
Command::EatCake => "eat-cake",
Command::MakeCake => "make-cake",
}
}
}
fn main() {
let command_from_str: Command = "eat-cake".to_owned().parse().unwrap();
let str_from_command: &str = command_from_str.into();
assert_eq!(command_from_str, Command::EatCake);
assert_eq!(str_from_command, "eat-cake");
}
And here's a working playground:
https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=b5e9ac450fd6a79b855306e96d4707fa
Here's an abridged version of what I'm running in clap.
let matches = App::new("cake")
.setting(AppSettings::SubcommandRequiredElseHelp)
// ...
.subcommand(
SubCommand::with_name(Command::MakeCake.into())
// ...
)
.subcommand(
SubCommand::with_name(Command::EatCake.into())
// ...
)
.get_matches();
It seems to work, but I'm not sure if I'm missing something / a bigger picture.
Related:
How to use an internal library Enum for Clap Args
How do I return an error within match statement while implementing from_str in rust?
Thanks!
The strum crate may save you some work. Using strum I was able to get the simple main() you have to work without any additional From implementations.
use strum_macros::{Display, EnumString, IntoStaticStr};
#[derive(Debug, Clone, Copy, PartialEq)]
#[derive(Display, EnumString, IntoStaticStr)] // strum macros.
pub enum Command {
#[strum(serialize = "eat-cake")]
EatCake,
#[strum(serialize = "make-cake")]
MakeCake,
}
fn main() {
let command_from_str: Command = "eat-cake".to_owned().parse().unwrap();
let str_from_command: &str = command_from_str.into();
assert_eq!(command_from_str, Command::EatCake);
assert_eq!(str_from_command, "eat-cake");
}
So I have this function
fn render_i32(n: &dyn Typeable, echo: &dyn Fn(&String)) {
let x: &i32 = unsafe {transmute(n)};
echo(&x.to_string());
}
It does not compile because cannot transmute between types of different sizes.
What I want with this code is the following: I have a HashMap which contains rendering functions for different types. Every type that may be rendered must implement my interface Typeable, which basically only returns a constant type_id for the type (I've just come across a type_id in std, and wonder if I could use that instead...). And using that type_id I can then look up the correct render function in my HashMap. So my code ensures, that render_i32 is only called for i32. This works fine.
Now all of this would be really easy in C where I'd just cast the value under the pointer. But in rust it does not appear to be so easy. I don't get at the i32 value. How would I get that?
Edit: Alternative Solutions to my own approach that are less type-unsafe but solve the following requirement are also welcome: clients (who use this library) should be able to add their own rendering functions for their own types...
Note that the rendering functions are not supposed to be statically defined once: different rendering functions might be used for the same type depending for example on a language setting.
I still don't get why you didn't use the conventional trait-impl approach, it seems to do what you wanted, except that function pointers don't have any common data structure holding them (it's probably less cache-friendly than HashMap's approach)
Playground
use std::iter;
// lib
fn echo_windows(s: &String) {
println!("C:/Users> {}", s)
}
fn echo_linux(s: &String) {
println!("$ {}", s)
}
trait Renderable {
fn render(&self, echo: &dyn Fn(&String));
}
// client
struct ClientType {
ch: char,
len: usize,
}
impl Renderable for ClientType {
fn render(&self, echo: &dyn Fn(&String)) {
let to_echo: String = iter::repeat(self.ch)
.take(self.len)
.collect();
echo(&to_echo);
}
}
fn main() {
ClientType{ ch: '#', len: 5 }.render(&echo_windows); // output: C:/Users> #####
ClientType{ ch: '!', len: 3 }.render(&echo_linux); // output: $ !!!
}
Maybe you can use the Any trait for your purpose:
use std::any::Any;
pub trait Typeable {
...
fn as_any(&self) -> &dyn Any;
}
fn render_i32(n: &dyn Typeable, echo: &dyn Fn(&String)) {
let x: &i32 = n.as_any().downcast_ref::<i32>().unwrap();
echo(&x.to_string());
}
The downcast_ref::<i32>() method returns an Option<&i32>, so you can also check if the downcast is valid. You can even do this in a generic way:
fn render<T:'static + std::fmt::Display>(n: &dyn Typeable, echo: &dyn Fn(&String)) {
let x: &T = n.as_any().downcast_ref::<T>().unwrap();
echo(&x.to_string());
}
I have a type Instruction and would use std::convert::TryFrom to convert from a string.
Shall I implement over String or &str? If I use &str I am obliged to
use &* pattern or as_ref().
I have something like: Rust Playground permalink
use std::convert::TryFrom;
enum Instruction {
Forward, /* other removed for brievity */
}
#[derive(Debug)]
struct InstructionParseError(char);
impl std::convert::TryFrom<&str> for Instruction {
type Error = InstructionParseError;
fn try_from(input: &str) -> Result<Self, Self::Error> {
match input {
"F" => Ok(Instruction::Forward),
_ => unimplemented!(), // For brievity
}
}
}
fn main() {
// I use a string because this input can come from stdio.
let instr = String::from("F");
let instr = Instruction::try_from(&*instr);
}
I read this answer: Should Rust implementations of From/TryFrom target references or values? but i am wondering what is the best option: Implement both? Use impl avanced typing?
One solution I think after reading #SirDarius' comment is just to implement also for String
and use as_ref() or &* inside.
use std::convert::TryFrom;
enum Instruction {
Forward, /* other removed for brievity */
}
#[derive(Debug)]
struct InstructionParseError(char);
impl std::convert::TryFrom<String> for Instruction {
type Error = InstructionParseError;
fn try_from(input: String) -> Result<Self, Self::Error> {
Instruction::try_from(input.as_ref())
}
}
Like described here Should Rust implementations of From/TryFrom target references or values? maybe in the future if some change is made with blanket implemation, AsRef will be usable.
*** This doesn't actually work as SirDarius points out below.
Use T: AsRef<str>.
impl<T: AsRef<str>> std::convert::TryFrom<T> for Instruction {
type Error = InstructionParseError;
fn try_from(input: T) -> Result<Self, Self::Error> {
let input: &str = input.as_ref();
match input {
"F" => Ok(Instruction::Forward),
_ => unimplemented!(), // For brievity
}
}
}
I want my getv() function return the value from a HashMap. I get errors no matter how I modify my code:
enum Type {
TyInt,
TyBool,
}
struct TypeEnv {
Env: HashMap<char, Type>,
}
impl TypeEnv {
fn set(&mut self, name: char, ty: Type) {
self.Env.insert(name, ty);
}
fn getv(self, name: char) -> Type {
match self.Env.get(&name) {
Some(Type) => Type, // <------- Always error here
None => Type::TyInt,
}
}
}
HashMap::get returns an Option<&Type>, not an Option<Type>, which is why just returning the matched value fails to compile.
get provides a reference because in Rust you cannot simply return the actual value that is in the hash table - you need to either clone it (make a copy, potentially expensive), remove it from the container and transfer it to the caller, or return a reference to the value inside. HashMap::get chooses the last option because it is cheap while providing the greatest flexibility to the caller, which can choose to either inspect the value or clone it.
For your enum that consists of a single machine word, copying the value would be the optimal approach. (For more complex enums, a better approach is to return a reference as shown in Simon's answer.) Copying requires:
making Type copyable, by marking it with the Copy trait using the #[derive(Copy, Clone)] directive at the type definition.
returning the copy from getv by dereferencing the matched value using *matched_value, or by using & in the pattern.
Finally, your getv method consumes the object, which means that you can call it only once. This is almost certainly not what was intended - it should accept &self instead. With those changes, the resulting code looks like this:
use std::collections::HashMap;
#[derive(Copy, Clone)]
enum Type {
TyInt,
TyBool,
}
struct TypeEnv {
env: HashMap<char, Type>,
}
impl TypeEnv {
fn set(&mut self, name: char, ty: Type) {
self.env.insert(name, ty);
}
fn getv(&self, name: char) -> Type {
match self.env.get(&name) {
Some(&v) => v,
None => Type::TyInt,
}
}
}
If the type needs to contain non-Copy data, then you can make it only Clone instead, and return v.clone().
You don't have to make your type Copy or Clone ... if you're happy working with the reference.
When things like this pop up while writing Rust, its generally best to re-evaluate how you're calling the code. Can it be made simpler?
You can either have the caller expect the Option<&V>, or just force something out... perhaps like this:
fn getv(&self, name: char) -> &Type {
self.env.get(&name).unwrap_or(&Type::TyInt)
}
This simplifies your code and makes it pretty clear what you're after. Here it is in action:
use std::collections::HashMap;
#[derive(Debug)]
enum Type {
TyInt,
TyBool,
}
struct TypeEnv {
env: HashMap<char, Type>,
}
impl TypeEnv {
fn set(&mut self, name: char, ty: Type) {
self.env.insert(name, ty);
}
fn getv(&self, name: char) -> &Type {
self.env.get(&name).unwrap_or(&Type::TyInt)
}
}
fn main() {
let mut te = TypeEnv {
env: HashMap::new(),
};
{
let arg = te.getv('c');
println!("{:?}", arg); // "TyInt" - because nothing was found
}
te.set('c', Type::TyBool);
let arg = te.getv('c'); // "TyBool" - because we stored it in the line above.
println!("{:?}", arg);
}
Also, your existing implementation of getv takes ownership of self ... notice how I added the reference getv(&self). If you don't have this, this type (as far as I can tell) becomes basically useless.