Is there syntax for declaring character literals in hexadecimal notation? - rust

Something like
const X: char = '0x10FFFC';

Yes, use \u{..}:
const X: char = '\u{10FFFC}';
Playground
One trick for this cases is play with the compiler. If you try the following code it will give you a nice hint about what to do for example:
const X: char = 0x10FFFC as char;
error: only `u8` can be cast into `char`
--> src/lib.rs:1:17
|
1 | const X: char = 0x10FFFC as char;
| ^^^^^^^^^^^^^^^^ help: use a `char` literal instead: `'\u{10FFFC}'`
|
= note: `#[deny(overflowing_literals)]` on by default

Related

Rust std::fmt formating parameters. Is [fill] from variable possible? [duplicate]

With Rust I can have a user-specified width in my call to format!,
format!("{:>width$}", result.clone(), width=v as usize )
I can even specify the fill character (like the - below)
format!("{:->width$}", result.clone(), width=v as usize )
Or, with a 0,
format!("{:0>width$}", result.clone(), width=v as usize )
But is there a way to have that user-specified? I've tried the following but it doesn't work,
format!("{:fill$>width$}", result.clone(), fill='0' width=v as usize )
I get the following error,
error: invalid format string: expected `'}'`, found `'>'`
--> src/sequence/renderer.rs:124:28
|
124 | |v| Ok(format!("{:fill$>width$}", result.clone(), fill='0', width=v as usize ))
| - ^ expected `}` in format string
| |
| because of this opening brace
|
= note: if you intended to print `{`, you can escape it using `{{`
error: could not compile `letter-sequence` due to previous error
warning: build failed, waiting for other jobs to finish...
error: build failed
It's not possible at the moment. By looking at the grammar of format string it's not possible to pass fill parameter as an argument.
format := '{' [ argument ] [ ':' format_spec ] '}'
argument := integer | identifier
format_spec := [[fill]align][sign]['#']['0'][width]['.' precision]type
fill := character
align := ''
sign := '+' | '-'
width := count
precision := count | '*'
type := '' | '?' | 'x?' | 'X?' | identifier
count := parameter | integer
parameter := argument '$'
As you can see the fill format specifier is directly mapped to a character token, Where as width can be substituted with identifier $(width -> count -> parameter -> argument $ -> identifier $)
You can use runtime-fmt (https://crates.io/crates/runtime-fmt). The idea is to do something like this
!!! not tested !!!
#[macro_use] extern crate runtime_fmt;
fn main()
{
let result = 400;
let v = 45;
let f = format!("{{:{file}>{width}}}", width=v as usize, file='0');
let s = rt_format!(f, result).unwrap();
println!("{}", s);
}
Some interesting links:
https://github.com/rust-lang/rfcs/issues/543
https://fmt.dev/latest/syntax.html

Compare DateTime with fixed offset to DateTime with time zone

How does one convert between DateTime<FixedOffset> and DateTime<Tz>, in order to subtract to get a duration, compare inequality, or reassign?
use chrono::DateTime;
use chrono_tz::America::New_York;
fn main() {
let mut a = DateTime::parse_from_rfc3339("2022-06- 01T10:00:00").unwrap();
let b = a.with_timezone(&New_York);
a = b;
}
An attempt to do this directly yields the error:
error[E0308]: mismatched types
--> src/main.rs:13:9
|
11 | let mut a = DateTime::parse_from_rfc3339("2022-06-01T10:00:00").unwrap();
| ------------------------------------------------------------ expected due to this value
12 | let b = a.with_timezone(&New_York);
13 | a = b;
| ^ expected struct `FixedOffset`, found enum `Tz`
|
= note: expected struct `DateTime<FixedOffset>`
found struct `DateTime<Tz>`
Playground
Convert the timezone of b into the timezone of a before assigning it:
a = b.with_timezone(&a.timezone());

Does the rust format! macro provide for user-specified fill characters

With Rust I can have a user-specified width in my call to format!,
format!("{:>width$}", result.clone(), width=v as usize )
I can even specify the fill character (like the - below)
format!("{:->width$}", result.clone(), width=v as usize )
Or, with a 0,
format!("{:0>width$}", result.clone(), width=v as usize )
But is there a way to have that user-specified? I've tried the following but it doesn't work,
format!("{:fill$>width$}", result.clone(), fill='0' width=v as usize )
I get the following error,
error: invalid format string: expected `'}'`, found `'>'`
--> src/sequence/renderer.rs:124:28
|
124 | |v| Ok(format!("{:fill$>width$}", result.clone(), fill='0', width=v as usize ))
| - ^ expected `}` in format string
| |
| because of this opening brace
|
= note: if you intended to print `{`, you can escape it using `{{`
error: could not compile `letter-sequence` due to previous error
warning: build failed, waiting for other jobs to finish...
error: build failed
It's not possible at the moment. By looking at the grammar of format string it's not possible to pass fill parameter as an argument.
format := '{' [ argument ] [ ':' format_spec ] '}'
argument := integer | identifier
format_spec := [[fill]align][sign]['#']['0'][width]['.' precision]type
fill := character
align := ''
sign := '+' | '-'
width := count
precision := count | '*'
type := '' | '?' | 'x?' | 'X?' | identifier
count := parameter | integer
parameter := argument '$'
As you can see the fill format specifier is directly mapped to a character token, Where as width can be substituted with identifier $(width -> count -> parameter -> argument $ -> identifier $)
You can use runtime-fmt (https://crates.io/crates/runtime-fmt). The idea is to do something like this
!!! not tested !!!
#[macro_use] extern crate runtime_fmt;
fn main()
{
let result = 400;
let v = 45;
let f = format!("{{:{file}>{width}}}", width=v as usize, file='0');
let s = rt_format!(f, result).unwrap();
println!("{}", s);
}
Some interesting links:
https://github.com/rust-lang/rfcs/issues/543
https://fmt.dev/latest/syntax.html

Proper bounds on associated types to allow default methods on traits [duplicate]

I wanted to implement a function computing the number of digits within any generic type of integer. Here is the code I came up with:
extern crate num;
use num::Integer;
fn int_length<T: Integer>(mut x: T) -> u8 {
if x == 0 {
return 1;
}
let mut length = 0u8;
if x < 0 {
length += 1;
x = -x;
}
while x > 0 {
x /= 10;
length += 1;
}
length
}
fn main() {
println!("{}", int_length(45));
println!("{}", int_length(-45));
}
And here is the compiler output
error[E0308]: mismatched types
--> src/main.rs:5:13
|
5 | if x == 0 {
| ^ expected type parameter, found integral variable
|
= note: expected type `T`
found type `{integer}`
error[E0308]: mismatched types
--> src/main.rs:10:12
|
10 | if x < 0 {
| ^ expected type parameter, found integral variable
|
= note: expected type `T`
found type `{integer}`
error: cannot apply unary operator `-` to type `T`
--> src/main.rs:12:13
|
12 | x = -x;
| ^^
error[E0308]: mismatched types
--> src/main.rs:15:15
|
15 | while x > 0 {
| ^ expected type parameter, found integral variable
|
= note: expected type `T`
found type `{integer}`
error[E0368]: binary assignment operation `/=` cannot be applied to type `T`
--> src/main.rs:16:9
|
16 | x /= 10;
| ^ cannot use `/=` on type `T`
I understand that the problem comes from my use of constants within the function, but I don't understand why the trait specification as Integer doesn't solve this.
The documentation for Integer says it implements the PartialOrd, etc. traits with Self (which I assume refers to Integer). By using integer constants which also implement the Integer trait, aren't the operations defined, and shouldn't the compiler compile without errors?
I tried suffixing my constants with i32, but the error message is the same, replacing _ with i32.
Many things are going wrong here:
As Shepmaster says, 0 and 1 cannot be converted to everything implementing Integer. Use Zero::zero and One::one instead.
10 can definitely not be converted to anything implementing Integer, you need to use NumCast for that
a /= b is not sugar for a = a / b but an separate trait that Integer does not require.
-x is an unary operation which is not part of Integer but requires the Neg trait (since it only makes sense for signed types).
Here's an implementation. Note that you need a bound on Neg, to make sure that it results in the same type as T
extern crate num;
use num::{Integer, NumCast};
use std::ops::Neg;
fn int_length<T>(mut x: T) -> u8
where
T: Integer + Neg<Output = T> + NumCast,
{
if x == T::zero() {
return 1;
}
let mut length = 0;
if x < T::zero() {
length += 1;
x = -x;
}
while x > T::zero() {
x = x / NumCast::from(10).unwrap();
length += 1;
}
length
}
fn main() {
println!("{}", int_length(45));
println!("{}", int_length(-45));
}
The problem is that the Integer trait can be implemented by anything. For example, you could choose to implement it on your own struct! There wouldn't be a way to convert the literal 0 or 1 to your struct. I'm too lazy to show an example of implementing it, because there's 10 or so methods. ^_^
num::Zero and num::One
This is why Zero::zero and One::one exist. You can (very annoyingly) create all the other constants from repeated calls to those.
use num::{One, Zero}; // 0.4.0
fn three<T>() -> T
where
T: Zero + One,
{
let mut three = Zero::zero();
for _ in 0..3 {
three = three + One::one();
}
three
}
From and Into
You can also use the From and Into traits to convert to your generic type:
use num::Integer; // 0.4.0
use std::ops::{DivAssign, Neg};
fn int_length<T>(mut x: T) -> u8
where
T: Integer + Neg<Output = T> + DivAssign,
u8: Into<T>,
{
let zero = 0.into();
if x == zero {
return 1;
}
let mut length = 0u8;
if x < zero {
length += 1;
x = -x;
}
while x > zero {
x /= 10.into();
length += 1;
}
length
}
fn main() {
println!("{}", int_length(45));
println!("{}", int_length(-45));
}
See also:
How do I use floating point number literals when using generic types?

Why do I get the error "expected integral variable, found Option" when matching on an integer?

I am trying to use match in Rust. I wrote a function:
fn main() {
let try = 3;
let x = match try {
Some(number) => number,
None => 0,
};
}
But I'm getting the error:
error[E0308]: mismatched types
--> src/main.rs:4:9
|
4 | Some(number) => number,
| ^^^^^^^^^^^^ expected integral variable, found enum `std::option::Option`
|
= note: expected type `{integer}`
found type `std::option::Option<_>`
error[E0308]: mismatched types
--> src/main.rs:5:9
|
5 | None => 0,
| ^^^^ expected integral variable, found enum `std::option::Option`
|
= note: expected type `{integer}`
found type `std::option::Option<_>`
I tried something like let try: i32 = 3; to make sure that try is an integral value, but I still get the same error.
I think you want this:
fn main() {
let try = Some(3);
let x = match try {
Some(number) => number,
None => 0,
};
}
The issue is that you're trying to match an integer against Some(...) and None, which are Options. This doesn't really make sense... an integer can never be None.
Instead, I think you want to use the type Option<i32> and convert it to an i32 by using a default value. The above code should accomplish that. Note that if that's all you're trying to do, this is an easier way:
let x = try.unwrap_or(0);
In match expressions the type of the value you are matching on must correspond to the variants in the block following it; in your case this means that try either needs to be an Option or the match block needs to have integral variants.
I highly recommend reading The Rust Book; Rust is strongly typed and this is one of the most basic concepts you will need to familiarize yourself with.

Resources