Declare char for comparison - string

As part of Advent of Code 2020 day 3, I'm trying to compare characters in a string to a specific character.
fn main(){
let str_foo =
"...##..#
##.#..#.";
for char in str_foo.chars() {
println!("{}", char == "#");
}
}
The error I get is expected char, found &str.
I'm struggling to find a clean way to cast either the left or right side of the equality check so that they can be compared.

Use single quotes for character literals. Fixed:
fn main() {
let str_foo =
"...##..#
##.#..#.";
for char in str_foo.chars() {
println!("{}", char == '#');
}
}
playground

Related

How to convert ascii char to int like c/c++ in rust

In c/c++, an ascii char can be converted to a number
int('a')
But how to do this in rust?
You can simply convert character to u32 this way: let code = 'a' as u32;. This will give you unicode value for specific character.
fn main() {
let s = "0123";
for c in s.chars() {
println!("{c} -> {}", c as u32);
}
}
Try it here
But if you are strictly require to work with ASCII not with Unicode, you can also check ascii crate

Can a BigInteger be truncated to an i32 in Rust?

In Java, intValue() gives back a truncated portion of the BigInteger instance. I wrote a similar program in Rust but it appears not to truncate:
extern crate num;
use num::bigint::{BigInt, RandBigInt};
use num::ToPrimitive;
fn main() {
println!("Hello, world!");
truncate_num(
BigInt::parse_bytes(b"423445324324324324234324", 10).unwrap(),
BigInt::parse_bytes(b"22447", 10).unwrap(),
);
}
fn truncate_num(num1: BigInt, num2: BigInt) -> i32 {
println!("Truncation of {} is {:?}.", num1, num1.to_i32());
println!("Truncation of {} is {:?}.", num2, num2.to_i32());
return 0;
}
The output I get from this is
Hello, world!
Truncation of 423445324324324324234324 is None.
Truncation of 22447 is Some(22447).
How can I achieve this in Rust? Should I try a conversion to String and then truncate manually? This would be my last resort.
Java's intValue() returns the lowest 32 bits of the integer. This could be done by a bitwise-AND operation x & 0xffffffff. A BigInt in Rust doesn't support bitwise manipulation, but you could first convert it to a BigUint which supports such operations.
fn truncate_biguint_to_u32(a: &BigUint) -> u32 {
use std::u32;
let mask = BigUint::from(u32::MAX);
(a & mask).to_u32().unwrap()
}
Converting BigInt to BigUint will be successful only when it is not negative. If the BigInt is negative (-x), we could find the lowest 32 bits of its absolute value (x), then negate the result.
fn truncate_bigint_to_u32(a: &BigInt) -> u32 {
use num_traits::Signed;
let was_negative = a.is_negative();
let abs = a.abs().to_biguint().unwrap();
let mut truncated = truncate_biguint_to_u32(&abs);
if was_negative {
truncated.wrapping_neg()
} else {
truncated
}
}
Demo
You may use truncate_bigint_to_u32(a) as i32 if you need a signed number.
There is also a to_signed_bytes_le() method with which you could extract the bytes and decode that into a primitive integer directly:
fn truncate_bigint_to_u32_slow(a: &BigInt) -> u32 {
let mut bytes = a.to_signed_bytes_le();
bytes.resize(4, 0);
bytes[0] as u32 | (bytes[1] as u32) << 8 | (bytes[2] as u32) << 16 | (bytes[3] as u32) << 24
}
This method is extremely slow compared to the above methods and I don't recommend using it.
There's no natural truncation of a big integer into a smaller one. Either it fits or you have to decide what value you want.
You could do this:
println!("Truncation of {} is {:?}.", num1, num1.to_i32().unwrap_or(-1));
or
println!("Truncation of {} is {:?}.", num1, num1.to_i32().unwrap_or(std::i32::MAX));
but your application logic should probably dictate what's the desired behavior when the returned option contains no value.

How to find the last occurrence of a char in a string?

I want to find the index of the last forward slash / in a string. For example, I have the string /test1/test2/test3 and I want to find the location of the slash before test3. How can I achieve this?
In Python, I would use rfind but I can't find anything like that in Rust.
You need to use std::str::rfind. Note that it returns an Option<usize>, so you will need to account for that when checking its result:
fn main() {
let s = "/test1/test2/test3";
let pos = s.rfind('/');
println!("{:?}", pos); // prints "Some(12)"
}
#ljedrz's solution will not give you the correct result if your string contains non-ASCII characters.
Here is a slower solution but it will always give you correct answer:
let s = "/test1/test2/test3";
let pos = s.chars().count() - s.chars().rev().position(|c| c == '/').unwrap() - 1;
Or you can use this as a function:
fn rfind_utf8(s: &str, chr: char) -> Option<usize> {
if let Some(rev_pos) = s.chars().rev().position(|c| c == chr) {
Some(s.chars().count() - rev_pos - 1)
} else {
None
}
}

How to convert a Rust char to an integer so that '1' becomes 1?

I am trying to find the sum of the digits of a given number. For example, 134 will give 8.
My plan is to convert the number into a string using .to_string() and then use .chars() to iterate over the digits as characters. Then I want to convert every char in the iteration into an integer and add it to a variable. I want to get the final value of this variable.
I tried using the code below to convert a char into an integer:
fn main() {
let x = "123";
for y in x.chars() {
let z = y.parse::<i32>().unwrap();
println!("{}", z + 1);
}
}
(Playground)
But it results in this error:
error[E0599]: no method named `parse` found for type `char` in the current scope
--> src/main.rs:4:19
|
4 | let z = y.parse::<i32>().unwrap();
| ^^^^^
This code does exactly what I want to do, but first I have to convert each char into a string and then into an integer to then increment sum by z.
fn main() {
let mut sum = 0;
let x = 123;
let x = x.to_string();
for y in x.chars() {
// converting `y` to string and then to integer
let z = (y.to_string()).parse::<i32>().unwrap();
// incrementing `sum` by `z`
sum += z;
}
println!("{}", sum);
}
(Playground)
The method you need is char::to_digit. It converts char to a number it represents in the given radix.
You can also use Iterator::sum to calculate sum of a sequence conveniently:
fn main() {
const RADIX: u32 = 10;
let x = "134";
println!("{}", x.chars().map(|c| c.to_digit(RADIX).unwrap()).sum::<u32>());
}
my_char as u32 - '0' as u32
Now, there's a lot more to unpack about this answer.
It works because the ASCII (and thus UTF-8) encodings have the Arabic numerals 0-9 ordered in ascending order. You can get the scalar values and subtract them.
However, what should it do for values outside this range? What happens if you provide 'p'? It returns 64. What about '.'? This will panic. And '♥' will return 9781.
Strings are not just bags of bytes. They are UTF-8 encoded and you cannot just ignore that fact. Every char can hold any Unicode scalar value.
That's why strings are the wrong abstraction for the problem.
From an efficiency perspective, allocating a string seems inefficient. Rosetta Code has an example of using an iterator which only does numeric operations:
struct DigitIter(usize, usize);
impl Iterator for DigitIter {
type Item = usize;
fn next(&mut self) -> Option<Self::Item> {
if self.0 == 0 {
None
} else {
let ret = self.0 % self.1;
self.0 /= self.1;
Some(ret)
}
}
}
fn main() {
println!("{}", DigitIter(1234, 10).sum::<usize>());
}
If c is your character you can just write:
c as i32 - 0x30;
Test with:
let c:char = '2';
let n:i32 = c as i32 - 0x30;
println!("{}", n);
output:
2
NB: 0x30 is '0' in ASCII table, easy enough to remember!
Another way is to iterate over the characters of your string and convert and add them using fold.
fn sum_of_string(s: &str) -> u32 {
s.chars().fold(0, |acc, c| c.to_digit(10).unwrap_or(0) + acc)
}
fn main() {
let x = "123";
println!("{}", sum_of_string(x));
}

Concise and safe way to replace certain ASCII chars with other ASCII chars in a string

I want to write a function which takes a mutable string and checks if the first and last character are the " character. If so, those two characters should be replaced with the backtick character `. I've come up with this solution:
fn replace_wrapping_char(s: &mut String) {
if s.len() > 1 && s.starts_with('"') && s.ends_with('"') {
unsafe {
let v = s.as_mut_vec();
v[0] = '`' as u8;
*v.last_mut().unwrap() = '`' as u8;
}
}
}
This seems to work (yes, '`'.is_ascii() returns true), but it uses unsafe and looks a bit ugly to me.
Is there a safe and concise way to achieve what I want?
Here is a safe, shortened version of that function, although it won't be memory efficient. This will create a copy and reassign the given string, so a pure function prototype returning a new string would probably be more fitting here. It also relies on the fact that the double quote character is 1-byte sized in UTF-8.
fn replace_wrapping_char(s: &mut String) {
if s.len() > 1 && s.starts_with('"') && s.ends_with('"') {
*s = format!("`{}`", &s[1 .. s.len()-1])
}
}
Playground

Resources