Casting &i32 as usize - rust

I have a function that is meant to make a move in connect 4, this function takes 3 parameters. The main issue is that the square variable is out of scope in the for loop so it must be borrowed however I cannot cast an &i32 to usize.
fn make_move<'playing>(board: &'playing Vec<&str>, column: i32, turn: &'playing i32) -> &'playing Vec<&'playing str> {
let mut square = column;
let new_board = board;
for i in 0..6 {
if new_board[(&square) as usize] == " " {
square = &square + 7;
} else {
if turn % 2 == 0 {
new_board[(&square - 7) as usize] = "●";
} else {
new_board[(&square - 7) as usize] = "○";
}
}
}
return new_board;
}
error[E0606]: casting `&i32` as `usize` is invalid
--> src/lib.rs:6:22
|
6 | if new_board[(&square) as usize] == " " {
| ---------^^^^^^^^^
| |
| cannot cast `&i32` as `usize`
| help: dereference the expression: `*(&square)`
I also just started using Rust so I am sure my logic is flawed and maybe this is a dumb mistake but help would be much appreciated.

In this case, you don't need the ampersands at all before square. square is an i32, which is an integer type, and you want to turn it into another integer type, so you don't need to borrow it at all. Since i32 is Copy, you can make cheap copies of square without a problem.
Additionally, new_board is making a copy of your reference to board, not copying the actual board. Vec is not Copy, since copies are not cheap, so you'll need to use the clone method to create a new board and then return the actual Vec, not a reference. With the above change and this one, the result would look like this:
fn make_move<'playing>(
board: &'playing Vec<&str>,
column: i32,
turn: &'playing i32,
) -> Vec<&'playing str> {
let mut square = column;
let mut new_board = board.clone();
for i in 0..6 {
if new_board[(square) as usize] == " " {
square = square + 7;
} else {
if turn % 2 == 0 {
new_board[(square - 7) as usize] = "●";
} else {
new_board[(square - 7) as usize] = "○";
}
}
}
return new_board;
}
Alternatively, you can make board a mutable reference, in which case you will modify both the old and new board, but you can continue to return a reference. That would look like this:
fn make_move<'playing>(
board: &'playing mut Vec<&str>,
column: i32,
turn: &'playing i32,
) -> &'playing Vec<&'playing str> {
let mut square = column;
let new_board = board;
for i in 0..6 {
if new_board[(square) as usize] == " " {
square = square + 7;
} else {
if turn % 2 == 0 {
new_board[(square - 7) as usize] = "●";
} else {
new_board[(square - 7) as usize] = "○";
}
}
}
return new_board;
}

Related

Borrow inside a loop

I'm trying to learn Rust after many years of C++. I have a situation where the compiler is complaining about a borrow, and it doesn't seem to matter whether it is mutable or immutable. I don't seem to be able to use self as a parameter inside a loop that start with: for item in self.func.drain(..).I've tried calling appropriate() as a function:
Self::appropriate(&self,&item,index)
and I have tried it as a method:
self.appropriate(&item,index)
but I get the same message in either case:
The function or method appropriate() is intended imply examine the relationship among its parameters and return a bool without modifying anything. How can I call either a function or method on self without violating borrowing rules?This program is a learning exercise from exercism.org and doesn't include a main() so it won't run but should almost compile except for the error in question. Here's the code I have:
use std::collections::HashMap;
pub type Value = i32;
pub type Result = std::result::Result<(), Error>;
pub struct Forth {
v: Vec<Value>,
f: HashMap<String,usize>,
s: Vec<Vec<String>>,
func: Vec<String>
}
#[derive(Debug, PartialEq)]
pub enum Error {
DivisionByZero,
StackUnderflow,
UnknownWord,
InvalidWord,
}
impl Forth {
pub fn new() -> Forth {
let mut temp: Vec<Vec<String>> = Vec::new();
temp.push(Vec::new());
Forth{v: Vec::<Value>::new(), f: HashMap::new(), s: temp, func: Vec::new()}
}
pub fn stack(&self) -> &[Value] {
&self.v
}
pub fn eval(&mut self, input: &str) -> Result {
self.v.clear();
self.s[0].clear();
let mut count = 0;
{
let temp: Vec<&str> = input.split(' ').collect();
let n = temp.len() as i32;
for x in 0..n as usize {
self.s[0].push(String::from(temp[x]));
}
}
let mut collecting = false;
let mut xlist: Vec<(usize,usize)> = Vec::new();
let mut sx: usize = 0;
let mut z: i32 = -1;
let mut x: usize;
let mut n: usize = self.s[0].len();
loop {
count += 1;
if count > 20 {break;}
z += 1;
x = z as usize;
if x >= n {break;}
z = x as i32;
let word = &self.s[sx][x];
if word == ";" {
if collecting {
collecting = false;
let index: usize = self.s.len();
self.s.push(Vec::<String>::new());
for item in self.func.drain(..) {
if self.s[index].len() > 0 &&
Self::appropriate(&self,&item,index)
{
let sx = *self.f.get(&self.s[index][0]).unwrap();
let n = self.s[sx].len();
for x in 1..n as usize {
let symbol = self.s[sx][x].clone();
self.s[index].push(symbol);
}
}
else {
self.s[index].push(item);
}
}
self.f.insert(self.s[index][0].clone(), index);
self.func.clear();
continue;
}
if 0 < xlist.len() {
(x, n) = xlist.pop().unwrap();
continue;
}
return Err(Error::InvalidWord);
}
if collecting {
self.func.push(String::from(word));
continue;
}
if Self::is_op(word) {
if self.v.len() < 2 {
return Err(Error::StackUnderflow);
}
let b = self.v.pop().unwrap();
let a = self.v.pop().unwrap();
let c = match word.as_str() {
"+" => a + b,
"-" => a - b,
"*" => a * b,
"/" => {if b == 0 {return Err(Error::DivisionByZero);} a / b},
_ => 0
};
self.v.push(c);
continue;
}
match word.parse::<Value>() {
Ok(value) => { self.v.push(value); continue;},
_ => {}
}
if word == ":" {
collecting = true;
self.func.clear();
continue;
}
if word == "drop" {
if self.v.len() < 1 {
return Err(Error::StackUnderflow);
}
self.v.pop();
continue;
}
if word == "dup" {
if self.v.len() < 1 {
return Err(Error::StackUnderflow);
}
let temp = self.v[self.v.len() - 1];
self.v.push(temp);
continue;
}
if !self.f.contains_key(word) {
return Err(Error::UnknownWord);
}
xlist.push((sx,n));
sx = *self.f.get(word).unwrap();
n = self.s[sx].len();
z = 0;
}
Ok(())
}
fn is_op(input: &str) -> bool {
match input {"+"|"-"|"*"|"/" => true, _ => false}
}
fn appropriate(&self, item:&str, index:usize) -> bool
{
false
}
fn prev_def_is_short(&self, index: usize) -> bool {
if index >= self.s.len() {
false
}
else {
if let Some(&sx) = self.f.get(&self.func[0]) {
self.s[sx].len() == 2
}
else {
false
}
}
}
}
The error message relates to the call to appropriate(). I haven't even written the body of that function yet; I'd like to get the parameters right first. The compiler's complaint is:
As a subroutine call
error[E0502]: cannot borrow `self` as immutable because it is also borrowed as mutable
--> src/lib.rs:85:47
|
81 | for item in self.func.drain(..) {
| -------------------
| |
| mutable borrow occurs here
| mutable borrow later used here
...
85 | Self::appropriate(&self,&item,index)
| ^^^^^ immutable borrow occurs here
For more information about this error, try `rustc --explain E0502`.
as a method call
error[E0502]: cannot borrow `*self` as immutable because it is also borrowed as mutable
--> src/lib.rs:85:29
|
81 | for item in self.func.drain(..) {
| -------------------
| |
| mutable borrow occurs here
| mutable borrow later used here
...
85 | self.appropriate(&item,index)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ immutable borrow occurs here
For more information about this error, try `rustc --explain E0502`.
Is there any canonical way to deal with this situation?
The problem is that self.func.drain() will consume the elements contained in self.func, thus an exclusive (&mut) access is needed on self.func for the entire for loop.
If during the iteration you need to pass a reference to self globally, then its func member is potentially accessible while the loop holds an exclusive access to it: Rust forbids that.
Since you use drain() in order to consume all the elements inside self.func, I suggest you swap this vector with an empty one just before the loop, then iterate on this other vector that is not anymore part of self.
No copy of the content of the vector is involved here; swap() only deals with pointers.
Here is an over-simplified version of your code, adapted consequently.
struct Forth {
func: Vec<String>,
}
impl Forth {
fn eval(&mut self) {
/*
for item in self.func.drain(..) {
self.appropriate(&self);
}
*/
let mut func = Vec::new();
std::mem::swap(&mut self.func, &mut func);
for item in func.drain(..) {
let b = self.appropriate();
println!("{:?} {:?}", item, b);
}
}
fn appropriate(&self) -> bool {
false
}
}
fn main() {
let mut f = Forth {
func: vec!["aaa".into(), "bbb".into()],
};
f.eval();
}

How do I return the sum of the three largest elements in an array?

I'm trying to return sum of 3 largest numbers in an array like this:
fn max_tri_sum(arr: &[i32]) -> i32 {
arr.sort();
arr[0]+arr[1]+arr[2]
}
but I keep getting this error:
error[E0596]: cannot borrow `*arr` as mutable, as it is behind a `&` reference
fn max_tri_sum(arr: &[i32]) -> i32 {
------ help: consider changing this to be a mutable reference: `&mut [i32]`
arr.sort();
^^^ `arr` is a `&` reference, so the data it refers to cannot be borrowed as mutable
I shouldn't change arr: &[i32] to arr: &mut [i32] because of some restrictions. So what can I do about it?
P.S: I tried to clone arr to a mutable variable but got other errors:
fn max_tri_sum(arr: &[i32]) -> i32 {
let a: &mut [i32] = *arr.clone();
a.sort();
a[0]+a[1]+a[2]
}
You could also use a BinaryHeap to store the three largest values so far and replace the smallest value while looping through the array:
use std::collections::BinaryHeap;
fn max_tri_sum(arr: &[i32]) -> i32 {
let mut heap = BinaryHeap::new();
heap.push(-arr[0]);
heap.push(-arr[1]);
heap.push(-arr[2]);
for e in arr[3..].iter() {
if -e < *heap.peek().unwrap() {
heap.pop();
heap.push(-e);
}
}
-heap.drain().sum::<i32>()
}
Or if you prefer the sort option, you can convert the slice to a vector:
fn max_tri_sum(arr: &[i32]) -> i32 {
let mut arr1 = arr.to_vec();
arr1.sort_by(|a, b| b.cmp(a));
arr1[0] + arr1[1] + arr1[2]
}
Playground
In Rust, you can't have a variable that has both reference and mutable reference in the same scope.
You have several choices :
You could simply do a loop on the slice to get the values you want. Something like this (there's probably a more elegant way with iterators)
fn max_tri_sum(arr: &[i32]) -> i32 {
let mut maxes = [0, 0, 0];
for &el in arr {
if el > maxes[0] && el < maxes[1] {
maxes[0] = el;
} else if el > maxes[1] && el < maxes[2] {
if maxes[1] > maxes[0] {
maxes[0] = maxes[1];
}
maxes[1] = el;
} else if el > maxes[2] {
if maxes[2] > maxes[1] {
maxes[1] = maxes[2];
}
maxes[2] = el;
}
}
maxes[0] + maxes[1] + maxes[2]
}
You can create a new Vec from the slice, and then do all the operations on it (which requires allocation, but should be fine for small Vecs).
fn max_tri_sum(arr: &[i32]) -> i32 {
let mut arr = Vec::from(arr);
arr.sort();
arr[0] + arr[1] + arr[2]
}
I would also like to point that sort sorts from smallest to biggest, so the index 0, 1, 2, ... would be the smallest values in the array, which I don't think is what you want to do!

Why does wasm-opt fail in wasm-pack builds when generating a function returning a string?

I'm working through the Rust WASM tutorial for Conway's game of life.
One of the simplest functions in the file is called Universe.render (it's the one for rendering a string representing game state). It's causing an error when I run wasm-pack build:
Fatal: error in validating input
Error: failed to execute `wasm-opt`: exited with exit code: 1
full command: "/home/vaer/.cache/.wasm-pack/wasm-opt-4d7a65327e9363b7/wasm-opt" "/home/vaer/src/learn-rust/wasm-game-of-life/pkg/wasm_game_of_life_bg.wasm" "-o" "/home/vaer/src/learn-rust/wasm-game-of-life/pkg/wasm_game_of_life_bg.wasm-opt.wasm" "-O"
To disable `wasm-opt`, add `wasm-opt = false` to your package metadata in your `Cargo.toml`.
If I remove that function, the code builds without errors. If I replace it with the following function, the build fails with the same error:
pub fn wtf() -> String {
String::from("wtf")
}
It seems like any function that returns a String causes this error. Why?
Following is the entirety of my code:
mod utils;
use wasm_bindgen::prelude::*;
// When the `wee_alloc` feature is enabled, use `wee_alloc` as the global
// allocator.
#[cfg(feature = "wee_alloc")]
#[global_allocator]
static ALLOC: wee_alloc::WeeAlloc = wee_alloc::WeeAlloc::INIT;
// Begin game of life impl
use std::fmt;
#[wasm_bindgen]
#[repr(u8)]
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
pub enum Cell {
Dead = 0,
Alive = 1,
}
#[wasm_bindgen]
pub struct Universe {
width: u32,
height: u32,
cells: Vec<Cell>,
}
impl fmt::Display for Universe {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
for line in self.cells.as_slice().chunks(self.width as usize) {
for &cell in line {
let symbol = if cell == Cell::Dead { '◻' } else { '◼' };
write!(f, "{}", symbol)?;
}
write!(f, "\n")?;
}
Ok(())
}
}
impl Universe {
fn get_index(&self, row: u32, column: u32) -> usize {
(row * self.width + column) as usize
}
fn live_neighbor_count(&self, row: u32, column: u32) -> u8 {
let mut count = 0;
for delta_row in [self.height - 1, 0, 1].iter().cloned() {
for delta_col in [self.width - 1, 0, 1].iter().cloned() {
if delta_row == 0 && delta_col == 0 {
continue;
}
let neighbor_row = (row + delta_row) % self.height;
let neighbor_col = (column + delta_col) % self.width;
let idx = self.get_index(neighbor_row, neighbor_col);
count += self.cells[idx] as u8;
}
}
count
}
}
/// Public methods, exported to JavaScript.
#[wasm_bindgen]
impl Universe {
pub fn tick(&mut self) {
let mut next = self.cells.clone();
for row in 0..self.height {
for col in 0..self.width {
let idx = self.get_index(row, col);
let cell = self.cells[idx];
let live_neighbors = self.live_neighbor_count(row, col);
let next_cell = match (cell, live_neighbors) {
// Rule 1: Any live cell with fewer than two live neighbours
// dies, as if caused by underpopulation.
(Cell::Alive, x) if x < 2 => Cell::Dead,
// Rule 2: Any live cell with two or three live neighbours
// lives on to the next generation.
(Cell::Alive, 2) | (Cell::Alive, 3) => Cell::Alive,
// Rule 3: Any live cell with more than three live
// neighbours dies, as if by overpopulation.
(Cell::Alive, x) if x > 3 => Cell::Dead,
// Rule 4: Any dead cell with exactly three live neighbours
// becomes a live cell, as if by reproduction.
(Cell::Dead, 3) => Cell::Alive,
// All other cells remain in the same state.
(otherwise, _) => otherwise,
};
next[idx] = next_cell;
}
}
self.cells = next;
}
pub fn new() -> Universe {
let width = 64;
let height = 64;
let cells = (0..width * height)
.map(|i| {
if i % 2 == 0 || i % 7 == 0 {
Cell::Alive
} else {
Cell::Dead
}
})
.collect();
Universe {
width,
height,
cells,
}
}
pub fn render(&self) -> String {
self.to_string()
}
}
Simply removing the render function at the bottom of this file causes the build to succeed. Replacing the render function with any function returning a String causes the build to fail. Why?
It turns out that this is not expected behavior; instead it is a bug with wasm-pack.
The issue can be resolved for now by adding the following to the project's cargo.toml:
[package.metadata.wasm-pack.profile.release]
wasm-opt = ["-Oz", "--enable-mutable-globals"]

Build tree from formatted input

I have been having a problem trying to come up with a solution to read a binary tree from a formatted input and build said tree in Rust. The borrow checker has been driving crazy and thus decided to take to the community.
Basically, the input looks like this:
1 2 3 4 5 6 N N 7 8
and it represents a tree that looks like this:
1
/ \
2 3
/ \ / \
4 5 6 N
/ \ /
N 7 8
with N meaning NULL.
To read this, in CPP I would usually read this tree by level doing kind of a Breadth first building of the tree using a queue to do so.
I was attempting the same approach in Rust but that is where hell broke lose on me. I am a beginner in Rust and of course I am being scolded by the borrow checker.
I am using the following TreeNode structure:
#[derive(Debug, PartialEq, Eq)]
pub struct TreeNode {
pub val: i32,
pub left: Option<Rc<RefCell<TreeNode>>>,
pub right: Option<Rc<RefCell<TreeNode>>>,
}
impl TreeNode {
#[inline]
pub fn new(val: i32) -> Self {
TreeNode {
val,
left: None,
right: None
}
}
}
And here is the piece of code that is doing the reading of the tree from STDIN:
fn read_b_tree<T: io::BufRead>(scan: &mut Scanner<T>, size: usize)
-> Result<Option<Rc<RefCell<TreeNode>>>, Box<dyn Error>> {
if size == 0 {
Ok(None)
} else {
let r = scan.token::<String>();
if r == "N" {
Ok(None)
} else {
let mut q = VecDeque::new();
let root = Rc::new(RefCell::new(TreeNode::new(r.parse::<i32>()?)));
q.push_back(&root);
let mut cnt: usize = 1;
while cnt < size && !q.is_empty() {
let node = match q.pop_front() {
Some(node) => Ok(node),
_ => Err("Queue should not be empty"),
}?;
let v = Rc::clone(node);
let left = scan.token::<String>();
let right = scan.token::<String>();
if left != "N" {
let left_n = Rc::new(RefCell::new(TreeNode::new(left.parse::<i32>()?)));
v.borrow_mut().left = Some(Rc::clone(&left_n));
q.push_back(&left_n);
}
cnt += 1;
if right != "N" {
let right_n = Rc::new(RefCell::new(TreeNode::new(right.parse::<i32>()?)));
v.borrow_mut().right = Some(Rc::clone(&right_n));
q.push_back(&right_n);
}
cnt += 1;
}
Ok(Some(root))
}
}
}
As you can imagine, I ran into lifetime issues with this approach, such as:
error[E0597]: `right_n` does not live long enough
--> src/main.rs:146:33
|
125 | while cnt < size && !q.is_empty() {
| - borrow later used here
...
146 | q.push_back(&right_n);
| ^^^^^^^^ borrowed value does not live long enough
147 | }
| - `right_n` dropped here while still borrowed
I would be highly thankful to anyone who could give some pointers as to how I work out of this situation.
Sincerely,
The following code is shorter and demonstrates your problem:
fn read_b_tree() -> Result<Option<Rc<RefCell<TreeNode>>>, Box<dyn Error>> {
let r = String::new();
let mut q = VecDeque::new();
let root = Rc::new(RefCell::new(TreeNode::new(r.parse::<i32>()?)));
q.push_back(&root);
while !q.is_empty() {
if r != "N" {
let left_n = Rc::new(RefCell::new(TreeNode::new(r.parse::<i32>()?)));
q.push_back(&left_n);
}
}
Ok(Some(root))
}
The inferred type of q is VecDeque<&Rc<RefCell<TreeNode>>>, yet there is no reason why you choose this instead of VecDeque<Rc<RefCell<TreeNode>>> (Rc instead of &Rc) - Rc is already a reference so there is no need to do it a second time.
I think you inserted the & because without & there was an error use of moved value .... This error is correct: You use root after moving it to q. But that is not a problem, because you wanted to move a Rc to q and you can easily get one new by just cloning it root.clone() or Rc::clone(&root).
The fixed example is:
fn read_b_tree() -> Result<Option<Rc<RefCell<TreeNode>>>, Box<dyn Error>> {
let r = String::new();
let mut q = VecDeque::new();
let root = Rc::new(RefCell::new(TreeNode::new(r.parse::<i32>()?)));
q.push_back(root.clone());
while !q.is_empty() {
if r != "N" {
let left_n = Rc::new(RefCell::new(TreeNode::new(r.parse::<i32>()?)));
q.push_back(left_n); //this works in the example, but you will have to clone here too
}
}
Ok(Some(root))
}

Type mismatch for std::op trait "Not"

I am implementing a general matrix solver. In doing so I make use of the "Not" operator to get around another issue which I'll explain below. However, when invoking the function in my tests, I get the following error:
error[E0271]: type mismatch resolving `<i32 as std::ops::Not>::Output == bool`
--> src/matrix.rs:223:15
|
90 | pub fn reduce<T>(mat: &mut Matrix<T>) -> Result<Matrix<T>, &'static str>
| ------
...
97 | + Not<Output = bool>
| ------------- required by this bound in `matrix::reduce`
...
223 | let res = reduce(&mut mat).unwrap();
| ^^^^^^ expected i32, found bool
error: aborting due to previous error
This is particularly confusing because I am not sure how else I would implement the Not trait and have it function properly. When bool is the output type, it compiles just fine but seems to bark during execution.
Here's my code:
/// Performs a reduction operation on a given matrix, giving the reduced row echelon form
pub fn reduce<T>(mat: &mut Matrix<T>) -> Result<Matrix<T>, &'static str>
where
T: num_traits::Zero
+ num_traits::One
+ Mul<T, Output = T>
+ Add<T, Output = T>
+ Sub<T, Output = T>
+ Not<Output = bool>
+ Neg<Output = T>
+ Div<T, Output = T>
+ Copy,
{
let exchange = |matrix: &mut Matrix<T>, i: usize, j: usize| {
matrix.data.swap(i, j);
};
let scale = |matrix: &mut Matrix<T>, row: usize, factor: T| {
for i in 0..matrix.data[row].len() {
matrix.data[row][i] = matrix.data[row][i] * factor;
}
};
let row_replace = |matrix: &mut Matrix<T>, i: usize, j: usize, factor: T| {
for k in 0..matrix.data[j].len() {
matrix.data[j][k] = matrix.data[j][k] + (matrix.data[i][k] * factor);
}
};
// Reduction steps
let n = mat.data.len();
for i in 0..n {
// Find a pivot point
for j in i..n {
if !mat.data[j][i] { // <------- Error Here *********
if i != j {
exchange(mat, i, j);
break;
}
}
if j == n - 1 {
return Err("No pivot found")
}
}
// Put zeros below diagonal
for j in i + 1..n {
row_replace(mat, i, j, -mat.data[j][i] / mat.data[i][i]);
}
}
// Back substitution (bottom up)
for i in (0..n - 1).rev() {
for j in 0..i {
row_replace(mat, i, j, -mat.data[j][i] / mat.data[i][i]);
}
}
// Add 1's to the diagonal
for i in 0..n {
scale(mat, i, T::one() / mat.data[i][i]);
}
Ok(mat.clone())
}
#[test]
fn it_row_reduces() {
let mat = Matrix {
data: vec![vec![2, 1, 4], vec![1, 2, 5]],
nrows: 2,
ncols: 3,
};
let comp = Matrix {
data: vec![vec![1, 0, 1], vec![0, 1, 2]],
nrows: 2,
ncols: 3,
};
let res = reduce(&mut mat).unwrap();
assert_eq!(res.data, comp.data);
}
Originally, the code looked like the following:
if mat.data[j][i] != T::zero() {
if i != j {
exchange(mat, i, j);
break;
}
}
But it seems that even with the Not trait added to the function signature, this operation would never work, giving the following error:
binary operation `!=` cannot be applied to type `T`: T
I'm looking to figure out where I'm going wrong with this code and if my use of generics for this comparison is the most idiomatic way to do it in rust. Any additional feedback is appreciated. I can provide the struct as well, I just wanted to keep the question as brief as possible.
In Rust, ! serves as both logical NOT and bitwise NOT, depending on the argument type. It performs a logical NOT when the argument is a bool and a bitwise NOT when the argument is an integer type. The only built-in type that implements Not<Output = bool> is bool.
You should stick to if mat.data[j][i] != T::zero() {. != is provided by the PartialEq trait. Instead of the T: Not<Output = bool> bound, you'll want T: PartialEq<T> or simply T: Eq.

Resources