Create Static Slice from another Static Slice? - rust

I have two static slices that look something like this:
static ITEMS_1 : &'static [&str] = &[
"abc",
"def",
];
static ITEMS_2 : &'static [&str] = &[
"abc",
"def",
"ghi"
];
The first two entries of ITEMS_2 should match those of ITEMS_1. So, I want to define ITEMS_2 as ITEMS_1 appended with another item (or more) to avoid duplication. Is there any sensible way of doing this?

If you really only have 3 items, I wouldn't unpack the macro hammer.
While you can't do
static ITEMS_2 : &'static [&str] = &[
"abc",
"def",
"ghi"
];
static ITEMS_1 : &'static [&str] = &ITEMS_2[0..1];
you can do
static ITEMS_2: &'static [&str] = &["abc", "def", "ghi"];
static ITEMS_1: &'static [&str] = match ITEMS_2.split_last() {
None => &[],
Some((_, i)) => i,
};
Alternatively, you could also reference the elements one by one
static ITEMS_1 : &'static [&str] = &[
"abc",
"def",
];
static ITEMS_2 : &'static [&str] = &[
ITEMS_1[0],
ITEMS_1[1],
"ghi"
];

This can be done by encapsulating the common part in a macro:
macro_rules! with_common_strings {
($id:ident [ $($item:expr),* ]) => {
static $id: &'static [&str] = &[
"abc",
"def",
$($item),*
];
}
}
with_common_strings!(ITEMS_1 []);
with_common_strings!(ITEMS_2 [ "ghi" ]);
fn main() {
println!("Items 1: {:?}", ITEMS_1);
println!("Items 2: {:?}", ITEMS_2);
}
Playground
Or with a more complex but more generic macro that doesn't hardcode the common part of the arrays:
macro_rules! make_cumul_arrays {
(#rec ($($accum:expr),*)) => {};
(#rec ($($accum:expr),*) $id:ident [ $($item:expr),* ] $($tail:tt)*) => {
static $id: &'static[&str] = &[
$($accum),*,
$($item),*
];
make_cumul_arrays!{ #rec ($($accum),*, $($item),*) $($tail)* }
};
($id:ident [ $($item:expr),* ] $($tail:tt)*) => {
static $id: &'static[&str] = &[ $($item),* ];
make_cumul_arrays!{ #rec ($($item),*) $($tail)* }
};
}
make_cumul_arrays!{
ITEMS_1 [ "abc", "def" ]
ITEMS_2 [ "ghi" ]
}
Playground

Related

Actix Web 4 always add "Ok" field

Currently, I change actix_web from version 3 to 4.0.0-rc.1 and for every response, it always add "Ok"/"Err" field as below
{
"Ok": [
"item1",
"item2"
]
}
It should return:
[
"item1",
"item2"
]
This is handler for the API
pub async fn get_data(db: web::Data<Pool>) -> HttpResponse {
let res = web::block(move || db_get_data(db)).await;
match res {
Ok(data_vec) => HttpResponse::Ok().json(data_vec),
Err(_) => HttpResponse::BadRequest().finish()
}
}
fn db_get_data(db: web::Data<Pool>) -> Result<Vec<String>, ()> {
let items = vec!["item1".to_string(), "item2".to_string()];
Ok(items)
}
How could I solve this issue?
There are two layers of Result: one from actix_web::web::block and another from db_get_data. Try something like this:
let res = web::block(move || db_get_data(db)).await;
match res {
Ok(Ok(data_vec)) => HttpResponse::Ok().json(data_vec),
_ => HttpResponse::BadRequest().finish(),
}

Is it possible to reduce the number of match arms caused by generics?

I am "serializing" and "deserializing" a generic struct to and from an SQLite database. The struct has two members whose values are of generic types, V and T, both constrained to the DataType trait. When I want to reconstruct these from the information in the database, I haven't been able to find a way around specifying match arms for every combination of V and T. Given that I will eventually have around 20 data types, that means 20 * 20 = 400 match arms. Is there any way around this? An unsafe solution is also acceptable.
Here is a MWE with two data types:
// A somewhat boilerplaty, but working solution to
// store and retrieve a generically typed struct
// in a SQLite database.
use rusqlite::{params, Connection, Statement};
use rusqlite::types::{ToSql, FromSql, ValueRef};
// This trait needs to be implemented for every type
// the GenericStruct will hold as a value.
trait DataType : ToSql + FromSql {
type TargetType;
fn convert(value: &ValueRef) -> Self::TargetType;
fn data_type(&self) -> &'static str;
}
impl DataType for String {
type TargetType = String;
fn data_type(&self) -> &'static str {
"String"
}
fn convert(value: &ValueRef) -> Self::TargetType {
String::from(value.as_str().unwrap())
}
}
impl DataType for usize {
type TargetType = usize;
fn data_type(&self) -> &'static str {
"usize"
}
fn convert(value: &ValueRef) -> Self::TargetType {
usize::try_from(value.as_i64().unwrap()).unwrap()
}
}
// This is the generic struct that is persisted in SQLite.
#[derive(Debug)]
struct GenericStruct<V: DataType, T: DataType> {
value: V,
time: T
}
// This is just to keep the database stuff together.
struct Database<'db> {
pub add_struct: Statement<'db>,
pub get_struct: Statement<'db>
}
impl<'db> Database<'db> {
pub fn new<'con>(connection: &'con Connection) -> Database<'con> {
// the table will hold both the value and its Rust type
connection.execute_batch("
create table if not exists GenericStruct (
value any not null,
value_type text not null,
time any not null,
time_type text not null
)
").unwrap();
Database {
add_struct: connection.prepare("
insert into GenericStruct
(value, value_type, time, time_type)
values
(?, ?, ?, ?)
").unwrap(),
get_struct: connection.prepare("
select
value, value_type, time, time_type
from GenericStruct
").unwrap()
}
}
}
pub fn main() {
let sqlite = Connection::open("generic.db").unwrap();
let mut database = Database::new(&sqlite);
let g1 = GenericStruct {
value: String::from("Hello World"),
time: 20090921
};
let g2 = GenericStruct {
value: 42,
time: String::from("now")
};
// Add the two structs to the sqlite database
database.add_struct.execute(
params![&g1.value, &g1.value.data_type(), &g1.time, &g1.time.data_type()]
).unwrap();
database.add_struct.execute(
params![&g2.value, &g2.value.data_type(), &g2.time, &g2.time.data_type()]
).unwrap();
// Now there are two different types in the database.
// Retrieve the two structs again.
let mut rows = database.get_struct.query([]).unwrap();
while let Some(row) = rows.next().unwrap() {
let data_type = row.get_unwrap::<_, String>(1);
let time_type = row.get_unwrap::<_, String>(3);
// I want to lookup the converter instead
// of explicitly listing alternatives...
match (data_type.as_str(), time_type.as_str()) {
("String", "usize") => {
println!("{:?}", GenericStruct {
value: String::convert(&row.get_ref_unwrap(0)),
time: usize::convert(&row.get_ref_unwrap(2))
});
},
("usize", "String") => {
println!("{:?}", GenericStruct {
value: usize::convert(&row.get_ref_unwrap(0)),
time: String::convert(&row.get_ref_unwrap(2))
});
},
_ => ()
}
}
}
I have also set it up in a playground here: https://play.rust-lang.org/?version=stable&mode=debug&edition=2021&gist=7bb2403d88c3318348ba50d90763c225
You can do that with the following (pretty complex) macro.
macro_rules! generate_match {
// First, we generate a table of permutations.
// Suppose we have the tuple (String, usize, ()).
// The table we generate will be the following:
// [
// [ String, usize, () ]
// [ usize, (), String ]
// [ (), String, usize ]
// ]
// Empty case
{ #generate_permutations_table
$row:ident
match ($e:expr)
table = [ $($table:tt)* ]
rest = [ ]
transformed = [ $($transformed:ty,)* ]
} => {
generate_match! { #permutate_entry
$row
match ($e) { }
table = [ $($table)* ]
}
};
{ #generate_permutations_table
$row:ident
match ($e:expr)
table = [ $($table:tt)* ]
rest = [ $current:ty, $($rest:ty,)* ]
transformed = [ $($transformed:ty,)* ]
} => {
generate_match! { #generate_permutations_table
$row
match ($e)
table = [
$($table)*
[ $current, $($rest,)* $($transformed,)* ]
]
rest = [ $($rest,)* ]
transformed = [ $($transformed,)* $current, ]
}
};
// For each entry in the table, we generate all combinations of the first type with the others.
// For example, for the entry [ String, usize, () ] we'll generate the following permutations:
// [
// (String, usize)
// (String, ())
// ]
// Empty case
{ #permutate_entry
$row:ident
match ($e:expr) { $($match_tt:tt)* }
table = [ ]
} => {
match $e {
$($match_tt)*
_ => {}
}
};
{ #permutate_entry
$row:ident
match ($e:expr) { $($match_tt:tt)* }
table = [
[ $current:ty, $($others:ty,)* ]
$($table:tt)*
]
} => {
generate_match! { #generate_arm
$row
match ($e) { $($match_tt)* }
table = [ $($table)* ]
current = [ $current ]
others = [ $($others,)* ]
}
};
// Finally, We generate `match` arms from each pair.
// For example, for the pair (String, usize):
// ("String", "usize") => {
// let value = GenericStruct {
// value: <String as DataType>::convert(&row.get_ref_unwrap(0)),
// time: <usize as DataType>::convert(&row.get_ref_unwrap(2)),
// };
// // Process `value...`
// }
// Empty case: permutate the next table entry.
{ #generate_arm
$row:ident
match ($e:expr) { $($match_tt:tt)* }
table = [ $($table:tt)* ]
current = [ $current:ty ]
others = [ ]
} => {
generate_match! { #permutate_entry
$row
match ($e) { $($match_tt)* }
table = [ $($table)* ]
}
};
{ #generate_arm
$row:ident
match ($e:expr) { $($match_tt:tt)* }
table = [ $($table:tt)* ]
current = [ $current:ty ]
others = [ $first_other:ty, $($others:ty,)* ]
} => {
generate_match! { #generate_arm
$row
match ($e) {
$($match_tt)*
(stringify!($current), stringify!($first_other)) => {
let value = GenericStruct {
value: <$current as DataType>::convert(&$row.get_ref_unwrap(0)),
time: <$first_other as DataType>::convert(&$row.get_ref_unwrap(2)),
};
// Here you actually do whatever you want with the value. Adjust for your needs.
println!("{:?}", value);
}
}
table = [ $($table)* ]
current = [ $current ]
others = [ $($others,)* ]
}
};
// Entry
(
match ($e:expr) from ($($ty:ty),+) in $row:ident
) => {
generate_match! { #generate_permutations_table
$row
match ($e)
table = [ ]
rest = [ $($ty,)+ ]
transformed = [ ]
}
};
}
Invoke with:
generate_match!(
match ((data_type.as_str(), time_type.as_str()))
from (String, usize /* more types... */)
in row
);
Playground.

How to effectively build a byte array from calculated parts?

I need to build a byte array that represents commands to a device. It may look something like this:
let cmds = [
0x01, // cmd 1
0x02, // cmd 2
0x03, 0xaa, 0xbb, // cmd 3
0x04, // cmd 4
0x05, 0xaa, // cmd 5
];
Some commands take parameters, some don't. Some parameters require calculations. Each command is fixed in size, so it's known at compile time how big the array needs to be.
It'd be nice to construct it like this, where I abstract groups of bytes into commands:
let cmds = [
cmd1(),
cmd2(),
cmd3(0, true, [3, 4]),
cmd4(),
cmd5(0xaa)
];
I haven't found any way to do this with functions or macros. I am in no_std, so I am not using collections.
How to achieve something resembling this in Rust?
You can have each command function return an array or Vec of bytes:
fn cmd1() -> [u8; 1] { [0x01] }
fn cmd2() -> [u8; 1] { [0x02] }
fn cmd3(_a: u8, _b: bool, _c: [u8; 2]) -> [u8; 3] { [0x03, 0xaa, 0xbb] }
fn cmd4() -> [u8; 1] { [0x04] }
fn cmd5(a: u8) -> Vec<u8> { vec![0x05, a] }
And then build your commands like so:
let cmds = [
&cmd1() as &[u8],
&cmd2(),
&cmd3(0, true, [3, 4]),
&cmd4(),
&cmd5(0xaa),
];
This builds an array of slices of bytes. To get the full stream of bytes, use flatten:
println!("{:?}", cmds);
println!("{:?}", cmds.iter().copied().flatten().collect::<Vec<_>>());
[[1], [2], [3, 170, 187], [4], [5, 170]]
[1, 2, 3, 170, 187, 4, 5, 170]
You can make this more elaborate by returning some types that implement a Command trait and collecting them into an array of trait objects, but I'll leave that up to OP.
Edit: Here's a macro that can build the array directly, using the arrayvec crate:
use arrayvec::ArrayVec;
fn cmd1() -> [u8; 1] { [0x01] }
fn cmd2() -> [u8; 1] { [0x02] }
fn cmd3(_a: u8, _b: bool, _c: [u8; 2]) -> [u8; 3] { [0x03, 0xaa, 0xbb] }
fn cmd4() -> [u8; 1] { [0x04] }
fn cmd5(a: u8) -> [u8; 2] { [0x05, a] }
macro_rules! combine {
($($cmd:expr),+ $(,)?) => {
{
let mut vec = ArrayVec::new();
$(vec.try_extend_from_slice(&$cmd).unwrap();)*
vec.into_inner().unwrap()
}
}
}
fn main() {
let cmds: [u8; 8] = combine![
cmd1(),
cmd2(),
cmd3(0, true, [3, 4]),
cmd4(),
cmd5(0xaa),
];
println!("{:?}", cmds);
}
If you're worried about performance, this example compiles the array into a single instruction:
movabs rax, -6195540508320529919 // equal to [0x01‬, 0x02, 0x03, 0xAA, 0xBB, 0x04, 0x05, 0xAA]
See it on the playground. Its limited to types that are Copy. The length of the array must be supplied. It will panic at runtime if the array size doesn't match the combined size of the results.
You can do it with no external dependencies if you do it as a macro:
macro_rules! cmd_array {
(# [ $($acc:tt)* ]) => { [ $($acc)* ] };
(# [ $($acc:tt)* ] cmd1(), $($tail:tt)*) => { cmd_array!{# [ $($acc)* 0x01, ] $($tail)* } };
(# [ $($acc:tt)* ] cmd2(), $($tail:tt)*) => { cmd_array!{# [ $($acc)* 0x02, ] $($tail)* } };
(# [ $($acc:tt)* ] cmd3 ($a:expr, $b:expr, $c:expr), $($tail:tt)*) => { cmd_array!{# [ $($acc)* 0x03, 0xaa, 0xbb, ] $($tail)* } };
(# [ $($acc:tt)* ] cmd4(), $($tail:tt)*) => { cmd_array!{# [ $($acc)* 0x04, ] $($tail)* } };
(# [ $($acc:tt)* ] cmd5 ($a:expr), $($tail:tt)*) => { cmd_array!{# [ $($acc)* 0x05, $a, ] $($tail)* } };
($($tail:tt)*) => {
cmd_array!(# [] $($tail)*)
}
}
fn main() {
let cmds: [u8; 8] = cmd_array![
cmd1(),
cmd2(),
cmd3(0, true, [3, 4]),
cmd4(),
cmd5(0xaa),
];
println!("{:?}", cmds);
}
This macro is built using an incremental TT muncher to parse the commands, combined with push-down accumulation to build the final array.

Why does my nom parser not consume the entire input, leaving the last piece unparsed?

I'm trying to split a log line on space and commas in order to create a Vector of Tokens of Field and Separator as shown in the code below.
My problem is that nom doesn't seem to consume the entire log line, it leaves the last part unparsed - in this case 08:33:58).
main.rs
#![feature(rust_2018_preview)]
#[macro_use] extern crate nom;
#[derive(Debug, PartialEq)]
pub enum Token<'a> {
Separator(&'a [u8]),
Field(&'a [u8]),
}
named!(separator, is_a!(" ,"));
named!(not_sep, is_not!(" ,"));
named!(
token<Token>,
alt_complete!(
separator => { |s| Token::Separator(s) } |
not_sep => { |n| Token::Field(n) }
)
);
named!(sequence<Vec<Token>>, many1!(token));
pub fn scan(input: &[u8]) -> Vec<Token> {
let (_, seq) = sequence(input).unwrap();
seq
}
fn main() {
}
#[cfg(test)]
mod tests {
use std::str;
use crate::Token;
use crate::scan;
#[test]
fn parse_stuff() {
let log = &b"docker INFO 2019-10-01 08:33:58,878 [1] schedule:run Running job Every 1 hour do _precache_systems_streaks() (last run: 2018-09-21 07:33:58, next run: 2018-09-21 08:33:58)";
let seq = scan(&log[..]);
for t in seq {
let text = match t {
Token::Field(data) => format!("f[{}]", str::from_utf8(data).unwrap()),
Token::Separator(data) => format!("s[{}]", str::from_utf8(data).unwrap()),
};
println!("{}", text);
}
}
}
Cargo.toml
[dependencies]
nom = "4.0"
output
f[docker]
s[ ]
f[INFO]
s[ ]
f[2019-10-01]
s[ ]
f[08:33:58]
s[,]
f[878]
s[ ]
f[[1]]
s[ ]
f[schedule:run]
s[ ]
f[Running]
s[ ]
f[job]
s[ ]
f[Every]
s[ ]
f[1]
s[ ]
f[hour]
s[ ]
f[do]
s[ ]
f[_precache_systems_streaks()]
s[ ]
f[(last]
s[ ]
f[run:]
s[ ]
f[2018-09-21]
s[ ]
f[07:33:58]
s[, ]
f[next]
s[ ]
f[run:]
s[ ]
f[2018-09-21]
s[ ]
The issue you're running into is that Nom is designed to always assume that there may be more input, unless you tell it otherwise. Since you know your input here is complete, you need to feed the parsers your literal wrapped in a CompleteByteSlice (or if you used a &str, a CompleteStr). These types are thin wrappers that Nom uses to indicate that we know there isn't more input coming. It will make it so a parser that fails to make a complete match returns an Error instead of an Incomplete, and in this case, will instruct the the parser to consume that final token, rather than ask for more characters.
For completeness, I implemented the following changes according to #Zarenor's answer and the parser now consumes the entire input.
changes to main.rs
use nom::types::CompleteByteSlice;
use nom::IResult;
named!(separator<CompleteByteSlice, CompleteByteSlice>, is_a!(" ,"));
named!(not_separator<CompleteByteSlice, CompleteByteSlice>, is_not!(" ,"));
fn token<'a>(input: CompleteByteSlice<'a>) -> IResult<CompleteByteSlice<'a>, Token<'a>> {
alt!(input,
separator => { | s: CompleteByteSlice<'a> | Token::Separator(s.0) } |
not_separator => { | n: CompleteByteSlice<'a> | Token::Field(n.0) }
)
}
named!(sequence<CompleteByteSlice, Vec<Token>>, many1!(token));
pub fn scan(input: &[u8]) -> Vec<Token> {
let (_, seq) = sequence(CompleteByteSlice(input)).unwrap();
seq
}

Why does this rust HashMap macro no longer work?

I previously used:
#[macro_export]
macro_rules! map(
{ T:ident, $($key:expr => $value:expr),+ } => {
{
let mut m = $T::new();
$(
m.insert($key, $value);
)+
m
}
};
)
To create objects, like this:
let mut hm = map! {HashMap, "a string" => 21, "another string" => 33};
However, this no longer seems to work. The compiler reports:
- Failed:
macros.rs:136:24: 136:31 error: no rules expected the token `HashMap`
macros.rs:136 let mut hm = map! {HashMap, "a string" => 21, "another string" => 33};
^~~~~~~
What's changed with macro definitions that makes this no longer work?
The basic example below works fine:
macro_rules! foo(
{$T:ident} => { $T; };
)
struct Blah;
#[test]
fn test_map_create() {
let mut bar = foo!{Blah};
}
So this seem to be some change to how the {T:ident, $(...), +} expansion is processed?
What's going on here?
You’re lacking the $ symbol before T.
How about this play.rust-lang.org?
// nightly rust
#![feature(type_name_of_val)]
use std::collections::{BTreeMap, HashMap};
macro_rules! map {
($T:ty; $( $key:literal : $val:expr ),* ,) => {
{
// $T is the full HashMap<String, i32>, <$T> is just HashMap
let mut m: $T = <$T>::new();
$(
m.insert($key.into(), $val);
)*
m
}
};
// handle no tailing comma
($T:ty; $( $key:literal : $val:expr ),*) => {
map!{$T; $( $key : $val ,)*}
}
}
fn main() {
let hm = map! {HashMap<String, i32>;
"a": 1,
"b": 2,
"c": 3,
};
let bm = map! {BTreeMap<String, i32>;
"1": 1,
"2": 2,
"3": 3
};
println!("typeof hm = {}", std::any::type_name_of_val(&hm));
println!("typeof bm = {}", std::any::type_name_of_val(&bm));
dbg!(hm, bm);
}
Additional, use {}, macro_rules map {} means it will return an item

Resources