Deserializing a String with into_serde makes the app panick - rust

With a friend of mine, we're trying to use the serde_json crate to deserialize some message sent by a WebSocket.
We are having a specific error, and we managed to recreate it with the following snippet of code:
use serde::{Deserialize, Serialize};
#[derive(Deserialize, Debug)]
struct EffetSer {
test: String
}
fn main() {
let test_value = JsValue::from_str("{\"test\": \"value\"}");
let test_value: EffetSer = test_value.into_serde().unwrap();
log::error!("WOW : {:?}", test_value);
}
Our TOML has the following dependencies:
wasm-bindgen = { version = '0.2.63', features = ['serde-serialize'] }
serde = { version = '1.0', features = ["derive"] }
serde_json = '1.0.55'
js-sys = '0.3.40'
The error is the following:
app.js:310 panicked at 'called `Result::unwrap()` on an `Err` value: Error("invalid type: string \"{\\\"test\\\": \\\"value\\\"}\", expected struct EffetSer", line: 1, column: 23)'
Any help would be very appreciated, as we're still struggling to understand what we're doing wrong and why we cannot deserialize our String.

The problem is likely misunderstanding of into_serde's semantics.
According to documentation, it works like this:
Invokes JSON.stringify on this value and then parses the resulting JSON into an arbitrary Rust value.
In other words, its semantics are as following:
convert each component of the JsValue to the corresponding serde internal element;
deserialize the required type from the given tree of components.
Now, what does this mean in our case? Well, you created JsValue using JsValue::from_str, which, again according to documentation,
Creates a new JS value which is a string.
So, the JsValue here is not an object, as you are likely assuming; it is a primitive - a string, which simply happens to have the shape of object's JSON representation. Then, when you invoke from_serde, Serde sees the string - not as input, but as internal representation, which cannot be transformed into object.
Now, what to do? There are several ways to fix this code:
First and the most obvious: don't use JsValue at all, deserialize from &str directly with serde_json::from_str.
Use js_sys::JSON::parse to get the object-like JsValue from string, and then convert it to the EffetSer with into_serde. This is likely to be less efficient, since it requires the round-trip of JSON::parse and JSON::serialize to convert the string to object and then back to string.
Write your own method to convert JsValue to EffetSer directly. I'm not sure if this is possible, however, since I wasn't able to find a way to extract a single field from JS object.

Related

AWS Rust - convert DynamoDB result JSON to (web) JSON

Currently I have a S3 static website with a javascript request that hits a Lambda which returns an Item from my dynamodb database. I think I am very close to success on this. It seems like all I need to do is convert the DynamoDB version of JSON to normal JSON such as passed over the internet.
This is some of what I have within my Rust Lambda.
use aws_config::meta::region::RegionProviderChain;
use aws_sdk_dynamodb::model::AttributeValue;
use aws_sdk_dynamodb::Client;
use lambda_runtime::{service_fn, Error as LambdaError, LambdaEvent};
use serde_json::{json, Value};
...
...
let item = client
.get_item()
.table_name("example_table")
.key("example_key", AttributeValue::S(key_value.to_string()))
.send()
.await?;
let mapped_value = item.item().unwrap();
let json_value = json!({ "statusCode": 200, "body": format!("{:?}", mapped_value) });
Ok(json_value)
It returns a correct response, but formatted in the DynamoDB version of JSON. Here is a brief example of a piece of it.
{"items": L([M({"load": N("2"), "name": S("Superlaser"), "item_type": S("Weapon")})])}
So when my javascript on the frontend receives this response, it errors;
Error SyntaxError: Unexpected token 'N', ..."apon_lr": N("10"), ""... is not valid JSON
I have done some Googling and come across Rusoto and serde_dynamo, but I have a lot of troubles trying to mix and match these crates... and it doesn't feel right? Is there not a conversion within aws_sdk_dynamodb?
Quite similar to this StackExchange question, but for Rust rather than Node.JS or Python. Formatting DynamoDB data to normal JSON in AWS Lambda
What I ended up doing was using a combination of serde_dynamo::from_item and serde_json.
I set up a struct..
use serde_derive::{Deserialize, Serialize};
use serde_dynamo::from_item;
use serde_json::{json, Value};
#[derive(Serialize, Deserialize)]
struct S { //just for example
txt: String,
}
let struc: S = from_item(mapped_value.clone()).expect("Should convert Item to S");
let json_value = serde_json::to_value(struc).expect("Should serialize to JSON");
The struct needs to match the structure of the object you're getting from DynamoDB. If you are returning a bunch of items there should be a corresponding serde_dynamo call.

[Best practice for optional in proto or default value

We are setting up our rust services and using prost-build to bridge between proto <-> rust land. Our proto definitions are in proto3
Lets take the following proto message:
message Test {
string id = 1;
string body = 2;
string maybe_nullable_thing = 3;
}
This generates a struct like so:
pub struct Test {
#[prost(string, tag="1")]
pub id: ::prost::alloc::string::String,
#[prost(string, tag="2")]
pub body: ::prost::alloc::string::String,
#[prost(string, tag="3")]
pub maybe_nullable_thing: ::prost::alloc::string::String,
}
In other languagues where we have tried this, the fields of a proto message are optional by design and can be left out. In the example there can be cases where maybe_nullable_thing can not be set.
I can work around this issue by using the optional keyword. Altho I remember that it was not the best practice to do so(maybe I am mistaken?)
In terms of best practice with proto3 and rust in general, is it okay to use the optional keyword? If I use serde along with my Test struct I can see the default values of all the fields begin set to "".to_owned() (or empty string).
So I am not sure whats the best practice here? Would love to get some pointers on the best way forward here.
Looking at the readme for Tokio's PROST! tool, it appears their advice is to wrap any non-repeated and non-scalar fields, or any optional fields with Option<T>. This may or may not be different for prost-build, but it should give you a good reference to what's expected when using proto3 and Rust.
In general, however, you should wrap any value you want to be optional in Option<T>. This is not a bad practice, this is the default, standard way to represent "maybe nullable things" in Rust.

How to avoid clones when using postgres_types::Json?

I'm currently doing a rust app which uses tokio postgres and i need to make a sql request to fetch some data based on a jsonb row. The problem is that tokio postgres use a particular type (postgres_types::Json) which can be used like this : &Json::<Struct>(struct_var).
The struct var can't be a reference so the Json takes ownership which raises a problem as i need to use one of the struct's field after.
I could solve the problem using clone but i wanted to know before if there was an other solution which would not lower the performances.
Here is the function :
pub async fn user_exists_ipv4(
pool: &Pool,
ip: IpAddr,
device: &Device,
) -> Result<Option<Uuid>, String> {
// Get a connection from the pool
let conn = get_connection(pool).await?;
let country = &device.country[..];
// Get the user id from the database
let result = conn
.query(
FETCH_USER_QUERY_FOR_V4,
&[
&ip.to_string(),
&Json::<Device>(device.clone()),
&country.to_string(),
],
)
.await?
...
You can use references with Json, it is simply a wrapper that implements ToSql for types that are Serialize-able. That will include &T where T: Serialize. So you can use it with device directly as it is:
&Json::<&Device>(device)
You also don't need to annotate the type of Json explicitly since it can be inferred directly from what you pass to it. The code above could be more succinctly written as:
&Json(device)

Enum attribute in lit/lit-element

We are trying to build a component with a property variant that should only be set to "primary" or "secondary" (enum). Currently, we are just declaring the attribute as a String, but we were wondering if there is a better way for handling enums? For example, should we validate somehow that the current value is part of the enum? Should we throw an error if not?
I asked this question on Slack and the answers I got lean towards declaring the property as String and use hasChanged() to display a warning in the console if the property value is invalid.
Standard HTML elements accept any string as attribute values and don't throw exceptions, so web components should probably behave the same way.
This all sounds reasonable to me.
If you're using TypeScript I'd recommend just using strings. You can use export type MyEnum = 'primary' | 'secondary' to declare it and then use #property() fooBar: MyEnum to get build time checking. You can use #ts-check to do this in plain JS with #type MyEnum too.
This works well if the enums are for component options or that map to server-side enums that will get validated again.
However, if you want to validate user input into enums or loop through them a lot this is less good. As the JS runs it has no visibility of the type. You need an object dictionary, something like:
const MyEnum = Object.freeze({
primary: 'primary',
secondary: 'secondary'
});
// Enforce type in TS
const value: keyof MyEnum;
// Validate
const validated = MyEnum[input.toLower()];
// Loop
for(const enumVal of Object.keys(MyEnum)) ...
// Or Convert to a different value type
const MyEnum = Object.freeze({
primary: 1,
secondary: 2
});
These are somewhat idiosyncratic. Again, if you're using TypeScript it has an enum keyword that compiles to something like this and I'd use that rather than rolling your own. Strings are the better option unless you need to validate, loop or convert the values.

Declaring a map in a separate file and reading its contents

I'm trying to declare a map in a separate file, and then access it from my main function.
I want Rust's equivalent (or whatever comes closest) to this C++ map:
static const std::map<std::string, std::vector<std::string>> table = {
{ "a", { "foo" } },
{ "e", { "bar", "baz" } }
};
This is my attempt in Rust.
table.rs
use std::container::Map;
pub static table: &'static Map<~str, ~[~str]> = (~[
(~"a", ~[~"foo"]),
(~"e", ~[~"bar", ~"baz"])
]).move_iter().collect();
main.rs
mod table;
fn main() {
println(fmt!("%?", table::table));
}
The above gives two compiler errors in table.rs, saying "constant contains unimplemented expression type".
I also have the feeling that the map declaration is less than optimal for the purpose.
Finally, I'm using Rust 0.8.
As Chris Morgan noted, rust doesn't allow you to run user code in order to initialize global variables before main is entered, unlike C++. So you are mostly limited to primitive types that you can initialize with literal expressions. This is, afaik, part of the design and unlikely to change, even though the particular error message is probably not final.
Depending on your use case, you might want to change your code so you're manually passing your map as an argument to all the functions that will want to use it (ugh!), use task-local storage to initialize a tls slot with your map early on and then refer to it later in the same task (ugh?), or use unsafe code and a static mut variable to do much the same with your map wrapped in an Option maybe so it can start its life as None (ugh!).

Resources