AWS Rust - convert DynamoDB result JSON to (web) JSON - rust

Currently I have a S3 static website with a javascript request that hits a Lambda which returns an Item from my dynamodb database. I think I am very close to success on this. It seems like all I need to do is convert the DynamoDB version of JSON to normal JSON such as passed over the internet.
This is some of what I have within my Rust Lambda.
use aws_config::meta::region::RegionProviderChain;
use aws_sdk_dynamodb::model::AttributeValue;
use aws_sdk_dynamodb::Client;
use lambda_runtime::{service_fn, Error as LambdaError, LambdaEvent};
use serde_json::{json, Value};
...
...
let item = client
.get_item()
.table_name("example_table")
.key("example_key", AttributeValue::S(key_value.to_string()))
.send()
.await?;
let mapped_value = item.item().unwrap();
let json_value = json!({ "statusCode": 200, "body": format!("{:?}", mapped_value) });
Ok(json_value)
It returns a correct response, but formatted in the DynamoDB version of JSON. Here is a brief example of a piece of it.
{"items": L([M({"load": N("2"), "name": S("Superlaser"), "item_type": S("Weapon")})])}
So when my javascript on the frontend receives this response, it errors;
Error SyntaxError: Unexpected token 'N', ..."apon_lr": N("10"), ""... is not valid JSON
I have done some Googling and come across Rusoto and serde_dynamo, but I have a lot of troubles trying to mix and match these crates... and it doesn't feel right? Is there not a conversion within aws_sdk_dynamodb?
Quite similar to this StackExchange question, but for Rust rather than Node.JS or Python. Formatting DynamoDB data to normal JSON in AWS Lambda

What I ended up doing was using a combination of serde_dynamo::from_item and serde_json.
I set up a struct..
use serde_derive::{Deserialize, Serialize};
use serde_dynamo::from_item;
use serde_json::{json, Value};
#[derive(Serialize, Deserialize)]
struct S { //just for example
txt: String,
}
let struc: S = from_item(mapped_value.clone()).expect("Should convert Item to S");
let json_value = serde_json::to_value(struc).expect("Should serialize to JSON");
The struct needs to match the structure of the object you're getting from DynamoDB. If you are returning a bunch of items there should be a corresponding serde_dynamo call.

Related

How to bulk update mongoDB in Rust by inserting full doc if entry doesn't exist and updating select fields if entry exists?

In short, I am interested in finding the most optimal, minimal call amount way of executing this pseudocode logic:
match find(doc) {
Some(x) => x.update(select_fields),
None=>collection.insert(all_fields)
}
but in bulk, for the entire local DB, without iterating one by one. Is there such a method? What's the most minimal one currently available?
My use case:
I have a HashMap<T,MyStruct>. I've packed both key and value into the doc!{}. Is that okay?
For some reason I was getting error trait From<u64> is not implemented for Bson in key3, so I changed my code to f64:
let dmp_op = my_database.lock().unwrap().clone();
let mut dmp_db = vec![];
for (k,v) in dmp_op{
dmp_db.push(doc! { "key": value, "key2": value2, "key3": value3 as f64 },
)
};
match collection.insert_many(dmp_db, None).await{
Ok(x)=>x,
Err(x)=>{
println!("{:?}",x);
continue
}
};
This part works, but that's non-repeatable. Instead of doing this, I'd love to execute the aforementioned logic in the most optimal way from scratch.
I can't find any information as to whether all the singular methods I used in an implementation of find_one_and_update() + upsert can be used in bulk.
PS: On second thought... maybe my infra logic is flawed? Just starting with MongoDB, what is more preferable:
Inserting/Updating one by one inside worker threads into MongoDB instead of local HashMap
Creating a separate thread that from time to time inserts into MongoDB the local HashMap / cleanses it to keep low resource?

Error handling for applications: how to return a public message error instead of all the chain of errors and tracing it at the same time?

PROLOGUE
I'm using async-graphql and I have hundreds of resolvers and for each resolver I would like to trace all the possible errors.
In each method of my app I'm using anyhow::{Error}.
Right now I have code similar to this for each resolver:
#[Object]
impl MutationRoot {
async fn player_create(&self, ctx: &Context<'_>, input: PlayerInput) -> Result<Player> {
let services = ctx.data_unchecked::<Services>();
let player = services
.player_create(input)
.await?;
Ok(player)
}
}
So I thought about using the below code (note the added line with .map_err()):
#[Object]
impl MutationRoot {
async fn player_create(&self, ctx: &Context<'_>, input: PlayerInput) -> Result<Player> {
let services = ctx.data_unchecked::<Services>();
let player = services
.player_create(input)
.await
.map_err(errorify)?;
Ok(player)
}
}
fn errorify(err: anyhow::Error) -> async_graphql::Error {
tracing::error!("{:?}", err);
err.into()
}
Now the error is traced along with all the error chain:
ERROR example::main:10: I'm the error number 4
Caused by:
0: I'm the error number 3
1: I'm the error number 2
2: I'm the error number 1
in example::async_graphql
QUESTION 1
Is there a way to avoid the .map_err() on each resolver?
I would like to use the ? alone.
Should I use a custom error?
Can we have a global "hook"/callback/fn to call on each error?
QUESTION 2
As you can see above the chain of the error is the inverse.
In my graphql response I'm getting as message the "I'm the error number 4" but I need to get the "I'm the error number 2" instead.
The error chain is created using anyhow like this:
main.rs: returns error with .with_context(|| "I'm the error number 4")?
call fn player_create() in graphql.rs: returns with .with_context(|| "I'm the error number 3")?
call fn new_player() in domain.rs: returns with .with_context(|| "I'm the error number 2")?
call fn save_player() in database.rs: returns with .with_context(|| "I'm the error number 1")?
How can I accomplish this?
I'm really new to Rust. I come from Golang where I was using a struct like:
type Error struct {
error error
public_message string
}
chaining it easily with:
return fmt.Errorf("this function is called but the error was: %w", previousError)
How to do it in Rust?
Do I necessarily have to use anyhow?
Can you point me to a good handling error tutorial/book for applications?
Thank you very much.
I would suggest you define your own error for your library and handle them properly by using thiserror crate.
It's like Go defining var ErrNotFound = errors.New(...) and use fmt.Errorf(..., err) to add context.
With the powerful tool enum in Rust, so you can handle every error properly by match arms. It also provides really convenient derive macro like #[from] or #[error(transparent)] to make error conversion/forwarding easy.
Edit 1:
If you want to separate public message and tracing log, you may consider defining you custom error struct like this:
#[derive(Error, Debug)]
pub struct MyError {
msg: String,
#[source] // optional if field name is `source`
source: anyhow::Error,
}
and implement Display trait for formatting its inner msg field.
Finally, you could use macro in tracing-attributes crate:
#[instrument(err(Debug))]
fn my_function(arg: usize) -> Result<(), std::io::Error> {
Ok(())
}
Is there a way to avoid the .map_err() on each resolver?
Yes, you should be able to remove it unless you really need to convert to async_graphql::Error.
Do I necessarily have to use anyhow?
No, but it makes this easier when using ? on different error types.
I'm really new to Rust. I come from Golang where I was using a struct like:
Take a look at thiserror which lets you build you own enum of error variants.

How to avoid clones when using postgres_types::Json?

I'm currently doing a rust app which uses tokio postgres and i need to make a sql request to fetch some data based on a jsonb row. The problem is that tokio postgres use a particular type (postgres_types::Json) which can be used like this : &Json::<Struct>(struct_var).
The struct var can't be a reference so the Json takes ownership which raises a problem as i need to use one of the struct's field after.
I could solve the problem using clone but i wanted to know before if there was an other solution which would not lower the performances.
Here is the function :
pub async fn user_exists_ipv4(
pool: &Pool,
ip: IpAddr,
device: &Device,
) -> Result<Option<Uuid>, String> {
// Get a connection from the pool
let conn = get_connection(pool).await?;
let country = &device.country[..];
// Get the user id from the database
let result = conn
.query(
FETCH_USER_QUERY_FOR_V4,
&[
&ip.to_string(),
&Json::<Device>(device.clone()),
&country.to_string(),
],
)
.await?
...
You can use references with Json, it is simply a wrapper that implements ToSql for types that are Serialize-able. That will include &T where T: Serialize. So you can use it with device directly as it is:
&Json::<&Device>(device)
You also don't need to annotate the type of Json explicitly since it can be inferred directly from what you pass to it. The code above could be more succinctly written as:
&Json(device)

Deserializing a String with into_serde makes the app panick

With a friend of mine, we're trying to use the serde_json crate to deserialize some message sent by a WebSocket.
We are having a specific error, and we managed to recreate it with the following snippet of code:
use serde::{Deserialize, Serialize};
#[derive(Deserialize, Debug)]
struct EffetSer {
test: String
}
fn main() {
let test_value = JsValue::from_str("{\"test\": \"value\"}");
let test_value: EffetSer = test_value.into_serde().unwrap();
log::error!("WOW : {:?}", test_value);
}
Our TOML has the following dependencies:
wasm-bindgen = { version = '0.2.63', features = ['serde-serialize'] }
serde = { version = '1.0', features = ["derive"] }
serde_json = '1.0.55'
js-sys = '0.3.40'
The error is the following:
app.js:310 panicked at 'called `Result::unwrap()` on an `Err` value: Error("invalid type: string \"{\\\"test\\\": \\\"value\\\"}\", expected struct EffetSer", line: 1, column: 23)'
Any help would be very appreciated, as we're still struggling to understand what we're doing wrong and why we cannot deserialize our String.
The problem is likely misunderstanding of into_serde's semantics.
According to documentation, it works like this:
Invokes JSON.stringify on this value and then parses the resulting JSON into an arbitrary Rust value.
In other words, its semantics are as following:
convert each component of the JsValue to the corresponding serde internal element;
deserialize the required type from the given tree of components.
Now, what does this mean in our case? Well, you created JsValue using JsValue::from_str, which, again according to documentation,
Creates a new JS value which is a string.
So, the JsValue here is not an object, as you are likely assuming; it is a primitive - a string, which simply happens to have the shape of object's JSON representation. Then, when you invoke from_serde, Serde sees the string - not as input, but as internal representation, which cannot be transformed into object.
Now, what to do? There are several ways to fix this code:
First and the most obvious: don't use JsValue at all, deserialize from &str directly with serde_json::from_str.
Use js_sys::JSON::parse to get the object-like JsValue from string, and then convert it to the EffetSer with into_serde. This is likely to be less efficient, since it requires the round-trip of JSON::parse and JSON::serialize to convert the string to object and then back to string.
Write your own method to convert JsValue to EffetSer directly. I'm not sure if this is possible, however, since I wasn't able to find a way to extract a single field from JS object.

How to build a Graqhql mutation with existing variables

This might seem like an odd question, or something really straightforward, but honestly I am struggling to figure out how to do this. I am working in Node.js and I want to set data I have saved on a node object into my GraphQL mutation.
I'm working with a vendor's GraqhQL API, so this isn't something I have created myself, nor do I have a schema file for it. I'm building a mutation that will insert a record into their application, and I can write out everything manually and use a tool like Postman to manually create a new record...the structure of the mutation is not my problem.
What I'm struggling to figure out is how to build the mutation with variables from my node object without just catting a bunch of strings together.
For example, this is what I'm trying to avoid:
class MyClass {
constructor() {
this.username = "my_username"
this.title = "Some Title"
}
}
const obj = new MyClass()
let query = "mutation {
createEntry( input: {
author: { username: \"" + obj.username + "\" }
title: \"" + obj.title + "\"
})
}"
I've noticed that there are a number of different node packages out there for working with Graphql, but none of their documentation that I've seen really addresses the above situation. I've been completely unsuccessful in my Googling attempts, can someone please point me in the right direction? Is there a package out there that's useful for just building queries without requiring a schema or trying to send them at the same time?
GraphQL services typically implement this spec when using HTTP as a transport. That means you can construct a POST request with four parameters:
query - A Document containing GraphQL Operations and Fragments to execute.
operationName - (Optional): The name of the Operation in the Document to execute.
variables - (Optional): Values for any Variables defined by the Operation.
extensions - (Optional): This entry is reserved for implementors to extend the protocol however they see fit.
You can use a Node-friendly version of fetch like cross-fetch, axios, request or any other library of your choice to make the actual HTTP request.
If you have dynamic values you want to substitute inside the query, you should utilize variables to do so. Variables are defined as part of your operation definition at the top of the document:
const query = `
mutation ($input: SomeInputObjectType!) {
createEntry(input: $input) {
# whatever other fields assuming the createEntry
# returns an object and not a scalar
}
}
`
Note that the type you use will depend on the type specified by the input argument -- replace SomeInputObjectType with the appropriate type name. If the vendor did not provide adequate documentation for their service, you should at least have access to a GraphiQL or GraphQL Playground instance where you can look up the argument's type. Otherwise, you can use any generic GraphQL client like Altair and view the schema that way.
Once you've constructed your query, make the request like this:
const variables = {
input: {
title: obj.title,
...
}
}
const response = await fetch(YOUR_GRAPHQL_ENDPOINT, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ query, variables }),
})
const { data, errors } = await response.json()

Resources