HMAC-SHA512 in Rust, can't get expected result - rust

Trying to get HMAC-SHA512 working in Rust, the test-case is taken from kraken API, but just can't get it working for a few days now.
Can anybody spot what I am missing?
I tried different HMAC libraries, and they all seem to yield the same result, so it seems it's something about how I concatenate/combine strings before feeding it to HMAC implementation.
Cargo.toml:
[dependencies]
urlencoding = "2.1.0"
base64 = "0.13.0"
ring = "0.16.20"
sha256 = "1.0.3"
use ring::hmac;
use sha256;
use urlencoding::encode;
pub fn api_sign(
private_key: Option<String>,
nonse: u64,
params: Option<String>,
uri: String,
) -> hmac::Tag {
let private_key = match private_key {
Some(p) => p,
None => panic!("Private key is not provided"),
};
let encoded_params = match params {
Some(p) => encode(&p[..]).into_owned(),
// Some(p) => p, <= tried this one too
None => "".to_string(),
};
let nonse = nonse.to_string();
let hmac_data = [nonse, encoded_params].concat();
let hmac_data = sha256::digest(hmac_data);
let hmac_data = [uri, hmac_data].concat();
let key = base64::decode(private_key).unwrap();
let key = hmac::Key::new(hmac::HMAC_SHA512, &key);
let mut s_ctx = hmac::Context::with_key(&key);
s_ctx.update(hmac_data.as_bytes());
s_ctx.sign()
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_api_sign_0() {
assert_eq!(
base64::encode(api_sign(
Some("kQH5HW/8p1uGOVjbgWA7FunAmGO8lsSUXNsu3eow76sz84Q18fWxnyRzBHCd3pd5nE9qa99HAZtuZuj6F1huXg==".to_string()),
1616492376594,
Some("nonce=1616492376594&ordertype=limit&pair=XBTUSD&price=37500&type=buy&volume=1.25".to_string()),
"/0/private/AddOrder".to_string()
).as_ref()),
"4/dpxb3iT4tp/ZCVEwSnEsLxx0bqyhLpdfOpc6fn7OR8+UClSV5n9E6aSS8MPtnRfp32bAb0nmbRn6H8ndwLUQ=="
)
}
}

The primary issue is that the APIs you're using are not equivalent. I'll be comparing to the Python implementation as that's what I'm most fluent in.
misunderstanding of the urlencoding API
Your Rust code takes a string of params and urlencodes it, but that's not what the Python code does, urllib.parse.urlencode takes a map of params and creates the query string, the value of postdata in the python code is
nonce=1616492376594&ordertype=limit&pair=XBTUSD&price=37500&type=buy&volume=1.25
but the value of encoded_params in your code is:
nonce%3D1616492376594%26ordertype%3Dlimit%26pair%3DXBTUSD%26price%3D37500%26type%3Dbuy%26volume%3D1.25
it's double-urlencoded. That is because you start from the pre-built querystring and urlencode it, while the Python code starts from the params and creates the querystring (properly encoded).
I think serde-urlencode would be a better pick / dependency: it is used a lot more, by pretty big projects (e.g. reqwest and pretty much every high-level web framework), and it can encode a data struct (because serde) which better matches the Python behaviour.
different sha256 API
sha256::digest and hashlib.sha256 have completely different behaviour:
sha256::digest(hmac_data)
returns a hex-encoded string, while
hashlib.sha256(encoded).digest()
returns the "raw" binary hash value: https://docs.python.org/3/library/hashlib.html#hashlib.hash.digest
That's why the Python code encodes the urlpath before the concatenation, message is bytes, not str.
It seems this sha256 outputs only hex strings, and it seems pretty low-use, so I'd suggest you're also using the wrong crate here, you likely want Rust Crypto's SHA2.
Recommendation: porting
For this sort of situations where there are available implementations, I would suggest
dumping the intermediate values of your and your reference implementation to check that they match, that would have made both the postdata and the digest issues obvious at first glance
sticking to following the "reference" implementation as much as you can, until you have something working, once your version works you can make it more rustic or fix the edge cases (e.g. make parameters optional, fix the API, use API conveniences, ...)
Here's a relatively direct conversion of the Python version, I kept your return type of an hmac::Tag but I used ring::hmac's other shortcuts to simplify that bit:
use ring::hmac;
use serde::Serialize;
use sha2::Digest;
pub fn api_sign(uri: String, data: Data, secret: String) -> hmac::Tag {
let postdata = serde_urlencoded::to_string(&data).unwrap();
let encoded = (data.nonce + &postdata).into_bytes();
let mut message = uri.into_bytes();
message.extend(sha2::Sha256::digest(encoded));
let key = hmac::Key::new(hmac::HMAC_SHA512, &base64::decode(secret).unwrap());
hmac::sign(&key, &message)
}
#[derive(Serialize)]
pub struct Data {
nonce: String,
ordertype: String,
pair: String,
price: u32,
r#type: String,
volume: f32,
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_api_sign_0() {
let sig = api_sign(
"/0/private/AddOrder".to_string(),
Data {
nonce: "1616492376594".into(),
ordertype: "limit".into(),
pair: "XBTUSD".into(),
price: 37500,
r#type: "buy".into(),
volume: 1.25,
},
"kQH5HW/8p1uGOVjbgWA7FunAmGO8lsSUXNsu3eow76sz84Q18fWxnyRzBHCd3pd5nE9qa99HAZtuZuj6F1huXg==".into(),
);
assert_eq!(
base64::encode(&sig),
"4/dpxb3iT4tp/ZCVEwSnEsLxx0bqyhLpdfOpc6fn7OR8+UClSV5n9E6aSS8MPtnRfp32bAb0nmbRn6H8ndwLUQ==",
)
}
}
you can follow and match the signature function pretty much line by line to the Python code.

Related

Metaprogamming name to function and type lookup in Rust?

I am working on a system which produces and consumes large numbers of "events", they are a name with some small payload of data, and an attached function which is used as a kind of fold-left over the data, something like a reducer.
I receive from the upstream something like {t: 'fieldUpdated', p: {new: 'new field value'}}, and must in my program associate the fieldUpdated "callback" function with the incoming event and apply it. There is a confirmation command I must echo back (which follows a programatic naming convention), and each type is custome.
I tried using simple macros to do codegen for the structs, callbacks, and with the paste::paste! macro crate, and with the stringify macro I made quite good progress.
Regrettably however I did not find a good way to metaprogram these into a list or map using macros. Extending an enum through macros doesn't seem to be possible, and solutions such as the use of ctors seems extremely hacky.
My ideal case is something this:
type evPayload = {
new: String
}
let evHandler = fn(evPayload: )-> Result<(), Error> { Ok(()) }
// ...
let data = r#"{"t": 'fieldUpdated', "p": {"new": 'new field value'}}"#'
let v: Value = serde_json::from_str(data)?;
Given only knowledge of data how can use macros, specifically (boilerplate is actually 2-3 types, 3 functions, some factory and helper functions) in a way that I can do a name-to-function lookup?
It seems like Serde's adjacently, or internally tagged would get me there, if I could modify a enum in a macro https://serde.rs/enum-representations.html#internally-tagged
It almost feels like I need a macro which can either maintain an enum, or I can "cheat" and use module scoped ctors to do a quasi-static initialization of the names and types into a map.
My program would have on the order of 40-100 of these, with anything from 3-10 in a module. I don't think ctors are necessarily a problem here, but the fact that they're a little grey area handshake, and that ctors might preclude one day being able to cross-compile to wasm put me off a little.
I actually had need of something similar today; the enum macro part specifically. But beware of my method: here be dragons!
Someone more experienced than me — and less mad — should probably vet this. Please do not assume my SAFETY comments to be correct.
Also, if you don't have variant that collide with rust keywords, you might want to tear out the '_' prefix hack entirely. I used a static mut byte array for that purpose, as manipulating strings was an order of magnitude slower, but that was benchmarked in a simplified function. There are likely better ways of doing this.
Finally, I am using it where failing to parse must cause panic, so error handling somewhat limited.
With that being said, here's my current solution:
/// NOTE: It is **imperative** that the length of this array is longer that the longest variant name +1
static mut CHECK_BUFF: [u8; 32] = [b'_'; 32];
macro_rules! str_enums {
($enum:ident: $($variant:ident),* $(,)?) => {
#[allow(non_camel_case_types)]
#[derive(Debug, Default, Hash, Clone, PartialEq, Eq, PartialOrd, Ord)]
enum $enum {
#[default]
UNINIT,
$($variant),*,
UNKNOWN
}
impl FromStr for $enum {
type Err = String;
fn from_str(s: &str) -> Result<Self, Self::Err> {
unsafe {
// SAFETY: Currently only single threaded
CHECK_BUFF[1..len].copy_from_slice(s.as_bytes());
let len = s.len() + 1;
assert!(CHECK_BUFF.len() >= len);
// SAFETY: Safe as long as CHECK_BUFF.len() >= s.len() + 1
match from_utf8_unchecked(&CHECK_BUFF[..len]) {
$(stringify!($variant) => Ok(Self::$variant),)*
_ => Err(format!(
"{} variant not accounted for: {s} ({},)",
stringify!($enum),
from_utf8_unchecked(&CHECK_BUFF[..len])
))
}
}
}
}
impl From<&$enum> for &'static str {
fn from(variant: &$enum) -> Self {
unsafe {
match variant {
// SAFETY: The first byte is always '_', and stripping it of should be safe.
$($enum::$variant => from_utf8_unchecked(&stringify!($variant).as_bytes()[1..]),)*
$enum::UNINIT => {
eprintln!("uninitialized {}!", stringify!($enum));
""
}
$enum::UNKNOWN => {
eprintln!("unknown {}!", stringify!($enum));
""
}
}
}
}
}
impl Display for $enum {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", Into::<&str>::into(self))
}
}
};
}
And then I call it like so:
str_enums!(
AttributeKind:
_alias,
_allowduplicate,
_altlen,
_api,
...
_enum,
_type,
_struct,
);
str_enums!(
MarkupKind:
_alias,
_apientry,
_command,
_commands,
...
);

Using HMAC as a generic

I am having trouble updating the crates hmac and digest. I have a function defined that takes a generic type of a HMAC function, and computes the HMAC over a given input. I have a function working with the version of hmac and digest being 0.7 and 0.8 respectively. However, I'm getting blocked when trying to get the same logic running for the latest versions 0.10 and 0.9 respectively.
In my machine, I use rustc 1.48.0 (7eac88abb 2020-11-16).
The working example has the following Cargo.toml dependencies
[dependencies]
hmac = "0.7"
sha2 = "0.8"
digest = "0.8"
The minimal working example is the following:
use sha2::{Sha256};
use hmac::{Mac, Hmac};
type HmacSha256 = Hmac<Sha256>;
use digest::generic_array::typenum::{U32};
pub struct Key([u8; 2]);
impl Key {
pub fn print_hmac<D>(&self, message: &[u8])
where
D: Mac<OutputSize = U32>,
{
let mut mac = D::new_varkey(self.0.as_ref()).unwrap();
mac.input(message);
let result = mac.result();
let code_bytes = result.code();
println!("{:?}", code_bytes)
}
}
pub fn main() {
let verif_key = Key([12u8, 33u8]);
verif_key.print_hmac::<HmacSha256>(&[83u8, 123u8]);
}
The above code works well, and compiles. However, when I try to upgrade the dependencies to the latests versions, everything breaks.
Updated Cargo.toml:
[dependencies]
hmac = "0.10"
sha2 = "0.9"
digest = "0.9"
With the updates, we have some changes in the nomenclature:
.input() -> .update()
.result() -> .finalize()
.code() -> .into_bytes()
When I try to run it, I get the following error
no function or associated item named 'new_varkey' found for type parameter 'D' in the current scope
So I tried to define the generic type to be NewMac (for that need to change the second line to use hmac::{Mac, Hmac, NewMac};). However, now the error is in the functions .update() and .finalize().
I've also tried to pass the Digest generic type, rather than the Hmac, as follows:
pub fn print_hmac<D>(&self, message: &[u8])
where
D: Digest,
{
let mut mac = Hmac::<D>::new_varkey(self.0.as_ref()).unwrap();
mac.update(message);
let result = mac.finalise();
let code_bytes = result.into_bytes();
println!("{:?}", code_bytes)
}
But still not working.
How should I handle the generic Hmac function for the updated crates?
Sorry for the long post, and I hope I made my problem clear. Thanks community!
It's good to look at example code which is updated across version updates, and luckily hmac has some tests in its repository.
Those tests use the new_test macro defined here in the crypto-mac crate. In particular, there is one line similar to yours...
let mut mac = <$mac as NewMac>::new_varkey(key).unwrap();
...which suggests that D should be cast to a NewMac implementor in your code as well.
After implementing the nomenclature updates you've already identified, your code works with the additional as NewMac cast from above, as well as the corresponding new + NewMac trait bound on D:
use sha2::{Sha256};
use hmac::{NewMac, Mac, Hmac};
type HmacSha256 = Hmac<Sha256>;
use digest::generic_array::typenum::{U32};
pub struct Key([u8; 2]);
impl Key {
pub fn print_hmac<D>(&self, message: &[u8])
where
D: Mac<OutputSize = U32> + NewMac, // `+ NewMac` input trait bound
{
let mut mac = <D as NewMac>::new_varkey(self.0.as_ref()).unwrap(); // `as NewMac` cast
mac.update(message);
let result = mac.finalize();
let code_bytes = result.into_bytes();
println!("{:?}", code_bytes)
}
}
pub fn main() {
let verif_key = Key([12u8, 33u8]);
verif_key.print_hmac::<HmacSha256>(&[83u8, 123u8]);
}

How do I get the value and type of a Literal in a procedural macro?

I am implementing a function-like procedural macro which takes a single string literal as an argument, but I don't know how to get the value of the string literal.
If I print the variable, it shows a bunch of fields, which includes both the type and the value. They are clearly there, somewhere. How do I get them?
extern crate proc_macro;
use proc_macro::{TokenStream,TokenTree};
#[proc_macro]
pub fn my_macro(input: TokenStream) -> TokenStream {
let input: Vec<TokenTree> = input.into_iter().collect();
let literal = match &input.get(0) {
Some(TokenTree::Literal(literal)) => literal,
_ => panic!()
};
// can't do anything with "literal"
// println!("{:?}", literal.lit.symbol); says "unknown field"
format!("{:?}", format!("{:?}", literal)).parse().unwrap()
}
#![feature(proc_macro_hygiene)]
extern crate macros;
fn main() {
let value = macros::my_macro!("hahaha");
println!("it is {}", value);
// prints "it is Literal { lit: Lit { kind: Str, symbol: "hahaha", suffix: None }, span: Span { lo: BytePos(100), hi: BytePos(108), ctxt: #0 } }"
}
After running into the same problem countless times already, I finally wrote a library to help with this: litrs on crates.io. It compiles faster than syn and lets you inspect your literals.
use std::convert::TryFrom;
use litrs::StringLit;
use proc_macro::TokenStream;
use quote::quote;
#[proc_macro]
pub fn my_macro(input: TokenStream) -> TokenStream {
let input = input.into_iter().collect::<Vec<_>>();
if input.len() != 1 {
let msg = format!("expected exactly one input token, got {}", input.len());
return quote! { compile_error!(#msg) }.into();
}
let string_lit = match StringLit::try_from(&input[0]) {
// Error if the token is not a string literal
Err(e) => return e.to_compile_error(),
Ok(lit) => lit,
};
// `StringLit::value` returns the actual string value represented by the
// literal. Quotes are removed and escape sequences replaced with the
// corresponding value.
let v = string_lit.value();
// TODO: implement your logic here
}
See the documentation of litrs for more information.
To obtain more information about a literal, litrs uses the Display impl of Literal to obtain a string representation (as it would be written in source code) and then parses that string. For example, if the string starts with 0x one knows it has to be an integer literal, if it starts with r#" one knows it is a raw string literal. The crate syn does exactly the same.
Of course, it seems a bit wasteful to write and run a second parser given that rustc already parsed the literal. Yes, that's unfortunate and having a better API in proc_literal would be preferable. But right now, I think litrs (or syn if you are using syn anyway) are the best solutions.
(PS: I'm usually not a fan of promoting one's own libraries on Stack Overflow, but I am very familiar with the problem OP is having and I very much think litrs is the best tool for the job right now.)
If you're writing procedural macros, I'd recommend that you look into using the crates syn (for parsing) and quote (for code generation) instead of using proc-macro directly, since those are generally easier to deal with.
In this case, you can use syn::parse_macro_input to parse a token stream into any syntatic element of Rust (such as literals, expressions, functions), and will also take care of error messages in case parsing fails.
You can use LitStr to represent a string literal, if that's exactly what you need. The .value() function will give you a String with the contents of that literal.
You can use quote::quote to generate the output of the macro, and use # to insert the contents of a variable into the generated code.
use proc_macro::TokenStream;
use syn::{parse_macro_input, LitStr};
use quote::quote;
#[proc_macro]
pub fn my_macro(input: TokenStream) -> TokenStream {
// macro input must be `LitStr`, which is a string literal.
// if not, a relevant error message will be generated.
let input = parse_macro_input!(input as LitStr);
// get value of the string literal.
let str_value = input.value();
// do something with value...
let str_value = str_value.to_uppercase();
// generate code, include `str_value` variable (automatically encodes
// `String` as a string literal in the generated code)
(quote!{
#str_value
}).into()
}
I always want a string literal, so I found this solution that is good enough. Literal implements ToString, which I can then use with .parse().
#[proc_macro]
pub fn my_macro(input: TokenStream) -> TokenStream {
let input: Vec<TokenTree> = input.into_iter().collect();
let value = match &input.get(0) {
Some(TokenTree::Literal(literal)) => literal.to_string(),
_ => panic!()
};
let str_value: String = value.parse().unwrap();
// do whatever
format!("{:?}", str_value).parse().unwrap()
}
I had similar problem for parsing doc attribute. It is also represented as a TokenStream. This is not exact answer but maybe will guide in a proper direction:
fn from(value: &Vec<Attribute>) -> Vec<String> {
let mut lines = Vec::new();
for attr in value {
if !attr.path.is_ident("doc") {
continue;
}
if let Ok(Meta::NameValue(nv)) = attr.parse_meta() {
if let Lit::Str(lit) = nv.lit {
lines.push(lit.value());
}
}
}
lines
}

Map C-like packed data structure to Rust struct

I'm fairly new to Rust and have spent most of my time writing code in C/C++. I have a flask webserver that returns back a packed data structure in the form of length + null-terminated string:
test_data = "Hello there bob!" + "\x00"
test_data = test_data.encode("utf-8")
data = struct.pack("<I", len(test_data ))
data += test_data
return data
In my rust code, I'm using the easy_http_request crate and can successfully get the response back by calling get_from_url_str. What I'm trying to do is map the returned response back to the Test data structure (if possible). I've attempted to use align_to to unsuccessfully get the string data mapped to the structure.
extern crate easy_http_request;
extern crate libc;
use easy_http_request::DefaultHttpRequest;
use libc::c_char;
#[repr(C, packed)]
#[derive(Debug, Clone, Copy)]
struct Test {
a: u32,
b: *const c_char // TODO: What do I put here???
}
fn main() {
let response = DefaultHttpRequest::get_from_url_str("http://localhost:5000/").unwrap().send().unwrap();
let (head, body, _tail) = unsafe { response.body.align_to::<Test>() };
let my_test: Test = body[0];
println!("{}", my_test.a); // Correctly prints '17'
println!("{:?}", my_test.b); // Fails
}
I'm not sure this is possible in Rust. In the response.body I can correctly see the null-terminated string, so I know the data is there. Just unsure if there's a way to map it to a string in the Test structure. There's no reason I need to use a null-terminated string. Ultimately, I'm just trying to map a data structure of size and a string to a Rust struct of the similar types.
It looks like you are confused by two different meanings of pack:
* In Python, pack is a protocol of sorts to serialize data into an array of bytes.
* In Rust, pack is a directive added to a struct to remove padding between members and disable other weirdness.
While they can be use together to make a protocol work, that is not the case, because in your pack you have a variable-length member. And trying to serialize/deserialize a pointer value directly is a very bad idea.
Your packed flask message is basically:
4 bytes litte endian value with the number of bytes in the string.
so many bytes indicated above for the string, encoded in utf-8.
For that you do not need a packed struct. The easiest way is to just read the fields manually, one by one. Something like this (error checking omitted):
use std::convert::TryInto;
let a = i32::from_le_bytes(response[0..4].try_into().unwrap());
let b = std::str::from_utf8(&response[4 .. 4 + a as usize]).unwrap();
Don't use raw pointers, they are unsafe to use and recommended only when there are strong reasons to
get around Rust’s safety guarantees.
At minumum a struct that fits your requirement is something like:
struct Test<'a> {
value: &'a str
}
or a String owned value for avoiding lifetime dependencies.
A reference to &str comprises a len and a pointer (it is not a C-like char * pointer).
By the way, the hard part is not the parsing of the wire protocol but to manage correctly all the possible
decoding errors and avoid unexpected runtime failures in case of buggy or malicious clients.
In order not to reinvent the wheel, an example with the parse combinator nom:
use nom::{
number::complete::le_u32,
bytes::complete::take,
error::ErrorKind,
IResult
};
use easy_http_request::DefaultHttpRequest;
use std::str::from_utf8;
#[derive(Debug, Clone)]
struct Test {
value: String
}
fn decode_len_value(bytes: &[u8]) -> IResult<&[u8], Test> {
let (buffer, len) = le_u32(bytes)?;
// take len-1 bytes because null char (\0) is accounted into len
let (remaining, val) = take(len-1)(buffer)?;
match from_utf8(val) {
Ok(strval) => Ok((remaining, Test {value: strval.to_owned()})),
Err(_) => Err(nom::Err::Error((remaining, ErrorKind::Char)))
}
}
fn main() {
let response = DefaultHttpRequest::get_from_url_str("http://localhost:5000/").unwrap().send().unwrap();
let result = decode_len_value(&response.body);
println!("{:?}", result);
}

Calling Rust from NodeJS

I was trying to build a simple rust rss 'harvester' for my soup.io blog and then post those entries to diaspora with node.js (since there is an npm package for that)
I want to learn how to use rust from node so this is why I'm building this project.
My problem is that I don't know how to call the ffi function with the right types.
var lib = ffi.Library('target/debug/libmain', {
'get_soup': ['Vec<Post>', ['String']]
});
The 'Vec<Post>' doesn't work.
I get that I have to use ref for that.
But I don't really know how and what that actually does.
I understand that I have to translate the rust types to javascript?
How can I use Vec<Post> in my ffi function?
my github project for that: Realtin/suppe
and here the relevant code:
Rust Code:
extern crate rss;
extern crate hyper;
use rss::Rss;
use std::io::prelude::*;
#[derive(Debug)]
pub struct Post {
title: String,
link: String,
description: String,
}
fn main() {
let user = "realtin".to_string();
let vec = get_soup(&user);
println!("{:?}", vec[vec.len()-1]);
}
#[no_mangle]
pub extern fn get_soup(user: &str) ->Vec<Post>{
let url = format!("http://{}.soup.io/rss", user);
let mut vec = Vec::new();
let client = hyper::Client::new();
let mut response = client.get(&url).send().unwrap();
let mut suppe = String::new();
let _= response.read_to_string(&mut suppe);
let rss::Rss(channel) = suppe.parse::<rss::Rss>().unwrap();
for item in channel.items.into_iter().rev() {
let item_object = Post {
title: item.title.unwrap(),
link: item.link.unwrap(),
description: item.description.unwrap(),
};
vec.push(item_object);
}
return vec;
}
NodeJS code:
var ref = require('ref');
var StructType = require("ref-struct");
var ffi = require('ffi');
var Post = StructType({
title: String,
link: String,
description: String,
});
// var vecPost = ref.refType(ref.types.Object);
var lib = ffi.Library('target/debug/libmain', {
'get_soup': ['Vec<Post>', ['String']]
});
var posts = lib.get_soup("realtin");
The short answer: you cannot export any Rust function for FFI bindings, you need to specifically export Rust functions compatible with C.
Specifically, this means that you need to expose only C-struct compatible objects OR expose opaque pointers (which can only be manipulated through Rust functions).
In your case, Vec<Post> is not compatible with usage in FFI, because Vec is not.
You can find more information in the FFI Guide.
Some data types can be passed directly, using your approach
Export Rust using FFI
https://svartalf.info/posts/2019-03-01-exposing-ffi-from-the-rust-library/
and then call it from nodejs from using something like https://github.com/node-ffi/node-ffi
May be not using structure that you require, but you can convert data in one or both ends.
As a last resort you can use JSON. There will be some overhead but not a lot
https://stackoverflow.com/a/42498913/3232611
can use protocol buffers, if JSON performance is bottleneck, but in most cases its not worth complexity

Resources