I was trying to build a simple rust rss 'harvester' for my soup.io blog and then post those entries to diaspora with node.js (since there is an npm package for that)
I want to learn how to use rust from node so this is why I'm building this project.
My problem is that I don't know how to call the ffi function with the right types.
var lib = ffi.Library('target/debug/libmain', {
'get_soup': ['Vec<Post>', ['String']]
});
The 'Vec<Post>' doesn't work.
I get that I have to use ref for that.
But I don't really know how and what that actually does.
I understand that I have to translate the rust types to javascript?
How can I use Vec<Post> in my ffi function?
my github project for that: Realtin/suppe
and here the relevant code:
Rust Code:
extern crate rss;
extern crate hyper;
use rss::Rss;
use std::io::prelude::*;
#[derive(Debug)]
pub struct Post {
title: String,
link: String,
description: String,
}
fn main() {
let user = "realtin".to_string();
let vec = get_soup(&user);
println!("{:?}", vec[vec.len()-1]);
}
#[no_mangle]
pub extern fn get_soup(user: &str) ->Vec<Post>{
let url = format!("http://{}.soup.io/rss", user);
let mut vec = Vec::new();
let client = hyper::Client::new();
let mut response = client.get(&url).send().unwrap();
let mut suppe = String::new();
let _= response.read_to_string(&mut suppe);
let rss::Rss(channel) = suppe.parse::<rss::Rss>().unwrap();
for item in channel.items.into_iter().rev() {
let item_object = Post {
title: item.title.unwrap(),
link: item.link.unwrap(),
description: item.description.unwrap(),
};
vec.push(item_object);
}
return vec;
}
NodeJS code:
var ref = require('ref');
var StructType = require("ref-struct");
var ffi = require('ffi');
var Post = StructType({
title: String,
link: String,
description: String,
});
// var vecPost = ref.refType(ref.types.Object);
var lib = ffi.Library('target/debug/libmain', {
'get_soup': ['Vec<Post>', ['String']]
});
var posts = lib.get_soup("realtin");
The short answer: you cannot export any Rust function for FFI bindings, you need to specifically export Rust functions compatible with C.
Specifically, this means that you need to expose only C-struct compatible objects OR expose opaque pointers (which can only be manipulated through Rust functions).
In your case, Vec<Post> is not compatible with usage in FFI, because Vec is not.
You can find more information in the FFI Guide.
Some data types can be passed directly, using your approach
Export Rust using FFI
https://svartalf.info/posts/2019-03-01-exposing-ffi-from-the-rust-library/
and then call it from nodejs from using something like https://github.com/node-ffi/node-ffi
May be not using structure that you require, but you can convert data in one or both ends.
As a last resort you can use JSON. There will be some overhead but not a lot
https://stackoverflow.com/a/42498913/3232611
can use protocol buffers, if JSON performance is bottleneck, but in most cases its not worth complexity
Related
Trying to get HMAC-SHA512 working in Rust, the test-case is taken from kraken API, but just can't get it working for a few days now.
Can anybody spot what I am missing?
I tried different HMAC libraries, and they all seem to yield the same result, so it seems it's something about how I concatenate/combine strings before feeding it to HMAC implementation.
Cargo.toml:
[dependencies]
urlencoding = "2.1.0"
base64 = "0.13.0"
ring = "0.16.20"
sha256 = "1.0.3"
use ring::hmac;
use sha256;
use urlencoding::encode;
pub fn api_sign(
private_key: Option<String>,
nonse: u64,
params: Option<String>,
uri: String,
) -> hmac::Tag {
let private_key = match private_key {
Some(p) => p,
None => panic!("Private key is not provided"),
};
let encoded_params = match params {
Some(p) => encode(&p[..]).into_owned(),
// Some(p) => p, <= tried this one too
None => "".to_string(),
};
let nonse = nonse.to_string();
let hmac_data = [nonse, encoded_params].concat();
let hmac_data = sha256::digest(hmac_data);
let hmac_data = [uri, hmac_data].concat();
let key = base64::decode(private_key).unwrap();
let key = hmac::Key::new(hmac::HMAC_SHA512, &key);
let mut s_ctx = hmac::Context::with_key(&key);
s_ctx.update(hmac_data.as_bytes());
s_ctx.sign()
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_api_sign_0() {
assert_eq!(
base64::encode(api_sign(
Some("kQH5HW/8p1uGOVjbgWA7FunAmGO8lsSUXNsu3eow76sz84Q18fWxnyRzBHCd3pd5nE9qa99HAZtuZuj6F1huXg==".to_string()),
1616492376594,
Some("nonce=1616492376594&ordertype=limit&pair=XBTUSD&price=37500&type=buy&volume=1.25".to_string()),
"/0/private/AddOrder".to_string()
).as_ref()),
"4/dpxb3iT4tp/ZCVEwSnEsLxx0bqyhLpdfOpc6fn7OR8+UClSV5n9E6aSS8MPtnRfp32bAb0nmbRn6H8ndwLUQ=="
)
}
}
The primary issue is that the APIs you're using are not equivalent. I'll be comparing to the Python implementation as that's what I'm most fluent in.
misunderstanding of the urlencoding API
Your Rust code takes a string of params and urlencodes it, but that's not what the Python code does, urllib.parse.urlencode takes a map of params and creates the query string, the value of postdata in the python code is
nonce=1616492376594&ordertype=limit&pair=XBTUSD&price=37500&type=buy&volume=1.25
but the value of encoded_params in your code is:
nonce%3D1616492376594%26ordertype%3Dlimit%26pair%3DXBTUSD%26price%3D37500%26type%3Dbuy%26volume%3D1.25
it's double-urlencoded. That is because you start from the pre-built querystring and urlencode it, while the Python code starts from the params and creates the querystring (properly encoded).
I think serde-urlencode would be a better pick / dependency: it is used a lot more, by pretty big projects (e.g. reqwest and pretty much every high-level web framework), and it can encode a data struct (because serde) which better matches the Python behaviour.
different sha256 API
sha256::digest and hashlib.sha256 have completely different behaviour:
sha256::digest(hmac_data)
returns a hex-encoded string, while
hashlib.sha256(encoded).digest()
returns the "raw" binary hash value: https://docs.python.org/3/library/hashlib.html#hashlib.hash.digest
That's why the Python code encodes the urlpath before the concatenation, message is bytes, not str.
It seems this sha256 outputs only hex strings, and it seems pretty low-use, so I'd suggest you're also using the wrong crate here, you likely want Rust Crypto's SHA2.
Recommendation: porting
For this sort of situations where there are available implementations, I would suggest
dumping the intermediate values of your and your reference implementation to check that they match, that would have made both the postdata and the digest issues obvious at first glance
sticking to following the "reference" implementation as much as you can, until you have something working, once your version works you can make it more rustic or fix the edge cases (e.g. make parameters optional, fix the API, use API conveniences, ...)
Here's a relatively direct conversion of the Python version, I kept your return type of an hmac::Tag but I used ring::hmac's other shortcuts to simplify that bit:
use ring::hmac;
use serde::Serialize;
use sha2::Digest;
pub fn api_sign(uri: String, data: Data, secret: String) -> hmac::Tag {
let postdata = serde_urlencoded::to_string(&data).unwrap();
let encoded = (data.nonce + &postdata).into_bytes();
let mut message = uri.into_bytes();
message.extend(sha2::Sha256::digest(encoded));
let key = hmac::Key::new(hmac::HMAC_SHA512, &base64::decode(secret).unwrap());
hmac::sign(&key, &message)
}
#[derive(Serialize)]
pub struct Data {
nonce: String,
ordertype: String,
pair: String,
price: u32,
r#type: String,
volume: f32,
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_api_sign_0() {
let sig = api_sign(
"/0/private/AddOrder".to_string(),
Data {
nonce: "1616492376594".into(),
ordertype: "limit".into(),
pair: "XBTUSD".into(),
price: 37500,
r#type: "buy".into(),
volume: 1.25,
},
"kQH5HW/8p1uGOVjbgWA7FunAmGO8lsSUXNsu3eow76sz84Q18fWxnyRzBHCd3pd5nE9qa99HAZtuZuj6F1huXg==".into(),
);
assert_eq!(
base64::encode(&sig),
"4/dpxb3iT4tp/ZCVEwSnEsLxx0bqyhLpdfOpc6fn7OR8+UClSV5n9E6aSS8MPtnRfp32bAb0nmbRn6H8ndwLUQ==",
)
}
}
you can follow and match the signature function pretty much line by line to the Python code.
I'd like to write a custom attribute for a const field, which will later be accessed throughout my entire library.
Example
// default declaration in `my_lib`...
pub const INITIAL_VEC_CAPACITY: usize = 10;
//...but can be overriden by dependent crates...
#[mylib_initial_vec_capacity]
pub const INITIAL_VEC_CAPACITY: usize = 5;
//...then can be accessed within my crate:
pub fn do_something() {
let mut vec = Vec::with_capacity(macros::INITIAL_VEC_CAPACITY);
/* do stuff with vec */
}
How would I go about achieving this?
You can use features to allow the user the compilation process of your library.
https://doc.rust-lang.org/cargo/reference/features-examples.html
Other than that I think you should use some configuration object the user passes into your code or is able to configure from outside using one of the following crates
https://crates.io/crates/config
https://crates.io/crates/config-derive/0.1.6
https://crates.io/crates/envconfig
I'm creating a Rust library and want to expose my Rust functions through c bindings to Dart. This question is just related to the setup of actually exposing the Rust function through C bindings and not on how to call it in Dart.
This is my function which I want to expose through FFI:
pub fn create_channel(credential: String) -> Result<String, iota_streams::core::Error> {
let seed = create_seed::new();
// Create the Transport Client
let client = Client::new_from_url(&dotenv::var("URL").unwrap());
let mut author = Author::new(&seed, ChannelType::SingleBranch, client.clone());
// Create the channel with an announcement message. Make sure to save the resulting link somewhere,
let announcement_link = author.send_announce()?;
// This link acts as a root for the channel itself
let ann_link_string = announcement_link.to_string();
// Author will now send signed encrypted messages in a chain
let msg_inputs = vec![credential];
let mut prev_msg_link = announcement_link;
for input in &msg_inputs {
let (msg_link, _seq_link) = author.send_signed_packet(
&prev_msg_link,
&Bytes::default(),
&Bytes(input.as_bytes().to_vec()),
)?;
println!("Sent msg: {}", msg_link);
prev_msg_link = msg_link;
}
Ok(ann_link_string)
}
The credential String is supposed to be just a stringified json object. Which I want to provide from Dart through C bindings into Rust and then use inside my create_channel function. But I don't know how to define the type of my credential parameter, because it would come in as a C type and then would need to be converted to Rust.
#[no_mangle]
pub extern "C" fn create_channel(credential: *const raw::c_char) -> String {
streams::create_channel(credential).unwrap()
}
Right now I'm just defining the parameters of my extern function to be off type c_char but I would then need to convert this C type to a Rust String or &str. So that I can use it inside of my actual create_channel function written in Rust.
As what types should I define the credential parameter and how would I convert the c_char into a String or &str?
Rust has the convenient CStr and CString types, you can use Cstr::from_ptr to wrap the raw string and then call to_str on that. Of course you need to do some error handling here for cases where the string isn't valid UTF-8.
After reading the Rust book, I've decided to give it a try with Web Assembly. I'm creating a simple tracker script to practice and learn more about it. There are a couple of methods that need to access the window, navigator or cookie API. Every time I have to access any of those there are a lot of boiler plate code involved:
pub fn start() {
let window = web_sys::window().unwrap();
let document = window.document().unwrap();
let html = document.dyn_into::<web_sys::HtmlDocument>().unwrap();
let cookie = html_document.cookie().unwrap();
}
That's unpractical and bothers me. Is there a smart way to solve this? I've in fact tried to use lazy_static to have all of this in a global.rs file:
#[macro_use]
extern crate lazy_static;
use web_sys::*;
lazy_static! {
static ref WINDOW: window = {
web_sys::window().unwrap()
};
}
But the compile fails with: *mut u8 cannot be shared between threads safely`.
You could use the ? operator instead of unwrapping.
Instead of writing
pub fn start() {
let window = web_sys::window().unwrap();
let document = window.document().unwrap();
let html = document.dyn_into::<web_sys::HtmlDocument>().unwrap();
let cookie = html_document.cookie().unwrap();
}
You can write
pub fn start() -> Result<()> {
let cookie = web_sys::window()?
.document()?
.dyn_into<web_sys::HtmlDocument>()?
.cookie()?;
Ok(())
}
It's the same number of lines, but less boilerplate and for simpler cases a one-liner.
If you really don't want to return a result you can wrap the whole thing in a lambda, (or a try block if you're happy using unstable features).
pub fn start() {
let cookie = (|| Result<Cookie)> {
web_sys::window()?
.document()?
.dyn_into<web_sys::HtmlDocument>()?
.cookie()
}).unwrap();
}
if you find you don't like repeating this frequently - you can use functions
fn document() -> Result<Document> {
web_sys::window()?.document()
}
fn html() -> Result<web_sys::HtmlDocument> {
document()?.dyn_into<web_sys::HtmlDocument>()
}
fn cookie() -> Result<Cookie> {
html()?.cookie()
}
pub fn start() {
let cookie = cookie()?;
}
That's unpractical and bothers me.
Unsure what's your issue here, but if you access the same cookie again and again in your application, perhaps you can save it in a struct and just use that struct? In my recent WebAssembly project I saved some of the elements I've used in a struct and used them by passing it around.
I also think that perhaps explaining your specific use case might lead to more specific answers :)
I am trying to use node-ffi with dynamic lib generated from rust. This is the rust link, https://github.com/petrachor/pairing-ariel. How can I get JavaScript to properly call rust function and return the expected result?
To compile rust first change crate-type(Cargo.toml) to ["dylib"], and cargo build --release
#[repr(C)]
#[derive(Debug)]
pub struct ArrayStruct<T> {
d: *mut T,
len: usize,
}
#[no_mangle]
pub extern "C" fn g2_get_one(g: ArrayStruct<u8>) -> bool {
return panic::catch_unwind(|| {
g2_to_raw(G2Affine::get_generator(), g);
}).is_ok();
}
My node code to call rust via FFI
var ref = require('ref');
var FFI = require('ffi');
var Struct = require('ref-struct');
//var IArrayType = require('ref-array');
var ArrayStruct8 = Struct({
'd': "uchar*",
'len': "int32"
});
var lib = new FFI.Library('target/release/libpairing', { 'g2_get_zero': [ ref.types.bool, [ ArrayStruct8] ]});
var buf = new Buffer.alloc(192);
var a8 = new ArrayStruct8({d: buf, len: 192});
lib.g2_get_zero(a8);
console.dir(a8);
I was expecting a8.b to contain unsigned char* .. When I did console.log(a8.d), I got "#". There is something out there I have not fixed yet, please help me.
This will get edited once I am at home and have my full set of tools available to get the actual, struct-based example to work.
The symptoms across the FFI boundary with the implementation given in the question are as follows:
Pointer integrity is kept (the same pointer surfaces on both sides of the FFI boundary)
Buffer integrity is not kept or updated. Coming from Rust to NodeJS, the buffer should have been updated, but is not
The quick solution when dealing with low-field-count modification structures like this is to pass a fat pointer instead (pointer + length) as follows, for instance:
#[no_mangle]
pub extern "C" fn do_something_with_array(buf: *mut u8, len: u32) -> u32 {
unsafe {
buf.write_bytes(1, 3);
}
3
}
With the corresponding FFI definition across on the nodeJS front:
var lib = new FFI.Library('target/debug/libtest2', {
'do_something_with_array': [ 'int', ['pointer', 'int'] ]
});
var buf = new Buffer.alloc(192);
var new_len = lib.do_something_with_array(buf, 192);
var new_buf = buf.slice(0, new_len);
It seems like ref-struct requires a sync of some sort as the underlying memory has the right content.
While trying to figure out how to get Buffer integrity kept or updated I noticed that I do not really need ref-struct or something complex. I think I am making use of C language internal layout of struct. Just pointer and it's length. For example,
#[repr(C)]
#[derive(Debug)]
pub struct ArrayStruct<T> {
d: *mut T,
len: usize,
}
#[no_mangle]
pub extern "C" fn g2_get_one(g: ArrayStruct<u8>) -> bool {
return panic::catch_unwind(|| {
g2_to_raw(G2Affine::get_generator(), g);
}).is_ok();
}
To call the g2_get_one from Nodejs do the following:
var lib = new FFI.Library('target/debug/libtest2', {
'do_something_with_array': [ 'int', ['pointer', 'int'] ]
});
var buf = new Buffer.alloc(192);
var new_len = lib.do_something_with_array(buf, 192);
As long as one follows this simple rule you can have multiple structs as parameters in Rust and this rule would be obeyed .