Near protocol sandbox Promise::new().transfer() doesn't work - rust

I am developing a NEAR protocol smart contract that has a function which sends $NEAR from the contract account to another account.
The function works fine in testnet, but when I try to run it inside the sandbox the $NEAR transfer never occurs, causing my tests to fail.
Even though the contract works in testnet, this problem is slowing down my entire testing pipeline and making the team insecure about lots of contract assertions.
The function code is basically:
pub fn retrieve_dev_funds(&mut self) -> Promise {
let dev_account_id = self.owner_id.clone();
let withdrawal_dev_balance = self.dev_balance.clone();
self.dev_balance = 0;
Promise::new(dev_account_id).transfer(withdrawal_dev_balance)
}
And the test code is:
let devBalance1 = contractState.dev_balance;
let devNearBalance1 = await devUser.getAccountBalance();
let promiseTransfer = await hodler1CoinUser.retrieve_dev_funds({ args: {} });
console.log(promiseTransfer);
await hodler1CoinUser.play({ args: { _bet_type: true, bet_size: betSize } })
let contractState2 = await hodler1CoinUser.get_contract_state({ args: {} });
let devBalance2 = contractState2.dev_balance;
let devNearBalance2 = await devUser.getAccountBalance();
assert.equal(devBalance2, "0");
assert.equal(BigNumber(devNearBalance1.total).plus(BigNumber(devBalance1)).comparedTo(BigNumber(devNearBalance2.total)), 0);
The test fails at the last assertion as devNearBalance2 is actually equal to devNearBalance1, since the Promise transfer never actually happens.

Related

Turning tokio_postgres client into a variable for reuse

I am trying to figure out a way to make my tokio_postgres client a variable that I can reuse in different parts of my app. Ideally, I'm trying to achieve something similar to the Prisma ORM in the Node world:
const prisma = new PrismaClient()
...
const user = await prisma.user.create({
data: {
name: 'Alice',
email: 'alice#prisma.io',
},
The code I have so far is:
async fn connect() -> Result<P::Client, PgError> {
// Connect to the database.
let (client, connection) =
tokio_postgres::connect("host=localhost user=postgres", NoTls).await?;
// The connection object performs the actual communication with the database,
// so spawn it off to run on its own.
tokio::spawn(async move {
if let Err(e) = connection.await {
eprintln!("connection error: {}", e);
}
});
// Now we can execute a simple statement that just returns its parameter.
let rows = client
.query("SELECT $1::TEXT", &[&"hello world"])
.await?;
// And then check that we got back the same string we sent over.
let value: &str = rows[0].get(0);
assert_eq!(value, "hello world");
return client;
}
However, I am getting the error:
expected type Result<tokio_postgres::Client, tokio_postgres::Error>
found struct tokio_postgres::Client
Any idea what I could be doing wrong here? I'm new to Rust and maybe I'm just bringing baggage from Node, but I haven't found any documentation on this and figured it would be good to have.

Hyperledger node.js failed to parse null

can somebody help with error which appears on HL Composer?
Error content: Error: SyntaxError: Failed to parse null: Unexpected token (377:39)
Line 377 is: let exists = await accounts.exists(element.destinationAcc)
let accounts = await getAssetRegistry(ns + '.Account');
let transactions = await getAssetRegistry(ns + '.Transactions');
let allTransactions = await query('pendingTransactions');
let allAccounts = await accounts.getAll();
if (allTransactions.length() > 0) {
allTransactions.forEach(element => {
if (element.status == 'PENDING') {
let exists = await accounts.exists(element.destinationAcc);
if (exists) {
let destAcc = await allAccounts.find(element.destinationAcc);
This is a pretty standard mistake that javascript developers make and isn't related to hyperledger composer at all.
You are trying to perform an await within a method that hasn't been declared async. HOWEVER even if you do add the keyword async to the method that you have declared inside the forEach declaration it still won't work, due to the way forEach works.
So for you the solution is, don't use the forEach method of an array to try to run an anonymous function with an await in it. Use an alternative method to iterate the allTransactions array such as a for loop.

History of asset in Hyperledger Fabric

I am using node.js to write chaincode and I want to get the history of drug in pharmaceutical supply chain. I deployed chaincode, invoked manufacture and buy contract which modifies the drug's current state from one owner to another owner. Infact, I just modified commercial paper chaincode for this. The change in owner is reflected in couchdb database. But when I try to get the history of drug by drug key it doesn't work as expected.
Code I used
const promiseOfIterator = this.ctx.stub.getHistoryForKey(drugKey);
const results = [];
for await (const keyMod of promiseOfIterator) {
const resp = {
timestamp: keyMod.timestamp,
txid: keyMod.tx_id
}
if (keyMod.is_delete) {
resp.data = 'KEY DELETED';
} else {
resp.data = keyMod.value.toString('utf8');
}
results.push(resp);
}
return results;
When I printed the results, it gives: []
And when I do this: Drug.fromBuffer(getDrugHistoryResponse); and print it, it gives Drug { class: 'org.medicochainnet.drug', key: ':', currentState: null }
How to make this work? What am I doing wrong here? Please help me.
the function
ctx.stub.getHistoryForKey(drugKey);
is an asynchronous function. So you need to add await
const promiseOfIterator = await this.ctx.stub.getHistoryForKey(drugKey);
Then you can iterate over the result.
I have done it in a demo like that:
const promiseOfIterator = await this.ctx.stub.getHistoryForKey(drugKey);
const results = [];
while(true){
let res = await promiseOfIterator.next();
//In the loop you have to check if the iterator has values or if its done
if(res.value){do your actions}
if(res.done){
// close the iterator
await promiseOfIterator.close()
// exit the loop
return results
}
}
Check the Mozilla Documentation for more information about Iterators in Javascript. https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Iterators_and_Generators

Fastest way to send many groups of HTTP requests using new async/await syntax and control the amount of workers

Most recent threads I have read are saying async is the better way to perform lots of I/O bound work such as sending HTTP requests and the like. I have tried to pick up async recently but am struggling with understanding how to send many groups of requests in parallel, for example:
let client = reqwest::Client::new();
let mut requests = 0;
let get = client.get("https://somesite.com").send().await?;
let response = get.text().await?;
if response.contains("some stuff") {
let get = client.get("https://somesite.com/something").send().await?;
let response = get.text().await?;
if response.contains("some new stuff") {
requests += 1;
println!("Got response {}", requests)
This does what I want, but how can I run it in parallel and control the amount of "worker threads" or whatever the equivalent is to a thread pool in async?
I understand it is similar to this question, but mine is strictly talking about the nightly Rust async/await syntax and a more specific use case where groups of requests/tasks need to be done. I also find using combinators for these situations a bit confusing, was hoping the newer style would help make it a bit more readable.
Not sure if this is the fastest way, as I am just experimenting myself, but here is my solution:
let client = reqwest::Client::new();
let links = vec![ // A vec of strings representing links
"example.net/a".to_owned(),
"example.net/b".to_owned(),
"example.net/c".to_owned(),
"example.net/d".to_owned(),
];
let ref_client = &client; // Need this to prevent client from being moved into the first map
futures::stream::iter(links)
.map(async move |link: String| {
let res = ref_client.get(&link).send().await;
// res.map(|res| res.text().await.unwrap().to_vec())
match res { // This is where I would usually use `map`, but not sure how to await for a future inside a result
Ok(res) => Ok(res.text().await.unwrap()),
Err(err) => Err(err),
}
})
.buffer_unordered(10) // Number of connection at the same time
.filter_map(|c| future::ready(c.ok())) // Throw errors out, do your own error handling here
.filter_map(|item| {
if item.contains("abc") {
future::ready(Some(item))
} else {
future::ready(None)
}
})
.map(async move |sec_link| {
let res = ref_client.get(&sec_link).send().await;
match res {
Ok(res) => Ok(res.text().await.unwrap()),
Err(err) => Err(err),
}
})
.buffer_unordered(10) // Number of connections for the secondary requests (so max 20 connections concurrently)
.filter_map(|c| future::ready(c.ok()))
.for_each(|item| {
println!("File received: {}", item);
future::ready(())
})
.await;
This requires the #![feature(async_closure)] feature.

Using ElementArrayFinder.filter() with async/await

I've been using the following function to filter element arrays for the past few years with Webdriver's Control Flow enabled:
filterElementsByText (elemList, comparator, locator) {
return elemList.filter((elem) => {
let searchTarget = locator ? elem.element(locator) : elem
return searchTarget.getText().then((text) => text === comparator)
})
}
I am now trying to migrate my repo to using async/await which requires turning off the Control Flow.
This transition has been mostly successful, but I'm having trouble with the function above. Intermittently, I am seeing this error:
Failed: java.net.ConnectException: Connection refused: connect
I am able to reproduce this issue with a test case I've written against https://angularjs.org, although it happens with much higher frequency against my own app.
let todoList = element.all(by.repeater('todo in todoList.todos'))
let todoText = element(by.model('todoList.todoText'))
let todoSubmit = element(by.css('[value="add"]'))
let addItem = async (itemLabel = 'write first protractor test') => {
await todoText.sendKeys(itemLabel)
return todoSubmit.click()
}
let filterElementsByText = (elemList, comparator, locator) => {
return elemList.filter((elem) => {
let searchTarget = locator ? elem.element(locator) : elem
return searchTarget.getText().then((text) => {
console.log(`Element text is: ${text}`)
return text === comparator
})
})
}
describe('filter should', () => {
beforeAll(async () => {
browser.ignoreSynchronization = true
await browser.get('https://angularjs.org')
for (let i = 0; i < 10; i++) {
await addItem(`item${i}`)
}
return addItem()
})
it('work', async () => {
let filteredElements = await filterElementsByText(todoList, 'write first protractor test')
return expect(filteredElements.length).toEqual(1)
})
})
This is being run with the following set in Protractor's conf file:
SELENIUM_PROMISE_MANAGER: false
With the simplified test case it seems to occur on 5-10% of executions (although, anecdotally it does seem to occur more frequently once it occurs the first time)
My problem is, this feels like a bug in Webdriver, but I'm not sure what conditions would cause that error so I'm not sure how to proceed.
For anyone reading and wondering, the problem with my own app was two-fold.
First, as described in the comments to the original question, ElementArrayFinder.filter() causes this error because it runs parallel requests for each element in the array.
Secondly (and not apparent in the original question), rather than passing an ElementArrayFinder as described in this test case, I was actually passing in a chained child of each element in the array such as:
element.all(by.repeater('todo in todoList.todos').$$('span')
Looking at the Webdriver output as this happens I noticed that this then causes all those locators to be retrieved in parallel leading to the same error.
I was able to work around both issues by filtering this way:
let filterElementsByText = async (elemList, comparator, locator) => {
let filteredElements = []
let elems = await elemList
for (let i = 0; i < elems.length; i++) {
let elem = await elems[i]
let searchTarget = locator ? elem.element(locator) : elem
let text = await searchTarget.getText()
if (text === comparator) {
filteredElements.push(elem)
}
}
return filteredElements
}
This unblocks me, but it still feels like an issue that these functions are just unusable with async/await.

Resources