I'm developing an android application where I connect to a server to get data
I'm using Volley library to perform my request in a separate class from the controller class, I put the data from the server in global list where I call it from the UI thread
my problem is when I call the function that initiates the call to the server it starts the Async task for the connection and doesn't wait for the data and calls the function that adds the data to the UI
so can I put the call of the function that initiates the call in an Async task and have nested Async tasks or it will do the same
new AsynctaskName().execute().get();
Try using get() to execute async task.
It will wait until the async task get executed.
Related
In a web environment with Actix-Web I want to write behind data to a database, async so the request is not held up. This could also be calling a webhook or calling an API to send an email.
With Scala I would create a queue and use a thread pool (e.g. with ForkJoin) to fire and forget a task.
How would I do this in Rust with Actix-Web? (Actix actors?)
You would use actix_web::rt::spawn to execute an async function that runs independently.
I have an AWS Lambda application built upon an external library that contains an EventEmitter. On a certain event, I need to make a HTTP request. So I was using this code (simplified):
myEmitter.on("myEvent", async() => {
setup();
await doRequest();
finishingWork();
});
What I understand that happens is this:
My handler is called, but as soon as the doRequest function is called, a Promise is returned and the EventEmitter continues with the next handlers. When all that is done, the work of the handler can continue (finishingWork).
This works locally, because my NodeJS process keeps running and any remaining events on the eventloop are handled. The strange thing is that this doesn't seem to work on AWS Lambda. Even if context.callbackWaitsForEmptyEventLoop is set to true.
In my logging I can see my handler enters the doRequest function, but nothing after I call the library to make the HTTP call (request-promise which uses request). And the code doesn't continue when I make another request (which I would expect if callbackWaitsForEmptyEventLoop is set to false, which it isn't).
Has anyone experienced something similar and know how to perform an ansynchronous HTTP request in the handler of a NodeJS event emitter, on AWS Lambda?
I have similar issue as well, my event emitter logs all events normally until running into async function. It works fine in ECS but not in Lambda, as event emitter runs synchronously but Lambda will exit once the response is returned.
At last, I used await-event-emitter to solve the problem.
await emitter.emit('onUpdate', ...);
If you know how to solve this, feel free to add another answer. But for now, the "solution" for us was to put the eventhandler code elsewhere in our codebase. This way, it is executed asynchronously.
We were able to do that because there is only one place where the event is emitted, but the eventhandler way would have been a cleaner solution. Unfortunately, it doesn't seem like it's possible.
In Meteor JS code, I am using HTTP.get method to call server inside a method. I must return result to client, so I am wrapping this function with
Meteor.wrapAsync to get a Synchronous function.
var httpSync = Meteor.wrapAsync(HTTP.get, this);
var result = httpSync(myUrl);
My question is - Will Meteor.wrapAsync(AsyncFunction) block other requests? Will it affect parallel execution of multiple requests?
It won't block the entire server. Meteor uses the fibers package to provide "synchronous looking" functions which don't block the entire server.
However, it will block other methods from the same user. If you want other methods from that user to run simultaneously, call this.unblock() inside the method:
On the server, methods from a given client run one at a time. The N+1th invocation from a client won't start until the Nth invocation returns. However, you can change this by calling this.unblock. This will allow the N+1th invocation to start running in a new fiber.
By the way, you don't need to Meteor.wrapAsync HTTP.get, since it can already be used synchronously. wrapAsync is intended to be used with external libraries that are not designed for Meteor.
I am using node.js along with express. I have a REST API open which is
app.get('/pushtoqueue/:id', function(req, res){
//do something
callFunction(data, function(){
//Do sequential execution
});
}
When REST gets requests it calls callFunction, where this functions needs to be executed in sequential way. Which means, when REST gets called 1st time request should be processed, while callFunction is in execution if another request arrives for callFunction that should be queued until 1st execution is completed and so on for more future request.
How can I achieve this in node.js. Is there any way to queue requests?
You need to use a queue that autoexecutes their elements. If the queue is empty, when you push the first task then it is immediately executed. While the task is being executed another task might be pushed, so instead of being executed it is just enqueued.
I have a module that does exactly this: deferred-queue
I am implementing freeswitch ESL using nodejs, for which i am using modesl module, it works fine and i can call dialplan tools using the execute function.
However execute function is an asynchronous implementation in modesl module of nodejs.
What i need is a synchronous call so that when i call execute function the execution should wait till freeswitch finishes executing that application.
In the below code sample i get the output "ivr finished" before the playback gets finished.
exports.process_ivr = function (conn, id)
{
conn.execute('answer');
conn.execute('playback','/root/before.wav');
console.log('ivr finished');
};
As per modesl there is no asynchronous way of calling freeswitch commands, is there any other way to implement this using nodejs?
Try this.
conn.execute('answer',function(){
conn.execute('playback','/root/before.wav',function(){
console.log('ivr finished');
});
});