RxJava - timeout while using retryWhen - retrofit2

I need to send requests in 10s intervals. It means the next request should be sent in 10s after response from previous request has been received. The last request should start within 60 seconds.
For example:
0:00: 1st request: sent
0:05: 1st request: response
0:15: 2nd request: sent
0:45: 2nd request: response
0:55: 3rd request: sent
1:10: 3rd request: response
No more requests
This is my code:
mainRepository.getFlightSearchResults(uuid)
.repeatWhen { it.delay(10, TimeUnit.SECONDS) }
.timeout(60, TimeUnit.SECONDS)
.observeOn(schedulerProvider.ui)
where getFlightSearchResults returns Observable. Requests are being sent as described above, however requests are not stopping being sent after 60s. How can I stop sending (not receiving response) requests after 60s?

Solved using takeWhile:
val startTimeMillis = System.currentTimeMillis()
mainRepository.getFlightSearchResults(uuid)
.repeatWhen { it.delay(10, TimeUnit.SECONDS) }
.takeWhile { System.currentTimeMillis() - startTimeMillis < 60_000 }
.observeOn(schedulerProvider.ui)

Related

Node.js multiple requests - no responses

I am trying to get data from web API with axios (Node.js). I need to execute approximately 200 requests with different URLs to fetch some data for further analysis. I tried to use multiple libs for http callouts but in every case i have the same issue. I did not receive success or error callback. Request just stock somewhere.
async function sendRequest(url) {
let resp = await axios.get(url);
return resp.data;
}
I am calling this function in foor loop
for (var url in urls) {
try {
setData(url)
} catch (e) {
console.log(e);
}
}
async function setData(url) {
var data = await sendRequest(url);
// Set this data in global variable.
globalData[url] = data;
}
I often received this error:
Error: read ECONNRESET
I think this is all connected with too many requests in small interval.
What should I do to receive all requests. My temporary fix is to periodicly send 20 request per 20 seconds (still not ok, but I received more responses). But this is slow and it takes to many time.
However, I need all data from 200 requests in one variable for further analysis. If I wait for every request It takes too many time.

handle 102 Status (processing) in Node.js

I'm issuing a request from my Server (Node8.9.0LTS) to other service, and I receive a 102 processing status code:
const firstRes = await ajaxService.post('/data/search', params)
<-- issue a request with a timeout -->
ctx.data = secondRes.response.data //return the response to the client
the ajaxService returns an async function that uses axios to issue the request.
How can I write the code to issue the same request with an interval of 1 second, limited to 5 seconds (so I'll return timeout to the client) with async/await?

Node http.createServer how to buffer incoming requests

I'm building a small node server that generates PDF files (using Nightmare.js). Each request calls createPage to generate one pdf.
The incoming request tend to all come around the same time, overloading the PC this is running on.
I need to buffer the incoming requests to delay execution of some requests till some of the current requests have completed. How do I do this?
function createPage(o, final) {
//generate pdf files
}
http.createServer(function (request, response) {
var body = [];
request.on('data', function (chunk) {
body.push(chunk);
}).on('end', function () {
body = Buffer.concat(body).toString();
var json = JSON.parse(body);
createPage(json, function (status) {
if (status === true) {
response.writeHead(200, { 'Content-Length': 0 });
console.log('status good');
} else {
response.writeHead(500, { 'Content-Type': 'text/html' });
response.write(' ' + status);
}
response.end('\nEnd of Request \n');
});
});
}).listen(8007);
If I understand correctly, you want to continually accept http requests but throttle the rate at which createPage is invoked. If so, you probably need to consider a slightly different design. In this current design, every next client will have to wait longer than the previous one to find out if their request has succeeded or failed.
Approach 1:
use a queue (rabbitmq, aws sqs, zeromq, kafka, etc).
Here's the basic workflow:
receive the request
generate a unique id
put a message on the queue that includes the data and the unique id
return the unique id to the client
the client periodically checks for the completion of the task using the unique id
Approach 2:
Use a queue with message duplexing.
receive the request
generate a correlation id and relate it to the http transaction
send message on queue to worker with correlation id
when worker completes, it sends the response back with the correlation id
server uses correlation id to find the http transaction and send the appropriate response to the client

How to send response using twice or more callback in one request

I'm using express and the request POST look like that
router.post('/', function(req, res, next){
var data = req.body;
getRandom(data, function(value){
res.json({value: value});
});
});
POST is sent through ajax and then update textarea with new data.
$.ajax({
type: "POST",
url: "/",
data: JSON.stringify(datareq),
dataType: 'json',
contentType: 'application/json',
success: function(x){
$.each(x, function(index, value) {
$('.textarea').append(value + '\n');
});
},
error: function(x) {
console.log(x + 'error');
}
});
How to send this using one POST and a few response. User received one data in textarea when cb finished and then another data and so one till the end.
<textarea>
data 1 - 1sec
data 2 - 2sec leater
data 3 - 3 second later
...
</textarea>
I add Time (1sec ...) only to show that callback has a lot to do to send another response.
Of course this not working because res.send() close connection and I received error
So how to achieve my idea, to sending simultaneously after post request. I want to give user data very fast, then another one when is ready not waiting for all and then send response.
You can't
Reason:
Http closes connection after sending response. You can not keep it open and sending multiple responses to the client. HTTP doesn't support it.
Solution 1:
Simply put a timer at client side and request periodically.
Solution 2 (Recommended):
Use socket, and pass data through it. socket.io is the socket library for nodejs applications. It is very easy to use. Set up a connection, keep sending data from server and receive it on client side.
Just to add on the answer. This answer explains why res.send closes the connection.

Node.js - How can I wait for something to be POSTed before I reply to a GET

I have 2 clients and one node.js server url - localhost:8888/ServerRequest. The First client GETs from this url and waits for 20 seconds to see if the Second client has POSTed some data for the first client within the 20 second timeout period or not.If the second client did POST before the timeout, then that value is returned to the GET request, else a default value is returned for the GET request. I am not sure what is the best way to implement this. I am trying something like this, but it is not working as desired -
function ServerRequest(response, postData , request)
{
var id;
if(request.method == "GET")
{
id= setTimeout(function( )
{
// handle timeout here
console.log("Got a timeout, sending default value");
cmd = "DefaultVal";
response.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?><list id=\"20101001\"><com type=\"" + cmd + "\"></com></list>")
response.end()
},20000);
}
else if(request.method == "POST")
{
console.log("Received POST, sending POSTed value");
cmd = postData;
//Cancel Timeout
clearTimeout(id);
console.log(" \n Received POST")
response.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?><list id=\"20101001\"><com type=\"" + cmd + "\"></com></list>")
response.end()
}
}
Another approach in my mind was to use 2 separate URLs - One for GET Request (/ServerRequest) and the other for POST Request (/PostData). But then how will I pass the POSTed data from one URL to the other if received before the timeout?
EDIT: I think I know now what I exactly need. I need to implement a longpoll, where a client sends a GET request, and waits for a timeout period (the data might not be immediately available to consume, so it waits for 20 seconds for some other client to POST some data for the first client to consume). In case timeout occurs, a default value is returned in response to the GET request from the first client. I'm working on the longpoll implementation I found here, I'll update if I am able to succeed in what I'm trying. If someone can point me or provide me with a better example, it will be helpful.
Edit: removed my original code after a more careful reading of the question.
The best solution would probably be websockets the browser will appear to hang waiting for 20 seconds.
Using a library like socket.io you can do this
var io = require('socket.io').listen(8888);
function postHandler(req, data, res){
io.sockets.emit("response" , data)
}
then client side
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost:8888');
socket.on('response', function (data) {
console.log(data);
});
</script>

Resources