Recently we have decided to play some video in browser at my company. We want to support Safari, Firefox and Chrome. To stream video, Safari requires that we implement range http requests in servicestack. Our server supports range requests as indicated by the 'Accept-Ranges: bytes' header being returned in the response.
Looking at previous questions we would want to add a prerequest filter, but I don't understand the details of doing so. Adding this to our AppHost.cs's configure function does do something:
PreRequestFilters.Add((req, res) => {
if (req.GetHeader(HttpHeaders.Range) != null) {
var rangeHeader = req.GetHeader(HttpHeaders.Range);
rangeHeader.ExtractHttpRanges(req.ContentLength, out var rangeStart, out var rangeEnd);
if (rangeEnd > req.ContentLength - 1) {
rangeEnd = req.ContentLength - 1;
}
res.AddHttpRangeResponseHeaders(rangeStart, rangeEnd, req.ContentLength);
}
});
Setting a breakpoint I can see that this code is hit. However rangeEnd always equals -1, and ContentLength always equals 0. A rangeEnd of -1 is invalid as per spec, so something is wrong. Most importantly, adding this code breaks video playback in Chrome as well. I'm not sure I'm on the right track. It does not break the loading of pictures.
If you would like the details of the Response/Request headers via the network let me know.
Related
For the context, I am developing a synthetic monitoring tool using Nodejs and puppeteer.
For each step of a defined scenario, I capture a screenshot, a waterfall and performance metrics.
My problem is on the waterfall, I previously used puppeter-har but this package is not able to capture request outside of a navigation.
Therefore I use this piece of code to capture all interesting requests :
const {harFromMessages} = require('chrome-har');
// Event types to observe for waterfall saving (probably overkill, I just set all events of Page and Network)
const observe = [
'Page.domContentEventFired',
'Page.fileChooserOpened',
'Page.frameAttached',
'Page.frameDetached',
'Page.frameNavigated',
'Page.interstitialHidden',
'Page.interstitialShown',
'Page.javascriptDialogClosed',
'Page.javascriptDialogOpening',
'Page.lifecycleEvent',
'Page.loadEventFired',
'Page.windowOpen',
'Page.frameClearedScheduledNavigation',
'Page.frameScheduledNavigation',
'Page.compilationCacheProduced',
'Page.downloadProgress',
'Page.downloadWillBegin',
'Page.frameRequestedNavigation',
'Page.frameResized',
'Page.frameStartedLoading',
'Page.frameStoppedLoading',
'Page.navigatedWithinDocument',
'Page.screencastFrame',
'Page.screencastVisibilityChanged',
'Network.dataReceived',
'Network.eventSourceMessageReceived',
'Network.loadingFailed',
'Network.loadingFinished',
'Network.requestServedFromCache',
'Network.requestWillBeSent',
'Network.responseReceived',
'Network.webSocketClosed',
'Network.webSocketCreated',
'Network.webSocketFrameError',
'Network.webSocketFrameReceived',
'Network.webSocketFrameSent',
'Network.webSocketHandshakeResponseReceived',
'Network.webSocketWillSendHandshakeRequest',
'Network.requestWillBeSentExtraInfo',
'Network.resourceChangedPriority',
'Network.responseReceivedExtraInfo',
'Network.signedExchangeReceived',
'Network.requestIntercepted'
];
At the start of the step :
// list of events for converting to HAR
const events = [];
client = await page.target().createCDPSession();
await client.send('Page.enable');
await client.send('Network.enable');
observe.forEach(method => {
client.on(method, params => {
events.push({ method, params });
});
});
At the end of the step :
waterfall = await harFromMessages(events);
It works good for navigation events, and also for navigation inside a web application.
However, the web application I try to monitor has iframes with the main content.
I would like to see the iframes requests into my waterfall.
So a few question :
Why is Network.responseReceived or any other event doesn't capture this requests ?
Is it possible to capture such requests ?
So far I've red the devtool protocol documentation, nothing I could use.
The closest to my problem I found is this question :
How can I receive events for an embedded iframe using Chrome Devtools Protocol?
My guess is, I have to enable the Network for each iframe I may encounter.
I didn't found any way to do this. If there is a way to do it with devtool protocol, I should have no problem to implement it with nodsjs and puppeteer.
Thansk for your insights !
EDIT 18/08 :
After more searching on the subject, mostly Out-of-process iframes, lots of people on the internet point to that response :
https://bugs.chromium.org/p/chromium/issues/detail?id=924937#c13
The answer is question states :
Note that the easiest workaround is the --disable-features flag.
That said, to work with out-of-process iframes over DevTools protocol,
you need to use Target [1] domain:
Call Target.setAutoAttach with flatten=true;
You'll receive Target.attachedToTarget event with a sessionId for the iframe;
Treat that session as a separate "page" in chrome-remote-interface. Send separate protocol messages with additional sessionId field:
{id: 3, sessionId: "", method: "Runtime.enable", params:
{}}
You'll get responses and events with the same "sessionId" field which means they are coming from that frame. For example:
{sessionId: "", method: "Runtime.consoleAPICalled",
params: {...}}
However I'm still not able to implement it.
I'm trying this, mostly based on puppeteer :
const events = [];
const targets = await browser.targets();
const nbTargets = targets.length;
for(var i=0;i<nbTargets;i++){
console.log(targets[i].type());
if (targets[i].type() === 'page') {
client = await targets[i].createCDPSession();
await client.send("Target.setAutoAttach", {
autoAttach: true,
flatten: true,
windowOpen: true,
waitForDebuggerOnStart: false // is set to false in pptr
})
await client.send('Page.enable');
await client.send('Network.enable');
observeTest.forEach(method => {
client.on(method, params => {
events.push({ method, params });
});
});
}
};
But I still don't have my expected output for the navigation in a web application inside an iframe.
However I am able to capture all the requests during the step where the iframe is loaded.
What I miss are requests that happened outside of a proper navigation.
Does anyone has an idea about the integration into puppeteer of that chromium response above ? Thanks !
I was looking on the wrong side all this time.
The chrome network events are correctly captured, as I would have seen earlier if I checked the "events" variable earlier.
The problem comes from the "chrome-har" package that I use on :
waterfall = await harFromMessages(events);
The page expects the page and iframe main events to be present in the same batch of event than the requests. Otherwise the request "can't be mapped to any page at the moment".
The steps of my scenario being sometimes a navigation in the same web application (=no navigation event), I didn't have these events and chrome-har couldn't map the requests and therefore sent an empty .har
Hope it can help someone else, I messed up the debugging on this one...
In some page I have to get information from 8 different endpoints. 2 of them are outside of my application and sometimes they cause an delay at displaying data. The web browser waits until the data is processed. Once they're outside of my app I can't refactor them in order to make them fast, but I need to show the information that they provide. In addition, sometimes one of them returns nothing. If so, I use default data to show to the user. The waiting time takes time for the user experience perspective.
I'm using promises to call these endpoints. Below is part of the code snippet that I am using.
The code is working fine. The issue is the delay.
First. Here is the array that contains all the service that I need to process:
var requests = [{
// 0
url: urlLocalApi + '/endpointURL_1/',
headers: {
'headers': 'apitoken'
},
}, {
// 1
url: urlLocalApi + '/endpointURL_2/',
headers: {
'headers': 'apitoken'
},
];
The code of array is encapsulated in this method:
const requests = homePageFunctions.createRequest();
Now, it is how the data is processed. I am using both 'request-promise' and 'bluebird', and a personal logger to check it out if everything goes fine.
const Promise = require("bluebird");
const request = require('request-promise');
var viewsHelper = {
getPageData: function (requests) {
return Promise.map(requests, function (obj) {
return request(obj).then(function (body) {
AppLogger.log(`Endpoint parsed`, statusLogger.infodate);
return JSON.parse(body);
});
});
}
}
module.exports = viewsHelper;
How do I call this?
viewsHelper.getPageData(requests)
.then(results => {
var output = [];
for (var i = 0; i < results.length; i++) {
output.push(results[i]);
}
// render data
res.render('homepage/index', output);
AppLogger.log(`PageData is rendered`, statusLogger.infodate);
})
.catch(err => {
console.log(err);
});
};
Take a look that inside of each index item of "output" array, there is the output of each data of each endpoint.
The problem here is:
If any of the endpoint takes long, the entire chain slows even though
if they are already processed. The web page waits in a blank mode.
How to prevent this behavior?
That is an interesting question but I have questions in order to answer it effectively.
You have Node server and client (HTML/JS)
You have 8 end points 2 are slow because you don’t have control over them.
Does the client (page) aware of the 8 end points? I .e you make 8 calls everytime you reload the page?
OR
Does the page makes one request to your node JS and your nodeJS synchronously calls the 8 end points
If it is 1 then lazy loading will work easily for you since the page is making the requests.
If it is 2 lazy loading will work only at the server side however the client will be blocked because it doesn’t know (or care how you load your data. The page made one request and it is blocked waiting for that request..
Obviously each method has pros and cons ..
One way you can solve this is to asynchronously call those end points on node and cache them and when the page makes the 1 request you have the data ready ..
Again we know very little about the situation there are many ways to solve this
Hope this helps
[context] - this turned out to be irrelevant, the issue at hand is a client side thing
I'm experiencing some strange response times from my NodeJS/Express app.
According to the logs, the requests complete in 180-220 MS
But from the web client perspective, I'm seeing these numbers which are very strange, some are in the range of seconds. and the payload is not that big. its around 1.5k.
I've disabled every feature I could suspect, Sessions, AD Authentication etc.
To add to the confusion, the overhead is only there sometimes, maybe 30% of the requests. others complete in the same time range as listed in the first picture.
I wish I could give more context but it's a really simple React frontend using Fetch to HTTP GET data from the /contacts endpoint.
Nothing more.
[EDIT 1]
This seems to be a client side thing.
The whole process is part of a virtual/infinite scroll React component.
If I do scroll down using the down arrow key, response times stay normal.
If I scroll really fast using the scrollbar/touch, the response time goes up.
Important to know is that the services do not return more data, it is always 20 rows at a time, yet the response time increases.
So, does scrolling in a browser somehow prevent promises or other more native constructs from completing?
[EDIT 2]
Same behavior across Chrome, Safari, Firefox.
Scrolling fast seems to make the request unable to complete in a timely manner.
Upgrading to React 16 seems to have improved the issue, or maybe just false positives.
[EDIT 3]
Even when replacing the call to the backend to a static JSON service online. the behavior still persists, so this has nothing to do with Express or NodeJS.
There is something odd about scrolling and React+Fetch.
[EDIT 4]
Client side code snippets:
Event listeners
this.scrollHandler = this.checkWindowScroll;
this.resizeHandler = this.checkWindowScroll;
window.addEventListener("scroll", this.scrollHandler, { passive: true });
window.addEventListener("resize", this.resizeHandler, { passive: true });
ScrollHandler
//check if we have scrolled to the bottom of the screen
checkWindowScroll = () => {
if (this.state.loading) {
return;
}
const trigger = 800;
const pageBottom =
window.document.body.getBoundingClientRect().height - trigger;
const scrollBottom = window.pageYOffset + window.innerHeight;
if (scrollBottom > pageBottom) {
this.loadMore();
}
};
Fetch Next Data
//responsible for fetching another chunk of data from a backend
async loadMore() {
this.setState({ loading: true, error: undefined }); // begin load
const items = await this.props.fetchData(
this.search,
this.state.items.length
);
this.setState(
{
loading: false,
error: undefined,
items: [...this.state.items, ...items] // clone
},
() => this.checkWindowScroll() // once state is updated, check again if we need more data
);
}
This question may be a little overwhelming, but I feel close to understanding the way video seeking works in Google Chrome, but it's still very confusing to me and support is difficult to find.
If I am not mistaken, Chrome initially sends a request header with Range bytes=0- to test if the server understands Partial Content requests, and expects the server to respond with status code 206.
I have read the following answers to get a better understanding:
Need more rep to link them, their topics are:
can't seek html5 video or audio in chrome
HTML5 video will not loop
HTTP Range header
My server is powered by Node.js and I am having trouble with getting continuous range requests out of chrome during playback. When a video is requested, the server receives a bytes=0-, the server then responds with status code 206 and
then the media player breaks.
My confusion is with with the response header, because I am not sure how to
construct my response header and handle eventual range requests:
Do I respond with a status code 200 or 206 initially?
When I respond with 206 I only receive bytes=0-, but when I respond with
200 I receive bytes=0- and after that bytes=355856504-.
If I were to subtract 355856504 of the total Content-Length of the video file, the result is 58, and bytes=0-58 seems like a valid Content-Range?
But after those two requests, I receive no more range requests from Chrome.
I am also unsure if the Content-Range in the response header should looks like "bytes=0-58" or like "bytes=0-58/355856562" for example.
Here is the code
if(req.headers.range) console.info(req.headers.range); // prints bytes=0-
const type = rc.sync(media, 0, 32); // determines mime type
const size = fs.statSync(media)["size"]; // determines content length
// String range, initially "bytes=0-" according to Chrome
var Strange = req.headers.range;
res.set({
"Accept-Ranges": "bytes",
"Content-Type": ft(type).mime,
"Content-Length": size-Strange.replace(/bytes=/, "").split("-")[0],
"Content-Range": Strange+size+"/"+size
});
//res.status(206); // one request from chrome, then breaks
res.status(200); // two requests from chrome, then breaks
// this prints 35585604-58, whereas i expect something like 0-58
console.log("should serve range: "+
parseInt(Strange.replace(/bytes=/, "").split("-")[0]) +"-"+
parseInt(size-Strange.replace(/bytes=/, "").split("-")[0])
);
// this function reads some bytes from 'media', and then streams it:
fs.createReadStream(media, {
start: 0,
end: parseInt(size-Strange.replace(/bytes=/, "").split("-")[0]) // 58
}).pipe(res);
Screenshots of the request and response headers when status code is 200:
first response and request headers
second response and request headers
Screenshot of the request and response header when status code is 206:
Need more rep, to show another screenshot
Essentially the request is:
"Range: bytes=0-"
and the Content-Range response is:
"bytes=0-355856562/355856562"
One apparent error is that you are returning an invalid value in the Range header. See the sepcification - it should be 0-355856561/355856562 since the second value after the dash is the last byte position not length.
I am writing a webapp, using express.js.
My webapp achieves the following
User posts 100 json objects
Each json object is processed via a service call
Once the service call is completed, a session variable is incremented
On incrementation of the session variable, a server side event must be sent to the client to update the progress bar
How do i achieve listening on a session variable change to trigger a server-sent event?
Listening to a variable change is not the only solution I seek?
I need to achieve sending a server-sent event once a JSON object is processed.
Any appropriate suggestion is welcome
Edit (based on Alberto Zaccagni's comment)
My code looks like this:
function processRecords(cmRecords,requestObject,responseObject)
{
for (var index = 0; index < cmRecords.length; index++)
{
post_options.body = cmRecords[index];
request.post(post_options,function(err,res,body)
{
if(requestObject.session.processedcount)
requestObject.session.processedcount = requestObject.session.processedcount + 1;
else
requestObject.session.processedcount = 1;
if(err)
{
appLog.error('Error Occured %j',err);
}
else
{
appLog.debug('CMResponse: %j',body);
}
var percentage = (requestObject.session.processedcount / requestObject.session.totalCount) * 100;
responseObject.set('Content-Type','text/event-stream');
responseObject.json({'event':'progress','data':percentage});
});
};
}
When the first record is updated and a server side event is triggered using the responseObject (express response object)
When the second record is updated and I try triggering a server side event using the same responseObject. I get an error saying cannot set header to a response that has already been sent
It's hard to know exactly what the situation is without seeing the routes/actions you have in your main application...
However, I believe the issue you are running into is that you are trying to send two sets of headers to the client (browser), which is not allowed. The reason this is not allowed is because the browser does not allow you to change the content type of a response after you have sent the initial response...as it uses that as an indicator of how to process the response you are sending it. You can't change either of these (or any other headers) after you have sent them to a client once (one request -> one response -> one set of headers back to the client). This prevents your server from appearing schizophrenic (by switching from a "200 Ok" response to a "400 Bad Request," for example).
In this case, on the initial request, you are telling the client "Hey, this was a valid request and here is my response (via the status of 200 which is either set elsewhere or being assumed by ExpressJS), and please keep the communication channel open so I can send you updates (by setting your content type to text/event-stream)".
As far as how to "fix" this, there are many options. When I've done this, I've used the pub/sub feature of redis to act as the "pipe" that connects everything up. So, the flow has been like this:
Some client sends a request to /your-event-stream-url
In this request, you set up your Redis subscriber. Anything that comes in on this subscription can be handled however you want. In your case, you want to "send some data down the pipe to the client in a JSON object with at least a data attribute." After you have set up this client, you just return a response of "200 Ok" and set the content type to "text/event-stream." Redis will take care of the rest.
Then, another request is made to another URL endpoint which accomplishes the task of "posting a JSON object" by hitting /your-endpoint-that-processes-json. (Note: obviously this request may be made by the same user/browser...but the application doesn't know/care about that)
In this action, you do the processing of their JSON data, increment your counters, or do whatever...and return a 200 response. However, one of the things you'd do in this action is "publish" a message on the Redis channel your subscribers from step #1 are listening to so the clients get the updates. Technically, this action does not need to return anything to the client, assuming the user will have some type of feedback based on the 200-status code or on the server-sent event that is sent down the pipe...
A tangible example I can give you is this gist, which is part of this article. Note that the article is a couple years old at this point so some of the code may have to be tweaked a bit. Also note this is not guaranteed to be anything more than an example (ie: it has not been "load tested" or anything like that). However, it may help you get started.
I came up with a solution please let me know if this is the right way to do stuff ?
Will this solution work across sessions ?
Server side Code
var events = require('events');
var progressEmitter = new events.EventEmitter();
exports.cleanseMatch = function(req, res)
{
console.log('cleanseMatch Inovked');
var progressTrigger = new events.EventEmitter;
var id = '';
var i = 1;
id = setInterval(function(){
req.session.percentage = (i/10)*100;
i++;
console.log('PCT is: ' + req.session.percentage);
progressEmitter.emit('progress',req.session.percentage)
if(i == 11) {
req.session.percentage = 100;
clearInterval(id);
res.json({'data':'test'});
}
},1000);
}
exports.progress = function(req,res)
{
console.log('progress Inovked');
// console.log('PCT is: ' + req.session.percentage);
res.writeHead(200, {'Content-Type': 'text/event-stream'});
progressEmitter.on('progress',function(percentage){
console.log('progress event fired for : ' + percentage);
res.write("event: progress\n");
res.write("data: "+percentage+"\n\n");
});
}
Client Side Code
var source = new EventSource('progress');
source.addEventListener('progress', function(e) {
var percentage = JSON.parse(e.data);
//update progress bar in client
App.updateProgressBar(percentage);
}, false);