According to MDN the correct approach is to send the header:
Link: </images/big.jpeg>; rel=prefetch
So my express syntax is:
res.header('Link', '</images/big.jpeg>; rel=prefetch');
and I see it land in my browser as:
Link:</images/big.jpeg>; rel=prefetch
But chrome never attempts to download the image.. The meta and link approaches work fine..
Am I setting the header wrong or is the browser failing to process the header value?
UDPATE: Okay so it looks like I'm doing things right but Chrome 43 & Chromium 43 on Linux/Ubuntu doesn't have support for this yet. This is working fine in Firefox 38..
Could it be that Chromium is just not showing the prefetching in the Network tab?
UPDATE 2: So it does look like Chrome/Chromium is hiding the file transfers from the Network tab. If someone can confirm this I'd appreciate it..
To set prefetch for one file in Express 4+..
res.set('Link', '<static/js/file1.js>; rel=prefetch');
For multiple files.
res.set('Link', '<static/js/file1.js>; rel=prefetch, <static/js/file2.js>; rel=prefetch');
Do NOT attempt to test this in Chrome, Chrome will lie to you and show it as not working if you investigate under Network tab. Always test with Firefox.
You can see me implementing this in a larger project in context here.
Related
I am learning node and wrote a simple app which returns just JSON data as response.
There is no any error but I am not getting why there are 5 different resources (content.min.css, favicon.ico, jsonview-core.css, options.png, etc) showing in my network tab. Can anyone help me to understand this.
If possible how can I avoid sending these extra resources and just send the JSON data?
It's because of the chrome extension that I am using.
I found some interesting about favicon.ico file:
Modern browsers will show an icon to the left of the URL. This known
as the 'favicon.ico' and is typically fetched from
website.com/favicon.ico. Your browser will automatically request it
when browsing to different sites. If your browser receives a valid
favicon.ico file, it will display this icon. If it fails, it will not
display a special icon.
According to this article:
Debugging React performance with React 16 and Chrome Devtools.
I want to inspect some performance things of my own website built by React as well.
But I cannot see the User timing section in my DevTools.
Chrome version with 62.0.3202.89 in my computer,
am I missing something?
I was able to make this work by appending ?react_perf to my url. Not sure why this was so difficult to find... hope it works for you.
https://reactjs.org/blog/2016/11/16/react-v15.4.0.html#profiling-components-with-chrome-timeline
The User Timing tab was removed from chrome in favour of the "Timings" tab which combines some well known timings, such as LCP, OnLoad Event and custom User Timing API marks/measures.
You need to add ?react_perf to the URL
It's required if you're using React 15
https://reactjs.org/blog/2016/11/16/react-v15.4.0.html#profiling-components-with-chrome-timeline
Not required for React 16 though
https://reactjs.org/docs/optimizing-performance.html#profiling-components-with-the-chrome-performance-tab
The Heroku app i'm trying to get to work (code here):
https://github.com/heroku/facebook-template-nodejs
"Unsafe Javascript attempt to access frame with URL" errors occur when the page is loaded in chrome.
The login button takes you to facebook but does not actually log you into the app and gives the same errors.
Has anyone got this app to work on Chrome or can anyone advise as to how to patch it up?
P.S. it seems to work fine on Mozilla.
Almost certain this is a cross domain policy issue, as stated above. Generally speaking, you just need to add the correct header info to the response.
Access-Control-Allow-Origin: *
In Node, I think it is just a matter of adding it as another header in the response, using
response.writeHead
See http://nodejs.org/api/http.html#http_response_writehead_statuscode_reasonphrase_headers
Oh, and there's explicit instructions on how to do it if you're using Express. I see no reason why it can't work using plain old node then.
http://enable-cors.org/server_expressjs.html
So I looked at your link, in your case I think you just have to enter the header info prior to using any other express app methods.
As to why it works in Firefox and not Chrome, not sure. Both support CORS many versions back. Maybe you have some Chrome extension that's interfering.
I'm wondering, is it even possible to treat the request for the Xul Browser component to open a new window? I tried changing the window.open function, but looks like it's never called.
All links that open in a new window are not opening in my application.
I found this page on the subject, but the provided solution is showing no different behavior.
Any hint on this?
(by the way, I'm developing a stand alone application, not a Firefox's extension)
I'm assuming you are in a XULRunner application, and that you are trying to load a chrome URL from a non-chrome source in a browser (e.g. HTTP or local file). While enabling UniversalXPConnect and UniversalBrowserWrite can be helpful, they are also a security risk (since any arbitrary script on the web could use them), so they tend to be disabled in browsers (for example, running that line in Firebug will give you an exception):
>>> netscape.security.PrivilegeManager.enablePrivilege("UniversalXPConnect UniversalBrowserWrite");
Error: A script from "http://stackoverflow.com" was denied UniversalXPConnect UniversalBrowserWrite privileges.
How about you try using codebase security principals and see if that makes a difference? (http://www.mozilla.org/projects/security/components/signed-scripts.html#codebase). For me in Firebug it does allow me to get the additional permissions after I OK it with a big, nasty looking dialog), but still doesn't allow me to open a Chrome URL with window.open. The next step is probably to try changing your conf file to use contentaccessible so that the relevant parts of your content are accessible (see https://developer.mozilla.org/en/Chrome_Registration#contentaccessible).
To avoid the nasty message when elevating permissions, you could try setting permissions for the right files automatically as described at http://forums.mozillazine.org/viewtopic.php?f=38&t=1769555.
Also, make sure you check the browser type (https://developer.mozilla.org/en/XUL/Attribute/browser.type). If the browser type is not chrome, then it might be worth trying making it chrome and seeing if that makes a difference.
If any of my assumptions are wrong get back to me and I will try something else.
does normal js not work?
window.open(url,windowname,flags);
There are two ways that I know of.
The first is to set the browser.chromeURL preference to a chrome URL that contains a <browser type="content-primary">. The page that the content window tried to open will load into the given browser.
The second is to set the property window.browserDOMWindow with an object that you define to implement the nsIBrowserDOMWindow interface. This allows you to divert the open call into a tab, if you are using a tabbed interface. Note: the tabbed browsing preferences must be set to allow windows to be diverted into tabs, otherwise XULrunner will fall back on browser.chromeURL.
I find that when I am doing web development there are a few browser plugins that are very useful to me.
For Firefox I am using:
Firebug - Great for inspecting the HTML elements and working with CSS.
YSlow for Firebug - Developed by Yahoo! and gives timing and tips about page resources.
Live HTTP headers - Lets you inspect the headers that are sent to your browser.
For IE I am using:
Fiddler - "a Web Debugging Proxy which logs all HTTP(S) traffic between your computer and the Internet"
I am always looking for other great tools to use. So what is everyone else using?
In addition to what you have:
Web Developer toolbar adds alot of extra functionality (cookie, form, image inspection, viewing generated DOM, etc).
HTML Validator - great for a quick check to make sure your pages are valid. Also good when there are display errors, you can quickly see if it's from improperly generated HTML.
ColorZilla - I use this alot to pull exact colors from a page to the clipboard.
Fireshot -- takes screenshots and annotates them convieniently, helpful.
Extended Statusbar modifies the status bar to show speed, percentage, time, and loaded size (useful for seeing how many images are being loaded, page weight, etc)
ShowIP Displays the IP address of the current page in the status bar
external IP Displays your external IP address in the statusbar
On a side note, I also find it useful to run these extensions in FirefoxPortable, so that I've got a browser setup specifically for development work with the relevant extensions installed, and to avoid slowing down or destabilizing my primary browser (eg. Firebug used to crash my browser all the time when accessing Gmail).
URL Params (Firefox extension) to view the POST and GET parameters of a webpage. Useful for checking your forms.
HttpFox
The one that prevents you from accessing StackOverflow is pretty useful.
All of these are Firefox plugins.
Firebug for Javascript and CSS debugging. Firebug allows for example to examine DOM tree while javascript modifies it. Firebug is my main tool.
Live HTTP Headers for looking at what data actually is inside request and responses.
Web Developer toolbar contains smaller utilities. For example it can validate html and CSS.
Dust Me Selectors finds which pieces of CSS are unused.
IE Developer Toolbar
Venkman debugger for Firefox
Firecookie and console 2
How about twitterfox to help twitter with developer colleagues and friends.
MeasureIt
For getting exact size of items rendered on a page in FF.
Firebug - Also let's me see the JS requests being sent from one page to another and which data is being sent.
- I can see the data inside the JS variables
- Replaces Error Console. It also outputs in the statue if it has found an error, so I can inspect it.
- Good for seeing the structure of the html when developing AJAX application.