Started fullstackopen for the first time and getting a "favicon.ico" request in the network tab - frontend

So I basically started this web dev course by fullstackopen and in one of the steps it says that once you open the Developer Console, and refresh the tab, 2 requests shall appear (one HTTP GET request and the another for an image to be retrieved from the server, but whenever I refresh I get this unknown request called "favicon.ico" every time. For the next steps also the same is happening. While the courses explains all the other requests made, it does not even talk about this one because this does not appear in their examples. So I just wanted to know why I am getting this random request. first one is what the network tab should have displayed
second image shows what I get when I follow the instructions of the course

As Akki mentioned, this is the icon you would see on a tab in the browser. Like the StackOverflow icon for this page.
You are getting a status 200 response, so it does find it in your root

Modern browsers will show an icon to the left of the URL. This known as the favicon.ico and is typically fetched from website.com/favicon.ico. Your browser will automatically request it when browsing to different sites. If your browser receives a valid favicon.ico file, it will display this icon. If it fails, it will not display a special icon.
In either circumstances, browsing is not affected.
It would seem that you don't have a favicon in your root directory of the project thus the unknown request

Related

Checking other users are viewing this page in Node Express

I have a NodeJs Express app and when a particular user goes to a page/route would like to identify and display if another user is currently viewing the page as well. So for example it would say "Jerry currently viewing this page" and someone else goes there.
Is there any easy/lightweight way to do this?
First off, with a regular web page, the server only knows who and when a page was requested. It doesn't, all by itself, know whether that user is still viewing that page or not. The user could have closed the browser, typed something else in the URL bar, the computer could have gone to sleep, etc...
Second off, even if the page is still being displayed in the browser, you can't know if someone is actually there at the computer or not. The best you could do is to try to keep track of activity in that web page (last mouse click, recent mouse movements over the web page, etc...).
Then, to even have any idea of the web page is even still open in the browser, you need some way of tracking that notion. There are two possibilities I can think of.
You can have some Javascript in the web page that regularly (say once every few minutes) sends a small little ajax call to your server that basically just says "I'm still here". This wouldn't know immediately when they left the page, but if the server finds that the usual every few minutes ajax call didn't come, then the server can change the status of that user on that page to not there any more.
If that web page makes a webSocket or socket.io connection to the server and keeps that connection alive, then whenever the browser closes or the user closed that tab or navigates to another page, that webSocket or socket.io connection will get automatically closed and the server will be notified that the socket got closed. Using this technique, the server can know pretty much right away when the user leaves the page.

High response times with Ajax True Client script

We are facing an issue with Ajax True Client script. When recording and replaying the script. One transaction took more than 60 sec time to load the page. Same behaviour observed after executing scenario in controller as well. But, if we manually perform the same transaction, it took only 8 sec. There is huge gap between the expected response times. Can any one suggest the fix?
This happens because of external resource download attempts by the script, which are not visible to you when you manually browse the page.
For example, if the page requests data from Google Analytics, or Facebook, and cannot access these sites (due to company restrictions, firewall, etc.), the response time would jump up to 60 seconds (timeout), but when you browse manually, you will not experience the timeout, since the browser behaves differently.
To resolve this issue, you should first find out which site is the script attempting to download data from? You can do this using a browser's developer tools (such as F12 tool in Google Chrome), and looking at the "Network" tab. Once you use this tab and browse to the web page, you should see the external HTTP requests. Make a list of these sites.
Once you know which external sites the page goes to, you can then use the Utils.removeAutoFilter JS command in your TruClient script:
From the Truclient Toolbox, choose "Misc" > "Evaluate Javascript Code" and add it to the first line of your script.
Then you can set the JS code in this action to:
Utils.removeAutoFilter(url, isIncluded);
for example, to prevent the script from downloading data from facebook:
Utils.removeAutoFilter('http://facebook.com', true);
Utils.removeAutoFilter('https://facebook.com', true);
Utils.removeAutoFilter('http://www.facebook.com', true);
Utils.removeAutoFilter('https://www.facebook.com', true);

comet.c cannot work with more than one page opened in browser

It works well when comet.c is opened in different browser simultaneously, one page per browser.
When I opened two pages of comet.c in a browser, no matter firefox or chrome, only the first page received and displayed data.
The second page were hanged until the first page was closed.
In the user's point of view, it is abnormal.
Who can tell what's wrong, browser or push_list_add() or the comet.js?
All pages requested freq. of one update per second.
It works well when comet.c is opened in different browser simultaneously, one page per browser... but with several pages in a single browser only the first page works.
It looks like a client issue: if that was the server then comet.c would not work with different client programs used simultaneously.
[solved]
In client side, add a timestamp at the end of the url in the form action.
var url = "/?comet.c&feed=livestock&delay="+escape(delay)+"&"+(new Date().getTime());
Done.

Pop-Up Message With out opening a page

Well, This is the thing, I was navigating the internet any page, and suddenly, A Pop Up windows appeared on screen from my ISP (To remind me I didn't pay today) ... so Do you know how to send that kind of messages without opening the webpage or installing anything on a pc?
I was navigating the internet any page...
Since your internet access is through the ISP, they have the ability to manipulate the traffic coming through their hardware. This includes replacing the content of a site, appending content, or inserting content.
Obviously they can't alter the actual site, but if you request "foo.com" and you haven't paid your bill, they can return whatever they want in response to your request.
They could return <script>window.open()</script> + the HTML content of "foo.com". Invalid, but browsers will render it. They could return a warning page. They could return an HTTP status code. You get the idea.
If you didn't have your web browser open, then something is installed in the background (whether or not you know it). Most hardware comes with dozens of background services which try to be "helpful".

Google Chrome Extension - prevent cookie on jquery ajax request or Use a chome.extension

I have a great working chrome extension now.
It basically loops over a list of HTML of a web auction site, if a user has not paid for to have the image shown in the main list. A default image is shown.
My plugin use a jQuery Ajax request to load the auction page and find the main image to display as a thumbnail for any missing images. WORKS GREAT.
The plugin finds the correct image url and update the HTML Dom to the new image and sets a new width.
The issue is, that the auction site tracks all pages views and saves it to a "recently viewed" section of the site "users can see any auctions they have clicked on"
ISSUE
- My plugin uses ajax and the cookies are sent via the jQuery ajax request. I am pretty sure I cannot modify the cookies in this request so the auction site tracks the request and for any listing that has a missing image this listing is now shown in my "recently viewed" even though I have not actually navigated to it.
Can I remove cookies for ajax request (I dont think I can)
Can chrome remove the cookie (only for the ajax requests)
Could I get chrome to make the request (eg curl, with no cookie?)
Just for the curious.
Here is a page with missing images on this auction site
http://www.trademe.co.nz/Browse/SearchResults.aspx?searchType=all&searchString=toaster&type=Search&generalSearch_keypresses=9&generalSearch_suggested=0
Thanks for any input, John.
You can use the webRequest API to intercept and modify requests (including blanking headers). It cannot be used to modify requests which are created within the context of a Chrome extension though. If you want to use this API for cookie-blanking purposes, you have to load the page in a non-extension context. Either by creating a new tab, or use an off-screen tab (using the experimental offscreenTabs API.
Another option is to use the chrome.cookie API, and bind a onChanged event. Then, you can intercept cookie modifications, and revert the changes using chrome.cookies.set.
The last option is to create a new window+tab in Incognito mode. This method is not reliable, and should not be used:
The user can disallow access to the Incognito mode
The user could have navigated to the page in incognito mode, causing cookie fields to be populated.
It's disruptive: A new window is created.
Presumably this AJAX interaction is being run from a content script? Could you run it from the background page instead and pass the data to the content script? I belive the background page operates in a different context and shouldn't send the normal cookies.

Resources