How to 100% turn off Google personalization for search? - search

I have problem that me (I'm in Europe) and my college (he is Europe) are getting different results for a search a Google query even though we use the following:
we added &pws=0 to query
we use browser in incognito mode.
Is there any way to turn off personalization completely?

It may be personalizing based on your location. You can get around this by going through a proxy (dreaded and unacceptable answer, I know). Google doesn't provide an easy option to disable it.
Try these Yoast plugins:
http://yoast.com/tools/seo/disable-personalized-search-plugin/
Or this Chrome extension:
http://www.redflymarketing.com/internet-marketing-tools/google-global/
Google allows you to set your search area as large as a country. By default, I set mine for the United States to remove any local bias that creeps into my results.

try go to the preference page and see if you and your colleague have different settings.
http://www.google.com/preferences

You may try using a different browser to get an "unpersonalized" search. There are some browsers that already claim to do this, but I haven't tested them myself...
There may be the problem of the search not being as complete as Google can 'sometimes' be. Google is most famous for it's search engine because of how it worked differently from other engines at the time it came out. This may be more or less the same now, or other companies may have caught up - I'm not sure.
One such engine goes by the ridiculous name "DuckDuckGo." It is the first result to show up "for me" when searching "unfiltered web search engines".
Other than that, you may try contacting Google representatives to get answers more directly.

Related

Google servers see website differently

I Googled one of our sites today (gamestyling.com) and saw that the results where in Chinese. It looks like our site was hacked but I see no traces of that. When opening the site all looks normaal (no Chinese).
On further inspection it seems that Google doesn't see the website correctly:
I cannot verify in Google search console. When I use the meta tag it shows me it detected a completely different tag.
When running pagespeed insight the preview does show Chinese: https://developers.google.com/speed/pagespeed/insights/?url=gamestyling.com
Also, when running the site through a proxy it looks completely normal.
Any idea how I can get Google to see my site correctly or what is causing this issue?
UPDATE
I now have access to Google search console and found that someone already had access to the property (2nd user):
I cannot remove the user because it uses a meta tag that google thinks is still in the header but doesn't appear in my code. So I'm still not sure if someone is playing tricks on Google or that we've been actually hacked. Note; nothing has changed on the server itself.
UPDATE2
This article describes exactly what's going on; https://blog.sucuri.net/2015/09/malicious-google-search-console-verifications.html. I must say that's an amazing safety fault on Google's part...
I had experienced this issue on one of the site and resubmitted website for review in google webmasters. Search results in google were corrected in couple of days.

Chrome Extension won't appear by filtering in the Web Store, only by search

This has bothered me for a long time, and there is no way to contact Google for support, and their documentation doesn't cover this.
I think I must be missing something, but I can't for the life of me figure it out
If you search for my extension manually, it shows up:
https://chrome.google.com/webstore/search/mortality
but if you just go to the store and apply the relevant filters
https://chrome.google.com/webstore/category/extensions
Extensions, Productivity, Runs Offline
It doesn't show up.
I have had it in the store for a few months now, initially I had thought it just takes some time to show up but I now am fairly confident I'm missing something.
The manifest is correct (Compared to other apps that do show)
Region and language is correct
Has anybody seen this before, and know what the problem is?
I've met this issue before and have contacted with the CWS dev support team. It is because chrome web store doesn't list the item under its category if it has already been installed by the user. I can see your extension under "Extensions, Productivity, Runs Offline" category properly since I didn't install your extension. I think Google CWS team is considering to show up the item under its category no matter if the user has installed it in the future.
CWS works strange, I think some unwanted to Google extensions and apps are blocked and are not shown in the top of categories. You can check this by going to the bottom of the categories and see these unwanted apps and extensions. Simple set the filter to 4 stars and you can find this apps in the place when apps and extensions with promotion tales ended.

Confusion on google api query limit for chrome extension

I'm super new to making chrome extensions, but I really wanted to make one that let me highlight text and just do a simple same-page google image search of that text by clicking the extension button and opening a popup of the returned images from the query. So I made it and tested it using the deprecated google image search api. I want to put it live but I'm genuinely confused about the query limits. I have no intention to make money off of it in any way, considering the primary content of the extension is just a google image search. I just always hated having to open a new tab to search for images of a word I see on a website when surfing the web.
Also is it even possible to upload it to the store when it's using the deprecated google image search api that still works for some reason even without a key. Or would I need to update it to using the custom search api, which has only free 100 queries per day. And can someone explain that? If it's an extension, and a end user clicks on the extension button and it queries google custom search api, I'd only have 99 queries left for that day? So only about 2-5 people could actively use it during the day before the limit is reached? I spent hours reading stuff but I still don't quite understand.
Don't use the Image Search API. It was deprecated in May, 2011 with bests effort to keep it running for three years. It's now well past that best effort timeframe so I can disappear without notice leaving your extension broken.
The Custom Search JSON/Atom API free tier is 100 searches per day for your entire application. That that could be 100 people making one query each or 1 person making 100 queries.

Why do user agents / browsers lie

I have read several articles on feature detection and that it is more reliable than browser detection because browsers lie.
I couldn't find any information on why they lie. Does anyone know the reason why they would do that?
As far as I understand it, Webmasters do browser sniffing to find the capabilities of a browser and limit what they send to the browser. If a browser lies about it's capabilities they will receive more from the webmaster, you can read more:
http://farukat.es/journal/2011/02/499-lest-we-forget-or-how-i-learned-whats-so-bad-about-browser-sniffing
http://webaim.org/blog/user-agent-string-history/
The reason is simple:
Because web sites look at the user agent string and make assumptions about the browser, which are then invalid when the browser is updated to a new version.
This has been going on almost since the begining of the web. Browser vendors don't want their new versions to break the web, so they tweak the UA string to fool the code on existing sites.
Ultimately, if everyone used the UA string responsibly and updated their sites whenever new browser versions come out, then browsers wouldn't need to lie. But you have to admit, that's asking quite a lot.
Feature detection works better because when a new browser version comes out with that feature, the detection will pick it up automatically without the either browser needing to do anything special nor the site owner.
Of course, there are times when feature detection doesn't work perfectly -- eg maybe if a feature exists but has bugs in a particular browser. In that case, yes, you may want to do browser detection as a fall-back. But in most cases, feature detection is a much better option.
Another more modern reason is to just avoid demands to install mobile apps (where product owners contol what I can and can't do with content. No thanks!).
Today Reddit started to block viewing subreddits in case they detect a mobile browser in UserAgent so I had to change it just to be able to view content.

How to profile browser page load using Javascript (Library)?

I've been doing a lot of research on this, but I figure I could crowd-source with what I have and see if anyone can offer additions to what I have. So I want to be able to determine page load time using JS. Not just page load as a single number, but as a breakdown.
First what I found was a new W3C Specification (Draft):
https://dvcs.w3.org/hg/webperf/raw-file/tip/specs/NavigationTiming/Overview.html
This would be perfect, however its limited to Chrome, and IE, and it's still inconsistent between the browsers.
But now I have found Real User Monitoring (RUM) by New Relic that is based off of a Javascript Library by Steve Souders. From what I can tell they can determine the same data that I saw from the new w3c Draft.
It seems that they are using HTTP Archive: http://code.google.com/p/httparchive/
However, I cannot seem to find any information on page performance or load, so I wasn't sure if I was looking at the correct library.
Now of course, if there is anything else out there, that could provide more information on page profiling, I am welcomed to the information.
Have a look at Boomerang.js (https://github.com/yahoo/boomerang) by Yahoo.
Should allow you to roll your own RUM and does graceful degradation so you should still get some information from browsers without navigation.timing.
Also if you've got access to Windows have a play with dynatrace's tools - gives quite a good insight into what it going on during page load (in IE and FF)

Resources