I have created a new website www.bucketshowers.com and I tried to index it using google webmaster tools. Fetch as Google for the desktop worked just fine, but doing the same for mobile shows an error "Temporarily unreachbale". It's been a few days and the website REALLY is not avaible on mobile. It's driving me nuts. Here're is some information and things I have already tried:
Website is made with WP
I have disabled all SEO/meta tags plugins and I added a very basic robots.txt http://bucketshowers.com/robots.txt
I tried waiting 15min between fetching the root page on mobile
I have checked source code for the homepage to make sure there are no meta tags with nofollow or noindex attributes
I baffled by this issue and I would gladly take any advise/pointers what else can be done. Thank you.
The crazy thing was, that it was caused by WP Statistics plugin, which is probably the most popular from its kind - 500k downloads. When I deactivated it, everything is fine, google fetches of the mobile and the website is available. Incredible! I'm still searching for the actual problem within that plugin.
Related
We have a curriculum site hosted in New Sites and is shared publicly. Anyone that visits the site gets the Google "We're Sorry" page and can't access the website without refreshing the page multiple times. It seems that after you finally get each page to show, future visits are fine. But as they begin to roll this site out to teachers, they need the link to work. This is both via direct link access or clicking the link in an email. Happens in Chrome and Firefox so far from testing.
I've never seen this happen with Google Sites. There is nothing specific on the page that is unsafe, no insecure embeds (just images and links to google drive docs).
I used https://transparencyreport.google.com/safe-browsing/search to test and it comes back safe.
Per request I am going to include screenshots from the Network tab. However I can no longer replicate this issue on my network or machines, but many teachers are still reporting the issue so trying to get screenshots from them. In this first one, logimpressions is blocked for them but the site loaded - this is most likely caused by having uBlock enabled.
I have created a multilingual Umbraco website which has 3 domain names pointing to it for each language. The site has gone live and people are starting to share links to it on LinkedIn and other social media. I have metadata in the website which should be picked up when these links are shared. On LinkedIn when the link is shared it has 'coming soon' as the strap-line, which is what was in the holding page months ago suggesting the site isn't being re-scraped.
I used the Facebook link debugging tool and that was returning a run-time error with a 500 response code.
My co-worker insists that there is nothing wrong with the DNS and there aren't any errors in the code of the website so I am wondering if anyone has any ideas why the website cannot be scraped?
It also has another issue where one of the domains sometimes doesn't redirect to it's www. version despite have a redirect on the DNS which may be related.
Is there some specific Umbraco configuration that I may have missed? Or a bug within Umbraco that may cause this?
Aside from this issue the website is working fine, it is just these scrapers seem to be unable to hit the website successfully.
Do you have meta data set for encoding? see https://www.w3.org/International/questions/qa-html-language-declarations probably long shot.
I Googled one of our sites today (gamestyling.com) and saw that the results where in Chinese. It looks like our site was hacked but I see no traces of that. When opening the site all looks normaal (no Chinese).
On further inspection it seems that Google doesn't see the website correctly:
I cannot verify in Google search console. When I use the meta tag it shows me it detected a completely different tag.
When running pagespeed insight the preview does show Chinese: https://developers.google.com/speed/pagespeed/insights/?url=gamestyling.com
Also, when running the site through a proxy it looks completely normal.
Any idea how I can get Google to see my site correctly or what is causing this issue?
UPDATE
I now have access to Google search console and found that someone already had access to the property (2nd user):
I cannot remove the user because it uses a meta tag that google thinks is still in the header but doesn't appear in my code. So I'm still not sure if someone is playing tricks on Google or that we've been actually hacked. Note; nothing has changed on the server itself.
UPDATE2
This article describes exactly what's going on; https://blog.sucuri.net/2015/09/malicious-google-search-console-verifications.html. I must say that's an amazing safety fault on Google's part...
I had experienced this issue on one of the site and resubmitted website for review in google webmasters. Search results in google were corrected in couple of days.
We've discovered today that our Joomla website has been hacked by a pharmacy trojan.
It was difficult to discover because most users don't see it when visiting our website.
One user reported about 2 weeks ago that our site contains viagra/pharmacy spam.
We've looked into it, but found nothing. The conclusion was that the users computer was infected.
Yesterday another user reported this problem, so I've started to investigate again.
One hour later I've discovered that the site is indeed infected.
When I visit this webpage with my web browser all if fine:
http://www.outertech.com/en/bookmark-manager
But, if I do a google translate of this webpage I see the infection (viagra and cialis links):
http://translate.google.com/translate?sl=en&tl=de&js=n&prev=_t&hl=de&ie=UTF-8&u=http%3A%2F%2Fwww.outertech.com%2Fen%2Fbookmark-manager
The same happens if I use curl:
curl -L -A "Googlebot/2.1 (+http://www.google.com/bot.html)" http://www.outertech.com/en/bookmark-manager
As a next step I made a backup (Akeeba) of the website and transferred it to a local xampp installation for further investigation.
The local xampp installation with the website has also the same problem, so indeed the Joomla installation is infected.
a visit of
http://localhost/en/bookmark-manager
shows no problems, but a
curl -L -A "Googlebot/2.1 (+http://www.google.com/bot.html)" http://localhost/en/bookmark-manager
contains the viagra links.
I've looked for hours at the (mostly php) files, did a lot of greps etc, but I cannot find anything suspicious.
Virus Total and Google Webmaster report the site as clean.
I did an audit on myjoomla.com, but no malware was found.
I would be really grateful if someone could point me in the right direction.
Where to look inside my Joomla installation for this hack?
I've restored an older backup that was not infected to a local Xampp installation. Did a backup of the current site and installed into to another local Xampp instanced. Made a diff of all files between the two installations and found the hack in the application.php file (it was only one line). Removed the line and the hack died. I still don't know how the site got infected (all addons are the latest versions). I've changed the password as a security measure and monitoring for this hack once a week.
edit: myJoomla.com report did actually find the hack, I didn't read the report carefully enough.
We recently recovered and migrated a Joomla 1.5 site to 2.5 and the hack was found in the template files (index.php and various override files in the templates html/ directory).
The surprising thing was we also found that about 1 in 10 of the articles had been infected. i.e. when we searched the jos_content table we found the fulltext column had Javascript embedded in it. So, I would suggest also looking there.
Your best bet is to use a tool like myJoomla as it was specifically created for this sort of thing for Joomla.
I also had this problem where if I'm visiting a sub page, the home page would load instead and show a lot of Pharmacy gibberish. But this only happened when I had Firefox Firebug opened. It turned out in my template under /html there was a mysql.php file that shouldn't be there. Luckily, I created this template so I deleted the template on the server and uploaded my original version and the problem went away. Hope this helps.
Our website has been recently hacked (Joomla 1.5, hosted on VPS). Attacker added few php scripts that were redirecting to some ad sites. We have cleaned everything (or at least we think we did), and now everything works as it should.
However, links on Google (or Yahoo) that are pointing to our web site are still trying to include these php scripts (and returns 404 as these are deleted now). Direct links from browser works as they should.
We have cleaned site 10 days ago, so I do not think that something is cached at Google servers. Re-indexing should be done by now.
To reproduce this behavior:
Go to www.google.com
type in "anitex socks"
click any php link that starts with "anitexsocks.com"
You will get "The requested URL /wp-includes/client.php was not found on this server" + 404 error
Refresh page and everything works without issues
Why are only Google links making troubles?
Any help is welcome. Thanks!
As for the reason why this is happening, I installed a firefox add-on which blocks my browser's Referrer Header and then followed a Google link to your site and it worked fine. Then I disabled the add-on and the problem started occurring again.
This shows that there is still some malicious code running on your website which is checking all http requests to see if they come from Google (based on checking the HTTP Referrer header) and redirecting them to /wp-includes/client.php if they do,
To try to determine where this code may lie, try performing a recursive grep through all your www files on your server as well as your www configuration files,somewhere in there there must still be a reference to that client.php script, hopefully you can find and eliminate it.
That said, if it were my site and I knew a hacker had had free reign over my server to do whatever they wanted to it, I would not mess around with trying to undo the damage and would instead restore the most recent backup from before the site was hacked. You only have to miss one back door the hacker left in place and they can re-enter your site. After restoring backups, you should also upgrade/reconfigure the software they used to gain access in the first place so they can't simply rehack it in the same manner again.