Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this question
I just discovered that one of my site was considered by Chrome as a malware.
Here's what is Google Chrome is showing:
The website at ___ contains elements from the site ___, which appears to host malware – software that can hurt your computer or otherwise operate without your consent. Just visiting a site that contains malware can infect your computer.
My site uses Joomla 1.5 as a CMS and had securities issue, one of the template ("beez" I think was the name of the template) that comes with Joomla contained a virus, Now I updated the Joomla and removed the template, and I thought that fixed the problem.
Now Chrome is still considering my site as a malware. Any Ideas how I would fix this?
Thanks!
Google is using the database of this organization for malware reports.
http://www.stopbadware.org/home/reportsearch
Check if you're in there - if you are, go to google webmaster tools and there you should be able to request a revalidation.
http://www.stopbadware.org/home/reviewinfo
You might still have a problem with content that contains malware. When a wordpress site of mine got hacked, there was a stray piece of malware in the static html. You need to check every file that displays :(
Good luck.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
My website was hacked and my homepage was changed again and again. Is there any tools or any ASP sources can protect it from editing?
I set the attributes hidden, read and system to the index.asp files, well, it was changed by the hacker again.
Notes:
My site was hosted in a shared server
My website was hacked by a china chopper before
I have cleaned server hidden asp files..
To put it bluntly, secure your server it will stop the hackers editing your pages :)
It sounds like your server has been compromised at a higher level, if this is a hosted solution (by a 3rd party company) they need to fix their servers. Unfortunately I've seen smaller hosting companies never fix the problem and just replace the files back and blame "poor coding" when the problem is actually "stupid system admins that don't know what they are doing". If this is the case, move to a different host. If this is your machine and you are hosting it, rebuild the entire machine it sounds like it has been compromised.
For your site 1st you need to check some security measure like there may chance that your web site vulnerabilities . With following attack like SQLinjection, Blind SQl , XSS, Oracle Padding Attack, DOT-NET-Nuke etc .
2nd thing as #silver said may be your host is responsible for all this thing like many time on IIS there is major security issue if your other site which is hosted in the server is having some vulnerabilities so attacker or hacker can access of the server throw PHP-Shells and Some ASP or ASPX shells. So for this issue you need to choice the good host :)
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
For Google Analytics, I had to prove that I owned my domain. I added a TXT record to do this. I also had to prove to Microsoft that I owned my domain by uploading a file (BingSiteAuth.xml) to my site.
Now that I'm up and running with Analytics and Webmaster tools for Google and Bing, can I remove these verification records, or will that break analytics? Does leaving the record and file there pose any kind of security risk?
No, you shouldn't remove any of the verification files or DNS records. Google will periodically recheck your site and if it doesn't succeed you will lose access to WMT, for example. See this WMT support page:
Removing the record from your server can cause your site to become
unverified, and you will need to go through the verification process
again.
I'm not 100% sure but I think Bing will do the same. It makes sense because a domain owner or the roles of administrators might change and you don't want anyone who ever had access to your site's data to keep that access right forever.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am building a question and answering site by myself.
I want to make this site indexed as a Q&A site or Forums by Google, which can be retrieved when using the "Discussions" in Google. In my personal experience, Google Discussion Search is a pretty useful function when I want to get others' real opinions or experience.
However, I have no any idea on that how Google determine one site as Q&A/Forum or one page as Q&A/Forum page. I searched a lot on Google, but there is little related information discussing this issue. Do you have any idea or reference on that?
Thanks!
Best,
Jing
Use richsnippets and make Google recognizing your traffic by using Webmaster tools or Analytics . Use a sitemap.xml to invite for revisit and fast indexing, disable archiving (f.e. Google Cache) with meta-robots noarchive. If you have high traffic and fast content building, search-engines will then recognize by themselves.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I went to a web page and it would not load. The browsers, Chrome and Firefox both said “Waiting…” and never gave an error or page can’t load, just continually said “Waiting…” In fact, in Chrome it was like that for over thirty minutes and never did load. I could load other websites with no problem; I thought there was something wrong with the website on the server.
Unfortunately I did not think to try it in IE at that time.
My Firefox browser’s cache I had been cleaning frequently, almost daily (localhost developing, nothing major). So when I posted the original question; a very bright, unselfish and polite Stackoverflow member suggested I clean my cache (remove the cookies, etc..) I did. And the page loaded. Same with Chrome. After all that, I did try it in IE and it loaded.
The web page was a simple one page with one image 950x648 pixels.
Why would that happen? I want to rule out the server side, but I had never experienced that before. (At least I don’t think so…) Could it be my internet connection, my router? My computer? Some settings? I'm leaning towards my computer, but where do I start to diagnosis this, if it is.
Is this the right section of the site to ask this question? Is there another site I should consider to ask this question?
UPDATE
Given the excellent resources listed below, I am ruling out the website and server. I will focus on the browsers and watch for this anomaly to repeat. Any thoughts, or course of action, would be greatly appreciated. Thx.
Perhaps try this site:
http://www.downforeveryoneorjustme.com/
It loads perfectly for me, must be your browser/network setup. Try opening Terminal (or Command Prompt) and executing ping eastsidepropertysolutions.com. If you get a reply then the connection from your network to the website is fine, most likely indicating a browser issue.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this question
Say if I register a domain and have developed it into a complete website. From where and how Googlebot knows that the new domain is up? Does it always start with the domain registry?
If it starts with the registry, does that mean that anyone can have complete access to the registry's database? Thanks for any insight.
Google will find your website on its own if some existing website has a link to it.
You can jump-start the process: http://www.google.com/addurl/.
You may also be interested in Google's Webmaster Tools.
Google needs to find you. That is, if there is no link to your site from another web site, it'll never find it.
Google finds pages to crawl as links from other pages. If no site links to your site, Google will likely never find it.
You should look at the google website. They have some good information here. They even have a link to add your site to their list to crawl.
Google works purely (sort of) on pagerank, which accounts for its success. If nobody cares about your site, nor do Google.
Usually I submit new domain directly to Google, I found out it takes less time for Google crwaler to find out about it rather than waiting for Googelbot to parse some link to the new domain in some other websites: http://www.google.com/addurl/
Another way is to create a free Webmaster Tools account on Google and add the new domain/website there (maybe also with a sitemap), this will take to Goggle even less time to crawl your domain and get it indexed.
As far as I know the registry's informations of a domain for most domains are public, you can read them by using a simple whois service.