Google

It seems that Google has decided that Longrider and Felix Domesticus are both distributing badware. If you search on Google, this is what you see. These sites are not distributing badware. Felix Domesticus suffered a corrupted database following a failed upgrade of WordPress. Otherwise, nothing has changed with the sites.

According to stop badware.org, the reasons for this warning are:

1. Badware available for download on your site

2. Badware available on sites that you link to

3. Badware distributed through ads running on your site

4. Badware links posted in user-generated areas of your site

5. Hacking attacks to your site

I don’t offer any downloads. I link to other blogs and news sites and I don’t have ads. I always checkout links posted by people who comment, so can confirm that they do not lead to badware sites. That last one sent a chill down my spine but a check reveals no evidence of hacking. Indeed, the only recent change in circumstances was the failed WordPress upgrade. Generally when looking for a fault, the last thing that went wrong is where it is to be found. A check with AVG confirms what I already established – there are no threats on my sites. Google is wrong. Not only is it wrong, it is verging on the paranoid.

Seriously, I’m pissed off with Google – if their bots can’t tell the difference between a corrupted database and malware, and consequently tarnish the whole site, then I am not impressed. I’ve requested a review, so we will see what happens. Interestingly other search engines are fine. It’s Google that sees badware where none exists and blackens my name with no good reason. If they don’t fix this quickly, I’ll simply block their bot from the sites completely. I’ll be losing nothing by doing so.

Bastards!

———————————

Update: Google have done the decent thing and removed the warning. Thanks chaps. I withdraw the bastards comment. However, there is a caveat; you still got it wrong, there was never any risk to peoples’ PCs from my sites.

———————————

Boody hell! it’s happened again – after they gave me the all-clear. I’ve asked my host to double check, but they won’t find anything. In the meantime, I’ve had it with Google. If they can’t program their bots properly, the bugger can fuck off from my site. Robots.txt is now blocking googlebot.