Disavow tool: can black hat means speed up recovery…
I’m writing this post because I’ve been recently asked why a site is still penalized a long time after having submitted a disavow file, and I’ve thought maybe some public considerations are better than a private answer.
First of all, however, let me say I’m not the maximum authority regarding algorithmic link based penalties or filters (well, I’m not the maximum authority regarding anything, to be honest) and everything I’m going to say is just based on common sense.
But let’s back to the problem: most of the webmasters that have used the disavow tool report not having seen any visible effects after several weeks or months. Given they haven’t done any mistakes, their problem is the tool doesn’t work as an On/Off switch. When a file is uploaded it simply tells Google to ignore some links in the incoming link graph, by applying a “nofollow” tag to them. Then, each of those links still remains “followed” until Google doesn’t recrawl the page cointaining it. This is clearly stated in the Disavow Tool Documentation:
It may take some time for Google to process the information you’ve uploaded. In particular, this information will be incorporated into our index as we recrawl the web and reprocess the pages that we see, which can take a number of weeks. These links will continue to be shown in the Webmaster Tools inbound links section.
Google’s John Mueller has often confirmed it during Webmaster Central Hangouts.
So this is a fact: you’ve to wait for Google to recrawl the pages. But if your links are really bad ones (guestbooks, dead profiles, comments on paginated URLs, and similar kind of toxic stuff) chances are it can really take months to have an algorithmic penalty lifted.
Unless… unless you force Googlebot visiting those URLs sooner, and having the bots on a page quickly is a tipical Black Hat goal. Usually one wants links to be discovered: in this case they have to be recrawled, but the ways to achieve the result are obviously the same ones.
I’ll list a few ones:
– Pinging
– RSS syndication
– Social bookmarking
– Xrumer/Scrapebox/(add BH tool) blasts
Do you remember being careful because they may lead a site to be sandboxed because of too many links too quickly? In this case speed is what you need…
Now I’m not suggesting to pollute the web with other crap, but if you’ve built really bad links chances are you’ll need black hat means to make your disavow file work.
Thoughts?