by
4/17/2015 06:36:00 am
0
comments
Alexa Rank
Backlink
Blogger
Pagerank
Pingblog
SEO Book
Template SEO
Tips Online
Tools
Tutorial SEO
Tutorials
Web SEO
Widget SEO
Wordpress

Another link network has been thoroughly de-indexed by Google’s
anti-spam team. This time it was the blackhat mainstay Anglo Rank, which
was one of the larger “ranking services” that has managed thus far to
stay under the surface. Matt Cutts has recently felt it safe to Tweet
about a few of the particulars, which gives some insight into Google’s
approach.
Another One Bites the Dust
Anglo Rank
was a link network so-named for its high percentage of English content.
It relied on a number of sites with high rank and low OBL, or outbound
links. Theoretically, this made their sites ideal sources of backlinks
for the webmasters investing in the service.
Sites with high OBL values are generally regarded as less valuable by
Google’s algorithm and are more likely to trigger manual investigation
in the instance of suspicious linking behavior. Sites with primarily
English content also look better to Google’s spiders as when pointed
toward an English-language site this looks more like natural discussion
and reference to the linked page. Low OBL and primarily English web
content were major advantages over contemporary competing networks.
In addition to low OBL and primarily English content, Anglo Rank
relied on full-fledged “authority websites” for its links. This
distinguished it from many of its competitors who relied primarily on
blogs. Blog networks were preferred previously because blogs are
generally easier to automate and queue up with dubious “content” to link
from. Blogs were much easier to root out for two big reasons.
Reason #1
The first was OBL. Blog networks tended to carry two to three links per blog post. When Google caught on, they started reducing the link counts. By the time OBL count was fully rooted out and penalized as it is with the current Google update the damage was done and the blogs were too late to salvage.
The first was OBL. Blog networks tended to carry two to three links per blog post. When Google caught on, they started reducing the link counts. By the time OBL count was fully rooted out and penalized as it is with the current Google update the damage was done and the blogs were too late to salvage.
Reason #2
The second reason was structure. Blog networks tended to utilize the same few designs over and over again. There are only so many page variants that can be produced on the same blog architecture before it becomes painfully obvious what’s going on even to a blind algorithm.
The second reason was structure. Blog networks tended to utilize the same few designs over and over again. There are only so many page variants that can be produced on the same blog architecture before it becomes painfully obvious what’s going on even to a blind algorithm.
Overcoming the weaknesses of the blog network and OBL hit empowered
Anglo Rank on its release. It allowed for very quick results. Review
copies often reported seeing rank jumps in 24-48 hours. Occasionally
they even offered up sites as sacrificial lambs to verify these results
to their peers. This is not something you do with a site you can’t
afford to lose; it can be bad news for blackhats to have the URL to a
potential-competitor’s functional, successful site. The willingness of
Anglo Rank users to put their sites at risk to provide testimonials is a
testament to their short-lived success.

Anglo Rank boasted “absolutely no footprints linking the websites together“.
Footprint is the crossover of links in a network. If many sites all
have links from the same sources it’s a natural red flag for search
engine algorithms that value inbound links. Between 2010 and 2012,
network footprinting was Google’s primary detection heuristic to root
out link networks; the more successful backlinking networks were those
that had solutions that suppressed their footprints, often requiring
them to restrict information to their subscribers to prevent them from
accidentally outing the network.
New Techniques?
If Anglo Rank had nothing linking their sites together, did Google
need to develop a new technique to determine which links stemmed from
their blackhat workings? Not remotely. Matt Cutts is the head of
Google’s anti-spam team, and this is far from his first rodeo dealing
with high-secrecy “underground” networks. Cutts is the public face of
Google’s anti-spam efforts, and his name is known among blackhats and
marketers with a mix of equal parts reverence and resentment. He felt no
shame about calling Anglo Rank out directly on their strongest
marketing claim.

On December 6th, Matt Cutts tweeted:
“There are absolutely NO footprints linking the websites together” Oh, Anglo Rank.
Footprint is exactly what allowed Google to spot Anglo Rank’s illicit
links and begin the de-indexing process. Few if any of the links in
their network remain valuable. Google’s spam team has a history of
identifying link sources by host and clearing them out with extreme
prejudice. Once a server has been marked for blackhat spam, there’s no
redemption unless hosting changes hands entirely, and the same applies
to domains. This makes the hit to Anglo Rank devastating
as it takes the teeth out of the infrastructure they’ve spent their
time developing. It’s the same pattern we’ve seen in the past with
similar services like BuildMyRank.
The habit of explicit, self-styled blackhats is generally to scurry
back into the woodwork and link their wounds after a major hit like
this. The operators of Anglo Rank have yet to do so, maintaining a
presence on the forum they’ve pushed the brunt of their marketing
through. Their response to complaints has been less-than-cordial and
borderline-mocking. Given the blackhat subculture it is likely they’ll
cleanly get away with this. Users are reporting their penalty notices
with the predicted measures of profanity only to be met with:
“Thank you for the update but what was you expecting that you gonna be still ranking for 10 years ehhh?”
Outside the blackhat subculture this would leave you with a lot of
chargebacks and a bad Better Business Bureau hit. There’s very little
their customers can do about it in this context.
What Happens Now?

Google’s last PR update hit a lot of spam-reliant marketers harder
than ever. We’re beginning to see a change in how blackhat solutions are
marketed; they no longer pretend by omission that their services are
for long-term use. The penalties being reported and complained about
seem to be harsher as well.
Link networks are going out of style for a lot of reasons. There are
few ways to adapt outside of the box they’ve trapped themselves in.
Through 2012 blackhat link networks could play “cat and mouse” with
Cutts and his team but there is now little that will prevent Google from
locking onto their footprints
and doling out the necessary penalties. Because these networks are
underground and as distributed as they are, eliminating the tainted
links to a site is all-but-impossible even when the proprietors of a
service are willing to assist. This makes recovering a site that has
been penalized for blackhat linking a difficult proposition as
demonstrating a restoration of compliance is, in effect, impossible. As
more people come to realize that utilizing these services effectively
ties the lifespan of their site to the lifespan of the link network
their popularity will only decrease.
This consistent history of even the most underground networks getting
nailed is causing many would-be subscribers to shy away from putting
link network services to use even on their rapid-fire fly-by-night
projects. Link sales as a whole are not yet dead, of course, but this
style of tiered network can’t last under Google’s withering fire of
penalties and manual spam actions. As time wears on we’ll likely see an
increase in individualized links being used on authority pages with
legitimately-useful content. We’re not yet seeing marketers being boxed
into exclusively whitehat tactics yet, but the momentum is there.
0 comments:
Post a Comment