Thursday, January 23, 2014

Building High Quality Backlinks with Dofollow Relationship

Had a request to write an article specifically about this topic: how to build high quality links with a “dofollow” relationship. Of course, “dofollow” is a figurative label we apply to any link that a search engine is likely to follow and allow to pass value. There remains, to this day, no “rel=’dofollow’” link attribute in the standards. I think that still has to be said because we are starting to mentally blot out the basic facts of link architecture.
A link is just a reference to a document. It neither conveys sentiment nor measures quality. Larry Page and Sergey Brin originally tested their citation-analysis link strategy against a very small set of documents, found only on the Stanford University Website, which were anything but representative of the Web in general. Manipulative links already existed in volume before Page and Brin typed up their little white paper on PageRank. Paid links were already rampant on the Web. When Google stepped into the search space, it was easy fodder for link spammers and their SERPs made that plain and clear.
Nonetheless, despite the many spankings that spammers handed to the Google engineers over the past 15 years, they have persisted with their little link experiment — expanding it into new directions, dressing it up with filters, and otherwise looking for ways to make links work the way Page and Brin thought they should
.
Frankly, I wouldn’t mind if Google succeeded with this noble goal but they do seem to be going about it the wrong way. I say that because people are still trying to figure out how to manipulate the search results with links. That’s doable, easily doable, and being done easily still by many people. But sooner or later the search engines seem to catch up with the latest link manipulation techniques. And therefore I think Google has yet to de-incentivize manipulative linking. I don’t think this linking arms race will ever end — not unless some profound philosophical change occurs on one side of the conversation or the other.
So how important should the “dofollow relationship” be to search engine optimization
? After all, it’s not optimal if the link only survives or passes value for a short time. That’s sub-optimal.
Of course, this must be the point where people add the “high quality” (or, as most folks just say, “quality”) to the “(dofollow )links”. I never in my life found a “(high )quality link”. I keep wondering what people mean by that. I’ve been reading about “(high )quality links” for years but to this day no one has managed to present one.
Let’s back up here and revisit the basics.
Leaving aside the fact that Larry and Sergey had it all wrong to begin with, anyone who has worked with large data sets can tell the rest of us that often the SET OF DATA says much more interesting stuff than any specific piece of data. I concede that aggregate link data has been very revealing about all sorts of things from the start.
Where Larry and Sergey were wrong was in treating links as “votes”. Links have never been “votes”. But the aggregate data analysis really doesn’t need to use that metaphor. Or, rather, I should say it’s a neutral metaphor at the aggregate level — it doesn’t matter if you say “links are votes” or “links are NOT votes” when you look at the SET OF LINKS.
Linking behavior always reveals a pattern. And most people who create links don’t even think about patterns or how the links might influence search results. So when you go looking for links that are intentionally influencing search results sooner or later you will find a pattern. It’s not a matter of “if” but “when”, as pundits like to say.
When you step back and look at the kinds of patterns that search engineers have paid special attention to, they are mostly patterns created through manipulative linking. Take reciprocal linking, for example. That’s a perfectly fine practice found across the Web “in the wild”. Millions of sites reciprocate links every day. The links sometimes help and they don’t hurt.
But ask someone to set up a reciprocal linking plan “to help with SEO” and suddenly you’re in trouble. Of course, it’s the “for SEO/help with SEO” part that gets you into trouble. Reciprocal linking when done without thought of search engines looks very different from reciprocal linking done for SEO.
There are two classes of patterns: linking (techniques) done FOR SEO and linking (techniques) NOT “done for SEO”. Every linking practice falls into both of those buckets, but the patterns look very different.
Yes, Virginia, you’re allowed to “build links” for your Website. Heck, you’re even allowed to build them “for SEO” — up to a certain point. That is, the search engines tell people to get links to their sites so that the sites can be found.
Search engine optimization begins with publishing content, but search indexing begins with crawling, and crawling only occurs when the search engine knows about a URL. You can submit a sitemap containing a list of freshly published URLs to the search engine and many if not all of them will be fetched; but most Websites don’t publish sitemaps. So indexing for most sites is built on crawl.
The real question is how many links are you allowed to build? What is acceptable for a site? It’s a pity there is no universal number to answer that question. SEO link building would be much easier if we could all agree that “you’re allowed to build (place by yourself) 100 links (or 10 links)”.
While it’s okay to build
links what is not okay is building too many links. I don’t know how many links are too many but I know you eventually cross that threshold when you just keep blindly building links.
Search engines never asked you to build thousands of links to a Website. You decided to build [Whatever My Competitor Has Plus X More] so that you could have a competitive advantage. And then you lied and said that the search algorithms were all about links.
So if the search engines encourage you to build the barest minimum of helpful links (to get your crawl/indexing started) then it follows that you are
permitted to build “helpful” links that influence search results.
You just have to curtail your compulsive desire to place as many “helpful” links as you possibly can. That’s not what SEO is all about — that’s what “self-penalization” is all about.
In other words, Horatio, I submit that whenever you cross that invisible threshold all of your “helpful” links become “UNhelpful”.
And there’s the rub. A perfectly good link may “go bad” not because it is a bad link but because it is one link too many, or part of a set of links that are one too many, or part of a set of links that create a pattern that reveals manipulation.
Search engines don’t have to see a complete pattern of manipulation; they only have to see enough manipulation to know that — if they dig deeper — they will see a pattern.
If it’s not this link
or that link that gets you into trouble, but rather all these links, then the fact you didn’t drop links in a thousand blog comments and forum profiles doesn’t matter. You still created too many “helpful” links.
1 infographic is probably good. 10 infographics is a pattern. 50 infographics is most likely abusive.
But there are always exceptions
, and so we excuse or pardon those exceptions by glibly concluding that “it comes down to intent” (I have said that myself). Intent is not yet the killer buzzword that it could be, but I suppose we’re working on it.
Every self-placed link has some sort of self-serving intention behind it. The intention itself may be good or bad but I think a lot of self-serving link placements are essentially neutral (in and of themselves). It’s when you have “a lot of self-serving link placements” that your intentions (to manipulate search results) become more clear.
This is why every attempt to define “(high )quality links” fails. Because when you get enough of these types of links you create a pattern that reveals your intention to tilt the search results in your favor. You’re not playing fairly.
I could argue that the Website with the most manipulative keyword-rich anchor text is Wikipedia. I don’t have the data to back that up but people generally link to Wikipedia articles with keyword-rich anchor text. So why doesn’t Google slap down Wikipedia for all that manipulative linking?
Because Wikipedia’s keyword-rich anchors are natural
. People naturally link to articles with their titles. I want to laugh out loud every time I read some SEO blog or opinion column that advises people to vary their anchor text. Talk about overthinking a problem, you’re just replacing one spammy pattern with another one.
People run their backlink reports and they check nonsense metrics like “domain authority”, “page authority”, “pagerank”, “someotherrank” and they conclude that since Competitor Joe gets X-links with values greater than THIS that all they have to do is get X+Y links with at least similar (or higher values).
Using the other guy’s (partial) backlink profile as an excuse for not playing fairly doesn’t sway the search engines when it comes to detecting and blocking manipulative linking patterns. Just because you see the links doesn’t mean they are helping (or hurting).
Take links in press releases, for example. Earlier this year many people thought Matt Cutts was lying (or just plain ignorant) when a badly designed press release link test showed that the link passed anchor text
. Even Danny Sullivan made the mistake of concluding that if an individual link passes anchor text it must be influencing search results.Just getting DISMAL TRAFFIC to your Website? Let's change all that. CLICK HERE to contact Reflective Dynamics...
Anyone (including Danny Sullivan) who has written an article about Google’s Panda algorithm, where they suggest that it downgrades a site’s performance in search results, should understand that no matter how many “helpful” links you point at a site, it’s rankings are going to be determined by MORE than just the links.
We knew this to be true before Panda but everyone winked and nudged each other in the shoulder and whispered, “yeah, right”. But when Panda started knocking Websites down right and left and no one could tie backlink profiles into the Panda downgrades, everyone should have realized that it doesn’t matter if a link is capable of passing anchor text
.
There were so many comments posted on that Search Engine Land article, let me pull out two:
Danny Sullivan: [Daniel Tan's press release experiment] proves that with a few links, you can rank something or an unusual terms. Agreed, that’s not necessarily new. But it comes when there’s renewed attention on press release links and when Matt’s freshly said they don’t count. Clearly, they do.
Michael Martinez: First of all, Matt did NOT say that press release links won’t pass anchor text. So for Danny and Barry to carry on as if he did is simply outrageous. He DID say that the original poster in the Google Web forum discussion (and perhaps by extension everyone) should not expect the links to help with rankings. There is a huge flaw in the assumption that “passing anchor text” = “helps with rankings”. A LOT of other factors are taken into consideration and Danny Sullivan of all people knows that very well — who has made that point more often than him (outside the search engines)?
I am not picking on Danny because I want to embarrass him — I am picking on Danny because few people in our industry are seldom right more often than he on any technical detail. Danny Sullivan has asked every question, investigated every scenario, hosted numerous conferences where (nearly) all the angles were discussed. I don’t believe you can say anyone is better informed than Danny Sullivan when it comes to SEO, but like all of us his opinions are sometimes influenced by chaotic circumstances.
Several months before Barry Schwartz published that Search Engine Land article about the press release link experiment, Danny wrote “Why Google Panda Is More A Ranking Factor Than Algorithm Update”.
If you want to get metaphorical about this, in a showdown between the Panda and the Links, THE PANDA ALWAYS WINS. Panda downgrades short-circuit the generally helpful effects of targeted link anchor text.
So what if your press release links pass anchor text? That doesn’t mean they will help with your search rankings. And that was the point Matt Cutts originally made in the Google Webmaster Support Forums.
I’ve been saying this for years and will have to say it again, so let’s move on.
If you knew that Google would ignore any link on the front page of a PR 9 Website (and let’s not get into whether Google will ever update the Toolbar PageRank again) — that the link would not pass PageRank or Anchor Text — would YOU accept a link from that front page?
The classic example people cite when it comes to “sculpting PageRank” is Google’s use of “rel=’nofollow’” on the front page of YouTube (they no longer do this, by the way). If you could have gotten one of those NoFollowed links, would you have turned it down?
The majority of Websites can never be assigned a Toolbar PR value of 9 (the distribution is governed by a power law). You’re not going to find nearly as many PR 6 sites that will link to you as PR 1 sites. A lot of PR 1 sites will eventually earn more PageRank, but not all of them.
A link from a PR 1 Website can be nearly as helpful as a link from a PR 9 Website, if only because once a linking document has enough (internal) PageRank its anchor text will be passed to (some or all of) its link destinations. So you can have ten times, 100 times the internal PageRank but you’re only going to pass that anchor text once.
If a link passes anchor text and IF the anchor text helps with rankings, then why should anyone care if the link is on a PR 1 or a PR 10 page? You got the anchor text. Move on.
If link spammers who use automated software could drop their links only on PR 8 and above sites, they would do that. In fact, I have seen advertisements for link dropping services that claim to do just that. If you believe the high Toolbar PageRank will somehow protect those links from being detected through a pattern of manipulation then congratulations. You have earned your Most Naive SEO Idiot card for 2013 and beyond. It’s a lifetime membership.
There is no such thing as “a high quality link”. There never has been. There may never be such a thing.
Yes, Googlers talk about “high quality links” but I don’t think they mean what you

think they mean. Ask a Googler if a high-PR Website can fail to pass anchor text and I’m pretty sure they’ll be able to say, “Yes, under certain circumstances, those links won’t pass anchor text (or PageRank)”. In fact, we know that Google has publicly penalized high-PR Websites for selling manipulative links.
Your “high quality links” don’t exist. Google’s “high quality links” exist at Google’s whim and discretion. This is why I have been saying for years that if your SEO plan is built on links you need a new SEO plan.
Let’s go back to the original request. Can you build these kinds of links?
Sure — just as long as you have a clear and precise definition for what you mean by “high quality links”. I’ll assume that you do and it’s not my responsibility to knock down your illusions if you don’t. Here are ways you can build “high quality links” (in no particular order):
Reciprocal linksBlog comment linksForum profile linksGuest blog linksBlog network linksBlogroll linksSocial media linksInfographic linksRemotely hosted widget linksAsking people to link to your sitePaying for links
I will even go further and say that you can do all these things without violating search engine guidelines.
Sure, I am leaving out various disclaimers and qualifying details in making such a definitive statement. A typical link spammer dropping 10,000 links in a day isn’t going to worry over the details — he just wants to create his spammy little pattern and congratulate himself on his imagined cleverness.
Anyone engaged in serious link building should be creating links that are intended to last a long time. How you define quality is up to you. How you measure success is up to you.
Where you don’t have any discretion is in the facts about links. The facts are neither inconvenient truths nor excuses for doing really dumb stuff.
Links don’t kill Website rankings in search results.
Link building
kills Website rankings in search results.
You’re allowed to build links; you’re just not allowed to use links to create an unfair advantage for yourself (or your clients).
We can all agree that where the line is drawn is anyone’s guess. My guess is that it varies by Website.
In other words, too much of a good thing ruins it
. But that’s not to say “you can do each of these things in moderation and you’ll be safe”. Rather, it’s to say that “if you create a pattern of manipulative linking you won’t be fooling any algorithms for very long”.
It’s not about creating “the right pattern” or “no pattern”. It’s about putting the user experience ahead of the rankings.
Google has reset the linking environment for many Websites. The last 18 months of Google link-slaughter have given everyone in the Internet marketing field an opportunity to stop, take a deep breath, and say, “Okay — we still need links and we can still place links, but we need to stop pretending that it’s all about links.”
As long as you don’t gorge on links again, chances are pretty good you’ll be okay. So if today you’re still wondering where you can get enough links to change the search results, you’re doing it wrong.
Search engine optimization is not all about links. It’s all about improving the search experience for all three members of the Searchable Web Ecosystem: Publishers, Indexers, and Searchers. Links help in that process but they are not enough by themselves.
All your banned and penalized Websites should have made that clear by now.
Share on StumbleUpon

View the original article here

No comments:

Post a Comment