Deprecated: Assigning the return value of new by reference is deprecated in /home/jlhdes/public_html/wp-settings.php on line 520

Deprecated: Assigning the return value of new by reference is deprecated in /home/jlhdes/public_html/wp-settings.php on line 535

Deprecated: Assigning the return value of new by reference is deprecated in /home/jlhdes/public_html/wp-settings.php on line 542

Deprecated: Assigning the return value of new by reference is deprecated in /home/jlhdes/public_html/wp-settings.php on line 578

Deprecated: Function set_magic_quotes_runtime() is deprecated in /home/jlhdes/public_html/wp-settings.php on line 18

Warning: Cannot modify header information - headers already sent by (output started at /home/jlhdes/public_html/wp-settings.php:520) in /home/jlhdes/public_html/wp-content/plugins/bad-behavior/bad-behavior/screener.inc.php on line 8

Warning: session_start() [function.session-start]: Cannot send session cookie - headers already sent by (output started at /home/jlhdes/public_html/wp-settings.php:520) in /home/jlhdes/public_html/wp-content/plugins/wp-referer.php on line 36

Warning: session_start() [function.session-start]: Cannot send session cache limiter - headers already sent (output started at /home/jlhdes/public_html/wp-settings.php:520) in /home/jlhdes/public_html/wp-content/plugins/wp-referer.php on line 36
Paid links: A scalable solution | JLH Design
28th August 2007

Paid links: A scalable solution

posted in Paid Links |

Warning: copy() [function.copy]: Filename cannot be empty in /home/jlhdes/public_html/wp-content/plugins/mytube/mytube.php on line 220

Google has always been smart in respect to building solutions based on scalability. From the onset they always wondered what would happen if they had to grow the solution at hand by 10 fold or even greater. Scalability in their algorithm is so entrenched as its philosophy that they even openly admit that sites that are submitted via a spam report are not removed or penalized. They rather use that data as information to judge their algorithm against.

What amazes me regarding their battle with paid links how non-scalable the solution is:

  • Adding a rel=”nofollow” attribute to the <a> tag
  • Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file
  • [it used to say something about javascript but they took that out]

They even take it a step further, which is obviously also a step back in their fight against spam, when they ask for people to submit sites that sell links, supposedly for some hand-to-hand combat.

So what’s a more scalable solution, Google tweaking it’s system to identify paid links on its own or having millions and millions of webmaster’s modify their billions and billions of pages available on the web? Obviously its much easier for Google if we all just bend over and do their job for them, but then again how serious are they about this? Sure it’s available in the guidelines, at conferences, and if you read Matt Cutts blog, but that probably reaches a very small percentage of the real content creators out there. All the pros will know about it, but the VAST majority of indexed content managers out there are going to miss the message.

Let’s go back for a second to review why they think Paid Links are bad. What set Google apart from the rest of “search engines” at the time was that they not only looked at the content on the page but also used the academic model of references in literature to vouch for the authority a page is on the subject. At the time of that original theory the web was young and innocent and pretty much not exploited nearly as much. So Google’s rankings are based largely on the links to a page/site and since most people want to rank higher so they get more traffic the obvious optimization procedure is to get more links. Had they ranked sites based on the use of purple text all websites would be using purple text today.

Back when the original ideas for Google came about most of the links out there were actually votes for other sites. It was when “surfing” the web actually meant bouncing around from site to site based on the links of those sites. You didn’t Google something you surfed for it. In 1584 when Google came up with this idea the barriers to get online were much higher than they are now from registration and hosting to easy content generation it’s gotten much easier to get your site online today, back then it was more academic institutions, geek squads, and corporations that had the resources to publish sites.

Well the times they are a changing. Now you can buy a domain for pennies, hosting is next to free, and writing content has never been easier. There are so many millions of new links created every day that they have lost their value due to the sheer volume of links available. HOWEVER, there are some sites that have some value, traffic, authority, PageRank, and links from those sites tend to be worth something, and BOOM an economy of link selling is born.

Not straying too far from their original founders who borrowed the reference system used in academic papers as a judge of quality, Google wants to borrow from the older established media sources that must disclose paid endorsements. What’s different however is that most of those media outlets are regulated by authorities. Being that Google is the only game in town when it comes to actual search traffic they are the defacto authority to regulate the masses.

So how can Google get everyone on board, let me repeat that EVERYONE, not just the 0.0001% of the publishers that read Matt’s blog, or the 10,000 subscribers to seoMOZ, but EVERYONE. If Google wants to regulate the web then they need to start regulating it and not just observing it, its going to be painful but if they really want to monitor all the links on the web it will have to be done.

  1. The first thing to do is throw out all of the links known till this point. They are polluted, we have no way of knowing the intention of any of the links since they exist pre-regulation.
  2. In order to have the links count they have to be registered, verified, and monitored by Google so all websites will have to be removed from the index.
  3. After verifying ownership in your webmaster tools account, Google will crawl the site. They can then show you a list of all the external links on your site. You then select what type a link it is: Regular Voting link, Paid Link, non-endorsed user generated link, etc. After selecting the link attribute you will have to digitally sign an agreement attesting to the authenticity of your claim, enter the captcha, and submit. Repeat for the rest of the links on your site.
  4. After the links are verified and attested to Google can then add them in as votes or non-votes into the index.

Now we’ve got something with some teeth in it. In order to be included in Google’s index you have to have agreed to their terms and have signed a legally binding contract that they can go back on.

  • We no longer have to worry about hidden links as they won’t be verified.
  • Links will only be bought and sold for traffic.
  • You can code your links any way you’d like.
  • User submitted link directories are all but dead.
  • Sitewide links will probably disappear due to the sheer labor required to insert them.
  • Sneaky little plug-in and theme developers that drop links all over the place will be wasting their time as the site owners probably won’t vouch for them.
  • Automatic text link building systems will grind to a halt as whenever the links change on a page the page will drop out of the index waiting to be verified.
  • As the publisher has to be verified by state issued credentials, large false link networks built up by SEO’s will have little value as Google will be able to see all of them as owned by the same person.
  • Comment spamming will disappear as people will just turn off their comments.

Now until that is instituted and since you’ve read to the end of this story and know about Google’s stance on paid link you are morally bound to nofollow all of your paid links and only buy nofollowed links. Granted your competitors who didn’t go to SES San Jose or read Matt Cutts blog probably aren’t doing that, but that’s your problem not Google’s.

The only flaw in the system is that some people may actually LIE and say that a link that they got paid for is actually a regular link. Oh my. Well at least that’s a sin of ccommission and not a sin of omission like the millions of people currently not nofollowing their paid links.

This entry was posted on Tuesday, August 28th, 2007 at 3:02 am and is filed under Paid Links. You can follow any responses to this entry through the RSS 2.0 feed. All comments are subject to my NoFollow policy. Both comments and pings are currently closed.

There are currently 2 responses to “Paid links: A scalable solution”

Why not let me know what you think by adding your own comment! All the cool kids are doing it.

  1. 1 MyAvatars 0.2 On August 28th, 2007, dockarl said:

    But that’s not an algorithm - that requires human intervention and honesty (exactly like nofollow) - and that’s not scaleable, especially honesty.

    No matter how you change the system, if people know about it, they’ll manipulate it to their own ends. Patches like nofollow acknowledge that and acknowledge to the world that the system is under strain.

    Do you folks have that silly television program ‘numbers’ over there? Ever seen how they depict every single complex real world problem as having a neat little mathematical solution?

    Unfortunately if you know anything about math the majority of stuff they say is CRAP - Unfortunately when real world meets theory, the two often don’t make good bedfellows.

    There is no such thing as a free market, other than in the textbooks. Human nature is corrupting.

    Personally I think the answer is that they need to quit just looking at onsite factors and start looking seriously at user behavior.. who goes where, where they come from, how long they stayed, where they went.

    It’s a pity they can’t get that sort of information from websites.. hmm.. they COULD buy a few key switches around the place.. or you could buy a little webstats company like Urchin and rebrand it as a free service so that every man and his dog signed up.. What a wealth of information that would bring.

    That would help bring Google back to where it was in the early days - a real model of popularity on the net instead of a model of itself.

    Still wouldn’t fix the paid links thing though - but who cares - if the site being linked to sucks, no-one would stay long anyway.

    I think there are some battles ahead.

  2. 2 MyAvatars 0.2 On August 28th, 2007, John Honeck "JLH" said:

    The scalability factor was kind of tounge-in-cheek on my part, its scalable on Google’s part because they don’t have to do anything other than shut it down for a while as everyone updates their account!

    I’m trying to make the point that nofollow is a non-answer, just as crazy as the one I suggested.

    Here’s the other issue they are combating, as the web ages, the older sites naturally have more links, but not neccisarily the better. By trying to keep the index fresh they need to keep an eye on sites that start to garner links fast as being more important than the old stale ones that already have links. This makes the system vulnerable to manipulation by artificial link campaigns. If they just relied on the sheer volume of links then old sites would dominate but it would be a lot harder to buy your way to the top.

    User behavior would be pretty easy to mimic however with click-bots running 24/7 on millions of random IPs. You think we have a computer virus problem now it would explode if that ever were a factor in ranking.

  • Please Support

  • Marquette University

  • Sponsored

125x125

  • Donations


  • ;

Enter your email address:

Delivered by FeedBurner

rss posts
Spread the Word
Sphinn
delicious
digg
technorati
reddit
magnolia
stumbleupon
yahoo
google
  • Readers