30th October 2007

GWHG month in review

For the month of October in Google Webmaster Help Group Susan Moskwa has really picked up the pace and helped webmasters at a clip I haven’t seen ever before.  She’s got over 60 posts already this month (though those numbers may be Google Groups Buggy) .  It’s very encouraging to see as world traveler and Susan’s fellow Googler John Mueller pointed out earlier that their participation was less than stellar in September.  (and people thought I was a paranoid nut! It’s nice to see some numbers to back up my concerns)

I haven’t really noticed any other appreciable increases from Googler’s other than perhaps Wysz who is up to 17 posts this month.   JohnMu is getting up to speed and posting more frequently.  Unfortunately for us, the small time webmaster, it will always be a net loss with the assimilation of John into the Google collective as he will never be able to help as many as before.  I get the feeling that Adam Lasnik has moved on to other projects within the Google organisation as his contributions publicly have feathered off to nearly nothing.

Their Popular Picks series was largely ignored by the blogging crowd because they don’t get to increase their reader numbers and ad revenue by pointing out official comments by Googlers.  The search engine bloggers are all about getting the big Googler scoop and that  will not change until Google starts to release important information through the official channels and not through some hidden back door unattributed anonymous Googler quotes on A-Lister blogs.  As a whole however the answers were excellent and very helpful, I have personally used the pages as references dozens if not hundreds of times since they were introduced.  Thankfully, Wysz added a link to the series in their FAQ section, including a very rare appearance by search engine rock star Matt Cutts.

The official Webmaster Blog is still averaging just about a post a week so no real development there at all to speak of.

All-in-all I’d say they are improving their communication lead mostly by the efforts of Susan Moskwa who has appeared to take the lead in this over-due effort.  If this continues, I may have to consider rejoining the group, with the one caveat that I will not do the job that an overly successful corporation should do with their own people.  Helping webmasters with site reviews and opinions should be left to the mere mortal members of the group however Googlers should be accountable for minding the store when it come to official declarations, clarifications on their cryptic guidelines, and those most annoying and benign too frequently asked questions.

posted in GWHG | 3 Comments

29th October 2007

Bob Dylan as my Spokesman


No Bob Dylans were harmed in the filming of this episode. Bob has not been compensated for his appearance.

posted in Site News | 0 Comments

26th October 2007

Googleblog linking to bad neighborhoods

In what many are calling just a warning volley across the bow of many link-selling sites Google has initiated a penalty of sorts by reducing some sites’ visual PageRank score, one such site is Search Engine Roundtable which provides us proof that their rankings and traffic have not been affected.

Clearly a reduction in PageRank for a site selling links is a signal that Google feels that they are breaking the rules as written in the webmaster guidelines. Regarding link selling they state,

Links purchased for advertising should be designated as such. This can be done in several ways, such as:

  • Adding a rel=”nofollow” attribute to the <a> tag
  • Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file

Which is a practice that Search Engine Roundtable is not ready to take part in as made evident by Barry Schwartz (rustybrick) in the above linked article,

On a personal note, I trust my sponsors, I value their sponsorships and I couldn’t do what I do without their financial support. Some sponsors can’t afford huge sponsorships, so they sponsor in their ways. It is what enables this site and many other sites to function and operate on a daily basis. I turn down sponsors all the time because they are simply not relevant or useful to my reader. I hand select them and for them to be on my site, means I trust them. Why nofollow someone you trust and want to thank? Is that a slap in their face? Will I have to and will they continue to sponsor? Time will tell.

So we have a site that outwardly sells links, does not want to conform to the webmaster guidelines by marking paid links in the manner in which Google desires, and has been hit with a PageRank reduction. Clearly a signal that the site could be considered not only a rule breaker but a bad neighborhood to be associated with.

In Google’s webmaster’s guidelines it clearly states,

In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.

I’ve written before on the difficulties of discerning (complete with Matt Cutts’ email address!) what a bad neighborhood is even asked for clarification on the matter.

I find it quite ironic then that Google’s very own official blog links to said site which has admittedly broken the rules and publicly been admonished. ( screen shot ) Are they not taking their own advice and linking to a bad neighborhood? This is not the first time Google has talked out of both sides of their mouth. Even during this last wave of anti-link-selling assaults the largest link seller in the land has gotten off scot-free with out any sort of PageRank deduction. Matt Cutts has even come out and said that they are allowed to sell links because they review the links before they publish them and not every one makes it into the directory. If that is the standard to be followed perhaps Barry Schwartz and others like him to accept advertising dollars should just charge for the chance of being listed on his site, add an element of uncertainty to the equation. Maybe then he will get his PageRank back while making more money as he could oversell the advertisements 10 fold. To me that seems highly unethical but for some reason is the only method of linking endorsed by Google without the use of machine only readable declaration of a paid link.

If reducing one’s PageRank for selling links is really a penalty, I would expect Google to do the right thing for their own blogs ranking and not link to such terribly bad neighborhoods (t.i.c.) and will be watching the site as an indicator.

Before anyone get’s upset at me, I’m not calling Barry Schwartz a spammer or even personally think he did anything wrong. The PageRank degrading has been widely published and I am just drawing the connection between what Google says and what they do. I follow Barry on Twitter, his own Cartoon Barry, Search Engine Roundtable, and his contributions on Search Engine Land. He is considered a leader in the industry sector and I value his opinions and expertise.  There’s at least 4 links in this article to his properties that are genuine followed editorial links of endorsement, not like the link to the Yahoo! directory which is nofollowed due to their blatant breaking of the rules.

posted in Google, Paid Links | 1 Comment

24th October 2007

Digital Point Members put on Suicide Watch

noose.jpg

THIS IS AN EMERGENCY SEO/WEBMASTER BROADCAST.

Due to Google’s apparent assault on Paid Links resulting in some sites’ PageRank being reduced, Digital Point forum members will have to be guarded 24 hours a day for the foreseeable future. If you are near one please remove their belt, shoelaces, and anything that could be fashioned into a sharp weapon. They should also be moved to the lowest floor in the building and all windows should be boarded up.

This is not a drill. I repeat. This is not a drill.

posted in Paid Links | 2 Comments

24th October 2007

Rants: paid links and penalties

It’s my party and I’ll cry if I want to.  I’ve been reading a lot of ranting lately on Sphinn and the blogs, which got me into a ranting mood.  Let the games begin.

Even though Google has a ton of official blogs, discussion groups, webmaster guidelines, and press releases available they decide they would be best served to send Deep Throat down to the parking ramp with Danny Sullivan doing his best Carl Bernstein impression to break the news that penalties are now given to link sellers.  I can only guess they’d choose this chicken-shit cowardly approach because it has some plausible deniability if it really hit the fan.  Beyond their poor choice of using unnamed sources I have a couple other issues bugging me.

As soon as one is assimilated into the collective the first thing they teach the new drones is the gospel of “Don’t Worry About PageRank“.  You’ll see is spewed from every orifice of any Googler giving a speech, writing a blog, answering a question in a forum, or just plain pontificating from on high.  It’s the canned response for any and all questions regarding the green bar; its effect, its acquisition, its retention, its loss,  its very existence.  They are all told to say things like, “worry less about PageRank and more about creating unique and compelling content [and tools].” So if this PageRank is nothing to worry about, then why would docking some college newspaper’s PageRank be a suitable punishment?  If PageRank is no big deal and not to be worried about as much as content, why would they choose this as their punitive reaction?  Maybe one should worry about PageRank just a little bit.  You can’t have it both ways, either it’s not worth worrying about or it is worth worrying about and something we should all fear loosing.   This leads me to another thought on the matter, that it’s not punishment but rather an adjustment, more on that later.

It’s clear that to battle the evil doers that sell and purchase links some sort of punishment must be doled out.  They can’t have an outright ban on all site that sell a link, as Google would soon become a joke.  If someone is searching for Stanford’s Newspaper they’d better find it.  If not, then Google looses it’s relevancy.  Sure it wouldn’t matter much if they just destroyed some nice lady in Colorado who’s buying baby food with the money she makes from her site, but there would be plenty of high profile cases that would just make them look silly.  We as webmasters, marketers, SEO’s or just plain anyone who has any idea how the inner workings of search work have to step back from the scene for a moment.  The VAST majority of Google’s users, customers, and shareholders don’t give lick about paid links, hidden text, or cloaking.  They just know that when they search for something they expect to see it.  If Google banned Stanford for selling links and someone who wasn’t in the know was told that was the reason, the response would be a great big, “So what?  The point is that while selling links goes against Google’s webmaster guidelines, not listing the site selling the links goes against Google core principles of returning the most relevant results.

That principle has it’s limits.  In the case of a site known for fascilitating the selling of links, it’s so well known that when you use the Google tool bar to search for its name you’ll get it listed as a suggestion as soon as you type [text-l] in the field.  If you continue the query and fill the whole [text-link-ads] you will not find the site listed.  In this case, Google has decided that returning the most relevant results are not quite worth as much as punishing the offender.  So I am quite confused when that distinction is made.  Is it just academic institutions that get this exemption? Or if Matt Drudge started selling links would he too get to be listed for his name and his site?  I find it utterly priceless that Google is taking the moral high ground on this text-link-ad selling problem by not returning the site for its own name, but they are more than happy to take their money to show their ads in the results.  In this case the most relevant result is required to pay for their position.  This reminds me of a little story I read on the Stanford web site, ” For example, we noticed a major search engine would not return a large airline’s homepage when the airline’s name was given as a query. It so happened that the airline had placed an expensive ad, linked to the query that was its name. A better search engine would not have required this ad, and possibly resulted in the loss of the revenue from the airline to the search engine. In general, it could be argued from the consumer point of view that the better the search engine is, the fewer advertisements will be needed for the consumer to find what they want. This of course erodes the advertising supported business model of the existing search engines. However, there will always be money from advertisers who want a customer to switch products, or have something that is genuinely new. But we believe the issue of advertising causes enough mixed incentives that it is crucial to have a competitive search engine that is transparent and in the academic realm.”  No, it’s not an exact comparison or even a pretty close metaphor, but the idea that the most relevant result has to pay for its position is true in both cases.   Larry and Sergey knew that was wrong way back then, oh my how their little project has strayed.

I’m going to go out on a limb here and postulate that Google cannot detect 100% of the paid links 100% of the time.  I’ve deduced this solely based on their behavior.   1) they still encourage people to tattle on their competitors and do their job for them 2) They have to manually penalize sites by removing PageRank or knocking them down a few hundred notches in the results, and 3) If my buddy calls me tonight and tells me he’ll buy me a shot and beer if I link to him tomorrow, no where in that process is Google involved.  If they were able to detect paid links there wouldn’t be a need for penalties of any kind, they would just re-rank the index as if said links didn’t exist.   The fact that Deep Throat and Danny had to have that clandestine meeting is proof enough to me that Google’s ability to detect paid links is completely flawed.  By admitting that penalties for selling links exists, they are admitting that they cannot handle them algorithmically, which as you’ve heard before just isn’t scalable.  Sure there isn’t a shortage of 3rd world countries with people willing to work for $1 a day hand checking sites, but at some point the web will become so large that even that isn’t scalable for a company with billions and billions to spend.

I don’t want to just hammer on Google, I give them a lot of credit, they still are the best option available for sending free traffic to a site that isn’t going to go viral on youtube.   Perhaps Google’s inability to detect and devalue paid links isn’t all that flawed, all paid endorsements are not irrelevant.  That is what we are after, relevancy.   If you want Bill Clinton to speak at your college’s commencement be prepared to pay him handsomely for the honor.  That does not make his speech to the leaders of tomorrow any less relevant.   To get Jeff Gordon to use your motor oil and put a little sticker on his car it’s going to cost you millions, but his endorsement would mean a lot more than the man on the street telling you what to buy.  Then again if Bill Clinton told us what oil to buy and Jeff Gordon wanted to tell us how to work in the global economy no one (should) listen to them either.  The point is that both of these men are experts in their field who demand a high amount of compensation for their limited time.  The fact that they are paid does not render their opinion any less relevant.  The same could be said of links.   If Stanford links to an academic the link should carry a lot of weight, then again if they link to britney-spears-mesothelioma-nude-lawyer.info it shouldn’t be considered an endorsement on that subject either.

In that sense the drones saying, “Don’t worry about PageRank” are right. PageRank in it’s purest form, the sum of the weighted links to a page shouldn’t be worried about.  The relevancy of the links should, be them paid endorsements or pure out-of-the-goodness-of-their-hearts editorial links.

One final note on paid links.  This includes some other webmaster guideline no-nos here as well, like hidden text.   We can all easily prove that Google’s ability to detect either a paid link or hidden text is limited.  Create a new page, buy a link see if it get’s indexed or create a page with some obscure hidden text, see if you can find it on Google.  Even if they could detect 100% of the hidden text and paid links within a month of it’s publishing, that month would be plenty of time for some people to make use of it.  With domains being pennies now adays the true black-hatter doesn’t even care if a domain is banned, penalized, or blown-up completely.  By the time that month is over they’ve moved on to hundred or a thousand other sites.  Who’s really getting caught up in this dragnet is the “honest” webmaster’s who think they are acting the way they should.  They are trying to build a site for the long-haul and really want to produce a good product but are fed so much bad information that they truly think they are doing the right thing.  It’s the center of all the anger I have with Google right now, utter lack of communication with the real webmaster.  Daily many webmasters approach the Google webmaster help group saying things like, “I’ve exchanged tons of links, bought links and yet I lost my ranking” Not because they are trying to be sneaky but because they feel that is what they SHOULD do.  They don’t read searchengineland or listen to Rand’s youtube video of the week because they are busy running their sites.  It’s the actual honest webmaster’s who don’t have the right information in front of them that are getting hurt, while the black-hats slip through the cracks only to have Google help them by removing legitimate sites by the thousands.

I want to rant about PageRank funnelling and Google’s Green attitude, but that will have to wait for another post.  I’m tired.

posted in Google, SEO | 2 Comments

24th October 2007

Happy Birthday To Me!

cake2.png

posted in Personal | 2 Comments

23rd October 2007

My Baby’s Got Blue Eyes

copy-of-blue_eyes.JPG

posted in Personal | 0 Comments

17th October 2007

“But the emperor has no clothes!”

Much has been said about Google’s love affair with the wikipedia and I am going to say some more.

The wikipedia apologists will point to the fact that they have garnered so many natural links that they deserve the rankings. They will also point out their extensive interlinking that helps the site boost its own rankings based on it’s own authority. This concept is foreign to me but it must be true in the world of Google*.

Imagine if you will, a new way of analyzing data developed by two bright college students in their dorm room. Their way was so revolutionary that the dream quickly (in corporate time) grew to be a multi-billion dollar company touching the lives and pocket books of everyone involved with the internet. This data analysis method IS their brand, it’s what separated them from the rest of the also rans in the great technology race. They invented it, they own the rights to it, only they know exactly how it works. Before they wrote about it, it didn’t exist. You get my point, I hope, they are THE authority on the subject. In their words it is, “The heart of our software

Fast forward to today. If you are interested in learning about this concept which built arguably the web’s strongest performing property you would probably Google the trademarked name: PageRank.

Who would you expect to be #1 for that query? The founders and inventors of the term? The corporate web site of the trademarked name with possibly millions of links to it? I would. Apparently however, Google thinks that the wikipedia is more of an authority on what PageRank is than Google itself.

pagerank.jpg

If this isn’t a serious indictment of Google’s unnatural propensity to return the wikipedia in their results, I don’t know what is.

The one thing wikipedia is good for is references of benign facts and a cursory overview, definitely not an authority on the subject, so using them for what they are good for I found this citation:

Many years ago, there lived an emperor who was quite an average fairy tale ruler, with one exception: he cared much about his clothes. One day he heard from two swindlers named Guido and Luigi Farabutto that they could make the finest suit of clothes from the most beautiful cloth. This cloth, they said, also had the special capability that it was invisible to anyone who was either stupid or not fit for his position.

Being a bit nervous about whether he himself would be able to see the cloth, the emperor first sent two of his trusted men to see it. Of course, neither would admit that they could not see the cloth and so praised it. All the townspeople had also heard of the cloth and were interested to learn how stupid their neighbors were.

The emperor then allowed himself to be dressed in the clothes for a procession through town, never admitting that he was too unfit and stupid to see what he was wearing. He was afraid that the other people would think that he was stupid.

Of course, all the townspeople wildly praised the magnificent clothes of the emperor, afraid to admit that they could not see them, until a small child said:

“But he has nothing on!”

This was whispered from person to person until everyone in the crowd was shouting that the emperor had nothing on. The emperor heard it and felt that they were correct, but held his head high and finished the procession.

Denial is not just a river in Egypt, it’s time they realize that the search results are LESS authoritative with wikipedia at the top.


*It reminds me of a class I had sophomore year in college. I took an elective called, “The Philosophy of God.” Don’t let the name fool you, it wasn’t a religious class but rather an exploration of the human psyche’s need for defining a being higher than themselves. One afternoon’s discussion lead to the concept of the unmoved mover by Aristotle which we had to try to reconcile with the big bang. When asked to list possible problems with the big bang I of course fell back on my physics training and cited Newton’s 3rd law ( for every action there is an equal and opposite reaction) . When pressed further by the never shaved, rarely bathed, and haphazardly dressed doctorate of philosophy for a less engineering based illustration I challenged him to reach around, grab his pants around the waist, and lift himself up and fly around the room. Being the odd fella that he was, he attempted it to the amusement of the class and to further prove my point. That moment earned me my philosophical handshake, an odd moment when the professor got on bent knee at the head of the class and extended his hand for a handshake. They were given out for simple and brief explanations to complex situations. This story would not earn me such a handshake.

posted in Google, PageRank | 2 Comments

16th October 2007

Faking Googlebombing for fun

Google uses the links pointing to a page to help decide what the page is all about, even so much to as return a page for a search term even if the term is not on the page.  It’s been termed the Google Bomb.

An actual Google bomb may take many well placed links to fuse, but you can fake just such a Google bomb if you’d like to  scare some unsuspecting webmasters into believing that they’ve got some bad anchor text out there.

Let’s say we’d like to pull a fast one on our favorite black hatted red crab and make him think that some people may consider him a black hatted spammer.  You could edit the cache link to include the search terms of your liking, and get a cache with the anchor text highlighted (for example).

sebastians-pamphlets_black_hat.png

Note that Google inserts the following copy near the bottom of the cache header, “These terms only appear in links pointing to this page: black hat spammer”

Sebastian is way too smart for this to work, but I’m sure you can have fun with it on some other sites.  As an added bonus, the page view should show up in the server stats with the search term, and excellent idea if you know someone who watches their logs religiously.

posted in Google, humor | 0 Comments

16th October 2007

Minty Fresh Updating

A while back Matt Cutts pointed out the minty fresh indexing Google is now exhibiting.  Having been lucky enough to be a part of this minty freshness myself, I’ve been watching it a bit. Today I got another fine example of it.

I made a post with the semi-unique title of cuttlet-block and within a couple hours was seeing search traffic to it.   So I decided to check it out.

Using regular Google Search:

cuttlet-block-google-search.png

And using Google Blog Search:

cuttlet-block-google-blog-search.png

Notice the difference?  The blog search has indexed the actual post, whereas the regular index is showing the main index page with the title as content on it.  Minty fresh indexing is really a misnomer in this case, it’s actually minty fresh updating of already indexed pages (the home page).  The new page isn’t actually indexed yet, but the search term does show up within hours in the index but on pages that are already regularly indexed.

Note also that the “cache” link isn’t present yet on regular index as a quick check of the cache shows and older version crawled yesterday.

What is in the cache does not necessarily equal what the page will be returned for in search results.

posted in Google, Matt Cutts | 0 Comments

16th October 2007

Cuttlet-Block

If you are a regular reader of Google’s Matt Cutts’ blog you’ll notice that there are more than a few regulars that think of it as their personal soap box. Some are even a little borderline insane.

Enter the Web 2.0. Download and install this quick and easy Firefox add-on and those annoying commenters are no more! You can sort through all the fluff in one click to see just what Matt has said, show/hide individual comments, or even block users.

While you are at it sidle on over and Sphinn the article I found this on.

posted in Matt Cutts | 2 Comments

15th October 2007

Empty Redirect

Back in September me and one of my few remaining friends from the Google Webmaster Help Group had a bit of a conversation in the comments on my post Googlebot Gave Up.

Somewhere around comment #3 I posted a link to a page that I created that returns a 301 status code but doesn’t actually redirect anywhere. Forgetting my overly responsible form of dofollow blogging, the link went live without the link condom approximately 7 days after I posted the link.

Google promptly crawled that live link and to my surprise I received an error I’d never seen in Google Webmaster Tools.

empty_redirect.png

The detail declares it an “empty redirect” and clicking the barley discernible question mark will lead you to this page which let’s you know that Google found a redirect on this page, but it didn’t point to anything, so the Googlebot couldn’t follow it. Make sure that all of your redirects are valid and not empty.

I’m taken aback by the fact that even though I struggled to create a page that exhibited the diseased behavior I was hoping to show it is so very common enough that Google has a default error message for it. Unbelievable.

I still believe that is not normal behavior to loose your homepage when doing a canonical redirect even if for a while, and I find it odd that Google wants to keep this “empty redirect” hidden from it’s own crawlers with the use of “noindex,nofollow” on the help page. Oh hell, perhaps they are just funneling PageRank.

posted in Google | 0 Comments

  • Please Support

  • Marquette University

  • Sponsored

125x125

  • Donations


  • ;

Enter your email address:

Delivered by FeedBurner

rss posts
  • Readers