Today I’ve been inundated with “Best of 2007” SEO, SEM, and Search blogs and sites lists. Amazingly the only mention Google got was for their Webmaster Central Blog, a quality one at that, but there is so much more information out there that Google offers us lowly webmasters. One of the best kept secrets in the webmastering and SEO community is the Google Webmaster Help Group which is part of Google’s thriving and growing Google Webmaster Center. Unlike some much lesser but more popular forums site specific help is available and almost required to get the most information. The discussion on the group is not a matter of theoretical discussion but actual practical application. Almost daily (sometimes more, sometimes less) you will see input from actual Google employees and not mere speculation on all aspects of webmasters’ concerns and Google. Google employees can be easily spotted in the discussion by the little blue by their name.
With that being said, most people don’t have enough time to religiously follow the discussion group for the most important nuggets of knowledge and I could not find a central location that catalogued their contributions. The following is a list of the Googlers that regularly post on the help group, a link to their profile so you can find their latest posts. I’m sorry if I missed anyone, If I did please let me know.
posted in GWHG, Google |
I’ve always subscribed to the theory that links are part of the content. In other words a page gets also judged on the quality and topic of the pages it links to. It follows from the academic realm where Google was born, a paper with no references isn’t going to be taken seriously, and neither should a web page. Obviously linking to the #1 result for the term you are targeting is also going to help them, but the trick has always been to link out to highly authoritative sites for terms that surround the keyword targeting.
Anyway, I may be right and I may be wrong, assuming I am right for a brief moment, would a rel=”nofollow” link still count as content to help you rank?
As anecdotal evidence we could use the wikipedia which isn’t an authority on any subject but does use copious amounts of nofollow links to actual authorities, and usually outranks them.
Thoughts, anyone? Bueller, Bueller? Is this thing on?
posted in Google |
Adam Lasnik started a contest to predict when Webado would go over 10,000 posts in the Google Webmaster Help Group. The following is the calendar tracking her progress and contestant’s entries.
posted in GWHG |
I found an interview with Matt Cutts on Sphinn that I found quite revealing. Normally these types of interviews are very bland and boring where the interviewer gets to build up some sort of street-cred by getting a big name search rock star to talk, and the interviewee gets to spew the normal company spin. This one was a bit different, at least in the fact that I took from it some pearls that can be used in forums to dispel frequently addressed concerns.
Leaving the obligatory social interaction out of it, the highlights of the interview as I see fit are as follows:
- Syndicating Content - When syndicating content to be published on multiple sources be sure to include a link within the content to the original source of the content. This will help transfer PageRank the syndicated content may get from external links to the original source. When Google is deciding what story to return in the results when there are many copies they apparently, ” a lot of the times it helps to know which one came first; which one has higher PageRank” On the other side of the story, if you are stealing syndicating content and want it to rank higher than the original then don’t include a link to the original source and get more PageRank to yours than theirs.
- Supplemental Results - There are a couple of undocumented methods for finding your supplemental page count still working. At least one data center is now actually searching the supplemental black hole for 100% of it’s queries, with, “hopefully at more in the future”
- Link Quality - “a link is a link, is a link; wherever that link’s worth is, that is the worth that we give it” .edu links do not count more than a .com link based on any specific weighting in the ranking algorithm. It just happens that many .edu links are naturally better than your average easy to get .com link. Additionally social bookmarking links follow the same guidelines and are not devalued based on their social network status, if they are weak they are weak on their own, without any help from a calculation.
- Link Count - The old adage of 100 links per page is a bit outdated and a good example is DHTML throwout menus when many links could be seen on one page. Matt notes that a page with 5000 links would have it’s PageRank so diluted when it came to distribution that the links wouldn’t pass much. The question I think wasn’t answered here was in regards to the DHTML menu structure is that navigation like that tends to be site wide and Google is quite good at determining what part of pages is the template or site wide stuff, vs the actual content. My question would then be is PageRank flow just a simple division of PageRank by link count, or does more weighting go to the actual page content and less to sitewide navigation. Obviously a page with 1000 links (like a sitemap) isn’t user friendly as the designer should have provided a logical tree for the user to find the information, rather than have to read 1000 links and figure it out for them selves. Bottom line is that 100 links per page isn’t a hard and fast number, but keeping it reasonable still applies.
- NOFOLLOW passing anchor text - In it’s early days there were some rare, and bug-like, instances where the anchor text of a nofollowed link was used in the search results. Those bugs have been killed. Right now, “At least for Google, we have taken a very clear stance that those links are not even used for discovery; they are not used for PageRank; they are not used for anchor text in any way. Anybody can go and do various experiments to verify that.”
- Predatory Link Buying - Buying links for your competitor in hopes of hurting them is more than likely going to help them as Google is most attacking the link sellers. Kind of goes without saying, but I bring it up only because the original Webmaster Guidelines on the issue only addressed buyers and not sellers.
There’s a lot more in the interview.
posted in Google, SEO |
Earlier tonight I left a comment on Sphinn, “
Of or relating to an endless and ineffective task.
This one comes straight out of Greek myth. Sisyphus was a king of Corinth, a son of Aeolus (the ruler of the winds, hence our word aeolian for something produced by or borne on the wind). In later legend he was the father of Odysseus or Ulysses. His name actually meant “crafty” in Greek: he was noted for his deception and he’s the equivalent in Greek folklore of the master trickster who turns up in many folk beliefs, such as Coyote in American Indian mythology. He even managed to cheat Death the first time around, surviving the experience to live to a ripe old age. In Greek legend Sisyphus was punished in Hades for his misdeeds in life by being condemned eternally to roll a heavy stone up a hill. As he neared the top, the stone rolled down again, so that his labor was everlasting and futile. The word first appeared in English in the middle of the seventeenth century. It isn’t used much these days because so few people understand the reference to classical literature.
The parallels that one can draw from that story are uncanny. It also answers the question, beyond the obvious “not to help the spammers angle” why Google doesn’t notify everyone of their penalties, their cause, their existence, nor their cure. The announcements of paid links have accomplished the same task, paralyzing the community in an endless and ineffective task of one-sided debate.
posted in Paid Links |
This was found today on the Google Webmaster Help Group. A site which is selling it’s SEO services is banned/removed from Google’s indexed. When pushed the poster admits that he doesn’t really know any SEO and is just “outsourcing” the services. Apparently he’s got a few of these sites or at least has scraped some, all with the fine keyword stuffed bottom navigation (classic), clip art images, no external links, you name it.
I don’t want to out the guy as he’s got plenty of troubles all ready, but this is just classic. From the FAQ of the site:
Why is [site name] not ranked high on the search engines?
[site name] web site is intentionally not optimized for search engines because our services are for companies needing high traffic exposure and awareness. The less traffic we receive the better because we focus on qualified and selected clients that will actually benefit from High Rank optimization.
You just can’t make this stuff up.
posted in GWHG, SEO |
I am thinking about doing a small study of Matt Cutts’ Blog. There seems to be a point in every one of his
controversial popular posts where Matt just bows out of the conversation and it devolves into bickering, unsubstantiated claims, personal attacks, and even more confusion than before the post was made. Perhaps it’s just as simple as Matt has said all he wants to say on the matter as he doesn’t throw these things out there for debate but rather information. On the other hand, maybe he just gets sick of the inane conversation and moves on to bigger and better things. In his position of leadership at Google I’m sure decisions are made daily and acted upon without endless debate, so his blog may reflect that aspect of his style. Either way I’d suggest he invest in a free wordpress plug-in that automatically closes comments after a certain time period, or just manually shuts them down when he’s no longer interested in responding. There are many examples on what started out as a good discussion decayed into a mess that only leads to more confusion. I’d like to use his blog as a point of reference for many things but often times the actual post is so polluted with unmoderated gibbish that sending someone there to read would only open up a whole new set of issues.
Just take a look at this quagmire that used to be an insightful and intelligent conversation. Perhaps there is some correlations to be drawn between the post date, Matt’s last comment, and the point when conversation has turned just plain silly. If I could put together such a relationship there may be a way to modify the CuttletBlock script to not only block regular troublemakers and lemmings but also just block out the impending noise.
Then again it’s his site and he’s free to do with as he pleases, and I’m only a reader paying nothing to view it. Maybe there is a lesson in there somewhere for people who would like to debate the moral grounds of Google’s paid link policy. Google is just a website after all, and how they choose to run it is their business.
On second thought, Matt, you can operate your site as you see fit.
(Intentional 70’s SNL reference just for us old geezers)
posted in Google, Matt Cutts |
Do you use bad behavior on your wordpress blog? If so, be sure to update it RIGHT NOW before you are blocked from your own blog.
From the site:
All users should update to Bad Behavior 2.0.11 immediately to prevent being blocked from your own site.
Within the past two days users have found themselves blocked from their own sites while using recent versions of Bad Behavior. A third party blacklist which Bad Behavior queries recently began sending false positives for any IP address queried, causing everyone using Bad Behavior to be blocked. This issue is fixed in Bad Behavior 2.0.11.
Please Sphinn if you will (Sphinns actual site not mine)
posted in Site News, Wordpress |
Working on an excel spread sheet and forgot (nudge, nudge, wink, wink) the passwords so that you cannot get to the hidden sheets, modify the VBA, or unlock cells?
Then you need the Excel Password Remover. It’s free, it works, and it’s very stable.
~This is NOT a paid or sponsored review but just an actual I-used-it-and-I-recommend-it endorsement.
posted in engineering |