Deprecated: Assigning the return value of new by reference is deprecated in /home/jlhdes/public_html/wp-settings.php on line 520

Deprecated: Assigning the return value of new by reference is deprecated in /home/jlhdes/public_html/wp-settings.php on line 535

Deprecated: Assigning the return value of new by reference is deprecated in /home/jlhdes/public_html/wp-settings.php on line 542

Deprecated: Assigning the return value of new by reference is deprecated in /home/jlhdes/public_html/wp-settings.php on line 578

Deprecated: Function set_magic_quotes_runtime() is deprecated in /home/jlhdes/public_html/wp-settings.php on line 18

Warning: Cannot modify header information - headers already sent by (output started at /home/jlhdes/public_html/wp-settings.php:520) in /home/jlhdes/public_html/wp-content/plugins/bad-behavior/bad-behavior/screener.inc.php on line 8

Warning: session_start() [function.session-start]: Cannot send session cookie - headers already sent by (output started at /home/jlhdes/public_html/wp-settings.php:520) in /home/jlhdes/public_html/wp-content/plugins/wp-referer.php on line 36

Warning: session_start() [function.session-start]: Cannot send session cache limiter - headers already sent (output started at /home/jlhdes/public_html/wp-settings.php:520) in /home/jlhdes/public_html/wp-content/plugins/wp-referer.php on line 36
Google’s “Scalable” Solution | JLH Design
25th January 2008

Google’s “Scalable” Solution


Warning: copy() [function.copy]: Filename cannot be empty in /home/jlhdes/public_html/wp-content/plugins/mytube/mytube.php on line 220

I’m no stranger to Google’s reconsideration request. I’ve helped dozens if not hundreds of people scour their sites, identify possible violations, implement changes, and compose the reconsideration request. I don’t do this as a professional cause but as an extension of my efforts in helping webmasters in Google’s Webmaster Help Group. Perhaps its because I choose the sites I want to work with and only cater to the ones that I believe are acting in ignorance rather than more devious intentions, but my success rate is quite high. There’s never been a case I couldn’t solve, then again this is probably due to my selective choices and not my mad Google skills. Either way, I know of what I speak.

Which brings me to an interesting situation that I was alerted of in twitter, saw in Sphinn, and then saw unfold on Dazzlin Donna’s take on SEO news, tips and theories SEO Scoop Blog. If you take the time to read Donna’s post you’ll see that she was caught up in the paid links dragnet and lost some of her visible PageRank. After a while she decided to demonetize her blog and set it up to comply with Google’s guidelines regarding paid links. She’s not Yahoo! so her time and opinion in choosing which sites to review are not worthy of being compensated for if they contain an active link (Google’s opinion, not mine).  After cleaning up the site she submitted a reconsideration request to Google.  Time passed and yet her PageRank penalty persisted.  Five weeks passed and she has finally found some resolution, though not through Google’s reconsideration request, but through the only solution that will actually work.

From my outsiders point of view and without any inside knowledge, the situation unfolded like this.

  1. Sometime in late December a reconsideration request was filed.
  2. Five weeks passed…
  3. Donna posts her plight to her blog
  4. A twitter is sent out.
  5. The post is Sphunn.
  6. 20 people sphunn it.
  7. The Sphinn goes hot 2 hours later.
  8. Matt Cutts comments on her blog, scolding her for her non-scalable method of approaching the situation, but offers to help.
  9. Matt offers to look into another commenter’s site.
  10. Matt says that her disclosure policy could be the problem.
  11. Donna changes her policy and responds that she did so.
  12. Matt emails the Google employee charged with reviewing Donna’s request.  Apparently there is another post that is still passing PageRank that was paid for.
  13. Donna fixes the post and comments that she did so.
  14. Matt points out another violation.
  15. Donna fixes that violation.
  16. Matt praises his team and says that they will get to it soon.

I would not have thought of how obtuse this whole process was had it not been for Matt saying, “In general you want to go with the reconsideration request approach rather than invoking me (that’s not scalable :)”  [my emphasis] Obviously this process is not scalable at all.  Here we have someone who’s worked on fixing her site, made some substantial changes, submitted a request for review, and apparently missed some things.  What she missed was exactly the same problem that she already admitted guilt to in the reconsideration request, but rather than offering any help Google files the request in the circular file and ignores the problem.

Since the majority of site owners don’t know Matt Cutts, know how to use social sites to get attention to their blog, don’t have blogs for that matter, and if they did probably wouldn’t get Matt to write six comments on their blog and send an email on their behalf, this is not a scalable solution.

A scalable solution would be the following:

  1. Site owner fixes site and submits a reconsideration request.
  2. Google reviews the site and finds some outstanding violations.
  3. Google sends a message back in the site owners  webmaster’s tools message center saying, “We have received and reviewed your request for consideration.  Unfortunately at this time we are unable to act on your request due to continued possible violations of our Webmaster Guidelines.  Please feel free to review the Webmaster’s Guidelines, make any changes that you find appropriate and resubmit your reconsideration request”
  4. Site owner digs deeper and sends in request.
  5. Google responds with another note, “We have received and reviewed your request for consideration.  It appears that your site is now within our guidelines.”

Notice that I didn’t even say that Google had to specifically say what violation they had.  I didn’t even specify whether or not a penalty has ever existed or has been lifted.  What I did do is “COMMUNICATE“.   Letting the site owner at least know that they are being heard.  Google’s response can be an automated one with only two possibilities. I’m sure their is a radio button somewhere on a computer somewhere that a Google employee is clicking when they review a reconsideration request.  It wouldn’t be too much to program one of two auto-responses depending on the status of that button.  That would be a scalable solution.

Their communication efforts in the help groups and their webmasters blog have been quite admirable lately, but there still is a disjoint between your average webmasters and those who know how to get to Matt Cutts, and that is just not right.  Not right at all.  I’ve heard many people say and write that one thing you should look for on an SEO’s resume is whether or not they know any search engine engineers, this situation just adds  that, and that is just not right.  Not right at all.

Having Matt Cutts be the voice of Google out there writing on his own blog,  commenting on people’s sites, and occasionally penning something on the official webmaster’s blog is great and wonderful for the community that watches that sort of thing.  I just believe that those people are a small subset of the actual webmaster population and the majority should not be at a disadvantage because they don’t subscribe to the right feeds.

This entry was posted on Friday, January 25th, 2008 at 2:14 am and is filed under Google, Matt Cutts, Paid Links, reconsideration request. You can follow any responses to this entry through the RSS 2.0 feed. All comments are subject to my NoFollow policy. Both comments and pings are currently closed.

There are currently 12 responses to “Google’s “Scalable” Solution”

Why not let me know what you think by adding your own comment! All the cool kids are doing it.

  1. 1 MyAvatars 0.2 On January 25th, 2008, Matt Cutts said:

    We’ve certainly debated giving that sort of status back, but the problem is that hardcore spammers would probe to see just exact how much they could “get away with” and still get reincluded.

    But I should have added in my comments on Donna’s situation something like “The webmaster help group is more scalable to identify problems that I am.”

  2. 2 MyAvatars 0.2 On January 25th, 2008, John Honeck "JLH" said:

    Once again I agree with you.

    Feedback has to be given in such a way that it can’t be used in an iterative process to decipher where the lines are. Obviously there is some human element to it as well as I’ve seen people who’ve gotten very specific messages telling them where hidden text was located, which surprised me.

    I don’t have the exact answer but I just feel like there is more that could be done. I’m trying to step away from the forums, blogs, and SEM social sites and look at the issue through the eyes of a site owner of a single site owner, and for them the current process is akin to sending a message in a bottle, you don’t know if it got anywhere.

    On the other side of it I know the troubles of trying to deal with spammers. In my own personal experience I’ve learned that you cannot judge a site solely on the information given on the site. Often people post with problems for a given site that on the surface seem quite benign, but before helping them out I’ll check the history. With my limited tools I’ll often find that the site owner has A LOT of experience pushing the envelope on other properties, at that point I have to make a judgement call of my own. Do I help them get this one particular site back in or do I resist so as to not further their shadier tendencies? False link profiles, dozens of thin affiliate sites, scraper aggregators, etc. are just some of the rather obvious stuff I can find, and I don’t have the wonderful data mining tools that Google has.

    I was hesitant to post this at first as I don’t want to give the impression that I don’t want to have Matt out there helping people, as I do, but it’s an issue that I feel strong about. For me helping the silent majority who isn’t quite as vocal and doesn’t have a voice in the blogosphere or the search engine circles because frankly, they don’t know they exist, is a cause worth forwarding.

    In the end, hopefully Matt continues his public efforts, but also I hope I at least registered a vote for the cause to find a way to help the non-hardcore spammers and those without many connections. Great strides have been made in that direction but I don’t think we are quite there yet.

    Thanks Matt for adding to my discussion.

  3. 3 MyAvatars 0.2 On January 25th, 2008, DazzlinDonna said:

    John, I agree with you 100%. Heck, I agree with you 500%. And I hope that my post eventually leads to that sort of solution.

  4. 4 MyAvatars 0.2 On January 25th, 2008, Doug Heil said:

    My, oh my. I don’t know what to say. When has any search engine in history had back and forths with people who filed a re-inclusion request of any kind, even though they had not fixed all violations totally?

    Hasn’t it always been the case that if a site owner knows they did something against the guidelines, that the site owner is responsible for a “total” cleanup?

    I just don’t know what to say. How does our SEO industry expect Google or any employee to email back and forth with any type of owner who did not follow the guidelines?

    I’m at a loss with this. I really am. It shouldn’t matter who did what and when; if you violate guidelines you should fix things first.

    Second, third, and even fourth and fifth chances?

    My, oh my.

  5. 5 MyAvatars 0.2 On January 25th, 2008, John Honeck "JLH" said:

    Doug, where did I say that Google was to email the owner back and forth?

  6. 6 MyAvatars 0.2 On January 25th, 2008, Doug Heil said:

    Hi John, I thought you said that Google would reply back saying all was fine or all was still not fine? It’s also Matt Cutts doing a post by post and telling the owner what the issues still were, etc.

    I just don’t see how things would scale at all in any type of situation. I’m also not understanding the idea of giving a blackhat all the tools they need in order to get re-included…(NOT talking about this particular case at all), but talking about this issue in general of Google giving correspondence like this to everyone.

  7. 7 MyAvatars 0.2 On January 26th, 2008, Dave L said:

    First, Google needs to indicate the area causing the site to be penalized if a site owner requests it. They don’t have to share specifics, but they should give an overview, ESPECIALLY when Google has changed its policy and something that was okay yesterday is not ok today. There should be a generic report of some kind available through webmaster tools.

    Second, no one with a clear-cut issue should be left guessing. Did they simply overlook something? Google should make that information available. If it’s a case of borderline violations, or borderline detection methods, Google can be more obscure so that spammers don’t abuse the feedback.

    Third, there are Google AdWords certified professionals, why not Google SERPs reinclusion certified professionals? Google can provide them more advanced tools or have the relevant Google employees interact with them. Part of ongoing certification could be to help an assigned case a few times per year pro bono.

    And when Google changes their policy, they need to provide a single, clear document on the policy, and tools and to allow compliance, all in ADVANCE. No one should be left interpreting and interpretation of something Matt Cutts said on video in order to determine details of Google’s policies.

    And last, on Google should not provide methods or tools that CAUSE sites to be penalized. Explanation:

    My least read blog (on blogspot) recently lost all Google traffic when I changed the blogspot template. Almost certainly a duplicate content penalty, as I didn’t have any posts sorted by topic before, and now do (and not using any suspicious page elements, etc.) Google traffic dropped to zero within a couple hours of changing the template, and has remained at zero for weeks since. So Google killed my Google results. Pages are still “indexed,” so reinclusion is not an issue, but sending 0-0-0 for traffic.

  8. 8 MyAvatars 0.2 On January 26th, 2008, dockarl said:

    I think that if Google really decided to put the axe to the grindstone they could find a solution to this problem - I’ve personally thought about possible solutions to the prob from just about every angle and (with brain cells sufficiently knotted) I still can’t find one that doesn’t potentially open the algorithm to the kind of abuse Matt refers to above.

    I do, however, think that obvious and well known issues such as simple HTML hidden text could be notified without threatening the integrity of the spam detection algorithm, but I guess even that would then possibly give the blackhats info about the approx time it takes an infraction to be detected.. sigh..

    This is a particularly tricky problem for Google.

  9. 9 MyAvatars 0.2 On January 28th, 2008, Joe Hunkins said:

    This is an *excellent* set of observation, and with all due respect to my pal Matt I’ve always been totally unmoved by Google’s suggestion that making the reinclusion and webmaster information process more transparent would somehow jeopardize Google’s ability to kill spammers.

    In fact from my observations over the years I think the lack of transparency, along with initally vague webmaster guidelines (now fixed), have caused many if not most of the spam problems as both spammers and regular web folks vie to push the limits of the rules while staying in Google’s good graces. The big problem now is the profound inconsistency in the way sites are indexed, and the fact that it’s very difficult for webmasters to get much feedback from Google. Google would be well advised to consider better automated or customer pays routines to examine websites for problems and allow reinclusion, because the frustration is building more than they realize in the webmaster and small business community.

  10. 10 MyAvatars 0.2 On January 29th, 2008, Forrest said:

    I agree what you described isn’t very scalable … but I honestly don’t see anything wrong with it. I get the sense you don’t, either?

    Your proposed, more-scalable solution says when a reinclusion request is denied, Google should say “no,” and refer the user to their guidelines … but you aren’t actually suggesting that the onus is on Google to research the site page by page to find each violation of their rules for inclusion. It sounds like the main difference between that and what actually happened is that Google assumes anyone asking to be reincluded probably already knows about their guidelines…?

    Also, Google is a private, for-profit company, and not a public utility. Page rank ( especially the toolbar variety ) isn’t something we’re owed for having web sites. Google would be within its legal rights to reset a site’s PR to zero because one of their employees doesn’t like its template; their stance on paid links and such hasn’t been a secret for the past several years.

    It’s unfortunate it took several weeks and a public spectacle for your friend to get her ad value back. But does Google really owe a site that’s broken their terms of service repeatedly more haste than they would have for any other web site?

  11. 11 MyAvatars 0.2 On February 10th, 2008, Stumped said:

    “We’ve certainly debated giving that sort of status back, but the problem is that hardcore spammers would probe to see just exact how much they could “get away with” and still get reincluded.”

    No disrespect Matt but that’s a bit of a cop out, there’s nothing stopping a black hat or spammer setting up a network of sites and adding a dash of cloaking, .edu spam etc and probing how much they could “get away with” before being shot down.

    No difference, by not having some sort of re inclusion feedback it effects legitimate people trying to do the right thing more than anyone.

    I just submitted a re inclusion request after my site vanished 2 weeks ago for an unknown reason. It went from xx,xxx visitors per day mainly from Google powered search to about 10 a day in a matter of 5 minutes.

    The worst part is, i finished my last day of my offline job on the Friday to become a full time webmaster solely because my site just started breaking the $400 per day mark via Google Adsense. On the Sunday my traffic was killed off completely leaving me with no source of income what so ever.

    Now i’m left sitting here praying my re inclusion message in a bottle i sent out gets to someone before i cannot afford to eat.

    I don’t sell/exchange text links and i couldn’t give a hoot about PageRank and nobody can find anything wrong with my site and it’s believed to be a glitch.

    So i feel sick after reading the Adsense “Success Stories” and how it changes peoples lives… Well i can definitely say it changed my life just 48 Hours after becoming a full time publisher.

    Anyhow back to the topic.. Please Matt something needs to be done about the re inclusion process, put yourself in my shoes. Something eventuating from this re inclusion request means the difference between loosing things like my house and internet connection or being able to live my life as it was before all this happened.

    It’s breaking me scouring the net all day long trying to work out “what else can i do”.

  12. 12 MyAvatars 0.2 On February 13th, 2008, Craig said:

    I like the “Web search certified engineer” thingy, where do I sign up? :-()

    Actually, here is an idea, although maybe not any more scalable than anything else but here goes anyway,,,

    A webmaster’s site tanks, they come to the Google Webmaster Help forum and pose the problem. If after everyone diligently working on the problem, no one finds anything obvious, which I think most of us would find things with spammers’ or black-hat sites, the problem is escalated to someone who can peer into Google’s algo crystal ball and give ideas as to what areas are problematic.

    An example of this that I think worked well, although maybe took longer than maybe it should have, is Sam I Am’s website. We all wracked our brains looking for anything and everything and the real problem turned out to be something that almost no one had imagined.

    A Googler then steps in and gives a general overview of the problems, Sam I Am and his brother then go off and do the work that is needed. In the trade off we end up with a new and valuable member of the GWMH forum in Sam I Am who may not be the forum whore that some of us are but often seems to help out at just the right time.

    Could something like this work?

  • Please Support

  • Marquette University

  • Sponsored

125x125

  • Donations


  • ;

Enter your email address:

Delivered by FeedBurner

rss posts
Spread the Word
Sphinn
delicious
digg
technorati
reddit
magnolia
stumbleupon
yahoo
google
  • Readers