: copy() [function.copy
]: Filename cannot be empty in /home/jlhdes/public_html/wp-content/plugins/mytube/mytube.php
on line 220
I want to start out by saying that I haven’t wasted my time reading patents nor have any inside knowledge whether this is fact or not. It’s just pure theory, conjecture, hypothesis, observation, opinion…it’s a blog post.
We all know that one of the factors of a site’s ranking possibilities is the popularity of the site/page. The defacto measurement for this popularity is the amount of links the site has. Links drive entire economic segments of internet marketing from buy and selling, or just pure creating them. At the center of the link popularity firestorm is Google’s PageRank, measurement of the pages importance, which is purely a calculation of the total and quality of the links pointing to a page.
Douglas Fairbanks is a really popular guy, he has millions of fans that pay their last nickle to see him his movies. Unfortunately, Douglas is dead, and hasn’t made a movie since the 1930s. His popularity didn’t wain, his devoted fans were still fans, however new things came along (like sound) in the movies and they became fans of those as well. The point being, like in life, being popular is a continuous effort. You cannot reach a certain amount of fame, links, and then sit back and enjoy the ride.
Google keeps track of your links. We check for them using the link: operator, log into our webmaster tools account to check the links, go to yahoo and use their site explorer, and wait for that quarterly PageRank update. By logging the links to a site they also collect another crucial metric that is rarely discussed, time. Somewhere in Google is a database that is logging: Link X with give PR to Y page found on DD/MM/YY at HH:MM:SS. All of the online tools available track the quantity and to a lesser extent the quality of the links, however the time factor is not mentioned. Given the time factor a whole host of calculations possibilities arise, I’m going to go over some implications.
Displacement is the total number of links a site has, it’s the distance from zero along a straight line to the total. It’s the one factor we can gather some data on ourselves by using online tools. When evaluating a sites performance problems, it’s usually the first place any forum observer goes and says things like, “You don’t have enough links, get more to rank for anything” or conversely, “I don’t know why you don’t rank you’ve got 8000 links”. The displacement of your site’s links is the sum total of all the links you’ve received, less the ones you’ve lost, for a snap-shot of the site’s health. Older sites tend to have more links, since they’ve been around a while to gain them, as do popular or trendy sites, as they tend to get them quickly. Not-so-good sites or sites about obscure subject that nobody is interested in tend to have less, new sites may have none.
Using the time data of link acquisition another variable can be calculated, the link velocity. Velocity is defined as the rate of change of displacement, given in units of displacement per time ( MPH, m/s, ft/min) or for site popularity let’s say Links/Week, Links/Day, Links/Year, or Links/Site Age. Velocity is the rate at which your site is gaining/loosing links to it. It’s not easily viewed in any of the online tools or data given. Positive velocity is anything above gaining zero links per time period. If you’ve gotten one link in the last week, and not lost any, you’ve got positive link velocity. However if you haven’t gotten any links this week but lost 3, you’ve got a negative velocity. Velocity is great indication of how the site is currently doing, much more than displacement. For example, if you’ve got a site with 10,000 links to it, normally we’d say that site is fairly popular, but if it’s only gaining 2 links a week at the moment, it really isn’t that popular any more. Sure you still get your credit for having 10,000 links but some consideration has to be given to what your doing today.
Another calculation is overall velocity, or velocity calculated over the time frame of the entire event . Let’s consider two marathon runners. Both have run the entire distance of 26+ miles. The first runner completed the journey in 3:00 hours, the second took 7:00 hours. Our first runner has an over all velocity of 26/3 or a little over 8 -1/2 MPH, while the second ran an average of 3-3/4 MPH. In the web world they’d both have the same PageRank (26+ miles) but entirely different link velocities, one may have taken 3 years to get to a PageRank of 6 while the other has done it in 9 months.
Doing a further calculation on the links and time data we can come up with the link acceleration, defined as the rate of change of velocity. Knowing the acceleration of an object tells us the trend that object is taking, is it gaining or loosing velocity. So now for a web site we’ve got three parameters to look at; the total links, the velocity of gaining or loosing links, and the rate at which the site is gaining or loosing. For a given site that has 10,000 links to it, last week it may have gained 50 new links for a velocity of 50 links/week. The week prior the site gained 40 links so the site is accelerating, it’s velocity has gone up by 10 links per week per week, showing an upward trend in popularity. On the other hand, let’s take the same site with 10,000 links that has gained 50 links this week, however last week it gained 80 links. The acceleration of the site is negative 30 links/week/week. It’s still fairly popular, it’s still gaining in popularity, but the rate at which its gaining popularity has slowed down. This isn’t something is normally noticeable when doing a site evaluation, as the data just isn’t there for us to gather, unless it’s been gathered and logged with time as a constant.
I’ve discussed how a sites link profile could be used to evaluate its current popularity and trends, but there is another consideration. Since Google has this information on all the web sites in it’s index, one would have to assume that they use it on a comparison basis. For a given search term Google whirls and buzzes and comes up with a ranking based on it’s 11 secret herbs and spices, then comes to link evaluation. The first is raw popularity, how many links point to two sites relevant to the search, the second is velocity - how fast or slow has that site been gaining links, and the 3rd is acceleration is that link gaining/loosing been going up or down. Why comparison is important is because link popularity, velocity, and acceleration do not have the same weight in the ranking algorithm for all sectors. If you are searching for the history of WWII one would assume older more popular sites would rank higher because the history of WWII hasn’t changed, the interest in the subject is pretty steady, and velocity and acceleration should follow the web as an aggregate (as the amount of all available links grows so should a sites share). This would be where an old established authority site would probably be unbeatable in the ranks. Other topics however may not have a history to consider, they are new, so velocity and acceleration would have to be considered more.
Some implications and observations
So now that I’ve hopefully got you thinking about something other than just how many links you’ve got, let’s consider the implications of such ideas in regular search behaviors.
Google has yet to celebrate its 10th anniversary and the internet does not seem to be going anywhere soon. If the algorithm was purely based on PageRank or total links eventually all the web results would settle down to a select small group. Older established sites with their millions of links would continue to get millions of links and eventually be unbeatable. Well, this is obviously not the case as new sites and trends pop up in popularity all the time. It is conceivable that in 20 years time when we’ve got a real history to look at, 30 years of Google, that there will be sites that have millions (if not billions by that time) of links but yet don’t rank for anything at all. Sites that may be popular today will still have their links, but will not gain them as before and be filtered to the bottom. Just like our friend Douglas Fairbanks, his fans didn’t stop loving him, they just starting liking other things more. I’m looking forward to the day when I can look back at the internet with my grandkids and tell them about the days when every search turned up a wiki result. And I can show them the old and busted wiki site siting there with 10 billion pages of content and 20 billion links not showing up at all in search results because no one has linked to it in the last 10 years….ahh one can dream.
It’s been observed by many that PageRank isn’t everything, and the primary proof of this is the search results page. It’s been pointed out in a million different places that a PageRank 2 page can outrank a PageRank 6 page. Other than on-page factors such as content, is the link velocity and acceleration. The PageRank 2 page may not have as many total links at the moment, but it’s been getting them at a quicker pace than the PageRank 6 page.
Another phenomena that is discussed often is the newness factor. Fresh sites and pages tend to get a bump in the SERPs then settle down into a lower rank. A new page has no history, so when an acceleration calculation is done it’s acceleration is huge. If it’s deviation and velocity were zero last week, but this week it has 10 links it has accelerated tremendously. In order for an established page with 100 links to it already have the ability to match our new pages velocity it would need to get 1000 links to it in the same week.
I’ve read in some forums the theory of don’t get “too many links too fast” I’ve always thought that was an odd theory as it’s a natural phenomena. When Apple announce the iPhone, I’d imagine it got a few links that day. However where there may be a grain of truth to it is in unnatural linking. Let’s say you decide your site is lagging so you take a break from content generation and go on a week long link building campaign for your site. You write to 100’s of sites asking for them to take a look at your content, suggesting where they could benefit from linking to a page on your site. You also go and submit to a few hundred directories, and then go buy a couple hundred link ads. Initially you’ll probably see a substantial boost in traffic and probably rankings, you may even see some more green pixels in the tool bar on the next update, so you’re happy. You go back to business as usual, and then a month later are in WebmasterWorld whining that you’ve got the too-many-links-too-fast penalty. I’d suggest that there is no such penalty, just that you’ve made your site look like its loosing popularity rather than gaining it. Sure you get some credit for having more links than it used to but when put into context with the temporal data the site looks like it had a big gain in popularity one week and then the next the popularity wained. Sure you want to outpace your competitors in link building rate, but remember slowing down in that link building is also a sign. So a link building campaign un-naturally sets the bar higher for a site, when that campaign stops you can no longer maintain the false popularity acceleration that it portrayed. So our webmaster quits whining in WebmasterWorld and moves on to other things, the site will eventually settle back into its natural link growth and probably will regain its original rankings.
Spend any time watching webmastering forums and one recurring theme you’ll see is “I’ve done nothing and all of a sudden my rankings dropped” also known as the -950 or whatever penalty. At this point many will head on over to site explorer and check out the links, the site owner will point to a bunch of great links they’ve got on Microsoft’s home page, etc. What isn’t considered is acceleration. Remember there is negative acceleration as well, or the velocity is slowing down, and there is even negative velocity, where your link total is dwindling. If your competitors are gaining links at a regular pace but you’ve just lost some, your site may appear like it’s penalized. Once again the problem lies in only looking at the link total and not knowing the link trending. If the site has 10,000 links and gains 100, it’s probably not going to be noticed by observing link: commands, 10,100 looks a lot like 10,000. On the other hand, if the site was normally getting 100 links a month, but then in one day lost 500 links an interesting thing happens. The popularity will appear pretty much the same, 9500 links looks just as good in site explorer as 10,000 links. BUT the velocity will be negative, and the acceleration will be HUGELY negative because the links were lost in a short period of time. Now 500 people rarely get together and decide to remove some links, but Google does it all the time. In Googles ever-quest to improve its ranking algorithm they are always re-evaluating which links count and don’t count. If they’ve recently discovered that 500 of your links are footer links on sites you bought them from and simply discount those links as not counting it may appear as a penalty because of negative acceleration and velocity. No amount of writing reconsideration requests is going to get the site back into the rankings, because the effect of the negative link building will be there. This also explains why some sites that suffer sudden ranking drop come back into the rankings slowly. As time goes by, that negative spike in acceleration slowly fades into the sites average. The sites natural positive acceleration will slowly show that it’s again gaining in popularity and the effects of loosing the links all at once will be eliminated.
Blogs tend to get a bad rap for being able to rank fresh post quite fast then fading into obscurity. I think this has to do with a blogs infrastructure and the link velocity and acceleration factors. When a new blog post is published it is shown on the front page of the blog, in a couple of categories, and probably in the archives. If the blog is remotely popular many sites aggregate the feed and also publish the story on their front page, categories etc. Unlike adding a new product under an existing category on your ecommerce site, a new blog post get’s tons of link pop right out of the gate. It’s link acceleration is huge. After some time goes by acceleration stops, velocity goes to zero, and displacement stagnates. The blog post then fades into a ranking position much lower than the initial publication.
The circles I travel in tend to bring me to a lot of professional SEO sites and people. One of the main tenants of being an SEO is that SEO is not a one time thing, you cannot just SEO a site and let it ride,you need continued SEO. Many have very good proof of this by being able to document sites that they used to work on, the client stopped using their services, and then slowly the site drops into the abyss. Part of an SEO process is a link building campaign. The campaign can be as white-hat as possible only generating natural links, however it is un-natural in that it outpace the sites natural abilities. Stopping this campaign will be seen as a negative acceleration and thus a slow down in link velocity for the site. The key to a good SEO campaign of link building is not to outpace the natural link building of the site by too much. When link building is stopped, it cannot stop all at once. The site needs to ween itself from link building, slowly reduce it’s link building activities until it’s normal velocity is within the deadband of the un-natural link building. At that point the site can live on its own without a negative rankings drop.
In conclusion I’d just like to sum up the fact that looking at a link total for a site is not the only indication of it’s health. There may be other factors in a sites popularity upswing or down swing in rank other than just the total links. Any un-natural link building whether by-the-rules or not can be seen in time trended algorithms.
Coming up next: How you can monitor your own link velocity and acceleration, kind-of.