from SEO Articles
In the early days of the Internet, search engine optimisation was a fairly simple affair. Webmasters (SEO professionals to you and I today) had a fairly straightforward job optimising web sites. To rank for a keyword in a search engine, all that was required was to put them in the title, the keyword meta tag and mention it freely in the site content and hey presto! You were done. This however led to ‘keyword stuffing’ which was a technique of repeatedly using the keyword in the meta tag and hiding keywords behind images, in order to manipulate the search engines.
The search engines caught up with the tricks these webmasters were using and introduced another factor which would influence a site’s position on the results pages. This was the number of incoming links to the site. This was around the same time that Google’s Page Rank Algorithm came about. Google’s page rank essentially states that each link to a page represents a ‘vote’ for that page, so the more links a page has the more authority that page has.
The focus on the number of inbound links led to another form of spamming. People started buying links or swapping links just to increase the number of links they had. It is evident that just because a web page has 1000 links doesn’t make it more of an authoritative page than one with 100 links, when considering that links can simply be bought.
So once again the search engines had to re-evaluate their criteria for gauging the importance of a Web site. This time they focussed attention on the relevancy of the Web site the link was on to the Web site at the other end of the link. If the subjects were related, this was deemed good by the search engines. If not, this wasn’t so good and even potentially harmful.
Search engines are constantly tweaking their algorithms so SEO professionals have to keep their wits about them in order to stay ahead on the game. Google’s recent Panda update zapped Web sites that were reliant on huge numbers of links from sites known collectively as link farms. These are Web sites that have many links and very little content, and are therefore not very useful to anyone other than for SEO purposes. It would appear that search engines since the dawn of the Internet have been following the same path – that is to rank Web sites higher based on their usefulness to people surfing the Internet, not for SEO people. This would imply that the future of good search engine optimisation techniques would focus more on what the user wants and less on how search engines can be manipulated.