The story of the search engine begins with 'Archie', created in 1990 by Alan Emtage, a student at McGill University in Montreal. At the time the World Wide Web and its protocols did not yet exist. However the Internet did, with many files scattered over a vast network of computers.
The main way people shared data was via File Transfer Protocol (FTP). If you had a file you wanted to share you would set up an FTP server. If someone was interested in retrieving the data, then they could access it using an FTP client. Even with archive sites, many important files were still scattered on small FTP servers. This information could only be located by the Internet equivalent of word of mouth - with somebody posting an email to a message list or a discussion forum announcing the availability of a file.
Archie changed all that. Archie's gatherer scoured FTP sites across the Internet and indexed all of the files it found. Its regular expression matcher provided users with access to its database. And there it was - the world's first search engine.
The first website built was at
http://info.cern.ch/ and was put online on August 6th 1991. It provided an explanation about what the World Wide Web was, how one could own a browser and how to set up a web server. It was also the world's first web directory, since Berners-Lee maintained a list of other websites.
By the end of 1994, the web had 10,000 servers, of whom 2,000 were commercial, and 10 million users. Traffic was equivalent to shipping the entire collected works of Shakespeare every second. Miniscule by today's standards.
Primitive web protocols were established and technology evolved. As the web grew, it became more and more difficult to sort through all of the new web pages added each day. Web robots were devised that searched the net, following links from site to site capturing and index website URLs in giant search databases. Search engine brands we are familiar with today such as Excite, Lycos, Infoseek and Yahoo! started to appear around the mid 1990's. Returns varied enormously from engine to engine depending on the technology and so MetaCrawler was developed, reformatting the search engine output from the various engines that it indexes it onto one concise page.
It wasn't long after the advent of search engines before advertisers noticed the massive popularity of the search engines compared to other types of site. Receiving daily hits in the millions, the search engines had stumbled across a search-driven advertising gold mine. The rewards for websites placed on the search engine's first page through high search ranking started to grow as visitors clicked through to the site and followed the call to action.
Clicks tuned into cash, as the Internet became financially viable through advertising revenues, e-commerce and other commercial opportunities. Webmasters sought ever-more inventive ways by which to get their sites to the top of search returns. In so doing they created what has since become the multi million-dollar Search Engine Optimisation (SEO) industry.
Over the last decade SEO techniques used to ensure top positions in search results have changed repeatedly, as the search engines battle to retain their integrity in maintaining search relevance as the number one priority when generating search returns. All search engines apply highly complex and top secret algorithmic formulas by which they assess queries and match them to (hopefully) the most relevant returns. These formulas are the core business differentials between search companies - their currency - and the means by which they claim their competitive advantage over each other. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources.
The optimisation/algorithm dance is a game of cat and mouse, with the search companies working hard to keep one step ahead of webmasters and SEO marketers. In the early days of search engine optimisation, getting listed was straightforward. Using descriptive file names, page titles, and meta descriptions withkeywords in sufficient density would normally do the trick. Often returns weren't particularly relevant but people's expectations weren't that high. The 'add URL' function was king and software that made automated submissions were a large part of SEO strategy.
Remove Virus | Remove Spyware