Skip to main content

Search Engine Optimization: The Past, Present and Future

search-engine-optimization-the-past-present-and-future

Between 1997-1999 was the period of formation of search engines. SEO specialists have been those individuals that added sites to search engines. The echo of the past is still heard, when old-fashioned SEO companies offer to register your site in thousands of possible search engines.

Indexing programs used search engines (robots, spiders), they were viewed through HTML code pages and use some algorithms: PageRank, TrustRank and others that are kept secret.

Those days were a bliss for spammers and it was much easier to get high site ratings. You could use the keywords any number of times in the content body, in META tags, comments, etc. It was also possible to hide everything from the eyes of the visitors by making the text invisible using special html tricks.

The search engine did not have sufficient technology to recognize this spam, so such sites easily got to the top position. Today, such a primitive optimization is also encountered (however, now such projects are most often "banned" (removed from the index).

The exception has always been Yahoo, where a person indexed and excluded spam pages. Over time, search engines have become able to distinguish between spam and fine pages. However, search engine optimizers have always been one step behind search engines in discovering new ways to cheat the index algorithms.

That is why search engines deliver us only relevant results, spared from various tricks of spammers. Some machines are starting to use different indexing methods.

In mid-1999, search portals tried to use the logic of the regular Internet surfer to improve search results. Search engine DirectHit introduced technology that kept track of which sites Internet users chose.

If visitors often choose the same site for a particular keyword word, then, most likely, this site will rise in the ratings (this effect can be seen in the Rambler search engine, the first positions which were occupied by sites ranked first in the top 100).

Another way to make indexing algorithms more efficient is to track the number of sites linking to a given page. This rule of Search engine optimization in general came from the system of libraries and archives that still plays an important role in the influx of visitors to the site.

This method was called link popularity and remains one of the main ranking factors. Both described parameters are known as external factors, because these are factors that do not depend on the content of the page. These pages are free from the influence of spammers.

The SEO industry has found a way out in the creation of so-called link farms. The idea is this: if search engines take into account the number of incoming links to the site, then you can create a special external link page that will be useful for crawlers. On this page you can post many links to our page and to other similar pages, and these links will do their job when the robot starts indexing them.

Similar link pages have links to each other. such link farms were the creation of a community united by links. The goal was to achieve high positions in the ratings for their sites.

While link farms enjoyed their popularity and developed link exchange software between their link farms. Search engine crawlers already knew about all this. Today, search engines no longer provide a big advantage for these types of links.

Yahoo was the first search engine and still remains one of the most popular. Since Yahoo directories are created by people, the search engine has faced a lack of human resources. Yahoo editors are still striving to achieve the maximum number of pages covered.

Community-published catalogs first appeared in 1999. This system allows thousands of editors, united by the Internet, to constantly update catalogs. The first of these networks is the Netscape Open directory, and the www.go.com directory became another leader.

Scroll to Continue

The Netscape Open catalog, in addition to being published by a community of editors, is also an open access directory. Anyone who wants to improve its search portal, can access this directory.

The only fact that is indicative is that by 2000 the lists from the Netscape Open catalog began to appear in many search engines. When catalogs published by editorial communities have not only quantitative, but also a high-quality network of links, they become weighty players in the search space.

Man-made directories act as a kind of search engines lawyers because they provide highly relevant search results. With the increasing importance of such directories by 2000, marketers began to concentrate on focused, targeted and high-quality website optimization.

Google

The Google search engine began its journey to the title of King of Search Engines in 2000, and in 2002 it could rightfully call itself that, since 70% of the search in almost all countries (except Russia, China and Estonia) there was Google.

In Russia, the Yandex search engine is in first place in terms of popularity. in China - Baidu.com and in Estonia - Neti.ee. Google in these countries ranks solid second place.

While other search engines try to be as universal as possible, Google remains a simple and unique search engine - with a simple interface and relevant results. Google is also the only search engine (OS) that can search and index SWF (shockwave flash) files.

Besides, thanks to off-page technologies, Google is resistant to spam. Google gained a foothold when Yahoo switched to it from Inktomi in 2000 as a second search engine.

Right now Yahoo is using a combination of results Overture and its own search software, i.e. there is no Google dominance here. Now Google offers many tools and services worth mentioning.

Here are the ones that we decided to pay attention to: Sitemap files, Google Analytics, Google Webmaster Tools and many more.

□ Sitemaps. Google introduced a new feature to speed up the process "crawling" on large sites. Sitemap technology is another way to inform Google about your site and tell its crawlers about basic information on the site (number of pages, frequency of updates, pages to index).

The technology also provides statistical data about the site. All, what you need to do is to create an XML sitemap file, place it in the root directory and let us know about it at http://google.com/webmasters/ sitemaps/siteview. All information about this technology can be found here: http://www.google.coni/webmasters/.

□ Google Analytics (analytics.google.com). This tool can provide important business information: you can determine where your visitors come from, what links are most significant in terms of attracting visitors, how long visitors spend on the pages of your site, on which links they click on, find out the keywords they search on the site, etc.

Google Analytics can be used directly from the Ad Words interface. Then you can even receive relevant data and reports. To subscribe to this service, please visit http://www.google.com/analytics/sign_up.html.

□ Google Webmaster Tools (GWT) is an excellent resource for any website owner. It was created to inform about how the Google bot interacts with the webmaster's site. The tool website is - http://www.google.ru/intl/ru/webmasters/.

This content is accurate and true to the best of the author’s knowledge and is not meant to substitute for formal and individualized advice from a qualified professional.

© 2022 Temoor Dar

Related Articles