Natural referencing,as you know, is the discipline that focuses on  improving the positioning of websites and web pages in search engine results. It has become essential as the Internet has expanded, and is practiced daily by millions of people around the world, and plays a major role in the digital visibility strategy of a very large number of companies. But  where does SEO come from? How was it born? How has it become so important over the years?

We invite you to discover the history of SEO and its intimate links with search engines.

What is the birth certificate of natural referencing?

If we consider SEO(Search Engine Optimization)as a discipline in its own right, its birth certificate does not coincide with the emergence of search engines, contrary to what one might think.

In reality, natural referencing really comes into being when webmasters become aware that it is possible to manipulate the results of search engines to generate traffic and, ultimately, derive significant revenue from the Internet. To prevent them from “cheating” with algorithms, and thus offer qualitative and relevant content to Internet users, engines  have raised safeguards that webmasters, in turn, have tried to overcome.

SEO was born from this permanent tension between the desire of some to position their pages at the highest in the SERP (the results page), and that of others to ensure the quality of the answers proposed.

Let’s go back three decades to take a closer look.

Primitive SEO: the 1990s and the rise of search engines

In November 1990, Tim Berners-Lee launched the very first web page. From that date, more and more websites are emerging, with the common ambition of offering information to users, whose number is slowly increasing. This increase in the volume of resources is soon so important that it becomes necessary to structure this data in order to simplify their accessibility: it is to this need that search enginesrespond.

Since 1993, the Excite platform has revolutionized the way digital information is indexed, categorized and classified, based on the terms used in the pages put online by webmasters. In 1994, the World Wide Web Worm (one of the oldest search engines) had a strong index of 110,000 pages and documents online. Three years later,  the main drivers index between 2 and 100 million resources. At the same time, the number of requests grew exponentially: in March 1994, the WWWW recorded an average of 1,500 daily requests. In November 1997, Altavista had 20 million a day.

Throughout the 1990s, several search engines arrived on the market: Yahoo, Altavista and Lycos in 1994, AskJeeves (future Ask.com) and Google in 1997. All of them make improvements in indexing and rankingresults. In parallel,  the notion of “Search Engine Optimization” was born,  probably in 1997,to designate the practice ofmanipulating nascent algorithms to boost the positioning of web pages… by taking advantage of the lack of controls put in place by the engines.

See also  10 key tips for finding a domain name

The search results then have one thing in common: they are of poorquality. To classify the results, the engines simply match the terms typed by users to the proposed pages. To assess the relevance of these, they rely in particular on the occurrences of key expressions.

This observation opens the door to natural referencing techniques called “black hat”. Unofficial methods that take advantage of engine failures, such as “keywordstuffing” (abusiverepetition ofan expression in content at the expense of readability), HTML labeling, and the multiplication of incoming links (backlinks)of poor quality. Algorithmic updates take months to implement, allowing webmasters and proto-SEOs to take advantage of these techniques for long periods of time. It is a kind of  “Wild West” of natural referencing,a period of lawlessness in the history of SEO.

SEO is structured (and Google imposes its rhythm)

Things are changing with the entry into play ofGoogle. By developing their own tool, Larry Page and Sergey Brin aim to answer the question that plagues all search engines: how to improve the quality of the information to which Internet users have access?

In a 1998 seminal article,Page and Brin wrote that their  “primary purpose is to improve the quality of search engines.” They draw up a gloomy inventory:  “Any user can testify to the fact that the completeness of the index is not the only factor that influences the quality of the results. [This was the original ambition of the engines: to index as much data as possible and let it rank on its own.] Too  often, junk results are mixed with the results that interest the user. »

The problem stems from the exponential increase in the volume of available resources, while user behaviors do not change. “Internet users are focusing on the first ten results  [of the SERP]. To absorb the rapid increase in the number of pages, we need tools that can bring the most relevant results up in the top ten places.  Google’s entire philosophy is contained in this manifesto (and we can see that the habit of Internet users with regard to the number of links consulted has not fundamentally changed).

From the beginning of the 2000s, this vision of a SERP focused on the quality of results has been a school. To give substance to its approach, Google begins to develop “good practices” of natural referencing, called “white hat” because they comply with official guidelines, in opposition to “black hat” levers.

See also  Top 5 Tools for Corporate Newsletters

In short, SEO is structured by molding itself on the rules enacted by Google, so at the rate of improvements in the algorithm of the latter. This is the starting point of a hegemony that the launch of Bing in 2009 by Microsoft will not succeed in questioning. It is therefore not surprising that the history of SEO is so intimately linked to that of Google.

SEO is becoming more professional and engines are starting to hit offenders hard

However, in the early 2000s, Google’s guidelines still have little impact on the natural referencing of sites. In fact, few webmasters apply them, the fault of a glaring lack of control tools. For example, PageRank (Google’s algorithm) checks the number of inbound links, without providing ways to measure their authenticity. For webmasters (and SEOs who are starting to swarm, SEO becoming a profession in its own right), it is more lucrative to continue with the old practices than to follow the guidelines.

But Google is quietly working on retaliatory measures. This is the arrival of algorithm updates, which force SEO players to review their tactics. In November 2003, Florida  was the first major update to  Google’s algorithm.

Florida, whose objective would be to target the quality of backlinks (the use of the conditional is de rigueur: Google never communicates on the content of its updates, even a posteriori), causes damage until the dawn of 2004. It’s a tidal wave that sweeps away many websites – including innocent pages that the company calls “false positives” – with catastrophic business consequences, especially as the holiday season approaches.

For the first time, websites are penalized for using techniques considered “badpractices”. It is also the entry into play of the notion of “quality of backlinks”, reinforced in 2005 by the development of the nofollow attribute (jointly by Google, Yahoo and MSN) to reduce link spamming.

Search engines are increasingly focused on favoring the user  and offering him relevant results. In June 2005, Google launched personalized search that takes into account users’ browsing and search history to improve the quality of suggested pages. A few years later, it will be the inclusion of social posts in the SERP. So many changes that contribute to greatly modify the optics of the engines: from now on, it is all the actors of the web (journalists, marketers, editors, community managers …) who are concerned by SEO, and not only webmasters.

At the same time, natural referencing is becoming more professional by taking note of thesechanges. In November, the launch of Google Analytics allows them to measure the results of their SEO campaigns and optimize their SEO actions in real time.

See also  The secret of search operators [2022]

The new pillars of natural referencing

From 2010, we enter the contemporary era of SEO history. This period is characterized by the desire of search engines to force professionals to position themselves in the SERP through user-centered content, thereforerelevant and qualitative. With, in case of rebellion, a shovel of new sanctions at the key.

Indeed, successive updates of Google’s algorithm impose rules concerning the quality of content, the relevance of keywords or the lack of technical optimization. As such, the Panda (2011) and Penguin (2012) filters mark major turning points. The first tackles low-quality content that has no other purpose than to generate traffic. The second targets sites that offer inbound links of poor value (unrelated to content, published on directories, etc.). These developments contribute to giving the actors of the web (and natural referencing) a clear direction to follow, while pointing the finger at  the SEO levers that must no longer be used.

At the same time, search engines are developing innovative tools to further improve the relevance of results by taking into account the changing habits of users. It is, on Google, the gradual integration of social posts (and the creation of Google+), the appearance of the Knowledge Graph (which provides an immediate answer to the user, without him having to click on a link), the rise of local searches (the Pigeon update to improve searches via Google Maps), the priority given to mobile indexing (Mobile First Index tested from 2015),  etc.

You know this part of the history of SEO: it’s the one we’re experiencing today.

Why be interested in the history of natural referencing? Because it’s important to know how SEO was born and grew to better understand how it works and, even more, in which direction it goes.

By following the line drawn by Google since its creation, we can, for example, observe a vast movement towards ever more readability, relevance in content, and quality in the user experience.

We can thus grasp the ambition of search engines which is to impose themselves, in the long term, as solutions in their own right that will (almost) no longer require the presence of links to click,as we already see with voice assistants integrated intosmartphones and smart speakers.

Finally, it gives us an idea of what it will take tomorrow to continue to seduce not engines, but their users. This is now the essence of natural referencing.

5/5 - (2 votes)