Searching on Google, a search engine used by around 90% of users, is now a daily activity for almost each of us. Yet, lately it doesn’t seem easy to find the information we’re looking for.
If in the past you remember having immediately found what you needed and now you notice a difference, you are not imagining anything. Google has gotten much worse in its searches and we’ll explain why: there are at least three reasons.
Table of Contents
The problem of criteria, principles and guidelines
Search engines have to navigate huge numbers of web pages every day. To avoid getting lost, Google relies on many criteria, which however are secret. No one can know exactly which elements, words or orders contained in a text make it rise to the top positions. Otherwise cheating would be very easy.
At the same time, Google provides general guidelines. It is a kind of list of principles that content on the internet must have. Most recommend producing quality content, written with the user in mind rather than bot tricks, and that provides useful information.
Some of these principles, however, may not be very advantageous for end users. An example is that of the age and reputation of the site. For the same “quality” of the content, the website that has been open for the longest and receives the most links from other pages is rewarded, because Google considers it more reliable.
On paper it is useful, since it excludes unfair practices attempted by websites that are obviously new. But sometimes it happens that the same sites always appear in the first search results. Even when their articles are not that useful, and to the detriment of better content.
A further example is the excessive narrowness of some principles. If Google indicates a certain layout of the text, or if it turns out that certain practices push up the results, the consequence is that the results at the top will be too similar. You will certainly have visited pages that seemed photocopied, and this is the case.
In short, it is the first reason for the worsening. When one of Google’s principles has a flaw, the results are affected. Sometimes Google fixes this by changing the algorithm, but other times it doesn’t do so, because it thinks it’s right.
The excess of SEO, the art of optimizing texts
Usefulness, quality and good writing are not objective criteria. To consider a page good, Google still uses parameters. They are secret, but they exist and experts try in every way to find them by doing lots of tests.
We enter the field of SEO, acronym for Search Engine Optimization. That is, optimization for search engines. There are professionals who take care of the shape of a web page, the order of the text, the links, the tags and the images within it, so that it is read better and evaluated better by Google.
Generally SEO is an excellent practice, which makes pages more readable and helps the user orient themselves. In fact, even today, SEO marketing is essential for companies. But when some people discover one trick too many, they can make almost anything look good to Google.
So a very well done SEO, but on poor content, manages to make that page rise among the first results even if it has nothing to say.
Various pages will appear on Google that seem to talk about it. But once inside the page you will just read a lot of useless text. Only to discover at the end that there is nothing new. The second reason for the worsening is the SEO being too good for too little content. In short, only the form was taken care of, to ingratiate Google, but not the substance at all.
Machine translations and AI content fly too high
The last frontier is made up of very poor results, with little usefulness and logical sense, sometimes incomprehensible. These are two similar categories: AI-produced content and bad machine translations.
The first type mixes information found from other sources on the web, thanks to the power of generative Artificial Intelligence. And since AI takes care of it, not only can it not add any knowledge, but it also risks making mistakes and giving incorrect information. One of the many dangers of Artificial Intelligence.
The second type is as banal as it is harmful: content produced in a different language passes through an automatic translator which spits it out in Italian. But since there is no person behind it, there are many translation errors and it is difficult to understand, if not downright illegible.
But if they are so scarce, how does this content end up so high? First of all, the two causes mentioned above take effect: sometimes these contents take advantage of an already well-placed site, a form that Google likes, good SEO.
Additionally, they have the power of quantity over quality. Since they are not created by hand, it is possible to generate tons of them in a short time to storm Google. Even if the engine realizes that they are bad, someone will always manage to pass off as good. And this is how the poor user ends up on yet another useless site.
Read also: Does Google spy on you? How to find out how much it knows and how to limit it