Google told us about its algorithm of the future

With Mum (acronym for Multitask Unified Model), Mountain View takes a decisive step towards the evolution from search engine to artificial intelligence with which to communicate. An interview with Pandu Nayak, head of the Google Search area

(photo: Solen Feyissa / Unsplash)

Sui vaccines anti-Covid, we know, there has been and still is a certain confusion. Doses, age groups, side effects, Even the names did not agree with everyone. After all, it seems that in the world there are over 800 different ways to name just a dozen vaccines distributed on a large scale. The esteem comes from those who, together with the doctors, more than any other found themselves at the foot of this tower of Babel: Google. With an unprecedented urgency in the history of the internet, the pandemic has posed the need to identify all possible words and concatenations referring to the coronavirus in order to promptly provide useful and reliable information. An unprecedented challenge, which allowed the search engine to test and improve itself.

Those 800 names attributed to vaccines and searched online by millions of users were found in seconds, sifting through more than 50 languages, and then linked to responses from health or institutional sources“. So Pandu Nayak, head of the Google Search area, introduces us to his new creature: the Multitask Unified Model (Mum), the last stage reached by the Californian company in understanding language. Nayak is cautious, but Mum perhaps gives us a first concrete signal of how Google is working to achieve what for years it has imagined its goal: a search engine that is not a text field to be filled in, but an artificial intelligence with which to talk in multiple ways.

Presented at Google’s I / O conference last June, “Mum is a thousand times better performing algorithm than its predecessor Bert (Bidirectional Encoder Representations from Transformers)”, Continues Nayak, and brings with it three fundamental innovations. As we have seen in the case of vaccines, it overcomes language barriers, looking for results in 75 languages ​​and translating them in accordance with the user’s query. It introduces format multimodality, in the sense that it understands how to relate images, text and audio. Finally, he will be able to understand complex questions and provide articulated answers from a predictive point of view. How? Well, “Mum not only understands language, it generates it“. A sentence left between the lines by Nayak, yet one of the most important he uttered during our meeting.

To understand this, if we photograph some shoes and ask if they are suitable for a certain purpose, Mum will contextualize what we have shown him with respect to the question to show us the right results. With a query related to a distant place, for example Mount Fuji, pages in our language may be scarce and Mum will search directly from Japanese sources to transmit, translated, the requested information. Again, in the case of a complex question, such as the comparison between two objects, Mum will understand it by giving pertinent indications and advice. When will it be released, Mr. Nayak? “To be defined”. Smile.

However it will come and apart from the details we have already told on Wired, as its creator tells us, the new algorithm represents an almost definitive transition: no more answers provided based on the exact correspondence between the words used by users and those present in the web pages, but guided by the interpretation of the questions and by an elaboration that not only returns sources but draws from them. For a semantic, intelligent and, sooner or later, conversational search engine. This, in perspective, is the meaning of Pandu Nayak’s sentence. Short and cryptic as the answer he gives when we ask him if an audio declination of Mum is foreseen: “We have several projects related to Google Assistant”.

Mum, in fact, is configured as the system that for the first time connects some tools activated in recent years by Google. It is the latest evolution of the search engine and of the Serp, the Google results page. It will include Google Assistant, launched in 2016 and anticipated by voice search – Nayak’s word. It supports the features of a revolutionary and little-known tool: Lens, capable of providing us with answers based not on what we write on the Google bar but on what we frame with the smartphone camera. Mum, in short, it is an unprecedented integration, with already extremely enhanced computing capabilities compared to the past. An embryo of artificial intelligence to which, at the same time, we will talk, send messages via chat and show objects, ready to receive the information or advice we need.

All this one tomorrow. Today the search engine is not able to grasp the meaning of everything we write and Google Assistant often fails. Without going too far, for the moment we can worry about how Mum will impact the criteria that establish the ranking of the pages and regulate the visibility of the results that appear in the SERP. Second John Muller, Google’s Search Advocate, Seo (Sarch Engine Optimization) experts will continue to have bread for their teeth: “Things simply evolve“. One wonders, however, if Mum’s ability to “generate language”Does not remove traffic from websites. In part it is already happening and it is reasonable to think that his abilities can make things worse. Let’s think about it: if the answers to the queries are extracted from a page, like those of Wikipedia, or written directly by Google and displayed in plain sight, why would a user click to get that same information or go on with scrolling?

IS It is good to clarify that there are many ways of generating language”, ci spiega Pandu Nayak. “Mum will not give answers automatically, unless they are very concise and related to simple questions, as in the case of today’s weather or the name of a country’s capital. For more complex questions such simple answers are insufficient. Our goal is to offer the best user experience and make them find what they are looking for according to their needs“. It is not in Google’s interests to block websites, at least for the moment. In the future, however, with a conversational and intelligent search engine, things could change and, together with the point of view of those who create the contents, there is that of those who use them. The countless answers that arrive in writing cannot be provided together via audio or chat. Therefore, as for the sites the probability of being ‘seen’ by users would decrease, for these the margins of choice would be reduced. Both would rely, even more than now, on the decisions of the algorithm. Which could also offer results from sites that are no longer deserving, and therefore better positioned in the ranking, but simply paying.

Perhaps in the future, in fact. But with Mum, to hear Nayak, things will turn out differently. He doesn’t have much to tell us about advertising, it’s not his field: “We deal with language, not ads. I can only assume that the division in charge will do what it can and must to monetize this new product“. And on the question of the so-called one-right-answer, for which the performance of the search engine brought in the conversational field takes away traffic to the sites and autonomy to the users? “We are not interested in replacing those who create content. Indeed, the aim is to connect them with people, to whom we want to continue to offer multiple possibilities, including choices. It is about showing the part of the web that is relevant with respect to infinite possible questions and the nuances of language, to be understood in an ever more faithful way“. Appointment to release Mum. And when, in a few years, the search engine will be definitively exited from our display.

Categories:   Internet