Imagine taking a picture of your hiking boots
|
The algorithm examines and analyzes the page looking for and evaluating the most important elements first. It then indexes it based on the relevance between them and the user's search query It is therefore easy to imagine that Google will more easily reward long and well-argued texts. It is estimated that this new system once implemented globally will affect approximately of the search query. . Welcome MUM While still in the testing phase saw the arrival of MUM Multitask Unified Model . With the launch of BERT Bidirectional Encoder Representations from Transformers in Google has become increasingly capable of understanding the search intent of users. BERT launched voice search and spearheaded the rise of voice assistants.
It is thanks to BERT that people are phone number list typing less and talking a lot more. But BERT was only the beginning of a great revolution. During the Google I/O conference the company announced a new model called the Multitask Unified Model or MUM. According to Prabhakar Raghavan head of Google search MUM is a thousand times more powerful than BERT and can analyze videos images and texts in as many as languages responding in a highly performing way to complex search queries. Put simply MUM can understand the user's feelings context and intentions providing relevant answers to his query.
With the help of MUM users can do less searching to find what they want. Google provides them with the results they want through a query or at most two. MUM is also multitasking: users can combine text images and voice to get relevant results for their search queries. An example? and asking Google if you can use them for a hike up Mt. Fuji. MUM will analyze the image and independently evaluate whether or not they are suitable for that type of business. If the answer is negative it could even show you an eCommerce with a list of recommended equipment.
|
|
|
|
|
|