Tech News

Google fixes 2 annoying oddities in its voice assistant

“Nowadays, when people want to talk to any digital assistant, they’re thinking about two things: what I want to do and how I need to do my order to get that,” says Subramanya. “I think that’s very natural. There’s a lot of cognitive burden when people are talking to digital assistants; natural conversation is a way to eliminate the cognitive burden.”

Making conversations with the helper more natural involves improving their reference resolution, the ability to link a sentence to a specific entity. For example, if you say “Set the timer for 10 minutes” and you say “Switch to 12 minutes,” a voice assistant should understand and fix what you were referring to “when you say it”.

New NLU models are powered by automatic learning technology transformers double encoding representations, or BERT. Google introduced this technique in 2018 and applied Google Search first. Early language comprehension technology was used to deconstruct each word in a sentence on its own, but BERT processes the relationship between all the words in the sentence, greatly improving its ability to identify context.

Example of how BERT has improved Search (as mentioned here) When you look up at the “No Parking Hill.” Previously, the results still had hills that had ridges. After enabling BERT, Google searches provided a website that recommended drivers to drive their wheels to the side of the road. BERT has had no problems. Research conducted by Google researchers they have shown that the pattern has related sentences referring to disabilities with negative language, encouraging companies to be careful with natural language processing projects.

But with the BERT models used for timers and alarms, Subramanya says the Assistant is able to respond to related queries, such as the aforementioned adjustments, with almost 100 percent accuracy. But this understanding of context does not yet work everywhere; Google says it is slowly working to move updated models to more tasks, such as controlling reminders and smart home devices.

William Wang UC, director of the Santa Barbara Natural Language Treatment team, says Google’s improvements are radical, especially since applying the BERT model to oral language comprehension “is not a very easy thing to do.”

“In the whole field of natural language processing, starting in 2018, when Google introduced this BERT model, everything changed,” says Wang. “BERT really understands what goes on naturally from one sentence to another and what the relationship between sentences is. You’re learning the representation of the word, the sentences, and the context of the sentences, so compared to works before 2018, it’s much stronger. ”

Most of these improvements may be loaded on timers and alarms, but you will see general improvement in the ability of the voice assistant to understand the context more broadly. For example, if you ask the weather in New York and ask “What’s the tallest building out there?” Following questions like. and “Who built it?” The contributor will continue to provide answers knowing which city you are referring to. This isn’t entirely new, but the Updates Assistant is even more adept at solving puzzles in context.

Names of the assistant teacher

Video: Google

The helper is now better able to understand great names. If you have tried to call or text someone with a strange name, there are many chances that you may have made many attempts or failed at all because the Google Assistant feature did not know the correct pronunciation.


Source link

Related Articles

Back to top button