One day your voice will control gadgets and you

Technology
One day your voice will control gadgets and you
The sound of our own voice is something we take for granted as just a part of who we are, but what if our voice were to change drastically or be taken away altogether?

No, you certainly can’t afford that. That would be a nightmare. That’s what this year’s CES, the world’s largest annual gadget bonanza, has made abundantly clear. Everything we own in the future, be it our fridge, front door or even toilet, will be controlled by our voice.

We can keep pondering upon the need of an Alexa-powered toilet, which allows us to lift our toilet cover, but we can’t overlook the ubiquity of voice interfaces than a never-ending series of hardware companies jumping on the bandwagon.

Until recently, Amazon and Google competed for presence in our homes through their own gadgets: Smart speakers like the Amazon Echo and Google Home, TV add-ons like Chromecast and Fire TV and even home security systems like Nest and Ring.

But, at CES 2019, that competition was brought to a boil where a series of voice-enabled products underscored the scope of each company’s ambitions. Partners from both the companies launched dozens of products that will bring Amazon Alexa and Google Assistant into nearly every aspect of our lives.

The tech giants are planning to put their assistants into our TV, our car, and even our bathroom.

It all revolves around an idea that leading AI expert Kai-Fu Lee calls OMO, online-merge-of-offline.

According to Mr Lee, OMO refers to combining our digital and physical worlds in such a way that every object in our surrounding environment will become an interaction point for the internet — as well as a sensor that collects data about lives.

A clever scheme

Our shopping cart needs to know what’s in our fridge so it can recommend the optimal shopping list.

It requires our front door to know our online purchases and whether you’re waiting for an in-home delivery.

That’s where voice interfaces come in: installing Alexa into our fridge, our door, and all our other disparate possessions neatly ties them to one software ecosystem.

By selling us the powerful and seamless convenience of voice assistants, Google and Amazon have slowly inched their way into being the central platform for all our data and the core engine for algorithmically streamlining our life. Since Alexa and Google Assistant learn from what you ask of them, it means, whether you want it or not, Amazon and Google will be able to learn more than ever about us and our habits.

Limitations

Everything will depend on the understanding levels of the voice assistants. Compared with other subfields of AI, progress in natural-language processing and generation has kind of lagged behind.

Advancements

Last year several research teams used new machine-learning techniques to make impressive breakthroughs in language comprehension.

Research nonprofit OpenAI developed an unsupervised learning technique June, 2018.

Systems were trained on unstructured, rather than cleaned and labeled, text. It dramatically lowered the costs of acquiring more training data, thereby increasing their system’s performance.

Google also released an even better unsupervised algorithm that is as good as humans at completing sentences with multiple-choice answers.

All these advancements are getting us closer to a day when machines that really understand what we mean could render physical and visual interfaces obsolete. It’s up to us to decide if it’s for better or worse.
Tags :
Share This News On: