Artificial intelligence when Alexa and Siri talk

How often do you use artificial intelligence or voice assistants to talk to them?

Do you remember when you had to watch the news for the evening weather report to know if it was raining tomorrow? The times are long gone, one click on the smartphone is enough: “How will the weather at my home tomorrow be”?

Of course, we use this charming technology for numerous tasks, for example to plan appointments, to get information quickly from the Internet or from time to time to be presented with a (more or less good) joke.

The history of computational linguistics

The artificial intelligence behind it is anything but simple. This science, computational linguistics – we also speak of natural language processing (nlp) – began in the 1950s. While the attempts in this direction were initially very mechanical and characterized by statistical methods, the technology has developed significantly in recent years. Among other things, methods of deep neural networks are used, which usually require a lot of computing power, memory and, above all, data.

The results are sometimes amazing. The new systems can not only understand language, they can also translate into other languages, work out morphological elements, find proper names with certainty, find semantic similarities, qualify texts as “positive” or “negative”, “write” their own texts and, of course, above all, answers to find questions. Artificial intelligence and language assistants speak better and better.

Artificial intelligence when speaking: where are we today?

Despite all the euphoria, it must be said that we still have a long way to go in many areas. It will be a while before a computer gives us in-depth legal advice, prepares our tax return completely independently or examines and advises us like a doctor. Because language is one of the skills that we humans need a long time to “master”. Artificial intelligences are no different.

Tell a search engine of your choice that you want to buy “buy a sweater and my wife a car”. What do you get Exactly, advertisement for a sweater. It would be more lucrative for the search engine if it could advertise cars. But she doesn’t understand the sub-sentence “my wife has a car” because we simply left the predicate “buy” aside.

Often, nuances in the language determine the meaning: “What, do you want again?” Has an obviously different meaning than “What do you want again?”.

Despite all the obstacles, the systems have become more and more understanding over the past 10 years. Chained questions are now well understood: “What time is it in Sydney” … “It is 10 o’clock” … “and in New York?”. Try it!

EXTRA: Voice Commerce – The next revolution in online retail

The future of voice assistants

It is clear that with a steady improvement in the performance of the language assistants, their application potential also increases. And it is only a matter of time before these clever helpers can provide us with valuable services even with complex problems. Incidentally, this will have massive consequences for the world of work.

While we have been used to using robots in production for years, we still largely lack this experience in the service sector. Voice assistants will change that. It is foreseeable that in a few years many jobs in this segment will no longer have to be filled. Then some academics with many years of training as lawyers, doctors or tax advisors will have to rethink their approach.


The better the machines’ understanding of language, the easier it will be for us humans to network, communicate and interact around the world. As a result, our world will grow together and that is why we should see artificial intelligence as an opportunity when speaking.

Leave a Reply

Your email address will not be published.