Artificial Conversations – How AI might influence your life

Artificial Conversations – How AI might influence your life

One way you might have realized that supposedly artificial intelligence entered your life are these strange text suggestions in Gmail, Facebook or LinkedIn. I’m talking about the tiny text bubbles which are suggesting you short replies for your conversation or email.

Why are more and more services introducing these suggested text replies? To spark a conversation? To save you time by reducing the time it takes to reply to a text? To make our life easier?

In several occasions I – myself – used these quick replies. Telling your appointment that you’ll be a few minutes late by simply tipping on your calendar notification? So easy!

But what might be a great help in some circumstances might be harmful in other situations. I find it scary to imagine that an electronic device is suggesting us what to write to our friends or business partners. The more we get used to this new technology, the more we will use it. The more we use it, the less we have to think ourselves what we should reply. The less we think ourselves how to structure our conversations the more an AI is guiding our human communication.

If we look further than small and supposedly unimportant quick replies we find other tools. There are Chrome extensions who help you write better, tools which minimize any human effort while translating between two languages, and smart phone keyboards which continue the sentence for you.

It might look benign at first but what is the ultimate effect on human to human interactions? Is the reply the strange AI is suggesting you actually the same you intended to communicate? Do smart reply suggestions influence your conversations with other human beings?

In the past, I caught myself using replies my SwiftKey suggested me to use. As an example: when I type “Good” in my chat app, SwiftKey will suggest me three words how the sentence might go on. It will suggest “night” – which I accept. Then it will suggest “my” – which I accept. Then it will suggest “honey” – which I accept. Then it will suggest “I love you” – which I accept. By now you might get what I mean. Or you might have caught yourself acting similar.

The thing is: was it me intentionally wishing my significant other a good night or was it SwiftKey suggesting me to wish my girl good night in this specific way?

If we find ourselves using such technology to augment our human to human communication – maybe we should pay attention whether an AI is secretly influencing what we actually communicate.

Leave a Reply

Your email address will not be published. Required fields are marked *

Up Next:

The Hard Lesson I Learned about Contracts

The Hard Lesson I Learned about Contracts