Siri, the virtual assistant that answers questions, makes recommendations, and performs internet enabled actions, knows how to read your mind.
If you tell her you have a headache, she’ll respond by saying that she’s found three drugstores not far from you, where you can buy aspirin.
Actually, my favorite conspiracy theory claims the assistant will intentionally select drug stores that are a few miles down the road, knowing that driving will clear your head. Siri’s plan is to get you in your car and have you drive a few miles from your current location, so that by the time you get to the drugstore, you may no longer need an aspirin.
Pretty clever, huh?
She searches for the question behind the question. She fulfills your unexpressed desires.
This technology is astounding the first time you engage with it. What a luxury to have artificial intelligence that can anticipate human needs.
However, while computers prove that mind reading is becoming rapidly closer to reality, we should be careful not to assign the same ability to each other.
Because mind reading isn’t a love language. Nobody knows our inner experience except us. And the biggest mistake we can make is assuming that everyone else experiences the world the same way that we do, having the same needs that we have.
Imagine you walk into work one morning and see one of your coworkers is in a foul mood. Suddenly, she snaps at you out of the blue, scowling at you for setting her off. And you’re just standing there with your hands up in the air as if to say, whoa, what the hell, I just got here!
If you’ve ever been in this scenario before, it can be confusing and frustrating. Because people believe you should just know why they’re upset. You should intuit from their body language and energy signature that they didn’t sleep well last night, had a fight with their spouse over breakfast, got cut off in traffic, spilled their coffee in the elevator and are anxious about their monthly performance review.
Excuse me, but no, you shouldn’t. You are not a piece of advanced artificial intelligence that uses brain wave technology to decode linguistic patterns into emotional needs. This is the fundamental human error. People assume that we will just use our common sense to know what they want. They believe that we will anticipate what they need and supply it to them, without them ever having to ask for it.
Wrong. Our species is not that advanced yet. We’re still primitive enough in that every interpersonal conflict goes back to the question, why can’t you be more like me?
And so, next time somebody falsely expects you to read their minds, stop for a moment and tell them this.
It would help me a lot if you could tell me more about what you want.
If all else fails, you can always drive to the drugstore and buy some aspirin.
LET ME ASK YA THIS…
Are you assuming that others are having the same experience as you are?