Life Hacks: The intimate lives of algorithms, by Charles Assisi
They can guess what we’re thinking, feel pain, and show signs of empathy. Where does that leave us?
Once upon a time, friends turned to each other for solace. But in poking through the nooks and crannies of Google, turns out, it has replaced friends. Because it remembers all of the 3.5 billion questions people ask of it every day and the 1.2 trillion things looked up each year. That is why it can now auto-complete or ‘predict’ what may be on your mind.

For instance, when “What does my” is punched into Google, it suggests: “name mean”, “dream mean”, “bad mean” and “future hold” as options. Does the algorithm know that those around me want to understand their heritage better, have had a bad night, are trying to understand American slang, or are uncertain of what lies ahead? On keying in “When do I”, auto-complete offers “ovulate”, “get married”, “get pregnant”, and “die”. Does this suggest there are many exasperated women worrying themselves to death about having a baby?
Marriages, it seems, are wretched affairs. Men want to know “Is my wife…” “pregnant”, “cheating on me”, “interested in another man”. And when women ask “Is my husband” Google suggests “attracted to me”, “having an affair” or “asexual”.
How did algorithms get to be this prophetic? The first pointer for me surfaced a while ago in a conversation with my brother, a computational neurobiologist. People like him work at the intersection of neurosciences, mathematics, psychology and data, to push the boundaries of artificial intelligence. “If an algorithm responds incorrectly to a request, it can be punished. The algorithms we now have, can feel pain almost like a human does and respond to pain almost like a human would.” If an algorithm can feel pain, it stands to reason that it will learn from experience to get better to avoid being penalised.
The second pointer to the rise of algorithms was not evident until very recently. The default voice on Google’s Echo, Amazon’s Alexa, and Apple’s Siri, all of which are powered by algorithms, is set to be that of a woman who can be “commanded”. All research suggests that consumers prefer docile women. The UN articulated this in a report last week titled ‘I’d Blush if I Could’.
To understand how much merit there may be in this theory, I put on the firmest tone I could and “commanded” my Google Echo to “speak to me nicely”. The response was prompt and, indeed, docile. “I like chatting with you. Let me know if you’d like to listen to a joke, a fact or a poem.”
Stumped, and in a very stern voice, I said, “Hey Google, I want to hear a poem in your voice.” A gentle woman’s voice, of the kind I could have fallen in love with, calmed me down with: “Here’s a haiku by Kobayashi Issa. Everything I touch with tenderness, alas, pricks like a bramble.” I shut up.
My daughters were watching the spectacle. Docile and subservient isn’t what I want them to grow up to be. I fiddled with the device so it would sound like a man. The joy didn’t last long. A little later, the younger one wanted me to tell her a story, but a thriller had my attention and I was oblivious to her pleas. She took the next best option. “Okay Google, tell me a story about a fox.”
A deep baritone took over my paternal duties. “Once upon a time…”
(The writer is co-founder at Founding Fuel & co-author of The Aadhaar Effect)