Technology is a beautiful thing and tremendously useful. But, a bit like a knife, it can hurt us if we are not extremely careful. I had dinner with an old friend last night who also happens to be a machine learning and AI expert. And, what I heard made me fall off my chair. It also made me think more about technology and responsibility.
It is not news that our online digital footprints are used to market and sell things to us based on our revealed preferences. What is more troubling however is how self-learning algorithms can reinforce and even strengthen our existing biases and tendencies when we spend time online.
For example, your social media newsfeed is generally curated by an algorithm. Its goal (and that of its masters) is to keep you spending time on the platform. So, when it finds things that a person might like; be it cats, gym-memes or Alt-right ideas, it will feed you these. The more confirmatory feedback the algorithms receive, the more it feeds you such material. And, the material tends to become more-and-more extreme with every confirmatory interaction. You start with pictures of kittens, you end up with a tiger in your living room.
This can then be very dangerous depending on what your particular tendencies and leanings are. You can end up spending time in an ever deepening echo chamber that is moulding you day-by-day, unnoticed by you.
The psychologists Daniel Kahneman and Amos Tversky have shown through their seminal experiments how human beings just aren’t wired to account for what is outside the realms of the immediate. Kahneman aptly calls this psychological bias What You See is All there is.
Not only are we biologically inclined to give more value to what is in front of us, but we are now increasingly exposed to autonomous algorithms that keep us stuck in a maze. A maze designed using tendencies we display. The effort required on our part then just went up. Drastically. So, if you are not trying to cultivate and develop a deep awareness of yourself and how you relate to the world. It is probably fair to say that you are extremely open to manipulation. In other words f****d.
Also, when people talk about the whole Cambridge Analytica Facebook debacle, what they fail to see is that the algorithms don’t care about your preferences. The algorithms’ behaviour is not subject-matter-specific. They will feed you whatever it is that you seem to like, whether it is Donald, Hilary or Garfield the cat.
Clearly, this is a complex area that needs to be looked at by legislators, tech experts and public representatives. But, what we can do as individuals is start by taking responsibility ourselves. Taking responsibility by being more aware of our own habits, behaviours and tendencies and how they impact us and others.
It starts with you and me. The problem in a way is not technology. It is rather the common denominator — we, the humans. Technology often complexifies. We just have to acknowledge that fact and learn to use it better.
Find out more about Harsha’s work