Prof. Liran Razinski from the Program for Interpretation and Culture at Bar-Ilan University claims that the algorithmic view - the way in which computer systems analyze us - affects our self-perception and our behavior in the world. What is worrying, he said, is that most people are not aware of the scope and depth of this information collection
In recent years it seems as if our cell phones are eavesdropping and following us. We leave digital footprints wherever we are, in the online world as well as in the physical world - from a purchase in a store to a walk in the park - our every activity becomes data that is collected and analyzed.
Prof. Liran Razinski from the Program for Interpretation and Culture at Bar-Ilan University claims that the algorithmic view - the way in which computer systems analyze us - affects our self-perception and our conduct in the world. What is worrying, he said, is that most people are not aware of the scope and depth of this information collection.
The algorithms see us in a fundamentally different way from the way we perceive ourselves. While we focus on what seems significant to us, the algorithms also attach importance to apparently marginal details. "Imagine a situation where the decision regarding your hiring is based not on the content of the interview, but on your body language or the clothes you chose to wear," says Prof. Razinski.
According to him, the effect of the algorithmic gaze penetrates deeper than just how others perceive us. It also changes the way we see ourselves. "People are starting to perceive themselves as a collection of data. Take for example the use of fitness apps. Instead of focusing on the general feeling of running, people become obsessed with metrics like heart rate and oxygen saturation. We are starting to see ourselves through the eyes of the algorithm."
This phenomenon, which Prof. Razinski calls the "datafication of the world", is part of a wider trend in which more and more aspects of our lives are becoming digital data. "The goal is to make human behavior more predictable," he explains. "When every aspect of life becomes measurable, the illusion is created that human behavior can be accurately predicted and controlled."
But this trend raises complex ethical and philosophical questions. Do we lose something essential when we reduce human complexity to a series of data? Are there aspects of the human experience that cannot be quantified? And what are the consequences of giving so much power to algorithms in shaping our lives?
Prof. Razinski warns against "hyper-individualization" - a situation where everyone lives in a personalized information bubble, which undermines our common space. "When everyone is only exposed to information that the algorithms 'think' is of interest to them, we lose the common basis for discourse and mutual understanding," he explains.
Despite the troubling implications, it is hard to resist this trend. The companies present the use of algorithms as something that benefits us, offers convenience and efficiency. But users are not sufficiently aware of the heavy price we pay. "Our challenge," in Razinski's opinion, "is to find a way to take advantage of technology without losing what makes us human."
More of the topic in Hayadan: