Death of Privacy in Digital Landscape

April 23, 2022

Did you ever have this bizarre and creepy feeling that your phone might be listening to your conversations and scanning through your text messages? Like you might be talking about potted plants or maybe some good exhibition in town and suddenly while browsing through Instagram you come across advertisements related to the above topics?


Image Source

Doesn’t it feel like living in an episode of the Netflix series Black Mirror? These are not unique experiences. Virtual advertisements or social media platforms today reflect everything being discussed in real life. Such interactions between the internet and our lives have become normal instead of anomalies.

The algorithms and the laws that regulate them have been designed to control and determine who has access to data, however they do not govern what happens after it is collected. An algorithm can collect seemingly benign data, identify hidden correlations, and derive information- and then use that information in a subtle way like excluding or including specific users from seeing certain products or offers.

Sophisticated ways of surveilling users and tracking their identity are becoming ubiquitous; attempting to quantify them is like measuring the air around us. Websites that we browse store snapshots and track our activity of the interaction with it via cookies (trackers). This data is used by other vendors and other retailers for several purposes. The facial recognition technology is embedded in our phones and tracks us. The phone tracks your location via google maps, Whatsapp and numerous other apps demanding access to our location. Conversations with voice assistants like Siri, Cortana and Alexa are vigorously recorded and stored.


As scary as it sounds, we very well know that our digital footprints are tracked but they have been used to infer our deepest personality traits and other characteristics. In a study conducted in 2013 including 58,000 participants who were Facebook users, the researchers were able to predict their gender, sexual orientation, race, religious and political views, age, level of intelligence, substance use, and if the participant’s parents were separated. They also predicted to some extent the personality traits, such as conscientiousness, openness to experience, extraversion, emotional stability, and agreeableness.

If that is where we are at and that is how technology is invading our lives, it is not so hard to imagine a future in which Uber rating would be used to determine our emotional intelligence or infer our likability; and Spotify and Netflix to identify our personal religious and political standings. The potential for trading this data is by no means limited to marketing purpose alone. The data is actively used by recruitment industries, insurances, dating portals, and financial services to monetize on user’s personal data, which is the price we pay for giving our personal details for anything that is free.

Human beings want a sense of agency and choice around their lives. We may experience distress as we see it getting taken away by someone. This constant feeling of being watched, surveilled and tracked can be rather worrisome and anxiety-provoking. This lack of privacy induces a crisis of liberty and autonomy for users and leads to a feeling of being out of control. Constant surveillance can be cognitively draining since it pushes the brain into a state of fight or flight. A perpetual lack of privacy in the digital landscape invades people’s personal space and the sense of security. Some users have also mentioned feeling paranoid by the idea of interacting with the technology.

In the book “ Psychology of Online Persuasion,” the author, Nathalie, has rightly mentioned that the targeted data may have a positive impact on our purchasing intentions, but it can come with a hidden cost, precisely more invasive practices. This cost is termed as ‘psychological reactance’ that refers to an aversive emotional state that buyers experience in response to perceived threats to their autonomy and freedom. It’s this notion that kicks in, for example, when we receive an inexact advertisement from a brand we have never heard of, have never bought from or maybe don’t even trust. The vendors and apps must not over use the customer’s data and constrain what they do with it. For instance, if Amazon or Google overexploit the customer data they have, they may undermine consumer loyalty.


In a research study, researchers studied the impact of continuous computerised surveillance on individuals. They used video cameras, smartphones, TVs, wireless network, logging software for personal computers, microphones to monitor and get information from the participants. Participants were asked to report their levels of stress at an interval of six and twelve months. Interestingly, results indicated that 90% of them reported having feelings of anxiety, annoyance and anger. A few participants dropped out at six months because of the significant discomfort that they experienced while being surveilled by the technology.

Needless to say, the digital landscape has posed many potential dangers and ethical issues associated with expansion of digital profiling, right from hacking to leaking the data. There is a huge difference between what companies could and should know.

This almost feels like an ultimatum. In today’s era using and interacting with technology is as basic as breathing. It is not only associated with entertainment purposes but also with livelihood for many, and the lack of privacy and security in this space not only poses a threat to our mental health but to our basic sustenance as well. But the question remains, at what cost are we making our personal life and data public?


It is not like we can do nothing to protect our data and our mental health. The first step to this is not surrendering in the face of what might feel like a lost battle. “We need to start putting our money where our mouth is when it comes to privacy. Appreciate it as a human right worth defending, and avoid the defeatist trap that suggests nothing can be done. Think of your mental health: Giving up control of the details that make us who we are is disempowering and psychologically destabilizing.” says Aboujaoude.

Web 3.0 and its Vision of End to End Privacy and Protection

Since the last few years, data privacy has become a greater concern in response to those incidental consequences. Users these days are becoming increasingly aware and concerned of the implications of the data being shared and utilized by companies. It is now slowly creeping into the fabric of the society through issues like misinformation, polarisation, confirmation bias etc.

There have been some data privacy browsers that attempt to add an additional layer of protection, however, they simply don’t go far enough to protect user’s privacy. A lot more work needs to be done and the internet will need to fundamentally revive and support a healthier relationship in the digital landscape. Today, people refer to this fundamental change in the internet ecosystem as Web 3.0. Unlike Web 2.0, this new era would enable the users to own and gain complete authority of their data.

So, where do we go from here? How can Web 3.0 drive privacy? The answer is simple, it is Blockchain. Imagine a future where our data belongs to us and is not owned by another entity, and we can clearly see who has access to the data and the type of access they have.


Sounds interesting right?

The blockchain technology, by default, is a decentralized form of technology that does not permit any person or group to have full control over the internet ecosystem. Rather, all the users collectively have control over it. While there is no single governing entity in this technology, it has a database that holds records in the algorithm ensuring transparency and security.

Web 2.0 harvests users’ content and data to generate revenue, by snatching privacy and control from the users. Web 3.0 or the new internet aims at shifting the power back in the user’s hands and making the web an open and secure space.

Privacy and authorization of the data are key aspects of Web 3.0. The original authors of digital assets like blogs, videos, pictures etc can claim and establish their ownership in the webs pace. This is done by hashing the content cryptographically. Every piece of information has its own hash, or to be precise, an unique identifier. Additionally, today’s blockchain is ‘pseudonymous’ in nature where users are identified by an alphanumeric string of characters that is known as a public key. This identifier is then linked to the user’s public key that allows for ownership to be verified.


Additionally, blockchain technology eliminates the need for trusted third parties or the ‘middleman’ of the centralized web identity management and creates a direct relationship between users and service providers. Users will instead interact directly with the technology using self-sovereign identities. Blockchain technology fundamentally reconfigures the power balance in data ownership.


Conclusively, Web 2.0 has driven incredible technological progress and enhanced user experience in the digital landscape. As a natural evolution of the internet and to combat the current alarming issues surrounding user’s privacy or data breaches, Web 3.0 is the answer that could bring substantial privacy benefits. We are gradually entering the new internet age and blockchain will likely be the foundational technology in this era. If we implement it effectively, it would not only help us to own and have full autonomy over our data but it would also revolutionise internet security.

Myraah is transforming India’s digital space by introducing Web 3.0 products for the first time. This will enable you to truly own and safeguard your digital assets. For more information, click here.

Do follow our LinkedIn page for updates: [ Myraah IO on LinkedIn ]