79 % of EU citizens access the internet every day. Using their computer or mobile device to surf the web, users generate vast amounts of data that could give outsiders a peak into their most personal preferences, habits, political views and more.
We live in the information age fueled by data. While harnessing this data can allow for enormous innovation and service improvements for businesses and consumers alike, it also harbors immense risk for the anonymity, privacy and security of individuals and their data.
In the first part of this series, we will take a closer look at the meaning of privacy in the digital age and its importance.
The Cambridge dictionary defines privacy as a person’s right „to keep their personal matters and relationships secret“. Privacy is a long-held value in Western democracies and is based on the tenet that certain private matters of individuals should be free and independent of government or outside influence.
Researchers suggest that privacy is a multi-faceted construct. Rosenberg (1992) proposed a widely cited concept that distinguishes between
Another much-regarded multi-dimensional conceptualization of privacy was put forth by BusinessWeek (1998) and Osorio (2001) which define the term as a user’s control in terms of property, autonomy and seclusion.
In the digital age and online context, it is primarily the mentioned informational/data privacy that comes into the spotlight. Data privacy entails that a user himself/herself has full control over their personally identifiable information (PII).
Nowadays, internet usage, mobile devices and IoT-enabled devices with sensors allow for data generation and collection at previously unforeseeable levels. Social media platforms have established the model of monetizing user data in exchange for offering users “free“ service.
As a result, most user data is not stored with users but large social media and online service platforms. Pooled in data lakes, companies gather, structure and analyze user data with the help of artificial intelligence (AI) algorithms in order to derive new insights for cost optimization, marketing opportunities and more.
Privacy becomes a major user concern, as most of their data is stored with profit-oriented companies who may not have their users best interests at heart. Data privacy is dependent on the intersection of control and trust. Users therefore need to trust these companies that they will safely store and manage this data within its designated use and neither sell this data to third parties nor allow it to be compromised by hacks from malicious actors. However, a recent boost in data regulation has started to replace trust with clear laws and guidelines for companies managing user data.
As regulators realize the importance and sensitive nature of user data, they have enforced strict frameworks like the GDPR in Europe or California Consumer Privacy Act (CCPA). These frameworks seek to guarantee user privacy by forcing companies to reveal the details and type of data (what), the reasons (why), the ways in which (how) and the purpose for (what for) as well as methods of storage they use for gathering and storing user data.
Companies may only gather the data required for a specific, granted purpose and aren’t allowed to reuse it for purposes other than authorized by users. The GDPR goes one step further and dictates that upon request a company must provide a user with the entirety of his user data on record (“right of access“) and delete all of these records upon request (“right to be forgotten“).
When a student enters University, he knows his University will track his class performance that could give insight to various elements as his ambition, intelligence, reliability and more. Yet, as the student knows that the University would face enormous legal liability and sanctions for selling or transferring such data to third parties, he can trust his University to keep the data within its designated use. A penalty system can be a good mechanism to enforce trust by aligning the users and the data-controlling entity’s interests.
Users have become more worried and concerned about data privacy in recent months. While they may willingly share personal information with their friends and contacts on social media, they will wildly oppose companies using or sharing their data. When Facebook announced in January 2021, it would force WhatsApp users to share their data with Facebook (or deactivate their WhatsApp account), a public outcry followed.
If being asked about privacy, a common argument is the „I have nothing to hide. Why should I care?“, saying their person and data would not be important or relevant enough for anyone to track and analyze. The ‘I’ve got nothing to hide — argument’ is a big and dangerous fallacy for various reasons: