No need to tell – How data silence can speak volumes

Photo by Matthew Henry on Unsplash

Data privacy is a hot topic affecting numerous people around the globe – if not every single individual. While the public debate often revolves around the un-ethical retrieval and use of personal data I am going to shed some light on the societal ramifications of people deliberately sharing their data.

In 2009, Meglena Kuneva, European Commissioner for Consumer Protection at that time, said that “personal data is the new oil of the Internet and the currency of the digital world”. Although personal data has become its own asset class and markets for personal data have been developed, it is often traded in grey zones or used in exchange for free services, making its precise valuation complicated.

These days, companies utilize personal data for a variety of purposes: reducing search costs for products via personalized and collaborative filtering of offerings, lowering transaction costs for themselves and for consumers, increasing advertising returns through better targeting of advertisements, and conducting risk analysis on customers.

Let’s focus on the last aspect of conducting risk analysis on customers and illustrate its application in the financial industry. For instance, accurately predicting the default risk of a borrower or an insurance policyholder’s risk of having a car accident can be a competitive advantage and save you money. But how does this development look from a customer’s perspective? So-called usage-based insurances (e.g. Drivewise from Allstate), for instance, are using driver behavior to calculate insurance premiums. Customers who are not willing to share their driving behavior are obviously not amongst the clientele of these insurances and that does not impose a problem at this point. But this only holds as long as there are enough alternative insurance companies that do not require customers to share their driving behavior. However, the market for usage-based insurances is expected to reach a global market size of $115 billion by 2026. Things could change tremendously once insurers and customers realize how much money they can save by using and sharing data. At this point not sharing your data becomes costly and the sole fact that data is not shared already conveys information that could make companies suspicious. What does he or she have to hide?

Going back in history: Germany ratified the “General Act of Equal Treatment” in 2006 which aimed at avoiding discrimination based on race, ethnicity, gender, age, religion, disabilities, and sexual identity. An example is the disclosed information in German CVs: employees do not have to provide any information on aspects mentioned in the General Act of Equal Treatment. However, equality is only ensured if all applicants follow the recommendations and do not share this information in their application. There lies the rub: people who can expect favorable treatment by a system (positive discrimination) could be more forthcoming and willing to share their data, whereas people who have to fear a negative treatment (negative discrimination) could be more likely to withhold it.

But if a critical mass is sharing its data, data privacy-sensitive people might be caught between a rock and hard place because of the phenomenon called information unraveling. Meaning the information disclosure of others pushes you towards disclosing your information if you want to avoid negative discrimination.

The following is an example of information unraveling told by Prof. Ben Polak during his lecture on game theory at Yale University. He describes that the hygiene in restaurants in Los Angeles in the 1990s had become so alarmingly bad that the government introduced a new quality control that checked the restaurants and distributed health certificates from A to D. Despite the fact that companies were not obliged to display their certificate to the public those restaurants receiving an A started to put their certificate in the window. What did this do to the other restaurants? Well, those who received a B started hanging up their certificate because they did not want to be considered only having a C or D. Guess what C-certificated restaurants did? They followed the logic of B-certificated places and hung up their certificates as well. Only those receiving a D did not engage in the practice of showcasing their certificate. However, from a customer’s perspective, the interpretation is clear: if you do not show your certificate you are most likely part of the lowest assessment and therefore, not a good place to dine. By the way, information unraveling is only effective if the receivers know about it. Tourists usually did not which made displayed certificates ineffective in touristy areas.

So where does this leave us? The bottom line is if people are sharing their data deliberately it can start cascades of information disclosure that make markets extremely efficient. However, it also holds the potential to discriminate against people who are not willing to share their data. So, while the public debate has been revolving around protecting customers from companies harvesting and utilizing personal data against their will, the debate on which data companies are not allowed to use despite the customers’ consent should get more attention. Evidently, that debate is a very industry- and service-specific discussion but one that has to go with the current developments.

by Jonas Röttger, FINDER ESR

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s