PhD Defense – Dawei Chen

PhD Defense – Dawei Chen

Meeting Room 3 (COM2-02-26)
July 27, 2023

Date/Time/Venue: July 27 (Thu), 2023 @ 10:00AM @ Meeting Room 3 (COM2-02-26)

Dissertation Title: Privacy Protection, PETs by Individuals, PDPs by Firms

Abstract:

Advances in data collection and mining techniques have given rise to the necessity of privacy protection. Apart from privacy regulations, individuals and firms also play considerable roles in the process of privacy protection. For example, to combat the threat of privacy invasion, individuals are proactively adopting privacy enhancing technologies (PETs) to protect their personal information. For enterprises, it takes great effort and resources, such as privacy dark patterns (PDPs) practices, for them to “wisely” comply with privacy regulations. This dissertation seeks to understand the role individuals and firms play in the process of privacy protection through two studies.

The first study examines the impact of end-user PETs on firms’ analytics capabilities. After a comprehensive review of end-user PETs, we propose an inductively derived framework which qualitatively shows that end-user PETs induce measurement error and/or missing values with regards to attributes, entities and relationships in firms’ customer databases, but the impact of specific end-user PETs may vary by analytics use case. We propose a value-oriented framework through which firms can study and quantify the impact of end-user PETs. We illustrate the value of this framework by applying it with simulation experiments in the context of product recommendations that quantitatively find that consumers’ adoption characteristics (i.e., adoption rate and pattern) and PETs protection characteristics

(i.e., protection mechanism and intensity) significantly affect the performance of recommender systems. In addition, our results reveal the presence of spillover effects. In the presence of end-user PETs adoption, not only PET users but also non-users become worse off; moreover, PET users suffer more in term of recommendation accuracy. Even though observations from PET users are problematic, we find that their removal could actually further deteriorate recommendation accuracy.

The second study investigates the economic implications of privacy dark patterns (PDPs) through which firms could “wisely” play privacy protection games. It is commonly believed that PDPs advantage firms by deceiving and collecting more information from consumers. Nevertheless, they could also hinder firms’ credibility

and consumers might stop sharing information to and purchase products from firms. Thus, the second study, firstly, aims to examine whether PDPs always benefit firms and hurt consumers. We also try to answer whether market force is sufficient to keep PDPs at low levels. Our results show that the presence of PDPs indeed makes users weakly worse off and the seller weakly better off. Nevertheless, the seller has incentives to not utilize any PDPs when users’ privacy cost is high and the ratio of privacy concern and the reduced search cost of opt-in is either too high or too low. This could be attributed to the fact that the market shrinkage effect dominates the market division effect under these conditions. In other words, the gain from making more users opt-in will be outweighed by the loss from total market shrinkage when the seller increases its level of PDPs. Finally, we show that a welfare maximizing social planner would allow the presence of PDPs when the users’ privacy cost is sufficiently low.

This dissertation contributes to the privacy protection literature from two perspectives. Firstly, we propose a framework, both qualitatively and quantitatively, to understand the data impact of end-user privacy enhancing technologies (PETs) adopted by individuals. Secondly, we uncover the economic implications of privacy dark patterns (PDPs) and their regulation.