PhD Thesis Proposal — Dawei Chen

PhD Thesis Proposal — Dawei Chen

Time: 10:00-11:30AM / Venue: Executive Classroom (EC) - COM2-04-02
20 Oct 2022

Title: Two Essays on Privacy Protection


Advances in data collection and mining techniques have given rise to the necessity of privacy protection. Apart from privacy regulations, individuals and firms also play considerable roles in the process of privacy protection. For examples, to combat the threat of privacy invasion, individuals are proactively adopting privacy enhancing technologies (PETs) to protect their personal information. For enterprises, it takes great effort and resources for them to “wisely” comply with privacy regulations. This dissertation seeks to understand the role individuals and firms play in the process of privacy protection through two studies.

The first study examines the impact of end-user privacy enhancing technologies (PETs) on firms’ analytics capabilities. After a comprehensive review of end-user PETs, we propose an inductively derived framework which qualitatively shows that end-user PETs induce measurement error and/or missing values with regards to attributes, entities and relationships in firms’ customer databases, but the impact of specific end-user PETs may vary by analytics use case. We propose a value-oriented framework through which firms can study and quantify the impact of end-user PETs. We illustrate the value of this framework by applying it with simulation experiments in the context of product recommendations that quantitatively find that consumers’ adoption characteristics (i.e., adoption rate and pattern) and PETs protection characteristics (i.e., protection mechanism and intensity) significantly affect the performance of recommender systems. In addition, our results reveal the presence of spillover effects. In the presence of PET adoption, not only PET users but also non-users become worse off; moreover, PET users suffer more in term of recommendation accuracy. Even though observations from PET users are problematic, we find that their removal could actually further deteriorate recommendation accuracy.

The second study investigates the economic implications of privacy dark pattern (PDP) practices through which firms could “wisely” play privacy protection games. It is commonly believed that PDPs advantage firms by deceiving and collecting more information from consumers. Nevertheless, it could also hinder firms’ credibility and consumers might stop sharing information to and purchase products from firms. Thus, the second study, firstly, aims to examine whether PDPs always benefit firms and hurt consumers. We also try to answer whether market forces (i.e., competition) are sufficient to keep PDP practices at relative low levels. The preliminary results from our game-theoretical model show that a monopolistic seller could benefit from their PDP practices when all consumers are naive. The social welfare is not necessary to decrease.

This dissertation contributes to the privacy protection literature from two perspectives. Firstly, we propose a framework, both qualitatively and quantitatively, to understand the data impact of PETs adopted by individuals. Secondly, we uncover the economics implication of privacy dark patterns and their regulation.