2024
Title: Beyond the Accept Button: How Information and Control Shape Data Sharing and AI Engagement
Sponsor: Meta
Presented by: Ibrahim Chhya, Yash Maurya, Zoufei Hong, Limin Ge
Advisor: Prof. Lorrie Cranor
This study examines how consent flow design impacts user engagement with AI features and data sharing in digital platforms. Through a simulated social media app onboarding experience as a part of our survey with 500 people, we investigated the effects of content length, information presentation, and granularity of data sharing controls. Our findings reveal three key insights: (1) while longer content improved user comprehension and medium-length content increased data sharing consent, users preferred 2-3 settings per page without showing significant sentiment differences; (2) although provided consent flow information did not impact user behavior, participants strongly desired more transparency about third-party data sharing and AI training practices; and (3) simple binary control options led to higher engagement with AI features compared to more complex controls. We conclude with design recommendations for creating more effective consent flows that balance user experience with privacy requirements in modern digital platforms.
Title: Designing Privacy-Conscious Consent Interfaces: A User Study of Account Aggregator Onboarding Flows
Sponsor: Silence Laboratories
Presented by: Aman Priyanshu, Jingxin Shi, Suriya Ganesh
Advisors: Dr. Sarah Scheffler and Dr. Omer Akgul
This paper examines privacy consent mechanisms for financial data sharing, with a focus on Account Aggregator onboarding processes. Drawing from existing research and industry practices in privacy consent mechanisms, we created a pool of research-informed UI variations of privacy notices and privacy choice interfaces. Through user studies involving 264 participants, we evaluated a selected subset of these variations: six distinct information presentation interfaces and four consent option interfaces. The study combined quantitative metrics (statistical analysis using Chi-Square tests and Spearman correlations), technical measurements (interaction logging within the application), and qualitative insights (thematic coding of user feedback) to assess interface effectiveness. We provide actionable recommendations for implementing privacy-conscious interfaces in open finance systems. We establish a replicable approach for evaluating privacy consent mechanisms across different domains.
Title: CMU Information Security Office: Transparency Report and FAQ Project Overview
Sponsor: CMU Information Security Office
Presented by: Joanne Peca
Advisor: Lujo Bauer
The ISO at CMU is responsible for securing the university’s technology resources (including data, computing, and network environments) and works in partnership with the CMU community (including students, faculty, staff, and visitors) towards this goal. At the same time, the CMU community wants to have a better understanding of their privacy (of their data, usage, behaviors, etc.) when using CMU technology resources. The goal of this project is to assist the ISO with the implementation of key deliverables (outlined below) which will help to improve transparency and understanding between the ISO and the CMU community, thereby strengthening the security posture of the university.
Title: Understanding the current landscape of AI Governance Processes and Frameworks
Presented by: Jayson Jin
Advisor: Prof. Hana Habib
2023
Title: Evaluating Potential Future States of Data Sharing within the City of Boston
Sponsor: City of Boston
Presented by: Katie Earlies, Andrew Berry and Jatan Loya
Advisor: Prof. Han Habib
Data sharing within city governments can cultivate improved service delivery, transparency, and trust. The City of Boston aims to foster data sharing with Boston Public Schools (BPS) to improve educational outcomes and budgeting efficiency. Partnering with the Department of Innovation and Technology (DoIT), we assessed the technical and regulatory possibilities of various future states of increased data sharing. After speaking with a multitude of stakeholders within the City, we assessed the regulatory and technical requirements to ingest and utilize data from BPS and DoIT to conduct future analyses which could impact educational outcomes and the city’s budget. We evaluated regulatory requirements of future states of data sharing, analyzed necessary legal and procedural requirements, and assessed technical possibilities of sharing data in a privacy- preserving way. Finally, we defined four future states that incorporated both impact evaluation value as well as privacy and security. Moving forward, we highly recommend expanding collaboration with BPS. Even a small step in the direction of increasing data sharing will help promote trust and transparency between BPS and the City. Additionally, we recommend strong technical privacy controls for data shared, including encryption in transit and at rest as well as data manipulation techniques, such as aggregation, k-anonymity, tokenization, and differential privacy. DoIT has the technical abilities and stakeholder support to facilitate secure data sharing with BPS that will simultaneously protect the privacy and security of student data while providing critical information to the City.
Sponsor: KIPP NYC
Presented by: Kexin Jiang, Hongtao Yao, Harish Balaji
Advisor: Prof. Hana Habib
KIPP NYC Public Schools, operates a network of 18 K-12 public charter schools, fostering an environment that encourages educators to embrace innovative practices for family engagement and instructional delivery. Despite implementing staff and student policies along with technical controls, the inherent challenges of the Internet often limits the effectiveness of these interventions in ensuring authentic student privacy protection.
We, the privacy engineering team at CMU, is an experienced team with knowledge in the privacy and software engineering domain and expertise with implementing the privacy-enhancing products. We collaborated with KIPP NYC to address this challenge. Our collaboration yielded a comprehensive online privacy guide aimed at enhancing the staff's understanding of existing privacy regulations for minors in the United States. This dynamic guide serves as a user-friendly and engaging resource, translating legal complexities into actionable 'rules of engagement' for staff handling student information online.
As a pivotal component of our final deliverable, we designed and conducted a survey to assess educators' usage of various online tools and services in daily classroom activities, management, and parent communication. This survey, covering social media, approved district applications, and non-approved district applications, not only provided valuable insights but also served as a foundational resource for the development of the comprehensive privacy guide.
Title: Privacy Threat Modeling Framework for ‘User Notice & Choice’
Sponsor: PwC
Presented by: Geetika Gopi, Seo Young Ko, Alexandra Li, Yanhao Qiu, Ziping Song
Advisor: Prof. Lorrie Cranor and Prof. Norman Sadeh
For the capstone project, as a team, we worked on drafting a privacy threat modeling framework for User Notice and Choice. The framework is designed to be practical and user-oriented, addressing the limitations of existing high-level guidance. It builds on the concept of Privacy-by-Design and aims to provide a systematic approach to identifying and mitigating risks associated with user-oriented privacy notice and choice, particularly in the context of AI. This is a critical need in today’s rapidly evolving technological landscape, where new technologies like AI are outpacing the ability of regulation to address privacy risks directly. The framework aims to move beyond a compliance-by-design approach to a more proactive and practical approach to privacy.
Title: Evaluating Privacy Risks of Text-to-Image Generative Models
Sponsor: Meta
Presented by: Aadyaa Maddi, Swadhin Routray, Yifeng Zeng, Zili Zhou
Advisor: Prof. Norman Sadeh
The development of Generative Artificial Intelligence (Gen AI) has led to its widespread use across various digital industries by businesses and consumers. Consequently, a growing need exists to educate users, especially minors and their guardians, about the risks and adverse effects of using Gen AI software. This project addresses privacy concerns about data collection and usage in Gen AI, particularly in scenarios where teenagers generate and share photo-realistic images. Our main goal is to map out information flows from the prompter of the Gen AI tool to the end-user and identify potential privacy risks associated with each of these flows. We will draw ideas from existing regulations, policies, and frameworks and propose mitigation strategies to improve privacy protections. Our study will also include user feedback, obtained by conducting a thorough user study, to prioritize risks and select effective mitigation strategies. We hope that the work presented herein will help inform the development and deployment of Gen AI tools by platform operators and other providers of Gen AI technologies.
Title: Survey Of Alternative AdTech Solutions In The Face Of Third Party Cookie Deprecation
Sponsor: ZenData
Presented by: Sriram Viswanathan, Pauline Mevs, David Mberingabo
Advisor: Prof. Hana Habib
With the deprecation of third party cookies, publishers are looking for ways to protect their revenues, which often leads them to finding alternative solutions that enable audience addressability and reporting ad campaign effectiveness. Publishers are considering ways of leveraging first party data, that also protect users’ data and comply with regulations. Deprecation is also an opportunity for brands and publishers to build increased trust with their customers and provide richer value exchange for the data collected to the users.
In our study, we looked at the current state of these alternative solutions and what AdTech use-case do they solve (viz., audience addressability, or data sharing and enrichment, or measuring campaign effectiveness). At a high level, these solutions can be classified into alternative ID solutions, Data Clean Room (DCR), Seller Defined Audience (SDA), and Privacy Enhancing Technologies or PETs. In addition to this, Google is introducing its Privacy Sandbox suite of APIs and features that are built into Chrome (the largest browser by market share) to enable a more privacy-preserving way for both interest-based advertising (IBA) and ad attribution reporting. All these solutions have varying degrees of maturity, adoption, ease of implementation, privacy guarantees, and last but not the least, utility to the use-case for which it is being used in terms of measurable ad metrics that can help with the choice of the right solution for the right use-case.
2022
Title: Privacy in Live Streaming
Sponsor: Twitch
Presented by: Andrew Berry, Jacob Gursky, Zixuan Li, Rachna Sasheendran, David Zagardo
Advisor: Aleecia McDonald
This project is an assessment of de-identification and anonymization best practices for live streaming services. The live nature, engagement model of viewers, and types of data being collected can require different approaches than other user generated content (UGC) sites. Students looked at different layers of the live streaming experience (from UX to back-end data stores) and assess the best de-identification/pseudonymization and anonymization techniques for each layer.
Title: Inclusive & Engaging Privacy Education Formats
Sponsor: Meta
Students: Chris Choy, Jack Gardner, Soha Jiwani, and Jasmine Tefsa
Advisor: Prof. Lorrie Cranor
Providing people with the tools and agency to control their data is essential to improving digital privacy, and a key first step is to provide effective privacy education. The students suggest a data-driven exploration into how social media users can receive effective digital education about privacy through the use of engaging formats in order to facilitate new learning and reinforce existing knowledge.
Title: User Privacy Expectations Concerning Artificial Intelligence Usage in Twitch
Students: Navid Kagalwalla, Rakshit Naidu, Mingjie Chen, Tresna Mulatua
Advisor: Prof. Norman Sadeh
This project examines end users' expectations from technology companies regarding their use of AI. Possible privacy questions include what end users expect for their rights to deletion, access, and transparent use of their data in the AI realm. How can information about AI data use be communicated to users to enhances trust and understanding?
Title: Anonymous Identification and Verification: Privacy & Security Research of Biometric Information
Sponsor: Mastercard
Presented by: Adhishree Abhyankar, Xin Gu, Zhi Lin, Yixuan Wang
Advisor: Prof. Michael Shamos
Because there is no standard, biometric service providers (BSPs) use proprietary methods to capture, analyze, and process facial recognition data, for example at point of sale. Students will investigate best practices to author recommendations on appropriate privacy and security protections for facial recognition data.
Title: An Ispettore in Training: Developing a Usable, Semi-Automated Enforcement Tool For GDPR Violations
Sponsor: nyob
Students: Jacob Gursky, Rachna Sasheendran, Zixuan Li, David Zagardo
Advisor: Aleecia McDonald
2021
Title: Best Practices for Global Audio Streaming Platforms
Sponsored by: Spoon Radio
Presented by: Kang Wang, Annette Stawsky, Ye In Kim, Dong Hyuk Shin
Advisor: Norman Sadeh
Spoon Radio is a rapidly growing global audio streaming platform which currently operates in South Korea, the United States, Japan as well as the Middle East and North Africa. The platform believes that its commitment to user privacy is an important competitive factor. As such, it aims to not just comply with existing privacy regulations in regions where it operates today but to also ensure that it anticipates likely evolution of these regulations and of user expectations. In doing so, Spoon Radio wants to ensure it is well prepared to continue its expansion into new markets. As part of an effort to inform the evolution of its data practices, Spoon Radio reached out to the Privacy Engineering Program at CMU and sponsored a capstone project in which two master’s students in the Program worked with Spoon Radio personnel over the course of the 2021 Fall semester. The present report summarizes best practice recommendations that have emerged from this collaboration. These best practices are a combination of practices that are already implemented or in the process of being implemented by Spoon Radio today as well as more aspirational recommendations, which are expected to help inform Spoon Radio’s practices in the future. In this report, best practice recommendations are organized around four stages of the data life cycle: data collection, data storage, data usage, and finally data destruction. A separate section is devoted to content moderation, an area where platforms such as Spoon Radio need to reconcile considerations such as promoting freedom of expression with the need to create a safe and respectful environment that complies with applicable laws and respects relevant cultural values. Index Terms—audio recordings, social network, user-generated content, voice, moderation.
2020
Title: Consent and Authorization Interfaces for Voice Interfaces
Sponsor: Highmark
Presented by: Chenxiao Guan, Chenghao Ye, Yigeng Wang, Fangxian Mi
Advisor: Timothy Libert
People are accustomed to letting voice assistants complete daily tasks for more convenience, and the purpose of our design is to provide healthcare services to people via voice assistants as well. However, news and studies show many devices have been exposed to record and abuse users’ conversations with intelligent voice assistants without consent. Thus, we created a consent and authorization voice interface using Alexa Skill based on HIPAA Privacy Rule to guarantee users can handle their protected health information (PHI). We innovated a new way of agreeing to privacy policies and notices via a voice interface, while still maintaining the usability from the user's perspective in a privacy-friendly manner, and simultaneously letting the company-side be in compliance with regulations such as HIPAA. The implementation is meant to strike a balance between usability and privacy-friendliness, and thus we provide a way of configuring fine-grained privacy preferences while making sure the interface is not bloated with too many explanations that annoy our users.
Title: Developing Privacy Labels for Medical Devices
Sponsor: Elektra Labs
Presented by: Dev N. Patel, Yash D. Mehta, Ziyuan Zhu
Advisor: Yuvraj Agarwal
IoT is a rapidly growing industry and it is becoming increasingly difficult to keep up with the advances in the industry. These advances have also led to a large increase in the number of devices available in the market, which makes it difficult for consumers to compare and make a decision about which device to purchase. Elektra Labs is a startup focused on providing comprehensive information about medical IoT devices, such as research evidence and security, through an interface which makes it possible to find information easily and compare it with other similar devices, the target audience being clinicians, medical researchers who might be interested in purchasing the device for their study, their research subjects or even consumers interested in a device for personal use. In this presentation, we describe our efforts to select the information that should be displayed to the target audience through a multi-layered label. We also present a questionnaire that would be given to device manufacturers to provide the information that would be displayed on the label. We tested both, the questionnaire and the label with some device manufacturers and medical researchers respectively. We also include our findings from these user test in this presentation.
Title: Designing Inclusive Privacy Education
Sponsor: Facebook
Presented by: Clare Hsu, Lewei Li, Daniel Mo
Advisor: Lorrie Cranor
Low digital literacy users, who are generally not familiar with digital skills and Internet technologies, are forced to join in the current trend of online payment and social commerce under this COVID-19 situation. However, compared to general users, their lower awareness, less concerns, and more misconceptions could make them more vulnerable in terms of privacy. In this talk, we will present a design of privacy education for low digital literacy users under the context of data practice policies on Facebook Pay, addressing their potential privacy concerns, misconception towards existing data practice policies, and absence of privacy self-efficacy. We will also discuss the evaluation result of our design based on qualitative responses we collected from interviews, and propose several recommendations for current education design.
2019
Title:
Sponsor: Ethyca
Presented by: Meihan Li, Lulan Yu, Amanda Zhu
Advisors: Timothy Libert
Title: The Internet of Things Privacy Infrastructure: Applicability, Usability, and Market Entry Strategy
Sponsor: CMU Mobile Commerce Lab
Presented by: James Arps, Ziheng Ni, Yiding Ou, Jingyi Zhu
Advisors: Norman Sadeh, Yuanyuan Feng, Justin Donnell, Gabriela Zanfir-Fortuna
Students studied a small number of IoT deployment scenarios that are particularly ripe for adoption of the IoT Privacy Infrastructure in light of legal requirements associated with GDPR and possibly CCPA. The project further Identified opt-in/opt-out requirements associated with these scenarios and how to best support these requirements using the IoT Privacy Infrastructure, as well as the information that would be most important to show to users on their IoT Assistant app to support informed opt-in/opt-out decisions in these scenarios
Previous Years
Year | Project Name | Project Sponsor | Presented by | Advisor |
2018 |
Evaluation of Mobile Privacy and Security: Mobile App Privacy Score |
Redmorph Inc. | Fan Yang, Jianlan Zhu | Timothy Libert |
2018 | Netflix, Inc. | Ao Chen, Jeremy Thomas | Nicolas Christin | |
2018 | Yama Ahmadullah, Zhuo Chen, David Edelstein | Lorrie Cranor | ||
2018 | Phillips | Sharada Boda | Norman Sadeh | |
2017 | Microsoft | Dhanuja Shaji and Javed Ramjohn | Lorrie Cranor | |
2017 |
Data Processing Inventory: A solution for compliance with GDPR Article 30 |
Citi | Lidong Wei, Quan (Bill) Quan, and Jun Ma | |
2017 | HERE | Tong Liu, Yuankun Li, Yuru Liu | Lorrie Cranor | |
2017 | Alibaba | Anuj Shah and Yunfan Wang | Norman Sadeh | |
2017 | UnifyID | Siddharth Nair, Preethi Josephina Mudialba, and Dan Calderon | Lujo Bauer | |
2016 |
Prevalence of PII data in the clear on personal computers |
Intersections, Inc. | Lieyong Zhou & Xi Zheng | |
2016 |
Data Subject Notice and Consent under the EU GDPR |
PrivacyCheq | Jonathan Liao, Vijay Kalani, Arnab Kumar | |
2015 |
How Transparency Affects User Preferences and Behavior |
Lufthansa | Scott Stevenson, Chunyue Du, and Rahul Yadav | Lorrie Cranor |
2015 |
Tales from the Crypton: An Evaluation of SpiderOak's Novel Authentication Mechanisms |
SpiderOak | Cameron Boozarjomehri and Hsin Miao | |
2014 |
A Taxonomy of Privacy Notices |
Adam Durity and Aditya Marella | Lorrie Cranor | |
2014 |
Mobile Location Analytics Opt-out |
The Future of Privacy Forum | Pranshu Kalvani, Chao Pan (pictured below), and Weisi Dai | |
2014 |
Privacy Center |
American Express | Sakshi Garg, Zhipeng Tian, Ziewi Hu |