Carnegie Mellon University

2021 Capstone Project

screencap1.png

Capstone Report: Best Privacy Practice Recommendations for Global Audio Streaming Platforms

2020 Capstone Projects

Sponsor:           Highmark
Presented by:  Chenxiao Guan, Chenghao Ye, Yigeng Wang, Fangxian Mi
Advisor:           Timothy Libert

People are accustomed to letting voice assistants complete daily tasks for more convenience, and the purpose of our design is to provide healthcare services to people via voice assistants as well. However, news and studies show many devices have been exposed to record and abuse users’ conversations with intelligent voice assistants without consent. Thus, we created a consent and authorization voice interface using Alexa Skill based on HIPAA Privacy Rule to guarantee users can handle their protected health information (PHI). We innovated a new way of agreeing to privacy policies and notices via a voice interface, while still maintaining the usability from the user's perspective in a privacy-friendly manner, and simultaneously letting the company-side be in compliance with regulations such as HIPAA. The implementation is meant to strike a balance between usability and privacy-friendliness, and thus we provide a way of configuring fine-grained privacy preferences while making sure the interface is not bloated with too many explanations that annoy our users.

Sponsor:           Elektra Labs
Presented by:  Dev N. Patel, Yash D. Mehta, Ziyuan Zhu
Advisor:           Yuvraj Agarwal

IoT is a rapidly growing industry and it is becoming increasingly difficult to keep up with the advances in the industry. These advances have also led to a large increase in the number of devices available in the market, which makes it difficult for consumers to compare and make a decision about which device to purchase. Elektra Labs is a startup focused on providing comprehensive information about medical IoT devices, such as research evidence and security, through an interface which makes it possible to find information easily and compare it with other similar devices, the target audience being clinicians, medical researchers who might be interested in purchasing the device for their study, their research subjects or even consumers interested in a device for personal use. In this presentation, we describe our efforts to select the information that should be displayed to the target audience through a multi-layered label. We also present a questionnaire that would be given to device manufacturers to provide the information that would be displayed on the label. We tested both, the questionnaire and the label with some device manufacturers and medical researchers respectively. We also include our findings from these user test in this presentation.

Sponsor:           Facebook
Presented by:  Clare Hsu, Lewei Li, Daniel Mo
Advisor:           Lorrie Cranor

Low digital literacy users, who are generally not familiar with digital skills and Internet technologies, are forced to join in the current trend of online payment and social commerce under this COVID-19 situation. However, compared to general users, their lower awareness, less concerns, and more misconceptions could make them more vulnerable in terms of privacy. In this talk, we will present a design of privacy education for low digital literacy users under the context of data practice policies on Facebook Pay, addressing their potential privacy concerns, misconception towards existing data practice policies, and absence of privacy self-efficacy. We will also discuss the evaluation result of our design based on qualitative responses we collected from interviews, and propose several recommendations for current education design.

2018 Capstone Projects

Evaluation of Mobile Privacy and Security:  Mobile App Privacy Score

Sponsored:       Redmorph Inc.
Presented by:   Fan Yang, Jianlan Zhu
Advisor:            Timothy Libert

Fan etc

Nowadays, mobile applications suffer from tremendous user-privacy issues. With the popularity of mobile apps, users’ data is shared between a large variety of entities. Problems like targeted advertising and information leakage are common privacy violations among mobile applications.  What makes it even worse is that, according to research, few users have an idea of the complicated issues. Even users are aware of some of their personal data has been improperly used, they have no idea of how bad the situation is and no choice but to accept them.  Related studies have done plenty of work to identify privacy issues exist in current mobile applications. Researchers have taken many metrics like privacy policies, permission requested, code analysis, data transmission into consideration to evaluate mobile apps’ privacy performance. But average users still have a difficult time understanding the issues because few studies have put forward a straightforward scoring system. 

Executive Summary:  Evaluation of Mobile Privacy and Security:  Mobile App Privacy Score.

Evaluating Privacy Enhancing Technologies for Organizations

Sponsor:           Netflix, Inc.
Presented by:   Ao Chen, Jeremy Thomas
Advisor:            Nicolas Christin

Ao etc

Given the changing perspective on privacy in today’s society, organizations must adapt to a growing set of requirements and constraints on practices that impact privacy. Individual choices about privacy, exposure of broken privacy practices, and expanding regulations force organizations towards expanding the influence of accepted principles of privacy throughout
business operations. All of these factors encourage the use of technologies to provide stronger privacy guarantees and to directly address the privacy challenges these requirements create in an effective and efficient manner.  To better understand the use of privacy enhancing technologies (PETs) by organizations, we interviewed a set of privacy experts working across various industries and in multiple disciplines. We conducted 20 interviews of privacy experts from September 2018 to November 2018. We interviewed most participants over the phone or through video conferencing applications, and typically the interviews lasted 30 to 60 minutes. 

Executive Summary: Evaluating Privacy Enhancing Technologies for Organizations

View the final report here.

Designing Privacy Controls for Older Facebook Users

Sponsor:           Facebook
Presented by:  Yama Ahmadullah, Zhuo Chen, David Edelstein
Advisor:            Lorrie Cranor

Yama etc

At Facebook’s request we investigated the particular privacy concerns of seniors (age 65+) regarding Facebook and to develop proposed remedies.

Information Gathering
We built off of existing research and prior attempts to develop privacy aids for older users to construct interviews and surveys to find places for improvement in Facebook’s existing experience. We conducted 15 semi-structured interviews to identify privacy concerns senior users have on Facebook. Then, from our results, we developed a survey to find which concerns were most important and to find promising ways to provide senior users with better control of their security and privacy. Ultimately, we got 79 completed surveys. Seniors were eager to share their concerns with us, and were pleased that a company like Facebook was interested in learning how they felt. 

Executive Summary: Designing Privacy Controls for Older Facebook User

Developing a Process to Test Privacy in Mobile Apps

Sponsor:  Phillips
Presented by:  Sharada Boda

This project developed a process specification for Philips to test privacy for mobile apps and tested two apps using the developed process. The process addresses a few aspects of GDPR which are ‘Purpose Identifications’, ‘Third-Part library identification and their access’, Consent and Portability. The process involved a 3-step methodology which combined the use of automated tools with manual analysis. The first phase involved identification of personal data, the third-party libraries and their access to personal data. The second phases involved determining the behavior of personal data in terms of identifying the purpose of processing the personal data, consent and portability. These aspects were manually verified by analyzing the code, interacting with the application, inspecting the network traffic and learning more about the various third-party libraries used in the app. The data was also analyzed to determine how it was stored and transmitted and checked for encryption. The last phase of the process involved analyzing the policy text. The policy text was analyzed to identify possible violations (contradiction as to how personal data is processed in the app compared to the privacy policy text) and omissions (missing information in the privacy policy). Four different automated tools were identified for various parts of the process and were combined with manual analysis for a comprehensive privacy testing. The entire process involved both static analysis as well as dynamic analysis. The process was implemented on two Philips apps and few potential issues were found. While the process is capable of finding certain potential issues, the process aims to help raise pertinent questions that must be addressed through team collaboration with the developers and privacy officers. The finding suggests there is scope for further improvement that can address some of the aspects of GDPR and secure client data in mobile application.

Developing a Process to Test Privacy in Mobile Apps - Executive Summary

2017 Capstone Projects

Developing a Windows Privacy Walkthrough Experience

Sponsor:  Microsoft
Presented by: Dhanuja Shaji and Javed Ramjohn

Dhanuja and Javed

This project provides data to help inform Microsoft on how to deliver a state-of-the-art privacy experience for Windows 10 desktop that empowers the user and that represents the dedication Microsoft has to privacy. We conducted a competitive analysis of the privacy experiences offered by major platforms and operating systems and identified a lack of proactive privacy experiences that use elements like nudges or notifications to encourage informed privacy decision making. Microsoft also offered greater privacy controls than most of the other platforms studied. An MTurk study (N = 364) was done to gain insight into the habits and preferences of consumers with regard to operating system privacy. The results show that, despite Microsoft’s extensive privacy controls, Windows users were more concerned about their privacy than other OS users (p < 0.05). We also find that participants (80%) want a proactive privacy walkthrough for their OS privacy settings. Finally, a series of prototype drafts for a privacy walkthrough experience in Windows 10 were designed as a starting point for future work. Our findings suggest that Microsoft can further assert itself as a privacy leader by focusing on a seamless privacy experience throughout its ecosystem of products and that Microsoft should devote resources to the user testing and development of a proactive privacy walkthrough experience.

Developing a Windows Privacy Walkthrough Experience - Executive Summary

Data Processing Inventory:  A solution for compliance with GDPR Article 30

Sponsor:  Citi
Presented by: Lidong Wei, Quan (Bill) Quan, and Jun Ma

Jun

This project is designed to organize the data processing activities of Citi organization and assist it in becoming compliant to Article 30 of the GDPR regulation which comes into effect on May 25th, 2018. A two-step approach was taken - analysis and developing a prototype. The analysis phase entailed of a detailed research of publicly available third party descriptions of Article 30, and research into vendor tools designed to meet Article 30 compliance objectives. In the second phase, a prototype was designed and developed. The work effort ensured the design and development would meet the requirements of GDPR Article 30 covering data processing, data collection, data sorting and the presentation of the data. The work includes data management, data analytics and data visualization components and original, innovative thinking regarding how to setup and maintain a compliant Article 30 inventory.

Data Processing Inventory - Executive Summary

Getting Consent From Drivers on the Go: a Privacy Consent Collection Framework for in-vehicle Systems

Sponsor:  HERE
Presented by: Tong Liu, Yuankun Li, Yuru Liu

Lorrie et

HERE Technologies, a digital location technology company, providing location based services to various verticals, including Automotive, has been considering different means of acquiring consent in an in-vehicle context. Therefore, we designed a consent collection framework for collecting consent in an in-vehicle context.  The framework consists of four methods that each have their pros and cons:  using the vehicle’s screen, using the vehicle’s audio interface, using an app on the user’s smartphone, or using the vehicle’s audio interface to send information to the user’s smartphone for viewing.
We conducted a two-phase user study to evaluate each method with drivers. Overall, we found the in-vehicle screen is the most favored method: people think it is the most natural and convenient way to provide consent. However, some people still think the consent process takes too much time and want to bypass it. The other approaches we tested were also promising.

Getting Consent from Drivers on the Go - Executive Summary

Supporting Data Portability in the Cloud Under the GPDR

Sponsor:  Alibaba
Presenters: Anuj Shah and Yunfan Wang

Anuj

The right to data portability under the European Union’s General Data Protection Regulation (GDPR) extends beyond existing privacy frameworks and empowers individuals to transfer their personal data between data controllers. While the Working Party for Article 29 of the Data Protection Directive has issued guidance on how to respond to portability requests, the European Commission has expressed a different interpretation of this right. Data portability therefore brings new and significant challenges to data-driven enterprises, especially those with systems that are distributed across cloud infrastructure. We attempt to clarify how this right translates to the operations of cloud service providers in their roles as either data controllers or data processors. Specifically, we outline the various technical methods available for porting data in the cloud, and then consider how the recipient of data from a portability request and the cloud service level govern which compliance solution a cloud provider can put forward. The solutions we describe here are simple extensions of existing services and do not prescribe a specific legal interpretation. We encourage cloud providers to take a competitive stance on GDPR compliance by offering these solutions to their customers.

Supporting Data Portability in the Cloud Under the GPDR - Executive Summary

Researchers develop data portability recommendations for the cloud.

Metrics and Adversary Models for Implicit Authentication

Sponsor:  UnifyID
Presenters:  Siddharth Nair, Preethi Josephina Mudialba, and Dan Calderon

Dan etc

Abstract:
Implicit Authentication is a prominently emerging field for considering the usably secure authentication of users. Many approaches have been considered for achieving the goals of this field, but it remains unclear how to evaluate across systems since there is no agreed-upon set of performance evaluation metrics for this field. To compound this problem further, not all systems necessarily consider the same, if any, adversarial threats to their system that could compromise the security or usability of the system. In this project, we review literature on performance evaluation, and the broader computer security authentication literature, and determine a set of important criteria that a metric and a threat model should have to be valuable for evaluating an implicit authentication system. We present taxonomies of performance metrics and threat models with respect to these criteria, and recommend a subset that all future proposed systems should use.

Metrics and Adversary Models for Implicit Authorization - Executive Summary

2016 Capstone Projects

Prevalence of PII data in the clear on personal computers

Sponsor:  Intersections, Inc.
Presented by: Lieyong Zhou & Xi Zheng

Project Description:
Problem:  People commonly store personal data (SSN, credit card numbers, user names and passwords, etc.) on their devices in the clear.  Devices, meaning mobile phones, laptops, desktops, tablets, etc.

Goals:

  • Develop methods to scan personal computers (with the users' consent), search for personal data in the clear, and categorize the type of data found.
  • The resulting data will be used to determine how prevalent the issue is, determine what type of data is most commonly at risk, and identify ways of increasing user awareness.
  • The project will focus on windows machines

Data Subject Notice and Consent under the EU GDPR

Sponsor:  PrivacyCheq
Presented by: Jonathan Liao, Vijay Kalani, Arnab Kumar

Project Description:
Problem:  The EU GDPR (General Data Protection Regulation is going to change requirements revolving around notice and consent when collecting and processing data associated with EU residents.  This capstone will take a close look at these changes and work with PrivacyCheq to identify opportunities to refine and extend its current product portfolio and in particular its recently launched ConsentCheck GDPR "Compliance Development Kit."

Goals: The objective of this capstone is to investiage what effect GDPR-grade notice and consent might have in terms of user experience if/when it is implemented on Americans (arguably accustomed to a less rigorous notice/consent user experience).  What would be the level of acceptance or delight with enhanced notice, more granular consent and added available benefits such as right of access, erasure, portability, etc.?

2015 Capstone Projects

How Transparency Affects User Preferences and Behavior

Sponsor:  Lufthansa
Presented by Scott Stevenson, Chunyue Du, and Rahul Yadav

Cap

The Lufthansa Team: Chunyue Du, Scott Stevenson, Bettina Kraemer (Lufthansa), Lorrie Cranor, and Rahul Yadav

lorrie team

Tales from the Crypton: An Evaluation of SpiderOak's Novel Authentication Mechanisms

Sponsor:  SpiderOak
Presented by: Cameron Boozarjomehri and Hsin Miao

miao and c

The SpiderOak Team: Hsin Miao, Norman Sadeh, Cameron Boozarjomehri

N sadeh

2014 Capstone Projects

A Taxonomy of Privacy Notices

Sponsor:  Facebook
Presented by Adam Durity and Aditya Marella

adam and aditya

Mobile Location Analytics Opt-out

Sponsor:  The Future of Privacy Forum
Presented by: Pranshu Kalvani, Chao Pan (pictured below), and Weisi Dai

chao

Privacy Center

Sponsor:  American Express
Presented by: Sakshi Garg, Zhipeng Tian, Ziewi Hu

sakshi et el