Privacy

Privacy in Online Dating

How do you manage your privacy in online dating? Chances are that if you use online dating or have considered using it, this is an issue you’ve given some thought. And you wouldn’t be alone, as privacy issues in online dating have appeared in the media—two summers ago, during the Rio Olympics, privacy in online dating made headlines when a Tinder user posted screenshots of Olympian’s profiles on social media, and a journalist collected identifying information about closeted gay Olympians through Grindr. In September, a journalist requested her personal data from Tinder and received 800+ pages including information about her Facebook and Instagram activity. And more recently, researchers have revealed security vulnerabilities in a number of online dating apps, including ways that users may be vulnerable due to sensitive information they disclose on the site.

These events and others show that individual users can’t control all privacy-related risks when using online dating. To understand how users reason about privacy risks they can potentially control through decision making, Lab Ph.D. student Camille Cobb and Lab Faculty Co-Director Yoshi Kohno studied online dating user’s perceptions about and actions governing their privacy in “How Public is My Private Life? Privacy in Online Dating.” The researchers surveyed 100 participants about how they handle their own and other users’ privacy; then, based on themes raised in survey responses, they conducted follow-up interviews and analyzed a sample of Tinder profiles.

OnlineDatingImage

Based on their survey of 100 online dating users, and interviews with 14 of those, the researchers found that when choosing profile content, looking people up, and taking screenshots of messages or profiles, users may face complex tradeoffs between preserving their own or others’ privacy and other goals. Users described balancing privacy considerations including the risk of feeling awkward, screenshots and data breaches, stalking, and their profile being seen by a friend or co-worker, with goals like getting successful matches, preserving information that may be sentimental if the match is successful, safety, and avoiding scams.

These tradeoffs are complex, and involve user’s privacy decisions beyond just the dating app. For example, a user concerned about privacy on social media like Facebook might change their name to something unusual or unique and hard to guess, but if a dating service pulls that name into a user’s profile, that unique name makes the user easier to find outside of the dating app. Beyond the name they used, users also experienced tradeoffs around the amount of information to include in their profile. Including more information could make users more easily searchable outside of the dating app, while not including any identifiable information could run the risk, in one user’s case, of being thought to be a bot.

Focusing on users’ concerns about “searchibility,” or the risk of being identified elsewhere online, the researchers analyzed 400 Tinder profiles. Using techniques readily available and fairly easy for any Tinder user to use, event without technical knowledge, the researchers were able to find 47% of the users. And having an account directly linked to another account, or mentioning a username for another account in the profile, increased the chance of being found to 80%. These results support concerns suggested in the survey; the researchers were able to find a larger portion of people with unique names, echoing a survey respondent’s concern that having a unique name would make her more identifiable.

Discussing the privacy considerations & tradeoffs that users described experiencing, and in light of their analysis of profiles’ searchability, the researchers suggest a number of avenues to explore that could help online dating users make decisions around privacy. These could include restricting the number of screenshots a user can take per day, allowing users to disallow remove matches, and, more broadly, implementing privacy awareness campaigns for users.

This paper was presented at the 26th International World-Wide Web Conference and is available here.

Exploring ADINT: Using Ad Targeting for Surveillance on a Budget

New research by former CSE Ph.D. student Paul Vines, Lab Faculty Associate Franzi Roesner, and Faculty Co-Director Yoshi Kohno demonstrates how targeted advertising can be used for personal surveillance.

From “Exploring ADINT: Using Ad Targeting for Surveillance on a Budget – or – How Alice Can Buy Ads to Track Bob

The online advertising ecosystem is built upon the ability of advertising networks to know properties about users (e.g., their interests or physical locations) and deliver targeted ads based on those properties. Much of the privacy debate around online advertising has focused on the harvesting of these properties by the advertising networks. In this work, we explore the following question: can third-parties use the purchasing of ads to extract private information about individuals? We find that the answer is yes. For example, in a case study with an archetypal advertising network, we find that — for $1000 USD — we can track the location of individuals who are using apps served by that advertising network, as well as infer whether they are using potentially sensitive applications (e.g., certain religious or sexuality-related apps). We also conduct a broad survey of other ad networks and assess their risks to similar attacks. We then step back and explore the implications of our findings.

The Tech Policy Lab plans to work with the ADINT research team to explore the policy implications of this research, examining potential recommendations for issues raised by this new form of personal surveillance.

More information can be found on the team’s website, and the UW News and UW CSE releases. The paper will be presented at ACM’s Workshop on Privacy in the Electronic Society later this month and can be found here.

Spring Distinguished Lecture with Kate Crawford: AI Now

Please join the Tech Policy Lab for a lecture on the social and political questions for artificial intelligence with Kate Crawford on Tuesday, March 6 at 7:00 pm in Kane Hall. (more…)

Privacy’s Past & Future: Discussion with Chris Hoofnagle

12:30-1:20 pm.
William H. Gates Hall Room 117 (room changed).
Lunch provided.
Please RSVP to emcr@uw.edu.
(more…)

Toys That Listen – CHI 2017

What do teddy bears, My Friend Cayla and Barbie have in common? They are all toys connected to the internet that can listen, overhearing what goes on in the home. Security breaches and the privacy challenges of these devices are regularly in the news. During the holiday season of 2015 Hello Barbie faced significant pushback from privacy advocates and the companies involved, Mattel and ToyTalk, were responsive to concerns. This past holiday season a complaint was filed with the Federal Trade Commission over My Friend Cayla’s privacy failures and recently the doll was banned in Germany. Just this week it was revealed that the CloudPets’ teddy bears millions of recordings of parents’ and children’s conversations had been easily accessible online.

Describing them as Toys That Listen, our team at the Tech Policy Lab sought to better understand their privacy and security implications. We began with a hackathon investigating the security of toys like My Friend Cayla, Hello Barbie, Cognitoys Dino and others. We also sought to understand how parents and children viewed their privacy around these toys. We conducted interviews with parent-child pairs in which they interacted with Hello Barbie and CogniToys Dino, shedding light on children’s expectations of the toys’ “intelligence” and parents’ privacy concerns and expectations for parental controls. We found that children were often unaware that others might be able to hear what was said to the toy, and that some parents draw connections between the toys and similar tools not intended as toys (e.g., Siri, Alexa) with which their children already interact. Our findings illuminate people’s mental models and experiences with these emerging technologies and provide a foundation for recommendations to toy designers and policy makers. Read the paper (forthcoming in CHI 2017).

More information about our work is available through conferences we have participated in. In February we led a discussion on privacy and and the connected home at Start With Privacy, a conference organized by the Washington State Office of Privacy and Data Protection. We also joined a panel hosted by the Future of Privacy Forum and Family Online Safety Institute on Kids & the Connected Home and highlighted the portability of toys leading to children bringing new privacy concerns to their friend’s houses.

For questions about this project email emcr@uw.edu.