Blog

Toys That Listen – CHI 2017

v4-TTL-image-slide

What do teddy bears, My Friend Cayla and Barbie have in common? They are all toys connected to the internet that can listen, overhearing what goes on in the home. Security breaches and the privacy challenges of these devices are regularly in the news. During the holiday season of 2015 Hello Barbie faced significant pushback from privacy advocates and the companies involved, Mattel and ToyTalk, were responsive to concerns. This past holiday season a complaint was filed with the Federal Trade Commission over My Friend Cayla’s privacy failures and recently the doll was banned in Germany. Just this week it was revealed that the CloudPets’ teddy bears millions of recordings of parents’ and children’s conversations had been easily accessible online.

Describing them as Toys That Listen, our team at the Tech Policy Lab sought to better understand their privacy and security implications. We began with a hackathon investigating the security of toys like My Friend Cayla, Hello Barbie, Cognitoys Dino and others. We also sought to understand how parents and children viewed their privacy around these toys. We conducted interviews with parent-child pairs in which they interacted with Hello Barbie and CogniToys Dino, shedding light on children’s expectations of the toys’ “intelligence” and parents’ privacy concerns and expectations for parental controls. We found that children were often unaware that others might be able to hear what was said to the toy, and that some parents draw connections between the toys and similar tools not intended as toys (e.g., Siri, Alexa) with which their children already interact. Our findings illuminate people’s mental models and experiences with these emerging technologies and provide a foundation for recommendations to toy designers and policy makers. Read the paper (forthcoming in CHI 2017).

More information about our work is available through conferences we have participated in. In February we led a discussion on privacy and and the connected home at Start With Privacy, a conference organized by the Washington State Office of Privacy and Data Protection. We also joined a panel hosted by the Future of Privacy Forum and Family Online Safety Institute on Kids & the Connected Home and highlighted the portability of toys leading to children bringing new privacy concerns to their friend’s houses.

For questions about this project email emcr@uw.edu.

 

Director Calo Testifies on Augmented Reality before U.S. Senate

calo-testimony-senate-11-16-16

Lab Faculty Co-Director, Ryan Calo, testified before the U.S. Senate Committee on Commerce, Science, and Transportation at a hearing exploring augmented reality. Watch the hearing here and read his testimony below.

“Chairman Thune, Ranking Member Nelson, and Members of the Committee, thank you for the opportunity to discuss the promise and perils of augmented reality.

Augmented reality (AR) refers to a mobile or embedded technology that senses, processes, and outputs data in real time, recognizes and tracks real-world objects, and provides contextual information by supplementing—or in some cases, replacing—human senses. AR differs from so-called virtual reality in that AR users continue to experience most of their physical environment. AR has many positive applications, from training tomorrow’s workforce, to empowering people with disabilities. But the technology also raises novel or acute policy concerns that companies and policymakers must address if AR is to be widely adopted and positively affect American society.

The UW Tech Policy Lab is a unique, interdisciplinary research unit at the University of Washington that aims to help policymakers develop wise and inclusive technology policy. We have studied AR and its impact on diverse populations and discuss our findings in detail in the appended whitepaper Augmented Reality: A Technology and Policy Primer.

Our research suggests that AR raises a variety of question of law and policy, including around privacy, free speech, and novel forms of distraction and discrimination. For example: Will the constant recording of a user’s environment give hackers, companies, and government unparalleled access to the bedroom, the boardroom, and other private spaces? Could the superimposition of information over reality render the AR user vulnerable or unsafe? And are there situations—such as job interviews—where knowing everything about an individual could result in discrimination or subject the AR user to legal liability? Industry must design AR products with these and many other questions in mind.

Thank you again for the interest in our research and the opportunity to appear before the Committee. I look forward to your questions.”

Knight Foundation Demo Day – Peter Ney

This past July, Lab member Peter Ney presented his work on SeaGlass, a cell-site simulator detection system, at the Knight Foundation Prototype Demo Day in Miami. In the below post, Peter discusses his experience at the event.

In February, the Knight Foundation awarded the Prototype Grant to the SeaGlass team (colead by Ian Smith and me). This grant is designed to fund early-stage projects over a 6 month sprint, and the Demo Day was a chance to showcase our work to the Knight Foundation and the other grantees.

Cell phone surveillance became a hot topic after it was reported that law enforcement has been using cell phone tracking devices, called cell-site simulators, to locate suspects for years with little judicial oversight. To provide independent information on when, where, and how often cell-site simulators are used, we wanted to develop a measurement system that could detect them. We used the grant from Knight to build a monitoring system called SeaGlass that can detect cell-site simulators across a city and be run for long periods of time. After it was built, we tested SeaGlass by deploying it for two months in Seattle and Milwaukee. More details on SeaGlass will be coming in a future write up where we will also cover the technical details cell-site simulators and legal their legal implications. The talk received lots of great questions and feedback on future work.

In addition to presenting, it was also fun to see the progress that the other teams made in just six months. There were lots of exciting projects, all under the broad theme of data. Here are a few of my favorite talks.

Michael Skirpan, of the University of Colorado, designed an immersive theater performance on the future of technology and the ethics of personal data collection called Quantified Self. He described the technical details and lessons learned from the show. Before the performance, attendees give Quantified Self access to data stored in their online profiles. Attendees to the show wear RFID bracelets, so that their data can follow them as they participate in the show. The performance consists of many exhibits that showcase how their data can be used. One fun example was a mock job interview, where the interviewer researches you on social media. By using the real data of attendees, the show was able to make something as abstract as “data” into something that was more personally understood and salient.

Another exciting project was lead by Surya Mattu from ProPublica. Surya has been doing a lot of work reporting on the impacts of machine bias and how black-box algorithms are impacting our lives. He demoed a tool that his team has been developing to help people learn about data discrimination and data bias. The tool let people analyze their own data to determine if it is biased in sensitive ways, like race or gender. Surya hoped that by using the tool, people could gain a better understanding of their data and learn how to avoid using it in harmful or discriminatory ways.

Personalized data has also been used by political campaigns to create targeted political ads. To better understand this targeting, Young Mie Kim, a professor in the School of Journalism at the University of Wisconsin, discussed Project DATA. This is an effort to collect and analyze advertising data sent to real voters. To collect this data, Professor Kim worked with volunteers to record which political ads are displayed to them as they surf the web. Using this ad data, she hopes we can better understand how campaigns disseminate information and why particular voters are targeted.

Kids & Connected Toys

Kids-Connected-Toys

This week Emily McReynolds will be speaking at the Future of Privacy Forum event Kids & the Connected Home. One of the Tech Policy Lab’s current projects focuses on the privacy and security implications of connected toys, Toys That Listen. Follow the discussion on Twitter at #InternetofToys.

Hello Barbie, Amazon Echo, and the home robot Jibo are part of a new wave of connected toys and gadgets for the home that listen. Different than the smartphone, these devices are always on, blending into the background until needed by the adult or child user. We do not yet know all the information our new toys are collecting, storing, or disclosing. With an intended audience of designers and regulators, this project brings an interdisciplinary group of experts together to build a set of consumer protection best practices for design and user control of connected devices in the home.

The potential benefits of household intelligent devices may be real–these technologies claim to increase convenience, cleanliness, and even improve health. In the lab setting, at-home robots have been tested to help individuals with dementia or rehabilitation. But just as the benefits may be game-changing and exciting, the threats of harm will be novel and non-trivial. Attacks on consumer privacy via the Internet are pervasive, and these issues increase where devices record information from inside the home.

Our goal is to preempt privacy problems before they occur. Consumer privacy protection laws have often been reactionary–drafted or amended after privacy was breached and individuals harmed. The Video Privacy Protection Act, for example, was the result of lessons on the dangers of the distribution of an individual’s video rental history. The recent Netflix settlement under the same Act shows that these issues are alive and well today. The Children’s Online Privacy Protection Act (COPPA) responds to fears adults have about children being online and the new internet-connected toys raise these fears. While legislation like California’s Online Privacy Protection Act has been found to extend from the initial web page privacy policy requirement to apps on devices, the delivery of privacy notices on toys such as Hello Barbie is more difficult to design. With household devices having the ability to collect increasingly detailed information about what we watch, listen to, talk about, or purchase from the comfort of home, now is the time to identify and implement best practices.

PokemonGO and Policy for Augmented Reality Applications

pokeman_go

With widespread adoption, PokemonGO has brought the novel policy considerations of augmented reality to a wide audience. Over the last week, members of the Lab have highlighted some of these issues. Co-Director Calo,  noted the novel nature of a game that requires players to physically travel and potentially actionable nuisance created by the developers (Verge). In an article in New Scientist, Emily McReynolds highlighted the benefits of including a diverse set of stakeholders in the design of these applications.

In the Tech Policy Lab’s Augmented Reality Law and Policy Primer we provide a preview of the policy implications of this developing technology and make conditional  recommendations. Our key findings included:

1. AR exists in a variety of configurations, but in general, AR is a mobile or embedded technology that senses, processes, and outputs data in real-time, recognizes and tracks real-world objects, and provides contextual information by supplementing or replacing human senses.
2. AR systems will raise legal and policy issues in roughly two categories: collection and display. Issues tend to include privacy, free speech, and intellectual property as well as novel forms of distraction and discrimination.
3. We recommend that policymakers—broadly defined—engage in diverse stakeholder analysis, threat modeling, and risk assessment processes. We recommend that they pay particular attention to: a) the fact that adversaries succeed when systems fail to anticipate behaviors; and that, b) not all stakeholders experience AR the same way.
4. Architectural/design decisions—such as whether AR systems are open or closed, whether data is ephemeral or stored, where data is processed, and so on—will each have policy consequences that vary by stakeholder.