Blog

Toys That Listen and the Internet of Things

Hello Barbie, Amazon Echo, and the home robot Jibo are part of a new wave of connected toys and gadgets for the home that listen. Different than the smartphone, these devices are always on, blending into the background until needed by the adult or child user. We do not yet know all the information our new toys are collecting, storing, or disclosing. With an intended audience of designers and regulators, this project brings an interdisciplinary group of experts together to build a set of consumer protection best practices for design and user control of connected devices in the home. We are grateful to the Rose Foundation Consumer Privacy Rights Fund for funding this work.

Forthcoming in CHI 2017, our study Toys That Listen: A Study of Parents, Children, and Internet-Connected Toys, explored people’s mental models and experiences with these emerging technologies and to help inform the future designs of interactive, connected toys and gadgets.

Our goal is to preempt privacy problems before they occur. Consumer privacy protection laws have often been reactionary–drafted or amended after privacy was breached and individuals harmed. The Video Privacy Protection Act, for example, was the result of lessons on the dangers of the distribution of an individual’s video rental history. The recent Netflix settlement under the same Act shows that these issues are alive and well today. The Children’s Online Privacy Protection Act (COPPA) responds to fears adults have about children being online and the new internet-connected toys like Hello Barbie raise these fears. While legislation like California’s Online Privacy Protection Act has been found to extend from the initial web page privacy policy requirement to apps on devices, the delivery of privacy notices on toys such as Hello Barbie is more difficult to design. With household devices having the ability to collect increasingly detailed information about what we watch, listen to, talk about, or purchase from the comfort of home, now is the time to identify and implement best practices.

Tech Policy Lab Distinguished Lecture with Prof. Latanya Sweeney: How Technology Impacts Humans

On December 1, 2015, the Tech Policy Lab hosted Prof. Latanya Sweeney for our Fall Distinguished Lecture. Prof. Sweeney gave a talk titled “How Technology Impacts Humans.” Illustrating how technology designers are the new policy makers through the decisions they make when producing the latest gadgets and online innovations.

As a professor at Harvard University, Latanya Sweeney creates and uses technology to assess and solve societal, political and governance problems, and teaches others how to do the same. One focus area is the scientific study of technology’s impact on humankind, and she is the Editor-in-Chief of the newly formed journal Technology Science. She was formerly the Chief Technology Officer at the Federal Trade Commission, an elected fellow of the American College of Medical Informatics, with almost 100 academic publications, 3 patents, explicit citations in 2 government regulations, and founded 3 company spin-offs. She has received numerous professional and academic awards, and testified before federal and international government bodies. Professor Sweeney earned her PhD in computer science from the Massachusetts Institute of Technology, being the first black woman to do so. Her undergraduate degree in computer science was completed at Harvard University. latanyasweeney.org.

Tech Policy Lab Faculty Directors discussion “Responsible innovation: A cross-disciplinary lens on privacy and security challenges”

The Tech Policy Lab’s Faculty Co-Directors were the featured speakers for November’s installment of the 2015 Engineering Lecture Series where they discussed what it means to innovate responsibly, particularly with respect to privacy and security.

How do you interact and socialize? How do you conduct business? How do you raise your kids and care for the elderly? All these basic activities are being directly impacted by new technologies that are emerging at an incredible rate. While each new technology brings its own benefits and risks, regulations struggle to catch up.

Augmented Reality – Technology & Policy Primer

AR-Whitepaper_Cover

This whitepaper is aimed at identifying some of the major legal and policy issues augmented reality (AR) may present as a novel technology, and outlines some conditional recommendations to help address those issues. Our key findings include:

1. AR exists in a variety of configurations, but in general, AR is a mobile or embedded technology that senses, processes, and outputs data in real-time, recognizes and tracks real-world objects, and provides contextual information by supplementing or replacing human senses.

2. AR systems will raise legal and policy issues in roughly two categories: collection and display. Issues tend to include privacy, free speech, and intellectual property as well as novel forms of distraction and discrimination.

3. We recommend that policymakers—broadly defined—engage in diverse stakeholder analysis, threat modeling, and risk assessment processes. We recommend that they pay particular attention to: a) the fact that adversaries succeed when systems fail to anticipate behaviors; and that, b) not all stakeholders experience AR the same way.

4. Architectural/design decisions—such as whether AR systems are open or closed, whether data is ephemeral or stored, where data is processed, and so on—will each have policy consequences that vary by stakeholder.

How Information Asymmetry Helped Find Abducted Kids

Lab Co-Director Ryan Calo is featured in a Washington Post article describing how police used Spotify and other streaming services to located abducted kids in Mexico. Calo explains:

“This is a classic case of ‘information asymmetry,’ said University of Washington law professor Ryan Calo, meaning when companies, government agencies or police departments have more information about your online habits than you even realize is out there.

‘There’s an enormous underestimation of your digital footprint,’ Calo said. ‘You might not realize how much your data is being stored, but you also might not realize how many parties have access to it. Think about all the uses to which this information can be put.'”