Blog

Tech Policy Lab Distinguished Lecture: Responsible Innovation in the Age of Robots & Smart Machines

Jeroen van den Hoven

Many of the things we do to each other in the 21st century –both good and bad – we do by means of smart technology. Drones, robots, cars, and computers are a case in point. Military drones can help protect vulnerable, displaced civilians; at the same time, drones that do so without clear accountability give rise to serious moral questions when unintended deaths and harms occur. More generally, the social benefits of our smart machines are manifold, the potential drawbacks and moral quandaries extremely challenging. In this talk, I take up the question of responsible innovation drawing on the European Union experience, value sensitive design, and reconsidering the relations between ethics and design.

Jeroen van den Hoven is a professor of Ethics and Technology at Delft University of Technology. He was the first scientific director for 3TU/Ethics and is currently editor-in-chief of Ethics and Information Technology. In 2009 he won both the World Technology Award for Ethics and the IFIP prize for ICT and Society for his work on ethics and ICT.

Cory Doctorow: Alice, Bob and Clapper: What Snowden Taught us About Privacy

CoryDoctorow

It’s the 21st century and the Internet is the nervous system of the information age. Treating it as a platform for jihad recruitment that incidentally does some ecommerce and video on demand around the edges is blinkered, depraved indifference.

The news that the world’s spies have been industriously converting every wire, fiber and chip into part of a surveillance apparatus actually pales in comparison to the news that the NSA spends $250,000,000 every year to undermine the security of the devices we trust our lives to — literally.

Can technology give us privacy, or only take it away? Are we headed for Orwell’s future? Huxley’s? Kafka’s? Do we have to choose, or do we get all three (if we’re not careful)?

The Center for Digital Arts and Experimental Media (DXARTS), Henry Art Gallery, and the UW Tech Policy Lab recently sponsored a talk by author and activist Cory Doctorow: “Alice, Bob and Clapper: What Snowden taught us about privacy.”

Distinguished Lecture: Responsible Innovation in the Age of Robots & Smart Machines

Many of the things we do to each other in the 21st century –both good and bad – we do by means of smart technology. Drones, robots, cars, and computers are a case in point. Military drones can help protect vulnerable, displaced civilians; at the same time, drones that do so without clear accountability give rise to serious moral questions when unintended deaths and harms occur. More generally, the social benefits of our smart machines are manifold; the potential drawbacks and moral quandaries extremely challenging. In this talk, I take up the question of responsible innovation drawing on the European Union experience and reconsidering the relations between ethics and design. I shall introduce ‘Value Sensitive Design’, one the most promising approaches, and provide illustrations from robotics, AI and drone technology to show how moral values can be used as requirements in technical design. By doing so we may overcome problems of moral overload and conflicting values by design.

Jeroen van den Hoven is full professor of Ethics and Technology at Delft University of Technology, he is editor in chief of Ethics and Information Technology. He was the first scientific director of 3TU.Ethics (2007-2013). He won the World Technology Award for Ethics in 2009 and the IFIP prize for ICT and Society also in 2009 for his work in Ethics and ICT.

Spotlight on Tech Policy Lab Scholar Adam Lerner

AdamLerner

The Tech Policy Lab is looking forward to new projects with the arrival of the 2014-2015 academic year. This year we have Adam Lerner, a Ph.D. student in Computer Science & Engineering at the University of Washington, working on privacy technologies. Based in Lab Director Tadayoshi Kohno’s UW Security and Privacy Research Lab, Adam studies censorship, surveillance and privacy in the context of the global Internet and emerging technologies.

Adam spent the spring in Berkeley, California developing a new system, Rangzen, in collaboration with De Novo Group (http://denovogroup.org/). Rangzen is a collaboration with Yahel Ben-David (De Novo Group, Berkeley EECS),  Barath Raghavan (De Novo Group, ICSI), Giulia Fanti (Berkeley EECS) and Eric Brewer (Berkeley EECS).

Rangzen is a smartphone app which lets people communicate when there are no cell networks and no Internet, such as in the case of heavy governmental censorship or a natural disaster. It’s a mesh networking platform, which means it allows phones to propagate messages through gossip, passing all the messages they’ve heard about to other nearby phones over Bluetooth or Wifi. It fights spam and propaganda by prioritizing messages based on social relationships: when a message arrives at a phone, Rangzen decides how much to trust that message based on how many friends the owners of the phones have in common. In anti-censorship mode, it’s a completely anonymous system which preserves users’ and authors’ anonymity, using cryptography to check how many friends users have in common without revealing who those friends are.

We asked Adam how he became interested in working on anti-censorship programs:

“Anti censorship systems are one of those areas where technical solutions can be really significant. They’re not the whole pie – civil liberties don’t magically emerge from an app – but they’re definitely a piece of it. The key is to get the threat model right. If you build a circumvention system that defeats censorship which isn’t practiced anywhere, you’re probably not helping anyone. What I liked about working with De Novo Group is that they want to build systems that are innovative research, and actually apply those systems in the real world.”

Privacy and Security Concerns for the Smart Watch Age

Smartwatch

(photo credit Kārlis Dambrāns)

The Internet of Things (IoT) is quickly expanding the next big product in its interconnected family – the smart watch. While these high-tech watches are not necessarily new, recent releases from companies like Samsung, LG and Apple have given them a more mainstream public appeal and market share. In welcoming the watches to the Internet of Things, customers  are also introduced to the various privacy and security questions that researchers and governments have scrutinized in IoT. This past month Tech Policy Lab members participated in the Workshop on Usable Privacy & Security for wearable and domestic ubIquitous DEvices (UPSIDE). The workshop brought academics from around the country to discuss various IoT privacy issues. The FTC addressed these issues on a larger stage last fall, hosting an IoT-focused workshop to identify and address privacy and security problems. As the smart watch maneuvers its way onto the wrists of customers, wearers should  take note these problems.

The health and fitness craze that inspired products like FitBit and Garmin’s Connect has strongly impacted the design of recent smart watches. Products like the Samsung Gear Fit and the Apple smart watch have incorporated pedometers and heart rate monitors to allow users to measure their daily activity. With the increasing prevalence of health monitoring in everyday technology, it was no surprise that the FTC devoted one of their four workshop panels to the topic last fall. The “Connected Health and Fitness” panel brought in business and academic experts to discuss the benefits as well as the data privacy and security risks involved. As the smart watch’s popularity increases, preventing these risks will become imperative. Data theft could lead to location data and activity patterns being extracted from the pedometer or heart rate readings. Risks could even come from the company itself. What would happen if a health insurance provider was being sold health data collected from a prospective candidate’s smart watch? Protecting this data from both unwanted collection and use will be a necessary measure to ensure privacy in the age of the internet-enabled watch.

One of the papers presented at UPSIDE was on the hunt for privacy flaws in IoT products like Google Glass and the smart watch. Titled, “When Everyone’s A Cyborg: Musings on Privacy and Security in The Age of Wearable Computing,” the paper by Serge Egelman highlighted one of the major issues – “the continuous capture of audio and video.” While UPSIDE only provides the abstract for the paper, extrapolating where audio and video recording could breach a smart watch user’s privacy is easy. A user wants their watch to listen while giving it  instructions, but unwarranted recording would present a security risk. The same is true for the camera. Unchecked recording devices could leak the  extremely personal data of  an unknowing user. Similar to the health and fitness privacy risk, this data could be misused by a thief or third-party company by way of reconnaissance, blackmail or harassment.

Privacy in the land of smart watches is not entirely hopeless, however, as it may fare better than other IoT devices in some instances. Specifically, updates and patches to address security issues could be much more common on a watch than a product like an electrical grid monitor. The issue is something the FTC addressed in its questions to the public for their conference. Two of the questions identified the FTC’s attention to the issue: “How can companies update device software for security purposes or patch security vulnerabilities in connected devices, particularly if they do not have an ongoing relationship with the consumer?” as well as “Do companies have adequate incentives to provide updates or patches over products’ lifecycles?”

In both cases the smart watch poses an optimistic answer. For the first question, the smart watch is likely to be exempt, as it will engage any user enough to form an ongoing relationship with them. By delivering notifications, sounding morning alarms, and even telling the user the time, it directly impacts and interacts with a user’s life on a daily or even hourly basis. Adequate incentives to provide updates are present as well. The smart watch is one of the newest hardware endeavors among competing companies like Apple, Samsung and Google. For one of these companies or even a smaller one to have their product succeed, they must convince the customer it is better than the rest and that it is something worth buying in the first place. A product that has unpatched security threats would do neither.

The smart watch fits into the privacy and security discussions of the IoT just as well as it fits onto a wrist. Prevalent issues such as health monitoring and audio or video capture could cause serious risks to consumers. Eliminating these risks and providing a safe product will be vital to the product’s success, bringing a great new addition to the Internet of Things family.