Eliza Newman-Saul
 
 
POSTERolder.png

Designing for Privacy and Security in Home IoT

 

Team:
Eliza Newman-Saul and Olivia Harold

Timeline:  19 weeks

 

sponsored by: Substantial Digital Studio

 

Tools:
Illustrator, Keynote, Sketch, Invision, Sketch, For Remote Testing (Draw.IO; Skype), AfterEffects and PremierePro

 

 

Deliverable:
UI Specs, Prototypes, Presentation and Research Report

 

 

Features

Devicable is a web application for privacy concerned users.  It is designed to help people quickly understand what setting they might want to disable on home IoT devices or what concerns them.  She can receive step-by step instructions of how to modify the device. The site also has an optional sign in  feature to receive alerts about your devices.

 
 
TradeoffAnimation_lockCardFixed.2018-09-04 16_09_48.gif
 

Understand Tradeoffs

The Privacy manager shows you want is gained and lost when you adjust a privacy setting on a home device. The information changes and reorders based on your concerns.

ActionsAnimationfinal.gif
 

Set Privacy Levels

By selecting your  IoT devices, the user can receive step-by-step instructions off how to modify a device to meet a privacy level

QuizAnimation_WithGo.2018-09-04 16_17_16.gif

Personalize Recommendations

A user can complete a questionnaire to find the privacy level that is right for her, especially when she is unsure of her privacy concerns. She receives a description of what users like her might care about before  receiving step-by-step instructions. 

 

Background

 

Connected home devices, or “home IoT,” appeal to both passionate technology enthusiasts as well as a growing number of casual users. Voice assistants and Smart TVs are becoming more common, although users struggle to understand how to protect themselves, and if protection is indeed necessary. Insecurities in home IoT is prevalent, which puts both devices and people at risk. We believe that in order for the market to continue to grow and sustain, systems must enable concerned users to develop a deeper understanding of connected homes. The goal is to help them make informed decisions that protect their privacy and security. 

 
googlehome.png
 

How might we create a home IoT interface that addresses users concerns about privacy and security?


 

IoT Concerns

 
shower.png

01 

An in-depth picture of our behaviors and habits. 

02 

Threatening Home Safety

03 

A connected network vulnerable to hacks and spying

8 Expert Interviews

 
experts.001.jpeg
(IoT) has been designed so that there’s minimal intelligence in the device itself and they are continuing to go out to the cloud or some set of servers. They could have been designed to be much more self contained.
— Batya Friedman

We researched consent, privacy and security in home IoT. Using the articles we read, I contacted the leading authors in the field including Batya Friedman (U. of Washington), Gilad Rosner (founder of  Internet of Things Privacy Forum), Peter Behr (Consultant and founder of Trustmark for IoT and founder of Thingscon), and we connected with Michelle Change (Electronic Frontier Fund). Since IoT is an emerging technology we also talked to engineers who work on privacy teams at major companies and an expert in edge computing.  Finally, we talked to a researcher at Google who came from an advertising background and was comfortable with data mining. 

 

Below is a diagram I made to understand some of the engineering decisions behind designing IoT:

 
 
Smart Ecosystem.png

Primary Research

 

We recruited 12 Participants using the following criteria:  27+, own at least 1 smart device and self-reported as concerned about privacy and security. We alternated between conducting interview sessions and taking notes for our colleague. 

Research Questions

 

01 How do people evaluate and define privacy risks currently and do their concerns match proven risks?
02 How do people derive value from connected devices?
03 What do people understand about data collection from devices and how does it inform their decisions?
04 What are the complications of protecting digital identity?
05 When is personal privacy violated?

Research Activities

Our interviews included a matrix card sort and semi-structured interviews. We also did a privacy policy think-aloud and matrix card sort. 

 
IMG_3460 2.JPG

Personas

 
 
PersonaMatrix.png

* Our personas were created by placing our users on a 2 x 2 matrix and graphing them on spectrums from casual user to techie and trouble to untroubled.

 

Techie Troubled

We spoke with several users who are highly concerned about privacy at both an individual and societal level.

Techie Untroubled

This user primarily works for a large tech company and feels like there is not a lot to worry about. They believe companies do a good job managing confidential information and hacks.

Casual Troubled

Owns a few smart devices and is concerned about privacy. They feel overwhelmed and confused as to what actionable steps they can take.

Casual Untroubled

Deligates privacy to other family members and does not want to deal with technology management.

 

 

 

Understanding the Data

 
affinitymap.jpg
 

The interviews were coded into small chunks and externalized into a collection of sticky notes. Themes were drawn out in blue and the pink notes were used to brainstorm insights. The notes were moved around and arranged to look for patterns and relationships between participants. The matrix card sort was compared between users into three categories--high level of concern, medium level of concern and low level. 

This research was supplemented by a survey of 628 responses

from a survey the U. of Washington, DUB mailing list, but the bulk of responses were from users on reddit.com/r/homeautomation and reddit.com/r/googlehome.

Insights

 

01 

CONCERNS ARE LARGELY ON A SOCIETAL LEVEL RATHER THAN AN INDIVIDUAL LEVEL. 

People justified their inaction by saying that they don’t feel like targets. They described themselves as law-abiding citizens, and stated that they would be more concerned if they were in a different situation, like being a reporter in Russia. Furthermore, their data is aggregated. However, 11/12 participants stated that they were very worried about the effects of data collection on society-at-large.

“Metadata like the time I sent a text seems harmless, but at a societal level it’s creepy as hell.” (P6)
“Data harvesting can swing elections and encode inequalities in society.” (P8)

02 

IOT TRENDS TOWARDS OPAQUE INTERFACES.

Devices are geared towards completing tasks and minimum interactivity, and are not informative about themselves. Our competitive analysis found devices such as routers and physical security protection almost exclusively have arbitrary and screen-less forms. Common HCI practice realizes that limiting feedback diminishes usability and the users mental models, as well as increasing the gulfs of evaluation and execution.

Opaque form factors are probably not accidental design choices, as evidenced by the quote by P2 below. The ultimate goal of ubiquitous computing is screen-lessness, but the integration and functioning of devices is lagging. There is a mismatch between the need for easy troubleshooting and building trust with users through transparency, and opaque design trends.

“I applaud people who go to extremes to research devices and what they can do.” (P9)
“As I become more informed I become more suspicious.” (P2)
 

 

03 

PEOPLE HAVE VAGUE AND MASSIVE IDEAS ABOUT DATA COLLECTION.

Participants struggled to describe what data about them is 'out there', and most simply said “everything.” They also believe that data is never deleted, and every part of their digital identity remains outside their control.

This exact phenomenon was forecasted by our expert, Batya Friedman, who predicted that people would draw large nondescript bubbles if instructed to illustrate their digital identity. Some people spoke about specific hacks that had affected them. Although they knew what information had been leaked, they concluded that now everything was compromised. It was impossible for them to get a clear picture because they were never sure about the extent of the hack.

“I try to keep some things on more private settings, but I don’t fool myself, it’s out there.” (P1)
“Data being out there is a reality of modern America.” (P9)

04 

SOCIAL AND NEWS EVENTS MAKE TECHNOLOGY UNDERSTANDABLE AND MOTIVATE PEOPLE TO CHANGE BEHAVIORS.

We saw that many people took action and changed behavior after a news event, or after conversations with friends. When a company suggested a change of action, like an update, users were slower to act. This indicates that people are looking for outside sources of information, rather than the company that produces the device. We hypothesize that part of the explanation is that mainstream media, as well as peers, explain complicated concepts in understandable terms. Almost every participant had heard about the Facebook Cambridge Analytica scandal and had an opinion.

These social events often serve as a motivator for taking action to protect their privacy, such as deleting Facebook or checking their records.

“Turns out Alexa is storing audio info…my jaw dropped, wait what?” (P2)
“I subscribed to a VPN, but sometimes I forget to pay and it runs out, then something happens that reminds me of why I wanted it in the first place.” (P3)

Low Fidelity Concepts

 
storyboardidea_1.png

Narrowing

Based on our insights and design principles we generated 35 general design concepts. We chose the 5 strongest ideas to develop further. Each of these ideas were built into an open-ended storyboard. We then tested the storyboards with 5 users (ages 20-67). After completing 5 user tests we chose the "Privacy Made Easy" concept because users were excited to take a quiz to help understand what they should consider with their devices. 

 
usertesting.jpg

User Testing

3 Rounds of Testing

+ Concept Testing

 participants to learn what ideas met users needs. We also sought feedback from experts such as Electronic Frontier Foundation. We found that users did not want another single use device

Insight Examples:

01. People were not interested in another screen in their home

02. A news aggregator offered too many opportunities for trolling

 

+ Feature Testing

Used Invision to create a basic prototype and a sort (pictured left) to understand users flow preferences

Insights Examples:

01. People want to handle their privacy needs in a single chunk not in an ongoing basis

02. Understanding tradeoffs is very valuable to users.

 

+ Usability Testing

Used Sketch clickable to ask for predictive results, locate buttons and ensure users were learning what they wanted to learn.

01. People desire exploration, and the system supports it.

02. Most were inclined to fill out the questionnaire, and found some value in the results.

 Designing with Feedback 

 

The product that became Devicable was shaped by the feedback of our users. 

  • Create a platform to share information so we all can have a better experience with technology and build trust.
     

  • Make all information accessible without an account so users never feel pressured to share their personal information and data.
     

  • Support with evidence. We are here to help people make informed decisions, not direct them. 
     

  • Allow for differences in people, values and concern. Do not tell people what to do, offer thoughtful  options.

Takeaways

This project brought up important issues of how to make a viable product that people trust. In the end we chose to make a site that would be non-profit, rather than commercially successful. Our users were clear that this is not something they would want to use frequently, but would be very glad to have it when necessary. Throughout our process we explored gamification, an app, something that more directly modified devices. While these ideas all made sense, our users were clear in wanting something that quickly showed the tradeoffs. I hope to continue to work on meaty problems like this one in the future.