Eliza Newman-Saul
 
 
matthew-henry-87142-unsplash.jpg

Designing for Privacy and Security in Home IoT

 

Ongoing Project

Team: Eliza Newman-Saul andOlivia Harold
Tools: Illustrator, Keynote, Google Suite, Remote Testing (Draw.IO; Skype)
Deliverable: UI Specs, Prototypes, Presentation and Research Report

 

sponsored by: Substantial

 
 
 

Background

 
 

Connected home devices, or “home IoT,” appeal to both passionate technology enthusiasts as well as a growing number of casual users. Voice assistants and Smart TVs are becoming more common, although users struggle to understand how to protect themselves, and if protection is indeed necessary. Insecurities in home IoT is prevalent, which puts both devices and people at risk. We believe that in order for the market to continue to grow and sustain, systems must enable concerned users to develop a deeper understanding of connected homes. The goal is to help them make informed decisions that protect their privacy and security. 

 
googlehome.png
 

How might we create a home IoT interface that addresses users concerns about privacy and security?


 
homeiotlandscape.png

IoT Concerns

 
 
shower.png

01. An in-depth picture of our behaviors and habits. 

02. Threatening Home Safety

03. A connect network vulnerable to hacks and spying

8 Expert Interviews

experts.001.jpeg
(IoT) has been designed so that there’s minimal intelligence in the device itself and they are continuing to go out to the cloud or some set of servers. They could have been designed to be much more self contained.
— Batya Friedman
 

We researched consent, privacy and security and IoT activism. Using the articles we read, I contacted the leading authors in the field including Batya Friedman (U. of Washington), Gilad Rosner (founder of  Internet of Things Privacy Forum), Peter Behr (Consultant and founder of Trustmark for IoT and founder of Thingscon), and we connected with Michelle Change (Electronic Frontier Fund). Since IoT is an emerging technology we also talked to engineers who work on privacy teams at major companies and an expert in edge computing.  Finally, we talked to a researcher at Google who came from an advertising background working with big data. 

 

Below is a diagram I made to understand some of the engineering decisions behind designing IoT:

 
Smart Ecosystem.png

Primary Research

 
 

We recruited 12 Participants using the following criteria:  27+, own at least 1 smart device and self-reported as concerned about privacy and security. We alternated between conducting interview sessions and taking notes for our colleague. 

Research Questions

 

01. How do people evaluate and define privacy risks currently and do their concerns match proven risks?
02. How do people derive value from connected devices?
03. What do people understand about data collection from devices and how does it inform their decisions?
04. What are the complications of protecting digital identity?
05. When is personal privacy violated?

Research Activities

Our interviews included a matrix card sort and semi-structured interviews. We also did a privacy policy think-aloud and matrix card sort. 

 
IMG_3460 2.JPG

Personas

 
 
PersonaMatrix.png

Our personas were created by placing our users on a 2 x 2 matrix and graphing them on spectrums from casual user to techie and trouble to untroubled.

 
 
personas.001.jpeg

Techie Troubled

We spoke with several users who are highly concerned about privacy at both an individual and societal level.

personas.003.jpeg
personas.002.jpeg

Techie Untroubled

This user primarily works for a large tech company and feels like there is not a lot to worry about. They believe companies do a good job managing confidential information and hacks.

personas.004.jpeg

Casual Troubled

Owns a few smart devices and is concerned about privacy. They feel overwhelmed and confused as to what actionable steps they can take.

Casual Untroubled

Deligates privacy to other family members and does not want to deal with technology management.

Understanding the Data

 
 

The interviews were coded into small chunks and externalized into a collection of sticky notes. Themes were drawn out in blue and the pink notes were used to brainstorm insights. The notes were moved around and arranged to look for patterns and relationships between participants. The matrix card sort was compared between users into three categories--high level of concern, medium level of concern and low level.  This research was supplemented by a survey of 628 responses from the U. of Washington, DUB mailing list, but the bulk of responses were from users on reddit.com/r/homeautomation and reddit.com/r/googlehome.

 
 
affinitymap.jpg

We mapped all 12 participants on a 2 x 2 matrix from troubled to untroubled and casual tech user to techie. Using the results of that exercise we generated our 4 personas.

Insights

 
ian-dooley-331063-unsplash.jpg
 

01. CONCERNS ARE LARGELY ON A SOCIETAL LEVEL RATHER THAN AN INDIVIDUAL LEVEL. 

People justified their inaction by saying that they don’t feel like targets. They described themselves as law-abiding citizens, and stated that they would be more concerned if they were in a different situation, like being a reporter in Russia. Furthermore, their data is aggregated. However, 11/12 participants stated that they were very worried about the effects of data collection on society-at-large.

“Metadata like the time I sent a text seems harmless, but at a societal level it’s creepy as hell.” (P6)
“Data harvesting can swing elections and encode inequalities in society.” (P8)

02. IOT TRENDS TOWARDS OPAQUE INTERFACES.

Devices are geared towards completing tasks and minimum interactivity, and are not informative about themselves. Our competitive analysis found devices such as routers and physical security protection almost exclusively have arbitrary and screen-less forms. Common HCI practice realizes that limiting feedback diminishes usability and the users mental models, as well as increasing the gulfs of evaluation and execution.

Opaque form factors are probably not accidental design choices, as evidenced by the quote by P2 below. The ultimate goal of ubiquitous computing is screen-lessness, but the integration and functioning of devices is lagging. There is a mismatch between the need for easy troubleshooting and building trust with users through transparency, and opaque design trends.

“I applaud people who go to extremes to research devices and what they can do.” (P9)
“As I become more informed I become more suspicious.” (P2)
 

 

03. PEOPLE HAVE VAGUE AND MASSIVE IDEAS ABOUT DATA COLLECTION.

Participants struggled to describe what data about them is 'out there', and most simply said “everything.” They also believe that data is never deleted, and every part of their digital identity remains outside their control.

This exact phenomenon was forecasted by our expert, Batya Friedman, who predicted that people would draw large nondescript bubbles if instructed to illustrate their digital identity. Some people spoke about specific hacks that had affected them. Although they knew what information had been leaked, they concluded that now everything was compromised. It was impossible for them to get a clear picture because they were never sure about the extent of the hack.

“I try to keep some things on more private settings, but I don’t fool myself, it’s out there.” (P1)
“Data being out there is a reality of modern America.” (P9)

04. SOCIAL AND NEWS EVENTS MAKE TECHNOLOGY UNDERSTANDABLE AND MOTIVATE PEOPLE TO CHANGE BEHAVIORS.

We saw that many people took action and changed behavior after a news event, or after conversations with friends. When a company suggested a change of action, like an update, users were slower to act. This indicates that people are looking for outside sources of information, rather than the company that produces the device. We hypothesize that part of the explanation is that mainstream media, as well as peers, explain complicated concepts in understandable terms. Almost every participant had heard about the Facebook Cambridge Analytica scandal and had an opinion.

These social events often serve as a motivator for taking action to protect their privacy, such as deleting Facebook or checking their records.

“Turns out Alexa is storing audio info…my jaw dropped, wait what?” (P2)
“I subscribed to a VPN, but sometimes I forget to pay and it runs out, then something happens that reminds me of why I wanted it in the first place.” (P3)

Low Fidelity Concepts

 
 

Narrowing

Based on our insights and design principles we generated 35 general design concepts. Using feedback we chose the 5 strongest ideas to develop further. Each of these ideas were built into an open-ended storyboard. We then tested the storyboards with 5 users (ages 20-67). After completing 5 user tests we chose the "Privacy Made Easy" concept.

 
 
 
 
storyboardidea_1.png
 
 
livingroom_smarthub-01.png
 
 
newsstoryboard-01.png