When software engineer Batya Friedman started out in the 1980s, her goal was to build technologies that were beneficial to people. Back then, however, she couldn't find methodologies that helped her ensure that goal. So she developed value-sensitive design (VSD), an approach that technologists can use to bring human values to the forefront of the technical design process. Today, technologists in academic and commercial settings are increasingly using VSD, which advocates for designers and developers to consider the values of both direct and indirect stakeholders. Friedman is now a professor in the Information School at the University of Washington as well as an adjunct professor in the university's Department of Computer Science and the Department of Human-Centered Design and Engineering. She also directs the university's Value Sensitive Design Research Lab.
What's lacking in the traditional design process?
Batya Friedman: If you look at a lot of our design processes, they focus on technology not the design. And one of the things about technology is you build it and it either works or it doesn't work. You bring questions like is it reliable, is it correct, is it efficient, not questions about whether it meets the values of the people who are going to be using it, does it help people feel better about themselves, or does it build the relationships that people want to build. You're not given a set of tools to think about how the thing you build is going to affect people, so you don't bring that into the way you build your system.
Why is that so critical?
Friedman: Let me give you an example. Let's say I'm building a system that's going to keep track of people who are willing to be organ donors, and I build a great big database for that. A sensible thing is to put people in alphabetical order; you have people who want organs when they become available and they're in alphabetical order. An algorithm like that will bias people whose name begins with A. But if you have a sense of fairness or [are] not discriminating based on someone's last name, you need a different way than picking by last name -- by survivability or something like that.
Isn't that obvious?
Batya Friedman
Friedman: Actually it's not so obvious. It's very challenging to write algorithms that don't have biases in them. These issues pervade all the algorithms being made.
Friedman: There's nothing in value-sensitive design that's about a specific technology. It's about how do we foreground what's important to people in the tools and technologies and infrastructure we build. Most of my work has focused on information technology, but other people have applied it to wind turbines, to designing processes for customs in major ports, for transportation systems.
How does this differ from UX and designing for the user experience?
You're not given a set of tools to think about how the thing you build is going to affect people, so you don't bring that into the way you build your system.
Batya Friedman
Friedman: When you're designing a system, who do you focus on? The language in the field is to talk about users and user-sensitive design. So when people design, they think about who is going to use the technology. We have methodologies for doing user testing, but we know that others are stakeholders, too. So one of the key changes is to bring other stakeholders in to make sure they're considered along with the users.
Can you give me an example that illustrates this?
Friedman: Take a cell phone. All the people who hear an intimate conversation on the bus or have a meeting interrupted by a cell phone are touched by the technology. With your cell phone, there's everyone else around you when you're talking on the cell phone. The user is the person who touches the technology. But a lot of people who don't use the technology are still affected. They're affected by hearing your conversations.
Friedman: It would have changed the design process. So when we were first thinking about and designing cell phones, if you were having focus groups or having some evaluations, you wouldn't do it just with the people using the cell phones, you would have had others as well. And ultimately that would have reshaped the design.
Friedman: A decade or so ago we were looking at how to bring a real-time image of outdoor scenes into the interior office. We brought plasma screens in and had great nature scenes from outside, but our indirect stakeholders found that their images were captured. Women particularly were much more concerned and uncomfortable with this technology than men were; they didn't want their images collected without their knowledge and shown to people without their permission. Normally, if you were developing that technology you wouldn't even ask this question about the people outside who were being photographed.
How did you resolve this?
Friedman: In that case we weren't building out that technology further, so we didn't actually get to a further resolution. The contribution was to bring this question into the public sphere so people could begin to talk about it. [Also, at the time] there was almost no reporting of different responses based on gender, so it was as if issues around gender and privacy didn't exist. So one of the things that that work did was it brought the issues of gender and privacy to the fore, so now in a lot of the research, people take the time to look at it. In an industry setting you want to make sure you involve both men and women in your studies.
Does value-sensitive design actually save money then?
Friedman: While we don't have a lot of research on that, I am fairly confident in saying yes. You don't want to have snafus after the fact, you want to find them beforehand. If you can avoid a snafu [earlier in the design process], then the money you spend in researching it is less than what you'd spend on fixing it after the fact.