Customer experience -- how people interact with the organizations they do business with and how they feel about...
those interactions -- was the overarching focus of a tech conference I attended in Boston earlier this spring. But it wasn't a gathering of CRM users or product people. It was for Cloud Foundry, the open source architecture that underlies business operations.
I met Geri Jennings, software engineer at Massachusetts-based CyberArk, at the conference. The security software company swears by "experience-driven design." Before working on any change to a product, she and her colleagues create an "XDD" document that compiles details on whatever new feature they're working on, who uses the product and what the pending tweaks will mean for users.
"Before we do any engineering work, everybody has to sit and think about this document," Jennings said. "The work that we do at our company is very driven by that end-user experience from the start of any project."
But just because companies are using technology to retain customers and win new ones doesn't mean they're doing what's right for them as human beings. As Jennings put it, "You certainly could listen to what they're saying and still do unethical things."
As headlines carry news of the havoc technology platforms can wreak -- the Cambridge Analytica scandal, in which the political consulting firm manipulated Facebook user data to influence voters; fake news spreading like a virus on the social media site; and a pervasive sense that too much screen time is a bad thing for everyone -- the notion of tech ethics has taken hold, with the purveyors of technology getting called out for not adequately protecting people's privacy or for peddling addictive products. Bravo, I say. The power tech companies wield is massive, making a discussion about how the products they produce are used -- and what they're used for -- all the more critical.
A Hippocratic oath for the digital age
Mariesa Dale discussed the idea that tech companies need to build their products with human health in mind during a panel discussion at Cloud Foundry Summit. Product design manager at Pivotal Labs, the consulting division of Pivotal Software, at the time of the gathering, Dale wrote "Technologist's Hippocratic Oath," a tech version of the original contract that required physicians to swear they'd do no harm to patients. Dale's oath makes software developers and engineers promise to respect users as people, strive to influence lives in a positive way and think about the consequences of whatever they build.
Dale wrote the pledge after working on government projects that had "big, consequential outcomes for people and the world -- life-and-death outcomes."
For example, she created an app that tracked the progress of immigrants becoming U.S. citizens. A glitch in the system could jeopardize someone's application status. Another app ingested flight data that got presented in military airplane cockpits. A mistaken input there could result in "pilots crashing or analysts going and having to testify in front of Congress."
She also saw her own son getting hooked on technology and started to worry. "He's very vulnerable to those things," she said.
So is everyone else, and tech companies work hard to make it that way. They build persuasive technology -- tools designed to change people's behavior and attitudes -- to keep them glued to their gadgets, sharing photos and watching funny animal videos.
Take social media. At the MIT Sloan CIO Symposium, Jason Jackson, assistant professor of political economy and urban planning at MIT, questioned whether these powerful platforms are creating social value at all or just forwarding an agenda.
"Our first thought is that social media will help us to connect with others," Jackson said during a panel discussion about the fate of human workers in a world teeming with smart machines. "But also the entities that have control over the data and the way the data is used also may have different incentives for how it's used."
Tech ethics in action
Incentives like getting rich quick, greed or destroying the Western alliance -- what do we really know about the intent of the powers pulling the levers of tech? This much is clear: There are people working to head off a dystopian future of antisocial, tech-reliant zombies and instill tech ethics in our present. Tristan Harris, former ethicist at Google, is a vocal opponent of Silicon Valley mind games, speaking out about the dangers of being constantly distracted by a barrage of emails, tweets and social media posts.
His Center for Humane Technology advocates for more ethical technology "to protect our minds from constant distractions, minimize screen time, protect our time in relationships and replace the App Store marketplace of apps competing for usage with a marketplace of tools competing to benefit our lives and society."
Venice Beach, Calif., startup Boundless Mind is trying to build that kind of marketplace, as reported by Haley Sweetland Edwards in Time. The company, run by techies trained in neuroscience, doesn't abandon persuasive technology but uses it to help build applications they believe can help do good for people and society. For example, one app helps patients manage acute pain with the help of a virtual reality game.
"We already know how to engineer your brain to be a good little social media user," said Ramsay Brown, one of the co-founders, in the article. "Why can't we engineer your brain to be who you want to be?"
Actions and consequences
At CyberArk, Jennings feels good about the Conjur software she works on to help developers secure the development process, "because we're trying to create things that make it less likely for there to be a breach of data from an application."
"Do other places encourage their employees to think about that larger picture also, or do people learn in school that that's something that they are going to have to themselves reflect on wherever they are? I don't know, but it probably should be something that people are reflecting on," she said.
Jennings wasn't sure about the future of tech ethics or whether legislation could ever be crafted to enforce some type of standards -- technology is developing too fast to be able to know yet. But she's hopeful. And hearing that from a technologist is a good start.
"We don't really understand well enough what the consequences of it all are going to be," Jennings said. "But I think that it's encouraging that there are more people who are thinking about that, that there's a larger conversation that's happening around whether it is a good, healthy thing for us to have these apps like we do."