The Google Memo Proves Tech Needs More Empathy, Not Less Emotion

By Ash Huang , August 10, 2017

This story reflects the views of this author, but not necessarily the editorial position of Fast Company.


By now you’ve probably heard all about the memo that got erstwhile software engineer James Damore fired from Google earlier this week, even if you haven’t read it. While it’s hardly his only questionable claim, this suggestion of Damore’s is particularly bizarre: “Relying on affective empathy–feeling another’s pain–causes us to focus on anecdotes, favor individuals similar to us, and harbor other irrational and dangerous biases. Being emotionally unengaged helps us better reason about the facts.”

Who’s really being emotional here? Ironically enough, this man has written 10 pages against empathy–and yet this is exactly what he seeks from his coworkers. He implores them to acknowledge his frustration and respect his point of view in the same language used by diversity advocates whose tactics he objects to, and whose foundational assumptions–about coding aptitude and cognitive development, among other things–he rejects:

We need psychological safety and shared values to gain the benefits of diversity.

Treat people as individuals, not as just another member of their group.

These two remarks are nothing if not calls for empathy–for more, not less, “emotional engagement.” Seemingly without meaning to, Damore has offered up an important reminder: Try as we might to keep empathy and emotion out of our work–to look at things dispassionately, in a fact-based way, and with limited biases–we have all visceral, emotional reactions, some of which we aren’t even fully aware of. That reality makes for complicated relationships with other human beings, to say the least.

And downplaying it has caused some serious damage already.

Clickbait Culture’s Emotional Paradox

I’m old enough to remember the rise of metric-driven strategy in Silicon Valley. Around a decade ago, web companies routinely tested hundreds of home pages against each other to eke out percentages of positive conversions, the dark side of which designer Doug Bowman epitomized in his famous “41 shades of blue” departure letter:

I had a recent debate over whether a border should be 3, 4 or 5 pixels wide, and was asked to prove my case. I can’t operate in an environment like that. I’ve grown tired of debating such minuscule design decisions. There are more exciting design problems in this world to tackle.

But excitement–or, really, any emotional reaction–wasn’t part of Bowman’s job description. So the mind-set Damore calls for has existed for quite some time; it’s been tried out, and it’s largely failed. Yes, this way of thinking in the tech world has been boon to objective decision-making in some quarters. But it’s also become an easy way to distance ourselves from the foils of humanity, to rely on numbers instead of intuition, and to focus on petty tasks and petty concerns.

Not incidentally, it’s also the reason why your social feeds are filled with uncles and cousins internet-yelling about whether or not President Obama is an American citizen. Because the paradoxical consequence of approaching technology in an “emotionally unengaged” way opens the door for users to engage with it as emotionally–and un-empathetically–as possible.

Even though we as users might not feel good witnessing online fights and tirades, it’s hard to stop scrolling and refreshing to see what happens next. What began inside Twitter or Facebook as a well-meaning data-driven initiative–to keep you immersed in the feed for as long as possible–translates in the real world into a stressful and divisive experience, whether you’re actively commenting or just passing by. When you reduce a complex and individual person into a number, user engagement becomes a game.

In this sense, Damore’s suggestion to treat people as individuals is dead-on; it’s his proposal for how to do that that’s ass-backwards. Coupled with the rise of advertising as a major revenue stream for major tech firms, those of us who work in the sector are still doing just about anything to keep users on our platforms longer, but we don’t always stop to ask whether users are happy or productive or kind to each other while they’re there. Engagement is engagement, which means being “emotionally unengaged” is already something tech workers are taught (and paid) to do all too well.

Raw Emotion Pays, Until It Doesn’t

 

Both of Twitter’s cofounders Ev Williams and Biz Stone have expressed their regrets on the service’s shortfalls. “The trouble with the internet, Mr. Williams says, is that it rewards extremes,” David Streitfeld wrote in the Times last May. “Say you’re driving down the road and see a car crash. Of course you look. Everyone looks. The internet interprets behavior like this to mean everyone is asking for car crashes, so it tries to supply them.”

“If I learn that every time I drive down this road I’m going to see more and more car crashes,” Williams told Streitfeld, “I’m going to take a different road.

Twitter’s ambivalence toward salacious behavior and its focus on short-term growth goals have together proved unsustainable. Its user growth has long been stalled and the company’s stock is collapsing. Countless public figures are announcing “Twitter diets” and moving off of the service. And it’s not just Twitter, of course; Williams is right to call out “the internet” as a whole. The biggest internet companies, from Facebook and Google to Uber and Airbnb, all saw massive returns by relying on data. But lately they’ve all had to deal with increasingly serious, and all-too-human, problems and pressures.

Contrary to Damore’s appeal, it’s actually dangerous to pretend we can divorce emotion from our work. That’s where empathy comes in. Instead of ignoring what’s human, we need to account for emotions first, and use them from the very beginning to direct everyone’s collective energies toward good.

That won’t be easy to do. But if you can set aside the context in which he offers them, Damore actually has two good pieces of advice for accomplishing that. Creating more psychological safety and treating people as individuals make great sense as ways to temper maximally emotional yet minimally empathetic experiences–both of using technology and of working someplace that builds it. What if we designed and built products that prioritized people having meaningful and productive interactions with each other, instead of focusing solely on metric gains? What if we found sustainable ways of earning revenue that benefit both the company and the consumer, instead of resorting to ones that just make the numbers go up?

As some have pointed out already, Damore likely had to have enjoyed considerable emotional “safe space” to feel comfortable sending his memo to coworkers. But his preferred approach would actually remove the few safe spaces tech companies have managed to carve out for underprivileged populations. Damore imagines this would make things more “fair.” Perhaps had he been a little more emotionally engaged and more empathetic, he’d see it for the spiteful illogic that it is.


Ash Huang is a designer and writer in San Francisco. Visit her at ashsmash.com and follow her on Twitter at @ashsmash.

 

Fast Company , Read Full Story

(31)