Software engineers must think deeply about ethics

Jessica Khauv/Staff

Related Posts

After my sophomore year at UC Berkeley studying computer science, I briefly dropped out to join a medical tech startup in San Francisco. It was a great match: I was going to work on what I was interested in, I had strong personal motivations, and I emotionally connected with the chief technology officer (who would become my boss). A few days after one short interview, I received an offer.

As it happened, when I got the call, I was about to board a plane. I was in line for a flight to Georgia, where my best friend from high school (let’s call him Keith Adamson) and his mom would pick me up. This was Thanksgiving break, and I was going to Keith’s place in Summerville, Georgia, a small landlocked rural village with about 4,000 residents.

I excitedly accepted the offer, got on the plane and was soon reunited with Keith and Mrs. Adamson. Keith and I hopped in the back of her car and started chatting it up, bringing up memories of old antics and catching up on each other’s lives. At some point I mentioned the startup I was going to work at. I explained that there was this job in the medical industry that was very time-consuming and that we were using machine learning to automate it. He asked me what exactly this job was, and I told him the name of the job. Then he said, “Dude, my mom does that.”

I thought I had heard him wrong.

“What?” I replied. I didn’t know what to think. Keith just gave me a look and told his mom, perversely amused, that I was going to automate her job. I looked out the car window. Below the night sky, there was an illuminated billboard referencing President Trump: “God Has Blessed America Again!!!”

The week passed, I returned to San Francisco, and I soon started my job at the startup. At first, it was exciting. I was thrilled to feel like each line of code I was writing would have a real impact on the world. After all, this was my first job, and I was, by all accounts, living the dream: dropping out of school to join an exciting startup in “Silicon Valley,” with the fancy title “software engineer,” “changing the world,” “disrupting an industry,” “making the world a better place.”

I rationalized: Certainly, I was “solving a problem.” The process I was automating was a bottleneck in the health care industry, and it could benefit millions of people. Besides, Mrs. Adamson’s job was a menial one, a job I couldn’t imagine anyone enjoying; if we got rid of all these “inefficiencies,” if we got rid of all these unfulfilling jobs, then somehow, sometime, Universal Basic Income would be a thing, and maybe people like Mrs. Adamson could enjoy a more creative, satisfying life.

I convinced myself as such in the overcrowded BART commute each morning. Thoughts of Mrs. Adamson kept returning. At one point, I heard the CEO talking over the phone to someone I presume was a potential investor, “This is a multi-billion-dollar industry … 300,000 people in the United States are working on it … It’s an industry riddled with inefficiencies …” During a team social at a hip ping-pong bar, I gathered the courage to bring up my dilemma to the CTO. I felt compelled to act as nonchalant as possible: “You know, my friend’s mom does this job.”

He grunted, and he said, matter-of-factly, “Lotsa people do it.”

And that was the end of the conversation.

Weeks passed, and each time I overheard the CEO or CTO talk about these workers — which was not often — it struck me that they did not consider the workers as full human beings. Though I do not want to make assumptions, it is hard to think that they, who had attended prestigious private universities and met each other in a prestigious private MBA program, personally knew anyone who had the job or had any idea what it was like to have the job. Increasingly, it seemed, that to them, the 300,000 workers with this lower-middle-class job were just some “inefficiencies,” a financial burden of $40,000 per head that, if computed away, would enable some hospital to make more money and perhaps offer a $1,000,000 bonus to hire a more prestigious doctor, or whatever. I slowly realized that the proposition I was working for was not “make the world a better place,” “move fast and break things,” or “solve big problems,” but “make the rich richer, and the poor poorer.”

Two months after I joined the startup, I resigned. I started Philosophy of Computation at Berkeley, a student organization dedicated to bringing more self-awareness to UC Berkeley’s computer science culture. I don’t mean to say that Silicon Valley is an evil force that is destroying the world. Far from it: I believe technology is immensely constructive and, like any power, if wielded correctly, can in fact make the world a better place. I still believe most jobs will be automated, and, in the long run, humanity will be better off from it.

But great power must be accompanied by great responsibility, which remains largely absent in Silicon Valley. The Valley’s fantastical vision of a techno-utopian future where robots do all the menial jobs, Universal Basic Income is firmly in place, and everyone can freely pursue their creative endeavors, is just that: a fantasy. Between that fantasy of tomorrow and the reality of today is a gap into which real families, with real mortgages to pay and real mouths to feed, are falling into.

In the car on the way to Summerville, when Keith told his mom that I was going to automate her job, Mrs. Adamson said, in her comforting Southern drawl, “Whatever you’re working on, I’m sure you’re getting paid way more than I am,” and just kept driving.

I have since tried to understand her reaction. Did she just not understand what was going on? Had she resigned herself to the fate of her job? Or was there a note of hidden contempt? There’s some personal comfort in the fact that Keith happens to be a computer science major at a top university, and thus his family will weather the AI revolution with relative grace. But Keith is a very special case: few rural families are lucky enough to have a child with his privilege. How will they deal with irresponsible automation, and where will they go?

Jongmin Jerome Baek is a UC Berkeley student and lead facilitator of Philosophy of Computation at Berkeley (