The Moral Failure of Computer Scientists by Kaveh Waddell.
From the post:
Computer scientists and cryptographers occupy some of the ivory tower’s highest floors. Among academics, their work is prestigious and celebrated. To the average observer, much of it is too technical to comprehend. The field’s problems can sometimes seem remote from reality.
But computer science has quite a bit to do with reality. Its practitioners devise the surveillance systems that watch over nearly every space, public or otherwise—and they design the tools that allow for privacy in the digital realm. Computer science is political, by its very nature.
That’s at least according to Phillip Rogaway, a professor of computer science at the University of California, Davis, who has helped create some of the most important tools that secure the Internet today. Last week, Rogaway took his case directly to a roomful of cryptographers at a conference in Auckland, New Zealand. He accused them of a moral failure: By allowing the government to construct a massive surveillance apparatus, the field had abused the public trust. Rogaway said the scientists had a duty to pursue social good in their work.
He likened the danger posed by modern governments’ growing surveillance capabilities to the threat of nuclear warfare in the 1950s, and called upon scientists to step up and speak out today, as they did then.
I spoke to Rogaway about why cryptographers fail to see their work in moral terms, and the emerging link between encryption and terrorism in the national conversation. A transcript of our conversation appears below, lightly edited for concision and clarity.
…
I don’t disagree with Rogaway that all science and technology is political. I might use the term social instead but I agree, there are no neutral choices.
Having said that, I do disagree that Rogaway has the standing to pre-package a political stance colored as “morals” and denounce others as “immoral” if they disagree.
It is one of the oldest tricks in rhetoric but quite often effective, which is why people keep using it.
If Rogaway is correct that CS and technology are political, then his stance for a particular take on government, surveillance and cryptography is equally political.
Not that I disagree with his stance, but I don’t consider it be a moral choice.
Anything you can do to impede, disrupt or interfere with any government surveillance is fine by me. I won’t complain. But that’s because government surveillance, the high-tech kind, is a waste of time and effort.
Rogaway uses scientists who spoke out in the 1950’s about the threat of nuclear warfare as an example. Some example.
The Federation of American Scientists estimates that as of September 2015, there are approximately 15,800 nuclear weapons in the world.
Hmmm, doesn’t sound like their moral outrage was very effective does it?
There will be sessions, presentations, conferences, along with comped travel and lodging, publications for tenure, etc., but the sum of the discussion of morality in computer science with be largely the same.
The reason for the sameness of result is that discussions, papers, resolutions and the rest, aren’t nearly as important as the ethical/moral choices you make in the day to day practice as a computer scientist.
Choices in the practice of computer science make a difference, discussions of fictional choices don’t. It’s really that simple.*
*That’s not entirely fair. The industry of discussing moral choices without making any of them is quite lucrative and it depletes the bank accounts of those snared by it. So in that sense it does make a difference.