Inadvertent Algorithmic Cruelty by Eric A. Meyers.
From the post:
I didn’t go looking for grief this afternoon, but it found me anyway, and I have designers and programmers to thank for it. In this case, the designers and programmers are somewhere at Facebook.
I know they’re probably pretty proud of the work that went into the “Year in Review” app they designed and developed. Knowing what kind of year I’d had, though, I avoided making one of my own. I kept seeing them pop up in my feed, created by others, almost all of them with the default caption, “It’s been a great year! Thanks for being a part of it.” Which was, by itself, jarring enough, the idea that any year I was part of could be described as great.
…
Suffice it to say that Eric suffered a tragic loss this year and the algorithms behind “See Your Year” didn’t take that into account.
While I think Eric is right in saying users should have the ability to opt out of “See Your Year,” I am less confident about his broader suggestion:
If I could fix one thing about our industry, just one thing, it would be that: to increase awareness of and consideration for the failure modes, the edge cases, the worst-case scenarios. And so I will try.
That might be helpful but uncovering edge cases or worst-case scenarios takes time and resources, to say nothing of accommodating them. Once an edge case comes up, then it can be accommodated as I am sure Facebook will do next year with “See Your Year.” But it has to happen first and become noticed.
Keep Eric’s point that algorithms being “thoughtless” in mind when using machine learning techniques. Algorithms aren’t confirming your classification, they are confirming the conditions they have been taught to recognize are present. Not the same thing. Recalling that deep learning algorithms can be fooled into recognizing noise as objects..