There is ample evidence to suggest that digital technologies are being designed and deployed not only to surveil and nudge us toward certain consumer preferences, but to train us to act like predictable machines. In the absence of an established framework for assessing these effects, we need a new test of humanity lost.
PHILADELPHIA – A few years ago, my (Brett’s) inbox was flooded with Facebook notifications indicating that friends had wished me a happy birthday. Annoyed, I logged in and changed the birthday to a random date in mid-summer. Six months later, my email inbox was flooded again. At first, I didn’t think much of it. Most of those Facebook friends didn’t know my actual birthday, and were responding automatically to a prompt. But there were also two messages from close relatives who knew my real birthday. Their responses resembled the others, raising the question of whether automation bias had trumped their human judgment. Was this just a single mistake, or was it indicative of something larger and more pernicious?
PHILADELPHIA – A few years ago, my (Brett’s) inbox was flooded with Facebook notifications indicating that friends had wished me a happy birthday. Annoyed, I logged in and changed the birthday to a random date in mid-summer. Six months later, my email inbox was flooded again. At first, I didn’t think much of it. Most of those Facebook friends didn’t know my actual birthday, and were responding automatically to a prompt. But there were also two messages from close relatives who knew my real birthday. Their responses resembled the others, raising the question of whether automation bias had trumped their human judgment. Was this just a single mistake, or was it indicative of something larger and more pernicious?