Two stories that broke this week illustrate the hazards that can come from our ever-increasing reliance on technology. The first story is about an experiment conducted at Georgia Tech where a majority of students disregarded their common sense and followed the path indicated by a robot wearing a sign that read “EMERGENCY GUIDE ROBOT”:
A university student is holed up in a small office with a robot, completing an academic survey. Suddenly, an alarm rings and smoke fills the hall outside the door. The student is forced to make a quick choice: escape via the clearly marked exit that they entered through, or head in the direction the robot is pointing, along an unknown path and through an obscure door.
The vast majority of students–26 out of the 30 included in the experiment–went where the robot was pointing. As it turned out, there was no exit in that direction. The remaining four students either stayed in the room or were unable to complete the experiment. No student, it seems, simply went out the way they came in.
Many of the students attributed their decision to disregard the correct exit to the “Emergency Guide Robot” sign, which suggested that the robot was specifically designed to tell them where to go in emergency situations. According to the Georgia Tech researchers, these results suggest that people will “automatically trust” a robot that “is designed to do a particular task.” The lead researcher analogized this trust “to the way in which drivers sometimes follow the odd routes mapped by their GPS devices,” saying that “[a]s long as a robot can communicate its intentions in some way, people will probably trust it in most situations.”
As if on cue, this happened the very same day that the study was released: