Editor’s Note (4/12/23): On Tuesday, New York City Mayor Eric Adams announced that the New York City Police Department (NYPD) will be deploying a new fleet of robots and other devices, including the controversial “Digidog”. The NYPD first tried this bot in 2020, but a public backlash to its use led to its removal in 2021.
Last year, the New York City Police Department (NYPD) began leasing a dog-like robot, a Boston Dynamics Spot model that the department dubbed Digidog. Officers have deployed the robot in just a few cases, including a hostage situation in the Bronx and an incident at a public housing building in Manhattan. As the news spread (along with photographs and video), a public reaction – and eventually elected civil servants – quickly gained momentum. Some objected to the robot’s spending. Others worried that its use would threaten civil liberties. Many simply found it scary.
The NYPD abruptly terminated its lease and stopped using the robot last month. However, other US police departments have tested their own Spot models. “Spot was particularly resourceful in tackling boring, dirty and dangerous tasks,” the Boston Dynamics spokesperson said. American Scientist. “Public safety initiatives, including police departments, are often faced with dangerous tasks, such as inspecting a bomb, excavating the remains of an explosion or fire, or even de-escalating of a potentially dangerous situation.”
Complex social and historical factors influenced the NYPD’s decision to remove Digidog from office. “It’s just not the right time to [the NYPD] trying that,” says David J. Gunkel, professor of communications at Northern Illinois University. He notes that the department made the decision “at a time when we as the public are starting to wonder what the police do, how they are funded and what that money is for.” American Scientist spoke with Gunkel about why people accept some machines while rejecting others — and whether audiences can ever fully embrace the idea of robotic cops.
[An edited transcript of the interview follows.]
What influences how we humans feel about robots? People like it PARO cuddly robot sealfor example, while having a strong negative reaction to Digidog.
There is a combination of factors that come into play: the design of the robot, the contexts in which it is deployed and user contributions. The PARO robot is designed to engage humans in more social activities. Boston Dynamics robots aren’t made to look like this. They have no face. They are not furry and cuddly. So the design can have an effect on how people react.
But, also, the context of use is really important. The same Boston Dynamics robots you saw causing trouble with the New York [City] Police Department, just [a few] years earlier, received a lot of sympathy from humans. Boston Dynamics engineers were shown kicking the robot. People saw these videos online, and there was this surge of emotion for “wrong place.” This robot, because of the context in which it was used, elicited a very different emotional reaction from the reaction elicited by the police Digidog robot.
And then, finally, there’s what users do with these things. You can design the best robot in the world, but if users don’t use it the way you intended, that robot could turn into something very different.
Is there anything about robots in particular that makes humans nervous?
The really important thing about robots is that they move. And movement is something that raises a lot of expectations in us human beings about what the object is. Already in the 1940s, [psychologists] Fritz Heider and Marianne Simmel did studies with very simple animated characters on a piece of film. When they showed this to human test subjects, the human beings gave a personality to [a] triangle against [a] edge. And the difference was not that the shapes actually had personalities. The difference was how they moved. Because movement conveys a lot of information about social positioning and expectations, the movement of robots in physical space really means a lot to us.
Going back to the public backlash against the NYPD, why were people so attached to this specific robot?
It’s again a number of factors. One is the design itself. The robot, if you’ve seen any footage of it, is a rather imposing presence. It’s a bit smaller than the robots you see in science fiction. But the way it navigates in space gives it this very imposing profile that can be seen as frightening by many human observers.
There is also the context of use. The NYPD used this robot, very famous now, [at] a social housing project. This use of the robot there, I think, was a really bad choice on the part of the NYPD – because you’re already talking about police going into public housing, now with this massive piece of technology, and that [exacerbates the] very big power imbalance that is already there.
Third, there’s just the timing. All of this is happening in the wake of increased public scrutiny of policing and policing practices – particularly the militarization of the police – and [how] the police have responded to minority populations in a very different way than they have responded to populations of white individuals.
Some people have used sci-fi to criticize Digidog, referring to an episode of the TV show Black Mirror in which robotic dogs hunt humans. How do stories shape our reaction to technology?
The question of science fiction is really crucial. We get the word robot from the Czech word robota, which comes to us in a 1920 play by Karel Čapek. So our very idea of ”robot” is absolutely science fiction, and you can’t really separate it from it, because that’s where it all started.
Also, what the public knows about robots is already predicted in science fiction because we see it in science fiction before we actually see it in social reality. This is called “science fiction prototyping”. Roboticists benefit from this because they can often use science fiction to explain what they are building and why. But they also fight [this prototyping] because science fiction stories create expectations that inform how people will react to these things before they become a social reality. It is therefore a double-edged sword: it offers possibilities of explanation, but it also prevents us from fully understanding what the realities are.
Could the public possibly accept the use of robots in policing?
I think it’s an evolving scenario. And the decision-making, on the part of the police services, on how these things are integrated or not is going to be crucial. I think you would have seen a very different response if the Digidog robot had been used to rescue someone from a fire, rather than being brought into a housing project to support the police action. I think you would have seen a very different result if it had been used as a bomb disposal robot. So I think a lot will depend not only on the design of the robot, but also on the time of use, the context of use and the positioning of this device in terms of how the police interact with their communities and who they serves.