By S. H. Friday August 30 2019

Robots engender human sympathy. Seen in the wild, they appear to have agency, feelings, and desires. R2D2’s , C3PO’s intelligence, Wall-E’s charm. When delivery bots get stuck on the sidewalk, good Samaritans help (link) them get unstuck.

Knightscope, a company that build patrolbots, supplied one of its models, the ‘K5’ (photo), to the city of Hayward, California.

The Knightscope K5

K5, an autonomous security robot, rolls around by itself, taking video and reading license plates. K5 seemed to be doing a good job until the night of August 3, when a stranger came up to it, knocked it down, and kicked it repeatedly.

According to Bilge Mutlu, at the University of Wisconsin’s HCI Lab:

“People want to explore them, and they don’t know how to do that.” Rarely do the interventions cause damage.

Sure, sometimes people do get in the way. They’re curious. What’s this thing for, anyway? They’ll follow the robots to see what they do or tap their buttons to see what happens.

Humanization comes naturally. When one K5 patrolling Washington, DC, rolled itself into a fountain in summer 2017, writer at The Verge affirmed (link) its choice: “I wouldn’t want your job either, K5. Live your truth.”

The incident on August 3, though, was not a case of poking around. Though the identity of the assailant remains unknown, video captured just before K5 crashed to the concrete shows a blurry image of a young person with dark hair running past the camera.

Stacey Stephens, Knightscope’s executive vice president, wouldn’t say how many have been seriously damaged. “I don’t want to challenge people,” he says, afraid any number will inspire—perhaps compel—more miscreants to seek out K5s.

Humans are mean to robots. The question is: Do we care? K5 is perceived as an erosion of our civil liberties and right to privacy.

Says Kate Darling, a ‘robot ethicist’ at MIT:

We wouldn’t be having this conversation if people didn’t clearly view or treat robots differently than other devices. If people were going around smashing security cameras, you wouldn’t have called me.”

K5 is not a friendly robot, even if the cute blue lights are meant to show that it is. It exists to collect data—data about people’s daily habits and routines.

While Knightscope owns the robots and leases them to clients, the clients own the data K5 collects. They can store it as long as they want and analyze it however they want. K5 is an unregulated security camera on wheels, a 21st-century panopticon.

The true power of K5 isn’t to watch you—it’s to make you police yourself. It’s designed to be at eye level, to catch your attention. Stephens likens it to a police car sitting on the side of the road: It makes everyone hyperaware of their surroundings. Even if you aren’t speeding, you break, turn down the radio, and put your hands at 10 and 2.

Even Darling, who believes that the way we treat robots mirrors our ideas about empathy and kindness, agrees the ethics aren’t always clear:

Even though it’s clearly wrong to punch a person, you get into ethical questions very quickly where it’s not always so clear what the answer is.”

While K5 can’t threaten bodily harm, the data it collects can cause real problems, and the social position it puts us all in—ever suspicious, ever watched, ever worried about what might be seen—is just as scary.

At the very least, it’s a relationship we should question, not blithely accept as K5 rolls by.

So, yes, you should be allowed to kick robots. Test the strength of your sociopolitical convictions. The K5 is a sham, an ersatz impression of power that should be pushed to its limits—right down onto the hard parking lot floor.


Leave a Reply

Your email address will not be published. Required fields are marked *