No, you read that right.
The historically democratic San Francisco Board of Supervisors
This particular story begins with
On Tuesday, November 29, an 8-3 majority vote was cast in support of a contentious policy that allowed police to use remote-controlled, armed robots that could employ deadly force, albeit only in very dire situations.
The San Francisco City Police Department has since insisted they do not have pre-armed robots or any intention of arming their robots with guns, RoboCop style. The SFPD did, however, seek to equip them with explosive charges “to contact, incapacitate, or disorient violent, armed, or dangerous suspect[s]”, according to SFPD spokesperson Allison Maxie.
State police officials argue such an option is meant to be nothing more than a last-resort when all else fails. In this case, these are specific circumstances where typical de-escalation and alternative force methods are proving unsuccessful and lives might be at stake.
However,
On July 7, 2016, Micah Johnson shot and killed five police officers in Dallas, Texas, while injuring 9 others in a surprise attack. That day, Dallas became the first city in the United States to use a robot to transport and set off a lethal bomb in order to take down a suspect.
Although only high-ranking officers could sign off on robots as deadly force, the approval did not go over well with San Francisco community members and political leaders, most who found the decision uncomfortably
Others like those at the San Francisco Public Defender’s office issued a letter to the board the day before the vote, deeming the decision to allow police the “ability to kill community members remotely” as misaligned with the city’s traditional ethics and principles. For the next week, disapproving messages such as these led the forefront of the fast-growing public opinion that disagreed with the board’s decision.
With so many against the over-militarization of automated technologies in police work, the board responded by
Robots capable of lethal force were banned entirely, while yet retaining some purpose in the form of ground-based surveillance units that can scope out situations which are otherwise too dangerous for police officers.
The events preceding and following San Francisco’s controversial policy raises many questions regarding the intersection of emerging technology and humanity.
After all, with more and more companies turning to automation in order to maximize efficiency in lieu of employing human workers, who is to say whether robots don’t already have the potential to replace human officers entirely?
The answer is – as you might have guessed – more complicated than you think.
As it currently stands, there are many experts who argue robots are more than capable of performing police work just as well, if not better than their human counterparts.
David Clark, a trial lawyer and seasoned attorney of more than 35 years, maintains that robots could be particularly useful when it comes to helping minimize casualties and harm toward fellow police officers and civilians. He believes it is their natural apathy and absence of any predisposed biases that aid in lowering a person’s chances of being injured on the job.
Clark says, “They don't have human emotions that can lead to irrational judgments or unfair biases; thus, they remove the possibility of someone getting hurt physically.”
In spite of these major benefits robots can have on traditional police work, Clark believes there are still plenty of legal and ethical problems to address before RoboCops are given the all-clear.
Although these robots may come bias-free, there remains a chance for people of color to be wrongfully misidentified, reportedly due to the group being “‘under-sampled,’ meaning there isn't much data that can represent them or be used for validating info,” according to Clark.
This issue would be particularly impactful in situations where autonomous officers have to accurately identify a suspect who may have committed a crime. If the data on file isn’t accurate or there isn’t enough for the robot to fall back on, a person of color might do time for a crime they did not otherwise commit.
The aforementioned apathetic qualities of a robot officer also has the potential to actually work against its ability to relate to and effectively serve the community members it would be expected to protect.
Conrad Golly, a medically retired former motorcycle officer who worked in various cities across California, is not as confident robots could effectively enforce the law without a human partner, as they “lack the ability to understand and empathize with human emotions, which is necessary for building trust and relationships with the community.”
As a former officer, he also points out how our society’s current legal structure does not support rights for robots. Thus, robots would not be able to legally dispense justice until laws are created allowing them to do so. And only when appropriate legislation has passed will people even have to listen to RoboCops, let alone respect their authority to enforce the law.
One of Golly’s most prominent RoboCop concerns – that San Francisco board members also discussed prior to their vote – is the potential misuse of their authority and inappropriate use of force. He finds that “Because robots do not have the same moral compass or accountability as human police officers, there is a risk that they could use excessive force or discriminate against certain groups of people by the programmer of their algorithms.”
As society inches closer to the future, there is a steadily increasing number of autonomous machines being integrated into our daily lives. Currently, there are a few recent examples of countries that have already been experimenting with incorporating robots into local police forces.
In September 2021,
At the
New York City had its very own
During its time with the NYPD, Digidog had the opportunity to accompany officers on duty, and assisted in various high-risk situations of its own, including responding to a home invasion in the Bronx and a domestic dispute in a Manhattan public housing complex. The mechanical K9 was ultimately scrapped and its contract was canceled, however, due to growing public outcry surrounding privacy and increased police militarization.