MARCH OF THE ROBOT TIN SOLDIERS
MONOLOGUE WRITTEN BY CLYDE LEWIS
This holiday season, I challenged myself to find typical Ground Zero stories and fit them into a Christmas theme. I don’t know if I felt the urge to compete with the countless radio stations that play Christmas music from October to January, or if it is my way to convey important stories while still feeling some sort of Christmas spirit in the process.
Writers and reporters do it all of the time; every sitcom and drama on television throws in a Christmas theme and even the military gets into the act by tracking Santa from NORAD.
During the Obama administration, there was even a signed order where the military was instructed to track Santa and if necessary provide him with a military escort in order to protect the sleigh from any and all attacks from possible terrorists.
The push of an armed escort into Santa’s itinerary seemed over-the-top and darkly comical. It actually sounded like a programming mechanism taken right out of the psychological operations manual which gives instructions on how soldiers and armed personnel can befriend children in order to recruit them for duty at a later time.
I can imagine an impressionable child thinking that one day he too could be an Air Force pilot protecting Santa at Christmas time.
We can be all glib about the military surveillance state looking out for a mythological being like Santa, however, the reality is a lot less magical when we know that the NSA and other alphabet agencies are watching you, knowing when you are sleeping, or awake or whether or not you are being good or bad – or maybe even suspicious.
Surveillance software and data monitoring have advanced into a digital panopticon that began with email and phone monitoring but now includes keeping track of web-browsing patterns, text messages, screenshots, keystrokes, social media posts, private messaging apps like WhatsApp and even face-to-face interactions with co-workers.
The majority of surveillance tech providers focus their attention on the financial sector, where companies are legally required to track staff communications to prevent insider trading. But they are increasingly selling their tech to a broader range of companies to monitor staff productivity, data leaks, and Human Resources violations, like sexual harassment and inappropriate behavior.
American companies generally aren’t required by law to disclose how they monitor employees using company-issued devices, although they tend to include a catch-all clause in employment contracts declaring such monitoring.
Even if you’re not an employee you may still be subject to surveillance, thanks to technology used to screen potential job candidates. Connecting the dots between a person’s work life and personal life can lead to uncomfortable territory and it is territory that most businesses are willing to cross in order to protect themselves rather than protecting the right to privacy.
There are many people who have been conditioned not to care, after all, many Americans have already invited office assistants like Amazon Echo, and Google Home into their lives.
We now know that authorities have made it clear that they can use any of the data from these services for criminal investigations.
If the authorities wish to make your life a living hell, they can acquire forensically meaningful native artifacts from Alexa, such as registered user accounts, Alexa-enabled devices, saved WiFi settings (including unencrypted passwords), linked Google calendars, and installed skill lists that may be used to interact with other cloud services.
This may provide sources of evidence that allow reconstruction of user activities with a time zone identified by device-preference API, which is once again an amazing way to track you and what you have been doing.
They also found data that includes a URL pointing to user voice recordings stored in the cloud, making it possible to download those voice files using the utterance API, but tracking and spying are the least of your worries. This surveillance and so-called command performance AI can do all sorts of dirty tricks to annoy you.
Cynics of every age suspect their virtual assistants of eavesdropping, and not without reason. Smart speakers are yet another way for companies to keep tabs on our searches and purchases. Their microphones listen even when you’re not interacting with them because they have to be able to hear their “wake word,” the command that snaps them to attention and puts them at your service.
The speakers’ manufacturers promise that only speech that follows the wake word is archived in the cloud, and Amazon and Google, at least, make deleting those exchanges easy enough.
Nonetheless, every so often weird glitches occur, like the time Alexa recorded a family’s private conversation without their having said the wake word and emailed the recording to an acquaintance on their contacts list. Amazon explained that Alexa must have been awakened by a word that sounded like Alexa, and then misconstrued elements of the ensuing conversation as a series of commands.
Amazon was hoping that the explanation would make people feel better.
I suppose it worked because according to a 2018 report by National Public Radio and Edison Research, 8 million Americans own three or more smart speakers, suggesting that they feel the need to always have one within earshot. By 2021, according to another research firm, Ovum, there will be almost as many voice-activated assistants on the planet as people.
It took about 30 years for mobile phones to outnumber humans. Alexa and other Smart speakered ilk may get there in less than half that time.
One reason is that Amazon and Google are pushing these devices hard, discounting them so heavily during last year’s holiday season that industry observers suspect that the companies lost money on each unit sold.
They are so desperate to get them into households that they are willing to lose money. They want full spectrum dominance and they will start with your home, office, and your car.
In the near future, everything from your lighting to your air-conditioning to your refrigerator, your coffee maker, and even your toilet could be wired to a system controlled by voice.
In the future, the office assistants will be modified into robots, and they will be used to carry out tasks by voice as well.
If you are okay with that, the next step is to use modified versions of these innocent assistants to aid in security, law enforcement, and assisting in decisions in warfare scenarios.
You heard right, Alexa may eventually have command over her own army of tin soldiers.
Talk about your dark Christmas metaphors.
In fact, the Nutcracker character in the famed Nutcracker story is actually the story of a mechanical soldier that is the product of a clockmaker and tinkerer, who often gave the children wonderfully unique toys that moved on their own.
In the age-old tale set during the Christmas season, the clock strikes midnight, and the Nutcracker and the other toys come to life to go to war against the Mouse King.
At first glance, this story doesn’t seem to really tie into the Christmas season other than its setting. However, for the sake of the topic at hand, animated tin soldiers fighting a war could easily be updated into a future setting where a smart computer system animates robots to launch an attack against an enemy.
You may have heard of this story, it is called The Terminator.
In the film, The Terminator, a super surveillance system called Skynet becomes self-aware and sees humanity as a threat to its existence. It deploys an army of Terminators, anthropomorphic cyborgs against humanity.
Maybe it can be seen as a stretch to compare an army of animated tin nutcrackers to that of huge mechanical terminators – but there is an important message that can be gleaned from the comparison and that is Alexa wants to order the March of Robotic Tin Soldiers of the Future.
The robotic systems now rolling out in prototype stage are far more capable, intelligent, and autonomous than we realize.
Amazon is seeking to build a global “brain” for the Pentagon called JEDI, a weapon of unprecedented surveillance and killing power, a profoundly aggressive weapon that should not be allowed to be created.
Amazon has built a vast, globally distributed data storage capacity and sophisticated Artificial Intelligence programs to propel its retail business and now it hopes to use this capability to win a $10 billion Pentagon contract to create an artificial “brain” that goes by the project name Joint Enterprise Defense Infrastructure, or JEDI a moniker that obviously tips its hat to the movie Star Wars.
Amazon is the betting favorite for the contract, which will go to just one bidder, in spite of protests by competitors, chief among them Microsoft and IBM. The Pentagon appears likely to select a winner for the contract in 2019.
In discussing JEDI, Pentagon officials refer to the need for efficiencies in sharing data among its military branches, with the goal of increasing the “lethality” of the US war machine.
JEDI is intended to not only improve information sharing but to dramatically increase the US military’s ability to collect and sort through huge amounts of surveillance information from many, many sources on individuals and groups – governmental and non-governmental – around the world. This will be part of a process of using Artificial Intelligence and algorithms to identify probable targets for killing.
JEDI will do what is being done now on a less coordinated scale and with a smaller volume of surveillance data, but JEDI, promises to find those enemies faster – even if all it takes to be considered an enemy is exhibiting a pattern of behavior that a classified machine-learning model associates with hostile activity.
Call it death by big data.
In addition, JEDI will very likely to be at the heart of managing the first generation of operational US robotic land, sea, air and space weaponry.
Decisions on who and when to kill based on human and JEDI’s massive, fallible Artificial Intelligence will almost certainly be significantly guided by the concept of pre-emptive killing; that is, assassination or larger attack based on suspected threat.
Thus, JEDI will generate new levels of global fear, violence, repression, and conformity in the service of Western US corporations, following current US policies.
It appears very possible that if Amazon gets the Pentagon contract, the personal profiles of its customers around the world, developed to stimulate retail sales, will become, either individually or as aggregated, instruments of these customers’ intimidation and control. In a real way, the acquisitive impulses of hundreds of millions of people may well become the stuff of their imprisonment and, in some cases, their deaths.
Further, JEDI as a global presence represents the creation of a weapon that dramatically ups the level of global military rivalry and ensures more global human conflict. JEDI demonstrates a new level of US determination for global domination that can only be described as disastrous.
The US public’s acceptance of rampant militarism, of which JEDI is a manifestation, is clearly driven by fear of “terrorism” and racial, ethnic and religious differences.
The difficulties for the existing laws of war that this robotics revolution will provoke are barely beginning to be understood. Technology generally evolves much more quickly than the laws of war.
According to numerous reports, the JEDI project stems from the conclusion of Pentagon officials that the US military needs a “brain” operating on cloud technology. The Pentagon’s Request for Proposals to create JEDI says the system must be able to provide information simultaneously to “all Military Departments, Army, Navy, Air Force, Department of Defense Components, the Defense intelligence community, and the Office of the Secretary of Defense; the US Coast Guard; the Intelligence Community; international allies and partners; and contractors supporting defense business and mission operations.
The Pentagon leaders have decided that the Pentagon is way behind in developing cloud capability. So, rather than having the military build its own “brain,” the Pentagon will pay to use and to expand the power of an already existent corporate cloud.
To understand the implications of JEDI, we must realize that the information being gathered and sorted will inevitably be used for the targeting and killing of not only opposing government-based military forces but also nongovernmental individuals and groups who are viewed as political or potential military threats by the US. And this information will be gathered on a global scale.
JEDI’s brain will almost certainly be fed a set of criteria commonly used by governments for identifying potential “threats,” such as political ideology, intellectual and religious pursuits, ethnic and racial origin, race, sexual orientation, and social relationships.
We will not know specifically all of the criteria that will be used to define an “enemy” of the US, nor do we know specifically who decides on these criteria. This type of information-gathering and target identification is already a Pentagon activity used to target US drone assassinations.
JEDI would obviously “benefit” from having access to the millions of personal profiles developed by Amazon and other major cloud operators and any other personal information it can capture.
The transfer of a massive amount of military information into a privately owned and built cloud, as will happen with the creation of JEDI, raises the possibility that the owner or owners of that cloud will – because of their knowledge of the cloud structure, capabilities, and content become more powerful than military and elected officials. Such a transfer has never happened before, and the RFP for JEDI does not speak to the question of how JEDI will remain in governmental control.
The degree to which the JEDI cloud provider may be willing to assert control over US politics and military decisions in the absence of specific controls may depend solely on that provider’s ambitions.
We all know of Amazon owner Jeff Bezo’s ambitions.
Jeff Bezos is a person who has shown a determination to be a very big player in Washington. He owns The Washington Post and has become a formidable lobbying force. Amazon has increased spending on lobbying from less than $1 million in 2000 to $13 million in 2017; so far, $10.6 million has been reported in 2018.
This is what we get when we trade our humanity for convenience; it all begins as an innocent office assistant grows into a tool for war.
As military robots gain more and more autonomy, the ethical questions involved will become even more complex. The U.S. military bends over backwards to figure out when it is appropriate to engage the enemy and how to limit civilian casualties. Autonomous robots could, in theory, follow the rules of engagement; they could be programmed with a list of criteria for determining appropriate targets and when shooting is permissible.
Even if a robot has software that follows all the various rules of engagement, and even if it were somehow absolutely free of software bugs and hardware failures (a big assumption), the very question of figuring out who an enemy is in the first place—that is, whether a target should even be considered for the list of screening questions—is extremely complicated in modern war.
It essentially is a judgment call. It becomes further complicated as the enemy adapts, changes his conduct, and even hides among civilians. If an enemy is hiding behind a child, is it okay to shoot or not? Or what if an enemy is plotting an attack but has not yet carried it out? Politicians, pundits, and lawyers can fill pages arguing these points. It is unreasonable to expect robots to find them any easier.
This is the main reason why military lawyers are so concerned about robots being armed and autonomous. As long as “man is in the loop,” traditional accountability can be ensured. Breaking this restriction opens up all sorts of new and seemingly irresolvable legal questions about accountability.
Most people say that all of these decisions should be left for the technocrats to decide. However, everyone should be aware of what is happening.
It shouldn’t just be up to the scientists to make decisions, but everyone from theologians who helped create the first laws of war to the human rights and arms control communities must start looking at where this technological revolution is taking both our weapons and laws.
These discussions and debates also need to be global, as the issues of robotics cross national lines. There are forty-three countries now have military robotics programs. Over time, some sort of consensus might emerge—if not banning the use of all autonomous robots with lethal weapons, then perhaps banning just certain types of robots such as ones not made of metal, which would be hard to detect and thus of most benefit to terrorist groups.
With JEDI we will have a weapon that will dramatically increase the gathering and managing of information to identify people to target for killing and to increase the speed of killing.
It gives me the creeps to think that all that has to be said is “Alexa Drop a Bomb” and the whole world meets its judgment day.