MONOLOGUE WRITTEN BY CLYDE LEWIS
There have been stories that have been called ‘conspiracy paranoia’ regarding the stockpiling and deployment of new technological terror weapons, and other tech weapons that can and will be used against the infrastructure of many countries including the United States.
Anything from Direct Energy weapons to lightning bolts from above have always been called a rumor but now we have learned that robotic death drones are being used in warfare and with the latest debacle in Afghanistan — the subject of using drones and robots to do the job of the combat soldier is back in the news and this time the stories and events are becoming far more bold and chilling as governments treat collateral damage of such attacks with a pause and a yawn.
Reporters can ask questions about the ruthless attacks against civilians but unfortunately, he biggest struggle for any reporter is to report these facts and then have the American people simply ignore them out of political convenience instead of heeding what will be a world war guarantee with massive technical breakthroughs that can not only kill but render an area helpless if all infrastructure grids are hacked and made useless.
What happened in Afghanistan has now been discussed as a war crime with President Biden being called a war criminal.
The New York Times reported that President Joe Biden murdered an innocent family when the US military conducted a “righteous strike” on Aug. 29 against a vehicle that American officials thought was an ISIS bomb that posed an imminent threat to thousands of people at the Kabul Airport.
The drone strike killed innocent civilians and the mainstream media in the United States has decidedly moved on –as they treat the killing of innocents under Biden’s watch a non story –unworthy of further investigation.
The Biden administration lied about who it killed with its drone strike. They had no idea who they hit. The media mindlessly repeated the false claim that they killed “terrorists” when, in fact, they just killed innocent people.
The victim was an aid worker returning home to his family– many people were skeptical that Biden had the capability to effectively retaliate against ISIS — who claimed responsibility for the 13 dead soldiers killed by a bomb in Kabul.
An entire family in Afghanistan was extinguished to prevent Joe Biden from having to endure a news cycle accusing him of weakness in the face of the attack at Kabul Airport, accompanied by lies from his Pentagon/CIA and his media about who was killed.
This attack unfortunately will be a continuation of the use of robots and autonomous drones that are set to take out a target based on programming rather than human strategy of the moment.
Meanwhile, another story was reported about the use of robots to kill perceived enemies — this time from Israel.
Reports by Iran’s state media that their top nuclear scientist had been gunned down by a remote-controlled robotic assassin where initially viewed with skepticism. However, the New York Times reported Saturday that Mohsen Fakhrizadeh was indeed assassinated last year by a killer robot.
The Times revealed new details of the killing gathered from interviews with Israeli, American, and Iranian officials, including “two intelligence officials familiar with the details of the planning and execution of the operation.”
According to the report, Israel’s Mossad carried out the operation using a special model of the Belgian-made FN MAG 7.62mm machine gun attached to an advanced robotic apparatus.
The high-tech killing machine, composed of approximately one ton of equipment including artificial intelligence and multiple-camera eyes, was smuggled into Iran piece by piece.
It was designed to fit into the bed of a Nissan Zamyad pickup truck. The truck was packed with enough explosives to self-destruct after the mission.
The weapon was operated via satellite from an undisclosed location over 1,000 miles away.
On November 27 of last year, Fakhrizadeh and his wife were traveling in their Nissan Teana sedan from their vacation home on the Caspian Sea to their country house in Absard, east of Tehran, accompanied by a security team in three additional vehicles.
The convoy first passed a fake disabled car about three quarters of a mile from the killer robot. The car contained a hidden camera which allowed the Mossad to identify Fakhrizadeh in the driver’s seat.
As Fakhrizadeh approached a blue Nissan Zamyad pickup truck parked on the side of the road, machine gun fire ripped into his vehicle.
The scientist stopped his car, and another burst of gunfire hit the windshield, wounding him in the shoulder.
Fakhrizadeh then exited the vehicle and took cover behind the open door, where three more bullets tore into his spine. A confused bodyguard looked around for the shooter while Fakhrizadeh’s wife comforted her dying husband.
Then the pickup truck exploded.
According to the report, the explosion was the only part of the operation that did not go as planned. Instead of completely destroying the killer robot, the bomb hurled most of the system into the air.
The robot fell to ground, damaged beyond repair but still identifiable.
Science fiction writer Isaac Asimov wrote incessantly about robots, even creating the, Three Laws of Robotics in 1942, still in use today by almost any science fiction story dedicated to such a topic.
The first law is that a robot shall not harm a human, or by inaction allow a human to come to harm. The second law is that a robot shall obey any instruction given to it by a human, and the third law is that a robot shall avoid actions or situations that could cause it to come to harm itself.
However with today’s rudimentary killing machines there are loopholes used by the AI to skirt these rules in order to achieve their mission. They are ingrained in our collective psyche as known facts about robots. Yet, like the word “robot” itself, they are a fiction created by an ingenious Science Fiction writer, that have become a standard way of wiping out humans. Something that I believe is a terrifying reality in our future.
Movies like Robo Cop, I Robot and the Terminator franchise make armed robots a novelty item. However, as with any novelty topic such as robots, there is always that moment where the novelty wears off and the reality of sharing our world with robots becomes all too common.
Advanced technologies used by the military industrial complex have always been depicted in science fiction as ruthless and amoral but the first robot to ever kill someone was an industrial robot that helped manufacture cars. It should be noted this was an accident, not a murder—
On January 25, 1979, Robert Williams, a 25-year-old employee of Ford, became the first human to die at the hands of a robot. The tragic events took place when Williams, who worked at a car plant in Michigan, tried to retrieve some stored parts and was struck by the robot arm and killed.
Eventually, a court came to the same conclusion that Williams’s family had argued – that the robot did not have a sufficient safety mechanism – and hit the manufacturer with damages of $10m.
Then in 2015, another robot at a Volkswagen plant in Germany grabbed a worker and crushed his body against a wall.
Robots of course have no intent — they just do their jobs and when they kill someone it is because of human negligence.
This is why the prospect of autonomous killer drones is a topic worth discussing because they act with no morality– they carry out programmed orders and innocent people are killed in the process.
The New York Times reported tat U.S. military officials have insisted since the last American troops withdrew from Afghanistan last month that they would be able to detect and attack Islamic State or Qaeda threats in the country from afar.
But an errant drone strike that killed 10 civilians, including seven children, in Kabul on Aug. 29 calls into question the reliability of the intelligence that will be used to conduct the operations.
U.S. commanders concede that the missions will be more difficult without a military presence in the country. But new details about the drone strike, which the Pentagon initially said was necessary to prevent an attack on American troops, show the limitations of such counterterrorism missions and the accuracy of these robotic purveyors of mechanized compliance.
John Sifton, the Asia advocacy director at Human Rights Watch, said in an email about the incident: “The U.S. has a terrible record in this regard, and after decades of failed accountability, in the context of the end of the war in Afghanistan, the U.S. should acknowledge that their processes have failed, and that vital reforms and more independent outside scrutiny is vital,”
Thirty-six hours before the strike, intelligence analysts and drone operators at a base in Qatar were sifting through more than 60 specific pieces of intelligence — some conflicting, some mutually reinforcing — related to an imminent ISIS attack, according to Gen. Kenneth F. McKenzie Jr., the commander of the military’s Central Command.
The group, called the Over-the-Horizon Strike Cell, was created in early July to track and disrupt plots in Afghanistan by Al Qaeda or the Islamic State that threatened the U.S. homeland. After the sudden Taliban takeover of the country, the cell began focusing on ISIS threats against the thousands of American troops at Hamid Karzai International Airport in Kabul who were helping tens of thousands of Afghans flee the country.
On the morning of Aug. 29, the military was on high alert, looking out for a white Toyota Corolla as six Reaper drones monitored what General McKenzie called a suspected Islamic State compound, or safe house, both believed to be linked to the plot.
The strike cell commander kept in close contact with Maj. Gen. Christopher Donahue, the head of the 82nd Airborne Division and the ground force commander at the airport. General McKenzie was also kept apprised of the developments during the day.
Just before 9 o’clock, Zemari Ahmadi, a longtime worker for a U.S. aid group, wheeled his white 1996 Toyota Corolla in front of the safe house the Americans were watching. Two men got out of the car, met with another man at the safe house, took a bag from him and returned to the car. Mr. Ahmadi drove off.
Eight hours later, a Hellfire missile slammed into the sedan, killing Mr. Ahmadi and nine other people in what American officials now acknowledge was a tragic case of mistaken identity.
It was for a short time reported that this was a major intelligence failure and the buck stopped with the President. Joe Biden.
For me, it appeared to be a case of Deja vu as President Barack Obama was also drone happy using autonomous robot drones to kill his enemies.
The 542 drone strikes that Obama authorized killed an estimated 3,797 people, including 324 civilians.
As he reportedly told senior aides in 2011: “Turns out I’m really good at killing people. Didn’t know that was going to be a strong suit of mine.”
Civilian casualties are already a huge problem with drone strikes, which by some estimates kill their intended target only 10 percent of the time. Drones, an early form of killer robot, offer minimal sensory input for the operator, making it difficult to distinguish combatants from non-combatants.
Soldiers controlling infantry-bots from afar will have even less visibility, being stuck to the ground, and their physical distance from the action means shooting first and asking questions later becomes an act no more significant than pulling the trigger in a first-person-shooter video game.
Any US military lives saved by using robot troops will thus be more than compensated for by a spike in civilian casualties on the other side. This will be ignored by the media, as “collateral damage” often is—but many Americans will not be informed about the dangers ahead as many of the technologies we use will be weaponized, used for advanced surveillance and biometric identification.
The Pentagon is in the Death Machine business and many military and tech contractors are selling advanced A.I. weapons to the highest bidder.
People may justify the use of killer drones but from the beginning dating back to the Obama administration I have spoken out against them because I can see that you can become a target at any time and if you are not in the good graces of the deep state you can be turned into a pretty pink mist at a park or even at a party.
It is not just the hellfire drones I am talking about either.
That drone, you know the one that you see advertised that gets great aerial photography – can be made to fly without a pilot carrying a weapon that can be thrust headfirst into a target.
Keep in mind that weaponized versions of this technology are being made to kill you. These technologies are now trending because of the latest incident in the Middle East and it appears that both sides will be using highly advanced Artificial Intelligence and robotics that will change the way wars will be waged.
Drones can be beneficial for simple tasks but even then they can become a nuisance and interfere with air traffic causing near misses and air craft collisions.
A helicopter belonging to U.S. Customs and Border Protection, or CBP, encountered what was described as a “highly modified drone” over Tucson, Arizona earlier this year as it flew through controlled airspace near Davis-Monthan Air Force Base.
A Tucson Police Department Air Support helicopter was called in to aid the CBP crew, at which point both aircraft pursued the remarkably “thinking” drone as it easily evaded them and eventually outran them to the northwest.
According to the website “The War Zone” the aircrew was able to identify the craft as a drone with quadcopter-like rotors.
The aircrews claimed the craft that they chased through the Tucson skies was “not like any other” unmanned aerial vehicle they had previously encountered. In particular, the speed, awareness, maneuverability, altitude performance, and endurance of the UAV made some of them wonder if, in fact, the craft pursued by the law enforcement helicopters was a drone at all.
Again, the speed and agility of these drones are reminiscent of the reports of arial Tic Tac-like UAP’s that have eluded our military.
However, it was later confirmed that the crew encountered a “quadcopter-like” small unmanned aerial system and was able to observe “propellers reflecting the city light” on the drone.
That was as advanced as we see with most “harmless” drones.
But autonomous killer robots – that is, bots that select and kill their own targets using Artificial Intelligence – are the logical endpoint of the mission to dehumanize war completely, filling the ranks with soldiers that won’t ask questions, won’t talk back, and won’t hesitate to shoot whoever they’re told to shoot.
The pitfalls to such a technology are obvious. If AI can’t be trained to distinguish between sarcasm and normal speech, or between a convicted felon and a congressman, how can it be trusted to reliably distinguish between civilian and soldier? Or even between friend and foe?
A recent Pentagon study (focusing on robotically-enhanced humans but applicable to robotic soldiers as well) warns that it’s better to “anticipate” and “prepare” for the impact of these technologies by crafting a regulatory framework in advance than to hastily impose one later, presumably in reaction to some catastrophic robot mishap.
In order to have their hands free to develop the technology as they see fit, military leaders should make an effort to “reverse the negative cultural narratives of enhancement technologies,” lest the dystopian narratives civilians carry in their heads spoil all the fun.
Meanwhile, the Campaign to Stop Killer Robots, a coalition of anti-war groups, scientists, academics, and politicians who’d rather not take a ‘wait and see’ approach to a technology that could destroy the human race, are calling on the United Nations to adopt an international ban on autonomous killing machines.
According to the Campaign to Stop Killer Robots, the stage is being set for a potentially destabilizing “robotic arms race” that could see countries worldwide working to gain the upper-hand in building their autonomous warfighting capabilities.
The military of the U.S., Russia, China, Israel, South Korea, and the United Kingdom have already developed advanced systems that enjoy significant autonomy in their ability to select and attack targets, the campaign notes.
And while countries across the Global South have urged the UN to impose a ban on killer robots, states who possess these technologies have opposed such a ban at every turn — signaling that they are unwilling to let go of their revolutionary new implements of death.
The first human soldier to die by a robot comrade’s “friendly fire” will no doubt be framed as an accident, but who will be held responsible in the event of a full-on robot mutiny?
This would make the Matrix and the Terminator stories prophetic accidents.