AI Control of the US Military

Share This:

From Greg Reese

In the military, a “kill chain” refers to the sequential process of finding, fixing, tracking, targeting, engaging, and assessing a target. And the US military war machine has found that humans are not as willing to kill as much as they’d like them to.

After World War II, studies conducted by Brigadier General S.L.A. Marshall showed that only 20% of U.S. infantrymen fired their weapons at the enemy, even when under direct threat. He attributed this to an innate human resistance to killing. He wrote in his book, Men Against Fire, that “the average and healthy individual has such an inner and usually unrealized resistance towards killing a fellow man that…at the vital point he becomes a conscientious objector.” This led to changes in U.S. military training, such as, human-shaped silhouette targets and dehumanization of the enemy.

In 1996, The Psychological Cost of Learning to Kill in War and Society by Lt. Col. Dave Grossman, showed that increasing the kill rate in US infantrymen came with psychological costs due to the guilt and trauma of killing.

The National Longitudinal Study of Adolescent Health in 2011 showed that soldiers who believed they killed someone in combat were at higher risk of PTSD, depression, and suicidal thoughts.

It is for these reasons that the IDF, with the help of the US military and Palantir, have been utilizing Artificial Intelligence on the battlefield. Destroying an entire building to kill one person may seem extreme for a human, but not for AI. It is admitted that the AI targeting system murders innocent civilians 10% of the time, and AI is not bothered one bit.

The Golden Dome that President Trump announced will rely upon AI-driven tracking and interceptors. And lethal autonomous weapons are next.

The Bullfrog M2 is an autonomous .50 caliber machine gun developed for the US military. It detects, tracks, identifies, and acquires targets autonomously.

Last year, DARPA released footage of Artificial Intelligence autonomously flying an F-16, known as the X-62A, against a human-piloted F-16.

Known as the founder of Oculus VR, Palmer Luckey is the latest harmless looking front-man for the military industrial complex. In 2017, Luckey founded Anduril Industries, a military technology company focused on autonomous military weapons systems.

The US Department of Defense has the goal of “always having robots, not soldiers, make first contact with the enemy.” Robotic systems are being developed to perform everything from surveillance to killing.

With an army of autonomous slaughterbots, there is no longer a concern of ethics. If a government wanted to unleash them against their own people, they will have no problem following those orders.

Share This: