AI: Dressed to Kill

AI weapons will save the Global Military Complex billions of dollars. But that does not mean they will stop making hundreds of billions selling traditional arms.

Photo: In the Donbas region of Ukraine, a Ukrainian soldier prepares a drone to carry a hand grenade for an attack in March 2023. Credit: Aris Messinis/AFP/Getty

Aris Messinis/AFP/Getty

24 April 2024 | James Porteous | Clipper Media News

It has been my contention that the future will consist of two ‘streams’ of AI: the play-acting one we will use was we continue to amuse ourselves to death and the other, more viable, version that is not prone to ‘hallucinations.’ Or, as we used to call it, lies.

Think about the history of the internet. It started out as a web of information, as vast and as thorough as anything anyone had ever seen, from USENET groups that began in 1976 to the abundance of music, film, arts, MySpace, webpages, world wide web, newspapers, torrents…

Read Also: Report Sounds Alarm Over Growing Role of Big Tech in US Military-Industrial Complex The five largest military contracts to major tech firms between 2018 and 2022 “had contract ceilings totaling at least $53 billion combined.”

We watched as it morphed into crappy Facebook and other corporate entities before finally being overtaken by money, advertisers and the US government.

Today, no one gives a shit. It was fun while it lasted. (And fyi, if you don’t want those corporations and governments tracking your every move, comment and hissy fit, and then charging you with asinine offences retroactively, STOP USING SOCIAL MEDIA.)

So the point is that this is likely the same trajectory that we are seeing, and will see, with AI. For example, we now know, if we to dare to look, that the two current conflicts are being used as proving grounds for both traditional and AI weapons.

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza and ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets rather nonchalantly document its successful use in one current conflict.

There are many other examples, along with an equal number talking about the military experiments taking place in Ukraine.

One such article is posted below. In Lethal AI weapons are here: how can we control them? we learn that ‘the development of lethal autonomous weapons (LAWs), including AI-equipped drones, is on the rise. The US Department of Defense, for example, has earmarked US$1 billion so far for its Replicator programme, which aims to build a fleet of small, weaponized autonomous vehicles.’

You see, they already have an acronym for something we did not even know existed!

And low and behold: Warfare is a relatively simple application for AI. “The technical capability for a system to find a human being and kill them is much easier than to develop a self-driving car. It’s a graduate-student project,” says Stuart Russell, a computer scientist at the University of California, Berkeley, and a prominent campaigner against AI weapons. He helped to produce a viral 2017 video called Slaughterbots that highlighted the possible risks.

This is clearly a win-win proposition. Not only will the Global Military Complex save billions using AI weapons, but they can (and will) continue to make hundreds of billions of dollars in profit manufacturing and selling (to themselves) more traditional weapons.

Call this a win-win-win proposition.

James Porteous | Clipper Media News


The US Air Force’s X-62A VISTA aircraft has been used to test the ability of autonomous agents to carry out advanced aerial manoeuvres.Credit: U.S. Air Force photo/Kyle Brasier

Lethal AI weapons are here: how can we control them?

Autonomous weapons guided by artificial intelligence are already in use. Researchers, legal experts and ethicists are struggling with what should be allowed on the battlefield.

23 April 2024 | By David Adam | Nature

In the conflict between Russia and Ukraine, video footage has shown drones penetrating deep into Russian territory, more than 1,000 kilometres from the border, and destroying oil and gas infrastructure. It’s likely, experts say, that artificial intelligence (AI) is helping to direct the drones to their targets. For such weapons, no person needs to hold the trigger or make the final decision to detonate.

The development of lethal autonomous weapons (LAWs), including AI-equipped drones, is on the rise. The US Department of Defense, for example, has earmarked US$1 billion so far for its Replicator programme, which aims to build a fleet of small, weaponized autonomous vehicles.

Experimental submarines, tanks and ships have been made that use AI to pilot themselves and shoot. Commercially available drones can use AI image recognition to zero in on targets and blow them up. LAWs do not need AI to operate, but the technology adds speed, specificity and the ability to evade defences. Some observers fear a future in which swarms of cheap AI drones could be dispatched by any faction to take out a specific person, using facial recognition.

Read More

Loading

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.