Quick access to top menu Direct access to main contents Quick access to page bottom

Robots on the Battlefield: AI Arms Race Escalates in Ukraine

News 1

The recent cover story in Time, discussing the “First AI War” unfolding in Ukraine, has become a hot topic. The article centers on how giant tech companies are using Ukraine as a testing ground for AI warfare.

Simultaneously, the prediction by Geoffrey Hinton, known as the “godfather of AI” and professor emeritus at the University of Toronto, is also garnering attention. He stated, “I foresee the emergence of autonomous robot weapons that can kill humans within the next decade.”

The war between Russia and Ukraine is reportedly favoring Russia. Ukraine, being in a desperate situation, did not refuse help, and the first company they turned to was led by Alex Karp of Palantir Technologies.

Palantir is known for providing data analysis software to the U.S. Immigration and Customs Enforcement, the Federal Bureau of Investigation, the Pentagon, and intelligence agencies abroad, and it has earned the title of 21st-century AI arms dealer.

It is reported that Palantir has been involved in the Russia-Ukraine war in an unprecedented manner.

News 1

The software provided by Palantir, which is known for its AI-based core features, intervenes by analyzing satellite images, open-source data, drone footage, and reports collected on the ground to provide military options to commanders, significantly influencing the majority of target selections in Ukraine.

This technological capability also aids in mine clearance in Ukraine, one of the most mine-infested areas globally. In collaboration with Palantir, the Ukrainian Ministry of Defense is developing a model to prioritize mine clearance areas.

Other companies are also known to be assisting Ukraine. Clearview, known as Ukraine’s secret weapon, is reportedly being used to identify over 230,000 Russian soldiers involved in military invasions. This is seen as an effort to link evidence to prove war crime allegations.

Shutterstock

Clearview is known to identify hundreds of Russians entering Ukraine each day and is reported to have approached Ukraine first, seeing the war as a testing ground.

Clearview is considered illegal in Greece, Italy, France, and Australia due to privacy violations, highlighting its potential misuse in criminal activities.

However, they are striving to achieve public purposes such as tracking child abusers, identifying rioters who attacked the U.S. Capitol, and rescuing victims of human trafficking. This is also being used in Ukraine, allowing families to verify the identities of soldiers participating in the war.

The war between Russia and Ukraine, known as the first inter-state conflict of the 21st century, is also the first to be facilitated by satellite internet. Companies are supporting Ukraine with software and various AI technologies, forming a digital battlefield and electronic kill chain, marking the advent of a new era of AI warfare.

Palantir’s software, combined with ubiquitous unmanned sensors deployed across the battlefield, acts as an actual kill chain.

Drones are highlighted as a significant feature of this war. Once Palantir’s software identifies a target, armed drones attack the Russian target, followed by damage assessment and data re-entry into the system, representing an evolution in warfare.

The use of drones in war has the advantage of significantly reducing manpower and resources. However, some suggest the possibility of large-scale ground battles being replaced by drone warfare. The Ukraine war has been evaluated as opening the door to drone warfare by commercializing AI and drones.

Increasingly, more companies are entering the drone warfare space, with AI companies developing drones that adopt strategies to lock onto and destroy targets without being affected by electronic interference, increasing the risks of unmanned warfare.

The testing process necessary for drone development is prohibited in the majority of countries, especially for battlefield drones, due to their highlighted dangers, leading to national bans.

Experts view the current war between Russia and Ukraine as a testing ground for developing world-class solutions. They have the opportunity to participate in actual combat and upgrade based on collected information.

Russia is also reported to be developing battlefield drones in response to Ukraine’s drone strategy.

The AI software used in the current Russia-Ukraine conflict is criticized for acting as an experiment for testing and upgrades rather than saving lives, focusing on developing systems for more effective future AI wars.

Hinton strongly criticizes this situation. He argues that we must seriously consider the risks of AI combat robots as lethal drones are being commercialized and used in actual combat.

He further argues for stricter regulations on AI robot weapons, adding, “AI will surpass humans, and AIs will compete for resources such as data centers and evolve through competition.”

Hinton said, “Just as chemical weapons were globally banned after World War I, regulations will soon be needed for AI robotic weapons.” However, he emphasized the timing of this.

He argued that if this point comes after recognizing the miserable situation caused by AI, it will only be a situation of damage control after causing countless victims, strongly advocating for pushing forward with regulations.

+1
0
+1
0
+1
0
+1
0
+1
0
news international's Profile image

Comments0

300

Comments0

Share it on