Thinking hard on AI 

ForumIAS announcing GS Foundation Program for UPSC CSE 2025-26 from 26th June. Click Here for more information.

News: Recently, two Indian researchers wrote an unpublished paper “Artificial Intelligence and the armed forces: Legal and ethical concerns”.

This has led to the resurgence of a debate on Artificial Intelligence (AI) based arms and weapon system.  

In 1950, Alan Turing in a paper titled “Computing Machinery and Intelligence”, considered the question: “Can machines think?”. Further, in 1956, it was John McCarthy who coined the term  artificial intelligence.
What is Artificial Intelligence? 

AI is a field of computer science. It allows computers and machines to perform intelligent tasks by mimicking human behaviour and actions. Further, AI can be of broadly classified into two types:  

(1) Narrow AI, which performs specific tasks like music, shopping recommendations, and medical diagnosis. For example, music-streaming services, speech recognition, and personal assistants such as Siri or Alexa comes under this, and  

(2) General AI: It is a system which functions with an intelligent behaviour at least as advanced as a person. It works across the full range of cognitive tasks. General AI is still a few decades away. 

What are the advantages of using AI in military operations? 

AI based arms and weapon systems can help to obtain tactical advantages in the military operation. Big data analytics can be used for this during a war. It will help humans to take decisions. 

Development of autonomous weapons systems. Such systems derive conclusions from gathered information and pre-programmed parameters and models. Thus, they independently select, engage and attack (i.e., use force against, neutralize, damage or destroy) targets without human intervention. 

Usage in the remote areas

Reduction of casualties among soldiers and non-combatants. For example, For India, AI-based weapons systems can help tackle our hostile neighbours and our peculiar problem of Naxalism. 

What are the recent developments in the area? 

At present, global powers like China, Russia, the US, and India are competing to develop AI-based weapons systems. For example, the US is developing intelligent weapons systems.  

In the case of India, an AI task force (AITF) was set up in 2017.

It was supposed to “explore possibilities to leverage AI for development across various fields”.

Further, In 2018, Indian Ministry of Defence (MoD) set up a task force to study the use and feasibility of AI in India’s military.   

Israel developed the Harpy drone. It is an autonomous weapon. It flies to a particular area to hunt for specific targets. It then destroys the target using a high-explosive warhead nicknamed “Fire and Forget”. 

What are the issues in AI based weapon systems? 

There is no formal definition, given that the word “intelligence” is, in itself, difficult to define. 

Threats due to “Lethal Automated Weapons Systems” (LAWS): Also known as “Killer robots”, they are designed not to require any human involvement once activated. They would effectively take the decision to kill or engage targets. Such systems could pose significant threats, legal and ethical challenges. 

The autonomous weapon systems can be used by countries for warmongering. They can be used to cause civilian and collateral damage. 

Way Forward 

“Intelligence” should be clearly defined before attempting its regulation.  

Various researchers warned about the dangers of an AI arms race. They called for a “ban on offensive autonomous weapons beyond meaningful human control”. It was advocated in the “International Joint Conferences on Artificial Intelligence (IJCAI)” which was held in 2015.

India advocates, (1) AI based weapons systems should meet the standards of international humanitarian law, (2) there should be systemic controls on the use of AI based weapon systems, in international armed conflict. It will prevent widening of the technology gap between countries, and (3) AI-weapon use should also be insulated from the influence of public conscience. 

A country should avoid deployment of the Lethal Autonomous Weapon System, (LAWS) to curtail a plethora of legal and ethical issues.  

Source: The post is based on an article “Thinking hard on AI” published in the Business Standard on 31st Mar 22. 

Print Friendly and PDF