G7 nations agree on ‘risk-based’ artificial intelligence regulation
Red Book
Red Book

ForumIAS announcing GS Foundation Program for UPSC CSE 2025-26 from 10th August. Click Here for more information.

Source: The post is based on the article G7 nations agree on ‘risk-based’ artificial intelligence regulationpublished in The Hindu on 1st May 2023

What is the News?

The G7 Digital and Tech Ministers meeting was held in Takasaki, Japan.

What is G7?

Click Here to read

What are the key highlights from the G7 Digital and Tech Ministers meeting?

G7 members have reaffirmed their commitment to the adoption of “risk-based” regulations for artificial intelligence (AI).

The development is significant because developed nations have repeatedly emphasized the need for regulations, given the growing popularity of AI platforms similar to ChatGPT. 

Italy, a G7 member, had recently decided to ban ChatGPT due to privacy concerns but the ban was lifted.

Moreover, lawmakers in the European Union (EU) are also planning to bring a revised version of the upcoming AI Act. The revised draft incorporates clauses aimed at safeguarding copyright with respect to generative AI. 

What are the concerns with AI-based chatbot platforms like ChatGPT?

Security of chatbot systems: Chatbots are often connected to the internet, which means that they are vulnerable to hacking and cyberattacks. If a chatbot system is hacked, personal information can be stolen, and the chatbot can be used to spread malware or launch cyberattacks.

Lack of transparency around how chatbot data could be collected, stored, and accessed: Many chatbot developers do not clearly explain how they collect and use user data, leaving users in the dark about how their personal information is being handled. This lack of transparency can lead to mistrust of chatbots and reluctance to use them.

Chatbot data may be shared with third parties without the user’s knowledge or consent. This could potentially lead to the data being used for targeted advertising or other purposes that the user may not be comfortable with.

Print Friendly and PDF
Blog
Academy
Community