Menu Close

Australia takes new step towards AI regulation, EU inches closer to AI Act

Australia has taken another step towards regulating artificial intelligence, establishing a new AI Expert Group on Wednesday gathering scientists alongside legal, ethics and governance experts.

The main task of the group will be to advise the Department of Industry, Science and Resources on setting up guardrails for high-risk AI systems and ensuring AI systems are tested, transparent and accountable.

“It’s imperative sophisticated models underpinning high-risk AI systems are transparent and well tested,” Minister for Industry and Science Ed Husic said in a statement. “This Artificial Intelligence Expert Group brings the right mix of skills to steer the formation of mandatory guardrails for high-risk AI settings.”

The founding of the group marks Canberra’s latest step towards potentially introducing mandatory guardrails for AI applications.

The initiative was first announced in September by Australian Finance and Public Service Minister Katy Gallagher. The move was followed by the publication of the government’s interim response to the Safe and Responsible AI in Australia consultation in January. Aside from establishing the AI Expert Group, the response promised to work with the industry to develop a voluntary AI Safety Standard and voluntary labeling and watermarking of AI-generated materials.

The AI group began meetings in early February and is expected to work until 30 June 2024. The Australian government said it is currently considering long-term responses to AI regulation.

Among the 12 appointees to the group are Commonwealth Scientific and Industrial Research Organisation (CSIRO) Chief Scientist Bronwyn Fox, Chair of Australia’s national AI standards committee Aurélie Jacquet, Director of the Australian Institute for Machine Learning at the University of Adelaide Angus Lang and Jeannie Paterson, co-director of the Centre for AI and Digital Ethics.

EU AI Act takes another step towards adoption

Australia still has catching up to do with the European Union, which witnessed another step in turning the AI Act into reality this week. On Tuesday, members of the European Parliament (MEPs) endorsed the provisional agreement on the Act on the committee level.

The deal on technical details of the AI Act, reached at the beginning of February, won the approval of the EU Internal Market and Civil Liberties Committees 71 to eight with seven abstentions. The move opens the door to formal adoption by the European Parliament and final Council endorsement. The final steps are expected to be completed in April while the legislation will be fully applicable 24 months after entry into force.

According to the agreement, certain AI applications that threaten citizens’ rights will be banned, including biometric categorization, systems based on sensitive characteristics, untargeted scraping of facial images from the internet and CCTV footage.

The AI Act, however, leaves the door open for some AI applications that could affect civil rights, inviting criticism from civil rights groups and some lawmakers. Among them is emotion recognition, a controversial technology that uses face biometrics to assess sentiments. The practice has been banned in workplaces and schools but critics argue that the AI Acts leaves a pathway for its use in law enforcement, particularly migration control.

The AI Act also ensured other exemptions for law enforcement: While the use of biometric identification systems by law enforcement is prohibited in principle, there are exceptions for certain situations. Real-time facial recognition systems, for instance can be employed in case of terrorist attacks or missing person searches with authorization from courts.

The landmark legislation also promises to establish regulatory sandboxes and real-world testing at the national level, allowing companies to train AI before placing it on the market.

Article: Australia takes new step towards AI regulation, EU inches closer to AI Act

Leave a Reply

Your email address will not be published. Required fields are marked *