Report Uncovers AI 'Kill List' Errors and Civilian Toll in Gaza Strikes
April 3, 2024
An Israeli media investigation exposes high error rates in the military's AI-based 'Lavender' program, used to identify bombing targets in Gaza.
The Lavender system has been reportedly used to compile a 'kill list' of at least 37,000 individuals in Gaza, with minimal human oversight.
Israel's military actions, guided by the AI system, have resulted in thousands of Palestinian casualties, including women and children.
The Israeli military has allegedly relaxed its rules regarding acceptable levels of civilian casualties, heightening ethical concerns.
Hamas is demanding a permanent ceasefire and other conditions, while the US is engaged in peace mediation efforts.
International tensions rise as Poland addresses the death of a Polish humanitarian worker, and violence continues in Gaza and Yemen.
Israeli families are protesting for the release of relatives kidnapped by Hamas amidst the ongoing conflict.
A panel of journalists is set to discuss the crisis on April 30, following Israel's denial of using AI to identify terrorists and its assurance of compliance with international law.
Summary based on 9 sources
Get a daily email with more World News stories
Sources

+972 Magazine • Apr 3, 2024
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
The Guardian • Apr 3, 2024
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
Yahoo News • Apr 3, 2024
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets