internacional

US federal judges increasingly adopt AI for rulings - study

RT
US federal judges increasingly adopt AI for rulings - study

A Northwestern University study finds 60% of surveyed federal judges use AI tools to draft rulings and prepare for hearings, though experts warn of misinformation risks.

Over half of US federal judges are using artificial intelligence tools in their judicial work, according to a recent Northwestern University study. The research, based on responses from 112 randomly selected federal judges, reveals that 60% employ at least one AI tool occasionally for tasks such as reviewing documents, conducting legal research, and drafting or editing texts. Approximately 22% use it daily or weekly, with legal research being the most common application (30%), followed by document review (16%).

The use of AI in courtrooms has raised concerns due to instances of fabricated citations and other errors that have undermined confidence in some judicial filings. While one in three judges permits or encourages AI use in their chambers, 20% formally prohibit it. More than 45% reported not receiving AI training from court administration, highlighting a gap in preparedness for safely integrating these technologies.

Legal experts warn that AI's unreliability could compromise judicial authority. Eric Posner, a law professor at the University of Chicago, emphasizes that judges make critical decisions and cannot gamble with a technology prone to hallucinations. Meanwhile, proponents like Florida chief judge Christopher Patterson argue that AI can improve efficiency and help manage heavy caseloads, though they stress the need to assess its accuracy and suitability.

Global concerns about AI's impact on work and health are intensifying, especially following recent sanctions against attorneys for AI-generated content. In March, New York judges urged verification of AI citations after several briefs included fake cases, reflecting a growing problem of misinformation in the legal field that raises questions about safety and accountability.

Original source → ← Back to news