Conflict over AI hallucinations in court documents sparks legal debate
    Inteligencia Artificial (IA)

    Conflict over AI hallucinations in court documents sparks legal debate

    Paloma Firgaira
    2026-05-04
    5 min read
    The rise of artificial intelligence in the legal field is creating unprecedented challenges for lawyers and firms. Recently, several attorneys have begun publicly revealing the names of the AI programs they use after becoming embroiled in controversies over errors in court documents. A few weeks ago, Ross LeBlanc, a partner at the Dudley DeBosier law firm in Louisiana, apologized to Judge William Jorden for submitting filings that, while citing a real ruling, included non-existent excerpts. LeBlanc explained that he had started using the AI software Eve to draft pleadings and, after checking its initial accuracy, stopped thoroughly reviewing the citations. "I never thought this could happen to me," he admitted in a private letter to the judge. Jay Madheswaranm, CEO of Eve, assured Business Insider that, after an internal audit, they confirmed that their software did not generate false citations in this case. However, the situation has highlighted the shared responsibility between lawyers and AI developers. Courts have sanctioned professionals for submitting documents with AI "hallucinations," and prestigious firms like Sullivan & Cromwell have also had to apologize for similar errors. Transparency about the use of AI is limited: according to researcher Damien Charlotin, less than 10% of cases identify the software used. Many lawyers prefer not to disclose whether they use tools like ChatGPT or Claude from Anthropic, which have also been involved in errors in legal documents. Eve, valued at one billion dollars after a recent funding round, processes over 200,000 documents monthly and promises safeguards to avoid errors. However, trust in these tools is threatened when failures come to light, as occurred in a recent case related to a fall in a Lowe's store, where new "hallucinations" were detected in legal writings. Dudley DeBosier maintains that lawyers must carefully review the results generated by AI and take final responsibility for the content presented. LeBlanc, for his part, acknowledges that the pressure for efficiency led him to rely too much on technology and hopes his experience serves as a warning to other professionals. Although he does not blame Eve, he has decided to stop using the tool temporarily: "It's fair to have a cooling-off period," he concluded.
    Paloma Firgaira

    Paloma Firgaira

    CEO

    Con más de 20 años de experiencia, Paloma es una ejecutiva flexible y ágil que sobresale implementando estrategias adaptadas a cada situación. Su MBA en Administración de Empresas y experiencia como Experta en IA y Automatización fortalecen su liderazgo y pensamiento estratégico. Su eficiencia en la planificación de tareas y rápida adaptación al cambio contribuyen positivamente a su trabajo. Con sólidas habilidades de liderazgo e interpersonales, tiene un historial comprobado en gestión financiera, planificación estratégica y desarrollo de equipos.