How artificial intelligence is reshaping California's judicial system | Opinion
Imagine you’re in court for a traffic ticket or a child custody dispute. You expect a judge to weigh your case with impartial wisdom and a thorough understanding of the law. But what if, behind the scenes, parts of your ruling were drafted by artificial intelligence?
This month, the California Judicial Council, which oversees the largest court system in the country, approved groundbreaking rules regulating generative AI use by judges, clerks and court staff. By September 1, every courthouse from San Diego to Siskiyou must follow policies that require human oversight, protect confidentiality and guard against AI bias.
California deserves credit for leading the nation in facing this challenge. But these rules reveal a deeper truth: AI is already reshaping the courts, and society is only beginning to understand the consequences.
The council’s new guidelines are prudent: They forbid court personnel from allowing AI to draft legal documents or make decisions without meaningful human review. They warn against inputting sensitive case details into public AI platforms, preventing data leaks. They recognize the danger of bias baked into AI systems trained on flawed or discriminatory case law.
In an overstretched judicial system, these safeguards are essential. But safeguards are not barriers. And the AI genie is out of the bottle. California courts already rely on algorithmic tools. Judges use AI-powered risk assessments, like COMPAS, to predict defendants’ likelihood of reoffending, guiding bail and sentencing decisions. These tools have sparked fierce controversy as there is racial bias in the technology, yet they remain widespread.
Lawyers now draft motions with AI assistance. Paralegals use generative AI to summarize depositions. Self-represented litigants increasingly turn to AI-powered chatbots to navigate complex legal procedures. While AI quietly amplifies the judicial system, it often does so without disclosure, oversight or accountability. It’s convenient to silently outsource judgment itself.
A judge rushing through a packed docket might lean on AI to draft rulings. A clerk juggling hundreds of motions might ask ChatGPT for quick summaries. These shortcuts chip away at the core of justice: human discernment, context, empathy and constitutional fairness. Then there’s the issue of bias: AI learns from data steeped in racial, gender and socioeconomic inequities embedded in past rulings.
Left unchecked, these systems can turbocharge injustice cloaked in the guise of impartial algorithms. Unlike a biased judge, whose decisions can be openly appealed and scrutinized, AI bias often lurks behind inscrutable code and secret training data. More troubling still, the current rules govern only court employees.
What about private attorneys, overworked public defenders or Californians relying on AI chatbots for legal advice? These actors operate beyond court policy reach, yet shape justice daily. California needs more than cautious policy; it needs a bold, statewide vision. The state should create a Judicial AI Commission — an independent panel of judges, technologists, ethicists and civil rights advocates charged with designing transparent, enforceable AI standards for all courts.
This commission would mandate disclosure whenever AI assists legal filings, establish regular audits for bias and promote open-source legal AI tools designed for public good, not black-box systems controlled by private tech giants. We also need laws protecting Californians directly.
An “AI Miranda” law would require clear, upfront disclosure whenever AI influences legal advice, filings or court decisions. Justice should never be automated in stealth. The stakes could not be higher. Courts are not tech startups. When social media algorithms fail, you get a bad recommendation. When courts fail, people lose freedom, homes and families.
Californians deserve to know who or what is deciding their fate. AI can be a powerful ally in expanding access to justice, reducing costs and speeding routine processes. But it must be tightly governed, rigorously tested and always subordinate to human judgment.
California must build a justice system that is tech-aware, and, more importantly, truly just. Anything less risks turning the rule of law into the rule of code.
Joshua Chronicles is a writer focused on politics, law and the intersection of technology and public life.
This story was originally published August 3, 2025 at 5:00 AM with the headline "How artificial intelligence is reshaping California's judicial system | Opinion."