Pre-Programmed Professionalism: The Ethics of Artificial Intelligence in Criminal Prosecution
Contributing Author: Wendy L. Patrick, Deputy District Attorney, San Diego County (CA)
Artificial Intelligence (AI) has become a household term and a hot topic. Cutting-edge and controversial, it delivers an increasing number of services across a variety of industries that are conducted through efficient automation. One of the things that distinguishes AI from other types of automation is its ability to think and learn — which is both sensational and scary. But can AI prosecute crime? The current answer seems to be “no.” Although AI has proven to be a useful tool in some aspects of predictive policing, the art of engaging in individualized case analysis, including the exercise of prosecutorial discretion, requires human experience, judgment, and expertise.
Legal Ethics and AI
Although AI can enhance the speed and accuracy of tasks such as legal research, it cannot replace judgment, morality, or chemistry with court, counsel, or colleagues. And in front of a jury, a prosecutor’s silver tongue remains a unique, individualized feature of skilled advocacy. Yet even the most talented trial lawyers may nonetheless improve the speed and efficiency of some of the more mundane aspects of legal work through automating services. But which ones, and at what cost?
The use of AI in the practice of law implicates several important ethics rules. Not surprisingly, the first one deals with the obligation to know how to use AI in the first place: the duty of competence.
ABA rule 1.1, Competence, requires that a lawyer “provide competent representation to a client.” Competent representation is defined as requiring “the legal knowledge, skill, thoroughness and preparation reasonably necessary for the representation.” Yet competence goes farther in requiring an ongoing awareness of changes and developments in the law. Rule 1.1 Comment [8] explains that in order to maintain the requisite knowledge and skill associated with the duty of competence, a lawyer should “keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology,” among other ongoing legal education requirements.
AI may also assist prosecutors, especially in jurisdictions that are currently tackling heavy post-pandemic caseloads, in working faster in some aspects of legal practice. This can facilitate compliance with rule 1.3, Diligence, which states that a lawyer shall “act with reasonable diligence and promptness in representing a client.”
Using AI in the courtroom implicates ABA rule 3.3, Candor Toward the Tribunal, which states in subdivision (a) that a lawyer shall not knowingly “make a false statement of fact or law to a tribunal or fail to correct a false statement of material fact or law previously made to the tribunal by the lawyer” . . . or “offer evidence known to be false.”
Complying with the duty of candor (and competence) requires prosecutors to know how much they should augment AI-related research with good old-fashioned cite-checking. “Robot briefs” are not (yet) reliable enough to submit to a court without verifying the citations, and crafting individualized legal analysis. And if something related to either evidence or argument slips onto the record inadvertently that is subsequently discovered to be false, rule 3.3(a) provides that “If a lawyer, the lawyer’s client, or a witness called by the lawyer, has offered material evidence and the lawyer comes to know of its falsity, the lawyer shall take reasonable remedial measures, including, if necessary, disclosure to the tribunal.”
AI is not a Mentor or Role Model
Beyond compliance with ethical rules, one of the hallmarks of prosecutor work is the benefit of working within a larger office. Supervisors, peers, and subordinates offer guidance, support, and ready-made roundtables within which to discuss individual cases. AI cannot replicate this dynamic. It has no capacity to serve as a mentor or role model, will not remember your prior cases in the same way your supervisor and peers will, and has no relevant life experience to suggest that you incorporate into selecting a jury.
Lacking sentience, AI cannot (yet) engage in the type of higher-level reasoning that people can. While it can learn, artificial intelligence is not emotional intelligence in the sense that it cannot replicate feelings, sympathy, empathy, and other factors that facilitate human decision-making, whether picking a jury or charging a case, which requires discernment and discretion. And although AI is capable of creativity, artificial ingenuity is not the same as human innovation, and AI’s “originality” often depends on programming preferences.
Prosecution Work is Personal
AI can make decisions considering many of the same factors people would consider, including both logic and practicality, yet without the interference of emotion. But it does not possess the human concept of morality. It can learn about ethics and professionalism through proactive programming, but it will never view humanity in the same way people do. Possessing intelligence also does not include human judgment or the ability to always distinguish right from wrong — especially in situations requiring the exercise of discretion or considering extenuating circumstances.
Within the practice of law, the goal is to decide what type of AI to utilize to accomplish particular tasks, how to create the right types of legal research queries, and how to program AI with the correct set of data. Like any other computerized task, output depends on input. Within prosecution work specifically, important decisions should be made by prosecutors themselves, not artificial assistants, in order to ensure accurate, efficient, and ethical decision-making, in the pursuit of justice for all.
Wendy Patrick, J.D., Ph.D., is a career trial attorney, former Chair of the California State Bar Ethics Committee (COPRAC), and former Chair of the San Diego County Bar Association Ethics Committee. She also serves as a trial consultant and expert witness. The opinions in this piece are her own and not attributable to her employer.