Ethics & Practice
How Florida Lawyers Can Use AI While Complying with Opinion 24-1
Florida Bar Ethics Opinion 24-1 allows lawyers to use generative AI, but only with safeguards that preserve confidentiality, competence, and client consent. Here’s a plain‑English guide to using AI in Florida practice—what’s allowed, what requires informed consent, and practical steps to stay compliant.

Petrus Systems
Nov 11, 2025
This article summarizes Florida Bar Ethics Opinion 24‑1 (Jan. 19, 2024) for informational purposes only and is not legal advice. Always apply your professional judgment and the Rules Regulating The Florida Bar to your specific matter.
What Opinion 24‑1 Says—in Brief
- AI is permitted with safeguards. Lawyers may use generative AI consistent with duties of confidentiality, competence, and supervision.
- Confidentiality remains paramount. The duty covers all information related to the representation, regardless of whether it is public elsewhere.
- Informed consent may be required. If using a third‑party AI system requires disclosing client information to that third party, obtain the client’s informed consent first.
- “In‑house/private” AI mitigates risk. Using an internal system that does not expose client data to any third party generally does not require informed consent.
- Competence and verification are mandatory. Lawyers are responsible for their work product, must understand the tools they use, and must verify AI outputs (watch for “hallucinations”).
When You Need Informed Consent
You should obtain informed consent before using AI if:
- You will input client‑identifiable or matter‑specific information into a system where the provider can access, store, or use it to train models.
- The tool’s terms of service, data retention, or human review policies could expose the information to anyone outside your firm.
- You cannot configure the tool to disable logging, training, or third‑party processing for your inputs.
You typically do not need informed consent when:
- You use a truly private, in‑house deployment that keeps all inputs and outputs within your firm’s controlled environment, with no third‑party access.
- You use AI for generic, non‑client tasks (e.g., drafting a marketing blurb) that rely on no client or matter information.
Practical Safeguards Checklist
- Prefer private deployments. Use an in‑house model or a vendor offering customer‑owned encryption keys, no training on your data, and disabled logging by default.
- Contract for confidentiality. Ensure your vendor agreement prohibits training on your data, restricts human review, defines retention windows, and provides audit rights.
- Minimize and anonymize. Use hypotheticals, scrub or tokenize identifiers, and share the least information necessary.
- Configure defensively. Turn off chat history, logging, and model training; restrict who can enable integrations that transmit data to external services.
- Verify outputs. Treat AI like a junior research assistant—cite‑check, Shepardize/KeyCite, and confirm quotations, quotes, and authorities.
- Document your process. Keep a short note in the file: tool used, settings (no‑train/no‑log), what was shared, and the verification performed.
- Train your team. Provide a one‑page SOP on permitted tools, redaction rules, consent steps, and review requirements.
Suggested Engagement Letter Language
You can add language like:
“We may use securely configured artificial intelligence tools to improve efficiency. We will not disclose your confidential information to any third‑party AI provider without your informed consent. Any AI‑assisted work is supervised and verified by an attorney, and you will not be charged for any time required to correct AI‑generated errors.”
Example Workflows (Compliant by Default)
- Research and memo drafting (private model): Use an in‑house model to outline arguments without client identifiers. Verify every citation; attach a source list to your memo.
- Document review and summarization: Upload to a firm‑controlled repository; run AI summarization locally with no external callbacks. Escalate anything ambiguous to attorney review.
- Email and template drafting: Start from non‑client templates. Only insert client facts in a private environment or after consent if using an external tool.
If You Must Use a Public AI Tool
- Explain benefits and risks to the client (storage, training, human review, and potential disclosure).
- Offer alternatives (private workflow; manual drafting).
- Obtain written informed consent naming the provider and the categories of information to be used.
- Configure “no‑train/no‑log” settings and restrict access to the smallest necessary group.
- Verify and cite‑check all outputs before use.
Quick Decision Flow
Does this require client information?
└─ No → Proceed with safeguards (verification still required).
└─ Yes → Will any third party access/store it?
└─ No (in‑house/private) → Proceed; document configuration.
└─ Yes (public/vendor) → Obtain informed consent OR redesign to avoid disclosure.
Bottom Line
Opinion 24‑1 enables Florida lawyers to responsibly use AI. Keep client data confidential, obtain consent when disclosure to a third party is involved, use private deployments where possible, and rigorously verify outputs. Done this way, AI can enhance quality and efficiency—without compromising your ethical obligations.