Why Your Law Firm Isn't Getting Cited by AI (How to Fix It)
AI search tools like ChatGPT, Perplexity, and Google AI Overviews are now the first stop for people with legal questions — but most law firm websites are invisible to them. Here's what's actually happening, and how to become the source AI cites.

Why Your Law Firm Isn't Getting Cited by AI (And Exactly How to Fix It)
Something quietly shifted in legal search over the last 18 months. A potential client with a family law question doesn't type "family solicitor London" into Google and scroll through blue links anymore. They open ChatGPT or Perplexity, type their question in plain English, and read the answer the AI gives them.
If your firm is mentioned in that answer, you exist. If you're not, you don't.
This is the core problem that Answer Engine Optimization (AEO) solves for law firms. And it's not the same problem as traditional SEO.
What AEO Actually Means for Law Firms
AEO — Answer Engine Optimization — is the practice of structuring your content so that AI systems like ChatGPT, Perplexity, Google AI Overviews, and Gemini are able to extract, trust, and cite your pages when answering questions related to your practice areas.
It's different from SEO in a specific and important way. SEO gets you ranked on a results page. AEO gets you cited inside an AI-generated answer. The person may never visit your website at all — they just receive the answer, with your firm's name attached to it as the source.
That citation is the new first impression.
The shift is already happening at scale. Zero-click searches — queries where the user gets their answer directly on the results page without clicking through to a site — now account for a significant portion of legal queries. When someone asks "how long does a divorce take in England?" and Google AI Overviews answers that question directly, the law firm whose content was used as the source gets named. The other twenty firms that ranked on page one get nothing.
Why Most Law Firm Websites Fail the AI Citation Test
The honest answer is that most legal content online was written for humans to read, not for machines to extract. That's not a criticism — it's just the reality of how legal marketing evolved. Blog posts written to rank typically start with three paragraphs of context before getting to the actual answer. Service pages prioritise persuasion over precision. FAQ sections bury answers in caveats.
AI engines work differently. When a language model retrieves your page and tries to formulate a citation, it's looking for a specific thing: a clean, declarative, confident answer to a specific question. It wants to extract 40 to 60 words that it can trust.
If your content leads with "at our firm, we understand how stressful..." — the AI skips you.
If your content opens with "Under the Matrimonial Causes Act 1973, either party can apply for divorce after 1 year of marriage. The process typically takes 6 to 12 months depending on whether both parties agree on financial matters and childcare arrangements" — the AI cites you.
This is what practitioners in the r/LegalMarketingTalk community mean when they say traditional rank tracking doesn't capture how LLMs actually evaluate your firm's content. You can rank number one for a term and still be invisible to AI systems, because the signals that drive rankings and the signals that drive citations are not the same.
The Four Signals AI Engines Use to Choose Legal Sources
Understanding what AI systems actually look for when deciding who to cite is the foundation of everything else. Based on how retrieval-augmented generation works across the main engines, there are four signals that matter most for legal content.
1. Answer Structure
The answer has to come first. Not after context, not after credentials — first. The AI's extraction process is essentially looking for the most direct, confident statement that addresses the query. Legal content that opens with the direct answer, then provides context and nuance, gets extracted. Content that buries the answer gets passed over for a source that doesn't.
This feels counterintuitive for lawyers trained to qualify everything before stating a conclusion. The fix isn't to stop qualifying — it's to state the direct answer first, then add the jurisdictional nuances, caveats, and disclaimers that follow. The AI will typically use the opening declarative statement and note the source. The caveats are still there for the reader who clicks through.
2. Entity Clarity
AI systems build a model of who you are and what you're authoritative about. This is called entity recognition. For a law firm, that means your content — and your structured data — needs to make clear: who you are, which jurisdiction you operate in, which practice areas you cover, and what specific expertise your attorneys have.
Vague content produces vague entity signals. A page that discusses "legal services" in general terms doesn't train the AI to associate your firm with anything specific. A page that discusses "contested divorce proceedings under Scottish family law, specifically the Divorce (Scotland) Act 1976" gives the AI a precise entity-topic signal it can use.
3. Corroboration
AI systems are risk-averse about legal content. They're aware that recommending the wrong solicitor or giving incorrect legal information could cause real harm. So they apply a higher trust threshold to legal citations than they do to, say, a recipe recommendation.
Corroboration means your answer is supported by references to statute, case law, authoritative bodies, or verifiable data. Pages that cite specific legislation, name regulatory bodies, or reference court decisions are treated as more reliable than pages that offer unattributed assertions. You don't need to turn your blog into a law review article, but grounding your answers in specific legal references dramatically increases citation eligibility.
4. Disclaimer Placement
This one is counterintuitive and critically important. Legal content almost always includes disclaimers — "this is not legal advice," "consult a qualified solicitor," and so on. These disclaimers are appropriate and often mandatory under bar rules.
But where you place them matters enormously for AI extraction. Disclaimers that appear before the answer — in the first paragraph, above the content — interfere with the AI's ability to extract a clean answer unit. The engine reads the disclaimer first and downgrades the confidence of everything that follows.
The fix is structural: state your direct answer, then provide context and corroborating details, then include the disclaimer. The disclaimer at the end of a content block doesn't undermine the extractability of the answer above it. The disclaimer at the beginning kills the whole thing.
What "AI-Optimised" Legal Content Actually Looks Like
Here's a concrete illustration. Take a high-intent local query: "Do I need a solicitor for an uncontested divorce in England?"
Content that won't get cited:
"Divorce is one of the most emotionally difficult experiences a person can go through. At [Firm Name], our experienced family law team has helped hundreds of clients navigate the complexities of separation. If you're considering divorce, it's important to seek professional legal advice. Contact us today for a free consultation."
That content answers nothing. The AI skips it entirely.
Content that will get cited:
"You don't legally need a solicitor to apply for an uncontested divorce in England, but most people use one to ensure the financial settlement and any child arrangements are properly documented. Since April 2022, couples can also apply online jointly via the HMCTS portal. A solicitor typically costs £500–£2,000 for an uncontested case, depending on complexity. Note: this is general guidance only — speak to a qualified family solicitor for advice specific to your circumstances."
That second block is 65 words. It answers the question directly, names the specific jurisdiction, references a verifiable government process, includes a price range, and places the disclaimer at the end. A language model can extract that cleanly and cite your firm as the source.
This is the "Answer Unit" model — writing each content block as a standalone, extractable answer to a specific question.
Schema Markup: The Technical Layer Most Firms Skip
Content structure alone isn't enough. The technical signals your pages send to crawlers and AI retrievers also matter, and most law firm websites fail at this layer almost completely.
The two highest-leverage schema types for law firms are FAQPage and LegalService (a sub-type of LocalBusiness in the Schema.org vocabulary).
FAQPage schema is the single most powerful AEO technical element available. When you mark up a Q&A section with FAQPage JSON-LD, you're explicitly telling AI retrievers: "This content is structured as questions and answers." You're making extraction trivially easy. Sites with complete schema markup are measurably more likely to be cited in AI-generated answers. One community-reported benchmark put the advantage at approximately 2.4 times more likely to be recommended by AI systems compared to equivalent content without markup.
LegalService schema establishes entity clarity at the machine layer. It tells crawlers exactly what type of legal service you offer, in which jurisdiction, at what price range, with which attorneys. Combined with your content-level entity signals, this creates a strong, consistent identity that AI systems can anchor citations to.
If your site has no structured data at all — which is the case for a significant number of law firm websites — implementing FAQPage and LegalService schema on your key service pages is the highest-ROI technical action you can take right now.
How AI Citation Behaviour Differs Across Platforms
Not all AI engines work the same way, and treating them as identical is a common mistake in AEO strategy.
Google AI Overviews, which appear at the top of standard Google search results, tend to favour sources that already rank well organically. The overlap between traditional SEO authority and AI Overview inclusion is meaningful here — you can't entirely ignore rankings.
Perplexity operates differently. It runs a real-time web search for every query and synthesises answers from the results it retrieves. This means newer content and more specific, answer-dense pages can outperform established sites that rely on domain authority alone. A solo practice firm with precisely structured content on a specific local query can beat a national directory on Perplexity.
ChatGPT with browsing enabled behaves similarly to Perplexity when it performs a search. Without browsing, it draws on training data — which means longer-term authority signals matter more, and newer content has less immediate impact.
Gemini sits between these poles: it indexes heavily from Google's own corpus and applies E-E-A-T frameworks similar to Google Search.
The practical takeaway for law firms is that a pure SEO strategy gets you some visibility on Google AI Overviews. But for Perplexity and ChatGPT citations — which now drive a meaningful share of legal queries, particularly from younger clients — you need content that's structured for extraction independent of your domain authority.
Measuring Whether It's Actually Working
The biggest unanswered question in the AEO space for law firms right now is measurement. Practitioners in communities like r/LegalMarketingTalk are asking it directly: "How are you measuring success in AEO or GEO? Traditional rank tracking doesn't capture how LLMs actually summarise your firm's expertise."
They're right. Google Search Console doesn't track AI Overview citations. There's no native dashboard that tells you how many times Perplexity cited your criminal defence page last month.
The current measurement approach involves three layers:
The first is manual citation auditing. Run a set of high-intent queries related to your practice areas directly in ChatGPT, Perplexity, and Google. Note whether your firm is cited, whether a competitor is, and what content type is being used. Do this monthly. It's imperfect but it gives you real signal.
The second is traffic pattern analysis. AI citations that include a direct link generate referral traffic from Perplexity and from Google AI Overview sources. Monitor your referral sources in Analytics for these. An increase in direct and referral traffic from AI platforms alongside stable or declining organic click-through rates is a signature of growing AI visibility.
The third is brand mention monitoring. Tools like Brand24 or Google Alerts won't capture AI citations directly, but tracking brand mentions across platforms can surface indirect evidence of citation activity — clients who found you via AI often describe the process in reviews and enquiry forms.
This measurement gap is real and no agency offering AEO services should pretend otherwise. What matters is tracking consistently so you have a baseline and can observe change over time.
Common Mistakes Law Firms Make With AEO
Publishing AI-generated content without AEO structuring. Many firms have started using AI tools to produce more blog content. That's fine, but AI-generated content that isn't structured for AI extraction is just noise. Volume without structure doesn't build citation authority.
Treating AEO as SEO rebranding. Some agencies are selling "AI SEO" that's just traditional keyword content dressed up with new terminology. The actual technical requirements — answer-first structure, Answer Units, schema markup, entity clarity, citation-friendly disclaimer placement — are specific and different from traditional SEO tactics.
Multi-jurisdiction content without jurisdiction tagging. For firms operating across multiple regions, untagged content is harder for AI systems to associate with specific locations. A page about personal injury law that doesn't specify whether it's discussing English, Scottish, or Welsh law will perform worse on jurisdiction-specific queries than a page that makes this explicit.
Deleting "low-traffic" content without an AEO audit. Some content that looks thin from an SEO perspective is actually performing citation work — it's a clean answer to a specific question that Perplexity is citing regularly. Before running a content pruning exercise, manually audit your target queries in AI engines to identify what's getting cited before you remove it.
Frequently Asked Questions
What is AEO for law firms?
Answer Engine Optimization (AEO) for law firms is the practice of structuring legal content so that AI systems — including ChatGPT, Perplexity, Google AI Overviews, and Gemini — extract and cite your firm when answering legal questions. It focuses on answer-first writing, entity clarity, schema markup, and building verifiable authority signals that AI engines trust.
Does AEO replace SEO for law firms?
No. AEO works alongside SEO, not in place of it. Google AI Overviews still draw partly on organic ranking signals, so traditional SEO authority matters. But Perplexity and ChatGPT citations are driven more by content structure and extraction quality than by domain authority — which means AEO creates a separate citation pathway that SEO alone won't capture.
How do AI engines decide which law firm to cite?
AI engines select legal sources based on four primary signals: answer structure (direct answers stated first), entity clarity (specific jurisdiction and practice area signals), corroboration (references to statute, case law, or regulatory bodies), and trust signals (schema markup, EEAT indicators, disclaimer placement that doesn't undermine extraction).
Where should legal disclaimers go in AEO-optimised content?
After the answer, not before it. Disclaimers placed at the start of a content block reduce the AI's confidence in the answer that follows and reduce citation eligibility. State the direct answer first, add context and jurisdictional nuance, then include the disclaimer at the end of the content unit.
Can small law firms compete with national directories for AI citations?
Yes, particularly on Perplexity and ChatGPT, where content extraction quality matters more than domain authority. A solo practice with precisely structured content on a specific local legal query — with FAQPage schema, jurisdiction-specific language, and statute references — can outperform national directories like FindLaw on that specific query.
What schema markup should law firms use for AEO?
Prioritise FAQPage schema for all Q&A content (the highest single-impact AEO technical element), LegalService or LocalBusiness schema for your service pages, and Article schema for blog content. Validate all structured data at schema.org/validator before publishing.
How do I know if my law firm is being cited by AI?
Run manual citation audits monthly: search your highest-intent practice area queries directly in ChatGPT, Perplexity, and Google AI Overviews. Track referral traffic from Perplexity in Google Analytics. Monitor enquiry forms for clients who mention finding you via AI search. There's no native dashboard for this yet — consistent manual tracking is the current best practice.
The Practical Takeaway
The firms that will dominate AI-generated legal search over the next three years are the ones that restructure their content now, before their competitors do. The window where this work is a competitive advantage rather than a table stake is open — but not indefinitely.
You don't need to rebuild your entire website. Start with your five highest-value practice area pages. Restructure each one with answer-first paragraphs, Answer Units of 40–60 words per key question, jurisdiction-explicit language, statute references, and FAQPage schema. Run manual citation audits before and after. Track the change.
That's the playbook. Everything else is iteration.
Want content like this written for your brand, daily?