The Mentions page is where you see exactly what AI engines are saying about your brand. While the Visibility page shows you aggregated numbers, the Mentions page shows you the actual words — every AI response, every sentiment, every citation. This is your brand reputation monitoring center for the AI era.Documentation Index
Fetch the complete documentation index at: https://docs.anymorph.ai/llms.txt
Use this file to discover all available pages before exploring further.

Filtering Your Mentions
The top of the page provides powerful filters to narrow down exactly what you want to see:Date Range
Date Range
Choose from 7 days, 30 days, 60 days, or 90 days. Start with 7 days for recent activity, or expand to 90 days for a comprehensive view of how AI responses have evolved.
Engine Filter
Engine Filter
Select one or multiple AI engines (ChatGPT, Perplexity, Gemini, etc.) to see only their responses. This is essential for diagnosing engine-specific issues — if ChatGPT loves you but Perplexity ignores you, filter by Perplexity to understand why.
Sentiment Filter
Sentiment Filter
Filter by Positive, Neutral, or Negative sentiment. Use this to quickly find problem areas (negative mentions) or success stories (positive mentions) you can amplify.
Has Mention Filter
Has Mention Filter
Toggle between responses that mention your brand and those that do not. Viewing “no mention” responses reveals where competitors appear but you are absent — these are your biggest opportunities.
Product Filter
Product Filter
If you track multiple products, filter mentions by specific product. Available on Commerce plans.
Language Filter
Language Filter
Filter by the language of the AI response. Useful for brands operating in multiple markets.
Search Box
Search Box
Free-text search across query text and response content. Use this to find mentions of specific topics, features, or competitor names.
Mentions Summary Section
At the top of the page, a visual summary gives you a quick read on your mention landscape:Mentions Donut Chart
A donut chart (140px wide, 11px thick ring) showing the split between responses with mentions and responses without. The center displays the total response count. Below the donut, a breakdown shows the count of positive, neutral, and negative mentions. How to read it:- Large “with mentions” segment — AI engines frequently include your brand. Good.
- Large “without mentions” segment — Many tracked prompts return AI responses that do not mention you. Investigate which queries these are.
- Sentiment breakdown — At a glance, see if your mentions skew positive or negative.
Quick Stats
Three cards next to the donut chart:- Mention Rate — A percentage badge showing what portion of all tracked responses mention you
- Total Responses — The total number of AI responses analyzed in the selected period
- With Mentions — How many of those responses included your brand
Performance by Platform Card
A table breaking down your mentions across each AI engine:| Column | What it tells you |
|---|---|
| Engine | The AI engine, shown with its icon |
| Mention Rate | The percentage of that engine’s responses that mention you |
| Mentions Count | How many total mentions from that engine |
| Response Count | How many total responses were tracked from that engine |
Queries Table
The main section of the page — a detailed table of every tracked query and its AI response:| Column | What it shows |
|---|---|
| Engine icon | Which AI engine generated this response |
| Prompt/Query | The exact question or prompt sent to the AI engine |
| Sentiment badge | Color-coded: green (positive), gray (neutral), red (negative) |
| Mention status | Whether your brand was mentioned in the response |
| Response preview | A truncated snippet of the AI’s response |
- Sort by sentiment to find negative mentions first — these need the most urgent attention
- Filter for “no mention” to find gaps where you should be appearing
- Search for competitor names to see how you are compared side-by-side
Response Detail Panel
Click any row in the Queries Table to open a slide-out panel showing the complete AI response. The detail panel includes:- Full response text with your brand name highlighted wherever it appears
- The original prompt that triggered this response
- Engine name and model version
- Sentiment analysis with confidence score
- Citations — Any URLs the AI engine linked to in its response
- Timestamps — When the query was sent and the response was received
The highlighted brand mentions in the full response are your “moment of truth.” Read the context around each highlight. Is the AI recommending you? Comparing you unfavorably? Listing you as one option among many? The context matters as much as the mention itself.
Understanding Sentiment
Anymorph analyzes the context around each brand mention to determine sentiment — not just whether your name appears, but how you are portrayed:| Sentiment | What it looks like | Example |
|---|---|---|
| Positive | Brand recommended, praised, or highlighted as a top choice | ”Brand X is widely regarded as the best option for…” |
| Neutral | Brand mentioned factually without opinion or recommendation | ”Options in this space include Brand X, Brand Y, and Brand Z.” |
| Negative | Brand mentioned with caveats, criticism, or unfavorable comparison | ”While Brand X offers this feature, users often report issues with…” |
What to Look For
Recurring positive patterns
Recurring positive patterns
Which queries consistently surface your brand positively? These are your strengths. Identify the pages or content that AI engines are drawing from and keep them updated and strong.
Negative sentiment clusters
Negative sentiment clusters
Multiple negative mentions on similar topics may indicate outdated information, a known product issue, or competitor content that frames you unfavorably. Address these by adding accurate, positive content to your Knowledge Base.
Missing mentions (your biggest opportunity)
Missing mentions (your biggest opportunity)
Queries where competitors appear but you do not are your highest-ROI targets. Each one is a potential GEO Page that could capture visibility you are currently losing.
Citation quality
Citation quality
When AI engines do cite you, which pages are they linking to? If they link to outdated blog posts instead of your main product pages, you need to improve the citability of your key pages.
Pro Tips for Marketers
Do a weekly 'negative mention audit'
Do a weekly 'negative mention audit'
Every week, filter for negative sentiment and read through the responses. Create a list of recurring themes (pricing complaints, feature gaps, comparison losses) and address each one in your Knowledge Base. Within 2—4 weeks, you should see sentiment improve.
Use 'no mention' responses as a content brief
Use 'no mention' responses as a content brief
Filter for responses with no mention, then read what the AI said instead. The brands and content it recommended instead of you reveal exactly what kind of content you need to create. Use these as briefs for your next GEO pages.
Track sentiment over time
Track sentiment over time
If you are running a brand campaign or launched a new product, check the Mentions page weekly to see if AI engines have picked up the new narrative. It typically takes 2—6 weeks for AI training data to reflect new content.
Share positive mentions with your team
Share positive mentions with your team
Related
Visibility
See aggregated visibility trends across all engines.
Knowledge Base
Upload content to improve how AI engines describe your brand.
GEO Strategy
Learn the optimization strategies that improve mention quality and frequency.
Prompts
Manage the prompts that generate these AI responses.