I've been running intelligenttools.co for a few months now, writing honest reviews of AI coding tools from a developer's perspective. I'm proud of the content. It's authentic, technical, and based on real usage. But last week, I realized something that should have been obvious: when someone asks ChatGPT, "What's the best AI coding tool?" my site probably doesn't show up.
That's a problem. Because an increasing number of developers aren't Googling anymore, they're asking ChatGPT, Claude, or Perplexity.
This post is about what I learned when I went down the rabbit hole of "AI visibility" or "getting cited by ChatGPT." It's not an expert guide. I'm figuring this out alongside you. But the data I found is fascinating and changes how we should think about content strategy going forward.
The Numbers That Made Me Pay Attention
AI traffic isn't some future trend. It's happening right now, and here are some numbers to see where we are at:
- AI-referred website traffic jumped 527% between January and May 2025
- ChatGPT now has 400 million weekly active users
- Some SaaS sites are seeing more than 1% of all sessions coming from AI citations
- Traffic from AI converts at higher rates than traditional organic search (people arrive already convinced). They trust AI more than they ever did with Google search.
But here's the thing: if these AI systems are not citing you, you're invisible to all that traffic.
How I Realized I'm Probably Not Getting Cited
I did a simple test. I opened ChatGPT and asked:
"What are the best AI coding tools for full-stack developers?"

ChatGPT gave me a solid answer mentioning GitHub Copilot, Cursor, and a few others. It cited TechCrunch, a couple of official tool websites, and... not intelligenttools.co.
I tried Perplexity: "Compare ChatGPT vs Claude Code for developers."
Same thing. My comparison post didn't show up, even though I spent hours writing it based on real usage.
The reality hit me: I've been optimizing for Google (which is still essential), but I haven't considered optimizing for AI systems that are increasingly the first stop for developers seeking tool recommendations.
What Actually Gets Cited by AI (According to Data)
I spent a week researching how ChatGPT, Perplexity, and Claude actually decide what to cite. Here's what the data shows:
The 77% Rule: 76% of AI citations come from pages already ranking in Google's top 10. So traditional SEO isn't dead. It simply became its actual foundation. But that remaining 24% is where things get interesting.
Answer Capsules Are King: The single strongest predictor of getting cited is having a clear, direct answer at the top of your post. Not fluff. Not intro paragraphs. A 65-75-word answer that directly addresses the question. Think of it like a Stack Overflow answer: get to the point immediately.
Original Data Wins Big: Pages with original data or statistics get cited 4.1x more than generic content. Even small stats count. "Based on my testing of 5 tools..." is better than "Many developers prefer..."
Freshness Matters: Content updated within the last 30 days gets 3.2x more citations. AI systems favor recent information, especially for "best of 2025" type queries.
Reddit Is Complicated: Reddit used to be the #1-cited source (40.1% of all AI citations), but ChatGPT cut its citations by 50% in September 2025. However, Perplexity still heavily cites Reddit (6.3%). The takeaway: authentic community discussions still matter, but the landscape is volatile.
Different AI Platforms Have Different Preferences
This surprised me: each AI system cites different sources.
ChatGPT (73.8% market share):
- Heavily favors Wikipedia and Reddit
- Prefers depth over recency
- Cites mainstream tech media (TechCrunch, TechRadar)
- Best for broad visibility
Perplexity (fastest growing):
- Shows explicit citations with links
- Favors fresh content (under 30 days old)
- Still heavily cites Reddit (6.3%)
- Better for niche technical content
Claude:
- Highest session value ($4.56 per visit)
- Prioritizes technical accuracy
- Pulls from academic and technical sources
- Lowest traffic volume but highest quality visitors
For a technical blog like mine, this means I should optimize for Perplexity first (technical audience, shows citations), then ChatGPT (volume), then Claude (quality). I can't go as far as blog post automatization because I write most of the posts myself, with Claude's help, who does the research, and I test and validate the results.
The Wikipedia Lesson
Wikipedia shows up in 47.9% of ChatGPT citations. Why? Because it follows a specific format:
- Neutral, encyclopedic tone (no sales pitch)
- Clear structure with headers and sections
- Citations to reliable sources
- Comprehensive coverage of a topic
- Regular updates
The lesson for blogs: write like you're creating a reference guide, not a promotional article. Neutral, detailed, well-structured content is cited more often. So if you have some experience and expertise, you will still win the AI slop. This, of course, requires that you know and have some real-world knowledge of the subject.
Tools I'm Using:
- Manual testing: I'll ask ChatGPT, Perplexity, and Claude the same questions monthly and track if I show up
- Otterly AI ($29/mo): Once I have more content, I'll use this to track brand mentions across AI platforms
- Google Analytics: Already tracking if I get referral traffic from ai.com or perplexity.ai
What I'm NOT Doing:
- Abandoning traditional SEO (Google still drives 80-90% of traffic)
- Writing AI-optimized garbage (quality over citations)
- Buying citations or fake engagement
- Spending more than 20% of my time on this (it's experimental)
The Long Tail Opportunity for Small Blogs
Here's the encouraging part: you don't need to be TechCrunch to get cited.
The data shows that over 50% of AI citations come from the "long tail" of smaller blogs, niche sites, company documentation, and forums. If your content directly answers a specific question, AI systems will cite you even if you're not a household name.
For example, an obscure blog post by a machine learning engineer might be cited for "How to fine-tune GPT models" even though TensorFlow's official site has greater authority. Why? Because the blog post has a clear, direct answer.
This is good news for developers writing technical blogs. We don't need a massive Domain Authority. We need specificity, accuracy, and straightforward answers.
Why This Matters for Developer Blogs
Think about how you discover new tools. Ten years ago, you Googled it. Five years ago, you asked on Reddit or Stack Overflow. Today, you might ask ChatGPT:
"I need a CLI tool for AI-assisted coding that works with Neovim. What are my options?"
If AI systems don't know about your tool, your blog, or your company, you don't exist in that conversation.
Writing intelligenttools.co is partly about building an audience, but it's also about becoming a trusted source. If I can get ChatGPT to cite my Claude Code review when someone asks about terminal-based AI coding tools, that's a win.
The Reality Check
Let me be honest about a few things:
This is still experimental. AI citation patterns change constantly. Reddit citations dropped 50% in one month. What works today might not work next month.
Size of opportunity is unclear. Some sites report 1%+ of traffic from AI, others see nothing. It's too early to know if this is worth the effort for small blogs.
Traditional SEO still matters more. Google still handles 80-90% of queries. Don't abandon what works.
It takes time. The average cited Reddit post is over a year old. AI systems favor evergreen content that has proven advantageous over time.
I'll write a follow-up post with honest results, whether they're good, inadequate, or nonexistent.
What You Can Try Right Now
If you run a technical blog or write content, here are the lowest-effort, highest-impact things you can test:
5-Minute Test: Open ChatGPT and ask the questions your blog posts answer. Do you show up? If not, why? Look at what gets cited and analyze the structure.
30-Minute Fix: Add an "answer capsule" to your best-performing post. 65-75 words, direct answer, no fluff. Put it right after the introduction.
Weekend Project: Create a comparison table with original data. Even if it's just "I tested these three tools and here's what I found." Original data = 4.1x more citations.
Longer-Term: Post authentically on Reddit in relevant subreddits. Not promotional. Just helpful answers to real questions. The average cited Reddit post is 1+ year old, so it's a long game.
Final Thoughts
AI visibility feels like SEO in 2010. It's new, the rules aren't clear, and most people aren't paying attention yet. That's precisely why it's worth experimenting with now.
I'm not betting the farm on this. But I am dedicating 10-20% of my content efforts to understanding how AI systems discover and cite content. If it works, I'll be ahead of the curve. If it doesn't, I've learned something and still have strong traditional SEO.
The bigger shift is this: we're moving from "how do I rank on Google?" to "how do I become the answer?" That requires a different mindset, less about keywords and backlinks, more about being the clearest, most authoritative source for specific questions. And we can agree that's a good thing, finally, something new and different, a step in the right direction.
I'll report back in 3 months with real data.

























