How to Track and Optimize AEO With Vercel: Salespeak LLM Analytics Setup

A red, orange and blue "S" - Salespeak Images
Omer Gotlieb Cofounder and CEO - Salespeak Images
Salespeak Team
7 min read
March 9, 2026

70% of sites show under 2% AI traffic in their analytics. That number is wrong. It's not that AI crawlers aren't visiting. It's that Google Analytics can't see them. GA runs client-side JavaScript. AI crawlers don't execute JavaScript. Every visit from ChatGPT-User, ClaudeBot, or PerplexityBot hits your server, grabs your content, and leaves without a trace in your dashboard.

Server-side detection tells a different story. When you actually inspect incoming requests at the middleware layer, AI traffic is significantly higher than what any client-side tool reports. Cloudflare Radar pegged 66% of all internet traffic as bots in 2025. AI crawling specifically grew 15x that same year. Your site isn't exempt from that trend — you just can't see it yet.

If you're running Next.js on Vercel, Salespeak's LLM Analytics integration fixes this with two files. No CDN API tokens. No Lambda functions. No Lua modules. Just Next.js middleware and an Edge Route Handler — patterns you already know.

Why middleware is the right layer for AI detection

Next.js middleware runs on every incoming request before your page renders. It sits at the edge, inspects the request, and can rewrite, redirect, or modify it, all before your React components even load. That's exactly where AI crawler detection belongs.

Client-side approaches fail because AI crawlers don't run your JavaScript. Server-side approaches work but often require infrastructure changes: custom server configs, reverse proxies, or Lambda@Edge functions. Next.js middleware is neither. It's application-level edge logic that deploys with your code. No infrastructure to manage. No separate service to monitor.

Salespeak's integration uses this position to do three things automatically:

  • Detect AI agents by user agent: ChatGPT-User, BingPreview, PerplexityBot, Claude-User, Claude-Web, ClaudeBot, and others, identified at the edge in milliseconds
  • Track every AI visit to your Salespeak dashboard: which models visited, which pages, how often, and when
  • Inject AI-optimized content when available: AI visitors get AEO-tuned pages; if no optimized version exists, they get your original page as-is. Your site always loads normally.

Human visitors never see a difference. Your site looks and performs exactly the same. The middleware only activates its rewrite logic for recognized AI user agents.

Two files. That's the entire integration.

The architecture is minimal by design. Two files power everything:

middleware.ts (project root)

This file inspects the User-Agent header on every incoming request. It maintains a list of known AI crawler signatures — ChatGPT-User, ClaudeBot, PerplexityBot, BingPreview, and others. When a match is found, it rewrites the request to your Salespeak proxy route at /api/ai-proxy. When there's no match, the request passes through untouched.

One config value lives here: your Salespeak organization ID. That's the only thing you need to set.

app/api/ai-proxy/route.ts (Edge route handler)

This route handles the actual work when an AI crawler is detected. It runs on Vercel's Edge runtime, the same infrastructure as Vercel's own edge functions, meaning low latency globally. Here's what it does in sequence:

  1. Logs the AI visit to Salespeak — model name, page URL, timestamp, all pushed to your dashboard
  2. Fetches your original page — the same content your human visitors see
  3. Checks for AI-optimized content — pulls from your Salespeak alternate origin if an optimized version exists for that URL
  4. Injects optimized content when available — if Salespeak has an AEO-tuned version, the crawler receives it. If not, the original page is returned unchanged.

The fallback behavior matters. If your Salespeak account doesn't have optimized content for a given page, or if anything goes wrong with the proxy, the route returns your original page. Your site always works. There's no failure mode where visitors (human or AI) see a broken page.

Caching: intentionally disabled for AI

AI responses are never cached. The route returns Cache-Control: private, no-store, max-age=0 and Vary: User-Agent on every AI response. This ensures crawlers always receive the freshest version of your optimized content. Human visitors continue using Vercel's normal caching — ISR, SSG, whatever you've configured. The integration doesn't touch it.

Setup: four steps, under five minutes

If you've deployed a Next.js app to Vercel before, this will feel familiar. There's nothing new to learn.

Step 1: Get the files

Two options. Clone the Salespeak Next.js boilerplate if you're starting fresh. Or copy the two required files (middleware.ts and app/api/ai-proxy/route.ts) into your existing Next.js project. Both approaches work identically.

Step 2: Configure your org ID

Open middleware.ts. Replace the placeholder organization ID with your Salespeak org ID. That's the only configuration change. One string, one file.

Step 3: Deploy

Commit and push to your deployment branch. Vercel auto-builds. The proxy route uses Edge runtime by default — no additional Vercel configuration, no environment variables to set, no build settings to adjust.

Step 4: Test

Send a request with an AI user agent string (e.g., ChatGPT-User) and confirm it routes through /api/ai-proxy. Check the response headers for Vary: User-Agent. If it's there, the integration is live.

Removal is equally simple. Delete the two files, redeploy. Traffic returns to normal immediately. No orphaned infrastructure, no dangling configs.

Full deployment guide: DEPLOYMENT.md
Support docs: LLM Analytics for Vercel

What you'll see in the dashboard

Once deployed, your Salespeak dashboard starts collecting data that GA physically cannot:

  • AI crawler breakdown: which models visit your site and how often. ChatGPT-User vs. ClaudeBot vs. PerplexityBot, with daily and weekly trends
  • Page-level crawl data: which URLs get the most AI attention, and which are being skipped entirely
  • Crawl frequency patterns: when AI models recrawl your pages, which ties directly to the AEO metrics that actually matter
  • Optimization status: which pages have AEO-optimized content active, and which are still serving standard pages to AI visitors

This data answers questions your current stack can't. Is Perplexity crawling your docs but ignoring your product pages? Did your latest content update trigger a recrawl from GPTBot? How does your AI crawl volume compare week over week? You'll know.

Why this fits the Vercel philosophy

Vercel's whole pitch is "develop, preview, ship." The Salespeak integration matches that energy. Two files to add. One value to configure. Push to deploy. No infrastructure to provision, no separate services to run, no API tokens to manage beyond your org ID.

Compare that to other hosting environments. Cloudflare requires API token creation and Worker deployment. WordPress needs a plugin plus server configuration. Both work well, but they involve steps outside your application code. The Vercel integration lives entirely inside your Next.js project. It deploys with your app, versions with your code, and removes with a git rm.

The middleware pattern isn't something Salespeak invented — it's how Next.js expects you to handle request-level logic. Auth checks, geo-redirects, A/B testing — they all use the same pattern. AI crawler detection is just another entry in that list. If you've written a Next.js middleware before, you already understand how this works.

Tracking is the baseline. Optimization is the upside.

Knowing that ClaudeBot visited your pricing page is useful. Knowing it visited and received a page structured with clear definitions, entity-rich descriptions, and citable statements? That's where AEO compounds.

Publishers who blocked AI crawlers entirely saw a 23% average traffic decline. The opposite approach works better: welcome AI crawlers, give them structured content, and make your pages easy to cite. The Salespeak integration does this automatically when you've configured optimized content for your key pages.

Start with tracking. See which pages AI models actually visit. Then prioritize optimization for those pages first, structured for AI search and built around the kind of definitive statements that get cited in AI-generated answers.

Two files. One config value. A git push. That's the full distance between being blind to AI traffic and controlling what AI models see when they visit your site.

Newsletter

Stay ahead of the AI sales and marketing curve with our exclusive newsletter directly in your inbox. All insights, no fluff.
Thanks! We're excited to talk more about B2B GTM and AI!
Oops! Something went wrong while submitting the form.

Share this Post