Speed & Performance
TL;DR
Speed (5% weight) has a modest direct weight but can trigger a score floor at extreme levels. AI crawlers deprioritize slow pages, and extremely slow sites risk being skipped entirely during crawl cycles.
Last updated: 2026-03-09
What It Measures#
The Speed factor evaluates how quickly your pages load and respond to requests. It measures server response time (Time to First Byte), full page load time, and Core Web Vitals metrics including Largest Contentful Paint and Cumulative Layout Shift. The evaluation focuses on what an AI crawler experiences, which is different from what a human visitor sees. AI crawlers do not execute JavaScript in most cases, so client-side rendering performance matters less than server response speed and initial HTML delivery. A page that loads fast for JavaScript-enabled browsers but returns a slow or empty initial response will score poorly on this factor. The factor uses performance data from multiple measurement points to account for geographic variation in server response times.
Why It Matters for AI#
AI crawlers operate at scale, processing millions of pages within limited time windows. When a page is slow to respond, the crawler may time out, receive partial content, or simply move on to faster sites. Extremely slow sites risk being dropped from crawl queues entirely. The 5% weight reflects the fact that speed is more of a threshold than a gradient for AI purposes. Once your site is reasonably fast (server response under a few seconds), additional speed improvements have diminishing returns for AI visibility. But falling below the threshold can trigger a score floor that caps your entire score. Think of speed as a pass-fail gate rather than a linear scale. Good enough is fine. Extremely slow is disqualifying. Review How Scoring Works for the complete factor breakdown.
How to Check Yours#
Use tools like Google PageSpeed Insights, WebPageTest, or GTmetrix to measure your page performance. Focus on server-side metrics: Time to First Byte and initial HTML delivery speed. These matter most for AI crawlers. Test from multiple geographic locations, especially if your audience or AI crawler traffic comes from different regions. Your AgentReady™ scan measures performance from the crawler's perspective and reports specific timing data. Look for pages where TTFB exceeds 3 seconds, as these are at risk of being deprioritized by AI crawlers. Also check for pages that return empty content initially and rely entirely on client-side rendering, since AI crawlers typically do not execute JavaScript.
How to Improve#
Optimize server response time first. Use caching (both server-side and CDN), optimize database queries, and ensure your hosting infrastructure can handle concurrent requests without degradation. If you use a CMS, install server-side caching plugins and ensure your hosting plan has adequate resources. For sites with client-side rendering (React, Vue, Angular), implement server-side rendering (SSR) or static site generation (SSG) to ensure AI crawlers receive complete HTML on the initial request. Reduce page weight by optimizing images, minifying CSS and JavaScript, and eliminating unnecessary third-party scripts. Use a CDN to reduce geographic latency. If your site is on shared hosting and consistently slow, consider upgrading to a VPS or managed hosting solution. For related technical improvements, see Crawl Health and Bot Access.
Related Pages
Frequently Asked Questions
Why is speed only 5% of the total score?
Speed functions more as a threshold than a spectrum for AI crawlers. Once you meet a reasonable performance baseline, further speed improvements have minimal impact on AI visibility. The 5% weight reflects this threshold nature, but remember that extremely slow speeds trigger a floor that caps your entire score.
Do Core Web Vitals affect my AgentReady™ score directly?
Core Web Vitals contribute to the Speed factor evaluation, but the factor emphasizes server-side performance metrics over client-side rendering metrics. AI crawlers care most about Time to First Byte and initial HTML content delivery.
My site uses a single-page application framework. Is that a problem?
It can be. If your SPA relies entirely on client-side rendering, AI crawlers may see empty pages. Implement server-side rendering or pre-rendering to ensure AI crawlers receive complete HTML content on the initial request.
Was this page helpful?