Improve Page Speed
TL;DR
Slow pages frustrate AI crawlers just like they frustrate human visitors. Pages that load slowly are less likely to be fully crawled and indexed by AI systems. Improving Core Web Vitals, image optimization, and server response time boosts both AI readiness and user experience.
Last updated: 2026-03-09
Understanding Core Web Vitals#
Core Web Vitals are Google's metrics for measuring real-world page performance. They include Largest Contentful Paint (LCP, how fast the main content loads), Interaction to Next Paint (INP, how responsive the page is), and Cumulative Layout Shift (CLS, how stable the layout is).
These metrics matter for AI readiness because they correlate with how quickly and reliably crawlers can access your content. A page with a 6-second LCP is not just slow for humans — it is slow for bots too. AI crawlers allocate limited time per page. If your page takes too long to load, the crawler may time out before parsing all your content.
Check your Core Web Vitals using Google PageSpeed Insights, Lighthouse (built into Chrome DevTools), or Google Search Console's Core Web Vitals report. These tools give you specific scores and actionable recommendations for each metric.
Aim for LCP under 2.5 seconds, INP under 200 milliseconds, and CLS under 0.1. Pages meeting these thresholds pass the Core Web Vitals assessment and provide a fast, reliable experience for both humans and AI crawlers.
Optimizing Images#
Images are typically the largest files on any web page and the biggest contributor to slow load times. Unoptimized images can add megabytes to your page weight, dramatically slowing First Contentful Paint and LCP.
Start by converting images to modern formats. WebP and AVIF are 25-50% smaller than JPEG and PNG at the same visual quality. Most modern browsers support WebP, and AVIF support is growing. Use the
<picture> element with fallbacks for older browsers.
Resize images to the dimensions they are actually displayed at. A 4000x3000 pixel image displayed in a 400x300 container wastes 99% of its data. Generate multiple sizes and use srcset to let the browser pick the right one.
Add loading="lazy" to images below the fold. This tells the browser to delay loading off-screen images until the user scrolls near them. Do not lazy-load the hero image or any images in the first viewport — those should load immediately to keep LCP fast.
Always include width and height attributes on image tags. This prevents layout shifts (CLS) by reserving space before the image loads.Minimizing JavaScript Impact#
Heavy JavaScript bundles are the second most common cause of slow pages. Every kilobyte of JavaScript must be downloaded, parsed, compiled, and executed before the page is fully interactive. This affects both human users and AI crawlers.
Audit your JavaScript bundles using Chrome DevTools Coverage panel. It shows how much of your downloaded JavaScript is actually used on the current page. Many sites ship 500KB or more of JavaScript but use less than half of it on any given page.
Remove unused JavaScript first. This includes analytics scripts you no longer use, third-party widgets that are not visible, and legacy polyfills for browsers you no longer support. Each removed script reduces both download time and execution time.
For JavaScript that must remain, use code splitting to load only what the current page needs. Modern bundlers (Webpack, Vite, esbuild) support dynamic imports that load code on demand. Defer non-critical scripts using
defer or async attributes so they do not block the initial page render.
AI crawlers that render JavaScript (like Google's) benefit directly from faster execution. Crawlers that do not render JavaScript benefit indirectly because a lighter page loads its HTML and structured data faster.Improving Server Response Time#
Time to First Byte (TTFB) measures how long your server takes to begin sending the page back after receiving a request. A TTFB over 600ms is slow. Under 200ms is good. This metric directly impacts how quickly crawlers can access your content.
If your TTFB is consistently slow, diagnose the cause. Common culprits include slow database queries, unoptimized server-side code, missing server-level caching, and under-provisioned hosting. Each of these has a different fix.
Enable server-side caching for pages that do not change frequently. Most content pages can be cached for minutes or hours. Use cache-control headers to tell browsers and CDNs how long to cache each response. For dynamic pages, consider edge-side rendering or incremental static regeneration.
Upgrade your hosting if you are on a shared server. AI crawlers add load to your server, and a slow server that struggles with human traffic will perform even worse when crawlers visit. A modern hosting platform with adequate resources is a foundational investment in speed and performance.
CDN and Compression#
A Content Delivery Network (CDN) caches your pages on servers around the world, serving each visitor from the nearest location. This dramatically reduces latency for visitors far from your origin server and improves TTFB globally.
If you are not using a CDN, you should be. Services like Cloudflare, Fastly, and AWS CloudFront offer free or low-cost tiers that handle most small to medium sites. Setup is typically straightforward: point your DNS to the CDN and configure caching rules.
Enable compression on your server. Gzip reduces HTML, CSS, and JavaScript file sizes by 60-80%. Brotli compression is even more efficient, reducing sizes by an additional 15-20% compared to gzip. Most modern web servers and CDNs support both.
Combine CDN and compression for maximum impact. A compressed HTML file served from a CDN edge location reaches the crawler in a fraction of the time compared to an uncompressed file served from a distant origin server. This efficiency matters when AI crawlers process hundreds of your pages in a single crawl session.
Test your compression using the
curl -I -H "Accept-Encoding: gzip, br" https://yourdomain.com command. The response should include a Content-Encoding: gzip or Content-Encoding: br header.Related Pages
Frequently Asked Questions
Do AI crawlers actually care about page speed?
Yes. AI crawlers allocate limited time per page during a crawl session. Slow pages may time out before the crawler finishes parsing all content and structured data. Faster pages are more reliably crawled, which means more of your content gets into AI systems.
What is the most impactful speed improvement I can make?
For most sites, image optimization gives the biggest improvement with the least effort. Converting to WebP, resizing to display dimensions, and adding lazy loading can cut page weight by 50% or more. After images, removing unused JavaScript is typically the next biggest win.
Does page speed affect my AgentReady™ score?
Yes. Page speed is part of the speed and performance factor in your AgentReady™ scan. Sites with fast load times score higher on this factor, which contributes to your overall AI readiness score. However, speed is one of several factors, so a perfect speed score alone will not compensate for missing schema or blocked AI crawlers.
Was this page helpful?