Google Clarifies Its 15MB Googlebot Limit Is For Each Individual Subresources
Google has updated the Googlebot help document on crawling to clarify that the 15MB fetch size limit applies to each fetch of the individual subresources referenced in the HTML as well, such as JavaScript and CSS files.
Google initially added information about this 15MB limit several months ago, which caused a lot of concern in the SEO industry, so make sure to read that story.
The help document now reads:
Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Each resource referenced in the HTML such as CSS and JavaScript is fetched separately, and each fetch is bound by the same file size limit. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers, for example Googlebot Video and Googlebot Image, may have different limits.Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Any resources referenced in the HTML such as images, videos, CSS, and JavaScript are fetched separately. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers may have different limits.— johnmu is not a chatbot yet 🐀 (@JohnMu) March 19, 2023 — johnmu is not a chatbot yet 🐀 (@JohnMu) March 19, 2023