Google Search Console’s Index Coverage report is receiving 4 updates to keep site owners better informed about indexing issues.
The Index Coverage report is new compared to other reports Google offers, as it was first introduced when the revamped version of Search Console launched in 2018.
Since the launch of the Index Coverage report site owners have been sharing feedback with Google about improvements they’d like to see made in the future.
Changes to the Index Coverage report, rolling out today, are based on the feedback provided by the webmaster community.
“Based on the feedback we got from the community, today we are rolling out significant improvements to this report so you’re better informed on issues that might prevent Google from crawling and indexing your pages. The change is focused on providing a more accurate state to existing issues, which should help you solve them more easily.”
Advertisement
Continue Reading Below
Changes to Search Console Index Coverage Report
The list of changes to the Index Coverage report in Search Console includes:
- Removal of the generic “crawl anomaly” issue type – all crawls errors should now be mapped to an issue with a finer resolution.
- Pages that were submitted but blocked by robots.txt and got indexed are now reported as “indexed but blocked” (warning) instead of “submitted but blocked” (error)
- Addition of a new issue: “indexed without content” (warning)
- Soft 404 reporting is now more accurate
The overarching theme of these updates appears to be data accuracy.
There’s no more guesswork involved when it comes to crawl errors as the “crawl anomaly” issue is being replaced with specific issues and resolutions.
Site owners will know with certainty if a page indexed by Google is blocked by robots.txt because the report will state “indexed but blocked” rather than “submitted but blocked.” Submitting a URL is not the same as having it indexed, and the report is now updated to reflect that.
Advertisement
Continue Reading Below
Soft 404 reporting is said to be more accurate, and there’s the addition of a brand new issue called “indexed without content.” Let’s take a closer look at that issue in case it comes up in one of your reports.
Here’s what the Search Console Help page says about indexed without content:
“This page appears in the Google index, but for some reason Google could not read the content. Possible reasons are that the page might be cloaked to Google or the page might be in a format that Google can’t index. This is not a case of robots.txt blocking.”
If you come across the indexed without content issue it means the URL is in Google’s index but its web crawlers cannot view the content.
That could mean you’ve accidentally published a blank page, or there’s an error on the page which is preventing Google from rendering the content.
For further guidance on resolving an indexed without content error I advise site owners to run the specific page through Google’s URL Inspection tool.
The URL Inspection tool will render the page as Google sees it which may help with understanding why the content is not viewable to Google’s web crawlers.
These changes are now reflected in the Index Coverage report. Site owners may see new types of issues, or changes in counts of issues.
For more information see Google’s official blog post.