The Diagnostics reports specialize in pointing out errors and suggestions
for improvement, along with raw crawling data that helps you identify server
This section has content only if Google has detected your site has been
infected with malware. If so, follow the instructions shown to clean your site
and then inform Google to re-check it. Check this report once a week even
if you are sure your site is clean.
The Crawl Errors report is one of the most helpful reports and one of the
biggest reasons GWT is so helpful. For both web and mobile content,
Google shows you errors of the following type:
HTTP: Generally a 400 or 403 error.
In sitemaps: Shows URLs of multiple error types that are listed in
your XML file.
Not followed: Shows links that Google chose not to follow, usually
because of excessive redirects or endless looping.
Restricted by robots.txt: Check this to double-check whether your
robots file is working the way you want it to.
Not found: Traditional “404” errors.
Soft 404s: “Page not found” errors that don’t give a true 404 HTTP
header code. These pages can lead to a lot of junk clogging up the
Timed out: Usually due to the server being too busy to respond to
Unreachable: Usually due to a server error.
Why is repairing these errors important?
First, your users might be seeing the same errors as Google is.
Second, there is a lot of spare PageRank and authority swimming around out there.
If Google can’t read your page, it can’t see the architecture you built.
NOTE This applies especially to the “Not found” category. This report
shows URLs on your site that other sites are actually linking to, but the
404 error is keeping your site from receiving credit for them. Repair the
404 errors (such as by redirecting the URL to an appropriate page on
your site) and recoup that link.
This report shows three graphs:
Pages crawled per day: The number of distinct URLs that Google
crawls by day. Expect spikes when you introduce a lot of new
content, accrue strong links to your site, and submit new XML
sitemaps. If this graph bottoms out consistently, it’s likely due to crawl
obstacles or penalties.
Kilobytes downloaded per day: Similar to pages crawled per
day, this graph frequently looks very similar to the one above it.
Spikes and valleys can occur, however, if the pages downloaded by
Google are particularly large or small.
Time spent downloading a page (in milliseconds): This report
reflects page-load time, and small numbers are better. This graph
should not, in theory, correspond to the two preceding graphs.
Relatively large spikes can suggest server problems or abnormally
large file sizes.
This section highlights pages for which Google has detected potential
“issues” with your site’s meta content, including:
Meta descriptions: Highlights duplicates and descriptions that are
too long or short.
Title tags: Highlights URLs for which titles are missing, duplicated,
too long, too short, and uninformative.
Non-indexable content: Highlights content that Google can’t read
or interpret correctly.
It’s worthwhile to look at this section with a critical eye toward your
content. Google won’t point out issues unless it feels they’re giving users a
poor experience, and its algorithm is all about enriching the user
experience. It may be entirely appropriate on your site if five URLs share
the same title, but this report nearly always highlights several areas to
The Labs section of webmaster reports is where Google tests reporting
structures before it considers them ready for prime time. But that doesn’t
mean their data is unhelpful. In fact, some Labs reports are as helpful in
diagnosing site problems as reports in the other areas of GWT. Following
are the Labs reports and a brief description of each.
Fetch as Googlebot
This is Google’s version of a “header checker,” and it’s quite similar to a
long-time favorite tool of SEOs, Rex Swain’s HTTP Viewer
Insert a URL from your site, and select whether you want it checked by
Google’s main crawler (“Web”) or by its mobile crawler (“XHTML” or
“cHTML”). Check back in a minute or two, and if Google has crawled the
page, there will be a link called “Success” that you can click to see the
code Google crawled.
This tool is very helpful for ensuring that your pages are showing the
correct HTTP header code (200, 302, 301, and so on), and it’s especially
helpful for testing your site’s mobile device detection and redirection. For
example, testing your desktop site as Google’s mobile crawler will help
you know whether mobile devices are being redirected correctly to mobile
The Site Performance report shows a graph of random page-load times
from your site over the last several months. Google has arbitrarily defined
“slow” as the slowest 80 percent of sites on the Internet, and “fast” as the
fastest 20 percent. This means your site could perform in the top 22nd
percentile and still be considered “slow” by Google standards.
Consequently, I recommend that you don’t pay a lot of attention to those
labels. Instead, pay attention to spikes that relate to your pages loading
more slowly, and try to determine whether it‘s feasible to trim your load
Further, the report offers suggestions about how to cut the load time of
your pages, including offering specific predictions about how enabling
minimizing DNS lookups will affect the size of your pages.
This report is similar to the Sitemaps report in the Site Configuration
section of GWT, except its purpose is to diagnose and report on video
content found in XML sitemaps. Currently the reports show very little
information other than listing all of your existing XML feeds.