How does webmaster tools work


















Check the Advanced Index Status report and examine total pages indexed , the number of pages removed , and the number of pages blocked by robots. For more information on crawling and indexation metrics, read this. This error occurs whenever there is no page for the URL requested. Common causes of s include typos in the destination URL of a link and failure to redirect the URL of a page that was moved or deleted.

Both causes of s can be detrimental to both the user experience and your SEO endeavors. These may not represent any significant inconvenience to your users or wastage of link juice, but many s will be problematic.

Resolve problem s by redirecting to the appropriate page, by changing the destination URL of the inbound link, or by restoring content to the , depending on what is most practical and most beneficial to your users. This post explains the right perspective on s. For example s, s, and s are all non-crawlable. Google Webmaster Tools reports on all these. Soft s are a user-experience and SEO issue, and GWT can be the best way to find them non-manually though some might not actually be soft s.

However, GWT does not report on crawl issues like misplaced meta robots tags or redirects. Crawl Stats has pretty volatile graphs, but do look for big, weird spikes and distinct trends. For example, Crawl Stats can tell you:. GWT help articles here. Fetch as Google is an essential tool in making sure your pages are SEO-friendly or at least Google-friendly.

I recommend requesting a Fetch and Render on every template you have and every critical SEO landing page. Using Fetch as Google on many sites as showed me just how often this happens.

Incidentally, Pierre Far also told conference attendees that the biggest SEO error he sees is accidentally blocking Google from crawling all of your website.

However, I prefer this robots. Fetch as Google is no a replacement for best practices for crawl-friendliness like good robots. However, Fetch should often be used for site upgrades, URL migrations, breaking important news, and launching batches of new content. An XML Sitemap s is an opportunity to tell Google and the other search engines what pages on your site you want to be crawled and indexed. For large site or sites with frequently updated content, a Sitemap is pretty important.

Sitemaps can get tricky — especially when you have a large site or when you use special Sitemaps for images, video, news, mobile, or source code. It is recommended that you always validate your Sitemaps before going live. And what better way to validate than through the eyes of Google? Check in regularly to see if there are any errors or warnings.

Often, a sitemap error will reveal a larger problem with your site. Here is the list of possible Sitemap errors. In addition, pay attention to the number of URLs or images, videos, etc.. It is not uncommon for there to be a discrepancy here, but one of your SEO goals is to get the search engines to index everything you want indexed.

The tricky part is seeing which pages are not indexed in fact, this topic could warrant its own article , but this may be possible with Google site search and Analytics landing page reports. If the pages not indexed are important to you, there are a few things you can do to improve indexation.

Also, unindexed pages may be a red flag that those pages lack inbound links or lack content perceived by engines to be unique.

On a related note, I wrote about building Sitemaps here. Incorrectly excluding URLs could result in many pages disappearing from search. This will give you an indication of which parameters are getting indexed.

The other method is to look for the parameters in the results of a site crawl. You can also configure the parameters in GWT, but this typically should only be a band-aid instead of a permanent fix, and as noted should always be done with much caution. This guide was last updated at noon February 9, The guide was originally published in March of I really hope this Webmaster Tools guide is useful to you.

It tells you what pages are broken on your website. I like to use this report to determine if there is index bloat. By pairing the data from Google Search Console to Google Analytics, webmasters can find out if they are the unfortunate receivers of index bloat. You want to see if the number of pages in the Coverage Report matches the number of landing pages receiving organic traffic in Google Analytics. If you notice issues with indexing, you can report an indexing issue directly to Google.

You may see something like this:. After a long month of page speed enhancements and keyword mapping, the easiest quick fixes you can do to your website are cleaning up your crawl errors. With just one redirect, you can transform a broken link into a magical nest of backlink unicorns.

So much of what you include in your sitemap, from excluding tags to removing categories, can affect your website. The Sitemaps report in Google Search Console shares insights into what is happening on your website.

Yes, these are errors found in your sitemap, but so much more can be discovered from this error report. Fear not, your best defense to these sitemap errors is digging deep into the cause of why these errors occurred in the first place. A large website comes to me for an SEO audit. Awesome, right? After taking 5 seconds to review their sitemap errors, I noticed only one URL was being indexed compared to 16, URLs from this sitemap.

The Removals section in Google Search Console can be, in a word, complicated. Temporary Removals. If you want to temporarily hide something from Google searches like massive amounts of thin or duplicate content , you can add the URL to the Removals section in Google Search Console. You can also use the Removals section in Google Search Console to clear the cache URL to completely remove the page description snippet. Google dives into how you can hide content using the Removals section , as well.

In the Outdated Content section, you can view the removal requests sent by the public. So, any person can suggest an update to your search results if the correct information is not available. In the SafeSearch Filtering section, you can see what content is reported as adult content using the SafeSearch suggestion tool.

You want it to tell you the secrets to getting your structured data working and if your AMP pages are active. You love it when they say you have no error codes. Just tap the Enhancements tab in the left menu bar to view all the reports and tools. Let the nerding out begin! The Core Web Vitals report is meant to show you the quality of the experience a user has on your site.

The Data Highlighter is very user-friendly and can be used to tag at least 9 types of data, and every tag corresponds with schema. Whenever feasible, I prefer getting schema. If hard-coding schema is not practical, take the Data Highlighter for a spin. The HTML improvements section can not only help you improve the appearance of your SERP listings, but also help you find opportunities to address keyword optimization and duplicate content issues.

Sniff out duplicate content. As you likely know, it is generally a bad practice to have pages that do not contain content unique to that page. So use caution. Google your brand. Now do it on private browsing. Assuming you have sitelinks, do you like them? Occasionally, the sitelinks can link to pages that convert poorly or offer suboptimal UX. If you have a lot of branded traffic and a crappy sitelink or two, this is a big and easy win.

The most common big win scenario I see is when a site was getting a lot of traffic to a page that has suddenly become dated for example, a seasonal or out-of-stock product. Just make sure this is the right thing to do.

Also, if it happens that the vast majority of Google traffic to a page is coming through a sitelink which you can determine by analyzing the page in the Search Queries and noting how many clicks are from branded queries , then you can guess on conversion and engagement for the sitelink in a Google Analytics landing page report filtered for only Google organic traffic.

Links to Your Site give you data on who links to which pages on your site. We use Open Site Explorer by Moz. Majestic SEO is a third option, and has the largest database among the premium link tools.

For all of the above metrics, GWT offers 16 months of data and allows you to compare two date ranges. More importantly, it highlights any indexing errors or warnings and gives you information for analysis.

This allows you to check for any errors or inconsistencies, and you can also see the date that the page was last crawled.



0コメント

  • 1000 / 1000