We just launched a new Webmaster Central Blog in Japanese. For those of you who feel more comfortable reading Japanese, and are interested in webmaster-related information from Google, and even learning about issues specific to our region and language, we hope you enjoy on our Japanese version of the Webmaster Central Blog :D
Saturday, 27 December 2014
Japanese WMC Blog launched
We just launched a new Webmaster Central Blog in Japanese. For those of you who feel more comfortable reading Japanese, and are interested in webmaster-related information from Google, and even learning about issues specific to our region and language, we hope you enjoy on our Japanese version of the Webmaster Central Blog :D
Thursday, 25 December 2014
Feliz Navidad from the Spanish Webmaster Central team!
It's been both a pleasure and a great opportunity for us to share our knowledge and hear your feedback. A few of this year's highlights:
For the blog, we had:
- Matt Cutts talking to us in a 3 part interview (see part 1, part 2, and part 3).
- A series of videos explaining how to use Webmaster Tools (See all parts: 1, 2, 3, 4, 5, and 6)
- The links series and the 404 series.
- Google's SEO Starter Guide in Spanish.
- SMX Madrid 2008.
- Google Search Masters 2008 in Mexico. We even have some video footage from this conference, including a session about the Help Group :)
- Congreso de Webmasters 2008.
This is us, several members of the Spanish Webmaster Central team:
From left to right: Cristina, Alvar, Rebecca, and Esperanza in Google's Dublin office, with a holiday touch :)
Written by Alvar López, Search Quality Team
Wednesday, 24 December 2014
Helping webmasters from user to user
With thousands of webmasters visiting the English Help Forum every day, some questions naturally pop up more often than others. To help catch these common issues, the Bionic Posters have also helped to create and maintain a comprehensive list of frequently asked questions and their answers. These FAQs cover everything from "Why isn't my site indexed?" to diagnosing difficult issues with the help of Google Webmaster Tools, often referring to our Webmaster Help Center for specific topics. Before you post in the forum, make sure you've read through these resources and do a quick search in the forum; chances are high that your question has been answered there already.
Besides the Bionic Posters, we're lucky to have a number of very active and helpful users in the forum, such as: squibble, Lysis, yasir, Steven Lockey, seo101, RickyD, MartinJ and many more. Thank you all for making this community so captivating and—most of the time—friendly.
Here are just a few (well, a little more than a few) of the many comments that we've seen posted in the forum:
- "Thank you for this forum... Thank you to those that take the time to answer and care!"
- "I've only posted one question here, but have received a wealth of knowledge by reading tons of posts and answers. The time you experts put into helping people with their problems is very inspiring and my hat's off to each of you. Anyway, I just wanted to let you know that your services aren't going unnoticed and I truly appreciate the lessons."
- "Thank you very much cristina, what you told me has done the trick. I really appriciate the help as this has been bugging me for a while now and I didn't know what was wrong."
- "thank you ssssssssssoooo much kaleh. "
- "OK, Phil Payne big thanks to You! I have made changes and maybe people are starting to find me in G! Thanks to Ashley, I've started to make exclusive and relevant content for people."
- "If anything, it has helped me reflect on the sites and projects of days gone by so as to see what I could have done better - so that I can deliver that much more and better results going forward. I've learned that some things I had done right, were spot on, and other issues could have been handled differently, as well as a host of technical information that I've stored away for future use. Bottom Line: this forum rocks and is incredibly helpful."
- "I asked a handful of questions, got GREAT help while doing a whole lot of lurking, and now I've got a site that rocks!! (...) Huge thanks to all the Top Contributors, and a very special mention to WEBADO, who helped me a TON with my .htaccess file."
- "Over the years of reading (and sometimes contributing) to this forum I think it has helped to remove many false assumptions and doubts over Google's ranking systems. Contrary to what many have said I verily believe Google can benefit small businesses. Keep up the good work. "
- "The forum members are awesome and are a most impressive bunch. Their contribution is immeasurable as it is huge. Not only have they helped Google in their success as a profitable business entity, but also helped webmasters both aspiring and experienced. There is also an engender(ment) of "family" or "belonging" in the group that has transcended the best and worst of times (Current forum change still TBD :-) ). We can agree, disagree and agree to disagree but remain respectful and civil (Usually :-) )."
- "Hi Redleg, Thank you very much for all of the information. Without your help, I don't think I would ever have known how to find the problem. "
- "What an amazing board. Over the last few days I have asked 1 question and recieved a ton of advice mainly from Autocrat. "
- "A big thank you to the forum and the contributors that helped me get my site on Google . After some hassle with my web hosters and their naff submission service, issues over adding pages Google can see, issues over Sitemaps, I can now say that when I put my site name into the search and when i put in [custom made watch box], for instance, my site now comes up."
- "Thank you Autocrat! You are MAGNIFICENT! (...) I am your biggest fan today. : ) Imagine Joe Cocker singing With a Little Help from My Friends...that's my theme song today."
- "I've done a lot of reading since then and I've learned more in the last year than I learned in the previous 10. When I stumbled into this forum I had no idea what I was getting into but finding this forum was a gift from God! Words cannot express the amount of gratitude I feel for the help you have given me and I wish I could repay you some how.... I don't mean to sound so mushy, but I write this with tears in my eyes and I am truly, truly grateful..."
Are you new to the Webmaster Help Forum? Tell us a little bit about yourself and then join us to learn more and help others!
Posted by John Mueller, Webmaster Trends Analyst, Google Zürich
Tuesday, 23 December 2014
Wishing you and your site a happy holiday!
Every day we see new people commenting and joining the discussion. This holiday season we'll try to update our blog to accommodate your growing needs. Always feel free to let us know how we're doing (especially if we publish a typo! :), because first and foremost and everywhere in the middle, we're trying to improve for you.
Happy holidays from all of us at Webmaster Central.
Written by Maile Ohye, Developer Programs Tech Lead
Monday, 22 December 2014
Quick and easy tips for the holiday rush
Verify that your site is indexed by Google (and is returned in search results)
Check your snippet content and page titles with the site: command [site:example.com] -- do they look accurate and descriptive for users? Ideally, each title and snippet should be unique in order to reflect that each URL contains unique content. If anything is missing or you want more details, you can also use the Content Analysis tool in Webmaster Tools. There you can see which URLs on your site show duplicate titles or meta descriptions.
Label your images accurately
Don't miss out on potential customers! Because good 'alt' text and descriptive filenames help us better understand images, make sure you change non-descriptive file names [001.jpg] to something more accurate [NintendoWii.jpg]. Image Search is one of our largest search properties, so you should take advantage of it.
Know what Google knows (about your site)
Check for crawl errors and learn the top queries that bring traffic to your site through Webmaster Tools. See our diagnostics checklist.
Have a plan for expiring and temporary pages
Make sure to serve accurate HTTP status codes. If you no longer sell a product, serve a 404. If you have changed a product page to a new URL, serve a 301 to redirect the old page to the new one. Keeping your site up-to-date can help bring more targeted traffic your way.
Increase foot traffic too
If your website directs customers to a brick-and-mortar location, make sure you claim and double check your business listing in Google Local.
Usability 101
Test the usability of your checkout process with various browsers. Ask yourself if a user can get from product page to checkout without assistance. Is your checkout button easy to find?
Tell us where to find all of your web pages
If you upload new products faster than Google crawls your site, make sure to submit a Sitemap and include 'last modification' and change frequency' information. A Sitemap can point Googlebot to your new or hard-to-find content.
Manage your sitelinks
Your site may be triggering Sitelinks in the search results, so check the links and make sure the destination pages are fully functional. Remember: in Webmaster Tools you can remove any sitelinks that you don't think users will find useful.
Don't forget to check out these additional resources:
- Read our recently released SEO Starter Guide.
- Watch our Tutorials for Webmasters.
- Find out what information Google has about your website in Webmaster Tools.
- Get your other questions answered in our Webmaster Help Center.
- Ask your last-minute questions in the Webmaster Help Forum.
- Countdown to 2009!
Sunday, 21 December 2014
A Festivus for our webmasterus
If it's good enough for the Costanzas, it's good enough for Webmaster Central: it's time for a Festivus for the rest of us (webmasterus)!
Our special celebration begins not with carols and eggnog, but by remembering some of the popular Webmaster Tools features -- make that Feats of Strength -- for 2007. This year, you gained the ability to chickity-check out your backlinks (<-- that's Festivus-inspired anchor text) and tell Google you want out with URL Removal. And let's not forget Message Center and IDNA support, perfect for those times when [a-zA-Z0-9\-] just doesn't cut it.
Feel the power! Festivus Feats of Strength!
Now comes our webmaster family's traditional Airing of Grievances. You can air your woes and "awww man!"s in the comments below. Just remember that bots may crawl this blog, but we humans review the comments, so please keep your grievances constructive. :) Let us know about features you'd like implemented in Webmaster Tools, articles you'd like written in our blog or Help Center, and stuff you'd like to see in the discussion group. Bonus points if you also explain how your suggestion helps the whole Internet—not just your site's individual rankings. (But of course, we understand that your site ranking number one for all queries in all regions is truly, objectively good for everyone.)
Last, there are so many Festivus Miracles to share! Such as the many helpful members of the discussion group from all around the world, the new friendships formed between Susan Moskwa, JohnMu, Wysz, Matt D, Bergy, Patrick, Nathanj and so many webmasters, and the fun of chatting with our video watchers, fellow conference attendees, and those in the blogosphere keepin' it real.
On behalf of the entire Webmaster Central team, here's to you, Festivus Miracle and Time Magazine's Person of the Year in 2006 -- happy holidays. See you in 2008. :)
Friday, 19 December 2014
White iPhone 4 Vs Black iPhone 4
Thursday, 18 December 2014
The Ultimate Fate of Supplemental Results
In 2003, Google introduced a "supplemental index" as a way of showing more documents to users. Most webmasters will probably snicker about that statement, since supplemental docs were famous for refreshing less often and showing up in search results less often. But the supplemental index served an important purpose: it stored unusual documents that we would search in more depth for harder or more esoteric queries. For a long time, the alternative was to simply not show those documents at all, but this was always unsatisfying—ideally, we would search all of the documents all of the time, to give users the experience they expect.
This led to a major effort to rethink the entire supplemental index. We improved the crawl frequency and decoupled it from which index a document was stored in, and once these "supplementalization effects" were gone, the "supplemental result" tag itself—which only served to suggest that otherwise good documents were somehow suspect—was eliminated a few months ago. Now we're coming to the next major milestone in the elimination of the artificial difference between indices: rather than searching some part of our index in more depth for obscure queries, we're now searching the whole index for every query.
From a user perspective, this means that you'll be seeing more relevant documents and a much deeper slice of the web, especially for non-English queries. For webmasters, this means that good-quality pages that were less visible in our index are more likely to come up for queries.
Hidden behind this are some truly amazing technical feats; serving this much larger of an index doesn't happen easily, and it took several fundamental innovations to make it possible. At this point it's safe to say that the Google search engine works like nothing else in the world. If you want to know how it actually works, you'll have to come join Google Engineering; as usual, it's all triple-hush-hush secrets.*
* Originally, I was going to give the stock Google answer, "If I told you, I'd have to kill you." However, I've been informed by management that killing people violates our "Don't be evil" policy, so I'm forced to replace that with sounding mysterious and suggesting that good engineers come and join us. Which I'm dead serious about; if you've got the technical chops and want to work on some of the most complex and advanced large-scale software infrastructure in the world, we want you here.
Sitemap Submission Made Simple

Sitemap file formats supported by Google
Part of what makes the web so interesting is that there are so many different kinds of content out there. Do you use videos on your website? If so, send us a Video Sitemap file so that we can send you visitors to those videos! Do you host source-code samples? Submit a Code Search Sitemap! Here are the various kinds of Sitemap files that Google supports at the moment:- XML Sitemap files for web pages - Use these files to submit all of your web pages (this is the preferred format for web pages). While not all search engines may support the Sitemap types listed below, the XML Sitemap for web pages is supported by all search engines of sitemaps.org.
- RSS 2.0 and Atom 1.0 feeds for web pages - Many blogs create these automatically.
- Text files with web page URLs - If you can't automatically create one of the above formats, you can create a text file with your URLs in it.
- XML Sitemap files for Video Search - Videos on your website can be indexed and made available for Google Video Search.
- Media-RSS feeds for Video Search - mRSS feeds are used by various other systems, we can use these for Google Video Search as well.
- XML Sitemap files for Google Code Search - If you make programming samples or code available to your users, you can submit these for Google Code Search.
- XML Sitemap files for mobile web pages - Using this kind of format allows us to recognize content that has been optimized for mobile devices (please note that there was recently a small change in the format).
- XML Sitemap files for geo-data - If you have geographic data on your website in the form of KML or GeoRSS files, please let us know about these files.
- XML Sitemap files for News - News websites can submit their news content in this special Sitemap format (please note that you must first register with Google News before these files are processed).
If you have multiple Sitemap files that you wish to submit to Google, you can include up to 1,000 of these in an XML Sitemap Index file. If you have more than 1,000 Sitemap files, you can just submit multiple Sitemap Index files - we'd love to take them all!
Submitting your Sitemap files to Google
Once you have your Sitemap files ready and available on your server, all that's left is making sure that the search engines can find them. Google supports three simple ways to submit Sitemap files:- Using Google Webmaster Tools
Submitting your Sitemap files through Google Webmaster Tools is the preferred way of letting us know about them. The main advantage of doing it this way is that you'll always have direct feedback about how your Sitemap files were downloaded (were we able to reach your server?), how they were recognized (were they in the right format?) and what happened to the web pages listed in them (how many were indexed?). To submit your Sitemap files, make sure that your website is verified in Webmaster Tools, then go to "Sitemaps" in Webmaster Tools and enter the file name of your Sitemap(s).
Sometimes it makes sense to keep your Sitemap file on a different server / domain name. To submit Sitemap files like that, you must verify ownership of both sites in Webmaster Tools and submit the Sitemap on the appropriate site. For instance, if your Sitemap file for http://www.example.com is kept on http://sitemap-files.example.com/ then you need to verify ownership of both sites and then submit the Sitemap file under http://sitemap-files.example.com (even though the URLs listed in it are for http://www.example.com). For more information, please see our Help Center topic on submitting Sitemap files for multiple sites. - Listing Sitemap files in the robots.txt file
Another way of submitting a Sitemap file is to specify the URL in your robots.txt file. If you use this method of submitting a Sitemap file, it will be found by all search engines that support the Sitemaps protocol (although not all of them support the extensions listed above). Since you can specify the full URL of your Sitemap file in the robots.txt file, this method also allows you to store your Sitemap file on a different domain. Keep in mind that while Sitemap files submitted this way are processed on our side, they will not be automatically listed in your Webmaster Tools account. In order to receive feedback on your files, we recommend adding them manually to your account as well. - Using an HTTP "ping"
If your Sitemap files are generated automatically, a convenient way to submit (and re-submit) them is to access the "ping" URL for Google Sitemaps. This URL includes the URL of your Sitemap file. For more information on the "ping" URL for your website, please see the Help Center article on Updating a Sitemap. Feel free to "ping" this URL whenever you update your Sitemap file - we'll know to pick it up and process it again. If you also have your Sitemap file registered in Webmaster Tools, we'll update the status there as well. This method is also valid if your Sitemap file is kept on a different server, but you must still verify both sites in Webmaster Tools as previously mentioned.
Search engines that are a members of sitemaps.org support a similar way of submitting general web Sitemap files.
We hope these simplifications make it even easier for you to send us your Sitemap files!
Posted by John Mueller, Webmaster Trends Analyst, Google Zürich
Taking feeds out of our web search results
As a webmaster, you may have been concerned about your RSS/Atom feeds crowding out their associated HTML pages in Google's search results. By serving feeds, we could cause a poor user experience:
- Feeds increase the likelihood that users see duplicate search results.
- Users clicking on a feed may miss valuable content available only in the HTML page.
As a user, you may ask yourself whether Google has a way to search for feeds. The answer is yes; both Google Reader and iGoogle allow searching for feeds to subscribe to.
We're aware that there are a few non-podcast feeds out there with no associated HTML pages, and thus removing these feeds for now from the search results might be less than ideal. We remain open to other feedback on how to improve the handling of feeds, and especially welcome your comments and questions in the Crawling, Indexing and Ranking subtopic of our Webmaster Help Group.
For the German version of this post, go to "Wir entfernen Feeds aus unseren Suchergebnissen."
Wednesday, 17 December 2014
Introducing Video Sitemaps
In our effort to help users search all the world's public videos, the Google Video team joined the Sitemaps folks to introduce Video Sitemaps—an extension of the Sitemap Protocol that helps make your videos more searchable via Google Video Search. By submitting this video-specific Sitemap in addition to your standard Sitemap, you can specify all the video files on your site, along with relevant metadata. Here's an example:
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:video="http://www.google.com/schemas/sitemap-video/1.0">
<url>
<loc>http://www.example.com/videos/some_video_landing_page.html</loc>
<video:video>
<video:content_loc>http://www.example.com/video123.flv</video:content_loc>
<video:player_loc allow_embed="yes">http://www.example.com/videoplayer.swf?video=123</video:player_loc>
<video:title>My funny video</video:title>
<video:thumbnail_loc>http://www.example.com/thumbs/123.jpg</video:thumbnail_loc>
</video:video>
</url>
<url>
<loc>http://www.example.com/videos/some_other_video_landing_page.html</loc>
<video:video>
<video:content_loc>http://www.example.com/videos/video1.mpg</video:content_loc>
<video:description>A really awesome video</video:description>
</video:video>
</url>
</urlset>
To get started, create a Video Sitemap, sign into Google Webmaster Tools, and add the Video Sitemap to your account.
Tuesday, 16 December 2014
Webmaster Tools in 40 languages!
In our recent Webmaster Tools launch, we went live in 14 new languages, bringing our total language support count to 40! With the launch of Bulgarian, Catalan, Croatian, Filipino, Greek, Indonesian, Lithuanian, Latvian, Portuguese (Portugal), Slovak, Slovenian, Serbian, Ukrainian and Vietnamese, Webmaster Tools joins Google products such as Google.com, AdWords, Gmail and Toolbar to reach the 40 Language Initiative (Google's company-wide initiative to make sure Google products are available in the 40 languages read by more than 98% of Internet users).
Our team is very excited to reach so many of you by offering our tools in 40 languages. At the same time, both the Google Localization and Webmaster Tools teams know that there's more room for improvements in the features and quality of our service. We hope to hear your input in the comments below, especially on the linguistic quality of our new languages.
Written by Kidus Asfaw, Google Localization
Monday, 15 December 2014
Handling legitimate cross-domain content duplication
We've recently discussed several ways of handling duplicate content on a single website; today we'll look at ways of handling similar duplication across different websites, across different domains. For some sites, there are legitimate reasons to duplicate content across different websites — for instance, to migrate to a new domain name using a web server that cannot create server-side redirects. To help with issues that arise on such sites, we're announcing our support of the cross-domain rel="canonical" link element.

Ways of handling cross-domain content duplication:
- Choose your preferred domain
When confronted with duplicate content, search engines will generally take one version and filter the others out. This can also happen when multiple domain names are involved, so while search engines are generally pretty good at choosing something reasonable, many webmasters prefer to make that decision themselves.
- Reduce in-site duplication
Before starting on cross-site duplicate content questions, make sure to handle duplication within your site first.
- Enable crawling and use 301 (permanent) redirects where possible
Where possible, the most important step is often to use appropriate 301 redirects. These redirects send visitors and search engine crawlers to your preferred domain and make it very clear which URL should be indexed. This is generally the preferred method as it gives clear guidance to everyone who accesses the content. Keep in mind that in order for search engine crawlers to discover these redirects, none of the URLs in the redirect chain can be disallowed via a robots.txt file. Don't forget to handle your www / non-www preference with appropriate redirects and in Webmaster Tools.
- Use the cross-domain rel="canonical" link element
There are situations where it's not easily possible to set up redirects. This could be the case when you need to move your website from a server that does not feature server-side redirects. In a situation like this, you can use the rel="canonical" link element across domains to specify the exact URL of whichever domain is preferred for indexing. While the rel="canonical" link element is seen as a hint and not an absolute directive, we do try to follow it where possible.
Still have questions?
Q: Do the pages have to be identical?
A: No, but they should be similar. Slight differences are fine.
Q: For technical reasons I can't include a 1:1 mapping for the URLs on my sites. Can I just point the rel="canonical" at the homepage of my preferred site?
A: No; this could result in problems. A mapping from old URL to new URL for each URL on the old site is the best way to use rel="canonical".
Q: I'm offering my content / product descriptions for syndication. Do my publishers need to use rel="canonical"?
A: We leave this up to you and your publishers. If the content is similar enough, it might make sense to use rel="canonical", if both parties agree.
Q: My server can't do a 301 (permanent) redirect. Can I use rel="canonical" to move my site?
A: If it's at all possible, you should work with your webhost or web server to do a 301 redirect. Keep in mind that we treat rel="canonical" as a hint, and other search engines may handle it differently. But if a 301 redirect is impossible for some reason, then a rel="canonical" may work for you. For more information, see our guidelines on moving your site.
Q: Should I use a noindex robots meta tag on pages with a rel="canonical" link element?
A: No, since those pages would not be equivalent with regards to indexing - one would be allowed while the other would be blocked. Additionally, it's important that these pages are not disallowed from crawling through a robots.txt file, otherwise search engine crawlers will not be able to discover the rel="canonical" link element.
We hope this makes it easier for you to handle duplicate content in a user-friendly way. Are there still places where you feel that duplicate content is causing your sites problems? Let us know in the Webmaster Help Forum!
Posted by John Mueller, Webmaster Trends Analyst, Google Zürich
Google Public DNS and Location-Sensitive DNS Responses
Webmaster level: advanced
Recently the Google Public DNS team, in collaboration with Akamai, reached an important milestone: Google Public DNS now propagates client location information to Akamai nameservers. This effort significantly improves the accuracy of approximately 30% of the location-sensitive DNS responses returned by Google Public DNS. In other words, client requests to Akamai hosted content can be routed to closer servers with lower latency and greater data transfer throughput. Overall, Google Public DNS resolvers serve 400 billion responses per day and more than 50% of them are location-sensitive.
DNS is often used by Content Distribution Networks (CDNs) such as Akamai to achieve location-based load balancing by constructing responses based on clients’ IP addresses. However, CDNs usually see the DNS resolvers’ IP address instead of the actual clients’ and are therefore forced to assume that the resolvers are close to the clients. Unfortunately, the assumption is not always true. Many resolvers, especially those open to the Internet at large, are not deployed at every single local network.
To solve this issue, a group of DNS and content providers, including Google, proposed an approach to allow resolvers to forward the client’s subnet to CDN nameservers in an extension field in the DNS request. The subnet is a portion of the client’s IP address, truncated to preserve privacy. The approach is officially named edns-client-subnet or ECS.
This solution requires that both resolvers and CDNs adopt the new DNS extension. Google Public DNS resolvers automatically probe to discover ECS-aware nameservers and have observed the footprint of ECS support from CDNs expanding steadily over the past years. By now, more than 4000 nameservers from approximately 300 content providers support ECS. The Google-Akamai collaboration marks a significant milestone in our ongoing efforts to ensure DNS contributes to keeping the Internet fast. We encourage more CDNs to join us by supporting the ECS option.
For more information about Google Public DNS, please visit our website. For CDN operators, please also visit “A Faster Internet” for more technical details.
Posted by Yunhong Gu, Tech Lead, Google Public DNS
Sunday, 14 December 2014
FYI on Google Toolbar's latest features
The latest version of Google Toolbar for Internet Explorer (beta) just added a neat feature to help users arrive at your website, or at least see your content, even when things go awry.
It's frustrating for your users to mistype your URL and receive a generic "404 - Not Found" or try to access a part of your site that might be down.
Regardless of your site being useful and information-rich, when these issues arise, most users just move on to something else. The latest release of Google Toolbar, however, helps users by detecting site issues and providing alternatives.
Website Optimizer or Website Optimiser? The Toolbar can help you find it even if you try "google.cmo" instead of "google.com".

3 site issues detected by Google Toolbar
- 404 errors with default error pages
When a visitor tries to reach your content with an invalid URL and your server returns a short, default error message (less than 512 bytes), the Toolbar will suggest an alternate URL to the visitor. If this is a general problem in your website, you will see these URLs also listed in the crawl errors section of your Webmaster Tools account.
If you choose to set up a custom error page, make sure it returns result code 404. The content of the 404 page can help your visitors to understand that they tried to reach a missing page and provides suggestions regarding how to find the content they were looking for. When a site displays a custom error page the Toolbar will no longer provide suggestions for that site. You can check the behavior of the Toolbar by visiting an invalid URL on your site with the Google Toolbar installed. - DNS errors
When a URL contains a non-existent domain name (like www.google.cmo), the Toolbar will suggest an alternate, similar looking URL with a valid domain name. - Connection failures
When your server is unreachable, the Google Toolbar will automatically display a link to the cached version of your page. This feature is only available when Google is not explicitly forbidden from caching your pages through use of a robots meta tag or crawling is blocked on the page through the robots.txt file. If your server is regularly unreachable, you will probably want to fix that first; but it may also be a good idea to check the Google cache for your pages by looking at the search results for your site.
Suggestions provided by the Google Toolbar
When one of the above situations is found, the Toolbar will try to find the most helpful links for the user. That may include:- A link to the corrected URL
When the Toolbar can find the most probable, active URL to match the user's input (or link they clicked on), it will display it right on top as a suggestion. The correction can be somewhere in the domain name, the path or the file name (the Toolbar does not look at any parameters in the URL). - A link to the cached version of the URL
When Toolbar recognizes the URL in the Google cache, it will display a link to the cached version. This is particularly useful when the user can't access your pages for some reason. As mentioned above, Google may cache your URLs provided you're not explicitly forbidding this through use of a robots meta tag or the robots.txt file. - A link to the homepage or HTML site map of your site
Sometimes going to the homepage or a site map page is the best way to find the page that a user is really looking for. Site map pages (these are not XML Sitemap files) are generally recognized based on the file name; if the Toolbar can find something called "sitemap.html" or similar, this page will probably be recognized as the site map page. Don't worry if your site map page is called something else; if a user decides to go to your homepage, they'll probably find it right away even if the Toolbar doesn't spot it. - A link to a higher level folder
Sometimes the homepage or site map page is too far out and the user would be better off just going one step up in the hierarchy. When the Toolbar can recognize that your site's structure is based on folders and sub-folders, it may suggest a page one step back. - A search within your site for keywords found in the URL
It's a good practice to use descriptive URLs. If the Toolbar can recognize keywords within the URL which the user tried to access, it will link to a site-search with those keywords. Even if the URL has changed significantly in the meantime, the search may be able to find similar content based on those keywords. For instance, if the URL was http://example.com/party-gifts/holidays/ it will suggest a search for the words "party", "gifts" and "holidays" within the site example.com. - An open Google search box
If all else fails, there's always a chance that similar content already exists elsewhere on the web. The Google web search can help your users to find it - the Toolbar will help you by adding the keywords found in the URL to the search box.
Are you curious already? Download the Google Toolbar for your browser and give it a try on your site!
To discuss how this feature can help visitors to your site, jump in to our Google Webmaster Help Group; or for general Google Toolbar questions, try the Toolbar group for Internet Explorer or the Toolbar group for Firefox.
Saturday, 13 December 2014
New: Content analysis and Sitemap details, plus more languages
We're always striving to help webmasters build outstanding websites, and in our latest release we have two new features: Content analysis and Sitemap details. We hope these features help you to build a site you could compare to a fine wine -- getting better and better over time.
Content analysis
To help you improve the quality of your site, our new content analysis feature should be a helpful addition to the crawl error diagnostics already provided in Webmaster Tools. Content analysis contains feedback about issues that may impact the user experience or that may make it difficult for Google to crawl and index pages on your site. By reviewing the areas we've highlighted, you can help eliminate potential issues that could affect your site's ability to be crawled and indexed. This results in better indexing of your site by Google and other search engines.
The Content analysis summary page within the Diagnostics section of Webmaster Tools features three main categories. Click on a particular issue type for more details:
- Title tag issues
- Meta description issues
- Non-indexable content issues

Selecting "Duplicate title tags" displays a list of repeated page titles along with a count of how many pages contain that title. We currently present up to thirty duplicated page titles on the details page. If the duplicate title issues shown are corrected, we'll update the list to reflect any other pages that share duplicate titles the next time your website is crawled.
Also, in the Title tag issues category, we show "Long title tags" and "Short title tags." For these issue types we will identify title tags that are way too short (for example "IT" isn't generally a good title tag) or way too long (title tag was never intended to mean <insert epic novel here>). A similar algorithm identifies potentially problematic meta description tags. While these pointers won't directly help you rank better (i.e. pages with <title> length x aren't moved to the top of the search results), they may help your site display better titles and snippets in search results, and this can increase visitor traffic.
In the "Non-indexable content issues," we give you a heads-up of areas that aren't as friendly to our more text-based crawler. And be sure to check out our posts on Flash and images to learn how to make these items more search-engine friendly.

Sitemap details page
If you've submitted a Sitemap, you'll be happy when you see the additional information in Webmaster Tools revealing how your Sitemap was processed. You can find this information on the newly available Sitemap Details page which (along with information that was previously provided for each of your Sitemaps) shows you the number of the pages from your Sitemap that were indexed. Keep in mind the number of pages indexed from your Sitemap may not be 100% accurate because the indexed number is updated periodically, but it's more accurate than running a "site:example.com" query on Google.
The new Sitemap Details page also lists any errors or warnings that were encountered when specific pages from your Sitemap were crawled. So the time you might have previously spent on crafting custom Google queries to determine how many pages from your Sitemap were indexed, can now be spent on improving your site. If your site is already the crème de la crème, you might prefer to spend the extra free time mastering your ice-carving skills or blending the perfect eggnog.
Here's a view of the new Sitemap details page:

Sitemaps are an excellent way to tell Google about your site's most important pages, especially if you have new or updated content that we may not know about. If you haven't yet submitted a Sitemap or have questions about the process, visit our Webmaster Help Center to learn more.
Webmaster Tools now available in Czech & Hungarian
We love expanding our product to help more people and in their language of choice. We recently put in effort to expand the number of Webmaster Tools available languages to Czech and Hungarian, in addition to the 20 other languages we already support. We won't be stopping here. Our desire to support even more languages in the future means that if your language of choice isn't currently supported, stay tuned -- there'll be even more supported languages to come.
We always love to hear what you think. Please visit our Webmaster Help Group to share comments or ask questions.
Wednesday, 10 December 2014
Message Center info through our API
What can I do?
The Message Center GData API lets you retrieve all messages, mark the messages as read or unread, and delete messages. You can do these tasks using the provided Java client libraries, or you can create your own client code based on the protocol information.
- Retrieve messages: The messages feed contains all the messages sent to your account. These messages have important information about your verified sites. Examples of messages include infinite spaces warnings and crawl rate change notifications.
- Mark messages as read or unread: In order to keep track of new communications from Google, you can mark your messages as read or unread, the same way that you would manage your inbox. If you retrieve a single message, this message will be automatically marked as read.
- Delete mesages: It's possible to delete messages using the GData API. However, be careful because if you delete a message through the API it will also be deleted in your Webmaster Tools account, as both interfaces share the same data.
You can download code samples in Java for all these new features. These samples provide simple ways to use the messages feed. The following snippet shows how to retrieve the messages feed in a supported language and print all the messages:
// Connect with the service and authenticate
WebmasterToolsService service
= new WebmasterToolsService("exampleCo-exampleApp-1");
try {
service.setUserCredentials(USERNAME, PASSWORD);
} catch (AuthenticationException e) {
System.out.println("Username or password invalid");
return;
}
// Retrieve messages feed
MessagesFeed messages;
try {
URL feedUrl;
if (USER_LANGUAGE == null) {
feedUrl = new URL(MESSAGES_FEED_URI);
} else {
feedUrl = new URL(MESSAGES_FEED_URI
+ "?hl=" + USER_LANGUAGE);
}
messages = service.getFeed(feedUrl, MessagesFeed.class);
} catch (IOException e) {
System.out.println("There was a network error.");
return;
} catch (ServiceException e) {
System.out.println("The service is not available.");
return;
}
// Print the messages feed
System.out.println(messages.getTitle().getPlainText());
for (MessageEntry entry : messages.getEntries()) {
if (entry.getRead()) {
System.out.print(" \t");
} else {
System.out.print("new\t");
}
System.out.print(entry.getDate().toUiString() + "\t");
System.out.println(entry.getSubject());
}
Where do I get it?
If you want to know more about GData, you may want to start by checking out the GData website. The homepage of the Webmaster Tools GData API contains a section on the messages feed, with details about the protocol. You can also download the sample Message Center client form the GData download site. It will show you how to use all the Message Center GData API features.
Written by Javier Tordable, Software Engineer
Tuesday, 9 December 2014
The four steps to appiness
Webmaster Level: intermediate to advanced
App deep links are the new kid on the block in organic search, and they’re picking up speed faster than you can say “schema.org ViewAction”! For signed-in users, 15% of Google searches on Android now return deep links to apps through App Indexing. And over just the past quarter, we've seen the number of clicks on app deep links jump by 10x.
We’ve gotten a lot of feedback from developers and seen a lot of implementations gone right and others that were good learning experiences since we opened up App Indexing back in June. We’d like to share with you four key steps to monitor app performance and drive user engagement:
1. Give your app developer access to Webmaster Tools
App indexing is a team effort between you (as a webmaster) and your app development team. We show information in Webmaster Tools that is key for your app developers to do their job well. Here’s what’s available right now:
- Errors in indexed pages within apps
- Weekly clicks and impressions from app deep link via Google search
- Stats on your sitemap (if that’s how you implemented the app deep links)
We’ve noticed that very few developers have access to Webmaster Tools. So if you want your app development team to get all of the information they need to fix app-related issues, it’s essential for them to have access to Webmaster Tools.
Any verified site owner can add a new user. Pick restricted or full permissions, depending on the level of access you’d like to give:
2. Understand how your app is doing in search results
How are users engaging with your app from search results? We’ve introduced two new ways for you to track performance for your app deep links:
- We now send a weekly clicks and impressions update to the Message center in your Webmaster Tools account.
- You can now track how much traffic app deep links drive to your app using referrer information - specifically, the referrer extra in the ACTION_VIEW intent. We're working to integrate this information with Google Analytics for even easier access. Learn how to track referrer information on our Developer site.
3. Make sure key app resources can be crawled
Blocked resources are one of the top reasons for the “content mismatch” errors you see in Webmaster Tools’ Crawl Errors report. We need access to all the resources necessary to render your app page. This allows us to assess whether your associated web page has the same content as your app page.
To help you find and fix these issues, we now show you the specific resources we can’t access that are critical for rendering your app page. If you see a content mismatch error for your app, look out for the list of blocked resources in “Step 5” of the details dialog:
4. Watch out for Android App errors
To help you identify errors when indexing your app, we’ll send you messages for all app errors we detect, and will also display most of them in the “Android apps” tab of the Crawl errors report.
In addition to the currently available “Content mismatch” and “Intent URI not supported” error alerts, we’re introducing three new error types:
- APK not found: we can’t find the package corresponding to the app.
- No first-click free: the link to your app does not lead directly to the content, but requires login to access.
- Back button violation: after following the link to your app, the back button did not return to search results.
In our experience, the majority of errors are usually caused by a general setting in your app (e.g. a blocked resource, or a region picker that pops up when the user tries to open the app from search). Taking care of that generally resolves it for all involved URIs.
Good luck in the pursuit of appiness! As always, if you have questions, feel free to drop by our Webmaster help forum.
Posted by Mariya Moeva, Webmaster Trends AnalystSunday, 7 December 2014
Reintroducing your English Webmaster Help Google Guides
Also in Mountain View: Evan, Jessica, and Nate.
Guides in Zürich, Switzerland: John Mueller and Balázs.
Written by Reid Yokoyama, Search Quality






