Friday, 31 October 2014

Spookier than malware


hotdog

lion king
...and infinitely more fun: webmasters and their pets incognito! Happy Halloween, everyone! If you see any costumes that would pass the SafeSearch filter :), feel like sharing a gripe or telling a good story, please join the chat!

Take care, and don't forget to brush your teeth.
 Yours scarily,
  The Webmaster Central Team


Our glasses-wearing, no vampire-teeth vampire (Ryan), zoombie Mur, Holiday Fail (Tiffany Lane), Colbert Hipster (Dan Vanderkam), Rick Astley Cutts, Homeboy Ben D'Angelo, Me -- pinker & poofier, Investment Bank CEO Shyam Jayaraman (though you can't see the golden parachute in his backpack)



Chark as Juno, Wysz as Beah Burger (our co-worker), Adi and Matt Dougherty as yellow ninja, red ninja!


Heroes come in all shapes and sizes...

Powdered toast man, Mike Leotta

Adam Lasnik as, let me see if I get this right, a "secret service agent masquerading as a backstage tech" :)

Happy Halloween to our spooktacular webmasters!



With apologizes to Vic Mizzy, we've written short verse to the tune of the "Addams Family" theme (please use your imagination):

We may be hobbyists or just geeky,
Building websites and acting cheeky,
Javascript redirects we won't make sneaky,
Our webmaster fam-i-ly!

Happy Halloween everyone! Feel free to join the discussion and share your Halloween stories and costumes.


Magnum P.I., Punk Rocker, Rubik's Cube, Mr. T., and Rainbow Brite
a.k.a. Several members of our Webmaster Tools team: Dennis Geels, Jonathan Simon, Sean Harding, Nish Thakkar, and Amanda Camp


Panda and Lolcat
Or just Evan Tang and Matt Cutts?


7 Indexing Engineers and 1 Burrito


Cheese Wysz, Internet Repairman, Community Chest, Internet Pirate (don't tell the RIAA)
Helpful members of the Webmaster Help Group: Wysz, MattD, Nathan Johns (nathanj) , and Bergy


Count++
Webspam Engineer Shashi Thakur (in the same outfit he wore to Searchnomics)


Hawaiian Surfer Dude and Firefox
Members of Webmaster Central's communications team: Reid Yokoyama and Mariya Moeva


Napolean Dynamite and Raiderfan
Shyam Jayaraman (speaking at SES Chicago, hopefully doing the dance) and me

Better geographic choices for webmasters

Written by Amanda Camp, Webmaster Tools and Trystan Upstill, International Search Quality Team

Starting today Google Webmaster Tools helps you better control the country association of your content on a per-domain, per-subdomain, or per-directory level. The information you give us will help us determine how your site appears in our country-specific search results, and also improves our search results for geographic queries.

We currently only allow you to associate your site with a single country and location. If your site is relevant to an even more specific area, such as a particular state or region, feel free to tell us that. Or let us know if your site isn't relevant to any particular geographic location at all. If no information is entered in Webmaster Tools, we'll continue to make geographic associations largely based on the top-level domain (e.g. .co.uk or .ca) and the IP of the webserver from which the context was served.

For example, if we wanted to associate www.google.com with Hungary:


But you don't want www.google.com/webmasters/tools" associated with any country...


This feature is restricted for sites with a country code top level domain, as we'll always associate that site with the country domain. (For example, google.ru will always be the version of Google associated with Russia.)


Note that in the same way that Google may show your business address if you register your brick-and-mortar business with the Google Local Business Center, we may show the information that you give us publicly.

This feature was largely initiated by your feedback, so thanks for the great suggestion. Google is always committed towards helping more sites and users get better and more relevant results. This is a new step as we continue to think about how to improve searches around the world.

We encourage you to tell us what you think in the Webmaster Tools section of our discussion group.

Wednesday, 29 October 2014

Using RSS/Atom feeds to discover new URLs

Webmaster Level: Intermediate

Google uses numerous sources to find new webpages, from links we find on the web to submitted URLs. We aim to discover new pages quickly so that users can find new content in Google search results soon after they go live. We recently launched a feature that uses RSS and Atom feeds for the discovery of new webpages.

RSS/Atom feeds have been very popular in recent years as a mechanism for content publication. They allow readers to check for new content from publishers. Using feeds for discovery allows us to get these new pages into our index more quickly than traditional crawling methods. We may use many potential sources to access updates from feeds including Reader, notification services, or direct crawls of feeds. Going forward, we might also explore mechanisms such as PubSubHubbub to identify updated items.

In order for us to use your RSS/Atom feeds for discovery, it's important that crawling these files is not disallowed by your robots.txt. To find out if Googlebot can crawl your feeds and find your pages as fast as possible, test your feed URLs with the robots.txt tester in Google Webmaster Tools.

Reflections on the "Tricks and Treats" webmaster event

What featured over 750 webmasters and a large number of Googlers from around the world, hundreds of questions, and over one hundred answers over the course of nearly two hours?  If you guessed "the Tricks and Treats webmaster event from this earlier this month!" well, you're either absolutely brilliant, you read the title of this post, or both!

How did it go?
It was an exhilarating, exhausting, and educational event, if we may say so ourselves, even though there were a few snafus.  We're aware that the sound quality wasn't great for some folks, and we've also appreciated quite-helpful constructive criticisms in this feedback thread.  Last but not least, we are bummed to admit that someone (whose name starts with 'A' and ends with 'M') uncharacteristically forgot to hit the record button (really!), so there's unfortunately no audio recording to share :-(.

But on more positive notes, we're delighted that so many of you enjoyed our presentations (embedded below), our many answers, and even some of our bad jokes (mercifully not to be repeated).

What next?
Well, for starters, all of us Webmaster Central Googlers will be spending quite some time taking in your feedback.  Some of you have requested sessions exclusively covering particular (pre-announced) topics or tailored to specific experience levels, and we've also heard from many webmasters outside of the U.S. who would love online events in other languages and at more convenient times.  No promises, but you can bet we're eager to please!  Stay tuned on this blog (and, as a hint and hallo to our German-speaking webmasters, do make sure to follow our German webmaster blog  ;-).  

And finally, a big thank you!
A heartfelt thank you to my fellow Googlers, many of whom got up at the crack of dawn to get to the office early for the chat and previous day's runthrough or stayed at work late in Europe.  But more importantly, major props to all of you (from New Delhi, New York, New Zealand and older places) who asked great questions and hung out with us online for up to two hours.  You webmasters are the reason we love coming to work each day, and we look forward to our next chat!

*  *  *

The presentations...
We had presentations from John, Jonathan, Maile, and Wysz.  Presentations from the first three are embedded below (Wysz didn't have a written presentation this time).


John's slides on "Frightening Webmastering Myths"


Jonathan's slides on "Using the Not Found errors report in Webmaster Tools"


Maile's slides on "Where We're Coming From"


Edited on Wednesday, October 29 at 6:00pm to update number of participants

Tracking mobile usability in Webmaster Tools

Webmaster Level: intermediate

Mobile is growing at a fantastic pace - in usage, not just in screen size. To keep you informed of issues mobile users might be seeing across your website, we've added the Mobile Usability feature to Webmaster Tools.

The new feature shows mobile usability issues we’ve identified across your website, complete with graphs over time so that you see the progress that you've made.

A mobile-friendly site is one that you can easily read & use on a smartphone, by only having to scroll up or down. Swiping left/right to search for content, zooming to read text and use UI elements, or not being able to see the content at all make a site harder to use for users on mobile phones. To help, the Mobile Usability reports show the following issues: Flash content, missing viewport (a critical meta-tag for mobile pages), tiny fonts, fixed-width viewports, content not sized to viewport, and clickable links/buttons too close to each other.

We strongly recommend you take a look at these issues in Webmaster Tools, and think about how they might be resolved; sometimes it's just a matter of tweaking your site's template! More information on how to make a great mobile-friendly website can be found in our Web Fundamentals website (with more information to come soon).

If you have any questions, feel free to join us in our webmaster help forums (on your phone too)!

Tuesday, 28 October 2014

G11i Pro (Dual SIM with 3G support)

Introduction

If you have read my G11i review and became interested, you'll definitely desire one G11i Pro. It was released approximately one month ago and the main difference when compared to G11i is the chipset... it's based on MT6573, which means 3G support! That's right, G11i Pro is a Dual SIM Dual Standby smartphone with 3G support.

Specifications

Chipset

Name:MediaTek MT6573
CPU:650 MHz ARM11™
GPU:PowerVR™ SGX 531
Instruction set:ARMv6

Software environment

Embedded:OS: Android 2.3.4 (Gingerbread)

Body

Dimensions
(width x height x depth):
120 x 64 x 11.7 millimetres
Weigth:140 grams
Color:Black

Battery

Capacity: 1450 mAh

Memory

RAM:capacity:512 MB
ROM-capacity:512 MB
Expansion slot:microSD memory card, supporting up to 32 GB

Network support

Primary phone:GSM850, GSM900, GSM1800, GSM1900, UMTS900, UMTS2100
Secondary phone:GSM850, GSM900, GSM1800, GSM1900
Data links:GPRS, EDGE, HSDPA, HSUPA

Display

Type:Sharp LCD capacitive touchscreen
Size:4.0 inches, WVGA resolution (480 x 800 pixels)

Camera

Main (rear):8 megapixels (interpolated) with autofocus and dual LED flash
Secondary (front):1.3 megapixels

Interfaces

Bluetooth (802.15):Bluetooth 2.1 + Enhanced Data Rate
Wireless LAN / Wi-Fi (802.11):  IEEE 802.11b, IEEE 802.11g
USB:USB 2.0 Client, Hi-Speed (480 Mbit/s)
USB Series Micro-B (Micro-USB) connector

Satellite navigation

Built-in GPS module:MT6620 chipset
GPS antenna:Internal
Complementary GPS services:  A-GPS (Assisted GPS), MediaTek EPO (Extended Prediction Orbit)

Additional features

Sensors:
Gravity, Proximity and Light sensors
Analog Radio:FM radio (87.5-108 MHz) with RDS radio receiver

Design and construction

In terms of quality of construction, there are no differences between G11i and G11i Pro. It is still a greatly built copy of Incredible S.


As a quick side note, the original battery from HTC won't fit in this phone. I'm just referring it because some people asked me if the original battery would fit. In fact, that's the only original accessory that you can't use on G11i / G11i Pro. 


There's only one subtle difference in the Pro version, when compared to the normal one. That is the microSD slot, which has been moved a little bit and the battery now obstructs the removal of the memory card (it isn't hot-swappable any more).


Display

There's not much to say about the display, as the manufacturer is still using the same good old Sharp LCD screen...



Features

Just in case you've missed my G11i review, please take some time and read it now because it doesn't make sense to detail all the features again (jump directly to G11i functionality review). Everything is pretty much the same and the only new feature that is worth mentioning is the 3G support.


Under the dual SIM management menu, the user already had the possibility to set a default card to establish all outgoing voice calls. In addition, given that G11i Pro supports 3G networks, the option of video calls can now be noticed.


The user can choose the SIM to establish the data connection and although SIM1 supports data connections up to HSPA, SIM2 is limited to EDGE / GPRS.

Final thoughts

Well, if you were looking for a good Dual SIM featuring 3G support, this is a great choice. It sports good voice call quality, stability, fast and smooth user experience.

Just let me finish this review with a little advertisement... in order to avoid questions regarding where to buy it from, you can always visit my partner shop (etotalk.com). G11i Pro is now available for 229 USD.

    Monday, 27 October 2014

    Updating our technical Webmaster Guidelines

    Webmaster level: All

    We recently announced that our indexing system has been rendering web pages more like a typical modern browser, with CSS and JavaScript turned on. Today, we're updating one of our technical Webmaster Guidelines in light of this announcement.

    For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use. This provides you optimal rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.

    Updated advice for optimal indexing

    Historically, Google indexing systems resembled old text-only browsers, such as Lynx, and that’s what our Webmaster Guidelines said. Now, with indexing based on page rendering, it's no longer accurate to see our indexing systems as a text-only browser. Instead, a more accurate approximation is a modern web browser. With that new perspective, keep the following in mind:

    • Just like modern browsers, our rendering engine might not support all of the technologies a page uses. Make sure your web design adheres to the principles of progressive enhancement as this helps our systems (and a wider range of browsers) see usable content and basic functionality when certain web design features are not yet supported.
    • Pages that render quickly not only help users get to your content easier, but make indexing of those pages more efficient too. We advise you follow the best practices for page performance optimization, specifically:
    • Make sure your server can handle the additional load for serving of JavaScript and CSS files to Googlebot.

    Testing and troubleshooting

    In conjunction with the launch of our rendering-based indexing, we also updated the Fetch and Render as Google feature in Webmaster Tools so webmasters could see how our systems render the page. With it, you'll be able to identify a number of indexing issues: improper robots.txt restrictions, redirects that Googlebot cannot follow, and more.

    And, as always, if you have any comments or questions, please ask in our Webmaster Help forum.

    Sunday, 26 October 2014

    Help us make the web better: An update on Rich Snippets

    Webmaster Level: All

    In May this year we announced Rich Snippets which makes it possible to show structured data from your pages on Google's search results.


    We're convinced that structured data makes the web better, and we've worked hard to expand Rich Snippets to more search results and collect your feedback along the way. If you have review or people/social networking content on your site, it's easier than ever to mark up your content using microformats or RDFa so that Google can better understand it to generate useful Rich Snippets. Here are a few helpful improvements on our end to enable you to mark up your content:

    Testing tool. See what Google is able to extract, and preview how microformats or RDFa marked-up pages would look on Google search results. Test your URLs on the Rich Snippets Testing Tool.


    Google Custom Search users can also use the Rich Snippets Testing Tool to test markup usable in their Custom Search engine.

    Better documentation. We've extended our documentation to include a new section containing Tips & Tricks and Frequently Asked Questions. Here we have responded to common points of confusion and provided instructions on how to maximize the chances of getting Rich Snippets for your site.

    Extended RDFa support. In addition to the Person RDFa format, we have added support for the corresponding fields from the FOAF and vCard vocabularies for all those of you who asked for it.

    Videos. If you have videos on your page, you can now mark up your content to help Google find those videos.

    As before, marking up your content does not guarantee that Rich Snippets will be shown for your site. We will continue to expand this feature gradually to ensure a great user experience whenever Rich Snippets are shown in search results.

    Saturday, 25 October 2014

    Dealing with Sitemap cross-submissions



    Since the launch of Sitemaps, webmasters have been asking if they could submit their Sitemaps for multiple hosts on a single dedicated host. A fair question -- and now you can!

    Why would someone want to do this? Let's say that you own www.example.com and mysite.google.com and you have Sitemaps for both hosts, e.g. sitemap-example.xml and sitemap-mysite.xml. Until today, you would have to store each Sitemap on its respective host. If you tried to place sitemap-mysite.xml on www.example.com, you would get an error because, for security reasons, a Sitemap on www.example.com can only contains URLs from www.example.com. So how do we solve this? Well, if you can "prove" that you own or control both of these hosts, then either one can host a Sitemap containing URLs for the other. Just follow the normal verification process in Google Webmaster Tools and any verified site in your account will be able to host Sitemaps for any other verified site in the same account.

    Here is an example showing both sites verified:

    And now, from a single host, you can submit Sitemaps for both sites without any errors. sitemap-example.xml contains URLs from www.example.com and sitemap-mysite.xml contains URLs from mysite.google.com but both now reside on www.example.com:
    We've also added more information on handling cross-submits in our Webmaster Help Center.
    For those of you wondering how this affects the other search engines that support the Sitemap Protocol, rest assured that we're talking to them about how to make cross-submissions work seamlessly across all of them. Until then, this specific solution will work only for users of Google Webmaster Tools.

    Friday, 24 October 2014

    Malware? We don't need no stinking malware!

    (Cross-posted from the Google Online Security Blog.)

    "This site may harm your computer"
    You may have seen those words in Google search results — but what do they mean? If you click the search result link you get another warning page instead of the website you were expecting. But if the web page was your grandmother's baking blog, you're still confused. Surely your grandmother hasn't been secretly honing her l33t computer hacking skills at night school. Google must have made a mistake and your grandmother's web page is just fine...

    I work with the team that helps put the warning in Google's search results, so let me try to explain. The good news is that your grandmother is still kind and loves turtles. She isn't trying to start a botnet or steal credit card numbers. The bad news is that her website or the server that it runs on probably has a security vulnerability, most likely from some out-of-date software. That vulnerability has been exploited and malicious code has been added to your grandmother's website. It's most likely an invisible script or iframe that pulls content from another website that tries to attack any computer that views the page. If the attack succeeds, then viruses, spyware, key loggers, botnets, and other nasty stuff will get installed.

    If you see the warning on a site in Google's search results, it's a good idea to pay attention to it. Google has automatic scanners that are constantly looking for these sorts of web pages. I help build the scanners and continue to be surprised by how accurate they are. There is almost certainly something wrong with the website even if it is run by someone you trust. The automatic scanners make unbiased decisions based on the malicious content of the pages, not the reputation of the webmaster.

    Servers are just like your home computer and need constant updating. There are lots of tools that make building a website easy, but each one adds some risk of being exploited. Even if you're diligent and keep all your website components updated, your web host may not be. They control your website's server and may not have installed the most recent OS patches. And it's not just innocent grandmothers that this happens to. There have been warnings on the websites of banks, sports teams, and corporate and government websites.

    Uh-oh... I need help!
    Now that we understand what the malware label means in search results, what do you do if you're a webmaster and Google's scanners have found malware on your site?

    There are some resources to help clean things up. The Google Webmaster Central blog has some tips and a quick security checklist for webmasters. Stopbadware.org has great information, and their forums have a number of helpful and knowledgeable volunteers who may be able to help (sometimes I'm one of them). You can also use the Google SafeBrowsing diagnostics page for your site (http://www.google.com/safebrowsing/diagnostic?site=<site-name-here>) to see specific information about what Google's automatic scanners have found. If your site has been flagged, Google's Webmaster Tools lists some of the URLs that were scanned and found to be infected.

    Once you've cleaned up your website, use Google's Webmaster Tools to request a malware review. The automatic systems will rescan your website and the warning will be removed if the malware is gone.

    Advance warning
    I often hear webmasters asking Google for advance warning before a malware label is put on their website. When the label is applied, Google usually emails the website owners and then posts a warning in Google's Webmaster Tools. But no warning is given ahead of time - before the label is applied - so a webmaster can't quickly clean up the site before a warning is applied.

    But, look at the situation from the user's point of view. As a user, I'd be pretty annoyed if Google sent me to a site it knew was dangerous. Even a short delay would expose some users to that risk, and it doesn't seem justified. I know it's frustrating for a webmaster to see a malware label on their website. But, ultimately, protecting users against malware makes the internet a safer place and everyone benefits, both webmasters and users.

    Google's Webmaster Tools has started a test to provide warnings to webmasters that their server software may be vulnerable. Responding to that warning and updating server software can prevent your website from being compromised with malware. The best way to avoid a malware label is to never have any malware on the site!

    Reviews
    You can request a review via Google's Webmaster Tools and you can see the status of the review there. If you think the review is taking too long, make sure to check the status. Finding all the malware on a site is difficult and the automated scanners are far more accurate than humans. The scanners may have found something you've missed and the review may have failed. If your site has a malware label, Google's Webmaster Tools will also list some sample URLs that have problems. This is not a full list of all of the problem URLs (because that's often very, very long), but it should get you started.

    Finally, don't confuse a malware review with a request for reconsideration. If Google's automated scanners find malware on your website, the site will usually not be removed from search results. There is also a different process that removes spammy websites from Google search results. If that's happened and you disagree with Google, you should submit a reconsideration request. But if your site has a malware label, a reconsideration request won't do any good — for malware you need to file a malware review from the Overview page.

    How long will a review take?
    Webmasters are eager to have a Google malware label removed from their site and often ask how long a review of the site will take. Both the original scanning and the review process are fully automated. The systems analyze large portions of the internet, which is big place, so the review may not happen immediately. Ideally, the label will be removed within a few hours. At its longest, the process should take a day or so.

    Wednesday, 22 October 2014

    Verifying a Blogger blog in Webmaster Tools

    Webmaster Level: All

    You may have seen our recent announcement of changes to the verification system in Webmaster Tools. One side effect of this change is that blogs hosted on Blogger (that haven't yet been verified) will have to use the meta tag verification method rather than the "one-click" integration from the Blogger dashboard. The "Webmaster Tools" auto-verification link from the Blogger dashboard is no longer working and will soon be removed. We're working to reinstate an automated verification approach for Blogger hosted blogs in the future, but for the time being we wanted you to be aware of the steps required to verify your Blogger blog in Webmaster Tools.

    Step-By-Step Instructions:

    In Webmaster Tools
    1. Click the "Add a site" button on the Webmaster Tools Home page
    2. Enter your blog's URL (for example, googlewebmastercentral.blogspot.com) and click the "Continue" button to go to the Manage verification page
    3. Select the "Meta tag" verification method and copy the meta tag provided

    In Blogger
    4. Go to your blog and sign in
    5. From the Blogger dashboard click the "Layout" link for the blog you're verifying
    6. Click the "Edit HTML" link under the "Layout" tab which will allow you to edit the HTML for your blog's template
    7. Paste the meta tag (copied in step 3) immediately after the <head> element within the template HTML and click the "SAVE TEMPLATE" button




    In Webmaster Tools
    8. On the Manage Verification page, confirm that "Meta tag" is selected as the verification method and click the "Verify" button

    Your blog should now be verified. You're ready to start using Webmaster Tools!

    Tuesday, 21 October 2014

    One million YouTube views!

    Earlier this year, we launched our very own Webmaster Central channel on YouTube. Just today, we saw our total video views exceed one million! On the road to this milestone, we uploaded 154 videos, for a total of nearly 11 hours of webmaster-focused media. These videos have brought you conference presentations, updates on tools for webmasters, general tips, and of course answers to your "Grab bag" questions for Matt Cutts.

    To celebrate our one million views, we're sharing a fun video with you in which Matt Cutts shows us what happened when he lost a bet with his team:



    We're also pleased to announce that we've added captions to all of our videos and plan to do so for our future videos as well. Thank you to everyone who has watched, shared, and commented on our videos. We look forward to the next million views!

    Webmaster chat event: Vote early and often!


    No matter where in the world you are, you can vote right now on webmaster-oriented questions by registering for our free Webmaster chat  ("Tricks and Treats") which is scheduled for tomorrow at 9am PDT (5pm GMT).  Even better: you can suggest your own questions that you'd like Webmaster Central Googlers to answer.


    We're using the new Google Moderator tool, so posting questions and voting on your favorites is fun and easy; you'll receive an e-mail with a link to the webmaster chat questions right after you register.  Click on the check mark next to questions you find particularly interesting and important. Click on the X next to questions that seem less relevant or useful.  From your votes, Google Moderator will surface the best questions, helping us spend more time in the chat on issues you really care about.

    Feel free to review our post from yesterday for more details on this event.

    See you there!


    P.S. - Speaking of voting:  If you're an American citizen, we hope you're also participating in the upcoming presidential election! Our friends in Google Maps have even prepared a handy lookup tool to help you find your voting place -- check it out!



    Monday, 20 October 2014

    Join us for our third live online webmaster chat!


    You know how some myths just won't die?  Well, do we have some great news for you!  A not-so-scary bunch of Gooooooooooooglers will be on hand to drive a stake through the most ghoulish webmastering myths and misconceptions in our live online "Tricks and Treats" chat this coming Wednesday.

    That's right!  You'll be treated to some brief presentations and then have the chance to ask lots of questions to Googlers ranging from Matt Cutts in Mountain View to John Mueller in Zurich to Kaspar Szymanski in Dublin (and many more folks as well).


    Here's what you'll need
    • About an hour of free time
    • A computer with audio capabilities that is connected to the Internet and has these additional specifications
      (We'll be broadcasting via the Internet tubes this time rather than over the phone lines)
    • A URL for the chat, which you can only get when you register for the event (don't worry -- it's fast and painless!)
    • Costumes: optional

    What will our Tricks and Treats chat include?
    • INTRO:  A quick hello from some of your favorite Help Group Guides
    • PRESO:  A 15 minute presentation on "Frightening Myths and Misconceptions" by John Mueller
    • FAQs:  A return of our popular "Three for Three," in which we'll have three different Googlers tackling three different issues we've seen come up in the Group recently... in under three minutes each!
    • And lots of Q&A!  You'll have a chance to type questions during the entire session (actually, starting an hour prior!) using our hunky-dory new Google Moderator tool.  Ask, then vote!  With this tool and your insights, we expect the most interesting questions to quickly float to the top.

    When and how can you join in?
    1. Mark the date on your calendar now:  Wednesday, October 22, at 9am PDT, noon EDT, and 5pm GMT
    2. Register right now for this event.  Please note that you'll need to click on the "register" link on the bottom lefthand side.
    3. Optionally post questions via Google Moderator one hour prior to the start of the event.  The link will be mailed to all registrants.
    4. Log in 5-10 minutes prior to the start of the chat, using the link e-mailed to you by WebEx (the service hosting the event).
    5. Interact!  During the event, you'll be able to chat (by typing) with your fellow attendees, and also post questions and vote on your favorite questions via Google Moderator.

    We look forward to seeing you online!  In the meantime, if you have any questions, feel free to post a note in this thread of our friendly Webmaster Help Group.

    Edited on October 21st at 12:15pm and 12:29pm PDT to add:
    We've decided to open up the Google Moderator page early.  Everyone who registered for this event previously and everyone registering from this moment on will receive the link in e-mail.  Also, the event is scheduled for *5pm* GMT (correctly listed on the registration page and in the followup e-mails).

    Sunday, 19 October 2014

    Where's my data?

    Today we're going back to basics. We'll be answering the question: What is a website?

    ...Okay, not exactly. But we will be looking into what a "website" means in the context of Webmaster Tools, what kind of sites you can add to your Webmaster Tools account, and what data you can get from different types of sites.

    Why should you care? Well, the following are all questions that we've gotten from webmasters recently:
    • "I know my site has lots of incoming links; why don't I see any in my Webmaster Tools account?"
    • "I see sitelinks for my site in Google's search results, but when I look in Webmaster Tools it says 'No sitelinks have been generated for your site.'"
    • "Why does my Top search queries report still say 'Data is not available at this time'? My site has been verified for months."
    In each of these cases, the answer was the same: the data was there, but the webmaster was looking at the wrong "version" of their domain in Webmaster Tools.


    A little background
    The majority of tools and settings in Webmaster Tools operate on a per-site basis. This means that when you're looking at, say, the Top search queries report, you're only seeing the top search queries for a particular site. Looking at the top queries for www.example.com will show you different data than looking at the top queries for www.example.org. Makes sense, right?

    Not all websites have URLs in the form www.example.com, though. Your root URL may not include the www subdomain (example.com); it may include a custom subdomain (rollergirl.example.com); or your site may live in a subfolder, for example if it's hosted on a free hosting site (www.example.com/rollergirl/). Since we want webmasters to be able to access our tools regardless of how their site is hosted, you can add any combination of domain, subdomain(s), and/or subfolder(s) as a "site" on your Webmaster Tools dashboard. Once you've verified your ownership of that site, we'll show you the information we have for that particular piece of the web, however big or small it may be. If you've verified your domain at the root level, we'll show you data for that whole domain; if you've only verified a particular subfolder or subdomain, we'll only show you data for that subfolder or subdomain. Take Blogger as an example—someone who blogs with Blogger should only be able to have access to the data for their own subdomain (googlewebmastercentral.blogspot.com), not the entire blogspot.com domain.

    What some people overlook is the fact that www is actually a subdomain. It's a very, very common subdomain, and many sites serve the same content whether you access them with or without the www; but the fact remains that example.com and www.example.com are two different URLs and have the potential to serve different content. For this reason, they're considered different sites in Webmaster Tools. Since they're different sites—just like www.example.com and www.example.orgthey can have different data. When you're looking at the data for www.example.com (with the www subdomain) you're not seeing the data for example.com (without the subdomain), and vice versa.

    What can I do to make sure I'm seeing all my data?
    • If you feel like you're missing some data, add both the www and the non-www version of your domain to your Webmaster Tools account. Take a look at the data for both sites.
    • Do a site: search for your domain without the www (e.g. [site:example.com]). This should return pages from your domain and any of your indexed subdomains (www.example.com, rollergirl.example.com, etc.). You should be able to tell from the results whether your site is mainly indexed with or without the www subdomain. The version that's indexed is likely to be the version that shows the most data in your Webmaster Tools account.
    • Tell us whether you prefer for your site to be indexed with or without the www by setting your preferred domain.
    • Let everyone else know which version you prefer by doing a site-wide 301 redirect.
    Even though example.com and www.example.com may look like identical twins, any twins will be quick to tell you that they're not actually the same person. :-) Now that you know, we urge you to give both your www and non-www sites some love in Webmaster Tools, and—as usual—to post any follow-up questions in our Webmaster Help Group.

    Saturday, 18 October 2014

    Blast from the past

    Written by Sahala Swenson, Webmaster Tools Team

    As you know, the queries used to find your website in search results can change over time. Your website content changes, as do the needs of all the busy searchers out there. Whether the queries associated with your site change subtly or dramatically, it's pretty useful to see how they transform over time.

    Recognizing this, Top Search Queries in Webmaster Tools now presents historical data and other enhancements. Let's take a closer look:


    Up to 6 months of historical data:
    Previously we only showed query stats for the last 7 days. Now you can jump between 9 query stats snapshots ranging from now to 6 months ago. Note that the time interval for each of these snapshots is different. For the 7 day, 2 week, and 3 week snapshots, we report the top queries for the previous week. For the 1 to 6 month snapshots, we report statistics for the previous month. And still others of you who log in may notice that you don't have query stats data going back to 6 months ago. We hope to improve that experience in the future. :)

    Top query percentages:
    You might have noticed a new column in the top query listings. Previously we just ranked your query results and clicks. While useful, this didn't really tell you to what extent one query was ranked higher than another. Now we show what percentage each query result or click represents out of the top 20 queries. This should help you see how well the result or click volume is distributed in the top 20.

    Downloads:

    Since we're now showing historical data on the Top Search Queries screen, we figured it would be rude to not let you download it all and play with the data yourself (spreadsheet masochists, I'm looking at you). We added a “Download data” link that lets you download all the stats in CSV format. Note that this exports all query stats historical data across all snapshots as well as search types and languages, so you can slice and dice to your satisfaction. The “Download all stats (including subfolders)” link, however, will still only show query stats for your site and sub-folders for the last 7 days.

    Freshness:

    We've improved data freshness in Webmaster Tools a couple of times in the past, and we've done it again with the new Top Search Queries. Statistics are being now updated constantly. Top query results and clicks may visibly change rank a lot more often now, sometimes daily.


    So enough talk. Sign in and play around with the new improvements for yourself. As always we welcome feedback (especially in the form of beer), so feel free to drop us a note in the Webmaster Help Group and let us know what you think.

    Friday, 17 October 2014

    First Click Free for Web Search

    While working on our mission to organize the world's information and make it universally accessible and useful, we sometimes run into situations where important content is not publicly available. In order to help users find and access content that may require registration or a subscription, Google offers an option to web and news publishers called "First Click Free." First Click Free has two main goals:
    1. To include highly relevant content in Google's search index. This provides a better experience for Google users who may not have known that content existed.
    2. To provide a promotion and discovery opportunity for publishers with restricted content.

    First Click Free is designed to protect your content while allowing you to include it Google's search index. To implement First Click Free, you must allow all users who find your page through Google search to see the full text of the document that the user found in Google's search results and that Google's crawler found on the web without requiring them to register or subscribe to see that content. The user's first click to your content is free and does not require logging in. You may, however, block the user with a login or payment or registration request when he tries to click away from that page to another section of your content site.

    Guidelines
    Webmasters wishing to implement First Click Free should follow these guidelines:
    • All users who click a Google search result to arrive at your site should be allowed to see the full text of the content they're trying to access.
    • The page displayed to all users who visit from Google must be identical to the content that is shown to Googlebot.
    • If a user clicks to a multi-page article, the user must be able to view the entire article. To allow this, you could display all of the content on a single page—you would need to do this for both Googlebot and for users. Alternately, you could use cookies to make sure that a user can visit each page of a multi-page article before being asked for registration or payment.

    Implementation Suggestions
    To include your restricted content in Google's search index, our crawler needs to be able to access that content on your site. Keep in mind that Googlebot cannot access pages behind registration or login forms. You need to configure your website to serve the full text of each document when the request is identified as coming from Googlebot via the user-agent and IP-address. It's equally important that your robots.txt file allows access of these URLs by Googlebot.

    When users click a Google search result to access your content, your web server will need to check the "Referer" HTTP request-header field. When the referring URL is on a Google domain, like www.google.com or www.google.de, your site will need to display the full text version of the page instead of the protected version of the page that is otherwise shown. Most web servers have instructions for implementing this type of behavior.

    Frequently Asked Questions
    Q: Can I allow Googlebot to access some restricted content pages but not others?
    A: Yes.

    Q: Can I limit the number of restricted content pages that an individual user can access on my site via First Click Free?
    A: No. Any user arriving at your site from a Google search results page should be shown the full text of the requested page.

    Q: Can First Click Free URLs be submitted using Sitemap files?
    A: Yes. Simply create and submit your Sitemap file as usual.

    Q: Is First Click Free content guaranteed inclusion in the Google Index?
    A: No. Google does not guarantee inclusion in the web index.


    Do you have any more questions or comments? Come on over to the Google Webmaster Help forum and join the discussion!