Monday, 29 June 2015

Sitemaps: One file, many content types

Webmaster Level: All

Have you ever wanted to submit your various content types (video, images, etc.) in one Sitemap? Now you can! If your site contains videos, images, mobile URLs, code or geo information, you can now create—and submit—a Sitemap with all the information.

Site owners have been leveraging Sitemaps to let Google know about their sites’ content since Sitemaps were first introduced in 2005. Since that time additional specialized Sitemap formats have been introduced to better accommodate video, images, mobile, code or geographic content. With the increasing number of specialized formats, we’d like to make it easier for you by supporting Sitemaps that can include multiple content types in the same file.

The structure of a Sitemap with multiple content types is similar to a standard Sitemap, with the additional ability to contain URLs referencing different content types. Here's an example of a Sitemap that contains a reference to a standard web page for Web search, image content for Image search and a video reference to be included in Video search:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:image="http://www.google.com/schemas/sitemap-image/1.1"
xmlns:video="http://www.google.com/schemas/sitemap-video/1.1">
<url>
<loc>http://example.com/foo.html</loc>
<image:image>
<image:loc>http://example.com/image.jpg</image:loc>
</image:image>
<video:video>
<video:content_loc>http://example.com/videoABC.flv</video:content_loc>
<video:title>Grilling tofu for summer</video:title>
</video:video>
</url>
</urlset>

Here's an example of what you'll see in Webmaster Tools when a Sitemap containing multiple content types is submitted:



We hope the capability to include multiple content types in one Sitemap simplifies your Sitemap submission. The rest of the Sitemap rules, like 50,000 max URLs in one file and the 10MB uncompressed file size limit, still apply. If you have questions or other feedback, please visit the Webmaster Help Forum.

Sunday, 28 June 2015

Adding associates to manage your YouTube presence

Webmaster level: All

Update: This functionality is no longer supported.

Many organizations have multiple presences on the web. For example, Webmaster Tools lives at www.google.com/webmasters, but it also has a Twitter account and a YouTube channel. It's important that visitors to these other properties have confidence that they are actually associated with the Webmaster Tools site. However to date it has been challenging for webmasters to manage which users can take actions on behalf of their site in different services.

Today we're happy to announce a new feature in Webmaster Tools that allows webmasters to add "associates" -- trusted users who can act on behalf of your site in other Google products. Unlike site owners and users, associates can't view site data or take any site actions in Webmaster Tools, but they are authorized to perform specific tasks in other products.

For this initial launch, members of YouTube's partner program that have created a YouTube channel for their site can now link the two together. By doing this, your YouTube channel will be displayed as the "official channel" for your website.


Management within Webmaster Tools

To add or change associates:

  1. On the Webmaster Tools home page, click the site you want.
  2. Under Configuration, click Associates.
  3. Click Add a new associate.
  4. In the text box, type the email address of the person you want to add.
  5. Select the type of association you want.
  6. Click Add.

Management within YouTube

It’s also possible for users to request association from a site’s webmaster.
  1. Log in to your YouTube partner account.
  2. Click on the user menu and choose Settings > Associated Website.
  3. Fill in the page you would like to associate your channel with.
  4. Click Add. If you’re a verified owner of the site, you’re done. But if someone else in your organization manages the website, the association will be marked Pending. The owner receives a notification with an option to approve or deny the request.
  5. After approval is granted, navigate back to this page and click Refresh to complete the association.
Through associates, webmasters can easily and safely allow others to associate their website with YouTube channels. We plan to support integration with additional Google products in the future.

If you have more questions, please see the Help with Associates article or visit our webmaster help forum.

Saturday, 27 June 2015

Introducing website satisfaction by Google Consumer Surveys


Webmaster level: all

We're now offering webmasters an easy and free way to collect feedback from your website visitors with website satisfaction surveys. All you have to do is paste a small snippet of code in the HTML for your website and this will load a discreet satisfaction survey in the lower right hand corner of your website. Google automatically aggregates and analyzes responses, providing the data back to you through a simple online interface.


Users will be asked to complete a four-question satisfaction survey. Surveys will run until they have received 500 responses and will start again after 30 days so you can track responses over time. This is currently limited to US English visitors on non-mobile devices. The default questions are free and you can customize questions for just $0.01 per response or $5.00 for 500 responses.


Survey Setup and Code Placement Tips

To set up the survey code, you'll need to have access to the source code for your website.
  1. Sign into Google Consumer Surveys for website satisfaction to find the code snippet.
  2. You have the option to enter the website name and URL, survey timing, and survey frequency.
  3. Click on the “Activate survey” button when ready.
  4. Once you find the code snippet on top of the setup page, copy and paste it into your web page, just before the closing </head> tag. If your website uses templates to generate pages, enter it just before the closing </head> tag in the file that contains the <head> section.
If  you have any questions, please read our Help Center article to learn more.

Friday, 26 June 2015

SEO essentials for startups in under 10 minutes

Webmaster Level: Beginner to Intermediate

Wondering how to be search-friendly but lacking time for SEO research? We’d like to help! Meta keywords tag? Google Search ignores it. Meta description? Good to include.
If you:
  • Work on a company website that’s under 50ish pages.
  • Hope to rank well for your company name and a handful of related terms (not lots of terms like a news agency or e-commerce site).
  • Want to be smart about search engines and attracting searchers, but haven’t kept up with the latest search news.
Then perhaps set aside ten minutes for this video (or just the slides) and gain SEO peace of mind.


Everything I’d tell a startup if I had ten minutes as their SEO consultant.

More tips at developers.google.com/startups. Best of luck!

Sunday, 21 June 2015

Quality links to your site

A popular question on our Webmaster Help Forum is in regard to best practices for organic link building. There seems to be some confusion, especially among less experienced webmasters, on how to approach the topic. Different perspectives have been shared, and we would also like to explain our viewpoint on earning quality links.

If your site is rather new and still unknown, a good way marketing technique is to get involved in the community around your topic. Interact and contribute on forums and blogs. Just keep in mind to contribute in a positive way, rather than spamming or soliciting for your site. Just building a reputation can drive people to your site. And they will keep on visiting it and linking to it. If you offer long-lasting, unique and compelling content -- something that lets your expertise shine -- people will want to recommend it to others. Great content can serve this purpose as much as providing useful tools.

A promising way to create value for your target group and earn great links is to think of issues or problems your users might encounter. Visitors are likely to appreciate your site and link to it if you publish a short tutorial or a video providing a solution, or a practical tool. Survey or original research results can serve the same purpose, if they turn out to be useful for the target audience. Both methods grow your credibility in the community and increase visibility. This can help you gain lasting, merit-based links and loyal followers who generate direct traffic and "spread the word." Offering a number of solutions for different problems could evolve into a blog which can continuously affect the site's reputation in a positive way.

Humor can be another way to gain both great links and get people to talk about your site. With Google Buzz and other social media services constantly growing, entertaining content is being shared now more than ever. We've seen all kinds of amusing content, from ASCII art embedded in a site's source code to funny downtime messages used as a viral marketing technique to increase the visibility of a site. However, we do not recommend counting only on short-lived link-bait tactics. Their appeal wears off quickly and as powerful as marketing stunts can be, you shouldn't rely on them as a long-term strategy or as your only marketing effort.

It's important to clarify that any legitimate link building strategy is a long-term effort. There are those who advocate for short-lived, often spammy methods, but these are not advisable if you care for your site's reputation. Buying PageRank-passing links or randomly exchanging links are the worst ways of attempting to gather links and they're likely to have no positive impact on your site's performance over time. If your site's visibility in the Google index is important to you it's best to avoid them.

Directory entries are often mentioned as another way to promote young sites in the Google index. There are great, topical directories that add value to the Internet. But there are not many of them in proportion to those of lower quality. If you decide to submit your site to a directory, make sure it's on topic, moderated, and well structured. Mass submissions, which are sometimes offered as a quick work-around SEO method, are mostly useless and not likely to serve your purposes.

It can be a good idea to take a look at similar sites in other markets and identify the elements of those sites that might work well for yours, too. However, it's important not to just copy success stories but to adapt them, so that they provide unique value for your visitors.


Social bookmarks on YouTube enable users to share content easily


Finally, consider making linking to your site easier for less tech savvy users. Similar to the way we do it on YouTube, offering bookmarking services for social sites like Twitter or Facebook can help spread the word about the great content on your site and draw users' attention.

As usual, we'd like to hear your opinion. You're welcome to comment here in the blog, or join our Webmaster Help Forum community.

Backlinks and reconsideration requests

Webmaster level: advanced

When talking to site owners on Google Webmaster Forums we come across questions on reconsideration requests and how to handle backlink-related issues. Here are some common questions, along with our recommendations.

When should I file a reconsideration request?

If your site violates our Google Quality Guidelines or did in the past, a manual spam action may be applied to your site to prevent spam in our search results. You may learn about this violation from a notification in Google Webmaster Tools, or perhaps from someone else such as a previous owner or SEO of the site. To get this manual action revoked, first make sure that your site no longer violates the quality guidelines. After you've done that, it's time to file a reconsideration request.

Should I file a reconsideration request if I think my site is affected by an algorithmic change?

Reconsideration requests are intended for sites with manual spam actions. If your site’s visibility has been solely affected by an algorithmic change, there's no manual action to be revoked, and therefore no need to file a reconsideration request. If you're unsure if it's an algorithmic change or a manual action, and have found issues that you have resolved, then submitting a reconsideration request is fine.

How can I assess the quality of a site’s backlinks?

The links to your site section of Google Webmaster Tools is a great starting point for an investigation as it shows a significant amount of your site’s inbound links. If you know that you ran an SEO campaign during a particular period of time, downloading the latest links can come handy in slicing links created at that time. Using the links found in Google Webmaster Tools, we recommend looking for patterns that point to general issues that are worth resolving. For example, spammy blog comments, auto generated forum posts or text advertisements with links that pass PageRank are likely to be seen as unnatural links and would violate Google’s quality guidelines. For individual examples and hands-on advice we recommend getting help of peers and expert webmasters on the Google Webmaster Forum.

How do I clean a bad backlink profile?

Make sure to identify poor links first, then make a strong effort to get them either removed or nofollowed. Then use the Disavow Links Tool to deal with remaining unnatural backlinks. We recommend using domain-wide operator for sites with a complicated URL structure, very obvious spam sites, such as gibberish content sites or low quality sites with content that shows no editorial value. See our video on common mistakes when using the disavow tool for more information.

How much information do I need to provide?

Detailed documentation submitted along with a reconsideration request can contribute to its success, as it demonstrates the efforts made by the webmaster and helps Googlers with their investigation. If you are including a link to a shared document, make sure that it’s accessible to anyone with the link.

How long does it take to process reconsideration requests?

Reconsideration requests for sites affected by a manual spam action are investigated by a Googler. We strive to respond in a timely manner, normally within just a few days. However, the volume of incoming reconsideration requests can vary considerably, hence we don't provide a guaranteed turnaround time.

What are the possible outcomes of a reconsideration request?

Upon submitting a reconsideration request, you will first receive an automated confirmation in Google Webmaster Tools. After your request is processed, we'll send you another message to let you know the outcome of the request. In most cases, this message will either inform you that the manual action has been revoked or that your site still violates our quality guidelines.

Where can I get more guidance?

For more information on reconsideration requests, please visit our Help Center. And as always, the Google Webmaster Forum is a great place for further discussions as well as seeking more advice from experienced webmasters and Google guides.


Saturday, 20 June 2015

Siamo tornati dall' SES di Milano!

Un paio di chiarimenti...

Ciao! Siamo appena rientrati da un breve soggiorno in Italia. Tempo fantastico! Abbiamo partecipato come spettatori al Search Engine Strategies conference a Milano nei giorni 29 e 30 maggio. La conferenza è stato davvero una fantastica opportunità per parlare con molti di voi! Ci ha fatto molto piacere esserci e vorrei ringraziare tutti quelli che si sono fermati semplicemente a salutare o a discutere di strategie dei motori di ricerca. Abbiamo avuto la possibilità di parlare con diversi dei partecipanti e con alcuni dei più importanti attori del mondo SEO e Web Search Marketing in Italia. Discussioni utili e fruttuose per molti aspetti. Si e' parlato di come il mercato Web si stia sviluppando in Italia, di strategie SEO e di evangelizzazione (la traduzione italiana suona veramente forte).

Un buon numero di voi è saltato fuori con domande interessanti, e mi piacerebbe ora esporre un caso per poi fornire un paio di chiarificamenti che siano chiari e concisi.

Allora partiamo. Questa è la situazione in cui un webmaster potrebbe ritrovarsi: ho ottimizzato questo sito utilizzando tecniche non in accordo con le linee guida di Google. Ce la siamo cavata per un po', e questo ci ha aiutato a raggiungere la seconda posizione nei risultati di ricerca per alcune parole chiave. Ad un certo punto però, abbiamo ricevuto una email dal team della qualità della ricerca di Google che diceva che il nostro sito non sarebbe stato momentaneamente più presente nell'indice (nelle email c'è sempre almeno un esempio delle tecniche utilizzate). Abbiamo allora sistemato il sito togliendo tutto ciò che non era conforme alle linee guida e dopo alcuni giorni il nostro sito era di nuovo presente nell'indice. Come è possibile che non è più posizionato in seconda posizione nonostante il fatto che abbiamo rimosso tutto ciò che non era conforme alle linee guida?!

Va bene, lasciatemi fare un paio di domande prima di rispondere.

  • Non avete ottimizzato il sito utilizzando quelle tecniche al fine di posizionarlo il meglio possibile artificialmente?
  • Non pensavate che quelle tecniche avrebbero funzionato, almeno in una prospettiva di breve periodo?

Quindi se c'è stato un utilizzo di tecniche spam, incoraggiamo il sito che ha ricevuto la notifica da Google a prendere la cosa seriamente. Molti ripuliscono il proprio sito dalle tecniche scorrette di ottimizzazione dopo aver ricevuto una nostra notifica, ma noi dobbiamo anche tenere in considerazione che oltre a quelle presenti sul sito (per esempio testo nascosto, redirecting doorway page, etc) spesso ci sono anche tecniche utilizzate al di fuori del sito stesso come link popularity artificiali per guadagnarsi un’ottima posizione nelle pagine dei risultati di ricerca di Google.

Quindi, per rendere la questione più chiara possibile, una volta che ognuna delle manipolazioni sopra citate, inserite ai fini del posizionamento, e’ stata rimossa, il sito torna ad occupare la posizione che merita sulla base dei suoi contenuti e della sua link popularity naturale. C'è in oltre da evidenziare che il posizionamento del vostro sito dipende anche dagli altri siti relazionati al vostro per argomento trattato e tali siti nel frattempo potrebbero essere stati ottimizzati correttamente, va da sé che questo avrebbe un impatto anche sulla vostra posizione.

Notate che non c’è alcun tipo di penalizzazione preventiva applicata a quei siti che, ora puliti, hanno però visto in precedenza un utilizzo di tecniche non consentite. E questo è un punto a cui teniamo particolarmente: non rimangono né malus né macchie nella storia di un sito.

E' per questo motivo che insistiamo fermamente nel consigliare di lavorare sodo sui propri contenuti in modo che siano una risorsa che abbia valore per gli utenti, essendo proprio il buon contenuto una delle risorse più importanti che alimentano una link populary naturale e tutti dovremmo ormai sapere quanto una tale popolarità possa essere solida.

Qualità della ricerca, qualità dei contenuti e l'esperienza dei tuoi lettori.

Tra le varie conversazioni sulla qualità della ricerca, una su tutte ricorreva più spesso. Mi riferisco alle landing page e come scrivere per i motori di ricerca, due temi che spesso viaggiano in coppia quando si parla di risultati organici di ricerca.

Pensiamo allora al tuo visitatore che ha cercato qualcosa con Google e ha trovato la tua pagina. Ora, che tipo di accoglienza gli stai riservando? Una buona esperienza di ricerca consiste nel trovare una pagina che contiene l'informazione necessaria per rispondere alla domanda posta all'inizio.

Tuttavia un errore frequente nello scrivere per motori di ricerca é dimenticare proprio il visitatore e focalizzare l'attenzione solo sulla sua domanda. In effetti potremmo sostenere, "é con quella chiave di ricerca che hanno trovato la mia pagina!"

Alla fine dei conti, esasperare un comportamento del genere potrebbe portare a creare pagine fatta "su misura" per rispondere a quella ricerca ma con ben poco contenuto. Pagine del genere spesso utilizzano tecniche quali, tra l'altro, pure ripetizioni di parole, contenuti duplicati e in generale minimo contenuto. Ricapitolando, possono anche essere a tema con la domanda posta - ma per il tuo visitatore, sono inutili. In altri termini, hai finito per creare una pagina scritta solo per i motori di ricerca e ti sei dimenticato del visitatore. Il risultato é che l'utente trova pagine all'apparenza a tema ma in realtà completamente insignificanti.

Queste pagine "insignificanti", fatte artificialmente per generare traffico dai motori, non rappresentano una buona esperienza di ricerca. Anche se non adottano tecniche scorrette, quali ad esempio testo o links nascosti, sono fatte solo ed esclusivamente per posizionarsi per specifiche parole chiave, o combinazioni di parole, ma in realtà non offrono autonomamente alcun valore come risultato di una ricerca.

Un primo approccio per capire se stai causando una cattiva esperienza di ricerca ai tuoi utenti é controllare che le pagine trovate siano davvero utili. Queste pagine avranno contenuto a tema, che risponde alla domanda originalmente posta dall'utente ed in generale sono significative e rilevanti. Potresti cominciare con il controllo delle pagine che ricevono più visite e passare poi a rivedere tutto il sito. E per concludere, un consiglio: in generale, anche quando si vuole ottimizzare la pagina affinché il motore la trovi facilmente, bisogna ricordarsi che i visitatori sono il tuo pubblico e che una pagina scritta per i motori di ricerca non soddisfa necessariamente le aspettative del visitatore in termini di qualità e contenuti. Allora se stai pensando a come scrivere per il motore di ricerca, pensa invece ai tuoi utenti e a qual é il valore che stai offrendo loro!

We're back from SES Milan!

...with a couple of clarifications

Ciao everybody! We just got back from Italy—great weather there, I must say! We attended SES in Milan on the 29th and 30th of May. The conference was a great opportunity to talk to many of you. We really had a good time and want to thank all the people who stopped by to simply say "hi" or to talk to us in more detail about search engine strategies. This gave us a chance to talk to many participants and many of the big Italian actresses and actors in the SEO and web marketing worlds. We discussed recent developments in the Italian internet market, SEO strategies and evangelizing.

A number of you have raised interesting questions, and we'd like to go through two of these in more detail.

This is a situation a webmaster might find himself/herself in: I optimized this site using some sneaky techniques that are not in accordance with Google´s Webmaster Guidelines. I got away with it for a while and it helped me to rank in second position for certain keywords. Then, suddenly, I got an email from Google saying my site has been banned from the index because of those techniques (in these emails there is always an example of one of the infractions found). I now have cleaned up the site and after some days the site was back in the index.
Why on earth doesn't my site rank in the second position anymore, even though I've already paid for the sneaky techniques we used?

OK, before answering let me ask you a couple of questions:

  • Didn't you optimize your site with those techniques in order to artificially boost the ranking?
  • Didn't you think those techniques had worked out (in a short term perspective at least)?

So, if there has been spamming going on, we encourage a site that has gotten an email from Google to take this notification seriously. Many people clean up their sites after receiving a notification from us. But we must also take into account that besides the shady SEO techniques used on a particular site (for instance hidden text, redirecting doorway pages, etc) there are often off-site SEO techniques used such as creating artificial link popularity in order to gain a high position in Google's SERPs.

So, to make it straightforward, once those manipulations to make a site rank unnaturally high are removed, the site gains the position it merits based on its content and its natural link popularity. Note that of course the ranking of your site also depends on other sites related to the same topic and these sites might have been optimized in accordance to our guidelines, which might affect the ranking of your site.

Note that a site does not keep a stain or any residual negative effect from a prior breach of our webmaster guidelines, after it has been cleaned up.

That is why we first and foremost recommend to work hard on the content made for the audience of your site, as the content is a decisive factor for building natural link popularity. We all know how powerful a strong natural link popularity can be.

Search quality, content quality and your visitor's experience.

During our conversations about search-related issues, another topic that came up frequently was landing pages and writing for search engines, which are often related when we consider organic search results.

So, think of your visitors who have searched for something with Google and have found your page. Now, what kind of welcome are you offering? A good search experience consists of finding a page that contains enough information to satisfy your original query.

A common mistake in writing optimized content for search engines is to forget about the user and focus only on that particular query. One might say, that's how the user landed on my page!

At the end of that day, exaggerating this attitude might lead to create pages only made to satisfy that query but with no actual content on them. Such pages often adopt techniques such as, among others, mere repetition of keywords, duplicate content and overall very little value. In general, they might be in line with the keywords of the query – but for your visitor, they’re useless. In other words, you have written pages solely for the search engine and you forgot about the user. As a result, your visitor will find a page apparently on topic but totally meaningless.

These “meaningless” pages, artificially made to generate search engine traffic, do not represent a good search experience. Even though they do not employ other not recommendable techniques, such as for examples hidden text and links, they are very much made solely for the purpose of ranking for particular keywords, or a set of keywords, but actually are not offering a satisfying search result in itself.

A first step to identify if you are causing a bad search experience for your visitor consists of checking that the pages that he or she finds are actually useful. They will have topical content, that satisfies the query for which your visitor has found it and are overall meaningful and relevant. You might want to start with the pages that are most frequently found and extend your check up to your entire site. To sum up, as general advice, even if you want to make a page that is easily found via search engines, remember that the users are your audience, and that a page optimized for the search engine does not necessarily meet the user's expectations in terms of quality and content. So if you find yourself writing content for a search engine, you should ask yourself what the value is for the user!

Google's email communication with webmasters

Posted by Ríona MacNamara, Webmaster Tools Team

In den letzten Tagen gab es nochmals Versuche, deutsche Webmaster durch falsche E-Mails von Google zu verunsichern. Diese E-Mails stammen nicht von Google. Seit einigen Wochen hat Google die Benachrichtigung von Webmastern durch E-Mails eingestellt. Google arbeitet derzeit an einem zuverlässigeren Webmaster-Kommunikationsprozess.

We've noticed that someone is again trying to spoof the emails that Google sends to webmasters to alert them with issues about their site. These emails are not coming from Google, and in fact several weeks ago we temporarily discontinued sending these emails to webmasters while we explore different, secure ways of communicating with webmasters. Watch this space for more news - but in the meantime, you can safely assume that any email message you receive is not, in fact, from us.

Friday, 19 June 2015

Download to Google Spreadsheet from Webmaster Tools

Webmaster level: All

Webmaster Tools now has a new download option for exporting your data directly to a Google Spreadsheet. The download option is available for most of our data heavy features, such as Crawl errors, Search queries, and Links to your site. If you enjoy digging into the data from Webmaster Tools but don’t want to use Python scripts or the API, we’ve added new functionality just for you. Now when you click a download button from a Webmaster Tools feature like Search queries, you'll be presented with the "Select Download Format" option where you can choose to download the data as "CSV" or "Google Docs."


Choosing "CSV" initiates a download of the data in CSV format which has long been available in Webmaster Tools and can be imported into other spreadsheet tools like Excel. If you select the new “Google Docs” option then your data will be saved into a Google Spreadsheet and the newly created spreadsheet will be opened in a new browser tab.

We hope the ability to easily download your data to a Google Spreadsheet helps you to get crunching on your site's Webmaster Tools data even faster than you could before. Using only a web browser you can instantly dive right into slicing and dicing your data to create customized charts for detecting significant changes and tracking longer term trends impacting your site. If you've got questions or feedback please share it in the Webmaster Help Forum.

Thursday, 18 June 2015

Revamping the Webmaster Tools Help Center



A while ago, I posted in the Webmaster user group, looking for feedback on our Help Center and how we can improve the assistance we provide to our webmasters. And wow, did we get a a lot of feedback - both in the group and in the blogosphere. I'm amazed at the webmaster community and your willingness to share your thoughts with us: thank you!

Here's a selection of what we're hearing:

You want Help to be more discoverable

  • It's not as easy as it should be to find the information you're looking for. You'd like Google to do a better job of surfacing the answers to the most common questions. The browse structure doesn't make it easy for users to find help, and sometimes search depends on users knowing exactly the right term to search for.
  • You like the idea of context-sensitive help - on-the-spot assistance (often shown in a tooltip that appears when you hover over an item) that doesn't require you to click to a different Help page.
  • Right now, it's not clear when new Help information - or new features - are added, and you'd like Google to look at calling these out.

You want Help to be more useful

  • You'd like Google to look at adding videos and graphics
  • You'd like us to providing the kind of information that's relevant to the average webmaster, who may not have a deep knowledge of SEO techniques. You're looking for good and understandable answers to common questions.
  • You'd like us to expand the actual content, and do a much better job in explaining potential reasons why sites may have dropped the rankings.

What's next?

Well, over the next several weeks, we'll be working on lots of changes to the Help Center, both in its content and its organization. We'll be looking at all the feedback we've gotten, and we're taking it very seriously: believe me, I have a long task list for this area. But it can always grow: if you have some great thoughts or ideas, jump into the discussion, or just leave a comment right here.

Wednesday, 17 June 2015

Verify your site in Webmaster Tools using Google Tag Manager

Webmaster level: Intermediate


If you use Google Tag Manager to add and update your site tags, now you can quickly and easily verify ownership of your site in Webmaster Tools using the container snippet code.

Here’s how it’s done:

1. On the Webmaster Tools home page, click Manage site for the site you’d like to verify, then select Verify this site. If you haven’t added the site yet, you can click the Add a site button in the top right corner.



To do this, you must have "View, Edit, and Manage" account level permissions in Google Tag Manager.

2. On the Verification page, select Google Tag Manager as the verification method and follow the steps on your screen.



3. Click Verify.

And you’re done!

If you’ve got any questions about this verification method, drop by the Webmaster Help Forum.


Saturday, 13 June 2015

Expanding the webmaster central team

You've probably already figured this out if you use webmaster tools, the webmaster help center, or our webmaster discussion forum, but the webmaster central team is a fantastic group of people. You have seen some of them helping out in the discussion forums, and you may have met a few more at conferences, but there are lots of others behind the scenes who you don't see, working on expanding webmaster tools, writing content, and generally doing all they can for you, the webmaster. Even the team members you don't see are paying close attention to your feedback: reading our discussion forum, as well as blogs and message boards. We introduced you to a few of the team before SES NY and Danny Sullivan told you about a few Googler alternatives before SES Chicago. We also have several interns working with us right now, including Marcel, who seems to have been the hit of the party at SMX Advanced.

I am truly pleased to welcome a new addition to the team, although she'll be a familiar face to many of you already. Susan Moskwa is joining Jonathan Simon as a webmaster trends analyst! She's already started posting on the forums and is doing lots of work behind the scenes. Jonathan does a wonderful job answering your questions and investigating issues that come up and he and Susan will make a great team. Susan is a bit of a linguistic genius, so she'll also be helping out in some of the international forums, where Dublin Googlers have started reading and replying to your questions. Want to know more about Susan? You just never know what you find when you do a Google search.

Duplicate content summit at SMX Advanced

Last week, I participated in the duplicate content summit at SMX Advanced. I couldn't resist the opportunity to show how Buffy is applicable to the everday Search marketing world, but mostly I was there to get input from you on the duplicate content issues you face and to brainstorm how search engines can help.

A few months ago, Adam wrote a great post on dealing with duplicate content. The most important things to know about duplicate content are:
  • Google wants to serve up unique results and does a great job of picking a version of your content to show if your sites includes duplication. If you don't want to worry about sorting through duplication on your site, you can let us worry about it instead.
  • Duplicate content doesn't cause your site to be penalized. If duplicate pages are detected, one version will be returned in the search results to ensure variety for searchers.
  • Duplicate content doesn't cause your site to be placed in the supplemental index. Duplication may indirectly influence this however, if links to your pages are split among the various versions, causing lower per-page PageRank.
At the summit at SMX Advanced, we asked what duplicate content issues were most worrisome. Those in the audience were concerned about scraper sites, syndication, and internal duplication. We discussed lots of potential solutions to these issues and we'll definitely consider these options along with others as we continue to evolve our toolset. Here's the list of some of the potential solutions we discussed so that those of you who couldn't attend can get in on the conversation.

Specifying the preferred version of a URL in the site's Sitemap file
One thing we discussed was the possibility of specifying the preferred version of a URL in a Sitemap file, with the suggestion that if we encountered multiple URLs that point to the same content, we could consolidate links to that page and could index the preferred version.

Providing a method for indicating parameters that should be stripped from a URL during indexing
We discussed providing this in either an interface such as webmaster tools on in the site's robots.txt file. For instance, if a URL contains sessions IDs, the webmaster could indicate the variable for the session ID, which would help search engines index the clean version of the URL and consolidate links to it. The audience leaned towards an addition in robots.txt for this.

Providing a way to authenticate ownership of content
This would provide search engines with extra information to help ensure we index the original version of an article, rather than a scraped or syndicated version. Note that we do a pretty good job of this now and not many people in the audience mentioned this to be a primary issue. However, the audience was interested in a way of authenticating content as an extra protection. Some suggested using the page with the earliest date, but creation dates aren't always reliable. Someone also suggested allowing site owners to register content, although that could raise issues as well, as non-savvy site owners wouldn't know to register content and someone else could take the content and register it instead. We currently rely on a number of factors such as the site's authority and the number of links to the page. If you syndicate content, we suggest that you ask the sites who are using your content to block their version with a robots.txt file as part of the syndication arrangement to help ensure your version is served in results.

Making a duplicate content report available for site owners
There was great support for the idea of a duplicate content report that would list pages within a site that search engines see as duplicate, as well as pages that are seen as duplicates of pages on other sites. In addition, we discussed the possibility of adding an alert system to this report so site owners could be notified via email or RSS of new duplication issues (particularly external duplication).

Working with blogging software and content management systems to address duplicate content issues
Some duplicate content issues within a site are due to how the software powering the site structures URLs. For instance, a blog may have the same content on the home page, a permalink page, a category page, and an archive page. We are definitely open to talking with software makers about the best way to provide easy solutions for content creators.

In addition to discussing potential solutions to duplicate content issues, the audience had a few questions.

Q: If I nofollow a substantial number of my internal links to reduce duplicate content issues, will this raise a red flag with the search engines?
The number of nofollow links on a site won't raise any red flags, but that is probably not the best method of blocking the search engines from crawling duplicate pages, as other sites may link to those pages. A better method may be to block pages you don't want crawled with a robots.txt file.

Q: Are the search engines continuing the Sitemaps alliance?
We launched sitemaps.org in November of last year and have continued to meet regularly since then. In April, we added the ability for you to let us know about your Sitemap in your robots.txt file. We plan to continue to work together on initiatives such as this to make the lives of webmasters easier.

Q: Many pages on my site primarily consist of graphs. Although the graphs are different on each page, how can I ensure that search engines don't see these pages as duplicate since they don't read images?
To ensure that search engines see these pages as unique, include unique text on each page (for instance, a different title, caption, and description for each graph) and include unique alt text for each image. (For instance, rather than use alt="graph", use something like alt="graph that shows Willow's evil trending over time".

Q: I've syndicated my content to many affiliates and now some of those sites are ranking for this content rather than my site. What can I do?
If you've freely distributed your content, you may need to enhance and expand the content on your site to make it unique.

Q: As a searcher, I want to see duplicates in search results. Can you add this as an option?
We've found that most searchers prefer not to have duplicate results. The audience member in particular commented that she may not want to get information from one site and would like other choices, but for that case, other sites will likely not have identical information and therefore will show up in the results. Bear in mind that you can add the "&filter=0" parameter to the end of a Google web search URL to see additional results which might be similar.

I've brought back all the issues and potential solutions that we discussed at the summit back to my team and others within Google and we'll continue to work on providing the best search results and expanding our partnership with you, the webmaster. If you have additional thoughts, we'd love to hear about them!

Friday, 12 June 2015

More ways for you to give us input

At Google, we are always working hard to provide searchers with the best possible results. We've found that our spam reporting form is a great way to get your input as we continue to improve our results. Some of you have asked for a way to report paid links as well.

Links are an important signal in our PageRank calculations, as they tend to indicate when someone has found a page useful. Links that are purchased are great for advertising and traffic purposes, but aren't useful for PageRank calculations. Buying or selling links to manipulate results and deceive search engines violates our guidelines.

Today, in response to your request, we're providing a paid links reporting form within Webmaster Tools. To use the form, simply log in and provide information on the sites buying and selling links for purposes of search engine manipulation. We'll review each report we get and use this feedback to improve our algorithms and improve our search results. in some cases we may also take individual action on sites.

If you are selling links for advertising purposes, there are many ways you can designate this, including:
  • Adding a rel="nofollow" attribute to the href tag
  • Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file
We value your input and look forward to continuing to improve our great partnership with you.

For webmasters: Google+ and the +1 button 101

Webmaster Level: Beginner to Intermediate

Here’s a video that covers the basics of Google+, the +1 button, getting started on Google+, and how social information can make products, like Search, more relevant. This video is for a range of webmasters (from personal bloggers to SEOs of corporations). So, if you’re interested in learning about Google+, we hope that with 20 minutes and this video on YouTube (we have our own Webmaster Support Channel!), you can feel more up to speed with Google’s social opportunities.


Video about the basics of Google+ and how to get started if you're an interested webmaster.


Speaking of Google+, if you join, please say hello! We're often posting and hosting Hangouts.


Thursday, 11 June 2015

Google Videos best practices

Webmaster Level: All

We'd like to highlight three best practices that address some of the most common problems found when crawling and indexing video content. These best practices include ensuring your video URLs are crawlable, stating what countries your videos may be played in, and that if your videos are removed, you clearly indicate this state to search engines.

  • Best Practice 1: Verify your video URLs are crawlable: check your robots.txt
    • Sometimes publishers unknowingly include video URLs in their Sitemap that are robots.txt disallowed. Please make sure your robots.txt file isn't blocking any of the URLs specified in your Sitemap. This includes URLs for the:
      • Playpage
      • Content and player
      • Thumbnail
      More information about robots.txt.

  • Best Practice 2: Tell us what countries the video may be played in
    • Is your video only available in some locales? The optional attribute “restriction” has recently been added (documentation at http://www.google.com/support/webmasters/bin/answer.py?answer=80472), which you can use to tell us whether the video can only be played in certain territories. Using this tag, you have the option of either including a list of all countries where it can be played, or just telling us the countries where it can't be played. If your videos can be played everywhere, then you don't need to include this.

  • Best Practice 3: Indicate clearly when videos are removed -- protect the user experience
    • Sometimes publishers take videos down but don't signal to search engines that they've done so. This can result in the search engine's index not accurately reflecting content of the web. Then when users click on a search result, they're taken to a page either indicating that the video doesn't exist, or to a different video. Users find this experience dissatisfying. Although we have mechanisms to detect when search results are no longer available, we strongly encourage following community standards.

      To signal that a video has been removed,
      1. Return a 404 (Not found) HTTP response code, you can still return a helpful page to be displayed to your users. Check out these guidelines for creating useful 404 pages.
      2. Indicate expiration dates for each video listed in a Video Sitemap (use the <video:expiration_date> element) or mRSS feed (<dcterms:valid> tag) submitted to Google.
For more information on Google Videos please visit our Help Center, and to post questions and search answers check out our Help Forum.

MT6575T to be named MT6577

Some of you may have heard rumours that MediaTek was working on MT6575T, an improved version of MT6575, integrating a dual-core ARM Cortex™-A9 processor.

It is now being said that the new chipset will be named MT6577, although further informations such as the integrated GPU are not yet known (better to wait for an official MediaTek announcement).


According to several sources (such as www.m44.com.cn) there are already engineering samples floating around. From a series of third-party benchmarks, it's claimed that this new MediaTek chipset outperforms Qualcomm’s Snapdragon MSM8255, being very close to MSM8255T's performance.



MediaTek's MT6577 is expected to enter into mass production already this month.

Changes in rankings of smartphone search results

Webmaster level: Intermediate

Smartphone users are a significant and fast growing segment of Internet users, and at Google we want them to experience the full richness of the web. As part of our efforts to improve the mobile web, we published our recommendations and the most common configuration mistakes.

Avoiding these mistakes helps your smartphone users engage with your site fully and helps searchers find what they're looking for faster. To improve the search experience for smartphone users and address their pain points, we plan to roll out several ranking changes in the near future that address sites that are misconfigured for smartphone users.

Let's now look at two of the most common mistakes and how to fix them.

Faulty redirects

Some websites use separate URLs to serve desktop and smartphone users. A faulty redirect is when a desktop page redirects smartphone users to an irrelevant page on the smartphone-optimized website. A typical example is when all pages on the desktop site redirect smartphone users to the homepage of the smartphone-optimized site. For example, in the figure below, the redirects shown as red arrows are considered faulty:

This kind of redirect disrupts a user's workflow and may lead them to stop using the site and go elsewhere. Even if the user doesn't abandon the site, irrelevant redirects add more work for them to handle, which is particularly troublesome when they're on slow mobile networks. These faulty redirects frustrate users whether they're looking for a webpage, video, or something else, and our ranking changes will affect many types of searches.

Avoiding irrelevant redirects is very easy: Simply redirect smartphone users from a desktop page to its equivalent smartphone-optimized page. If the content doesn't exist in a smartphone-friendly format, showing the desktop content is better than redirecting to an irrelevant page.

We have more tips about redirects, and be sure to read our recommendations for having separate URLs for desktop and smartphone users.

Smartphone-only errors

Some sites serve content to desktop users accessing a URL but show an error page to smartphone users. There are many scenarios where smartphone-only errors are seen. Some common ones are:

  • If you recognize a user is visiting a desktop page from a mobile device and you have an equivalent smartphone-friendly page at a different URL, redirect them to that URL instead of serving a 404 or a soft 404 page.

  • Make sure that the smartphone-friendly page itself is not an error page. If your content is not available in a smartphone-friendly format, serve the desktop page instead. Showing the content the user was looking for is a much better experience than showing an error page.

  • Incorrectly handling Googlebot-Mobile. A typical mistake is when Googlebot-Mobile for smartphones is incorrectly redirected to the website optimized for feature phones which, in turn, redirects Googlebot-Mobile for smartphones back to desktop site. This results in infinite redirect loop, which we recognize as error.

    Avoiding this mistake is easy: All Googlebot-Mobile user-agents identify themselves as specific mobile devices, and you should treat these Googlebot user-agents exactly like you would treat these devices. For example, Googlebot-Mobile for smartphones currently identifies itself as an iPhone and you should serve it the same response an iPhone user would get.

  • Unplayable videos on smartphone devices. Many websites embed videos in a way that works well on desktops but is unplayable on smartphone devices. For example, if content requires Adobe Flash, it won't be playable on an iPhone or on Android versions 4.1 and higher.

Although we covered only two types of mistakes here, it's important for webmasters to focus on avoiding all of the common smartphone website misconfigurations. Try to test your site on as many different mobile devices and operating systems, or their emulators, as possible, including testing the videos included on your site. Doing so will improve the mobile web, make your users happy, and allow searchers to experience your content fully.

As always, please ask in our forums if you have any questions.

Monday, 8 June 2015

Our new search index: Caffeine

(Cross-posted on the Official Google Blog)

Today, we're announcing the completion of a new web indexing system called Caffeine. Caffeine provides 50 percent fresher results for web searches than our last index, and it's the largest collection of web content we've offered. Whether it's a news story, a blog or a forum post, you can now find links to relevant content much sooner after it is published than was possible ever before.

Some background for those of you who don't build search engines for a living like us: when you search Google, you're not searching the live web. Instead you're searching Google's index of the web which, like the list in the back of a book, helps you pinpoint exactly the information you need. (Here's a good explanation of how it all works.)

So why did we build a new search indexing system? Content on the web is blossoming. It's growing not just in size and numbers but with the advent of video, images, news and real-time updates, the average webpage is richer and more complex. In addition, people's expectations for search are higher than they used to be. Searchers want to find the latest relevant content and publishers expect to be found the instant they publish.

To keep up with the evolution of the web and to meet rising user expectations, we've built Caffeine. The image below illustrates how our old indexing system worked compared to Caffeine:
Our old index had several layers, some of which were refreshed at a faster rate than others; the main layer would update every couple of weeks. To refresh a layer of the old index, we would analyze the entire web, which meant there was a significant delay between when we found a page and made it available to you.

With Caffeine, we analyze the web in small portions and update our search index on a continuous basis, globally. As we find new pages, or new information on existing pages, we can add these straight to the index. That means you can find fresher information than ever before — no matter when or where it was published.

Caffeine lets us index web pages on an enormous scale. In fact, every second Caffeine processes hundreds of thousands of pages in parallel. If this were a pile of paper it would grow three miles taller every second. Caffeine takes up nearly 100 million gigabytes of storage in one database and adds new information at a rate of hundreds of thousands of gigabytes per day. You would need 625,000 of the largest iPods to store that much information; if these were stacked end-to-end they would go for more than 40 miles.

We've built Caffeine with the future in mind. Not only is it fresher, it's a robust foundation that makes it possible for us to build an even faster and comprehensive search engine that scales with the growth of information online, and delivers even more relevant search results to you. So stay tuned, and look for more improvements in the months to come.

Sunday, 7 June 2015

More details about our webmaster guidelines

At SMX Advanced on Monday, Matt Cutts talked about our webmaster guidelines. Later, during Q&A, someone asked about adding more detail to the guidelines: more explanation about violations and more actionable help on how to improve sites. You ask -- we deliver! On Tuesday, Matt told the SMX crowd that we'd updated the guidelines overnight to include exactly those things! We work fast around here. (OK, maybe we had been working on some of it already.)

So, what's new? Well, the guidelines themselves haven't changed. But the specific quality guidelines now link to expanded information to help you better understand how to spot and fix any issues. That section is below so you can click through to explore these new details.

Quality guidelines - specific guidelines

As Riona MacNarmara recently posted in our discussion forum, we are working to expand our webmaster help content even further and want your input. If you have suggestions, please post them in either the thread or as a comment to this post. We would love to hear from you!

Crawl Errors now reports soft 404s

Webmaster Level: All

Today we’re releasing a feature to help you discover if your site serves undesirable "soft” or “crypto” 404s. A "soft 404" occurs when a webserver responds with a 200 OK HTTP response code for a page that doesn't exist rather than the appropriate 404 Not Found. Soft 404s can limit a site's crawl coverage by search engines because these duplicate URLs may be crawled instead of pages with unique content.

The web is infinite, but the time search engines spend crawling your site is limited. Properly reporting non-existent pages with a 404 or 410 response code can improve the crawl coverage of your site’s best content. Additionally, soft 404s can potentially be confusing for your site's visitors as described in our past blog post, Farewell to Soft 404s.    

You can find the new soft 404s reporting feature under the Crawl errors section in Webmaster Tools.



Here’s a list of steps to correct soft 404s to help both Google and your users:
  1. Check whether you have soft 404s listed in Webmaster Tools
  2. For the soft 404s, determine whether the URL:
    1. Contains the correct content and properly returns a 200 response (not actually a soft 404)
    2. Should 301 redirect to a more accurate URL
    3. Doesn’t exist and should return a 404 or 410 response
  3. Confirm that you’ve configured the proper HTTP Response by using Fetch as Googlebot in Webmaster Tools
  4. If you now return 404s, you may want to customize your 404 page to aid your users. Our custom 404 widget can help.

We hope that you’re now better enabled to find and correct soft 404s on your site. If you have feedback or questions about the new "soft 404s" reporting feature or any other Webmaster Tools feature, please share your thoughts with us in the Webmaster Help Forum.

Saturday, 6 June 2015

Recommendations for building smartphone-optimized websites

Webmaster level: All
Every day more and more smartphones get activated and more websites are producing smartphone-optimized content. Since we last talked about how to build mobile-friendly websites, we’ve been working hard on improving Google’s support for smartphone-optimized content. As part of this effort, we launched Googlebot-Mobile for smartphones back in December 2011, which is specifically tasked with identifying such content.
Today we’d like to give you Google’s recommendations for building smartphone-optimized websites and explain how to do so in a way that gives both your desktop- and smartphone-optimized sites the best chance of performing well in Google’s search results.

Recommendations for smartphone-optimized sites

The full details of our recommendation can be found in our new help site, which we now summarize.
When building a website that targets smartphones, Google supports three different configurations:
  1. Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device. This is Google’s recommended configuration.
  2. Sites that dynamically serve all devices on the same set of URLs, but each URL serves different HTML (and CSS) depending on whether the user agent is a desktop or a mobile device.
  3. Sites that have a separate mobile and desktop sites.

Responsive web design

Responsive web design is a technique to build web pages that alter how they look using CSS3 media queries. That is, there is one HTML code for the page regardless of the device accessing it, but its presentation changes using CSS media queries to specify which CSS rules apply for the browser displaying the page. You can learn more about responsive web design from this blog post by Google's webmasters and in our recommendations.
Using responsive web design has multiple advantages, including:
  • It keeps your desktop and mobile content on a single URL, which is easier for your users to interact with, share, and link to and for Google’s algorithms to assign the indexing properties to your content.
  • Google can discover your content more efficiently as we wouldn't need to crawl a page with the different Googlebot user agents to retrieve and index all the content.

Device-specific HTML

However, we appreciate that for many situations it may not be possible or appropriate to use responsive web design. That’s why we support having websites serve equivalent content using different, device-specific, HTML. The device-specific HTML can be served on the same URL (a configuration called dynamic serving) or different URLs (such as www.example.com and m.example.com).
If your website uses a dynamic serving configuration, we strongly recommend using the Vary HTTP header to communicate to caching servers and our algorithms that the content may change for different user agents requesting the page. We also use this as a crawling signal for Googlebot-Mobile. More details are here.
As for the separate mobile site configuration, since there are many ways to do this, our recommendation introduces annotations that communicate to our algorithms that your desktop and mobile pages are equivalent in purpose; that is, the new annotations describe the relationship between the desktop and mobile content as alternatives of each other and should be treated as a single entity with each alternative targeting a specific class of device.
These annotations will help us discover your smartphone-optimized content and help our algorithms understand the structure of your content, giving it the best chance of performing well in our search results.

Conclusion

This blog post is only a brief summary of our recommendation for building smartphone-optimized websites. Please read the full recommendation and see which supported implementation is most suitable for your site and users. And, as always, please ask on our Webmaster Help forums if you have more questions.

Thursday, 4 June 2015

Easier domain verification

Webmaster level: All

Update 2014 Jan: We've expanded our support for this easier domain verification to many more domain name providers:
  • Afrihost
  • Domainmonster.com
  • Gandi.net
  • GMO
  • Heart Internet
  • Hover
  • LCN.com
  • Name.com
  • TransIP.eu
  • Wix


Today we’re announcing a new initiative that makes it easier for users to verify domains for Google services like Webmaster Tools and Google Apps.

First, some background on this initiative. To use certain Google services with your website or domain, you currently have to verify that you own the site or domain, since these services can share sensitive data (like search queries) or operate Internet-facing services (like hosted email) on your behalf.

One of our supported verification methods is domain verification. Currently this method requires a user to manually create a DNS TXT record to prove their ownership. For many users, this can be challenging and difficult to do.

So now, in collaboration with Go Daddy and eNom, we’re introducing a simple, automated solution for domain verification that guides you through the process in a few easy steps.

If your domain name records are managed by eNom or Go Daddy, in the Google site verification interface you will see a new, easier verification method as shown below.

   

Selecting this method launches a pop-up window that asks you to log in to the provider using your existing account with them.

  

The first time you log in, you’ll be asked to authorize the provider to access the Google site verification service on your behalf.

 

Next you’ll be asked to confirm that you wish to verify the domain.

   

And that’s it! After a few seconds, your domain should be automatically verified and a confirmation message displayed.

 

Now eNom and Go Daddy customers can more quickly and easily verify their domains to use with Google services like Webmaster Tools and Google Apps.

We’re also happy to share that Bluehost customers will be able to enjoy the same capability in the near future. And we look forward to working with more partners to bring easier domain verification to even more users. (Interested parties can contact us via this form.)

If you have any questions or feedback, as always please let us know via our webmaster help forum.