Archive for the ‘webmaster tools’ Category

Tracking mobile usability in Webmaster Tools

Webmaster Level: intermediate

Mobile is growing at a fantastic pace – in usage, not just in screen size. To keep you informed of issues mobile users might be seeing across your website, we’ve added the Mobile Usability feature to Webmaster Tools.

The new feature shows mobile usability issues we’ve identified across your website, complete with graphs over time so that you see the progress that you’ve made.

A mobile-friendly site is one that you can easily read & use on a smartphone, by only having to scroll up or down. Swiping left/right to search for content, zooming to read text and use UI elements, or not being able to see the content at all make a site harder to use for users on mobile phones. To help, the Mobile Usability reports show the following issues: Flash content, missing viewport (a critical meta-tag for mobile pages), tiny fonts, fixed-width viewports, content not sized to viewport, and clickable links/buttons too close to each other.

We strongly recommend you take a look at these issues in Webmaster Tools, and think about how they might be resolved; sometimes it’s just a matter of tweaking your site’s template! More information on how to make a great mobile-friendly website can be found in our Web Fundamentals website (with more information to come soon).

If you have any questions, feel free to join us in our webmaster help forums (on your phone too)!

Posted by John Mueller, Webmaster Trends Analyst, Zurich

An update to the Webmaster Tools API

Webmaster level: advanced

Over the summer the Webmaster Tools team has been cooking up an update to the Webmaster Tools API. The new API is consistent with other Google APIs, makes it easier to authenticate for apps or web-services, and provides access to some of the main features of Webmaster Tools.

If you’ve used other Google APIs, getting started with the new Webmaster Tools API will be easy! We have examples for Python, Java, as well as OACurl (for fans of command lines).

This API allows you to:

  • list, add, or remove sites from your account (you can currently have up to 500 sites in your account)
  • list, add, or remove sitemaps for your websites
  • get warning, error, and indexed counts for individual sitemaps
  • get a time-series of all kinds of crawl errors for your site
  • list crawl error samples for specific types of errors
  • mark individual crawl errors as “fixed” (this doesn’t change how they’re processed, but can help simplify the UI for you)

We’d love to see what you’re building with our APIs! Feel free to link to your projects in the comments below. Should you have any questions about the usage of the API, feel free to post in our help forum as well.

Posted by John Mueller, fan of long command lines, Google Zürich

Introducing the Google News Publisher Center

(Cross-posted on the Google News Blog)

Webmaster level: All

UPDATE: Great News — The Publisher Center is now available for publishers in France, Italy, Germany and Spain as well as in all 21 countries where a Google News edition is available in English.

If you’re a news publisher, your website has probably evolved and changed over time — just like your stories. But in the past, when you made changes to the structure of your site, we might not have discovered your new content. That meant a lost opportunity for your readers, and for you. Unless you regularly checked Webmaster Tools, you might not even have realized that your new content wasn’t showing up in Google News. To prevent this from happening, we are letting you make changes to our record of your news site using the just launched Google News Publisher Center.

With the Publisher Center, your potential readers can be more informed about the articles they’re clicking on and you benefit from better discovery and classification of your news content. After verifying ownership of your site using Google Webmaster Tools, you can use the Publisher Center to directly make the following changes:

  • Update your news site details, including changing your site name and labeling your publication with any relevant source labels (e.g., “Blog”, “Satire” or “Opinion”)
  • Update your section URLs when you change your site structure (e.g., when you add a new section such as http://example.com/2014commonwealthgames or http://example.com/elections2014)
  • Label your sections with a specific topic (e.g., “Technology” or “Politics”)

Whenever you make changes to your site, we’d recommend also checking our record of it in the Publisher Center and updating it if necessary.

Try it out, or learn more about how to get started.

At the moment the tool is only available to publishers in the U.S. but we plan to introduce it in other countries soon and add more features.  In the meantime, we’d love to hear from you about what works well and what doesn’t. Ultimately, our goal is to make this a platform where news publishers and Google News can work together to provide readers with the best, most diverse news on the web.

Posted by Eric Weigle, Software Engineer

Testing robots.txt files made easier

Webmaster level: intermediate-advanced

To crawl, or not to crawl, that is the robots.txt question.

Making and maintaining correct robots.txt files can sometimes be difficult. While most sites have it easy (tip: they often don’t even need a robots.txt file!), finding the directives within a large robots.txt file that are or were blocking individual URLs can be quite tricky. To make that easier, we’re now announcing an updated robots.txt testing tool in Webmaster Tools.

You can find the updated testing tool in Webmaster Tools within the Crawl section:

Here you’ll see the current robots.txt file, and can test new URLs to see whether they’re disallowed for crawling. To guide your way through complicated directives, it will highlight the specific one that led to the final decision. You can make changes in the file and test those too, you’ll just need to upload the new version of the file to your server afterwards to make the changes take effect. Our developers site has more about robots.txt directives and how the files are processed.

Additionally, you’ll be able to review older versions of your robots.txt file, and see when access issues block us from crawling. For example, if Googlebot sees a 500 server error for the robots.txt file, we’ll generally pause further crawling of the website.

Since there may be some errors or warnings shown for your existing sites, we recommend double-checking their robots.txt files. You can also combine it with other parts of Webmaster Tools: for example, you might use the updated Fetch as Google tool to render important pages on your website. If any blocked URLs are reported, you can use this robots.txt tester to find the directive that’s blocking them, and, of course, then improve that. A common problem we’ve seen comes from old robots.txt files that block CSS, JavaScript, or mobile content — fixing that is often trivial once you’ve seen it.

We hope this updated tool makes it easier for you to test & maintain the robots.txt file. Should you have any questions, or need help with crafting a good set of directives, feel free to drop by our webmaster’s help forum!

Posted by Asaph Arnon, Webmaster Tools team

Android app indexing is now open for everyone!

Webmaster level: All

Do you have an Android app in addition to your website? You can now connect the two so that users searching from their smartphones and tablets can easily find and reach your app content.

App deep links in search results help your users find your content more easily and re-engage with your app after they’ve installed it. As a site owner, you can show your users the right content at the right time — by connecting pages of your website to the relevant parts of your app you control when your users are directed to your app and when they go to your website.

Hundreds of apps have already implemented app indexing. This week at Google I/O, we’re announcing a set of new features that will make it even easier to set up deep links in your app, connect your site to your app, and keep track of performance and potential errors.

Getting started is easy

We’ve greatly simplified the process to get your app deep links indexed. If your app supports HTTP deep linking schemes, here’s what you need to do:

  1. Add deep link support to your app
  2. Connect your site and your app
  3. There is no step 3 (:

As we index your URLs, we’ll discover and index the app / site connections and may begin to surface app deep links in search results.

We can discover and index your app deep links on our own, but we recommend you publish the deep links. This is also the case if your app only supports a custom deep link scheme. You can publish them in one of the following ways:

There’s one more thing: we’ve added a new feature in Webmaster Tools to help you debug any issues that might arise during app indexing. It will show you what type of errors we’ve detected for the app page-web page pairs, together with example app URIs so you can debug:

We’ll also give you detailed instructions on how to debug each issue, including a QR code for the app deep links, so you can easily open them on your phone or tablet. We’ll send you Webmaster Tools error notifications as well, so you can keep up to date.

Give app indexing a spin, and as always, if you need more help ask questions on the Webmaster help forum.

Posted by Mariya Moeva, Webmaster Trends Analyst

Directing smartphone users to the page they actually wanted

Webmaster level: all

Have you ever used Google Search on your smartphone and clicked on a promising-looking result, only to end up on the mobile site’s homepage, with no idea why the page you were hoping to see vanished? This is such a common annoyance that we’ve even seen comics about it. Usually this happens because the website is not properly set up to handle requests from smartphones and sends you to its smartphone homepage—we call this a “faulty redirect”.

We’d like to spare users the frustration of landing on irrelevant pages and help webmasters fix the faulty redirects. Starting today in our English search results in the US, whenever we detect that smartphone users are redirected to a homepage instead of the the page they asked for, we may note it below the result. If you still wish to proceed to the page, you can click “Try anyway”:

And we’re providing advice and resources to help you direct your audience to the pages they want. Here’s a quick rundown:

1. Do a few searches on your own phone (or with a browser set up to act like a smartphone) and see how your site behaves. Simple but effective. 🙂

2. Check out Webmaster Tools—we’ll send you a message if we detect that any of your site’s pages are redirecting smartphone users to the homepage. We’ll also show you any faulty redirects we detect in the Smartphone Crawl Errors section of Webmaster Tools:

3. Investigate any faulty redirects and fix them. Here’s what you can do:

  • Use the example URLs we provide in Webmaster Tools as a starting point to debug exactly where the problem is with your server configuration.
  • Set up your server so that it redirects smartphone users to the equivalent URL on your smartphone site.
  • If a page on your site doesn’t have a smartphone equivalent, keep users on the desktop page, rather than redirecting them to the smartphone site’s homepage. Doing nothing is better than doing something wrong in this case.
  • Try using responsive web design, which serves the same content for desktop and smartphone users.

If you’d like to know more about building smartphone-friendly sites, read our full recommendations. And, as always, if you need more help you can ask a question in our webmaster forum.

Posted by , Webmaster Trends Analyst

Rendering pages with Fetch as Google

Webmaster level: all

The Fetch as Google feature in Webmaster Tools provides webmasters with the results of Googlebot attempting to fetch their pages. The server headers and HTML shown are useful to diagnose technical problems and hacking side-effects, but sometimes make double-checking the response hard: Help! What do all of these codes mean? Is this really the same page as I see it in my browser? Where shall we have lunch? We can’t help with that last one, but for the rest, we’ve recently expanded this tool to also show how Googlebot would be able to render the page.

Viewing the rendered page

In order to render the page, Googlebot will try to find all the external files involved, and fetch them as well. Those files frequently include images, CSS and JavaScript files, as well as other files that might be indirectly embedded through the CSS or JavaScript. These are then used to render a preview image that shows Googlebot’s view of the page.

You can find the Fetch as Google feature in the Crawl section of Google Webmaster Tools. After submitting a URL with “Fetch and render,” wait for it to be processed (this might take a moment for some pages). Once it’s ready, just click on the response row to see the results.

Fetch as Google

Handling resources blocked by robots.txt

Googlebot follows the robots.txt directives for all files that it fetches. If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that’s disallowing Googlebot’s crawling of them), we won’t be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won’t be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we’ll show them below the preview image.

We recommend making sure Googlebot can access any embedded resource that meaningfully contributes to your site’s visible content, or to its layout. That will make Fetch as Google easier for you to use, and will make it possible for Googlebot to find and index that content as well. Some types of content – such as social media buttons, fonts or website-analytics scripts – tend not to meaningfully contribute to the visible content or layout, and can be left disallowed from crawling. For more information, please see our previous blog post on how Google is working to understand the web better.

We hope this update makes it easier for you to diagnose these kinds of issues, and to discover content that’s accidentally blocked from crawling. If you have any comments or questions, let us know here or drop by in the webmaster help forum.

Posted by Shimi Salant, Webmaster Tools team

More Precise Index Status Data for Your Site Variations

Webmaster Level: Intermediate

The Google Webmaster Tools Index Status feature reports how many pages on your site are indexed by Google. In the past, we didn’t show index status data for HTTPS websites independently, but rather we included everything in the HTTP site’s report. In the last months, we’ve heard from you that you’d like to use Webmaster Tools to track your indexed URLs for sections of your website, including the parts that use HTTPS.

We’ve seen that nearly 10% of all URLs already use a secure connection to transfer data via HTTPS, and we hope to see more webmasters move their websites from HTTP to HTTPS in the future. We’re happy to announce a refinement in the way your site’s index status data is displayed in Webmaster Tools: the Index Status feature now tracks your site’s indexed URLs for each protocol (HTTP and HTTPS) as well as for verified subdirectories.

This makes it easy for you to monitor different sections of your site. For example, the following URLs each show their own data in Webmaster Tools Index Status report, provided they are verified separately:

The refined data will be visible for webmasters whose site’s URLs are on HTTPS or who have subdirectories verified, such as https://example.com/folder/. Data for subdirectories will be included in the higher-level verified sites on the same hostname and protocol.

If you have a website on HTTPS or if some of your content is indexed under different subdomains, you will see a change in the corresponding Index Status reports. The screenshots below illustrate the changes that you may see on your HTTP and HTTPS sites’ Index Status graphs for instance:

HTTP site’s Index Status showing drop

HTTPS site’s Index Status showing increase

An “Update” annotation has been added to the Index Status graph for March 9th, showing when we started collecting this data. This change does not affect the way we index your URLs, nor does it have an impact on the overall number of URLs indexed on your domain. It is a change that only affects the reporting of data in Webmaster Tools user interface.
In order to see your data correctly, you will need to verify all existing variants of your site (www., non-www., HTTPS, subdirectories, subdomains) in Google Webmaster Tools. We recommend that your preferred domains and canonical URLs are configured accordingly.
Note that if you wish to submit a Sitemap, you will need to do so for the preferred variant of your website, using the corresponding URLs. Robots.txt files are also read separately for each protocol and hostname.
We hope that you’ll find this update useful, and that it’ll help you monitor, identify and fix indexing problems with your website. You can find additional details in our Index Status Help Center article. As usual, if you have any questions, don’t hesitate to ask in our webmaster Help Forum.

Posted by Zineb Ait Bahajji, WTA, thanks to the Webmaster Tools team.

Changes in crawl error reporting for redirects

Webmaster level: intermediate-advancedIn the past, we have seen occasional confusion by webmasters regarding how crawl errors on redirecting pages were shown in Webmaster Tools. It’s time to make this a bit clearer and easier to diagnose! While it used…

Google Publisher Plugin beta: Bringing our publisher products to WordPress

Cross-posted from the Inside AdSense blog.

We’ve heard from many publishers using WordPress that they’re looking for an easier way to work with Google products within the platform. Today, we’re excited to share the beta release of our official Google Publisher Plugin, which adds new functionality to publishers’ WordPress websites. If you own your own domain and power it with WordPress, this new plugin will give you access to a few Google services — and all within WordPress.

Please keep in mind that because this is a beta release, we’re still fine-tuning the plugin to make sure it works well on the many WordPress sites out there. We’d love for you to try it now and share your feedback on how it works for your site.

This first version of the Google Publisher Plugin currently supports two Google products:

  • Google AdSense: Earn money by placing ads on your website. The plugin links your WordPress site to your AdSense account and makes it easier to place ads on your site — without needing to manually modify any HTML code.
  • Google Webmaster Tools: Webmaster Tools provides you with detailed reports about your pages’ visibility on Google. The plugin allows you to verify your site on Webmaster Tools with just one click.

Visit the WordPress.org plugin directory to download the new plugin and give it a try. For more information about the plugin and how to use it, please visit our Help Center. We look forward to hearing your feedback!

Posted by Michael Smith – Product Manager