Connect to job seekers with Google Search

At Google I/O this year, we announced Google for Jobs, a new company-wide initiative focused on helping both job seekers and employers, through collaboration with the job matching industry. One major part of this effort is launching an improved experience for job seekers on Google Search. We’re happy to announce this new experience is now open for all developers and site owners. For queries with clear intent like [head of catering jobs in nyc] or [entry level jobs in DC], we’ll show a job listings preview, and each job can expand to display comprehensive details about the listing: For employers or site owners with job content, this feature brings many benefits: Prominent place in Search results: your postings are eligible to be displayed in the in the new job search feature on Google, featuring your logo, reviews, ratings, and job details. More, motivated applicants: job seekers can filter by various criteria like location or job title, meaning you’re more likely to get applicants who are looking exactly for that job. Increased chances of discovery and conversion: job seekers will have a new avenue to interact with your postings and click through to your site. Get your job listings on GoogleImplementation involves two steps: Mark up your job listings with Job Posting structured data. Submit a sitemap (or an RSS or Atom feed) with a <lastmod> date for each listing. If you have more than 100,000 job postings or more than 10,000 changes per day, you can express interest to use the High Change Rate feature. If you already publish your job openings on another site like LinkedIn, Monster, DirectEmployers, CareerBuilder, Glassdoor, and Facebook, they are eligible to appear in the feature as well. Job search is an enriched search experience. We’ve created a dedicated guide to help you understand how Google ranking works for enriched search and practices for improving your presence Keep track of how you’re doing and fix issuesThere’s a suite of tools to help you with the implementation: Validate your markup with the Structured Data Testing Tool Preview your listing in the Structured Data Testing ToolKeep track of your sitemap status in Search Console See aggregate stats and markup error examples in Search Console In the coming weeks, we’ll add new job listings filters in the Search Analytics report in Search Console, so you can track clicks and impressions for your listings. As always, if you have questions, ask in the forums or find us on Twitter! Posted by Nick Zakrasek, Product Manager


Source: google webmaster

Making the Internet safer and faster: Introducing reCAPTCHA Android API

When we launched reCAPTCHA ten years ago, we had a simple goal: enable users to visit the sites they love without worrying about spam and abuse. Over the years, reCAPTCHA has changed quite a bit. It evolved from the distorted text to street numbers and names, then No CAPTCHA reCAPTCHA in 2014 and Invisible reCAPTCHA in March this year. By now, more than a billion users have benefited from reCAPTCHA and we continue to work to refine our protections. reCAPTCHA protects users wherever they may be online. As the use of mobile devices has grown rapidly, it’s important to keep the mobile applications and data safe. Today, on reCAPTCHA’s tenth birthday, we’re glad to announce the first reCAPTCHA Android API as part of Google Play Services. With this API, reCAPTCHA can better tell human and bots apart to provide a streamlined user experience on mobile. It will use our newest Invisible reCAPTCHA technology, which runs risk analysis behind the scene and has enabled millions of human users to pass through with zero click everyday. Now mobile users can enjoy their apps without being interrupted, while still staying away from spam and abuse. reCAPTCHA Android API is included with Google SafetyNet, which provides services like device attestation and safe browsing to protect mobile apps. Mobile developers can do both the device and user attestations in the same API to mitigate security risks of their apps more efficiently. This adds to the diversity of security protections on Android: Google Play Protect to monitor for potentially harmful applications, device encryption, and regular security updates. Please visit our site to learn more about how to integrate with the reCAPTCHA Android API, and keep an eye out for our iOS library.The journey of reCAPTCHA continues: we’ll make the Internet safer and easier to use for everyone (except bots). Posted by Wei Liu, Product Manager, reCAPTCHA


Source: google webmaster

Better Snippets for your Users

Before buying a book, people like to get a snapshot of how they’re about to spend a few hours reading. They’ll take a look at the synopsis, the preface, or even the prologue just to get a sense about whether they’ll like the book. Search result snippets are much the same; they help people decide whether or not it makes sense to invest the time reading the page the snippet belongs to.  The more descriptive and relevant a search result snippet is, the more likely that people will click through and be satisfied with the page they land on. Historically, snippets came from 3 places:The content of the pageThe meta descriptionDMOZ listingsThe content of the page is an obvious choice for result snippets, and  the content that can be extracted is often the most relevant to people’s queries. However, there are times when the content itself isn’t the best source for a snippet. For instance, when someone searches for a publishing company for their book, the relevant homepages in the result set may contain only a few images describing the businesses and a logo, and maybe some links, none of which are particularly useful for a snippet. The logical fallback in cases when the content of a page doesn’t have much textual content for a search result snippet is the meta description. This should be short blurbs that describe accurately and precisely the content in a few words. Finally, when a page doesn’t have much textual content for snippet generation and the meta description is missing, unrelated to the page, or low quality, our fallback was DMOZ, also known as The Open Directory Project. For over 10 years, we relied on DMOZ for snippets because the quality of the DMOZ snippets were often much higher quality than those  provided by webmasters in their meta description, or were more descriptive than what the page provided. With DMOZ now closed, we’ve stopped using its listings for snippeting, so it’s a lot more important that webmasters provide good meta descriptions, if adding more content to the page is not an option. What makes a good meta description?Good meta descriptions are short blurbs that describe accurately the content of the page. They are like a pitch that convince the user that the page is exactly what they’re looking for. For more tips, we have a handy help center article on the topic. Remember to make sure that both your desktop and your mobile pages include both a title and a meta description. What are the most common problems with meta descriptions?Because meta descriptions are usually visible only to search engines and other software, webmasters sometimes forget about them, leaving them completely empty. It’s also common, for the same reason, that the same meta description is used across multiple (and sometimes many) pages. On the flip side, it’s also relatively common that the description is completely off-topic, low quality, or outright spammy. These issues tarnish our users’ search experience, so we prefer to ignore such meta descriptions. Is there a character limit for meta descriptions?There’s no limit on how long a meta description can be, but the search result snippets are truncated as needed, typically to fit the device width.What will happen with the “NOODP” robots directive?With DMOZ (ODP) closed, we stopped relying on its data and thus the NOODP directive is already no-op. Can I prevent Google from using the page contents as snippet? You can prevent Google from generating snippets altogether by specifying the “nosnippet” robots directive. There’s no way to prevent using page contents as snippet while allowing other sources.As always, if you have questions, ask in the forums or find us on Twitter!Posted by Gary, Search Team


Source: google webmaster

A reminder about links in large-scale article campaigns

Lately we’ve seen an increase in spammy links contained in articles referred to as contributor posts, guest posts, partner posts, or syndicated posts. These articles are generally written by or in the name of one website, and published on a different one.Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google’s guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site. Below are factors that, when taken to an extreme, can indicate when an article is in violation of these guidelines:Stuffing keyword-rich links to your site in your articlesHaving the articles published across many different sites; alternatively, having a large number of articles on a few large, different sitesUsing or hiring article writers that aren’t knowledgeable about the topics they’re writing onUsing the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel=”canonical”, in addition to rel=”nofollow”, is advised)When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking. Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users. Also, webmasters generally prefer not to receive aggressive or repeated “Post my article!” requests, and we encourage such cases to be reported to our spam report form. And lastly, if a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything–including links–will follow (no pun intended).Posted by the Google Webspam Team


Source: google webmaster

How we fought webspam – Webspam Report 2016

With 2017 well underway, we wanted to take a moment and share some of the insights we gathered in 2016 in our fight against webspam. Over the past year, we continued to find new ways of keeping spam from creating a poor quality search experience, and worked with webmasters around the world to make the web better.We do a lot behind the scenes to make sure that users can make full use of what today’s web has to offer, bringing relevant results to everyone around the globe, while fighting webspam that could potentially harm or simply annoy users.Webspam trends in 2016Website security continues to be a major source of concern. Last year we saw more hacked sites than ever – a 32% increase compared to 2015. Because of this, we continued to invest in improving and creating more resources to help webmasters know what to do when their sites get hacked. We continued to see that sites are compromised not just to host webspam. We saw a lot of webmasters affected by social engineering, unwanted software, and unwanted ad injectors. We took a stronger stance in Safe Browsing to protect users from deceptive download buttons, made a strong effort to protect users from repeatedly dangerous sites, and we launched more detailed help text within the Search Console Security Issues Report.Since more people are searching on Google using a mobile device, we saw a significant increase in spam targeting mobile users. In particular, we saw a rise in spam that redirects users, without the webmaster’s knowledge, to other sites or pages, inserted into webmaster pages using widgets or via ad units from various advertising networks.How we fought spam in 2016We continued to refine our algorithms to tackle webspam. We made multiple improvements to how we rank sites, including making Penguin (one of our core ranking algorithms) work in real-time.The spam that we didn’t identify algorithmically was handled manually. We sent over 9 million messages to webmasters to notify them of webspam issues on their sites. We also started providing more security notifications via Google Analytics.We performed algorithmic and manual quality checks to ensure that websites with structured data markup meet quality standards. We took manual action on more than 10,000 sites that did not meet the quality guidelines for inclusion in search features powered by structured data.Working with users and webmasters for a better webIn 2016 we received over 180,000 user-submitted spam reports from around the world. After carefully checking their validity, we considered 52% of those reported sites to be spam. Thanks to all who submitted reports and contributed towards a cleaner and safer web ecosystem!We conducted more than 170 online office hours and live events around the world to audiences totaling over 150,000 website owners, webmasters and digital marketers.We continued to provide support to website owners around the world through our Webmaster Help Forums in 15 languages. Through these forums we saw over 67,000 questions, with a majority of them being identified as having a Best Response by our community of Top contributors, Rising Stars and Googlers. We had 119 volunteer Webmaster Top Contributors and Rising Stars, whom we invited to join us at our local Top Contributor Meetups in 11 different locations across 4 continents (Asia, Europe, North America, South America). We think everybody deserves high quality, spam-free search results. We hope that this report provides a glimpse of what we do to make that happen.Posted by Michal Wicinski, Search Quality Strategist and Kiyotaka Tanaka, User Education & Outreach Specialist 


Source: google webmaster

Similar items: Rich products feature on Google Image Search

Image Search recently launched “Similar items” on mobile web and the Android Search app. The “Similar items” feature is designed to help users find products they love in photos that inspire them on Google Image Search. Using machine vision technology, the Similar items feature identifies products in lifestyle images and displays matching products to the user. Similar items supports handbags, sunglasses, and shoes and will cover other apparel and home & garden categories in the next few months. The Similar items feature enables users to browse and shop inspirational fashion photography and find product info about items they’re interested in. Try it out by opening results from queries like [designer handbags]. Finding price and availability information was one of the top Image Search feature request from our users. The Similar items carousel gets millions of impressions and clicks daily from all over the world. To make your products eligible for Similar items, make sure to add and maintain schema.org product metadata on your pages. The schema.org/Product markup helps Google find product offerings on the web and give users an at-a-glance summary of product info. To ensure that your products are eligible to appear in Similar items:Ensure that the product offerings on your pages have schema.org product markup, including an image reference. Products with name, image, price & currency, and availability meta-data on their host page are eligible for Similar itemsTest your pages with Google’s Structured Data Testing Tool to verify that the product markup is formatted correctlySee your images on image search by issuing the query “site:yourdomain.com.” For results with valid product markup, you may see product information appear once you tap on the images from your site. It can take up to a week for Googlebot to recrawl your website. Right now, Similar items is available on mobile browsers and the Android Google Search App globally, and we plan to expand to more platforms in 2017. If you have questions, find us in the dedicated Structured data section of our forum, on Twitter, or on Google+. To prevent your images from showing in Similar items, webmasters can opt-out of Google Image Search. We’re excited to help users find your products on the web by showcasing buyable items. Thanks for partnering with us to make the web more shoppable!Posted by Julia E, Product Manager on Image Search


Source: google webmaster

Updates to the Google Safe Browsing’s Site Status Tool

(Cross-posted from the Google Security Blog) Google Safe Browsing gives users tools to help protect themselves from web-based threats like malware, unwanted software, and social engineering. We are best known for our warnings, which users see when they attempt to navigate to dangerous sites or download dangerous files. We also provide other tools, like the Site Status Tool, where people can check the current safety status of a web page (without having to visit it).We host this tool within Google’s Safe Browsing Transparency Report. As with other sections in Google’s Transparency Report, we make this data available to give the public more visibility into the security and health of the online ecosystem. Users of the Site Status Tool input a webpage (as a URL, website, or domain) into the tool, and the most recent results of the Safe Browsing analysis for that webpage are returned…plus references to troubleshooting help and educational materials.We’ve just launched a new version of the Site Status Tool that provides simpler, clearer results and is better designed for the primary users of the page: people who are visiting the tool from a Safe Browsing warning they’ve received, or doing casual research on Google’s malware and phishing detection. The tool now features a cleaner UI, easier-to-interpret language, and more precise results. We’ve also moved some of the more technical data on associated ASes (autonomous systems) over to the malware dashboard section of the report. While the interface has been streamlined, additional diagnostic information is not gone: researchers who wish to find more details can drill-down elsewhere in Safe Browsing’s Transparency Report, while site-owners can find additional diagnostic information in Search Console. One of the goals of the Transparency Report is to shed light on complex policy and security issues, so, we hope the design adjustments will indeed provide our users with additional clarity. Posted by Deeksha Padma Prasad and Allison Miller, Safe Browsing


Source: google webmaster

#NoHacked: A year in review

We hope your year started out safe and secure!We wanted to share with you a summary of our 2016 work as we continue our #NoHacked campaign. Let’s start with some trends on hacked sites from the past year. State of Website Security in 2016First off, some unfortunate news. We’ve seen an increase in the number of hacked sites by approximately 32% in 2016 compared to 2015. We don’t expect this trend to slow down. As hackers get more aggressive and more sites become outdated, hackers will continue to capitalize by infecting more sites.On the bright side, 84% webmasters who do apply for reconsideration are successful in cleaning their sites. However, 61% of webmasters who were hacked never received a notification from Google that their site was infected because their sites weren’t verified in Search Console. Remember to register for Search Console if you own or manage a site. It’s the primary channel that Google uses to communicate site health alerts.More Help for Hacked Webmasters We’ve been listening to your feedback to better understand how we can help webmasters with security issues. One of the top requests was easier to understand documentation about hacked sites. As a result we’ve been hard at work to make our documentation more useful.First, we created new documentation to give webmasters more context when their site has been compromised. Here is a list of the new help documentation:Top ways websites get hacked by spammersGlossary for Hacked SitesFAQs for Hacked SitesHow do I know if my site is hacked?Next, we created clean up guides for sites affected by known hacks. We’ve noticed that sites often get affected in similar ways when hacked. By investigating the similarities, we were able to create clean up guides for specific known type of hack. Below is a short description of each of the guides we created:Gibberish Hack: The gibberish hack automatically creates many pages with non-sensical sentences filled with keywords on the target site. Hackers do this so the hacked pages show up in Google Search. Then, when people try to visit these pages, they’ll be redirected to an unrelated page, like a porn site. Learn more on how to fix this type of hack.Japanese Keywords Hack: The Japanese keywords hack typically creates new pages with Japanese text on the target site in randomly generated directory names. These pages are monetized using affiliate links to stores selling fake brand merchandise and then shown in Google search. Sometimes the accounts of the hackers get added in Search Console as site owners. Learn more on how to fix this type of hack.Cloaked Keywords Hack: The cloaked keywords and link hack automatically creates many pages with non-sensical sentence, links, and images. These pages sometimes contain basic template elements from the original site, so at first glance, the pages might look like normal parts of the target site until you read the content. In this type of attack, hackers usually use cloaking techniques to hide the malicious content and make the injected page appear as part of the original site or a 404 error page. Learn more on how to fix this type of hack.Prevention is Key As always it’s best to take a preventative approach and secure your site rather than dealing with the aftermath. Remember a chain is only as strong as its weakest link. You can read more about how to identify vulnerabilities on your site in our hacked help guide. We also recommend staying up-to-date on releases and announcements from your Content Management System (CMS) providers and software/hardware vendors.Looking ForwardHacking behavior is constantly evolving, and research allows us to stay up to date on and combat the latest trends. You can learn about our latest research publications in the information security research site. Highlighted below are a few specific studies specific to website compromises:Cloak of Visibility: Detecting When Machines Browse a Different WebInvestigating Commercial Pay-Per-Install and the Distribution of Unwanted SoftwareUsers Really Do Plug in USB Drives They FindAd Injection at Scale: Assessing Deceptive Advertisement ModificationsIf you have feedback or specific questions about compromised sites, the Webmaster Help Forums has an active group of Googlers and technical contributors that can address your questions and provide additional technical support.Posted by Wafa Alnasayan, Trust & Safety Analyst and Eric Kuan, Webmaster Relations


Source: google webmaster

Closing down for a day

Even in today’s “always-on” world, sometimes businesses want to take a break. There are times when even their online presence needs to be paused. This blog post covers some of the available options so that a site’s search presence isn’t affected. Option: Block cart functionality If a site only needs to block users from buying things, the simplest approach is to disable that specific functionality. In most cases, shopping cart pages can either be blocked from crawling through the robots.txt file, or blocked from indexing with a robots meta tag. Since search engines either won’t see or index that content, you can communicate this to users in an appropriate way. For example, you may disable the link to the cart, add a relevant message, or display an informational page instead of the cart. Option: Always show interstitial or pop-up If you need to block the whole site from users, be it with a “temporarily unavailable” message, informational page, or popup, the server should return a 503 HTTP result code (“Service Unavailable”). The 503 result code makes sure that Google doesn’t index the temporary content that’s shown to users. Without the 503 result code, the interstitial would be indexed as your website’s content. Googlebot will retry pages that return 503 for up to about a week, before treating it as a permanent error that can result in those pages being dropped from the search results. You can also include a “Retry after” header to indicate how long the site will be unavailable. Blocking a site for longer than a week can have negative effects on the site’s search results regardless of the method that you use. Option: Switch whole website off Turning the server off completely is another option. You might also do this if you’re physically moving your server to a different data center. For this, have a temporary server available to serve a 503 HTTP result code for all URLs (with an appropriate informational page for users), and switch your DNS to point to that server during that time. Set your DNS TTL to a low time (such as 5 minutes) a few days in advance.Change the DNS to the temporary server’s IP address. Take your main server offline once all requests go to the temporary server. … your server is now offline …When ready, bring your main server online again. Switch DNS back to the main server’s IP address.Change the DNS TTL back to normal. We hope these options cover the common situations where you’d need to disable your website temporarily. If you have any questions, feel free to drop by our webmaster help forums! PS If your business is active locally, make sure to reflect these closures in the opening hours for your local listings too! Posted by John Mueller, Webmaster Trends Analyst, Switzerland


Source: google webmaster

Introducing the Mobile-Friendly Test API

With so many users on mobile devices, having a mobile-friendly web is important to us all. The Mobile-Friendly Test is a great way to check individual pages manually. We’re happy to announce that this test is now available via API as well. The Mobile-Friendly Test API lets you test URLs using automated tools. For example, you could use it to monitor important pages in your website in order to prevent accidental regressions in templates that you use. The API method runs all tests, and returns the same information – including a list of the blocked URLs – as the manual test. The documentation includes simple samples to help get you started quickly. We hope this API makes it easier to check your pages for mobile-friendliness and to get any such issues resolved faster. We’d love to hear how you use the API — leave us a comment here, and feel free to link to any code or implementation that you’ve set up! As always, if you have any questions, feel free to drop by our webmaster help forum. Posted by John Mueller, Webmaster Trends Analyst, Google Switzerland


Source: google webmaster