WorryFree Computers   »   [go: up one dir, main page]

Many of you create great content on the web, and we work hard to make that content discoverable on Google. Today, we will start highlighting the people creating this content in Google.com search results.

Why? Because authorship is a great way to identify and highlight high-quality content. Plus, the web is centered around people. People discovering content on the web often want to learn more about its author, see other content by that author, and even interact with the author.

We’re piloting this new search experience with a small sample of authors who have linked their Google profile with their content. If you are a content creator and would like to participate in our pilot program, please follow the instructions detailed in our Help Center.

As you can see below, certain results will display an author’s picture and name -- derived from and linked to their Google Profile -- next to their content on the Google Search results page.


In addition, as we discover new content that pilot participants have marked up as theirs, we can automatically add links to this content within their Google profiles.

This feature is powered by the authorship markup which we announced two weeks ago. We hope as more authors link to their content, it will improve your search experience and the quality of content being created on the web.

Today we introduced initial design changes across our different products like Google Maps, Gmail and the search results page. Over the next few months, you’ll continue to see updates to the look of Google Search, where our core principle is to make it as easy as possible for you to find what you’re looking for. Design is an important part of achieving that goal, so we’ll be making changes to simplify our look and make it easier to focus your attention on what’s important on the page.

On the results page, you’ll notice a new gray bar and a blue search button to highlight the search box at the top of the page.



A few other design changes to help you focus finding what you’re searching for include:
  • An updated design for the left-hand panel of tools, where we’ve muted the color of the tools and reserved the use of bolder colors to highlight key action buttons, tools and filters.
  • The URL relocated directly beneath the headline for each search result.
  • Links on the homepage moved to the top and bottom edges of your browser, making the notably clean Google homepage even cleaner.
We will continue to update and improve the design based on feedback from users. The changes rolling out this week are the first steps toward a more focused, simplified look across all Google properties.

Sometimes when you’re searching, you’re not just looking for one specific result, you may be looking for a list to start a series of searches. For example, if you search for [greek philosophers], many search results mention well known philosophers like Plato or Aristotle. Typically, searches like these are the beginning of a research task, where you follow up by searching to learn more about each item in the list, in this case each philosopher.

Starting today, for many of your list seeking searches, you’ll see a collection of the top referenced items from the topic of your search.


If you click one of these links, the collection of links moves to the top of the results page, and results for the philosopher you clicked are shown below. Since the top references block stays anchored on top of your search results, it's easy to explore and learn about each of the philosophers. You can see top references for a variety of topics -- try out [american authors], [seattle neighborhoods], [famous basketball players], or [cruciferous vegetables].

Sometimes, a list of related searches is helpful even if you don't ask for the list directly. If you search for [van gogh], you may also be interested in learning more about his paintings. For many searches for artists, you can now see a list of famous paintings at the bottom of the search results page.



Similarly, search for a movie like [inception] or TV show like [30 rock], and you'll see a list of starring cast members. You can also see these sorts of results for other types of searches -- for example [u2], [stephen king], or [tom hanks]. These related searches currently appear in English only on google.com.

To better understand and answer your searches for a list, we use a variety of signals to assess what the web collectively thinks are the most significant items associated with your search keywords. Since Plato is discussed so frequently in pages about Greek philosophers, our algorithms can infer that he is an important Greek philosopher. Much of this work is based on common search patterns and Google Squared technology which we introduced into Google Labs in June 2009.

This improvement to related searches reflects our continued efforts to help our algorithms better understand content on the web in the complex ways that humans do, and use that understanding to help you as you search. We see a lot more potential in this exciting area of research and we're looking forward to introducing more improvements in the future.

During our Inside Search press event in San Francisco today, we discussed how mobile search has inspired new ways to remove barriers to knowledge on the desktop and help you get to your results faster. On a mobile phone, you’re not limited to typing - you can search using your voice or a picture. Now you can do the same on your computer with Voice Search and Search by Image on desktop. You can also get your results faster with Google Instant on Images and Instant Pages.

Voice Search on desktop

You’ve been able to use Voice Search on mobile devices since 2008, but you’re probably so used to typing your searches that you don’t immediately think to use your voice. With Voice Search now available on desktop, searching by voice is becoming more ubiquitous and the idea of being able to speak your search will be more familiar, no matter what you’re searching for.

Voice Search can be especially useful for long queries such as [pictures of big wave surfing in Waimea Bay] or words that are hard to spell like [Schenectady, New York]. It’s also a helpful option if you’re in a conversational mood and you’d rather ask your question out loud than think of the keywords to type. For example, maybe you want to find “a recipe for spaghetti with bolognese sauce.” Just click the microphone icon in the search box and ask out loud.


Voice Search on desktop takes advantage of Chrome’s Speech API and will be available to everyone using Chrome 11+ in English after it has rolled out over the next week. You'll also need to make sure you have a microphone that works, whether it's built into your computer or an external mic that you plug in.

Search by Image on desktop

You’ve also been able to search by image on your phone since 2009 with Google Goggles. But sometimes when you're on your computer, you may not have the words to describe exactly what you’re looking for. You might have an old vacation photo, but forgot the name of that beautiful beach. Typing [guy on a rocky path on a cliff with an island behind him] isn’t exactly specific enough to help find your answer. So when words aren’t as descriptive as the image, you can now search using the image itself.

To search using an image, go to images.google.com and just put your picture in the search box. There are many ways to do this. You can click the camera icon in the search box and upload a photo from your computer or paste the URL of an image from the web. You can also drag and drop pictures from webpages or your computer into the search box. To search images on the web even faster with just one click, you can download the Chrome or Firefox extensions.


Search by Image returns the best results for images that have related content already on the web, so you’re more likely to get relevant results for distinctive landmarks or paintings than you will for more unique photos like your toddler’s latest finger painting. In addition to getting relevant results about your image, you can also find visually similar images or the same image in different sizes or resolutions.

Search by Image starts with the computer vision technology underlying Google Goggles, and adds new techniques and functionality that optimize the experience for desktop. The technology behind Search by Image analyzes your image to find its most distinctive points, lines and textures and creates a mathematical model. We match that model against billions of images in our index, and page analysis helps us derive a best guess text description of your image. Search by Image technology also includes the ability to match against images on the web so that we can show you similar images and webpages that contain your image.

Search by Image is rolling out and will be available in most countries over the next couple of days. The Chrome and Firefox extensions are available for download now, but until Search by Image is rolled out to you, you won’t see the extension active in your browser.

Google Images with Instant

Another theme of our event was speed. Last fall when we introduced Google Instant, we sped up searching by showing you results as you type. In our event today we showed a sneak preview of Google Images with Instant.

Google Images with Instant will be available over the next couple of months to all domains and languages where Instant is already available. If you want to try it sooner, opt in at google.com/experimental.

Instant Pages

Whether you’re typing, speaking, or using an image, entering your search is only part of the process. You’re not really done searching until you have the answer you’re looking for. But waiting for webpages to load adds time to this process - the average webpage takes about five seconds to load.

With Instant Pages in Chrome, you can skip the extra seconds waiting for a page to load and get to the answers you’re looking for faster with webpages that load instantly.

For searches when we can predict with reasonable confidence that you’ll click on the first result, Instant Pages technology will begin loading that webpage early so that by the time you click on the result, the entire webpage appears fully loaded instantly. Take a look at this side-by-side comparison:


Learn more about the technology behind Instant Pages on the Chromium blog and, if you’re a webmaster, you can learn more on our Webmaster Central blog. Instant Pages is currently available in the developer version of Chrome, and will be included in the next beta version. It will will be available to all Chrome users later this summer.

In April we introduced A Google a Day, a daily puzzle that helps you practice both your trivia knowledge and your search skills. We have a team that works on coming up with new questions, but we want to make sure the questions continue to be challenging and diverse. As part of the new Guest Author program for A Google a Day, leading experts in various science, art, and literature fields will be writing questions on topics they’re passionate about to make sure you get compelling trivia questions on a wide range of topics.

The Guest Author program kicked off last week on May 31 with a question from our own Chief Internet Evangelist, Vint Cerf, who is recognized as one of the fathers of the Internet. He is also well-known for his vision with the InterPlaNetary Internet Project, a proposal on how to extend IP protocols to Mars and other spacecraft, and an intergalactic passion that inspired his A Google a Day question.


Each week, there will be one question written by a guest author. Today’s question is from renowned adventurer and explorer David de Rothschild in celebration of World Oceans day, a worldwide initiative to appreciate the ocean and promote ocean conservation.


To see upcoming questions from future guest authors such as the Barefoot Contessa Ina Garten, author of best-selling novel The Alchemist Paulo Coelho, Boeing test pilot Mike Carriker, animation historian Jerry Beck and more, keep visiting www.agoogleaday.com.

(Cross-posted on the Webmaster Central Blog)

Today we're beginning to support authorship markup -- a way to connect authors with their content on the web. We are experimenting with using this data to help people find content from great authors in our search results.

We now support markup that enables websites to publicly link within their site from content to author pages. For example, if an author at The New York Times has written dozens of articles, using this markup, the webmaster can connect these articles with a New York Times author page. An author page describes and identifies the author, and can include things like the author’s bio, photo, articles and other links.

If you run a website with authored content, you’ll want to learn about authorship markup in our help center. The markup uses existing standards such as HTML5 (rel=”author”) and XFN (rel=”me”) to enable search engines and other web services to identify works by the same author across the web. If you're already doing structured data markup using microdata from schema.org, we'll interpret that authorship information as well.

We wanted to make sure the markup was as easy to implement as possible. To that end, we’ve already worked with several sites to markup their pages, including The New York Times, The Washington Post, CNET, Entertainment Weekly, The New Yorker and others. In addition, we’ve taken the extra step to add this markup to everything hosted by YouTube and Blogger. In the future, both platforms will automatically include this markup when you publish content.

We know that great content comes from great authors, and we’re looking closely at ways this markup could help us highlight authors and rank search results.

Posted by Othar Hansson, Software Engineer

Starting this week we’re making it easier to quickly find great images right in your Google search results. Drawing from last year’s broader update to Google Images, we’ve integrated many of the features we introduced at that time into our main search results. Images will now appear in a tiled layout, with hover previews that give you a larger thumbnail and more information about a particular image:


Additionally, if we detect that your query has “high image intent” (meaning, we’re pretty sure you’re looking for images) we’ll start showing more images on the page. If you add words like “photos”, “pictures”, and “images” to a query, that means you’re probably not looking for a blog post or video. Showing more images on the main search results page makes it just that much faster to find the image you’re looking for. For example, in a search for [nebula pictures], instead of just three or four pictures at the top of the results, now you’ll find more than a dozen beautiful pictures filling up most of the page.


For more great looking examples, try out queries like [pictures of rainforests] and [monet photos]. This is currently available on google.com in English and will be available globally over the next month.

(Cross-posted on the Official Google Blog)

Today we’re announcing schema.org, a new initiative from Google, Bing and Yahoo! to create and support a common vocabulary for structured data markup on web pages. With schema.org, site owners and developers can learn about structured data and improve how their sites appear in major search engines. The site aims to be a one stop resource for webmasters looking to add markup to their pages.

Search engines have been working independently to support structured markup for a few years now. We introduced rich snippets to Google search in 2009 to help people find better summaries of reviews and people, and since that time we’ve expanded to new kinds of rich snippets, including recipes and events. We’ve been thrilled to see content creators across the web—from stubhub.com to allrecipes.com—add markup to their pages, and today we’re able to show rich snippets in search results more than 10 times as often as when we started two years ago.

We want to continue making the open web richer and more useful. We know that it takes time and effort for webmasters to add this markup to their pages, and adding markup is much harder if every search engine asks for data in a different way. That’s why we’ve come together with other search engines to support a common set of schemas, just as we came together to support a common standard for sitemaps in 2006. With schema.org, site owners can improve how their sites appear in search results not only on Google, but on Bing, Yahoo! and potentially other search engines as well in the future.

In addition to consolidating the schemas for the categories we already support, schema.org also introduces schemas for more than a hundred new categories, including movies, music, organizations, TV shows, products, places and more. As webmasters add this markup to their sites, search engines can develop richer search experiences. With webmaster feedback, we’ll be able to regularly publish new schemas for sites to use and, in turn, expand the list of queries with rich results. For webmasters who have already added microformats or RDFa currently supported by rich snippets, their sites will still appear with rich snippets on Google. You can learn more on our Webmaster Central Blog, Help Center and on schema.org.

Schema.org provides a wide variety of vocabularies webmasters can use to mark up their pages.

While this collaborative initiative is new, we draw heavily from the decades of work in the database and knowledge representation communities, from projects such as Jim Gray’s SDSS Skyserver, Cyc and from ongoing efforts such as dbpedia.org and linked data. We feel privileged to build upon this great work.

We look forward to seeing structured markup continue to grow on the web, powering richer search results and new kinds of applications.