WorryFree Computers   »   [go: up one dir, main page]

Hey—we've moved. Visit The Keyword for all the latest news and stories from Google

Our updated Privacy Policy takes effect today, March 1. As you use our products one thing will be clear: it’s the same Google experience that you’re used to, with the same controls.

And because we’re making these changes, over time we’ll be able to improve our products in ways that help our users get the most from the web.

While we’ve undertaken the most extensive user education campaign in our history to explain the coming changes, we know there has been a fair amount of chatter and confusion.

Here are a few important points to bear in mind:

Our Privacy Policy is now much easier to understand.

We’ve included the key parts from more than 60 product-specific notices into our main Google Privacy Policy—so there’s no longer any need to be your own mini search engine if you want to work out what’s going on. Our Privacy Policy now explains, for the vast majority of our services, what data we’re collecting and how we may use it, in plain language.

Our Privacy Policy will enable us to build a better, more intuitive user experience across Google for signed-in users.

If you’re signed in to Google, you expect our products to work really beautifully together. For example, if you’re working on Google Docs and you want to share it with someone on Gmail, you want their email right there ready to use. Our privacy policies have always allowed us to combine information from different products with your account—effectively using your data to provide you with a better service. However, we’ve been restricted in our ability to combine your YouTube and Search histories with other information in your account. Our new Privacy Policy gets rid of those inconsistencies so we can make more of your information available to you when using Google.

So in the future, if you do frequent searches for Jamie Oliver, we could recommend Jamie Oliver videos when you’re looking for recipes on YouTube—or we might suggest ads for his cookbooks when you’re on other Google properties.

Our privacy controls aren’t changing.

The new policy doesn’t change any existing privacy settings or how any personal information is shared outside of Google. We aren’t collecting any new or additional information about users. We won’t be selling your personal data. And we will continue to employ industry-leading security to keep your information safe.

If you don’t think information sharing will improve your experience, you can use our privacy tools to do things like edit or turn off your search history and YouTube history, control the way Google tailors ads to your interests and browse the web “incognito” using Chrome. You can use services like Search, Maps and YouTube if you are not signed in. You can even separate your information into different accounts, since we don’t combine personal information across them. And we’re committed to data liberation, so if you want to take your information elsewhere you can.

We’ll continue to look for ways to make it simpler for you to understand and control how we use the information you entrust to us. We build Google for you, and we think these changes will make our services even better.

In just over a month we will make some changes to our privacy policies and Google Terms of Service. This stuff matters, so we wanted to explain what’s changing, why and what these changes mean for users.

First, our privacy policies. Despite trimming our policies in 2010, we still have more than 70 (yes, you read right … 70) privacy documents covering all of our different products. This approach is somewhat complicated. It’s also at odds with our efforts to integrate our different products more closely so that we can create a beautifully simple, intuitive user experience across Google.

So we’re rolling out a new main privacy policy that covers the majority of our products and explains what information we collect, and how we use it, in a much more readable way. While we’ve had to keep a handful of separate privacy notices for legal and other reasons, we’re consolidating more than 60 into our main Privacy Policy.

Regulators globally have been calling for shorter, simpler privacy policies—and having one policy covering many different products is now fairly standard across the web.

These changes will take effect on March 1, and we’re starting to notify users today, including via email and a notice on our homepage.



What does this mean in practice? The main change is for users with Google Accounts. Our new Privacy Policy makes clear that, if you’re signed in, we may combine information you've provided from one service with information from other services. In short, we’ll treat you as a single user across all our products, which will mean a simpler, more intuitive Google experience.

Our recently launched personal search feature is a good example of the cool things Google can do when we combine information across products. Our search box now gives you great answers not just from the web, but your personal stuff too. So if I search for restaurants in Munich, I might see Google+ posts or photos that people have shared with me, or that are in my albums. Today we can also do things like make it easy for you to read a memo from Google Docs right in your Gmail, or add someone from your Gmail contacts to a meeting in Google Calendar.

But there’s so much more that Google can do to help you by sharing more of your information with … well, you. We can make search better—figuring out what you really mean when you type in Apple, Jaguar or Pink. We can provide more relevant ads too. For example, it’s January, but maybe you’re not a gym person, so fitness ads aren’t that useful to you. We can provide reminders that you’re going to be late for a meeting based on your location, your calendar and an understanding of what the traffic is like that day. Or ensure that our spelling suggestions, even for your friends’ names, are accurate because you’ve typed them before. People still have to do way too much heavy lifting, and we want to do a better job of helping them out.

Second, the Google Terms of Service—terms you agree to when you use our products. As with our privacy policies, we’ve rewritten them so they’re easier to read. We’ve also cut down the total number, so many of our products are now covered by our new main Google Terms of Service. Visit the Google Terms of Service page to find the revised terms.

Finally, what we’re not changing. We remain committed to data liberation, so if you want to take your information elsewhere you can. We don’t sell your personal information, nor do we share it externally without your permission except in very limited circumstances like a valid court order. We try hard to be transparent about the information we collect, and to give you meaningful choices about how it is used—for example our Ads Preferences Manager enables you to edit the interest categories we advertise against or turn off certain Google ads altogether. And we continue to design privacy controls, like Google+’s circles, into our products from the ground up.

We believe this new, simpler policy will make it easier for people to understand our privacy practices as well as enable Google to improve the services we offer. Whether you’re a new Google user or an old hand, please do take the time to read our new privacy policy and terms, learn more about the changes we’re making and understand the controls we offer.

Does this person sound familiar? He can’t be bothered to type a password into his phone every time he wants to play a game of Angry Birds. When he does need a password, maybe for his email or bank website, he chooses one that’s easy to remember like his sister’s name—and he uses the same one for each website he visits. For him, cookies come from the bakery, IP addresses are the locations of Intellectual Property and a correct Google search result is basically magic.

Most of us know someone like this. Technology can be confusing, and the industry often fails to explain clearly enough why digital literacy matters. So today in the U.S. we’re kicking off Good to Know, our biggest-ever consumer education campaign focused on making the web a safer, more comfortable place. Our ad campaign, which we introduced in the U.K. and Germany last fall, offers privacy and security tips: Use 2-step verification! Remember to lock your computer when you step away! Make sure your connection to a website is secure! It also explains some of the building blocks of the web like cookies and IP addresses. Keep an eye out for the ads in newspapers and magazines, online and in New York and Washington, D.C. subway stations.



The campaign and Good to Know website build on our commitment to keeping people safe online. We’ve created resources like privacy videos, the Google Security Center, the Family Safety Center and Teach Parents Tech to help you develop strong privacy and security habits. We design for privacy, building tools like Google Dashboard, Me on the Web, the Ads Preferences Manager and Google+ Circles—with more on the way.

We encourage you to take a few minutes to check out the Good to Know site, watch some of the videos, and be on the lookout for ads in your favorite newspaper or website. We hope you’ll learn something new about how to protect yourself online—tips that are always good to know!

Update Jan 17: Updated to include more background about Good to Know.



In May, we held our first Big Tent conference near London, where we debated some of the hot issues relating to the Internet and society with policy-makers, academics and NGOs. The term "big tent” not only described the marquee venue but also our aim to include diverse points of view.

After the U.K. success, we decided to export the concept. Yesterday we welcomed more than 200 guests in Berlin, Germany to the second Big Tent event, entitled DatenDialog.

This dialogue about data tackled the issue of online privacy from a variety of angles. It was appropriate to hold it in Germany, which is a pacesetter both in its concern about privacy and its ideas for safeguarding personal data. During the one-day event, we debated questions such as: what does responsible collaboration between the tech industry and the data protection authorities look like? Do we need new regulation to manage the Internet and the large amount of data produced in the online world? Who is responsible for educating users and how does the tech industry make sure it builds privacy controls into its products?

Speakers included the German State Secretary for the Interior Cornelia Rogall-Grothe and the Federal Data Protection Commissioner Peter Schaar, alongside international authors and bloggers Cory Doctorow and Jeff Jarvis who appeared via live video chat from the U.S.



The debate was always lively, sometimes polarised—Cory likened amalgamated data to nuclear waste while Jeff appealed to governments not to regulate for the worst case—but all seemed to agree that it was a worthwhile and timely exercise to explore these important issues.

You can watch the highlights soon on our Big Tent YouTube channel, and stay tuned for more Big Tents on a range of topics around the world in the coming months.



(Cross-posted from the European Public Policy Blog)

From tagging a post with your location, to checking in to a restaurant, to simply finding out where you are, location-based services have become some of the most popular features of today’s Internet. One of the key ways technology companies are able to determine a location for these services is through a location database, which matches publicly broadcast information about local wireless networks with their approximate geographic location. By looking for wireless access points that are close to a user’s phone, location providers can return the approximate location you need. In addition, this method is a good alternative to other approaches, like GPS, because it’s faster, it works indoors, and it’s more battery-efficient.

The wireless access point information we use in our location database, the Google Location Server, doesn’t identify people. But as first mentioned in September, we can do more to address privacy concerns.

We’re introducing a method that lets you opt out of having your wireless access point included in the Google Location Server. To opt out, visit your access point’s settings and change the wireless network name (or SSID) so that it ends with “_nomap”.  For example, if your SSID is “Network”, you‘d need to change it to “Network_nomap”.

To get started, visit this Help Center article to learn more about the process and to find links with specific instructions on how to change an access point’s SSID for various wireless access point manufacturers.

As we explored different approaches for opting-out access points from the Google Location Server, we found that a method based on wireless network names provides the right balance of simplicity as well as protection against abuse. Specifically, this approach helps protect against others opting out your access point without your permission.

Finally, because other location providers will also be able to observe these opt-outs, we hope that over time the “_nomap” string will be adopted universally. This would help benefit all users by providing everyone with a unified opt-out process regardless of location provider.

Update Nov 21: Edited punctuation to clarify the "_nomap" tag.



(Cross posted on the European Public Policy Blog)

(Cross-posted on the European Public Policy Blog and Public Policy Blog)

Update June 14, 7:40pm: After we published this post, the Kazakhstan authorities issued new guidance stating that the order no longer applies to previously registered domains. In practice this means we can re-launch google.kz. While we’re pleased that we can once again offer our users in Kazakhstan customized search results, we encourage the Government of Kazakhstan to rescind this requirement for all future .kz domains as well.

The genius of the Internet has always been its open infrastructure, which allows anyone with a connection to communicate with anyone else on the network. It’s not limited by national boundaries, and it facilitates free expression, commerce and innovation in ways that we could never have imagined even 20 or 30 years ago.

Some governments, however, are attempting to create borders on the web without full consideration of the consequences their actions may have on their own citizens and the economy. Last month, the Kazakhstan Network Information Centre notified us of an order issued by the Ministry of Communications and Information in Kazakhstan that requires all .kz domain names, such as google.kz, to operate on physical servers within the borders of that country. This requirement means that Google would have to route all searches on google.kz to servers located inside Kazakhstan. (Currently, when users search on any of our domains, our systems automatically handle those requests the fastest way possible, regardless of national boundaries.)

We find ourselves in a difficult situation: creating borders on the web raises important questions for us not only about network efficiency but also about user privacy and free expression. If we were to operate google.kz only via servers located inside Kazakhstan, we would be helping to create a fractured Internet. So we have decided to redirect users that visit google.kz to google.com in Kazakh. Unfortunately, this means that Kazakhstani users will experience a reduction in search quality as results will no longer be customized for Kazakhstan.

Measures that force Internet companies to choose between taking actions that harm the open web, or reducing the quality of their services, hurt users. We encourage governments and other stakeholders to work together to preserve an open Internet, which empowers local users, boosts local economies and encourages innovation around the globe.

(Cross-posted from the European Public Policy Blog)

At our European Zeitgeist event, held annually near London, we traditionally erect a large marquee for a partner dinner and entertainment. This year we wondered if there was anything else we could do with the space once Zeitgeist was over. In that instant, the Big Tent was born.

Canvas aside, the term "big tent" has, of course, a political connotation. Wikipedia defines it as "seeking to attract people with diverse viewpoints...does not require adherence to some ideology as a criterion for membership." That just about sums up the idea behind last week’s Big Tent conference, which focused on debating some of the hot issues relating to the internet and society.

We invited the advocacy groups Privacy International and Index on Censorship—both of whom have criticised Google in the past—to partner with us in staging the debates, and sought diverse viewpoints among the speakers and the delegates.

Topics on the agenda included: what was the role of technology in the revolutions in the Middle East? What are the limits of free speech online? Do we need tougher privacy laws or are we in danger of stifling innovation? Can technology and access to information be used to help prevent conflict?

The result was a stimulating day of debate featuring the likes of Big Brother television producer Peter Bazalgette, Mumsnet founder Justine Roberts and the U.K. Culture Secretary Jeremy Hunt alongside Googlers including Eric Schmidt, Google Ideas’ Jared Cohen and the Egyptian activist Wael Ghonim, and a highly engaged and knowledgeable audience of NGOs, policy advisers, tech businesses and journalists.



You can watch highlights on YouTube and see event feedback on Twitter. We hope to bring the Big Tent to other regions over the coming year.

User trust really matters to Google. That’s why we try to be clear about what data we collect and how we use it—and to give people real control over the information they share with us. For example, Google Dashboard lets you view the data that’s stored in your Google Account and manage your privacy settings for different services. With our Ads Preferences Manager, you can see and edit the data Google uses to tailor ads on our partner websites—or opt out of them entirely. And the Data Liberation Front makes it easy to move your data in and out of Google products. We also recently improved our internal privacy and security procedures.

That said, we don’t always get everything right. The launch of Google Buzz fell short of our usual standards for transparency and user control—letting our users and Google down. While we worked quickly to make improvements, regulators—including the U.S. Federal Trade Commission—unsurprisingly wanted more detail about what went wrong and how we could prevent it from happening again. Today, we’ve reached an agreement with the FTC to address their concerns. We’ll receive an independent review of our privacy procedures once every two years, and we’ll ask users to give us affirmative consent before we change how we share their personal information.

We’d like to apologize again for the mistakes we made with Buzz. While today’s announcement thankfully put this incident behind us, we are 100 percent focused on ensuring that our new privacy procedures effectively protect the interests of all our users going forward.

(Cross-posted on the Public Policy Blog)

It’s become a welcome tradition: Today is the fourth annual Data Privacy Day. Dozens of countries have been celebrating with events throughout the week to inform and educate us all about our personal data rights and protections.

This is the first year I’ve marked this day as director of privacy across both engineering and product management at Google. I’ve chosen to spend the day in Washington, D.C., where there’s a been a lot of robust and productive discussion lately. People from Congress, the Federal Trade Commission, the Department of Commerce, and industry and consumer groups have been contributing to these important conversations about how to best protect people’s data, and we’re happy to be participating too. I’m doing my part by bringing my geek sensibilities into a public discussion that we’re hosting today. In fact, that’s what we’re calling it: “The Technology of Privacy: When Geeks Meet Wonks.” I’ll be joined on the panel by technologists from the Electronic Frontier Foundation, the Federal Trade Commission and the National Institute of Standards and Technology. If you can’t attend in person, don’t worry—we’ll be uploading a video of the event later in the day on our Public Policy blog and you’ll also be able to see it on the Google Privacy Channel on YouTube.

On this Data Privacy Day, a major focus for Google is on creating ways for people to manage and protect their data. We’ve built tools like the Google Dashboard, the Ads Preferences Manager and encrypted search, and we’re always working on further ideas for providing transparency, control and security to empower our users. For example, earlier this week we launched an extension for Chrome users called Keep My Opt-Outs, which enables you to opt out permanently from ad tracking cookies. And pretty soon we’ll be extending the availability of 2-step verification, an advanced account security solution that is now helping protect more than 1,000 new accounts a day from common problems like phishing and password compromise. Right now it’s available to Google Apps Accounts; we’ll be offering it to all users in the next few weeks.

Data Privacy Day 2011 reminds us that as industry and society are busy moving forward, we face new challenges that together we can tackle through conversation and innovation. We’re eager to be part of the solution.

(Cross-posted on the Public Policy and European Public Policy Blogs)

In May we announced that we had mistakenly collected unencrypted WiFi payload data (information sent over networks) using our Street View cars. We work hard at Google to earn your trust, and we’re acutely aware that we failed badly here. So we’ve spent the past several months looking at how to strengthen our internal privacy and security practices, as well as talking to external regulators globally about possible improvements to our policies. Here’s a summary of the changes we’re now making.
  • First, people: we have appointed Alma Whitten as our director of privacy across both engineering and product management. Her focus will be to ensure that we build effective privacy controls into our products and internal practices. Alma is an internationally recognized expert in the computer science field of privacy and security. She has been our engineering lead on privacy for the last two years, and we will significantly increase the number of engineers and product managers working with her in this new role.

  • Second, training: All our employees already receive orientation training on Google’s privacy principles and are required to sign Google’s Code of Conduct, which includes sections on privacy and the protection of user data. However, to ensure we do an even better job, we’re enhancing our core training for engineers and other important groups (such as product management and legal) with a particular focus on the responsible collection, use and handling of data. In addition, starting in December, all our employees will also be required to undertake a new information security awareness program, which will include clear guidance on both security and privacy.

  • Third, compliance: While we’ve made important changes to our internal compliance procedures in the last few years, we need to make further changes to reflect the fact that we are now a larger company. So we’re adding a new process to our existing review system, in which every engineering project leader will be required to maintain a privacy design document for each initiative they are working on. This document will record how user data is handled and will be reviewed regularly by managers, as well as by an independent internal audit team.
We believe these changes will significantly improve our internal practices (though no system can of course entirely eliminate human error), and we look forward to seeing the innovative new security and privacy features that Alma and her team develop. That said, we’ll be constantly on the lookout for additional improvements to our procedures as Google grows, and as we branch out into new fields of computer science.

Finally, I would like to take this opportunity to update one point in my May blog post. When I wrote it, no one inside Google had analyzed in detail the data we had mistakenly collected, so we did not know for sure what the disks contained. Since then a number of external regulators have inspected the data as part of their investigations (seven of which have now been concluded). It’s clear from those inspections that while most of the data is fragmentary, in some instances entire emails and URLs were captured, as well as passwords. We want to delete this data as soon as possible, and I would like to apologize again for the fact that we collected it in the first place. We are mortified by what happened, but confident that these changes to our processes and structure will significantly improve our internal privacy and security practices for the benefit of all our users.

Long, complicated and lawyerly—that's what most people think about privacy policies, and for good reason. Even taking into account that they’re legal documents, most privacy policies are still too hard to understand.

So we’re simplifying and updating Google’s privacy policies. To be clear, we aren’t changing any of our privacy practices; we want to make our policies more transparent and understandable. As a first step, we’re making two types of improvements:
  1. Most of our products and services are covered by our main Google Privacy Policy. Some, however, also have their own supplementary individual policies. Since there is a lot of repetition, we are deleting 12 of these product-specific policies. These changes are also in line with the way information is used between certain products—for example, since contacts are shared between services like Gmail, Talk, Calendar and Docs, it makes sense for those services to be governed by one privacy policy as well.
  2. We’re also simplifying our main Google Privacy Policy to make it more user-friendly by cutting down the parts that are redundant and rewriting the more legalistic bits so people can understand them more easily. For example, we’re deleting a sentence that reads, “The affiliated sites through which our services are offered may have different privacy practices and we encourage you to read their privacy policies,” since it seems obvious that sites not owned by Google might have their own privacy policies.
In addition, we’re adding:
  • More content to some of our product Help Centers so people will be able to find information about protecting their privacy more easily; and
  • A new privacy tools page to the Google Privacy Center. This will mean that our most popular privacy tools are now all in one place.
These privacy policy updates will take effect in a month, on October 3. You can see the new main Google Privacy Policy here, and if you have questions this FAQ should be helpful.

Our updated privacy policies still might not be your top choice for beach reading (I am, after all, still a lawyer), but hopefully you’ll find the improvements to be a step in the right direction.

Update July 9:
We are very pleased that the government has renewed our ICP license and we look forward to continuing to provide web search and local products to our users in China.

(original post)
Ever since we launched Google.cn, our search engine for mainland Chinese users, we have done our best to increase access to information while abiding by Chinese law. This has not always been an easy balance to strike, especially since our January announcement that we were no longer willing to censor results on Google.cn.

We currently automatically redirect everyone using Google.cn to Google.com.hk, our Hong Kong search engine. This redirect, which offers unfiltered search in simplified Chinese, has been working well for our users and for Google. However, it’s clear from conversations we have had with Chinese government officials that they find the redirect unacceptable—and that if we continue redirecting users our Internet Content Provider license will not be renewed (it’s up for renewal on June 30). Without an ICP license, we can’t operate a commercial website like Google.cn—so Google would effectively go dark in China.

That’s a prospect dreaded by many of our Chinese users, who have been vocal about their desire to keep Google.cn alive. We have therefore been looking at possible alternatives, and instead of automatically redirecting all our users, we have started taking a small percentage of them to a landing page on Google.cn that links to Google.com.hk—where users can conduct web search or continue to use Google.cn services like music and text translate, which we can provide locally without filtering. This approach ensures we stay true to our commitment not to censor our results on Google.cn and gives users access to all of our services from one page.

Over the next few days we’ll end the redirect entirely, taking all our Chinese users to our new landing page—and today we re-submitted our ICP license renewal application based on this approach.

As a company we aspire to make information available to users everywhere, including China. It’s why we have worked so hard to keep Google.cn alive, as well as to continue our research and development work in China. This new approach is consistent with our commitment not to self censor and, we believe, with local law. We are therefore hopeful that our license will be renewed on this basis so we can continue to offer our Chinese users services via Google.cn.

Update June 9, 2010: 

When we announced three weeks ago that we had mistakenly included code in our software that collected samples of payload data from WiFi networks, we said we would ask a third party to review the software at issue, how it worked, and what data it gathered. That report, by the security consulting firm Stroz Friedberg, is now complete and was sent to the interested data protection authorities today. In short, it confirms that Google did indeed collect and store payload data from unencrypted WiFi networks, but not from networks that were encrypted. You can read the report here. We are continuing to work with the relevant authorities to respond to their questions and concerns.

Update May 17, 2010:

On Friday May 14 the Irish Data Protection Authority asked us to delete the payload data we collected in error in Ireland. We can confirm that all data identified as being from Ireland was deleted over the weekend in the presence of an independent third party. We are reaching out to Data Protection Authorities in the other relevant countries about how to dispose of the remaining data as quickly as possible.


You can read the letter from the independent third party, confirming deletion, here.


[original post]
Nine days ago the data protection authority (DPA) in Hamburg, Germany asked to audit the WiFi data that our Street View cars collect for use in location-based products like Google Maps for mobile, which enables people to find local restaurants or get directions. His request prompted us to re-examine everything we have been collecting, and during our review we discovered that a statement made in a blog post on April 27 was incorrect.

In that blog post, and in a technical note sent to data protection authorities the same day, we said that while Google did collect publicly broadcast SSID information (the WiFi network name) and MAC addresses (the unique number given to a device like a WiFi router) using Street View cars, we did not collect payload data (information sent over the network). But it’s now clear that we have been mistakenly collecting samples of payload data from open (i.e. non-password-protected) WiFi networks, even though we never used that data in any Google products.

However, we will typically have collected only fragments of payload data because: our cars are on the move; someone would need to be using the network as a car passed by; and our in-car WiFi equipment automatically changes channels roughly five times a second. In addition, we did not collect information traveling over secure, password-protected WiFi networks.

So how did this happen? Quite simply, it was a mistake. In 2006 an engineer working on an experimental WiFi project wrote a piece of code that sampled all categories of publicly broadcast WiFi data. A year later, when our mobile team started a project to collect basic WiFi network data like SSID information and MAC addresses using Google’s Street View cars, they included that code in their software—although the project leaders did not want, and had no intention of using, payload data.

As soon as we became aware of this problem, we grounded our Street View cars and segregated the data on our network, which we then disconnected to make it inaccessible. We want to delete this data as soon as possible, and are currently reaching out to regulators in the relevant countries about how to quickly dispose of it.

Maintaining people’s trust is crucial to everything we do, and in this case we fell short. So we will be:
  • Asking a third party to review the software at issue, how it worked and what data it gathered, as well as to confirm that we deleted the data appropriately; and
  • Internally reviewing our procedures to ensure that our controls are sufficiently robust to address these kinds of problems in the future.
In addition, given the concerns raised, we have decided that it’s best to stop our Street View cars collecting WiFi network data entirely.

This incident highlights just how publicly accessible open, non-password-protected WiFi networks are today. Earlier this year, we encrypted Gmail for all our users, and next week we will start offering an encrypted version of Google Search. For other services users can check that pages are encrypted by looking to see whether the URL begins with “https”, rather than just “http”; browsers will generally show a lock icon when the connection is secure. For more information about how to password-protect your network, read this.

The engineering team at Google works hard to earn your trust—and we are acutely aware that we failed badly here. We are profoundly sorry for this error and are determined to learn all the lessons we can from our mistake.

Six months ago, we launched the Google Dashboard to help you view and control information stored in your Google Account. It’s organized according to the products you use (like Gmail, Docs or YouTube), listing data stored in your account and providing direct links to control your personal settings.

Since we’re celebrating our very first half-birthday, we thought it was the ideal time to update you on how things are going. On average, around 100,000 unique visitors a day check out their Dashboard, 85 percent for the first time. Since launch, we’ve worked to grow Dashboard, adding a number of other Google products including Sites, Maps, Books, Webmaster Tools, Buzz, Goggles, Sidewiki and Analytics. We’re still working on adding other products to the tool and are talking with users about new ways to improve the functionality moving forward.

We launched the Dashboard to provide you with greater transparency and control. We’re proud of its success so far and look forward to what’s next. If you haven’t looked at your own Dashboard yet, check it out!



Article 19 of the Universal Declaration on Human Rights states that "everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers." Written in 1948, the principle applies aptly to today's Internet -- one of the most important means of free expression in the world. Yet government censorship of the web is growing rapidly: from the outright blocking and filtering of sites, to court orders limiting access to information and legislation forcing companies to self-censor content.

So it's no surprise that Google, like other technology and telecommunications companies, regularly receives demands from government agencies to remove content from our services. Of course many of these requests are entirely legitimate, such as requests for the removal of child pornography. We also regularly receive requests from law enforcement agencies to hand over private user data. Again, the vast majority of these requests are valid and the information needed is for legitimate criminal investigations. However, data about these activities historically has not been broadly available. We believe that greater transparency will lead to less censorship.

We are today launching a new Government Requests tool to give people information about the requests for user data or content removal we receive from government agencies around the world. For this launch, we are using data from July-December, 2009, and we plan to update the data in 6-month increments. Read this post to learn more about our principles surrounding free expression and controversial content on the web.

We already try to be as transparent as legally possible with respect to requests. Whenever we can, we notify users about requests that may affect them personally. If we remove content in search results, we display a message to users. The numbers we are sharing today take this transparency a step further and reflect the total number of requests we have received broken down by jurisdiction. We are also sharing the number of these content removal requests that we do not comply with, and while we cannot yet provide more detail about our compliance with user data requests in a useful way, we intend to do so in the future.

As part of our commitment to the Global Network Initiative, we have already agreed to principles and practices that govern privacy and free expression. In the spirit of these principles, we hope this tool will shine some light on the scale and scope of government requests for censorship and data around the globe. We also hope that this is just the first step toward increased transparency about these actions across the technology and communications industries.

The year was 1986. A gallon of gas cost 89 cents, Paul Simon’s Graceland won the Grammy for album of the year, and the federal Electronic Communications Privacy Act (ECPA), which governs how law enforcement can access electronic data, was signed into law.

A lot has changed since 1986. Gas is now measured in dollars and Taylor Swift (born 1989) won album of the year. All the while, technology has moved at record pace. But ECPA has stayed the same. Originally designed to protect us from unwarranted government intrusion while ensuring that law enforcement had the tools necessary to protect public safety, it was written long before most people had heard of email, cell phones or the “cloud” — the term used for programs helping people store personal data like photos and documents online. As a result, ECPA has become outdated.

This is why we’re proud to help establish Digital Due Process, a coalition of technology companies, civil rights organizations and academics seeking to update ECPA to provide privacy protections to new and emerging technologies.

Specifically, we want to modernize ECPA in four ways:
  • Better protect your data stored online: The government must first get a search warrant before obtaining any private communications or documents stored online;
  • Better protect your location privacy: The government must first get a search warrant before it can track the location of your cell phone or other mobile communications device;
  • Better protect against monitoring of when and with whom you communicate: The government must demonstrate to a court that the data it seeks is relevant and material to a criminal investigation before monitoring when and with whom you communicate using email, instant messaging, text messaging, the telephone, etc.; and
  • Better protect against bulk data requests: The government must demonstrate to a court that the information it seeks is needed for a criminal investigation before it can obtain data about an entire class of users.
We also created this video to help explain ECPA and why it needs updating:



You can read more about our proposal at our coalition website. In the coming months, we’ll meet with lawmakers, law enforcement officials and others to help build support for modernizing the law.

1986 was a good year, but it’s time our laws catch up with how we live our lives today.

Thursday, January 28th marks International Data Privacy Day. We're recognizing this day by publicly publishing our guiding Privacy Principles.
  • Use information to provide our users with valuable products and services.
  • Develop products that reflect strong privacy standards and practices.
  • Make the collection of personal information transparent.
  • Give users meaningful choices to protect their privacy.
  • Be a responsible steward of the information we hold.


We've always operated with these principles in mind. Now, we're just putting them in writing so you have a better understanding of how we think about these issues from a product perspective. Like our design and software guidelines, these privacy principles are designed to guide the decisions we make when we create new technologies. They are one of the key reasons our engineers have worked on new privacy-enhancing initiatives and features like the Google Dashboard, the Ads Preferences Manager and the Data Liberation Front. And there is more in store for 2010.

You can find out more about our efforts at the Google Privacy Center and on our YouTube channel.

Like many other well-known organizations, we face cyber attacks of varying degrees on a regular basis. In mid-December, we detected a highly sophisticated and targeted attack on our corporate infrastructure originating from China that resulted in the theft of intellectual property from Google. However, it soon became clear that what at first appeared to be solely a security incident--albeit a significant one--was something quite different.

First, this attack was not just on Google. As part of our investigation we have discovered that at least twenty other large companies from a wide range of businesses--including the Internet, finance, technology, media and chemical sectors--have been similarly targeted. We are currently in the process of notifying those companies, and we are also working with the relevant U.S. authorities.

Second, we have evidence to suggest that a primary goal of the attackers was accessing the Gmail accounts of Chinese human rights activists. Based on our investigation to date we believe their attack did not achieve that objective. Only two Gmail accounts appear to have been accessed, and that activity was limited to account information (such as the date the account was created) and subject line, rather than the content of emails themselves.

Third, as part of this investigation but independent of the attack on Google, we have discovered that the accounts of dozens of U.S.-, China- and Europe-based Gmail users who are advocates of human rights in China appear to have been routinely accessed by third parties. These accounts have not been accessed through any security breach at Google, but most likely via phishing scams or malware placed on the users' computers.

We have already used information gained from this attack to make infrastructure and architectural improvements that enhance security for Google and for our users. In terms of individual users, we would advise people to deploy reputable anti-virus and anti-spyware programs on their computers, to install patches for their operating systems and to update their web browsers. Always be cautious when clicking on links appearing in instant messages and emails, or when asked to share personal information like passwords online. You can read more here about our cyber-security recommendations. People wanting to learn more about these kinds of attacks can read this Report to Congress (PDF) by the U.S.-China Economic and Security Review Commission (see p. 163-), as well as a related analysis (PDF) prepared for the Commission, Nart Villeneuve's blog and this presentation on the GhostNet spying incident.

We have taken the unusual step of sharing information about these attacks with a broad audience not just because of the security and human rights implications of what we have unearthed, but also because this information goes to the heart of a much bigger global debate about freedom of speech. In the last two decades, China's economic reform programs and its citizens' entrepreneurial flair have lifted hundreds of millions of Chinese people out of poverty. Indeed, this great nation is at the heart of much economic progress and development in the world today.

We launched Google.cn in January 2006 in the belief that the benefits of increased access to information for people in China and a more open Internet outweighed our discomfort in agreeing to censor some results. At the time we made clear that "we will carefully monitor conditions in China, including new laws and other restrictions on our services. If we determine that we are unable to achieve the objectives outlined we will not hesitate to reconsider our approach to China."

These attacks and the surveillance they have uncovered--combined with the attempts over the past year to further limit free speech on the web--have led us to conclude that we should review the feasibility of our business operations in China. We have decided we are no longer willing to continue censoring our results on Google.cn, and so over the next few weeks we will be discussing with the Chinese government the basis on which we could operate an unfiltered search engine within the law, if at all. We recognize that this may well mean having to shut down Google.cn, and potentially our offices in China.

The decision to review our business operations in China has been incredibly hard, and we know that it will have potentially far-reaching consequences. We want to make clear that this move was driven by our executives in the United States, without the knowledge or involvement of our employees in China who have worked incredibly hard to make Google.cn the success it is today. We are committed to working responsibly to resolve the very difficult issues raised.

Update: Added a link to another referenced report in paragraph 5.

Today, we are excited to announce the launch of Google Dashboard. Have you ever wondered what data is stored with your Google Account? The Google Dashboard offers a simple view into the data associated with your account — easily and concisely in one location.

Over the past 11 years, Google has focused on building innovative products for our users. Today, with hundreds of millions of people using those products around the world, we are very aware of the trust that you have placed in us, and our responsibility to protect your privacy and data. In the past, we've taken numerous steps in this area, investing in educating our users with our Privacy Center, making it easier to move data in and out of Google with our Data Liberation Front, and allowing you to control the ads you see with interest-based advertising. Transparency, choice and control have become a key part of Google's philosophy, and today, we're happy to announce that we're doing even more.

In an effort to provide you with greater transparency and control over your own data, we've built the Google Dashboard. Designed to be simple and useful, the Dashboard summarizes data for each product that you use (when signed in to your account) and provides you direct links to control your personal settings. Today, the Dashboard covers more than 20 products and services, including Gmail, Calendar, Docs, Web History, Orkut, YouTube, Picasa, Talk, Reader, Alerts, Latitude and many more. The scale and level of detail of the Dashboard is unprecedented, and we're delighted to be the first Internet company to offer this — and we hope it will become the standard. Watch this quick video to learn more and then try it out for yourself at www.google.com/dashboard.



Your friends and contacts are a key part of your life online. Most people on the web today make social connections and publish web content in many different ways, including blogs, status updates and tweets. This translates to a public social web of content that has special relevance to each person. Unfortunately, that information isn't always very easy to find in one simple place. That's why today we're rolling out a new experiment on Google Labs called Google Social Search that helps you find more relevant public content from your broader social circle. It should be available for everyone to try by the end of the day, so be sure to check back.

A lot of people write about New York, so if I do a search for [new york] on Google, my best friend's New York blog probably isn't going to show up on the first page of my results. Probably what I'll find are some well-known and official sites. We've taken steps to improve the relevance of our search results with personalization, but today's launch takes that one step further. With Social Search, Google finds relevant public content from your friends and contacts and highlights it for you at the bottom of your search results. When I do a simple query for [new york], Google Social Search includes my friend's blog on the results page under the heading "Results from people in your social circle for New York." I can also filter my results to see only content from my social circle by clicking "Show options" on the results page and clicking "Social." Check out this video for a demo:

All the information that appears as part of Google Social Search is published publicly on the web — you can find it without Social Search if you really want to. What we've done is surface that content together in one single place to make your results more relevant. The way we do it is by building a social circle of your friends and contacts using the connections linked from your public Google profile, such as the people you're following on Twitter or FriendFeed. The results are specific to you, so you need to be signed in to your Google Account to use Social Search. If you use Gmail, we'll also include your chat buddies and contacts in your friends, family, and coworkers groups. And if you use Google Reader, we'll include some websites from your subscriptions as part of your social search results. To learn more about how Social Search works behind the scenes, including the choices and control you have over the content you see and share, read our help center article or watch this video: This feature is an experiment, but we've been using it at Google and the results have been exciting. We'd love to hear your feedback. Oh, and don't forget to create a public Google profile to expand your social circle and more easily find the information you're looking for (including that New York blog).