Google Factory Tour of Search


Uploaded by Google on 20.05.2008

Transcript:

MALE SPEAKER: Ladies and gentlemen, good morning.
Would you please welcome Vice President of Search Products
and User Experience, Marissa Mayer.

MARISSA MAYER: Good morning.
And welcome to Google's factory tour of Search.
I'm really excited to have all of you here today because we
wanted to give you an inside glance at what
we're doing in Search.
Search today is broader than it's ever been before.
It's more relevant than it's ever been before.
And it's more innovative.
So I've asked members of my team today to share with you
what some of the most interesting things they're
working on are.
First up we have RJ Pittman.
RJ is our Director of Search Properties.
RJ is a serial entrepreneur who came to us last fall.
You may know him from his work at Groxis, where he developed
the search technology Grokker.
And as an expert in search interface design, he heads
twelve of our search properties, including news,
image search, trends, finance, and more.
So we're going to hear what is really exciting on those
properties from RJ this morning.
Welcome RJ.

RJ PITTMAN: Thanks, Marissa.
Good to see you.
Good morning.
Well there's a lot of things that we're excited to talk
about today, so let's get to it.
As Marissa mentioned, I am the manager of our search
properties.
And there are a lot of them.
And the reason for that is pretty simple.
Users today are using and working with content in a wide
variety of ways.
Sure, a lot of people are doing web searches.
That's kind of a given with Google.
But beyond web search, people are analyzing stock charts,
reading news articles, browsing through
pictures on the web.
You get the idea.
And it's my job at Google to try and make sure that all of
our users are directed to the best user experience possible.
So it's not enough to just provide a great content
experience, but it's also important that we manage the
influx, the overload of information, that our users
are being hit with today.
It's no surprise that the world is getting
more and more digital.
I used to say that these generations of kids coming up
today are born digital.
Well today, content is born digital.
And with that fact, we're seeing just tremendous scale
and tremendous growth of content online.
So managing that is a really big challenge.
Another part of my responsibility is managing the
user experience and providing a seamless experience from one
property to the next, whether it's researching a company on
the stock market or getting information on the fate of our
polar bears.
These are very different experiences.
And this is why search properties are so important.
Today I'd like to talk about three of our search properties
and share with you some of the interesting innovations were
working on to help our users navigate this growing
information, this sea of landscape of content that's
all around us.
Image search is one of our fastest growing properties.
And if we zoom the camera lens out a little bit and look at
the industry as a whole, I think you can see why.
You might be interested to find that the market is bigger
than it's ever been before for imagery and photos.
Over 300 million photos, digital photos, are taken
every single day.
That's over a hundred billion new digital images added to
cyberspace every year.
And if you multiply that by the number of camera phones
and digital cameras that are being picked up every year,
you can see how this exponential growth gets us,
not to billions, but to potentially trillions of
images online.
That's the kind of number that I like.
I'd like to see us, one day, be the search index that has
over one trillion images to search from.
Now supposing that happens, how are we going
to make this work?
How are we're going to sort this out for our users?
Well, with this supply of all of these images that are
coming online, it's no surprise that the
demand is there too.
More and more users are turning to image search than
ever before.
You can see that just in the past two years the popularity
has grown a couple orders of magnitude.
This is pretty significant for such a short period of time.
Literally hundreds of millions of searches every single day
are being done on Google image search.
So as our users come to image search, it's important for us
to help them navigate and do interesting
things with image search.
So innovation is hugely, hugely important today.
So let's suppose that the user finds an image that they
really like on image search.
Well it should be within Googles grasp to actually help
them find more like it.
The likeness of an image is actually a really difficult
thing to figure out.
Sure you can find pictures that are of cars, or you can
find pictures that all have a nice shade of blue in them.
But to actually find a picture such as this scene of palm
trees and a beach at sunset is much more difficult.
But using our image similarity technology that we have today,
with some of the best computer researchers in computer vision
and image processing, it's one click away.

In addition to that, let's suppose now that a user wants
to find any picture that has the Eiffel Tower in it.
Now this is a different challenge because identifying
an object in a picture is different than also
identifying the similarity of a picture.
At Google, we have some of the first and the best technology
in object recognition.
And we too believe that with this technology and the
millions and billions and trillions, potentially, images
that are coming online, adding this kind of technology to
help our users find more of what they're looking for is
going to be hugely important.
Now another example, people love to search for people.
That's a fact.
Everybody does it.
And people love to look for celebrities, look for their
friends, etc.
online.
So this is where our technology really comes into
play and is particularly important.
For example, face detection is important to, in this example,
be able to disambiguate between a searcher that's
looking for the DeLorean, the car, or the man behind the
car, John DeLorean himself.
Now we'd also like to be able to take that one step further
for our users, supposing that you were looking for John
DeLorean and you'd like to see any picture that
John DeLorean is in.
Well, using our facial recognition technology, that
is really a reality.
And that's something we're working to make, again, just
one click away.
As a matter of fact, some of this technology, you may not
even realize is available in image search today.
If you go into our advanced search panel, you'll find this
capability available.
And we're working on a lot of exciting ways to make it even
more powerful.

Now in addition to the interesting things you can do
with sorting and navigating these images, we also care
about the content itself.
As you can imagine, at Google, we like things big.
And imagery is one area that we think there's a lot of
opportunity for growth.
We're all used to our digital cameras snapping two
megapixel, five megapixel, and now ten megapixel photos.
But suppose you had images that were 1,000 megapixels or
a gigapixel.
This is where the new frontier is.
And at Google were working on a number of interesting
projects, one that you may have heard of, called the
gigapan project, where we're very involved in making images
available at up to three gigapixels.
Using advanced image feeds we're able to get the best of
these high quality images, whether it's a gigapixel or a
mega pixel from the best content publishers, and
provide a rich image search experience.
This is fairly new for us and it's a way to really enrich
the user experience around images.

In addition to that, you will probably notice that your
camera phone, particularly if it's GPS enabled, is able to
in code your photo, not just with all of the information
about the exposure, the aperture, the f stop of that
picture, the mega pixels, what the resolution is, but it can
also include a with a geo location of precisely where
that camera phone was when you snapped the picture.
Now this gets very interesting when you want to start
searching pictures from a totally different dimension.
And when Carter comes up and talk to us about geotagging,
and geospatial imagery, and search in geospatial images,
you get a sense of exactly where we're headed with this.
And finally, talking about all of these innovations is great,
but what we really want to do is surfeit
these for our users.
And we'll be talking later about when and where you're
going to see some of these new technologies come to light.
Now many users of Google do things beyond search.
It can be as quirky as using Google for spell check.
It can be as quirky as using Google to use it as a
calculator.
And why are people doing that?
Well because Google is so fast and it's so easy, and you can
just type in your computation, you can type in the word that
you're not sure of how it's spelled, and it gives it right
back to you.
Well similarly, we're seeing the same behavior emerge on
image search, whether it's shopping and looking for
inspiration for preparing a meal or a fine dessert,
planning the next great vacation, managing your
family's health, and perhaps most importantly, most fun, is
shopping for a birthday present.
As the use cases for image search broaden, so too does
the relevance of commerce related activities.
This is exactly what we saw in web search several years ago.
There's a direct connection between
these uses and commerce.
Now we've experimented with this before on image search.
And we have looked at ways to create a better user
experience, not only for our users but also for our
advertisers.
And they haven't always been successful.
And this is important to note because advertising at Google
is driven by user experience.
And the user always comes first. So we're not ashamed to
go back to the drawing board and continue to work on this
until we reach a point where we are satisfied and our users
indicate to us that an advertising experience has
come together.
Now what we're announcing today is a new suite of image
related experiments.
And here we're pairing images with images for the very first
time, or display ads with image search.
And we believe that if we can align the nature of the images
from the advertisers with the nature of the images in image
search, we can really help our users find more of what
they're looking for.
And frankly, we may also help them achieve some of these new
use cases that they're conducting on image search,
such as shopping and recipe hunting.
So you see that these renderings here show a fairly
seamless and more pleasing user experience than just
running a stack of ads down the side or across the top.
We are really looking for a true
user experience alignment.
And we've kind of overstated this for the purpose is the
camera and the a projector so that you can see clearly.
And it won't be quite this prominently.
It will be a little bit more seamlessly integrated in the
actual product.
But we're really excited to get these out there.
And most importantly, we're excited to hear what our users
think about this.
And you can be sure that we'll keep you
posted on how this goes.
Now I'd like to shift gears and talk about news.
As it happens, news publishers happen to be one of our
greatest sources for images.
But they are even better source for Google News.
At the core of Google News, as you all probably know, is the
story cluster.
This clustering technology is pivotal to the user experience
and our unique approach to news.
And story cluster is essentially a collection of
articles about a specific storyline over a specified
period of time.
This really is at the heart of how we think about news.
Now if you pair that with our powerful search technology, we
have some pretty amazing capabilities.
We can sort, organize, cluster, classify news into
section pages, and localize that information for you.
I mention localize because we recently
launched local search.
And this was a pretty big effort.
And why was this an effort?
Well local information and local news is a really hard
problem to solve because it's not just about zip codes and
city names.
There is a great deal of complexity in a story, and
really taking it down into its pieces, word by word, and
really understanding what that story is about and where that
story is taking place, is a bit elusive.
So it goes beyond the zip code.
It goes beyond the city name.
We have to disambiguate everything from Paris, France,
to Paris Hilton, Paris, Michigan, and Paris, Texas.
And the way that we are approaching this is actually
quite novel.
We're leveraging the power of our story clusters.
So to be sure that a story is about a particular location or
is happening in an area near your hometown, we're able to
look across hundreds, and even thousands of stories that are
tied to that story cluster to reaffirm the approximations
that, indeed, this new story is about this town.
If we didn't have the story cluster, we would not have the
ability to be that accurate.
This is exciting stuff.
I like to see how we are reusing and leveraging some of
our earliest innovations and taking them
forward for our users.
Now you can see here in this particular example, local
news, it's not just about Mountain View, California.
We also think about proximity in a very fluid sense.
Local to me may mean Mountain View.
Local to me may mean the Bay area.
Being able to identify this, not only with the content
that's in the new story, but also by recognizing the
location of the sources of that information and
incorporating that signal into the local news experience is
critical to really getting the best experience for our users.

Another really cool feature that we've launched recently
is quote search.
I think this is a pretty good example.
Suppose you're in line and ready to vote for the next
American Idol.
It's coming up this week.
The finals are on Wednesday.
And if you wanted to get a preview and recap and do some
research before weighing in your vote, certainly one thing
I would do is go back and look at everything as Simon Cowell
said about the finalists.
And now you can do that today on Google News.
By doing a quote search for Simon Cowell, I
get all of the searches.
And then from there, I can actually search into both
Davids and find out what he thinks about each of them, and
get a sense for where this thing may turn out.
Now the other cool thing with this is we are also using the
very same clustering technology to really perfect
these quotes.
It's very difficult to actually make sure that the
Simon Cowell we're talking about is, in fact, the one
that's on American idol, and is the one that's quoting
specifically the performance of these particular
contestants.
So using the clustering technology, again, across
thousands of articles where these quotes appear, we can
reaffirm the accuracy of our engine.

So I'd like to also talk about another kind of quote.
This time it's a stock quote.
And Google Finance launched about a year and a half ago.
And Google Finance was Google Finance.
And we started with a very humble and very focused
mission to build a very powerful and simple stock
market tool, a window into the market.
And we led that introduction with a very powerful
visualization tool.
Our charting tools were recognized as some of the most
intuitive and easy to understand in the industry.
And this is important, because the goal of Google Finance is
to help all users navigate the complexities of
the financial markets.
We recently launched a new version of our home page,
bearing this in mind, because our users were asking us for
more context.
They wanted more than just stock quotes.
They wanted the rich, market driving content that is
affecting the stock markets and affecting their specific
portfolios.
So leveraging the power of news and those news clusters,
we were able to integrate this right into the Google Finance
user experience and give users a fantastic news a capability
that is not available anywhere on the web.

In addition to that, we realized that hey, we've been
innovating, and we were known for some of this great
visualization technology, we've got to
keep moving it forward.
And a very mysterious, and elusive, and, frankly, complex
feature in most online finance sites is a stock screener,
being able to pick stocks based on your criteria that
meet your investment needs and your risk preferences.
Well those tend to be pretty cumbersome airplane
dashboards.
And we basically took that apart and put it together
again in a very simple, intuitive interface.
And you can see here, with this particular layout, the
user can get a bird's eye view of virtually every stock
that's traded in the public markets.
And all of those stocks can be filtered by any criteria,
based on the size of the company, their financial
performance, their PE ratios, you name it.
And this has become hugely popular and we're really happy
with the success of this product.
I think we're setting the standard again for
demystifying some of these complexities
that we see the markets.
Another key element of who will finance, you may have
realized when we launched this product, only launched it in
the United States.
And part of the reason there was to really focus
on getting it right.
And as soon as we were sure of that, we
launched the new homepage.
We also launched something else, the Google Global
Finance platform.
This is an important innovation for us, because we
realize that in our mission to organize all the world's
information, this has to hold true for finance as well.
And as we launched the new product, I'm excited to say
that we were able to simultaneously launch in four
countries, not just the US, but China, the UK, and Canada,
with many more on deck.
So in many ways, we're just getting started with Google
Finance, and the world is just getting a taste
of what's to come.

Now no factory tour would be complete without a stroll
through the Google factory floor.
And for that, I would like to invite up our product manager
from Google Labs, Michael Cohen, to share with us some
of the exciting things happening there.
MICHAEL COHEN: Thanks RJ.
RJ PITTMAN: Hey Michael.
How's it going?
Good to have you.
So Michael, tell us a little bit about labs.
I've been looking at labs for a long time, and sort of
staring at it in hopes that we're going to see some great
new innovations coming out of Google.
And in the old days, it used to be like sort of a new thing
every month coming out of labs.
And lately, it seems like there's a big pile of stuff.
It's like a parking lot.
What was happening?
MICHAEL COHEN: Sure.
I really wanted to come up here and
talk about lab strategy.
I didn't realize that we'd have all
these people observing.
But this is as good a time as any, I guess.
RJ PITTMAN: Absolutely.
I'm going to put you on the spot.
We can have a product strategy discussion here.
I'm sure they won't mind.
Love to hear what you have in store.
MICHAEL COHEN: Sure.
Let's go.
So labs is our chance to showcase some of our most
innovative and creative experiments.
And we realize that they're not all going to be home runs.
We've had some pretty high profile graduates, if you
think about things like Maps and iGoogle.
We realize we also have our share of fifth year seniors on
the list as well.
But that's exactly our point with labs.
We're trying to be innovative and experiment a lot.
I want to talk about two particular examples,
Experimental Search and Google Trends, which are pointing out
the directions we're going to be going with Labs.
Experimental Search, which we launched last year, is the
first in a line of lab sections dedicated to our
existing products.
What this lets you do is actually bolt on experiments
to your existing Google search, so that when you
conduct searches on Google, these experiments will follow
you along and you won't have to visit specific labs pages.
For example, if you were to join this alternate search
view experiment, every time you did a search on Google,
you would have the option of seeing a timeline or a map
view for your search results.
We've got a lot more experiments of this form
coming through the pipeline.

While our existing products are doing some great
innovation, the heart and soul of labs is really around our
new product development.
And Google Trends is a great example.
And it's really still developing.
Our vision behind Trends is to allow users to understand
consumer and market trends, by seeing what people are
searching for in Google.
You can see from this chart--
the top chart shows our search volume for different queries
on Google, while the bottom chart shows how these
different queries showed up in the news.
This particular example is probably the ultimate seasonal
effect, looking at shorts and sweaters.

In this example, we're actually looking at
superheroes, Spiderman, Batman, Superman, seeing how
query volume for them develops.
You can actually see that the spikes might very well
correlate to different movies that are out on these
different superheroes.
You may even wonder if the relative height of the spikes
suggest how successful the different movies have been.

RJ PITTMAN: Now, Michael, let me stop you
right there for a minute.
That's kind of interesting.
So it seems to me that with these types of trends, they
can be correlated to interesting economic
indicators.
Might we see something like this in Google Finance?
I could almost imagine sort of a trends overlay with the
stock chart that might show help the user see some of
these correlations as they as they map back.
MICHAEL COHEN: RJ, that's a really cool idea.
I don't really want to still the beans or anything.
But we're always thinking about interesting ways of
applying this technology to our existing products.
In this particular example, which is very timely, you can
see our different presidential candidate and how search
queries for them are varied over time.
One particular thing you might note is, as candidates left
the race, you can see how queries for them sort of fell
off the map.
Also if you, say, had a theory that search queries were
predictive of success in elections, you might also be
interested in mining this data to see how it varied across
geographies.
So in summary, labs is really at the core of
innovation at Google.
We have a lot of great stuff in the pipeline, coming
through for labs.
While I can't guarantee that every single one of them are
going to be home runs, I can certainly guarantee there's
going to be some really exciting stuff so you better
come back and check it out.
RJ PITTMAN: Thanks Michael.
MICHAEL COHEN: Sure, RJ.
RJ PITTMAN: Appreciate it.

Well just to restate what Michael said, innovation is
absolutely at the core of Google.
And the engineering power at the company
has never been stronger.
And I'm excited to see what Michael has in store for us
and I think Labs is going to be an exciting place to watch
over the next year as we continue to renovate the
factory and bring out some of these
innovations in great form.
So that's it for my segment of the tour.
I'd like to invite Marissa back up onto the stage.
But before I do, I'd also invite you guys to stop by our
little station and round table at the lunch break, where
we're actually going to be able to give you a sneak peek
into some of these new innovations that we're working
on here at Google.
Thank you.

MARISSA MAYER: Thanks RJ.
One of our big home runs from Labs was Google Maps, and has
continued to be Google Maps.
And Carter Maslan, who you'll hear from next is the Director
of Local Search Quality.
So he owns a big part of Google Maps.
How do we find things, literally find things, in
physical space Better His span of responsibility includes
local search quality, our maps wiki projects, and more.
So with that, welcome Carter Maslan.

CARTER MASLAN: So good morning.
My name is Carter Maslan.
And I'm just thrilled to be here to lead a
tour of local search.
And joining me on stage is Dana Nguyen.
She's going to be our maestro on the demo keyboard.
And as much as I'm happy to be here, if there were a real
factory tour, actually seeing engineers that are doing the
heavy lifting and pushing the bits, we'd have to take around
the world trip.
So instead of that, you'll see them in inset pictures, as we
talk about each of the challenges that were
addressing in local search.
So RJ mentioned the fact that he's optimizing search for
different content types, like images, news, and finance.
And for local search, we do something similar, for things
like addresses and business listings.
But local search as an added challenge, because local
search is really a geographic lens onto any content type
that has some correlation to place.
And in the process of looking through that geographic lens,
we are fulfilling kind of our ambition to be an organized
collection of all the world's information, just organized
geographically, so truly, building a map that contains
the world's information.
So what we're going to do on the tour
is cover three things.
One, we're going to find out why it's hard.
Two, we'll take a look at the state of the art in addressing
some of the complications.
And then three, we're going to go through some innovative I
work that's springing up on top of the work to date.
And by the end, I hope you have some new insights in
local search, but also kind of share my secret delight that
every time I imagine this room full of hardworking engineers
working away to fulfill my casual query of pizza SF.
So let's get started by just asking why it's hard.
If your ambition is to provide the best information about any
place on the planet, the first challenge is how do you even
identify and describe a place?
So just to go to the basics, let's start with the basic
example that's from my hometown.
These are two of a handful of engineers that work on one of
the most computationally complex
problems in local search.
And that's clustering, and clustering it
with related web pages.
So Julian Bosch and [? Natan ?], they work on
those two problems, as applied to this scene.
Here's a typical scene where we've got a playground in the
upper left, we've got a soccer field,
we've got a golf center.
And if you're trying to have all the information that best
describes these places, you may start by saying, well
let's start by trying to identify it by street address.
But unfortunately, they all share the same street address
in the entrance.
Then you say, well how about, maybe phone number will
provide a clue.
But both the golf center and the restaurant within the golf
center both share the same phone number.
You go on and try hint after hint, whether it's geocode or
other category hints, title hits.
And it becomes this problem where you've got lots of good
hints, but none of which are precise enough to actually
uniquely identify the place you are talking about.
The second problem is whose reality is it?
And reality, believe it or not, is a
question in a lot of cases.
If you take a map of this region of the world, and you
say, let me just find a place on the planet.
You think that that's a pretty objective question.
But in fact, when you start adding in disputed political
boundaries and borders, and you had in a variety of
alternate names and alternate languages that all vary by the
country of the user that's issuing the query, it becomes
more and more complex.
So we have Alan and Andrew that are working on searching
the planet for things and taking into account where the
person is when they issued the search, what country, what
language, and how to deal with all of these different
complexities in interpreting reality.

Kind of as a third problem, a lot of times, people's
questions have different answers.
Like if someone were to say, from a local search
perspective, find New York, New York.
You'd think that that is kind of completely unambiguous,
that they're talking about New York, New York.
And we do in fact return New York, New York.
But if they are over Las Vegas, they may be looking for
the hotel called New York, New York.
And so we have Vlad, in the upper right there, who's
working on interpreting queries based on the view port
from which they issue that local query.

Kind of the fourth challenge is that local search has an
extremely long tail.
If you take a query for something like rope swing,
there's one kind of long tail that that represents.
But then if you slice it by all the places in the world
where there might be a rope swing, it's a
very long, thin tail.
So one of the biggest challenges is addressing a
personal challenge I have, driving three hours to a rural
part of Maine that's absolutely beautiful, wanting
to find a rope swing.
How many people can actually tell me the precise location.
The green marker represents one of the best rope swings on
one of the best lily ponds of Deer Isle, Maine.
There's not a lot of people.
So we've got two people here, Steve Stern and Ramesh, who
are working on two aspects of this long tail development of
local search content.
Steve works on the instant indexing, so that you've got
hundreds of millions of places being viewed by millions of
people and having the updates go live immediately.
That's a hard technical challenge that he's
addressing.
And Ramesh is working on the whole user experience of
editing and modifying places on the map.
And one thing that might be fun is to go to a demo of
maps.google.com recent edits, just as a sampling.
Maybe Dana, you could go to recent edits.
Can you go to the demo screen, please?
Thank you.
So every eight seconds or so, this is just cycling through a
random sample of the modifications that end users
are making to make the maps more accurate.
It's a fun app to watch the world improving in its
accuracy, as we speak.
Maybe we can go back to slides, please.

So as you work through these problems in identifying a
place, clustering, the long tail of specialized queries,
the interpretations that vary by geography, you quickly
realize that perfect local search requires a near perfect
mirror world, where you've got everything replicated in a
virtual place that is searchable.

That brings us to kind of one of the main state of the art
challenges, which is creating this canvas for people to
annotate on planet.
How do we create that base substrate that a lot of
content can grow up on and be situated in.
So it starts with satellite collection.
So this is the digital globe satellite that creates high
res imagery.
This is the SPOT-5, from France, that creates medium
resolution that covers much of the planet.
And this is the a high res aerial imagery over a typical
spot that you might need the detail to know exactly where
the equipment goes in a baseball field, or where the
rendezvous point is for a swim meet.
But in addition to those, I thought we'd bring the imagery
collection in the context of search.
How does a global blanket of good imagery relate to search?
So what I thought we do is assume we're in San Francisco,
if we can go to the demo.

And let's say that we're at the Moscone Convention Center
and we'd like to go to the Giants game.
We can simply say from Moscone to Giants.
And Search helps disambiguate what we meant.
Yes, we meant Moscone Center for the starting point, and
yes, we meant the San Fransisco Giants stadium for
the ending point.
We can jump to step four in the driving directions and
actually get an up close view of the sequence.
We click on the arrow, and it will take you turn by turn
within street view, blending the search with the imagery
that lets you see where you're navigating.
So great imagery collection there, but even street view is
not where we wanted to stop.
We also want to look at 3D space.
You've got multistory downtown buildings.
Let's go to the Embarcadero Center in San
Francisco as well.

Here, as we zoom in, you've got photo realistic 3D
buildings being streamed.
You've got literally terabytes of data that are being
accessible to people that are--
go ahead and pan and tilt, just to show the 3D photo
realistic buildings.
That level of detail is what we need to have people
precisely annotate the planet with all kinds of local
information you may want to find.

So if we go back to the slides, please.

So this is a summary of what it entails to create a base
canvas for the planet for local search.
We've got everything from the high and medium res satellite
collection, to aerial imagery, to street view imagery.
And as impressive an effort it is to coordinate all this and
bring it together, I think our engineers are most impressed
by what users are doing on top of it.
So now that we have the base canvas, how do we enable
people to annotate that canvas, paint, and tell
stories on that canvas?
So that brings us to annotating the planet, how the
geoweb is expanding.
If you picture that you've got 350 million unique user
activations, more than that, unique users coming to Maps
and Earth, doing searches, local searches, more than a
thousand lifetimes spent looking at the planet, it's
amazing what accomplishments arise from that, one of which
is one of the most unbelievably comprehensive
inventories of Bigfoot sightings in North America,
complete with an animated timeline, that apparently
shows a peak in the 70s.
Or people that, interestingly, are finding undiscovered Roman
villas in their backyard--
this fellow was looking at satellite imagery, and found,
literally, a buried Roman villa in his hometown.
But then they range to useful things like a transit map for
Santiago, or a road map for Buenos Aires, or an ecotourism
guide to Costa Rica, or a complete 3D model of the big
Buddha on Lantau Island in Hong Kong, or an exquisitely
detailed 3D model of Berlin city.
But then there's also mundane things.
If you're having a soccer meet, and you want people to
actually know how to get to the field, we have an editable
maps and editable categories that are a new introduction in
to local search.
So we have Doug and Matt that are working on two parts of
the problem.
Doug's working on the problem of having a flexible
categorization system that represent the long tail of
businesses.
Matt's working on the problem of local business center, of
enabling people to get on the map instantaneously and
classify and represent themselves how they'd like.
So now, things that are usually underrepresented, like
a soccer field or baseball field, can be added to the map
by any user.

There's also compelling narratives that really have
much greater strength through the context of seeing the
story in relation to the places on earth.
This is an effort from the Holocaust Memorial Museum that
really accentuated the crisis in Darfur by telling a story
with that geographic context.

Just to get an overview of how much this geoweb has been
exploding over the past year, this is the picture, a heat
map, of May 2007, of all of the annotations that people
have put on the planet for searchability.
Let's just step through the sequence over this past year
to see how that's expanded.
And you can see by this May, 2008, we've had a huge amount
of growth of people telling their stories, painting this
blank canvas, with the geographic context that makes
local search easy to understand.
But if you assume that you've got all of this information
now searchable, the next round of challenges is what's the
user experience that enables you to find things and see
them in context.
You've got hundreds of millions of places with
millions of users trying to look for those places.
What kinds of innovations do you need in the user
experience to enable that?
So I thought we would do a demo of going to the place
where some of this work is being done.
Our Zurich team works on the server side rendering of map
tiles, in response to search results that we'll
take a look at now.
So let's say we're going to go to Zurich and do a google.com
query for Hotel Zurich.
And thanks to Johanna, who you're going to hear from
later on search quality, we get a nice universal result of
a hotel listings.
But we actually want to see the geography around the
hotels, because I'm interested in staying in a rural section,
outside of Zurich.
Now this may be unique to the Swiss, but they actually have
another form of user contributed content.
He used 256 colors to represent the transit times on
their public transit system, within a 30 minute radius.
So I think Swiss may be among the few that need 256 colors
to describe 30 minutes of travel time.
But an amazing visualization based on the public transit
data of where you can stay within Zurich within a 30
minute commute--

There's this Web 2.0 misperception that user
contributed content is about social preening, or kind of
frivolous sharing, but there's actually a lot of really
useful content that's annotating the planet.
Take an example here.
We can stay in Baden.
So let's look for hotels in Baden and get transit straight
to Zurich from that hotel in Baden.
So one of the things that searches have, Dana can say
look, I want to go from Baden to Zurich, and specify it in
the search box with that kind of simplicity.
And you get not only the driving directions, but you
can also click on public transit and get an aggregated
view across three different transit agencies on what your
options are right at the moment in planning
your travel to Zurich.
So the search results for the travel route are even better
understood when they've got the context of photos from
around the place.
So if Dana can turn on the photo layer, we can see an
example of this high performance tile rendering
that's happening in the background to have millions of
photos accessible to millions of people as they pan and zoom
for their particular map.
So the kind of performance that you're seeing as Dana
drags and zooms the map is made possible by this advanced
technology.
You might take a look at that river scene there.
So an amazing set of user experience challenges that all
deal with kind of a high performance rendering of not
only the search result, but the ability to see the search
result in context of other information.
So if we could go back to the slides.

So we've covered some of the hard problems on getting the
base canvas, how to get people to be able to annotate that
canvas, and some of the search user experience and
visualizations that make it easier to understand the
results in the context of geographic location--
but there are lot of hard queries, some of which we're
doing well on, but there's many more that have got room
for improvement.
Take for example Greg Donaker and Howard Trickey, they are
two of our engineers that focus
obsessively on search quality.
And there's some that I'm really proud of, like a beach
front hotels, Los Angeles, where, by virtue of the work
that [? Natan ?]
and Julian we're doing to correlate a business with
associated web content, we can find a good set of results for
that query.
Maybe we can go to demo to just walk through some of
these hard queries.

Could we switch to demo?
So let's start with the beach front hotels, Los Angeles.
The interesting thing, and the reason why this query is
somewhat hard, is that the match for beach front hotels
is a descriptive based on the clustered web
pages with that listing.
So by virtue of knowing the reference to that business,
we're able to get a good set of results through that
clustering.
The second example, if we could do the second query,
this is a very simple query.
There's nothing algorithmically complex about
matching a soccer field in Burlingame.
The challenge is even having the soccer field in
the index at all.
And that goes back to the editable maps, with that in
real-time updated index, based on the millions of user
additions that are being made.
The third example is kind of the flexibility of
interpreting your intent from your location.
So let's start at San Francisco.
And you were to say look, I'm trying to go from SFO airport
to One Market Street, you can just type in, from SFO to One
Market Street, and out of the hundreds of Market Streets in
the country, the proximity of your first location is used to
influence the selection of your destination.
So really, trying to make it easy from a local search
perspective to specify your query--

Some of the other queries, we have room for improvement.
If you take a query like Marin Headlands hiking, what we try
to do is blend the best content.
But much like universal search has the challenge of picking
the right blend of content, this is not optimal.
We've got some hotels listings at the front.
We've got some good user contributed
content in the middle.
At the bottom we have the visitor information bureau, if
you scroll down to the left on the left Nav search results.
This we would like to improve.
There's a lot of great web content that is geotagged for
Marin Headlands.
And a lot of the work that we're doing right now is in
trying to determine when to blend which content.
Another example of this is where we're a little too
literal in our interpretation.
If you type in Kansas State, you're thinking probably of
the university.
We literally give you a map of Kansas state.
So this is another classic problem where we're working on
algorithms to give a variety of choices and interpretations
of your local query.
But even as we tackle some of these hard query problems,
there's going to be a whole other round of query problems.
If you take a world where we can do a search for
Embarcadero Center, for example, this I love.
And we'll end on this demo, where we basically go from the
3D space perspective of these photo realistic buildings,
down into the search results for street view.
So if you could turn on street view, and then zoom into a
particular spot, it kind of illustrates the next round of
search challenges, even if we've
address the current round.
So Dana is going to fly into the Ferry Building.
And you'll see that the transition from 3D to street
level lets you be immersed in the scene.
And once you're merged in a scene I think you can see a
whole class of problems related to search that's going
to crop up, especially when you look at
the transient nature.
Here we went straight from 3D, straight to street view.
Something as transient as who's typically staffing a
farmer's market, what are the parking restrictions on the
street, what are the opening hours of places.
All of this is going to open up a whole other class of
search challenges that I think we're all eager to tackle.
So if we could go back to slides--

You've seen some of the difficult problems, some of
the advanced technologies to address those problems, and
some of the content that's cropping up
from the work to date.
I hope that you all have some insights into search and see
that the end result is going to have all the world's
information geographically organized, universally
accessible, and useful.
And you've gotten some preview of what motivates the team
that's working on this.
So thank you very much.
And I'll hand it back to Marissa.

MARISSA MAYER: You know we used to joke in the early days
of Google about Google helping to find your keys.
And with Carter's team and their work, I think that's
becoming more and more of a reality every day.
You've heard a lot this morning about all of our
different search properties, maps, images, news, finance.
And of course there's web search, which really aims to
wrap all of those together, especially with universal
search, which we launched last year.
Johanna Wright is our director of search quality.
And I would like her to come and give you an update on
universal search, what we've achieved in the last year, and
what is new and exciting in the area of search.
So I'll hand it off to Johanna.

JOHANNA WRIGHT: Thank you, Marissa.

So I'm Johanna Wright.
And this is how I spell broccoli.
Thank goodness Google understands what I meant when
I type in, brocalli.

I chose this example, although it's an early innovation at
Google, I think it highlights our theme for today.
And that is deeply
understanding the user's intent.
Here's what I said, that's what's we get in the search
team, just what the user types into the search box.
But now it's our job within search to give you what it is
that you really want and what it is that you really meant by
this request to Google.
I'm here today with two colleagues, engineering
leaders in the search quality team--
they're sitting in the back right now--
Pandu Nayak and Trystan Upstill.
And we're going to be talking about user intent and the
variety of efforts in search quality in which we uncover
what it is that the user is really meaning when
they come to Google.
First, I'm going to talk about universal search.
And then Tristan is going to come out and he's going to
describe the challenges that we have across the world in
understanding queries.
And then Pandu is going to show off some clever stuff,
where it's very hard, given the context that the user is--
it's hard for us at Google to understand what the context is
that the user is in when they type in their queries.
So let's start with universal search.
Universal search--
you heard from RJ and you heard from Carter.
So Carter talked about the geoweb.
RJ talked about our properties,
images, news, finance.
Universal search is the effort to bring all of these together
into the search results page.
Users should not need to know if the best result for their
query sits in a book or if it sits in an image.
We at Google just need to bring this to them in the
search results page in an easy way for them to understand.

And this graphic here represents what we
launched last year.
So you can see across the x access the
properties that we launched--
video, maps, news, books, and images.
And this graphic shows where we are today.
I forgot to mention the y axis is the relative frequency with
which theses show today.
So if you go back one slide this is where
we were last year.

And here's where we are today, in terms of the frequency with
which you will see these universal results.

Can you go back one slide.
And today we've added in products and blogs.
So as you can see, you'll be seeing a lot more of universal
search this year, and currently when
you're using Google.
And why is this hard?
And why are we talking about it today?
Basically there are three challenges that you see in
universal search.
There's infrastructure, there's ranking, and then
there's user experience.
Let's start with infrastructure.
The easiest way to think about the infrastructure challenge
is that if we need to give you the best answers from image
search, the way that we know how to do this is to look in
image search index and see what result there are.
But with a simple calculation, this takes about twice the
computational power of web search right now.
So we have to send all of those queries back to the
image search index.
And now if we add in seven properties, well, that take
seven times the computational power, seven times the number
of machines.
And so you can imagine, our engineers had to come up with
something a bit smarter than this so that we could serve
these results.

The next challenge is ranking.
And so that's just the issue of
comparing apples to oranges.
How can you compare an image to a news result?
How can you compare a book to a video when
both have great content?
And again, this graph represents the hard work of
the past year to improve our ranking algorithm so that you
could see universal results when they're relevant in many
more cases.
And the final challenges user experience.
How do we bring us all together in a simple fashion,
such that seems familiar to users, when this very rich
content that we know about it because we have more detail
about what type of content it is?

And you all got a sneak peek at this slide.
But the other thing that we've been hard at work doing this
year is launching worldwide.
So universal search is now available to our
users around the globe.
But I think the easiest way to understand universal search is
just to go through a handful of examples to see what we've
been at work with in the past year.

So Red Bones Barbecue is a restaurant in the town in
which I grew up in.
And last year we included this result in the
search results page.
But this year, we've made a slight change in that we've
added the review content.
So now we can see that Red Bones is, in fact, as good as
I remember it.
It's got awesome food.
And it's got big portions.

But this next query, another local query, yoga in Kansas
City highlights the kind of challenge that
we have within UI.
Not all of the local results should be
displayed the same way.
And in fact for broad queries, yoga, restaurants, pizza, what
we what we found is that the local results that we have and
the local interests that users have are much more varied.
And here what is important to the users is seeing a group
and understanding that there's more content that
they can dig into.
So this is the change that we made over the past year, where
we used to show the bare minimum.
But now in these broad queries, we'll show you ten
results and highlight the fact that this is a group.

This query may be interesting for those of you who like
gadgets within the room.
So this highlights the search qualities focus on what we
call freshness.
And that is looking at when did this concept occur.
So in this case, there's a release around
the BlackBerry Bold.
And you can see that our focus on recency really helps
universal search.
So you can see the news result.
And then you can also see we've added in a blogs group
this past year.
And our focus on understanding freshness helps us to
introduce blogs and universal search.

This query highlights some of the infrastructure work we've
been doing in the past year.
So basically, within video, we've launched improvements to
the video crawl.
So now we see many, many more videos from
many different sites.
And the other thing we've launched is video site maps.
And so it's a way for businesses that post videos to
tell Google that they have those
videos is in their sights.
In this first query, you see the how to bake naan results
from Videojug.
And this is because they are telling us at Google hey, we
have great videos at this site.
I picked this query actually because, for another reason,
and that is because somebody else told me about the query.
And I have this problem working on Search at Google
and that nobody understands what I do.
And it's a big problem with my parents.
They say things to me like Johanna, you work on search?
Search hasn't changed in ten years?
What do you do?
And so someone gave me this example at a conference.
They said, oh, you work on universal search.
I did the query for naan.
And actually my nanny did the query for naan, and she came
up with a recipe for how to make naan.
And then she actually watched the video online and then made
us some naan.
So I thought it was pretty cool because a user could
articulate to us what they were seeing
in our search results.

This query, Lisa Leinbaugh photographs, highlights the
work of universal search.
And I think this is one of the biggest improvements, and what
you saw with the chart, is that we're increasing coverage
over the past year.
And that's our term for how frequent these results show.
Last year when we launched, you imagine a query like Ansel
Adams, or Annie Leibowitz, of course, Google should
understand for famous photographers that we should
show images.
But now we're really focusing on the long tail and less
popular photographers will show up in the search results.

And this really gets to one of the user intent challenges
that we see at Google.
So we talk about, and the topic of this speech here, is
understanding user intent.
But we don't want to just understand user intent
broadly, we want to understand the user intent of each and
every user, and on each and every query.
And in fact, this query here, Leo Voolich Wright is a query
that I think only one person in the world issues.
And that's me.
This is my son's name.
And last year, we didn't do so well on this query.
But this year, you can see that Google has done much
better on these deep queries.
And here, we have videos and blogs blended right into the
results page.

And while that past query for my son was an indication of
seeing multiple types of content on the page, here you
can see that, in these two cases, there's many more
results from different corpuses showing together.
So on the left, with kangaroos jumping,
you can see the pictures.
But also you see a video, because if you type in
kangaroo jumping, you probably want to see what it's like
when the kangaroo actually jumps.
And this also has a books result
incorporated at the bottom.
For Mexico City, you see maps, news, videos, all together in
the same results page in a very simple to read fashion.

And my final picture and set of examples is of our work
launching worldwide.
And this has actually been a lot of work for the
engineering team.
And we're really proud of our efforts in this area.

And I wanted to leave with that.
It's a good time to introduce Trystan.
Trystan is an engineering leader here in search quality,
and he runs our international search quality efforts.
And so it is really amazing to me the breadth of the impact
of the work of Trystan's team.
So he's going to talk to you about understanding user's
intent across the world.
TRYSTAN UPSTILL: Thank you.
As Johanna mentioned, I'm an engineer here on the
International Search Quality Team.
Our team consist of engineers who have come together from
all over the world with one singular purpose in mind--
to build the best search experience for everyone,
everywhere, no matter what languages they speak.
Most of what we do happens under the hood here at Google,
refining, researching, developing, and deploying new
algorithms to improve our international search
experience.
And in many ways, give our users more, even when they
tell us less.
So what I'd like to do today is just step through some
examples of our search localization and take a quick
dip into some of the new things we've
been looking at recently.
So a little bit of background on me might help before we go
through these slides here.
I have a somewhat varied background, although it's not
that varied compared to a lot of people in my team.
I was born in France.
I lived in Australia for 20 years--
or more than 20 years--
25 years.
And I really want to go into my age.
And then I spent a year in the UK and then came out
to the US for work.
So I've learnt that saying that there is death in taxes
is at least true in three countries, the US, the UK--
well, I haven't learned about death, but I know that the tax
bit is true--
in the US, Australia, and in the UK.
So if you're going to pay your tax in the US, which most
people do, the three letter acronym that you need to know
before you kick it off is the IRS.
And so if you search for tax on google.com, you get the IRS
result at position one.
Then this is followed up with some forms and publications
from the IRS, which you need to submit with your tax
return, and finally a nice page which has a history of
taxation around the world.
By contrast, if you searched for tax in the UK, you get Her
Majesty's Revenue and Customs Service.
So you make your checks out to Her Majesty in the UK.
Then you get the Wikipedia results, which, once again,
has informational resources about the history of tax.
And finally some information about how to
pay tax in the UK.
And if we go further down the pages in these cases, we'll
see that the localized results continue with services which
will help you pay your tax in these regions, etc.
By contrast, if we search in Australia, we no longer have
to pay tax to Her Majesty directly.
In Australia, instead, we pay taxes to the
Australian Tax Office.
And the ATO in Australia has a special e-filing service,
which we can see here in the second result.
Once again in the third result, we have our
informational history of taxation.
So this not only works for queries that are in the head
of our query stream, like tax.
But it also works for queries in the tail.
So sticking with the taxation theme, if you were to search
for tax calendar, tax year, tax dates, in all of these
cases, you would get local results.
And it's very important that they are local, because
otherwise, all your dates will be wrong.
Outside of tax, this holds for all sorts of of things, like
whether you want to find out about schools in your country,
the school system, or even things like what time you
should plant your tomatoes, because if you use the US
system in Australia, you are completely hosed.
So moving beyond country localization to language
localization--
so when we deal with language localization we have this
tricky problem of counter balancing
readability with relevance.
And you can see that illustrated here with the
query, Enrique Iglesias in Russia.
I guess he's hot there right now.
So as we can see, Enrique is a good looking bloke, which is
good, so we lead with photos of Enrique.
Then we have his navigational result at the second position.
So this is his official website.
It has news about Enrique, his blog postings, probably when
he's touring all the countries around the world, and if
you're lucky, a few more photos.
Then we have some Russian information results at
positions two, three, and four.
And then we're lucky enough to see a web
video in there as well.
This illustrates something that is very important, which
is that we're not only localizing our web corpus, but
we're localizing all of our corpuses, such that when we do
want to do universal search, we can do it and understand
readability, or in the case of videos, watchability, across
all the languages.
So I'll step back for second.
I've shown you three countries and two languages.
That's kind of cute.
But what we do hear Google is over 100
languages in 150 countries.
So if you're in Vanuatu, or Trinidad and Tobago, or
Botswana, and you want to search in Indonesian or
Swahili, then we're giving you the best localized information
we possibly can.

So another part of our strategy here at Google is
thinking about ways in which we can unlock the content from
larger languages on the web for the smaller languages.
So for example, we know that the number of Arabic users on
the internet is growing faster than the amount of Arabic
content on the web.
So part of our strategy is to work out ways in which we can
make the much larger languages on the web readable to
speakers of the much smaller languages.
And Udi introduced a key part of this effort last year,
which was the cross language search system.
So going a little further on this, in the past year we've
taken this right into the search results.
So if you're searching for Bermuda Triangle in Arabic,
which of course I can't do, so this query at the top says
Bermuda Triangle in Latin, for our benefit.
Our algorithms will go through the search results, and detect
that there's results in Arabic, are OK, but maybe
there are some better results in English, and seamlessly
insert a link at the bottom of the page which will take you
directly to the cross language search results, which is this
link down here at the bottom.
It has the translation as well as the Arabic text.
And then we pop straight to the cross language search
results page.
And here, we hope that Arabic users can extract more from
more useful information about the Bermuda Triangle.

So something that I'm super excited about that we've been
working on this year is thinking about localization
beyond country and beyond language.
So I guess something which I hadn't realized when I first
moved to the US was just how big, and how culturally
distinct, the areas of this country are.
And if I'm going to go on a driving holiday, it's probably
just as likely that I'll go to Canada or Mexico as it is that
I'll go to New York, or somewhere else on the east
coast. So we're moving in the direction right now of
something that I refer to as sub-country localization, but
is more easily referred to as metro-localization.
So say our user is in the Los Angeles metro area, and puts
in a somewhat vague query.
They ask for a zoo.
Here we'll lead with the San Diego Zoo, which is an
extremely well known, world famous zoo.
And it's only a couple hours down the coast from LA, so
it's likely that our users will be willing to go there
for a day trip.
We then have the LA Zoo, which is also a good zoo in LA.
And then we have others zoos from around the country.
By comparison, if you were to search for zoo in the Seattle
metropolitan area, we would lead straight off the bat with
the Woodland Park Zoo in Seattle, which is a great zoo
and isn't going to require three days of
driving to get to.
But if you're up for a longer road trip, we'll stick again
with the San Diego Zoo, the world famous zoo, and then
have other important zoos around the country.
But this works well for vague queries like zoo, where maybe
you're willing to go on a day trip.
It doesn't work so well if you're after a pizza.
So if you're in LA, and you ask for a pizza, then you
don't really want to have to drive to San Diego to pick up
your pizza.
I should mention that the zoo localization I spoke about in
the last example is something we'll be rolling out over the
next couple of months, and you'll hear
more about it then.
This, by comparison, went live, I think, last week.
So if you're searching for pizza in LA, it's unlikely
that you're going to be willing to drive to San Diego
to pick up your pizza.
So what we have now is a local universal, which doesn't
require a location every single time you search.
So the first time you search for something that we detect,
or our algorithms detect is a very local query, we'll prompt
you for a location or a zip code.
And if you go ahead and enter that, we'll give you results
for the zip code that you've entered.
And so I've gone along here and entered where I live,
which is Palo Alto.
And I now have a list of all the pizza joints around Palo
Alto, and my personal favorite, the California Pizza
Kitchen, which is very cool.
Now what we'll do from now on is actually store this zip
code, and every time you search for something that's
very, very local, say coffee, or schools, or libraries,
we'll give you all the local results around your zip code.
So I hope, in talking to you briefly today, about some of
these examples and dipping into some of the new things
we're looking at, I've managed to convince you to some extent
of how much we believe in providing the very best search
experience for everyone, everywhere, and we want to
break down all the language boundaries, such as it doesn't
matter what languages that our users speak.
And now I'd like to hand off to Pandu, who is both a
gentleman and a scholar, and is going to tell you about
queries that are hard to understand.
PANDU NAYAK: Thank you, Trystan.
So as Trystan said, I'm going to tell you about these
queries that are much harder to understand than the normal.
Now that's not actually quite true.
As we'll see, most of these committees are actually quite
easy to understand if you happen to be a human being.
If, on the other hand, you happen to be a computer, then
it's significantly more challenging.
And our group, as a whole, is looking very hard to take some
of the understanding that human seem to display with
such ease, and translate them into computer algorithms that
would allow for us to understand such queries much
more effectively, and provide users great search results.
So let me illustrate the kinds of things we've been doing
with a series of examples.
So here is a query from a user asking for dr zhivago.
And I think we're all quite sure of that when the user
typed dr, they really meant doctor.
And apparently the search results got the right
interpretation.
And we do, in fact, get the results for Dr. Zhivago.
You're probably thinking, oh yeah, this is fairly
straightforward, dr is always doctor.
And you'd be right, except of course when it's not.
So in this particular case, the user is probably not
looking for Dr. Burton Green in Rodeo.
In this case, they're probably looking for the Burton Green
who was the investor, who at the turn of the last century,
invested in that property of land, and actually named Rodeo
Drive what it is.
So in this case, dr actually means drive, not doctor.
So now you're probably thinking, oh yeah, I forgot
about that one.
It's clear that dr either drive or doctor.
And that would also be right, except of
course when it's not.
And in this case, when you're looking for the best beaches
in dr, it's neither drive nor doctor.
In fact, you mean the Dominican Republic.
Now the tricky thing here, I guess the point of these
examples, is that as users come to expect more and more
from Google, they really talk to us in a more conversational
manner and feel free to use language the way it's meant to
be, which means that words in the queries can mean lots of
different things.
And if we are to support them effectively and give them
great search results, we need to understand all the
different things that words might mean.
And of all these things, many of which are not going to be
in dictionaries, like this meaning here, and all the
various things that it might mean, we need to pick out the
right meaning in a given context to figure out exactly
what the user wants.
So that's one key piece of understanding, understanding
the vocabulary that users use.
Here's a second example.

Search queries that come to us are not just sets of words,
collections of words.
In fact, the're actually collections of concepts.
So this query here that you see is really asking for
information about Times Square in New York.
So this is like a two concept query, if you will.
And those are the two concepts there.
And as you can see, the search results correctly get that
interpretation that it's Time Square in New York.
The interesting point here is that none of us really thought
that the user was looking for information about squares in
the New York Times.
And that would be right, except of
course when it's not.
So in this particular case, presumably the user is looking
for information about squaring the circle in
the New York Times.
And in fact, all of the results sort of take that
interpretation.
You can see that the first result, for example, is about
this age old problem started by the Greeks on how you can
find the circle that has the same area as a square, or
rather the other way around, a square of having the same area
as a circle, using just a straight edge and a compass.
And there's an article in the New York Times,
talking about this.
That are a couple of other interpretations that the
results also get, including the Circle in the Square
Theater, and even an old movie called the Square Circle.
The point here that is that when we get queries, we can't
do the naive thing of just thinking of them as
collections of words.
We really need to understand the concepts that are embedded
in those queries.
And the concepts that are embedded in those queries are
very context dependent, and we need to figure out the right
sort of breakdown into concepts.
And if we can do that, we can provide search results that
are really great for users.
So a third kind of thing is illustrated
by this query here.
So looking at this query, it's pretty clear to all of us
what's going on here.
The user is planning a trip, a vacation probably, to Carmel.
They want to stay at the Normandy Inn, and they're
looking to see whether the Inn is available on the days of
their vacation.
It seems fairly straightforward.
The only problem with this query is that the Normandy Inn
does not have online reservations.
And so there are no good pages out there that provide the
user the information they want about the availability.
If we were to do the naive thing, and simply return to
the user results that had all of the words in the query,
then we would get terrible search results like this.
We would in fact give them results about the Carmel
Mission Inn, instead of the Normandy Inn, which the user
really wanted.
If instead, we understand the user intent, if we understand
what the user really wants, which is to make a reservation
at the Normandy Inn, then these are the results which we
actually do return, that actually drive the user to the
Normandy Inn homepage.
The user can then go to the homepage, see that there is no
online reservations, see a phone number, and call them,
and presumably, go on to have a great vacation in Carmel.
So this is another example where really understanding
what the user wants is more important than the literal
words that they provide us.
And by being careful about that, you can provide them
great results.
And for my final example on this team, is perhaps my most
favorite example of this set, which is this query.
The user is asking for the Wild Wolf Water Resort in
Niagara Falls, Ontario.
It sounds pretty clear what they want, except there's a
small catch here.
There is no Wild Wolf Water Resort in Niagra Falls.
In fact, the result that they're looking for is called
The Great Wolf Lodge Water Park.
It's not the Wild Wolf Water Resort.
So we happen to realize that, in fact, that is no Wild Wolf
Water Resort, and, in fact, there is this other water
park, called The Great Wolf Lodge Water Park, right in
Niagara Falls.
And so we ensure that there are enough results in the
results set about The Great Wolf Lodge.
Presumably the user sees this discrepancy, notices aha,
that's actually what I was looking for, and clicks on
these results, and presumably has a happy day at the water
part with their kids.
So the long and the short of this is that understanding
what the user really wants is very, very crucial to
providing great search results.
This is something that we're working very hard at.
It's a very challenging problem, but it's what makes
it fun to come to work everyday.
And with that, I'm going to hand it back to
Johanna to wrap up.
Thank you.
JOHANNA WRIGHT: Thank you, Pandu.

I'm really glad that you guys got to hear from Pandu today.
His work on these queries is some of the stuff that I think
is the cleverest that we see in search quality.
And when I started at Google-- here's a story--
I met Pandu.
And I was so excited to be meeting these engineers in
Search at Google.
And he was, then, also working on something very interesting
around solving the user's intent.
I went home to tell my husband about it.
Here we were getting queries and then, for similar queries,
putting those in the results set.
And my husband said, oh yeah, I saw that.
That's when I type in goose liver, I see results for foie
gras in the middle of the page.
And I thought that was pretty cool.
And it was just a glimpse of the really neat stuff that
goes on with the engineering team in Search Quality.
And to close, I wanted to end with another story from when I
started at Google, which is much more about why I like
working in this team, and really around the theme of
today, which is understanding our users intent.
Part of the reason I really like working in Search is that
all of the engineers here are very, very dedicated to
serving our users needs, and which is a very, very user
focused team.
So when I got to Google, it was after Google Book Search
had launched.
And Google Book Search was just starting to get a lot of
really good content.
Now it was before Universal Search had launched, so we
didn't have a way to show this continent in the search
results page.
So what we did is we just added a link at the bottom of
the page that said try your query on Google Book Search.
Now, while a lot of the time, this link was great, but for
the millions and millions of queries that we got to web
search, it wasn't always true that there was content in
Google Book Search.
And seeing a link on the page that doesn't lead you to a
good result is the kind of thing that just drives a
search quality engineer at Google batty.
And so one engineer, who wasn't on the project, but was
working in search quality, saw this, was really upset about
it, and quickly worked up an algorithm and launched it in
such that we would only shows link when there were relevant
content within Google Book Search.
And I like that story, because it was my first glimpse at
what a strong user orientation the Search Quality
group had at Google.
And it's just one thing I really like about coming to
work every day, is that, no matter what disagreements we
have, it's clear that those of us who are working on search
serve one master, and that's our millions of users.

And with that, I'll hand it back to Marissa.

MARISSA MAYER: Thanks.
I want to thank all of our speakers from this morning.
I think this gives you a glimpse into why search is
hard, but that's one of the things that makes it really
interesting.

As part of the core search experience for Google, we've
always thought of four components--
comprehensiveness, do we have all of the information;
freshness, do we have the newest information that's
being generated on a topic; relevance, can we produce
exactly what you want out of that whole set of information;
and overall user experience.
And you heard all four of these themes echoed throughout
the three presentations this morning.
So for example, when you hear from RJ, if we can go back to
the slides.

I think are we reset.
You heard from RJ that we are producing 100
billion images per year.
That's a huge amount of data, and that brings to bear all
kinds of questions about
comprehensiveness and freshness.
How do we get all those images as they appear onto web and
searchable?
And then how do we apply new relevance technologies, like
facial recognition, to those images, such that your results
get more and more accurate for what you are searching for?
And then also looking at things like the overall user
experience an image search, it's one of our biggest
properties.
It's one that we would like to monetize.
We want to do so in a way that actually, beyond leaving the
user experience neutral, actually improves it.
How can we introduce advertising in a way that
ultimately improves your image search experience?
So this is the 100 billion images per year, facial
recognition, and ads on image search--
these are some of the things that we're beginning to
experiment with and get started on, to try to
understand how best to improve user experience.
And then taking some of those much more elaborate notions
comprehensiveness, freshness, and relevance, and improving
user experience by introducing them into things like news,
and allowing people to search our news database for quotes
from notable people--
these are all things that are particularly exciting.
Carter covered a lot of the same topics.
Comprehensiveness--
how we get all of the world's information?--
as well as better understanding of relevance--
when I type New York, New York, and I'm outside of Las
Vegas, I mean something different than when I type it
in Las Vegas, and I'm actually referring to the hotel.
These are basically notions of relevance and how we can
ultimately improve our local searches.
Our user generated content that's occurring on our
geoproducts, of course, brings up a new element of freshness.
So as we've had more and more users contribute over time to
our body of data, in ultimately means we can have
fresher results.
So should an intersection be closed, should a road change,
should a building change, we have that information as close
to real time as possible, really allowing us to have
that mirror of the world.
And then of course there is the user experience editions
we've seen of late on our geo products, in the way of the
three-dimensional buildings, as well as our 3D, two street
view, transition, which Carter demonstrated for the first
time this morning--
are things that we think can really enhance the user
experience.
And when you think about what street view can do, in terms
of driving directions, if you are
wondering how to get somewhere.
You might not just need to look at a list of written
instructions.
You might actually be able to drive the road first on your
computer, so you know exactly what you're looking for.
It could really change the way that these directions,
overall, that these work.
Then we heard from Johanna about how all of that content
is coming to bear inside of web search.
And you can see where we were last year versus
where we are this year.
So you can see that last year we did a complete overhaul of
the search engine, changing out the infrastructure, the
relevance algorithms, and the user experience, to
incorporate these different types of media into our search
results page.
And here you can see how much we've
actually expanded on that.
This is in terms of frequency of these types of results
appearing in your search results.
So this is happening more and more over time as our
engineers are getting more and more accurate with their
relevance notions.
How can we build in a new set of ideas around image search?
How can we build in a new set of ideas around news?
And this type of expansion will continue to occur as we
understand the information better.
And then, in terms of user experience, there's the ever
elusive user intent.
Here's what I said, now give me what I meant.
You heard from Trystan and from Pandu how hard and
interesting a problem that this.
It's something that changes all the time, every day, which
really brings us to the future of search.
And I think when you look at the future of search, there's
a couple of things that are really notable that we've
touched on today.
The first is that we imagine search in the future will be
experienced through many new modes, cell phones, cars.
It's interesting because, if anyone in the world is a
search addict, who searches as much they possibly can, I have
to be pretty close to it.
I search all the time, every day.
Yet I would guess that I probably only do searches on
about 20% of what I'm interested in.
Why?
Because a lot of the searches that I think of happen when
I'm on the go, or in the car.
And as we think about how we can actually take our search
technology and make it viable by cell phone and by car, it
becomes particularly interesting.
It also adds new layers of complexity.
So you think about Carter's work on geo, it actually gets
even more complex if someone's moving, or if someone is on
the go, how can you take that context of where they are and
make that into a better experience on our geo products
and on web search, overall.
There's also, of course, the concept of media.
How do we pull in all the different media that RJ and
his teams work with--
news, images, trends--
and display those as best we can on web search, really
creating these kinds of
interconnections between products.
You've seen how, in just one short year, we've expanded the
different types of content and the frequency with which they
occur in web search, and we anticipate being able to do
that even more in the future.
And finally, there's the element of personalization.
We think that, in the future, one way we'll get the answer
of did you mean Dominican Republic, did you mean
doctors, or did you mean drive, is by knowing a little
bit more about you, and being able to build that into the
search engine to ultimately tailor the relevance of your
results, yielding better results and a better user
experience for you overall.
So we're very excited about where we are in search today
and what the future holds.
But from our standpoint, as you can tell, we're just
getting started.
And on the topic of just getting started, as well as on
the topic of comprehensiveness and personalizatoin, I want a
transition from Search to Google Health.
So you've heard us talk a lot about Google Health over the
last while.
So Google Health is our product that attempts to take
our users medical records and bring them online, where users
can see them, and control them, and put them to good use
in getting better health care.
This is not an easy initiative.
It makes sense for Google take this on, because we've seen
over time that the majority of internet users, when they look
for health information online, start with search, when they
try and learn something about their health,
they start with search.
And the lion's share of that 66% percent comes to Google.
So this is one reason we've been interested in Health, but
is a big project.
And it's a big project that you've heard about from us for
a long time.
You can see these are all the different times we've talked
about Health, and they can span now almost a year
buildup, and trying to talk about what
are we doing in Health.
Why do we think that it's important?
Why do we think this is something that's worthy, not
only for Google, but for our users?
And I'm happy to say today that it's no more just talk.
We actually have the product.
It is live at www.google.com/health.
You can sign up today.
It is open to the public.
And it's really exciting day for us.
We're very happy to finally be able to offer this
service to our users.
And we'd like to demonstrate it for all of you. so I'm
going to invite Roni Zeiger, up to the stage.
Roni is our product manager for Google Health.
He's also a doctor, Stanford educated, UCSF resident, who
we plucked from his practice to come and help us build out
Google Health.
He still practices medicine occasionally on the weekends,
in his 20% time.
And we're going to go ahead and show you Google Health.
So Roni is going to tell us today about our friend Diana.
RONI ZEIGER: Thank you, Marissa.
It's really an honor to be here, talking about this.
So here we are with our friend Diana.
She is a pretty healthy person, but she has several
health problems that many of us can relate to.
And she just got started using Google Health to try to
organize her health information.
MARISSA MAYER: So what happens if Diana gets something new.
RONI ZEIGER: Perfect question.
Let's say Diana just got sinusitis.
And let's do exactly what Diana could do, by entering it
herself into Google Health.
And Marissa, help me out, what's a good antibiotic for
sinusitis, assuming, of course, that it's caused by a
bacterial infection.
MARISSA MAYER: Amoxicillin.

RONI ZEIGER: I'm pretty impressed.
That's actually the same thing that I would have said, partly
because I also was not aware of her allergy to penicillin.
Fortunately, that had previously been put into her
GOogle Health profile.
So we can actually give her a tip here that says she might
have a bad reaction to Amoxicillin.

Let's go back to her home page.
Now I know one of the things that you're thinking is this
is really nice to be able to enter information yourself,
but does that scale?
Will users really want to enter all their conditions,
when we know that they already exist electronically many
places, such as their pharmacies.
So if we go to the details of her medications, for example,
we see in fact that some of them were imported from Beth
Israel Deaconess Medical Center in Boston, some from Rx
America, and some from Walgreen's pharmacy.
MARISSA MAYER: That's great.
Does that mean that, as of today, I can import all of
these different records into my Google Health profile, if I
have prescriptions at Walgreen's or other partners?
RONI ZEIGER: It's absolutely true.
And it's quite remarkable.
As of right now, if you have records at any of these
providers, you can sign up for Google Health and start the
process of importing you records.

Now one thing that we learned, that was actually a little bit
of a surprise to us, is that users wanted help from us
directly, finding out more about common problems that
they have, in this case, such as chicken pox.
So if I pop open this page that we've built for them
about chicken pox--

come on internet tubes--
we see that we have useful overview information, in my
opinion, beautiful photographs--
I'm going to spare you popping them up, because I'll get some
hollers of gross, but if you're on your own time, if
you're like me, please do so--
relevant results from Google properties, such as Google
Scholar for research articles, discussion groups, and Google
News, and a convenient shortcut to add this directly
to Google Health profile, if you happen to find this page,
for example, in some search results.
MARISSA MAYER: So it really puts the user's records right
in their hands, where they can see them, they can control
them, and have access to them.
And it seems like this will ultimately make a big
difference, especially for people who travel or move
around, because they can keep their records with them.
RONI ZEIGER: That's right.
And before we go further, one thing I do want to comment on
is to make a quick point about privacy, which I know is a
common question that a lot of have. So Google Health cares
about privacy and puts it in the control of each user.
The user decides who, if anyone, should have access to
her records, and can revoke that access at any time.
We will not sell any user's data and we won't share it
with anyone, unless Diana, in this case,
specifically asks us to.
Now the most exciting part in my opinion about Google Health
is this notion of online health services.
So Diana, for example, is worried about her risk of a
heart attack.
So she has connected, in the same way that she connected to
Beth Israel and Walgreen's, she has connected to the heart
attack risk calculator from the American Heart
Association.
Now because she decided to securely connect with them,
they have already filled out their questionnaire with data
from her Google Health profile.
So when she goes over to their application, notice that she's
already on step eight of ten, because she didn't have to
fill out her weight, her height, her age, her LDL
cholesterol, her HDL cholesterol, etc.
And she can go straight to where the action is.
For example, here, she can see that she, according to the
American Heart Association, has about a 27% chance of
having a heart attack in the next year.
And she can check, for example, if she were to
improve her cholesterol a little bit further, big drop.
Also some benefit from working harder on her blood pressure.
And no surprise, perhaps the most important thing she can
do is quit smoking.

MARISSA MAYER: This is great.
This is all available today.
RONI ZEIGER: This is all available today.
We have an amazing start of services
that you can see here.
In addition to the heart attack risk calculator,
there's a virtual pill box that does neat things like
send you SMS alerts, if you want them, when it's time to
take a pill; a personalized news service that can give you
news tailored precisely to your health profile, if you
wish; a immunization dashboard, that can pull the
immunizations that you, or your daughter, or son have
had, and remind you which CDC recommended medications
haven't yet been taken; and a couple of services actually
that can take your paper medical records that are in
your filing cabinets right now and convert them into
electronic format and allow you to import them into your
Google Health record.
MARISSA MAYER: That's great.
This is a really exciting start.

RONI ZEIGER: And I have to say there are some amazing
features there.
There are some amazing, amazing services, but I truly
believe that the most interesting and innovative
ones are the ones that we haven't seen or
even thought of yet.
MARISSA MAYER: That's right.
Great.
Thanks Roni.
Can we go back to the slides?

So, as Ronnie demonstrated, we have a lot of partners already
participating in Google Health.
On the slide you'll see the logos of some of the partners
that we have participating today.
And we wouldn't have gotten this far without our partner
participation.
And many of those people are in the room.
So we wanted to invite a few of them to the stage to give
us their views on what Google Health can do, in terms of
this partnership.
So first, I would like to invite Dr. Stephen Suffin, the
Chief Corporate Medical Officer from Quest to come up.
It's interesting, Quest does a large majority of lab tests
nation wide.
Until I started working on Google Health, I didn't
realize that my own lab results were in Quest, and are
now importable into Google Health.
Welcome.
STEPHEN SUFFIN: Pleasure, thank you.
I've noticed that the floor here gets a great deal of work
out from speakers who have been very dynamic, unless
they're tied to, in fact, the computer.
I'm not one of those.
I have the opportunity today to deliver a short-time
message to you, and so I've taken the opportunity to make
sure that I get it down by writing it down for you.
And you now all know that I'm Dr. Stephen Suffin, and I'm
the Corporate Chief Medical Officer for Quest Diagnostics.
But what you don't know is how proud I am to be here today to
help mark this pivotal moment in the history of health care,
which is the launch of Google Health.
With the advent of Google Health, physicians and
patients have acquired a powerful new channel for
communicating health care information, including
critical laboratory diagnostic data.
It's diagnostic laboratory information that is one of the
foundational elements of any medical record.
And it's this laboratory information that allows
physicians to make targeted, effective decisions that
intervene in health care.
Without accurate diagnostic laboratory data, a health
record would be a tracker of prescriptions, an itinerary of
office visits, a diary of hospital stays.
But for personal health record to serve as a verification of
wellness, a tool for documenting and supporting
accuracy, and specificness of medical diagnosis, then an
important asset in tracking therapy, these diagnostic
laboratory data must be an essential portion of that
medical record.
At Quest Diagnostics, we are proud to play an integral role
in this forward leap in the ability to communicate
diagnostic laboratory data that the launch of Google
Health represents.
Through Google Health, more than 100,000 physicians who
use Quest Diagnostics Care360 connectivity products can now
share and explain diagnostic laboratory data with their
patients in a secure online setting.
By understanding these data, patients that use Google
Health will have a better opportunity to communicate
with their physicians about the laboratory test results
and more effectively participate in decisions
involving their health care.
Google Health represents a vision that Google, and Quest
Diagnostics, and the other health care organizations here
today all share.
And that is that an effort to empower patients to make well
informed health care decisions needs to provide them with
timely and accurate information about the
diagnostic and therapeutic circumstances they all face.
As a physician who's been involved with various spots
within the Quest organization within the last decade, I
believe that no other company is as well prepared to provide
the foundational diagnostic laboratory data that will make
our joint vision of an improved and enhanced
patient-physician interaction a reality.
Quest Diagnostics, for those of you who don't know is the
world's largest diagnostic testing services company.
We serve 50% percent of all the physicians
in the United States.
And we perform more than a half a million specimen
determinations for patients every single day.
Our laboratories were the first in the industry to take
on the quality improving principles of
Six Sigma and Lean.
And our 900 physicians and doctoral level scientists have
world renown in fields as diverse as cardiovascular
disease, metabolic disorders, and cancer.
And we are leading developers of new technology in fields of
diagnostic testing.
Google and Quest Diagnostics, working together, will be able
to help physicians and patients collaborate as they
have never before been able in making health care decisions.
A better informed patient is, in my experience, more likely
to take the steps necessary to secure more individualized and
effective health care.
And Google Health, ultimately, is about helping each of us
live a healthier life.
We at Quest Diagnostics are proud to be a founding partner
in this effort.
Thank you.
MARISSA MAYER: Thanks.
And we're very excited to be working with Quest.
In addition to being able to get your lab results online,
we've also formed partnerships with various pharmacies, so
you can see your prescriptions online.
And I'd like to welcome Casey Kozlowski from Walgreens, who
is their lead on all health care IT initiatives to talk
about our partnership.
Thanks.
CASEY KOZLOWSKI: Well, yes, I'm Casey Kozlowski.
I'm with Walgreens.
I'm a pharmacists.
And I've been with Walgreens for the past 11 years.
The first eight years of my career were spent behind the
counter, filling prescriptions and counseling patients.
And currently, I work in our health care innovations area,
managing our health information technology
projects, including personal records, and the like.
So I'd like to start by saying how very excited we are to be
a part of this launch.
I really think Google Health is going to be a great thing.
And we're so very proud to be there in the beginning and a
part of this.
As a pharmacist, having an incomplete or inaccurate drug
list for a patient makes it much harder to do your job.
You can't perform a drug interaction check on a drug
that you don't know exists.
And this is actually happened close to home recently.
A friend of mine's mother was actually hospitalized due to a
drug interaction.
She's a Walgreens patient and ended up getting a
prescription filled at another pharmacy, which probably was
her first mistake.
And that pharmacy didn't have her complete drug profile.
And they didn't realize that one of her maintenance
medications had a very severe drug interaction.
So she ended up getting the script filled, taking it, and
now is having some severe liver failure, and might
possibly need a transplant.
So had that pharmacy had the complete profile, and had all
the medications that she had been taking, this could have
been avoided.
Obviously this is an extreme case, but it's just a way to
show how we can make a difference by having all this
data available to all health care practitioners, doctors,
pharmacies, and the like.
It's only going to help create better care for everyone.
For pharmacists, patient care is really
the number one priority.
I don't know many pharmacists who went into this field
because they really liked putting pills from a big
bottle into a little bottle.
So it's all about taking care of the patient.
And with Google Health, I think what we're getting is a
more engaged patient, a patient that's taking an
active role in their health care and that really wants to
see themselves get better, manage their disease states,
look up their medications that they're taking and make sure
there aren't any drug interactions, or with over the
counter medications.
So that's really what I think one of the greatest impact, is
really getting patients involved in
their own health care.

The new Google Health effort will lead to patients and
their health care practitioners getting better
access to patient medical information.
So more informed patient and more informed doctors will
lead to fewer harmful drug interactions, and patients
getting the most benefit from their drug therapy.
And in closing, Walgreens has had a long history of pharmacy
innovation.
We were the first pharmacy to install a satellite connected
pharmacy computer system, where all of our locations
were linked, in the early 1980s, to inventing the
electronic prescription in the early 1990s, to having the
first online website integrated with retail firms
to locations.
And today we're taking another innovative
step with Google Health.
And we're very excited about it.
Thank you.
MARISSA MAYER: Thanks.

What Google Health is really about is pulling together
information from your doctor's office, from labs, from
pharmacies, all together, to get a holistic picture.
And we've been really lucky to have a great health advisory
board, headed up by one of the holistic medicine specialists
in the world, Dr. Dean Ornish, founder of The preventive
Medical Research Institute.
And I'd like to ask Dean to come up and talk a little bit
about his collaboration through the advisory board.
DEAN ORNISH: Thank you.
It's great to see you.
I'm Dean Ornish.
I'm a clinical professor of medicine at UCSF, and the
founder and president of the nonprofit Preventive Medicine
Research Institute.
And for the past 18 years, I've had the privilege of
working with Marissa and colleagues, chairing the
Google Health advisory council, which includes 23 of
the top health and medical experts from around the
country, people like Dr. Toby Cosgrove, who's the CEO of the
Cleveland Clinic, represented here by Dr. Martin Harris,
their chief information officer, and Dr. Mike Roizen,
a longtime friend, who is also their chief wellness officer,
people like Dr. John Halamka, who's the Chief Information
Officer at Harvard Medical School and Beth Israel
Deaconess Medical Center, as well as representatives, the
heads of the American Medical Association, the AARP, and
other privacy experts.
Many of these people, along with the partners listed up on
this screen, will be available during breakout sessions
during lunch, if you want to talk with them.
I'm excited by this because it really empowers people with
getting more control over their own information, as well
as having better access to it.
And as Ronni Zeiger mentioned, we're proud of what we've done
so far, but it's just the beginning, and look forward to
more in the future.
Thank you.
MARISSA MAYER: Thanks.

So we didn't have time today to, obviously, bring all of
our partners up here on the stage.
But I wanted to thank all of them, because this has been a
really profound effort.
We've come a long way.
And the service that we're offering our users today, with
these integrations, is already very powerful.
That said, we realize this is just the beginning.
There are literally thousands of additional partnerships
that need to be formed, petabytes of information that
need to be brought online, digitally, and put in the
hands of the patient.
Right?
It's crazy that that Seinfeld clip that we showed on the
break still applies.
How many people here have actually seen or touched their
medical record?
A few, mostly the doctors in the room.
But that really shouldn't be the case in this interim
information age.
You really should be able to have that information.
It's so relevant to you and it can make such a big difference
in your life and in your health care.
And that's what Google Health is really about, happier,
healthier users, with access to the information that's most
important to them to get good health care.
And on that note, related, but separate from Google Health,
we have one final announcement today, which is something that
came out of our collaboration with Cleveland Clinic and is
also at healthier users, which is our Walk for Good Campaign.
In the spirit of wellness, we have worked with the Cleveland
Clinic to develop a Google Gadgets, that sits and becomes
part of your iGoogle page.
But it's actually part of a broader wellness campaign to
get our users walking, get them out, get them taking
advantage of the warmer and happier weather, and making a
difference in their health every day.
So you can sign up, as of today, for our Walking for
Good Campaign.
You can add Walking to iGoogle.
And you keep track of your walking
over the next 15 weeks.
And if you achieve a certain point in the program, we'll
allow you to vote for a charity, where we'll make
$100,000 donation, come the Fall.
And so by signing up for this, you'll see it drop into your
iGoogle page, like this.
And we have Dr. Michael Roizen, from the Cleveland
clinic, the Chief Wellness Officer for Cleveland that
helped us develop this gadget, here to talk little
bit more about it.
Welcome.
MICHAEL ROIZEN: Thank you.
I'm the Chief Wellness Officer of the Cleveland Clinic.
And I think I'm the only chief wellness officer, the first,
at any major medical center.
I focused my medical career, or my medical practice, on
prevention and wellness for 37 years.
So the Cleveland Clinic and I are excited to be partnering
with Google.
If anyone can demystify what wellness is, and if anyone can
make it fun, enjoyable, and give the user what they want,
which is an easy way of staying well, getting well,
and having fun while doing it, Google can.
So we're really proud to be with you.
The Cleveland Clinic has done a lot for wellness.
It really believes in wellness.
It's more than than a mission.
It's a passion.
For the last three years, we've, in fact, stopped hiring
smokers, made the campuses smoke free, and don't allow
trans fats to sneak into patients' or employees' food.
And now we've started investing heavily in our
employee wellness activities.
So as a health care organization, we believe we
have to practice what we preach.
We believe that, in fact, the big idea is that 70% of how
long and well you live are your choices, not your genes.
You get to nudge your genes-- and Dean's done a great deal
of work on this--
you get to nudge your genes so that you can live much
healthier, with much more vitality, and much longer than
you ever thought possible.
And the most important thing you can do is
daily physical activity.
Whether it is in the long-lived populations of
Okinawa, or Crete, or Hawaii--
and they all have different diets and different
activities--
but the one activity they all share in common is vigorous,
physical, daily activity.
And everyone can do that.
The best physical activity--
and we've said it on TV, on Oprah, and we say it in the
Ubooks, and we say it to patients daily at the
Cleveland Clinic--
the best physical activity you can do is daily walking.
It's pretty easy.
It is without fear of injury.
And virtually everyone can do it.
If you walk two hours a week, just two hours a week, you
decrease the risk of all cause mortality compared your peers
by 39% percent.
Three to four hours, and you decrease that risk by 54%.
So it's really a huge benefit from just walking.
But not all walking is created equal.
That is, you've got to have a reasonable pace and you've got
to spice it up, or you'll stop doing it.
And that's where Google comes in, because Google, in fact,
is the one that can make walking fun.
The Walk for Good is the tool, or the gadget, that, in fact,
allows everyone to walk, to set targets, and from beginner
to advanced person, everyone can set targets and go to
achieving their results.
And setting targets is also critical.
Now walking is just one of those targets.
There are many of them.
But it is the most important target.
So Cleveland Clinic and Google are happy that we can, in
fact, offer this to everyone.
It's a tool that will allow people to set targets, to hit
them, and to understand how much fun it is to get well.
So I hope all of you will, in fact, join me and encourage
others to go for good.
Thank you.

MARISSA MAYER: So that brings us to close
of the factory tour.
We're going to bring up the speakers for a Q&A. So if the
speakers can come up towards the front, that will be great.
And in summary, you can see that the factory tour has
really been about, for engineers, the hard problems.
What are we working on in order to make our users more
well understood, provide more relevant results?
And for our users, how do we make them happier and
healthier with our overall user experiences and these new
products that we've brought out?
So we're really excited with the start we have today on
Google Health, as well as all the progress we've made on the
other search properties we've talked about.
So why don't you guys come up.
And with that, we can go ahead and take questions.
We have a mic here in the middle of the room, if you
wouldn't mind using it, so the webcast can hear the question,
that would be great.
There is also roving mics.
[SIDE CONVERSATION]
AUDIENCE: Lisa Krieger, San Jose Mercury News.
Can you bring us up to date on the status of partnerships, or
potential partnerships, with local health care providers,
Kaiser, Palo Alto Medical Foundation, also insurance
companies, which continue to be the biggest industry, I
think, to most of us.
MARISSA MAYER: Sure, we've made excellent progress on all
these partnerships.
And Roni can talk about that.
RONI ZEIGER: We don't do exclusive partnerships.
We're excited to work with everyone.
We're talking to folks locally and nationally.
And you can imagine, many of those that you mentioned,
we're in talks with.
And if you are a patient somewhere, and we don't yet
have, and you want access to, your records, and we can help,
go ahead and let them know.

AUDIENCE: Hi.
I'm John Pallatto, with Eweek.
And the question I have regarding Google Health is to
what degree are the personal health records totally
separated from the rest of the system, totally private, or is
there any kind of sort of anonymous informational
aggregating that is going on within Google,
within Google Health.

RONI ZEIGER: So let me give an example of what we are doing,
or what we are interested in doing in terms of aggregated
information.
If, for example, six months from now, there are a lot of
Google Health users, and we'd be able to publish a statement
that was true according to data from Google Health users
along the lines of 10% of diabetics last year, who are
Google Health users, had a flu shot.
That would be aggregated information that couldn't be
tied back to any individual that we would
feel comfortable sharing.
In addition, in terms of connection between different
Google properties, to two quick comments I'll make, and
hopefully that will help answer your question.
One is--
right now there's a connection, for example, with
the contacts list that you see in Gmail.
So with one feature of Google Health, you can search for
doctors, import them into your Google Health contacts list.
That automatically creates a list, a group, within your
Gmail contacts.
So if you, for example, have five doctors and a
chiropractor, or what have you, in your Google Health
contacts list, you'll be able to find those in your Gmail
contacts as well.
Last comment all make is that what absolutely will not be
the case is that no Google Health user should expect, or
will ever find, their health information as search results
anywhere on Google.
That information is yours, and only you have access to it.

AUDIENCE Tom Forenski, Silicon Valley Watcher.
We spend so much money on health care as a country.
Why am I supposed to take more responsibility for my health
care by integrating all this information?
Shouldn't my doctor have this information on hand?
After all, they can make better decisions than I can.

MICHAEL ROIZEN: I'm going to just give you get that 75% of
all health care costs are due to chronic illness, that you
can prevent.
We can't do it.
You've got to do it.
We, as physicians, can't do it.
We, as Google, can't do it.
In other words, we can't take the walk.
You have to.
You've got to record it in some place.
And that's a way in making it fun and enjoyable, it's
something to inspire you and keep you motivated.
But you're the only who can prevent chronic disease.
The United States has twice the health care costs of Great
Britain and Canada.
Why?
We have twice the chronic disease on an
age weighted basis.
And you get to change that if you want to.
And that's why Google Health is so important, and the
Google Go for Good Campaign is so important.
DEAN ORNISH: And just to build on what Mike said, you can not
only prevent diseases, you can often reverse these diseases
simply by changing lifestyle.
We've shown in our studies that even severe heart disease
can be reversed by making changes in diet and lifestyle.
Progression of prostate cancer, and by extension,
breast cancer, can be stopped or perhaps even reversed,
especially at early stages-- diabetes, hypertension,
obesity, elevated cholesterol levels.
And empowerment is not the same as blame.
If you're a victim, then there is not much you
can do about it.
We're moving away from the model where the doctor is the
all-knowing and tells the patient what to do, to saying
we're a partner, we can empower you with information.
And we've shown this in our studies.
But now we can make this information available to the
people who most need it.
Finally it's coming at a time when health care costs, which
are really disease care costs, are reaching a tipping point.
It's going to become a major focus of the
presidential debates.
And so the timing couldn't be better in terms of getting
information out to people that we know can be beneficial at a
time when it's most needed-- and at a time when the high
tech interventions, like for example, angioplasties have
been proven very clearly not to prolong life, not to
prevent heart attacks, unless you're in the middle of having
one, which 95% of people who get angioplasties are not.
And we spent $30 billion in this country on that one
procedure alone last year.
That's why I'm working with Google.
It's an exciting time to be doing it.
MARISSA MAYER: And I would also raise the point that,
even for healthy individuals, this is really an element if
convenience, and also, in some cases, cost reduction.
So for example, I'm going on a long trip this summer to two
different places.
And I wanted to find out if I was up to date on my
vaccinations.
Right now, that's very hard to find out.
When was my last vaccination at Stanford, was it here from
the Google doctor, was it from the Palo Alto Medical
Foundation.
All those records aren't in one place.
And having them knitted together, just so you
understand do I need this shot or not, and can you actually
save the time of having to look all of that information
up across all those offices, helps.
We've also seen interesting cases from the Cleveland
Clinic pilot that we did.
So for example, the Cleveland Clinic, because it's the
nation's leading institution on cardiac work, a lot of
people travel there to have their cardiac surgeries.
They arrive, and the first thing that Cleveland
immediately has to do is repeat all the tests, tests
that probably your local doctor just did.
So if you actually have this as part of a unified health
record, this could be transitioned to Cleveland and
we could avoid all the extra expense and inconvenience to
you, having to rerun on those tests.

AUDIENCE: Two questions.
First of all, how are going to make money from this, and can
you talk about that in light of privacy, and [INAUDIBLE].
MARISSA MAYER: No, it's not.
AUDIENCE: It's not.
So how does this work for Google's platform?
MARISSA MAYER: Well, for us, this is really tied to our
core mission.
It's organizing the worlds information.
It's a lot of information to bring online.
And we think it ultimately makes Google, as a tool, more
useful to users overall.
RJ PITTMAN: I might add that, as well, that there are a lot
of things that we do in trying to build user loyalty.
And bringing value to the users, we were talking about
earlier in search properties, you'll notice that some of our
other properties, like Google News and even Image search,
were very resistant to leaping out and making it immediately
relevant go Google's bottom line, because it's really the
user that comes first.
RONI ZEIGER: I'll add also that, you'll notice in the
product, there's a Google Search box on every page.
And we expect that people who find Google Health useful will
spend more time there and do more web search.
And they'll get taken to google.com search results,
which, of course, have our standard advertising on the
side there.
AUDIENCE: And you said earlier that you expect thousands of
new applications, best applications [UNINTELLIGIBLE],
which implies that there's some sort of open
platform for this.
Can you discuss how people will plug into this Google
Health product.
RONI ZEIGER: Absolutely.
So, today we're also publishing our APIs, and in
lingo, which means our instructions for how
programmers can connect to Google Health.
And that's public and we look forward to working with a
whole slew of developers, some of whom we already know about,
and I'm sure many of whom we don't yet.
And they'll have ideas, I'm sure, that we
haven't thought of yet.
They'll say you know what, what people really want is a
widget that does xyz that's personalized to their health
information.
They'll build that product using our public platform.
And we will review their work and work with them and get
them integrated.
And then users will have the choice whether or not they
wish to connect securely with that partner.

AUDIENCE: Larry Maggot, CBS News.
To some extent, you answered this.
But today happens, by coincidence, to be the US
debut of Nintendo Wii Fit, which is one of hundreds of
gadgets that theoretically could add data to something
like Google Health.
And I'm thinking of blood pressure monitors, the fit
link system that they have my local YMCA and all sorts of
gadgets out there.
Are you going to be developing some kind of, you mentioned
APIs, but is there going to be an attempt to integrate a lot
of these other electronics into your health records for
tracking your exercise and your health.
RONI ZEIGER: So absolutely.
We're a surprisingly small team, so we focused on the
most important places to start with.
But no question that a very high priority is, for example,
integrating with blood pressure monitors, basically
any devices or any context where the user has health
information that they would wish to pull into their Google
Health account, we're interested in building those
bridges so that they can do that, if they so choose.
AUDIENCE: And exercise equipment as well?
RONI ZEIGER: Anything is possible.
Like I said, the programming interface is public.
So if you have an exercise machine and you know how to
hack it, you could do it.

AUDIENCE: Wendy Tanaka with forbes.com.
An unrelated question, sort of related though--
does Google have response to yesterday's Microsoft and
Yahoo announcements?
MARISSA MAYER: Not at this time.
AUDIENCE: Thanks.

AUDIENCE: Marissa, Greg Sterling, Search Engine Land.
I want to come back to the privacy issue a bit.
I think that the underlying service is a great
value to end users.
There's no question about that.
More information sharing will provide a lot of benefits.
But I'll never forget a line that somebody from Blue Cross
said to me on the phone and telling me why my wife had
been denied for coverage, which is, this is a verbatim
quote, "Blue Cross makes no apologies for being a for
profit enterprise." And that line will
never leave my brain.
And I'm wondering about the efforts that employers and
insurance carriers might try to undertake to get access to
this information.
Do you foresee any problems or challenges there.
For instance, let me have access to Google Health
profile as a condition of applying for this job or
accepting this job, because people know that this kind of
information exists.
MARISSA MAYER: Well we certainly have put in place
the firmest privacy policy that we could construct.
And we are putting the sharing in the hands of the users.
So it's at the user's discretion as to whether or
not they want to share their profile and their information
with anyone, should it be an insurance company or should be
an employer.
And certainly, that privacy policy has been designed to
try and make it straightforward for users who
want to abstain from certain sharing to do so, because it
is entirely within their control and their discretion.

AUDIENCE: James Niccolai, with IDG News Service.
I assume this is limited at the moment to North
America or to the US.
I wonder if you can talk about whether you've even started
signing up partners in Europe and Asia, give us any kind of
idea when the service might be available there.
RONI ZEIGER: Sure.
I can speak to that.
So this is indeed a US launch.
And we are well aware that the world is a much broader place
than just the US.
And we're really looking forward to providing the
service, assuming that it turns out to be as useful as
we hope it is, worldwide.
We're talking to folks in places
such as that you mentioned.
There's nothing specific that we have to
announce at this time.
I think it's worth noting that health care is probably a
little bit more complex than some of the other products
that Google launches.
You can imagine that even at the level of, for example, the
important area of privacy and regulation around there, that
we have a lot of homework to do and a lot of learning.
So a lot of discussions going on.
We look forward to getting there as quickly as possible.
It's absolutely part of our mission, but
this is where we started.
AUDIENCE: Verne Kopytoff, San Francisco Chronicle.
How is Google Health differentiated from other
electronic file systems?

RONI ZEIGER: So first of all, I think it's a really
important and good thing that there are many smart and
resourced organizations working on this problem.
This is a big area.
I think that the whole notion of personal health records and
related is an era that's really just beginning.
We understand a fair amount about them, but we're
certainly learning as we go.
The fact that really only a few percent of people in this
country are using those kinds of tools at this point really
means that we have, the royal we, we haven't
gotten it right yet.
So what we have focused on this is an excellent user
experience, an open platform, so that users can connect to
places that might have some of their health care information,
like their pharmacy and their hospital, and that same open
platform, so that third parties, who have great ideas
of what those users could do, and therefore make it a much
more meaningful experience to have their health care in one
place so that they can really act on it.
Those are the things that we're
focusing on most closely.
And we'll certainly leave it up to you to give us feedback
about which parts we're doing great, and which parts we
could do better.
MARISSA MAYER: And I would even summarize more concisely.
It's about the three Ps and the U--
privacy, platform, portability, and users.
Those are really the thing that set Google Health apart.
It makes your data more portable.
It's on a platform, so anyone can play with it.
It's done with ultimate privacy.
And it's done with a big user focus.
We think that the user interface is one of the nicest
parts of the application.
It's extremely user friendly.
It's easy to interact with or organize.
I know myself, whey I thought about it, well, what format
does a doctor's chart even have. And if I could look at
it, would I even be able to understand it?
Taking that apart, breaking it down, and displaying it in a
way that makes it really easy to interact with is one of the
key differentiating features of Google Health.
DEAN ORNISH: And just to build on that, besides the medical
benefits of having access to this information, to avoid the
kind of drug interactions, or to have your EKG available at
2:00 in the morning, if you show up in
the emergency room--
it's been estimated that 30% of health care costs are due
to waste and inefficiency.
A lot of it is spent chasing down information.
And so, if we're going to try to make health care affordable
to the people who most need it, the 47 million Americans,
for example, that would be part of a universal health
care plan, we have to find ways of making
the system more efficient.
And this is really good start.

AUDIENCE: David Louie with ABC7.
Given the issues of ID theft, and hacking that go on in the
internet space now, what risks are there to medical
facilities and to doctors in terms of potential
modification of records.
And what legal exposure do you have after you've distributed
or have shared it with the user, with the patient, that
if something is modified, if there is a drug interaction
because somebody didn't change their records, or somebody had
the information hacked.
What's sort of protection do we have that you're not going
to open up a legal mess as a result of this?
RONI ZEIGER: I'll take a first stab at that.
So whenever I get asked hard questions, I was think choice
and transparency.
Those are really the key tenets that drive everything
we're doing here.
So transparency means that we make it as clear as we know
how to, to the user, what they're doing, what they're
deciding to do, what they're deciding not to do.
And choice means that it's completely up to them.
Like we said, the user decides who, if anyone, has access to
their information.
They can also revoke such access if they do grant it.
With respect to the question about the modification of
records, it's a really important one and we've
learned a lot a long way from our advisory council, and the
many partners were working with on how to best do that.
So you saw in the demonstration that I was able
to enter a condition and/or a medication myself.
It was clearly indicated there as user entered.
I was also able to import some data, for example, from
Walgreens pharmacy.
Now I could annotate the record that I imported from
Walgreens pharmacy with a note.
For example, I could say Roni, remember to take this at
night, because it makes a little bit sleepy.
However, I can't change the dose.
I can't increase the number of refills.
I can only add a separately kept note.
So if I then import something from Walgreens and then share
it with my doctor at the Cleveland Clinic, my doctor
there will see the attribution.
My doctor there, she'll see what is user entered and what
had been imported from Walgreens.

AUDIENCE: Steven Shanklin from cnetnews.com.
I'm having trouble figuring out where
exactly data is stored.
I understand I can import
information into Google Health.
But it seems like there are also a lot of other databases
that it perhaps ties into, that might be stored
elsewhere, at my doctor's office, at Walgreens, perhaps,
or maybe it's updated automatically.
I'm wondering about the situation where I want to
figure out do I want to share information that's at this
medical facility with that doctor over there, or with
this pharmacy.
And it sort of seems like a combinatorics nightmare.
And I'm also wondering, specifically, if there's a
granularity setting.
Do I share all, or just some?
How do I control that?
RONI ZEIGER: Great question.
So the first question, I think the best way to think about it
is that Google, on your behalf, is storing a copy of
your records.
You can enter data into that into that account.
You can import a copy from Walgreens, or Longs, or Beth
Israel in Boston.
And then you have a copy.
You haven't taken away the data from your account at
Walgreens, you've just made a copy of it.
If you then, in turn, decide to share some of that data
with your local hospital, and they've integrated with Google
Health, then you're giving them the right to pull a copy,
or to read from yours.
But this is a user controlled database that, of course,
Google is hosting.
Depending on how the connections are set up, the
user may have the ability to regularly pull data from, say,
their Cleveland Clinic account.
Similarly, depending on what privileges they give to the
Cleveland Clinic, the Cleveland Clinic may have the
right to regularly pull data from the
Google Health account.
MARISSA MAYER: And this is explained as you set up each
set of sharing links.
So the way this would work, for example, for me is I would
go, I would pick Quest, because they do my
diagnostics, I would pick Walgreens, because they handle
my pharmacy, and probably Palo Alto Medical Foundation.
So it's as easy as picking out your medical providers out of
the list. You go through the agreement to set up the
sharing link with each one.
And information is regularly pulled in accordance with
those sharing procedures.
That means then, should I want to start, say, seeing the
Google doctor here, and wanted to share the information from
those three sources with her, it's easy, because it's
already in my copy of the data.
And I would just set up a sharing link with the Google
doctor, who would then see all this information.
But it's really as easy as choosing your providers out of
a list.
MALE SPEAKER: You can always change it too.
MARISSA MAYER: You can always revoke.
RONI ZEIGER: Your question about granularity is an
important one.
As of the version that we launched today, it is all or
none sharing of your profile.
And we absolutely recognize that granularity is something
that people want.
And we're working on it.
But we thought that you all had been waiting enough.
RJ PITTMAN: I think it's also important to point out that,
in that sharing model, it's all or none for your Google
copy of that information.
We are not opening up sharing access
directly from your provider.
MARISSA MAYER: And I think we have time for one more
question, right here.
AUDIENCE: John Pallatto, from Eweek.
A more fundamental question, to address my colleague's
question, is how secure is the database in Google?
How secure would my records be from hacking, from
intrusion, by anyone?
How do you protect that?
It's the most personal thing you own?
MARISSA MAYER: It is at our highest level of security, so
the computers that it's stored on are actually more secure
than, say, our search servers, or anything else, because we
understand how private that is.
So we've constructed a special infrastructure with an
additional layer of security on it in
order to achieve that.
DEAN ORNISH: And for what it's worth, at the Google Health
advisory council meetings, this has been the number one
topic that we discuss.
We've brought in the top privacy experts.
We completely get it.
We completely share your concerns.
And we've been really happy with how Google has gone about
addressing it.
MARISSA MAYER: I want to thank everyone for being here today.
Thanks.