One-On-One With Google CEO Sundar Pichai: AI, Hardware, Monetization And The Future Of Search
Sundar Pichai. (Credit: Christian Peacock/Forbes)
Sundar Pichai became CEO of Google GOOGL +0.92% last year, when co-founder Larry Page created its parent company, Alphabet. Ahead of this week’s Google I/O, the company’s annual conference for developers, Pichai sat down with FORBES to discuss new products and his plan to lead Google in an AI-first world. We reported in depth on his vision for Google and the future of search earlier this week. Our conversation went into more details on some subjects and touched on a few other topics. Below is a transcript of it, edited for length and clarity.
You’re unveiling a new way to search, which you are calling the Google assistant. How does it fit into the larger evolution of the company?
We’ve been building these incredible capabilities, be it Search, the Knowledge Graph, our understanding of natural language, image recognition, voice recognition, translation. Particularly over the last three years, we have felt that with machine learning and artificial intelligence, we can do these things better than ever before. They are progressing at an incredible rate. So how do we take that and get Google to be a more intelligent assistant there for you, when you need it? I think of it almost like saying, “How can I help?” at any point in your day. How do we do that? It’s kind of a journey.
Building an assistant is a constant evolution for us. We see it as a conversational assistant, an ongoing conversation between the users and Google that helps them get things done in the context of their world. We think of it as your own individual Google. That’s the big journey we are on.
That represents a shift away from the search box. Of course, search has evolved since the original ten blue links. How significant or momentous is this shift, compared to others, like Universal Search or the Knowledge Graph?
I do think it’s a profound change. It’s a big evolution of where we are going. It’s also more ambitious and difficult.
Read Also: Google Passed Apple As The World's Most Valuable Company (Again)
Why is it more difficult?
Search historically used to be queries. Ten years ago, we could point you to a link and be done with it. Today, to help you get things done, if we want to get you movie tickets or maybe a restaurant reservation you have to complete that step further. The transition to a conversational assistant is harder from a computer science standpoint. In every dimension, it’s more ambitious.
A few years ago, Larry Page and other Googlers talked about search evolving to a point where Google would answer question before you asked. This feels different.
A big part of the conversational assistant is that. Sometimes the assistant tells you something before you ask. If I can talk about an aspirational example, I think it would be good for the assistant to say, “Mother’s Day is coming up, and here are some things.” You haven’t asked it yet. Think of it as a two-way conversation; sometimes the assistant will be telling you something before you ask it.
The shift from PC-first to mobile-first that happened in recent years is pretty self-explanatory. Now you’re talking about moving from a mobile-first to an AI-first world? What does it mean?
Historically, we’ve largely been device-centric. To me over time, it makes sense that computing will be there in the context of what you are doing. You are trying to go about your day, and in an ambient way, things are there to help you. You’ll be able to that because there are more intelligent devices. We will accomplish it because we can do AI. We are in the beginning stages of AI. We are at an incredibly exciting stage of AI. That’s the context in which I talk about it.
Will AI spread across all of Google’s products?
You’re seeing that. This has been a big focus for me. Our core strength and core mission for our users was to be able to organize their information. You can do that increasingly better because of machine learning and artificially intelligence. Google Photos was a great example of that. But in the last 15 months, we also launched Smart Reply in Inbox. That’s an example of where we are assistive to you. The Google Now cards are great example of that. We launched Goals in calendar, a few weeks ago, in which you could tell your calendar “Here are my goals,” and it helps you accomplish that. That’s an example of being assistive in what we do. Another example that we launched last week, is that from any Android app you can translate.
In general, we are using machine learning and artificial intelligence to be assistive across all our products. But I want to distinguish between that and actually building a conversational assistant, which in the context of Google Home and Allo or search, you actually will be able to engage directly. Both are important.
Read Also: Google designs emojis depicting professional women
What’s the role of some pure AI experiments, like the AlphaGo, vis-a-vis the more practical AI applications. How does it fit in?
There are things you can do now, some in two to three years and some that are deeper and will take more time to do. With Deep Mind (the artificial intelligence startup Google acquired) and even some of our internal teams, we are focused on long-term AI. There are several examples of that. Alpha Go is a great example of it. Internally, just with a team of six or seven engineers, we’ve been working with a set of doctors on a condition called diabetic retinopathy. If you detect it early it’s curable. If you detect it late, people become blind. Using tools based on machine learning, doctors can diagnose it much much better. It will take time, it has to be reviewed. But it shows the promise of applying AI and machine learning to these kinds of diverse things. The power of these technologies is incredible. It is a foundational way in how I think that we at Google could do better for users across a set of things.
Sundar Pichai. (Credit: Christian Peacock/Forbes)
There’s a debate out there as to whether AI is something we should fear. Should we?
These are all good questions. Personally I think we are in very early stages of it. We have a long ways to go. I also think of AI as being there to help people, and if you approach it that way, I think it is incredibly useful. We all need to be thoughtful about it. I’m very optimistic about it.
From a product perspective, many of the things you’re introducing are things your rivals already have or do or have announced. Microsoft MSFT +0.66% and Facebook FB +0.44% both talked about AI powered chatbots. Amazon, of course, has Echo. How is Google’s approach to these things different and why do you think you will succeed?
We are not doing bots or something like that. For us this is an evolution of Google itself. People have been asking Google stuff all the time, so the question is how do we do it better. That’s the main goal. We do think we come to it with unique strengths — Search, the Knowledge Graph, machine learning and AI. It’s important to be able to do this at scale across situations, across the entirely world. I think that’s where we are uniquely differentiated.
You said you are not doing bots. What’s the distinction between a chat bot and the Google assistant?
Today a chat bot is a specific version of what others are doing with many interactions. We are bringing a version of Google to be available in the context of your phone, you watch, you car, you home or when you are using another product. This is about Google being available to you.
Where else do you think this assistive technology will make sense? The car? The TV?
We are excited about getting machine learning adopted more widely, engaging with the academic community. But over time, we are also excited about exposing machine learning capabilities and APIs as part of our cloud solutions as well. Imagine all the kinds of problems that many many teams around the world are working on being able to tap and apply machine learning and AI to their problems. That scale is what really excites me, be it in fields like healthcare, financial services, education, how you teach people better, climate modeling. I think that’s a huge opportunity.
All of this – the conversational assistant, the messaging product – suggest a shift toward some form of communications tool becoming the new computing platform. Facebook and Microsoft seem to think so, which is why they’re building their chat bots, and in China WeChat has pushed in that direction. Is it a big shift in computing?
It’s too early to tell. It’s an exciting thing. Already 20% of queries on Android are through voice. That’s an aspect of conversational search. It is an exciting shift already under way. But to do it well, it is going to take many years of hard work. In the context of narrow examples, it’s easy to do it. But the vision we are describing, it’s a journey over many years. But it is a big shift. I think so.
Take us four or five years out. How will things be different?
The world is so tough to predict that far out. I would love to be in a position where people feel we are helping them in the context of their lives when they want it, at moments that are appropriate to them. I think it is important for us to actually meaningfully impact people’s lives, not just make one small interaction a little bit faster.
Google tends to think about product first and monetization later. But there must be some thought on how advertising – or some other form of monetization – will fit into these conversations between users and the Google assistant.
I think things will evolve quite substantially. We have always thought if you solve users information needs, a lot of those needs are commercial in nature. Inherently, when information needs are commercial, you are connecting users with people who provide services, I think there are natural opportunities.
Do you worry that it’s further moving away from the search paradigm, which has been the bread and butter of Google’s economic engine?
We always had this question asked throughout the years. The shift to mobile has worked out well. I think of this as a new opportunity to bring in more users. It feels very far from a zero sum game to me.
I presume Larry Page seen the new products you are unveiling? What has been his reaction, and what sort of direction does he give you?
Larry and I have worked closely for many many years. We always have conversations about all the stuff we are working on, both ways. I think Larry has always been consistent as someone who wants to set a very high bar and always wants to make sure that we are pushing the boundaries, aiming very very high and being very user-centric. The specific feedback may vary, but it’s always in that context.
In many ways the creation of Alphabet makes a lot of sense. It allows Googlers focus on Google and others focus on moonshots or ancillary products/initiatives or investments. But the result is that many of the far out innovations – the sexiest stuff, so to speak – are not longer inside Google. I get that there’s still plenty of interesting stuff going on inside Google, but for employee satisfaction, retention and to attract top talent, do you feel you need to recreate that sense of huge ambition inside Google?
We haven’t had any such issues at all. The most exciting areas where we are seeing innovation is in machine learning and AI and a lot of that is happening within Google and we’ll use that to help Alphabet companies and companies outside. I don’t feel that tension internally at all.
What’s the thinking behind the dedicated hardware group that you created under Rick Osterloh, the former CEO of Motorola.
We’ve always been doing hardware. We have projects across the company. There’s a lot of synergies in doing hardware projects together. So we are bringing it all together.
Does that mean there will be more Google hardware products in the future?
It’s a much more thoughtful, coordinated way to do them. Google Home is a great example of the things we will do. We’ve always felt that if there’s an area where we can innovate we do it.
As you can see with VR, we are working with an ecosystem. We realize if you want to push what’s ahead in the next two to three years, we need to push where display technology is going, push where silicon is going.
Google Glass is part of that group now. Do you expect it to come back into the market as a product?
It’s not that we will specifically do Glass again, but we are looking at where computing needs to be for users, and approach it that way.
How is Sundar’s Google different from Google a year ago?
Personally for me, there is a renewed sense of focus on our mission and transforming the company using machine learning and AI. I think we have this vision of a shift from mobile first to AI first over many years and we want to be able to do that for users. And we want to do that by being deep about computer science and we brought a lot of focus back on to that. That’s important.
I would add that we’ve always cared to do this work for everyone at scale. That’s very near and dear to me.
Third, I do want to engage thoughtfully externally and internally. If feel a sense of responsibility and commitment to engage well with the outside well. These are all areas I’m focusing on.
Source: Forbes Tech
Sundar Pichai became CEO of Google GOOGL +0.92% last year, when co-founder Larry Page created its parent company, Alphabet. Ahead of this week’s Google I/O, the company’s annual conference for developers, Pichai sat down with FORBES to discuss new products and his plan to lead Google in an AI-first world. We reported in depth on his vision for Google and the future of search earlier this week. Our conversation went into more details on some subjects and touched on a few other topics. Below is a transcript of it, edited for length and clarity.
You’re unveiling a new way to search, which you are calling the Google assistant. How does it fit into the larger evolution of the company?
We’ve been building these incredible capabilities, be it Search, the Knowledge Graph, our understanding of natural language, image recognition, voice recognition, translation. Particularly over the last three years, we have felt that with machine learning and artificial intelligence, we can do these things better than ever before. They are progressing at an incredible rate. So how do we take that and get Google to be a more intelligent assistant there for you, when you need it? I think of it almost like saying, “How can I help?” at any point in your day. How do we do that? It’s kind of a journey.
Building an assistant is a constant evolution for us. We see it as a conversational assistant, an ongoing conversation between the users and Google that helps them get things done in the context of their world. We think of it as your own individual Google. That’s the big journey we are on.
That represents a shift away from the search box. Of course, search has evolved since the original ten blue links. How significant or momentous is this shift, compared to others, like Universal Search or the Knowledge Graph?
I do think it’s a profound change. It’s a big evolution of where we are going. It’s also more ambitious and difficult.
Read Also: Google Passed Apple As The World's Most Valuable Company (Again)
Why is it more difficult?
Search historically used to be queries. Ten years ago, we could point you to a link and be done with it. Today, to help you get things done, if we want to get you movie tickets or maybe a restaurant reservation you have to complete that step further. The transition to a conversational assistant is harder from a computer science standpoint. In every dimension, it’s more ambitious.
A few years ago, Larry Page and other Googlers talked about search evolving to a point where Google would answer question before you asked. This feels different.
A big part of the conversational assistant is that. Sometimes the assistant tells you something before you ask. If I can talk about an aspirational example, I think it would be good for the assistant to say, “Mother’s Day is coming up, and here are some things.” You haven’t asked it yet. Think of it as a two-way conversation; sometimes the assistant will be telling you something before you ask it.
The shift from PC-first to mobile-first that happened in recent years is pretty self-explanatory. Now you’re talking about moving from a mobile-first to an AI-first world? What does it mean?
Historically, we’ve largely been device-centric. To me over time, it makes sense that computing will be there in the context of what you are doing. You are trying to go about your day, and in an ambient way, things are there to help you. You’ll be able to that because there are more intelligent devices. We will accomplish it because we can do AI. We are in the beginning stages of AI. We are at an incredibly exciting stage of AI. That’s the context in which I talk about it.
Will AI spread across all of Google’s products?
You’re seeing that. This has been a big focus for me. Our core strength and core mission for our users was to be able to organize their information. You can do that increasingly better because of machine learning and artificially intelligence. Google Photos was a great example of that. But in the last 15 months, we also launched Smart Reply in Inbox. That’s an example of where we are assistive to you. The Google Now cards are great example of that. We launched Goals in calendar, a few weeks ago, in which you could tell your calendar “Here are my goals,” and it helps you accomplish that. That’s an example of being assistive in what we do. Another example that we launched last week, is that from any Android app you can translate.
In general, we are using machine learning and artificial intelligence to be assistive across all our products. But I want to distinguish between that and actually building a conversational assistant, which in the context of Google Home and Allo or search, you actually will be able to engage directly. Both are important.
Read Also: Google designs emojis depicting professional women
What’s the role of some pure AI experiments, like the AlphaGo, vis-a-vis the more practical AI applications. How does it fit in?
There are things you can do now, some in two to three years and some that are deeper and will take more time to do. With Deep Mind (the artificial intelligence startup Google acquired) and even some of our internal teams, we are focused on long-term AI. There are several examples of that. Alpha Go is a great example of it. Internally, just with a team of six or seven engineers, we’ve been working with a set of doctors on a condition called diabetic retinopathy. If you detect it early it’s curable. If you detect it late, people become blind. Using tools based on machine learning, doctors can diagnose it much much better. It will take time, it has to be reviewed. But it shows the promise of applying AI and machine learning to these kinds of diverse things. The power of these technologies is incredible. It is a foundational way in how I think that we at Google could do better for users across a set of things.
Sundar Pichai. (Credit: Christian Peacock/Forbes)
There’s a debate out there as to whether AI is something we should fear. Should we?
These are all good questions. Personally I think we are in very early stages of it. We have a long ways to go. I also think of AI as being there to help people, and if you approach it that way, I think it is incredibly useful. We all need to be thoughtful about it. I’m very optimistic about it.
From a product perspective, many of the things you’re introducing are things your rivals already have or do or have announced. Microsoft MSFT +0.66% and Facebook FB +0.44% both talked about AI powered chatbots. Amazon, of course, has Echo. How is Google’s approach to these things different and why do you think you will succeed?
We are not doing bots or something like that. For us this is an evolution of Google itself. People have been asking Google stuff all the time, so the question is how do we do it better. That’s the main goal. We do think we come to it with unique strengths — Search, the Knowledge Graph, machine learning and AI. It’s important to be able to do this at scale across situations, across the entirely world. I think that’s where we are uniquely differentiated.
You said you are not doing bots. What’s the distinction between a chat bot and the Google assistant?
Today a chat bot is a specific version of what others are doing with many interactions. We are bringing a version of Google to be available in the context of your phone, you watch, you car, you home or when you are using another product. This is about Google being available to you.
Where else do you think this assistive technology will make sense? The car? The TV?
We are excited about getting machine learning adopted more widely, engaging with the academic community. But over time, we are also excited about exposing machine learning capabilities and APIs as part of our cloud solutions as well. Imagine all the kinds of problems that many many teams around the world are working on being able to tap and apply machine learning and AI to their problems. That scale is what really excites me, be it in fields like healthcare, financial services, education, how you teach people better, climate modeling. I think that’s a huge opportunity.
All of this – the conversational assistant, the messaging product – suggest a shift toward some form of communications tool becoming the new computing platform. Facebook and Microsoft seem to think so, which is why they’re building their chat bots, and in China WeChat has pushed in that direction. Is it a big shift in computing?
It’s too early to tell. It’s an exciting thing. Already 20% of queries on Android are through voice. That’s an aspect of conversational search. It is an exciting shift already under way. But to do it well, it is going to take many years of hard work. In the context of narrow examples, it’s easy to do it. But the vision we are describing, it’s a journey over many years. But it is a big shift. I think so.
Take us four or five years out. How will things be different?
The world is so tough to predict that far out. I would love to be in a position where people feel we are helping them in the context of their lives when they want it, at moments that are appropriate to them. I think it is important for us to actually meaningfully impact people’s lives, not just make one small interaction a little bit faster.
Google tends to think about product first and monetization later. But there must be some thought on how advertising – or some other form of monetization – will fit into these conversations between users and the Google assistant.
I think things will evolve quite substantially. We have always thought if you solve users information needs, a lot of those needs are commercial in nature. Inherently, when information needs are commercial, you are connecting users with people who provide services, I think there are natural opportunities.
Do you worry that it’s further moving away from the search paradigm, which has been the bread and butter of Google’s economic engine?
We always had this question asked throughout the years. The shift to mobile has worked out well. I think of this as a new opportunity to bring in more users. It feels very far from a zero sum game to me.
I presume Larry Page seen the new products you are unveiling? What has been his reaction, and what sort of direction does he give you?
Larry and I have worked closely for many many years. We always have conversations about all the stuff we are working on, both ways. I think Larry has always been consistent as someone who wants to set a very high bar and always wants to make sure that we are pushing the boundaries, aiming very very high and being very user-centric. The specific feedback may vary, but it’s always in that context.
In many ways the creation of Alphabet makes a lot of sense. It allows Googlers focus on Google and others focus on moonshots or ancillary products/initiatives or investments. But the result is that many of the far out innovations – the sexiest stuff, so to speak – are not longer inside Google. I get that there’s still plenty of interesting stuff going on inside Google, but for employee satisfaction, retention and to attract top talent, do you feel you need to recreate that sense of huge ambition inside Google?
We haven’t had any such issues at all. The most exciting areas where we are seeing innovation is in machine learning and AI and a lot of that is happening within Google and we’ll use that to help Alphabet companies and companies outside. I don’t feel that tension internally at all.
What’s the thinking behind the dedicated hardware group that you created under Rick Osterloh, the former CEO of Motorola.
We’ve always been doing hardware. We have projects across the company. There’s a lot of synergies in doing hardware projects together. So we are bringing it all together.
Does that mean there will be more Google hardware products in the future?
It’s a much more thoughtful, coordinated way to do them. Google Home is a great example of the things we will do. We’ve always felt that if there’s an area where we can innovate we do it.
As you can see with VR, we are working with an ecosystem. We realize if you want to push what’s ahead in the next two to three years, we need to push where display technology is going, push where silicon is going.
Google Glass is part of that group now. Do you expect it to come back into the market as a product?
It’s not that we will specifically do Glass again, but we are looking at where computing needs to be for users, and approach it that way.
How is Sundar’s Google different from Google a year ago?
Personally for me, there is a renewed sense of focus on our mission and transforming the company using machine learning and AI. I think we have this vision of a shift from mobile first to AI first over many years and we want to be able to do that for users. And we want to do that by being deep about computer science and we brought a lot of focus back on to that. That’s important.
I would add that we’ve always cared to do this work for everyone at scale. That’s very near and dear to me.
Third, I do want to engage thoughtfully externally and internally. If feel a sense of responsibility and commitment to engage well with the outside well. These are all areas I’m focusing on.
Source: Forbes Tech
No comments