Google CEO Sundar Pichai was recently interviewed, and the conversation covered many topics in the tech world. One of the areas that was touched upon was artificial intelligence (AI) and what the future will hold for the technology. In that regard, Pichai likens AI to fire.
Pichai said in the interview, "AI is one of the most important things that humanity is working on. It’s more profound than, I don’t know, electricity or fire."
"[While fire is good] it kills people, too. They learn to harness fire for the benefits of humanity, but we have to overcome its downsides, too."
The Google exec thinks that AI will do great things for humanity, like helping to find a cure for cancer. However, he does admit that it is fair to be concerned about negative impacts on humanity as well. He says we must find a balance between the utility of AI and what defines us as humans.
AI is a big concern for many of the top talents in technology today. Some fear that the rise of AI has the potential to be harmful to humankind, something akin to Skynet. It's not just "out there" types who have a healthy fear of AI, Tesla CEO Elon Musk has long said that AI poses a significant threat to humanity.
Musk is putting money into research efforts focused on AI, in 2015 he donated $10 million to AI research. Famous scientist and thinker Stephen Hawking also fears AI run amuck. Hawking says that AI as the potential to be the worst event in human history. Hawking fears more than just a "Skynet scenario" he fears AI taking jobs that humans performed, putting people out of work and harming the economy.
Pichai also talked about the termination of former employee James Damore after Damore criticized the diversity policy of Google in a memo he published. Pichai says that he doesn’t regret firing Damore, but that it wasn't political. Pichai said, "The last thing we do when we make decisions like this is look at it with a political lens." The full interview will air on MSNBC on January 26th.