Chinese search engine company Baidu is working on a massive computing cluster for deep learning that will be 100 times larger than the cat-recognizing system Google famously built in 2012 and that should be complete in six months, Baidu Chief Scientist and machine leaning expert Andrew Ng told Bloomberg News in an article published on Wednesday. The size Ng is referring to is in terms of neural connections, not sheer server or node count, and will be accomplished via heavy use of graphics processing units, or GPUs.
That Baidu is at work on such a system is hardly surprising: Ng actually helped build that system at Google (as part of a project dubbed Google Brain) and has been one of the leading voices in the deep learning community for years. He joined Baidu in May, working out of the company’s Silicon Valley office, in order to help advance its capabilities in artificial intelligence.
However, the comparison to Google Brain might not be entirely relevant. Google’s system was part of a research project, while Baidu’s 100-billion-neural-connection system will be handling live search traffic for Baidu’s hundreds of millions of users in China. Google hasn’t publicly disclosed how many servers it runs overall, much less the number dedicated to deep-learning models that currently power a number of applications including speech recognition and image search.
Still, Baidu is trying to do some impressive things that will certainly benefit from the work Ng and his team, as well as Baidu’s Beijing-based artificial intelligence lab, are doing. For starters, they’re trying to keep up with the trends in mobile search. Baidu CEO Robin Li told Bloomberg that 10 percent of the company’s search queries are currently done by voice, and that voice and image search will surpass text queries within five years.
And at its Baidu World conference this week, the company also announced its lensless version of Google Glass, called Baidu Eye (pictured above and below), that appears pretty reliant on deep learning. It sounds like an interesting device: A camera on one side of the headset analyzes the things it sees around it and sends audio information into an earpiece on the other side, as well as to a smartphone. Users can control Eye via voice commands or gestures.
Baidu and Google are just two of many technology companies now investing heavily in deep learning and other forms of artificial intelligence in order to improve their capabilities in fields such as computer vision, speech recognition and text analysis. Others include Facebook, Microsoft, Twitter, Amazon, Apple, Netflix, Spotify and Yahoo. Google is now taking things a step further by designing its own quantum computing processors as part of its Quantum Artificial Intelligence lab.
We’ll have a collection of leading researchers from these companies (including Ng) at our Sept. 17 artificial intelligence meetup in San Francisco. The event is currently sold out, but it will be recorded and (if the technology cooperates) possibly live-streamed, as well.