HomeNewsAndroidGoogle’s tensor processing units (TPUs) are interesting, but Nvidia is essential

Google’s tensor processing units (TPUs) are interesting, but Nvidia is essential

Google-machine-learning-tensor-google-io-624x351

Google revealed at its I/O 2016 that it has developed and manufactured its own chipset called a Tensor Processing Unit, a custom designed chip that is made to work with its open source machine learning framework, TensorFlow.

The TPUs were used to power DeepMind, that defeated International Go champion Lee Sedol. They are already being used for StreetView, RankBrain and Inbox Smart Reply, and are expected to be deployed for Google Home and Google Assistant as well.


Neural Nets use a very different kind of processing, and these require different approaches than conventional computers. Custom architecture is more viable, and uses less resources, and is faster. Google for example, managed to optimise its architecture to deliver better performance per watt consumed. The result was computation power an order of magnitude better than any setup possible with commercially available products.

Barron’s has posted a comment on Google announcing these chips, from JP Morgan semiconductor analyst Joseph Moore. According to the analyst, Google’s reveal shows that the investment in hardware specific to machine learning applications is important. Just conceptualising the architecture for such applications is important. There will be a wide variety of chips needed for these future applications, and according to the analyst, Nvidia is poised to dominate this space. AI processing is reliant on GPUs at this point, and Nvidia’s experience here could enable it to penetrate the market deeper, said the analyst.

Subscribe To Our Newsletter!

To be updated with all the latest news, offers and special announcements.

RECOMMENDED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

TRENDING!