Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

admin
Pinned September 29, 2021

<> Embed

@  Email

Report

Uploaded by user
Facebook is reportedly developing custom server chips
<> Embed @  Email Report

Facebook is reportedly developing custom server chips

Google built a processor just for AI

Tensor Processing Units are designed to speed up machine learning.

Jon Fingas
J. Fingas
May 18th, 2016
Facebook is reportedly developing custom server chips | DeviceDaily.com

 

Google is no stranger to building hardware for its data centers, but it’s now going so far as to design its own processors. The internet giant has revealed the Tensor Processing Unit, a custom chip built expressly for machine learning. As Google doesn’t need high precision for artificial intelligence tasks, the TPU is focused more on raw operations per second than anything else: It’s an “order of magnitude” faster in AI than conventional processors at similar energy levels. It’s space-efficient too, fitting into the hard drive bays in data center racks.

The fun part? You’ve already seen what TPUs can do. Google has been quietly using them for over a year, and they’ve handled everything from improving map quality to securing AlphaGo’s victory over the human Go champion. The AI could both move faster and predict further ahead thanks to the chip, Google says. You won’t get to buy the chip yourself, alas, but you might just notice its impact as AI becomes an ever more important part of Google’s services.

 

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics 

(29)