Nvidia this week unveiled a new, super powerful Titan X graphics card.
Boasting 12 billion transistors, 3,584 CUDA cores at 1.53GHz (up from 3,072 cores at 1.08GHz in the previous Titan X), 12GB of GDDR5X memory, and more than 10 teraflops of computing performance, the $1,200 GPU began with a bet.
“Brian Kelleher, our top hardware engineer, bet our CEO, Jen-Hsun Huang, we could get more than 10 teraflops of computing performance from a single chip,” Nvidia said in a blog post. “Jen-Hsun thought it was crazy.
“Well, we did it” the company announced. “The result is crazy. And, as of today, Jen-Hsun now owes Brian a dollar.”
In a Wednesday meeting of deep-learning experts at Stanford University, Nvidia’s CEO presented the Titan X to Baidu Chief Scientist Andrew Ng. Four years ago, Ng helped jumpstart the field of artificial intelligence by using GPUs to build a network of artificial neurons.
“If you’re a machine learning researcher, having access to a machine that is [two times] as fast means that you are [two times] as productive as a researcher,” Ng, an associate professor at Stanford, told the crowd this week.
Some participants are already getting a head start on that research, after Huang gave away a handful of Titan X cards to audience members. Everyone else will have to wait a week: The new Titan X will be available Aug. 2 for $1,200 from Nvidia.com in North America and Europe; it is coming soon to Asia.
Earlier this month, Nvidia announced the third and cheapest card in its Pascal family of GPUs. The $249 GTX 1060 follows the GTX 1070 ($379) and top-of-the-line GTX 1080 ($600)—the latter which PCMag’s Matthew Murray called a “game-changing video card.”