Supercomputing speeds up deep learning training

Researchers from UC Berkeley, UC Davis and TACC used Stampede2 to complete a 100-epoch ImageNet deep neural network training in 11 minutes -- the fastest time recorded to date. Using 1600 Skylake processors they also bested Facebook's prior results by finishing a 90-epoch ImageNet training with ResNet-50 in 32 minutes. Given TACC's large user base and huge capacity, this capability will have a major impact across all fields of science.

Supercomputing speeds up deep learning training

A team of researchers from the University of California, Berkeley, the University of California, Davis and the Texas Advanced Computing Center (TACC) published the results of an effort to harness ...

Mon 13 Nov 17 from TechXplore

Supercomputing speeds up deep learning training, Mon 13 Nov 17 from Eurekalert

  • Pages: 1

Bookmark

Bookmark and Share