henry copeland: Moore's Law, version 15.21 pllqt.it/LrOsdG by @CadeMetz
Facebook Open Sources Its AI Hardware as It Races Google | WIREDBig Sur includes eight GPU boards, each loaded with dozens of chips while consuming only about 300 Watts of power. Although GPUs were originally designed to render images for computer games and other highly graphical applications, they’ve proven remarkably adept at deep learning. […]Traditional processors help drive these machines, but big companies like Facebook and Google and Baidu have found that their neural networks are far more efficient if they shift much of the computation onto GPUs. […] After 18 months of development, Big Sur is twice as fast as the previous system Facebook used to train its neural networks.