We are all beginning to see the foundations getting laid for the future of artificial intelligence (AI), especially with deep learning, which is key in recognizing words and identifying images. The most common implementations and best examples are voice assistants and driverless cars.
A lot of tech companies have shown their support to further research and development in deep learning. The first one that comes to mind is Google. However, the company is no longer alone in the front lines. Facebook Inc., operator of the largest social network, Facebook, has now come out with its own solution and is extending its arm to help push the foundations of machine learning and AI.
Facebook is making a beast of a computer, called Big Sur, available to companies that have interests in furthering voice and facial recognition technology.
"The recent advances in [machine learning and artificial intelligence] have been enabled by two trends: larger publicly available research data sets and the availability of more powerful computers - specifically ones powered by GPUs," explains Facebook's Kevin Lee and Serkan Piantino. "Most of the major advances in these areas move forward in lockstep with our computational ability, as faster hardware and software allow us to explore deeper and more complex systems."
To be more precise, Big Sur is an Open Rack-compatible hardware designed for large-scale AI computing. Big Sur incorporates high-performance GPUs (Nvidia Tesla M40) that consume up to 300 watts each, and is compatible with multiple PCI-e layouts.
The whole setup was made possible with Facebook's collaboration with other tech companies, particularly Nvidia, which leveraged its Tesla Accelerated Computing Platform. Training is distributed into eight GPUs, which is the reason why the process is two times as fast. Through the new hardware, neural networks that are twice as big can also be explored twice as quickly.
With the number of GPUs and the power consumption, it's easy to imagine that Big Sur would be steaming when pushed to its computing limits. However, Facebook claims its newly-developed server doesn't need any special cooling devices for it is designed for thermal and power efficiency. Moreover, it can be air-cooled.
"Because of the enormous amount of heat and power drawn by the graphics processing chips used in building the machines, that the team that melted its first enclosure would have gotten a steak dinner," Fortune's Stacey Higginbotham writes, retelling Piantino's explanation in a conference call. "But the challenge in building the hardware wasn't reducing the heat from the semiconductors, but rather the amount of power they consumed."
Note that Facebook took approximately 18 months to finish Big Sur. The company's AI laboratory was established back 2013.
A likely immediate implementation is for the company's social network portal, on which people upload and share a lot of videos and photos.The new hardware can certainly breeze through facial recognitions. Automated facial recognitions and tags don't feel too far away now.
With its commitment to open-sourcing its codes and academic papers, Facebook is contributing the design to the Open Compute Foundation for others to use and improve on.
Below is a demonstration by Yann LeCun, Facebook Director of AI Research, on how smart its AI is.