Facebook releases design for souped-up artificial intelligence server, ‘Big Sur’ - petehaske1995
Facebook is releasing the hardware design for a server it uses to train conventionalized intelligence software, allowing other companies exploring AI to build similar systems.
Code-named Big Sur, Facebook uses the server to run its machine learning programs, a type of AI software that "learns" and gets better at tasks over time. It's contributing Big Sur to the Open Compute Stick out, which it unmoving capable let companies share designs for revolutionary ironware.
One inferior use for machine learning is image recognition, where a software program studies a photo or video to identify the objects in the fles. But it's being applied to all kinds of large data sets, to spot things like email spam and credit card fraud.
Facebook, Google and Microsoft are all pushing hard at AI, which helps them build smarter online services. Facebook has released some ASCII text file Army Intelligence software in the past, but this is the first time it's free AI computer hardware.
Big Sur relies to a great extent on GPUs, which are ofttimes more efficient than CPUs for machine erudition tasks. The server bottom have as many as eighter high-performance GPUs that each consume up to 300 watts, and can be configured in a variety of ways via PCIe.
Facebook said the GPU-based system is twice as fast as its previous propagation of computer hardware. "And distributing training crossways eight GPUs allows the States to scale the size and upper of our networks by another broker of two," it aforesaid in a blog post Thursday.
One notable matter about Wide-ranging Sur is that information technology doesn't require special cooling or another "unique base," Facebook said. High carrying out computers generate a tidy sum of heat, and holding them cool can be costly. Both are even immersed in unusual liquids to stop them overheating.
An pictur of Big Sur shows a large airflow unit of measurement inside the server
Big Sur doesn't need whatever of that, according to Facebook. Information technology hasn't released the hardware specs yet, but images render a large air flow building block inside the server that presumably contains fans that blow out unagitated air across the components. Facebook says it force out usage the servers in its cool data centers, which avoid industrial cooling systems to keep costs down.
Corresponding a lot of other Ingenuous Figure out hardware, it's studied to represent equally three-needled as possible. OCP members are adoring of talking about the "uncalled-for differentiation" that server vendors put in their products, which throne drive up costs and come through harder to superintend equipment from different vendors.
"We've far the components that don't get used a great deal, and components that fail relatively frequently — such as hard drives and DIMMs — nates now represent removed and replaced in a hardly a seconds," Facebook said. Altogether the handles and levers that technicians are supposed to touch are colored green, so the machines can be serviced quickly, and even the motherboard can be removed within a minute. "In fact, Big Sur is just about entirely tool-less –the CPU heat sinks are the only things you need a screwdriver for" Facebook says.
It's not sharing the design to be altruistic: Facebook hopes others will try out the hardware and suggest improvements. And if other big companies ask waiter makers to build their own Big Sur systems, the economies of scale should help drive costs down for Facebook.
Machine learning has come to the fore lately for a few reasons. One is that large data sets misused to train the systems have become publicly available. The other is that powerful computers have gotten cheap enough to do just about impressive AI run.
Facebook pointed to software it developed already that can read stories, answer questions about an image, play games, and learn tasks by observing examples. "Merely we realized that truly tackling these problems at scale would require us to design our own systems," information technology said.
Big Sur, named after a stretch of picturesque California coastline, uses GPUs from Nvidia, including its Tesla Speeded up Computing Platform.
Facebook same it will to triple its investment in GPUs so that information technology can add machine learning to more of its services.
"Big Sur is twice as fast as our previous generation, which agency we tooshie civilis twice as fast and explore networks doubly as large," IT said. "And distributing training across eight GPUs allows us to scale the size and speed of our networks by another factor of two."
Google is likewise rolling out machine learning across more of its services. "Machine learning is a meat, transformative way by which we're rethinking everything we're doing," Google CEO Sundar Pichai aforesaid in October.
Facebook didn't say when it would release the specifications for Big Sur. The next OCP Summit meeting in the U.S. takes place in March, so it might say more about the system more and so.
Source: https://www.pcworld.com/article/418743/facebook-makes-its-big-sur-ai-server-design-available-to-anyone.html
Posted by: petehaske1995.blogspot.com

0 Response to "Facebook releases design for souped-up artificial intelligence server, ‘Big Sur’ - petehaske1995"
Post a Comment