While most PS3 owners never took advantage of the system's Linux capabilities, "Other OS" functionality is critical to the universities and institutions that have deployed PS3 clusters as high-performance compute farms. We talked with several project leads on the impact of Sony's decision, and what it means for low-cost supercomputing programs.
Cluster of PS3s, U.S.A.F. 2,000 Console Supercomputer
Image courtesy: U.S. Air Force
Flashback to early 2007. Sony's $599 PlayStation 3 had limped to market just a few months earlier and was then the least-focused and most expensive system on the market. That started to change in February, as researchers began reporting on the PS3's real-world performance in high performance computing (HPC) applications. The Cell processor at the heart of the PlayStation 3 was a monster on paper, but the results that came back from even the earliest clusters showed that the CPU, jointly developed by IBM, Toshiba, and Sony, proved it had a massive set of real-world teeth.
Once word got out that Sony's PS3 offered the performance of a high-end compute cluster for $600, interest spiked. A Folding@Home client appeared in March of 2007 and numerous announcements followed. Today, the PS3 is used in calculating financial risk, scientific applications, HPC clusters, and even by the military. In late 2009, the Air Force research lab put in a request for an additional 2,200 PS3s, to be used for expanding a cluster of 336 systems that were already on site. The military branch built the initial node based on Cell's image-processing capabilities and justified the request by noting: "a single 1U server configured with two 3.2-GHz cell processors can cost up to $8k, while two Sony PS3s cost approximately $600...the approximately tenfold cost difference per GFLOP makes the Sony PS3 the only viable technology for HPC applications."
|We propose the above theorem to explain the scientific relationship between black holes, the PS3, and objects of tremendous mass from which our eyes cannot escape...|
Consider this Exhibit B. Notice how much sexier Lara is on the far right? Supercomputing at work.
Game development and scientific number-crunching have quite a bit more in common than might be apparent at a glance. Both types of programs commonly require low-latency/high-bandwidth processing and both rely on a processor's ability to compute complex interactions between multiple objects moving at different velocities along different vectors. According to Dr. Guarav Khanna, one of the first scientists to build a PS3 cluster back in 2007, there's definitely a potential link between gaming and more lofty-minded pursuits.
"There is huge potential for interaction between game developers and computational scientists," Dr. Khanna said. "Certainly at the level of squeezing performance from computer hardware like you suggest, and—in my view—also in the context of interacting with running simulations, which of course the gaming community has figured out extremely well."