NVIDIA's Reality Server Aims to Deliver Real-Time Rendering On Your Next Netbook—or iPhone
It's hard to pin down exactly what cloud computing is—the term is as vaporous as its proverbial namesake—but to date, the majority of cloud computing applications have emphasized storage, group collaboration, or the ability to share significant amounts of information with specific groups of people. In business IT, the concept of renting server power from IBM or Sun could be seen as a type of cloud computing. To date, there's been no push to make GPU power available in a cloud computing environment—but that's something NVIDIA hopes to change. The company announced version 3.0 of its RealityServer today, the new revision sports hardware-level 3D acceleration, a new rendering engine (iray), and can create "images of photorealistic scenes at rates approaching an interactive gaming experience," according to the company.
Here's the eye-opener: NVIDIA claims that the combination of RealityServer and its Tesla hardware can deliver those photorealistic scenes on your workstation or your cell phone, with no difference in speed or quality. Instead of relying on a client PC to handle the task of 3D rendering, NVIDIA wants to move the capability into the cloud, where the task of rendering an image or scene is handed off to a specialized Tesla server. Said server performs the necessary calculations and fires back the finished product, which theoretically allows for the aforementioned frame rates that approach an "interactive gaming experience."
NVIDIA has designed a series of rackable Tesla servers, as shown above. The categories above are meant to represent various size possibilities—based on NVIDIA's comments, one could host a RealityServer 3.0 project on a single GPU, and could possibly scale up past 100 GPUs. A key part of the new software architecture is Mental Image's new iray rendering engine, which has been designed to take advantage of GPU acceleration. Iray was built to take advantage of the tremendous parallel processing capabilities of a modern GPU—according to the FAQ (PDF), iray is an "interactive, consistent, high-performance global illumination rendering technology that generates photorealistic imagery by simulating the physical behavior of light...iray does not depend on complex renderer-specific shaders and settings to approximate global illumination. iray generates photorealistic imagery without introducing rendering algorithm-specific artifacts."
If the system works as advertised, businesses and companies of all sorts could create interactive, 3D web applications that updated in real-time and offered far more detail than a netbook or smartphone could possibly render at all, much less render in real time. Since only the final product is being transferred to the client, there's no need for the user to worry about battery life, heat generation, or turning their system into a tortoise in exchange for using one particular application.
Cloudy, With a Chance of
NVIDIA demonstrated RealityServer 3 at their web event, with relatively mixed results. There's no doubt that the Tesla server off in the atmosphere shot back gorgeous visuals, at speeds that would take weeks to render on the small netbook the company used for its demonstration, but the speed of the updates didn't remotely come close to "approaching an interactive gaming experience," unless said experience involved attempting to run Doom on your 16MHz 386 with the screen size set at maximum. Update times varied from 10-20 seconds, and that's a significant lag when discussing online usage patterns.
To be fair, NVIDIA's backend Tesla server was built with just 16 GPUs, but said server presumably had just one client. Photorealism levels of detail were a fine way to display the capabilities of the iray rendering engine, but probably had a negative impact on the actual service demo—a bit less detail and a lot more speed would've made the rendering demonstration more impressive. Ideally, visitors to a company's website would be able to select a level of detail between the speed and quality of a given program—NVIDIA didn't address if this is the case, or if the device-agnostic nature of the platform means it would default to a one-size fits all.
Setting aside questions of commercial viability and corporate/consumer interest, there's a question as to whether or not the United States' broadband providers can handle the load without service degradation. This is most obvious when considering the current state of cellular data networks—AT&T's is already buckling under the iPhone's strain—but the hypothetical widespread use of RealityServer could raise questions over the real-world capabilities of ADSL and cable connections.
We bet AT&T just loves this demo.
In theory, wired broadband connections are more than capable of handling the additional load of such services, but service quality and reliability in the real world is dependant on an enormous number of variables including what geographical area you live in, which side of town you're on, the condition of the local phone lines, and whether or not the ISP has adequately provisioned its network for peak usage times, just to name a few. What works on a whiteboard, in this case, may have no relation to what works in real life depending on the variables mentioned above.
NVIDIA fielded a question on this topic during the Q&A session, and insists that RealityServer applications will have a bandwidth footprint equal to or less than that of a YouTube video stream, and it erred on the side of "less." Assuming this is true, the new services should have little to no effect on current-generation networks.
There's no doubt that RealityServer could be the sort of technological innovation that describes future generations of multimedia-rich devices—a so-called Web 3.0, although I loathe the term. There's a definite appeal to the idea of moving 3D rendering off the client, particularly in mobile devices where this feature would extend battery life. Extend the concept a bit, and we might one day see high-quality games take advantage of offsite 3D rendering capabilities while simultaneously utilizing the system or phone's GPU for immediate visual updates. Again, this could be used to extend rich multimedia capabilities to devices that either might not be able to otherwise handle them, or might have a negative impact on battery life.
Whether or not any of this happens, of course, depends on whether or not NVIDIA's RealityServer 3 grabs the eyes of potential customers. The capabilities are interesting, but RS3 is aiming at a market that doesn't exist yet. Whether or not NVIDIA can create that market and remain a dominant player within it remains to be seen.