NVIDIA's Reality Server Aims to Deliver Real-Time Rendering On Your Next Netbook—or iPhone

rated by 0 users
This post has 6 Replies | 0 Followers

Top 10 Contributor
Posts 26,109
Points 1,183,820
Joined: Sep 2007
ForumsAdministrator
News Posted: Tue, Oct 20 2009 6:28 PM

It's hard to pin down exactly what cloud computing is—the term is as vaporous as its proverbial namesake—but to date, the majority of cloud computing applications have emphasized storage, group collaboration, or the ability to share significant amounts of information with specific groups of people. In business IT, the concept of renting server power from IBM or Sun could be seen as a type of cloud computing. To date, there's been no push to make GPU power available in a cloud computing environment—but that's something NVIDIA hopes to change. The company announced version 3.0 of its RealityServer today, the new revision sports hardware-level 3D acceleration, a new rendering engine (iray), and can create "images of photorealistic scenes at rates approaching an interactive gaming experience," according to the company.

 

Here's the eye-opener: NVIDIA claims that the combination of RealityServer and its Tesla hardware can deliver those photorealistic scenes on your workstation or your cell phone, with no difference in speed or quality. Instead of relying on a client PC to handle the task of 3D rendering, NVIDIA wants to move the capability into the cloud, where the task of rendering an image or scene is handed off to a specialized Tesla server. Said server performs the necessary calculations and fires back the finished product, which theoretically allows for the aforementioned frame rates that approach an "interactive gaming experience."

NVIDIA has designed a series of rackable
Tesla servers, as shown above. The categories above are meant to represent various size possibilities—based on NVIDIA's comments, one could host a RealityServer 3.0 project on a single GPU, and could possibly scale up past 100 GPUs. A key part of the new software architecture is Mental Image's new iray rendering engine, which has been designed to take advantage of GPU acceleration. Iray was built to take advantage of the tremendous parallel processing capabilities of a modern GPU—according to the FAQ (PDF),  iray is an "interactive, consistent, high-performance global illumination rendering technology that generates photorealistic imagery by simulating the physical behavior of light...iray does not depend on complex renderer-specific shaders and settings to approximate global illumination. iray generates photorealistic imagery without introducing rendering algorithm-specific artifacts."

 

If the system works as advertised, businesses and companies of all sorts could create interactive, 3D web applications that updated in real-time and offered far more detail than a netbook or smartphone could possibly render at all, much less render in real time. Since only the final product is being transferred to the client, there's no need for the user to worry about battery life, heat generation, or turning their system into a tortoise in exchange for using one particular application.

Cloudy, With a Chance of Meatballs Adoption

NVIDIA demonstrated RealityServer 3 at their web event, with relatively mixed results. There's no doubt that the Tesla server off in the atmosphere shot back gorgeous visuals, at speeds that would take weeks to render on the small netbook the company used for its demonstration, but the speed of the updates didn't remotely come close to "approaching an interactive gaming experience," unless said experience involved attempting to run Doom on your 16MHz 386 with the screen size set at maximum. Update times varied from 10-20 seconds, and that's a significant lag when discussing online usage patterns.

To be fair, NVIDIA's backend Tesla server was built with just 16 GPUs, but said server presumably had just one client. Photorealism levels of detail were a fine way to display the capabilities of the iray rendering engine, but probably had a negative impact on the actual service demo—a bit less detail and a lot more speed would've made the rendering demonstration more impressive. Ideally, visitors to a company's website would be able to select a level of detail between the speed and quality of a given program—NVIDIA didn't address if this is the case, or if the device-agnostic nature of the platform means it would default to a one-size fits all.

Bandwidth Burn
Setting aside questions of commercial viability and corporate/consumer interest, there's a question as to whether or not the United States' broadband providers can handle the load without service degradation. This is most obvious when considering the current state of cellular data networks—AT&T's is already
buckling under the iPhone's strain—but the hypothetical widespread use of RealityServer could raise questions over the real-world capabilities of ADSL and cable connections.


We bet AT&T just loves this demo.

In theory, wired broadband connections are more than capable of handling the additional load of such services, but service quality and reliability in the real world is dependant on an enormous number of variables including what geographical area you live in, which side of town you're on, the condition of the local phone lines, and whether or not the ISP has adequately provisioned its network for peak usage times, just to name a few. What works on a whiteboard, in this case, may have no relation to what works in real life depending on the variables mentioned above.

NVIDIA fielded a question on this topic during the Q&A session, and insists that RealityServer applications will have a bandwidth footprint equal to or less than that of a YouTube video stream, and it erred on the side of "less." Assuming this is true, the new services should have little to no effect on current-generation networks.

There's no doubt that RealityServer could be the sort of technological innovation that describes future generations of multimedia-rich devices—a so-called Web 3.0, although I loathe the term. There's a definite appeal to the idea of moving 3D rendering off the client, particularly in mobile devices where this feature would extend battery life. Extend the concept a bit, and we might one day see high-quality games take advantage of offsite 3D rendering capabilities while simultaneously utilizing the system or phone's GPU for immediate visual updates. Again, this could be used to extend rich multimedia capabilities to devices that either might not be able to otherwise handle them, or might have a negative impact on battery life.

Whether or not any of this happens, of course, depends on whether or not NVIDIA's RealityServer 3 grabs the eyes of potential customers. The capabilities are interesting, but RS3 is aiming at a market that doesn't exist yet. Whether or not NVIDIA can create that market and remain a dominant player within it remains to be seen.

  • | Post Points: 80
Top 10 Contributor
Posts 5,053
Points 60,715
Joined: May 2008
Location: U.S.
Moderator
3vi1 replied on Wed, Oct 21 2009 8:25 AM

In addition to the bandwidth and poor fps, that sounds like a high server load to service one user. It doesn't seem practical for mass consumption at this point.

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

  • | Post Points: 20
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009
gibbersome replied on Wed, Oct 21 2009 12:45 PM

Yes, but it's an intriguing idea. Imagine when our internet infrastructure becomes as fast as Korea's or Japan's, we should be able to play games on your PC or laptop without needing a decent video card.

Don't have a GTX 295? No problem, just connect to one of Nvidia's game servers and enjoy the show.

Of course, the lag time wouldn't make high end gaming practical.

  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: Oct 2009

You could use an open source project like http://cloudi.org with a GPU library (e.g., http://www.culatools.com/) to develop efficient computational solutions on a cloud (so you don't necessarily need Nvidia's software).

  • | Post Points: 5
Top 10 Contributor
Posts 8,622
Points 103,905
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Wed, Oct 21 2009 6:25 PM

So maybe someday it will translate into something usable for the mainstream guy.

All it is, is an idea at this point, without any real legs.

I have faith in progress though.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Wed, Oct 21 2009 9:31 PM

One correction. NViDIA designed software for Tesla servers, and it has tools in place to help cloud computing providers who are interested in offering such services, but NV itself will *not* be hosting any of these applications.

  • | Post Points: 5
Top 50 Contributor
Posts 3,102
Points 38,250
Joined: Aug 2003
Location: Texas
acarzt replied on Fri, Oct 23 2009 8:05 PM

If it could work properly... it would be sweet to have that setup in your home and stream games, videos, music, images, etc to cheap "Hubs" basically. It would be like have 1 central, incredibly powerful Computer... oh wait... it already exists... VM ware... :-P

It's like they are trying to push the same technology in different ways under different names.

Seriously, it is a cool concept... if it actually worked.

  • | Post Points: 5
Page 1 of 1 (7 items) | RSS