Supercomputing Power Hits the Desktop, Minus the Software: "
The PC industry's two largest graphics companies released new top-of-the-line models this week. The new graphics processors will bring not just better videogame performance, but will also turn ordinary desktop PCs into the equivalent of supercomputers -- if programmers can figure out how to take advantage of the chips' massively parallel architectures.
'We're talking about every man, woman and child basically having a supercomputer on their desk,' says Jon Peddie, a graphics-industry veteran and president of Jon Peddie Research.
AMD, which acquired graphics maker ATI in 2006, released two new chips, the Radeon HD 4850 and the Radeon HD 4870. Nvidia, the other dominant player in the space, unveiled its new GeForce GTX 260 and GeForce GTX 280 processors.
According to both companies, the new series of chips feature performance measured in teraflops (that's a trillion floating point operations per second), billions of transistors, hundreds of cores and new architectures that, according to industry analysts, could have a staggering effect on not only Crysis frame rates, but also how and what we use our computers for.
Indeed, cheap access to such formidable computing power could mean that, over the next few years, we will see an explosion of new independent research along with profound new discoveries, analysts say. Additionally, new consumer applications will be able to draw on the graphics processing unit (GPU) for even more eye-watering special effects and even occasionally useful visual information.
'We'll start to get things like real-time mapping from Google that incorporates all manner of real world information,' says Bob O'Donnell, an analyst at IDC. 'All of this is going to bubble up more and more.'
As Peddie observes, it was only 11 years ago that the U.S. government spent approximately $33 million to build ASCI Red, one of the first supercomputers to achieve 1 teraflop. The new graphics chips offer similar power to the 1997-era supercomputer for a fraction of the cost.
'Now we can go down to Fry's or Best Buy and buy a graphics board that has 1 teraflop of processing power for $600 or less,' says Peddie.
Getting that processing power to work for the average computer user, however, remains a challenge.
With the exception of a few games, most applications still aren't made to take advantage of the GPU's power. That's because GPUs are made for parallel processing (crunching lots of bits of data at the same time, then assembling the results all at once), whereas most current software programs are written to be executed serially (operating on one piece of data at a time, then proceeding to the next step).
That is starting change, albeit slowly, thanks to new initiatives designed to spur parallel processing.
Just last week, Khronos, the industry consortium behind the OpenGL standard, announced what it calls Open Computing Language, or OpenCL. With this new heterogeneous computing initiative, the group hopes to come up with a standardized (and universal) way of programming parallel computing tasks.
In many ways, it's the Holy Grail developers have been waiting for: a hardware-agnostic standard that unleashes the power of multi-core CPUs and GPUs using a familiar language.
Apple is throwing its weight behind parallel processing too, and last week committed to using the OpenCL specification as part of its next operating system release, Snow Leopard.
Other companies, including AMD, Nvidia, ARM, Freescale, IBM, Imagination, Nokia, Motorola, Qualcomm, Samsung and Texas Instruments have joined the OpenCL working group.
If initiatives like OpenCL gain momentum, the days of researchers applying for grants and traveling across the country to use a given university or research facility's super computer may well be at an end. Similarly, distributed computing projects like Folding@Home and Seti@Home may see an huge boost in performance by using hundreds of thousand of computers equipped with these new powerful processors.
Of course, if curing cancer or looking for aliens isn't your thing, we can also be fairly certain that Crysis will really scream on any system equipped with these new GPUs.
"
Wired News - http://www.wired.com/rss/index.xml