X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

Ray tracing for PCs-- a bad idea whose time has come

Intel is promoting CPU-based ray tracing as an alternative to traditional polygon-order rendering on PCs. Glaskowsky explains why this is a bad idea.

Peter Glaskowsky
Peter N. Glaskowsky is a computer architect in Silicon Valley and a technology analyst for the Envisioneering Group. He has designed chip- and board-level products in the defense and computer industries, managed design teams, and served as editor in chief of the industry newsletter "Microprocessor Report." He is a member of the CNET Blog Network and is not an employee of CNET. Disclosure.
Peter Glaskowsky
4 min read

Dean Takahashi sent me an e-mail pointing to a piece he wrote on VentureBeat describing statements Wednesday by Intel's Chief Technical Officer Justin Rattner targeted at NVIDIA. CNET's own Brooke Crothers covered the same story and provides additional background here.

Intel Chief Technology Officer Justin R. Rattner
Intel Chief Technology Officer Justin R. Rattner Intel

The technology at issue relates to 3D graphics for PCs. All current PC graphics chips use what's called polygon-order rendering. All of the polygons that make up the objects to be displayed are processed one at a time. The graphics chip figures out where each polygon should appear on the screen and how much of it will be visible or obstructed by other polygons.

Ray tracing achieves similar results by working through each pixel on the screen, firing off a "ray" (like a backward ray of light) that bounces off the polygons until it reaches a light source in the scene. Ray tracing produces natural lighting effects but takes a lot more work.

(That's the short version, anyway. For more details, you could dig up a copy of my 1997 book Beyond Conventional 3D. Alas, the book is long since out of print.)

Ray tracing is easily implemented in software on a general-purpose CPU, and indeed, most of the computer graphics you see in movies and TV commercials are generated this way, using rooms full of PCs or blade-server systems.

Naturally, Intel loves ray tracing, and there are people at Intel working to make ray tracing work better on Intel hardware.

The occasion for Rattner's remarks Thursday was a meeting for industry analysts at the Computer History Museum. At the meeting, according to Takahashi, Intel showed how a four-chip, 16-core demo system could play "Quake Wars: Enemy Territory" at 16 frames per second.

Honestly, that's pretty pathetic, since you can get higher frame rates with a dual-core CPU plus one good graphics chip. Your system price and power consumption will be a tenth that of the Intel demo system.

Rattner basically implied that Nvidia must actually agree with Intel that ray tracing is a good idea because Nvidia recently bought ray-tracing firm RayScale and Rattner says Nvidia is trying to hire away Intel's ray-tracing people.

Takahashi compared this conflict with the "Phoney War" of 1939-1940 and said the real fighting will begin when Intel introduces Larrabee, a CPU-based graphics chip, at Siggraph in August.

But I don't think there's going to be much of a fight there.

Intel is trying to defend a crazy idea-- that CPU-based ray tracing is a practical alternative to GPU-based polygon-order rendering.

We can guess why they decided to push this alternative--Intel's a CPU company and its people are CPU-centric. But the numbers don't work out: ray tracing takes more work than polygon-order rendering. Going from pixels to polygons requires searching (tracing rays), whereas going from polygons to pixels merely requires a relatively simple set of calculations known as "triangle setup."

Ray tracing's advantages for lighting effects are pretty minor; current graphics chips can be programmed to get good results there too, with less work.

I imagine Intel noticed that ray tracing could be another way to use the many cores in Larrabee, and figured this could be the basis of some competitive differentiation, but what should have been a minor point in some future marketing campaign has grown into an overblown strategic initiative.

On the hardware side, Larrabee isn't even optimized for ray tracing. On the software side, there's no support for ray tracing in Microsoft's Direct3D middleware, and no way any version of Direct3D in the foreseeable future will rely on ray tracing.

Larrabee will certainly support ray tracing--every CPU does--and some future version of Direct3D may support ray tracing as an option, but it could be 10 years or more before ray tracing becomes a required feature for any real-world software.

And to whatever extent ray tracing can be useful, Nvidia can write efficient ray-tracing code for its GPUs faster than Intel can tape out more capable versions of Larrabee. Nvidia is looking for ways to use ray tracing for lighting and other purposes, but this effort is minor compared to the work it's putting into polygon-order rendering.

Rattner is very smart--too smart not to know the situation. I think he's just doing his job, supporting his company's position whether he fully agrees with it or not.

And once Intel starts selling Larrabee, it's only going to get a day or two to talk about ray tracing before the focus will turn, properly, to Larrabee's performance on the technology that matters: good old polygon-order rendering. And at that point, I don't think Intel's going to have much to say.