Intel's Daniel Pohl is once again making news with his ray-traced versions of Id Software classics. This time, it's a ray-traced version of Wolfenstein, running on a four-server cluster and using a version of the company's erstwhile discrete GPU. The framerate is playable, and the visuals are nice, but this little science fair project isn't really about the games.
After delays forced the company to cancel the commercial launch of its long-delayed, x86-based discrete GPU, codenamed Larrabee, Intel announced a plan to bring the same part to market as an HPC testbed. The chip got a new codename, Knight's Ferry, and a vastly diminished set of graphical expectations.
Sure, the ray tracing demo is impressive for what it is, but the video that Tech Report posted is sort of like that video of the elephant that can paint a picture of an elephant--it's amazing that this one thing is doing this other thing so well, but really, I'd rather have a painting done by a human. In other words, the overall scene quality doesn't really stack up to what you can get out of a high-end GPU today, despite the fact that a high-end GPU can't raytrace. But then again, out-GPUing the GPU isn't the point of this demo.
The point of the demo is to show that multiple Knight's Ferry parts can be used in a cluster to solve a really hard math problem extremely quickly. Everyone who knows how hard this problem is—i.e., Intel's potential customers in high-performance computing—will know what they're looking at when they see this thing go. Everyone else will just see a game.
Source: ars technica