
Probably the most eagerly-anticipated game for a number of years is upon us. Four years in development, Doom3 is the latest game by Mesquite-based id Software, a game that id says is the best they have ever done. While the reception to its gameplay is mixed (as is always the case for such a subjective topic of discussion), the majority is in agreement that it is the best looking game ever released on the PC platform.

Some of the following information have been provided by this interviewer in Beyond3D's various forums, scattered as they may be due to the nature of Beyond3D's forums where there are individual forums for discussing technologies and games. The hope is this "interview" would be a single source of information that touches on a few matters raised by this interviewer to John. Let's not waste anymore time.

Timedemo doesn't do any game logic.
Demos are always recorded at exactly 30Hz.
Okay, timedemo only tests graphics rendering and ignores AI and physics. Even with a high-end CPU system, I have found that timedemo is still very CPU-dependent. In demos with many monsters and/or large/complex monsters, I will have to assume this CPU-dependency in timedemos is a result of CPU skinning (since AI and physics are ignored in timedemo). Correct?
CPU skinning, shadow generation, and tangent space reconstruction are the parts of Doom that take up significant time in timedemos, but there is a lot of driver overhead as well.
"r_skipRenderContext 1" will unbind the gl context, so all driver calls become null functions. You can have some timings activated, then enable this, then disable it so things draw again, and compare to see how much time is spent in the driver, assuming the scene is not hardware limited.
What's the situation with regards to depth bounds test implementation in the game? Doesn't appear to have any effect on a NV35 or NV40 using the cvar.
Nvidia claims some improvement, but it might require unreleased drivers. It's not a big deal one way or another.