this one is different in that it seems to use a single screen for both eyes. this has the up side of being able to do a simultaneous eye rendering. meaning no left right flicker. refresh is still an issue, carmack was talking about most of the alloted 20 miliseconds to keep the effect convincing is used up just scanning out the frame (that doesnt account for the render time or the time neccisary to poll input devices like head trackers, and the time wasted by the os). ive actually had issues like that in my electronics projects where data sits in a buffer for several milliseconds, waiting to be used causing serious lag in the various control systems. he said something about 120 hz beeing better for something like this, which would half the scanout time from 16 to 8 ms. its fun to watch john break down the amount of time everything takes to happen.
rendering looks like it renders both eyes to the same frame buffer. looks like you could **** with the projection matrix to render to one side, and then set it up again and render the other side, maybe using some stencil buffer majic there (im going to play with my game engine and see if its possible). im not sure if this is something that will fly with differed lighting and such, but idk. carmack mentions that other than doing the split screen thing he had to use a pixel shader to handle the warp correction neccisary to really sell the effect. that video was a good watch, you learn **** about vr that you never knew.