This has little to do with programmers. DX9 already supports the tech and the programmers don't have to do anything. The *only* thing that has to be done is for the "camera point" to be split into two, and moved 4-5" away from each other - this is all done by enabling 3D in the API, not by programmers writing new code. Then two frames are rendered based on these two cameras.. the "left" and "right" eyes. The shutter glasses are sync'd to the monitor, which alternates between these two camera at each Hz. That's why 120Hz monitors = 60 fps with shutter glasses.Originally Posted by 'Molt',index.php?page=Thread&postID=168834#post168 834
We're writing programs that are 3D-enabled at work.. nothing "new" needs to be done on our side to be able to fully enjoy this technology. About the only thing we need to do is avoid sprite-based graphics (the billboarding you mentioned), which is outdated anyway. It helps to have everything rendered in 3D - gui, HUD, etc. We're using particle system and shader fire/explosion effects and they look awesome in 3D. Anyway, a sprite-based HUD doesn't look bad though - it looks like a picture frame hung 3 feet from your face, but the 3D effects happen both behind, through, and in front of it.
Also, LCDs don't "dim" like CRTs do.. the Hz with LCDs is how many times per second the screen will refresh, but there's no dimming after each refresh. That's why 60Hz LCDs look clean and clear with no flickering in comparison to 60Hz CRTs. When attempting to use multiple monitors, this technology would be VERY difficult to do because the shutter glasses can only by sync'd to one monitor, and getting the monitors sync'd is close to impossible. Until there's an option on the monitors to sync with another, I don't see multi-monitor 3D being supported any time soon.
Connect With Us