Some reading for you:
http://www.100fps.com/how_many_frame...humans_see.htm
Printable View
Some reading for you:
http://www.100fps.com/how_many_frame...humans_see.htm
Nice read : -)
Something very important about movies and television is that they will use the magic of a motion blur between frames to put it simply which prevents them from looking 'choppy' which video games simple are not capable of.Quote:
Originally Posted by 'Zub',index.php?page=Thread&postID=169082#post1690 82
E-Peeeeeeen
Yep, not everyone registers at the same rate...those new tailights drive me nuts when driving at night, they are flickering very quickly to me.Quote:
Originally Posted by 'Nisch',index.php?page=Thread&postID=169145#post16 9145
Ahh the old how many FPS can the human eye see debate. I remember about 6 years ago fighting this argument. After a lot of research on the issue it comes down to this.
Games and FPS. Anything over the refresh rate of your monitor is a waste and if its not a factorial of the refresh rate then your likely if your perceptive enough to see flickers, tearing and stuff. Thus the 240hz 120hz 60hz HDTV debates in regards to source media 1:1 3:5 ratios blah blah blah.
TV, Movies and Motion Blur. That whole shake your hand in front of your eyes and its all broken up 9 fingers thing is full of crap, that's all a trick of the crappy lighting where your at. Go outside on a sunny day and do it again if you have doubts. This gets even more goofy when you play with motion blur. With the right blurring in place watching a screen that does 10fps will look just as good as 60fps. Motion blurring is another trick of the eye, but also a camera trick as well. If you watched a movie at 24 fps and there was not captured motion blur it would look like shuddering choppy crap. Even video games put motion blur type effects in to make things look smoother. There's already many links explaining how motion blur works in regards to detail and how the eyes work etc.
Real Life VS computer specs. You cant compare Apples to Macbooks equaly. Take a moment and do the following. Dont move your head at all, but just your eyes. Look at something close, look at something further away, then up, down, left, and right. Look at the detail of the things you see. The texture of things. If theres something moving watch it for a while. Now what resolution and fps is real life playing in? Light is not entering your eye at 24 pulses/frames per second, your not looking through eyes that have 1024x768 lines of resolution. Real life streams to your eye at a FPS/Resolution that no computer can reproduce.
Conclusion. Its all tricks.
There is a notable difference for me between say 30 and 100 fps
and that is that 100 will give a much smoother look specifically when fast turning. Seriously, try it, put it on 30 and turn reaaaal fast then put it much higher and turn reaaaaaal fast. The eye sees more then you can imagine!
That said tho, I multibox with maxfps on 20, and it's pretty solid on 20, never really drops, and it's fins too. :p
Ok now i have to agree with the new card and the improved fps Wow looks awsome. I wish i had had a great card sooner.
Well, to the OP, not sure if this is exactly what you were asking, but:
FPS is typically measured during a relatively static time in the game when not a lot is happening on the screen (barring the lagfest known as Dalaran). If you are sitting at 100 FPS at that point, then when you get to a place where the load on your graphics card is much higher, your FPS will drop. You have a lot farther to drop at 100 FPS baseline than at 35 FPS baseline before things start looking very choppy, so to me it is about the ability to handle temporary increased loads at certain times (hello Wintergrasp).
for any that may care...
24 fps movies
30 fps standard def TV
60 fps HDTV
120 fps super HDTV
:!: pay attention to the bit below
as for your game screens... you want to balance your fps with your cpu/gpu load. if your system is doing 40 fps, but the cpu is at 100%, then you're max'd out. if anything happens that would require more cpu, you're going to lose fps (you should target an average of < 80%... and if anything comes up, your system can handle it). the best would be to lower the fps, or better still.. lower the number of pixels that need to be rendered.
as such... if you are using keyclone's maximizer, then you can tweak this fairly easy. the in-game resolution for each region is just below the PiP hotkey field. it defaults to 800x600, but you can type anything into it so long as you keep with the format. this is where you'll pick up savings for rendering... and this is how.
imagine your main area is 1600x1200 and you have 4 alt screen, each at 800x600. each has the in-game resolution set to 1600x1200. for each screen, wow will render according to the in-game resolution you specified for that region. in this scenario, you are rendering 9,600,000 pixels if each screen were to render 1 frame. you have some savings by adjusting the maxfpsbk and maxfps... but we can go further still.
the pc can do a 200% zoom in hardware and is very quick. we'll take advantage of this.
now, set all your regions in-game resolution to 800x600. this will result in your wows rendering 2,400,000 pixels per frame.... 25% of the original number. this is a RADICAL reduction in computational requirements and should result in lower cpu and gpu requirements.
to adjust your existing setup, look at the dimensions of your main area... divide the width and height each by 2 (ie: 1000,800 -> in-game res: 500x400). do this for all regions. this will not only improve performance, but also insure that the aspect ratio for each area is correct.
i hope that helps.
Rob