The online racing simulator
Maximum FPS
2
(40 posts, started )
Quote from Honey :no but, retina works like a low-pass filter and frames are the source digital signal, for those who knows the shannon theorem, this is similar...in few words: images persist on the retina for few moments, if another image comes before the previous image is "disappeared" from retina, then the eye sees them continuosly and more smoothly as much closer the two images come.
science says that human eye has its threshold at 50Hz (50 fps) but also says that this is not a crisp limit, the more fps you get the more stable and fluid you see, i'd say that above 100fps it becomes too unoticeable, but having 100fps makes it more smooth...that's the same reason why running crt monitors at 85Hz or above is good for eyes health...

Yup, you're right... But there's more to it than that...
On the rear-end of the eyeball there are a lot of nerves that pick up light.
Part of these (in the center directly behind the iris) are very good at picking up colors. And there are others which only detect light/dark (black and white if you will). These black/white recepticles (spelling?) are very fast at doing what they do, and can change at a rate of about 60 times/sec. The color ones in the middle don't get much better than 30 times/sec. Since these 2 are not completely in sync your b/w and color recepticles aren't constantly seeing the same image, so the faster you can switch the actual image, the more fluent this image seems to be. For television, there's a so called 25fps system in place (or 29.97 fps in NTSC areas). In fact, you're shown 50 frames per second (or 60 in NTSC areas, as your power is 60Hz). 25 is enough for the color recepticles to watch a rather fluent movement, but the b/w ones would see screenflips. So TV is interlaced. Showing you one half of the image first, and then the other half, so that the percepted (or seen) refresh rate is actually 50. In cinemas you're being fooled as well... 25fps movies (europe) are actually shown at 50fps (each frame is shown twice slightly offset, which is why there always is a hint of blurring).

To test this you can set your screen to 60Hz refresh rate, and open Word with a fully white page. Stare at the center of the screen, and you'll notice that it doesn't flicker in the center. Now, while still staring at the center of the screen, focus on the sides of the screen, and you'll notice that that flickers. Up to a refresh rate of 75Hz you can notice some flickering when you turn your eyes away from the screen and have the screen at the edge of your vision... Also notice that when the screen is at the edge of your vision it becomes impossible to discern any color and the image turns to b/w.
yep, you're right!
this is quite complete description...

let's just say that we are not paid by graphic card maufacturers to justify their expensive cards...
Even though your eye might only see at 60fps and your monitor might only refresh at that speed too the point of high fps is to give you a buffer for high intesity sections. For example... when I play COD2 and someone throws a smoke grenade, my fps dips from 50 to about 15... whereas my mate, who runs his fps at a seemingly pointless 150, only sees it drop to 120ish. I find it playable at about 30ish (same for LFS), so 15 is a bit annoying! lol

Basically your maximum fps rate isn't really what's important as long as it's playable... it's your minimum, because if that is unplayable then it's not really all that helpful

EDIT: lol Honey so true :P
@Tagforce
except some TVs can run at 100Hz

I always prefer to have high FPS because it mean when there is "traffic" on screen it doesnt dip really low.I have beeen quite a long time with really low spec PC and it was pain to get some decent times when you had in online racing drop to like 15FPS. IF the FPS drops in fast moving games under 60FPS I can imidiately notice it and its really annyoing for then because I dont see so smooth picture.
Quote from Honey :no but, retina works like a low-pass filter and frames are the source digital signal, for those who knows the shannon theorem, this is similar...in few words: images persist on the retina for few moments, if another image comes before the previous image is "disappeared" from retina, then the eye sees them continuosly and more smoothly as much closer the two images come.
science says that human eye has its threshold at 50Hz (50 fps) but also says that this is not a crisp limit, the more fps you get the more stable and fluid you see, i'd say that above 100fps it becomes too unoticeable, but having 100fps makes it more smooth...that's the same reason why running crt monitors at 85Hz or above is good for eyes health...

I know nothing about this subject except for what I've read. And what I've read is that recent research has shown that it makes a difference what kind of frames you're seeing per second. The human eye can see a noticeable difference in fps on a CRT between 60 and 120 (and some can even detect the differences up to 200 fps) because of the way a CRT works. The article I read on this was pretty in depth but basically pointed out that 60 fps is a minimum for the average human. But that some can detect over 200 fps. So more still is better. Btw, I average 100 fps steady no matter what happens on the track etc., or whether I'm encoding DVDs or not.
Quote from bw_krupp : I average 100 fps steady no matter what happens on the track etc., or whether I'm encoding DVDs or not.

Also at the start of the race being the last in grid??I doubt it.
Quote from DEVIL 007 :@Tagforce
except some TVs can run at 100Hz

And you still get only 50fps... They just refresh the screen twice as often.
I have no idea what they do in the US, but I'd imagine they'd be 120Hz there?
(unlike a movie theatre with a central light beaming the entire image, a CRT beams only 3 points at any one time. So to keep the image visible there must be a high refresh rate. 100Hz is just a means of keeping a more steady image, nothing to do with framerates)

Quote from bw_krupp :I know nothing about this subject except for what I've read. And what I've read is that recent research has shown that it makes a difference what kind of frames you're seeing per second. The human eye can see a noticeable difference in fps on a CRT between 60 and 120 (and some can even detect the differences up to 200 fps) because of the way a CRT works. The article I read on this was pretty in depth but basically pointed out that 60 fps is a minimum for the average human. But that some can detect over 200 fps. So more still is better. Btw, I average 100 fps steady no matter what happens on the track etc., or whether I'm encoding DVDs or not.

Yes, for the exact reason I just said. Also, did you know that people who can spot flickering at higher refreshrates are also more likely to be colorblind? But there is a difference between refreshrates and frames per second. If someone tells you they can spot the difference between 100 and 200 fps on a 100Hz screen, they are lying. The screen only displays 100 frames per second, so there is absolutely no difference at all between a steady 100fps or 200fps output (the computer draws one image that does not show on screen at any time). The only difference they should be able to spot is tearing, which would make them want to go to 100fps because the images look better that way.
@tag
so if i am correct, your eye only sees what the refresh rate of the monitor is (as long as your FPS is over the refresh rate)

so.. if a game says you have 100fps, and your refresh rate is at 60Hz for example, you only see 60fps?
.. basically?

also, if that is so, would running a higher or lower refresh rate put any more or less stress on your computer? (since it has to 'update' the screen at whatever rate) or is that just way out of lines
Quote from XCNuse :@tag
so if i am correct, your eye only sees what the refresh rate of the monitor is (as long as your FPS is over the refresh rate)

so.. if a game says you have 100fps, and your refresh rate is at 60Hz for example, you only see 60fps?
.. basically?

also, if that is so, would running a higher or lower refresh rate put any more or less stress on your computer? (since it has to 'update' the screen at whatever rate) or is that just way out of lines

Your monitor does not put any strain on the computer. It just displays whatever data is in the screenbuffer of the videocard at whatever refreshrate it is set. If your videocard supports a resolution at a certain refreshrate it simply means the screenbuffer is fast enough to provide those pixels at that speed (actually, to copy the size of the screenbuffer from its working memory to the screenbuffer and through to the monitor). That is the reason why the maximum refresh rate goes down the higher the resolution you want to display. Your card may support 1280x1024x32bpp at 100Hz, but it will only go to 85Hz @ 2048x1024x32bpp. It simply cannot provide the 2048*1024*4 bytes 100 times per second.
(same goes for monitors, btw. they need to be able to read that amount of data, which they sometimes simply can't... You can force a monitor to try it though, and basically you'll end up with a messed up image (or a message telling you "sync out of range"), or in very rare cases you find out that you can overclock your monitor and it does work, like mine support 2048x1536@60Hz, even though the monitor specs say 1600x1200@60Hz).

Basically you are correct... However (there's always a 'but')...
What your screen shows is actually a buffer on your videocard. Any data that is in there is what is shown on the screen. It polls this buffer from top-left to bottom-right line by line. If you have, say, 120fps then that means there's 120 images pumped into the buffer that show on your screen. If your monitor has a 60Hz refresh rate, that would mean the buffer gets overwritten halfway through the image. Your screen doesn't care about that and happily reads the new values and displays them. So what you get is half of the 1st image, and the lower half will be the new image. This causes "tearing".
Effectively your screen only gets updated (visually) 60 times per second, so you still have 60fps, even though your videocard is pumping out twice that (each of those 60 frames are two half images stuck together around halfway down the screen).
Since the vidcard is what ultimately tells the game how many images it pumps out, that's the fps you see ingame (120 in this case).

To solve the tearing, there is the option of "waiting for retrace". What this does is tell the videocard that it is only allowed to overwrite the screenbuffer when the cathode ray canon (which actually blasts your pixels on your screen) tells the card that it has finished drawing a screen, and is moving back to the top-left corner to start the next image. This takes a short amount of time and no pixels are actually displayed during that "retrace", so it is safe to dump the buffer. Now your vidcard will pump out a maximum fps that is equal or lower than the refresh rate of the screen, so your ingame fps will sync with your monitor's refresh rate (at 60, 70, 75, 100 etc).

Am I clear enough, or would you like a more detailed explanation?

EDIT:
Just to add that even though my example is static, not one single game produces these actual values on a 60Hz monitor. Basically in any game the fps fluctuates, and turning off V-sync (wait for retrace) ensures that the screenbuffer is always displaying the last known image (even if it's partially displayed). That is the reason why some people notice stuttering when they turn on v-sync. Suddenly there can be a (very very very) small delay between the data provided by the game, and the image you see. More over, this delay constantly changes (at one point the new image is dumped during a retrace, the next image may have to wait almost 1/60th of a second), which may very well cause the feeling of irregular framerates. In fact the fps is always 60, but the actual in-game expired time between any 2 frames may be different. Without v-sync you may have poor image quality, but the image data behaves more like a stream and seems more fluent.
no thats.. awesome, thanks for sharing that
Quite welcome...
I was thinking of adding a couple of "frames" in images to show you what it would look like in reality, but after re-reading my post I think it's pretty clear.
If anybody wants a visual idea, just let me know, I'll post them.
Quote from TagForce : To solve the tearing, there is the option of "waiting for retrace". What this does is tell the videocard that it is only allowed to overwrite the screenbuffer when the cathode ray canon (which actually blasts your pixels on your screen) tells the card that it has finished drawing a screen, and is moving back to the top-left corner to start the next image.

So, if my monitor has a refresh rate of 75hz, then Instead of using vsync i can just make the max framerate <75? From what I am understanding, at 30FPS the video card writes the frame on the framebuffer every 1/30th of a second. And, my monitor gets the image from the framebuffer every 1/75th of a second. So, I might be seing one frame 2 times on my monitor. But, if my framerate is >75hz, then that means without vsync, the monitor would read from the framebuffer where it was on the last frame, because the monitor does not know that it is a new frame.
Quote from wheel4hummer :So, if my monitor has a refresh rate of 75hz, then Instead of using vsync i can just make the max framerate <75? From what I am understanding, at 30FPS the video card writes the frame on the framebuffer every 1/30th of a second. And, my monitor gets the image from the framebuffer every 1/75th of a second. So, I might be seing one frame 2 times on my monitor. But, if my framerate is >75hz, then that means without vsync, the monitor would read from the framebuffer where it was on the last frame, because the monitor does not know that it is a new frame.

Depending on how LFS handles a max FPS setting... Yes...
But I doubt LFS would sync to your monitor's v-sync if you set the max FPS to your refresh rate... So in essence, other than creating a default tearing point on your screen at maximum FPS I don't see much use for an fps limiter, monitor wise.
piece of shit dell
i run 65-95 is it nice ?
2

Maximum FPS
(40 posts, started )
FGED GREDG RDFGDR GSFDG