The online racing simulator
LFS Benchmark Poll - Max Resolution
(89 posts, closed, started )

Poll : What is the maximum resolution your monitor can display?

1680*1050
106
1280*1024
96
More/Not listed
90
1440*900
37
1024*768
24
1280*800
11
1280*768
5
1280*960
5
1280*720
2
1380*768
2
1152*864
1
1360*1024
1
1440x900 @ 60Hz
1280x800 @ 60hz
Toshiba laptop
1440x900. Not interested in a benchmark though, I have constant 60fps (vsync power :nana
2560x1600 here
Quote from DeadWolfBones :Yeah, they just have tearing. :P

CRTs have tearing too. Tearing has nothing do with the monitor type.
Running on 1680x1050.
Quote from geeman1 :CRTs have tearing too. Tearing has nothing do with the monitor type.

its more noticeable on a lcd though
Quote from Shotglass :its more noticeable on a lcd though

Looks the same to me. Anyway just put on vertical sync and you don't see any tearing at all, no matter what monitor you have.
it is... lower refresh rate and images that have more time to burn themself into your retina
and great so the solution is delayed frames
Quote from Shotglass :it is... lower refresh rate and images that have more time to burn themself into your retina

True. It can come up easier because the fps has to only be higher than 60, but it still looks as ugly.
Quote :and great so the solution is delayed frames

I much rather have a small delay than a distorted image. It's not like it's that big anyway. Maybe if you play CS or Quake for money it makes a difference.
Quote from geeman1 :True. It can come up easier because the fps has to only be higher than 60, but it still looks as ugly.

its also that the image is on screen for longer and the pixels are on fo a whole 1/60 of a second instead of a quick flash of phosphor lighting up
also i really dont mind that much

Quote :I much rather have a small delay than a distorted image. It's not like it's that big anyway. Maybe if you play CS or Quake for money it makes a difference.

add frame delay caused by the crappy lcds themself the 3 frame prerendering that at least nvidias do by default and youre rapidly approaching 100ms+ which if youre as thick as me when it comes to noticing oversteer is an issue
Quote from Shotglass :add frame delay caused by the crappy lcds themself the 3 frame prerendering that at least nvidias do by default and youre rapidly approaching 100ms+ which if youre as thick as me when it comes to noticing oversteer is an issue

Are you sure you got it right? The prerender setting in nvidia drivers doesn't mean that the card renders frames with a 3 frame delay. What googling revelead that it does some kind of framerate smoothing with extra buffers or something like that. At least it's not anything silly like just adding a useless delay. There might be a delay, but 100ms+ is just you being silly
The input lag of crappy lcds is a real problem though. Fortunently the solution is easy, just don't buy crappy lcds
any buffer adds delays
and the only tfts without input delay are gaming tns which are even more rubbish than the ones with delay
so youll have a triple buffer for the vsynch which adds a 2 frame delay a tft which does another 2 and a prerendering buffer of length 3 which will give you roughly 100ms
TBH I don´t care about any delay. I play with vsync and triple buffer and every game looks superb And yeah, I have tried a CRT before.
1920*1080 here on my main monitor (19" LCD). I only use it on 1280*720 though, t'is a wee bit small for that huge resolution...60Hz @ all resolutions except 1920*1080, it goes down to 30hz interlaced. Just for the FPS war going on, it runs at 1280*720 locked to 100FPS and only dips with large grids

Secondary monitor (17" CRT) running 1152*864 @ 100Hz, but goes up to 1280*1024 @ 75Hz.

I think the benchmark should stay at 1024*768 though, as everybody can run that res to compare. I presume that using maths one can work out what it would run at at a higher resolution? Or does it not go proportionately?
Quote from Shotglass :
add frame delay caused by the crappy lcds themself the 3 frame prerendering that at least nvidias do by default and youre rapidly approaching 100ms+ which if youre as thick as me when it comes to noticing oversteer is an issue

+
the standard 125Hz @ 8ms USB lag (be it if you use a USB wheel) on top of the 5ms LCD latency, frame buff, frame drops dropping the CPU's response time and vsync's ultra 'uck' lag, 40 or 50ms can probably be added upon your ping time.
Yet, the USB latency can be tweaked down to 500Hz/2ms and 1000Hz/1ms with polling scripts.

Thank god the devs aren't from SimBin. Talk about device to screen latency.

12x10/75Hz
#67 - vari
Quote from dougie-lampkin :
I think the benchmark should stay at 1024*768 though, as everybody can run that res to compare. I presume that using maths one can work out what it would run at at a higher resolution? Or does it not go proportionately?

It doesn't. There's a point at which the GPU becomes the bottleneck. It seems to depend on:

-'The amount of frames the cpu can throw at it'
-The card itself
-Resolution
-AA/AF

With the current (max) benchmark settings the CPU is always the bottleneck meaning an old vid card seems to be just as fast as a new one and this gives a false impression.

With the new bench I planned my avg fps went from 130 to 50'ish. This was with full gtr grid (20) on so long rev, 1280*960 and 4x/16x. Without AA/AF I gained 10fps so at least my old card (850XT) had some work to do this time

Why 4x you might ask...Well some Nvidias seem to jump from 4x to 8x and older ATIs do 6x max.

Thanks to everyone for voting
Quote from Shotglass :any buffer adds delays

No it doesn't. Buffers are generally used to provide smoothness not to provide delays.
Quote :so youll have a triple buffer for the vsynch which adds a 2 frame delay

triple buffering with vsync does never add a two frame delay. Without vsync double buffering is used which works by having two planes which the image is set. The screen where the end result is shown and one backbuffer where the image is rendered, when the image is ready on the backbuffer it is then transferred to the screen. The reason why the image is not rendered straigth on the screen is because it would look generally ugly and it would be more difficult to code too. So with vsync the backbuffer waits for the monitor and transfers the image to the screen in sync with screen refreshes. If you would only use doublebuffering with vsync the gpu would have to wait for the monitor sync before it could start rendering the next frame (would cause a high delay). Which is why triple buffering was invented, triple buffering adds another backbuffer so that the gpu can start rendering the next frame on the another backbuffer when the another one is waiting for the screen sync. When the montor syncs the newest ready frame is then transferred to the screen. So the longest delay triple buffering + vsync can cause is the time between the monitor syncs (for example 60Hz monitor the max delay is 17ms). Without vsync of course there would be no delay, but then on the screen would be written half of one frame and half of another (which is called tearing). So you can't see anymore frames than the monitors refresh rate is, only parts.
Quote :and the only tfts without input delay are gaming tns which are even more rubbish than the ones with delay a tft which does another 2

The input lag comes from the monitors electronics, not from the panel type. On some monitors the electronics do some tricks to improve contrast or reduce ghosting or something like that. Which of course takes time which in turn causes the image to lag. So it doesn't really depend on what the panel type is, it depends on how much the monitor fiddles with the image before showing it.
On the uglyness argument. Yes, cheap panels are not very good in some parts. Contrast is still a issue for LCDs and ghosting to some extent (not very much on modern monitors). On the other hand LCDs have much better sharpness and image shape is perfect too since every pixel is always the right shape on a LCD. And the no flickering thing is great too.
Quote :and a prerendering buffer of length 3 which will give you roughly 100ms

I don't know how exactly that nvidia setting does or how it does it. But it seems a bit odd if they would have added 3 frame delay just to piss everyone off. It makes no sense just to have extra buffers to make the image lag. Maybe it tries to smooth the framerate or use the gpu more efficently. Anyway I am pretty sure it does not add 3 frame lag like you think it does.

In conclusion I can really see why would some one want a CRT, especially for gaming. In gaming sharpness or non-flickering is not so important but contrast, no ghosting and high refresh rate are. But on the other hand you can play of LCD just fine, modern LCDs are good enough and they are getting better all the time. It's really about what compromises you are willing to accept.
I'll have to check when I get back home. I know my card (Gainward 8600GT 1GB) can knock out a hefty resolution, but I think the limit for me is my monitor (19" widescreen). I run the game with everything maxed out and my framerate is capped at 100fps. I'll take the limit off and see what it'll run at too.

One question though, why the increase in resolution? If you want to make the game look prettier, surely it would be better to work on texture details or track/car detail in general? Not to say that LFS doesn't look good, but it does pale a bit in comparison to something like Forza Motorsport 2...



Quote from geeman1 :No it doesn't. Buffers are generally used to provide smoothness not to provide delays.

yes it does... do you even have any idea what a buffer is?

Quote :triple buffering with vsync does never add a two frame delay.

of course it does thats how the whole concept of a 3 stage shift register which is what the triple buffer essentially is works

Quote :The input lag comes from the monitors electronics, not from the panel type.

do you even read my posts? i said gaming tfts which are built for quick switching and low input lag and all of which are tns

Quote :I don't know how exactly that nvidia setting does or how it does it. But it seems a bit odd if they would have added 3 frame delay just to piss everyone off.

and yet they have... its a ms directx thing though and it is very very noticeably laggy if you set it to anything higher than 0
#71 - vari
Quote from Mikey Monkfish :
One question though, why the increase in resolution? If you want to make the game look prettier, surely it would be better to work on texture details or track/car detail in general? Not to say that LFS doesn't look good, but it does pale a bit in comparison to something like Forza Motorsport 2...

Err...what? This poll is for the benchmark site me and Henrik have: http://lfsbench.iron.eu.org/

This has nothing to do with LFS development nor am I a developer.
Quote from Shotglass :yes it does... do you even have any idea what a buffer is?
of course it does thats how the whole concept of a 3 stage shift register which is what the triple buffer essentially is works

I just explained how double and triple buffering works in computers. And I have also coded a simple double buffering my self. So I think I know how buffers in computer graphics work. Feel free to post your explanation if you think I am wrong.
Quote :do you even read my posts? i said gaming tfts which are built for quick switching and low input lag and all of which are tns

Yes. But do you read mine? I just said that TN panel itself does not cause the lag nor is input lag a specific problem with any panel type.
Quote :and yet they have... its a ms directx thing though and it is very very noticeably laggy if you set it to anything higher than 0

Ok. I can't really argue about that any longer because I don't know how it works.
Quote from geeman1 :I just explained how double and triple buffering works in computers.

i know how it works so thanks but no need for the explanation... how is it that you cant understand that adding a buffer of length 3 will add a 2 frame delay + whatever partial frames youll lose whenever your card can render faster than your monitors refeshrate?

Quote :And I have also coded a simple double buffering my self. So I think I know how buffers in computer graphics work.

so? ive implemented admitedly much simpler buffers myself in vhdl and am able to calculate systemic delays

Quote :I just said that TN panel itself does not cause the lag nor is input lag a specific problem with any panel type.

for gods sake i never claimed it was a problem of the panel type what i said was the only monitors that you will find which have been built with REDUCING input lag in mind are gamer lcds all of which are tns and thus rubbish for lots of other reasons
For what it's worth I did a few tests with the lfsbench replay:

1024x768 8XAA 16AF Avg: 162.779 - Min: 113 - Max: 232
1280X1024 8XAA 16XAF Avg: 163.285 - Min: 114 - Max: 232
1280x1024 16xAA 16xAF GC-AA TAF-SS Avg: 157.085 - Min: 115 - Max: 213
1600X1200 8XAA 16XAF Avg: 161.560 - Min: 114 - Max: 231
1600X1200 16XAA 16XAF Avg: 161.248 - Min: 114 - Max: 232
1600X1200 16QXAA 16XAF Avg: 147.820 - Min: 112 - Max: 185
1600X1200 16QXAA 16XAF GC-AA TAF-SS Avg: 99.949 - Min: 56 - Max: 168

PC spec:
E8500 4.1 GHz
DDR3 1526 MHz 7-7-7-20
X48 chipset
9800 GTX OC

So only when turning on 16xAA Q(uality) mode did the framerate start to drop, even at 1600x1200, then Gamma corrected AA and Transparrency AF have even more effect.
Quote from Shotglass :i know how it works so thanks but no need for the explanation... how is it that you cant understand that adding a buffer of length 3 will add a 2 frame delay + whatever partial frames youll lose whenever your card can render faster than your monitors refeshrate?

Because with triple buffering the the buffers length is not two. It is just one like with regular double buffering. The image is buffered in one of the buffers before it is sent to the screen, it does not go through both. The image does not go to buffer A then to B and then to screen. The image goes to buffer A OR buffer B then to the screen. The whole point of triple buffering is to reduce the delay and make the gpu not idle so much. In ideal situation the delay with triple buffering can be 0ms, if the refresh rate matches perfectly to the rendering speed of the gpu. Besides there is only one more buffer compared to regular double buffering, not two.
Quote :for gods sake i never claimed it was a problem of the panel type what i said was the only monitors that you will find which have been built with REDUCING input lag in mind are gamer lcds all of which are tns and thus rubbish for lots of other reasons

I haven't personally inspected each and every monitor there is in existance. What I was saying that there is no technical reason why the same input lag reducing stuff could not be applied to other panel types besides TN. And not all TN panels are built with gaming in mind. Infact some monitors which are TN suffer from a big input lag because the monitor tries to increase the contrast by applying some tricks to the image. Also there are many other panel types besides TN that are good for gaming too. If you want fast response and great picture quality you have to pay a bit more though.
This thread is closed

LFS Benchmark Poll - Max Resolution
(89 posts, closed, started )
FGED GREDG RDFGDR GSFDG