The online racing simulator
TEST PATCH 0.6F2 (minor update)
(181 posts, closed, started )
Hello there Scawen. I am Spark one of the TC City Driving Daily Cruisers and Firstly i would like to appreciate all the work you are doing now with Eric to make LFS Better and popular. Secondly As you know there is alot of Cruise Servers and its very nice that you are making all of these maps etc but is there a way you can make Traffic lights on tracks for better realism that work. What i mean by that is they change every 20 secs from Red than to Yellow and than to Green and than other way around. It would add soo much realism to Cruise Servers and it would be very much appreciated.
Here we have an unusual problem . I selected the Mouse XY look function while in 3D (with more options already on-screen) that brought 2 more settings - not fitting on screen now.
EDIT: for those who don't get it, look at the bottom of my attachment. The button look setting is out of the screen.
Attached images
settingsout.png
Quote from THE WIZARD DK :so how about that gearbox issue aswell as it seems to get worse for me going from 5th or other to reverse by itself... im sure im not the only one having this issue. i spoke to a guy yesterday with wheel, who has that problem aswell..

can it be fixed soon?

Sounds like the infamous "Crap Logitech Hardware Bug" that's been causing problems with G25s and G27s for years now.


In all seriousness, what you describe sounds exactly like the problems the G25/G27 shifter has, caused by worn and/or dirty components (some parts are very poorly designed).
I personally have never heard of any shifting issues unique to LFS.
the G25/7 uses that sort of "joystick" shifter with axis' imstead of a button system, doesn't it? Been using my button H-Shifter contraption for ~4 years now, never had a problem with it aside from hardware (alignment) errors that could be fixed with a hammer.
Its a bug in LFS, not a hardware problem on H-shifters, infact it will never happen while using H-shifter. And THE WIZARD DK is keyboard driver and therefore cannot have any problems with that joystick G25/G27 shifter.

It is very easily reproducable in single player. Turn autogears on, and while autogears are shifting for example from 4th to 5th press PAUSE so that you catch neutral gear on dashboard. Hold shift up (or down), resume gameplay, it will shift to 1st.
Problem is that LFS should ignore user input while autoshifting gears, or it should shift directly 4th to 5th. Right now it shifts 4th->neutral->5th, so basically if you press either shift up or shift down during neutral it will end up in unwanted gear.
#82 - kdo
Quote from Scawen :Anything's possible but I don't yet know why I would or wouldn't.

I chose shader model 2 as it seemed to offer what I needed. I have not researched properly, what shader model 3 could offer or what negative effects that may have on compatibility with a certain age of supported graphics hardware.

I can already imagine the response "Scawen is insisting that LFS needs to run on a casio wristwatch from the early eighties". No, I am not. However there is a balance between features and compatibility, and what I have said here is "I don't yet know where that level is".

personally, i prefer a game that run very smoothy on a casio than a game i cant play with a low-medium pc!

keep the good work ! and thanks again for the infos!
Quote from ACCAkut :the G25/7 uses that sort of "joystick" shifter with axis' imstead of a button system, doesn't it? Been using my button H-Shifter contraption for ~4 years now, never had a problem with it aside from hardware (alignment) errors that could be fixed with a hammer.

It uses the positioning of a couple of potentiometers to emulate button presses, yes (although I can't remember how it distinguishes between 6th and R).
The problems are generally caused by:
1. worn/dirty POTs producing noisy/wrong values (more obvious in the spiky pedal issue).
2. a small, soft-ish metal part scraping along a much larger, harder metal part every shift into, or out of, gears 1/2 or 5/6/R. Said soft metal part is vital for the alignment and eventually gets sanded down after a fair amount of use, causing it to lose alignment and randomly jump out of gear.
In some earlier G25s, some people had wires break because they were too short, but I haven't heard of that problem for a long time, so they probably fixed that one at least.

Quote from DANIEL-CRO :Its a bug in LFS, not a hardware problem on H-shifters, infact it will never happen while using H-shifter. And THE WIZARD DK is keyboard driver and therefore cannot have any problems with that joystick G25/G27 shifter.

It is very easily reproducable in single player. Turn autogears on, and while autogears are shifting for example from 4th to 5th press PAUSE so that you catch neutral gear on dashboard. Hold shift up (or down), resume gameplay, it will shift to 1st.
Problem is that LFS should ignore user input while autoshifting gears, or it should shift directly 4th to 5th. Right now it shifts 4th->neutral->5th, so basically if you press either shift up or shift down during neutral it will end up in unwanted gear.

It might have helped if he'd mentioned auto gearbox and described how to reproduce it then
"going from 5th or other to reverse by itself" sounds just like a *very* common hardware issue, and nothing like the actual problem.


AFAIK, shifting 4->5 isn't really possible currently, because it's operating the same manual gearbox simulation, which *must* go into N to actually change gear.

Ignoring the user's shift requests while it's in the process of operating the gearbox (I think that was your other suggestion) may well be the best way to deal with it. Alternatively, in the case where the autobox was shifting one way and the user requested a shift the other; abort the current shift operation and shift in the direction the user requested (eg auto tries to shift 3->2 and the user requested an upshift it should shift 3->4 instead). This may be more intuitive rather than just ignoring the input regardless.
Quote from bogdani.cojocaru :This is actually very accurate.

A SM 4.0 videocard will get more FPS in a 3.0 enviroment than it does in a 2.0 using same graphical test. FutureMark apps can be used for reference.

DX 9.0c support only SM 3.0 =(
Quote from Scawen :what shader model 3 could offer or what negative effects that may have on compatibility with a certain age of supported graphics hardware.

Quote :
• DirectX 8.0 - Shader Model 1.0 & 1.1
• DirectX 8.0a - Shader Model 1.3
• DirectX 8.1 - Shader Model 1.4
• DirectX 9.0 - Shader Model 2.0
• DirectX 9.0a - Shader Model 2.0a
• DirectX 9.0b - Shader Model 2.0b
• DirectX 9.0c - Shader Model 3.0
• DirectX 10.0* - Shader Model 4.0
• DirectX 10.1* - Shader Model 4.1
• DirectX 11.0† - Shader Model 5.0
• DirectX 11.1† - Shader Model 5.0
• DirectX 11.2‡ - Shader Model 5.0

Direct X 9.0c was the last release which also could be used for Windows 98 and as stated before got public in 2004 so it actually already was there when the first versions of LFS came to life.

If you still have a GPU/computer which predates that time it can safely be called a Casio wristwatch although its probably that big, ugly and inefficient a better name would be; junk.

But of course I understand there has to be benefits and if GPU's of the last years can handle SM3.0 better then I guess its a big benefit. At least I understand thats the quest now because Westhill has more detail, combined with 3D it probably asks a lot more graphical calculation.

2.0 also has a lot of limitations compared to 3.0. I don't know what everything means in these tables;
http://en.wikipedia.org/wiki/High-level_shader_language

But the difference looks huge.

Added fact;
Quote :Geometry Instancing

Another feature, geometry instancing, could show itself as a noticeable performance increase. With geometry instancing a game could loop information from within the vertex buffer to create the same object. A good example of this would be in an RPG game where you are controlling a large army, and in that scene the characters all look the same. Instead of having to draw each one separately with geometry instancing the video card could simply draw one of them and copy the rest therefore providing a performance increase. No image quality differences would be seen with this feature, it is purely a performance feature.

Useful for the cars and various objects which look the same around the track?

Maybe it's something to discuss in this new, I think quite good topic .

.
Speaking technically, as a sys admin I still ( In a business sense,) support XP+. (Well XP not for business, don't poke Vista with a stick.) 7 Actualy works really well, 8(.1) causes issues. (16 bit data bases don't run under 8 64 bit, and yes, people still run these. Just charged a client for getting a 16 bit database running on a 8.1 surface tablet. Yes, WTF, but, hey, a chargable job.)

Why the * are people still running 98 for ?

Let's at least settle on XP+ as being a minimum spec, after all, 14 years old could well be a cut off without offending too many people.......

"WHAT, my Dos 3.0 AT is no longer running LFS ?????, Y U NO SUPPORT THIS TECH !!!!!!!!!!!"
OK, some interesting info there. But something is not answered... I don't think it is so simple that if you got your PC in 2004 or later (or you have XP) then everything is necessarily fine, for an upgrade from Shader Model 2 to Shader Model 3. The graphics cards had to be developed to support Shader Model 3, and these new cards had to be installed in everyone's computer.

I don't yet know when it was that all graphics cards, onboard graphics and so on, supported SM3. I read there was a time when the market was full of SM2 hardware that people had just bought but they couldn't run the newest games that game out, from big developers pressurised by sinister capitalist forces into requiring the latest graphics cards that supported SM3.

I tried yesterday, I can change two lines "ps_2_0" and "vs_2_0" into "ps_3_0" and "vs_3_0" and it still works. But I don't want to do that if it breaks LFS for thousands of users.
Isn't the fact that a GPU supports Direct X 9.0 enough as it should automatically support the given shader model, no? And what GPU doesn't support DX9 in 2014....

BTW for Linux users I would suggest trying Wine 1.7.x.
No, DirectX used to provide software emulation for things that were not supported by the graphics card. Obviously this is terribly slow. And anyway it doesn't work that way. The capabilities depend on the card, not the software. DirectX is just an interface, it does not magically bestow abilities onto hardware.

P.S. Please don't join the conversation by asking questions. It wastes my time. Please only contribute if you know the answers.

DirectX 9.0 came out supporting Shader Model 2. Later, 9.0c came out supporting Shader Model 3. There was already loads of hardware out there, as I said in my last post, supporting Shader Model 2 but not Shader Model 3. That doesn't prevent anyone installing DirectX 9.0c on their computer, but that doesn't magically make their hardware support Shader Model 3.
Quote from Scawen :...
I tried yesterday, I can change two lines "ps_2_0" and "vs_2_0" into "ps_3_0" and "vs_3_0" and it still works. But I don't want to do that if it breaks LFS for thousands of users.

this can be a test patch ...
or even an option, no ?
As far as I know, SM 2.0 was launched with the GeForce FX 5000 series, which were uter crap exactly as you were saying and I hardly doubt it that people still use them for LFS (OR ANYTHING ELSE) because the performance is so low it makes you want to cry.

SM 3.0 came with the 6000 series which were properly made and they haven't failed again.

If you could only integrate some "thing" into LFS, that will send you a full report of the PC config of everyone who goes online ingame at least once. I don't know if that's invading privacy or not, but I think it would be helpfull to know.

Quote from Flotch :this can be a test patch ...
or even an option, no ?

Or maybe this. Source engine by Valve is a good example. You can pick the way you want the game rendered by starting with DX 7 to DX 9.0
you can query for device capabilities reported by your card using the IDirect3D9::GetDeviceCaps method. This will give you a structure containing a lot of interesting information about what the hardware supports. The resulting structure called VertexShaderVersion and PixelShaderVersion

you can add "option" in grapthic settings
Shader Model: [2.0] [3.0]
(disable 3.0 if device not support this like z-buffer option)

P.S. you can make getting a shader version and send it to your server to keep track of statistics
Quote from Flotch :this can be a test patch ...
or even an option, no ?

Maybe, but I'm just asking the question here, in case someone might happen know the answer. Like if someone could tell me that certain very common graphics cards were built to support SM2 but don't support SM3 and many of those cards are still in use, that kind of thing. Or someone could tell me that certain games still provide a SM2 option or something. I really don't know and that's why I asked, maybe someone could provide an answer that would be a reason to stick with SM2.

Obviously the usual capitalists around here will be telling me I should support the latest and greatest, don't worry about people who have ancient hardware from two years ago that Tyrannosaurus Rex used to use.
I'm actually more concerned about the people who still can't run the latest version on Linux. Which shader models are supported properly on Linux / Wine? Is it easy for our users who have problems at the moment, to update their D3DCompiler_43.dll and solve their problems?

So far if I remember correctly, we have heard from two Linux users with a shader problem, and one who runs it fine.

Quote from majod :BTW for Linux users I would suggest trying Wine 1.7.x.

Possible solution?
Scawen, i think what in 2005-2008 years nVidia GeForce 6600 was most popular videocard because all my friends and me was have this videocard

NVIDIA has ceased driver support for GeForce FX series. (SM20 only)
Windows 9x & Windows Me: 81.98 released on December 21, 2005
Windows 2000, 32-bit Windows XP & Media Center Edition: 175.19 released on July 9, 2008
Windows Vista & 7: 97.34 released on November 21, 2006

GeForce 6 seria support SM 3.0 and DX9.0c
the GeForce 6 seria started from 2004 and stop at 2006 years.

GeForce 7 seria support SM 3.0 and DX9.0c
the GeForce 6 seria started from 2005 and stop at 2007 years.

ATI Radeon R520 (X1300-1950) started from Oct. 2005, support SM 3.0 DX9.0c

ATI does not provide official support for any X1000 series cards for Windows 8;
the last AMD_Catalyst for this generation are the 10.2 from 2010 up to Windows 7


think you can safely turn shaders 3 versions without any problems
Quote from THE WIZARD DK :but it also happens to me in manual

You sure you're on manual?

More than a year ago when I wasn't used to manual shifting I was doing the same thing and I had no idea wtf is going on. People kept asking me if I'm on manual and I was like yea yea sure but it looked like I wasn't really.

Once I switched to manual it never happend again.

https://www.youtube.com/watch?v=V3EarMlWfbY
Quote from Scawen :Mine just says "New value is too low" or "New value is too high".

Well, seems your original LFS is better then than my copy. Thought maybe it's a translation thing then, but no, my copy does same when set to english (enter value out of range to 3D settings - LFS disappears).

You could perhaps ask Eric to try it on his copy?
#98 - troy
According to nvidia the first time shader model 3.0 got introduced was for the 6000 series cards, they seem to have been released in 2004. They go on ebay for 10-20 euro.

http://en.wikipedia.org/wiki/G ... 6_series#Shader_Model_3.0
http://www.nvidia.com/object/powerof3.html

According to steam 0.85% of it's users still use a shader 2.0 card, 1.54% use shader 3.0, the rest is on higher shader models.

http://store.steampowered.com/hwsurvey/videocard/

edit: Also interesting to see there are no nvidia cards in use at steam with SM 2.0 (scroll to the bottom), it looks like AMD was slow updating to SM 3.0 with some of their cards, the rest looks to be internal chipsets mostly, which got horrible gaming performance and will probably not be used for gaming?

edit2: about 70% of those 0.85% overall users of SM 2.0 cards seem to be on internal chipsets, from what I can see all of them got pretty much zero gaming performance.

http://www.videocardbenchmark. ... u=Intel+G33%2FG31+Express
http://www.videocardbenchmark. ... obile+Intel+945GM+Express

edit3: sorry I must have scrolled over denis-takumi's post. The bit about steamusers is still interesting though.
Would be interesting to see how SM3 affects us Wine users too.
Quote from hackerx :Well, seems your original LFS is better then than my copy. Thought maybe it's a translation thing then, but no, my copy does same when set to english (enter value out of range to 3D settings - LFS disappears).

You could perhaps ask Eric to try it on his copy?

Try entering an outside range value while the selected 3D mode is already active (so not in the first small window, OK that, but on the main settings - view screen). If I enter an outside range value in that first window, LFS crashes.
Attached images
crash.png
This thread is closed

TEST PATCH 0.6F2 (minor update)
(181 posts, closed, started )
FGED GREDG RDFGDR GSFDG