The online racing simulator
Searching in All forums
(976 results)
MadCatX
S3 licensed
Quote from sinanju :I have a nVIDIA badged GeForce GTX 580, which is about 4 years old now.

Looks like Ubuntu 15.04 is newest desktop version, which I'm currently downloading.

If you can't get even Ubuntu 15.04 to boot, you might want to use this fix: http://ubuntuforums.org/showthread.php?t=1613132 The section about "nomodeset" parameter if what you're after. The nouveau driver is almost completely reverse engineered and at least the part that handles desktop CPUs is completely unsupported by nVidia. Adding "nomodeset" parameter will prevent nouveau from loading and you'll have basic but working video output. Once you get to the desktop, you can install proprietary driver developed by nVidia through the "Additional device drivers" tool.

A few extra hints:
- You shouldn't have to add the "nomodeset" parameter permanently, nVidia driver installer will take care of that.
- Do not add any unnecessary kernel parameters unless you have a good reason to think that they will fix a particular problem.
- Always try to use 3rd-party drivers provided by the Ubuntu tool. Do not download and install them manually.
- Do not install 3rd-party drivers for a device that works. A lot of devices have decent open source drivers integrated into the kernel. 3rd-party drivers are usually pretty messy and don't play along nicely with the rest of the kernel infrastructure (yes, Broadcom, I'm looking at you!)
MadCatX
S3 licensed
Quote from Matrixi :V-sync doesn't seem to be working on wine, can anyone else confirm? Testing H5 with OSX 10.11 and if I switch v-sync on, LFS still runs at 100 FPS.

Works for me on Linux, but only in fullscreen.
MadCatX
S3 licensed
Since I have no intention to use Windows anywhere I thought I'd stay out of the stream mostly worthless babble that floods the web every time new version of Windows is released. However after I looked over some of the statements in EULA and some of the default settings related to "privacy" I can't help but wonder if this is some sort of elaborate joke. I realize that a good portion of it is hyped up a lot but I still find it concerning that people apparently have no problems agreeing to some of the terms mentioned in the EULA

- I get that Cortana needs to gather some information about the user's habits to work efficiently, but why is this an opt-out setting? I suppose that Cortana should also be covered by a separate EULA.

- Backups of the BitLocker recovery key to OneDrive. Come on, really? Besides my natural distrust to an encryption scheme that even supports something like a "recovery key", can one possibly do anything more stupid that upload a decryption key to a cloud storage?

- The paramount of all the "we don't give two shits about your privacy" gems has got to be this:

Quote :
We will access, disclose and preserve personal data, including your content (such as the content of your emails, other private communications or files in private folders), when we have a good faith belief that doing so is necessary to protect our customers or enforce the terms governing the use of the services.

This whole paragraph can basically translate to anything because the phrasing is so vague. I always thought that my computer was supposed to do what I tell it to, not the other way around.

All of the above plus the fact that one has to jump through all sorts of hoops to tune at least some of the snooping services down is a huge letdown for me.
MadCatX
S3 licensed
Quote from matijapkc :I have some temperature issues after applying the new patch. My laptop is normally running on pretty high temps, but this is extreme. My GPU usually goes up to 75-80°C, but with new patch I got a max temperature of 127.5°C which concerns me a bit. I will try the old patch later and report back with the max temp.

New optimizations lifted some of the CPU bottleneck so your GPU can now spend more time rendering frames. That's why it's getting hotter. Do what Dave said and fix your heatsink before the GPU burns a hole through your computer.

I noticed a pretty nice FPS increase even under WINE, especially at places where the framerate used to drop to low 30's.
MadCatX
S3 licensed
Quote from matze54564 :
Quote from dawesdust_12 :
Without vsync, you're always getting the newest frame possible because there's a surplus of frames, without any having to wait.

Thats not true, because your monitor have a refresh rate of maybe 70 Hz or 60 Hz (50 Hz are only for Freaks using german tv on pc) and and when you are limiting the frame rate to 100 Hz, your monitor won´t show you 30 frames. 1 skipped frame is like 10 ms delay.

That's not how it works. Without VSync and framerate that is not an integer fraction or multiple of the screen's refresh rate you get tearing - one part of the screen displays Nth frame while the other part of the screen shows N+1th frame. This is annoying to look at but it doesn't introduce any delays.
MadCatX
S3 licensed
Small heads up. Kernel 4.1 is starting to hit repositories of rolling update distributions. Issue with some newer revisions of wheels not being picked up by the driver should be resolved there.
MadCatX
S3 licensed
Can you post your config file? The parsing function for the config isn't exactly brilliant but there is no obvious reason why this wouldn't work. It definitely did work in older versions of the mod.
Last edited by MadCatX, .
MadCatX
S3 licensed
Are you editing the correct value? It looks like you are changing the first value but what you want is controlled by the second value.
MadCatX
S3 licensed
Works perfectly fine here. Good news is that introduction of D3D9Ex doesn't seem to cause any trouble with WINE.
MadCatX
S3 licensed
Quote from LFS Marius LFS :Now in LFS_External Intialise insim events are :
InSim.BTC_Received += new LFS_External.InSim.InSimInterface.BTC_EventHandler(BTC_ButtonClicked);

I need to know that for InSimDotNet

Read the section "Receiving Packets" in the tutorial. Seriously, everything you've asked so far is explained there. Take your time and read it, at the end of the day it'll be easier than asking other people for everything.
MadCatX
S3 licensed
You obviously need to send an IS_BTN packet. It's all in the simple tutorial.


IS_BTN b = new IS_BTN { /* Whatever button parameters you wish */ };
inSim.Send(b);

MadCatX
S3 licensed
http://rog.asus.com/forum/showthread.php?14887-Problems-with-new-Xonar-Phoebus

It seems like this sound card can be a pain in the ass to use...
MadCatX
S3 licensed
CSMT is an out-of-tree patch that Fedora includes in its WINE packages. It can obviously be used anywhere as long as you're willing to patch and build WINE yourself.
MadCatX
S3 licensed
If you're on up to date Fedora, you can get some FPS boost by enabling CSMT in winecfg.
MadCatX
S3 licensed
Quote from LFS Marius LFS :
Quote from MadCatX :Are you using something based on LFS_External? This library does not work properly since LFS 0.6B I believe.

LFS_External is my library.

Herein lies your problem. Consider using InSim.NET instead.
MadCatX
S3 licensed
Are you using something based on LFS_External? This library does not work properly since LFS 0.6B I believe.
MadCatX
S3 licensed
Quote from Racer X NZ :It's so ridiculous calling it an IGC just because that's what Intel call it.

No, but claiming that IGC is not a GPU is like the essense of nonsense. By the same logic you should not consider "Civic" a car because Honda calls it "Civic" and not a "Car". It's the function that matters and by definition "Graphics Processing Unit" is a piece of computer hardware specifically designed to generate image and display it on a screen. Intel IGCs do exactly that, hence it is perfectly correct to call them GPUs.
MadCatX
S3 licensed
Quote from Racer X NZ :That's cus Intel onboard is an IGC, rather than a GPU.

For a little while I considered explaining why is this statement so ridiculous... then I realized that this sort of argument is consistent with most of your posts so I just chuckled and went on about my day.
MadCatX
S3 licensed
Quote from dawesdust_12 :
Quote from MadCatX :
Quote from dawesdust_12 :
Quote from MadCatX :DX12 or Vulkan make sense only for applications that tax the GPU so heavily that the API overhead and inability to have a fine-grained control over the GPU becomes a problem. I don't believe that LFS is anywhere near that limit. DX12 and Vulkan also require a fairly new GPU because - as far as I gather - both of these APIs make some assupmtions about how a GPU works inside. All this "3D revolution" stuff is just marketing talk. Games won't suddenly get awesome just because their 3D engine is written in Vulkan.

Well, Valve was implementing Vulcan things on existing Intel GPUs (if I recall their presentation correctly), which would make me think that a lot of functionality can be implemented at the driver level, not requiring support at the hardware level (like some OpenGL features require).

I could be wrong, but Vulkan seems to give the greatest "hope" of being backwards compatible. DX12 could be similar, as the last few DX versions haven't seen the hardware changes like some of the DX9 GPU features required.

According to this Vulkan assumes OpenGL 4.3 or OpenGL ES 3.1 compliant GPU.

Which is basically "Any GPU in the last 5 years" after some quick Googling.

Intel GPUs seem to be the exception since official support for OpenGL 4.3 in Windows drivers is available only since Haswell chips.
MadCatX
S3 licensed
The WINE log suggests that you have the CSMT Direct3D offloading enabled. Does the problem go away then you switch it off?
MadCatX
S3 licensed
Quote from dawesdust_12 :
Quote from MadCatX :DX12 or Vulkan make sense only for applications that tax the GPU so heavily that the API overhead and inability to have a fine-grained control over the GPU becomes a problem. I don't believe that LFS is anywhere near that limit. DX12 and Vulkan also require a fairly new GPU because - as far as I gather - both of these APIs make some assupmtions about how a GPU works inside. All this "3D revolution" stuff is just marketing talk. Games won't suddenly get awesome just because their 3D engine is written in Vulkan.

Well, Valve was implementing Vulcan things on existing Intel GPUs (if I recall their presentation correctly), which would make me think that a lot of functionality can be implemented at the driver level, not requiring support at the hardware level (like some OpenGL features require).

I could be wrong, but Vulkan seems to give the greatest "hope" of being backwards compatible. DX12 could be similar, as the last few DX versions haven't seen the hardware changes like some of the DX9 GPU features required.

According to this Vulkan assumes OpenGL 4.3 or OpenGL ES 3.1 compliant GPU.
MadCatX
S3 licensed
DX12 or Vulkan make sense only for applications that tax the GPU so heavily that the API overhead and inability to have a fine-grained control over the GPU becomes a problem. I don't believe that LFS is anywhere near that limit. DX12 and Vulkan also require a fairly new GPU because - as far as I gather - both of these APIs make some assupmtions about how a GPU works inside. All this "3D revolution" stuff is just marketing talk. Games won't suddenly get awesome just because their 3D engine is written in Vulkan.
MadCatX
S3 licensed
Having a game run twice on different GPUs simultaneously might not give you a really relevant results because of how the Optimus technology works.

It might be interesting to see if there is any difference in CPU usage with the game running on different GPUs.
MadCatX
S3 licensed
The last log captured with the wheel disconnected doesn't seem to show any obvious pattern. It'd probably be a good idea to run some tests with games other than LFS. At least we need to figure out whether this problem is specific to LFS. Is there anything suspicious in your "dmesg" log?
FGED GREDG RDFGDR GSFDG