The online racing simulator
Upgrade 8800GTS (512MB/G92)
2
(47 posts, started )
What screen resolution do you want to run at?

I run everything on High plus 8x MSAA and 16x AF at 1280x1024 (native of monitor) in Team Fortress 2 on my Sapphire HD3850 512mb gddr3 AGP, and everything the same in bioshock and every other game I play. Hell even Insurgency doesn't lag, and it has some seriously large open spaces.

I never go below 30fps, usualy sits around 35-50 in heavy action scenes.

I'm running a 3.2ghz P4 prescott HT on an 800mhz FSB and 2GB of DDR @ 400mhz.
I run at 1680x1050 (22"), with a clone output via component to a 42" HDTV. Not sure if the latter affects the framerate or not, I suspect not though. In truth I can run most things pretty good, but I'm not happy with performance in Crysis or Age of Conan. Bioshock (demo) runs good.

I've never been able to run anything with super high AA settings. Typically 2X or sometimes 4x. LFS is the exception, I max out every single setting in the nVidia control panel (texture quality, AA, AF, LOD blah blah) and LFS runs at retarded frames, usually around 200.

@dadge - I've been an AMD supporter since my first 386 machine, and I think they've made some awesome stuff. It also seems that they're at the end of the line for the time being. I was stubborn about it last year and figured a dual core clocked at 3.2ghz (windsor) would be great but I didn't look into it enough. Everything I'm reading right now seems to indicate that Intel's product makes mine look like it's running backwards in the gaming world.
I think the only reason I can run things at such high settings is because both my monitors are limmited to 1280x1024, the CRT's max is the same as the native of my TFT.

I've played crysis on this computer once, and that was the demo running on an HD2600 pro.

Oddly enough it ran at all high settings decently well, not great but decent. No AA or AF though, and at 1024x768 on my old CRT.

With my current setup I think I could get it on high at native, but not ultra high, and probably very little if any AA and AF.

Crysis has great graphics though, I don't understand how Age of Conan can be asking for more than what you've got considering the graphics it generates. It must be a real resource hog.
AoC is DX10 is probably one of the best looking games out there right now me thinks!
i doubt that youll ever be happy with your crysis performance
looking at that hardocp article i linked to earlier... theyre running the 2nd highest graphics settings on an i7 machine with more graphics power than you currently have

so im afraid with any machine available in the next year or so that doesnt require you to sell your soul to be able to justify the expense youre going to have to live with the germans from crytek showing you that your epenis is in fact rather small
Quote from Shotglass :
so im afraid with any machine available in the next year or so that doesnt require you to sell your soul to be able to justify the expense youre going to have to live with the germans from crytek showing you that your epenis is in fact rather small

ROFL!

There's some sig material right there

Fair enough; but those maniacs doing that test you linked were running some crazyass resolution (2560x14*10^32 or so I think). The 4 ATi GPUs had a hard time on ultra everything at that res, I bet that at earthly resolutions I could run it reasonably well. Either that or Germans just aren't efficient as you think.
Quote from Ball Bearing Turbo :Fair enough; but those maniacs doing that test you linked were running some crazyass resolution (2560x14*10^32 or so I think).

they tend to go a bit overboard over there
crysis was running at a mere 1920*1200 though and still wouldnt go beyond a sustained 30 frames

Quote :Either that or Germans just aren't efficient as you think.

judging by the names in the credits the crytek guys are actually from turkey (the country not the bird) so i object to any implications on a lack of efficiency in german engineering
To be honest Ball Bearing Turbo if your Motherboard supports it (or you can get a Bios update) you would be best to save and get a AM2 Phenom II Quad Core.

They blow the old Phenom I out of the water and with an aftermarket fan they overclock just as well as the Intel Quad series, so you can squeeze some more performance out if needed

Benchmarks show while Intel is still slightly quicker, the Phenom II series is right behind it and by that I mean within a few %
Quote from mclarenmatt :Benchmarks show while Intel is still slightly quicker, the Phenom II series is right behind it and by that I mean within a few %

it has to be said though that this is purely on a performance/moneys basis not on a performance/clock base
Quote from Shotglass :it has to be said though that this is purely on a performance/moneys basis not on a performance/clock base

For example if they were both running at 3GHZ, yes the Intel would win performance wise by a few %.

But if you take into account the cost aswell they are about level.

It depends whether Ball Bearing Turbo wants that extra few % performance or whether he would like to save some money and still get a top notch CPU.

Edit:// Link to AMD Phenom II X4 940 3GHz. I plan on upgrading to this soon
Quote from mclarenmatt :For example if they were both running at 3GHZ, yes the Intel would win performance wise by a few %.

But if you take into account the cost aswell they are about level.

It depends whether Ball Bearing Turbo wants that extra few % performance or whether he would like to save some money and still get a top notch CPU.

Edit:// Link to AMD Phenom II X4 940 3GHz. I plan on upgrading to this soon

@ i've noticed you have a Gigabyte GA-MA770-DS3. i have a similar board. my board supports AM3 also. if yours does, you might want to buy one of those as the use less power and would clock alot more for you.
#37 - Jakg
Quote from mclarenmatt :For example if they were both running at 3GHZ, yes the Intel would win performance wise by a few %.

I wouldn't be so sure - my main PC is faster (running at 2.13 GHz) than the 920 (2.8 GHz) in PC#2 and the 920 costs around £25 more than the Q8200, which should be similar...
Quote from Jakg :I wouldn't be so sure - my main PC is faster (running at 2.13 GHz) than the 920 (2.8 GHz) in PC#2 and the 920 costs around £25 more than the Q8200, which should be similar...

Hmmm dunno then

Different benchmarks report different thing's, so I suppose it's down to what's better for what you will be using.
Yeah from my understanding there is a discrepancy between what benchmarks report and what actually happens in gaming.
Quote from Ball Bearing Turbo :Yeah from my understanding there is a discrepancy between what benchmarks report and what actually happens in gaming.

Sometimes there is a HUGE discrepancy.

Benchmarks are bull most of the time, I never look at them. I look at the characteristics of the hardware.

I've seen where a graphics card that is benchmarked as being much better is beaten by the card it was marked against.

For instance, my realy old Number nine savage3 8mb agp was marked against an nvidia card with 16mb on agp (can't remember what nvidia card though). It said I would get 10fps more at 800x600 resolutions with the nvidia card, where as I actualy got 20fps more in all situations with the savage3.

Unfortunately Number Nine tech shut down, they could have been a real contender with the cards they made.

Since then, I don't even look at benchmarks let alone take what they say as fact.
Hate to burst your bubble DragonCommando, but benchmarking is the only real way to compare the performance of two cards, you have them all run exactly the same test and compare the results. You can't compare clock speeds and automatically know which card is better, the true performance of the card is the combination of all the specs. But we don't know how they are combined, so we cannot tell from the core speed + memory speed + amount of memory + all the other stuff in there how one card compares to another card. And sometimes you cannot compare specs directly, like ATI and Nvidia have different ways of counting shaders, so an ATI 4870 has 800 shaders and a Geforce 280 GTX has 240 shaders (or something, dunno exactly). The 4870 also has a faster core clock speed than the 280 GTX. Does that make the 4870 faster? No, it doesn't. In fact it's slower. But how do I know that? After all the specs tell me the Radeon should be faster, right? Benchmarks tell the real story, even if they don't match exactly from site to site, they all follow a similar pattern.
There are so many factors that can effect a benchmark or the actual real world performance that the benchmarks arn't ever going to be reliable.

nine times out of ten one card is going to be good for one thing and another is going to be good for something else.

It's always a trade off. If you can look at the specs and consider what you are going to use it for, knowing that ATI and Nvidia count things differently, you can factor that in and figure out which one is actualy better for your task.

I've looked at alot of graphics cards that boast big numbers and passed them up for cards with lower numbers, but they are better, simply because they are geared for what I do.

Like you said, the numbers mean very little when they arn't measured on the same scale. But benchmark numbers mean nothing if you arn't using the exact same computer system they are.

Basicaly I'm saying, if the benchmark somehow covers exactly what you are going to be using it for, then fine, it might be accurate. But if the benchmark doesn't cover what you are going to use it for there is no way it will be of any use to you.
Ok, well I decided to go the motherboard / CPU route as well... in for a penny/in for a pound etc.

Just thought I'd mention something that I did NOT expect - I backed up all my important crap, and proceded to swap out the boards. What really shocked me was that switching from an AMD to an Intel system, I didn't have to reinstall the OS (Vista). I did the swap, and the bloody system just booted right up on the existing OS.... Was able to install all the new drivers and whatnot with no issue. I'm wondering if that is an improved functionality in Vista since that really was not possible on XP.

Just thought I'd share that, I thought it was cool and saved me a lot of time. One of these days I probably should do a fresh install anyway, but kind of waiting for Win7 first.
As your motherboard support AMD3 why did you went the way to spent more money. Can you still hold the order?

What about AMD Phenom II X3 720BE. It has unlocked multiplier so you dont need to raise the FSB and it cost around 110-115 euro currently. What a performance for the money.
Nah I picked the parts up at the shop here in town that I generally buy from. Wanted to get something that I knew I would be happy with and not needing to upgrade for a while so I went with the Core i7 920. I simply fired my bclk up, and have it sitting at 3.5GHz on the stock cooling - it still doesn't get hotter than 54C even with the GPU at 91C heating up the case... so I would imagine I still have quite some headroom if I wanted to push it.

Pretty large difference in games as well. Can run crysis on max settings, but if I want perfectly smooth all the time I have to turn one setting to the second highest - the post processing.

Only question now is whether or not to add a second GTX260 . (never had an SLi board before; seeing the other PCIe slot in there puts ideas in the head....)
Np
It is ofcourse your money and decision. Congratulation to the new system and enjoy it. BTW if you would wait 1.5 month you could have iCore 5. Same fast, much less money but dont kill me for saying that now.

LFS will not benefit from SLI at all but other games will do for sure. The question is if you have good PSU to support it. I would buy some not cheapo 750W power supply to be really on the save side. Maybe you will need even more as I dont know what all your current system consist. for example if you have a lots of HD in your rig it can stress a system when powered because the HD take a lot from 12V when starting to spinning. Some people might be suprised how much it can be. Otherwise the HD does not consume so much power.
Quote from DEVIL 007 :Np
It is ofcourse your money and decision. Congratulation to the new system and enjoy it. BTW if you would wait 1.5 month you could have iCore 5. Same fast, much less money but dont kill me for saying that now.

Yeah I checked into that extensively but still decided to get the i7. the i5 is actually quite different and the i7 still has a pretty major advantage in SLI systems. Still pretty confident about being ahead of the game with the headroom I have on the i7 chip... some of the clocks out there are insane.

Quote :LFS will not benefit from SLI at all but other games will do for sure. The question is if you have good PSU to support it. I would buy some not cheapo 750W power supply to be really on the save side. Maybe you will need even more as I dont know what all your current system consist. for example if you have a lots of HD in your rig it can stress a system when powered because the HD take a lot from 12V when starting to spinning. Some people might be suprised how much it can be. Otherwise the HD does not consume so much power.

I would for sure get a new PSU as well. I only run one little 300gb drive (I just don't have that much stuff lol) so I would think a good 750W would be sufficient.

I'm liking Enermax units a lot.

Anyone here have experience w/ SLI? Is it a pretty solid technology now?
2

Upgrade 8800GTS (512MB/G92)
(47 posts, started )
FGED GREDG RDFGDR GSFDG