![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
#11 |
Avatar
![]() ![]() ![]() ![]() Join Date: June 3, 2001
Location: Melbourne, Australia
Age: 65
Posts: 569
|
I'd love to get a new card, but at this point in time I can't really justify spending $$$ for a just to play one game, when my current card is good enough to play everything else.
My PC already cost me big $$$ last month when my motherboard failed. I then had to replace the CPU as my old one didn't fit the new socket ....
__________________
Not enough time to care .... |
![]() |
![]() |
![]() |
#12 |
The Magister
![]() Join Date: February 6, 2002
Location: south Texas
Posts: 131
|
Actually, an FX 5600, like the 5200 and 5500, and sub-Ultra 5700, is merely a bad joke if anyone has hopes of using it for games with intensive graphics. It was bad when new, three years ago, when compared to the GF4's, the numbering was shuffled like crazy. The least powerful of the Ti-series previously, the 4200, equated roughly to the 5700 Ultra. The 4200 was readily OC-able, the FX's were not. If someone has an FX 5600 ULTRA, that would "almost" match a Radeon Vanilla 9600's capability.
The low 3D speeds of the FX's has been well known literally since they were all new, and only with gigantic two-slot HSF's and hard-pushed GPU's could nVidia manage to make an FX 5950 Ultra beat a Radeon 9700 Pro, let alone a newer 9800. There do seem to be plenty of happy gamers with Geforce 6600 GT's running the game, and that card is on sale somewhere (Best Buy?) this week at $99 after a rebate, which is quite good for a mid-range VGA. Surprisingly enough, there are folks finding that they can use their Geforce 6200's in Oblivion, although I'm presuming that all of those gamers have the early, 128 Bit card, from before the later and cheaper 64-Bit ones took all of the brick and mortar shelf space. When I saw nVidia making claims for the entire FX family versus Oblivion, I KNEW that was hokum, whether or not I prefer their older and newer products to those from ATI. Be that as it may, I do have an FX 5900, which I didn't expect much of anything from, but it happens to be in the only PC around here currently loaded with a copy of XP on it. And it didn't work at first, but I also didn't have the special Beta drivers, 84.25, at first. That test was run on a PC here in the computer room, an XP machine, with an Asus A7N8X MB, an OC'd XP 3000 cpu, only 512 MB's of Kingston Hyper-X DDR 500, just a Creative SB 5.1 for audio, and that 5900. The game itself dialed everything down pretty far, and I haven't tested at higher resolutions, but indoors, it's certainly not sluggish at all. Compared to nVidia's bloated claims, I noted that ATI made no similar publicists' daydreams and hot air into any part of their suggestions toward the lower end of the Radeon line. The least capable VGA that they suggested was the 9500 Pro, which actually was faster than the 9600 Pro (and I have forgotten now if ATI included ANY 9600 anything in their Oblivion list). There is a PC in my BR, a spare, that has an OC'd XP-M 2500, running about as fast as the 3000 is, and that room had been my more usual gaming area, but nothing since KOTOR piqued my interest. The spare system presently only has Win98se, and a Radeon 9600 Pro, so I will be adding Win2000, and swapping in a Radeon 9800XT. The several Radeon cards I recently picked off by judicious lowball sniping at eBay represent my first looks at ATI hardware in several years. The drivers made me crazy in the past. I only paid about $80, a year and a half ago, for the FX 5900 that I have. There is no way I would've paid the MSRP. My GF3's and single GF4 were still good enough not to invest more than pocket change in an FX less potent that the 5700 Ultra (which I never saw sold for close enough to what I thought it was worth). I did help a couple of friends, and one relative out with budget level PC builds that had no better VGA than an FX 5500 (but at a cost of about $45, with shipping and taxes). I warned them they weren't getting much! ![]()
__________________
Kiwi * ![]() |
![]() |
![]() |
![]() |
#13 |
Elite Waterdeep Guard
![]() Join Date: January 22, 2003
Location: Toronto, Canada
Age: 53
Posts: 36
|
I hear and at least understand why people get upset. Working in the software industry myself, I can say posting specs that just are plain unusable causes headachs for everyone.
The problem is that the sales team gets thier say on what needs to go onto boxes to sell more product. Some companies (hope Bethsoft sales force is not like this) actaully practice the shelf syndrom. These groups actaully want a percentage of buyers to just buy and shelf the product because it does not work or work well. Sad but its a truth in the industry, also this behaviour happens in other industries as well, you just have to look at the elderly population and how they are taken advantage of. |
![]() |
![]() |
![]() |
#14 | |
Hathor
![]() Join Date: April 6, 2001
Location: the desert
Posts: 2,296
|
i feel so bad for all nvidia card users who are experiencing the problems cited.
![]() i wonder why their game testing didn't reveal any of that? Quote:
V***V
__________________
my best friend is a junkie. what does your best friend do? |
|
![]() |
![]() |
![]() |
#15 |
Dungeon Master
![]() Join Date: June 24, 2002
Location: The Edge of Reality
Posts: 69
|
I'm running it on:
P4 2.8 /w Hyperthreading 1GB Dual Channel DDR 400 GeForce FX 5900 128MB 80GB SATA 7200 HD When I first installed the game, it played like ass even on the lowest settings. I bumped the textures up to medium and the resolution up to 800x600, and it was fairly playable after the character creation sequence. The starter dungeon ran okay, as long as there wasn't much going on. I got outside and noticed that the framerate was much better, even in the city with lots of NPCs. I entered a shop and it again took a nosedive. Not sure what the deal is, but it definitely runs better outside then in on my system. After perusing the forums for a bit, I installed the latest Nvidia drivers that were supposed to be optimized for Oblizion. I then installed Coolbits 2.0 and did the "Max frames to render ahead" to 0, as well as turned vsync off in Oblivion's settings. With those tweaks I can now turn the resolution up to 1024x768, large textures, maxed draw distance, grass and shadows off. It runs much better, but the framerate still tanks on some interiors and in certain places outside. (The horse stables outside of the Imperial City is one) It seems that whenever there are lights involved, I have problems. I can decrease my framerate just by equipping a torch, so not sure what’s going on there. At this point, I doubt I mess with any more tweaking as I ordered a new card and it should be here tomorrow. Hopefully a PNY GeForce 6800 256MB will be enough to run it with no problems. Glycerine |
![]() |
![]() |
![]() |
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
Thread Tools | Search this Thread |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
The Oblivion launcher could not find the Oblivion game executable. | 12lettername | Miscellaneous Games (RPG or not) | 11 | 12-13-2007 09:52 PM |
Fixing Filefront Mirrors (Also Nvidia 8800 Driver warning) | Ziroc | NWN Mod: Escape from Undermountain | 6 | 07-03-2007 04:46 PM |
new playable races? | shadow dragon | Icewind Dale | Heart of Winter | Icewind Dale II Forum | 4 | 09-30-2002 05:13 PM |
Best Video Settings' for Wizardry 8 w/ nVidia cards? | daiSho | Miscellaneous Games (RPG or not) | 1 | 11-18-2001 02:24 AM |