Briareos H on 1/10/2009 at 13:07
i was responding to the perceived general nvidia ball-licking of this thread and not to the OP, you big meanie
re-reading the thread though, I only see two posts that made me react and I admit I haven't read the other ones. Fair.
falco216 on 1/10/2009 at 13:25
The people in this thread seem to forget that the AA in-game works perfectly for ATI cards, the game just disables it when it detects an ATI card.
Volitions Advocate on 1/10/2009 at 13:56
Valve did the same thing to Nvidia users with HL2. They disabled all DX9 compatibility when it came out with anything lower than a geforce 6 series. I had an FX 5950 Ultra at the time, which can do dx9 just fine, and the DX9 option was blurred out because valve and ati were real fags for each other at the time. You had to hack your install to get it to work.
so I way whatever, this is nothing new.
EvaUnit02 on 1/10/2009 at 14:14
Quote Posted by falco216
The people in this thread seem to forget that the AA in-game works perfectly for ATI cards, the game just disables it when it detects an ATI card.
No shit, we can all read. This post from another forum basically sums up the argument nicely.
Quote:
I'm not sure if you get it. AA in DX9 doesn't work AT ALL on UT3 -- so there would be no AA
at all in BAA (a DX9 game).
I've been a gamer for almost two decades, a PC gamer since the mid-90s, and have some experience as a modder. I have worked with the UT3 engine and know at least partially what the limitations are here. Look up deferred rendering for a technical background on why AA just won't work in DX9.
nVidia recognized the lack of AA in this game as an issue, so they approached Rocksteady and helped them code an AA routine in that would work on nVidia hardware in DX9. ATI didn't do this, so they're stuck with only the default AA support by the engine (ie none at all).
I completely agree that standards should be used and they should be open, but this really is not a case of vendor lockout, rather vendor lock-in. Who do you blame?
Epic, for not making AA possible on DX9 with the engine,
Rocksteady/Eidos, for not delaying the game by possibly weeks or months to write their own engine extensions that would equally work for both ATI and nVidia, or
ATI, for not actively approaching Rocksteady to help them get this desired feature to work?
All the blame I'm seeing is towards nVidia, while they are actually the only party who actually did something for their customers. Seems rather backwards to me.
One hardware vendor works with the developer, spending their money to develop hacks to get AA going in an engine where it's natively impossible, why should they let their competitors take advantage of it? AMD could've worked with the developer to implement their own hacks for the benefit of their customers, but they didn't. If Nvidia hadn't have implemented the algorithms, then the game likely wouldn't have native AA like so many other DX9 UT3 engine titles on PC (Gears of War, Mass Effect, Wolverine Origins, Turok, etc.). A big Western publisher spending more time and money than necessary these days on the shrinking PC market? That rarely happens. Be thankful that we got such a good port as we did.
One of the advantages that they had with their HD3xxx/HD4xxx series was that they supported DX10.1, when Nvidia didn't. DX10.1 offered better performance and more efficient AA implementations over DX10.0. They had this advantage for two hardware generations but they just sat there twiddling their thumbs. If AMD put more effort into actively pursuing devrel, then we'd probably would've seen more DX10.1 games.
(
http://game.amd.com/us-en/unlock_directx.aspx) What were the big DX10.1 games? Stalker: Clear Sky, a failed MMO that went F2P and ports of some mediocre console games that no one cared about.
EDIT: Far Cry 2 had DX10.1 support, was by Ubisoft Montreal like Assassin's Creed and was part of the Nvidia TWIMTBP programme. There goes the Assassin's Creed conspiracy theory, (
http://techreport.com/discussions.x/14707) not that it ever held any water in the first place.
Nameless Voice on 4/10/2009 at 22:38
Somewhat unrelated, but I find the nVIDIA logo with the spoken "nVIDIA" to be incredibly annoying, and delete the file from all my games with a vengeance. (The same generally applied to all short intro/logo videos, but the nVIDIA one is particularly annoying).
mothra on 4/10/2009 at 23:13
it's the first thing I do for any game - seek out to disable the tedious self-advertisment intros.
Matthew on 5/10/2009 at 09:46
I liked the Bloodlines version of the nVidia logo screen. It's the EA one that annoys me.