Why DirectX 11 will save the video card industry and why you don't care - by clearing
Koki on 23/7/2009 at 15:47
Quote Posted by Renzatic
See, I look at games like Crysis and RE5 and think we could stay where we're at for a good while and be pretty well satisfied.
I actually said that myself back when I played FFX...
Renzatic on 23/7/2009 at 16:22
I was beginning to think what I'm saying now back during the PS2 days.
Here, I'll give you all an example of what I'm talking about here. Take (
http://users.chartertn.net/greymatt/RE1.jpg) Resident Evil 1. It's kinda blocky looking, prerendered backgrounds, iffy textures. Compare it to what came out a generation later: (
http://users.chartertn.net/greymatt/RE4.jpg) Resident Evil 4. That's a huge amount of difference right there. It's still low res and grainy, but its' full 3D and the characters actually look like characters. Now compare it to (
http://users.chartertn.net/greymatt/RE5.jpg) Resident Evil 5. High res, characters are even more well defined, textures go so far to even show skin imperfections. There's a goodly bit of improvement from 4 to 5, but it's nowhere near as huge as what we saw from RE1 to 4. From here on out, the jumps in quality are going to be less and less noticeable. In another 10 years, we'll probably be (
http://users.chartertn.net/greymatt/example_thing.jpg) here, with true realtime multibounce radiosity and all that good stuff. Once again, a big jump from RE5, but the image quality is actually less of a jump from RE4 to 5.
Of course I'm only talking about image quality here, not scene complexity or any of the other advantages we'll be seeing in the near future. But as is, I don't see a reason to continue upgrading constantly like we have in the past when what we already have gives us more room than we've ever had to play around with. We should slow down, tweak what we got, then upgrade when the truly next big thing hits.
Malleus on 23/7/2009 at 16:49
Quote Posted by Renzatic
Yeah, animation is the one area where we really could see a huge amount of improvement. I don't think there's any hardware limitations preventing us from doing better, though. Just that developers are lazy when it comes to this one area of design.
Agreed, animations could use improvement, but they have very little to do with graphics processing power / image quality.
Iroquois on 23/7/2009 at 16:51
I actually believe that gamers' reliance on graphics is declining at the moment; it had reached its peak back in the Playstation and PS2 eras, when everything had to be 3D and look as sharp and close to 'realism' as possible.
What we're experiencing now feels more like a fad. How many fratboys who buy every single "pretty" fps on their XBOX are going to stick around and for how long? The industry sadly rushes to indulge them instead of the traditional gamers and that might come back to bite it in the ass, but those who've been this for some time now realize that maybe sharp graphics aren't all that matters and the so-called "realistic" presentation is bs.
Granted, that's a somewhat hesitant assessment on my part, purely because it's money that will decide the direction of the industry. If it wasn't as successful as it is now, it wouldn't be as bloated and misdirected. But all these things are known to die down, eventually. Already multiplatform games are generally a disappointment for PC users and various developers have expressed concern over this "new generation" of hardware and graphics.
I admit I only skimmed through the article, but here's the thing; remember those Flght Simulator pictures a few years ago, that compared DX9.0c and DX10. They looked great, sure. But, anybody saw those promises coming to fruition? Anybody gave a shit then? In the end of the day Halo 2 PC is still going to be blown away by the original Half-Life in all its OPENGL DX6 glory, because the games that have something to offer are the ones that remain high when fads collapse. And I can't help but feel this entire period we're experiencing is just a fad, because the endless copycat titles that just look pretty will only lead to a rut. And then the smaller, poorer, smarter developers will come back out of their holes and deliver what matters.
Or maybe I'm just a bit too optimistic. Whatever the case, there is truly no reason for me to give a rat's ass about DX11.
Silkworm on 23/7/2009 at 17:24
Quote Posted by Malleus
Agreed, animations could use improvement, but they have very little to do with graphics processing power / image quality.
lolwut? How is animation not classified under graphics? Video cards handle the vast majority of its processing, and it involves what gamers see, on screen. Seriously, one of the silliest things I've read on here.
Quote Posted by Renzatic
See, I look at games like Crysis and RE5 and think we could stay where we're at for a good while and be pretty well satisfied. After all, we're already at the point where the difference between a game with good graphics and a game with bad has more to do with its artistic direction than how many shaders and polygons are being thrown around.
So what? Just because gamers are satisfied with what we have now does not mean "we've reached a plateau" in technical ability. (And it's arguable that in this age of "realism" and procedural generation artistic direction matters
more than it did before, to say the least.) But I don't think you're right about that, either. This sounds to me like a sort of market-based fallacy - ie High end PCs are not being made/sold, so that means that the gaming public doesn't want it. Like I implied before, if the economy was doing better, we wouldn't be having this discussion.
Traditionally, home computer gaming has always gone hand in hand with technological improvement. Not just the strength of the market for games, but the innovation and quality within it. People who hoped for stagnation in graphics development in hopes that it would force developers to work on gameplay - well, that's what we're seeing. Be careful what you wish for.
steo on 23/7/2009 at 17:32
Problem is, gamers generally expect new games to have graphics comparable to other new games. If a game comes out with shitty graphics, it will get knocked by reviews. Therefore developers need to invest considerable time and money ensuring their game has good graphics, so they then need to convince investors that they're going to be able to make that money back, which means appealing to as large a market as possible and is the reason they make all these mass-market casual games that we all despise so very very much.
Now, if graphics and hardware advancements were to stagnate (and I don't believe they will), the need to continually modify engines and make a new one every few years disappears, development costs come down, games don't need to sell millions of copies to be profitable and they can make more of the games we all love.
But I don't think it'll ever happen...
Silkworm, what Malleus means is that animations in games are currently limited by the programmers ability to create realistic and lifelike movement, rather than lack of power on the graphics side of things.
WingedKagouti on 23/7/2009 at 18:17
Quote Posted by steo
Silkworm, what Malleus means is that animations in games are currently limited by the
art departments ability to create realistic and lifelike movement, rather than lack of power on the graphics side of things.
Fixed that for you.
Programmers rarely have much, if anything at all, to do with animation quality.
Malleus on 23/7/2009 at 19:12
Yeah, steo, that was what I meant, but still...
Quote Posted by Silkworm
lolwut? How is animation not classified under graphics? Video cards handle the vast majority of its processing, and it involves what gamers see, on screen.
Are we talking about the same thing? I mean character animations/movement? Aren't those either pre-recorded set of motions in which case the game just plays them back, or some dynamic movement system in which case the CPU handles the calculations? Anyway, I'm pretty sure video processing power is not the one limiting the creation/usage of realistic animation. And that's what I meant with "little to do with graphics processing power".
Volitions Advocate on 23/7/2009 at 19:29
Quote Posted by Malleus
Agreed, animations could use improvement, but they have very little to do with graphics processing power / image quality.
Quote Posted by Silkworm
lolwut? How is animation not classified under graphics? Video cards handle the vast majority of its processing, and it involves what gamers see, on screen. Seriously, one of the silliest things I've read on here.
Quote Posted by steo
Silkworm, what Malleus means is that animations in games are currently limited by the programmers ability to create realistic and lifelike movement, rather than lack of power on the graphics side of things.
However you put it, Malleus is actually right. A few weeks ago there was a discussion about that zombie ut2k4 mod and how people were complaining that the animations looked crappy. Its difficult to get a walking animation to work in 3d games because entities/actors/whatever do not walk, they glide on the surface, So yes it's a big thing for the art team and the animation team to "get it right" but they're still making the character appear to do something it really isn't.
People harp on Carmack a lot but when I think of him I think of the genius behind what his work really is. He's trying to find new ways make things work, not way's to make them look better.
Rather than try to make hardware so awesome that everything looks real, we could do the same thing on current hardware by finding a different way to do it. What if instead of animating something to walk we taught it to walk? rather than gliding on a surface, give it's joints proper boundaries and write up an AI smart enough to figure out how to use its own limbs to really walk. Each foot infront of the other rather than animate it doing so while the entire thing is just gliding. None of that has much to do with current limits of processing power, but a way of thinking.
I also agree very much with what the author of the article said about the Wii. There are a few games coming out on the market that are PC based that were built on older tech. Look at Killing Floor, it has a modest following that would probably be a lot bigger were it not for L4D and it's just UE2, doesn't really look super great. And people are still doing things with Goldsrc and even IDtech1 that are huge leaps from what they originally were but that are fun and playable today. Look at Dead Space, that game is not perfect graphically but it does look damn good, on a dated engine, and it loads up faster than any game on the wii.
If popcap games is taking all the business away from the indie developers making leisure games, than with all the open source content out on the net, the indie guys are going to start taking over the serious projects from the big guys like epic, id, and crytek with smaller budgets and shorter development times. And they'll do it with older engines. I'd buy a new thief clone or Deus Ex clone type of game based on older tech over the new UE3-big industry buzz-circlejerk even if it were priced more. I paid full price for dead space, and I swear I could've gotten it working on my Pentium III with appreciable framerates. and I ENJOYED it.
which reminds me... I have to go see if I can pick up a cheap 2nd hand copy of MGS4 to play, I'll proofread this and fix all my idiotic errors later.
nicked on 23/7/2009 at 19:48
Quote Posted by steo
But I don't think it'll ever happen...
Ever is a long time. Sooner or later we could reach the stage where game engines are created in the same way as the real world. Designers will build molecules with material properties and physics programming, and based on the element represented by the molecule, different hardnesses, textures, behaviours of materials will be mass produceable. So a brick wall would be created by programming a fired stone molecule, defining it's shape, and shaping a brick out of those molecules. The engine programming will handle how those molecules interact with their environment, and all the artist has to do is define a volume and density of those molecules. We'll have engines where the player could use an electron microscope anywhere within the game world and see the building blocks of reality, and then take a hammer to that wall and watch the bricks powder as they would in real life, collapsing in random, unpredictable ways and filling the air with individually modelled dust particles.
When that happens, we'll have reached the stage of an industry-standard engine because there will be no way to improve upon the possibilities offered. Designing will be made easier, because once you've constructed the template molecular building blocks, walls, objects, people are easy to procedurally generate based on 3D shapes not much more complex than we're seeing now.
At that stage, the improvements can only go outwards, not up, and AI behaviour, gameplay, and new ways for players to control and interact with the world become the challenges. And that's just a few steps away from the Matrix... :p
However, what's more likely is that the game industry will collapse under the weight of the expectations of the artists long before then, games will continue to cost more to create until eventually, massively expensive games will no longer be financially viable, and there won't be any graphically astonishing games because no-one can afford to make them. Reviewers will no longer mark games down for having poor graphics because very few games will be able to reach higher than their competition and still stay in-budget. Even if they did, no-one would notice, because there's only so much that's visible to the casual observer.
We'll have reached a plateau, not of possibility, but of commercial viability. Games will become cheaper to make as designers become experienced with tools that improve far less frequently, and risks in story and gameplay will become acceptable again.
Either way, I believe the games industry will change for the better at some point in the future.