SubJeff on 20/7/2012 at 21:22
Check this out.
(
http://www.youtube.com/watch?v=gbjW57zlVfc)
I fell that game gfx quality, as in the technical abilities not the art or style, has been at a relative plateau since the current console generation arrived. I was playing LA Noire on PC with all the setting up and although it looks nice it's not that much of a step up from GTAIV on my PS3.
In fact nothing has really wowed me in terms of technical power for a while now. I think the last thing that looked significantly better than it's fore-runners was Crysis. Even Crysis 2 doesn't look
that much better.
We're just about done with the New Physics now, aren't we? After HL2 everyone seems to be doing it to a lesser or greater extent.
So what next.
I think it's really high quality lighting, textures and depth of field like in that video I linked to. And forget 3D - it's a side issue.
Roll on the next gen of consoles, but I doubt the leap will be as impressive as the last one.
DDL on 20/7/2012 at 21:38
Wow. It did look basically like a de beers commercial until the physics kicked in, yes.
Probably the fact that the lighting, reflections and faked DOF effects were so good helped make the physics look crappy by comparison, admittedly.
Still though: pretty as that was, unless the next generation of games are very VERY pretty bejewelled clones, I still think the uncanny valley is the next big hurdle to conquer. Subsurface scattering and so on can make skin look great, but if it (and the face it's attached to) doesn't move right, it's still creepy as all hell.
Jason Moyer on 20/7/2012 at 22:26
Quote Posted by Subjective Effect
I fell that game gfx quality, as in the technical abilities not the art or style, has been at a relative plateau since the current console generation arrived.
Developers have had basically no incentive to make games that push next-gen hardware. As long as the X360/PS3 are the level of tech that people consider acceptable, you'll get games at that level. PC ports give you higher resolution/better textures (if you're lucky) but it's not like anyone is going to create entirely new models, animations, or physics systems between releasing something on a console and putting it on PC. Perhaps we should expect more out of PC-only software, but typically those sorts of games aren't really pushing the graphical envelope by their nature.
I dunno what it is with people's obsession with depth-of-field effects or trying to replicate the limitations of cameras using shaders and whatever, but the blur in that video is giving me a headache. I hope blur isn't the next-gen equvalent of bloom or lens flare. I don't want games to look like they were shot with a goddamn camera; give me something that looks like what I see when I look out my window. My eyes are more than capable of doing what they do when it comes to depth of field, peripheral blurring, and adjusting to lighting changes.
catbarf on 20/7/2012 at 23:51
I think game graphics were going to plateau anyways, regardless of the fact that the limitations of the current console generation have been holding back development.
High detail content is a lot more expensive to make than low detailed content. What would take one person a couple of minutes in Doom (making a single room) now takes days of work, collaboratively amongst several people all working together to produce that single environment. The more graphical and physics detail that is put in, the more content is required to make it work, and both the cost and time required start to creep up.
It's a cost/reward relationship at a very basic level. Better graphics means more money invested in the project. If it won't guarantee a significantly greater return, it's not worth it. I imagine we'll see a few games like Crysis every so often that really push the limits and sell at least partly on the basis of how pretty they are, but I wouldn't be surprised if gaming graphics stay roughly where they're at for some time. Any radical improvements will come, I think, from the use of better-looking but more processing-intensive techniques like raytracing that don't necessarily cost more to develop but are currently not feasible technologically.
demagogue on 21/7/2012 at 03:48
I think real time radiosity is going to push the envelope to a limit. When I look at a room with really good radiosity, it looks like a real room... And if that could be done in real time with the light moving around and realistic soft shadows then I think it's serious diminishing returns from there.
Also maybe voxel geometry where you can just zoom in or out of a scene to infinity and have countless bits of tree branches or rocky crags that all look uniquely shaped without slowing anything down.
People can already model faces that have gotten over the uncanny valley hurdle now; it's just maybe a matter of having it in a game that can push that many polys in realtime. So that's something else.
Sulphur on 21/7/2012 at 05:09
Pretty much everything catbarf said. We've reached a level where adding all that detail costs a hell of a lot of money; and unlike movies, where an animation studio knows exactly how the camera's going to track a shot and thus only have to put all that expenditure into stuff that's in front of the camera, with video games the player's free to poke about almost anywhere in any given environment.
In terms of radical improvements, we're not going to see any. Games have progressed very far since the 90s: the reason why Doom's levels looked so abstract was because they had to be due to engine limitations. We've moved from abstract to blocky cartoon to semi-realistic to quite damn good in the last two decades; and the 'quite damn good' has lasted us the last 8 years or so.
With each leap, the quality gap between what artists wanted to display and what they ended up with due to engine limitations has been closing, and I think it's safe to say now that, barring the usual poly/texture limitations of the generation, the end result is close enough to how the model was originally conceived that there's not much more work to be done except for throwing more computing power at them to lift the limits.
The easiest way to see this is to compare: back in the 90s we had primitive polygon marionettes dancing in awkward rhythm to some keyframer's odd idea of motion; now we have fully mo-capped/keyframed characters with skeletons and realtime IK and physics, articulated down to the tips of their fingers. We have fantastic lip-sync (all Source engine games). The things left to do now are getting the details right, like realtime sub-surface scattering for skin so it doesn't look waxy/made of cardboard; I think Metro 2033 tried this but wasn't terribly successful in the outcome, because sub-surface scattering needs lots of clock cycles thrown its way.
So to sum up: you're going to see gradual advances from here on out. This isn't as good as it gets, but the years where you could see dramatic jumps in graphics quality are gone, simply because the quality we have right now? It's pretty damn good.
As a side-ramble based on demagogue's post: Radiosity is another feature that's absolutely ruinous on CPU time. dema's right in that it's one of the things that adds realism to an environment, because it models light in far more accurate fashion than just slapping some lights in a scene and fiddling around with the ambient light value to make it look realistic, and the results are generally more pleasing to the eye. But an accurate simulation of radiosity is a very expensive thing to add, simply because it gobbles up all the processing power you can throw at it. That's the reason why it's always been baked into environments in games (Max Payne 1/2, Mirror's Edge).
That recently released UE4 demo, though, seems to be doing global illumination of some sort in real-time - not in terribly accurate fashion, but in a relatively pleasing approximation of it. That's the smart way of doing it - games have always implemented smart tricks to approximate what a full-time renderer would do, viz. using fake reflection maps instead of calculating them from environment geometry, so you never know, radiosity might not actually be that far off.
zajazd on 21/7/2012 at 07:09
I liked more how AAA games looked till mid 2000s, nowadays they have too much details that only serve as expensive decorations that most ppl don't even notice and appreciate. E.g. now I'm playing Max Payne 3 and I read these reviews saying how great graphics it has but to me Max Payne 2 was prettier, it was more.. streamlined and added to the drive of the game.
Melan on 21/7/2012 at 08:37
Quote Posted by Sulphur
Pretty much everything catbarf said.
And what Sulphur said.
At this stage, AAA games are incredibly capital- and labour-intensive affairs. To me, the saddest consequence of all this has been a decline in amateur mapping. People still make levels in their free time, but I am noticing on places like (
http://www.mapcore.org/) Mapcore that an increasing number of them are people trying to break into the industry and polishing their resumes.
Maybe there will be a way to drive down development costs and complexity some day by automating some segments of the workflow, or relatively cheap asset databases, although they can only be a partial solution.
Ulukai on 21/7/2012 at 08:46
I certainly noticed a marked increase in the time it took to produce levels for Unreal Tournament 2004 over the original. Think I made over a dozen for original UT; only 2 for 2003/2004. And for this reason, it was considerably more like work and less like fun.
june gloom on 21/7/2012 at 09:26
Can we at least fix the thread title? That period should be a colon, and that typo at least should be fixed. Just looking at it is driving me mad.