fett on 27/5/2009 at 22:34
Memory Cards also suck. Make the machine 1/2 inch thicker and quit making me deal with that shit. :mad:
ZylonBane on 27/5/2009 at 22:42
Quote Posted by CCCToad
He means there are three consoles, so playing all the good console games is gonna put you out at least 1000 bucks for hardware.
Three? Please. A real console gamer is juggling a good
dozen different systems.
fett on 28/5/2009 at 01:47
Quote Posted by Abysmal
This is very true and something I've often considered. Today it gives consoles the edge on content for the money, but in the fast-approaching digital distribution age I think this will soon even out, with the PC finally getting back in the game with downloadable rentals and streaming services.
Can't argue it now though; consoles win here. (unless you're pirate scum of course)
Which means nothing if the content typically sucks, the controllers handle like a wet rag, and I can't get titles I want without buying 3 different machines. Did I mention the graphics also suck? (Yeah, I'm either trolling or trying to get the thread back on topic, not sure which...):erg:
EvaUnit02 on 28/5/2009 at 06:37
Quote Posted by fett
Memory Cards also suck. Make the machine 1/2 inch thicker and quit making me deal with that shit. :mad:
What rock have you been living under? None of the current gen consoles require memory cards.
PS3 requires a HDD for caching/installations (Blu-ray read speeds are slower than DVD), 360 has the option of a HDD (stupid design decision, especially since it wasn't optional with the first Xbox) and Wii has internal flash memory.
242 on 28/5/2009 at 11:42
Bought myself a red PS2... couldn't resist, it looked too sexy.
Thirith on 28/5/2009 at 12:25
Quote Posted by Koki
Yes. They will also all have five-year-old graphics, since they're running on five-year-old hardware. Very cutting-edge, that.
In practice I would say that this isn't strictly true - at release I think many X360 games looked better than many PC games, so they're ahead at that point, added to which the stability of the platform (no constantly changing hardware setup, no escalating hardware and driver wars) means that programmers learn how to use the hardware better and better. Look at early PS2 games vs the ones that came out in the last, say, two years.
PC programmers seem to be more lazy at pushing the hardware in clever, effective ways - why should they, if players can just throw in a new Geforce or ATI card?
Obviously five years after release an originally cutting-edge console will be behind a cutting-edge PC, but it'll still be able to keep up reasonably (and surprisingly) well with a 2-3 year old PC.
Jason Moyer on 28/5/2009 at 12:58
It takes a pretty hefty PC to play something like Bioshock or Mass Effect at 1920×1080.
Volitions Advocate on 28/5/2009 at 13:05
Quote Posted by Jason Moyer
Yes. 10 years ago. With a mouse and keyboard, too!
Complete with inability to download plugins and have them work properly. Also no tabbed browsing or multiple windows.
EvaUnit02 on 28/5/2009 at 13:28
Quote Posted by Jason Moyer
It takes a pretty hefty PC to play something like Bioshock or Mass Effect at 1920×1080.
Also keep in mind that something like 90% of 360 games run at 1280x720.
The supposed "free 4xMSAA" of the 360's Xenos GPU is a big lie. Most of the games that I've played are still jaggy as fuck, 4xMSAA my arse.
Wormrat on 28/5/2009 at 15:40
HD--even 720--is quite often a lie, too. Many games are upscaled from lower resolutions.
I doubt anyone cares about this, but if you scroll about halfway down this page you can see some of the actual resolutions these games are rendered at:
(
http://forum.beyond3d.com/showthread.php?t=46241)
Considering that most console games also run at 30 fps, it's not very expensive to build a computer that does just as well.