CliffyB says "no GoW2 on PC ever, because of piracy and Dell-esque shitboxes" - by EvaUnit02
Aja on 1/10/2008 at 04:11
Also, I'm not convinced that the pirates are people who wouldnt' buy the game anyway. The thing I've noticed is that once these people realize they don't have to pay for games, suddenly those games lose all value, and it's only natural that the pirates wouldn't buy em. But if they never had the option to pirate in the first place, the games would certainly seem more valuable, and therefore more desirable.
june gloom on 1/10/2008 at 04:25
First of all I'm not. I'm simply poking holes in the idea that console gaming is somehow free of piracy.
As to your second point, you seem to be basing your stance on people you know personally. I'd hardly consider that an accurate cross section of the average pirater. Secondly, you're assuming that it's possible to stop piracy forever, which it's not. Case in point: Spore.
CaptSyn on 1/10/2008 at 04:27
I don't game nearly as much as I used to, so I assumed onboard video would be fine, as it once was. Besides, my rig was a budget build, so a vidcard was not part of the immediate plan.
Yeah, maybe I could have done a little more research, but I didn't feel the need, as my original intention was to get an 8800GT a few months after my build was complete, but the prices at the time were way too high. I refuse to pay over $100 for any vidcard.
As it is, I'm happy with my cheap 8400GS. I don't need loads of eye candy in my games.
Most games these days seem to rely on nothing more than eye candy, which seriously pisses me off. I've been playing pc games for nearly 30 years and I've never seen such shit games look so good.
Back in the ms-dos days, games actually had story and substance. Most games don't these days. Now it's all flashy stuff, no story, and usually crap gameplay. Much like Las Vegas.
And many gamers seem to eat it up, like some sort of brainwashed hoard.
As for Nvidia and ATi onboard video, I never bothered looking at those options as I prefer Intel chipsets over any other. No particular reason really, it's just what I've always used. Old dog, new tricks sort of thing I guess.
Also, the early Nvidia chipsets were garbage so I haven't bothered looking into them further. I know they've been greatly improved though. Nearly every Nvidia mobo has SLI and no onboard video, and I'm not going to pay for a feature that I won't use (and let's face it, SLI sucks anyway), nor limit myself to a handful of choices.
As for ATi graphics, I prefer Nvidia over ATi any day. I wasn't impressed with my Radeon and ATi's consistent failure to produce quality drivers. It seemed that what they fixed in one Catalyst version, they tended to break again in the next.
Knights of the Old Republic 1 & 2 are perfect examples. There's a map in each with lots of grass. No matter what video settings you use, an ATi card simply couldn't handle it. Severe stuttering, unplayable framerates, even crashes were not just common, but impossible to avoid.
And no, it wasn't just my card. It was all ATi cards, according to the Catalyst changelog.
Maybe Nvidia has had similiar issues with particular games, but I'm not aware of them and they've never affected me.
*Where did the post I was responding to run off to? WTF?
Yakoob on 1/10/2008 at 04:30
Quote Posted by CaptSyn
This Cliff guy is a real out of touch moron.
...
From a purely business standpoint, as long as those n00bs buy his crappy games, that's all he should ultimately care about. Code your crap games to run on the most common hardware and problem solved.
As for the game in question, who really cares? GoW sucks ass anyway. So do 3rd person shooters for that matter.
...
My onboard video, Intel GMA3100, exceeds the capabilities of that Radeon 9600 in every way, yet Oblivion will not run on it. Almost nothing will.
It's not because the onboard is weak. It's simply because games are no longer written for the most common hardware. I blame the game companies entirely.
It's funny to see one so completely misguided and delusional calling someone a "moron."
CaptSyn on 1/10/2008 at 04:33
And why am I suddenly a "fuckwit"?
You are entitled to your opinion of course, but so am I. If you don't like it, you can ignore it.
Yakoob on 1/10/2008 at 04:40
I edited my post (before seeing yours) since that was unnecessary. You are still horribly misguided, thought.
Also, it's not your "opinion," it's your horribly warped and erroneous perception of how business or computers work.
CaptSyn on 1/10/2008 at 04:48
I may not be a business expert, but I do know that it all comes down to the bottom line, which is profit. Make a product that sells well enough, and you make money. It really is that simple.
If games were coded to run on the most common hardware, which is onboard video, almost entirely Intel based, all those n00bs who buy high end games and return them when they don't work, would actually keep them and buy more. This is good for business and it would expand the customer base bigtime.
FYI, I've been making a good living for many years building and maintaining computers, so I think I know a thing or two about them.
Intel GMA3100 does indeed exceed the capabilities of the Radeon 9600, especially on a rig with a Q6600 and 2 gigs dual channel ram. Do the research and see.
If my old mobo with the Radeon was still working, I'd show you screenshots of the features and benchmark results.
EvaUnit02 on 1/10/2008 at 10:47
Quote Posted by catbarf
A bit OT, but I haven't gotten to try Warhead- is it really significantly more optimized than the original?
Hell yes. Warhead is beautifully well optimised.
Koki on 1/10/2008 at 11:03
Quote Posted by EvaUnit02
Hell yes. Warhead is beautifully well optimised.
[citation needed]
EvaUnit02 on 1/10/2008 at 13:39
Quote Posted by Koki
[citation needed]
Buy and play the game, you cheap bastard. It's only $30 USD. Also Warhead itself is immensely better than the original's campaign.