WingedKagouti on 11/8/2008 at 16:24
Quote Posted by ZylonBane
That's right, people certainly don't replace their video cards every 3-4 years, now do they?
That's right, a large majority of people don't. They are, however, not the primary target audience for these kind of games.
Volca on 19/8/2008 at 06:30
I find (
http://s08.idav.ucdavis.edu/) this page quite interesting, particulary the talk about larrabee programming. It seems that Intel's plan is to offload even more of the CPU's processing onto the GPU (If it can still be so called). Ray tracing is not likely the best example usage of that new technology.
What catched my attention is the mention about visibility culling/scene graph updates/traversal all done on the Larrabee as a pre-step to rendering.
addink on 19/8/2008 at 08:31
If the Larrabee doesn't outperform the conventional cards, Intel is gonna have a hell of a time marketing the thing. And given their attention to preprocessing the scene before actually rendering it across all cores, I assume it doesn't.. by much at least.
I do really like the concept of it though. I much preferred optimizing the classic in-order processing to the near-impossible to schedule out-of-order processing, but I assume the only reason to use the Pentium1 based cores is the simpler pipeline and that, once Larrabee takes off, out-of-order processing will be reintroduced. Which to be honest saves a lot of work; given that AMD(Ati) and NVidea won't sit idly by and wait for Intel to come up with a standard, it would be a nightmare to optimize with 3 types of cores in mind.
Volca on 19/8/2008 at 08:53
It's interesting to note performance here. I suppose doing openGL/direct3D in software on such thing could be a nightmare to optimize, but I see them taking a big chunk of time to do it right. I read somewhere that memory chip manufacturers already received Larrabee prototypes and It's more than a year till the release, so maybe Intel will have enough time to polish the product.
What I see here is a good possibility to revise all the current GPU limitations, although I agree that for example defering some of the scene graph processing to the GPU would mean some hard work for engine writers to count with both possibilities (but it can be done).
addink on 19/8/2008 at 09:33
Key for all manufacturers is having a solid backward compatible API and/or driver set. They can (and should) develop that themselves. In case of Intel's Larrabee they should incorporate their custom scene render pipeline in such a way it's transparent to the developers, ie 'simply' make DirectX and OpenGL compatible drivers.
However DirectX nor OpenGL is setup to be a generic parallel processor interface. All the real benefits would still remain hidden by their API. So without a specific standard for parallel processing we're back to the days before directx: Wanna use the cool features: Write hardware specific code.
And developers will only write hardware specific code if the user base is big enough, or if adding the hardware specific code is sponsored.
What could help things if the hardware was included in the next version of a console. Like the Cell processor for the PS3.
If the Larrabee could be part of the next XBox or something, the benefits of the processor, combined with specifically developed render/physics/AI engines could streamline the development of a standard API, and in turn its introduction into other platforms like the PC.