Yakoob on 2/8/2008 at 19:08
There's also CUDA, a new tech by nvidia (iirc) that allows developers to program directly on the GPU, thus allowing you to do whatever you want with it. I bet there's gonna be some GPU-based ray tracers coming up soon.
My view on ray tracing? It's pretty. But it also has it's problems. I recommend this read: (
http://www.beyond3d.com/content/articles/94/1). I bet a ratarizer/ray tracer combo will be the way of the future.
IndieInIndy on 4/8/2008 at 17:08
Quote Posted by catbarf
So, in theory at least, graphics capability becomes more dependent on the processor than the GPU?
Actually, I was referring to software raytracing, with no GPU (aside from providing a frame buffer). In theory, the number of rays you can cast per frame scales linearly with the number of CPU cores being used. In practice, you're going to run into resource contention between cores, and you're likely to saturate the memory bus trying to keep all of those cores fed with data.
Because it is so easy to divy up rays across all of the CPU cores, ray tracing ends up being a very popular example of what you can do with massively multi-core CPUs.
pakmannen on 5/8/2008 at 09:43
I thought this was already "possible" with todays hardware? I read this article: (
http://www.tgdaily.com/content/view/38145/135/) some time ago, which states how the Transformer trailers were being rendered in real-time using ray tracing and the latest ATI card.
The_Raven on 5/8/2008 at 14:10
That isn't real-time rendering, though.
catbarf on 5/8/2008 at 18:34
Quote Posted by pakmannen
I thought this was already "possible" with todays hardware? I read this article: (
http://www.tgdaily.com/content/view/38145/135/) some time ago, which states how the Transformer trailers were being rendered in real-time using ray tracing and the latest ATI card.
It's not actually in real-time, it could take a few seconds to draw up each frame but since it's just being recorded there's no need for a high framerate.
The_Raven on 5/8/2008 at 21:30
Did you even bother to read the post before yours?
june gloom on 5/8/2008 at 22:07
My arguments with catbarf in the past have revealed that he in fact does not actually read posts.
pakmannen on 5/8/2008 at 22:35
Ah, maybe I misunderstood something? Here's the quote I'm referring to:
Quote:
(
http://uk.youtube.com/watch?v=aT37b2QjkZc) Previous video
In terms of performance, the Radeon 2900XT 1GB rendered Transformers scenes in 20-30 frames per second, in 720p resolution and no Anti-Aliasing. With the Radeon 3870, the test scene jumped to 60 fps, with a drop to 20 fps when the proprietary Anti-Aliasing algorithm was applied. Urbach mentioned that the Radeon 4870 hits the same 60 fps – and stays at that level with Anti-Aliasing.
The_Raven on 6/8/2008 at 00:55
That sounds pretty iffy. Industrial Light and Magic did the effects for Transformers, and I remember hearing that it pushed their equipment pretty hard. If this were true, then why they hell would ILM need their huge render farms? They also do not mention what the computer is actually rendering; if it is just regular video editing and compositing, then that would make more sense.
EDIT: I'm now going to attach some excepts from the Transformers wikipedia article in order to back up my point.
Quote:
According to Bay, "The visual effects were so complex it took a staggering 38 hours for ILM to render just one frame of movement;"[2] that meant ILM had to increase their processing facilities....Photographs were taken of each set. These were used as a reference for the lighting environment, which was reproduced within a computer, so the robots would look like they were convincingly moving there. Bay, who has directed numerous car commercials, understood ray tracing was the key to making the robots look real; the CG models would look realistic based on how much of the environment was reflecting on their bodies.[22]