• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The [[Offical]] ATI vs NVIDIA thread

3DS Max uses Direct X over OpenGL by default so it can't be that bad.
Microsoft threatened to cripple OpenGL in Vista with a "legacy" mode. This caused a lot of backlash from CAD developers and caused others to switch. They reversed their decision at the last minute after they lost support for Vista from many of these developers.

Before DX11 it didn't even support multithreaded rendering. It's a total joke. Can you imagine rendering Wall-E on a single thread?

Even the films listed on Wikipedia as having used 3dsmax have been rendered in OpenGL (2012, I Robot etc).
 
Microsoft threatened to cripple OpenGL in Vista with a "legacy" mode. This caused a lot of backlash from CAD developers and caused others to switch. They reversed their decision at the last minute after they lost support for Vista from many of these developers.

Before DX11 it didn't even support multithreaded rendering. It's a total joke. Can you imagine rendering Wall-E on a single thread?

Even the films listed on Wikipedia as having used 3dsmax have been rendered in OpenGL (2012, I Robot etc).

You're confusing the 3D viewport with "rendering".

The actual rendering, as known as the final image produced, isn't OpenGL or Direct X.

Scenes are rendered in a separate rendering engine that doesn't use Direct X or Open GL to render the scene.

In fact, those scenes are rendered on CPUs, so how many threads Direct X has is irrelevant as it's not a part of the rendering chain.

For example, I can make a model in 3DS Max, I can use the Direct X driver for the 3D Viewport, but when it comes to rendering, I can use all 4 of my CPU cores, or I can network render and use 8 cores all together if I add my fileserver in to it.

I Robot and 2012, their wiki pages mention nothing about 3DS Max, never mind Open GL. :confused:

In fact, the 3DS Max wiki page doesn't even mention Open GL, it does however list the most common render engine plugins used with it.
 
Last edited:
3DS Max is often used in movie production - but not for the final rendering.

As above this is usually done on a render farm consisting of 100s of CPUs - tho it is something that could benefit from CUDA.

While it doesn't use Open GL or DX, etc. I think that the software rendering engines are often loosely based on Open GL - but you have all kinds of things you won't see in normal 3D rendering very advanced lighting (raytracing, etc.) high deffinition materials, etc.
 
You're confusing the 3D viewport with "rendering".

The actual rendering, as known as the final image produced, isn't OpenGL or Direct X.

Scenes are rendered in a separate rendering engine that doesn't use Direct X or Open GL to render the scene.

In fact, those scenes are rendered on CPUs, so how many threads Direct X has is irrelevant as it's not a part of the rendering chain.

For example, I can make a model in 3DS Max, I can use the Direct X driver for the 3D Viewport, but when it comes to rendering, I can use all 4 of my CPU cores, or I can network render and use 8 cores all together if I add my fileserver in to it.
Doing a quick search reveals I falsely believed DX supported offline rendering. It does not. OpenGL however, does. As you can probably tell I'm not a DX guy.

Take something like Renderman (which Pixar use). It is extremely similar to offline OpenGL rendering. Shader language, even the APIs are pretty close. The main difference being OpenGL is more focused on realtime rendering and Renderman purely offline rendering. You can technically use either for the task.

I Robot and 2012, their wiki pages mention nothing about 3DS Max, never mind Open GL. :confused:
Check the 3ds Max Wiki page, the films produced using it. It lacks any details though, just a list.
 
3DS Max is often used in movie production - but not for the final rendering.

As above this is usually done on a render farm consisting of 100s of CPUs - tho it is something that could benefit from CUDA.

While it doesn't use Open GL or DX, etc. I think that the software rendering engines are often loosely based on Open GL - but you have all kinds of things you won't see in normal 3D rendering very advanced lighting (raytracing, etc.) high deffinition materials, etc.

Rather than "CUDA" I'd just broadly generalise that all forms of rendering that's usually done on the CPU will benefit massively from running on GPUs.
 
Doing a quick search reveals I falsely believed DX supported offline rendering. It does not. OpenGL however, does. As you can probably tell I'm not a DX guy.

Take something like Renderman (which Pixar use). It is extremely similar to offline OpenGL rendering. Shader language, even the APIs are pretty close. The main difference being OpenGL is more focused on realtime rendering and Renderman purely offline rendering. You can technically use either for the task.

Check the 3ds Max Wiki page, the films produced using it. It lacks any details though, just a list.

Even if it does use OpenGL in some form, its usage isn't that relevant as the GPU does absolutely nothing, and it'll be a complete base line usage, probably just to generate geometry, everything else will be custom programmed by the developers using physically accurate algorithms for light and so on.
 
Even if it does use OpenGL in some form, its usage isn't that relevant as the GPU does absolutely nothing
OpenGL was designed from the ground up to not require any hardware acceleration. This is a large point of its design. You can accelerate it with a GPU. You can also get it to render every frame on a different CPU. It has massive scalability, but none of it is required.
 
Last edited:
OpenGL was designed from the ground up to not require any hardware acceleration. This is a large point of its design.

Maybe so, but the usage of it in rendering applications wouldn't allow MS to cripple anything with a legacy mode. :)

Rendering on the CPU with rendering engines such as Maxwell Render wouldn't have anything to do with the type of crippling that MS could potentially apply.

The type of OpenGL you were initially talking about is to do with realtime, interactive usage of it.

I'm not even entirely sure that render engines use OpenGL, but I don't know enough about the mechanics of them to say much else.
 
They don't usually use OpenGL as such - but many of the engines started life working from a framework based on the OpenGL design.
 
Maybe so, but the usage of it in rendering applications wouldn't allow MS to cripple anything with a legacy mode. :)
I was first explaining the (likely) cause of the switch in 3ds Max. I could be wrong on this.
The type of OpenGL you were initially talking about is to do with realtime, interactive usage of it.
Kind of but not really, i'll explain. OpenGL will use any acceleration available (which it may have for some parts of a render) and do the rest completely offline. This is what is so brilliant about its design and why it will be extremely exciting when APUs finally start pushing through (bring on Larrabee / Bulldozer). In the not so distant future accelerated ray tracing will be possible.

I'm not even entirely sure that render engines use OpenGL, but I don't know enough about the mechanics of them to say much else.

Some do, some don't. Almost all are based on OpenGL design.
 
I was first explaining the (likely) cause of the switch in 3ds Max. I could be wrong on this.

Kind of but not really, i'll explain. OpenGL will use any acceleration available (which it may have for some parts of a render) and do the rest completely offline. This is what is so brilliant about its design and why it will be extremely exciting when APUs finally start pushing through (bring on Larrabee / Bulldozer). In the not so distant future accelerated ray tracing will be possible.



Some do, some don't. Almost all are based on OpenGL design.

Raytracing is so old tech. Unbiased is where it's at now. :p

There's already videos of FERMI doing "real time" raytracing actually, there's also an application in beta that uses GPUs to accelerate an unbiased rendering engine, it currently runs on CUDA but an OpenCL implementation is being worked on and will be released when they feel it's mature enough.
 
Raytracing is so old tech. Unbiased is where it's at now. :p

There's already videos of FERMI doing "real time" raytracing actually,
Haven't seen that! I'm checking that out now. There was a Larrabee one last year too.. before they cancelled it :(

there's also an application in beta that uses GPUs to accelerate an unbiased rendering engine, it currently runs on CUDA but an OpenCL implementation is being worked on and will be released when they feel it's mature enough.
I've seen absolutely nothing about that :o
 
Haven't seen that! I'm checking that out now. There was a Larrabee one last year too.. before they cancelled it :(

With nVidia just buying mental ray, they're pouring some R&D in to it which is good for everyone.


I've seen absolutely nothing about that :o

There was a thread on it here a few weeks back *looks up*.
 
I have a nvidia edition CM690 and running a 5770 soon to be corss fired :D

Good man! I have an Athlon 64 sticker (found in an old box int he spare room) on the inside door of my Antec Mini P180... Am running a C2D E8400. Could be a new trend in the making... Now where's those Pentium MMX stickers I had ;)

Onthe GPU front, lets not hurt the fanboys feelings. Just buy whatever is best for your budget and needs ffs
 
Back
Top Bottom