• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

yet another Titian - Titan V CEO Edition

Please report away. Was it you that asked about the lies? Rroff has all the dirt on Jen-Sun. You're barking up the wrong tree :p
:confused: By report I thought you meant you was going to write and post a link to an actual report on that subject. You did repeat the word so thought that was just excitement of doing such a task.
 
I wish someone like DM would post in this thread, the thing about his posts is they tend to stick to the subject of the OP and add real substance to the debate.
 
No I'm right on both counts. Nvidia never invented the GPU. They may have coined a phrase first. And Jen-Sun has told a lot of lies :p

Report
Report
Report

No they didn't coin a phrase - the phrase GPU was in general use way before the GeForce 256. They popularised the term GPU when they brought to market the first true GPU in that it could do the full set of primitive graphics operations that underpin all other operations in hardware for which they had to invent the GPU - no one else had done it before them though as per my quote before several other vendors were also trying.

Most of your posts are nonsense or banding around playground level insolence and insults lately... such as "Rroff has all the dirt on Jen-Sun" which is just childish nonsense.

I wish someone like DM would post in this thread, the thing about his posts is they tend to stick to the subject of the OP and add real substance to the debate.

https://forums.overclockers.co.uk/posts/31904706

He is right in as far as GPUs have rapidly increased in feature set, etc. but the big deal with the GeForce 256 was how much of the underlying functionality that underpins more advanced features was brought into actual hardware processing which is what distinguishes it as a processing unit and makes an arguable claim for "inventing" the GPU albeit a somewhat inflated one but certainly not an outright lie.

Larrabee is an interesting aside here as it depends a lot on software but is reprogrammable hardware heh.
 
I wish someone like DM would post in this thread, the thing about his posts is they tend to stick to the subject of the OP and add real substance to the debate.

Nvidia announced another Titan card. Not sure what more we have to talk about. Erm... it's got a lot of memory.

Hyperseven wants to know about Jen-Sun lies as a side topic. Although it seem like a diversionary question.

Edit: A quick search of Nvidia/Jen-Sun Haung/lies seems to reveal enough.
 
Last edited:
Nvidia announced another Titan card. Not sure what more we have to talk about. Erm... it's got a lot of memory.

Hyperseven wants to know about Jen-Sun lies as a side topic. Although it seem like a diversionary question.

Edit: A quick search of Nvidia/Jen-Sun Haung/lies seems to reveal enough.

If you have a problem with keeping to the topic of the thread please start a new one with a topic that you do wish to debate.:)

He already did. :confused:

Sorry I missed that one as it got lost in all the other stuff.:)

@Rroff thanks for posting the link to DMs post, I do find it interesting reading other peoples posts (yours included) when they are talking about the technical side of things to do with GPUs, over the years I have learned a great deal by doing this.:)
 
So kaapstaad, are you remortgaging your gaff to acquire one of these when they inevitably get listed for about 6 grand? :p
 
I always assumed the Pixie chip was the original GPU in that it was a separate unit that processed graphics.

Most of those early chips would be more accurately something like video [output] processing unit - they did little in the way of graphics processing (pixel blending, scaling, etc.) just converting the data to something that could be displayed.
 
Most of those early chips would be more accurately something like video [output] processing unit - they did little in the way of graphics processing (pixel blending, scaling, etc.) just converting the data to something that could be displayed.
Yeah I guess it really comes down to how you define graphics. You had big players like SGI with their RealityEngine that had a lot of what is the modern graphics pipeline done in hardware - but not all of it. You also had Evans & Sutherland working for the US military producing incredible 3D visuals (for the time) with their Freedom Graphics hardware but again not the entire pipeline done in hardware.
 
So kaapstaad, are you remortgaging your gaff to acquire one of these when they inevitably get listed for about 6 grand? :p

I will be sitting this one out.:)

What is interesting is if the CEO edition ever makes it to market is it could actually be slower than the regular Titan V for gaming due to having to drive 32gb of memory.
 
I will be sitting this one out.:)

What is interesting is if the CEO edition ever makes it to market is it could actually be slower than the regular Titan V for gaming due to having to drive 32gb of memory.
How will you be able to sleep knowing your collection has something missing? Won’t it eat you inside bit by bit? :P
 
Yeah I guess it really comes down to how you define graphics. You had big players like SGI with their RealityEngine that had a lot of what is the modern graphics pipeline done in hardware - but not all of it. You also had Evans & Sutherland working for the US military producing incredible 3D visuals (for the time) with their Freedom Graphics hardware but again not the entire pipeline done in hardware.

Yeah it is a very debatable point - what people often refer to as a GPU with older ones is often a glorified video controller with little more than display buffers, H/V Sync and a video DAC.
 
Yup very often the CPU would do most of the calculations and then the graphics card would just handle the output side of things, hence why NVidia claim the first GPU title because it didn't need the CPU to do half the work.
 
How will you be able to sleep knowing your collection has something missing? Won’t it eat you inside bit by bit? :p

The trouble with Titans is they are like buses, you wait ages for one then three come along at once.:)

I think by the end of the year NVidia will launch another Titan based on 12nm purely for gaming using the same 5120 SP cores as the Titan V but without any of the DP or Tensor cores. It won't be that much faster than a Titan V but it will be an interesting step forward.:)
 
The trouble with Titans is they are like buses, you wait ages for one then three come along at once.:)

I think by the end of the year NVidia will launch another Titan based on 12nm purely for gaming using the same 5120 SP cores as the Titan V but without any of the DP or Tensor cores. It won't be that much faster than a Titan V but it will be an interesting step forward.:)
Yea. I personally feel like if I am to get a titan, it is going to be the 7nm one where I watercool it and be done with needing an upgrade for a good 3 years. Either that or a watercooled 7nm xx80Ti.

I have the feeling once we hit 7nm progress will start slowing down and unless nvidia spend a nice chuck of their revenue to r&d a very good architecture, we will start getting smaller and smaller gains after that.
 
Back
Top Bottom