• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

#BeForTheGame

They just demoed that same Star Wars demo on a single Quadro Turing during SIGGRAPH, it was done using DLAA as well.

Lets see them do it side by side with the same memory amount on both cards lol.

Also checking out the specs and performance for the top Turing card is interesting, it is just a cut down GV100 chip (10% less on all cores) overclocked a bit and using GDDR6 memory.

7IShbE8.jpg

I notice power consumption (TDP) has gone down a bit with the switch from HBM2 to GDDR6.
 
to be honest it boggles me that NV are too cheap to put an AIO cooler on all of their gpus over say £1500 as an option as default. I mean come on really... do they really add a significant increase to the cost when you are talking 1.5k+??

AIO cooler is the last thing you want on a professional card as they are used in mGPU setups.:)
 
AIO cooler is the last thing you want on a professional card as they are used in mGPU setups.:)
indeed which is why i said as an option....

i meant the geforce series however (including the titan) and not the quadro... i wont be considering buying one of those. :)

(the truly professional stupid powerful cards/systems dont even always have video out on them do they? they are just number crunching beasts. (i think we have a few at our place and i am almost certain one of our it guys said no video out on it.)
 
Also the Turing 'Tensor Compute' seems like it got a very big bump @Kaapstad


They made big changes to the Tensor core so int8 is now twice as fast FP16 which is reasonable.

Turing is absolutely not the same as Volta, obviously an evolution. There is greater performance with lower power than Volta.

Amd seemed to be about a generation behind with Vegas vs Pascal (the 1080 was competing with the Vega 64), now Nvidia have released 2 generations, albeit smaller jumps each.
 
As a GTX1080 owner happy at 1440p I'm going to give the 20xx series a miss I think. Also it worries me that AMD are nowhere to be seen. Hopefully AMD are back in the GPU game when the next series come around and I can get better perf/£
Then I will move to 4K.
 
I'm also hoping in a couple of years they'll be back in the high-end, I'm guessed next year will be the mid-range focus then a bigger chip a year or so later :) 1080 will last you at 1440p no doubt :cool:
Just got myself a Vega64 not long ago, and it's really doing well at 1440p.

Given little difference in price now between the 1080 and the Vega64, it would be silly to get 1080 at this stage in time as you just know it's going to age much worse and faster than the Vega64 at EOL if the history was anything to go by. Vega64 will still get driver performance optimisation and support for new games up to (and likely beyond) the launch of their next gen card, where as for the 1080's driver support optimisation from Nvidia post 2080 launch (not that long away)...well good luck with that, the people's gonna need it.
 
There's a fundamental gap between what a company achieves/capable of technologically vs what they decide to offer (or not offer) for their consumer products business wise though.

I miss the days that the GTX 80 cards are "real" 80 cards, not these recent gen 60Ti cards wearing a 80 skin that are being launch at price point close to the original Titan.

If only AMD could up their game for the graphic card like what they have done with Ryzen 2 for CPU side of things...
 
I got a Titan xp 2016 now. For me at least is all about performance not to much about money. If its going to be 50% more GFLOPS than what I have now I'll get it if it comes under 2,500 pounds or there about. I do hope they will improve the cooler with 2 fans at least or better yet put AIO on it.

It's always about the performance, I find having the money makes you more aware of when you're just spending for the sake of spending. I've had 16 TITANS since the brand's introduction, I'm sick of paying through the nose when you don't have to. Even saying things like "if it comes in under 2,500".

Christ, just stop lol

to be honest it boggles me that NV are too cheap to put an AIO cooler on all of their gpus over say £1500 as an option as default. I mean come on really... do they really add a significant increase to the cost when you are talking 1.5k+??

AIO coolers on GPUs is just a big no. I think it takes away the premium of a GPU rather than adding to it.
 
Last edited:
There's a fundamental gap between what a company achieves/capable of technologically vs what they decide to offer (or not offer) for their consumer products business wise though.

I miss the days that the GTX 80 cards are "real" 80 cards, not these recent gen 60Ti cards wearing a 80 skin that are being launch at price point close to the original Titan.

If only AMD could up their game for the graphic card like what they have done with Ryzen 2 for CPU side of things...


That is just naming, you could call the new xx8 cards an xx60ti and they will sell at the same rpcie and have the same specs. It is just a name.

As technology has changed, node costs increased, R&D costs exploded and the market changed with a new luxury end then of course model numbers will change.

Onm the AMD side, it used to be if you purchased an X90 card you got a top of the range offering, now they don't exist even.
 
As technology has changed, node costs increased, R&D costs exploded and the market changed with a new luxury end then of course model numbers will change.
Indeed! The cost is so high that Nvidia full-year revenue gone up by 41% comparing the year before...no wait...

Seriously if people willing to pay the price for it that's one thing and so be it and bravo for Nvidia for putting it off as business; but trying to sugar coat the inflated prices of these mid-range chipset card blaming it on "increase costs" is just ridiculous, considering Nvidia is making a higher margin than ever on each of these mid-range chipset cards than the flagship cards they traditionally made and sold in the pass.

Much like trying to justifying on behalf of EA with high development costs for games, and with them getting record breaking year after year, and can afford to pay to executives $48 million and $35 million just for last year alone.
 
Last edited:
Back
Top Bottom