• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Welcome back zeed, been a while, hope you're feeling better
It's a topic no one should touch. Lets say they put me on suicide watch off for more antidepresants at 17.15. Tahts how Good i feel cause of this falling apart body. Lockdown kiled me.

New GPU got me on forums again to have anything to look forward to besides coffin
 
I don't disagree but iirc the 1080 was sold as the top card ie customers weren't expecting to have to trade up in (less than?) a year because their card wasn't the flagship card of that generation. If I'm not remembering this correctly I'm happy to be corrected :)

The 1080 stopped been the the top card less than a month after release. Nvidia announced the Titan card about 3 weeks after the 1080 was released. Looking around at these forums and others most people hate that their is such a long wait between upgrades. So, I don't think too many people were upset at giving the choice to upgrade to a better GPU 10 months later.

They did the same thing with Maxwell. So I am guessing that nobody was surprised or upset when they released the 1080ti 10 months after the 1080.
 
Then they can use the CTR to get a further % performance for some % power savings! :)

I don't even know the CPU is there..... there is no change to my £45 AIO when i put load on it, i've put a 140mm fan above it to draw the heat out from the GPU because the Rad Fan isn't doing it, it barely changes speed from idle when i load up the CPU.

My gaming mate @pete910 folds on his 3900X while we are gaming...
-------


Nvidia do not want the Titan name beaten.

I'll just leave that there ^^^^
 
I don't even know the CPU is there..... there is no change to my £45 AIO when i put load on it, i've put a 140mm fan above it to draw the heat out from the GPU because the Rad Fan isn't doing it, it barely changes speed from idle when i load up the CPU.

My gaming mate @pete910 folds on his 3900X while we are gaming...
-------


Nvidia do not want the Titan name beaten.

I'll just leave that there ^^^^

If I mine on my 3600 it gets toasty on the AIO running at 68c. The folding software will dial back if your using the cores.
 
Last edited:

MCM for GPU would be equivalent to chiplets for CPU,so using smaller "gpulets" would make sense.

I think this is where they'll aim. Higher yields, more profitable and more potential customers at a more affordable price. Maybe even use the consoles as the benchmark rather than Nvidia. "Buy this card for $XXX and get performance equal or better than on console" whilst not costing the earth.... maybe.

Also potentially better supply?? Hopefully AMD can sort of cross market Zen3 and RDNA2 GPUs together.
 
aa1711891ca18ff750600292e379a84d3b8a425b.jpeg

I don't think the cooler works like that. The top fan sucks in and the back fan pushed out the hot air.
 
MCM for GPU would be equivalent to chiplets for CPU,so using smaller "gpulets" would make sense.

Can't see GPUlet TBH, the Block Diagram has a "Fabric" between the Shaders and IMC/IO so certainly an external IMC/IO die like Zen 2 is a good educated guess and that's the sort of thing where give that they have proven it in Zen 2 AMD themselves have said splitting the Shaders up into chiplets presents some very difficult challenges, so they have looked at it and i'm sure they are putting R&D into it, i just can't see it this generation.
 
@shankly1985 the fan on the top draws air through the HS from underneath it and out the top. Just like the picture shows, which is actually one of Nvidia's slides.
 
E
I don't think the cooler works like that. The top fan sucks in and the back fan pushed out the hot air.

So it will probably be sucking in warm air from the cpu and potentially heating up the gpu instead since both fans will likely be sucking in warm air. Seems like a strange design but time will tell.
 
all you got to do is swap some fans..
make the rear fan inward
take one of the bottom fans to the top and make it push air out

Maybe but apparently you can't really dismantle the stock cooling without damaging it. That's atleast what JayzTwoCent said he was told my Nvidia in this 3080 unboxing video
 
I don't think the cooler works like that. The top fan sucks in and the back fan pushed out the hot air.
Thats how it works This is NV's slide if You missed it. Thats why If I was on Air I would not touch FE card why would i want to heat up my cpu ?? :D
 
@shankly1985 the fan on the top draws air through the HS from underneath it and out the top. Just like the picture shows, which is actually one of Nvidia's slides.
aaa You answered first I think its designed to produce audio counterwave of fan that could be why NV is claiming much quieter aka in person it will sound quiet to you even tho DB meter wont pick it up. Like Active Noise canceling kinda solution simillat to headphones I use that require 2 batteries to power the noise cancelling system. I cant Hear my washing machne spinning in kitchen at 1600rpm when i turn the system on and it sounds like its abnout to take off !!
 
Maybe but apparently you can't really dismantle the stock cooling without damaging it. That's atleast what JayzTwoCent said he was told my Nvidia in this 3080 unboxing video

i was talking about case fans.. not modding the card:

all you got to do is swap some fans..
make the rear fan inward
take one of the bottom fans to the top and make it push air out
 
Can't see GPUlet TBH, the Block Diagram has a "Fabric" between the Shaders and IMC/IO so certainly an external IMC/IO die like Zen 2 is a good educated guess and that's the sort of thing where give that they have proven it in Zen 2 AMD themselves have said splitting the Shaders up into chiplets presents some very difficult challenges, so they have looked at it and i'm sure they are putting R&D into it, i just can't see it this generation.

I think its going to be a normal monolithic GPU this generation,but interesting if AMD does make something like that for RDNA3!
 
If a card is better performance than the 2080Ti and is sensible price it will be a nice upgrade from my vega56. The 5700XT was not big enough jump so I am confident they will deliver on this. Bonus is the green brigade costs are sensible, so it makes AMD's cards pretty much be less too (win-win). Thanks Jensen!

Exactly. I would prefer to be able to support AMD but if they aren't giving me proper price/performance GPU's, or something seriously problematic is up, I'll just turn to the Leather Jacket Man.
 
Status
Not open for further replies.
Back
Top Bottom