• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

New video from Derbauer where they do a literal teardown on the XTX cooler. Conclusion at the end after talking to some cooler experts is that there's not enough fluid in some of the coolers, so (IMO) it's looking more like a manufacturing fault from whoever made the coolers for AMD.

 
Last edited:
New video from Derbauer where they do a literal teardown on the XTX cooler. Conclusion at the end after talking to some cooler experts is that there's not enough fluid in some of the coolers, so (IMO) it's lookin more like a manufacturing fault from whoever made the coolers for AMD.


The design of the vapour chamber probably needs a computer to chunter away for hours predicting performance and not something that can be checked by opening one. It seems to me that the root problem must be manufacturing rather than design and opening the darn thing wasn't about to shed light on which. But, anyway, it is interesting to see what's in one!
 
Give it a second, gonna have 5 repeats from various channels insisting there's definitely not enough liquid in there -> link -> link eventually back to derbaur sawing open a card and saying he has no idea how much was in there.
 
New video from Derbauer where they do a literal teardown on the XTX cooler. Conclusion at the end after talking to some cooler experts is that there's not enough fluid in some of the coolers, so (IMO) it's lookin more like a manufacturing fault from whoever made the coolers for AMD.

Man, Derbauer is a bright guy and knows his limits. He's not jumping to conclusions, rather getting advice and knowing the limits of his expertise.

The actual design of the vapour camber is beautiful, and would cost some £££ to manufacture. Super interesting that they're likely using water as the working fluid, then pulling a vaccume on the chamber to drop the boiling point. I've made some heat pipes before but used Ether - great until it leaks and you get a bit :p

So there could be two problems here:
1. Not enough water, so at higher powers there literally isn't enough fluid, meaning all your fluid stays evapourated - The temperature will trend towards the physical thermal performance of the copper.
2. The chamber pressure wasn't set correctly (or it at atmospheric), so we're working at the boiling point of water + thermal gradients - this would give you ~110C... Not saying this is the answer, but it's kind of interesting.
 
Now see how many melted PCIe connectors you find from the adapters we are discussing.

These adapters have only one 12VHPWR connector and 3 or 4 PCIe adapters. The individuals who "can't plug in a connector" apparently can plug in "a" connector....because they plugged in 3 or 4 of them right around the time they plugged in the 12VHPWR connector.

The PCIe connectors on my adapter inserts as it should and clicks when fully seated. The 12VHPWR connector? No. It does neither.


I get what you're saying. But I'm also saying that PCI-E connectors GPU side have melted into the GPU. Imagine plenty have also melted into PSU from not being inserted properly. As the 3/4 PCI-E end to the adpter is essentially open, as is the PSU cables you need to plug into them, much easier to push two open connectors together. Agree 12VHPWR is a poor design. But when inserting power cables, click or not - the proper check to ensure correct seating, is a visual one to ensure both faces are flush and fully married. How would you ensure it's fully inserted in a noisier environment if you rely on hearing a click?
#
I wonder if all the guys in the OCUK build dept say; "hang on guys, quiet please, radio off, I'm insterting the 12VHPWP - I need to hear the click"
 
Last edited:
1080ti to a 3080 will be a massive uplift in performance. 4090 is double the 3080 at 4k. Witcher 3 RT version is poor on release but at least free. SOme games still cripple the best cards, particularly with RT on. RT is a nice to have but I think most people still weigh up their buying decisions around pure raster performance. Then there are some technologies you havent experiecned like DLSS. It was good on the 3000 series cards witth DLSS 2 and now 3 on the current gen. FSR works well on the AMD cards. I think you are right not going for either the 4080 or XTX. A 6900XT for £750 (were £650 before xmas) or a 2nd hand 3080 12GB, 3080ti, 3090 or 3090ti will all do 4k comfortably - no card can do RT well in W3, CP 2077 etc, maybe RT set too high or poorly implemented. I wouldn't base 1 games RT perf over a buying decision - understandably this maybe your fave game but the RT implementation is poor currently and Project Red that make both W3 & CP2077 released both games in a buggy state. W3 saving grace is that it's free.

My horror with a 4090 and Elden ring where the game is LOCKED to 60fps! In this day and age! LAst gen top cards will give you a massive boost well in your budget and on most games. RT is a nice to have but it still, in some games, brings any card to it's knees. DLSS does work well with a fettle of IQ.
It's nothing to do with Cyberpunk being my favorite game, more to do with future proofing if neither card can perform that well neither will age well in regards to Ray-Tracing.

Yeh a bit of mre regrets not jumping on the 699 6900xt deal will be an excellent uplift on my 1080ti.

This generation truly truly sucks though for consumers.

DLSS and FSR I condider fake frames I understand why people like them but my personal opinion is that I'd expect a new GPU to have a playable framerate without frame generation on day one and to be that way for at least 6 months. I see frame generation as a useful tool to lengthen the longentivity of GPU. 4k max settings for 6 months then to hit 4k max settings after a year DLSS would need to be turned on Etc. The biggest issue is that some games rely on frame generation on day one... That's not a good thing for future performance in my eyes.

As someone who was super excited for this generation as it was to be my upgrade year I can't help but feel all energy has been sucked out of me as AMD and Nvidia have become corporate dementors sucking the life out of the industry.
 
Last edited:
This generation truly truly sucks though for consumers.

DLSS and FSR I consider fake frames

As someone who was super excited for this generation as it was to be my upgrade year I can't help but feel all energy has been sucked out of me as AMD and Nvidia have become corporate dementors sucking the life out of the industry.
I do tend to agree.

The new DLSS, well, I don't really see the point at all.

 
Last edited:
Give it a second, gonna have 5 repeats from various channels insisting there's definitely not enough liquid in there -> link -> link eventually back to derbaur sawing open a card and saying he has no idea how much was in there.

The whole video can be summed up with; I have a lot of assumptions, but know nothing. Here is a look inside a vapour chamber, cool isn't it?
 
Last edited:
It's nothing to do with Cyberpunk being my favorite game, more to do with future proofing if neither card can perform that well neither will age well in regards to Ray-Tracing.

Yeh a bit of mre regrets not jumping on the 699 6900xt deal will be an excellent uplift on my 1080ti.

This generation truly truly sucks though for consumers.

DLSS and FSR I condider fake frames I understand why people like them but my personal opinion is that I'd expect a new GPU to have a playable framerate without frame generation on day one and to be that way for at least 6 months. I see frame generation as a useful tool to lengthen the longentivity of GPU. 4k max settings for 6 months then to hit 4k max settings after a year DLSS would need to be turned on Etc. The biggest issue is that some games rely on frame generation on day one... That's not a good thing for future performance in my eyes.

As someone who was super excited for this generation as it was to be my upgrade year I can't help but feel all energy has been sucked out of me as AMD and Nvidia have become corporate dementors sucking the life out of the industry.

You'll never 'future proof when it comes to PC gaming. Well there will always be one game that makes cards fall over - and it's more to do with programming. AAA games are released in Beta form at best, people pay full MSRP for 'the way it's not meant to be played'. It's the games that are poor - not the cards. I think people should stop buying games at MSRP if they dont work properly. MSRP for worst experience? MAdness - wait 6 months - half price and fixed for decent experience.

People on here berating others buying expensive cards - stop buying the 'broken on release' games! That's where the madness started.

DLSS and FSR - you may consider as fake frames, but in these games that chug, it's better to have a smoother experience than a janky one at native. You must be used to that with a 1080ti @ 4k? But that's what DLSS & FSR is for. DLSS in some games (under DLSS2.0 performed very well and you'd have to pixel peep to notice it). But again - it's around the implementation. Some devs do it well, some don't, same as RT - same as all the new technologies.

Regarding 4k, a 4090 will do 98% of games at native at +100fps. FC6 with RT on 120fps! RDR2 +100fps. Plenty of games out after CP2077 etc that run very well.

As this was your upograde year - saved from the 20 seriues (which was awful over 1080ti) missed the 3000 series and experiencing how good DLSS can be - the prices are eye watering - but the 1080ti was £700 on release. If yoiu want to play 4k then it's 4090 time. :D
 
We are where we are because things like DLSS are now factored in to the pricing, i ####'ing said this would happen.

Its $1600 for todays 1080Ti equivalent "Because RT and DLSS is worth it" a 4080 is $1200 for the same reason, a 4070Ti will be $800 if not $900 for the same reason, because too many of you bought in to it as the defining thing about a GPU, so now that's exactly what you got, those of you deserve it.
 
Last edited:
Proud DLSS user here :cool:

If it weren't for dlss/fsr/xess and now frame generation, we wouldn't have RT being usable for the past 2 years, inb4 "rt is a gimmick"

pNGeImm.png

Guess amd must also be charging for fsr etc. given they are pricing their competing cards in the same bracket.....

Honestly some people should just go console and be done with pc :cry:
 
Dear God, never say that. I swear Jensen has a default search for "seems cheap" on his browser. Every time it gets a hit he increases the price of the next gen by $10.
Well if we by the price increase of the last gen by approx double.

We can expect the 5080 to be 2400 and the 5090 to be 3200

Imagine the 5090ti will be the first $4,000 card!

Wonder what AMD will charge lol
 
Last edited:
Well if we by the price increase of the last gen by approx double.

We can expect the 5080 to be 2400 and the 5090 to be 3200

Imagine the 5090ti will be the first $4,000 card!

Wonder what AMD will charge lol

And I bet they'll still fly off the shelves! Or if not they're making so much profit on each unit they can afford to sell in low numbers as it cuts down on manufacturing costs! Its a win-win!
 
Back
Top Bottom