• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

I think someone replied to that with something I hadn't thought of, which is, what's the point of the pass through if you're then gonna block over the fins with whatever that is to stop the cables getting damaged. I realise there are some holes, but you'll also have the cables in there. I'm torn whether I like that idea or not, I also worry about the mess of cables exiting the card on the motherboard (tray) side.

You don't see the cables at all they come out of the PCIe side of the GPU and straight in to the motherboard tray passthrough to the cable management side of the case.
 
Like this....


How much is it really gonna effect thermals I think not much at all

Cables should be hidden by the size of the card , also the 12VHPWR connector splits for 8 pin so the connector only should be going into the tray back it's not thick as 3 x 8 pin

You are not gonna see the cables once the cover is on
2ZkcqPd.jpeg
 
Yeah I feel like it's just copying Nvidia which is a shame. AMD has the potential, but innovation is the opposite of imitation

Sapphire's design might be the one thing where they might by right for the job. you couldn't do that with the old PCIe connectors.
 
Last edited:
I think someone replied to that with something I hadn't thought of, which is, what's the point of the pass through if you're then gonna block over the fins with whatever that is to stop the cables getting damaged. I realise there are some holes, but you'll also have the cables in there. I'm torn whether I like that idea or not, I also worry about the mess of cables exiting the card on the motherboard (tray) side.

Most modern cards have a backplate which obstructs the air passing through anyway. All the hot air is usually directed to the top where the LED strip is and towards the pci-e slot.
 
AMD developed DisplayPort as a better alternative to HDMI, are Nvidia copying AMD by using it?
 
Last edited:
Sapphire's design might be the one thing where they might by right for the job. you couldn't do that with the old PCIe connectors.

Technically you could, as pcie power can be mounted on that end of the card, but yeah its good for sales.
 
Wonder if retailers are in Nvidia's pocket and will just ignore MSRP and overcharge to hurt AMD?

/s

If there's going to be a price hike in April then I'm more concerned about the bots... as a price hike is an opportunity for the scalper scumbags



Lads - Just remember to use the supplied connector to avoid any warranty loopholes that the manufacturer may throw at you in the event of any meltdowns
 
That's very sensible, however the 12gb really bothers me. Even though it's fine in most games, there are exceptions. EG I can't run Indiana Jones at max settings because of this. It's performance is great but it's held back by vram.
I have a 4070Ti too have similar frustrations.
 
Yeah I feel like it's just copying Nvidia which is a shame. AMD has the potential, but innovation is the opposite of imitation
That's a very silly post.

E: Personally I'll be looking at the Pulse though for the 8pin connections
 
Last edited:
That's very sensible, however the 12gb really bothers me. Even though it's fine in most games, there are exceptions. EG I can't run Indiana Jones at max settings because of this. It's performance is great but it's held back by vram.

This is why I'm leaning towards keeping my 7900XTX, I have seen upwards of 18GB used in various games, Just under 20GB in Indiana Jones, 16GB is cutting it way too close for me IMO.
 
Last edited:
Wasnt it bending the power chord like that what led to some of the melting?

Bending is not the issue since my 3080FE connector is very bent but still hasn't melted. The real problem is that Nvidia's power design in the 4090/5090 doesn't detect current imbalance thus allowing higher current in some wires which causes the overheating/melting. The older cards had shunt resistors per pair of wires which allowed the VRM's to adjust current afaik.
 
Last edited:
Back
Top Bottom