Hostile_18
Hostile_18
Only one card has a gem sticking out its rear end. The other AIBs have already lost.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I would imagine because they haven't implemented it correctly with the use of onboard fuses and resistors.Makes you wonder why everyone has a go at Nvidia for any issues with it.
Makes you wonder why everyone has a go at Nvidia for any issues with it.
That's very sensible, however the 12gb really bothers me. Even though it's fine in most games, there are exceptions. EG I can't run Indiana Jones at max settings because of this. It's performance is great but it's held back by vram.
Wonder when we'll see the new drivers drop. Guessing on the 5th - they'd have to come out before people purchase the cards?
Whats the recommended PSU wattage if i get one of these cards?
How so?That's a very silly post.
E: Personally I'll be looking at the Pulse though for the 8pin connections
Fair enough, I just meant I feel like they're only using it because Nvidia is. If there's no power requirement to do so, is there another reason? Genuinely askingNvidia didn't come up with 12VPWR, It was made by PCI-SIG, Nvidia were just the first ones to use it.
I obviously didn't describe my concerns very well, I'm not bothered in the slightest by being able to see a bit of cable. My concern was having to deal with 3 x 8-pin connectors. In the example picture it's fine, but what if your case doesn't have a gap to pass through right at the exit point? Also in my experience of cable management that section is usually one of the busier points behind the motherboard tray to suddenly add 3 x 8-pin connectors to.
Yeah I feel like it's just copying Nvidia which is a shame. AMD has the potential, but innovation is the opposite of imitation
Fair enough, I just meant I feel like they're only using it because Nvidia is. If there's no power requirement to do so, is there another reason? Genuinely asking
Reducing PCB space is likely 1 of the reasons.
If that's the minimum what would people recommend these days if i want to future proof?750 watts for the XT according to AMD's page.
Good points, I was under the impression it was mostly for power requirementsIn the case of sapphire it seems to have been done to hide the power cables, which would be harder to do with conventional connectors.
I think some AIB have recommended 900w for their variants.750 watts for the XT according to AMD's page.
If that's the minimum what would people recommend these days if i want to future proof?
They were talking about the RX9070.£1 = $1.26.
600 / 1.26 = 476.19 + 20 VAT = £571.43.
Just buy an Nvidia card mateI obviously didn't describe my concerns very well, I'm not bothered in the slightest by being able to see a bit of cable. My concern was having to deal with 3 x 8-pin connectors. In the example picture it's fine, but what if your case doesn't have a gap to pass through right at the exit point? Also in my experience of cable management that section is usually one of the busier points behind the motherboard tray to suddenly add 3 x 8-pin connectors to.
Makes you wonder why everyone has a go at Nvidia for any issues with it.
I think it's fine if AMD do it.
But yes it does seem to go against the advice we got with these connectors. I'd say you'd think Sapphire would've tested it, but then you'd have thought Nvidia and it's board partners also did some testing but it didn't prevent issues with the connector.