Didn't IGN test the 9070 none XT?
Non, not none...
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Didn't IGN test the 9070 none XT?
Non, not none...
Ok so I've tried testing BO6 with that in game tool at 4k with the Extreme Preset. I noticed DLSS was on by default with the extreme preset, so I had to manually disable it, and I'm not sure if there is anything else I need to disable
I got 97fps, so I guess the 9070XT is faster than a 4090 in this benchmark, unless the settings don't quite match
Found this. Honestly I don't know how much stock we can put in the ign article as it almost seems too good to be true, but seems like performance is right around the 7900xtx level. With cod liking amd cards I certainly don't think this means we're going to see anything like 4090 performance in other games, but 4080 certainly seems possible.
Again though there's something that's not sitting right with that ign article, for as much as people have been begging to see performance numbers, no one else seems to have noticed it.
Noooooooooo
I think there was 7900 asrock cards using the 12vh connector, though I may be misremembering
Underhanded tactics from Nvidia to get their rival GPUs to explode in flames as well, now that feature is no longer exclusive to Nvidia.Radeon using 12V-2x6, a first for team red.
Insert Thisfine.jpg and framechasersguypullingaface.gifI think most will stick with the old 8 pin connectors, i have two on mine and i don't see what the problem is with them, why the mad push for this technology that's if anything proven to be a bit dodgy all in the name of subjective ascetics, but if its only pulling 300 to 350 watts it shouldn't be a problem. I don't trust them beyond that and it looks like the 5090 will push them right up against their insane 600 watt limits. That should be fun....
Honestly if the 9070 is as fast as the XTX in COD, there's no reason it couldn't be in other titles, albeit, its a different architecture/changed architecture somewhat, so could be some variation (some better, some worst) dependent on the specialities of the architecture and what it turns out to be good at.
The INTERESTING part of this is, if accurate, and the 9070 is matching/slightly beating the 7900XTX in this title, then the 9070XT is likely to be another 10-20% ahead. That could get very interesting if it extrapolates to other titles as well, if they've killed some of the weaknesses of RDNA3 in some engines, or refined it to just generally improve performance due to resolving issues in the prior architecture, even IF it has been made more efficient for manufacture.
Honestly I'm not counting the 9070XT out as yet, it all depends on what the realworld turns out to be, and the price.
If its faster than a 7900XTX on average, with considerably better RT, at around £500, it could be a barnstormer, as that'd potentially put it in range of a 5070TI (a £750 card)
AMD have a quandrary here, they can try and get more profit IF the card performs well, or they can reduce margins somewhat, and actually try and move numbers and win marketshare (and good opinion ahead of UDNA's launch next gen) like they said previously.
If its faster than a 7900XTX on average, with considerably better RT, at around £500, it could be a barnstormer, as that'd potentially put it in range of a 5070TI (a £750 card)
If that's the case, I'd place a bet that AMD didn't see Nvidia's shrinkflation coming, and thought the GB203-200-A1 would be the 5070. They've now delayed the RDNA4 reveal while they rebrand and reprice the 9070 accordingly (so the $599 9070 releases as the 9070 XT at $699, and $799 9070 XT releases as the 9080(XT?) at $899?).
Even if they don't change the prices (yeah, right ), I can't see them sticking with the 9070 name if performance if far ahead of the 5070. The whole point of the new naming is to suggest equivalence.
[/QUOTE
"This also indicates that AMD is likely to reveal the full specifications of these cards by then."
Not seen this posted here yet. Not really any information, but does seem like rebranding at this point isn't happening.