• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RTX 3080 TI isn't going to be- EDIT - No We Were All Wrong!

Soldato
Joined
20 Aug 2019
Posts
3,033
Location
SW Florida
Do most of us need more than a 50% improvement over the 5700 XT? Although, I can see why people with a RTX 2080 / RTX 2080 TI might end up being a bit disappointed, if looking for a substantial upgrade.

I'm on a 1080Ti running a Reverb in VR. I could use an upgrade. I'm just waiting to see who offers what. (and for how much)
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
The XSX is running the entire APU at 200W. You'd need to rip out those 52CUs and run them alone at 225W to compare against the 5700 XT to see if the 50% ppw boost has occured.

That's interesting. Does that mean that the Xbox Series X CPU is consuming part of that 200W?

I noticed that the Ryzen 7 4700U 8 core, 4.1Ghz mobile CPU only requires upto 25w TDP. Perhaps the lower clocked Series X CPU will require a similar amount of power?

So, if the console RDNA 2 GPU is only using ~175w, then that's quite impressive, and would make performance improvements over 50% more likely, for a desktop RDNA 2 GPU.

Ryzen 4000 series link:
https://en.wikipedia.org/wiki/Ryzen#Mobile_3
 
Last edited:
Associate
Joined
12 Jul 2020
Posts
288
Its maybe not going to be a mammoth boost for 2080Ti users, but for everyone else its likely to be a nice jump in performance. As @Twinz says above - we just want to see at what cost.
Don't you just miss the days when a high end GPU 'merely' cost £200-300? No! Nvidia.
 
Last edited by a moderator:
Soldato
Joined
26 Sep 2010
Posts
7,164
Location
Stoke-on-Trent
Much more than 50% is gonna be tough for AMD.
50% performance per watt increase is the target for this generation. RDNA 3 is intended to be another 50% ppw increase.
The Xbox Series X GPU already has 52 CUs and is slower than the RTX 2080 TI, so the higher end Desktop GPUs will definately need more than the Series X...That's interesting. Does that mean that the Xbox Series X CPU is consuming part of that 200W?
No, the entire console uses 200W (or so we can estimate based on the power jack in the pictures), that's not just the APU, thats all the I/O, the SSD, the motherboard, the memory, everything. How that power budget is split we don't know, but the point is the XSX looks to have RDNA CUs that offer about 2080 Super performance level at a chunk less than 200W. So imagine then 52 RDNA 2 CUs on a discrete card where they are not confined by the power envelope and thermals of an APU and a console as a whole.
I noticed that the Ryzen 7 4700U 8 core, 4.1Ghz mobile CPU only requires upto 25w TDP. Perhaps the lower clocked Series X CPU will require a similar amount of power?
You're forgetting we're talking about APU, so that 4700U is 8 Zen 2 CPU cores and 8 Vega CUs in a 25W package. So those CPU cores are using less than 25W.
So, if the console RDNA 2 GPU is only using ~175w, then that's quite impressive, and would make performance improvements over 50% more likely, for a desktop RDNA 2 GPU.
Exactly. This is why I used a hypothetical 6700 XT matching the CU count and board power of the 5700 XT to illustrate what a 50% uplift in performance per watt could mean.
 
Soldato
Joined
6 Feb 2019
Posts
17,648
That's interesting. Does that mean that the Xbox Series X CPU is consuming part of that 200W?

I noticed that the Ryzen 7 4700U 8 core, 4.1Ghz mobile CPU only requires upto 25w TDP. Perhaps the lower clocked Series X CPU will require a similar amount of power?

So, if the console RDNA 2 GPU is only using ~175w, then that's quite impressive, and would make performance improvements over 50% more likely, for a desktop RDNA 2 GPU.

Ryzen 4000 series link:
https://en.wikipedia.org/wiki/Ryzen#Mobile_3

Who said the entire system is 200w, LePhuronn? lol thats not the case. It's only the GPU that is rumoured to be 200w in the Xbox. The CPU is, as you mentioned quite efficient - most likely between 25w and 40w at all core clocks. And then another 30w is for the system memory.

We know for a fact the PSU inside the new xbox is a 350w unit - consoles historically always overbuilt with the PSU but it's never far out and it's been consistent across Sony and MS - given the PSU is 350w, it's a very safe assumption that the console will draw 300w from the wall
 
Soldato
Joined
21 Jul 2005
Posts
20,093
Location
Officially least sunny location -Ronskistats
Don't you just miss the days when a high end GPU 'merely' cost £200-300? Nvidia.

Absolutely! Some on here either are not old enough to experience the discrete cards from the 90s or they are happy to pay the asking price which is what scorn is being poured on because the "its their money let them spend it" movement.

I used to upgrade often but have family and other things to prioritise - but even still when I was young and free I would have regularly bought a GPU for £100-200 and if I was doing a big one the high end would have spent £300.

Somewhere along the line (I wasn't paying attention) but there has been a creep up where there is more choice but really its down to using binned hardware that cant run so its locked and sold as a weaker unit. Not against that, its standard but the prices seem to have got out of hand, lots of refreshes and pointless extra flavours. So you have stupid flagship like the Titan, then the ear bleeding chief gamer card like the 2080Ti. Why they seem to command near or over a grand is anyones guess but if people are buying them then there's not much you can do about it.
 
Last edited by a moderator:
Soldato
Joined
26 Sep 2010
Posts
7,164
Location
Stoke-on-Trent
Who said the entire system is 200w, LePhuronn? lol thats not the case. It's only the GPU that is rumoured to be 200w in the Xbox.
It's been discussed that the barrel jack on the console indicates a max power brick of 240W. It's speculation of course because the pictures were never confirmed, nor have we seen real hardware. Now granted that could be a 19V jack, but have you seen the size of 19V 20A power bricks? And even in millions of units they're going to be a little pricey. So yeah, it's possible to feed the XSX 380W, but is that actually practical? 200W-240W for the entire console seems more realistic.
 
Soldato
Joined
6 Feb 2019
Posts
17,648
It's been discussed that the barrel jack on the console indicates a max power brick of 240W. It's speculation of course because the pictures were never confirmed, nor have we seen real hardware. Now granted that could be a 19V jack, but have you seen the size of 19V 20A power bricks? And even in millions of units they're going to be a little pricey. So yeah, it's possible to feed the XSX 380W, but is that actually practical? 200W-240W for the entire console seems more realistic.

240w psu wouldn't make any sense.

That's the same as the Current Xbox and that has a small fan and is cooled easily - yet they make the new Xbox much bigger with a massive fan - no reason to do that if the system isn't drawing much more power
 
Soldato
Joined
26 Sep 2010
Posts
7,164
Location
Stoke-on-Trent
no reason to do that if the system isn't drawing much more power
Thermal density of a 7nm APU plus that screaming fast SSD that is going to be at significant load pretty much all the time? That'll need some decent cooling (hence the big-ass vapour chamber).

Now I didn't know this, but there's a YouTuber called Austin Evans who got his hands on a prototype end of June who says the PSU will be a 300W unit. I doubt Microsoft will push that PSU to 100% all the time, so assuming this prototype is representative of the final unit, we're looking at 250W ish at sustained full load? But still, that's largely academic to my point to g67575: if the entire console is using X amount of power and performs similarly to the 2080 Super, what would these same CUs be capable of when solely given the same power budget on a discrete card?
 
Soldato
Joined
6 Feb 2019
Posts
17,648
Thermal density of a 7nm APU plus that screaming fast SSD that is going to be at significant load pretty much all the time? That'll need some decent cooling (hence the big-ass vapour chamber).

My pcie4 SSD is already faster the one in the Series X and it doesn't get very hot - about 40c under load compared to a pcie3 SSD which gets to 30c - sure there is extra heat and a few extra watts but its not that significant I would have thought.
 
Soldato
Joined
6 Feb 2019
Posts
17,648
But is it under full load all of the time?

No, but neither shall the consoles be - data isn't streaming all the time, games' stream data in bursts - that's why the consoles need a custom controller to improve the random 4k reads - constant data transfer isn't important.
 
Associate
Joined
12 Jul 2020
Posts
288
Absolutely! Some on here either are not old enough to experience the discrete cards from the 90s or they are happy to pay the asking price which is what scorn is being poured on because the "its their money let them spend it" movement.

I used to upgrade often but have family and other things to prioritise - but even still when I was young and free I would have regularly bought a GPU for £100-200 and if I was doing a big one the high end would have spent £300.

Somewhere along the line (I wasn't paying attention) but there has been a creep up where there is more choice but really its down to using binned hardware that cant run so its locked and sold as a weaker unit. Not against that, its standard but the prices seem to have got out of hand, lots of refreshes and pointless extra flavours. So you have stupid flagship like the Titan, then the ear bleeding chief gamer card like the 2080Ti. Why they seem to command near or over a grand is anyones guess but if people are buying them then there's not much you can do about it.
My first GPU was the Geforce 4 Ti 4800 SE - only cost me £100. That was then followed by the 8800 GTS (£200), AMD 6950 (£220), GTX 970 (£200), and then... the RTX 2080 (£600). That for me is the absolute limit and I won't ever spend a penny more, regardless of how 'fancy' said GPU might be.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
The big price increases happened around the time of the release of the r9 290 and R9 390 cards so around the end of 2013. I remember thinking how much the prices had gone up, and feeling like £300 was the limit for me (previously ~£200 was more normal). Of course, things change...

One way of looking at it is, you should at least be getting a similar amount of GPU transistors per $/£, and improved cooling in most cases :D

It does seem ridiculous that single GPU chip can often cost more than all the other parts of your PC added together.
 
Last edited:
Soldato
Joined
21 Jul 2005
Posts
20,093
Location
Officially least sunny location -Ronskistats
My first GPU was the Geforce 4 Ti 4800 SE - only cost me £100. That was then followed by the 8800 GTS (£200), AMD 6950 (£220), GTX 970 (£200), and then... the RTX 2080 (£600). That for me is the absolute limit and I won't ever spend a penny more, regardless of how 'fancy' said GPU might be.

I cant recall every one I got but I do remember using sli / crossfire a few times i.e. buying a card then a year later picking up the same for much cheaper, shame they think this is not worth it these days. I got voodoo 3 3000 > GeForce2 MX > FX 5600 for the earlier ones.

You can say inflation and the weak pound etc. but its not normal to have jumped so much just for one component.
 
Soldato
Joined
26 Sep 2010
Posts
7,164
Location
Stoke-on-Trent
No, but neither shall the consoles be - data isn't streaming all the time, games' stream data in bursts - that's why the consoles need a custom controller to improve the random 4k reads - constant data transfer isn't important.
Yeah, but that's the thing isn't it. OK, maybe more the PS5 than the XSX, but the big hype about these consoles is utilising the speed of the SSD to continually stream assets. They're going to be worked harder than in a PC and those controllers are going to heat up.
 
Soldato
Joined
21 Jul 2005
Posts
20,093
Location
Officially least sunny location -Ronskistats
Yeah, but that's the thing isn't it. OK, maybe more the PS5 than the XSX, but the big hype about these consoles is utilising the speed of the SSD to continually stream assets. They're going to be worked harder than in a PC and those controllers are going to heat up.

From what I see of Ryzen, it works in the fashion of micro ramping up of speed for the core then sleeping - which saves power and stops heat building up so fast. I can imagine as the consoles are all AMD hardware they would have this mindset otherwise you will get toasty boxes and a refresh of the classic RROD we saw from microsoft back in the day.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
There's some more info about the X series RDNA2 GPU here:
https://www.techpowerup.com/271115/...rdna2-architecture-and-zen-2-cpu-enhancements

It has 15.3 billion transistors, but I think this includes the other parts of the SOC like the CPU cores also and Ray tracing hardware called an 'intersection engine'.
It is built with the TSMC N7 Enhanced fabrication process - I think it's very likely the rest of the RDNA 2 GPUs will be built with the same 7nm process.

Considering it is 'only' 15.3 billion transistors, and at least 3-4 billion of these will probably be used for the CPU cores and raytracing HW, I think this makes it a bit more difficult for AMD to double the transistor count of Navi 10 (RX 5700 XT) with RDNA2 GPUs

Also, still no official word about how much power the GPU will consume.
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,648
There's some more info about the X series RDNA2 GPU here:
https://www.techpowerup.com/271115/...rdna2-architecture-and-zen-2-cpu-enhancements

It has 15.3 billion transistors, but I think this includes the other parts of the SOC like the CPU cores also and Ray tracing hardware called an 'intersection engine'.
It is built with the TSMC N7 Enhanced fabrication process - I think it's very likely the rest of the RDNA 2 GPUs will be built with the same 7nm process.

Considering it is 'only' 15.3 billion transistors, and at least 3-4 billion of these will probably be used for the CPU cores and raytracing HW, I think this makes it a bit more difficult for AMD to double the transistor count of Navi 10 (RX 5700 XT) with RDNA2 GPUs

Also, still no official word about how much power the GPU will consume.

175w is the gpu
 
Back
Top Bottom