• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
Soldato
Joined
7 Jul 2004
Posts
7,091
Location
Gloucestershire
I don't see many people talking about power consumption, but how about the power difference between the 6800 XT and the 3080?

Average Gaming Power Consumption
EGXdtwW.png


Peak Gaming Power Consumption
oVDvDsD.png

Oh no, it will cost 1p per hour more to play games with the 3080!!! :eek:
 
Caporegime
Joined
12 Jul 2007
Posts
37,323
Location
United Kingdom
I suggest RT on consoles is going to be extremely minimal. Like shadows only or a couple of puddles.

You can use your sliders all day long to add more rays or whatever, but if the consoles aren't going to make much use of RT (if) then don't expect your 3090 to transform the game into a fully ray traced RT showcase.

Whatever (minimal) RT gets added to console games, the 6000 series will be more than capable of dealing with it.

I just expect the amount of RT added to games to remain minimal this console gen. We'll need the Ps6 (or beyond) to get a really extensive amount of RT in games.

Your 3090 might look a tiny bit better, but unless nVidia invest heavily - basically paying the devs to fuss over RT for PC gamers - then I wouldn't expect much in the way of RT this console generation.

Like I said, a few shadows here, a few puddles there, and that'll be about it.
This is one for all the dads out there, but i am personally looking forward to watching Peppa Pig sing about jumping up and down in ray traced muddy puddles.
 
Soldato
Joined
8 Jun 2018
Posts
2,643
Ray tracing will always be minimal in games for this gen of gaming, imo.

It is clear without provocation that RT offers very little in image quality over rasterization. Unless you take still shots, stand still and look at a particular glass or shadow, Etc.

There is no fundamental Improvement in image quality that is obvious with Ray tracing enabled. Except that you take a performance penalty. Or, reduce the resolution to upscale it.
 
Associate
Joined
9 May 2007
Posts
1,283
I don't see many people talking about power consumption, but how about the power difference between the 6800 XT and the 3080?

Average Gaming Power Consumption
EGXdtwW.png


Peak Gaming Power Consumption
oVDvDsD.png

320 watts or the same as the 3080 stock. RTX 3080 FE https://www.techpowerup.com/gpu-specs/geforce-rtx-3080.c3621 stock bios 320 watts TDP. 6800xt stock bios is 300 watts TDP.

power-maximum.png


Even so the RTX 3080 will pull more than 320 watts.

power-maximum.png


52 watts difference or that of a light bulb.
 
Soldato
Joined
17 Aug 2009
Posts
9,225
Nowhere in my statement did I mention that having ray tracing support/DLLS directly corelates to the enjoyment of the game, I was merely referencing performance of said game without it. You can still enjoy a game even if it runs like a bag of rocks. However, if I was to drop 600-700 quid on a brand new graphics card I would want it to run games with all the bells and whistles.

You mentioned hyperbole? See your previous post.

Huh. That doesn't seem to be what I was replying to.

I suspect come release day for Cyberpunk 2077 there will be a few tears in here and much back peddling of the "I don't care about RT/DLSS" squad.

But at least it's clear you personally value RT and DLSS very much even if it's questionable to project that on others.
 
Associate
Joined
9 May 2007
Posts
1,283
I suggest RT on consoles is going to be extremely minimal. Like shadows only or a couple of puddles.

You can use your sliders all day long to add more rays or whatever, but if the consoles aren't going to make much use of RT (if) then don't expect your 3090 to transform the game into a fully ray traced RT showcase.

Whatever (minimal) RT gets added to console games, the 6000 series will be more than capable of dealing with it.

I just expect the amount of RT added to games to remain minimal this console gen. We'll need the Ps6 (or beyond) to get a really extensive amount of RT in games.

Your 3090 might look a tiny bit better, but unless nVidia invest heavily - basically paying the devs to fuss over RT for PC gamers - then I wouldn't expect much in the way of RT this console generation.

Like I said, a few shadows here, a few puddles there, and that'll be about it.

Spider-man is like Control with tweaks to stay within the frame budget. Tweaks that can be changed for better hardware. Making the game look much better and require much more performance.

 
Caporegime
Joined
17 Feb 2006
Posts
28,680
Location
Cornwall
Ray tracing will always be minimal in games for this gen of gaming, imo.

It is clear without provocation that RT offers very little in image quality over rasterization. Unless you take still shots, stand still and look at a particular glass or shadow, Etc.

There is no fundamental Improvement in image quality that is obvious with Ray tracing enabled. Except that you take a performance penalty. Or, reduce the resolution to upscale it.
We've also seen that it's very possible to make your games look *worse* with a shoddy RT implementation.

Like that screenshot with a mirror finish TV reflecting perfectly the other side of the room. And people looking at it thinking, "What the? That's not how a TV behaves in RL? Isn't RT supposed to be more realistic?"
 
Soldato
Joined
12 May 2014
Posts
3,253
The whole nvidia youtube channel has more than two. Why don't you go there yourself and fix your own ignorance. Or maybe google search and go to the long list on nvidia website.

https://www.nvidia.com/en-gb/geforce/rtx/
You made a claim, it is your job to back it up. Don't throw a hissy fit

I don't see any reason to differentiate between RT that was added pre- or post- release. Ray-tracing is one of the most simple methods of calculating rendering information, and with current real-time apis trivialising many of the complexities behind it (fast ray intersections, interop with traditional pipeline) it's perfectly suited to being "tacked on" considering the various independent stages in a typical rendering pipeline anyway. CP2077 is using RT (specular?) reflections, ambient occlusion, and shadows: all of those things could be added to most game post-release without too much development cost for a typical AAA studio, especially the latter two.

Are you going to replay a game because it has RT? Are you going to buy a game you skipped because of RT? Do you think devs are going to take away resources from future games to assign it to an old game that has no future monetary value to the company? And implement it in a way that is better than turning it off?

It is not an easy "tack on" you need to go through and check how all the shaders look and respond to this new system, as well as debugging and performance testing the entire game.
 
Soldato
Joined
13 Jun 2009
Posts
4,014
Location
My own head
Have to say, and disappointed to say this. I've shopped with OcUK for something like 15 years, and today marked the day I stop shopping here, as well as directing others here.

Don't screw customers so bad guys, it's really short term as a view. Supply/Demand etc etc but if that's the case how some retailers selling for £600? Really disappointed.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
@PrincessFrosty Thats basically what the video shows. The developer just has to retweak the game for the new hardware. Reflections for example, can have more objects and can be a higher resolution. You just have to have the basic feature in place, then you tweak it to meet the frame budget. The faster the hardware the more you can render and still meet the performance target. If the hardware is not as good, then you can scale back each feature so you can still meet the frame budget.

Exactly. The feature set like RTX/DXR is implemented into the game/engine and typically is configured with a bunch of different parameters. So the hard work is the implementation but telling it to run with reflections at 1/2 resolution or 1/4 resolution is just 1 setting. So the consoles will get RT support it will just be low fidelity and increasing that fidelity for the PC by casting more rays at higher resolution for example, that'll be trivial to implement.

Yes. I remember debating this with you a couple of weeks ago.

What is telling is that at 4K, rasterisation wise the 6800Xt is not superceding the 3080 so its unlikely even if a game does demand 16GB of VRAM, that the 6800Xt is going to pull off anything tangible and spectacular over the NVIDIA GPUs.

Exactly, it wont. It'll reach a GPU bottleneck before the 3080 does. Which makes me think that without even seeing benchmarks for things like FS2020, Avengers, Crysis all at 4k Ultra that it's reasonable to predict it'll be crawling along with unplayable frame rates before it's even close to 10Gb of vRAM use (I'll qualify that by saying 10Gb of vRAM used, not allocated of course :) )
 
Soldato
Joined
7 Jul 2004
Posts
7,091
Location
Gloucestershire
I can see by looking at your choice of components (sig) that you are not in the least bit concerned by high power draw haha. :p

I would like to swap to Ryzen but can't justify it from the 9900k. I'll wait until AMD release their new socket and make the switch at the start of their new socket cycle instead of the end.

As for the 3080, what can I say? Got it 2 days after launch for £670 and am very happy with it. When I spend that amount of money on a GPU I like it to be able to use all functions properly, including decent RT performance. Would be a bit bummed out if I bought a new card now for nearly £700 but couldn't use RT without dropping the resolution right down :(
 
Soldato
Joined
1 Apr 2014
Posts
14,731
Location
Aberdeen
I would like to swap to Ryzen but can't justify it from the 9900k.

SAM is not exclusive to Ryzen. It's just changing the amount of BAR and that's in the PCIE spec. So a BIOS update should sort you out as soon as Intel can be bothered. Plus right now it's unstable, even on Ryzen.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
Oh no, it will cost 1p per hour more to play games with the 3080!!! :eek:

The main thing to focus on is a card that only uses 210W will be easier to cool and potentially have more overclocking headroom when tweaked and played with or under water. More importantly though at 300W Nvidia making a newer faster bigger card on the same node seems less likely while 50-80W less power usage on average that is actually room to make a say 30% bigger die on the same node and have a reasonable power budget/output.
 
Status
Not open for further replies.
Top