• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

I think you are doing Nvidia customers a disservice by saying they are uninformed. The 3080 Ti costed well over £1,200 through its lifecycle. People who will be buying a card like that or a prebuilt with it (likely to have astronomical prices) will be doing their due diligence and know what the alternatives are. I can agree with you if we are talking about something which is £500 or lower but a customer spending a grand on something knows what he is doing.
Never said all Nvidia customers are uninformed. I said I know people who only buy Nvidia based on what someone told them years ago. Never attacked or said anything about "all customers".

Also having money does not necessitate knowing what you are talking about. If a football player decided he wanted a PC do you think he would care what it cost or what it was? I have known people who have in the past gone for SLI Titan setups as soon as they were available for no reason other than they could.
 
£500 and 30-40% faster than RTX 3070 would be a winner for any 3070/Ti owners.
RX 7700 series?

There are many more 1440P gamers than 4K on Steam HW survey so yes I agree, me being one of them.

fYB2o0B.png
 
Last edited:
They can't force AMD out with RT.

For it to matter, seriously matter, RT needs to be everywhere, cheaply accessible and game breaking if not working.

Instead it's niche, horribly expensive and typically optional. Which means the games are designed to be played without it. Which should absolutely be the case for a graphics toggle. But the point is RT pressure is trivial.

By the time it's more widely adopted AMD will have chugged along gradually improving its RT and chances are it's going to be just fine when RT becomes a meaningful factor.
Excellent argument, I agree.
 
You seem to think RT is such a big difference imo it isnt in the mid and lower tiers look at the poll on this forum, pricing matters in the lower tiers
But you don't know the situation with the lower and mid tiers for both AMD and Nvidia with this generation. IMO both AMD and Nvidia are not going to have a strong mid and lower tier lineup this time around so pricing isn't going to look good on either side. People keep talking about the 4080 but if you look at the 7900XTX and 7900XT, what is the point of the 7900XT? Anyone who has the budget for the 7900XT will obviously go the whole hog for the XTX. If this is how the tiers are shaping up, performance is going to be almost in line with last gen once we get down to £500 territory and we are already getting that performance with the lower prices of the older generation.

My point even with the mid and lower tiers was that if 2 cards in the mid range are in the same price category in rasteriztion, going for the card with RT and DLSS would make more sense as those are features AMD is not efficient at and something is better than nothing. Last time AMD atleast had the VRAM argument going for it but that's not the case this time.
 
Last edited:
They can't force AMD out with RT.

For it to matter, seriously matter, RT needs to be everywhere, cheaply accessible and game breaking if not working.

Instead it's niche, horribly expensive and typically optional. Which means the games are designed to be played without it. Which should absolutely be the case for a graphics toggle. But the point is RT pressure is trivial.

By the time it's more widely adopted AMD will have chugged along gradually improving its RT and chances are it's going to be just fine when RT becomes a meaningful factor.

This is by far the best summation about the state of RT without the gaslighting. Spot on.
 
But you don't know the situation with the lower and mid tiers for both AMD and Nvidia with this generation. IMO both AMD and Nvidia are not going to have a strong mid and lower tier lineup this time around so pricing isn't going to look good on either side. People keep talking about the 4080 but if you look at the 7900XTX and 7900XT, what is the point of the 7900XT? Anyone who has the budget for the 7900XT will obviously go the whole hog for the XTX. If this is how the tiers are shaping up, performance is going to be almost in line with last gen once we get down to £500 territory and we are already getting that performance with the lower prices of the older generation.

My point even with the mid and lower tiers was that if 2 cards in the mid range are in the same price category in rasteriztion, going for the card with RT and DLSS would make more sense as those are features AMD is not efficient at and something is better than nothing. Last time AMD atleast had the VRAM argument going for it but that's not the case this time.

Think we should wait till the lower tiers performance and pricing is out before carrying on with this topic :)
 
As per the Steam survey, majority of gamers are on a 1080p 60hz monitor. So should developers stop implementing 4k textures, higher refresh rates in games and not support HDR, the latter of which needs at least a £1,000 investment to properly work? Just because the mid-range cannot run something efficiently does not mean we should hinder progress.

On the console side, Sony put the time and resources to implement VRR,120hz and 1440p support and the vast majority of that audience do not have an expensive TV/monitor which support either. Tech advancement cannot be stopped just because something does not work on the majority of hardware.
 
Last edited:
As per the Steam survey, majority of gamers are on a 1080p 60hz monitor. So should developers stop implementing 4k textures, higher refresh rates in games and not support HDR, the latter of which needs at least a £1,000 investment to properly work? Just because the mid-range cannot run something efficiently does not mean we should hinder progress.

On the console side, Sony put the time and resources to implement VRR,120hz and 1440p support and the vast majority of that audience do not have an expensive TV/monitor which support either. Tech advancement cannot be stopped just because something does not work on the majority of hardware.
No one said stop developing, it's just that most people are not at the fringe of technology on either side being lowest or highest.
It makes zero sense to be at the best for a static experience across a number of years, by static I mean reasonably long life time due to intelligent choice of hardware and screen.

This is the reason forums like this exist and places like Tom's, LinusTechTipc, Hexus... the list goes on, so that people get the best for their money.

And as much as we like to larp about thinking we are the most in the know or intelligent, the majority of people are intelligent enough to get what suits them within reason and for this I think we are in a better place as PC gamers.
 
Last edited:
amd using most of its last gen chips to make consoles not gpus so thats hardly a shocker
Which they did because they were not gaining market share in the PC space. They need a unique selling point for their cards in the PC market if they want to do that. I am upgrading to one of their 3D chips with the Ryzen 7000 series once they come out because these CPUs genuinely wipe the floor with Intel in gaming. The 5800X3D is hanging with DDR5 setups in gaming. They need something like that to counter Nvidia. Marketshare will come then
 
Last edited:
AMD need to leak some performance figures before the 4080 drops. especially if NV buyers are on the fence about future 4080ti release devaluing the 4080.

I would also like to see 7900XTX RT performance at 1440p as i have a feeling AMD could do well at 1440p

Yeah, I gotta agree with you on that one. Sandbagging the performance is a good move by AMD, up to a point.

I think we have reached the point where a couple of leaks would help AMD out.
 
No one said stop developing, it's just that most people are not at the fringe of technology on either side being lowest or highest.
It makes zero sense to be at the best for a static experience across a number of years, by static I mean reasonably long life time due to intelligent choice of hardware and screen.

This is the reason forums like this exist and places like Tom's, LinusTechTipc, Hexus... the list goes on, so that people get the best for their money.

And as much as we like to larp about thinking we are the most in the know or intelligent, the majority of people are intelligent enough to get what suits them within reason and for this I think we are in a better place as PC gamers.
I was just pointing out how people seem to have double standards about these technologies. HDR is very similar to ray tracing in that it only works on top-end displays, which the majority don't have even in the console space but no one calls it irrelevant. 1440p, 120hz and VRR are irrelevant on the PS5 if we go by the number of TVs supporting those features and yet we have AAA titles like God of War Ragnarok, Forbidden West, Rift Apart supporting them. No one calls those technologies irrelevant. Ray tracing is just one of those things which works better the more money you pay just like HDR. HDR on the PG32UQX works much better than the AW3423DW and the Odyssey Neo G8 but its costs more than both combined. So if something is irrelevant if the 4060/7700XT does not support it, then we can also apply that logic to the other technologies.

If AMD was competing with Nvidia efficiently in RT, we would be having none of these discussions. But I don't get which people crucify Nvidia for charging more when they have something their competitor does not (talking about the 4090 here).
 
As per the Steam survey, majority of gamers are on a 1080p 60hz monitor. So should developers stop implementing 4k textures, higher refresh rates in games and not support HDR, the latter of which needs at least a £1,000 investment to properly work? Just because the mid-range cannot run something efficiently does not mean we should hinder progress.

On the console side, Sony put the time and resources to implement VRR,120hz and 1440p support and the vast majority of that audience do not have an expensive TV/monitor which support either. Tech advancement cannot be stopped just because something does not work on the majority of hardware.
Rather suspect that for textures any graphic devs/visual artists will make their assets at 4K or higher and any big studio will have processes in place to scale those down. Ergo, the extra work is very little.

Likewise, higher refresh rates are no more work unless like with some old Bethesda games the physics engine is synced to a max 60Hz.

HDR I am less sure about but I suspect in the end it's similar to the texture.

Point being, adding all those things costs big studios very little and carries little risk.

Making a whole game and engine with RT-first or RT-only on the other hand?

Well, that would be a hard sell for the majority of the market (consoles) and would limit them on the PC market too.

EDIT:
Which they did because they were not gaining market share in the PC space.
Well, we don't know what kind of generous contract Sony and Microsoft had, but when 7nm was so seriously supply constrained and AMD were losing high value CPU sales, I'm sure they would have preferred to sell higher margin stuff like CPUs, and even their own GPUs rather than making millions of consoles at paltry margins.

Consoles margins are terrible.
 
Last edited:
I was just pointing out how people seem to have double standards about these technologies. HDR is very similar to ray tracing in that it only works on top-end displays, which the majority don't have even in the console space but no one calls it irrelevant. 1440p, 120hz and VRR are irrelevant on the PS5 if we go by the number of TVs supporting those features and yet we have AAA titles like God of War Ragnarok, Forbidden West, Rift Apart supporting them. No one calls those technologies irrelevant. Ray tracing is just one of those things which works better the more money you pay just like HDR. HDR on the PG32UQX works much better than the AW3423DW and the Odyssey Neo G8 but its costs more than both combined. So if something is irrelevant if the 4060/7700XT does not support it, then we can also apply that logic to the other technologies.

If AMD was competing with Nvidia efficiently in RT, we would be having none of these discussions. But I don't get which people crucify Nvidia for charging more when they have something their competitor does not (talking about the 4090 here).
I crucify based on melting PSU cables, the need to replace my perfectly good MSI 850w PSU I have owned for just under 1 year and the fact they cost way way too much for their performance.

Factors in AMD's favor right now: They improved OpenGL and DX11 to the point it is a non issue (Main problem I had) And drivers have maintained stability for the majority ever since the 5700-XT.

I will still wait to see how drivers are.
 
Last edited:
Rather suspect that for textures any graphic devs/visual artists will make their assets at 4K or higher and any big studio will have processes in place to scale those down. Ergo, the extra work is very little.

Likewise, higher refresh rates are no more work unless like with some old Bethesda games the physics engine is synced to a max 60Hz.

HDR I am less sure about but I suspect in the end it's similar to the texture.

Point being, adding all those things costs big studios very little and carries little risk.

Making a whole game and engine with RT-first or RT-only on the other hand?

Well, that would be a hard sell for the majority of the market (consoles) and would limit them on the PC market too.
You don't need the engine to be RT-first. Turning on any single RT effect, as miniscule as it may be crushes performance on the AMD cards. If that wasn't enough FSR performance mode doesn't look as good as DLSS upscaling from a lower resolution in motion so you can't use that as a crutch to get it to run.

HDR needs a lot of work to get it right. Many games I play have raised blacks indicating the devs are simply passing the SDR data in an HDR container. I use Special K injection and Windows Auto HDR in such ca.ses.

Also you are underestimating the time and effort it takes to implement 120hz, VRR and 1440p on the consoles. To this day, the VRR on the PS5 is essentially broken to me as it doesn't work below 48 FPS unless the developer takes the time and effort to implement their own LFC solution. All Sony first party studios do this but third party devs don't bother.

On the PC side, just look at games like AC and BF2042. Horrible optimization. Unless you have a top-end CPU and GPU good luck getting it to run without stuttering. Far more difficult than you think.
 
Last edited:
Rather suspect that for textures any graphic devs/visual artists will make their assets at 4K or higher and any big studio will have processes in place to scale those down. Ergo, the extra work is very little.

Likewise, higher refresh rates are no more work unless like with some old Bethesda games the physics engine is synced to a max 60Hz.

HDR I am less sure about but I suspect in the end it's similar to the texture.

Point being, adding all those things costs big studios very little and carries little risk.

Making a whole game and engine with RT-first or RT-only on the other hand?

Well, that would be a hard sell for the majority of the market (consoles) and would limit them on the PC market too.


EDIT:

Well, we don't know what kind of generous contract Sony and Microsoft had, but when 7nm was so seriously supply constrained and AMD were losing high value CPU sales, I'm sure they would have preferred to sell higher margin stuff like CPUs, and even their own GPUs rather than making millions of consoles at paltry margins.

Consoles margins are terrible.

Have a read of developers comments on ray tracing, the main thing for RT is to reduce development costs and time.

Taken from 4a enhanced:


We saw many strides in performance during this phase of console optimization, many of which gave us a cause to rethink our approaches to certain solutions. We’ve remained very conscious of the fact that we were aiming to have the consoles providing a consistent 60FPS experience for this title, and, with that in the back of our minds, the gradual performance improvements allowed us to also include more and more features. Despite superficial differences in the actual layout and approach of the platform-specific Application Programming Interfaces (APIs), all platforms are now running remarkably similar CPU and GPU code, and have managed to maintain a very consistent feature set.

The groundwork has been laid though and we have successfully brought a product to the 9th generation consoles complete with essentially our entire Ray Tracing feature set. This sets a baseline for this generation’s future projects. We mentioned that we had initially thought of some features as potential fallbacks solution to be maintained alongside superior and steadily evolving equivalents on PC. This wasn’t the case, but it could have been, if the consoles weren’t as good as they are. If it had been the case, then so many members of the team would have been hit by the massive increase in workload that comes with working with two separate and distinct systems in parallel. Instead, we now have a new set of standards to base our work on that are consistent across all target platforms.

As it stands then, we can say for sure that projects of this generation, across all targeted platforms, will be based off of this raytraced feature set. That is great news for the end result: it is allowing us to produce scenes with the highest level of graphical fidelity we have ever achieved and that is what the public gets to see, though these features are just as important behind the scenes.

There is a reason why we have always been so vocally critical of the idea of baking assets (pre-generating the results of things like lighting calculations) and shipping them as immutable monoliths of data in the games package files, rather than generating as much as possible on the fly: everything that you pre-calculate is something that you are stuck with. Not “stuck with” in the sense that if it is wrong it can’t be fixed (everyone loves a 50GB patch after all) but “stuck” in a much more limiting sense – that any part of your game, any object in the scene that relies on baked assets will be static and unchanging. You won’t be able to change the way it is lit so you have to be overly cautious with decisions about how the player can affect dynamic lights (you won’t be able to move it, so you disable physics on as much as possible), and the player can’t interact with it in interesting ways, so you pass that problem onto UX design.

The more you have to rely on baked assets for your scenes, the more you restrict your game design options, and the more you take the risk that your environments will feel rigid and lifeless. Perhaps, the biggest advantage that Ray Tracing brings is that it gives game developers a huge boost in the direction of worlds that are truly, fully dynamic, with no dependencies on pre-computed assets whatsoever. There are still similar examples where such problems need to be solved, but lighting was one of the biggest and most all-encompassing examples of the lot and Ray Tracing solves it.

It doesn’t just come down to what you end up shipping either. Game development is by its very nature an iterative process. You need to have some plan for where you want to take a project from the start, or course, but once you begin working on assets and developing features you always test them as part of the larger context of the game, and more often than not this leads to tweaks, refinements, and balancing. Even that might not be the end of it, other features can come along changing the experience and myriad different ways leading to yet more alterations and adjustments. For this process to work developers need an environment to work in that is intuitive, responsive, and as close a representation of the main game as possible. Our editor has always run basically the same simulation as the final game, but technological advancements seen in this generation streamlined the process significantly. Testing environments are quicker to set up with fewer dependencies on assets or on the work of other departments. Changes in visual design are visible (in their final form) immediately. The physical simulation on the whole feels more like a sandbox in which ideas can be tested and iterated upon. This makes for a more comfortable and fluent development experience, more conducive to creativity and experimentation.



In terms of HDR, it probably requires more time, well if you want to get it right anyway (which very FEW games do, at least in terms of 100% accuracy... there are a lot of games that look good with hdr such as cp 2077 and rdr 2 but they aren't quite right in some areas where as ones like hzd, gears of war, days gone are perfect)

 
Last edited:
You don't need the engine to be RT-first. Turning on any single RT effect, as miniscule as it may be crushes performance on the AMD cards. If that wasn't enough FSR performance mode doesn't look as good as DLSS upscaling from a lower resolution in motion so you can't use that as a crutch to get it to run.

HDR needs a lot of work to get it right. Many games I play have raised blacks indicating the devs are simply passing the SDR data in an HDR container. I use Special K injection and Windows Auto HDR in such ca.ses.

Also you are underestimating the time and effort it takes to implement 120hz, VRR and 1440p on the consoles. To this day, the VRR on the PS5 is essentially broken to me as it doesn't work below 48 FPS unless the developer takes the time and effort to implement their own LFC solution. All Sony first party studios do this but third party devs don't bother.

On the PC side, just look at games like AC and BF2042. Horrible optimization. Unless you have a top-end CPU and GPU good luck getting it to run without stuttering. Far more difficult than you think.

Not to mention, nvidia have provided a lot of plugins for game engines to be able to enable ray tracing with ease among many guides on how to get the best from ray tracing.

In fact so have amd and intel.

EDIT:

It's things like this where amd need to also be doing similar:

Vulkan 1.3.233 Released With Three New NVIDIA Extensions


The new Vulkan extensions from NVIDIA are VK_NV_memory_decompression, VK_NV_ray_tracing_invocation_reorder, and VK_NV_copy_memory_indirect. Aside from these new extensions there is also the routine documentation corrections/clarifications.

VK_NV_memory_decompression is a new extension that as implied by the name is for performing memory-to-memory decompression. That's it and is just publicly documented as a one-line spec.

VK_NV_ray_tracing_invocation_reorder allows for having more control over the Vulkan ray-tracing pipeline for reordering for locality. The new NVIDIA interface provides a hit object that can provide result information from the hit and can be used as part of the explicit sorting plus other enhancements to enhance the ray-tracing invocation reordering.

Lastly the third NVIDIA extension with today's update is VK_NV_copy_memory_indirect. The VK_NV_copy_memory_indirect extension is used for performing copies between memory and image regions using indirect parameters. The indirect parameters are read by the device from a buffer during execution.The VK_NV_copy_memory_indirect can be useful for performing copies where the copy parameters aren't known during command buffer creation time.

i.e. putting the effort in now so that when RT does really take of, they aren't left behind scrambling to get on par with nvidia for several years
 
Last edited:
They can only keep doing this as long as they have a unique selling point for their cards. Raster performance within 5-10% of each other might as well be an error margin unless we are talking about games which are right on the cusp of 60 FPS, which in my experience with the 3080 Ti was only in RT games.When performance is so similar, the differentiators are price and features. When you enter halo card territory, the price becomes kind of irrelevant so its a war of features. At the mid-range, the price is the diffrentiator, but here you have an Nvidia card which can somewhat run RT games with 1-2 effects turned on and has DLSS in addition to FSR while with the AMD card, RT will be inferior and you won't get DLSS.

AMD needs a unique selling point for their cards if they want to gain marketshare. Just tailgating Nvidia in rasterisation while charging lower prices won't cut it because now the market thinks Nvidia is the more premium option with more features.
AMD are not tailgating nvidia on raster though, at $900 the raster performance for Ada was somewhere between a 3080 and 3090 while AMD is looking at delivering +50% more. That card is now set to be cut down further and relaunched as a 4070ti but I doubt it'll cost less than $700 so will be up against the 7800XT which should easily be 30%+ faster in raster, that's almost a generational lead.
 
Not directed at you but we will be doing our best not to be locking threads going forward. The guilty party will be going straight to strike/suspension and then loss of access to the section. We have tried to let things flow with warnings but some think they are above it.....

Ah ok that clears it up then thanks.
 
Back
Top Bottom