• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

Yes it's worth it. I don't have a problem with companies charging halo prices for halo products.

Intel's competitive comeback comes at a cost you know, AMD are earning more per-sale than Intel are, and Intel fab their own CPU's, Pat keeps crying about how they have to get the margins up, he's just gone to Mr Biden for a $52 Billion bailout because they are skint.
 
Last edited:
On upscalling tech, AMD will have the same fake frame generation garbage Nvidia have soon...... it might be good for brining the frame rates up in Star Citizen, i might even have it before CIG are done converting the whole game to Vulkan, ongoing now for 2 years and apparently 90% done.

We'll see who gets there first.
More fake frames, just what the world needs!

Really fail to see any reason for that "feature" whatsoever.
 
Meanwhile AMD buys Xilinx for $40 Billion...
Xilinx are top notch!

They have put a huge development effort in over the last 10 years, and as a result are very much the leaders in the FPGA and FPGA-SoC space. You'd be surprised where these devices are popping up!

They have A TON of IP that is really useful, from the obvious FPGA designs into the best network endpoints, RF, analog defined radio, etc etc.
 
Last edited:
Xilinx are top notch!

They have put a huge development effort in over the last 10 years, and as a result are very much the leaders in the FPGA and FPGA-SoC space. You'd be surprised where these devices are popping up!

They have A TON of IP that is really useful, from the obvious FPGA designs into the best network endpoints, RF, analog defined radio, etc etc.

Like here...


That's right, AMD are in my game gear. Actually this isn't my Game Gear but one im modding for somebody but they are also in mine :)
 
Last edited:
Fake frames are not necessarily a bad thing.

You see, or rather you don't see fake frames in video all the time.
Well I'm sure it's decent for what it's meant for but it's going to become rather irritating when it starts showing up without some form of qualification to show it's being used in terms of benchmarking/advertising.
 
Intel's competitive comeback comes at a cost you know, AMD are earning more per-sale than Intel are, and Intel fab their own CPU's, Pat keeps crying about how they have to get the margins up, he's just gone to Mr Biden for a $52 Billion bailout because they are skint.

Why am I not surprised to see you posting pure and utter drivel. It's called the Chips and Science Act and it's about increasing semiconductor manufacturing in the US. it's nothing to do with a bail out. Europe is in the process of doing something similar to the tune of €43 billion. It's about reducing dependency on Taiwan.

The $52 billion is only part of the bill, $39 million of it is allocated for manufacturing. Texas Instruments, Micron, Intel, etc. are all going to receive some of that money. Nvidia and AMD don't manufacture chips, so they don't get any part of that $39 million. Intel are going to benefit the most from it simply because they are going to be building the most.
 
Fake frames are not necessarily a bad thing.

You see, or rather you don't see fake frames in video all the time.
But you don't interact with videos, so interpolating from even a few frames beyond "now" makes little difference and has been part of CODECs for ages now.

Using frame #3 to help to decide what to display for frame #2 so as to make appear between #1 and #3 is not a problem when you already have frame #3.

Interactive gaming on the other hand requires low latency and aside from a time-jump there is no way to get details about frame #3 to fake-fill frame #2 from.

Now if you just wanted to hold #1 and display it as #2 until #3 is ready.. then you have adaptive sync.

Unless I'm missing something?
 
Once AMD catch up with RT it'll be something else, perhaps go back to 10 year old driver arguments.

The 4090 is 60% more expensive, if that's the price of better RT performance i'm more than happy not paying it, others can if they want but i don't want to hear AMD need to do better because my Nvidia GPU is too expensive, the reasons for that are thier choices, not AMD's.
This has been a common trend over the AMD threads since the 7900 announcement.

Everyone else telling me what I need in my GPU and telling me I should expect more. The deflection is unbelievable.

Let's see what happens in December and we can finally put this nonsense to bed. One thing is true I will not be paying 100s in a premium cost for RT performance so I can play one or two games in "pretty sparkly" settings
 
Last edited:
But you don't interact with videos, so interpolating from even a few frames beyond "now" makes little difference and has been part of CODECs for ages now.

Using frame #3 to help to decide what to display for frame #2 so as to make appear between #1 and #3 is not a problem when you already have frame #3.

Interactive gaming on the other hand requires low latency and aside from a time-jump there is no way to get details about frame #3 to fake-fill frame #2 from.

Now if you just wanted to hold #1 and display it as #2 until #3 is ready.. then you have adaptive sync.

Unless I'm missing something?

There is the certainty of artefacts when smudging frames in real time which will be more or less visible depending on the game.

I'm not entirely sure what games would benefit but if it can give a meaningful benefit in X type of games then it's still worth doing.
 
Fake frames are not necessarily a bad thing.

You see, or rather you don't see fake frames in video all the time.
Eh, broadcasters use this technique because they have to send the signal to a huge amount of people. The logistics make sense here but for something that's locally processed and costs a huge amount of money, it's stupid.

They main reason people will use it because it will just make them feel better because more frames, even where fake, surprise Nvidia just doesn't release a software that says 500 fps but it's actually 60. Just cos of people's pathetic acceptance of poor quality.
 
More fake frames, just what the world needs!

Really fail to see any reason for that "feature" whatsoever.

It's just another (optional) tool to exchange some graphical fidelity (which you may not be able to notice very much) in order to get playable framerates out of high resolutions. Where previously you would have had to turn down graphical options or drop down resolutions to get playable framerates, now you also have the option of frame generation and upscaling to do the same thing. More options are always better than less options.
 
It's just another (optional) tool to exchange some graphical fidelity (which you may not be able to notice very much) in order to get playable framerates out of high resolutions. Where previously you would have had to turn down graphical options or drop down resolutions to get playable framerates, now you also have the option of frame generation and upscaling to do the same thing. More options are always better than less options.
Oh, for sure. If it was some small well hidden thing in the advanced options of a graphic control panel.

However, I cannot see how fake frames would make something appear more playable. Well, not any more than having an old CRT with slow phosphor to smear the previous frame into the current one.
 
The same excuses happened in the past, be it physx, hairworks, g-sync etc. The latest one is ray tracing. Whilst it is a cool feature its still in its infancy and not supported by much of the volume out there regardless of what salesmen say. So even now when AMD is actually 'competitive' and is a chunk cheaper you are still seeing the boo boys playing it down. As I expect most enthusiasts to have an ampere card already, the recent gen release of cards probably wont be attracting many due to the insane prices. The performance leap is OK or par for the course. Its going to be interesting to see how many people buy these sku's as they are all over a grand.

Once AMD catch up with RT it'll be something else, perhaps go back to 10 year old driver arguments.

The 4090 is 60% more expensive, if that's the price of better RT performance i'm more than happy not paying it, others can if they want but i don't want to hear AMD need to do better because my Nvidia GPU is too expensive, the reasons for that are thier choices, not AMD's.

This has been a common trend over the AMD threads since the 7900 announcement.

Everyone else telling me what I need in my GPU and telling me I should expect more. The deflection is unbelievable.

Let's see what happens in December and we can finally put this nonsense to bed. One thing is true I will not be paying 100s in a premium cost for RT performance so I can play one or two games in "pretty sparkly" settings

Yep said it few pages back gents.
 
Don't kid yourself, Intel was going to revoke AMD's X86 licence, if AMD hadn't seen that coming, invented 64Bit logic and then tagged that on to X86 creating X86_64 Intel would be the sole X86 provider now and a long time ago. Intel never believed AMD have any right to make X86 based CPU's and they still don't

We used to have about 10 different GPU makers, Nvidia want to be the sole survivor and they work every minute of every day to try and make that happen.
Firstly I'm talking about Nvidia and not Intel, and if they had done that then they would have to deal with being a monopoly. Besides that's ancient history now & the environment is markedly different.
I also think it's incorrect to put the blame on Nvidia for all those companies failing, they simply couldn't keep up & in 3dfx's case their downfall was due to their own arrogance and poor decision-making.

Why are you talking of life support and being ecstatic.

AMD isn't a company living off a 20% pc gaming gpu share. Nvidia has nothing but its gpus.
We're talking about the GPU division and not the company as a whole.

You don't need the engine to be RT-first. Turning on any single RT effect, as miniscule as it may be crushes performance on the AMD cards.
Actually that's not true, you can absolutely tune RT to have AMD near/on par with Nvidia, there's definitely a sweetspot for it, just check Far Cry 6 for example. So long as the BVH building is carefully managed then AMD can do fine.
 
Last edited:
AMD can do RT fine as long as the game is designed with RT only i.e. metro EE, if attached on top along with raster methods then sacrifices have to be made for amd as we have seen with their sponsored titles e.g.

- shadows cast from only certain light sources and/or only on certain assets/objects
- reflections only on certain bodies of water or/and surfaces as well as much lower resolution

Same story for AO and GI.

When they haven't done the above then performance can crash hard and it's probably down to not having "dedicated" hardware for RT workloads like nvidia and intel, in fact, I think someone did some testing and found that when rasterisation/other graphic settings were reduced, performance was better whilst keeping RT maxed.


As for nvidia always making a problem and giving a reason for people to upgrade, you mean innovation?

physx - I wasn't a fan of and imo was largely a gimmick where effects were locked out and removed from the game to put it behind a paywall i.e. locked to nvidia but again, not a good tactic at all, some of the effects were very nice like in batman though e.g. volumetric smoke/fog and reacting to your movements etc.
gsync - had it not been for nvidia, we would have been waiting 2 years for adaptive sync (amds branding name for that is freesync) or maybe even longer, not to mention, gsync when released was superior in many ways compared to the first freesync displays i.e. no issues with black screens, flickering, poor FPS range, lack of low frame compensation, lack of variable overdrive (which is still the main advantage of gsync for LCD based displays). Tftcentral have done a good education piece as to the advantages and disadvantages:


And once again, ray tracing is NOT a nvidia thing......
 
Back
Top Bottom