This is NVIDIA modus operandi, they're quite good at it![]()
And Apple. They do it a lot.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
This is NVIDIA modus operandi, they're quite good at it![]()
Intel are in an interesting position given that they can follow this same level of CPU integration with it’s upcoming GPU lineup.
Nvidia might need to start considering how they can partner with the Intel and AMD, which should hopefully align technologies instead of everyone having their own proprietary BS (wishful thinking).
Intel are in an interesting position given that they can follow this same level of CPU integration with it’s upcoming GPU lineup.
Nvidia might need to start considering how they can partner with the Intel and AMD, which should hopefully align technologies instead of everyone having their own proprietary BS (wishful thinking).
And when Nvidia does share or adopt open tech they try to make it look like they own it. Look at the way they basically adopted FreeSync but did so in a way that was an attempt to rebrand it as G-Sync.
First they implemented FreeSync in their drivers but made out that they were rolling G-Sync out for FreeSync monitors, and then they updated their latest generation G-Sync modules for monitors to give compatibility with FreeSync enabled cards but didn’t really being honour of their way to share that information in quite the same way.
Do now if you do a google search for cross compatibility the top hit is about some FreeSync monitors working with G-Sync rather than the other way round. It took me ages to find out that my Alienware Ultra Wide could also handle FreeSync (discovered on this forum actually) and it was this information that helped me change my mind on my GPU purchase from 3080 to 6800XT as I now know that I’m not locked in to Nvidia.
I’m glad that AMD seems to be upping its game somewhat as it seems that Nvidia needs a serious competitor to keep it honest.
Youre not getting it.
I am BGA substrate supplier. I supply all of TSMCs BGA substrate for AMD PS5, Xbox Series X and AMD GPUs. I have an issue, the supply i can give is hit by x%.... What product likely loses out and is most hit by that, the consoles which will be seen as tier 1 priority with ironclad agreements with AMD and TSMC or AMDs dGPUs.
I think this is one of the reasons NVIDIA has bought ARM, how long until we see an NVIDIA CPU/GPU SoC so that they can compete with AMD for next Sony/Microsoft Consoles. Currently they can't compete with AMD as they can only offer a discrete GPU.
AMD SoC = 8C/16T Ryzen “Zen 2” CPU, 10.3 TFLOPS RDNA 2 Radeon GPU, very tasty![]()
But something might trickle down to the consumer market eventually
with that in mind mind I wonder if AMD will come up with a cloud computed ray tracing solution that people could turn on with an always on connection?I guess it will be 4/5 years before Sony/Microsoft refresh their consoles.
Does anyone else find the hatred of the Epic Games launcher / platform kind of irrational / childish?
Not amongst the majority perhaps, but a large percentage of Steam gamers.
Yes, Epic, you can buy my affection with free games![]()
https://www.facebook.com/tinytomlogan/posts/4811607448880130
sorry if it's been posted before but it looks liek ASUS are going to make an AIO cooled 6800 series card
Also being reported that aib's are in talks to make custom 6900xt's.
But what's an extra 100-200 mhz going to get you in terms of framerate? You'll probably get that with very a mild overclock. I think for most people that want to play games rather than benchmark, the difference in performance will be much less significant than the difference in price.
It'll probably be on release day, just like Nvidiawhen can we expect reviews btw? at release and not before? sorry if this has been asked already but there's a lot of pages..
I mean it wasn't overclocked, RAGE mode doesn't adjust the clocks just the power budget. It's not like they didn't put it very clearly on the slide either.
I hereby declare you the winner of this engagement.
The Adaptive-Sync technology which I think he is basing this on was there before nvidia coined their own implementation to handle the issue, as nobody was really addressing it.
They were also being boosted by using a 5000 CPU, and the SAM.
Lets have it on a non 5000 CPU, or an Intel CPU, no RAGE, no SAM, and lets see how it does stock for stock, as that wasn't a fair comparison, looks to me like, to beat the 3090, you'll need a 5000 CPU as well, so you may as well say, the 6900 is going to cost you (if you take the cheapest 5 CPU on here), £1,290, and thats assuming you already have an AMD board, that you can just that CPU straight into, if not, then they'll cost even more, as not only will you need the CPU, you'll need a motherboard etc... as well.