• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Are they not going to a more mature 7+ manufacturing process? Also 5700XT is not a million miles away from 2080Ti performance and it is 251mm2 so there is a lot of room right there alone.

45% slower with less features while using 80% of the power , 2080ti has MUCH better performance per watt despite the chip size and the node disadvantage.


I think you are underestimating the performance leap rdna 2 will bring. They will have been working on that for a very long time on consoles. It will bring nice efficiency gains imo. In my mind beating 2080Ti is not even a question. The question is how close can they come to tithe 3080Ti/Titan. That will depend on how well it scales.

All we have on performance/power is AMD's 50% per watt claim. Interesting not a comment on performance gain there next card will have at worse we might not get a much faster card at all just one with a better PPW

We will see I suppose. I am not expecting it to match or beat nvidia’s top offerings myself, but as I said 2080Ti performance will be easy to match or beat. I will eat my hat if it doesn’t and you can quote me on that!

Unless Ampere is a Rankine levels of fail (aka Nv30/5800 ultra) the next 80/ti card will still be far ahead still and so prices will stay high. AMD need to stop competing a generation behind.
 
I’m guessing they have gone with the die they did because it was easier for AMD to fab on both being 7nm. It would cut down the costs as the cpu is already in production. Perf wise at 4k the difference between say a 10 year old i5 2500 and a brand new 3950x is minimal.

It will more so be to utilise the massively improved computational power and cache of the cpu for other tasks. Those other tasks likely being Raytracing and the new SSD as well as other background apps and features (game resume etc) without bogging down the CPU during actual gameplay.

architecture, ipc and memory design will always provide the best uplift for games - clock speeds have been largely irrelevant for years now. That's why both consoles are just using 3.x boost clocks - it's enough to saturate the Capabilities of the chip, going any higher is just adding additional heat and power draw for very little benefit - just ask PC user they know all about how going over 4ghz on any modern CPU provides little benefit for gaming

mid like to think now that cpu have far more cores than they actually need that developers will start using it but I also have my doubts - because the cpu is just so inefficient at many tasks that you can do faster on a gpu
 
Last edited:
Having experienced AMD for the first time with my Vega 64, the questions I'm askng are:
  • Will it be less expensive for the same performance?
  • Will the control panel be simplified to a point where I don't need Google to use it.
  • As above, will overlocking get anywhere near the simplicity of Nvidia?
  • Will it be plug and play out of the box, without the need for all this undervolting carry on - even that isn't a case of just tweaking a single voltage slider.

There's more to a GPU experience than raw power.

These are good and fair questions imho, I don't know the answer but I understand where you re coming from.
 
Traditionally before the dark times of modern GPU pricing the Ti brand was the performance uplift SKU at a sensible price, kind of like the Apple iPhone SE. It gives Nvidia the option to release a new SKU at a new price point if they need to rather than discounting which they hate to do and will avoid like the plague.
 
45% slower with less features while using 80% of the power , 2080ti has MUCH better performance per watt despite the chip size and the node disadvantage.




All we have on performance/power is AMD's 50% per watt claim. Interesting not a comment on performance gain there next card will have at worse we might not get a much faster card at all just one with a better PPW



Unless Ampere is a Rankine levels of fail (aka Nv30/5800 ultra) the next 80/ti card will still be far ahead still and so prices will stay high. AMD need to stop competing a generation behind.

You missed the point I was making it seems.
 
Thinking about what you said about the 3090...
Do you think there could be a 3090 ti later on?
Possibly. No one knows really. If no competition at the top end from amd then they will take their time I would imagine.
 

Going off topic now so i will rein it in after this before i get whacked by a mod.

This will surprise people but the 2013 Xbox CPU, with the 8 Jaguar cores at 1.75ghz had the approximate CPU compute power of an K1 Nvidia Shield or, a 2012 Google Chromebook.

In short, the 2013 Xbox Cpu was awful at launch. the IPC, Clock, Architecture, ram bus speed - the lot.

In CPU terms, this new 2020 Xbox will be as fast as the same 2020 Xbox running a 3900x at the 99th percentile, in the majority of gaming scenarios.

We have proven it on the forum time and time again in our own independent performance testing, games programmers don’t (with small exceptions) use all the extra cores they are given beyond 6. Games are not as multithreaded as they should be.

Our £7k rigs worked well in 3dmark and Cinebench but they meant little when running Batman or, Crysis benchmarks, IPC mattered, not cores.

If you took a 3900x and plugged it into the socket into the 2020 Xbox X and did a blind test against the standard version, on a TV, in-game, I doubt anyone would notice the difference.

Hence, my claim that we have console gaming performance parity with a Desktop PC is simply because the IPC performance of the Zen2 heart and the clock speed is now as fast as a desktop for its intended purpose - playing games.

Microsoft and Sony could have gone quicker on the CPU clocks but i bet that in their internal CPU clock testing, they came to the same conclusion that this board came to years ago - CPU clock only gets you to a certain FPS point and beyond that, you are burning heat budget to beat a benchmark, the actual user experience is not improved and you are better off either upgrading your CPU architecture or, more effectively, spending the heat budget on GPU clock and faster, hotter, VRAM.
Well said. However, I thought this was common knowledge though.
 
Possibly. No one knows really. If no competition at the top end from amd then they will take their time I would imagine.

Thanks.

I will wait till after Christmas and see what the prices are.
Hoping AMD bring something great to the table this time around.
 
Sadly I think that’s optimistic.

I agree. Thankfully, it's not a compulsory purchase. :p

I can only speak for myself, but in this post-Covid environment I can't justify much more than that for the level of performance quoted. If I get less than 40% for £700 then I might as well stick with what I have. As always, the market will dictate the price but with mass redundancies on the horizon world wide I just can't see how the price/performance ratio can be maintained.
 
Yes. I agree my price point is about £500. I don’t really want to go above that as sometimes I may not even game for months at a time depending of how busy work/life is.
 
I think with the upcoming model name changes it's more helpful to think in terms of price range. 40% over a 2080ti has to be in the £700 zone for me.
Why would they ever do that? It would be like Rolex releasing a brand new Sub for £2000.
Even if they do, everyone would jump on it (me included) and the price would quickly skyrocket as demand outweighed supply
 
Why would they ever do that? It would be like Rolex releasing a brand new Sub for £2000.
Even if they do, everyone would jump on it (me included) and the price would quickly skyrocket as demand outweighed supply

You're assuming the card in question is top of the range? For all we know, the card that has been spotted at 40% over the 2080ti could be the "3070" or "3070 Super" or the "3080 super duper". That's the point I'm making, they've changed the model nomenclature yet again so it's best to focus on performance for your price point.
 
I'm hoping this spotted card is the rtx 3080, I can't be dealing with a x80 card matching last gen Ti card again when historically they were always better performers since turning.
 
I'm hoping this spotted card is the rtx 3080, I can't be dealing with a x80 card matching last gen Ti card again when historically they were always better performers since turning.
Yep and priced at £699 :D

Would prefer £599 but let’s be honest that ain’t happened with this exchange rate.
 
Back
Top Bottom