• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why are GPUs so expensive?

If what Digital Foundry said about a lower spec console from Microsoft, around 4TF, is correct, then I wouldn't expect too much from the next gen consoles as everything would need to run on that first.
Also, if they want to target 4k@60fps, also the jump in image quality won't be that big, so plying on a lower resolution with a lower hardware will be possible.
Keep in mind that only towards the end of a console's life you have games that make the best out of it - and which also require more hardware in the PC space. If that 12.3TF monster console, with 8c/16t CPU (above 3GHz and would be nice if it had 24 or 32GB RAM), would target 1080p 60fps or even 1080 30fps only, than yes, it would have been something wonderful indeed! ;)

The biggest jump would be in CPU performance. If the 8c/16t 3GHz+ turns to be true, than all those below will have a hard time keeping up (although 1-2 cores will be kept off gaming and dedicated to OS and other stuff).

As it stands now, I don't see much of a problem in the PC space. It will get bad for PC and consoles alike when 5G has enough penetration and can provide low lag, high bandwidth to a significant crowd as will enable cloud gaming to take off properly in those areas. Until then... same ol', same ol'...



Some people want AMD to be competitive, especially in the high end market, so they can buy cheap nVIDIA products. That can't go on forever. They were most of the time (if not always) competitive in at least some areas, but never had the success people gave to nVIDIA and Intel.

Of course they want for people to buy the consoles, it's a double win! Triple or even more if they buy both and the PC parts (at higher than normal price)!

The 4tf console sounds like complete bunkum to me. Why a 4TF console when there's a 6TF console already out on the market with the X, with a mature manufacturing base already there. They'll just send it down a tier in price.

As for 4k 60 - We've already seen where the market is going with things like Radeon Boost, DLSS and VRR shading. Technologies of this nature will be built into the console so no screwing around with ****** drivers.

I think you are completely missing the point.

Traditionally it's been like this:
80= flagship
70= high-end
60/60Ti= Mainstream/Mid-range
50= Entry level

But now it is like:
2080ti= flagship at premium price
2080= high-end at flagship price
2060/60ti/70= mid-range at high-end price
1660/60ti (which should had been the 50/50ti)= entry level at mid-range price

The point is that the performance that people are getting are less for their money, due to Nvidia pushing their cards pricing across the board to 1 tier up in pricing. The 2060 6GB is the perfect example for illustrating everything that is wrong; you got some people arguing 2060 having 6GB is fine because it is a "1080p card", but last I checked "1080p" had no business being launched at $349+, and when even the 2 years old+ 1070 had 8GB, it's kinda apparent that Nvidia trying to milk the consumer with on one hand their can cut corner on the cost of vram, and on the other it is the perfectly excuse to push people into buying the even more expensive 2070 8GB instead at £100 more.

"1080p card" screams of marketing youtube shill nonsense that's been repeated so much it's now entered popular lingo. 1080p is a sickening resolution, you can pick up alcatel phones with 1080p screens for £40 these days.
 
On average the 2080 is 10-15% faster than the 1080ti, especially in new games - in a per game basis there are a couple titles where the lead goes even above 15%

https://youtu.be/jUM_eINGUl4

And it was slower than the 1080 TI at release for the sim I like to run (Project cars 2)

https://babeltechreviews.com/the-rtx-2080-vs-the-gtx-1080-ti-in-vr/3/

They have squeezed it as much as possible with drivers since release, but babeltech revisited their VR stuff and the 2080 is still functionally equivalent to the 1080 Ti for my sim.

https://babeltechreviews.com/the-rtx-2080-vs-the-gtx-1080-ti-in-vr-revisited/

Nvidia can do whatever they want with their products, but they will have to do better than this for my money.
 
"1080p card" screams of marketing youtube shill nonsense that's been repeated so much it's now entered popular lingo. 1080p is a sickening resolution, you can pick up alcatel phones with 1080p screens for £40 these days.

1080p is only sickening because games haven't caught up yet.

A 1080p TV program looks many times better than a 4k game. Even an 8k game won't look at good as a 1080p TV program. The difference is night and day.

Simply increasing resolution to ever increasing numbers isn't the answer in my opinion.

Maybe we should go back to 1080p and concentrate on improving it until games look cinematic and real. Then we can increase the resolution to add the detail and clarity that's missing from 1080p.
 
The 4tf console sounds like complete bunkum to me. Why a 4TF console when there's a 6TF console already out on the market with the X, with a mature manufacturing base already there.
Wonder if could be a hand-held?

Maybe they're trying to break into that market... They already make the Surface tablets.
 
https://youtu.be/jUM_eINGUl4

And it was slower than the 1080 TI at release for the sim I like to run (Project cars 2)

https://babeltechreviews.com/the-rtx-2080-vs-the-gtx-1080-ti-in-vr/3/

They have squeezed it as much as possible with drivers since release, but babeltech revisited their VR stuff and the 2080 is still functionally equivalent to the 1080 Ti for my sim.

https://babeltechreviews.com/the-rtx-2080-vs-the-gtx-1080-ti-in-vr-revisited/

Nvidia can do whatever they want with their products, but they will have to do better than this for my money.

look at reviews for games released in the last 6 months

And sorry I have no interest in vr games
 
The 4tf console sounds like complete bunkum to me. Why a 4TF console when there's a 6TF console already out on the market with the X, with a mature manufacturing base already there. They'll just send it down a tier in price.

Current consoles have a weak CPU and no RT (if next ones will have RT hardware). At 4tf could be good enough for 1080p like a 12tf +/- one is for 4k while still having the same powerful CPU. In theory should be doable as image quality and gameplay elements will be the same, just like is on the PC.
 
A 1080p TV program looks many times better than a 4k game. Even an 8k game won't look at good as a 1080p TV program. The difference is night and day.

That's because it's "downsampling" reality from 16k or 32k (or whatever it is) to 1080p. Downsample a game from 4k to 1080p and will look better than a native 1080p image.

But in terms of RT and other expensive "stuff", yeah, it would be nice to keep the focus on 1080p.
 
That's because it's "downsampling" reality from 16k or 32k (or whatever it is) to 1080p. Downsample a game from 4k to 1080p and will look better than a native 1080p image.

But in terms of RT and other expensive "stuff", yeah, it would be nice to keep the focus on 1080p.

Nothing to do with that but the power required to make lifelike graphics is nowhere near yet.

Even 4k gaming or 1440p gaming isn't much different to 1080p because its the same graphics just at a higher resolution.

I have gamed at 4k, 1440p and 1080p none of them look as good as a real life movie
 
It might be "the most life-like yet", but it's still not going to be like a real-life movie.

I completely agree with @Psycho Sonny , the technology for true photo-realism is yonks away.

Where every cloud in the sky changes the lighting on the ground. Where every surface accurately reflects light. Where every shadow is correct. Where every moving body of water reflects correctly in real-time.

Those things are massively computationally expensive. I wouldn't expect 2020 to be the year of photo-realism, I really wouldn't.

e: Also whilst we might get some really good effects close to the camera, the further away an object is the more devs need to use techniques to reduce the processing expense. So whilst in real life you might get a glint of the sun from a window or roof 2 miles away, in a game you're just not going to get that, because the processing expense to render things that far away accurately is not worth the performance hit.

So yes things are getting better, but true photo-realism is miles off.
 
Last edited:
It might be "the most life-like yet", but it's still not going to be like a real-life movie.

I completely agree with @Psycho Sonny , the technology for true photo-realism is yonks away.

Where every cloud in the sky changes the lighting on the ground. Where every surface accurately reflects light. Where every shadow is correct. Where every moving body of water reflects correctly in real-time.

Those things are massively computationally expensive. I wouldn't expect 2020 to be the year of photo-realism, I really wouldn't.
The biggest leap I see this year is the new 2020 Microsoft Flight Simulator.

But what you guys are describing likely won't happen even in our lifetime. They will get close in the next 20 years though I recon. But it will be close yet far type of thing.
 
Out of curiosity, in your scheme what are the 3000 series a replacement for? Or do they just sit above the 2000 series, not replace them?

In my scheme? You obviously misunderstand my post. Marine was judging where a card comes in a line up based on Tradition. (x80 high end, x60 mid range etc.) But that has changed since Fermi. With the release of Kepler that tradition has become pretty meaningless. Nvidia have changed it every release since then. With the release of Turing, Nvidia have changed the position and naming scheme again with the introduction of the RTX cards. Is this showing that they are going back to previous scheme where the letters in the name of card mean what level it's at, like the 8800GTX, 8800GT etc.) GTX meant high end, GT meant mid range. So now we are back at something similar with RTX covering the high end of cards, GTX covering all the lower cards.

And I don't know what Nvidia are going to do with the 3000 series. Will they change the naming scheme again? I don't know.
 
Maybe you are embarrassed. At least, when you speak about graphics cards, check their performance first!

rtx-poor-performance.png

LOL you bring out a benchmark to show that I am wrong but all it does it make you look silly.
 
Yep, I didn't know much about it til November but having seen it, I am blown away. Hopefully, it runs well and not at 7 fps lol
People running the alpha seem to be getting good fps with a 2080ti so it will only get better with final release and on the new 3000 series hardware.
 
Even then its flight simulator.

There's not a lot going on apart from scenery.

Try implementing those kind of graphics in a grand theft auto or a battlefield type game.

Should be getting closer when the next gen consoles hits. Just because it hasn't been done, it doesn't mean is not possible on the current PC hardware.

People running the alpha seem to be getting good fps with a 2080ti so it will only get better with final release and on the new 3000 series hardware.

That is if the limit is the GPU.
 
Back
Top Bottom