• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The FUTURE of Graphics

With the 1050 being unplayable I would say 1060 6GB was the real low end GPU for 1080p and a half decent experience. 1070 if you wanted 1080P high refresh rates. 1080 and 1080ti was for 1440P and the ti catering for higher refresh rates.

Only if sticking to Nvidia. The RX570 Does very well for only £120 at 1080p.
 
I made it through 30 seconds. That's a computer-generated voice reading a text. Awful.

Same here, I couldn't listen to that for a full minute let alone half an hour. Has anyone watched all the way through? All I'm seeing is an argument about what is entry level GPU, surely that's not what the thread is about, is it?

Could the OP give us a TLDW summary?
 
the 1660 series was created to basically fool people into thinking the other series were all higher than they are. i still stick by 2060 - low, 2070 mid and 2080 high.

the 1660 series shouldn't even be a thing. it is because they raised the prices of every series and had to make something cheaper. so the 1660 was born. to fill the gap created by raising all the rest.

1060 last year was entry level the 1050 was basically for HTPC's with mobile phone type gaming. now they have the 1030 for that job with 1660 as a better version if you need it.

Otherwise we could call those £30 GPU's entry level.

The RTX & GTX Turing cards are the gaming focused gpu's & the GT gpu's are not, I think we should be looking at the GTX & RTX cards as all being one range, the performance supports that so I look at the 2060 as being a mid range card.
 
Only in a thread about the Future of Gaming can it so far consist of an arguement about current range of cards and which is budget/entry level. Thats why I love OCUK forums.
 
I made it through 30 seconds. That's a computer-generated voice reading a text. Awful.

Same here, I couldn't listen to that for a full minute let alone half an hour. Has anyone watched all the way through? All I'm seeing is an argument about what is entry level GPU, surely that's not what the thread is about, is it?

Could the OP give us a TLDW summary?

If you can't be bothered to watch the video because of some issue with the voice narrating it there's not really much point as it was about hearing different opinions on what's said in the video not what my opinion of it is, He talks about Ray-tracing & whether Nvidia's solution is here to stay considering that we've been shown that Ray Tracing can be done without specific hardware for it as seen in the recently released Neon Noir benchmark or whether they'll have to change how they do it in future generations to go.

Neon Noir download
https://www.cryengine.com/marketplace/product/neon-noir
 
It's not a case of 'can't be bothered' but a case of 'the voice is actively off-putting'.

Hi, Even so that's a fairly lame excused & without watching it for yourself you haven't got an informed opinion.

If the voice is the issue turn the subtitles on & the sound off.
 
Hi, Even so that's a fairly lame excused & without watching it for yourself you haven't got an informed opinion.

No, it's not a lame excuse. It's a prime example why presentation matters as much as the message itself.

What context is this?

1080p is a world-wide standard now, much like VGA was in the 90s. Resolutions like 1440p 1440UW and 4K are still niche, though 4K likely outweighs the other two combined.
 
I remember sitting with a friend playing around with 3D Studio Max and saying, in the future games will render with ray tracing instantaneously. Guess the future is here already, sort of.
 
On July 2, 2019, the GeForce RTX Super line of cards was announced, which comprises higher-spec versions of the 2060, 2070 and 2080. In a departure from Nvidia's usual strategy, the 20 series doesn't have an entry level range, leaving it to the 16 series to cover this segment of the market

Mid-range

GeForce RTX 2060
GeForce RTX 2060 Super
High-end
GeForce RTX 2070
GeForce RTX 2070 Super
GeForce RTX 2080GeForce
RTX 2080 Super
Enthusiast
GeForce RTX 2080 Ti
Nvidia Titan RTX
 
I've not watched the whole video but at around 10:30 he tells us that Quake RTX doesn't actually use Nvidia's RT cores. He then moves onto competition and 2020, I'm guessing he's going to say Intel will be able to muscle OEM's to stop using Nvidia graphics and use Intel instead.
 
What context is this?
You're giving a resolution only an assigning it a tier.

Exactly you can buy 1080p 240hz monitors. So if my card can't max that out on all games is it then low end or mid end?

The tiers are all getting pushed around and blurred.

1660 was made to fill the gap as Nvidia pushed all 3 tiers upwards. They then had to fill the gap left and the answer was the 1660. So 1660 = 1060, 2060 = 1070, 2070 = 1080 and 2080 = titan?

If you accept what they have done then the 2060 at best is a low to mid end card. 2070 is mid or mid to high end with the 2080 being high end.

By people accepting this to be normal I can only see them doing the same in the future. It will get to the point they will release say the 4660 as a £400 card. Then have to introduce a new low end card as the 4550 or something.

Like my car scenario earlier due to the rising cost of cars BMW instead of having a 3 and a 5 series for the majority of their target market. Now have 1,2,3,4 and 5 series for the average driver, the other marques are more exotic and expensive aimed at higher end or electric markets. They had to fill the gap as they rose upwards. So do people now regard a 3 Series as a high end car? with the 1 series being low end?

So essentially the manufacturers have moved the goalposts and now people are to accept that? To me the 2060 series will always be low end. I don't care if they moved the goalposts by pricing them higher.
 
Last edited:
Exactly you can buy 1080p 240hz monitors. So if my card can't max that out on all games is it then low end or mid end?

The tiers are all getting pushed around and blurred.

1660 was made to fill the gap as Nvidia pushed all 3 tiers upwards. They then had to fill the gap left and the answer was the 1660. So 1660 = 1060, 2060 = 1070, 2070 = 1080 and 2080 = titan?

If you accept what they have done then the 2060 at best is a low to mid end card. 2070 is mid or mid to high end with the 2080 being high end.

By people accepting this to be normal I can only see them doing the same in the future. It will get to the point they will release say the 4660 as a £400 card. Then have to introduce a new low end card as the 4550 or something.

Like my car scenario earlier due to the rising cost of cars BMW instead of having a 3 and a 5 series for the majority of their target market. Now have 1,2,3,4 and 5 series for the average driver, the other marques are more exotic and expensive aimed at higher end or electric markets. They had to fill the gap as they rose upwards. So do people now regard a 3 Series as a high end car? with the 1 series being low end?

So essentially the manufacturers have moved the goalposts and now people are to accept that? To me the 2060 series will always be low end. I don't care if they moved the goalposts by pricing them higher.

So what about AMD cards then?

Where does the 590, 580, 570, 560, 550 or even the new 5700 fit in then if everything from the 2060/super is low end?
 
So what about AMD cards then?

Where does the 590, 580, 570, 560, 550 or even the new 5700 fit in then if everything from the 2060/super is low end?

Personally don't care as they don't do g sync. I'm limited to Nvidia if I want to utilise my monitors capabilities.

If AMD can make my G Sync Monitor "free sync" then I would be open to them. But for me they aren't even an option to consider. Unless I want to also change monitors.

I spoke about this before. How G Sync essentially trapped me into being Nvidia only until my monitor dies or is replaced.
 
Personally don't care as they don't do g sync. I'm limited to Nvidia if I want to utilise my monitors capabilities.

If AMD can make my G Sync Monitor "free sync" then I would be open to them. But for me they aren't even an option to consider. Unless I want to also change monitors.

I spoke about this before. How G Sync essentially trapped me into being Nvidia only until my monitor dies or is replaced.


The fact that you don't want one of those doesn't suddenly invalidate the argument. :rolleyes:
 
Back
Top Bottom