• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia CEO says next gaming GPUs won't come for "a long time"

I never knew about this! Is it really that bad?

It is too sophisticated to maintain the image quality with a GeForce.

Are you guys with Nvidia image quality issues 100% certain you are not outputting limited RGB to your panel instead of full range? There is a massive difference between the 2 and sometimes the damn card defaults to the wrong output.
If in any doubt at all please do check.
Details here
https://www.reddit.com/r/pcgaming/comments/63id3z/make_sure_your_monitor_is_set_to_rgb_full_instead/

One can use NV_RGBFullRangeToggle which looks like this:



This is technically a registry hack and once you use CCleaner or AdvancedSystemCare the change to Full RGB is gone and you are again with limited colours.

The other thing you need to calibrate is in the Control Panel itself:



Digital vibrance +63
Hue +2

:cool:

+65
+0
for me ;)

And once you are ok with the desktop, you need to calibrate 3D settings as well, because defaults are to lower quality.
 
You cannot be serious, that looks bloody horrid :p

? This is the setting which resembles the closest the saturation levels of all smartphone screens. If you look at a gorgeous smartphone display and then look at the default vibrance level +50 in the control panel, you will notice that something is missing.
 
It is simulating HDR :D Full RGB + +65% digital vibrance.

Hang on, are you not the guy that said HDR is a gimmick? and that you set your non HDR panels up to mimic HDR

I have a 4K TV without HDR support and it's perfect. HDR is just a marketing gimmick for the clueless.
Sub 500 nits is the maximum brightness level which you normally will never ever reach.
 
Hang on, are you not the guy that said HDR is a gimmick? and that you set your non HDR panels up to mimic HDR

Mhm, let's clarify what HDR is - 10 bit colour + higher lighting levels + higher digital vibrance. There is no problem to simulate it or get close to it on most displays.
 
Interesting video with info regarding Nvidia's current gpu situation along with other stuff, Timestamps are in the description, Sorry if it's already been posted.

 
Mhm, let's clarify what HDR is - 10 bit colour + higher lighting levels + higher digital vibrance. There is no problem to simulate it or get close to it on most displays.

?I'm confused you call it a marketing gimmick for the clueless yet manually set up your display(s) to mimic it.....
 
?I'm confused you call it a marketing gimmick for the clueless yet manually set up your display(s) to mimic it.....

I am not in a mood to go shopping after "the latest and greatest" after every new checkbox "feature" which is purposefully made to force you think your still brand new purchase is already obsolete.
 
I did hear Nvidia have back out of Hotchips? Maybe they’ll just release the 1180 before that. Who knows?

I was wondering why they don't take a leaf out of AMD's book as they don't have to release an all new range of gpu's every time, They could release a range where they use the Pascal chips for the 1160 and lower and Volta for the 1170 up, AMD have often done this with examples like the 7850 which became the 265 and the 370 & the 7900 chips which became the 280's, etcetera.

If as mentioned in the video Nvidia are using further cut down 1070 chips to make 1060's due to having too much stock it's an acceptable option. Pascal could be used to give us an 1150, 1150ti, 1160 & 1160ti. They could then expand the middle & high end range to offer up to 6 Volta models, the 1070, 1070ti, 1080, 1080ti & a couple of Titans.
 
I was wondering why they don't take a leaf out of AMD's book as they don't have to release an all new range of gpu's every time, They could release a range where they use the Pascal chips for the 1160 and lower and Volta for the 1170 up, AMD have often done this with examples like the 7850 which became the 265 and the 370 & the 7900 chips which became the 280's, etcetera.

This isn't a good idea because they leave many of the products with older version of GCN the microarchitecture, and UVD, HDMI, DisplayPort support, etc.
 
This isn't a good idea because they leave many of the products with older version of GCN the microarchitecture, and UVD, HDMI, DisplayPort support, etc.

I meant Nvidia doing it with the surplus Pascal chips they now have, I was using the AMD cards as examples, In my opinion Nvidia doing this wouldn't be a bad thing.
 
I meant Nvidia doing it with the surplus Pascal chips they now have, I was using the AMD cards as examples, In my opinion Nvidia doing this wouldn't be a bad thing.

Rebadge them and sell them cheaper? Why not just discount them and sell them as they are, thus saving precious efforts for anything more valuable? :p
 
So your suggesting rebranding some chips. Just remember where the idea came from next year when your all shouting down NVidia for doing a rebranding.;)

Personally I don't think they will, as the new architecture chips, will have already been researched designed and probably being in the process of being made.
 
I am not in a mood to go shopping after "the latest and greatest" after every new checkbox "feature" which is purposefully made to force you think your still brand new purchase is already obsolete.

Now I'm even more confused, if it's the latest and greatest why call it a gimmick, anyway some versions (I believe only DV, HLG and HDR+) of HDR contains metadata to accurately display the image on a scene by scene basis hence detail in blacks and intensity in whites, by simply raising the brightness on a set and setting vibrance and hue to simply make the colours pop does not mimic HDR.
On a dull scene utilising HDR the image is dull yet the detail stands out and certain areas will have a deeper gamut allowing the blacks to have further detail along with the lighter areas.

My old Sony TV @300 odd nits had HDR along with a 10bit panel yet it could never mimic what my OLED produces via a 4k HDR Blu-ray. It's night and day (literally) difference. No tweaking of brightness or anything else will achieve it on that set.

Falsely telling people that adjusting a few settings and your good to go is nothing like the real thing.
 
Rebadge them and sell them cheaper? Why not just discount them and sell them as they are, thus saving precious efforts for anything more valuable? :p

I know what you mean but he said in the video Nvidia are taking back loads of GP104 chipped (1070/ti/1080) boards from board partners and then disabling cores to make 1060's out of them to help get rid of the surplus stock, If that's true using them as is for certain 1100 series cards sounds like less work to me.

So your suggesting rebranding some chips. Just remember where the idea came from next year when your all shouting down NVidia for doing a rebranding.;)

Personally I don't think they will, as the new architecture chips, will have already been researched designed and probably being in the process of being made.

I'm pretty sure they won't as well I was just making up an alternate use for them rather than doing what I just wrote about them doing above, That may not even be true.
But no I wouldn't have an issue with gpu's being reused in the next range as long as they're positioned & priced accordingly.
 
Last edited:
I know what you mean but he said in the video Nvidia are taking back loads of GP104 chipped (1070/ti/1080) boards from board partners and then disabling cores to make 1060's out of them, If that's true using them as is for certain 1100 series cards sounds like less work to me.

I would have thought the work involved to disable the cores would make the exercise pointless from a cost point of view.

I have noticed that NVidia are still selling 1080s and 1070 Ti FE cards direct which is more likely what is happening the any extra boards.

I also think that board partners will end up using excess boards if NVidia does not give them anything else to sell.
 
Back
Top Bottom