that's why my pc is in cold storage and i bought a ps5I can't do this anymore....
ready to make a return though, if the sillyness ends.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
that's why my pc is in cold storage and i bought a ps5I can't do this anymore....
Don't AMD have their own dlss thing?
Indeed. All frames will be the real deal rather than fake ones to boost numbers but do nothing for input latency.
So you are saying ...Maybe, i don't know what to do... Nvidia just launched a £9004080 12GB4070, so is that then a £70040704060?
I would like to hope AMD will seize the opportunity to make Nvidia look daft with much more reasonable pricing but we have taught AMD that doesn't actually do anything for their market share, all it does is leave margins on the table, so i don't have a lot of hope for that.
My recent GPU history:
GTX 970, £270
GTX 1070, £370
RTX 2070 Super, £480
Now the ##70 class card is £900.
I can't do this anymore....
So you are saying ...
Jensen is disappointed in you.
Which in itself is quite witty.I tried to think of something witty, I have nothing.
I guess it can render then next frame, but not display it. Then you add the frame(s) in between and start displaying the frames.If DLSS 3 takes data from the next frame to fill in the detail on the low res frame it puts on your screen how does it do that? Wouldn't it need to render the next frame to get that detail?
I guess it can render then next frame, but not display it. Then you add the frame(s) in between and start displaying the frames.
There are commands in certain games/engine like Render Ahead_x (I think it was, probably still is in nVIDIA drivers), where x being a higher number meant a smoother experience at the cost of some input lag, while lower values meant lower input lag, bug increased stutter. Perhaps works in a similar way, but now the hardware has enough power to generate frames between those reference ones to increase smoothness ( there was a talk that for each "real" pixe 7 more are generated by AI?).
In a way, like that works v-sync if I'm not mistaken, as in holding back/generating just enough frames to match the display refresh rate resulting in a slight lag - sometimes can be bigger, depends per game.
I do one better, people paying this money to wait on games released on the ps5 5 years later, while many can get those games now, looks great, performs great and on discount lolSad state of affairs really even more so when you think most games now are rubbish - maybe 1-2 a year that are really good if lucky.
Maybe just full of snobs on this forum and have much better kit than the average but hard not to see the death of pc gaming if this continues, crazy but now seems sensible £700ish for 1080Ti now more than double for high end.
If AMD's cards are smaller than a medium sized house they'll be winning.
From memory I believe the popular rumour is that the main die is about 300 mm2If AMD's cards are smaller than a medium sized house they'll be winning.
Maybe, i don't know what to do... Nvidia just launched a £9004080 12GB4070, so is that then a £70040704060?
I would like to hope AMD will seize the opportunity to make Nvidia look daft with much more reasonable pricing but we have taught AMD that doesn't actually do anything for their market share, all it does is leave margins on the table, so i don't have a lot of hope for that.
My recent GPU history:
GTX 970, £270
GTX 1070, £370
5700XT
RTX 2070 Super, £480
Now the ##70 class card is £900.
I can't do this anymore....
Maybe, i don't know what to do... Nvidia just launched a £9004080 12GB4070, so is that then a £70040704060?
I would like to hope AMD will seize the opportunity to make Nvidia look daft with much more reasonable pricing but we have taught AMD that doesn't actually do anything for their market share, all it does is leave margins on the table, so i don't have a lot of hope for that.
My recent GPU history:
GTX 970, £270
GTX 1070, £370
RTX 2070 Super, £480
Now the ##70 class card is £900.
I said **** it and got a RTX4080 12GB on pre-order
FTFY
From memory: AI is trained on supercomputers using high res images, that was/is (?) the model for current versions. Using that it can accurately "guess" how to fill image properly when it upscale it.The point is it would be using GPU cycles to render a frame its not displaying, displaying the frame costs nothing, rendering it costs GPU cycles, the same as if it was putting that image on your screen. Its rendering a higher res image to gather data from, not to display on the screen.
So does Jensons explanation whats going on here make any sense to you? It doesn't to me.
If AMD's cards are smaller than a medium sized house they'll be winning.