• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

juqwSRX.png

Just to quote those posts for the lols:

Still not getting a 5090 :p



There are more but these will do :p
 
Just to quote those posts for the lols:





There are more but these will do :p

But his happy getting 50fps with a 4090 on his 240hz monitor
 
Last edited:
  • Haha
Reactions: TNA
But his happy getting 50fps with a 4090 on his 240hz monitor

Don't think he is. Hence 5090 incoming imo. Not just that, but new tech and fomo.

He will start justifying it by saying he got lots of money for his 4090 and it was only xxx more for the 5090.

Obvious stuff. And i told him this, but was not having it saying how the 4090 is so powerful that won't need an upgrade etc etc as you can see in his long as post i quoted above which aged like milk :cry:
 
Last edited:
Do you think he is. Hence 5090 incoming imo. Not just that, but new tech and fomo.

He will start justifying it by saying he got lots of money for his 4090 and it was only xxx more for the 5090.

Obvious stuff. And i told him this, but was not having it saying how the 4090 is so powerful that won't need an upgrade etc etc as you can see in his long as post i quoted above which aged like milk :cry:

We already know his going buy a 5090 sooner or later
 
  • Like
Reactions: TNA
We already know his going buy a 5090 sooner or later

We knew from a year ago. He is only starting realise that now I think.

The bit that makes me laugh is how he speaks with 100% certainly and lays out the reasons why he won't be upgrading. Proper lol stuff :D
 
We knew from a year ago. He is only starting realise that now I think.

The bit that makes me laugh is how he speaks with 100% certainly and lays out the reasons why he won't be upgrading. Proper lol stuff :D
My FIL, every time...

Week 1 - Have you seen the new x-series BMWs? [Randomly interjected during a conversation about snooker]

Week 2 - Those x-series sure are sharp [non sequitor while driving]

Week 3 - "I don't need a new car, my current one is the bees knees." [disappointingly looking at his late-model Jag he bought less than two years ago]

Week 4 - Manufactures minor issues and annoyances with current car.

Week 5 - New x-series sitting on the drive. "They gave me a smacking deal! I couldn't afford to pass it up."
 
To me that would aire on the 5080 been on the cheaper side. Otherwise all reviewers would be like "if your spending this much, just wait a few weeks".

On the otherside if this is all about the Trump tariffs, wouldn't they release the strongest card first?
 
Wcftech seem to think the 5080 will be the first out of the gate this time.
In fact, for enthusiasts, the wait time for the NVIDIA GeForce RTX 5090 will be just a few weeks apart from the RTX 5080.
It is said that both the RTX 5090 and the RTX 5090D are eyeing a debut by the end of January or mid-February.
The RTX 5090 will once again offer a gargantuan upgrade over the existing 90-class offerings and set a new benchmark for high-end PC gaming.

Might not be till mid February before we know the 5090 price :eek::eek::eek:
Or will they release the prices at CES
 
Last edited:
The sugar tax is only for drinks and wouldn't explain the reduction in cocoa, cocoa butter, milk content or the increase in sweetness of modern chocolate. Cadbury's chocolate has gotten worse for example.Terry's Chocolate Orange is terrible now. Less orange flavour now and very sweet.
Modern biscuits seem to be excessively sweet.
Instead of natural sugar, sugar syrups are being increasingly used in the last decade in many products.
That sweetness masks poorer quality ingredients. Its why a lot of US sweets are very sweet.

I always felt it was also to do with getting old and it being blasé. Only ones I still enjoy now and then are peanut m&m's, maltesers, mr tom and mikado.

Plan to cut those out too from the new year tbh. Not that I had them often.
 
My theory would be is that they're releasing the 5080 first because it'll only look like a good buy in isolation - if the 5070ti is, say, 20% cheaper and 10% less performance, and the 5090 is 30% more expensive and has 60% more performance then it's going to be a tougher sell.

(I am of course pulling random numbers out of my arse)
 
You may be a person that’s very susceptible to some of these issues but I think, for some people, there is a certain amount of ‘retconning’ when it comes to why they dislike ray-tracing.
No, I simply have functioning eyes. :) Unlike tiny details that people like to bring up as in many cases, which you need high magnification to see, crawling noise is a very visible thing in motion, same as ghosting on many objects. Our eyes react mainly to contrast and luminance changes and not colours (just because we have many more receptors in our eyes responsible for that than for colour recognition) and in many cases here you have very contrasting noise crawling over darker areas, shadows, textures, reflections etc. One would have to have physically damaged eyes to not see that, it feels. :D That said, I dislike noise in image, I always turn off film grain and similar effects as well as they are too distracting for me - some other people might consider noise to be just normal thing and do not pay attention to it, but it's not normal. To be clear, RT/PT games aren't the only games that have this problem - BO6 doesn't use RT AFAIK and noise is so bad in it, I stopped playing that game (plus other artefacts as game is still very buggy). Devs using cheap solutions instead of proper ones, for cost cutting and then trying to mask it with TAA or not even bothering with that and then it looks really bad.
Slowly reacting GI in for example CP2077 before AI fixed it partially, just looked so freaky/unnatural, as it defied physical understanding of how light works that my brain learned by being outside now and then, it was very noticeable and distracting.

Before that HUB video it was:

“It tanks performance and you can barely see the difference. Not worth it. Also things can look too shiny.”

Not invalid points! Some people could legitimately argue that RT is rubbish for these reasons, from the perspective of preference.
As HUB shown in other video - there are plenty of games where RT doesn't bring any improvements at all, or even worsens the image. RT/PT by itself is just one of many tools in developer's portfolio and how they use it and to what effect is a very different thing from just saying "It's always better" - as reality shown, it's not always better. Most of the time it's also outside usable range of most gamers (as majority has xx60 cards class, often from previous generations), or forces 30FPS on consoles so gets disabled to get to 60FPS.

Now after that video it’s:

“Textures become too noisy! It’s weird when you move around! The reflections are lower resolution!”
I've been mentioning this for years now, in here (and not only). :) I am glad HUB actually released the video (TI and other channels also shown similar things) to shine some light on why people think their eyes got worse or games got more blurry - games really did get much more blurry! And it's mostly caused by premature push of this tech to the market plus cost-cutting publishers, pushing games ASAP without giving devs time to actually do things properly. Modern GPUs (even 4090) are way (hundreds times actually) too slow to generate proper quality PT image, they barely handle simplified RT as is. We get very small number of samples per pixel so final image is just a noisy mess - as in this example from Q2 RTX, just to demonstrate how PT image in games really looks like before all the crutches are being used to turn it into something more usable:
It then has to be fixed by TAA and denoising over multiple frames. That removes a bunch of details (as they never existed there in the first place, since the source image is just a horribly noisy mess), causing blurry image, ghosting (TAA = Temporal Accumulation, not AA in this case) and other artefacts. This can't be fixed without AI dreaming up pixels that are missing from the source image or adding many hundreds times more rays to the source image - which won't happen for decades yet, most likely, as things are moving currently.

^Not invalid points either, technically, but you’d have to be a hawk to notice some of these things and pretty much nobody was talking about them before.
Absolutely not true - topics about it with screenshots and videos have been shown on these forums and other places for years now. Consumers love to have selective amnesia just to excuse their own spendings on new shiny toys. :)

If anything, these points show that the tech has some downsides, but they aren’t primary reasons for disliking it IMO.
It is for me. It's way too early for this tech in current state - both from quality and performance standpoints. And yet it's being pushed forth against majority of the market who can't even run it properly. NVIDIA marketing and monies talk, really. We'll get there in time, to the point it will reach lower tiers of GPUs with sensible performance after all the AI crutches are added to it (many such revealed in scientific papers recently - many of such coming from NVIDIA). Without all the AI helpers it's near unusable in the current form for me, outside of tech demos, as a curiosity - I don't like to play blurry games full of noise, ghosting and other artefacts. To me it's like releasing new car model with no wheels yet, just on sledges and 1 old, slow horse pulling it, promising wheels will arrive eventually - but by then one might need a new car anyway. Not ideal solution from the consumer's point of view, is it?

Even the HUB video notes at several points that the visual presentation of lighting is generally better with RT enabled.
In games where it works well and actually brings visual improvements. As per their bit older video - plenty of examples where it doesn't bring anything good to the table, like for example in D4.

Personally, I dislike it when things appear too shiny - that’s the aspect that bothers me. Hence when there is an option for RT shadows only (rather than reflections) I have sometimes opted for that.
Shiny, flat bits are much easier and faster to calculate in RT/PT, as all the rays reflect in the same way, so can be calculated at the same time, saving loads of time. Add roughness to the surface, uneven surfaces and things quickly get much slower. This is one of the main reasons why they do it - it looks bling, but it also works faster (more FPS!). Personally I like GI most of all but it's still too laggy and noisy in dynamic games. And in static games, you might as well just bake it in, or use hybrid methods. Over 10y ago NVIDIA shown voxel based accurate RT for GI, shadows etc. - it worked so well it passed realism tests but they ditched it for some reason (it worked well on 900 series cards!) and instead decided to push for full PT on GPUs that are just way too slow for it. And here we are.
 
Last edited:
or a secondhand 4090 24GB with no warranty with risk of something being wrong with it (remember the melting connectors) and being sold by some random stranger
Fun fact - the connector does NOT change with 5000 series cards. And 5090 has 600W TBP it seems, so we'll see plenty more melted connectors most likely. :)
 
We knew from a year ago. He is only starting realise that now I think.

The bit that makes me laugh is how he speaks with 100% certainly and lays out the reasons why he won't be upgrading. Proper lol stuff :D
It's no different from him saying he's not a pixel peeper whilst I still recall all the dozens of screenshots posted by him doing exactly what he claim he doesn't do. :)
 
Back
Top Bottom