4080 and 4085.Wait, what? You mean it's not just the amount of memory, maybe they should give them different names.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
4080 and 4085.Wait, what? You mean it's not just the amount of memory, maybe they should give them different names.
The extra L2 cache is likely a cost saving measure as it enables nvidia to cut down the bus sizeWe can't directly compare these. The big different with the 40x0 cards - and the thing that's really jacked up the price - is the size of the L2 cache. Even the badly named card has 8x the cache of the 3090Ti. That cache makes brute bandwidth less significant.
The extra L2 cache is likely a cost saving measure as it enables nvidia to cut down the bus size
Well I'm one of those who would rather just not play than play at a substandard level. It's like clay shooting, when I can't afford to do that anymore I'm not going to go paintballing just to experience something similar.Conversely, fortunate that PC games have enough settings to enable playing new releases on your current hardware for years to come. I get that people want all the resolution, framerate and whatever but that doesn't exactly exlude anyone who can't afford or justify the latest hardware.
Yes, that is all true. They also didn't really hide it - when J.H. had a call just few weeks ago, with shareholders, he explained clearly NVIDIA's plans: manipulate the market, make sure 3k series sells without dropping pricing or margins, and by that make sure to not destroy high margins for future products of theirs (4k series and later). In short words, they want to uphold the high margins from mining-times, just because they think they can - irrelevant of any production cost increases. Satisfying the consumer has not been mentioned there at all. It's not really a secret, it's been posted all over the internet, but it had oddly small reach amongst gamers - most people seems to not have heard about it. Though, NVIDIA loves to forget that they do have growing competition (AMD and soon hopefully Intel) which will hopefully force them to adjust their thinking to be more real.Jensen is right, he just failed to explain his position by answering as if he's talking to 12 year olds.
The price of high end chips are not going down, just take a look at TSMC's pricing schedule - you can't manufacture the same size wafer on 5nm and expect it to be cheaper than 12nm, it ain't happening. This used to happen in the past, not only were smaller node processes smaller and more efficient but they were also cheaper to use.
We're just talking about the silicon here, one of many costs that go into a GPU and just cause the silicon cost has gone up doesn't mean the GPU needs to. Nvidia is purposefully designing the X pensive GPUs, it doesn't need to and it's purposefully trying to make very high margins and these things have a greater impact on GPU prices than the cost making a smaller transistor.
I was sent an email from them a few hours ago notifying me of new stock. I opened the email 50min after and it was already sold out.. smhA quick question and guess it’s more of a what do you think, I am going for the ZOTAC which has 4 cables in to the graphics card. Corsair have a cable for their psus which is a 2 cables one which can handle the full 600w, I’ll be ok to use either right? I could use the Corsair one if/when it comes to the uk.
you get more chips per wafer though so it evens out, the per chip price is all that matters5nm wafer is really very expensive at $16,988 each.
In my view - get 4090 only if you actually make money on it or are enthusiast with enough money for it to not matter much to you. If you'd have to take credit for it and want it more than need it, plus it brings you no money... that would be a really silly thing to do. In such case it's just a toy that will lose value rapidly for a bit of entertainment. I reckon each person would have to actually think about it before buying or face potential huge buyer's remorse later.The only reason the 4080s are so overpriced for what you get is the vast stock of 30-series NVIDIA still need to sell. That's it. That's the only reason. No Moore's law (dead or otherwise) not L2 cache. Just an overabundance of 30-series they have to make look appealing price-wise or they will never sell.
Get a 4090 if you can afford it, otherwise wait till the backlog of 30-series is sold through. Only then will the price come down to where it should be.
They also suggest/recommend in the table in that article that 3k series (even 3080Ti and by that 3090) are good only for 1080p these days. Want 1440p or higher - only 4k series will be good, apparently. Details that it's 360FPS+ (number of people owning such monitors is miniscule) on full details only. Ah, the PR/marketing BS in full force.Nvidia has released benchmarks for Overwatch 2
This is average FPS at 2560x1440p max settings.
Using the RTX3080 as a base, the rtx4080 12gb is 19% faster than the 3080. The 4080 16gb is 47% faster. And the rtx4090 is 105% faster than the RTX3080
Overwatch 2: GeForce RTX 40 Series GPUs with NVIDIA Reflex reach 360+ FPS, 1440p
New 600 FPS frame rate cap and system latency as low as 8 milliseconds thanks to the unbelievable performance of GeForce RTX graphics cards, and the magic of NVIDIA Reflex.www.nvidia.com
They also suggest/recommend in the table in that article that 3k series (even 3080Ti and by that 3090) are good only for 1080p these days. Want 1440p or higher - only 4k series will be good, apparently. Details that it's 360FPS+ (number of people owning such monitors is miniscule) on full details only. Ah, the PR/marketing BS in full force.
Competitive peeps lower details a lot usually - and not just to pump FPS up, but also to remove visual clutter from the screen. Then that table would be of no use to them. Also, just looking at the table with latency per frame (which is directly comparable to overall latency whilst playing), with such high FPS number it's 1-2ms difference... No human being can perceive such small latency change, not even super-human competitive players.To be fair, that's tagged as "competitive play" rather than for normal muggins.
Competitive peeps lower details a lot usually - and not just to pump FPS up, but also to remove visual clutter from the screen. Then that table would be of no use to them. Also, just looking at the table with latency per frame (which is directly comparable to overall latency whilst playing), with such high FPS number it's 1-2ms difference... No human being can perceive such small latency change, not even super-human competitive players.
add PC VR to that as well unfortunately even a 4090 isn't going to max out one of the higher end HMDs, VR is a cruel mistressYou only need a 4090 card if you play @4k or UW and want to chase fps to match a high hz monitor/TV.
You only need a 4090 card if you play @4k or UW and want to chase fps to match a high hz monitor/TV.
Other than that, pick up a 3090 or 6900XT for very high performance on lower resolutions - these cards do well @ 4k anyway. DLSS 2.0 does a good job for games where implemented well.
You only need a 4090 card if you play @4k or UW and want to chase fps to match a high hz monitor/TV.
Other than that, pick up a 3090 or 6900XT for very high performance on lower resolutions - these cards do well @ 4k anyway. DLSS 2.0 does a good job for games where implemented well.
Smaller chips means more chips per wafer.
Smaller chips means more chips per wafer.
This is due to smaller fab processes providing increased performance per area on chip, with increased density.
I hope I didn't bork the wording but I'm sure you know what I mean.
I think wafer cost doesn't compare to R&D cost, the focus is because it sounds like a lot to you and me when we see wafer cost.
This is a widely ignored point when people say not to buy the 4090. I’m a VR gamer with a higher end set and definitely need the power boost!add PC VR to that as well unfortunately even a 4090 isn't going to max out one of the higher end HMDs, VR is a cruel mistress