• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

What are the options for getting 3090 ? can you wait outside the Shop and order or do you have to try your luck again at 2pm on Thursday ? Could someone give me some pointers on how I can get my card and non end up with a blank screen like last week, Do you have any idea on the stock levels of the 3090s ?
 
It's looking like I can be intensely relaxed. All the benchmarks are really telling me is how good my 2080 Ti is. I'm not fussed about RTX (granted it's nice), but I can wait until the stock situation improves and even then probably not bother until Hopper. That'll be 4 years for my 2080 Ti and thus about £20 a month.

Of course, if AMD release a cracker, GPU prices will come down and make one or the other a better value proposition.

If your playing in 4k your going to struggle with cyberpunk 2077 at ultra settings mark my word.
 
It does not say worst. It would be nice if they actually admitted it was worst launch ever but they didn't. They are claiming best launch ever.

To me even though I managed to get one but it was the worst launch I ever seen.

OK most frustrating then - I thought I read worst somewhere.
 
3NxadUJ.jpg
 
Have you not watched Cyberpunks visual demo's its obviously next gen Graphics' its on a whole new level...

I've watched a few early release videos of it and i have to say that they have a very long way to go to yet with apparantly not much time left to deliver. The lighting and shadows are especially terrible, it has an amazing aestetic and im sure RTX use will add in some really nice ray tracing, but honestly so far I'm not next-gen blown away. I dont think it was ever really designed to be a visual masterpiece though, i dunno where that hype came from for that.
 
This is it how it should be. Top res ultra should always be too much for current gen.
But there recommended GPU is a GTX 1060
SYSTEM REQUIREMENTS

  • MINIMUM:
    • OS: Windows 7 or 10
    • Processor: Intel Core i5-3570K or AMD FX-8310
    • Memory: 8 GB RAM
    • Graphics: NVIDIA GeForce GTX 780 or AMD Radeon RX 470
    • DirectX: Version 12
    • Storage: 70 GB available space
    • Additional Notes: SSD recommended

  • RECOMMENDED:
    • OS: Windows 10
    • Processor: Intel Core i7-4790 or AMD Ryzen 3 3200G
    • Memory: 12 GB RAM
    • Graphics: NVIDIA GeForce GTX 1060 or AMD Radeon R9 Fury
    • DirectX: Version 12
    • Storage: 70 GB available space
 
Have you not watched Cyberpunks visual demo's its obviously next gen Graphics' its on a whole new level...

It's been in dev for a few years. In fact, it was supposed to come out shortly after the 2080Ti, hence the Cyberpunk 2080Ti. It has been delayed more than once to improve the performance, too. It will also use DLSS.

COD MW looks amazing. Almost next gen, even, yet flies along. Just because it looks nice doesn't mean it is going to run like crap. Death Stranding, for example, is stunning and runs wonderfully on a 2080Ti even at 4k.

Releasing your games for cards that don't exist is always a bad idea. It would just get slated, which is probably why it has been delayed.
 
No i don't know exactly how much Nvidia's supercomputer cost to run. What i do know that in general a supercomputer are expensive to run. There is the electricity bill, cooling bill, the cost of land, the manpower to maintain the equipment. There is also the cost of the computer itself.

If it was quick/cheap to stick DLSS in games we would have it in more games by now. My assumption is that they are working it in around more significant projects. Which is why we have a handful of games with DLSS after 2 years.

If they are going to put it in a significant number of games they would probably want to build a dedicated super computer to train dlss on.

DLSS 1.0 was trained per game, where as the new DLSS 2.0 and upcoming 2.1 are trained in a general game agnostic way and this will likely be the standard moving forward. It's just the game/engine implementation is still done on a per game basis, but thats true really of almost all AA types today except for the post processing shader ones which are lame and no one should ever use. As with all new tech, its a steady evolution and then adoption.
 
DLSS 1.0 was trained per game, where as the new DLSS 2.0 and upcoming 2.1 are trained in a general game agnostic way and this will likely be the standard moving forward. It's just the game/engine implementation is still done on a per game basis, but thats true really of almost all AA types today except for the post processing shader ones which are lame and no one should ever use. As with all new tech, its a steady evolution and then adoption.

Seriously the most disappointing thing about RT for me was DLSS. The most impressive thing about RT for me was DLSS 2.0. It really is amazing. Without it we would still be two or three gens away from proper RT games that can run at a decent framerate.

Saying that? I don't think it looks any better than RAGE 2. Which wasn't a fantastic game, but man it was eye candy.
 
I still feel that the A.I./DLSS is just an excuse for Nvidia to flog compute cards to gamers. If AMD takes the performance crown/handily beats NVidia, I would not be surprised if they drop Tensor cores like a rock and just pack the chips with more transistors that directly contribute to raster performance.

My theory is that these 6 year long console cycles are hurting pc GPU sales as a lot of the games are aimed to be multiplatform, and are built for the lowest common denominator, basically the consoles. PC is an afterthought.

So we get like 3 generations of cards in those 6 years. Day 1 the consoles are average PC spec equiv GPU/CPU, by year 2 they're fundamentally low end PCs. And by year 6 where we are now they're just kind of a joke. By year 6 in this cycle the demands on rasterization is low, but the ability to provide it on PC is insane. So, we crank settings, we up the res to true 1440p/4k, 60-120fps instead of 30. But even after all that it's hard to find where to spend that power.

Nvidia took a gamble, they knew Ray Tracing was just within reach, but that performance at the resolutions games now expect 1440p/4k would be impossible. DLSS is a necessary component of getting RTX actually functional. Both these things heavily rely on more specially reserved transistors.

AMD could go 100% raster with a similar sized GPU and hence more total raster performance for your dollar. But what would you spend that additional perf on? 4K is still a vanishingly small number of gamers.

This happened a few gens back with AMDs cards one cycle being so good at 1080p which everyone ran, that they just stuck 6 video outs on the backplate and pushed triple monitor gaming.

The mainstream 10-15 games used for extreme benchmarking of these cards arent themselves good representatives of the gaming market, you cant throw a few killer apps at a card to justify it, games have to look and see what they'll get out of it, and if it brings them broad benefit. Nvidia basically gave the common gamer a reason to upgrade. New effects, and DLSS as a way to get into the 4k bracket.

I'm really intrigued to see what AMD show off, based on the consoles it does look like theyre adopting RT but in a way more conservative way.
 
Seriously the most disappointing thing about RT for me was DLSS. The most impressive thing about RT for me was DLSS 2.0. It really is amazing. Without it we would still be two or three gens away from proper RT games that can run at a decent framerate.

Saying that? I don't think it looks any better than RAGE 2. Which wasn't a fantastic game, but man it was eye candy.

I hated DLSS 1.0, it was a crap upsample and purists on image quality like myself saw that not only as a step back, but fundamentally not even AA in a classic sense.

DLSS 2.0 is actually serviceable, i was honestly surprised how good they got it. But you're right, Ray tracing is SO hard not only is it done at quite low resolution internally, but rays per pixel to sample the game space are quite low which means you have to de-noise that output before you can even blend it with that raster portion and upscale the whole lot.

We've almost hit the reset button on PC graphics now, to get modern 1440p native res RT with a decent number of rays per pixel I suspect would take another 10 years or 5 generations alone, ignoring all of the other growing demands on the GPU.
 
Personally still not on board with DLSS - even the latest incarnation has issues around motion which you don't see in a lot of the comparisons - but personally it bothers me spending time with it actually playing - one of those things like the grainy smearing on "gaming" VA monitors that once you see you can't "unsee".

(Though I suspect it is somewhat subjective how noticeable it is)
 
I've watched a few early release videos of it and i have to say that they have a very long way to go to yet with apparantly not much time left to deliver. The lighting and shadows are especially terrible, it has an amazing aestetic and im sure RTX use will add in some really nice ray tracing, but honestly so far I'm not next-gen blown away. I dont think it was ever really designed to be a visual masterpiece though, i dunno where that hype came from for that.

Don't forget that your watching a compressed video stream. Also the 4 hour demo that those journalists did was apparently running full RTX on 2080tis and was hitting 1080p 60fps..... With dlss.

Hopefully that wasn't completely optimised
 
Back
Top Bottom