• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Who's waiting for the 3090 and which brand are you getting?

Status
Not open for further replies.
wow ... what a thread.. anyways all crying 3090 purchasers are fools .. mind your own business it's their money and for all you know they require more than 10GB VRAM ... or just want the fastest card and don't update every new shiney comes out so keeping it for a good few years.

It's pretty rude to justify what you think is right at the people buying them, they may have their reasons. Yes for gaming for now the 3080 with 10GB will do, but reality is as we all know for future 4k+ resolutions and new games with high textures the 10GB will not be enough but if you are a 1440p and 1440p Ultrawide user the 10GB will be fine for many years, by that time the card will not be fast enough before VRAM becomes an issue.

Guys we are all mad about what Nvidia has done to some point and reality is in most cases if the 3080 20GB was available I bet you most people using a 3090 for gaming only would have got that.. Now lets be honest Nvidia are going to charge a premium for the 20GB model and clearly not coming out just yet , they need to milk the 3090 sales first. But lets hope it appears soon, I keep looking at the 3080 and 3090 and honestly getting fed up of both cards they both have their issues, Nvidia did that deliberately .. So people that are fed up to some point thought sod it 3090 and wont upgrade for a few years and pay the Nvidia tax this time round.

But also lets wait and see what AMD brings out soon to the GPU area and then we can all decide what was really best value... depending on our needs NOT wants but real use needs.

Show me proof that 10GB isnt enough for games at 4k.

Someone said Horizon Zero Dawn. That uses 7.2GB and has a Vram memory leak.

In the next 2 years is there a game that will USE 10gb?
 
Lol. Thats not how it works at all.

Most PC games are Ports which are upscaled. Not PC games downscaled to consoles. Have a look at Warzone MW. It looks like **** on PC.... No matter how much you turn it up.

Also Nvidia just abandoned SLI officially.

I think I will go with the graphics on a high end PC thanks.

And Consoles really suck at a lot of the games I play, those key pads just don't have enough buttons.
 
Err what?

did you understand my point at all?

I understand your point and you are wrong.

If I blow 3k on a couple of GPUs that are worthless in a couple of years time I don't care, all I care about is what they can do now.
 
Show me proof that 10GB isnt enough for games at 4k.

Someone said Horizon Zero Dawn. That uses 7.2GB and has a Vram memory leak.

In the next 2 years is there a game that will USE 10gb?

Horizon Zero Dawn is 3 years old.

ZuAcNST.png
 
Lol. Thats not how it works at all.

Most PC games are Ports which are upscaled. Not PC games downscaled to consoles. Have a look at Warzone MW. It looks like **** on PC.... No matter how much you turn it up.

Also Nvidia just abandoned SLI officially.

You really are offended aren't you? :D

You are wrong! Most games that are cross platform end up releasing with higher resolution graphics than the console including the texture quality! I'm not even going to bother giving you links to sites that shows this frequently using comparisons. But GTAV is one good solid example: It released later for PC simply because of all the extra work that got put into it. Crysis and Crysis remasterd is another very easy example where texture resolution is much higher on PC! Just go look on Youtube if nowhere else and stop lying to yourself to back up arguments that exist only in your own head.

Oh, and the last nail in your coffin for "consoles leading the way" -- Why has the PC had ray tracing for a while now and not consoles?
 
You really are offended aren't you? :D

You are wrong! Most games that are cross platform end up releasing with higher resolution graphics than the console including the texture quality! I'm not even going to bother giving you links to sites that shows this frequently using comparisons. But GTAV is one good solid example: It released later for PC simply because of all the extra work that got put into it. Crysis and Crysis remasterd is another very easy example where texture resolution is much higher on PC! Just go look on Youtube if nowhere else and stop lying to yourself to back up arguments that exist only in your own head.

Oh, and the last nail in your coffin for "consoles leading the way" -- Why has the PC had ray tracing for a while now and not consoles?

you clearly dont understand how things work.

Do you realise that games are not VRAM limited? Do you know what VRAM allocation means? Do you know that frame rates are 99% always Core limited? Never in the history of PC gaming has the high end card ever not had enough VRAM.
 
Lol. Thats not how it works at all.

Most PC games are Ports which are upscaled. Not PC games downscaled to consoles. Have a look at Warzone MW. It looks like **** on PC.... No matter how much you turn it up.

Also Nvidia just abandoned SLI officially.

Interesting you should mention this (true by the way) - Digital Foundry now have an Xbox Series X and because they don't have an games to play on it, they released a video showing how it copes with running old 360 and Xbox one X games.

One of the more interesting ones was seeing GTAIV run at a constant 60fps with no frame drops at all.

I've never seen that even on modern PC's, as the port from the 360 wasn't very good. I've seen it running at 140fps or whatever on a 2080Ti, but the frame rate still has issues occasionally.
 
In 2 years time the mid range 4000 series will make the 3090 look 50% of its speed and 25% of its value. Next gen consoles will push up graphics standards and require much faster GPU's over the next few years.

Dont justify this purchase with "it will last me 4 years" kind of mentality.

Unless you are taking Radox's mentality it will only lead to buyers remorse.

You say that but a normal 3080 isnt that much faster than a good overclocking 2080ti from 2 years ago. In fact the difference is so small that I was thinking of keeping the 2080ti for another 2 years.

so no, i doubt a mid range 4000 will make the 3090 look 50% of its speed. Looking how Nvidia are doing things and they cant just turn up the wattage to 11 like they did this time (all the none dlss and rt gains from the 3080 came just from the power boost), I suspect we will be looking at a 4080 which will be 10-20% faster than a 3090. So yeah people can finally afford to have 3090 performance.

Will that make a 3090 owner upgrade? I doubt it. In the meantime they have had two years playing 4k games between 60 and 120fps.
 
Show me proof that 10GB isnt enough for games at 4k.

Someone said Horizon Zero Dawn. That uses 7.2GB and has a Vram memory leak.

In the next 2 years is there a game that will USE 10gb?

I never said it wasn't enough... It will be fine for 4k for a good few years too probably but depends on future games and what people are using as settings for their 4k gaming and what refresh rates they want too.

I still use a 980ti 6GB with a 1440p ultrawide and has never let me down with any game, but again my monitor is 60hz and can be clocked up to 75hz but I don't bother, i'm not a refresh rate junkie, but I do like nice image quality. Like I was saying some people like myself dont update every new generation and some of us keep GPUs for over half a decade before we update, I basically by then update all my system and then buy a new current GPU too. As I have been doing recently I went 3950x and 64GB ram and waiting for a 3080 20GB or a 3090 at a sensible price and will again probably keep for 5 years again.

My current system that is in my signature below is now back to my 980ti Classified from a 1080ti Kingpin that I had for about 3 weeks and sold it again as it wasn't that much better than what I had, so decided to wait for a 3080 20GB for that system and get a 3090 for the 3950x system.. Or I may just sell the system below, still not sure what I'm going to do with it yet, but since getting the 3950x and comparing both systems I realised how good the 5930k system is still for a 6 core cpu with 12 threads and had 2080ti on it as tests and it came out with almost same frame rates at 4k and 1440p ultra wide on most games that were GPU bound.. 1080p is dead to me and has been for over a decade so no need to see how CPU bound some games get with a 2080ti.. reality is all you need to do in most cases is bring a couple of settings down and all runs well even on 6GB Vram.. but if buying new again wouldn't you want to buy something that will last you longer ? That's the way I look at my purchases it stops me then doing the never ending upgrades for minimum gains and money thrown out of the window every upgrade that has a slight bump... the days of huge upgrades that give big performance increases are over and has been for years and anyone that has been around as long as me and has been involved in the PC industry knows this, it's only recently AMD has shaken things up again and made some real gains worth the updates early.
 
you clearly dont understand how things work.

Do you realise that games are not VRAM limited? Do you know what VRAM allocation means? Do you know that frame rates are 99% always Core limited? Never in the history of PC gaming has the high end card ever not had enough VRAM.

Please point out where I once mentioned frame rates? I countered your "visuals" argument. So now you cannot counter ague it you try to steer the argument to a different course? I also countered your "consoles lead the PC" argument too. You didn't counter argue either of these points? WHY? Because you can't, so you changed the course of the argument from higher visuals to fps?? You've just lost.
 
You say that but a normal 3080 isnt that much faster than a good overclocking 2080ti from 2 years ago. In fact the difference is so small that I was thinking of keeping the 2080ti for another 2 years.

so no, i doubt a mid range 4000 will make the 3090 look 50% of its speed. Looking how Nvidia are doing things and they cant just turn up the wattage to 11 like they did this time (all the none dlss and rt gains from the 3080 came just from the power boost), I suspect we will be looking at a 4080 which will be 10-20% faster than a 3090. So yeah people can finally afford to have 3090 performance.

Will that make a 3090 owner upgrade? I doubt it. In the meantime they have had two years playing 4k games between 60 and 120fps.

This is a new Process Node.

What happens is the second gen on this node has higher yeilds and lower costs and better power consumption. Ram prices fall and overall you will see 30-50% gains just like every other time this happens.
 
you clearly dont understand how things work.

Do you realise that games are not VRAM limited? Do you know what VRAM allocation means? Do you know that frame rates are 99% always Core limited? Never in the history of PC gaming has the high end card ever not had enough VRAM.

Not true. The 3080 loses 20% of its performance in Doom Eternal at the settings Nvidia used to show the 3080 was twice as fast as the 2080. And that is because the 3080 had run out of memory at those settings. Even the 1080ti beats the 2080 at those settings.
 
Status
Not open for further replies.
Back
Top Bottom