Looks like someone got reincarnated after 15 years. I am getting Deja Vu.
But based on some comments I am starting to think it ain't just Rollo's around here but RedRollo's also
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Looks like someone got reincarnated after 15 years. I am getting Deja Vu.
But based on some comments I am starting to think it ain't just Rollo's around here but RedRollo's also
That goes both ways.To a point, once the cost becomes to high then people will stop caring about the premium features and care only about the cost.
True, its a balancing act but this generation Nvidia has gone to far on price, people expected increases but not the massive rises seen.That goes both ways.
At present the 4070 is good card, not brilliant but good, with a poor price but its looking increasing likely that 12Gb VRAM will not last, as the minimum standard, as long as 8GB did and that 16GB for the long run is.I have some screenshots if you like. DLSS + High Textures > native + ultra textures > fsr + Ultra textures.
Sad story for the vram warriors, but it's true. Personal attacks against me may help you pat each other in the back but they won't change the facts.
I have some screenshots if you like. DLSS + High Textures > native + ultra textures > fsr + Ultra textures.
Sad story for the vram warriors, but it's true. Personal attacks against me may help you pat each other in the back but they won't change the facts.
At present the 4070 is good card, not brilliant but good, with a poor price but its looking increasing likely that 12Gb VRAM will not last, as the minimum standard, as long as 8GB did and that 16GB for the long run is.
Most people buy GPU's to last years but its looking like 12GB will not do so, most likely due to most games companies looking at consoles at the same time as PC's, and sell more to Consoles, and they offer 16GB VRAM, so make games with that in mind.
As we have seen the PC optimisation is often poor so 8GB is not enough and in some cases 12GB is just enough.
Well I never understood what 12gb will not last even means. I played hogwarts on a 3060ti 3440x1440p with everything maxed out except textures (high) on ultra. As I've said before, it looks better than fsr with ultra textures, so I'd rather have the option of dlss then extra vramAt present the 4070 is good card, not brilliant but good, with a poor price but its looking increasing likely that 12Gb VRAM will not last, as the minimum standard, as long as 8GB did and that 16GB for the long run is.
Most people buy GPU's to last years but its looking like 12GB will not do so, most likely due to most games companies looking at consoles at the same time as PC's, and sell more to Consoles, and they offer 16GB VRAM, so make games with that in mind.
As we have seen the PC optimisation is often poor so 8GB is not enough and in some cases 12GB is just enough.
True, but I mean there isn't an ideal option from AMD given how much they've increased the price as well - or haven't decrease it enough for older gen given they don't have 7xxx models in the lower tiers. Is bad from both sides.True, its a balancing act but this generation Nvidia has gone to far on price, people expected increases but not the massive rises seen.
Have you tried Forespoken?Well I never understood what 12gb will not last even means. I played hogwarts on a 3060ti 3440x1440p with everything maxed out except textures (high) on ultra. As I've said before, it looks better than fsr with ultra textures, so I'd rather have the option of dlss then extra vram
Is there a game currently that you can't play on 8gb cards? I wanna try that's why I'm asking.
Well hogwarts is one of the notorious vram hogging games. The other ones look mediocre to bad regardless of settings, like forspoken godfall etc.That is true for Hogwarts because the ultra settings are ****** and in many cases looks better with high settings (and a small number of ultra) than all ultra. It isn't true for many other games.
No but it looks terrible even on ultra. The ground textures from what I've seen look like a 10 year old game next to plague tale Requiem,and that one uses 5 to 6 gb in 4k Ultra. I might give a go to forspoken just for funHave you tried Forespoken?
I suppose what I mean the timeframe, on which you can play on High settings, on a new card, then medium and then low, over time, is getting narrower.Well I never understood what 12gb will not last even means. I played hogwarts on a 3060ti 3440x1440p with everything maxed out except textures (high) on ultra. As I've said before, it looks better than fsr with ultra textures, so I'd rather have the option of dlss then extra vram
Is there a game currently that you can't play on 8gb cards? I wanna try that's why I'm asking.
20 quid extra gets you the 16gb a770 so 300 quid for a770 16gb is reason enough to buy it over the 6650xtA770 8GB: £280
RX 6650XT: £270
But XESS? No come on why should i buy the A770 instead of the RX 6650XT?
1. In the past Nvidia and Amd would stop producing old gen cards for weeks before the next gen came out, so that there were not many to have to reduce. As they over produced this time & had more stock left over they didn't want to reduce them as much and take a hit to margins.What I don't understand is, what has changed from before?
Before the 3xxx series, most of the time a new generation came out, all the old stuff was sold mega cheap to clear stock.
I just don't understand this tactic of keeping the old stuff at the same price and releasing the new stuff all at higher price points, instead of just replacing the old ones.
It feels illogical and perhaps even driven by some sort of algorithm/ AI that can't understand why 3 series sales were so ridiculously good. So it just thinks that GPUs will always sell out instantly, hence the higher prices.
Someone modded a 3070 to have 16GB VRAM, impressive uplift.
Modded GeForce RTX 3070 with 16GB memory gets major 1% low FPS boost - VideoCardz.com
What if NVIDIA made RTX 3070 with 16GB? Paulo Gomes and his friends came up with a new idea to upgrade NVIDIA’s last-gen RTX 3070 GPU memory. RTX 3070 with 16GB VRAM, Source: Paulo Gomes Memory mods on graphics cards are nothing new these days. Modders and technicians have been doing this for...videocardz.com
Just goes to show that they don't give you the best products they can make, only the ones they can make the most profit on. We all know this but it's nice to be reminded now and thenOh no its back.. must have breached the containment of the cpu section.
If only the factory could have put more on in the first place!
NVIDIA stops supply of chips, the RTX 4060 Ti comes at the end of May/beginning of June and AMD shows the RX 7600 at Computex (Update) | igor´sLAB
Let’s start the graphics card week with today’s review of an RTX 4070 from Palit and an information that I also got confirmed in the meantime: NVIDIA will stop supplying its board partners for a few…www.igorslab.de
Let’s start the graphics card week with today’s review of an RTX 4070 from Palit and an information that I also got confirmed in the meantime: NVIDIA will stop supplying its board partners for a few weeks. In the previous news, there was usually explicit talk about the GeForce RTX 4070, which is now also the case for the GeForce RTX 4080 in a similar form. This doesn’t mean that NVIDIA might stop making chips, but that board partners are complaining that there are too many cards both in the channel and in factories.