I cared 18 months ago and people on here laughed at me if you remember, saying 8gb would last forever at 1440p-4k
Yeah I wouldn't of paid for a Ti... I had the money for a 7900xt and wanted that originally, but seeing the power consumption of that even at 1080p vs a 4070 at the same settings it was a no brainer.
As I say, from
real world testing vs relying on shills that are clearly sponsored online in reviews i've noticed I use around 1-2gb less than any rival or higher tier amd card, so they're either majorly favouring coding games to nvidia or amd cards just operate differently achieving the same or higher fps and are more vram hungry, no idea but I've literally seen it in person with rx6800xt vs mine for example and it makes no sense when both are at the same res/settings yet mine will use 9.3-9.7gb in the most demanding ott settings at 1440p or 4k...
So that means I'll be using 12gb when they'll be using 14-15gb in the future.
As I said in my post mate, when it gets to the point dlss can't save it at 1440p at a capped 60fps mid-high settings, then it'll go in my 2nd rig as an upgrade, and by that point it'll have cost me nothing... As after the 2nd year the money I save vs having a 6800/6950/7900xt paired with the rest of my system + speakers/amp/monitor... Remember my
total draw at the wall with monitor/speakers/amp/pc is only 260-280w at 1440p or 4k native ultra, and at 1440p dlss it's even less... A 6800/6900/6950/7900xt
does use 350w+ at 4k, and that's excluding the rest of their components + monitor/speakers/amp! F that!
Even underclocked they
CANNOT knock off 200w+ and get even close to my stock tdp at 1440p/4k high/ultra native, I've seen friends builds recently in person and yeah, not possible, anyone that's not biased will happily admit that.
Yeah IMHO I'd have a 6700XT over a 4060ti 16gb
I have no interest in the whole this should have been called a X card not a Y, as both teams are guilty of that, just look at the 7900xt that's a 7800xt with a dress on!
Haha it's funny you should mention MITX, my SFF 2nd system
is MITX, so that played a part in getting extra life out of the 4070 as I went for an Asus Dual which is very small for sucha powerful card, and will happily fit in my 2nd rig when the time comes

then that'll give that life for further years for it's chosen usage of being a tv media playback/emulation rig.
So I can't loose really. Especially when it's costing me peanuts a month to run with my entire setup including speakers/amp/monitor using less than even an undervolted RX6950XT/7900XT at 4k that can't do RT/frame gen/dlss3.0!
As I mentioned before I'm whiling to give stuff a chance as with different car manufacturers vs being narrowminded to 1 brand, I will say this I've had to eat my words with dlss3.0 as I wasn't convinced before, but it's VERY hard to tell on a up to date patched modern or vanilla game, I literally messed about with my 4k tv playing with native 4k and 4k dlss along with playing 1440p native on my new monitor etc etc and yeah, you'd really HAVE to know where to look to know, if you notice at all. So many older games are now adding support to it (not that I care as they wouldn't struggle running native, but it's nice they're bothering regardless)
So TLDR, after 3 years if it can't play nice with the aids of dlss3.0/frame gen etc at mid-high 1440p it'll just go in the 2nd sff mitx tv rig as a bonus upgrade and by that third year ill have saved £264+ on what I would have burned per year extra running a rx6950/7900xt when paired with the rest of the system/speakers/amp/monitors power draw, so I doubt I'll be that bothered come year 4 as that's over £500 that would have been wasted on electricity giving me effectively a free gpu, vs by year 4 on an RX AMD card i'd have no money to spend and the same performance or worse if you account for lack of RT/dlss... Win Win.
I'd of bought it whoever made it mate, as with a car. It majorly reminds me of how wicked the rx6600xt was for it's power consumption/undervolting vs performance.