I have had an RTX3060TI since 2021,which is a bit faster and it's running out of VRAM now at reasonable settings(1440p) in newer games. I don't just max out games. My mate with an RX6700XT doesn't have that issue. We both had similar systems until recently(5700X+ B450 motherboards).
This was on both a PCI-E 3.0(5700X) and PCI-E 5.0(7800X3D) system BTW. The newer system helps a bit more,probably because of the faster connection and DDR5. It manifests in 1% lows and texture pop-ins. The RTX4060 is a bit slower,because it uses a PCI-E 8X connection so when it pages into system RAM,there is even less PCI-E bandwidth available.
The RX6700XT/RX6750XT have been under £300 for the last year or more if you shop around. The RX6800 16GB/RX7700XT 12GB have been available for almost £300 a few times over October and November,as AMD starts clearing out stock.
The new Indiana Jones game is Nvidia sponsored,but despite this see what happens:
It needs at least 12GB of VRAM to perform OK. The rest of the Nvidia cards perform fine.
The RTX4060 makes a lot of sense as an OEM card,because you can find complete desktops for as low as £600 using it. But DIY pricing is a bit higher than it should be IMHO.
With the consoles having between 10GB to 12GB of VRAM,12GB is probably the entry level with newer more PC focussed games IMHO.
If the Intel benchmarks are true and the drivers are OK,the B580 12GB is basically a lower power consumption RX6700XT,with better RT,lower power and a more advanced media engine.
I hope it also means the RX8600 and RTX5060 are shipped at least 12GB VRAM next year.