Well, obviously 8gb vram isn't ideal at a higher price point. You may get away with 8gb at 1080p to a larger extent than you would have at 1440p. Developers will likely be forced to cater to 8gb cards since it's a large part of the market, but it wont last forever. I think the 4060Ti 16gb is rumoured to be an idiotic price also.
You need to factor in the price of the GPU and whether it's worth going for something more power efficient, or something with less initial outlay.
The cheapest 4060Ti on OcUK is £389, the cheapest 6700xt is £329 unless you want to fart about with rebates then it's £295 but that's another story.
At minimum it's costing £60 more to get a 4060Ti. On the
TPU power consumption page it's saying the 4060Ti takes 59w to do the cyberpunk 60hz test, and the 6700xt needs 136w to make the exact same 60fps, so that's a 77w difference.
Power in the UK is about to be around 29.7p/kwh. So for every hour you game at the same 60fps on the 6700xt it's costing you 2.29p more than it would have on the 4060Ti. So you need to 2620 hours before it's more cost effective to buy the 4060Ti from a power efficiency / purchase price standpoint.
Personally I might get 10 hours a week of gaming time if that, so it'd take 5 years to recover the extra £60 outlay. If power was dearer or I gamed more then it'd pay off sooner.
Anyway, the point is you also need to balance in purchase price to your equation. I'm not sure where Brownsville is... Texas? If it is then google says you're paying 15c/kwh? What did you base your estimates on to get £35 - £60 a month for running your PC? Remember also if it pulls 250w at load it's much much less at idle or normal workloads, you should base your GPU power consumption costs around your average gaming hours.