I'm guessing a freeby. A certain company is giving these away with orders at the moment.240GB SSD is an interesting choice.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm guessing a freeby. A certain company is giving these away with orders at the moment.240GB SSD is an interesting choice.
..Yes, Doctor. Definitely another diagnosed case of 'Copium Copiatus'
..its spreading, rapidly, it must be airborne, what shall we do?
Oh wow just seen the prices and have Nvidia badged the RTX 4070? Yeah this is a hard pass from me. Will see what AMD bring to the table later on.
Could Nexus18 be a super-spreader? He's been looking a bit peaky since his 10GB 3080 aged like milk in Ray-Tracing.
I agree with you that usually textures are the most important settings when it comes to visual fidelity. But on the other hand, as an owner of the notorious 1060 3gb ( as well as the 6gb model), people were going nuts back then about the vram as well. And let me tell you, besides pubg that didnt run ultra textures without stuttering here and there, with every other game it was fine. In fact, i changed both cards after a bunch of games (mafia 3, watchdogs 2 and ghost recon) were running like complete crap regardless of textures being used. The raster performance killed the card, not the lack of vram.Just because X game does a terrible job of scaling textures does not mean there are not amazing textures that increase visual fidelity to a massive degree.
The reason people are arguing with you is because you are deluded and will not accept that the 4070ti is not a perfect product. The 4090 is a perfect product and has the price to go with it ,no flaws with this products performance at all. The 4070ti has pretty much the same vram as a gpu from 5 years ago , times move on and it will be a limiting factor in a lot of cases.
The 4070ti is a better slab of silicon than the 7900xt, there I have said it and I am willing to make a concession. the Nv card has similar performance whilst using less power and it does cost less if you buy the right card. It does not make it perfect so open your eyes and see that it does have some flaws. If you want a gpu with no flaws then Nv has you covered with the 4090.
Yeah it has it's place, just overpriced by at least £200. Paying inflated prices will only get us less value for money in future.THE THREAD TITLE IS :
What do you think of the 4070Ti?
And my answer is I think it fits the BILL Perfectly for me!!! why ? the options where ......buy a rtx 4070ti for £800 , buy a rtx 3080 for £700 or buy a rtx 3090 for £1300 .................so i purchaced a rtx 4070ti and have the same framerate in games I play on 1440p as i would on a rtx £3090, BUT IF I HAD TO GO BY HALF THE POSTS ON HERE I SHOULD HAVE PAID £500 more for the same Framerate and purchaced the rtx 3090 !!!!!! give your heads a shake ..................... please stop bashing the rtx4070ti it has its place , a newby on here might listen /read this verbal diareah written on this tread and spend more money than he needs to !!!
Yeah there's been a steady price creep on top of inflation for a long time but a big jump starting with the RTX cards.I'm in complete agreement with you but a lot of you are way too late now as there was people rasing concerns over the way pricing of gpus was going years ago and all these people was met with was mocking, stop being poor, get better jobs, muh inflation, stop complaining. I'll take a guess that a lot of the people that where mocking the people years ago now are saying something now that its effecting them and they are priced out of the market.
We also had years of fanboys/vendor reps complaing about how 1 company was the good guy and they need the money to compete with the evil nasty evil company and how we needed competetion and now this company is pretty much the same as the evil nasty company they stop complaining. We now have a situation where 2 companies that are a duoply with price fixing, such competetion much wow.
Yeah there's been a steady price creep on top of inflation for a long time but a big jump starting with the RTX cards.
£799 DeliveredWhat you pay for it?
Yeah it was a freebie!I'm guessing a freeby. A certain company is giving these away with orders at the moment.
Prices have been crazy, been trying my best to resist buying anything. My 3070 lasted me two years and I want an upgrade for VR, if I get a couple years from this it's not too bad.What the...
Didn't you used to always buy whatever the top end card was? Well at least back in the day. What's up with buying a 4070 Ti Boomy?
You got a free graphics card with your new SSD? Good for you.
Firstly, I'm talking about multiple games, not just one. Secondly, it was about RT performance. Lastly if you only play Warzone 2 then you should only care about Warzone 2 performance and happily buy an AMD card - of course!In Warzone 2 AMD crushes Nvidia so thats the only game that matters. Is that a fair assessment or have I cherry picked an individual result to best highlight the differnces ?
What does equally optimised even mean? We're talking about radically different architectures particularly as it relates to how they both handle RT. It's literally impossible that they could be "equally optimised". Moreover it would be your burden to prove how exactly these are "unequally optimised" when all these games are made with consoles in mind first, which have been on AMD-only hardware for over a decade now! Not only that but EVEN IF it were true that NV dominates cause of their sponsorships rather than its superior hardware, so what? It's still AMD's problem to solve (and an AMD customer's to suffer)! Ultimately as a regular PC user you can't choose who sponsors what, only what card you buy. So it's a big fat L for AMD regardless.Preach.
To add to this, Cyberpunk is Nvidia sponsored so it's really not a fair example. If you think the RT implementation is equally optimised for both vendors, you'd be mistaken. This should be obvious to anyone with a modicum of common sense. Regarding Warzone 2 however, that title is vendor neutral, despite the game performing better on AMD hardware. It also uses DLSS/FSR, although i would strongly recommend neither is used in that title as they are both bad.
It's simple, there's a close relationship with that developer. That means more time for optimisation in game code and development that one vendor is the beneficiary of. Look at the Witcher 3, the same developer, same sponsored title. Now with RT, but prior to that it was the tessellation stuff in The Witcher 3. Seeing a pattern here? Good, I know you are not stupid mate so surprised to see you act like you are. Don't pretend that this does not play a part in overall performance.What does equally optimised even mean? We're talking about radically different architectures particularly as it relates to how they both handle RT. It's literally impossible that they could be "equally optimised". Moreover it would be your burden to prove how exactly these are "unequally optimised" when all these games are made with consoles in mind first, which have been on AMD-only hardware for over a decade now! Not only that but EVEN IF it were true that NV dominates cause of their sponsorships rather than its superior hardware, so what? It's still AMD's problem to solve (and an AMD customer's to suffer)! Ultimately as a regular PC user you can't choose who sponsors what, only what card you buy. So it's a big fat L for AMD regardless.
Besides, I can think of AMD sponsored titles where Nvidia still dominates like Riftbreaker or Deathloop. Or should we point out how FSR 2 was also faster on Ampere than RDNA 2? Then what's the excuse, Nvidia sponsored AMD to defeat itself? Or former console-only titles where even with a mild RT implementation Nvidia is clearly superior all the same (Miles Morales). Or a title like Metro Exodus EE where it HAD to be optimised for AMD because it had to run with RT at 60 fps even on consoles (as puny as the Series S, which is essentially an RX 580 w/ DX12 features).
Just accept RDNA is crap when RT is turned on to any significant extent, no need to deny reality because you like one corpo over the other. None of them give a **** about us anyway. Let's at least not spread falsehoods.
It's simple, there's a close relationship with that developer. That means more time for optimisation in game code and development that one vendor is the beneficiary of. Look at the Witcher 3, the same developer, same sponsored title. Now with RT, but prior to that it was the tessellation stuff in The Witcher 3. Seeing a pattern here? Good, I know you are not stupid mate so surprised to see you act like you are. Don't pretend that this does not play a part in overall performance.
Those two titles you mention, RT was added at a later date. The big difference, RT was playable on RDNA2 but of course at lower frame rates vs Ampere. To be clear, no one (myself included since you quoted me) ever said AMD is stronger at RT than Nvidia. RDNA2 was always weaker. But to be clear, there's a difference between 1 gen weaker and 2 gens weaker and software can play a big part in that, this much should be obvious to you. I guess now that 7900 XTX is around 3090-3090TI level in Ray Tracing performance on various games, Ampere/RDNA3 is now rubbish at RT, when looking at those cherry picked results 2nd gen RT on both vendors is pretty similar, with Nvidia being 1 gen ahead on the 4000 series still.