Discussion in 'Graphics Cards' started by LoadsaMoney, Oct 4, 2019.
Does the pope **** in the woods?
The reason I asked is because he has been highly critical of nvidia, RT and their pricing of late.
I can only imagine if he got it, it would be to do benchmarking as I don’t think he plays anything that his current Titan could not handle
haters gonna hate
Bugger off, he'll buy 4
Not sure we can call him a hater just yet, he has paid for more of Jensens leather jackets then we both combined ever will. Lol.
Titan comes out, yeah I’ll take 4 of those. Then xx80Ti comes out few months later, let’s get 4 of those also and see how they compare with the Titans! That’s Kaapstad right there
I will be surprised if he buys more than 2 this time, or any at all
Turing was the worst architecture NVidia has produced in a very long time. Too much was expected of it on huge dies which offered very little real benefit at huge cost to the end user. Unfortunately high end Turing cards were also better than anything else available.
I think the Ampere Titan will be total pants but it will still be the best pants around.
Sadly I see Ampere being much the same as Turing price wise, a total joke.
People should buy and post what they want. NVidia make great cards and they are beasts. I might even have a look at the 3090Ti but price will be a factor and might even call it a day and go the PS5 route, as not gaming so much if at all now.
I am pretty sure you will cave (like you always do) and buy one. You like RT too much to not have the latest and greatest. You will likely get your money’s worth just like Kaapstad running benchies, gaming is just a bonus
At least this time you will end up with a proper RT card that won’t tank fps just to use it
I do like RT. Hopefully cyberpunk will bring me back this way but the last game I played was Control and that was a couple of months ago.
Yeah, not been gaming much myself. I get more enjoyment from the anticipation of the new cards and Cyberpunk 2077.
I have got to do a serious PC build later this year but at the moment what direction intel CPUs are taking is very unclear.
Until then the Ampere Titan will just be used for gaming.
I won't respond directly as it will be off-topic and peeps getting miffed.
What I will say is that i ran my own thread in this forum about my 1080ti vs. 2070S vs. 5700XT experience, i ran all three cards and kept the 5700XT.
The one thing that would not work with the Nvidia cards was DX12 mode in Division 2, i even changed MB, RAM and CPU to see if it was my rig and actually, it transpired that the Nvidia card's don't like DX12 in Division 2. Have a mooch around the Division 2 forum here and Nvidia card owners running DX12 mode are rarer sight than Gregster.
So the question for me, when i get my 3070 Ampere card (See - on topic!) is, when i whack it in my shiny watercooled 5.4ghz uber rig, "will DX12 in Division 2 work?" because it will be first game i will benchmark against my other cards.
The Division 2 worked fine in DX12 on both my 1080ti based main rig and 1070 based laptop, got a nice boost on the laptop as it was a 6700HQ CPU and modest gain on the main rig, crashed about once across both setups . So elaborate more on "would not work" you wasn't running Windows 7 or something ?
Also got to remember The division 2 was a AMD sponsored title so There focus would of been more on AMD GPUs in DX12. As always with DX12/vulkan alot of what the graphics driver used to do the game dev has to manage now.
Worst value for sure but on a technical level a achievement
Micro-stutter in DX12. No issues in DX11, as many other forum posters running Nvidia cards will attest to.
I will report back once the 3070 lands.
Will be a looker
It isn't really my thing. Glad if it is real it'll only be the FE cooler.
Will be hot in there!
That's a 3D render, its not going to look so clean in the metal but i will agree its a good looking card, tho i do think its a bit Marmite, it looks more a necessity than cosmetics which may tell something about its power consumption.
Also. no RGB!
I agree and The Division 1 and 2 in DX12 on my card was rubbish and a juddery feeling mess, even though G-Sync was working. DX11 worked well, so wasn't fussed in truth. Ohh and cheeky sod!
Separate names with a comma.