• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

In the end all I am saying is their profit margins are up and it can’t be all put down to increase of costs like some like to do here. But you chose to interpret in differently.

That is business. The better you are the more successful you will be. Everything has a mark up, and you start with production cost. 2080Ti would not have been £1400 had it not been so expensive to make. Like I said, you will soon get a faster card with much better RT performance for £600 less, or, back to the old Ti price (which is what the 3080 is this time around, given there is a 3090).

So they have passed those savings on to whoever buys them. You can't just make a blanket statement without all of the facts and then call it fact. You need to consider everything. The Volta Titan was three grand and was nothing more than a RT tech demo. However, they were cast off Tesla cards used for deep learning and again, they cost a fortune to have one.

It does kinda irk me when every one bangs on and on and on about GPU prices. They have hardly changed at all !!

You don't NEED a 2080Ti. You didn't NEED a Titan XP or 1080Ti. You didn't NEED a 980Ti. This goes back to the dawn of time. If you buy the card you NEED? then you are spending £300-£400. And that has not changed. At all.

You're just irked that the top end cards are very expensive and you don't like it. Ain't we all, but that doesn't remove the facts.

Right now for a decent 1080p experience you NEED like, a RX 580 or slightly better. For 1440p you NEED a 5700 and so on. Now go look at what they cost. If you want RT? or wanted 4k when it came around? or WANTED anything else? it is there. Just don't whine about the price. "You can have anything you like, at a price". Good olde capitalism.
 
That is business. The better you are the more successful you will be. Everything has a mark up, and you start with production cost. 2080Ti would not have been £1400 had it not been so expensive to make. Like I said, you will soon get a faster card with much better RT performance for £600 less, or, back to the old Ti price (which is what the 3080 is this time around, given there is a 3090).

So they have passed those savings on to whoever buys them. You can't just make a blanket statement without all of the facts and then call it fact. You need to consider everything. The Volta Titan was three grand and was nothing more than a RT tech demo. However, they were cast off Tesla cards used for deep learning and again, they cost a fortune to have one.

It does kinda irk me when every one bangs on and on and on about GPU prices. They have hardly changed at all !!

You don't NEED a 2080Ti. You didn't NEED a Titan XP or 1080Ti. You didn't NEED a 980Ti. This goes back to the dawn of time. If you buy the card you NEED? then you are spending £300-£400. And that has not changed. At all.

You're just irked that the top end cards are very expensive and you don't like it. Ain't we all, but that doesn't remove the facts.

Right now for a decent 1080p experience you NEED like, a RX 580 or slightly better. For 1440p you NEED a 5700 and so on. Now go look at what they cost. If you want RT? or wanted 4k when it came around? or WANTED anything else? it is there. Just don't whine about the price. "You can have anything you like, at a price". Good olde capitalism.
Lol. Dude, as I said, I get that it costs more to make due to bigger die etc. My point is they have also increased margins. Are you saying their margins are the same as Pascal and Maxwell?
 
at the moment you dont need Ray Tracing to enjoy a game but you had to pay for the hardware to use it.

But they were when they were introduced.
"Who needs a generally programmable shaders anyway? They should just make the existing pixel shaders faster" - heard in about 2001...

It's just another feature, and it's one that's showing up everywhere. I'm not even talking about RTX here. The post I replied to was saying RT is going nowhere, when the opposite is true - it's going everywhere!


The "other techniques" as you call it were not added premium extras if you bought into the hardware.

Neither was RTX - I bought my 2080Ti because it was the most capable chip on the market. RT is just part of it, on top of things like tensor cores which I also find interesting.


Did someone playing Battlefield for instance enjoy the game more because it looked better with RTX enabled ? From what I read it was turned off because of the hit performance.

No idea, I make no claims about Battlefield, what I am claiming is that "Ugh, it's just eye candy" is a luddite view, especially in a forum dedicated to bleeding-edge graphics cards!
 
Only a handful of games are actually using it and it's not worth having on in any of them due to the performance hit. It really doesn't add anything to the experience. If Nvidia wasn't giving them cash they wouldn't have bothered. It's not sustainable in this form just like Physx and 3D vision wasn't.

Give it 5 years maybe and it will just become a basic feature. Once it runs well and costs drop. But right now there are other priorities which have a far bigger impact on the experience and sales numbers.

It's going to be in AMDs new cards and in the consoles, as well as in the 30 series cards, it's likely it will develop pretty quickly now.

As for "it doesn't add anything", see my other comment on commodore graphics. We all enjoyed Wolfenstein 3D (well those of us old enough did), but I'm pretty sure we can agree graphics have moved on since then. This is another step forward in realism.
 
It's going to be in AMDs new cards and in the consoles, as well as in the 30 series cards, it's likely it will develop pretty quickly now.

As for "it doesn't add anything", see my other comment on commodore graphics. We all enjoyed Wolfenstein 3D (well those of us old enough did), but I'm pretty sure we can agree graphics have moved on since then. This is another step forward in realism.

But it isn't a huge leap. VR is a huge leap.
 
OK look let me explain how this works once.


I don't think you understand my point.

The die is huge because they designed it that way. It's almost like you are just assuming that Nvidia had no choice in the size of the dies. They design this stuff.

Considering the tiny performance gain produced by this approach, maybe they should have gone back to the drawing board.
 
Computer Graphics folks, from academics to games designers, have been 'caring' about real-time raytracing for decades. It's in the new consoles, it's in AMD's next gen, I'd call that a pretty big part of "the direction things are going", personally.

Maybe wait until you figure out a way to it do that is more effective and doesn't balloon the cost of the products.
 
Computer Graphics folks, from academics to games designers, have been 'caring' about real-time raytracing for decades. It's in the new consoles, it's in AMD's next gen, I'd call that a pretty big part of "the direction things are going", personally.
But in the realm of real-time ray tracing, we're still in the era of Stephenson's Rocket. Or maybe horse-drawn carts :p

It's a massive compromise (and will be even more so on the consoles). It's a fudge.

And so far developer use of RT has been awful. Personally, I don't much rate being able to see perfect reflections in the glossy, polished heads of bald NPCs, glaring with the power of the Sun thanks to HDR :p
 
If Nvidia offers a full line of non-RT cards next to more-expensive RT versions, I suspect the non-RT cards will outsell the RT cards.

I've been thinking this for ages, Nvidia wont do it because they're really pushing the whole thing, like you i think the non RT cards would selll a hell of a lot more.

would be fun if AMD just thought fudge it, here's our powerful cards with no RT at a much cheaper price :D
 
Hope the rumour mill is wrong on the size and power front, my NCase M1 with its Corsair SF600 is not looking good, love that case too had three builds in it and really don’t want to move onto something else. I’m another that skipped Turing on 1080ti personally ready to upgrade but clearly my much adored case is not.
 
In the end all I am saying is their profit margins are up and it can’t be all put down to increase of costs like some like to do here. But you chose to interpret in differently.

Of course it isn't anyone can see that, I think we can all agree that die size and manufacturing cost has some impact but its mostly Nvidia gorging from their dominant market position.

Makes some people uncomfortable I guess, it is what it is if Nvidia want to play it differently in the future then they deserve the praise or critism accordingly I don't think its particularly helpful trying to explain or worse justify the changes its pretty scummy that everything from mid-range to top-end is basically funding Nvidias growth and bottom line.
 
Hope the rumour mill is wrong on the size and power front, my NCase M1 with its Corsair SF600 is not looking good, love that case too had three builds in it and really don’t want to move onto something else. I’m another that skipped Turing on 1080ti personally ready to upgrade but clearly my much adored case is not.

I was going to jump straight in but because of the above mentioned i'll wait to see what AMD come up with.
 
Back
Top Bottom