• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia rumour to be launching new GTX 11 series without ray tracing

People wanted 1080 ti perf for 2060 prices though as per last gen with the 1070/ 980ti. Can't see an 11 series card coming in at that price. I think they might drop this one if navi is looking bit tasty just to corner the market


I do wonder if Ray tracing is ever really going to take until they manage to get acceptance into the console market. With less than 1% of cards being able to use it no sane games maker is going to spend development money on something so niche - some of them won't even make a PC version of the games at all! If an 1180 comes in around the price of a 1080 with the rise in performance then it's going to be a very interesting card indeed. Lets say for sake of argument it retailed at the £600 - 650 mark I think it would be difficult for retailers to get enough stock !
 
https://www.pcgamesn.com/nvidia/nvidia-gtx-1180-gfxbench-performance

Another site saying much the same as the others that if it's true RTX is at best postponed, but suppose NVidia isn't able to get developers to add it into their games, and they keep that info from buyers while still ask RTX prices knowing it's not the future? If that came out in the future it would cause all kinds of long lasting bad PR so maybe there's ,ore nehind the scenes that we are aware of.
 
Occam's razor, by far the simplest explanation is the baseless rumours are bogus.

Nvidia would never, ever sell a GPU that competes with Turing but has no RTX capabilities. Never in a million years. It makes absolutely no sense in the slightest. If sales figures for Turing are low then Nvidia will simply lower prices.


the only realistic scenarios for an 11 series GPU are:
1) A cut-down Turing without Tensor cores or RTX to fill in the low level. A replacement for a 1050. Nvidia always try to release a top to bottom stack of a common architecture.
2) Re-badging Excess Pascal stock to clear supply chain, products liekly only for Asian markets. A 1080 could be renamed a 1170 for example and sold on the cheap in China.
3) Some other low end Turing salvage part without RTX also for Asian market.

ray-tracing is the future of gaming GPUs, and already has a significant market in content creation, professional, graphics design, and game developer communities. Nvidia have successfully shown that real time RT is possible with Turing, just. Performance in BFV is just acceptable, future titles will likely be better with more optimizations, and this is generation 1. This always happens with new technologies, the GPUs can only just make it viable in games. Things like Pixel shading were exactly the same.

With 7nm GPUS coming in about a year, Nvidia have a chance to more than double the ray tracing performance in a short time. By which point there will be many more games out there with RTX support.


I'm sorry but you are looking through the wrng end of the telescope. Those who have RTX capable cards make up an infinitessimal number of people so what realistic incentive is there for developers to spend time and money on adding ray tracing to games which will not add one penny to the bottom line? Most games are designed for consoles and then ported to PC and not the other way around, and until ray tracing goes mass market it won't be worthwhile for manufacturers to include it.

Obvioulsy we don't know what discussions take place between NVidia and games studios, but just suppose they've said no to adding ray tracing? What then? Does NVidia carry on selling what it knows to be a failed technology to unsuspecting buyers? Imagine the fallout if and when that got out. On the other hand if they continue to offer both technologies and attempt to widen the spread as the tech gets cheaper then that would make a lot of sense.
 
But that will kill RTX. If you have identical raster performance between a GTX 11 and a RTX 20, would YOU pay a premium for the latter card which has features you simply cannot use?

Nvidia either have to go all-in on RTX and start paying game devs to implement it and develop the 2nd gen regardless of sales of the 1st gen, or AMD have to get a consumer-working version of Radeon Rays into the next generation of consoles. If neither one of these happen then RTX might as well join PhysX.

There are currently just 4 RTX cards and the biggest sellers are all GTX without ray tracing. The number of potential buyers for games is tiny, and while it was probably worth while for DICE in terms of the extra advertising and being the one and only ray tracing game available, the also rans will not have such incentive. The only way this could ever work is not just if NVidia pay the developers, but the risk that they will also load the cost onto PC gamers, further compromising the market.

I think both of your conditions need to be met, NVidia have to pay developers and get the cost /performance of RTX cards right down and quickly, and AMD have to support it too, with consoles offering rat tracing, or the market is just too small to service in the **** term.
 
Looking on Google at the PC titles which support Ray Tracing, there hasn't been any update since the launch, with intentions but not hard copy. One would imagine if Ray Tracing was the great leap forward NVidia claimed games companies would be crowing about their inclusion of it in games launches, yet it's all quiet on the Western front. It all adds to the suspicion that NVidia might just quietly drop it and return to more conventional cards. Are they still pushing 'hairworks' ?
 
Lisa Su CEO of AMD addressing Jensen Huangs criticism of their new card said: "AMD isn’t all in on ray tracing just yet. “The consumer doesn’t see a lot of benefit today because the other parts of the ecosystem are not ready,” says Su. “I think by the time we talk more about ray tracing the consumer’s gonna see the benefit.”

The 'ecosystem' is the media, so it seems a little chicken and egg here.
 
Likely fake news?

What about the GTX1160? direct competition with the RTX2060 and yet there are very strong rumours of its immenent arrival:

"As for the GTX 1160, only two leaks substantiate its existence. The first is from leaker BullsLab Jay, who told VideoCardz that that the 1160 would be based on the TU116 core rather than the TU106 that the 2060 is meant to use. The second is basically a hard confirmation from a Lenovo product page, that lists the 1160 as coming in a new laptop model in 3GB and 6GB configurations. Both leaks agree that the card will be launched somewhere between January 8 (the first day of CES) and the 15th of that same month."

https://www.techspot.com/news/78044-rtx-2060-vs-gtx-1160-ten-new-leaks.html

So if they can make a card to compete against one which has only just launched then why wouldn't they make other higher end cards?
 
https://www.techradar.com/news/amd-...ortant-but-still-thinks-nvidia-jumped-the-gun

Interview with AMD CEO Lisa Su who says many of the things said on this thread.

“technology for technology’s sake is okay”, but that “technology done together with partners, and really getting the development community fully engaged, I think is really important.”

this is underlying AMD’s previous line which is that it’s too early to be pushing with ray tracing just yet, and that Nvidia has gone ahead with it for Turing GPUs just for the sake of saying it has got the technology, as opposed to delivering any real benefits to gamers.

AMD’s argument is that it would rather focus on making its GPUs better performing across the entire gamut of games, rather than those with support for a specific feature.

If as AMD appears to be suggesting it has no immediate plans to launch a Ray Tracing product but is looking at it in the longer term then there is going to be a very small niche market for games designers in the short to medium term.
 
revisiting the old business studies battle between VHS and Betamax it is as if NVidia are following Sony's abject failure to market a device in ignorance of the outcome it caused. Sony studiously disregarded all of the marketing feedback doing what its R&D team wanted the people to buy rather than what people wanted, and although Beta might have been the superior product, it failed because it did not address what customers wanted or were prepared to pay.
Customers simply refused to pay the rapacious prices Sony wanted and settled for second best at a more affordable price. It's a lesson NVidia appear not to have learned.
 
It had better picture quality but was less useful as a recording device. It wasn’t the superior product.

That is only true of Beta 1, by the time they brought out Beta 2 which extended the dreadfully short recording time, the picture quality was equal to VHS. They also refused to include a timer on the recorders, making people pay extra if they wanted one, making it an even more expensive option. They refused to licence the technology to any other manufaturers, whereas JVC allowed anyone to produce it, thus introducing competetion and lowering prices.

It's a whole long litany of management huberis and a testament for the need to force engineers to listen to what the customer wants, not what they want to produce.
 
Here's the link:

https://www.pcgamer.com/nvidia-is-rumored-to-be-readying-a-geforce-gtx-1660-ti-graphics-card/

and an interesting paragraph

This latest rumor aside, it seems inevitable to us that Nvidia will eventually release a newer generation card without ray tracing hardware baked in. The bigger question is whether the rumored TU116 core is the same as the TU106 in the RTX 2070/2060 but with ray tracing and Tensor cores disabled, or if it's a completely new GPU. The latter seems unlikely due to time considerations, though it would be the more cost effective approach.

I think the bigger question is how powerful will NVidia make this new GTX range a 1060 equivalent is one thing but a 1070 up is another.
 
https://www.engadget.com/2019/01/17/amd-versus-nvidia-radeon-vii-7-nanometer/?guccounter=1

It would seem that AMDs new Radeon VII is a worthy competitor to NVidias RTX series and has out performed the RTX2080 in some games. There are now being more questions raised over the value of ray tracing at this time, and if NVidia wants to preserve its market share it cannot do that with a card costing around £1000 competing with another which costs half that and out performs it in certain applications.
It would make clear business sense to compete with the threat head on with a GTX 11 series and to keep the ray tracing for those who want it.

Nvidia cards are not just used for games - as the mining application proved, and the ray tracing and DLSS features are of no interest what so ever to these users.
 
So if the RTX2060 launched at $350 and NVidia did launch a 1680Ti or equivalent it should launch at $875

Anyone interested at that kind of price?
 
There won’t be a 1680ti!!!!!


Everyone was saying there wouldn't be a 1160 / 1660 but now we know there is, and it is in direct competition with an RTX card. If they are prepared to offer this then why not a 1670 or a 1680 etc. Remember that gaming is not the only application for graphics cards and these other applications do not need either ray tracing or DLSS.
 
https://edition.cnn.com/2019/01/28/investing/dow-stock-market-today-caterpillar-nvidia/index.html

More nVidia doom & gloom with the shares down 17% in one day !

Nvidia added to the doom-and-gloom by slashing its fourth-quarter sales guidance. The chip maker cited "deteriorating" economic conditions, "particularly in China." Nvidia (NVDA) shares plunged 17%. And the dreary news weighed on rivals Intel (INTC), Applied Materials (AMAT) and Texas Instruments (TXN), all of which fell by more than 1% apiece.

If they have over estimated the volume sales of RTX cards then it would give more push to a non ray tracing card which would be cheaper and probably increase sales volume (and also profit) to get the company out of what appears to be an ever deepening black hole.
 
OOOOpps !

https://www.extremetech.com/computi...stroys-tens-of-thousands-of-nvidia-gpu-wafers



"After the Great Cryptographic GPU Shortage of 2017 sent GPU prices into the stratosphere, the slow decline through 2018 was a welcome return to normal. Now, we may see fresh shortages and higher prices thanks to a reported manufacturing problem at TSMC.

The following is a quote from a report by the Chinese site Expreview:

Today, TSMC has been experiencing a security incident. This time, the wafer was contaminated by unqualified raw materials. It is estimated that it will lose tens of thousands of wafers, affecting the 16/12nm process of the main revenue, NVIDIA GPU and many mobile phone chip manufacturers."

The loss of tens of thousands of wafers — if that figure is accurate (and we’re trusting Google with the mechanical translation) — would represent a significant chunk of a typical fab’s monthly output
 
Back
Top Bottom