• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia rumour to be launching new GTX 11 series without ray tracing

They're just setting up the pricing ladder & going to have GPU variants at each pricing interval, so there's no big jumps in-between. E.g. if you look at AMD you had very big jumps between 570/580 & Vegas (before all the recent deals) in terms of pricing (still do, >£50; used to be ~£100ish), but Nvidia had something within ~£50 of each card.

guess it wouldnt be a real graphics card if it didnt fit in then have 50million varients accross different manufactorers that straddle several pricing levels
 
Nvidia will continue to stop 4K becoming standard for as long as they can get away with it. I say 4K but I really mean anything which pushes spec up to this level (high refresh on anything above 2K).

We’ve been talking about it since the GTX 780ti days yet the pricing is still too high 5 years and 3 generations on.
 
https://www.reddit.com/r/nvidia/comments/as7vuy/galax_gtx_1660_ti_my_friend_got_it_early_475aud/

Just saw this on reddit someone got GTX 1660 Ti early but it cant run games or benchmarks because it do not have driver yet.

6aklKbm.jpg

XHpogzh.jpg

YK74uhm.jpg

I assumed Techpowerup's GTX 1660 Ti spec mentioned Tensor cores was a typo as GTX GPUs expected to lacked both RT and Tensor cores while RTX GPUs has both RT and Tensor cores. But I noticed something really surprised very interesting...

pX30Neu.jpg

GTX 1660 Ti confirmed to have 192 Tensor cores!

Volta Titan V can run 3D Mark Port Royal benchmark, Battlefield V and Metro Exodus DXR ray tracing with Tensor cores just fine so GTX 1660 Ti & GTX 1660 with 192 Tensor cores and GTX 1650 with 112 Tensor cores will able to run DXR ray tracing but it will be slower than RT cores.

GTX 1660 Ti, GTX 1660 and GTX 1650 reviews will be very interesting to read. I guess Volta has 1st gen Tensor cores and Turing has 2nd gen Tensor cores.

Also it will be very interesting to find TU116 and TU117 die size. Without Tenor cores, TU116 die would measure around 150mm2.
 
GIGABYTE GeForce GTX 1660 Ti OC pictured, benchmark leak, price confirmed

https://videocardz.com/newz/gigabyte-geforce-gtx-1660-ti-oc-pictured-benchmark-leak-price-confirmed



Yes it is GTX 1070 peformance accorded to Final Fantasy XV benchmark.

tzf3Tdz.jpg
If you have observed the nature of how graphic cards are over the years, then you'd most probably recall that cards with new architecture generally are more efficient and deliver better results for published benchmarks due to specific optimisations. In real-world gaming though, I would bet 95% times the 1660Ti would most likely be slower than the 1070, though for new games I would guess it would be the other way round moving forward due to the continual support that the 1660Ti would get that the 1070 would not. So I guess it's down to individuals to decide what's 1660Ti's performance meant for them, depending on if they are going to be playing new and future releases, or backlog of games which they have not got round to play.

Regardless, it's hard to get excited over what's essentially a 6GB 1070 for around £250+ in 2019, especially with the Vega56 with free games available for under £300. If anything, the 1660Ti is more to serve as the purpose for trying to convince people to pay a bit more for a 2060 instead (but then, there's also the difficult pills of a £350 card with only 6GB vram need to be swallowed).

For all intent and purposes, the 2060 would have been a great card for the price had it got 8GB vram rather than 6GB, but I guess Nvidia don't want to have a 2060 8GB to make the 2070 look like a poor proposition.
 
Jensen now admitting the poor sales of the RTX cards is because of high prices:

"
Nvidia's gaming unit took the biggest hit with revenues sinking 45 per cent year-on-year, which the company blamed on "weakness in gaming GPUs and a decline in shipments of SoC modules for gaming platforms."

The firm admitted that sales of its Turing RTX 2070 and 2080 graphics cards came in below company expectations, which Nvidia CEO Jensen Huang said could be due to the GPU's high-end prices; lower-priced cards based on the new flagship architecture didn't arrive on shelves until months later. "

https://www.theinquirer.net/inquire...-turing-gpu-sales-failed-to-meet-expectations

They now have a conventional GPU against an RTX unit, so what's to stop a 1670 next if the 1660 sells well? It's one thing undercutting their own tech, but needs must when the devil drives.
 
But even their lower end card (2060) is overpriced and isn't a very good package when compared to alternatives.

They are pricing themselves out of the market and are now making cards to compete with themselves. The damage from the 20 series has not all been felt yet.
 
https://www.kitguru.net/components/...lson/nvidia-admits-dlss-needs-to-be-improved/

Wow now nVidia are admitting that the DLSS part of the new RTX cards isn't working properly and needs to be improved! What is the point of buying one of these cards when the hype turns out to be just - well hype?

The trouble is hype is enough on its own to make sales. Nvidia have such a blind following, they could market a green dog turd and people would buy it. You’d then have the same people saying, if questioned “it’s my money, I’ll choose how I spend it”.

Just look at the hate the guy from Adored TV gets from purist fanboys for being honest. Inconvenient truths don’t sit well with people.
 
Wow now nVidia are admitting that the DLSS part of the new RTX cards isn't working properly and needs to be improved! What is the point of buying one of these cards when the hype turns out to be just - well hype?

To be honest it was only ever going to be as good as the AI algorithm. If Nvidia only do 1 pass for a game and never return then it'll be gash, retrain on a quarterly basis and you get better results. Nvidia's statement sounds like they've done the former so far.

I think this is why Nvidia were saying game devs could buy into deep learning kit and generate their own DLSS profiles, rather than submitting to Nvidia's super computers.
 
Back
Top Bottom