Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I guess its something that's going to have to be seen to fully understand the difference it will make, at least for me. My fps are never that low so i guess its something that will benefit a slower gpu generally than the more powerful ones. I have to admit though, i don't really have a problem with lag or stutter when using vsync. Some input lag granted, but i can remove that by either taking off vsync or using an fps limit. Obviously if fps drops then its noticeable, but personally i don't start to notice it unless fps drops below say 50. Tearing does not always occur with it off either, that seems to vary on a game by game basis for me. So it sounds like a nice feature, but not one id personally describe as game changing because theres not really an issue there in the first place. Except in a few circumstances. For those situations im sure it will help. I can see the benefit of it a bit more now, but i don't think it would change too much for me personally.
Thought I would have a play with what I think the specs will be :D[/QUOTE]
I think you might be a bit low on the cores. The Quadro K6000 has 2880 @ 902 MHz to deliver 5.2 TFLOPS. Surely Nvidia will want to match/beat the 290X in the numbers game?
Possibly 4.5 GB of memory to keep costs down.
2 x 6-pin like the K6000.
I think you might be a bit low on the cores. The Quadro K6000 has 2880 @ 902 MHz to deliver 5.2 TFLOPS. Surely Nvidia will want to match/beat the 290X in the numbers game?
Possibly 4.5 GB of memory to keep costs down.
2 x 6-pin like the K6000.
Hope this is true, saving my pennies. Can not resist 2880 Cores !!
Make that GOOD GDDR5 memory with 1700MHz out of the box, and all overclockable to 2000MHz on average as well- that would make the bandwidth matches 290x's 1250MHz~1500MHz on the 512-bit bus.I'm remaining cautiously optimistic for more cores than Titan, and 3GB - 6GB variants. Have a feeling the 780ti will be a beast, Nvidia will want to properly take on the 290X with more than just an overclocked 780.
Bring on the 780ti and 290X benchmarks !!
Mantle is a low level API. Gsync is monitor synchronisation. If anything the two technologies combined would be brilliant. Although it is unlikely to happen. Not sure how this in anyway shape or form makes Mantle seem any less of a good thing.
If GSync is that great expect AMD to bring their own version out. Probably one reason why its not yet proprietary. I'm not sure the monitor companies would want to lock out 40% or so of the user base from a unique selling point of their monitor. If an AMD version does come out, expect it to be cheaper as well.
Panel manufacturers have enough products on the market to safely cover themselves on that Matt. Besides there isn't any stopping you using a G-Sync enabled monitor on AMD hardware. Potentially completely pointless but still true
You really think Nvidia Gsync will continue to work as soon as an AMD Gpu is detected? Think Physx and arbitrary lockouts.