I'm only asking this as I've been wondering since I got my XBOX One X.
That's got aprox the same "GPU power" as my GTX1070 laptop, 6TFlops. Yet I'd say that at 4k HDR the best experience is on the Xbox One X, we all know that consoles are more optimised because they have one specific use and one or two different specifications.
However, do we use PC GPU's to the max? Is there more life left in older GPU's? Surely if they properly optimise a PC release we'd have graphics much better quality and much more playable than ever, and something like a GTX1070 would be classed as a 4k gaming card? As realistically it's a the entry level into 4k gaming isn't it.
**EDIT**, slightly misleading title, we know they're not, mean will we ever have them utilised to their full potential.
That's got aprox the same "GPU power" as my GTX1070 laptop, 6TFlops. Yet I'd say that at 4k HDR the best experience is on the Xbox One X, we all know that consoles are more optimised because they have one specific use and one or two different specifications.
However, do we use PC GPU's to the max? Is there more life left in older GPU's? Surely if they properly optimise a PC release we'd have graphics much better quality and much more playable than ever, and something like a GTX1070 would be classed as a 4k gaming card? As realistically it's a the entry level into 4k gaming isn't it.
**EDIT**, slightly misleading title, we know they're not, mean will we ever have them utilised to their full potential.
