Soldato
Sunday night will be SIGGRAPH and GoT. A good night indeed
grab the popcorn for the comedy event
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Sunday night will be SIGGRAPH and GoT. A good night indeed
Well from what I understand there wasn't much interest in pushing adaptive sync tech when nVidia was trying to get it introduced - it was only after they went away and made their own version VESA suddenly started to do anything.
grab the popcorn for the comedy event
I am sure it will be golden bro. 1080 performance at 1080 prices will be what I expect and that will do nicely for those with Freesync screens. Hopefully they have done some sandbagging with the FE and the RX will be Ti performancegrab the popcorn for the comedy event
1080 performance at 1080 prices will be what I expect and that will do nicely for those with Freesync screens.
I am sure it will be golden bro. 1080 performance at 1080 prices will be what I expect and that will do nicely for those with Freesync screens. Hopefully they have done some sandbagging with the FE and the RX will be Ti performance
Yer, a 1080 is very good performance and a very capable card in truth.I think a lot of people would accept that for sure. And with knowing they'll squeeze a lot more performance out of it over the course of 6-9 months
I noticed a huge difference at 1440 over 1080, very surprised you can't see it. Everything looks so much better at 1440.Am I the only one here who doesnt care about gsync, freesync, hdr, 4k ?
I dont game on a sofa far from my screen, just hate that type of experience.
I have used HDR for a decade already in pc gaming, it is not new like sony and co are trying to make out. Its been around since the early 80s.
gsync and freesync are only useful if you have a framerate different to your refresh rate multiplier. The question is why are people playing in such a way? I play at either 30fps or 60fps on a 60hz screen.
I think pixel count is a false economy, it does wonders for slowing gpu's down, but not that great for improving image quality. As resolutions get higher and higher, you hit a point of diminishing returns, I only got a 1440p monitor for desktop real estate, the fact I game at 1440p is only for that reason now. I have noticed no visual improvement over my previous 1050p resolution, however things like lighting affects, sggssaa, tessellation et. "do" make a meaningful difference to visuals. I always prefer lower resolution with max graphics settings vs higher resolution with things turned down.
So to me buying a slower hotter, more power hungry card just so I can use freesync sounds barmy.
Am I the only one here who doesnt care about gsync, freesync, hdr, 4k ?
I dont game on a sofa far from my screen, just hate that type of experience.
I have used HDR for a decade already in pc gaming, it is not new like sony and co are trying to make out. Its been around since the early 80s.
gsync and freesync are only useful if you have a framerate different to your refresh rate multiplier. The question is why are people playing in such a way? I play at either 30fps or 60fps on a 60hz screen.
I think pixel count is a false economy, it does wonders for slowing gpu's down, but not that great for improving image quality. As resolutions get higher and higher, you hit a point of diminishing returns, I only got a 1440p monitor for desktop real estate, the fact I game at 1440p is only for that reason now. I have noticed no visual improvement over my previous 1050p resolution, however things like lighting affects, sggssaa, tessellation et. "do" make a meaningful difference to visuals. I always prefer lower resolution with max graphics settings vs higher resolution with things turned down.
So to me buying a slower hotter, more power hungry card just so I can use freesync sounds barmy.
Not sure they ever announced that ...I could have sworn maxwell was capable too. Because of that rumor nvidia was going to release a driver to support adaptive sync. But it never happend.
Sunday night will be SIGGRAPH and GoT. A good night indeed
Don't Nvidia support adaptive sync in their gsync laptops?
I could have sworn maxwell was capable too. Because of that rumor nvidia was going to release a driver to support adaptive sync. But it never happend.
Can be disappointed, and then you can watch GoT to move on.
All this GoT talk is making me wanting to watch the last ep again to get me through til early Monday morning!!!
Then go it. Thank goodness for SkyGo. It's the only time I use that service.
Well i just installed it last night and my god what a positive difference it made to be back on win7 for now. BF1 was buttersmooth with no issues, havent had that it a long while.
Mine has a bit of BLB but nothing to worry over it is in the 4 corners but not bad. My Dell IPS was horrendous for itWhat's the backlight bleeding issues like on the ROG? I saw nightmare inducing stuff on the batches from the first few months.
As for Acer, I tend you avoid them after they got their 15 dead pixel policy. Anything under 15 dead pixels is considered fine. Thankfully I ordered from Rainforest at the time and they got me a refund on the monitor I had at the time.
Yes locking card to monitor is annoying and anti-consumer, but that's nvidia for you.I don't care for g-sync/freesync either. I don't care because you have to lock yourself to one vendor. I'll care as soon as that restriction is lifted, till them, lived without it for the years before it was around, I'll survive without it now. Locking card/monitor together is absurd.