• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD readies three new GPUs: Greenland, Baffin and Ellesmere

FreeSync is AMD’s name for a complete three-part solution: a Freesync-compatible AMD Radeon™ graphics card, a Freesync-enabled AMD Catalyst™ graphics driver and a DisplayPort™ Adaptive-Sync-compatible display

From AMD's site.

https://community.amd.com/community...t-is-freesync-an-explanation-in-laymans-terms

Whereas FS123 said that Freesync is just AMD's implementation of adaptive sync. That's not technically correct, as adaptive sync by itself isn't doing anything.

Freesync is a proprietary solution which utilises adaptive sync, no more or less.

And that's me done.

But apparently caring about the technicalities makes me a 16 year old boy.

Intel can then do exactly the same thing with current existing adaptive sync monitors (Which no one has tried to argue against.............)
 
My final bit on this to clear up any confusion and we should really get back on topic chaps :)

What is FreeSync and Adaptive-Sync?

Adaptive-sync is part of a new VESA standard which is part of the 1.2a specification for DisplayPort while Free-sync is an AMD project used in conjunction with the new standard to offer widespread support to the new technology through its graphics cards and drivers.

Instead of updating a monitor at a constant refresh rate it matches the display update which, somewhat like G-sync, makes for a visual experience that's smooth and fluid.

http://toptenmonitors.hubpages.com/hub/G-sync-monitors
 
Maybe you are not a hardcore fanboy like mmj_uk, Lambchop, but I still think you are more towards Nvidia, especially with the one side jokes.

At least I support NVidia because my experience is that their products are far superior and a more worthy recipient of my money, rather than some bizarre notion that I'm supporting the oppressed underdog in a war against the evils of the world.

Corporations are corporations, AMD would be doing all of the things Intel/NVidia are doing if they were ever in a strong enough position to do it.

Funnily enough you consider MMJ a hardcore fanboy, he's done something I wouldn't even entertain ; Purchase an own an FX83.

I've owned a few actually, sold a few and I still even own one now though I do not use it. I wouldn't recommend them for a main gaming system not if you can afford to go Intel. I just like to play around with all sorts of hardware and make my own judgments rather than reading the marketing fluff you read from fanboys on forums.
 
Last edited:
At least I support NVidia because my experience is that their products are far superior and a more worthy recipient of my money, rather than some bizarre notion that I'm supporting the oppressed underdog in a war against the evils of the world.

Corporations are corporations, AMD would be doing all of the things Intel/NVidia are doing if they were ever in a strong enough position to do it.

There are different people in the world with different principles. Some think of the wider picture, some think of themselves only. I will buy whatever is best for my needs but if where equivalents are available I personally would buy from the company I think is better for the gaming ecosystem (i.e open-standards, forward thinking, etc). I also use android, firefox, linux...
 
Last edited:
At least I support NVidia because my experience is that their products are far superior and a more worthy recipient of my money, rather than some bizarre notion that I'm supporting the oppressed underdog in a war against the evils of the world.

Corporations are corporations, AMD would be doing all of the things Intel/NVidia are doing if they were ever in a strong enough position to do it.



I've owned a few actually, sold a few and I still even own one now though I do not use it. I wouldn't recommend them for a main gaming system not if you can afford to go Intel. I just like to play around with all sorts of hardware and make my own judgments rather than reading the marketing fluff you read from fanboys on forums.

Yeah I agree with what your saying, I think it makes more sense to buy the best hardware you can get within your budget.

Go where the hardware is. Nvidia / AMD are no better / worse than each other in terms of company. Nvidia out did themselves with the 970 issue and the worst non apology in history. Only for AMD to launch faulty Fury X pumps into the retail channel and bury their head in the sand about the issue at first. I.e their just the same..

Buy the best hardware you can with your budget, only be loyal to your wallet imho.

I'll try the high end cards from both camps next year (Like to try myself VS just reading reviews) but will only keep what I think is best. Right now that's the 980 Ti. Really hoping AMD come back strong next year, for the sake of competition.
 
Maybe Fury X will be just about rolled out mainstream by the time Greenland comes around.

Let's hope HBM 2 is actually ready for mass production when these cards are released.
 
Maybe Fury X will be just about rolled out mainstream by the time Greenland comes around.

Let's hope HBM 2 is actually ready for mass production when these cards are released.

It could be that they are using the Furyx X as a pipecleaner for the production lines, to test them so to speak. Since AMD is releasing all of thier 400 series with HBM2 and 14/16nm.

So they may never make them in huge numbers in the end.
 
It could be that they are using the Furyx X as a pipecleaner for the production lines, to test them so to speak. Since AMD is releasing all of thier 400 series with HBM2 and 14/16nm.

So they may never make them in huge numbers in the end.

There were early rumours that AMD would only make 50k Fiji chips, the rumours was ridiculed by the AMD fanatics and it didn't really make any sense at the time. But here we are any Fury is basically Missing in Action, still terrible stock, still pump problems, nano delayed seemingly, no word from AMD. Maybe the rumour was right after all, Amd knew they could only produce so many interposers or source a certain amount of HBM. It is a strange pipe cleaner card though because it must have cost them a fortune in R&D so wouldn't see any profits from this round which is the last thing AMD needs. If their plan is to get Greenland out as early as possible next year then they should have scrapped the Fiji project before it drained their R&D budget, instead tried to work ona. Full fat Tonga.
 
Maybe the rumour was right after all, Amd knew they could only produce so many interposers or source a certain amount of HBM. It is a strange pipe cleaner card though because it must have cost them a fortune in R&D so wouldn't see any profits from this round which is the last thing AMD needs. If their plan is to get Greenland out as early as possible next year then they should have scrapped the Fiji project before it drained their R&D budget, instead tried to work ona. Full fat Tonga.

IF anything, it is a scaled up tonga and more than likely a test card for the new style memory controller for HBM, the use of HBM itself, as well as ironing out problems with the interposer technology.

So not entirely R&D money wasted since they were things that need testing for their entire 16/14nm refresh, but still a needed expense for getting certain technologies ironed out. Kind of like how the 750/ti and thier mobile variants are the only maxwell v1 cards. since they were essentially test cards for maxwell v2 in a way.

And they more than likely went with a Big card since they had not released a new top end in a while. But it could also have been to test the limits of the fabrication tech with them wanting to make their huge HPC dies.

I still think that Tonga was them back porting and testing some of their Finfet designs to see if they were Viable on 28nm. But i guess things did not go as well as they wanted since they tweaked and rebadged again.

And i still think they have been sitting on finfet designs for the past year or more. but they could not release them in the form they were due to the delays by the manufacturers. So still worked on them and improved them etc.

I remember seeing an early small die test GPU with some HBM on it. Cant remember where the picture was though. might have been something from a few years ago now. but the hbm itself was on the GPU's package rather than using an interposer.

There is a lot going on in the background even if the consumer side of things were fluffed.
 
Last edited:
Jesus, is this necessary? I hope a mod clears up a few of these posts and for the record, I know Tonester and have met him. I don't ever knock anyone for being partial to a brand and that is their choice but don't miss the fact that I have a massive sense of humour and enjoy life to the full. I love tech/gadgets and will be getting Greenland and reviewing it and if it is better than Pascal, I will be sticking with Greenland (I stuck with the fastest GPU out of the Fury X and Titan X). Don't hate on me just because I speak the truth about GPUs and other hardware/games and I certainly don't have a gang lol.

Edit:

I also am on my last strike, so I am sure if you push hard enough, you will get me gone perma :(

No mod will answer you calls, remember we had the same crap in the Fury X thread, now quit ya whining!:p:D
 
No mod will answer you calls, remember we had the same crap in the Fury X thread, now quit ya whining!:p:D

True that lol and meh!!!

I am genuinely looking forward to 14/16nm and whatever is the faster out of the first from both will remain in my PC. Should be a big boost and I am expecting at least 50% faster than my TX on both Greenland and Pascal.
 
Plenty of things to look forward to Greg. The shrink is one and the other will be if DX12 can improve as much as the hype expects! :)
 
Plenty of things to look forward to Greg. The shrink is one and the other will be if DX12 can improve as much as the hype expects! :)

I never really expected much in the way of frame improvements but more in the way of API overhead improvements for SLI/CF using DX12. Some games just don't like using multiple cards full stop for either brand but my concerns are with Devs having the control now instead of Nvidia/AMD. Seeing what Mantle brought was an indication for me and whilst it certainly helped lower/older CPUs, it didn't add much in the way of frames for others.
 
I never really expected much in the way of frame improvements but more in the way of API overhead improvements for SLI/CF using DX12. Some games just don't like using multiple cards full stop for either brand but my concerns are with Devs having the control now instead of Nvidia/AMD. Seeing what Mantle brought was an indication for me and whilst it certainly helped lower/older CPUs, it didn't add much in the way of frames for others.

IF the system was already GPU bound then it wouldn't. Most games got around the limits by bumping up effects in other areas, reducing draw calls, and limiting the viewing area to tight corridors or sparsely populated open areas.

But Mantle did lower frame and input latency while eliminate hitching. (except when the memory bug occured in BF4 but that is another issue)

And Mantle did show that Multi GPU performance and scaling were being held back by the DX11 bolt-on-top method.

But the major thing that will only come out in future are games having more varied and dynamic environments becasue they are no longer massively held back by draw calls. We could even see city scenes with dense populations and streets packed with cars in the future.

The new Deus Ex is already a good example of scenes dense with many NPC's. And no doubt the performance will be far better on DX12 than DX11.
 
It has been one of my gripes with SLI and Tri-SLI. Scaling for over 80% of the games in Tri was poor and pretty much a wasted GPU and SLI at times was questionable in terms of scaling. This is something I am hoping that DX12 will fix but I refer back to my last comment and that is putting in the hands of the devs has me worried.
 
Back
Top Bottom