I think you missed mine - trying to use ARK as an indicator of anything is like trying to throw money into a bottomless pit - its been terribad from the beginning. They are on purpose not optimising the game better so they can save on R and D costs - they have had years and they cannot be bothered.
Its like all the people who try to use games like Planetside 2 as a performance indicator for CPUs - its very CPU taxing,but in the end even throwing hardware at it won't solve performance problems,as people have found to their chagrin.
Hardware enthusiasts on forums need to apply some common sense to things - many games or engines are poorly optimised,or are made to scale massively beyond what is available. People never got that with Crysis - you could push the image quality higher with manual tweaks to the configuration file - it is why it can even push modern cards at 4K even now but you had all these people moaning about performance back then after doing that. There were certain settings in-game which tanked performance in the game at launch which had virtually no image quality improvements like DX10,etc.
Even then the whole culture of "maxing out" is pointless,its fun to do,but still pointless - there are certain types of AA which are just plain stupid and only exist for people measurebating over screenshots or for AMD/Nvidia to sell more expensive cards,whilst drip feeding the performance improvements more and more. If there is a genuine improvement in things fair enough,but if you need to spend £1400 instead of £500 just for increasing a few more settings by a notch or two,then I would rather drop the settings.
That is just me OFC,but YMMV.