I use the term optimized in quotes because it's not really optimization in the classic computing sense. Optimization means a very specific thing, which is to take some function f(x) and make it compute faster or with less resources but the output remain the same for any specific input. So with rendering what you want is more FPS with the same visual quality. The problem is that this kind of "optimization" we see on the consoles is really just turning down visual settings in a selective way so you get the most visual bang for your performance buck. It's a smart thing to do because you're spending your resource of GPU cycles on things that give you the biggest win. BUT it's not the same as truly optimizing the game, you're just tweaking settings and compared to the PC with everything turned up to Ultra you're simply looking at an inferior output on the console. So you naturally expect that to perform better on any given hardware, this is why I said if you want to compare performance on PC/Console you need to make sure you at least have settings parity first, so at least it's a fair test.
All of that said there are definitely more chances for real genuine optimizations on consoles. First of all they have fixed hardware across all users so if you're hunting for ways to be more efficient then if you find something it's a win for everyone. Where as on the PC you have a whole range of hardware and if you find an optimization for a specific video card then it has limited users, meaning there's less incentive to hunt for them on PC. As
@Gerard also pointed out consoles allow for real optimizations that come from allowing engineers to program close to the metal. This is because consoles strive to remove as many layers of abstraction between the game engine and what instructions are being executed on the GPU. Something that's impossible on the PC, they need to have all those layers of abstraction to have an open and competing market of different video cards and components.
As for visual trade offs, game developers are going to aim for the low hanging fruit first. In a marketplace of competing games that want to use nice visuals to attract gamers they will go after whatever visual effects give the most bang for buck. But a corollary of this is that as time goes on you're forced to move towards effects which are more computationally expensive and relatively more subtle impact on visuals. if those tradeoffs are worth it or not is kinda just personal preference on image quality vs cost vs performance. It's why consoles being more or less budget gaming devices still sell extremely well. Many gamers will be told something is 4k when in fact there's dynamic scaling going on and most wont be able to tell the difference. High end settings for PC games is kind of a niche thing, it's enthusiasts who understand how a lot of this tech works and want to experience it in action, even if the visual impacts are subtle.
Alex @ DF argues for the very same thing regarding console settings on the PC, it'd be really handy for the hardware and gaming community to compare games. Hopefully the console DRM will be cracked and people will get access to the file system on them and modders can go hunting for this info.