It's not always easy to compare different GPU architectures from the same or different brands. When looking at the same series of cards, I'd probably go in this order for importance.
- More cores
- Faster core speed
- More memory
- Faster Memory
Generally, higher tier cards have more cores, then you'll have overclocked versions that'll be faster with a higher clock speed. There are situations where lower tiered cards with less cores can overclock and run faster than higher tiered cards at stock speeds (less so these days AFAIK, with maybe the exception of a titan?). More memory is better generally, but only up to the point that it's required i.e. if a game only needs 3GB VRAM there won't be a difference between a 4/6/8GB cards, but the one with a higher memory clock, if everything else is the same, will perform better. However, if a game uses 7GB VRAM, you're gonna run into performance issues on the 4 and 6GB cards, regardless of the clock speeds on their memory.
Nvidia use less cores, but have higher core frequency, AMD have more cores and lower core frequency.
Currently, off the top of my head,
580 ≈ 1060 6GB (1060 3GB is a different card, not just less VRAM)
Vega 56 ≈ 1070
Vega 64 ≈ 1070Ti/1080
AMD currently uses more power, more power equals more heat, so they'll output more heat. Doesn't mean they'll run hotter, that depends on the cooler. Generally, best place is to look at reviews, I'd go with Hardware Unboxed/Techspot or techpowerup as they both go through a large amount of games. These will give you a rough idea of where the type of card you're looking at will slot in, e.g. a 1070 or a 580. Then do some googling for a review of the model you're looking at to make sure the cooler isn't crap and it doesn't get noisy.
There's also been the introduction of adaptive sync tech - this keeps the frames monitors refresh rate in sync with your frames. AMD has Freesync, Nvidia has GSync. Neither use the others tech, but you can run either graphics card on any screen with enabling this tech - so for example you can buy an Nvidia GPU and use it with a Freesync screen, just Freesync won't be active.
If any of the above is wrong, I'm sure someone will chime in and correct me!