It's in both their interests to keep prices high that's how cartels work.
Yet they didn't take this approach with Intel. Because reasons?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It's in both their interests to keep prices high that's how cartels work.
Surely smoking crack or snorting cocaine? I thought that's what made crack so addictive, I guess they could probably afford it too although it would certainly shake things up for all of us if they were 'busy' doing that rather than the day job.It won't beat Ampere unless nV's entire R&D team has spent all their time since Turing snorting crack off a lady of the night's bum.
Maybe because their CPU mindshare was almost non existent, they were last competitive in the CPU space nearly 20 years ago but in GPUs they've been competitive far more recently. I'm just speculating but there's so much money involved that the situation begs for shady dealing.Yet they didn't take this approach with Intel. Because reasons?
I don't think it's in AMDs interest at all to match prices especially at the high end where people are used to buying nvidia products which are tried and tested and people have trust in.It's in both their interests to keep prices high that's how cartels work. A quiet chat in a bar between 'friends' who both work in the industry for different 'sides' to get a ballpark idea of each other's performance. Its not espionage it's collusion with zero paper trail and inherently unprovable. How else would different architectures arrive at very similar performance/price. Huawei vs Apple in mobile phones demonstrates what happens when there isn't a gentleman's agreement with similar features for radically different prices. Having just two players makes it much easier.
I agree £50 difference isn't enough, they need to disrupt the market rather than repeat the 5700XT launch. I'm desperately hoping for a 9700pro moment but not that confident it will happen. If they can replicate what they've done with CPUs in the GPU space I'll be very happy.I don't think it's in AMDs interest at all to match prices especially at the high end where people are used to buying nvidia products which are tried and tested and people have trust in.
I'm open to buying AMD but only if it's quite a bit cheaper than nvidia for a similar level of performance. If the difference in price is only £50 then I would probably pay the extra to stick with nvidia as 50 quid when I'm already shelling out 7-800 isn't a big discount and not enough for me to take a chance on AMD.
Just pointing out something which is rare when it comes to AMD here.. due to the shrink in nano's they have staggered well with the power consumption - something nvidia normally have bragging rights over. Doubled over with the higher power draw this iteration for their new lineup (30x0), we might see a very tight battle when it comes to fps/watt/£. Where it gets ultra interesting is if they can clear the 2080Ti, for decent power draw AND price it at realistic levels, then its gonna be popcorn sold out.
Someone explained that to me a long time ago in another forum. He claimed to be an engineer of sorts, don't recall the exact title. But it went a little like this:
In gist when it came to gpu's it's always better to make the die bigger as it would help dispense heat. That heat dispense which is keep well controlled, helped with the overall power consumption of that the die needed. This was why Nvidia used larger dies. It was not just about transistor count alone but managing thermals which helped control power consumption.
Packing everything in a tight confined space (this is what ATI/AMD were allegedly doing) might have decreased the die size but would effect thermals and thus power consumption. <--There was a bit more too this technically. I'm just going off memory on that.
So with everything shrinking to unheard of sizes and increasing compute there might be an equilibrium reached were one might need to do more with their GPU IP then just shrink dies.
We already know what AMD will do. That super duper all in one SOC that does EVERYTHING...from computers to phones and from cars to manufacturing.
from various news sources."AMD has been promising big things for RDNA 2 for a while now, specifically a 50% improvement in performance per watt over first gen RDNA."
Hard to say. We are only going on rumors right now after all.It fairly true in what you say, i.e. look at ryzens surface area which would be a positive when it comes to cooling. The leap however I am referring to is this snippet I retort from various news sources.
Normally the red v green releases, in an attempt to compete AMD usually blast the clocks up at the expense of power efficiency - however it has got marginally better in recent times.
So if the new AMD cards are 50% better over first gen, and nvidia are going for the clocks at the expense of power efficiency, its surely going to be a close one?
However, somethings are still not adding up. A doubling of everything doesn't necessarily equal double the performance. We are still missing a lot of information on this card. Which is why it's so hard to gauge what this Big Navi is going to do vs what RDNA2 will do on console.
Interesting comparison, seems big Navi could well deliver if the rumours are true regarding specs and power budget, lets hope there's some architectural secret sauce that really brings it to Nvidia.If you want an interesting case study of what (practically) doubling shader count, ROPs, memory bandwidth etc does for RDNA compare the 5500XT (4 or 8GB but I feel the 4GB gets held back by VRAM limitations too much so the performance delta is wider than it should be) to the 5700XT. Average clockspeeds are similar for both. When looking at Computerbase.de and TechPowerUp numbers I found that there was a 1:1 ratio between power increases and performance uplifts. At TPU the 5700XT used 74% more power than the 8GB 5500XT but was 76% faster at 1080p with 1440p and 4k having larger performance deltas. Comparing to the 4GB card the 5700XT used 94% more power and had a 90% performance uplift. Same story over at Computerbase, their samples had the 5700XT using 62% more power for 62% more performance, their 4GB performance numbers are quite a bit down vs the 8GB card and the power usage is basically the same so there for a 64% increase in power you got > 70% increase in performance.
Given this and the advertised 50% perf/watt increase for RDNA2 that would give us a 5700XT performing card at around 140W and twice the card at 280W. Provided AMD can get similar perf : power scaling for 'big navi' a 100% performance uplift (or close to) is possible although scaling workloads to that many CUs may be an issue.
Pretty obvious NVIDIA would want to be first to market so that they can charge £££££ up until AMD can launch their card. They want to further gouge their user base while they are able![]()
They've caught and surpassed Intel with quality of product. They're setting the pricing now and Intel are scrambling to compete. AMD can't afford to have a price war with Nvidia. Nvidia are too strong.Yet they didn't take this approach with Intel. Because reasons?
You don't have to be that cynical about everything lol
They've caught and surpassed Intel with quality of product. They're setting the pricing now and Intel are scrambling to compete. AMD can't afford to have price war with Nvidia. Nvidia are too strong.
AIB vendor claims Nvidia is close to launching Ampere gaming GPUs, while RDNA 2 cards are MIA - AIB vendors don't yet have testing GPUs from AMD - appears Nvidia is launching well ahead of AMD
https://out.reddit.com/t3_hrjkmh?url=https://www.purepc.pl/gralem-w-cyberpunk-2077-wymagania-sprzetowe-i-jakosc-grafiki#comment-725216&token=AQAAB8sOXxerOUZ_eNbY--KYiukKMMrRJVjlI04Q9wMqC0zET3c1&app_name=reddit.com
AIB vendor claims Nvidia is close to launching Ampere gaming GPUs, while RDNA 2 cards are MIA - AIB vendors don't yet have testing GPUs from AMD - appears Nvidia is launching well ahead of AMD
Bad news for AMD.AIB vendor claims Nvidia is close to launching Ampere gaming GPUs, while RDNA 2 cards are MIA - AIB vendors don't yet have testing GPUs from AMD - appears Nvidia is launching well ahead of AMD
https://out.reddit.com/t3_hrjkmh?url=https://www.purepc.pl/gralem-w-cyberpunk-2077-wymagania-sprzetowe-i-jakosc-grafiki#comment-725216&token=AQAAB8sOXxerOUZ_eNbY--KYiukKMMrRJVjlI04Q9wMqC0zET3c1&app_name=reddit.com