What 1000w![]()
Its bs, just gigabyte drastically overstating power requirements.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
What 1000w![]()
Just when things could not get any worse for the Vega launch.....
![]()
https://www.techpowerup.com/235701/...l-features-for-titan-xp-through-driver-update
The fastest gaming card available just got better !!!!
Shame I have not got any professional work for mine to do.![]()
But it is LITERALLY why we in this country are paying more...The pound was worth ~20% more when the 900 series came out. That is EXACTLY why the 970 started at ~£260. If we had the same exchange rate as back then (~1.6) the 1070's would have started at around ~£300.
There is no disputing this, it is just a fact. : / Gibbo himself has said exactly this at launch of the 1000 series as well.
The cards are set at a dollar price. Our money is worth many less dollars now, so we pay more.
The 970 started at around $329/1.6 X 1.2 (for VAT)
The 1070 starts at around $379/1.3 x 1.2 (for VAT)
The 1000 series was tricky as Gibbo explained, because they had to be careful of the crashing exchange rates so there was a certain amount of hedging going on with the prices, with the expected and then confirmed plummet of the pound due to Brexit
Typical launch from AMD really.
Tech of tomorrow today before the market is ready. Which is probably what explains why AMD tech ages well.
Nvidia on the other hand, push the tech of today forward for today, which is why they have the performance crown and don't age well.
Depends on your priorities.
I'm attracted to the tech in Vega and the fact that Vega supports all the DX12 features. Issue only is that how many proper DX12 games are there, and what currently exists, we all know its mostly better to use DX11.
SNIP
I do agree however I would say the 980ti, when overclocked well can hang with a 1080. Thats pretty solid for a 2/3 year old card.Typical launch from AMD really.
Tech of tomorrow today before the market is ready. Which is probably what explains why AMD tech ages well.
Nvidia on the other hand, push the tech of today forward for today, which is why they have the performance crown and don't age well.
Depends on your priorities.
I'm attracted to the tech in Vega and the fact that Vega supports all the DX12 features. Issue only is that how many proper DX12 games are there, and what currently exists, we all know its mostly better to use DX11.
that 3x is for a very specific set of applications and I very much doubt any owner of such card would suddenly decide to drop it and get a Vega instead. From what Ive seen so far it's not like Vega will be close to 1080 TI, or Titan cards so what you are saying doesn't really make sense.
It sounds to me like they did a very nice thing for owners of such cards enabling something they didn't have to do. Thank you, move on, nothing to see here.
Not sure if serious. lol
Exactly. It is obvious that they had it locked and released it due to competition forcing their hand. How can one not see this?
If anything Titan owners who will be using this new unlocked performance should be thanking AMD.
are you actually serious? what competition???
actually that's not quite true. We always pay more over here, tech usually has the same price as in US except it's in £ over $. It's always been like that, UK has been a cash cow for US companies. Brexit only made that a little worse.
Its bs, just gigabyte drastically overstating power requirements.
Did not know you had a fetish for PSU's also Loadsa, I always thought it was just CrossFireYeah they missed a 1 off, should be 1100w
![]()
As a blanket statement "500W is good enough for Vega" is not good advice. A lot of existing 500W PSUs will be older models, 80+ Bronze at best.You do know that these gold, platinum and titanium PSU's actually offer pretty similar efficiency at 50% to 90% these days.
For instance;
gold rating is 90% efficiency at 50% load and 87% efficiency at 100% load.
Platinum rating is 92% at 50% load and 89% at 100% load.
Titanium is 94% at 50% load and 91% at 100% load
So with that I would say we need to stop giving this advice because it just isn't what it was a decade ago.
Now with that there is heat as you mention, however again, a number of fanless PSU's are out there and they are happy to sit at 80%-90% load in testing I have done 24/7 for a month (simulated at work) and they had no problem. Any decent PSU now will happily sit at 90% for 5 years without an issue. And that is also why we get 5-10 year warranties these days. That is 5-10 year warranties assuming 100% load 24/7 for that period of time.
There is no problem with the 500 watt PSU being suggested if you just play games and don't overclock.
But in this case it was entirely due to BrexitI agree the pound against the dollar is bad at the moment and we are paying more for imports but my point is it wrong to just say its brexit fault there are so many different factors that cause exchange rates to go up and down.
Did not know you had a fetish for PSU's also Loadsa, I always thought it was just CrossFire![]()
This is not true either -
http://www.alphr.com/components/26408/inno3d-8800-gtx-review
price when reviewed - £333
https://uk.hardware.info/reviews/989/3/nvidia-geforce-8800-gtx-test-point-of-view-geforce-8800-gtx
£419 (this was closer to release
USD rrp - $599
Exchange rate at the time = ~1.9
so plus 17.5% VAT and that works out at about £370 so again, prices were there or thereabouts. I remember buying a 8800gtx a few months after release for about £340.
Other factors such as supply/demand, launch day milking etc do inflate things yes, but fundamentally we normally pay the dollar conversion + VAT + shipping costs (and then sometimes a cheeky mark up if the retailer thinks they can get away with it).