Soldato
- Joined
- 30 Nov 2011
- Posts
- 11,526
You might be right mate I just remember a hardware review site on here couple months back getting a warning.
Nice site though![]()
He had banner ads.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
You might be right mate I just remember a hardware review site on here couple months back getting a warning.
Nice site though![]()
Removed
Good to see AMD partnering IO Interactive with the latest Hitman game and with a Feb Beta date and March release date, could this be the first "full" DX12 game? I quite enjoy Absolution and will be looking forward to this.
i think if the game is made for DX11 as a base, then it wont really take advantage of DX12 features, and the gains will be minimals, it's like doing multiplatform title, the game is coded for console capabilities, like wise here you wont have huge draw calls, and heavy workload on async, i still believe for a game to be truly NEW API title, it need to be exclusive to that API, or maybe later ported to DX11, but since they do both...meh gonna be crap
On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.
This is very good news, much needed for AMD. Let's hope the rumours are true and Tomb Raider gets DX12 as well, then I'll be very happy.
You do realise console games have had more drawcalls in past than pc games. Devs have needed to tricks and magic to convert those scenes for pc.
Random arcicle by quick googling about this.
http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2
Have to ask Nvidia when they'll release it for TR.
It would certainly be a good thing for Hitman to come with DX12 and use ACE. I could see this giving AMD a good leap and if it is indeed true that Nvidia can't do Async in parallel, at least we get a heads up all round.
I'd be surprised if it's taking full advantage of DX12 and giving GCN much of an advantage over Nvidia, The company behind the game and the dev's will probably be under pressure not too allow that.
AMD are claiming that their GCN is the only GPUs capable of running Async. I have an article on my site showing what is what but not going to link it but Hitman allegedly uses Async![]()
Then when the performance reviews come out, AMDs cards will be around 5/10fps+ slower than Nvidias cards, as per usual![]()
AMD Radeon™ GPUs for blazingly fast DirectX® 12 performance1 featuring AMD's advanced Graphics Core Next (GCN) architecture, which is currently the only architecture in the world that supports DX12's asynchronous shading
AMD is once again partnering with IO Interactive to bring an incredible Hitman gaming experience to the PC. As the newest member to the AMD Gaming Evolved program, Hitman will feature top-flight effects and performance optimizations for PC gamers.
Hitman will leverage unique DX12 hardware found in only AMD Radeon GPUs—called asynchronous compute engines—to handle heavier workloads and better image quality without compromising performance. PC gamers may have heard of asynchronous compute already, and Hitman demonstrates the best implementation of this exciting technology yet. By unlocking performance in GPUs and processors that couldn’t be touched in DirectX 11, gamers can get new performance out of the hardware they already own.
AMD is also looking to provide an exceptional experience to PC gamers with high-end PCs, collaborating with IO Interactive to implement AMD Eyefinity and ultrawide support, plus super-sample anti-aliasing for the best possible AA quality.
This partnership is a journey three years in the making, which started with Hitman: Absolution in 2012, a top seller in Europe and widely critically acclaimed. PC technical reviewers lauded all the knobs and dials that pushed GPUs of the time to their limit. That was no accident. With on-staff game developers, source code and effects, the AMDGaming Evolved program helps developers to bring the best out of a GPU. And now in 2016, Hitman gets the same PC-focused treatment with AMD and IO Interactive to ensure that the series’ newest title represents another great showcase for PC gaming!
Hitman is going to be a DX12 game title and AMD is hoping that AMD Radeon GPUs will shine since they have full hardware support for asynchronous compute. NVIDIA is having to rely on their software drivers to to handle asynchronous compute. The good news is that both companies are able to fully support Async Compute and it will be interesting to see how the two approaches work out when it comes to compromising performance. This AAA game title should finally show how Async Compute Engines will be handled by current GPUs. We are sure that AMD’s upcoming Polaris and NVIDIA’s upcoming Pascal GPU architectures will both offer full hardware support for Async Compute, but both aren’t due out until later this year.
Read more at http://www.legitreviews.com/hitman-beta-pc-system-requirements-announced_178991#c3c3sw62PMEDj1vU.99
Played Hitman a few weeks ago on a widescreen 4K FreeSync monitor with a single Fury X. Smooth 60FPS with no performance issues. Should note that it was an alpha build (DX11) so when I set the settings to max I couldn't actually tell if worked or not. Will have to see what performance is like during the beta.
It will be interesting to see how the episodic stuff works out. I like that there will be random "hits" in the game for a limited time. So you get a notification on your phone that a target needs to be killed and then you have 24 hours to kill him. If you don't kill him then that's it. you won't get the opportunity again.