**XBOX ONE** Official Thread

It's because Xbox One is meant to be 'next generation' whereby 1080P should be standard, the 360 came out at a time when 720P TV's were just hitting the market and Nintendo haven't been considered 'next generation' since the 90's.

Most TV's today are 1080P native so to be rendering in 720P and upscaling is a big disappointment, especially for a console which costs significantly more than its 1080P counterpart.

What do we have to support this though?

A bunch of 3rd party gamers where most have said they have not got to grips with the Dev Kits yet, you are talking like the XO is completely restricted to only ever have games at 720p this is false, and talking like its fact makes those people read stupid.
1080p games like Forza are available day one, more 1080p games will start flowing in as Devs get used more and more with BOTH the XO and PS4 Kits.
And honestly...the more and more I see of Kinect, that looks to be more "next generational" than anything I have seen from the PS4, which all its is, is a GPU and CPU upgrade, that's it, and its still MILEs behind current PC specs.
 
What do we have to support this though?

A bunch of 3rd party gamers where most have said they have not got to grips with the Dev Kits yet, you are talking like the XO is completely restricted to only ever have games at 720p this is false, and talking like its fact makes those people read stupid.
1080p games like Forza are available day one, more 1080p games will start flowing in as Devs get used more and more with BOTH the XO and PS4 Kits.
And honestly...the more and more I see of Kinect, that looks to be more "next generational" than anything I have seen from the PS4, which all its is, is a GPU and CPU upgrade, that's it, and its still MILEs behind current PC specs.

Forza is only 1080P 60FPS because it isn't using any advanced rendering techniques, no deferred rendering/lighting, it's not a technically advanced game (Not that I'm saying it looks bad, it's just not doing anything special).

Any game that is using deferred rendering on the XB1 is 720P 60FPS or 900P 30 FPS

Games not using deferred rendering which is Forza and a few of the sports games are 1080P 60 FPS.
 
Yes I know all of that, not that deferred lighting is really required for such a game for set scenes, my point is people talk as though the XO will NEVER get 1080p games in general, it will, just in a bit of time when Devs get used to things.
 
Yes I know all of that, not that deferred lighting is really required for such a game for set scenes, my point is people talk as though the XO will NEVER get 1080p games in general, it will, just in a bit of time when Devs get used to things.

Obtain 1080p they might, but at what cost? That's the real question.

Not a good start really, a so called next-gen box should **** 1080p and then some. This thing is meant to last 5+ years.
 
Obtain 1080p they might, but at what cost? That's the real question.

Not a good start really, a so called next-gen box should **** 1080p and then some. This thing is meant to last 5+ years.

Well then in that regard, the PS4 is a complete and total utter failure, the machine EVERYONE is toting as the "gamers" choice, and yet for all their talk and GPU abilities, it has games also not natively running at 1080p.

At what cost?...none....Devs will get better making games for both, simple as that, MS will find ways to optimise the multiple OS's that's looking more and more slick in each demo over time freeing up more resources.

remember when the Devs where all crying and annoyed about how hard it was to get the Cell chip working on the PS3 when it was first released?
 
When are we going to see an actual teardown of the XB1?

I expect Microsoft to back up their claims about the XB1 being the most powerful console ever and demonstrate exactly how 68gb/s DDR3 is not bottlenecked by 102gb/s ESRAM on chip. All I ever seem to be able to read on the subject of the competing architectures is that MS say you can simultaneously read and write on ESRAM, whilst the internet calls shenanigans.
 
It's because Xbox One is meant to be 'next generation' whereby 1080P should be standard, the 360 came out at a time when 720P TV's were just hitting the market and Nintendo haven't been considered 'next generation' since the 90's.

Most TV's today are 1080P native so to be rendering in 720P and upscaling is a big disappointment, especially for a console which costs significantly more than its 1080P counterpart.

Obtain 1080p they might, but at what cost? That's the real question.

Not a good start really, a so called next-gen box should **** 1080p and then some. This thing is meant to last 5+ years.

You guys make me lol. I really don't understand yours or other people obsession with "its a next gen console it should be able to do 1080p" The current gen can do 1080p so ofcourse this new gen can easily do it. Maybe some devs are choosing to run higher graphical settings and only run at 720p but that is their choice not a limit of the console. Heck, when I played BF3 on my PC I couldn't run it 60fps 1080p on max settings, lowered a few settings and I got it to run steady 60fps.
 
When are we going to see an actual teardown of the XB1?

I expect Microsoft to back up their claims about the XB1 being the most powerful console ever and demonstrate exactly how 68gb/s DDR3 is not bottlenecked by 102gb/s ESRAM on chip. All I ever seem to be able to read on the subject of the competing architectures is that MS say you can simultaneously read and write on ESRAM, whilst the internet calls shenanigans.

The Internet is wrong then, as you can read and write to the ESRAM at the same time. What benefits it will actually reap is the issue.

Edit: Here is an interview with some Xbox One designers if you're interested - http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview
 
When are we going to see an actual teardown of the XB1?

I expect Microsoft to back up their claims about the XB1 being the most powerful console ever and demonstrate exactly how 68gb/s DDR3 is not bottlenecked by 102gb/s ESRAM on chip. All I ever seem to be able to read on the subject of the competing architectures is that MS say you can simultaneously read and write on ESRAM, whilst the internet calls shenanigans.

I haven't looked into the ESRAM specifically but the few articles I have read have said you can read and write to it at the same time but obviously in this situation the transfer rate is half of that compared to an all read or all write scenario.
 
Back
Top Bottom