Game Industry Misbehaving Series: 1080p 60FPS Native for Consoles

Generation steps need a technology jump to go with it, what jump do we have this generation?

1080p 60FPS on Consoles Should be Standard (NOT Because Better Graphics)

Recommended Videos

All the games on the current generation of consoles (PS4, Xbox One WiiU) need to be running at 1080p native and at 60FPS. The N64 had more games running at 60FPS than all of last generation of consoles (Xbox 360, PS3 and Wii) did. And for those consoles I can only name one game off the top of my head. Vanquish, which was a great game! It did not run at 1080p, which was the trade-off to get it running at a pretty solid 60FPS, with anti-aliasing (2xMSAA). This is on both the Xbox 360 and PS3 versions of the game.

 Vanqiush is pretty, and plays great! 

I’m Sure You’re Wondering, If You Say It’s Not Because Graphics, Why Are You Complaining About Graphics?

I love games, and will play them at 720p 30FPS, I don’t mind that much. It’s the game, not the resolution or frame rate which interest me (as long as it runs at a minimum of 30FPS otherwise it hurts my eyes). However as this is now a new generation, and a much older system could run at 60FPS, then the new gen needs to beat that right? Stepping from the N64/PS2 to Xbox 360/PS3 to drop the frame rate was acceptable, as we got HD which is the jump for that gen. But what jump are we getting from last generation to this generation? Sure the first few months of games for that generation will be running a bit behind, that’s fine. Developers need to get used to the hardware, and learn coding for optimisation. But a year down the line… something is fundamentally wrong with the hardware if all games cannot run at 1080p 60FPS native. I mean it’s only now we are getting games which can run like that.

Mario 64 ran at 60FPS 

This Is Not About Something Versus Something, This Is About a Generation Technological Jump

Just to put something out there, and to emphasise a point. I don’t care what I game on, for the most part, except handhelds (they don’t interest me, but that’s preference). I don’t care what resolution a game runs at, or how it looks. Pretty graphics are great to have, but not necessary for the core game to be good. I care about you, I care that you are not being ripped off with lies, and deceit. I care that consumers/customers (that’s you) are going to get what you pay for. And honestly, with the current generation of consoles, you simply are not getting that. You are being lied to, and looked down upon on. Deceived over and over again. That is why I am angry at the consoles not running at 1080p 60FPS as standard, not because graphics. But because that is the next jump for the generations step. A step across generations needs to bring some sort of jump, no matter what it is. Be it in the animations, or AI (doesn’t need to be graphics). This generation’s jump is believed to be, and marketed as, running at 1080p 60FPS native. However is this the case? No. If you look at the standout exclusives for the consoles (which we know how they are running), on the PS4 side we have The Order: 1886.

I’m sure it will be a fun game, but all the bull surrounding it puts me off.

Showcase Games

The Order: 1886 is a showcase game, to show off what they PS4 can do in terms of gameplay, graphics, frame rates etc. So what exactly is it showing? That the PS4 can render 4xMSAA in sub 1080p resolution (at 1920×800)? Chief Technology Officer Andrea Pessino quoted the reason for this, “…we do run 4xMSAA which looks spectacular! x800 with AA looks MUCH better than x1080 without : )” and later “To be clear, x800 with 4xMSAA needs more bandwidth than x1080 would, so 1080 no MS would be cheaper.” By cheaper I assume he means less resource intensive, so less taxing on the system. This means they have openly admitted that the PS4 cannot run this game with the extra 280 pixels with 4xMSAA. Why not just use FXAA? That is much less taxing, and still looks great. The Order is also only running at 30FPS, ReadyAtDawn have said the reason is something to this effect. “Running the game at 60FPS makes it looks to sci-fi/fantasy, so running the game at 30FPS hit the movie look we want”. I have a massive issue with these statements, the first is that it just sounds like an excuse for poor hardware, the second is that they are openly admitting that the very expensive hardware is not very graphically powerful. Now looking at the showcase for the Xbox One, Ryse: Son of Rome, exactly the same issue can be found here. The game runs at 1920×900 and at 30FPS. So what does that showcase? Exactly the same as the PS4. Lack of graphical hardware to be able to run the games. So no one console is better than the other.

All the effort went into making it look nice, and they failed at making it play well.

I have nothing wrong with a showcase of something, they just simply show the item in question in the best light. But when that best light is only a small amount (resolution wise) more than the last generation, for me it begs the questions. Where the consoles poorly designed? Where they released too early? Are they too hard to develop for? Or do they simply have weak hardware?

 In The End It Is Just About The Games

I do just love games, and want to play them. So when there are games, I deem (for me) worthy for the purchase of a console, then I will get that console. Even if I am playing at 720p 30FPS. If the game is amazing, or at the very least amazing fun, it doesn’t matter how it runs. I have enjoyed it and that, for me, is exactly what I want.

I have enjoyed plenty of games running lower frame rates, or resolutions. The Last of Us, running at 720p 30FPS, did that take away from the gameplay? Or the story? No, not at all. It is one of the best games ever made. Halo 3, Reach and 4, running at 720p/1080p and 30FPS, did it detract from the immense fun, and amount of hours I have put into those games? Not in the slightest.

 

Such an amazing game, does not need 1080p but 60FPS will not hurt (will look smoother)

Generation Steps Need a Technology Jump To Go With It

That is the root of the issue I have, from the 4th generation to the 5th brought disks (PS1).

PS1 disks were black… for some reason.

The jump from the 5th to the 6th brought a lot more power (Dreamcast), games running from DVDs (PS2, Dreamcast, Xbox Original, Gamecube) and built in hard drives (Xbox Original).

Probally the best generation, in terms of singleplayer games of course.

Then the biggest jump in gaming, the 6th generation to the 7th generation (Xbox 360, PS3) brought high definition gaming to our homes, but not only that stable and fully fledged online systems (Xbox Live, PSN), which just grew as the generation went along.

The most impressive (so far) console generation graphically.

So what has the jump from the 7th generation to the 8th given us? Very expensive hardware, which can’t run game much better than the previous generation? We are starting to see where other little jumps are happening, with the cloud services, game streaming. But is this enough for a generations step? I don’t think it is, we need a small graphical, and frame rate jump as well.

The general gaming community is also looking at it from a graphical perspective, “OH LOOK IT’S NOT MUCH MORE SHINY.” I hear them cry. Where I am looking at it from a technological perspective.

Have I succeeded? Let me know in the comment bellow. And of course, thank you for reading.


GameSkinny is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Pierre Fouquet
Pierre Fouquet
-- Games are a passion as well as a hobby. Other writing of mine found on at www.scrncheat.com