Home / Opinions / Does Xbox One and PS4 game resolution really matter?

Does Xbox One and PS4 game resolution really matter?

by

Forza 5

OPINION: Nick Cowen questions when did 1080p and 60fps become such an issue? And why the hell are players so angry about it?

“1080p, 60fps.”

It’s the new mantra in gaming. The gold standard to which all new IPs and sequels in world-conquering franchises are held. A rallying call for quality in gaming. Anything less is viewed as a slight. A betrayal. A sign that developers, publishers and platform holders aren’t taking the investment of their audiences seriously.

“1080p, 60fps.”

If a developer’s game doesn’t have these two magical requirements in place, it can cause outcry. Take a look at the reaction to the announcement Call Of Duty: Ghosts didn’t offer the same visual fidelity across platforms. It became such an issue that Infinity Ward moved to address this in the month of its release.

More recently, the fact that Assassin’s Creed: Unity is to be locked at 900p resolution and 30fps across all platforms raised hackles across the board. Twitter was deluged with complaints from players using the hashtag #PS4NoParity to vent that the game’s visuals were being scaled back to accommodate Xbox One owners. Visual fidelity has become a massive issue in the gaming community.

Here’s a question, though: does it really matter?

Call Of Duty Ghosts

Call Of Duty: Ghosts came in for some flack over visual parity at launch

Recently, developers have stepped into the fray to address these visual concerns and opined that the furor over “1080p, 60fps” misses the point. Neil Thompson, the Director of Art and Animation on the forthcoming Dragon Age: Inquisition, turned himself into a potential tackling dummy, when he put forward the idea that resolution and frame rate aren’t as important as the community assert they are.

"If the experience is satisfying and everyone is happy with that, why be concerned about certain technical parameters that may be invisible to all but the most technically verbose player," said Thompson to TrustedReviews. "If I go and watch a movie, I’m not questioning the technology behind it, whether it’s 235:1 in aspect ratio or whatever. If it’s an enjoyable experience, it’s an enjoyable experience."

"I would be very surprised if anyone on any game or any movie or anything else, can spot the difference between 720 and 1080p. Even side by side I’d be very surprised," he added.

Assassins Creed Unity

Assassin's Creed: Unity's 900p 30fps news became the topic of a Twitter storm

Thompson’s not alone in this view. Far Cry 4’s Creative Director, Alex Hutchinson, recently said that 30fps was absolutely fine for the type of game he and his team were working on.

"60 is really good for a shooter, action adventure not so much. It actually feels better for people when it's at that 30fps,” he said. “It also lets us push the limits of everything to the maximum. It's like when people start asking about resolution. Is it the number or the quality of the pixels that you want? If the game looks gorgeous, who cares about the number?"

It’s an interesting question: if a game plays well and looks lovely, who cares what the frame rate and resolution are? These are questions, incidentally, that are never aimed at Nintendo. While Bayonetta 2 and Mario Kart can boast visual fidelity that is in line with the requirements of the “1080p, 60fps” crowd, this isn’t true of all of the Wii U’s fare so far – and no one has given a damn.

Dragon Age Inquisition

Why should technical parameters that may be invisible to all but the most technically verbose player be so important?

In fact, Nintendo has been allowed to skate on issues that would have seen both Microsoft and Sony crucified. Can you imagine the uproar that would have been caused if Sony had cracked down on YouTube clips in the same way Nintendo has? Or if Microsoft had decided that players in Europe could only download adult fare between 11am and 3pm? But hey, it’s quirky Nintendo, so it can do what it wants.

“1080p, 60fps.”

So what’s the problem here?

Well, a new-generation console does cost a lot of money and no one wants to feel like they’ve been sold a pup. If your platform of choice doesn’t offer the same visual capability as its nearest and dearest rival, you’re bound to be irked after the financial outlay.

Splinter Cell

Visual parity wasn't an issue on PS2 or the original Xbox, but then, those systems weren't technologically neck and neck

When you’ve dropped close to £400 on a new machine and it emerges that its rival platform offers a better visual experience as a matter of course, naturally you’re going to be rather upset.

An unfortunate development in all this is the hideous amount of abuse both creators and publisher receive from disgruntled gamers. Rude – and sometimes borderline psychotic – tweets and messages are a daily hazard following announcements on uneven visual fidelity across platforms.

It’s true that visual parity should exist across platforms – God knows the new consoles have the processing power, which funnily enough, wasn’t an issue when the sales figures of the PS2 left the first Xbox in the dust. But now that the internal workings of both Sony’s and Microsoft’s platforms are more neck-and-neck, players’ expectations have grown accordingly.

Far Cry 4

View for the future: hopefully, once visuals stand up to comparison across platforms players can focus on what's important in gaming - fun

“1080p, 60fps.”

As much as visual fidelity is an issue right now, chances are it won’t be for long. While third-party publishers are releasing games that don’t share the same visual fidelity across platforms, Microsoft-owned developers are perfectly capable of releasing games on the 1080p, 60fps requirement basis. It could be down to the fact they’re more experienced with using the internal tech. Or it could be down to the fact their games require it – just check out the latest Forza title, Forza Horizon 2.

Sony has made a lot of press out of the fact that the PS4 was built from the ground up to make lives easier for developers. Microsoft has been rather quiet on this subject, instead focusing on the Xbox One as the ultimate entertainment console. It could be the case that Sony has the edge due to its engagement with publishers and developers. As time marches on, however, the two consoles’ visual alignment is likely to become closer. Then, hopefully, it’ll be down to which platform offers the most fun.

See also: Xbox One vs PS4

This Guy

October 20, 2014, 3:51 pm

"It’s true that visual parity should exist across platforms – God knows the new consoles have the processing power, which funnily enough, wasn’t an issue when the sales figures of the PS2 left the first Xbox in the dust."

Yeah people who were into gaming during those days forgot how technically inferior the PS2 was to the original Xbox but still sold like crazy nonetheless, but at the end of the day it's about games and the PS2 had great exclusives.

Matthew Bunton

October 20, 2014, 5:08 pm

Agreed it is ultimately about the Library of games which is precisely why the PS2 sold so well back in its day.

However if people can't see the difference between 720p and 1080p when gaming then they really need to get their eyesight checked.

Besides 1080p is so old now we are rapidly moving towards 4K. Even the 360 and PS3 could play some of their games at 1080p seven years ago.

This whole issue has stemed from both manufacturers offering underpowered consoles this generation. Yes the PS4 is slightly more powerfull than the One but still not good enough.

Yojimbo

October 20, 2014, 5:30 pm

Dear Neil Thompson Director of Art and Animation if you cannot tell the difference between a movie playing at 720p as opposed to 1080p you should visit Vision Express.

kftgr

October 20, 2014, 5:57 pm

Isn't Forza Horizon 2 1080p 30fps?

kftgr

October 20, 2014, 6:03 pm

Yep, although 1080p vs 900p is pretty hard, especially when you're playing the game and not playing spot the differences.

Skeet

October 20, 2014, 6:08 pm

I think studios should just refrain from divulging their resolutions and frame rates. Concentrate on all the features and gameplay of the games. #ifyouwantthebestgraphicsthengobuyapc

bill

October 20, 2014, 6:22 pm

I dont know about console gaming but pc gaming all i know if certain games drop below 50fps like far cry 3 its a horrible experience,stuttery and just horrible.

Matthew Bunton

October 20, 2014, 6:32 pm

Agreed and I have said so many times in my other posts.

Matthew Bunton

October 20, 2014, 6:33 pm

Yes you are correct.

Only Forza 5 is 1080p 60fps.

Jonathan George Anaya

October 20, 2014, 8:09 pm

I think that 900p UPSCALED to 1080p is not noticeable. I dare to say: take screenshots of Thief on the X1/PS4 and ask random pedestrians which looks better: what would the results be?

Also, MS knew that PCs crush consoles. That's why MS had planned to make the X1 a Steam Machine, of sorts. DRM powered by Cloud Tech for AI Processing?

I signed up for that Future DAY ONE because: A. I have a stable, 100mbs connection and B. I want INNOVATION in my games. As an Indie Dev, I'm SICK and TIRED of the Google Play Store games that are flooding the market with garbage.

the PS4 hardware is powerful, yes, but, it's getting CRUSHED by the PC since launch. MS made their X1 upgradable and I'd like to see what Crackdown and Halo 5 does with Azure. Forza and Titanfall have already proven that parallel processing works quite fine for online asset streaming

Paul

October 21, 2014, 1:19 am

oh yeah it matters

Nettrick Nowan

October 21, 2014, 2:43 am

My takeaway from all of this is that a developer is saying this. My opinion--pathetic. Heck, even that Microsoft exec admitted you could see a difference if you had a large TV--but here you have a developer doubting that a difference can be seen. From what he says, we should be satisfied watching 4:3 formatted games and movies in SD. As I said, my opinion, but truly pathetic.

Brandon Maya

October 21, 2014, 2:48 am

If you read the article, it absolutely does matter. The irony here is, intentional or otherwise; the target audience of this article is the exact group that it openly attempts to undermine.

Guest

October 21, 2014, 2:51 am

Being a game developer contributes to the bias. The end-user experience is wholesomely different than that of the developer, whom can only empirically scrutinize a game based on fundamental gameplay concepts. This article entirely misses the point made by people who argue that this "visual fidelity" is not a pointless endeavor.

Eno

October 21, 2014, 7:09 am

Yes resolution and FPS matters! I bet they know it very well, but they invented this "cinematic" nonsense about 30 fps because the consoles are not powerful enough. What is the most irritating thing is they try to limit the PC versions as well, in my opinion this is unforgivable!

Nate

October 21, 2014, 1:10 pm

I totally agree. The only time Ive thought someone could actually have that opinion is during the actually cinematics. Some cinematics do look better at 30 fps. Thats why movie's are shot in 27. But the actually gameplay? No. Never.

Nate

October 21, 2014, 1:21 pm

Yes. It matters. End of story. Im not saying its the "holy grail". There are other things that far more important.. like framerate, and actual gameplay. But with all that being said, its about the size of your tv/viewing distance. Anything larger than 42" should be 1080p. You can CLEARLY see the difference in a larger tv, and the experience is more immersive because of it. Making the GAMEPLAY FUNNER <<< Hey, would you look at that..

Eno

October 21, 2014, 1:39 pm

Actually films are usually shot on 23,976 fps at a 1/48 seconds shutter speed. This only because in film days it was to hard in to shot on 48 fps, not enough light etc. Now we are accustomed to that look and call it cinematic. But this has nothing to do with actual gaming. Our bain can distinguish about 72 fps, optimum game immersion is above that speed, so 60 fps in my opinion is a minimum frame rate (optimum is above 72 fps). 30 fps for gaming is crap!

Nate

October 21, 2014, 2:35 pm

My bad for getting the numbers wrong. But yeah I know about the whole film issue. Like.. The fact that films were originally shot in 24fps in the 20's when they were getting away from silent films. A couple directors feel strongly that films should be shot at 48fps, but when they do people spazz. Some film professors preach that its because we've become accustomed to it. Some preach that its actually a "picture" effect our brain enjoys that we accidentally found due to our lack of technology.

I actually agree with the second opinion. When film is shot at 48fps you can clearly tell everyone is on a set. It looks TOO real. I dont want Gandlaf looking like just some dude in a hat.

Im pretty sure you know all of this, I just didn't want you thinking I was completely ignorant on the issue. That's just we're I stand. My opinion only carries over to gaming because Ive seen multiple games Remastered from 30-60fps. And I personally thought the CINEMATICS looked shittier at 60. It looked like "gameplay I had no control over". It just looked.. strange. But yeah, every time a Dev says they were going for the "movie" effect so they stuck with 30fps, they're full of shit.

Eno

October 21, 2014, 4:31 pm

I agree with you Nate, just stated the 24number instead of 27 fps. The story about film was just to highlight why 24 fps was chosen instead of something else. Actually they chose 24 fps because it was easy to snick with the audio (48 KHz).

30s fps for games sucks! I bet most of the new game releases will be limited to 30 fps even on computers and that will make a lot of people angry, including myself. We invested a lot of money on high end video cards to play games on 30 fps?

jacksjus

October 22, 2014, 1:00 pm

It appears that the developers and the consumers opinions are vastly differing. Why not just make the console versions adjustable similar to PC where you have the option to flip between 60 and 30. 60 is always better.

And 1080p should be standard, period.

Dennis Crosby

October 22, 2014, 1:20 pm

No it doesn't matter never has mattered in console gaming. A game being 1080p 60fps doesn't make it good and can't make a bad game better. If you think 1080p 60fps should be a standard you need to build a gaming rig. Console gamer just need to worry about having great textures and smooth frame rate.

Tga215

October 22, 2014, 1:29 pm

Finally a article that makes sense

TheFanboySlayer

October 22, 2014, 2:29 pm

Honestly resolution doesn't matter much to me. I'm just waiting for the day when the best game gets created and it runs at 720p....I would buy it but would you buy it.

Funny

October 22, 2014, 2:44 pm

It DOES matter, not because it would make a game better just by having a higher framer ate or resolution, but it's due to the fact that many gamers chose PS4 over Xbox because of its superior hardware. The difference isn't minimal either, much faster RAM coupled with a superior GPU means the PS4 is obviously more powerful than the Xbox. So when devs release identical copies of the game on both systems, it's obvious that the best xbox can do is not the best that the PS4 can. I think gamers deserve the best version of a game and shouldn't be held back by the weaker console.

comments powered by Disqus