# The Framerate Argument 30fps vs 60fps



## Fernin (Jul 5, 2015)

The 30 vs 60 fps argument has been a running issue in gaming for a long time, with plenty of folks on both sides. This article a found addresses the issue quite well covering everything from the "The human eye can't see more than 30fps" argument to the "But films are in 24fps" case and more. It's a good read, and I feel presents a strong case to why 60fps and above is simply better.


Also, could you imagine the horrific "screen tearing" like effect when turning your head or doing anything with quick movement or at high speed if you "could only see in 30 fps". @@


----------



## DrDingo (Jul 5, 2015)

I think that anyone in doubt that 60fps is more than 30 should watch a video on YouTube that offers 60 frames per second in the quality options. Normal YouTube video quality plays at 30 frames per second.
E.g. https://www.youtube.com/watch?v=xzdsuCIzvOE


----------



## flletcher (Jul 5, 2015)

the human eye doesn't see in FPS and i can easily notice the difference between 30fps and 60fps, a huge difference, i always watch videos in 60fps if i can


----------



## Vitaly (Jul 5, 2015)

The more, the better. I see no reason to limit fps unless you are not a console peasant and 30 fps is all that your soapstation can do


----------



## PlusThirtyOne (Jul 5, 2015)

i personally find 60FPS gives me headaches; at least on a TV. Sitting at my computer screen, i do just fine. Movies at higher frame rates always screw with me but video games look okay. That being said, i'm not bothered by having a game locked at 30FPS. Dropping the frame rate gives a teeny bit more power to push the graphics' fidelity. On the topic of resolution, with the right kinds of filters, subtle motion blur and anisotropic filtering, i can't tell the difference between two games at 720P or 1080P unless the interface is stretched out. i would gladly trade a slower frame rate and lower res for higher quality textures and scaling. Object/texture/detail pop-in is my #1 pet-peeve.


----------



## Hakar Kerarmor (Jul 5, 2015)

PlusThirtyOne said:


> Object/texture/detail pop-in is my #1 pet-peeve.



Exactly, screw pretty much anything else. It doesn't matter how pretty the trees in your game are if they pop into view only 20-30 meters away.


----------



## Kalmor (Jul 5, 2015)

I can't bear anything below about 40. I aim for stable 60. I get headaches/simulation sickness otherwise, especially since I play a lot of racing games where you also need the improved input latency to even be competitive at the high-skill sims like iRacing. I also despise motion blur and it's one of the first things I turn off in the settings. I will turn down settings like shadows and AA to get stable 60.

To those who still say 30 fps is better. Don't just watch a 60 fps video, ~play something~ at 60 fps if possible. Use a friend's PC or play one of those rare console titles that somehow run at it.

See: http://30vs60.com/bf4running.php


----------



## Fernin (Jul 5, 2015)

@flletcher: The fact that the human eye doesn't see in frames is one of the points brought up in the article.

@PlusThirtyOne: The high "frame rates" on TVs also gives me an immense head ache, but the reason for it is entirely due to the way TVs employ motion processing. You see, one of the major selling points tv makers try to push is high refresh rates, and the motion processing that they employ to "take advantage" of those refresh rates. You see, when you watch something one TV, you're always watching a video input that is coming in at 24fps (give or take). So to make the image seem smoother what the TV does is it takes that 24 FPS and generates fake frames to seemingly add more detail. Frankly, this technology is most often awful and produces that strange motion that gives head aches. If you take a high refresh TV and plug it into a computer, or other source of high framerate video and play that (with motion processing turned off on the TV) you'll see it looks vastly better, and no longer gives head aches. Personally, I ALWAYS turn off motion processing.

Now on the other matters.... I feel a game should balance gameplay AND graphical fidelity, that means I want 60 FPS AND good visuals to match. This is why I mostly game on PC (though I do own every console except the WiiU). What I feel developers should do, is properly balance those two priorities. It CAN be done, there are already games on consoles that looks decent and run at 60 FPS. Unfortunately chasing PC level eye candy on consoles is something devs have become obsessed with, and the console games suffer for it, many barely even manage a stable 30fps.

And as for motion blur, this is one of the few opinions I share with TotalBiscuit, it's shit, it's terrible, and it needs to go the fuck away much like FXAA.


----------



## Schwimmwagen (Jul 5, 2015)

I never minded 30fps but 60fps is objectively superior

Never been a dealbreaker though


----------



## flletcher (Jul 5, 2015)

Yea i'm a total fps whore, i demand 60fps!


----------



## Alastair Snowpaw (Jul 5, 2015)

my favorite framerate is the one that is stable.


----------



## Kellie Gator (Jul 5, 2015)

Well, I usually care more about the framerate being smooth and constant.

BUT, it's fucking 2015 now so you'd think that games would've nailed the higher frame rates at this point. You know what I believe? That console developers are so obsessed with making games as good-looking as possible, like, cartoonishly so, which limits the amount of FPS the game can run in. But no developer's gonna have the guts to admit technical limitations like that, so of course they'll make up some bullshit about how it's more cinematic with 30 or whatever.

[video=youtube;QiinO9JPUGw]https://www.youtube.com/watch?v=QiinO9JPUGw[/video] (skip to 2:19)

The new Doom's aiming for 1080p, 60FPS on all platforms and it looks gorgeous here, but I have some doubts. The FPS doesn't look constant but that may be the result of a game still in development. Also, some games have been subjected to graphical downgrades lately so I kinda fear that may be the case with Doom when it's released. But I hope it's not.

Just wanted to share an example of the potential of what 60FPS games can look like. I'm growing tired of 30.


----------



## LazerMaster5 (Jul 5, 2015)

30fps is shit. Have you ever played a game with the fps counter displayed on screen? When the fps jumps from 60 to 30, it is very noticeable. I don't care if Ubisoft thinks 30fps is more cinematic, when I play a game I want everything to feel fast and fluid.


----------



## Willow (Jul 6, 2015)

Honestly it depends on the game for me. Like if the game was made for 60fps, then I'd rather play it in 60fps because games like The Last of Us and Battlefield look gorgeous in it. But then games like Bioshock (the first one) are probably better left in 30fps. 
As far as the whole film thing goes, not only do films have a natural motion blur but I believe it also has to do a lot with the projectors themselves.


----------



## SparkyWolf (Jul 6, 2015)

To me it seems delusional to say that there's no difference between 30 and 60 FPS. I notice a massive difference between the clip at 30 FPS and the one at 60 FPS, and I think that something about actually controlling the game makes you more aware of frame rate than when you watch a clip. Usually, I shoot for about 80 FPS, 'cause I even notice a difference between 60 and 80.


----------



## Kellie Gator (Jul 6, 2015)

SparkyWolf said:


> To me it seems delusional to say that there's no difference between 30 and 60 FPS. I notice a massive difference between the clip at 30 FPS and the one at 60 FPS, and I think that something about actually controlling the game makes you more aware of frame rate than when you watch a clip. Usually, I shoot for about 80 FPS, 'cause I even notice a difference between 60 and 80.


I know of people who go for as high as 125, and in the case of some low-end games like Quake Live people go for 250 and it's kind of amazing, do modern monitors even support that kind of framerate? I know I normally get screen tearing when aiming that high and I hate screen tearing, bugs the hell outta me.


----------



## SparkyWolf (Jul 6, 2015)

Kellie Gator said:


> I know of people who go for as high as 125, and in the case of some low-end games like Quake Live people go for 250 and it's kind of amazing, do modern monitors even support that kind of framerate? I know I normally get screen tearing when aiming that high and I hate screen tearing, bugs the hell outta me.





Certain games I will run at 120 FPS. Honestly, I don't know if monitors support frame rates that high. But I'd guess only the really high-end ones do.


----------



## JerryFoxcoon (Jul 6, 2015)

Oddly 60FPS videos don't look natural to me. As if the movements were too fluid, particularly progressive 60FPS. TVs use interlaced 30FPS to simulate 60FPS and it looks better to me.


----------



## TrishaCat (Jul 6, 2015)

The higher the FPS the better. Always figured this was obvious.

I just hate that its such a big issue of late. I don't care about framerate. I can see it, its clear, but I don't see any reason to demand better from companies when they aren't 60FPS like the article says. If a device can't handle 60, that's fine.  As long as its playable, I'm happy. The fact that companies justify it as an art choice is silly though.


----------



## Inpw (Jul 6, 2015)

It is true that the human eye can't notice anything above 24 fps the problem is screen tearing due to multiple frame rates over each other. The lower the LED response time the better but it causes problems when frames goes missing. it's almost the same as converting a 30fps video file into 24fps. Thus the higher the frame rate of the content playing back the less stutter or tearing will happen.

The reason why it doesn't matter with film media playback (Not the conversion example) is because video stills are smears of movement or motion blur lasting a frame in length . GPU's render stills that will make the stuttering more noticeable.

See the below example where the difference is compared directly and clearly visible:

http://30vs60.com/

I'm totally on the 60fps side not only for the reason above but for the mere fact that I don't like my hardware running on it's limit given a amount of polygons, shaders etc... Meaning that anything added to the scene will slow it down to less fps. Developers and 3d artists should get there act together and not be lazy fuckers who forgot how to be economical with 3d design.


----------



## Kalmor (Jul 6, 2015)

Kellie Gator said:


> I know of people who go for as high as 125, and in the case of some low-end games like Quake Live people go for 250 and it's kind of amazing, do modern monitors even support that kind of framerate? I know I normally get screen tearing when aiming that high and I hate screen tearing, bugs the hell outta me.


They usually use high refresh rate monitors (144hz+) or old CRT monitors that can have a crazy high refresh rate.


----------



## JerryFoxcoon (Jul 6, 2015)

shteev said:


> FPS is like the horsepower of the computer world. it's like computational flex
> 
> so of course people who run rigs that can't push 60 will advocate for 30 :v



I can play 60FPS but I just don't like it.

I feel I'm going to be that hillbilly supporting 30FPS with nobody else on his side. Not that I care really LOL.


----------



## Fernin (Jul 6, 2015)

"It is true that the human eye can't notice anything above 24 fps"

No, no,no,no. This is flat out not true. 

There's many sources that cover WHY FPS came into common usage, and it has nothing to due with it being the limit of what the human eye can see. This piece from wikipedia summarizes it handily.

Early silent films had stated frame rates anywhere from 16 to 24 FPS,[5] but since the cameras were hand-cranked, the rate often changed during the scene to fit the mood. Projectionists could also change the frame rate in the theater by adjusting a rheostat controlling the voltage powering the film-carrying mechanism in the projector.[6] Silent films were often intended to be shown at higher frame rates than those used during filming.[7] These frame rates were enough for the sense of motion, but it was perceived as jerky motion. By using projectors with dual- and triple-blade shutters, the rate was multiplied two or three times as seen by the audience. Thomas Edison said that 46 frames per second was the minimum needed by the visual cortex: "Anything less will strain the eye."[8][9] In the mid to late 1920s, the frame rate for silent films increased to between 20 and 26 FPS.[8]

When sound film was introduced in 1926, variations in film speed were no longer tolerated as the human ear is more sensitive to changes in audio frequency. Many theaters had shown silent films at 22 to 26 FPS which is why 24 FPS was chosen for sound. From 1927 to 1930, as various studios updated equipment, the rate of 24 FPS became standard for 35 mm sound film.[1] At 24 FPS the film travels through the projector at a rate of 456 millimetres (18.0 in) per second. This allowed for simple two-blade shutters to give a projected series of images at 48 per second, satisfying Edison's recommendation. Many modern 35 mm film projectors use three-blade shutters to give 72 images per secondâ€”each frame is flashed on screen three times.


----------



## Kellie Gator (Jul 7, 2015)

Kalmor said:


> They usually use high refresh rate monitors (144hz+) or old CRT monitors that can have a crazy high refresh rate.


Now I almost want a CRT monitor but those goddamn things were a pain in the ass with how much they weighed, so finally getting a flat screen monitor was a blessing when they came out.


----------



## Stratelier (Jul 7, 2015)

Inpw said:


> The reason why it doesn't matter with film media playback (Not the conversion example) is because video stills are smears of movement or motion blur lasting a frame in length . GPU's render stills that will make the stuttering more noticeable.


This.  This so much.  Motion blur is a natural part of film capture, while in videogames it is artificial (not that it can't be artistic or even functional, like projectile trails from arrows or bullets that might otherwise be invisible).  So the *only* way for videogames to recreate this sense of natural motion blur is to render graphics at an extremely high framerate such that human perception blurs adjacent frames together.


----------



## Fernin (Jul 8, 2015)

Stratadrake said:


> This.  This so much.  Motion blur is a natural part of film capture, while in videogames it is artificial (not that it can't be artistic or even functional, like projectile trails from arrows or bullets that might otherwise be invisible).  So the *only* way for videogames to recreate this sense of natural motion blur is to render graphics at an extremely high framerate such that human perception blurs adjacent frames together.



This effect is also part of why "motion blur" effects unfailing look like shit. Instead of mimicking natural blur, it just looks like Vaseline or something has been smudged around moving things.


----------



## DevilishlyHandsome49 (Jul 8, 2015)

I really never gave a damn. Ive played video games since I was 5 and a smooth game is a smooth game, despite FPS. Last of Us on PS3 was 30. On PS4, I noticed a slight difference but it still ran smooth on PS3 cause it ran properly.

Sometimes 60 FPS does look smooth, but sometimes its too smooth for its own good. Looks fake, too clean and perfect. Blech

Plus, my tv already has motion blur  so im pretty much in an illusion of a higher frame rate all the time lol


----------



## Stratelier (Jul 8, 2015)

DevilishlyHandsome49 said:


> Sometimes 60 FPS does look smooth, but sometimes its too smooth for its own good. Looks fake, too clean and perfect. Blech


I know in some films you have these scenes where it's like they shot it with a high-speed camera instead of a regular one, so there's no motion blur in any of the frames whatsoever, and it gives them a slightly odd feel compared to the norm.



> Plus, my tv already has motion blur  so im pretty much in an illusion of a higher frame rate all the time lol



Yeah, LED TVs have this spec called "response time" i.e. how fast the pixels can change color.  So when you have really fast movement onscreen (especially if it's between high contrast objects) their previous frames sort of 'ghost' onscreen for a bit.  This does create a sense of motion blur if the distance between frames is small, but it's unsettling if the distance between frames is large (because it's this sort of strobed looking blur).


----------



## DevilishlyHandsome49 (Jul 8, 2015)

Stratadrake said:


> I know in some films you have these scenes where it's like they shot it with a high-speed camera instead of a regular one, so there's no motion blur in any of the frames whatsoever, and it gives them a slightly odd feel compared to the norm.
> 
> 
> 
> Yeah, LED TVs have this spec called "response time" i.e. how fast the pixels can change color.  So when you have really fast movement onscreen (especially if it's between high contrast objects) their previous frames sort of 'ghost' onscreen for a bit.  This does create a sense of motion blur if the distance between frames is small, but it's unsettling if the distance between frames is large (because it's this sort of strobed looking blur).



I usually have the response time levels om my tv at medium, so that way the blur isnt too messy


----------



## TrishaCat (Jul 9, 2015)

Fernin said:


> This effect is also part of why "motion blur" effects unfailing look like shit. Instead of mimicking natural blur, it just looks like Vaseline or something has been smudged around moving things.


That reminds me.
Square Enix ruined Final Fantasy Type 0 HD with motion blur. I hope the upcoming PC release allows people to turn that off. It looks hideous.


----------



## dischimera (Jul 9, 2015)

I don't know how the human eye VS frame rate issue goes. But anything from 30 and beyond is more than enough to me.

My favorite game of all time is in 20 FPS or something.

FPS change is not hard to notice at all. What I truly don't believe is when someone claims there is a huge difference from 720p to 1080p in a screen that isn't even 35''. I can't even tell the difference in a 42'' TV.


----------



## TrishaCat (Jul 9, 2015)

dischimera said:


> My favorite game of all time is in 20 FPS or something.


Let me guess: The Legend of Zelda: Ocarina of Time?


----------



## Stratelier (Jul 9, 2015)

DevilishlyHandsome49 said:


> I usually have the response time levels om my tv at medium, so that way the blur isnt too messy



The actual response time of the LCD grid on the screen isn't a stat you can customize, but whatever.  I know mine has one adjustable setting (sharpness I think -- I'd have to take another look) beyond the usual contrast/brightness but I don't believe it is related to motion.


----------



## DevilishlyHandsome49 (Jul 9, 2015)

Stratadrake said:


> The actual response time of the LCD grid on the screen isn't a stat you can customize, but whatever.  I know mine has one adjustable setting (sharpness I think -- I'd have to take another look) beyond the usual contrast/brightness but I don't believe it is related to motion.



mines is "judder" and I thing "speed"


----------



## AnAnomaly (Jul 10, 2015)

Eyes do not see in frames. There is a continuous stream of information flowing into our brains. That said, our brain uses all sorts of neat tricks to save us energy, which is why it stitches together rapid, successive images into motion. The issue has to do with perception, which varies from individual. Personally, I notice the difference between 30 and 60 fps, but I can't really tell if it's higher than 70fps. Some people cutoff lower, some higher. However, with the overall increase in processing power in consoles, which will always be the lowest common denominator for devs to consider, I believe that 60fps should be the standard, unless I hear a convincing argument otherwise.


----------



## Hakar Kerarmor (Jul 10, 2015)

Arguing that 30 fps is better than 60 fps sounds to me like arguing that a 12cm dick is better than an 18cm dick.


----------



## Kellie Gator (Jul 10, 2015)

Hakar Kerarmor said:


> Arguing that 30 fps is better than 60 fps sounds to me like arguing that a 12cm dick is better than an 18cm dick.


I dunno, if the 12cm dick has more girth it might be preferable. ;3


----------



## Hakar Kerarmor (Jul 10, 2015)

Kellie Gator said:


> I dunno, if the 12cm dick has more girth it might be preferable. ;3



Ok, fine, assuming for all practical purposes an identical girth.


----------



## Bidoyinn (Jul 10, 2015)

60 FPS is always a treat but when you've grown up on old consoles you  tend not to care too much for the FPS as long as the game doesn't play  like ass. So both are fine. The fights about it are interesting sometimes.


----------



## dischimera (Jul 10, 2015)

Some games do get a lot of benefit from 60 FPS. Like 3D Sonic. It's a lot easier to react on time if it's 60 FPS...


----------



## -Sliqq- (Jul 10, 2015)

As a guy who had to play 20 fps Minecraft.

*I don't give a fuck.*


----------



## JerryFoxcoon (Jul 10, 2015)

Hakar Kerarmor said:


> Arguing that 30 fps is better than 60 fps sounds to me like arguing that a 12cm dick is better than an 18cm dick.



A girl might prefer 12cm if she's not that deep. I don't think the extra 6cm would be an advantage here. Otherwise we should all go and upgrade our wee-wees with 2ft destruction machines!


----------



## Stratelier (Jul 10, 2015)

Somebody played the Furry Godwin card, I see....  (a.k.a. "in any Internet conversation, the probability of a sexual metaphor approaches one as the discussion progresses")


----------



## jahan_sher (Mar 10, 2016)

I don't like 60fps in live action, makes everything look like a soap opera to me. CGI and video games I prefer 60fps though.
But yeah, I remember seeing Empire Strikes Back playing in 4k 60fps at a store once and it just looked nasty XD


----------



## Ahkrin Descol (Mar 10, 2016)

I'll admit I do like my 144fps with gsync, 60 is perfectly fine for third person (good ol' dark souls) but once it comes to the more immersive first person games I do notice 60 and below jerking.  It's a marmite thing I figure, some like/'need' it, some just don't care (we grew up on 15? for the NES after all).


----------



## Moderator-Gazelle (Mar 10, 2016)

I saw a good video on this once by the Extra Credits team!


----------



## Chir (Mar 18, 2016)

60fps deffo. In fact, even though it's a quickly ramping curve towards diminishing returns after 60fps, I'm investing in a 144Hz monitor and at least a GTX970 (waiting for GTX1000-series tho) to make older games like Chivalry: Medieval Warfare and Payday 2 run as smooooth as possible. We recently built a desktop with a GTX960 for a friend, and it is insane how smooth the older titles look. No turning back now, I want that so bad. Besides display refresh rate and frame rate the game renders at, there's talk about frame latency, which technologies like G-Sync and FreeSync aim to eliminate or at least reduce for even better responsiveness. It's a great time to be a gamer, with all of this being reasonably affordable and becoming more standardized. 

As for someone saying 30fps is fine, I get it. I'm on a 4-year old mid-range gaming laptop, and most games run at 30-50fps on super low settings, 720p. It's okay and I don't direly NEED more, but hey, if I can invest just some $300-500 more to on the next desktop to make it buttery smooth, I will. It's not a necessity, but it's a luxury I would definitely pay for.


----------

