# Framerate



## Judge Spear (Dec 29, 2012)

I'm not too keen on this topic. I know it's simple. More frames=good, but lately, I've been hearing things about what's playable and unplayable. I've been hearing 30FPS was unplayable and that 60FPS is barely playable/tolerable. I honestly used to think that 30 was standard and 60 was just awesome, but I was wrong. Someone want to grace me with knowledge? I know anything less than 30 is...no. Explanation and examples would be appreciated. I'm just curious.


----------



## EllieTheFuzzy (Dec 29, 2012)

30 FPS is not that i would say unplayable (I've played PES06 at about 7-13FPS, the slow commentary is freaking funny though xD though it is definately not so playable) but it is pretty noticeable and does affect somewhat, also if the FPS suddenly increases in some games it really can put you out of loop, though again it does depend on what game your playing,

Like per say if im Playing Gran turismo 3 or 4 Really does need all the FPS's you can get, but a game like even like a tetris or walking around in a game like GTA SA (or any of em) 30FPS isn't so hard.


----------



## Kalmor (Dec 29, 2012)

It really mostly depends on personal preference. 60fps is now the "standard" and 120fps is the "awesome" level. I can really notice the difference between 30 and 60fps and that means I find playing at 30fps is very jerky and not very pleasing for me. It also has something to do with your monitor refresh rate. If it has a 60Hz refresh rate you should be aiming for 60fps and anything above that is just extra.

For me, being into fast paced racing simulation "games" (e.g iRacing, rFactor, ect) framerate is important, since it gives you a better idea of what's going on around you at a much smoother rate, giving you more time to react to anything. I've also heared that a higher framerate also helps with the "hardcore" fps players in much the same way.

If something is "unplayable" or not is really down to what you feel.

I'm sure someone else can give you a better answer, but I'm quite tired right now.


----------



## AshleyAshes (Dec 29, 2012)

60 is actually the top end on most displays these days.  With the shift to LCDs, most LCDs only refresh at a rate of 60hz, so you'll see almost all games capped to a 60fps refresh rate.  So, yeah, your average monitor can't even display pictures faster than 60fps.

30fps I'd say is tolerable but going under that would give you bothersome issues.


----------



## BRN (Dec 29, 2012)

XoPachi said:


> I'm not too keen on this topic. I know it's simple. More frames=good, but lately, I've been hearing things about what's playable and unplayable. I've been hearing 30FPS was unplayable and that 60FPS is barely playable/tolerable. I honestly used to think that 30 was standard and 60 was just awesome, but I was wrong. Someone want to grace me with knowledge? I know anything less than 30 is...no. Explanation and examples would be appreciated. I'm just curious.



Think of frames per second as a type of resolution. I mean, you've got your "screen size" resolution, and the higher your resolution the smaller the details each pixel can display, right? But with frames per second, it's the number of events that happen that can be shown - the higher your "fps" resolution, the more details that can be displayed each second.

It's true that 60 FPS is actually faster than the brain can analyse what your eyes can see, but that doesn't mean the faster FPS is useless. The higher you go, the smaller the changes in detail you'll be able to pick up. Great for games that need a quick reaction time.


----------



## Judge Spear (Dec 29, 2012)

Oh, so FPS also increases the detail also. That I didn't even guess. I thought it was purely speed.


----------



## Kalmor (Dec 29, 2012)

XoPachi said:


> Oh, so FPS also increases the detail also. That I didn't even guess. I thought it was purely speed.


Not the picture/texture detail, just the smoothness of the playback you're getting so it's not as jerky/disjointed. It's hard to show on a youtube video because all youtube videos run at 30fps.


----------



## Runefox (Dec 30, 2012)

The higher the frame rate, the more information is presented to your eyes per second. When something moves, you get to see it happen with greater immediacy with a higher frame rate, and thus can react more quickly (in a game) or otherwise reduce any delay (sports casts). In addition, the faster the frame rate, the easier it is to track motion, since instead of adding motion blur or simply juddering as with a low frame rate, a high frame rate will give you a solid picture during motion.

To put this into perspective, most movies are shot at 24 frames per second (The Hobbit was shot at 48), and most TV is shot at 30 frames per second (with ALL current TV broadcasts in NTSC regions running at 29.97). Generally speaking, these rates are OK for most video, and indeed most games can run fine at 30, as well. In a game, however, going much below 30 begins to introduce more and more significant delay in what you're seeing and what is actually happening, and in a shooter, for example, it can seriously impact your ability to gauge where you're pointing at any given moment unless you're perfectly still. Some people are more sensitive to this than others, and while 30 is perfectly playable in most scenarios (many graphically intensive console games typically run at 30), 60 is near the point where the eye can no longer distinguish differences in motion between the computer screen or TV and real life. The next Avatar film is set to be shot at 60 frames per second, and testing is already being done on shooting at 120 frames per second (which is above and beyond the threshold for motion in human vision, and has the added benefit of being cleanly divisible by 60 or 24 for older equipment, and also directly compatible with most 3D equipment, since depending on the technology, 3D is also displayed at this rate). The major drawback to recording film at these speeds is the sheer amount of data required to store it. Thankfully, for games, this isn't a problem.

The reason why more traditional film and TV work so well at lower frame rates, though, is due mainly to motion blur. It tricks our minds into thinking that something is moving smoothly by blurring the difference in the frame. This is why many people have had issues watching The Hobbit, since the lower amount of motion blur and higher amount of detail in motion begins to break the uncanny valley in that the motion becomes "too detailed", and thus, for some, distracting.

Check out this nifty utility for some hands-on comparisons. Just bear in mind that your computer screen probably won't go above 60 unless it's 3D-capable, but that should represent the maximum frame rate your computer is able to display. There IS value in having an even higher frame rate than what is displayed on-screen (buffer for sudden drop in fps, lesser effect of input lag), however for the purposes of demonstration, 60 should be your "highest frame rate" benchmark. Experiment with turning motion blur on and off in this app and see just how slow 24 or 30 is compared to 60, and also how much motion blur affects how detailed the picture is and how smooth you perceive it to be. Notice, too, that 30 and below appears to "lag behind" slightly.


----------



## AshleyAshes (Dec 30, 2012)

Runefox said:


> To put this into perspective, most movies are shot at 24 frames per second (The Hobbit was shot at 48), and most TV is shot at 30 frames per second (with ALL current TV broadcasts in NTSC regions running at 29.97).



<FilmStudent>

Well, news, soap operas, sports, and more 'live' things like that are shot at 29.97fps, but dramas, even single camera sitcoms, animation and the like are also shot at 23.976fps. :3  So it's not universally 29.97 for 'Television' and more depends on the catagory of television.

</FilmStudent>


----------



## Saga (Dec 30, 2012)

Depends with me. 3D games or anything fast-paced I like to have at <60 but no higher than 100, and 2D and relaxed games I feel just fine with 30 fps.


----------



## Lobar (Dec 30, 2012)

Depends entirely on what you're playing, but I can't imagine 30fps actually giving you a gameplay handicap in anything but extremely twitchy Japanese fighters or something.


----------



## Milotarcs (Dec 30, 2012)

Normal, crappy video cameras are 10FPS
HD video is 24-30FPS
60FPS usually for high speed and slowing down on small dSLRs
Anything higher is in specialized high speed cameras.


----------



## AshleyAshes (Dec 30, 2012)

Milotarcs said:


> Normal, crappy video cameras are 10FPS
> HD video is 24-30FPS



Uhh, no dude, all video cameras run at typically 30fps (Well, really, 60 interlaced fields per second, but anyway).  It's not unique to 'HD cameras'.  I mean, sure, various webcams and the like often run at say 15fps or something but those are a unique field of crappy cameras.



Milotarcs said:


> 60FPS usually for high speed and slowing down on small dSLRs
> Anything higher is in specialized high speed cameras.



Nah.  Even run of the mill cameras can shoot higher than 60fps, it's not unique to specialized to high speed cameras.  A Red Epic for example, a pretty standard cinema camera, can shoot 120fps without cropping the sensor and up to 300fps if you crop the sensor to save on bandwidth.  'Specialized High Speed Cameras' certianly expect, but 'high speed' is well, well beyond 60fps, like starting at 500fps and going much higher


----------



## Runefox (Dec 31, 2012)

AshleyAshes said:


> <FilmStudent>
> 
> Well, news, soap operas, sports, and more 'live' things like that are shot at 29.97fps, but dramas, even single camera sitcoms, animation and the like are also shot at 23.976fps. :3  So it's not universally 29.97 for 'Television' and more depends on the catagory of television.
> 
> </FilmStudent>


Yes, and they're still broadcast at 29.97, despite the fact that recording at 23.976 is hilariously wtf.


----------



## shteev (Dec 31, 2012)

I thought I heard somewhere that the human eye doesn't really detect anything higher 60 fps, but this could be wrong. I have noticed that, when looking at a TV that has a 120hz refresh rate, there is a noticeable difference over traditional TV sets.

As for what's playable, it really depends on what the kind of game is and what your personal preference is. For example, if you're playing an FPS, you might benefit from getting a full 60 fps over 30, whereas if you were playing a racing game, 30 would do just fine. Some games like Borderlands 2 have framerate smoothing where it varies between two values but the changes in framerate aren't too noticeable, and this works pretty well (IMO). Really, it's about user preference. If you're not sure, than aim for the highest framerate possible by turning down settings and then turning them back up until you've reached a happy medium. If your computer can play things maxxed out at a straight 60 FPS, then you're all set.


----------



## Kalmor (Dec 31, 2012)

Runefox said:


> Yes, and they're still broadcast at 29.97, despite the fact that recording at 23.976 is hilariously wtf.


Perhaps Ashley can answer this, why 29.97/27.976fps rather than 30/28fps? Surley it would be easier of you didn't have to work with decimals all the time.


----------



## AshleyAshes (Dec 31, 2012)

Raptros said:


> Perhaps Ashley can answer this, why 29.97/27.976fps rather than 30/28fps? Surley it would be easier of you didn't have to work with decimals all the time.



Firstly, the reason it's 60hz at all (60 interlaced fields, which can be entirely seperate, or be two fields that can make one frame, or '30fps') is caused by the use of the electrical grid to provide a sync signal.  The North American electrical grid runs at 60hz, so a TV doesn't need fancy timing hardware when it can just read that signal off the electricity going through itself and it's also why Europe and other places use 25fps/50i, because they have 50hz electrical grids.

But uhh, in the black and white days it was 30fps/60i, but they made a small change to the NTSC signal when they implimented color broadcasts and slightly adjusted the frequence of the color signal to keep it from interfering with the sound signal.  I'm more on the creative-technical side rather than the engineering side of television broadcasting, so I don't understand it with any greater depth than that.  But basically it was 'forced' as NTSC implimented color broadcasts while keeping the signal backwards compatible with black and white television sets.  This didn't happen in Europe with PAL.  23.976 is derived from 29.97 at the same ratio of difference as there is between 24 and 30.

However, since 29.97 means that you're behind two frames out of every 1min, one frame just gets doubled every 30sec to make up for it.  You can just mentally think of it as '24' and '30' fps and be sure to set the right framerate when setting up equiptment or exporting video.


----------



## Kalmor (Dec 31, 2012)

AshleyAshes said:


> Firstly, the reason it's 60hz at all (60 interlaced fields, which can be entirely seperate, or be two fields that can make one frame, or '30fps') is caused by the use of the electrical grid to provide a sync signal.  The North American electrical grid runs at 60hz, so a TV doesn't need fancy timing hardware when it can just read that signal off the electricity going through itself and it's also why Europe and other places use 25fps/50i, because they have 50hz electrical grids.
> 
> But uhh, in the black and white days it was 30fps/60i, but they made a small change to the NTSC signal when they implimented color broadcasts and slightly adjusted the frequence of the color signal to keep it from interfering with the sound signal.  I'm more on the creative-technical side rather than the engineering side of television broadcasting, so I don't understand it with any greater depth than that.  But basically it was 'forced' as NTSC implimented color broadcasts while keeping the signal backwards compatible with black and white television sets.  This didn't happen in Europe with PAL.  23.976 is derived from 29.97 at the same ratio of difference as there is between 24 and 30.
> 
> However, since 29.97 means that you're behind ONE second out of every 10mins, one frame just gets doubled every 10mins to make up for it.  You can just mentally think of it as '24' and '30' fps and be sure to set the right framerate when setting up equiptment or exporting video.


Ahh thanks for this. The more you know eh? That question has been bugging me for a long time, thanks!


----------



## shteev (Dec 31, 2012)

AshleyAshes said:


> Firstly, the reason it's 60hz at all (60 interlaced fields, which can be entirely seperate, or be two fields that can make one frame, or '30fps') is caused by the use of the electrical grid to provide a sync signal.  The North American electrical grid runs at 60hz, so a TV doesn't need fancy timing hardware when it can just read that signal off the electricity going through itself and it's also why Europe and other places use 25fps/50i, because they have 50hz electrical grids.
> 
> But uhh, in the black and white days it was 30fps/60i, but they made a small change to the NTSC signal when they implimented color broadcasts and slightly adjusted the frequence of the color signal to keep it from interfering with the sound signal.  I'm more on the creative-technical side rather than the engineering side of television broadcasting, so I don't understand it with any greater depth than that.  But basically it was 'forced' as NTSC implimented color broadcasts while keeping the signal backwards compatible with black and white television sets.  This didn't happen in Europe with PAL.  23.976 is derived from 29.97 at the same ratio of difference as there is between 24 and 30.
> 
> However, since 29.97 means that you're behind two frames out of every 1min, one frame just gets doubled every 30sec to make up for it.  You can just mentally think of it as '24' and '30' fps and be sure to set the right framerate when setting up equiptment or exporting video.



Gotta love it when things we do in the past just confuse us and make things more complicated in the future, eh?


----------



## Runefox (Dec 31, 2012)

shteev said:


> I thought I heard somewhere that the human eye doesn't really detect anything higher 60 fps, but this could be wrong. I have noticed that, when looking at a TV that has a 120hz refresh rate, there is a noticeable difference over traditional TV sets.


In this case, the 120Hz refers to what the screen is capable of displaying - The input source is still running at what it always does, and furthermore not all 120Hz/240Hz/etc TV's will accept a 120Hz/240Hz signal. What sets them apart and makes things smoother is motion compensation (Sony calls theirs MotionFlow) - It takes the difference between two frames (more frames for "240Hz" models) and using some mathematical wizardry blends them to create (an) intermediate image(s). This causes lag, but for TV broadcasts or other non-interactive media, this isn't a problem (games are noticeably slower to respond). I personally prefer the effect that it has on TV broadcasts, and though the concept of increasing the perceived frame rate of a game is enticing, it's always best to turn it off for gaming.

The biggest thing that a 120Hz TV brings to the table is the elimination of 3:2 pulldown. What this means is, basically, since film is shot at 24 fps, a 60Hz TV can't evenly display every frame - Every third frame is held for slightly longer, introducing a judder. Doubling 60 to 120 both enables the display of 30/60Hz content, while at the same time being cleanly divisible by 24 to properly display film, too. Since most devices don't output at 120Hz, though, it's important to set the refresh rate to 24Hz to enable this. I don't think it's entirely necessary for analogue connections, but for digital, it is.

Anyway...



> Gotta love it when things we do in the past just confuse us and make things more complicated in the future, eh?



To be fair, I don't think that at the time of colour TV's introduction, anyone thought that digital was going to happen or what the consequences were. That said, the only reason this was done in the first place was to keep backwards compatibility with older black and white TV sets.


----------



## AshleyAshes (Dec 31, 2012)

shteev said:


> Gotta love it when things we do in the past just confuse us and make things more complicated in the future, eh?



Considdering that ATSC didn't replace NTSC in the United States for over-the-air broadcasts till 2009... No.


----------



## Fernin (Dec 31, 2012)

"I thought I heard somewhere that the human eye doesn't really detect anything higher 60 fps"

I've always hated this because it's wildly incorrect. The human eye doesn't work in frames, it's not rendering individual pictures like a computer, it's taking in a constant stream of information. This is why when you look out the side window of your car the world isn't a choppy mess. That magic 38-62 fps rate is the point at which the average person can no longer detect individual frames in video playback. In video games where player input effects what happens onscreen this point is even more critical because a high frame rate means the players actions are visible sooner, resulting in the appearance of faster/smoother response in the game.


----------



## BRN (Dec 31, 2012)

Fernin said:


> "I thought I heard somewhere that the human eye doesn't really detect anything higher 60 fps"
> 
> I've always hated this because it's wildly incorrect. The human eye doesn't work in frames, it's not rendering individual pictures like a computer, it's taking in a constant stream of information. This is why when you look out the side window of your car the world isn't a choppy mess. That magic 38-62 fps rate is the point at which the average person can no longer detect individual frames in video playback. In video games where player input effects what happens onscreen this point is even more critical because a high frame rate means the players actions are visible sooner, resulting in the appearance of faster/smoother response in the game.



So what's the functional difference? This is splitting hairs.


----------



## Runefox (Jan 1, 2013)

SIX said:


> So what's the functional difference? This is splitting hairs.


Basically, the human eye detects differences in motion in a constant stream via changes in contrast over minutes of arc, though our vision is calibrated to a certain threshold of motion before the brain considers it as moving (in nature, there are birds that can detect extremely minuscule amounts of motion at ridiculous distances). At a certain threshold, the difference in motion is so subtle that the human eye (or rather, brain) can't detect them as they happen. Since this is a biological process, this will vary from person to person; However, universally, this happens in video not at 60, but somewhere closer to 100Hz. For example, traditional CRT monitors suffered from "flicker" akin to a fluorescent light when running at 60Hz, which causes eye strain before too long during use. This is because the electron beam that refreshes the screen on a CRT is still detectable at 60Hz (60 frames per second), and thus very briefly we can detect that the image is not a solid image. For most people, a setting of 85Hz was the beginning of the "flicker-free" zone, and thus closer to the threshold of human vision's ability to distinguish extremely subtle motion. Beyond 100Hz, the change in motion between frames is so small that it becomes impossible to resolve.

Going back to film, then, the reason why the human eye tricks the brain into thinking that 24 frames per second is 'natural' is because of motion blur, which smooths out the contrast differences between frames of motion and tricks our brains into perceiving it as speed (much like a camera, our eyes will report motion blur to our brains during sufficiently fast motion - An example of this is to look at the texture of the road while in a moving vehicle. Even though actual speed may be low, the details of the gravel will begin to blend together as speed increases).

It's all very similar to the digital audio world, where an analogue wave is "sampled" at a very high rate, and then reconstructed at the DAC level for reproduction by interpolating the differences between each point of data about the sound wave. It works similarly to drawing a smooth curve over a bar graph - The bar graph represents digital data, and the smooth curve the original sound wave. The audio player's job then is to reconstruct it as a wave prior to sending the electrical signal to the speaker itself. Thus, a low sample rate in audio will result in a fuzzy mess to our ears - Something we hear a lot over a telephone line, which operates around 8kHz. CD audio is recorded at a sample rate of 44.1kHz, which has a Nyquist frequency (half of the sample rate, representing the range of sound that can be accurately reproduced by the digital signal) of 22kHz, which is the upper threshold of human hearing's ability to distinguish change in pitch. DVD audio and other formats record at higher frequencies - The idea being that the more information that the speaker has about a sound, the more detailed the sound will be, regardless of the hearing range of the listener. This is important because supersonic and subsonic sound waves can interact with the audible spectrum, providing subtle tone differences that we hear in reality, but don't necessarily always hear in a digital reproduction. This is why vinyl records are still hailed today as a high quality playback format, because they record the sound wave directly as an analogue wave rather than digital samples (though they have their own drawbacks).

In both worlds, the frame rate or sample rate have an upper limit to where the human brain no longer perceives a change between frames / pitch. However, there is still always a benefit in pushing higher, because the more information that is presented to our brains, the more likely our perception of it is to be accurate to the source material, especially since neither digital audio nor video exist in nature, and video itself is in effect an optical illusion. This is why audiophiles return to vinyl recordings with vaccuum tube amplifiers and high-end speakers and headphones, and why videophiles will choose their equipment based on how accurately colour and contrast are portrayed (and then calibrating it), as well as setting up their environment for optimal viewing. In both cases, the most accurate reproduction possible is the goal, and while a videophile may not necessarily be interested in the frame rate over their other concerns (after all, most people in this category are there for film), in the interests of immersion, the higher the frame rate, the more "real" the reproduction becomes.


----------



## AshleyAshes (Jan 1, 2013)

To sum it up: Faster framerate is always better, though your display will likely max out at 60fps.  Meanwhile, for dramatic television and cinema, 24fps is ideal for freaky reasons, because higher framerates somehow look 'fake'.


----------



## Runefox (Jan 1, 2013)

AshleyAshes said:


> To sum it up: Faster framerate is always better, though your display will likely max out at 60fps.  Meanwhile, for dramatic television and cinema, 24fps is ideal for freaky reasons, because higher framerates somehow look 'fake'.


It looks "fake" because we aren't in control over the camera's motion as an audience, and we feel no external forces when the camera moves. Because of this, at higher frame rates, we perceive more information and a more detailed image, and consider it too uncanny because we have only the visual stimulus of the scene and nothing else. The scale of the actors, the angle of the shot, everything feels strange because of the amount of detail in the moving image. Our suspension of disbelief becomes difficult to maintain. That having been said, though, the mere fact that we became accustomed to 24fps to begin with means that we can become accustomed to 48fps and higher. We've learned to discard the notion of the uncanny valley at 24fps. An audience that has only ever watched a 48fps film may find 24fps film to be similarly jarring in the same way we might today find 12fps film to be, because of our conditioning.

It also looks fake because post-processing looks fake. 

See also: http://www.youtube.com/watch?v=I6yrnnbhOp0 (12fps)
http://www.youtube.com/watch?v=-nU2_ERC_oE (30 vs 60 fps)

Quite frankly, screw 24fps.


----------



## AshleyAshes (Jan 1, 2013)

Runefox said:


> Quite frankly, screw 24fps.



I love my 24fps, I specifically shoot on DSLRs because they're affordable and offer 24p recording.   However I dispute that we'll 'get used' to 48fps, because relatively higher framerate isn't new.  The reason The Hobbit has that 'Cops' feel is because we're used to seeing shows like Cops at a higher framerate.  While people talk about 60i video as 'Two interlaced fields that make a frame' on a LOT of hardware this isn't true at all.  A lot of cameras, like all consumer 'Video Cameras' (Until HD came around) and live studio cameras and ENG (Electronic News Gathering) class cameras primarily shot at true 60i.  That is, every 60th of a second a field of video was shot and the next 60th of a second the next field was shot.  So any two interlaced fields could never be combined to make a single progressive frame.  While not quite 60p, this 60i video is very smooth just like 48p for cinema.  It makes that 'cheap' feeling that leaves us instantly knowing that we are watching 'video' instead of 'film'.  It's also why countless television programs with high production value were infact shot on film at 24fps and then converted to video for post production and broadcast.  They were expensive shows and so no one wanted them having the cheap feeling of video.

Speaking of motion blur, wanna know what's funny?  Sometimes I have to shoot at high shutter speed to eliminate motion blur only to spend a lot of processing time adding motion blur in post production.  If you're using a computer to 'solve' a camera so you can add in 3D effects to match the video, motion blur confuses the tracking which prefers sharply defined objects to track.  Crazy no?


----------



## Runefox (Jan 1, 2013)

If we'd started shooting film at 60p to begin with, we'd be used to that frame rate. Fact of the matter is, 24fps is a holdover from the early days of film, to retain compatibility with 12fps film and keep the size of the reel down (and projector motors, etc). 24fps wasn't chosen because it was specifically the "perfect" frame rate. The reason it's still chosen today is exactly BECAUSE film was shot at 24. I would go as far as to say that Cops and other high frame rate shows are pioneers of the modern film era, even if the productions themselves weren't quality - The fact that we're "used" to those productions being low quality and associating the frame rate to that is again part of what I'm talking about as far as our conditioning goes, and same with the home video recorder frame rates. Regardless of conditioning, the fact that the film industry and television industry are only now able to synchronize the frame rate of video thanks to 120Hz TV sets is a testament to the fact that 24fps is a batshit insane frame rate to use.

High frame rate video isn't new at all, no, but having it in motion pictures as anything but slow motion capture is. As for "video" vs "film", do you actually shoot in film anymore? There's no practical reason to shoot in 24fps anymore except for the fact that we're used to it. It's time to move on. Quite frankly, HFR video is a much greater advancement in the video industry than 3D ever will be. 3D has come and gone time and time again, because quite frankly, stereoscopy is only a very small part of the equation when it comes to resolving a 3D image. In addition, the extra hardware required and the extra production costs dramatically outweigh the effect, and 3D's consumer flop in spite of the massive push is testament to that. A real improvement in image quality will come with a higher frame rate, just as Deep Color, 1080p and other advancements have dramatically improved video quality over 480i/p DVD playback, which again improved over the abysmal quality of VHS. Use the extra storage space for a higher frame rate, or if nothing else, at least increase the bitrate of the video. 3D not only REDUCES image quality by reducing colour depth AND vertical resolution, but the extra hardware required is absolutely not worth the investment.

20 years from now, we'll wonder why we ever watched 24fps video in the first place.


----------



## Arshes Nei (Jan 1, 2013)

The human mind likes to put together shapes as well. That's why a highly rendered environment doesn't have a person looking at it as long as drawing that is more loose.

Cinematography is an art form. Some people forget that when they try to pursue high quality realism.


----------



## Runefox (Jan 1, 2013)

The way the industry is right now, realism seems to be the direction that everything is going. The film industry doesn't necessarily have to work at 12, 24, 48, 60, 100, 120 or any other arbitrary frame rate, but right now, the film industry is LOCKED at 24fps. The Hobbit is the only major film release I can think of, at least in recent history, that has bothered to break from that convention. I'm not necessarily saying that 24fps is crap, but that EVERYTHING being shot at 24fps is boring, not life-like, and while there have been great artistic films shot at it, I wonder what would happen if 24fps wasn't the holy grail of cinema frame rates. If cinematography is an art form, why not let the artist choose their aspect ratio, their frame rate, their colour depth? Why not film something in 12fps Technicolor if it's for artistic purposes? Why not broadcast sporting and news events at 120fps? The fixation on 24 is stupid. If the intention is realism, shoot at a high frame rate. If the intention is artistic, then shoot at whatever works. Just shooting at 24 and calling it a day seems so strange to me.


----------



## Arshes Nei (Jan 1, 2013)

I'm just countering the 
"tldr; higher framerates = great"

I'm just saying it generally doesn't work with fantasy. I know there were a lot of complaints with the Hobbit, complaints I agree with. 

There are a lot of movies that end up showing their age in terms of sci fi and fantasy with the fixation of realism/cg and not thinking about practical effects and what helps with making movies...well magic.


----------



## AshleyAshes (Jan 1, 2013)

Runefox said:


> As for "video" vs "film", do you actually shoot in film anymore? There's no practical reason to shoot in 24fps anymore except for the fact that we're used to it.



Well, when I use 'Video' vs 'Film' I mean it in the way it meant in the 90s and before.  Where 'Video' and 'Film' had two distinct looks and one was shot electronically while the other was shot photochemically.  Now when most material even for cinema is shot with digital video technology, it refers to the 'film look'.  I argue that there's a major creative element in shooting for a 'film look' because sometimes complete realisim isn't what's wanted.  The film look can absorb the viewer into what they're watching and it plays into the suspension of disbelief.



Runefox said:


> 20 years from now, we'll wonder why we ever watched 24fps video in the first place.



I don't think it's reasonable to say that with such absolute conviction.  While I think strongly that 24fps will remain the standard (If anything, more productions are going to 24p.  I've noticed even some documentary shows that you'd expect to be shot at 60i are now shot at 24p, though certianly not all) consumers and audiences can be fickle.  They can stick to what they've always known and refuse to change where as for something else, constantly seek something 'better' and leave things hard to even standardize.  So I don't think a rapid change is something that could be predicted with conviction.



Runefox said:


> The way the industry is right now, realism seems to be the direction that everything is going. The film industry doesn't necessarily have to work at 12, 24, 48, 60, 100, 120 or any other arbitrary frame rate, but right now, the film industry is LOCKED at 24fps. The Hobbit is the only major film release I can think of, at least in recent history, that has bothered to break from that convention. I'm not necessarily saying that 24fps is crap, but that EVERYTHING being shot at 24fps is boring, not life-like, and while there have been great artistic films shot at it, I wonder what would happen if 24fps wasn't the holy grail of cinema frame rates. If cinematography is an art form, why not let the artist choose their aspect ratio, their frame rate, their colour depth? Why not film something in 12fps Technicolor if it's for artistic purposes? Why not broadcast sporting and news events at 120fps? The fixation on 24 is stupid. If the intention is realism, shoot at a high frame rate. If the intention is artistic, then shoot at whatever works. Just shooting at 24 and calling it a day seems so strange to me.



Well, increasing framerate beyond 60fps also has another caveat: Shutter speed.  That is, how long the shutter is open to expose the film or sensor to light.  You can easily expose for less time than an entire frame.  When shooting at 24p the 'standard' shutter is 1/48th of a second before the exposure stops.  It's 1/60th for 30p/60i.  This is to maintain a certian sense of motion blur.  I could shoot at a 1/24 shutter speed but I'd see a lot more motion blur.  However your shutter speed can't really be lower than your framerate, so shooting at 120fps for example mandates at shutter speed no slower than 1/120.  1/120 only allows 40% as much light through as 1/48.  You now need to compensate by either opening the aperture on the lens (Which intern affects the depth of field), increase the sensitivty of the sensor/film (Which increases noise that's captured as well) or adding more lights to the scene.  (Which can stress electrical systems and increase heat on set).  This is also why high frame rate filming, like super slow mo 1000fps stuff requires STUPID ammounts of hot lights to ensure the camera actually sees something.

So everything has it's tradeoffs, including increasing your framerate.

That said, even increasing the shutter speed can have it's artistic value.  Saving Private Ryan for example, to replicate the high shutter speed on WWII era cameras was shot at a sutterspeed 1/192 which reduced the motion blur to achieve an artistic effect.


----------



## Judge Spear (Jan 1, 2013)

...All I wanted was a difference between 30-60 fps. ;-;

I wasn't expecting the thread to blow up like this. This is the most information I've seen being passed around this site in a while.


----------

