Choosing the Frame Rate for Live Streaming
This article will explore factors to consider when choosing your frame rate for live streaming, or streaming fps.
Let’s start from the beginning.
What Is Frame Rate?
Frames are the individual pictures in a video file. The frame rate, usually expressed using the term frames per second, or fps, are the number of frames per second contained in the video.
Common frame rates include 24 fps (for film), 29.97/59.94 fps (for broadcast TV), 30/60 fps (for live streaming). If you’re shooting in Europe, you’ll probably shoot in 25 fps or 50 fps.
Give Me the TL/DR Please
For the vast majority of live streaming producers in the U.S., if you configure your camera at 30 fps, your encoder at 30 fps, and your cloud transcoder at 30 fps, you’re in good shape.
In Europe, you can go with 25 fps or 30 fps.
As I said at the top, for the vast majority of productions, using 30fps (or 25 fps in Europe) from glass to glass should produce a high-quality result.
Frame Rates You Know and Love
Here are some common frame rates and their typical usage.
24 fps — Used for most movies.
25 fps — Used in Europe for broadcast and streaming.
29.97 fps — Television was originally broadcast at 30 fps in the U.S., but 29.97 was introduced for technical reasons to support color TV. If you’re shooting solely for streaming, you should go with 30 fps.
30 fps — Used for non-broadcast video in the U.S. and elsewhere.
50 fps — High frame rate video in Europe for broadcast or streaming.
59.94 — High frame rate video for U.S. broadcast.
60 fps — High frame rate for non-broadcast applications.
Where Do You Set the Frame Rate?
You typically set the frame rate in up to three places. The first is in the camera (shown below for an iPhone).
The second place is the streaming encoder (shown below for Tele-stream Wire-cast).
The third is in the cloud transcoder. For the record, the Wowza Streaming Cloud transcodes at the incoming rate by default, which you can reduce if desired.
Ditto with Wowza Streaming Engine, which allows you to change the frame rate using the Skip-Frame-Count tag.
What’s the Difference Between the Frame Rate and Refresh Rate or Hz?
As mentioned, the frame rate defines the number of frames per second in the video file. In contrast, the refresh rate, or Hz, defines the number of times per second a display refreshes.
At this point, most display devices refresh at 60 Hz or faster so they can display each frame in a 60 fps stream, which is typically the highest frame rate used for streaming.
So, unless you’re targeting older devices, the refresh rate of your viewers shouldn’t be a factor in choosing a streaming fps.
What Frame Rate Should I Use at the Source?
If you control the frame rate in the camera or other video source, you should match the frame rate to the content.
If you’re shooting fast motion sports or capturing a computer gaming screen, go with 50/60 fps.
Some purists would argue that 24 fps is best for cinematic experiences, but most producers go with 25/30 fps for non-sports productions.
More isn’t always better when it comes to frame rate and streaming because it takes more bandwidth to support a high-quality 60 fps stream than a high-quality 30 fps stream.
For most productions with normal motion, viewers can’t tell the difference between 30 fps and 60 fps at equal quality levels.
However, when compressed for delivery, the 60 fps stream may exhibit compression artifacts that wouldn’t be present in a 30 fps stream, which degrades visual quality and viewer quality of experience (QoE).
Unless your content really needs 50/60 fps, you should go with 25/30 fps.
What Frame Rate Should I Use at the Encoder?
Typically, you’ll want to match the source. However, if you don’t have sufficient outbound bandwidth to support a high-quality stream at the source frame rate, consider cutting the frame rate in half, say from 60 fps to 30 fps.
As an example, if you were shooting a soccer match for live and on-demand presentation, you may shoot and capture locally at 60 fps but live stream at 30 fps.
Another consideration is the frame rate accepted by the delivery service that you’re using. Both Wowza Streaming Engine and Wowza Streaming Cloud can accept up to 60 fps.
So can YouTube Live, but if you’re streaming to Facebook Live you may be limited to 30 fps unless you’re using an encoder that plugs into Facebook’s application programming interface (API) and can stream up to 60 fps.
What Frame Rate Should I Use at the Transcoder?
Here’s where things get interesting. When you deliver your video using adaptive streaming, your transcoder creates multiple outputs in what’s typically called an encoding ladder.
Apple’s recommended encoding ladder from its HLS Authoring Specification is shown below. As you can ses ae, Apple recommends using the source frame rate for the high-bandwidth, high-quality streamt the bottom of the ladder, but that you should consider dropping the frame rate for the lower-bandwidth, lower-quality streams at the top of the ladder.
16:9 Aspect Ratio H.264/AVC Framerate
416 x 234 145 ≤30 fps
640 x 360 365 ≤30 fps
768 x 432 730 ≤30 fps
768 x 432 1,100 ≤30 fps
960 x 540 2,000 same as source
1280 x 720 3,000 same as source
1280 x 720 4,500 same as source
1920 x 1080 6,000 same as source
1920 x 1080 7,800 same as source
HLS Encoding Targets
This is so because when working at lower bitrates, you may want to reduce the frame rate to show fewer frames at a higher quality, essentially trading off smoothness for frame clarity.
For example, at 640×360 resolution and bandwidth of 365 kbps, the individual frames presented at 15 fps would be much clearer than frames presented at 60 fps.
When dropping the frame rate for lower bitrate rungs on the ladder, it’s important to use rates that divide evenly into the source frame rate to ensure smoothness.
For example, when dropping from 30 fps to 10 fps, the transcoder would drop 2 of every three frames.
Whereas when dropping from 30 fps to 12 fps, the transcoder would drop irregular numbers of frames, which would produce jerky motion during playback.
Overall, when choosing your streaming fps, you should consider both the source video and the individual rungs on your encoding ladder.