Understanding Codecs, Proxies, and Transcoding

Overview

The following guide will address why it’s important to understand what codec you are working in and how it impacts the Post proces.

Picking a Camera.  Picking a Codec.

The codec of a file refers to how information from the sensor is encoded and stored onto your memory card.  Every camera and camera manufacturer has its' own flavor of encoding.  When you pick a camera, you pick a codec and this means you’ve already set up certain parameters for your flexibility in Post Production.  

Because you are recording onto a SD, CF or SxS card with a limited amount of space, camera manufacturers had to determine what information could be discarded to save space while still giving the user a good quality image.  In order to reduce file size, some information hitting the sensor needs to be discarded.  

What information is thrown out and what is retained is different for every camera.  To understand how your camera is compressing information you can see what codec you camera shoots.  

Understanding some basic terminology can help you pick the right camera for your project.   

  • Codec: how a device compresses data
  • Chroma: Subsampling: how color is sampled per pixel per color channel and determines how color information is stored

NLEPrep_02.png

  • Bit Depth: Camera sensors’ ability to capture differences in Chroma and Luma values

NLEPrep_03.png

  • Native Codec: the codec the camera uses when recording.
  • Intermediate Codecs: Transcoded footage that is less taxing on your system, allowing for more consistent playback and a streamlined workflow in color correction and VFX

What is Transcoding?

Transcoding is the process of converting your video from one file format to another.  For example, if the native codec of your camera is H.264 and you place your footage into Adobe Media Encoder, Prelude or any other software to change it’s codec to Apple ProRes422, that process is called Transcoding.  Transcoding never adds additional quality to your footage but you can lose quality.  It’s important to understand the differences between codecs to ensure that you aren’t compromising the quality of your media.  

When Should I Transcode?

If you shot in a codec that uses interframe compression, you should probably transcode your media. Even at standard HD resolutions, intraframe video is easier to work with in whatever editing software you use.  It also ensure that you have a consistent codec within your project that can make moving into programs like Davinci Resolve or After Effects much smoother.  

Transcoding creates new media that is much larger in file size than the original media.  It is highly encouraged that you keep both your original and transcoded media (in separate folders) in case of corruption or transcoding errors.  

If you’ve shot 4K video, transcoding to a lower quality resolution file can make editing smoother. These files tend to be much larger in file size and can slow down playback significantly.  New low resolution files are called Proxies.  It’s important that if you plan on using a Proxy workflow you’ve tested reconnecting back to your original media either within your editing software or a program like Davinci Resolve.   

Interframe vs. Intraframe

There are two basic types of compression schemes that codecs encode in: interframe codecs and intraframe codecs. Most cameras shoot with interframe codecs.  

Interframe compression helps save space when recording onto memory cards by only recording the differences between movements rather than the entire frame.  

Intraframe codecs, on the other hand, save every single frame shot in the video file. These formats are excellent for editing with because the program does not have to do any work to "decode" what has changed between frames - all the data is right there.  

If you've ever had to render frequently or experienced extreme stuttering while scrubbing through your footage in Avid or Premiere, it's probably because you did not transcode from the interframe codec used by your camera.

Interframe_vs._Intraframe.gif

Which Codec Should I Use?

Knowing the right codec to use in every situation isn't practical - you may have to do some research to find which one is best for what you're trying to do. There are, however, some general guidelines for a few of the more common situations.

  • Uploading to YouTube or Vimeo - Uploading to the web can take a long time and produce poor results if you use the wrong codec. We recommend exporting using H.264 - this codec will produce very small files without much deterioration in quality.
  • Screening for a Class - If you have been asked to produce a video to be screened in class, we recommend exporting using Apple ProRes 422. This will produce a file that will play smoothly on just about any computer. Note that if your professor recommends a different codec, that advice supersedes ours.
  • Transcoding from Interframe codec or making a Proxy - If you're transcoding to ensure smooth playback during editing Apple ProRes 422 or DNxHD115/DNxHR SQ.
  • Screening in a Theater - For BFA Screenings, BA Capstone screenings, or submission to some film festivals, you may be required to produce a DCP. For specifications see the BA/BFA Class Support page.

How Can I Check Which Codec I'm Using?

For many cameras, like the Canon 60D, you can tell what codec your files are using in Quicktime Player. Simply open the file in QuickTime and press CMD+I to bring up the info tab. One of the items listed should be the codec of the video.

Identifying_Codec.jpg

For files that do not open in Quicktime, you can do the same process in VLC Player.

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a ticket