linux setup notes

Example ffmpeg commands

by T.J. Nelson

Example ffmpeg commands

F fmpeg is a powerful command-line tool for manipulating video files and movies. However, the documentation is confusing. Here are some tested commands for performing typical tasks.

Normally, you'd do this in Linux, but there's also a Windows version of ffmpeg. To use it, click the Start button and type cmd.exe, hit Enter, then type the commands in the little black command window that pops up. To change to the different directory or “folder” in a Windows command box, use the cd command. If a filename contains spaces, you have to put it in double quotes (usually; the Windows cd command doesn't seem to need them.)

Extract frames from a movie
This example extracts the first 2 seconds of a movie in video21.wmv into individual image files. The files will be called img-0001.png, img-0002.png, img-0003.png, etc. It is best to do this in a separate directory.
ffmpeg -i video21.wmv -r 30 -t 2 -f image2 img-%04d.png

For wmv files it's sometimes necessary to specify the frame rate, in this case 30 (-r 30). The parameter '-f image2' means the input movie is in "image2" format; normally it's not necessary to specify it, but if it's not automatically detected, or if the extension is wrong, it may be needed.

A typical command line would be
ffmpeg -i video21.wmv -f image2 %04d.png

Combine individual frames into a movie
Create an MP4 movie from JPEG files with filenames 001.jpg, 002.jpg, etc. This example creates an mp4 with 10 frames per second and 1800 kbps.
ffmpeg -r 10 -b 1800 -i %03d.jpg test.mp4
The %3d is 'C' language notation for a 3-digit integer. Don't put a percent sign or a 'd' in the filename.

This example creates an mpeg at the default rate (25 fps, 200 kbps).
ffmpeg -i %03d.jpg -qscale:v 1 test.mpeg
ffmpeg figures out what movie format to use based on the filename extension that you specify.

Sometimes you have to set the bitrate to get a good quality movie. The -b 4000 sets the bit rate to 4000 kbps. In this example, the input filenames must have names starting with 'frame' followed by a dash, then five digits, then a dot, then 'jpg'.
ffmpeg -b 4000 -i frame-%5d.jpg -qscale:v 1 test.mpeg

Notes on creating movies

1. File names must start with 01, 001, etc. If they start at some other number, or if they have a prefix (like "img-") ffmpeg won't see them. If there's a missing number in the sequence, ffmpeg will just stop and create a truncated movie.

2. To get a good quality, it's necessary to use the -qscale 1 option. This gives a variable bitrate but constant quality (1 = good, 31 = bad). Example:
ffmpeg -i %04d.png -qscale:v 1 test.MOV

Here are the results I got with a test file:

qscale       File size       Quality      
1 5826198 OK
2 5859146 OK
4 2651676 not too bad
7 1657662 bad
10 1341036 blocky
15 1109092 terrible
20 987848 terrible
30 895896 terrible

If the filenames aren't in numeric sequence, use this Linux bash script to rename them:

# bash file to rename images to numeric filename sequence
a=1
for i in *.png; do
new=$(printf "%04d.png" ${a})
echo renaming ${i} ${new}
mv ${i} ${new}
let a=a+1
done

Creating animated GIFs in ffmpeg
fmpeg -i %04d.png -s 288x162 test.gif

creates an animated GIF89 format file that's playable on a web browser. Set the image size with the -s option to keep the animated file to a reasonable size. GIF files are only 8 bits/pixel regardless of the pixel depth of the starting files, so the quality is not very good. The qscale option is ignored. There are many options for setting the number of times it loops, and to overlay it on another video.

It's often more convenient to do it in 3 steps.

  1. Create an MPEG movie. This example creates one from a series of images, then shrinks it to 288 × 162 pixels.
    fmpeg -i %04d.png -s 288x162 test.mp4
  2. Generate a palette using either the lanczos or dither filter. It creates a small palette file in a png image format.
    ffmpeg -i test.mp4 -vf scale=900:-1:flags=lanczos,palettegen palette.png
    ffmpeg -i test.mp4 -vf scale=900:-1:sws_dither=ed,palettegen palette2.png
  3. Convert the mpeg into an animated gif using this rather complicated command. ffmpeg -i test.mp4 -i palette2.png -filter_complex "fps=30,scale=250:-1:flags=lanczos[x];[x][1:v]paletteuse" test.gif

Note that the fps parameter doesn't really change the frame rate. It simply works by deleting intermediate frames, rather than slowing the image down. I could get none of the other options (-r, -default_delay, -max_gif_delay, -min_delay, etc) to work either. The -r drops frames and the others have no effect.

The animated GIF file was 206774 bytes, while the MPEG file was 43008 bytes.

GIF is pretty much obsolete, so it's much better to forget about animated GIF and use the HTML5 video command.
<video width="320" height="240" controls>
   <source src="movie.mp4" type="video/mp4">
   <source src="movie.ogg" type="video/ogg">
   Your browser does not support the video tag.
</video>

In the above HTML, the browser checks each line to see if there's a format it supports. If not, it prints the last line. Unfortunately, the default MP4 produced by ffmpeg isn't compatible with HTML5. Last I checked, getting ffmpeg to compile in Linux with H.264 support was a royal pain. So, for the time being, I use the old way, which causes the browser to start up Windows Media Player in Windows, or Mplayer in Linux, or whatever add-on your browser is set to use. Example:
<a href="fireflies/fireflies-cropped.avi">

Changing image file format
Sometimes you want to convert the files into JPEGs first. This script will do the conversion and change the extension of each file from .png to .jpg. This works in Linux. Note that scripting won't work in Windows unless you have a shell like Bash installed.
for f in *png ; do convert -quality 100 $f `basename $f png`jpg; done

Find what file formats are supported

ffmpeg -formats
man ffmpeg-all
man ffmpeg-formats
man ffmpeg-filters

Get help

ffmpeg -h

Convert movie from WMV to mp4 format
Ffmpeg determines what file format you want by the extension. If your input file doesn't have the right extension, bad things will happen.
ffmpeg -i video04.wmv -f mp4 -strict -2 -t 5 a.mp4
ffmpeg -i output2.MOV -strict -2 test.mp4

It's recommended to use the -qscale 1 option whenever possible to make sure ffmpeg doesn't compress the result, which can give poor quality. This line converts an AVI (RIFF format) file to Apple QuickTime format. Windows Media Player can't read the AVI files from the later (4.1) versions of ffmpeg, but has no trouble with the earlier ones.
ffmpeg -i 9980.avi -qscale 1 9980-merged.MOV

Resize a movie
Resize a movie input.MOV to 640 × 480 pixels. A movie file titled output.MOV is produced.
ffmpeg -i input.MOV -vf scale=640:480 output.MOV

Crop a movie
ffmpeg -i input.MOV -vf crop=100:110:200:80 output.MOV
ffmpeg -i input.MOV -vf crop=in_w:in_h/2:in_w:in_h/2 output.MOV

The parameters are x:y:width:height in pixels. The first command tries to create a 200×80 image, but ffmpeg will change this to the correct movie aspect ratio. The second command saves only the bottom half of your movie.

Editing a movie
Extract a section out of the movie, saving only the five seconds between 70 and 75 in output.MOV.

ffmpeg -i input.MOV -ss 00:01:10 -t 00:00:05 -c:v copy -c:a copy output.MOV

The -c:v copy -c:a copy option makes it faster by copying the video and audio instead of decoding and re-encoding them. You could also use -vf trim=70:75, but this doesn't re-set the time stamp, so viewers will just see a black screen for the first 70 seconds. Supposedly the setpts filter can fix this, but I couldn't get it to work.

Another way to copy 12 seconds from a movie starting at 1 minute 15 seconds, put in test.MOV:
ffmpeg -i input.MOV -ss 00:01:15.0 -codec copy -t 12 test.MOV

(Note: In older versions of ffmpeg this command crops a few hundred milliseconds from the beginning of the movie. If those are important, upgrade to 4.1 or merge the movie with something else first.)

Concatenating two or more movies
To merge an arbitrary number video files:
ffmpeg -f concat -i list.txt -c copy merged.MOV

The list file must have the filenames, one per line, preceded by the word 'file'.
file test01.MOV
file test02.MOV

Both video and audio tracks are merged.

With this technique, I've had trouble with extra unwanted frames showing up in the concatenated movie. So far, the only way I've been able to get exactly the frames that are needed is to split the movie into individual image files, delete the bad ones, and create a new movie from the images. Unfortunately, this eliminates the sound.

Retrieving metadata from a movie
Reads metadata and prints it on the screen. As with all ffmpeg commands, there are many options (man ffprobe). ExifTool gives a lot more information.
ffprobe DSC_6881.MOV Metadata:
major_brand : qt
minor_version : 537331968
compatible_brands: qt niko
creation_time : 2015-06-09 01:10:21
Duration: 00:01:41.35, start: 0.000000, bitrate: 18896 kb/s
Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, bt470bg/unknown/bt470m), 1920x1080 [SAR 1:1 DAR 16:9], 17339 kb/s, 23.98 fps, 23.98 tbr, 24k tbn, 47.95 tbc (default)
Metadata:
creation_time : 2015-06-09 01:10:21
Stream #0:1(eng): Audio: pcm_s16le (sowt / 0x74776F73), 48000 Hz, 2 channels, s16, 1536 kb/s (default)
Metadata:
creation_time : 2015-06-09 01:10:21

Filtering a movie
It is possible to split a movie into frames, process each individual frame in an image analysis program, and then re-assemble it into a movie. But this gets tedious after the first few hundred thousand frames.

Brightening, changing the gamma, inverting, and many other functions are available in ffmpeg through the filter option. Filtering uses the -vf option followed by a series of commands. They can be very simple, but don't forget the -qscale 1 option or you'll get garbage.

To resize a movie to 320 × 240 pixels:
ffmpeg -i input.MOV -vf scale=320:240 -qscale 1 output.MOV

To invert the colors in a movie:
ffmpeg -i output2.MOV -vf lutrgb="r=negval:g=negval:b=negval" -qscale 1 output3.MOV

To increase brightness by a factor of four:
ffmpeg -i output2.MOV -vf lutyuv=y=val*4 -qscale 1 output3.MOV

To lighten a movie while reducing the contrast:
ffmpeg -i DSC_2650.MOV -vf lutyuv=y=val+30 -qscale 1 DSC-2650a.mov

To increase red by a factor of two:
ffmpeg -i output2.MOV -vf lutrgb=r=val*2 -qscale 1 output3.MOV

To increase gamma by factor of 5:
ffmpeg -i output2.MOV -vf 'lutyuv=y=gammaval(0.2)' -qscale 1 output3.MOV
Single quotes are needed to prevent the shell from messing with the command because it contains parentheses.

To rotate a movie by 45 degrees:
ffmpeg -i output2.MOV -vf rotate=45 -qscale 1 output3.MOV

The sharpen, blur, or sharpen a movie (-qscale 1 is omitted for clarity):
ffmpeg -i output2.MOV -vf unsharp output3.MOV
ffmpeg -i output2.MOV -vf unsharp=7:7:-2:7:7:-2 output3.MOV
ffmpeg -i output2.MOV -vf unsharp=5:5:1.5:5:5:0.0 output3.MOV


The defaults for unsharp are 5:5:1.0:5:5:0.0.
1st = kernel of luma filter x size (odd 3 to 63)
2nd = kernel of luma filter y size (odd 3 to 63)
3rd = amount of luma filtering (−1.5 to 1.5 but can be any number); negative=blur, positive=sharpen
4th = kernel of chroma filter x size (odd 3 to 63)
5th = kernel of chroma filter y size (odd 3 to 63)
6th = amount of chroma filtering (−1.5 to 1.5 but can be any number); negative=blur, positive=sharpen

To draw a box or grid on the movie:
ffmpeg -i output2.MOV -vf drawbox=x=10:y=20:w=200:h=60:color=red@0.5 output3.MOV

ffmpeg -i output2.MOV -vf drawgrid=width=100:height=100:thickness=2:color=red@0.5 output3.MOV


More complex filters
Filters can also be very complex. Many seemingly simple operations require splitting the processing stream. This example crops and flips half of the image. This information is from the man page (man ffmpeg-filters).

               [main]
     input --> split ---------------------> overlay --> output
                 |                             ^
                 |[tmp]                  [flip]|
                 +-----> crop --> vflip -------+

The input is split into two streams. One stream goes through the crop filter and the vflip filter, and is then merged back with the other stream by overlaying it on top. The start and end of each path require labels enclosed in square brackets. All these commands go on a single line, not broken up as shown here.

ffmpeg -i inputmovie -vf "split [main][tmp]; [tmp] crop=iw:ih/2:0:0, vflip [flip]; [main][flip] overlay=0:H/2" outputmovie

Here are some examples of filtering. The first one uses the YUV look-up table filter to multiply the luminance by a factor of 5, which can be useful for making extremely dark images brighter. There are also commands for changing the gamma and contrast.

ffmpeg -i DSC_6881.MOV -vf "split [main][tmp]; [tmp] lutyuv="y=val*5" [tmp2]; [main][tmp2] overlay" output.MOV

This example raises the gamma. In ffmpeg, a value less than 1.0 makes dark areas lighter and a value above 1.0 makes them darker, which is the opposite of what you'd expect:
ffmpeg -i DSC_6881.MOV -vf "split [main][tmp]; [tmp] lutyuv=y=gammaval(0.6) [tmp2]; [main][tmp2] overlay" output.MOV

The RGB look-up table filter is similar, and allows you to do stuff to the red, green, and blue channels separately. In this case we invert them to make a negative image.

ffmpeg -i DSC_6887.MOV -vf "split [main][tmp]; [tmp] lutrgb="r=negval:g=negval:b=negval" [tmp2]; [main][tmp2] overlay" output.MOV

This example denoises a movie file. This helps reduce those rectangular compression artifacts.
ffmpeg -i output2.MOV -vf "split [main][tmp]; [tmp] dctdnoiz=4.5 [tmp2]; [main][tmp2] overlay" output3.MOV

There are numerous other options, such as deshake, delogo, drawtext, fade, lens correction, rotate, subtitles, and fft filter. Some I could get to work and some, like drawtext, I couldn't, and some take a very long time to run.

Combining filters
You can put many filters together in the same command. The following rules apply:


On the Internet, no one can tell whether you're a dolphin or a porpoise
jun 11 2015; last updated jan 02 2020

back

to top