FFmpeg 0.11.5
Since* 0.7
#

Overlay one video on top of another.

It takes two inputs and one output, the first input is the "main" video on which the second input is overlayed.

It accepts the parameters: x:y[:options].

x is the x coordinate of the overlayed video on the main video, y is the y coordinate. x and y are expressions containing the following parameters:

main_w, main_h

main input width and height

W, H

same as main_w and main_h

overlay_w, overlay_h

overlay input width and height

w, h

same as overlay_w and overlay_h

options is an optional list of key=value pairs, separated by ":".

The description of the accepted options follows.

rgb

If set to 1, force the filter to accept inputs in the RGB color space. Default value is 0.

Be aware that frames are taken from each input video in timestamp order, hence, if their initial timestamps differ, it is a a good idea to pass the two inputs through a setpts=PTS-STARTPTS filter to have them begin in the same zero timestamp, as it does the example for the movie filter.

Follow some examples:

# draw the overlay at 10 pixels from the bottom right
# corner of the main video.
overlay=main_w-overlay_w-10:main_h-overlay_h-10

# insert a transparent PNG logo in the bottom left corner of the input
ffmpeg -i input -i logo -filter_complex 'overlay=10:main_h-overlay_h-10' output

# insert 2 different transparent PNG logos (second logo on bottom
# right corner):
ffmpeg -i input -i logo1 -i logo2 -filter_complex
'overlay=10:H-h-10,overlay=W-w-10:H-h-10' output

# add a transparent color layer on top of the main video,
# WxH specifies the size of the main input to the overlay filter
color=red.3:WxH [over]; [in][over] overlay [out]

You can chain together more overlays but the efficiency of such approach is yet to be tested.