diff --git a/index.html b/index.html index 94f62a8..26967cf 100644 --- a/index.html +++ b/index.html @@ -131,9 +131,9 @@

Filtergraphs

Many FFmpeg commands use filters that manipulate the video or audio stream in some way: for example, hflip to horizontally flip a video, or amerge to merge two or more audio tracks into a single stream.

-

The use of a filter is signalled by the flag -vf (video filter) or -af (audio filter), followed by the name and options of the filter itself. For example, take the convert colourspace command:

+

The use of a filter is signaled by the flag -vf (video filter) or -af (audio filter), followed by the name and options of the filter itself. For example, take the convert colorspace command:

ffmpeg -i input_file -c:v libx264 -vf colormatrix=src:dst output_file -

Here, colormatrix is the filter used, with src and dst representing the source and destination colourspaces. This part following the -vf is a filtergraph.

+

Here, colormatrix is the filter used, with src and dst representing the source and destination colorspaces. This part following the -vf is a filtergraph.

It is also possible to apply multiple filters to an input, which are sequenced together in the filtergraph. A chained set of filters is called a filter chain, and a filtergraph may include multiple filter chains. Filters in a filterchain are separated from each other by commas (,), and filterchains are separated from each other by semicolons (;). For example, take the inverse telecine command:

ffmpeg -i input_file -c:v libx264 -vf "fieldmatch,yadif,decimate" output_file

Here we have a filtergraph including one filter chain, which is made up of three video filters.

@@ -161,7 +161,7 @@

Stream mapping

-

Stream mapping is the practice of defining which of the streams (e.g., video or audio tracks) present in an input file will be present in the output file. FFmpeg recognises five stream types:

+

Stream mapping is the practice of defining which of the streams (e.g., video or audio tracks) present in an input file will be present in the output file. FFmpeg recognizes five stream types:

- - - + + +
-

Transcode video to a different colourspace

-

This command uses a filter to convert the video to a different colour space.

+

Transcode video to a different colorspace

+

This command uses a filter to convert the video to a different colorspace.

ffmpeg -i input_file -c:v libx264 -vf colormatrix=src:dst output_file

ffmpeg
starts the command
-i input file
path, name and extension of the input file
-c:v libx264
tells FFmpeg to encode the video stream as H.264
-
-vf colormatrix=src:dst
the video filter colormatrix will be applied, with the given source and destination colourspaces.
+
-vf colormatrix=src:dst
the video filter colormatrix will be applied, with the given source and destination colorspaces.
Accepted values include bt601 (Rec.601), smpte170m (Rec.601, 525-line/NTSC version), bt470bg (Rec.601, 625-line/PAL version), bt709 (Rec.709), and bt2020 (Rec.2020).
For example, to convert from Rec.601 to Rec.709, you would use -vf colormatrix=bt601:bt709.
output file
path, name and extension of the output file
-

Note: Converting between colourspaces with FFmpeg can be done via either the colormatrix or colorspace filters, with colorspace allowing finer control (individual setting of colourspace, transfer characteristics, primaries, range, pixel format, etc). See this entry on the FFmpeg wiki, and the FFmpeg documentation for colormatrix and colorspace.

+

Note: Converting between colorspaces with FFmpeg can be done via either the colormatrix or colorspace filters, with colorspace allowing finer control (individual setting of colorspace, transfer characteristics, primaries, range, pixel format, etc). See this entry on the FFmpeg wiki, and the FFmpeg documentation for colormatrix and colorspace.


-

Convert colourspace and embed colourspace metadata

+

Convert colorspace and embed colorspace metadata

ffmpeg -i input_file -c:v libx264 -vf colormatrix=src:dst -color_primaries val -color_trc val -colorspace val output_file

ffmpeg
starts the command
-i input file
path, name and extension of the input file
-c:v libx264
encode video as H.264
-
-vf colormatrix=src:dst
the video filter colormatrix will be applied, with the given source and destination colourspaces.
-
-color_primaries val
tags video with the given colour primaries.
+
-vf colormatrix=src:dst
the video filter colormatrix will be applied, with the given source and destination colorspaces.
+
-color_primaries val
tags video with the given color primaries.
Accepted values include smpte170m (Rec.601, 525-line/NTSC version), bt470bg (Rec.601, 625-line/PAL version), bt709 (Rec.709), and bt2020 (Rec.2020).
-color_trc val
tags video with the given transfer characteristics (gamma).
Accepted values include smpte170m (Rec.601, 525-line/NTSC version), gamma28 (Rec.601, 625-line/PAL version)1, bt709 (Rec.709), bt2020_10 (Rec.2020 10-bit), and bt2020_12 (Rec.2020 12-bit).
@@ -677,17 +677,17 @@

To Rec.709:

ffmpeg -i input_file -c:v libx264 -vf colormatrix=bt601:bt709 -color_primaries bt709 -color_trc bt709 -colorspace bt709 output_file

MediaInfo output examples:

- MediaInfo screenshots of colourspace metadata
+ MediaInfo screenshots of colorspace metadata

Using this command it is possible to add Rec.709 tags to a file that is actually Rec.601 (etc), so apply with caution!

These commands are relevant for H.264 and H.265 videos, encoded with libx264 and libx265 respectively.

-

Note: If you wish to embed colourspace metadata without changing to another colourspace, omit -vf colormatrix=src:dst. However, since it is libx264/libx265 that writes the metadata, it’s not possible to add these tags without reencoding the video stream.

+

Note: If you wish to embed colorspace metadata without changing to another colorspace, omit -vf colormatrix=src:dst. However, since it is libx264/libx265 that writes the metadata, it’s not possible to add these tags without reencoding the video stream.

For all possible values for -color_primaries, -color_trc, and -colorspace, see the FFmpeg documentation on codec options.


1. Out of step with the regular pattern, -color_trc doesn’t accept bt470bg; it is instead here referred to directly as gamma.
In the Rec.601 standard, 525-line/NTSC and 625-line/PAL video have assumed gammas of 2.2 and 2.8 respectively.

- + @@ -762,32 +762,32 @@
ffmpeg
starts the command
-i input_file
path, name and extension of the input file
-vf "width:height"
Crops the video to the given width and height (in pixels).
- By default, the crop area is centred: that is, the position of the top left of the cropped area is set to x = (input_width - output_width) / 2, y = input_height - output_height) / 2. + By default, the crop area is centered: that is, the position of the top left of the cropped area is set to x = (input_width - output_width) / 2, y = input_height - output_height) / 2.
output_file
path, name and extension of the output file

It's also possible to specify the crop position by adding the x and y coordinates representing the top left of your cropped area to your crop filter, as such:

ffmpeg -i input_file -vf "crop=width:height[:x_position:y_position]" output_file

Examples

-

The original frame, a screenshot of the SMPTE colourbars:

+

The original frame, a screenshot of the SMPTE colorbars:

VLC screenshot of Maggie Cheung -

Result of the command ffmpeg -i smpte_coloursbars.mov -vf "crop=500:500" output_file:

+

Result of the command ffmpeg -i smpte_colorsbars.mov -vf "crop=500:500" output_file:

VLC screenshot of Maggie Cheung, cropped from original -

Result of the command ffmpeg -i smpte_coloursbars.mov -vf "crop=500:500:0:0" output_file, appending :0:0 to crop from the top left corner:

+

Result of the command ffmpeg -i smpte_colorsbars.mov -vf "crop=500:500:0:0" output_file, appending :0:0 to crop from the top left corner:

VLC screenshot of Maggie Cheung, cropped from original -

Result of the command ffmpeg -i smpte_coloursbars.mov -vf "crop=500:300:500:30" output_file:

+

Result of the command ffmpeg -i smpte_colousbars.mov -vf "crop=500:300:500:30" output_file:

VLC screenshot of Maggie Cheung, cropped from original
- - + +
-

Change video colour to black and white

+

Change video color to black and white

ffmpeg -i input_file -filter_complex hue=s=0 -c:a copy output_file

-

A basic command to alter colour hue to black and white using filter_complex (credit @FFMPEG via Twitter).

+

A basic command to alter color hue to black and white using filter_complex (credit @FFMPEG via Twitter).

ffmpeg
starts the command
-i input_file
path, name and extension of the input file
@@ -808,7 +808,7 @@
- +
@@ -1032,7 +1032,7 @@

ffmpeg -i input_1.avi -i input_2.mp4 -filter_complex "[0:v:0][0:a:0][1:v:0][1:a:0]concat=n=2:v=1:a=1[video_out][audio_out]" -map "[video_out]" -map "[audio_out]" output_file

This command takes two or more files of the different file types and joins them together to make a single file.

The input files may differ in many respects - container, codec, chroma subsampling scheme, framerate, etc. However, the above command only works properly if the files to be combined have the same dimensions (e.g., 720x576). Also note that if the input files have different framerates, then the output file will be of variable framerate.

-

Some aspects of the input files will be normalised: for example, if an input file contains a video track and an audio track that do not have exactly the same duration, the shorter one will be padded. In the case of a shorter video track, the last frame will be repeated in order to cover the missing video; in the case of a shorter audio track, the audio stream will be padded with silence.

+

Some aspects of the input files will be normalized: for example, if an input file contains a video track and an audio track that do not have exactly the same duration, the shorter one will be padded. In the case of a shorter video track, the last frame will be repeated in order to cover the missing video; in the case of a shorter audio track, the audio stream will be padded with silence.

ffmpeg
starts the command
-i input_1.ext
path, name and extension of the first input file
@@ -1136,7 +1136,7 @@
-ss 00:05:00 -t 10
Beginning five minutes into the original video, this command will create a 10-second-long excerpt.
-

Note: In order to keep the original timestamps, without trying to sanitise them, you may add the -copyts option.

+

Note: In order to keep the original timestamps, without trying to sanitize them, you may add the -copyts option.

@@ -1337,7 +1337,7 @@

Create centered, transparent text watermark

E.g For creating access copies with your institutions name

-

ffmpeg -i input_file -vf drawtext="fontfile=font_path:fontsize=font_size:text=watermark_text:fontcolor=font_colour:alpha=0.4:x=(w-text_w)/2:y=(h-text_h)/2" output_file

+

ffmpeg -i input_file -vf drawtext="fontfile=font_path:fontsize=font_size:text=watermark_text:fontcolor=font_color:alpha=0.4:x=(w-text_w)/2:y=(h-text_h)/2" output_file

ffmpeg
starts the command
-i input_file
path, name and extension of the input file
@@ -1346,7 +1346,7 @@
fontfile=font_path
Set path to font. For example in macOS: fontfile=/Library/Fonts/AppleGothic.ttf
fontsize=font_size
Set font size. 35 is a good starting point for SD. Ideally this value is proportional to video size, for example use ffprobe to acquire video height and divide by 14.
text=watermark_text
Set the content of your watermark text. For example: text='FFMPROVISR EXAMPLE TEXT'
-
fontcolor=font_colour
Set colour of font. Can be a text string such as fontcolor=white or a hexadecimal value such as fontcolor=0xFFFFFF
+
fontcolor=font_color
Set color of font. Can be a text string such as fontcolor=white or a hexadecimal value such as fontcolor=0xFFFFFF
alpha=0.4
Set transparency value.
x=(w-text_w)/2:y=(h-text_h)/2
Sets x and y coordinates for the watermark. These relative values will centre your watermark regardless of video dimensions.
@@ -1388,9 +1388,9 @@
fontfile=font_path
Set path to font. For example in macOS: fontfile=/Library/Fonts/AppleGothic.ttf
fontsize=font_size
Set font size. 35 is a good starting point for SD. Ideally this value is proportional to video size, for example use ffprobe to acquire video height and divide by 14.
timecode=starting_timecode
Set the timecode to be displayed for the first frame. Timecode is to be represented as hh:mm:ss[:;.]ff. Colon escaping is determined by O.S, for example in Ubuntu timecode='09\\:50\\:01\\:23'. Ideally, this value would be generated from the file itself using ffprobe.
-
fontcolor=font_colour
Set colour of font. Can be a text string such as fontcolor=white or a hexadecimal value such as fontcolor=0xFFFFFF
+
fontcolor=font_color
Set color of font. Can be a text string such as fontcolor=white or a hexadecimal value such as fontcolor=0xFFFFFF
box=1
Enable box around timecode
-
boxcolor=box_colour
Set colour of box. Can be a text string such as fontcolor=black or a hexadecimal value such as fontcolor=0x000000
+
boxcolor=box_color
Set color of box. Can be a text string such as fontcolor=black or a hexadecimal value such as fontcolor=0x000000
rate=timecode_rate
Framerate of video. For example 25/1
x=(w-text_w)/2:y=h/1.2
Sets x and y coordinates for the timecode. These relative values will horizontally centre your timecode in the bottom third regardless of video dimensions.
"
quotation mark to end drawtext filter command
@@ -1775,7 +1775,7 @@

Create PowerShell script to batch process with FFmpeg

As of Windows 10, it is possible to run Bash via Bash on Ubuntu on Windows, allowing you to use bash scripting. To enable Bash on Windows, see these instructions.

-

On Windows, the primary native command line programme is PowerShell. PowerShell scripts are plain text files saved with a .ps1 extension. This entry explains how they work with the example of a PowerShell script named “rewrap-mp4.ps1”, which rewraps .mp4 files in a given directory to .mkv files.

+

On Windows, the primary native command line program is PowerShell. PowerShell scripts are plain text files saved with a .ps1 extension. This entry explains how they work with the example of a PowerShell script named “rewrap-mp4.ps1”, which rewraps .mp4 files in a given directory to .mkv files.

“rewrap-mp4.ps1” contains the following text:

$inputfiles = ls *.mp4
   foreach ($file in $inputfiles) {
@@ -1871,7 +1871,7 @@
         
  • 44.1 kHz: "asetnsamples=n=44100"
  • 96 kHz: "asetnsamples=n=96000"
  • -

    Note: This filter trandscodes audio to 16 bit PCM by default. The generated framemd5s will represent this value. Validating these framemd5s will require using the same default settings. Alternatively, when your file has another quantisation rates (e.g. 24 bit), then you might add the audio codec -c:a pcm_s24le to the command, for compatibility reasons with other tools, like BWF MetaEdit.

    +

    Note: This filter transcodes audio to 16 bit PCM by default. The generated framemd5s will represent this value. Validating these framemd5s will require using the same default settings. Alternatively, when your file has another quantization rates (e.g. 24 bit), then you might add the audio codec -c:a pcm_s24le to the command, for compatibility reasons with other tools, like BWF MetaEdit.

    ffmpeg
    starts the command
    -i input_file
    path, name and extension of the input file
    @@ -2065,12 +2065,12 @@

    Play HD SMPTE bars

    -

    Test an HD video projector by playing the SMPTE colour bars pattern.

    +

    Test an HD video projector by playing the SMPTE color bars pattern.

    ffplay -f lavfi -i smptehdbars=size=1920x1080

    ffplay
    starts the command
    -f lavfi
    tells ffplay to use the Libavfilter input virtual device
    -
    -i smptehdbars=size=1920x1080
    asks for the smptehdbars filter pattern as input and sets the HD resolution. This generates a colour bars pattern, based on the SMPTE RP 219–2002.
    +
    -i smptehdbars=size=1920x1080
    asks for the smptehdbars filter pattern as input and sets the HD resolution. This generates a color bars pattern, based on the SMPTE RP 219–2002.
    @@ -2081,12 +2081,12 @@

    Play VGA SMPTE bars

    -

    Test a VGA (SD) video projector by playing the SMPTE colour bars pattern.

    +

    Test a VGA (SD) video projector by playing the SMPTE color bars pattern.

    ffplay -f lavfi -i smptebars=size=640x480

    ffplay
    starts the command
    -f lavfi
    tells ffplay to use the Libavfilter input virtual device
    -
    -i smptebars=size=640x480
    asks for the smptebars filter pattern as input and sets the VGA (SD) resolution. This generates a colour bars pattern, based on the SMPTE Engineering Guideline EG 1–1990.
    +
    -i smptebars=size=640x480
    asks for the smptebars filter pattern as input and sets the VGA (SD) resolution. This generates a color bars pattern, based on the SMPTE Engineering Guideline EG 1–1990.