diff --git a/index.html b/index.html index 94f62a8..26967cf 100644 --- a/index.html +++ b/index.html @@ -131,9 +131,9 @@
Many FFmpeg commands use filters that manipulate the video or audio stream in some way: for example, hflip to horizontally flip a video, or amerge to merge two or more audio tracks into a single stream.
-The use of a filter is signalled by the flag -vf
(video filter) or -af
(audio filter), followed by the name and options of the filter itself. For example, take the convert colourspace command:
The use of a filter is signaled by the flag -vf
(video filter) or -af
(audio filter), followed by the name and options of the filter itself. For example, take the convert colorspace command:
ffmpeg -i input_file -c:v libx264 -vf colormatrix=src:dst output_file
-
Here, colormatrix is the filter used, with src and dst representing the source and destination colourspaces. This part following the -vf
is a filtergraph.
Here, colormatrix is the filter used, with src and dst representing the source and destination colorspaces. This part following the -vf
is a filtergraph.
It is also possible to apply multiple filters to an input, which are sequenced together in the filtergraph. A chained set of filters is called a filter chain, and a filtergraph may include multiple filter chains. Filters in a filterchain are separated from each other by commas (,
), and filterchains are separated from each other by semicolons (;
). For example, take the inverse telecine command:
ffmpeg -i input_file -c:v libx264 -vf "fieldmatch,yadif,decimate" output_file
Here we have a filtergraph including one filter chain, which is made up of three video filters.
@@ -161,7 +161,7 @@Stream mapping is the practice of defining which of the streams (e.g., video or audio tracks) present in an input file will be present in the output file. FFmpeg recognises five stream types:
+Stream mapping is the practice of defining which of the streams (e.g., video or audio tracks) present in an input file will be present in the output file. FFmpeg recognizes five stream types:
a
- audiov
- videoscale=1440:1080
) works for both upscaling and downscaling. We use the Lanczos scaling algorithm (flags=lanczos
), which is slower but gives better results than the default bilinear algorithm.pad=1920:1080:240:0
) completes the transformation from SD to HD.This command uses a filter to convert the video to a different colour space.
+This command uses a filter to convert the video to a different colorspace.
ffmpeg -i input_file -c:v libx264 -vf colormatrix=src:dst output_file
bt601
(Rec.601), smpte170m
(Rec.601, 525-line/NTSC version), bt470bg
(Rec.601, 625-line/PAL version), bt709
(Rec.709), and bt2020
(Rec.2020).-vf colormatrix=bt601:bt709
.Note: Converting between colourspaces with FFmpeg can be done via either the colormatrix or colorspace filters, with colorspace allowing finer control (individual setting of colourspace, transfer characteristics, primaries, range, pixel format, etc). See this entry on the FFmpeg wiki, and the FFmpeg documentation for colormatrix and colorspace.
+Note: Converting between colorspaces with FFmpeg can be done via either the colormatrix or colorspace filters, with colorspace allowing finer control (individual setting of colorspace, transfer characteristics, primaries, range, pixel format, etc). See this entry on the FFmpeg wiki, and the FFmpeg documentation for colormatrix and colorspace.
ffmpeg -i input_file -c:v libx264 -vf colormatrix=src:dst -color_primaries val -color_trc val -colorspace val output_file
smpte170m
(Rec.601, 525-line/NTSC version), bt470bg
(Rec.601, 625-line/PAL version), bt709
(Rec.709), and bt2020
(Rec.2020).
smpte170m
(Rec.601, 525-line/NTSC version), gamma28
(Rec.601, 625-line/PAL version)1, bt709
(Rec.709), bt2020_10
(Rec.2020 10-bit), and bt2020_12
(Rec.2020 12-bit).To Rec.709:
ffmpeg -i input_file -c:v libx264 -vf colormatrix=bt601:bt709 -color_primaries bt709 -color_trc bt709 -colorspace bt709 output_file
MediaInfo output examples:
-⚠ Using this command it is possible to add Rec.709 tags to a file that is actually Rec.601 (etc), so apply with caution!
These commands are relevant for H.264 and H.265 videos, encoded with libx264
and libx265
respectively.
Note: If you wish to embed colourspace metadata without changing to another colourspace, omit -vf colormatrix=src:dst
. However, since it is libx264
/libx265
that writes the metadata, it’s not possible to add these tags without reencoding the video stream.
Note: If you wish to embed colorspace metadata without changing to another colorspace, omit -vf colormatrix=src:dst
. However, since it is libx264
/libx265
that writes the metadata, it’s not possible to add these tags without reencoding the video stream.
For all possible values for -color_primaries
, -color_trc
, and -colorspace
, see the FFmpeg documentation on codec options.
1. Out of step with the regular pattern, -color_trc
doesn’t accept bt470bg
; it is instead here referred to directly as gamma.
In the Rec.601 standard, 525-line/NTSC and 625-line/PAL video have assumed gammas of 2.2 and 2.8 respectively. ↩
It's also possible to specify the crop position by adding the x and y coordinates representing the top left of your cropped area to your crop filter, as such:
ffmpeg -i input_file -vf "crop=width:height[:x_position:y_position]" output_file
The original frame, a screenshot of the SMPTE colourbars:
+The original frame, a screenshot of the SMPTE colorbars:
-Result of the command ffmpeg -i smpte_coloursbars.mov -vf "crop=500:500" output_file
:
Result of the command ffmpeg -i smpte_colorsbars.mov -vf "crop=500:500" output_file
:
Result of the command ffmpeg -i smpte_coloursbars.mov -vf "crop=500:500:0:0" output_file
, appending :0:0
to crop from the top left corner:
Result of the command ffmpeg -i smpte_colorsbars.mov -vf "crop=500:500:0:0" output_file
, appending :0:0
to crop from the top left corner:
Result of the command ffmpeg -i smpte_coloursbars.mov -vf "crop=500:300:500:30" output_file
:
Result of the command ffmpeg -i smpte_colousbars.mov -vf "crop=500:300:500:30" output_file
:
ffmpeg -i input_file -filter_complex hue=s=0 -c:a copy output_file
A basic command to alter colour hue to black and white using filter_complex (credit @FFMPEG via Twitter).
+A basic command to alter color hue to black and white using filter_complex (credit @FFMPEG via Twitter).
ffmpeg -i input_1.avi -i input_2.mp4 -filter_complex "[0:v:0][0:a:0][1:v:0][1:a:0]concat=n=2:v=1:a=1[video_out][audio_out]" -map "[video_out]" -map "[audio_out]" output_file
This command takes two or more files of the different file types and joins them together to make a single file.
The input files may differ in many respects - container, codec, chroma subsampling scheme, framerate, etc. However, the above command only works properly if the files to be combined have the same dimensions (e.g., 720x576). Also note that if the input files have different framerates, then the output file will be of variable framerate.
-Some aspects of the input files will be normalised: for example, if an input file contains a video track and an audio track that do not have exactly the same duration, the shorter one will be padded. In the case of a shorter video track, the last frame will be repeated in order to cover the missing video; in the case of a shorter audio track, the audio stream will be padded with silence.
+Some aspects of the input files will be normalized: for example, if an input file contains a video track and an audio track that do not have exactly the same duration, the shorter one will be padded. In the case of a shorter video track, the last frame will be repeated in order to cover the missing video; in the case of a shorter audio track, the audio stream will be padded with silence.
Note: In order to keep the original timestamps, without trying to sanitise them, you may add the -copyts
option.
Note: In order to keep the original timestamps, without trying to sanitize them, you may add the -copyts
option.
E.g For creating access copies with your institutions name
-ffmpeg -i input_file -vf drawtext="fontfile=font_path:fontsize=font_size:text=watermark_text:fontcolor=font_colour:alpha=0.4:x=(w-text_w)/2:y=(h-text_h)/2" output_file
ffmpeg -i input_file -vf drawtext="fontfile=font_path:fontsize=font_size:text=watermark_text:fontcolor=font_color:alpha=0.4:x=(w-text_w)/2:y=(h-text_h)/2" output_file
fontfile=/Library/Fonts/AppleGothic.ttf
35
is a good starting point for SD. Ideally this value is proportional to video size, for example use ffprobe to acquire video height and divide by 14.text='FFMPROVISR EXAMPLE TEXT'
fontcolor=white
or a hexadecimal value such as fontcolor=0xFFFFFF
fontcolor=white
or a hexadecimal value such as fontcolor=0xFFFFFF
fontfile=/Library/Fonts/AppleGothic.ttf
35
is a good starting point for SD. Ideally this value is proportional to video size, for example use ffprobe to acquire video height and divide by 14.hh:mm:ss[:;.]ff
. Colon escaping is determined by O.S, for example in Ubuntu timecode='09\\:50\\:01\\:23'
. Ideally, this value would be generated from the file itself using ffprobe.fontcolor=white
or a hexadecimal value such as fontcolor=0xFFFFFF
fontcolor=white
or a hexadecimal value such as fontcolor=0xFFFFFF
fontcolor=black
or a hexadecimal value such as fontcolor=0x000000
fontcolor=black
or a hexadecimal value such as fontcolor=0x000000
25/1
As of Windows 10, it is possible to run Bash via Bash on Ubuntu on Windows, allowing you to use bash scripting. To enable Bash on Windows, see these instructions.
-On Windows, the primary native command line programme is PowerShell. PowerShell scripts are plain text files saved with a .ps1 extension. This entry explains how they work with the example of a PowerShell script named “rewrap-mp4.ps1”, which rewraps .mp4 files in a given directory to .mkv files.
+On Windows, the primary native command line program is PowerShell. PowerShell scripts are plain text files saved with a .ps1 extension. This entry explains how they work with the example of a PowerShell script named “rewrap-mp4.ps1”, which rewraps .mp4 files in a given directory to .mkv files.
“rewrap-mp4.ps1” contains the following text:
$inputfiles = ls *.mp4
foreach ($file in $inputfiles) {
@@ -1871,7 +1871,7 @@
44.1 kHz: "asetnsamples=n=44100"
96 kHz: "asetnsamples=n=96000"
- Note: This filter trandscodes audio to 16 bit PCM by default. The generated framemd5s will represent this value. Validating these framemd5s will require using the same default settings. Alternatively, when your file has another quantisation rates (e.g. 24 bit), then you might add the audio codec -c:a pcm_s24le
to the command, for compatibility reasons with other tools, like BWF MetaEdit.
+ Note: This filter transcodes audio to 16 bit PCM by default. The generated framemd5s will represent this value. Validating these framemd5s will require using the same default settings. Alternatively, when your file has another quantization rates (e.g. 24 bit), then you might add the audio codec -c:a pcm_s24le
to the command, for compatibility reasons with other tools, like BWF MetaEdit.
- ffmpeg
- starts the command
- -i input_file
- path, name and extension of the input file
@@ -2065,12 +2065,12 @@
Play HD SMPTE bars
- Test an HD video projector by playing the SMPTE colour bars pattern.
+ Test an HD video projector by playing the SMPTE color bars pattern.
ffplay -f lavfi -i smptehdbars=size=1920x1080
- ffplay
- starts the command
- -f lavfi
- tells ffplay to use the Libavfilter input virtual device
- - -i smptehdbars=size=1920x1080
- asks for the smptehdbars filter pattern as input and sets the HD resolution. This generates a colour bars pattern, based on the SMPTE RP 219–2002.
+ - -i smptehdbars=size=1920x1080
- asks for the smptehdbars filter pattern as input and sets the HD resolution. This generates a color bars pattern, based on the SMPTE RP 219–2002.
@@ -2081,12 +2081,12 @@
Play VGA SMPTE bars
- Test a VGA (SD) video projector by playing the SMPTE colour bars pattern.
+ Test a VGA (SD) video projector by playing the SMPTE color bars pattern.
ffplay -f lavfi -i smptebars=size=640x480
- ffplay
- starts the command
- -f lavfi
- tells ffplay to use the Libavfilter input virtual device
- - -i smptebars=size=640x480
- asks for the smptebars filter pattern as input and sets the VGA (SD) resolution. This generates a colour bars pattern, based on the SMPTE Engineering Guideline EG 1–1990.
+ - -i smptebars=size=640x480
- asks for the smptebars filter pattern as input and sets the VGA (SD) resolution. This generates a color bars pattern, based on the SMPTE Engineering Guideline EG 1–1990.