From 99450e61de536be89800bd5a77a7d08a81c524c9 Mon Sep 17 00:00:00 2001 From: Reto Kromer Date: Sun, 24 Dec 2017 13:19:54 +0100 Subject: [PATCH] a little housekeeping - use secure protocol whenever possible - alignment --- index.html | 96 +++++++++++++++++++++++++++--------------------------- 1 file changed, 48 insertions(+), 48 deletions(-) diff --git a/index.html b/index.html index 661f251..604070b 100644 --- a/index.html +++ b/index.html @@ -1,14 +1,14 @@ - - ffmprovisr - - - - - - - + + ffmprovisr + + + + + + +
@@ -46,16 +46,16 @@

FFmpeg is a powerful tool for manipulating audiovisual files. Unfortunately, it also has a steep learning curve, especially for users unfamiliar with a command line interface. This app helps users through the command generation process so that more people can reap the benefits of FFmpeg.

Each button displays helpful information about how to perform a wide variety of tasks using FFmpeg. To use this site, click on the task you would like to perform. A new window will open up with a sample command and a description of how that command works. You can copy this command and understand how the command works with a breakdown of each of the flags.

Tutorials -

For FFmpeg basics, check out the program’s official website.

+

For FFmpeg basics, check out the program’s official website.

For instructions on how to install FFmpeg on Mac, Linux, and Windows, refer to Reto Kromer’s installation instructions.

-

For Bash and command line basics, try the Command Line Crash Course. For a little more context presented in an ffmprovisr style, try explainshell.com!

+

For Bash and command line basics, try the Command Line Crash Course. For a little more context presented in an ffmprovisr style, try explainshell.com!

License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Sister projects -

Script Ahoy: Community Resource for Archivists and Librarians Scripting

+

Script Ahoy: Community Resource for Archivists and Librarians Scripting

The Sourcecaster: an app that helps you use the command line to work through common challenges that come up when working with digital primary sources.

Cable Bible: A Guide to Cables and Connectors Used for Audiovisual Tech

@@ -99,17 +99,17 @@
-

Learn about more advanced FFmpeg concepts

+

Learn about more advanced FFmpeg concepts

Filtergraphs

-

Many FFmpeg commands use filters that manipulate the video or audio stream in some way: for example, hflip to horizontally flip a video, or amerge to merge two or more audio tracks into a single stream.

+

Many FFmpeg commands use filters that manipulate the video or audio stream in some way: for example, hflip to horizontally flip a video, or amerge to merge two or more audio tracks into a single stream.

The use of a filter is signalled by the flag -vf (video filter) or -af (audio filter), followed by the name and options of the filter itself. For example, take the convert colourspace command:

ffmpeg -i input_file -c:v libx264 -vf colormatrix=src:dst output_file -

Here, colormatrix is the filter used, with src and dst representing the source and destination colourspaces. This part following the -vf is a filtergraph.

+

Here, colormatrix is the filter used, with src and dst representing the source and destination colourspaces. This part following the -vf is a filtergraph.

It is also possible to apply multiple filters to an input, which are sequenced together in the filtergraph. A chained set of filters is called a filter chain, and a filtergraph may include multiple filter chains. Filters in a filterchain are separated from each other by commas (,), and filterchains are separated from each other by semicolons (;). For example, take the inverse telecine command:

ffmpeg -i input_file -c:v libx264 -vf "fieldmatch,yadif,decimate" output_file

Here we have a filtergraph including one filter chain, which is made up of three video filters.

@@ -154,7 +154,7 @@

To map all streams in the input file to the output file, use -map 0. However, note that not all container formats can include all stream types: for example, .mp4 cannot contain timecode.

When no mapping is specified in an ffmpeg command, the default for video files is to take just one video and one audio stream for the output: other stream types, such as timecode or subtitles, will not be copied to the output file by default. If multiple video or audio streams are present, the best quality one is automatically selected by FFmpeg.

-

For more information, check out the FFmpeg wiki Map page, and the official FFmpeg documentation on -map.

+

For more information, check out the FFmpeg wiki Map page, and the official FFmpeg documentation on -map.

@@ -582,7 +582,7 @@ For example, to convert from Rec.601 to Rec.709, you would use -vf colormatrix=bt601:bt709.
output file
path, name and extension of the output file
-

Note: Converting between colourspaces with FFmpeg can be done via either the colormatrix or colorspace filters, with colorspace allowing finer control (individual setting of colourspace, transfer characteristics, primaries, range, pixel format, etc). See this entry on the FFmpeg wiki, and the FFmpeg documentation for colormatrix and colorspace.

+

Note: Converting between colourspaces with FFmpeg can be done via either the colormatrix or colorspace filters, with colorspace allowing finer control (individual setting of colourspace, transfer characteristics, primaries, range, pixel format, etc). See this entry on the FFmpeg wiki, and the FFmpeg documentation for colormatrix and colorspace.


Convert colourspace and embed colourspace metadata

ffmpeg -i input_file -c:v libx264 -vf colormatrix=src:dst -color_primaries val -color_trc val -colorspace val output_file

@@ -611,7 +611,7 @@

Using this command it is possible to add Rec.709 tags to a file that is actually Rec.601 (etc), so apply with caution!

These commands are relevant for H.264 and H.265 videos, encoded with libx264 and libx265 respectively.

Note: If you wish to embed colourspace metadata without changing to another colourspace, omit -vf colormatrix=src:dst. However, since it is libx264/libx265 that writes the metadata, it’s not possible to add these tags without reencoding the video stream.

-

For all possible values for -color_primaries, -color_trc, and -colorspace, see the FFmpeg documentation on codec options.

+

For all possible values for -color_primaries, -color_trc, and -colorspace, see the FFmpeg documentation on codec options.


1. Out of step with the regular pattern, -color_trc doesn’t accept bt470bg; it is instead here referred to directly as gamma.
In the Rec.601 standard, 525-line/NTSC and 625-line/PAL video have assumed gammas of 2.2 and 2.8 respectively.

@@ -675,7 +675,7 @@
-color_trc VALUE -movflags write_colr
Set a new color_transfer value.
-field_order VALUE
Set interlacement values.
-

The possible values for -color_primaries, -color_trc, and -field_order are given in the Codec Options section of the FFmpeg docs - scroll down to near the bottom of the section.

+

The possible values for -color_primaries, -color_trc, and -field_order are given in the Codec Options section of the FFmpeg docs - scroll down to near the bottom of the section.

@@ -783,7 +783,7 @@

ffmpeg -i input_file -af loudnorm=print_format=json -f null -

This filter calculates and outputs loudness information in json about an input file (labeled input) as well as what the levels would be if loudnorm were applied in its one pass mode (labeled output). The values generated can be used as inputs for a 'second pass' of the loudnorm filter allowing more accurate loudness normalization than if it is used in a single pass.

These instructions use the loudnorm defaults, which align well with PBS recommendations for target loudness. More information can be found at the loudnorm documentation.

-

Information about PBS loudness standards can be found in the PBS Technical Operating Specifications document. Information about EBU loudness standards can be found in the EBU R 128 recommendation document.

+

Information about PBS loudness standards can be found in the PBS Technical Operating Specifications document. Information about EBU loudness standards can be found in the EBU R 128 recommendation document.

ffmpeg
starts the command
input_file
path, name and extension of the input file
@@ -819,7 +819,7 @@

One Pass Loudness Normalization

ffmpeg -i input_file -af loudnorm=dual_mono=true -ar 48k output_file

This will normalize the loudness of an input using one pass, which is quicker but less accurate than using two passes. This command uses the loudnorm filter defaults for target loudness. These defaults align well with PBS recommendations, but loudnorm does allow targeting of specific loudness levels. More information can be found at the loudnorm documentation.

-

Information about PBS loudness standards can be found in the PBS Technical Operating Specifications document. Information about EBU loudness standards can be found in the EBU R 128 recommendation document.

+

Information about PBS loudness standards can be found in the PBS Technical Operating Specifications document. Information about EBU loudness standards can be found in the EBU R 128 recommendation document.

ffmpeg
starts the command
input_file
path, name and extension of the input file
@@ -839,7 +839,7 @@

Two Pass Loudness Normalization

ffmpeg -i input_file -af loudnorm=dual_mono=true:measured_I=input_i:measured_TP=input_tp:measured_LRA=input_lra:measured_thresh=input_thresh:offset=target_offset:linear=true -ar 48k output_file

This command allows using the levels calculated using a first pass of the loudnorm filter to more accurately normalize loudness. This command uses the loudnorm filter defaults for target loudness. These defaults align well with PBS recommendations, but loudnorm does allow targeting of specific loudness levels. More information can be found at the loudnorm documentation.

-

Information about PBS loudness standards can be found in the PBS Technical Operating Specifications document. Information about EBU loudness standards can be found in the EBU R 128 recommendation document.

+

Information about PBS loudness standards can be found in the PBS Technical Operating Specifications document. Information about EBU loudness standards can be found in the EBU R 128 recommendation document.

ffmpeg
starts the command
input_file
path, name and extension of the input file
@@ -890,7 +890,7 @@
ffmpeg
starts the command
-f concat
forces ffmpeg to concatenate the files and to keep the same file format
-
-i mylist.txt
path, name and extension of the input file. Per the FFmpeg documentation, it is preferable to specify relative rather than absolute file paths, as allowing absolute file paths may pose a security risk.
+
-i mylist.txt
path, name and extension of the input file. Per the FFmpeg documentation, it is preferable to specify relative rather than absolute file paths, as allowing absolute file paths may pose a security risk.
This text file contains the list of files to be concatenated and should be formatted as follows:
file './first_file.ext'
   file './second_file.ext'
@@ -918,7 +918,7 @@
         
-i input_file
Takes in a normal file.
-c copy
Use stream copy mode to re-mux instead of re-encode.
-map 0
tells FFmpeg to map all streams of the input to the output.
-
-f segment
Use segment muxer for generating the output.
+
-f segment
Use segment muxer for generating the output.
-segment_time 60
Set duration of each segment (in seconds). This example creates segments with max. duration of 60s each.
-reset_timestamps 1
Reset timestamps of each segment to 0. Meant to ease the playback of the generated segments.
output_file-%03d.mkv
@@ -1190,7 +1190,7 @@
ffmpeg
starts the command
-i input_video_file
path, name and extension of the input video file
-i input_image_file
path, name and extension of the image file
-
-filter_complex overlay=main_w-overlay_w-5:5
This calls the overlay filter and sets x and y coordinates for the position of the watermark on the video. Instead of hardcoding specific x and y coordinates, main_w-overlay_w-5:5 uses relative coordinates to place the watermark in the upper right hand corner, based on the width of your input files. Please see the FFmpeg documentation for more examples.
+
-filter_complex overlay=main_w-overlay_w-5:5
This calls the overlay filter and sets x and y coordinates for the position of the watermark on the video. Instead of hardcoding specific x and y coordinates, main_w-overlay_w-5:5 uses relative coordinates to place the watermark in the upper right hand corner, based on the width of your input files. Please see the FFmpeg documentation for more examples.
output_file
path, name and extension of the output file
@@ -1242,7 +1242,7 @@ - +

Create thumbnails or GIFs

@@ -1417,7 +1417,7 @@

ffplay -f lavfi "amovie='input.mp3', astats=metadata=1:reset=1, adrawgraph=lavfi.astats.Overall.Peak_level:max=0:min=-30.0:size=700x256:bg=Black[out]"

ffplay
starts the command
-
-f lavfi
tells ffplay to use the Libavfilter input virtual device
+
-f lavfi
tells ffplay to use the Libavfilter input virtual device
"
quotation mark to start the lavfi filtergraph
movie='input.mp3'
declares audio source file on which to apply filter
,
comma signifies the end of audio source section and the beginning of the filter section
@@ -1446,7 +1446,7 @@

ffplay -f lavfi "movie='input.mp4', signalstats=out=brng:color=cyan[out]"

ffplay
starts the command
-
-f lavfi
tells ffplay to use the Libavfilter input virtual device
+
-f lavfi
tells ffplay to use the Libavfilter input virtual device
"
quotation mark to start the lavfi filtergraph
movie='input.mp4'
declares video file source to apply filter
,
comma signifies closing of video source assertion and ready for filter assertion
@@ -1501,7 +1501,7 @@
[a][b]hstack[out]
Takes the outputs from the previous steps ([a] and [b] and uses the hstack (horizontal stack) filter on them to create the side by side output. This output is then named [out])
"
quotation mark to end filtergraph
-map [out]
Maps the output of the filter chain
-
-f nut
Sets the format for the output video stream to Nut
+
-f nut
Sets the format for the output video stream to Nut
-c:v rawvideo
Sets the video codec of the output video stream to raw video
-
tells FFmpeg that the output will be piped to a new command (as opposed to a file)
|
Tells the system you will be piping the output of the previous command into a new command
@@ -1534,7 +1534,7 @@
-show_data
adds a short “hexdump” to show_streams command output
-print_format
Set the output printing format (in this example “xml”; other formats include “json” and “flat”)
-

See also the FFmpeg documentation on ffprobe for a full list of flags, commands, and options.

+

See also the FFmpeg documentation on ffprobe for a full list of flags, commands, and options.

@@ -1680,7 +1680,7 @@

This will create an XML report for use in QCTools for a video file with one video track and one audio track. See also the QCTools documentation.

ffprobe
starts the command
-
-f lavfi
tells ffprobe to use the Libavfilter input virtual device
+
-f lavfi
tells ffprobe to use the Libavfilter input virtual device
-i
input file and parameters
"movie=input_file:s=v+a[in0][in1], [in0]signalstats=stat=tout+vrep+brng, cropdetect=reset=1:round=1, idet=half_life=1, split[a][b];[a]field=top[a1];[b]field=bottom, split[b1][b2];[a1][b1]psnr[c1];[c1][b2]ssim[out0];[in1]ebur128=metadata=1, astats=metadata=1:reset=1:length=0.4[out1]"
This very large lump of commands declares the input file and passes in a request for all potential data signal information for a file with one video and one audio track
@@ -1705,7 +1705,7 @@

This will create an XML report for use in QCTools for a video file with one video track and NO audio track. See also the QCTools documentation.

ffprobe
starts the command
-
-f lavfi
tells ffprobe to use the Libavfilter input virtual device
+
-f lavfi
tells ffprobe to use the Libavfilter input virtual device
-i
input file and parameters
"movie=input_file,signalstats=stat=tout+vrep+brng, cropdetect=reset=1:round=1, idet=half_life=1, split[a][b];[a]field=top[a1];[b]field=bottom,split[b1][b2];[a1][b1]psnr[c1];[c1][b2]ssim"
This very large lump of commands declares the input file and passes in a request for all potential data signal information for a file with one video and one audio track
@@ -1750,7 +1750,7 @@

If hex isn't your thing, closed captioning character and code sets can be found in the documentation for SCTools.

ffprobe
starts the command
-
-f lavfi
tells ffprobe to use the libavfilter input virtual device
+
-f lavfi
tells ffprobe to use the libavfilter input virtual device
-i input_file
input file and parameters
readeia608 -show_entries frame=pkt_pts_time:frame_tags=lavfi.readeia608.0.line,lavfi.readeia608.0.cc,lavfi.readeia608.1.line,lavfi.readeia608.1.cc -of csv
specifies the first two lines of video in which EIA-608 data (hexadecimal byte pairs) are identifiable by ffprobe, outputting comma separated values (CSV)
>
redirects the standard output (the data created by ffprobe about the video)
@@ -1777,7 +1777,7 @@

ffmpeg -f lavfi -i mandelbrot=size=1280x720:rate=25 -c:v libx264 -t 10 output_file

ffmpeg
starts the command
-
-f lavfi
tells FFmpeg to use the Libavfilter input virtual device
+
-f lavfi
tells FFmpeg to use the Libavfilter input virtual device
-i mandelbrot=size=1280x720:rate=25
asks for the mandelbrot test filter as input. Adjusting the size and rate options allows you to choose a specific frame size and framerate.
-c:v libx264
transcodes video from rawvideo to H.264. Set -pix_fmt to yuv420p for greater H.264 compatibility with media players.
-t 10
specifies recording time of 10 seconds
@@ -1795,7 +1795,7 @@

ffmpeg -f lavfi -i smptebars=size=720x576:rate=25 -c:v prores -t 10 output_file

ffmpeg
starts the command
-
-f lavfi
tells FFmpeg to use the Libavfilter input virtual device
+
-f lavfi
tells FFmpeg to use the Libavfilter input virtual device
-i smptebars=size=720x576:rate=25
asks for the smptebars test filter as input. Adjusting the size and rate options allows you to choose a specific frame size and framerate.
-c:v prores
transcodes video from rawvideo to Apple ProRes 4:2:2.
-t 10
specifies recording time of 10 seconds
@@ -1813,7 +1813,7 @@

ffmpeg -f lavfi -i testsrc=size=720x576:rate=25 -c:v v210 -t 10 output_file

ffmpeg
starts the command
-
-f lavfi
tells FFmpeg to use the libavfilter input virtual device
+
-f lavfi
tells FFmpeg to use the libavfilter input virtual device
-i testsrc=size=720x576:rate=25
asks for the testsrc filter pattern as input. Adjusting the size and rate options allows you to choose a specific frame size and framerate.
The different test patterns that can be generated are listed here.
-c:v v210
transcodes video from rawvideo to 10-bit Uncompressed Y′CBCR 4:2:2. Alter this setting to set your desired codec.
@@ -1833,7 +1833,7 @@

ffplay -f lavfi -i smptehdbars=size=1920x1080

ffplay
starts the command
-
-f lavfi
tells ffplay to use the Libavfilter input virtual device
+
-f lavfi
tells ffplay to use the Libavfilter input virtual device
-i smptehdbars=size=1920x1080
asks for the smptehdbars filter pattern as input and sets the HD resolution. This generates a colour bars pattern, based on the SMPTE RP 219–2002.
@@ -1849,7 +1849,7 @@

ffplay -f lavfi -i smptebars=size=640x480

ffplay
starts the command
-
-f lavfi
tells ffplay to use the Libavfilter input virtual device
+
-f lavfi
tells ffplay to use the Libavfilter input virtual device
-i smptebars=size=640x480
asks for the smptebars filter pattern as input and sets the VGA (SD) resolution. This generates a colour bars pattern, based on the SMPTE Engineering Guideline EG 1–1990.
@@ -1865,7 +1865,7 @@

ffmpeg -f lavfi -i "sine=frequency=1000:sample_rate=48000:duration=5" -c:a pcm_s16le output_file.wav

ffmpeg
starts the command
-
-f lavfi
tells FFmpeg to use the Libavfilter input virtual device
+
-f lavfi
tells FFmpeg to use the Libavfilter input virtual device
-i "sine=frequency=1000:sample_rate=48000:duration=5"
Sets the signal to 1000 Hz, sampling at 48 kHz, and for 5 seconds
-c:a pcm_s16le
encodes the audio codec in pcm_s16le (the default encoding for wav files). pcm represents pulse-code modulation format (raw bytes), 16 means 16 bits per sample, and le means "little endian"
output_file.wav
path, name and extension of the output file
@@ -1883,7 +1883,7 @@

ffmpeg -f lavfi -i smptebars=size=720x576:rate=25 -f lavfi -i "sine=frequency=1000:sample_rate=48000" -c:a pcm_s16le -t 10 -c:v ffv1 output_file

ffmpeg
starts the command
-
-f lavfi
tells FFmpeg to use the libavfilter input virtual device
+
-f lavfi
tells FFmpeg to use the libavfilter input virtual device
-i smptebars=size=720x576:rate=25
asks for the smptebars test filter as input. Adjusting the size and rate options allows you to choose a specific frame size and framerate.
-f lavfi
use libavfilter again, but now for audio
-i "sine=frequency=1000:sample_rate=48000"
Sets the signal to 1000 Hz, sampling at 48 kHz.
@@ -1906,7 +1906,7 @@
ffmpeg
starts the command
-i input_file
takes in a normal file
-
-bsf noise=1
sets bitstream filters for all to 'noise'. Filters can be set on specific filters using syntax such as -bsf:v for video, -bsf:a for audio, etc. The noise filter intentionally damages the contents of packets without damaging the container. This sets the noise level to 1 but it could be left blank or any number above 0.
+
-bsf noise=1
sets bitstream filters for all to 'noise'. Filters can be set on specific filters using syntax such as -bsf:v for video, -bsf:a for audio, etc. The noise filter intentionally damages the contents of packets without damaging the container. This sets the noise level to 1 but it could be left blank or any number above 0.
-c copy
use stream copy mode to re-mux instead of re-encode
output_file
path, name and extension of the output file
@@ -1923,7 +1923,7 @@

ffplay -f lavfi life=s=300x200:mold=10:r=60:ratio=0.1:death_color=#C83232:life_color=#00ff00,scale=1200:800

ffplay
starts the command
-
-f lavfi
tells ffplay to use the Libavfilter input virtual device
+
-f lavfi
tells ffplay to use the Libavfilter input virtual device
life=s=300x200
use the life filter and set the size of the video to 300x200
:
indicates there’s another parameter coming
mold=10:r=60:ratio=0.1
sets up the rules of the game: cell mold speed, video rate, and random fill ratio
@@ -1938,7 +1938,7 @@ - +

Use OCR

@@ -1978,7 +1978,7 @@
ffprobe
starts the command
-show_entries
sets a list of entries to show
frame_tags=lavfi.ocr.text
shows the lavfi.ocr.text tag in the frame section of the video
-
-f lavfi
tells ffprobe to use the Libavfilter input virtual device
+
-f lavfi
tells ffprobe to use the Libavfilter input virtual device
-i "movie=input_file,ocr"
declares 'movie' as input_file and passes in the 'ocr' command
@@ -2121,7 +2121,7 @@

This ffprobe command prints a CSV correlating timestamps and their YDIF values, useful for determining cuts.

ffprobe
starts the command
-
-f lavfi
uses the Libavfilter input virtual device as chosen format
+
-f lavfi
uses the Libavfilter input virtual device as chosen format
-i movie=input file
path, name and extension of the input video file
,
comma signifies closing of video source assertion and ready for filter assertion
signalstats
tells ffprobe to use the signalstats command
@@ -2242,9 +2242,9 @@ Change the above data-target field, the hover-over description, the button text, --> - +