mirror of
https://github.com/amiaopensource/ffmprovisr.git
synced 2025-10-15 02:09:57 +02:00
Compare commits
36 Commits
v2020-07-1
...
v2023-08-0
Author | SHA1 | Date | |
---|---|---|---|
|
8290b25fe5 | ||
|
53334033d0 | ||
|
86a04859a2 | ||
|
dada53dff4 | ||
|
f100b46233 | ||
|
d1083e012c | ||
|
08a27055c3 | ||
|
63033942f1 | ||
|
aba38e0a08 | ||
|
393a8cc22f | ||
|
20cbf26144 | ||
|
e6552c5494 | ||
|
0d27ffa5d1 | ||
|
bf93a20da8 | ||
|
3c393a688e | ||
|
961f401c34 | ||
|
f3cc3b518f | ||
|
3555a9b61e | ||
|
9a36ef66ba | ||
|
e7d8dd58b9 | ||
|
a25c20a855 | ||
|
c0d3761ce5 | ||
|
869ffda8c4 | ||
|
789329b415 | ||
|
23e0097b54 | ||
|
ca86a0eca4 | ||
|
bc0edae268 | ||
|
87314cef76 | ||
|
77a7a2b3dc | ||
|
f0aab69f0f | ||
|
5c0b98bded | ||
|
2677cd74e7 | ||
|
46d65cf4d0 | ||
|
638bb951dd | ||
|
87581c717c | ||
|
ba49741c9f |
BIN
img/deinterlaced_video_frames.png
Normal file
BIN
img/deinterlaced_video_frames.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 741 KiB |
BIN
img/interlaced_video_fields.png
Normal file
BIN
img/interlaced_video_fields.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 422 KiB |
115
index.html
115
index.html
@@ -58,12 +58,13 @@
|
||||
<a href="https://creativecommons.org/licenses/by/4.0/" target="_blank"><img alt="Creative Commons License" src="img/cc.png"></a><br>
|
||||
This work is licensed under a <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank">Creative Commons Attribution 4.0 International License</a>.
|
||||
</p>
|
||||
<h3>Sister projects</h3>
|
||||
<h3>Sibling projects</h3>
|
||||
<p><a href="https://dd388.github.io/crals/" target="_blank">Script Ahoy</a>: Community Resource for Archivists and Librarians Scripting</p>
|
||||
<p><a href="https://datapraxis.github.io/sourcecaster/" target="_blank">The Sourcecaster</a>: an app that helps you use the command line to work through common challenges that come up when working with digital primary sources.</p>
|
||||
<p><a href="https://pugetsoundandvision.github.io/micropops/" target="_blank">Micropops</a>: One liners and automation tools from Moving Image Preservation of Puget Sound</p>
|
||||
<p><a href="https://amiaopensource.github.io/cable-bible/" target="_blank">Cable Bible</a>: A Guide to Cables and Connectors Used for Audiovisual Tech</p>
|
||||
<p><a href="https://eaasi.gitlab.io/qemu-qed/" target="_blank">QEMU QED</a>: instructions for using QEMU (Quick EMUlator), a command line application for computer emulation and virtualization</p>
|
||||
<p><a href="https://eaasi.gitlab.io/program_docs/qemu-qed/" target="_blank">QEMU QED</a>: instructions for using QEMU (Quick EMUlator), a command line application for computer emulation and virtualization</p>
|
||||
<p><a href="https://amiaopensource.github.io/ffmpeg-artschool/" target="_blank">ffmpeg-artschool</a>: An AMIA workshop featuring scripts, exercises, and activities to make art using FFmpeg</p>
|
||||
</div>
|
||||
|
||||
<div class="well">
|
||||
@@ -107,6 +108,18 @@
|
||||
<div class="well">
|
||||
|
||||
<h2 id="concepts">Learn about more advanced FFmpeg concepts</h2>
|
||||
<!-- Loop usage explanation -->
|
||||
<label class="recipe" for="batch-loop">Batch and Loop script usage</label>
|
||||
<input type="checkbox" id="batch-loop">
|
||||
<div class="hiding">
|
||||
<h5>Batch and Loop script usage</h5>
|
||||
<p><code>ffmpeg -nostdin -i <em>input_file</em> ...</code></p>
|
||||
<p>One of the frequent uses of FFmpeg is to run batch commands within loops to, for example, generate access files for an entire collection at once.</p>
|
||||
<p>When running an FFmpeg command within a loop it is often necessary to use the <code>-nostdin</code> flag prior to the input in order to ensure successful execution of the commands. This is needed to override FFmpeg's default behavior of enabling interaction on standard input which can result in errors as loop inputs are fed to the ongoing command.</p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- End loop usage explanation -->
|
||||
|
||||
<!-- Codec Defaults explanation -->
|
||||
<label class="recipe" for="codec-defaults">Codec defaults</label>
|
||||
<input type="checkbox" id="codec-defaults">
|
||||
@@ -212,7 +225,7 @@
|
||||
The new container you are rewrapping to is defined by the filename extension used here, e.g. .mkv, .mp4, .mov.</dd>
|
||||
</dl>
|
||||
<h4>Important caveat</h4>
|
||||
<p>It may not be possible to rewrap a file's contents to a new container without re-encoding one or more of the streams within (that is, the video, audio, and subtitle tracks). Some containers can only contain streams of a certain encoding type: for example, the .mp4 container does not support uncompressed audio tracks. (In practice .mp4 goes hand-in-hand with a H.264-encoded video stream and an AAC-encoded video stream, although other types of video and audio streams are possible). Another example is that the Matroska container does not allow data tracks; see the <a href="#mkv-to-mp4">MKV to MP4 recipe</a>.</p>
|
||||
<p>It may not be possible to rewrap a file's contents to a new container without re-encoding one or more of the streams within (that is, the video, audio, and subtitle tracks). Some containers can only contain streams of a certain encoding type: for example, the .mp4 container does not support uncompressed audio tracks. (In practice .mp4 goes hand-in-hand with a H.264-encoded video stream and an AAC-encoded video stream, although other types of video and audio streams are possible). Another example is that the Matroska container does not allow data tracks.</p>
|
||||
<p>In such cases, FFmpeg will throw an error. If you encounter errors of this kind, you may wish to consult the <a href="#transcode">list of transcoding recipes</a>.</p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
@@ -337,6 +350,38 @@
|
||||
</div>
|
||||
<!-- ends Transcode to H.264 -->
|
||||
|
||||
<!-- Transcode to H.264 or H.265 using the GPU -->
|
||||
<label class="recipe" for="transcode_gpu">Transcode to H.264/H.265 using the GPU</label>
|
||||
<input type="checkbox" id="transcode_gpu">
|
||||
<div class="hiding">
|
||||
<h5>Transcode to H.264/H.265 using the GPU</h5>
|
||||
<p><code>ffmpeg -i <em>input_file</em> -c:v h264_nvenc -preset llhq -rc:v vbr_hq -cq:v 19 -b:v 8000k -maxrate:v 12000k -profile:v high -c:a copy <em>output_file</em></code></p>
|
||||
<p>This command takes an input file and transcodes it to H.264 using the encoding functionality of an Nvidia GPU (without transcoding the audio). If you're using H.264 with AAC or AC3 audio, you can output to an .mp4 file; if you're using HEVC and/or more exotic audio, you should output to .mkv. While Nvidia's fixed-function hardware can be 10x as performant as encoding on the CPU, it requires a few more parameters in order to optimize quality at lower bitrates.</p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
<dt>-i <em>input_file</em></dt><dd>path, name and extension of the input file</dd>
|
||||
<dt>-c:v <em>h264_nvenc</em></dt><dd>tells FFmpeg to encode the video stream as H.264 using Nvidia's encoder.</dd>
|
||||
<dt>-preset <em>llhq</em></dt><dd>uses the "low latency, high quality" encoding preset, a good default when working with nvenc.</dd>
|
||||
<dt>-rc:v <em>vbr_hq</em></dt><dd>means "variable bitrate, high quality," allowing you to set a minimum and maximum bitrate for the encode.</dd>
|
||||
<dt>-cq:v <em>19</em></dt><dd>is the same as the CRF quality level specified using x264 or other CPU-based encoders, where 0 is lossless, 51 is the worst possible quality, and values from 18-23 are typical.</dd>
|
||||
<dt>-b:v <em>8000k -maxrate:v 12000k</em></dt><dd>corresponds to a minimum bitrate of 8 megabits (8000k) per second, and a maximum of 12 megabits per second. nvenc is not as good at estimating bitrates as CPU-based encoders, and without this data, will occasionally choose a visibly lower bitrate. The 8-12 mbit range is generally a good one for high-quality 1080p h264.</dd>
|
||||
<dt>-profile:v <em>high</em></dt><dd>uses the "high quality" profile of h264, something that's been baked in to the spec for a long time so that older players can declare compatibility; almost all h264 video now uses high.</dd>
|
||||
<dt>-c:a <em>copy</em></dt><dd>will skip reencoding the audio stream, and copy the audio from the source file.</dd>
|
||||
<dt><em>output_file</em></dt><dd>path, name and extension of the output file</dd>
|
||||
</dl>
|
||||
<p>In order to encode to HEVC instead, and optionally transcode the audio, you can try changing the command like this:</p>
|
||||
<p><code>ffmpeg -i <em>input_file</em> -c:v hevc_nvenc -preset llhq -rc:v vbr_hq -cq:v 19 -b:v 5000k -maxrate:v 8000k -profile:v main10 -c:a aac <em>output_file</em></code></p>
|
||||
<dl>
|
||||
<dt>-c:v <em>hevc_nvenc</em></dt><dd>encodes to HEVC (also called H.265), a more efficient codec supported on GPUs from approximately 2015 and newer.</dd>
|
||||
<dt>-b:v <em>5000k -maxrate:v 8000k</em></dt><dd>specifies a slightly lower bitrate than when using h264, per HEVC's greater efficiency.</dd>
|
||||
<dt>-profile:v <em>main10</em></dt><dd>declares the "main10" profile for working with HEVC; one of the primary advantages of this codec is better support for 10-bit video, enabling consumer HDR.</dd>
|
||||
<dt>-c:a <em>aac</em></dt><dd>reencodes the audio to AAC with default parameters, a very common and widely supported format for access copies.</dd>
|
||||
</dl>
|
||||
<p>Much of the information in this entry was taken from <a href="https://superuser.com/a/1236387" target="_blank">this superuser.com post</a> provided by an Nvidia developer, one of the best sources of information on the ffmpeg Nvidia encoders.</p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- ends Transcode to H.264 or H.265 using the GPU -->
|
||||
|
||||
<!-- H.264 from DCP -->
|
||||
<label class="recipe" for="dcp_to_h264">Transcode from DCP to an H.264 access file</label>
|
||||
<input type="checkbox" id="dcp_to_h264">
|
||||
@@ -459,7 +504,6 @@
|
||||
<h5>Transcode to Ogg/Theora</h5>
|
||||
<p><code>ffmpeg -i <em>input_file</em> -acodec libvorbis -b:v 690k <em>output_file</em></code></p>
|
||||
<p>This command takes an input file and transcodes it to Ogg/Theora in an .ogv wrapper with 690k video bitrate.</p>
|
||||
<p><strong>Note:</strong> FFmpeg must be installed with support for Ogg Theora. If you are using Homebrew, you can check with <code>brew info ffmpeg</code> and then update it with <code>brew upgrade ffmpeg --with-theora --with-libvorbis</code> if necessary.</p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
<dt>-i <em>input file</em></dt><dd>path, name and extension of the input file</dd>
|
||||
@@ -628,10 +672,10 @@
|
||||
<dt>-c:a copy</dt><dd>re-encodes using the same audio codec<br>
|
||||
For silent videos you can replace <code>-c:a copy</code> with <code>-an</code>.</dd>
|
||||
<dt><em>output_file</em></dt><dd>path, name and extension of the output file</dd>
|
||||
<p>If your source is interlaced, you will want to deinterlace prior to scaling. In that case, your command would look like this:</p>
|
||||
<p><code>ffmpeg -i <em>input_file</em> -c:v libx264 -filter:v "yadif, colormatrix=bt601:bt709, scale=1440:1080:flags=lanczos, pad=1920:1080:240:0" -c:a copy <em>output_file</em></code></p>
|
||||
<p>See the <a href="#ntsc_to_h264">Interlaced NTSC to MP4 recipe</a> for a fuller explanation of the deinterlacing step.</p>
|
||||
</dl>
|
||||
<p>If your source is interlaced, you will want to deinterlace prior to scaling. In that case, your command would look like this:</p>
|
||||
<p><code>ffmpeg -i <em>input_file</em> -c:v libx264 -filter:v "yadif, colormatrix=bt601:bt709, scale=1440:1080:flags=lanczos, pad=1920:1080:240:0" -c:a copy <em>output_file</em></code></p>
|
||||
<p>See the <a href="#ntsc_to_h264">Interlaced NTSC to MP4 recipe</a> for a fuller explanation of the deinterlacing step.</p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- ends SD to HD -->
|
||||
@@ -723,7 +767,7 @@
|
||||
<li>In the sound filter <code>atempo</code> the numerator <code>output_fps</code> sets the output speed and the denominator <code>input_fps</code> sets the input speed; both values are given in frames per second.</li>
|
||||
</ul>
|
||||
The different filters in a complex filter can be divided either by comma or semicolon. The quotation marks allow to insert a space between the filters for readability.</dd>
|
||||
<dt>-map "[v]"</dt><dd>maps the video stream and:</dd>
|
||||
<dt>-map "[v]"</dt><dd>maps the video stream and</dd>
|
||||
<dt>-map "[a]"</dt><dd>maps the audio stream together into:</dd>
|
||||
<dt><em>output_file</em></dt><dd>path, name and extension of the output file</dd>
|
||||
</dl>
|
||||
@@ -940,7 +984,7 @@
|
||||
<p><code>ffmpeg -i <em>input_file</em> -af loudnorm=print_format=json -f null -</code></p>
|
||||
<p>This filter calculates and outputs loudness information in json about an input file (labeled input) as well as what the levels would be if loudnorm were applied in its one pass mode (labeled output). The values generated can be used as inputs for a 'second pass' of the loudnorm filter allowing more accurate loudness normalization than if it is used in a single pass.</p>
|
||||
<p>These instructions use the loudnorm defaults, which align well with PBS recommendations for target loudness. More information can be found at the <a href="https://ffmpeg.org/ffmpeg-filters.html#loudnorm" target="_blank">loudnorm documentation</a>.</p>
|
||||
<p>Information about PBS loudness standards can be found in the <a href="https://www-tc.pbs.org/capt/Producing/TOS-2012-Pt2-Distribution.pdf" target="_blank">PBS Technical Operating Specifications</a> document. Information about EBU loudness standards can be found in the <a href="https://tech.ebu.ch/docs/r/r128-2014.pdf" target="_blank">EBU R 128</a> recommendation document.</p>
|
||||
<p>Information about PBS loudness standards can be found in the <a href="http://bento.cdn.pbs.org/hostedbento-prod/filer_public/PBS_About/Producing/Red%20Book/TOS%20Pt%201%20Submission%202016.pdf" target="_blank">PBS Technical Operating Specifications</a> document. Information about EBU loudness standards can be found in the <a href="https://tech.ebu.ch/docs/r/r128-2014.pdf" target="_blank">EBU R 128</a> recommendation document.</p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
<dt><em>input_file</em></dt><dd>path, name and extension of the input file</dd>
|
||||
@@ -1060,16 +1104,18 @@
|
||||
<div class="hiding">
|
||||
<h5>Join files together</h5>
|
||||
<p><code>ffmpeg -f concat -i mylist.txt -c copy <em>output_file</em></code></p>
|
||||
<p>This command takes two or more files of the same file type and joins them together to make a single file. All that the program needs is a text file with a list specifying the files that should be joined. However, it only works properly if the files to be combined have the exact same codec and technical specifications. Be careful, FFmpeg may appear to have successfully joined two video files with different codecs, but may only bring over the audio from the second file or have other weird behaviors. Don’t use this command for joining files with different codecs and technical specs and always preview your resulting video file!</p>
|
||||
<p>This command takes two or more files of the same file type and joins them together to make a single file. All that the program needs is a text file with a list specifying the files that should be joined. If possible, run the command from the same directory where the files and the text file reside. Otherwise you'll have to use <code>-safe 0</code>, see below for more information. However, it only works properly if the files to be combined have the exact same codec and technical specifications. Be careful, FFmpeg may appear to have successfully joined two video files with different codecs, but may only bring over the audio from the second file or have other weird behaviors. Don’t use this command for joining files with different codecs and technical specs and always preview your resulting video file!</p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
<dt>-f concat</dt><dd>forces ffmpeg to concatenate the files and to keep the same file format</dd>
|
||||
<dt>-i <em>mylist.txt</em></dt><dd>path, name and extension of the input file. Per the <a href="https://ffmpeg.org/ffmpeg-formats.html#Options" target="_blank">FFmpeg documentation</a>, it is preferable to specify relative rather than absolute file paths, as allowing absolute file paths may pose a security risk.<br>
|
||||
This text file contains the list of files to be concatenated and should be formatted as follows:
|
||||
<pre>file '<em>./first_file.ext</em>'
|
||||
file '<em>./second_file.ext</em>'
|
||||
This text file contains the list of files (without their absolute path) to be concatenated and should be formatted as follows:
|
||||
<pre>
|
||||
file '<em>first_file.ext</em>'
|
||||
file '<em>second_file.ext</em>'
|
||||
. . .
|
||||
file '<em>./last_file.ext</em>'</pre>
|
||||
file '<em>last_file.ext</em>'
|
||||
</pre>
|
||||
In the above, <strong>file</strong> is simply the word "file". Straight apostrophes ('like this') rather than curved quotation marks (‘like this’) must be used to enclose the file paths.<br>
|
||||
<strong>Note:</strong> If specifying absolute file paths in the .txt file, add <code>-safe 0</code> before the input file.<br>
|
||||
e.g.: <code>ffmpeg -f concat -safe 0 -i mylist.txt -c copy <em>output_file</em></code></dd>
|
||||
@@ -1363,6 +1409,43 @@
|
||||
</div>
|
||||
<!-- ends Deinterlace video -->
|
||||
|
||||
<!-- Deinterlace video fields -->
|
||||
<label class="recipe" for="deinterlace_fields">Deinterlace video fields to frames</label>
|
||||
<input type="checkbox" id="deinterlace_fields">
|
||||
<div class="hiding">
|
||||
<h5>Deinterlace video fields to frames</h5>
|
||||
<p><code>ffmpeg -i <em>input_file</em> -c:v libx264 -vf "idet,bwdif,format=yuv420p" <em>output_file</em></code></p>
|
||||
<p>This command takes an interlaced input file and outputs a deinterlaced H.264 MP4, with each field separated into its own frame. This is preferred for interlaced video that contains a lot of motion, as the double-rate output preserves the visual cadence of the source material.</p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
<dt>-i <em>input file</em></dt><dd>path, name and extension of the input file</dd>
|
||||
<dt>-c:v libx264</dt><dd>tells FFmpeg to encode the video stream as H.264</dd>
|
||||
<dt>-vf</dt><dd>video filtering will be used (<code>-vf</code> is an alias of <code>-filter:v</code>)</dd>
|
||||
<dt>"</dt><dd>start of filtergraph (see below)</dd>
|
||||
<dt>idet</dt><dd>detect interlaced video field order<br>
|
||||
<a href="https://ffmpeg.org/ffmpeg-filters.html#idet" target="_blank">idet</a> will try to detect if the video is interlaced, and if so, what the order of the fields are (top-field-first, or bottom-field-first). This is done to ensure the output of the deinterlacing filter is correct.</dd>
|
||||
<dt>bwdif</dt><dd>deinterlacing filter (‘Bob Weaver Deinterlacing Filter’)<br>
|
||||
By default, <a href="https://ffmpeg.org/ffmpeg-filters.html#bwdif-1" target="_blank">bwdif</a> will output one frame for each field, matching the visual cadence of interlaced video. </dd>
|
||||
<dt>,</dt><dd>separates filters</dd>
|
||||
<dt>format=yuv420p</dt><dd>chroma subsampling set to 4:2:0<br>
|
||||
By default, <code>libx264</code> will use a chroma subsampling scheme that is the closest match to that of the input. This can result in Y′C<sub>B</sub>C<sub>R</sub> 4:2:0, 4:2:2, or 4:4:4 chroma subsampling. QuickTime and most other non-FFmpeg based players can’t decode H.264 files that are not 4:2:0, therefore it’s advisable to specify 4:2:0 chroma subsampling.</dd>
|
||||
<dt>"</dt><dd>end of filtergraph</dd>
|
||||
<dt><em>output file</em></dt><dd>path, name and extension of the output file</dd>
|
||||
</dl>
|
||||
<p><code>"idet,bwdif,format=yuv420p"</code> is an FFmpeg <a href="https://trac.ffmpeg.org/wiki/FilteringGuide#FiltergraphChainFilterrelationship" target="_blank">filtergraph</a>. Here the filtergraph is made up of one filter chain, which is itself made up of the three filters (separated by the comma).<br>
|
||||
The enclosing quote marks are necessary when you use spaces within the filtergraph, e.g. <code>-vf "idet, bwdif, format=yuv420p"</code>, and are included above as an example of good practice.</p>
|
||||
<p><strong>Note:</strong> bwdif also supports the older method of outputting one frame for each frame (thereby halving the number of output frames per second) with the syntax <code>bwdif=mode=send_frame</code>. This can be used when the presentation device is not capable of reproducing 50 (PAL) or 60 (NTSC) frames per second.</p>
|
||||
<p>For more H.264 encoding options, see the latter section of the <a href="#transcode_h264">encode H.264 command</a>.</p>
|
||||
<div class="sample-image">
|
||||
<h2>Example</h2>
|
||||
<p>Before and after deinterlacing with bwdif:</p>
|
||||
<img src="img/interlaced_video_fields.png" alt="VLC screenshot of original interlaced video">
|
||||
<img src="img/deinterlaced_video_frames.png" alt="VLC screenshot of deinterlaced video">
|
||||
</div>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- ends Deinterlace video fields -->
|
||||
|
||||
<!-- Inverse telecine -->
|
||||
<label class="recipe" for="inverse-telecine">Inverse telecine</label>
|
||||
<input type="checkbox" id="inverse-telecine">
|
||||
@@ -1506,7 +1589,7 @@
|
||||
<label class="recipe" for="embed_subtitles">Embed subtitles</label>
|
||||
<input type="checkbox" id="embed_subtitles">
|
||||
<div class="hiding">
|
||||
<h5>Embed a subtitle file into a movie file </h5>
|
||||
<h5>Embed a subtitle file into a movie file</h5>
|
||||
<p><code>ffmpeg -i <em>input_file</em> -i <em>subtitles_file</em> -c copy -c:s mov_text <em>output_file</em></code></p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
@@ -2335,7 +2418,6 @@
|
||||
<input type="checkbox" id="ocr_on_top">
|
||||
<div class="hiding">
|
||||
<h5>Plays video with OCR on top</h5>
|
||||
<p>Note: ffmpeg must be compiled with the tesseract library for this script to work (<code>--with-tesseract</code> if using the <code>brew install ffmpeg</code> method).</p>
|
||||
<p><code>ffplay input_file -vf "ocr,drawtext=fontfile=/Library/Fonts/Andale Mono.ttf:text=%{metadata\\\:lavfi.ocr.text}:fontcolor=white"</code></p>
|
||||
<dl>
|
||||
<dt>ffplay</dt><dd>starts the command</dd>
|
||||
@@ -2359,7 +2441,6 @@
|
||||
<input type="checkbox" id="ffprobe_ocr">
|
||||
<div class="hiding">
|
||||
<h5>Exports OCR data to screen</h5>
|
||||
<p>Note: FFmpeg must be compiled with the tesseract library for this script to work (<code>--with-tesseract</code> if using the <code>brew install ffmpeg</code> method)</p>
|
||||
<p><code>ffprobe -show_entries frame_tags=lavfi.ocr.text -f lavfi -i "movie=<em>input_file</em>,ocr"</code></p>
|
||||
<dl>
|
||||
<dt>ffprobe</dt><dd>starts the command</dd>
|
||||
|
@@ -173,7 +173,7 @@ Last updated: 2019-12-11
|
||||
|
||||
[The Cable Bible](https://amiaopensource.github.io/cable-bible/): A Guide to Cables and Connectors Used for Audiovisual Tech
|
||||
[FFCommand_Engine](https://github.com/ColorlabMD/FFCommand_Engine): a tool for easier use of FFmpeg binaries
|
||||
[QEMU QED](https://eaasi.gitlab.io/qemu-qed): instructions for using QEMU (Quick EMUlator), a command line application for computer emulation and virtualization
|
||||
[QEMU QED](https://eaasi.gitlab.io/program_docs/qemu-qed/): instructions for using QEMU (Quick EMUlator), a command line application for computer emulation and virtualization
|
||||
[Script Ahoy](http://dd388.github.io/crals/): Community Resource for Archivists and Librarians Scripting
|
||||
[sourcecaster](https://datapraxis.github.io/sourcecaster/): helps you use the command line to work through common challenges that come up when working with digital primary sources.
|
||||
|
||||
|
29
recipes.txt
29
recipes.txt
@@ -16,6 +16,10 @@ ffmpeg -i input_file -map 0 -dn -c:v ffv1 -level 3 -g 1 -slicecrc 1 -slices 16 -
|
||||
ffmpeg -i concat:input_file_1\|input_file_2\|input_file_3 -c:v libx264 -c:a aac output_file.mp4
|
||||
# Transcode to an H.265/HEVC MP4
|
||||
ffmpeg -i input_file -c:v libx265 -pix_fmt yuv420p -c:a copy output_file
|
||||
# Transcode to H.264 using the GPU
|
||||
ffmpeg -i input_file -c:v h264_nvenc -preset llhq -rc:v vbr_hq -cq:v 19 -b:v 8000k -maxrate:v 12000k -profile:v high -c:a copy output_file
|
||||
# Transcode to H.265 using the GPU
|
||||
ffmpeg -i input_file -c:v hevc_nvenc -preset llhq -rc:v vbr_hq -cq:v 19 -b:v 5000k -maxrate:v 8000k -profile:v main10 -c:a copy output_file
|
||||
# Transcode to an Ogg Theora
|
||||
ffmpeg -i input_file -acodec libvorbis -b:v 690k output_file
|
||||
# Convert WAV to MP3
|
||||
@@ -108,7 +112,7 @@ ffmpeg -i input_file -ss 00:00:20 -vframes 1 thumb.png
|
||||
ffmpeg -i input_file -vf fps=1/60 out%d.png
|
||||
# Create GIF from still images
|
||||
ffmpeg -f image2 -framerate 9 -pattern_type glob -i "input_image_*.jpg" -vf scale=250x250 output_file.gif
|
||||
# Create GIF from a video
|
||||
# Create GIF from a video
|
||||
ffmpeg -ss HH:MM:SS -i input_file -filter_complex "fps=10,scale=500:-1:flags=lanczos,palettegen" -t 3 palette.png
|
||||
ffmpeg -ss HH:MM:SS -i input_file -i palette.png -filter_complex "[0:v]fps=10, scale=500:-1:flags=lanczos[v], [v][1:v]paletteuse" -t 3 -loop 6 output_file
|
||||
# Transcode an image sequence into uncompressed 10-bit video
|
||||
@@ -161,7 +165,7 @@ ffmpeg -f lavfi -i smptebars=size=720x576:rate=25 -c:v prores -t 10 output_file
|
||||
ffmpeg -f lavfi -i testsrc=size=720x576:rate=25 -c:v v210 -t 10 output_file
|
||||
# Play HD SMPTE bars
|
||||
ffplay -f lavfi -i smptehdbars=size=1920x1080
|
||||
# Play VGA SMPTE bars
|
||||
# Play VGA SMPTE bars
|
||||
ffplay -f lavfi -i smptebars=size=640x480
|
||||
# Generate a sine wave test audio file
|
||||
ffmpeg -f lavfi -i "sine=frequency=1000:sample_rate=48000:duration=5" -c:a pcm_s16le output_file.wav
|
||||
@@ -181,7 +185,7 @@ ffmpeg -i input_one -i input_two -filter_complex signature=detectmode=full:nb_in
|
||||
ffmpeg -i input -vf signature=format=xml:filename="output.xml" -an -f null -
|
||||
# Play an image sequence
|
||||
ffplay -framerate 5 input_file_%06d.ext
|
||||
# Split audio and video tracks
|
||||
# Split audio and video tracks
|
||||
ffmpeg -i input_file -map 0:v:0 video_output_file -map 0:a:0 audio_output_file
|
||||
# Merge audio and video tracks
|
||||
ffmpeg -i video_file -i audio_file -map 0:v -map 1:a -c copy output_file
|
||||
@@ -195,22 +199,3 @@ ffmpeg -i input_file -filter:v drawbox=w=iw:h=7:y=ih-h:t=max output_file
|
||||
ffmpeg -re -i ${INPUTFILE} -map 0 -flags +global_header -vf scale="1280:-1,format=yuv420p" -pix_fmt yuv420p -level 3.1 -vsync passthrough -crf 26 -g 50 -bufsize 3500k -maxrate 1800k -c:v libx264 -c:a aac -b:a 128000 -r:a 44100 -ac 2 -t ${STREAMDURATION} -f tee "[movflags=+faststart]${TARGETFILE}|[f=flv]${STREAMTARGET}"
|
||||
# View FFmpeg subprogram information
|
||||
ffmpeg -h type=name
|
||||
# Rip a CD with CD Paranoia
|
||||
cdparanoia -L -B -O [Drive Offset] [Starting Track Number]-[Ending Track Number] output_file.wav
|
||||
# Rip a CD with Cdda2wav
|
||||
cdda2wav -L0 -t all -cuefile -paranoia paraopts=retries=200,readahead=600,minoverlap=sectors-per-request-1 -verbose-level all output.wav
|
||||
# Compare two images
|
||||
compare -metric ae image1.ext image2.ext null:
|
||||
# Create thumbnails of images
|
||||
mogrify -resize 80x80 -format jpg -quality 75 -path thumbs *.jpg
|
||||
# Creates grid of images from text file
|
||||
montage @list.txt -tile 6x12 -geometry +0+0 output_grid.jpg
|
||||
# Get file signature data
|
||||
convert -verbose input_file.ext | grep -i signature
|
||||
# Removes exif metadata
|
||||
mogrify -path ./stripped/ -strip *.jpg
|
||||
# Resizes image to specific pixel width
|
||||
convert input_file.ext -resize 750 output_file.ext
|
||||
# Transcoding to/from FLAC
|
||||
flac --best --keep-foreign-metadata --preserve-modtime --verify input.wav
|
||||
flac --decode --keep-foreign-metadata --preserve-modtime --verify input.flac
|
||||
|
@@ -4,17 +4,12 @@
|
||||
# connected to the Web, and the local version otherwise.
|
||||
|
||||
if [[ "$(uname -s)" = "Darwin" ]] ; then
|
||||
default_browser=$(plutil -convert json ~/Library/Preferences/com.apple.LaunchServices/com.apple.launchservices.secure.plist -r -o - | grep https -b1 | tail -n1 | cut -d'"' -f4)
|
||||
if ping -c 1 amiaopensource.github.io >/dev/null 2>&1 ; then
|
||||
ffmprovisr_path='https://amiaopensource.github.io/ffmprovisr/'
|
||||
else
|
||||
ffmprovisr_path=$(find /usr/local/Cellar/ffmprovisr -iname 'index.html' | sort -M | tail -n1)
|
||||
fi
|
||||
if [[ -n "${default_browser}" ]] ; then
|
||||
open -b "${default_browser}" "${ffmprovisr_path}"
|
||||
else
|
||||
open "${ffmprovisr_path}"
|
||||
fi
|
||||
open "${ffmprovisr_path}"
|
||||
elif [[ "$(uname -s)" = "Linux" ]] ; then
|
||||
if ping -c 1 amiaopensource.github.io >/dev/null 2>&1 ; then
|
||||
ffmprovisr_path='https://amiaopensource.github.io/ffmprovisr/'
|
||||
|
Reference in New Issue
Block a user