mirror of
https://github.com/amiaopensource/ffmprovisr.git
synced 2025-10-25 15:10:52 +02:00
Compare commits
30 Commits
v2017-10-2
...
v2017-12-0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f0e0cf8ed3 | ||
|
|
5c1c336d77 | ||
|
|
d71793583e | ||
|
|
6705bdf41d | ||
|
|
25e779a59f | ||
|
|
ba0852a957 | ||
|
|
88024c040f | ||
|
|
3d9b9edf1c | ||
|
|
c0326ad7d9 | ||
|
|
0cb6827b39 | ||
|
|
0d68614c04 | ||
|
|
1e86b70ba4 | ||
|
|
ced142a215 | ||
|
|
bf301daa71 | ||
|
|
278ac2baae | ||
|
|
10b8e4c941 | ||
|
|
1d1b3e4eac | ||
|
|
5a3e437d76 | ||
|
|
75a7aa1299 | ||
|
|
cf13529485 | ||
|
|
7c03ae2f80 | ||
|
|
affae48547 | ||
|
|
11ab1ae731 | ||
|
|
6b324152ac | ||
|
|
edaa0a3443 | ||
|
|
133f81db8e | ||
|
|
634f72213b | ||
|
|
94049d2338 | ||
|
|
46356c3b98 | ||
|
|
53d3cf8a74 |
25
css/css.css
25
css/css.css
@@ -44,6 +44,9 @@ html, body {
|
||||
"content"
|
||||
"footer";
|
||||
}
|
||||
code {
|
||||
word-break: break-all;
|
||||
}
|
||||
}
|
||||
|
||||
@media only screen and (min-width: 1000px) {
|
||||
@@ -61,6 +64,10 @@ html, body {
|
||||
.grid {
|
||||
max-width: 70%;
|
||||
margin: 0 auto;
|
||||
grid-template-areas:
|
||||
"header header"
|
||||
"sidebar content"
|
||||
"footer footer";
|
||||
}
|
||||
}
|
||||
|
||||
@@ -114,8 +121,10 @@ code {
|
||||
color: #c7254e;
|
||||
background-color: #f9f2f4;
|
||||
border-radius: 4px;
|
||||
word-break: break-all;
|
||||
word-wrap: break-word;
|
||||
max-width: 800px;
|
||||
white-space: normal;
|
||||
display: inline-block;
|
||||
}
|
||||
|
||||
dd {
|
||||
@@ -124,6 +133,10 @@ dd {
|
||||
|
||||
dt {
|
||||
font-family: Menlo, Monaco, Consolas, "Courier New", monospace;
|
||||
word-break: break-all;
|
||||
word-wrap: break-word;
|
||||
white-space: normal;
|
||||
display: inline-block;
|
||||
}
|
||||
|
||||
img {
|
||||
@@ -134,7 +147,17 @@ img {
|
||||
}
|
||||
|
||||
.sample-image {
|
||||
margin: 0 auto;
|
||||
margin-bottom: 18px;
|
||||
max-width: 500px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.sample-image-small {
|
||||
margin: 0 auto;
|
||||
margin-bottom: 18px;
|
||||
max-width: 250px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
div {
|
||||
|
||||
BIN
img/crop_example_aftercrop1.png
Normal file
BIN
img/crop_example_aftercrop1.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 245 KiB |
BIN
img/crop_example_aftercrop2.png
Normal file
BIN
img/crop_example_aftercrop2.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 167 KiB |
BIN
img/crop_example_aftercrop3.png
Normal file
BIN
img/crop_example_aftercrop3.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 146 KiB |
BIN
img/crop_example_orig.png
Normal file
BIN
img/crop_example_orig.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 436 KiB |
BIN
img/life.gif
Normal file
BIN
img/life.gif
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 574 KiB |
133
index.html
133
index.html
@@ -83,6 +83,20 @@
|
||||
</div>
|
||||
<!-- End Basic structure of an FFmpeg command -->
|
||||
|
||||
<!-- Streaming vs. Saving -->
|
||||
<label class="recipe" for="streaming-saving">Streaming vs. Saving</label>
|
||||
<input type="checkbox" id="streaming-saving">
|
||||
<div class="hiding">
|
||||
<h3>Streaming vs. Saving</h3>
|
||||
<p>FFplay allows you to stream created video and FFmpeg allows you to save video.</p>
|
||||
<p>The following command creates and saves a 10-second video of SMPTE bars:</p>
|
||||
<code>ffmpeg -f lavfi -i smptebars=size=640x480 -t 5 output_file</code>
|
||||
<p>This command plays and streams SMPTE bars but does not save them on the computer:</p>
|
||||
<code>ffplay -f lavfi smptebars=size=640x480</code>
|
||||
<p>The main difference is small but significant: the <code>-i</code> flag is required for FFmpeg but not required for FFplay. Additionally, the FFmpeg script needs to have <code>-t 5</code> and <code>output.mkv</code> added to specify the length of time to record and the place to save the video.</p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- End Streaming vs. Saving -->
|
||||
</div>
|
||||
<div class="well">
|
||||
<h2 id="concepts">Learn about more advanced FFmpeg concepts</h2>
|
||||
@@ -117,6 +131,34 @@
|
||||
</div>
|
||||
<!-- End Filtergraph explanation -->
|
||||
|
||||
<!-- Stream mapping explanation -->
|
||||
<label class="recipe" for="stream-mapping">Stream mapping</label>
|
||||
<input type="checkbox" id="stream-mapping">
|
||||
<div class="hiding">
|
||||
<h3>Stream mapping</h3>
|
||||
<p>Stream mapping is the practice of defining which of the streams (e.g., video or audio tracks) present in an input file will be present in the output file. FFmpeg recognises five stream types:</p>
|
||||
<ul>
|
||||
<li><code>a</code> - audio</li>
|
||||
<li><code>v</code> - video</li>
|
||||
<li><code>s</code> - subtitle</li>
|
||||
<li><code>d</code> - data (including timecode tracks)</li>
|
||||
<li><code>t</code> - attachment</li>
|
||||
</ul>
|
||||
<p>Mapping is achieved by use of the <code>-map</code> flag, followed by an action of the type <code>file_number:stream_type[:stream_number]</code>. Numbering is zero-indexed, and it's possible to map by stream type and/or overall stream order within the input file. For example:</p>
|
||||
<ul>
|
||||
<li><code>-map 0:v</code> means ‘take all video streams from the first input file’.</li>
|
||||
<li><code>-map 0:3</code> means ‘take the fourth stream from the first input file’.</li>
|
||||
<li><code>-map 0:a:2</code> means ‘take the third audio stream from the first input file’.</li>
|
||||
<li><code>-map 0:0 -map 0:2</code> means ‘take the first and third streams from the first input file’.</li>
|
||||
<li><code>-map 0:1 -map 1:0</code> means ‘take the second stream from the first input file and the first stream from the second input file’.</li>
|
||||
</ul>
|
||||
<p>To map <i>all</i> streams in the input file to the output file, use <code>-map 0</code>. However, note that not all container formats can include all stream types: for example, .mp4 cannot contain timecode.</p>
|
||||
<p>When no mapping is specified in an ffmpeg command, the default for video files is to take just one video and one audio stream for the output: other stream types, such as timecode or subtitles, will not be copied to the output file by default. If multiple video or audio streams are present, the best quality one is automatically selected by FFmpeg.</p>
|
||||
<p>For more information, check out the FFmpeg wiki <a href="https://trac.ffmpeg.org/wiki/Map" target="_blank">Map</a> page, and the official FFmpeg <a href="http://ffmpeg.org/ffmpeg.html#Advanced-options" target="_blank">documentation on <code>-map</code></a>.</p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- End Stream Mapping explanation -->
|
||||
|
||||
</div>
|
||||
<div class="well">
|
||||
<h2 id="rewrap">Change container (rewrap)</h2>
|
||||
@@ -638,6 +680,36 @@
|
||||
</div>
|
||||
<!-- ends Make stream properties explicate -->
|
||||
|
||||
<!-- Crop video -->
|
||||
<label class="recipe" for="crop_video">Crop video</label>
|
||||
<input type="checkbox" id="crop_video">
|
||||
<div class="hiding">
|
||||
<h3>Crop video</h3>
|
||||
<p><code>ffmpeg -i <i>input_file</i> -vf "crop=<i>width</i>:<i>height</i>" <i>output_file</i></code></p>
|
||||
<p>This command crops the input video to the dimensions defined</p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
<dt>-i <i>input_file</i></dt><dd>path, name and extension of the input file</dd>
|
||||
<dt>-vf "<i>width</i>:<i>height</i>"</dt><dd>Crops the video to the given width and height (in pixels).<br>
|
||||
By default, the crop area is centred: that is, the position of the top left of the cropped area is set to x = (<i>input_width</i> - <i>output_width</i>) / 2, y = <i>input_height</i> - <i>output_height</i>) / 2.
|
||||
</dd>
|
||||
<dt><i>output_file</i></dt><dd>path, name and extension of the output file</dd>
|
||||
</dl>
|
||||
<p>It's also possible to specify the crop position by adding the x and y coordinates representing the top left of your cropped area to your crop filter, as such:</p>
|
||||
<p><code>ffmpeg -i <i>input_file</i> -vf "crop=<i>width</i>:<i>height</i>[:<i>x_position</i>:<i>y_position</i>]" <i>output_file</i></code></p>
|
||||
<h3>Examples</h3>
|
||||
<p>The original frame, a screenshot of the SMPTE colourbars:</p>
|
||||
<img class="sample-image" src="img/crop_example_orig.png" alt="VLC screenshot of Maggie Cheung">
|
||||
<p>Result of the command <code>ffmpeg -i <i>smpte_coloursbars.mov</i> -vf "crop=500:500" <i>output_file</i></code>:</p>
|
||||
<img class="sample-image-small" src="img/crop_example_aftercrop1.png" alt="VLC screenshot of Maggie Cheung, cropped from original">
|
||||
<p>Result of the command <code>ffmpeg -i <i>smpte_coloursbars.mov</i> -vf "crop=500:500:0:0" <i>output_file</i></code>, appending <code>:0:0</code> to crop from the top left corner:</p>
|
||||
<img class="sample-image-small" src="img/crop_example_aftercrop2.png" alt="VLC screenshot of Maggie Cheung, cropped from original">
|
||||
<p>Result of the command <code>ffmpeg -i <i>smpte_coloursbars.mov</i> -vf "crop=500:300:500:30" <i>output_file</i></code>:</p>
|
||||
<img class="sample-image-small" src="img/crop_example_aftercrop3.png" alt="VLC screenshot of Maggie Cheung, cropped from original">
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- ends Crop video -->
|
||||
|
||||
</div>
|
||||
<div class="well">
|
||||
<h2 id="audio-files">Change or view audio properties</h2>
|
||||
@@ -1359,7 +1431,7 @@
|
||||
<dt>"</dt><dd>quotation mark to start the lavfi filtergraph</dd>
|
||||
<dt>movie='<i>input.mp4</i>'</dt><dd>declares video file source to apply filter</dd>
|
||||
<dt>,</dt><dd>comma signifies closing of video source assertion and ready for filter assertion</dd>
|
||||
<dt>signalstats=out=brng:</dt><dd>tells ffplay to use the signalstats command, output the data, use the brng filter</dd>
|
||||
<dt>signalstats=out=brng</dt><dd>tells ffplay to use the signalstats command, output the data, use the brng filter</dd>
|
||||
<dt>:</dt><dd>indicates there’s another parameter coming</dd>
|
||||
<dt>color=cyan[out]</dt><dd>sets the color of out-of-range pixels to cyan</dd>
|
||||
<dt>"</dt><dd>quotation mark to end the lavfi filtergraph</dd>
|
||||
@@ -1443,7 +1515,7 @@
|
||||
<dt>-show_data</dt><dd>adds a short “hexdump” to show_streams command output</dd>
|
||||
<dt>-print_format</dt><dd>Set the output printing format (in this example “xml”; other formats include “json” and “flat”)</dd>
|
||||
</dl>
|
||||
<p>See also the <a href="www.ffmpeg.org/ffprobe.html" target="_blank"> FFmpeg documentation on ffprobe</a> for a full list of flags, commands, and options.</p>
|
||||
<p>See also the <a href="http://www.ffmpeg.org/ffprobe.html" target="_blank"> FFmpeg documentation on ffprobe</a> for a full list of flags, commands, and options.</p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- ends Pull specs -->
|
||||
@@ -1798,7 +1870,7 @@
|
||||
<dt>-c:a pcm_s16le</dt><dd>encodes the audio codec in <code>pcm_s16le</code> (the default encoding for wav files). <code>pcm</code> represents pulse-code modulation format (raw bytes), <code>16</code> means 16 bits per sample, and <code>le</code> means "little endian"</dd>
|
||||
<dt>-t 10</dt><dd>specifies recording time of 10 seconds</dd>
|
||||
<dt>-c:v ffv1</dt><dd>Encodes to <a href="https://en.wikipedia.org/wiki/FFV1" target="_blank">FFV1</a>. Alter this setting to set your desired codec.</dd>
|
||||
<dt><i>output_file</i>.wav</dt><dd>path, name and extension of the output file</dd>
|
||||
<dt><i>output_file</i></dt><dd>path, name and extension of the output file</dd>
|
||||
</dl>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
@@ -1822,6 +1894,31 @@
|
||||
</div>
|
||||
<!-- ends Broken File -->
|
||||
|
||||
<!-- Game of Life -->
|
||||
<label class="recipe" for="game_of_life">Conway's Game of Life</label>
|
||||
<input type="checkbox" id="game_of_life">
|
||||
<div class="hiding">
|
||||
<h3>Conway's Game of Life</h3>
|
||||
<p>Simulates <a href="https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life">Conway's Game of Life</a></p>
|
||||
<p><code>ffplay -f lavfi life=s=300x200:mold=10:r=60:ratio=0.1:death_color=#C83232:life_color=#00ff00,scale=1200:800</code></p>
|
||||
<dl>
|
||||
<dt>ffplay</dt><dd>starts the command</dd>
|
||||
<dt>-f lavfi</dt><dd>tells ffplay to use the <a href="http://ffmpeg.org/ffmpeg-devices.html#lavfi" target="_blank">Libavfilter</a> input virtual device</dd>
|
||||
<dt>life=s=300x200</dt><dd>use the life filter and set the size of the video to 300x200</dd>
|
||||
<dt>:</dt><dd>indicates there’s another parameter coming</dd>
|
||||
<dt>mold=10:r=60:ratio=0.1</dt><dd>sets up the rules of the game: cell mold speed, video rate, and random fill ratio</dd>
|
||||
<dt>:</dt><dd>indicates there’s another parameter coming</dd>
|
||||
<dt>death_color=#C83232:life_color=#00ff00</dt><dd>specifies color for cell death and cell life; mold_color can also be set</dd>
|
||||
<dt>,</dt><dd>comma signifies closing of video source assertion and ready for filter assertion</dd>
|
||||
<dt>scale=1200:800</dt><dd>scale to 1280 width and 800 height</dd>
|
||||
</dl>
|
||||
<img src="img/life.gif" alt="GIF of above command">
|
||||
<p>To save a portion of the stream instead of playing it back infinitely, use the following command:</p>
|
||||
<p><code>ffmpeg -f lavfi -i life=s=300x200:mold=10:r=60:ratio=0.1:death_color=#C83232:life_color=#00ff00,scale=1200:800 -t 5 <i>output_file</i></code></p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- ends Game of Life -->
|
||||
|
||||
</div>
|
||||
<div class="well">
|
||||
<h2 id="ocr">Use OCR</h2>
|
||||
@@ -1938,20 +2035,44 @@
|
||||
<input type="checkbox" id="split_audio_video">
|
||||
<div class="hiding">
|
||||
<h3>Split audio and video tracks</h3>
|
||||
<p><code>ffmpeg -i <i>input_file</i> -map <i>0:v:0 video_output_file</i> -map <i>0:a:0 audio_output_file</i></code></p>
|
||||
<p><code>ffmpeg -i <i>input_file</i> -map 0:v:0 <i>video_output_file</i> -map 0:a:0 <i>audio_output_file</i></code></p>
|
||||
<p>This command splits the original input file into a video and audio stream. The -map command identifies which streams are mapped to which file. To ensure that you’re mapping the right streams to the right file, run ffprobe before writing the script to identify which streams are desired.</p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
<dt>-i <i>input_file</i></dt><dd>path, name and extension of the input file</dd>
|
||||
<dt>-map <i>0:v:0</i></dt><dd>grabs the first video stream and maps it into:</dd>
|
||||
<dt>-map 0:v:0</dt><dd>grabs the first video stream and maps it into:</dd>
|
||||
<dt><i>video_output_file</i></dt><dd>path, name and extension of the video output file</dd>
|
||||
<dt>-map <i>0:a:0</i></dt><dd>grabs the first audio stream and maps it into:</dd>
|
||||
<dt>-map 0:a:0</dt><dd>grabs the first audio stream and maps it into:</dd>
|
||||
<dt><i>audio_output_file</i></dt><dd>path, name and extension of the audio output file</dd>
|
||||
</dl>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- ends Split audio and video tracks -->
|
||||
|
||||
<!-- Merge audio and video tracks -->
|
||||
<label class="recipe" for="merge_audio_video">Merge audio and video tracks</label>
|
||||
<input type="checkbox" id="merge_audio_video">
|
||||
<div class="hiding">
|
||||
<h3>Merge audio and video tracks</h3>
|
||||
<p><code>ffmpeg -i <i>video_file</i> -i <i>audio_file</i> -map 0:v -map 1:a -c copy <i>output_file</i></code></p>
|
||||
<p>This command takes a video file and an audio file as inputs, and creates an output file that combines the video stream in the first file with the audio stream in the second file.</p>
|
||||
<dl>
|
||||
<dt>ffmpeg</dt><dd>starts the command</dd>
|
||||
<dt>-i <i>video_file</i></dt><dd>path, name and extension of the first input file (the video file)</dd>
|
||||
<dt>-i <i>audio_file</i></dt><dd>path, name and extension of the second input file (the audio file)</dd>
|
||||
<dt>-map <i>0:v</i></dt><dd>selects the video streams from the first input file</dd>
|
||||
<dt>-map <i>1:a</i></dt><dd>selects the audio streams from the second input file</dd>
|
||||
<dt>-c copy</dt><dd>copies streams without re-encoding</dd>
|
||||
<dt><i>output_file</i></dt><dd>path, name and extension of the output file</dd>
|
||||
</dl>
|
||||
<p><b>Note:</b> in the example above, the video input file is given prior to the audio input file. However, input files can be added any order, as long as they are indexed correctly when stream mapping with <code>-map</code>. See the entry on <a href="#stream-mapping">stream mapping</a>.</p>
|
||||
<h4>Variation:</h4>
|
||||
<p>Include the audio tracks from both input files with the following command:</p>
|
||||
<p><code>ffmpeg -i <i>video_file</i> -i <i>audio_file</i> -map 0:v -map 0:a -map 1:a -c copy <i>output_file</i></code></p>
|
||||
<p class="link"></p>
|
||||
</div>
|
||||
<!-- ends Merge audio and video tracks -->
|
||||
|
||||
<!-- Create ISO -->
|
||||
<label class="recipe" for="create_iso">Create ISO files for DVD access</label>
|
||||
<input type="checkbox" id="create_iso">
|
||||
|
||||
5
js/js.js
5
js/js.js
@@ -3,10 +3,11 @@ $(document).ready(function() {
|
||||
// open recipe window if a hash is found in URL
|
||||
if(window.location.hash) {
|
||||
id = window.location.hash
|
||||
console.log(id.substring(1))
|
||||
document.getElementById(id.substring(1)).checked = true;
|
||||
$('html, body').animate({ scrollTop: $(id).offset().top}, 1000);
|
||||
$(id).closest('div').find('.link').empty();
|
||||
$(id).closest('div').find('.link').append("<small>Link to this command: <a href="+window.location.href+">"+window.location.href+"</a></small>");
|
||||
$(id).closest('div').find('.link').append("<small>Link to this command: <a href='https://amiaopensource.github.io/ffmprovisr/index.html"+window.location.hash+"'>https://amiaopensource.github.io/ffmprovisr/index.html"+window.location.hash+"</a></small>");
|
||||
}
|
||||
|
||||
// add hash URL when recipe is opened
|
||||
@@ -14,7 +15,7 @@ $(document).ready(function() {
|
||||
id = $(this).attr("for");
|
||||
window.location.hash = ('#' + id)
|
||||
$('#' + id).closest('div').find('.link').empty();
|
||||
$('#' + id).closest('div').find('.link').append("<small>Link to this command: <a href="+window.location.href+">"+window.location.href+"</a></small>");
|
||||
$('#' + id).closest('div').find('.link').append("<small>Link to this command: <a href='https://amiaopensource.github.io/ffmprovisr/index.html"+window.location.hash+"'>https://amiaopensource.github.io/ffmprovisr/index.html"+window.location.hash+"</a></small>");
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
@@ -44,7 +44,7 @@ You can read our contributor code of conduct [here](https://github.com/amiaopens
|
||||
|
||||
## Maintainers
|
||||
|
||||
[Ashley Blewer](https://github.com/ablwr), [Katherine Frances Nagels](https://github.com/kfrn), [Kieran O'Leary](https://github.com/kieranjol) and [Reto Kromer](https://github.com/retokromer)
|
||||
[Ashley Blewer](https://github.com/ablwr), [Katherine Frances Nagels](https://github.com/kfrn), [Kieran O'Leary](https://github.com/kieranjol), [Reto Kromer](https://github.com/retokromer) and [Andrew Weaver](https://github.com/privatezero)
|
||||
|
||||
## Contributors
|
||||
* Gathered using [octohatrack](https://github.com/LABHR/octohatrack)
|
||||
@@ -104,4 +104,6 @@ All Contributors: 22
|
||||
|
||||
## License
|
||||
|
||||
<a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/80x15.png" /></a><br />This <span xmlns:dct="http://purl.org/dc/terms/" href="http://purl.org/dc/dcmitype/InteractiveResource" rel="dct:type">work</span> by <a xmlns:cc="http://creativecommons.org/ns#" href="http://amiaopensource.github.io/ffmprovisr/" property="cc:attributionName" rel="cc:attributionURL">ffmprovisr</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.<br />Based on a work at <a xmlns:dct="http://purl.org/dc/terms/" href="https://github.com/amiaopensource/ffmprovisr" rel="dct:source">https://github.com/amiaopensource/ffmprovisr</a>.
|
||||
<a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/80x15.png"></a><br>
|
||||
This <span xmlns:dct="http://purl.org/dc/terms/" href="http://purl.org/dc/dcmitype/InteractiveResource" rel="dct:type">work</span> by <a xmlns:cc="http://creativecommons.org/ns#" href="http://amiaopensource.github.io/ffmprovisr/" property="cc:attributionName" rel="cc:attributionURL">ffmprovisr</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.<br>
|
||||
Based on a work at <a xmlns:dct="http://purl.org/dc/terms/" href="https://github.com/amiaopensource/ffmprovisr" rel="dct:source">https://github.com/amiaopensource/ffmprovisr</a>.
|
||||
|
||||
Reference in New Issue
Block a user