06-07-2021, 10:00 PM
I like extracting stills from videos, so I made this script to help automate the process. This works in Linux only as it's a bash script. This does approx 10 screenshots/second which means a 1080P 60 minute video can easily be 20GB of screenshots as these are fully quality PNGs.
Steps to use:
If you get an error about ffmpeg being missing (or this script appears to not work), do an or whatever you normally use to install software in your distro. Note that jdupes is just used because sometimes you get the same screenshot, it's not absolutely essential and is a cleanup function. To reduce the number of frames/second, change to 1 or 5 or whatever you want.
This will make a directory with a TON of png files in it, which may crash when you try to open it the solution is to split that directory. Here is another helper script for that call it as script.sh directory_to_be_split. This will split the files into directories containing up to 5000 files. You can change that limit where it says 5000 in the code.
Hope this is useful to somebody
Steps to use:
- Create file named screenshot.sh in your home folder (or wherever else you like), past code below into it
- Run chmod +x ~/screenshot.sh to make it executable
- Open terminal where video file is located that you want (it will dump screenshots in whatever directory you call it from)
- Run ~/screenshot.sh /full/path/to/videofile.mp4
Code:
#!/bin/bash# NOTE: if the frames folder exists and contains files that match the filename `frame_%03d.png`, no frames will be generated
if [ ! -z "$2" ]
then
size=$2
else
size=-1
fi
tmp_dir=$($1)"./$(basename "$1")frames"
mkdir "$tmp_dir"
echo "\033[33m Extract frames $1 ($2px wide)"
echo "\033[32m ## Extracting frames..."
nice ionice ffmpeg -i "$1" -vf scale=$size:-1 -r 10 "$tmp_dir/frame_%03d.png"
jdupes -rdN "$tmp_dir"
If you get an error about ffmpeg being missing (or this script appears to not work), do an
Code:
apt install ffmpeg
Code:
-r 10
This will make a directory with a TON of png files in it, which may crash when you try to open it the solution is to split that directory. Here is another helper script for that call it as script.sh directory_to_be_split. This will split the files into directories containing up to 5000 files. You can change that limit where it says 5000 in the code.
Code:
#!/bin/bash
cd "$1"
i=0;
readarray -d '' entries < <(printf '%s\0' *.* | sort -zV)
for f in "${entries[@]}"; do
d=dir_$(printf %03d $((i/5000+1)));
mkdir -p "$d";
mv "$f" "$d";
let i++;
echo "On file $i";
done
Hope this is useful to somebody