General code improvement

Using tabs instead of spaces\nFixed argument parsing\nTidied up some code\nLong arguments are no longer supported (blame getopts).
This commit is contained in:
Xerbo 2019-07-23 15:47:48 +01:00
parent 6d745c1228
commit 3821157d09
2 changed files with 108 additions and 111 deletions

View file

@ -12,22 +12,22 @@ Windows users can get it to work via Microsoft's [WSL](https://docs.microsoft.co
## Usage ## Usage
Make it executable with Make it executable with
`chmod +x faraffinity-dl-ng` `chmod +x faraffinity-dl`
And run it with And then run it with
`./furaffinity-dl-ng section/username` `./furaffinity-dl section/username`
All files from the given section and user will be downloaded to the current directory. All files from the given section and user will be downloaded to the current directory.
### Examples ### Examples
`./furaffinity-dl-ng gallery/mylafox` `./furaffinity-dl gallery/mylafox`
`./furaffinity-dl-ng -o=mylasArt gallery/mylafox` `./furaffinity-dl -o mylasArt gallery/mylafox`
`./furaffinity-dl-ng --out=koulsFavs favorites/koul` `./furaffinity-dl -o koulsFavs favorites/koul`
You can also log in to download restricted content. To do that, log in to FurAffinity in your web browser, export cookies to a file from your web browser in Netscape format (there are extensions to do that [for Firefox](https://addons.mozilla.org/en-US/firefox/addon/ganbo/) and [for Chrome/Vivaldi](https://chrome.google.com/webstore/detail/cookiestxt/njabckikapfpffapmjgojcnbfjonfjfg)) and pass them to the script as a second parameter, like this: You can also log in to download restricted content. To do that, log in to FurAffinity in your web browser, export cookies to a file from your web browser in Netscape format (there are extensions to do that [for Firefox](https://addons.mozilla.org/en-US/firefox/addon/ganbo/) and [for Chrome/Vivaldi](https://chrome.google.com/webstore/detail/cookiestxt/njabckikapfpffapmjgojcnbfjonfjfg)) and pass them to the script as a second parameter, like this:
`./furaffinity-dl -c=/path/to/your/cookies.txt gallery/gonnaneedabiggerboat` `./furaffinity-dl -c /path/to/your/cookies.txt gallery/gonnaneedabiggerboat`
## TODO ## TODO
* Download user bio, post tags and ideally user comments * Download user bio, post tags and ideally user comments

View file

@ -1,99 +1,94 @@
#!/bin/bash #!/bin/bash
set -e set -e
# Detect installed applications # Detect installed metadata injectors
if [ -f /usr/bin/eyeD3 ]; then eyed3=true
eyed3=true if [ -z $(command -v eyeD3) ]; then
else eyed3=false
eyed3=false echo "INFO: eyed3 is not installed, no metadata will be injected into music files."
echo "INFO: eyed3 is not installed, no metadata will be injected into music files."
fi fi
if [ -f /usr/bin/exiftool ]; then
exiftool=true exiftool=true
else if [ -z $(command -v exiftool) ]; then
exiftool=false exiftool=false
echo "INFO: exiftool is not installed, no metadata will be injected into pictures." echo "INFO: exiftool is not installed, no metadata will be injected into pictures."
fi fi
# Helper functions # Helper functions
help() { help() {
echo "Usage: $0 SECTION/USER [ARGUMENTS] echo "Usage: $0 [ARGUMENTS] SECTION/USER
Downloads the entire gallery/scraps/favorites of any furaffinity user. Downloads the entire gallery/scraps/favorites of any furaffinity user.
Arguments: Arguments:
-h --help This text -h (H)elp screen
-i --http Use an insecure connection -i Use an (I)nsecure connection when downloading
-o --out The directory to put files in -o The (O)utput directory to put files in
-c --cookiefile If you need to download restricted content -c If you need to download restricted content
you can provide a path to a cookie file you can provide a path to a (C)ookie file
-p --nometa Plain file without any additional metadata -p (P)lain file without any additional metadata
Examples: Examples:
$0 gallery/mylafox $0 gallery/mylafox
$0 -o=myla gallery/mylafox $0 -o mylasArt gallery/mylafox
$0 --out=koul favorites/koul $0 -o koulsFavs favorites/koul
You can also log in to FurAffinity to download restricted content, like this: You can also log in to FurAffinity to download restricted content, like this:
$0 -c=/path/to/your/cookies.txt gallery/gonnaneedabiggerboat $0 -c /path/to/your/cookies.txt gallery/gonnaneedabiggerboat
DISCLAIMER: It is your own responsibility to check whether batch downloading is allowed by FurAffinity terms of service and to abide by them." DISCLAIMER: It is your own responsibility to check whether batch downloading is allowed by FurAffinity terms of service and to abide by them."
exit 1 exit 1
} }
cleanup() { cleanup() {
rm -r "$tempfile" rm -r "$tempfile"
} }
# Arguments # Arguments
if [ $# -eq 0 ]; then [[ $# -eq 0 ]] && help
help
fi
prefix="https:"
outdir="." outdir="."
metadata=true; prefix="https:"
case "$1" in metadata=true
# Help while getopts 'o:c:i:p:h:' flag; do
-h|--help) help;; case "${flag}" in
o) outdir=${OPTARG};;
# HTTP / HTTPS c) cookiefile=${OPTARG};;
-i|--http) prefix="http:";; i) prefix="http:";;
p) metadata=false;;
h) help;;
*) help;;
esac
done
# Output directory # Attempt to create the output directory
-o=*|--out=*) outdir="${1#*=}"; shift 1;;
# Cookie file
-c=*|--cookiefile=*) cookiefile="${1#*=}"; shift 1;;
# Metadata
-p|--nometa) meta=false;;
esac
mkdir -p "$outdir" mkdir -p "$outdir"
runtime_dir="$HOME"'/.cache/furaffinity-dl-ng' # Setup runtime directory
runtime_dir="$HOME"'/.cache/furaffinity-dl'
mkdir -p "$runtime_dir" mkdir -p "$runtime_dir"
tempfile="$(umask u=rwx,g=,o= && mktemp $runtime_dir/fa-dl.XXXXXXXXXX)" tempfile="$(umask u=rwx,g=,o= && mktemp $runtime_dir/fa-dl.XXXXXXXXXX)"
# Call cleanup function on exit
trap cleanup EXIT trap cleanup EXIT
if [ "$cookiefile" = "" ]; then if [ -z "$cookiefile" ]; then
# Set wget with a custom user agent # Set wget with a custom user agent
fwget() { fwget() {
wget -nv --user-agent="Mozilla/5.0 furaffinity-dl-ng (https://github.com/Xerbo/furaffinity-dl-ng)" $* wget -nv --user-agent="Mozilla/5.0 furaffinity-dl (https://github.com/Xerbo/furaffinity-dl)" $*
} }
else else
# Set wget with a custom user agent and cookies # Set wget with a custom user agent and cookies
fwget() { fwget() {
wget -nv --user-agent="Mozilla/5.0 furaffinity-dl-ng (https://github.com/Xerbo/furaffinity-dl-ng)" --load-cookies "$cookiefile" $* wget -nv --user-agent="Mozilla/5.0 furaffinity-dl (https://github.com/Xerbo/furaffinity-dl)" --load-cookies "$cookiefile" $*
} }
fi fi
url="https://www.furaffinity.net/${@: -1}" url="https://www.furaffinity.net/${@: -1}"
# Iterate over the gallery pages with thumbnails and links to artwork view pages # Iterate over the gallery pages with thumbnails and links to artwork view pages
while true; do while true; do
fwget "$url" -O "$tempfile" fwget "$url" -O "$tempfile"
if [ "$cookiefile" != "" ] && grep -q 'furaffinity.net/login/' "$tempfile"; then if [ -n "$cookiefile" ] && grep -q 'furaffinity.net/login/' "$tempfile"; then
echo "ERROR: You have provided a cookies file, but it does not contain valid cookies. echo "ERROR: You have provided a cookies file, but it does not contain valid cookies.
If this file used to work, this means that the cookies have expired; If this file used to work, this means that the cookies have expired;
you will have to log in to FurAffinity from your web browser and export the cookies again. you will have to log in to FurAffinity from your web browser and export the cookies again.
@ -103,56 +98,58 @@ in Netscape format (this is normally done through \"cookie export\" browser exte
and supplied the correct path to the cookies.txt file to this script. and supplied the correct path to the cookies.txt file to this script.
If that doesn't resolve the issue, please report the problem at If that doesn't resolve the issue, please report the problem at
https://github.com/Xerbo/furaffinity-dl-ng/issues" >&2 https://github.com/Xerbo/furaffinity-dl/issues" >&2
exit 1 exit 1
fi fi
# Get URL for next page out of "Next" button. Required for favorites, pages of which are not numbered # Get URL for next page out of "Next" button. Required for favorites, pages of which are not numbered
next_page_url="$(grep '<a class="button-link right" href="' "$tempfile" | grep '">Next &nbsp;&#x276f;&#x276f;</a>' | cut -d '"' -f 4 | sort -u)" next_page_url="$(grep '<a class="button-link right" href="' "$tempfile" | grep '">Next &nbsp;&#x276f;&#x276f;</a>' | cut -d '"' -f 4 | sort -u)"
# Extract links to pages with individual artworks and iterate over them # Extract links to pages with individual artworks and iterate over them
artwork_pages=$(grep '<a href="/view/' "$tempfile" | grep -E --only-matching '/view/[[:digit:]]+/' | uniq) artwork_pages=$(grep '<a href="/view/' "$tempfile" | grep -E --only-matching '/view/[[:digit:]]+/' | uniq)
for page in $artwork_pages; do for page in $artwork_pages; do
# Download the submission page # Download the submission page
fwget -O "$tempfile" 'https://www.furaffinity.net'"$page" fwget -O "$tempfile" 'https://www.furaffinity.net'"$page"
if grep -q "System Message" "$tempfile"; then if grep -q "System Message" "$tempfile"; then
echo "WARNING: $page seems to be inaccessible, skipping." echo "WARNING: $page seems to be inaccessible, skipping."
continue continue
fi fi
# Get the full size image URL. # Get the full size image URL.
# This will be a facdn.net link, we will default to HTTPS # This will be a facdn.net link, we will default to HTTPS
# but this can be disabled with -i or --http for specific reasons # but this can be disabled with -i or --http for specific reasons
image_url=$prefix$(grep --only-matching --max-count=1 ' href="//d.facdn.net/art/.\+">Download' "$tempfile" | cut -d '"' -f 2) image_url=$prefix$(grep --only-matching --max-count=1 ' href="//d.facdn.net/art/.\+">Download' "$tempfile" | cut -d '"' -f 2)
# Get metadata # Get metadata
description=$(grep 'og:description" content="' "$tempfile" | cut -d '"' -f4) description=$(grep 'og:description" content="' "$tempfile" | cut -d '"' -f4)
title=$(grep 'og:title" content="' "$tempfile" | cut -d '"' -f4) title=$(grep 'og:title" content="' "$tempfile" | cut -d '"' -f4)
file_type=${image_url##*.} title="${title%" by"*}" # Remove the " by Artist" bit
file="$outdir/$title.$file_type" file_type=${image_url##*.}
file="$outdir/$title.$file_type"
wget "$image_url" -O "$file"
# Add metadata
if [ $file_type == "mp3" ] || [ $file_type == "wav" ] || [ $file_type == "wmv" ] || [ $file_type == "ogg" ] || [ $file_type == "flac" ]; then
# Use eyeD3 for injecting injecting metadata into audio files (if it's installed)
if [ $eyed3 ] && [ $metadata ]; then
if [ -z "$description" ]; then
eyeD3 -t "$title" "$file"
else
# HACK eyeD3 throws an error if a description containing a ":"
eyeD3 -t "$title" --add-comment "${description//:/\\:}" "$file"
fi
fi
elif [ $file_type == "png" ] || [ $file_type == "jpg" ] || [ $file_type == "jpeg" ]; then
# Use exiftool for injecting metadata into pictures (if it's installed)
if [ $exiftool ] && [ $metadata ]; then
exiftool "$file" -description="$description" -title="$title"
fi
fi
done
[ "$next_page_url" = "" ] && break # Download the image
url='https://www.furaffinity.net'"$next_page_url" wget "$image_url" -O "$file"
# Add metadata
if [ $file_type == "mp3" ] || [ $file_type == "wav" ] || [ $file_type == "wmv" ] || [ $file_type == "ogg" ] || [ $file_type == "flac" ]; then
# Use eyeD3 for injecting metadata into audio files (if it's installed)
if [ $eyed3 ] && [ $metadata ]; then
if [ -z "$description" ]; then
eyeD3 -t "$title" "$file"
else
# HACK: eyeD3 throws an error if a description containing a ":"
eyeD3 -t "$title" --add-comment "${description//:/\\:}" "$file"
fi
fi
elif [ $file_type == "png" ] || [ $file_type == "jpg" ] || [ $file_type == "jpeg" ]; then
# Use exiftool for injecting metadata into pictures (if it's installed)
if [ $exiftool ] && [ $metadata ]; then
exiftool "$file" -description="$description" -title="$title" -overwrite_original
fi
fi
done
[ -z "$next_page_url" ] && break
url='https://www.furaffinity.net'"$next_page_url"
done done