Merge pull request #19 from Xerbo/master

Ownership migration/chanes
This commit is contained in:
Sergey "Shnatsel" Davidoff 2019-07-21 11:44:51 +02:00 committed by GitHub
commit 6d745c1228
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
2 changed files with 119 additions and 55 deletions

View file

@ -1,39 +1,36 @@
# FurAffinity Downloader
**furaffinity-dl** is a bash script for batch downloading of galleries and favorites from furaffinity.net users.
I've written it for preservation of culture, to counter the people nuking their galleries every once a while.
It was written for preservation of culture, to counter the people nuking their galleries every once a while.
Supports all known submission types: images, texts and audio. Sorts downloaded files in chronological order.
I'd like to eventually expand it to download the description pages as well. Patches are welcome!
Supports all known submission types: images, texts and audio.
## Requirements
Coreutils, bash and wget are the only dependencies.
Coreutils, bash and wget are the only dependencies. However if you want to embed metadata into files you will need eyed3 and exiftool
furaffinity-dl was tested only on Linux. It should also work on Mac and BSDs.
Windows users can get it to work via Microsoft's "BASH on Windows". Cygwin is not supported.
Windows users can get it to work via Microsoft's [WSL](https://docs.microsoft.com/en-us/windows/wsl/install-win10). Cygwin is not supported.
## Usage
`furaffinity-dl section/username`
Make it executable with
`chmod +x faraffinity-dl-ng`
And run it with
`./furaffinity-dl-ng section/username`
All files from the given section and user will be downloaded to the current directory.
### Examples
`furaffinity-dl gallery/kodardragon`
`./furaffinity-dl-ng gallery/mylafox`
`furaffinity-dl scraps/---`
`./furaffinity-dl-ng -o=mylasArt gallery/mylafox`
`furaffinity-dl favorites/kivuli`
`./furaffinity-dl-ng --out=koulsFavs favorites/koul`
You can also log in to download restricted content. To do that, log in to FurAffinity in your web browser, export cookies to a file from your web browser in Netscape format (there are extensions to do that [for Firefox](https://addons.mozilla.org/en-US/firefox/addon/ganbo/) and [for Chrome/Vivaldi](https://chrome.google.com/webstore/detail/cookiestxt/njabckikapfpffapmjgojcnbfjonfjfg)) and pass them to the script as a second parameter, like this:
`furaffinity-dl gallery/gonnaneedabiggerboat /path/to/your/cookies.txt`
`./furaffinity-dl -c=/path/to/your/cookies.txt gallery/gonnaneedabiggerboat`
## TODO
* Download author's description of the artwork, and ideally the entire description page along with user comments
* Download user bio, post tags and ideally user comments
## Disclaimer
It is your own responsibility to check whether batch downloading is allowed by FurAffinity terms of service and to abide by them. For further disclaimers see LICENSE.
## See also
There is a similar downloader for VCL art library, see https://github.com/Shnatsel/vcl-dl
It is your own responsibility to check whether batch downloading is allowed by FurAffinity's terms of service and to abide by them. For further disclaimers see LICENSE.

View file

@ -1,52 +1,99 @@
#!/bin/bash
set -e
if [ "$1" = "" ] || [ "$1" = "-h" ] || [ "$1" = "--help" ]; then
echo "Usage: $0 SECTION/USER [YOUR_USERNAME]
Downloads the entire gallery/scraps/favorites of any furaffinity.net user.
Examples:
$0 gallery/kodardragon
$0 scraps/---
$0 favorites/kivuli
You can also log in to FurAffinity and download restricted content, like this:
$0 gallery/gonnaneedabiggerboat /path/to/your/cookies.txt"
exit 1
# Detect installed applications
if [ -f /usr/bin/eyeD3 ]; then
eyed3=true
else
eyed3=false
echo "INFO: eyed3 is not installed, no metadata will be injected into music files."
fi
if [ -f /usr/bin/exiftool ]; then
exiftool=true
else
exiftool=false
echo "INFO: exiftool is not installed, no metadata will be injected into pictures."
fi
runtime_dir="$HOME"'/.cache/furaffinity-dl'
mkdir -p "$runtime_dir"
tempfile="$(umask u=rwx,g=,o= && mktemp $runtime_dir/fa-dl.XXXXXXXXXX)"
# Helper functions
help() {
echo "Usage: $0 SECTION/USER [ARGUMENTS]
Downloads the entire gallery/scraps/favorites of any furaffinity user.
Arguments:
-h --help This text
-i --http Use an insecure connection
-o --out The directory to put files in
-c --cookiefile If you need to download restricted content
you can provide a path to a cookie file
-p --nometa Plain file without any additional metadata
Examples:
$0 gallery/mylafox
$0 -o=myla gallery/mylafox
$0 --out=koul favorites/koul
You can also log in to FurAffinity to download restricted content, like this:
$0 -c=/path/to/your/cookies.txt gallery/gonnaneedabiggerboat
DISCLAIMER: It is your own responsibility to check whether batch downloading is allowed by FurAffinity terms of service and to abide by them."
exit 1
}
cleanup() {
rm -r "$tempfile"
}
# Arguments
if [ $# -eq 0 ]; then
help
fi
prefix="https:"
outdir="."
metadata=true;
case "$1" in
# Help
-h|--help) help;;
# HTTP / HTTPS
-i|--http) prefix="http:";;
# Output directory
-o=*|--out=*) outdir="${1#*=}"; shift 1;;
# Cookie file
-c=*|--cookiefile=*) cookiefile="${1#*=}"; shift 1;;
# Metadata
-p|--nometa) meta=false;;
esac
mkdir -p "$outdir"
runtime_dir="$HOME"'/.cache/furaffinity-dl-ng'
mkdir -p "$runtime_dir"
tempfile="$(umask u=rwx,g=,o= && mktemp $runtime_dir/fa-dl.XXXXXXXXXX)"
trap cleanup EXIT
COOKIES_FILE="$2"
if [ "$COOKIES_FILE" = "" ]; then
# set a wget wrapper with custom user agent
if [ "$cookiefile" = "" ]; then
# Set wget with a custom user agent
fwget() {
wget --user-agent="Mozilla/5.0 furaffinity-dl (https://github.com/Shnatsel/furaffinity-dl)" $*
wget -nv --user-agent="Mozilla/5.0 furaffinity-dl-ng (https://github.com/Xerbo/furaffinity-dl-ng)" $*
}
else
# set a wget wrapper with custom user agent and cookies
# Set wget with a custom user agent and cookies
fwget() {
wget --user-agent="Mozilla/5.0 furaffinity-dl (https://github.com/Shnatsel/furaffinity-dl)" \
--load-cookies "$COOKIES_FILE" $*
wget -nv --user-agent="Mozilla/5.0 furaffinity-dl-ng (https://github.com/Xerbo/furaffinity-dl-ng)" --load-cookies "$cookiefile" $*
}
fi
url=https://www.furaffinity.net/"$1"
url="https://www.furaffinity.net/${@: -1}"
# Iterate over the gallery pages with thumbnails and links to artwork view pages
while true; do
fwget -O "$tempfile" "$url"
if [ "$COOKIES_FILE" != "" ] && grep -q 'furaffinity.net/login/' "$tempfile"; then
echo "--------------
ERROR: You have provided a cookies file, but it does not contain valid cookies.
fwget "$url" -O "$tempfile"
if [ "$cookiefile" != "" ] && grep -q 'furaffinity.net/login/' "$tempfile"; then
echo "ERROR: You have provided a cookies file, but it does not contain valid cookies.
If this file used to work, this means that the cookies have expired;
you will have to log in to FurAffinity from your web browser and export the cookies again.
@ -56,7 +103,7 @@ in Netscape format (this is normally done through \"cookie export\" browser exte
and supplied the correct path to the cookies.txt file to this script.
If that doesn't resolve the issue, please report the problem at
https://github.com/Shnatsel/furaffinity-dl/issues" >&2
https://github.com/Xerbo/furaffinity-dl-ng/issues" >&2
exit 1
fi
@ -75,15 +122,35 @@ https://github.com/Shnatsel/furaffinity-dl/issues" >&2
fi
# Get the full size image URL.
# This will be a facdn.net link, we have to use HTTP
# to get around DPI-based page blocking in some countries.
image_url='http:'$(grep --only-matching --max-count=1 ' href="//d.facdn.net/art/.\+">Download' "$tempfile" | cut -d '"' -f 2)
# TODO: Get the submission title out of the page
# this trick may come in handy for avoiding slashes in filenames:
# | tr '/' ''
wget --timestamping "$image_url"
# This will be a facdn.net link, we will default to HTTPS
# but this can be disabled with -i or --http for specific reasons
image_url=$prefix$(grep --only-matching --max-count=1 ' href="//d.facdn.net/art/.\+">Download' "$tempfile" | cut -d '"' -f 2)
# Get metadata
description=$(grep 'og:description" content="' "$tempfile" | cut -d '"' -f4)
title=$(grep 'og:title" content="' "$tempfile" | cut -d '"' -f4)
file_type=${image_url##*.}
file="$outdir/$title.$file_type"
wget "$image_url" -O "$file"
# Add metadata
if [ $file_type == "mp3" ] || [ $file_type == "wav" ] || [ $file_type == "wmv" ] || [ $file_type == "ogg" ] || [ $file_type == "flac" ]; then
# Use eyeD3 for injecting injecting metadata into audio files (if it's installed)
if [ $eyed3 ] && [ $metadata ]; then
if [ -z "$description" ]; then
eyeD3 -t "$title" "$file"
else
# HACK eyeD3 throws an error if a description containing a ":"
eyeD3 -t "$title" --add-comment "${description//:/\\:}" "$file"
fi
fi
elif [ $file_type == "png" ] || [ $file_type == "jpg" ] || [ $file_type == "jpeg" ]; then
# Use exiftool for injecting metadata into pictures (if it's installed)
if [ $exiftool ] && [ $metadata ]; then
exiftool "$file" -description="$description" -title="$title"
fi
fi
done
[ "$next_page_url" = "" ] && break