add: user submissions downloading

(but yeah it's kludge)
This commit is contained in:
Kentai Radiquum 2022-06-15 23:40:40 +05:00
parent 253cc74f7d
commit 77760e7135
No known key found for this signature in database
GPG key ID: CB1FC16C710DB347
2 changed files with 41 additions and 15 deletions

View file

@ -1,7 +1,8 @@
This branch is the development version of furaffinity-dl rewritten in python. This branch is the development version of furaffinity-dl rewritten in python.
# FurAffinity Downloader # FurAffinity Downloader
**furaffinity-dl** is a python script for batch downloading of galleries (and scraps/favourites) from furaffinity users.
**furaffinity-dl** is a python script for batch downloading of galleries (and scraps/favourites) from furaffinity users users or your submissons!
It was written for preservation of culture, to counter the people nuking their galleries every once a while. It was written for preservation of culture, to counter the people nuking their galleries every once a while.
Supports all known submission types: images, text, flash and audio. Supports all known submission types: images, text, flash and audio.
@ -18,18 +19,20 @@ furaffinity-dl has only been tested only on Linux, however it should also work o
When downloading a folder make sure to put everything after **/folder/**, for example 123456/Folder-Name-Here instead of just 123456 (ref [#60](https://github.com/Xerbo/furaffinity-dl/issues/60)). When downloading a folder make sure to put everything after **/folder/**, for example 123456/Folder-Name-Here instead of just 123456 (ref [#60](https://github.com/Xerbo/furaffinity-dl/issues/60)).
``` ```help
usage: furaffinity-dl.py [-h] [--output OUTPUT] [--cookies COOKIES] [--ua UA] [--start START] [--dont-redownload] [--interval INTERVAL] [--metadir METADIR]
usage: furaffinity-dl.py [-h] [--output OUTPUT] [--cookies COOKIES] [--ua UA] [--start START] [--stop STOP] [--dont-redownload]
[--interval INTERVAL] [--metadir METADIR]
[category] [username] [folder] [category] [username] [folder]
Downloads the entire gallery/scraps/favorites of a furaffinity user Downloads the entire gallery/scraps/favorites of a furaffinity user, or your submissions
positional arguments: positional arguments:
category the category to download, gallery/scraps/favorites category the category to download, gallery/scraps/favorites
username username of the furaffinity user username username of the furaffinity user
folder name of the folder (full path, for instance 123456/Folder-Name-Here) folder name of the folder (full path, for instance 123456/Folder-Name-Here)
optional arguments: options:
-h, --help show this help message and exit -h, --help show this help message and exit
--output OUTPUT, -o OUTPUT --output OUTPUT, -o OUTPUT
output directory output directory
@ -38,6 +41,7 @@ optional arguments:
--ua UA, -u UA Your browser's useragent, may be required, depending on your luck --ua UA, -u UA Your browser's useragent, may be required, depending on your luck
--start START, -s START --start START, -s START
page number to start from page number to start from
--stop STOP, -S STOP Page number to stop on. For favorites pages, specify the full URL after the username (1234567890/next).
--dont-redownload, -d --dont-redownload, -d
Don't redownload files that have already been downloaded Don't redownload files that have already been downloaded
--interval INTERVAL, -i INTERVAL --interval INTERVAL, -i INTERVAL
@ -50,10 +54,12 @@ Examples:
python3 furaffinity-dl.py -o koulsArt gallery koul python3 furaffinity-dl.py -o koulsArt gallery koul
python3 furaffinity-dl.py -o mylasFavs favorites mylafox python3 furaffinity-dl.py -o mylasFavs favorites mylafox
You can also log in to FurAffinity in a web browser and load cookies to download restricted content: You can also log in to FurAffinity in a web browser and load cookies to download Age restricted content or Submissions:
python3 furaffinity-dl.py -c cookies.txt gallery letodoesart python3 furaffinity-dl.py -c cookies.txt gallery letodoesart
python3 furaffinity-dl.py -c cookies.txt msg submissions
DISCLAIMER: It is your own responsibility to check whether batch downloading is allowed by FurAffinity terms of service and to abide by them. DISCLAIMER: It is your own responsibility to check whether batch downloading is allowed by FurAffinity terms of service and to abide by them.
``` ```
You can also log in to download restricted content. To do that, log in to FurAffinity in your web browser, export cookies to a file from your web browser in Netscape format (there are extensions to do that [for Firefox](https://addons.mozilla.org/en-US/firefox/addon/ganbo/) and [for Chrome based browsers](https://chrome.google.com/webstore/detail/cookiestxt/njabckikapfpffapmjgojcnbfjonfjfg)), you can then pass them to the script with the `-c` flag, like this (you may also have to provide your user agent): You can also log in to download restricted content. To do that, log in to FurAffinity in your web browser, export cookies to a file from your web browser in Netscape format (there are extensions to do that [for Firefox](https://addons.mozilla.org/en-US/firefox/addon/ganbo/) and [for Chrome based browsers](https://chrome.google.com/webstore/detail/cookiestxt/njabckikapfpffapmjgojcnbfjonfjfg)), you can then pass them to the script with the `-c` flag, like this (you may also have to provide your user agent):
@ -62,9 +68,9 @@ You can also log in to download restricted content. To do that, log in to FurAff
## TODO ## TODO
- Download user profile information. - Download user profile information.
- "Classic" theme support - "Classic" theme support
- Login without having to export cookies - Login without having to export cookies
## Disclaimer ## Disclaimer

View file

@ -1,5 +1,6 @@
#!/usr/bin/python3 #!/usr/bin/python3
import argparse import argparse
from types import NoneType
from tqdm import tqdm from tqdm import tqdm
from argparse import RawTextHelpFormatter from argparse import RawTextHelpFormatter
import json import json
@ -15,13 +16,14 @@ Please refer to LICENSE for licensing conditions.
''' '''
# Argument parsing # Argument parsing
parser = argparse.ArgumentParser(formatter_class=RawTextHelpFormatter, description='Downloads the entire gallery/scraps/favorites of a furaffinity user', epilog=''' parser = argparse.ArgumentParser(formatter_class=RawTextHelpFormatter, description='Downloads the entire gallery/scraps/favorites of a furaffinity user, or your submissions', epilog='''
Examples: Examples:
python3 furaffinity-dl.py gallery koul python3 furaffinity-dl.py gallery koul
python3 furaffinity-dl.py -o koulsArt gallery koul python3 furaffinity-dl.py -o koulsArt gallery koul
python3 furaffinity-dl.py -o mylasFavs favorites mylafox\n python3 furaffinity-dl.py -o mylasFavs favorites mylafox\n
You can also log in to FurAffinity in a web browser and load cookies to download restricted content: You can also log in to FurAffinity in a web browser and load cookies to download Age restricted content or Submissions:
python3 furaffinity-dl.py -c cookies.txt gallery letodoesart\n python3 furaffinity-dl.py -c cookies.txt gallery letodoesart
python3 furaffinity-dl.py -c cookies.txt msg submissions\n
DISCLAIMER: It is your own responsibility to check whether batch downloading is allowed by FurAffinity terms of service and to abide by them. DISCLAIMER: It is your own responsibility to check whether batch downloading is allowed by FurAffinity terms of service and to abide by them.
''') ''')
parser.add_argument('category', metavar='category', type=str, nargs='?', default='gallery', help='the category to download, gallery/scraps/favorites') parser.add_argument('category', metavar='category', type=str, nargs='?', default='gallery', help='the category to download, gallery/scraps/favorites')
@ -52,7 +54,7 @@ else:
# Check validity of category # Check validity of category
valid_categories = ['gallery', 'favorites', 'scraps'] valid_categories = ['gallery', 'favorites', 'scraps', 'msg']
if args.category not in valid_categories: if args.category not in valid_categories:
raise Exception('Category is not valid', args.category) raise Exception('Category is not valid', args.category)
@ -171,9 +173,11 @@ def download(path):
return True return True
global i
i = 1
# Main downloading loop # Main downloading loop
while True: while True:
if args.stop and args.stop == page_num: if args.stop and args.stop == page_num:
print(f"Reached page {args.stop}, stopping.") print(f"Reached page {args.stop}, stopping.")
break break
@ -204,8 +208,24 @@ while True:
for img in s.findAll('figure'): for img in s.findAll('figure'):
download(img.find('a').attrs.get('href')) download(img.find('a').attrs.get('href'))
sleep(args.interval) sleep(args.interval)
if args.category == "msg":
next_button = s.find('a', class_='button standard more', text="Next 48")
if next_button is None or next_button.parent is None:
next_button = s.find('a', class_='button standard more-half', text="Next 48")
if next_button is None or next_button.parent is None:
print('Unable to find next button')
break
next_page_link = next_button.attrs['href']
if args.category != "favorites": i = i + 1
page_num = next_page_link.split('/')[-2]
page_url = base_url + next_page_link
print('Downloading page', i, page_url)
elif args.category != "favorites":
next_button = s.find('button', class_='button standard', text="Next") next_button = s.find('button', class_='button standard', text="Next")
if next_button is None or next_button.parent is None: if next_button is None or next_button.parent is None:
print('Unable to find next button') print('Unable to find next button')