profile.fav.PurgeRegexp('*'); profile.fav.Reset(*backup);
16 years ago
General
After years of seeing a huge swath of favorites that I never remembered adding (and suspected were either database corruption or the result of my early attempts at tracking users before FA's more modern late 200x url format) I have finally made a list of the favorites I want to keep.
Lacking a Nuke button for favorites I've opened a trouble ticket; however at this time that feature doesn't even seem to be at Wishlist priority, so I've taken to doing it in an at least automated, but less efficient way.
ff3x-cookies.sh is a method for retrieving the cookies from a firefox 3 sqlite3 database; I didn't write that code but did modify the wrapper.
The process is effectively; "load favorite's page 0", "grab all the delete urls", "wget the urls", "exit on error", "loop"
Oh, not automated is backing up the favorites the user wants to keep and re-adding them. I might see if add favorite can be mutated from the backup urls, but if not it won't take -that- long to re-add my favs.
#!/bin/bash
# ff3x-cookies.sh
SQLFILE=${1:-~/.mozilla/firefox/*.default/cookies.sqlite}
cp $SQLFILE /dev/shm/cookies.sqlite
sqlite3 /dev/shm/cookies.sqlite <<EOF
.mode tabs
.header off
select host as domain,
case substr(host,1,1)='.' when 0 then 'FALSE' else 'TRUE' end as flag,
path,
case isSecure when 0 then 'FALSE' else 'TRUE' end as secure,
expiry as expiration, name, value from moz_cookies;
.exit
EOF
rm /dev/shm/cookies.sqlite
#!/bin/sh
# whatever
~/bin/ff3x-cookies.sh | grep furaffinity > /dev/shm/facookies.txt
while [ "$?" == "0" ]
do
wget --load-cookies /dev/shm/facookies.txt -O /dev/shm/fafavs 'http://www.furaffinity.net/controls/favorites/0/'
perl -ne 'if($_ =~ m{'"'"'(/controls/favorites/delete/[0-9]+/)'"'"'}){print "http://www.furaffinity.net$1 \n"}' \
/dev/shm/fafavs > /dev/shm/faurls
if [ ! -s /dev/shm/faurls ] ; then break ; fi
wget --load-cookies /dev/shm/facookies.txt -i /dev/shm/faurls -O /dev/null
done
rm /dev/shm/facookies.txt /dev/shm/fafavs /dev/shm/faurls
### Edit 1 # Added 1 space before \n to try and retain the \ in the face of lacking a [code] style tag.
### Edit 2 # Using a temporary file to test for zero url length, since wget returns true when given no urls on it's input.
Lacking a Nuke button for favorites I've opened a trouble ticket; however at this time that feature doesn't even seem to be at Wishlist priority, so I've taken to doing it in an at least automated, but less efficient way.
ff3x-cookies.sh is a method for retrieving the cookies from a firefox 3 sqlite3 database; I didn't write that code but did modify the wrapper.
The process is effectively; "load favorite's page 0", "grab all the delete urls", "wget the urls", "exit on error", "loop"
Oh, not automated is backing up the favorites the user wants to keep and re-adding them. I might see if add favorite can be mutated from the backup urls, but if not it won't take -that- long to re-add my favs.
#!/bin/bash
# ff3x-cookies.sh
SQLFILE=${1:-~/.mozilla/firefox/*.default/cookies.sqlite}
cp $SQLFILE /dev/shm/cookies.sqlite
sqlite3 /dev/shm/cookies.sqlite <<EOF
.mode tabs
.header off
select host as domain,
case substr(host,1,1)='.' when 0 then 'FALSE' else 'TRUE' end as flag,
path,
case isSecure when 0 then 'FALSE' else 'TRUE' end as secure,
expiry as expiration, name, value from moz_cookies;
.exit
EOF
rm /dev/shm/cookies.sqlite
#!/bin/sh
# whatever
~/bin/ff3x-cookies.sh | grep furaffinity > /dev/shm/facookies.txt
while [ "$?" == "0" ]
do
wget --load-cookies /dev/shm/facookies.txt -O /dev/shm/fafavs 'http://www.furaffinity.net/controls/favorites/0/'
perl -ne 'if($_ =~ m{'"'"'(/controls/favorites/delete/[0-9]+/)'"'"'}){print "http://www.furaffinity.net$1 \n"}' \
/dev/shm/fafavs > /dev/shm/faurls
if [ ! -s /dev/shm/faurls ] ; then break ; fi
wget --load-cookies /dev/shm/facookies.txt -i /dev/shm/faurls -O /dev/null
done
rm /dev/shm/facookies.txt /dev/shm/fafavs /dev/shm/faurls
### Edit 1 # Added 1 space before \n to try and retain the \ in the face of lacking a [code] style tag.
### Edit 2 # Using a temporary file to test for zero url length, since wget returns true when given no urls on it's input.
FA+
