Just wondering what is the best way to prevent bulk website downloader type of software to access the albums and download everything?
I'm running 1.4 on a Windows Apache server. I tried to play around with .htaccess but I was wondering if there is a better way to secure the files so that people have access via the Coppermine website only.
This is a tricky subject, as there is little difference between a legitimate human visitor, a (legitimate) search engine spider and an (abusive) offline copier. There's no method built into coppermine. Read the FAQ by one of the offline copier sites: http://www.httrack.com/html/abuse.html#WEBMASTERS
If you ask me, there are two suitable options:
- use robots.txt to exclude legitimate spiders and copiers from indexing/downloading coppermine's albums folder, as copying the actual content (your pics and multimedia files) will steal most bandwidth
- Use honeypot scripts that slow down copiers (Option 1i in the HTTRACK FAQ that I refered to above labelled "Use technical tricks to lag offline browsers" there)
Quick tipp for apache: make sure that you have turned indexes off for folders that don't contain index.php/index.htm/index.html)
Your question is not directly related to Coppermine (as suggested, Coppermine doesn't have a built-in feature to prevent bulk downloading except making registration mandatory), but more a matter of webserver setup, so I suggest googling for this: there are certainly better resources than this forum to figure out details.
cool, thanks for the quick reply.
ya I've already had indexes turned off and seing/listing img in the albums isn't possible, unless you know the picture full name/path.
I will add the robot.txt though I'm more concerned about rogue attempts at downloading all pictures using a program likst HTTrack not so much because of bandwidth but rather that some picture are more or less private.
In any case I'll take a look a the scripts as well.
Thanks!