one basic option that I know of is Squidalyser, http://ababa.org/
This doesn't do everything that you want but I've found
it useful myself on a number of fronts
Now it won't hive off things that have been downloaded but
it your cache is so tuned and you've got the dick space you
can keep some rather large objects in that.
It works by going through your squid logs & chucking them into a MySQL db
with an expiration date (default is one month).
it's frontend is a perl cgi & from this you can do a "relativly" quick
search on what's gone through squid & potentially from there save
stuff from your cache to disk.
it's also useful to see how much Pr0n your directors are grabbing per hour :)
another (probably simpler) way is
"grep interesting_stuff squid/access.log" & from there attack your
(pruned) output with a proxy configured wget
I'm pretty sure that this wouldn't be too difficult to script
Justin MacCarthy wrote:
> I have a Linux gateway, with squid & firewall & IDS at home. I was
> wondering is there a way of caching / saving any downloads that are
> requested from machines on the network and coping the them with their URLs
> to a central repository.
>> Something like
>> E.G. all rpm are saved to rpms/url_of_rpm/downloadedfile.rpm
>> It there anything that does it? Can you get Squid to do something like this
Maintained by the ILUG website team. The aim of Linux.ie is to
support and help commercial and private users of Linux in Ireland. You can
display ILUG news in your own webpages, read backend
information to find out how. Networking services kindly provided by HEAnet, server kindly donated by
Dell. Linux is a trademark of Linus Torvalds,
used with permission. No penguins were harmed in the production or maintenance
of this highly praised website. Looking for the
Indian Linux Users' Group? Try here. If you've read all this and aren't a lawyer: you should be!