From: Wynne, Conor (Conor.Wynne at domain COMPAQ.COM)
Date: Fri 05 Apr 2002 - 16:30:13 IST
RPM and apt are nice an all.... [certainly if you are in a hurry]
But isin't it "better" to compile from source anyway? So that you get optimised packages for your architecture? Like i686/athlon or whatever processor optimisations?
I often come across sites that say, aw yeah, compile it and stuff coz its better....
[I am talking about squid in particular -> see the squid.conf and there are some bits that state, you can only do this if you compiled the source with --blh-blah option. I think its dnsserver and child processes [too lazy to connect to check]
Or is that just geek bull?
We have a cool utility internally here that creates a HTML file with every package [and services/logs etc etc] running / installed on the system. It was ported from Tru64 yada yada yada.
Bye Bye and have a good one.
> Package managers are very handy but sometiese you can only get a program
> in a .tar.gz type file. So my question is what did people do before package
> managers to keep track of what was on their syste? I know that it is
> possible to put code you download into a .rpm or whatever file and installit
> with the package manager but just say you don't want to.
That's exactly why package management systems were developed - it's to say
the least of it tricky to keep track of what a make install installs without
going into the Makefile and making notes and in the real world, that's not
going to happen. Hence package managers evolved and have reached their
pinnacle in . . . (F**ked if I'm going to start another religious war this
early on a Friday morning)
This archive was generated by hypermail 2.1.6 : Thu 06 Feb 2003 - 13:15:52 GMT