On Monday 03 July 2006 09:26, rance at frontiernet.net wrote: > I'm open to nfs, ftp, or http content delivery to clients in case one > or the other makes a difference. If I wanted to set up and then support a large number of workstations, my answer would perhaps be "hackish" but would work. YMMV. 1) Set up a yum repo webserver on the local network. This makes workstation installations go much faster, and allows me to control the rollout of updates. 2) Set up a single workstation with all the packages that you want. -> Set up yum on the workstation to use the yum repo. -> Use rpm -qa and make a stupid-simple "yum install `rpm -qa`" script out of it. -> Put the yum files (/etc/yum*) and the yum install script on the yum repo webserver as a .tgz file. 3) Install new workstations with minimal installs of CentOS (only the first CD is needed if you choose "custom" and then uncheck all packages) which takes just a couple minutes per system. 4) wget the setup script that you built in #2, and run. Since it's all local, it will install very fast. With this method, I could probably set up 5-10 workstations per hour if they were pre-built and had a fast network. This method has an additional advantage of leaving the systems preconfigured to get updates that I could control the rollout of. I would additionally recommend setting up a cron script on each workstation to do updates automatically ("yum update") every night at 3 AM. Grep for "kernel" in the output to toggle an automatic reboot, if it makes sense. I similar methods to set up Porn-filtering web-proxy servers. (that we sell to schools) Thus, I can roll out security updates automatically for the entire organization in < 24 hours while performing only a few minutes of work. Again, there may be "better" ways to achieve the above, but the shell scripting to support the above would take me about 4 hours to cook up, and would work quite well. -Ben -- "The best way to predict the future is to invent it." - XEROX PARC slogan, circa 1978