The ongoing discussion regarding Perl modules and RPMs, prompted me to post an issue I am currently facing. I support a development team that uses various third party tools, some free, some not, some of which are installed via RPMs, and some that are not. The development cycle is 18-24 months long, and then there are many years of support (continuing engineering) after that, for which the various versions of the tools need to be available. There are a few dozen engineers on the project.
Let say that I want to install SuperTool V1.1, and it's available to install only via RPMs. After thorough testing/qualification on one workstation, I need to install it on the few dozen workstations, which is straightforward but possibly time consuming (and you have to deal with workstations that may be down when you are doing the installs). When V1.2 comes out, I can again test it on one workstation, and then deploy it on the remaining workstations. But what if I there is a bug, and we need to test it against the old version of the tool? With RPMs, it is cumbersome to go back and forth between versions. Contrast that with a tar based install, where I can untar the tool in a directory that includes a version number in the path, and then automount that directory to all the workstations. There is only one install, and I can easily move back and forth among the various versions.
Now I understand the benefits of RPMs, but when it comes to supporting tool vendors, dealing with tar based installations and using automounts really simplifies things. I am thinking about installing the RPM based tools on one system, and then copying the entire install tree to a NAS and automount it. I know this is not ideal, but I can't think of a better solution. Any thoughts from the members of this list? Have you tried to solve similar problems in the past? If so, how did you do it?
Alfred
On Thu, 2006-06-01 at 22:30, Alfred von Campe wrote:
Now I understand the benefits of RPMs, but when it comes to supporting tool vendors, dealing with tar based installations and using automounts really simplifies things. I am thinking about installing the RPM based tools on one system, and then copying the entire install tree to a NAS and automount it. I know this is not ideal, but I can't think of a better solution. Any thoughts from the members of this list? Have you tried to solve similar problems in the past? If so, how did you do it?
One group of developers here tries to check everything that might be version-related into CVS with everything tagged at release points. That means anyone can start with a fairly bare machine and a build script can check out all the tool and source versions needed to re-create anything they have ever done. A side effect of this is that the developers never have to build the deployed executables - they just give the tags or a build script to the operators who build and install the final versions, ensuring that the build can be repeated. A different group always uses the same build machine and has had an assortment of problems like forgotten tweaks being lost during upgrades or when restoring after disk failures. Due to the nature of their product they rarely have to work with old revisions and it's probably a good thing. If there is another incarnation of the 'dedicated build machine' concept, I'd probably do it as a virtual machine under VMware and archive copies of the things before any changes to the tool set so it would be possible to revive exact copies of old versions.