We have a directory full of installation and configuration scripts that are updated on a fairly regular basis. I would like to implement some sort of version control for these files. I have used SVN and CVS in the past, but I thought I'd ask if anyone can recommend a simple, easy-to-use tool that would be better than cvs or subversion for this fairly simple setup.
Sean Carolan wrote:
We have a directory full of installation and configuration scripts that are updated on a fairly regular basis. I would like to implement some sort of version control for these files. I have used SVN and CVS in the past, but I thought I'd ask if anyone can recommend a simple, easy-to-use tool that would be better than cvs or subversion for this fairly simple setup.
RCS ? It doesn't get much simpler. Otherwise, I'd use SVN.
On Thu, Mar 13, 2008 at 10:39 AM, Sean Carolan scarolan@gmail.com wrote:
We have a directory full of installation and configuration scripts that are updated on a fairly regular basis. I would like to implement some sort of version control for these files. I have used SVN and CVS in the past, but I thought I'd ask if anyone can recommend a simple, easy-to-use tool that would be better than cvs or subversion for this fairly simple setup.
I dont really think you can get much easier than CVS if you need centralized management over a network. If it never gets off the machine then there is RCS. If those aren't simple enough... I don't think any of the others are going to help.
I dont really think you can get much easier than CVS if you need centralized management over a network. If it never gets off the machine then there is RCS. If those aren't simple enough... I don't think any of the others are going to help.
Thanks for the pointers, it looks like we will go with CVS.
On Thu, Mar 13, 2008 at 6:38 PM, Sean Carolan scarolan@gmail.com wrote:
I dont really think you can get much easier than CVS if you need centralized management over a network. If it never gets off the machine then there is RCS. If those aren't simple enough... I don't think any of the others are going to help.
Thanks for the pointers, it looks like we will go with CVS.
I'd recommend you re-consider SVN. It's as simple as CVS (in terms of command line ease of use) but also adds important things: 1. Atomic commits (when checking in multiple file changes, either all of them or none of them will go in). 2. Directory operations (moving files and directories around is as simple as "svn mv source destination") 3. Branches are a breeze (e,g, "svn mkdir branches/project-a; svn cp trunk/file branches/project-a")
I don't see any reason for anyone to get themselves into the trap that's called CVS at this time and age.
(BTW - if you started with CVS then you should be able to move over to SVN, there are programs to convert the repository).
Cheers,
--Amos
Sean Carolan a écrit :
We have a directory full of installation and configuration scripts that are updated on a fairly regular basis. I would like to implement some sort of version control for these files. I have used SVN and CVS in the past, but I thought I'd ask if anyone can recommend a simple, easy-to-use tool that would be better than cvs or subversion for this fairly simple setup.
I'm a writer, and I have all my .tex source files for LaTeX on an SVN server. Takes no more than 5 minutes to learn, plus maybe 15 minutes to learn how to setup your own SVN server.
Cheers,
Niki
I have run into a snag with my CVS installation:
[scarolan@neinei:~]$ cvs co -P installfiles cvs checkout: Updating installfiles cvs [checkout aborted]: out of memory; can not allocate 1022462837 bytes
Unfortunately we have a couple of large binary .tgz files in the repository. I was able to check them in but as you can see I can't check them out because of memory limitations. I have even added 2 more gigs of swap space but it still errors out. I noticed while watching it that it doesn't seem to use all the swap space. Any pointers?
From what I recall, I believe CVS attempts to create the checked out file under /tmp so that might be where you are lacking space.
MAL
-----Original Message----- From: centos-bounces@centos.org [mailto:centos-bounces@centos.org] On Behalf Of Sean Carolan Sent: March 13, 2008 15:50 To: CentOS mailing list Subject: Re: [CentOS] Good version control package?
I have run into a snag with my CVS installation:
[scarolan@neinei:~]$ cvs co -P installfiles cvs checkout: Updating installfiles cvs [checkout aborted]: out of memory; can not allocate 1022462837 bytes
Unfortunately we have a couple of large binary .tgz files in the repository. I was able to check them in but as you can see I can't check them out because of memory limitations. I have even added 2 more gigs of swap space but it still errors out. I noticed while watching it that it doesn't seem to use all the swap space. Any pointers? _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Sean Carolan wrote:
I have run into a snag with my CVS installation:
[scarolan@neinei:~]$ cvs co -P installfiles cvs checkout: Updating installfiles cvs [checkout aborted]: out of memory; can not allocate 1022462837 bytes
Try upping your ulimit. What does "ulimit -a" give.
Try upping your ulimit. What does "ulimit -a" give.
[scarolan@neinei:~]$ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) 4 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 7168 virtual memory (kbytes, -v) unlimited
On Thu, Mar 13, 2008 at 1:49 PM, Sean Carolan scarolan@gmail.com wrote:
I have run into a snag with my CVS installation:
[scarolan@neinei:~]$ cvs co -P installfiles cvs checkout: Updating installfiles cvs [checkout aborted]: out of memory; can not allocate 1022462837 bytes
Unfortunately we have a couple of large binary .tgz files in the repository. I was able to check them in but as you can see I can't check them out because of memory limitations. I have even added 2 more gigs of swap space but it still errors out. I noticed while watching it that it doesn't seem to use all the swap space. Any pointers?
Checking in binary files into CVS or any repository control system is usually a broken thing. You want to either check in the stuff inside the tar ball seperately (if its going to change), or just copy it into the archive by updating CVSROOT/cvswrappers
*.tar -k 'b' -m 'COPY' *.tbz -k 'b' -m 'COPY' *.tgz -k 'b' -m 'COPY'
Stephen John Smoogen wrote:
On Thu, Mar 13, 2008 at 1:49 PM, Sean Carolan scarolan@gmail.com wrote:
I have run into a snag with my CVS installation:
[scarolan@neinei:~]$ cvs co -P installfiles cvs checkout: Updating installfiles cvs [checkout aborted]: out of memory; can not allocate 1022462837 bytes
Unfortunately we have a couple of large binary .tgz files in the repository. I was able to check them in but as you can see I can't check them out because of memory limitations. I have even added 2 more gigs of swap space but it still errors out. I noticed while watching it that it doesn't seem to use all the swap space. Any pointers?
Checking in binary files into CVS or any repository control system is usually a broken thing. You want to either check in the stuff inside the tar ball seperately (if its going to change), or just copy it into the archive by updating CVSROOT/cvswrappers
*.tar -k 'b' -m 'COPY' *.tbz -k 'b' -m 'COPY' *.tgz -k 'b' -m 'COPY'
Check out: http://www.ibm.com/developerworks/java/library/j-svnbins.html
It has tips on tuning subversion for binary deltas.
-Ross
______________________________________________________________________ This e-mail, and any attachments thereto, is intended only for use by the addressee(s) named herein and may contain legally privileged and/or confidential information. If you are not the intended recipient of this e-mail, you are hereby notified that any dissemination, distribution or copying of this e-mail, and any attachments thereto, is strictly prohibited. If you have received this e-mail in error, please immediately notify the sender and permanently delete the original and any copy or printout thereof.
Checking in binary files into CVS or any repository control system is usually a broken thing. You want to either check in the stuff inside the tar ball seperately (if its going to change), or just copy it into the archive by updating CVSROOT/cvswrappers
This comes back to the point of my first post - I'm looking for an *easy* to manage system to keep track of one directory of files that are updated once in a while. We're not working on a huge code base with multiple branches, etc. I suppose we can check in the files inside the .tar.gz separately but was hoping to avoid that since the contents of this binary are maintained by a different department. I'd really rather keep it intact as it is.
Will SVN be better equipped to cope with large binaries? I don't understand why CVS chokes on a 1GB file when all it has to do is move it from one directory to another. I even gave this machine 3Gb of swap so it had 5Gb of total memory space available but it still dies when doing a cvs checkout.
On Thu, Mar 13, 2008 at 3:03 PM, Sean Carolan scarolan@gmail.com wrote:
Checking in binary files into CVS or any repository control system is usually a broken thing. You want to either check in the stuff inside the tar ball seperately (if its going to change), or just copy it into the archive by updating CVSROOT/cvswrappers
This comes back to the point of my first post - I'm looking for an *easy* to manage system to keep track of one directory of files that are updated once in a while. We're not working on a huge code base with multiple branches, etc. I suppose we can check in the files inside the .tar.gz separately but was hoping to avoid that since the contents of this binary are maintained by a different department. I'd really rather keep it intact as it is.
Will SVN be better equipped to cope with large binaries? I don't understand why CVS chokes on a 1GB file when all it has to do is move it from one directory to another. I even gave this machine 3Gb of swap so it had 5Gb of total memory space available but it still dies when doing a cvs checkout.
Because these tools are meant to deal with source code files and deal with diffs of such files. You are cramming a 1 gigabyte of compressed bits at it and its trying to make sure it could give you a diff of it later on. I don't have any idea why you would want to store it in a CVS type tool if its not going to change that often.. a simple MD5sum of the tar ball and check against that would probably do a check of it.
Remember we have NO clue what you are wanting to do with this or why.. a couple of sentences does not give us enough background on what problem you are trying to solve. You never mentioned 1 gb tar balls or other things.. you mentioned config and script files which update quite frequently.. thats all.
Because these tools are meant to deal with source code files and deal with diffs of such files. You are cramming a 1 gigabyte of compressed bits at it and its trying to make sure it could give you a diff of it later on. I don't have any idea why you would want to store it in a CVS type tool if its not going to change that often.. a simple MD5sum of the tar ball and check against that would probably do a check of it.
Remember we have NO clue what you are wanting to do with this or why.. a couple of sentences does not give us enough background on what problem you are trying to solve. You never mentioned 1 gb tar balls or other things.. you mentioned config and script files which update quite frequently.. thats all.
Thank you for the reply. I am going to try checking this file in as a binary using Stephen's cvswrapper suggestion and see if this makes a difference.
Sean Carolan wrote:
Checking in binary files into CVS or any repository control system is usually a broken thing. You want to either check in the stuff inside the tar ball seperately (if its going to change), or just copy it into the archive by updating CVSROOT/cvswrappers
This comes back to the point of my first post - I'm looking for an *easy* to manage system to keep track of one directory of files that are updated once in a while. We're not working on a huge code base with multiple branches, etc. I suppose we can check in the files inside the .tar.gz separately but was hoping to avoid that since the contents of this binary are maintained by a different department. I'd really rather keep it intact as it is.
If the contents are text, one of the real values of a version control system is that you can do a diff between any two versions to see what changed. And ideally, you would push the access back to the people maintaining it so they could take advantage of the tools themselves.
Will SVN be better equipped to cope with large binaries?
SVN handles binaries moderately well and tries to just store the diffs even though they aren't usable for human viewing. The tradeoff is that svn keeps two complete copies in your checked out working directory and it wants to work on directories, not individual files.
I don't understand why CVS chokes on a 1GB file when all it has to do is move it from one directory to another. I even gave this machine 3Gb of swap so it had 5Gb of total memory space available but it still dies when doing a cvs checkout.
I don't think that should be a problem - but on the other hand I also don't think CVS is going to do anything useful with a compressed tarball. With either CVS or SVN, I'd recommend installing viewvc as a companion program. It will give you a nice web browser view of the repository with the ability to view history, diffs, and grab copies of files.
Sean Carolan wrote:
I have run into a snag with my CVS installation:
[scarolan@neinei:~]$ cvs co -P installfiles cvs checkout: Updating installfiles cvs [checkout aborted]: out of memory; can not allocate 1022462837 bytes
Unfortunately we have a couple of large binary .tgz files in the repository. I was able to check them in but as you can see I can't check them out because of memory limitations. I have even added 2 more gigs of swap space but it still errors out. I noticed while watching it that it doesn't seem to use all the swap space. Any pointers?
Subversion has better handling for binaries. Not sure if it will solve the problem but it's one of the factors that caused us to switch from CVS to SVN (among others).
If you do go the SVN route, use the "fsfs" storage - it's much more robust and recovers better from any corruption than the others available in SVN (at least that's my experience anyway).
Good luck :)
"just copy it into the archive by updating CVSROOT/cvswrappers"
*.tar -k 'b' -m 'COPY' *.tbz -k 'b' -m 'COPY' *.tgz -k 'b' -m 'COPY'
This worked great. Thank you, Stephan. The enormous .tar.gz is now easily swallowed by the CVS snake.