Trying to rsync a rather large file from a windows server to a centos server and all but this is working fine.
As it's a 20 gig file I am trying to send the diff of with a -c, I suspect over the low bandwidth this presents an issue. I also stage this file locally on another centos server and could calc the diff and create a patch and send that, comparing checksums etc...
A quick look at bsdiff and bspatch and the mem requirements on my 20 gig file make that solution rather not acceptable.
Anyone know a better solution to accomplish this?
Thanks! jlc
On 1/20/2010 12:01 PM, Joseph L. Casale wrote:
Trying to rsync a rather large file from a windows server to a centos server and all but this is working fine.
As it's a 20 gig file I am trying to send the diff of with a -c, I suspect over the low bandwidth this presents an issue. I also stage this file locally on another centos server and could calc the diff and create a patch and send that, comparing checksums etc...
A quick look at bsdiff and bspatch and the mem requirements on my 20 gig file make that solution rather not acceptable.
Anyone know a better solution to accomplish this?
Is your windows rsync version fairly current? I think older cygwin versions had a file size limit.
On Wed, Jan 20, 2010 at 1:01 PM, Joseph L. Casale jcasale@activenetwerx.com wrote:
Trying to rsync a rather large file from a windows server to a centos server and all but this is working fine.
As it's a 20 gig file I am trying to send the diff of with a -c, I suspect over the low bandwidth this presents an issue. I also stage this file locally on another centos server and could calc the diff and create a patch and send that, comparing checksums etc...
A quick look at bsdiff and bspatch and the mem requirements on my 20 gig file make that solution rather not acceptable.
Anyone know a better solution to accomplish this?
Thanks! jlc
I don't understand why the diff shenanigans. Rsync has that built-in, so you shouldn't need to be doing that as a separate step.
If it is a file size limit, you could try to split(1) the file, then rsync the chunks. You might also try cygwin 1.7, which has improved the support for modern Windows OS dramatically.
I don't understand why the diff shenanigans. Rsync has that built-in, so you shouldn't need to be doing that as a separate step.
I am using cwrsync (3.0.6) and its failing no matter what switches I try.
If it is a file size limit, you could try to split(1) the file, then rsync the chunks. You might also try cygwin 1.7, which has improved the support for modern Windows OS dramatically.
Looking into cygwin now, I didn't think it will make any difference, cwrsync is fairly recent.
hrm, splitting first, I'll try that.
On 1/20/2010 2:05 PM, Joseph L. Casale wrote:
I don't understand why the diff shenanigans. Rsync has that built-in, so you shouldn't need to be doing that as a separate step.
I am using cwrsync (3.0.6) and its failing no matter what switches I try.
If it is a file size limit, you could try to split(1) the file, then rsync the chunks. You might also try cygwin 1.7, which has improved the support for modern Windows OS dramatically.
Looking into cygwin now, I didn't think it will make any difference, cwrsync is fairly recent.
People have reported different results with different builds on the backuppc list but I don't remember the versions. How is this failing? There is still a maximum path limit. Also, how are you using it? Running under cygwin sshd has had problems that might also be fixed in the latest versions, although it has always worked to issue the command from the windows side or to run rsync in daemon mode.
Don't want to sound like a spoil sport but you could scp it over and already be well on your way?
People have reported different results with different builds on the backuppc list but I don't remember the versions. How is this failing? There is still a maximum path limit. Also, how are you using it? Running under cygwin sshd has had problems that might also be fixed in the latest versions, although it has always worked to issue the command from the windows side or to run rsync in daemon mode.
Yeah, I caught some threads in their forum wrt it. It gives me an exit code 30, timeout. The centos (receiving side) is running rsync in daemon mode.
I'll try the --blocking-io switch tonight.
On 1/20/2010 3:06 PM, Joseph L. Casale wrote:
People have reported different results with different builds on the backuppc list but I don't remember the versions. How is this failing? There is still a maximum path limit. Also, how are you using it? Running under cygwin sshd has had problems that might also be fixed in the latest versions, although it has always worked to issue the command from the windows side or to run rsync in daemon mode.
Yeah, I caught some threads in their forum wrt it. It gives me an exit code 30, timeout. The centos (receiving side) is running rsync in daemon mode.
I'll try the --blocking-io switch tonight.
Does it pick up where it stopped if you use the -P option? Also, -z might help if you have low bandwidth.
On Wed, Jan 20, 2010 at 11:13 AM, Brian Mathis brian.mathis@gmail.com wrote:
On Wed, Jan 20, 2010 at 1:01 PM, Joseph L. Casale jcasale@activenetwerx.com wrote:
Trying to rsync a rather large file from a windows server to a centos server and all but this is working fine.
As it's a 20 gig file I am trying to send the diff of with a -c, I suspect over the low bandwidth this presents an issue. I also stage this file locally on another centos server and could calc the diff and create a patch and send that, comparing checksums etc...
A quick look at bsdiff and bspatch and the mem requirements on my 20 gig file make that solution rather not acceptable.
Anyone know a better solution to accomplish this?
Thanks! jlc
I don't understand why the diff shenanigans. Rsync has that built-in, so you shouldn't need to be doing that as a separate step.
If it is a file size limit, you could try to split(1) the file, then rsync the chunks. You might also try cygwin 1.7, which has improved the support for modern Windows OS dramatically. _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Try adding
--blocking-io
to rsync flags.
It's the default on Linux if you're using rsh or remsh.
Also, "low bandwidth" is undefined.
In any case, try changing the bandwidth
--bwlimit=KPS
Note, I have no idea if these flags work in the Windows version of rsync.
On Wed, Jan 20, 2010 at 1:01 PM, Joseph L. Casale jcasale@activenetwerx.com wrote:
Trying to rsync a rather large file from a windows server to a centos server and all but this is working fine.
As it's a 20 gig file I am trying to send the diff of with a -c, I suspect over the low bandwidth this presents an issue. I also stage this file locally on another centos server and could calc the diff and create a patch and send that, comparing checksums etc...
A quick look at bsdiff and bspatch and the mem requirements on my 20 gig file make that solution rather not acceptable.
Anyone know a better solution to accomplish this?
I had to do a similar thing across a satellite link. I ended up splitting the files into 500M segments then rsyncing those. When completed, I ssh'ed into the remote and rejoined then checksummed.
On Wed, Jan 20, 2010 at 1:01 PM, Joseph L. Casale <jcasale@activenetwerx.com
wrote:
<snip> As it's a 20 gig file I am trying to send the diff of with a -c, I suspect over <snip>
This might be too basic a question, what type of file system are you using on the CentOS system? For instance if it is the ext2/ext3 file system, there are file size limits depending on such things as the block size: http://en.wikipedia.org/wiki/Ext2#File_system_limits.
Brett