[CentOS] Best practices for copying lots of files machine-to-machine

Wed May 17 17:45:24 UTC 2017
Andrew Holway <andrew.holway at gmail.com>

Rsync seems to be the obvious answer here.

On 17 May 2017 at 18:16, Robert Moskowitz <rgm at htt-consult.com> wrote:

>
>
> On 05/17/2017 12:03 PM, ken wrote:
>
>> An entire filesystem (~180g) needs to be copied from one local linux
>> machine to another.  Since both systems are on the same local subnet,
>> there's no need for encryption.
>>
>> I've done this sort of thing before a few times in the past in different
>> ways, but wanted to get input from others on what's worked best for them.
>>
>> One consideration is that the source filesystem contains quite a few
>> hardlinks and symlinks and of course I want to preserve these, and preserve
>> all timestamps and ownerships and permissions as well.   Maintaining the
>> integrity of this metadata and the integrity of the files themselves if of
>> course the top priority.
>>
>> Speed is also a consideration, but having done this before, I find it
>> even more important to have a running progress report or log so I can see
>> how the session is proceeding and approximately how much longer it will be
>> until finished... and too to see if something's hung up.
>>
>> One other consideration:  There isn't much disk space left on the source
>> machine, so creating a tar file, even compressed, isn't an option.
>>
>> What relevant methods have you been impressed by?
>>
>
> I use rsync for such work.  It is good at maintaining hard and sym links
> and timestamps.  It can give you a running progress as well.
>
> One thing I have learned is that crud happens and I loose my local session
> for some stupid reason or another, thus I often run rsync in a screen shell
> that I can easily reconnect to.
>
>
>
>
> _______________________________________________
> CentOS mailing list
> CentOS at centos.org
> https://lists.centos.org/mailman/listinfo/centos
>