Thanks to all of you for your help, and especially Tim Shubitz who faced the same problem and his solution worked perfectly for me.
However, now that I have properly created a GPT partition of size 2.7TB, which filesystem is best on it? This filesystem will be used to store backups of various other linux systems, so the files will be mostly small, however some systems do host big movie files and sometimes SVN dumps, and DB dumps can get a little big. I am going to be using rsnapshot to do the backups, so perhaps I should be careful about the number of inodes I create and try to maximise them?
I am thinking of using XFS, but am not sure. I seem to have heard in the past that one should avoid EXT3 on such huge filesystems, but I can't find a reference or proper justification for it. JFS is another option but then some mailing list threads online say it has lost data for them so I'm a bit confused as to what is best to use in my scenario.
As for XFS I have read that a UPS is necessary and this is not a problem since these machines are already connected to a UPS (and that UPS has a backup as well).
Any help appreciated, thanks,
Khusro