On Wed, Aug 12, 2009 at 7:29 PM, Andreas Rogge<a.rogge at solvention.de> wrote: > > I just hacked it together using wget and xslt. It is still quite rough, > but works. > Downloading can be done with wget -r -nH -np http://whatever/base/url/ > However, there are css files missing then (wget doesn't follow @import > statements). > After that you can just apply the attached XSLT to remove the header and > add the disclaimer. > > So far I only tested with the Deployment Guide. I wanted to know wether > that kind of solution is ok for you. If it is, I can probably create a > rather small script for mirroring and maintain it. > Hi again, After being busy with some other stuff I had some time to look into this some more. First there is the issue that wget does not follow urls mentioned in CSS. Well, some good news here. The current development version of wget (1.12) has received support for it. I've tested and it indeed works. I can probably create a RPM for it if people are interested in this. So, once wget has run and see 2 more things that needs to be done. First is to add the disclaimer to each page. Andreas, I've tried using your XSLT (using xsltproc) but it does not seem to work here. Its probably my total ignorance about XML and XSL and how to use it. So if you could show how exactly to apply it ? Secondly all the Red Hat logos need to be removed/replaced. Looking at the deployment guide this looks to be a relatively short list. So it should not be that hard for the script to have list of files to replace after the mirroring that takes care of this. So, if people have some time the coming days to have a go at this let us know. If not I will probably have a go at it soonish. Thanks again for the help, Tim -- Tim Verhoeven - tim.verhoeven.be at gmail.com - 0479 / 88 11 83 Hoping the problem magically goes away by ignoring it is the "microsoft approach to programming" and should never be allowed. (Linus Torvalds)