On Tue, 2007-08-28 at 23:30 -0400, Stephen Harris wrote:
On Tue, Aug 28, 2007 at 05:04:33PM -0500, Les Mikesell wrote:
<snip>
I'm probably fighting a losing battle; I was shell scripting 17 years ago when every fork/exec was expensive. I cry when I see people writing grep | awk type combinations (and don't start me on cat | grep).
AMEN BROTHER! It's hard to shake the efficiency concerns that started ... oh well, why not ... in 1978 on PWB V7. I astounded the sysadmin with my multi-thousand line shell scripts. Having been programming, not administering, for some years prior to that, I saw nothing wrong with well structured code in large quantities when well commented.
The "cat | <your-util-of-choice>" that is seen so frequently is one of my pet peeves too. People need to be very familiar with I/O redirects provided as a *standard* feature of almost all *IX utilities.
Old geek statement: If you think perl is the answer to a simple filter question then think twice. You might be right, but it's likely smaller faster tools already exist. And I say this as someone who has written 1000 line shell scripts and even bigger perl scripts; perl is good for complicated tasks, but rarely required for simple stuff. Don't wield your hammer because that's all you know.
In this case, everyone who responded with a perl solution needs their hammer taken away.
As a minor disagreement with the above... I agree but the sad fact is that it is usually faster to use what you already know than to try to find one or more appropriate utilities and then learn how to use them. In a production environment, the time needed may be very dear.
I offer for your consideration that among the many good answers to the OP, csplit was never mentioned although it is certainly one of the useful options.
-- Bill