Les Mikesell wrote: > On 9/17/2010 10:12 AM, m.roth at 5-cent.us wrote: >> Les Mikesell wrote: >>> On 9/17/2010 8:24 AM, m.roth at 5-cent.us wrote: >>>> >>>>> Proper scripting abilities are perhaps beyond reach for a short >>>>> course, but you could at least show off some one-liners or those >>>>> short, stunningly useful examples to help them get the idea that they >>>>> definitely should get their feet wet on it sooner or later. >>>> >>>> awk, awk! Perl's a day, minimum, by itself, but awk you can do in an >>>> hour or two, and have immediate results. >>> >>> But awk is a dead end that can't do a lot of things by itself. And >> >> So, what's the longest awk scripts you've ever written, Mike? It works >> wonderfully well for what it was intended - and mostly, I use it for >> reports or data conversion. > > Don't think I've ever written one from scratch, at least not since perl > was around because it was too painful to debug. I agree that it works > fine when you copy someone else's already-debugged code. I'm not > recommending never using awk, I just don't see the point of learning to > write it. > >>> learning how to embed awk into other scripts is even more syntactically >>> obscure than just using perl in the first place. Besides, perl's '-c' >>> check and debug facilities make it much more usable to beginners than >>> awk's propensity to find errors mid-run (and worse, >>> mid-some-other-script because you had to embed it). >> >> Misuse of awk. >> >> mark "why, yes, I *have* written 100 and 200 line awk scripts >> to do data converstion and data validation" > > But why, when very likely better versions of whatever you were doing > have already been written and debugged as CPAN perl modules? Would you > do something like time parsing or format conversions in awk, or extract > mime attachment from a mail message? Those sound simple but aren't and > in perl you only have to write a couple of lines yourself to do them. Ah, no. I wrote 30 scripts around '91-'92 to take datafiles from 30 sources and reformat them, to feed to the C program I'd written with embedded sql, in place of the d/b's sqlloader (*bleah*). Then, 11 years ago, I wrote a validation program for data that was being loaded by another program that I didn't want to change; the data had been exported from ArcInfo, and had to go into our Oracle d/b. Really simple to do in awk - just so much of it, and no, perl would have offered no improved/shorter way to do it, and yes, I do know perl - in '04, for example, I rewrote a call routing and billing system from perl (written by my then-manager, who'd never studied programming, can you say spaghetti?) into reasonable perl. Actually, I just wrote a scraper in perl, using HTML::Parser. Anyway, the point of that was to demonstrate that I know both, and awk is better, IMO, for some jobs. mark