On Fri, Sep 17, 2010 at 11:47,
<m.roth@5-cent.us> wrote:
Ah, no. I wrote 30 scripts around '91-'92 to take datafiles from 30
sources and reformat them, to feed to the C program I'd written with
embedded sql, in place of the d/b's sqlloader (*bleah*). Then, 11 years
ago, I wrote a validation program for data that was being loaded by
another program that I didn't want to change; the data had been exported
from ArcInfo, and had to go into our Oracle d/b.
Really simple to do in awk - just so much of it, and no, perl would have
offered no improved/shorter way to do it, and yes, I do know perl - in
'04, for example, I rewrote a call routing and billing system from perl
(written by my then-manager, who'd never studied programming, can you say
spaghetti?) into reasonable perl. Actually, I just wrote a scraper in
perl, using HTML::Parser. Anyway, the point of that was to demonstrate
that I know both, and awk is better, IMO, for some jobs.
mark
It is the beauty of Linux...