[CentOS] looking for cool, post-install things to do on a centos 5.5 system

m.roth at 5-cent.us m.roth at 5-cent.us
Fri Sep 17 17:45:39 UTC 2010


Les Mikesell wrote:
> On 9/17/2010 10:47 AM, m.roth at 5-cent.us wrote:
>>
>> Ah, no. I wrote 30 scripts around '91-'92 to take datafiles from 30
>> sources and reformat them, to feed to the C program I'd written with
>> embedded sql, in place of the d/b's sqlloader (*bleah*). Then, 11 years
>> ago, I wrote a validation program for data that was being loaded by
>> another program that I didn't want to change; the data had been exported
>> from ArcInfo, and had to go into our Oracle d/b.
>>
>> Really simple to do in awk - just so much of it, and no, perl would have
>> offered no improved/shorter way to do it,
>
> I don't get it.  Why wouldn't you just talk to the db directly with
> perl's dbi/dbd, replacing both the awk and C parts?  I do that all the
> time.  Or was that before dbi - or the dbd you needed?

Mike, you really aren't reading all of what I wrote. Perl itself wasn't
available in '91-'92. I'd already written the C program, and then the
hypothesis that our company would be able to tell all the sources of the
data what format to put it in was shown to be less realistic than the
typical tv commercial.

I don't know the state of dbd/dbi in '98 or '99, but I was *not* going to
touch the existing program that loaded the data, and I was trying to get
just very basic validation, which included feedback as to what was wrong
with each bad record (and let the rest be loaded).
>
>   and yes, I do know perl - in
>> '04, for example, I rewrote a call routing and billing system from perl
>> (written by my then-manager, who'd never studied programming, can you
>> say spaghetti?) into reasonable perl. Actually, I just wrote a scraper in
>> perl, using HTML::Parser.  Anyway, the point of that was to demonstrate
>> that I know both, and awk is better, IMO, for some jobs.
>
> That depends on how you define better.  I can see how it could save a
> microsecond of loading time on tiny jobs, but not how it can do anything
> functionally better.  Have you tried feeding one of your long scripts to
> a2p and timing some job with enough input to matter?  I'd expect perl to
> win anything where there is enough actual work to make up for the
> compile/tokenize pass.

Nope. And the one company no longer exists as such, it having been sold
over 10 years ago, and that project over for something like 15 years; the
other, I've no idea what they're doing these days with the City of
Chicago's 911 system for geodata loading, but I'd be surprised if they
weren't still using my system, money being tight, and the VAR I worked
with being cheaper than ever.

          mark




More information about the CentOS mailing list