On Mon, Apr 7, 2014 at 1:41 PM, Stephen John Smoogen smooge@gmail.com wrote:
What Unix philosophy or religion?
Ken Thompson's.
That is the one liner elevator pitch. It gets to be much more nuanced when you get to the details and the complexity of where things go. Plus it was written when text was universal and the main thing that people dealt with. Today text is a deep down thing that gets dealt with but not what people actually interact with.
It's not about text. It is about elegant, reusable simplicity. How many system calls and options should there be to create a new process? How many programs should there be for that first process that is the parent of all others? Can't people who don't like those aspects of unix design find some other OS to butcher?
Rule 3 of his philosophy:
Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.
Yeah - where does that say "push this out to vast numbers of users, wait till they learn and like it, then change it in ways that will break everything they know"? Or anything like what Fedora does...?
Rule 4 of the Unix philosophy:
Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them.
Likewise - where does that say to publish the bad versions before throwing them out? Or to throw them out if they still work correctly?
When in doubt use brute force.
These are the core issues that we all run into and causes us to froth and foam about change. We had a lovely 30 years where the following factors buffered us from those rules:
No, most of that 30 years was about minor variations with patent/copyright protection to prevent reuse and attempt to lock customers in.
- The number of programmers who were solving these problems were in the
hundreds versus the millions that they are today. Out of those hundreds, you had camps which meant the central architects were probably counted on your hand. That meant that group think would allow for 'consensus' of how to solve problems and keep the number of different solutions to a minimum.
So with millions of programmers, wouldn't you expect better results if they build on each other's work instead of all of them starting over from scratch with no concern for backwards compatibility?
- Due to the cost of leased lines and mailing tapes, it would take years to
filter out the churn that the above rules hit. Thus your 2-4 Unix boxes could go for years without anything but software patches and might not see the next architecture change to the OS until the next hardware rollout. [Unless you were adventurous and could have the Vax down for a week as you worked on the phone with some tech on why cc didn't produce the kernel you wanted.]
I've never been in a situation like that. I don't think it is an accurate depiction of the time since usenet existed.
- Your site was pretty isolated and would reinforce group think via
purchasing. You might be a Vax shop or a Sun shop or a HP shop but in most cases the majority of the boxes would be one architecture and one vendor would supply the OS. You might have other systems but they would be small fry compared to the main architecture.. and normally you would tool to make the one-offs look like the main vendor via symlinks etc.
I started with AT&T boxes. And by the mid-90's, linux was a fair approximation of SysVr4 - didn't need a lot of munging.
- The ability to buy systems was hard because they were extremely
expensive.. counting in inflation we are talking millions of dollars of investment for 1-2 systems while today we can have hundreds of boxes for that amount. You bought 1 box and made it do everything you needed.
That was long, long ago. Dell sold PC's with SysV installed at roughly high-end PC prices before Win95 came out.
However, none of these exist anymore. Systems are cheaper which means we can have tons of boxes dedicated to one task which means you can write your tools for it very well but it won't work on other stuff that well.
But that's all the more reason for simple reusable designs. And maintaining backwards compatibility as the hardware churns.
The internet and other changes make it that you are not isolated and you can see all the different ways to solve a problem without an outside filter that if you go with vendor/architecture X you will solve all your problems with X's way versus Y's way.
Nobody believes that after the first time. But I suppose there are a lot of people with no experience who think everything old is bad.
The internet makes it also easier for the churn to be pushed out. While this stuff might go into a lab for years and then get pushed out when the vendor decides to push a new hardware system out you can see it now.
This does have a certain value. If you look at how bad the code was on the first RedHat release that was likely to boot on most PCs (maybe 4.0 or so) and how many bugs have been found and fixed as a result of so many people trying to make it work. But, if everything with bugs gets thrown away instead of fixed, what was the point?
And finally we now have much less groupthink than we had back then because the barrier to think differently is much lower. [Doesn't mean it isn't there but it is a lot less.] Thus instead of 5-10 brute force solutions to a problem you have 100's or 1000's.
Ummm, venture capital? People wanting to build something different enough to sell?
It isn't nice or pretty and there are days when I feel like a dinosaur wishing the damn meteor would hurry up and get here.. but it also is just how things have changed and I just need to figure out ways to filter stuff better to make it work for the environments I need to work it in.
None of which really addresses the issue of elegant designs that scale down as well as up or help with version-tracking of packages across some number of systems.