[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: ::scr open sores



I have various random thoughts on this subject, but since I've been trying,
unsuccessfuly, to do some work for most of the day, and then had to go and
discover I've forgotten all the Spanish I ever learned, and then realised my
house is covered in scaffolding a day earlier than I expected it to be, I
have only just got round to replying. Given that, I'm not going to try to
follow up to each post in turn, but rather gather it all together here.

Something of a rant, actually. Overly sensitive people with an excessive
attachment to open source software should not read :)

1. Open source in general

Since it is not possible to sell OS software for money, and it is not yet
clear it is possible to make money with it at all, its use is limited to
small, business plan on the back of a napkin startups, dispersed internet
communities, and academics. Small poorly funded businesses generally produce
terrible software, and the OS ones do not seem to be an exception, and
anyway most of the firms trying to develop OS stuff have grown out of
communities of developers around "bazaar" projects. Academics do, of course,
sometimes produce excellent software, but they generally have little
interest in actually making it useful.

As for the much vaunted bazaar model ... it seems to work great in
duplicating what has already been done, at least when the developers
properly understand what they are trying to imitate. I reckon that applies
to implementing properly documented standards, too. Its not impossible for
all infrastructure to end up as open source, some day. It sucks terribly,
terribly badly for anything involving a vision of how things should be that
is not purely technical, though. Brookes talks about a property of software
systems called "conceptual unity". Those OS projects that have tried to be
original noticably lack it.

2. Linux

It seems to be a perfectly decent Unix kernel and utilities. I've been
running it since 1.2 for various house-keeping tasks, and in odd fits of
enthusiasm I've even used it as a desktop. My company - like many others -
uses Linux on all its small server systems, but we don't run big application
servers or databases on it. Most of the things holding Linux back have
nothing to do with Linux, but apply more generally to all Unices.

The only time I found it satisfactory as a desktop OS was when I was
developing Unix software. The tools for everything else - most especially
for writing (of which I do quite a bit), and surfing the web (of which I do
even more) are well below Windows' standards.  The worrying thing is that
they seem to have fallen behind, and failed to capitalise on their
strengths. NS3 was a reasonable web browser. NS4.7 is extremely poor by
current standards, and while Mozilla has improved its still ugly and clunky
as hell. And no, I can't fix that with skins. Skins don't fix basic
usability problems, they just make them look pretty. Similarly, some tools
that showed promise a few years ago - XML standards based word processors,
for instance - seem to have lost their way. I know I could fix some of these
problems by applying my 'leet coding skillz. However, I have other, more
interesting, things to do, and therefore I want to use an operating system,
not write one. If that means using MS, I do it, though I plan to experiment
with MacOS X. It looks sweet.

3. X

Which leads me to X. X is truly, appallingly, terribly bad. Its basic
interfaces are all wrong. The rendering model was broken when it was
invented, and as postscript-like technology has found its way onto other
desktops, the fact that the inventors of X warped the postscript model
deliberately or accidentally in several places has become more and more
obvious.

On top of that it requires a context switch for every drawing operation to
be executed. When you compare this with, say, DirectX, in which applications
talk straight to the graphics card, or GGI, in which a very thin layer in
the kernel mediates, the cost is appalling. Network transparency is a worthy
goal, but its basically wrong to make everything talk over a socket to
acheive this. Do we use local pipes to call procedures because we sometimes
want to call them over the LAN ? No, we call procedures that write over the
LAN, but look as if they are local.

All the volume of "extensions" added to X in attempts to fix these problems
are not transparent to the developer, and thus serve to make X programming
difficult, and cause X applications to degrade ingracefully when facilities
they would like are not there.

4. Gnome and KDE

Cargo cult GUI development. Having been on both the Gnome and KDE
development lists at different times, they seem to procede by hacking
together a bit of code to do The Next Kewl Bit, and actually creating
development libraries, component models, and GUI standards as something of
an afterthought. The fact that the purpose of a GUI is to produce usable
software seems to escape them completely. The secondary fact that if
developers are to be expected to create usably software, it needs to be made
easy for them is Beyond The Scope of This Project. Yes, Windows is not
terribly usable,  but quite how anyone expects to beat it by imitating it
and actually managing to make some aspects - inconsistent naming, poor
terminology, tiny things you need to click on, failure to exploit infinite
depth, wasted screen real estate - *worse* is a mystery. Oh, they have
transparent terminal windows. That really makes up for it all, doesn't it ?
See above about conceptual unity.

It gets worse, though. Gnome and KDE apps are more monolithic, and less
component based, than the worst bloatware to come out of Redmond, and
Mozilla is worse, and StarOffice ... lets just not talk about that. Things
take years to be rolled into the system libraries, so for ages, for example,
every Gnome app did printing differently. Now they all do it the same way,
but can never make it work, because I don't use supermegagruvynewllpr, or
something. At one point, every Unix app being written seemed to contain its
very own buggy HTML parser and renderer. The result is that if the vastness
of the X server and constant context switching weren't enough, you've now
got constant swapping and occasional crashes to add to your woes.

5. "Small tools" and text

The Unix "small tools" philosophy as usually applied is basically knackered.
OK, maybe that is a bit strong. It works OK for well known file formats that
are simple enough not to incur a parsing overhead. For anything else, its
knackered. For a component based system to really work, standard interfaces,
either file formats or APIs, are needed. Of course this never happens in
Unixland, because of the screaming hoards yelling "just use text !" (as if
that was a file format), or "who needs interfaces, I can knock that up in 3
lines of Perl!".

6. Apache

Given how widely used it is, and the fact corporate politics are against it,
and the cost of changing web servers is not that great, they must be doing
something right. The Jakarta stuff is really very cool.

7. Perl, etc

I have the Wrong Kind of Brain for Perl. I can't figure out why anyone would
ever write code they ever might need to read in a language with a Magic
Hidden Variable called $_. Ruby and Python, however, seem really nice. OS
seems to work quite well for this.

Simon