The HeliOS Project is now.....

The HeliOS Project is now.....
Same mission, same folks...just a different name

Search the Blog of helios and all comments


Sunday, May 03, 2009

Deja Vu All Over Again

It was a website called first step into Gnu/Linux and Free Software advocacy.

It wasn't half bad. A good but long-lost friend named Tracy Kuhlman from Oklahoma built the site from a Xoops template and he spent dozens of hours making it work. To my surprise it became fairly popular. A famous Linux journalist, Joe Barr, interviewed me about Lobby4Linux for in 2006.

May God give Love and Rest to Joe's spirit. We lost him about two years ago.

Today Lobby4Linux resides redundantly, safely tarred on several hard drives...existing in it's own little cube. The only changes it sees these days is when I change icon sets and it is dressed by someone's representation of what a tar.gz file should look like. A 4 gig tar.gz file. Every comment made, every article written lays silent inside that cube.

Things change...evolve...

On that venue, in the summer of 2005, I wrote an article that pissed thousands of people off. To date, that article posted the third-most comments we've ever received. Many of them were not positive. I lost friends over it. I made publicly some fairly controversial statements. I said that the current model of the GNU/Linux desktop system would flounder in obscurity until a permanent level of standardization was reached.

And then I crossed the line.

I stated that while I was all for choice and freedom, Linux would continue to simmer on the back burner...never tasted by those seated at the tables out front...never quite done...never quite ready for public consumption. We need to have one distribution method of applications. Deb, rpm, tar.gz, apt-get, emerge = common user information overload. Therefore being ever ignored by the masses.

We needed to think about possibly presenting just one distro to the world.

Mcdonalds. I think their food sucks. Many of you think their food sucks, but we continue to eat it. So why are they successful? In part, because whether you are in Bamberg Germany or Boise Idaho, it sucks exactly the same. Windows is no different as a product. It sucks but people have grown used to the taste of bad software...and they not only accept it, they make excuses as to why they continue to consume it.

Even when they are presented with a better way of doing things.

Because they know what it is going to taste like before they eat it. When they sit down with it in front of them, they are already familiar with it.

I stated that we needed to form a single entity that took in and distributed funds for specific development teams. I even tried to organize it and was screamed at from people on six continents. We could have our Photoshop...our AutoCad...our Garage Band. Developers have to pay mortgages. Love of what they do and the love for their fellow man doesn't keep the repossession agent from the door. Many of you despise it but the fact remains that capitalism's the driving motivation behind the majority of successes at every level.

"Developers have to eat."

That is quoted for a reason.

In the spring of 2009, at a Linux gathering in Northwest USA, a man stood before a large group of Linux Developers and said the same things. I am no smarter than this man, he's forgotten more about GNU/Linux than I will ever remember. I simply gave it some thought before he did.

I wonder how many friends he will lose because he said it.

You be the judge.

All-Righty Then


NoobixCube said...

There are some good points in that video, and probably a lot of them are the same as what you said years back. I don't necessarily think we all need to standardize on one package management solution, but I do think that commercial developers need something they can use independent of distribution. Autopackage does an alright job at this, and I can definitely imagine buying a game off the shelf at my local EB with an Autopackage installer. As for commercial software on Linux, I've often thought that it would be a brilliant move for Microsoft to make MS Office for Linux. Lots of companies are either holding off from buying the latest versions of MS software, or switching to Linux; usually in numbers like thousands of computers at a time. If MS made MS Office for Linux, then the uninformed TCOs of companies would probably say "Okay, we'll do the Linux switch, but I want MS Office for compatibility". Microsoft wins, because those people probably would have waited another three years for the next hardware rollout - instead, in this case, they squeeze in an extra Office bulk licensing deal they wouldn't have made otherwise.

Photoshop for Linux. Who really wants that? Adobe software is notoriously bloated, and their Linux software buggy (how many times has your Flash plugin crapped out while you're watching a video?). What we need is either Gimp to get really good really fast, or a new, ground up Photoshop-like solution. Gimp was never meant to be a Photoshop competitor, remember. I can imagine one day downloading "Canonical Canvas" (I just made that up :P), since if anyone has enough stake in Linux to make a good image editor, it would be Canonical.

The Observer said...

When I first saw the title to that video, I thought it would be some Windows enthusiast ranting. I was quite surprised when I watched the full video. I think he was right on many points. I had known about many of the problems with memory managers and hardware support for a while, but I had never thought of package management as being a real issue. I'm relatively new to Linux. I've only been seriously using a Linux OS for about a year. I'm really amazed at the lack of standardization in the Linux community. I'm used to web development, so I'm used to using various standardized technologies. I can't understand why such standardization hasn't been done in Linux.

Anonymous said...

Ken, my name is Rob Jinks and I remember that website and that blog well. It was well written, well thought out and you did not come close to deserving the roasting you got from the fanboys. If I remember clearly, it was a particular distro that you had earlier aligned yourself with that hammered at you the hardest. That is a shame. You did fantastic work for them.

Yes things evolve. You have evolved nicely. Thousands of bloggers have come and gone and you are still here. I am impressed with your fortitude, especially in the face of some of the libelous attacks you've suffered.

Yes, I remember that blog and I remember Lobby4Linux well. I was a constant on your forums there too. Tracy Kuhlman, wasn't he "Okie".

Great guy. I often wondered what happened to him.

You've become a fixture Ken, one I think most of us have come to count on. We need that kind of stable presence in the community. Thank you for hanging with us over the years.


Carla Schroder said...

hey ole chum, I sort of agree and I sort of don't. I don't agree that Linux needs to be standardized like McDonald's. There is no upgrade path from McDonald's to better cuisine.

I see it as more of a marketing problem. We need a 'gateway Linux' that draws new users in, and then gives them a clean path to the hard stuff. Canonical is doing a great job at becoming the face of Linux. It's hip, it's friendly, it's attracting bazillions of noobs. They get hooked, and then over time they can learn about Debian the Mothership; tiny Linuxes like Puppy and Tiny; Free Linuxes like gNewSense and Mandriva Free; CentOS for the server room; router Linuxes like OpenWRT and Pyramid; VoIP Linuxes like SipX and Trixbox; security Linuxes, fixit Linuxes, liveCD/USB Linuxes, and on and on.

Trying to get adoption of a common installer, common filesystem, and other such infrastructure is probably doomed to failure, since it would require massive overhauls. So maybe the answer is some kind of front-end that understands these things, and provides a common user interface.

Them's my thoughts. Add two bucks and you can buy a cup of coffee!

Owain's Blog said...

I agree with what you said in your blog of 2005, well to a point anyway. The greatest thing about GNU/Linux is probably choice. This is also its biggest draw back.
Apple is trying to compete with Microsoft. Linux is trying to compete with Apple, Microsoft and itself!
It's always Gnome VS KDE, RPM VS Deb.
This is in some what a good thing, as it creates dedicated fans, but it also slows down Linux from its potential.
Wouldn't it be great if slowly all these different package managers, distributions and Desktop Environments came together to merge their ideas and develop GNU/Linux into something far beyond what we have today?

Andy Cater said...

We already have what you need: Ubuntu as the distro for new folks: Debian as the distro for when they need more.

One package format, LSB compliant etc. etc.

Show me the people who "need" Adobe on Linux - then ask them why they're not getting satisfaction from Adobe and what they're doing about lobbying.

Commercial Linuxes like Red Hat go only so far: what fires people is enthusiasm and, even if you pay prominent developers, you may not get what you want - dunc-tank for Debian springs to mind. On the other hand, you have Alan Cox - has worked for several major distributions, is now paid by Intel - and has never had to move from his own house :) A kernel hackers dream.

Mark Shutleworth is attempting to get distributions to unify release cycles but it's not so straightforward, "Release early and often" produces Fedora and, sadly, Ubuntu stability, "release when ready" produces Debian and "release when ready according to marketing and then be forced to pin on seven years worth of backports and kludges" produces Red Hat :)

In the meantime, keep on keeping on :)

All best

Reece Dunn said...

I have used and programmed on Windows for over 10 years. In the last few years I have migrated my home computer over to Linux.

What I can say from my experience is that Windows and Linux fundamentally share some of the same problems, but approach them in different ways.

For the developer, Microsoft release an ever increasing number of programming interfaces, only to officially drop support for them in subsequent releases (GDI+, Windows Forms). In many ways, this is similar to the ever evolving APIs on Linux.

From a user point of view, Linux is becoming more integrated (with Gnome applications being able to look like KDE applications and vice-versa). Also, with the specific widget toolkits (Gnome, KDE, etc.), they provide the framework for a consistent look. Windows, on the other hand is becoming more fragmented (try running a Windows 3.1 app, or 95/2000 app on XP, or the mismatch of interfaces on Vista and later). The developers need to do more work on Windows to get a consistent look (and because the source isn't available, this cannot be fixed without buy-in from the company).

That's not even considering the inconsistency that is Windows Media Player, Microsoft Office, iTunes and others running on Windows.

Don't get me wrong, I've had my share of audio problems and driver issues on Linux (sound and wireless are the major problems, but with recent Linux distros I haven't had any wireless issues). But I've had the same problems on Windows (more-so if you count all the CDs that you need to hunt for and install from to get the drivers, including the ones for your motherboard!).

Nvidia don't provide an installer for their Go-based graphics cards on Vista laptops - you need to go to the laptop manufacturer to get the driver. When I tried Vista (before jumping to Linux), the Nvidia drivers were rubbish to the point where the computer was unusable. Nvidia did however provide the latest drivers for Go-based graphics cards on Linux. Fortunately, I didn't encounter the Creative soundcard driver issues that plagued some Vista users.

Every time Microsoft release a new operating system, the driver writers have to catch up. This includes the 64-bit only versions (that require the drivers to be signed and authenticated from Microsoft). On Linux, the drivers either come with the kernel, or the X11 desktop environment unless you are dependent on proprietary drivers.

As for installers, Linux has the better platform. Yes, there may be diverse schemes to run installers (deb vs rpm vs emerge). The solution is not to converge on a single one but to inter-operate so that a package can be installed on a different platform (e.g. being able to install an rpm on Ubuntu). This is available now, it just needs better integration.

On Windows, you need countless reboots and downloading of updates. To get a working system, you need to spend a day or more hunting for all the applications, downloading them from websites and installing them. Not to mention keeping track of all the license keys.

I'm not saying that Linux is perfect - nothing is. It will have regressions and rough edges, but as long as people band together then we can make Linux the best it can be, and make it accessible to all, not just those who a market survey says that it should be targeting.

Anonymous said...

In various ways, many people have made these points over and over again ever since Linux became a serious desktop alternative. But most of them are just wishful thinking and will never happen.

One thing which people who migrate to Linux should accept that there are several benefits - and hence there are bound to be some downsides. Linux cannot be all things that it is and _also_ include all the good points of OSX _and_ all the good points of Windows.

Unknown said...

I'm going to disagree. I think that it would be totally suicidal to imitate the failed Windows model.

Choice is one of the greatest options that Free Software provides. Choice is antithetical to Microsoft, and something that the company is terrified of.

Anonymous said...

I'm beginning to wonder, based on some of the comments, how many people actually clicked the link at the end of the article. I mean it IS kind of the entire point to the blog, not his relating, that only introduced the clip.

For those that didn't, the presenter was quite eloquent in stating his case. The present development model is a huge burden on our developers. They spend more time duplicating effort than they do improving the code. Many developers have devolved into simple application packagers. That's not what they do. Once I watched the clip, I found myself shaking my head in total agreement. If you did not watch the clip, you have missed the whole point.

twitter said...

You were wrong then and you are wrong now. Freedom creates diversity because different people prefer different things. Choice is good and variety is not only interesting, it gives everyone what they want. Distro and desktop hopping is harmful. The same free software can be found in almost all distributions. GNU/Linux is succeeding precisely because people can get what they want and stick with it. It's not a one size fits all straight jacket. I've been running Debian for ten years. I customize it to make myself happy. My wife and kids run Mepis, they like it out of the box. It all works the same and it all works together. We are all happy this way. We would not be happy if someone "unified" our experience to something we've already rejected. If it is not broken, don't fix it.

No need for hate though. I'm sorry your lost friends over this. Friends should forgive trivial mistakes. Keep up the good work. Use your freedom to make exactly what you want and people will love you and beat a path to your door if that's what they want. Don't be surprised when people use their freedom to fork it.

Unknown said...


I watched it. Quite frankly the impression I get is that while Bryan's been around a long time, like Eric Raymond, he doesn't have a clue. Yes, I am being deliberately nasty.

The greatest strength of Free Software is it's diversity. Take desktop environments. Say the KDE team makes a totally horrible design choice, which ruins the KDE desktop. While the team is trying to fix it, we can use XFCE, or Gnome, or Enlightenment, or vanilla X, or [INSERT NAME HERE].

And if the KDE team can't fix the problem, and the team breaks up? Well, think of it as Evolution in Action.

When Microsoft committed the atrocity that is Vista on their poor suffering customers, the only option was to use 2000/XP, both of which are technologically ancient, and insecure as hell. Windows 7 doesn't appear to be any better, so the suffering continues.

Diversity is our strength. Yes, we end up doing things 3 or 4 times, 3 or 4 different ways. And dammit, we get better code that way. Code that Microsoft would love to steal, because they as a company can see how good it is.

Oh, and my apologies to the KDE team for using them as an example. I'm sure all of you remember the flame wars when KDE 4.0 was introduced, and all of the concern in the community about whether it would ever be usable. Well, they did one heck of a job, as 4.1 and later showed us.

And I agree with Carla - newbies go to Ubuntu, and later onto other things.

Anonymous said...

@ twitter

You were wrong then and you are wrong now.

Tell me I am wrong.

I have been developing software for GNU/Linux for eight years. Have you stopped to think where the Linux operating system would be today if we did not have to duplicate the work of others. I'm not going to go down the shopping list of examples. It is obvious you did not bother to watch the film the author presented at the end of the blog.

We work for nothing so you and your family can enjoy the diversity you speak of. Be careful how you take this for granted, and it is obvious you do.
There may come a time when people like me get sick and tired of replicating our work and buildiing packages for 7 different installation engines instead of innovating and making what you use better.

Watch the film and tell me how wrong we are for making the statements we make.

Lam said...

My question to this kind of suggestion would be "then what?". Assume that we succeeded in "unifying" the effort and experience, and GNU/Linux achieves significant market share in desktops, then what? What have we offered to the world but Windows-with-Linux-kernel?

The real meat of GNU/Linux is not about speed, or stability, or usability. These are the result of the real meat, side effects at best. If you wish those that much, you can go for Macintosh just fine. Or some specialized version of Windows. GNU/Linux is not about these. I still put the "GNU/" part there to remind myself that GNU/Linux is all about personality and freedom. That is what this system offers to the world.

True, it takes time and effort to put together a GNU/Linux system. True, it may be weird and inefficient from time to time. But so what? It is, for God's sake, MY GNU/Linux system. It's different from yours. It's unique. It's like how your parents feel about you: you are not the best, but you are theirs, period. They love you, enjoy spending time with you, nurture you, not because you are the best, but because you are theirs.

Same goes of "joy of computing. It's not about showing off, or speed, or stability. It's PC, personal computer. It's mine. It took me a month to put everything together, and it sort of works. However, you know what, I love it. It's mine. I can sometimes even feel its working for me, restlessly overcomes any mistakes I made along the way. It's personal. If I don't like such and such component, well, dead with that. I will switch. I hand-pick the part that I trust and love. My fingerprint is all over the place: it's my personal computer.

Can Windows ever achieve that? Can Macintosh can ever achieve that? Windows Vista is flashy alright, professional alright, but it's someone else, it's something I bought, not built. It's just impersonal, cold, and fake at best (cruel at worst, if you are talking about the EULA).

GNU/Linux is different, that's the main point. It takes time, effort, and sometimes money. It can frustrate me from time to time. But it's like my little kid.

And that's what GNU project offers to the world: a taste of freedom, of possessing your own system. It's the meat of the whole project, and later the whole free software movement. Linux is adopted to be a part of that. Thus, it is best that GNU/Linux stays what it is, "an alternative to the mainstream" (quoted from forgotten source).

If we change, if we "standardize" the system, we just prove one thing: that Thomas Paine and his bunch were wrong after all; or any freedom worshiper for that matter. It is best to have some "developers" to control your system; it's best to hand over your computer to someone else; it's best to not doing anything; it's best to be under control, to be slave, to not think. Freedom, personality, possession are too effort-intensive, too time-consuming. Is that what GNU/Linux about? Oh, sorry. Is that what Linux is about? Is that PC? A brand, sounds good, connote nobility, but denotes nothing but an empty name?

Anonymous said...

I don't think packaging etc. will ever be standardized across distros. First, it's politically impossible and second, it's probably impossible to specify a detailed enough standard to actually make everything work across many distros. It's very much like writing specs as opposed to writing working code.

The way to get a de facto standard desktop is to get one of the distros to take a large majority of the market share. At the moment, it looks like that might be Ubuntu. It's a bit of shame that Red Hat does not want to go after the desktop market. They are the largest company in the field and employ many more developers than Canoninal does, and make huge contributions to free software. Fedora will probably never seek to be pre-installed and supported by OEMs, and that puts the largest contributor to the OS out of the picture in some important ways. Sure, the code ends up in Ubuntu, too, but Canonical is still not in as good a position to provide support. Sometimes I feel Fedora and Ubuntu should merge (and I guess in that case Red Hat would buy Canonical), but I suppose that's impossible. Besides, they both already have the better package format.

jelabarre59 said...

Thinking back to the software developed for the multitude of home computers back in the early-to-mid 1980's, I have to wonder about the excuse that it's too hard for software houses to target multiple platforms. Gee, back then there were MANY architectures, as well as MANY operating systems. Not just different distributions of the SAME OS, but entirely different systems. And they managed, back then, to make variants of their software for a good subset of them.

Forward to now, and it seems they are incapable of thinking of some small variant of the same thing. It's entirely feasible, from a technological sense; I can only think it all boils down to the "grossly inflated quarterly numbers at the expense of all else, including the company's future" mentality. I would hope the economic collapse (otherwise known as the "New Depression") would shake out the cruft and dead meat, but I doubt that's going to happen.

Unknown said...

Anonymous said:

"Have you stopped to think where the Linux operating system would be today if we did not have to duplicate the work of others."

Yes I have. Rather than being light years ahead of Windows, it would be a poor copy of it, lacking features and capabilities, if we've have done things your way.

The competition between the different Free Software projects has driven the evolution of our technologies at a breakneck pace. Consider the GNote/Tomboy situation, where in less than a month, the developer of Gnote has cloned Tomboy in a different language, providing almost all of the same features, and producing a faster, leaner application.

Think of it as Evolution in Action.

While this sort of competition might be rough on the developers of Tomboy (who's major mistake appears using Mono), it provides us with software that is far better than Microsoft can provide. Microsoft doesn't allow internal competition by the programmers, and so what their customers get is the first junk that works.

Yes, it can be frustrating, trying to build another implementation of a feature - but why didn't you use the existing code? You are a Free Software programmer, aren't you?

I find it curious that you have not identified yourself. Quite frankly I know several people who work for Microsoft, and the arguments that you advance, are the same ones I hear from them.

Anonymous said...

"Deja vu all over again"
exactly that said it all
most of the points are the same as what we all know for years
what i really want to know is
who is this guy from the video ?
and what non-trivial things did he do / is doing to fix something
i really don't see the point of keep saying the same things eternally without any action

Anonymous said...

"Tell me I am wrong"
You are doing wrong if you are packaging for 7 diferent distros.You should provide a standalone agnostic prebuild tar.gz (if any) and the source that's it.distributions should do their job and package the software ,they exist with that purpose

Amri said...

Or, developers could focus on, say, two major package management systems like rpm and apt, and let others fork to their favourite system if and when they want to.

I'd propose that open source community focus on two or three systems to keep some diversity and healthy competition without going on too many directions that would waste valuable resources.

And hey, if rpm and apt can support each others package seamlessly, and distros can be switched back and forth almost effortlessly, I'd say, more power to the user. in such case, having standards help.

Reece Dunn said...

Indeed. One of the main reasons why there is so much diversity in build systems, compilers, package managers, applications and so forth is that no single application meets everyone's needs.

Don't like Mono, or concerned over patents? Or you just want a leaner application without further bloat as you are using a constrained netbook? (Similar arguments - except for patents - can be said about Java and Python)

Look at the evolution that happened on the distributed version control systems (git, mercurial and bazaar).

Also, look at what is happening on the netbook space with the Ubuntu Netbook Remix UI.

As for upstream packages (not the distributions), the program should support prefix/DESTDIR style make variables. The distributions should take care of the rest. Will there be duplication? Of course, but then each distribution is subtly different. They will also include custom patches and build adjustments where necessary.

Hopefully, there is good communication between the distribution and upstream package so that fixes flow to the individual packages (so that everyone benefits).

On the Windows side, it is not as standardised as it seems (at least from a development point of view). Microsoft change project formats with every release of Visual Studio, so you need to either do a one-way upgrade, support multiple versions of the project files, or use another build system (Jam/Boost.Jam & Boost.Build, Scons, cmake and others are available for Windows).

Not to mention that Microsoft shift the ballpark with every release of Windows such that if you did migrate to .NET when Windows Forms were in, Microsoft are pushing you to use WPF (what will be the next big thing?).

Before Vista supported hyperlink controls, you either needed to roll your own or use one in one of the UI frameworks. Then there are quality of implementation issues. In addition to this, if you intend to support XP or earlier you'll need to have conditionals in the code to use the Vista control on Vista and a fallback on XP or earlier.

So this is not exclusively a Linux issue. Remember, one of the motivating reasons to choose an alternate solution (or provide your own) is that the one that is available does not meet your needs, is moving slower than you need it to, or is not accepting your improvements.

The advantage for GNU/Linux is that the code is already there, so they can use that however they want -- the GPL license is designed to pass that benefit on to the next person, so that the ecosystem gradually improves.

Reece Dunn said...

Standards matter. Interoperability matters. Beyond that, who cares?

Choice matters. Look at what has been happening on the browser space. Microsoft and some silly moves on Netscape's part (like completely rewriting it) killed Netscape's browser. They decided to make it open source. Over time that browser has evolved to where it is today.

Opera, Konquerer and Firefox continued to innovate and push what it meant to be for a browser. The introduction of Chrome into the browser space has continued to push JavaScript performance. Opera and the WebKit (Safari and Chrome) teams continue to push standards compliance and Firefox is catching up.

Should those be standardised on a single browser or rendering engine? No. The web standards (assuming compliance *and* interoperability) are enough.

Standards need to evolve at a healthy pace, or people need to carry the torch (like is happening with HTML5) to keep the innovations happening.

Unfortunately, Microsoft are behind the curve - both in JavaScript performance and in standards support - instead, pushing their own proprietary frameworks that are controlled by a single stakeholder. If people are driven by money alone (like big companies often are) then if they don't see any value in something (like look and feel or accessibility issues), they won't put the effort into fixing or improving it.

Freedom matters. Choice matters. Community matters. Da?

Yes, the Linux kernel developers keep changing the driver API. That API is internal (the public APIs are stable as far as I am aware). Does this cause problems when using things like the Nvidia drivers? Sure. But if those drivers cause your system to crash, how are you going to fix it? If you have a stability issue, how do you know it's not one of those drivers misbehaving? You don't... unless you have the source.

What if the company decide that it is not worth fixing? Or that they are going to release a fix in around 6 months time? Or they go under? Or they decide to stop releasing updates for it as it is for 1/2/5 year old hardware?

The kernel guys know what they are doing. Yes, they mess up from time to time. Yes, there are bugs and regressions. But this is complicated software here (same for package managers, desktop environments and the like). And they are still human, just like us.

Anonymous said...

Personally? Things like muitple formats for programs (RPM, Tar.gz, .deb) are all very annoying, there are some programs I like that exist only as RPM, others I'd like but cant use because they are tar.gz's...I wish there was either;

1. One format


2. An automonus system in distros to convert them into the required format.

Why? Because a developer can only get so far with one format, and quite a few dont bother to transcribe into other formats.

Reece Dunn said...

There is a program called alien ( that converts between different package formats. I'm not sure if there is a GUI front-end for it though.

Other than that, I'm fairly sure that you can install an RPM in the Debian (DEB) package manager.

There is also PackageKit ( that is supposed to handle different package formats. Fedora/RedHat and possibly other RPM-based distributions appear to be standardising on PackageKit.

As for the other distributions, Debian/Ubuntu have a nicely integrated package management system - not sure what the support is for different packages, though.

Then there is the SRun project that is a part of the SuperOS variant of Ubuntu, as has been mentioned on here by Ken. Not sure if this uses PackageKit or the update manager UI inherited from Ubuntu.

Gentoo and other source-based distributions are pretty much on their own - you need the corresponding build script.

Slackware (that uses the tar.gz format, IIRC) I am not sure about. Nor am I sure about the other distributions.

I personally believe that interoperability is the answer, not unification. And decent progress is being - and has been - made in that direction.