Gregor Kiczales gave one of the keynotes at last year’s OOPSLA (slides and audio). The abstract was promising, and it sounds like the talk was well received. Nonetheless, I think Kiczales is off target. But as the talk has a number of important themes running through it, and the subject matter is extremely pertinent to my own research interests, I will attempt some course correction, rather than just pick holes. As with my last post, I intend these remarks as friendly constructive criticism. The talk was in the exploratory spirit, after all, and it deserves a response in the same spirit. Unfortunately the various strands of the talk are woven together in ways that are both confusing, and confused.
Pluralism in software, pluralism in science
Kiczales’ central observation is this: software abstractions are
Kiczales also makes the interesting point that the shift towards perspectivalism in software in many respects mirrors a similar shift in how we think about natural systems. (At one point he asks the audience, leadingly, whether anyone believes in “scientific objectivity”.) Here it seems that Kiczales has been heavily influenced by Brian Smith’s book On the Origin of Objects. Smith was in the office next door to Kiczales for several years, and some of his ideas seem to have cross-pollinated. I’ll come back to Smith’s book shortly. But first let’s consider this analogy between software systems and natural systems. (I apologise for the brief philosophical digression, but I think it’s one of the threads of the talk which hits on an important point.)
Twentieth-century philosophy of science, certainly since the decline of logical empiricism, was dominated by various flavours of a metaphysical position called
What’s wrong with the realist picture is that there is something smelly about the idea of a fact which is in principle beyond the reach of empirical science. Scientific theories typically “parse” low-level ontologies into higher-level ontologies; these macro-ontologies are really nothing more than patterns swirling in the low-level structure, and the “truth”, or otherwise, of such theories is, in scientific terms, fully exhausted by the empirical success of that theory. From science’s point of view, “molecules” are just patterns in the quantum-mechanical substrate (or whatever) which satisfy a certain behavioural or structural description, and the extent to which the theory of molecules is more (or less) objectively true than any other theory is just the extent to which that theory successfully (or unsuccessfully) systematises the phenomena. There are no further scientific facts – no “trans-empirical” facts – which determine which theory is “the one true theory”.
I can’t hope to have done justice to this topic in that one paragraph, even if I knew enough about the subject to give it proper treatment (and I don’t). But my aim here is only to concede to Kiczales and Smith what I think is fair concession: that any plausible alternative to realism must be pluralistic. It must allow for there to be multiple descriptions of the same natural system – perhaps with radically differing ontologies – without imposing the requirement that at most one of them is “correct”. A theory of quantum gravity, if we ever find one, will not reveal General Relativity to have been “false” – but mysteriously successful – all along. It will just be a better theory.
So let us grant the point that realism is a metaphysical red herring. And I think we can also agree that the analogy with what’s wrong with our traditional conception of software is compelling. We tend to think of there as being a unique objective fact about what a piece of sofware “does” – a unique theory of its behaviour. Our awareness of the existence of some underlying source code tends to fuel this intuition. But really we need to be much more pluralistic, and accept that what a piece of software “does” inescapably depends on your point of view. A security engineer might have a completely different view of a system than an end user. Each end user probably has a different view than other users, in as much as she can’t see what other users are doing. Reports or audits produced for management are really nothing more than abstractions of how the system behaves. Even a bug-fix, without too much of a stretch of the imagination, is just a view of an erroneous program that applies a correcting delta to its behaviour. And many of these views and perspectives aren’t just design-time artifacts, but are live perspectives on a running program. This pluralistic way of thinking about software is even more dynamic and fluid than fluid AOP: let’s call it
An exciting possibility, then, is that fixing our philosophy of the natural world and learning how to think properly about software might end up dovetailing rather nicely. God’s not a mathematician, he’s a programmer, right?
So far so good. But where Kiczales’ talk goes awry is in its leap from pluralism, to the ushering in of a new era of “formality-free computing”. In this fluffy new future, we will sit around engaging in
The following excerpt from the Amazon “review” of Smith’s book (presumably written by his publisher) captures the sickly flavour of Smith’s vision:
Critics of programming practice have compared it to alchemy and Smith recalls the characterisation of Newton as the last of the magicians. Is this a pre-Newtonian phase, lacking “Laws”, awaiting the differential calculus? Another position is suggested:
“… that we are post-Newtonian, in the sense of being inappropriately wedded to a particular reductionist form of scientism, inapplicable to so rich an intentional phenomenon. Another generation of scientists may be the last thing we need. Maybe, instead, we need a new generation of magicians”. [p362]
Magician? Magus? Seeking the secret of how it is we “deconvolve the deixis” – plus Ã§a change, plus c’est la mÃªme chose. The Alchemist: not a charlatan, but one possessed of much empirical wisdom stumbling after the scheme of things; as this new Science of the Artificial must do, self constructed, self referential, post-post-modern, a metaphysics for the 21st century.
I’m sorry, what?? When exactly did Gary Gygax get together with Jacques Derrida? It’s somewhere between uninformative and downright misleading to attach significance to the idea that software is intentional (in the philosophical sense originally popularised by Dennett, and somewhat misappropriated by Smith). We can, too, skip gaily past Smith’s notions of
And while it may be true that interfaces are, unsurprisingly, often socially negotiated, we must be careful what we infer from this. So is the spelling of identifiers, the pattern of whitespace in a source file, the arrangement of plant pots in an office, after all. What we must cleanly demarcate are the forces that define a particular technical problem, and any particular solution to that problem. The problem that Kiczales has quite rightly identified is just this: abstractions are essentially dynamic and context-sensitive. There is no unique “correct” ontology for any man-made system, any more than there is a unique correct theory of any natural system. And one of the key forces that happens to drive this dynamism and context-sensitivity – but only one among many – is social interaction. (“One man’s constant is another man’s variable”, as Alan Perlis nicely put it.) But it is a mistake to think that any observations about computing as a social activity offer insight into potential solutions to this problem.
Formality all the way down
This leads us to the final Smithesque strand we need to extract from Kiczales’ talk and lay to one side. We are all familiar with the observation that simple interactions between parts often give rise to “emergent” phenomena, behaviours that are somehow novel or surprising, such as the macroscopic behaviour of ant colonies or eBay shoppers, but which are not in any way mystical or magical. As Figure 1 attempts to show, emergent behaviours are in a sense dual to the requirements on a solution. Requirements are known and
Emergence is an important topic. But again, we must be careful not to make the leap from the uncontroversial phenomenon of emergence, to the highly controversial idea that reality (and by analogy software) might not be “formal all the way down”, as Kiczales, following Smith, suggests. Formal all the way down is exactly what reality is. What else could it possibly be?
Smith’s new-age version of emergentism is just an invalid inference from the failure of the reductionist programme in science. In the 1960’s, many scientists, as well as philosophers such as Ernest Nagel, were optimistic that we would eventually be able to deductively derive all of science from fundamental physics, by establishing the right “bridge laws” between theories. Half a century later, this optimism looks naive. There has been only limited success, for example, in deriving much of chemistry from quantum mechanics on a “first principles” basis.
But the failure of this kind of reductionist programme does not mean giving up on formalism. We simply need a more mature perspective on the relationship between two theories, perhaps seeing the relationship as closer to one of
Once we concede this, then as with social negotiation, we can see that emergence is only indirectly related to the technical problem of enabling “perspectival programming”. We don’t need to design for emergence; what we mean by “emergent” is, after all, just that which doesn’t come built-in. There are no insights we can export from emergence itself to the foundations of computing. Emergence comes about from the way we use things, the way things contingently interact, not from the
The technical challenge: a new paradigm for interactive computing
So at last, I think we can distill the central challenge lurking at the heart of Kiczales’ talk. How do we expect to realise the task-centric, perspectival model of programming that we know is coming? If abstractions indeed need only exist only in the service of specific interactions the programmer or user has with the program, then in the future we may be abstracting and unabstracting as frequently as we switch between edit buffers today. In their various ways, systems like Mylyn, fluid AOP, and Subtext offer a glimpse of what this world might look like (although Subtext is the only one of these that offers a glimpse of just how fluid the new paradigm might be). But do we have the technical maturity to realise this superfluid, aspects-on-steroids vision?
I suspect that Kiczales would agree that the answer is no. We simply lack a compelling paradigm for building robust interactive systems. But contra Kiczales, and as I argued in my last post, working out this new paradigm will require us to to embrace the formal, not reject it. The answer is not going to be to make things less
Conclusion: less pop, more sci
To sum up, I sincerely doubt that there is an impending “post-formalist” reconstruction of the foundations of computing. If we want things to be fluffy, they’re damn well going to have to be fluffy in some kind of technical, mathematically robust sense, not in some…well,
What this kind of question, the popularity of Smith’s book, and to a lesser degree, Kiczales’ talk, ultimately brings home is perhaps this: that if it’s socially negotiated artifacts we’re after, we need look no further than the world of