Tuesday, December 27, 2005

Order and Chaos and the Power/Threat of the unknown

Right as you can tell by the title this blog is about a number of ideas that I feel are related. I'll try and keep it short. What got me blogging was a post on the XP discussion group on yahoo. The discussion was about the newest XP practice "Slack" as mention in XPE2 (XP Explained, 2nd Edition). The question in most of the posts was "what is it?"

The answer was that no one really knew. The one person who should of known (Kent Beck) contributed a link to a forum discussing his fore mentioned book (XPE2), but offered no detail explanation of Slack himself.

I pointed out what seemed obvious to me, which was slack (leaving space in the plan) will mean different things to different teams at different times. So trying to elevate this behavior to a defined "Practice" was a futile exercise. Unfortunately only one person explicitly agreed with me. But I'm sure a few others did too, but chose not to be as vocal.


So why is it that we feel the need to define the indefinable? We are always modeling the world and trying to put it in a box. I think for many of us, that is just the way our brain works (left sided thinkers), Mathematicians, scientists and programmers.

Creative people (poets, artists, writers etc), seem to suffer from this need a lot less. To them something that is inherently indefinable is a source of fascination and interest. They tend not to find such things threatening at all. Concepts such as beauty and elegance resist precise definition, yet many communities quite happily share a common understanding of what these things are. This understanding becomes what is known as culture.


A book that had a profound effect on me was "Language in Thought and Action" by Awakawa. The book describes how thought is intimately linked to language, and how our choice of language can limit our thinking. In the book there is the idea that the label is not the thing. The map is not the territory. We use 'labels' in language all the time as synonyms for ideas, concepts, groups, activities etc. The synonym itself takes on it's own meaning and inference yet any instance of a thing that we choose to refer to using a synonym may have it's own unique characteristics which do not neatly fit into the meaning inferred through the synonym .

So the key is to know when your are using a label, and that the real world is a lot more complex and diverse. An example of this that comes to mind and is related to software is the CMM. The CMM is an attempt to model software development organisations. It identifys Key process Areas (KPA's) and their relationships with each other. Each KPA is assigned to a role that must exhibit concrete abilities.

As part of my training as a certified CMM assessor, I was exposed to a computer model of the CMM. It was great you could see the effect of changing abilities within a given KPA on other KPA's and on the process as a whole. Logically the model made a lot of sense. The big missing gap though was that people and organisations often defy logic.

So within the CMM a group of demonstrable KPA's represented an organisation who's output was repeatable on a per team basis (Level 2), an addtional group of KPA's defined an organisation whose output was repeatable and consistent across the organisation (Level 3). The next level of maturity, level4 , contained KPA's that demonstrated that an organisation could quantitatively measure it's performance as a pre-requisite to quantitative process improvement.

Unfortunately, the moment you expose this simplified model to real world situations it immediately begins to fall apart. For instance is it not possible that an organisation may have a single team that have demonstrably delivered repeatedly, using quantitative techniques. In such a scenario, the set of abilities demonsrated may cut across maturity levels, they may even cut through KPA's, leaving the CMM model in taters.

In a sense XP has fallen into the same trap as the CMM. When first introduced to XP I was told that you needed to do all the XP Practices together. I was told that they are all related and inter-dependent on each other. So after 4 years real world experience, XP has been shown to have gaps. Despite doing all the practices, there are still issues and problems that XP doesn't address. Is this surprising? So what is the solution? Well for Kent Beck it was add one more XP value (respect) and an additional practice (Slack).

The model is flawed, so just add to the model. There is another way. That is to remember that the model is just that a model. The real world is a lot more complex and diverse. The label is not the thing. The map is not the territory.

I could blog (rant) some more. But in summary, as developers we need to get better at using the other side of our brain. Not knowing shouldn't be threatening, a state that should be extinguished quickly, with ever more elaborate models. No, not knowing is just part of the rich fabric of life. In fact there is power in the unknown. In acknowledging that there isn't a simple formula, order can be found in chaos. Examples of this exist in philosophy, especially eastern philosophy. So Zen sayings like "Knowing without knowing" really do have meaning. Higher order thinking can help us to gain understanding in an inherently complex and chaotic world. All we need is the courage to embrace the power of the unknown.

Saturday, December 10, 2005

XPDAY5 - Getting Customers to Know Themselves

In the same session on "Getting to know your customer" mentioned in my previous post, Steve and Andy went on to describe Innovation Games. The goal of these games is to get your customer to think about their problems in a different way, and to come up with new innovative ideas on how to tackle them.

Interestingly, just talking to customers about what they want isn't always adequate. Often customers do not know what they want and end up telling you what they think you want to hear. I think this could be related to the three basic assumptions challenged by the Cynefin sense making framework (see my previous post). If our customers think that we expect them to be rational, intent-driven and ordered then they will answer our questions in kind.

By game play people are let of the hook. They do not need to pretend to be rational. They are free to get in touch with themselves and the true nature of their problem, after all it's just a game.

One of the games is called Product Box. Customers are encouraged to decorate a box containing the "product". The decoration should highlight the most exciting and sellabe features. The idea is to make the product appealing. Another game is called Speed Boat. Here customers are encouraged to draw a spead boat which is tied to things that are holding it back, things that are stopping them speeding ahead in their work.

These games are similar to the domestic probes used by Bill Gaver and fit in well with the Cynefin probe-sense-respond appoach.

Here is a link to more Innovation Games: http://www.enthiosys.com/innovation.php

The combination of Cynefin sense making, Innovation Games and Bill Gaver's Ludic Technologies really made XPDay for me. I'll never look at innovation and product ideas in the same way again.

XPDAY5 - Knowing your Customer

One of the sessions at XPDay that I attended was about knowing your customer. Steve Freeman and Andy Pols introduced use to the Cynefin sense making Framework. This framework challenges three of the basic assumptions we generally make when trying to understand our customers and their organisations. The three assumptions are:

  • The assumption of rational choice - People will do what is rational.
  • The assumption of intent - Everything that happens is done intentionally.
  • The assumption of order - Organisations are well ordered, rule based entities.
It turns out that if you remove these assumptions, then most organisations turn out to be highly complex and chaotic (sounds familiar?). The framework goes on to suggest two approaches to effecting change in such organisations:
  • probe-sense-respond - for complex organisations and,
  • act-sense-respond - for chaotic ones.


    This is very different to the sense-categorize-respond approach that has been traditionally taken (for example Business Process Engineering). They point out that the traditional approach is best suited to ordered organisations.

    Here is a link to their paper:

    http://www.research.ibm.com/journal/sj/423/kurtz.pdf

    A bit of a heavy read. Thankfully the approach they recommend for complex/chaotic organisations, namely probe/act-sense-respond is already widely known as suck-it-and-see :^).

XPDAY5 - Anti-Intellectualism and Dead Fish

Tim Listers keynote at XPDay this year was an impassioned plea for people to wake up and use their brains. An example of "brain-less" practice that he described was a team that continued to provide effort estimates to two decimal places even though their estimates where consistently between 100-200% out. Anyone who knows anything about maths (Tim's father) could clearly see the inconsistency between the estimating accuracy and precision.

So how does this type of thing happen? How do supposedly intelligent people keep making such fndamental mistakes? Organisational culture has a lot to do with it, coupled with the inherent difficulty of accurately predicting software development. The stats for software development failures are abysmal. Most projects fail and those that don't are either late or deliver less functionality then promised. Only a small percentage actually succeed.

Tim spoke about anti-intellectualism. People not wanting to know. I've experienced this too. If you propose an intellectually sound approach to tackling a complex question often the response is negative. It is almost as though people do not want to think. Worst still, it is though they are scared that if they did think things through, that they could come up with answers that aren't palatable.

The consequence of this willful ignorance is what Tim Lister refers to as the dead fish. Dead fish projects are projects where everyone knows that they are going to fail, but no one actually comes out and says it. The dead fish stays there smelling and everyone blistfully ignores it. Tim said that you can sense the mood in Dead fish teams, there is an atmosphere, a stench that is perceivable straight away. People know when they've been set up to fail, and it shows in their body language and general lack of enthusiasm.

Tim did a good job at identifying the problem, but didn't offer solutions. My experience is that telling people that there is a dead fish doesn't always help. Usually by that time, many people feel implicated and vunerable. Hearing the problem described so eloquently though was powerful. It has definately motivated me to do more about dead fish in the future.

Sunday, December 04, 2005

XPDAY 5 - "Gathering" Requirements

For me there was a sort of theme to XPDay 5. The theme was Innovation. How do we know that we're building the write things? Both of the Keynote speakers touched on this subject. It is an area that as concerned me too (see my post on software as an art). As developers we have managed to lull ourselves into the belief that product requirements are just out there and all we need to do is "gather" them. In Tim Listers keynote he showed a slide with a picture of a little girl picking a flower with the caption "Oh what a pretty requirement, let me gather it". It had us all laughing, but it made the point well.

Bill Gaver, the second keynote speaker spoke about his work with Ludic technologies. His interest is computational devices for the home. He believes that computers in the home need not be practical, work oriented devices, but instead can enrich the experience of domestic living in other non-practical ways. So the question is what should we build for the home? Well it turns out that the obvious approach isn't necessarily the best. Just asking people what they want in their homes doesn't always lead to new innovative thinking. To tackle this problem, Gaver and his team have come up with a probe pack which they use to probe the minds of potential users and gain an insight into how they use their homes:

Domestic Probes:




Real interesting stuff. One of the products produced as a result of this type of requirements elucidation is the drift table


To find out more about Ludic Technologies visit the Equator project website.

Monday, November 28, 2005

LISP and The Unified Theory of Everything

Spent today at XPday London 2005. It was a Great day. I Listened to Tim Lister give the keynote speech, an inspirational speaker. I'll never think of dead fish in the same way again, but that’s food (pardon the pun) for another blog.

It is
3am in the morning and my mind is spinning and making connections. Perhaps spending the day with "thought" oriented people has got something to do with it. Anyway I can't sleep until I get my latest connection down on paper (or should I say down on disk).

The serverside "Beyond Java" discussion has really got me looking deeply into alternatives. One of the posts on the serverside had a reference to a modern online book about LISP and Agile development. IIn the book LISP is touted as "the programming language to create programming languages" or something like that. The claim expressed is that LISP gives programmers the ability to define their own language and thus solve their own specific problems in an optimal way.

Anyone that as used EJB, has come across the situation where you’re chosen toolset is getting in the way of you solving your problem. For a simple problem it's over kill and for a real complex problem it just isn't flexible enough. In short all the pre-defined rules which are supposedly there to "help you" are just getting in the way.

Well with LISP, there are minimal rules. LISP breaks programming down to its basic constituents: lists and symbols. If you think about it, all programs (and programming languages) are made up of lists of things. An expression is a list containing various symbols. Literals are just symbols, so are operators, and methods. Symbols themselves are just lists of other symbols. LISP calls the most fundamental symbol an Atom, so '1' or 'A' or ‘+’ is an atom.

Right so the connection. Anyone with a passing knowledge of Physics as heard about the quest for the Unified Theory of Everything. Physicists have managed to explain all natural phenomena by relating them to 4 fundamental forces (Gravity, weak force, strong force and the electromagnetic force, off the top of my head). So why 4 forces why not 40 or 4000?

It is an act of faith in the physics community that there must be a single law out there, a common thread that unifies all four forces - The Unified Theory of Everything. The thought process behind this faith goes like this. If we can identify the common thread, the single underlying common principle, then our understanding of how the universe works at a fundamental level would be much improved.

Anyway Physicists have had some success. By colliding particles with each other at ever increasing energies they have found new fundamental elements of matter. They haven't found the 'atom' yet but they are learning more about the basic building blocks that make up the universe. This research has lead to String theory. Strings are so small that they are virtually impossible to identify empirically, but theoretical physicists have good reason to believe that they exist. So as currently stands Strings are the common glue that holds together everything in the universe, the 'atoms'. So by understanding Strings Physicists hope to discover the Theory of everything. From all accounts they are well on their way.

I'm on a similar quest. I'm looking into the fundamentals of programming. I have a theory. My theory is that at its heart, programming is an intellectual exercise. It is about the ability of human beings to master complexity and chaos through abstraction. As programmers we deal with abstractions all the time. Programming languages are abstractions, libraries are abstractions, objects are abstractions etc. So what are the fundamental forces? What are the fundamental particles? What are the 'Strings' of computer science? I think that LISP may hold the key to these questions. By striping programming down to its fundamental parts LISP allows you to explore the cognitive issues involved in identifying optimal abstractions to complex problems. So in a sense LISP is the String Theory of computer science - the exploration of the fundamental building blocks of the software universe.

To me it is no surprise that LISP derived languages like Smalltalk are still the crucible for innovation in the Software Industry, despite not being at the top of the hit parade.

Connection made and stored - I guess its back to bed.

Sunday, November 27, 2005

REST Revisted

I never did finish off blogging about REST. Well, where I got to in my understanding is that REST is all about state transfer. Moving and changing state between distributed application components in an asynchronous way. To support state transfer REST has a small number of fixed verbs: GET, POST, PUT, DELETE etc. Along with defining the type of state in question (MIME), and the location of that state (URI), thats all you need for state transfer.

My interest is object orientated distributed programming. Well with no new verbs, object orientated messaging is out of the question with REST (look back at my post about sending the message "deleteStuff" using HTTP GET). So for OO the only RESTful option is object transfer (code on demand).

This is cool for an application that is basically asynchronous in nature. What happens though when you need synchronised transactions, say for Croquet? I posted this question on the REST discussion group back in October and I'm still waiting for a reply:

http://groups.yahoo.com/group/rest-discuss/message/5347

REST is also about constraint, and the REST constraints just don't allow for messaging. Objects contain state hence RESTful "code-on-demand" is possible, but they also contain behaviour which is selected through messages (verbs). Message sending is not possible with REST. So replicating objects across several clients and coordinating their behaviour using transactions as required in Croquet just isn't possible with REST (IMHO).

It would be interesting to hear from anyone who thinks otherwise.

In agile development there is a saying "The simplest thing that can possibly work, but no simpler". For state transfer of static and semi-static web content then REST is that simplest thing. For interactive and dynamic multi-user transactions, such as in Croquet (or online shopping/auctions?), REST just doesn't cut it IMO. So it turns out that REST (http) is what I've allways intuitively felt it to be, a limited protocol for a limited set of applications.

Wednesday, November 16, 2005

Back to the future with Croquet

The Squeak community have been pretty busy of late. It looks like they've re-invented the operating system, distributed computing and the internet all in one go . See the Croquet website

This stuff looks like something out of a sci-fi movie, and yes, it actually works!

I've been airing my views on programming in a discussion thread on theserverside. The thread was started in response to Bruce Tates new book "Beyond Java". A lot of people are looking for a better way to develop software. There is a lot of ignorance out there, especially when it comes to anything other than C/C++/Java/C#. Seeing what has been achieved in Croquet using 1970's ideas, has really got me thinking.

What is it about our society that we only value things that have popular appeal? If something isn't popular then we instantly dismiss it. Very often, whether something is popular or not just comes down to circumstances, marketing and luck. One of the most strongly argued alternatives to Java on theseverside was something called MDA (Model Driven Architecture). It turns out that MDA is just CASE reborn, with the same old code generation that failed in the early 90's. Why would anyone want to try CASE again? Yet there still seems to be an appeal. I guess it must be the bang whiz appeal of flashy graphics.

I'm dead certain that late bound languages like Smalltalk will now take off on the back of Croquet, but I can't help but wonder where we'd be today if we spent the last 20 years building on sound engineering principles.

The term "Software Engineering" has gone out of vogue in recent years. Many in the Agile community have turned away from it. I find this a bit of a shame. Engineering is a practical profession, solving real world problems in a practical way. I don't see why engineering should not be viewed as creative. Instead titles like Analyst, Architect and Coach have become popular.

Hopefully Croquet will make an indepth apreciation of technology and the title "Engineer" popular again. I remember reading BYTE magazine in the 90's. Technology would be put through its paces. It was really interesting. SIGs and ACM publications were available too. Today this type of indepth analysis just isn't readily available. Instead technology has become like pop music - you use whatever is currently top of the pops.

We need a new BYTE. Hopefully the achievements of Croquet will make Software Engineering popular again. Developers may start to take more of an interest in how the Software technology they use actually works. Relying less on vendors and more on their own informed judgement.

It looks like interesting times ahead.

Friday, October 21, 2005

A TDD Episode

I'm at the tail end of a hairy refactor using JFreeChart. My client has assets which they need to manage the quality of. The application uses a JFreeChart to show a Colour Coded Quality (CCQ) representation of the asset. The Y axis identifies four quality measures and the X axis is the longitudinal position along the asset. A colour coded item at the X/Y vertex indicates the asset quality at that location and for that quality measure. (Sorry about the terse description, but I'm trying my best not to give away the identity of my client).

Anyway... There is an existing implementation, but the data generation, plotting, and rendering are all terribly intermingled. The rest of the app has been rewritten to get away from an abstract data model (don't ask, it ran like a dog), but the new charting was still yet to be done. So we needed the old chart, but we didn't want the old data generation bit.

How to proceed? Well I started with a standard JFreeChart Plot that sort of did what we needed, a ScatterPlot - A GREEN bar. So what to do next? Well the guts of the dataset code could be clearly identified scattered around the old code, so I copied and pasted that code from the old to the new. A few changes, and the creation of some canned data and... GREEN bar ( a running scatter plot using a CCQ dataset). The CCQ Chart shows rectangular data points of varying colours (DataItems) depending on the quality. For example a "compliant" dataItem is green and a "failed" item is red etc. The old renderer did this so we copied and pasted the old rendering code - GREEN bar (a running scatter plot with CCQ data and CCQ colour rendering). Along the way we identified lots of small refactors that needed to be done, but we just to a note of them. After all we didn't have a true green bar as our plot wasn't exactly the same as the original. We did the same process with the "Plot" and... Again with a bit of work GREEN bar.

Now we had the three main components needed for a chart: a custom dataset, renderer and plot. Our chart factory was still the original Scatter Plot, but everything else had been migrated to custom CCQ charting code. Yet our plot didn't look like the original. We were missing the labels on the Y axis and couldn't work out why.

After a while head scratching we decided to plug our new dataset, renderer and plot into the old code. We then refactored the old "factory" until it was all in one place and looked similar to ours. After each small change we would run the code and check that everything was still GREEN. We finally got down to the final difference. The refactored old code didn't set the chart orientation, relying on the default our new factory did. So we decided to add it this line of code to the old code and BANG, RED bar. It turns out that all we needed to do was remove this line from our new code - still not sure why though, possibly a bug in the JFreeChart library.

So time for a mini retrospective. What where the lessons learned:

  • You don't need JUNIT to do TDD. Often if the code compiles and runs then that is a good indicator of a GREEN bar. You may only have partial functionality, but if your code does what you expect it to do, then that's a GREEN bar.
  • Refactoring to patterns really works. It allowed us to make some real big refactors in one step. We identified our target design by looking at the JFreeChart ScatterPlot code.
  • When refactoring on the grand scale, getting the overall structure right and getting back to a green bar as soon as possible is the main goal. Small local refactors can happen later. By keeping a list of refactors you are able to choose the order in which to do them.
  • always "listen" to the code. Code smells will tell you what to refactor. Commenting out declarations and seeing the resultant errors in your IDE also tells you a lot. A refactor with a lot of dependencies isn't a good candidate to do first, because it could leave you on a RED bar for a long time. Do something easier first, and get back to GREEN .
  • If you find yourself on a RED bar for a long time, then undo the change, get back GREEN, and make a series of smaller changes. This is how we identified the orientation bug. If an ambitious refactor is failing you always have the option to go back and take smaller steps.
  • Small refactors are better. In hindsight, after identifying our target design pattern, we should have refactored the old code to the new design in small steps, rather than copy and paste. We would probably have maintained a better rhythm and learned more about the old code, whilst spending shorter intervals on RED.

Thursday, October 20, 2005

How ReSTful are you?

I've joined the REST discussion group on yahoo groups. And a very knowlegeable and friendly bunch they are too.

After following the group discussions I'm beginning to realise that their is a bit more to this REST stuff then I first thought.

For example, one would assume that Hessian Webservices is pretty ReSTful (obeys the REST constraints). After all what could be wrong with:

http://myserver/myServiceInterface?method=doSomething&param1=value1


Where 'myServiceInterface' is a remote web service and 'doSomething' is the method on the remote interface you wish to invoke.

Well the uniform 'HTTP' connector has its own generic commands: GET, PUT, DELETE, POST etc. So what happens when your client sends:

http://myserver/myServiceInterface?method=deleteStuff


Using a GET?

Sunday, October 16, 2005

REST: A not so Uniform Connector Example

Still buzzing with the possibilities of REST. One of the things I don't like about http is that it has a tendency to reduce everything down to the lowest common dominator "Strings". So all your parameters are strings and often your data is one long string in the form of XML. This is as a result of the REST constraint for a uniform connector across the entire web. But the world isn't uniform, it contains different realms each realm with its' unique characteristics. Isn't it suffcient that your connector is uniform within its given realm?

For example say I want to get access to a resource on a remote server running Squeak. Why not create a squeak specific connector by sublcassing:

SqueakConnector subClass HttpConnector
" A connector that understands the Squeak object representation"

formUrl: squeakServer object: anObject: message: aMessage params: parameters
"Answers a URL object"

GET: url
"Answers the squeak object at url"

A Squeak client would look like:

con := SqueakConnector new.
url := con formUrl: #myServer object: #UserDB message: #getUser params: (Dictionary new at: #user put: 'fred').
fred := con GET: url.

The actual URL passed to "HttpConnector>>GET" would be:

"http://SqueakUniverse/myServer/UserDB/getUser?user=fred"

As long as all servers in the "SqueakUniverse" agree on their object representation and use an agreed URL format then this should work. And my client code doesn't need to worry about dealing with the actual object representation on the wire.

I'm sure this is still RESTfull, honouring the concept of a uniform connector, but constraining its scope to a specified namespace. I need to see if this type of stuff is allready being done.

Saturday, October 15, 2005

REST and OO Abstraction

I'm still getting over just how simple REST is. In my first post on REST I was wondering where a protocol like IIOP fits in. Mark's comment kindly explained, that as far a REST is concerned it doesn't. So I've been thinking how does abstraction fit into REST. In REST a datum is data that is exposed by a component. Which makes sense, since data that you want to communicate to another component cannot be hidden. So the datums exposed by my object are resources that I can lookup:

// client side lookup
datum1 = connector.GET(
"protocol://myServer/myServiceObject/Datum1");

OK this is pretty fine grained, what if I want to get a lot of data at once. Well I can think of two options. Option 1: Expose all of the data in my object and send it through the connector to my client. This is what IIOP does through marshalling. What IIOP goes on to do is to un-marshall my data back into an object at the client end which breaks the REST constraints on a connector.

So without unmarshalling we have:

// client side lookup for a data structure
struturedDatumXML =
connector.GET( "protocol://myServer/myServiceObject/ObjectDataAsXML");

// data structure access
structuredDatum = new XMLObject(struturedDatumXML)
datam1 = struturedDatum.getElement("datum1").getValue();

But we've broken encapsulation, so the messaging is no longer Object Orientated. Going back to IIOP my structuredDatum would be a data transfer object (DTO), with a set of getters for each data element. But a DTO is really just a data struture also, so IIOP doesn't help. So how can we do this in REST, and not break encapsulation. Option 2: How about mobile objects:

// client side lookup for a real Object
objectDatumSerialised =
connector.GET( "protocol://myServer/myServiceObject/Serialise);

// client side use of object
objectDatum = ObjectLoader.load(objectDatumSerialised);
datum1 = objectDatum.getDatum1();

Treating an object as a resource should be possible with VM based languages.

So after my fears about abstraction and encapsulation, it turns out that the REST constraints do not preclude abstraction at all. I'm new to this REST stuff, so the above could be very wrong and all comments are welcomed. In particluar I'm interested in finding out more about "self describing data" as it sounds like a mechanism that will help abstraction.

So it looks like I don't need to loose Object Orientated abstraction to use REST. Until someone explains otherwise.

Friday, October 14, 2005

REST is so Simple

In my last post I was struggling to understand REST . After looking at this example it all made sense. Almost too simple to be true. After reading the dissertation , I was expecting REST to be a lot more complex than it is. In practice the REST style (set of architectual constraints) is simplicity itself.

If you intend to read the dissertation, make sure and read chapter 1 before reading chapter 5. Chapter 1 gives a precise definition of terms, which you'll need. Or if you're like me, just jump straight to the example. Don't be put off by its brevity; connectors, components and datums as defined by REST are all there.

Will REST win out over SOAP? Well I for one won't be using SOAP/WSDL/UDDI etc. Why would anyone want to use that stuff given the choice?

Thursday, October 13, 2005

REST to the Rescue?

I've heard a lot of good things about REST (Representational State Transfer). Apparently REST represents a distributed computing architectural pattern that brings simplicity and consistency to the WEB. Well at least that is what I have been lead to believe. Anyway I've finally found a description of REST, that I can understand.

After reading through the description (and it is quite lengthy) I was somewhat disappointed. I'm not sure why, but some how I was expecting more. I come from a telecoms background and I am pretty familiar with network protocols and communications. I remember reading the GSM protocols for the first time, I was immediately impressed. GSM is a very complex system with a number of orthogonal protocols that work together to address the various issues associated with a single application, mobile voice and data services. After reading the REST description, I was left thinking what is this architecture meant to do? What problem is it meant to solve?

This seems to be the key issue. With the exception of hyper-links, there doesn't appear to be a generic web application, instead their is a number of disparate applications that happen to share the HTTP protocol (out of convenience) and URIs as a common addressing mechanism. REST some how tries to come up with an architectural pattern that addresses all these disparate needs, and in the end becomes so unspecific it is almost meaningless (IMHO).

The only benefit I can see, is that it dissuades extensions to the web that are not consistent with good network communications practice. It appears to be an attempt to limit some of the network protocol abuses that have characterised the web from its inception.

What is missing is specific protocols for specific applications. For example, I once hoped that IIOP would become the established protocol for client/server applications over the Web. In my opinion HTTP is just inappropriate for such applications and if the W3C promoted an alternative I'm sure firewalls would be opened up (in the same way that ftp is accepted as the protocol for file transfer, telnet for terminals, POP3/STMP for mail etc).

In the article it talks about mobile code, but again nothing specific. I may be missing the point here, In which case can some one please enlighten me. I was hoping that REST would be the antidote to SOAP,UDDI,SOA etc Please tell me that my hope isn't misplaced.

Wednesday, October 12, 2005

The Backlash Against Open Source has Started

I've recently been looking into Open Source Software. Whilst searching for blog entries on the subject, I came across this post.

It's worth a read. The basic gist of the argument made against open source is that open source developers are being exploited by unscrupulous Software Vendors. To stop 'exploitation' the author suggests that the government should step in and 'regulate'.

It may come as no surprise that the author is a Microsoft supporter.

If anyone out there comes across other interesting blogs/articles on Open Source please let me know.

Monday, October 03, 2005

Is .NET (Mono) The Open Source Platform of the Future?

Been looking into Mono, The Open Source .NET clone. In an attempt to win people over from Java it appears that Microsoft has gone to some lengths to ensure that .NET is percieved as open.

In fact, my understanding is the the .NET VM (Common Language Runtime or CLR in Micorsoft talk) specification has been submitted to the ECMA, an international standards body.

I guess that the people at Microsoft thought that as long as they could control the .NET implementation, then .NET would be proprietary. But with an open standard anyone can create an implementation.

The Mono implementation supports Windows, Linux and OS X on the Mac. In addition, Mono has ported a broad set of languages to the .NET platform, including Ruby.NET, #Smalltalk and even Java.

I first thought that this common language VM thing was just a marketing ploy. But if you think about it, it is a great idea. What normally locks developers into a given platform is the supporting tools. But with a VM your code will run on any platform that the VM has been ported to. Thanks to Mono, for .NET that means all major platforms. The secound barrier to developers changing technology is library support. If the only good user interface library for Windows is written in C#, then I guess your stuck using C#, but with .NET/Mono you could choose to program in say Ruby.NET and still use your C# libraries.

So you program in Java, but there is a great library in GtK#, no problem, just compile your GtK# libraries into a Java jar (Java stubs) and link at run-time.

I'm not sure just how good Mono is, but I do know that it is backed by Novell, and with time it can only get better. From the look of things on the Mono website, it is more than ready for production use. The other concern of mine is that I'm not sure just how well the CLR supports languages other than C#. In particular I'm not sure how much support is present for dynamic languages like Smalltalk and Ruby. If the CLR byte code does support both dynamic and static languages fully or is extended to do so, then .NET has got to be the ideal open source platform.

Think about it. You decide that Ruby is more productive than C#, well just use Ruby.NET. You've got loads of Java libraries you want to use, well if you've got the source, just re-compile for .NET or if not dynamically translate the jars to .NET at runtime.

I don't think the Java/J2EE world has an answer to this kind of flexibility. The EJB monolithic container model has failed, and the light wieght, mix and match services a la carte model is available on .NET as Spring.NET and Hibernate.NET.

I'm keeping an eye on Mono. If it delivers, we as developers will be free to use what ever platform, language or libraries we choose. And with .NET locked into the ECMA standards process, I don't think Microsoft can do anything about it.

Microsoft setting the world free. Isn't that a thought!

Open Source - Who's doing it and why?

In my last post I discussed the effects of open source software on the Java community. In my opinion, Java was a reaction to a perception by many in the software industry that unless something was done that Microsoft would end up owning both the desktop and the server. If this was to happen, then in time Microsoft would be sure to use its dominant position to shut out other Server Software Vendors. After all, this is exactly what they did with Lotus, WordPerfect and others on the desktop.

As a consequence Sun, Oracle, IBM et al got together and agreed that the Java VM would be "the common means of passage" on the server. Cross-platform, Machine independent, OS-neutral and with open APIs, Java would fend off Microsofts' attempts to own the Server.

Interestingly Java has become a run away train, no longer under the control of the vendors. The developer community have taken the open Java APIs and crafted tools and APIs of their own, all open source. So who owns Java now? Well it appears to me that the open source community are in the driving seat.

So who are the open source community? Well things have changed a bit from the days of Richard Stallman. It appears that many open source projects have big backers. The Mono project, a .NET clone is backed by Novell, and IBM backs Eclipse (the leading Java IDE). Linux has become an open source phenomena in it's own right, with backers such as Sun, Oracle and most of the big players. The only large player not to be some how involved in open source is Microsoft.

Open source has become a viable business model for many. Look at Skype which was recently sold for billions. A less dramatic open-source business model is to offer consulting and bespoke variants on the back of an open source product. It seems that different people are involved in open source for different reasons.

I have always thought that their should be a "common means of interaction" in software, much like language or law in society. This common base should be freely available to everyone and owned by no one. This seems to be exactly what we're now getting in Software.

But why are the vendors backing it? One argument is that producing software open source is cheaper. True, but you can't sell it. I've read an article that suggest that open source is being used as a means of wrestling market share from the incumbent monopoly (Microsoft). It is difficult even for a monopoly to compete with something that is free. Once the free software becomes the de-facto standard (70-80% market share), a new market is available where the backers of the open source product can sell consultancy services and add-ons. I believe this is already happening with Linux.

For example, I wonder how many Oracle licenses have been sold under Linux. Now without the success of Linux, what percentage of those licenses would have gone to SQL Server under Windows? Most right.

If this strategy is true, then this sounds like a dangerous game for the vendors, as they are sure to loose control in the same way that they have lost control of Java. And in a world where customers feel empowered to pick and choose their own software without vendor endorsement, innovation and new ideas could quickly leave the vendors flat footed. Look at the emergence of Ruby on Rails.

Anyway I'm going to get myself a good book on open source. I think it's time I took the open source movement a lot more seriously.

Saturday, October 01, 2005

Java for Sale

I was an early adopter of Java. I was easily persuaded by the arguments in favor. A few years before Java came out, I had dabbled with Smalltalk, so I knew the advantages of a VM. Besides things couldn't go on the way they were with C++ and memory leaks.

Back then Java still had many detractors. Like with Smalltalk, performance was the main criticism. Yet I knew that micro-processors would get faster over time and Java was fast enough for most applications even back then (desktop GUIs were sluggish though). I knew Java wasn't as elegant as Smalltalk and the use of a VM wasn't anything new, but Java was type safe, C-like in syntax, and reasonably performant. For the majority of C++ developers trying to produce server side software, I could see that Java and J2EE would be a great leap forward.

It took me about 2 years to convince my company to use Java. By this time Java had gained quite a momentum in the industry. J2EE Application server implementations were beginning to emerge, and consultancies like Valtech where championing Java as the technology of the future. In fact quite a few big name companies had thrown their weight behind Java. By this time Sun had repositioned themselves as a software company and were practically betting the business on Java. Oracle had jumped on the bandwagon, and as well as creating their own J2EE App. Server, had written Java into the core of their database product. Initially IBM were on the fence, backing both Smalltalk and Java as the OO champions that would steal developer mind-share from Microsoft. Over time though IBM reduced support for Smalltalk and became massive backers of Java (I think even bigger backers than Sun).

I could see the power behind the Java promise of "Write once, run anywhere" especially in light of the threat from Microsoft. Without Java, we could all end up Microsoft developers, with anti-Microsoft vendors forced out. Microsoft had shown their intent for world domination by refusing to "play fair" within the OMG. It was all out warfare.

The anti-Microsoft camp was throwing Millions into Java, downloadable free over the internet. The thought crossed my mind more than once, "How did these companies expect to make money?" If Java software was available free, then developers would try it? Yeah OK. So if developers were using Java then companies would need to support it? True, but what could you sell to companies?

The first commercial Java tool that I used was WebLogic (no not BEA WebLogic, Just WebLogic as it was back then). This was a relatively light weight App. Server written totally in Java. Easy to use, and configured through a single property file. It worked a treat. A developer license of £2K per developer and a run-time license of £11K per server seemed a bit steep, but was reasonable given the gains in developer productivity.

But none of this money was going to Sun. How was Sun going to make money out of Java? I'm not sure if the people at Sun knew, but if world domination was the name of the game, it would be better if the glue pulling everything together on the server was the Java VM rather than Microsoft Windows OS.

The first big non-commercial Java tool I used was the Apache Web Server with built in J2EE Servlet support in the form of Tomcat. The Tomcat servlet engine perfomed much of the server side resource management that full blown App. Servers did, and it wasn't long before some began to recognise that for most applications, the full J2EE stack (Servlets, Stateless/Stateful/Entity Beans, JMS, JNDI, JTA , etc.) just wasn't needed. Besides if you really did need a full J2EE App. Server you could download one for free, JBoss.

By this time the marketing machine behind Java was in full swing. The App. Server market had matured with IBM's WebSphere and BEA WebLogic as the market leaders (Sun missing out again). These new App. Servers weren't simple and clean like the original WebLogic. Instead, they were cumbersome and clunky. Awkward web based App. Server administration was in vogue, and complex XML application configuration could take weeks. Feature bloat had set in, and they where sold as all things to all men. The compexity of these App. Servers was a real problem. Class loader hell, was a common experience for many J2EE developers, and gaining mastery over your App. Server took time and dedication.

Microsoft had woken-up to the marketing power of the Application Server. Middle-ware was the new industry buzz. After being forced to give up on "Microsoft Java", Microsoft produced their own "clean-room" Java clone, C#. They then went on to market their new middle-ware vision. In addition to "Write-once run anywhere (Microsoft chooses)" they added "Write in any language you like (if we choose to support it)" and "Expose your business services over the web (to anyone who uses are protocols)". The .NET era had arrived, it would take about three years before working .NET code would ship, but in marketing terms .NET was real and the industry was holding its' breath.

Sun had put in place the JCP process as a "community" lead way to create and improve Java standards. The problem was that the JCP was a community of Java Vendors not Java Users. Sun has allways flirted with the open-source community as a way of gaining credibility with developers. The fact that you could download the Java JDK source code for free, made Java appear almost open-source, yet there was a growing group of vendors who had bet a lot of money on Java and were looking for a return on investment.

Every .NET announcement would result in a corresponding JSR being raised to provide the same functionality in Java. The Vendors were playing the "my App. Server has more features than yours" game with Microsoft ane each other. Whilst this was going on, the developer community were realising that much of the stuff coming out of the JSRs didn't actually work that well (EJB 2.1 Entity Beans comes to mind), and they were busy producing better open-source solutions themselves. Eventually the gap between what worked and what the JSRs had specified became untenable.

Spring and Hibernate showed what could be done when you actually tried to solve problems rather than sell product. The cat was out the bag and all the JCP could do was try and standardise what the developer community had allready chosen for themselves. I would argue that this is exactly what the JCP should have been doing all along.

They say you can't serve two masters. The JCP process can't be in the interests of both Vendors and Developers at the same time. The developer community have taken matters into their own hands and are busy shaping their own future. Microsoft has gone very quite with .NET recently. The open-source community has targeted .NET too with Mono. The Java Vendor community are still trying to pretend that it is business as usual with JSF (JSR127) and EJB3 (JSR220). The JSF technology is practically obsolete before the first decent vendor tool has been shipped. Everyone knows that the future of web development lies with Continuations and Ajax, both missing from JSF, but available open-source (Squeak Smalltalk and Seaside). EJB3 is available open-source too, as Spring and Hibernate. Why wait to pay for a buggy Spring/Hibernate implementation from BEA?

So my take is that Ruby will carry on picking up momentum on the back of Rails, and who knows even Smalltalk may see a resurgence in popularity with Seaside (I hope). As for Java, well its' still for Sale, but I think there will be less takers in the future.

Sunday, September 25, 2005

The Economy: China to the rescue?

Just watched Panorama ( a BBC documentary), on Gordon Browns 'Economic Miracle'. Apparently what has sustained the longest period of economic growth in recent history has been consumer and public spending. Even with spending, inflation has remained low. Low inflation has been as a consequence of cheap imports form the far east and China. This as allowed us to avoid the boom and bust economic cycle of the past (high spending-> high inflation -> high interest rates -> economic slow down).

The growth of imports as been at the cost of local manufacturing. So we no longer build stuff, but stay afloat by selling things to each other built cheaply else where. And of course there's the growing 'service sector' what ever that is (retail parks, DIY super stores, etc).

I'm no economist, but I know that most people feel pretty insecure at work. Most people are working harder for less. Job stability is a thing of the past, and most people have come to accept short term contracts as part of life. The only thing that seems to produce 'a feel good factor' is house prices - and who knows when that will suddenly come to an end.

The scariest thing that came out of the programme for me, was that once UK manufacturers had moved their production to China, the benefits to the UK economy (lower retail prices) will be realised the once. Once things have settled down, low prices and consumer spending, can no longer be relied upon to sustain future growth. Worst still, without a manufacturing base of our own, we will become sensitive to price inflation in China. Should the Chinese decide to pay themselves more, then that will be reflected in higher prices in the UK.

So after years of protectionist EU trade policies on food. We will suddenly find ourselves dependent on others when it comes to manufactured goods.

What will become of the average man on the street? After all we all can't work in retail parks, and I don't think we're going to loose our appetite for consumer goods any time soon. Maybe the French have the right idea, and it is time to start putting up the barricades!

It really does look like unstable times ahead. Economic power in China, Military power in the US, the Europeans in the middle. Friction between China and the US seems inevitable to me. Some how I feel that the Americans with their vast resources and dynamism, will be able to meet the Chinese challenge. I fear that China's gains will be at the expense of the Europeans, including the UK.

I don't fancy learning Mandarin, so it looks like I need to get myself a Green Card quick!

Tuesday, September 20, 2005

Software Vision - Creating the Backlog

On the back of my last post. I've come up with an idea. My criteria for envisioning new systems:


  • Each release should be take no longer than 3 months and deliver usable value
  • The complete vision may take a number of releases to realise
  • The system/product definition (backlog) should be owned by customer/marketing
  • Customer/marketing should decide how much 'traction' is needed between releases
The last point may need some explaining. By traction, I mean the degree of organisational change required to support a release. In circumstances where organisational change is difficult, it may make sense to use a 'low gear' introducing small organisation changes between each release, providing more 'traction', and increasing the possibility that the organisational change will stick. Conversely if the going is good, then larger changes may be possible, allowing the end vision to be realised sooner.

The point here is that software vision is a skill, and the people responsible need training. So my bright idea is training for customers/marketing people on how to create a product backlog.

Agile development starts with the backlog. But what happens before that? Here is my suggestion:


  • A new product idea (instigated by anyone)
  • If idea meets some minimal criteria, it enters the project funnel
  • Organisation use some criteria to decide which projects to fund and in what order
  • Funded projects acquire a 'customer team'
  • Customer team go through some training on creating 'product backlog'
  • Customer team create first pass backlog
  • Development team work with customer team to refine and realise backlog
I'm going to give some thought to the training and assistance 'product' teams need to envision a good product backlog (product definition). I will use the term 'product team' instead of customer team as it is consitent with 'Product backlog' and SCRUM.

Hopefully I can come up with some good guidelines.

Anyway watch this space.

Monday, September 19, 2005

Software Vision

Alan Cooper and Kent Beck debate XP vs Interaction Design. Alan Coopers' book The Inmates are Running The Asylum describes Interaction design.

Not sure what 'interaction Design' is, but from the article it appears to be some type of business analysis that occurs prior to software development. The assumption is that users need 'help' in deciding what it is they want built. Rather than just automating what is, we should be designing something better. Sounds similar to the Business Process Re-engineering ideas of the late Nineties.

In my post on software as art, I point out that there isn't much guidance available on how best to envision new software systems. I'm not sure whether having a middle man between customers and developers is the right idea though.

It is all well and good, if your Interaction Designers are good at what they do and add value, but how do you test this? Also how do you keep the customer accountable for how he chooses to spends his money?

All sounds a bit naive to me.

Envisioning should be a customer/marketing responsibility, with developer input on feasibility and cost.

I'm with Kent Beck on this one.

Tuesday, August 23, 2005

The Verdict - NLP is Mumbo Jumbo

Past experience has taught me humility. Hence my criticism of NLP has been somewhat guarded.

Having looked up NLP on Wikipedia, I am pretty convinced that NLP is pseudo science. If the lives of the inventors of NLP are anything to go by, then NLP is definately dubious to say the least. On wikipeadia, there are links to several critiques of which this one is typical.

What does NLP say about our society? Why is everyone looking for a quick route to "success"

I'm of Jamaican decent. In Jamaica, especially in rural areas, old west african values still hold. People are a lot more relaxed about life, a lot more social, and a lot happier. In the town however, (kingston) european values tend to dominate. In a small island, the two cultures sit uneasily together.

When visiting Jamaica, it is clear to me what we in the west have lost. It is also clear how global economics is forcing much of the world down the same path.

People need to ask who is benefiting from this trend? Certainly not the majority of the worlds population. Where will it end? 1 billion chinese are about to join the rat race in earnest - can the planet support this?

Frankly I find it worrying.

Materialism, Spirituality and NLP

Been reading more about NLP. From the moment I was introduced to NLP, I've been a bit uncomfortable with it. In my last post on NLP, I questioned whether NLP could help if your desired outcomes where themselves spiritually unforfilling. Well, the answer is sort of. At least from what I've read. In choosing outcomes, NLP suggest that you identify outcomes that 'suite you'. Outcomes that are congruent with all aspects of you. NLP goes on to say that change should be supported by your subconcious.

NLP also describes a concept known as modeling. This is where the behaviour of "successful people" is analysed and copied. In this way NLP believes that success can be taught.

Unfortunately in my reading thus far NLP has not defined what it means by success. From what I've read, the implication is material success.

So if material gain, is the yard stick for success, where does the soul fit in? It seems to me that this focus on materialism turns sprituality on its' head. Rather than seeking contentment, peace and happiness from within, NLP seems to be promoting the search for happiness through external things.

I think this is a sign of our modern times. Production and consumption, is our new God. As human beings we seem to have lost the connection with our own souls, with nature and with God.

NLP seems to be trying to harness the resources of the soul in the service of material gain. If my interpretation is correct, then us in the west are truly lost. Here is a quote:"... In this spiritual confusion, many cults, sects and - issuing also emerge. Both religion and philosophy become materialistic and politicized...". This is taken from Brahma Kumaris

Life is speeding up, and increasingly more materlisitic. In the west, we are less communal, more individualistic and increasingly isolated as individuals. In this all consuming rush for wealth, our humanity itself seems to be the victim.

Friday, August 19, 2005

Creativity, Simplicity and Art

I've been thinking some more about "software as a creative process'. Good software can be a thing of beauty. Simple idioms and patterns repeated to produce something that is both complex, yet simple. Much like the double helix structure of DNA or fractal patterns in maths.

Beauty in software has two main manifestations. Firstly in its' conceptualisation. Knowing what problem to solve, and what the solution should be, is a creative process. One instantly recognises a good solution to a problem. A good solution is often simple, and a good fit for it's intended purpose. When going through open source projects on sourceforge, good project ideas immediately jump out at you. A project that meets a real need, and is simple.

The second manifestation, I think, is in the structure of the solution itself. Object orientated languages offer great scope for efficient, elegant solutions, where functionality is implemented once and once only. Dynamic OO languages like Smalltalk, offer even further scope for elegance and simplicity. A dynamic language can greatly improve productivity, allowing time for trying things out. Learning through discovery, leads to increased beauty.

Unfortunately, the beauty of a software concept, although visible to the business, is often overlooked. Very few business leaders spend time agonising over whether a proposed software concept is beautiful. Software just isn't thought of that way. More important to most business leaders is the projected return on investment. I've never calculated ROI, but I'm sure that it is a difficult thing to predict. Also I would hazard a guess that projects that actually do provide a good ROI are both beautiful and simple in concept.

Worst news is that the beauty of the final implementation is not visible to the business at all. The first the business becomes aware that a software solution may be less than beautiful, is when bugs begin to emerge, or when maintenance is more difficult and expensive than anticipated. The developers know when a software solution is ugly, but unfortunately, the business hold the purse strings and make the final decision.

Implementing beautiful software is a creative skill that can be taught. Much like painting or playing a musical instrument. The art of software development is well established, and there is a gamut of practices and disciplines for developers to draw upon. Agile development recognises the creative nature of software development, and promotes practices that support creativity. In contrast software conceptualisation, seems to be less well understood. The nearest I've seen to a disciplined approach to deciding what software to build is the approach outlined by SCRUM, namely keep it small and keep it simple. Beyond this there doesn't seem to be much guidance around unfortunately.

So where is the software industry today?

  • Well we've got managers who would like to think of software development as a defined process. Gant charts and LOC estimates, leaving little scope for discovery and creativity.
  • We've got developers, who tend to be more interested in technology, then delivering business value.
  • Where beauty is visible, in conceptualisation, the art is not well understood.
  • Where the art is well developed, in software implementation, the resultant beauty is not visible.
  • The business care little for conceptual beauty, as they do not appreciate its' importance. Developers can avoid discussing implementation beauty honestly with business people as the implementation is no visible.
It would be interesting to see what business leaders from creative industries like fashion and music would make of the software industry. I'm sure they could teach us a lot.

Sunday, August 14, 2005

Coaching and NLP

I've not blogged about my agile coaching role for a while. To be honest, I'm a bit disheartened with coaching. Coaching as I understand it places the emphasis on the person being coached to create change. Whilst on the surface this makes sense, it does assume that the person being coached has positive goals and is motivated to achieve them. If this is true, then the coach is a mere assistant showing the way.

Whilst many sports men and women, do have elevated ideals, I don't think the same can be said for the workplace. My experience of the workplace is that of extreme politics. By politics, I mean numerous agendas all vying for supremacy. In my experience, very few of these agendas can be described as noble, aimed at making the world a better place. In my opinion , these competing agendas are responsible for making organisations sub-optimal, and less than ideal places to work and thrive.

Some organisations do manage to maintain lofty ideals, with everyone working towards a common goal. Organisations such as Universities for instance, where an ethos of egality and openness is well established. In such organisations, motives tend to be somewhat different than in the workplace. The pursuit of personal financial security (money), is largely replaced by the desire to gain the respect of your peers.

In the workplace the need to be held in high regard by others can exist too. The success of Japanese companies like Toyota for instance, can largely be put down to the well established clan system that has existed in Japan for many centuries. By fostering clan like loyalty, Japanese companies have managed to achieve a high degree of optimalisim.

In a last ditched attempt to see if coaching can really address the problems in the modern work environment, I've started looking into NLP (Neuro Linguistic Programming). NLP is a psychological science touted by many in the coaching profession as the theoretical under pinning for what they do.

NLP is an interesting mix of eastern leaning philosophy, spiritual awareness, and modern western psychology. NLP proclaims that we can better achieve our desired outcomes, by re-programming the way we think, speak and behave, and in so doing become successful. In NLP, the role of the coach is to help people 're-program' so that they can succeed.

As someone who is well versed in eastern Buddhist philosophy, and who believes in the power of meditation (directed thought), a potential flaw in NLP jumped out at me straight away. NLP starts with the desired outcomes of the practitioner, but very often, it is these desires, that are at the root of the problem.

To explain this last statement fully, would take more time than is warranted here. But in short, it is what we desire (our intended outcome), that is often what leads to our unhappiness in the first place. For example, many of those who desire 'financial security' still find themselves feeling 'insecure' and desperately unhappy, even when they have acquired more than adequate wealth.

People caught like this do not fully understand their own motives. They are consumed by their surface desires, unable to see what lays underneath. Often their desires stem from uncomfortable human emotions, such as fear, jealousy, low self esteem etc. Free of the need for a quick fix, they are more able to see what ever it is that will make them truely happy. I describe this type of awareness as wisdom.

This is why, I feel that what is needed in the workplace is leaders. People that can lead by example, act as role models and inspire those around them to elevate their thinking and gain wisdom. Such people don't merely facilitate, but set the direction and promote values that others can follow. In my coaching, I make a conscious effort to lead. This can manifest itself in several ways. One important way is establishing congruence between my actions and my words. By this I mean 'walking the talk', practicing what I preach. Through this I manage to gain trust, a commodity that is in short supply in the workplace. Next, I try to limit my tendency to judge others, and to become over critical. This is something that I really struggle with, but when I do achieve it, I find that it increases my ability to influence others. People tend to be more open to you when they feel safe from personal attack. Finally, when necessary, I'm brutally honest, and unequivocally is saying what I feel is needed. After a while, when people feel safe and that they can trust me, they become open to taking my lead. I find that when I talk, most people are willing to listen, especially when I take the time to listen to them.

Despite my personal achievements in persuasion and leadership, I find myself in scenarios, where a positive coarse of action is being blocked by someone with higher authority. In most cases this person has sanctioned my role, but has chosen not to consult me on a decision that affects my ability to do my job. They are happy to pass on responsibility whilst reserving control for themselves, leaving me powerless.

This in my opinion, is a failure in leadership. Most often triggered by fears. For such people, what is missing is the ability to lead themselves. To conquer their fears and self doubts. Leaders aren't born in my opinion, but made. The concept of leadership is well established, and many cultures have established ways of ensuring that they foster good leaders. With a good leader a coach becomes a useful tool, helping the leader and their team(s) become the best they can be. The sports men and women, who use coaches to achieve their goals are a good example of this. A sports person, who does not pursue excellence, or who is not motivated to do the hard work needed, will not be helped by a coach.

So whilst the role of coach is significant, what I feel the workplace really needs is good leaders. Not being a captain of industry myself, I have limited my ambitions to leading my own life. NLP may help me get to where I'm heading faster, but if I'm heading in the wrong direction does NLP help?

I would love to hear from soneone with real experience with NLP.

Wednesday, August 03, 2005

NASA finally sees the light

It looks like common sense has finally prevailed at NASA. You have to give them credit:

Space shuttle replacement

Meanwhile the Russians are having a quiet chuckle, whilst planning their own future:

Russia in Space

In my last entry, I couldn't remember the name the Russians gave their rockets. Well it is Soyuz.

BTW the russian Soyuz rockets date back to the 1960s, and hence are over 40 years old in lineage. Apparently, the Russians have made monumental leaps in rocket technology, eclipsing the early achievements of German pioneers. So perhaps it was abit unfair of me to imply that their current Soyuz rockets are based on German WWII designs - sorry.

Tuesday, August 02, 2005

The space shuttle in trouble again

Not wanting to annoy any Americans out there, but the latest episode in the Space shuttle saga, is evidence if we needed it that complexity should be avoided.

Space travel, like software development is inherently risky. So why compound that risk, by devising a space vehicle miles more complex than it needs to be?

The Russians seem able to put people into space, a lot more reliably and at a fraction of the cost. So what is their secret? Well a simple rocket design, inherited from the Germans after World War II, is still used by the Russians today.

This simple design has its' advantages, the dangerous fuel tank, that makes up the bulk of the vehicle, is behind the capsule where the astronauts reside. The cockpit capsule can be ejected from the rest of the vehicle, protecting lives in a catastrophe. In contrast, the space shuttle design, has the astronauts sitting on top of a massive fuel tank and adjacent to two solid rocket boosters, with no escape route.

From a safety view point, the space shuttle design is ludicrous. In terms of complexity, the shuttle is significantly more complex then the rockets the Russians use so successfully.

So why? Oh yes, the space shuttle comes back to earth, this as got to be an advantage. Well no, the Russians manage to travel to space at a fraction of the cost, even though they have no reusable parts.

So what is the real reason? Well if I was to hazard a guess, I would say ego. The same reason why so much software is much more complicated than it needs to be. Having a space vehicle that looks like it came out of a "Buck Rogers" movie is more flattering to the ego, than a plain old rocket as depicted in a bugs bunny cartoon.

I'm sure that national ego, will keep the shuttle program going, when all involved must know that the basic design is fundamentally flawed. I've seen this type of "group think" before, often in companies. Compound a bad decision, by ignoring it, glossing over the facts, and pouring good money after bad. After all we don't want to admit that we got it wrong, do we?

I've been learning Ruby lately, and I've taken a look at Rails. I find Ruby an elegant and productive language (when compared to Java), and for most web apps I've built, I'm sure that Rails would have done the job in a fraction of the time (and cost).

So why are people still building web applications using EJBs, JNDI, XML, JSP, CSS, JavaScript etc, etc? For some, mastering this soup of TLAs is an ends in itself. Being able to do this stuff just makes them feel good. It plays to their ego.

As for me, I get my kicks by knowing that I've produced something useful. Something, that will make someones life easier, better, less stressed etc. So with all this press about the space shuttle, I like to think of the Russians - brushing the dust off their 1945 designs, and knocking rockets together with bits of old metal. Its' not glamourous, but they are still luanching multi-million dollar satelites into space.

Monday, July 25, 2005

Agile Development - Learning from Tommy Hilfiger

I've allways seen software development as more of a creative process than a science. A couple of recent experiences has convince me even more of this.

The first incident occurred when I was explaining story writing to a BA. "Oh" she said, "a story board." "It sounds just like the story board we'd put together when working on a new fashion collection." Prior to being a BA she worked for several years in fashion. Apparently, in a fashion boutique someone would come up with a concept for a new range or collection on a story board. A team would then populate the board with ideas: drawings and snippets of fabric building and expanding on the basic theme.

More I thought about her analogy, the more I liked it. After all, in the beginning, the concept for a new piece of software is no less creative than a fashion concept. Stories are nothing more than incomplete snippets that help flesh out the basic vision. Some stories will be accepted, depending on their percieved value, much like ideas in fashion.

The second incident occured whilst watching television. The programm was called "Rich Girls", and followed the life of Tommy Hilfigers' daughter. Tommys' team had come up with a story board for a collection aimed at teenagers. As a way of proving their ideas, Tommy asked his teenaged daughter and one of her friends to come into the office and go through the story board.

His daughter took her "work" seriously, and went to efforts to explain to her friend that "she needed to be honest." If she didn't like something she should say so. "Don't be diplomatic". The scene was an interesting one. The story board taking pride of place in the centre of a large room. Fashion designers huddled in a door way peering in. Tommy leading his daughter and her friend through ideas on the board. Tommys' focus was soley on his daughter and her friend, after all they where the potential customers. The fashion designers were tense, and uncomfortable, as they listened to a severe critique, handed out by two teenagers.

Tommys management technique was interesting. There was only two experts in that room that day, the teenagers. Watching them say, "I wouldn't wear that, or I wouldn't be seen in stripes like those", with everyone listening was amazing. A real lesson in direct customer feedback.

I recently read a post by Rachel Davies where she refers to the lies and half truths that characterise the relationship between development and the business in most organisations.

Tommy Hilfigers management technique seemed miles apart from what I'm use to and what Rachel describes in her blog.

Managing a creative process is different to managing a defined science. Perhaps IT managers could learn a lot from the creative professions. Acknowledging that software development is essentially creative, would be a good first step.

Monday, May 30, 2005

Agile Development - Skill

Knowledge, Understanding and Skill. The three stages of learning (according to Japanese TQM). I have found this to be true. In my opinion a little knowledge can be dangerous, understanding comes only with experience and skill is acquired only after repeated practice in varying conditions.

The theme of learning, has been central to my blogs on Agile development. One of the tenants of the agile manifesto is to value people over process. If you accept that software is a complex non-deterministic, creative discipline, then this primary tenant must be true.

As an organisation, if you value people over process, then it is only logical that you get good at developing people. As an individual you owe it to yourself to improve.

I have been using XP for sometime, and up to now I have reserved judgment on its general applicability. Well I feel that I've learnt enough to come off the fence and offer an opinion.

I don't think XP provides enough guidance on how to get the best out of people. How best to build and develop individuals and teams. XP says go get a skilled team, apply these practices and bingo success. But if the team is not skilled, or worst still, if they think they have the skills, but in reality do not, what then?

In a sense this is an opportunity for people like myself to offer our services as "Coach". Come in, exhibit skill and leave with a fat check. But what happens after we go?

I have been reading Crystal Clear (CC) by Alistair Cockburn. CC advertises itself as a human powered methodology. I prefer to think of it as a people centric approach. It was developed over several years by interviewing sucessful agile teams and asking what worked. The outcome of these interviews has been boiled down into a number of guidelines for success.

CC addresses many of the short comings of XP, by truly focusing on people. I have got my own ideas on how the two approaches could be merged, taking the best from each. I'll save that for another blog. I'll leave you with an omission in XP, that if you think about it is a startling oversight.

How do people improve? People learn from other people, often by example. For this to occur a relationship between teacher and pupil(s) must exist. This relationship is very important and requires mutual respect and personal safety. In the relationship the teacher leads, and the pupil follows. The essential skill of the teacher is leadership. Without good leadership improvement and eventual success is unattainable.


XP does not address the concept of leadership. Instead it assumes that a meritocracy will arise within teams and appropriate leaders will be found at appropriate times. My experience is that what is as likely to occur is that the blind will end up leading the blind. Alternatively, in scenarios where several people have the required leadership ability, uncertainty and confusion can arise as no one knows who to follow.

You only need to look to Sport to see that teams need leaders. Good leaders get the best out their teams.

Saturday, May 28, 2005

Agile Development - Understanding

Another entry on my role as agile coach. My last entry on this subject told what happened when I looked for management support to help solve an internal problem. It wasn't a surprise when the management support was not forth coming.

The problem was that some team members did not understand XPs approach to design, and as a consequence were adding technical debt to the project at a rate of knots. Worst still, they did not accept my authority.

What to do? Well in a last ditched attempt I called an impromptu brown-bag session on agile design, and the cost of change. I described the need to "flatten" the cost of change curve when performing agile development, and I used examples from the project to make the point. The session was discussion based, with members of the team asking questions and offering opinions. It became clear very quickly that the team members that were hostile to TDD just did not have an appreciation of the cost of change, never mind how best to address it.

During the meeting I could see knowledge transform into understanding. They had heard the terminology before, some had even skimmed "XP Explained", but this was the first time that they actually understood.

After the meeting, members of the team who had previously said that "they knew it already", asked to borrow my copy of "XP Explained". There is evidently a big gap between knowledge and understanding. Perhaps I should of held the brown bag earlier. Yet I have a feeling that without the pain that the team went through, the less experienced would not have understood. Knowledge and experience leads to understanding. Knowledge on its own is not sufficient.

Consequently my authority in the team has increased. The doubters, doubt a lot less, and are more willing to take my lead.

Thinking through the lessons learned, I've concluded that this episode demonstrates a weakness in XP. How do you ensure that the less experienced developers do not cause too much harm. Especially if they feel that "they know it already"? XP provides no guidance other than to avoid inexperienced developers. I've been reading Crystal Clear - which provides a pragmatic solution to this problem. Let them do non critical tasks, like bug fixes and maintenance.

This idea brings us back to where we started - managing people. As a leader I need to be recognised as such and empowered, but XP does not stress leadership.

In fact XP provides little guidance on people issues. In some ways I find XP a bit de-humanising, reducing software development to a set of rigid disciplines. In contrast, Crystal Clear takes a more people centric approach. I can see myself borrowing from Crystal Clear in the future.

Sunday, May 15, 2005

Agile Development - Management

Back to bloging about my experiences as an agile coach. In my last entry I mentioned that we were weeks away from our first release, but that we were also burdened with technical debt.

Well, we have delivered our first release (sort of), but it has been painful. The more experienced developers in the team, are now quite aware of the impact of technical debt, mainly as a consequence of having to refactor large swath's of the system inorder to implement the final few stories.

The technical team lead approached me and asked what to do? Quite frankly I was stumped for an answer. The problem was that some team members were quite convinced that their "up front designs" were playing a vital role. And in their opinion their views where just a valid as mine. After all, common sense dictates that unit testing is about testing, not design right?

By this stage my attempts to persuade them otherwise had failed abysmally. Our team leaders' attempts to assert some influence had failed too. People are complex. The reasons for resisting TDD by some team members are still unclear to me. What was clear though, is that their stance was putting the project at risk, and that management should be informed.

So I penned an e-mail to our manager. It had to be an e-mail, since the only person I could clearly identify as taking direct management responsibility was based in the States (all the team members including myself and the customer assigned team lead are contractors). As a consequence a tele-conference between myself, the technical team lead, the onsite customer assigned team lead, and the offsite manager ensued (if this structure sounds complex, that is because it is!).

Nothing much came of this meeting other than me being told: "Call me back when there is a crisis" by our Manager. So what is the moral of the story? The pointy haired boss is alive and well, even on agile projects.

Saturday, April 30, 2005

The cost of software

I've recently watched "The Aviator" on DVD. The movie about Howard Hughes. What an interesting personality. After watching the movie I was keen to find out more about the man.

I found an autobiographical link on the web: http://en.wikipedia.org/wiki/Howard_Hughes read it yourself. My conclusion on Howard, is that he was a man caught up with his own self importance, driven by ego. I couldn't help feeling that perhaps he was a victim. Possibly of a over bearing father who expected nothing less than greatness from his only son. Unfortunately the movie and the online autobiography say very little about his childhood. Apart from his affliction with OCD, the sadness of his life for me was that he spent all his time proving himself, and very little time actually living.

After reading about Howard Hughes, I was keen to explore the lives of other very wealthy people to see if I could find a common link. I looked up Bill Gates http://en.wikipedia.org/wiki/Bill_Gates. I expected Bills' autobiography to contain all the signs of an ego centric megalomaniac (Much like Howard Hughes). But it didn't. Pretty dull actually. Bill Gates appears to be your regular nerd. No vision of grandeur. No big I am. It was when I read his open letter to hobbyist programmers written back in 1976 that I got a real insight into the man: http://www.blinkenlights.com/classiccmp/gateswhine.html

Now Bill Gates is the man we all love to hate (my self included), but reading his letter I realised that he had a point back then. He wanted to produce good software and get it out there to the fledgling personal computer community. What's wrong with that? I found myself asking. To do it he needed good programmers, and good programmers deserve to get paid for what they do. He didn't want people "stealing" his software, especially when they stole and distributed pre-release buggy code - giving him and his software a bad rep.

After reading his letter I now see Bill Gates as the first nerd who cared enough about what he did to elevate software to something valuable, and in common with the producers of other valuable products, he demanded payment.

So how does this sit with the ideals of the open source community? Not sure. I've always seen the common sense in collaboration - working together for the common good. But I always thought that this work should be paid for by the people that benefited from it. For example, a lot of fortune 500 companies could save themselves a lot of money if they worked with each other to generate common software, that they shared (open sourced). They would end up paying the programmers they hired a lot less then they currently pay software vendors like Oracle. The idea of programmers writing code in their spare time for free, and then "donating" it to their employers never made sense to me.

Some people have used open source as a means of bootstrapping a services business, JBoss comes to mind. This is a neat idea, but the few JBoss consultants that actually get paid, are leveraging the work of an whole army of programmers out there that never see a penny. So my question is why do they do it? The programmers I mean. What motivates them?

My guess is that many in the open source community may have more in common with Howard Hughes, then Bill Gates does. The ego boost of having their code used and worshiped by their peers is perhaps payment enough.

Friday, April 08, 2005

Agile Development - Knowledge, Understanding and Skill

As I mentioned in a previous blog, on my current contract I have been performing the role of Agile Coach. I'm not sure that i'm a natural fit for coach, but the project desperately called out for Agile practices, I was the one with agile experience, so I was made coach.

On my previous project, I had the great fortune to be coached by Rachel Davies. Rachel is a very well respected member of the agile community. She is the chair of the extreme Tuesday club in London (www.xpdeveloper.com) and a director on the board of the agile alliance (you know the people that wrote the agile manifesto: http://www.agilealliance.org/home, people like Kent Beck, Martin Fowler, Bob Martin etc). In the Agile community you can't get better credentials than this. Racheal really impressed me in the way she assisted our team in coming to the right answers for ourselves. Never telling, but patiently observing, taking notes, pairing with individual members of the team, and providing input in the most subtle ways. In short Racheal demonstrated a very high level of skill.

In contrast, I'm more of a teller and a doer, not that good at observing and influencing, but inspired by Racheal I resolved to do my best in my new role. Interestingly, my immediate successes where with the Management. Within a short time I was able to demonstrate the power of user stories. The technical management had been having a difficult time in defining the requirements with the users and avoiding scope creep. They where quick to appreciate how stories would help.

Soon we where up and running, boostrapping the XP process by initially writing stories ourselves from the technical specs. We then introduced stories to the testers, who where quick to get on board after months in the dark. The next person on board was the user. He found seeing what he was getting and defining priorities a massive improvement. So much so he cleared space in his diary to be with the team 3 days a week.

The people that proved most difficult to influence was a, the business analyst and b, the developers. The response of the business analysts was somewhat expected. These people have a vested interest in producing paper specs that everyone knew had limited value. We needed to tread careful, coaxing the BAs into realising that stories could be good for them too. Even with the BAs we have had some success, with one or two of them being quite keen to write stories instead of continually revising specs.

The difficulty with the developers came as a great surprise. Some members of the team initially found it extremely difficult being told what to do. I believe the phrase don't teach me how to suck eggs' was quoted more than once. I had to go to great lengths to stress that I was merely identifying an alternative way of working, which in no way invalidated the way they had worked in the past. The experienced developers were quick to see the benefits of XP practices even though the practices were not familiar in themselves. The less experienced developers in the team however, found it more difficult to see past what they already knew. For example, TDD is still seen by some as primarily a unit testing technique (as opposed to a programming and design approach).

NIH (Not Invented Here) and ego centric technical debates soon engulfed the team. Unnecessary complexity was added to the code base, and technical debt began to grow. Worst still I was in danger of becoming the main protagonist in these technical debates. People wanted to prove that I was wrong, and that they knew better. This situation called for swift action. I decided to take a back seat, stopped leading from the front and allowed the team to make decisions for themselves. That iteration no stories were delivered, instead people bussed themselves with technical tasks, making "improvements" to the core architecture.

After the experience of zero velocity, many in the team realised for themselves that Agile development required discipline. My next step was to clarify my role, by suggesting that we needed a technical lead. Fortunately we had a good candidate in the team, who was keen to take on the role. For a couple of iterations velocity soared under our new lead.

Now we are potentially weeks away from our first release. Unfortunately the technical debt incurred during earlier iterations still needs to be paid. It will be interesting to see how things work out.

Thursday, March 31, 2005

Agile Development - Whats' in a language?

Since my last entry things for me have changed. In my current contract I am performing the role of "Agile Coach". Interesting in itself (I take my hat off to people like Racheal Davies, coaching is not easy), I won't be blogging about it today. No what I want to blog about is programming languages, and languages that support agile development.

For a while I have been hearing people in the Agile camp say things like: "dynamic languages are the way to go". Untyped (risky) and Interpreted (slow) both bad things right? Well in my coaching role I've been taking a closer look at test driven development - and of course having to practice what I preach, I have been doing a fair amount of TDD recently. For TDD there are two things you definately need - a good test framework (JUnit) and a good refactoring IDE (Eclipse).

The more refactoring I have been doing, the more I have taken the refactorings available in Eclipse for granted and the more refactoring facilities I want. Surely there should be a button to do this or that refactor, and if not why?

Both of the tools I rely on for TDD have a common heritage: JUnit descends from sUnit, Eclipse refactorying is inspired by The Smalltalk Refactoring Browser. In search of a better environment to support TDD I dug up my old copy of the purple book and went on a quest on the internet to find The Smaltalk Refactoring Browser.

The first thing I noticed was that the Smalltalk community is still alive allthough a bit muted. The impact of Java has definately taken its' toll though. A quick search on jobserve for Smalltalk vacancies identified only 10 or so and they appeared to be for maintance work. It looks like the remaining Smalltalkers are using Smalltalk just out of interest or just for fun in their spare time. With the exception of one or two German Consultancies, most of the Smalltalk related websites were either academic, or open source related - very few businesses.

The second thing I noticed is that you can get commercial quality smalltalk for free. I remember the days when a VW Smalltalk license ran well over a thousand pounds. Finally I found Squeak. A free implementation of Smalltalk by the founder Alan Kay.

The first thing that happend when I started using Smalltalk again was that I immediately fell back in love with the syntax. I first taught my self Smalltalk back in the early nineties as a way of learning OO programming. Learning OO and learning C++ at the same time was proving too difficult, so I decided to learn OO without C++. It worked a treat. OO expressions are built on sending messages to objects. The Smalltalk syntax mimicks natural language: An object is a "subject" and a message is a "verb", parameters are "complements" and expressions end in a full stop.
For example:

bunny moveForward: 5.

Tells the bunny object to move forward 5 pixels.

The reflectiveness of Smalltalk and the focus on late binding opens up tremendous possibilities for tooling. Where Eclipse struggles to make Java self aware, Smalltalk positively encourages tools to inspect this, or discover that or refactor this or rewrite whatever...

So where are the Smalltalk tools? Well the old Smalltalk favorites are still there - Browser, Inspector etc. Interestingly these 30 year old tools compete pretty well with their modern counterparts like Eclipse. Eclipse is ahead in some areas (e.g code completion) , but the Smalltalk tools are just so much more accesible and easy to use. I also found some newer tools, for example GLORP is a Object Relational Mapping framework for Smalltalk. If you are familiar with Hibernate you should take a look at GLORP, OR mapping without xdoclet, XML or bite code manipulation. The power of making everytning an object (including classes) is that adding new object types and extending behavour is easy. For example GLORP has a class for a TableMapping, a OneToManyMapping etc and ofcourse overloading 'new' and extending its behavour is a synch. Object queries using GLORP look remarkably like SQL, yet they are standard Smalltalk expressions. No need for a special query language like HQL.

By this stage I was pretty smitten with Smalltalk. For Agile development this has got to be the way to go. The slogan of XP is "Embrace Change". Thinking about the recent changes to Java: dynamic proxies, annotations, generics etc all of these have required changes to the core language (JVM and compiler). All these features either allready existed in the orginal Smalltalk-80 or have been added without the need to change the language at all. Thinking about changes (refactorings) to exisiting apps, Smalltalk tools like the Refactoring Browser and the code rewriter have yet to be surpassed in the Java world (and probably will never be surpassed without further fundamental changes to the Java language).

But Smalltalk is untyped? With test driven development who cares? A compiler can only ensure the type assertions made by the programmer. What does that mean? Well it could mean that your program is self-consistently wrong. Besides you have to type in all that type info so that your compiler can do the checks (what a chore). With TDD you make assertions about the correctness of your code with respect to its' use. These tests are enforced each time you build. So if you've got unit tests who needs type safety?

Alan Kay, the inventor of Smalltalk allways envisioned that computers would become easier to program. So easy in fact children could do it. In this view of the world, the interface to a computer is its' programming language. So a graphical user interface becomes a programming language for end users.

I got to put this idea to the test the other day. Whilst playing with Squeak, my girlfriend decided to pop herself on my knee and took a look at what I was doing. I was playing with the Alice framework port (Alice.org). Alice is a3D graphical framework which allows you to create a 3D world populated by Actors. On the screen was a pink bunny rabbit wearing dark glasses and carrying a drum and a mallet. I entered a few Smalltalk expressions asking the system to 'do-it' each time, and watched my girlfriends expression as the bunny respond to my every command. She soon caught on, and started entering commands of her own: "bunny head turn." Getting bored I was about to issue "bunny destroy" when she stopped me. "No you can't do that to such a cute bunny, make it beat the drum instead". At this point she grabbed the keyboard an typed "bunny beatDrum", but the system responed with: " bunny didNotUnderStand".

Within 5 minutes with no instruction from myself, my girl friend was coding in Smalltalk. Maybe Alan Kay was right! The didNotUnderstand message was a bit of a bummer though. What would she need to do to make the rabbit beat the drum? Well the following expression may work: "bunny mallet beat" or there could be another sequences of messages that would work. If not she would have to define and implement a message her self. This is were things would become difficult. How do you make a bit mapped representation of a mallet move in 3D on a 2D surface? You would have to calculate the position and colour of a lot of bits. This would require a lot of maths. Perhaps there is an intermediate abstraction that could help, like the Morphic framework. Even so you would still need to know a lot to use it effectively. So much for simplicity and end user programming.

I've recently read an interview on the web by Alan Kay, where he was somewhat despondent on the direction computer languages have taken over the last couple of decades. His feeling is that good enough short termism has allways won out over good sound engineering principles. He is right, but I do think he missed something. Computers do very simple things very quicky. To do something meaningful (like beat a drum) the computer needs to perform a very large number of very simple operations in a pre-determined sequence. Here in lies the complexity. Abstracting can hide the complexity some what. For example, the bunny responds to a small number of simple messages, pretty simple, but when the appropriate abstraction does not exist, complexity raises its' head once more.

Because of this complexity the world cares very little for programming. Its' too hard and requires far too much skill. Instead people hire other people to do their programming for them. All they care about is whether the solution is performant, functional and affordable. It is the programmers problem to "talk" to the computer. For the programmer his main concern is that he can get work (using a language that he knows and is popular) and that the solution is "good enough". Untill users are aware of the long term costs of "good enough" solutions nothing will change. So there we have it. Looks like I'll be using Java for some time to come then (Is there anyone out there with a lot of money who fancies marketing Smalltalk? No? I didn't think so).