My last post on Emergent design got me thinking about other common architectural patterns (in addition to ORM) which may not necessarily be the best solution for the types of problems to which they are commonly applied.
I believe N-tier architecture where N=3 is a common choice, is yet another example of Big Upfront Design locking us into a unnecessary straight-jacket even before we look closely at our problem. Now the purpose of each tier in such an architecture is to provide a service interface to subsequent tiers. The goal being that tiers are loosely coupled and hence inter-changeable. From a clients perspective the tier you will most likely want to change is the presentation tier. So a PDA client will want a different presentation tier then a desktop client, for example. But both clients would want to share everything else from the service tier down.
All of this is common wisdom and as spawned a whole architectural vernacular of its own around the term Service Orientated Architecture. So what is wrong with SOA you ask? Well nothing, other than the term is not very specific and it tells you nothing about the type of problem to which it applies. SOA is the proverbial solution looking for a problem. So lets start this discussion again from the other direction. Lets choose a concrete problem and then identify a clean working solution.
The problem: My users want to store a bunch of related data records in a database. They also want to create, retrieve, update and delete (CRUD) these records remotely through a web browser interface .
OK. They want to store related data records. Sounds like a job for a relational database to me. They then want to view these records over HTTP. Well the verbs in HTTP (POST, GET, PUT, DELETE) sort of map to CRUD operations so why not make use of this? Also my user will view data records via HTML web forms presented to the browser. So there is inherent data coupling between my web presentation and my entity/domain model. Why not except this coupling as intrinsic?
From an Emergent design perspective, the simplest thing that can possibly work is an architecture that allows you to create data record hashmaps from HTML presentation views, and persist them in a relational database. Better still from an OO perspective, it would be even better if those hashmaps could persist themselves, and perhaps do other useful house keeping like data record validation that only they should know how to do. Viola - The simplest CRUD architecture possible. A Controller to handle the HTTP verbs and map them to actions on our database. An ActiveRecord that models database records as an hashmap and is capable of persisting itself, and a View that translates the model to an HTML presentation ready to be sent back to the browser over HTTP.
Sounds very much like Ruby on Rails to me. Now how do you think DHH got there? Did he use big upfront design, or did he choose to focus on his problem afresh and apply emergent design principles? Well it should come as no surprise to find out that Rails was harvested from a real word CRUD application that evolved over time. It should also come as no surprise either to find out that 37 Signals are big advocates of TDD. In fact Rails is the only web framework I know that has TDD built in.
One last thing. The astute amongst you are probably asking what happened to our nice separation of concerns and SOA? Well if you look at my problem statement I didn't need it (YAGNI). My client was using a web browser. I've saved myself a heck of a lot of unnecessary code and shipped my release 3 months early. Its now out there making a ROI for my customer whilst those N-tier folks are still deciding how many tiers they need. Hang-on I hear you cry. So what happens when your customer does decide that he needs another client to the system:
Problem2: Not only does the system need to support CRUD data entry by people, but in addition, we have recently gone into partnership with another company who wants to access the same data records using a remote system. So we now need some way to support this too.
The world as changed so we need to change our system to suite. Unlike the BUFD guys we will only make such an investment if driven to do so by our problem. OK. We understand how to map HTTP verbs to database actions, we also know how to map our model to HTML. Given this it makes a lot of sense to re-use this infrastructure and choose a service interface that makes the most of HTTP, we settle on a RESTful approach. Our previous CRUD based controllers still apply, but now the presentation needs to be machine readable XML instead of HTML. We could put a lot of if.. else.. logic in our controllers to determine what type of presentation to create, but we decide that this would represent a lot of code duplication, so we choose to encapsulate this logic in one place which all controllers can use. This approach is so successful that we then decide to factor it out as part of our home-grown framework. Sounds pretty much like how Rails 1.2 was born.
So now we have a Restful Application that is fully SOA aware and is built on our existing technology base. The only new technology is XML, and given that XHTML is a variant of XML, we could argue that we haven't had to introduce any new technologies at all. So we have managed to introduce a Service tier when we needed it at very little cost.
Some following this discussion may come to the conclusion that DHH just got lucky. To them my answer would be that you create your own luck. By gaining a deep understanding of the problem, keeping the design clean and simple and deferring unnecessary design decisions rather than prematurely locking yourself into false assumptions; then 9 out of 10 times evolutionary paths will present themselves without you having to plan (guess) for them upfront.
In the 1 out of 10 cases where there is no low cost option, then what your problem is telling you is that your application requires diverse personalities, and whether you plan for this upfront or evolve it later, producing such an application will be expensive. The benefit with emergent design is that by applying the YAGNI principle, you will not incur this cost unless you really need to.
Sunday, March 30, 2008
Subscribe to:
Post Comments (Atom)
4 comments:
I have worked on a number of n-tier projects and while I think the intent is actually quite pure, decouple to permit easier maintenance or change of implementation later, the outcomes are somewhat misaligned. The decoupled layers were largely not taken advantage of - i.e. you aren't gonna need it (YAGNI) - on an architectural scale. Of course the immediate downside of this is the additional coding effort required to add in all the artificial layers which, on their own, don't actually add any business value. There is always inherent coupling between layers no matter how many you have, so adding more simply complicates the development task of maintenance, adding new features or tracking down bugs.
When working on these projects I used to get the feeling that the old guard (client-server) crew were laughing at us - and now I'm sure they were. I know I keep banging on about this - and I am definitely not advocating the hacking mentality, but it always comes back to business value. A manager on one of these projects wanted to turn a simple request from the business into an architecture exercise, the business were amazed at how long it took the team to deliver, lost faith and cancelled the project. Much more importantly, they lost a critical market window. We as developers need to become much more in touch with these realities of life.
I agree with your rails argument, but I certainly don't think it will be the end game on this subject, but until something better comes along, for many business problems, it looks like a good fit.
We currently have some projects coming up at work, and the first thing I hear is - 'we need a web service to do X'. That really bothers me - I want to know what the underlying business problem is before going down any technology selection path. This seems to be a common problem that is getting worse, not better. While it is true that more and more businesses are looking into B2B these days, lets not blindly accept that a web service is required - as if its a panacea for all problems.
Think people, think!
SOA and n-Tiers..
Ok, more definitions:
SOA and Tiers and Layers are three different architecture styles (patterns so to speak). As patterns, they are intended to suggest a solution that had worked in a similar context, and when some particular forces and conditions are met, and of course they all list the consequences of applying the idea. The Idea. That is, a pattern is not a reusable block of code, but an idea of a solution that may be the base for your own custom solution.
I agree. People simply select 3-tiered architecture for anything, which is a Golden Hammer Syndrome case. We got impedance mismatch at the end. Your definition is wrong though, you are talking of layers, not tiers, but I may blog later about the difference. The main point holds: To use a pattern (be strategic, tactical or operational) you should work on the idea and decide if it fits. And Yes, 90% of the people is using patterns the wrong way, as a recipe. Bad, bad boys.
SOA, buzzword vendor. If you have something with web services, you have SOA, they said. I agree that maybe 95% of the people does not actually know what SOA is. Despite this, SOA can be a good architecture style if correctly applied.
Hey! When you describe your take on the CRUD, you are actually talking like an architect/designer! Look at the needs and chose the best one. Do not simple pick the canned solution, see if it fits or modify it to fit! That is what you should do to a pattern. You are actually using MVC!!! And you actually did Separation of Concerns too.
SOA is not web services, so you may have added some service definition to your solution very cheaply and way too efficiently.
Problem 2! The only problem I can see there is if you put part of your BL into the HTML. If you do that, then you may have some additional work to do. But if you separated correctly, you have just to build another front end. That is not lucky, you make the correct decision previously, based on your experience. You planned that (without knowing it?) and that was not guessing. The RoR guys planned that, and the architecture they have is ready for that. They didn’t guess, they were just prepared for the common trends.
I like a lot what Andrew says. Client Server is actually a 2 tier style. And everybody wants to have a web services (like having the latest cell model with Google earth embedded, just to talk the same) , that is an antipattern called Fashion Bandwagon. We don’t need to blame web services, we need to blame the managers for not listening to architects, for listening to vendors.
William Martinez Pomares
Hi William,
SOA and Tiers and Layers are three different architecture styles (patterns so to speak).
I agree (although, I look forward to your distinction between tiers and layers. I thought those two were the same thing).
The problem is that they are tipicaly used together because they are all considered "best practice".
Our pattern languages are failing for a number of reasons. One is that "best practice" doesn't exist. What does exist is "right practice" for a given context. For right practice and the tailoring of patterns to specific problems (which we both agree is needed) you need to understand the underlying design principles that led to the initial use of a pattern in the first place.
Hey! When you describe your take on the CRUD, you are actually talking like an architect/designer!
I know. I act as a Designer, Architect, Business Analyst, Systems Analyst, Tester.. each time I sit down to program. Like an artist I start with a blank canvas. We have different ideas about the developer role and what competence in this role actually means.
DHH is a young developer. He chose to shun contemporary best practice and used the simplest pattern that could possibly work which for him was MVC.
There is a whole industry out there built on J2EE 3-tier architectures which people are using to build CRUD applications where MVC would do.
Why? Where in the J2EE Patterns literature does it say when not to use the Sun J2EE Blue-Print patterns?
We have a one size fits all approach yet most of the time it is over kill (YAGNI). A lot of ideas have been appropriated by people who's sole interest is to sell us technology we don't need. I have seen Architecture and the role of Architect appropriated this way, and I have also seen the same happen with pattern languages.
If you belief that developers can't make "best practice" decisions and should just work and not think. Then it is easy to believe that you can buy technology that encapsulates "best practice" patterns and buy an expert, call him an "Architect" and get him to do all your design for you which you then get cheap "coders" to implement for you in a painting by numbers style.
If those coders stay in their box, and do not understand the principles, then they will follow the design/patterns blindly. They will not recognise when they don't fit the problem and they will not be able to provide feedback to the expert Architect.
The end result. The vendors get rich, the Architects ego grows, the developers working lives are unrewarding because they have been de-skilled and dis-empowered (and they are possibly offshore working in India for a third of the price), the customer pays a lot of money and get back very little value in return.
Is this a situation you recognise?
My blog posts aren't just an academic discussion over technical terminology. I am relaying what I have seen and learnt over 18 years in the Software business. This experience includes hands on programming, team leading, project Managment, Systems Engineering, and Engineering Management. I have seen what this business does to it's people and what it does to it's customers. We are getting to the stage were customers would rather go half way around the world to get their IT needs met, rather then trust their own local Software Professionals to deliver value. This is my motivation for blogging. We are failing and we need to accept this fact!
We and the industry we have built is responsible for this failure!
Problem 2! The only problem I can see there is if you put part of your BL into the HTML. If you do that, then you may have some additional work to do. But if you separated correctly, you have just to build another front end. That is not lucky, you make the correct decision previously, based on your experience. You planned that (without knowing it?) and that was not guessing. The RoR guys planned that, and the architecture they have is ready for that. They didn’t guess, they were just prepared for the common trends.
We agree here. To do this though DH had to have an appreciation of first principles, the separation of concerns that led to the MVC pattern in the first place.
I like a lot what Andrew says. Client Server is actually a 2 tier style. And everybody wants to have a web services (like having the latest cell model with Google earth embedded, just to talk the same) , that is an antipattern called Fashion Bandwagon. We don’t need to blame web services, we need to blame the managers for not listening to architects, for listening to vendors.
Lots of people are to blame including the customers themselves. Taylorist beliefs means that usually the developers get most of the blame, yet in most organisations they have the least control and influence.
We need a situation which is more transparent, where the consequence of decisions are more readily traced back to the decision makers. Having been a Manager, I must admit that they are often the ones mostly to blame in my opinion. This is why many believe that teams should be self managed, making their own decisions. But IT professional do not help either. On Andrews blog he speaks of the Developer whose sole interest is technology and who has scant regard for the business and delivering business value.
It is my strong belief that we should streamline the number of roles, minimising the number of hands-off and focusing the minds of the practitioners on the end goal which is the delivery of business value. This is why I believe that the developer role needs to be expanded not contracted and de-skilled. Good developers should understand the business, they should understand architecture and it's impact on business value, they should also understand design principles. Finally they need to understand how to program in a given language with a set of APIs they have selected for themselves.
This is the point I was making about the Unix Operating System. The people on that team were multi-skilled and multi-disciplined with an appreciation of the whole process. They could do "the whole job", and hence could "optimise the whole" rather then just narrowly focusing on their bit.
We sub-optimise bits of the development process in isolation and wonder why the whole doesn't hang together and why the whole doesn't deliver the expected results. Tachii Ohno would say "optimise the whole" not the parts.
To do this I believe the breath and the prestige of the Developer role needs to be expanded. Given the right skills, they are often best placed to make the right choices.
Post a Comment