Yes, there was over engineering. Loads of it. Back in the mid 90s, when I started my career as a developer, the goal was to become an architect. No serious developer would dare writing a single line of code before selecting our preferred design patterns—we would then decide how the business requirements would fit into them. Yes, we would first come up with our architecture, then our macro design (layers) and domain model (including UML diagrams), and then we would think about the business requirements and detailed use cases. Yes, user stories were not really a thing back then. We had long use cases with a basic flow and many alternative flows.
The 90s were an interesting period to be a developer. We finally had the Internet and a few academic papers on structured design from 70s and 80s became available to us. We also had a few books focusing on Object-Oriented Programming, still a novelty in many places.
I still remember the countless months we spent drawing class, sequence, component, deployment, and many other diagrams, trying to come up with the best design for the software that one day we would build. We built some prototypes as well, so I can’t say we were not coding or experimenting during the elaboration phase. All the learnings from the prototypes would be fed back into our diagrams, of course.
Besides the internal design, in the 90s we had an explosion of distributed systems as well. We had CORBA and DCOM. We had client/server and multi-tier architecture. We learnt about single point of failure when using databases as a communication point between applications. Yes, we had nasty problems with that. We learnt how to create “services”—SOA was emerging as an architectural concept. We learnt how to think about scalability and security. Many projects outside governments and banks were also becoming very large and complex.
Creating a great design was the goal for most passionate software developers; it was also the path to become an architect and career evolution. We had to study and be good at software design. We had to understand in depth principles of structured and object-oriented design. We had to understand principles of distributed system. We had to understand in depth all levels of cohesion and coupling. We had to understand covariance and contravariance. We had to learn how to design components boundaries including their contracts and invariant. We had to learn to how to understand verbs and nouns, the language of the business, and map them into software. We would never pass a job interview without knowing how to model data in our relational databases, and mostly important, how to make our queries perform well. We had to define our proprietary protocols in the right granularity in order to make it easier for other systems to consume them and at the same time address I/O and bandwidth concerns. Yes, we spent a lot of time doing that.
Software development in the 90s was all about design and architecture. And yes, it had to be done with UML.
But although we learnt how to design, we failed in many other aspects. We didn’t deliver fast enough. In fact, in some projects we didn’t deliver anything at all. Most of the times, all the thinking we did trying to create a perfect design was completely wasted due to lack of quick feedback and our inability to cope with rapid business evolution. I still remember the huge spreadsheets we used to control change requests even before we had any code written. Yes, we screwed up big time. Many times.
Fast-forwarding 20 years, some of us understood what we were doing was wrong. We then incorporated Agile, Lean, and many other principles and practices into the way we work. We incorporated many new design and architecture techniques into our tool kit. We incorporated new technologies. We incorporated different ways to collaborate with the business and also how to structure our teams. We learned that although design was important, nothing was more important than continuously delivering software. We learnt how to get feedback and iterate. We learnt we were responsible to test our own code. We learnt we had to support our software in production. We learnt the value of creating prototypes and throwing them away. We learnt the value of experimentation. But we also learnt we didn’t need to throw away all the design skills we acquired over the years in order to work in a better way. We didn’t need to discard all the great work done before and during our time, mainly around software design during the 70s, 80s and 90s. But the most important lesson we learnt was that context is king and that software design is all about trade offs. Design is pointless without delivery. But can we keep delivering code without a solid design foundation? I don’t think so.
One of the biggest problems I see today in software design is binary thinking. If X is bad, than Y must be good. If X worked for company A, than it will also work for us. If a well-known person said something in a 45-minute talk in a conference or we found something written in a blog, it must be true. Always.
Another common mistake caused by binary thinking is the belief that all features in a software project have the same degree of complexity and a single design choice will fit them all. But the truth is, some features are quite simple, others very complex, and many others are somewhere in between. Sometimes the complexity is found in the implementation; other times it is found on understanding and modelling the domain. Sometimes the complexity is in understanding what we should be building. Other times the complexity is in the integration with other systems. Parts of the same feature can also vary a lot in complexity: some parts may be trivial and quickly implemented while others can be extremely complex and demand a lot thinking up front. Some features are shallow (very few lines of code) while others are deep (thousands of lines of code spread through different modules). Non-functional requirements can also make the implementation of simple features (business-wise) very complex. And the interesting thing is that all of that can be found within the same software project. So, if we agree that different features in a software project have different degrees of complexity and size, there is no way we can apply binary thinking into software design—no single design approach will ever work in a reasonably complex software project.
We live in a world where information is easily and quickly accessible. One search on Google and we can find many ready-made solutions to our problems. My fear is that, as an industry, we are losing the ability to think. We are losing the ability to research and make our own choices. More and more we are looking for a ready-made recipe. A shortcut. I call them a “Stack Overflow solution.”
It makes me sad that, to some people, software design is synonymous of over-engineering. It also makes me sad to see that "no design at all” is becoming synonymous of Agile, Lean Startup, and fast delivery. I don’t think the originators and main proponents of good software design, Agile, and Lean principles ever meant that. Over-engineering is bad, but so is no design at all. Simple doesn’t mean crap. Simple means just enough design for what we know today, but not less. Paraphrasing Einstein, software design should be made as simple as possible, but not simpler. Or another way of putting it: code must be well-designed but not over-designed.
After interviewing a lot of developers and reviewing a lot of code in the past few years, my main concern is that we are developing a hacker culture. Many developers I met who have been in the industry for less than a decade, have very little knowledge of good software design. They will claim otherwise, of course. If you think I’m exaggerating, ask developers in your team to explain cohesion and its different levels. Ask them about connascence, covariance and contravariance. Ask them about different degrees and types of coupling. Ask them about design by contract and invariant. For those that have heard about SOLID principles ask them where the SOLID principles came from. Many developers today say design patterns are bad. Ask them to describe some patterns, their differences, and when they should or should not be applied. Ask them about the different pattern classifications. Ask them about the difference between a Bridge, Adapter, and Mediator. What problem a Visitor is supposed to solve? Ask them what is a Memento? If they can’t explain that, how can they say patterns are bad?
Many developers today are not aware of the software design foundation laid during the 70s, 80s, and 90s. Others prefer to ignore it: That is over-engineering and they don’t need that. That’s fine, I can respect that. But why do we still have software that sucks then? Is the software being produced today really better than the software being produced 20 years ago? Why developers still struggle to design code with TDD? Why are we still talking about legacy code? For me, legacy code is synonymous of code that is badly designed, hence difficult to test and maintain.
By no means I’m defending over-engineering or wasting time drawing diagrams in UML. What I’m trying to say is that at the same time I would never spend hours to design classes inside my core domain up front, I would also not try to build an enterprise application, one test at a time, without thinking about its overall design before start coding. Design is essential for software development. If I’m building an application that will live in an ecosystem alongside many applications, or with heavy non-functional requirements, or that needs to comply to regulations, yes, of course I would put a lot of thought about its overall structure (macro-design) before I start coding but I would still develop its features (“micro-level”) one test at a time. Design happens at all levels: from up-front at the architectural level to just-in-time at micro level as part of my TDD flow. Decide how much to design is a skill—it's all about finding the inflection point, subject I covered on a previous blog.
Software design is one of the most important skills in software development. Good design enables developers to collaborate, business features to be added and changed frequently, and reliable test automation to be done. With experience, we learn how to quickly identify problems and decide the amount of time we should spend on it. We also learn that most design decisions should be made in the last responsible moment, that means, we try not to commit to a design too early, while we don’t know enough about the problem.
And that’s why I say that not everything we did in the 90s was in vain. Although we over-engineered every thing and didn’t deliver too much, we learnt how to design. We learnt how to think for ourselves. We learnt how research. We learnt how to reason about trade offs. It took us a while but we also learnt how to avoid binary thinking and refrain our excitement with new trends. The combination of a strong software design foundation with Agile and Lean principles and practices, put us in a much better position today not only to deliver software fast but also to deliver software continuously.
Our goal is to enable business agility and that can be achieved through software that can be continuously deployed into production. Deploy software to production once is not that hard but to deploy software to production multiple times a day and keep this rhythm for months if not years, well, that’s not so simple. We need a lot of discipline and engineering to achieve continuous delivery, which makes software design and TDD two of the most important technical disciplines we have to master.