I recently stumbled over this video. It gives a good introduction to what Continuous Delivery is at its core.
A hot topic of the last few years has been the debate as to whether traditional (aka waterfall-like) methodologies or agile ones (XP, SCRUM, etc.) deliver better results. Much of the discussion that I am aware of has focused on things like
- Which approach fits the organization?
- How strategic or tactical (both terms usually go undefined) is the project and how does this affect the suitability of one approach over the other?
- What legal and compliance requirements must be taken into account?
- How large and distributed is the development team?
This is all very important stuff and thinking about it is vital. Interestingly, though, what has largely been ignored, at least in the articles I have come across, is the tooling aspect. A methodology without proper tool support has relatively little practical value. Well, of course the tools exist. But can they effectively be used in the project? In my experience this is mostly not the case, when we speak about the “usual suspects” for requirements and test management. The reason for that is simply money. It comes in many incarnations:
- Few organizations have enterprise licenses for the respective tools and normally no budget is available for buying extra licenses for the project. The reason for the latter is either that this part of the budget was rejected, or that it was forgotten altogether.
- Even if people are willing to invest for the project, here comes the purchasing process, which in itself can be quite prohibitive.
- If there are licenses, most of these comprehensive tools have a steep learning curve (no blame meant, this is a complicated subject).
- No project manager, unless career-wise suicidal, is willing to have his budget pay for people getting to know this software.
- Even if there was budget (in terms of cash-flow), it takes time and often more than one project to obtain proficiency with the tools.
Let’s be clear, this is not product or methodology bashing. It is simply my personal, 100% subjective experience from many projects.
Now let’s compare this with the situation for Version Control Systems (VCS). Here the situation looks quite different. Products like Subversion (SVN) are well-established and widely used. Their value is not questioned and every non-trivial project uses them. Why are things so different here and since when? (The second part of the question is very important.) VCSes have been around for many years (RCS, CVS and many commercial ones) but none of them really gained the acceptance that SVN has today. I cannot present a scientific study here but my gut feeling is that the following points were crucial for this:
- Freely available
- Very simple to use, compared to other VCS. This causes issues for more advanced use-cases, especially merging, but allows for a fast start. And this is certainly better than avoiding a VCS in the first place.
- Good tool suppport (e.g. TortoiseSVN for Windows)
Many people started using SVN under the covers for the aforementioned reasons and from there it gradually made its way into the official corporate arena. It is now widely accepted as the standard. A similar pattern can be observed for unit-testing (as opposed to full-blown integrating and user acceptance testing): Many people use JUnit or something comparable with huge success. Or look at Continuous Integration with Hudson. Cruise Control was around quite a bit longer but its configuration was perceived to be cumbersome. And on top of its ease-of-use Hudson added something else: extensibility via plug-ins. The Hudson guys accepted upfront that people would want to do more than what the core product could deliver.
All these tools were designed bottom-up coming from people who knew exactly what they needed. And by “sheer coincidence” much of this stuff is what’s needed for an agile approach. My hypothesis is that more and more of these tools (narrow scope, free, extensible) will be coming and moving up the value chain. A good example is the Framework for Integrated Test that addresses user acceptance tests. As this happens and integration of the various tools at different levels progresses, the different methodologies will also converge.
This is a term from the (software) project management area. It basically means that deliverables are not fix in terms of content, which is the “normal” approach. The latter means that something has been agreed-upon as scope and will be delivered when that has scope has been implemented. Hopefully the planned deadline will be met, but it can be sacrificed in order to deliver the full thing. With time boxing, however, one is willing to reduce the scope in order to meet a deadline (goes hand-in-hand with the MoSCoW method). You therefore often find it in the following contexts
- Agile projects
- “Political” projects where it is more important to keep a date than functionality
For a more thorough discussion check out this article.