It’s easy to feel overwhelmed these days. Choosing a Netflix show can take up an entire weekend and even lead to a breakup. Sorting through comments on Trivago to find a hotel can trigger a rash. Even getting a coffee can cause a nosebleed – with so many options for roasts, sizes (often in different languages), milk from various animals, and different nuts. We are both fortunate and burdened by endless choices. Plus, it’s not getting any easier — the speed of technology is so fast, so unyielding, and so all-encompassing that it can be tough to catch our breath. Is it any wonder that neo-luddism is gaining popularity?
One source of comfort is that in our personal lives, we can adopt technology at our own speed. In the business world, we rarely have that luxury. Let’s take a moment to appreciate enterprise leaders during these chaotic times, as technology adoption is mostly influenced by the urgent and changing demands of employees, customers, and competitors. Whether we want to or not, we are all in a digital race to use tech and data more smartly to gain an edge over our rivals.
With tech-driven change bursting forth inexorably, it can be hard for business leaders to find their feet in this world of software-driven disruption. Every new initiative, every new piece of tech, every glitzy new methodology is a “must have.” You can’t blame the C-level suite for thinking: “Hey, things are complicated enough, won’t adding more new stuff add even more fuel to the bonfire, not to mention cost even more money?” And that’s not even taking into consideration the endless mergers and acquisitions, the organizational restructuring and so on. How do you see the woods for the trees?
When it comes to the enigmatic and mercurial world of software delivery, a blend of visibility and flexibility can destress and even embolden leaders. A modular toolchain infrastructure, underpinned by Value Stream Integration, enables business and IT leaders to steady themselves in this vertiginous economy. Even better, they can see how business value is flowing across the IT organization and can confidently experiment with new tools and processes without interrupting existing workflows and product value streams (also known as a “win-win”).
Crucially, by making this complex process visible, leaders can see what is working (and, significantly, what is not), and can introduce solutions with minimal blowback to end users; no more wild punts and gambles. Rather, an informed investment strategy that is focused on the realities of large-scale software delivery and how to measure them to business outcomes.
After all, that’s what digital transformation is all about. It’s not a fad; it’s the difference between survival and prosperity — ask Nokia, which lost the mobile market it helped create, or General Electric, the last original member to drop out the Dow due to its failure to transform, about the perils of ill-designed digital strategies and investments.
With the right understanding of how their businesses are delivering value through software, business and IT leaders can collaborate to find their calm in the chaos. And the only thing customers will ever see is their great product working or getting better. Or if there is a problem, these enterprises can fix it quickly while communicating with their customers. Continuous delivery as it should be: uninterrupted, evolving, impactful, delighting.
To obtain this holistic control and innovative environment, we need to consider what constitutes a software delivery value stream. How does value-adding work flow across the process? What are the impediments to acceleration?
A typical software delivery process starts and ends with the customer and involves a complex and implicit network of specialist teams (sales, product, project managers, business analysts, developers, testers, operations, etc.), domain-specific tools and methods (SAFe, Agile, DevOps) that help plan, build and deliver software. They are bound by the work they collaborate on. A knowledge-sharing network of communication centered around four essential units of value:
- Features (business value)
- Defects (quality)
- Risk (security, governance, compliance)
- Technical debt (removal of impediments to future delivery)
By connecting these domain-specific tools, disconnected by design, organizations can automate the flow of work across their product value streams and make it visible, traceable and measurable. Once this network is connected, the modular infrastructure reveals the intricacies of the process and removes the headache of working out when and where to invest resources. Everything is lit up by a click of a button. No more stumbling around in the dark.
For example, perhaps IT management is being pushed for the latest Agile planning tool by development teams. However, with a visible value stream, one can see that the work isn’t actually waiting in development, where teams are churning out code like it’s going out of fashion. Instead, the bottleneck is revealed to be further upstream in the business analyst’s tool, where there’s a wait state of five weeks.
Five weeks! That’s precious time that a customer isn’t receiving its product. It’s probably best to start lifting the constraint there before throwing more tech, people, and money at the development teams. Moreover, this visibility leads into the flexibility benefits, into the experimental world opined by the three ways of DevOps of flow, feedback and continual learning. There are myriad benefits to a visible, modular infrastructure that build a strong case for integrating before speculating, two of which will instantly change the way an organization approaches and manages the technical tides of change:
New tools, upgrades, version/API changes
Software tool vendors such as Atlassian can amend their APIs at any moment for multiple reasons. As business needs change rapidly and unexpectedly, developers must integrate new application components into existing architecture to ensure the software is as innovative, functional and responsive as possible for their customers. Robust and reliable enterprise-grade integration between tools supported by 24/7 automated testing ensures that any tool changes are automatically incorporated to eliminate disruption to product value streams.
Accommodating organizational change
Organizations are mutable, living, breathing entities that reflect the people that they employ and the work they deliver. As ever, change is constant, especially in a time where larger companies are absorbing other businesses to protect and grow their market share at an incredible rate. In the first nine months of 2018 alone, M&As all over the world reached $3.3 trillion in value, a record high. Every M&A brings its own set of headaches, such as new teams, tools, workflows, and processes to software delivery. Through a modular infrastructure, IT leaders can easily plug in these elements into existing and new product value streams. This capability mitigates the negative impact of the transition on things like product downtime and employee happiness.
The combination of being able to see how software delivery is creating business value with the ability to minimize the vibration of inevitable change is a potent mix. Suddenly disruption doesn’t seem so scary. In fact, this modular, connected infrastructure makes the unknown seem exciting.
To paraphrase Jeffrey Snover of Microsoft, “During periods of disruptions, there are periods of opportunity.” With this in mind, the question is no longer about how to handle the Catherine wheel of disruption posed by tech. Instead, it’s about how you can artfully take advantage of it.