Issue 24: Part 7 - A 5 stage process for Digital Transformation

Democratising and demystifying the process

If this email was forwarded to you, I’d love to see you onboard. You can sign up here:

Sign up now

In trying to explain Digital Transformation to organisations that are interested in it, we technical people tend to lose sight of the big picture and confuse business leaders by burying the discussion in technical jargon. In order to deal with this, I thought I’d try to demystify the Digital Transformation process, this is my attempt at that.

On to this weeks’ issue.


In Issue 22: Part 6 … I introduced a methodology to help you structure your Digital Transformation plans where I covered the first two steps, auditing the current situation and segmentation. This issue concentrates on the third step, using a 5-lever framework that concentrates on the aspects of Digital Transformation that directly affect your products and services and your internal operations.


Segmentation and Targeting, continued…

Following on from the introduction to segmentation, I thought that a better phrase to explain the process would be “pinpointing opportunities”. The idea is to understand the market in basic terms, terms that have been used for decades in marketing; age, sex, earnings, placement etc. These demographics provide a basic overview of the type of population that may be susceptible to purchase your products and services. In a purely internal transformation project, this will help you get a better sense of your co-workers, but not much more than that.

Which is where the next tool comes in, the Empathy Map. The tool, as we’ve seen, guides you to build up a profile of a potential client or co-worker from their perspective, and more importantly from the perspective of the jobs they’re trying to get done. Take a look back at issues 6 and 7 for an explanation of the JTBD concept, but put briefly, people have things they are trying to accomplish and to do so they “hire” products and services to help them get those things done easier, faster, better or any other metric that reduces friction or adds value to their task.

You can see that from this exercise we can make the leap from observing blindly the work being done, to starting to understand the motivations of a client. In understanding better these motivations, it is a relatively easy step to try to identify products or services that respond to those motivations. Let’s look at a couple of examples.

The Black and Decker drill is a good example of a company that thought intensely about the JBTD, and created a whole new line of personal power tools as a result:

Prior to 1960, handheld electric tools were heavy and rugged, designed for professionals—and very expensive. B&D introduced a line of plastic-encased tools with universal motors that would only last twenty-five to thirty hours of operation—which actually was more than adequate for most do-it-yourselfers who drill a few holes per month. In today’s dollars, B&D brought the cost of these tools down from $150 to $20, enabling a whole new population to own and use their own tools.

Christensen, Clayton M.,Raynor, Michael E.. The Innovator's Solution: Creating and Sustaining Successful Growth.

Clearly technology can have an effect on the thinking and hence influence any ideas that you may have on doing the JTBD better, but it’s important to not lose sight of the goal of the exercise; better understand the end result the user is trying to get done.

 The five levers of Digital Transformation

Now that we have clear insights in to the areas that are most ripe for Digital Transformation, either in your organisation or by means of new or improved products and services, we can look at the different levers at our disposal to develop an execution plan to implement at a practical level. I personally tend to use a set of 5 levers, but there are surely many more, some of which I haven’t thought about and some that are outside my field of expertise. These 5 basic levers do, however, cover most scenarios and are a great starting point for anyone wishing to achieve results quickly and easily.

The 5 topics are; Innovation, Value Proposition, Customers, Competition and Data.

I’ve talked briefly about all of them over the course of this newsletter, but it is here I will outline them in greater detail, hopefully to contextualise them in Digital Transformation project terms. I’ll start with Innovation and the next issue will concentrate on the Value Proposition.

Innovation

When we talk of innovation, we tend to think of the finished product, like when the initial iPhone was unveiled. It was a stunning “innovation” and completely trumped what the market at that time had to offer… in some ways. Interestingly, it is best explained by Disruption Theory, by entering a market with less features but doing the JTBD brilliantly, so brilliantly in fact, that it quickly ate up market share to the dismay of all the incumbents in the market at the time.

But innovation isn’t the product, innovation is the process. And, innovation doesn’t have to produce a physical product at the end of the process, it can simply be a better way an organisation works internally, for example something as simple as better stock management in the warehouse. To cut it down to its basics, innovation is any change in a business process, or product/service that adds value. Innovation is similarly, not limited to the development of something new and exciting, it can be an incremental change that produces better outcomes for the organisation. Lastly, innovation is a continuous process. Once you have innovated, the work doesn’t stop there. Remember, the new new becomes the new old very quickly in these times!

The innovation process has itself been transformed, mostly by digital technologies, but also by the research and learnings from the likes of Alexander Osterwalder and Yves Pigneur, not to mention Clayton Christensen et al. Traditional innovation was focused entirely upon the finished product or service and thus was difficult to test without incurring large costs. The result? A fear of failure was at the heart of most decision-making processes, with financial controllers keen to avoid large sunk costs, and innovators and managers keen to avoid looking bad in front of the CEO when it all went south, and finally, the testing process was often slow and cumbersome dissuading true innovation.

The innovation process should be more scientific, and the digital age aids us do just this. Decision making in the pre-digital age was often driven and arbitrated by seniority, testing was done on a minimum required basis (being that it was slow and expensive), only experts were charged with experimentation and the finished product was at centre stage, setting the results up for failure.

The digital age affords us, in many cases, the possibility to focus on a minimal viable product to get something out of the gate quickly and inexpensively, to test more frequently and cheaply, to empower the users of the products themselves to make the decisions or at least influence the product direction. The focus being, clearly, on the JTBD being better served.

In the enterprise, the buyer is generally not the user of the product, the digital age has turned that on its head, with the end users dictating what it is they want to use. I’ve participated in numerous IT projects over the last 25 years where the company had decided on a product that was less than well received by the users and consequently never achieved the ROI estimated in the sales process. It was often put down to change management being the problem. I’m sure that in many cases, not only change management, but the fact that the product didn’t do the JTBD better, faster, simpler than the incumbent product or process, it was rejected and ignored by the very public it was designed to aid.

How do you innovate?

The process is actually quite simple in and of itself, and it builds upon the previous work as outlined in Issue 22.

  1. List the opportunities unveiled by the empathy map and the understanding of the JTBD.

  2. Rank the opportunities using a classification system (1 to 5 for example) for each criterion you’re looking to improve, speed, simplicity etc.

  3. Total the rankings to help you choose the priority opportunities.

  4. Do you need to collect data whilst testing? The answer is generally yes. Decide how you are going to do this.

  5. Design the KPIs to use to validate the results

  6. Design and build the Minimal Viable Solution, MVS

  7. Test it with the final user

  8. Observe the use and collect the data in accordance with point no. 4 above.

  9. Analyse the results to reach your conclusion

  10. Using the data and the result, iterate on the MVS to either develop a better solution or conclude that the opportunity cannot progress

  11. Re-test the new MVS or restart the process from a different opportunity you have identified

The information learned from the previous stage, should give you enough information to list each opportunity you have identified as a potential target to improve upon the JTBD. These need to be listed in basic and functional terms, remember the goal is to unveil opportunities for making the JBTD better. Example: Opportunity to automate the process to contact third party service partners to intervene in the on-site repair process.

Classify the list of opportunities based on the criteria; cost, speed, simplicity and quality. Each opportunity should be scored on the basis of your assumptions on the level of benefit each criterion will afford. Example: We believe the cost of the new widget will be half the cost of the current one, therefore we score it a 5 out of 5 for the opportunity of Cost Saving. Note, it doesn’t matter if it is not accurate but is at least based in reality, it will be tested later in the methodology so don’t get trapped from moving forward.

Determine if a data collection strategy is required. Most opportunities will need some form of data analysis to confirm or deny your hypothesis. Determine how you will collect that data. Example: The application will automate the ordering process and increase efficiency by an estimated 15%, saving time of approximately 2 minutes per transaction. We need to integrate in to the application a timer routine to verify this hypothesis. If the process is manual, we need to observe real-world usage, i.e., sitting next to the user to time the operations. Don’t get hung up on elaborate data capture strategies, we only need basic data at this stage, if you need to build a MongoDB database to handle it, you’re doing it wrong for the moment!

Determine the indicators that you will need to measure in order to validate/invalidate your tests. If your MVS is designed to save time, you need to measure the time spent as compared to the old solution. In the above example, the opportunity exists to save time, measure the current time to process the transactions and compare against the time taken when the MVS is introduced. A good, simple test plan at this stage will help you as go further and further through the innovation cycles that you will inevitably have to.

Develop a minimum viable solution (MVS) that responds to the basic needs of the opportunity revealed in the previous process. It doesn’t have to be perfect. It doesn’t have to resemble a finished product and it doesn’t have to be pretty. For example, if it’s a physical product your testing, look at 3D printing as a quick and dirty way to manufacture a “good enough” test to be put in to the hands of the user.

Test it with the customer profile you previously identified and do not be tempted to test it with your design team or friends and family, they’ll want to be nice to you and will often subconsciously not give you the feedback you’ll need. Be careful also, to explicitly state that it is a “beta” or “test” in order to correctly set the expectations of your testers. You may need to incentivise the users to get them to use it but try to avoid doing this too much in order to prevent results skew and biases. A recent study at ESSEC Business School showed that students’ evaluations were skewed in the favour of the school if two conditions were true; they were tested ‘after’ having their grades and their grades were good.

Observe obsessively how they use the MVS, collecting data either manually or automatically on its use. This is a delicate one when it comes to testing software, particularly if its software designed to be used for personal purposes. Be very clear and explicit and get upfront approval from the users as to what you will do with the data. I would suggest getting professional legal advice on the terms and conditions and additionally professional advice from a GDPR specialist, particularly if you’re dealing with European testers and dealing with personally identifiable information.

Analyse the results to determine the ultimate success/failure of the test. If you have set the expectations correctly and developed the right indictors to measure, at this stage it should be a simple matter of collating and gauging if the measured indicators are being met, surpassed or not. Again, at this early stage it is only important to validate the initial hypothesis.

Iterate the MVS using the knowledge collected from the testing process. What I mean by that is, that the data you have now may be able to help you tweak your design to do the JTBD even better. It’s tempting to restart the whole process to get better and better, but keep in mind that at some point you need to release the product, and if it already does the incumbents’ JTBD better, clean it up and release it now. Further iteration will come further down the Digital Transformation process, so there will be plenty of opportunity to improve upon the product/service. For now, iterate only if the test shows promise but doesn’t fully meet your expectations.


I hope this helps you start your own Digital Transformation process. If you have any questions or want to discuss your own projects, please let me know, I’d be only too happy to see how I can help out.


Reading List

0.png

Tomorrow’s Digital Transformation Battles Will Be Fought at the Edge

IoT will be a significant Digital Transformation enabler – enabling new opportunities to integrate digital capabilities into the organization’s assets, products and operational processes in order to improve efficiency, enhance customer value, mitigate risk, and uncover new monetization opportunities. 

IoT value creation occurs when the IoT Analytics collide with IoT Applications (like predictive maintenance, manufacturing performance optimization, waste reduction, reducing obsolete and excessive inventory, and first-time-fix) to deliver measurable sources of business and operational value 

I’ve not talked much about IoT in this newsletter, I’m hoping to rectify that over the coming months, however. Don’t touch that dial.

106029175-1563569669591beller.jpg

Meet Morgan Beller, the 26-year-old woman behind Facebook's plan to make its own currency

I’ll not lecture you (again!) on my personal feelings about Facebook… This article is worth a read though, even if to get a better picture of the people behind the — I believe, ultimately doomed product, Libra. See next article!

GettyImages-1156154775-800x533.jpg

Facebook is backpedaling from its ambitious vision for Libra

As ever, Are Technica does a great write up on the subject:

So Facebook's challenge in the coming months is to design a new network architecture that strikes a reasonable balance between these competing objectives—a network that is locked-down enough to satisfy regulators but open enough to attract a healthy developer ecosystem.

The Future is Digital Newsletter is intended for anyone interesting in learning about Digital Transformation and how it affects their business. I strongly encourage you to forward it to people you feel may be interested, go on, help them out :-)

You can read all the free back issues here:

Back Issues

Thanks for being a supporter, have a great day.

Matthew Cowen @matthewcowen