Matthew Cowen
About Newsletter Categories Working Library Subscribe License Search Also on Micro.blog
  • The state of Digital Transformation in 2020 and some follow-up on the disruption of Intel

    My apologies for the long lapse between writing essays. If I’m honest, the world has been just too much to afford me the mental space to think, write, edit and record these essays. I’ve taken on more responsibilities and simultaneously fell down the rabbit hole with the US presidential elections. And, despite not being directly affected by the ousting of the “Great Orange Liar”, I can’t help but be touched on a personal level. Seeing the once-great nation disintegrating in front of my eyes bothers me much. That’s all I’ll say for the moment.

    On a more positive note, as we draw ever-closer to deploying what might be a solution to our number one problem of the novel coronavirus, I thought I’d take a look at what the pandemic has exposed and what digital transformation means today.

    One last note on housekeeping, the keen-eyed will notice that I have moved this newsletter to my companies’ domain (dgtlfutures.com). Everything else stays the same and all the existing links still work. It is now easier to find, as it is at newsletter.dgtlfutures.com


    The state of Digital Transformation in 2020

    When the pandemic appeared to be a real threat to the countries in which it took hold, many including myself estimated that this might be the impetus required to get companies to start, deploy and finish their digital transformations. The reality couldn’t be further from the truth, and it comes back to some difficulties I’ve been discussing for years.

    Let’s take a look at where we are in digital transformation today. I’ll go on to show what is missing, what is needed and how we get there.

    The first wave - The implementation of computerised back-end systems

    The likes of IBM with their AS/400 and DEC with their VAX minis and mainframes were the masters of this. Not because it was hard, but because it was easy and it was a licence to print money. Ever since the first VisiCalc spreadsheet showed the CEO that he or she didn’t need to manually calculate the columns and rows themselves (or have one of the minions to do it), it was inevitable that computerised system would penetrate deeper and deeper in businesses. Those that afforded the outrageous costs of the time, were given an advantage that had not been seen since the invention of the wheel (the innovation that disrupted the movement of atoms from one place to another).

    Quickly, IBM and their competitors spun up massive sales teams that crisscrossed the globe demonstrating and selling stock control systems, basic accounting ledgers and simple statistical reporting programs. On the back of this, software-focused companies ramped up work producing ever more complex designs that piggy-backed on the already-deployed hardware. Being that the business model of the IBMs and DECs of the era was to sell high-margin hardware and lucrative support contracts for that hardware, they were pleased to let the software houses integrate their software as it made the hardware even more desirable.

    This symbiotic relationship even gave rise to the “Killer App” moniker we use today.

    The businesses that needed to make ever-quicker decisions wrote practically open cheques to the manufacturers, as they were confident that the returns on the investments would outdo the spend. And they were right to believe this, as the entire industry structured itself to become a self-fulfilling prophecy.

    A long time ago, I interviewed for a support role in one of the world’s largest banks, in their London office. I was applying for a position as a support engineer that would be dispatched to the trading floor. I was genuinely intrigued by the floor and asked if it would be possible to see the environment in which I would be working. After a short pause, the IT manager agreed to the unusual request and led me down the stairs to the big oak doors that displayed an ominous sign on a big brass plate. “Do Not Feed The Animals” it read. I chuckled and braced myself for the spectacle that was a Trading Floor in the early 90s. I’ll tell that story another day. But what I most remember was that each trader had two complete AS/400 systems under their desks. This, a room with perhaps 60 to 100 people in it.

    That would be a multi-million pound budget by today's rates. But this was standard issue, as the opportunity cost was so high for traders that were just that slightly slower than their competition during trading hours. Their killer app was the trading platform that operated on super-slim timing to augment the trader’s abilities.

    The second wave - The paperless office

    After the fury of this first wave of digital transformation, it was clear to businesses that they needed to go further, and hunting season was declared on paper—a hunting season that has not, by any stretch of the imagination, finished yet even in 2020.

    But that is beside the point.

    End-user facing documents and reports were the next low-hanging fruit, and businesses proceeded to digitise these objects as and when they could based on the technologies at their disposal. Very few companies bought software with the sole intention of moving paper reports to digital versions of themselves. It was just the icing on the cake for most. So now, timesheets, TP reports (see Office Space) and countless other document types were converted.

    Operators would export data from the ERP systems like SAP, import them into Excel and produce the charts that ended up in Word and PowerPoint documents. But even at this stage, paper wasn’t entirely eliminated, as often these reports (and I see this still today) were printed out in colour and distributed manually or by mail (the physical internal and external postal systems) for analysis and treatment. At some point, people realised that this could be made more efficient. With the advent of internal email systems gaining popularity, sending the PPT over email was the method of choice that enabled faster and better “collaboration”.

    I used the quotes, as, by today’s standards, this could be nothing but further from the truth of what collaboration is. The back-and-forth of individually saved and edited documents on the network led to an exponential growth in data storage needs for both the email systems and the personal data storage systems.

    When working on sophisticated archiving systems, I discovered that it was not that uncommon to have approximately 100 copies of the same documents on the network. That email sent to 20 colleagues, saved on their “personal space” produces in one step 41 copies!

    The third wave - The age of collaboration

    When the apparent limitations of this model became apparent, and the software had caught up, we started to build-out specific collaboration software to address these limitations.

    As a side note, it must be said that IBM was (not for the first time) way ahead of this curve. Whilst the likes of Novell and Microsoft were supplying the pipes to connect businesses with unstable and simplistic networking hardware, IBM bought and extended a company that built a virtually limitless collaboration system that was too much too early. Lotus Domino was the first “proper” collaboration tool that let business easily deploy just-what-was-needed software to decision-makers. It included storage, sharing, app-building and elementary database capabilities that were far ahead of the curve at the time. Its complexity and frankly, the hostile user interface was part of its downfall, but it was an essential step in the computing-business interface building world.

    Fast-forward to pre-pandemic, and the state of collaboration today. We see that companies that had implemented the new generation of basic collaboration systems could provide some semblance of business continuity. Whilst those who hadn’t yet taken the steps, scrambled to implement tools, shoehorning them into day-to-day operations. With varying degrees of success, it should be noted. The pandemic has forced many companies to evaluate if the tools work. They work that is for sure, and work surprisingly well, as they are developed from years of research and experience testing. Forcing them on to unprepared staff will only expose the frailty of your operations if you don’t do the hard work of real digital transformation.

    But back to the pandemic. Businesses that have been forced to close their doors to receiving public in their offices and shops have turned to their most pressing problem of managing the customer relationship. How do I sell to someone who would previously wander around my shop for 15 minutes before picking an item and purchasing it?

    Facebook and WhatsApp, for example, have provided a means to interact with the client, and possibly vehicle some sales. I’d argue that those sales were probably not lost in the first place, but let’s not split hairs. However, they don’t address the fundamental problems of a wholesale shift in the customer journey fro discovery to purchase and beyond. Plasters on broken leg might make you feel a little better, but they don’t repair the break! So, as we progress in the pandemic, most are preoccupied with the customer-facing elements, ignoring the opportunity to implement real change that would set them up for the afterworld.

    In trying to schematise the idea, I’ve settled on three blocks; the back-end, the operations and the customer-facing parts.

    7ABF27A9-E19B-40C8-86AC-F06DDA0A6602_1_201_a.jpeg

    Source: Matthew Cowen

    The back-end has been deployed for many years and is efficient at what it does. What is doesn’t do is the problem we have today, however. Legacy AS/400 and DEC systems are still prevalent all around the world. Those legacy systems are notoriously difficult to interface with, notoriously poor at real-time and notoriously poor at providing reusable data for business intelligence, or BI.

    But the customer-facing elements are the new centre of focus. It doesn’t take much work to find hundreds and hundreds of businesses that increase your visibility online and help to market and promote your wares, and that’s just in our region. If you think about Internet assumptions, each one of these businesses competes with the millions around the world that are providing the same thing. It’s the reality of the Internet. But let’s not get hung up on that, and focus on the value-added service they’re providing in a world where it increasingly more challenging to be found. This value-add is only limited in that it doesn’t interface well with the real issue for businesses tackling digital transformation, the operations and the back-end. You can sell online, great, but how does that affect the whole value-chain and all the interlacing parts in your business?

    The fourth wave (we’re not here yet) - Reimagining the value chain

    The elephant in the room is that big block in the middle; Operations. Real digital transformation comes from looking at the whole can of worms that make up your daily operations. The simple tasks, right up to the complex multi-layer, multi-purpose and multi-approval workflows.

    I’m not diminishing the value of the fire-fighting going on today in any way — it is a case of survival in many instances—, but the real work should start now. Businesses need to evaluate in detail every single process that makes up their very existence. In assessing, they need to determine three things; 1) the reason a process exists: is it there simply because that’s the way it has always been done? 2) the worth of the process, or to put it another way: what value does that process bring to the business? And 3) the justification for the process: which is not the reason, nor the individual value, but more of an evaluation about how it fits into the whole. Is the sum of its parts greater than the whole? 

    Redesigning and redeploying those processes is necessary and the only path to digital transformation that will bear its fruits in the future. It will undoubtedly put in question your back-end and will almost certainly change your front-end. And as hard as it will be, it will likely be the difference between your businesses prosperity or ultimate demise.


    Intel’s Disruption Followup

    I’ve been studying and writing about the disruption taking place on Intel’s x86 line for several years. I’ve written substantially about it here (here and here) in this newsletter and in unpublished form. I explained why it is hard to spot disruption, even if it is happening in front of us. It becomes doubly hard when we are focused on our businesses staying alive, as so many of us are in this current pandemic:

    From Intel’s Pohoiki Beach and Disruption Theory:

    The problem with theories like these is that it is pretty much the same problem we have when we discuss human or animal evolution. We find it hard to understand the future direction of the evolutive process in real-time, mostly because it happens so slowly and over many generations. From a retrospective position, we can see what happened, and we can often even, have an informed guess as to why it happened. Reliable prediction, it seems, is just unreachable. With modern digital technologies and the pace with which they evolve, we might be able to see enough into the future to discern and predict outcomes for companies in this new world.

    I described the history behind Intel’s disruption:

    Intel is a well-run long-established microprocessor design and fabrication company, with a phenomenal marketing arm and deep links to the most important companies in the computing industry. Founded in 1968, a year before AMD, it has run head to head with AMD and in nearly every battle beaten AMD on just about every level that means anything; marketing, price, availability, design, availability, to name a few. 

    The new entrant in the microprocessor market is known as ARM, or as it was previously known, Advanced RISC Machine and before that Acorn RISC Machine — giving you an idea to its origins, powering Acorn Archimedes personal computers. Founded in Cambridge, UK, in 1990 (22 years after Intel), the processor design was a complete revolution and rethink of classic processor design, with the clue in the companies’ original name; RISC.

    RISC means Reduced Instruction Set Computer. The Instruction Set of a processor is a fundamental element to how the processor behaves but more importantly, how it is directed to do things, what is more commonly known as programmed. Modern terms, such as coding are basically the same things. Different microprocessors can have the same instruction set, allowing programmers to write the same code, or for the compilers — software designed to turn more human-readable code into native machine language, that is virtually impossible to understand as a human — to translate into the same instructions.

    Compared to ARM, Intel microprocessors are CISC, Complex Instruction Set Computers. Without going in to microprocessor design, an instruction is a type of command run by the processor to achieve a desired outcome, like a multiplication, division, comparison, etc. Complex instructions can be of variable length, take more than one processor cycle to execute (processor cycles govern how fast the microprocessor can operate), but are more efficient in memory (RAM) usage. RISC instructions are more simplified and standardised, and critically, take only one processor cycle to execute. They have trade-offs, managing memory (RAM) less efficiently and require the compilers to do more work when translating the code into machine language, i.e., potentially slower development times whilst waiting for the compilation to finish.

    The memory issue was only an issue until recently, when memory has become effectively abundant and cheap, allowing hardware designs to incorporate huge amounts of RAM in their designs.

    At the begginingg of my writing, I’d tried to frame it in terms of Clayton Christensen’s Disruption Theory, an observation that was, at that time, not frequently put forward. A recent blog post on Medium by one of Christensen’s students vindicated what I’d been writing about for a long time. Have a read if you’re interested in the theory. He does a great job of articulating it.

    But it got me thinking about how one could spot disruption or more importantly, one could use the framework to try to provoke disruption. In this essay, I’m delving into that first point.


    How do you spot disruption?

    Going back to the basics, and ignoring all the incorrect uses of the word —no, disruption is not just doing the same thing but cheaper— I thought I’d try to give you the tools to see disruption happening in your markets.

    The big red flag to look for is a product or service that does some of what your existing product or service does, but does it better on a couple of metrics simple metrics; price, efficiency and friction. 

    Cheaper is not disruption

    A less-expensive product alone is not a sign of disruption. What is, however, is a product that evolves steadily providing more and more features and competing on more and more parts of your product or service all whilst its price is significantly lower than yours. Again, it might just be a cheaper product to produce and sell. You have to ask from the customer’s point of view. Is it fulfilling the job to be done? Is it good enough that it makes your buyers question the need to pay your prices? Can your customers successfully integrate and use the competing product despite its shortcomings over your product and still find value?

    Efficiency is key

    If the competing product is more efficient, be that in the sales cycle, the use-cycle or the complete lifecycle of the product, buyers will, of course, investigate and evaluate that product. Can the product do essentially the same job as the existing product on the market? Does it do this faster, better with more predictable outcomes? Efficiency might afford you competitor lower costs too.

    Friction is proportional to sales

    Friction is often overlooked as an important force that influences buying habits. The more you reduce friction in the purchase operation and reception of the product, the more chance the buyer tends to have in choosing your product over a competing one. If the new entrant has substantially reduced this friction compared to the friction required to purchase your product, this should be another indicator that things might be difficult from this point forward for you.

    Loyal customers may soften the impact at first, but even the most faithful will switch if the product fulfils several of the criteria I have highlighted above.

    But that alone is not enough, you need to have historical data, or you need to develop a way to project into the future and make assumptions about where you think the product and service are going or where it could go. Just look at the image that explains disruptive innovation below. Looking at the “Entrant’s disruptive trajectory” line as compared to the “Mainstream” line shows how, eventually, disruptive innovation will surpass the mainstream product or service when “performance” is evaluated based on the metrics I’ve highlighted above (to name a few).

    VL950101_AT.png

    Source: HBR

    If you are the incumbent and you wish to stay profitable and dominant, you have but two choices. Embark on a “sustaining trajectory” or innovate your own disruptive innovation. Just be aware, a sustaining trajectory has its upper limits!


    The Future is Digital Newsletter is intended for anyone interested in Digital Technologies and how it affects their business. I’d really appreciate it if you would share it to those in your network.

    Share The Future is Digital

    If this email was forwarded to you, I’d love to see you on board. You can sign up here:

    Subscribe now

    Visit the website to read all the archives.

    Thanks for being a supporter, have a great day.

    → 3:43 PM, Nov 17
  • Digital Commerce. Blockchain (again)

    If you listened to the podcast version, you’d note I added some music. I broke out my skills in Garageband to make a quick accompanying jingle to spice up the podcast. Let me know what you think. 🎵

    On to this week’s topics.

    I’m astonished I didn’t get roasted for completely dissing Blockchain as a useless technology a couple of weeks ago. I thought I’d talk about an example where I’m actually quite bullish about the technology. But first I wanted to expand upon a thought I had following a conversation I’d had with someone, discussing why digital commerce is so different from brick and mortar commerce. Follow on for my thoughts.


    The unique challenges of digital commerce for physical goods

    The difficulty for online retailers selling physical goods in the digital economy is that the value of digital products is significantly reduced, and in some cases, is virtually zero. That inherently puts pressure on the value of physical goods that are commoditised. Fortunately, luxury goods are seeing less pressure on their perceived value, but that is more a function of time rather than real value.

    Luxury goods houses and retailers are seeing these changes and are starting to act. Apple today has changed its retail sales processes to resemble more of a luxury brand one-to-one service rather than a Walmart get-it-off-the-shelf-yourself operation.

    From Apple’s press release:

    When iPhone 12, iPhone 12 Pro, and iPad Air are available Friday, October 23, customers can get their all-new products directly from Apple through tailored purchase experiences offered online, by phone, or in store. From a chat session with a Specialist that starts online and finishes with contactless delivery, to visiting select Apple Store locations for a one-on-one session with an Apple Specialist, customers can find the best way to get the products they’re looking for.

    Apple_new-ways-to-shop-for-iPadAir-iPhone12Pro-iPhone12-infographic_10212020.jpg

    Source: Apple

    So, the question is, how do businesses make money now that the products they sell are virtually worthless (economically speaking)?

    Answers to some of that lie in the services surrounding the product. Be that sales (see above), installation/delivery, support, subscription, ongoing help and many other possibilities, the key lies in giving the customer an ‘experience’ rather than a sale. Deriving value comes from developing and innovating on several levels to create a whole greater than the sum of its parts.

    If we look at the above example from Apple, you’ll note they’re selling a phone. But that phone is so much more than a simple “… Widescreen iPod with touch controls, a revolutionary mobile phone and a breakthrough internet communicator.”. The iPhone has replaced by some estimates, over 50 products; mobile phone, point and shoot camera, a torch, calendar, SatNav, personal assistant, to name a few. But it is a product in a sea of other products that are designed to resemble each other closely. There is little material difference between the flagship Samsung and Google devices that are directly competing with the iPhone. And, Smartphones themselves are becoming commoditised, as is evidenced by the increasingly smaller and smaller gains made in hardware design and technologies deployed. The differentiator is the software and what that software can enable hardware to do for the overall end-user experience.

    Two excellent examples are computational photography and health analytics.

    In computational photography, we are nearing the phase whereby even specialist cameras of the past are being innovated out of existence. It is only a matter of time before the computational aspect will outperform pure optical limitations of smartphone camera modules. In health, simple movement sensors initially enabled step tracking, instantly killing a growing market segment, and eventually enabled detailed sleep tracking that has (itself) been out-innovated by smartwatches. It is no longer science fiction to imagine the doctor’s office on your wrist.

    The overall experience of owning these products and their potential beyond the initial use-case is what I mean when I say ‘customer experience’. Apple has gone that step further and developed and Covid-friendly white-glove shopping experience previously reserved for the rich and famous. You book a 1 to 1 either in-store or directly on the Apple Retail site, and you are led through your purchase to contactless delivery or pick-up. It is a personal shopping service for the rest of us.

    Who doesn’t want to be made to feel special when buying something?


    Blockchain. Again

    I’m surprised I didn’t get more of a roasting from my somewhat sceptical articles about Blockchain here and here:

    According to a detailed academic-style “peer-reviewed” study by the Centre for Evidence-Based Blockchain and reported in the FT:

    “… outside of cryptoland, where blockchain does actually have a purpose insofar as it allows people to pay each other in strings of 1s and 0s without an intermediary, we have never seen any evidence that it actually does anything, or makes anything better. Often, it seems to make things a whole lot worse.”

    Worse, the report repeatedly highlights that the technology is a solution currently looking for a problem. The antithesis to the Jobs to be Done theory that helps us better design and provide solutions. With over 55% of projects showing no evidence of useful outcomes, over 45% showing “unfiltered evidence” (i.e., next to worthless), it would appear that Blockchain is a belief-system rather than a technological solution.

    And …

    …it is a huge energy consumer and hence by definition is inefficient. That, sadly, is not its only efficiency problem. Blockchain is actually extremely limited in its speed and quantity of transactions and scales poorly. So much so that in 2016 several banks exploring the possibility of using the technology in the personal and business banking sector abandoned the work as blockchain was just too slow.

    Quite the downer if I’m honest. But whilst many projects show no use for Blockchain, some projects show promise. One such example is Dominica’s Economic Growth team at the Climate Resilience Execution Agency for Dominica (CREAD).

    They are currently developing a parametric insurance product that uses blockchain technology to help small businesses and the typically underserved by traditional insurance products, manage their risk of natural disasters in an innovative way. It’s called BHP or Blockchain Hurricane Protection. From the article on LinkedIn:

    BHP aims to extend coverage to those excluded from traditional indemnity insurance, and provide Dominicans with a flexible and affordable tool for managing climate risk. Total damage from Hurricane Maria which struck the island as a category 5 storm on September 18, 2017 was US$1.3 billion, representing 226% of GDP. Uninsured losses were US$1.1 billion, or 86% of total damages. Damages to the housing sector totalled US$520 million. MSMEs suffered US$73 million in damages, and agriculture also suffered US$73 million in damage. In the years since Hurricane Maria, premiums for traditional indemnity policies have increased by more than 80%.

    This was the background to Dominica’s drive for innovation to better protect itself after multiple incidents that substantially affected citizens and businesses over the last decade.

    So, what is a Parametric Insurance and why blockchain?

    From Wikipedia:

    Parametric insurance is a type of insurance that does not indemnify the pure loss, but ex ante agrees to make a payment upon the occurrence of a triggering event. The triggering event is often a catastrophic natural event which may ordinarily precipitate a loss or a series of losses. But parametric insurance principles are also applied to agricultural crop insurance and other normal risks not of the nature of disaster, if the outcome of the risk is correlated to a parameter or an index of parameters.

    What’s great about this project is that it is intelligently using technology in the right places to fulfil the “Job to be Done”. Again, from that LinkedIn article:

    Once customers have downloaded the mobile wallet to their smartphone, they simply indicate the level of coverage that they would like to purchase, tag the location where they want the policy to apply, enter some basic information, and pay the premium. The policy is then issued and stored in the blockchain. In the event of a triggering event that meets the criteria of the policy, the payout is generated automatically and delivered to the customer's mobile wallet within three days of the triggering event.

    This product reduces friction at the critical stages of an insurance lifecycle; the signup and the payout.

    You’d be right in asking why a “normal” insurance product couldn’t do the same. And there’s no reason traditional insurance can’t reduce friction when it comes to the signup and management of the product. Payout is where the difficulty lies. Frequently, insurance companies need to wait for a “Natural Disaster” to be declared or during a smaller indecent, assessors and inspectors to audit and report back to the insurer before the insurer can start the payout process, which itself can be lengthy and time-consuming.

    In this product, they are disrupting traditional insurance at a particular level — this is not general insurance — and that disruption, like all disruptions, is to the benefit of customers in the way of simplification and increased speed in onboarding and payouts. Not to mention the pricing that makes it more accessible and hence more likely to be adopted, benefitting all in the process.

    But the interesting aspect from a tech point of view is the use of blockchain. In this instance, it is playing to blockchain’s strengths and not trying to overcome its weaknesses (see above). And that’s the intelligent way to use it.

    BHP is a product that doesn’t need to scale to hundreds if not thousands of transactions per millisecond like traditional banking systems deployed around the world. For one, Dominica (thankfully) doesn’t suffer a significant natural disaster every day, and secondly, its population is currently only around 73 000 people. Today’s blockchains are more than capable of sustaining the likely transaction requirements of this implementation. And suppose BHP is available to a broader audience throughout the Caribbean, it is still unlikely to overwhelm the system, as processor designs and energy efficiency gains are around the corner.

    A cursory glance shows that the major chip designers and builders are all exploring the possibilities of tailoring their products for blockchain applications to overcome the shortcomings of the technology.

    https://www.amd.com/en/technologies/blockchain

    https://www.intel.com/content/www/us/en/security/blockchain-overview.html

    Interesting times.


    You can find the archives of all my essays here:

    The Future is Digital Archives


    The Future is Digital Newsletter is intended to inform and entertain on the topic of tech and the digital world. Share it to someone you know who would like it.

    Share The Future is Digital

    If this email was forwarded to you, I’d love to see you on board, here’s where you can sign up spam-free:

    Subscribe now

    Thanks for being a supporter, have a great day.

    → 8:26 PM, Oct 21
  • Nvidia, AI and the Big Picture

    There are currently many moving parts to the tech industry, and as tech becomes more and more pervasive in society, it is getting roped into discussions and being judged by standards that never applied in the past. Debates are ranging from whether or not big tech has built unmatched and unrivalled monopolies, whether those monopolies are legal or not, to whether big tech is going to be responsible for the downfall of democracy and ultimately the next world war. I can’t give you a credible answer, but I can say that it is mentally draining to follow the tech industry. Not because there’s nothing to read and write about, precisely the opposite. The fire-hose of news in this industry has relentlessly increased, driving an information flow-rate that is impossible to manage.

    Too much choice is ultimately a bad thing.

    I’m currently researching the ICT industry in the Eastern Caribbean, and the same data points appear continually. Small businesses are trying to survive by offering the same services competing with the same compatriots on the same value propositions. It is not only a zero-sum game, but it is also so misaligned to what is possible if we consider internet assumptions.

    I’ll write more about this topic in the future as I clarify my thoughts and the research reveals further insights.

    I thought I’d write a follow-up on the last newsletter, as pretty-much right after I’d recorded and published news broke about the sale of the subject of that issue, ARM. Read on for my thoughts on this.


    ARM’s History

    When I wrote about the disruption of a part of Intel’s processors’ design and build process, in the issues: The Tale of Intel’s Disruption and Blockchain is Useless and Intel’s Pohoiki Beach and Disruption Theory, and I delved further into Disruption Theory, in Disruption Theory. Is Wizzee a Disruptor? I was trying to give you an overview of Disruption Theory and how it may apply to your own business. I recommend you read those articles for a better context of this essay.

    Getting back to that news. I was aware of the potential sale of one of the most important actors in that field, ARM Holdings. What I didn’t expect was such a quick sale and a sale to a company that logic would reason is not best suited to the type of business it was buying.

    Let’s back up here just a little and recap on the timeline and where I think this is going.

    ARM Limited, as it is now known, was initially incorporated in 1990 under the name Advanced RISC Machines. Funnily enough, even that wasn’t its original name. It was born as Acorn RISC Machine, from the Acorn Archimedes computer that was powered by the new microprocessor design. The change was apparently at the request of Apple that objected to the name of a competitor in the name of one of the processors it was responsible for jointly designing and using in that ill-fated (but arguably necessary step) of the Apple Newton. Advanced RISC Machines became ARM Holdings for the Initial Public Offering (IPO) that took place in 1998.

    In 2016, Softbank, a Japanese Telecoms company with an appetite for Venture Capital, purchased ARM for an amount of approximately 32 billion USD. That transaction guaranteed the operations to continue as they were. That is a UK headquarters and offices in Silicon Valley and Tokyo. It allowed ARM to be close to the world’s disrupters and designers (Silicon Valley) and the world’s builders (Taiwan and China). ARM capitalised on this, and the catalogue of products that currently use ARM chips designs is un-fathomable. Just about every device that requires a processor of some kind, that isn’t a computer, contains an ARM chip. And that’s before we even talk about the just-starting revolution of the Internet of Things, or IoT.

    Image: nvidia

    ARM has just sold to Nvidia for an announced price of 40 billion USD, an 8 billion USD premium over its purchase price, or a 25% profit over four years. SoftBank will retain a 10% stake in ARM too. This money will go some way to stopping the haemorrhage it recently suffered when it indicated that it might lose up to 80 billion USD from failed investments—WeWork (cough, cough).

    As a recap, ARM makes no processors itself. And in some cases, it doesn’t even design the subsequent generations of some of its processor designs. ARM licenses its intellectual property, or IP, to anyone, following a long tradition of British tech design houses that have sprung up of the last couple of decades, like Imagine Technologies and ARC International. Depending on the license terms, companies are more or less free to use the designs as they see fit. ARM presents itself as the Switzerland of processor designs i.e., neutral. ARM reports that its designs are in around 180 billion processors in use to date.

    Qualcomm uses them for their processors that run a majority of Android phones, and most famously Apple has a lifetime license (from the days it was one of the original designers) and uses its asset to design and implement the most advanced mobile processors on the planet currently. But even Apple doesn’t build those processors; it farms that work out to the specialist I mentioned in the last newsletter, Taiwan Semiconductor Manufacturing Corporation, TSMC. You can’t get a more explicit name that reflects the companies’ primary purpose than that! Which leads me to where I think this is going.

    ARM-ing the Future

    The big question is why a graphics card builder like Nvidia would splash for a chip designer? 

    Part of the answer lies in the fact that Nvidia, itself, is a licensee of ARM, and presumably that annual fixed cost will be removed from the books being that it is now the owner of the company it used to buy a licence from. It’s an upfront investment that pays off over several years, and if the value accumulation of ARM continues, the investment might be justified relatively quickly (from an accounting point of view).

    But I think it goes beyond that. I hinted earlier that ARM processors are just about everywhere and are integrated into more and more devices in the form of SMART tech. The fridge, the toaster, the Espresso machine are all candidates for a coming home-smarts revolution. And the already processor-enabled world of home appliances like washing machines will be enhanced by the technological possibilities available to their builders.

    The TAM, or Total Addressable Market, for their IP is almost infinite. The ubiquity of wifi and the incoming 5G avalanche only goes to reinforce the inclusion of ARM-type processors in devices: even the comms technology itself, the routers, switches and amplifiers, use ARM processor designs. ARM is set to become the de facto processors of things that are not traditional PCs.

    Besides Apple, Microsoft is using more and more of the technology in its designs. The new Surface Duo is an ARM-based foldable phone/tablet hybrid with impressive screen technology, all running on a customised ARM design. The Surface X Pro is a new generation of the popular Microsoft Surface PC line, and is also ARM-based, running a customised ARM compiled version of Windows.

    Beyond Computers

    But again, it goes beyond this, to what will inevitably be as pervasive a technology as oil-powered personal transport has become. I’m talking about AI or Artificial Intelligence.

    From simple statistical models to more advanced nuanced-based algorithms like GPT3, AI is set to be included in everything from your everyday carry phone to the entertainment system of the future. Think Bladerunner 2049. Where does Nvidia step in then?

    From the Nvidia Deep Learning AI website:

    I AM A VISIONARY, A HEALER, A CREATOR, AND SO MUCH MORE. i am ai.

    POWERING CHANGE WITH AI AND DEEP LEARNING

    AI doesn’t stand still. It’s a living, changing entity that powers change throughout every industry across the globe. As it evolves, so do we all. From the visionaries, healers, and navigators to the creators, protectors, and teachers. It’s what drives us today. And what comes next.

    Many Data Scientists and technology teams around the world realised that they needed powerful processors to perform a highly reduced and specific set of calculations, for which only specialised and extremely expensive super-computers could perform. Super-computer makers like Cray and IBM sold their systems to large research institutes and universities with high profit-margins on the back of their uniqueness in their ability to calculate rapidly and massively parallel, an important factor when designing calculations of that type.

    On the other end of the computing spectrum, users wanted to get better quality graphics for video-gaming and image manipulation. Nvidia started to design and build and sell specialised video cards to OEMs (Original Equipment Manufacturers) like DELL, for integration on their motherboards for a win-win situation. Better graphics meant that computers became more desirable as games or design machines. These designs evolved over the years and are sold as separate cards for builders to include in their offerings.

    In a quirk of circumstance, the type of processing required to produce detailed and fluid graphics for games was also ideal for the type of calculations required for AI. At a fraction of the cost, scientists and researchers could equip banal PCs with a bank of processors that could compete with the multi-million $ super-computers. As you’ve guessed by now, these cards are powered by ARM processors. With Nvidia as a pioneer in AI, it has developed a deep understanding of the field as the above website indicates. That pivot let Nvidia surpass Intel a few weeks ago, as the worlds most valuable chip maker.

    With AI becoming ubiquitous, like it or not, the purchase of what Nvidia sees as a cornerstone of its technological chops, the purchase of ARM will no doubt allow Nvidia to extend a lead in the AI world.

    Challenges remain in that ARM licenses its technology to direct competitors like AMD (Advanced Micro Devices) and just how they will navigate those forces is unclear. If you think about it, Nvidia has to develop and execute a way it can successfully operate and profit from two very different business models. Selling IP is nothing like selling processors, and that job has just become much more complicated with the inherent competing forces of buyers of the IP and the manufacturers.

    For now, Nvidia has stated that the business will continue to run as-is with Nvidia itself being one of many of ARMs customers to “buy” the technology. Only time will tell if they manage to pull this off, but I’m currently positive on the long term prospects of the deal.

    Intel’s Disruption Train

    As you know, I firmly believe we are witnessing a considerable disruption of an incumbent in the tech industry. It doesn’t happen as often as the tech press would like you to think. But it is an exciting spectacle to observe from a purely theoretical point of view. There will be many articles and possible books written on the subject if we still have books by then.

    With the Nvidia acquisition of ARM, Intel’s woes in the mobile processor, AI and Datacenter fields, have just got worse. Above, I purposely stuck to a more consumer side of the implications. But it is in the Datacenter that the next battle for processor superiority will commence in earnest soon. We’ve already seen inroads that Google and Amazon are making with the design of ARM-based servers that slot into the racks alongside thousands of other servers all furnishing your email, photo management and countless other modern-day necessities.

    With Nvidia, the risk for Intel is that they successfully manage to vertically integrate the ARM designs in their products in a much tighter manner, thus producing even more effective cloud server designs for specialist applications like AI and Machine Learning. They could, of course, do this without having to buy ARM, but the ARM acquisition may give them a head start that puts the nail in the coffin of significant business for Intel.

    I have no crystal ball, and I may be horribly wrong, but I think the next few years are going to be critical for Intel for it to survive in its current form. As I’ve noted in other essays, it is not lost on Intel, and they are making business decisions that keep their margins up and fix the short term. I would like to see Intel be bold and try to out-disrupt itself. I think this deal allows Intel an opportunity to profit from the scepticism in the industry surrounding Nvidia’s long term intentions and stun us with something truly new. It is the time for Intel to look at the research labs for the next thing and give it a shot. That’s easier said than done, however.


    The Future is Digital Newsletter is intended for anyone interested in Digital Technologies and how it affects their business. I’d really appreciate a share to those in your network.

    Share The Future is Digital

    If this email was forwarded to you, I’d love to see you on board. You can sign up here:

    Subscribe now

    Thanks for being a supporter, have a great day.

    → 3:44 PM, Sep 25
  • The tale of Intel’s disruption and Blockchain is useless

    It’s good to be back.

    I decided to take a summer holiday, of sorts, retiring myself from the pressure of writing these articles. If you know me, you’ll recall that I’ve pretty much not had a holiday since I started my professional life—much to the consternation of my family. So I decided to take a little time for myself this year. Being that this year has been, er, rather unusual to say the least, I thought this would be the perfect opportunity.

    These articles are a labour of love and earn me absolutely nothing in monetary terms, so I have to work at the same time to earn a living in my day job, putting pressure on the time I have for this writing. I really enjoy the writing and hope to make it a significant part of my professional life in the near future.

    Speaking of which, I have a small favour to ask of you, my dear readers. I’m running a small study about the ICT industry in the eastern Caribbean and have concocted a short survey to give me an overview of the market. If only half of you respond, I’ll be well on my way to having useful data to work with. I’m sure you can be that half :)

    It’s not all one-way either. The better the data and the more data I have, the more I’ll write about the results here directly to your inboxes. You give, I give. What could be fairer?

    You can take the survey here:

    Quick Survey

    Thanks for your help.


    Disrupting Intel

    Last year I wrote about Intel’s intention to ignore the threat of disruption to its core business if it continued to follow, virtually to the letter, Clayton Christensen’s Disruption Theory. From that article:

    If we follow DT to its conclusion, it is possible to see the risks Intel poses for itself, namely being innovated out of business. I’m clearly not suggesting that Intel will fail next year, but I think the long-term future is at risk if there is not some kind of reaction, with Intel creating further opportunities.

    I wrote at the time, that the fact that Intel was concentrating on moving further up the stack to increasingly more profitable zones, avoiding the threat of the lower-end processor makers like AMD, Pohoiki Beach was designed to ensure Intel’s prosperous future.

    It was a good strategy on the face of it. Desktop and laptop chips were increasingly under better-than-ever competition, something that was not the case when Intel was in its heyday. The real threat, Advanced RISC Machines’ ARM designs, were only beginning to poke their head out from the development studios, and whilst they had ambitions of capturing a small percentage of the market (10% if I recall well), this together with AMD provided real pressure on Intel. Intel had to react, and it did by going upscale and upping margins on those products because of reduced unit numbers.

    The thing many people don’t understand is just how phenomenally expensive it is to design a CPU. It takes months of research and prototyping, and each iteration and innovation adds substantially to those costs. As CPUs are designed using smaller and smaller transistor sizes, costs go the other way, and exponentially. Costs of design are often dwarfed by the costs of tooling too. Tooling is the process of the building and bringing online fabrication plants to build the processors. Marketing is another expensive cost centre. Intel has famously pilled millions into elaborate marketing campaigns to get the public to think that laptop chips only come from them. 

    Other factors influence the costs too. It should be noted that CPUs are defined not only by their speed —something that has mostly been maximised today, in that we can’t get the electrons to move any faster or for long periods without breaking the silicon— but are now defined by the transistor size in nanometers, or nm. When you look at processor specifications, they will talk of 14nm, 10nm and smaller. Looking at the following chart from International Business Strategies will give you an idea of the estimated costs, and how they have multiplied as semiconductors have reduced in size:

    nano3.png

    Source: IBS

    In the beginning, when it was a simple arms race of raw processor speeds, Moore’s law —i.e., the number of transistors on a dye will double every 18 months or so— meant that Intel could produce faster and faster chips for their target markets, namely desktop and server devices. The server-specific chips came further down the road after Intel saw the opportunity to custom-design and build what was essentially desktop-class chip to supply a burgeoning market of businesses that saw the need to store documents and applications centrally. It followed the second-level of the Digital Transformation model I wrote about in Issue 4: The Digital Transformation model in detail:

    Internal Exploitation

    The second level, Internal Exploitation, is defined by the process in which organisation attempt to integrate the different silos of information systems and datasets with the aim to produce a ‘whole’. Integration is difficult, slow and often results in failures when starting out from a base that is not adapted to integration. Just how do you get the Accounts, Stock, HR, Sales systems integrated?

    There are two types of integration, technical and business process interdependence. According to the model most enterprises spend more time on integrating on a technical level than on the business processes.

    Since then, the battle has become more technical and has required close coordination between the designers wants and the builders’ capabilities. So far Intel has been outpaced by the likes of TSMC in reducing the size of its transistors who have become world leaders in producing the most densely packed systems-on-a-chip. TSMC is not the only one either, Qualcomm and a couple of others are also at the forefront in the production of ever-tinier devices year in, year out.

    The keen-eyed among you will note that I switched from talking about CPUs and processors to systems-on-a-chip (SoCs, pronounced Socks). That is where the most prominent battleground is playing out currently. Not on pure CPUs but on chips that contain several previously separate ancillary systems on the same dye. Graphics, memory and other components are being reduced in size and brought physically closer to the processing units. In these minute devices, even a fraction of a millimetre can wield significant gains in data-exchange, or processing.

    Intel seems to be having trouble developing and manufacturing smaller transistors reliably, which in part, explains the reason for multi-core and multi-processor CPU designs from them. Their designs don’t need to be too concerned with size, power and heat dissipation requirements. A desktop or a server is plugged into an infinite power source for all intents and purposes, and the cooling systems put in place in server rooms or the space in an office affords all the heat sink required to ensure stable operation. 

    Since the beginning, Intel took the responsibility to design, make, market and ship the chips to PC and Server makers. This vertically integrated strategy served them well, so well in fact, that they became the de facto leader in the world for processors. Remember Intel Inside? But as recent news highlights, something that is a traditional force for an organisation can be turned into a weakness when disruption theory is well understood and utilised by competitors.

    Intel has now shown that it is nearly a whole generation, or “node” as it is known in the industry, behind TSMC. As a result, they have stated that they are going to outsource some of their production to... none other than the company outpacing Intel in chip building. TSMC of course. From the FT:

    “To make up for the delay, Intel said it was considering turning to outside manufacturers from 2023 onwards for some of its production — a plan that appeared to leave little option but to subcontract its most advanced manufacturing to TSMC. The shift raised the prospect of a change in Intel’s business model, according to some analysts, forcing it to consider abandoning more of its manufacturing to focus on design.”

    Classic disruption theory!

    Cementing TSMCs lead in processor manufacturing is another piece of news that may send shockwaves around the industry. No doubt most of you heard the long-awaited news that Apple, following a successful transition from PowerPC processors to Intel processors years ago, announced their intention to move its entire line of Macs to its in-house designed A-series processors. The same family of processors that power their iPhones and iPads and probably countless other items in their inventory.

    These SoCs are currently world leaders in speed, power and cooling capabilities. For reference, the release of the iPhone 11 and iPad Pro showed that Apple-designed processors were faster than most of the Intel and AMD processors found in laptops of the day. Apple has successfully designed its processors using a small, relatively unknown player in the CPU market to produce a succession of world-beaters. That partner is ARM or Advanced RISC Machines. If you’re interested I could bore you with the in’s and out’s of the different philosophies of instruction sets in the CPUs — The RISC bit stands for Reduced Instruction Set Chip. They are all built by TSMC.

    Once Intel lets go of this market, it will be all but impossible to get it back in the future. Don’t worry, they’re not going out of business anytime soon, and will undoubtedly record higher profits and better margins in the future as they shift their production away from a market that looks set to transition lock, stock and barrel over to ARM-designed processors and SoCs. Intel’s own mobile-focused processors are starting to get better, but it is too little too late in my opinion.


    Blockchain, Schmockchain

    Regular readers will know that I have been mostly sceptical of the utility of Blockchain in its potential to change the world. It is currently a big, slow database that complicates things rather than simplifying them. From Blockchain ≠ Cryptocurrency, I said :

    ... that it is a huge energy consumer and hence by definition is inefficient. That, sadly, is not its only efficiency problem. Blockchain is actually extremely limited in its speed and quantity of transactions and scales poorly. So much so that in 2016 several banks exploring the possibility of using the technology in the personal and business banking sector abandoned the work as blockchain was just too slow.

    According to a detailed academic-style “peer-reviewed” study by the Centre for Evidence-Based Blockchain and reported in the FT:

    ... outside of cryptoland, where blockchain does actually have a purpose insofar as it allows people to pay each other in strings of 1s and 0s without an intermediary, we have never seen any evidence that it actually does anything, or makes anything better. Often, it seems to make things a whole lot worse.

    Worse, the report repeatedly highlights that the technology is a solution currently looking for a problem. The antithesis to the Jobs to be Done theory that helps us better design and provide solutions. With over 55% of projects showing no evidence of useful outcomes, over 45% showing “unfiltered evidence” (i.e., next to worthless), it would appear that Blockchain is a belief-system rather than a technological solution.


    The Future is Digital Newsletter is intended for anyone interested in Digital Technologies and how it affects their business. I’d really appreciate it if you would share it to those in your network.

    Share The Future is Digital

    If this email was forwarded to you, I’d love to see you on board. You can sign up here:

    Subscribe now

    Visit the website to read all the archives. And don’t forget, you can comment on the article and my writing here:

    Leave a comment

    Thanks for being a supporter, have a great day.

    → 9:45 PM, Sep 9
  • The digital Caribbean and the digital reality

    Today’s essay is a slightly longer one, it’ll take a couple more minutes to read than is custom for my essays. The subject is broad and couldn’t be condensed without losing some of the finer details. I hope you don’t mind.

    This essay is based on a small presentation I did to talk about COVID-19 and how we could kickstart after the worst of the pandemic is over. This essay expands on the first part of that presentation exploring the themes in more detail. Look out for the second part in the near future.

    Enjoy! Your feedback is welcome.

    Leave a comment


    The digital Caribbean

    I’ve written here several times about the state of digital in the Caribbean, and I encourage you to read those earlier essays, you can find them all in the archives.

    However, what I didn’t emphasise is just how connected we all are in the Caribbean, but only connected in ways that are mostly ephemeral. Out of the nearly forty-four million people in the Caribbean around 77%, that’s thirty-three and half million people are connected with a data-capable mobile phone. More than 26 million people connect to the internet using a computer. But critically, over 50% of us regularly use social networks, a percentage that is higher than most places in the entire world.

    Screenshot 2020-07-11 at 11.02.11.png

    Which begs the question, why are our services, our stores and our governments not online?

    It would appear that COVID-19 might be the impetus that finally changes that, and I think that we have more chance that this will change because of this pandemic rather than an earthquake or even a hurricane. I feel that we are in a big, forced experiment where the entire world is collectively conscious at the same time and that we all have some amount of control on our outcomes, which is entirely different from a natural disaster that a) we have virtually no control over our outcomes, and b) nothing works during the disaster event.

    If you look at the islands in the North Leewards that suffered greatly in the 2017 hurricane season with Irma and Maria, and the Bahamas during last years’ terrifying and unimaginably tragic passing of hurricane Dorian, all islands affected suffered a complete breakdown of most if not all services, digital services included. With COVID-19 there has been virtually zero downtime and zero outages. Sure, many people could not work when their jobs were centred around physically being at work, but those who could work albeit remotely continued with relatively little effort —if we ignore Zoom fatigue that is— which is a real thing.

    To answer the above question, I’m noticing more and more services coming online, like the recently announced online Immigration and Customs Form for travelling to Barbados. It’s great to see this move but it provokes the question about why it took so long. The real answer, of course, is the will or lack thereof. There are virtually no technical reasons in 2020 to not have most services online, it was even possible ten years ago. And even those services that cannot be completed fully online, a major component can be digitised to make processes easier.

    Quite often, in the companies I consult for, I see multi-step manual processes in use, despite the company being willing to digitise its processes, for example, A ➔ B ➔ C. These processes cling on in manual form often because process A cannot be easily or successfully digitised despite B and C being eligible. The result is the abandonment of the digital process change. There are, of course, at least two ways to go about this. One is to digitise processes B and C, with process A being manually entered into the system for B and C to run the data. Or, as I tend to analyse, why not re-think the process from start to finish seeing if there is a way to digitise not only B and C but a part of A. It looks like this; A1 (manual) ➔ A2 (digital) ➔ B ➔ C. Going a step further the process can then be redesigned, considering the desire to eliminate manual processes; X ➔ Y ➔ Z, for example. XY&Z achieves the same goals but the data entry and data processing are reorganised to eliminate as much manual entry as possible. In this case, this new process no longer resembles the original process. I’m simplifying the work of course, but you get the picture.


     The digital reality

    We’re living in a new “digital reality” and approaching an inflexion point where the majority of our lives will be online and those that fail to embrace and effect change will feel pain in many areas of their society. Which is why these first steps cannot come soon enough and why politicians and businesses need to start to radically change their minds to adapt to what is coming, not what is current.

    So, how do we achieve this?

    The first thing to understand is the current state of affairs, my writing on the current state of digital in the Caribbean goes some way towards this, but further research is needed to look further into the economic, socio-political and business world in the region. Again, I’m doing some of this and intend to do even more going forward, but funding is needed for this to be more widespread.

    Screenshot 2020-07-11 at 11.02.40.png

    A few examples of my research can be briefly summarised here in five important categories; discovery, purchasing, payments, aggregation/uberisation and automation. Let’s take a quick peek at each of those in turn.

    Discovery

    In a world of virtually infinite information, content generation and a never-ending avalanche of information flow, we can extrapolate that the chances of information types we want do not exist are virtually zero, in other words, the information is there somewhere. The issue is in finding that information. This is initially where Google stepped in. Google understood that the exponential growth of websites on the internet would render the old model of listings and directories useless at best and dissuasive at worst. Google’s trick was to ignore the direct listings of names and URLs and concentrate on understanding the relationships between all online sites. PageRank was designed around this principle and was implemented to provide more “relevant” results to people’s searches. Up to that point, the internet had logically reproduced the physical Yellow Pages world.

    As a result, a whole new industry was born around getting better visibility for businesses on the internet, it’s called Search Engine Optimisation or SEO for short. A name born in the generation when Search was the primary tool used online. This name is already becoming redundant as the “optimisation” is not restricted to search engines, but relevant to all online platforms like Twitter, Facebook, etc., which run their own in-house developed algorithms of their users’ content.

    Purchasing

    Purchasing habits were being fundamentally altered even before COVID-19 hit. Today’s purchasing can be easily resumed as a few words :

    Buy online. Pick-up in-store.

    According to qudini, a specialist SaaS Retail Experience company, in a recent survey, 76% of respondents said they had purchased items using in-store pickup after researching and evaluating online. This is only part of the story, as the online retail giants like Amazon are putting greater effort into reducing friction at the point of sale enabling easier and faster consumption. And, despite this, there is still room for the niche markets to be highly profitable businesses, simply because of the sheer scale of the internet. A niche on the internet is a misnomer.

    Payments

    There is an ongoing trend of mass democratisation emerging in the financial world. Banking is being disrupted, with online-only banks not only reducing friction to access your money but providing more timely services for a fraction of the cost of traditional banks. And as nothing exists in a vacuum, traditional banks are not ignoring this and are implementing new strategies to ensure survival, for example, pivoting some sectors as online banks using a different brand. Consolidation in the back end additionally helps capitalise on the opportunity to become the guarantor for the online banks.

    Payments are being simplified and increasingly more integrated with online platforms from everything from membership systems to complete online marketplaces. Stripe is probably the most known and capable in this industry. However, more and more banks are starting to roll out their own online payments solutions. Not willing to let Stripe eat their lunch so easily they are hoping on keeping their clients in-house. They’ll need to be careful of hidden fees, simplicity and friction reduction to do this… something the banks have shown they are not very good at up until now.

    Investing is also opening up and becoming easier for the public. Efforts like Betterment and Wealthfront are only the first step of a wholesale dismantling of the staid and exclusive boys clubs that are current investment bankers. Not only that, as we’ll see later, but their use of technology is also outperforming traditional investment experts:

    Betterment portfolios outperformed average advised portfolios 88% of the time.

    Aggregation and Uberisation

    Aggregation is largely an internet phenomenon. It’s an extension of a well-trodden path from the powerful retailers using their muscle to keep clients coming back, thereby using that power to entice suppliers to prioritise their stores (being that the stores can guarantee customers), rinse and repeat. The traditional giant stores like Macy’s and Debenhams rode this wave for several decades. With digital distribution being essentially free, the value chain has been turned upside down meaning that those who integrate throughout the value chain and commoditise their supply generally increase their profit over the incumbents.

    The uberisation of services is another trend that appears unstoppable for now. Uberisation facilitates a peer-to-peer driven business model enabled through the use of technology to simplify the on-demand delivery of physical goods and services. The growing use of mobile and the constant connection to the internet allowed Uber to deliver an application that works for both drivers and passengers, hooking them up without the need for a central taxi operator to get involved. The model has been further developed and exploited by food delivery services, interestingly Uber has just acquired Postmates on the back of a decreasing amount of mobility and an increasing amount of online ordering. It doesn’t take a giant leap to see how this could become a deliver-anything service.

    Automation

    Automation, more specifically Machine Learning and Artificial Intelligence are the last key element. Their democratisation by the Microsofts of the world (see Azure Cognitive Services), is allowing a completely new generation of software designs. Simple operations like the scanning and treatment of receipts direct-to-accounting software are freeing up administrative staff to be better used in more valuable roles. Even simple workflow systems like Microsoft Power Automate can tap directly into the APIs of these services and perform simple repetitive tasks as an aid to decision-making.

    Which brings me to the availability of online automation products, of which IFTTT was probably the first to hit notoriety. If This Then That simplified the creation of fun automation that switched on your lights as you neared home or flashed the lights in the colours of your favourite team when they scored. It went even further by hooking into popular SaaS products allowing you to “join” together previously disparate systems. Zapier and Power Automate take this much further, with examples of users replacing no-longer-supported legacy software with modern workflows that are modulable and allow for data analysis, unlike the systems they replace.

    The second step is to try to envisage what the future will bring. Easier said than done, but current affairs do give us a few hints at what the future may hold for the internet and business on the internet. In my research three factors come up time and time again. Regulation, health and information misuse.

    Screenshot 2020-07-11 at 11.03.12.png

    Regulation

    I like memes so I couldn’t resist:

    winter-is-coming.jpg

    The EU has restarted efforts on its Digital Services Act, a far-reaching proposal to regulate Artificial Intelligence and data collection. Even the traditionally Wild West US is hauling its biggest tech CEOs to testify before Congress in an investigation to determine if the AAAF (i.e., Alphabet, Amazon, Apple and Facebook) are using anti-competitive practices (they are). We can expect the end to the free-for-all that is the current posture in most countries. Regulation will affect not only the giants online but all the supporting systems and smaller operations. GDPR was only a first attempt but its implementation has given impetus for the next wave of regulation, one that will bite harder, there is no question.

    Regulation will not just stop at competition and data harvesting, but it will also start to regulate what information can and cannot be published on the internet, much like how traditional media cannot publish absolutely anything. The days of self-regulation are soon over, as, just like the banks, the internet is incapable of regulating itself effectively. Regulation will be very difficult and full of competing ideologies pulling against each other. Just how we are going to shoehorn a global internet into the current state of political divisions around the world is still open to question for the moment. I suspect the EU will move first, and any businesses that are in some way reliant on the EU will feel the early pain, including us here in the Caribbean.

    Health

    When I discuss health, I’m talking about what is increasingly a difficult subject for parents and concerned parties like schools and businesses, that of Digital Health. Often reduced —incorrectly in my view to screen time— digital health will become a subject that every employer and supplier has to be cognizant of. They will be forced to take it into account when developing systems and processes to prevent people from being adversely affected.

    Employers will have to better discern good screen time from bad, to ensure their employees are not overly exposed to bad screen time. But what is bad screen time? How do we define it? How do we measure it effectively? How do we control it? These and many other questions are starting to be debated the world over. Like regulation, it is only a matter of time before it becomes a central aspect of your digitalisation strategy.

    Information misuse

    The most prominent danger for those of us who spend most of our working and personal lives online is fake news. A phrase the short-fingered vulgarian in the White House likes to overuse when attacking his imagined foes. But fake news is absolutely real, and ironically often created and perpetuated by the likes of Mr Drumpf. It is the fact that it is easy to produce and distribute, making it such a danger to the world and contributes to making it difficult to regulate.

    Another problematic innovation we’re seein g in currently limited use is that of deepfakes. They’re not completely indistinguishable to real photos, film or audio. But they are getting better, and with the use of machine learning their efficacy is accelerating. Legislation and regulation have not and likely will not catch up with these developments any time soon. It’s a disaster waiting to happen if it hasn’t already, and we just haven’t noticed it yet.


    Interesting times

    In summary, we can see that digital integration of business is evolving and accelerating, in part due to the current pandemic, and in part because of the natural changes in a society that is more exposed to digital than previously. Customers in the Caribbean are becoming more digital, although not for the reasons we thought they would —convenience and availability— but for reasons more to do with safety in the face of a virus we still know little about. We see business and structural changes brought about by investment or governmental and organisational willingness to face one of the most damaging crises we've seen to date.

    We also see that the very nature of the internet is about to change and change significantly. With more and more governments and populations favouring some form of regulation, it is only a matter of time before whole sections of the internet come under some framework of operations that have previously been rejected. Media, social media and sales of goods and services are likely the first areas regulated, but make no mistake regulation will follow for everything else soon after. The fact that the internet provides a scale of possibilities hitherto unseen, logically means that regulation will affect on a mass scale too.

    Even without the pandemic, we were living in interesting times. Adding COVID-19 to the mix has been like pouring water into a boiling chip pan. Hold on tight.


    The Future is Digital Newsletter is intended for anyone interested in Digital Technologies and how it affects their business. I’d really appreciate it if you would share it to those in your network.

    Share The Future is Digital

    If this email was forwarded to you, I’d love to see you on board. You can sign up here:

    Subscribe now

    Visit the website to read all the archives.

    Thanks for being a supporter, have a great day.

    → 9:55 PM, Jul 16
← Newer Posts Page 9 of 24 Older Posts →
  • RSS
  • JSON Feed