Matthew Cowen
About Newsletter Categories Working Library Subscribe License Search Also on Micro.blog
  • 2020 Tropical Cyclone Season. Nearly over, but not quite.

    A truly breathtaking season that thankfully resulted in few storms directly hitting us here in the Caribbean. The US, particularly Louisiana, didn’t have such fortune, with Laura, Marco, Sally and Delta all affecting the state. An overly busy season appeared as predicted, with (so far) 29 depressions, 28 named storms, 12 hurricanes (Category 1 and 2) and 4 major hurricanes (Categories 3+).

    Each year the NOAA and other meteorological institutions around the world predict the season ahead using a scale called the ACE or Accumulated Cyclone Energy index. After studying reliable data from 1981 to 2010, a team in Colorado State University, and independently the NOAA, derived a scale to measure and help predict tropical cyclone intensities for each season. They arrived at an average of 12.1 storms with 6.4 hurricanes and 2.7 major hurricanes. This is enabled the development of the indicator, the ACE, and set a value of 106 for an average season.

    The 2020 season is currently at 143 and it is not over yet.

    Bear in mind that in 1954 Hurricane Alice formed on 30 December in the mid-Atlantic to the north-east of the Leeward Islands, traveling south-west directly impacting these islands with maximum winds reaching 90+ mph, around 150 kph.

    Stay vigilant.

    Image: [upload.wikimedia.org/wikipedia...](https://upload.wikimedia.org/wikipedia/en/timeline/1f4a7932cbee5dcb8c94b0df46fc7f48.png)

    2 November 2020 — French West Indies

    → 9:34 AM, Nov 2
  • Digital Commerce. Blockchain (again)

    If you listened to the podcast version, you’d note I added some music. I broke out my skills in Garageband to make a quick accompanying jingle to spice up the podcast. Let me know what you think. 🎵

    On to this week’s topics.

    I’m astonished I didn’t get roasted for completely dissing Blockchain as a useless technology a couple of weeks ago. I thought I’d talk about an example where I’m actually quite bullish about the technology. But first I wanted to expand upon a thought I had following a conversation I’d had with someone, discussing why digital commerce is so different from brick and mortar commerce. Follow on for my thoughts.


    The unique challenges of digital commerce for physical goods

    The difficulty for online retailers selling physical goods in the digital economy is that the value of digital products is significantly reduced, and in some cases, is virtually zero. That inherently puts pressure on the value of physical goods that are commoditised. Fortunately, luxury goods are seeing less pressure on their perceived value, but that is more a function of time rather than real value.

    Luxury goods houses and retailers are seeing these changes and are starting to act. Apple today has changed its retail sales processes to resemble more of a luxury brand one-to-one service rather than a Walmart get-it-off-the-shelf-yourself operation.

    From Apple’s press release:

    When iPhone 12, iPhone 12 Pro, and iPad Air are available Friday, October 23, customers can get their all-new products directly from Apple through tailored purchase experiences offered online, by phone, or in store. From a chat session with a Specialist that starts online and finishes with contactless delivery, to visiting select Apple Store locations for a one-on-one session with an Apple Specialist, customers can find the best way to get the products they’re looking for.

    Apple_new-ways-to-shop-for-iPadAir-iPhone12Pro-iPhone12-infographic_10212020.jpg

    Source: Apple

    So, the question is, how do businesses make money now that the products they sell are virtually worthless (economically speaking)?

    Answers to some of that lie in the services surrounding the product. Be that sales (see above), installation/delivery, support, subscription, ongoing help and many other possibilities, the key lies in giving the customer an ‘experience’ rather than a sale. Deriving value comes from developing and innovating on several levels to create a whole greater than the sum of its parts.

    If we look at the above example from Apple, you’ll note they’re selling a phone. But that phone is so much more than a simple “… Widescreen iPod with touch controls, a revolutionary mobile phone and a breakthrough internet communicator.”. The iPhone has replaced by some estimates, over 50 products; mobile phone, point and shoot camera, a torch, calendar, SatNav, personal assistant, to name a few. But it is a product in a sea of other products that are designed to resemble each other closely. There is little material difference between the flagship Samsung and Google devices that are directly competing with the iPhone. And, Smartphones themselves are becoming commoditised, as is evidenced by the increasingly smaller and smaller gains made in hardware design and technologies deployed. The differentiator is the software and what that software can enable hardware to do for the overall end-user experience.

    Two excellent examples are computational photography and health analytics.

    In computational photography, we are nearing the phase whereby even specialist cameras of the past are being innovated out of existence. It is only a matter of time before the computational aspect will outperform pure optical limitations of smartphone camera modules. In health, simple movement sensors initially enabled step tracking, instantly killing a growing market segment, and eventually enabled detailed sleep tracking that has (itself) been out-innovated by smartwatches. It is no longer science fiction to imagine the doctor’s office on your wrist.

    The overall experience of owning these products and their potential beyond the initial use-case is what I mean when I say ‘customer experience’. Apple has gone that step further and developed and Covid-friendly white-glove shopping experience previously reserved for the rich and famous. You book a 1 to 1 either in-store or directly on the Apple Retail site, and you are led through your purchase to contactless delivery or pick-up. It is a personal shopping service for the rest of us.

    Who doesn’t want to be made to feel special when buying something?


    Blockchain. Again

    I’m surprised I didn’t get more of a roasting from my somewhat sceptical articles about Blockchain here and here:

    According to a detailed academic-style “peer-reviewed” study by the Centre for Evidence-Based Blockchain and reported in the FT:

    “… outside of cryptoland, where blockchain does actually have a purpose insofar as it allows people to pay each other in strings of 1s and 0s without an intermediary, we have never seen any evidence that it actually does anything, or makes anything better. Often, it seems to make things a whole lot worse.”

    Worse, the report repeatedly highlights that the technology is a solution currently looking for a problem. The antithesis to the Jobs to be Done theory that helps us better design and provide solutions. With over 55% of projects showing no evidence of useful outcomes, over 45% showing “unfiltered evidence” (i.e., next to worthless), it would appear that Blockchain is a belief-system rather than a technological solution.

    And …

    …it is a huge energy consumer and hence by definition is inefficient. That, sadly, is not its only efficiency problem. Blockchain is actually extremely limited in its speed and quantity of transactions and scales poorly. So much so that in 2016 several banks exploring the possibility of using the technology in the personal and business banking sector abandoned the work as blockchain was just too slow.

    Quite the downer if I’m honest. But whilst many projects show no use for Blockchain, some projects show promise. One such example is Dominica’s Economic Growth team at the Climate Resilience Execution Agency for Dominica (CREAD).

    They are currently developing a parametric insurance product that uses blockchain technology to help small businesses and the typically underserved by traditional insurance products, manage their risk of natural disasters in an innovative way. It’s called BHP or Blockchain Hurricane Protection. From the article on LinkedIn:

    BHP aims to extend coverage to those excluded from traditional indemnity insurance, and provide Dominicans with a flexible and affordable tool for managing climate risk. Total damage from Hurricane Maria which struck the island as a category 5 storm on September 18, 2017 was US$1.3 billion, representing 226% of GDP. Uninsured losses were US$1.1 billion, or 86% of total damages. Damages to the housing sector totalled US$520 million. MSMEs suffered US$73 million in damages, and agriculture also suffered US$73 million in damage. In the years since Hurricane Maria, premiums for traditional indemnity policies have increased by more than 80%.

    This was the background to Dominica’s drive for innovation to better protect itself after multiple incidents that substantially affected citizens and businesses over the last decade.

    So, what is a Parametric Insurance and why blockchain?

    From Wikipedia:

    Parametric insurance is a type of insurance that does not indemnify the pure loss, but ex ante agrees to make a payment upon the occurrence of a triggering event. The triggering event is often a catastrophic natural event which may ordinarily precipitate a loss or a series of losses. But parametric insurance principles are also applied to agricultural crop insurance and other normal risks not of the nature of disaster, if the outcome of the risk is correlated to a parameter or an index of parameters.

    What’s great about this project is that it is intelligently using technology in the right places to fulfil the “Job to be Done”. Again, from that LinkedIn article:

    Once customers have downloaded the mobile wallet to their smartphone, they simply indicate the level of coverage that they would like to purchase, tag the location where they want the policy to apply, enter some basic information, and pay the premium. The policy is then issued and stored in the blockchain. In the event of a triggering event that meets the criteria of the policy, the payout is generated automatically and delivered to the customer's mobile wallet within three days of the triggering event.

    This product reduces friction at the critical stages of an insurance lifecycle; the signup and the payout.

    You’d be right in asking why a “normal” insurance product couldn’t do the same. And there’s no reason traditional insurance can’t reduce friction when it comes to the signup and management of the product. Payout is where the difficulty lies. Frequently, insurance companies need to wait for a “Natural Disaster” to be declared or during a smaller indecent, assessors and inspectors to audit and report back to the insurer before the insurer can start the payout process, which itself can be lengthy and time-consuming.

    In this product, they are disrupting traditional insurance at a particular level — this is not general insurance — and that disruption, like all disruptions, is to the benefit of customers in the way of simplification and increased speed in onboarding and payouts. Not to mention the pricing that makes it more accessible and hence more likely to be adopted, benefitting all in the process.

    But the interesting aspect from a tech point of view is the use of blockchain. In this instance, it is playing to blockchain’s strengths and not trying to overcome its weaknesses (see above). And that’s the intelligent way to use it.

    BHP is a product that doesn’t need to scale to hundreds if not thousands of transactions per millisecond like traditional banking systems deployed around the world. For one, Dominica (thankfully) doesn’t suffer a significant natural disaster every day, and secondly, its population is currently only around 73 000 people. Today’s blockchains are more than capable of sustaining the likely transaction requirements of this implementation. And suppose BHP is available to a broader audience throughout the Caribbean, it is still unlikely to overwhelm the system, as processor designs and energy efficiency gains are around the corner.

    A cursory glance shows that the major chip designers and builders are all exploring the possibilities of tailoring their products for blockchain applications to overcome the shortcomings of the technology.

    https://www.amd.com/en/technologies/blockchain

    https://www.intel.com/content/www/us/en/security/blockchain-overview.html

    Interesting times.


    You can find the archives of all my essays here:

    The Future is Digital Archives


    The Future is Digital Newsletter is intended to inform and entertain on the topic of tech and the digital world. Share it to someone you know who would like it.

    Share The Future is Digital

    If this email was forwarded to you, I’d love to see you on board, here’s where you can sign up spam-free:

    Subscribe now

    Thanks for being a supporter, have a great day.

    → 8:26 PM, Oct 21
  • Nvidia, AI and the Big Picture

    There are currently many moving parts to the tech industry, and as tech becomes more and more pervasive in society, it is getting roped into discussions and being judged by standards that never applied in the past. Debates are ranging from whether or not big tech has built unmatched and unrivalled monopolies, whether those monopolies are legal or not, to whether big tech is going to be responsible for the downfall of democracy and ultimately the next world war. I can’t give you a credible answer, but I can say that it is mentally draining to follow the tech industry. Not because there’s nothing to read and write about, precisely the opposite. The fire-hose of news in this industry has relentlessly increased, driving an information flow-rate that is impossible to manage.

    Too much choice is ultimately a bad thing.

    I’m currently researching the ICT industry in the Eastern Caribbean, and the same data points appear continually. Small businesses are trying to survive by offering the same services competing with the same compatriots on the same value propositions. It is not only a zero-sum game, but it is also so misaligned to what is possible if we consider internet assumptions.

    I’ll write more about this topic in the future as I clarify my thoughts and the research reveals further insights.

    I thought I’d write a follow-up on the last newsletter, as pretty-much right after I’d recorded and published news broke about the sale of the subject of that issue, ARM. Read on for my thoughts on this.


    ARM’s History

    When I wrote about the disruption of a part of Intel’s processors’ design and build process, in the issues: The Tale of Intel’s Disruption and Blockchain is Useless and Intel’s Pohoiki Beach and Disruption Theory, and I delved further into Disruption Theory, in Disruption Theory. Is Wizzee a Disruptor? I was trying to give you an overview of Disruption Theory and how it may apply to your own business. I recommend you read those articles for a better context of this essay.

    Getting back to that news. I was aware of the potential sale of one of the most important actors in that field, ARM Holdings. What I didn’t expect was such a quick sale and a sale to a company that logic would reason is not best suited to the type of business it was buying.

    Let’s back up here just a little and recap on the timeline and where I think this is going.

    ARM Limited, as it is now known, was initially incorporated in 1990 under the name Advanced RISC Machines. Funnily enough, even that wasn’t its original name. It was born as Acorn RISC Machine, from the Acorn Archimedes computer that was powered by the new microprocessor design. The change was apparently at the request of Apple that objected to the name of a competitor in the name of one of the processors it was responsible for jointly designing and using in that ill-fated (but arguably necessary step) of the Apple Newton. Advanced RISC Machines became ARM Holdings for the Initial Public Offering (IPO) that took place in 1998.

    In 2016, Softbank, a Japanese Telecoms company with an appetite for Venture Capital, purchased ARM for an amount of approximately 32 billion USD. That transaction guaranteed the operations to continue as they were. That is a UK headquarters and offices in Silicon Valley and Tokyo. It allowed ARM to be close to the world’s disrupters and designers (Silicon Valley) and the world’s builders (Taiwan and China). ARM capitalised on this, and the catalogue of products that currently use ARM chips designs is un-fathomable. Just about every device that requires a processor of some kind, that isn’t a computer, contains an ARM chip. And that’s before we even talk about the just-starting revolution of the Internet of Things, or IoT.

    Image: nvidia

    ARM has just sold to Nvidia for an announced price of 40 billion USD, an 8 billion USD premium over its purchase price, or a 25% profit over four years. SoftBank will retain a 10% stake in ARM too. This money will go some way to stopping the haemorrhage it recently suffered when it indicated that it might lose up to 80 billion USD from failed investments—WeWork (cough, cough).

    As a recap, ARM makes no processors itself. And in some cases, it doesn’t even design the subsequent generations of some of its processor designs. ARM licenses its intellectual property, or IP, to anyone, following a long tradition of British tech design houses that have sprung up of the last couple of decades, like Imagine Technologies and ARC International. Depending on the license terms, companies are more or less free to use the designs as they see fit. ARM presents itself as the Switzerland of processor designs i.e., neutral. ARM reports that its designs are in around 180 billion processors in use to date.

    Qualcomm uses them for their processors that run a majority of Android phones, and most famously Apple has a lifetime license (from the days it was one of the original designers) and uses its asset to design and implement the most advanced mobile processors on the planet currently. But even Apple doesn’t build those processors; it farms that work out to the specialist I mentioned in the last newsletter, Taiwan Semiconductor Manufacturing Corporation, TSMC. You can’t get a more explicit name that reflects the companies’ primary purpose than that! Which leads me to where I think this is going.

    ARM-ing the Future

    The big question is why a graphics card builder like Nvidia would splash for a chip designer? 

    Part of the answer lies in the fact that Nvidia, itself, is a licensee of ARM, and presumably that annual fixed cost will be removed from the books being that it is now the owner of the company it used to buy a licence from. It’s an upfront investment that pays off over several years, and if the value accumulation of ARM continues, the investment might be justified relatively quickly (from an accounting point of view).

    But I think it goes beyond that. I hinted earlier that ARM processors are just about everywhere and are integrated into more and more devices in the form of SMART tech. The fridge, the toaster, the Espresso machine are all candidates for a coming home-smarts revolution. And the already processor-enabled world of home appliances like washing machines will be enhanced by the technological possibilities available to their builders.

    The TAM, or Total Addressable Market, for their IP is almost infinite. The ubiquity of wifi and the incoming 5G avalanche only goes to reinforce the inclusion of ARM-type processors in devices: even the comms technology itself, the routers, switches and amplifiers, use ARM processor designs. ARM is set to become the de facto processors of things that are not traditional PCs.

    Besides Apple, Microsoft is using more and more of the technology in its designs. The new Surface Duo is an ARM-based foldable phone/tablet hybrid with impressive screen technology, all running on a customised ARM design. The Surface X Pro is a new generation of the popular Microsoft Surface PC line, and is also ARM-based, running a customised ARM compiled version of Windows.

    Beyond Computers

    But again, it goes beyond this, to what will inevitably be as pervasive a technology as oil-powered personal transport has become. I’m talking about AI or Artificial Intelligence.

    From simple statistical models to more advanced nuanced-based algorithms like GPT3, AI is set to be included in everything from your everyday carry phone to the entertainment system of the future. Think Bladerunner 2049. Where does Nvidia step in then?

    From the Nvidia Deep Learning AI website:

    I AM A VISIONARY, A HEALER, A CREATOR, AND SO MUCH MORE. i am ai.

    POWERING CHANGE WITH AI AND DEEP LEARNING

    AI doesn’t stand still. It’s a living, changing entity that powers change throughout every industry across the globe. As it evolves, so do we all. From the visionaries, healers, and navigators to the creators, protectors, and teachers. It’s what drives us today. And what comes next.

    Many Data Scientists and technology teams around the world realised that they needed powerful processors to perform a highly reduced and specific set of calculations, for which only specialised and extremely expensive super-computers could perform. Super-computer makers like Cray and IBM sold their systems to large research institutes and universities with high profit-margins on the back of their uniqueness in their ability to calculate rapidly and massively parallel, an important factor when designing calculations of that type.

    On the other end of the computing spectrum, users wanted to get better quality graphics for video-gaming and image manipulation. Nvidia started to design and build and sell specialised video cards to OEMs (Original Equipment Manufacturers) like DELL, for integration on their motherboards for a win-win situation. Better graphics meant that computers became more desirable as games or design machines. These designs evolved over the years and are sold as separate cards for builders to include in their offerings.

    In a quirk of circumstance, the type of processing required to produce detailed and fluid graphics for games was also ideal for the type of calculations required for AI. At a fraction of the cost, scientists and researchers could equip banal PCs with a bank of processors that could compete with the multi-million $ super-computers. As you’ve guessed by now, these cards are powered by ARM processors. With Nvidia as a pioneer in AI, it has developed a deep understanding of the field as the above website indicates. That pivot let Nvidia surpass Intel a few weeks ago, as the worlds most valuable chip maker.

    With AI becoming ubiquitous, like it or not, the purchase of what Nvidia sees as a cornerstone of its technological chops, the purchase of ARM will no doubt allow Nvidia to extend a lead in the AI world.

    Challenges remain in that ARM licenses its technology to direct competitors like AMD (Advanced Micro Devices) and just how they will navigate those forces is unclear. If you think about it, Nvidia has to develop and execute a way it can successfully operate and profit from two very different business models. Selling IP is nothing like selling processors, and that job has just become much more complicated with the inherent competing forces of buyers of the IP and the manufacturers.

    For now, Nvidia has stated that the business will continue to run as-is with Nvidia itself being one of many of ARMs customers to “buy” the technology. Only time will tell if they manage to pull this off, but I’m currently positive on the long term prospects of the deal.

    Intel’s Disruption Train

    As you know, I firmly believe we are witnessing a considerable disruption of an incumbent in the tech industry. It doesn’t happen as often as the tech press would like you to think. But it is an exciting spectacle to observe from a purely theoretical point of view. There will be many articles and possible books written on the subject if we still have books by then.

    With the Nvidia acquisition of ARM, Intel’s woes in the mobile processor, AI and Datacenter fields, have just got worse. Above, I purposely stuck to a more consumer side of the implications. But it is in the Datacenter that the next battle for processor superiority will commence in earnest soon. We’ve already seen inroads that Google and Amazon are making with the design of ARM-based servers that slot into the racks alongside thousands of other servers all furnishing your email, photo management and countless other modern-day necessities.

    With Nvidia, the risk for Intel is that they successfully manage to vertically integrate the ARM designs in their products in a much tighter manner, thus producing even more effective cloud server designs for specialist applications like AI and Machine Learning. They could, of course, do this without having to buy ARM, but the ARM acquisition may give them a head start that puts the nail in the coffin of significant business for Intel.

    I have no crystal ball, and I may be horribly wrong, but I think the next few years are going to be critical for Intel for it to survive in its current form. As I’ve noted in other essays, it is not lost on Intel, and they are making business decisions that keep their margins up and fix the short term. I would like to see Intel be bold and try to out-disrupt itself. I think this deal allows Intel an opportunity to profit from the scepticism in the industry surrounding Nvidia’s long term intentions and stun us with something truly new. It is the time for Intel to look at the research labs for the next thing and give it a shot. That’s easier said than done, however.


    The Future is Digital Newsletter is intended for anyone interested in Digital Technologies and how it affects their business. I’d really appreciate a share to those in your network.

    Share The Future is Digital

    If this email was forwarded to you, I’d love to see you on board. You can sign up here:

    Subscribe now

    Thanks for being a supporter, have a great day.

    → 3:44 PM, Sep 25
  • The tale of Intel’s disruption and Blockchain is useless

    It’s good to be back.

    I decided to take a summer holiday, of sorts, retiring myself from the pressure of writing these articles. If you know me, you’ll recall that I’ve pretty much not had a holiday since I started my professional life—much to the consternation of my family. So I decided to take a little time for myself this year. Being that this year has been, er, rather unusual to say the least, I thought this would be the perfect opportunity.

    These articles are a labour of love and earn me absolutely nothing in monetary terms, so I have to work at the same time to earn a living in my day job, putting pressure on the time I have for this writing. I really enjoy the writing and hope to make it a significant part of my professional life in the near future.

    Speaking of which, I have a small favour to ask of you, my dear readers. I’m running a small study about the ICT industry in the eastern Caribbean and have concocted a short survey to give me an overview of the market. If only half of you respond, I’ll be well on my way to having useful data to work with. I’m sure you can be that half :)

    It’s not all one-way either. The better the data and the more data I have, the more I’ll write about the results here directly to your inboxes. You give, I give. What could be fairer?

    You can take the survey here:

    Quick Survey

    Thanks for your help.


    Disrupting Intel

    Last year I wrote about Intel’s intention to ignore the threat of disruption to its core business if it continued to follow, virtually to the letter, Clayton Christensen’s Disruption Theory. From that article:

    If we follow DT to its conclusion, it is possible to see the risks Intel poses for itself, namely being innovated out of business. I’m clearly not suggesting that Intel will fail next year, but I think the long-term future is at risk if there is not some kind of reaction, with Intel creating further opportunities.

    I wrote at the time, that the fact that Intel was concentrating on moving further up the stack to increasingly more profitable zones, avoiding the threat of the lower-end processor makers like AMD, Pohoiki Beach was designed to ensure Intel’s prosperous future.

    It was a good strategy on the face of it. Desktop and laptop chips were increasingly under better-than-ever competition, something that was not the case when Intel was in its heyday. The real threat, Advanced RISC Machines’ ARM designs, were only beginning to poke their head out from the development studios, and whilst they had ambitions of capturing a small percentage of the market (10% if I recall well), this together with AMD provided real pressure on Intel. Intel had to react, and it did by going upscale and upping margins on those products because of reduced unit numbers.

    The thing many people don’t understand is just how phenomenally expensive it is to design a CPU. It takes months of research and prototyping, and each iteration and innovation adds substantially to those costs. As CPUs are designed using smaller and smaller transistor sizes, costs go the other way, and exponentially. Costs of design are often dwarfed by the costs of tooling too. Tooling is the process of the building and bringing online fabrication plants to build the processors. Marketing is another expensive cost centre. Intel has famously pilled millions into elaborate marketing campaigns to get the public to think that laptop chips only come from them. 

    Other factors influence the costs too. It should be noted that CPUs are defined not only by their speed —something that has mostly been maximised today, in that we can’t get the electrons to move any faster or for long periods without breaking the silicon— but are now defined by the transistor size in nanometers, or nm. When you look at processor specifications, they will talk of 14nm, 10nm and smaller. Looking at the following chart from International Business Strategies will give you an idea of the estimated costs, and how they have multiplied as semiconductors have reduced in size:

    nano3.png

    Source: IBS

    In the beginning, when it was a simple arms race of raw processor speeds, Moore’s law —i.e., the number of transistors on a dye will double every 18 months or so— meant that Intel could produce faster and faster chips for their target markets, namely desktop and server devices. The server-specific chips came further down the road after Intel saw the opportunity to custom-design and build what was essentially desktop-class chip to supply a burgeoning market of businesses that saw the need to store documents and applications centrally. It followed the second-level of the Digital Transformation model I wrote about in Issue 4: The Digital Transformation model in detail:

    Internal Exploitation

    The second level, Internal Exploitation, is defined by the process in which organisation attempt to integrate the different silos of information systems and datasets with the aim to produce a ‘whole’. Integration is difficult, slow and often results in failures when starting out from a base that is not adapted to integration. Just how do you get the Accounts, Stock, HR, Sales systems integrated?

    There are two types of integration, technical and business process interdependence. According to the model most enterprises spend more time on integrating on a technical level than on the business processes.

    Since then, the battle has become more technical and has required close coordination between the designers wants and the builders’ capabilities. So far Intel has been outpaced by the likes of TSMC in reducing the size of its transistors who have become world leaders in producing the most densely packed systems-on-a-chip. TSMC is not the only one either, Qualcomm and a couple of others are also at the forefront in the production of ever-tinier devices year in, year out.

    The keen-eyed among you will note that I switched from talking about CPUs and processors to systems-on-a-chip (SoCs, pronounced Socks). That is where the most prominent battleground is playing out currently. Not on pure CPUs but on chips that contain several previously separate ancillary systems on the same dye. Graphics, memory and other components are being reduced in size and brought physically closer to the processing units. In these minute devices, even a fraction of a millimetre can wield significant gains in data-exchange, or processing.

    Intel seems to be having trouble developing and manufacturing smaller transistors reliably, which in part, explains the reason for multi-core and multi-processor CPU designs from them. Their designs don’t need to be too concerned with size, power and heat dissipation requirements. A desktop or a server is plugged into an infinite power source for all intents and purposes, and the cooling systems put in place in server rooms or the space in an office affords all the heat sink required to ensure stable operation. 

    Since the beginning, Intel took the responsibility to design, make, market and ship the chips to PC and Server makers. This vertically integrated strategy served them well, so well in fact, that they became the de facto leader in the world for processors. Remember Intel Inside? But as recent news highlights, something that is a traditional force for an organisation can be turned into a weakness when disruption theory is well understood and utilised by competitors.

    Intel has now shown that it is nearly a whole generation, or “node” as it is known in the industry, behind TSMC. As a result, they have stated that they are going to outsource some of their production to... none other than the company outpacing Intel in chip building. TSMC of course. From the FT:

    “To make up for the delay, Intel said it was considering turning to outside manufacturers from 2023 onwards for some of its production — a plan that appeared to leave little option but to subcontract its most advanced manufacturing to TSMC. The shift raised the prospect of a change in Intel’s business model, according to some analysts, forcing it to consider abandoning more of its manufacturing to focus on design.”

    Classic disruption theory!

    Cementing TSMCs lead in processor manufacturing is another piece of news that may send shockwaves around the industry. No doubt most of you heard the long-awaited news that Apple, following a successful transition from PowerPC processors to Intel processors years ago, announced their intention to move its entire line of Macs to its in-house designed A-series processors. The same family of processors that power their iPhones and iPads and probably countless other items in their inventory.

    These SoCs are currently world leaders in speed, power and cooling capabilities. For reference, the release of the iPhone 11 and iPad Pro showed that Apple-designed processors were faster than most of the Intel and AMD processors found in laptops of the day. Apple has successfully designed its processors using a small, relatively unknown player in the CPU market to produce a succession of world-beaters. That partner is ARM or Advanced RISC Machines. If you’re interested I could bore you with the in’s and out’s of the different philosophies of instruction sets in the CPUs — The RISC bit stands for Reduced Instruction Set Chip. They are all built by TSMC.

    Once Intel lets go of this market, it will be all but impossible to get it back in the future. Don’t worry, they’re not going out of business anytime soon, and will undoubtedly record higher profits and better margins in the future as they shift their production away from a market that looks set to transition lock, stock and barrel over to ARM-designed processors and SoCs. Intel’s own mobile-focused processors are starting to get better, but it is too little too late in my opinion.


    Blockchain, Schmockchain

    Regular readers will know that I have been mostly sceptical of the utility of Blockchain in its potential to change the world. It is currently a big, slow database that complicates things rather than simplifying them. From Blockchain ≠ Cryptocurrency, I said :

    ... that it is a huge energy consumer and hence by definition is inefficient. That, sadly, is not its only efficiency problem. Blockchain is actually extremely limited in its speed and quantity of transactions and scales poorly. So much so that in 2016 several banks exploring the possibility of using the technology in the personal and business banking sector abandoned the work as blockchain was just too slow.

    According to a detailed academic-style “peer-reviewed” study by the Centre for Evidence-Based Blockchain and reported in the FT:

    ... outside of cryptoland, where blockchain does actually have a purpose insofar as it allows people to pay each other in strings of 1s and 0s without an intermediary, we have never seen any evidence that it actually does anything, or makes anything better. Often, it seems to make things a whole lot worse.

    Worse, the report repeatedly highlights that the technology is a solution currently looking for a problem. The antithesis to the Jobs to be Done theory that helps us better design and provide solutions. With over 55% of projects showing no evidence of useful outcomes, over 45% showing “unfiltered evidence” (i.e., next to worthless), it would appear that Blockchain is a belief-system rather than a technological solution.


    The Future is Digital Newsletter is intended for anyone interested in Digital Technologies and how it affects their business. I’d really appreciate it if you would share it to those in your network.

    Share The Future is Digital

    If this email was forwarded to you, I’d love to see you on board. You can sign up here:

    Subscribe now

    Visit the website to read all the archives. And don’t forget, you can comment on the article and my writing here:

    Leave a comment

    Thanks for being a supporter, have a great day.

    → 9:45 PM, Sep 9
  • After the first real TW, comes the first real TS

    Genuinely a worrying image to see when you live in the Caribbean. For the best sources to follow and to learn about how these predictions are done, the NHC NOAA website is key, but my preference is for Tropical Tidbits. His nightly (during current storms) presentations on his YouTube channel are second to none. Easy to understand, easy to get the key messages and removed from drama to prevent panic, something the world’s TV channels would do well to emulate.

    22 July 2020 — French West Indies

    → 10:05 AM, Jul 22
← Newer Posts Page 25 of 46 Older Posts →
  • RSS
  • JSON Feed