Issue 2 : What is Digital Transformation?

Defining Digital Transformation

In this, the second issue, I dig into the origins of digital technologies and how they name about, their uses bringing us up to the modern day discussing innovations like 5G and the Smartphone.

On to the newsletter, I hope you enjoy it.

What is Digital Transformation?

It’s a very good question, what actually is Digital Transformation?

One of my favourite definitions of Digital Transformation is one from R “Ray” Wang of Constellation Research:

“Digital Transformation is the methodology in which organizations transform and create new business models and culture with digital technologies”

But therein lies one of the difficulties associated with Digital Transformation, there is currently no standardised definition, only a bunch of aims, mantras and promises from leading organisations — mostly Tech — with a vested interest in selling you something.

That’s not to say they are inherently bad, it’s just important to understand the motivations to fully understand the products, services and information shared.

To better understand the origins of Digital Transformation we can look back at the not too distant past and what things influenced today’s digital world.

Business before Digital Transformation

The history of computerisation is relatively recent with essentially three inventions that founded pretty much everything we have today.

The first published in an article the 30 June 1945, John Van Neumann discusses a concept of storing a program in a calculating machine. The second, something even more profound to our everyday lives, developed in the notorious Parc Xerox labs, Ethernet came to life in 1973 largely due to efforts of Bob Metcalfe. Ethernet permitted, for the first time, computers to communicate between themselves. The last and arguably most important, http (Hypertext Transfer Protocol), invented by Sir Tim Berners-Lee in the Europeen Center for Nuclear Research (CERN - Centre Européen pour la Recherce Nucléaire). This 1989 innovation is the basis for absolutely everything that happens on the Internet.

As a side-note, I recently discovered that http, even at that early stage, had provisions for micro-payments built-in. We’re all familiar with the infamous 404 error messages popularised by Twitter’s Fail Whale page, but how many know that there is a 402?

10.4.3 402 Payment Required

This code is reserved for future use.

From these beginnings all Information Technology has grown with four distinct categories defining what Information Systems (IS) are: Hardware, Software, Networks and the Human.

Hardware

Hardware, as the name suggests, makes up the bottom most layer of the stack. Hardware is essentially the items that physically construct the IS in its entirety. Servers, routers, switches, personal computers, printers, terminals, mobile devices are all hardware regardless of their ultimate function.

Arguably, today in our cloud world, the server and its associated support hardware (routers etc.) are the most important elements of our information systems. Various categories of serveurs are in use, with the most popular being applications servers, directory servers, communications servers, communications servers, database servers, file servers, media distribution servers and more generically, web and proxy servers.

The democratisation of the server happened in the 1990’s. Something that happened on the back of the popularity of the personal computer, companies like IBM, DEC and Compaq (now HP) disrupted the mainframe and mini business by producing what were essentially souped-up personal computers that neatly fit into to businesses of all types and sizes, providing simple server-based functions for running the company.

This gave life to the Client/Server Computing model, where we see a distinction between the roles of each part of the system. Clients present results and help formulate demands, but the bulk of the work is done at the server-level. The acceleration of Intels processors only made this model more and more compelling, with even the smallest of organisations able to have a calculation power orders of magnitude higher than anything they had imagined not even a decade ago. Remind you of anything?

Software

As hinted at previously, the roots of software are found in that article from John Van Neumann, and like hardware many forms exist. In some respects, one could argue that an infinite number of software types could exist by definition. Software being infinitely ‘programmable’ unlike hardware, that once designed is static until the next design modification.

However, we can identify three main types of software in the foundations, system software (used to start and run the hardware), development software (in a meta manner, used to create new software) and application software (think about what you’re reading this on, probably a mail client or web browser).

The main systems software are operating systems (OSes) like Unix, Linux and Windows. Interestingly, MacOS is based on Unix, Android. But without thinking too deeply, you may notice that more than just computers are run by OSes like your TV decoder for example, and even household appliances often contain sophisticated computers booting off an OS like Linux. Development software is extremely elaborate with parts such as the IDE, the Integrated Development Environment, that contain editors, compilers and debugging software.

Compilers translate text written in specific syntax into machine readable code, ie., something the processor can execute directly to product the desired result. The editor is there to help the developer construct the instructions and the debugger, as its name suggests, aids the developer find issues and bad instructions in the code.

Application software is the most prevalent and the part of the stack that is touched directly and quite literally in some cases, by the end users.

Networks

The network links all the parts together, hardware, software and the people. Networks are of increasing importance in Digital Transformation, and network effects of the new business models will become apparent in futur issues.

Dartmouth College, based in New Hampshire, U.S.A is credited with the birth of computer networks. In 1940 a “Teletype” machine communicated (a keypress was received at the other end of the network in real-time) with another in New York. Then in 1950 a primitive network called SAGE (Semi-Automatic Ground Environment) — don’t you just love the names they had for things those days — was developed for the militaries radar systems. It usefulness was quickly appreciated and commercial purposes were rapidly developed, the first being SABRE (Semi-Automatic Business Research Environment), which was used as a reservation system for the airlines of the time.

At around the same time the Advanced Research Projects Agency (ARPA) created its concept network called the Intergalactic Computer Network. Its aim, to allow any computer communicate with any other computer regardless of its physical position on the earth. This quickly gave rise to WANs (Wide Area Networks) with the agencies own, called ARPANET. As a coincidence, it’s also important to note that telephone networks had just been computerised by the likes of Western Electric.

The first commercial use of networks happened in the 1970’s using the X25 and TCP/IP Protocols. The same ones in use today for the Internet, which is also a direct descendant of the original ARPANET.

As a side note, my first Internet experience was at a University in the UK, where I would have to log on to an academic network called JANET, hop on to an intermediary one called NIST, then finally to the Internet, availing myself of all the wonderful things and eventually using a new protocol and internet interface called Gopher.

Getting back to the subject, everything changed in the 2000’s. Broadband was born and suddenly virtually anyone who wanted fast internet was able to have it. Then on the heels of that mobile broadband arrived through technological advances like Edge, 3G and then 4G. Currently 5G is being implemented with promised speeds of around 1 to 10 Gbps, which for comprehension is the ability to download several HD films in less than a minute.

People (Meatware)

All systems are operated and used by humans, but the crucial part of the equation is that the end user needs to be supplied with what he or she is interested in, to dedicate time to spend with the system. This is where we see that most systems, from the beginning of the IS revolution right to modern day computing and social networks, are geared to pleasing its targeted public.

For a business, the CIO/CTO is the crucial role in choosing and implementing systems, and outside its the users that vote en masse, either for or against a particular piece of software engineering. The difficulty for businesses is to find and use systems that are both pleasing and efficient. But I’m getting ahead of myself and will cover this difficulty in the near future.

Other important factors

Although I singled-out only four main categories (hardware, software, networks and humans), there are two more that merit their place in being discussed about. Namely, Virtualisation and the Smartphone. Virtualisation is software, you could argue that its either systems software or application software, its unimportant to classify it so narrowly. What is important is that it has had a profound affect on IS since its inception in the 1960’s.

IBM invented virtualisation for its S/360 computer to allow it to run several things simultaneously, using a rudimentary “time sharing” algorithm. Only recently has the general IT world caught up with this idea and products like VMware and Microsoft Hyper-V have appeared in the last 10 years or so, allowing the construction of massive Datacenters around the world, giving businesses the ability to run programs on “time shared” servers. Virtualisation, being so successful at the processor and systems software end, spawned business models that allow anyone to ‘hire’ a slice of a computer for a punctual requirement or for the creation of something new, foregoing the old upfront costs of buying and setting up a server, with almost everything needed at the datacenter being virtualised to allow multiple simultaneous users and, of course, revenue streams.

Saving the device with possibly the biggest impact on society, the smartphone came along for the first time in 1994 via a concept from IBM, then called the Simon Personal Computer. It was a PDA (Personal Digital Assistant) and telephone in an all-in-one package. A couple of years later Nokia introduced its 9000 Communicator, again a PDA and phone inseparably linked. However it came with an extra that differentiated it from the rest of the bunch, it had the Internet. Quickly, as is the law, many other companies tried their stab at this new and exciting market, with the likes of Microsoft, Blackberry, Handspring, Palm etc building and selling their own take.

Then, once again and as it always does (we’ll cover disruption in the future), it all changed.

In January 2007 Steve Jobs demonstrated on stage for the first time, an incredible new device was ushered in to the public eye, the iPhone. Its worth remembering what phones looked like back then (see image), but also remember that Nokia and Blackberry owned the market at the time, and in just a few short years, Apple and Google, in a cruel 1-2 punch put both of those players out of business.

Phones before iPhone, 2007 | Obama Pacman

Image : Apple

Now we have a good overview of digital history, the next edition will dive in to the subject of digitalisation and its origins.

Reading List

The Story of Twitter’s Fail Whale

In theme with this weeks subject, I thought it would be good to link to this article form 2015 that, despite only coming two years after the event, looked at it so nostalgically. Things get anchored in Internet folklore so quickly, and are forgotten oftentimes even quicker. Sadly missed, even though it did highlight more of a negative than a positive.

From shopping to car design, our customers and partners spark innovation across every industry

Microsoft’s vision of Digital Transformation, worth reading. I’ll be visiting Microsoft, probably regularly, as it is an actor in the space that we simple cannot avoid, and I don’t mean that negatively. Microsoft are in their own transformation and a pivoting into something more than just a tech company. Good to see Satya Nadella moving things in this direction.


Thanks for reading. The next issue (on Friday, then every Friday after that), we’ll get into Digitalisation and why Digital Transformation is different.


The Future is Digital Newsletter is intended for a single recipient, but I encourage you to forward it to people you feel may be interested in the subject matter.

Thanks for being a supporter, I wish you an excellent day.

Matthew Cowen @matthewcowen