The impact of digital technologies on industry and society at large is the subject of numerous recent studies — but how significant is “digital transformation” when put in historical context? Is it any more significant than that which followed the introduction of the personal computer in industry and society in the 1980s or that instigated by the use of the Internet in the 1990s? Using my experience in IT across this period, I provide some possible answers to these questions.
The mainframe era
In the late 1960s, Ford’s centre for UK operations in Warley, Essex had two IBM 360-65 mainframe computers in its ground-floor computer room for use by its data-processing department. Each machine had 32 disc drives and 32 tape drives, plus card readers, card punches, several printers, and up to one megabyte of core memory. Production, inventory, payroll, and sales data came in from the main UK manufacturing plants for processing.
In the tape library, trolley loads of tapes were set up for running different programming jobs and taken to the computer room. They were later collected and put back on the stacked shelves in the tape library. Each tape had a cardboard record card on which movements in and out of the tape library were noted down. The cards were then stored away in sliding metal drawers.
Upstairs in the finance department, about 25 employees worked on analysing data. Equipped with lined graph paper, pencils, rubber pens, and printouts from the computer room, they were assigned tasks involving arithmetic calculations, some of them quite complex. These were then checked over by the senior analyst.
The advent of the personal computer
Glaxo Pharmaceuticals in Greenford, London had Hewlett Packard HP3000 mini-computers in its information management division, which were connected in a radial fashion to about 1000 Visual Display Units (VDUs). From these “dumb terminals”, users could access the company’s main production, sales, and finance systems (written in-house in Cobol, Pascal or Fortran).
In the mid-to-late 1980s, the IT provision changed dramatically as the personal computer (PC) was introduced across the company’s four main sites. These new desktop machines could use the company’s main information systems but also had access to a range of new PC tools, including spreadsheets, databases, graphics packages and word processors (Figure 1). The download of data from the main information systems for spreadsheets and databases became possible. End-user computing had arrived.
The end of the 1980s witnessed further disruptive change as the first modules of a packaged business software product, SAP, were implemented. Some old in-house programme suites were phased out, with major implications for future systems development and support. On the PCs, Microsoft Windows emerged as the dominant operating system, bringing about major consolidation in the PC software sector.
In the 1980s, Microsoft Word was just one of many products available for the personal computer, but this package subsequently came to dominate the document processing software market. Similar developments followed with spreadsheets, graphics and email packages.
The Internet “changes everything”
By year 2000, cider-maker HP Bulmer in Hereford, UK had migrated from its old legacy systems running on proprietary hardware to an integrated set of software packages. The majority of this software was Oracle based and ran on Intel chip-based servers. The Internet, laptop computers, and mobile phones were widely used across the company.
E-business capabilities were being rapidly developed by an in-house team, with business-to-business and business-to-consumer order capture systems available on the web (Figure 2). Ownership and management of some elements of the IT provision had been transferred to the business functions for process management, data and systems maintenance, and e-business operations. Some of the systems were hosted off-site by third-party providers.
Some of the key IT issues of the day were highlighted at the IT Directors Forum in the year 2000 (Y2K). These included the rapid growth of e-business, the growing significance of information and data management, the need for new skills and competencies in the IT team, and the growing importance of IT security, following the Y2K technology hiatus.
The year 2023
The School of Computing and Engineering at the University of Gloucestershire, where I work, has a “best of breed” approach to its information systems, some of which are accessed via the Cloud while others remain on-premises. It has planned a full migration to cloud servers within the next 3-5 years.
Digital technologies are evident across the university’s four main campuses, notably social media, mobile, Internet of Things, and analytics. Artificial intelligence and blockchain applications are being trialled and assessed. WiFi connectivity has replaced much of the old hard-wired local area network infrastructure.
The university has a digital strategy (as well as an IT strategy), which notes, “The evolution of technology is having an impact on how teaching is delivered and increasing the expectation of students. Education 4.0 will change the way teaching and learning are delivered, making it more rewarding, and preparing students for this new dependence on systems, automation and intelligence in the workplace.”
Digital Transformation: Evolution or Revolution?
These new ways of working are seen by some as a “remote work revolution,” accelerated by the Covid-19 pandemic. Digitalisation has also been the catalyst for changes in how information technology is managed in some organisations. Business functions have taken on new responsibilities for managing digital technology projects, and IT strategy is being repurposed accordingly.
Information technologies have wrought fundamental change throughout society, driving it forward from the industrial age to the networked era.Zaryn Dentzel, American computer scientist and entrepreneur
The subjective snapshots outlined above provide only a partial view of change over the recent past. Daryn Dentzel recently observed that “information technologies have wrought fundamental change throughout society, driving it forward from the industrial age to the networked era.” While digitalisation has had a significant impact on business and society, it has arguably not been any greater than that of new technology developments in previous eras.
At Ford, in the late sixties, each mainframe computer had just 1 megabyte of memory, and business analysts used pencil and paper. Today, desktop PCs typically have at least eight gigabytes of memory, and we have a remarkable range of communication and analytical tools at our disposal. This has been the cumulative result of a series of evolutionary technological changes and their application in business and society. Nevertheless, the continued development and widening deployment of digital technologies remain one of the key challenges facing all organisations today. Their impact on our daily lives will inevitably continue to gather further pace and momentum.
Wynn, M.G. (2022). Handbook of Research on Digital Transformation, industry use cases, and the impact of Disruptive Technologies. Advances in E-Business Research. https://doi.org/10.4018/978-1-7998-7712-7