young man with glasses sitting in front of his computer, programming. the code he is working on can be seen through the screen.
© Getty

Rowan Crozier, chief executive of the 125-year-old UK manufacturing company Brandauer, based in the Midlands, pauses for a moment while describing the company’s shift to a new state-of-the-art computer system.

“We switched from a highly flexible, largely paper-based, system to one that was a lot more rigid . . . there was a lot we had to go through,” he says.

Brandauer’s business is based on technical excellence. Its precision presses make metal components, from dental floss cutters to ethernet connector pins, with such efficiency that it is one of the rare UK companies to export parts to China rather than import them.

But the IT challenges Brandauer has faced reflect a wider struggle among businesses overhauling creaking, but functional, software systems in order to become more productive. Changes have meant that practices that were once second nature must be relearnt, old shortcuts no longer work and executives doubt whether it will all be worth it.

Meanwhile, software has continued to improve. Applications that only a few years ago would have not been out of place in a science fiction movie — such as video calls — are now commonplace.

Yet across the developed world productivity growth has fallen since the financial crisis began in 2007.

The great age of productivity growth was between around 1920 and 1970, an era that saw technological advances from telephones and washing machines to cars and aircraft. There was a second boost in the 1990s, when wider use of workplace computers helped companies produce more goods and services per worker — and ditch whole classes of employees such as secretaries.

The expectation was that the internet revolution of the early 2000s meant another rise in productivity was on its way. But it has not worked out like that.

In the UK, the output of the average worker has not risen for nearly a decade. The Center for American Progress estimates the US will produce $2.8tn less in goods and services than was forecast before the 2008 financial markets crash, slower productivity growth being the main reason for the shortfall.

Productivity growth matters because without it companies will struggle to increase wages. This affects both workers and a nation’s tax receipts.

There is no agreement among economists about the reasons for the decline.

Professor Robert Gordon, of Northwestern University in the US, believes some inventions, however flashy, are just not as significant as others. His argument is in the ascendance, partly because alternative suggestions on their own — from a lack of business investment to mismeasurement of the digital economy — do not seem sufficient.

Global corporate IT spending had been rising, but research company Gartner forecasts it will be flat in 2016. Despite this, Gartner still estimates that $3.41tn will be spent on IT this year, more than the UK’s GDP, up from $2.6tn four years ago.

In terms of economic output generated by each employee, John-David Lovelock, an analyst at Gartner, says the research shows a clear relationship with higher corporate IT spending. But it is not a perfect fit by any means.

On the question of mismeasurement, former deputy governor of the Bank of England Sir Charlie Bean concluded, in a review of the digital economy published this year, that official statistics were not capturing all of the benefits of the digital economy. If these were fully accounted for, the average annual UK growth rate over the past decade would have been between 0.4 and 0.7 percentage points higher, the review said. Though this is significant, productivity growth has nonetheless stalled.

Chad Syverson at the Chicago Booth School of Business has calculated that the unrecorded value of the digital economy to the average US citizen would need to be $8,400 per year — or a fifth of net disposable income — to make up the difference in the productivity gap.

Despite the apparent gloom, there is a more optimistic argument: that we are in a temporary pause before the gains from improved IT begin to show. Professors Erik Brynjolfsson and Andrew McAfee at MIT argue there has always been a delay between the time when technology arrives and when it really begins to make a difference. The gains are on the way, they argue.

Brandauer’s Mr Crozier takes this view. His company’s £400,000 IT investment was necessary even if it took time to bed in. “We’ve gone from making decisions based on gut instinct, to judgments based on data,” he says. The company is seeing measurable benefits and expects the investment to have paid for itself inside 18 months, he says.

This is just as well. All of our future living standards depend on this small example becoming commonplace.

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments