acm-header
Sign In

Communications of the ACM

Departments

What Do Computing and Economics Have to Say to Each Other?


CACM Senior Editor Moshe Y. Vardi

In July 2020, I wrotea about a computational perspective of economics. I described a 1999 result by Elias Koutsoupias and Christos Papadimitriou, regarding multi-agent systems. They studied systems in which non-cooperative agents share a common resource and proposed the ratio between the worst possible Nash equilibrium and the social optimum as a measure of the effectiveness of the system. This ratio has become known as the "Price of Anarchy," as it measures how far from optimal such non-cooperative systems can be. They showed that the price of anarchy could be arbitrarily high, depending on the complexity of the system. The Price-of-Anarchy concept has later been extended to other types of equilibria—for example, Pareto-Optimal Equilibria.b

Price-of-Anarchy results debunk the key idea underlying market fundamentalism,c which is that "markets knows best." This idea is rather tempting. Societies struggle to agree on the value of "things" due to multiplicity of opinions. Why not let the market arbitrate? The value of a "thing" in equilibrium must be the true value, they say. However, markets have many equilibria, and the actual equilibrium reached is not necessarily the "best" one, so we should not take it as the arbiter of true value. There is no escaping the multiplicity-of-opinions problem.

What does economics say about computing? A decade ago, economist Robert Gordon talked about "The death of innovation, the end of growth."d The deep-learning revolution also launched about that time. Current predictions estimatee the impact of artificial intelligence in terms of perhaps an additional USD13T of global GDP by 2030. Predictions are difficult…especially about the future.

At the heart of economic pessimism about computing is the Productivity Paradox, due to the late economist Robert Solow, in reference to his 1987 quip, "You can see the computer age everywhere but in the productivity statistics." As described by Erik Brynjolfsson, the paradox refers to the slowdown in productivity growth in the U.S. in the 1970s and 1980s despite rapid development in computing technology over the same period.f This discrepancy is at the root of Gordon's pessimism. Nevertheless, in 2021, Brynjolfsson took the optimistic side of a betg with Gordon that productivity would surge in the coming decade, thanks to AI.

Yet, the analysis at the heart of the Productivity Paradox is weak. Economic productivity measures output per unit of input, such as labor, capital, or other resource. Productivity grows via myriad inventions and improvements spreading through the economy. Productivity growth in the U.S. from 1973 to 1995 was notably slower than the preceding two decades (1951–1972). To truly assess the impact of computing on productivity, however, one would have to compare productivity growth between 1973 and 1995 to a hypothetical 1973–1995 U.S. economy without innovation and investment in computing. In other words, to assess the value of computing technology to the economy, one must contemplate how the economy would have fared without it. Of course, such a comparison is not feasible; it is not possible to develop such a hypothetical model reliably. The point is that we do not really know what caused the slowdown in productivity growth from 1973 to 1995. In fact, it might have been worse without the investment in computing.

It seems intuitive that today's complex economic world would simply be infeasible without computing technology. JoAnne Yates described the intimate connection between computing technology and economic complexity in her 1989 book, Control through Communication: The Rise of System in American Management. Modern technology has enhanced this trend to the point that Bruce Lindsay, a well-known IBM database researcher, quipped, "relational databases form the bedrock of Western civilization." Indeed, if a massive electromagnetic pulse wiped out our computing infrastructure, our society would face a catastrophic collapse.

Humanity has been fashioned by some very significant cultural revolutions: the development of language about 200,000 years ago, the development of writing about 5,400 years ago, the invention of print in the 1440s, and the invention of the telegraph in the 1830s. Language, writing, print, and telegraph are means for transmitting information. A culminating chapter in this 200,000-year information revolution is the development of computing and communication technologies in the 20th century, where we learned to transmit as well as process information.

Computing is huge! Pay attention to optimistic economists.

Back to Top

Author

Moshe Y. Vardi (vardi@cs.rice.edu) is a university professor and the Karen Ostrum George Distinguished Service Professor in Computational Engineering at Rice University, Houston, TX, USA. He the the former editor-in-chief of Communications.

Back to Top

Footnotes

a. See https://bit.ly/4adkde0.

b. See https://bit.ly/4ap2G2K.

c. See https://bit.ly/3RAp7dT.

d. See https://bit.ly/3tcubf4.

e. See https://bit.ly/3RC3pGu.

f. See https://bit.ly/4aujYvg.

g. See https://longbets.org/868/.


© 2024 Copyright held by the owner/author(s).
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2024 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: