acm-header
Sign In

Communications of the ACM

The profession of IT

Misconceptions About Computer Science


Misconceptions About Computer Science, illustration

Credit: Andrij Borys Associates / Shutterstock

When many of us were in school, we were given definitions of computer science such as "the study of information processes and their transformations" or "the study of phenomena arising around computers." But when we entered the world of professional practice, we experienced computer science in a completely different way from these abstract definitions.

In our professional world, our ability to obtain a job depends on how well we display competence in using computational methods and tools to solve problems of interest to our employers. We have to be able to create small apps on the fly with no more effort than writing a Post-It note. We discover that we have customers who can be satisfied or not with our work—and that our professional advancement depends on an ever-expanding legacy of satisfied customers. We discover that over time we become proficient and our peers and bosses call on us to solve ever more complex problems. We are beset with unpredictable surprises and contingencies not covered in school or our previous experience—and yet we must deal with them effectively.

As an example, the current surge of deep-learning AI technologies has generated many benefits and created well-paying new jobs for data analysts and software designers who automate some mental tasks. These technologies are permanently displacing workers who used to do those tasks manually. Many readers of this column are well-paid designers and yet even they worry that a technology surprise might push them overnight into the unemployed. Our Internet technology has facilitated globalization of labor and raised living standards everywhere, yet has stimulated a backlash of anti-immigration, anti-trade sentiment. Our Internet technology has also developed a dark side that includes hackers, data and identity thieves, scammers, polarizing websites, terrorists, and more. To help us cope with all this change and churn we have organized ourselves into several hundred professional specialty groups hosted by ACM, IEEE, and others.

Because computing is so intimately involved with many fields, an educational movement called "CS for All" has emerged that aims to include some computing in everyone's K–12 education or professional development.

We note that the CS for All movement does not advocate that every single child should learn to program for the sake of becoming a professional programmer or software engineer. Computing occupations are projected to grow at a higher rate than all other STEM areas combined. By one estimate, more than 7.7 million Americans use computers in complex ways in their jobs, almost half of them in fields that are not directly related to STEM.a Regardless of their career, many professionals will be using computer science at work.

We have worked closely with many people in this movement. They have been confronted with a number of misconceptions about computer science, both in the audiences they are trying to reach and among themselves. These misconceptions can lead to expectations that cannot be met—for example, graduates thinking they have studied things that will help them land and keep good jobs, or employer expectations about what graduating professionals can do for them. These misconceptions can also interfere with practitioner abilities to navigate effectively in the real world of computing. Our purpose here is to call out the nine most pernicious of these misconceptions and call on our professional colleagues to work to dispel them.

CS = programming. The idea that programming is the core activity of computer science is easy to accept and yet it is only partly true. Computing professionals are expected to be able to program. But computing professionals engage in many other important activities such as designing software and hardware systems, networks, databases, and applications. The idea that coding (a subset of programming) opens the door to many career opportunities has intrigued the public because of the successful publicity of Hour of Code, after-school coding clubs for boys and girls, and coding competitions.

This misconception is not new. It took root in the 1970s and was repeatedly challenged over the years; ACM and IEEE, for example, spent considerable effort in the 1990s uprooting it.7 The most recent ACM/IEEE college curriculum includes 17 areas of computing technology besides programming.2 Even when computing is distilled to its core scientific and engineering principles it is still a huge field in which programming is not the lead.4 The new Advanced Placement course CS Principlesb reflects a much broader view of computer science for high school seniors. Code.org's K–12 curriculumc covers much more than coding. Yet the "learn to code" movement seems to offer quick access to many well-paying jobs after you work your way through intensive bootcamps and workshops. The moment of truth comes when you discover in interviews that employers look for much more than ability to code.

Programming is concerned with expressing a solution to a problem as notation in a language. The purpose of programs is to control machines—not to provide a means of algorithmic self-expression for programmers. Starting with Ada Lovelace's example programs in the 1840s, programming has always been concerned with giving instructions to a machine so that the machine will produce an intended effect. A programming language is a notation used to encode an algorithm that, when compiled into executable code, instructs a machine.

Computer scientists have long understood that every programming language (the "syntax") is bound to an abstract machine (the "semantics"). The machine—simulated by the compiler and operating system on the real hardware and network—carries out the work specified by the algorithm encoded in the program. Advanced programmers go further: they design new abstract machines that are more suited to the types of problems needing solution. The idea that programs are simply a notation for expression is completely disconnected from this fundamental reality that programs control machines.

A recent illustration of this is the legal battle by copyright owners to block the distribution of decryption software that unlocked copyright protection. The decryption software would have been uninteresting if it were merely a means of expression. But that software, when run on a machine, broke the copy protection.


Programmers progress from beginners to experts over a long period of time. The much publicized kid coders are mostly beginners.


Once you master a core knowledge base including variables, sequencing, conditionals, loops, abstraction, modularization, and decomposition, you will be a computing professional. This is a woefully incomplete characterization of what computing professionals need to know. The concepts listed are all programming concepts, and programming is a small subset of CS. The listed concepts were central in the 1960s and 1970s when programming was the main interface to computers. Today, you simply cannot be a competent programmer with little skill at systems, architectures, and design, and with little knowledge of the domain in which your software will be used.

Programming is easy to learn. Programming and coding are skill sets. Programmers can progress from beginners to experts over a long period of time. It takes more and more practice and experience to reach the higher stages. Becoming proficient at programming real-world applications is not easy. The much publicized kid coders are mostly beginners.

Educators have been searching for many years for ways to accelerate learning programming. Seymour Papert introduced the Logo language in the 1970s and watched how children got excited by computing and learned how to think computationally. He attuned Logo to children's interests; even so, it still took students time to move from the fascination of the introduction to the ability to program useful computations regularly.

Computational thinking is the driver of programming skill. Computational thinking (CT) is an old idea in CS, first discussed by pioneers such as Alan Perlis in the late 1950s.8 Perlis thought "algorithmizing" would become part of every field as computing moved in to automate processes. Dijkstra recognized he had learned new mental skills while programming (1974). In his 1980 book Mindstorms, Papert was the first to mention the term CT explicitly when discussing the mental skills children developed while programming in Logo. Jeannette Wing catalyzed a discussion about how people outside CS could benefit from learning computing.8 The common thread was always that CT is the consequence of learning to program.

Modern versions of the CT story have turned this upside down, claiming that CT is a knowledge set that drives the programming skill. A student who scores well on tests to explain and illustrate abstraction and decomposition can still be an incompetent or insensitive algorithm designer. The only way to learn the skill is to practice for many hours until you master it. The newest CSTA guidelines move to counteract this upside-down story, emphasizing exhibition of programming skill in contests and projects.d

Because computation has invaded so many fields, and because people who do computational design in those fields have made many new discoveries, some have hypothesized that CT is the most fundamental kind of thinking, trumping all the others such as systems thinking, design thinking, logical thinking, scientific thinking, etc. This is computational chauvinism. There is no basis to claim that CT is more fundamental than other kinds of thinking.

When we engage in everyday step-by-step procedures we are thinking computationally. Everyday step-by-step procedures use the term "step" loosely to refer to an isolated action of a person. That meaning of step is quite different from a machine instruction; thus most "human executable recipes" cannot be implemented by a machine. This misconception actually leads people to misunderstand algorithms and therefore overestimate what a machine can do.

Step-by-step procedures in life, such as recipes, do not satisfy the definition of algorithm because not all their steps are machine executable. Just because humans can simulate some computational steps does not change the requirement for a machine to do the steps. This misconception undermines the definition of algorithm and teaches people the wrong things about computing.

Computational thinking improves problem-solving skills in other fields. This old claim is called the "transfer hypothesis." It assumes that a thinking skill automatically transfers into other domains simply by being present in the brain. It would revolutionize education if true. Education researchers have studied automatic transfer of CT for three decades and have never been able to substantiate it.6 There is evidence on the other side—slavish faith in a single way of thinking can make you into a worse problem solver than if you are open to multiple ways of thinking.

Another form of transfer—designed transfer—holds more promise. Teachers in a non-CS field, such as biology, can bring computational thinking into their field by showing how programming is useful and relevant in that field. In other words, studying computer science alone will not make you a better biologist. You need to learn biology to accomplish that.

CS is basically science and math. The engineering needed to produce the technology is all based on the science and math. History tells us otherwise. Electrical engineers designed and built the first electronic computers without knowing any computer science—CS did not exist at the time. Their goal was to harness the movement of electrons in circuits to perform logical operations and numerical calculations. Programs controlled the circuits by opening and closing gates. Later scientists and mathematicians brought rigorous formal and experimental methods to computing. To find out what works and what does not, engineers tinker and scientists test hypotheses. In much of computing the engineering has preceded the science. However, both engineers and scientists contribute to a vibrant computing profession: they need each other.

Old CS is obsolete. The important developments in CS such as AI and big data analysis are all recent. Computing technology is unique among technologies in that it sustains exponential growth (Moore's Law) at the levels of individual chips, systems, and economies.3 Thus it can seem that computer technology continually fosters upheavals in society, economies, and politics—and it obsoletes itself every decade or so. Many of the familiar principles of CS were identified in the 1950s and 1960s and continue to be relevant today. The early CS shaped the world we find ourselves in today. Our history shows us what worked and what does not. The resurrection of the current belief that CS=programming illustrates how those who forget history can repeat it.

Artificial intelligence is an old subfield of CS, started in the early 1950s. For the first 30 years, AI pursued a dream of intelligent machines. When they were unable to even get close to realizing the dream, they gave up rule-based AI systems and turned instead to machine learning focused on automating simple mental tasks rather than general intelligence. They were able to build amazing automations based on neural networks without trying to imitate human brain processes. Today's AI has become so successful with neural network models that do far better than humans at some mental tasks that we are now facing social disruptions about joblessness caused by AI-driven automation.

Back to Top

Conclusion

We welcome the enthusiasm for computer science and its ways of thinking. As professionals, we need to be careful that in our enthusiasm we do not entertain and propagate misconceptions about our field. Let us not let others oversell our field. Let us foster expectations we can fulfill.

Back to Top

References

1. Change the Equation. The hidden half. Blog post. (Dec. 7, 2015); http://changetheequation.org/blog/hidden-half

2. Computer Science Curricula 2013; https://www.acm.org/education/CS2013-final-report.pdf

3. Denning, P. and Lewis, T.G. Exponential laws of computing growth. Commun. ACM 60, 1 (Jan. 2017).

4. Denning, P. and C. Martell. Great Principles of Computing. MIT Press, 2015.

5. Denning, P.J. et al. Computing as a discipline. Commun. ACM 32, 1 (Jan. 1989), 9–23.

6. Guzdial, M. Learner-Centered Design of Computing Education: Research on Computing for Everyone. Morgan-Claypool, 2015.

7. Tedre, M. Science of Computing: Shaping a Discipline. CRC Press, Taylor & Francis, 2014.

8. Tedre, M. and Denning, P.J. The long quest for computational thinking. In Proceedings of the 16th Koli Calling Conference on Computing Education Research (Koli, Finland, Nov. 24–27, 2016), 120–129.

9. Wing, J. Computational thinking. Commun. ACM 49, 3 (Mar. 2006), 33–35.

Back to Top

Authors

Peter J. Denning (pjd@nps.edu) is Distinguished Professor of Computer Science and Director of the Cebrowski Institute for information innovation at the Naval Postgraduate School in Monterey, CA, is Editor of ACM Ubiquity, and is a past president of ACM. The author's views expressed here are not necessarily those of his employer or the U.S. federal government.

Matti Tedre (matti.tedre@acm.org) is Associate Professor of Computer and Systems Sciences at Stockholm University, Sweden, adjunct professor at University of Eastern Finland, and the author of Science of Computing: Shaping a Discipline (CRC Press, Taylor & Francis, 2014).

Pat Yongpradit (pat@code.org) is the Chief Academic Officer for Code.org and served as staff lead on the development of the K–12 Computer Science Framework. A former high school computer science teacher, Pat has been featured in the book, American Teacher: Heroes in the Classroom, has been recognized as a Microsoft Worldwide Innovative Educator, and is certified in biology, physics, math, health, and technology education.

Back to Top

Footnotes

a. https://advancesinap.collegeboard.org/stem/computer-science-principles

b. https://code.org/educate

c. http://www.csteachers.org/?page=CSTA_Standards

d. One of the K–12 curriculum recommendations actually cites making a peanut butter and jelly sandwich as an example of an algorithm.


Copyright held by authors.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: