Coding Should Not Replace Foreign Language Requirements

Allowing coding or even computer science to displace other forms of instruction belies its prominence in real-world commerce.

***

In late 2013, the state of Texas passed legislation allowing some students to replace their high school foreign language requirements with computer science credits. In the years since, several other states have joined suit, debating changes to school curricula that would make Coding 101 a fair substitution for French, Latin, or Spanish.

The benefits of learning to program are fairly obvious. The ability to construct automated systems to solve problems is an eminently valuable (and employable) skill.

The case for developing foreign language ability is less clear, however, at least as far as potential employment is concerned. After all, English is the global language of commerce. Europeans are often multilingual by necessity: NATO, for example, conducts its operations in English. Multilingual ability is also increasingly prevalent in Asian and African markets. Many native English speakers may feel that there is no need to learn a foreign language when they can often communicate readily in English. To add to that, people who learn to program are often funneled into high-paying jobs as programmers. People who learn to speak Spanish or French or Japanese may find that their linguistic skills do not as readily translate to clearly-defined careers.

So is foreign language instruction necessary for native English speakers? In a word: yes.

Human vs. Computer Languages

In its most basic form, programming is the act of feeding a predetermined set of instructions to a machine that interprets and executes them. The simplest computer scripts are comparable to recipes. Programming exercises are an essentially linear, deductive, goal-oriented form of thinking. Unlike a recipe for cooking, however, programming can be unforgivingly rigid. There is an oft-repeated saying that “computers do what you tell them to do, not what you want them to do.” Beginner (and non-beginner) programmers often grapple with the exactitude required for writing programs. You could, for example, ruin an entire database with a misplaced bracket.

Human-to-human communication, on the other hand, is idiomatic, expressive, and often downright ambiguous. (Consider that the word “private” can refer to a military rank, a body part, exclusivity, or ownership.) People, of course, are not machines, and while they are certainly more accommodating than computers, they are also less compliant and predictable. One can’t feed verbal inputs into a person and perfectly predict the resulting behavior. To the extent that foreign language instruction is an exercise in interpersonal communication, computer programming of any stripe is a very poor substitute.

Another distinction between human and computer language has to do with translatability. Some programming languages can be described as Turing-complete, meaning that they can express the universe of computable algorithms and procedures and can, in principle, be translated from one another. By contrast, human languages are shaped by historical, cultural, and environmental contexts; there are many well-known instances of words and sentiments that cannot be translated from one language into another.

There is a final important practical difference between human language and computer language: human languages have evolved over thousands of years, whereas computer languages — at least so far — have had much shorter lives, and much shorter half-lives. They emerge and become obsolete in sometimes startlingly brief intervals.

Variation in Technological Literacy

There is reason to be somewhat skeptical of the claim that “coding is the new literacy.” The ability to code is one of many possible tools a person can leverage in the digital economy. It is true that  today’s leading industries belong to those whose skills complement the strengths of machines. These skills, however, are present not only in programmers, developers, and engineers, but also in marketers who are able to leverage the power of social media, machinists who can use AutoCAD and laser-sintering 3-D printers, business people who understand how the Internet can provide novel solutions to problems and generate value, and designers who understand the intersection among Web interfaces, art, and psychology. Many of the emerging industries and most sought-after technology job titles are essentially interdisciplinary. Computer programming, or the use of computers, is typically just one component of the work. Coding, then, is a single facet of technological literacy.

Allowing coding or even computer science to displace other forms of instruction belies its prominence in real-world commerce. Even wildly successful software companies like Facebook and Google only employ, respectively, about 23,000 and 72,000 people (of whom a relative handful are engineers or developers whose primary job consists of programming or writing code); contrast that with manufacturers like IBM, which employs more than 380,000 people, or retailers like Walmart, which employs 2.2 million. Despite the well-known shortfall of skilled technology workers, it is unlikely that Internet and software companies will ever approach the employment rolls of manufacturers and retailers.

Need for Intercultural Competence

Any list of the “aims” of education, is likely to mention “employability” — but there are also oft-repeated phrases such as “critical thinking,” “civic engagement,” “ethical and moral sense,” and “communication.” Derek Bok, the former president of Harvard, specifically identified intercultural competence and the ability to live in a culturally and experientially diverse world as being important for students, citing globalization and the increasing diversity of the American population as reasons for developing these essential skills.

Foreign language instruction gives students the opportunity to learn about the histories and experiences of others, as well as to reexamine their own cultural perspectives. If the purpose of technology is to solve human problems, then shortchanging the parts of schooling that engage with humanity and culture is extremely misguided.

A Well-Rounded Education

Arguments against foreign language instruction and in favor of computer science replacement are similar to those leveled against art, music, drama, and other subjects deemed nonessential. With every publication from the Programme of International Student Assessment (PISA) or Trends in International Mathematics and Science Study (TIMSS), the pendulum swings toward the goal of “practical” skills and employability. Meanwhile, psychometric tests of creativity have registered continually declining scores in creative thinking for decades. Arts education is frequently slashed despite its demonstrable benefits to student’s cognitive and academic abilities.

The bottom line is that while computer programming is important to learn, it should not displace instructional time and resources reserved for other disciplines, and it should not be considered comprehensive with respect to general technological literacy. As some schools have shown in their pioneering work with art, perhaps the answer lies not in replacing one academic priority with another, but with the integration of technology into every aspect of education — including foreign language instruction.

Charles is a professional nerd with a deep and abiding interest in lifelong learning and personal growth. Outside of work he studies social science and the Western Canon, practices martial arts, listens to classical music and heavy metal, and thinks about open-ended questions. He is a sometimes ambivalent alum of the University of Chicago ('11), where he double-majored in economics and political science and was heavily steeped in the life of the mind while largely miseducated about the real world.