Eliminating the Foreign Language Requirement

By Dr. Katharine B. Nielson –

A couple of weeks ago, a colleague from a large, U.S. research university contacted me, distressed because the foreign language requirement for undergraduate students had been eliminated and the Dean of Libraries thought that a good alternative would be to offer students a certificate for completing the levels of a one-size-fits-all self-study language learning application without any human interaction. My colleague, a professor of applied linguistics and an expert in bilingualism, had reviewed the application and determined that the program “is completely lacking in any theoretical basis, depends on translation, and is decontextualized,” and she was hoping I could point her to some research confirming her opinion.

Sadly, despite the fact that there is research indicating that self-study language tools without any in-person support are an inadequate replacement for face-to-face instruction, this is not a new phenomenon. U.S. colleges and universities have been looking for ways to cut costs by outsourcing language instruction to exclusively online providers for years. And that’s when they aren’t finding easy ways for learners to bypass taking foreign languages or arguing that we should be doing away with the language learning requirement altogether. Just last week I was quoted in an article reporting that Moody College in Texas is no longer requiring foreign language. High schools are jumping on the bandwagon as well. Indiana, Virginia, and Tennessee all want to make it possible to graduate from high school without ever having taken a foreign language.

As I’ve pointed out beforemultiple times–generally speaking, the U.S. does an embarrassing job teaching second languages. We force students to sit through years of mind-numbing grammar lessons and stilted dialogues, testing them on the rules governing when to use the passé composé or imparfait, and leaving them unable to order a coffee at a French bistro. But that doesn’t mean that we should abandon teaching languages. The U.S. is the fifth largest Spanish-speaking country in the world (behind Mexico, Spain, Argentina, and Colombia), yet less than 20% of our citizens identify as bilingual. In today’s increasingly interconnected world, it’s incredibly important for everyone to speak more than one language, appreciate more than one culture, and communicate across borders and barriers.

It’s striking that at the same time these institutions of higher education have been eliminating language courses, researchers have been learning more and more about how people learn languages and how to teach them effectively. We know from decades of empirical research that language learning works best when it’s relevant and based on learners’ needs. Fundamentally, language is a tool that we use to do the things that are important to us, and that’s how we should be organizing our instruction. If more schools offered instruction that prepared learners to use the languages they are studying in a meaningful way, we wouldn’t have to convince people that language learning is a vital part of education–it would be as obvious to them as it is to anyone who has successfully learned a second language.

And, in the meantime, offering college credit for self-study language courses makes absolutely no sense at all. If anything, we should offer credit to students who can demonstrate real-world language proficiency, by carrying on conversations, reading instructions, writing emails, following recipes, understanding poetry, watching movies, listening to music, or participating in political debates. You know, the kinds of things that human beings use language to do all day long.