Month: May 2019

Gianfranco Conti’s Claim and the Evidence

Dr. Gianfranco Conti just joined CI LIFTOFF. Yay! Now, along with Bill VanPatten this group has two experts. VanPatten has a PhD in linguistics (where he focused on SL processing stregies) and has written a few hundred articles in scholarly journals and books, a whack of books, some textbooks, and some novels (his 2011 CV is here). Conti is a French and Spanish teacher with a PhD in applied linguistics (his CV is here).

Conti  and I have divergent views about SLA. Mine is pretty standard: during a discussion, I wrote that “the only way language acquisition occurs is through processing comprehended input.” Whatever else may be going with a learner and their class/learning environment (e.g. forced output, grammar teaching & practice, grammar-“rule” feedback, etc), it is the C.I. that the learner is getting that drives the acquisition bus.  That’s my claim.

Conti countered with this: “Chris Stolz you are welcome to your viewpoint, but the weight of research is solidly against you. Explicit instruction appears time and again to be superior to implicit instruction and there is an argument that it demonstrates to the learners that they can approach language empirically, just like biology or chemistry, and thus makes it more interesting to a wider range of learners.”

Note that Conti’s claim has one giant problem: he doesn’t define what “superior” means, or to what it applies. I give him the benefit of the doubt and guess that “superior” means “generates more durable and accurate mental representation of the target language in the learner’s brain.”

Being the data-slave that I am, I asked for evidence. Conti immediately sent me what looked prima facie like a bibliography for a book. It is a list of articles that he says support his claim that explicit grammar instruction does more for acquisition of language than does mere input.

Here is Conti’s support for his position. My comments follow each work he cites.

Chan, A. Y. W. & D. C. S. Li. 2002. ‘Form‐focused remedial instruction: an empirical study’. International Journal of Applied Linguistics 12/1: 24-53. This study of Cantonese L1s studying L2 English lacks a real control group. When it was begun, the control group instructional materials were so boring that the researchers were forced to treat them with almost the same instruction as the treatment group. Both groups experienced equal gains on post-treatment and delayed post-treatment assessment. The absence of a real control group– eg one which got regular English instruction, or just comprehensible input aurally and/or in writing– means we have no idea what the intervention did relative to other interventions. The assessments also focus on conscious learning. This study therefore does not appear to support Conti’s claim.

Craik, F.I.M. & R. S. Lockhart. 1972. ‘Levels of processing: a framework for memory research’. Journal of Verbal Learning and Verbal Behavior 11: 671-684. This article does not look at the role that grammar instruction plays in language acquisition, and therefore does not support Conti’s claim.

DeKeyser, R. 1994. ‘Implicit and explicit learning of L2 grammar: a pilot study’. TESOL
Quarterly 28/1: 188-194
DeKeyser’s study used an artificial language (Implexan)  and examined whether people who were told grammar rules (and then given input) acquired a better “feel” for the meaning of the language, and better production, than did those who received merely input. He showed subjects pictures with accompanying sentences in the synthetic language. DeKeyser used a production test and a grammaticality judgment test to assess learning.

This study has a number of problems.  These include:
1. n = 6
2. He doesn’t provide his data. Am I missing something? I have a PDF of the article and I don’t see his data.
3. DeKeyser doesn’t explain what he means when he says that the subjects who receive explicit grammar instruction “learned categorical rules.” Does this means they were able to consciously formulate them? Does this mean they could apply them?
4. No delayed post-test was conducted.
5. DeKeyser does not say whether or not the subjects were told the meaning of the sentences. If they were not, this experiment is looking at pattern recognition (general cognitive processing) and not language acquisition. This leads us to…
6. …the probability that the grammar-rule instruction included references to meaning, and therefore made the input more comprehensible. Eg, a student sees a picture of a man on a horse, and the sentence “flerb guf dibble.” They are then told that the word order is subject object verb. This means the sentence probably reads “man horse rides/is on.” This person has a leg up on the person who just sees the picture and the sentences and has to guess its meaning.

For these reasons. DeKeyser’s study does not support Conti’s claim.

DeKeyser, R. 1995. ‘Learning second language grammar rules: an experiment with a miniature linguistic system’. Studies in Second Language Acquisition 17/3: 379-410. This synthetic-language (Implexan) study is the only one which appears to support Conti’s hypothesis that explicit grammar teaching is more effective than delivering comprehensible input when building proficiency in production and comprehension. In this study, which is basically a much bigger version of his 1994 study (and in this one, subjects were told sentence meanings in English), DeKeyser found that the explicit and implicit learning groups were able to recognise the same amount of vocab by the end of the study. However, the explicit-instruction group significantly outperformed the other group in production accuracy. So far, so good, but…
1. There was no delayed post-test.
2. The use of artificial languages– done to simplify testing– is problematic. One must note that nobody has ever gotten DeKeyser’s results with a real language.


DeKeyser, R. 1998. ‘Beyond focus on form: cognitive perspectives on learning and practicing second language grammar’ in C. Doughty & J. Williams (eds.). Focus on Form in Classroom Second Language Acquisition. Cambridge: Cambridge University Press. This is not an empirical study. It therefore does not support Conti’s claim.

Erlam, R. (2003). ‘The effects of deductive and inductive instruction on the acquisition of direct object pronouns in French as a second language’. The Modern Language Journal 87/2:242-260.  This study– by the author’s acknowledgement– shows that people who get explicit “grammar” instruction do well on tests where explicit (declarative) knowledge of “grammar” can be accessed. The post-treatment measures of speaking and writing, by Erlam’s admission, did not prevent students from “thinking about” answers. In other words, Erlam tested for conscious learning and not implicit acquisition. This does not therefore appear to address Conti’s claim.

Fotos, S. & R. Ellis. 1991. ‘Communicating about grammar: a task-based approach’. TESOL Quarterly 25/4: 605-628. This study, in the authors’ words, measured and “encouraged communication about grammar.” The authors tried to see if conscious-grammar problem-solving helped students learn grammar rules (learn = consciously understand and explain). This study does not focus on acquisition, and therefore does not appear to support Conti’s claim.

Gass, S. & L. Selinker. 2008. Second Language Acquisition: an Introductory Course (Third Edition). New York: Routledge/Taylor. This is not a book of empiricial studies, and therefore does not support Conti’s claim.

Genesee, F. 1987. Learning through Two Languages. New York: Newbury House.  This book does not contain empirical data, and so does not support Conti’s claim.

Hulstijn, J. 1995. ‘Not all grammar rules are equal: giving grammar instruction its proper place in foreign language teaching’ in R. Schmidt (ed.). There may be a mis-citation in the list Conti provides. I could only find a 1994 collection from R. Schmidt (ed.), and this volume does not contain any empirical studies.  If this is the text Conti recommends, it does not appear to support his contention.

Attention and Awareness in Foreign Language Learning (Technical Report Nº 9). Honolulu, Hawai’i: University of Hawai’i, Second Language Teaching and Curriculum Center, 359-386. This paper– if I have found the right one– outlines theories about consciousness and the “noticing hypothesis” in SLA, but does not provide empirical data.  It therefore does not appear to support Conti’s claim.

Johnson, K. 1996. Language Teaching and Skill Learning. Oxford: Blackwell.  This is a teaching handbook.

Klapper, J. & J. Rees. 2003. ‘Reviewing the case for explicit grammar instruction in the university foreign language learning context’. Language Teaching Research 7/3: 285-314.  This study compared English L1 students of L2 German. It compared two groups: those taught a basically “hardcore grammar” German class (focus on forms), and those taught a much less grammar-focused “society and culture” class (focus on form). 

There are a number of problems with this study:
1. As the researchers themselves note, their study “might be thought to favour explicit over implicit language knowledge and it is certainly possible that slightly different results might have been obtained with fluency measures.” No kidding! The assessment tool was a gap-fill grammar test, on which the hardcore grammar students (FoFs)  predictably beat the others.
2. There was no control group which received “pure” C.I.

This study, because it assesses conscious grammar knowledge and not spontaneous comprehension and/or production, does not support Conti’s claim.

Ming, C. S. & N. Maarof. 2010. ‘The effect of C‐R activities on personal pronoun acquisition’. Procedia – Social and Behavioral Sciences 2/2: 5045-5050.  For the purposes of examining acquisition, this study is flawed. It does not use either a control group or delayed post test. It assesses conscious learning rather than acquisition. It therefore does not support Conti’s claim.

Nation, P. 2007. ‘The four strands’. Innovation in Language Learning and Teaching 1/1: 1-12. This does not provide any data and therefore does not support Conti’s claim.

Norris, J. M. & L. Ortega. 2000. ‘Effectiveness of L2 instruction: a research synthesis and quantitative meta‐analysis’. Language Learning 50/3: 417-528. Ah yes, the bad boy. This study looked at….every other study about language instruction, and concluded that, yes, grammar teaching is necessary. The devil, as always, is in the details: Ortega and Norris do not distinguish between research focused on acquisition (unrehearsed, spontaneous language use) and learning (where explicit awareness, rule-learning etc come into play).

Skehan, P. 2003. ‘Task-based instruction’. Language Teaching 36/ 1:1-14. This article discusses task-based teaching. It does not provide any empirical evidence for Conti’s claim.

Spada, N. & P. M. Lightbown. 2008. ‘Form-focused instruction: isolated or integrated?’ TESOL Quarterly 42: 181‐207. This discusses various types of instruction but does not provide data. It therefore does not support Conti’s claim.

Spada, N. & Y. Tomita. 2010. ‘Interactions between type of instruction and type of language feature: a meta‐analysis’. Language Learning 60/2: 1‐46. In this study, the authors summarise research into instructional practices and conclude that, yes, explicit grammar instruction “works.”  However…in my view, this meta-analysis is deeply flawed, for the following reasons.

1. Only 4 sample studies have a 2nd delayed post-test, of which the average length after treatment was 5 weeks. No delayed P.T. = we have no idea if the gains are durable. 

2. As Eric Herman points out, “just because language use is “free” or “spontaneous” does not rule out people using explicit knowledge (especially if the treatment primed them to do so), and especially in an untimed written mode, which was counted as a “free” response!” The authors themselves write “Thus, one cannot be certain that the oral production tasks used in the primary studies for this meta-analysis are indeed measures of implicit knowledge” (p. 287).”

3. One “effective” study of VanPatten’s processing instruction, (Benati, 2005) was characterized as “explicit.” However, in P.I., as VanPatten has pointed out, students are not “being taught rules.” 

4. The single-biggest problem here, however, is the authors’ description of language acquisition as “learning rules.” This is not what happens with language. There are, strictly speaking, no “rules” to be learned.

One can summarise Spada and Tomita by saying they provide evidence that under some conditions, explicit grammar instruction appears to help production under conditions where conscious language use can occur. This meta-analysis thus appears to contradict Conti’s claims.

Swain, M. 1985. ‘Communicative competence: some roles of comprehensible input and comprehensible output in its development’, in S. Gass & C. Madden (eds.).  Swain does not provide any data here. In addition, she notes that the main “job” of output in language acquisition is to generate more– and more focused– input for the learner. This article therefore does not support Conti’s claim.

Input in Second Language Acquisition. Rowley MA: Newbury House, 235-253. This essay does not describe any specific evidence about language acquisition, but rather focuses on possible roles for input and output.

Swan, M. (1994). Design criteria for pedagogic language rules’, in M. Bygate, A. Tonkyn and E. Williams (Eds.), Grammar and the Language Teacher. London: Prentice Hall, pp. 45‐55. The full article is available at  This article does not contain any empirical evidence.

Van Patten, B. & S. Oikkenon. 1996. ‘Explanation versus structured input in processing instruction’. Studies in Second Language Acquisition 18/4: 495-510.
This is a repeat of the classic VanPatten & Cadierno (1993) study on the effects of processing instruction. They wanted to see whether exposure to language, direct instruction, or processing activities resulted in better development of ability to process “non-Englishy” language (Spanish, with its pronoun orders).  Their conclusion: “[r]esults showed that the beneficial effects of instruction were due to the structured input activities and not to the explicit information (explanation) provided to learners.” In other words, this refutes Conti’s claim: it’s the processing of input, not instruction, that develops mental representation of language.

Willis, D. & J. Willis. 2007. Doing Task-Based Teaching. Oxford: Oxford University Press. This book does not contain any empirical studies.  It therefore does not support Conti’s claim.

Conclusion: Gianfranco Conti has claimed that explicit instruction generates more language acquisition (fast, accurate and spontaneous comprehension and production of language) than does the provision of comprehensible input. He provided 23 references that allegedly supported his claim. Only one–marginally– does so. The rest either contradict his claim, are irrelevant, or otherwise do not support it. As of this reading, Gianfranco Conti’s claims await substantiation.

(How) Should I Use Questions to Assess Reading?

Yesterday I found a kid in my English class copying this from her neighbour.  It is post reading assessment– in Q&A form– for the novel Les yeux de Carmen. TPT is full of things like this, as are teachers guides,, workbooks, etc.

The idea here is, read, then show your understanding of the novel by answering various questions about it. It “works” as a way to get learners to re-read, and as what Adminz like to call “the accountability piece,” ie, “the reason to do it is cos it’s for marks.”

Before I get into today’s post, I should note, I (and every teacher I know) uses some kind of post-reading activity.

Q: Should I use questions to assess reading?

A: Probably not. Here’s why.

  1. How do we mark it? What if the answer is right, but the French is poor? Or the reverse? Half a mark each? Do we want complete sentences? What qualifies as acceptable and not for writing purposes? What if there is more than one answer? What’s the rubric we use for marking?
  2. It can (and, basically, should) be copied. This is the kind of thing that a teacher would send home to get kids to re-read the novel. Fine, but…it’s boring, and it takes a long time. It doesn’t use much brain power. If I were a student, I would copy this off my neighbour. If you don’t get caught, you save a bunch of time, and the teacher has no way of noticing.
  3. It would totally suck to mark this. Do you actually want to read 30– or 60!— of these?!? I dunno about you folks, but I have a life. We have to mark, obviously, but these, ugh, I’d fall asleep.
  4. It’s a lot of work for few returns. I asked the kid who’d lent her answers to her friend how long it took (btw, there is one more page I didn’t copy), and she said “about 45 min.” This is a lot of time where very little input is happening.  The activity should either be shorter, or should involve reading another story. As Beniko Mason, Stephen Krashen and Jeff McQuillan (aka The Backseat Linguist) show us, input is more efficient than input plus activities (ie, instead of questions about a story, read another story).  As the great Latinist James Hosler once remarked, “for me, assessment is just another excuse to deliver input.”

So…how should we assess reading? Here are a bunch of ideas, none of them mine, that work.

A. Read the text, and make it into a comic. Easy, fun, useful for your classroom library and requires a bit of creativity.

B. Do some smash doodles. This is basically a comic, but minus any writing. As usual, Martina Bex has killer ideas.

C. Do a discourse scramble activity. For these, take 5-10 sentences from the text, and print them out of order (eg a sentence from the end of the text near the beginning, etc). Students have to sort them into correct order, then translate into L1. This is fairly easy– and even easier if a student has done the reading, heh heh–, as well as not requiring output while requiring re-reading.

Another variant on a discourse scramble is, have students copy the sentences down into order and then illustrate them.

For C, they get one mark per correct translation (or accurate pic), and one mark for each sentence in its proper place. Discourse scramble answers can be copied, so I get kids to do them in class.  They are also due day-of, because if kids take them home others will copy.

D. If you have kids with written output issues, you can always just interview them informally: stop at their desk or have them come to you and ask them some questions (L1, or simple L2) about the text.

Alrighty! Go forth and assess reading mercifully :-).