Volume 11, Issue 1: 2018

Argument Essays Written in the 1st and 3rd Years of College: Assessing Differences in Performance

by Irene Clark, California State University, Northridge, and Bettina J. Huber, California State University, Northridge

Building on earlier longitudinal studies and focusing on the concept of writing performance and the issue of transfer, this article discusses an assessment of thesis-driven argument essays written by students in their freshman and junior years at a large, urban, Hispanic serving university. The article addresses the question of whether students in the study were able to “transfer” what they had been taught in their first-year writing classes to writing tasks assigned in third year classes and indicates that modest gains did occur, particularly in the use of sources and evidence. It also discusses several factors contributing to improvement in student performance and to maximizing the possibility of transfer, in particular. Throughout, the emphasis is on process-oriented strategies and thesis-driven argument in the first-year writing class and the specificity and clarity of the writing prompts assigned in junior level classes. Examination of these paired essays, written in a similar genre by the same students reveals that improvement was greatest for students with less than adequate writing skills. The study thus suggests that “near transfer” of the ability to write a thesis-driven argument essay did occur between the first and third years for this student population.


Although most American colleges and universities offer a writing course to first-year students, the nature and effectiveness of that course continue to generate scholarly debate. Contested features include establishing the purpose of the course—that is, whether “we imagine writing is a general or more specialized ability” (Soliday, 2011, p. 5), deciding whether the course helps students “improve” as writers, defining what is meant by “improvement,” developing an appropriate and effective assessment protocol, and understanding what factors contribute to improvement. Moreover, it is also important to recognize that when students “try to transfer new skills and knowledge from one academic setting to another” (Anson & Moore, 2017, p. 7), the nature of the writing task in the new setting may affect students’ capacity to do so.

In the context of evaluating the efficacy of the first-year composition course and of determining factors that can contribute to students’ ability to transfer writing skill from one situation to another, this article, situated in the concept of performance, and building on the work of Haswell (2000) and Kelly-Riley (2015), will discuss an assessment of thesis-driven argument essays written by students in their freshman and junior years at a large, urban, Hispanic serving university. Examination of these paired essays, written in a similar genre by the same students—all of whom participated in the multi-year Learning Habits Project—allowed us to highlight several factors understood to affect student performance, in particular the role of junior-level writing prompts in fostering students’ ability to transfer.

 Theoretical Framework

The Growing Importance of the Concept of Transfer

Recent interest in the concept of “transfer” (Adler-Kassner, Clark, Robertson, Taczak & Yancey, 2017; Anson & Forsberg, 1990; Anson and Moore, 2017; Beaufort, 2007; Bergmann & Zepernick, 2007; Clark & Hernandez, 2011; Driscoll & Wells, 2012; McCarthy, 1987; Moore & Bass, 2017; Nowacek, 2011; Nelms & Dively, 2007; Wardle, 2007) has highlighted the difficulty of defining writing ability and the extent to which a general writing course can improve that ability (Anson & Forsberg, 1990; Beaufort, 2007; Bergmann & Zepernick, 2007; Clark & Hernandez, 2011; Driscoll & Wells, 2012; McCarthy, 1987; Nowacek, 2011; Nelms & Dively, 2007; Wardle, 2007). This issue has generated a number of recent publications, among them the Elon Statement on Writing Transfer (2015), the 2012 special issue of Composition Forum, and an edited collection titled Critical Transitions: Writing and the Question of Transfer (Anson & Moore, 2017). Of particular relevance to this issue are Perkins and Salomon’s (1992) discussions of “near” and “far” transfer, and “low road” and “high road” transfer. “Near transfer,” according to their definition, “occurs when knowledge or skill is used in situations very like the initial context of learning,” whereas “far transfer occurs when people make connections to contexts that intuitively seem vastly different from the context of learning” (Perkins & Salomon, 1992, p. 202). Low road transfer enables a learner to perceive similarities between a new context and a prior situation almost automatically (as in driving a car and then driving a truck). In the context of fostering near transfer from the first to third years of college, the nature of writing prompts can play a significant role in that students are more likely to transfer what they learned in a first-year writing course when the expectations of the junior-level prompt are similar to those they have encountered before and when the specifications of the prompt are clearly stated. Far transfer, in contrast, requires “deliberate mindful abstraction of skill or knowledge from one context for application in another” (Perkins & Salomon, 1992, p. 25). When writing prompts are poorly constructed, students are less likely to perceive similarity, and thus, their performance is likely to be weaker. Such a result was indicated in the study discussed in this article.

The distinction between writing “ability” and writing “performance” is a significant factor in considering the issue of transfer. It is therefore important for those assessing writing to remain cautious about what we mean when we try to measure writing ability, and therefore, the present study focused not on writing ability, but rather on writing performance. The idea that writing is a type of performance corresponds to the definition included in The Encyclopedia of Rhetoric and Composition, which describes writing as “a performative act” (p. 108) (see also Lunsford, 2015), and in the context of that definition, this article will assess the performance of first- and third-year students. However, we also recognize that improvement in performance is likely to indicate improvement in ability.

The Role of the Writing Prompt in Enabling Transfer

Because the quality of writing prompts can contribute to students’ ability to transfer writing knowledge from one context to another, it is important to acknowledge its role in assessing student performance, a relationship that has long been acknowledged by committees that construct placement exams, such as the Advanced Placement Language exam committee. Of course, most instructors explain the requirements of the writing tasks they assign during class. However, the quality of the written prompt itself, in terms of clarity and specificity, is also likely to influence how successfully students are able to perform because they may not take adequate notes in class and/or forget at least some of what an instructor has said. Thus, the quality of the written assignment prompt, itself, can be a significant factor in student performance and successful transfer. As is noted by John Bean (2011), writing prompts vary a great deal in terms of how successfully students are able to complete them, some being more easily fulfilled than others. Bean (2011) notes that a “traditional” writing prompt is often presented as follows:

There will be a term paper due at the end of the semester. The term paper can be on any aspect of the course that interests you, but I have to approve your topic in advance. (p. 90)

Bean maintained that this type of writing prompt may be quite suitable “for skilled upper-division students who have already learned the conventions of inquiry and argumentation in a discipline” (p. #). However, “for many college writers, the freedom of an open-topic research paper is debilitating” (Bean, 2011, p. 91). These students are not yet comfortable with “academic writing or with the discourse conventions of a new discipline” (Bean, 2011, p. #), so the essays they submit are likely to be immature, “all-about papers” (papers that present information chronologically, but have no main point), or “quasi plagiarized data dumps with long quotations and thinly disguised paraphrases” (Bean, 2011, p. 91). For the students enrolled in the Learning Habits Project focused on here, the more “traditional” type of research assignment would most likely have posed a challenge even though all entered the University with higher than average GPAs.

In contrast to this “traditional” assignment prompt, Bean (2011) argued assignments that specify “rhetorical context—purpose, audience, genre—can create significant differences in students’ writing and thinking processes as well as in their final products” (p. 93), a perspective that has been supported in a recent large scale study of writing improvement by Anderson, Anson, Gonyea, and Paine (2015), as well as by Clark’s (2012) perspective on how to develop effective writing prompts, which focuses on the importance of defining the goals and exigencies of a writing assignment and of scaffolding the task–segmenting it “into various components that build upon one another as a means of fostering students’ understanding” (p. 443).

Those of us who have worked in writing centers are well acquainted with writing prompts that do not adequately explain the writing task, thereby causing considerable confusion and frustration for students (and the tutors who attempt to help them). These are the prompts that Muriel Harris (2010) has termed “Assignments from Hell,” which she illustrates with the following example:

Analyze the problem of gender in Hedda Gabler and Uncle Tom’s Cabin. Remember to consider other relevant factors such as race, social conditions, economic class and author’s nationality. I expect clean, tightly written, interesting prose that is free of literary jargon. If your thinking is sloppy, the paper will be sloppy, and I grade accordingly. (p. 200)

Harris (2010) pointed out this prompt placed emphasis on “thinking” and “clean” writing and cautioned against “sloppy” thinking. But, she questioned whether most students would be able to define “sloppy” thinking or “clean” writing.” She also noted the ambiguity of the phrases “other relevant factors” and “interesting prose,” the lack of instructions specifying length, format, and other grading criteria, and the emphasis on form over substance. Because the Learning Habits Project gathered both student writing and writing prompts, the role of the latter in student performance could be evaluated in relationship to the quality of student writing.

Research Questions           

Situated in the concept of performance outlined above, along with the relationship of clearly defined writing prompts to the quality of student writing and the issue of transfer, this article outlines key conclusions emerging from the following research questions:

  1. To what extent did improvement in students’ writing performance occur between the first and third year at the university?
  2. If improvement occurred,
    1. In what areas was it most evident?
    2. For which students was it most notable?
    3. What factors contributed most significantly to that improvement?

The following sections discuss the results of this study, the student population involved, and the methods used to assess changes in student performance through time. They also explore the study’s implications in the context of transfer and curricular design.


Launched in Fall 2007 by a group of CSUN faculty and administrators, the Learning Habits Project was designed to track, over a four- to six-year period, several groups of newly enrolled students likely to succeed at a large, urban, public, Hispanic serving university. The initial student participants entered the University in Fall 2007, while the last set of freshman participants joined in Fall 2011. Participating students were among the most highly qualified incoming freshmen in their cohorts: They entered with high-school GPAs of at least 3.5 and/or were fully prepared for college-level work in mathematics and English.

The purpose of the decade-long project was to gain insight into the characteristics and practices of these promising students—that is, we sought to find out about their learning habits, including those related to reading and writing. However, an additional goal of the study was to determine if students’ performance in fulfilling the expectations of a writing assignment had improved between their first and third years—that is, to find out if transfer had occurred. The assessment study focused on a subset of the essays collected as part of the Learning Habits Project. During the period in which students participated in this study, they were asked to submit a thesis-driven argument essay written during their freshman year and a similar essay written in various General Education courses attempted in the junior year. Along with their essays, students submitted the prompts that instructors had provided to guide their work. As a result, it was possible to assess the adequacy of the latter, along with the essays themselves.

In addition, during their multi-year participation in the Learning Habits Project, most students responded to a range of additional inquiries about their learning. These included responses to 10 end-of-term surveys containing a variety of open-ended questions, some of which explicitly addressed reading and writing. In addition, the Learning Habits students sometimes discussed such topics during face-to-face in-depth interviews at the end of their first year at the university, early in their junior year, and during their last semester at the University (for more detailed discussion of Learning Habits procedures, see Berry, Huber, & Rawitch, 2018, pp. 3-13).

Characteristics of the Essay Writers

The students whose essays were assessed in this study constituted one tenth of the larger set of Learning Habits students; two fifths of them entered CSUN in Fall 2011. Like other Project participants, three quarters were proficient in both English and mathematics at entry, while three fifths (62%) had high school GPAs of 3.50 or higher. In addition, the majority (56%) planned majors housed in one of three Colleges: Arts, Media, and Communication; Science and Mathematics; or Business and Economics.

Table 1

In terms of background, two thirds of the essay writers were women, while seven tenths stemmed from Better Served racial and ethnic backgrounds.[i] Only a third were Pell-Grant recipients, while the majority had at least one parent with a four-year college degree (55%). Most were native English speakers (70%), even though close to three fifths heard another language at home while growing up (59%) and had at least one parent who was raised outside the United States (65%). Despite their immigrant backgrounds, the students whose essays were being examined entered the University well-prepared for college work and stemmed from relatively comfortable backgrounds. They did not differ from most Learning Habits students in these respects.[ii]

The Learning Habits students, however, did differ from other entering freshmen. The 759 freshmen involved in the project were more likely than their larger entry cohorts to be women (64% vs. 58%) and less likely to stem from minority backgrounds in general and from Latina/o backgrounds in particular (37% and 31% vs. 66% and 49%).[iii]

Approach to Essay Assessment

Given the nature of the data available to us, it was possible to develop a rubric, outlined below, that was appropriate for both the freshman and junior essays because we focused our analysis on thesis-driven argument essays, selecting samples in which the junior writing samples were most similar to those which students had written in their first-year writing course. The pile of essays we received was then sorted according to whether they adhered to this criterion. We matched students’ ID numbers to ensure each essay pair had been written by the same student.

The privileging of thesis-driven argument in the required first-year writing course was based on the idea that this genre of writing would most effectively enable students to complete writing assignments in other courses. Although it is recognized that argument may be defined differently in disciplines across the curriculum (Yancey, Robertson, & Taczak, 2014), it is also true, as Graff (2003) has noted, that “one of the most closely guarded secrets that academia unwittingly keeps from students and everybody else is that all academics, despite their many differences, play a version of the same game of persuasive argument” (p. 22), which he referred to as “arguespeak,” a perspective supported by Soliday (2011) and Thaiss and Zawacki (2006).

Although we were aware that different disciplines often define “thesis” differently (see Melzer 2014a), we were careful to select junior-level essays with prompts that were most like those assigned in the first-year writing course—thesis-driven arguments supported by evidence. We therefore aimed to maximize the possibility of “near transfer” as well as what Melzer (2014b) referred to as “lateral transfer —transfer to related tasks that do not require new skills or more complex learning” p. 81).

Development of the Rubric

Work on the current assessment exercise commenced early in Summer 2013, when a four-person group of experienced writing teachers with expertise in both holistic and analytic scoring began the process of developing a scoring rubric to guide evaluation of the essays in question. Based on their experience in constructing rubrics, the group developed a rubric (see Appendix for a complete version) consisting of six dimensions or traits, each of which was evaluated independently:

  • The context and purpose for writing and critical thinking
  • Organization and cohesion
  • Content development and coherence
  • Genre and disciplinary conventions
  • Appropriate reliance on sources and evidence
  • Control of syntax and mechanics

The decision to use analytic or “multiple trait scoring” (Hamp-Lyons, 2007, 2016) was based on our interest in learning which aspects of writing performance had improved (see O’Neill, Moore, & Huot (2009).

Four summary descriptions of student performance served to assess each of the dimensions outlined above: Less than Adequate, Satisfactory, Competent, and Superior. These evaluations could be further refined with the addition of pluses and minuses (e.g., Competent + or Satisfactory -). Initial qualitative ratings were translated into numerical equivalents for the purposes of all data analysis. The numerical equivalents ranged from 0.8 for a score of Less than Adequate minus to 4.2 for a score of Superior plus. We then calculated resolved scores for each of the freshman and junior ratings pairs, adding them together (O’Neill, Moore, & Huot, 2009, pp. 201, 204). As a result, the possible collective score across all six dimensions for any given essay ranged from 4.8 to 25.2.

Analysis of Essay Prompts

 The essay prompts we collected were analyzed in conjunction with students’ resolved scores. That is, the prompts attached to the writing samples with the largest gaps in overall essay scores (i.e., a gain of more than 4 points between the overall freshman and junior scores; 24 essay pairs involved) were systematically evaluated for clarity and consistency.

Analysis of End-of-Term Responses Dealing with Views of Writing Instruction

The end-of-term survey questions to which students responded on a regular basis included a multi-part question focusing on writing:

  1. Has the way you approach your writing assignments changed since you came to the university? (Y/N)
  2. Why or why not? [open-ended]
  3. If you have taken courses here at the university that were particularly helpful in strengthening your writing skills, what was it about them that proved so helpful? [open-ended]

This question was posed at the end of students’ second fall term at the university and was treated like other end-of term questions during the analysis process, which relied on traditional content analysis procedures to identify major themes emerging from students’ responses (Weber, 1990).


This section begins with a review of the findings emerging from the assessment of our students’ writing samples. Thereafter, relevant responses to end-of-term questions focusing on the introductory writing course are reviewed, as are the findings relating to the importance of clear and well-defined writing prompts.

The results of this assessment indicated

1) that modest improvement in students’ writing performance did occur, particularly in their ability to use sources and evidence,

2) that improvement was greatest for students who arrived at the University with less than adequate writing skills,

3) that a majority of students credited the first-year writing class, with its emphasis on helping students develop process-oriented strategies, as having had a significant impact on their ability to transfer writing knowledge to assignments in other classes, and

4) that a significant influence on student performance and successful transfer was the specificity and clarity of the writing prompts provided in junior-level classes.

The Writing Assessment

The primary goal of the Learning Habits Writing Study was to learn whether and how the students in the Project had improved in their ability to write a thesis-driven argument essay between their freshman and junior years—that is, to analyze the quality of their performance of that particular writing task, which would indicate that some transfer of writing knowledge had occurred. However, a secondary goal was to identify factors that facilitated improvement in performance; in particular, those that had been emphasized in the first-year writing course, with special attention to the role of writing prompts in enabling transfer.

In terms of the first goal, the results of the study, as noted in Table 2 indicated that most students did make modest gains in their ability to write thesis-driven, argument essays between the two assessment points.

Table 2

Further, Table 2 indicates, the greatest improvement was evident among students whose freshman essays were evaluated as Satisfactory or less: Students who received low scores on their freshman essays received proportionally higher scores on their junior essays. Significantly, students whose freshman essays were evaluated as Competent or Superior continued to be evaluated in these terms when they were juniors. To some degree, the limited improvement among such students can be attributed to what is known as the “ceiling effect.” (i.e., there is less room for improvement if a student starts out scoring close to the maximum score).

Table 3

Table 3 indicates that the most notable dimension-specific improvement emerged for the use of Sources and Evidence. This dimension is the only one for which a statistically significant gain was evident. Nevertheless, it is important to note that, although the average junior essay scores increased only modestly on most dimension-specific measures, they did not decrease, a result which we regard as encouraging and exciting.

Tables 4 and 5, which summarize students’ longitudinal performance in more detail, are in keeping with the above conclusions. Table 4 indicates that approximately half of all students received competent ratings across all rubric dimensions and at both the freshman and junior levels. Further, Table 5 shows that individual performance is least likely to have declined for Sources and Evidence; that is, student essay ratings remained constant through time or improved.

Table 4

Table 5

Student Views of the Writing Course

In terms of the additional goals outlined above, the multi-part survey question distributed to Learning Habits students after their second university year indicated that two thirds of respondents said their approach to writing had changed since college entry; examples being that they had learned new process-oriented strategies for completing a writing assignment, such as starting essay assignments well in advance, organizing background materials before beginning to write, and considering one’s audience before and during the writing process.

One striking feature of the students’ responses were the many unsolicited statements affirming that their first-year writing class had contributed to their development as writers and therefore was likely to have been a factor in their performance on junior-level writing assignments. In the relevant responses, students noted two significant factors: that the course had fostered their understanding of academic genres and that it had enabled them to learn process-oriented approaches that strengthened their ability to complete writing tasks in other academic contexts. In addition, students mentioned other factors that were helpful: their exceptional instructors, the writing practice required, and the new knowledge about academic writing that they had acquired.

The Writing Prompt: A Significant Influence on Students’ Performance

As noted earlier in this article, a key influence on students’ performance on writing assignments is the extent to which students understand the assignments given to them in the writing prompt. Because most of the first-year writing instructors teaching the Learning Habits students had been given extensive training in assignment development, most of their prompts fulfilled these criteria. However, some of the junior-level essay prompts did not, and among students whose assessment scores decreased between their first and third years, the inadequacy of the prompt may explain much of the decline.

The effect of an ambiguous writing prompt. To illustrate the possible effect of a writing prompt that is characterized by the ambiguities noted by Harris (2010), we cite the example of a student whose scores declined by 15% between the freshman and junior years, as displayed in Table 6.

Table 6

Although there are always factors that can contribute to a student’s inability to perform a writing task successfully, an analysis of the writing prompts from both the freshman and junior years suggests that, in this case, the decrease may have been due, at least in part, to the clarity of the freshman prompt and the lack of clarity in the junior prompt.

Analysis of the Freshman prompt. The freshman prompt, which appears below, specifies the expectations of the essay: the necessity of addressing a controversy, developing an argumentative thesis, addressing a counterargument, and writing multiple drafts. Two readings are listed, as are the due dates for the various components of the assignment

The essay that the student wrote in response to this prompt demonstrated that he or she had understood the expectations of the prompt. The title of the essay, “Even Educational Games Can Have Negative Consequences,” indicated that the student was aware of the requirement to address a controversy, and the first paragraph, which led to the thesis sentence, further exhibited that understanding. The thesis statement, cited below, is as follows:

Freshman Prompt

Although Webkinz provides a safe environment with educational games for children, if played too much, children will become socially awkward and lose the sense of having an imagination, if all rules and regulations are set up for them, as well as increase their health problems in the future.

Of course, one might find this thesis a bit exaggerated; nonetheless, the essay indicated that the student understood the exigence for the writing and was able to develop an idea that could then be supported with relevant evidence from outside sources. As a freshman essay, it is not Superior, but, as the scores in Table 6 indicate, it is more than Satisfactory (i.e., almost all of the dimension-specific resolved scores exceed 4.4).

Analysis of the junior prompt. In contrast to the clearly written freshman prompt, the junior prompt, shown below, is characterized by a number of ambiguities. The specifications suggest that the instructor wanted the student to do research on a specific person or topic and

Junior Prompt

develop a thesis that shows how this person or topic relates to a theme discussed in the course (a standard academic writing trope, as discussed in Wolfe, Olson, & Wilder, 2014). However, that goal was not explicitly stated in the prompt, requiring students not already familiar with the trope to probe the prompt to figure out what was expected. Moreover, rather than specifying what the student should do to address the goals of the assignment, the prompt contained injunctions about what the student should not do.

 One can imagine that students receiving this prompt might have a number of questions—perhaps about how many sources should be included, how long the essay should be, and, most importantly, the purpose and genre of the assignment. Perhaps this information had been provided during class. However, from the scores the student received on the junior-level essay, either no explanation was given or the student did not understand it because the only area in which the student’s essay score increased was in the rubric dimension for Syntax and Mechanics. An increase in this dimension of the rubric suggests that the student’s writing ability had probably improved at the sentence level between the freshman and junior years but that the student had not understood the expectations as presented in the prompt.

In terms of Context and Purpose, the student’s resolved score decreased by 1.2 points, while in terms of Organization and Cohesion, it decreased by almost two full points. Reading through the essay, no thesis is discernible. There is a lot of summary, but no clear thesis or argument, even though the prompt (albeit obliquely) had indicated that the instructor wanted a thesis. Moreover, few references to outside sources were used and the ones that were included merely document facts. The decline in the students’ scores, then, is not surprising because it is unlikely that the student understood what was expected.

A junior-level prompt for a student whose longitudinal scores increased. The junior-level writing prompt discussed above may be contrasted with a more clearly written junior-level prompt that was assigned to a student whose scores increased significantly:

Junior prompt--scores increased

Unlike the assignment discussed earlier, which required the student to figure out the exigence, purpose, and expected academic trope for the essay, the assignment shown above makes these elements clear, focusing attention on particular parameters of the problem (the factors most significant in causing the housing boom and bust). Although one might wonder if adequate exploration of the problem could be achieved in a 3- to 5-page essay, given the scope of the topic, the student who wrote in response to this prompt received high scores in all dimensions of the rubric, increasing the total resolved freshman-essay score from 28.0 to 44.4. Table 7 displays the full set of scores for this student on both the freshman- and junior-level essays.

As the table indicates, the scores of the student who responded to the junior-level prompt above increased in all dimensions of the rubric, a result that could have been due to a number of factors. Nevertheless, the clarity and explicitness of the junior-level prompt is likely to have been a contributing factor.

Table 7

Moreover, what is particularly significant about all the junior-level prompts for students whose total scores improved by more than 4 points is that each prompt had clear expectations—all of them, in fact, specifying patterns that Wolfe, Olson, and Wilder (2014) maintained are typical disciplinary tropes, such as the necessity to address an issue of stasis (a central issue involving definition, causality, or evaluation), developing a proposal, or using a conceptual lens (apply a concept, term, theory or hypothesis to a particular idea, situation, or text). Further, all of the relevant prompts specified the importance of developing a thesis or main idea. Several required multiple drafts, specified page lengths and other formatting requirements, explained the type and number of readings that were acceptable, and outlined the process that students were required to complete. A mathematics assignment, summarized below, was particularly well-constructed—and notable because writing is not traditionally assigned in mathematics classes. Summarized below are some of its features:

The assignment for the final paper asks you to research and report on connections between Mathematics and a particular field or area that interests you. The key component of this paper is teaching your reader about some specific mathematics topic. The paper should establish a clear thesis concerning the topic and goal for the paper in the opening paragraph.

The assignment prompt also outlines the “Process for Completing the Final Project”

In addition to calling attention to the exigence of the assignment (to teach the reader about some specific mathematics topic), the prompt also specifies the number and type of sources required, the necessity of having a well-developed thesis supported by evidence, and the

Final Project

requirements for formatting. It is not surprising, then that this student’s total score improved remarkably, from 28.0 to 44.4—a 57% increase.

Poorly constructed freshman prompt—Well-constructed junior prompt. Another example worth noting was a case in which the freshman prompt was poorly constructed and the junior-level prompt was well-constructed. The freshman-level prompt had been assigned in a first-year political science class, not, as was the case in most of the other freshman assignment prompts, in the composition program:

Freshman Prompt 2

This prompt calls for “complete analysis” and an emphasis on clarity, perhaps because a definition or even an example of an “analysis” had been described in class. But, the essay that the student produced was a summary, not an analysis, suggesting that he or she had not understood the requirements of the assignment. In contrast, as Table 8 indicates, the same student performed outstandingly on the junior-level prompt, with his or her score increasing by 21 points, the largest increase in the set. Not surprisingly, the junior-level prompt was clear and included a statement of what was required:

Best Six stages prompt

This prompt, which involves the application of a theory or concept to a particular issue, a common academic trope, clearly delineates the expectations of the assignment, and the student obviously understood what was required.

Table 8 indicates the scores for this student

Table 8


A significant strength of the assessment discussed here is that it used paired essays written in a similar genre by the same students at comparable points in their college careers, thereby helping the researchers control for two factors that are important in writing assessment—the students and the genre. We were thus able to minimize much of the variation that is unavoidable in the more typical cross-sectional approach to the assessment of how effectively students are able to perform a given writing task. Moreover, a study based on access to paired essays prepared by students attending a comprehensive institution is relatively rare, enabling us to address the requirements outlined by Brian Huot (2002): that assessments should be “site-based, locally controlled, context-sensitive, rhetorically oriented and accessible” (cited in Elliot & Perelman, 2012, p. 65).

Use of Sources and Evidence

In the findings summarized above, the most notable dimension-specific improvement is evident for the use of Sources and Evidence. The significance of this finding is in keeping with a recent study of students’ use of sources, which noted that the ability to use sources effectively “requires students not only to be familiar with defined areas of disciplinary content, but also to be able to represent themselves through their writing as articulate and authoritative authors” (Thompson, Morton, & Storch, 2012, p.101). Moreover, the ability to use sources and evidence appropriately is also in accord with the emphasis on performance in which the study was situated, in that an important element in being able to write in the academy is to be able to perform the role of an academic writer, a perspective that is supported by James Gee’s (2001) concept of the “identity kit,” which he maintained enables students to function within an unfamiliar discourse community. Gee argued unless a student comes from an educationally privileged background, he or she becomes proficient in academic discourse only by acquiring an “identity kit” —a set of tools—which helps students perform effectively and enables them to be viewed as members of that community. Thus, improved competence in using sources demonstrates students’ developing ability to “represent their own ideas in relation to those of the authors of the source texts they employ” (Thompson et al., 2012, p. 100), another indicator of students’ improved ability to write a thesis-driven argument essay and transfer their knowledge to junior-level writing tasks.

The Role of the First-Year Writing Course

One of the goals of our assessment study concerned the extent to which the first-year writing course contributed to the improvement evident in the junior-level essays. This is a goal that is impossible to realize definitively because many factors can influence performance on a particular writing task, including students’ prior knowledge, growing familiarity with college writing, developing maturity between their first and third years, social/psychological factors such as dispositions (Driscoll & Wells, 2012), and motivational variables, such as interest in course content. However, the fact that students improved the most in their use of sources and evidence may be due to the emphasis in the first-year writing course on evaluating and incorporating secondary sources and to the involvement of university librarians with the writing program curriculum. In fact, students’ participation in the Learning Habits Project, itself, may have played a role in their development of self-confidence, self-efficacy, and motivation.

Moreover, students frequently commented on the value of their first-year writing courses in their responses to survey questions concerning courses that had strengthened their writing skills. Close to two fifths of the respondents noted that their freshman writing course had provided valuable guidance about how to approach their academic writing tasks, with one quarter mentioning the value of thinking about writing in a different “college” way. A fair number of students also mentioned that their freshman writing classes taught the value of organization and the importance of constructing logical arguments, developing a thesis, and supporting it with credible evidence. Most significant in these responses are the students’ emphases on their new understanding of academic genres and the acquisition of process-oriented approaches. These, as well as other aspects of their first-year courses, were perceived by students as having strengthened their ability to complete writing tasks in other academic contexts.

Revised Versus Unrevised Writing

 An additional factor that needs to be considered in interpreting the degree to which student writing performance improved between the first and third years is whether the submitted essays had received feedback and had been revised. Based on curricular requirements, all of the freshman essays were written in multiple drafts, received both peer and instructor feedback, and were additionally revised for a department-mandated portfolio assessment that occurred at the end of the semester.

Thus, the texts that the first-year students submitted to the Learning Habits Project were revised essays, whereas the third-year essays may not have been revised as extensively. It is possible, then, that, if the junior-level essays had consistently received the extensive feedback typical of the freshman assignments, and had been equivalently revised, that they would have received higher scores, with the consequence that the degree of improvement and transfer evident in the assessment study would have been higher.

Significant Findings

The Learning Habits Writing Study discussed here is notable for its use of multiple sources of evidence to document changes in students’ performance on a particular writing task (thesis-driven argument essay) between their first and third years of college, suggesting that students had been able to transfer writing knowledge to junior-level writing assignments. In addition to writing samples, the study also included reports by the students themselves about how the first-year writing course helped them improve as writers. Moreover, although the claims we are making here are concerned with students’ performance on a particular task, the students’ self-reports indicated that they had also learned process-oriented strategies that were useful for approaching various writing tasks more effectively.

The Learning Habits Writing Study thus constitutes a significant contribution to the ongoing “transfer debate” because it not only demonstrates that helping students learn a particular type of writing in their first-year writing course can enable them to perform well on similar writing tasks assigned in subsequent courses, but also because students’ comments about what was most helpful to them suggest that they had also gained a metacognitive awareness of what is involved in engaging with a writing task. This is in accord with the research of Devitt (2004), Beaufort (2007), Wardle (2007), Nowacek (2011), and Clark and Hernandez (2011).

In addition, the Learning Habits Writing Study highlighted not only student improvement but also several factors that can strengthen student performance. These are as follows:

  1. Overall, the students under study made modest gains in writing thesis-driven essays between the end of their first-year composition course and the completion of similar coursework in their junior years. This finding suggests that performance did improve, and some transfer of writing knowledge did occur.
  2. Improvement was most significant for students who arrived at the university with less than adequate writing skills.
  3. Dimension-specific improvement was most notable for the use of Sources and Evidence.
  4. The writing tasks emphasized in a first-year writing class can foster students’ ability to engage with similar writing tasks in subsequent academic contexts.
  5. The specificity and clarity of writing prompts in upper division classes can affect students’ ability to perform successfully.

Implications for Further Study and Curriculum Development

The Learning Habits Writing Study offers a useful contribution to scholarship concerned with transfer, particularly that of “near transfer.” However, additional examination of the elements that are likely to foster transfer constitutes an important direction for further research. Beyond knowledge about writing and opportunities to learn process-oriented strategies that are presented in a classroom setting, interpersonal and psychological factors (e.g., agency, disposition, and motivation) constitute such a direction (Driscoll & Wells, 2012; Haswell, 2000; Robertson, Taczak, & Yancey, 2012), as does the issue of “prior knowledge” (Yancey, 2015).

 In terms of curriculum design, the results of this study point to the advantage of developing what Melzer (2014b) referred to as a “connected curriculum”—that is, a curriculum characterized by “carefully sequenced writing courses in composition, general education, and the majors that connect to and build upon one another” (p. 78). Although the concept of a vertical curriculum has been discussed in previous scholarship (Crowley, 1998; Hall, 2006; Jamieson, 2009; Miles et al., 2008), discussions of this topic and of transfer have tended to occur in isolation.

In the context of the present study, we were happy to learn that students’ performance did improve, albeit modestly, that it did not decline, and that the greatest improvement was made by students whose entry-level writing ability was less than adequate. This is a particularly encouraging result, given the fact that this study was conducted at a large, public, urban university enrolling many minority and first-generation students, whose previous education may not have provided adequate writing instruction.


Adler-Kassner, L., Clark, I., Roberston, L., Taczac, K., & Yancey, K. B. (2017). Assembling knowledge: The role of threshold concepts in facilitating transfer. In C. M. Anson and J. L. Moore (Eds.), Critical transitions: Writing and the question of transfer (pp. 17-47), Fort Collins, CO: WAC Clearinghouse and University of Colorado Press.

Anderson, P., Anson, C. M., Gonyea, R. M., & Paine. C. (2015). The contributions of writing to learning and development: Results from a large-scale multi-institutional study. Research in the Teaching of English, 50(2), 199-235.

Anson, C. M., & Forsberg, L. L. (1990). Moving beyond the academic community: Transitional stages in professional writing. Written Communication, 7, 200-231.

Anson, C. M., & Moore, J. L. (Eds.). (2017). Critical transitions: Writing and the question of transfer. Fort Collins, CO: WAC Clearinghouse and University of Colorado Press.

Bean, J. C. (2011). Engaging ideas: The professor’s guide to integrating writing, critical thinking and active learning in the classroom (2nd ed.). San Francisco, CA: Jossey-Bass.

Beaufort, A. (2007). College writing and beyond: A new framework for college writing instruction. Logan, UT: Utah State UP.

Bergmann, L. S., & Zepernick, J. S. (2007). Disciplinarity and transfer: Students’ perceptions of learning to write. Writing Program Administration, 31(1/2), 124-149.

Berry, E., Huber, B., & Rawitch, C. Z. (2018). Learning from the learners: Successful college students share their effective learning habits. Lanham, MD: Rowman & Littlefield.            

Clark, I. L. (2012). Concepts in composition:Theory and practice in the teaching of writing. New York, NY: Routledge.

Clark, I. L., & Hernandez, A. (2011). Genre awareness, academic argument and transferability. The WAC Journal, 22, 65-78.

Crowley, S. (1998). Composition in the university: Historical and polemical essays Pittsburgh, PA: University of Pittsburgh Press.

Devitt, A. (2004). Writing genres. Carbondale, IL: Southern Illinois UP.

Driscoll, D. L., & Wells, J. H. M. (2012). Beyond knowledge and skills: Writing transfer and the role of student dispositions in and beyond the writing classroom. Composition Forum, 26. Retrieved from http://compositionforum.com/issue/26/beyond-knowledge-skills.php

Elliot, N., & Perelman, L. (Eds.). (2012). Writing assessment in the 21st century: Essays in honor of Edward M. White. New York, NY: Hampton Press.

Elon Statement on Writing Transfer. (2015). Retrieved from www.centerforengagedlearning.org/elon-statement-on-writing-transfer/ Gee, J. P. (2001). Literacy, discourse and linguistics: Introduction and what is literacy? In Cushman, E. R. Kintgen, B. M. Kroll, & M. Rose (Eds.). Literacy: A critical sourcebook(pp. 525-544). Boston, MA: Bedford St. Martins.

Graff, G. (2003). Clueless in academe: How schooling obscures the life of the mind. New Haven, CT: Yale UP.

Hall, J. (2006). Toward a unified curriculum: Integrating WAC/WID with freshman composition. The WAC Journal, 17, 5-22.

Hamp-Lyons, L. (2007). Farewell to holistic scoring? Assessing Writing, 27, A1-A2.

Hamp-Lyons, L. (2016). Farewell to holistic scoring. Part two: Why build a house with only one brick? Assessing Writing, 29, A1-A5.

Harris, M. (2010). Assignments from hell: The view from the writing center. In P. Sullivan, H. Tinberg, & S. Blau (Eds). What is college level writing? Vol .2 (pp. 183-206). Urbana, IL: NCTE.

Haswell, R. H. (2000). Documenting improvement in college writing: A longitudinal approach. Written Communication, 17(3), 307-351.

Huot, B. (2002). (Re)articulating writing assessment for teaching and learning. Logan, UT: Utah State UP.

Jamieson, S. (2009). The vertical writing curriculum. In J. C. Post & J. A. Inman (Eds). Composition(s) in the new liberal arts (pp. 159-184): Creskill, NJ: Hampton.

Kelly-Riley, D. (2015). Toward a validational framework using student course papers from common undergraduate curricular requirements as viable outcomes evidence. Assessing Writing, 20, 60-74.

Lunsford, A. (2015). Writing is performative. In Adler-Kassner, L. & Wardle, L. (Eds). Naming what we know: Threshold concepts in writing studies. Logan, UT: Utah State UP.

McCarthy, L. (1987). A stranger in strange lands: A college student writing across the Curriculum. Research in the Teaching of English, 21(3), 233-265.

Melzer, D. (2014a). Assignments across the curriculum: A national study of college writing. Logan, UT: Utah State UP.

Melzer, D. (2014b). The connected curriculum: Designing a vertical transfer writing curriculum. The WAC Journal, 25, 78-91.

Miles, L., Pennell, M., Hensley Owens, K., Dyehouse, J., O’Grady, H., . . . Shamoon, L. (2008). Thinking vertically. College Composition and Communication, 59(3), 503-511.

Moore, J. L., & Bass, R. (Eds). (2017). Understanding writing transfer: Implications for transformative student learning in higher education. Sterling, VA: Stylus.

Nelms, G., & Dively, R. L. (2007). Perceived roadblocks to transferring knowledge from first- year composition to writing intensive major courses: A pilot study. Writing Program Administration, 31(1-2), 214-240.

Nowacek, R. S. (2011). Agents of integration: Understanding transfer as a rhetorical act. Carbondale, IL: Southern Illinois UP.

O’Neill, P., Moore, C., & Huot, B. (Eds.). (2009). A guide to college writing assessment. Logan, UT: Utah State UP.

Perkins, D., & Salomon, G. (1992). Transfer of learning. International encyclopedia of education (2nd ed.). Boston, MA: Pergamon Press. 

Robertson, L., Taczak, K., & Yancey, K. (2012). Notes toward a theory of prior knowledge and its role in college composers’ transfer of knowledge and practice. Composition Forum, 25. Retrieved from http://compositionforum.com/issue/26/beyond-knowledge-skills.php

Soliday, M. (2011). Everyday genres: Writing assignments across the disciplines. Carbondale, IL: Southern Illinois UP.

Thaiss, C., & Zawacki, T. M. (2006). Engaged writers and dynamic disciplines: Research on the academic writing life. Portsmouth: Boynton/Cook.

Thompson, C., Morton, J., & Storch,. (2013). Where from, who, why, and how? A study of the use of sources by first year L2 university students. Journal of English for Academic Purposes, 12(2), 99-109.

Weber, R. P. (1990). Basic content analysis (2nd ed.). Newbury Park, CA: Sage.

Wardle, E. (2007). Understanding transfer from FYC: Preliminary results of a longitudinal study. Writing Program Administration, 31(1-2), 65-85.

Wolfe, J., Olson, B., & Wilder, L. (2014). Knowing what we know about writing in the disciplines: A new approach to teaching for transfer in FYC. The WAC Journal, 25, 42-77.

Writing. (2010). In Encyclopedia of rhetoric and composition. Routledge: Taylor & Francis. New York.

Yancey, K. B., Robertson, L., & Taczac, K. (2014). Writing across contexts: Transfer, composition, and sites of writing. Logan, UT: Utah State UP.

Yancey, K. B. (2015). Introduction: Coming to terms: Composition/rhetoric, threshold concepts, and a disciplinary core. In L. Adler-Kassner & E. Wardle (Eds). Naming what we know: Threshold concepts in writing studies (pp. xvii-xxxi), Logan, UT, Utah State UP.


[i] Students from Caucasian and Asian backgrounds, as well as those declining to state a racial and ethnic background, are considered to stem from Better Served backgrounds. The remainder stem from backgrounds Traditionally Underserved by higher education.

[ii] It should be noted that the essay writers are somewhat less likely than other Learning Habits participants to be native English-speakers (70% vs. 81%). The difference is modest, however.

[iii] The characteristics of the Learning Habits students are discussed in more detail in Berry, Huber, and Rawitch (2018, pp. 13-38).