Volume 13, Issue 2: 2020

Rebuilding Habits: Assessing the Impact of a Hybrid Learning Contract in Online First-Year Composition Courses

by Michelle A. Stuckey, Arizona State University; Ebru Erdem, Arizona State University; Zachary Waggoner, Arizona State University

This article examines a pilot study of a learning contract in an online first-year writing program. The program uses a master-class model with a shared curriculum and serves more than 3,500 students a semester. In this pilot, we implemented the contract within half of our courses. Our goal was to understand the impact of a learning contract on student retention in our first-year writing courses. We also hoped to determine if the learning contract helped shift student and instructor focus from grades to skill transfer. In this article, we first discuss the process of developing a learning contract, including the challenges of collaborating with faculty to address their needs and concerns; building instructor and instructional designer buy-in; and working through the limitations of the learning management system (LMS) to implement the contract in online courses. Second, we assess the results of the initial pilot to determine whether the contract functioned as we hoped by tracking the differences in retention, pass rates, and grade distributions between learning contract and traditional courses. We also examine survey data from students and faculty to make initial observations about students’ and instructors’ perceptions of how the learning contract impacted teaching and learning.

Keywords: learning contract, first-year writing, assessment, online writing instruction, writing program faculty development

The Framework for Success in Post-Secondary Writing (Council of Writing Program Administrators [CWPA] et al., 2011) proposes eight essential “habits of mind,” or intellectual behaviors, as essential for college readiness, but which are also practical preparation for lifelong learning. The habits include curiosity, openness, engagement, creativity, persistence, responsibility, flexibility, and metacognition. These eight habits of mind are central to the curriculum in our online first-year composition (FYC) program; we use them in conjunction with the WPA Outcomes Statement for First-Year Composition (CWPA, 2014) as the learning outcomes across our courses. Students engage with these habits their first week of class, reflecting on the habits they feel most confident about as well as those they need to strengthen. Students also reflect on their growth related to these habits throughout the class and consider how they apply them to other academic, professional, and personal situations. Rather than using the habits of mind to reinforce White language supremacy, we introduce them as an opportunity for students to set their own goals related to developing habits and dispositions that will support their individual learning while “attend[ing] to an ever-widening universe of reflective discourses” (Inoue, 2019a, p. 361).

Research on the habits of mind intersects with work on reflection and metacognition and has renewed interest in cognitive approaches within writing studies. For example, recent years have seen the publication of The Framework for Success in Postsecondary Writing: Scholarship and Application (Behm et al., 2017), Contemporary Perspectives on Cognition and Writing (Portanova et al., 2017), and the forthcoming Teaching at the Intersection of Cognition and Writing (Rifenburg et al., in press). E. Shelley Reid (2017) explores the relationship between the habits of mind and student dispositions, noting that work on self-efficacy (Bandura, 1986), self-regulated learning (Zimmerman, 2002), and learner mindsets (Dweck, 1996) suggests a link between “emotional dispositions and the more general concept of metacognition or reflective practice,” such that “learners’ dispositions are revealed through metacognition and thus should be read as complementary rather than opposed to learners’ cognition” (p. 292). Students in our courses face many challenges as primarily non-traditional learners in online, accelerated classes, and we use the habits of mind to support their growth by actively engaging them in thinking about how their mental habits factor into their ability to accomplish the work in the course. Scholarship consistently indicates that online students have lower retention rates than students in face-to-face classes (Bawa, 2016), and this is true of our program. Research on student habits and dispositions thus influenced us as we considered strategies for improving student persistence in our courses. As we were already asking students to re-think and re-build their cognitive habits related to learning, and to writing more specifically, we considered the use of contract grading as an additional tool to shift student dispositions about learning away from grades and onto their learning itself.

Our goal in simplifying the assessment process was not only to shift students’ focus but also to encourage instructors to continue to develop instructional habits that   prioritize an inclusive, student-centered feedback process while de-emphasizing the numerical structure of the gradebook. Faculty workload in this non-traditional teaching environment is intense: Faculty teach a 5/5 load, and courses are only 7.5 weeks long, which intensifies the workload related to responding to student writing. The contract grading system we developed streamlined the assessment process, allowing teachers to focus on learning and skill transfer more so than grades without adding to their workloads. Our goal was to also encourage faculty to focus on students’ individual growth as learners by creating more inclusive assessment methods that account for the diversity of student experiences and that “[teach] our students where they are” (Harris, 2019, Introduction). In addition, we hoped to identify faculty dispositions and practices that contribute to moving beyond the transactional model of teaching and learning. We wanted to understand what uncertainties and concerns instructors had about adopting this method and how the experience impacted their approaches to teaching online. Ultimately, we hoped to begin to build new habits of thinking about learning and teaching outside of grading for both students and instructors.

In this article, then, we first discuss the process of developing a learning contract for a master-class online FYC program. We address the challenges we encountered, which included deciding on what type of contract to adopt, building instructor and instructional designer buy-in, and adapting the contract to the learning management system (LMS). As the majority of the scholarship on contract grading has been conducted in very traditional contexts, our experience will be helpful to faculty in alternative instructional environments who are seeking to develop a contract grading system. Second, we share the quantitative results of the initial pilot and our revisions to the pilot and assess its impact on retention. Third, we share qualitative survey responses from students and faculty from the pilot and revised pilot to assess how the learning contract impacted teaching and learning. Fourth and finally, we share our takeaways on the applicability to other composition programs.



We are administrators of a fully online FYC program that serves more than 8,000 students per academic year, including summer sessions. Our courses run on an accelerated, 7.5-week schedule, which compresses the work of a 15-week, 3-credit course in half the time. Summer sessions are only 6 weeks long and thus move even more quickly. It is an intense work schedule for students and instructors alike. Students complete scaffolded assignments to compose two multimodal projects and a digital portfolio, in which they reflect on their learning at the beginning of the course, in the middle, and at the end. These three reflective writing activities help them assess their development in relation to the course learning outcomes, connect the writing they do in our classes to other composing they are doing or might do, and develop and evaluate their own goals as writers.

Our online students are predominantly non-traditional; according to Spring 2020 data, 56% are ages 25 to 39 while 53% are juniors and seniors. The racial and ethnic composition of our university’s online students does not reflect the national average: 59% of our online students are White, 19% are Hispanic, 8% are African American, and 5% are Asian American while nationally 54% of online students are White, 27% are African American, 11% are Hispanic, and 3% are Asian American. Although many online students transfer to our university with some college credit, as with most first-year writing courses, a majority of new online students enroll in one of our courses their first semester. Historically, about two thirds of students who enroll pass our classes with a C or better.

Each of our courses, English (ENG) 101, 102, and 105, uses a master-class model in which a course coordinator oversees the design and set-up of the Canvas courses and the development of curriculum. Multiple instructors teach in each of the Canvas classes, all using the pre-established curriculum. For instance, in Fall A 2019, we had seven distinct ENG 102 courses, each with between four and five instructors. While our courses might have as many as 186 students, we use the “section” option in Canvas, so students are assigned to a specific instructor and only interact with other students in that smaller section.

Given our master-class model, it is not possible for one instructor to individually decide to use a contract grading system. Rather, these decisions happen in conversation with coordinators and faculty. As we began to consider adopting a contract grading system, we thus knew it would be at a much larger scale with many more stakeholders involved than in a traditional class.   

While much previous work on grading contracts has focused on traditional class structures taught by tenured faculty, most writing faculty do not operate in this privileged space. Rather, the majority of writing instructors are not tenure-track and have heavy teaching loads. Furthermore, non-traditional teaching and learning contexts such as online education continue to grow (Seaman et al., 2018). We believe our experience developing a contract grading model program-wide, in a non-traditional context, will be useful for faculty members and administrators who are grappling with the challenges of teaching in constrained contexts and struggling to adopt student-centered assessment models that privilege dialogical feedback practices.  

In our accelerated, online courses, we face a variety of challenges, one of which is student retention and persistence.  In the 2019 introduction to the updated Bedford Bibliography of Research in Online Writing Instruction, Heidi Skurat Harris identifies several areas in online education where further research is needed, among them retention and student experiences with and perceptions of online education. She notes that the updated bibliography includes only one study of retention and recommends that more data-driven research be done that “supports effective writing practices and demonstrates the value of writing instruction to retention at the university-level” (Introduction). Harris (2019) also recommends the need to “systematically study students’ perceptions and experiences in online writing classes….Making our online writing courses inclusive and teaching our students where they are continues to be a focus of [online writing instruction] OWI research” (Introduction).

In a 2016 literature review of research on retention in online education, Bawa identifies several factors that can lead students to withdraw from online courses. These include misconceptions about cognitive load, students’ ongoing family commitments and social obligations, motivation and accountability, and lack of familiarity with educational technologies in both technically novice and savvy students. Bawa also finds that attrition is related to faculty preparation and experience, specifically instructors’ lack of understanding of the needs of online learners, faculty inexperience with educational technology, and the lack of institutional investment in training for faculty to support their transition to online instruction.

We address several of these challenges already in our program; namely, we provide technical resources and orientations to students and instructors, as well as ongoing faculty development related to OWI. However, many of the student factors are difficult to address on the instructional side. As Bawa (2016) summarizes:

Learners do not consider the magnitude of workload and the required depth of their involvement in the online courses as reasonable criteria to make the decision to go online. As a result, when they attend the online classes, many of them are unpleasantly surprised to find that the conveniences of flexible hours and lower cost outweigh the inconveniences of excessive demands on lifestyles, technical issues, and concerns related to the attitude and aptitude of learners toward a new platform. Online learning environment is very largely self-driven and dependent on the learners’ ability to manage academic responsibilities, with fewer props than those available in face-to-face classes. (p. 4)

This signals a disconnect between student perceptions and the real conditions of online learning. Bawa suggests that these misconceptions contribute to “cognitive overload,” citing research from Paas et al. (2004) who define cognitive overload as “a situation where learners are intimidated by a large amount of information that needs to be processed all at once before real learning can begin” (p. 4). Unfortunately, we are not in a position at our institution to mandate online orientations for students or implement more detailed self-assessments for readiness for online learning, which might help better prepare students to be online learners. In developing a learning contract, we intended to mitigate the cognitive overload students might experience as new online learners in our courses by simplifying the assessment model and creating a map for successful completion of the course.

As we developed a contract-based assessment model, the work of Peter Elbow, Ira Shor, and Asao Inoue on grading contracts deeply informed our approach. Scholars such as Shor (2009) and Inoue (2019b) use a negotiated contract, in which students participate in developing criteria for the grading contract. Shor (2009) writes that he “negotiate[s] grading contracts with students to construct the classroom as a public sphere for democratic arts” (p. 7) while grading student writing based on quality on a traditional A to F scale. Inoue (2019b) also negotiates the contract specifics with students although his assessment focuses on students’ labor rather than on the quality of their writing. While we share Shor’s and Inoue’s goals of establishing a more democratic and less hierarchical teaching and learning environment, we made the decision to use a unilateral contract. Danielwicz and Elbow’s (2009) “A Unilateral Grading Contract” became an important model for the contract we developed. They describe their unilateral approach to grading contracts as follows: “With our goal of reducing the effect of grading, we give up as much power over grading as we can manage, but we keep full power over course requirements” (p. 247). In their contract, and in ours, the conditions of the contract are pre-determined and are not negotiated with students.

We chose the unilateral contract model because the accelerated, asynchronous structure of our courses does not allow for meaningful engagement of students in the process of developing the contract or rubrics for the course. Grading contracts are clearly emerging as an important pedagogical approach in the field, one that can benefit online learners. But as scholarship on learning contracts primarily comes from relatively traditional teaching and learning environments, in face-to-face classrooms using quarter or semester systems, it was necessary for us to adapt the contract model to our particular context. Synchronous discussions can create burdens for online students, many of whom work full time and live in different time zones. In addition, our university strongly encourages faculty to make online courses fully available to students prior to the actual start date. Because our non-traditional students are often negotiating competing demands, it seemed unreasonable to potentially alter expectations during the course, when all students likely would not be able to contribute to contract revisions.

We also found much value in Inoue’s (2019b) work on labor-based assessment. In ideal teaching and learning conditions, we would prefer to eliminate quality as a vector of assessment and focus primarily on student growth. Although tracking labor and contribution to the course community seems to be a valuable method of prioritizing student growth and engagement, this system is fairly labor intensive for both students and teachers. We know that our students have unrealistic expectations about the workload in online courses, which is exacerbated by the accelerated 7.5-week timeframe. Because students already struggle to complete the required work, adding work, in particular asking them to do more work in order to earn an A, would prove to be an additional stressor for students. In addition, our faculty have heavy loads, often working with 60 students at once, and in the 7.5-week courses must respond to student work very quickly. We did not want to add additional challenges to that already heavy content load. Finally, our grading schema already privileges student labor as well as their own self-assessments by heavily weighting process invention work and portfolio reflections (a combined 75% of the total course grade). Portfolio reflections are a central part of the assessment process in our courses and give students the opportunity to explain their learning to their instructors.

Thus, we developed a hybrid contract to meet the needs of online learners and teachers. We primarily drew on Danielwicz and Elbow’s (2009) contract in our approach to determining the grading system. We followed their schema of using the B grade to anchor the contract. That is, students are required to do the following for each assignment in order to earn a B: complete all components for each assignment, complete the assignment in a meaningful and substantive manner as outlined by the rubric, demonstrate consideration of previous feedback, submit the assigned work on time. As Danielwicz and Elbow acknowledge, this approach does widen the category of B to include a broader range of student writing. We embraced this possibility as we hoped it would enable us to move closer to equitable learning conditions by accounting for students’ diverse range of access to dominant literacy practices and valuing their interventions into college-level writing. Our contract also describes how students can exceed the B grade through “the thoroughness and quality of their thinking and composing.” Students can fall short of the B grade by missing assignments, not completely meeting assignment requirements, and submitting late work.

Our goal was to encourage students to worry less about grades by giving them a clear path to success and thereby allow them to focus more on establishing their own goals with their writing and feel more confident taking risks as writers. Research on the relationship between self-efficacy and outcome expectations in writing instruction suggests the perceived outcome of completing a task can influence individuals’ willingness to engage in that task (Pajares, 1997, p. 6, as cited in Khost, 2017). As Khost (2017) notes, there is disagreement in the social cognitive theory literature about whether students’ perceptions of their ability to complete a task is a stronger predictor of whether they do it than what they perceive will be the outcome. We see the learning contract as functioning to clearly present outcomes to students, such that it will influence their willingness to engage. While this may not account for the impact of their sense of self-efficacy, the portfolio reflections in the course offer a means for them to set and work toward their own learning goals.

Making a program-wide shift to contract grading required professional development with faculty to help them understand the potential benefits, discuss their concerns, and gather their feedback. We involved faculty in initial drafts of the contract, and from Fall 2018 to Fall 2019, we spent time in faculty meetings discussing and revising the contract as well as engaging in discussions on research related to contract grading. We also invited Asao Inoue to a meeting in which faculty had the opportunity to learn the details of his contract and experience teaching it.

A survey of faculty conducted prior to the contract implementation in Fall 2019 (See Appendix A for questions) revealed that initial perceptions of the contract benefits included shifting students’ focus away from grades and onto learning. Faculty anticipated the contract would help students and instructors “focus on the writing process” and “engage more with feedback.” One instructor commented,    

I love the focus on student learning rather than grading. This is beneficial for both sides of the classroom - the student can focus more on what they take away from the content (transfer), and the instructor can focus more on looking at what the student is learning from these assignments. While the student is less concerned about scores and grades, this also takes that same burden off the instructor - they no longer need to spend so much energy looking at what was done wrong, and instead can focus on what was done right.

The sentiment that the contract would also “free” instructors to focus more on student learning was echoed in many of the responses to the survey. Instructors wrote they saw potential for the contract to simplify the grading process as well as make their jobs “easier and more straightforward.” They looked forward to “being able to give feedback more freely with less of a constraint of trying to justify a grade.” 

Perhaps unsurprisingly for faculty used to working with a points system and matching those end-of-semester totals to an A through F traditional grade scale, much of the uncertainty and anxiety expressed by instructors was related to distinguishing between A submissions (exceeds expectations for a B) and B submissions (meets expectations). Instructors were uncomfortable with the gray area they perceived between these criteria. Yet many instructors also expressed that the new assessment model would better reflect their pedagogical approach and thus improve their teaching practices. For instance, one instructor remarked that the contract might “free up time I spend agonizing over whether a number grade I assign is fair, and I can use that time to create better teaching materials and better student video feedback on assignments. I think I will become more efficient as a guide.” Another expressed,

I think that using a learning contract would allow me, as an instructor, to focus on educating students rather than “grading.” I think it's a shift in perception, largely, but I believe, for instance, that my marginal comments on students' essays would cease being implicit justification for why the student earned the grade they did and instead be a more supportive stepping stone in their learning.

Conversely, other instructors worried about the learning contract model from the student perspective: Some expressed concern that students would not understand the new assessment model, and the online course would make it difficult to ensure student understanding. For instance, one instructor remarked, “I'm not sure how they will interpret this contract, and I'm afraid that an online learning environment might not allow me to answer all questions in the way that face-to-face teaching allows me to field questions in real time.” While one goal of the contract was to relieve anxiety for struggling or inexperienced student composers with the hope of increasing retention and persistence in our courses, our teachers expressed concerns about students driven to receive A (or even A+) scores and final course grades. One teacher expressed this concern by writing, “[I have] concerns about ‘A’ or scholarship students…[we need to] define the requirements to receive an A.” Although we want students to focus on their learning and transferring that learning to other composing situations, many student scholarships at our university are tied to students maintaining a specific GPA. This final-grade-based mindset was perceived by many teachers as an obstacle to getting student buy-in on our proposed learning contract.



In our program, we serve more than 3,500 students a semester, and we implemented the learning contract within approximately half of our courses. We piloted the contract in two ENG 101 Canvas multi-section classes, with a total of 255 students and seven instructors; three ENG 102 multi-section Canvas classes, with a total of 500 students and 11 instructors; and one multi-section Canvas class of ENG 105, with a total of 95 students and three instructors. As we use a master-class model with a shared curriculum, we were able to implement the learning contract consistently across the three courses.  

Because of both the scale of our enrollment and the online nature of our courses, we need to designate grades for each assignment in the gradebook in Canvas, our course LMS. We worked with our instructional designer to determine how best to represent the three categories of grades. Ultimately, we decided on a 3-point grading scale as follows:

  • 3 score = exceeds expectations for a B
  • 2 score = meets expectations for a B
  • 1 score = does not meet expectations for a B

We changed the default grading scheme to align with our 3-point scale, shown in Appendix B. We also revised our grading rubrics for all assignments to reflect the assessment expectations.

To effectively communicate the goals and expectations, as well as the advantages of the learning contract, we included the learning contract in the introductory module in all course shells. To access the rest of the modules, students were required to view the contract. Faculty also engaged students with the contract by making a screencast video going over the key points, explaining the goals and rationale of contract grading in an announcement, and/or opening ungraded class discussions about the contract. To support faculty, we formed a learning community where we met once a week online to discuss implementation and suggestions for improving the contract and related materials. We designed practice assessment activities that helped us all “see” student work through the lens of the contract by assessing sample assignments together and talking through the nuances of applying this new assessment method.

We used a mixed-methods approach to determine the outcomes of the contract assessment model in the pilot courses. With the assistance of researchers at our university’s online education unit, we collected quantitative data on student engagement, persistence, and retention. We also obtained IRB approval to survey faculty and students in order to assess perceptions of, attitudes toward, and responses to the contract model. Participants completed consent forms in Google Forms prior to completing the distinct surveys. We administered a short answer, anonymous survey through Google Forms to all 43 faculty before the beginning of the semester to gauge their perceptions of how the learning contract would impact teaching and learning in the program; we received 25 responses. At the end of the pilot, we administered a follow-up, short answer, anonymous survey to the 18 instructors teaching in the pilot courses; we received 14 responses. Pre-course and post-course survey questions can be found in Appendices A and C, respectively. 

To guide our quantitative data collection on student engagement, persistence, and retention, we worked collaboratively with the online education unit to look at the following:

  • If the use of a contract assessment model associated with a significantly and meaningfully higher pass rate (percent of students who earn a C or higher) versus traditional courses.
  • If the use of a contract assessment model associated with a significantly and meaningfully lower withdrawal rate versus traditional courses.
  • If there was a significant and meaningful difference in grade distributions between contract and traditional courses.
  • If the use of a contract assessment model associated with significantly and meaningfully higher levels of student engagement with their instructor and their learning process (as captured by the survey) versus traditional courses.

The online education unit provided comparative data related to student pass rate, withdrawal rate, and grade distributions between the contract and traditional courses, which we will discuss in the Results section. To capture data on student engagement, we developed nine 5-point Likert-scale questions administered to students in both traditional and learning contract courses at the end of the session. To assess student perceptions of and responses to the contract model, we included three additional open-ended questions for students in the pilot courses. (The student survey questions can be found in Appendix D.) The anonymous survey was administered via Google Forms with IRB approval.




As discussed earlier, one of our goals for developing a contract assessment model was to present clear outcomes to students in order to influence their willingness to complete coursework and persist in the class. To determine the potential impact of the learning contract on student retention, we compared both the pass rates and withdrawal rates of students in the learning contract courses to those in our traditional classes. For both ENG 101 and ENG 102, two-tailed, two-proportion z tests were conducted to compare the pass rates (percent of students who earned a C or higher) and the withdrawal rate between students in the learning contract and traditional sections of the courses. Because of the small number of students enrolled in ENG 105, all sections of that class participated in the learning contract pilot study, and thus we could only compare this course with historical rates (see Appendix E).

The data revealed that ENG 101 students in the learning contract sections of the fall course had significantly lower pass rates (53% vs. 67%) and significantly higher withdrawal rates (31% vs. 21%) compared to students in the traditional sections of the course. However, for ENG 102, there were not significant differences in pass and withdrawal rates between students in the learning contract and traditional sections of the course. Because of the differences in student outcomes between the two classes, we cannot know whether the adoption of learning contracts directly led to lower levels of success in ENG 101 or if those outcomes were the result, in whole or in part, of other factors, such as pedagogical differences between instructors or student experiences. We did note significantly more first-time, first-year students in the ENG 101 classes than in the ENG 102 sections. In ENG 101, 60% of the learning contract students and 58% of the traditional students were first-time, first-year students, compared to 23% of students in learning contract sections and 24% of students in traditional sections of ENG 102. We speculate this could be a factor in the different outcomes for the courses, which we will consider as we revise our contract and supporting materials going forward. This seems to be supported by existing research on retention in online education, which suggests that less experienced students who are earlier in their progression toward a degree are more likely to withdraw (Bawa, 2016).

The overall distribution of grades was significantly different between the learning contract and traditional courses in both ENG 101 and ENG 102. Approximately 50% of grades in both courses were As and Bs. However, in the traditional sections, 31% were As, and 20% were Bs. In contrast, in the learning contract sections, the breakdown is almost the mirror opposite: 18% were As, and 34% were Bs. In short, a much higher percentage of students earned Bs in the learning contract courses. While additional research and discussion are needed to better understand the relationship between contract grading and student outcomes, we interpret these results to suggest our use of the B contract grade as the default in structuring both the contract and the rubrics was the primary factor in the increase of B grades and the decrease of A grades in the learning contract sections.

Figure 1

Distributions of Course Grades for ENG101

Figure 1 

Note. The blue distribution is of students in the traditional course sections. The orange distribution is of students in the learning contract course sections.

Figure 2

Distributions of Course Grades for ENG102

Figure 2 

Note. The blue distribution is of students in the traditional course sections. The orange distribution is of students in the learning contract course sections.


In light of these findings, we worked with faculty in the pilot to revise course materials for the spring semester. In the spring, we implemented the learning contract program wide with the updated course materials. We revised language on the contract and rubrics to clarify distinctions between A and B grades. Although there were still fewer As on average in Spring A than historically across all course sections, the percentage of As is much closer to historic averages than in the pilot (see Appendix E). Furthermore, while retention still has not improved as a result of the learning contract, the pass rate in all sections, including ENG 101, is in line with historical averages. It is still not clear what factors impacted the low pass rate in the ENG 101 pilot; more research is required to understand and account for these data (see Appendix E).

Student Survey Responses

To determine if the contract helped increase student focus on the learning process and skill transfer and decrease the focus on grades, we studied results from the end-of-term student survey (see Appendix D). Students in both traditional and contract sections responded to nine questions using a 5-point Likert scale, with 5 being the highest and 1 being the lowest. Students in contract courses also responded to three short answer questions. Respondents included 53 students in the traditional courses and 44 students in the contract courses (see Appendix F). 

Five of the questions had more than a 15% difference in the number of students in traditional versus learning contract courses who rated their response a 5, the highest ranking. For instance, 55% of respondents from the Fall 19 Session A learning contract sections rated their level of engagement with their instructor (Question 1) as a 5, compared to 36% of respondents from traditional sections. Question 3 asked students to rate the degree to which their instructor emphasized improving their writing and composing skills. Of the respondents in contract sections, 67% rated this a 5 while 48% of respondents in traditional sections rated this a 5. Because the sample size is small, we cannot extrapolate these data to make definitive statements about student experiences. However, coupling these data with the comments we received from students and faculty (discussed below) makes us optimistic about the possibility for the learning contract to increase student-teacher engagement.

Student responses to Question 8, however, raise questions about the potential effectiveness of the contract to change student habits related to grade-focused learning. Specifically, 57% of respondents from contract courses rated their concern about their grade a 5, whereas 23% of respondents in traditional sections ranked their concern for grades a 5. This may be related to Question 9, which asked students to rank their level of understanding of the grading expectations in the course. Of the respondents in contract courses, only 37% ranked their understanding as a 5, whereas 64% of respondents in traditional sections ranked their understanding as a 5. Student confusion over the distinction between A and B grades may have contributed to this, as students in contract courses might have been more concerned about their grade because they were less clear about the criteria for an A. Similarly, Question 7 asked students to rate the degree to which their instructor emphasized getting a good grade in the course. Of the respondents in contract courses, 32% ranked this a 5, whereas 21% of respondents in traditional courses ranked this a 5. This raises the question of how students understand teacher feedback and support. If students measure their own success in terms of grades, perhaps they also measure supportive instruction as “emphasizing good grades.” More ongoing qualitative research is needed to better understand student attitudes towards learning, instruction, and grades in contract grading environments.

Responses to Questions 4 and 5 also raise questions for future research. Question 4 asked students to rate the degree to which they read and acted upon their instructor’s feedback and comments on their work in the course. Of the respondents in learning contract courses, 57% ranked this a 5, whereas 67% of respondents in traditional courses ranked this a 5. This suggests respondents in traditional courses were more attentive to the feedback they received from their instructors. This could be the result of students in contract courses feeling more comfortable taking risks in their writing due to the contract nature of the assessment process. In contrast, for Question 5, 43% of respondents in contract courses ranked the degree to which their writing and composing knowledge and skill improved as a result of the class a 5, whereas 28% of respondents in traditional courses ranked this a 5. Finally, of respondents in contract courses, 25% ranked their level of enjoyment of the course a 5, while 21% of respondents in the traditional classes ranked this a 5. Again, more qualitative research is needed to better understand student experiences, but responses to the open-ended questions provide some possible explanations for these results.

To better understand how students understood and experienced the learning contract, we analyzed responses to the open-ended survey questions. Of the 44 respondents to the survey, 33 responded to the first two-part question: “What aspects of the learning contract in your Writers’ Studio class did you find to be most beneficial to your experience in the class? In what ways do you think it might have helped you?” Of those responses, three responded they did not like anything, and one responded with ambivalence. Another 15 seemed to indicate that either the student did not understand the question or did not understand what we meant by the “learning contract.” That is, many referenced other elements of the course, such as the writing process, peer and instructor feedback, and reflection. Two students indicated they did not know how to answer or what the question was asking. As this is a first-year writing class that for many students is their first college course or first course at our university (or both), it is not surprising they may not have found contract grading unusual. They are encountering many new expectations, discourses, and digital learning environments, so the contract may not seem any more new or disorienting than other aspects of their experience as new college students.

The final 14 responded in ways that indicated they valued the way the contract clarified outcomes, decreased potential stress or cognitive overload related to grades, and redirected the focus to their learning. The following student statement was typical of the student responses to this question:

The aspect of the learning contract that I found most beneficial was knowing that as long as I completed my work in a meaningful way that showed I put in my best effort, I would get a B at the end of the course. This allowed me to worry less about how exactly to get a good grade, and gave me the chance to focus on how to complete each assignment to my best possible ability and grow as a writer.

Of these 14 responses, 12 comments indicated the contract grading system’s clear articulation of outcomes reassured them their work or “best effort” would be valued. Other students who focused on clear outcomes mentioned benefits such as feeling reassured they could pass the class, holding themselves more accountable, knowing what was expected of them, and focusing on their growth as writers. Another student wrote, “It was good to know that hard work/perseverance would pay off even if an ‘A’ grade was not earned. Work hard, follow assignment instructions/rubric, and turn in the work on-time earns a B.” These responses appear to support our conjecture that clearly presenting outcomes to students could positively influence their engagement and persistence.

Related to clear outcomes, three respondents also indicated the contract reduced stress (or cognitive overload), and six suggested it allowed them to focus more on their learning. The following student connects the two, indicating that a decrease in worry about grades allowed for more freedom as a writer:

I found that knowing I would receive a B if I turned in my work on time and complete to the best of my abilities took some of the stress off of the actual writings being done for the course. I think this was helpful because it allowed me to write more freely, without being worried that I would be marked down for grammar or spelling mistakes, misunderstandings, ect. [sic]

Similar comments indicated the contract allowed students to “obtain the most from the class” and “focus solely on my learning and improving my writing skills.” These student comments affirm the goal of creating an assessment model that encourages students to worry less about grades and focus more on their own writing goals.

We also asked the students the following: “What aspects of the learning contract did you find confusing or challenging?” The most common response (n = 11) indicated students were uncertain about how to move from a B to an A on assignments; four also mentioned being confused by the 3-point scale.  The following student response typified these sentiments:

I think the grading system was very challenging and confusing. I put a lot of time, hard work, and effort into this class, but the grading system seems to make it very hard to get anything higher than a B.

As we have discussed, the distinction between B and A grades was a source of confusion in the course. This seems to be part of what the student identifies as challenging. Yet this also seems to suggest that only A grades signify recognition of effort and learning to some students who are invested in and more comfortable with traditional grading scales.

Three students discussed being worried or stressed about grades. For instance, one student said, “The grading was clear, I just wasn't fond of it. My personality was not a match. The grading stressed me out.” This student identified their own disposition as not compatible with the contract model. Another commented, “I found it challenging at times to not worry about my final grade in the course since it is still an unavoidable element to the course.” These responses echo Inman and Powell’s (2018) findings in “In the Absence of Grades,” specifically that students’ attitudes and beliefs about themselves as writers are deeply informed by the experience of being graded. As Inman and Powell argue, students rely on grades to measure their academic progress, which they often elide with learning, and to signal their success or failure. Changing deeply rooted emotions and dispositions related to grading is not something that can be accomplished in a 7.5-week course, especially when, as the student notes, grades are still ultimately “unavoidable.” Yet, it can be an introduction to thinking differently about their learning, and as with any new concept, students’ exposure to these assessment practices in other courses might help them to continue to build habits and dispositions that disassociate learning from grades.

Finally, we asked the contract students the following question: “If you had the choice between using a learning contract and being graded in a more traditional way, which would you choose? Why?” Of the 37 student responses to this question, six said they were not sure or did not care. Of the remaining respondents, 13 said they preferred a more traditional model, and 18 said they preferred the contract. The students who favored traditional evaluation methods cited confusion about the grading scale and more comfort in the familiarity of traditional assessment measures. It is clear that our students had not encountered contracts in their previous educational experiences; the fact that students are more comfortable with educational models they are “accustomed to” is one obstacle to student buy-in on learning contracts. This suggests again that the habit of associating learning with a grade or score is difficult to change.

The most comprehensive student response from the many positive responses we received was as follows:  

I would choose the learning contract over being graded in a more traditional way. After being graded in a more traditional way for most of my academic career, I have felt the pressure to receive good grades even if they were not a true reflection of my learning. I have always valued my learning over grades because the knowledge is permanent and the grades are not. I carry the knowledge with me into all areas of my life, but not the grades. I would have been very happy to even just receive the minimum grade I could have given the learning contract. However, by letting go of my anxiety of needing to pass the class, I exceeded my own expectations, improved my writing skills more than I thought I could, and received an even higher grade than I was even aiming for. The learning contract provided a refreshing perspective that resonated with my own beliefs on what education truly should be and gave me hope that more classes could start heading in this direction.

This student expresses what we hoped would be the experience of all students who participated in the contract courses: reduced focus on grades that opened a space for students to set and exceed their own learning goals.

Instructor Survey Results

To compare instructors’ perceptions of contract grading after implementation to those of prior and to collect data on their experience using the contract, we conducted a follow-up survey. This survey produced similar results as the initial survey, with the follow-up survey revealing an increase in instructors’ dialogical feedback practices and a preference for the streamlined assessment process. We were particularly interested in whether and how the learning contract helped instructors change or build new teaching habits. It is clear that a majority of the instructors found the learning contract simplified the grading process; 12 out of 14 respondents to Question 1 (“What aspects of the learning contract did you enjoy most? Why?”) indicated this. For instance, one instructor commented:

I appreciated the way the contract simplifies grading. I did not have to make as many fine-grained (and ultimately counterproductive) decisions about how many points to deduct in a certain case. I also appreciated that the contract makes explicit what was already implicit in my grading practices, namely that I put more weight on meeting the rubric requirements as regards completeness, effort and attention to revision (where applicable) than more abstract judgments of quality.

Other respondents (n = 6) indicated they enjoyed the focus on feedback the contract enabled; three mentioned they found it helped students succeed, and one mentioned preferring the focus on student effort.

Instructors’ perception that the learning contract enabled them to focus more on feedback was also reflected in their responses to Question 3 (“What aspects did you find facilitated or improved teaching? Why?); of the 14 responses, eight reiterated the contract helped them to focus more on dialogic feedback, for example:

For the same reasons as the students, the focus was less on grading and more on discussing the work, process, growth, progress, transfer, etc. with the students. This meant I had more time to dig into the important, interesting details of the assignments when giving feedback rather than having to spend time explaining why I was taking points off in certain categories.

This comment suggests that the contract encourages instructors to continue to develop instructional habits that prioritize a dialogical feedback process by helping build new feedback habits focused on guiding student growth. Other respondents said the contract made grading less stressful (n = 3), that it allowed for more fairness in grading (n = 3), that it made grading less time consuming (n = 1), and that it simplified responding to student questions about grades (n = 1).  

We also asked instructors to reflect on the impact the contract had on their teaching practices (Question 6) and their engagement with students and their writing (Questions 7). The 13 responses to each of these questions varied widely. The most common responses to Question 6 were again related to feedback: Four respondents discussed ways the contract improved their approach to feedback or student perception of it. Two respondents indicated the contract had not had a clear impact on their teaching practices, two discussed being able to emphasize the writing process more in their interactions with students, two indicated it improved communication with students, and two noted their teaching became more student focused. Other instructors noted they were “more careful in the language I use in my feedback,” that they have become “more fair,” and that their “teaching practices have been heightened.” The 13 responses to Question 7 again included four responses that the contract had not changed their engagement with students. However, many instructors noted positive changes in their practices; for example, four instructors discussed the way their communication with students about their writing had improved. Specifically, instructors commented, “Our dialogue has become more conversational and less prescriptive,” and “I find myself being more personable and conversational in my feedback.” Two instructors also noted they engaged more with students about process; for example, “My feedback focuses on process and developing transferable skills, strategies, and practices. The contract emphasizes the structure of our courses' invention and reflection work.” Thus, instructors’ responses indicate the contract both enhances their existing instructional practices and has helped them to continue to build more student-centered pedagogical habits.

In Questions 4 and 5, we asked instructors to reflect on challenges they anticipated prior to using the contract and those they in fact encountered. Responses to Question 4 indicated the majority of instructors’ initial concerns centered around grades – students arguing about their grades (n = 3), being overly concerned about obtaining a “3” (n = 3), or not understanding the grading scale (n = 3). For instance, one instructor remarked,

As a student who would have been concerned about how to get an A in the contract, I anticipated that to be a challenge. Most of the grade questions I got were about getting an A, but I think I would have had those questions with the traditional system; however, with the contract I went back to the rubric and learning outcomes more than I did before.

Two instructors indicated concern about unclear assessment criteria, and three indicated they had had no initial concerns. The challenges instructors reported actually encountering (Question 5) included determining scores on the 3-point scale (n = 4), explaining the contract to students (n = 3), and using the rubrics (n = 2). Comparing this response to the responses to Question 1, in which most instructors said the contract simplified grading, suggests that, while ultimately the contract has simplified the assessment process, adopting this new assessment model required rethinking of previous practices and development of new habits of assessment.

Finally, instructors were asked how using the contract impacted their identities as teachers (Question 8). Of the 13 responses, six indicated their identity as teachers had not changed or had changed very little. The remaining seven respondents listed a variety of ways their identity had changed, including having space to be more nurturing, focusing more on student improvement, and “offering ‘advice’ [rather] than judgment.” Instructors indicated they emphasize learning more, specifically critical thinking and transfer, feel clearer about programmatic expectations, worry less about grades, and focus more on feedback. One instructor indicated feeling “harder on students” than in previous terms, but this sentiment seemed to be an outlier. To our question about whether they wanted to continue with the contract, 14 respondents seemed to unanimously prefer continuing with the contract although eight wanted revisions. (We received one ambiguous response about whether the instructor supported continued use of the contract.)

Instructors found the contract to be a valuable tool for meeting the needs of the diverse student population our program serves, as expressed by the following comment:

I would prefer to continue with a learning contract. The expectations are clearly stated up front and more students benefit from the practice of the contract. The contract allows for conversations with over-achieving students that challenge them to think about their learning as an end result not a grade. It supports student learning for those students who haven't been rewarded for working the grading system. I find it to be an exciting way to address learner engagement for first-year students from diverse backgrounds.

In addition to helping them build new approaches to assessment and feedback, instructors overwhelmingly cite the contract as helping to “re-energize” their teaching, “heightening” their own practices, and improving their communication with students. 



Findings from our experience developing a contract grading system in a large-scale online writing program can help administrators and teachers in online first-year writing programs develop similar, context-specific contract assessment models. Our data reveal that, while contract grading is not a panacea for the complex issues related to retention in online courses, it does have the potential to improve student and teacher experiences. Simplifying assessment can help online teachers better manage the “paper” load, freeing them up to focus on coaching students through feedback. Contract grading can also increase equity in writing assessment by expanding grading categories to account for the diverse writing knowledge and experiences first-year students bring to the FYC class. This is particularly exigent in online courses to counter the potential for anonymous and impersonal instruction and assessment.

Our data align with the findings of Inman and Powell (2018) and suggest that successful implementation of contract grading in any environment requires changing the habits and dispositions that shape students’ and instructors’ investment in grades. This is a hurdle that anyone considering a transition to contract grading must grapple with. In online courses with a diverse mix of first-time, first-year students and returning and non-traditional students, our findings suggest that preparing faculty for the transition is essential for successful implementation. Ongoing professional development with faculty about how contract grading impacts their teaching and their students’ learning is essential to guiding the process.   

Students who are coming to post-secondary online learning directly from secondary education contexts are more likely to perceive their learning and their identity as learners through the lens of traditional grading structures. Anticipating that first-time, first-year students might struggle the most with understanding the contract model of assessment can help teachers and administrators develop methods for interpreting contract grading for this student population and supporting faculty as they engage with these students about the rationale for contract grading. Incorporating the habits of mind and systematic reflection alongside the use of contract-based assessment can help students transition to a new way of thinking about learning by transforming their cognitive habits and dispositions. As with any new concept, students’ exposure to these assessment practices in other courses might help them continue to build habits and dispositions that disassociate learning from grades. These findings, we hope, will encourage and support other teachers and administrators in online and non-traditional instructional contexts to adopt contract assessment models that privilege student-centered feedback practices.


Author Note

We have no known conflict of interest to disclose.

Correspondence concerning this article should be addressed to Michelle A. Stuckey, College of Integrative Sciences and Arts, Arizona State University, P.O. Box 870604, Tempe, AZ, 85287-0604. Email: Michelle.Stuckey@asu.edu

 Michelle Stuckey is a Clinical Assistant Professor at Arizona State University and the Writing Program Administrator for the Writers’ Studio, a fully online first-year composition program housed in the College of Integrative Sciences and Arts. She also oversees a course embedded tutor program of more than 60 writing mentors who provide additional instructional support to online students. 

Ebru Erdem is a Course Coordinator in the Writers' Studio - an online first-year composition program at Arizona State University.  Working collaboratively with the instructional team, she oversees English 102.

Zach Waggoner is a Course Coordinator for the Writers' Studio, a fully online first-year composition program housed in the College of Integrative Sciences and Arts. Previously, he served as the Associate Director of the Program in Writing and Rhetoric at Stanford University.



Bandura, A. (1986). Self-efficacy: The exercise of control. Freeman.

Bawa, P. (2016). Retention in online courses: Exploring issues and solutions—A literature review. SAGE Open, (January-March), 1-11. https://doi.org/10.1177/2158244015621777

Behm, N. N., Rankins-Robertson, S., & Roen, D. (Eds.). (2017). The framework for success in postsecondary writing: Scholarship and applications. Parlor Press.

Council of Writing Program Administrators. (2014). WPA outcomes statement on first-year composition (3.0). http://wpacouncil.org/aws/CWPA/pt/sd/news_article/243055/_PARENT/layout_details/false

Council of Writing Program Administrators, National Council of Teachers of English, and National Writing Project. (2011). Framework for success in post-secondary writing. http://wpacouncil.org/framework

Danielwicz, J., & Elbow, P. (2009). A unilateral grading contract to improve learning and teaching. College Composition and Communication, 61(2), 244-68. 

Dweck, C. (1996). Mindset: The new psychology of success. Random House.

Harris, H. S. (2019). Introduction to the updated version. In The Bedford bibliography of research in online writing instruction. Bedford/St. Martin.

Inman, J. O., & Powell, R. A. (2018). In the absence of grades: Dissonance and desire in course-contract classrooms. College Composition and Communication, 70(1), 30-56.

Inoue, A. (2019a). 2019 Chair’s Address: How do we language so people stop killing each other, or what do we do about White language supremacy? College Composition and Communication, 71(2), 352-69.

Inoue, A. (2019b). Labor-based grading contracts: Building equity and inclusion in the compassionate writing classroom. The WAC Clearinghouse; University Press of Colorado. https://wac.colostate.edu/books/perspectives/labor/

Khost, P. H. (2017). Researching habits-of-mind self-efficacy in first-year college writers. In P. Portanova, J. M. Rifenburg, & D. Roen (Eds.), Contemporary perspectives on cognition and writing (pp. 271-289). The WAC Clearinghouse; University Press of Colorado. https://wac.colostate.edu/books/perspectives/cognition/

Paas, F., Renkl, A., & Sweller, J. (2004). Cognitive load theory: Instructional implications of the interaction between information structures and cognitive architecture. Instructional Science, 32, 1-8. http://www.ucs.mun.ca/~bmann/0_ ARTICLES/CogLoad_Paas04.pdf

Portanova, P., Rifenburg, J. M., & Roen, D. (Eds.). (2017). Contemporary perspectives on cognition and writing. The WAC Clearinghouse; University Press of Colorado. https://wac.colostate.edu/books/perspectives/cognition/

Reid, E. S. (2017). Defining dispositions: Mapping student attitudes and strategies in college composition. In P. Portanova, J. M. Rifenburg, & D. Roen (Eds.), Contemporary perspectives on cognition and writing (pp. 291-312). The WAC Clearinghouse; University Press of Colorado. https://wac.colostate.edu/books/perspectives/cognition/

Rifenburg, J. M., Portanova, P., & Roen, D. (in press). Teaching at the intersection of cognition and writing. Parlor Press.

Seaman, J. E., Allen, I. E., & Seaman, J. (2018). Grade increase: Tracking distance education in the United States. Babson Survey Research Group. https://www.onlinelearningsurvey.com/highered.html

Shor, I. (2009). Critical pedagogy is too big to fail. Journal of Basic Writing, 28(2), 6-27.

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory Into Practice, 41, 64-70.


Appendix A

Pre-Course Faculty Survey Questions

  1. What aspects of using a learning contract most interest or excite you? Why?
  2. What aspects of using a learning contract are you concerned about? Why?
  3. What challenges do you anticipate encountering with the learning contract?
  4. What impact do you think using a learning contract might have on your teaching practices? How might they change?
  5. How might you see your identity as a teacher changing as a result of using the learning contract?


Appendix B

Grading Conversion Scale for Learning Contract Courses



to 97%



to 92%



to 85%



to 78%



to 66%



to 60%



to 53%



to 47%



to 40%



to 0%



Appendix C

Post-Course Faculty Survey Questions

  1. What aspects of using a learning contract did you enjoy the most? Why?
  2. What aspects were most conducive to student learning? Why?
  3. What aspects were most conducive to teaching? Why?
  4. What challenges did you anticipate encountering with the learning contract? Did you actually encounter those challenges?
  5. What aspects of using a learning contract were most challenging for you? Why?
  6. What impact do you think using a learning contract has had on your teaching practices? How have they changed?
  7. What impact do you think using a learning contract has had on how you engage with students and their writing? How has that changed?
  8. Do you think your sense of yourself as a teacher has changed? For instance, have your priorities or focus changed? Or has this impacted the way you view your relationship to your students?
  9. If given the choice, would you continue to use a version of the learning contract? Why or why not?

Appendix D

Student Survey Questions

  1. Rate your level of engagement with your instructor.
  2. Rate the degree to which you feel your instructor valued the effort you put into learning in this course.
  3. Rate the degree to which your instructor emphasized improving your writing and composing skills.
  4. Rate the degree to which you read and acted upon your instructor’s feedback and comments on your work in the course.
  5. Rate the degree to which your writing and composing knowledge and skill improved as a result of this class.
  6. Rate your level of enjoyment of the course.
  7. Rate the degree to which your instructor emphasized getting a good grade in the course.
  8. Rate how concerned you were about your grade in this class.
  9. Rate your level of understanding of the grading expectations of the course.
  10. What aspects of the learning contract in your...class did you find to be most beneficial to your experience in the class? In what ways do you think it might have helped you?
  11. What aspects of the learning contract did you find confusing or challenging?
  12. If you had the choice between using a learning contract and being graded in a more traditional way, which would you choose? Why?


Appendix E

Writers' Studio Distribution of Grades by Course

Appendix E table


Appendix F

Writers Studio Pilot Traditional Course Survey Results




  Ranking (%)









Q1 - LC






Q1 - Trad






Q2 - LC






Q2 – Trad






Q3 – LC






Q3 - Trad






Q4 - LC






Q4 - Trad






Q5 - LC






Q5 - Trad






Q6 - LC






Q6 - Trad






Q7 - LC






Q7 - Trad






Q8 - LC






Q8 - Trad






Q9 - LC






Q9 - Trad












Note: LC = learning contract classes; Trad = traditional classes.