Volume 13, Issue 1: 2020

Collaborative Placement of Multilingual Writers: Combining Formal Assessment and Self-Evaluation

by Dana Ferris, University of California, Davis; Amy Lombardi, University of California, Davis

Placement of multilingual writers within writing programs is an important and challenging issue. If students perceive that the placement process is rigid and unfair, this perception may affect their attitudes and motivation levels while taking courses in the writing program. The purpose of this study was to see whether a specific subgroup of students (n = 65) in a large university writing program for multilingual students could be successful if allowed to collaborate, with guidance, in their own placement. Various data were collected about these students in their first quarter after matriculating in the writing program: instructors’ initial ratings, students’ outcomes in their initial course (final portfolio scores and course grades), and students’ satisfaction levels with their placement after they had completed the course (via a brief survey). These data were compared to another group of students (n = 65) who received similar placement scores but were not given the choice to move up or down a level. Findings indicated the pilot group was able to succeed at their chosen course level at levels comparable to the comparison group, and they were happy with their placement choices. Implications for placement processes in multilingual writing programs are discussed.

Keywords: multilingual writers, writing assessment, directed self-placement, placement processes, student agency


As student populations at North American universities become more diverse and as colleges and universities aggressively increase proportions of international undergraduates (Institute of International Education, 2018), the placement of multilingual[1] writers into first-year writing sequences has become an issue with important practical implications. Proper placement and assessment of incoming student writers is important: If students are placed too low, they waste time and money; if they are placed too high, they may fail to complete their requirements successfully. Misplacement of students can lead to classrooms filled with students of disparate abilities, instructors who are stretched to the limit trying to meet a wide range of needs, and programs that struggle to accurately and fairly assess student achievement. Braine (1996) argued that such initial placement decisions can influence students’ “success or failure” (p. 91) in first-year writing courses.

In her book on the placement of multilingual writers into college composition courses, Saenkhum (2016) noted that the multilingual university students with whom she interacted seemed

aimless and passive, being moved around by various authority figures…when questions we really should be asking are what the students themselves want, how they can make well-informed placement decisions and exercise their own agency…instead of just doing what others tell them. (p. 4)

As Saenkhum (2016) implies here, possibly the worst risk of poorly conceived placement processes is that, if students perceive the placement process is rigid and unfair, this perception may affect their attitudes, motivation, and self-efficacy levels while taking courses in the writing program (see also Chiang & Schmida, 1999; Ortmeier-Hooper, 2008).

            The placement of multilingual writers thus has high stakes and can be a contested space among administrators, instructors, and students themselves. However, placement also poses practical problems for institutions and writing programs due to scale—rapidly increasing student numbers—because to do placement well can be labor-intensive and very expensive (Silva, 1994). All too often, institutions choose the most expeditious and cost-effective placement route (most typically relying on standardized test scores, such as the SAT or the TOEFL) and only rarely question whether students have been accurately placed and whether the placements are reliable and have face validity to those most affected by them—the students. Consequently, it is important to question the forms of multilingual writing placement while being appropriately cognizant of the practical constraints involved in making changes. It is equally significant to ask whether institutional writing placement practices are ethical in light of social justice concerns (Poe, Inoue, & Elliot, 2018).

            The analysis described in this paper was conducted in a large, four-level developmental writing program for first-year multilingual students at a U.S. university. For this project, a subgroup of incoming students was placed by a combination of formal assessment (a locally administered placement examination) and self-evaluation (their responses to a survey asking which of the course levels they believed would be most appropriate for them). The course outcomes for this group of students (n = 65) were studied alongside those of a comparison group (n = 65); further, we examined their attitudes toward their initial placement once they had completed their first course.[2] The goal of this study was to evaluate whether incorporating student agency into placement decisions could be accomplished without adding substantial labor or cost to the process and without harming students’ ability to be successful in these initial writing courses. The desired objective of this research effort was and is to arrive at a model of writing placement that is enlightened, collaborative, ethical, and fair, in addition to being institutionally feasible and programmatically effective.

 

Background

Self-Assessment in Alternative Writing Placement Models

In recent work on writing assessment in general and placement in particular, scholars have turned attention beyond statistical notions of reliability and practical questions of feasibility to examine the ethical and social justice issues raised by traditional assessment models (Poe et al., 2018; see also special issues of the Journal of Writing Assessment [2016, 2019] and of College English [2016]). Questioning the validity of standardized tests as sole writing placement mechanisms (Kokhan, 2013), institutions of higher education across the US have sought opportunities to improve placement processes in “substantive, creative and ethical ways” (Estrem, Shepard, & Sturman, 2018, p. 57). Alternative placement strategies have included the use of multiple measures, such as high school GPA and/or student surveys along with or in place of test scores (Hassel & Giordano, 2015), locally constructed exams evaluated by readers familiar with the specific courses into which students are placed, and even entrance portfolios, though the labor intensity of placement by portfolio is often prohibitive (Frus, 2003). Many programs are turning to placement models that include student self-assessment.

The Conference on College Composition and Communication (CCCC, 2014) Position Statement on Writing Assessment argues that “students should have the right to weigh in on their assessment.” Royer and Gilles (1998) argued in their seminal piece that student choice could serve effectively as the sole criterion of placement for all students. Many institutions have implemented processes that give all students complete agency to place themselves while others use self-assessment as a factor in evaluators’ placement decisions or offer self-placement for only a subset of the student population. Considering that “self-placement without direction may become merely a right to fail” (CCCC, 2014), it is clear that any process that incorporates student choice must be carefully designed to ensure students are provided adequate guidance in preparation for the decisions they are asked to make.

In Directed Self-Placement (DSP), as implemented at Grand Valley State University and described by Royer and Gilles (1998), students choose between two courses after having been introduced to the course options and completing a survey examining their own literacy backgrounds. While surveys are a staple in self-placement (Toth & Aull, 2014), many programs have included additional tools to help students gain a more profound understanding of their options in relationship to their preparedness. One approach requires a writing task as part of the placement process, so students may reflect on the experience, and/or evaluators may consider the writing sample in making their recommendations (Jones, 2008; Kenner, 2016; Lewiecki-Wilson, Sommers, & Tassoni, 2000; Toth, 2019). Classroom-based approaches can allow the placement process to occur during a summer term or during the first few weeks at the beginning of fall term (Nicolay, 2002). Bedore and Rossen-Knill (2004) advocate for Informed Self-Placement (ISP), which includes one-on-one meetings with advisors. As the authors admit, though, a labor-intensive process such as ISP may be difficult to scale up at a large institution.

One way to reduce the labor-intensity of ISP is to offer it only to a subset of entering students. The ISP process at Rochester University as described by Bedore and Rossen-Knill (2004), for example, targeted only students with low test scores. Study data revealed that, while students who participated in the personalized ISP process were well-informed about their options, others who had received only the standard handbook course information had not fully understood their choices. As Toth (2019) explained, the use of test scores to determine eligibility for self-placement may be problematic in that the two modes of assessment (testing and self-assessment) represent conflicting ideologies. Some institutions in Toth’s study indicated their intention to expand self-placement after having first established some viability through a more limited application of the self-placement mechanism. Any decision to offer certain students a choice regarding their placement while denying that choice to others must be made with careful consideration for not only practicality but also fairness and philosophical coherence.

As alternative strategies involving student participation in placement processes have become more commonplace, multilingual writing specialists have begun to advocate for self-assessment as part of the placement process for multilingual students (Crusan, 2006, 2011; Ferris, Evans, & Kurzer, 2017; Ruecker, 2011). Crusan (2006) outlined an online placement process that incorporated self-assessment as one of multiple measures but did not ultimately allow the student to select their own course. While multilingual students must certainly have been included within previous study samples, to our knowledge, none of the available research on ISP and DSP has focused exclusively on multilingual students.

One challenge that we quickly noted from reviewing the previous research on DSP/ISP and other alternative placement models was that student participants in those studies were usually provided with binary choices: a developmental basic writing course versus a graduation-credit-bearing first-year composition course or a sheltered option for multilingual students versus a mainstream/mixed course. In their article, “A Class for Students Like Me,” Costino and Hyon (2007) reported that multilingual students were satisfied with different placement options (out of two possibilities) they had chosen if the conditions for success were optimal (e.g., supportive instructors and an inclusive curriculum). However, the program in the present study included four consecutive levels of instruction representing a broad range of language and writing competencies, not two relatively equivalent choices. With a wider range of placement possibilities, we were unsure as to the feasibility of DSP for our context. For example, when considering a student whose proficiency skills might have placed them in our lowest level course using traditional placement mechanisms, allowing them to choose to skip three levels to the highest course might pose considerable risks not only for that individual student but also for overall programmatic coherence.

The limited model of DSP piloted in this study (one level up-or-down course choice for multilingual students with borderline scores on an assessment of their performance on a writing placement examination) was chosen with institutional and demographic considerations in mind. Placement decisions had to be made several months in advance of the students’ arrival on campus at the beginning of a 10-week fall term. The placement process took place remotely with limited opportunity for personalized advising. A large percentage of students would be entering the United States educational system for the first time and could not be expected to have substantial familiarity with the expectations for university writing in the US. With these factors in mind, we designed a modest and limited DSP process that would present minimal risk to participating students while investigating whether it could potentially be scaled up to a wider population of students in the program.

 

Previous Placement Studies in this Context

In an earlier research project focused on the same multilingual writing program, Ferris et al. (2017) compared the placement exam scores of over 1,100 students with students’ self-assessments of which course level they felt would best meet their needs for language and writing support. They found a reasonably good fit between the two sets of scores, with 79% of the participants’ self-evaluations either matching their placement scores or just one level off (higher or lower). Ferris et al. (2017) concluded that the findings “did not completely convince us that self-assessment alone would work for effective placement of students in our four-level L2 writing program but they also did not demonstrate that incorporation of such student input would be a complete disaster, either” (p. 8). They suggested possible next steps included a pilot to see the effects of allowing a group of students limited choices to move up or down a level. The present study represents that next step.

In a broader study of overall student satisfaction with the multilingual writing program, Ferris (2018) found that a sizable minority of student respondents (n = 355), 1 to 3 years after completing the program, remembered feeling dissatisfied about their initial course placement. Twenty-eight percent said they had felt frustrated about being placed (in their opinion) too low, and another 21% said that, while they also had felt they were placed too low, they were glad, in hindsight, that they had taken the course into which they were placed. This is a substantial number of students who had felt frustrated about their placement, whether at the time or even years later. It seems fair to speculate this frustration could have led to overall dissatisfaction or negative attitudes toward the course and the program. These two studies, taken together, created the exigence for the present study, which takes the previous work a step further by not simply asking students’ views about their placement before or after the fact but by actually allowing those views to influence their placement outcomes.

In sum, our study adds to the previous research on DSP by focusing on multilingual writers across several placement levels and by asking students, as they finished the term, for their attitudes about and reactions to the placement process they had experienced. This study was guided by the following research questions:

  1. What was the effect of a self-placement option on students’ course outcomes?
  2. What was the effect of a self-placement option on student attitudes and satisfaction after completing the course?

 

Method

Context

This study was conducted in a developmental writing program for first-year multilingual university students in the United States. The university is a large public research institution that is highly ranked and has extremely competitive admissions. Like many state universities in the US over the last decade, this institution aggressively increased its undergraduate international student enrollment; as of Fall 2017, when the study was conducted, there were over 4,100 international undergraduates (14% of the total). In addition to the rapidly growing international student population, the university is nearing federal Hispanic-serving institution status (meaning that at least 25% of undergraduates are from Hispanic backgrounds). Thus, the new students coming to this university are extremely diverse linguistically and culturally (see https://www.ucdavis.edu/sites/default/files/upload/files/uc-davis-student-profile.pdf for a detailed breakdown as of Fall 2019). We did not collect or analyze demographic data for this study population in particular, but from previous research conducted in the same program (Ferris, 2018; Ferris et al., 2017), we know most of the students were international and not born in the US, the vast majority of the international students were from China, and the majority of the multilingual students in the first-year developmental writing program who were U.S. residents were Spanish speakers.

The primary language and writing support for first-year multilingual students is housed in the university’s writing program, a stand-alone unit in the College of Letters and Science. The number of first-year multilingual students (including both international visa students and resident multilingual students) served by this program grew from just over 400 in 2012-2013 to over 1,300 in 2017-2018. Designed for students who have not yet fulfilled the university’s entry-level writing requirement upon admission and matriculation, there were[3] four developmental levels into which the multilingual students could be placed, with each level taking one academic quarter (10 weeks) to complete. The first three levels were designed exclusively for multilingual students, and the fourth level was a basic/entry-level writing course that offered sections for native English speakers needing writing support and smaller sheltered sections reserved for students who had exited the third level of the multilingual program.

Since 2014, this first-year program, now known as English for Multilingual Students (EMS), has used a locally designed and administered examination to place incoming multilingual students into the most appropriate of the four levels. The exam consists of both reading and writing activities, and, since 2016, a background and self-evaluation questionnaire (see Ferris et al., 2017, for details about the development of the exam and the questionnaire). Over the 6 years this exam has been administered, the placements have typically skewed toward the lower two levels of the sequence (EMS 2 placements usually around 45-50%, EMS 1 between 20-30%, and EMS 3-4 usually around 25-30% combined). The exams are read by teams of EMS instructors, and each exam receives at least two scores, with a supervisor providing a third reading if necessary to resolve borderline cases.

While students are given a specific course placement as a result of their exam scores, there is a range within levels, and this factor influenced the design of the present study. Table 1 shows how the ranges worked across the program.

 

Table 1

This table shows that, while there were only four possible placements, there could be some variation in observed language and writing proficiency within levels (and below and above the ranges, as well).

Student Participants

Participants in this study were 130 new first-year multilingual students who had taken the placement exam in May 2017. Over 1,150 multilingual students took courses in the program in Fall 2017, so this study group represented about 9% of the student population at the time (see Tables 4-6 below for additional details). As previously noted, we kept this phase of the research relatively small, so we could assess the effects of allowing student choice in the placement process on a small scale before ramping it up program-wide (Toth, 2019). All 130 students had received a borderline +/- score (see Table 1) for the particular course level into which they had been placed. However, half of the students (n = 65) had indicated on their self-evaluation survey they believed they belonged in the next level up or down (i.e., the level closest to their borderline score). One hypothetical example of this is a student who received a score of 77 (EMS 2+) but had indicated on the survey they felt EMS 3 would be the best placement. Because their exam score was on the high end of the previous course level, and they believed the next level up was appropriate, they were offered the opportunity to move up to the next level. Similarly, if a student’s score was on the low end for that level (e.g., 70 or EMS 2-), and they had selected the lower level as the best placement, they were offered the choice to move down a level (to EMS 1).

The 65 students who had borderline exam scores and whose self-placement choice was one level higher or lower than their placement scores indicated were sent emails explaining the choice (see Appendix A) and given five days to respond. This was our pilot self-placement group. If they chose to move up or down, their final placement score was entered as that higher or lower level. A second group of exam takers (n = 65) also had borderline scores, but their self-evaluation score matched their exam placement (e.g., a student who said “EMS 2” on the survey but whose score was 70 or 77). These students were not offered the opportunity to change levels and were investigated as a comparison group to our pilot students.

As indicated in Table 2, of the 65 students in the pilot group, 39 of them chose to move up a level when given a choice, six chose to move down a level, and 20 chose to remain at the level into which they had tested. The most common movement was from EMS 1 to EMS 2 (24 students), followed by 15 students moving up from EMS 2 to EMS 3.[4] Five students moved down from their original placement in EMS 3 to EMS 2, and one moved down from EMS 2 to EMS 1.           

Table 2

Data Collection

To address our research questions, we collected four pieces of data. First, after students’ placements were finalized and they had enrolled in the appropriate fall quarter writing courses, their teachers were asked, at the end of the first week of instruction, to evaluate whether our study group of students appeared to be appropriately placed. This was an early indicator as to whether the limited self-placement option might be detrimental to students in the pilot group. The instructors were sent a list of students in their classes who were part of the study and asked to rank them, after a week of in-class diagnostic activities, on a scale of 1-3: 1 = low for this class; 2 = average for this class; 3 = high for this class. Though teachers knew we were collecting the ratings for research purposes, we did not disclose why these particular students were of interest nor what exactly we were studying.

The second and third pieces of data were students’ course outcomes at the conclusion of the fall term. Specifically, we obtained students’ final e-portfolio grades and their final course grade. Portfolios included final versions of two multiple-draft assignments, an in-class final essay exam, and a portfolio letter; the portfolio scores comprised 50% of the final course grade. These letter grades were converted to numerical scores for statistical analysis on a 4.0 grading scale (A = 4.0, A- = 3.7, etc.). We also obtained portfolio and course grades for all students completing EMS 1-3 that term (N = 1,158), so we could see whether the smaller study group (n = 130) was representative of the general population in the courses.

The final data were responses to an electronic survey, with the link sent via email after the term was over, to all students in both pilot and comparison groups in the study. This survey is shown in Appendix B. Its purpose was to assess students’ feelings after the fact about their course placements in the fall and, more generally, their opinions about who should be in control of student placement. The pilot group survey had an additional question about how they felt about the choices they had made to move up or down a level or to stay in the level in which the exam had placed them. Forty-seven of the 130 students completed the survey, which was voluntary and anonymous, a response rate of 36%. Of these, 25 were from the comparison group, and 22 were from the pilot group.

 

Data Analysis

The two groups were compared across the three performance measures (teacher ratings, portfolio grades, final course grades) via t tests, chosen as a simple way to compare group means. For the student surveys, we examined frequencies and percentages of the responses for each question; sample sizes of survey responses were not large enough for inferential statistical comparisons.

 

Results

As already discussed (Table 2), when the pilot group was given the (limited) opportunity to choose their placements, a few moved down a level, some chose to stay in the same level in which the exam had placed them, and quite a few moved up to the next level. Our two research questions focused on whether these choices appeared to affect students’ performance in the classes they took and influence their feelings or attitudes about their placements.

 

Research Question 1: Did Self-Placement Options Affect Student Outcomes?

Instructor ratings. When instructors were asked to give an initial impression of the students’ level based on their first day in-class writing and other Week 1 diagnostic activities, the pilot group was indistinguishable from the comparison group. The average rating on the 1-3 scale described above across all course levels was 1.961 for the pilot group and 1.900 for the control group. Table 3 breaks the results down into course-specific detail. In short, instructors did not detect any major difference (at least in the first week of class) between students who had been moved up/down upon their request and those who had simply been placed according to their exam scores. This early-term checkpoint reassured us there was no glaring mismatch between the pilot group’s placements and their teachers’ observations about their proficiency, and their chances for success in the courses were equivalent to those of their peers who had been placed in traditional ways (i.e., the local placement exam).

Table 3

Students’ course outcomes. Final portfolio scores, worth 50% of the course grade, were examined to compare the two study groups (pilot and comparison) and to look at both compared to the general population of students who took EMS 1-3 courses during the same term. Table 4 summarizes these results.

Table 4

The results in Table 4 suggest students in the pilot group received somewhat lower portfolio scores than did either the students in the comparison group or in the general population. Specifically, the pilot group students received fewer A’s and more B’s on their portfolios than did the other two groups. However, both study groups received fewer failing scores (D/F grades) than did the general population.

It is not particularly surprising that the comparison group received overall higher grades than did the pilot group or the students in general, as many had been on the high end of their placement level but had not been offered the opportunity to move up a course level, meaning they potentially were among the strongest students in their classes from the beginning. Conversely, it is possible that by electing to move up a level to challenge themselves, some members of the pilot group may have traded A’s for B’s on their portfolios. Nonetheless, when it came to final course grades, the differences across groups narrowed considerably. These results are shown in Table 5.

Table 5

The final course grades took into consideration a broader range of student work beyond the portfolios (which counted for 50%) such as in-class work, homework, and an in-class midterm. Because some of those grade points involved simply submitting completed work on time, it makes sense that the overall course grades skewed higher than the portfolios alone. This seemed especially important for the pilot group, which overall earned many more A’s (32%) for their course grades than they did for their portfolios (11%). Indeed, when A’s and B’s are combined, 80% of the pilot group students earned either an A or a B in the course, compared with just 60% receiving A’s or B’s on the portfolio. Theory and research on DSP predict that, when students are allowed to have a voice in their own placement, they will work harder to prove to themselves and others that they made the right choice (Royer & Gilles, 1988; Sinha, 2014). In this study, the course grades, especially when compared to portfolio grades (which are less sensitive to overall effort, such as coming to class or submitting homework), at least speculatively support this assumption from the literature.

            We also compared the two sets of grades (portfolio and course) between the two study groups via t tests. Table 6 shows the results.

Table 6

The grade differences on both t tests were statistically significant, and they were especially so for the course grades measure. However, when the means of the numerical grades were converted back to letter grades, the statistical difference was minor—the pilot group’s mean course grade was a B-, while the comparison group’s average grade was a B. In short, though the pilot group students received lower course grades than did the comparison group, these are relatively small differences. There was no difference in failure rates between the two study groups, and as shown in Table 5, both groups received a slightly lower number of D/F (failing) grades (5%) than did the general population (6%). In short, the course outcomes show no real detrimental effect on the pilot group students of being allowed to have input into their course placement.

 

Research Question 2: How Did Placement Options Influence Student Attitudes?

As discussed previously, a brief electronic survey (Appendix B) was sent via email to all study group participants once the term was over, and we had 47 responses.

Table 7 shows students’ responses to Questions 3 and 5 on the survey (both groups) and the pilot group’s response to Question 6 (which the comparison group did not have on their version of the survey).

Table 7

Though the survey response numbers were too small for statistical comparisons, the student respondents from the pilot group were more likely to say (on Question 3) their class level “was perfect for me” (77%) compared with students in the comparison group (58%). It is worth remembering that the pilot group included students who had chosen not to move up or down but rather to stay at the course level indicated by the placement exam score. In contrast, only one student from the pilot group felt their class was “too easy for me,” while seven of 25 respondents (28%) in the comparison group chose this response. Overall, students who had some choice as to their final placement were more likely to say, after completing the course, that they had been in the right class level for their needs than those who had not had this choice presented to them. These survey responses stand in strong contrast to the reactions of previous cohorts from the same program—none of whom had been offered input into their initial course placement—who had felt frustrated, during and after the fact, about their placement outcomes (Ferris, 2018).

Question 5 asked students’ opinions about who should be in control of the placement process. More students from the pilot group (who had some input into their placement) said that students alone should determine their placements, while the reverse was true for the comparison group that had no choices offered—they were more likely to say that the program/teachers should have complete control. However, the numbers for both response options were small, and the vast majority from both groups (64% for each) felt placement should be determined by collaboration between programs and students making the decision together—which was in fact the scenario experienced by the pilot group in this study.

Finally, Question 6, which was only in the survey version sent to the pilot group, showed respondents who had experienced some limited agency in the placement process were happy about their choices. Eleven respondents (50% from that group) said they’d chosen to move up a level and were glad they had, and eight students (35%) also said they were glad they had stayed in the original level indicated by the placement exam score. Only one student out of the 22 respondents expressed regret about their choice, saying they had stayed in the level suggested by the exam but now wished they had moved when offered the opportunity to do so. For nearly all respondents, it appeared having been offered the choice—regardless of what their choice was--contributed to their overall satisfaction with the placement process once they had experienced its effects by completing the course. Again, this finding contrasts sharply with the finding of an earlier study (Ferris, 2018) that a substantial minority of students had been frustrated with their course placements, over which they had no input or control.

 

Discussion and Conclusion

Our study yielded three important findings. First, students in the pilot group did not appear to abuse the self-placement option. They took their choices seriously. One of the most persistent objections to applying DSP approaches to multilingual writers’ placements is the suspicion that the students would always choose a higher level if offered an opportunity, even if so doing might not really be in their best interests (Crusan, 2002, 2006; Reynolds, 2003). However, in this study we saw that, while many students chose to move up a level, nearly one third chose to stay in the level in which they had been placed, and others even made the choice to move down one level (Table 2). In short, our findings for this study group suggest students can be trusted to make good decisions when given guidance about their options. This finding is consistent with other DSP/ISP research conducted in mainstream composition settings (Inoue, 2009; Sinha, 2014), suggesting that multilingual students in developmental programs, like their mainstream peers, are capable of exercising agency in responsible ways that best serve their own needs.

Second, when students were given choices, it did not harm their course outcomes in any substantial way. Administrators and teachers may fear that, if multilingual students are empowered through DSP, they will misplace themselves to the detriment of their academic progress (Crusan, 2002, 2006). We did not find this to be the case. Teachers polled in the first week of the term discerned no major differences in ability across the students in our study group (Table 3). The 65 students in the pilot group passed their courses at rates slightly higher than that of the general population taking the course during that term (Table 5). Although the portfolio grades and final course grades were higher for the comparison group than for the pilot group (Tables 4-6), the differences were between A and B grades on the portfolios (Table 4) or between B and B- for the course grades (Table 5)—not between passing and failing the level. We suspect most students, given the option, would happily trade a slightly lower but still acceptable course grade for saving themselves a full term of developmental coursework.

Third, giving students a voice in their own placement seemed to contribute to their overall satisfaction with the process. As Table 7 demonstrated, survey respondents from the pilot group were overall quite happy with their placements after the fact—whether or not they chose to change levels. Further, the majority of students in both groups expressed the opinion that placement should be a decision jointly made between a writing program and the students themselves.

 

Implications and Next Steps

Because we didn’t know in designing the study what the effects of this pilot effort would be, we deliberately kept it rather small (130 students total and just 65 students in the pilot group). One next step for moving the inquiry forward would be to broaden the groups by allowing all students receiving borderline placement scores to choose whether to remain in the course level indicated by the placement exam or to move up to the next level or down to the previous one. This would not have to be an especially labor-intensive process. In this study, students in the pilot group were simply sent an email explaining their choices (Appendix A) and given 5 days to respond. Final placement scores were not entered into the database until they had made a choice (which included not responding at all).

A bigger and potentially riskier subsequent step would be to allow students who received average exam scores for the level rather than borderline-high or borderline-low (e.g., EMS 2 rather than 2+; see Table 1) the opportunity to change levels, up or down. This choice could again be framed, as it was for the current study, in the student’s self-evaluation questionnaire. For example, if a student’s self-assessment was lower than the placement exam score, they might consider moving down a level if they are not confident they can handle the level into which the exam placed them. On the other hand, some multilingual students have lower levels of confidence than they really should (Eckstein & Ferris, 2018; Ferris & Eckstein, 2020), and the chance to consider the placement exam score as feedback on their proficiency might provide the encouragement they need to try a level that without other input they might have considered too difficult for them.

 

Other Follow-Up Research

Beyond continuing to assess whether larger, broader student populations within our program could benefit from the focused DSP approach taken in this study, several other interesting questions are raised by this study and the ones that preceded it. For example, before the data were collected for the previous study (Ferris et al., 2017), the student self-evaluation questionnaire was administered separately from the exam, and teachers scoring the exam did not have access to the information provided by the student test-takers. However, beginning in 2016, we added the questionnaire to the exam and encouraged scorers to consider that information along with the exam results (short-answer reading comprehension questions and a short essay) in determining placement scores. Though anecdotally we have heard that the raters have found this more holistic approach to placement a helpful step forward (i.e., that they do read the student questionnaire responses and consider them in scoring), we have not studied the effects of this change empirically.

It also would be useful to more closely examine our placement process using a longitudinal case-study approach, in addition to the larger-scale quantitative assessments we have already undertaken. Students could be interviewed at intervals and followed across the levels to see how their initial placement and any choices they made regarding that placement have influenced their performance, outcomes, attitude, engagement, and satisfaction (see Saenkhum, 2016, for an example of this type of longitudinal multiple-case study). They could also be queried after they have exited the program (as in Ferris, 2018) as to their perspectives on their placement once they have experienced its effects over time. Though our studies of larger groups of students (Ferris, 2018; Ferris et al., 2017; the present study) have been helpful in moving along our thought process about our placement system, we suspect there are many individual variables at play in how students feel about their experience in the multilingual writing program, starting with their initial placement (for further examinations of this point, see also Evans & Ferris, 2019).

It is important to observe that none of the steps we have taken thus far to examine the feasibility of incorporating DSP into our placement process have been either expensive or especially time-consuming. The student self-evaluation questionnaire has been added to the exam platform and is completed electronically, and the process of allowing students a choice before finalizing their placements could also be easily automated. Because one of the biggest impediments to programs adopting a more collaborative placement process is usually resources—time and money—the findings from this study suggest students could be given some agency with minimal disruption and effort, without negative effects on their academic progress, and with discernably positive benefits for their attitudes and satisfaction.

Most writing instructors and writing program administrators would likely agree, hypothetically, that it is both ethical and wise to incorporate student voices into decisions that will affect them, such as their course placement. It is the “how” that often stops administrators from pursuing such approaches. While our study does not answer all of the possible practical questions, it at least provides promising evidence that DSP, as part of the placement process for multilingual writers, can be successful and even predict positive outcomes as to both student achievement and satisfaction. This, we believe, is a valuable step forward for a writing assessment subfield—the placement of multilingual student writers—that has been slow to adopt or even seriously consider such approaches.

 

Acknowledgments

We are grateful to Ms. Jamie Ferrando and Ms. Helen Sutton, both formerly of the University Writing Program at UC Davis, for their assistance in data collection for this project.

Author Bios

Dana Ferris is Professor and Director of the University Writing Program at the University of California, Davis. She is also currently co-editor of the Journal of Second Language Writing.

Amy Lombardi is a Ph.D. candidate in Linguistics at the University of California, Davis. Her research focuses on reading-writing connections for multilingual writers and particularly on how they understand and deploy information from sources in their writing.

 

References

Bedore, P., & Rossen-Knill, D. F. (2004). Informed self-placement: Is a choice offered a choice received? WPA: Writing Program Administration28(1-2), 55-78.

Braine, G. (1996). ESL students in first-year writing courses: ESL versus mainstream classes. Journal of Second Language Writing, 5(2), 91-107.

Chiang, Y-S., & Schmida, M. (1999). Language identity and language ownership: Linguistic conflicts of first-year university writing students. In L. Harklau, K. Losey, & M. Siegal (Eds.), Generation 1.5 meets college composition (pp. 81-96). Mahwah, NJ: Lawrence Erlbaum.

Conference on College Composition and Communication. (2014). Writingassessment: A position statement. Retrieved from https://ncte.org/statement/writingassessment/

Costino, K. A., & Hyon, S. (2007). “A class for students like me”: Reconsidering relationships among identity labels, residency status, and students’ preferences for mainstream or multilingual composition. Journal of Second Language Writing, 16(2), 63-81.

Crusan, D. (2002). An assessment of ESL writing placement assessment. Assessing Writing, 8(1), 17-30.

Crusan, D. (2006). The politics of implementing online directed self-placement for second language writers. In P. K. Matsuda, C. Ortmeier-Hooper, & X. You (Eds.), The politics of second language writing: In search of the promised land (pp. 205-221). West Lafayette, Indiana: Parlor Press.

Crusan, D. (2011). The promise of directed self-placement for second language writers. TESOL Quarterly, 45(4), 774–774.

Eckstein, G., & Ferris, D. (2018). Comparing L1 and L2 texts and writers in first-year composition. TESOL Quarterly, 52(1), 137-162.

Estrem, H., Shepard, D., & Sturman, S. (2018). Reclaiming writing placement. WPA: Writing Program Administration, 42(1), 56–71.

Evans, K., & Ferris, D. (2019). Revision from multiple feedback sources: The attitudes and behaviors of three multilingual student writers. Research in the Teaching of English, 54(2), 131-160.

Ferris, D. R. (2018). Using student satisfaction surveys for program improvement. CATESOL Journal 30(2), 19-42.

Ferris, D., & Eckstein, G. (2020—forthcoming). Language matters: Examining the language-related needs and wants of writers in a first-year university writing course. Journal of Writing Research.

Ferris, D., Evans, K., & Kurzer, K. (2017).  Placement of multilingual writers: Is there a role for student voices? Assessing Writing, 32(1), 1-11.

Frus, P. (2003). Directed self-placement at a large research university: A writing center perspective. In D. Royer & R. Gilles (Eds.), Directed self-placement: Principles and practices (pp. 179–192). Cresskill: Hampton Press.

Hassel, H., & Giordano, J. (2015). The blurry borders of college writing: Remediation and the assessment of student readiness. College English, 78(1), 56.

Inoue, A. B. (2009). Self-Assessment as programmatic center: The first-year writing program and its assessment at California State University, Fresno. Composition Forum, 20(3). Retrieved from http://compositionforum.com/issue/20/calstate-fresno.php

Institute of International Education. (2018). International student enrollment trends. Open Doors Report on International Educational Exchange. Retrieved from http://www.iie.org/opendoors/ Accessed 27.08.2019

Jones, E. (2008). Self-placement at a distance: Challenge and opportunities. WPA: Writing Program Administration, 32(1), 57–75.

Kenner, K. (2016). Student rationale for self-placement into first-year composition: Decision making and directed self-placement. Teaching English in the Two-Year College, 43(3), 274–289.

Kokhan, K. (2013). An argument against using standardized test scores for placement of international undergraduate students in English as a second language (ESL) courses. Language Testing, 30(4), 467–489.

Lewiecki-Wilson, C., Sommers, J., & Tassoni, J. P. (2000). Rhetoric and the writer’s profile: Problematizing directed self-placement. Assessing Writing, 7(2), 165-183.

Nicolay, T. F. (2002). Placement and instruction in context: Situating writing within a first-year program. WPA: Writing Program Administration, 25(3), 41–59.

Ortmeier-Hooper, C. (2008). “English may be my second language—but I’m not ‘ESL.’” College Composition and Communication, 59(3), 389-419.

Poe, M., Inoue, A., & Elliot, N. (Eds.). (2018). Writing assessment, social justice, and the advancement of opportunity. Fort Collins, Colorado: The WAC Clearinghouse.

Reynolds, E. (2003). The role of self-efficacy in writing and directed self-placement. In D. Royer & R. Gilles (Eds.), Directed self-placement: Principles and practices (pp. 73- 103). Cresskill, NJ: Hampton Press.

Royer, D. J., & Gilles, R. (1998). Directed self-placement: An attitude of orientation. College Composition and Communication50(1), 54-70.

Ruecker, T. (2011). Improving the placement of L2 writers: The students’ perspective. WPA: Writing Program Administration, 35(1), 91-117.

Saenkhum, T. (2016). Decisions, agency, and advising: Key issues in the placement of multilingual writers into first-year composition courses. Boulder, Colorado: Utah State University Press.

Silva, T. (1994). An examination of writing program administrators' options for the placement of ESL students in first year writing classes. WPA: Writing Program Administration, 18, 37-43.

Sinha, A. (2014) Exploring directed self-placement as a placement alternative for first-year college students in writing classes (Unpublished doctoral dissertation). University of California, Davis, California.

Toth, C. (2019). Directed self-placement at two-year colleges: A kairotic moment. Journal of Writing Assessment, 12(1).

Toth, C., & Aull, L. (2014). Directed self-placement questionnaire design: Practices, problems, possibilities. Assessing Writing, 20, 1–18.

 

Appendix A

Sample Emails Sent to Pilot Group Students

 

Move UP email

Dear Student,

Your English Language Placement Exam that you took on May 13 has been scored. You received a score of 77, which places you into Level 2. 

However, on your questionnaire, you said that you think a higher-level class would be the best fit for you. Since your score of 77 is on the high end of the Level 2 range, we will give you the choice of going up one level in your placement, to Level 3.  It is your choice.  Your ELPE suggests that Level 3 will be difficult for you, but if you feel that it's the right choice and will work hard in the class, you may choose to give it a try.

Please think it over and email back by Sunday, June 4, at midnight, with one of the following responses:

Yes, I'd like to move my placement up to Level 3.

No, I want to keep my placement in Level 2.

 

Move DOWN email

Dear Student,


Your English Language Placement Exam (ELPE) that you took on May 13 has been scored. You received a score of 70, which places you into Level 2. 

However, on your questionnaire, you said that you think a lower level class would be the best fit for you. Since your score of 70 is on the low end of the Level 2 range, we will give you the choice of going down one level in your placement, to Level 1.  It is your choice.  Your ELPE suggests that Level 2 will be difficult for you, but if you feel that it's the right choice and will work hard in the class, you may choose to give it a try.

Please think it over and email back by Sunday, June 4, at midnight, with one of the following responses:

Yes, I'd like to move my placement down to Level 1.

No, I want to keep my placement in Level 2.

 

Appendix B

 

End-of-Term Student Survey

  1. What is your name?
  2. What is your email address?
  3. How did you feel about your placement (class level) in the ESL program this quarter? Choose the statement that BEST matches your opinion.
  • My class level was perfect for me.
  • My class level was somewhat easy for me, but I’m still glad I took the class.
  • My class level was somewhat hard for me, but I did OK anyway.
  • My class was too hard for me, and I wish I’d taken a lower level.
  • My class was too easy for me, and I wish I’d taken a higher level.
  • Not sure/no opinion
  1. How did you like your ESL class this quarter? Choose the statement that BEST describes your opinion.
  • I enjoyed it and found it valuable.
  • It was so-so. I didn’t love it or hate it.
  • I didn’t enjoy it that much, but I still learned some valuable things.
  • I didn’t like it at all.
  • Not sure/no opinion
  1. What is your opinion about placement processes such as the English Language Placement Exam? Choose that statement that BEST matches your opinion.
    • Students should have total control over their own placement.
    • Programs/teachers should have total control over student placement.
    • Students and programs/teachers should decide together about student placement.
    • Not sure/no opinion

[This question was only on the version sent to the pilot group.]

  1. Last summer, we gave you the opportunity to change your placement after you took the ELPE. How did/do you feel about this? Choose the statement that best describes your feeling now.
    • I moved up a level, and I’m glad that I did.
    • I moved down a level, and I’m glad that I did.
    • I stayed in the same level as my ELPE score, and I’m glad that I did.
    • I moved up a level, and I’m sorry that I did.
    • I moved down a level, and I’m sorry that I did.
    • I stayed in the same level as my ELPE score, but I wish I’d chosen to change levels.

 

[1] We use the term multilingual in this paper to refer to students who are writing in a language that was not their primary and/or only home language. This includes both those pursuing study abroad (i.e., international/visa students) and resident multilingual students (immigrants or children of immigrants whose primary home language was not the language in which they are writing/studying now). For consistency, we use this label rather than ESL or second language or L2, though all of these terms are used somewhat interchangeably in the literature.

[2] This project was classified by our Institutional Review Board (IRB) as program evaluation rather than research, so it was exempt from human subjects review. Students who requested a placement change did so voluntarily in response to an email (Appendix A), and survey responses were voluntary and anonymous. The rest of the data were collected from program files/instructors and did not require individual consent.

[3] The structure of this writing developmental program has changed since 2017, which is why the description is written in the past tense.

[4] Due to institutional constraints, students could not be given the option to move up to or beyond EMS 4.