Volume 7, Issue 1: 2014

Language Background and the College Writing Course

by Jonathan Hall

In an era of growing linguistic diversity, assessment of all college writing courses needs to include a focus on multilingual equity: How well does the course serve the needs of students with varying language backgrounds and educational histories? In this study, an Education and Language Background (ELB) survey was developed on a scale measuring divergence from default assumptions of college students as U.S.-educated monolingual English speakers. This survey data was used in assessment of a junior-level college writing course by correlating student ELB data with writing sample scores. On the pre-test, multilingual students and immigrants educated in non-U.S. systems scored significantly lower, but by the post-test this effect had disappeared, suggesting that junior-level writing instruction may be of especial utility to such students. Survey data also revealed important language and education differences between students who began their career at College Y in first-year composition and those who transferred in later. Students’ language background information should be routinely collected at the beginning of each course using an instrument, such as the ELB, that systematically quantifies student language identity using multiple questions, thus permitting both a nuanced portrait of how multilinguality interacts with student writing proficiency, and development of differentiated instruction strategies.


Assessment of multi-section college writing courses has not routinely focused on questions about the language background and educational history of enrolled students; in fact, the information is often not even collected, let alone connected to actual student writing samples. But there is no course taught today on a U.S. college campus--at any level from first-year through graduate, in any discipline, in any format--in which the language and literacy background of the students enrolled is irrelevant. This is doubly true for writing courses, where sophisticated language use within particular academic genres and registers is central to the objectives.

Student language background and language learning history have always been influential factors in determining student success in college writing. The full impact of all the languages a student knows and uses is only now becoming more visible because of the demographic shifts, driven by globalization and immigration, in the student population of higher education over the past few decades (Staklis & Horn, 2012). Writing course assessment--of all writing courses, not just those sections that are designated as “ESL” or “ELL” or “for multilingual learners”--must be cognizant of these shifts. One of the questions that must always be asked, now and in the future, is: Does the course design and execution provide opportunities and access in an even-handed manner to both multilingual and English monolingual students? And how do we know if multilingual and monolingual students are, in fact, achieving the goals of the course in an equitable distribution?

My first section explores this concept of multilingual equity as it applies to college writing courses, especially the process of assessing it: Once we have defined multilingual equity, how will we know whether we have achieved it in a given course? I argue that current measures of student linguistic identity are inadequate to both assessment and pedagogical needs in college writing courses, and that a more comprehensive instrument is a necessary prerequisite to any assessment plan that aims to include a measure of multilingual equity.

In my second section, I describe the construction of such an instrument, an Education and Language Background survey (ELB) that was used in an assessment project at College Y. I show how results from the survey can be used to compute measures of individual divergence from default monolingualist assumptions of U.S. academia, and to offer a composite linguistic diversity index that could be compared with results from other campuses.

My third section focuses on the assessment of a required junior-level research writing course at College Y. It examines multilingual equity in the course through statistical connections between pre-test/post-test writing samples and responses on the ELB survey.

To illustrate the utility of the ELB as a pre-assessment instrument, I compare results from the junior-level assessment project with data from an earlier pilot study of the survey involving first-year students at the same campus.

In conclusion, I reflect upon both some of the possibilities for future research and some of the challenges presented when making use of such data. Gathering detailed, timely information about our students can be used to improve pedagogy, revise curricula, revamp administrative procedures in writing programs, and address institutional metrics concerning the campus environment for students with various language identities.

I. Contexts of Language Background and Assessment: Institutional, Writing Program, and Writing Course Levels

Writing course assessment, the focus of the present project, must always be situated in the context of other levels of assessment; it is best conceived as a subset of multilingual issues at the institutional and writing program levels.

Institutional Attitudes: Beyond the Myth of Transience for Multilingual Students

The myth of transience--the belief that language development should be complete before college work commences, and that first-year composition should take care of any remaining writing issues--is alive and well in academia despite much recent work dedicated to examining and debunking its underlying premises about the nature of literacy development. Mike Rose first proposed the term in 1985 to name the tendency to conceive issues of student writing in terms of a temporary crisis: “the belief persists in the American university that if we can just do x or y, the problem will be solved-in five years, ten years, or a generation-and higher education will be able to return to its real work” (p. 355). David Russell later picked up “the myth of transience” and applied it in the context of writing across the curriculum, arguing that it “masked deep conflicts in the mass-education system over the nature of writing and learning” (2002, p. 9).

What began as a critique of academic conceptions of student writing later evolved into a broader critique of academic conceptions of student language acquisition, in work by Vivian Zamel (1995), Ruth Spack (1997), Bruce Horner and John Trimbur (2002), A. Suresh Canagarajah (2006), Paul Kei Matsuda (2006), and Gail Shuck (2006), among others. Zamel in particular attacked “the notion that these students’ problems are temporary and can be remediated--so long as some isolated set of courses or program of instruction, but not the real courses in the academy, takes on the responsibility of doing so” (1995, p. 510). Issues of language and writing, then, which may at first appear to be technical issues for specialists, turn out to be centrally implicated in conflicting ideas about the basic nature of a university education.

If language learning is not a merely transient phenomenon, but rather one that continues throughout a student’s college career and beyond, then administrators at all levels have reason to ask questions about how particular programs, courses, and instructors are approaching the challenge of providing an equitable writing education for students of all language identities.

Criteria for institutional multilingual equity:

· To what degree does the institution as a whole create a welcoming and supportive atmosphere for multilingual learners?

· Are sufficient support services in place, such as Writing Center tutors who understand how to work with students who are learning English and academic discourse at the same time?

· Are there counseling opportunities to help with cultural transitions--most obviously for international students arriving from abroad, but also for first-in-family transitions to college, for students transferring from community colleges to four-year institutions, etc.?

· Does the institution have serious foreign language requirements for all students? If it’s a curricular target for all students to achieve some degree of multilinguality, then all students become L1/L2 users, not just a particular minority.

Re-examining Writing Program Language Assumptions: The Translingual Student

Multilingual students--like monolingual students--come to college writing courses with widely varying levels of writing and reading proficiency. Once there, they--again like monolingual students--learn and progress at widely varying paces. It is no more true that multilingual students uniformly face a disadvantage in college writing than it is true that monolingual students possess a uniform advantage. Nevertheless, language background is unavoidably a part of the picture for all students. All the languages a student has learned or can use in everyday life are inextricably involved, often in complex and subtle and sometimes unconscious ways, in every literacy act and processes that student undertakes. This phenomenon has sometimes been referred to as a “translingual” space (Canagarajah, 2013; Pennycook 2008; Horner et al., 2011), where the edges of languages are pliable rather than rigid, where languages mix together, not in separate systems, but as mutually interacting elements in a student’s total literacy capacities. Everything a student knows, remembers, or creatively invents is potentially impacted at every moment of linguistic activity by all the experiences that student has had, in whatever context and in whatever language.

Writing researchers, administrators, and instructors are still far from being able to make this complex process of translinguality fully visible in the classroom or in students’ written documents; we are farther yet from being able to calibrate feedback and responses to what particular students need, given their particular mix of language background, educational history, and writing proficiency. And we may never get really good at this, because of the infinite possible variations and individual combinations of these variables. Perhaps our goal should be to help students become more aware of the ways their various language resources fit together, since ultimately it is the students themselves who will need to reconcile language identities and language proficiencies. From the outside, as instructors and administrators, posing the right questions is a difficult task, and gathering the proper data to answer them is perhaps even more daunting. Nevertheless we have to try at least to articulate the problem: We must find ways of incorporating consideration of impact on multilingual learners into writing program assessment.

Criteria for Writing Program multilingual equity:

· Is the writing curriculum--from basic writing through first-year composition through writing across the curriculum and disciplinary writing courses--designed to be universally accessible to students with varying language background profiles?

· Do instructors, full time and part time, receive sufficiently focused professional development opportunities so that they are comfortable in delivering differentiated instruction to mixed classrooms of multilingual and monolingual students?

· Do assessment procedures for student learning take sufficient account of the persistence of language learning among even advanced students?

· Do multilingual students have a clear sense that their ideas--and not just their grammar--are being examined fairly by their instructors?

· Do all students have a sense of what they need to be doing to improve their language skills? Do they understand--and feel empowered to make use of--available resources to help them?

· Do multilingual students feel part of a community that views language difference as a resource rather than a deficit?

Writing Course Assessment for Multilingual Equity

All of these broader issues of attitudes and services at the institutional and writing program level unavoidably impinge on the level of the writing course. In recent years, research on Writing Across the Curriculum and Writing in the Disciplines (WAC/WID) has begun to focus on the often-subtle ways in which language learning intersects with the learning of new academic genres and registers (Matsuda and Jablonski, 2001; Johns, 2005; Hall, 2009; Cox and Zawacki, 2011, 2014). First-year composition has a long-standing dialogue with research in second language writing (e.g. Matsuda et al., 2006), including the concept of “Generation 1.5" students (Harklau et al., 1999; Roberge et al., 2009). But research on writing assessment for multilingual learners has not usually focused on equity for multilingual learners within multi-section college writing courses--especially “mainstream” writing courses that are not specifically designated as tailored especially for “ESL students” or “multilingual learners.” Rather, studies have mostly centered on issues of testing equity (Crislip and Heck 2001; Kim 2011), Generation 1.5 vs. international students (Di Gennaro, 2009), rater language background (Elder et al., 2007; Johnson and Lim, 2009; Lim, 2011); and/or faculty attitudes and expectations across the curriculum (Lindsey and Crusan, 2011; Ives et al., 2014; Zawacki and Habib, 2014).

Research Questions

The question of multilingual equity in a writing course may be articulated thus: Does a given course provide sufficient instruction and support so that students of all language backgrounds demonstrate satisfactory progress toward meeting the goals of the course? This overall issue may be divided up into three research questions that formed the basis of this project:

1. How can we best describe language background groups among the population enrolled in the course? The first step in understanding and meeting the needs of our students as writers and readers is to find out who they are as language users and what their experiences have been with writing and reading in English and in other languages. On many campuses this information is still not routinely available, and what information does exist is seldom sufficient to provide an adequate basis for placement decisions, classroom instruction, or assessment purposes. Faculty and administrators need an instrument that will solicit this information from a number of different angles and provide a concise, preferably numerical, means of characterizing the complex process of language identity. What I am calling an “Education and Language Background” survey (ELB) can provide this information, enabling us to calculate a Linguistic Diversity Index both for individual students and for purposes of describing a given student population.

2. Do students’ language background scores correlate with a pattern of performance on tasks measuring Writing, Reading and Thinking (WRT) proficiency at the beginning and at the end of the course? After gathering information on student language background and educational history, the next step toward achieving multilingual equity is to connect this data to actual samples of student writing. Doing so is necessary to foster meaningful assessment of individual student performance, of the effectiveness of particular courses in serving the needs of diverse subsets of the student population, and of program-wide and institution-wide commitments to serving the needs of multilingual and monolingual students alike. The goal is to search for patterns that suggest relationships within a particular student cohort between performance on particular college-level writing tasks and the data on language background and educational history.

3. How well does the course serve the needs of students with varying language backgrounds and educational histories? The focus of this linguistic diversity project is a particular assessment of a particular writing course: a junior-level research writing course taught at College Y, a public urban campus with an extremely diverse student body. Like any assessment, this investigation is situated within a local curriculum, administrative structure, and, especially, student population. The details can be extremely complex, and yet the principles and procedures are fairly straightforward and transferable to other contexts; there’s nothing new about writing samples administered at the beginning and ending of a course. But in this case, these samples were statistically evaluated in relation to an education and language background survey to gauge the degree to which the course--and by implication the overall writing curriculum and writing program and institution--is equitably serving the needs of students with diverse language backgrounds and educational histories.

II. Mapping Linguistic Diversity: Rationale, Structure, and Findings from the ELB

Who are our students? The answer will be different on every campus. On some, multilingual learners are primarily international students who arrive directly from abroad with considerable literacy in their first language but often highly variable proficiency in English. In other places, the local population--based on immigration patterns, refugee resettlement, and other factors--provides a large pool of students with a single non-English language (often Spanish, but sometimes French, Chinese, Creole, Korean, or Hmong). Still others, usually in large urban areas, combine a mixture of many different language groups on the same campus--this is the case with College Y, our example here.

A subset of students, even some born in the U.S., have experienced an educational history so disrupted by frequent familial moves, difficulties with immigration status, and/or frequent migration to and from the U.S. and overseas school systems, that it has affected their literacy development in both English and their other languages. Even students who self-identify as native English speakers may really be more comfortable in “another English”; that is, another dialect than Standard American English. Writing administrators and instructors are becoming increasingly aware that many students have varying degrees of connection to and proficiency in multiple languages, and that they continue to use this multicompetence (Hall and Navarro, 2011) for their own purposes in local and global contexts.

In constructing their assessment plans, writing programs should not rely solely on existing institutional data to identify variables that might be relevant to issues of multilingual equity. At best students might have been asked one or two questions on admissions forms, for example, whether one is a native speaker of English, or what one’s “home language” is. This type of data, even if it exists, is often not readily available to writing program administrators in a form that would assist their efforts to make placement decisions, to revise curricula, or to construct assessment plans that are fair to students at all points on the multilingual spectrum. And even if that data were available, it would be irresponsible to rely on one or two measures of linguistic identity for programmatic decisions, given the complex linguistic diversity of college student bodies today.

The Education and Language Background Survey: Purposes, Scope, and Scale

A more comprehensive instrument is needed, one that probes a broader range of language learning, experience, and current use; that remains under the control of the local writing program, not in another office on campus; that may be easily implemented in all writing courses using existing course management software (e.g. Blackboard); and that provides program administrators and course instructors with real-time information that may be used for placement decisions and for differentiated instruction. The ELB survey used in this study is one such instrument.

(For a more detailed description of study procedures, see Appendix A. The complete text of the ELB survey is included in Appendix B, along with the percentages of students giving each answer from the junior-level assessment study and the earlier pilot study of first-year students. In the text references such as ELBxx refer to the results of Question xx of the survey.)

Purposes of the ELB. Designing an instrument to capture the most relevant variables--and determining what, in fact, the most relevant variables are--is a complex enterprise, and various approaches have been taken (see Marian, Blumenfeld, and Kaushanskaya, 2007). Much recent research has added complexity to our understanding of exactly what it means to be an “English Language Learner” or “Generation 1.5" student (see Roberge et al., 2009) and offered analysis on how we need to adjust educational approaches. At minimum, a survey designed to measure linguistic diversity must be constructed in a way that yields:

· A meaningful profile of individual students

· A group portrait of a given student body

· A numerical score that can be used to

> Interrogate statistical relationships with other indicators (e.g. writing sample scores) in the same population

>Compare results across institutions and/or populations

Scope of the ELB. The Education and Language Background Survey, containing 17 questions, 14 of which are multiple choice questions constructed on the scale discussed below, focuses on language identity, affiliation, and expertise (Leung et al., 1997) from a number of different angles. The three main areas are:

· Immigration and language learning.Was the student born in the U.S.? If not, in what country? At what age did the student arrive? At what age did she begin speaking English?

. Educational history.Did the student attend U.S. schools a) during the K-8 period? b) during the high school period? Where did the student learn to read and write in English? Did the student learn to read and write in another language? Has the student studied a foreign language in U.S. schools?

. Language use. Does the student speak and/or write any other languages? If so, in what contexts does she use them today? Does the student prefer to speak in English or another language? To read?

Scale: The Education and Language Background continuum. The ELB was constructed on a scale designed to test the widespread two-pronged default assumption of U.S. student language identity:
Assumption 1: The “mainstream” student is a monolingual English speaker.
Assumption 2: Our students have been educated completely in U.S. school system.

These assumptions are usually not articulated fully, but the entire U.S. educational system has been unconsciously constructed upon this basis (Hall, 2014). That’s beginning to shift, of course, as institutions and instructors adjust to the developing situation in front of them. In order to drive institutional and pedagogical change, we need reliable measures of exactly how far from these assumptions a particular reality may be in a particular classroom, on a particular campus, or in the college student population as a whole.

The Education and Language Background Survey asks similar questions in several different ways to create a nuanced picture of students’ language history, exposure, and usage. It uses a five-point scale for each question so that 1=a response that would be given by a student who matched one or both of the default assumptions articulated above, and 5=an answer that might be given by a student who completely did not match the assumption, with, of course, lots of room for graduated responses in between, so we can map our students' language background along a continuum of language identity and educational history.

Used in this way, the totals of student responses to the 14 multiple-choice questions yield a quantified portrait of each individual user in relation to others in the study. The overall responses can contribute to a composite linguistic portrait of the population under study. By computing the total of responses to the questions that were composed on the 1-5 scale we can give each student an ELB score between 14 (answered 1 to all questions) and 70 (answered 5 to all questions).

Language background groups: charting linguistic identity . In the junior assessment study and the first-year composition pilot study at College Y, actual responses ranged from 14-62, with a median of 26 for both groups, and a mean of 27.1 for first-year students and 29.9 for juniors. These group figures may be thought of as a linguistic diversity index of this particular student population, a useful measure that could be compared with results from other campuses if the ELB is administered elsewhere in the future.

If we divide the students into groups based on the average of their responses (see Table 1)--with JLG1 as the group closest to the imagined default of monolingual, U.S.-educated native English speakers, and with JLG4 as the group most resembling prototypical English language learner--we can see that fully 43% (first-year) to 46% (juniors) of College Y students in the studies do not meet the default conditions of U.S.-educated monolingual English speaker. And among juniors, 23% were in the two groups (JLG3 and JLG4) that diverged most sharply from the default assumptions.

Table 1

Education and Language Background Groups at College Y

Group

1st yr %

Junior%

Education and Language Background Groups

Jun-FY%

JLG1

57.3

54.1

14 to 27 (1.0-1.9) Mostly monolingual and U.S.-educated

-3.3

JLG2

29.8

23.0

28 to 41 (2.0-2.9)

-6.9

JLG3

8.8

18.9

42 to 54 (3.0-3.9)

10.1

JLG4

4.1

4.1

> 55 (4.0-5.0) More likely to be multilingual and/or non-U.S. educated

-0.0

Once ELB data have been collected and language groups have been identified, the relation between language background, educational history, and actual demonstrated college writing proficiency can be investigated statistically. Clearly on the campus of College Y, multilingual students are not some tiny minority; rather, they are right in the mainstream. Thus the course itself--in its administration, pedagogy, and assessment--needs to adjust to student needs.

III. Connecting Language Background with Writing Proficiency: Writing Course Assessment and Multilingual Equity

The ELB survey data captures a vivid portrait of student language identity and educational history and provides a more detailed linguistic mapping of the student body at a given institution than might otherwise be available. This information can be extremely valuable for Writing Program administrators, Writing Center directors, and others involved with writing curriculum planning, implementation, and support. Such a survey yields primarily demographic information: interesting, but not as useful as it might be for immediate classroom pedagogy, course-level assessment, or program-wide curriculum development, unless and until it is tied to samples of students’ actual writing.

The project reported here examined data about educational history and language use in conjunction with scores on actual pretest/posttest writing samples and explored how the ELB/Writing Sample combination provides a web of complex assessment information. The specific focus is on assessment of a required junior-level research writing course at College Y, a campus of an urban public university.

Findings On Language Background Scores and Performance Correlation

Writing proficiency in groups with high ELB scores improved from pre-test to post-test.As discussed previously, the 14 multiple choice questions on the Education and Language Background survey are based on a five-point scale for each question so that 1=a response that would be given by a monolingual English speaker and/or lifetime U.S. resident, and 5=an answer that might be given by a recent arrival, and/or a student who is relatively new to the U.S. school system and/or a student for whom English is not the principal language--with graduated responses in between.

Using the groups previously identified based on the average of their responses (Table 1)-- with JLG1 as the group closest to the imagined default of monolingual, U.S.-educated native English speakers, and with JLG4 as the group that most resembling prototypical English language learners--we can then compare different ELB groups’ performance on the writing samples given at the beginning and end of the WRIT 300 course under study (Table 2).

Table 2

ELB Groups: Writing Sample Scores, Pre-test and Post-test



Junior WS1 (Pretest)

Junior WS2 (Posttest)

Group

Label

ELB Range

N

%

Mean

SD

Min

Max

Mean

SD

Min

Max

W2-W1

JLG1

14-27 (1.0-1.9)

40

54.1%

34.8

6.1

20.5

50.5

35.5

6.1

24.0

45.5

0.69

JLG2

28-42 (2.0-2.9)

17

23.0%

35.0

6.2

26.0

45.0

36.8

7.2

27.0

52.5

1.85

JLG3

43-55 (3.0-3.9)

14

18.9%

27.4

7.5

13.5

38.0

34.8

7.9

22.5

47.5

7.43

JLG4

> 55 (4.0-5.0)

3

4.1%

22.8

6.3

18.0

30.0

29.0

5.3

23.5

34.0

6.17


The mean for students with the highest language background scores (in the JLG3 and JLG4 groups) showed a marked improvement over the course of WRIT 30-, by 7.43 points and 6.1 points respectively, which is a more dramatic increase than the other two language groups experienced. This put students in JLG3 clearly within the overall mean for the course.

Even with this improvement, JLG3 students still lagged behind the overall mean for the post-test Writing Sample 2 (WS2), at 29.0 vs. the overall mean for WS2 of 35.2. Ultimately, there was clear, positive impact on the groups that most diverged from the default student assumptions.

Findings on Multilingual Equity

Students with high ELB scores closed an initial gap to reach the course mainstream by the post-test. Multilingual students and students with substantial non-U.S. education histories, as measured by the ELB, were likely to score lower not only on the totals of the diagnostic essay at the beginning of the course but also on numerous individual dimensions of the rubric. Answers on the ELB strongly negatively correlated with the scores on the first writing sample (WS1). Altogether, there were 150 possible correlations: a grid (Table 3) of 15 x 10 where 15 includes the 14 primary questions on the ELB plus the ELB total, and 10 includes the 9 dimensions of the WS1 score from the rubric plus the total WS1 score. On the pre-test, 125 of 150 possible correlations (83%) were found to be correlated at the 0.05 level (Table 3).

On Writing Sample 2 (WS2), the post-test, however, 70% of these correlations disappeared. Only 18 (12%) of the Question/Dimension intersections were statistically significantly correlated (Table 4).

The best way to see this is to visually examine the shading on Table 3 vs. Table 4.

Table 3 shows correlations between ELB questions/ELB total and Writing Sample 1. Shaded areas indicate statistical correlation at the 0.05 level. Correlation at the 0.01 level is indicated by **.

Table 4 shows the results from the same analysis of ELB and WS.

Or, to state it most simply: On the pre-test, students with high ELB scores did worse than students with low ones in almost every category. By the post-test this effect had disappeared, suggesting that junior-level writing instruction may be of especial utility to multilingual students, to immigrants educated in non-U.S. educational systems, and to speakers of multiple Englishes.

Transfer students comprised the very substantial majority in WRIT 30- enrollment, shown in ELB18 (Table 6). In such a situation, transfer students are the mainstream, and therefore assessment must account for multiple student bodies at the same physical campus. Curricular design needs to reflect the reality that many students will not be exposed to an institution’s general education program, at least not in its entirety.

Beyond their sheer prevalence in WRIT 30-, what this study shows about transfer students, based on student responses to the survey and performance on the writing samples at the beginning and end of the course, are two things:
·Transfer students begin their career at College Y less well-prepared in writing, reading, and critical thinking than their home-grown counterparts.
·WRIT 30-, on the basis of this study, seems to be fairly effective at providing that intervention. The gap between transfer students and those who began their academic careers at College Y has greatly narrowed and nearly disappeared by the writing sample at the end of the course.

Table 5

Writing Assignments total score comparison (maximum possible=54) between those who started at College Y and those who transferred in

Student College History

N

W1 (pre-test) Total

W2 (Post-test) Total

Only College Y

29

34.4

36.2

Transferred < 60 credits

16

34.6

38.0

Transferred >= 60 credits

42

30.2

33.4

All students

87

32.4

35.2


Discussion: How did WRIT 30- Help Multilingual Students, Immigrants, and Transfer Students?

The results from the course assessment of WRIT 30- suggest that semester-long writing courses taught by experienced writing instructors may be an effective tool to help junior-level writers accommodate themselves to the demands of advanced literacy. Furthermore, we have found that, at least in this cohort, WRIT 30- benefits most directly some of our most vulnerable students: multilingual students and immigrants who learned to read and write English in another country and transfer students making the transition from community colleges or elsewhere. Both of these groups scored well below the “home-grown” College Y students on the pre-test writing sample, but demonstrated significant gains by the post-test, to the extent that they were now in the mainstream of the class, no longer statistically distinguishable from their “home-grown” counterparts.

Good news, of course--but how to explain it? Does this mean that the pedagogy of the course adheres in some way to cutting edge “translingual” or “multilingual” pedagogical principles? Not at all. While instructors in the course are certainly aware that College Y has a very diverse population, they are not trained in methods that are specifically designed to address the needs of second language learners, and/or to focus on the particular needs of transfer students. In fact, a majority of the course instructors are part-time faculty, some of long standing at the college, but most without any significant background in the teaching of English as a second language or other systematic approaches to multilingual pedagogies.

The first part of the course--sometimes dubbed “Junior Composition”--is a review of common first-year composition skills such as summary, analysis, and response, but with more emphasis on advanced techniques such as synthesis. The second part of the course is a carefully scaffolded research project, on a topic from the student’s major, but written for a more general audience. So in its staffing and in its pedagogical design, WRIT 30- is not much different from any number of college writing courses. One advantage the course does possess is its unusual positioning at the junior level, where it therefore includes transfer students as well as “home-grown” College Y students.

But I am not claiming any radical pedagogical breakthrough. Rather, these results would seem to be an illustration of Vivian Zamel’s description of the relation between Writing Across the Curriculum pedagogy and pedagogy for “ESOL” students: “What faculty ought to be doing to enhance the learning of ESOL students is not a concession, a capitulation, a giving up of standards....What ESOL students need...is good pedagogy for everyone” (pp. 518-519).

Let me re-state Zamel’s point a little more negatively. Instructors who teach a relatively homogeneous student population--who share a similar language background and educational history with each other, who come from a cultural background generally similar to that of the instructor--can get away with some pretty sloppy pedagogy and students will still learn because they will be able to fill in the gaps with their own background knowledge and their own cultural assumptions and experiences. If, however, these conditions do not apply, if there is diversity of language background, diverging educational histories, and/or multiple cultural experiences, assumptions, and perceptions among students in the same class, then there will be a much smaller margin for pedagogical error. Instructors need to be much more precise and explicit in scaffolding assignments and in managing and monitoring the steps of the research and writing process that students go through in major projects. They must take a critical approach to their own cultural assumptions--and those of the discipline of the course.

IV. The ELB as Assessment and Research Tool

One Campus, Two Student Bodies: Findings from a Comparison of Junior vs. First-year ELB Results

The Education and Language Background Survey had been piloted one year earlier in a study of first-year students enrolled in a composition course at College Y. Though this pilot study and the WRIT 30- assessment project focus respectively on first-year students and juniors at the same institution, the results do not present a longitudinal movement along a continuous development curve. Rather, the studies are best seen as two discontinuous snapshots. This disparity is more fundamental than the obvious fact that the specific students in the pilot did not have enough time to progress, in one year, from first-year composition to junior-level writing. Rather, as previously discussed, the majority of the students in WRIT 30- had not experienced first-year composition at College Y at all, but rather at other institutions, most commonly at community colleges in the same urban university system (Table 6, ELB18).

Table 6

Transfer status of students in Junior study

ELB18

N

%

Which statement best describes your background?

34

35.1

I completed an Associates degree at a community college before coming to College Y

3

3.1

I completed less than ten credits at another college before enrolling at College Y

16

16.5

I completed more than ten credits but less than sixty credits before transferring to College Y

11

11.3

I completed more than sixty credits at another college before transferring to College Y

33

34.0

College Y is the only college I have attended.

97

100

Total


The pilot study did not include a post-test writing sample as the junior study did, so the progress in writing proficiency from beginning to end of the course cannot be directly compared. But in the analysis that follows, ELB answers from these two groups of students are compared in order to shed light on the differences between first-year and junior students at the same institution. The following results are, of course, applicable only to one particular campus and one particular student body, and they are a snapshot of a particular one-year span. Each campus would need to do its own study to find out who its juniors are and how they differ from its first-year students. But this study should be useful as an example of how different they can be, and of how complex the language backgrounds, the educational histories, and the interaction between the two can be.

Beyond a portrait of a particular student body--or rather bodies--at this particular institution, the differences also suggest several wider possible uses for the ELB or similar instruments for assessment or research purposes, even in the absence of the labor-intensive scoring of writing samples.

Identifying students in need of language support services. As discussed earlier (Table 1), according to the ELB, first-year and junior students include a small but significant group, 6%-7%, who are more comfortable speaking another language than English. This group fits the usual profile of “second language learners.” But we also need to remember that these students who are at a relatively early stage in English fluency represent a fairly small percentage of multilingual students; they are not the norm. These students are likely to need extra support as they work on reading and writing assignments in English in our courses, and we need to make sure that they get that support. We should recognize, however, that even highly proficient users of English as a nonnative language may still require various forms of language support and differentiated instruction if we are to help them make use of their linguistic resources.

The paradigm shift from language difference as problem to language difference as resource is at the heart of how a project like this can reveal data that can be used to empower our students to develop their linguistic and rhetorical knowledge. We need to further examine the data to map the extremely diverse language use patterns that cannot be captured by simple binaries such as monolingual/multilingual.

Distinguishing multilinguality from multiple Englishes. Even more intriguingly, the ELB suggests that juniors at College Y are more likely than first-year students to have been born outside the U.S., and are more likely to say that they are relatively recent immigrants, arriving in the U.S. after the age of 12 (Table 7, ELB01).

Table 7

Immigration history

ELB 01

1st yr %

Junior%

Which statement best describes your background?

Jun-FY%

1

66.2

41.2

I was born in the United States and have lived here all my life.

-25.0

2

1.8

1.0

I was born in another country, but have lived in the U.S. since before the age of 2.

-0.8

3

7.2

5.2

I was born in another country, but came to the U.S. between the ages of 2 and 6.

-2.1

4

6.8

12.4

I was born in another country and came to the U.S. between the ages of 6 and 12.

5.6

5

18.0

40.2

I was born in another country and came to the United States after the age of 12.

22.2

Total

100.0

100.0

These same students are also more likely to identify themselves as monolingual English speakers (Table 8m, ELB03).

Table 8

Language Identity

ELB 03

1st yr %

Junior%

Which statement best describes your background?

Jun-FY%

1

31.1

43.3

English is the only language that I speak.

12.2

2

19.8

12.4

I use a language other than English in some situations, but I would not describe myself as fluent in it.

-7.4

3

16.7

10.3

I speak a language other than English fluently, but am more comfortable in English.

-6.4

4

25.2

26.8

I am equally comfortable speaking English and another language.

1.6

5

5.9

7.2

I am more comfortable speaking another language than I am speaking English.

1.4

Total

100.0

100.0

The key factor is where these immigrants originated (Table 9).

Table 9

Country of Birth

ELB 02

1st yr %

Junior%

If you were born outside the United States, in what country were you born?

(If you were born in the United States, skip to question #3)

Jun-FY%

[Open field for students to enter name of county--no multiple choices given]

29.5

42.9

Born outside the U.S.

13.4

7.4

26.8

Born in Anglophone Carribean (Guyana, Jamaica, Trinidad, St. Lucia)

19.4

6.0

9.2

Born in Spanish-speaking countries

3.2

9.7

5.8

Born in South Asia (India, Bangladesh, Pakistan, Nepal)

-3.9

6.4

1.1

Born in other non-U.S. areas

--5.3


On the College Y campus, the pattern of immigration includes many students from English-speaking countries (ELB02), including the Anglophone Caribbean islands and mainland (especially Guyana, Jamaica, and Trinidad), and also post-colonial Asian and African countries, where English is a main component of the educational system (such as India, Pakistan, Bangladesh, Nigeria, and the Philippines). The mere fact that these students very legitimately consider themselves to be “native English speakers,” however, should not be the end of the conversation: Students often remark that although their education in their country of birth had been in English, the gap between their writing experiences in their secondary education and the demands of the U.S. university seemed--much to their surprise--to be larger than the parallel gap experienced by their U.S. contemporaries.

So there is much more going on here than issues relating to writing in a “second language.” For a much larger proportion of relatively recent immigrants, their linguistic concerns are not the classical “second language” issues, but rather point to divisions within “English” rather than between English and another language. Overall, the ELB results remind us that we need to complicate our conceptions of both “multilinguality” and “monolinguality” and move beyond simple binary oppositions.

Exploring educational history: U.S. high school. The ELB reveals significant differences between juniors and first-year students in their educational history, most strikingly at the secondary level. While the experiences of multilingual students who do attend U.S. high schools are, of course, far from unproblematic and often raise complex issues of linguistic identity (Ortmeir-Hooper, 2010), and literacy development (Frodesen, 2002), a different set of issues can arise when students arrive directly from a different educational system, even if their previous writing instruction was in English.

Table 10

U.S. High School attendance

ELB07

1st yr %

Junior%

For my high school education (or equivalent)...

Jun-FY%

1

87.8

67.0

I attended a U.S. high school for four years.

-20.8

2

5.0

5.2

I attended a U.S. high school for three years

0.2

3

0.9

6.2

I attended a U.S. high school for two years

5.3

4

0.9

3.1

I attended a U.S. high school for one year.

2.2

5

5.4

18.6

I did not attend a U.S. high school.

13.2

Total

100.0

100.0


Among juniors, 18.6% did not attend a U.S. high school at all, and an additional 14% attended a U.S. high school for less than 4 years. First-year students are much more likely to come from a U.S. high school background, with only 5.4% not attending at least some U.S. high school (ELB07). While the U.S. secondary education system has well-documented shortcomings, students who come through it are nevertheless certainly inculcated with U.S. educational values and procedures. They are familiar with U.S. pedagogical methods, and have had at least some practice with common genres of classroom writing in U.S. educational culture. And yet such U.S.-educated students still often struggle, of course, with the transition to college writing. How much more difficult, then, must be the adjustment faced by students who have attended high school in another country, even though all the instruction was in the local English.

Once again, a dichotomy between “native speaker”--which these students certainly are and identify as--vs. “non-native speaker” can obscure important differences within and among speakers of Englishes. These differences are not purely linguistic, but rather are cultural and rhetorical, the product of possibly divergent conceptions of what “writing” is and even what the purpose of a secondary education is.

Exploring Educational and Linguistic History: Early Literacy Experiences.

ELB10 reports a question about earlier educational history, inquiring where students learned to read and write in English.

Table 11

Early Educational History: English literacy

ELB10

1st yr %

Junior%

Which statement best describes how you learned to READ and WRITE in English?

Jun-FY%

1

74.8

46.3

I learned to read and write in English in a United States elementary school.

-28.5

2

7.2

24.2

I learned to read and write in English in an elementary school in an English-speaking country outside the U.S.

17.0

3

6.3

10.5

I studied English fairly seriously in another country before coming to the U.S. and I was a proficient reader and writer of English when I arrived.

4.2

4

10.8

11.6

I studied some English in another country before coming to the U.S., but I would not describe myself as a proficient English reader and writer at that time.

0.8

5

0.9

7.4

I did not study English until I arrived in the U.S. after the age of 12.

6.5

Total

100.0

100.0


And here we see that there is again a major difference: Juniors at College Y are less likely than first-year students to have come up through the U.S. school system. Compared to first-year students, juniors are much less likely, by 28.5%, to have learned to read and write in a U.S. elementary school; in fact fewer than half of them did. (ELB10).

In some of cases, a familiar “multilingual student” pattern forms: More juniors than first-year students did not begin to study English until after the age of 12. Some of these students may be inhabiting a translingual space, in which English subtly interacts with their other language(s) as they perform their reading and writing assignments. More research is needed to explore how they do this work and how we might help them to do it better.

But what will turn out to be equally important is Answer #2: 17% more juniors than first-year students reported they learned to read and write in an elementary school in an English-speaking country outside the U.S. Nearly a quarter of juniors chose this response--a huge number. The choice of this response, however, raises more questions than it answers. Specifically, it leads us to ask:

· How did the educational systems in which students studied define “learned to read and write”? How do these definitions continue to exert effects on students’ writing, reading, and critical thinking practices even today?

· In the cultural context into which these students were born, what, exactly, is the meaning of “English”? Is it a language of a prestigious elite, an international medium of academic and economic exchange, a foreign language not used locally, or some more complicated combination (Nero, 2006; Irvine, 2008)? How might these connotations--and students’ later experiences with multiple “Englishes”--affect what they do when asked to complete a reading, writing, and critical thinking assignment in our classrooms?

The present data do not provide answers to these questions, but they do point to a direction for future research.

At minimum, assuming familiarity with U.S. elementary and secondary education procedures and principles is not a good strategy with such a diverse student population. Even though many of these students have had at least two years of exposure to the U.S. higher education system, either at College Y or a community college or elsewhere, they are still relative newcomers to dominant U.S. literacy practices, and will likely need to have many assumptions and procedures made explicit and taught directly that instructors who assume a U.S. cultural and educational background might otherwise have allowed to remain merely implicit.

Exploring current language use. One of the non-multiple choice questions (ELB11) asked students to list all the languages they can use and to characterize whether they used each one “Constantly,” “Frequently,” “Occasionally,” “Rarely,” or “Never.” Combining the results of the pilot and the junior study, 34.5% of students identified 26 non-English first languages. Not surprisingly, given College Y’s geographic location, Spanish was the most frequently cited language.

Perhaps the most striking result from this question was that 81% of these students said that they currently use their non-English first language “constantly” or “frequently.” This is an important finding especially in U.S. culture, where the tradition of the melting pot metaphor has led to a pervasive subtractive expectation (Hall, 2009, p. 35), i.e., the belief that as students learn English, their original languages would fade away through disuse. The ELB responses suggest that this expectation is off base, that our students continue to be multicompetent in various languages, dialects, and registers in their daily lives. Thus we need to ask: How do these continuing and frequent multilingual experiences affect students’ academic work and their personal identities? When we investigate language background and educational history, we are not only looking at students’ past experiences, but charting their linguistic present as well.

Conclusion

So how do the demographic patterns and linguistic complexity revealed by the ELB survey impact the task of writing assessment? The simplest answer is: You cannot assess what you cannot see, and you cannot see what you have not looked for. It is extremely likely that students whose linguistic patterns and educational histories diverge from the default assumptions of U.S. college writing pedagogy--that all our students are native, monolingual English speakers who have been completely educated in the U.S. school system--will have been poorly served by a system that was designed for an America that has fled--and which by the way was, even in the distant past, far more linguistically diverse than was acknowledged at the time. Like other previously excluded groups, multilingual students have been similarly invisible, and, like race or gender or any similar categories, language diversity is much more complex than college writing studies is equipped to explore with current conceptual frameworks and research methods.

And yet, writing assessment scholarship continues to make clear: “One-size-fits-all” assessment practices are unlikely to produce a coherent picture of student writing at a given institution. As we become more aware of the increasing linguistic and cultural diversity of today’s student bodies, we have also become retrospectively more aware of how our past pedagogical and placement practices have always been unconsciously shaped by unspoken assumptions of English monolinguality, of uniform U.S.-based educational histories, and of cultural homogeneity shaped by exclusive immersion in U.S. society. Even if, as might perhaps happen, a particular course were to be constituted entirely by students who, at least on the surface, met all of these criteria, we would still be compelled to take a critical view of that population: How was this homogeneity achieved? Is it a reflection, for example, of admissions procedures at a particular institution? Or is it, especially if there is a disjunction between the overall student population and that of a particular course, perhaps a result of disciplinary assumptions and practices which tend, probably inadvertently, to funnel particular types of students into it and steer others away? We need to interrogate linguistic and cultural homogeneity, that is, just as systematically as we examine diverse multicompetencies and literacy backgrounds in a classroom.

The Education and Language Background Survey was conceived as a tool to gauge the distance a given student, or a given student population, diverges from the default assumption of U.S.-educated monolingual English speaker: the lower the total score, the closer to the assumption, the higher the number the farther away. Such a survey can be conveniently and economically implemented from within existing content management structures, and thus could become a routine part of every writing course--maybe every college course, period. One can almost envision a time when it would be unthinkable that a college instructor would not administer an instrument at the beginning of each course, such as the ELB, that systematically quantifies student language identity using multiple questions to create a nuanced portrait of how multilinguality interacts with student writing proficiency. Once instructors have this information in real time, the next step would be to adjust and to personalize pedagogy using differentiated instruction strategies developed for various language background profiles.

At the institutional level, the composite information about a given student body’s linguistic identity would permit, in follow-up studies, detailed comparisons across campuses. The urban public university in this study is probably on the high end of that linguistic diversity measure, but, given current trends, possibly provides a glimpse of the future everywhere.

On any campus, with any combination of student language profiles, the first step is simply to gather the information, something that many institutions still do not do systematically. If this project has shown anything, it is that investigating the complexity and diversity of student language identity and language learning history is a necessary prerequisite before we can assess the effectiveness of our writing courses, or tailor our pedagogical approaches to the needs of all our students.


Acknowledgements

The pilot study of the Linguistic Diversity Project was supported by a PSC-CUNY grant, and the assessment study was under the auspices of the college’s Outcomes Assessment Committee. I would like to thank Shao-wei Wu for her invaluable statistical assistance with both studies. I would also like to thank my colleagues who were part of an earlier project to collect language background data at another university: Minoo Varzegar, Nela Navarro, and Thomas LaPointe.

Biographical Note

Jonathan Hall is Associate Professor of English at York College, City University of New York. His work on multilingual learners and writing across the curriculum has appeared in The WAC Journal, Across the Disciplines, and elsewhere.

References

Canagarajah, A. S. (2013). Translingual practice: Global Englishes and cosmopolitan relations. New York: Routledge.

Canagarajah, A. S. (2006). Toward a writing pedagogy of shuttling between languages: Learning from multilingual writers. College English, 68(6), 589-604.

Carey, B. (2005, August 16). Have you heard? Gossip turns out to serve a purpose. The New York Times.

Cox, M & T. M. Zawacki (Eds.) (2014). WAC and second language writers: Research towards developing linguistically and culturally inclusive programs and practices. Anderson, SC: Parlor Press.

Cox, M., & Zawacki, T. M. (Eds.). (2011). Special issue: WAC and second language writing: Cross-field research, theory, and program development. Across the Disciplines: A Journal of Language, Learning, and Academic Writing, 8(4). Retrieved from http://wac.colostate.edu/atd/ell/index.cfm

Crislip, M. A., & Heck, R. H. (2001). Accountability, writing assessment, and equity: Testing a multilevel model. ERIC #: ED452203, 33 pp. Presented at the American Educational Research Association, Seattle, WA.

Di Gennaro, K. (2009). Investigating differences in the writing performance of international and Generation 1.5 students. Language Testing, 26(4), 533-559.

Elder, C., Barkhuizen, G., Knoch, U., & Von Randow, J. (2007). Evaluating rater responses to an online training program for L2 writing assessment. Language Testing, 24(1), 37-64.

Frodesen, J. (2002). At what price success?: The academic writing development of a Generation 1.5 “latecomer.” CATESOL Journal, 14(1), 191-206.

Hall, J. (2014). Multilinguality is the mainstream. In B. Horner & K. Kopelson (Eds.), ReWorking English in Rhetoric and Composition: Global-local contexts, commitments, consequences (31-48). Carbondale, IL: Southern Illinois University Press.

Hall, J. (2009). WAC/WID in the next America: Re-thinking professional identity in the age of the multilingual majority. The WAC Journal, 20, 33-47.

Hall, J., & Navarro, N. (2011). Lessons for WAC/WID from language learning research: Multicompetence, register acquisition, and the college writing student. Across the Disciplines: A Journal of Language, Learning, and Academic Writing, 8(4). Retrieved from http://wac.colostate.edu/atd/ell/hall-navarro.cfm

Harklau, L., Losey, K. M., & Siegal, M. (1999). Generation 1.5 meets college composition: Issues in the teaching of writing to U.S.-educated learners of ESL. New York, NY: Routledge.

Horner, B., & Trimbur, J. (2002). English only and U.S. college composition. College Composition and Communication, 53(4), 594-630.

Horner, B., Lu, M.Z., Trimbur J., and Royster, J.J. (2011). Language difference in writing: Toward a translingual approach. College English, 73(3), 299-317.

Irvine, A. (2008). Contrast and convergence in Standard Jamaican English: The phonological architecture of the standard in an ideologically bidialectal community. World Englishes, 27(1), 9-25.

Ives, L., Leahy, E., Leming, A., Pierce, T., & Schwartz, M. (2014). “I don’t know if that was the right thing to do”: Faculty respond to multilingual writers in the disciplines. In M. Cox & T. M. Zawacki (Eds.), WAC and second language writers: Research towards developing linguistically and culturally inclusive programs and practices (211-232).Anderson, SC: Parlor Press.

Johns, A. M. (2001). ESL students and WAC programs: Varied populations and diverse needs. In S. H. McLeod, E. Miraglia, M. Soven, and C. Thaiss (Eds.), WAC for the new millennium: Strategies for continuing Writing-Across-the Curriculum programs (141-164). Urbana, IL: NCTE .

Johnson, J. S., & Lim, G. S. (2009). The influence of rater language background on writing performance assessment. Language Testing, 26(4), 485-505.

Kim, Y.H. (2011). Diagnosing EAP writing ability using the Reduced Reparameterized Unified Model. Language Testing, 28(4), 509-541.

Lim, G. S. (2011). The development and maintenance of rating quality in performance writing assessment: A longitudinal study of new and experienced raters. Language Testing, 28(4), 543-560.

Lindsey, P., & Crusan, D. (2011). How faculty attitudes and expectations toward student nationality affect writing assessment. Across the Disciplines: A Journal of Language, Learning, and Academic Writing, 8(4), n.p.

Leung, C., Harris, R., & Rampton, B. (1997). The idealised native speaker, reified ethnicities, and classroom realities. TESOL Quarterly, 31(3), 543-560.

Marian, V., Blumenfeld, H. K., & Kaushanskaya, M. (2007). The Language Experience and Proficiency Questionnaire (LEAP-Q): Assessing language profiles in bilinguals and multilinguals. Journal of Speech, Language & Hearing Research, 50(4), 940-967.

Matsuda, P. K. (2006). The myth of linguistic homogeneity in U.S. college composition. College English, 68(6), 637-651.

Matsuda, P. K., Cox, M., Jordan, J., & Ortmeier-Hooper, C. (2006). Second-language writing in the composition classroom: A critical sourcebook. Boston, MA: Bedford/St. Martin’s.

Matsuda, P. K., & Jablonski, J. (2000). Beyond the L2 metaphor: Towards a mutually transformative model of ESL/WAC collaboration. Academic Writing: Interdisciplinary Perspectives on Communication Across the Curriculum. Retrieved from http://wac.colostate.edu/aw/articles/matsuda_jablonski2000.htm

Nero, S. (2006). Language, identity, and education of Caribbean English speakers. World Englishes, 25(3-4), 501-511.

Ortmeier-Hooper, C. (2010). The shifting nature of identity: Social identity, L2 writers, and high school. In M. Cox, J. Jordan, C. Ortmeier-Hooper, & G. G. Schwartz (Eds.), Reinventing identities in second language writing (5-28). Urbana IL: National Council of Teachers of English.

Pennycook, A. (2008). Translingual English. Australian Review of Applied Linguistics, 31(3), 30.1-30.9.

Rose, M. (1985). The language of exclusion: Writing instruction at the university. College English, 47(4), 341-359.

Russell, D. R. (2002). Writing in the academic disciplines: A curricular history (2nd ed.). Carbondale, IL: Southern Illinois University Press.

Shuck, G. (2006). Combating monolingualism: A novice administrator’s challenge. WPA: Writing Program Administrators, 30(1-2), 59-82.

Spack, R. (1997). The rhetorical construction of multilingual students. TESOL Quarterly, 31(4), 765-774.

Staklis, S., & Horn, L. (2012). New Americans in postsecondary education: A profile of immigrant and second-generation American undergraduates. (Stats in Brief. NCES 2012-2013). National Center for Education Statistics. Retrieved from http://nces.ed.gov/

Tough, P. (2006, November 26). What it takes to make a student. The New York Times.

Zamel, V. (1995). Strangers in academia: The experiences of faculty and ESL students across the curriculum. CCC, 46(4), 506-521.

Zawacki, T., & Habib, A. S. (2014). Negotiating “errors” in L2 writing: faculty dispositions and language difference. In M. Cox & T. M. Zawacki (Eds.), WAC and second language writers: Research towards developing linguistically and culturally inclusive programs and practices (183-210). Anderson, SC: Parlor Press.

Appendix A

Appendix B

Appendix C

Appendix D