Volume 9, Issue 2: 2016

Recognizing Multiplicity and Audience across the Disciplines: Developing a Questionnaire to Assess Undergraduates’ Rhetorical Writing Beliefs

by Michelle Neely, University of Colorado at Colorado Springs

How do students feel about expressing uncertainty in their academic writing? To what extent do they think about their readers as they compose? Understanding the enactment of rhetorical knowledge is among the goals of many rich qualitative studies about students’ reading and writing processes (e.g. Haas & Flower, 1988; Roozen, 2010). The current study seeks to provide a quantitative assessment of students’ rhetorical beliefs based on a questionnaire. This study reports on (1) the development of the Measure of Rhetorical Beliefs and (2) demonstration of the measure’s construct validity and utility by comparing undergraduates’ rhetorical and epistemological beliefs, as well as their composing process, across different majors. The new Measure of Rhetorical Beliefs (MRB) was administered to engineering, business, and liberal arts and science majors, along with the Inventory of Process in College Composition (Lavelle and Zuercher, 2001) and the Epistemological Belief Inventory (Schraw, Bendixen, and Dunkle, 2002). Findings suggest that rhetorical writing beliefs are a measurable construct distinct from, but related to, epistemological beliefs and composing practices and that students from different majors may have different rhetorical beliefs and composing practices. Implications for use of the Measure of Rhetorical Beliefs are discussed, to include further validation of the instrument and its potential use for research, program evaluation, and instructional practice.

 1.0 Introduction

For many faculty members, challenging—and perhaps changing—the way that students approach learning, knowing, and writing is at least as important as their course content (Fives & Buehl, 2012). Such values are often deeply embedded in the choices that faculty make about course content, teaching approaches, and assignment design. Just as teachers’ own beliefs about learning, knowledge, and writing shape what they do in the classroom (Hillocks, 1999), students’ beliefs may also influence their academic practices. In the work reported here, I describe the initial stages of developing a Measure of Rhetorical Beliefs. Understanding and assessing rhetorical beliefs via this scale may assist faculty, composition researchers, writing program administrators, and even students themselves to understand and assess beliefs about persuasion and multiple viewpoints relative to written communication tasks. This consideration of perspectives is termed “multiplicity” by some epistemological researchers (Kuhn, Cheney, & Weinstock, 2000; Perry, 1998), a term that I will use to describe students’ metacognitive consideration of multiple perspectives.

As Beaufort (2007) reminded us about transfer, the goal of writing instruction should not be to create students who are experts in general writing skills. Not only is doing so unrealistic, but we cannot know exactly what jobs, roles, genres, and positions in which they will find themselves writing in the future. Rather, we should be promoting students who hold metacognitive expertise about writing and are able to strategize about the complex situations in which they will need to write, adapt their skills to new contexts, and situate their messages relative to audience and purpose. This metacognitive expertise about writing includes students’ rhetorical knowledge and rhetorical beliefs (Alexander, Schallert, & Hare, 1991; Beaufort, 2007; Flower & Hayes, 1984).

Quantifying students’ metacognitive growth, specifically their epistemological and rhetorical writing beliefs, may augment evidence of student growth we seek in students’ written artifacts. Scales assessing students’ beliefs about knowledge, composing processes, and rhetorical beliefs may also help to triangulate instructional narratives, providing a way of understanding student change that supplements qualitative types of data and instructional accounts (Charlton, 2007). Eventually, brief, quantitative measures such as the Measure of Rhetorical Beliefs may help us further research students’ transitions from first-year composition into their disciplinary writing, supplementing to the rich qualitative studies like that of Johnson and Krase (2012), who tracked ways that students managed claims and evidence from their first-year writing courses into their disciplines. In these ways, scales can help us to understand the contexts in which students’ epistemological and rhetorical beliefs change, whether those beliefs are “portable” from composition courses into disciplines, and which beliefs help (or hinder) students as they move through their coursework.

2.0 Scope and Purpose of Current Study

The purpose of the work reported here is to explain the development and testing of items for a Measure of Rhetorical Beliefs, which includes exploring its construct validity. This included testing by administering it, along with related scales, to composition students majoring in business, engineering, and liberal arts & sciences. Scales were used to explore the relationship between students’ epistemological and rhetorical beliefs and their composing processes; I also examined potential differences among undergraduates’ beliefs and composing processes across major.

While this work represents some of the earliest stages of development and testing of the Measure of Rhetorical beliefs, my hope is by sharing the details of the scale development process, others will use and further develop the Measure of Rhetorical Beliefs, validating it with a greater sample size and varied population of undergraduates.

2.1 Undergraduates’ Rhetorical Knowledge

How do we understand and conceptualize the rhetorical knowledge that writers need for success in their college, and other, writing tasks? Writing studies scholars have worked to theorize and describe the different knowledge, skills, and strategies that writers bring to composing contexts, with an aim of better supporting student writers pedagogically. Theorists explained the situatedness of texts within discourse communities (Bizzell, 1992; Gee, 2012; Swales, 1990), which contributes to the rhetorical knowledge that writers bring to composing tasks. Writing assignments ask students to replicate genres, while sometimes also analyzing genres; students who understand genres comprised of living, generative texts (Bawarshi, 2003), socially constructed and informed (Herrington & Moran, 2005), and representing the values of discourse communities may be at an advantage when facing these complex assignments.

Building rhetorical knowledge may go beyond understanding concepts of genre and discourse community. It may also include building content knowledge about texts and understanding the referential nature of discourse pieces. As Porter (1986) explained, all texts are in relation to other texts, so it would make sense that writers with a rich bank of text experience would come to writing tasks with deeper rhetorical knowledge, and chance for success, compared to those who do not have such resources. The task for new arrivals is daunting because they enter the conversation with two deficits: that of content (what has been discussed) and the “rules and features” (Bartholomae, 1986) for that discussion. This is evident in Bartholomae’s (1986) basic freshman writer essays, as students stepped into the discourse of the university and worked to learn and balance what to say with how to say it (Bereiter & Scardamalia, 1987). Similarly, McCarthy (1987) provided an ethnographic account of an undergraduate writing across the different disciplines required for his degree. Here, the student gathered context and content knowledge, with varying degrees of success, across the different and shifting writing demands of his classes.

Rhetorical knowledge may be informed by a writer’s conceptualization of genre, along with familiarity with conventions of discourse and content. The extent to which writers are able to draw upon these funds of knowledge (Gee, 2012) may depend on the metacognitive strategies (Pintrich, 2002) that they bring to writing tasks. A writer’s knowledge of these heuristics, along with the ability to self regulate, appears an important feature of “sophisticated” rhetorical thinking. The extent to which writers conceptualize writing tasks as complex communication problems (Flower & Hayes, 1986) and can pivot among a repertoire of strategies may also contribute to their success in accomplishing writing tasks, as cross-sectional studies comparing expert and novice writers have shown (Perl, 1979; Sommers, 1980).

Taken together, “rhetorical knowledge” is a construct that may actually be informed by students’ conceptualization of discourse communities, including audience and genre; their knowledge bank of texts; and their metacognitive strategies for approaching and regulating their own composing processes. Research tracing undergraduates’ development of rhetorical knowledge has identified a common trajectory as students move toward contextualizing texts and their authors, and considering audience and purpose while composing—all of which builds funds of rhetorical knowledge. This path is demonstrated by Beaufort (2007), as she described a student’s transition from first year writing to his required major courses, then to on-the-job writing after graduation, leveraging prior writing experiences to inform new ones. As he moved deeper in his major-level coursework, Beaufort’s student came to view texts as authored, constructed assertions, recognizing the texts he read and those he produced as contingent on context. Similarly, Haas’ (1994) longitudinal study of a biology major followed a similar trend as she conceptualized her reading and writing as part of a larger conversation taking place within biology. She moved from away from her original “asituational or arhetorical” understanding of texts (Haas, 1994, p. 46). Other studies of academic enculturation (Berkenkotter, Huckin, & Ackermann, 1998; Herrington & Curtis, 2002; Penrose & Geisler, 1994) identified students’ rhetorical knowledge growth across their academic paths and the increased complexity with which they came to view their own and others’ texts as author-built, context-specific, and assailable.

These longitudinal, qualitative studies of rhetorical knowledge revealed growth in students as readers and writers. Similarly, a quantitative assessment of students’ epistemological and rhetorical beliefs, such as that provided by the Measure of Rhetorical Beliefs reported here, may allow us a peek into students’ understanding of knowledge, persuasion, and knowing. In this way, measuring changes in students’ beliefs may help us to track metacognitive growth and understand outcomes of certain types of instruction and programming. The Measure of Rhetorical Beliefs, as well as other scales that assess composing practices and epistemological beliefs, may be useful to writing program faculty and program administrators in describing the changes that may occur as part of instructional intervention or larger-scale programming.

2.2 Linking Epistemological Beliefs to Rhetorical Thinking

Just as scholars have studied the growth of students constructing rhetorical knowledge within reading and writing assignments, for decades educational psychologists have explored links between students’ epistemological beliefs and their academic performance. Their findings illustrate the ways that beliefs may influence students’ approaches to academic tasks. For instance, in the academic task of summarizing a reading passage, researchers (Schommer, 1990; Schraw, Bendixen, & Dunkle, 2002) identified a relationship between college students’ beliefs about knowledge and their performance on comprehension tasks involving writing a summary paragraph and completing a mastery test. Findings from this study indicated that students’ beliefs in quick learning predicted oversimplified conclusions, overconfident self-reports regarding understanding of the material, and poorer performance on a mastery test. Students who viewed knowledge as more certain also performed more poorly on the paragraph-writing task, creating paragraphs with inappropriate conclusions and oversimplified claims. Other investigations of epistemological development and academic performance have studied beliefs in quick learning as they relate to high school (Schommer, 1993; Schommer & Dunnell, 1994) and college (Schommer-Akins, 2002; Schommer, Crouse, & Rhodes, 1992) performance; students who believed that learning happens fast tended to have lower grades.

The concept of epistemological beliefs and the widely-known work begun by William Perry (1998) has also resonated with composition scholars, who have linked these beliefs with students’ writing performance. For instance, Hays, Brandt, and Chantry (1988; Hays & Brandt, 1992) evaluated student papers for rhetorical quality and evidence of students’ epistemological stance. Although they did not use a separate assessment of students’ beliefs, they coded student essays for evidence of epistemological stance by rating them according to Perry’s stages. Their analysis supported strong relationships between students’ epistemological beliefs and the quality of their essays. Students who held more constructivist epistemologies wrote essays of higher quality and audience awareness. Work by Charney, Newman, and Palmquist (1995) also examined the relationship between undergraduates’ epistemological styles, writing grades, and their attitudes toward writing. They found students with more absolutist beliefs, defined as a view of knowledge as provable facts that are either true or false, tended to have lower grades on their writing. Students who were more evaluative generally had higher writing grades and identified themselves as good writers. These findings support a potential relationship among undergraduates’ epistemological styles, their attitudes toward writing, and writing performance.

2.3 Students’ Beliefs about Disciplinary Knowledge

Scholars exploring rhetorical thinking and epistemological beliefs have done so with an interest in students’ metacognition and the way that it may influence their approach to and performance on academic tasks. Understanding the ways students’ epistemological and rhetorical beliefs may differ across contexts could help us improve instructional opportunities for undergraduate groups. For instance, research suggests that, in addition to holding general epistemological beliefs, students may also have different beliefs about knowledge within different disciplines. Hofer (2000) found that first-year college students across majors categorized knowledge in the sciences as more certain and static when compared to knowledge and knowing in fields like psychology. To understand students’ perceptions of knowledge in different disciplines, Schommer-Aikins, Duell, and Barker (2003) studied students’ general epistemological beliefs, assessed by the Epistemological Beliefs Questionnaire, and their specific beliefs about math, business, and the social sciences, assessed by a modified version of the Epistemological Belief Questionnaire that asked students to keep a specific discipline in mind when responding to the questions. Results suggested that students do hold general epistemological beliefs that are separate from, but moderately related to, discipline-specific beliefs. In addition, those discipline-specific beliefs are influenced by their experience within that domain; students with more experience in math, for instance, had more “sophisticated” views of knowledge within the domain of math.

2.4 Students’ Beliefs across Different Majors

Just as individual students may hold different epistemological beliefs about disciplines, as explained above, groups of students from different majors may also hold different general epistemological beliefs. For example, previous work revealed differences across majors regarding students’ beliefs about knowledge and learning. Findings by Jehng, Johnson, and Anderson (1993) showed that undergraduate majors had different epistemological beliefs, with social science, arts, and humanities students reporting a stronger belief in the uncertainty of knowledge and view of learning as a slow process compared to business and engineering majors. Paulsen and Wells’ (1998) results also supported a difference between majors, with students from the humanities, fine arts, and social sciences reporting more complex views of knowledge, learning, and certainty compared to students in education, business, and engineering. Additionally, longitudinal work by Trautwein and Lüdtke (2007) found students’ high school beliefs about certain knowledge were predictive of their major in college. Business students reported the highest certainty beliefs, with social science students reporting the lowest. Charney and colleagues (1995) found students majoring in the humanities had more constructed views of knowledge compared to social science, business, and science/engineering students. The humanities students in their study also reported more positive attitudes about writing, including enjoyment of writing and writing efficacy. Findings by Hays, Brandt, and Chantry (1998) accounted for differences in epistemologies based on majors, but they did not find that academic major was a significant predictor of epistemological stance, complicating earlier reports of differences by majors. In general, however, evidence supports the notion that undergraduates’ epistemological beliefs vary by major.

2.5 Measuring Rhetorical Writing Beliefs

A recent investigation of the relationship between undergraduates’ beliefs about knowledge and learning and their persuasive writing found key differences between students who had a more constructed, contingent view of knowledge and those who did not (Neely, 2014). Freshman students who believed knowledge is absolute and learning abilities are fixed wrote arguments that tended to engage in summarizing information about the selected topic instead of synthesizing and transforming their research to support their key points.

However, assessments of students’ ideas about writing (White and Bruning, 2005), writing process (Lavelle, 1993; Lavelle & Zuercher, 2001), and affect in regard to writing (Charney, et al., 1995; Daly & Wilson, 1983) have generally not accounted for students’ beliefs about what constitutes a persuasive argument. The measure by Charney and colleagues (1995) assessed students’ enjoyment of writing and their sense of writing efficacy, as well as their ideas about the learnability of writing. It did not tap into student’s ideas about the rhetorical features of writing, such as consideration of audience and alternative perspectives. The closest assessment of students’ rhetorical perspective with regard to persuasive writing is White and Bruning’s (2005) Writing Beliefs Inventory (WBI), which measures students’ views of writing as either those of transmitting knowledge or transacting with knowledge. Their work with this instrument found a significant positive relationship between essay quality and the view of writing as a constructive, transformative process. Subsequent research using the WBI, by Mateos, Cuevas, Martin, Martin, Echeita, and Luna (2010), explored potential relationships between students’ writing beliefs, reading beliefs, and their writing performance, but found only reading beliefs predictive of writing performance.

The present study fills a gap in the field of writing research by presenting a measure that taps into students’ rhetorical beliefs about writing, a construct that is not assessed by currently available scales and questionnaires.

3.0 Research Questions

The following research questions guided this inquiry and structure this report:

  1. What items could constitute a scale assessing college students’ rhetorical beliefs?
  2. Do these items demonstrate content validity according to experienced composition instructors?
  3. To what extent does this scale assessing rhetorical beliefs demonstrate construct validity?
  4. How does this new instrument perform, with regard to reliability, to help us understand possible differences in students’ beliefs across different majors?
  5. What are the relationships between rhetorical beliefs, epistemological beliefs, and students’ composing processes?

4.0 Methods

4.1 Creating Items and Exploring Content Validity for the Measure of Rhetorical Beliefs

A reliable, valid scale to assess students’ beliefs about persuasive writing would be useful to understand the ways certain types of writing assignments and instruction may promote student growth. For example, from a programmatic perspective, the ability to assess students’ rhetorical beliefs can help us identify the types of assignments and activities that promote development of rhetorical understanding. Instructionally, such a scale can help individual faculty members gauge students’ rhetorical understanding and facilitate discussion about these beliefs by making them explicit via the scale items.

As explained above, scales such as those assessing epistemological beliefs are widely used, valid, and relatively reliable. These scales attempt to measure the broad domain of students’ beliefs about knowledge and learning. The task-specific scale created by White and Bruning (2005) to assess students’ beliefs about writing, the Writing Beliefs Inventory, is a valid measure and has been used by other researchers within academic contexts. However, the measure did not perform reliably in earlier work (Neely, 2010), nor does it necessarily assess students’ beliefs about the persuasive nature of writing.

In order to address the limitations of existing writing belief measures, I set out to create a new scale to assess undergraduates’ rhetorical writing beliefs (Neely, 2012). This process included holding semi-structured focus group sessions with undergraduates, asking them questions about their academic and nonacademic writing, their approach to writing assignments, and their ideas about persuasive texts (see the focus group discussion guide in Appendix A).

Next, I tested these items by administering them to undergraduates enrolled in composition courses and requesting feedback about item clarity. Once I had the final of scale items, I further validated them with feedback from experienced composition faculty. Participants in each of these steps are described below.

4.1.1 Focus group participants. A total of 52 students participated in nine semi-structured focus groups. The majority of participants were sophomores (80%), female (70%), and education majors or minors (90%), as they came from a subject pool for an education course.

From the focus group transcripts, I composed pilot items for a new Measure of Rhetorical Beliefs, available in Appendix B. The process of conducting focus groups, transcribing the conversations, and using these data to draft scale items supports the content validity of the scale (Vogt, King, & King, 2004), the “extent to which a specific set of items reflects a content domain.” (DeVellis, 2003 p. 49).

4.1.2 Initial scale administration to participants. After drafting the scale, I administered pilot scale items to 187 students enrolled in composition courses, seeking student input about question content and phrasing. Of these participants, 57% were female and 86% were underclassmen; 19% Asian/Pacific Islander, 13% Latinx, 47% White, 10% mixed/other, 4% Black, and 6% no response. Average age was 19.4 years. Approximately half, 51%, had declared majors in College of Letters, Arts, and Sciences on campus, which also housed undeclared majors. Business majors comprised 27% of the sample, with engineering majors 23% (See Appendix B for a list of the Pilot Measure of Rhetorical Beliefs Items.)

These participants were asked to respond to items as they would in a usual paper-and-pencil survey and were also asked to circle or underline mark any items that were unclear. In addition, there was a final, open-ended item that asked for written suggestions about ways to make the survey clearer. Later, as I entered their responses into spreadsheet software for analysis, I noted scale items that received multiple student comments regarding clarity and worked to address these via minor revisions. Composition faculty item validation. After the scale development procedures from the initial scale administration, results reported below, eight items remained in the Measure of Rhetorical Beliefs, and those items were further validated by input from seven veteran rhetoric and writing instructors, each of whom had taught freshman composition for at least 8 semesters.

4.2 Establishing Construct Validity & Exploring the Beliefs of Students across Majors

Once I developed the Measure of Rhetorical Beliefs, I worked to establish its construct validity by exploring its relationship to scales that assess related constructs. To this end, I administered the Measure of Rhetorical Beliefs, along with the Epistemological Belief Inventory and the Inventory of Process in College Composition, to groups of undergraduate students majoring in engineering, liberal arts & sciences, and business. These measures were related to the MRB in that they assess related metacognitive constructs regarding beliefs about knowledge, learning, and writing practices. Prior work found that these scales were related to academic performance (Schommer-Akins, 2002; Schommer, Crouse, & Rhodes, 1992) and may vary across student majors (Charney, Newman, & Palmquist, 1995; Paulsen & Wells, 1998). Thus, in this project, I also explore potential differences in these scales across students from different majors.

4.2.1 Participants and context. Initial round of construct validity testing. The eight items of the Measure of Rhetorical Beliefs, along with two other scales, were administered to a group of 115 undergraduate education and liberal arts majors from a subject pool associated with an education course in order to (a) explore underlying subscales within the 8 items and (b) to establish construct validity. The average age of the participants was 19.9 years old, 70% of the them were female, and 75% were underclassmen. The scale was administered at the middle of the semester, and the participant demographic was comprised of 40% White, 20% Latinx, 17% Asian/Pacific Islander, 9% Black, 8% mixed/other, and 6% of students who chose not to identify. Second round of testing with composition students from across majors. A total of 260 undergraduates, most at the freshman and sophomore level, completed the series of questionnaires, comprised of the three measures described below. All participants had earned credit for the first-year composition course, a prerequisite for each of the three courses from which I recruited. The liberal arts and science majors came from a second-year rhetoric and writing course that is required of liberal arts and science majors. The business and engineering students came from a second-year professional and technical writing course. All students completed the questionnaires during the middle of the semester, between weeks 7-8 in a 16-week semester. Participation was completely voluntary, and faculty did not know which of their students had participated. Liberal Arts, and Sciences majors. A total of 127 students comprised this group, and were majoring in English (10), History (5), Communications (22), Women’s and Ethnic Studies (1), Philosophy (2), Sociology (2), Psychology (28), Geography (9), Nursing (11), Criminal Justice (14), Health Sciences (15), or Education (9). Of these students, 63% were female and 83% were underclassmen; 17% Latinx, 29% Asian/Pacific Islander, 40% white, 8% mixed/other, and 6% no response. Average age was 19.6 years. This particular semester, the mean SAT-Verbal score for all students in the College of Letters, Arts, and Sciences (LAS) at this university was 540. Business majors. Of the 133 total Business majors, 56% were female and 77% were underclassmen. Their average age was 20 years old. The ethnic breakdown was similar to that of students in the Liberal Arts and Science group. This particular semester, the mean SAT-Verbal scores for all students in the College of Business at this university was 535. Engineering majors. Of the 70 students in this group, 45% were female and 78% were underclassmen at the freshman or sophomore level. Their average age was 20.2 years old. The ethnic breakdown was similar to that of student in the liberal arts and science group. This particular semester, the mean SAT-Verbal score for all students in the College of Engineering at this university was 576.

4.2.2 Measures. Scales assessing students’ epistemological beliefs, writing process, and rhetorical beliefs were given to the engineering, business, and liberal arts, & sciences (LAS) majors. Details about each of the three scales are provided below. Epistemic Beliefs Inventory (EBI). Epistemic beliefs were measured using a 32-item scale from Schraw, Bendixen, and Dunkle (2002), with participants rating their responses on a 1-5 level of agreement with each item. Factor analysis procedures replicated those of Schraw and colleagues, but identified a four-factor structure, not a five-factor structure. Even so, identified factors were very similar to those of Schraw and colleagues’; the main difference was that items assessing beliefs about the certainty and structure of knowledge appeared to be functioning as a single subscale.

The first factor, about simple knowledge, had a coefficient alpha of .62. A sample item from this factor was, “Things are simpler than most professors would have you believe.” The second factor, about omniscient authority, had a coefficient alpha of .70. A sample item from this factor was, “People should always obey the law.” The third factor, about whether the ability to learn is innate, had a coefficient alpha of .73. A sample item from this factor was, “People’s intellectual ability is fixed at birth.” The fourth factor, about the stability of knowledge, had a coefficient alpha of .67. A sample item from this factor was, “What is true today will be true tomorrow.” (See Appendix C for items and factor loadings). Inventory of Processes in College Composition (IPCC). Students’ composing processes and practices were measured using a 52-item scale, rated from 1-5, from Lavelle and Zuercher (2001; Lavelle & Guarion, 2003). Their factor analysis of the scale indicated 5 subscales of composition processes. In other words, the scale assessed students’ writing process across several different components, which included: (1) Elaborative, or writing as a search for personal meaning and self-investment; (2) Low self-efficacy, which includes a fearful or doubting approach to one’s writing; (3) Reflective-revision, or approaching writing as a “sculptor,” using writing as an opportunity to construct knowledge; (4) Spontaneous-Impulsive, which includes an approach to writing that overestimates skill and undervalues revision; and (5) Procedural, indicating a strict adherence to rules and getting an essay “right.”

I used identical factor analysis procedures to Lavelle and Zuercher (Principle Components analysis with Varimax rotation) and found similar results, but the scree plot suggested a structure of four factors, not five, and accounted for 42% of the variance in the sample population. The fact that I was unable to replicate the previous factor structure was not too much cause for concern, as the items were still clustering in factors that resonated with previous findings and our understanding of the elements of composing process. When analyzing factor analysis data, it is important that the numbers resonate with the theory and practice of the phenomenon (Allen & Yen, 2001). This was the case with my analysis of the IPCC; the items clustered into factors that resonated with my experience teaching composition and the research of composition theorists. Reliability of each of the four factors ranged from .69-.93, which means that items within the factors were likely assessing the same construct (See Appendix D for items and factor loadings). Measure of Rhetorical Beliefs (MRB). This 7-item measure, also on a 5-point scale, consisted of questions based on the focus group item development and testing. The scale items are listed above in Table One. Overall reliability of the scale (coefficient alpha) was .77, and reliability for the two factors was .63 and .78.

5.0 Results

Results are presented in response to the Research Questions in Section 3.0, with research question 1 addressed in 5.1, research question 2 addressed in section 5.2, and so on.

5.1 Testing Items Written from Focus Groups 

The group of 187 undergraduate composition students represented an adequate initial number of responses to the scale (Comrey & Lee, 1992; Spector, 1992). Thus, I conducted factor analysis (Principle Components Analysis with Varimax Rotation) to determine whether the measure was assessing independent factors regarding students’ rhetorical writing beliefs. The resulting Scree plot suggested a four-factor solution accounting for 42% of variance in the sample population. After reviewing the resulting factor score correlation matrix, I dropped items that had low factor loadings (less than .40 according to DeVellis (2003) or cross-loadings. Dropped items included those assessing non-academic writing (items 16-19 in Appendix B), students’ ideas about reading persuasive texts (items 21-36), and oral communication (items 37-38).

Within the eight items, I explored the potential factors, or subscales, to determine whether the eight items were measuring a single, broad construct called “rhetorical beliefs” or whether these eight items might actually comprise more than one measure of constructs that are related to rhetorical beliefs. Factor analysis procedures involved Principle Component Analysis with Varimax rotation, selected because the factors were assumed to be orthogonal (DeVellis, 2003, and yielded a scree plot that suggested two eigenvectors could be extracted (Cattell, 1966) for a two-factor solution. I therefore ran an analysis to extract two factors, which together accounted for 38% variance in the sample population. Table 1 presents the factor loadings of each item. Loadings greater than .40 (positive or negative) are considered acceptable (DeVellis, 2003) and items should not cross-load (i.e. load on more than one factor). Cross-loaded items or items that do not load high enough on either factor do not help in measuring the underlying construct. The final item in Table 1, “Writing in college is mainly reporting what authorities think about an issue,” loaded only at .20 on the “multiplicity” factor and at .07 on “audience,” indicating that the item was not working with the other items to assess the underlying factors. As a result, I dropped that item from subsequent analysis as a means to optimize the Measure of Rhetorical Beliefs.

Since the analysis described above suggested that the seven items clustered into two factors, I set out to name each of these subscales. One of these factors encompassed items that gauged an openness to multiple, even competing, ideas while writing, so I labeled it “multiplicity.” The second cluster of items included statements about the extent to which students think about their readers as they compose, so I labeled it “audience.”

Table 1: Scale Items and Factor Loadings of the Measure of Rhetorical Beliefs  

Item Factor Loading
Multiplicity Audience
1. When writing a paper for school, I try to imagine who will be reading it. .23 .85
2. When writing a paper for school, I think about readers who might disagree with my opinion. .24 .74
3. When writing a paper for school, I think about the professor who will be reading it. -.07 .81
4. When writing a paper for school, I try to stick only to my opinion and not present too many sides of an issue. -.55 .14
5. When writing a paper for school, I show multiple sides of the issue. .80 .13
6. It is important to know exactly what you’re trying to say before you start writing a paper for school. -.45 -.19
7. I’ve had the experience of changing my mind about an issue after writing a paper about it. .47 .11
8.*Writing in college is mainly reporting what authorities think about an issue. -.20 .07 

5.2 Feedback from Experienced Composition Faculty

After identifying the core items in the Measure of Rhetorical Beliefs, I presented copies of the eight-item scale to seven veteran composition faculty in order to further establish content validity for the measure (DeVellis, 2003). The instructors provided input about the appropriateness of the items to understand students’ beliefs about persuasive writing by writing comments on a draft of the instrument and returning it to me for review. They agreed that “at face” the items appeared to tap into students’ persuasive beliefs about writing, supporting the instrument’s face validity, which refers to whether a group of experts believe it will assess the construct, based on a reading of the items (DeVellis, 2003). Based on the faculty members’ feedback, I made a few minor edits so that survey items were in language more clear to students (e.g. changed the word “people” to “readers” in item 2).

5.3 Initial Round of Construct Validity Testing

In order to explore its construct validity, which refers to the way in which the Measure of Rhetorical Beliefs relates to other established measures of related constructs (Haswell, 2001), students’ responses to the Measure of Rhetorical Beliefs were compared to two established scales assessing related constructs. The two established scales were the Epistemological Belief Inventory (Schraw, Bendixen, & Dunkle, 2002) and the Inventory of Process in College Composition (Lavelle & Zuercher, 2001). Correlation analysis indicated that the Measure of Rhetorical Beliefs assessed a construct different from that measured by the Epistemological Belief Questionnaire and the Inventory of Process in College Composition. Results are reported in Table 2.While a subscale of the Measure of Rhetorical Beliefs (MRB) was significantly correlated with the assessments of Epistemic Beliefs Inventory (EBI) and some elements of the Inventory of Process in College Composition (IPCC), it was not so highly correlated as to suggest that the scales were measuring the same construct.

Table 2: Correlation Coefficients of Subscales**     

  EBI IPCC Elaborative IPCC Efficacy IPCC Spontan IPCC Procedural MRB Multip
IPCC Elaborative -.21*          
IPCC Efficacy -.29* .42*        
IPCC Sponaneity .11 .35* -.06      
IPCC Procedural .10 .21* -.05 -.52*    
MRB Multiplicity -.12 .20* -.03 .01 -.12  
MRB Audience .05 .23* .13 -.10 -.48* .15

*Correlation significant at p< .05 level

**Note: EBI=Epistemological Beliefs Inventory, IPCC= Inventory of Process in College Composition, MRB= Measure of Rhetorical Beliefs

5.4 Second Round of Construct Validity with Identification of Differences among Majors

In this step, scale responses from the 260 composition students from across majors were analyzed in order to explore whether students from different majors responded differently to the scales. As reported in Table 3, significance tests (ANOVA), which included a Bonferroni correction, and post hoc testing revealed that the group of liberal arts and science (n=127) majors’ scores were significantly different than those of the business (n=63) and engineering majors (n=70) on the Elaborationist subscale of the Inventory of Process in College Composition (F=9.35; 2, 258, p<.05; partial eta squared=.13) and the Audience subscale of the Measure of Rhetorical Beliefs (F=3.90; 2, 258, p<.05; partial eta squared=.07).

Table 3: Scale and Subscale Means (Standard Deviations) by Major    

Scale: Subscale High Score Interpretation LAS Business Engineering
EBI: Certain Knowledge Truth is stable and generally does not change.







EBI: Innate Learning Believes ability to learn is fixed and individuals cannot "learn to learn."







EBI: Omniscient Authority Views authority as unassailable and generally expert.







EBI: Simple Knowing Preference for simple knowledge. 







IPCC: Elaborationist Views writing as constructive process of sculpting meaning, often personal.







IPCC: Self Efficacy Knowledge of and confidence about writing process.







IPCC: Spontaneous Tends not to engage in revision. Reports writing impulsively. 







IPCC: Procedure Concerned with correctness, grammar, and staying with plan.







MRB: Multiplicity Considers competing perspectives while writing.







MRB: Audience Views writing as communicative and considers audience perspectives.








The highlighted differences suggest that the engineering and business majors identified as more elaborative in their writing, using their composing processes to construct meaning to a greater extent than the LAS majors. Further, the groups of engineering and business students considered audience more often and/or to a greater extent during composing processes compared to the LAS majors. In addition, the LAS majors scored significantly lower on the Self Efficacy subscale of the Inventory of Process in College Composition compared to both groups of business and engineering (F=3.62; 2, 258, p<.05; partial eta squared=.07), which indicates that the LAS majors as a group felt less confident about academic writing tasks compared to the business and engineering students.

5.5 Exploring Relationships among Scale Constructs. Once the analysis of variance across majors was complete, I combined all of the students’ scores into a single group and ran a correlational analysis in order to explore potential relationships between scales. These results are reported in Table 4. As illustrated in Table 4, the relationship between epistemological certainty beliefs and rhetorical beliefs related to multiplicity was significant, but weak (correlation coefficient=.37). In addition, there were moderate significant relationships between some subscales of the Inventory of Process in College Composition and the Measure of Rhetorical Beliefs.

Table 4: Correlation of Subscales across All Majors    

EBI: Certainty .25** .32** .36** -.02 -.17* .22** .17 -.37** -.07
EBI: Innate Learning   .09 .10 .09 .05 .10 .01 -.06 .25**
EBI: Omniscient Authority     .16 -.01 .01 -.07 .41** -.11 .02
EBI: Prefer Simple Knowledge       .07 .04 .22* .09 -.18* -.03
IPCC: Elaborate         .60** -.32** .12 .20* .50**
IPCC: Self Efficacy           -.21* .02 .23** .30**
IPCC: Spontaneous             -.17 -.06 -.32**
IPCC: Procedural Correct               -.26** .10
MRB: Multiplicity                 .18*

**. Correlation is significant at the 0.01 level

*. Correlation is significant at the 0.05 level

Note: EBI = Epistemic Beliefs Inventory, IPCC = Inventory of Process in College Composition, MRB = Measure of Rhetorical Beliefs 

6.0 Discussion

The purpose of the focus group testing, pilot testing of items across students in composition courses, and solicitation of feedback from experienced composition faculty was to explore and establish the content validity of the Measure of Rhetorical Beliefs. Focus groups are an important step in creating items for a new measure, as undergraduates represent both the expert group and target audience for the scale (Creswell, Plano Clark, Guttmann, & Hanson, 2003). That is, since I was working to compose items about students’ beliefs about persuasion, these focus groups provided me with some access and insight into these ideas and the language that students use to describe the phenomenon. The next step, in which I administered pilot scale items to a group of undergraduates and conducted factor analyses on the results, helped me to understand whether the items were functioning together, and which items should be dropped from the instrument draft. Sharing the items with composition faculty was an additional step to understand face validity of the Measure of Rhetorical Beliefs.

I explain the steps to establish construct validity for the final eight items of the measure, which includes administering the scale to two groups of undergraduates, along with several other instruments that assess related constructs.

6.1 Demonstrating Construct Validity of the MRB

The correlations reported in Table 2 suggest that the Measure of Rhetorical Beliefs has acceptable discriminant validity (Clark & Watson, 1995), which supports the notion that the construct of epistemological beliefs is related to, but distinct from, composing processes and rhetorical beliefs. If the three scales had been highly correlated (greater than .90), then that high correlation would suggest that the scales were not assessing unique, discrete constructs, but rather the same construct. Thus significant, but not overly high, correlation of the Measure of Rhetorical Beliefs to related scales represents an important demonstration of its construct validity.

6.2 Identifying Similarities in Student Beliefs across Majors

Unlike earlier studies that found differences in students’ epistemological beliefs across majors (Charney et al., 1995; Jehng et al., 1993; Paulsen & Wells, 1998; Trautwein & Lüdtke, 2007), students’ epistemological belief scores across majors in the current study were not significantly different. Scores were also similar across majors with regard to composing processes; across majors, students tended not to engage in revision, with an average Inventory of Process in College Composition score of 3.07 on the 5-point “Spontaneous” subscale. Students’ concern with grammar while composing was similar across majors, as measured by the “Procedure” subscale. Scores by major also did not vary with regard to the Measure of Rhetorical Beliefs multiplicity subscale, which includes considering competing perspectives while writing.

6.3 Accounting for Differences in Student Beliefs across Majors

Some differences across majors did emerge from the analysis of mean scores. When compared to the LAS majors, business and engineering students reported more advanced considerations of audience in terms of their rhetorical beliefs, as well as idea elaboration and writing self-efficacy. One possible explanation for this is the higher SAT-Verbal average scores of all students in the engineering program, which was at least 36 points higher than students in the other two groups. It might be the case that engineering students had greater access to academic experiences that promoted the type of advanced thinking about writing that were assessed by the scales in this study that corresponding with higher SAT scores as well.  

The average score of all students in the engineering program this semester was 576, compared to business program students (535) and students enrolled in LAS programs (540). Although SAT-Verbal scores were not collected for individual students, we may tentatively consider the different composing practices and writing beliefs in light of the engineering students’ higher mean SAT-Verbal scores.

Even still, explaining different composing practices and writing beliefs via the engineering students’ high SAT-Verbal scores only provides part of the picture. The Liberal Arts SAT-Verbal scores were very close to those of students in the business program, with only a 5-point difference in mean scores. Despite the similarity in SAT-Verbal scores to their LAS counterparts, the Business majors scored significantly higher on the elaborationist composing subscale, reporting a constructive view of writing, and the consideration of audience component of the Measure of Rhetorical Beliefs.

Future work with the measure of rhetorical beliefs should include verbal aptitude each participant as a covariate, but this study raises the question of what may account for the difference in scale scores of different majors beyond SAT-Verbal performance in this study. Why are majors from the professional schools reporting more advanced composing processes, writing efficacy, and consideration of audience while writing? In the case of our university population, these differences may be the result of the specific instruction in professional genres that the business and engineering students receive in their second-year composition courses and throughout out their majors. Once they have completed first-year composition, students in the professional schools (engineering and business) are tracked into a professional and technical writing course to fulfill their second composition requirement, courses that they had started (but not finished) at the time of the survey data collection. In contrast, students in LAS major programs take a research-based writing course that covers academic genres, but not professional ones. This difference in instructional content may account for the different scores around audience and elaboration that are indicated here.

6.3.1 Understanding rhetorical beliefs as related to epistemological beliefs. In the correlational analysis reported in Table 4, I expected a strong link between students’ epistemological and rhetorical beliefs; however, results do not support such a relationship. The weaker than expected relationship between epistemological certainty beliefs and rhetorical beliefs related to multiplicity is significant (correlation coefficient=.37) was surprising in light of qualitative studies (e.g. Beaufort, 2007; Berkenkotter, Huckin, & Ackermann, 1998; Haas, 1994; Penrose & Geisler, 1994) that illustrate development of students’ rhetorical knowledge and epistemological stance within their fields of study. Thus, the constructs of Epistemological and Rhetorical beliefs, and their development within students, may be more independent than I originally anticipated. Further use of the measure is needed to verify the independence of the two constructs.

6.3.2 Exploring rhetorical beliefs as related to composing processes. Other correlations between the scales, also reported in Table 4, are noteworthy. For instance, there were moderate significant relationships between some subscales of the Inventory of Process in College Composition and the Measure of Rhetorical Beliefs. Within the composing process measure, students who tended to view writing as an elaborative process (IPCC: Elaborative) also had high feelings of efficacy toward their writing processes (IPCC: Efficacy), another moderate relationship (correlation coefficient=.52.) Put another way, students who viewed writing as a way to construct understanding also tended to enjoy writing, compared to those who viewed writing as a means to relay information. This makes sense if students feel successful in their use of writing to construct meaning. Although weak, another significant relationship was that students who felt efficacious in terms of their writing (IPCC: Efficacy) tended to be less spontaneous about writing, preferring instead to plan their composing (IPCC: Spontaneous). This aligns with an emphasis on process-model instruction, with students encouraged to plan, revise, and re-vision their writing.

Students who reported considering audience as they wrote (MRB: Audience) also viewed writing as a process of sculpting meaning (IPCC: Elaborative), felt efficacious about writing (IPCC: Efficacy), and reported considering multiple perspectives in their writing (MRB: Multiplicity). Those who considered their audience also were less likely to write spontaneously (IPCC: Spontaneously), preferring instead to plan and revise.

6.3.3 Composing process related to epistemological beliefs. Finally, with regard to students’ composing processes, those students who reported concern with grammar and format correctness (IPCC: Procedural) also viewed knowledge as coming from omniscient authority (EBI: Omniscient Authority) with a correlation of .41. It might be that for these students “good” writing is grammatically impeccable, which would resonate with the notion of authority-based knowledge, reflected in their EBI scores. These findings align with Rose’s (1980) description of the “rigid rules and inflexible plans” of novice writers and descriptions of writers entering their discipline, as provided by Berkenkotter and colleagues (1988) and Beaufort (2007).

From a quantitative perspective, prior work had not explored correlations between students’ epistemological beliefs and their composing processes, as assessed by the IPCC. Thus, the findings reported here should be taken as tentative and in need of further testing.

Finally, as we further consider the correlations reported in Table 4, it is worth remembering that correlation does not imply cause. With that said, the relationship between students’ beliefs knowledge, their rhetorical beliefs, and their reported composing practices, does suggest a complex system of relationships linking beliefs and practice. Further use of these measures, and linking them with authentic student artifacts, may help us better understand ways to promote beliefs about writing that will lead not only to improved writing performance, but also to epistemological growth and rhetorical thinking.

The findings presented here report on the initial steps toward creating a questionnaire to measure rhetorical writing beliefs and using that questionnaire to compare beliefs of undergraduates across different majors. Overall findings suggest that rhetorical writing beliefs are a measurable construct distinct from, but related to, epistemological beliefs and composing practices and that students from different majors may have different rhetorical beliefs and composing practices.

Although the measure of rhetorical beliefs is still in the pilot testing stage, these early results indicate that students from different majors may have different ideas about audience as they complete academic writing tasks, in addition to different beliefs about writing as a constructive process. Generally, these results are supported by earlier studies comparing epistemological and writing beliefs of students from different majors

6.4 Changing Students’ Beliefs

To some extent, students’ rhetorical and epistemological beliefs likely grow from their prior experiences with writing and the way that writing is presented to them in general education requirements and within their majors. The types of assignments we ask of our students shape the way they understand knowledge construction and meaning making within their disciplines. For instance, more “authentic,” problem-posing assignments may promote growth in rhetorical and epistemological beliefs to a greater extent than static assignments, such as the traditional research paper (Davis & Shadle, 2000; Larson, 1982). This is supported by the higher audience scores of the engineering and business students, which we might tentatively link to the difference in writing tasks that they experience in their courses.

The measure of rhetorical beliefs reported here is potentially useful to understand the effect of instructional interventions and types of assignments. Such work has been done with measures of epistemological beliefs and other writing belief scales, linking certain types of instructional intervention to epistemological change in college students. Work by Kienhues, Bromme, and Stahl (2008) points to instruction designed to directly challenge students’ beliefs, as opposed to narrative-based instruction, as changing education and psychology undergraduates’ epistemological beliefs about the field of genetics. Valanides and Angeli (2005) compared the effect of three different instructional approaches on undergraduates’ epistemological beliefs and a written outline assignment. Students who were in the groups that received traditional lectures or Socratic questioning from an expert did not experience changes in their epistemological beliefs. However, students in the “infusion” group, whose instruction was more dialogue-based with an expert, did experience changes in their epistemological belief scores.

Instruction designed to shift students’ specific beliefs about writing provide a somewhat more confusing picture about the malleability of these important beliefs via certain instructional techniques. A small study of undergraduates showed improved writing performance after direct instruction on a synthesis-writing task (Boscolo, Arfé, & Quarisa, 2007). Included in the study design was a pre- and post-assessment of writing beliefs via the Inventory of Process in College Composition, a scale used in the study reported here. Even though students’ quality of synthesis writing improved over the course of the intervention, their beliefs about writing, specifically text elaboration and revision, did not significantly change across the intervention. Instead, students came to view academic writing as less personal and less evaluative than they thought prior to the synthesis writing intervention. Thus, while these findings support a possible link between instruction and writing belief change, the instruction may result in a change toward an unintended direction, at least as a temporary side effect.

Additional work also tracked an unintended potential consequence of writing instruction on students’ beliefs (Neely, 2014). Across a 16-semester first-year composition course, students’ beliefs about the nature of knowledge shifted as they came to view learning more as a slow process and knowledge as less certain. Their writing beliefs also shifted away from the view of writing as a product and writing as one-sided and certain. However, students’ views about authority did not shift to more constructivist ends of the measures. That is, they continued to see a key purpose of writing as reporting authorities’ ideas about issues. Although they did not regress, students’ beliefs about the role of reporting authority in writing remained the same even after a semester-long course that emphasized a rhetorical approach to research-based writing. This is perhaps because research-based writing, even taught within a rhetorical framework, often does emphasize location and proper use of appropriate, credible sources. This is not to say that rhetorical-based research writing instruction leads necessarily leads to negative effects, but rather such instruction may reinforce existing, functional, beliefs that students have about writing. Pre- and post- designs assessing students’ rhetorical beliefs and epistemological beliefs may help us understand the impact of certain assignments and instruction on their metacognitive growth.

6.5 Limitations and Future Directions

What I have labeled “rhetorical beliefs” in this project is actually a complex and multi-faceted construct that may be informed by the students’ genre understanding and experience (Bawarshi, 2003; Bizzell, 1992), funds of knowledge (Gee, 2012; Porter, 1986), and metacognitive strategies (Flower & Hayes, 1977; Pintrich, 2002). Scales such as this attempt to reduce complex phenomenon to a series of easily answerable items in the name of efficiency and gathering data on larger numbers of students than would be possible compared to more qualitative approaches. However, we should be cautious as we construct and use such scales to not veer too far from the original construct set out to measure. This is especially important during the statistical analyses of scale items. The items that “survived” the scale validation and factor analyses described here are those that assess the extent to which writers consider audience and think about competing viewpoints when composing, subscales which I named “audience” and “multiplicity.” This study raises the question, Are these items an adequate representation of a writer’s rhetorical beliefs, a vast and complicated phenomenon?

One of the challenges of constructing this scale was omitting items that were not contributing to the reliability of the scale. After the focus groups, I began with thirty-eight items (see Appendix B). These were items I had written based on focus group conversations about rhetorical thinking in different contexts, including items to capture potential difference between reading and writing beliefs. After the factor analyses were complete, only eight scale items remained. While those eight items were strong in terms of their reliability, they were also limited in scope because there were so few. Thus, a future direction for this project would be to build a large pool of participants, perhaps even across institutions, who could test items, thus building a stronger item bank (DeVellis, 2003).

There are other substantial limitations within the design of the study reported here: All of the questionnaires are self-reports, without written artifacts and other data to triangulate findings. Self-report data via quantitative assessments has a problem of “truthiness,” as the tidiness of the numbers belies underlying complications, including issues of social desirability response bias (Fowler, 2014). Thus, self-report data may contain error variance due to inaccurate perception of actual practices and beliefs. At its best, self-report data is just that: It indicates how participants, in this case undergraduates across different majors, understand themselves relative to their knowledge and writing beliefs. The hope is that these types of scales tap into students’ perceptions of themselves. Thus, these are measures of students’ perceptions of their beliefs (in the case of the Epistemological Beliefs Inventory and Measure of Rhetorical Beliefs) and practices (in the case of the Inventory of Process in College Composition) more than measures of their “true” beliefs and practices. In crafting questions for the MRB, I made efforts to use value-neutral language when composing items and to use phrasing students provided in the focus groups. Attention to the authentic language and neutral phrasing represent were attempts to mitigate against the threat of response bias to the MRB’s validity.

An additional limitation is the weak to moderate correlation coefficients reported in Table 4. Although some of these correlation coefficients were in the weak range (.30), their significance is reported here as part of early development data for the Measure of Rhetorical Beliefs. The hope is that future research builds upon this measure, perhaps also using the other scales, constructing a clearer understanding of the relationship among these variables. Some weaker coefficient alphas can also be addressed via further work with the Measure of Rhetorical Beliefs, including possibly adding additional items to the “Audience” subscale.

These caveats aside, it is worth understanding and accounting for the more “expert-like” composing practices and perspectives of the engineering and business majors in this study. The differences may be related to the nature of assignments in their courses. Engineering and business majors are often assigned writing that involves a specific format and audience. Lab reports, business proposals, and service audits, for example, have specific features and audiences often made clear to students, with some writing assignments calling for students to coauthor a single report (Nesi and Gardner, 2012). Whether different types of assignments and collaborative writing opportunities may result in different student beliefs about writing is speculative within the scope of this study; it is an area for future research, perhaps using the Measure of Rhetorical Beliefs.

In addition to investigating contexts for rhetorical and epistemological growth, future studies should account for other variations among student groups besides just majors; controlling for variance due to verbal ACT or SAT scores may show whether the difference between majors remains even after considering verbal aptitude. For instance, Schommer’s study (1993) first identified differences between the epistemological beliefs of students from two different majors; however, once other variables, including parental education level and students’ verbal aptitude were accounted for, these differences were no longer significant. Future study designs with these scales should account for such covariates.

The field of writing assessment would benefit from further research into the links between students’ epistemological beliefs and the quality of their writing (e.g. Boscolo, et al., 2007; Charney et al., 1995; Hays, Brandt, and Chantry, 1988; Hays & Brandt, 1992; Kardash & Scholes, 1996; Mason & Boscolo, 2004; Mateos, Cuevas, Martin, Martin, Echeita, and Luna, 2011; Neely, 2014; Schommer, 1993) to include whether students’ rhetorical writing beliefs relate to writing performance. In addition to further use of the Measure of Rhetorical Beliefs, other new scales assessing students’ task-specific writing beliefs, such as that by Sanders-Reio, Alexander, Reio, and Newman (2014) should be included along with epistemological beliefs and written artifacts. Such questionnaires may also be useful in longitudinal studies as a means to track students’ shifting beliefs and practices across time.

Author Note

Michelle Neely is an Assistant Professor, Attendant Rank, in the English Department at the University of Colorado at Colorado Springs. There she also directs the Writing across the Curriculum and the Writing Portfolio Assessment programs. Her interests include student and faculty epistemological beliefs and the role of writing in assessing general education.


Alexander, P. A., Schallert, D. L., & Hare, V. C. (1991). Coming to terms: How researchers in learning and literacy talk about knowledge. Review of Educational Research, 61(3), 315-343.Allen, M. J., and Yen, W. M. (2001). Introduction to measurement theory. Long Grove, IL: Waveland Press.

Bartholomae, D. (1986). Inventing the university. Journal of Basic Writing5(1), 4-23.

Bawarshi, A. (2003). Genre and the invention of the writer: Reconsidering the place of invention in composition. University Press of Colorado.

Beaufort, A. (2007). College writing and beyond: A new framework for university writing instruction. Logan, UT: Utah State University Press.

Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hillside, NJ: Lawrence Erlbaum Associates.

Berkenkotter, C., Huckin, T. N., & Ackerman, J. (1998). Conventions, conversations, and the writer: Case study of a student in a rhetoric Ph. D. program. Research in the Teaching of English, 22, 9-44.

Bizzell, P. (1992). Academic discourse and critical consciousness. Pittsburgh, PA: University of Pittsburgh Press.

Boscolo, P., Arfé, B., & Quarisa, M. (2007). Improving the quality of students’ academic writing: an intervention study. Studies in Higher Education, 32(4), 419-438.

Cattell, R. B. (1966). The scree test for the number of factors. Multivariate behavioral research1(2), 245-276.Charlton, M. (2007). That’s just a story: Academic genres and teaching anecdotes in writing across the curriculum projects. The WAC Journal, 18, 19-29.

Charney, D., Newman, J. H., & Palmquist, M. (1995). “I’m just no good at writing:” Epistemological style and attitudes toward writing.” Written Communication, 12, 298-329.

Clark, L. A. & Watson, D. (1995). Constructing validity: Basic issues scale development. Psychological Assessment, 7(3), 309-319.

Comrey, L. A. & Lee, H. B. (1992). A first course in factor analysis (2nd ed.). Hillside, NJ: Lawrence Erlbaum Associates.

Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. Handbook of mixed methods in social and behavioral research, 209-240.

Daly, J. A., & Wilson, D. A (1983). Writing apprehension, self-esteem, and personality. Research in the Teaching of English, 17, 327-342.

Davis, R., & Shadle, M. (2000). “Building a mystery:” Alternative research writing and the academic art of seeking. College Composition and Communication, 51(3), 417-446.

DeVellis, R. F. (2003). Scale development: Theory and applications. CA: Thousand Oaks. Sage.

Fives, H., & Buehl, M. M. (2012). Spring cleaning for the “messy” construct of teachers’ beliefs: What are they? Which have been examined? What can they tell us? In K. R. Harris, S. Graham, & T. Urdan (Eds.), APA Educational Psychology Handbook (Vol. 2) (pp. 471-499).

Flower, L. S., & Hayes, J. R. (1977). Problem-solving strategies and the writing process. College English, 39(4), 449-461.

Flower, L., & Hayes, J. R. (1984). Images, plans, and prose the representation of meaning in writing. Written Communication1(1), 120-160.

Fowler, F. J. (2014). Survey research methods (Vol. 1). Thousand Oaks, CA: Sage.

Gee, J. (2012). Social linguistics and literacies: Ideology in discourses. New York: Routledge.

Haas, C. (1994). Learning to Read Biology One Student's Rhetorical Development in College. Written Communication11(1), 43-84.

Haas, C., & Flower, L. (1988). Rhetorical reading strategies and the construction of meaning. College Composition and Communication, 39(2), 167-183.

Haswell, R. (2001). Validation: Part of the circle. In R. Haswell (Ed.), Beyond Outcomes: Assessment and Instruction within a University Writing Program. CT: Westport. Ablex.

Hays, J. N., Brandt, K. M., & Chantry, K. H. (1988). The impact of friendly and hostile audiences on the argumentative writing of high school and college students. Research in the Teaching of English, 22, 391-416.

Hays, J. N., & Brandt, K. S. (1992). Socio-cognitive development and students' performance on audience-centered argumentative writing. In M. Secor & D. Charney (Eds.), Constructing rhetorical education (pp. 202-229). Carbondale, IL: Southern Illinois University Press.

Herrington, A., & Curtis, M. (2000). Persons in process: Four stories of writing and personal development in college. Urbana, IL: National Council of Teachers of English.

Herrington, A., & Moran, C. (2005). Genre across the curriculum Logan, UT: Utah State University Press.

Hillocks, G. (1999). Ways of thinking, Ways of teaching. New York: Teachers College Press.

Hofer, B. K. (2000). Dimensionality and disciplinary differences in personal epistemology. Contemporary Educational Psychology, 25(4), 378-405.

Jehng, J. C. J., Johnson, S. D., & Anderson, R. C. (1993). Schooling and students’ epistemological beliefs about learning. Contemporary Educational Psychology, 18(1), 23-35.

Johnson, J.P., & Krase, E. (2012). Articulating claims and presenting evidence: A study of twelve student writers from first-year composition to writing across the curriculum. The WAC Journal, 23, 31-48.

Kardash, C. M., & Scholes, R. J. (1996). Effects of preexisiting beliefs, epistemological beliefs, and need for cognition on interpretation of controversial issues. Journal of Educational Psychology88(2), 260.

Kienhues, D., Bromme, R., & Stahl, E. (2008). Changing epistemological beliefs: The unexpected impact of a short‐term intervention. British Journal of Educational Psychology, 78(4), 545-565.

King, P. M., & Kitchener, K. S. (1994). Developing reflective judgment. San Francisco, CA: Jossey-Bass.

Kuhn, D., Cheney, R., & Weinstock, M. (2000). The development of epistemological understanding. Cognitive Development15(3), 309-328.

Larson, R. L. (1982). The “research paper” in the writing course: A non-form of writing. College English, 44(8), 811-816.

Lavelle, E. (1993). Development and validation of an inventory to assess processes in college composition. British Journal of Educational Psychology, 63, 489-499.

Lavelle, E., & Guarino, A. J. (2003). A multidimensional approach to understanding college writing processes. Educational Psychology, 23(3), 295-305.

Lavelle, E. & Zuercher, N. (2001). The writing approaches of university students. Higher Education, 42, 373-391.

Mason, L., & Boscolo, P. (2004). Role of epistemological understanding and interest in interpreting a controversy and in topic-specific belief change. Contemporary Educational Psychology29(2), 103-128.

Mateos, M., Cuevas, I., Martin, E., Martin, A., Echeita, G., & Luna, M. (2011). Reading to write an argumentation: The role of epistemological, reading, and writing beliefs. Journal of Research in Reading, 1, 1-17.

McCarthy, L. P. (1987). A stranger in strange lands: A college student writing across the curriculum. Research in the Teaching of English, 21(3), 233-265.

Neely, M.E. (2010). “My audience is all readers, ages 18-90:” Named audiences & writing quality in first-year composition. Paper presented at the annual meeting of the American Educational Research Association, Denver, CO.

Neely, M.E. (2012). Developing a measure of undergraduates’ rhetorical beliefs. Paper presented at the annual meeting of the American Educational Research Association, Vancouver.

Neely, M. E. (2014). Epistemological and writing beliefs in a first-year college writing course: Exploring shifts across a semester and relationships with argument quality. Journal of Writing Research6(2).

Nesi, H., & Gardner, S. (2012). Genres across the disciplines: Student writing in higher education. Cambridge, MA: Cambridge University Press.

Paulsen, M. B., & Wells, C. T. (1998). Domain differences in the epistemological beliefs of college students. Research in Higher Education, 39(4), 365-384.

Penrose, A. M., & Geisler, C. (1994). Reading and writing without authority. College Composition and Communication45(4), 505-520.

Perl, S. (1979). The composing processes of unskilled college writers. Research in the Teaching of English, 13(4), 317-336.

Perry, W. G. Jr. (1998). Forms of ethical and intellectual development in the college years: A scheme. San Francisco: Jossey-Bass. (Originally published in 1970. New York: Holt, Rinehart & Winston).

Pintrich, P. R. (2002). The role of metacognitive knowledge in learning, teaching, and assessing. Theory into Practice, 41(4), 219-225.

Porter, J. E. (1986). Intertextuality and the discourse community. Rhetoric Review, 5(1), 34-47.

Roozen, K. (2010). Tracing trajectories of practice: Repurposing in one student’s developing disciplinary writing processes. Written Communication, 27(3), 318-354.

Rose, M. (1980). Rigid rules, inflexible plans, and the stifling of language: A cognitivist analysis of writer's block. College Composition and Communication, 31(4), 389-401.

Sanders-Reio, J., Alexander, P. A., Reio Jr, T. G., & Newman, I. (2014). Do students’ beliefs about writing relate to their writing self-efficacy, apprehension, and performance? Learning and Instruction, 33, 1-11.

Schommer, M. (1990). Effects of beliefs about the nature of knowledge on comprehension. Journal of Educational Psychology, 82(3), 498-504.

Schommer, M. (1993). Comparisons of beliefs about the nature of knowledge and learning among postsecondary students. Research in Higher Education, 34, 355-370.

Schommer, M., Crouse, A., & Rhodes, N. (1992). Epistemological beliefs and mathematical text comprehension: Believing it is simple does not make it so. Journal of Educational Psychology, 84, 435-443.

Schommer, M., & Dunnell, P. A. (1994). A comparison of epistemological beliefs between gifted and non-gifted high school students. Roeper Review, 16, 207-210.

Schommer-Aikins, M. (2002). An evolving theoretical framework for an epistemological belief system. In B. Hofer & P. Pintrich (Eds.), Personal Epistemology: The Psychology of Belief about Knowledge and Knowing (pp. 103-118). Mahwah, NJ: Lawrence Erlbaum Associates.

Schommer-Aikins, M. (2004). Explaining the epistemological belief system: Introducing the embedded systemic model and coordinated research approach. Educational Psychologist, 39, 19-29.

Schommer-Aikins, M., Duell, O. K., & Barker, S. (2003). Epistemological beliefs across domains using Biglan's classification of academic disciplines. Research in Higher Education, 44(3), 347-366.

Schommer-Aikins, M., & Easter, M. (2006). Ways of knowing and epistemological beliefs: Combined effect on academic performance. Educational Psychology, 26, 411-423.

Schommer-Aikins, M., & Hutter, R. (2002). Epistemological beliefs and thinking about everyday controversial issues. The Journal of Psychology, 136, 5-20.

Schraw, G. Bendixen, L. D., & Dunkle, M. E. (2002). Development and validation of the Epistemic Belief Inventory (EBI). In B. Hofer, & P. R. Pintrich (Eds.), Personal Epistemology (pp. 231-260). Mahwah, NJ: Lawrence Erlbaum Associates.

Sommers, N. (1980). Revision strategies of student writers and experienced adult writers. College composition and communication31(4), 378-388.

Spector, P. E. (1992). Summated rating scale construction: An introduction. Series: Quantitative applications in the social sciences. CA, Newbury Park: Sage.

Swales, J. (1990). Genre analysis: English in academic and research settings. Cambridge, MA: Cambridge University Press.

Trautwein, U., & Lüdtke, O. (2007). Epistemological beliefs, school achievement, and college major: A large-scale longitudinal study on the impact of certainty beliefs. Contemporary Educational Psychology, 32(3), 348-366.

Valanides, N., & Angeli, C. (2005). Effects of instruction on changes in epistemological beliefs. Contemporary Educational Psychology, 30(3), 314-330.

Vogt, D. S., King, D. W., & King, L. A. (2004). Focus groups in psychological assessment: enhancing content validity by consulting members of the target population. Psychological Assessment, 16, 231-43.

White, M. J., & Bruning, R. (2005). Implicit writing beliefs and their relationship to writing quality. Contemporary Educational Psychology, 30, 166-190. 


Appendix A

Focus group discussion guide

Semi-structured discussion questions were left vague and open-ended in order to generate conversations among focus group participants.

  • When was the last time you wrote for school? What did you write? What steps did you take?
  • When was the last time you wrote outside of school? What did you write? What steps did you take?
  • How is school writing different from non-school writing?
  • Is posting on Facebook “writing”? How so?
  • Tell me about a time when your writing has been misunderstood.
  • Tell me about the parts of writing that are easy for you.
  • Tell me about the parts of writing that are hard for you.
  • What do you notice about your friends’ writing?
  • What types of things do you read for school?
  • What types of things do you read outside of school?
  • Do you read the things for school differently than you read the non-school stuff?
  • Have you ever read something and then changed your mind about an issue? Tell me about that.
  • Have you ever written something for school and then changed your mind about an issue? Tell me about that. 


Appendix B

Pilot items generated from focus groups

Rhetorical Beliefs Scale

The following questions are about the way you complete writing assignments for school:

  • When I write a paper, I try to imagine who will be reading it.
  • When I write a paper or essay, I think about readers who might disagree with my opinion.
  • When I write a paper or essay, I try to stick only to my opinion and not present too many sides of the issue.
  • When I write a paper or essay, I try to show multiple sides of the issue.
  • Before I start writing, I’ve made up my mind about the issue.
  • It’s important to know exactly what you’re going to say before you start writing a paper or essay.
  • I’ve had the experience of changing my mind about an issue after writing a paper on it.
  • Sometimes, after writing a paper about a topic, I find that my opinion of the issue has changed.
  • When I write a paper or essay, I think about the professor who will be reading it.
  • Writing in college is mainly about reporting what authorities think about issues.
  • All issues have a right and a wrong side.
  • There are two sides to every issue: right and wrong.
  • Issues and controversies have multiple sides.
  • There are many legitimate sides to an argument.
  • When I’m writing, it’s good to explain the limitations of my ideas in my paper.

The following questions are about how you write to your friends:

  • When I write a message to a friend, I imagine how they will read/interpret it.
  • When I write a message to a friend (email, Facebook, etc.), I reread it before I post or send it.
  • When I write a message to a friend, it’s important for me to know exactly what I’m going to say before I start to write.
  • When I write a message to friends, I think about their response to it.

The following questions are about what you expect when you’re reading:

  • Writers who include opinions that disagree with their own weaken their argument.
  • Writers who present too many sides of an issue are wishy-washy.
  • When writers explain multiple sides of an issue, it looks like they can’t decide what they believe.
  • Writers who present multiple perspectives on an issue build my trust in them.
  • Writers who offer several sides about an issue appear indecisive.
  • In order to persuade me, writers should stick to one side of the issue.
  • It’s annoying to read an article that presents too many sides of a topic.
  • If a writer changes my mind about an issue, it’s because he/she manipulated my thinking.
  • My opinions about many issues are set.
  • If two authors write differing opinions on an issue, one of them must be wrong.
  • Writers’ main concerns should be with persuading readers.
  • Writers should not worry about accuracy if they can convince readers to change their minds.
  • If a writer is really passionate, he/she will likely persuade me.
  • If a writer has good facts, then he/she will likely persuade me.
  • If a writer seems dedicated, but doesn’t have a lot of facts, he/she will persuade me.
  • If a writer has a lot of good facts, but doesn’t seem passionate about the issue, then he/she will persuade me.
  • If a writer expresses doubt in his/her own argument, I’m less likely to believe them.
  • It is more important for a speaker to be passionate than to have facts and evidence.
  • If a speaker is confident, I’m more likely to believe their argument.


Appendix C

Epistemological Beliefs Inventory

Items and Factor loadings

Factor 1: Simple knowledge

Factor 2: Omniscient authority

Factor 3: Innate ability to learn

Factor 4: Stability/structure of knowledge   

Item Factor

  1 2 3 4
Most things worth knowing are easy to understand* .37      .38 
What is true is a matter of opinion.        -.42
 Students who learn things quickly are the most successful.     .55  
 People should always obey the law.   .61     
 People's intellectual ability is fixed at birth.     .67   
 Absolute moral truth does exist.    .46    
 Parents should teach their children all there is to know about life.*   .21    .26
 Really smart students don't have to work as hard to do well in school.     .51  
 If a person tries too hard to understand a problem, they will most likely end up being confused.  .52      
 Too many theories just complicate things. .52       
 The best ideas are often the most simple.* .46     .43
 Instructors should focus on facts instead of theories.  .50      
 Some people are born with special talents and gifts.     .50  
 How well you do in school depends on how smart you are.     .61  
 If you don't learn something quickly, you won't ever learn it.     .46  
 Some people have a knack for learning and others don't. .54      
 Things are simpler than most professors would have you believe. .66      
 If two people are arguing about something, at least one of them must be wrong. .45      
 Children should be allowed to question their parents' authority.   -.66     
If you haven't understood a chapter the first time through, going back over it won't help.  .23      
 Science is easy to understand because it contains so many facts.       .67
The more you know about a topic, the more there is to know.        .47
What is true today will be true tomorrow.       .50
When someone in authority tells me to do something, I usually do it.   .65    
 People shouldn't question authority.   .75    
 Working on a problem with no solution is a waste of time.     .50  
 Sometimes there are no answers to life's big problems.* .31     -.37
It's annoying to listen to a lecturer who can't seem to make up their mind about the topic. .43      
If professors stuck to the facts and theorized less, I'd get more out of college. .51      
Once you have the facts, most questions have only one right answer.   .51    
*Item removed from subsequent analyses due to cross-loading      




Appendix D

Inventory of Process in College Composition Questionnaire

Items and Factor loadings

Factor 1: Elaborationist

Factor 2: Self efficacy

Factor 3: Spontaneous/Impulsive

Factor 4: Procedure/Correctness    

Item Factor

  1 2 3 4
Writing is like a journey. .86      
Writing makes me feel good. .74      
Writing helps me keep information organized in my mind.  .73      
I sometimes get sudden inspirations while writing. .66      
Writing is symbolic. .63      
Writing reminds me of other things that I do. .62      
When I write a paper, I often get ideas for other papers. .60      
At times, my writing has given me deep personal satisfaction. .59      
I use written assignments as learning experiences. .59      
I imagine the reaction that my readers might have. .59       
It's important to me to like what I've written. .58       
The main reason for writing a paper is to get a good grade on it.  -.57      
I compare and contrast ideas to make my writing clear. .50      
I try to entertain, inform, or impress my audience.* .48 .41    
It is important to me to like what I have written. .46 .40    
I think about how I come across in my writing. .45 .33    
My essay or paper often goes beyond the specifications of the assignment. .44 .43    
I put a lot of myself in my writing. .44      
I re-examine and restate my thoughts in revision. .43 .37 -.25  
I visualize what I am writing about.* .35 .28 -.25  
When given an assignment calling for an argument or viewpoint, I immediately know which side I would take. -.33      
I can write simple, compound, and complex sentences.   .65    
I do well on essay tests.   .63    
I can usually find one main sentence that tells the theme of my paper   .60    
I expect good grades on papers.   .59    
In my writing, I use some ideas to support other, larger ideas.   .56    
I can write a term paper.   .56    
I tend to give a lot of description and detail.   .50    
My writing rarely expresses what I think.   -.49    
I closely examine what the essay calls for.   .47    
When writing an essay, I stick to the rules.   .42    
Writing is always a slow process.*   -.42   .40
I often use analogy and metaphor in my writing.   .41    
I use a lot of definitions and examples to make things clear.   .37    
I keep my theme or topic clearly in my mind as I write.* .27 .32   .26
My writing "just happens" with little planning or preparation.     .79  
Often my first draft is the finished product.     .77  
I often do writing assignments at the last minute.     .77  
I set aside specific time to do my writing assignments.     -.59  
I never think about how I approach writing.     .58  
Revision is a one-time process at the end.     .47  
I plan, write, and revise all at the same time.     .45  
I plan out my writing and stick to the plan.       .65
I often start with a fairly detailed outline.       .56
I worry about how much time my paper will take to write.       .44
Studying grammar and punctuation would greatly improve my writing.       .44
I like to work in small groups to discuss ideas or revisions.* .26     .32
The most important thing in writing is observing the rules of grammar and punctuation.       .35
*Item removed from subsequent analyses due to cross-loading    



All scale items are provided in these appendices with the hope that other researchers and faculty may use them in their own work with students. An online survey site is also available for those who wish to administer items to groups of students. Please contact the author [mneely2@uccs.edu] for more details.