In the past decade, open-access institutions have faced an onslaught of legislative and administrative initiatives aimed at reducing or eliminating developmental coursework (Hassel et al. 2015; Whinnery and Pompelia 2018), including moving underprepared students to credit-bearing composition courses with varied forms of corequisite support, integrating reading and writing courses as an alternative to standalone reading courses, and eliminating developmental education entirely. These initiatives stem from concerns about the impact of developmental courses on time to degree completion, but they are also strongly rooted in higher education austerity measures. An emphasis on accelerating students’ development as college readers and writers has fundamentally changed (and continues to reshape) writing program curricula and instruction at an ever-increasing number of community colleges and some less selective universities. Until recently, most students with low standardized test scores would have been required to take multiple semesters of developmental reading and writing coursework at a typical community college.
These changes are unfolding at the same time that students’ literacy experiences both inside and outside of school are rapidly evolving through significant social changes connected to use of new technologies and global communication (National Council of Teachers of English 2013; Brandt 2015). Drawing from Deborah Brandt's concept of “literacy accumulation,” Daniel Keller (2014: 5) asserted that twenty-first-century reading is rooted in “a culture of acceleration” in which “literacy is tied to educational, business, social, and technological contexts that value speed and increasingly enable and promote faster ways of reading and writing.” He suggested that literacy acceleration happens both because “literacy technologies and practices tend toward speed” and “literacies can accelerate: appearing, changing, and merging with other literacies, or fading at a faster rate” (7). It is not surprising, then, that even the least experienced students are now expected to begin college-level reading and writing at a faster rate and earlier in their pathways in higher education while simultaneously using literacy technologies for reading in new ways. The expectation that students (even those who are still developing comprehension skills) should accumulate literacy at a faster rate with less time in English courses complicates how instructors help students develop information literacy and use critical reading to assess the credibility of sources within a rhetorical context and identify deliberately “fake news.”
Our article focuses on a critical question that faces instructors working with underprepared learners: What are the implications for teaching college writing in an era of accelerated literacies and fake news when large numbers of students enter college without the reading comprehension skills required for laying a foundation for critical reading and analyzing sources? We draw from a study that traced the literacy journeys of fifteen students whose placement assessments (based on multiple measures of college readiness) suggested that they were underprepared for college reading. The students in our study represent a large group of readers who typically start their postsecondary education at an open-admissions institution. Such students are more likely than their peers to struggle not just with making progress through writing program sequences but also with successfully moving into reading-intensive courses across the curriculum that require them to analyze rhetorically complex digital and print texts (see Sternglass  2009; Tinberg and Nadeau 2010).
We situate the findings of our study within current scholarship on critical reading and information literacy, tracing the proficiency that our participating students demonstrated in three areas of critical reading and writing from sources in increasing levels of sophistication: using ideas from sources, evaluating the credibility of sources, and engaging critically with sources. We also identify strategies that helped students develop complex critical reading and information literacy skills and explain implications for teaching situated within a “culture of acceleration.”
Defining Critical Reading and Information Literacy
Alice S. Horning, Deborah-Lee Gollnitz, and Cynthia Haller (2017: 7) define college-level academic reading as “a complex, recursive process in which readers actively and critically understand and create meaning through connections to texts.” This definition can be productively read against the six frames presented in the Association of College and Research Libraries’ (2016) “Framework for Information Literacy for Higher Education,” whose definition of the concept of information literacy—essentially the ability to critically engage with and understand sources—is increasingly more crucial as part of college writing instruction: “Information literacy is the set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information in creating new knowledge and participating ethically in communities of learning.” Our work is influenced by scholarship on how college writers find, analyze, and integrate secondary sources into their own writing (Kantz 1990; Greene 2001; Association of College and Research Libraries 2016; Jamieson 2017). Sandra Jamieson (2017) and the Association of College and Research Libraries (2016) both observed that students are more likely to use a checklist mode of information gathering and to select sentences from sources rather than engage with the complete content of a source. Therefore, structured approaches to critical reading that support students in building their ability to read rhetorically as well as critically (with attention to how texts create meaning together) are even more important for information literacy than strategies for gathering sources (Carillo 2015; Sullivan 2017).
What our project adds to conversations about pedagogies supporting critical reading and information literacy is a longitudinal empirical examination of the literacy development of students who are underrepresented in the literature, tracing their work as readers and writers across multiple semesters. Our study supplements prior scholarship such as the studies emerging from the Citation Project (which look only at papers from students and unlinked to specific student information) or work such as Ellen Carillo's (2015), which pairs historical understanding with a focus on qualitative data from writing instructors. Other sources draw from interviews with students about their reading practices (Gogan 2017). Our study focuses not just on papers but on writers, readers, and their literacy experiences and academic outcomes over the first two college years.
For over a decade we conducted a series of related studies at a two-year, open-admissions liberal arts institution that traced students’ development as readers and writers over two or more semesters, beginning with their initial placements into writing and reading courses, analyzing their work throughout their first college year and for some students into their second or third college years as they moved toward completing required writing courses (Hassel and Giordano 2009, 2015; Giordano and Hassel 2016). To gain a better sense of the literacy development of students who are most affected by college literacy acceleration and rapidly changing approaches to developmental reading and writing curricula and instruction, we designed a study that focused on students who started college in a developmental reading and/or writing course and eventually completed a university transfer-level writing course.
Our study, approved by the University of Wisconsin College institutional review board, was part of a systematic assessment of changes to our developmental (non-credit-bearing) English and writing courses designed to support students’ transitions to critical reading and source-based writing across every level of our writing program. Previously, developmental reading and writing instructors designed their own courses, which typically focused on skill and drill activities. Before our study began, our English department adopted new course guidelines and learning outcomes that focused on introducing students to reading comprehension strategies, critical analysis, and writing about reading both in writing and in reading courses.
Our project focused on two research questions related to assessing changes to our program:
In what ways does a curricular and pedagogical approach in developmental courses focused on critical reading, intellectual inquiry, and analysis help significantly underprepared student populations develop the skills needed to move from non-degree-credit to degree-credit composition?
What are the barriers for some students to achieving first-year composition learning outcomes necessary for making a successful transition to college-level writing?
In particular, we were interested in tracing students’ literacy development by analyzing their writing about reading, beginning with their initial experiences in a developmental writing and/or reading course across two or more semesters and ending when they completed the research-based writing course that required them to conduct independent research using library databases and then write academic texts that analyze information and ideas from their research.
Profile of Participating Students
All campus students in a developmental reading or writing course were invited to participate in the study through their instructors at the start of the semester. Study participants had ACT reading and/or English scores of 15 or lower, which are much lower than the college readiness benchmarks of 18 for English and 22 for reading. Nationally, students with this score range are in the bottom 22nd percentile on the reading test and in the 30th percentile on the English test (ACT, Inc. 2019). We also included students with scores on a state placement exam that were slightly better or sometimes lower than a score resulting from random guessing. On standardized tests, these students demonstrated basic comprehension skills (locating basic facts and identifying the topic of a paragraph) without demonstrating the more complex comprehension and critical reading skills required for first-year writing and other college courses (see ACT, Inc. 2017).
Thirty-nine students agreed to participate in the study. From the original cohort, fifteen students took the final course in our writing program within two years and provided us with writing, including eight bilingual Hmong speakers who were educated in US high school. Two participants began college in a credit-bearing first-year writing course with a developmental reading course and took only two writing courses. The other students started in developmental writing, and we followed them into their second year as they worked toward completing their required writing coursework. The remaining students transferred after one year, withdrew from the institution, or took more than two years to complete first-year writing.
Process for Collecting Data and Student Writing
We focused on students’ experiences with learning outcomes for critical reading and writing from sources across our entire writing program rather than what happened with a particular class. We purposefully collected writing from students’ varied experiences with assignments in sections taught by different instructors. Instead of looking at a specific type of writing assignment, we analyzed student work for varied assignments that participants encountered based on how instructors chose to implement a standard set of curricular guidelines and learning outcomes for each course.
We collected writing either by students submitting writing directly to us or instructors providing copies, along with instructions for the major assignments. We collected multiple pieces of evidence from each of our thirty-nine student participants over a two-year period from their initial placements through the writing program sequence: developmental writing (which most participants took concurrently with developmental reading), introductory first-year writing (English 101), and a second research-based writing course that fulfilled state system degree requirements (English 102).
We started our research by examining traditional measures for college readiness based on each student's placement profile, including standardized test scores, a placement writing sample, high school grades and coursework, and a placement questionnaire with a self-assessment. In the first semester, participants also completed a survey about their educational backgrounds. For two academic years we collected and analyzed writing from each student for their college reading and writing courses. We also traced students’ overall college success outcomes.
For the fifteen students who finished the required research-based writing course within two years, we analyzed multiple pieces of writing they produced over time in different courses in our program. We analyzed writing from three or four courses for each participant, except for one student who submitted writing from just two courses. For each student, we analyzed at least four pieces of writing in each English course and the accompanying instructions, which varied based on the choices each instructor made about how to develop assignments around program guidelines. For some courses we looked at essays produced at different points in the semester. For portfolio-based writing or reading courses, we examined midterm and final portfolios that included both essays and reflective writing. We also analyzed reading journal assignments, informal pieces of writing, additional reflections, and low-stakes timed essay exams.
Methods for Analysis
As we traced the fifteen students’ development as college writers and readers, we analyzed how they integrated, responded to, and used shared course texts and research sources in their formal and informal writing assignments in relation to learning outcomes for our first-year writing program. We divided our analysis into three categories: using ideas and arguments from sources, evaluating the credibility of sources, and engaging critically with sources. We evaluated multiple pieces of writing for students to determine whether they demonstrated a particular move in writing about reading in any of their submitted pieces and classified their work into one of three categories:
Did not demonstrate: We did not observe the move in any piece of writing submitted for the project.
Developing: The writer demonstrated the move at least once in a piece of writing but did not use the move consistently within the same paper and did not demonstrate it in subsequent writing assignments.
Proficient: The writer used the move multiple times and demonstrated it with consistency (but not perfection) in at least one piece of submitted writing from the first two college years. Students’ writing was categorized as proficient if they demonstrated the skill consistently in one assignment or course but did not receive source-based writing that required the move in later coursework.
We also identified the earliest points in the writing program when students fully demonstrated proficiency in a particular move for writing about reading. Throughout our analysis, we use the word move to indicate what students were doing in their writing, in contrast to skill, which suggests an ability. Although moves and skills are interrelated (a student can't make a move proficiently without reading and writing skills), some students may have been able to do a move that they did not do in the writing that we assessed.
Using Ideas and Arguments from Sources
We first looked at whether students distinguished between their own thinking and ideas from sources and then analyzed how they approached using authors’ arguments in selecting evidence in their writing. Table 1 indicates the number of students who demonstrated proficiency with a particular move for using sources and the first course in which they used the move in their submitted writing.
By the end of the research-based writing course (English 102), all fifteen students used transitions from their own thinking to ideas from sources (typically with signal phrases), but that was the only source-based writing move that all of the students made by the time they finished their writing coursework. Nine students referred directly to specific arguments or ideas from texts, while six students discussed concepts from texts in a general way in their writing without discussing specific ideas (not including using quotations). Seven students demonstrated proficiency in distinguishing between authors’ arguments and factual information in a developmental reading and/or introductory first-year writing course. The remaining students tended to treat all ideas from sources as facts and/or plugged source material into texts without doing anything as writers to indicate that they understood the difference between an argument and factual information even in the final research-based writing course. Two students mostly limited their use of sources to making connections between information from sources and their own personal experiences rather than using learning from reading to support an argument or achieve another academic purpose.
Evaluating the Credibility of Sources
Predictably, students were less proficient in evaluating the credibility of sources than in incorporating evidence from texts into their writing. For us, credibility in first-year writing means that a source (a) comes from a library database, publication, or website that focuses on peer-reviewed or edited work; (b) is written by an author with expertise in the topic of the text; or (c) uses verifiable evidence (i.e., the text itself uses credible sources). The appropriateness of particular sources for college-level writing varies depending on the course, discipline, and requirements for a writing project, as well as the purpose for which it is used by the writer. In our writing program, students were expected to use library database sources for their final research projects in English 102 but not necessarily in English 101, and we considered those differences and the assignment instructions provided to us by the instructors of the courses as we looked at how students approached analyzing the credibility of sources.
We assessed the extent to which students demonstrated proficiency in evaluating the credibility of sources based on four criteria (table 2). None of the students wrote about independent research in first-semester developmental courses. When they transitioned to credit-bearing courses, less than half (six) used sources that were clearly relevant to their audience, purpose, research issue, and thesis in at least one formal writing assignment. Five of the students used sources inconsistently, with some irrelevant sources, and four students didn't show even a developing proficiency in selecting relevant material from credible sources to use in their writing. Only four students directly evaluated the credibility of sources they used in their writing, and three identified subjectivity or bias in at least one text.
Four students established the credibility of the sources used in their writing by identifying the author's expertise or relevant experience while also using credible sources that were relevant to the focus of their writing projects. Six students showed developing proficiency by using at least some sources written by authors with expertise in the subject of their essays without using any moves as writers to indicate that they could identify whether an author had relevant expertise. Some of these students introduced sources by identifying an author's background for sources from blogs or other websites written by individuals who lacked professional expertise or relevant credentials for writing about the topic of the text. They were able to use a surface-level writing skill for attribution without the accompanying critical reading of a text required for determining the credibility of a source.
Engaging Critically with Sources
We also assessed students’ development as college readers by exploring the extent to which they moved beyond simply using ideas from texts to analysis. We examined how students analyzed sources by starting with the basic move of making connections between a text and their own arguments and then looked at whether students critically analyzed authors’ arguments and/or brought together ideas from more than one text (table 3).
Nine students made connections between ideas from texts and their own arguments as writers, while others inserted source material into their writing without commenting on it and/or without discussing the relationship between the author's ideas and the points they were making as writers. Only five students responded to an author's arguments in a direct way with any type of analysis to indicate deeper critical reading (such as agreement, disagreement, evaluation, or interpretation). Only three brought together ideas from multiple texts or differentiated between the arguments and positions of two or more authors. Five students did not engage critically with texts in any writing assignment in any course, which means that we were not able to verify that by the end of their second college year they had transitioned beyond basic reporting on information from texts to writing that draws from critical reading.
The Surface of Source Integration
Study participants’ writing about reading over multiple semesters is evidence that students, as Rebecca Moore Howard, Tricia Serviss, and Tanya K. Rodrigue (2010) suggested, “write from sentences” while writing from sources. Sentence-level moves for using ideas from texts are only a starting point for transitioning to college-level critical reading and writing from sources. Introducing and citing sources in writing about reading can signal a developing but still very basic transition to evaluating sources. However, this move can be read by instructors as a proxy for more complex skills in the absence of a targeted assessment that can get at students’ deep reading information literacy proficiency. All study participants developed a capacity for using sentence-level skills to write about texts, and most used attributive phrases to provide background information about authors or publication sources, but most did not use those phrases to identify credible sources. Some wrote error-free phrases and citations to introduce blogs or biased online sources without the accompanying critical reading and evaluation of texts that helps more experienced readers select evidence from credible sources.
Kha,1 a bilingual Hmong student, is just one example of several writers in our study who learned over time how to use surface features of writing about reading without accompanying critical analysis or evaluation of texts required for proficiency. Her low standardized test scores in reading were one indication of a need for reading support as she started college, even though she had As and Bs in high school English courses. She remained in good standing long enough to complete four semesters of college work but did not earn a C or higher in any of the reading-intensive courses she completed and withdrew from four courses.
In a reflection for her developmental reading course, she evaluated her preparation for college reading: “If I was to rate myself on my reading comprehension, I would definitely give myself a four [on a scale of 1 to 10]. I am a very weak reader; often I find myself re-reading sentences a couple of times before I finally understand the meaning of the sentence or passage.” However, she later identified fictional works that she enjoyed reading, noting that “if I am really into a book I understand what I am reading. However if it is something that does not interest me, I have a hard time understanding it.” Kha's stated enthusiasm for leisure reading may seem incongruent with her skill in academic reading. However, she struggled to complete assigned tasks in her developmental reading class that required analysis of nonfiction texts and instead focused on discussing content. Her final self-assessment indicates that her transition to college-level reading still focused on comprehension at the end of the first semester: “My development skills for reading have improved not as much as I would like it to improve but it has some improvements.” In our research and teaching in an open-access context, Kha is just one example of many students whose experience with and interest in fiction reading as adolescents did not translate into proficiency with critical reading of nonfiction texts in college, suggesting that the deeper reading required for critical literacy requires more than simply doing a lot of reading.
The consequences of Kha's continued lack of complex engagement with sources provided us with a surprising (but not unique) finding that a final paper for English 101 on home cooking was subsequently recycled—almost verbatim—for English 102, even though her English 102 course required more sources and an annotated bibliography. Kha repurposed her paper in a way that did not demonstrate increased sophistication in research, source integration, or rhetorical reading. Emerging research findings from the Citation Project show that “in the area of traditional sources (books and journals) and in non-traditional sources (websites), first-year students are mostly able to identify, find, access, and cite sources in ways that would satisfy traditional bibliographic instruction” (Jamieson 2017: 127). However, in situating individual students’ critical reading and research practices across multiple semesters, we can learn more about the depth and degree to which students are moving toward proficiency in these skills. In this case, Kha met some benchmarks, as they were assessed at individual points through her writing coursework, but her struggle to meet new and more rigorous learning outcomes was not visible except through an analysis of multiple pieces of writing across more than one semester and course.
Like other researchers who have observed similar patterns in standalone paper analyses, we confirm the limited use that students who struggle with academic discourse make of critical reading in their writing and use of research sources. Further, the experiences of students in our two-year study highlight just how challenging it is to move college writers without strong prior critical reading foundations toward the information literacy frames even over multiple semesters of coursework. Without college-level critical reading and information literacy skills, students are more likely to struggle to adapt their literacy and reading practices to courses across the curriculum, many of which demand not just critical reading skills but discipline-specific reading skills. The foundations for these practices begin in the first year, not exclusively but heavily in the one class that is often the only required course that emphasizes academic literacy skills for nearly all students.
Supporting Sophisticated Source Use: From Shared Texts to Independent Inquiry
A second, more encouraging theme that our student participants’ experiences highlight is that, when given structured opportunities to read, analyze, discuss, and reflect on shared texts, credible research sources, and their own reading experiences, they were put on a trajectory to develop the college-level critical reading skills spelled out by scholars and the information literacy practices and dispositions that the Association of College and Research Libraries has endorsed. These students who began college without meeting even minimal reading readiness benchmarks showed clear progress toward evaluating sources, but that progress required scaffolded, sequenced, and sustained experiences with integrating and then critically analyzing texts.
Two participants illustrated this progress. Callie was one of two participants who started in credit-bearing writing based on her high school grades while also enrolling in a reading course based on low standardized test scores. In a final reading course reflection, Callie self-assessed her own incomplete readiness for college reading, stating that she had “a struggle for me in high school, grasping the main point of what I read” and later reflected on her high school experiences: “I also, in high school didn't have to read a textbook to find the main points and summarize what I read.” Throughout the reading course, she was able to articulate specific reading strategies and goals: “Get better at textbook reading and get faster with it. Second, I want to get better at finding the main point and key concepts because it is going to help me once I transfer. . . . I plan on accomplishing this by doing more textbook reading in Bio and Sociology next semester.” She pointed to specific experiences in the English 101 course she was concurrently enrolled in, explaining, “In my English 101 class, we are going rather slowly through the book Into the Wild by Jon Krakauer, which is nice because it is allowing me to slowly progress into college-level reading of non-fiction books.” Her instructor for that course confirmed in an interview that he saw her becoming “a lot more sophisticated in understanding what she's doing with the sources and using them in her arguments.” Unlike some students in her cohort, by English 102 she was able to locate and integrate sources using the library database with a level of emerging proficiency.
Thai, who identified as a bilingual speaker of both Hmong and English, also showed tremendous growth over his writing and reading coursework. However, he ultimately earned a C-minus in his English 102 course (which is a good illustration of how course grades are imperfect barometers of success in assessing program effectiveness). With an ACT score of 11 in reading, and high school grades in English ranging from Bs to Ds, Thai's placement profile suggested a student in need of support to develop college-level reading skills, and he took both developmental reading and writing in his first semester. His progress over time proved substantive; by English 102 he had some of the most advanced critical reading and information literacy skills of the participating students in the cohort. We saw evidence in Thai's early college writing that he was working toward establishing source credibility, synthesizing authors’ perspectives, and evaluating credibility. In English 101 he worked to address ethos but was continuing to unpack exactly how source types and author credentials are interrelated yet separate ways of establishing the credibility and use value of a source: “Some forms of ethos, was quoting from a physicist and medicinal professor of UW-Madison, Prof. Alta Charo. Even though the data was from a blog/org.com website, I believe Charo knew a whole lot of what she was informing her readers about.” By English 102 Thai was interacting with, citing, and locating scholarly sources. However, even though Thai demonstrated significant development in critical reading and rhetorical moves required for writing about reading, his English grades were unexpectedly low. As a second-language writer, he faced challenges to earn high grades in at least one writing class where the instructor had very explicitly stated grading standards around “grammar, mechanics, spelling, completeness, attention to detail, proper format.”
Implications for Instruction and Assessment
The students in our study illustrate a crucial concept from the College Reading and Learning Association white paper “The Terrain of College Developmental Reading” (Holschuh and Paulson 2013): proficiency in reading doesn't automatically evolve throughout a student's education, and earlier progress in reading doesn't necessarily translate into more advanced reading in the absence of ongoing literacy instruction. Fostering students’ information literacy development in an open-admissions English classroom and teaching readers to critically engage with texts in an era of accelerating literacies require intensive reading support and substantial, repeated experiences with writing about reading over multiple semesters and years. Despite the national push for literacy acceleration, a single first-year writing course is insufficient for helping underprepared college readers achieve the advanced critical reading skills required for evaluating the credibility of sources and evidence.
Students in developmental reading courses in our study demonstrated sometimes surprising growth as both readers and writers in their first semester when instructors organized a course around multiple pieces of informal and formal writing about reading, accompanied by portfolios that required them to consistently and substantially document their development as college readers with accompanying feedback. Students were more likely to move beyond surface-level use of texts and become proficient in critically evaluating and analyzing sources after intensive immersion in college reading. This requires repeated assignments that asked them to write about reading: reading journal assignments with varying critical reading purposes, essay assignments asking them to engage with key issues from shared course texts, and informal reflections about their reading practices and literacy development over time. We saw similar gains for students in portfolio-based writing courses that emphasized critical reading and writing from sources.
Moreover, we observed more growth in critical reading for students who were asked to do several different kinds of reading and writing activities with shared course texts before engaging in research-based writing, especially if writing about reading was an ongoing part of their literacy experiences across two or three semesters. For example, we identified students who were analyzing and situating sources rhetorically in their developmental reading classes as first-semester students (primarily because they were required to), but their subsequent writing courses did not demand an equivalent level of source engagement or practice with critical reading. The focus of those courses was writing technique, genre, and practices without an accompanying emphasis on gathering, critically reading, and evaluating sources. Some students with early gains in developmental reading demonstrated less proficiency later in the writing program compared to students in course sections that emphasized critical reading and writing from sources.
When given the opportunity to use library research databases, students benefited most when they were provided with clear expectations about the use of scholarly articles through assignment instructions, followed by accountability for that expectation through feedback and assessment of their work. Across three semesters, a surprising number of students in our study only ever used informational websites in their writing projects, which, while consistent with prior research, was not the stated focus of our writing program curriculum and the learning outcomes for the courses. Given the increasing challenges that students, instructors, and the public face around information credibility and accuracy, helping students develop the critical capacity for analyzing informational websites is just as important as teaching them to find and deeply read scholarly sources. However, comprehending and critically reading both types of sources require structured opportunities within an academic context (for examples, see Anson 2017). As our analysis shows, assessments of student writing that do not account for evaluating the credibility of sources or that value the demonstration of sentence-level writing skills tended to inaccurately evaluate students’ information literacy and their abilities to read critically.
The National Council of Teachers of English (2016) report “Professional Knowledge for the Teaching of Writing” includes the concept that “writing and reading are related” as one of the “professional principles that guide effective teaching” at all levels of writing instruction. However, the experiences of student participants in our study illustrate that simply assigning students to write about reading or complete source-based research projects is insufficient to help them develop the critical reading skills required for information literacy in an era of fake news and accelerated use of literacy technologies, especially in open-admissions writing programs. Though the instructors in our study regularly assigned reading for varying audiences and purposes, the writing about texts, particularly in the degree-credit courses, rarely asked students to enact or make visible their reading practices. The standard “composition research paper” assignment prevailed in the courses taught by our participating instructors; as Emily Isaacs's (2016) study shows, 80.2% of the 106 programs she surveyed required a research paper—primarily described as “an essay which includes student-selected secondary sources” (99). But this assignment did not require students to demonstrate critical reading knowledge or practices. Our finer-grained analysis of student writing suggests that merely asserting that the relationship between reading and writing exists, or assigning independent research, even with an accompanying emphasis on instruction in research (which 91.4 percent of programs in Isaacs's study did), is insufficient to cultivate students’ reading and information literacy skills.
These findings suggest that the work of supporting students’ development of proficiency in critical reading and information literacy requires a program-level approach that equips instructors not just to assign reading tasks or texts but also to provide instruction on critical reading strategies. At the level of the profession, conversations about reading in writing studies must include students and instructors who work in open-access contexts, in part because the reading experiences and literacy development of such students require different interventions compared to students who meet the admissions standards for selective institutions.
All student names are pseudonyms.