The Bergen Community College Journal of Scholarly Teaching, 2018-19

Page 1

The Bergen Community College Journal of Scholarly Teaching n 2018 -2019

n In this issue n Studying History like a Historian: Using Translated Archival and Primary Sources in the Classroom by Ilan Ehrlich

n Writing Center Assessment at the Community College Level by John R. Findura & Margaret M. Roidi

n What Jabari Taught Me by Sara Mastellone n Federal Reserve Economic Data (FREDÂŽ) in Principles of Economics Courses by Takvor H. Mutafoglu

n Experiencing the Past through Service to the Community: The Case for Incorporating ServiceLearning Opportunities into BCC History Courses by Daniel Saperstein & Evan Saperstein

n Effects of Written Performance by Academic English as a Second Language Students Using Integrated Listening-to-Writing Task Repetitions by John Bandman

n Reading for Academic Purposes: Vocabulary Knowledge Improves Reading by Carol Miele & Leah Carmona


In This Issue Foreword . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Carol Miele, Professor, English as a Second Language, Editor Studying History like a Historian: Using Translated Archival and Primary Sources in the Classroom . . . .3 Ilan Ehrlich, Professor, History Writing Center Assessment at the Community College Level . . . . . . . . .7 John R. Findura & Margaret M. Roidi, Cerullo Learning Assistance Center What Jabari Taught Me . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15 Sara Mastellone, Assistant Professor, Math Federal Reserve Economic Data (FREDÂŽ) in Principles of Economics Courses . . . . . . . . . . . . . . . . . . . . . . . . . . . .24 Takvor H. Mutafoglu, Assistant Professor, Economics Editor: Carol Miele, Ed.D. Special thanks to Paula Williams, Ed.D., for consultation on APA format.

Experiencing the Past through Service to the Community: The Case for Incorporating Service-Learning Opportunities into BCC History Courses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .33 Daniel Saperstein and Evan Saperstein, Adjunct Professors, History Effects of Written Performance by Academic English as a Second Language Students Using Integrated Listening-to-Writing Task Repetitions . . . . . . . . . . .41 John Bandman, Assistant Professor, Business Hotel Restaurant Management Reading for Academic Purposes: Vocabulary Knowledge Improves Reading . . . . . . . . . . . . . . . . . . . . . . .55 Carol Miele, Professor and Leah Carmona, Assistant Professor, ESL

The BCC Journal of Scholarly Teaching • 2018 - 2019

1


Foreword Professor Carol Miele, Ed.D. Editor

This edition of The Bergen Journal of Scholarly Teaching includes ongoing work of professors seeking to enrich learning, whether by pursuing their own research in their field, or by researching new ways to understand and encourage student learning. This journal encourages professors to reflect on how their research and scholarly pursuits inform course content and methods of instruction and to share the results of their inquiries with their colleagues. The articles that appear here span a wide range of interests and are examples of the wealth of abilities and skills of the Bergen faculty. Professors are recognizing the value of introducing real world resources in the community college learning experiences. An economics professor makes use of the Federal Reserve Economic Database and a history professor uses translated archival and primary sources. Service Learning is another way to bring a discipline to life though experiences that connect the classroom to the world through community service. There is a wealth of opportunities in local historical societies, for example, to enliven historical studies through civic engagement. The instances where this has been done in colleges and universities provide inspiration for course design including service as experiential learning. As in past issues, teaching English a second language is represented by professors seeking to explore new methods and approaches to improve the reading and writing skills of college ESL students. A Writing Center assessment study shows the concern of our institution for constant and on-going efforts to improve student writing and the quality of academic support service to this end. Finally, equity in the classroom is becoming a critical topic for faculty research as institutions of higher education seek to address achievement gaps, particularly among historically underserved minority students. Reflecting on classroom experiences has led to questioning how students of various backgrounds, races and ethnicities respond to teaching approaches. An article in this issue relates a promising research project exploring these issues. The Bergen Community College Journal of Scholarly Teaching was the result of the vision of the former Vice President of Academic Affairs, Dr. William Mullaney. His support for faculty scholarship at the community college inspired the authors published in this edition, but also many other faculty members to increase their efforts as scholarly teachers in order to inspire in students a love of learning. We owe Dr. Mullaney our gratitude for the example he set as an enthusiastic proponent of the scholarship of teaching and learning and of all our scholarly endeavors.

2

The BCC Journal of Scholarly Teaching • 2018 - 2019

n Studying History like a

Historian: Using Translated Archival and Primary Sources in the Classroom

Dr. Ilan Ehrlich Professor History

How did Latin American students oppose violent, lawless, or despotic presidents? How did university students view their place in the region? What role did U.S. businesses and government officials play in Latin American politics? These are important questions for a class on the Spanish speaking Caribbean and Central America since 1898, where student protests were politically influential. They are also relevant for current students, many of whom have roots in the region. But what about the answers? As a specialist in this area and time period, I have collected thousands of archival sources that can place them in perspective. Why not, I asked myself, introduce these sources in the classroom as a way of showing students how history is understood and written about by historians themselves? I began with two sources from the National Archives and Records Administration (NARA) in College Park, Maryland. The first was a letter written by an expelled University of Havana law student named Eduardo Chibás to U.S. Secretary of State Henry Stimson (Chibas, 1933). Chibás accused the U.S. ambassador in Havana, Harry Guggenheim, of backing a violent dictator responsible for “the assassination in large scale of men, women and children who demand freedom and justice.” He was referring to President Gerardo Machado, who had illegally extended his presidential tenure by two years and subsequently won another term by banning opposition parties. Those who opposed such tactics were often jailed and sometimes murdered. Chibás also accused Guggenheim of plying

Studying History like a Historian

3


American journalists with falsehoods to manipulate U.S. public opinion. Thus, Edwin C. Hill, a popular radio commentator, dismissed those opposing President Machado as “office seekers,” and claimed removing him from office would lead to a “communist uprising.” When Chibás pressed Hill about the source of these canards, the latter replied that he had consulted Ambassador Guggenheim. Chibás hoped to set the record straight by publishing his letter in the Washington Herald. The second source is a memorandum of a conversation between Frank Mahoney, president of the Cuban Electric Company, which was owned by Electric Bond and Share, and H. Freeman Matthews, a State Department official. Mahoney worried that the new administration of Franklin D. Roosevelt would view Machado in a different light. Hence, he told Matthews that, “frankly he hoped nothing would be done to oust Machado” and added that “his Company had always gotten along very well with the president.” Indeed, the Cuban Electric Company enjoyed preferential tax rates and land rights. Moreover, the company reaped fantastic profits since (by 1927 estimates) it cost the company just 1.5 cents to produce an electric kilowatt but Cubans were charged 17 cents. In the end, Mahoney was optimistic since he had spoken to the new ambassador and pronounced himself “favorably impressed.” These two typewritten documents, which comprise less than seven double spaced pages, are rich with possibility for discussion. Why were university students among the first to denounce President Machado? Was Chibás’ public relations campaign a shrewd tactic? What is the proper role of a U.S. ambassador? How did U.S. firms in Latin America influence local politics? At the same time, I was limiting myself to English language sources. Therefore, in May of 2017, I applied for a Scholarship of Teaching and Learning (SOTL) grant to gather Spanish language documents from Havana’s National Archive and the Cuban History Institute’s collection of magazines and newspapers from the 1930s. After the SOTL committee generously approved my proposal, I focused on sources that would show my students the next part of this story – namely, Cuba’s 1933 revolution. Cuba’s military forced President Machado to resign on August 12, 1933 to avoid a possible U.S. intervention. U.S. intervention was in fact legal and had been enshrined in Cuba’s constitution via the hated Platt Amendment. Not only had United States troops remained in Cuba until 1902, nearly four years after the Spanish-War concluded, but the U.S. army had occupied the island from 1906-1909 after an electoral dispute. U.S. marines landed in Cuba in 1912 when a political uprising threatened U.S. property and again in 1921 during a financial crisis. Despite being lawful, constant U.S. interference was highly unpopular. In this case, U.S. marines would remain in their barracks. However, in 1933 Machado was replaced with a provisional president handpicked by U.S. Ambassador Sumner Welles. That president, Carlos Manuel de Céspedes, was toppled three weeks later by a coalition

4

Studying History like a Historian

of former University of Havana students and enlisted military men. The government that followed, led by a former physiology professor named Ramón Grau San Martín, was known as the first that was “100 percent Cuban.” In this spirit, Grau declared that Cuba no longer recognized the Platt Amendment. He also offered no special treatment to U.S.-owned businesses on the island. Thus, he decreed a 40 percent reduction in electricity prices, which predictably enraged the Cuban Electric Company. Perhaps most damaging of all, Grau investigated an $80 million public works loan taken out by President Machado in 1928 from Chase National Bank. Calling the loan “illegal,” he suggested the Cuban government would refuse to pay. As a result, the U.S. government never formally recognized President Grau and negotiated with the chief-of-staff of Cuba’s armed forces, Fulgencio Batista, to overthrow him. For this period and slightly beyond, I introduced a series of brief documents. The first is an official statement from Grau to United Press the day after Céspedes was overthrown. It begins in the following manner: All we ask of the United States is fair play. A chance to carry out our program which is nonpolitical and embodies the Cuban ideal. We overthrew Machado because he was a tyrant; we overthrew Céspedes because he was a puppet. Our aim is to give the Cuban people an opportunity freely to express its will in drawing up a new Constitution and not merely to approve one elaborated at the American Embassy (Grau, 1933). This brief paragraph is rife with possibilities for discussion. What does Grau mean by fair play? Was the transition from Machado to Céspedes a positive one? Were Cubans rash in not giving the latter a chance? Should the United States support a genuinely independent Cuba even if this means some U.S. interests might suffer? In the concluding paragraph of his statement, Grau points out that, “We learn of American warships ordered to Cuba. Their presence here will render our task more difficult. Have we been prejudged and found guilty? Must the first truly Cuban government this country has had succumb to intimidation?” Six months after Grau was ousted as president, in July of 1934, he issued a statement to the Cuban daily Ahora where he once again criticized the U.S. battleships that remained in Cuba’s territorial waters. He further stated that, “If the Americans have a suitable bay in the Gulf of Mexico or Florida, they ought to transfer their naval base in Guantánamo Bay over there.” He concludes by asking, “To what ends and with what rights do these warships remain silently and threateningly in Havana bay?” In conclusion, this is a project that is still evolving – especially as my trove of

Studying History like a Historian

5


documents provides so many options for explaining the prelude and aftermath of Cuba’s 1933 revolution. At the same time, my experience thus far has been that students are more than capable of studying history like historians. In other words, there is no reason why archival and primary source materials cannot be integrated among assigned readings. Moreover, when I explain that the documents they are about to read are from Havana, and fodder for historians, this generates a buzz of excitement all its own. This, for me, is precisely the purpose.

n Writing Center Assessment at the Community College Level

References Chibás, E. (193, February). [Letter U.S. Secretary of State Henry Stimson]. National Archives and Records Administration (NARA). College Park, MD. Grau, R. (1933). [Official statement from Grau to United Press the day after Céspedes was overthrown.]. Havana, Cuba: Oficial Archivo Nacional de Cuba. Grau, R. (1934). [Statement to the Cuban daily Ahora criticizing the U.S. battleships that remained in Cuba’s territorial waters]. Havana, Cuba: Oficial Archivo Nacional de Cuba. Mahoney, F. (1933). [Memorandum from president of the Cuban Electric Company to H. Freeman Matthews, U.S. State Department Official]. Havana, Cuba: Oficial Archivo Nacional de Cuba.

In the almost two decades since Law and Murphy John R. Findura & Margaret M. Roidi, PhD (1997) first stated their puzzlement at the lack of studies related to Writing Centers and assessment, it appears little has changed. Although there has been much writThe Cerullo Learning ten about Writing Centers since becoming ever-present Assistance Center in the 1970s (Jones, 2001), Kinkead and Harris (1993) concluded that because of the wide variations in centers and the schools that house them, it is impossible to make generalizations about them; thus, there are still “enormous gaps in the existing literature on writing centers” (Jones, 2001, p. 3). The impact of writing courses on student learning, particularly in open admissions community colleges, needs to be examined alongside institutional assessment practices (Borg & Deane, 2011). Race, Brown, and Smith (2005) encouraged educators to reflect on the importance of providing learners with purposeful assessment and feedback of their work. In turn, we are forced to ask the question that if it is necessary for our students, then why not for ourselves? The approach that we took was simple enough: look at one community college and one writing center to assess the effect designated tutorial support might have had on students’ final grades and self-reported perceptions of success compared to a peer group of students who did not utilize such services during the same academic semesters. Context The academic services offered to learners oftentimes include one-on-one appointments, walk-in assistance,

6

Studying History like a Historian

Writing Center Assessment

7


study groups, workshops, in-class tutoring as well as Supplemental Instruction (SI). Since tutors are expected to seamlessly accommodate tutees’ unique learning styles through purposeful assistance, it is important to identify the means through which optimal tutorial methodology can be presented and reinforced in a consistent and purposeful manner (Roidi, 2015).

course). Although it was initially planned to target four sections each of EBS-011, EBS-012, WRT-101 and WRT-201, the sample size would not have allowed for a significant investigation; therefore, in the fall of 2014, it was determined that every section of EBS-011, EBS-012, WRT-101 and WRT-201 should be included in this assessment initiative.

The setting for this assessment study is the Henry & Edith Cerullo Learning Assistance Center (CLAC), an award-winning learning assistance center, housed at a community college in the northeast United States. All tutorial assistance is centralized and monitored closely to ensure consistency and optimal methodology. As an extensive range of free avenues of support are available to all students registered at the institution, every semester the center employs over 200 peer (student) and professional (degree holder) tutors to accommodate their unique learning styles.

Assessment Tools TutorTrac, the web-based appointment system utilized by the CLAC, was employed to record students’ visits while Datatel, the college’s data management system, provided students’ final grades. Also, self-reported surveys were distributed through SurveyMonkey to the target population in an effort to capture parameters involving perceived confidence and acquired skill level as an effect of the tutorial services received. As Paulson and Armstrong (2011) state “Students’ beliefs about learning and their conceptualizations about themselves as leaners have long been associated with levels of academic success” (p. 494).

This study looks at the students enrolled in one of four courses at Bergen Community College (BCC), an institution of almost 16,000 students located in suburban Paramus, New Jersey, over three semesters, from fall 2013 to fall 2014. Assessment Goal English Basic Skills (EBS) and Writing (WRT) students who attend the CLAC Writing Center will achieve a statistically significant higher final grade and selfreported success in their EBS and WRT courses than a peer group that does not attend the Writing Center. Definitions For the purposes of this study, success was defined as a higher final grade and self-reported course competence for learners who utilized any of the Writing services offered by the CLAC versus a peer group that did not. The Writing services that were reviewed for this study included: (1) one-on-one appointments, (2) walkin assistance, and (3) drop-in appointments. Population The population under examination focused on learners enrolled in 15-week sessions within the designated semesters. The data collected for EBS-011, EBS-012, WRT-101 and WRT-201 excluded the following groups: (1) AIMS (Academic Intervention and Monitoring System), Title V, International, Online, and students’ enrolled in BCC’s Meadowlands Campus. Methodology The four classes are (1) English Basic Skills 011 (EBS-011, a developmental English course), (2) English Basic Skills 012 (EBS-012, also a developmental English course), (3) Composition 101 (WRT-101, a college-level writing and composition course) and (4) Composition 201 (WRT-201, a higher-level college writing and composition

8

Writing Center Assessment

Quantitative data. All section of EBS-011, EBS-012, WRT-101, and WRT-201 were reviewed to assess the correlation between tutorial visits and academic success. Students were split into two cohorts: those who attended a Writing Center service and those who did not attend a Writing Center service. TutorTrac was utilized to identify all the students who utilized the CLAC Writing Center services under review while Datatel was utilized to assess participants’ final grades and pass rates. Qualitative data. The students enrolled in the target courses were asked to complete a self-reported survey. Thus, students’ perceptions in regards to their level of confidence and the writing skills they believe they might have gained through their experience at the Writing Center were captured. Results The CLAC worked closely with the Bergen Community College Center of Institutional Effectiveness in designing and analyzing the pass rates of students who attended the Writing Center versus those who did not as well as conducting the final grade comparison. While the difference in grades was not statistically significant, there is evidence of higher pass rates for students who attended these designated Writing Center services compared to those who did not (see Appendix, Tables 1, 2 and 3), as well as a higher percentage of A, B+, B, C+ and C grades for Writing Center attendees (see Appendix, Tables 4, 5 and 6). The only cohorts that did not follow this were the spring 2014 and fall 2014 EBS-012 classes, which both had lower pass rates for Writing Center attendees as well as more D, E and F grades. These two groups, however, also had the lowest Writing Center participation, with the fall 2014 cohort only having nine attendees, of which five ultimately passed.

Writing Center Assessment

9


For the survey results, only 47 out of 266 respondents answered the open-ended question available, which asked “Please tell us about your writing center experience.” The majority of the responses were positive with most along the lines of the following answers: • “The writing center is a very helpful resource” • “The tutors really helped me with my papers” • “The writing center is great and I highly recommend students take this opportunity” • “Good tutors, great at explaining” • “They are very knowledgeable and thorough” • “My first experience in the writing center was good. Now, I am comfortable going there and asking for help” • “I used the writing center a lot and I really appreciate the help I could get” • “I like it so much because it’s so helpful. It helps me when I need it most” • “It’s great!” Limitations There are a few limitations that need to be discussed. This study only documented the final pass rates and grade distributions, but did not look into the significance of the number of times each student might have visited the Writing Center nor did it take into account previous grades in EBS or WRT courses. While this initiative examined only one community college and one Writing Center, it did capture data from 10,678 students over the course of the three targeted semesters. Also, there is a discrepancy between the two sets of data that were obtained for this initiative. Although both quantitative and qualitative approaches were utilized, the data collection process for the latter methodology commenced at the end of the fall 2014 semester; thus, participants were asked to denote individually the semester during which they were enrolled in the courses under review. Researchers should consider duplicating this study and administering the end of semester survey at the end of each designated semester. It should be noted that few respondents chose to answer the open-ended questions; unfortunately, not much qualitative data were acquired aside from positive remarks regarding the Writing Center. Future studies would do well to ask more pointed, direct questions about perceived student success in order to properly assess this component.

Due to the relatively small sample sizes of students who attended the Writing Center compared to students who did not, there is no real statistical significance between attendance and final grades or grade distribution. However, there is evidence of students who used the Writing Center consistently receiving more passing grades and having a higher grade distribution than students who did not use the Writing Center for the three targeted courses and semesters. There is also a higher self-perceived success rate amongst students who used the Writing Center as compared to those who did not. These conclusions do support Wurtz’s (2015) claim about Learning Centers in that “Current research examining the impact of LAC use on academic achievement does indicate an effect on academic success” (p. 3). Due to these findings, learning center assessment is fruitful and under-examined field of future research that should be revisited by learning center professionals and higher education administrators. Appendix Table 1 – Pass Rates – Fall 2013

CLASS

ATTENDED TUTORING

TOTAL PASSED PERCENTAGE

EBS-011

Yes

41

38

93%

No

560

428

76%

Yes

21

16

76%

No

134

90

67%

Yes

84

77

92%

No

1819

1502

83%

Yes

51

49

96%

No

1057

905

86%

EBS-012 WRT-101 WRT-201

Table 2 – Pass Rates – Spring 2014

CLASS

ATTENDED TUTORING

TOTAL PASSED PERCENTAGE

EBS-011

Yes

28

26

93%

No

163

101

62%

Yes

27

19

70%

No

358

364

74%

Yes

87

81

93%

No

1115

881

79%

Yes

111

103

93%

No

1307

1147

88%

EBS-012 Conclusion Writing Centers play a vital role in preparing students for the rigors of collegelevel writing (Tobin, 2010). However, literature on the assessment of Writing Centers on college campuses is few and far between (Law & Murphy, 1997). This study attempted to create discussion worthy of further study on this topic.

10

Writing Center Assessment

WRT-101 WRT-201

Writing Center Assessment

11


Table 3 – Pass Rates – Fall 2014

Table 5 – Grade Distribution – Spring 2014

CLASS

ATTENDED TUTORING

TOTAL PASSED PERCENTAGE

CLASS

ATTENDED TUTORING

A, B+, D, E*, F B, C+, C

DIFFERENCE

EBS-011

Yes

33

27

82%

EBS-011

Yes

61%

39%

+10%

No

520

353

68%

No

51%

49%

Yes

9

5

56%

Yes

44%

56%

No

149

110

74%

No

62%

38%

Yes

84

77

92%

Yes

92%

8%

No

1783

1425

80%

No

85%

15%

Yes

59

57

97%

Yes

88%

12%

No

1078

910

84%

No

83%

17%

EBS-012 WRT-101 WRT-201

Table 4 – Grade Distribution – Fall 2013

EBS-012 WRT-101 WRT-201

-18% +7% +5%

Table 6 – Grade Distribution – Fall 2014

CLASS

ATTENDED TUTORING

A, B+, D, E*, F B, C+, C

DIFFERENCE

CLASS

ATTENDED TUTORING

A, B+, D, E*, F B, C+, C

DIFFERENCE

EBS-011

Yes

76%

24%

+7%

EBS-011

Yes

70%

30%

+9%

No

69%

31%

No

61%

39%

Yes

53%

47%

Yes

55%

45%

No

51%

49%

No

67%

33%

Yes

84%

16%

Yes

84%

16%

No

78%

22%

No

76%

24%

Yes

92%

8%

Yes

92%

8%

No

83%

17%

No

81%

19%

EBS-012 WRT-101 WRT-201

+2% +6% +9%

EBS-012 WRT-101 WRT-201

-12% +8% +11%

*E grades are recorded for Unofficial Withdrawals from class (i.e. a student who stops attending) and are equivalent to a grade of F

12

Writing Center Assessment

Writing Center Assessment

13


References Babcock, R. D., Manning, K., & Rogers, T. (2012). A synthesis of qualitative studies of writing center tutoring, 1983-2006. New York: Peter Lang.

n What Jabarii Taught Me

Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W.H. Freeman. Borg, E., & Deane, M. (2011). Measuring the outcomes of individualised writing instruction: A multilayered approach to capturing changes in students’ text. Teaching in Higher Education, 16(3), 319-331. https://doi.org/10.1080/13562517.2010.546525 Carino, P., & Enders, D. (2001). Does frequency of visits to the writing center increase student satisfaction? A statistical correlation study – or story. Writing Center Journal, 22(1), 83-103. Retrieved from http://www.jstor.org/stable/43442137 Jones, C. (2001). The relationship between writing centers and improvement in writing ability: An assessment of the literature. Education, 122(1), 3-20.

Sarah Mastellone Assistant Professor Math

Kinkead, J. A., & Harris, J. A. (1993). Writing centers in context: Twelve case studies. Urbana, IL: National Council of Teachers of English. Law. J., & Murphy, C. (1997). Formative assessment and the paradigms of writing center practice. Clearing House, 71(2), 106-108. Mackiewicz, J., & Thompson, I. (2014). Instruction, cognitive scaffolding, and motivational scaffolding in writing center tutoring. Composition Studies, 41(1), 54-78. Paulson. E. J., & Armstrong, S. L. (2011). Mountains and pit bulls: Students’ metaphors for college transitional reading and writing. Journal of Adolescent & Adult Literacy, 54(7), 494-503. DOI: 10.2307/41203399

On December 4, 1992, I went to a workshop called “Failing at Fairness” given by Myra and David Sadker. The Sadkers' work on gender equity in education (Sadker, & Sadker 1994) was groundbreaking and much cited in the late 1980s, throughout the 1990s, and beyond. While they focused on gender inequities in the classroom, I believed that their strategies would successfully address bias in the classroom in general. In this 1992 workshop, the presenters shared a list of thirteen pedagogies that facilitate an equitable learning environment. While all of these suggestions proved to be helpful, what happened the day after the workshop in my classroom taught me the most powerful lesson about equitable teaching practice. One of my classes during the 1992-1993 school year in an urban-rim New Jersey high school was a pre-calculus class for high school seniors. As most people know, December is the end of student effort in most high school senior classrooms. In this particular class, I had already seen evidence of this trend. However, Jabari had not decreased his level of work because, since September, he had done none of the homework assignments. Despite this lack of effort, he was maintaining a C average on tests.

Race, P., Brown, S., & Smith, B. (2005). 500 tips on assessment (2nd ed.). London: Routledge-Falmer. Roidi, M. M. (2015). Tutor training procedures in higher education: Creating a community of lifelong learners. Synergy: The online Journal for the Association for the Tutoring Profession, 7, 1-17. Tobin, T. (2010). The writing center as a key actor in secondary school preparation. Clearing House, 83(6), 230-234. Retrieved from http://www.jstor.org/stable/41149850 Vacca, R. T. (2006). They can because they think they can. Educational Leadership, 63(5), 56-59.

On December 5, 1992, I began my pre-calculus class by sharing what I had learned in the Sadkers' gender equity workshop the day before. I projected the thirteen ways to create an equitable learning environment list (see below) onto the classroom screen and asked

Wurtz, K. A. (2015). Impact of learning assistance center utilization on success. Journal of Developmental Education, 38(3), 2-10.

i

14

Writing Center Assessment

What Jabari Taught Me

All student names are pseudonyms 15


the students to evaluate my practice of these strategies. I was fairly confident that the students would rank me highly as my pedagogy was grounded in the cooperative techniques developed by Roger and David Johnson (Johnson, Johnson, & Holubec, 1984), which enhanced the learning by purposely creating an environment that emphasized the responsibilities of both the teacher and the students. I had been practicing all but two of the 13 strategies for an equitable classroom, which I had learned from my training in cooperative classrooms given by the Johnsons. The remaining two, “code yourself” and “alert students to issues of equity”, were not techniques that I had deliberately practiced in the classroom. However, I had an interest in addressing issues of equity for many years. In fact, this concern for equity had motivated me to co-found a club at the high school called “Facing Race”. The readings and movies discussed as club activity had supported my pedagogy by heightening my awareness of how I communicated with students and how I fostered an inclusive and respectful environment in the classroom Nevertheless, I was surprised by what happened. “Code yourself” raised some questions from the students. I told them that the Sadkers suggested that colleagues watch each other teach and “code” or keep score of how equitably we interacted with students. This scoring was to focus on whom the teacher called on in class and the quality of interactions with each student. Colleagues would focus on questions such as: Were the questions or comments to some students more encouraging or critical than they were to others? Did the teacher communicate more interest in the learning of some students than other students? Did interactions change depending on whether the questions or comments to some students were more encouraging (or critical) than they were to others? Did the teacher’s interactions with students change depending on a particular demographic? I explained to the students that having colleagues code each other was problematic for two reasons. First, colleagues are far too busy to sit in on each other’s classes. Second, colleagues would be evaluating each other’s actions from their particular frame of reference, and I thought it was more important that I learn what each of my students thought of the fairness of my interactions with each student in the class. Therefore, I proposed that from then on students should let me know when I did something that could be seen as inequitable. After a few more minutes discussing the remaining equitable practices listed on the chart, and finding students thought I did well with each item, I began the math lesson.

(European American male) asked a question and you answered it, gave him an example, and asked him a question". I asked, "What can that mean"? Another student said, "Jennifer is smarter than Jason". I said, "It could mean that I know what each student needs and I gave them each the appropriate feedback, but what else could it mean"? Jabari said, "You care more about Jason learning it than Jennifer". Responding to Jabari’s perception, I turned to Jennifer, gave an example related to her question, asked her a question, and moved on with the lesson. I thought it was important to take Jabari's comment at face value rather than further defend my differentiation of responses given to Jennifer and Jason because my intention was to invite students' perceptions and be seen by students as responding in a way that improved equity. From that day forward, Jabari did all of the assigned homework and got A's on the remaining tests. I regret never debriefing with Jabari to learn why his behavior changed. I can only surmise that he cared about issues of racial inequity and wanted to work for a teacher who showed, by her actions, that she did too. This experience underlined the power of number six on the Sadkers' list: "Alert students to issues of equity". The power of engaging students in building an equitable and responsive classroom atmosphere may have far-reaching, positive impact on their willingness to engage with both the work of the class and the professor who is creating that atmosphere. I believe this is what inspired Jabari to engage more fully in this class. In addressing the damage done to cross-cultural relations by racism, perception of a professors' actions has at least as much effect on students' willingness to engage as the professor's intentions. While differentiation in teaching helps to meet the needs of students in diverse environments, the unspoken interpretation of these pedagogical moves can cause great damage to students' engagement in class. Taking the bold step to make these alternate interpretations vocal and addressing the interpretations that are legitimatized by our racialized past can have a great positive impact on successful learning. Equitable Learning Environments at Bergen Community College Community Colleges today have to repair achievement gaps. It may be it productive to open a campus-wide conversation about effective pedagogical practice for members of diverse groups of students. Having such an ongoing conversation guided by what we learn from students may engage Bergen students the way it engaged Jabari.

Not five minutes into the lesson, Jabari raised his hand. He said, "I don't know if you want to hear this". I encouraged him to "spit it out", as I had asked for the feedback and I truly wanted to hear from them. He continued by saying, "Jennifer (African American female) asked you a question and you answered it, but Jason

In an effort to initiate this community-wide conversation about equity, members of the Mathematics Department developed a pilot 36-question Likert scale survey asking students to rate the effectiveness of a variety of pedagogies to engage their effort in their courses. After completing the survey, students were asked

16

What Jabari Taught Me

What Jabari Taught Me

17


to answer purposefully inclusive demographic questions. The survey was administered during Spring 2019 through MyMathLab by Developmental Mathematics students. Four hundred eighty-two students (482) responded. Once the survey responses were collected, members of the Institutional Research department analyzed the results. (For IR Data Brief, see Appendix) Partial Results The results show the top 12 pedagogies identified by students as encouraging their engagement with the mean score for each. The survey used a Likert scale with 3.52 is in Neutral = 3 and Agree = 4.

For the highest ranked pedagogies there were statistically significant differences for the following groups: Middle Eastern, Asian, and North American, Black Caribbean, and Islamic. Two surprising outcomes can be seen in the charts below. Clearly stating how assignments will be graded is more important to Asian students and less important to Middle Eastern students when compared to the general population. It is important to note that this prompt ranks high to all respondents. However, when looking at allowing students to resubmit essays and papers, the Black Caribbean respondents only ranked this prompt at 3.81, which places it outside of the top twelve (12) pedagogies. Whereas everyone else placed it among the top twelve.

Table 1:

Table 2: Prompt: clearly state how assignments will be graded

I am encouraged by professors who

Mean

clearly state how assignments will be graded.

4.52

Middle Eastern N=458

Non-Middle Eastern N=458

Statistical Significance Level

encourage students who are having trouble.

4.49

4.23

4.53

0.043

are passionate about the material they teach.

4.39

use humor in their teaching.

4.34

Asian N=21

Non-Asian N=461

make suggestions about how I can improve my work.

4.33

4.76

4.50

allow students to retake tests and/or quizzes.

4.33

break down large assignments into smaller steps.

4.29

help me structure my work and studying.

4.28

0.017

Table 3: Prompt: allow students to re-submit essays and papers

White North American N=65

Non-White North American N=417

Statistical Significance Level

allow students to re-submit essays and papers 4.27

4.45

4.24

0.028

provide virtual/on-line office hours for students.

4.25

Black Caribbean N=22

Non-Black Caribbean N=460

communicate that they believe in me and want me to do well

4.25

3.81

4.29

ask students how they feel about the topics discussed in class.

4.22

0.012

Future Directions Further study and discussion of the complete findings need to take place. Consideration can be given to scaling up the project so that more members of the Bergen community can provide input on the survey. Results disaggregated by the demographics identified in the survey will add to a community conversation about equity at BCC and demonstrate the institution's dedication to this initiative. Attention to equity in the classroom similar to my interaction with Jabari may also prove helpful in engaging students more fully in their studies. That decision, of course, is up to each faculty member.

18

What Jabari Taught Me

What Jabari Taught Me

19


References 13 Ways to Create an Equitable Learning Environment 1. Increase wait time: both between questions and naming a student and between naming a student and giving an answer. 2. Separate instruction from management: Do not use questioning techniques to control student behavior. 3. Intentionality: be aware of communicators of expectations and changes in behavior. 4. Class grouping: think carefully about groupings of any kind in the classroom. Issues of perceived ability, race and gender should be intentional and well thought through for the messages they transmit. 5. Collaborative Learning: social skills, accountability, and roll assignments must be taught and reinforced in this learning technique. 6. Alert students to issues of equity: students not only perceive different treatment but exaggerate the degree of differentiation that exists. 7. Code yourself: or allow yourself to be coded by your students.

Good, T. L., & Brophy, J. E. (2003). Looking in classrooms (9th ed.). Boston, MA: Allyn & Bacon. Johnson, D. W., Johnson, R. T., & Holubec, E. J. (1984). Circles of learning: Cooperation in the classroom. Alexandria, VA: Association for Supervision and Curriculum Development. Johnson, D. W., & Johnson, R. T. (1984). Cooperative small-group learning. Curriculum Report, 14(1). Retrieved from https://ďŹ les.eric.ed.gov/fulltext/ED249625.pdf Joseph, G. G. (1990). The crest of the peacock: Non-European roots of mathematics. London: Penguin Books. Meek, A. (1989). On creating ganas: A conversation with Jaime Escalante. Educational Leadership, 46(5), 46. Retrieved from http://library.saintpeters.edu/login?url= http://search.ebscohost.com.library.saintpeters.edu/login.aspx?direct=t rue&db=ehh&AN=8522504&site=ehost-live&scope=site

8. Don't rely on volunteers to answer in class.

Sadker, M., & Sadker, D. (1994). Failing at fairness: How America's schools cheat girls. New York, NY: Macmillan Publishing.

9. Create strategies to involve all students: poker chips, seating charts, quick response boards, cooperative work groups, whole-class discussion strategies (revoicing, paraphrasing, saying more, and sentence starters).

Tatum, B. (1997). Why are all the black kids sitting together in the cafeteria? (Rev. ed.). New York, NY: Basic Books.

10. Individual meetings with some students or groups: conversations with students about inappropriate behavior should not take up class time and should not be a public chastisement. 11. Teacher geographic mobility: how you position yourself in the room can communicate expectations to students. 12. Be sure teaching materials and displays in the classroom reect all types of students in non-stereotypical pursuits: written projects that illustrate different cultures contributions to the content under study can motivate students to achieve in that subject. 13 Eliminate put-downs of all kinds: a safe learning environment is essential for the depth of learning that we desire to produce.

13 Ways to create an equitable learning environment is based on the work of Myra and David Sadker Failing at Fairness, and Thomas L. Good and Jere E. Brophy Looking in classrooms.

20

What Jabari Taught Me

What Jabari Taught Me

21


Appendix Math and Equity Survey Explanation Sheet • The Math and Equity Survey was conducted in the Spring 2019 semester. After IRB approval, a survey was distributed to different Math classes throughout the semester. Four hundred and eighty-two (482) responses were collected. • The students were asked 36 Likert scale questions. All questions started with the phrase ‘I am encouraged to work hard by professors who…’ with different pedagogical approaches listed below. The students were then asked demographic questions about their gender, race/ethnicity, and religious beliefs. • The Likert scale had five potential answers: Strongly Agree (5), Agree (4), Neutral (3), Disagree (2), and Strongly Disagree (1). Using this method, each question had a mean score that placed the students’ responses somewhere on the scale. For instance, the phrase ‘I am encouraged to work hard by professors who relate course topics to real life’ had a mean score of 4.16. This means that overall respondents agreed with this statement, but didn’t agree with it as much as ‘I am encouraged to work hard by professors who clearly state how assignments will be graded’ because this phrase had a mean score of 4.52. • The demographic questions were purposefully inclusive. The gender question did not just include ‘Male’ and ‘Female’, but other options including ‘Trans Male’ and ‘Trans Female’ as well as a ‘prefer not to answer’ option and a space to write in any gender not covered by the choices given. Race/ethnicity did not ask about the major race categories (Hispanic, White, Asian, Black, etc.), but instead included specific ethnicities and, in some cases, nationalities like Caribbean Indian and Korean. There was also a ‘prefer not to answer’ option and a space to writein a race/ethnicity not covered by the given choices. Likewise, religious beliefs had numerous options including Catholicism, Judaism, No religion, and Spiritual. A ‘prefer not to answer’ option was given as well as a space to write-in a religious belief not covered by the given choices. • Due to the sensitive nature of the race/ethnicity, gender, and religious beliefs questions, the IRB insisted on a Note on Confidentiality and Anonymity as well as a warning that the survey might cause emotional distress and if it does so, where appropriate counseling services are located within Bergen. • Once the survey responses were collected, the results were brought down and stripped of all identifiable information (IP addresses, date of survey completion, etc.) and means were figured out for the 36 Likert scale questions. • To match a previous report done on a similar study, results were disaggregated by self-identified genders, race/ethnicities, and religious beliefs to see if any patterns emerged that showed certain groups responded more positively (or more negatively) to certain teaching styles than others. • In order to do this analysis, an Independent Samples T-Test was conducted to compare means between test groups and control groups and test for statistical significance.

• Test groups were groups that had a certain gender, race/ethnicity, or religious belief while the control group was the antithesis of this group. For instance, the means for Males (174 respondents) were compared to Non-Males (308 respondents) and the means for Hispanic/Latinos (170 respondents) were compared to the means for Non-Hispanic/Latinos (312 respondents). Not every group was tested because the study was limited to groups that had 20 or more respondents. • Twelve groups were tested and are listed below with the number of respondents in each group: ° Males, N=174 ° Females, N=254 ° Hispanic/Latino, N=170 ° African American, N=43 ° Middle Eastern, N=24 ° Asian, N=21 ° Black Caribbean, N=22 ° White Eastern European, N=37 ° White North American, N=65 ° Islamic, N=22 ° Had No Religion, N=87 ° Catholic, N=150 • Groups were compared to their antitheses to show differences in means and these differences were tested for statistical significance. • Results that were statistically significant at the .05 level were shaded, bolded, and italicized in the final Excel spreadsheet. Being statistically significant at the .05 level means that we can be quite certain that the differences in means are not statistical anomalies, but can instead be attributed to different opinions between the test group and the control group. For instance, ‘only giving high grades for excellent work’ is more important amongst White Eastern Europeans than amongst Non-White Eastern Europeans (mean of 3.97 compared to a mean of 3.50). This finding is statistically significant at the .01 level (less than .05) so we can be fairly certain that Eastern Europeans do find ‘only giving high grades for excellent work’ more important than their peers and that it is not just a random statistical anomaly. • The final Excel spreadsheet for this analysis has 13 tabs. The first shows the overall results of the survey and each group that was tested for mean differences and statistical significance. Every tab after is labelled with the specific test group and only shows results for that test group and its control group. Once again, statistically significant differences in means between test and control groups are shaded, bolded, and italicized. While differences in means might exist for all 36 Likert scale questions, only those shaded, bolded, and italicized have been deemed statistically significant and warrant further research.

This data brief is an explanation for the ‘Math and Equity Survey Means’ Excel Spreadsheet BCC CIE Data Brief IR:JJ 4/19

This data brief is an explanation for the ‘Math and Equity Survey Means’ Excel Spreadsheet BCC CIE Data Brief IR:JJ 4/19

22

What Jabari Taught Me

What Jabari Taught Me

23


n Federal Reserve Economic

Data (FRED®) in Principles of Economics Courses

Takvor H. Mutafoglu Assistant Professor Economics

The utilization of current and historical data in economics courses has always been a difficult task due, primarily, to restrictions placed on access by the institutions collecting and maintaining large databases. However, in recent years, several of those institutions, including but not limited to, the World Bank, International Monetary Fund, Organisation for Economic Cooperation and Development (OECD), Central Banks around the world, and the individual Federal Reserve banks in the United States, have started to offer online, albeit in some cases limited, access to their numerous databases with no subscription requirements attached. Hence, the collection and display of historical data, relevant to course material, in the classroom has become less and less costly, not only in monetary terms but also in terms of time and energy. Consequently, it has facilitated the environment for instructors to relate economic concepts and theories, introduced in Principles of Economics courses, to real-data, through visual applications, which essentially help reveal the “story” behind the historical movement of a particular variable in question. Thus, time-series plots, i.e. historical data, have become important instruments in exposing historical events that can be discussed in reference to the theoretical relationships. It also enables the groundwork for empirical analysis of economic concepts and theories presented in the classroom. The focus of this paper is the Federal Reserve Economic Database (FRED) repository created and maintained by the Research Department at the Federal

24

Federal Reserve Economic Data (FRED®)

Reserve Bank of St. Louis. It can be accessed directly via https://fred.stlouisfed.org Suiter & Taylor (2016) describe FRED as follows: FRED is a set of free online tools for finding, displaying, analyzing, and downloading time series. The data in FRED are aggregated from 80 public and private sources, including the OECD, the World Bank, and the U.S. Bureau of Labor Statistics. By aggregating over 240,000 time series, FRED maximizes the time professors and students have to analyze data by minimizing time spent navigating Web sites. FRED’s main Web site allows users to search through the time series and then graph, transform, and download the series of their choice. The graphing software allows users to customize their graphs and add other series to them. These graphs can be in the form of line, bar, scatterplot, area, and pie charts. Users also can transform individual data series and construct a single series from multiple series (p.71). In fact, at the time of this writing, FRED has over 500,000 U.S. and international time series available from 86 sources. Mandez-Carbajo (2015) argues that data visualization and analysis using the FRED Web site help “students locate and effectively use the quantitative information that they need to evaluate abstract concepts” (p.420), such as inflation expectations developed by financial markets. Moreover, Patel and Saunoris (2016) emphasize active learning strategies utilizing a FRED add-in for Microsoft Excel, rather than the FRED Web site, to foster “learning-by-doing strategies that promote two-way interaction in order to improve learning outcomes” (p.37). Their paper advocates the use of Excel features such as the powerful Data Analysis tool embedded in the application to turn data into information. Furthermore, more recently, Mendez-Carbajo, Taylor, & Bayles (2017) discuss how to build and display a graph of a Taylor rule (a “guided” monetary policy interest rate) with FRED by incorporating real data into teaching and learning activities. While those studies are all novel, they concentrate, explicitly, on students of intermediate-level economics courses to take advantage of FRED and its functions. However, students of principles-level economics courses could also benefit from the data visualization and analysis tools of FRED through instructor-led collection, manipulation, and presentation of real economic data in the classroom. Furthermore, the database could be utilized to create FRED-specific student assignments to achieve information literacy learning outcomes in Principles of Economics courses. To that end, this paper will present not only how to build and display a graph of the Okun’s Law but also provide additional suggested discussion points using visual representation of historical data.

Federal Reserve Economic Data (FRED®)

25


Content of Lesson Okun’s Law proposes an inverse relationship between unemployment and output produced in the economy. That is, when the unemployment rate increases, output produced in the economy will decrease and vice-versa. More specifically, Okun’s Law suggests that for every one percentage point by which the actual unemployment rate is higher than the natural rate of unemployment, there will be a negative output, or Gross Domestic Product (GDP), gap of about two percent . The natural rate of unemployment is a medium-to-long-run measure of the unemployment rate that would prevail in the economy if it were producing at its potential output level. The GDP gap is measured by the difference in actual GDP and potential GDP, which is defined as the level of output that would prevail at natural rate of unemployment in the economy. Although both the natural rate of unemployment and potential GDP are unobservable, estimates by the U.S. Congressional Budget Office are readily available for the U.S. economy in FRED. The challenge in the classroom is to make sure that our students clearly understand not only the magnitude of the loss (or the gain) in the level of output due to high unemployment (or employment) but also the scale of change in household and business income and spending. Accordingly, a visual presentation of Okun’s Law will help to illustrate students the extent of the loss or gain in the level of output due to high unemployment or employment in the economy. We can now begin illustrating the steps necessary to construct and display the Okun’s Law. The reader can refer to the Appendix at the end of the paper for actual screen shots. The first step consists of creating a graph of the unemployment rate. On the homepage of the FRED Web site, search for “Civilian Unemployment Rate” by typing it in the search box and hit the enter key. Select “Civilian Unemployment Rate – Percent, Monthly, and Not Seasonally Adjusted” variable on the next page by clicking on it. The second step is to add the natural rate of unemployment to the same graph so we can visually illustrate the deviation of the unemployment rate from its natural rate. The instructor will edit the graph in order to add this second line to the present graph. Thus, within the FRED graph displaying the “civilian unemployment rate”, the instructor will click on the orange “EDIT GRAPH” button located on the upper-right-side of the page. Once the orange button is clicked on, a side window will appear. The instructor will click on the “ADD LINE” tab and type “Natural Rate of Unemployment” in the add data series to graph box. Select “Natural Rate of Unemployment (Long-Term) – Quarterly, Percent, Not Seasonally Adjusted” by clicking on it and then click on “Add data series” button right below the add data series to graph box. However, at this point, it should be noted that the frequency for the civilian unemployment rate (monthly) and the natural rate of

26

Federal Reserve Economic Data (FRED®)

unemployment (quarterly) are different from each other. Therefore, to maintain consistency, the frequency of the civilian unemployment rate needs to be changed to quarterly by clicking on the “EDIT LINE” tab, selecting “EDIT LINE 1”, and finally selecting Quarterly option from the Modify frequency drop-down menu. If desired, the instructor, at this point, could pause for a minute, close the side window and start discussing with students the historical movement of the unemployment rate, and its reasons, relative to the natural rate. In order to reveal the actual value of either the civilian unemployment rate or the natural rate of unemployment, at a specific quarter in time, the instructor could move the mouse pointer over that time period on the graph to make the values visible. The next step is to customize the data in order to apply the Okun’s Law. Once again, the instructor will click on the orange “EDIT GRAPH” button, go to “EDIT LINE” tab, select “EDIT LINE 1”, and type “Natural Rate of Unemployment (LongTerm)” in the search box under customize data section. Select “Natural Rate of Unemployment (Long-Term)” from the drop-down menu by clicking on it and finally clicking on the grey “Add” button. At this point, the instructor will see no change in the graph created earlier since the natural rate of unemployment was already visible. However, the instructor will note a difference on the “EDIT LINE 1” screen. There will now be two variables under “EDIT LINE 1” screen: (a) Civilian Unemployment Rate, Percent, Not Seasonally Adjusted (UNRATESA), and (b) Natural Rate of Unemployment (Long-Term), Percent, Not Seasonally Adjusted (NROU). This will allow the instructor to apply the formula for the Okun’s Law. Consequently, the last step is to simply enter the following formula in the text box right next to the label “Formula:” and click on the “Apply” button: (a-b) *-2. The FRED graph will update and display the difference in the civilian rate of unemployment and the natural rate of unemployment multiplied by negative two (this would be represented by a blue line on the graph), as per Okun’s Law. The instructor will see the natural rate of unemployment (this would be the red line on the graph) along with the estimated loss or gain (in percentage terms) in the level of real output, as hypothesized by Arthur M. Okun. The instructor, at this point, could include the civilian rate of unemployment on the same picture to enhance discussion of the relationships between the variables.2 Suggested Discussion Points As indicated in the introductory section of this article, there is always a “story” (in many cases, more than one) behind the historical performance of a particular variable observed, and visual representation of data make it easier and, most importantly, more exciting to tell it. For instance, the instructor could start off by pointing out the relationship between output growth (blue line) and unemployment (this would be the green line on the graph) in the U.S. economy, i.e., how out-

Federal Reserve Economic Data (FRED®)

27


put growth rate is positive (negative) whenever civilian unemployment rate is below (above) the natural rate of unemployment (red line). Next, the instructor could present economic events in order to explain the reasons for behavior of the unemployment rate, and therefore, output growth, in different time periods. For instance, the oil crisis, produced by the Organization of Petroleum Exporting Countries (OPEC), of the early and late 1970s have led the U.S. economy into a recession (presented by grey shaded areas on the graph) and raised the civilian unemployment rate over the long-term natural rate of unemployment and, therefore, created a negative output growth in the economy. Conversely, the surge in new technologies relating to computers, such as advancements in microprocessor technology, the Internet, improvements in communications technology in the form of cellular phones and e-mail, as well as electronic commerce, led to a rise in information technology related employment and contributed to a positive output growth rate in the U.S. economy during the latter half of 1990s and early 2000s. It should be pointed out the civilian unemployment rate, the green line, during the above-mentioned period, was below the long-term natural rate of unemployment. Perhaps, the most striking period on the graph is represented by the 2007-2009 recession, also known as the “Great Recession”, which was triggered by a sharp decline in housing prices and a related crisis in mortgage loans. Since several financial securities in the economy were actually built on those home mortgage loans, a default in the mortgage market have led to a collapse of many key U.S. financial institutions and a subsequent freeze in the credit markets, i.e., lending halted. The resulting impact of the credit crunch was felt in the broader economy, especially in the labor market (hiring froze and layoffs soared), with official unemployment rate climbing as high as almost 10 percent. As the gap between the long-term natural rate of unemployment (red line) and the actual rate of unemployment (green line) was widening, the real output in the economy declined nearly 9.7 percent (blue line), estimated by the Okun’s Law formula. Conclusion Federal Reserve Economic Data repository is a user-friendly and powerful Web based application that can be used to complement discussions of economic theories and concepts with graphical representations that may not be readily available in popular economics textbooks. The online data manipulation feature of the FRED database eliminates the need to download data series and manipulate them through spreadsheets. This reduces the amount of time and energy required to construct a data series of interest. Moreover, created data series and graphics can be saved online as long as the user has a registered account, free-of-charge, with FRED. The instructor can easily refer to the saved graphics in the classroom to elaborate on economic concepts and theories as well as economic events to ex-

28

Federal Reserve Economic Data (FRED®)

plain variable movements. In fact, the author of this paper has executed a series of workshops for economics faculty, sponsored by the Center for Innovation in Teaching and Learning (CITL) on campus, during the spring semester of 2018, and covered several topics by means of the FRED database in order to assist instructors with classroom presentations and discussions. Moreover, the FRED database is a perfect platform for Principles of Economics courses to satisfy learning outcomes related to information literacy. Therefore, it is as important for instructors to overview the construction of a FRED graphic in the classroom as to discuss economic concepts and theories by only displaying FRED graphics in the classroom. Consequently, instructors can create assignments that require students to specifically utilize the available tools of FRED and provide the means in order to successfully complete them. Although I have recently started to use the FRED repository and its data manipulation tools in the classroom, student feedback has been quite positive. More precisely, the visual aspect of time-series data on the white screen has made students more interested in historical economic events. As Mendez-Carbajo, Taylor, and Bayles (2017) write “…by connecting course material to current events and articulating abstract theoretical concepts through the use of real data, the curriculum of undergraduate macroeconomics becomes more engaging to our students” (p.23). References Mendez-Carbajo, D. (2015). Visualizing Data and the Online FRED Database. The Journal of Economic Education, 46(4), 420-429. Mendez-Carbajo, D., Taylor, K., & Bayles, M. (2017). Building a Taylor Rule Using FRED. Journal of Economics Teaching, 2(2), 15-29. Mendez-Carbajo, D. (2015). Visualizing Data and the Online FRED Database. The Journal of Economic Education, 46(4), 420-429. Patel, D. & Saunoris, J. (2016). Using the FRED Excel-Based Application to Improve Learning Outcomes in Economic Courses: From Student to Practitioner. Journal of Economics and Finance Education, 15(2), 37-49. Suiter, M. & Taylor, K. (2016). Resources for Economic Educators from the Federal Reserve Bank of St. Louis. The Journal of Economic Education, 47(1), 71-75.

Federal Reserve Economic Data (FRED®)

29


Figure 4 – Screen Capture of Edit Line Tab and Modify Frequency Drop-Down Menu

Appendix Figure 1 – Screen Capture of Civilian Unemployment Rate (Complete Data Range)

Figure 5 – Screen Capture of Civilian Unemployment Rate & Natural rate of Unemployment Figure 2 – Screen Capture of Edit Graph Options Window

Figure 6 – Screen Capture of Customize Data Menu

Figure 3 – Screen Capture of Add Line Tab and Add Data Series Button

30

Federal Reserve Economic Data (FRED®)

Federal Reserve Economic Data (FRED®)

31


Figure 7 – Screen Capture of EDIT LINE 1 Tab

n Experiencing the Past through

Service to the Community: The Case for Incorporating Service-Learning Opportunities into BCC History Courses

Daniel Saperstein and Evan Saperstein Figure 8 – Screen Capture of Okun’s Law and Natural Rate of Unemployment

Adjunct Professors History

History is important. It can tell us who we were, who we have become, and who we can be. It can bear witness and impart knowledge. It can connect the past with the present and build bridges to the future. It can explain ideas and movements, provoke thought and question, and stimulate discussion and debate. Yet, for too many college students, the study of history is humdrum. It conjures images of dry, required survey courses, and the monotony of memorizing lists of people, places, and events. It invokes memories of tattered textbooks, tedious timelines, and tired topics. While history professors have tried to find ways to challenge these stereotypes (e.g., in-class projects, interactive dialogue, digital media), students, all too often, see history as “boring.”

Figure 9 – Screen Capture of Okun’s Law, Natural Rate of Unemployment, & Civilian Unemployment Rate

History is not the only subject that faces these challenges. Faculties across disciplines continue to seek new ways to make course material more relatable and relevant. To that end, through recent departmental and school-wide initiatives, a growing number of colleges and universities have turned their focus to promoting student learning experiences outside of the classroom, a decades-old pedagogical philosophy known as “experiential learning” (Dewey, 1916, 1938; Kolb, 1984, 2009; Lewin, 1951; Piaget, 1952). One form of experiential learning that has gained particular attention in academia is “service- learning,” which promotes learning experiences outside of the

32

Federal Reserve Economic Data (FRED®)

Experience the Past through Service to the Community

33


classroom through civic engagement with the community (e.g., field trips, conferences, internships, volunteer work, and local community initiatives, to name a few) (Bellner & Pomery, 2005; Deeley, 2010; Donahue, 2000; Foucar-Szocki & Bolsing, 1999). To breathe life into historical study and groom civic-minded citizens, Bergen Community College (BCC) should follow the lead of other colleges and universities and offer increased service-learning opportunities for students enrolled in history courses, as detailed below. How Colleges and Universities Have Incorporated Service-Learning Into History Courses While the scholarship on service-learning continues to grow, “[t]he scholarship drops off in the discussion of humanities courses in general, and in history courses in particular” (Straus & Eckenrode, 2014, p. 255). Nevertheless, in recent years, several scholarly journals have described the efforts of a growing number of U.S. colleges and universities that have offered service-learning projects as part of their history courses. This scholarship explains how some history professors have blended course material with service-learning experiences in courses ranging from U.S. to Latin American to Asian history. It also speaks to the range of potential service-learning opportunities across local, regional, and national institutions, and the resulting benefits to student development and the community. Examples of such scholarship include the following: For an Asian history survey course at Portland Community College (PCC) in Oregon, students volunteered for 10 hours at local institutions such as the Portland Classical Chinese Garden, the Nikkei Legacy Center, or PCC’s Student Success Center (Gray, 2006). At the Portland Classical Chinese Garden, students devised programs for children, participated in festivals, and welcomed visitors. At the Nikkei Legacy Center, students helped promote Japanese culture by participating in poetry and literary events, studying the history of Japanese Americans, and making paper cranes (origami). At the PCC Student Success Center, students interacted with foreign students through conversation groups. As part of the service-learning opportunity, the student had to maintain a reflective journal and write an essay about the experience; be evaluated by the participating institution; and present about the experience to the class. By the end of the course, the instructor found that students had “gained a valuable understanding” of other cultures and “emerge[d] as citizens” with a greater appreciation for history (Gray, 2006, p. 205). For an urban history course at a liberal arts university in Western New York (SUNY Fredonia), students also left “the comfort zone of a traditional history course” and partnered with the local community (e.g., Fenton History Center) to designed digital exhibits about Italian American, Latin American and African American experiences across the state of New York, and present their research at a history conference (Straus & Eckenrode, 2014, p. 262). The experience would in-

34

Experience the Past through Service to the Community

clude students interviewing a site director, taking photographs, and performing specific, hands-on research tasks. Although the students first expressed “fear, resistance, and skepticism,” according to the instructor, they later “develop[ed] their academic and professional skills” and “rated the course highly” (Straus & Eckenrode, 2014, pp. 262-263). There are other examples of partnerships with local institutions. For a course entitled “Conflict and Consensus in History” at Gwynedd Mercy University in Pennsylvania, students conducted biographical interviews at a local senior living community to actively learn about earlier generations (Clinton, 2015). These students completed an oral presentation and reflection essay about their experience, including standard written accounts and more “creative” narratives (e.g., scrapbooks, web pages, videos) (Clinton, 2015). One of the students noted “that the history told in textbooks is not the only history that there is to know” (Clinton, 2015, p. 112). For a U.S. social and cultural history course at Kennesaw State University in Georgia, students also interviewed local residents about daily life during previous eras (e.g., the Great Depression and World War II) and chronicled their findings (through reflection essays) at the Center for Regional History database (Nystrom, 2002). Rather than writing a traditional research paper and/or completing a final exam, students submitted pre-reflection and post-reflection essays about the interview process. As Nystrom (2002) recalls, while most students were initially “unenthusiastic, borderline bored, or quickly dropped the class” (p. 62) considering the amount of work involved, “all the students who completed the assignment enjoyed the experience” (p. 67). Other service-learning projects have involved national institutions. For example, for an honors course entitled “Wealth and Poverty in American History” at Ithaca College, students worked with organizations like the Red Cross, Homeless Shelter, Loaves and Fishes, and the Living Wage Coalition to actively learn about economic inequality (Smith, 2009). As part of their experience, students analyzed primary and secondary sources for the purpose of exploring American wealth and poverty in the past and present. Subsequently, they wrote weekly journal entries (reflection on course reading and service work), a final portfolio essay, and a letter reflecting on their service-learning experience. Smith (2009) concludes that “service-learning in history courses can emphatically refocus students on the social benefits of studying the past,” and help lead students “to a life devoted to the service of others” (p. 69). And, for a course entitled “Service and Study in Latin America” (History 243) at Pace University, students went so far as to travel to Argentina, Brazil, or Peru to fulfill a service-learning component (Greenberg, 2008). As part of the course, students completed community development term projects (e.g., renovation of a pub-

Experience the Past through Service to the Community

35


lic elementary school, renovation of a community social center, providing health care training abroad, renovation of a building and construction of a hospital prenatal care program, providing disaster relief and food donations), capstone seminars, and journal reflections. By the end of each trip, students completed evaluations that “revealed a strong preference for civic engagement over conventional teaching methodologies” (Greenberg, 2008, p. 299). How BCC Can Incorporate Service-Learning Opportunities Into History Courses In recent years, BCC has made strides to incorporate service-learning as part of the broader educational experience through organizations like Campus Compact and the Mahwah Environmental Volunteers Organization (MEVO). On its website, BCC defines service-learning as a “course-based, credit-bearing educational experience,” where students partake in an “organized service activity” that “meets community needs,” nurtures “deeper understanding of course content,” instills a “strong sense of civic responsibility,” and “enhance[s] critical thinking skills” (Bergen Community College, 2018). Service-learning projects can “range from single day events to several hours a week per semester” (Bergen Community College, 2018). BCC emphasizes the benefits of service-learning for faculty, students, and the community alike, including that it is an “effective teaching tool,” “enriches student learning of course material,” and “gives the community the opportunity to serve as a partner/mentor in education” (Bergen Community College, 2018). Presently, more than 25 departments at BCC offer service-learning opportunities through BCC’s Career and Workforce Development Center (Bergen Community College, 2018). These departments combine to cover a whole host of disciplines, ranging from mathematics to the sciences to liberal arts; history, however, is not among them. We are proposing that students enrolled in history courses at BCC have greater opportunity to partake in service-learning initiatives at local, statewide, regional, and national organizations as part of fulfilling course requirements. In coordination with history professors and through collaboration with BCC’s Career and Workforce Development Center, service-learning projects can reinforce course material, strengthen student bonds to the community, cultivate a culture of civicminded service, increase potential career prospects, and further enrich educational experiences. In furtherance of this pursuit, BCC history professors can work with one another to share ideas and create a database of ongoing service learning opportunities. Such a database can in turn serve to assist and encourage history students in their search for the appropriate service learning experience. As part of the experience, students should have to participate for a minimum of 10 hours at the site, maintain a log, write a reflection essay (in lieu of the final research paper), and deliver a presentation to the class about their service activity. And, there is no shortage of service-learning opportunities for history students to

36

Experience the Past through Service to the Community

partake in the tri-state area. For BCC students enrolled in U.S. survey history courses (HIS-111, HIS-112, HIS113, and HIS-114), they can partner with the Bergen County Historical Society (BCHS) located in River Edge, New Jersey or different town historical societies, as well as local museums in Bergen County (e.g., the Aviation Hall of Fame in Teterboro or New Jersey Naval Museum in Hackensack). They also can serve the community through interviewing and then telling the stories of community leaders, residents of local independent or assisted living facilities, and war veterans. In addition, such students can document local history through photographic projects, songs, poems, articles, or journals about a particular area or building within the community (e.g., a community agency, homeless shelter or local soup kitchen, library, park, or local historic preservation commission or organization). BCC U.S. history students also may complete service-learning projects at statewide institutions, such as the New Jersey Historical Society (NJHS) in Newark, or more regional-focused institutions, such as the museum and library at the New York Historical Society in Manhattan and the Bronx, Brooklyn, Queens, or Staten Island Historical Societies. Moreover, through internships, service volunteer work, or other means, such students can assist with collections and exhibitions at such institutions as the American Museum of Natural History (AMNH) or the Museum of the City of New York (MCNY), or help carry out the mission of non-profit organizations in boroughs of New York City. Service-learning opportunities are not only limited to students studying U.S. history. For BCC students enrolled in Genocide and Holocaust (HIS-146), they can partake in events and other activities at the BCC Center for Peace, Justice and Reconciliation (CPJR) (which partners with the New Jersey Commission on Holocaust Education and other New Jersey colleges and universities); or help organize TEDx talks about genocide at BCC or at other local institutions across the state (such as the Ramapo College Gross Center for Holocaust & Genocide Studies, the Rutgers University Center for the Study of Genocide and Human Rights, or the Montclair State University Holocaust, Genocide, and Human Rights Education Project). For students studying women’s history (HIS-105 and HIS-116), they can participate in seminars and other activities at the Association for Women’s Right in Development (AWID), the National Organization for Women (NOW), or the Women's Environment and Development Organization (WEDO). And, for students enrolled in a Western Civilization (HIS-101 and HIS-102) survey course, they can volunteer at cultural institutions of different faiths and nationalities such as the New Jersey Buddhist Culture Center (NJBCC) or the Ukrainian American Cultural Center of New Jersey (UACCNJ). Moreover, after taking a service-learning focused course in the history of Mod-

Experience the Past through Service to the Community

37


ern Europe (HIS-106 and HIS-107), Modern Asia (HIS-121), Modern Africa (HIS126), or Latin America (HIS-130, HIS-131, and HIS-132), students may be more inclined to study abroad. And by living abroad, students can actively learn about the rich cultures, traditions, and histories of other countries. In turn, BCC can continue to build partnerships with higher education institutions across Europe, Asia, Africa, or Latin America.

References

Conclusion BCC’s history department offers a diverse range of courses, from broader survey courses in Western Civilization and U.S. history to more specialized courses in African, Asian, Latin American and Women’s history. While not all BCC history courses will lend themselves to service-learning, if properly tailored, such opportunities can better the educational experiences and career development of history students, while at the same time help to build ties with local communities. As other colleges and universities have shown, service- learning can further the mission of historical inquiry and civic participation.

Clinton, M. (2015). “I’ll remember that”: Oral history, service learning, and historical understanding. Teaching History: A Journal of Methods, 40(2), 107-117.

While it may sound clichéd, history does repeat itself. The takeaways from historical study and analysis can and do advance the civic good. That is not to say that history always serves this purpose. There are certainly examples where the lessons of history have been misinterpreted or misapplied. But overall, the study of history can and does benefit society, and is why history professors find such meaning in their work. And what better way to instill this ethos than to provide students with the opportunity to complement their in-class studies with out-ofclass service- learning based experiences.

Donahue, D. M. (2000). Charity basket or revolution: Beliefs, experiences, and context in preservice teachers’ service learning. Curriculum Inquiry, 30(4), 429-450.

Bellner, M., & Pomery, J. (Eds.). (2005). Service-learning: Intercommunity & interdisciplinary explorations. Indianapolis, IN: University of Indianapolis Press. Bergen Community College. (2018). Service learning. Retrieved from http://ww3.bergen.edu/pages 1/Pages/2150.aspx.

Deeley, S. J. (2010). Service-learning: Thinking outside the box. Active Learning in Higher Education, 11(1), 43-53. Dewey, J. (1916). Democracy and education: An introduction to the philosophy of education. New York: Macmillan. Dewey, J. (1938). Experience and education. New York, NY: Collier Books.

Foucar-Szocki, R., & Bolsing, C. (1999). Linking hospitality management programs to industry. Hospitality management education, 37-65. Gray, S. (2006). Beyond facts: Service-learning and Asian history. Academic Exchange Quarterly, 10(2), 201-206. Greenberg, D. J. (2008). Teaching global citizenship, social change, and economic development in a history course: A course model in Latin American travel/service learning. The History Teacher, 41(3), 283-304. Kolb, A. Y., & Kolb, D. A. (2009). Experiential learning theory: A dynamic, holistic approach to management learning, education and development. In S. J. Armstrong, & C. V. Fukami, (Eds.), The SAGE handbook of management learning, education and development (pp. 42-68). London: Sage Publications. Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. New Jersey: Prentice Hall. Lewin, K. (1951). Field theory in social science. New York, NY: Harper & Row. Nystrom, E. A. (2002). Remembrance of things past: Service learning opportunities in U.S. history. Oral History Review, 29(2), 61-68. Piaget, J. P. (1952). The origins of intelligence in children. New York: International Universities Press.

38

Experience the Past through Service to the Community

Experience the Past through Service to the Community

39


Smith, M. (2009). What History is good for: Service-learning and study the past. Learning and Teaching, 2(3), 50-73. Straus, E., & Eckenrode, D. (2014). Engaging past and present: Service-learning in the college history classroom. The History Teacher, 47(2), 253-266.

n Literature review: Effects of

Written Performance by Academic English as a Second Language Students Using Integrated Listening-to-Writing Task Repetitions

John Bandman Assistant Professor Business Hotel Restaurant Management

Introduction The aim of researching the effects of listening-to-writing task repetition on academic ESL writing is to look at ways students in an academic ESL program improve in their writing after repeating a series of tasks in between delayed time intervals (one week in between the initial task and the first repetition; and two weeks between the first and second repetition). Many seminal researchers have investigated the effects of task repetition on subsequent task performance (Bygate, 1996, 2001; Bygate and Samuda, 2005; Skehan, 1998). Much of the task repetition research has focused on oral, not written, language production. Initial task completion is considered preparation for subsequent performance as learners become familiar with the task at the first performance. Bygate (1996) indicates that at the first task performance, learners focus primarily on the conceptualization process. Therefore, it is necessary to make comparisons of the students’ corpora after a series of written task repetitions based on a listening passage, and to measure complexity, accuracy and fluency (CAF). While learners go through the process of performing a task for the first time, they spend much time planning preverbal messages (Bygate, 1996). Learners gather from memory the target language they consider most appropriate for the task, thereby helping familiarize themselves with the messages they wish to convey. Through repetition of this process, they produce more meaningful language at higher levels of complexity and fluency as an outcome of task familiarity.

40

Experience the Past through Service to the Community

Literature Review

41


The repetition process mirrors a behaviorist perspective in that drills are given in foreign language classes because educators assume students learn their target language as a result of habit formation through the repeated task completion. According to Paulston and Bruder (1976), repetition drills are defined as “plain repetition of the cue” (p. 12). On the contrary, Bygate (2006) indicates task repetition is not mere repetition of cues but rather repetition of form and content that became familiar to the learners. Because it is challenging for students to focus on meaning and form at the same time, repetition provides the necessary additional time to communicate effectively to fulfill the task while properly formulating target grammar and vocabulary. A possible reason for an absence of research showing improvements after repeating the task is researchers have taken such results for granted. Research Focus & Necessity for a Literature Review This research aims to answer the following question: How does repetition of integrated listening and writing tasks impact international students’ academic writing. The subordinate research questions are: 1. How does repeating the same integrated task at varied time intervals affect CAF? 2. How does feedback influence CAF in a repeated task? 3. What are the learners’ perceptions of integrated task repetition and the extent to which repetition helps build second language proficiency? Theories that Help Guide this Study

According to Willis (1996), tasks are goal-oriented with real specific outcomes such that the target language is used in a meaningful way to finish the task as opposed to producing specific forms. Willis identifies six types of tasks: Listing, ordering and sorting, comparing, problem solving, sharing personal experiences and innovating. Under the Willis model, learners who are asked to make up their own versions of story endings or give advice are completing a problem-solving task. Willis suggests a task that engages learners’ interests, places primary focus on meaning, offers an outcome, requires completion, and relates to real world outcomes. Pedagogical Tasks Central to Language Learning According to Ellis (2003), “tasks hold a central place in current second language acquisition (SLA) research and language pedagogy” (p. 1). Willis (1996) maintains that a task is any activity that always includes the target language used by the learner to achieve an outcome through communicating the newly learned second language (L2). The major roles that tasks play in improving language learning rest on the theory that language is primarily a way to make meaning, the major pillar of language use. Richards and Rodgers (2001) state that tasks provide for the learner both input and output processing needed to learn the L2. Nunan (1989) adds that tasks require incorporating real activities in the classroom, and Nunan (1991) further adds that tasks are not only the core but also the foundation of classroom instruction. He maintains that tasks place emphasis on learning experiences combined with communication, thereby supporting task use as an integral learning component. Task Repetition Improves CAF The position on task repetition presented throughout my research is task repetition positively improves a learner’s L2 output with respect to CAF (Bygate, 1996, 2001; Gass, Mackey, Alvarez-Torres, & Fernandez-Garcia (1999); Lynch & McLean, 2001). This follows Bygate’s (1996) claim that if learners are given the opportunity to repeat a task, they will gain oral accuracy because they are familiar with the task content by having completed the first task, thereby allowing them to focus the next time on producing the correct L2 formation. Additionally, learners search from their memory the language needed to complete the task so they become familiar with the message content. Then when they repeat the task, they are familiar with the content, so they shift their attention to selecting the language, thereby monitoring CAF (Bygate, 1999). Bygate (2001) adds: “part of the work of conceptualization, formation and articulation carried out on the first occasion is kept in the learners’ memory store and can be reused on the second occasion” (p. 29). These beneficial effects of task repetition align with Skehan’s (1998) dual-mode system proposal where L2 is categorized by exemplar- and rule-based systems, the former being lexical and the latter

42

Literature Review

Literature Review

43


requiring processing, therefore well suited for more controlled and less fluent language performance. This proposal suggests task repetition facilitates learners’ use of the rule-based system thus helping improve fluency and complexity. Further, the Trade-Off Hypothesis (Skehan 2009, 1998), previously known as Limited Attentional Capacity Model (Skehan, 1998; Skehan & Foster, 2001), proposes that “due to capacity limitations, speakers must divide their attentional resources between all the processes a task requires… .If various task demands exceed the available resources, then complexity, accuracy and fluency compete with each other” (Sample & Michel, 2014, p. 27). This supports the importance of reassigning the same task to allow the learner the shift of focus. Nunan (1991) states: “A lot of grammar can be learnt from the repeated use and exposure of target language forms. Also, the authentic use of target language by the learners in task-based materials can be regarded as positive evidence” (p. 283). While some of these theories pertain to oral repetition, the focus of the effects of the task repetition stages frame my study to assess written language production. Definition of a Task Defining a task can be difficult, and researchers offer additional meaning that can be ambiguous to practitioners who attempt to distinguish between an activity, drill or exercise (Crookes, 1986; Long, 1985, cited in Ellis, 2003; Nunan, 1989). Nunan (1989) defines a task as a “piece of classroom work which involves learners in comprehending, manipulating, producing or interacting in the target language while their attention is principally focused on meaning rather than form” (p. 10). A task is different from a drill in that a task includes a purpose for use and learning the L2 as opposed to merely learning the new language without a context. According to Ellis (2003), a task is a type of workplan involving primary focus on meaning and realworld language use, incorporation of the language skills. A task draws on cognitive processes and it provides a clearly stated communicative outcome (pp. 9 – 10). Skehan (1998) is critical of the presentation-practice-production approach stating that only advanced learners would be proficient, whereas he supports a taskbased approach for more rapid and efficient L2 learning. Even with variations of definitions, common features of tasks are specific goals and outcomes, input data, and at least one relevant activity (Kumaravadivelu, 1993). Definition of CAF The dictionary defines complexity as “composing of two or more parts” and “hard to separate, analyze or solve (Merriam-Webster). According to Pallotti (2009), accuracy (correctness) is “the degree of conformity to certain norms … Fluency is the capacity to produce speech at normal rate without interruption” (p. 591). Pallotti (2016) streamlines three definitions of complexity: First, structural (proper

44

Literature Review

arrangement of texts and linguistic systems pertaining to their relational patterns); second, cognitive (pertaining to the processing associated with the structures); and third, developmental (“the order in which linguistic structures emerge and are mastered in second language acquisition) (p. 118). Housen and Kuiken (2009) state that complexity is the most ambiguous of the three dimensions of the CAF triad: Task complexity (properties of the language task) and L2 complexity (cognitive or linguistic: performance and proficiency). Simply stated, complexity and accuracy pertain mainly to L2 knowledge representation and level of analysis of linguistic patterns. Fluency pertains to learners’ ability to apply their linguistic L2 knowledge, as reflected in the speed and ease with which they access relevant L2 information to communicate meanings in real time, with “‘control improv[ing] as the learner automatizes the process of gaining access” (Wolfe-Quintero et al., 1998). CAF entails the properties of L2 as a product, while its development is a process in itself. Numerous researchers have explored the relationship between tasks and linguistic performance, measured with CAF dimensions. Skehan (1996a) added complexity as a third component to accuracy and fluency to complete a three-dimensional language proficiency model. Evidently, there is some lack of agreement on the meaning of CAF among the researchers. Skehan (1996a) adds “[complexity] requires a learner who explicitly accepts such developments as goals and who is driven by whatever means to achieve them” (p. 46). Complexity also extends to the diverse and complex nature of the use of vocabulary in student writing. Lexical diversity measures the number of different words used in a text. According to Jarvis (2013), variability, volume, evenness, rarity and discrepancy are pillars of lexical diversity. In order for a text to have a higher lexical diversity, many different words with very little repetition would be needed. Various factors that influence word or lexical complexity are word length, morphology, familiarity, etymological roots, ambiguity, and context (Specia et al., 2012). Definition of Feedback Feedback is the teacher’s response to errors. Kulhavey (1977) defines feedback as “any of the numerous procedures that are used to tell a learner if an instructional response is right or wrong” (p. 211). One major goal of providing feedback in writing is to engage the students to revise their work through repeating the tasks. Yet, Kulhavey states that few studies incorporate feedback, apart from feedback as an aid in scaffolding. An exception, however, is Ellis (2009) who emphasizes the crucial importance of both task repetition and feedback as learning tools in language teaching. Ferris and Hedgcock (2014) support feedback as they state: “Both teachers and students feel that teacher feedback on student writing is a critical, nonnegotiable aspect of writing instruction” (pp. 237-238). Manchon (2014) adds: “the availability of feedback and the role of feedback in bringing about potential benefits should be made central in future task repetition…these gains are

Literature Review

45


purported to be crucially dependent on the learners’ own engagement with and processing of the feedback received” (p. 31). Writing performance for those who receive feedback and for those who do not will be compared and contrasted, thereby confirming its importance in task repetition. Literature Review Written Task Repetition Written task repetition provides the ability to free students’ limited attentional resources that help them devote more of their cognitive resources to the “formal and systemic aspects of language” (Ahmadian & Tavakoli, 2011; Ellis, 2005). The advanced skills needed for a knowledge-transforming approach to writing require a great deal of memory and focus. For many beginner writers given a task for the first time, using their transcription and oral fluency skills necessary to decode meanings of words challenges their ability to free up their working memory required for critical thinking, crucial for success in academic ESL (Kellogg, 2001; Olive & Kellogg, 2002; Torrance & Galbraith, 2005). Bygate (2001) argues that when L2 students complete a task the first time, their speech production system needs to use all the relevant language-processing steps while under time constraints. Thus, based on this limited attentional model of speech production (Kormos, 2006; Skehan, 2009), it could be argued that on the first performance of a given task, learners have to use their cognitive skills in conceptualizing the context of the task. Their attention is required on understanding the task as opposed to their formulating and monitoring their L2 production. Similarly, students also have to overcome challenges of handling the breakdown of their L2 performance due to incomplete lexical or syntactic proficiency (Dörnyei & Kormos, 1998). Although this literature provides arguments for oral task repetition as a rationale for task usage, the extent to which this transfers to written task repetition is not known. Task repetition, unlike first time completion, allows L2 students to rely on the previous conceptualized content they recall and to apply recently used linguistic constructions to formulate their messages. This also reduces the attentional demands, so they can simultaneously conceptualize, encode, and monitor their L2 production. During the task repetition stage, students can focus on improving their lexical complexity as well as fluency. Task-Based Writing When writing instead of speaking, students have more time to think about their L2 output, as writing allows participants more time to plan what they want to convey than they would have when speaking. Many studies show that L2 output improves after students repeat the same task. Much of this research, however, has focused on integrated listening-to-speaking and reading-to-writing tasks as well

46

Literature Review

as on independent speaking and writing tasks. Because many tasks are more focused on meaning than on form, when it comes to prioritizing either form or meaning, it is more probable that students will choose meaning. Kellogg (1996) theorizes that a student’s working memory of information aids in planning ideas they then transcribe into sentences, thus supporting knowledge transfer as a resulting skill, showing how they use information they previously learned. Essay writing requires students to use of many cognitive processes including planning, developing sentences, and revising (Hayes, 1996; Kellogg, 1996; Levy & Ransdell, 1995). This endorses the importance of taking into consideration writing as a process when repeating integrated tasks. CAF in Task Repetition Bygate (1999) advocates that task repetition has many beneficial effects on L2 performance. Studies show it improves accuracy and fluency while accuracy may decrease when complexity increases. Ellis (2003) suggests accuracy will be developed if tasks are incorporated with planned discourse; conversely, lexical complexity will increase if tasks are incorporated with unplanned discourse. Planning, however, is a necessary step to improve L2 output in task repetition. Foster and Skehan (1996) suggest that at least ten minutes of pre-task planning would increase complexity, accuracy, and fluency. Crookes (1989), unlike Foster and Skehan (1996), argues that more planning time before task completion increases only complexity but not accuracy. Ellis (2005) argues that task repetition is part of task planning in that repeating identical or partially altered versions of the same task improves fluency and complexity. Bygate (1996) was among the first to research task repetition and its effect on L2 cognitive processing. In this study, students watched a short video, narrated it, and then repeated the same task. Results showed there was greater complexity in L2 production in the second narration than in the first one. Results also showed repetition of several phrases from the first narration, likely resulting from discourse planning between tasks. Several years later, Gass, Mackey, Alvarez-Torres, and Fernandez-Garcia (1999) conducted a study on 104 Spanish language students. As in Bygate’s (1996) study, students were asked to narrate a short video, but there were three task repetitions, totaling four task completions. Results showed an increase in accuracy and lexical complexity. Results from both of these studies suggest a shift of focus from conceptualization to fluency. Bygate (2001) conducted another task repetition study on 48 ESL students who fulfilled the same narrative and interview tasks twice with a ten-week delay between initial task completion and repetition. Bygate (2001) indicated there was stronger fluency and complexity, but no significant improvement on accuracy. When learners repeat tasks, they become familiar with the content the first time

Literature Review

47


when they do most of their conceptualizing, formulating, and articulating of language, thus enabling them to redirect their attention from content to proper language usage, leading to more advanced complexity, fluency and accuracy (Bygate, 1999; Bygate and Samuda, 2005). Bygate and Samuda (2005) report, “part of the work of conceptualization, formulation, and articulation carried out on the first occasion is kept in the learners’ memory store and can be reused on the second occasion” (p. 29). Although conceptualization, formulation, and articulation typically improve in L2 after task repetition, Bygate and Samuda suggest the additional possibility that such an effect may be minimal if student performance for the same task is automated. In this case, they are also reusing information they had released from memory when producing language. Results from further empirical research that compares and contrasts L2 output after timed intervals between initial task completion and task repetition would strengthen or refute this idea. Ellis (2005) differentiates task completion from task repetition. The first task performance is a preparation for the repetition. During the first performance, the learners conceptualize the task as rehearsal. Skehan (2009) indicates that when students perform oral tasks, they are naturally enticed to plan what they will say during this conceptualization stage. However, when they repeat the task, the transition from learning stage to practice frees the learners to enhance their L2 output. Bygate (2001) also stresses that learners must begin by creating form to produce L2, then applying it to the language it produces to complete the task. When they repeat the task, they are more familiar with the topic, thus conceptualization time decreases and frees learners to enhance complexity, accuracy, and fluency. Skehan (1998) relates fluency to ability to communicate meaningfully with minimal pauses, accuracy to ability to apply the target grammar, and relates complexity to a wider range of sentence structure. Skehan (1998) parallels the comparative switch of focus during initial task completion and task repetition to the “trade-off” hypothesis. This means learners have limited attention capacity such that they need to prioritize where they place their attention. According to Bygate (1999), learners usually focus first on meaning, then task repetition lends itself to “freeing” the learners to switch their attention focus to selecting the appropriate language to use during task repetition. This suggests that placing lesser demands on the cognitive processing through task repetition is useful for teaching grammar. A number of later researchers built on the seminal research of task repetition studies. Lynch and McLean (2001) conducted research on fourteen English for Specific Purposes students. They completed a poster-carousel task that required them to continually answer questions from classmates about posters they prepared. Some questions were duplicated while others were similar. Lynch and McLean stated that results showed accuracy and fluency improved as a result of task repetition.

48

Literature Review

Birjandi and Ahangori (2008) cite Bygate (1999) on the beneficial outcomes of task repetition: “Learners are likely to initially focus on message content, and once message content and the basic language needed to encode it has been established, to switch their attention to the selection and monitoring of appropriate language” (p. 31). Birjandi and Ahangori also build on Bygate (1996, 1999) by indicating that because a task was previously formulated, there would be fewer hesitancies in L2 oral output. In Birjandi and Ahangori’s study involving Iranian English as Foreign Language EFL learners, three types of oral tasks are used, including personal narratives, story narratives, and decision-making tasks, which required the participants practice through task repetition. Results indicated increases in fluency and complexity, but not accuracy. The researchers added that increases in accuracy and complexity were highest for personal narrative tasks.. The reason for this may be the participants were more familiar with this topic than with the other ones, resulting in a shift in attention focus from accuracy to fluency. Not all studies, however, show strong improvements in fluency. Matsumara, Kawamura and Affricano (2008) conducted a study on English language learners who completed both narrative and decision-making tasks. The findings indicate that task type does significantly affect performance. For example, there was no significant improvement in fluency but complexity and accuracy improved. Accuracy improved more strongly from narrative task while complexity improved more strongly from the decision-making task. Despite such variations in the findings, task repetition research continually suggests it be implemented in classrooms. While task repetition studies on oral language output continue to flourish, there has also been research conducted on task-type repetition as opposed to same-task repetition. For example, Gass et al. (1999) conducted a study to identify whether student performance of proper usage of estar in Spanish would improve if a similar task were assigned instead of an identical one. The results showed improvements only when same-task types were repeated, but not as a result of task-type repetition. Takimoto (2010) conducted one such study on politeness forms with English language learners. Results show same-task type repetition produced more improved L2 output than did task-type repetition. Integrated Tasks According to Plakans (2010), language test developers and educators need a greater understanding of how writers respond to integrated tasks as well as compose the meaning of the information they convey in their writing. Academic ESL writing tasks are usually integrated with at least one other skill in order to elicit more authentic integrative language use (Plakans, 2009; Hinkel, 2006). Researchers have investigated task representation in integrated reading-to-writing tasks. Based on various studies, integrated task usage results in positive test results, a reflection on language teaching and learning (Cumming et al., 2004; Fox,

Literature Review

49


2004). Meanwhile, results from other studies suggest that integrated task usage effectively addresses construct validity (the degree to which a test measures what it claims, or purports, to be measuring) (Bachman and Palmer, 1996; Charge & Taylor, 1997; Upshur & Turner, 1999). Writing has often been the sole construct in writing courses and independent assessment tasks. However, writing is not an autonomous language skill, but rather combined with reading, listening, and speaking. Various tests, like the Test of English as a Foreign Language (TOEFL), have incorporated integrated writing tasks in addition to independent tasks (Horowitz, 1986). For example, the TOEFL added a writing task that requires test takers to listen to and read dialogue and then to summarize the content. Other high-stakes tests that use integrated tasks are Canadian Academic English Assessment , Ontario Test of English as a Second Language, and Certificate of Proficiency in English (Yang & Plakans, 2012). Such integrated task assessments warrant an understanding of the impact of skills integration on l2 students when aced with writing tasks that require other skills. Research on this idea further will have implications for test developers and educators, as there appears to be a gap in the research exploring written performance on integrated listening-to-write tasks.

References Ahmadian , M. J., & Tavakoli, M. (2011). The effects of simultaneous use of careful online planning and task repetition on accuracy, complexity, and fluency in EFL learners' oral production. Language Teaching Research, 15(1), 3559. https://doi.org/10.1177/ 1362168810383329 Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice. Oxford, England: Oxford University Press. Birjandi P., & Ahangari, S. (2008). Effects of task repetition on the fluency, complexity and accuracy of Iranian EFL learners. The Asian EFL Journal, 10(3), 28-52. Bygate, M. (1996). Effects of task repetition: appraising the developing language of learners. In Willis, J., and Willis, D., (Eds.), Challenge and change in language teaching (pp. 136-146). Oxford, England: Heinemann. Bygate, M. (1999). Task as the context for the framing, re-framing and unframing of language. System 27: 33 – 48. Bygate, M. (2001). Effects of task repetition on the structure and control of oral language. In M. Bygate, P. Skehan, & M. Swain (Eds.), Researching pedagogic tasks. Harlow: Longman. 23-48. Bygate M. (2006) Areas of research that influence L2 speaking instruction. In E. Usó-Juan, & A. Martínez-Flor (Eds.), Current trends in the development and teaching of the four skills. Berlin: Mouton de Gruyter. Bygate & Samuda (2005). Integrative planning through the use of task repetition. In R. Ellis (Ed.), Planning and task performance in second language (pp. 37-74). Amsterdam: John Benjamins. Charge, N., & Taylor, L. B. (1997). Recent developments in IELTS. ELT Journal, 51(4), 374–380. Cumming, A., Grant, L., Mulcahy-Ernt, P., & Powers, D. (2004). A teacherverification study of speaking and writing prototype tasks for a new TOEFL. Language Testing, 21(2), 159–197. Dörnyei, Z., & Kormos, J. (1998). Problem-solving mechanisms in L2 communication: A psycholinguistic perspective. Studies in Second Language Acquisition, 20, 349–385. DOI: https://doi.org/10.1017/S0272263198003039 Ellis, R. (2005). Planning and task-based performance: Theory and research. In Ellis, R. (ed.), Planning and task performance in second language. (pp. 334). Amsterdam: John Benjamins. Ellis, R. (2003). Task-based language learning and teaching. Oxford: Oxford University Press.

50

Literature Review

Literature Review

51


Ellis, R. (2009). The differential effects of three types of task planning on the fluency, complexity, and accuracy in L2 oral production. Applied Linguistics, 30, 474-509. Ferris, D., & Hedgcock, J. (2014) Teaching L2 composition. Purpose, process, and practice (3rd ed.). New York: Routledge.

methodological aspects of task-based pedagogy' in G. Crookes and S.M. Gass (eds.). Tasks in a Pedagogical Context. Clevedon: Multilingual Matters. Levelt, W. J. M. (1989). Speaking: From intention to articulation. Cambridge, MA: MIT Press.

Foster, P., & Skehan, P. (1996). The influence of planning on performance in taskbased learning. Studies in Second Language Acquisition, 18(3), 299-324.

Levelt, W. J. M. (1999). Producing spoken language: A blueprint of the speaker. In C. Brown & P. Hagoort (Eds.), Neurocognition of language, (pp. 83–122). Oxford: Oxford University Press.

Fox, J. (2004). Test decisions over time: Tracking validity. Language Testing, 21, 437–465.

Levy, C. M., & Ransdell, S. E. (1995). Is writing as difficult as it seems? Memory and Cognition, 23, 767–779.

Gass, S., Mackey, A., Alvarez-Torres, M. J., & Fernández-García, M. (1999). The effects of task repetition on linguistic output. Language Learning, 49, 549-581.

Lynch, T. & Maclean, J. (2000). Exploring the benefits of task repetition and recycling for classroom language learning. Language Teaching Research, 4, 221-250.

Hayes, J. R. (1996). A new framework for understanding cognition and affect in writing. In C. M. Levy, & S. E. Ransdell, (Eds.), The science of writing (pp. 1-27). Mahwah, NJ: Erlbaum. Horowitz, D. (1986). What professors actually require: Academic tasks for the ESL classroom. TESOL Quarterly, 20(3), 445-462. Housen, A., & Kuiken, F. (2009). Complexity, accuracy, and fluency in second language acquisition. Applied Linguistics, 30, 461-473. http://dx.doi.org/10.1093/applin/amp048 Jarvis, S. (2013). Capturing the diversity in lexical diversity. Language Learning, 63: 87–106. Kellogg, R. T. (1996). A model of working memory in writing. In Levy C.M. & Ransdell, S.E. (Eds.), The science of writing (pp. 57-71). Mahwah, NJ: Erlbaum. Kellogg, R. T. (2001). Commentary on processing modalities and development of expertise in writing. In Rijlaarsdam, G., Alamargot, D. & Chanquoy, L. (Vol. Eds.), Studies in writing: Vol. 9. Through the models of writing. (pp. 219-228). Dordrecht, The Netherlands: Kluwer Academic Publishers. Kellogg, R. T., Turner, C., Whiteford, A., & Mertens, A. (2016). The role of working memory in planning and generating written sentences.” The Journal of Writing Research 7(3), 397-416. Hinkel, E. (2006). Current perspectives on teaching the four skills. TESOL Quarterly, 40(1), 109-131. Kormos, J. (2006). Speech production and L2 acquisition. Mahwah, NJ: Lawrence Erlbaum. Kulhavey, R. W. (1977). Feedback in written instruction. Review of Educational Research. 47, 211-229.

Manchón, R.M. (2014). The distinctive nature of task repetition in writing. Implications for theory, research, and pedagogy. Estudios de Lingüística Inglesa Aplicada (ELIA), 14, 13-41. Matsumara, M., Kawamura, K. & Affricano, A. (2008). Narrative task type repetition and changes in second language use in a classroom environment: a case study. The Educational Sciences, 10, 124-145. Nunan, D. (1989). Designing Tasks for a Communicative Classroom. Cambridge: Cambridge University Press. Nunan, D. (1991). Communicative tasks and the language curriculum. TESOL Quarterly, 25(2), 279 – 295. Olive, T., & Kellogg, R. T. (2002). Concurrent activation of high- and low-level production processes in written composition. Memory & Cognition, 30(4), 594-600. Pallotti, G. (2009). CAF: Defining, refining and differentiating constructs. Applied Linguistics, 30(4), 590–601. Pallotti, G. (2015). A simplistic view of linguistic complexity. Second Language Research, 31(1), 117-134. Paulston, C. & Bruder, M. (1976). Teaching English as a Second Language: Techniques and Procedures. Cambridge, MA: Winthrop Publishers; 1976. Plakans, L. (2009). The role of reading strategies in integrated L2 writing tasks. Journal of English for Academic Purposes, 8(2), 252 – 266. Plakans, L. (2010). Independent vs. Integrated Writing Tasks: A Comparison of Task Representation. TESOL Quarterly, 44: 185–194.

Kumaravadivelu, B. (1993). 'The name of the task and the task of naming:

Ransdell, S., Arecco, M., & Levy, C. (2001). Bilingual long-term working memory: The effects of working memory loads on writing quality and fluency.

52

Literature Review

Literature Review

53


Applied Psycholinguistics, 22(1), 113-128. Richards, J. & Rodgers, R. (2001). Approaches and methods in language teaching (2nd Ed.). Cambridge: Cambridge University Press. Sample, E & Michel, M. (2014). An exploratory study into trade-off effects of complexity, accuracy, and fluency on young learners’ oral task repetition. TESL Canada Journal, 31(8), 23 – 46. DOI:10.18806/tesl.v31i0.1185

n Reading for Academic Purposes:

Vocabulary Knowledge Improves Reading

Skehan, P. (1998). A cognitive approach to language learning. Oxford: Oxford University. Skehan, P. (1996a). A framework for the implementation of task-based instruction, Applied Linguistics, 17(1). 38–62. Skehan, P. (1996b). Second language acquisition research and task-based instruction. In J. Willis, & D. Willis (Eds.). Challenge and change in language teaching, (pp. 17–30). Oxford: Heinemann. Skehan, P. (2009). Modelling L2 performance: Integrating complexity, accuracy, fluency and lexis. Applied Linguistics, 30(4), 510–532. Skehan, P., & Foster, P. (2001). Cognition and tasks. In P. Robinson (Ed.), Cognition and second language instruction, (pp. 183-205). Cambridge, UK: Cambridge University. Specia, L., Jauhar, S.K., & Mihalcea, R. (2012). Semeval-2012 task 1: English lexical simplification. Proceedings from the First Joint Conference on Lexical and Computational Semantics (*SEM), (pp. 347–355). Montreal, Canada: Association for Computational Linguistics. Takimoto M. (2012). Assessing the effects of identical task repetition and task-type repetition on learners’ recognition and production of second language request downgraders. Intercultural Pragmatics, 9(1), 71 – 96. Torrance, M., & Galbraith, D. (2005). The processing demands of writing. In MacArthur, C., Graham, S. & Fitzgerald, J. (Eds.), Handbook of writing research (pp. 67-82). New York: Guilford. Upshur, J. A., & Turner, C. (1999). Systematic effects in the rating of secondlanguage speaking ability: Test method and learner discourse. Language Testing, 16. 82–111. Willis, J. (1996). A framework for task-based learning. Harlow: Longman. Wolfe-Quintero, K., Inagaki, S., & Kim, H. Y. (1998). Second language development in writing: Measures of fluency, accuracy and complexity. Manoa: University of Hawaii Press. Yang, H.-C., & Plakans, L. (2012), Second language writers’ strategy use and performance on an integrated reading-listening-writing task. TESOL Quarterly, 46: 80–103.

54

Literature Review

Carol Miele Professor, English as a Second Language

Leah Carmona Assistant Professor, English as a Second Language

College students who are English language learners face many difficulties when trying to learn a new language and use it in their academic studies. Second language learning depends to a large extent on building up vocabulary that can eventually be called upon automatically. However, this process is difficult and time consuming. Schmitt (2008) illustrates how learners first have to pay close attention to many aspects of the language they are trying to comprehend and produce. Paying attention, in this context, means using cognitive resources to process information and develop strategies concerning the form and use of particular words. In order to be successful, students need to learn how to use each word correctly. Schmitt (2008) refers to this concept as “the quality or depth of vocabulary knowledge” (p.333). In other words, simply having the skill to recall the meanings of previously memorized words is not enough. It is necessary to be able to understand how context impacts meaning and how to use grammatical knowledge to infer various meanings and usages of words and phrases. This can be very cumbersome for language learners; consequently, it is important to provide students with processes and resources not only to retrieve vocabulary previously learned, but also to understand how it is used in a particular text as well as to practice using it appropriately when required in speaking or writing. Students must learn to use all their knowledge of English as they develop reading strategies for success in college-level reading. English Composition I is one course that demands

Reading for Academic Purposes

55


fluent reading of difficult texts, such as the ones offered by W.W. Norton in customized readers known as The Norton Mix. All of the articles present numerous obstacles to second language readers. Typical style and content are laden with cultural references and idiomatic usages. Readers are expected to comprehend layers of meaning through close reading. Their ability to understand what they read and how to interact with a text will be influenced by their confidence and experience seeing and understanding the language in use by all types of writers.

are used. Readers encounter words in sentences and phrases, so it is important to recognize the grammatical patterns and how grammatical structures contribute to meaning. Therefore, understanding word formation and recognizing of the use of derivational forms to incorporate a word into a syntactic pattern helps the reader to follow the ideas contained in the sentences. Recognition and understanding of word families based on forms can reinforce learning of a base meaning and provide a key to unlocking the meaning of a whole sentence.

Addressing these challenges requires a teaching approach that starts with vocabulary development and recognizes that expanding vocabulary knowledge is a way to improve reading comprehension, critical thinking and writing. The work of prominent researchers in the field of language acquisition and teaching, especially Paul Nation, has provided the theoretical foundation for a vocabulary-based approach to teaching reading (Nation, 2001). Grabe (2009) includes principles for vocabulary learning, which were used in developing the instructional design for a college English for academic purposes reading course incorporating vocabulary concepts that can be applied to comprehension and analysis of texts.

Awareness of additional aspects of word usage must be developed in order to fully appreciate and understand the language as it is used in written and spoken discourse. For example, it is useful to know about collocations–that words often go together with other words in a fixed or semi-fixed relationship (Cambridge Dictionary). Recognizing the phrasal structures within a sentence helps to see that words do not exist separately from other words and that meanings depend on surrounding words. This is especially important when idiomatic language is being used.

Extensive Reading The first principle involves extensive reading. Nation (2001) describes extensive reading as an important component of vocabulary development and language learning. Basically, extensive reading is reading for pleasure, which is possible when the reading level of the text and of the reader are comparable. According to Hu and Nation (2000), “Extensive reading can only occur if 95% -98% of the running words in a text are already familiar to the learner or are of not burden to the learner” (p. 405). A graded reader series with level assessment tests is useful. The Penguin Readers are graded and simplified for English language learners so that if students read at their level, they will have the correct number of known words. The high interest titles in the series seem to stimulate the students, and reading at the appropriate level can increase their motivation and confidence. Continuous, habitual extensive reading can help develop general reading skills, fluency, and it can increase vocabulary and the ability tolerate unknown words in order to follow the text. Word Study The second principle for effective vocabulary learning is extensive word study. To begin, students must realize that learning vocabulary is more than memorizing meanings and much more than attaching a translation to a new word. In written text, words are encountered in a context determined by the subject or topic the author has chosen. The new context-specific vocabulary often leads to the possibility of retrieving previously acquired words belonging to the same context. Students access background knowledge and make associations to infer meanings. Furthermore, it is necessary to consider the grammatical context in which words

56

Reading for Academic Purposes

The Instructional Design Vocabulary-based reading instruction at the advanced levels presupposes that students have the concepts of vocabulary knowledge and are ready to put them to use as they read. They have learned that words often have multiple meanings which are related to the basic meaning but may not be the same. Students know that it is important for them to see which definition best fits the context and helps unlock the overall sense of the text. The same is true for synonyms. Readers also must recognize among numerous possible synonyms the ones that fit the context of the reading and be on the lookout for the synonyms used by the author to achieve textual cohesion and to avoid repetition. What is more, students are used to the fact that words are not used separately from other words; students know that very often words are part of a phrase or idiom, and often exist in collocations. Finally, students become accustomed to the value of the grammatical function, recognizing that the part of speech of a word holds the key to its function and meaning in a sentence. The concept of word families also entails affixes, both suffixes and prefixes, and students have to get used to how related forms can change meanings. The goal of a teaching approach that starts with vocabulary development is to train students from the beginning levels to improve reading skills by providing strategies for confronting unfamiliar words in difficult texts. In contrast to typical pre-reading strategies, which mainly focus on prior knowledge of the topic, prereading activities in this approach involve preparing students to encounter the vocabulary of the topic, e.g. if the topic is technology, the reader will benefit from being familiar with words pertaining to that field. Depending on the level of the class, the teacher will activate previously acquired words while introducing some of the new words students will find in the current lesson. While pre-teaching rel-

Reading for Academic Purposes

57


evant words, phrases and idioms, students begin to engage with the topic. The concepts dealt with by the author of the text begins to emerge as they explore the lexical items that pertain to the semantic domain. Among the 18 key implications for planning vocabulary activities suggested by Grabe (2009) is developing consistent procedures for selecting words to teach. One way to do this is to create a vocabulary profile of the text the students will read by using a text-analysis website such as Compleat Lexical Tutor, which analyzes scanned texts and shows words of different frequency types in color-coded categories showing levels of frequency. ( www.lextutor.ca ) Once the analysis is done the pre-reading activities are devised to activate prior knowledge of the theme as well as of vocabulary related to it. A pre-reading class begins with a discussion of the theme of the reading in order activate prior knowledge. The theme is written on the board for the whole class to think about words that relate to it. In addition, the instructor can guide the students and introduce some new words from the text analysis. The answers, mainly simple words, are written on the board creating a word map or schema. Students are asked to copy the information in their notebook; and at this time, are shown how to create flash cards to study vocabulary, including on their flashcards parts of speech, prefixes, suffixes, synonyms and antonyms. Students are also taught how to use resources such as yourdictionary.com and other online resources for vocabulary study. Next, the students look at the title of the text they are going to read, and as a group, they discuss the topic and predict the main idea. Students can use information from their flashcards or notes to guess what the reading will be about and add their prediction to their notes. This will be revisited by students after they read the article to verify that their guess was accurate. After the pre-reading activities, students will read the text, if possible, without stopping when they come to words they do not know. They may circle the unfamiliar words and try to guess the meanings based on context and clues, such as sentence structure, punctuation and transitional devices. Next, students read the text again paragraph by paragraph, writing down the most important information they are able to grasp, which could be words, sentences or quotes as well as references that show the page and paragraph number. In this way students attempt to understand the content in general while trying to pick out the most important points. This active reading strategy helps students deal with words that they do not know so that those words are not obstacles to reading fluency. Following this activity, students with the instructor re-read the article, Readers become aware of text structuring, e.g. statement, restatement, example, and cohesive devices and other syntactic structures used to show connections within

58

Reading for Academic Purposes

and between sentences, as well as phrases, idioms, and collocations. When students are finished reading, they review their notes and write the main idea of the article. Students refer to the word map or use an online dictionary or thesaurus for synonyms if needed. Also, they revisit their prediction of the main idea and make any appropriate corrections. In addition to stating the author’s thesis, students can also be taught to distinguish between main points and supporting details. Using their flashcards and notes, they can learn how to paraphrase and write summaries. Reading for academic purposes often necessitates close reading of a text so that the context can be evaluated and the ideas applied in comparisons with other texts and the ideas of other authors writing on similar topics. Readers will also be expected to formulate their own ideas as they become more engaged in their fields of study. Critical thinking will flow more naturally as students are more confidently engaged in close reading using effective vocabulary comprehension strategies for dealing with difficult texts. Implementation of the Design American Language III: Reading (ALP-064) at Bergen Community College (BCC) prepares advanced English learners for reading in college-level courses. Vocabulary-based reading was first adopted in the fall 2017 semester in one section of the course. The results of the formative evaluation were applied in spring 2018, again to one section (ALP-064-004). The textbook for the course was The Norton Mix: American Language III: Bergen Community College. This customized text was selected since the English Composition I uses a version of The Norton Mix. In selecting readings for the Reading III version, Lexile levels were considered as well as topics of general interest, i.e. technology, education, immigration, gender. In preparation for reading the essays in the textbook, the class reviewed all the vocabulary concepts of English (parts of speech, word families, multiple meanings, base and related meanings, prefixes, suffixes, roots; synonyms and antonyms, phrases, idioms and collocations). Students practiced using electronic dictionaries and created their flashcards or notebook system to engage in continuous word study. Through discussion and guided reading in class students began to understand how the language is used by authors to create meaning and express ideas. At the same time, students were involved in the extensive reading assignment. First, they took the Penguin Reader placement test, which determines the students starting levels. Each test has 30 multiple choice questions. (www.penguinreader.com). After taking the placement test, students selected a book according to their test results. They chose a book at their level (not higher) to read for enjoyment. If they finished reading one book, they chose another book either at that level or at the next highest level. They

Reading for Academic Purposes

59


were instructed to read one book every two weeks, or ďŹ ve books during the semester. There was a weekly journal assignment about their reading. At the end of the semester students took the Penguin Reader post-test.

Chart 1: Level 3 Students by Section

Outcomes Positive outcomes of the implementation of the vocabulary-based reading approach and extensive reading activity were seen in the success rate of the students based on their post-test and the results on the department exit exam as compared to other students in the same course. To establish a baseline for analyzing outcomes, a correlation was created between ALP/ESL levels at BCC and the Penguin Reader levels to measure improvement. The Common European Framework of Reference (CEFR), a set of guidelines used to determine levels and needs of foreign language learners and also used by Penguin Readers, served as a benchmark. Correspondence between learning outcomes of CEFR and ALP/ ESL levels were determined. The starting level of each student on the Penguin placement pre-test determined where they were in relation to the optimal starting point for that level. The posttest at the end of the semester showed whether the goal for the level was reached as shown in Table 1 below. Table 1.1 Correlations ALP/ESL Program CEFR

Penguin Reader Pre-Test

Foundations Level 1 Level 2 Level 3

1/2 2/3 4/4 4/5

A1 A2 B1 B2/C1

The result of the placement pre-test at the beginning of the semester indicated that 13 students out of 22 were at the 5/6 level. Six students started at level 4, and two students started at level 3. Chart 2: Penguin Reader Pre-Test Levels:

Penguin Reader Post-Test Goal 2/3 3/4 4/5 5/6/+ (beyond Penguin)

Data Analysis From the total number of students, 147, registered in Level 3 Reading, 22 were in the treatment group (ALP-064-004). This represents 15 % of the total students in all sections. The post-test results, show the percentage improvement in Penguin Reader level for each student at the end of the semester.

60

Reading for Academic Purposes

Reading for Academic Purposes

61


Table 2: Post Test Penguin Readers Percentage Improvement

Student 1 Student 2 Student 3 Student 4 Student 5 Student 6 Student 7 Student 8 Student 9 Student 10 Student 11 Student 12 Student 13 Student 14 Student 15 Student 16 Student 17 Student 18 Student 19 Student 20 Student 21 Student 22

Pre Test 4 4 6 4 3 6 5 6 5 4 6 6 6 4 3 6 6 5 5 5 No test 4

Post Test Reader Level Improvement % Improvement + 3 50% 6 2 33% 6 1 17% No test 5 2 33% + 1 17% No test + 1 17% + 2 33% 5 1 17% + 1 17% + 1 17% No test + 3 50% 4 1 17% No test 6 1 17% 6 1 17% 6 1 17% No test No test 6 2 33%

two, (9%) failed the exit test. Comparison of exit test result for ALP-064-004 to passing and failing rate of the other classes in the department, indicate a higher percentage-passing rate and lower failing rate (Chart 3). The passing scores of the treatment group were higher than the non-treatment group as shown in Chart 4. Chart 3: Pass Rate: Comparison of Exit Test Results

* Plus sign (+) indicates placement above Penguin highest level

Chart 4: Grades

Overall, Table 2 shows that 10 students improved 17%, 4 improved 33%, and 2 students improved 50%. Because six students did not take the post-test and results cannot be recorded, the results demonstrate that there was a 74% improvement for the class. However, removing the 6 students from the total class average, leaves a total improvement of 100%. The data collected from the extensive reading activity as shown by Penguin Reader post-test data, can also be used to analyze exit test results. The exit test consists of a B2/C1 level reading with 35 to 36 multiple choice questions. These questions include stated, inferences, guessing meaning from context and parts of speech. These standard reading skills are measured routinely to determine readiness for college-level reading. The treatment group students represented 24% the total number of students taking the test in the morning session, which is the only session from which data are analyzed. Twenty students in the treatment group, 91%, passed the exit test and

62

Reading for Academic Purposes

Reading for Academic Purposes

63


Discussion of Results The exit test necessitates the ability to apply or/and utilize reading strategies included in the vocabulary-based reading approach. The higher pass rate and grades of the students who were instructed with the approach described in this study show how promising it is. Students practiced throughout the semester using these strategies while confronting difficult texts. Their understanding and ability to respond to the multiple-choice questions on the test may reflect their ability to use all the reading strategies related to syntax, lexis, and context. It seems likely that the students gained confidence through the process of applying vocabulary, syntax and discourse structure as practiced throughout the course.

References

The goal of this teaching approach is to give students strategies that contribute to their skills in analyzing and evaluating what they read. Motivation and curiosity seemed to increase as they were able to be more engaged as independent readers. As language learners, they appeared to gain proficiency by seeing how the parts of the language relate to the whole in the text they are reading. Students not using this approach, tend to have a mostly fragmented knowledge of the elements of language and tend not to connect the fragments, at least not as they read.

Grabe, W. (2009). Reading in a second language; Moving from theory to practice. New York: Cambridge University Press.

The combination of extensive reading (reading for enjoyment at an appropriate level of difficulty) and intensive or close reading of difficult texts seems to empower English language learners when both types of reading are occurring. The study of vocabulary concepts as part of reading class ensures that language learning is continuous. Students have many opportunities to build up their linguistic awareness and to experiment with language usage as they engage in text-based assignments, such as paraphrasing and summarizing, where their proficiency in producing written language is also growing.

Collocation. (n.d.). In Cambridge Dictionary. Retrieved from https://dictionary.cambridge.org/us/ Cobb, T. (2017). VocabProfilers. In Compleat Lexical Tutor [computer software]. Retrieved from https://www.lextutor.ca/vp/ Fowler, W. S. (2005). Penguin Readers teacher’s guide: Placement tests. Harlow, UK: Pearson Education. Retrieved from http://www.pearsonlongman.com/ae/emac/newsletters/ Penguin_Readers_Placement_Tests.pdf

Hu, M. & Nation, P. (2000). Unknown vocabulary density and reading comprehension. Reading in a Foreign Language. 23, 403-430. Nation, I.S.P. (2001). Learning vocabulary in another language. Cambridge: Cambridge University Press. Schmitt, N. (2008). Instructed second language vocabulary learning. Language Teaching Research, 12(3), 329-363.

Vocabulary-based reading instruction suggested in this paper utilizes current theories in language acquisition and vocabulary development. Online resources, such as the Compleat Lexical Tutor, but also the many online dictionaries are an integral part of the pedagogical methods employed by the teacher in preparing the lessons and teaching the classes.

64

Reading for Academic Purposes

Reading for Academic Purposes

65



Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.