NOTE: This report is part of a larger article, Action Research: Reseeing Learning and Rethinking Practice in the LOTE Classroom, published by the LOTE Center for Educator Development. Please access the main page for full text and copyright information.
Greg Foulds, North East ISD
The purpose of this study was to give students choices in the ways in which they are assessed and in the types of assessments used to evaluate their proficiency in Spanish. This topic is of particular importance to me because even after eleven years of teaching Spanish at the secondary level, I still feel that, more often than not, students end up showing me what they do not know rather than what they do. I want them to be able to demonstrate what they have actually learned.
With this goal in mind, I wanted to answer three questions in my research: 1) What would happen if students were given choices in assessment? 2) Do the tests imitate the way that we have been practicing in class? 3) How will the students measure success?
I chose to conduct my action research with 67 Spanish II Pre-AP students (three classes). The project included an examination of pre- and post-test survey results and a comparison of scores from two student-generated tests with scores from a teacher-generated test. I used the first semester of the 2002-2003 school year to orient my students to the state-adopted textbook and state standards for language learners, the way in which I teach, the types of tests that are given (textbook publisher exams), and the expectations that I have for them as language learners. In January 2003, I told my students that I was going to research some ways in which I could help them improve their Spanish. Together, we would specifically target how students could better perform for me and how they would measure success if they were given a chance to have input in the assessment process. My students seemed interested in the idea, even though at that time we still were not sure what it would look like.
I began with a reflective survey to see how the students felt about assessment in general and in the Spanish class in particular. Most students (74%) felt somewhat comfortable with our current assessment procedure, and 80% agreed that the way we practiced Spanish in class matched the way in which they were being tested. However, 87% also supported making some changes to the evaluation process as a whole. Once the survey results had been tallied, I told students that for the next four weeks, they would be given the opportunity to design their own assessments.
Each time we reached another evaluation point in our studies, I placed the students in groups of three or four where they brainstormed how they wanted to be evaluated, deciding on the content and format of each test. For example, we had been studying a past tense in Spanish, so I told them that they needed to show me they had learned how to conjugate verbs in the preterit as well as to be able to recognize and use these forms in communicating. Once all the ideas had been shared, the class decided on specific ways to demonstrate their proficiency. As a class, students also decided that they would measure success by the grades they had earned, by whether they were able to correct and learn from their mistakes, and by whether they felt they could repeat the performance. The test development procedure was repeated after we had finished studying informal commands in Spanish. At the end of both units, each class generated its own unique test to assess their knowledge and skills. In both cases, the process led to a student-generated, pencil-and-paper test that included items that were not altogether different from items that I might have created for them myself! Their tests included word banks with fill-in-the-blank items, short speaking sections, kinesthetic activities (responding to a command), small writing/translation items, and items that asked students to decide if things were correct or incorrect. At no time did I tell them they had to have written tests, nor did I tell them that they had to choose from any particular test format(s). I simply acted as a facilitator and helped them verbalize and realize a test of their own.
Finally, I gave students a third assessment, a textbook test (TBT) covering the same materials (preterit and informal commands) included on the two student-generated tests (SGT). I was then able to compare their test scores on the three evaluations. The number of students in each class that passed (grade of 70 or better) the student-generated tests was essentially (with one exception) the same number as those that passed the traditional assessment:
Class |
SGT Preterit |
SGT Commands |
TBT Preterit & Commands |
A (N=21) | 57% | 48% | 50% |
B (N=21) | 91% | 72% | 76.5% |
C (N=25) | 72% | 70% | 79% |
Although scores for student-generated exams were not consistently higher than those on the textbook test, students in classes A and B both scored better on one of the student-generated assessments, and the other scores were not considerably lower than those on the third exam. It is also possible that students may have felt a bit more empowered in the process of deciding on their own assessments. Although they indicated that they favored a change in the way assessments had been given, as two students put it, “I’ve learned that even if students design the assessments it can still be hard,” and “Students realized that they still have to study for tests, even if it’s their own.”
I also found the results of the post-experiment survey very interesting. Ninety-two percent believed that their assessments matched the way we practice in class and that their language skills had improved. Eighty-eight percent of students liked the changes made in the evaluation process and did not wish to return to the textbook tests. Ninety-four percent felt that the experiment had been both successful and valuable. Of how he had measured success, one student said, “I have actually learned the material instead of just knowing the basics for the test,” and another commented, “I can see an improvement in not only my grades, but in my esteem, and I feel better in taking tests.”
My students and I all learned a great deal in the process of this research. The overwhelming majority of my students feel that their voice mattered, that they were successful, and that the process was a rewarding and valuable one. There was a small number of students who did not enjoy the process and did not feel that it was overly successful simply because they felt uncomfortable making their own tests and felt that the teacher knew more about instruction and teaching than they did. One student commented, “Student designed assessments benefit some students greatly; however, others would rather be told what to do.” Many students have never been given the type of freedom that I allowed my students and, frankly, I think it intimidated them and may have even scared a few.
It is hard to make such changes in one’s teaching and return to the status quo, even after just a few weeks of “thinking outside of the box.” Consequently, I plan to continue allowing students the opportunity to give input into the evaluation process for the express purpose of being able to show me what they really can do in the language. I plan to give them the opportunity to correct their work and learn from their mistakes while giving them the chance to improve their grades. Now that dialogue has begun and my students and I openly communicate about their assessments, we will be able to continue the negotiation process that will ultimately benefit all students. For example, I now believe that my students generated traditional types of assessments because that is what they know and are familiar with. Therefore, I want to expand my students’ vision and lead them to more real-world, authentic types of assessment so they can apply what they have learned in ways that are meaningful to them.
This action research project was frustrating because it led my students and me into uncharted waters, but it was also intriguing for the very same reason. I feel that I have learned an enormous amount about instruction in the classroom, about myself as an educator, about my students as learners of a language not their own, and about the give-and-take that exists in all human relationships. This process has helped me grow as an individual and has helped reinforce my personal conviction that teaching is a difficult, yet noble and rewarding life-calling.
Previous Report | Next Report |