diamond.gif (561 bytes) Resources
 
diamond.gif (561 bytes) Courses & Professional Development
 
diamond.gif (561 bytes) E-rate
 
diamond.gif (561 bytes) Evaluation
 
diamond.gif (561 bytes) Grants & Funding
 
diamond.gif (561 bytes) Hardware/Software
 
diamond.gif (561 bytes) Home - Community
 
diamond.gif (561 bytes) Leadership
 
diamond.gif (561 bytes) Literacy for Adults
 
diamond.gif (561 bytes) Noteworthy Schools
 
diamond.gif (561 bytes) Planning
 
diamond.gif (561 bytes) Policy
 
diamond.gif (561 bytes) Presentations
diamond.gif (561 bytes) Publications
diamond.gif (561 bytes) SEIR*TEC Initiatives
 
diamond.gif (561 bytes) State Technology Information
 
diamond.gif (561 bytes) Teacher Education
 
Planning for Evaluation

SEIR*TEC Home | About SEIR*TEC | Partners | SEIR*TEC Region |
Search | Site Map

 
 
Evaluation Questions—
Guiding Inquiry in Schools
 

Through the No Child Left Behind (NCLB) legislation school and district practitioners are being asked to become more involved in the evaluation of the effectiveness of their schools' efforts and progress. Many practitioners are short on time,funding, and evaluation experience. However, educators can maximize their learning from this work by building their evaluation around clearly articulated evaluation questions.

The critical guidance for evaluation work, just as in school-based action research, is the identification, use, and reflection on essential questions. These questions drive the learning, and evaluation is about learning:

  • Learning how students and teachers are using technology
  • Learning what kinds of professional development and support are making a difference in classroom practice
  • Learning how the infusion of technology is changing student approaches to learning, characteristics of student products, and student achievement in curricular areas

As practitioners, engage in evaluation work, whether involved in a formal evaluation (perhaps supporting the work of outside evaluators) or undertaking an informal examination of a school initiative, they need to consider the following aspects of evaluation questions.

Question Identification

Identify the overarching questions that you want to answer and why. First of all, what do you and the people in your school want to learn from this evaluation work? If you are working with grant funds, what do the funders want to learn? For example, if your school district had received a grant to engage faculty, students, and community members to use a variety of technologies to enhance science and mathematics learning through a community-based environmental study, what would you want to learn from your evaluation efforts? Some of your evaluation questions might be:

  • How has the funding from the grant actually been used? What training was provided to students and faculty in using various technologies? What was the perceived quality of the training? How many students, faculty, and community members were involved in the training?
  • How did students and faculty use technologies in the environmental study? What areas of mathematics did students explore? To what extent did students engage in mathematics and science inquiry? What roe did technology play in this inquiry?
  • What mathematical concepts or skills did students gain through this project? To what extent did students demonstrate mathematics and science inquiry skills?
  • As this program is institute and continued, are thee notable increases i the percentage of students meeting grade-level-appropriate technology standards? Is there improvement in student achievement in the area of mathematics focused upon in the project?
  • How have student, faculty, and community attitudes changed through this project (e.g., attitudes toward mathematics and science, the use of technology, or the environment)?

Identifying and prioritizing these questions is the first step toward meaningful evaluation and essential learning for your school.

Matching Methods to Questions

It is essential to remember that the identification of evaluation questions dictates the choice of evaluation methods. Practitioners need to ensure that the data-gathering methods used will result in answers to the identified questions.

Using methods such as questionnaires, interviews, and focus groups makes perfect sense when you wish to determine changes in attitudes (e.g., attitudes toward technology use). However, classroom observations become the essential method (with interviews or questionnaires providing additional information) to gain useful data about the sue of technologies or the engagement of students in mathematics and science inquiry.

Although teacher interviews may give some insights into student learning and changes in student achievement of technology standards, an analysis of student products will more directly answer such an evaluation question. If the development of certain mathematics learning has been targeted within this project, an appropriate method may be the tracking of changes over time in teacher-designed assessments or selected sections of standardized tests.

It is essential to choose methods for your evaluation that will yield appropriate data for answering your top-priority questions.

Reflections on Evaluation Questions

Finally, it is critical, when the data are in and analyzed, to return to the evaluation questions and the results in order to determine the implications for your future work. For instance, perhaps you found that although the quality and reach of the technology-related professional development was excellent, too much time elapsed between that learning and the actual use of the technology environmental study, so time and energy had to be wasted on additional training. Or perhaps your classroom observations indicated that although use of graphic calculators was to be an essential component of the environmental project, the use was negligible. What if after three years of similar project work, the targeted areas for improvement in mathematics achievement showed no improvement? These findings would certainly lead you to strategic changes in your work.

—by Ann Abeille,
Director of Research and Evaluation
Learning Innovations at WestEd

Originally printed in SEIR*TEC NewsWire Volume Five, Number Three, 2002
For more information, contact