diamond.gif (561 bytes) Resources
 
diamond.gif (561 bytes) Courses & Professional Development
 
diamond.gif (561 bytes) E-rate
 
diamond.gif (561 bytes) Evaluation
 
diamond.gif (561 bytes) Grants & Funding
 
diamond.gif (561 bytes) Hardware/Software
 
diamond.gif (561 bytes) Home - Community
 
diamond.gif (561 bytes) Leadership
 
diamond.gif (561 bytes) Literacy for Adults
 
diamond.gif (561 bytes) Noteworthy Schools
 
diamond.gif (561 bytes) Planning
 
diamond.gif (561 bytes) Policy
 
diamond.gif (561 bytes) Presentations
diamond.gif (561 bytes) Publications
diamond.gif (561 bytes) SEIR*TEC Initiatives
 
diamond.gif (561 bytes) State Technology Information
 
diamond.gif (561 bytes) Teacher Education
 
Planning for Evaluation

SEIR*TEC Home | About SEIR*TEC | Partners | SEIR*TEC Region |
Search | Site Map

 
 
 
Tips for Writing an Evaluation Plan for a Technology Grant
 

Ask anyone who has reviewed proposals for federal or state grants about the most important factor that determines which ones are funded and which ones are not, and they will invariably say the evaluation section. As Zucchini Dean of the Mississippi Department of Education says, "Most proposals contain very little about evaluation...what they do say usually doesn't correlate with the goals they indicated in the proposal, and the focus is usually not on student achievement and teacher competency." There are dozens of reasons for these shortcomings, but one is that many of the educators who write the grant proposals have little or no experience in developing evaluation plans. With that in mind, SEIR*TEC offers the following tips for writing an evaluation plan that will win approval:

  1. Start with your project goals and objectives and work your way backwards to determine your evaluation questions, strategies, and methods. For example, if your goal is to improve student achievement, you need to define what you mean by "student achievement," and then identify the conditions that have to be in place in order for improvement to occur. Some essential conditions are as follows:
    • Curriculum, assessment, and technology use should be aligned.
    • Teachers and students have to use technology in meaningful ways.
    • Teachers must have ongoing high-quality professional development that is directly related to what students are supposed to learn.

  2. Ask good evaluation questions. Good questions will lead to the answers you need in order to determine whether your project makes a difference in teaching and learning. Evaluation questions might ask:
  • To what extent are teachers using what they learned in professional development activities?
  • Do teachers and students have ready access to modern computers and the Internet?
  • How effective is the project in identifying and addressing barriers to technology integration? (See article on Evaluation Questions—Guiding Inquiry in Schools for additional information.)

  • Collect baseline data at the beginning of the project and ask the same questions over time. For example, if your project focuses on professional development, begin by determining teachers' current level of technology , use of technology, attitudes, interests, and needs. If you periodically ask them the same kinds of questions, and if their proficiency and use improve, you have some evidence of the cumulative effectiveness of the program.

  • Counting boxes isn't enough. It can be useful to know the number of computers available for student use of the student-to-computer ration, but if you want to know whether technology is making a difference in teaching and learning, you have to examine how well and how much students and teachers are using it.

  • Look beyond standardizes student achievement data. Standardized tests seldom measure the areas of learning where technology has been shown to have an impact. such as research skills, communications skills, quality of student work, dropout rate, and discipline referrals.

  • Surveys are no longer adequate as the single measure for determining the quality and impact of a technology project, mainly because self-reportig data are often unreliable. Consider using a variety of qualitative and quantitative measures, such as classroom observations, school portfolios, interviews, and focus groups. (See Thinking Beyond Surveys for advantages and disadvantages of various measures.)

  • You don;t have to develop evaluation tools; some excellent ones already exists. The U.S. Department of Education's book An Educator's Guide to Evaluating the Use of Technology in Schools and Classrooms is a good place to start. Also look at the web sites of the Regional Technology in Education Consortia (RTEC), such as the High Plains RTEC's Profiler, the South Central RTEC's Insight, the North Central RTEC's enGauge, and SEIR*TEC's Technology Integration Progress Gauge. (See Tools for Evaluating Technology Projects and Programs.)

  • Above all else, read the directions in the grant application package. If you don't meet all the funding agency's requirements for evaluation, the agency will be hard pressed to fund your project. Theis is especially true for the technology grants funded through the No Child Left Behind legislation because the states must provide data from he districts in order to show that the money is being well spent.
  • If you follow these tips and still feel uncertain about the quality of your evaluation plan, remember that it's okay to ask for help. Although there isn't an abundance of evaluators with experience in educational technology, you should be able to find an evaluator or researcher at a nearby college or university who can review your palm and offer suggestions.

    —by Elizabeth Byrom, Ed.D.
    Principal Investigator, SEIR*TEC

     
    Originally printed in SEIR*TEC NewsWire Volume Five, Number Three, 2002
     
    For more information, contact