diamond.gif (561 bytes) Classroom Resources
diamond.gif (561 bytes) Courses & Professional Development
diamond.gif (561 bytes) E-rate
diamond.gif (561 bytes) Evaluation
diamond.gif (561 bytes) Grants & Funding
diamond.gif (561 bytes) Hardware/Software
diamond.gif (561 bytes) Home - Community
diamond.gif (561 bytes) Leadership
diamond.gif (561 bytes) Literacy for Adults
diamond.gif (561 bytes) Noteworthy Schools
diamond.gif (561 bytes) Planning
diamond.gif (561 bytes) Policy
diamond.gif (561 bytes) Presentations
diamond.gif (561 bytes) Publications
diamond.gif (561 bytes) SEIR*TEC Initiatives
diamond.gif (561 bytes) State Technology Information
diamond.gif (561 bytes) Teacher Education



Stories with Data and
Data with Stories

by Margaret Bingham

SEIR*TEC Home | About SEIR*TEC | Partners | SEIR*TEC Region |
Search | Site Map

As many school districts approach a 20th-year anniversary of microcomputers in their classrooms, they are facing loud calls for proof of results from the investments of time and money made in technology during that period. Jane Oates, education advisor for Senator Edward Kennedy, stated in two different speeches this fall that what is needed (by Congress and the federal government) are not just stories and not just data, but "stories with data and data with stories."

In response to these local calls for proof and in reaction to statements such as those from Ms. Oates, educators are searching for tools and instruments to measure and document the impact of technology, particularly microcomputers, on teaching and learning.

Measuring Sticks Exist

Several groups have introduced within the last year measuring sticks of the impact of technology in education. The Milken Exchange has developed the Seven Dimensions for judging progress in implementing technology (www.mff.org). They continue to collaborate with state technology leaders and educational groups to further develop the dimensions and related tools. Currently, they are completing a Professional Development tool (www.mff.org/publications/publications.taf?page=159).

Also, for the past two years they have documented in their annual report, "Technology Counts," the status of the efforts by states to implement technology. This report establishes the accomplishments of each state as well as provides comparisons of the states in various categories.

The CEO Forum's School Technology and Readiness (STaR) chart (www.ceoforum.org) is another tool in use in schools and is a part of several state initiatives. The self-assessment quiz that guides respondents to label themselves as either a low tech, mid tech, high tech, or target tech school establishes a means of tracking progress in six categories. In other instruments, rubrics guide the user in pinpointing the current status in various areas from the use of technology by students and teachers to the involvement of the community in the technology program. One such instrument that has a strong set of rubrics addressing student engagement with technology is from the North Central Regional Technology in Education Consortium (NCRTEC). Their online instrument, entitled Learning with Technology Profile Tool, is found at www.ncrtec.org. These tracking tools probably represent just the first set of offerings, as discussions at technology meetings increasingly call for means of documenting technology progress.

One State's Effort

In the late fall of 1997, the North Carolina Department of Public Instruction's Instructional Technology Division adopted the CEO Forum's STaR Chart as a tool for school districts to use in tracking the implementation of their local instructional technology plans. With permission from the CEO Forum office, the state agency added a row to the bottom of the chart. This row consisted of the Target Tech School standards for North Carolina schools. These standards, from the North Carolina Instructional Technology Plan, were closely aligned to those in the Target Tech row of the STaR Chart. These modified charts were distributed to school districts as part of a package of tools for tracking technology implementation. School districts were encouraged to use the North Carolina version of the STaR chart with the accompanying self-assessment quiz as part of their efforts to revise their local instructional technology plans. When the revised plans were submitted to the state office in June 1998, approximately one-third of the 117 school districts had followed the recommendation and used the STaR Chart. Some had every school complete the self-assessment quiz in order to "stake out" where that school was at the end of the first two years of the local plan implementation. Most of those districts indicated their plans to continue collecting this data yearly. Other districts reported using the STaR chart to determine areas to focus on in the revised technology plan. Of those who had not used the STaR Chart in the revision process, most included it as a tool in their revised evaluation plan or indicated that they planned to use it in their Technology Literacy Challenge Fund (TLCF) proposal activities.

The second instrument distributed in the package of tools was a chart of rubrics designed for tracking progress in the North Carolina TLCF proposals. This chart, jointly designed by the Instructional Technology Division staff and the SouthEast and Islands Regional Technology in Education Consortium (SEIR*TEC), emphasized reflection on progress being made toward the proposal goals. Even though the instrument (seirtec.org) was designed to capture common data from the 44 very different TLCF proposals, several school districts adopted portions to use in their local technology plan evaluation efforts.

In the months since June, a second round of Technology Literacy Challenge Fund subgrants has been awarded and an updated profiling tool distributed. When that tool was received, comments were heard to the effect that using the first profile tool and the STaR chart were the first time they [the school systems] had stopped "doing" and had taken time to reflect on what accomplishments had been made. With both instruments, the North Carolina state education agency has provided the school districts with a means of tracking technology program progress and of collecting data to accompany their stories.  

One District's Effort

Using tools to track progress in technology implementation is just one way to have "stories with data and data with stories." Another strategy is to develop an evaluation plan consisting of observations, data collection from testing, and portfolios. Asheboro City Schools (N.C.) is applying this strategy to determine the benefit of AlphaSmart keying devices from Intelligent Peripherals to student writing skills in grades 3 and 4. As their Technology Literacy Challenge Fund proposal, Asheboro City Schools has provided every student in grades 3 and 4 with an AlphaSmart. Their goal is to increase student scores on the fourth grade writing test. In addition to staff development and technical support, Intelligent Peripherals is working with the district to determine the factors associated with gains on the writing test.

Under the direction of Dr. Sheila Cory of the University of North Carolina, teachers are recording student use and activities during one week each month from November 1998 through March 1999. Dr. Cory is observing teacher workshops and classroom instruction. She will also analyze the scores on a released version of the test taken by third graders in March 1998, and at the beginning of the fourth grade. Even though no official results exist yet, Mike Ingram, Director of Technology Services for the district, reports that an unanticipated result has occurred already: Students are using the devices for almost every curricular activity, not just for writing. Additionally, several teachers have told him that the AlphaSmarts are the single best tool they have ever been given to increase student motivation and to have kids learning effectively and efficiently. After March 1999, when the students take the official writing test and the district receives the evaluation from Dr. Cory, Asheboro City Schools should have data to go with their stories!

One Region's Effort

SEIR*TEC is using both the STaR Chart and the SEIR*TEC Technology Integration Progress Gauge as tools for tracking the technology program in twelve schools across the region. These schools, referred to as Intensive Sites by SEIR*TEC, are typically rural, resource-poor schools. Located in Virginia, North Carolina, South Carolina, Georgia, Arkansas, Tennessee, Mississippi, Alabama, Florida, Puerto Rico and the U. S. Virgin Islands, these schools have received staff development and technical assistance from SEIR*TEC for three years. Now in the fourth year, an effort is being made at these "laboratory" schools to take a snapshot of where they are in implementing technology. To gain this record, SEIR*TEC staff members are working with the school leaders to complete the STaR Chart and the SEIR*TEC Gauge. The Gauge (seirtec.org) is similar in format to the instrument developed for the N.C. TLCF but addresses five areas of the technology program. Those areas are Student Engagement, Teacher Engagement, Resources, Support and Community Involvement. These same tools will be used again after summer activities are completed, in order to have a second snapshot. Using the two snapshots and the record of assistance provided and activities undertaken, SEIR*TEC will have initial data for the good collection of existing stories and lessons learned at these intensive sites.


SEIR*TEC has completed a comparison of the areas addressed by their Gauge to those addressed by the Milken Seven Dimensions, the CEO Forum STAR Chart, the NCRTEC Profiling tool, the NSSE indicators (www.nsse.org/ioq4.html), the NSBA's ITTE toolkit (www.nsba.org/sbot/toolkit/index.html) and several others. (See Survey of Profiling Instruments) Three conclusions are obvious from the comparison: 1) all address hardware, 2) most address student and/or teacher engagement, but 3) few address community involvement. Users need to determine which of these tools best matches the activity they wish to track. They may want to use only parts of these tools or sets of rubrics as components of their evaluation program.

To share their "stories with data and data with stories," schools, districts, states and regional groups will have to begin tracking the progress of technology programs. Tools are needed. Strong evaluation plans are needed. The cry for proof of results from investments of time and money in implementing technology is only going to increase in volume. Now is the time to take a look at some of these tools.

Survey of Profiling Instruments

Overview: A review of several instruments for profiling school, student and educator use of technology was conducted by SEIR*TEC staff. Components or major focus areas of the instruments listed  were compared to the domains and indicators of the SEIR*TEC Gauge.

For more information, contact