Measuring Progress With Technology in Schools

SETDA’s PETI Framework and Suite of Tools Address State Assessment Needs

Today’s education policymakers are seeing technology through the lens of the No Child Left Behind (NCLB) Act, which is creating expectations for a “learning return” on all technology investments. Across the nation, schools see highly qualified teachers, differentiated instruction, and informed data-driven decision-making as highly effective strategies for improving the academic achievement of all students - all strategies highly dependent on smart, integrated uses of technology.

While states have established accountability systems to measure student achievement, they have not yet set up clear measures for effective integration of technology into teaching, learning and leadership in schools. Fortunately, SETDA has stepped up to fill that gap.

Over the last three years, SETDA has developed a framework for effective technology use, which is aligned to Title II D (Enhancing Education Through Technology) of NCLB, and a suite of tools that helps states, districts, schools and researchers profile both a school’s readiness to use technology and the condition of its current use of technology. These tools provide answers to the important question: “Why d'es technology work in some schools and not in others?” Only then can schools work toward establishing the type of learning environment that advances NCLB goals through technology.

This suite of Profiling Educational Technology Integration (PETI) tools includes:

  1. Survey instruments, including a teacher survey (20-25 minutes), a school administrator survey (30-40 minutes) and a district administrator survey (35-45 minutes);
  2. Site visitation protocols for classroom observations, focus groups with students and teachers, teacher and principal interviews, school walk-throughs, artifact reviews, sample communiqués, and strategies for ensuring interrater reliability;
  3. Recommended report structures; and
  4. Sampling strategies for reducing the data collection burden on schools.

The Development Process

The PETI tools are based on a framework SETDA developed in 2003, focusing on five essential conditions for schools to effectively, systemically and equitably use technology (see sidebar below).

SETDA, in conjunction with the Metiri Group, began with this framework, and then further described each condition with a set of indicators which collectively defined what successful attainment of that condition would look like in practice. For example, in the chart on Page 20, Condition 1 of the SETDA framework is listed with associated indicators of success.

After SETDA secured funding from the U.S. Department of Education, the association commissioned the Metiri Group to build the PETI suite of assessment tools. They systematically developed survey questions and site protocols that aligned to the set of SETDA conditions and indicators. Therefore, educators using the tools can have a high degree of confidence that the PETI profile of their school reliably and accurately portrays their progress in using technology effectively.

PETI in Action

Data from SETDA’s fall 2004 “National Trends” report indicated that states collect technology data from districts in a variety of ways and for a number of reasons. Some states survey districts, schools, teachers, and even a few survey students, while over half conduct site visits; although the latter are typically to grantees of federal or state programs. One of the intents of SETDA’s development of the PETI instruments was to provide consistent definitions and question sets for such data collection. (For more information on state technology surveys, visit www.setda.org and click on “2005 National Trends Report.”)

Given the variability of data collection on education technology across states and school districts, SETDA anticipates that the use of the PETI instruments will vary between states and districts. Such use will be determined by a number of factors, including assessments currently in place, dictates from state mandates, and the goals of the state and district technology/learning plans. In general, SETDA anticipates the following uses:

  • States, districts or researchers with sophisticated profiling tools already in place may want to re-analyze that toolset in light of SETDA’s PETI tools, updating where necessary. (Note: Metiri Group, the developers, caution educators who are customizing the tools that in order to maintain validity and reliability such changes be executed at the indicator level, adding and/or eliminating indicators and associated survey items and protocols.)
  • States, districts or researchers currently using surveys only to determine their school districts’ technology readiness may want to add a new element: site visitation data. Although such visits are not required for valid and reliable profiles, they increase the credibility of resultant reports to policymakers.
  • States, districts or researchers currently using hardware/software inventories may want to continue collecting such data, adding the SETDA/Metiri surveys and/or site visitations to get more complete baselines and trend patterns on their school districts’ technology readiness over time.
  • States, districts or researchers currently not collecting data statewide or schoolwide may want to adopt the SETDA/Metiri tools, using only the three surveys, or a combination of the surveys and the site visitation protocols.

Tips and Techniques for Using PETI

PETI is an excellent tool for gauging a school’s readiness to use technology effectively as well as its current state of technology integration. Because local goals and specific technology interventions vary among schools, PETI must be accompanied by local assessments of student learning correlated with the specific uses of technology the school is implementing.

Options for Data Collection. Users can either collect data from all districts, schools and teachers, or use stratified sampling techniques at various levels that would provide a representative sample. While the resultant data sets yield different types of reports, both are statistically sound.

Some states will want to survey every school district and every school building, while others will want to reduce the burden of data collection by surveying representative samples from those populations. Some states will include site visitations (in a sampling of schools representative of the state), while others will rely on the surveys only to determine their school districts’ technology readiness and state of technology integration.

State Example: A state may want to consider surveying every district, randomly selecting a representative sample of schools and teachers within those schools to survey, while also selecting a smaller sample of schools/classrooms to conduct site visits. Why? Policymakers often want data on every district, but are satisfied with general trend data by locale (demographics), geographic area, level and size of district/school. Including the site visits in the state data collection adds a dimension of credibility for policymakers who read the reports. It also provides an opportunity to bring the quantifiable data alive by giving examples of what the data trends mean through anecdotes from the field.

District Example:A district may want to consider surveying every school administrator in the district, select a stratified sample of teachers to survey, survey all district administrators, and conduct site visitations in every school that serves students. Why? The sampling is used strictly to reduce the burden of data collection. Whereas a sample of teachers may be acceptable, school board members and school superintendents often want validation of the quantifiable data through site visits in all schools. This is also an opportunity to bring the quantifiable data to life through stories of impact on student learning, teaching, parental outreach and administrative efficiencies.

In Summary

SETDA’s timely development of its framework and the PETI suite of assessment tools provide a forum for states to discuss their assessment needs and design a common framework and suite of tools. This set of tools adds consistency to state data collection, including common definitions of terms, and provides educators a low-cost suite of tools with high validity and reliability. These are tools educators can use with confidence, knowing that resultant reports are providing data for continually increasing the learning return on technology investments.

The Application of PETI

When used to their fullest extent, the PETI tools inform a district's process in using technology effectively from a number of perspectives. The summary reports from the PETI tools enable school districts to track their progress with technology over time.

Drilling down a little further to the indicator and item levels enables schools to not only know if they are using technology effectively, but what actions they can take to improve. There is a set of questions from each survey as well as protocols from the site visits that addresses each indicator. This enables schools to get answers to key condition-level questions, and then drill down to the indicator and item levels for data to inform their next steps. For example:

Example of Alignment: Condition ' Indicator ' Cluster of Items

A district using the PETI tools would be informed as to how its schools are doing on indicator C2-2 through an analysis of responses to the survey questions and protocols aligned to that indicator.

Featured