Data Lends Schools a Helping Hand

If we assume that an education system has collectively agreed on a "destination" for student learning with technology, then it is important to ensure that everyone travels and arrives at that destination. Without continuous feedback, many unintended places are likely to become landing sites rather than the expected destination.

The following illustration helps make this a bit clearer. A presenter asks her audience, "Have any of you ever taken a direct flight to Honolulu? or London?" Almost a third of the audience smugly raise their hands indicating some well-seasoned travelers. "Well," the presenter replies, "I don't want to scare you but there is no possible way you took a direct flight because as a plane heads for its destination, it is common for the plane's flight to actually be off course 80% of the time after it takes off." The presenter continues to make her point to a puzzled audience: "You might wonder how you ever arrived at this workshop under these conditions. Only because pilots are trained and expected to use their data instruments to continuously gather information for intermittent course correction do you arrive at the city of your choice."

Four Cornerstones Chart

I. Readiness

Collective Vision
Community Support/Benefits
Leadership Readiness
Information Technologies Readiness
Staff Capacity

II. Learning

Information Technologies Readiness
Libraries as Information Centers
Instructional Practices
Equitable Opportunities
Home/School Connection
Ubiquitous Access

III. System Capacity

Community Support/Benefits
Policies and Procedures
Staff Development Program
Purchasing Decisions
Budget Support

IV. Technology Deployment

Ubiquitous Access
Tool Capacity
Technical Support

As you reflect on this story, you might examine your own community and school board's expectations that data be used to gather information to ensure your final destination is actually reached. Unless data collection has been embedded within the technology planning process right from the beginning, accountability will not be something that can be requested with any validity after the implementation has been completed. School boards are as responsible for this expectation prior to implementation as their technology committees.


Schools Need to Organize their Accountability Efforts at Two Levels

First, planning teams need to provide their district with an overall system assessment. This would measure the district's current status and subsequent progress made on multiple indicators that influence technology's ability to make a difference for all students. Education Technology Planners, Inc. (ETP) has developed a Technology Profile Summary (ETP Online) tool based on twenty indicators that are clustered into four general areas called Four Cornerstones. (See Four Cornerstones Chart.) All four areas - readiness, learning, system capacity and technology deployment - need to have equal attention when setting goals and resourcing implementation strategies if technology benefits are to be realized for students. If any of these four areas are weak, no amount of equipment will be able to yield by itself the results nor the accountability expected by parents, community and staff.

Second, all goals need to be explicitly focused on student learning. Groups frequently create equipment or technology skill goals in the beginning. Goals need to be queried to understand the intended results for students. For example: "Our goal is for students to do 2-3 PowerPoint presentations a year." Try asking "and if you did use PowerPoint, what would that do for students?" Or another common example, "Our goal is to network the school." Try asking "and if you did network the school, what would that do for students?" If the actual language of the goal cannot incorporate a student-learning perspective, it is critical to write a student rationale statement to make the explicit connection of what this goal would do for students. Criteria for shaping goals must absolutely include identifying expected student results and assessment strategies within the planning document. This targets all activities, decisions and resources on specific student outcomes with pre-determined tools to conduct intermittent assessment. (Schomaker 1996) Formal collective agreement on the goals, results and assessment strategies is an essential process to ensure everyone takes responsibility for the accountability.


Data Helps Schools Course Correct Their Flight Plan

Education Technology Planners, Inc. has been developing and using tools/processes with schools for the last seven years to create planning documents as well as conduct implementation impact audits. Two case studies of districts who used their data findings to support the next steps needed in their implementation efforts are discussed below.

Data Conclusion for Staff Development Program/Instruc-tional Practices Indicators: The teacher survey given to 3,500 staff yielded the following findings: 58% of the staff feel proficient and well prepared in technology uses. Out of 49 possible activities in the classroom, the top practices reported by teachers fell into two general categories: automation of administrative tasks and the use of technology to automate learning assignments. Only 16% of staff are using teaching and learning practices that fall into the "Evolving Uses" category from the ETP Technology and Learning Spectrum, which represents student-centered learning. (ETP 1995) Note: Any percentage of use below 30% represents an early adopter. Until the percentages of use begin to cross the 80% level, practices cannot be considered institutionalized or "the norm." (Rogers 1995)

Top Learning Practices with Technology
(What teachers are doing.)

Top "NOT" Practices with Technology (What teachers are not doing.)

Generate worksheets and letters (64%)
Keep electronic grade books (34%)

Conduct online searching/investigations (83%)
Support online collaboration with groups outside of the classroom (82%)

Print out grades/progress reports (34%)

Use complex situations to understand cause and effect - i.e. simulations, modeling, query of spreadsheets/databases (81%)

Provide word processing for writing assignments (33%)

Facilitate electronic portfolios (78%)

Send and receive electronic mail (29%)

Support collaborative projects within the classroom (77%)

Track student attendance (27%)

Provide graphic and visual tools for analysis (68%)
Explore and learn topics on their own (65%)
Enable effective presentations (50%)

Implication: The teachers' primary technology uses are administrative tasks. Use levels with students for learning tasks are minimal. These findings reflect the district's staff development program five years ago giving every teacher 30 clock hours of productivity tools. However, the staff development program's budget was then terminated leaving teachers with limited support to connect technology with curriculum and learning. Equipment acquisitions continued but not staff development efforts for technology integration with learning.

The level of teachers' satisfaction with their competency and training is higher than would be expected if a challenging vision were in place. The teachers' top choices of tools and practices to learn show little forward vision beyond personal productivity uses. Their staff development selections of Internet, e-mail, content specific software, spreadsheet, online reference tools, electronic grade book, alternative research papers, and print out grades/progress report do not indicate a student-focused vision driving their learning needs. A compelling student-focused vision and support system to make strong use of the technology for higher, complex learning tasks is missing. A more challenging vision must be created and supported by school leaders.

Strategies Developed: The planning team presented the data findings to their school board with an action plan and budget for an active district staff development program to support the next stages of teacher practices being focused on student results. Teams of teachers were then involved in defining essential skills and practices for all students/teachers. Skills were defined as technical competencies, while practices were defined as the ongoing uses of the skills applied to learning standards or tasks. A "toolkit" of hardware/software was standardized for all classrooms. All staff development efforts were organized around all staff acquiring the essential skills/practices and tools. Multiple staff development strategies, including in-school curriculum support personnel, were designed. Teacher surveys are to be given every two years to assess the progress of the acquisition of the identified essential skills and practices. A survey was designed to assess student skills and practices, allowing triangulation of data that will identify the progress of transferring teacher learning into classroom practices.

Data Conclusion for Tool Capacity/Ubiquitous Access Indi-cator: Equipment inventories of the buildings yielded significant, disparate statistics when comparing computer/student ratios for all workstations versus ratios for standard equipment only. Counting all equipment, 96% of the 127 buildings have at least a 1:15 computer/student ratio and 75% of them have at least a 1:10 ratio. This appears to be a strength that benchmarks closely with the reported average national and state ratios of 1:9 until the aging inventory is factored in the ratios. When using the workstation standards of being network and Internet capable, however, only 18% of the buildings have at least a 1:15 ratio. Only 12% of them have a better than 1:10 ratio.

The ratios between buildings (presented visually to the school board) are radically disparate, providing disturbing inequities in student access. The ratios range from 27 buildings having no better than a 1:350 computer/student ratio, 66 buildings having no better than 1:100, 20 buildings having at least 1:15 ratio and, finally, only 15 buildings having at least a 1:10 ratio.

Implication: A significant amount of the equipment is considered "aging inventory." Many schools have used "brick and mortar money" to make original purchases with no further funding available to upgrade equipment. A number of other school sites have been using grant money and other fundraising activities to acquire their equipment. Finally, individual school sites presently take sole responsibility for all equipment acquisitions and upgrades.

Strategies Developed: The data was used to strategize three general action items for balancing the ratios between schools over time. First, technology equipment was declared the domain and responsibility of the central office rather than site-based management.

The acquisition process was put into an existing program called facilities management. This means on-going budgeting and upgrading will be handled much like roofs, painting and other infrastructure needs. Second, a program for leasing equipment from the central office as an in-house program was designed to work within the site-based budget process. This ensured that upgrading would happen within specified periods of time. Third, a well-defined process for re-purposing aging equipment was designed to make use of obsolete equipment without disturbing the equitable access for identified programs. Annual inventories are now collected to monitor progress.


Keeping the Charted Course

Today, school, as people have known it for over 100 years, is being pressured to reshape itself in unprecedented ways by the quantity and speed of changes in our work world. The rate of change, influenced significantly by technologies, has now exceeded the public's understanding of the school's role in young people's lives. The public's response to this turmoil is an increasing need to have more communication and more accountability as they grapple with the changes being brought about by technology. Policy makers and community members want answers to their questions about technology's effectiveness on student learning and student readiness for work force skills. Collecting data is a powerful mechanism for responding to the public's concerns, while at the same time answering educator's own concerns about the value of technology in our schools. Data collection is not a task meant to punish or challenge our professional work; it is an opportunity to keep our charted "course." We all want to move our visions of what schools now need to be into practice, ensuring students are being prepared for their changing world. Technology is a "robust" vehicle for that journey.

Schools will want to take stock of their planning process and documents to incorporate six elements for setting the stage for accountability. Using data will ensure that we do, indeed, arrive at our final, designated destination.


Six Elements for Setting the Stage for Accountability

The following elements are essential in all technology plans before resourcing implementation. Plan-ning groups need to assess if they are "Accountability Ready" (ETP Online). School boards and communities cannot at the end of implementation activities expect measurements beyond efforts to be valid without having organized for this expectation from the beginning. Goals, results and assessment strategies need collective agreement at all levels as doable and worthy before action plans and implementation plans are resourced. (Ibid)

  • Goals that articulate expected, measurable results (benefits) focused on student learning
  • Baseline data to provide current reality and gap analysis
  • Specific assessment strategies and tools aligned with expected results
  • Everyone, not just special groups, takes responsibility for assessing the implementation, asking and answering with their colleagues the question of "So What?"
  • Expectations for periodic assessments with public reporting and reflection
  • Budgets for staff, dollars and time to conduct assessment activities that support periodic evaluations processes

This article originally appeared in the 04/01/1999 issue of THE Journal.