A Plan Without a Plan
        
        
        
        ##AUTHORSPLIT##<--->
Without some leadership and cooperation at all levels, a new federal mandate forgathering data on students’ technological literacy will produce meaningless results.
SOMETIME THIS YEAR, the federal government,  through your state’s department of education, will be asking  you how many eighth-grade students in your district have been  determined to be technologically literate. (The exact formation  of the questions to be used in the collection of this information  is not final.) This is thanks to Title II-D of the No Child Left  Behind Act—Enhancing Education Through Technology—which  has as one of its goals: “To [ensure] that every student is  technologically literate by the time the student finishes the  eighth grade, regardless of the student’s race, ethnicity, gender,family income, geographic location, or disability.”
When NCLB was signed by President Bush in 2001, the federal  DoE told state educational technology directors and state  DoEs that they would not be required to collect data associated  with EETT’s technological literacy goal. That policy stayed  in effect until this past summer, when the Office of Management  and Budget (OMB) informed the Education Department  that it must now begin to gather data on student tech literacy.  Data will be collected for the 2006-2007 school year, even  though the terms of the data requirement have not been finalized.As of this writing, the DoE:
  - will not define technological literacy
   - will not define how to assess technological literacy
   - will not provide any additional money to use in assessing  technological literacy
   - will not provide more flexibility in how EETT money can be  spent to assess technological literacy
   - will not require that states distinguish between districts  that receive EETT funds and those that do not
 
In short, all of these matters are being left to the states. So,  what are the states choosing to do? A few already have tests  in place for assessing student tech literacy, and a few more  are thinking about it, but most will ask questions of those at  the district level and send in the results. But the results will  be meaningless. Why? Have a look again at the list of things  missing from the government’s plan for gathering data on tech  literacy, then imagine the quality of the results that will emerge  as each district defines tech literate slightly differently fromthe next, and makes assessments however it chooses.
Most states have a formal definition of technological literacy,  but they do not or will not require local districts to use it in  connection with this effort. Talking with state technology directors,  I more often than not heard the comment, “We are a  local-control state.” Tim Magner, director of the Office of Educational  Technology, alluded to the same problem, saying,“That is one of the big challenges of a federal system.”
To get an idea of the disparity among the various approaches  states are considering or have already taken, mull on this:
  - One state is contemplating defining technological literacy    as the ability to use an online testing program.
   - Another state is planning to tie the definition to the ability    to pass a Technology, Life, and Careers course.
   - Another state has a laptop initiative for all students in the    seventh and eighth grades. State officials are assuming    that all their eighth-graders are tech literate.
   - Another state has technology standards embedded in its    core curriculum, and thinks it will say that any student who    has taken its core curriculum is technologically literate.
 
And then there is the vast majority of states, which are leaving  it entirely up to the districts to tell them how many students  are tech literate without requiring the use of the state  definition, while allowing assessment tools as varied as a  hands-on skills test, a multiple-choice knowledge test, a project  result, an aggregation of projects in a portfolio, or simply  teacher observation.
                To say that comparing data from different assessment methods is like comparing apples and oranges isn’t going far enough. It’s much worse.
               
To say that comparing data from different assessment  methods is like comparing apples and oranges isn’t going  far enough. It’s much worse. This isn’t comparing apples and  oranges, or even calling apples and oranges a fruit salad. It’s  like taking apples, car batteries, sailboats, chairs, cats, printers,  and gravel and bringing them all together under one  name. If this all gets implemented as currently envisioned, for this year, we will amass an enormous quantity of useless data.  In fairness to the Education Department, this requirement was  put on it by the OMB, and everyone involved, from  the DoE to the state directors of technology, is trying to make  things as easy as possible for all concerned. They all realize  there is no additional money for this effort, and schools are  already up to their ears in testing and other data requirements.
So, we in the ed tech community are faced with an opportunity  and a challenge. We are suffering, like most educational  programs, with a lack of information. We can take the easy  road and collect a bunch of bad data, or we can take the more  difficult road and start a process of collecting better data.  School districts need to show a willingness and desire to gather  better data, and states need to work together to try to unify  that data. The federal government needs to show some leadership,  flexibility, and preferably some money.
  We are not that far away. Under the leadership of the State  Educational Technology Directors Association,  the states have developed a definition for technological literacy,  and most state officials I talked to referenced it. Most  states adhere to the International Society for Technology in  Education’s  National Educational Technology  Standards (NETS). The standards are undergoing revision this  year, and the new standards are expected to be unveiled at the  2007 National Education Computing Conference. To be clear,  I am not advocating a national technology curriculum,  although I would argue that we have a de facto national curriculum  with NETS. I am not arguing for a national assessment,  but I am heartened by how much the states are able to  learn from each other.
In particular, West Virginia has taken a leadership position,  rewriting its entire curriculum to enforce the development of  21st-century skills, including global awareness, problem solving,  critical thinking, and technology proficiency. The state is  looking at a rigorous assessment of all of those areas.
This new attention to EETT’s goal of technological literacy  relates to a previous complaint from OMB that there is insuf-  ficient data on the impact of the program. Now the department  is requiring that the data be collected, but without an  organized, consolidated collection strategy, the data will not  reveal the impact of EETT or anything else. If we are going to  go to the trouble of collecting data, let’s arrive at information  that will help us know and understand more. Instead of leaving  ourselves with apples and oranges, let’s at least do the  work to create a fruit salad.
Geoffrey H. Fletcher is editorial director of T.H.E. Journal and executive director of T.H.E. Institute.