Reinventing the Technology ‘Wheel’

##AUTHORSPLIT##<--->

When I started working in the field of futures study in the mid-’70s,Dr. Geoffrey H. Fletcher, Editor-at-Large I recall hearingthen-professor Chris Dede (now Chair, Teaching & Learning, Technology inEducation, Harvard Graduate School of Education and a member of our editorialboard), talk about technology assessment. At that time, the term “technology assessment”referred to the practice or study of technology’s impact on the field into which it wasintroduced, and on other aspects of society. One point Chris made, which has stuck with mesince then, is that the two most common errors of technology assessment are: 1) to overestimate an innovation’s speed of diffusion, and 2) to underestimate its eventual consequences. Obviously, we cannot anticipate all of the changes that may occur as a result of implementing into education a technology innovation. But part of our job should be to think through the consequences of implementing a technology beyond the immediate budget.

Most of us do not have the tools to systematically and systemically consider possibleconsequences. However, the ISTE Leadership Symposium at this year’s NationalEducational Computing Conference in Philadelphia provided an experience with a toolcalled “The Implications Wheel” (www.implicationswheel.com), led by renownedfuturist J'el Barker, which could be phenomenally helpful to leaders in technology andeducation. The Implications Wheel is a new iteration of the “Futures Wheel” that I firstlearned about from J'el in 1976 at a Minnesota workshop. I often used the wheel when Iwas teaching middle school and high school futures and English classes.The updated andcomputerized Implications Wheel not only automates a simple exercise that can be donewith students of all ages, but it also adds significant power such as enabling the quantification of consequences and encouraging a healthy discussion of each consequence. SomeISTE staff members are trained to implement this tool; it could be useful for you, too.

Holding On to the Computer Lab Model

I know that if I had thought of using a Futures Wheel while I was a technology coordinatorand bureaucrat, or had access to the Implications Wheel, I am sure I would have madedifferent decisions or at least thought differently about certain policies.

A case in point was the push in the ’80s for computer literacy, with a number of statesrequiring that all of their middle school students take a computer literacy class. Most districts responded to this effort by aggregating computers from around the school and the district into computer labs at the middle schools. They often hired someone to be the “computer teacher,” or converted a teacher from another field to be the computer teacher. Students went to these labs once or twice each week and became computer literate—policy accomplished!

However, another impact of that policy was that it delayed the integration of technologythroughout curriculum and instruction for a long time, and in two ways:

  1. It concentrated most of a district’s computing resources in a computer lab at a singleschool level. This lack of access to those resources at all levels made it virtually impossible for kids or teachers to get excited about technology, or use it to teach and learn differently.
  2. Hiring a computer teacher for computers absolved all other teachers from the opportunityand responsibility of using technology in their teaching and learning environments.

Two decades later, many schools are maintaining the computer lab model, often atthe expense of using technology in other classrooms. In these schools, technology is stillthe hood ornament on the conventional classroom, instead of the driveshaft of theteaching and learning process.

Unforeseen problem. What started these musings was reading two stories for thismonth. In the first, Una Daly writes about “The Hidden Costs of Wireless ComputerLabs”(page 13). She notes, “While space savings and an improved student-to-computerratio was realized throughout our district, security and maintenance issues surfaced thatwere not always factored into the purchase decision.” Indeed, looking at all aspects ofconsequences—especially maintenance and security—is critically important for asuccessful implementation of any technology. In the second article, “Doing More WithLess” (page 34), Pam Haney points out: “We must not only guarantee connectivity, butalso restrict illicit use of the Internet, which can put us in a catch-22 situation in ourmiddle and high schools. These students are often very computer-savvy, but are still inthe process of developing an ethical awareness. Ultimately, we’re responsible for keepingthem out of mischief, but the better they’re taught—which is the district’s fundamentalgoal—the harder our job becomes.”

I don't know that anyone envisioned this problem when we started using the Internetin schools. Interestingly, a true unintended consequence of the ever-broadening securityproblem has been the expansion of the technology security industry. And this industrywill most likely continue to grow as, unfortunately, the outlaws and evild'ers persist intheir efforts to create and disseminate viruses, spam, and other plagues.

Featured

  • An elementary school teacher and young students interact with floating holographic screens displaying colorful charts and playful data visualizations in a minimalist classroom setting

    New AI Collaborative to Explore Use of Artificial Intelligence to Improve Teaching and Learning

    Education-focused nonprofits Leading Educators and The Learning Accelerator have partnered to launch the School Teams AI Collaborative, a yearlong pilot initiative that will convene school teams, educators, and thought leaders to explore ways that artificial intelligence can enhance instruction.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Supported by OpenAI

    OpenAI, creator of ChatGPT, is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • closeup of laptop and smartphone calendars

    2024 Tech Tactics in Education Conference Agenda Announced

    Registration is free for this fully virtual Sept. 25 event, focused on "Building the Future-Ready Institution" in K-12 and higher education.

  • cloud icon connected to a data network with an alert symbol (a triangle with an exclamation mark) overlaying the cloud

    U.S. Department of Commerce Proposes Reporting Requirements for AI, Cloud Providers

    The United States Department of Commerce is proposing a new reporting requirement for AI developers and cloud providers. This proposed rule from the department's Bureau of Industry and Security (BIS) aims to enhance national security by establishing reporting requirements for the development of advanced AI models and computing clusters.