Outsmarting the Machine: Redesigning Education to Make Students, Teachers Relevant in the Algorithm Age

With the advancement of algorithms and artificial intelligence expected to impact the economy and job market, should there be an algorithm curriculum worked in through K–12? The Pew Research Center’s Lee Rainie discusses how research and development in algorithms will impact K–12, higher education and the next-generation labor force.

Image Credit: Pew Research Center.

Algorithms are incredibly useful tools for accomplishing tasks. They optimize practically every process these days — from catching bad guys to making friends. They help us view curated content on our social media feeds and enable us to ask voice assistants to order takeout. The future is promising too, with algorithms playing a pivotal role in the development of machine learning and deep learning.

Some experts are embracing the “Age of Algorithms,” while others are seriously concerned that algorithms created with good intentions can have unintended consequences. In the cases of “algocratic governance” or “surveillance capitalism,” for example, automation appears to do more harm than good.

“Experts worry [algorithms] can also put too much control in the hands of corporations and governments, perpetuate bias, create filter bubbles, cut choices, creativity and serendipity, and could result in greater unemployment,” according to the latest report from the Pew Research Center titled “Code-Dependent: Pros and Cons of the Algorithm Age.”

Image Credit: Pew Research Center.

Lee Rainie, director of internet, science and technology research at the center, co-wrote the report with Janna Anderson, director of Elon University’s Imagining the Internet Center. In 2000, as the internet was taking over popular culture, Rainie helped launch the “Pew Internet Project,” knowing the trend was going to have major impacts. The nonprofit, nonpartisan “fact tank” studies the social impact of digital technologies and has issued more than 600 reports that examine the public’s online activities and the internet’s role in their lives.

To shed light on current attitudes about the potential impacts of algorithms in the next decade, the Pew Research Center, in partnership with Elon University, recently asked more than 1,300 technology experts, scholars, corporate practitioners and government leaders whether they thought the overall effect of algorithms on individuals and society would be positive or negative.

Rainie discussed the survey’s findings during a recent Future Trends Forum video chat hosted by futurist Bryan Alexander. The canvassing revealed that 38 percent of respondents think the positive impacts of algorithms outweigh the negatives, while 37 percent believe the opposite. The remaining 25 percent said that overall impact would be equally positive and negative.

Survey participants were also asked to explain their answers further — and from that information, seven overarching themes emerged. First, algorithms will continue to spread everywhere. As a result, good things lie ahead, with plenty of opportunities to utilize data for informed decision-making. There is even a chance that “the future could be governed by benevolent artificial intelligence (AI),” according to the report.

As for concerns and challenges, Rainie said individuals are worried that algorithms will diminish human thinking and agency, or our capacity to make judgments. The report highlights a response from an assistant professor in human-centered computing at Clemson University, who wrote:

“The goal of algorithms is to fit some of our preferences, but not necessarily all of them: They essentially present a caricature of our tastes and preferences. My biggest fear is that, unless we tune our algorithms for self-actualization, it will be simply too convenient for people to follow the advice of an algorithm (or, too difficult to go beyond such advice), turning these algorithms into self-fulfilling prophecies, and users into zombies who exclusively consume easy-to-consume items.”

Another major concern is that human biases exist. While big data involves millions, even billions, of data points being fed into algorithmic systems, they still won’t represent everybody.

“Algorithms are only as good as the coders who create them and they’re only as good as the data sets that feed them — and both of those have the potential for bias,” Rainie explained. “There are ways that these data sort of have their limits that don't necessarily get reflected in the way that algorithms interpret them, nor the way humans interpret the algorithmic findings.”

A third challenge is that as algorithms become more ever-present in our lives, there will be deeper divides between the digitally savvy and those who are not as connected or unable to participate. However, Rainie pointed out that these divides occur over time with almost all cutting-edge technology.

“The extra-worry now in the Age of Algorithms is that people who know how to work with them, explain them and use them in productive ways will move even farther ahead than people who are not proficient using them,” he commented.

The last major concern — since algorithms are of course deeply connected to AI and robotics fields of research — is that the impact on jobs will be severe and unemployment will rise in the Age of Algorithms.

“As machines, robotics and artificial intelligence take over more functions that are typically human functions, we’re going to go through an enormous social, political and economic turmoil because people will lose their jobs,” said Rainie.

A final thought that emerged from the survey, which Rainie said was much more “a plea” to the public, is that the need grows for algorithmic literacy, transparency and oversight. As a fellow at the Oxford Martin School answered in the report, “It will take us some time to develop the wisdom and the ethics to understand and direct this power. In the meantime, we honestly don’t know how well or safely it is being applied. The first and most important step is to develop better social awareness of who, how, and where it is being applied.”

Alexander, who said he was struck by this final note on the need for “algorithm literacy,” asked if there should be an algorithm curriculum worked in through K–12 and higher ed, or if there is another way to do it.

Rainie first discussed what happened when the information revolution took place: The need for knowledge and literacy exploded and people started preparing for the industrial revolution by educating themselves. “The swarms of literacy that emerged from that were very helpful,” he explained. “There’s that same argument now and you see it in a lot of ways about data – the broad public understands that a new cluster of literacy is organized around digital skills as essential for people to master as the traditional literacy song.”

He also highlighted a strong mindset sweeping K–12 schools: That including coding in the curriculum helps students understand how devices work under the hood, making it an important skill to learn.

“As the culture shifts evermore toward a more knowledge-based economy and a brand new set of jobs where [coding] will be foundational aspects of those jobs, then you’re doing society a lot of good by preparing children — particularly K–12 but also at the postsecondary level — to know what’s going on and to make sure that there’s skills matched with the jobs of the future,” Rainie said.

Alexander followed up with an even more “futuristic question,” in his words, asking if Rainie had a sense of how education would change if the economy were to go through a period of underemployment and unemployment.

“If automation is meant just to remove a lot of jobs and create not enough yet to replace them, and we have say a decade of high underemployment and unemployment, what happens to education then?” Alexander asked. “Is it all the more important that we teach for the shrinking number of jobs, is it more important that we focus on technology, or should we prepare students for a life of unemployment?”

Rainie again alluded to what happened when the internet revolution took place, when every industry was disrupted, “yet the formal structures of education just didn’t seem to be moving at all, or hardly at all.” The basic dynamics of classrooms weren’t changing. Instead, the adoption of technology was being grafted onto old learning systems rather than interactive, feedback-oriented systems. “In a way, we’ve moved a little bit now, but there's a much greater change yet to come if people can get their heads around this.”

Experts certainly hope for an era where students, no matter their age, come together and feel motivated to learn. They aim to provide tons of individualized-based feedback and increase the capacity for self-paced, credential-based learning, he said.

“The structures that we’ve held might break apart,” he said. “People are going to have to constantly upgrade their skills and constantly be assessing their capacity to be relevant for the jobs of the future. Then some cluster of skills will emerge — some of which will be business-based, some of which will be available in the public domain and open source-based, some of which will be and profit-based, peer-to-peer and more.”

When the forum was opened up for audience questions, a longtime superintendent and education instructor said he teaches emerging concepts about algorithms and machine learning. But he questions whether we should be teaching kids these topics or focusing on a different approach: redesigning the structure of a day at school.

“As a person who tries to get my students to re-envision and redesign schools – and these students are teachers, administrators, higher ed people – I try to get them to ask, ‘How does this technology reshape the form of education in the school, and how does the structure of the day change because of that?’” the audience member said. “This idea that everyone has to be a coder is not necessarily unimportant, but if we had schools with learning spaces that were designed for using algorithms and different types of technology instead, and redesigned the relationship to be school-to-community, that is the model we need to migrate toward.”

Rainie emphasized that teaching with specific applications isn’t the right way to go because there’s a way to get lost in that and gadget-based literacy isn’t what a lot of these people are talking about.

“One of the striking things from these [survey] answers that made a lot of sense to me was that the grand goal of education in the future will be to teach people how to be learners. There won't be a fixed body of knowledge that makes them relevant forever for the job market. They really have to keep self-assessing and do deep-reckonings about whether their skill set matches the way the wind is blowing.”

Another participant who leads classes at a lifelong learning institute where the median age is about 75 also teaches his students about algorithms, artificial intelligence robotics, etc. In his experience, he said, a key part of any program is working with people outside of schools and colleges to try to assess some of the fears that are going on. In many cases, he has noticed that adults’ fear of technology and their befuddlement is projected back to the community and one of two things happen: Either the kids themselves become fearful and buy into this idea that the machines are out to get you, or the adults become irrelevant and children stop listening to anything the adults say, which causes a lot of innovation processes to stop.

“We need to have multi-generational conversations about this,” the participant suggested. “We learn from the kids and also have them learn from us … about how we stay relevant in a world that is working hard to make you irrelevant.”

Rainie said that these inter-generational conversations have been amplified by technology in recent years, and that “as worried as we all can be about the mismatch between people’s talents and the job market for the future, it's maybe a comfort to know that humans have never socially innovated as quickly as they have technologically innovated.” It has always been the case that technology first comes on the scene, people enjoy it and then over time develop the norms, regulations, laws and new social structures that pull that technology into the broader contours of society, he said.

“These are things to worry about now, but my guess is you’re going to have parallel stories in the next generation where the larger tech continues at pace and disruptions continue at pace. However, there will be social disruptions and new ways to organize ourselves to use information to navigate the world that aren’t obvious now, but will take account of these realities over time,” noted Rainie.

In the upcoming weeks, the Pew Internet Project plans to release a report directly addressing how education and skills are going to break apart and serve up a variety of options for people as they think about how to stay connected to the job market.

Featured