Research: Young Students Learn Better with Mix of Virtual and Real Worlds

Young learners do up to five times better when instruction combines the real world with the virtual world. That's the finding from Carnegie Mellon University, where researchers came up with a test to figure out how technology could best contribute to learning.

"NoRILLA," as the testing platform is called, is a mixed-reality set-up that bridges physical and virtual worlds. The system includes software and hardware components, including a computer depth camera (Microsoft's Kinect for Windows) to provide personalized feedback while experimenting in the real world.

The researchers designed an experiment to see if 92 children aged six to eight learned better in a mixed reality or on a screen-only educational game. The test also explored what impact the addition of a physical component (like shaking the screen) had on the students' enjoyment of the activity.

In the experiment students played one of four versions of "EarthShake," a game featuring a friendly gorilla and towers made from blocks. The game teaches students basic physics principles such as stability and balance.

In a mixed-reality version of the game, the gorilla asked the students which of two block towers on an earthquake table in front of them would fall first. The students used a mouse to select their choice on the projected interface. After discussing why they thought the tower would fall, the students used a mouse to press the shake button on the projected screen, and the earthquake table was activated with a switch. Once the tower fell, the game recognized the results in the real world and gave interactive feedback accordingly.

When students chose the correct tower, the gorilla congratulated them and asked why they thought it fell first. If they chose the wrong tower, the gorilla would inform them. Then each group was asked to select the reason why they thought the given tower fell from a list of possible explanations.

Then the researchers offered three variations on that approach. In one version, the students played the mixed-reality game but could control the earthquake table themselves with a physical switch. In a second version students played a screen-only version of the game on a laptop, but answered questions and pressed the shake button with a mouse; there was no physical interaction with the game. In the third version, students played the screen-only version on a tablet and could shake the tablet at earthquake time. Testing was done before and after the game to measure learning.

The mixed-reality variations improved learning by almost five times more than the screen-only alternatives. On top of that, students enjoyed the game more in the mixed-reality conditions. However, the addition of the physical controls — shaking the tablet or pushing the earthquake button — did little to improve either learning or enjoyment of the game.

As the researchers concluded in a paper on the experiment, "Mixed-reality games that support physical observation in the real world have a great potential to enhance learning and enjoyment for young children."

Next, they expect to extend the experiment to understand how learning and enjoyment results compare when more elaborate hands-on activities are added, such as children building with the blocks. They also expect to expand the game for different content areas, such as using a balance scale rather than an earthquake table.

PhD student Nesra Yannier is working on the project with Kenneth Koedinger and Scott Hudson, two professors at the school's Human Computer Interaction Institute. The team recently presented a paper on their findings at the Association for Computing Machinery's Conference on Human Factors in Computing Systems in Seoul.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • horizontal stack of U.S. dollar bills breaking in half

    ED Abruptly Cancels ESSER Funding Extensions

    The Department of Education has moved to close the door on COVID relief funding for schools, declaring that "extending deadlines for COVID-related grants, which are in fact taxpayer funds, years after the COVID pandemic ended is not consistent with the Department’s priorities and thus not a worthwhile exercise of its discretion."

  • illustration of a human head with a glowing neural network in the brain, connected to tech icons on a cool blue-gray background

    Meta Introduces Stand-Alone AI App

    Meta Platforms has launched a stand-alone artificial intelligence app built on its proprietary Llama 4 model, intensifying the competitive race in generative AI alongside OpenAI, Google, Anthropic, and xAI.

  • The AI Show

    Register for Free to Attend the World's Greatest Show for All Things AI in EDU

    The AI Show @ ASU+GSV, held April 5–7, 2025, at the San Diego Convention Center, is a free event designed to help educators, students, and parents navigate AI's role in education. Featuring hands-on workshops, AI-powered networking, live demos from 125+ EdTech exhibitors, and keynote speakers like Colin Kaepernick and Stevie Van Zandt, the event offers practical insights into AI-driven teaching, learning, and career opportunities. Attendees will gain actionable strategies to integrate AI into classrooms while exploring innovations that promote equity, accessibility, and student success.

  • robot waving

    Copilot Updates Aim to Personalize AI

    Microsoft has introduced a range of updates to its Copilot platform, marking a new phase in its effort to deliver what it calls a "true AI companion" that adapts to individual users' needs, preferences and routines.