Assessing the Impact of Instructional Technology on Student Achievement

##AUTHORSPLIT##<--->

The WEB Project is a five-year Technology Innovation Challenge Grant that was completed in September 2000. The purpose of this project was to infuse standards-based instruction in multimedia, digital art, music composition, and online discourse into the general arts and humanities curricula of Vermont K-12 schools. Multimedia technology was incorporated within six academic content areas: art, music, technology, history/social studies, English/language arts, and interdisciplinary studies.

Students shared their works-in-progress with a virtual community consisting of other students, teachers, digital artists, traditional artists, musicians, composers, Web page designers and other experts. This was done via a virtual learning environment called The WEB Exchange, which resided on The WEB Project's server. Through threaded design conversations, students requested feedback on their works of art, music, and multimedia. They then filtered the feedback they received and used it to improve their final artistic products, which many of the participating students then posted on The WEB Exchange.

Concurrently, language arts students, aided by language arts teachers and mentors from the Vermont Center for the Book, discussed curriculum-related texts. Moderating their own discussions, students engaged in deep, rich dialogue that focused on standards-based activities such as responding to text, substantiating arguments with evidence found in the text, informed decision-making, etc. These interventions were stable over the last two to three years of the five-year grant.

Measuring the Project's Impact

One of the research questions posed by the RMC Research Corporation evaluation team in evaluating this project was, "What is the impact of The WEB Project on student achievement?" Our intent was to generalize our methodology to other instructional technology grants in which student achievement must be reported. Findings from an online survey of The WEB Project teachers and administrators, repeated in spring 1998, 1999, and 2000, indicated that a connection between student motivation, metacognition, and learning processes, as outlined in a conceptual model developed by Sternberg (1998), might be applicable.

According to Sternberg, motivation drives metacognition, which, in turn, stimulates the development of thinking and learning skills. Thinking and learning skill development further stimulates metacognition, resulting in the development of expertise. The evaluation team extended the Sternberg Developing Expertise model to define "expertise" as student achievement measured by teacher-created rubrics. Participating teachers developed, refined, and benchmarked rubrics for student-created products over the past three years. Teachers also selected a rubric for measuring student learning processes from Marzano et al.'s (1993) Dimensions of Learning model. The rubric addressed the depth and richness of revisions to student-created products and performances.

The Survey

Using mixed methods that consisted of the online survey, student pretest and posttest surveys, and scores on teacher-created/selected rubrics that assessed students' learning processes and final products, the evaluation team used structural equation modeling to correlate the various elements of the extended Sternberg model. The hypothesis was that motivation would drive metacognition, and that metacognition would drive thinking and learning processes (specifically, inquiry learning and application of skills; two scales derived from WEB Project-based activities). Thus, increases in thinking and learning processes would result in increases in teacher-scored measures of student achievement. The student survey was pilot-tested in spring 1999, and three derived scales (metacognition, inquiry learning, and application of skills) had high internal consistency (alpha = .72 to .84). Two ten-item sets of questions for "in this class" and "in school in general" motivation were added to the survey in spring 2000.

In January 2000, the survey was administered to 165 students in nine cooperating schools. One-hundred and thirty-seven responses were from students who had not yet been exposed to the intervention, and could therefore be used as pretests. Internal consistency and reliability for all scales (class motivation, school motivation, metacognition, inquiry learning, and application of skills) ranged from alpha = .70 to alpha = .87. In May 2000, at the end of the spring term, the survey was re-administered as a posttest to the same group of students. As of August 2000, 131 completed surveys were returned by all nine schools. About 75% of the students who responded were from high schools, and 25% were from middle schools. Gender was about equally distributed.

Seventy-six valid data sets were matched in order to conduct a true repeated measures methodology (pretest vs. posttest). Only the "application of skills" scale increased during the spring term (2-tailed significance = .0165).

For the path analysis, the posttest survey results were correlated with teacher assessments. Participating teachers assigned a "product" score of "0" (no evidence), "1" (approaches standards), "2" (meets standards), and "3" (exceeds standards) to their students' final products. Products were re-scored by a jury of experts to increase reliability, resulting in 91 reported "product" scores. One-hundred and seven teachers assigned a "process" score of "1" (low) to "4" (high) to each of their participating students for the quality and depth of revisions of their final products, which they construed as a measure of student learning processes. These data constituted two independent measures of student achievement, which served to complete the model.

Four separate simplified path analysis models were tested. The first pair addressed process and product outcomes for class motivation, and the second pair addressed school motivation. The statistically significant (p < .05) results were as follows:

  • Motivation was related to metacognition. The relationship between class motivation and metacognition was slightly stronger (R = .307, p < the relationship between school motivation and metacognition (R = .282, p < .0001).
  • The relationship between metacognition and inquiry learning (Beta = .546, p < .0001) was stronger than the relationship between metacognition and application of skills (Beta = .282, p < .0001).
  • The relationship between inquiry learning and the student learning process outcome (Beta = .384, p = .001) was stronger than the relationship between application of skills and the student learning process outcome (Beta = -.055, not significant).
  • The relationship between application of skills and the student product outcome (Beta = .371, p = .004) was stronger than the relationship between inquiry learning and the student product outcome (Beta = .063, not significant).

Clearly, correlation d'es not imply causality. However, when each of these elements was considered as an independent variable, there was a corresponding change in associated dependent variables. For example, there was a significant correlation between motivation and metacognition, indicating that students' enthusiasm for learning with technology may stimulate students' metacognitive (strategic) thinking processes. The significant correlations between motivation, metacognition, inquiry learning, and the student learning process score indicate that motivation may drive increases in the four elements connected by the first path. Similarly, the significant correlations between motivation, metacognition, application of skills, and the student product score indicate that motivation may drive increases in the four elements connected by the second path.

Based on the significant correlations of the two teacher measurements of student achievement with the student survey data, these data validated the evaluation team's extension of the Developing Expertise model to explain increases in student performance as a result of engaging in technology-supported learning activities. Moreover, nearly all students across the project met the standards for both the teacher-created student product assessment and the learning process assessment. This indicates that, in general, the project had a positive impact on student achievement.

 

Conclusions

These preliminary findings suggest that teachers should emphasize the use of metacognitive skills, application of skills, and inquiry learning as they infuse technology into their respective academic content areas. Moreover, these activities are directly in line with the Vermont Reasoning and Problem Solving Standards, and with similar standards in other states. The ISTE/NETS standards for assessment and evaluation also suggest that teachers:

  • Apply technology in assessing student learning of subject matter using a variety of assessment techniques.
  • Use technology resources to collect and analyze data, interpret results, and communicate findings to improve instructional practice and maximize student learning.
  • Apply multiple evaluation methods to determine students' appropriate use of technology resources for learning, communication and productivity.

 

Rockman (1998) suggests that "A clear assessment strategy that g'es beyond standardized tests enables school leaders, policymakers, and the community to understand the impact of technology on teaching and learning." RMC Research Corporation's extension of the Sternberg model can be used to organize and interpret a variety of student self-perceptions, teacher observations of student learning processes, and teacher-scored student products. It captures the overlapping kinds of expertise that students developed throughout their technology-related activities.

One of the greatest challenges facing the Technology Innovation Challenge Grants and the Preparing Tomorrow's Teachers To Use Technology (PT3) grants is to make a link between educational technology innovations, promising practices for teaching and learning with technology, and increases in student achievement. We believe that this model may be replicable in other educational institutions, including schools, districts, institutions of higher learning, and grant-funded initiatives. However, to use this model, participating teachers must be able to clearly identify the standards they are addressing in their instruction, articulate the specific knowledge and skills that are to be fostered by using technology, carefully observe student behavior in creating and refining their work, and create and benchmark rubrics that they intend to use to evaluate student work.

Lorraine Sherry
Shelley Billig
Daniel Jesse
Deborah Watson-Acosta
RMC Research Corporation, Denver

References

Marzano., R.J., Pickering, D., & McTighe, J. 1993. Assessing student outcomes: Performance assessment using the Dimensions of Learning Model. Littleton CO: McREL Institute.

Rockman, S. 1998. Communicating our successes: Issues and tactics. Unpublished manuscript.

Sternberg, R.J. April, 1998. "Abilities Are Forms of Developing Expertise." Educational Researcher, 27 (3),11-20.

The WEB Project is a five-year Technology Innovation Challenge Grant that was completed in September 2000. The purpose of this project was to infuse standards-based instruction in multimedia, digital art, music composition, and online discourse into the general arts and humanities curricula of Vermont K-12 schools. Multimedia technology was incorporated within six academic content areas: art, music, technology, history/social studies, English/language arts, and interdisciplinary studies.

Students shared their works-in-progress with a virtual community consisting of other students, teachers, digital artists, traditional artists, musicians, composers, Web page designers and other experts. This was done via a virtual learning environment called The WEB Exchange, which resided on The WEB Project's server. Through threaded design conversations, students requested feedback on their works of art, music, and multimedia. They then filtered the feedback they received and used it to improve their final artistic products, which many of the participating students then posted on The WEB Exchange.

Concurrently, language arts students, aided by language arts teachers and mentors from the Vermont Center for the Book, discussed curriculum-related texts. Moderating their own discussions, students engaged in deep, rich dialogue that focused on standards-based activities such as responding to text, substantiating arguments with evidence found in the text, informed decision-making, etc. These interventions were stable over the last two to three years of the five-year grant.

X@XOpenTag000Measuring the Project's Impact

X@XCloseTag000One of the research questions posed by the RMC Research Corporation evaluation team in evaluating this project was, "What is the impact of The WEB Project on student achievement?" Our intent was to generalize our methodology to other instructional technology grants in which student achievement must be reported. Findings from an online survey of The WEB Project teachers and administrators, repeated in spring 1998, 1999, and 2000, indicated that a connection between student motivation, metacognition, and learning processes, as outlined in a conceptual model developed by Sternberg (1998), might be applicable.

According to Sternberg, motivation drives metacognition, which, in turn, stimulates the development of thinking and learning skills. Thinking and learning skill development further stimulates metacognition, resulting in the development of expertise. The evaluation team extended the Sternberg Developing Expertise model to define "expertise" as student achievement measured by teacher-created rubrics. Participating teachers developed, refined, and benchmarked rubrics for student-created products over the past three years. Teachers also selected a rubric for measuring student learning processes from Marzano et al.'s (1993) Dimensions of Learning model. The rubric addressed the depth and richness of revisions to student-created products and performances.

X@XOpenTag001The Survey

X@XCloseTag001Using mixed methods that consisted of the online survey, student pretest and posttest surveys, and scores on teacher-created/selected rubrics that assessed students' learning processes and final products, the evaluation team used structural equation modeling to correlate the various elements of the extended Sternberg model. The hypothesis was that motivation would drive metacognition, and that metacognition would drive thinking and learning processes (specifically, inquiry learning and application of skills; two scales derived from WEB Project-based activities). Thus, increases in thinking and learning processes would result in increases in teacher-scored measures of student achievement. The student survey was pilot-tested in spring 1999, and three derived scales (metacognition, inquiry learning, and application of skills) had high internal consistency (alpha = .72 to .84). Two ten-item sets of questions for "in this class" and "in school in general" motivation were added to the survey in spring 2000.

In January 2000, the survey was administered to 165 students in nine cooperating schools. One-hundred and thirty-seven responses were from students who had not yet been exposed to the intervention, and could therefore be used as pretests. Internal consistency and reliability for all scales (class motivation, school motivation, metacognition, inquiry learning, and application of skills) ranged from alpha = .70 to alpha = .87. In May 2000, at the end of the spring term, the survey was re-administered as a posttest to the same group of students. As of August 2000, 131 completed surveys were returned by all nine schools. About 75% of the students who responded were from high schools, and 25% were from middle schools. Gender was about equally distributed.

Seventy-six valid data sets were matched in order to conduct a true repeated measures methodology (pretest vs. posttest). Only the "application of skills" scale increased during the spring term (2-tailed significance = .0165).

For the path analysis, the posttest survey results were correlated with teacher assessments. Participating teachers assigned a "product" score of "0" (no evidence), "1" (approaches standards), "2" (meets standards), and "3" (exceeds standards) to their students' final products. Products were re-scored by a jury of experts to increase reliability, resulting in 91 reported "product" scores. One-hundred and seven teachers assigned a "process" score of "1" (low) to "4" (high) to each of their participating students for the quality and depth of revisions of their final products, which they construed as a measure of student learning processes. These data constituted two independent measures of student achievement, which served to complete the model.

Four separate simplified path analysis models were tested. The first pair addressed process and product outcomes for class motivation, and the second pair addressed school motivation. The statistically significant (p < .05) results were as follows:

  • Motivation was related to metacognition. The relationship between class motivation and metacognition was slightly stronger (R = .307, p < the relationship between school motivation and metacognition (R = .282, p < .0001).
  • The relationship between metacognition and inquiry learning (Beta = .546, p < .0001) was stronger than the relationship between metacognition and application of skills (Beta = .282, p < .0001).
  • The relationship between inquiry learning and the student learning process outcome (Beta = .384, p = .001) was stronger than the relationship between application of skills and the student learning process outcome (Beta = -.055, not significant).
  • The relationship between application of skills and the student product outcome (Beta = .371, p = .004) was stronger than the relationship between inquiry learning and the student product outcome (Beta = .063, not significant).

Clearly, correlation d'es not imply causality. However, when each of these elements was considered as an independent variable, there was a corresponding change in associated dependent variables. For example, there was a significant correlation between motivation and metacognition, indicating that students' enthusiasm for learning with technology may stimulate students' metacognitive (strategic) thinking processes. The significant correlations between motivation, metacognition, inquiry learning, and the student learning process score indicate that motivation may drive increases in the four elements connected by the first path. Similarly, the significant correlations between motivation, metacognition, application of skills, and the student product score indicate that motivation may drive increases in the four elements connected by the second path.

Based on the significant correlations of the two teacher measurements of student achievement with the student survey data, these data validated the evaluation team's extension of the Developing Expertise model to explain increases in student performance as a result of engaging in technology-supported learning activities. Moreover, nearly all students across the project met the standards for both the teacher-created student product assessment and the learning process assessment. This indicates that, in general, the project had a positive impact on student achievement.

 

X@XOpenTag002Conclusions

X@XCloseTag002These preliminary findings suggest that teachers should emphasize the use of metacognitive skills, application of skills, and inquiry learning as they infuse technology into their respective academic content areas. Moreover, these activities are directly in line with the Vermont Reasoning and Problem Solving Standards, and with similar standards in other states. The ISTE/NETS standards for assessment and evaluation also suggest that teachers:

  • Apply technology in assessing student learning of subject matter using a variety of assessment techniques.
  • Use technology resources to collect and analyze data, interpret results, and communicate findings to improve instructional practice and maximize student learning.
  • Apply multiple evaluation methods to determine students' appropriate use of technology resources for learning, communication and productivity.

 

Rockman (1998) suggests that "A clear assessment strategy that g'es beyond standardized tests enables school leaders, policymakers, and the community to understand the impact of technology on teaching and learning." RMC Research Corporation's extension of the Sternberg model can be used to organize and interpret a variety of student self-perceptions, teacher observations of student learning processes, and teacher-scored student products. It captures the overlapping kinds of expertise that students developed throughout their technology-related activities.

One of the greatest challenges facing the Technology Innovation Challenge Grants and the Preparing Tomorrow's Teachers To Use Technology (PT3) grants is to make a link between educational technology innovations, promising practices for teaching and learning with technology, and increases in student achievement. We believe that this model may be replicable in other educational institutions, including schools, districts, institutions of higher learning, and grant-funded initiatives. However, to use this model, participating teachers must be able to clearly identify the standards they are addressing in their instruction, articulate the specific knowledge and skills that are to be fostered by using technology, carefully observe student behavior in creating and refining their work, and create and benchmark rubrics that they intend to use to evaluate student work.

Lorraine Sherry
Shelley Billig
Daniel Jesse
Deborah Watson-Acosta
RMC Research Corporation, Denver

X@XOpenTag003References

X@XCloseTag003Marzano., R.J., Pickering, D., & McTighe, J. 1993. Assessing student outcomes: Performance assessment using the Dimensions of Learning Model. Littleton CO: McREL Institute.

Rockman, S. 1998. Communicating our successes: Issues and tactics. Unpublished manuscript.

Sternberg, R.J. April, 1998. "Abilities Are Forms of Developing Expertise." Educational Researcher, 27 (3),11-20.

The WEB Project is a five-year Technology Innovation Challenge Grant that was completed in September 2000. The purpose of this project was to infuse standards-based instruction in multimedia, digital art, music composition, and online discourse into the general arts and humanities curricula of Vermont K-12 schools. Multimedia technology was incorporated within six academic content areas: art, music, technology, history/social studies, English/language arts, and interdisciplinary studies.

Students shared their works-in-progress with a virtual community consisting of other students, teachers, digital artists, traditional artists, musicians, composers, Web page designers and other experts. This was done via a virtual learning environment called The WEB Exchange, which resided on The WEB Project's server. Through threaded design conversations, students requested feedback on their works of art, music, and multimedia. They then filtered the feedback they received and used it to improve their final artistic products, which many of the participating students then posted on The WEB Exchange.

Concurrently, language arts students, aided by language arts teachers and mentors from the Vermont Center for the Book, discussed curriculum-related texts. Moderating their own discussions, students engaged in deep, rich dialogue that focused on standards-based activities such as responding to text, substantiating arguments with evidence found in the text, informed decision-making, etc. These interventions were stable over the last two to three years of the five-year grant.

X@XOpenTag000Measuring the Project's Impact

X@XCloseTag000One of the research questions posed by the RMC Research Corporation evaluation team in evaluating this project was, "What is the impact of The WEB Project on student achievement?" Our intent was to generalize our methodology to other instructional technology grants in which student achievement must be reported. Findings from an online survey of The WEB Project teachers and administrators, repeated in spring 1998, 1999, and 2000, indicated that a connection between student motivation, metacognition, and learning processes, as outlined in a conceptual model developed by Sternberg (1998), might be applicable.

According to Sternberg, motivation drives metacognition, which, in turn, stimulates the development of thinking and learning skills. Thinking and learning skill development further stimulates metacognition, resulting in the development of expertise. The evaluation team extended the Sternberg Developing Expertise model to define "expertise" as student achievement measured by teacher-created rubrics. Participating teachers developed, refined, and benchmarked rubrics for student-created products over the past three years. Teachers also selected a rubric for measuring student learning processes from Marzano et al.'s (1993) Dimensions of Learning model. The rubric addressed the depth and richness of revisions to student-created products and performances.

X@XOpenTag001The Survey

X@XCloseTag001Using mixed methods that consisted of the online survey, student pretest and posttest surveys, and scores on teacher-created/selected rubrics that assessed students' learning processes and final products, the evaluation team used structural equation modeling to correlate the various elements of the extended Sternberg model. The hypothesis was that motivation would drive metacognition, and that metacognition would drive thinking and learning processes (specifically, inquiry learning and application of skills; two scales derived from WEB Project-based activities). Thus, increases in thinking and learning processes would result in increases in teacher-scored measures of student achievement. The student survey was pilot-tested in spring 1999, and three derived scales (metacognition, inquiry learning, and application of skills) had high internal consistency (alpha = .72 to .84). Two ten-item sets of questions for "in this class" and "in school in general" motivation were added to the survey in spring 2000.

In January 2000, the survey was administered to 165 students in nine cooperating schools. One-hundred and thirty-seven responses were from students who had not yet been exposed to the intervention, and could therefore be used as pretests. Internal consistency and reliability for all scales (class motivation, school motivation, metacognition, inquiry learning, and application of skills) ranged from alpha = .70 to alpha = .87. In May 2000, at the end of the spring term, the survey was re-administered as a posttest to the same group of students. As of August 2000, 131 completed surveys were returned by all nine schools. About 75% of the students who responded were from high schools, and 25% were from middle schools. Gender was about equally distributed.

Seventy-six valid data sets were matched in order to conduct a true repeated measures methodology (pretest vs. posttest). Only the "application of skills" scale increased during the spring term (2-tailed significance = .0165).

For the path analysis, the posttest survey results were correlated with teacher assessments. Participating teachers assigned a "product" score of "0" (no evidence), "1" (approaches standards), "2" (meets standards), and "3" (exceeds standards) to their students' final products. Products were re-scored by a jury of experts to increase reliability, resulting in 91 reported "product" scores. One-hundred and seven teachers assigned a "process" score of "1" (low) to "4" (high) to each of their participating students for the quality and depth of revisions of their final products, which they construed as a measure of student learning processes. These data constituted two independent measures of student achievement, which served to complete the model.

Four separate simplified path analysis models were tested. The first pair addressed process and product outcomes for class motivation, and the second pair addressed school motivation. The statistically significant (p < .05) results were as follows:

  • Motivation was related to metacognition. The relationship between class motivation and metacognition was slightly stronger (R = .307, p < the relationship between school motivation and metacognition (R = .282, p < .0001).
  • The relationship between metacognition and inquiry learning (Beta = .546, p < .0001) was stronger than the relationship between metacognition and application of skills (Beta = .282, p < .0001).
  • The relationship between inquiry learning and the student learning process outcome (Beta = .384, p = .001) was stronger than the relationship between application of skills and the student learning process outcome (Beta = -.055, not significant).
  • The relationship between application of skills and the student product outcome (Beta = .371, p = .004) was stronger than the relationship between inquiry learning and the student product outcome (Beta = .063, not significant).

Clearly, correlation d'es not imply causality. However, when each of these elements was considered as an independent variable, there was a corresponding change in associated dependent variables. For example, there was a significant correlation between motivation and metacognition, indicating that students' enthusiasm for learning with technology may stimulate students' metacognitive (strategic) thinking processes. The significant correlations between motivation, metacognition, inquiry learning, and the student learning process score indicate that motivation may drive increases in the four elements connected by the first path. Similarly, the significant correlations between motivation, metacognition, application of skills, and the student product score indicate that motivation may drive increases in the four elements connected by the second path.

Based on the significant correlations of the two teacher measurements of student achievement with the student survey data, these data validated the evaluation team's extension of the Developing Expertise model to explain increases in student performance as a result of engaging in technology-supported learning activities. Moreover, nearly all students across the project met the standards for both the teacher-created student product assessment and the learning process assessment. This indicates that, in general, the project had a positive impact on student achievement.

 

X@XOpenTag002Conclusions

X@XCloseTag002These preliminary findings suggest that teachers should emphasize the use of metacognitive skills, application of skills, and inquiry learning as they infuse technology into their respective academic content areas. Moreover, these activities are directly in line with the Vermont Reasoning and Problem Solving Standards, and with similar standards in other states. The ISTE/NETS standards for assessment and evaluation also suggest that teachers:

  • Apply technology in assessing student learning of subject matter using a variety of assessment techniques.
  • Use technology resources to collect and analyze data, interpret results, and communicate findings to improve instructional practice and maximize student learning.
  • Apply multiple evaluation methods to determine students' appropriate use of technology resources for learning, communication and productivity.

 

Rockman (1998) suggests that "A clear assessment strategy that g'es beyond standardized tests enables school leaders, policymakers, and the community to understand the impact of technology on teaching and learning." RMC Research Corporation's extension of the Sternberg model can be used to organize and interpret a variety of student self-perceptions, teacher observations of student learning processes, and teacher-scored student products. It captures the overlapping kinds of expertise that students developed throughout their technology-related activities.

One of the greatest challenges facing the Technology Innovation Challenge Grants and the Preparing Tomorrow's Teachers To Use Technology (PT3) grants is to make a link between educational technology innovations, promising practices for teaching and learning with technology, and increases in student achievement. We believe that this model may be replicable in other educational institutions, including schools, districts, institutions of higher learning, and grant-funded initiatives. However, to use this model, participating teachers must be able to clearly identify the standards they are addressing in their instruction, articulate the specific knowledge and skills that are to be fostered by using technology, carefully observe student behavior in creating and refining their work, and create and benchmark rubrics that they intend to use to evaluate student work.

Lorraine Sherry
Shelley Billig
Daniel Jesse
Deborah Watson-Acosta
RMC Research Corporation, Denver

X@XOpenTag003References

X@XCloseTag003Marzano., R.J., Pickering, D., & McTighe, J. 1993. Assessing student outcomes: Performance assessment using the Dimensions of Learning Model. Littleton CO: McREL Institute.

Rockman, S. 1998. Communicating our successes: Issues and tactics. Unpublished manuscript.

Sternberg, R.J. April, 1998. "Abilities Are Forms of Developing Expertise." Educational Researcher, 27 (3),11-20.X@XOpenTag004

The WEB Project is a five-year Technology Innovation Challenge Grant that was completed in September 2000. The purpose of this project was to infuse standards-based instruction in multimedia, digital art, music composition, and online discourse into the general arts and humanities curricula of Vermont K-12 schools. Multimedia technology was incorporated within six academic content areas: art, music, technology, history/social studies, English/language arts, and interdisciplinary studies.

Students shared their works-in-progress with a virtual community consisting of other students, teachers, digital artists, traditional artists, musicians, composers, Web page designers and other experts. This was done via a virtual learning environment called The WEB Exchange, which resided on The WEB Project's server. Through threaded design conversations, students requested feedback on their works of art, music, and multimedia. They then filtered the feedback they received and used it to improve their final artistic products, which many of the participating students then posted on The WEB Exchange.

Concurrently, language arts students, aided by language arts teachers and mentors from the Vermont Center for the Book, discussed curriculum-related texts. Moderating their own discussions, students engaged in deep, rich dialogue that focused on standards-based activities such as responding to text, substantiating arguments with evidence found in the text, informed decision-making, etc. These interventions were stable over the last two to three years of the five-year grant.

X@XOpenTag000Measuring the Project's Impact

X@XCloseTag000One of the research questions posed by the RMC Research Corporation evaluation team in evaluating this project was, "What is the impact of The WEB Project on student achievement?" Our intent was to generalize our methodology to other instructional technology grants in which student achievement must be reported. Findings from an online survey of The WEB Project teachers and administrators, repeated in spring 1998, 1999, and 2000, indicated that a connection between student motivation, metacognition, and learning processes, as outlined in a conceptual model developed by Sternberg (1998), might be applicable.

According to Sternberg, motivation drives metacognition, which, in turn, stimulates the development of thinking and learning skills. Thinking and learning skill development further stimulates metacognition, resulting in the development of expertise. The evaluation team extended the Sternberg Developing Expertise model to define "expertise" as student achievement measured by teacher-created rubrics. Participating teachers developed, refined, and benchmarked rubrics for student-created products over the past three years. Teachers also selected a rubric for measuring student learning processes from Marzano et al.'s (1993) Dimensions of Learning model. The rubric addressed the depth and richness of revisions to student-created products and performances.

X@XOpenTag001The Survey

X@XCloseTag001Using mixed methods that consisted of the online survey, student pretest and posttest surveys, and scores on teacher-created/selected rubrics that assessed students' learning processes and final products, the evaluation team used structural equation modeling to correlate the various elements of the extended Sternberg model. The hypothesis was that motivation would drive metacognition, and that metacognition would drive thinking and learning processes (specifically, inquiry learning and application of skills; two scales derived from WEB Project-based activities). Thus, increases in thinking and learning processes would result in increases in teacher-scored measures of student achievement. The student survey was pilot-tested in spring 1999, and three derived scales (metacognition, inquiry learning, and application of skills) had high internal consistency (alpha = .72 to .84). Two ten-item sets of questions for "in this class" and "in school in general" motivation were added to the survey in spring 2000.

In January 2000, the survey was administered to 165 students in nine cooperating schools. One-hundred and thirty-seven responses were from students who had not yet been exposed to the intervention, and could therefore be used as pretests. Internal consistency and reliability for all scales (class motivation, school motivation, metacognition, inquiry learning, and application of skills) ranged from alpha = .70 to alpha = .87. In May 2000, at the end of the spring term, the survey was re-administered as a posttest to the same group of students. As of August 2000, 131 completed surveys were returned by all nine schools. About 75% of the students who responded were from high schools, and 25% were from middle schools. Gender was about equally distributed.

Seventy-six valid data sets were matched in order to conduct a true repeated measures methodology (pretest vs. posttest). Only the "application of skills" scale increased during the spring term (2-tailed significance = .0165).

For the path analysis, the posttest survey results were correlated with teacher assessments. Participating teachers assigned a "product" score of "0" (no evidence), "1" (approaches standards), "2" (meets standards), and "3" (exceeds standards) to their students' final products. Products were re-scored by a jury of experts to increase reliability, resulting in 91 reported "product" scores. One-hundred and seven teachers assigned a "process" score of "1" (low) to "4" (high) to each of their participating students for the quality and depth of revisions of their final products, which they construed as a measure of student learning processes. These data constituted two independent measures of student achievement, which served to complete the model.

Four separate simplified path analysis models were tested. The first pair addressed process and product outcomes for class motivation, and the second pair addressed school motivation. The statistically significant (p < .05) results were as follows:

  • Motivation was related to metacognition. The relationship between class motivation and metacognition was slightly stronger (R = .307, p < the relationship between school motivation and metacognition (R = .282, p < .0001).
  • The relationship between metacognition and inquiry learning (Beta = .546, p < .0001) was stronger than the relationship between metacognition and application of skills (Beta = .282, p < .0001).
  • The relationship between inquiry learning and the student learning process outcome (Beta = .384, p = .001) was stronger than the relationship between application of skills and the student learning process outcome (Beta = -.055, not significant).
  • The relationship between application of skills and the student product outcome (Beta = .371, p = .004) was stronger than the relationship between inquiry learning and the student product outcome (Beta = .063, not significant).

Clearly, correlation d'es not imply causality. However, when each of these elements was considered as an independent variable, there was a corresponding change in associated dependent variables. For example, there was a significant correlation between motivation and metacognition, indicating that students' enthusiasm for learning with technology may stimulate students' metacognitive (strategic) thinking processes. The significant correlations between motivation, metacognition, inquiry learning, and the student learning process score indicate that motivation may drive increases in the four elements connected by the first path. Similarly, the significant correlations between motivation, metacognition, application of skills, and the student product score indicate that motivation may drive increases in the four elements connected by the second path.

Based on the significant correlations of the two teacher measurements of student achievement with the student survey data, these data validated the evaluation team's extension of the Developing Expertise model to explain increases in student performance as a result of engaging in technology-supported learning activities. Moreover, nearly all students across the project met the standards for both the teacher-created student product assessment and the learning process assessment. This indicates that, in general, the project had a positive impact on student achievement.

 

X@XOpenTag002Conclusions

X@XCloseTag002These preliminary findings suggest that teachers should emphasize the use of metacognitive skills, application of skills, and inquiry learning as they infuse technology into their respective academic content areas. Moreover, these activities are directly in line with the Vermont Reasoning and Problem Solving Standards, and with similar standards in other states. The ISTE/NETS standards for assessment and evaluation also suggest that teachers:

  • Apply technology in assessing student learning of subject matter using a variety of assessment techniques.
  • Use technology resources to collect and analyze data, interpret results, and communicate findings to improve instructional practice and maximize student learning.
  • Apply multiple evaluation methods to determine students' appropriate use of technology resources for learning, communication and productivity.

 

Rockman (1998) suggests that "A clear assessment strategy that g'es beyond standardized tests enables school leaders, policymakers, and the community to understand the impact of technology on teaching and learning." RMC Research Corporation's extension of the Sternberg model can be used to organize and interpret a variety of student self-perceptions, teacher observations of student learning processes, and teacher-scored student products. It captures the overlapping kinds of expertise that students developed throughout their technology-related activities.

One of the greatest challenges facing the Technology Innovation Challenge Grants and the Preparing Tomorrow's Teachers To Use Technology (PT3) grants is to make a link between educational technology innovations, promising practices for teaching and learning with technology, and increases in student achievement. We believe that this model may be replicable in other educational institutions, including schools, districts, institutions of higher learning, and grant-funded initiatives. However, to use this model, participating teachers must be able to clearly identify the standards they are addressing in their instruction, articulate the specific knowledge and skills that are to be fostered by using technology, carefully observe student behavior in creating and refining their work, and create and benchmark rubrics that they intend to use to evaluate student work.

Lorraine Sherry
Shelley Billig
Daniel Jesse
Deborah Watson-Acosta
RMC Research Corporation, Denver

X@XOpenTag003References

X@XCloseTag003Marzano., R.J., Pickering, D., & McTighe, J. 1993. Assessing student outcomes: Performance assessment using the Dimensions of Learning Model. Littleton CO: McREL Institute.

Rockman, S. 1998. Communicating our successes: Issues and tactics. Unpublished manuscript.

Sternberg, R.J. April, 1998. "Abilities Are Forms of Developing Expertise." Educational Researcher, 27 (3),11-20.X@XCloseTag004

Featured