4 Ways to Improve the Use of Data in CTE

Double exposure of man programmer, software developer working on digital tablet and smart city with binary, html computer code on screen

While the collection and use of data to improve instruction has been well explored in K-12 and higher education, career and technical education (CTE) on its own has received far less focus. MDRC recently issued a 10-page brief laying out the challenges CTE programs face in collecting data (such as a lack of staff dedicated to the job) and four steps they can take "to strengthen their own CTE data-collection and measurement activities." MDRC is a nonprofit that takes on developing solutions for really difficult problems in numerous fields, including education.

The information covered in the brief came from a "scan of leading CTE programs" (including those that are part of MDRC's Center for Effective CTE) and interviews with the people running those programs and researching them. Research Associate Hannah Dalporto, the author of the report, also talked with "innovative leaders, consultants and organizations" involved in CTE to understand their data strategies and the obstacles they've faced.

According to the report, creating a data strategy requires answering two questions: What problem does the program address, and how does it do that? Knowing those answers is the first step in figuring out what data to collect and using it to measure whether the program is meeting its goals. That feeds into creation of what MDRC called a "theory of change" — a model that lays out the "essential components and mechanisms" of the program that result in success. From there, it's a matter of the organization collecting data to answer questions about the outcomes.

Challenges surface in the process. One is the difficulty of measuring outcomes that require getting data from a number of institutions or schools. "For example," Dalporto noted, "a high school CTE program may feature a work-based learning component such as an internship and also offer classes that count for college credit. To measure outcomes, that program might need to collect data from secondary, postsecondary and workforce data systems."

The report winnowed the steps for effective data collection and usage down to four:

  1. Conduct a needs assessment and develop a theory of change. That should cover what problems the program is trying to address in the community and how the model leads to change and makes an impact.
  2. Define the "priority research questions" and the most important outcomes. Having that information will help corral what data to collect in the limited time available.
  3. Set up the data collection processes, through the use of spreadsheets or dedicated software (LaunchPath for work-based learning management and ImBlaze for internship management both receive a nod) and tapping into data already collected by school districts, community colleges and state or national agencies.
  4. "Iterate, adapt and update." Only once program people have begun collecting, analyzing and reporting on the data will they identify the gaps and come up with solutions for addressing them.

There's a lot to be gained from having "better metrics," Dalporto suggested. One is enabling programs and funders "to pinpoint the near-term measures that predict future workforce or college success." Another is to help avoid "the mistakes" of the past for vocational education. As an example, "funders are increasingly paying attention to diversity, equity and inclusion and asking for data on related outcomes," the brief stated. "Simply examining the outcomes of subgroups defined by race and ethnicity, gender or socioeconomic status can reveal otherwise hidden inequities."

The full report, "Building Effective Data Strategies in Career and Technical Education" is openly available on the MDRC site.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • stylized illustration of a desktop, laptop, tablet, and smartphone all displaying an orange AI icon

    Survey: AI Shifting from Cloud to PCs

    A recent Intel-commissioned report identifies a significant shift in AI adoption, moving away from the cloud and closer to the user. Businesses are increasingly turning to the specialized hardware of AI PCs, the survey found, recognizing their potential not just for productivity gains, but for revolutionizing IT efficiency, fortifying data security, and delivering a compelling return on investment by bringing AI capabilities directly to the edge.

  • handshake between two individuals with AI icons (brain, chip, network, robot) in the background

    Microsoft, Amazon Announce New Commitments in Support of Presidential AI Challenge

    At the Sept. 4 meeting of the White House Task Force on Artificial Intelligence Education, Microsoft and Amazon announced new commitments to expanding AI education and skills training.

  • digital learning resources including a document, video tutorial, quiz checklist, pie chart, and AI cloud icon

    Quizizz Rebrands as Wayground, Announces New AI Features

    Learning platform Quizizz has become Wayground, in a rebranding meant to reflect "the platform's evolution from a quiz tool into a more versatile supplemental learning platform that's supported by AI," according to a news announcement.

  • abstract pattern of cybersecurity, ai and cloud imagery

    Report Identifies Malicious Use of AI in Cloud-Based Cyber Threats

    A recent report from OpenAI identifies the misuse of artificial intelligence in cybercrime, social engineering, and influence operations, particularly those targeting or operating through cloud infrastructure. In "Disrupting Malicious Uses of AI: June 2025," the company outlines how threat actors are weaponizing large language models for malicious ends — and how OpenAI is pushing back.