traingle logo
2023-24 Tools Competition

Facilitating Learning Science Research

Finalists for this track have been announced. Learn more here.

Phase III results were released on April 4, 2024. For Phase II competitors, please contact our team at ToolsCompetition@the-learning-agency.com if you would like to receive feedback on your Phase II proposal. 


Pleasant senior engineer telling measurements to his intern
traingle logo

Learn About the Track

Track Description

Tools and technologies that facilitate the learning science research process in order to improve K-12 learning interventions and expand the field’s knowledge of what works for learners. Competitors in this track may propose solutions that make research more accessible, reliable, and efficient.
This track seeks to drive innovation and collaboration to support a deeper and more holistic understanding of what works well, for whom, and under what conditions.
This may include, but is not limited to, tools that:
Education research and development, particularly for historically marginalized populations, is notoriously underfunded compared to other fields. Moreover, researchers, developers, and educators too often work in silos, stunting advancements in the field. With strengthened research in education we can develop and improve K-12 edtech solutions, professional development, and classroom implementation – and ultimately boost student learning.

Target User/Audience

Tools should target academics, school administrators, practitioners, or innovators interested in performing high-quality research related to K-12 learning as their primary user group. 

Who should submit?

Competitors worldwide are invited to submit (see the official rules for any restrictions). We welcome proposals from teams or individuals from all backgrounds, including researchers/universities, edtech companies, educators, or students (undergraduate or graduate).

Competitive Priorities

Based on the most pressing needs in learning and learning technologies, a subset of awards will be reserved for for tools that:

Examples

While the Tools Competition has different tracks and competitive priorities from year to year, the winning tools below are examples of what would be considered compelling for the Facilitating Learning Science Research Track in this year’s competition.

Judges

Tools Competition judges play a critical role in selecting Tools Competition Winners and bring expertise spanning philanthropy, research, industry, and education. Judges will hear virtual pitches from finalists in Phase III of the competition.

Speaker Image

Joanna Cervantez

Microschool Founder and Math Ed Consultant
A Fuller Education
Speaker Image

Deb Crawford

Mathematics Supervisor
Frederick County Public Schools
Speaker Image

Mihai Dascalu

Professor
University Politehnica of Bucharest
Speaker Image

Matt Glanville

Director of Assessment
International Baccalaureate
Speaker Image

Libby Hills

Co-Lead, Learning Schools
Jacobs Foundation
Speaker Image

Dan Jarratt

Fellow, Learning Engineering
Schmidt Futures
Speaker Image

Romana Kropilova

Program Manager, Learning Schools
Jacobs Foundation
Speaker Image

Temple Lovelace

Executive Director, Assessment for Good
Advanced Education Research and Development Fund (AERDF)
Speaker Image

Katie McCarthy

Assistant Professor
Georgia State University
Speaker Image

Caitlin Mills

Assistant Professor of Educational Psychology
University of Minnesota
Speaker Image

Pedja Neskovic

Program Manager
Office of Naval Research
Speaker Image

Na’ilah Suad Nasir

President
Spencer Foundation
Speaker Image

Bryan Richardson

Senior Program Officer
Bill & Melinda Gates Foundation
Speaker Image

Rod Roscoe

Associate Professor
Arizona State University
Speaker Image

Jon Sotsky

Director, Strategic Impact and Learning
Overdeck Family Foundation
Speaker Image

Martin Tomasik

Full Professor for Research Methods in Developmental and Educational Science
University of Zurich

Compete in this Track

1. Select your award level

When submitting a proposal, competitors must select the relevant award level based on the size and scale of their tool. Proposals at all award levels should detail how the proposed tool will solve a defined problem, rather than focus on past achievements.
catalyst

Catalyst Level Awards: $50,000

These awards are designed for early-stage competitors.
growth

Growth Level Awards: $150,000

These awards are designed for competitors with a tool with some users and scale.
transform

Transform Level Awards : $300,000

These awards are designed for advanced competitors with an established tool.

2. Apply for the OpenAI Learning Impact Prize

Determine whether you are eligible to apply for the OpenAI Learning Impact Prize (optional).

Implementation Impact Prize:

Supplemental funding of $100,000 is available to competitors submitting at the Growth or Transform level to implement their product and conduct research in a school setting. The district or consortium of districts must have at least 10,000 students of which the majority come from historically marginalized populations. The research plan must include:

In Phase I, competitors will indicate if they intend to compete for the Implementation Impact Prize. In Phase II, competitors will detail their research plan and submit a letter of agreement from the school partner.

OpenAI Learning Impact Prize:

In partnership with OpenAI, the Tools Competition is thrilled to offer the OpenAI Learning Impact Prize to catalyze the impact of artificial intelligence (AI) on learning outcomes and drive edtech innovation leveraging advanced computational methods.
Given the rise of ChatGPT and growing interest in the possibilities of AI in educational settings, the 2023-24 competition is especially interested in supporting teams that are exploring, testing, or using AI-enabled tools and services to impact learning.

OpenAI will select up to three recipients of the OpenAI Learning Impact Prize among competition winners who will receive $100,000 in additional funds, $10,000 in OpenAI API credits, and technical guidance from OpenAI engineers. Select contenders will also receive $2,500 in OpenAI API credits.

Competitors across all tracks that indicate an interest in being considered for this award will complete additional requirements when submitting their proposal materials. 

traingle logo

Proposals will be evaluated for the following criteria

 

1

Likelihood to improve learning and instruction

 

2

Likelihood to improve research

 

3

Ability to support equity and close opportunity gaps

 

4

Ability to scale

traingle logo

What is a Tool?

The Tools Competition funds edtech tools and technologies that support learning outcomes and can contribute to learning science research.

Eligible tools have the potential to generate novel learning data that researchers can study to better understand learning at scale. This may include an app, software, algorithm, or other digital technology that facilitates or supports continuous data collection and has the potential to scale at minimal cost.

Please note that this definition is not exhaustive. As technology continues to develop and innovations are created globally, other tool concepts may also be competitive.

Not sure your tool is eligible? Explore winning tools from previous years or get in touch.

Timeline

Timeline

September 21, 2023
Competition Launch
November 10, 2023
Deadline for Phase I Abstracts

Competitors submit an abstract describing the concept for their tool and responding to the evaluation criteria.

December 8, 2023
Select competitors invited to Phase II
February 2, 2024
Deadline for Phase II Proposals

Competitors develop a proposal and budget detailing their tool and its technology and responding in detail to the evaluation criteria. Rubrics will be posted when Phase II opens.

April 2024
Finalists invited to Phase III pitches
April-May 2024
Phase III Pitches (virtual)
Finalists pitch before a panel of expert judges and have the opportunity for support and feedback in crafting their pitch.
June 2024
Winners Announced
Winners are announced and receive the first installment of their award. Winners receive coaching, the opportunity to connect with leaders in the field, and the ability to present to researchers or to refine their tool.
Year Following the Competition - Winner Impact Study
All winners will work with Georgia State University during the year following the competition to define impact measures for their tool and set up processes for ongoing data collection and evaluation.
December 2024
Product Review Day
Winners present on their progress to date and receive feedback from other winners and leaders in the field. Progress is considered against winner’s proposal and receive the second installment of their award after Product Review Day.
traingle logo

Explore a Different Track

Meet the finalists for the Building an Adaptive & Competitive Workforce track here. Meet the finalists for all other tracks here.