2022 Learning Engineering Tools Competition
Frequently Asked Questions

Eligibility

Tools and technologies that would be eligible for the competition could include an app or platform, software, algorithm, or other similar educational technology.

We are interested in tools and technologies that enhance learning or improve the education system, collect and generate data that supports learning science research and continuous product improvement, and can be scaled. This is core to the learning engineering objectives of the competition.

For the purposes of this competition, proposals that focus solely on hardware, curricular resources such as lesson plans or video guides, community platforms, or in-person programming are rarely competitive as they struggle to either support learning engineering principles or scale without significant continuous investment.

Please refer to the Official Rules. All participants must agree to these rules to compete.

Yes! The Tools Competition is eager to hear from participants from across the globe. Participants must be able to accept funds from US based entities.

Individuals and entities residing in Cuba, Iran, North Korea, Russia, Sudan, and Syria are not eligible to participate in the competition.

Yes, proposals must be in English.
Yes! Winners retain full intellectual property. Competition organizers do not seek shares of or equity in the product or company.

Yes! We are eager to hear and support individuals who are new to the field. We encourage you to compete in the Catalyst award level to be more competitive. Please see more information below on award levels and take the eligibility quiz for more guidance.

Yes! Anyone 18 years or older is eligible, and we are eager to hear from people at all stages of the development process.
Yes! You are not required to compete as part of a team or to be affiliated with an organization or company.
You are welcome to partner with other organizations. This should be mentioned in your proposal and reflected in the budget.
Yes! We encourage you to submit a proposal and make a note of your conflict.

Developing successful proposals

Submissions for the 2022 Tools Competition are open through November 20, 2022. You can read more about our submission process and how to compete here.

The Tools Competition seeks to spur new tools and technology. This means that something about the proposal needs to be fresh, innovative, or original. This does not mean you have to create a new tool or new platform.

Proposals seeking a Growth Phase or Transform Phase award must build off an existing platform of varying levels of development and scale. This might be an API that will improve the platform or a new tool to improve effectiveness. Or it could mean adding infrastructure that allows external researchers to access your data.

See more about award levels and eligibility requirements here.

The competition is open to solutions for Pre-K to secondary learners.
Active users are defined as individuals who use the tool regularly and in a meaningful way. We recommend that your users fall within the age range and target audience that your tool focuses on. Beta users, if they are just testing the functionality of the tool for a specified period of time, do not count as active users.
We are looking for user engagement at every stage of development and implementation. This can look like conducting interviews or focus groups to understand user needs, iterating the functionality or features of your tools based on user feedback, designing the user interface based on feedback, etc. Visit our blog to learn more about how previous winners of the Tools Competition - Springboard Collaborative, Podsie, and Humanitus Learning Sciences and Consulting Services successfully engage users.

The competition has four ‘tracks’ or priority areas that reflect the pressing needs and opportunities in education. Competitors will be required to select one of the tracks in which their submission will be primarily evaluated. Competitors can also select a secondary track.

The competition tracks include:

  • Transforming assessments to collect new measures, drive quality and reduce cost.
  • Strengthening teacher development and support.
  • Facilitating faster, better, and cheaper learning science research.
  • Accelerating learning for all.

Each track has somewhat different requirements and eligibility criteria and certain tracks may be more or less competitive than others depending on final funding allocation and the number of competitors in each track. Tracks may also have different total prize purses, depending on sponsor priorities.

See more on each track here.

The Phase I submission form will ask you to select a primary track and a secondary track. Your primary track should be the track that is the best match.

If competition organizers invite you to Phase II, they will carefully review the proposal to confirm your track or recommend a new one.

Proposals will need to stand alone independently, but they can certainly support each other. That said, it is unlikely that more than one proposal by the same team will win.
Phase II proposals should not be drastically different from Phase I, but we do expect that there may be changes as you further refine your idea.
You may provide a description of both, but your focus should be on what the funds from the award would enable you to do.

Consider the following recommendations:

  • Focus on competitive priorities. In the Assessment, Research, and Accelerating Learning Tracks, solutions for: math learning across Pre-K to secondary education. In the Assessment Track only, solutions for: non-academic measures across Pre-K to secondary education.
  • Prioritize your tool’s alignment to learning engineering principles. See more on learning engineering here.
  • Incorporate the need and demand of learners, families, and educators into the design and development of the tool. See more on this here.
  • For Growth or Transform Phase competitors in the Learning Science Research track, there is a supplemental award of $100,000 available for proposals that include a district partnership. See more below.
Please include references as needed. Any citation format is acceptable. References will count against character and word count, so we recommend using links or abbreviated references, where possible in the PDF version of your proposal.
You may include images, but the proposal should be able to stand alone in written form. You should not include any external links, unless requested. Reviewers will not click on or consider any external links. Please note that you will only be able to include images in the PDF version of your proposal and not via the submission form. This is the version that reviewers will consider.
The optional video is a brief (30 second) introduction of yourself and your team and your idea, similar to an elevator pitch. This will help us get to know you beyond your proposal. Your video does not need to include a demo (nor will you likely have the time!). Reviewers will not consider videos longer than 30 seconds. You should include a link to the video in the submission form question.
You can start building the tool or functionality, but there is no promise of funding. Similarly, it is OK for the execution plan to begin before funding is administered.

Award Levels & Budget

The competition is designed to be inclusive and support talent and ideas at all stages of development. As such, competitors can compete at one of three award levels:
  • Catalyst ($50,000): aimed at new competitors, including students, teachers, civic technologists, or those who need that initial spark of support to get started.
  • Growth ($100,000): for teams that have a minimum viable product upon which their new idea will build and some users.
  • Transform ($250,000): for teams with an established platform with more than 10,000 users upon which the new idea will build.

Growth or Transform Phase competitors in the Learning Science Research track are eligible to receive a supplemental award of $100,000 for district partnerships, if the district has at least 10,000 students of which the majority come from historically marginalized populations. See more below.

Complete the eligibility quiz to determine which award level best fits your proposal.

Tools in the catalyst phase will look different and be at varying stages of development. The product of your proposal may be an MVP or a prototype, or you may still be in the ideation phase and taking steps towards these goals as a result of your proposal.
If your tool has no current users and is not on the market, we recommend the Catalyst award level.
No, you are not required to compete in the Transform award level. You are welcome to compete for a lower award level if you believe that the idea is in an earlier stage of development.

Growth or Transform Phase competitors in the Learning Science Research track are eligible to receive a supplemental award of $100,000 for partnerships with a district or consortium of districts with at least 10,000 students of which the majority come from historically marginalized populations. The partnership must include:

  • co-design of research questions
  • data collection from at least 3,000 students by Year 2
  • a strategy to incorporate the findings from the research into district instruction or program

If you are entering the competition as a district or consortium of districts, you are also eligible to compete, as long as you are partnered with a researcher.

Refer to the Official Rules for full eligibility requirements.

Note: Budgets are required from Phase II of the competition only. Proposals will be evaluated based on whether they are clear, concise, actionable, and attainable, with budgets that are aligned and realistic with what’s being proposed. Judges will evaluate how you will maximize your impact.

Indirect costs should not exceed 10 percent of the total budget. Other than that, there are no specific requirements on what costs are allowed or not allowed (within reason, of course).

Note: Budgets are required from Phase II of the competition only. There is no definitive time period for the award. The grant can cover expenses before winners are announced. It is recommended that awarded proposals demonstrate significant progress by Product Review Day in Fall 2023 to receive the second installment of funds. This progress will be measured against the timeline for execution outlined in the proposal.
The funding is structured as a grant that will be paid in two installments. 50% will be paid after winners are announced. The remaining 50% may be paid after Product Review Day, where organizers will assess if the winner has made sufficient progress based on their plan for execution.

Each entrant is responsible and liable for all international, Federal, state, and local taxes arising from any grant that may be awarded.

For non-US citizens, certain amounts may be withheld from the grant as required by tax laws, reducing the total amount received by winning Entrants. The Sponsor will determine the withholding percentage after winning Entrants submit appropriate tax forms.

We review the budget to evaluate if the team has a clear sense of how they’ll execute and if the award will provide the necessary funds to accomplish the proposal. Budgets should include the total amount for significant categories related to the project. High level budgets are acceptable, however, details help reviewers understand and build confidence in your execution strategy.
We request that the total budget reflect the award amount. If the budget and the award amount are different, you should clearly show what the award funding will go to, detail where the additional funding will come from (and if it is already secured), and include any additional context on how the award amount fits into the greater budget.
This is not advisable – we recommend your budget add up to the award amount.
No. Since the projects are so different, we do not provide a budget template. However, spreadsheets or tables are preferable.
As part of the Phase II submission, you will upload a single PDF that includes the full proposal and budget. Please note that this PDF is the only place you will include your budget - there is no separate text box on the webform to include your budget.

Competition Tracks

For this track, we are looking for tools that both capture traditionally unmeasured elements of learning and development and improve the quality of assessments to better meet the needs of educators, students and families while reducing the time to develop, administer, or interpret them. All forms of assessment – diagnostic, formative, summative, direct-to-family – are eligible.

This year’s competition is especially focused on new ideas that focus on one or multiple of the following areas:

  • Non-academic measures. Tools that evaluate non-academic measures, including social emotional learning – relationships with adults and peers, emotional functioning, sense of identity, etc. – or approaches to learning – emotional and behavioral self-regulation, initiative, creativity, grit, etc. Many of these measures are “unconstrained” or developed gradually and without a ‘ceiling.’ This will influence the way the tool evaluates and helps users interpret progress. An example might be a tool that detects emotions through facial recognition. A subset of the overall award money for the assessment track will be reserved for proposals that identify non-academic measures for pre-K or pre-literate children, specifically.

  • Math performance. Tools that capture performance related to math across all grade levels from number sense to advanced arithmetic expressions to data science. As an example, consider a 2021 winner, M-Powering teachers, that uses NLP to evaluate student mathematical reasoning.

  • Stealth assessments. Many academic and non-academic measures can be effectively evaluated through stealth assessments. Many of the ideas listed above for non-academic and math performance tools would constitute as stealth assessments. Other examples would include a tool that evaluates motivation and growth mindset by monitoring response time and error rate on digital learning platforms; or 2021 winner University of Wisconsin-Madison that is creating a suite of games to measure student progress on various academic domains.

For examples of other promising innovations in assessment, review last year’s Assessment Track winners.

For this track, we are looking for tools that cultivate or support prospective, developing, and established teachers to improve their practice and maximize learning for all. Tools that support teacher retention, satisfaction, and effectiveness across schools are encouraged.

Just as technology has the potential to personalize and improve learning for Pre-K to secondary students, the same is true for adults. Schools of education, school districts, and other teacher development entities can leverage tools to prepare educators for the classroom as well as offer data and feedback to inform educators’ instructional decisions or improve practice.

As an example, consider Teaching Lab Plus, a 2020 Tools Competition winner, that will collect effectiveness data on professional learning programs in order to improve current programs. Or, a simulation that allows teacher candidates to practice how they would respond to difficult moments in a classroom and receive real-time feedback.

For this track, we are looking for tools that accelerate the learning science research process in order to improve learning interventions. Tools may facilitate A/B testing and randomized controlled trials, improve research design, promote replication, or release knowledge and data for external research. This year, there is a competitive priority for proposals that directly address or could be applied to math instruction.

Please review last year’s winners for examples of competitive proposals in the Learning Science Research track.

The competition is eager to promote tools that are developed in consultation with practitioners. As a result, this year, Growth or Transform Phase competitors in the Learning Science Research track are eligible to receive a supplemental award of $100,000 if they partner with a district or consortium of districts with at least 10,000 students of which the majority come from historically marginalized populations. The district partners would co-design research questions, implement the tool with at least 3,000 students, and incorporate the research findings into district instruction or policy.

For this track, we’re looking for tools that accelerate outcomes in literacy and math and increase relevance of instruction to prepare students for college and careers. Tools should have an equity focus, addressing the declines in academic progress across different races, ethnicities, socioeconomic groups, geographies and disability statuses. The competition also aims to support making knowledge and skills more relevant.

Please review last year’s winners for examples of competitive proposals in the K-12 Accelerated Learning track.

Feedback & Evaluation

The Tools Competition has a three-phase selection process in order to give competitors time and feedback to strengthen their tool and build a team. Proposals will be reviewed at each phase and selected submissions will be invited to submit to the next round. Please note that budgets are required from Phase II of the competition only.

For more information refer to our How to Compete page.

Proposals will be evaluated against others within the same track. Proposals at higher award levels will be subject to greater scrutiny. At each stage of the competition, reviewers will evaluate proposals based on eligibility requirements for the award level as well as:

  • Potential impact and likelihood to improve learning
  • Attention to equity to support learning of historically marginalized populations
  • Demand from learners, educators, and families
  • Ability to support rapid experimentation and continuous improvement
  • Ability to scale to additional users and/or domains
  • Team passion, and readiness to execute

For more information on eligibility criteria, refer to the Official Rules.

Yes! Interested competitors are welcome to reach out to ToolsCompetition@the-learning-agency.com with questions or feedback.

Additional avenues for support, including 1:1 feedback calls and office hours, will be emailed out to our email list, so please make sure to sign up for updates here.

We also recommend joining the Learning Engineering Google Group. Opportunities for partnership and additional support are also frequently posted there.

Phase II reviewers are context experts, technical experts, and researchers in fields directly related to track and proposal topics. Phase II proposals will be scored according to the track rubric and will receive an expert review related to the feasibility of the proposal and the tool’s contribution to the field before being nominated as a finalist.

Research Partnerships

From Phase II of the competition, Growth or Transform competitors in the Accelerated Learning, Assessment, or Strengthening Teacher Development tracks are required to either (1) identify an external researcher that has agreed to partner on the project, or (2) provide evidence from multiple external researchers that the tool could enable research.
To fulfill the research partnership requirement, the researcher(s) must be external to your team. Having an external research partner demonstrates that there is interest and demand for your tool and data set in the wider research community.
External researchers must be external to the immediate organization that is receiving the funds, but they may work for the same institution in another department.

If you need help identifying a researcher, please reach out to Toolscompetition@the-learning-agency.com. We have a large and growing network of researchers who can assist platforms with:

  1. How best to instrument a platform in ways that would serve the field,
  2. Determining what data a platform is able to collect and how best to collect it,
  3. Using the data and related research to answer questions of interest.

We can facilitate connections to researchers through individual requests or broader networking listservs and events.

You can also read our blog for other suggestions on how to connect with external researchers.

You’re welcome to use results of previous research to strengthen your proposal, but these results alone would not satisfy the research partnership requirements because the Tools Competition seeks to encourage scalable and iterative research practices that advance learning engineering and improve knowledge of learning. If you would like to work with an external researcher that you have already worked with in the past, this would be acceptable. To learn more about the research partnership requirements, read our blog post.
You can include costs for external researchers, but ideally, your tool allows multiple researchers to leverage the data. Given that, your budget should cover establishing the infrastructure to allow external researchers to access your data. We anticipate interested researchers will be able to fundraise to conduct research using your data.

Competitors seeking a Growth Phase or Transform Phase Award must have commitment from one or more external researchers that they are interested in using the data from their platform by the time they submit their detailed proposal for Phase II, which is due February 24th, 2023.

This does not need to be a formal agreement, and the researcher does not need to have already secured funding. Instead, we want to see that you have started forming partnerships with external researchers to share your data and consider how that will require you to adapt your tool.

Most importantly, the tool must be designed so that multiple researchers can access data from the platform over time. Given this, we assume that if the researcher you are working with falls through for a reason, you will be able to establish another partnership quickly.

The goal of the research partnership is to enhance the field’s knowledge of learning. Your research partner may come from any discipline (e.g., learning science, psychology, computer science, business) as long as their research will pertain to ‘learning’ or the relationship between the tool and the learner.
You should describe the data sharing process in your proposal and provide evidence that the tool can enable research for multiple external researchers. If you do charge a fee for researchers to use your data, it should not be so significant that it hinders access.
You do not need to submit any specific documentation or formal commitments. In your proposal you should describe the partnership, name the researcher(s) you will partner with and the nature of the partnership, and include researchers on your team (if applicable). Learn more about how you can address the research partnership in your proposal here.

What happens after the competition?

Winners will receive their award by check or bank transfer in two installments.

Winners will receive the first installment soon after winning. Winners will receive the second installment of the award after Product Review Day if they are making sufficient progress on the plan they outlined in their Phase 2 proposal.

Winners will present during a virtual Product Review Day to their peers and others in the field to get feedback and perspective on their progress.

Approximately one year after winners are notified, winners will convene again to present their progress in a Demo Day.

Yes! We strive to support all competitors, not just winners. At each phase, the organizers will compile lists of opportunities for additional funding, support, mentorship, and partnership.

We also encourage your team, if not selected, to stay in touch with the organizers through ToolsCompetition@the-learning-agency.com and the Learning Engineering Google Group.

Competition organizers are eager to support winners and learn from their work to inform future resources for competitors and winners. To do so, all winners will participate in an impact study during which research advisors will work with you to incorporate new measures into your internal evaluation process. In addition, all winners will complete two surveys each year for 3-5 years after winning. That will include completing two surveys annually.

 
 
 
 

SPONSORED BY

 
 
Schmidt Futures
Citadel Logo
Walton Family Foundation
 

“Bill & Melinda Gates Foundation” is a registered trademark of the Bill & Melinda Gates Foundation in the United States and is used with permission.