2025 Artificial General Intelligence Mizzou Hackathon

AI Hackathon and Showcase

Nov. 15-Nov. 16, 2025 | Lafferre Hall


Purpose

Inspired by the global ARC Prize competition, founded by Mizzou Engineering alum Mike Knoop, the AI Hackathon & Showcase brings together students, faculty and industry partners to take on the exciting challenges of abstraction and reasoning in artificial intelligence. Designed as a bridge between classroom learning, research and community engagement, the event gives students across the College of Engineering, and the broader campus community, a chance to apply AI fundamentals to a cutting-edge problem in a collaborative, creative environment.

Sponsored by the College of Engineering and supported by the College of Engineering Graduate Student Association, the EECS GSA and the ARC Prize Foundation, the hackathon and showcase is open to all and promises an engaging experience at the intersection of innovation, teamwork and discovery.


2025 Results

  1. t[AI]ger Roar: Wen-Hsin Chen (CS MS), Jeong Wook Lee (informatics PhD), Alina Rohulia (informatics PhD), Samrat Kumar Dey (informatics PhD)
  2. Cosmic Interface: Upasana Roy (CS PhD), Gourab Nandi (physics PhD)
  3. Logic AI: Khadichakhon Nurakhmedova (economics undergrad), Akbarjon Kamoldinov (EE undergrad)
  4. Decode is all you need: Zhiguang Liu (CS PhD), Jiuyi Zheng (CS PhD)
  5. Sam Squared: Samuel Hirner (CS undergrad), Sam Byerly (CS undergrad)
  6. MMCV Lab Team: Chimdi Walter Ndubuisi (ECE PhD), Christian Fluharty (CS undergrad)

Event Details

Focus

ARC-AGI-2 benchmark (2025 competition dataset). This event will emphasize exploration, creativity and learning, not outperforming state-of-the-art. It is designed to be fun, collaborative and involve open-ended exploration of AI reasoning tasks.


Timeline
  • 5 p.m., Friday, Oct. 10
    • Individual registration deadline
      • More than 120 Mizzou students across different disciplines, including computer science, engineering, data science, math, statistics, physics, plat science, economics, etc., have registered
  • September-early November
    • Pre-event exploration
      • Review ARC tasks, explore solvers and brainstorm approaches
  • Wednesday, Oct. 22 in Ketcham Auditorium
    • Mizzou ARC Prize 2025 Workshop, “Learn, share and spark ideas for ARC Prize 2025”
      • Top performers from the ARC Prize competition provided guest talks, including Jeremy Berman from reflection AI (ranked No. 1 ranked on global ARC-AGI-1 and ARC-AGI-2 public leaderboards) and Jack Cole from Tufa Labs (ranked No. 3 on the global Kaggle ARC prize 2025 leaderboard).
  • Wednesday, Oct. 29
  • 10-11 a.m. Saturday, Nov. 15
    • Team Registration
      • Teams (2-5 members) will form from individuals who registered by the Oct. 10 deadline. Finalized team lists will be submitted at check-in.
  • Saturday, Nov. 15-Sunday, Nov. 16
    • Hackathon and Showcase weekend
      • teams present their work, share experiences and celebrate creativity in a festival-style event.

Support

Amazon has provided $24,450 in AWS credits to participants and held AWS Research Day on Oct. 10 and AWS Immersion Day on Oct. 23 to help participants learn how to use AWS services.


Compute

College of Engineering will help provide High Performance Computing (HPC) to provide equitable access to teams. Details coming soon. 


Mentorship

Faculty and graduate student mentors will be available during the event


Sponsorship and Prizes

Cash prizes

  • First place: $2,000
  • Second place: $1,000
  • Third place: $500

Additional door prizes


Food and Logistics

Breakfast, lunch, dinner and snacks will be provided


Participation and Registration

Open to all MU students (College of Engineering and beyond).

Register by 5 p.m. on Friday, Oct. 10. Required information includes name, student email address, college and department.


Schedule
TimeEventLocation
Saturday, Nov. 15Hackathon Kickoff and Collaboration
9:30 a.m.Welcome and overview
Dean Marisa Chrysochoou, Mike Knoop (co-founder of Zapier, co-founder of the ARC Prize Foundation)
Naka Hall 102
10 a.m.1. Team registration in E3508
2. Team time
Lafferre Hall E3508, E3509, E3510, E3511
12 p.m.Lunch and networkingLafferre Hall E3508, E3509, E3510, E3511
1 p.m.Team collaborationLafferre Hall E3508, E3509, E3510, E3511
6 p.m.Dinner and social mixerLafferre Hall Ketcham Auditorium
7 p.m.Optional evening work sessionLafferre Hall E3508, E3509, E3510, E3511
Sunday, Nov. 16Showcase and Celebration
9 a.m.-12 p.m.Round 1: Lightning pitches/demos
All teams give five-minute pitches/demos to judges and peers. The top six teams advance to the final.
Lafferre Hall Ketcham Auditorium
12:30 p.m.-1:30 p.m.Lunch and networkingLafferre Hall Ketcham Auditorium
1:30-3:45 p.m.Round 2: Finalist presentations
Top six teams deliver full 20-minute presentations to all judges
Lafferre Hall Ketcham Auditorium
4-5:15 p.m.Idea exchange and discussion panels
(Finalists and faculty)
Lafferre Hall Ketcham Auditorium
5:30-6 p.m.Closing ceremony and awardsLafferre Hall Ketcham Auditorium

Judging Rubric for Round 1
CategoryExcellent (3-5 pts)Good (2-3 pts)Poor (1 pt)
Clarity of ideaProblem and approach clearly explained in 1-2 sentences (3 pts)Somewhat clear but missing focus or key details (2 pts)Hard to understand; vague or confusing explanation
Creativity/noveltyApproach (from algorithms to implementations) is unique, inventive and insightful (5 pts)Some originality but resembles baseline/common solutions (3 pts)No clear creativity; mostly trivial or copied baseline
Demo/resultsDemo or visuals shown, with evidence of progress/results (4 pts)Demo presented but limited or unclear results (2-3 pts)No demo or unclear whether work has been done
PresentationConcise, engaging and within time limit (3 pts)Slightly over/under time, somewhat engaging (2 pts)Disorganized, unengaging or poorly timed
Max score per team: 15 pts
Judges use scores + discussion to advance six teams
Judging Rubric for Round 2
CategoryExcellent (5 pts)Good (3 pts)Poor (1 pt)
Creativity an innovationHighly original approach; demonstrates fresh ideas beyond standard solversSome novelty; extends or modifies existing methods meaningfullyLacks originality; mostly replication with little extension
Technical depthStrong technical foundation; clear application of AI conceptsDemonstrates technical understanding with some depthMinimal technical rigor; superficial or unclear methodology
Clarity and communicationClear explanation of approach, results and challenges; accessible to broad audienceMostly clear, though occasionally hard to followPoorly explained; difficult to understand
Demo and resultsWorking demo or well-documented results; strong evidence of experimentation and analysisDemo presented but limited results or weak analysisLittle to no demo; results unclear or missing
Teamwork and executionStrong collaboration evident; polished delivery; all members engagedSome collaboration; delivery mostly handled by 1-2 membersWeak collaboration; unpolished delivery; minimal participation from some members
Impact and potentialApproach has potential beyond hackathon (research, teaching, applications)Some potential impact but limited scopeMinimal impact; unclear relevance or applicability
Max score per team: 30 pts
Judges use scores + discussion to award first, second and third-place teams