2024 · Problem A — To Play or Not to Play: Modeling Future Olympic Games
AHP/TOPSIS/EWM Multi-criteria Logistic regression Time seriesThe problem in one paragraph
The IOC needs to decide which Sports/Disciplines/Events (SDEs) to add or remove for the 2032 Brisbane Summer Olympics. Build a model that evaluates SDEs against the IOC's six criteria (popularity, gender equity, sustainability, inclusivity, relevance/innovation, safety) and recommends which SDEs to include.
Read the official PDF → · Dataset (xlsx)
Requirements, restated
- Identify factors needed to address the IOC criteria (quantitative/qualitative, constant/variable, deterministic/probabilistic). Justify and include units.
- Build a model that scores SDEs against the criteria.
- Test on ≥3 SDEs added/removed in 2020/2024/2028 and ≥3 SDEs in continuously since 1988. Verify model affirms current status.
- Identify 3 SDEs to add or reintroduce in 2032 Brisbane. Rank 1st/2nd/3rd. Discuss SDEs for 2036+.
- Sensitivity analysis — identify which factors drive scores.
- 1–2 page letter to IOC summarizing findings non-technically.
How top teams approached this
Factor breakdown
| IOC criterion | Measurable proxies |
|---|---|
| Popularity / Accessibility | Google Trends search volume, social-media mentions, fan-base size, # of professional leagues, broadcast viewership |
| Gender Equity | Ratio of men/women athletes worldwide, # of mixed events, prize-money equality |
| Sustainability | Equipment/venue carbon cost, single-use materials, water usage |
| Inclusivity | Number of countries with national federations, # continents represented, % of UN member states |
| Relevance / Innovation | Age skew of participants, growth rate of youth participation, tech adoption (AR/VR for "physical virtual" sports) |
| Safety / Fair Play | Injury rate per 1000 hours of competition, anti-doping infractions, governance score |
The dominant model template
Per the judges' commentary, "the vast majority of submissions incorporated AHP, EWM, TOPSIS, or some combination." The two top approaches were:
- Combined Weight Model (CWM): average AHP-derived (subjective) weights with EWM-derived (data-driven) weights, then apply TOPSIS for ranking.
- Logistic regression: build a binary classifier (included vs. not) trained on historical SDE×Year data with criteria as features. Predict inclusion probability for 2032 candidates.
An ARIMA / time-series component can model the temporal drift of popularity scores (e.g., breaking grew rapidly 2018–2024).
The "momentum" insight
A clever winning paper (Team 15504, Nanjing) treated inclusion as a probabilistic process with three components: $P_{\text{total}} = P_0 + P_{\text{inertia}} + P_{\text{bias}}$. The inertia term captures the fact that sports that were in last time are very likely to be in next time. This "momentum" idea is worth borrowing.
Apply
Test the model on a diverse set:
- SDEs added: surfing (2020), karate (2020 — then removed), breaking (2024), flag football (2028)
- SDEs continuously in since 1988: athletics, swimming, gymnastics, basketball
The model should predict "in" for the long-standing SDEs and explain why the recent additions score well (or why karate was removed).
2032 candidates worth considering
Cricket, lacrosse, squash, esports, kabaddi, frisbee/ultimate, parkour. Score each, justify your top 3.
Sensitivity
Most papers ran Monte Carlo over the criterion weights and showed which SDEs were "robust" (consistent rank under noise) vs. "marginal".
Pitfalls flagged by judges
- "Long lists of variables included upfront" — be focused.
- Using AHP/TOPSIS without explaining what they are, or where the weights came from.
- Building multiple models and not picking one.
- Picking obvious 2032 candidates without engaging with the IOC criteria.
- Forgetting the host-country effect (an Australian host might favor cricket; teams in 2024 that engaged with this stood out).