SURF-002 · Free-form Surfaces · difficulty 5/5
Compressor blade (NACA 65-(12)10)
sha256:8de14b209c01ac72…
§1Prompt verbatim
Single compressor blade: 60 mm chord, 80 mm span, 12° twist root-to-tip, NACA-65-(12)10 thickness distribution along the camber line. G2 continuous suction and pressure surfaces, sharp trailing edge at 0.3 mm.
§2Ground-truth spec
shells1
watertighttrue
manifoldtrue
acceptance ε±0.05 mm
§3Reference render
canonical reference · drag to orbit, scroll to zoom
Visualisation is rebuilt in-browser from the canonical parametric description. Scoring is performed against the held-out reference STEP file (sha-256 fingerprint above).
§4Per-agent renders
reference + 10 agent outputs · scored against the held-out STEP
vol IoU · BREP · manifold
canonical reference
REFERENCE
canonical · ground truth
1.000100✓
Human Baseline (Mech-E)
Human Baseline (Mech-E)
n=4 senior engineers
0.7449✓
Trellis 3D
Trellis 3D
Microsoft Research
0.5600✗
Spline AI
Spline AI
Spline.design
0.5480✗
Zoo Text-to-CAD
Zoo Text-to-CAD
Zoo (KittyCAD)
0.45511✗
GPT-5 → CadQuery
GPT-5 → CadQuery
OpenAI + CadQuery 2.4
0.40017✗
Claude Opus 4.7 → OpenSCAD
Claude Opus 4.7 → OpenSCAD
Anthropic + OpenSCAD 2024.06
0.3540✗
Gemini 2.5 Pro → OpenSCAD
Gemini 2.5 Pro → OpenSCAD
Google + OpenSCAD 2024.06
0.3390✗
Claude Opus 4.7 → CadQuery
Claude Opus 4.7 → CadQuery
Anthropic + CadQuery 2.4
0.31619✗
Adam (CADcrush)
Adam (CADcrush)
CADcrush
0.17528✗
DeepCAD
DeepCAD
Wu et al. 2021 (research)
0.10150✗
Each tile is rebuilt from the canonical parametric description and degraded to match the agent's scored profile (tessellation, non-manifold face removal, dimension scale jitter, missing features). Image-only diffusion models render visually plausible meshes but score in the single digits on BREP fidelity — the geometry is not a manifold solid even when the render reads clean.
§5Per-agent metrics
ranked by Vol IoU · same data as the leaderboard, restricted to this task
| Agent | Bidirectional Chamfer | Hausdorff p95 | NormCons | Watert. | Manif. | P@1 | p50 | latency | cost |
|---|---|---|---|---|---|---|---|---|---|
| Human Baseline (Mech-E) | 0.126 | 0.613 | 0.912 | ✓ | 0.961 | 1.000 | — | 773.4s | $6.746 |
| Trellis 3D | 0.140 | 0.839 | 0.840 | ✓ | 0.934 | 0.000 | — | 11.2s | $0.041 |
| Spline AI | 0.153 | 0.791 | 0.831 | ✓ | 0.928 | 0.000 | — | 5.6s | $0.039 |
| Zoo Text-to-CAD | 0.173 | 0.854 | 0.806 | ✓ | 0.921 | 0.000 | — | 6.6s | $0.202 |
| GPT-5 → CadQuery | 0.206 | 1.111 | 0.771 | × | 0.906 | 0.000 | — | 46.9s | $0.174 |
| Claude Opus 4.7 → OpenSCAD | 0.216 | 1.039 | 0.758 | × | 0.906 | 0.000 | — | 37.6s | $0.306 |
| Gemini 2.5 Pro → OpenSCAD | 0.227 | 1.296 | 0.748 | × | 0.900 | 0.000 | — | 25.9s | $0.086 |
| Claude Opus 4.7 → CadQuery | 0.302 | 1.359 | 0.731 | × | 0.894 | 0.000 | — | 38.6s | $0.320 |
| Adam (CADcrush) | 0.432 | 2.057 | 0.682 | × | 0.879 | 0.000 | — | 11.4s | $0.315 |
| DeepCAD | 0.811 | 4.065 | 0.643 | × | 0.864 | 0.000 | — | 3.6s | $0.018 |