CAM-001 · CAM Toolpath Validity · difficulty 3/5
5-pocket plate, Ø3 endmill finish
sha256:9c0fa1ee08b4c172…
§1Prompt verbatim
Aluminium plate 100 × 60 × 12 mm with five rectangular pockets (20×20×6 mm deep) on a 2×3 grid (one corner cell empty), inside corners R 1.6 mm. The geometry must produce a collision-free 3-axis G-code program at 0.05 mm finish stepover with a Ø 3 mm endmill.
§2Ground-truth spec
shells1
watertighttrue
manifoldtrue
acceptance ε±0.05 mm
featurespocket_x5, internal_R1.6
§3Reference render
canonical reference · drag to orbit, scroll to zoom
Visualisation is rebuilt in-browser from the canonical parametric description. Scoring is performed against the held-out reference STEP file (sha-256 fingerprint above).
§4Per-agent renders
reference + 10 agent outputs · scored against the held-out STEP
vol IoU · BREP · manifold
canonical reference
REFERENCE
canonical · ground truth
1.000100✓
Zoo Text-to-CAD
Zoo Text-to-CAD
Zoo (KittyCAD)
0.72311✓
Human Baseline (Mech-E)
Human Baseline (Mech-E)
n=4 senior engineers
0.7198✓
Adam (CADcrush)
Adam (CADcrush)
CADcrush
0.6309✗
GPT-5 → CadQuery
GPT-5 → CadQuery
OpenAI + CadQuery 2.4
0.53610✗
Claude Opus 4.7 → CadQuery
Claude Opus 4.7 → CadQuery
Anthropic + CadQuery 2.4
0.50511✗
Claude Opus 4.7 → OpenSCAD
Claude Opus 4.7 → OpenSCAD
Anthropic + OpenSCAD 2024.06
0.4860✗
Gemini 2.5 Pro → OpenSCAD
Gemini 2.5 Pro → OpenSCAD
Google + OpenSCAD 2024.06
0.4160✗
no manifold solid produced
DeepCAD
DeepCAD
Wu et al. 2021 (research)
—74✗
no manifold solid produced
Trellis 3D
Trellis 3D
Microsoft Research
—4✗
Spline AI
Spline AI
Spline.design
0.0000✗
Each tile is rebuilt from the canonical parametric description and degraded to match the agent's scored profile (tessellation, non-manifold face removal, dimension scale jitter, missing features). Image-only diffusion models render visually plausible meshes but score in the single digits on BREP fidelity — the geometry is not a manifold solid even when the render reads clean.
§5Per-agent metrics
ranked by Vol IoU · same data as the leaderboard, restricted to this task
| Agent | Watert. | Manif. | FeatRec | CAM Reachability | P@1 | p50 | latency | cost |
|---|---|---|---|---|---|---|---|---|
| Zoo Text-to-CAD | ✓ | 0.957 | 0.709 | 0.737 | 0.000 | — | 6.7s | $0.144 |
| Human Baseline (Mech-E) | ✓ | 0.958 | 0.887 | 0.905 | 0.000 | — | 536.7s | $7.008 |
| Adam (CADcrush) | ✓ | 0.941 | 0.640 | 0.613 | 0.000 | — | 10.3s | $0.224 |
| GPT-5 → CadQuery | ✓ | 0.933 | 0.728 | 0.639 | 0.000 | — | 40.4s | $0.207 |
| Claude Opus 4.7 → CadQuery | ✓ | 0.927 | 0.753 | 0.691 | 0.000 | — | 35.2s | $0.289 |
| Claude Opus 4.7 → OpenSCAD | ✓ | 0.925 | 0.562 | 0.487 | 0.000 | — | 39.5s | $0.367 |
| Gemini 2.5 Pro → OpenSCAD | ✓ | 0.916 | 0.548 | 0.448 | 0.000 | — | 27.1s | $0.086 |
| DeepCAD kernel error: BRepCheck_NotClosed | × | 0.000 | — | — | 0.000 | — | 4.6s | $0.021 |
| Trellis 3D kernel error: BRepCheck_NotClosed | × | 0.000 | — | — | 0.000 | — | 11.2s | $0.043 |
| Spline AI | × | 0.850 | 0.093 | 0.044 | 0.000 | — | 6.6s | $0.041 |