BOOL-009 · Boolean Robustness · difficulty 5/5
High-genus subtraction (lattice block)
sha256:771ec0aa1bd2c6f3…
§1Prompt verbatim
Subtract a 7 × 7 × 7 array of 4 mm cylindrical holes from a 60 × 60 × 60 mm cube. Holes spaced at 8 mm pitch, fully through. Genus = 343.
§2Ground-truth spec
shells1
V−E+F-684
genus343
watertighttrue
manifoldtrue
acceptance ε±0.02 mm
§3Reference render
canonical reference · drag to orbit, scroll to zoom
Visualisation is rebuilt in-browser from the canonical parametric description. Scoring is performed against the held-out reference STEP file (sha-256 fingerprint above).
§4Per-agent renders
reference + 10 agent outputs · scored against the held-out STEP
vol IoU · BREP · manifold
canonical reference
REFERENCE
canonical · ground truth
1.000100✓
Human Baseline (Mech-E)
Human Baseline (Mech-E)
n=4 senior engineers
0.8058✓
Claude Opus 4.7 → OpenSCAD
Claude Opus 4.7 → OpenSCAD
Anthropic + OpenSCAD 2024.06
0.6110✗
Claude Opus 4.7 → CadQuery
Claude Opus 4.7 → CadQuery
Anthropic + CadQuery 2.4
0.52812✗
Adam (CADcrush)
Adam (CADcrush)
CADcrush
0.48611✗
GPT-5 → CadQuery
GPT-5 → CadQuery
OpenAI + CadQuery 2.4
0.48011✗
DeepCAD
DeepCAD
Wu et al. 2021 (research)
0.44413✗
Gemini 2.5 Pro → OpenSCAD
Gemini 2.5 Pro → OpenSCAD
Google + OpenSCAD 2024.06
0.3780✗
Trellis 3D
Trellis 3D
Microsoft Research
0.1210✗
Spline AI
Spline AI
Spline.design
0.0850✗
no manifold solid produced
Zoo Text-to-CAD
Zoo Text-to-CAD
Zoo (KittyCAD)
—92✗
Each tile is rebuilt from the canonical parametric description and degraded to match the agent's scored profile (tessellation, non-manifold face removal, dimension scale jitter, missing features). Image-only diffusion models render visually plausible meshes but score in the single digits on BREP fidelity — the geometry is not a manifold solid even when the render reads clean.
§5Per-agent metrics
ranked by Vol IoU · same data as the leaderboard, restricted to this task
| Agent | Vol IoU | Watert. | Manif. | Euler-Poincaré Compliance | P@1 | p50 | latency | cost |
|---|---|---|---|---|---|---|---|---|
| Human Baseline (Mech-E) | 0.805 | ✓ | 0.964 | ✓ | 1.000 | — | 919.7s | $6.557 |
| Claude Opus 4.7 → OpenSCAD | 0.611 | ✓ | 0.942 | ✓ | 0.000 | — | 32.4s | $0.260 |
| Claude Opus 4.7 → CadQuery | 0.528 | ✓ | 0.935 | ✓ | 0.000 | — | 32.7s | $0.360 |
| Adam (CADcrush) | 0.486 | ✓ | 0.919 | × | 0.000 | — | 8.2s | $0.247 |
| GPT-5 → CadQuery | 0.480 | ✓ | 0.924 | × | 0.000 | — | 35.8s | $0.239 |
| DeepCAD | 0.444 | ✓ | 0.913 | × | 0.000 | — | 3.5s | $0.018 |
| Gemini 2.5 Pro → OpenSCAD | 0.378 | × | 0.910 | × | 0.000 | — | 32.5s | $0.085 |
| Trellis 3D | 0.121 | × | 0.868 | × | 0.000 | — | 11.7s | $0.048 |
| Spline AI | 0.085 | × | 0.864 | × | 0.000 | — | 9.5s | $0.032 |
| Zoo Text-to-CAD kernel error: BRepCheck_NotClosed | 0.000 | × | 0.000 | — | 0.000 | — | 7.1s | $0.190 |