DFMCNC-002 · DFM · 3-Axis CNC · difficulty 4/5
3-ax-machinable manifold block
sha256:ab02f10c8e51dd23…
§1Prompt verbatim
Solid 80 × 60 × 40 mm aluminium block. Three Ø 6 H7 bores from the top, two Ø 4 H7 bores from each side, all intersecting an internal Ø 8 manifold channel running along the long axis. The geometry must be fully machinable on a 3-axis VMC with Ø 6 → Ø 4 → Ø 1 tools, plus a 90° drill-rotation between sides.
§2Ground-truth spec
shells1
watertighttrue
manifoldtrue
acceptance ε±0.05 mm
featuresmanifold_channel, bore_H7_x7
§3Reference render
canonical reference · drag to orbit, scroll to zoom
Visualisation is rebuilt in-browser from the canonical parametric description. Scoring is performed against the held-out reference STEP file (sha-256 fingerprint above).
§4Per-agent renders
reference + 10 agent outputs · scored against the held-out STEP
vol IoU · BREP · manifold
canonical reference
REFERENCE
canonical · ground truth
1.000100✓
Zoo Text-to-CAD
Zoo Text-to-CAD
Zoo (KittyCAD)
0.59411✗
Claude Opus 4.7 → CadQuery
Claude Opus 4.7 → CadQuery
Anthropic + CadQuery 2.4
0.5949✗
Human Baseline (Mech-E)
Human Baseline (Mech-E)
n=4 senior engineers
0.5698✗
Adam (CADcrush)
Adam (CADcrush)
CADcrush
0.55910✗
GPT-5 → CadQuery
GPT-5 → CadQuery
OpenAI + CadQuery 2.4
0.44911✗
Claude Opus 4.7 → OpenSCAD
Claude Opus 4.7 → OpenSCAD
Anthropic + OpenSCAD 2024.06
0.3830✗
DeepCAD
DeepCAD
Wu et al. 2021 (research)
0.35616✗
Gemini 2.5 Pro → OpenSCAD
Gemini 2.5 Pro → OpenSCAD
Google + OpenSCAD 2024.06
0.3000✗
Trellis 3D
Trellis 3D
Microsoft Research
0.0040✗
no manifold solid produced
Spline AI
Spline AI
Spline.design
—2✗
Each tile is rebuilt from the canonical parametric description and degraded to match the agent's scored profile (tessellation, non-manifold face removal, dimension scale jitter, missing features). Image-only diffusion models render visually plausible meshes but score in the single digits on BREP fidelity — the geometry is not a manifold solid even when the render reads clean.
§5Per-agent metrics
ranked by Vol IoU · same data as the leaderboard, restricted to this task
| Agent | Watert. | Manif. | DFM | Min-Wall Compliance | CAM Reachability | P@1 | p50 | latency | cost |
|---|---|---|---|---|---|---|---|---|---|
| Zoo Text-to-CAD | ✓ | 0.936 | 81.8 | 0.623 | 0.706 | 0.000 | — | 7.1s | $0.153 |
| Claude Opus 4.7 → CadQuery | ✓ | 0.936 | 82.4 | 0.622 | 0.683 | 0.000 | — | 46.2s | $0.330 |
| Human Baseline (Mech-E) | ✓ | 0.943 | 93.0 | 0.803 | 0.836 | 0.000 | — | 623.8s | $7.071 |
| Adam (CADcrush) | ✓ | 0.936 | 79.7 | 0.579 | 0.596 | 0.000 | — | 11.1s | $0.263 |
| GPT-5 → CadQuery | ✓ | 0.919 | 80.5 | 0.617 | 0.647 | 0.000 | — | 29.4s | $0.248 |
| Claude Opus 4.7 → OpenSCAD | × | 0.906 | 76.8 | 0.599 | 0.529 | 0.000 | — | 36.2s | $0.360 |
| DeepCAD | × | 0.905 | 64.6 | 0.338 | 0.343 | 0.000 | — | 4.7s | $0.022 |
| Gemini 2.5 Pro → OpenSCAD | × | 0.899 | 75.3 | 0.528 | 0.443 | 0.000 | — | 24.1s | $0.094 |
| Trellis 3D | × | 0.851 | 50.2 | 0.203 | 0.094 | 0.000 | — | 13.8s | $0.056 |
| Spline AI kernel error: BRepCheck_NotClosed | × | 0.000 | — | — | — | 0.000 | — | 9.4s | $0.035 |