ASM-005 · Assembly & Mating · difficulty 3/5
Pin-in-hole, H7/g6 sliding fit
sha256:5cd0a7e3b4f10918…
§1Prompt verbatim
Cylindrical pin Ø 10 g6 × 40 mm long, chamfered 1×45° on both ends. Held-out partner has a Ø 10 H7 through-hole; the assembled clearance must lie in [0.005, 0.029] mm.
§2Ground-truth spec
shells1
watertighttrue
manifoldtrue
acceptance ε±0.005 mm
clearance[0.005, 0.029] mm
§3Reference render
canonical reference · drag to orbit, scroll to zoom
Visualisation is rebuilt in-browser from the canonical parametric description. Scoring is performed against the held-out reference STEP file (sha-256 fingerprint above).
§4Per-agent renders
reference + 10 agent outputs · scored against the held-out STEP
vol IoU · BREP · manifold
canonical reference
REFERENCE
canonical · ground truth
1.000100✓
Human Baseline (Mech-E)
Human Baseline (Mech-E)
n=4 senior engineers
0.8669✓
Zoo Text-to-CAD
Zoo Text-to-CAD
Zoo (KittyCAD)
0.6399✗
Claude Opus 4.7 → CadQuery
Claude Opus 4.7 → CadQuery
Anthropic + CadQuery 2.4
0.60112✗
GPT-5 → CadQuery
GPT-5 → CadQuery
OpenAI + CadQuery 2.4
0.58212✗
Adam (CADcrush)
Adam (CADcrush)
CADcrush
0.5509✗
Claude Opus 4.7 → OpenSCAD
Claude Opus 4.7 → OpenSCAD
Anthropic + OpenSCAD 2024.06
0.4210✗
DeepCAD
DeepCAD
Wu et al. 2021 (research)
0.33918✗
Gemini 2.5 Pro → OpenSCAD
Gemini 2.5 Pro → OpenSCAD
Google + OpenSCAD 2024.06
0.3350✗
Trellis 3D
Trellis 3D
Microsoft Research
0.0000✗
Spline AI
Spline AI
Spline.design
0.0000✗
Each tile is rebuilt from the canonical parametric description and degraded to match the agent's scored profile (tessellation, non-manifold face removal, dimension scale jitter, missing features). Image-only diffusion models render visually plausible meshes but score in the single digits on BREP fidelity — the geometry is not a manifold solid even when the render reads clean.
§5Per-agent metrics
ranked by Vol IoU · same data as the leaderboard, restricted to this task
| Agent | Watert. | Manif. | FeatRec | Mating Clearance | Fit-Class Compliance | P@1 | p50 | latency | cost |
|---|---|---|---|---|---|---|---|---|---|
| Human Baseline (Mech-E) | ✓ | 0.979 | 0.939 | 0.802 | 0.849 | 1.000 | — | 805.3s | $5.119 |
| Zoo Text-to-CAD | ✓ | 0.940 | 0.713 | 0.771 | 0.573 | 0.000 | — | 7.2s | $0.166 |
| Claude Opus 4.7 → CadQuery | ✓ | 0.947 | 0.777 | 0.601 | 0.414 | 0.000 | — | 40.3s | $0.296 |
| GPT-5 → CadQuery | ✓ | 0.936 | 0.693 | 0.577 | 0.376 | 0.000 | — | 36.2s | $0.179 |
| Adam (CADcrush) | ✓ | 0.934 | 0.701 | 0.593 | 0.446 | 0.000 | — | 7.9s | $0.263 |
| Claude Opus 4.7 → OpenSCAD | × | 0.910 | 0.609 | 0.499 | 0.225 | 0.000 | — | 36.8s | $0.318 |
| DeepCAD | × | 0.901 | 0.429 | 0.442 | 0.173 | 0.000 | — | 5.3s | $0.021 |
| Gemini 2.5 Pro → OpenSCAD | × | 0.903 | 0.546 | 0.461 | 0.229 | 0.000 | — | 27.2s | $0.104 |
| Trellis 3D | × | 0.850 | 0.189 | 0.096 | 0.005 | 0.000 | — | 9.1s | $0.051 |
| Spline AI | × | 0.850 | 0.085 | 0.049 | 0.001 | 0.000 | — | 9.8s | $0.032 |