REVENG-002 · Reverse Engineering · difficulty 4/5
Three-view ortho → bracket
sha256:30d72b41ae9c0014…
§1Prompt verbatim
From the supplied 1:1 front/top/side dimensioned drawing (PNG, 600 dpi) reproduce the part. All dimensions and tolerances on the drawing are authoritative.
§2Ground-truth spec
shells1
watertighttrue
manifoldtrue
acceptance ε±0.1 mm
§3Reference render
canonical reference · drag to orbit, scroll to zoom
Visualisation is rebuilt in-browser from the canonical parametric description. Scoring is performed against the held-out reference STEP file (sha-256 fingerprint above).
§4Per-agent renders
reference + 10 agent outputs · scored against the held-out STEP
vol IoU · BREP · manifold
canonical reference
REFERENCE
canonical · ground truth
1.000100✓
Claude Opus 4.7 → CadQuery
Claude Opus 4.7 → CadQuery
Anthropic + CadQuery 2.4
0.6768✓
Human Baseline (Mech-E)
Human Baseline (Mech-E)
n=4 senior engineers
0.6468✗
Trellis 3D
Trellis 3D
Microsoft Research
0.5500✗
GPT-5 → CadQuery
GPT-5 → CadQuery
OpenAI + CadQuery 2.4
0.44215✗
Claude Opus 4.7 → OpenSCAD
Claude Opus 4.7 → OpenSCAD
Anthropic + OpenSCAD 2024.06
0.4150✗
Gemini 2.5 Pro → OpenSCAD
Gemini 2.5 Pro → OpenSCAD
Google + OpenSCAD 2024.06
0.4030✗
Zoo Text-to-CAD
Zoo Text-to-CAD
Zoo (KittyCAD)
0.39215✗
Adam (CADcrush)
Adam (CADcrush)
CADcrush
0.31617✗
DeepCAD
DeepCAD
Wu et al. 2021 (research)
0.09252✗
no manifold solid produced
Spline AI
Spline AI
Spline.design
—2✗
Each tile is rebuilt from the canonical parametric description and degraded to match the agent's scored profile (tessellation, non-manifold face removal, dimension scale jitter, missing features). Image-only diffusion models render visually plausible meshes but score in the single digits on BREP fidelity — the geometry is not a manifold solid even when the render reads clean.
§5Per-agent metrics
ranked by Vol IoU · same data as the leaderboard, restricted to this task
| Agent | Vol IoU | Watert. | Manif. | Named-Dimension RMSE | FeatRec | P@1 | p50 | latency | cost |
|---|---|---|---|---|---|---|---|---|---|
| Claude Opus 4.7 → CadQuery | 0.676 | ✓ | 0.956 | 0.232 | 0.709 | 0.000 | — | 37.2s | $0.328 |
| Human Baseline (Mech-E) | 0.646 | ✓ | 0.943 | 0.111 | 0.937 | 0.000 | — | 844.1s | $6.605 |
| Trellis 3D | 0.550 | ✓ | 0.928 | 0.508 | 0.208 | 0.000 | — | 10.1s | $0.055 |
| GPT-5 → CadQuery | 0.442 | ✓ | 0.917 | 0.245 | 0.733 | 0.000 | — | 42.0s | $0.229 |
| Claude Opus 4.7 → OpenSCAD | 0.415 | ✓ | 0.913 | 0.218 | 0.599 | 0.000 | — | 34.2s | $0.341 |
| Gemini 2.5 Pro → OpenSCAD | 0.403 | ✓ | 0.913 | 0.276 | 0.508 | 0.000 | — | 25.9s | $0.101 |
| Zoo Text-to-CAD | 0.392 | × | 0.906 | 0.201 | 0.738 | 0.000 | — | 5.6s | $0.188 |
| Adam (CADcrush) | 0.316 | × | 0.896 | 0.165 | 0.736 | 0.000 | — | 6.7s | $0.228 |
| DeepCAD | 0.092 | × | 0.864 | 0.300 | 0.428 | 0.000 | — | 4.1s | $0.021 |
| Spline AI kernel error: BRepCheck_NotClosed | 0.000 | × | 0.000 | — | — | 0.000 | — | 8.3s | $0.036 |