BREP-004 · BREP Fidelity · difficulty 4/5
NURBS-handle goblet (G2 swept loft)
sha256:9bb19af0271ac46e…
§1Prompt verbatim
Goblet: cup is a swept-revolved NURBS surface (12 control points along generatrix), stem is a 10 mm chamfered cylinder, base is a Ø 70 × 5 mm disc. Cup-to-stem and stem-to-base junctions must be G2 continuous. Export AP242.
§2Ground-truth spec
shells1
watertighttrue
manifoldtrue
acceptance ε±0.05 mm
featuresg2_continuity_cup_stem, g2_continuity_stem_base
§3Reference render
canonical reference · drag to orbit, scroll to zoom
Visualisation is rebuilt in-browser from the canonical parametric description. Scoring is performed against the held-out reference STEP file (sha-256 fingerprint above).
§4Per-agent renders
reference + 10 agent outputs · scored against the held-out STEP
vol IoU · BREP · manifold
canonical reference
REFERENCE
canonical · ground truth
1.000100✓
Zoo Text-to-CAD
Zoo Text-to-CAD
Zoo (KittyCAD)
0.76010✓
Human Baseline (Mech-E)
Human Baseline (Mech-E)
n=4 senior engineers
0.67611✗
Adam (CADcrush)
Adam (CADcrush)
CADcrush
0.60112✗
DeepCAD
DeepCAD
Wu et al. 2021 (research)
0.52411✗
Claude Opus 4.7 → CadQuery
Claude Opus 4.7 → CadQuery
Anthropic + CadQuery 2.4
0.5039✗
GPT-5 → CadQuery
GPT-5 → CadQuery
OpenAI + CadQuery 2.4
0.44014✗
Claude Opus 4.7 → OpenSCAD
Claude Opus 4.7 → OpenSCAD
Anthropic + OpenSCAD 2024.06
0.0750✗
Trellis 3D
Trellis 3D
Microsoft Research
0.0150✗
Gemini 2.5 Pro → OpenSCAD
Gemini 2.5 Pro → OpenSCAD
Google + OpenSCAD 2024.06
0.0000✗
Spline AI
Spline AI
Spline.design
0.0000✗
Each tile is rebuilt from the canonical parametric description and degraded to match the agent's scored profile (tessellation, non-manifold face removal, dimension scale jitter, missing features). Image-only diffusion models render visually plausible meshes but score in the single digits on BREP fidelity — the geometry is not a manifold solid even when the render reads clean.
§5Per-agent metrics
ranked by Vol IoU · same data as the leaderboard, restricted to this task
| Agent | Watert. | Manif. | Euler-Poincaré Compliance | STEP RT | FeatRec | P@1 | p50 | latency | cost |
|---|---|---|---|---|---|---|---|---|---|
| Zoo Text-to-CAD | ✓ | 0.963 | ✓ | 0.097 | 0.722 | 1.000 | — | 6.2s | $0.188 |
| Human Baseline (Mech-E) | ✓ | 0.949 | ✓ | 0.113 | 0.896 | 0.000 | — | 568.6s | $6.847 |
| Adam (CADcrush) | ✓ | 0.944 | ✓ | 0.124 | 0.713 | 0.000 | — | 9.6s | $0.298 |
| DeepCAD | ✓ | 0.931 | ✓ | 0.109 | 0.447 | 0.000 | — | 4.2s | $0.024 |
| Claude Opus 4.7 → CadQuery | ✓ | 0.930 | ✓ | 0.088 | 0.760 | 0.000 | — | 44.9s | $0.347 |
| GPT-5 → CadQuery | ✓ | 0.918 | × | 0.135 | 0.680 | 0.000 | — | 40.2s | $0.174 |
| Claude Opus 4.7 → OpenSCAD | × | 0.862 | × | — | 0.523 | 0.000 | — | 37.8s | $0.344 |
| Trellis 3D | × | 0.852 | × | — | 0.201 | 0.000 | — | 12.2s | $0.056 |
| Gemini 2.5 Pro → OpenSCAD | × | 0.850 | × | — | 0.480 | 0.000 | — | 26.1s | $0.075 |
| Spline AI | × | 0.850 | × | — | 0.099 | 0.000 | — | 7.1s | $0.043 |