Appearance
Engineering Metrics & Validation
This page documents how we plan to measure effectiveness of our solutions.
What We Measure
- DOM performance: render cycles, fragment diff time, reflow frequency
- State consistency: update latency, transaction counts
- Developer experience: onboarding minutes, LOC per component, event bindings per component
- Bundle and Lighthouse scores: budgets and category thresholds
Data Shape
Schema: scripts/metrics.schema.json
Output file: test-results/metrics.json
Collecting Metrics (CI Placeholder)
- Command:
npm run metrics:collect - Environment variables consumed (optional):
BD_BUNDLE_TOTAL_KB,BD_BUNDLE_VENDOR_KBBD_LH_PERF,BD_LH_A11Y,BD_LH_BP,BD_LH_SEOBD_DOM_CYCLES,BD_DOM_DIFF_MS,BD_DOM_REFLOWSBD_STATE_LAT_MS,BD_STATE_TXBD_DX_ONBOARD_M,BD_DX_LOC,BD_DX_EVT
CI uploads test-results/metrics.json as an artifact named engineering-metrics.
Future Harness
- PerformanceObserver sampling in E2E to populate DOM metrics
- Parsing Lighthouse JSON to fill category scores
- Bundle analyzer parsing to emit vendor/core split sizes
- Telemetry during local dev to populate DX signals (opt-in)