How We Test
Our Workflow-First Method
We test document scanners by outcomes that matter in daily work: time-to-digital, jam recovery, OCR accuracy, and cloud filing success. Each device runs through standardized “mixed stack” jobs that mirror real life—receipts, IDs, business cards, multi-page contracts, and duplex forms with different sizes and paper weights.
Test Environment
- OS: Current Windows and macOS releases; select Linux where vendor drivers exist
- Connections: USB, Ethernet, and Wi‑Fi (2.4/5 GHz) with typical SMB office constraints
- Software: Vendor drivers/apps, TWAIN/ICA/WIA, and common middleware (e.g., Power Automate, Zapier connectors where available)
- Cloud: Google Drive, OneDrive/SharePoint, Dropbox, Box; common DMS targets where possible
Core Metrics
- Time-to-Digital: From first sheet load to searchable, correctly named, and filed PDF (or PDF/A) in the right folder
- Throughput: Effective pages-per-minute (EPPM) including prep, misfeeds, and recovery time
- Jam Prevention & Recovery: Misfeed rate, auto-retry behavior, and preservation of batch order
- OCR Quality: Word-level accuracy and layout retention on typed text and receipts; embedded metadata
- Filing Success: Authenticated, rules-based naming and routing; retry behavior on network hiccups
- Image Quality: Auto-crop, deskew, blank removal, color fidelity, and compression artifacts
Mixed Stack Protocol
We run three standardized stacks:
- Receipts & IDs: 50 mixed items, various sizes and conditions
- Office Batch: 100 pages, duplex, mixed letter/legal, occasional staples removed
- Contract Set: 30 pages mixed with tabs and shaded backgrounds Each run is repeated three times per connection type to capture variance.
Profiles and Automation
We create reproducible scan profiles: destination, resolution, color mode, duplex, OCR, naming tokens, and routing rules (client/matter, vendor, project). Profiles must be shareable across users and persistent after reboots/updates.
Reliability & Longevity
- Driver Stability: Install/upgrade experience, conflicts, and error handling
- Maintenance: Roller/ADF cleaning intervals, cost of consumables, and replacement counters
- Durability: Long-run batches (1,000–5,000 pages) to assess heat, slowdown, and feed wear
- Noise & Footprint: dB at 1m and desk-fit for shared spaces
- Power Use: Idle and active draw during typical runs
Scoring & Reporting
We weight results by user impact: time-to-digital (35%), reliability/jam recovery (25%), OCR/file fidelity (20%), cloud filing success (15%), and maintenance/TCO (5%). Every review includes:
- Scorecard summary and contextual notes
- Known issues and workarounds
- Best-fit use cases (home office, bookkeeping, legal intake, field scans)
- Compatibility matrix for OS, connections, and cloud
Retesting and Updates
We retest after major firmware/driver releases or when integrations change. Recommendations shift when the data does. All changes are timestamped in the review’s update log.
Independence
Vendors cannot preview or edit results. Loaners receive the same protocol as purchased units. If a unit is pre-production, we flag it and prioritize retail retest before final ratings.