Skip to content
BestKYC

Methodology

How we evaluate.

Every few weeks one of us is picking software for a new project at work — a bank onboarding flow, an ecommerce fraud stack, a marketplace KYB process. We write down what we learned. Those notes turn into the rankings on this site. The weights change by project. The criteria don't.

BestKYC — how we evaluate KYC, KYB, AML, and fraud prevention software

What we look at depends on why we're looking. Picking KYC for a bank onboarding flow is not the same as picking an ID check for an ecommerce cart. A regulator-driven project puts compliance above conversion. A fraud-driven project puts it the other way around. So the weights shift. The criteria don't.

These are the seven we always end up back at.

The seven criteria

  1. 01

    Voice from the public

    What the practitioners actually using the software say. We read the forums, the LinkedIn posts, the Reddit threads, and we talk to the people in our community who live with these tools every day. A vendor's case studies tell us what the vendor wants us to think. The voice from their customers tells us what to think. We don't use G2 or Gartner star averages — on those sites, the sample is whichever customers the vendor's CSM team contacted with a gift card, not a random one.

  2. 02

    Pricing

    Is the price on the website, or is it a phone call. What the minimum commitment is. What the bill actually is at our volume. We have yet to see a ranking change because a vendor had pretty pricing, but we have seen procurement cycles double because pricing took three calls to nail down.

  3. 03

    Experience and accuracy

    How the flow feels on a mid-range Android with a bad camera. Drop-off at each step. Acceptance rates — how often a real user gets rejected. False accepts, where a fraudster gets through. These are the numbers that decide whether the funnel grows or shrinks.

  4. 04

    Sales process

    We get on a call with every vendor and watch how they sell. What gets promised on the call versus what shows up in production. How much of the time is spent on discovery rather than pitching. Whether the rep's answers feel like product knowledge or sales script. How long from first contact to a useful conversation. The first sales call is the first integration test — if a vendor can't accurately describe the limits of their own product in 30 minutes, the next 18 months will be harder than they need to be.

  5. 05

    Support and customer service

    How long it takes to get a real person. Whether the support engineer can fix the problem or just takes a ticket. What the vendor does when the webhook goes down at 3am and your onboarding queue backs up behind it. Dedicated CSM vs. a rotating pool.

  6. 06

    Integration

    Time from signed contract to first real verification in your production flow. SDK quality. API ergonomics. Sandbox maturity. Whether the webhook payload has the fields you need or you have to make two extra calls to get them.

  7. 07

    Compliance

    Whether the vendor lets you pass your regulator's audit. Depending on your jurisdiction that means different things: FCA, BaFin, MAS, FINMA, OCC, OFAC. SOC 2 and ISO 27001 as table stakes. Sanctions, PEPs, and adverse media where they belong. A usable audit trail. Something sensible to say about DORA.

How the weights shift

For a bank onboarding flow, compliance is non-negotiable. If a vendor doesn't pass your regulator's audit, nothing else matters. Experience and accuracy matter a lot. Pricing and support matter enough to break ties.

For an ecommerce fraud stack, it's flipped. Experience and accuracy are non-negotiable — if the flow doesn't convert, the category stops existing. Pricing and support matter a lot. Compliance might not be relevant at all, because no regulator is asking for your audit.

Voice from the public shifts least. It's the first signal we look at on every project, because the people already using a vendor are running the experiment you're about to run.

Who ranks

We are the people picking this software. Every ranking has at least one practitioner on the byline — someone who has recently run, or is currently running, the relevant process in production at their day job. Usually a compliance lead, a fraud strategist, or a product owner. Their bio and any conflicts appear at the bottom of every review they touch.

What we don't do

We don't run benchmarks at scale from a lab. We use the numbers from real production flows, from the people running them, and we check the vendor's claims against that. Where we have our own test corpus for a category, we use it — but we won't pretend we have a setup that mirrors your onboarding volume, because we don't.

This page will keep getting refined as we publish more rankings.