The honest 2026 benchmark: across most industries, 60% to 80% of visitors who start a web form never finish it. The exact number depends heavily on form length, traffic source, device, and how you define "started." High-intent forms (booking a demo, requesting a quote) sit at the lower end of that range. Low-intent or long forms (insurance quotes, multi-step checkouts, mortgage applications) sit at the upper end. Anything below 50% is excellent. Anything above 85% is a problem.
How is form abandonment rate calculated?
Form abandonment rate is the share of form starters who do not become form submitters. The formula is straightforward but the definitions matter.
Form abandonment rate = 1 - (form submissions / form starts)
Where:
form start = the first focus event on any field in the form
form submission = a successful submit event (validation passed and the
server accepted it)
Two common mistakes:
- Counting page views as form starts. A visitor who scrolls past your form without touching it is not an abandoner. Anchor your denominator on the first focus event, not on page view.
- Counting failed submissions as submissions. If validation rejected the form, the visitor did not convert. Count only the submissions that the server actually accepted.
Form-abandonment-rate is a pure ratio. It does not depend on traffic volume, which makes it directly comparable across pages — but only if you measure it the same way every time.
What about cart abandonment vs form abandonment?
These are different metrics measured in different ways. The Baymard Institute, which has tracked e-commerce checkout behavior since 2010, reports an average documented online shopping cart abandonment rate of around 70% — averaged across roughly 50 different studies. That number applies to checkouts, which are multi-step funnels with payment friction, and is not directly comparable to a single contact form. We mention it because it is the most-cited number in this space and because it sets the right intuition: the majority is normal.
What are the 2026 form abandonment benchmarks by industry?
A frank caveat first: there is no single, current, statistically rigorous public dataset for form abandonment broken down by industry. The numbers most blog posts cite ultimately trace back to a small handful of studies — chiefly from Formisimo (2014-2019), Zuko's quarterly reports, and Baymard's checkout work. We've built the table below from those public sources plus what we observe in real CloseTrace accounts. Treat the bands as directional, not as a leaderboard.
| Form type | Typical abandonment range | Notes |
|---|---|---|
| Newsletter signup (single field) | 20-40% | Lowest friction; mostly intent failures |
| Contact form (3-5 fields) | 50-70% | The CF7 baseline |
| Demo request / B2B lead form | 55-75% | High intent, but often gated by length |
| Free trial signup | 45-65% | Better than demo because friction is lower |
| Quote / insurance form (multi-step) | 75-90% | Long forms with personal data |
| Checkout (e-commerce) | ~70% | Per Baymard's averaged studies |
| Job application | 60-85% | Highly variable by length and login wall |
| Mortgage / financial application | 80-95% | Longest forms in the dataset |
Sources and method: Baymard Institute for the checkout figure; Zuko's published industry reports for the form-type bands; CloseTrace's own anonymized customer data for the qualitative ranges. Where a public source isn't available we have intentionally given a range, not a precise %.
What counts as form "abandonment"? (the definitional problem)
This is where benchmarks get slippery. Different tools count different things. The definition you choose can swing your reported abandonment rate by 20 percentage points without anything actually changing on the site.
The four common definitions, from loosest to strictest:
- Page view minus conversion. A visitor saw the page, didn't submit. The loosest definition; over-counts by including everyone who never engaged with the form at all.
- Field focus minus submission. A visitor focused at least one field, didn't submit. The most common definition. This is what most form-analytics tools mean by "form starts."
- Multiple field focus minus submission. A visitor engaged with two or more fields, didn't submit. A stricter definition that filters out accidental focuses.
- Filled to last field minus submission. A visitor reached the last field but didn't hit submit. The strictest definition — useful for identifying friction in the final step.
When you compare numbers from two tools or two reports, always check the definition. A "55% form abandonment rate" under definition (1) is much worse than a "55%" under definition (3).
What are the five biggest drivers of high form abandonment?
These are the patterns that show up over and over in real session recordings — and the corresponding fixes. None of them are exotic; the value is in actually doing them.
1. The form is too long for the value exchange
The classic mismatch. Asking for 14 fields in exchange for "let us send you a PDF" almost guarantees a high drop-off. The fix is not always shorter — it is shorter for the perceived value. If you genuinely need 14 fields, raise the perceived value (custom report, real human callback) or split them across steps so the visitor can see progress.
2. A specific field is broken or confusing
The single highest-leverage thing you can find. One field with a 35% error rate or a 60-second average time-on-field will destroy the funnel by itself. Use funnel-drop-off-rate to surface it, and watch a few replays to see why it's confusing. Common culprits: phone number formats, date pickers, password rules, and country dropdowns.
3. Validation fires at the wrong time
Inline validation that fires while the visitor is still typing — "this is not a valid email" before they've finished typing — is a known abandonment driver. Validate on blur, not on every keystroke. The exception is positive feedback ("password meets requirements"), which is fine on input.
4. The submit button doesn't look like a submit button
Sounds trivial. Isn't. We routinely see dead-click and rage-click heatmaps cluster on disabled-looking submit buttons, on submit buttons that scrolled off-screen on mobile, and on buttons with vague labels like "Continue" when the visitor expected "Submit" or "Send." Make the action obvious, sticky on mobile, and high-contrast.
5. A modal, cookie banner, or sticky element is covering the form
Especially on mobile. A cookie banner that occupies the bottom 30% of the viewport hides the submit button on small screens. A live-chat widget covers the country dropdown. The visitor doesn't file a bug — they just leave. A scroll heatmap plus a few mobile replays will surface this in minutes.
How do long forms compare to short forms?
The intuition is right: short forms convert better than long ones. The size of the effect is often overstated.
- A single-field email signup typically converts 3-5x better than a 10-field demo form. That is a real, large effect.
- A 5-field form vs a 7-field form is much smaller — single-digit percentage points in most cases. The marginal field matters less than which fields you choose and how you ask.
- Multi-step forms ("conversational forms") typically beat single-page long forms of the same total length, because the visitor sees progress and can't see the full ask up front. The lift is usually meaningful but not magic — somewhere in the 10-30% range based on most published case studies.
- Optional vs required matters more than the field count. A required phone number tanks B2B forms more than adding two optional fields would.
The practical advice: don't obsess over removing fields. Obsess over removing required fields that aren't load-bearing for your sales process.
How do you measure your own form abandonment rate?
Three options, in order of effort:
- Hand-rolled JavaScript — listen for the first focus event and the submit event, post both to your analytics tool, and divide. This is what we walk through in how to track form abandonment in Contact Form 7. Cheap, accurate, and you own the data.
- A form-analytics tool — Zuko, Mouseflow's form module, Insiteful, CloseTrace. These give you the metric out of the box, plus field-level drop-off and a funnel view. The differences between these tools and pure session replay are covered in form analytics vs session replay.
- A general analytics tool with custom events — GA4, Heap, PostHog. Workable but more configuration; you have to define
form_startandform_submitevents yourself and build the funnel manually.
Whichever you pick, instrument the metric the same way across all your forms so the numbers are comparable.
Frequently asked questions
What is a good form abandonment rate?
It depends on the form type. A newsletter signup with anything above 60% abandonment is a problem. A 5-field contact form sitting around 50-60% is healthy. A long quote or insurance form sitting at 75% is normal. The right benchmark is your own form measured the same way over time — if it's improving, you're winning, regardless of the absolute number.
Why is the average form abandonment rate so high?
Two reasons. First, the denominator includes anyone who focused a field for any reason — including curious or accidental focuses. Second, web forms compete with infinite alternatives; a visitor can leave at any moment with one tap. The base rate of abandonment in a frictionless medium is just high. The job is not to drive it to zero; it is to recover the recoverable share.
How is form abandonment different from bounce-rate-vs-exit-rate?
Bounce rate measures whether the visitor left after a single page view. Exit rate measures whether the visitor's session ended on a specific page. Form abandonment measures whether a visitor who started a form did not submit it. A page can have a low bounce rate (visitors stay and explore) and still have a high form abandonment rate (the form itself is broken).
Does mobile have a higher abandonment rate than desktop?
Almost always, yes. Public reports from Zuko, Baymard, and others consistently show mobile abandonment running roughly 5-15 percentage points higher than desktop on the same forms, driven by smaller viewports, awkward keyboards, autofill conflicts, and overlay UI covering the submit button. If you're not measuring abandonment by device, start.
Is cart abandonment the same as form abandonment?
No. Cart abandonment is a checkout-funnel metric that includes payment friction, shipping calculations, account creation, and trust signals. Form abandonment is narrower — it measures a single form. Baymard's widely-cited ~70% cart abandonment number is not a form benchmark. Use it for checkout work and form benchmarks for everything else.
How do I find which field is causing the abandonment?
Two complementary tools. A field-level form-analytics report tells you which field has the highest drop-off, the highest error rate, and the longest time-on-field. Session replays of those abandoning sessions tell you why — the broken date picker, the confusing label, the disabled submit button. Form analytics finds the field; replay explains the field. Both together is the fastest path to a fix. See why session replay matters for lead recovery for the connection to revenue.
Where can I find a current public dataset on form abandonment?
There isn't one comprehensive public dataset. The closest things are Baymard Institute's checkout abandonment list, Zuko's quarterly form-analytics reports, and academic studies on individual form designs. Treat any single source as one input, not as the definitive number — and weight your own data more heavily than any benchmark.
The bottom line
Form abandonment is one of the most under-measured metrics in marketing analytics. The top-line benchmark — 60% to 80% — is high enough to be uncomfortable, and consistent enough across industries that it is worth measuring on every form you own. The goal is not to hit a magic number. The goal is to find the field that is killing your specific form and fix it, then find the next one. To see the framework CloseTrace uses for that, see form abandonment: the quiet revenue killer.