White paper · May 2026

Under the Surface

What 15,542 Public Website Scans Reveal About Hidden Website Quality Signals

Abstract

Most business owners evaluate their websites by what they can see: the design, the logo, the colors, the homepage layout, and whether the site appears to load when they open it. Those visible signals matter, but they do not tell the full story.

The Website Quality Report examined 15,542 public websites to identify broad measurable patterns across overall site quality, performance, mobile experience, load time, accessibility issues, and structural health. Of those sites, 15,541 scans completed successfully and 1 scan failed. The average overall Site Score was 60.45, the median Site Score was 61, and only 49 completed scans reached a Site Score of 90 or higher.

The findings show a clear pattern: many public websites that appear operational on the surface are underperforming across measurable quality signals that affect usability, accessibility, crawlability, mobile experience, and technical structure. This does not prove why any individual website performed poorly. It does not prove who built the website, how much was paid for it, or whether that website produces business. It does show that hidden website quality problems are common across a large sample of public sites.

This white paper explains what was measured, what the results show, what the results do not prove, and why business owners, marketers, developers, agencies, journalists, and researchers should treat website quality as infrastructure rather than decoration.


Executive Summary

The Website Quality Report scanned 15,542 public websites. The scan focused on measurable quality signals rather than subjective design opinions. It looked at the parts of a website that are often invisible to business owners, including performance, mobile experience, accessibility signals, technical structure, crawlability, SEO metadata, schema, security signals, infrastructure, media handling, and client-side health.

The results were concentrated in the middle and lower score ranges. Only 49 out of 15,541 completed scans scored 90 or higher. That means 99.68% of completed scans scored below 90.

The average overall Site Score was 60.45. The median Site Score was 61. The highest measured Site Score was 96. The lowest measured Site Score was 19.

Load time was one of the clearest issues. The average measured load time was 7.19 seconds. A total of 9,726 completed scans, or 62.60%, took more than 5 seconds to load. A total of 3,943 completed scans, or 25.38%, took 10 seconds or longer.

Accessibility issues were also common. A total of 12,081 completed scans, or 77.74%, had at least one detectable accessibility issue. A total of 3,412 completed scans, or 21.96%, had 10 or more detectable accessibility issues.

Mobile performance showed broad weakness. A total of 9,814 completed scans, or 63.15%, had a mobile score under 70.

Structure health also showed measurable weakness. A total of 4,924 completed scans, or 31.68%, had structure health under 70.

The central finding is not that every website in the dataset was unusable, offline, or visibly broken. The central finding is that many live, public websites were weak across measurable quality signals that business owners often do not know how to inspect.

A website can look acceptable and still load slowly. It can have a modern design and still perform poorly on mobile. It can have a clean homepage and still be missing structural signals that help search engines understand it. It can be live and still create friction for users, search engines, and assistive technology.

That is the gap this report identifies.


1. Purpose of the White Paper

This white paper is intended to turn the Website Quality Report into a formal, defensible explanation of the issue the dataset reveals.

The purpose is to:

  1. Explain what was measured.
  2. Summarize the key findings.
  3. Interpret what the results mean for business owners and website decision makers.
  4. Clarify what the dataset does not prove.
  5. Provide a practical framework for evaluating website quality beyond visual design.
  6. Support citation, discussion, and further analysis of broad website quality patterns.

This white paper is not intended to shame individual businesses, call out vendors, assign blame to agencies, or claim that any one factor alone determines rankings, revenue, or business outcomes.

The report should be read as a broad point-in-time analysis of measurable website quality signals across a large public dataset.


2. The Problem: Business Owners Are Judging Websites by the Wrong Finish Line

Most business owners judge a website by what they can see.

They open the homepage. They look at the logo, the colors, the photos, the layout, and whether the design feels newer than what they had before. They may check whether the phone number is visible, whether the contact form exists, and whether the site loads on their own computer.

That process is understandable. Design is visible. Performance architecture is not. Mobile scoring is not obvious at a glance. Structured data is invisible. Crawl paths are invisible. Compression headers are invisible. Accessibility issues may not be obvious to someone who does not rely on assistive technology. Metadata, schema, cache headers, indexability, heading hierarchy, robots.txt, sitemap quality, internal linking, and client-side errors are all easy to miss if the evaluation stops at visual approval.

That creates a practical problem.

A website can pass the visual test and fail the infrastructure test.

It can look fine while still loading slowly, working poorly on mobile, missing basic SEO signals, carrying detectable accessibility issues, using inefficient media, depending too heavily on JavaScript, or making it harder for search engines to crawl and understand the site.

The Website Quality Report was created to look below the surface.


3. Dataset Overview

The Website Quality Report is based on a point-in-time scan of 15,542 public websites.

Scan overview

Scan overview statistics
MetricResult
Total websites scanned15,542
Completed scans15,541
Failed scans1
Highest overall Site Score96
Lowest overall Site Score19
Average overall Site Score60.45
Median overall Site Score61
Websites scoring 90 or higher49
Websites scoring below 9015,492

Completed scans were included in score analysis. The failed scan was retained in the dataset but excluded from score distribution calculations.

The public dataset removes business names and phone numbers. It keeps website URL, city, scan ID, scan timestamp, and scan metrics.


4. What Was Measured

The scan measured website quality signals across multiple categories. The goal was not to judge visual design. The goal was to evaluate measurable signals that affect how a website performs, how it works for visitors, and how clearly it presents itself to search engines and crawlers.

Measured areas included:

  1. Overall Site Score
  2. Performance score
  3. Mobile score
  4. Load time
  5. Accessibility issues
  6. Structure health
  7. Media handling
  8. SEO core signals
  9. Schema and structured data
  10. Document structure
  11. Heading structure
  12. Crawlability
  13. URL integrity
  14. Indexing signals
  15. Performance optimization
  16. Internal linking and navigation
  17. Infrastructure and server signals
  18. Security signals
  19. Social metadata
  20. Client-side health

The score is best understood as a summary of measurable website health across many signals. A single missed check does not necessarily mean a website is broken. The broader pattern matters when multiple basic signals are missing, weak, or underperforming.

View the full list of measured checks →


5. Methodology Summary

The dataset summarizes a point-in-time scan of public websites. Website performance and technical measurements can change over time due to server response, network conditions, third-party scripts, redirects, caching, temporary outages, and live site changes.

The scan measured multiple quality indicators across performance, mobile experience, accessibility, structure, SEO, crawlability, indexing, infrastructure, security, and client-side health.

A small validation sample was rescanned using a second scanning system. The rescans showed directional consistency with the original results, with most sampled scores falling within a small variance range. One larger variance was observed, which is consistent with live website testing where load conditions and third-party dependencies can change between scans.

Methodological guardrails

The dataset does not prove why a website performed poorly. It does not prove whether a site was professionally built. It does not prove how much was paid for a site. It does not prove who built any specific site. It does not prove whether any specific site produces leads, calls, sales, or revenue. It does not assign blame to individual businesses, agencies, freelancers, platforms, or vendors.

The purpose is to identify broad measurable patterns across a large sample of public websites.


6. Key Finding 1: Very Few Websites Reached the Highest Scoring Range

Only 49 out of 15,541 completed scans reached a Site Score of 90 or higher. That represents approximately 0.32% of completed scans.

A total of 15,492 completed scans scored below 90. That represents 99.68% of completed scans.

This does not mean every website below 90 was useless or completely broken. It means the overwhelming majority of completed scans fell short of the highest quality range within this scoring model.

Site Score distribution

Site Score distribution across completed scans
Score range Websites Share of completed scans
0–301280.82%
30.01–406734.33%
40.01–501,94912.54%
50.01–604,76130.64%
60.01–705,30534.14%
70.01–802,26214.56%
80.01–904262.74%
90.01–100370.24%

The completed scans were concentrated between 50 and 70. A total of 10,066 completed scans fell between 50.01 and 70, representing 64.78% of completed scans.

This concentration matters because it suggests that the typical scanned website was neither catastrophically failed nor highly optimized. The more common pattern was moderate visible operation paired with measurable weaknesses below the surface.


7. Key Finding 2: Load Time Was a Widespread Issue

The average measured load time was 7.19 seconds. The median measured load time was 6.51 seconds.

A total of 9,726 completed scans, or 62.60%, took more than 5 seconds to load.

A total of 3,943 completed scans, or 25.38%, took 10 seconds or longer.

Load time distribution

Load time distribution across completed scans
Load time range Websites Share of completed scans
Under 2 seconds1,88212.11%
2–3 seconds1,1257.24%
3–5 seconds2,79217.97%
5–10 seconds5,79937.32%
10+ seconds3,94325.38%

Load time is one of the easiest hidden issues to underestimate. A site may load acceptably on the business owner’s desktop, on a fast connection, or after browser caching has already stored parts of the page. That does not mean the same site loads well for a first-time visitor on mobile, over a weaker connection, or through a crawler evaluating page performance.

Slow load time creates friction. It can delay the first impression, increase user frustration, and weaken the experience before the visitor reads a single sentence.

For local businesses, this matters because search visits are often high-intent. Someone searching for a roofer, plumber, accountant, attorney, lawn care company, auto shop, restaurant, or medical provider is often trying to make a decision. If the site is slow, confusing, or hard to use, the visitor does not need to complain. They can simply leave and choose another result.


8. Key Finding 3: Detectable Accessibility Issues Were Common

A total of 12,081 completed scans, or 77.74%, had at least one detectable accessibility issue.

A total of 3,412 completed scans, or 21.96%, had 10 or more detectable accessibility issues.

Accessibility issue distribution

Distribution of detectable accessibility issues across completed scans
Detected accessibility issues Websites Share of completed scans
0 issues3,44422.18%
1–5 issues6,73743.40%
6–10 issues2,26014.56%
11–20 issues1,77811.45%
21+ issues1,3068.41%

These measurements identify detectable issues. They do not represent a complete manual accessibility audit and should not be interpreted as the full lived experience of every visitor.

Even with that limitation, the pattern is important. Accessibility problems often overlap with broader usability problems. Missing alt text, unlabeled form inputs, buttons without accessible text, missing language declarations, and related issues can affect visitors using assistive technology, but they can also reflect weak development discipline across the site.

Accessibility should not be treated as a separate legal checkbox after the website is done. It should be part of building a site that works clearly for more people, more devices, and more user contexts.


9. Key Finding 4: Mobile Scores Were Weak Across a Majority of Completed Scans

A total of 9,814 completed scans, or 63.15%, had a mobile score under 70.

The average mobile score was 66.88. The median mobile score was 66.

This matters because mobile experience is not a secondary version of the website. For many business categories, mobile is the primary experience. Customers search from phones, compare options from phones, read reviews from phones, tap to call from phones, request quotes from phones, and make decisions from phones.

A site can look acceptable on a desktop monitor while being slow, cramped, unstable, or difficult to use on mobile. A mobile score under 70 does not automatically explain a site’s business performance, but it does indicate measurable weakness in a user environment that is central to modern website behavior.

Mobile weakness can include performance problems, layout instability, poor responsive behavior, horizontal overflow, weak navigation, oversized media, excessive scripts, or interface elements that do not work well on smaller screens.

For business owners, the practical takeaway is simple: approving a homepage on a laptop is not the same as validating the customer experience on a phone.


10. Key Finding 5: Structure Health Was Often Weaker Than the Visible Design Suggested

A total of 4,924 completed scans, or 31.68%, had structure health under 70.

The average structure health score was 72.60. The median structure health score was 76.

Structure health distribution

Structure health score distribution across completed scans
Structure health range Websites Share of completed scans
0–3000.00%
30.01–509446.10%
50.01–704,34628.07%
70.01–9010,15365.59%
90.01–100400.26%

Structure health is one of the clearest examples of a quality signal business owners usually cannot judge visually.

Structure includes the underlying organization and technical clarity of a page. It can include document validity, semantic HTML, heading structure, metadata, schema, canonical tags, navigation links, crawlability, sitemap availability, indexability, internal linking, and whether the site gives machines a clean way to understand what the content is and how the pages relate to each other.

A weak structure does not always make the homepage look bad. That is the problem. A visually acceptable page can still be harder for search engines to understand, harder for users to navigate, and harder for future systems to analyze.

As search increasingly depends on structured understanding, website structure becomes more important, not less.


11. Key Finding 6: The Problem Was Not Isolated to One Metric

The findings were not driven by one weak category alone. The dataset showed measurable issues across several different areas:

Key threshold findings across completed scans
Threshold finding Count Share of completed scans
Sites with load time over 5 seconds9,72662.60%
Sites with load time 10 seconds or longer3,94325.38%
Sites with 1+ accessibility issue12,08177.74%
Sites with 10+ accessibility issues3,41221.96%
Sites with structure health under 704,92431.68%
Sites with mobile score under 709,81463.15%
Sites with performance score under 704,02625.91%

This is important because website quality problems are often discussed in isolated categories. Speed is treated as one issue. SEO is treated as another. Accessibility is treated as another. Mobile is treated as another. Structure is treated as another.

In practice, these signals overlap. A slow site can weaken the customer experience. A weak mobile site can reduce conversion opportunities. Accessibility issues can also be usability issues. Poor structure can make content harder to crawl, interpret, and organize. Missing metadata can weaken how pages appear and are understood. Broken scripts can interfere with navigation, forms, layout, and tracking.

The business impact is not created by one technical category. It is created by the combined friction.


12. What “Broken” Means in This Context

The word “broken” can be misunderstood.

In this context, broken does not mean every website was offline. It does not mean the homepage failed to open. It does not mean every site was throwing an obvious server error. It does not mean every business had a useless website.

The more accurate interpretation is that many sites were broken under the surface in measurable ways. That can mean:

  1. The site loads too slowly.
  2. The mobile experience is weak.
  3. The page has detectable accessibility issues.
  4. The site structure is incomplete or inefficient.
  5. Search engines may have less clarity than they should.
  6. Important metadata may be missing.
  7. Images may be poorly optimized.
  8. Caching or compression may be missing.
  9. Crawlability signals may be incomplete.
  10. Schema may be missing or invalid.
  11. Navigation may be difficult for crawlers or users.
  12. Scripts may create errors or unnecessary overhead.
  13. Security headers may be missing.
  14. The page may be visually acceptable but technically weak.

A website does not need to be offline to cost a business attention, trust, visibility, and calls.

A site can be live and still underperform in ways that matter.


13. Why Website Quality Affects Search Visibility

Search visibility depends on more than keywords.

A search engine has to discover a site, crawl it, understand the content, interpret the structure, identify which pages should be indexed, evaluate the experience, and decide when those pages may be relevant to a query.

No single score in this report proves that a specific site ranks poorly because of a specific technical issue. Search systems are complex, and rankings depend on many factors.

However, the measured signals in this report are connected to the basic requirements of a search-friendly website:

  1. The site should load efficiently.
  2. The site should work well on mobile.
  3. The site should be crawlable.
  4. The site should be indexable where appropriate.
  5. The page structure should be clear.
  6. Titles, descriptions, canonical tags, and metadata should be present and useful.
  7. Structured data should be valid where used.
  8. Internal links should help users and crawlers understand relationships between pages.
  9. Images and scripts should not create unnecessary drag.
  10. The technical foundation should not fight the content.

That is why website quality matters. It does not replace content, brand, links, reviews, proximity, authority, relevance, or business reputation. It supports the system that allows those assets to work.

A strong website gives search engines fewer obstacles and gives visitors fewer reasons to leave.


14. Why Website Quality Affects Conversion

Website quality is not only a ranking issue. It is also a conversion issue.

A website can attract a visitor and still lose the opportunity because the experience creates friction.

Friction can come from slow loading, unclear navigation, poor mobile usability, inaccessible form controls, confusing structure, missing trust signals, broken scripts, layout problems, weak calls to action, or contact paths that are difficult to complete.

For local service businesses, this matters because many visitors are not casually browsing. They are trying to solve a problem. They need a quote. They need a repair. They need an appointment. They need pricing clarity. They need proof they can trust the business. They need to call, submit a form, book, or decide who to contact first.

If the site slows them down, confuses them, or fails to work properly on their device, the business may lose the lead without ever knowing it happened.

That is why website quality should be evaluated as business infrastructure. The site is not just a marketing asset. It is part of the customer’s path to action.


15. The Gap Between Visual Approval and Technical Validation

The website industry has trained many business owners to approve websites visually.

They are shown mockups. They are asked about colors. They are asked what websites they like. They are sold on the appearance of the homepage.

Design matters. A poorly designed website can damage trust. A confusing layout can reduce action. A site that looks outdated can weaken confidence.

But design is not the whole job.

A website is not a digital poster. It is a functional system. It needs to load. It needs to work on phones. It needs to be understandable. It needs to be crawlable. It needs to be structured. It needs to be accessible. It needs to guide visitors toward action. It needs to be maintained after launch.

The visual layer should support that system. It should not distract from the absence of it.

The Website Quality Report shows that the visible layer is not enough. Many sites that appear live and acceptable still carry measurable weaknesses in the parts business owners rarely inspect.


16. Implications for Business Owners

The main lesson for business owners is not to panic. It is to change the standard.

A website should not be judged only by whether it looks better than the old one. A stronger evaluation should ask:

  1. Does the site load quickly for first-time visitors?
  2. Does it perform well on mobile?
  3. Are images optimized and served in modern formats?
  4. Are title tags, descriptions, canonical tags, and social metadata present?
  5. Is structured data present and valid where appropriate?
  6. Does the site have a sitemap?
  7. Can important pages be crawled?
  8. Are there broken internal links or redirect chains?
  9. Does the site use clear heading structure?
  10. Are forms labeled and usable?
  11. Are buttons accessible?
  12. Does the site avoid obvious client-side errors?
  13. Are compression and caching configured?
  14. Are security headers present where appropriate?
  15. Does the site have a clear internal linking structure?
  16. Does the site make it easy for visitors to call, request help, or take the next step?

The practical takeaway is simple: do not treat visual approval as final approval.

Before accepting that a website is “done,” business owners should ask for evidence that the site works under the surface.


17. Implications for Agencies, Developers, and Website Providers

For agencies, developers, freelancers, and website providers, the report points to a responsibility gap.

Business owners often do not know how to inspect technical quality. That means providers carry the responsibility to build, test, explain, and maintain the parts clients cannot see.

A credible website build should not stop at visual delivery. It should include:

  1. Performance testing.
  2. Mobile testing.
  3. Accessibility checks.
  4. Image optimization.
  5. Metadata validation.
  6. Schema validation.
  7. Crawlability review.
  8. Sitemap verification.
  9. Indexability review.
  10. Heading structure review.
  11. Internal linking review.
  12. Redirect review.
  13. Security header review.
  14. Compression and caching review.
  15. Error checking.
  16. Post-launch monitoring.

The best providers already treat these items as part of the job. The report suggests that, across a large public sample, that standard is not consistently reflected in the websites being scanned.

That does not mean every provider is careless. It does mean the market needs clearer standards.


18. Implications for Marketers and SEO Teams

SEO teams and marketers often inherit websites after they are already built. They are asked to improve rankings, generate leads, increase calls, and fix visibility problems, sometimes while working on a weak technical foundation.

The Website Quality Report reinforces a practical point: SEO cannot be separated from website quality.

Content strategy matters. Keyword strategy matters. Location pages matter. Reviews matter. Links matter. Authority matters. But the technical foundation still matters.

A site with weak speed, mobile issues, poor structure, missing metadata, crawlability problems, or indexing issues forces marketing work to fight uphill.

Marketers should treat technical site quality as a baseline requirement, not an optional developer concern.

Before scaling campaigns, content, ads, or SEO work, teams should understand whether the site itself is helping or hurting the effort.


19. Implications for Journalists and Researchers

The Website Quality Report provides a public dataset intended for citation, analysis, and discussion of broad measurable patterns across public websites.

It should be described carefully.

Recommended framing: “The Website Quality Report is based on a point-in-time scan of 15,542 public websites. It found widespread measurable issues across performance, mobile experience, accessibility signals, structure health, and overall site quality.”

Avoid framing that claims the dataset proves:

  1. Who built any individual website.
  2. Why any individual website performed poorly.
  3. Whether a specific business wasted money.
  4. Whether a specific vendor did poor work.
  5. Whether a specific website fails to produce business results.
  6. Whether one technical issue alone caused a ranking problem.

The strongest use of the dataset is broad pattern analysis, not individual blame.


20. Practical Website Quality Framework

The following framework can be used to evaluate a website beyond visual design.

Layer 1 Access and availability

The site should load reliably, use HTTPS, avoid broken pages, and make key content available without obvious server or access errors.

Layer 2 Speed and performance

The site should load quickly, minimize blocking resources, use compression, serve optimized assets, and avoid unnecessary script weight.

Layer 3 Mobile experience

The site should be usable on mobile devices, avoid horizontal overflow, maintain readable layouts, and make calls to action easy to use.

Layer 4 Accessibility and usability

Images should have appropriate alt text. Forms should have labels. Buttons should have accessible text. The language should be declared. The site should avoid obvious accessibility blockers.

Layer 5 Search clarity

Pages should have useful title tags, meta descriptions, canonical tags, structured data where appropriate, valid metadata, and clear indexability signals.

Layer 6 Crawlability and structure

The site should have a sitemap, accessible robots.txt, crawlable internal links, logical navigation, clean URL behavior, and a structure that helps search engines understand the relationship between pages.

Layer 7 Technical resilience

The site should avoid broken scripts, console errors, redirect chains, mixed content, missing security headers, and unstable page behavior.

Layer 8 Conversion path

The site should make it easy for visitors to understand the business, trust the company, call, submit a form, request a quote, book, or take the next step.

A website that looks good but fails across these layers is not finished. It is decorated.


21. Recommendations

For business owners

Do not approve a website based only on appearance. Ask for measurable proof that the site performs well, works on mobile, avoids obvious accessibility issues, has proper structure, and can be understood by search engines.

Ask for reports that show real technical checks, not vague statements that the site is “optimized.”

Treat the website as infrastructure. It should be monitored, maintained, improved, and tested after launch.

For agencies and developers

Build technical validation into the standard delivery process. Do not make clients ask for things they do not know exist.

Performance, mobile usability, accessibility, crawlability, metadata, structured data, internal linking, caching, compression, and error checks should be part of the build process.

A website should not be considered complete because it visually matches the mockup. It should be complete when it performs its job.

For marketers and SEO teams

Audit the foundation before scaling traffic campaigns or SEO content. If the site is slow, weak on mobile, poorly structured, or difficult to crawl, marketing performance may be limited by the site itself.

Treat technical quality as part of growth strategy.

For researchers and media

Use the dataset to discuss broad measurable patterns. Avoid using it to blame individual businesses or vendors.

Describe the results as point-in-time measurements and include the stated limitations when citing the report.


22. Limitations

This white paper relies on the Website Quality Report dataset and supporting files. The scan is a point-in-time measurement and should not be interpreted as a permanent condition for any website.

Website performance can change due to:

  1. Server response.
  2. Network conditions.
  3. Third-party scripts.
  4. Redirect behavior.
  5. Caching.
  6. Temporary outages.
  7. Hosting changes.
  8. CDN behavior.
  9. Plugin or platform changes.
  10. Live site updates.
  11. Content changes.
  12. Image changes.
  13. Script changes.
  14. Security configuration changes.

The dataset does not prove causation. It identifies measurable conditions. The dataset does not prove that a site with a lower score fails as a business tool. The dataset does not prove that a site with a higher score will outrank competitors or produce more leads.

The dataset does not replace a manual audit, user testing, accessibility review, analytics review, conversion analysis, or search performance analysis.

The value of the report is in identifying broad patterns across a large sample of public websites.


23. Conclusion

The Website Quality Report shows a simple but important problem: many public websites are being judged by the part people can see while underperforming in the parts that make the site work.

A website can be live and still weak. It can look modern and still load slowly. It can have a clean homepage and still perform poorly on mobile. It can have attractive images and still miss basic accessibility signals. It can have polished copy and still be difficult for search engines to crawl, understand, and index clearly. It can look finished and still be incomplete as business infrastructure.

The scan of 15,542 public websites found broad measurable weaknesses across site quality, load time, mobile scores, accessibility issues, performance, and structure health. Only 49 completed scans reached a Site Score of 90 or higher. The average overall Site Score was 60.45.

The point is not that every website must be perfect. The point is that visual design is not enough.

  • Business owners need a better standard.
  • Agencies and developers need to deliver the parts clients cannot see.
  • Marketers need to stop treating technical quality as separate from growth.
  • Researchers and media need to interpret the dataset carefully and use it to discuss broad patterns, not individual blame.

A strong website should load fast, work well on mobile, be accessible, be structured clearly, be understandable to search engines, and make it easy for visitors to take action.

That is not extra. That is the job.


Appendix A: Full Scoring Criteria Summary

The Website Quality Report score was based on measurable website quality signals across performance, media handling, accessibility, security, SEO and schema, document structure, crawlability, indexing, mobile optimization, infrastructure, and client-side health.

1. Core performance metrics

  • Time to Interactive (TTI)
  • First Contentful Paint (FCP)
  • Speed Index
  • Google PageSpeed Mobile and Desktop scores
  • Largest Contentful Paint (LCP)
  • Interaction to Next Paint / Total Blocking Time proxy
  • Cumulative Layout Shift (CLS)
  • Time to First Byte (TTFB)
  • Server response time

2. Image and media checks

  • Use of modern formats such as WebP and SVG
  • Image compression level
  • Responsive images using srcset
  • Lazy loading enabled
  • Alt text presence

3. Accessibility checks

  • Alt text on images
  • Form inputs have labels
  • Buttons have accessible text
  • No critical accessibility blockers
  • Language declared on the html element

4. Security checks

  • HTTPS enabled
  • HTTP to HTTPS redirect
  • No mixed content
  • Content Security Policy
  • HSTS header
  • X-Content-Type-Options
  • X-Frame-Options
  • Referrer-Policy
  • Permissions-Policy

5. SEO core signals

  • Title tag present
  • Title length optimized
  • Meta description present
  • Canonical tag present
  • Canonical URL validity
  • Open Graph title, description, and image
  • Twitter card metadata
  • Structured data present
  • Structured data validity using JSON-LD
  • Number of schema types

6. Document structure

  • DOCTYPE declared
  • Valid HTML structure
  • UTF-8 charset
  • Viewport meta tag
  • html, head, and body presence

7. Heading structure

  • H1 tag present
  • Only one H1
  • Logical heading hierarchy

8. Crawlability

  • robots.txt present
  • robots.txt accessible
  • robots.txt not blocking crawling
  • sitemap.xml present
  • Sitemap accessible
  • Sitemap contains valid URLs
  • Sitemap URLs return 200

9. URL integrity

  • Internal links crawlable
  • No broken internal links
  • No redirect loops
  • No redirect chains
  • No orphan pages where multi-page crawling is available
  • No duplicate URLs

10. Indexing

  • Page indexable with no unintended noindex
  • Canonical page indexable
  • Proper use of noindex

11. Performance optimization

  • Render-blocking resources minimized
  • HTML minified
  • Compression enabled using GZIP or Brotli
  • Browser caching enabled
  • Cache headers configured
  • Static asset caching

12. Internal linking and navigation

  • Internal linking structure exists
  • Navigation links crawlable
  • Navigation not JavaScript-dependent

13. Mobile optimization

  • Responsive layout
  • Viewport configured
  • No horizontal overflow
  • Mobile navigation usable

14. Infrastructure and server

  • HTTP/2 or HTTP/3 enabled
  • Keep-alive enabled

15. Social metadata

  • Open Graph title
  • Open Graph description
  • Open Graph image
  • Twitter card metadata

16. Miscellaneous checks

  • Favicon present
  • Console errors
  • Broken scripts

17. High-level scan categories

  • Performance
  • Media and payload
  • Markup efficiency
  • SEO and schema
  • Accessibility
  • Security
  • Infrastructure

Appendix B: Summary Data Tables

Scan overview

Scan overview statistics
MetricValue
Total websites scanned15,542
Completed scans15,541
Failed scans1
Highest overall Site Score96
Lowest overall Site Score19
Average overall Site Score60.45
Median overall Site Score61
Websites scoring 90 or higher49
Websites scoring below 9015,492

Category averages

Average scores by category
CategoryAverage
Performance score74.93
Mobile score66.88
Load time7.19 seconds
Accessibility issues detected7.44
Structure health score72.60

Site Score distribution

Site Score distribution
Score rangeCountPercent of completed scans
0–301280.82%
30.01–406734.33%
40.01–501,94912.54%
50.01–604,76130.64%
60.01–705,30534.14%
70.01–802,26214.56%
80.01–904262.74%
90.01–100370.24%

Load time distribution

Load time distribution
Load time rangeCountPercent of completed scans
Under 2 seconds1,88212.11%
2–3 seconds1,1257.24%
3–5 seconds2,79217.97%
5–10 seconds5,79937.32%
10+ seconds3,94325.38%

Accessibility issue distribution

Accessibility issue distribution
Accessibility issue rangeCountPercent of completed scans
0 issues3,44422.18%
1–5 issues6,73743.40%
6–10 issues2,26014.56%
11–20 issues1,77811.45%
21+ issues1,3068.41%

Structure health distribution

Structure health distribution
Structure health rangeCountPercent of completed scans
0–3000.00%
30.01–509446.10%
50.01–704,34628.07%
70.01–9010,15365.59%
90.01–100400.26%

Key threshold findings

Key threshold findings
FindingCountPercent of completed scans
Sites with load time over 5 seconds9,72662.60%
Sites with load time 10 seconds or longer3,94325.38%
Sites with 1+ accessibility issue12,08177.74%
Sites with 10+ accessibility issues3,41221.96%
Sites with structure health under 704,92431.68%
Sites with mobile score under 709,81463.15%
Sites with performance score under 704,02625.91%

Appendix C: Suggested Citation

The Website Quality Report. Produced by Trevor Hunter / FOCUS. websitequalityreport.com.

When referencing the dataset, describe it as a point-in-time scan of 15,542 public websites.

Suggested citation language

“According to the Website Quality Report, a point-in-time scan of 15,542 public websites found that only 49 out of 15,541 completed scans reached a Site Score of 90 or higher. The report identified broad measurable patterns across site quality, performance, mobile experience, load time, accessibility issues, and structure health.”

Appendix D: Safe Interpretation Language

Use this language when discussing the report publicly:

  • The report identifies broad measurable patterns across public websites.
  • The scan was point-in-time and scores may change.
  • The dataset does not prove why any individual website performed poorly.
  • The dataset does not prove who built any specific website.
  • The dataset does not prove how much was paid for any website.
  • The dataset does not prove whether any specific website produces business results.
  • The report is not intended to shame individual businesses or assign blame to specific vendors.
  • The findings suggest that many public websites may be underperforming across measurable technical and structural quality signals.

Avoid this language unless it is clearly qualified:

  • “This proves the vendor failed.”
  • “This proves the site cannot rank.”
  • “This proves the business wasted money.”
  • “This proves the site gets no leads.”
  • “This proves one specific issue caused poor rankings.”

Appendix E: Public Summary Version

Most business owners judge a website by what they can see. The Website Quality Report looked at what they usually cannot see.

The report scanned 15,542 public websites and found widespread measurable issues across performance, mobile experience, accessibility signals, load time, structure health, and overall site quality.

Only 49 out of 15,541 completed scans reached a Site Score of 90 or higher. The average overall Site Score was 60.45. More than 62% of completed scans took over 5 seconds to load. More than 77% had at least one detectable accessibility issue. More than 63% had a mobile score under 70.

The report does not blame individual businesses or vendors. It does not prove why any individual website performed poorly. It does show that many public websites that appear acceptable on the surface are underperforming across measurable quality signals that affect usability, accessibility, search visibility, and structure.

The takeaway is simple: a website should not be judged by appearance alone. A strong website should load fast, work well on mobile, be accessible, be structured clearly, be understandable to search engines, and make it easy for visitors to take action.