Amplifi | Blog

It’s Time to Rethink How We Measure Consumer Understanding. Part 3: What Are the Key Recommendations for Using the Amplifi Framework Effectively?

Written by Dr Sarah Sabbaghan | Sep 9, 2025 11:49:25 AM

 

Introduction

Analysing and Interpreting Amplifi’s Multi-Level Comprehension Framework Results

Real-World Application: The FCA and SRA Case Studies

Recommendations: Using Intelligibility as the Anchor for Better Communication

Introduction 

In the previous article, we introduced the Amplifi Multi-Level Comprehension Framework.  We explored what it is, why it was developed, and how it moves beyond existing methods. This is to offer a more practical, real-world approach to measuring understanding. We looked at the framework’s structure, its purpose, and why it matters. To ensure that complex legal and financial documents are not just readable, but truly Intelligible.

In this follow-up, we shift focus from what the framework is to how it can be used in practice. 

We will explain how results from the Amplifi Multi-Level Comprehension Framework can be analysed and interpreted and share real-world applications through FCA and SRA case studies. We will provide useful advice on how to use the framework effectively in document design, testing, and compliance.

Analysing and Interpreting Amplifi’s Multi-Level Comprehension Framework Results

The results generated through the Amplifi Multi-Level Comprehension Framework can be analysed in several ways, depending on the objectives of the study. 

For open-ended responses, comprehension is assessed through inter-rater evaluation using a structured set of scoring criteria, or ‘rubric’. This means that two or more independent reviewers score the same responses against a predefined criteria. This helps to improve consistency, identify problem areas, and reduce individual bias. As an example, a rubric may include levels of information completeness and accuracy, allowing researchers to evaluate the depth, coherence, and relevance of responses. Such qualitative analysis provides insight into how well participants interpret and apply information, going beyond surface-level understanding. It also helps identify patterns of misunderstandings or problem areas in the document. This offers valuable insights into how it might be improved. 

For closed-ended tasks, such as multiple-choice or true/false questions, results can be interpreted using a variety of statistical methods. Descriptive statistics, such as percentages of correct responses by comprehension level, offer a basic overview of performance. For deeper analysis, inferential techniques, such as regression analysis or cross-tabulations can be used. This can explore relationships between comprehension outcomes and variables like document type, participant demographics, or confidence levels. Thresholds can also be set to determine an acceptable level of performance at each level. This can support decisions concerning issues such as compliance, usability, or design testing.

Ultimately, the method of analysis should reflect the intended use of the data. For example, whether to benchmark performance, identify content that needs to be revised, or demonstrate compliance with understanding outcomes.

Real-World Application: The FCA and SRA Case Studies

The Financial Conduct Authority (FCA) and Solicitors Regulation Authority (SRA) have already used Amplifi’s framework in critical regulatory documents:

  • CP25/17: Supporting Consumer Investment Decisions (2025): Our methodology was referenced by FCA in their recent consultation. The Amplifi Framework helped to structure the FCA’s comprehension evaluation, measuring everything from basic understanding to decision confidence (p.167).
  • Research Note: Reading Between the Lines (2025): The FCA used the framework to assess how consumers understood targeted financial support, including real-world decision outcomes (p.13).
  • SRA Study: The implementation of this assessment and the use of Amplifi multi-level comprehension framework produced significant insights into how simplification can enhance comprehension of legal information and regulatory guidance. The study revealed simplified documents scored higher across all comprehension levels, with the exception of the basic recall task. The most significant gap lies in the performance on applied comprehension tasks - higher levels of comprehension were achieved by simplifying the information.

Recommendations: Using Intelligibility as the Anchor for Better Communication

To ensure documents are not just clear but understood, organisations must move beyond readability, and adopt a more comprehensive approach embracing intelligibility and comprehension. The Amplifi Multi-Level Comprehension Framework offers a powerful foundation for this shift. Here are some practical recommendations for effectively using the framework across document design, testing, and compliance:

  1. Make Intelligibility the Anchor, Not Readability

Relying solely on tools that focus on readability provides a very limited view of consumer experience. A document can be readable without being intelligible, especially for legal and financial content.

Prioritise intelligibility as the benchmark for communication quality (it is the legal requirement after all!). Use the Amplifi Multi-Level Comprehension Framework to assess how well a document supports understanding in context, not just surface clarity.

  1. Align Comprehension Levels to Document Purpose

Not every section of a document requires the same level of cognitive engagement. Some communications may only require basic recall, while others, such as explaining repayment terms or financial advice, may require scenario-based application or decision-making.

Map each section to the appropriate comprehension level (e.g., recall, applied, or reflection) based on what the user needs to do with the information. Design and test accordingly.

  1. Don’t Stop at Basic Recall

High basic recall does not indicate deep understanding. A user may remember a fee, interest rate or deadline, but fail to interpret how these things may change or affect them differently based on their behaviour.

Include assessment tasks at multiple levels to capture functional understanding and self-efficacy. 

  1. Use Comprehension Failures as Diagnostic Signals

Gaps in comprehension highlight key weaknesses within a document's structure, language, or logic. These areas are prime candidates for tailoring, simplifying, or further explaining.

Analyze failures by level: for example, if users struggle with applied tasks but excel at basic recall, the problem might be complexity, not awareness. Use this analysis to more effectively target revisions.

  1. Integrate the Framework Early in the Content Lifecycle

Many documents are evaluated for clarity only at the final stages, after drafting is complete. This limits the ability to make structural improvements. By using the Amplifi Multi-Level Comprehension Framework earlier in the process, organisations can embed these checks into content strategy, prototyping, and user testing. Embedding this as a core part of content governance ensures that documents are clear and comprehensible before rollout, reducing risks and improving consumer outcomes. 

  1. Benchmark Comprehension Across Audiences

Different user groups (e.g., vulnerable consumers, low-literacy readers, non-native speakers) may experience documents differently.

Use comprehension testing across diverse populations to detect gaps in understanding and assess equity in how information is communicated. This supports both accessibility and regulatory inclusion goals.

  1. Use Results for Iterative Improvement and Transparency

Comprehension testing is not just a compliance exercise, it can drive better user experience, trust, and reduce negative emotional responses for the reader.

Treat results as feedback loops, informing revisions and demonstrating a commitment to transparency and fairness. This supports your delivery of the FCA’s Consumer Duty in financial services, and access to justice in a legal context.