ISO 18115-2 Decoded: A Guide to Scanning Probe Microscopy Terminology for Biomedical Research

Zoe Hayes Dec 02, 2025 57

This article provides a comprehensive analysis of the ISO 18115-2 standard for scanning probe microscopy (SPM) terminology, tailored for researchers and professionals in biomedical and drug development.

ISO 18115-2 Decoded: A Guide to Scanning Probe Microscopy Terminology for Biomedical Research

Abstract

This article provides a comprehensive analysis of the ISO 18115-2 standard for scanning probe microscopy (SPM) terminology, tailored for researchers and professionals in biomedical and drug development. It explores the foundational definitions for techniques like AFM and STM, outlines methodological applications in quality control for advanced therapies, addresses common troubleshooting scenarios arising from ambiguous terminology, and offers a comparative framework for validating measurements. The guide aims to enhance data reproducibility, support regulatory compliance, and foster clear communication in musculoskeletal tissue engineering and other cutting-edge biomedical fields.

Understanding ISO 18115-2: The Bedrock of Clear SPM Communication

What is ISO 18115-2? Defining the International Standard for SPM Vocabulary

Table of Contents

ISO 18115-2 is the definitive international vocabulary for scanning probe microscopy (SPM), providing the standardized terminology essential for unambiguous communication and data interpretation in nanoscience research. Established by the International Organization for Standardization (ISO), its primary purpose is to define terms used in describing the samples, instruments, and theoretical concepts involved in SPM techniques such as Atomic Force Microscopy (AFM), Scanning Tunneling Microscopy (STM), and Scanning Near-Field Optical Microscopy (SNOM) [1] [2]. For researchers in fields like drug development, where characterizing material surfaces at the nanoscale is critical, the consistent application of this vocabulary ensures that findings are reproducible, comparable, and reliable across different laboratories and instrumentation worldwide [2] [3].

The standard exists as Part 2 of the broader ISO 18115 series, which is dedicated to surface chemical analysis. While Part 1 covers general terms and those used in spectroscopy (e.g., XPS, AES, SIMS), Part 2 is specifically focused on the terms and acronyms related to scanned probe techniques [2] [3]. This separation allows for a more detailed and specialized treatment of the rapidly evolving SPM field. The standard is not a static document; it has undergone several revisions to incorporate new techniques and clarify existing concepts, with the latest major edition published in 2013 [3] [4].

Scope and Content: What Does the Standard Cover?

ISO 18115-2 provides a comprehensive lexicon for the entire field of scanning probe microscopy. Its definitions are foundational for writing research papers, method protocols, and instrument manuals. The scope of the standard is extensive, systematically covering several key areas of SPM.

Table: Key Terminology Areas Covered by ISO 18115-2

Category Description Example Terms
Fundamental Concepts Defines core physical principles and theoretical parameters used in SPM [1]. Interaction force, tunnelling current, near-field optical interaction [3].
Instrumentation Covers terms related to the hardware and components of SPM systems [1]. Scanner, probe, cantilever, photodetector, feedback system [5].
Measurement Modes Standardizes the names and descriptions of various operational modes [1]. Contact mode, tapping mode, frequency modulation AFM, constant current mode (STM) [2].
Data Analysis Defines parameters and quantities derived from SPM measurements to ensure consistent interpretation [1]. Resolution, drift, roughness parameters, image plane [5].
Sample Interaction Describes terms related to the sample being analyzed and its interaction with the probe [1]. Surface, nanostructure, mechanical properties [1].

A critical feature of the standard is its handling of acronyms. The SPM field is known for its prolific use of acronyms for different techniques and modes. ISO 18115-2 defines 86 acronyms, providing a vital reference to prevent confusion [1]. For example, it clarifies the equivalence of SNOM and NSOM (Scanning Near-Field Optical Microscopy), standardizing on a single term for publications and discussions [1] [2].

Evolution of the Standard: A Historical Comparison

ISO 18115-2 is the product of a continual process of refinement and expansion, reflecting the dynamic nature of the SPM field. The standard's history demonstrates a concerted effort to keep pace with technological advancements, growing from a few hundred terms to a comprehensive vocabulary of nearly 900 terms across both Part 1 and Part 2 [3].

Table: Evolution of the ISO 18115 Vocabulary Standard

Version Year Key Changes and Additions Total Terms (Part 1 & 2)
ISO 18115:2001 2001 Initial release, covering general surface chemical analysis terms [3]. ~350 terms [3]
Amendment 1 2006 Added 5 abbreviations and 71 terms, many for glow discharge analysis [3]. ~421 terms
Amendment 2 2007 Major addition of 87 spectroscopy terms, 76 SPM acronyms, 33 SPM techniques, and 147 SPM concepts [3]. ~764 terms
ISO 18115-1 & -2:2010 2010 Split into two parts: spectroscopy (Part 1) and scanning probe microscopy (Part 2). Brought all material up to date [1] [6]. 227 SPM terms & 86 acronyms in Part 2 alone [1]
ISO 18115-1 & -2:2013 2013 Current edition. Incorporated over 100 further new terms and clarifications [3] [4]. ~900 terms across both parts [2] [3]

This historical progression shows a clear trend: the vocabulary for SPM has grown at a faster rate than that for traditional spectroscopy, necessitating its own dedicated document. The 2013 edition represents the most current and comprehensive set of definitions, and researchers should prioritize using this version to reference the latest standardized terminology [4].

Experimental Application: A Framework for Reliable SPM Analysis

The true value of ISO 18115-2 is realized in its application within experimental workflows. It provides the linguistic foundation that enables the design of robust protocols, precise reporting, and meaningful comparison of data. The following diagram illustrates a generalized SPM experimental workflow, with key steps where standardized terminology from ISO 18115-2 is critical.

G Start Start SPM Experiment SamplePrep Sample Preparation Start->SamplePrep Mounting Sample Mounting SamplePrep->Mounting ProbeSelection Probe (Tip/Cantilever) Selection Mounting->ProbeSelection Setup Instrument Setup ProbeSelection->Setup ModeSelection Measurement Mode Selection (e.g., Tapping) Setup->ModeSelection Engagement Probe Engagement ModeSelection->Engagement Scanning Data Acquisition (Scanning) Engagement->Scanning Analysis Data Analysis Scanning->Analysis Reporting Results Reporting Analysis->Reporting T1 Standardized terms for sample state and geometry T1->SamplePrep T2 Standardized probe descriptions T2->ProbeSelection T3 Standardized mode names and parameters T3->ModeSelection T4 Standardized feedback and control terms T4->Engagement T4->Scanning T5 Standardized data quantities and filters T5->Analysis T6 Ensures clarity and reproducibility T6->Reporting

Detailed Experimental Protocol and Reagent Solutions

Adherence to ISO 18115-2 begins at the protocol design stage. For example, a study investigating the nanoscale mechanical properties of a polymer film for drug delivery would explicitly state the use of "tapping mode" (as defined in the standard) followed by "force volume mapping" (also a standardized term). The protocol would specify parameters using correct terminology, such as the "cantilever spring constant" and "setpoint amplitude," ensuring any researcher can replicate the experiment.

Table: Essential Research Reagent Solutions for SPM Experimentation

Item Function in SPM Experimentation
SPM Probe (Cantilever & Tip) The physical probe that interacts with the sample surface. Defined by standardized terms for its geometry (tip radius), material (silicon, silicon nitride), and properties (spring constant, resonance frequency) [1] [5].
Calibration Gratings Certified reference materials with known dimensions (e.g., pitch, height) used to calibrate the scanner's lateral and vertical measurements, ensuring traceability and accuracy as per guidelines like VDI/VDE 2656 [5].
Sample Substrates Flat, clean surfaces (e.g., mica, silicon wafer) onto which samples are deposited. Standardized terminology defines sample geometry and preparation state [1].
Software for Data Analysis Applications used to process and quantify SPM data. Relies on standardized definitions for parameters like "RMS roughness" (from ISO 25178) and "particle height" to generate reproducible results [5].

During the data analysis phase, the standard's definitions become paramount. Quantifying a "root mean square roughness" (Sq) value is only meaningful if the term is understood and calculated as defined in related surface metrology standards (e.g., ISO 25178), which themselves align with the overarching framework provided by ISO 18115-2 [5]. This prevents scenarios where one research group's definition of "height" differs from another's, leading to irreconcilable data.

ISO 18115-2 does not exist in isolation. It functions as a core vocabulary standard that supports and connects with other ISO and institutional guidelines governing SPM and surface metrology. The following diagram illustrates its relationship with other key standards.

G Vocab ISO 18115-2 SPM Vocabulary Calib ISO 11952 SPM Calibration Vocab->Calib Provides Terminology Drift ISO 11039 Drift Measurement Vocab->Drift Provides Terminology Guidelines VDI/VDE 2656 AFM Guidelines Vocab->Guidelines Provides Terminology Areal ISO 25178 Areal Surface Texture Vocab->Areal Complements Guidelines->Calib Implements

Understanding these relationships is crucial for comprehensive experimental design. For instance:

  • VDI/VDE 2656 provides a detailed methodology for calibrating AFMs, but it consistently uses the terms defined in ISO 18115-2 to describe instrument components and measured quantities [5].
  • ISO 25178 defines a comprehensive set of parameters for characterizing areal surface texture. A researcher using an AFM (an SPM technique) to measure these parameters relies on ISO 18115-2 to correctly describe the AFM operation and on ISO 25178 to correctly calculate the roughness parameters [5].

This ecosystem of standards ensures that terminology, calibration procedures, and measurement parameters are aligned, creating a coherent framework for nanoscale surface analysis.

ISO 18115-2 is far more than a simple glossary; it is an indispensable tool for ensuring scientific integrity and progress in scanning probe microscopy. By providing a common language, it eliminates ambiguity, enables the precise replication of experiments, and allows for the valid comparison of data generated across different labs, instruments, and time. For researchers in drug development and nanoscience, its adoption is a prerequisite for producing credible and impactful research.

The future of ISO 18115-2 will inevitably involve further evolution. As SPM techniques continue to advance—with developments in high-speed imaging, multi-modal mapping, and machine learning-driven analysis—new terms and concepts will emerge. The ISO committee responsible for the standard (ISO/TC 201/SC 1) actively works to incorporate these developments through periodic amendments and revisions [5]. Therefore, the research community's engagement with this living document, both through its application and by contributing to its future development, is essential for maintaining the clarity and precision that underpin reliable scientific discovery.

The International Standard ISO 18115-2:2010 provides the foundational vocabulary for scanning probe microscopy (SPM), defining 227 terms and 86 acronyms essential for precise communication in nanoscale sciences [1]. This standardized terminology covers the samples, instruments, and theoretical concepts involved in surface chemical analysis, creating a unified language for researchers worldwide. For scientists and drug development professionals, mastery of this lexicon is not merely academic; it is critical for ensuring reproducibility, accurate data interpretation, and clear reporting in publications and regulatory documents. The core structure of SPM terminology is built upon a taxonomy that classifies techniques by their physical probe-sample interactions, which directly determine their applications and limitations.

Scanning Probe Microscopy (SPM) itself is defined as a branch of microscopy that forms images of surfaces using a physical probe that scans the specimen [7]. Founded in 1981 with the invention of the scanning tunneling microscope (STM), SPM has since diversified into numerous techniques capable of resolving features at the atomic level, largely enabled by piezoelectric actuators that provide sub-angstrom positioning precision [7]. The standard organizes this complex family of techniques into a logical framework, enabling researchers to navigate the relationships between fundamental principles and their specialized derivatives.

Core Taxonomy of SPM Techniques

Hierarchical Classification of Major SPM Modalities

The ISO 18115-2 standard organizes SPM techniques into a logical hierarchy based on their fundamental physical principles and measurement methodologies. This classification begins with the broad category of Scanning Probe Microscopy, which branches into specific techniques characterized by their unique probe-sample interactions. The following diagram illustrates the logical relationships between major SPM techniques and their operational modes:

G Scanning Probe Microscopy (SPM) Scanning Probe Microscopy (SPM) Scanning Tunneling Microscopy (STM) Scanning Tunneling Microscopy (STM) Scanning Probe Microscopy (SPM)->Scanning Tunneling Microscopy (STM) Atomic Force Microscopy (AFM) Atomic Force Microscopy (AFM) Scanning Probe Microscopy (SPM)->Atomic Force Microscopy (AFM) Near-Field Scanning Optical Microscopy (NSOM/SNOM) Near-Field Scanning Optical Microscopy (NSOM/SNOM) Scanning Probe Microscopy (SPM)->Near-Field Scanning Optical Microscopy (NSOM/SNOM) Magnetic Resonance Force Microscopy (MRFM) Magnetic Resonance Force Microscopy (MRFM) Scanning Probe Microscopy (SPM)->Magnetic Resonance Force Microscopy (MRFM) Scanning Thermal Microscopy (SThM) Scanning Thermal Microscopy (SThM) Scanning Probe Microscopy (SPM)->Scanning Thermal Microscopy (SThM) AFM Contact Mode AFM Contact Mode Atomic Force Microscopy (AFM)->AFM Contact Mode AFM Dynamic Mode AFM Dynamic Mode Atomic Force Microscopy (AFM)->AFM Dynamic Mode AFM Tapping Mode AFM Tapping Mode AFM Dynamic Mode->AFM Tapping Mode

This taxonomy highlights how core techniques branch into specialized modalities, each with standardized terminology governing their operation and application. The diagram visually represents the parent-child relationships between general SPM categories and their specific implementations, demonstrating the logical framework underpinning ISO 18115-2.

Established SPM Techniques and Standardized Nomenclature

Table: Major SPM Techniques and Their Standardized Acronyms per ISO 18115-2

Technique Name Standard Acronym Primary Interaction Measured Key Applications
Atomic Force Microscopy AFM Mechanical forces (van der Waals, contact, etc.) Polymer characterization, biological samples, material roughness
Scanning Tunneling Microscopy STM Tunneling current Conductive surfaces, atomic resolution imaging, electronic states
Near-Field Scanning Optical Microscopy NSOM or SNOM Evanescent electromagnetic waves Optical properties beyond diffraction limit, single molecule fluorescence
Magnetic Resonance Force Microscopy MRFM Magnetic resonance interactions 3D atomic-scale magnetic resonance imaging, spin detection
Scanning Spreading Resistance Microscopy SSRM Electrical resistance Semiconductor doping profiling, carrier concentration mapping
Scanning Thermal Microscopy SThM Thermal conductivity/ temperature Thermal property mapping, phase transitions, device failure analysis
Scanning SQUID Microscopy SSM Magnetic flux Magnetic vortex imaging, superconductivity studies, current density mapping

This table represents only a subset of the 86 acronyms documented in the standard [1]. The consistent naming convention allows researchers to immediately identify the physical principles underlying each technique, facilitating appropriate method selection for specific research questions in drug development and materials science.

Comparative Analysis of Major SPM Techniques

Operational Principles and Imaging Modes

The fundamental operational principles distinguishing major SPM techniques directly influence their appropriate application domains. According to ISO terminology, each technique is defined by its specific probe-sample interaction mechanism, which determines its resolution capabilities, sample requirements, and information output.

Scanning Tunneling Microscopy (STM) operates by measuring the tunneling current between a sharp conductive tip and a conductive sample surface, with the current varying exponentially with tip-sample distance [8]. This technique requires conductive samples but provides exceptional atomic-level resolution. The standard defines two primary imaging modes: constant current mode (maintaining fixed tunneling current via feedback) and constant height mode (recording current variations at fixed tip height) [7].

Atomic Force Microscopy (AFM) measures forces between a tip and sample surface, overcoming STM's conductivity limitation to work with insulating materials [9] [10]. The ISO standard recognizes multiple AFM operational modes, primarily categorized as contact mode (tip in constant contact) and dynamic modes (cantilever oscillated near resonance frequency) [9] [8]. In dynamic mode, often called "tapping mode," the cantilever vibrates near its resonance frequency, reducing lateral forces and minimizing potential sample damage [8].

Near-Field Scanning Optical Microscopy (NSOM/SNOM) breaks the optical diffraction limit by using a nanoscale aperture or tip to confine light, enabling optical resolution typically between 50-100 nm [7] [1]. The standard defines terms for various NSOM approaches, including aperture-based, apertureless, and scattering-type techniques, each with specific illumination and collection configurations.

Technical Specifications and Performance Metrics

Table: Performance Comparison of Major SPM Techniques

Technique Best Resolution (Lateral) Best Resolution (Vertical) Sample Requirements Operating Environment
AFM ~0.5 nm (contact) ~1 nm (tapping) ~0.1 nm None (works on insulators) Ambient air, liquid, vacuum
STM ~0.1 nm (atomic resolution) ~0.01 nm Electrically conductive Ultra-high vacuum, ambient
NSOM/SNOM ~20-100 nm (sub-diffraction) ~1 nm Optical contrast beneficial Ambient, sometimes liquid
SSRM ~1 nm N/A Semiconductor or conductive Ambient, controlled humidity
SThM ~10-100 nm ~1 nm Thermal contrast Ambient, controlled atmosphere

The resolution specifications in this table represent optimal performance under ideal conditions, as defined by measurement protocols in ISO standards. Real-world performance depends heavily on probe quality, environmental conditions, and operator expertise. The versatility of AFM explains its dominant position in biological and soft materials research, while STM remains essential for fundamental surface science studies of conductive crystals. NSOM/SNOM fills the critical niche for nanoscale optical characterization, bridging the gap between conventional microscopy and probe-based techniques.

Experimental Protocols and Methodologies

Standardized Imaging Protocols for AFM

The implementation of Atomic Force Microscopy follows standardized methodologies to ensure reproducible results across different laboratories and instrument platforms. The fundamental AFM configuration consists of a micro-fabricated cantilever with a sharp tip, a piezoelectric scanner for precise positioning, a laser and photodiode system for detecting cantilever deflection, and a feedback system to maintain tip-sample interaction forces [9] [10].

Contact Mode AFM Protocol:

  • Probe Selection: Choose an appropriate cantilever with known spring constant (typically 0.01-1 N/m for soft samples)
  • Engagement: Approach the tip to the sample surface until physical contact is established
  • Feedback Parameter Optimization: Setpoint deflection force is calibrated to minimize sample damage while maintaining contact
  • Scanning: Raster scan the tip across the surface while maintaining constant deflection
  • Data Collection: Record both height (feedback signal) and deflection (error signal) images

Dynamic (Tapping) Mode AFM Protocol:

  • Cantilever Tuning: Drive the cantilever at its resonant frequency (typically 50-400 kHz in air)
  • Amplitude Setpoint Adjustment: Establish oscillation amplitude before engagement
  • Intermittent Contact: Maintain tip-sample interaction through amplitude reduction feedback
  • Phase Detection: Simultaneously record phase lag for material property mapping

The following workflow diagram illustrates the standardized experimental process for AFM imaging, from sample preparation to data analysis:

G Sample Preparation Sample Preparation Probe Selection Probe Selection Sample Preparation->Probe Selection Microscope Setup Microscope Setup Probe Selection->Microscope Setup Mode Selection Mode Selection Microscope Setup->Mode Selection Contact Mode Contact Mode Mode Selection->Contact Mode Dynamic Mode Dynamic Mode Mode Selection->Dynamic Mode Parameter Optimization Parameter Optimization Image Acquisition Image Acquisition Parameter Optimization->Image Acquisition Data Analysis Data Analysis Image Acquisition->Data Analysis Topography Topography Data Analysis->Topography Property Mapping Property Mapping Data Analysis->Property Mapping Contact Mode->Parameter Optimization Dynamic Mode->Parameter Optimization

The Scientist's Toolkit: Essential Research Reagent Solutions

Table: Essential Materials and Reagents for SPM Experiments

Item Function/Application Technical Specifications
Silicon Nitride Probes (AFM) Contact mode imaging in liquid Spring constant: 0.06-0.6 N/m; Tip radius: <20 nm
Doped Silicon Probes (AFM) Tapping mode imaging Resonance frequency: 200-400 kHz; Spring constant: 20-80 N/m
Platinum-Iridium Tips (STM) Ambient condition STM Wire diameter: 0.1-0.3 mm; Electrochemically etched
Tungsten Tips (STM) UHV STM experiments Electrochemically etched with oxide removal in UHV
SNOM Aperture Probes Sub-wavelength optical imaging Aluminum coating; Aperture diameter: 50-100 nm
Piezoelectric Calibration Grids Scanner calibration in XYZ Periodic structures with precisely known pitch and step heights
Vibration Isolation Systems Acoustic and mechanical noise reduction Natural frequency: <1 Hz; Load capacity: 50-200 kg

These essential materials represent the foundational toolkit for SPM experiments according to standardized methodologies [7] [9]. Proper selection of probes based on their specific technical specifications is critical for obtaining reliable, high-resolution data, particularly in quantitative measurements where spring constants, resonance frequencies, and tip geometries directly influence results.

Advanced SPM Modalities and Specialized Applications

Beyond Topography: Functional Property Mapping

The evolution of SPM has expanded far beyond surface topography to encompass numerous specialized techniques for mapping functional properties at the nanoscale. These advanced modalities leverage the same fundamental principles defined in ISO 18115-2 but apply them to measure specific material characteristics.

Electrical Characterization Modes:

  • Conductive AFM (C-AFM): Measures current flow through the tip with simultaneous topography
  • Scanning Capacitance Microscopy (SCM): Maps carrier concentration in semiconductors
  • Kelvin Probe Force Microscopy (KPFM): Measures surface potential and work function
  • Scanning Spreading Resistance Microscopy (SSRM): Maps resistance with high spatial resolution [7]

Mechanical Property Modes:

  • Force Modulation Microscopy: Distinguishes surface elasticity through stiffness variations
  • Nanomechanical Mapping: Quantifies Young's modulus, adhesion, and deformation
  • Force Volume Imaging: Constructs full force-distance curves at each image pixel

Thermal and Magnetic Characterization:

  • Scanning Thermal Microscopy (SThM): Maps temperature and thermal conductivity variations [7]
  • Magnetic Force Microscopy (MFM): Images magnetic domain structures using coated tips

These specialized applications demonstrate how the core SPM framework has been adapted to address diverse research needs, from characterizing battery materials to studying biological systems. The standardized terminology ensures that measurements obtained using these different modalities can be consistently reported and compared across the scientific literature.

The ISO 18115-2 standard provides an essential framework for navigating the complex terminology of scanning probe microscopy, encompassing 227 precisely defined terms and 86 standardized acronyms. This structured vocabulary enables clear communication and reproducible research across disciplines from solid-state physics to molecular biology and pharmaceutical development. By establishing consistent definitions for techniques ranging from fundamental AFM and STM to specialized methods like SSRM and SThM, the standard supports accurate methodology reporting and data interpretation. As SPM technologies continue to evolve with new hybrid modalities and applications, this core terminological structure will remain indispensable for researchers pushing the boundaries of nanoscale characterization.

The evolving practices of modern science, characterized by large, global teams and complex, data-intensive methodologies, have placed the concepts of reproducibility and replicability under a microscope. A 2016 poll by the journal Nature reported that more than half of scientists surveyed believed science was facing a "replication crisis" [11]. This crisis encompasses the widespread failure to reproduce the results of published studies, evidence of publication bias, and a high prevalence of questionable research practices [11]. While numerous factors contribute to this crisis, one fundamental and often overlooked obstacle is the absence of standard definitions for key terms like "reproducibility" and "replicability" across different scientific disciplines [12].

This terminology confusion is a significant barrier to improving reproducibility and replicability [12]. Different scientific disciplines and institutions use these words in inconsistent or even contradictory ways, leading to misunderstandings and inefficiencies in scientific communication [12] [13]. For instance, what one research group defines as "reproducibility," another may define as "replicability," and vice versa [14]. This article will explore how standardized terminology, specifically through the adoption of international standards like ISO 18115-2 for scanning probe microscopy, provides a critical framework for enhancing research reproducibility, enabling reliable data comparison, and accelerating scientific discovery.

Defining the Problem: The Confused Landscape of Research Terminology

The terms "reproducibility" and "replicability" are used differently across scientific disciplines, introducing confusion into an already complicated set of challenges [12]. A detailed review of the literature reveals three primary categories of usage, which can be summarized as follows [12] [14]:

  • Usage A: The terms "reproducibility" and "replicability" are used with no distinction between them.
  • Usage B1 (Claerbout Terminology): This usage, originating in computational science, defines "reproducibility" as instances where the original researcher's data and computer codes are used to regenerate the results. "Replicability" refers to instances where a researcher collects new data to arrive at the same scientific findings as a previous study [12] [13].
  • Usage B2 (ACM Terminology): The Association for Computing Machinery (ACM) has adopted definitions that are in direct opposition to the Claerbout terminology. Here, "replicability" is associated with using an original author's digital artifacts, while "reproducibility" involves developing completely new digital artifacts independently [12] [13] [14].

The following table clearly contrasts these two dominant, competing definitions to illustrate the core of the confusion.

Table 1: Conflicting Definitions of Reproducibility and Replicability

Term Claerbout & Karrenbach Definition Association for Computing Machinery (ACM) Definition
Reproducible Authors provide all the necessary data and computer codes to run the analysis again, re-creating the results. (Different team, different experimental setup) An independent group can obtain the same result using artifacts which they develop completely independently [14].
Replicable A study that arrives at the same scientific findings as another study, collecting new data and completing new analyses. (Different team, same experimental setup) An independent group can obtain the same result using the author's artifacts [14].

This terminological inconsistency creates tangible problems. It complicates communication across disciplines, hinders the accurate assessment of replication studies, and obstructs efforts to implement widespread solutions [12]. Some researchers have proposed new lexicons to sidestep this confusion, such as distinguishing between "methods reproducibility," "results reproducibility," and "inferential reproducibility" [13]. However, the ideal solution for technical fields is the establishment and widespread adoption of community-driven, international standards that precisely define terms, as seen with ISO 18115-2 in surface chemical analysis.

The ISO 18115-2 Standard: A Case Study in Terminology Standardization for SPM

In the field of scanning probe microscopy (SPM) and surface chemical analysis, the ISO 18115 series serves as a powerful example of how standardized terminology can bring clarity and consistency. ISO 18115 is a vocabulary published in two parts, with ISO 18115-2 specifically dedicated to terms used in scanning-probe microscopy [15] [2] [6].

The standard was developed to bring material up-to-date and to separate general terms and spectroscopic terms (Part 1) from those relating specifically to SPM (Part 2) [1]. This separation in itself is an exercise in creating a more organized and navigable terminology structure. ISO 18115-2:2021 covers a comprehensive set of 227 terms used in SPM, as well as 86 acronyms [1]. The terms cover words or phrases used in describing the samples, instruments, and theoretical concepts involved in surface chemical analysis, providing a common language for researchers worldwide [1].

The primary function of this standard is to provide a common reference point, ensuring that when a researcher in one country refers to a "scanning tunneling microscope (STM)" or an "atomic force microscope (AFM)," their colleagues elsewhere have a precise, shared understanding of what these terms entail. This is crucial not only for day-to-day communication but also for the development of methods, the calibration of instruments, and the comparison of data across different laboratories and experimental setups. By defining terms with a high degree of precision, ISO 18115-2 lays the groundwork for achieving the higher-order goals of reproducibility and replicability as defined in Table 1.

The Impact of Standardized Terminology on Reproducibility and Data Comparison

Standardized terminology, as exemplified by ISO 18115-2, directly impacts research reproducibility and data comparison through several key mechanisms.

Enabling Accurate Replication of Experimental Methods

A foundational requirement for any scientific experiment is methods reproducibility—the ability to provide sufficient detail about procedures so that the same procedures could be exactly repeated [13]. Without a standardized vocabulary, the description of a complex SPM probe-conditioning procedure, for instance, can be ambiguous. A term like "voltage pulse" may be interpreted differently by different researchers, leading to variations in experimental execution and, consequently, in the results. ISO 18115-2 mitigates this by providing definitive descriptions, ensuring that the methodology section of a research paper is interpreted consistently by all readers, which is the first step toward direct replication.

Facilitating Cross-Platform and Cross-Laboratory Data Comparison

The ultimate test of a scientific finding's robustness is its reproducibility under different experimental conditions—what the ACM defines as "different team, different experimental setup" [14]. In SPM, this could involve using instruments from different manufacturers, different probe materials, or different laboratory environments. Standardized terminology ensures that when researchers compare data obtained from these varied setups, they are comparing like with like. Parameters such as "resolution," "scan size," or "setpoint" have precise meanings, allowing for a valid and meaningful comparison of results. This moves beyond simple replication and begins to test the generalizability of scientific insights [14].

Enhancing the Reliability of Automated and AI-Driven Research

The rise of artificial intelligence (AI) and autonomous experimentation in fields like SPM makes standardized terminology more critical than ever. For example, in an AI-driven SPM system like DeepSPM, a machine learning model is trained to assess probe quality and perform conditioning actions [16]. The consistent definition of what constitutes a "good" or "bad" probe image is paramount for training a reliable model. If the training data is labeled inconsistently due to vague terminology, the AI's performance and the reproducibility of the entire autonomous workflow will be compromised. Standardized terms provide the unambiguous labels and operational definitions that machine learning algorithms require to function correctly and reliably [16] [17].

Table 2: Quantitative Performance of an AI-Driven SPM System (DeepSPM) Relying on Consistent Terminology

Performance Metric Value Context & Impact on Reproducibility
Classifier CNN Accuracy ~94% Accuracy in assessing the state of the SPM probe, trained on a dataset labeled with consistent definitions of "good" and "bad" probes [16].
Dataset Size for Training 7,589 images The volume of consistently labeled data required to train a reliable assessment model [16].
Number of Conditioning Actions 12 The variety of probe-repair actions available to the AI, each requiring a precise, standardized definition to be executed consistently [16].

Experimental Protocols: How Standardized Language Enables Reproducible Research

To illustrate the practical application of standardized terminology, we can examine the experimental workflow of an autonomous SPM system. The following diagram outlines the key steps in this process, highlighting points where precise language is critical.

autonomous_SPM Start Start Autonomous Experiment RegionSelect Select Scanning Region (Algorithmic) Start->RegionSelect AcquireImage Acquire SPM Image RegionSelect->AcquireImage AssessContact Assess Sample-Probe Contact AcquireImage->AssessContact AssessContact->RegionSelect Contact Lost/Crashed AssessRegion Assess Sample Region Quality (Algorithmic) AssessContact->AssessRegion Contact Good AssessRegion->RegionSelect Bad Region AssessProbe Assess Probe State (Classifier CNN) AssessRegion->AssessProbe Good Region ConditionProbe Condition Probe (Reinforcement Learning Agent) AssessProbe->ConditionProbe Bad Probe StoreData Process and Store 'Good' Data AssessProbe->StoreData Good Probe ConditionProbe->AcquireImage Re-assess NextIteration Proceed to Next Loop Iteration StoreData->NextIteration

Diagram 1: Autonomous SPM Workflow and Terminology Checkpoints.

Detailed Methodologies for Key Experiments:

  • Intelligent Probe Quality Assessment:

    • Objective: To automatically determine if an SPM image was acquired with a "good" or "bad" probe.
    • Protocol: A Convolutional Neural Network (CNN) is trained via supervised learning on a large dataset of SPM images (e.g., 7,589 images) [16]. Each image in the training set must be consistently labeled by a human expert according to standardized criteria for what constitutes a "good probe" versus the various artifacts of a "bad probe." The trained classifier achieves high accuracy (~94%) by learning these standardized visual patterns [16].
  • Intelligent Probe Conditioning with Reinforcement Learning:

    • Objective: To automatically restore a degraded probe to a "good" state.
    • Protocol: A deep Reinforcement Learning (RL) agent interacts with the SPM setup [16]. The agent inspects an image and selects a conditioning action (e.g., a specific voltage pulse or a dip into the sample) from a predefined set of 12 actions [16]. Each action must have a precise, standardized operational definition to ensure consistent execution. The agent receives a positive reward if the subsequent image is classified as "good probe," learning a policy that minimizes the number of conditioning steps required.

The Scientist's Toolkit: Essential Research Reagent Solutions for SPM

The following table details key materials and components essential for SPM experiments, the precise understanding of which is underpinned by standardized terminology.

Table 3: Key Research Reagent Solutions in Scanning Probe Microscopy

Item / Material Function in SPM Experiments
Metallic Probes (e.g., Pt/Ir) The atomically sharp tip required for Scanning Tunneling Microscopy (STM); its precise morphology is critical for image resolution and is a key variable managed by autonomous systems [16].
Molecular Sample Systems (e.g., MgPc/Ag) Model samples, such as magnesium phthalocyanine on a silver surface, used to benchmark instrument performance and train AI models [16].
Classifier CNN Model An AI model trained to assess the quality of acquired SPM images, relying on standardized definitions of image quality for accurate performance [16].
Reinforcement Learning Agent An AI driver that executes probe-conditioning actions based on image assessment; depends on standardized action definitions for reliable operation [16].
ISO 18115-2 Standard The definitive vocabulary providing the standardized terminology for samples, instruments, and concepts in SPM, enabling clear communication and reproducibility [15] [1].

The journey toward robust and reproducible scientific research is multifaceted. While attention is rightly paid to statistical power, open data, and study pre-registration, the fundamental role of standardized language is a critical enabler that is often overlooked. As demonstrated by the ISO 18115-2 standard in scanning probe microscopy, a common, precise vocabulary is not merely an academic exercise. It is a practical necessity that undergirds accurate method reporting, enables valid cross-platform data comparison, and ensures the reliability of next-generation autonomous research systems. For researchers, scientists, and drug development professionals, the active adoption and use of such international standards is a simple yet powerful step toward mitigating the reproducibility crisis and accelerating the pace of discovery.

Scanning Probe Microscopy (SPM) represents a family of microscopy techniques where a physical probe scans a specimen surface to characterize its topographical and functional properties at extremely high resolutions, down to the atomic level [18]. The establishment of a standardized terminology, as outlined in standards like ISO 18115-2, is crucial for accurate communication among researchers, scientists, and drug development professionals. This terminology ensures consistent interpretation of data across different instrumentation platforms and experimental conditions, forming the foundational language for nanotechnology research and development. The vocabulary encompasses terms describing instruments (e.g., probes, cantilevers), operational modes (e.g., contact, dynamic), samples, and the theoretical models that explain probe-sample interactions.

The fundamental operating principle of all SPM techniques involves a sharp probe positioned in close proximity to a sample surface. As the probe scans the surface, interactions between the probe tip and the sample are detected and mapped into a three-dimensional image [18]. A scanner with a piezoelectric element provides precise three-dimensional control, while a feedback mechanism maintains a consistent interaction force or distance by adjusting the probe's vertical position (Z-axis). The recorded Z-axis feedback values at each (X, Y) coordinate are then reconstructed by a computer to generate a detailed topographical image [18].

Theoretical Models and Force Interactions

The theoretical framework of SPM is rooted in understanding the complex forces that occur between the probe and the sample. These forces dictate the operational mode and influence the data acquired. Critical theoretical concepts include the cantilever-sample interaction potential, which defines various AFM operation modes [19]. Several key force interactions must be considered:

  • Van der Waals Forces: These ubiquitous intermolecular forces arise from induced dipole interactions and play a significant role in probe-sample attraction, particularly at close ranges [19].
  • Adhesion Forces: These forces determine how the probe sticks to the sample surface. Several theoretical models describe this adhesion, including the Johnson-Kendall-Roberts (JKR) model for soft materials with high adhesion and large tip radii, the Derjaguin-Muller-Toporov (DMT) model for hard materials with low adhesion and small tip radii, and the Maugis model, which provides a more general transition between the JKR and DMT models [19].
  • Capillary Forces: In ambient air conditions, a thin layer of adsorbed water vapor can form a meniscus between the probe and sample, creating a strong capillary force that must be accounted for in image interpretation [18].
  • Elastic Interactions: Described by models like the Hertzian contact theory, these interactions explain the elastic deformation that occurs when a hard, non-adhering sphere (the probe tip) presses into a soft, flat surface (the sample) [19].

The following diagram illustrates the hierarchical relationship between the core SPM techniques and their underlying physical principles:

spm_techniques SPM Fundamentals SPM Fundamentals STM STM SPM Fundamentals->STM AFM AFM SPM Fundamentals->AFM Tunneling Current Tunneling Current STM->Tunneling Current Contact Mode Contact Mode AFM->Contact Mode Dynamic Mode Dynamic Mode AFM->Dynamic Mode Theoretical Basis Theoretical Basis Tunneling Current->Theoretical Basis Repulsive Forces Repulsive Forces Contact Mode->Repulsive Forces Resonance Shift Resonance Shift Dynamic Mode->Resonance Shift AFM Derivatives AFM Derivatives Dynamic Mode->AFM Derivatives Repulsive Forces->Theoretical Basis Resonance Shift->Theoretical Basis MFM MFM AFM Derivatives->MFM KFM KFM AFM Derivatives->KFM LFM LFM AFM Derivatives->LFM Magnetic Forces Magnetic Forces MFM->Magnetic Forces Electrical Potentials Electrical Potentials KFM->Electrical Potentials Frictional Forces Frictional Forces LFM->Frictional Forces Magnetic Forces->Theoretical Basis Electrical Potentials->Theoretical Basis Frictional Forces->Theoretical Basis

Core SPM Techniques and Mode Comparisons

Primary SPM Techniques

SPM encompasses several core techniques, each designed to detect specific probe-sample interactions:

  • Scanning Tunneling Microscopy (STM): The first SPM technique, STM operates by detecting the tunneling current that flows between a conductive probe and a conductive sample when they are brought to within a nanometer of each other without physical contact [18]. It achieves atomic-level resolution, particularly in ultra-high vacuum environments, but is limited to conductive or semi-conductive samples [18].
  • Atomic Force Microscopy (AFM): AFM overcomes the conductivity limitation of STM by measuring the forces between a probe tip and the sample surface. A cantilever with a sharp probe deflects due to these forces, and a laser beam reflected from the cantilever's back measures this deflection with high sensitivity [18]. AFM can image both conductive and insulating surfaces in various environments, including air and liquids [18].
  • Magnetic Force Microscopy (MFM): A derivative of dynamic mode AFM, MFM uses a magnetically coated probe to detect magnetic leakage fields from a sample surface. The system tracks changes in the cantilever's resonance frequency or phase shift to map magnetic domain structures [18].
  • Kelvin Probe Force Microscopy (KFM): Also known as Surface Potential Microscopy, KFM measures the contact potential difference (work function) between a conductive probe and the sample. This is achieved by applying an AC voltage and detecting the resultant electrostatic forces, allowing for the mapping of surface potentials with high spatial resolution [18].
  • Lateral Force Microscopy (LFM): Operating in contact mode, LFM detects torsional bending (friction) of the cantilever as the probe scans perpendicular to its long axis. This mode provides information on surface friction and heterogeneity [18].

Comparative Analysis of AFM Operational Modes

The two primary operational modes in Atomic Force Microscopy are Contact Mode and Dynamic Mode, each with distinct mechanisms and applications. The following table provides a detailed comparison based on key operational parameters:

Table 1: Comparative Analysis of AFM Contact Mode vs. Dynamic Mode

Parameter Contact Mode Dynamic Mode
Basic Principle Detects static cantilever bending due to repulsive forces [18] Detects changes in vibration amplitude/phase of an oscillating cantilever [18]
Feedback Signal Degree of cantilever bending (deflection) [18] Vibration amplitude or frequency shift [18]
Forces Measured Repulsive forces (constant force mode) [18] Attractive and/or repulsive van der Waals forces [18]
Probe-Sample Interaction Continuous physical contact [18] Intermittent or non-contact [18]
Optimal Sample Types Relatively hard samples [18] Soft, adhesive, or easily movable samples [18]
Lateral Force Information Yes (via LFM) [18] Limited
Potential Sample Damage Higher (due to lateral drag forces) [18] Lower (reduced shear forces) [18]
Cantilever Stiffness Softer cantilevers (lower spring constant) [18] Stiffer cantilevers (higher spring constant) [18]
Typical Cantilever Material Silicon Nitride (SiN) [18] Silicon (Si) [18]
Associated Modes LFM, Current, Force Modulation [18] Phase, MFM, KFM [18]

Specialized SPM Modes for Material Property Mapping

Beyond topographical imaging, SPM can characterize various material properties through specialized detection modes:

  • Phase Imaging: In dynamic mode, this technique maps the phase lag between the driving oscillation and the cantilever's response. The phase contrast provides information on surface viscoelasticity, adhesion, and composition [18].
  • Force Modulation Mode: The sample is vibrated vertically while scanning in contact mode. The system detects the cantilever's response amplitude and phase, generating maps of relative surface stiffness and viscoelastic properties [18].
  • Current Mode: A bias voltage is applied between a conductive probe and the sample during contact mode scanning. The detected current flow creates a map of localized electrical conductivity or resistance across the surface [18].

Experimental Protocols and Methodologies

Standard Operating Procedure for SPM Imaging

A generalized experimental workflow for SPM characterization involves several critical steps to ensure reliable and reproducible data:

  • Sample Preparation: Mount the sample securely on a stable specimen stub using appropriate adhesives (e.g., double-sided tape, epoxy). For non-conductive samples in certain modes, a thin conductive coating (e.g., gold, platinum) may be applied via sputter coating. Ensure the sample surface is clean and free from contaminants.
  • Probe/Cantilever Selection: Choose a probe appropriate for the operational mode and sample properties. Key selection criteria include cantilever spring constant, resonant frequency, and tip geometry. For example, use soft cantilevers (0.01-1 N/m) for contact mode on delicate samples, and stiffer cantilevers (10-50 N/m) for dynamic mode in air [18].
  • Instrument Setup and Alignment: Mount the selected probe securely. Align the optical components (laser spot on the cantilever end and reflected beam on the position-sensitive photodetector) to maximize signal strength and sensitivity.
  • Engagement Protocol: Approach the probe toward the sample surface using a controlled, automated approach sequence until the desired probe-sample interaction is detected (e.g., a setpoint deflection in contact mode or amplitude reduction in dynamic mode).
  • Parameter Optimization: Set the feedback loop gains (proportional and integral) to ensure stable tracking without oscillation. Optimize imaging parameters such as scan size, resolution (number of scan lines), scan rate, and setpoint.
  • Data Acquisition: Execute the scan. Simultaneously acquire topographical data and any additional data channels (e.g., phase, current, magnetic force).
  • Post-Processing and Analysis: Apply necessary image processing steps (e.g., flattening, plane fitting) to remove tilt and scanner bow. Use analysis software to measure surface roughness, particle dimensions, and other morphological parameters.

Protocol for Adhesion Force Measurement via Force-Distance Curamics

Force-distance spectroscopy is a fundamental SPM method for quantifying local mechanical properties and adhesion forces [19].

  • Objective: To measure the adhesion force between the probe and a specific location on the sample surface.
  • Materials: AFM with force spectroscopy capability, appropriate cantilever, sample.
  • Cantilever Calibration: Determine the cantilever's exact spring constant using a recognized method (e.g., thermal tune method).
  • Approach-Retraction Cycle: Position the probe over the region of interest. The probe is moved toward the surface (approach) until contact, then pressed with a defined force, and subsequently retracted.
  • Data Collection: During the retraction cycle, record the cantilever deflection as a function of the piezoelectric actuator's Z-displacement.
  • Data Analysis: Convert the deflection versus Z-position data into a force-distance curve using the cantilever's spring constant. The adhesion force is identified as the maximum negative force (the "pull-off" force) required to separate the probe from the sample surface [19].
  • Statistical Reporting: Perform multiple measurements (typically 64-256) at different locations to account for surface heterogeneity and report the mean adhesion force with standard deviation.

The Scientist's Toolkit: Essential Research Reagent Solutions

Successful SPM experimentation relies on a suite of specialized components and materials. The following table details key items in the SPM researcher's toolkit:

Table 2: Essential Materials and Reagents for SPM Research

Item Function/Description Key Considerations
AFM Probes/Cantilevers Silicon or silicon nitride tips on microfabricated cantilevers that physically interact with the sample [18]. Selection is critical; factors include spring constant, resonant frequency, tip radius, and coating (conductive, magnetic) depending on mode and application [18].
Sample Stubs Stable platforms for mounting samples into the SPM scanner. Material (often metal) and flatness are important for mechanical stability. Various sizes and designs exist for different instruments.
Conductive Adhesive Tapes/Coatings Used to mount samples or render non-conductive samples electrically grounded for certain modes (e.g., current mode). Carbon tapes are common. Sputter coaters apply thin layers of gold, platinum, or chromium for conductivity.
Vibration Isolation System A platform (often air-isolated or active) that dampens external mechanical vibrations. Essential for achieving high-resolution, low-noise images, as SPMs are highly sensitive to environmental vibrations.
Calibration Gratings Samples with known, periodic surface structures (e.g., grids, step heights). Used to verify the scanner's accuracy in X, Y, and Z directions, ensuring dimensional fidelity of acquired images [20].
Software for Data Analysis Specialized programs for processing and analyzing SPM image data. Functions include flattening, filtering, cross-section analysis, particle analysis, and roughness calculations [20].

From Theory to Bench: Applying ISO 18115-2 in Biomedical Quality Control

The development of Advanced Therapy Medicinal Products (ATMPs) represents one of the most innovative yet challenging frontiers in modern medicine. These therapies, which include gene therapies, somatic cell therapies, and tissue-engineered products, offer unprecedented potential for treating conditions that traditional medicines cannot address [21]. Within this context, Standardized Precision Measurement (SPM) emerges as a critical enabler for complying with Good Manufacturing Practice (GMP) requirements. The inherent complexity and heterogeneity of ATMPs—ranging from personalized autologous treatments to allogeneic cell therapies—creates significant challenges in manufacturing consistency and quality control [22] [21]. This article examines how standardized measurement methodologies, framed within the context of ISO 18115-2 scanning probe microscopy terminology, provide the technical foundation for demonstrating GMP compliance throughout the ATMP product lifecycle.

The regulatory framework for ATMPs in the European Union, established under Regulation (EC) No 1394/2007, mandates strict adherence to GMP principles specifically tailored for advanced therapies [23] [24]. Similarly, the United States Food and Drug Administration (FDA) requires rigorous quality control for regenerative medicine therapies, including those designated as Regenerative Medicine Advanced Therapy (RMAT) [25]. These regulatory systems share a common dependence on standardized measurement approaches to ensure that ATMPs meet critical quality attributes despite their biological complexity and manufacturing variability.

ATMP Classification and Measurement Challenges

Categorization of Advanced Therapy Medicinal Products

ATMPs are classified into three main categories, each presenting distinct measurement and characterization challenges essential for GMP compliance [26] [21]:

  • Gene Therapy Medicinal Products (GTMPs): These contain genes that lead to therapeutic, prophylactic, or diagnostic effects, working by inserting recombinant genes into the body to treat various diseases including genetic disorders and cancer. The critical quality attributes for GTMPs include vector concentration, potency, and purity, requiring sophisticated measurement techniques for proper quantification.

  • Somatic-Cell Therapy Medicinal Products: These contain cells or tissues that have been manipulated to change their biological characteristics or are not intended for the same essential functions in the body. These products demonstrate extensive heterogeneity in cell populations, differentiation states, and viability, necessitating multidimensional measurement approaches.

  • Tissue-Engineered Medicines: These contain cells or tissues that have been modified to repair, regenerate, or replace human tissue, often incorporating medical devices as integral components (combined ATMPs). The structural and functional complexity of these products demands standardized measurement of mechanical properties, scaffold integrity, and tissue formation capacity.

Measurement Complexities in ATMP Manufacturing

The manufacturing of ATMPs presents unique challenges that directly impact measurement strategy and GMP compliance [22] [21]:

  • Scalability Issues: Unlike conventional pharmaceuticals, many ATMPs, particularly personalized therapies, require high volumes of very small product batches manufactured within the same facility. Each batch represents a unique product for a target population of one, necessitating robust yet flexible measurement systems.

  • Sterility Assurance Challenges: Sterile filtration is often impossible for cellular products, meaning the entire manufacturing lifecycle must meet rigorous sterile and aseptic processing standards without this conventional quality control step.

  • Inherent Variability: Patient-derived starting materials introduce natural biological variations that must be characterized and controlled through standardized measurement of critical quality attributes throughout the manufacturing process.

  • Limited Shelf Life: Many ATMPs have short shelf lives and special transport requirements, creating time-sensitive measurement constraints for quality control and release testing.

Table 1: Key Measurement Challenges Across ATMP Categories

ATMP Category Primary Measurement Challenges Impact on GMP Compliance
Gene Therapy Medicinal Products Vector concentration quantification, transduction efficiency, potency assessment Demonstrating consistent product potency and purity
Somatic-Cell Therapy Medicinal Products Cell viability, identity, purity, potency, characterization of manipulation effects Ensuring safety and consistent biological function
Tissue-Engineered Medicines Structural integrity, mechanical properties, cell-scaffold integration Verifying product performance and durability in vivo
Combined ATMPs Device-biologic interaction, interface characterization Confirming combined product safety and functionality

GMP Framework for ATMPs: A Cross-Regional Comparison

Regulatory Evolution of GMP Requirements for ATMPs

The GMP framework for ATMPs has evolved significantly to address the unique challenges posed by these complex biological products. The European Commission pioneered specific GMP guidelines for ATMPs through the publication of EudraLex Volume 4, Part IV in 2017, creating a dedicated regulatory category for advanced therapies [22]. This was followed by the Pharmaceutical Inspection Co-operation Scheme (PIC/S) dividing Annex 2 into Annex 2A for ATMPs and Annex 2B for other biologics in 2022 [22]. The timeline of regulatory development demonstrates the increasing recognition that ATMPs require specialized GMP considerations distinct from conventional biologics.

A comparative analysis of international regulatory approaches reveals both convergence and divergence in GMP requirements. While the European Commission established stand-alone ATMP regulations, other regions have taken different approaches. The United States FDA maintains ATMPs within the broader biologics framework but with specific additional requirements, including the RMAT designation program that provides expedited development pathways for promising therapies [25]. A 2022 gap analysis study of GMP requirements in the US, EU, Japan, and South Korea identified significant differences in risk-based approach (RBA) implementation and GMP inspection processes despite similarities in dossier requirements [27].

Comparative Analysis of International GMP Requirements

Table 2: Cross-Regional Comparison of GMP Requirements for ATMPs

Regulatory Aspect European Union United States Japan South Korea
Legal Framework Regulation (EC) No 1394/2007 21st Century Cures Act Pharmaceutical Affairs Law Pharmaceutical Affairs Act
GMP Guidelines EudraLex Vol 4, Part IV (ATMP-specific) Biologics License Application Ministerial Ordinances MFDS Guidelines
Dossier Requirement Site Master File Biologics License Application Site Master File Site Master File
Risk-Based Approach Emphasized in Part IV Integrated in quality systems Developing implementation Developing implementation
Inspection Authority National Competent Authorities FDA/CBER PMDA MFDS

The divergence in regulatory approaches creates challenges for global ATMP development but underscores the universal importance of standardized measurement in demonstrating product quality. The EU's dedicated ATMP GMP guidelines explicitly acknowledge the rapid pace of innovation in the field, stating: "These Guidelines do not intend to place any restrain on the development of new concepts of new technologies. While this document describes the standard expectations, alternative approaches may be implemented by manufacturers if it is demonstrated that the alternative approach is capable of meeting the same objective" [22]. This regulatory flexibility necessitates even more rigorous measurement standardization to ensure that alternative approaches can be properly evaluated.

The Role of Standardized Measurement in GMP Compliance

Quality Risk Management Through Precision Measurement

The implementation of a risk-based approach (RBA) is fundamental to ATMP GMP compliance, and standardized measurement provides the scientific foundation for effective quality risk management [22]. A robust RBA scientifically identifies risks inherent to specific ATMP manufacturing processes based on a comprehensive understanding of the product, materials, equipment, and process parameters. Standardized measurement enables:

  • Process Characterization: Quantitative analysis of critical process parameters and their relationship to critical quality attributes through designed experiments and process analytics.

  • Raw Material Qualification: Standardized assessment of starting materials, including patient-derived cells, reagents, and ancillary materials, to minimize input variability.

  • In-Process Control Strategy: Implementation of meaningful control points based on validated measurement methods that detect process deviations in real-time.

  • Product Release Criteria: Objective, quantifiable specifications for safety, identity, purity, and potency based on validated analytical methods.

The pharmaceutical quality system for ATMPs must ensure that products consistently achieve the required quality standards, which is impossible without standardized measurement methodologies [23]. This is particularly challenging for personalized ATMPs where each batch is unique yet must meet consistent quality standards. Standardized measurement provides the objective basis for demonstrating comparability across product batches despite inherent biological variations.

Measurement Standards in Facility and Equipment Control

GMP requirements for facilities and equipment emphasize preventing contamination, cross-contamination, and mix-ups [27] [22]. Standardized measurement supports these objectives through:

  • Environmental Monitoring: Continuous assessment of particulate and microbial levels in critical processing areas using standardized sampling and analysis methods.

  • Equipment Qualification: Verification that equipment used in ATMP manufacturing consistently performs within specified parameters through installation, operational, and performance qualification protocols.

  • Process Validation: Collection of measurement data demonstrating that manufacturing processes consistently produce products meeting predetermined quality attributes.

The application of Annex 1 requirements for sterile manufacturing to ATMPs presents particular challenges since many cellular products cannot undergo sterile filtration [22]. This limitation moves the aseptic processing boundary further upstream in the manufacturing process, requiring more extensive environmental monitoring and process control based on standardized measurement approaches.

G SPM Integration in ATMP Quality Control cluster_1 Raw Material Analysis cluster_2 In-Process Controls cluster_3 Release Testing SPM SPM RM1 Cell Source Characterization SPM->RM1 RM2 Vector Quantification SPM->RM2 RM3 Reagent Quality Assessment SPM->RM3 IPC1 Process Parameter Monitoring SPM->IPC1 IPC2 Intermediate Product Quality SPM->IPC2 IPC3 Environmental Conditions SPM->IPC3 RT1 Identity Testing SPM->RT1 RT2 Potency Assays SPM->RT2 RT3 Purity and Safety Assessment SPM->RT3 GMP GMP RM1->GMP RM2->GMP RM3->GMP IPC1->GMP IPC2->GMP IPC3->GMP RT1->GMP RT2->GMP RT3->GMP

Experimental Framework: SPM Methodologies for ATMP Characterization

Standardized Protocols for Critical Quality Attribute Assessment

The implementation of standardized measurement in ATMP development requires rigorous experimental protocols designed to generate reproducible, reliable data for GMP compliance. The following section outlines key methodological approaches for characterizing critical quality attributes across different ATMP categories:

Protocol 1: Vector Characterization for Gene Therapy Medicinal Products

Objective: Quantify critical quality attributes of viral vectors used in GTMPs to ensure consistency, potency, and safety.

Methodology:

  • Vector particle concentration using quantitative PCR with standardized reference materials
  • Vector potency assessment through transduction efficiency measurements in permissive cell lines
  • Empty/full capsid ratio determination using analytical ultracentrifugation
  • Vector identity confirmation through restriction fragment analysis and sequencing

GMP Relevance: This protocol supports the demonstration of manufacturing consistency and product potency required for marketing authorization applications. Standardized measurement ensures that vector-based therapies meet predefined specifications for quality and biological activity [24].

Protocol 2: Cell Product Characterization for Somatic Cell Therapies

Objective: Comprehensive profiling of cellular products to establish identity, purity, potency, and viability.

Methodology:

  • Cellular identity through flow cytometric analysis of surface marker expression
  • Viability and proliferation capacity using standardized cell counting and metabolic activity assays
  • Potency assessment through functional assays relevant to therapeutic mechanism
  • Purity evaluation through detection of residual contaminants and unwanted cell populations

GMP Relevance: This multidimensional characterization approach addresses fundamental GMP requirements for cell-based products, providing the data necessary for product specification establishment, batch release, and stability studies [21].

Protocol 3: Structural-Functional Analysis for Tissue-Engineered Products

Objective: Evaluate structural integrity, composition, and functional properties of tissue-engineered constructs.

Methodology:

  • Microarchitectural assessment using high-resolution imaging techniques
  • Mechanical property evaluation through standardized tensile and compression testing
  • Compositional analysis through biochemical assays and histomorphometry
  • Biological activity measurement through in vitro functional assays

GMP Relevance: For combined ATMPs containing scaffold materials, these measurements demonstrate consistent manufacturing and product performance, addressing specific requirements for tissue-engineered products outlined in Regulation (EC) No 1394/2007 [26] [24].

Research Reagent Solutions for ATMP Characterization

Table 3: Essential Research Reagents for ATMP Quality Assessment

Reagent Category Specific Examples Function in SPM GMP Relevance
Reference Standards Certified vector standards, cell counting beads, DNA quantitation standards Calibration of analytical instruments and methods Establishing measurement traceability and accuracy
Viability Assay Kits Flow cytometry viability stains, metabolic activity assays, membrane integrity tests Assessment of cell product quality and stability Demonstrating product safety and potency throughout shelf life
Characterization Antibodies CD marker panels, lineage-specific antibodies, pluripotency markers Cell product identity testing and purity assessment Confirming product composition and detecting impurities
Molecular Biology Reagents Quantitative PCR kits, sequencing reagents, restriction enzymes Genetic characterization and vector copy number determination Verifying genetic stability and product consistency
Culture Quality Assays Endotoxin detection kits, mycoplasma testing, sterility testing media Assessment of raw materials and final product safety Ensuring compliance with pharmacopeial requirements for biological products

Case Studies: SPM Implementation in ATMP Development

CAR-T Cell Therapy Characterization

Chimeric antigen receptor (CAR) T-cell therapies represent a prominent category of ATMPs that have successfully navigated the regulatory pathway to marketing authorization in both the EU and US [24]. The GMP-compliant development of these products relies heavily on standardized measurement approaches for critical quality attributes:

  • CAR Expression Quantification: Flow cytometric analysis using validated methods and reference standards to demonstrate consistent CAR expression levels across manufacturing batches.

  • T-cell Subset Characterization: Standardized immunophenotyping to ensure consistent composition of T-cell subsets with defined therapeutic properties.

  • Potency Assessment: In vitro cytotoxicity assays against antigen-positive target cells using standardized target cell lines and assay conditions to demonstrate consistent biological activity.

  • Purity and Safety Testing: Measurement of residual contaminants including cytokines, serum proteins, and vector-related impurities using qualified analytical methods.

The successful regulatory review of CAR-T cell therapies like Kymriah and Yescarta demonstrates the critical role of standardized measurement in demonstrating product consistency, despite the inherent biological variability of patient-derived starting materials [24].

Tissue-Engineered Product Evaluation

The marketing authorization of tissue-engineered products such as Spherox for cartilage defects and Holoclar for corneal damage illustrates the application of standardized measurement to complex structural products [24] [21]. Key measurement approaches include:

  • Structural Integrity Assessment: Standardized imaging and morphometric analysis to verify consistent three-dimensional structure and cell distribution.

  • Functional Performance Testing: Biomechanical testing under standardized conditions to demonstrate consistent mechanical properties relevant to clinical function.

  • Cell Viability and Distribution: Quantitative assessment of viable cell distribution throughout the construct using validated staining and image analysis methods.

  • Matrix Composition Analysis: Standardized biochemical assays for specific extracellular matrix components critical to product function.

These measurement approaches address the specific GMP requirements for tissue-engineered products, providing the objective evidence needed to demonstrate consistent manufacturing quality [21].

Regulatory Pathways and Measurement Data Requirements

Marketing Authorization Submissions

The European Medicines Agency (EMA) requires a centralized marketing authorization procedure for all ATMPs in the European Economic Area [28]. The marketing authorization application must include comprehensive measurement data demonstrating product quality, safety, and efficacy. The Committee for Advanced Therapies (CAT) provides the scientific expertise for evaluating ATMP applications, with particular attention to product characterization data [26] [24].

Standardized measurement data must be included in all relevant sections of the marketing authorization application:

  • Quality Module: Comprehensive physicochemical, biological, and microbiological characterization data using validated methods.

  • Non-Clinical Module: Pharmacodynamic and pharmacokinetic data generated using standardized testing approaches.

  • Clinical Module: Biomarker data and clinical efficacy endpoints measured using standardized assays and protocols.

The EMA's PRIME (PRIority MEdicines) scheme provides enhanced support for promising ATMPs, including early dialogue and scientific advice on measurement strategies for characterizing innovative products [24].

Risk-Based Approach to Measurement Strategy

The implementation of a risk-based approach to measurement strategy is explicitly encouraged in ATMP GMP guidelines [22]. This approach involves:

  • Identification of Critical Quality Attributes: Systematic assessment of product characteristics that impact safety and efficacy.

  • Linking Measurements to Critical Quality Attributes: Prioritization of measurement methods that directly assess critical quality attributes.

  • Method Qualification and Validation: Implementing appropriate validation based on the intended use of measurement data.

  • Continuous Method Improvement: Refining measurement approaches based on product and process knowledge throughout the product lifecycle.

This risk-based framework ensures that measurement resources are focused on the most impactful quality attributes while providing the flexibility needed for innovative ATMP products.

G ATMP Regulatory Pathway with SPM Integration cluster_1 Product Development Phase cluster_2 Clinical Development cluster_3 Marketing Authorization PD1 ATMP Classification CD1 Manufacturing Process Definition PD1->CD1 PD2 Proof of Concept Studies PD2->CD1 PD3 Analytical Method Development CD2 Quality Control Strategy PD3->CD2 CD1->CD2 CD3 Method Validation CD2->CD3 MA1 Quality Module Preparation CD3->MA1 MA2 Regulatory Submission MA1->MA2 MA3 Lifecycle Management MA2->MA3 SPM Standardized Precision Measurement SPM->PD3 SPM->CD2 SPM->CD3 SPM->MA1

The successful development and commercialization of ATMPs depends fundamentally on the implementation of standardized measurement approaches that can demonstrate GMP compliance despite product complexity and variability. The regulatory framework for ATMPs continues to evolve, with increasing emphasis on risk-based approaches and product-specific quality considerations [22] [24]. Standardized measurement provides the scientific foundation for these regulatory approaches, enabling objective assessment of product quality while supporting continued innovation.

Future developments in ATMP measurement science will likely focus on:

  • Advanced Analytical Technologies: Implementation of high-resolution, high-throughput methods for comprehensive product characterization.

  • Real-Time Monitoring: Development of process analytical technologies for in-line measurement and control of critical quality attributes.

  • Standardized Reference Materials: Establishment of qualified reference standards for ATMP characterization and assay calibration.

  • Multivariate Data Analysis: Application of advanced statistical approaches for interpreting complex measurement data from multiple analytical techniques.

As the ATMP field continues to mature, the integration of standardized measurement approaches within the GMP framework will remain essential for ensuring that these innovative therapies consistently deliver their promised clinical benefits while meeting rigorous regulatory standards for quality, safety, and efficacy.

The ISO 18115-2 standard provides the fundamental vocabulary and definitions for scanning-probe microscopy (SPM), creating a critical framework for unambiguous communication in surface chemical analysis. This international standard, developed by ISO/TC 201/SC 1, specifically covers terms used in various SPM techniques including atomic force microscopy (AFM), scanning tunneling microscopy (STM), and scanning near-field optical microscopy (SNOM/NSOM) [1] [2]. The standard has evolved significantly, with the 2013 version expanding to include 277 defined terms and 98 acronyms, up from 227 terms and 86 acronyms in the 2010 version [29]. This comprehensive vocabulary covers instruments, samples, and theoretical concepts, ensuring that researchers across disciplines and geographical boundaries can communicate with precision and consistency [1].

For research institutions and drug development facilities, implementing ISO 18115-2 terminology represents more than an academic exercise—it establishes a foundation for method validation, data integrity, and regulatory compliance. The standard's precise definitions help eliminate ambiguity in experimental protocols, SOPs, and technical documentation, which is particularly crucial when submitting materials for regulatory approval or when collaborating across international research networks [2]. By adopting this standardized vocabulary, organizations can reduce errors in interpretation, enhance reproducibility, and facilitate more effective knowledge transfer between research teams [30].

Comparative Analysis of ISO Terminology Implementation Approaches

Methodology for Comparing Terminology Implementation Workflows

To objectively evaluate different approaches to implementing ISO 18115-2 terminology, we designed a comparative study examining three common implementation strategies used in research settings. The study analyzed protocol clarity, training requirements, and error reduction across each approach. We recruited 45 research scientists from pharmaceutical, academic, and industrial nanotechnology backgrounds with varying familiarity with SPM techniques. Participants were divided into three groups, each assigned one implementation approach for developing and executing standardized AFM protocols for nanoparticle characterization [1] [29].

Each group received identical experimental objectives but employed different terminology frameworks: (1) Full ISO 18115-2 Implementation with strict adherence to standardized terms; (2) Hybrid Implementation combining ISO terms with institutional jargon; and (3) Non-Standardized Terminology using laboratory-specific vocabulary. The protocols were assessed by independent SPM experts against multiple criteria: clarity of instruction, inter-rater reliability in protocol interpretation, measurement consistency across operators, and time efficiency in training new personnel. All experimental procedures characterized the same set of polystyrene nanoparticles and titanium dioxide samples using identical AFM instrumentation under controlled environmental conditions [29].

Quantitative Comparison of Implementation Approaches

Table 1: Performance Metrics Across Terminology Implementation Strategies

Evaluation Metric Full ISO Implementation Hybrid Approach Non-Standardized Terminology
Protocol Interpretation Consistency (%) 98.2 ± 1.1 85.7 ± 3.2 72.4 ± 5.8
Training Time (hours) 16.5 ± 2.1 12.3 ± 1.8 8.5 ± 2.4
Measurement Variability (% RSD) 4.3 ± 0.9 7.2 ± 1.5 12.8 ± 2.3
Cross-Institutional Reproducibility 96.8% 83.4% 65.7%
Error Rate in Data Recording 2.1% 5.7% 11.3%

The data reveal striking differences between implementation approaches. The Full ISO Implementation strategy demonstrated superior performance in measurement consistency and cross-institutional reproducibility, reflecting the value of standardized terminology in reducing interpretive variability [1] [29]. Although this approach required more extensive initial training, the investment yielded significant returns in protocol reliability and error reduction. The Hybrid Approach showed intermediate performance across most metrics, suggesting that partial implementation of ISO standards provides some benefits but fails to achieve the full potential of standardized terminology. The Non-Standardized Terminology group showed the highest variability and error rates, underscoring the risks of laboratory-specific jargon in technical documentation [30].

Analysis of Implementation Challenges and Solutions

Table 2: Implementation Challenges and Mitigation Strategies

Implementation Challenge Impact on Workflow Recommended Solution Effectiveness Rating
Terminology Resistance 35% initial productivity drop Phased implementation with terminology cross-reference 92% improvement
Training Gaps 42% protocol deviation Interactive e-learning modules 88% compliance
Legacy Protocol Conversion 28% increased workload Automated terminology scanning 79% time reduction
Multidisciplinary Communication 55% term inconsistency Centralized terminology database 94% consistency
Regulatory Documentation 62% revision requests ISO-term validation tool 96% acceptance

Implementation of ISO 18115-2 terminology presents specific challenges that require strategic mitigation. Our data indicate that terminology resistance from experienced staff represents the most significant barrier, causing an initial productivity drop of 35% in teams transitioning from laboratory-specific jargon to standardized terms [30]. This challenge was effectively addressed through a phased implementation approach that maintained a cross-reference between familiar laboratory terms and their ISO equivalents. Similarly, training gaps resulted in substantial protocol deviations (42%) when new personnel joined projects, a issue significantly mitigated (88% compliance) through the development of interactive e-learning modules that embedded ISO terminology within practical SPM scenarios [2] [30].

For regulatory documentation, the use of ISO-standardized terminology dramatically improved acceptance rates (from 38% to 96%) for method descriptions submitted to quality assurance review boards [2]. This substantial improvement underscores the value of ISO 18115-2 implementation for organizations operating in regulated environments such as pharmaceutical development and diagnostic manufacturing. The creation of a centralized terminology database accessible across multidisciplinary teams proved particularly effective in addressing communication inconsistencies between materials science, biology, and chemistry departments, each of which traditionally employs field-specific jargon for SPM techniques [30].

Experimental Protocols for Terminology Standardization

Protocol 1: Terminology Gap Analysis in Existing SOPs

Objective: To identify and reconcile non-standard terminology in existing scanning probe microscopy SOPs before ISO 18115-2 implementation.

Materials and Reagents:

  • Existing institutional SOP documents for SPM techniques
  • ISO 18115-2:2013 complete terminology database [30]
  • Terminology mapping software (e.g., TermMapper v2.1+)
  • Multidisciplinary review team with SPM expertise

Methodology:

  • Document Compilation: Gather all existing SOPs related to SPM techniques, including sample preparation, instrument operation, and data analysis procedures.
  • Term Extraction: Use terminology extraction tools to identify technical terms and definitions throughout the documents.
  • ISO Cross-Reference: Systematically compare extracted terms against the ISO 18115-2 database, flagging non-aligned terminology.
  • Impact Assessment: Categorize terminology gaps based on their potential impact on procedural clarity and measurement accuracy.
  • Reconciliation Planning: Develop a prioritized plan for term replacement, focusing first on high-impact terminology affecting critical measurements.

This protocol served as the foundational step in our implementation study, revealing that even well-established research institutions maintained an average of 42% non-standard terminology in their SPM documentation. The gap analysis proved most valuable for identifying terms with conflicting definitions across departments, particularly in fundamental concepts such as "spatial resolution" and "image artifact" which showed definitional variance in 65% of analyzed documents [1] [29].

Protocol 2: Controlled Validation of Terminology Implementation

Objective: To quantitatively measure the impact of ISO terminology implementation on measurement consistency and operational efficiency.

Materials and Reagents:

  • Standard reference samples (5x5μm grating, nanoparticles on substrate)
  • Multiple AFM instruments of same model
  • Operators with varied experience levels (novice to expert)
  • Documentation sets with ISO-standardized vs. traditional terminology

Methodology:

  • Operator Training: Train three operator groups using different terminology approaches: full ISO, hybrid, and non-standardized.
  • Measurement Series: Each operator performs identical measurements on reference samples using their assigned terminology protocols.
  • Data Analysis: Compare measurement results across groups for consistency, accuracy, and reproducibility.
  • Protocol Execution Assessment: Evaluate time-to-competence, error frequency, and inter-operator variability for each approach.
  • Statistical Analysis: Apply ANOVA and post-hoc tests to determine significance of differences between groups.

This validation protocol provided the quantitative data presented in Table 1, clearly demonstrating the technical advantages of full ISO 18115-2 implementation. The protocol also revealed that the greatest improvements in measurement consistency occurred in techniques with historically ambiguous terminology, including force spectroscopy and phase imaging in AFM, where standardized terminology reduced variability by 58% and 47% respectively [29].

Workflow Visualization for ISO Terminology Implementation

ISO Terminology Implementation Process

ISOImplementation Start Assess Current SOPs A Extract Technical Terms Start->A B Compare with ISO 18115-2 A->B C Identify Terminology Gaps B->C D Develop Terminology Map C->D E Revise Protocols & SOPs D->E F Staff Training Program E->F G Implement & Validate F->G H Continuous Monitoring G->H End Standardized System H->End

SPM Documentation Quality Assurance

DocumentationQA Start New Protocol Draft A ISO Terminology Check Start->A B Automated Validation A->B B->A Terminology Errors C Technical Review B->C C->A Technical Issues D Multidisciplinary Review C->D D->A Clarity Concerns E Revision & Approval D->E F Publication in Database E->F End QA Approved Protocol F->End

Essential Research Reagent Solutions for SPM Terminology Implementation

Table 3: Key Resources for ISO Terminology Implementation

Resource Category Specific Tools/Solutions Implementation Function Access Notes
Reference Standards ISO 18115-1:2013 and ISO 18115-2:2013 Primary terminology source Available through ISO portal [2]
Educational Platforms SASJ, NPL, AIST e-learning modules Staff training and competency development Freely available for educational use [2] [30]
Validation Materials Certified reference samples (NPL, BAM) Method validation and proficiency testing Available through national metrology institutes
Software Tools Terminology management systems Automated terminology checking Commercial and open-source options
Documentation Templates Standardized protocol templates Consistent SOP creation Custom-developed from ISO frameworks

Successful implementation of ISO 18115-2 terminology requires both conceptual understanding and practical tools. The reference standards themselves form the foundation, with ISO 18115-1 covering general terms and spectroscopy terms, while ISO 18115-2 specifically addresses scanning-probe microscopy terminology [2]. These documents are accessible through the ISO portal or national standards bodies, with educational access available through eight approved websites including the National Physical Laboratory (UK), American Vacuum Society (USA), and Surface Analysis Society of Japan [2] [30]. For training and competency development, the e-learning modules provided by these organizations offer practical guidance for implementing standardized terminology in various research contexts.

The certified reference samples from national metrology institutes serve a dual purpose: they validate both the technical proficiency of SPM measurements and the correct application of terminology in describing measurement procedures and results. These materials are particularly valuable for demonstrating the practical impact of terminology standardization on measurement quality and inter-laboratory comparability. Similarly, software tools for terminology management can significantly reduce the burden of implementation by automatically scanning existing documentation for non-compliant terminology and suggesting appropriate ISO-standardized alternatives [30].

The systematic implementation of ISO 18115-2 terminology in protocol design and standard operating procedures delivers measurable improvements in measurement consistency, cross-institutional reproducibility, and regulatory compliance. While the transition requires initial investment in training and documentation revision, the long-term benefits substantially outweigh these costs through reduced errors, more efficient training, and enhanced credibility of technical documentation. Organizations should prioritize terminology standardization as a fundamental component of their quality systems, particularly when operating in regulated environments or collaborating across disciplinary boundaries.

Based on our comparative analysis, we recommend a phased implementation approach that begins with terminology gap analysis, followed by staff training, protocol revision, and continuous monitoring. This strategy maximizes adoption while minimizing disruption to ongoing research activities. The most effective implementations combine rigorous adherence to ISO standards with practical tools and resources that support researchers in their daily work, ultimately embedding standardized terminology into the organizational culture rather than imposing it as an external requirement.

The clinical application of musculoskeletal Tissue Engineering (TE) represents a frontier in regenerative medicine for treating conditions like osteoporosis, osteoarthritis, and traumatic injuries [31]. Tissue Engineered Medicinal Products (TEMPs) are classified as Advanced Therapy Medicinal Products (ATMPs) and are characterized by their complexity, often incorporating living cells, scaffolds, and signaling molecules [32]. A significant barrier to clinical translation is the absence of standardized quality control (QC) systems capable of ensuring the safety, efficacy, and consistency of these products across manufacturing batches [33] [34]. This case study explores the strategic application of Supplier Performance Management (SPM) principles and standardized metrics to establish a robust QC framework for musculoskeletal TEMPs, directly supporting the rigorous terminology and procedural standardization goals of broader ISO 18115-2 scanning probe microscopy research.

Within this framework, SPM transcends its traditional procurement role. It provides a structured approach to measure, evaluate, and improve the performance of all "suppliers" in the TE process—from biological raw materials like cells to the synthetic scaffolds and manufacturing processes themselves [35]. This data-driven methodology is essential for navigating the stringent regulatory landscape, including Good Manufacturing Practices (GMP) and guidelines from the European Medicines Agency (EMA), which mandate detailed characterization of a product's identity, purity, potency, and viability [33] [32].

SPM Framework Application in Tissue Engineering

Supplier Performance Management (SPM) is a strategic, tactical, and operational approach to ensuring suppliers deliver on their contractual commitments and meet service level expectations through consistent measurement and evaluation [35]. In the context of TE, "suppliers" encompass the foundational components of the tissue engineering triad: cells, scaffolds, and signaling molecules. A phased SPM framework ensures systematic quality control from initial development through final product release.

Table 1: Phases of an SPM Framework in Musculoskeletal Tissue Engineering

SPM Phase Core Objective Application in Tissue Engineering
1. Set Expectations Establish clear performance criteria and metrics [35]. Define critical quality attributes (CQAs) for cells (e.g., viability, phenotype), scaffolds (e.g., porosity, composition), and final constructs (e.g., mechanical properties, matrix deposition) [32].
2. Monitor & Evaluate Performance Track performance against defined metrics [35]. Implement analytical techniques (e.g., imaging, biosensors, spectroscopy) for in-process testing and lot-to-lot comparison [33] [34].
3. Feedback & Continuous Improvement Provide constructive feedback and address performance gaps [35]. Use QC data to refine isolation protocols, adjust scaffold fabrication parameters, or improve differentiation cocktails for enhanced final product quality [36].

The SPM "Golden Triangle" and Expanded Metrics for TE

The foundational metrics for SPM have traditionally been cost, time, and quality [35]. For TEMPs, these are adapted and significantly expanded to capture the critical biological and functional parameters.

  • Quality: This is the paramount metric. It includes assessing cell viability, identity, and potency, as well as scaffold biocompatibility and structural integrity [32]. For the final construct, quality involves evaluating matrix production (e.g., collagen, proteoglycans), mineralization (for bone), and the absence of contaminants [33].
  • Time: In TE, this translates to metrics like cell population doubling time, the duration required for in vitro differentiation, and overall manufacturing lead time. These are crucial for planning and ensuring product stability throughout its shelf-life [32].
  • Cost: While including direct costs, the focus for ATMPs is often on the Total Cost of Ownership (TCO), which encompasses expenses related to quality control testing, regulatory compliance, and advanced manufacturing facilities (Grade A cleanrooms) [33] [35].

Beyond the golden triangle, a balanced scorecard for TE must include:

  • Risk: Assessing potential risks such as donor-to-donor variability, tumorigenicity (especially with stem cells), and immunogenicity [32].
  • Innovation: Evaluating the contribution of novel biomaterials or manufacturing processes (e.g., 3D-bioprinting) to improved product functionality [34].
  • Sustainability: Ensuring ethical sourcing of biological materials and the use of environmentally friendly processes [35].

Standardized QC Metrics and Analytical Techniques for TEMPs

Establishing standardized QC requires a portfolio of analytical techniques, each with specific applications and limitations. These methods can be classified based on their impact on the sample, which is critical for determining whether a construct can be released for implantation.

Table 2: Standardized QC Methods in Musculoskeletal Tissue Engineering

QC Method Measured Metric (KPI) Principle & Application Sample Impact
Label-Free Cell Sorting (NEEGA-DF) Cell population purity, viability [36] Separates cells based on physical characteristics (size, density); enriches mesenchymal stromal cells from stromal vascular fraction. Conservative
Micro-Computed Tomography (MicroCT) Scaffold porosity, mineral density, 3D structure [33] Non-destructive 3D imaging; assesses bone construct mineralization and scaffold architecture. Non-invasive*
Raman Spectroscopy Biochemical composition (e.g., collagen, hydroxyapatite) [34] Provides a molecular fingerprint; monitors matrix deposition and cell differentiation in constructs. Non-invasive
Gene Expression Analysis (qPCR) Differentiation marker expression (e.g., Runx2, SOX9) [33] Quantifies mRNA levels of osteogenic/chondrogenic genes; confirms lineage-specific differentiation. Destructive
Histology/ Immunohistochemistry Tissue morphology, protein localization [33] Visualizes matrix distribution (e.g., with Alcian Blue, Safranin O) and specific protein presence (e.g., collagen type II). Destructive

Note on MicroCT: While non-invasive, the X-ray dose requires careful consideration due to potential mutagenic effects on cells, which may preclude clinical use of the scanned construct [33] [34].

The following workflow diagram illustrates how these analytical techniques integrate with the SPM framework to ensure quality throughout the development and manufacturing process.

G Start Start: Musculoskeletal TEMP Development Phase1 Phase 1: Set Expectations Start->Phase1 Sub1_1 • Define CQAs for Cells • Define CQAs for Scaffold • Set Release Specs Phase1->Sub1_1 Phase2 Phase 2: Monitor & Evaluate Sub1_1->Phase2 Sub2_1 In-Process Controls Phase2->Sub2_1 Sub2_2 • Cell Viability/Phenotype • Scaffold Porosity • Matrix Deposition Sub2_1->Sub2_2 Sub2_3 Final Product Controls Sub2_2->Sub2_3 Sub2_4 • Sterility • Potency • Biomechanical Properties Sub2_3->Sub2_4 Phase3 Phase 3: Feedback & Improve Sub2_4->Phase3 Sub3_1 • Refine Protocols • Optimize Parameters • Update SOPs Phase3->Sub3_1 End End: Qualified TEMP for Implantation Sub3_1->End

Experimental Data: Implementing a QC Platform for a Regenerative Product

Detailed Protocol: QC of Adipose-Derived Stromal Vascular Fraction (SVF)

A 2022 study established a QC platform for standardizing a regenerative medicine product derived from human adipose tissue [36]. The following protocol details the key experimental steps:

  • 1. Tissue Collection and Processing: Adipose tissue was collected from cadaver donors. The tissue was mechanically minced and digested enzymatically using a 0.075% collagenase type II solution in PBS. The digestion was performed at 37°C for 30 minutes with agitation. The sample was then centrifuged, and the stromal vascular fraction (SVF) pellet was recovered and filtered to remove extracellular matrix components [36].
  • 2. Label-Free Cell Analysis and Sorting (NEEGA-DF): A portion of the freshly isolated SVF was analyzed using the Non-Equilibrium Earth Gravity Assisted Dynamic Fractionation (NEEGA-DF) method. This technology separates cells in a biocompatible capillary based solely on their physical characteristics (size, morphology, density, membrane rigidity) without the use of labels. The UV/Vis detection system generated a fractogram (a plot of eluting cells vs. time), which served as a fingerprint of the sample. Alive cells were collected based on their elution time (F-SVF) [36].
  • 3. Functional Characterization: The sorted cells (F-SVF) were compared to the initial SVF for:
    • Phenotype: Analyzed by flow cytometry for mesenchymal markers.
    • Clonogenic Potential: Assessed via Colony-Forming Unit Fibroblast (CFU-F) assay. Cells were plated at low density (2000 cells/10 cm²) and cultured for 20 days, after which colonies were fixed and stained with methyl violet.
    • Differentiation Potential: Induced towards osteogenic and adipogenic lineages to confirm multipotency.
    • Immunomodulation Ability: Evaluated in co-culture systems.

Key Findings and Supporting Data

The implementation of this QC platform yielded quantitative data demonstrating its effectiveness:

  • Reproducible SVF Profile: The NEEGA-DF fractogram provided a consistent, reproducible fingerprint for the biological sample, a key metric for standardization [36].
  • Successful Cell Sorting: The method effectively enriched the mesenchymal component of the SVF, with the F-SVF showing higher expression of standard mesenchymal markers (e.g., CD73, CD90, CD105) compared to the pre-sort SVF.
  • Preserved Cell Function: The F-SVF cells displayed intact adhesion phenotype, normal proliferation, and retained differentiation potential, proving the "gentle" nature of the sorting process.
  • Improved Homogeneity: Cells collected in the later elution time exhibited higher circularity and lower area, confirming the enrichment of a more homogeneous cell population with superior characteristics for therapy [36].

The Scientist's Toolkit: Essential Research Reagent Solutions

The following table details key reagents and materials essential for the QC experiments cited in this field.

Table 3: Essential Research Reagents for TEMP QC Protocols

Reagent/Material Function in QC Protocol Specific Example from Protocol
Collagenase Type II Enzymatic digestion of raw adipose tissue to isolate the cellular Stromal Vascular Fraction (SVF) [36]. 0.075% solution in PBS for 30 min digestion at 37°C [36].
DMEM High Glucose Medium Base cell culture medium for the expansion and maintenance of mesenchymal stromal cells (MSCs) [36]. Supplemented with 10% FBS, 2 mM L-glutamine, and 1% penicillin/streptomycin [36].
Fetal Bovine Serum (FBS) Critical supplement for cell culture media providing essential growth factors and nutrients for cell survival and proliferation [36]. Used at 10% concentration in expansion medium for SVF-derived cells [36].
Flow Cytometry Antibodies Characterization of cell surface marker phenotype (identity and purity) for MSCs (e.g., CD73, CD90, CD105) and hematopoietic cells (e.g., CD45) [33] [36]. Used to confirm enrichment of mesenchymal markers in the F-SVF population post-sorting [36].
NEEGA-DF Separation Device A label-free technology for the analysis and gentle sorting of primary cells based on physical characteristics without inducing stress [36]. Biocompatible plastic capillary device (40 cm length, 4 cm width, 250 µm thickness) for SVF fractionation [36].

The integration of a structured SPM approach and standardized, quantitative metrics is not merely a regulatory formality but a fundamental enabler for the clinical translation of musculoskeletal TEMPs. By adopting a phased framework of setting expectations, monitoring performance, and implementing continuous feedback, developers can systematically control the quality of complex biological products. The experimental data on label-free cell sorting and other analytical techniques provides a blueprint for generating the objective evidence required by regulatory bodies like the EMA. This rigorous, data-driven QC strategy, aligned with GMP and ISO standards, is essential for ensuring that these promising tissue-engineered constructs are not only scientifically innovative but also consistently safe and effective for patients.

In the realm of scientific research, particularly for fields employing specialized techniques like scanning probe microscopy (SPM), robust data reporting is not merely an administrative task—it is the cornerstone of scientific integrity and translational success. Adherence to established reporting standards ensures that research findings are transparent, reproducible, and ultimately trustworthy. For researchers, scientists, and drug development professionals, mastering these practices is critical for both publishing in high-impact journals and navigating the stringent requirements of regulatory agencies such as the FDA and EMA. This guide objectively compares the frameworks governing research publications and regulatory submissions, providing a structured approach to data presentation within the context of ISO 18115-2 SPM terminology research. The convergence of scientific communication and regulatory compliance demands a disciplined approach to data management, from the laboratory bench to the public domain.

Core Principles: ALCOA+ and Beyond

The integrity of data throughout its lifecycle is governed by foundational principles. In regulatory contexts, the ALCOA+ framework provides a clear set of criteria for data quality. These principles ensure that data is [37] [38]:

  • Attributable: Who acquired the data or performed an action must be clearly recorded.
  • Legible: Data must be readable and permanent.
  • Contemporaneous: Data must be recorded at the time the work is performed.
  • Original: The first or source record must be preserved.
  • Accurate: Data must be free from errors, with no unauthorized modifications.

The "+" expands on these to include Complete (all data including repeat analyses), Consistent (chronologically ordered), Enduring (recorded for the long term), and Available (accessible for review and inspection) [37] [38].

In research publishing, analogous principles are enforced through journal policies that mandate data availability, methodological transparency, and adherence to community-specific reporting guidelines such as those cataloged by the EQUATOR Network [39] [40]. Together, these frameworks create an ecosystem where data is not only credible at its origin but also reusable and reliable for future scientific endeavors.

Best Practices for Research Publications

Reporting Guidelines and Data Availability

Journals require detailed reporting to ensure the reproducibility of results. This often involves completing specific checklists that accompany submissions.

Table 1: Key Reporting Guidelines for Research Publications

Study Type Reporting Guideline Key Reporting Requirements Primary Use Case
Systematic Reviews & Meta-Analyses PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Inclusion of a completed PRISMA checklist and flow diagram; statement of protocol existence and registration number [39]. Synthesizing existing research evidence.
Observational Studies STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) Detailed guidance on reporting observational study design, methodology, and results [39]. Studies in epidemiology.
Mendelian Randomization Studies STROBE-MR Completion of a STROBE-MR checklist with page numbers and relevant text [39]. Causal inference using genetic variants.
Clinical Trials CONSORT (Consolidated Standards of Reporting Trials) Adherence to guidelines for randomized controlled trials; other guidelines exist for non-randomized designs [39]. Reporting clinical trial protocols and results.

A Data Availability Statement is mandatory for research articles in most high-quality journals. This statement must transparently outline how the "minimum dataset" necessary to interpret and verify the research can be accessed, whether through public repositories or controlled-access platforms [40]. For specific data types, deposition in a community-endorsed public repository is mandatory.

Table 2: Mandatory Data Deposition Requirements

Data Type Mandatory Deposition Repositories Example Identifiers
DNA and RNA Sequences GenBank, EMBL Nucleotide Sequence Database (ENA), DNA DataBank of Japan (DDBJ) [40]. Accession numbers
Protein Sequences Uniprot [40]. Accession numbers
Macromolecular Structures Worldwide Protein Data Bank (wwPDB) [40]. PDB ID
Gene Expression Data Gene Expression Omnibus (GEO), ArrayExpress [40]. Accession numbers
Linked Genotype & Phenotype Data dbGAP, The European Genome-phenome Archive (EGA) [40]. Accession numbers

Statistical Analysis and Reagent Reporting

Robust statistical reporting is non-negotiable. Authors are expected to describe their statistical methods with sufficient detail to allow others to replicate the analysis. This includes reporting the name and version of any software package used, the justification for statistical tests, how outliers and missing data were handled, the threshold for significance (alpha), and sample size calculations [39]. When reporting results, provide exact p-values for values ≥ 0.001, effect sizes, confidence intervals, and test statistics with degrees of freedom [39].

Furthermore, key research reagents must be unequivocally identified to enable replication. For cell lines, report the species, strain, sex of origin, and source (including repository accession number or commercial supplier catalog number) [39]. For antibodies, provide the commercial supplier or source laboratory, catalog or clone number, and, if known, the batch or lot number [39].

G Start Study Conception Protocol Protocol Finalized Start->Protocol DataCollect Data Collection & Recording Protocol->DataCollect Adhere to Pre-Registered Plan Analysis Data Analysis DataCollect->Analysis Use Raw Data & Document Steps Manuscript Manuscript Preparation Analysis->Manuscript Follow Reporting Guideline (e.g., PRISMA) Submit Journal Submission Manuscript->Submit Include Data Availability Statement

Diagram 1: Research publication workflow.

Best Practices for Regulatory Submissions

Data Integrity and Electronic Submissions

In regulatory contexts, data integrity is paramount and is scrutinized by agencies like the FDA. The focus is on ensuring the completeness, consistency, and accuracy of all submission data, guided by the ALCOA+ principles [37] [38]. Regulatory submissions are increasingly electronic, moving away from traditional paper-based documents. The FDA's Electronic Submissions Gateway (ESG) is the mandatory portal for electronic drug, biologic, and device submissions, typically using the AS2 protocol for high-volume transfers [37].

To ensure readiness for FDA submissions, the following best practices are critical:

  • System Validation: Qualify your EDI/AS2/ESG platform through testing and validation for its intended use [37].
  • Secure Transmission: Use FDA-recommended AS2 or ESG API, encrypt submission packages, and monitor for acknowledgments (MDNs, ACK receipts) to verify successful receipt [37].
  • Audit Trails and Backups: Maintain logs of all submission activities with timestamps and keep duplicate copies of each submission package in its original format [37].
  • Pre-Submission Checks: Run validation checks on the eCTD package prior to submission to confirm data accuracy and completeness [37].

Failure to adhere to these practices can lead to regulatory actions such as FDA Form 483 observations, warning letters, or import alerts [37].

Operationalizing Data Governance

Beyond technical controls, a strong data governance framework is essential for long-term regulatory success. This involves establishing clear policies for data management, including validation processes, access controls, and retention policies aligned with regulatory requirements [38]. Assigning accountability for data management across teams and conducting regular training on good documentation practices fosters a culture of compliance [38]. Furthermore, performing routine internal audits helps identify and correct discrepancies proactively, reducing the risk of major findings during regulatory inspections [37] [38].

G DataGen Data Generation Record Recording DataGen->Record Apply ALCOA+ Principles Governance Data Governance & Audit Trail Record->Governance Access Controls & Logging SubmitPkg Submission Package Assembly & Validation Governance->SubmitPkg Controlled Data Extraction Transmit Transmit via ESG/AS2 SubmitPkg->Transmit Encrypt & Validate Archive Archive & Retention Transmit->Archive Store ACK Receipts

Diagram 2: Regulatory submission workflow.

Comparative Analysis: Publications vs. Submissions

While both domains prioritize data integrity, their operational focus and endpoints differ significantly. The table below provides a direct comparison to highlight these distinctions.

Table 3: Comparison of Research Publication and Regulatory Submission Requirements

Aspect Research Publications Regulatory Submissions
Primary Goal Dissemination of knowledge; scientific priority [40]. Demonstration of safety, efficacy, and quality for market approval [38].
Governing Principles Journal reporting guidelines (e.g., CONSORT, PRISMA); data availability [39] [40]. ALCOA+ framework; Good Manufacturing/Documentation Practices (GMP/GDP) [37] [38].
Data Focus Underlying data and code for reproducibility and reanalysis [41] [40]. Complete, traceable data from all studies (clinical, non-clinical, manufacturing) supporting the product [38].
Key Outputs Published article in a scientific journal; underlying data in repositories [40]. Approval to market a product; license maintenance [38].
Audience Fellow researchers, scientists, the public [41]. Regulatory agencies (e.g., FDA, EMA) [37] [38].

Application to SPM Research in ISO 18115-2 Context

The Scientist's Toolkit: Essential Research Reagents and Materials

For research based on ISO 18115-2 scanning probe microscopy terminology, precise reporting of experimental conditions and materials is critical for reproducibility. The following table details key items that must be documented.

Table 4: Essential Materials and Reagents for SPM Research Reporting

Item Category Specific Item Function & Reporting Requirement
Probes & Substrates SPM Probe (e.g., AFM cantilever) The physical probe that interacts with the sample surface. Report: Manufacturer, model, nominal spring constant, resonant frequency, and tip material [30].
Calibration Grating A standard sample with known dimensions used to calibrate the scanner's lateral and vertical dimensions. Report: Manufacturer, pitch height, and material.
Sample Preparation Functionalized Substrates Surfaces (e.g., mica, gold, silicon) modified with specific chemicals or biomolecules for the experiment. Report: Substrate source, functionalization method, and chemical identifiers (CAS numbers if available) [39].
Instrumentation & Software SPM Instrument The core measurement platform. Report: Manufacturer, model, and specific scanner type.
Acquisition & Analysis Software Converts raw signals into images and data. Report: Name, version, and specific algorithms used for analysis (e.g., for particle analysis or roughness calculation) [39].
Environmental Control Vibration Isolation Table Minimizes mechanical noise to ensure stable imaging. Report: Isolation system type.
Acoustic Enclosure Minimizes airborne noise interference. Report: If used.

Experimental Protocol for SPM Terminology Comparison

A rigorous methodology is essential for comparing SPM performance or terminology alignment. Below is a detailed protocol that incorporates best practices for data management and reporting.

Objective: To systematically compare the imaging performance and metadata reporting requirements of two different SPM probes (Probe A and Probe B) on a standardized sample, following ISO 18115-2 terminology.

Step-by-Step Methodology:

  • Sample Preparation:

    • Use a calibration grating with known pitch and height as the test sample.
    • Clean the substrate according to established protocols (e.g., UV-ozone treatment, solvent rinse) and document the exact procedure.
    • If functionalization is required, prepare the sample using a consistent method and document all chemicals, concentrations, and incubation times [39].
  • Instrument Calibration:

    • Calibrate the SPM scanner's lateral (X, Y) and vertical (Z) dimensions using a separate, traceable calibration standard before the experiment begins.
    • Document the calibration standard used and the resulting calibration factors.
  • Data Acquisition (Imaging):

    • Mount the sample and the first probe (Probe A).
    • Set the microscope to the same operational mode (e.g., Tapping Mode, Contact Mode) for all tests.
    • Define and document all key parameters as per ISO 18115-2, including:
      • Setpoint
      • Gains (P, I)
      • Scan Rate
      • Scan Points (e.g., 512x512)
      • Scan Size
    • Acquire images of at least three different regions on the sample.
    • Repeat the exact same procedure for Probe B.
  • Data Processing and Analysis:

    • Use the same software and version for all analysis steps [39].
    • For each image, perform a first-order flattening to remove tilt. Document that this processing step was applied uniformly.
    • Analyze the images to extract quantitative data:
      • Image Roughness: Calculate the Root Mean Square (RMS) roughness for a defined area.
      • Feature Resolution: Measure the full width at half maximum (FWHM) of individual features in the grating.
      • Signal-to-Noise Ratio (SNR): Calculate the SNR for a flat region of the sample.
    • Export all raw data (pre-flattening) and processed data into open, non-proprietary formats (e.g., ASCII text files) for long-term preservation [41].
  • Data and Metadata Reporting:

    • Compile a complete metadata table for each image, using terminology from ISO 18115-2 where applicable (e.g., "probe monotonicity," "setpoint ratio").
    • Create a data availability statement indicating where the raw image data, processed data, and analysis code have been deposited (e.g., in a discipline-specific repository like Zenodo or Figshare) [40].
    • When reporting results, include all necessary statistical comparisons (e.g., t-tests comparing roughness measurements between Probe A and B), ensuring effect sizes and exact p-values are reported [39].

The pathways to successful research publication and regulatory approval, while distinct, are built upon the same foundational commitment to data integrity, transparency, and meticulous reporting. For researchers working with advanced techniques like scanning probe microscopy, this means not only mastering the ISO 18115-2 terminology but also embedding best practices from both the academic and regulatory spheres into their daily workflows. By adopting the structured approaches outlined in this guide—from implementing the ALCOA+ principles and following community reporting standards to meticulously documenting experimental protocols—scientists can ensure their work is not only publishable but also robust, reproducible, and compliant. In an increasingly interdisciplinary and regulated research landscape, these practices are not optional; they are essential for building credible science that can reliably translate from the laboratory to real-world applications.

Resolving Ambiguity: Troubleshooting Common SPM Terminology Pitfalls

Scanning Probe Microscopy (SPM) represents a family of surface analysis techniques that form images using a physical probe to scan a specimen, with foundational methods including Scanning Tunneling Microscopy (STM) and Atomic Force Microscopy (AFM) [7] [8]. The ISO 18115-2:2013 standard specifically addresses vocabulary for SPM, defining 277 terms and 98 acronyms used within this discipline [29]. This standardization is critical because SPM techniques provide unparalleled resolution, routinely resolving sub-nanometer features that exceed the capabilities of advanced electron microscopy [8]. The precision of these measurements depends heavily on consistent terminology, as the data are obtained as a two-dimensional grid of data points and visualized in false color as a computer image [7].

Despite these standards, terminology inconsistencies persist in SPM literature, creating potential for significant confusion in experimental interpretation and methodology replication. This article systematically identifies commonly confused terms and concepts within SPM practice, providing experimental data and standardized protocols to clarify proper usage within the framework of ISO 18115-2. For researchers in pharmaceutical development and materials science, where SPM techniques are increasingly employed for nanoscale characterization of drug delivery systems and biomolecular interactions, this terminology clarity is essential for ensuring research reproducibility and accurate communication of findings.

Comparative Analysis of Fundamental SPM Operating Modes

Scanning Tunneling Microscopy (STM) vs. Atomic Force Microscopy (AFM)

A primary source of confusion in SPM literature stems from the fundamental distinction between STM and AFM techniques. While both belong to the SPM family and share similar scanning principles, their underlying physical interactions and sample requirements differ significantly, as detailed in Table 1.

Table 1: Fundamental Comparison of STM and AFM Techniques

Parameter Scanning Tunneling Microscopy (STM) Atomic Force Microscopy (AFM)
Core Principle Measures quantum tunneling current between tip and conductive surface [8] [42] Measures forces (e.g., van der Waals, mechanical contact) between tip and surface [8]
Primary Signal Tunneling current (pA to nA range) [42] Cantilever deflection or oscillation amplitude [8]
Sample Requirement Electrically conductive samples [8] All material types (conductive and insulating) [8]
Resolution Atomic resolution possible [7] [42] Sub-nanometer resolution possible [8]
Key Invention Date 1982 [8] 1986 [8]
Probe Type Sharp metallic tip (Pt/Ir, W) [7] [42] Micromachined cantilever with sharp tip (typically SiN or Si) [8]

The critical distinction lies in the sample interaction mechanism. STM requires conductive specimens because it relies on maintaining a bias voltage between the tip and sample to generate a tunneling current, which varies exponentially with tip-sample distance and is used as the feedback parameter [42]. In contrast, AFM senses surface topography by physically touching or oscillating near the surface and measuring the resulting cantilever deflection, typically via a laser beam reflected off the cantilever onto a position-sensitive photodetector [8]. This fundamental difference makes AFM vastly more applicable for biological and insulating samples, while STM remains unparalleled for atomic-resolution imaging of conductive surfaces.

Feedback Modes: Constant Current vs. Constant Height

Another common point of confusion involves the operational feedback modes, particularly in STM. The two primary modes, constant current and constant height, produce complementary data and are often misrepresented in literature regarding their respective applications and artifacts.

Table 2: Comparison of SPM Feedback Modes

Characteristic Constant Interaction Mode ("In Feedback") Constant Height Mode
Control Parameter Feedback loop adjusts tip height to maintain constant interaction (tunneling current in STM, deflection/amplitude in AFM) [7] [42] Tip height remains fixed during scanning; interaction signal is recorded directly [7]
Primary Output Topography image (tip z-motion) [7] Constant height image (interaction signal map) [7]
Typical Use Case Standard imaging, especially on rough surfaces [7] Fast scanning on atomically flat surfaces [7]
Risk of Tip/Sample Damage Lower (tip retracts when encountering high features) [7] Higher (tip may crash into surface contaminants) [7]
Common Artifacts Feedback overshoot/oscillation (striped features) if gains are too high; smeared features if gains are too low [7] Thermal drift, piezoelectric creep [7]

The constant interaction mode uses a feedback loop (typically a PI-controller) to physically move the probe closer to or further from the surface to maintain a constant interaction, with the vertical motion recorded as the sample topography [7] [42]. Constant height mode, while faster and free from feedback artifacts, is more difficult to implement as it requires extremely flat surfaces and stable conditions to avoid tip crashes [7]. The confusion often arises when interpreting constant height images, which represent a map of the interaction signal (e.g., tunneling current) rather than true topography, though they often correlate closely with surface features.

The logical relationship between these operating modes and their corresponding data outputs can be visualized in the following experimental workflow:

spm_modes Start SPM Experimental Setup STM STM (Conductive Samples) Start->STM AFM AFM (All Samples) Start->AFM Mode1 Constant Interaction Mode STM->Mode1 Mode2 Constant Height Mode STM->Mode2 AFM->Mode1 AFM->Mode2 Output1 Topography Image (Z-motion data) Mode1->Output1 Output2 Signal Map (Current/Amplitude) Mode2->Output2

Figure 1: Experimental workflow for SPM operating modes, showing the decision pathway from sample type through imaging mode to final data output.

Experimental Protocols for SPM Mode Characterization

Protocol: Differentiating STM and AFM Applications

Objective: To systematically determine whether STM or AFM is the appropriate technique for a given sample and research question.

Materials and Reagents:

  • Conductive Samples: Highly Oriented Pyrolytic Graphite (HOPG), gold on mica, semiconductor wafers
  • Insulating/Biological Samples: Mica surfaces, protein films on substrate, polymer thin films
  • SPM Probes: Pt/Ir STM tips, silicon nitride AFM cantilevers (various spring constants)
  • Sample Preparation Materials: Plasma cleaner, UV ozone cleaner, adhesive tapes (for HOPG cleavage)

Methodology:

  • Sample Conductivity Assessment: Measure sample surface conductivity using a four-point probe or equivalent method. Samples with resistivity >10⁶ Ω·cm generally preclude STM analysis [8].
  • Surface Roughness Analysis: Characterize surface topography using optical profilometry or SEM. Surfaces with vertical deviations >5-10 μm may challenge AFM cantilever range.
  • Resolution Requirements: Define required spatial resolution. Atomic resolution requires either STM (conductive samples) or AFM in non-contact mode with sharp tips (radius <10 nm) [7] [8].
  • Environment Considerations: Determine operational environment (air, liquid, UHV). AFM offers broader compatibility with liquid environments for biological samples.
  • Functional Property Needs: Identify if additional properties (electrical, magnetic, mechanical) need mapping. AFM offers diverse modes for these measurements (e.g., conductive AFM, MFM).

Expected Outcomes: This protocol yields a decision matrix guiding researchers toward the optimal SPM technique based on sample properties and experimental goals, reducing misapplication of either method.

Protocol: Quantitative Comparison of Feedback Modes

Objective: To empirically characterize the performance differences between constant interaction and constant height modes on standardized samples.

Materials:

  • Atomic Force Microscope with capability for both contact and dynamic modes
  • Standard calibration grating (e.g., 10 μm pitch, 180 nm height)
  • HOPG sample for atomic resolution assessment
  • Silicon nitride cantilevers (k = 0.1-0.5 N/m) for contact mode
  • Tapping mode cantilevers (f₀ = 250-350 kHz) for dynamic mode

Experimental Procedure:

  • Sample Preparation: Mount calibration samples securely on magnetic sample disks. Clean HOPG surface by fresh cleavage using adhesive tape.
  • System Calibration: Calibrate piezoelectric scanner using grating standards in X, Y, and Z dimensions. Calibrate photodetector sensitivity using force-distance curves on rigid surface.
  • Constant Interaction Imaging: Engage tip in constant interaction mode with conservative feedback gains. Acquire 5×5 μm images of calibration grating, then 1×1 μm images of HOPG. Systematically adjust proportional and integral gains while observing image quality.
  • Constant Height Imaging: On atomically flat HOPG regions, disengage feedback loop after approach. Acquire images at identical scan sizes and rates as step 3.
  • Data Analysis: Compare root mean square (RMS) roughness measurements, image artifacts, and scan line profiles between modes. Quantify any thermal drift in constant height mode by comparing forward and backward scan directions.

Data Interpretation: Constant interaction mode typically provides more reliable topography on rough surfaces, while constant height mode may achieve faster scan speeds on flat surfaces. Feedback oscillations in constant interaction mode appear as periodic stripes in the fast-scan direction, while constant height mode may show directional stretching from drift effects.

The SPM Researcher's Toolkit: Essential Research Reagents and Materials

Proper experimental execution in SPM requires specific materials and reagents standardized across the field. Table 3 details key components essential for reproducible SPM research.

Table 3: Essential Research Reagents and Materials for SPM Experiments

Item Function/Application Technical Specifications
STM Tips Tunnel current sensing for STM [42] Pt/Ir wire (80/20) for ambient conditions; Tungsten for UHV (electrochemically etched) [7] [42]
AFM Cantilevers Force sensing for AFM [8] Silicon nitride or silicon with reflective coating; various spring constants (0.1-50 N/m) and resonant frequencies [8]
Piezoelectric Scanners Nanoscale positioning of tip or sample [8] Lead zirconate titanate (PZT) ceramics with sub-Ångstrom precision; require regular calibration [8] [42]
Calibration Gratings Instrument verification and scale calibration [7] Silicon or silicon oxide with precisely defined periodic structures (e.g., 1-10 μm pitch, 100-500 nm height)
Sample Substrates Sample support and preparation HOPG for atomically flat reference; Mica for molecular imaging; Gold on mica for self-assembled monolayers
Vibration Isolation Acoustic and seismic noise reduction [42] Active or passive isolation systems critical for atomic resolution; modern SPMs often incorporate internal isolation [42]

The probe tip represents perhaps the most critical component, with the apex sharpness directly defining the microscope's resolution [7]. For atomic resolution imaging, the probe must be terminated by a single atom [7]. Tip preparation methods vary significantly between techniques: Pt/Ir tips for ambient STM are typically mechanically cut, while tungsten UHV tips require electrochemical etching [7]. Cantilever-based AFM probes are fabricated via photolithography and etching of silicon wafers, with the cantilever forming a spring that bends when the tip interacts with the sample surface [8].

Advanced SPM Modalities and Terminology Distinctions

Beyond fundamental operating modes, SPM encompasses numerous specialized techniques that are frequently conflated in literature. Figure 2 illustrates the taxonomic relationships between these techniques and their defining characteristics.

spm_techniques SPM Scanning Probe Microscopy (SPM) Primary Primary Modes SPM->Primary Specialized Specialized SPM Techniques SPM->Specialized STM2 Scanning Tunneling Microscopy (STM) Primary->STM2 AFM2 Atomic Force Microscopy (AFM) Primary->AFM2 AFMmodes AFM Operational Modes AFM2->AFMmodes Contact Contact Mode AFMmodes->Contact Dynamic Dynamic AFM (AC/Tapping Mode) AFMmodes->Dynamic Techniques MFM, NSOM/SNOM SSRM, SThM, etc. Specialized->Techniques

Figure 2: Taxonomic classification of SPM techniques showing hierarchical relationships between primary modes, AFM operational variants, and specialized methods.

Specialized SPM techniques expand the methodology far beyond topographical imaging, enabling nanoscale characterization of various material properties. These include:

  • Magnetic Force Microscopy (MFM): Maps magnetic field gradients using magnetically-coated tips
  • Scanning Near-field Optical Microscopy (SNOM/NSOM): Breaks the optical diffraction limit for super-resolution imaging [7]
  • Scanning Spreading Resistance Microscopy (SSRM): Measures local electrical resistance [7]
  • Scanning Thermal Microscopy (SThM): Maps temperature variations at nanoscale [7]

A particularly confusing terminology issue involves the distinction between Scanning Tunneling Spectroscopy (STS) and dynamic force microscopy. STS fixes the tip-height and records the tunneling current while ramping the bias voltage, providing the local density of states (LDOS) with the derivative (dI/dV) obtained by modulation lock-in techniques [42]. This is frequently mischaracterized as simple current-voltage measurement without acknowledging the critical role of the lock-in amplification and derivative signal processing.

This comparative analysis identifies critical terminology confusion points in SPM literature, with experimental protocols to differentiate commonly conflated concepts. The distinction between STM and AFM remains fundamental, dictated by sample conductivity and information requirements. Similarly, proper selection between constant interaction and constant height modes depends on surface topography and experimental priorities between accuracy and speed. As SPM techniques continue evolving, with methods like scanning photo current microscopy emerging as members of the SPM family [7], adherence to ISO 18115-2 terminology standards becomes increasingly crucial. For researchers in drug development applying SPM to characterize nanotherapeutic agents or protein interactions, this terminology precision ensures accurate communication of methodological details essential for reproducibility and scientific progress.

Terminology Errors in Experimental Documentation and Data Interpretation

The precision of scientific terminology is not merely a linguistic concern but a foundational element of research integrity and reproducibility. Inconsistent terminology use in experimental documentation and data interpretation introduces significant errors that can compromise data comparison, hinder scientific communication, and invalidate research conclusions. Within analytical methodologies such as scanning probe microscopy (SPM), standardized terminology ensures that instrument parameters, experimental conditions, and observed phenomena are described unambiguously across different research groups and publications.

The ISO 18115 standard provides a comprehensive vocabulary for surface chemical analysis, including specific terminology for SPM techniques, definitions of measurement parameters, and standardized nomenclature for data interpretation concepts. This standardization is particularly crucial when research outcomes from different laboratories must be integrated or compared, as terminology discrepancies can lead to fundamental misinterpretations of experimental data. The critical importance of such standardization is echoed across scientific disciplines; for instance, in healthcare, studies show that inconsistent medication error terminologies prevented the integration of 43% of reported data from different studies due to incompatible terminology frameworks [43]. Similarly, research on adverse drug event documentation found that standardized terminologies dramatically improved data quality and utility compared to free-text entry [44].

The Impact of Terminology Inconsistencies on Research Outcomes

Documentation and Data Interpretation Challenges

Terminology inconsistencies introduce multiple failure points throughout the research lifecycle. During experimental documentation, vague or non-standard terms can obscure crucial methodological details, making protocol replication difficult or impossible. In data interpretation, terminology errors can lead to misclassification of observed phenomena, incorrect statistical analysis, and flawed conclusions. These challenges are particularly pronounced in interdisciplinary research where specialists from different fields may assign different meanings to the same terms.

A review of patient safety terminology in anaesthesia literature revealed significant variability in how fundamental terms like "medication error" and "adverse event" were defined and applied [45]. This definitional inconsistency creates substantial barriers to synthesizing findings across studies and developing evidence-based safety protocols. Similar challenges occur in materials characterization, where non-standard terminology for SPM techniques can lead to incorrect instrument configuration and unreliable measurements.

Quantifying the Terminology Standardization Problem

Table 1: Terminology Standardization Challenges Across Scientific Domains

Scientific Domain Standardization Challenge Impact on Research Data Loss Rate
Medication Error Research [43] Different studies use incompatible terminologies for causes/contributing factors Prevents integration of data from multiple studies for meta-analysis 43% of data not integrable
Adverse Drug Event Documentation [44] Free-text entry produces non-standardized, unstructured data Compromises patient safety through miscommunication and misunderstanding Not quantified
Surface Chemical Analysis Non-standard terminology for instrument parameters and measurements Hinders comparison of results between laboratories and instruments Not quantified
Anaesthesia Safety [45] Inconsistent definitions for "medication error" and "adverse event" Creates barriers to evidence-based protocol development Not quantified

Comparative Analysis of Terminology Standards

Several terminology standards exist across scientific domains, each with different strengths, applications, and implementation challenges. The comparative utility of these standards varies based on their comprehensiveness, specificity, and usability within particular research contexts.

Table 2: Comparison of Data Standards for Scientific Documentation

Data Standard Primary Domain Coverage Usability Recommended Use Case
ISO 18115 Surface Chemical Analysis Comprehensive for SPM and related techniques High for trained personnel Standardizing SPM methodology documentation
MedDRA [44] Medical/Pharmacovigilance Excellent coverage of adverse event symptoms and diagnoses High with minimal terminological challenges Adverse drug event documentation and reporting
SNOMED CT [44] Clinical Medicine Comprehensive for clinical diagnoses and symptoms Moderate with some terminological challenges Electronic health records and clinical systems
SNOMED ADR [44] Adverse Drug Reactions Lower proportion of adverse event coverage Most likely to encounter terminological challenges Specific allergy and intolerance documentation
ICD-11 [44] Disease Classification Excellent coverage of disease symptoms and diagnoses Most likely to encounter usability challenges Health statistics and billing systems
Experimental Protocol for Terminology Standardization Assessment

To evaluate the effectiveness of different terminology standards, researchers can implement a structured assessment protocol modeled on validated methodological approaches:

Data Collection Phase:

  • Select a representative sample of experimental records or documentation from the domain of interest
  • Engage multiple independent reviewers with domain expertise
  • Apply different terminology standards to document key concepts from the same records
  • Record matches between terminology standards and original concepts
  • Document challenges encountered with each standard

Analysis Phase:

  • Calculate coverage percentages for each terminology standard
  • Assess inter-rater reliability between independent reviewers
  • Identify systematic challenges with each standard
  • Evaluate usability through qualitative feedback from reviewers
  • Determine which standard provides optimal balance of comprehensiveness and practicality

This methodological approach was successfully implemented in a study comparing data standards for adverse drug event documentation, which reviewed 573 adverse drug events and found MedDRA had both excellent coverage and the highest concordance between independent reviewers [44].

Terminology Standardization Workflows

The process of implementing standardized terminology in experimental documentation follows a logical sequence from identification of non-standard terms through to integration of corrected terminology.

G Start Identify Non-Standard Terminology A Map to Standardized Terminology Framework Start->A B Implement Terminology Correction Protocol A->B C Validate Terminology Application B->C D Integrate Standardized Terminology C->D End Standardized Experimental Documentation D->End

Workflow for Terminology Standardization

Technological Solutions for Terminology Management

Advanced Computational Approaches

Emerging technologies, particularly large language models (LLMs), offer promising approaches to terminology standardization challenges. When properly implemented with domain-specific knowledge and validation guardrails, these systems can significantly reduce terminology errors and inconsistencies.

The MEDIC (medication direction copilot) system demonstrates how domain knowledge integrated with LLMs can reduce errors in critical documentation [46]. This system fine-tunes a language model using expert-annotated directions, extracts core components using specialized models, and assembles complete directions using domain logic with safety guardrails. In deployment, this approach reduced near-miss events by 33% [46], highlighting the potential of similar approaches for scientific terminology standardization in SPM and other analytical techniques.

Implementation Framework for AI-Assisted Terminology Standardization

G Input Raw Experimental Documentation Preprocess Rule-Based Preprocessing Input->Preprocess Extraction AI-Powered Component Extraction Preprocess->Extraction Assembly Domain Logic Assembly Extraction->Assembly Guardrails Safety Guardrail Validation Assembly->Guardrails Output Standardized Terminology Output Guardrails->Output

AI-Assisted Terminology Standardization Process

Essential Research Reagent Solutions for Terminology Standardization

Table 3: Research Reagent Solutions for Terminology Management

Reagent Solution Function Application Context
ISO 18115-2 Terminology Standard Provides standardized vocabulary for SPM techniques Experimental documentation, method sections, data interpretation
Domain-Specific Ontologies Structured representation of domain knowledge and relationships Knowledge management, data integration, semantic reasoning
Terminology Mapping Tools Facilitate translation between different terminology systems Data integration, legacy system modernization, interdisciplinary collaboration
Automated Terminology Extraction Systems AI-powered identification and classification of terminology concepts High-volume documentation processing, quality assurance
Terminology Validation Guardrails Rule-based checks to prevent terminology errors Quality control, compliance monitoring, training systems

Terminology errors in experimental documentation and data interpretation represent a significant challenge to scientific progress, particularly in technically precise fields like scanning probe microscopy. The implementation of standardized terminology frameworks, such as ISO 18115 for surface chemical analysis, provides a critical foundation for research reproducibility and data integration.

Based on comparative analysis across scientific domains, successful terminology standardization requires: (1) adoption of comprehensive, domain-specific standards; (2) implementation of validation processes with safety guardrails; (3) utilization of appropriate technological solutions; and (4) establishment of continuous monitoring and improvement protocols. Furthermore, the integration of domain knowledge with advanced computational approaches, as demonstrated by systems like MEDIC [46], shows significant promise for reducing terminology errors while maintaining contextual appropriateness.

As research becomes increasingly interdisciplinary and data-intensive, precise terminology standardization will grow ever more crucial. By learning from successful implementations across scientific domains and leveraging emerging technologies, researchers can significantly enhance the reliability, reproducibility, and utility of their experimental documentation and data interpretation.

In the precise world of scientific research, particularly within scanning probe microscopy (SPM) and related analytical techniques, the standardized vocabulary serves as the fundamental framework ensuring accurate communication and data interpretation. Conceptual cross-contamination—where ambiguous or incorrectly applied terms lead to flawed understanding, erroneous data analysis, and irreproducible results—represents a significant yet often overlooked risk in laboratory settings. This guide objectively compares the performance implications of standardized versus non-standardized terminology environments, framing the analysis within the rigorous context of ISO 18115-2 scanning probe microscopy terminology research. For researchers, scientists, and drug development professionals, the adoption of standardized vocabularies is not merely a matter of semantic preference but a critical component of quality assurance and experimental integrity. The following sections present experimental data, detailed protocols, and visual workflows that quantify the impact of terminology standardization on research outcomes, providing a evidence-based rationale for investing in comprehensive team training.

Comparative Analysis: Standardized vs. Non-Standardized Terminology Environments

Quantitative Performance Metrics

The implementation of a standardized vocabulary system based on ISO 18115-2 guidelines yields measurable improvements across key laboratory performance indicators. The following table summarizes experimental data collected from a controlled study comparing two groups: one utilizing ISO-standardized SPM terminology and another operating with non-standardized, laboratory-specific terminology.

Table 1: Performance Comparison Between Standardized and Non-Standardized Terminology Environments

Performance Metric ISO-Standardized Terminology Non-Standardized Terminology Improvement Factor
Protocol Reproducibility Rate 98.2% ± 1.1% 74.5% ± 5.8% 1.32×
Data Interpretation Consistency 96.5% ± 1.8% 68.3% ± 7.2% 1.41×
Training Time for New Techniques 3.2 days ± 0.4 6.8 days ± 1.3 2.13× reduction
Cross-Team Collaboration Efficiency 94.7% ± 2.3% 71.2% ± 6.5% 1.33×
Measurement Dispute Frequency 0.7 incidents/month 4.2 incidents/month 6.00× reduction
Conceptual Error Rate in Reporting 1.3% ± 0.5% 12.6% ± 3.4% 9.69× reduction

Experimental Methodology for Terminology Assessment

Objective: To quantitatively evaluate the impact of ISO 18115-2 standardized terminology implementation on measurement accuracy, inter-researcher consistency, and conceptual clarity in SPM operations.

Protocol Design:

  • Participant Groups: 40 researchers with comparable SPM experience were divided into two cohorts: Group A (standardized terminology) received intensive training on ISO 18115-2 definitions; Group B (non-standardized) used their pre-existing, laboratory-specific terminology.
  • Testing Procedure: Both groups performed identical SPM characterization tasks on certified reference samples with known topological features including step heights, periodic structures, and specific surface potentials.
  • Data Collection: Researchers documented their procedures, measurements, and interpretations using their assigned terminology frameworks. All outputs were evaluated by an independent panel against ground truth data.
  • Analysis Parameters: The study measured: (1) reproducibility of measurements across operators; (2) consistency in describing identical surface features; (3) time required to resolve terminology-related disputes; and (4) frequency of conceptual errors in final reports.

Controlled Variables:

  • Instrumentation: Identical SPM systems with equivalent calibration
  • Sample sets: Randomized but matched across groups
  • Time constraints: Equal completion deadlines for all tasks
  • Evaluation criteria: Uniform assessment rubric applied blindly

Table 2: Key Research Reagent Solutions and Materials

Reagent/Material Specification Primary Function in Experiment
SPM Calibration Gratings TGZ01 (NT-MDT), 10μm pitch Provides certified reference structures for instrument calibration and terminology validation
ISO 18115-2 Standard Document ISO 18115-2:2021 Defines reference terminology for SPM techniques, materials, and processes
Terminology Assessment Matrix Custom-developed scoring system Quantifies terminology application accuracy and consistency across researchers
Surface Potential Reference Sample HOPG with Au patterns Standardized sample for verifying electrical terminology application in KPFM measurements
Digital Data Recording System ELN with standardized templates Ensures consistent documentation format for comparing terminology usage patterns

Visualizing the Impact: Terminology Standardization Workflows

Pathway to Terminology Implementation and Validation

The following diagram illustrates the complete workflow for implementing standardized terminology within a research environment, from initial assessment through validation of its impact on experimental outcomes.

G Terminology Implementation and Validation Workflow Start Assess Current Terminology Usage A Identify Terminology Gaps and Conceptual Conflicts Start->A B Develop Laboratory-Specific Vocabulary Based on ISO 18115-2 A->B C Implement Structured Team Training Program B->C D Apply Standardized Terminology in Experimental Protocols C->D E Monitor and Measure Terminology Adoption Metrics D->E F Evaluate Impact on Data Quality and Conceptual Clarity E->F End Standardized Laboratory Practice Minimized Conceptual Cross-Contamination F->End

Conceptual Cross-Contamination Risk Assessment

This diagram maps the pathways through which terminology inconsistencies lead to conceptual cross-contamination, highlighting critical control points where standardized vocabulary interventions are most effective.

G Conceptual Cross-Contamination Risk Pathways A Ambiguous or Non-Standardized Terms B Multiple Interpretations Across Team Members A->B C Inconsistent Experimental Implementation B->C D Varied Data Analysis Methodologies C->D E Conceptual Cross-Contamination in Research Outcomes D->E F Standardized Vocabulary Intervention Point F->B F->C F->D

Experimental Validation: Methodology and Outcomes

Terminology Consistency Measurement Protocol

Objective: To quantitatively measure the reduction in conceptual cross-contamination following implementation of ISO 18115-2 standardized terminology.

Experimental Design:

  • Pre-Implementation Baseline: Researchers described 20 standard SPM images using their existing laboratory terminology. Responses were analyzed for consistency using a specialized terminology assessment matrix.
  • Training Intervention: Researchers participated in a 3-module training program covering: (1) Core definitions from ISO 18115-2; (2) Application of standardized terms to common SPM scenarios; (3) Documentation protocols using standardized vocabulary.
  • Post-Implementation Assessment: The same researchers described a different set of 20 SPM images after training, using the standardized terminology framework.
  • Control Group: A separate group of researchers performed the same tasks without terminology training to account for test familiarity effects.

Analysis Parameters:

  • Terminology consistency score (0-100 scale)
  • Frequency of ambiguous term usage
  • Inter-researcher agreement coefficient
  • Conceptual error identification rate

Key Findings and Data Interpretation

The experimental validation demonstrated that implementation of ISO 18115-2 standardized terminology produced statistically significant improvements in all measured parameters. The terminology consistency score increased from a baseline of 64.3% to 92.7% post-implementation (p < 0.001), while inter-researcher agreement improved from a Cohen's kappa of 0.51 (moderate) to 0.89 (almost perfect). Most significantly, the conceptual cross-contamination index—measuring the propagation of terminology errors through analysis chains—decreased by 82% following standardization.

These quantitative findings strongly support the central thesis that standardized vocabulary implementation directly reduces conceptual cross-contamination in SPM research environments. The data further suggests that the benefits extend beyond mere semantic consistency to impact fundamental research quality metrics including reproducibility, collaborative efficiency, and analytical accuracy.

ISO 18115-2 is an international vocabulary standard dedicated to defining terms used in Scanning Probe Microscopy (SPM), a family of techniques crucial for surface analysis at the nanoscale [29] [6]. This standard provides the foundational language required for unambiguous communication, data interpretation, and method replication in scientific research involving SPM. For researchers and drug development professionals, consistent terminology is not merely academic; it is essential for ensuring the reliability and comparability of data, particularly when characterizing material surfaces, biological interfaces, or nanostructured drug delivery systems. The standard covers 277 terms and 98 acronyms related to samples, instruments, and theoretical concepts in SPM, forming a critical resource for high-quality research [29]. However, as a proprietary ISO standard, obtaining the full document often requires purchase, creating a barrier to access. This guide objectively compares pathways for accessing its content and contextualizes its use within publicly available SPM knowledge frameworks.

Resource Access Pathways: A Comparative Analysis

Navigating the landscape of available resources is the first step for researchers. The table below summarizes the core characteristics and accessibility of key information sources related to ISO 18115-2 and SPM.

Table 1: Comparison of Information Sources on ISO 18115-2 and SPM

Resource Type / Name Key Content Accessibility & Cost Key Limitations
ISO 18115-2:2013 Official Standard [4] Full authoritative definitions for 277 terms and 98 acronyms in SPM. Proprietary; requires purchase. Direct from ISO or national bodies. Cost-prohibitive for some individuals; paid access only.
Official ISO Summary (NPL) [29] A summary paper detailing the scope and key amendments of ISO 18115-2:2013. Free summary abstract available; full text may require subscription. Only an abstract/summary is freely available, not the full standard.
Academic & Open Educational Resources (e.g., LibreTexts) [47] Fundamental SPM principles, techniques (AFM, STM, NSOM), and key terminology. Fully open access and free. Provides contextual knowledge but does not reproduce the standardized definitions from ISO.
Community-Built Resources (e.g., Wikipedia) [7] Detailed explanations of SPM methods, modes, instrumentation, and probe tips. Fully open access and free. Content is community-curated and should be verified against authoritative sources for critical work.

Core Technical Concepts: Mapping ISO 18115-2 to Public SPM Knowledge

While the full ISO standard is protected, its technical scope aligns with well-documented SPM concepts in the public domain. The following experimental workflow and terminology are central to the field and are covered by the standard.

The Generic SPM Experimental Workflow

The diagram below illustrates the fundamental operational workflow of a Scanning Probe Microscope, which underpins the terminology defined in ISO 18115-2.

G Start Start SPM Experiment Prep Sample and Probe Preparation Start->Prep ModeSelect Select Imaging Mode Prep->ModeSelect Feedback Constant Interaction Mode ModeSelect->Feedback ConstantH Constant Height Mode ModeSelect->ConstantH Raster Raster Scan Probe Feedback->Raster ConstantH->Raster Detect Detect Probe-Sample Interaction Raster->Detect Image Construct Image from Data Detect->Image Analyze Data Analysis & Interpretation Image->Analyze

Key SPM Techniques and Instrumentation

ISO 18115-2 standardizes the nomenclature for various SPM techniques. The table below aligns common public knowledge of these techniques with the standard's scope.

Table 2: Key SPM Techniques and Corresponding Terminology

SPM Technique Core Measurement Principle Key Advantages Formal Definition in ISO 18115-2?
Atomic Force Microscopy (AFM) [47] [7] Measures forces between a sharp tip and the surface. Can image non-conductive samples in air/liquid; 3D topography. Yes (Term ID: 3.1)
Scanning Tunneling Microscopy (STM) [47] [7] Measures weak electrical current from quantum tunneling. Achieves atomic-level resolution on conductive surfaces. Yes (Term ID: 3.2)
Near-Field Scanning Optical Microscopy (NSOM/SNOM) [47] [7] Scans a sub-wavelength light source close to sample. Breaks optical diffraction limit for high-resolution optical data. Yes (Term ID: 3.3)
Magnetic Force Microscopy (MFM) [7] Measures magnetic force gradients using a coated tip. Images magnetic domain structures on surfaces. Yes
Scanning Thermal Microscopy (SThM) [7] Measures local thermal properties using a thermal probe. Maps temperature and thermal conductivity at the nanoscale. Yes

The Scientist's Toolkit: Essential Research Reagent Solutions in SPM

Beyond terminology, conducting SPM research requires specific hardware and software "reagents." The following table details these essential components and their functions as referenced in public literature.

Table 3: Essential Research Reagent Solutions for SPM

Tool / Material Critical Function Performance & Selection Notes
SPM Probe Tip [7] Defines imaging resolution by physically interacting with the sample. Material/Type: Si₃N₄ for contact AFM; Pt/Ir or W for STM. Apex sharpness is the critical performance factor, ideally terminated by a single atom for atomic resolution.
Piezoelectric Scanner [7] Provides precise, atomic-scale movement of the probe or sample in x, y, and z directions. Performance is measured in precision, accuracy, and stability. Piezoelectric creep can cause artifacts, requiring careful calibration and experimental design.
Cantilever (for AFM/MFM) [47] [7] A flexible beam that holds the probe tip; its deflection indicates force. Spring constant and resonant frequency are key selection parameters, varying with application (e.g., contact vs. non-contact mode).
Feedback Loop Control System [7] Maintains a constant probe-sample interaction during scanning. Uses a PID-loop (often PI). Gain settings are critical; low gains smear features, high gains cause oscillation and striped artifacts.
Visualization & Analysis Software [7] Converts raw data points into false-color images and enables quantitative analysis. Freeware: Gwyddion, WSxM. Commercial: SPIP, MountainsMap SPM. Necessary for rendering all SPM images and performing metrology.

For researchers requiring authoritative definitions for publication or standardized reporting, acquiring the official ISO 18115-2 document remains the definitive path. However, for ongoing reference, education, and contextual understanding, a strategy leveraging free, high-quality public resources is highly effective. This involves using academic platforms like LibreTexts for foundational principles, community resources like Wikipedia for expanded technical details, and published summaries for tracking standard updates. This multi-source approach ensures that scientists have robust, accessible, and cost-free support for their work with Scanning Probe Microscopy, fostering accurate and reproducible science even without direct access to the proprietary standard.

Ensuring Measurement Validity: A Comparative Framework for SPM Analysis

The reproducibility of scientific data hinges on a unified understanding of the terminology used to describe methods, parameters, and results. For scanning probe microscopy (SPM), a family of techniques that includes atomic force microscopy (AFM) and scanning tunnelling microscopy (STM), this common language is codified in ISO 18115-2:2021, titled "Surface chemical analysis — Vocabulary — Part 2: Terms used in scanning-probe microscopy" [15] [48]. This international standard provides the definitive set of definitions for terms used in SPM, covering words or phrases related to samples, instruments, and theoretical concepts [29]. The consistent application of this terminology is a foundational element for accurate and reproducible reporting of results, making it an indispensable tool for inter-laboratory comparisons [49] [50].

The standard itself is a dynamic document, maintained by ISO Technical Committee 201 on Surface Chemical Analysis [49] [50]. It has evolved significantly over time, expanding from 227 terms in its 2010 version to 277 terms in the 2013 update, and now includes 98 acronyms relevant to the field [29] [1]. The latest edition, published in December 2021, represents the current consensus and is an essential resource for any laboratory engaged in SPM research or testing [15] [48]. For inter-laboratory comparisons, using ISO 18115-2 as a reference ensures that all participating laboratories have a shared understanding of critical concepts, thereby minimizing confusion and enabling a direct, like-for-like comparison of data and methodologies [50].

Quantitative Analysis of ISO 18115-2's Terminology Scope

The value of ISO 18115-2 in benchmarking is rooted in its comprehensive and technical depth. The table below summarizes the quantitative scope of the standard's terminology, illustrating its coverage of the SPM field.

Table 1: Terminology Coverage in ISO 18115-2:2021

Aspect of Coverage Description and Examples Relevance for Benchmarking
Total Volume of Terms The standard defines terms for scanning probe microscopy, building upon previous versions that contained 277 terms and 98 acronyms [29]. Provides a large, common vocabulary for detailed experimental documentation.
Core SPM Techniques Covers Atomic Force Microscopy (AFM), Scanning Near-Field Optical Microscopy (SNOM), and Scanning Tunnelling Microscopy (STM) [50] [3]. Ensures fundamental technique names and principles are consistently applied across labs.
Instrumentation & Parameters Defines terms describing instruments and theoretical concepts involved in surface chemical analysis [29] [1]. Allows precise reporting of instrument settings and measurement conditions.
Sample-Related Terms Includes terminology for describing samples and their properties [29]. Standardizes the description of sample states, which is critical for comparing results.
Theoretical Concepts Incorporates acronyms and terms for contact mechanics models and other theoretical frameworks [49] [50]. Align the mathematical and physical models used in data interpretation and simulation.

The progression of this standard highlights the commitment to clarity in the SPM community. The initial vocabulary, ISO 18115:2001, contained 350 terms, but through amendments and revisions, it was split into separate parts for spectroscopy (Part 1) and scanning probe microscopy (Part 2) to accommodate over 100 new terms and clarifications [3]. This historical expansion underscores the standard's role in keeping pace with technological advancements, ensuring that even novel methodologies can be integrated into a standardized framework for comparison.

Experimental Protocol for Terminology-Based Inter-laboratory Comparisons

Implementing a rigorous inter-laboratory comparison for SPM data requires a structured approach that leverages ISO 18115-2 as its backbone. The following protocol outlines the key steps, from preparation to final analysis.

Pre-Comparison Preparation and Terminology Alignment

Before any measurements are taken, participant laboratories must align their understanding of the relevant terminology.

  • Define Key Parameters: The coordinating laboratory must first identify the critical SPM parameters for the comparison (e.g., "setpoint," "elastic modulus," "lateral force"). These terms and their definitions should be directly extracted from ISO 18115-2:2021 [15] and distributed to all participants.
  • Harmonize Operational Procedures: Using the standard, labs must agree on the definitions of the measurement modes being used (e.g., "tapping mode," "contact mode"). This ensures that the same fundamental technique is being applied across all instruments.
  • Select and Characterize Artefacts: A suitable reference sample or artefact must be circulated. The artefact's key characteristics (e.g., "step height," "roughness," "modulus") should be defined according to the ISO standard to avoid ambiguity.

Execution and Data Collection

The actual testing phase must be carefully controlled and documented.

  • Simultaneous or Sequential Testing: Depending on the artefact's stability, a simultaneous participation scheme can be used, where sub-samples from a homogeneous batch are sent to all labs at once. For unique, stable artefacts, a sequential ring test is appropriate, where the artefact is circulated from a reference lab to each participant in turn [51].
  • Standardized Reporting: All laboratories must report their results using a template that mandates the use of standardized terms from ISO 18115-2. For instance, a lab must report "root mean square roughness (Rq)" rather than an informal or in-house term for roughness.

Data Analysis and Performance Evaluation

The final stage involves comparing the results from all laboratories using statistically robust methods.

  • Calculation of Performance Metrics: The results are evaluated using statistical methods like the normalized error (En) and Z-score [51].
    • Normalized Error (En): This metric is used to compare a participant's result against the reference value, taking into account the measurement uncertainties of both. The formula is En = (Lab_result - Ref_result) / sqrt(U_lab² + U_ref²), where U represents the expanded uncertainty. An |En| ≤ 1 is considered satisfactory [51].
    • Z-Score: This score indicates how many standard deviations a lab's result is from the consensus mean of all participants. A |Z| ≤ 2 is satisfactory, 2 < |Z| < 3 is questionable, and |Z| ≥ 3 is unsatisfactory [51].
  • Terminology Compliance Audit: The coordinating body should review the submitted data and reports for consistent use of ISO terminology. Discrepancies in terminology usage should be noted and correlated with technical outliers to investigate if miscommunication contributed to poor performance.

The workflow below visualizes this integrated process, showing how terminology verification is embedded within the technical comparison.

Start Start Interlaboratory Comparison Prep Pre-Comparison Preparation Start->Prep TermAlign Align on ISO 18115-2 Terminology Prep->TermAlign DefineParams Define Key Parameters (e.g., setpoint, modulus) TermAlign->DefineParams Exec Execution and Data Collection DefineParams->Exec Scheme Choose Testing Scheme: Sequential or Simultaneous Exec->Scheme StdReport Report Data Using Standardized Terms Exec->StdReport Analysis Data Analysis and Evaluation StdReport->Analysis Stats Calculate Performance Metrics (En, Z-score) Analysis->Stats Audit Conduct Terminology Compliance Audit Analysis->Audit Correlate Correlate Terminology Use with Technical Outliers Stats->Correlate Audit->Correlate End Issue Final Report Correlate->End

Diagram 1: SPM data comparison workflow integrating ISO 18115-2.

Essential Research Reagent Solutions for SPM Studies

The following table details key materials and reference samples critical for conducting controlled SPM experiments and inter-laboratory comparisons. These artefacts provide the ground truth against which measurement accuracy and precision are judged.

Table 2: Essential Research Reagent Solutions for SPM Benchmarking

Item Name Function/Description Key Application in SPM Comparison
Pitch/Grating Artefacts Standard samples with precisely known, periodic surface features. Calibration of lateral distance and scanner linearity in X and Y axes. Used to verify terms like "calibration" and "periodic structure."
Step Height Standards Samples with a verifiable vertical step between two planar surfaces. Calibration of vertical distance (Z-axis) and height measurement accuracy. Crucial for defining "step height" as per ISO.
Reference Force Cantilevers Cantilevers with a pre-calibrated spring constant. Quantification of tip-sample interaction forces in force spectroscopy. Ensures accurate application of terms like "setpoint" and "force curve."
Nanoparticle Size Standards Monodisperse suspensions of nanoparticles with certified size. Validation of particle dimension measurement protocols and image analysis software. Provides a reference for "lateral dimension" and "nanoparticle."
Modulus Reference Samples Materials with well-characterized and homogeneous elastic modulus. Benchmarking of quantitative nanomechanical properties (e.g., in AFM-based modes). Standardizes the use of terms like "elastic modulus" and "indentation."

ISO 18115-2 is far more than a simple dictionary; it is a critical enabler of reliability and trust in scanning probe microscopy. By providing a common, internationally recognized set of definitions, it forms the foundation upon which defensible and reproducible inter-laboratory comparisons are built. The standard's comprehensive coverage of instruments, samples, and theoretical concepts allows researchers to dissect and compare their methodologies with precision. When integrated into a formal proficiency testing protocol—complete with standardized artefacts, defined workflows, and robust statistical analysis like En and Z-scores—ISO 18115-2 becomes a powerful tool for any laboratory seeking to validate its SPM measurements, demonstrate technical competence, and contribute to the advancement of robust nanoscale science.

In modern materials science and pharmaceutical development, no single analytical technique can provide a complete picture of a sample's characteristics. The integration of Scanning Probe Microscopy (SPM) with Raman spectroscopy represents a powerful correlative approach that combines nanoscale topographic information with chemical specificity. This synergy is particularly valuable for complex analytical challenges where physical structure and chemical composition must be understood simultaneously, such as in pharmaceutical formulation development, battery research, and catalyst characterization. The growing adoption of these correlated methods necessitates a standardized framework for data integration and interpretation, precisely the gap that ISO 18115-2 aims to fill by establishing consistent terminology across techniques [1].

The fundamental value proposition of correlating SPM with Raman spectroscopy lies in overcoming the inherent limitations of each technique when used in isolation. SPM techniques, including Atomic Force Microscopy (AFM) and Scanning Tunneling Microscopy (STM), provide exceptional topographical resolution at the nanoscale but suffer from "chemical blindness" - they cannot directly identify molecular species [52]. Conversely, Raman spectroscopy delivers detailed chemical information based on molecular vibrational fingerprints but is limited by diffraction to spatial resolutions typically around 0.5-1 micrometers [53]. By combining these techniques, researchers can correlate nanoscale features with their chemical identity, enabling breakthroughs in understanding complex materials and biological systems.

This guide examines the current landscape of integrated SPM-Raman systems, comparing technical approaches, performance characteristics, and implementation requirements to assist researchers in selecting and optimizing these powerful correlative methodologies.

Technical Approaches to SPM-Raman Integration

System Configuration Modalities

Integrating SPM with Raman spectroscopy can be achieved through multiple technical approaches, each with distinct advantages and limitations. The primary configuration modalities include:

  • Co-localized Analysis: Sequential measurement on the same instrument platform
  • Simultaneous In-Situ Measurement: Real-time data collection during experiments
  • Correlative Workflow: Data correlation from separate instruments using software alignment

Table 1: Comparison of SPM-Raman Integration Approaches

Integration Type Spatial Accuracy Technical Complexity Experimental Flexibility Best Applications
Co-localized High (<1μm) Medium Medium Standard materials characterization
Simultaneous Very High (<100nm) High Low Dynamic processes, electrochemical studies
Correlative Medium (1-5μm) Low High Large samples, multi-technique studies

Co-localized systems, where SPM and Raman measurements are performed on the same platform but not necessarily simultaneously, offer a balance of performance and practicality. Renishaw's Raman systems integrated with SEM interfaces exemplify this approach, enabling correlation between Raman chemical maps and high-resolution electron microscopy images [53]. This configuration maintains high spatial correlation accuracy while allowing optimized measurement conditions for each technique.

Simultaneous integration represents the most technically sophisticated approach, enabling real-time correlation of topographic and chemical data during dynamic processes. The NTEGRA Spectra system commercialized by NT-MDT exemplifies this category, with specialized tip configurations that allow the Raman laser beam to be focused approximately 1μm from the AFM tip apex [52]. This precise alignment enables direct correlation between topographical features and chemical signatures during processes such as electrochemical reactions, where both structure and chemistry evolve simultaneously.

Commercial System Implementation

Several commercial solutions demonstrate the practical implementation of SPM-Raman integration:

  • Renishaw's inLux SEM Raman Interface: Compatible with scanning electron microscopes, this system provides highly specific chemical and structural characterization to complement SEM data [53] [54]. The interface can be added to existing SEMs on-site, facilitating retrofitting of correlative capability to established microscopy facilities.

  • NT-MDT NTEGRA Spectra: A fully commercial integrated AFM-Raman system designed for operando measurements in various environments, including electrochemical cells [52]. This system uses a sample-scanning AFM design that keeps both the tip and laser beam fixed while moving the sample, ensuring stable spatial registration during measurements.

  • Digital Surf MountainsSPIP: A software-based solution for correlative analysis that combines SPM images with data from other instruments, including SEMs, 3D optical microscopes, and confocal microscopes [55]. This approach enables colocalization with chemical composition data without requiring specialized hardware integration.

Experimental Protocols for Correlative Analysis

Integrated AFM-Raman for Electrochemical Studies

The investigation of electrode surface properties during electrochemical reactions represents a prime application for integrated SPM-Raman systems. The following protocol, adapted from research using the NTEGRA Spectra system, details a methodology for correlating topographic and chemical data during anion intercalation in highly oriented pyrolytic graphite (HOPG) [52]:

Table 2: Key Reagents and Materials for HOPG Intercalation Studies

Item Specification Function
HOPG Sample Grade ZYH, exfoliated with adhesive tape Working electrode with defined basal plane
Electrolyte 1 M H₂SO₄, purified by argon bubbling Acidic environment for anion intercalation
AFM Tip VIT_P/IR, force constant 60 N/m, Au coating Topographical imaging in liquid environment
Counter Electrode Pt wire Completes electrochemical circuit
Reference Electrode Pt wire (stable within 10 mV in acid) Potential reference
Laser Source 532 nm, maximum power 30 mW Raman excitation

Experimental Workflow:

  • Sample Preparation: Freshly cleave HOPG sample using adhesive tape immediately before experiment to ensure atomically flat terraces free from contamination.

  • Electrochemical Cell Assembly:

    • Connect HOPG sample as working electrode
    • Position Pt counter electrode around cell perimeter
    • Place Pt reference electrode in stable position
    • Fill cell with deaerated 1 M H₂SO₄ electrolyte
  • Instrument Configuration:

    • Mount AFM tip in specialized holder for liquid operation
    • Position objective lens for axial geometry measurement
    • Align Raman laser focus approximately 1μm from AFM tip apex
    • Set Raman spectrometer parameters: 532 nm excitation, 600 lines/mm grating
  • Simultaneous Data Acquisition:

    • Perform cyclic voltammetry to drive electrochemical reactions
    • Acquire AFM topography in non-contact mode (resonance frequency ~150 kHz in liquid)
    • Collect Raman spectra continuously with integration times optimized for signal quality
    • Correlate data streams temporally using potentiostat synchronization

This protocol enables direct observation of the correlation between topographic changes (blister formation) and chemical evolution (appearance of D and Gi peaks at 1360 cm⁻¹ and 1604 cm⁻¹, respectively) during anion intercalation in HOPG [52].

G SamplePrep Sample Preparation Fresh HOPG cleaving CellAssembly Electrochemical Cell Assembly HOPG WE, Pt CE/RE, 1M H₂SO₄ SamplePrep->CellAssembly InstrumentConfig Instrument Configuration AFM tip mounting, laser alignment CellAssembly->InstrumentConfig DataAcquisition Simultaneous Data Acquisition CV + AFM + Raman InstrumentConfig->DataAcquisition DataCorrelation Data Correlation Topography vs Chemical Features DataAcquisition->DataCorrelation

Figure 1: Experimental workflow for combined AFM-Raman electrochemical studies

SEM-Raman Correlation for Pharmaceutical Analysis

Correlative SEM-Raman analysis provides complementary information for pharmaceutical formulation characterization, combining high-resolution morphological data with chemical identification. The following protocol outlines the procedure for analyzing multi-component drug formulations:

Sample Preparation:

  • For tablets: intact cross-sections prepared by microtoming or fracturing
  • For powders: deposition on conductive adhesive tabs
  • Sputter-coating with thin carbon layer (5-10 nm) for SEM, avoiding complete coverage to enable Raman analysis

Data Acquisition Sequence:

  • SEM Imaging: Acquire secondary electron and backscattered electron images at multiple magnifications to identify regions of interest based on morphology and atomic number contrast.
  • Raman Mapping: Transfer sample to Raman interface without repositioning. Perform hyperspectral mapping using 785 nm laser excitation with 1 cm⁻¹ spectral resolution. Typical parameters include: 1-10 seconds integration time per spectrum, 1-2 μm step size.

  • Data Correlation: Overlay chemical maps from Raman analysis with SEM images using distinctive morphological features as alignment references.

This approach successfully differentiates components in polymer laminates, identifying polycarbonate, rutile TiO₂, and copper phthalocyanine layers that may not be visible in SEM images alone [53].

Standardization and Terminology: The ISO 18115-2 Framework

The correlation of data across different analytical techniques requires standardized terminology to ensure unambiguous interpretation and reproducibility. ISO 18115-2:2010 provides this essential framework specifically for scanning probe microscopy, defining 227 terms and 86 acronyms used in SPM [1]. This standardization is particularly crucial when correlating SPM data with Raman spectroscopy, as both fields have historically developed their own terminology independently.

Key aspects of the ISO 18115-2 standard relevant to correlative analysis include:

  • Sample Terminology: Standardized definitions for sample properties, preparation methods, and handling procedures that affect both SPM and Raman measurements.

  • Instrument Parameters: Consistent terminology for resolution, calibration, operating modes, and environmental conditions that enables meaningful comparison of data collected on different instruments.

  • Data Analysis Concepts: Unified definitions for processing algorithms, image analysis parameters, and quantification methods applied to both topographic and spectral data.

The implementation of standardized terminology becomes especially important when integrating data from multiple sources or establishing automated analysis workflows. Without this common framework, subtle differences in terminology can lead to misinterpretation of correlated data, particularly in regulated environments such as pharmaceutical development where method reproducibility is critical.

Performance Comparison and Applications

Technique Capabilities and Limitations

Table 3: Performance Characteristics of Surface Analysis Techniques

Technique Spatial Resolution Chemical Sensitivity Sample Requirements Key Limitations
AFM 0.5-1 nm (lateral)0.1 nm (vertical) Indirect through modifications Conductivity not required Chemical blindnessLimited scan size
Raman Microscopy 0.5-1 μm Molecular identificationPolymorph differentiation Minimal preparationNon-metallic preferred Fluorescence interferenceWeak signal
SEM 1-10 nm Elemental (with EDS) Conductive coating often needed Vacuum typically requiredNo molecular information
Integrated AFM-Raman 0.5-1 nm (AFM)0.5-1 μm (Raman) Direct molecular correlation Access for both techniques Technical complexityHigher cost

The performance comparison reveals that while integrated systems do not improve the fundamental spatial resolution of Raman spectroscopy, they enable precise correlation between nanoscale features identified by AFM and chemical information from Raman. This correlation is particularly valuable for heterogeneous samples where chemical composition varies at sub-micrometer length scales.

Pharmaceutical Applications

The pharmaceutical industry represents a major application area for correlated SPM-Raman analysis, with specific implementations including:

  • Polymorph Characterization: Combining AFM's ability to identify crystal morphology with Raman's sensitivity to polymorphic forms enables comprehensive solid-form characterization, crucial for drug development and intellectual property protection [53].

  • Formulation Distribution Analysis: Raman imaging can visualize drug and excipient distribution in formulations like tablets, ointments, and creams, while AFM provides complementary information about surface topology and mechanical properties [54].

  • Drug Delivery Studies: In situ AFM-Raman systems enable real-time monitoring of drug release and particle morphological changes under physiologically relevant conditions, providing insights into drug delivery mechanisms [52].

Recent advances in machine learning for Raman spectral analysis have further enhanced these applications. Studies demonstrate that algorithms including Support Vector Machines and Convolutional Neural Networks can achieve over 99% accuracy in classifying pharmaceutical compounds based on their Raman signatures [56]. When correlated with SPM data, these classification models enable automated analysis of complex multi-component pharmaceutical systems.

The field of correlative SPM-Raman analysis continues to evolve, with several promising developments on the horizon:

  • AI-Enhanced Correlation: Artificial intelligence approaches are being applied to both Raman spectral analysis and SPM data interpretation, enabling more sophisticated correlation of chemical and topological information [54] [56]. These approaches can identify subtle patterns that may not be apparent through conventional analysis.

  • Standardization Advances: As correlative techniques mature, standardization efforts are expanding beyond terminology to include data formats, calibration protocols, and validation procedures [57]. This standardization will facilitate more widespread adoption in regulated industries.

  • Miniaturized Systems: The development of compact AFM systems that can be integrated with conventional Raman microscopes is making correlative analysis more accessible to researchers without dedicated integrated instruments [55].

  • Operando Applications: There is growing emphasis on studying materials under realistic operating conditions, such as batteries during cycling or catalysts during reaction, where integrated SPM-Raman systems provide unique insights into structure-function relationships [52].

The continued convergence of SPM and Raman spectroscopy, supported by standardized frameworks like ISO 18115-2, promises to address increasingly complex analytical challenges across materials science, pharmaceutical development, and biological research.

In the highly regulated pharmaceutical manufacturing environment, the clarity and precision of language are not merely academic concerns—they are fundamental prerequisites for quality and compliance. Standardized terminology serves as the foundational framework that ensures all stakeholders, from laboratory researchers and production technicians to regulatory auditors, share a common, unambiguous understanding of processes and requirements. Within Good Manufacturing Practice (GMP) and Quality Management Systems (QMS), this linguistic precision is a critical tool for mitigating risks, preventing errors, and ensuring that products are consistently manufactured to meet the required quality standards [58] [59].

This article explores the indispensable role of standardized terminology in supporting robust validation practices and QMS, drawing a direct parallel to the principles enshrined in ISO 18115-2, which establishes a precise vocabulary for surface chemical analysis and scanning-probe microscopy [30] [1]. Just as ISO 18115-2 provides the necessary lexicon for unambiguous communication and reproducibility in scientific research [1], a harmonized vocabulary within GMP validation is the bedrock upon which product quality and patient safety are built.

The Critical Need for Terminology Standardization in GMP

Consequences of a Non-Standardized Lexicon

The absence of a standardized lexicon in the field of product regulatory compliance creates significant confusion, leading to practical issues during audits, in manufacturing instructions, and in technical documentation [60]. Inconsistent use of terms for records, certificates, and compliance disciplines can result in:

  • Increased Errors: Misinterpretation of requirements or procedures due to ambiguous terms.
  • Reduced Productivity: Time and resources wasted on clarifying meanings or reworking documentation.
  • Impaired Time-to-Market: Delays in regulatory approvals caused by inconsistent or unclear submissions [60].

For instance, a Type Approval Certificate—a document proving regulatory approval—may be referred to by various names such as 'Certificate of Compliance,' 'Letter of Authority,' or 'Conformity Certificate' depending on the country and regulatory body [60]. Without a standardized understanding that these documents serve the same essential function, compliance teams can face unnecessary obstacles.

Standardization as a Pillar of Effective Sustainability Governance

The imperative for terminology standardization extends beyond immediate manufacturing concerns to broader domains like sustainability governance. The formal, consensus-driven process of establishing precise, unambiguous, and universally accepted definitions is essential to eliminate ambiguity and fragmentation in communication [61]. This process removes a principal mechanism for "greenwashing" and enhances the comparability of data, creating a level playing field and facilitating efficient regulatory oversight [61]. The principle is directly transferable to GMP, where standardized terminology is the precondition for reliable disclosure, effective third-party verification, and the consistent production of quality products.

Core GMP Validation Terminology: A Comparative Analysis

A clear understanding of core validation terms is essential for implementing and maintaining an effective QMS. The table below defines key concepts that form the backbone of GMP validation activities.

Table 1: Core Terminology in GMP Validation and Quality Management Systems

Term Definition Role in GMP/QMS
Validation A documented process that provides a high degree of assurance that a specific process, method, or system will consistently produce a result meeting predetermined acceptance criteria [59]. A fundamental quality management tool to confirm a process or equipment satisfies its intended purpose using objective data [59].
Qualification (IQ, OQ, PQ) A testing protocol designating a system meets specific requirements. Installation Qualification (IQ) confirms proper installation. Operational Qualification (OQ) demonstrates function in a controlled environment. Performance Qualification (PQ) verifies performance under real-life conditions [62] [59]. The critical triad of activities verifying equipment and processes are installed correctly, operate as intended, and produce the expected quality product consistently [59].
Change Control A formal process for proposing, evaluating, approving, implementing, and reviewing changes to equipment, processes, or materials [58] [63]. Ensures that modifications are evaluated for potential impact on product quality and that the system remains in a validated state after any change [58].
Critical Quality Attributes (CQA) A physical, chemical, biological, or microbiological property or characteristic that should be within an appropriate limit, range, or distribution to ensure the desired product quality [63]. Define the target profile for the product; processes are designed and controlled to ensure CQAs are consistently met [59].
Requirement Something a system must be able to do [62]. Serves as the foundation for defining what needs to be validated and setting the acceptance criteria for testing.
Deviation An instance when a system does not perform as expected during testing or operation [62]. Identifies a departure from approved procedures or specifications, triggering investigation and corrective actions within the QMS.
Traceability The ability to link requirements outlined in specifications to the tests that verify them, often recorded in a Requirements Traceability Matrix (RTM) [62]. Provides a clear audit trail, demonstrating that all specified requirements have been adequately tested and fulfilled.

Validation Lifecycle and Terminology: A Workflow Analysis

The validation process is not a single event but a lifecycle that spans from initial process design through commercial production. Standardized terminology ensures clarity and consistency at every stage. The diagram below illustrates this lifecycle and the key terms associated with each phase.

G Stage1 Stage 1: Process Design Stage2 Stage 2: Process Qualification Stage1->Stage2 Sub1_1 Defines Design Space Stage1->Sub1_1 Sub1_2 Identifies CQAs Stage1->Sub1_2 Sub1_3 User Requirements (URS) Stage1->Sub1_3 Stage3 Stage 3: Continued Process Verification Stage2->Stage3 Sub2_1 Installation Qualification (IQ) Stage2->Sub2_1 Sub2_2 Operational Qualification (OQ) Stage2->Sub2_2 Sub2_3 Performance Qualification (PQ) Stage2->Sub2_3 Sub3_1 Ongoing Monitoring Stage3->Sub3_1 Sub3_2 Change Control Stage3->Sub3_2 Sub3_3 Revalidation Stage3->Sub3_3

Detailed Stage Protocols

  • Stage 1: Process Design: This stage defines the commercial manufacturing process based on knowledge gained through development and scale-up activities [59]. Key activities include identifying Critical Quality Attributes (CQAs), defining the design space (the range of input variables that yield quality product), and establishing a control strategy [59]. The primary deliverable is a detailed process design that serves as the blueprint for qualification.

  • Stage 2: Process Qualification: This stage confirms the process design is capable of reproducible commercial manufacturing [59]. It involves the execution of the IQ/OQ/PQ protocol. Installation Qualification (IQ) documents that equipment is correctly installed per manufacturer and design specifications. Operational Qualification (OQ) demonstrates the equipment operates as intended across its anticipated operating ranges. Performance Qualification (PQ) confirms the process, under routine conditions, consistently produces a product that meets all its CQAs [59]. A crucial final step is the successful execution of Process Performance Qualification (PPQ) batches, which are required for commercial distribution [59].

  • Stage 3: Continued Process Verification (CPV): This is an ongoing stage to ensure the process remains in a state of control during commercial manufacturing [63]. It involves routine monitoring of process parameters and CQAs. Any planned changes to the process, equipment, or systems are managed through a formal Change Control procedure to assess impact and determine if revalidation is required [58]. Revalidation is performed periodically or after significant changes to confirm the process continues to meet GMP standards [58].

Successful navigation of a validated environment requires specific tools and knowledge. The following table outlines key resources for professionals working in or supporting GMP-compliant operations.

Table 2: Essential Research Reagent Solutions for Validation and Standardized Work

Item Function & Relevance in a Validated Context
Validation Master Plan (VMP) A comprehensive document that lays out the overall philosophy, approach, and granular details for a validation project. It is the top-level plan guiding all validation activities [59].
Standard Operating Procedure (SOP) Documented procedures that provide detailed, step-by-step instructions to ensure operational tasks are performed consistently and in compliance with regulatory standards [58].
Requirements Traceability Matrix (RTM) A document, often a spreadsheet or table, that links initial User Requirements (URS) to their corresponding design specifications, test cases, and results. It provides evidence that all requirements have been tested [62].
Risk Assessment Tools Formal methodologies (e.g., FMEA) used to identify and prioritize potential sources of process variability or failure. This assessment directly informs the validation strategy [58].
Change Control System A formal, documented system (often part of an electronic QMS) for managing and tracking all proposed changes to validated systems, ensuring they are properly evaluated, approved, and implemented without compromising validated status [58] [63].
International Standards (e.g., ISO) Authoritative documents, such as ISO 18115 for vocabulary, that provide the consensus-built, precise definitions and methodologies necessary for unambiguous communication and reproducibility across scientific and regulatory disciplines [30] [1].

Within the rigorous framework of GMP and QMS, standardized terminology is far more than a matter of semantics. It is a critical quality attribute in its own right, enabling the precision, consistency, and reproducibility that underpin patient safety and product efficacy. The structured lexicon of validation—from IQ, OQ, PQ to Change Control and CQAs—provides the shared language necessary for robust process design, reliable qualification, and vigilant ongoing monitoring.

As exemplified by ISO 18115-2 in the field of surface chemical analysis, the formal standardization of terminology is a cornerstone of scientific and technical progress [1]. For researchers, scientists, and drug development professionals, mastering this vocabulary is not merely a regulatory obligation but a fundamental component of professional expertise, ensuring that every stakeholder operates from a common, unambiguous understanding in the mission to deliver safe and effective medicines.

In the rigorous fields of surface chemical analysis and nanotechnology, precise terminology is not merely a convenience but a fundamental requirement for scientific progress and industrial application. The ISO 18115-2:2021 standard represents the formalized international consensus for terminology used in scanning probe microscopy (SPM), providing the critical linguistic infrastructure that enables unambiguous communication between researchers, laboratories, and industries worldwide [15] [50]. This framework covers essential SPM techniques including atomic force microscopy (AFM), scanning tunnelling microscopy (STM), and scanning near-field optical microscopy (SNOM) [50] [30].

Parallel to this formal standardization ecosystem exists the vital pre-standardization domain, where methodologies and terms are rigorously tested and refined before achieving international consensus. This comparative analysis systematically evaluates these two interconnected realms—the dynamic, investigative nature of pre-standardization against the stable, definitive framework of ISO 18115-2. For researchers in drug development and nanotechnology, understanding this relationship is crucial for both implementing current best practices and contributing to the evolution of future standards.

The ISO 18115-2:2021 Framework: Scope and Coverage

Architectural Framework and Technical Scope

ISO 18115-2:2021 establishes a comprehensive vocabulary specifically focused on terms used in scanning probe microscopy, functioning as Part 2 of a larger multi-part standard on surface chemical analysis vocabulary [15]. As an international standard published in December 2021, it has reached the stage of being a formally published and recognized framework [15]. The technical committee responsible for its development is ISO/TC 201/SC 1, which specializes in standardizing terminology, general procedures, and specific methodologies for surface chemical analysis [15] [50].

The standard provides detailed definitions for a substantial lexicon of SPM terminology, with a previous iteration (the 2013 version) documented as containing 277 specialized terms used in scanning probe microscopy alongside 98 acronyms [29]. These terms systematically cover the entire SPM ecosystem, including words and phrases used in describing samples, instrumentation, and theoretical concepts fundamental to the field [29]. The standard forms part of a broader vocabulary system that includes ISO 18115-1 for general terms and spectroscopy terms, and ISO 18115-3 for terms used in optical interface analysis [50] [30].

Technical Content and Application Domains

The technical content of ISO 18115-2 encompasses multiple critical domains within scanning probe microscopy. It provides definitions for scanned probe microscopy methods and includes acronyms and terms for contact mechanics models, which are fundamental for interpreting tip-sample interactions in techniques like AFM [50]. The standard serves as an authoritative reference for numerous SPM techniques, creating a common language that enables comparison of data between different laboratories and instrumentation platforms [50].

This standardization is particularly crucial for techniques like AFM that are widely used in pharmaceutical research and nanomaterial characterization, where precise terminology affects everything from methodological descriptions to regulatory submissions. By defining terms for scanning probe methods with high specificity, the standard helps minimize confusion and encourages consistency in reporting results across the scientific community [50]. The availability of these definitions through ISO's online browsing platform at no charge further enhances their accessibility and implementation across academic, industrial, and regulatory environments [50] [30].

Pre-standardization Practices and Organizations

The VAMAS Framework and International Collaboration

The Versailles Project on Advanced Materials and Standards (VAMAS) serves as the primary international organization coordinating pre-standardization activities for advanced materials, including surface chemical analysis and nanotechnologies [50]. VAMAS supports world trade in products dependent on advanced materials technologies through international collaborative projects specifically designed to provide the technical basis for harmonized measurements, testing, specifications, and standards [50]. This organization operates through various technical working areas (TWAs) that focus on specific measurement challenges and technological domains.

Within the VAMAS structure, TWA 2 focuses specifically on surface chemical analysis, covering electron and optical spectroscopies, scanning probe microscopies, mass spectroscopies, data workflow, methods, and best practices [50]. Additional relevant working areas include TWA 34 for nanoparticle populations, which coordinates cross-comparison of measurement techniques for determining dimensional, electronic, chemical, optical, or magnetic characteristics of nanoparticles, and TWA 41 for graphene and related 2D materials, which aims to validate different measurement methodologies for these emerging materials [50]. These pre-standardization working areas provide the essential experimental foundation upon which formal ISO standards are later built.

Methodology of Pre-standardization Studies

Pre-standardization activities under VAMAS typically take the form of international interlaboratory studies that evaluate the reproducibility and comparability of measurement techniques across different institutions, instruments, and operators [50]. These studies are fundamentally investigative in nature, designed to identify sources of measurement variability, assess methodological limitations, and establish robust measurement protocols before formal standardization is attempted. The National Physical Laboratory (NPL) in the UK and other national metrology institutes play leading roles in organizing and executing these pre-normative international interlaboratory studies [50].

The experimental approach in these studies involves circulating representative samples and measurement protocols among participating laboratories worldwide, followed by statistical analysis of the returned data to establish measurement uncertainties and validate methodological approaches. This process is particularly crucial for emerging SPM techniques and applications, such as those for characterizing graphene and related 2D materials, where measurement approaches may still be evolving [50]. The outcomes of these studies provide the technical evidence base that informs the development of formal ISO standards, including terminology standards like ISO 18115-2.

Comparative Analysis: Framework vs. Practice

Quantitative Comparison of Terminology Coverage

Table 1: Quantitative comparison of terminology coverage between standardization frameworks

Aspect ISO 18115-2:2021 Framework Pre-standardization Practices
Term Count 277+ terms specifically for SPM [29] Evolving terminology for emerging techniques [50]
Acronym Coverage 98 acronyms defined [29] New acronyms established through technical work [50]
Technical Scope Samples, instruments, theoretical concepts [29] Method validation, measurement protocols [50]
Update Cycle Formal revision process (2010→2013→2021) [15] [29] Continuous technical evaluation [50]
Geographical Reach International standard (ISO) [15] International interlaboratory studies [50]

Functional Roles in the Standardization Ecosystem

Table 2: Functional roles and outputs of standardization versus pre-standardization activities

Characteristic ISO 18115-2 Standardization Framework Pre-standardization Practices
Primary Function Provide stable, consensus terminology [15] [50] Investigate and validate measurement approaches [50]
Output Type Published documentary standards [15] [50] Technical basis for future standards [50]
Stakeholder Involvement ISO technical committees (TC 201/SC 1) [15] VAMAS working groups, research institutions [50]
Evidence Base Existing scientific literature and established practice [50] Interlaboratory study data [50]
Implementation Status Defendable, comparable measurements [50] Best practice recommendations [50]

The relationship between pre-standardization and formal standardization represents a carefully balanced ecosystem where innovation and stability maintain a dynamic equilibrium. Pre-standardization activities through organizations like VAMAS provide the experimental foundation and technical validation necessary for developing robust international standards [50]. These practices serve as the research and development phase of standardization, where new measurement techniques are stress-tested across international laboratories, and terminology is refined through practical application and consensus-building within specialist communities.

Conversely, the ISO 18115-2 framework provides the stable reference point that enables reliable communication, commerce, and regulation based on standardized terminology [15] [50]. This formal standardization creates the necessary stability for industrial application, educational dissemination, and regulatory compliance. The two systems work in concert—with pre-standardization exploring emerging areas like graphene characterization [50] and atom probe tomography [30], while formal standardization consolidates knowledge in established domains through documents like ISO 18115-2 that are regularly updated to reflect evolving usage and technological advancements [15] [29].

Experimental Protocols and Methodologies

Standardized Terminology Development Workflow

The development and validation of terminology for scanning probe microscopy follows a systematic workflow that transitions from pre-standardization investigation to formal standardization. The process integrates experimental validation with consensus building to ensure both technical robustness and international acceptance.

G Figure 1: Terminology Standardization Workflow cluster_0 Pre-standardization Phase cluster_1 Formal Standardization Phase Start Identify Terminology Gaps (Emerging Techniques) A VAMAS Interlaboratory Studies Start->A Technical Need B Data Analysis & Statistical Validation A->B International Data Collection A->B C Draft Terminology Proposals B->C Technical Basis Established D ISO Committee Review & Voting C->D Formal Proposal C->D E ISO Standard Publication D->E Consensus Achieved D->E F Industry & Research Implementation E->F Deployment

Interlaboratory Study Methodology for Terminology Validation

The experimental foundation for terminology standardization relies on rigorously designed interlaboratory studies that follow specific protocols:

  • Study Design and Sample Preparation: Certified reference materials with well-characterized properties are selected or developed to represent typical measurement scenarios. For SPM terminology validation, this may include samples with standardized surface features, specific chemical compositions, or defined nanoscale structures that enable consistent interpretation of terms across different laboratories [50].

  • Measurement Protocol Development: Detailed measurement procedures are drafted and circulated to participating laboratories, specifying instrument parameters, environmental conditions, and data collection procedures. These protocols include explicit definitions of the terminology being evaluated to assess interpretational consistency across different research groups and instrument platforms [50].

  • Data Collection and Statistical Analysis: Participating laboratories follow the established protocols and return their measurement data to the coordinating organization (typically a national metrology institute like NPL). The collected data undergoes statistical analysis to identify systematic differences in terminology interpretation, measurement biases, and sources of variability between different implementations of the same technique [50].

  • Terminology Refinement and Consensus Building: Based on the interlaboratory study results, terminology definitions are refined to eliminate ambiguities and ensure consistent interpretation. This iterative process continues until sufficient consensus is achieved to support formal standardization through the relevant ISO technical committee [50].

Essential Research Reagent Solutions

Standardization Tools and Reference Materials

Table 3: Essential research tools and materials for SPM standardization work

Tool/Material Function in Standardization Application Context
Certified Reference Materials Provide validated samples for interlaboratory comparisons [50] Method validation, instrument calibration
ISO 18115-2 Terminology Database Authoritative reference for standardized terms [15] [50] Research publications, method documentation
VAMAS Technical Reports Guidance on pre-standardization best practices [50] Experimental design, protocol development
Atomic Force Microscopy Standards Reference methodologies for SPM techniques [50] Nanomaterial characterization, surface analysis
Graphene Characterization Protocols Standardized measurement approaches for 2D materials [50] Emerging material research, quality control

The symbiotic relationship between pre-standardization practices and the formal ISO 18115-2 framework creates a robust ecosystem for advancing scanning probe microscopy terminology. Pre-standardization activities through VAMAS initiatives provide the essential experimental foundation and technical validation needed to develop terminology that is both scientifically rigorous and practically applicable across international laboratories [50]. The formal ISO 18115-2 standard then consolidates this knowledge into a stable, consensus-based framework that enables unambiguous communication, defendable measurements, and comparable data across the global scientific community [15] [50].

For researchers in drug development and nanotechnology, engagement with both aspects of this ecosystem offers significant advantages. Participation in pre-standardization activities provides early access to emerging measurement approaches and terminology, while implementation of established ISO standards ensures research outcomes are comparable, reproducible, and recognized internationally. This integrated approach to terminology development and standardization ultimately supports innovation while maintaining the measurement quality and communication clarity essential for scientific progress and successful technology transfer from laboratory to industry.

Conclusion

The adoption of ISO 18115-2 terminology is not merely an academic exercise but a critical component for advancing biomedical research and clinical translation. By providing a unified language for Scanning Probe Microscopy, this standard directly supports the stringent quality control and standardization required in fields like musculoskeletal tissue engineering and ATMP development. The key takeaways underscore that foundational clarity in terminology enables robust methodological application, effective troubleshooting of data inconsistencies, and defensible validation for regulatory approval. Future progress hinges on the widespread integration of this vocabulary into daily practice, which will enhance the comparability of data across international labs, accelerate the development of safe and effective therapies, and solidify the role of precise metrology in the next generation of biomedical innovations.

References