Chemical characterization under ISO 10993-18 is one of the most documentation-intensive areas in a biological evaluation package. The testing itself is technical, but the deficiency findings that come back from reviewers are rarely about the analytical chemistry. They are about what the documentation failed to justify, explain, or connect.
A lab delivers a clean extractables and leachables report with well-run chromatography and properly identified peaks. The manufacturer includes it in the submission. The notified body or FDA reviewer sends back a deficiency asking why only one solvent polarity was used, why the analytical evaluation threshold has no supporting calculation, or why three of the seven identified compounds have no corresponding entry in the toxicological risk assessment. The analytical work was sound. The documentation around it was not.
This article covers what E&L documentation actually needs to include beyond the test results themselves, and where gaps most commonly appear.
Solvent selection and polarity coverage
ISO 10993-18 and the FDA's draft guidance on chemical characterization both expect extraction studies to cover the range of polarity relevant to the clinical contact environment. For most devices, this means using both polar and non-polar solvents, because the substances that leach from a device into body tissues include compounds across the polarity spectrum.
In practice, many E&L studies are conducted with a single polar solvent. Sometimes this is appropriate, for example, when the device contact is exclusively with aqueous fluids and there is no lipophilic tissue contact. But the documentation needs to say that explicitly. The rationale for solvent selection must address why the chosen solvents represent the clinical exposure conditions and, if a polarity range was excluded, why that exclusion is justified for this specific device and its intended use.
A study that used only water and isopropanol without explaining why non-polar extraction was omitted will draw a finding. Not because the solvent choice was necessarily wrong, but because the reviewer has no basis to evaluate whether it was appropriate. The documentation gap is the problem, not the analytical decision.
What reviewers expect to see: A documented rationale for each solvent used, an explanation of how the selected solvents represent the clinical contact conditions (aqueous, lipophilic, or mixed), and a justification for any polarity range that was not covered in the study.
Analytical evaluation threshold
The analytical evaluation threshold (AET) defines the concentration below which an extractable compound does not require individual identification and toxicological assessment. It is the line that separates compounds requiring a margin of safety calculation from those that can be addressed through threshold-based arguments like the Threshold of Toxicological Concern (TTC).
The AET is one of the most common sources of deficiency findings in E&L documentation. The issue is not that companies set it incorrectly. The issue is that they state the AET as a number without showing how it was derived. A reviewer who sees "AET = 1.5 µg/device" with no supporting methodology cannot verify whether the threshold is appropriate for the device, the patient exposure scenario, or the analytical method's sensitivity.
A properly documented AET derivation connects several inputs: the tolerable exposure or tolerable intake value for a compound of concern, the uncertainty factors applied, the number of devices a patient could be exposed to simultaneously, and the clinical exposure duration. It also needs to account for the semi-quantitative nature of the analytical techniques used. The FDA has specifically noted that submitted AET calculations often fail to account for the instrument variability inherent to techniques like GC-MS and LC-MS when applied semi-quantitatively.1
If your AET documentation consists of a stated value and a reference to ISO 10993-18, that is likely insufficient. The calculation methodology, the inputs, and the assumptions all need to be visible in the documentation.
Exhaustive vs simulated-use extraction
ISO 10993-18 distinguishes between exhaustive extraction (designed to remove the maximum quantity of extractable substances) and simulated-use extraction (designed to approximate the quantity of substances released under clinical conditions). Both have a role in chemical characterization, and the documentation needs to justify which approach was used and why.
Exhaustive extraction is typically used for initial characterization and worst-case hazard identification. It tells you what could come out of the device. Simulated-use extraction tells you what actually comes out under conditions that approximate clinical exposure. The toxicological risk assessment needs to be based on clinically relevant exposure data, which usually means simulated-use conditions. But the hazard identification may rely on exhaustive extraction to ensure nothing was missed.
The documentation gap that commonly appears here is when a study uses exhaustive extraction conditions but the TRA treats the resulting quantities as if they represent clinical exposure without any adjustment or discussion. Exhaustive extraction deliberately overestimates patient exposure. If the margin of safety calculations use exhaustive extraction quantities directly, the documentation should acknowledge that the resulting MOS values are conservative. If simulated-use data is available, the TRA should reference it for the exposure estimates and explain the relationship between the two data sets.
Identified vs unidentified peaks
No extractables study identifies every single peak in the chromatographic profile. The question reviewers ask is not whether 100% identification was achieved, but whether the unidentified fraction was adequately addressed.
ISO 10993-18 expects that compounds above the AET are identified and quantified to the extent possible. When compounds above the AET cannot be identified, the documentation needs to explain what analytical steps were taken to attempt identification, why identification was not achievable, and how the toxicological risk of the unidentified compound was assessed in the absence of specific structural information.
A common documentation gap is handling unidentified peaks with a blanket statement like "unidentified compounds are present at low levels and are not expected to pose a toxicological concern." This is not a risk assessment. If the compound is above the AET, it requires individual assessment regardless of the perceived level. When identification is not possible, approaches like TTC-based assessment or structural class categorization based on available spectral data can be used, but the rationale needs to be documented and the limitations acknowledged.
Connecting E&L data to the toxicological risk assessment
The chemical characterization study and the toxicological risk assessment are separate documents that need to function as a connected chain. Every compound identified above the AET in the E&L study should have a corresponding entry in the TRA with an individual margin of safety calculation. This sounds straightforward, but the chain breaks in practice more often than it should.
The most common breaks are compounds identified in the E&L study that do not appear in the TRA at all, compounds that appear in both documents but with different quantification values because the TRA referenced a different version of the E&L report, and compounds that appear in the TRA but with a generic safety statement instead of a calculated margin of safety.
For example, an E&L study identifies seven compounds above the AET. The TRA calculates margins of safety for four of them, addresses one by referencing a published safety assessment from a regulatory body, and describes the remaining two as "trace-level compounds present below toxicological concern." That documentation will generate a finding. Every compound above the AET needs a documented, individual risk assessment with traceable tolerable exposure values from recognized sources such as EPA IRIS, ECHA DNEL, or published TDI/ADI values.
The chain that reviewers verify: E&L study identifies compound above AET → compound appears in TRA → TRA assigns tolerable exposure from a recognized source → TRA calculates patient exposure based on clinical use conditions → TRA calculates margin of safety → MOS is adequate for the exposure scenario. Every link in this chain needs to be documented and traceable.
Final finished form and the test article
ISO 10993-18 requires chemical characterization to be performed on the device in its final finished form, meaning after all manufacturing processes, surface treatments, coatings, and sterilization have been applied. This requirement is frequently cited but not always met in practice.
The documentation gap appears when the test article differs from the final device in ways that are not addressed. For example, testing was performed on a pre-sterilized sample because sterilization was not yet validated at the time of testing, or testing was performed on a material coupon rather than the finished device because the device geometry made extraction impractical. These are not necessarily disqualifying, but the documentation must acknowledge the difference, assess the potential impact on the extractable profile, and justify why the results remain representative of the final device.
If your E&L report describes the test article as the "final finished device" but your manufacturing records show that the sterilization process changed between testing and the current production process, that inconsistency will be found during review. The documentation needs to address any differences between the tested article and the current production device, including whether the differences could affect the extractable profile.
What changed in 2025
The 2025 revision of ISO 10993-1 removed the "physical and/or chemical information" column from the endpoint evaluation tables (previously Tables A.1 in the 2018 version, now Tables 1 through 4). Some companies have interpreted this removal as a reduced emphasis on chemical characterization. That interpretation is incorrect.
Chemical characterization remains a prerequisite for biological risk assessment under the 2025 standard. The definition of chemical characterization in the 2025 revision is unchanged: "the process of obtaining chemical information, accomplished either by information gathering or by information generation, for example, by literature review or chemical testing." The removal from the tables was intended to clarify that chemical characterization is not an endpoint to be evaluated in the same way as cytotoxicity or sensitization, but rather a foundational input to the entire risk assessment process.2
If your biological evaluation documentation was updated for 2025 and chemical characterization was deprioritized or removed from the evaluation plan based on the table changes, that will likely generate a finding. The documentation should reflect that chemical characterization is still required as an input to risk assessment, regardless of its position in the evaluation tables.
Common documentation gaps at a glance
Solvent selection without rationale. The E&L study used specific solvents but the documentation does not explain why those solvents represent the clinical contact conditions or why other polarity ranges were excluded.
AET stated without methodology. The analytical evaluation threshold appears as a number with no derivation showing the inputs, assumptions, and calculation methodology.
Exhaustive extraction treated as clinical exposure. Margin of safety calculations use exhaustive extraction quantities without acknowledging the conservative nature of the data or referencing simulated-use conditions.
Unidentified peaks dismissed generically. Compounds above the AET that could not be identified are addressed with blanket safety statements rather than documented risk assessment approaches.
Broken chain to the TRA. Compounds identified in the E&L study are absent from the TRA, present with different quantification values, or addressed with generic statements instead of individual margin of safety calculations.
Test article does not match the final device. The E&L study was performed on a sample that differs from the current production device, and the documentation does not address the difference or its potential impact.
2025 revision misinterpreted. Chemical characterization was deprioritized in the evaluation plan based on the removal of the physical/chemical column from the endpoint tables, without recognizing that it remains a prerequisite for risk assessment.
If your biological evaluation package includes extractables and leachables data, the documentation surrounding that data needs to be as thorough as the analytical work itself. A well-run E&L study paired with inadequate documentation is a preventable source of deficiency findings. If you want to check how your chemical characterization documentation holds up before submission, request a gap analysis or get in touch to discuss your package.
1 FDA Draft Guidance, "Biocompatibility Testing Considerations for Medical Devices: Chemical Characterization of Materials" (2024). Section on AET derivation and semi-quantitative analytical method variability.
2 NAMSA, "ISO 10993-1:2025 Updates: Top 10 Biological Evaluation Essentials in the Revision" (October 2025). Discussion of chemical characterization's removal from endpoint tables and its continued role as a risk assessment input.
BioEvalPro scores your chemical characterization documentation.
Solvent rationale, AET methodology, polarity coverage, E&L-to-TRA traceability, and final finished form consistency. Every sub-criterion scored with specific findings and remediation guidance.