Session 5: ToxCast Data Interpretation
National Center for Computational Toxicology, US EPA
The U.S. EPA ToxCast program is in its tenth year. Significant learning and progress have occurred towards collection, analysis, quality control, and interpretation of the data.
The analysis of ~1,800 chemicals across ~700 high-throughput in vitro assays has revealed that most environmental and industrial chemicals are very non-selective in the biological targets they perturb, while a small subset of chemicals are relatively selective for specific biological targets.
The selectivity of a chemical informs interpretation of the screening results while also guiding future work based on mode-of-action or adverse outcome pathway approaches. Coupling the high-throughput in vitro assays with additional in vitro pharmacokinetic assays and in vitro-to-in vivo extrapolation modeling allows conversion of in vitro bioactive concentrations to estimates to an administered dose (mg/kg/day). High throughput exposure models are also being developed that generate exposure estimates based on key aspects of chemical production, fate, transport, and personal use. Comparison of the administered dose to human exposure estimates provides a risk-based context.
Since its inception the ToxCast program has spent considerable effort to ensure data quality and public access in order to provide scientific confidence in the results. NCCT’s data analysis software has recently been revamped to increase statistical rigor.
Data quality flags have been added to indicate concerns with chemical purity and identity, noisy data, outliers, systematic assay errors, and activity in range of cytotoxicity. The raw data has also been released together with the data analysis software to enable users to reproduce the summary values.
Additional efforts, such as an external data audit and release of an owner’s manual, are planned for later this year to further increase transparency and data usability. This abstract does not necessarily reflect U.S. EPA policy.