The main goal of my work is toward a paradigm shift in toxicity testing to improve public health. Due to my background as head of the European Center for the Validation of Alternative Methods of the European Commission (2002-2008), I am involved in the implementation of the 2007 NRC vision document “Toxicity Testing in the 21st Century – a vision and a strategy”. I have furthered the translation of concepts of evidence-based medicine to toxicology (evidence-based toxicology). This aims for systematic assessment of the quality of all tools for regulatory toxicology and the development of new approaches based on annotated pathways of toxicity (the Human Toxome).
I have a broad background in clinical and experimental pharmacology and toxicology documented in more than 550 publications. Previous work centered on the immune recognition of bacteria, including pyrogen testing, and the induced inflammatory response. In experimental and clinical approaches, the pharmacological modulation of these responses was studied. I have relocated to the US early 2009 and established beside the directorship for the Center for Alternatives to Animal Testing (CAAT) a laboratory for developmental neurotoxicity research based on genomics and metabolomics the respective technologies were made available by a Thought-Leader Award from Agilent. Recent advances use big data and artificial intelligence for predictive toxicology.
2022 OpenTox Virtual Conference
Toward probabilistic risk assessment – the ONTOX project
The 2007 NAS report on Toxicity Testing for the 21st Century was a watershed moment for toxicology. Since then, the discussion is no longer whether to change but how and how fast? With knowledge in the life sciences doubling every seven years, we now have four times more understanding and actually a number of disruptive technologies have evolved, which were not anticipated in the report, such as Microphysiological Systems (MPS) and Machine Learning, aka Artificial Intelligence (AI). In order to embrace these developments and move toxicology to a more wholistic and integrated paradigm, the Basic Research Office of the Office of the Under Secretary of Defense for Research and Engineering, OUSD(R&E), hosted a Future Directions workshop Advancing the Next Scientific Revolution in Toxicology on April 28-29, 2022, at the Basic Research Innovation Collaboration Center (BRICC), in Arlington, VA. A vanguard of scientific and technical experts and agency observers developed a report, laying out how recent developments can be embraced and set the direction of “Toxicology for the 21st Century 2.0” in the next decades.
Computational approaches, especially AI, play a key role here:
A central role of Exposomics to change to more exposure-driven toxicology, with AI enabling us to make sense of ~omics (big) data
Predictive toxicology through automated read-across such as read-across-based structure-activity relationships (RASAR)
The computational modeling of in vitro tests and MPS
Digital pathology through image analysis
Information extraction by Natural Language Processing of scientific literature and the grey information of the internet as well as curated databases of legacy data
Evidence integration of different evidence streams allows probabilistic risk assessment
The EU project ONTOX is working toward the implementation of some of these goals. The ONTOX projects aims to deliver New Approach Methodologies (so called NAMs) for probabilistic risk assessment in toxicology (Vinken et al., 2021). To this end, physiological and toxicological data is collected and aggregated into physiological maps. These maps constitute current knowledge on physiological and toxicological perturbations caused by chemicals. The maps are meant as input for the establishment of new or to improve existing Adverse Outcome Pathways (AOP). From this, quantitative AOPs will be developed to quantitively model compound – biology interactions. Next to the AOPs and physiological maps, ONTOX aims to develop a Big Data approach for performing probabilistic risk assessment (Maertens et al., 2022), based on read-across-based quantitative relationships (RASAR) (Luechtefeld et al., 2018). This artificial intelligence approach is ultimately purposed as an information toolbox for performing chemical toxicological risk assessment.
Maertens A, Golden E, Luechtefeld TH, Hoffmann S, Tsaioun K and Hartung T. Probabilistic Risk Assessment – the Keystone for the Future of Toxicology. ALTEX 2022, 39:3-29. doi:10.14573/altex.2201081.
Vinken M, Benfenati E, Busquet F, Castell J, Clevert D-A, de Kok T, Dirven H, Fritsche E, Geris L, Gozalbes R, Hartung T, Jennen D, Jover R, Kandarova H, Kramer N, Krul C, Luechtefeld T, Masereeuw R, Roggen E, Schaller S, Vanhaecke T, Yang C, and Piersma AH. Safer chemicals using less animals: kick-off of the European ONTOX project. Toxicology 2021, 458, 152846, doi: 10.1016/j.tox.2021.152846.
Luechtefeld T, Marsh D, Rowlands C and Hartung T. Machine learning of toxicological big data enables read-across structure activity relationships (RASAR) outperforming animal test reproducibility. Toxicological Sciences, 2018, 165:198-212. doi: 10.1093/toxsci/kfy152.