2
Workshop participants envision a Tox-21c 2.0 that reects real-
world based exposure designs (in silico, cellular, organoids,
models, organisms, longitudinal epidemiological studies). It will
include population-scale measurements that are based on readily
available biobanks and ecobanks that inform on the distribution
of thousands of chemical and non-chemical stressors in relevant
populations (general population, relevant subgroups, disease
cohorts). Noteworthy, the exposomics approach potentially
can interrogate all types of stressors, not just chemicals, that
actually perturb biology and change biomarkers in body uids.
Study designs and computational approaches will be aligned to
provide interpretable and actionable results. Ethical issues, policy
implications, community engagement, and citizen participation
will keep pace with and inform the technology, rather than being
exclusively reactive to the technology. In the near-term, a critical
rst implementation step for exposure-driven toxicology and
precision health is to scale-up mass spectrometry technology for
high-quality inexpensive assessment of thousands of chemicals
that can be tagged to exogenous exposures including non-
chemical stressors. Libraries that tag key information for those
chemicals (metadata layering) will need to be expanded and
developed to facilitate interpretation, and to guide preventive
strategies, interventions, and policy recommendations. In the
mid-term, technologies will be required that link the exposome
with health outcomes, and leverage longitudinal studies and
biobanks retrospectively and prospectively, ensuring “FAIR”-ness
(Findability, Accessibility, Interoperability, and Reuse of digital
assets)
2
. In the long-term (20 years), we envision that exposome-
disease predictions and exposome-targeted prevention, and
treatment solutions will become part of the toxicology and
public health practice landscape, leveraging also other ~omics
technologies, genomic information, and clinical characteristics.
Technology-enabled Toxicology
The workshop participants discussed technological advances
over the last 10-15 years of great relevance to toxicology in
three key areas: cell and tissue biology, bioengineering, and
computational methods. Workshop participants noted that
while biological technologies, such as stem cell engineering,
have emerged as routine, commercial enterprises for
biomedical research, their potential in toxicology could be
further expanded through a) reliable and genetically-diverse
cell sourcing, b) improved protocols to differentiate patient-
derived stem cells into adult cell phenotypes across essential
tissues, c) integrative and non-invasive biomarkers, d) integration
of dynamic physiology and pathophysiology outcomes, e)
population heterogeneity and susceptibility through life-courses,
and f) biological surrogates for non-chemical stressors. On
the bioengineering side, workshop participants noted that
technological capabilities, such as microphysiological systems
(MPS), have shown many successes in the laboratory but need
to be further developed to 1) include a variety of models of
increasing architectural complexity (monolayer/suspension
cultures, organoids and multi-organoid systems) for different
stages of drug/chemical development, 2) better represent
healthy and diseased populations by a personalized multiverse
2 hps://www.go-fair.org/fair-principles/
of possible futures, 3) codify platform standardization, 4)
increase throughput, 5) demonstrate validation against in vivo
outcomes, 6) incorporate perfusion and biosensors with near
real-time outputs, and 7) develop automated fabrication. In
addition, the workshop participants noted that the emergence
of “big data” and “big compute” has revolutionized much of
biology, through the ability to analyze and interpret complex and
multi-dimensional information. Computational capabilities and
models are of utmost importance for toxicology, serving as the
key enabling technology. For instance, AI/Machine Learning has
emerged as a key technology to support data mining, predictive
modeling, hypothesis generation, and evidence interpretation
(e.g., explainable AI). Data acquisition and data-sharing following
the FAIR principles is key to unleashing these opportunities. The
emergence of these needs in toxicology necessitates widespread
use and understanding of these technologies combining
them with expert knowledge to yield augmented intelligence
workows. Moreover, given the quantity of information generated
and consumed by these new technologies, the workshop
participants agreed that there is a need for comparable,
compatible, integrable multi-omic databases, quantitative in vitro
to in vivo extrapolation, and the development of in silico “digital
twins” of in vitro and in vivo systems.
Evidence-integrated Toxicology
Workshop participants discussed the key challenge of integrating
data and methods (evidence streams) in test strategies,
systematic reviews, and risk assessments. They agree that
evidence-based toxicology and probabilistic risk assessments are
emerging solutions to this challenge. Evidence integration across
evidence streams (epidemiological, animal toxicology, in vitro, in
silico, non-chemical stressors, etc.) is expected to play a key role
in translating evidence into knowledge that can inform decision-
making. The group developed a vision to conduct complex
rapid/real-time evidence integration by combining advancements
made in data-sharing, and application of articial intelligence
(e.g., natural language processing), with the transparency
and rigor of systematic reviews. To implement this vision, the
workshop participants identied a need for collaborative, open
platform(s) to transparently collect, process, share, and interpret
data, information, and knowledge on chemical and non-chemical
stressors. Creating these platforms is foundational for rapid and
real-time evidence integration and will empower all steps of
protection of human health and the environment. Several needs
were identied to create this platform: 1) software development
to create dynamic and accessible interfaces, 2) denitive
standards and key data elements to facilitate analysis of meta-
data and automated annotation, and 3) consideration for quality
control.
In conclusion, the workshop advocates a paradigm shift
to “Toxicology 2.0” based on the evidence integration of
emerging disruptive technologies, especially exposomics,
microphysiological systems, and machine learning. To date,
exposure considerations typically follow the identication of a
hazard. Future Tox-21c 2.0 must be guided by the identication