The risk for terrorist-mediated nuclear or radiological attack has been identified as a major threat to the United States in the coming decade. In the event of a “dirty bomb” or a nuclear blast, thousands of individuals can be expected to present for medical attention to determine whether they have suffered radiation exposure. Radiation exposure can cause a spectrum of hematological toxicities, from mild immunosuppression to myeloablation with concordant life threatening complications. Accurate biological dosimetry will be critical, therefore, for caregivers to triage individuals to the appropriate medical management. Currently, biodosimetric tools include lymphocyte depletion kinetics and cytogenetics analysis, both of which require several days for results to be obtained. New technologies should therefore be applied to develop more rapid tests to diagnose radiation exposure in humans. We propose that high throughput genomic analysis of peripheral blood mononuclear cells (PB MNCs) can sensitively identify patterns of molecular changes which occur following different levels of radiation exposure. In this study, we collected primary PB MNCs from 10 week old C57Bl6 mice at 6 hours following 4 different levels of radiation exposure: normal (non-irradiated), 50 cGy (trivial exposure), 200 cGy (myelosuppressive) and 1000 cGy (lethal). Ten samples were collected per condition. RNA was extracted from each sample and spotted array hybridizations were performed. We first identified genes whose expression most highly correlated with the exposure to radiation at a particular dose. We then performed a binary regression analysis to elucidate patterns of gene expression to distinguish between a normal animal and one that had been exposed to various levels of radiation. Distinct gene expression patterns (30–75 genes/pattern) were evident within PB MNCs at each of the 4 exposure levels, demonstrating the feasibility of this approach. We next performed a leave-one-out cross validation to assess the ability of the patterns to predict the relevant samples and distinguish a particular exposure level from other doses. We found that the selected metagene pattern for “normal” was able to distinugish normal from 50 cGy, 200 cGy and 1000 cGy exposure with 100% predictive capacity. The predictors selected for 50 cGy and 1000 cGy were equally powerful at distinguishing these levels of exposure from all others. The predictor for 200 cGy easily distinguished versus normal and 50 cGy exposure, whereas separation from 1000 cGy was less distinct. These data demonstrate the power of this approach to correctly distinguish clinically relevant levels of radiation exposure. In order to validate these molecular predictors generated in mice and translate them into profiles of human radiation response, we are currently testing, in a blinded manner, whether these predictors can distinguish different levels of radiation exposure in human PB samples collected from patients who have undergone 200 cGy or 1000 cGy total body irradiation. We plan to further refine and prune these predictors to identify core groups of genes (20–25 per predictor) capable of effectively distinguishing different levels of human radiation response. These validated biomarkers of radiation response can serve as templates for rapid (e.g. real-time PCR-based) screening tests for radiation exposure and, more broadly, are potential targets for therapeutic intervention.

Author notes

Corresponding author

Sign in via your Institution