Background: Adult acute lymphoblastic leukemia (ALL) treatment has undergone a significant shift in initial management over the previous decade. While pediatric patients have historically responded well to chemotherapy with cure rates around 90%, adult ALL patients have significantly worse outcomes with long-term cure rates in the mid 40% range. Several studies in the last decade have shown improved overall survival (OS) rates in adolescents and young adults (AYA) being treated with pediatric-type regimens. In 2011, our institution adopted guidelines recommending that all patients under the age of 40 at time of diagnosis undergo treatment with Berlin-Frankfurt-Münster (BFM) based regimen typically containing asparaginase. Patients over the age of 40 typically receive hyper-fractionated cyclophosphamide, vincristine, doxorubicin, and dexamethasone (hyper-CVAD) based therapy with option to proceed either to maintenance chemotherapy vs transplantation following complete remission (CR). Prior to 2011, our group's approach to ALL patients was clinician-dependent, with some using BFM-based regimens and some using hyper-CVAD but without a standardized group approach.

Methods: We conducted a single center retrospective review of patients with new diagnosis of acute lymphoblastic leukemia treated at the University of Wisconsin Hospital and Clinics between 2008-2016. Data collected included age at diagnosis (<40 vs >/= 40), WBC at presentation, B vs T cell subtype, Hyper-CVAD vs BFM based regimen, BCR/ABL status, transplant status, enrollment in clinical trial, overall survival (OS), relapse free survival (RFS), minimal residual disease status.

Results: Overall there were 40 patients identified with new diagnosis of acute lymphoblastic leukemia. Estimated median OS for all new diagnoses was 3.93 years. There were a total of 23 patients over the age of 40 and 18 patients under the age of 40 at the time of diagnosis. There was no significant difference between the subgroup of under 40 vs over 40 in estimated median OS (p=0.103). There were 18 patients initiated on therapy with a hyper-CVAD based regimen vs 23 treated with non Hyper-CVAD regimen. Estimated median overall survival of patients treated with hyper-CVAD based regimen was 3.93 years vs 4.17 years for those treated with non Hyper-CVAD based regimen which was not significantly different (p=0.89). There were 10 patients who underwent allogeneic transplant in first remission. There was no significant difference in OS for those undergoing transplant vs not (p=0.42). There were 9 patients enrolled in clinical trial at time of diagnosis, all but one of whom were under the age of 30 and treated on Children's Oncology Group (COG) studies. Patients who enrolled in a clinical trial had a significantly higher estimated median overall survival 8.48 years vs 2.88 years (p=0.02). Minimal residual disease (MRD) status was evaluated after induction in 27 patients. Patients who achieved a negative MRD status had a significantly increased median OS vs those who had positive MRD testing (median not reached vs 2.15 years, p=0.001). Presenting WBC, B- vs T-cell disease, and BCR/ABL status were not significant predictors of OS.

Conclusion: In 2011, the leukemia group at the University of Wisconsin adopted general guidelines for choosing initial treatment regimens for patients under vs over 40 years of age (BFM-based vs hyper-CVAD). Our single institution analysis showed that achieving MRD negative status was associated with improved estimated overall survival. This observation is consistent with published data that achieving MRD negative status is the single greatest prognostic indicator after the initiation of therapy. In addition, enrollment on a clinical trial was also associated with improved outcomes, though the number of patients on trial was significantly higher in the under-40 patient population. There was no significant difference in OS or RFS with treatment of Hyper-CVAD vs non Hyper-CVAD based regimen. Limitations of this study include a relatively small sample size and short follow up duration. These data indicate that achieving disease control as measured by MRD analysis with whichever initial regimen an institution is most familiar with may be the greatest single predictor of successful treatment.

Disclosures

No relevant conflicts of interest to declare.

Author notes

*

Asterisk with author names denotes non-ASH members.

Sign in via your Institution