Topics:

Management Strategies in Acute Lymphoblastic Leukemia

Management Strategies in Acute Lymphoblastic Leukemia

Survival in acute lymphoblastic leukemia (ALL) has improved in recent decades due to recognition of the biologic heterogeneity of ALL, utilization of risk-adapted therapy, and development of protocols that include optimized chemotherapy combinations, effective central nervous system (CNS) prophylaxis, post-induction intensification of therapy, and a prolonged maintenance phase of treatment. Recent molecular studies have yielded novel insights into both leukemia biology and host pharmacogenetic factors; also, large cooperative group clinical research studies have successively refined effective treatment strategies. While children have higher remission and cure rates than adults, both populations have benefited from these discoveries and innovations. Future challenges in this field include improving outcomes for high-risk patients and those with relapsed disease, and developing and integrating novel targeted therapeutic agents into current regimens to reduce toxicities while further improving outcomes.

Acute lymphoblastic leukemia (ALL) is the most common malignancy of childhood, accounting for approximately 25% of childhood cancer and approximately 4,900 cases per year in the United States.[1] A second peak in incidence occurs after 50 years of age, and although ALL accounts for a smaller proportion of adult than of pediatric malignancies, the absolute number of adult cases is ten times greater.[2] Survival in childhood ALL has improved dramatically over the past 50 years; once a nearly incurable disease, pediatric ALL now has overall survival rates of over 80%.[3] Survival in adults has also improved over time but remains considerably poorer at approximately 40%.[4] Some of the key principles responsible for the improvement in outcomes include the use of combination chemotherapy to prevent the emergence of resistant clones, preventive CNS-directed therapy, the introduction of a delayed intensification phase of treatment, and risk stratification based on cytogenetic features and early response to treatment. Nevertheless, significant challenges remain. Survival in certain subgroups, such as infants and cases with adverse genetic features (eg, hypodiploidy), has improved very little over time. In addition, salvage rates are dismal for most patients who relapse. This review will cover the key elements of modern ALL treatment regimens, focusing primarily on front-line treatment and concluding with a brief discussion of the management of relapsed disease. Childhood ALL will be the primary focus, since children constitute over 60% of ALL cases and the majority are enrolled in clinical research trials. Significant contrasts with adult ALL will be highlighted.

Risk Stratification

TABLE 1 Prognosis Features in Childhood ALL

One of the key factors responsible for survival gains in ALL is the recognition that, rather than being treated as a single entity, ALL should be treated as a set of heterogeneous disease subgroups that each require tailored therapy. Table 1 lists the key risk factors that affect prognosis on current regimens. It should be noted that risk factors are not absolute; rather, they differ in significance depending on the treatment regimen. Host factors include age, gender, and race and ethnicity. Disease characteristics include initial white blood cell (WBC) count at diagnosis, immunophenotype, genetic features, extramedullary involvement, and treatment response. Both age (with the exception of infants under one year of age) and initial WBC count behave as continuous variables; increasing values are associated with increasingly poor prognostic impact. However, in most pediatric risk stratification schemas they are treated as categorical variables, and cut-off values known as the National Cancer Institute (NCI)/Rome criteria are used; in the NCI/Rome criteria, age <1 year and > 10 years, and initial WBC count > 50,000/µl are considered high risk.[5] Increasing age and initial WBC are both significant prognostic factors in adults as well, with the age cutoffs for high risk on different protocols ranging anywhere from 35 to 65 years, and initial WBC counts from 5,000 to 30,000/µl.[4] Males historically have had slightly worse survival, although this difference has diminished recently.[3] Race and ethnicity also affect outcome, with Asians having the best outcomes, followed by Caucasians, blacks, and Hispanics.[6] Racial and ethnic differences in outcome have multifactorial causes, including socioeconomic factors and biologic differences in disease features and host pharmacogenetics.[6,7] Indeed, a gene expression signature significantly associated with Hispanic ethnicity was recently identified, which may partially explain the survival disadvantage in Hispanics[8]; also, a component of genomic variation cosegregating with Native American ancestry was recently reported to be associated with increased risk of relapse.[9]

Immunophenotype is used to stratify patients to distinct treatment regimens for T-cell, B-precursor, or mature B-cell leukemia. Other immunophenotypic differences (eg, the adverse effects of CD10 negativity in B-precursor ALL, and a recently identified early T-cell precursor immunophenotype) affect prognosis but do not at present alter treatment.[10] The genetic features of ALL have been intensively researched for decades, and new insights have been generated by each successive methodological advance that emerged, including karyotype, fluorescence in situ hybridization (FISH), gene expression and single nucleotide polymorphism (SNP) array profiling, conventional and next-generation sequencing, and other techniques.[11] However, only a few features fulfill the criteria for incorporation into risk stratification schemas on a widespread basis:

• Occurrence in a clinically relevant proportion of patients.

• Contribution of independent prognostic information beyond that of other established risk factors.

• Ready availability in everyday clinical practice.

Significant favorable features used for risk stratification for most modern regimens include the ETV6-RUNX1 fusion gene generated by the t(12;21) translocation, and high hyperdiploidy (particularly trisomies 4, 10, and 17).[12] Adverse features include the BCR-ABL1 fusion gene generated by the t(9;22) translocation, hypodiploidy, and MLL rearrangements.[12] Recent studies have identified additional novel adverse prognostic factors that are beginning to be incorporated in risk stratification schemas: Ikaros (IKZF1) deletions or mutations,[13] Janus kinase 2 (JAK2) activating mutations and/or cytokine receptor–like factor 2 (CRLF2) overexpression,[14] and chromosome 21 intrachromosomal amplification (iAMP21).[15] Genetically defined subtypes have specific drug sensitivity patterns; enhanced sensitivity to asparaginase is seen in ETV6-RUNX1-positive ALL, and to methotrexate in hyperdiploid ALL, whereas ETV6-RUNX1-positive, TCF3-PBX1-positive, and T-cell ALL require higher methotrexate doses to yield equivalent intracellular concentrations of the active methotrexate polyglutamate metabolites.[16]

FIGURE 1

Event-Free Survival (EFS) of All Patients Enrolled in the Pediatric Oncology Group 9900 Series of Trials Who Had Satisfactory End-Induction Minimal Residual Disease (MRD

Treatment response has assumed importance relatively recently, as technologic advances have made detection of minimal residual disease (MRD) possible on a routine clinical basis. MRD assays are based either on flow cytometric detection of an aberrant combination of surface markers characteristic of the leukemic clone, or on polymerase chain reaction (PCR) detection of a fusion transcript, gene mutation, or clonal immunoglobulin or T-cell receptor rearrangement.[17] Many current regimens include measurements of MRD during and at the end of induction, and sometimes at later time points as well. MRD positivity generally necessitates intensification of therapy (see Figure 1), and MRD negativity in some cases may warrant consideration of decreased treatment intensity—eg, for selected favorable-risk patients with low MRD at days 8 and 29, a recent series reported 5-year event-free survival (EFS) of 97% ± 1%.[18] Bone marrow morphology following one or two weeks of induction therapy was previously used as a measure of early response, but this is generally being replaced by measures of either bone marrow or peripheral blood MRD due to the superior sensitivity of these tests. The Berlin-Frankfurt-Mnster (BFM) Study Group continues to employ another measure of early response as well, namely, response to an initial seven-day prednisone window.[19]

Pages

 
Loading comments...

By clicking Accept, you agree to become a member of the UBM Medica Community.