Topics:

Management Strategies in Acute Lymphoblastic Leukemia

Management Strategies in Acute Lymphoblastic Leukemia

ABSTRACT: Survival in acute lymphoblastic leukemia (ALL) has improved in recent decades due to recognition of the biologic heterogeneity of ALL, utilization of risk-adapted therapy, and development of protocols that include optimized chemotherapy combinations, effective central nervous system (CNS) prophylaxis, post-induction intensification of therapy, and a prolonged maintenance phase of treatment. Recent molecular studies have yielded novel insights into both leukemia biology and host pharmacogenetic factors; also, large cooperative group clinical research studies have successively refined effective treatment strategies. While children have higher remission and cure rates than adults, both populations have benefited from these discoveries and innovations. Future challenges in this field include improving outcomes for high-risk patients and those with relapsed disease, and developing and integrating novel targeted therapeutic agents into current regimens to reduce toxicities while further improving outcomes.

Acute lymphoblastic leukemia (ALL) is the most common malignancy of childhood, accounting for approximately 25% of childhood cancer and approximately 4,900 cases per year in the United States.[1] A second peak in incidence occurs after 50 years of age, and although ALL accounts for a smaller proportion of adult than of pediatric malignancies, the absolute number of adult cases is ten times greater.[2] Survival in childhood ALL has improved dramatically over the past 50 years; once a nearly incurable disease, pediatric ALL now has overall survival rates of over 80%.[3] Survival in adults has also improved over time but remains considerably poorer at approximately 40%.[4] Some of the key principles responsible for the improvement in outcomes include the use of combination chemotherapy to prevent the emergence of resistant clones, preventive CNS-directed therapy, the introduction of a delayed intensification phase of treatment, and risk stratification based on cytogenetic features and early response to treatment. Nevertheless, significant challenges remain. Survival in certain subgroups, such as infants and cases with adverse genetic features (eg, hypodiploidy), has improved very little over time. In addition, salvage rates are dismal for most patients who relapse. This review will cover the key elements of modern ALL treatment regimens, focusing primarily on front-line treatment and concluding with a brief discussion of the management of relapsed disease. Childhood ALL will be the primary focus, since children constitute over 60% of ALL cases and the majority are enrolled in clinical research trials. Significant contrasts with adult ALL will be highlighted.

Risk Stratification

TABLE 1 Prognosis Features in Childhood ALL

One of the key factors responsible for survival gains in ALL is the recognition that, rather than being treated as a single entity, ALL should be treated as a set of heterogeneous disease subgroups that each require tailored therapy. Table 1 lists the key risk factors that affect prognosis on current regimens. It should be noted that risk factors are not absolute; rather, they differ in significance depending on the treatment regimen. Host factors include age, gender, and race and ethnicity. Disease characteristics include initial white blood cell (WBC) count at diagnosis, immunophenotype, genetic features, extramedullary involvement, and treatment response. Both age (with the exception of infants under one year of age) and initial WBC count behave as continuous variables; increasing values are associated with increasingly poor prognostic impact. However, in most pediatric risk stratification schemas they are treated as categorical variables, and cut-off values known as the National Cancer Institute (NCI)/Rome criteria are used; in the NCI/Rome criteria, age <1 year and > 10 years, and initial WBC count > 50,000/µl are considered high risk.[5] Increasing age and initial WBC are both significant prognostic factors in adults as well, with the age cutoffs for high risk on different protocols ranging anywhere from 35 to 65 years, and initial WBC counts from 5,000 to 30,000/µl.[4] Males historically have had slightly worse survival, although this difference has diminished recently.[3] Race and ethnicity also affect outcome, with Asians having the best outcomes, followed by Caucasians, blacks, and Hispanics.[6] Racial and ethnic differences in outcome have multifactorial causes, including socioeconomic factors and biologic differences in disease features and host pharmacogenetics.[6,7] Indeed, a gene expression signature significantly associated with Hispanic ethnicity was recently identified, which may partially explain the survival disadvantage in Hispanics[8]; also, a component of genomic variation cosegregating with Native American ancestry was recently reported to be associated with increased risk of relapse.[9]

Immunophenotype is used to stratify patients to distinct treatment regimens for T-cell, B-precursor, or mature B-cell leukemia. Other immunophenotypic differences (eg, the adverse effects of CD10 negativity in B-precursor ALL, and a recently identified early T-cell precursor immunophenotype) affect prognosis but do not at present alter treatment.[10] The genetic features of ALL have been intensively researched for decades, and new insights have been generated by each successive methodological advance that emerged, including karyotype, fluorescence in situ hybridization (FISH), gene expression and single nucleotide polymorphism (SNP) array profiling, conventional and next-generation sequencing, and other techniques.[11] However, only a few features fulfill the criteria for incorporation into risk stratification schemas on a widespread basis:

• Occurrence in a clinically relevant proportion of patients.

• Contribution of independent prognostic information beyond that of other established risk factors.

• Ready availability in everyday clinical practice.

Significant favorable features used for risk stratification for most modern regimens include the ETV6-RUNX1 fusion gene generated by the t(12;21) translocation, and high hyperdiploidy (particularly trisomies 4, 10, and 17).[12] Adverse features include the BCR-ABL1 fusion gene generated by the t(9;22) translocation, hypodiploidy, and MLL rearrangements.[12] Recent studies have identified additional novel adverse prognostic factors that are beginning to be incorporated in risk stratification schemas: Ikaros (IKZF1) deletions or mutations,[13] Janus kinase 2 (JAK2) activating mutations and/or cytokine receptor–like factor 2 (CRLF2) overexpression,[14] and chromosome 21 intrachromosomal amplification (iAMP21).[15] Genetically defined subtypes have specific drug sensitivity patterns; enhanced sensitivity to asparaginase is seen in ETV6-RUNX1-positive ALL, and to methotrexate in hyperdiploid ALL, whereas ETV6-RUNX1-positive, TCF3-PBX1-positive, and T-cell ALL require higher methotrexate doses to yield equivalent intracellular concentrations of the active methotrexate polyglutamate metabolites.[16]

FIGURE 1

Event-Free Survival (EFS) of All Patients Enrolled in the Pediatric Oncology Group 9900 Series of Trials Who Had Satisfactory End-Induction Minimal Residual Disease (MRD

Treatment response has assumed importance relatively recently, as technologic advances have made detection of minimal residual disease (MRD) possible on a routine clinical basis. MRD assays are based either on flow cytometric detection of an aberrant combination of surface markers characteristic of the leukemic clone, or on polymerase chain reaction (PCR) detection of a fusion transcript, gene mutation, or clonal immunoglobulin or T-cell receptor rearrangement.[17] Many current regimens include measurements of MRD during and at the end of induction, and sometimes at later time points as well. MRD positivity generally necessitates intensification of therapy (see Figure 1), and MRD negativity in some cases may warrant consideration of decreased treatment intensity—eg, for selected favorable-risk patients with low MRD at days 8 and 29, a recent series reported 5-year event-free survival (EFS) of 97% ± 1%.[18] Bone marrow morphology following one or two weeks of induction therapy was previously used as a measure of early response, but this is generally being replaced by measures of either bone marrow or peripheral blood MRD due to the superior sensitivity of these tests. The Berlin-Frankfurt-Mnster (BFM) Study Group continues to employ another measure of early response as well, namely, response to an initial seven-day prednisone window.[19]

Induction

Traditionally, the goal of induction has been to achieve morphologic remission (<5% blasts in the bone marrow). However, it is now recognized that achieving a molecular remission, generally defined as below a threshold of 0.01% blasts by MRD assay, substantially improves the chance of long-term cure.[18] Complete remission is achieved in approximately 98% of children and 85% of adults.[20] Generally, induction regimens consist of vincristine, asparaginase (Elspar), a glucocorticoid, and in some cases an anthracycline, for a period of 4 to 6 weeks. While this general framework has been employed for decades, some modifications have been made as new drug formulations have become available. Whether dexamethasone or prednisone is used during induction varies across cooperative groups because each of these glucocorticoids has its pros and cons.[21]Dexamethasone has the advantages of more potent cytotoxicity and superior CNS penetration. However, it also carries an increased risk of infection, avascular necrosis (AVN), and other toxicities. Most cooperative groups currently use prednisone in patients over 10 years of age, because of the significant risk of AVN in this age group, while they use dexamethasone in children younger than 10 years. The asparaginase formulation used in most regimens has shifted from Escherichia coli (or native) asparaginase to a pegylated form, pegaspargase (Oncaspar), which has the advantages of a longer half-life and lower immunogenicity.[22] The lower immunogenicity has dual benefits: a lower frequency of both hypersensitivity reactions and neutralizing antibodies that reduce drug efficacy. An anthracycline, most often daunorubicin, is generally included in induction only for a subset of high-risk patients. A recent meta-analysis has questioned the benefit of anthracyclines altogether, suggesting that antileukemic efficacy is counterbalanced by increased cardiotoxicity and treatment-related deaths.[23]

Asparaginase is less well tolerated in adults; thus, another common regimen employed in this population is hyper-CVAD, which consists of courses of cyclophosphamide, vincristine, doxorubicin, and dexamethasone alternating with high-dose methotrexate and cytarabine for a total of eight courses.[24] However, more recent studies have demonstrated that the survival in adolescents and young adults treated in pediatric cooperative trials is superior to the survival of those treated in adult trials.[25] It is unclear whether the survival differences are attributable to differences in treatment regimen and dose intensity of the regimens, to protocol adherence by the physicians, or to demographic differences between patients enrolled in adult studies and patients in pediatric studies. Nevertheless, several recent studies have begun studying pediatric-style regimens in adult patients.

Post-Induction Intensification

It is well established that further intensification of therapy to consolidate remission status following induction improves outcomes in ALL.[26,27] This phase is generally termed consolidation or intensification. A wide variety of systemic chemotherapy regimens have been successfully utilized for consolidation—eg, methotrexate (ranging from 20 mg/m2 to 5 g/m2) with or without mercaptopurine[28]; prolonged asparaginase[29,30]; and cyclophosphamide, cytarabine, and thioguanine.[31] Consolidation is usually followed by an interim maintenance phase, and then by a reinduction or delayed intensification (DI) phase that includes many of the same chemotherapy agents used in induction and consolidation. A key Children's Cancer Group (CCG) study demonstrated that augmentation of post-induction therapy improved outcomes for high-risk ALL patients with a slow early response to therapy.[32] The augmented regimen included two rather than one interim maintenance and DI phase; these phases incorporated additional vincristine, asparaginase, dexamethasone, and escalating-dose intravenous methotrexate with asparaginase (the “Capizzi” methotrexate regimen) instead of the standard post-induction intensification.[32] A subsequent CCG study demonstrated that high-risk ALL patients with a rapid early response benefited from augmented intensity of postinduction therapy but not from double versus single DI.[33] More recently, a CCG study of standard-risk ALL patients confirmed the benefit of Capizzi methotrexate during interim maintenance[34] but again showed no benefit of double versus single DI.[35] Although a wide variety of agents and dose schedules have demonstrated efficacy, the general principle that significant post-induction intensification improves outcomes appears to hold true across treatment studies and ALL subgroups, with increasingly high-risk subgroups benefiting from correspondingly greater increases in treatment intensity and duration.

Pages

 
Loading comments...
Please Wait 20 seconds or click here to close