Most human cells express insufficient telomerase to compensate for the loss of telomere repeats that invariably follows DNA replication. As a result, telomeres shorten with age and in most cells with accumulated cell divisions. These and other findings have made the role of telomerase and telomere shortening in aging a hot research topic. Of particular interest are the cells of the immune system. Antigen-specific T and B cells need to respond rapidly to antigenic challenges, and multiple responses involving many cell divisions may be required over a lifetime. If the replicative potential of lymphocytes were limited by progressive telomere shortening, immune function could eventually be compromised. Previous studies by Norrback et al (Blood 88:222-229) and Weng et al (Proc Natl Acad Sci U S A 94:10827-10832) showed that certain B cells appear to prevent progressive telomere shortening by expressing high levels of telomerase. The situation with T cells is not so clear. While the overall telomere length in lymphocytes clearly decreases with age (see Rufer et al, J Exp Med 190:157-167), specific subpopulations involved in normal immune responses have not been studied in detail. In this issue Plunkett et al (page 700) studied the telomere length in antigen-specific T cells during acute infection with Epstein-Barr virus (EBV). For these studies they combined tetramer staining to identify EBV-specific CD8+ T cells with fluorescence in situ hybridization (flow FISH) to measure telomere length. Interestingly, no significant telomere shortening was observed in antigen-specific T cells during acute infection and up to one year thereafter. But at later time points, substantial telomere shortening was observed. These results suggest that telomerase may postpone but not prevent replicative telomere shortening in at least some CD8+ T cells. Further studies are needed to determine whether telomere shortening in T cells has biologic consequences and to elucidate whether telomere shortening is delayed in all T cells or only in those cells that express high levels of telomerase.

Eguchi-Ishimae and colleagues (page 737) report thatTEL/AML1 fusion transcripts can be detected in lymphoblastoid cell lines after exposure to apoptotic stimuli. Their data support the hypothesis that repair of double strand breaks induced by apoptotic stimuli may result in chromosomal translocation and fusion of genes such as TEL and AML1. Furthermore, they report, using sensitive nested RT-PCR strategies, that a significant proportion of healthy individuals (8.8%) and cord blood samples (1.1%) harbor the TEL/AML1 gene rearrangement. It is suggested that TEL/AML1 gene rearrangements that occur as a consequence of sublethal apoptotic stimuli may contribute to the development of leukemia. TEL/AML1 thus joins a growing list of fusion genes that can be detected in healthy individuals, including BCR/ABL, MLLtandemduplications, and IGH-BCL2.

This report and others like it raise a number of interesting questions. Does the presence of an oncogenic fusion gene, detected by RT-PCR, confer an increased risk to develop leukemia? Taken together, the data indicate that RT-PCR–detectable fusion genes confer a minimal risk of developing leukemia. The cumulative likelihood that an individual will harbor at least one RT-PCR–detectable fusion gene is quite high, yet the overall incidence of leukemia is only about 1 in 100 000 per year in the general population. On the one hand, this provides some comfort that we are not all “walking time bombs” and raises questions about the pathophysiologic significance of RT-PCR–detectable fusion transcripts and double-strand breaks. On the other hand, RT-PCR assays may in fact identify individuals at risk for progression to leukemia, albeit a low risk. This latter possibility has several important connotations. For example, what obligation do we have to report or follow up on RT-PCR positivity for a known oncogene in healthy individuals? At a minimum, investigators engaged in such analyses should consider design and implementation of studies that would allow for assessment of relative risk of leukemia based on the presence of RT-PCR–detectable fusion genes.

A major problem in treating acute lymphoblastic leukemia (ALL) is the occurrence of significant side effects as a consequence of high-dose chemotherapy. Side effects may also be encountered with otherwise effective biological agents such as interleukin-4 (IL-4) due to the widespread expression of IL-4 receptors. New forms of therapy are therefore urgently needed. In this issue, Srivannaboon et al (page752) take advantage of the fact that there are 2 known types of IL-4 receptors, 1 type (IL-4Rα/IL-2Rγ) expressed on ALL cells (and normal lymphoid cells) and a second type (IL-4Rα/IL-13Rα) expressed in endothelial cells and fibroblasts and probably responsible for many of the side effects caused by IL-4 in vivo. By engineering an IL-4 variant that discriminates between both receptors, the authors have been able to select IL-4– apoptotic effects on ALL cells. While possible effects on normal lymphoid cells remain unknown, it is very encouraging that at least some stimulatory effects on endothelial cells are not seen with this IL-4 variant.

Beyond its own clinical potential, this IL-4 variant highlights the feasibility of selecting for particular functions of cytokines when these are mediated by different receptors. Clearly, the combination of cytokine/cytokine receptor structure-function analyses and determination of what types of receptors and functions are expressed in different primary cell types offers a powerful approach for the development of novel second-generation cytokines with unique biologic properties and clinical usefulness.

Breuer et al (page 792) measure for the first time desferrioxamine chelatable iron (DFI) in serum using a fluorescein-desferrioxamine probe whose fluorescence is quenched by iron. This new assay is used in patients with thalassemia major to study the transfer of iron between the 2 iron chelators deferiprone and desferrioxamine. The orally active, highly permeable chelator deferiprone results in an increased level of DFI in serum. This DFI disappears when intravenous desferrioxamine is given. The authors hypothesize that deferiprone enters cells and binds iron, which it mobilizes into serum. This iron, they suggest, is then transferred to the higher-affinity ligand desferrioxamine, which expedites its excretion. The “shuttle” effect they propose provides an explanation for the earlier findings of Wonke et al (Br J Haematol 103:361-364) that simultaneous administration of deferiprone and desferrioxamine results in an additive or even synergistic increase in urine iron excretion compared with either chelator given alone. The metabolic studies of Grady et al (2000, 42nd Annual Mtg of the American Society of Hematology, Abstract 2594) have also shown that, when the 2 drugs are given simultaneously, there is a synergistic increase in urine and fecal iron excretion to 2.4-3.4 times that when desferrioxamine is given alone.

Combined therapy should result in better compliance by patients with self-administered desferrioxamine infusions if they need these only on 1 or 2 days per week, rather than the current 5 to 7 days per week. Moreover, daily oral deferiprone therapy, which alone is insufficient in some patients to achieve negative iron balance, may well achieve this with the additional boost from 1 or 2 days of desferrioxamine. Combined therapy may also allow doses of both drugs sufficiently low to avoid side effects. Also, deferiprone may, by entering cells, rapidly remove from vital organs (eg, the heart) iron that is then rapidly excreted with the aid of desferrioxamine. The side effects of both drugs are now reasonably well established. Vigilance, however, will be needed in patients receiving chelator combinations, and so long-term clinical trials are needed to test the efficacy and safety of combination therapy of desferrioxamine and deferiprone or of any other combination of new iron chelators still to be introduced into clinical practice.

Emilia et al (page 812) describe a potential association between Helicobacter pylori infection and idiopathic thrombocytopenic purpura (ITP). This is intriguing because it may provide both a clue to the etiology of ITP, as well as a simple and safe treatment option. Thirteen of 30 patients with ITP were documented to have H pylori infection; treatment for 1 week eradicated the infection in 12 of 13 patients, and 6 of the 12 then had an increased platelet count, some to sustained normal values. The implication of this observation is that chronic H pyloriinfection may be involved in the etiology of ITP in some patients. The appeal is the simplicity and safety of potential treatment.

The goal of treatment for patients with ITP is to prevent bleeding. No treatment is the appropriate management for patients with moderate thrombocytopenia and negligible risk for bleeding. When thrombocytopenia is severe and symptomatic, current treatment options are limited: glucocorticoids uncommonly result in durable remissions and their side effects are intolerable, splenectomy has risks and may only be effective in one half to two thirds of patients, and more intensive immunosuppression has greater risks. Thus, observations such as this are immediately attractive. Should this observation change our practice? Why not? The costs and risks are minimal. But we cannot yet be confident that this observation will be reproducible; even if it is true and reproducible, we cannot yet be certain that H pylori eradication will help the patients with severe ITP who most need help. But this observation does provide a sound basis for a systematic study of consecutive patients with ITP to determine whetherH pylori infection truly is a common coexisting condition and whether eradication of H pylori alters the clinical course of ITP.

Sign in via your Institution