The WHO guidelines for scaling up antiretroviral therapy in resource-limited settings recommend that anyone with clinical AIDS symptoms should initiate therapy, regardless of CD4 count. The guidelines also recommend that therapy be started when CD4 count falls below 200 cells/mm3 (or when CD4 percentage is less than 15%), whether symptoms are present or not. If CD4 count is not practically obtainable, WHO recommends starting therapy when total lymphocyte count (TLC) falls below 1200/mm3 in the presence of symptomatic HIV disease. TLC below 1200 alone, without symptoms, is not considered grounds for starting treatment.
Despite this endorsement, the usefulness of TLC remains ambiguous. The WHO guidelines note that, “While the total lymphocyte count correlates relatively poorly with the CD4 count, in combination with clinical staging it is a useful marker of prognosis and survival.”
Accordingly, WHO includes TLC among its “basic recommended” lab tests (TLC is calculated from a white blood cell count and differential). Due to cost and technical requirements, CD4 counts, typically performed by flow cytometry in central laboratories, are not currently among WHO’s basic recommended tests for resource-limited settings, although the use of alternative, low-cost cell counting technology is strongly encouraged.
There have been efforts to improve the predictive ability of TLC, for example by combining it with changes in haemoglobin, weight, delayed hypersensitivity and symptomology. A number of efforts have been made to validate the diagnostic potential of TLC by retrospectively analysing medical records of large clinical cohorts of HIV-positive people in developed countries to see how well TLC values correlate with standard CD4 counts in predicting disease progression.
Total lymphocyte count as a surrogate for CD4 cell count
A recently reported study of this type comes from the Chelsea and Westminster HIV cohort in London (Sawleshwarka). Patients with CD4 counts and TLC taken within the three months prior to development of an AIDS-defining opportunistic infection (ADOI) or the initiation of prophylaxis for an opportunistic infection (OI) were identified, and the last TLC/CD4 values obtained during that period were analysed. Of 5,774 patients in the cohort, 1,097 were included for analysis. The cut-off for TLC in this cohort, the point of maximum sensitivity and minimal error, was established at 1500 cells/mm3, which corresponds to the canonical CD4 cut-off of 200 cells/mm3, when therapy should always be introduced. This TLC cut-off is somewhat higher than the level recommended by WHO for use in resource-limited settings.
The study reported that having a TLC between 1000 and 1500 predicted that individuals were 40% more likely to develop an ADOI than those with TLC above 1500 (sensitivity = 68.6; specificity = 66.0). This result compares to a 34% greater likelihood of an ADOI being predicted by a CD4 count between 150 and 200 versus one over 200 (sensitivity = 73.8; specificity = 75.6). Furthermore, the risk of having an ADOI was three times greater for those with TLC between 500-1000 compared to those with TLC above 1000. The authors conclude that TLC would have been only moderately less predictive of disease progression than CD4 count in this cohort.
A prospective study of TLC as a simple surrogate for CD4 count conducted in Mozambique comes to a different conclusion (Liotta). Researchers at a day hospital in Matola-Maputo analyzed 651 TLC and CD4 counts obtained from asymptomatic patients who presented at the clinic between March 2002 and May 2003. The WHO-prescribed TLC cut-off of 1200 cell/mm3 was tested for its ability to classify patients as either above or below the CD4 count cut-off of 200 cells/mm3. The investigators also evaluated body mass index (BMI), an easy-to-obtain value derived from weight and height, as a correlate of CD4 count. The cut-off for BMI was 18, which is regarded as the threshold of malnutrition.
In individual comparisons of TLC with CD4 count and BMI with CD4 count, the classification capability of TLC was 67.7% (sensitivity = 48.9%; specificity = 81.3%); for BMI it was 65.2% (sensitivity = 47.0%; specificity = 78.5%). The authors note that had TLC really been used to diagnose this patient group, therapy would have been denied to half of those who needed it according to their CD4 counts, and would have been inappropriately started in 20% of those who did not yet have CD4 counts below 200. Using BMI would have produced similar results. The authors conclude that “the use of TLC as a surrogate for the CD4 cell count appears to be unjustified.”
Measuring immune response to treatment
While evidence for the value of TLC as an indicator of when to start antiretroviral therapy or prophylaxis for opportunistic infections remains contradictory, a retrospective study from Rhode Island reports that TLC may be useful for registering immune response after therapy has begun (Mahajan). Medical records were identified for 126 patients with CD4 counts of less than 250 cells/mm3 who were either treatment-naïve or had not been on therapy for the preceding three months, and who had started a new regimen and had remained on it for at least six months. The baseline values of CD4, TLC and viral load were those established at the last clinic visit prior to initiating therapy.
The investigators’ analysis focused on the ability of changes in TLC to predict changes in CD4 count over time and, as such, required multiple determinations in order to establish a trend. Encouragingly, the ability of a positive change in TLC to predict a positive CD4 response at six months was 98% (sensitivity = 94%; specificity = 85%). Similar results held for longer periods. The authors conclude that “clinicians seeing a positive trend in TLC in patients on HAART can be almost certain that the CD4 count change is also positive.”
While this seems to validate a role for TLC in confirming response to therapy, unfortunately, if TLC starts to go down over time, the picture becomes less clear. Declining TLC during two years after starting therapy only predicted drops in CD4 count about 43-63% of the time, which reveals the limits of TLC for identifying regimen failure.
One potential confounder may be simple measurement error, which the authors suggest can be addressed with multiple TLC determinations at a single time point. They also speculate that more frequent TLC measurements over time might help reduce the effect of high variability. Pending further study to resolve these problems, the authors suggest that TLC might be useful in resource-limited settings as a non-specific monitoring tool of therapeutic efficacy that would trigger a need for confirmatory CD4 counts when significant drops occurred.
The authors also note the need for additional research to understand how to use TLC (alone and along with haemoglobin or symptom staging) in resource-limited settings and how to account for the observed variability in TLC that has been associated with inter-current infections, regional and ethnic differences, and differing measurement technologies. With TLC now established in the WHO guidelines, many of these studies may be expected in the coming years. Ultimately, though, most observers agree that a new generation of simple, low-cost, reliable and rapid point-of-care diagnostics for both CD4 and HIV viral load are urgently needed.
References
Liotta G et al. Is total lymphocyte count a reliable predictor of the CD4 lymphocyte cell count in resource-limited settings? AIDS 18 (7): 1082, 2004.
Sawleshwarkar S et al. A cohort study to review the efficacy of the total lymphocyte count (TLC) as a predictor of AIDS-defining opportunistic infection (ADOI) in HIV-infected patients. Tenth Annual Conference of the British HIV Association, abstract P50, 2004
Mahajan AP et al. Changes in total lymphocyte count as a surrogate for changes in CD4 count following initiation of HAART: Implications for monitoring in resource-limited settings. JAIDS 36 (1): 567, 2004.