[show abstract][hide abstract] ABSTRACT: IMPORTANCE Hypertension is a major public health problem in sub-Saharan Africa, but the lack of affordable treatment and the poor quality of health care compromise antihypertensive treatment coverage and outcomes. OBJECTIVE To report the effect of a community-based health insurance (CBHI) program on blood pressure in adults with hypertension in rural Nigeria. DESIGN, SETTING, AND PARTICIPANTS We compared changes in outcomes from baseline (2009) between the CBHI program area and a control area in 2011 through consecutive household surveys. Households were selected from a stratified random sample of geographic areas. Among 3023 community-dwelling adults, all nonpregnant adults (aged ≥18 years) with hypertension at baseline were eligible for this study. INTERVENTION Voluntary CBHI covering primary and secondary health care and quality improvement of health care facilities. MAIN OUTCOMES AND MEASURES The difference in change in blood pressure from baseline between the program and the control areas in 2011, which was estimated using difference-in-differences regression analysis. RESULTS Of 1500 eligible households, 1450 (96.7%) participated, including 564 adults with hypertension at baseline (313 in the program area and 251 in the control area). Longitudinal data were available for 413 adults (73.2%) (237 in the program area and 176 in the control area). Baseline blood pressure in respondents with hypertension who had incomplete data did not differ between areas. Insurance coverage in the hypertensive population increased from 0% to 40.1% in the program area (n = 237) and remained less than 1% in the control area (n = 176) from 2009 to 2011. Systolic blood pressure decreased by 10.41 (95% CI, -13.28 to -7.54) mm Hg in the program area, constituting a 5.24 (-9.46 to -1.02)-mm Hg greater reduction compared with the control area (P = .02), where systolic blood pressure decreased by 5.17 (-8.29 to -2.05) mm Hg. Diastolic blood pressure decreased by 4.27 (95% CI, -5.74 to -2.80) mm Hg in the program area, a 2.16 (-4.27 to -0.05)-mm Hg greater reduction compared with the control area, where diastolic blood pressure decreased by 2.11 (-3.80 to -0.42) mm Hg (P = .04). CONCLUSIONS AND RELEVANCE Increased access to and improved quality of health care through a CBHI program was associated with a significant decrease in blood pressure in a hypertensive population in rural Nigeria. Community-based health insurance programs should be included in strategies to combat cardiovascular disease in sub-Saharan Africa.
[show abstract][hide abstract] ABSTRACT: To determine the long-term outcomes of treatment and prevalence of genotypic drug resistance in children and adolescents on combination antiretroviral therapy.
A cross-sectional study (September 2009 to October 2010) in which clinical, immunologic and virologic outcomes were assessed at a single-study visit and through patient records in a cohort of HIV-infected children and adolescents. Risk factors for clinical and immunologic responses and virologic outcome were evaluated using logistic regression, and the accuracy of clinical and immunologic criteria in identifying virologic failure was assessed.
Four hundred twenty-four patients were enrolled with a median age of 10.8 years (range: 1.7-18.8) and a median duration on combination antiretroviral therapy of 3.4 years (range: 1.0-8.1). Thirty-three percent were stunted and 17% underweight. Eighty-four percent (95% confidence interval: 79-87) of children >5 years had CD4 ≥350 cells/mm and in 74% (95% confidence interval: 62-84) of younger children CD4% was ≥25. CD4 values and age at combination antiretroviral therapy initiation were independently associated with CD4 outcomes; 124 (29%) had HIV-1 RNA ≥1000 copies/mL, with no significant predictors. Sensitivity for weight-for-age and height-for-age and CD4 cells (<350/mm) remained under 50% (15-42%); CD4 cells showed the best specificity, ranging from 91% to 97%. Of 52 samples tested, ≥1 mutations were observed in 91% (nucleoside reverse transcriptase inhibitors) and 95% (non-nucleoside reverse transcriptase inhibitors); 1 to 2 thymidine analogue-associated mutations were detected in 16 (31%) and ≥3 thymidine analogue-associated mutations in 7 (13%).
Nearly 1 in 3 children showed virologic failure, and >10% of the subgroup of children with treatment failure in whom genotyping was performed demonstrated multiple HIV drug resistance mutations. Neither clinical condition nor CD4 cells were good indicators for treatment failure.
[show abstract][hide abstract] ABSTRACT: To study the prevalence of target organ damage (TOD) in hypertensive adults in a general population in rural Nigeria, to assess determinants of TOD and the contribution of TOD screening to assess eligibility for antihypertensive treatment.
All adults diagnosed with hypertension (n = 387) and a random sample (n = 540) out of all nonhypertensive adults, classified during a household survey in 2009, had a blood pressure measurement and were invited for TOD (myocardial infarction, left ventricular hypertrophy, angina pectoris, kidney disease) screening in 2011.
Participation in TOD screening was 51% (n = 196) in respondents with hypertension and 33% (n = 179) in those without hypertension. TOD prevalence in hypertensive and nonhypertensive adults was 32 and 15%, respectively. Hypertension severity was a strong determinant for TOD [grade 1 odds ratio (OR) 2.66, 95% confidence interval (CI)1.04-6.84; grade 2 OR 3.82, 95% CI 1.41-10.36]. Out of 196 hypertensive patients, 151 were untreated, of whom all grade 2 hypertensive patients (n = 71) were eligible for treatment. Screening revealed TOD in 19 out of 80 grade 1 hypertensive respondents (24%), therefore also classifying them as eligible for treatment. TOD screening hypertensive nonrespondents had more severe hypertension than hypertensive respondents, which may have resulted in an underestimation of the true prevalence of TOD among adults with hypertension.
A high prevalence of 32% TOD in hypertensive adults in rural Nigeria was observed. Almost a quarter of respondents with grade 1 hypertension were eligible for antihypertensive treatment based on TOD screening findings. As TOD screening is mostly unavailable in sub-Saharan Africa, we propose antihypertensive treatment for all patients with hypertension.
Journal of hypertension 11/2013; · 4.02 Impact Factor
[show abstract][hide abstract] ABSTRACT: Abstract Physiological effects of aging make the older population more susceptible to adverse drug events and drug-drug interactions. We evaluated the impact of aging and gender on the pharmacokinetics (PK) of atazanavir/ritonavir (ATV/r) 300/100 mg once daily (qd) in 22 well-suppressed HIV-infected patients. This was a 24-h intensive PK study. Subjects were HIV-1-infected adults aged ≥18 years with HIV RNA <50 copies/ml and treated with ATV/r 300/100 mg once daily plus two nucleoside reverse transcriptase inhibitors (NRTIs) for at least 2 weeks. Atazanavir and ritonavir plasma concentrations were measured by validated high-performance liquid chromatography (HPLC). Plasma PK parameters were calculated using noncompartmental methods. Since 50% of the patients were older than 42 years, age 42 was selected as the cut-off point for the older (>42 years) group. Gender, weight, duration of ATV/r therapy, and proportion treated with tenofovir disoproxil fumarate (TDF)-containing regimens did not differ between both groups. Patients from the aging group had a reduced creatinine clearance (91 versus 76 ml/min). The older group had a higher atazanavir exposure with median AUC0-24 71.2 vs. 53.1 mg·h/liter, Cmax 8.5 vs. 5.5 mg/liter, and Ctrough 1.17 vs. 0.78 mg/liter, and slower apparent clearance (3.5 vs. 4.8 liter/h). Ten patients (91%) from the older group and 36% from the younger group had ATV Ctrough levels higher than the proposed upper limit for toxicity of 0.85 mg/liter. Females had a lower body weight (BW) (46 versus 63 kg) than the males, but atazanavir concentrations in females were greater. However, in multivariate analysis, older age was the only significant predictor for higher atazanavir concentrations. Parameter estimate for age and atazanavir AUC after adjusting for gender and BW was 2.17 (95% CI 1.01-3.33). That is, for every year increase in age, AUC increases by approximately 2 mg·h/liter. Age seems to be an important factor influencing atazanavir pharmacokinetics. Patients from the aging group appeared to have higher atazanavir exposure compared to the younger group. Further PK explorations of ATV in the extremely aged population are warranted.
AIDS research and human retroviruses 10/2013; · 2.18 Impact Factor
[show abstract][hide abstract] ABSTRACT: Background. The immunomodulatory nutritional product NR100157 has been developed for human immunodeficiency virus (HIV) - infected individuals. We hypothesised that targeting the compromised gastrointestinal tract of HIV-infected subjects would result in systemic immunological benefits. Methods. In a multicentre, randomised controlled double-blind trial (ISRCTN81868024 - BITE), 340 HIV-1 positive adults not on antiretroviral therapy, with CD4(+) T-cell counts <800/l were given either NR100157, or an isocaloric and isonitrogenous control for 52 weeks. Primary outcome was CD4(+) T-cell count. Secondary outcomes included plasma viral load (pVL), safety, and tolerability. In a pilot study (n=20), levels of CD4(+)CD25(+) and CD8(+)CD38(+) activation were measured (n=20). The trial is registered at the Dutch Trial Register (NTR886) and ISRCTN81868024. Results. At 52 weeks, CD4(+) T-cell decline showed a 40 cell/l difference (p=0·03) in the ITT population in favour of the immunomodulatory NR100157 (control vs active, -68±15 vs -28±16 cells/l/year). The change in pVL from baseline was similar between groups (p=0·81). In the pilot study, %CD4(+)CD25(+) was lower in the active group (p<0·05) and correlated with changes in CD4(+) T-cell count (r=-0·55, p<0·05). %CD8(+)CD38(+) levels were unaffected. Conclusions. The specific immunonutrition NR100157 significantly reduces CD4(+) decline in HIV-1 infected individuals and this is associated with decreased levels of CD4(+)CD25(+). (This nutritional intervention is likely to affect locally gut integrity and gut-associated lymphoid tissue homeostasis, which in turn translates positively to systemic effects.).
[show abstract][hide abstract] ABSTRACT: BACKGROUND: Prisoners are at high risk of developing tuberculosis (TB), causing morbidity and mortality. Prison facilities encounter many challenges in TB screening procedures and TB control. This review explores screening practices for detection of TB and describes limitations of TB control in prison facilities worldwide. METHODS: A systematic search of online databases (e.g., PubMed and Embase) and conference abstracts was carried out. Research papers describing screening and diagnostic practices among prisoners were included. A total of 52 articles met the inclusion criteria. A meta-analysis of TB prevalence in prison facilities by screening and diagnostic tools was performed. RESULTS: The most common screening tool was symptom questionnaires (63·5%), mostly reporting presence of cough. Microscopy of sputum with Ziehl-Neelsen staining and solid culture were the most frequently combined diagnostic methods (21·2%). Chest X-ray and tuberculin skin tests were used by 73·1% and 50%, respectively, as either a screening and/or diagnostic tool. Median TB prevalence among prisoners of all included studies was 1,913 cases of TB per 100,000 prisoners (interquartile range [IQR]: 332-3,517). The overall annual median TB incidence was 7·0 cases per 1000 person-years (IQR: 2·7-30·0). Major limitations for successful TB control were inaccuracy of diagnostic algorithms and the lack of adequate laboratory facilities reported by 61·5% of studies. The most frequent recommendation for improving TB control and case detection was to increase screening frequency (73·1%). DISCUSSION: TB screening algorithms differ by income area and should be adapted to local contexts. In order to control TB, prison facilities must improve laboratory capacity and frequent use of effective screening and diagnostic tools. Sustainable political will and funding are critical to achieve this.
PLoS ONE 01/2013; 8(1):e53644. · 3.73 Impact Factor
[show abstract][hide abstract] ABSTRACT: Objectives To investigate whether an unrecognised diagnosis of tuberculosis (TB) at the start of antiretroviral therapy (ART) influences subsequent CD4+ T cell (CD4) count recovery in an urban HIV clinic in Uganda. Methods In a retrospective cohort study, a multivariable polynomial mixed effects model was used to estimate CD4 recovery in the first 96 weeks of ART in two groups of patients: prevalent TB (started ART while on TB treatment), unrecognised TB (developed TB within 6 months after start ART). Results Included were 511 patients with a median baseline CD4 count of 57 cells/mm(3) (interquartile range: 22-130), of whom 368 (72%) had prevalent TB and 143 (28%) had unrecognised TB. Compared with prevalent TB, unrecognised TB was associated with lower CD4 count recovery at 96 weeks: -22.3 cells/mm(3) (95% confidence interval -43.2 to -1.5, P = 0.036). These estimates were adjusted for gender, age, baseline CD4 count and the use of zidovudine-based regimen. Conclusions Unrecognised TB at the time of ART initiation resulted in impaired CD4 recovery compared with TB treated before ART initiation. More vigilant screening with more sensitive and rapid TB diagnostics prior to ART initiation is needed to decrease the risk of ART-associated TB and sub-optimal immune reconstitution.
Tropical Medicine & International Health 11/2012; · 2.94 Impact Factor
[show abstract][hide abstract] ABSTRACT: The World Health Organization recommends that treatment of tuberculosis (TB) in HIV-infected patients should be integrated with HIV care. In December 2008, a separate outdoor-integrated TB/HIV clinic was instituted for attendees of a large urban HIV clinic in Uganda. We sought to evaluate associated TB and HIV treatment outcomes.
Routinely collected clinical, pharmacy, and laboratory data were merged with TB clinic data for patients initiating TB treatment in 2009 and with TB register data for patients in 2007. TB treatment outcomes and (timing of) antiretroviral therapy (ART) initiation in ART-naive patients [overall and stratified by CD4+ T cell (CD4) count] in 2007 and 2009 were compared. Nosocomial transmission rates could not be assessed.
Three hundred forty-six patients were initiated on TB treatment in 2007 and 366 in 2009. Median CD4 counts at TB diagnosis did not differ. TB treatment cure or completion increased from 62% to 68%, death or default decreased from 33% to 25% (P < 0.001). Fewer ART-naive TB patients were initiated on ART in 2009 versus 2007 (57% and 66%, P = 0.031), but this decrease was only in patients with CD4 counts >250 cells per cubic millimeter (19% vs. 48%, P = 0.003). More patients were started on ART during TB treatment (94% vs. 78%, P < 0.001). Moreover, the majority were now initiated during intensive phase (60% vs. 23%, P < 0.001).
Integration of TB and HIV care has led to improved TB treatment outcomes and earlier, prioritized ART initiation. This supports rollout of a fully integrated TB/HIV service delivery model throughout high-prevalence TB and HIV settings.
[show abstract][hide abstract] ABSTRACT: The objective of this study was to assess the benefit of temporary combination antiretroviral therapy (cART) during primary HIV infection (PHI).
Adult patients with laboratory evidence of PHI were recruited in 13 HIV treatment centers in the Netherlands and randomly assigned to receive no treatment or 24 or 60 wk of cART (allocation in a 1∶1∶1 ratio); if therapy was clinically indicated, participants were randomized over the two treatment arms (allocation in a 1∶1 ratio). Primary end points were (1) viral set point, defined as the plasma viral load 36 wk after randomization in the no treatment arm and 36 wk after treatment interruption in the treatment arms, and (2) the total time that patients were off therapy, defined as the time between randomization and start of cART in the no treatment arm, and the time between treatment interruption and restart of cART in the treatment arms. cART was (re)started in case of confirmed CD4 cell count < 350 cells/mm(3) or symptomatic HIV disease. In total, 173 participants were randomized. The modified intention-to-treat analysis comprised 168 patients: 115 were randomized over the three study arms, and 53 randomized over the two treatment arms. Of the 115 patients randomized over the three study arms, mean viral set point was 4.8 (standard deviation 0.6) log(10) copies/ml in the no treatment arm, and 4.0 (1.0) and 4.3 (0.9) log(10) copies/ml in the 24- and 60-wk treatment arms (between groups: p < 0.001). The median total time off therapy in the no treatment arm was 0.7 (95% CI 0.0-1.8) y compared to 3.0 (1.9-4.2) and 1.8 (0.5-3.0) y in the 24- and 60-wk treatment arms (log rank test, p < 0.001). In the adjusted Cox analysis, both 24 wk (hazard ratio 0.42 [95% CI 0.25-0.73]) and 60 wk of early treatment (hazard ratio 0.55 [0.32-0.95]) were associated with time to (re)start of cART.
In this trial, temporary cART during PHI was found to transiently lower the viral set point and defer the restart of cART during chronic HIV infection.
PLoS Medicine 03/2012; 9(3):e1001196. · 15.25 Impact Factor
[show abstract][hide abstract] ABSTRACT: High early mortality after antiretroviral therapy (ART) initiation in resource-limited settings is associated with low baseline CD4 cell counts and a high burden of opportunistic infections. Our large urban HIV clinic in Uganda has made concerted efforts to initiate ART at higher CD4 cell counts and to improve diagnosis and care of patients coinfected with tuberculosis (TB). We sought to determine associated treatment outcomes.
Routinely collected data for all patients who initiated ART from 2005 to 2009 were analysed. Median baseline CD4 cell counts by year of ART initiation were compared using the Cuzick test for trend. Mortality and TB incidence rates in the first year of ART were computed. Hazard ratios (HRs) were calculated using multivariable Cox proportional hazards models.
First-line ART was initiated in 7659 patients; 64% were women, and the mean age was 37 years (standard deviation 9 years). Median baseline CD4 counts increased from 2005 to 2009 [82 cells/μL (interquartile range (IQR) 24, 153) to 148 cells/μL (IQR 61, 197), respectively; P<0.001]. The mortality rate fell from 6.5/100 person-years at risk (PYAR) [95% confidence interval (CI) 5.5-7.6 PYAR] to 3.6/100 PYAR (95% CI 2.2-5.8 PYAR). TB incidence rates increased from 8.2/100 PYAR (95% CI 7.1-9.5 PYAR) to 15.6/100 PYAR (95% CI 12.4-19.7 PYAR). A later year of ART initiation was independently associated with decreased mortality (HR 0.91; 95% CI 0.83-1.00; P=0.04).
Baseline CD4 cell counts have increased over time and are associated with decreased mortality. Additional reductions in mortality might be a result of a better standard of care and increased TB case finding. Further efforts to initiate ART earlier should be prioritized even in a setting of capped or reduced funding for ART programmes.
HIV Medicine 02/2012; 13(6):337-44. · 3.16 Impact Factor
[show abstract][hide abstract] ABSTRACT: Introduction: Epstein-Barr virus (EBV) viraemia is associated with nasopharyngeal carcinoma and lymphoproliferative diseases. In HIV-1 infection, persistent EBV viraemia is a common phenomenon. The underlying mechanism of these high EBV DNA loads has not been clarified. We studied EBV viraemia during primary HIV-1 infection (PHI) to explore the mechanism of EBV viraemia in HIV-1 infection. Methods: Patients with PHI, participating in Primo-SHM study, a clinical trial with three study arms: no treatment, 24 weeks of combination antiretroviral therapy (cART) and 60 weeks of cART, were sampled longitudinally during PHI and 24 and 48 weeks thereafter. EBV DNA was assayed by PCR on stored samples of lysed whole blood. Results: 39 patients were tested, in 22 of whom EBV DNA was detected at one or more time points. All patients tested positive for anti-VCA and anti-EBNA antibodies, most patients that had EBV viraemia did not receive cART or interrupted cART. The prevalence of EBV viraemia at baseline was 29%, 18% and 33% for the untreated, 24 weeks cART and continuous cART groups. At week 48, these percentages were 38, 64 and 17 respectively (p < 0.05). Individual concentrations of EBV DNA for the three groups are shown in Figure 1. Conclusion: Intermittent EBV viraemia is highly prevalent in patients with PHI. Assuming that patients with very early HIV-1 infection are still immunocompetent, this indicates that EBV viraemia is not caused by immunodeficiency. Antiretroviral therapy started during PHI but not later during chronic HIV infection might reduce the prevalence of EBV viraemia in HIV-1 infection.
Journal of the International AIDS Society 01/2012; 15(6):18406. · 3.94 Impact Factor
[show abstract][hide abstract] ABSTRACT: We assessed pharmacokinetic (PK) parameters of reduced dose lopinavir/ritonavir (LPV/r) and compared generic and branded tablets. Twenty HIV-infected patients using protease inhibitors with HIV RNA <50 copies per milliliter were randomized to generic or branded LPV/r 200/50mg twice daily (BID). At week 2, PK-sampling was performed. Patients crossed over to the other arm until week 12, with another PK-sampling at week 4. Subtherapeutic lopinavir concentrations were observed in 10/40 samples. PK parameters were comparable between branded and generic tablets. All patients remained virologically suppressed at week 12. In conclusion, LPV/r 200/50mg BID does not lead to adequate lopinavir plasma concentrations. Generic and branded LPV/r have comparable PK-parameters.
[show abstract][hide abstract] ABSTRACT: Cardiovascular disease (CVD) is the leading cause of adult mortality in low-income countries but data on the prevalence of cardiovascular risk factors such as hypertension are scarce, especially in sub-Saharan Africa (SSA). This study aims to assess the prevalence of hypertension and determinants of blood pressure in four SSA populations in rural Nigeria and Kenya, and urban Namibia and Tanzania.
We performed four cross-sectional household surveys in Kwara State, Nigeria; Nandi district, Kenya; Dar es Salaam, Tanzania and Greater Windhoek, Namibia, between 2009-2011. Representative population-based samples were drawn in Nigeria and Namibia. The Kenya and Tanzania study populations consisted of specific target groups. Within a final sample size of 5,500 households, 9,857 non-pregnant adults were eligible for analysis on hypertension. Of those, 7,568 respondents ≥ 18 years were included. The primary outcome measure was the prevalence of hypertension in each of the populations under study. The age-standardized prevalence of hypertension was 19.3% (95%CI:17.3-21.3) in rural Nigeria, 21.4% (19.8-23.0) in rural Kenya, 23.7% (21.3-26.2) in urban Tanzania, and 38.0% (35.9-40.1) in urban Namibia. In individuals with hypertension, the proportion of grade 2 (≥ 160/100 mmHg) or grade 3 hypertension (≥ 180/110 mmHg) ranged from 29.2% (Namibia) to 43.3% (Nigeria). Control of hypertension ranged from 2.6% in Kenya to 17.8% in Namibia. Obesity prevalence (BMI ≥ 30) ranged from 6.1% (Nigeria) to 17.4% (Tanzania) and together with age and gender, BMI independently predicted blood pressure level in all study populations. Diabetes prevalence ranged from 2.1% (Namibia) to 3.7% (Tanzania).
Hypertension was the most frequently observed risk factor for CVD in both urban and rural communities in SSA and will contribute to the growing burden of CVD in SSA. Low levels of control of hypertension are alarming. Strengthening of health care systems in SSA to contain the emerging epidemic of CVD is urgently needed.
PLoS ONE 01/2012; 7(3):e32638. · 3.73 Impact Factor
[show abstract][hide abstract] ABSTRACT: Incidence and severity of herpes zoster (HZ) and postherpetic neuralgia increase with age, associated with age-related decrease in immunity to varicella-zoster virus (VZV). One dose of zoster vaccine (ZV) has demonstrated substantial protection against HZ; this study examined impact of a second dose of ZV.
Randomized, double-blind, multicenter study with 210 subjects ≥60 years old compared immunity and safety profiles after one and two doses of ZV, separated by 6 weeks, vs. placebo. Immunogenicity was evaluated using VZV interferon-gamma (IFN-γ) enzyme-linked immunospot (ELISPOT) assay and VZV glycoprotein enzyme-linked immunosorbent antibody (gpELISA) assay. Adverse experiences (AEs) were recorded on a standardized Vaccination Report Card.
No serious vaccine-related AEs occurred. VZV IFN-γ ELISPOT geometric mean count (GMC) of spot-forming cells per 10(6) peripheral blood mononuclear cells increased in the ZV group from 16.9 prevaccination to 49.5 and 32.8 at 2 and 6 weeks postdose 1, respectively. Two weeks, 6 weeks and 6 months postdose 2, GMC was 44.3, 42.9, and 36.5, respectively. GMC in the placebo group did not change during the study. The peak ELISPOT response occurred ∼2 weeks after each ZV dose. The gpELISA geometric mean titers (GMTs) in the ZV group were higher than in the placebo group at 6 weeks after each dose. Correlation between the IFN-γ ELISPOT and gpELISA assays was poor.
ZV was generally well-tolerated and immunogenic in adults ≥60 years old. A second dose of ZV was generally safe, but did not boost VZV-specific immunity beyond levels achieved postdose 1.
[show abstract][hide abstract] ABSTRACT: We longitudinally evaluated HIV-specific T-cell immunity after discontinuation of highly active antiretroviral therapy (HAART). After treatment interruption (TI), some individuals could maintain a low plasma viral load (<15,000 copies/mL), whereas others could not (>50,000 copies/mL). Before HAART was initiated, plasma viral load was similar. After TI, the numbers of CD8(+) T cells increased more in individuals without viral control, whereas individuals maintaining a low viral load showed a more pronounced increase in HIV-specific CD8(+) T-cell numbers. No differences were seen in the number or percentage of cytokine-producing HIV-1-specific CD4(+) T cells, or in proliferative capacity of T cells. Four weeks after TI, the magnitude of the total HIV-1-specific CD8(+) T-cell response (IFN-γ(+) and/or IL-2(+) and/or CD107a(+)) was significantly higher in individuals maintaining viral control. Degranulation contributed more to the overall CD8(+) T-cell response than cytokine production. Whether increased T-cell functionality is a cause or consequence of low viral load remains to be elucidated.
[show abstract][hide abstract] ABSTRACT: Whether temporary antiretroviral treatment during primary HIV infection (PHI) lowers the viral set point or affects the subsequent CD4 count decline remains unclear. The objectives of this study were to analyze the clinical, viral, and immunological effects of temporary early HAART during PHI. This is a cohort study of patients with laboratory evidence of PHI. Independent predictors of early HAART and the viral set point were analyzed using multiple regression analysis. Plasma HIV-1 RNA (pVL) and CD4 trajectories were analyzed using linear mixed models. A total of 332 patients were included in the analysis. Sixty-four patients started HAART within 180 days of seroconversion. A higher baseline pVL was independently predictive of the start of early HAART (OR: 2.69/log10pVL, p = 0.001). Thirty-two patients who interrupted early HAART were compared with 250 patients who remained untreated for more than 180 days after seroconversion. Temporary early HAART was not significantly associated with a longer AIDS-free survival but did result in an initial, but transient lowering of the viral set point. The viral set point was initially 0.6 log copies/ml lower after interruption of early HAART (p < 0.001) and remained lower during 83 weeks of follow-up. No significant difference in the slopes of CD4 decline was detected between the groups. Temporary HAART in PHI is started more frequently in patients with a higher pVL and can transiently lower the viral set point compared to never treated patients.
AIDS research and human retroviruses 04/2010; 26(4):379-87. · 2.18 Impact Factor