[Show abstract][Hide abstract] ABSTRACT: Background: We estimated the impact of loss to follow-up (LTFU) on the mortality rate amongst HIV-1 infected patients in Curaçao. Methods: A total of 214 therapy-naïve HIV-1 infected patients aged 15 years or older upon entering into HIV care between January 2005 and July 2009 were included. Persons who discontinued follow-up for more than 365 days were defined as LTFU and traced with the aim of registering their vital status. If no personal contact could be made, data were matched with the Curaçao National Death Registry. Mortality rates were estimated before and after starting combined antireteroviral therapy (cART). We used log-rank tests to compare survival rates amongst patients LTFU and patients who experienced continuous follow-up. Results: Pre-cART mortality in patients LTFU was similar to pre-cART mortality in those with continuous follow-up (p=0.79). All pre-cART deaths occurred within 6 months after entry. Late diagnosis was predictive for a shorter time to death after entry. Adjusting for those who were LTFU, the mortality rate after starting cART increased from 4.3 to 5.5 per 100 person years of observation (p=0.06). Mortality after starting cART was highest in the first 2 months after starting cART, especially for those who had late disease stage. Mortality rates were lower in patients with continuous follow-up compared to LTFUs (p<0.001). Conclusion: Mortality rates in HIV-1 infected patients who have started cART in Curaçao are underestimated as a result of inefficient patient administration combined with people starting cART at very late disease stage. Monitoring HIV treatment could help reducing the risk of LTFU and may improve the effect of treatment.
AIDS research and human retroviruses 08/2013; · 2.18 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: The advent of effective combination antiretroviral therapy (ART) in 1996 resulted in fewer patients experiencing clinical events, so that some prognostic analyses of individual cohort studies of human immunodeficiency virus-infected individuals had low statistical power. Because of this, the Antiretroviral Therapy Cohort Collaboration (ART-CC) of HIV cohort studies in Europe and North America was established in 2000, with the aim of studying the prognosis for clinical events in acquired immune deficiency syndrome (AIDS) and the mortality of adult patients treated for HIV-1 infection. In 2002, the ART-CC collected data on more than 12,000 patients in 13 cohorts who had begun combination ART between 1995 and 2001. Subsequent updates took place in 2004, 2006, 2008, and 2010. The ART-CC data base now includes data on more than 70 000 patients participating in 19 cohorts who began treatment before the end of 2009. Data are collected on patient demographics (e.g. sex, age, assumed transmission group, race/ethnicity, geographical origin), HIV biomarkers (e.g. CD4 cell count, plasma viral load of HIV-1), ART regimen, dates and types of AIDS events, and dates and causes of death. In recent years, additional data on co-infections such as hepatitis C; risk factors such as smoking, alcohol and drug use; non-HIV biomarkers such as haemoglobin and liver enzymes; and adherence to ART have been collected whenever available. The data remain the property of the contributing cohorts, whose representatives manage the ART-CC via the steering committee of the Collaboration. External collaboration is welcomed. Details of contacts are given on the ART-CC website (www.art-cohort-collaboration.org).
International Journal of Epidemiology 04/2013; · 6.98 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Abstract Retention in care is one of the major challenges to scaling up and maximizing the effectiveness of combination antiretroviral therapy (cART). High attrition rates have been reported in the Caribbean region, varying from 6% to 23%. We studied the incidence of and risk factors for intermittent care in a cohort of adult HIV-1-positive patients, who entered into care in Curaçao between January 2005 and July 2009. A total of 214 therapy-naïve HIV-1-infected patients aged 15 years or older, entered HIV care between January 2005 and July 2009. Intermittent care was defined as at least one period of 365 days or longer in which there was no HIV care contact in Curaçao. Cox regression models were used to identify characteristics associated with time to intermittent care. In all, 203 (95%) patients could be classified as having intermittent or continuous care. The incidence of intermittent care before starting cART was 25.4 per 100 person years observation (PYO), whilst it was 6.1 per 100 PYO after starting cART. Being born outside Curaçao was associated with intermittent care before and after starting cART. Time from diagnosis to entry into care was an independent predictor for intermittent care before starting cART. Younger age was independently associated with intermittent care after starting cART. Half of the patients returned to care after intermitting care. Upon returning to care, median CD4 count was 264 cells/mm(3) (IQR, 189-401) for those who intermitted care before starting cART, and 146 cells/mm(3) (IQR, 73-436) in those who intermitted care after starting cART. In conclusion, the incidence of intermitting care is high in Curaçao, especially before starting cART, and intermitting care before starting cART is an independent predictor for starting cART late.
[Show abstract][Hide abstract] ABSTRACT: BACKGROUND:: In HIV-negative patients radiotherapy (RT) decreases CD4T-cell counts. We studied the effects of RT in HIV-1 positive patients. METHODS:: HIV-1 positive patients with a subsequent diagnosis of a solid tumor were selected from the Dutch national observational HIV cohort (ATHENA). Patients were grouped according to whether they had received RT or not. Primary endpoint of the study was time from baseline to reaching CD4 cell counts higher than those at baseline. Kaplan-Meier estimates of the percentage of patients reaching the endpoint were calculated. RESULTS:: 90 patients were included of which 36 received RT and 54 not. Median duration of RT was 46 (IQR 30-63) days. Median first CD4 cell count after stopping RT was 150 (IQR 30-270) x10/L lower compared with baseline. In 13 of the 36 patients receiving RT, CD4 cell counts recovered to baseline, after a median of 469 (IQR 345-595) days. In 35 of the 54 patients without RT CD4 cell count recovered to baseline or higher, after a median of 112 (IQR 42-182) days. After 3 years, in 39% of patients who had RT compared to 71% of patients without RT CD4 cell counts recovered to baseline or higher (p<0.0001). In a Cox regression adjusted for potential confounders, RT was associated with a longer (HR 0.29 (95% CI 0.13 - 0.63), and cART use with a shorter time to return to baseline (HR 2.46 (95% CI 1.11-5.48). CONCLUSION:: RT resulted in a significant and prolonged decrease in CD4 cell counts.
[Show abstract][Hide abstract] ABSTRACT: The emergence of CXCR4-using HIV variants (X4-HIV) is associated with accelerated disease progression in the absence of antiretroviral therapy. However, the effect of X4-HIV variants on the treatment response remains unclear. Here we determined whether the presence of X4-HIV variants influenced the time to undetectable viral load and CD4+ T cell reconstitution after initiation of cART in 732 patients. The presence of X4-HIV variants was determined by MT-2 assay prior to cART initiation and viral load and CD4+ T cell counts were analyzed every 3 to 6 months during a three year follow-up period. Kaplan-Meier and Cox proportional hazard analyses were performed to compare time to viral suppression and the absolute CD4+ T cell counts and increases in CD4+ T cell counts during follow-up were compared for patients with and without X4-HIV at start of cART. Patients harboring X4-HIV variants at baseline showed a delay in time to achieve viral suppression below the viral load detection limit. This delay in viral suppression was independently associated with high viral load and the presence of X4-HIV variants. Furthermore, the absolute CD4+ T cell counts were significantly lower in patients harboring X4-HIV variants at all time points during follow-up. However, no differences were observed in the increase in absolute CD4+ T cell numbers after treatment initiation, indicating that the reconstitution of CD4+ T cells is independent of the presence of X4-HIV variants. The emergence of X4-HIV has been associated with an accelerated CD4+ T cell decline during the natural course of infection and therefore, patients who develop X4-HIV variants may benefit from earlier treatment initiation in order to obtain faster reconstitution of the CD4+ T cell population to normal levels.
PLoS ONE 01/2013; 8(10):e76255. · 3.73 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Document progress in HIV-treatment in the Netherlands since 1996 by reviewing changing patterns of cART use and relating those to trends in patients' short-term clinical outcomes between 1996 and 2010.
1996-2010 data from 10,278 patients in the Dutch ATHENA national observational cohort were analysed. The annual number of patients starting a type of regimen was quantified. Trends in the following outcomes were described: i) recovery of 150 CD4 cells/mm(3) within 12 months of starting cART; ii) achieving viral load (VL) suppression ≤1,000 copies/ml within 12 months of starting cART; iii) switching from first-line to second-line regimen within three years of starting treatment; and iv) all-cause mortality rate per 100 person-years within three years of starting treatment.
Between 1996 and 2010, first-line regimens changed from lamivudine/zidovudine-based or lamivudine/stavudine-based regimens with unboosted-PIs to tenofovir with either emtricitabine or lamivudine with NNRTIs. Mortality rates did not change significantly over time. VL suppression and CD4 recovery improved over time, and the incidence of switching due to virological failure and toxicity more than halved between 1996 and 2010. These effects appear to be related to the use of new regimens rather than improvements in clinical care.
The use of first-line cART in the Netherlands closely follows changes in guidelines, to the benefit of patients. While there was no significant improvement in mortality, newer drugs with better tolerability and simpler dosing resulted in improved immunological and virological recovery and reduced incidences of switching due to toxicity and virological failure.
PLoS ONE 01/2013; 8(9):e76071. · 3.73 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: BACKGROUND AND OBJECTIVE:: HIV-associated Pneumocystis jirovecii pneumonia (PJP) remains one of the commonest opportunistic infections in Western countries. Although it has been suggested that racial differences in PJP incidence exist, early studies report conflicting results. This study aims to investigate differences in PJP incidence in a developed country among patients originating from sub-Saharan Africa compared to other regions of origin. DESIGN AND METHODS:: A retrospective observational cohort study was performed among 13,844 HIV-infected patients from the Dutch ATHENA cohort. The main outcome measure was occurrence of PJP. RESULTS:: A total number of 1,055 PJP infections were diagnosed. Patients originating from sub-Saharan Africa had a significantly lower risk for having PJP at time of HIV diagnosis after adjustment for confounders compared to patients from Western origin (Western Europe, Australia and New Zealand; adjusted odds ratio (aOR) 0.21 (95% CI 0.15-0.29)). Other factors associated with higher PJP risk were increasing age (aOR 1.01 per year (95% CI 1.00-1.02)), a low CD4 count at HIV diagnosis (CD4 <50 versus >350 cells/mm aOR 123.3 (95% CI 77.8-195.5)) and a high plasma HIV-RNA (>100,000 copies/μl) at HIV diagnosis (aOR 1.41 (95% CI 1.19-1.66)). Moreover, a clearly lower risk for PJP acquisition later during follow-up was observed among Sub-Saharan Africans versus Western patients (aHR 0.60 (95% CI 0.39-0.90)). CONCLUSION:: Among HIV-infected patients living in the Netherlands, PJP occurrence is substantially lower in patients originating from sub-Saharan Africa, as compared to Western patients. Differences in genetic susceptibility may partially explain the lower PJP incidence in these patients.
AIDS (London, England) 12/2012; · 4.91 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Background We assessed whether quadruple or triple-class therapy for the initial treatment of HIV-1 infection provides a virological benefit over standard triple therapy in patients with a very high plasma viraemia.Design National observational HIV cohort in the Netherlands.Methods Inclusion criteria were age ≥18 years, treatment-naïve, plasma viral load (pVL) ≥500.000 copies/ml and initiation of quadruple or triple therapy between 2001-2011. Time to viral suppression, defined as pVL <50 c/ml, was compared between the two groups using Kaplan-Meier plots and multivariate Cox regression analysis.Results 675 patients were included: 125 (19%) initiated quadruple and 550 (81%) triple therapy. Median pVL was 5.9 (IQR 5.8-6.1) log(10) c/ml in both groups (P=0.49). 22 (18%) patients on quadruple and 63 (12%) on triple therapy interrupted the treatment regimen because of drug-related toxicity (P=0.06). Median time to viral suppression was 5.8 (IQR 4.6-7.9) and 6.0 (4.0-9.4) months in the patients on quadruple and triple therapy (log rank, P=0.42). In the adjusted Cox analysis, quadruple therapy was not associated with time to viral suppression (HR 1.07 (95% CI 0.86-1.33), P=0.53). Similar results were seen when comparing triple- versus dual-class therapy (n=72 vs. n=601, respectively).Conclusions Initial quadruple or triple-class therapy was equally effective as standard triple therapy in the suppression of HIV-1 in treatment-naïve patients with very high viraemia and did not result in a faster pVL decline, but did expose patients to additional toxicity.
[Show abstract][Hide abstract] ABSTRACT: Changes in risk behaviour among men who have sex with men (MSM) in the Netherlands were estimated by fitting a mathematical model to annual HIV and AIDS diagnoses in the period 1980-2009 and, independently, from rates of unprotected anal intercourse in a prospective cohort study in Amsterdam. The agreement between the two approaches was very good, confirming that in terms of incidence, increasing risk behaviour between MSM is offsetting benefits offered by enhanced testing and treatment.
AIDS (London, England) 07/2012; 26(14):1840-3. · 4.91 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: : To quantify the performance of existing first-line and second-line combination antiretroviral therapy (cART) regimens on patient's clinical outcomes in the Netherlands using ATHENA data and to evaluate the potential for new drug regimens to improve patient's clinical outcomes using a data-based mathematical model.
: We analysed data from 3995 patients from the Dutch ATHENA national observational cohort between 2000 and 2010. We quantified the main drug-related reasons for switching from first-line and second-line cART, classified as toxicity, simplification/new medication becoming available, virological failure, or other reasons. We developed a deterministic model describing HIV infection and treatment in the Netherlands parameterized on the basis of these data. The model simulated how a new drug regimen, with either improved toxicity or virological failure profile, could impact on patient's clinical outcomes.
: The main reason for switching current first-line and second-line regimens was toxicity, accounting for around 50% of switching from first-line and from second-line cART. The model found that a new drug regimen with increased tolerability profile could have the highest potential impact on patient's outcomes, especially as a first-line treatment. A new first-line drug regimen with improved tolerability could increase the time patients spend on first-line cART, decrease their risk of switching from first-line cART and thus simplify patient management.
: New drug regimens with improved toxicity profiles could have the greatest impact on patient outcomes and simplify patient management in the Netherlands.
AIDS (London, England) 06/2012; 26(15):1953-9. · 4.91 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: OBJECTIVES: The aim of the study was to compare health-related quality of life (HRQL) over 96 weeks in patients receiving no treatment or 24 or 60 weeks of combination antiretroviral therapy (cART) during primary HIV-1 infection (PHI). METHODS: A multicentre prospective cohort study of PHI patients, with an embedded randomized trial, was carried out. HRQL was assessed with the Medical Outcomes Study Health Survey for HIV (MOS-HIV) and a symptom checklist administered at weeks 0, 8, 24, 36, 48, 60, 72, 84 and 96. Mixed linear models were used for the analysis of differences in HRQL among the three groups. RESULTS: A total of 112 patients were included in the study: 28 received no treatment, 45 received 24 weeks of cART and 39 received 60 weeks of cART. Over 96 weeks of follow-up, the groups receiving 24 and 60 weeks of cART had better cognitive functioning than the no-treatment group (P = 0.005). Patients receiving 60 weeks of cART had less pain (P = 0.004), better role functioning (P = 0.001), better physical functioning (P = 0.02) and a better physical health summary score (P = 0.006) than the groups receiving no treatment or 24 weeks of cART. Mental health was better in patients receiving 24 weeks of cART than in patients in the no-treatment group or the group receiving 60 weeks of cART (P = 0.02). At week 8, patients in the groups receiving 24 and 60 weeks of cART reported more nausea (P = 0.002), diarrhoea (P < 0.001), abdominal pain (P = 0.02), stomach pain (P = 0.049) and dizziness (P = 0.01) than those in the no-treatment group. These differences had disappeared by week 24. CONCLUSIONS: Temporary cART during PHI had a significant positive impact on patients' HRQL as compared with no treatment, despite the initial, short-term occurrence of more physical symptoms, probably related to drug toxicity.
HIV Medicine 04/2012; 13(10):630-635. · 3.16 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Several studies reported an association between immunodeficiency and non-AIDS-defining diseases. We investigated whether nonstructured treatment interruptions and episodes of viremia during suppressive combination antiretroviral therapy were independently associated with non-AIDS diseases.
Six thousand four hundred forty patients with viral suppression (<50 copies/mL) within 48 weeks of starting combination antiretroviral therapy were selected from the Dutch ATHENA cohort. In proportional hazards models, associations between treatment interruptions, viral suppression, low-level (50-400 copies/mL), and high-level viremia (>400), and serious non-AIDS diseases (cardiovascular disease, chronic renal failure, liver fibrosis/cirrhosis) were investigated by including time-updated cumulative exposure to either viremia and interruptions or HIV RNA >400 copies per milliliter.
During 24,603 person-years, of which 88.5% occurred during viral suppression, 102 patients developed cardiovascular disease, 54 chronic renal failure, and 70 liver fibrosis/cirrhosis. Overall incidence of non-AIDS diseases ranged from 1.41 (95% confidence interval: 0.73 to 2.46) per 100 person-years for CD4 counts <200 to 0.71 (0.49 to 1.00) for CD4 ≥500 cells per cubic millimeter. Compared with viral suppression, high-level viremia was associated only with cardiovascular disease (relative hazard: 1.37, 1.04 to 1.81 per year longer), whereas interruptions and low-level viremia were not associated with non-AIDS diseases. Relative hazards for cumulative exposure to RNA >400 versus ≤400 copies per milliliter were 1.32 (1.01 to 1.73) for cardiovascular disease, 1.13 (0.66 to 1.92) for renal failure, and 0.86 (0.51 to 1.44) for fibrosis/cirrhosis.
Lower CD4 counts are associated with increased risk of non-AIDS diseases, whereas high-level viremia seems to be independently associated with cardiovascular disease. However, the power to detect associations with viremia or interruptions may have been limited as most events occurred during viral suppression.
[Show abstract][Hide abstract] ABSTRACT: The objective of this study was to assess the benefit of temporary combination antiretroviral therapy (cART) during primary HIV infection (PHI).
Adult patients with laboratory evidence of PHI were recruited in 13 HIV treatment centers in the Netherlands and randomly assigned to receive no treatment or 24 or 60 wk of cART (allocation in a 1∶1∶1 ratio); if therapy was clinically indicated, participants were randomized over the two treatment arms (allocation in a 1∶1 ratio). Primary end points were (1) viral set point, defined as the plasma viral load 36 wk after randomization in the no treatment arm and 36 wk after treatment interruption in the treatment arms, and (2) the total time that patients were off therapy, defined as the time between randomization and start of cART in the no treatment arm, and the time between treatment interruption and restart of cART in the treatment arms. cART was (re)started in case of confirmed CD4 cell count < 350 cells/mm(3) or symptomatic HIV disease. In total, 173 participants were randomized. The modified intention-to-treat analysis comprised 168 patients: 115 were randomized over the three study arms, and 53 randomized over the two treatment arms. Of the 115 patients randomized over the three study arms, mean viral set point was 4.8 (standard deviation 0.6) log(10) copies/ml in the no treatment arm, and 4.0 (1.0) and 4.3 (0.9) log(10) copies/ml in the 24- and 60-wk treatment arms (between groups: p < 0.001). The median total time off therapy in the no treatment arm was 0.7 (95% CI 0.0-1.8) y compared to 3.0 (1.9-4.2) and 1.8 (0.5-3.0) y in the 24- and 60-wk treatment arms (log rank test, p < 0.001). In the adjusted Cox analysis, both 24 wk (hazard ratio 0.42 [95% CI 0.25-0.73]) and 60 wk of early treatment (hazard ratio 0.55 [0.32-0.95]) were associated with time to (re)start of cART.
In this trial, temporary cART during PHI was found to transiently lower the viral set point and defer the restart of cART during chronic HIV infection.
PLoS Medicine 03/2012; 9(3):e1001196. · 15.25 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Viral blips may be an indication of poor adherence to antiretroviral treatment. This article studies how the variations of the definitions of viral blips and that of the choice of sampling frame in studies investigating viral blips may contribute to the uncertainty of the associations between viral blips and possible causes.
Mathematical modeling study allows us to study the impact of different sampling frames and different definitions of blips upon study results that are usually not feasible in clinical settings.
Using a previously published mathematical model, scenarios of different drug adherence levels and viral blips, with different sampling frames, were modeled.
In the case of viral blips as a result of nonadherence to combinational antiretroviral therapy, rather than calculating the incidence of blips directly from the number of blips observed in a given period of time, it is better to report the proportion of observations in a given period of time that are ≥50 copies per milliliter. Therefore, as the denominator, the number of observations in a given period of time is important. However, the proportion of blips is not very informative on the drug adherence level.
We should standardize definitions of viral blips and the choice of sampling frame and to report the proportion of observations of a given sampling frame in a given period of time that are ≥50 copies per milliliter, so that comparable data can be generated across different populations.
[Show abstract][Hide abstract] ABSTRACT: Infection with HIV-1 may result in severe cognitive and motor impairment, referred to as HIV-1-associated dementia (HAD). While its prevalence has dropped significantly in the era of combination antiretroviral therapy, milder neurocognitive disorders persist with a high prevalence. To identify additional therapeutic targets for treating HIV-associated neurocognitive disorders, several candidate gene polymorphisms have been evaluated, but few have been replicated across multiple studies.
We here tested 7 candidate gene polymorphisms for association with HAD in a case-control study consisting of 86 HAD cases and 246 non-HAD AIDS patients as controls. Since infected monocytes and macrophages are thought to play an important role in the infection of the brain, 5 recently identified single nucleotide polymorphisms (SNPs) affecting HIV-1 replication in macrophages in vitro were also tested.
The CCR5 wt/Δ32 genotype was only associated with HAD in individuals who developed AIDS prior to 1991, in agreement with the observed fading effect of this genotype on viral load set point. A significant difference in genotype distribution among all cases and controls irrespective of year of AIDS diagnosis was found only for a SNP in candidate gene PREP1 (p = 1.2 × 10(-5)). Prep1 has recently been identified as a transcription factor preferentially binding the -2,518 G allele in the promoter of the gene encoding MCP-1, a protein with a well established role in the etiology of HAD.
These results support previous findings suggesting an important role for MCP-1 in the onset of HIV-1-associated neurocognitive disorders.
PLoS ONE 01/2012; 7(2):e30990. · 3.73 Impact Factor