Chin-Shang Li

University of California, Davis, Davis, California, United States

Are you Chin-Shang Li?

Claim your profile

Publications (59)159.92 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: Despite effective local therapy with surgery and radiotherapy (RT), ~50 % of patients with high-grade soft tissue sarcoma (STS) will relapse and die of disease. Since experimental data suggest a significant synergistic effect when antiangiogenic targeted therapies such as sorafenib are combined with RT, we chose to evaluate preoperative combined modality sorafenib and conformal RT in a phase I/II trial among patients with extremity STS amenable to treatment with curative intent. For the phase I trial, eight patients with intermediate- or high-grade STS >5 cm in maximal dimension or low-grade STS >8 cm in maximal dimension received concomitant sorafenib (dose escalation cohort 1:200 twice daily, cohort 2:200/400 daily) and preoperative RT (50 Gy in 25 fractions). Sorafenib was continued during the entire period of RT as tolerated. Surgical resection was completed 4-6 weeks following completion of neoadjuvant sorafenib/RT. Three sorafenib dose levels were planned. Primary endpoints of the phase I trial were maximal tolerated dose and dose-limiting toxicity (DLT). Eight patients were enrolled in the phase I (five females, median age 44 years, two high-grade pleomorphic, two myxoid/round cell liposarcoma, four other). Median tumor size was 16 cm (range 8-29), and all tumors were located in the lower extremity. Two of five patients treated at dose level 2 developed DLT consisting of grade 3 rash not tolerating drug reintroduction. Other grade 3 side effects included anemia, perirectal abscess, and supraventricular tachycardia. Radiation toxicity (grade 1 or 2 dermatitis; N = 8) and post-surgical complications (three grade 3 wound complications) were comparable to historical controls and other series of preoperative RT monotherapy. Complete pathologic reponse (≥95 % tumor necrosis) was observed in three patients (38 %). Neoadjuvant sorafenib in combination with RT is tolerable and appears to demonstrate activity in locally advanced extremity STS. Further study to determine efficacy at dose level 1 is warranted. ( identifier NCT00805727).
    Annals of Surgical Oncology 02/2014; · 4.12 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Introduction With increasing longevity, a growing proportion of patients that present with lower extremity peripheral arterial disease (PAD) are ≥ 80 years old. While smoking and diabetes (DM) have traditionally been the main risk factors associated with PAD, we noted a pattern of severe infrapopliteal PAD in patients ≥ 80 years old in the absence of these traditional risk factors. As recognition of patterns of disease affects decisions regarding diagnostic and therapeutic approach, we sought to confirm this observation. Methods A single-center retrospective review was performed of all patients that underwent lower extremity arteriography between 3/2007 and 9/2009. Arteriograms were scored in blinded fashion. Any infrapopliteal PAD was defined as one or more infrapopliteal arteries with either >50% stenosis or total occlusion. Severe infrapopliteal PAD was defined as 2 or more infrapopliteal arteries with >50% stenosis or total occlusion. Fisher’s exact test and two-sample t-test or Wilcoxon rank-sum test were used for analysis. Results 297 patients comprised the study population. 82% (= 145/176) of those ≤ 70 years old versus 96% (= 46/48) of those ≥ 80 years old had any infrapopliteal PAD (p = 0.02). 30% of patients > 80 years old with infrapopliteal PAD had no history of DM or smoking, while only 5% of younger patients had infrapopliteal PAD in the absence of DM or smoking (p < 0.0001). A similar pattern was seen for severe infrapopliteal PAD. Tissue loss was an indication for lower extremity arteriography in 45% of those ≤ 70 years of age versus 65% of those ≥ 80 (p = 0.022). Conclusions A significant proportion of patients ≥ 80 years of age with PAD develop arterial disease in the infrapopliteal pattern in the absence of the traditional risk factors of smoking and DM. Our data also showed that this pattern of disease is significantly associated with tissue loss and critical limb ischemia, particularly in patients ≥ 80 years of age. Primary care providers need to be educated to suspect ischemic etiology for foot pain and ulcers in elderly patients not otherwise thought to have risk factors associated with PAD. Vascular specialists need to anticipate this pattern of disease when planning interventions. As smoking becomes less prevalent and as the population ages, octogenarians with severe infrapopliteal arterial occlusive disease will become a larger proportion of the patients treated by vascular specialists.
    Annals of Vascular Surgery 01/2014; · 0.99 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A majority of individuals infected with human immunodeficiency virus (HIV) have inadequate access to antiretroviral therapy and ultimately develop debilitating oral infections that often correlate with disease progression. Due to the impracticalities of conducting host-microbe systems-based studies in HIV infected patients, we have evaluated the potential of simian immunodeficiency virus (SIV) infected rhesus macaques to serve as a non-human primate model for oral manifestations of HIV disease. We present the first description of the rhesus macaque oral microbiota and show that a mixture of human commensal bacteria and ''macaque versions'' of human commensals colonize the tongue dorsum and dental plaque. Our findings indicate that SIV infection results in chronic activation of antiviral and inflammatory responses in the tongue mucosa that may collectively lead to repression of epithelial development and impact the microbiome. In addition, we show that dysbiosis of the lingual microbiome in SIV infection is characterized by outgrowth of Gemella morbillorum that may result from impaired macrophage function. Finally, we provide evidence that the increased capacity of opportunistic pathogens (e.g. E. coli) to colonize the microbiome is associated with reduced production of antimicrobial peptides.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A majority of individuals infected with human immunodeficiency virus (HIV) have inadequate access to antiretroviral therapy and ultimately develop debilitating oral infections that often correlate with disease progression. Due to the impracticalities of conducting host-microbe systems-based studies in HIV infected patients, we have evaluated the potential of simian immunodeficiency virus (SIV) infected rhesus macaques to serve as a non-human primate model for oral manifestations of HIV disease. We present the first description of the rhesus macaque oral microbiota and show that a mixture of human commensal bacteria and "macaque versions" of human commensals colonize the tongue dorsum and dental plaque. Our findings indicate that SIV infection results in chronic activation of antiviral and inflammatory responses in the tongue mucosa that may collectively lead to repression of epithelial development and impact the microbiome. In addition, we show that dysbiosis of the lingual microbiome in SIV infection is characterized by outgrowth of Gemella morbillorum that may result from impaired macrophage function. Finally, we provide evidence that the increased capacity of opportunistic pathogens (e.g. E. coli) to colonize the microbiome is associated with reduced production of antimicrobial peptides.
    PLoS ONE 11/2013; 8(11):e80863. · 3.73 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: To examine the age and gender-specific trends of Schedule II opioid use among California residents, with special reference to multiple provider users (doctor shoppers). Utilizing data from the California Prescription Drug Monitoring Program, we examined age and gender-specific trends of Schedule II opioid use during calendar years 1999-2007. Specifically, we analyzed the following: (1) the prevalence of Schedule II opioid users among California's population and (2) the proportion of these opioid users who were doctor shoppers (defined as an individual who used more than five different prescribers for all Schedule II opioids he or she obtained in a calendar year). Among all age and gender groups, the prevalence of Schedule II opioid users in California increased by 150%-280% and the prevalence of doctor shoppers among users increased by 111%-213% over 9 years. The prevalence of opioid users was lowest among 18-44 year old men (1.25%) and highest among 65-year and older women (5.31%) by 2007. The prevalence of doctor shoppers was approximately 1.4% among those up to age 64 years and 0.5% among those 65 years and older. The gender difference in doctor shoppers among all age groups was negligible. On average, the cumulative morphine-equivalent amount of Schedule II opioid per individual obtained per year was threefold to sixfold higher for doctor shoppers than for the general population across different age and gender groups. Age and gender differences in opioid use were relatively small, whereas the trends for use of opioids and multiple providers grew at a disquieting rate. Copyright © 2013 John Wiley & Sons, Ltd.
    Pharmacoepidemiology and Drug Safety 08/2013; · 2.90 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Disparities in smoking rates remain prominent within Asian Americans. Medical pluralism and cultural tailoring may enhance Asian Americans engaging with tobacco cessation assistance. We conducted a retrospective analysis of a community clinic's smoking cessation program targeting a Chinese population that offered acupuncture, nicotine replacement therapy (NRT), and counseling from 2007 to 2010. Most participants used acupuncture, with about half choosing acupuncture and NRT, followed by more than 40% choosing acupuncture only; few chose NRT only. Tobacco cessation rates at 6 months were relatively high for the acupuncture + NRT group and only acupuncture group (37.7% vs. 28.9%). In comparing tobacco reduction >50% from baseline with an expanded only NRT group, the acupuncture + NRT group had a higher odds ratio than the only acupuncture group, which had a lower odds ratio. Our evaluation of this real-world community program offering acupuncture as a cultural adjunct to a tobacco cessation program suggests that acupuncture might help with engagement by Chinese American male smokers into a tobacco cessation program that offers counseling and NRT. Future larger studies should further evaluate the efficacy of offering acupuncture in combination with NRT on the outcomes of cessation and reduction.
    Health Promotion Practice 05/2013;
  • [Show abstract] [Hide abstract]
    ABSTRACT: PURPOSE: Malignancies may cause urinary tract obstruction, which is often relieved with placement of a percutaneous nephrostomy tube, an internal double J nephro-ureteric stent (double J), or an internal external nephroureteral stent (NUS). We evaluated the affect of these palliative interventions on quality of life (QoL) using previously validated surveys. METHODS: Forty-six patients with malignancy related ureteral obstruction received nephrostomy tubes (n = 16), double J stents (n = 15), or NUS (n = 15) as determined by a multidisciplinary team. QoL surveys were administered at 7, 30, and 90 days after the palliative procedure to evaluate symptoms and physical, social, functional, and emotional well-being. Number of related procedures, fluoroscopy time, and complications were documented. Kruskal-Wallis and Friedman's test were used to compare patients at 7, 30, and 90 days. Spearman's rank correlation coefficient was used to assess correlations between clinical outcomes/symptoms and QoL. RESULTS: Responses to QoL surveys were not significantly different for patients receiving nephrostomies, double J stents, or NUS at 7, 30, or 90 days. At 30 and 90 days there were significantly higher reported urinary symptoms and pain in those receiving double J stents compared with nephrostomies (P = 0.0035 and P = 0.0189, respectively). Significantly greater fluoroscopy time was needed for double J stent-related procedures (P = 0.0054). Nephrostomy tubes were associated with more frequent minor complications requiring additional changes. CONCLUSION: QoL was not significantly different. However, a greater incidence of pain in those receiving double J stents and more frequent tube changes in those with nephrostomy tubes should be considered when choosing palliative approaches.
    CardioVascular and Interventional Radiology 02/2013; · 2.09 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Skeletal surveys for non-accidental trauma (NAT) include lateral spinal and pelvic views, which have a significant radiation dose. OBJECTIVE: To determine whether pelvic and lateral spinal radiographs should routinely be performed during initial bone surveys for suspected NAT. MATERIALS AND METHODS: The radiology database was queried for the period May 2005 to May 2011 using CPT codes for skeletal surveys for suspected NAT. Studies performed for skeletal dysplasia and follow-up surveys were excluded. Initial skeletal surveys were reviewed to identify fractures present, including those identified only on lateral spinal and/or pelvic radiographs. Clinical information and MR imaging was reviewed for the single patient with vertebral compression deformities. RESULTS: Of the 530 children, 223 (42.1%) had rib and extremity fractures suspicious for NAT. No fractures were identified solely on pelvic radiographs. Only one child (<0.2%) had vertebral compression deformities identified on a lateral spinal radiograph. This infant had rib and extremity fractures and was clinically paraplegic. MR imaging confirmed the vertebral body fractures. CONCLUSION: Since no fractures were identified solely on pelvic radiographs and on lateral spinal radiographs in children without evidence of NAT, nor in nearly all with evidence of NAT, inclusion of these views in the initial evaluation of children for suspected NAT may not be warranted.
    Pediatric Radiology 01/2013; · 1.57 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: There are conflicting data regarding improvements in postoperative outcomes with perioperative epidural analgesia. We sought to examine the effect of perioperative epidural analgesia vs. intravenous narcotic analgesia on perioperative outcomes including pain control, morbidity, and mortality in patients undergoing gastric and pancreatic resections. METHODS: We evaluated 169 patients from 2007 to 2011 who underwent open gastric and pancreatic resections for malignancy at a university medical center. Emergency, traumatic, pediatric, enucleations, and disseminated cancer cases were excluded. Clinicopathologic data were reviewed among epidural (E) and non-epidural (NE) patients for their association with perioperative endpoints. RESULTS: One hundred twenty patients (71 %) received an epidural and 49 (29 %) did not. There were no significant differences (P > 0.05) in mean pain scores at each of the four days (days 0-3) among the E (3.2 ± 2.7, 3.2 ± 2.3, 2.3 ± 1.9, and 2.1 ± 1.9, respectively) and NE patients (3.7 ± 2.7, 3.4 ± 1.9, 2.9 ± 2.1, and 2.4 ± 1.9, respectively). Within each of the E and NE patient groups, there were significant differences (P < 0.0001) in mean pain scores from day 0 to day 3 (P < 0.0001). Of the E patients, 69 % also received intravenous patient-controlled analgesia (PCA). Ileus (13 % E vs. 8 % NE), pneumonia (12 % E vs. 8 % NE), venous thromboembolism (6 % E vs. 4 % NE), length of stay [11.0 ± 12.1 (8, 4-107) E vs. 12.2 ± 10.7 (7, 3-54) NE], overall morbidity (36 % E vs. 39 % NE), and mortality (4 % E vs. 2 % NE) were not significantly different. CONCLUSIONS: Routine use of epidurals in this group of patients does not appear to be superior to PCA.
    Journal of Gastrointestinal Surgery 01/2013; · 2.36 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Diabetic foot ulcers (DFUs) represent a significant source of morbidity and an enormous financial burden. Standard care for DFUs involves systemic glucose control, ensuring adequate perfusion, debridement of nonviable tissue, off-loading, control of infection, local wound care and patient education, all administered by a multidisciplinary team. Unfortunately, even with the best standard of care (SOC) available, only 24% or 30% of DFUs will heal at weeks 12 or 20, respectively.The extracellular matrix (ECM) in DFUs is abnormal and its impairment has been proposed as a key target for new therapeutic devices. These devices intend to replace the aberrant ECM by implanting a matrix, either devoid of cells or enhanced with fibroblasts, keratinocytes or both as well as various growth factors. These new bioengineered skin substitutes are proposed to encourage angiogenesis and in-growth of new tissue, and to utilize living cells to generate cytokines needed for wound repair.To date, the efficacy of bioengineered ECM containing live cellular elements for improving healing above that of a SOC control group has not been compared with the efficacy of an ECM devoid of cells relative to the same SOC. Our hypothesis is that there is no difference in the improved healing effected by either of these two product types relative to SOC. METHODS: To test this hypothesis we propose a randomized, single-blind, clinical trial with three arms: SOC, SOC plus Dermagraft(R) (bioengineered ECM containing living fibroblasts) and SOC plus Oasis(R) (ECM devoid of living cells) in patients with nonhealing DFUs. The primary outcome is the percentage of subjects that achieved complete wound closure by week 12. DISCUSSION: If our hypothesis is correct, then immense cost savings could be realized by using the orders-of-magnitude less expensive acellular ECM device without compromising patient health outcomes. The article describes the protocol proposed to test our hypothesis.Trial registration: NCT01450943. Registered: 7 October 2011.
    Trials 01/2013; 14(1):8. · 2.21 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Joint Commission Venous Thromboembolism (VTE) National Hospital Inpatient Quality Measure VTE-5 outlines four criteria for discharge patient education when starting anticoagulation (usually, warfarin) therapy. The criteria do not specify content regarding patient recognition of potentially dangerous warfarin-related scenarios. A study was conducted to investigate how well patients assess the risks and consequences of potential warfarin-related safety threats. From an adult population on long-term warfarin, 480 patients were randomly selected for a telephone-based survey. Warfarin-knowledge questions were drawn from a previous survey; warfarin-associated risk scenarios were developed via focus interviews. Expert anticoagulation pharmacists categorized each scenario as urgent, moderately urgent, or not urgent, as did survey participants. For the 184 patients (38% completion rate), the mean knowledge score was 69% (standard deviation [SD], 0.20). Overall classification accuracy of situational urgency was 59% (95% confidence interval [CI], 57.3%-60.3%). Respondents overestimated non-urgent-severity situations 23% of the time (95% CI, 20.8%-24.7%), while underestimating urgent-severity situations 21% of the time (95% CI, 19.0%-23.9%). A significant percentage of patients failed to recognize the urgency of stroke symptoms (for example, loss of vision), the risk of bleeding after incidental head trauma, or medication mismanagement. Despite fair factual warfarin knowledge, participants did not appear to recognize well the clinical severity of warfarin-associated scenarios. Warfarin education programs should incorporate patient-centered strategies to teach recognition of high-risk situations that compromise patient safety.
    Joint Commission journal on quality and patient safety / Joint Commission Resources 01/2013; 39(1):22-31.
  • [Show abstract] [Hide abstract]
    ABSTRACT: OBJECTIVE: To assess the frequency and associations of barrier protection use during sexual activity in a population of women who have sex with women (WSW). METHODS: WSW were invited to participate in an international internet-based survey. Information regarding ethnodemographics, sexual health, and barrier use during sexual activities was collected. RESULTS: The study cohort comprised 1557 participants. Barrier use was least prevalent during digital genital stimulation (11.3% ever used barriers) and most prevalent during stimulation with a sex toy (34.4% ever used barriers). Univariate analysis revealed that women in non-monogamous relationships were more likely than monogamous women to always use barrier protection for sexual activity (14.3% vs 3.5%). On multivariate analysis, there was no association between barrier use and frequency of casual sexual activity or history of sexually transmitted infection. Small associations were noted between barrier use and certain sexual activities, age, race, and number of partners. CONCLUSION: Many WSW do not use barrier protection during sexual activity, even in the context of potentially risky sexual behaviors. Safer-sex practices among WSW merit increased attention from healthcare providers and public health researchers.
    International journal of gynaecology and obstetrics: the official organ of the International Federation of Gynaecology and Obstetrics 10/2012; · 1.41 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: INTRODUCTION: While the anti-resorptive effects of the bisphosphonates (BPs) are well documented, many questions remain about their mechanisms of action, particularly following long-term use. This study evaluated the effects of alendronate (Ale) treatment on TGF-β1 signaling in mesenchymal stem cells (MSCs) and osteocytes, and the relationship between prolonged alendronate treatment on systemic TGF-β1 levels and bone strength. METHODS: TGF-β1 expression and signaling were evaluated in MSCs and osteocytic MLO-Y4 cells following Ale treatment. Serum total TGF-β1 levels, a bone resorption marker (DPD/Cr), three-dimensional microCT scans and biomechanical tests from both the trabecular and cortical bone were measured in ovariectomized rats that either received continuous Ale treatment for 360days or Ale treatment for 120days followed by 240days of vehicle. Linear regression tests were performed to determine the association of serum total TGF-β1 levels and both the trabecular (vertebrae) and cortical (tibiae) bone strength. RESULTS: Ale increased TGF-β1 signaling in the MSCs but not in the MLO-Y4 cells. Ale treatment increased serum TGF-β1 levels and the numbers of TGF-β1-positive osteocytes and periosteal cells in cortical bone. Serum TGF-β1 levels were not associated with vertebral maximum load and strength but was negatively associated with cortical bone maximum load and ultimate strength. CONCLUSIONS: The increase of serum TGF-β1 levels during acute phase of estrogen deficiency is likely due to increased osteoclast-mediated release of matrix-derived latent TGF- β1. Long-term estrogen-deficiency generally results in a decline in serum TGF-β1 levels that are maintained by Ale treatment. Measuring serum total TGF-β1 levels may help to determine cortical bone quality following alendronate treatment.
    Bone 10/2012; · 3.82 Impact Factor
  • Masud Seyal, Lisa M Bateman, Chin-Shang Li
    [Show abstract] [Hide abstract]
    ABSTRACT: Purpose:  Sudden unexpected death in epilepsy (SUDEP) is the leading cause of epilepsy-related mortality. Seizure-related respiratory dysfunction (RD), the duration of postictal generalized electroencephalography (EEG) suppression (PGES), and duration of postictal immobility (PI) may be important in the pathophysiology of SUDEP. Periictal interventions may reduce the risk of SUDEP. Methods:  We assessed the impact of periictal nursing interventions on RD, PGES, and PI duration in patients with localization-related epilepsy and secondarily generalized convulsions (GCs) recorded during video-EEG telemetry in the epilepsy monitoring unit. Video-EEG data were retrospectively reviewed. Interventions including administration of supplemental oxygen, oropharyngeal suction, and patient repositioning were evaluated. Interventions were performed based on nursing clinical judgment at the bedside and were not randomized. The two-sided Wilcoxon rank-sum test was used to compare GCs with and those without intervention. Robust simple linear regression was used to assess the association between timing of intervention and duration of hypoxemia (SaO(2)  < 90%), PGES, and PI using data from only the first GC for each patient. Key Findings:  Data from 39 patients with 105 GCs were analyzed. PGES >2 s occurred following 31 GCs in 16 patients. There were 21 GCs with no intervention (NOINT) and 84 GC with interventions (INT). In the INT group, the duration of hypoxemia was shorter (p = 0.0014) when intervention occurred before hypoxemia onset (mean duration 53.1 s) than when intervention was delayed (mean duration 132.42 s). Linear regression indicated that in GCs with nursing interventions, earlier intervention was associated with shorter duration of hypoxemia (p < 0.0001) and shorter duration of PGES (p = 0.0012). Seizure duration (p < 0.0001) and convulsion duration (p = 0.0457) were shorter with earlier intervention. PI duration was longer for GCs with PGES than GCs without PGES (p < 0.0001). The mean delay to first active nonrespiratory movement following GCs with PGES was 251.96 s and for GC without PGES was 66.06 s. The duration of PI was positively associated with lower SaO(2) nadir (p = 0.003) and longer duration of oxygen desaturation (p = 0.0026). There was no association between PI duration and seizure duration (p = 0.773), between PI duration and PGES duration (p = 0.758), or between PI duration and the timing of first intervention relative to seizure onset (p = 0.823). PGES did not occur in the NOINT group. The mean duration of desaturation was longer (110.9 vs. 49.9 s) (p < 0.0001), mean SaO(2) nadir was lower (72.8% vs. 79.7%) (p = 0.0086), and mean end-tidal CO(2) was higher (58.6 vs. 50.3 mmHg) (p = 0.0359) in the INT group compared with the NOINT group. The duration of the seizure or of the convulsive component was not significantly different between the INT and NOINT groups. Significance:  Early periictal nursing intervention was associated with reduced duration of RD and reduced duration of PGES. These findings suggest the possibility that such interventions may be effective in reducing the risk of SUDEP in the outpatient setting. Validation of these preliminary data with a prospective study is needed before definitive conclusions can be reached regarding the efficacy of periictal interventions in reducing the risk of SUDEP.
    Epilepsia 09/2012; · 3.96 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Currently, there are no well-established duplex ultrasound (DUS) criteria for the evaluation of the mesenteric arteries after stenting for occlusive disease. Previous studies suggested DUS velocity criteria in the native superior mesenteric artery (SMA) overestimate stenosis in stented arteries, but most studies have not evaluated DUS imaging after SMA stenting longitudinally. This study was undertaken to determine the accuracy of DUS after mesenteric artery revascularization and, in particular, to evaluate the utility of DUS imaging for the detection of in-stent stenosis (ISS) of the SMA. A retrospective record review was performed for all patients who underwent SMA stenting for chronic mesenteric ischemia at a single institution from January 2004 to May 2011. Mesenteric artery occlusive disease resulted in 24 patients undergoing mesenteric stenting of the SMA alone (n = 20) or the SMA and celiac artery simultaneously (n = 3). The mean ± standard deviation peak systolic velocity (PSV) in 13 prestent DUS images of the SMA was 464 ± 130 cm/s. Prestenting angiography revealed an average SMA stenosis of 79% ± 14%. After stenting, completion angiography in each case revealed <20% residual stenosis. No significant correlation was identified between SMA PSV and angiographic stenosis before and after stenting (P > .05). Follow-up SMA DUS imaging showed an average PSV of 335 ± 138 cm/s at 0.9 ± 1.5 months, 360 ± 143 cm/s at 4.8 ±2.6 months, and 389 ± 95 cm/s at 14.4 ± 5.1 months. A significant difference existed between the prestent and the first poststent mean SMA PSV (P < .05), but no significant difference existed between each poststenting interval. Eight reinterventions for SMA ISS were performed, with a mean elevated in-stent SMA PSV of 505 ± 74 vs 341 ± 145 cm/s in patients who did not undergo reintervention. Angiography before the eight reinterventions demonstrated an average SMA ISS of 53% ± 25%. In-stent SMA PSV decreased from 505 ± 74 to 398 ± 108 cm/s after the reintervention (P < .05). Consistent with other reports, our data demonstrate the PSV in successfully stented SMAs remains higher than the PSV threshold of 275 cm/s used for the diagnosis of high-grade native SMA stenosis. In addition, in-stent SMA PSVs did not significantly change over DUS surveillance for patients who did not undergo reintervention. Thus, obtaining a baseline DUS early after mesenteric stenting should be considered to compare future surveillance DUS. An increase above this baseline or an in-stent SMA PSV approaching 500 cm/s should be considered suspicious for ISS, but larger prospective studies will be required to validate these preliminary findings.
    Journal of vascular surgery: official publication, the Society for Vascular Surgery [and] International Society for Cardiovascular Surgery, North American Chapter 09/2012; 56(5):1364-71. · 3.52 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We sought to determine if complete pathological necrosis (pathCR) predicts favorable oncological outcome in soft tissue sarcoma (STS) patients receiving pre-operative radiation monotherapy (RT). We evaluated 30 patients with primary STS treated with neoadjuvant RT followed by definitive resection, from 2000 to 2010 at our institution. We defined ≥95% tumor necrosis as pathCR. There were 22 STS of the extremities (73%), 7 of the retroperitoneum (23%), and 1 (4%) of the trunk. The median pathological percentage of tumor necrosis was 35% (range 5-100%) with three tumors (10%) demonstrating pathCR. With a median follow-up of 40 months, the 5-year local recurrence-free survival (LRFS), distant recurrence-free survival (DRFS), and overall survival (OS) for the entire cohort were 100%, 61%±11%, and 69%±11%, respectively. Among patients with pathCR, 3-year DRFS was 100% compared to 63±11% in patients without pathCR (p=0.28). Following neoadjuvant RT for STS, pathCR is associated with a clinically but not statistically significant 37% improvement in 3-year DRFS.
    Anticancer research 09/2012; 32(9):3911-5. · 1.71 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Opportunistic oral infections can be found in over 80% of HIV + patients, often causing debilitating lesions that also contribute to deterioration in nutritional health. Although appreciation for the role that the microbiota is likely to play in the initiation and/or enhancement of oral infections has grown considerably in recent years, little is known about the impact of HIV infection on host-microbe interactions within the oral cavity. In the current study, we characterize modulations in the bacterial composition of the lingual microbiome in patients with treated and untreated HIV infection. Bacterial species profiles were elucidated by microarray assay and compared between untreated HIV infected patients, HIV infected patients receiving antiretroviral therapy, and healthy HIV negative controls. The relationship between clinical parameters (viral burden and CD4+ T cell depletion) and the loss or gain of bacterial species was evaluated in each HIV patient group. In untreated HIV infection, elevated viremia was associated with significantly higher proportions of potentially pathogenic Veillonella, Prevotella, Megasphaera, and Campylobacter species in the lingual microbiome than observed in healthy controls. The upsurge in the prevalence of potential pathogens was juxtaposed by diminished representation of commensal Streptococcus and Veillonella species. Colonization of Neisseria flavescens was lower in the lingual microbiome of HIV infected patients receiving antiretroviral therapy than in uninfected controls. Our findings provide novel insights into the potential impact of HIV infection and antiretroviral therapy on the community structure of the oral microbiome, and implicate potential mechanisms that may increase the capacity of non-commensal species to gain a stronger foothold.
    BMC Microbiology 07/2012; 12:153. · 3.10 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: There has been scant attention to predictors of sexual dysfunction in women who have sex with women (WSW). To investigate the associations of high risk for sexual dysfunction in an Internet cohort of WSW. A modified version of the Female Sexual Function Index (FSFI) was used to quantify each subject's sexual function. Women who have sex with women were invited to participate in an Internet-based survey by invitations posted on e-mail listservs and on social media sites catering to WSW. Ethnodemographic, health status, and sexual/relationship data were collected. The study was completed by 2,433 adult women. Of these, 1,566 participants had complete data on the FSFI and comprised the study cohort; 388 (24.8%) met the FSFI criteria for high risk of female sexual dysfunction (HRFSD). On multivariable analysis, the following variables were found to be independently associated with the HRFSD; moderate or severe subjective bother regarding sexual function (OR 4.8, 95% CI 3.0-7.9 and 13.7, 95% CI 7.5-25.1, respectively), overactive bladder (OAB) (OR 2.1, 95% CI 1.0-4.5), having a nonfemale or no partner (OR 2.3, 95% CI 1.1-4.7 and 3.2, 95% CI 2.0-5.2, respectively). A history of pregnancy was associated with lower odds of HRFSD (OR 0.567, 95% CI 0.37-0.87). Mean FSFI domain scores for all domains except desire were negatively impacted by partner factors and OAB. A single-item question on sexual bother is strongly predictive of potentially distressing sexual problems in the WSW. A number of health and social factors are associated with risk of sexual problems in the WSW. Assessment of sexual well-being in the WSW is a priority for practicing healthcare providers.
    Journal of Sexual Medicine 02/2012; 9(5):1261-71. · 3.51 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Framingham risk score predicts a patient's 10-year risk of developing cardiovascular disease. Many risk factors included in its calculation influence or are influenced by circulating testosterone. To investigate the possible association between testosterone and cardiovascular risk, as defined by the Framingham score, a Veterans Affairs (VA) database was analyzed. A retrospective chart review was performed. Inclusion criteria were male sex and age ≥ 20 years. Exclusion criteria included pre-existing cardiovascular disease, stroke, and diabetes. Data were collected on veterans who had total plasma testosterone checked in the year 2008. The study included 1,479 patients (mean age 61 years). Framingham score was negatively associated with both total testosterone (p < 0.0001) and free testosterone (p = 0.0003). There was a positive association between total testosterone and high-density lipoprotein and negative associations between total testosterone and body mass index (BMI), total cholesterol, triglycerides, and blood pressure medication use. Free testosterone was positively associated with total cholesterol, low-density lipoprotein, and current smoking status and negatively associated with age, BMI, and blood pressure medication use. The BMI was not associated with Framingham score. Lower plasma testosterone may suggest the presence of cardiovascular risk factors and potentially increased risk for heart disease.
    The Aging Male 02/2012; 15(3):134-9. · 1.71 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: To describe persistent nephrographic patterns detected by unenhanced renal CT at 24 h after cardiac catheterisation and intervention. This prospective study was Health Insurance Portability and Accountability Act-compliant and institutional review board approved. Twenty-nine patients (20 men, nine women; average age 63.27 and range 41-85 years) agreed to undergo unenhanced dual-energy computed tomography (CT) limited to their kidneys at 24 h after cardiac catheterisation. CT attenuation values (Hounsfield units) were made from the cortical and medullary regions and single kidney total parenchymal iodine values (milligrams) were measured. Spearman's rank correlation coefficient and a two-sided Fisher's exact test were used in the statistics. Focal nephrograms were observed in at least one kidney (range, one to five regions per kidney) in 10/29 (34%) of patients and bilateral global nephrograms in 13/29 (45%) of patients. Focal nephrograms correlated with cardiac catheterisation fluoroscopic time (r = 0.48; P = 0.0087). For global nephrograms, the total iodine content of right and left kidneys correlated with fluoroscopic time (r = 0.79 and 0.76; P < 0.0001, respectively) and the amount of contrast material (CM) used (r = 0.77 and r = 0.74; P < 0.0001, respectively). Persistent focal and global nephrograms occur commonly as assessed by non-contrast CT at 24 h post cardiac catheterisation and our observations suggest they could be related to procedural factors.
    Insights into imaging. 02/2012; 3(1):49-60.

Publication Stats

332 Citations
159.92 Total Impact Points


  • 2008–2014
    • University of California, Davis
      • Department of Public Health Sciences
      Davis, California, United States
  • 2013
    • Peking Union Medical College Hospital
      Peping, Beijing, China
    • University of Washington Seattle
      • Department of Radiology
      Seattle, WA, United States
  • 2010–2012
    • California State University, Sacramento
      Sacramento, California, United States