Browsing by Subject "Clinical research"
Now showing 1 - 12 of 12
- Results Per Page
- Sort Options
Item Atrial fibrillation and the risk of sudden cardiac death: the atherosclerosis risk in communities (ARIC) study and cardiovascular health study (CHS)(2012-05) Chen, Lin YeeOjective: It is unknown whether atrial fibrilation (AF) is associated with an increased risk of sudden cardiac death (SCD) in the general population. The objective of this study was to examine the association between incident Af and SCD in 2 population-based cohorts. Research Design & Methods: In the Atherosclerosis Risk in Communities (ARIC) Study we analyzed data from 15439 participants (aged 45-64 years at baseline, 55% women, and 27% black) obtained from baseline (1987-1989) through December 31, 2001. In the Cardiovascular Health Study (CHS), we analyzed data from 5479 participants (aged >_65 years at baseline, 58% women and 15% black) obtained from baseline (first cohort, 1989-1990; second cohort, 1992-1993) through December 31, 2006. SCD was physician-adjudicated and was defined as a sudden, pulseless condition presumed due to a ventricular tachyamhythmia in a previously stable individual. We used multivariable Cox proportional hazards models to assess the association between AF and SCD adjusting for age, sex, race, field center, and baseline cardiovascular risk factors. Results: In ARIC, 894 incident AF and 269 SCD events occurred during follow-up (median, 13.1 years). In participants with and without AF, the crude incidence rates of SCD were 2.89 and 1.30 per 1000 person-years respectively. The multivariable hazard ratio (HR)(95% confidence interval [CI] of AF for SCD was 3.26 (2.17-4.91), P<001. In CHS, 1458 incident AF and 292 SCD events occurred during follow-up (median, 13.1 years). In participants with and without AF, the incidence rates of SCD were 12.00 and 3.82 per 1000 person-years, respectively. The multivariable HR (95% CI) of AF for SCD was 2.34 (1.76-2.91), P<.001 Conclusions: Incident AF is associated with an increased risk of SCD in the general population. Additional research to identify predictors of SCD in patients with AF is warranted.Item CD16xCD33 bispecific killer cell engager (BiKE) activates natural killer (NK) cells from myelodysplastic syndrome (MDS) patients against primary MDS and myeloid-derived suppressor cell (MDSC) CD33-positive targets(2014-11) Gleason, Michelle KathleenMyelodysplastic syndromes (MDS) are stem cell disorders that can progress to acute myeloid leukemia (AML). While hematopoietic cell transplantation (HCT) can be curative, additional therapies are needed for a disease that disproportionally afflicts the elderly. We tested the ability of a CD16xCD33 bispecific killer cell engager (BiKE) to induce natural killer (NK) cell function from 67 MDS patients. Compared to age-matched normal controls, CD7+ lymphocytes, NK cells, and CD16 expression were markedly decreased in MDS patients. Despite this, reverse-antibody dependent cell-mediated cytotoxicity (R-ADCC) assays showed potent degranulation and cytokine production when resting MDS-NK cells were triggered with an agonistic CD16 mAb. Blood and marrow MDS-NK cells treated with BiKE significantly enhanced degranulation, TNF-alpha and IFN-gamma production against HL-60 and endogenous CD33+ MDS targets. MDS patients had a significantly increased proportion of immunosuppressive CD33+ myeloid derived suppressor cells (MDSC) that negatively correlated with MDS lymphocyte populations and CD16 loss on NK cells. Treatment with the CD16xCD33 BiKE successfully reversed MDSC immunosuppression of NK cells and induced MDSC target cell lysis. Lastly, the BiKE induced optimal MDS-NK cell function irrespective of disease stage. Our data suggest that the CD16xCD33 BiKE functions against both CD33+ MDS and MDSC targets and may be therapeutically beneficial for MDS patients.Item “Child height and the risk of young-adult obesity”(2010-12) Stovitz, Steven DavidBACKGROUND: Childhood obesity is a major risk factor for adult obesity. Our research aim is to evaluate whether the association of child height and BMI alters the longitudinal tracking from childhood into young adulthood of clinically recommended categories of overweight status. METHODS: A multicenter prospective cohort study of subjects assessed in both 3rd grade and 12th grade, n = 2,802. Main exposures were CDC childhood body mass index (BMI) categories and height quartiles from 3rd grade measurements. Main outcome measure was CDC adult BMI categories from 12th grade measurements. Associations between childhood height quartiles, childhood BMI categories and adult BMI categories were assessed using chi-square tests and logistic regression models. RESULTS: Overall, 79% of overweight children remained overweight as young adults. Among children who were overweight or obese, the probability of becoming an overweight or obese young adult was 85% for children in the top quartile of height and 67% for children in the bottom quartile of height (p = 0.007). Among children who were normal weight, the probability of becoming an overweight or obese young adult was 25% for children in the top height quartile v. 17% for children in the bottom height quartile (p = 0.003). CONCLUSIONS: When clinicians classify children by BMI categories and counsel about the risk for future obesity, they should recognize that greater height is not protective. Rather, greater height may be a marker for increased risk of adult overweight and obesity.Item “Democratizing” clinical research? efficiency and inclusiveness in an electronic primary care research network(2010-06) Hudson, Brenda L.This dissertation is a critical ethnography and rhetorical study of the development of an electronic network designed to advance medical research and improve health. Specifically, this study focuses on the network's social and technological affordances of efficiency and inclusiveness to connect communities of primary care providers and clinical researchers to both expand participation in and expedite the research process. By examining the network's technical elements aligned with its social context, the assumptions that influence the choice of technologies, and the network's subsequent design, Brenda L. Hudson explores the network's hierarchical structure and potential democratizing capabilities in clinical research. Through field notes, interviews, and textual analysis, Hudson provides a micro-level examination of the electronic network's development and technical affordances during the program's three-year funded contract. An ethnographic narrative describes how the group functions as a "community of practice" to create a network linking primary care practices with clinical research. Further, Hudson provides a macro-level examination that draws on critical theories of technology and explores to what extent the network might serve as a "democratic" technology through its involvement of previously unprivileged populations in clinical research--primary care providers and patients. Results indicate that assumptions of efficiency and inclusiveness in clinical research--and specifically in the network's technical affordances--provide potential benefits to patients' health by widening the pool of researchers and participants and streamlining the recruitment process. However, manifest in this electronic network, these assumptions also pose potential risks and ethical challenges surrounding private health information and "therapeutic misconception," whereby a research participant believes that enrolling in a research study will provide direct therapeutic benefit. Further results indicate that although the development team has done much to assure a "democratic" development of use of technology by operating as a "community of practice," there exist unintentional asymmetrical hierarchies of who controls and uses the network, favoring primary care providers and practices that already exist in clinical research.Item Environmental contamination in households of patients with recurrent clostridium difficile infection(2014-09) Shaughnessy, Megan KoselBackground: Recurrent Clostridium difficile infection (R-CDI) is common and difficult to treat, potentially necessitating fecal microbiota transplantation (FMT). Although C. difficile spores can persist in the hospital environment and cause infection, little is known about their potential presence in the household environment.Methods: Households of R-CDI subjects in the peri-FMT period, and of geographically and age-matched controls, were analyzed for presence of C. difficile. Household environmental surfaces and fecal samples from humans and pets in the household were examined. Post-FMT subject households were also examined (environmental surfaces only). Participants were surveyed regarding their personal history and household cleaning habits. Environmental and fecal samples were cultured for C. difficile. Species identity and molecular characteristics of presumptive C. difficile isolates were determined using the PRO kit (Remel, USA), Gram staining, PCR, toxinotyping, tcdC gene sequencing, and pulsed-field gel electrophoresis (PFGE).Results: Environmental cultures detected C. difficile on ¡Ý 1 surface in 8/8 (100%) peri-FMT households vs. 3/8 (38%) post-FMT households and 3/8 (38%) control households (P = 0.025). The most common C. difficile-positive surfaces were the vacuum (11/27, 41%), toilet (8/30, 27%), and bathroom sink (5/29, 17%). C. difficile was detected in 3/36 (8%) fecal samples (2 R-CDI subjects, 1 household member). Nine (90%) of 10 households with multiple C. difficile-positive samples had a single genotype present each. Conclusions: C. difficile was found in the household environment of R-CDI patients. Whether this is a cause or consequence of R-CDI is unknown. If household contamination leads to R-CDI, effective decontamination may be protective.Item Markers of Endothelial Function in Heart Transplant Recipients and Associations with Cardiac Allograft Vasculopathy(2011-01) Colvin, Monica MecheleCardiac allograft vasculopathy (CAV) is a major limitation to long-term survival in heart transplant recipients (HTR) accounting for almost 30% of deaths after 5 years. Early non-invasive detection remains a major challenge due the insensitivity and invasiveness of current diagnostic tests. Small vessel disease and endothelial dysfunction are key players in the pathophysiology of CAV. We hypothesize that in HTR there is an impairment of endothelial cellular repair and changes in arterial elasticity, especially in small arteries, resulting in a reduction of small artery elasticity (SAE), an increase in the number of circulating endothelial cells (CEC), and increased CEC activation. In addition, these changes are significant in HTR who develop CAV. Methods: Ninety-seven HTR and 22 normal controls were included in this study. SAE was measured from the radial artery. CEC (CD146+ cells) were enumerated and assessed for activation based on VCAM expression. Continuous variables were analyzed using t-test and dichotomous variables using Chi-square. Logistic regression using stepwise selection was performed to evaluate determinants of CEC, CEC activation, and SAE. Results: The median age was 61years(range, 18-76). The mean duration of transplant was 5.4 ± 5.3 year. 77 % were male and 57% had CAV. HT was associated with significantly lower SAE (p<0.0001) and increased CEC activation (p=0.0004) when compared to healthy controls. We also found that CAV was significantly associated with SAE and CEC (p = 0.04 and 0.01, respectively). On stepwise regression, hypertension treatment and duration of transplant were associated with CAV. Conclusion: Heart transplant is characterized by endothelial activation and dysfunction as evidenced by a reduction in SAE and increased CEC activation. Prospective studies to evaluate these markers as predictors of risk are needed for further evaluation.Item Nicotine exposure and subject response following use of two smokeless, spitless tobacco products and medicinal nicotine(2012-12) Kotlyar, MichaelIntroduction: Tobacco products that can be used discreetly are being introduced into the marketplace with limited information currently available about these products. Methods: Eleven smokeless tobacco users used either Camel Snus, Taboka or nicotine lozenge at each of three sessions for 30 minutes. Nicotine concentrations were measured over 90 minutes and subjective measures were assessed during product use. Results: Significant differences were found among products in Cmax (significantly higher during nicotine lozenge and Camel Snus use than during Taboka use) and in the 90 minute AUC (significantly higher during nicotine lozenge use than during Camel Snus use which was significantly higher than during Taboka use). Craving and withdrawal symptoms decline did not differ among products and few differences were seen on measures of product effects or liking. Conclusion: Based on current data, smokers interested in switching to less harmful products should be encouraged to use medicinal nicotine.Item Pain and root canal therapy: exploring their relationships within the DPBRN.(2012-01) Nixdorf, Donald RobertIntroduction: Root canal therapy is a commonly employed and effective dental treatment often used to treat the symptom of intraoral pain. Unfortunately some patients experience severe amounts of pain post-operatively, but it is not clear why. Methods: Using a prospective observational study design within the Dental Practice Based Research Network (DPBRN), we enrolled patients presenting for initial orthograde root canal therapy. The patients and dentists completed questionnaires before, immediately after and at 1-week following treatment. Descriptive statistics were used to assess study variables. Results: Over 6 months, 708 subjects were enrolled within the practices of 62 dentists, 46 of who were generalists and 16 endodontists, and were typical of those in the U.S. At baseline, 79% of patients were experiencing pain, with an average intensity of 5/10 (SD: 2.8), and 63% reported their pain interfered with daily activities. Necrosis was the most common pulp diagnosis (49%), while symptomatic apical periodontitis was the most common apical diagnosis (39%). Widespread pain was reported by 29% of patients. Within the 1-week post-operative period, about 16% of patients reported experiencing severe dental pain (≥7/10) and 6% reported experiencing severe pain and swelling. Conclusions: Patients presenting for initial orthograde root canal therapy have a significant amount of pain and pain-related interference in daily life. Severe postoperative pain was perceived in almost 1-in-6 patients treated.Item Predictive factors of pathological complete response after long-course neoadjuvant chemoradiation therapy for rectal cancer(2012-05) Wallin, Ulrik GeorgeBackground: Preoperative chemoradiation therapy (CRT) in patients with rectal cancer results in pathologic complete response (pCR) in about 10-30% of patients. Predictive factors for obtaining pCR may influence the selection of patients for this therapy. The aim of this study was to evaluate the impact of tumor size, stage, location, circumferential extent, patient characteristics and pretreatment CEA levels on development of pCR after CRT. Methods: 530 patients treated with preoperative CRT and radical surgery for rectal adenocarcinoma 1998-2011 were identified. A total of 469 patients remained after excluding patients with a history of pelvic radiation (n=2), previous transanal endoscopic microsurgery or polypectomy of the primary lesion (n=15), concurrent malignant tumor (n=14), and no information about pre- or post-treatment T stage in the chart (n=30). The clinical tumor stage and size were assessed by endorectal ultrasound (90%), MRI (10%), pelvic CT (5%), flexible sigmoidoscopy/colonoscopy (100%), chest X-ray (100%) and PET-CT (1%). Pathologic complete response was defined as absence of viable tumor cells in the rectal wall and in any of the resected lymph nodes. CRT consisted of a 5-fluorouracil-based regimen and external beam radiation with a mean radiation dose of 50 Gy given over a mean of 5.7 weeks. Mean time between completion of CRT and surgery was 8.7 weeks (SD, ±7,4). Results: Ninety-six patients (20%) were found to have pCR in the operative specimen. Low pretreatment CEA (3.4 vs. 9.6 ng/ml; p<0.008) and smaller mean tumor size (4.2 vs. 4.7 cm; p<0.02) were significantly associated with pCR, but only low CEA level remained a significant predictor of pCR in the multivariate analysis. When stratifying for smoking status, low CEA was significantly associated with pCR only in the group of non-smokers (p=0.02). Conclusions: The current study indicates that non-smoking rectal cancer patients with a low CEA level have an improved chance to obtain pCR after CRT. Further studies are necessary to determine whether a low CEA level can aid in identifying patients who are candidates for close follow-up rather than surgery after CRT.Item Predictors of survival in dialysis patients in the United States after acute myocardial infarction.(2010-12) Riad, Samy MagdyBackground: Acute myocardial infarction (AMI) in dialysis patients continues to be associated with poor survival. This study aimed to identify predictors of survival in dialysis patients prior to AMI and to examine the association between survival and different revascularization techniques. Methods and Results: 3,049 US prevalent dialysis patients hospitalized for AMI between April 1, 1998, and June 30, 2000, were identified by cross-matching the United States Renal Data System (USRDS) database and the Third National Registry of Myocardial Infarction (NRMI 3). Of the 3011 data abstraction forms, 1,696 were suitable for analysis. Mean age was 67.0 ±11.9 years and average dialysis duration was 2.8 ± 3.2 years. Of the cohort, 69% were white and 47% were women. Diabetes and dysrhythmia were present in 72.5% and 65.5%, respectively. These two conditions were used for patient stratification. At 1 year post-AMI, 62% of the cohort died. The impact of independent predictors on survival was examined in a Cox proportional hazards model. Beta blockers use was associated with improved 1-year all-cause mortality (hazard ratio [HR] 0.8, P = 0.003). Compared with dialysis via catheter, fistula use was associated with favorable outcome (HR = 0.75, P = 0.0047), as was graft use (HR = 0.8 , P = 0.0054). Compared with predialysis systolic blood pressure 120-179 mmHg, values < 120 mmHg were more hazardous (HR =1.46, P ≤ 0.0001) and ≥ 180 mmHg less hazardous (HR = 0.7, P = 0.004). Coronary artery bypass graft surgery (CABG) and percutaneous coronary intervention (PCI) within 30 days of AMI were examined in time-dependent Cox models. CABG was not significantly associated with improved survival (HR = 0.87, P = 0.35), while PCI showed a strong protective association (HR = 0.67, P = 0.0005). Conclusion: Beta Blocker use prior to AMI, vascular access with fistula or graft and PCI within 30 days of AMI are associated with improved one year survival in dialysis patients. Optimal target blood pressure in dialysis patients remains controversial. Validation of these observational data by randomized clinical trials is needed as the impact of selection bias and unknown confounders may not be accounted for in our study.Item Quantitative digital assessment of periapical healing.(2011-08) Wiswall, Jeffrey HerbertAbstract summary not availableItem Trends in the use of implantable accelerated partial breast irradiation therapy for early stage breast cancer in the United States.(2012-02) Abbott, Andrea MarieBackground In 2002 the Food and Drug Administration approved an implantable balloon catheter that delivers accelerated partial breast irradiation (APBI) directly to the tumor bed and surrounding area after breast conserving surgery (BCS). Our aim was to determine the use of implantable APBI (IAPBI) in the United States and the patient and tumor factors associated with IAPBI use. Methods Using the Surveillance, Epidemiology, and End Results database, we conducted a retrospective analysis of patients who received whole breast radiation therapy (WBRT) or IAPBI after BCS for ductal carcinoma in situ, stage I or II breast cancer from 2000 to 2007. We determined WBRT and IAPBI rates across time and demographic and tumor factors using chi-square tests and Cochran-Armitage tests for trend for our unadjusted analyses. We used logistic regression for our multivariate analysis, allowing for adjustment for potential confounders. Results We identified 127,257 patients who met inclusion criteria. Over the study period the proportion of patients receiving IAPBI increased by 1600% (2000: 0.4%; 2007: 6.8%; p <0.001). This trend remained significant when using logistic regression (OR = 20.3, 95% CI 15.5 to 26.6). The increase in IAPBI use was statistically significant across all stage and age categories over 40 (p< 0.001). The use of IAPBI was most notable in older women (70-79 years) with a > 2100% increase in use during the study period (2000:0.4%; 2007: 9.0%; p-value <0.001). We also found significant variation in IAPBI use by region. Conclusions: IAPBI use has markedly increased since 2000, particularly in the elderly population. The rapid and widespread adoption of IAPBI is concerning, because large multicenter randomized controlled trials have not yet demonstrated long-term effectiveness of IAPBI compared to WBRT.