Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 71
Filter
1.
Clin Chem ; 2023 May 26.
Article in English | MEDLINE | ID: mdl-37232052

ABSTRACT

BACKGROUND: Fecal immunochemical tests (FITs) are widely used for colorectal cancer (CRC) screening; however, high ambient temperatures were found to reduce test accuracy. More recently, proprietary globin stabilizers were added to FIT sample buffers to prevent temperature-associated hemoglobin (Hb) degradation, but their effectiveness remains uncertain. We aimed to determine the impact of high temperature (>30°C) on OC-Sensor FIT Hb concentration with current FITs, characterize FIT temperatures during mail transit, and determine impact of ambient temperature on FIT Hb concentration using data from a CRC screening program. METHODS: FITs were analyzed for Hb concentration after in vitro incubation at different temperatures. Data loggers packaged alongside FITs measured temperatures during mail transit. Separately, screening program participants completed and mailed FITs to the laboratory for Hb analysis. Regression analyses compared the impact of environmental variables on FIT temperatures and separately on FIT sample Hb concentration. RESULTS: In vitro incubation at 30 to 35°C reduced FIT Hb concentration after >4 days. During mail transit, maximum FIT temperature averaged 6.4°C above maximum ambient temperature, but exposure to temperature above 30°C was for less than 24 hours. Screening program data showed no association between FIT Hb concentration and maximum ambient temperatures. CONCLUSIONS: Although FIT samples are exposed to elevated temperatures during mail transit, this is brief and does not significantly reduce FIT Hb concentration. These data support continuation of CRC screening during warm weather with modern FITs with a stabilizing agent when mail delivery is ≤4 days.

2.
J Clin Microbiol ; 59(2)2021 01 21.
Article in English | MEDLINE | ID: mdl-33177120

ABSTRACT

We evaluated the utility of the commercial Allplex genital ulcer real-time PCR multiplex assay for detecting Treponema pallidum, herpes simplex virus 1 (HSV-1) and 2 (HSV-2), and Chlamydia trachomatis serovar L (lymphogranuloma venereum [LGV]) DNA in mucosal and genital ulcers in the context of suspected syphilis. In total, 374 documented genital and mucosal ulcers from patients with and without syphilis presenting at several sexually transmitted infection (STI) centers in France from October 2010 to December 2016 were analyzed at the National Reference Center (CNR) for Bacterial STIs at Cochin Hospital in Paris. T. pallidum subsp. pallidum detection results were compared with the final diagnosis based on a combination of clinical examination, serological results, and in-house nested PCR (nPCR). Detections of HSV and LGV were validated against reference methods. We found that 44.6% of the 374 samples tested were positive for T. pallidum subsp. pallidum, 21% for HSV, and 0.8% for LGV. No positive results were obtained for 30.7% of samples, and 4.8% presented coinfections. For T. pallidum subsp. pallidum detection, the overall sensitivity was 80% (95% confidence interval [CI], 76.1 to 84.1%), specificity was 98.8% (95% CI, 97.7 to 99.9%), positive predictive value was 98.8% (95% CI, 97.7 to 99.9%) and negative predictive value was 80.2% (95% CI, 76.2 to 84.2%), with a rate of concordance with the reference method of 92.5% (k = 0.85). This PCR multiplex assay is suitable for T. pallidum subsp. pallidum detection in routine use and facilitates the simultaneous rapid detection of a broad panel of pathogens relevant in a context of suspected syphilis lesions.


Subject(s)
Syphilis , Treponema pallidum , France , Humans , Multiplex Polymerase Chain Reaction , Paris , Syphilis/diagnosis , Treponema pallidum/genetics , Ulcer
4.
Cancer Prev Res (Phila) ; 12(9): 631-640, 2019 09.
Article in English | MEDLINE | ID: mdl-31266825

ABSTRACT

Suboptimal participation is commonly observed in colorectal cancer screening programs utilizing fecal tests. This randomized controlled trial tested whether the offer of a blood test as either a "rescue" strategy for fecal test nonparticipants or an upfront choice, could improve participation. A total of 1,800 people (50-74 years) were randomized to control, rescue, or choice groups (n = 600/group). All were mailed a fecal immunochemical test (FIT, OC-Sensor, Eiken Chemical Company) and a survey assessing awareness of the screening tests. The rescue group was offered a blood test 12 weeks after FIT nonparticipation. The choice group was given the opportunity to choose to do a blood test (Colvera, Clinical Genomics) instead of FIT at baseline. Participation with any test after 24 weeks was not significantly different between groups (control, 37.8%; rescue, 36.9%; choice, 33.8%; P > 0.05). When the rescue strategy was offered after 12 weeks, an additional 6.5% participated with the blood test, which was greater than the blood test participation when offered as an upfront choice (1.5%; P < 0.001). Awareness of the tests was greater for FIT than for blood (96.2% vs. 23.1%; P < 0.0001). In a population familiar with FIT screening, provision of a blood test either as a rescue of FIT nonparticipants or as an upfront choice did not increase overall participation. This might reflect a lack of awareness of the blood test for screening compared with FIT.


Subject(s)
Blood Chemical Analysis , Colorectal Neoplasms/diagnosis , Early Detection of Cancer/methods , Occult Blood , Patient Participation/statistics & numerical data , Aged , Blood Chemical Analysis/methods , Blood Chemical Analysis/psychology , Blood Chemical Analysis/statistics & numerical data , Choice Behavior , Colorectal Neoplasms/psychology , Early Detection of Cancer/psychology , Early Detection of Cancer/statistics & numerical data , Female , Humans , Male , Mass Screening/methods , Mass Screening/psychology , Mass Screening/statistics & numerical data , Middle Aged , Patient Acceptance of Health Care/psychology , Patient Acceptance of Health Care/statistics & numerical data , Socioeconomic Factors , South Australia/epidemiology
5.
Crit Care ; 23(1): 222, 2019 Jun 18.
Article in English | MEDLINE | ID: mdl-31215498

ABSTRACT

BACKGROUND: During the initial phase of critical illness, the association between the dose of nutrition support and mortality risk may vary among patients in the intensive care unit (ICU) because the prevalence of malnutrition varies widely (28 to 78%), and not all ICU patients are severely ill. Therefore, we hypothesized that a prognostic model that integrates nutritional status and disease severity could accurately predict mortality risk and classify critically ill patients into low- and high-risk groups. Additionally, in critically ill patients placed on exclusive nutritional support (ENS), we hypothesized that their risk categories could modify the association between dose of nutrition support and mortality risk. METHODS: A prognostic model that predicts 28-day mortality was built from a prospective cohort study of 440 patients. The association between dose of nutrition support and mortality risk was evaluated in a subgroup of 252 mechanically ventilated patients via logistic regressions, stratified by low- and high-risk groups, and days of exclusive nutritional support (ENS) [short-term (≤ 6 days) vs. longer-term (≥ 7 days)]. Only the first 6 days of ENS was evaluated for a fair comparison. RESULTS: The prognostic model demonstrated good discrimination [AUC 0.78 (95% CI 0.73-0.82), and a bias-corrected calibration curve suggested fair accuracy. In high-risk patients with short-term ENS (≤ 6 days), each 10% increase in goal energy and protein intake was associated with an increased adjusted odds (95% CI) of 28-day mortality [1.60 (1.19-2.15) and 1.47 (1.12-1.86), respectively]. In contrast, each 10% increase in goal protein intake during the first 6 days of ENS in high-risk patients with longer-term ENS (≥ 7 days) was associated with a lower adjusted odds of 28-day mortality [0.75 (0.57-0.99)]. Despite the opposing associations, the mean predicted mortality risks and prevalence of malnutrition between short- and longer-term ENS patients were similar. CONCLUSIONS: Combining baseline nutritional status and disease severity in a prognostic model could accurately predict 28-day mortality. However, the association between the dose of nutrition support during the first 6 days of ENS and 28-day mortality was independent of baseline disease severity and nutritional status.


Subject(s)
Critical Illness/therapy , Mortality/trends , Nutritional Status , Nutritional Support/standards , Aged , Area Under Curve , Cohort Studies , Critical Illness/epidemiology , Critical Illness/mortality , Energy Intake/physiology , Female , Humans , Logistic Models , Male , Middle Aged , Nutritional Support/methods , Prognosis , Prospective Studies , ROC Curve , Severity of Illness Index , Singapore/epidemiology
6.
Dig Dis Sci ; 64(9): 2555-2562, 2019 09.
Article in English | MEDLINE | ID: mdl-30835026

ABSTRACT

BACKGROUND: Early detection and removal of precursor lesions reduce colorectal cancer morbidity and mortality. Sessile serrated adenomas/polyps (SSP) are a recognized precursor of cancer, but there are limited studies on whether current screening techniques detect this pathology. AIMS: To investigate the sensitivity of fecal immunochemical tests (FIT) and epigenetic biomarkers in blood for detection of SSP. METHODS: A prospective study offered FIT and a blood test (Colvera for methylated BCAT1 and IKZF1) to adults referred for colonoscopy. Sensitivity of FIT and the blood test were determined for four types of pathology: low-risk conventional adenoma, high-risk adenoma, SSP, and absence of neoplasia. Comparisons were made for FIT positivity at 10 and 20 µg hemoglobin (Hb)/g feces. RESULTS: One thousand eight hundred and eighty-two subjects completed FIT and underwent colonoscopy. One thousand four hundred and three were also tested for methylated BCAT1/IKZF1. The sensitivity of FIT (20 µg Hb/g feces) for SSP was 16.3%. This was lower than the sensitivity for high-risk adenomas (28.7%, p < 0.05), but no different to that for low-risk adenomas (13.1%) or no neoplasia (8.4%). A positive FIT result for SSP was not associated with demographics, morphology, concurrent pathology or intake of medications that increase bleeding risk. FIT sensitivity for SSP did not significantly increase through lowering the positivity threshold to 10 µg Hb/g feces (20.4%, p > 0.05). Sensitivity of the blood test for SSP was 8.8%, and 26.5% when combined with FIT. CONCLUSIONS: Both FIT and blood-based markers of DNA hypermethylation have low sensitivity for detection of SSP. Further development of sensitive screening tests is warranted.


Subject(s)
Adenoma/diagnosis , Colonic Neoplasms/diagnosis , Colonic Polyps/diagnosis , DNA Methylation , Early Detection of Cancer/methods , Occult Blood , Adenoma/blood , Adenoma/pathology , Adult , Aged , Aged, 80 and over , Biomarkers, Tumor/blood , Colonic Neoplasms/blood , Colonic Neoplasms/pathology , Colonic Polyps/blood , Colonic Polyps/pathology , Female , Hemoglobins/analysis , Humans , Ikaros Transcription Factor/blood , Ikaros Transcription Factor/genetics , Immunochemistry , Male , Middle Aged , Prospective Studies , Sensitivity and Specificity , Transaminases/blood , Transaminases/genetics
7.
J Gastrointest Surg ; 23(7): 1309-1317, 2019 07.
Article in English | MEDLINE | ID: mdl-30478530

ABSTRACT

PURPOSE: Endoscopic surveillance for Barrett's oesophagus is undertaken to detect dysplasia and early cancer, and to facilitate early intervention. Evidence supporting current practice is of low quality and often influenced by opinion. This study investigated the preferences of patients for surveillance of Barrett's oesophagus in an Australian cohort. METHODS: Four Barrett's oesophagus surveillance characteristics/attributes were evaluated within a discrete choice experiment based on literature and expert opinion: (1) surveillance method (endoscopy vs a blood test vs a novel breath test), (2) risk of missing a cancer over a 10-year period, (3) screening interval, and (4) out-of-pocket cost. The data from the discrete choice experiment was analysed within the framework of random utility theory using a mixed logit regression model. RESULTS: The study sample comprised patients (n = 71) undergoing endoscopic surveillance for Barrett's oesophagus of whom n = 65 completed the discrete choice experiment. The sample was predominantly male (77%) with average age of 65 years. All attributes except surveillance method significantly influenced respondents' preference for Barrett's oesophagus surveillance. Policy analyses suggested that compared to the reference case (i.e. endoscopy provided annually at no upfront cost and with a 4% risk of missing cancer), increasing test sensitivity to 0.5% risk of missing cancer would increase participation by up to 50%; surveillance every 5 years would lead to 26% reduction, while every 3 to 3.5 years would result in 7% increase in participation. Respondents were highly averse to paying A$500 for the test, resulting in 48% reduction in participation. None of the other surveillance methods was preferred to endoscopy, both resulting in 11% reduction in participation. CONCLUSION: Test sensitivity, test frequency and out-of-pocket cost were the key factors influencing surveillance uptake. Patients prefer a test with the highest sensitivity, offered frequently, that incurs no upfront costs.


Subject(s)
Barrett Esophagus/complications , Early Detection of Cancer/economics , Esophageal Neoplasms/diagnosis , Esophagoscopy/economics , Patient Preference , Aged , Australia , Barrett Esophagus/diagnosis , Breath Tests , Cohort Studies , Female , Health Care Costs , Humans , Logistic Models , Male , Middle Aged , Time Factors
8.
Ann Intensive Care ; 8(1): 98, 2018 Oct 22.
Article in English | MEDLINE | ID: mdl-30350233

ABSTRACT

BACKGROUND: The timing and dose of exclusive nutrition support (ENS) have not been investigated in previous studies aimed at validating the modified Nutrition Risk in Critically Ill (mNUTRIC) score. We therefore evaluated the mNUTRIC score by determining the association between dose of nutrition support and 28-day mortality in high-risk patients who received short- and longer-term ENS (≤ 6 days vs. ≥ 7 days). METHODS: A prospective cohort study included data from 252 adult patients with > 48 h of mechanical ventilation in a tertiary care institution in Singapore. The dose of nutrition support (amount received ÷ goal: expressed in percentage) was calculated for a maximum of 14 days. Associations between the dose of energy (and protein) intake and 28-day mortality were evaluated with multivariable Cox regressions. Since patients have different durations of ENS, only the first 6 days of ENS in patients with short- and longer-term ENS were assessed in the Cox regressions to ensure a valid comparison of the associations between energy (and protein) intake and 28-day mortality. RESULTS: In high-risk patients with short-term ENS (n = 106), each 10% increase in goal energy intake was associated with an increased hazard of 28-day mortality [adj-HR 1.37 (95% CI 1.17, 1.61)], and this was also observed for protein intake [adj-HR 1.31 (95% CI 1.10, 1.56)]. In contrast, each 10% increase in goal protein intake in high-risk patients with longer-term ENS (n = 146) was associated with a lower hazard of 28-day mortality [adj-HR 0.78 (95% CI 0.66, 0.93)]. The mean mNUTRIC scores in these two groups of patients were similar. CONCLUSION: When timing and dose of nutrition support were examined, the mNUTRIC did not differentiate high-risk patients who would derive the most benefit from nutrition support.

9.
Nat Microbiol ; 3(7): 814-823, 2018 07.
Article in English | MEDLINE | ID: mdl-29946163

ABSTRACT

Stem-cell-derived organoids recapitulate in vivo physiology of their original tissues, representing valuable systems to model medical disorders such as infectious diseases. Cryptosporidium, a protozoan parasite, is a leading cause of diarrhoea and a major cause of child mortality worldwide. Drug development requires detailed knowledge of the pathophysiology of Cryptosporidium, but experimental approaches have been hindered by the lack of an optimal in vitro culture system. Here, we show that Cryptosporidium can infect epithelial organoids derived from human small intestine and lung. The parasite propagates within the organoids and completes its complex life cycle. Temporal analysis of the Cryptosporidium transcriptome during organoid infection reveals dynamic regulation of transcripts related to its life cycle. Our study presents organoids as a physiologically relevant in vitro model system to study Cryptosporidium infection.


Subject(s)
Cryptosporidiosis/genetics , Cryptosporidium/pathogenicity , Gene Expression Profiling/methods , Organoids/parasitology , Cryptosporidiosis/parasitology , Cryptosporidium/growth & development , Gene Expression Regulation , Humans , Intestine, Small/parasitology , Lung/parasitology , Models, Biological , Organ Culture Techniques , Sequence Analysis, RNA , Spatio-Temporal Analysis
10.
Eur J Cancer Prev ; 27(5): 425-432, 2018 09.
Article in English | MEDLINE | ID: mdl-28368949

ABSTRACT

Participation rates in colorectal cancer (CRC) screening programmes using faecal occult blood tests (FOBTs) are low. Nonparticipation is commonly attributed to psychosocial factors, but some medical conditions also prevent screening. These barriers might be partially overcome if a blood test for CRC screening was available. This study determined whether people who had always declined screening by FOBT would participate if offered a blood test. An audit of registrants within a personalized CRC screening programme was undertaken to determine the reasons for regular nonparticipation in FOBT. Consistent nonparticipants (n=240) were randomly selected and invited for CRC screening with a blood test. Demographic characteristics and the reasons for prior FOBT nonparticipation were collected by means of a questionnaire. Nonparticipation in the screening programme could be classified as either behavioural (8.6%), with consistent noncompliance, or due to medical contraindications (8.5%), which included chronic rectal bleeding, being deemed unsuitable by a health professional, and needing personal assistance. Blood test uptake was 25%, with participation in the medical contraindications group greater than that in the behavioural group (43 vs. 12%, P<0.001). Reported behavioural reasons for nonparticipation in faecal immunochemical test included procrastination and dislike of the test, but these were not associated with blood test uptake (P>0.05). There is a subgroup of the community who have medical reasons for nonparticipation in CRC screening with FOBT but will participate if offered a blood test. The option of a blood test does not, however, improve uptake in those who admit to behavioural reasons for noncompliance with screening.


Subject(s)
Colorectal Neoplasms/diagnosis , Early Detection of Cancer/statistics & numerical data , Hematologic Tests/statistics & numerical data , Mass Screening/statistics & numerical data , Patient Acceptance of Health Care/statistics & numerical data , Aged , Colorectal Neoplasms/blood , Colorectal Neoplasms/prevention & control , Early Detection of Cancer/methods , Female , Humans , Male , Middle Aged , Occult Blood , Risk , Surveys and Questionnaires
11.
JPEN J Parenter Enteral Nutr ; : 148607117726060, 2017 Aug 01.
Article in English | MEDLINE | ID: mdl-28813205

ABSTRACT

BACKGROUND: This study aimed to determine the agreement between the modified Nutrition Risk in Critically ill Score (mNUTRIC) and the Subjective Global Assessment (SGA) and compare their ability in discriminating and quantifying mortality risk independently and in combination. METHODS: Between August 2015 and October 2016, all patients in a Singaporean hospital received the SGA within 48 hours of intensive care unit admission. Nutrition status was dichotomized into presence or absence of malnutrition. The mNUTRIC of patients was retrospectively calculated at the end of the study, and high mNUTRIC was defined as scores ≥5. RESULTS: There were 439 patients and 67.9% had high mNUTRIC, whereas only 28% were malnourished. Hospital mortality was 29.6%, and none was lost to follow-up. Although both tools had poor agreement (κ statistics: 0.13, P < .001), they had similar discriminative value for hospital mortality (C-statistics [95% confidence interval (CI)], 0.66 [0.62-0.70] for high mNUTRIC and 0.61 [0.56-0.66] for malnutrition, P = .12). However, a high mNUTRIC was associated with higher adjusted odds for hospital mortality compared with malnutrition (adjusted odds ratio [95% CI], 5.32 [2.15-13.17], P < .001, and 4.27 [1.03-17.71], P = .046, respectively). Combination of both tools showed malnutrition and high mNUTRIC were associated with the highest adjusted odds for hospital mortality (14.43 [5.38-38.78], P < .001). CONCLUSION: The mNUTRIC and SGA had poor agreement. Although they individually provided a fair discriminative value for hospital mortality, the combination of these approaches is a better discriminator to quantify mortality risk.

12.
JPEN J Parenter Enteral Nutr ; 41(5): 744-758, 2017 07.
Article in English | MEDLINE | ID: mdl-26838530

ABSTRACT

Malnutrition is associated with poor clinical outcomes among hospitalized patients. However, studies linking malnutrition with poor clinical outcomes in the intensive care unit (ICU) often have conflicting findings due in part to the inappropriate diagnosis of malnutrition. We primarily aimed to determine whether malnutrition diagnosed by validated nutrition assessment tools such as the Subjective Global Assessment (SGA) or Mini Nutritional Assessment (MNA) is independently associated with poorer clinical outcomes in the ICU and if the use of nutrition screening tools demonstrate a similar association. PubMed, CINAHL, Scopus, and Cochrane Library were systematically searched for eligible studies. Search terms included were synonyms of malnutrition, nutritional status, screening, assessment, and intensive care unit. Eligible studies were case-control or cohort studies that recruited adults in the ICU; conducted the SGA, MNA, or used nutrition screening tools before or within 48 hours of ICU admission; and reported the prevalence of malnutrition and relevant clinical outcomes including mortality, length of stay (LOS), and incidence of infection (IOI). Twenty of 1168 studies were eligible. The prevalence of malnutrition ranged from 38% to 78%. Malnutrition diagnosed by nutrition assessments was independently associated with increased ICU LOS, ICU readmission, IOI, and the risk of hospital mortality. The SGA clearly had better predictive validity than the MNA. The association between malnutrition risk determined by nutrition screening was less consistent. Malnutrition is independently associated with poorer clinical outcomes in the ICU. Compared with nutrition assessment tools, the predictive validity of nutrition screening tools were less consistent.


Subject(s)
Intensive Care Units , Malnutrition/diagnosis , Malnutrition/epidemiology , Hospital Mortality , Humans , Length of Stay , Nutrition Assessment , Nutritional Status , Prevalence , Randomized Controlled Trials as Topic , Risk Factors , Treatment Outcome
13.
World J Surg ; 41(4): 1023-1034, 2017 Apr.
Article in English | MEDLINE | ID: mdl-27882416

ABSTRACT

BACKGROUND: Endoscopic surveillance of Barrett's esophagus (BE) is probably not cost-effective. A sub-population with BE at increased risk of high-grade dysplasia (HGD) or esophageal adenocarcinoma (EAC) who could be targeted for cost-effective surveillance was sought. METHODS: The outcome for BE surveillance from 2003 to 2012 in a structured program was reviewed. Incidence rates and incidence rate ratios for developing HGD or EAC were calculated. Risk stratification identified individuals who could be considered for exclusion from surveillance. A health-state transition Markov cohort model evaluated the cost-effectiveness of focusing on higher-risk individuals. RESULTS: During 2067 person-years of follow-up of 640 patients, 17 individuals progressed to HGD or EAC (annual IR 0.8%). Individuals with columnar-lined esophagus (CLE) ≥2 cm had an annual IR of 1.2% and >8-fold increased relative risk of HGD or EAC, compared to CLE <2 cm [IR-0.14% (IRR 8.6, 95% CIs 4.5-12.8)]. Limiting the surveillance cohort after the first endoscopy to individuals with CLE ≥2 cm, or dysplasia, followed by a further restriction after the second endoscopy-exclusion of patients without intestinal metaplasia-removed 296 (46%) patients, and 767 (37%) person-years from surveillance. Limiting surveillance to the remaining individuals reduced the incremental cost-effectiveness ratio from US$60,858 to US$33,807 per quality-adjusted life year (QALY). Further restrictions were tested but failed to improve cost-effectiveness. CONCLUSIONS: Based on stratification of risk, the number of patients requiring surveillance can be reduced by at least a third. At a willingness-to-pay threshold of US$50,000 per QALY, surveillance of higher-risk individuals becomes cost-effective.


Subject(s)
Barrett Esophagus/pathology , Precancerous Conditions/pathology , Risk Assessment , Watchful Waiting/economics , Aged , Aged, 80 and over , Australia , Cell Transformation, Neoplastic , Cohort Studies , Cost-Benefit Analysis , Female , Follow-Up Studies , Humans , Male , Middle Aged , Quality-Adjusted Life Years
14.
Crit Care ; 20(1): 232, 2016 08 01.
Article in English | MEDLINE | ID: mdl-27476581

ABSTRACT

BACKGROUND: The promotility agents currently available to treat gastroparesis and feed intolerance in the critically ill are limited by adverse effects. The aim of this study was to assess the pharmacodynamic effects and pharmacokinetics of single doses of the novel gastric promotility agent motilin agonist camicinal (GSK962040) in critically ill feed-intolerant patients. METHODS: A prospective, randomized, double-blind, parallel-group, placebo-controlled, study was performed in mechanically ventilated feed-intolerant patients [median age 55 (19-84), 73 % male, APACHE II score 18 (5-37) with a gastric residual volume ≥200 mL]. Gastric emptying and glucose absorption were measured both pre- and post-treatment after intragastric administration of 50 mg (n = 15) camicinal and placebo (n = 8) using the (13)C-octanoic acid breath test (BTt1/2), acetaminophen concentrations, and 3-O-methyl glucose concentrations respectively. RESULTS: Following 50 mg enteral camicinal, there was a trend to accelerated gastric emptying [adjusted geometric means: pre-treatment BTt1/2 117 minutes vs. post- treatment 76 minutes; 95 % confidence intervals (CI; 0.39, 1.08) and increased glucose absorption (AUC240min pre-treatment: 28.63 mmol.min/L vs. post-treatment: 71.63 mmol.min/L; 95 % CI (1.68, 3.72)]. When two patients who did not have detectable plasma concentrations of camicinal were excluded from analysis, camicinal accelerated gastric emptying (adjusted geometric means: pre-treatment BTt1/2 121 minutes vs. post-treatment 65 minutes 95 % CI (0.32, 0.91) and increased glucose absorption (AUC240min pre-treatment: 33.04 mmol.min/L vs. post-treatment: 74.59 mmol.min/L; 95 % CI (1.478, 3.449). In those patients receiving placebo gastric emptying was similar pre- and post-treatment. CONCLUSIONS: When absorbed, a single enteral dose of camicinal (50 mg) accelerates gastric emptying and increases glucose absorption in feed-intolerant critically ill patients. TRIAL REGISTRATION: The study protocol was registered with the US NIH clinicaltrials.gov on 23 December 2009 (Identifier NCT01039805 ).


Subject(s)
Feeding and Eating Disorders/drug therapy , Gastric Emptying/drug effects , Gastrointestinal Motility/drug effects , Glucose/analysis , Piperazines/pharmacology , Piperidines/pharmacology , Adult , Aged , Aged, 80 and over , Critical Illness/therapy , Double-Blind Method , Enteral Nutrition/methods , Enteral Nutrition/standards , Female , Gastric Absorption/drug effects , Humans , Intensive Care Units/organization & administration , Male , Middle Aged , Piperazines/therapeutic use , Piperidines/therapeutic use , Placebos , Prospective Studies , South Australia
15.
J Gastroenterol Hepatol ; 31(2): 294-301, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26114968

ABSTRACT

BACKGROUND AND AIM: Percutaneous thermal ablation using radiofrequency ablation (RFA) and microwave ablation (MWA) are both widely available curative treatments for hepatocellular carcinoma. Despite significant advances, it remains unclear which modality results in better outcomes. This meta-analysis of randomized controlled trials (RCT) and observational studies was undertaken to compare the techniques in terms of effectiveness and safety. METHODS: Electronic reference databases (Medline, EMBASE and Cochrane Central) were searched between January 1980 and May 2014 for human studies comparing RFA and MWA. The primary outcome was the risk of local tumor progression (LTP). Secondary outcomes were complete ablation (CA), overall survival, and major adverse events (AE). The ORs were combined across studies using the random-effects model. RESULTS: Ten studies (two prospective and eight retrospective) were included, and the overall LTP rate was 13.6% (176/1298). There was no difference in LTP rates between RFA and MWA [OR (95% CI): 1.01(0.67-1.50), P = 0.9]. The CA rate, 1- and 3-year overall survival and major AE were similar between the two modalities (P > 0.05 for all). In subgroup analysis, there was no difference in LTP rates according to study quality, but LTP rates were lower with MWA for treatment of larger tumors [1.88(1.10-3.23), P = 0.02]. There was no significant publication bias or inter-study heterogeneity (I(2) < 50% and P > 0.1) observed in any of the measured outcomes. CONCLUSION: Overall, both RFA and MWA are equally effective and safe, but MWA may be more effective compared to RFA in preventing LTP when treating larger tumors. Well-designed, larger, multicentre RCTs are required to confirm these findings.


Subject(s)
Ablation Techniques/methods , Carcinoma, Hepatocellular/surgery , Liver Neoplasms/surgery , Catheter Ablation , Databases, Bibliographic , Disease Progression , Humans , Microwaves/therapeutic use , Observational Studies as Topic , Randomized Controlled Trials as Topic , Treatment Outcome
16.
J Arthroplasty ; 31(2): 501-5, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26427940

ABSTRACT

BACKGROUND: Accurate acetabular component orientation in hip resurfacing is mandatory. The aim of this study is to analyze if interpretation of pelvic radiographs with computer-added design (CAD) software is comparable to computed tomography (CT) in measurement of acetabular anteversion and inclination of a Birmingham Hip Resurfacing (BHR) hip. METHODS: A consecutive series of 49 patients (50 hips) who underwent hip resurfacing arthroplasty between 2005 and 2007 with the BHR system were retrospectively included. The surgical procedure was performed by 1 orthopedic surgeon in the beginning of his learning curve. Computer-added design software was used to measure acetabular component orientation on an anteroposterior pelvic radiograph. These measurements were compared with CT measurements. We calculated the correlation between the CAD software and CT analysis. The degree of underestimation or overestimation was determined, and a Bland-Altman plot was created to visualize the agreement between CAD software and CT results. RESULTS: We analyzed 50 BHR hips with mean inclination of 54.6° and 55.6° and mean anteversion of 24.8° and 13.3° measured by CT and CAD, respectively. Pearson correlation coefficient for inclination was 0.69 (P < .001) and for anteversion 0.81 (P < .001). Computer-added design showed a mean underestimated anteversion of 11.6° (P < .001). There was no significant underestimation or overestimation of inclination with CAD analysis compared to CT measurements. CONCLUSION: The CAD software is useful to assess acetabular inclination in hip resurfacing but underestimates anteversion.


Subject(s)
Acetabulum/diagnostic imaging , Arthroplasty, Replacement, Hip/methods , Computer-Aided Design , Hip Joint/diagnostic imaging , Adult , Female , Hip Prosthesis , Humans , Learning Curve , Male , Middle Aged , Retrospective Studies , Tomography, X-Ray Computed
17.
J Med Screen ; 22(4): 187-93, 2015 Dec.
Article in English | MEDLINE | ID: mdl-25977374

ABSTRACT

OBJECTIVES: Positive rates in faecal immunochemical test (FIT)-based colorectal cancer screening programmes vary, suggesting that differences between programmes may affect test results. We examined whether demographic, pathological, behavioural, and environmental factors affected haemoglobin concentration and positive rates where samples are mailed. METHODS: A retrospective cohort study; 34,298 collection devices were sent, over five years, to screening invitees (median age 60.6). Participant demographics, temperature on sample postage day, and previous screening were recorded. Outcomes from colonoscopy performed within a year following FIT were collected. Multivariate logistic regression identified significant predictors of test positivity. RESULTS: Higher positive rate was independently associated with male gender, older age, lower socioeconomic status, and distally located neoplasia, and negatively associated with previous screening (p < 0.05). Older males had higher faecal haemoglobin concentrations and were less likely to have a false positive result at colonoscopy (p < 0.05). High temperature on the sample postage day was associated with reduced haemoglobin concentration and positivity rate (26-35℃: Odds ratio 0.78, 95% confidence interval 0.66-0.93), but was not associated with missed significant neoplasia at colonoscopy (p > 0.05). CONCLUSIONS: Haemoglobin concentrations, and therefore FIT positivity, were affected by factors that vary between screening programmes. Participant demographics and high temperature at postage had significant effects. The impact of temperature could be reduced by seasonal scheduling of invitations. The importance of screening, and following up positive test results, particularly in older males, should be promoted.


Subject(s)
Colorectal Neoplasms/diagnosis , Feces/chemistry , Hemoglobins/analysis , Adult , Aged , Aged, 80 and over , Analysis of Variance , Early Detection of Cancer/methods , Female , Humans , Logistic Models , Male , Mass Screening/methods , Middle Aged , Odds Ratio , Retrospective Studies , Risk Factors , Socioeconomic Factors , Young Adult
18.
J Gastrointestin Liver Dis ; 23(3): 243-8, 2014 Sep.
Article in English | MEDLINE | ID: mdl-25267950

ABSTRACT

BACKGROUND AND AIMS: Swallowing difficulties become increasingly prevalent in older age. Differences exist in lower esophageal sphincter (LES) function between older and younger patients with dysphagia, but the contribution of aging per se to these is unclear. METHODS: Esophageal motor function was measured using high resolution manometry in older (aged 81+/-1.7 yrs) and younger (23+/-1.7 yrs) asymptomatic healthy adults. After baseline recording, motility was assessed by swallowing boluses of liquid (right lateral and upright postures) and solids. Basal LES pressure, integrated relaxation pressure, distal esophageal peristaltic amplitude, distal contractile integral and velocity were measured. Data are presented as mean +/- SEM. RESULTS: Despite a trend for lower basal LES pressure (15.8+/-2.9 mmHg vs. 21.0+/-0.2 mmHg; P=0.08), completeness of LES relaxation was reduced in older subjects (liquid RL: P=0.003; UR: P=0.007; solid: P=0.03), with higher integrated relaxation pressure when upright (liquid: 6.9+/-1.1 vs. 3.1+/-0.4 mmHg; P=0.01; solids: 8.1+/-1.1 vs. 3.6+/-0.3 mmHg; P=0.001) and a longer time to recovery after liquid boluses (right lateral: P=0.01; upright: P=0.04). In young, but not older adults, esophageal peristaltic velocity was increased when upright (3.6+/-0.2 cm/sec; P=0.04) and reduced with solids (3.0+/-0.1 cm-s; P=0.03). Distal contraction amplitude was higher with solid cf. liquid in the younger individuals (51.8+/-7.9 mmHg vs. 41.4+/-6.2 mmHg; P=0.03). In elderly subjects, the distal contractile integral was higher with liquid swallows in the upright posture (P=0.006). CONCLUSION: There are subtle changes in LES function even in asymptomatic older individuals. These age-related changes may contribute to the development of dysphagia.


Subject(s)
Aging , Deglutition Disorders/etiology , Deglutition , Esophageal Sphincter, Lower/physiopathology , Peristalsis , Adult , Age Factors , Aged , Aged, 80 and over , Deglutition Disorders/diagnosis , Deglutition Disorders/physiopathology , Female , Humans , Male , Manometry , Posture , Pressure , Risk Factors , Young Adult
19.
J Crohns Colitis ; 8(5): 370-4, 2014 May.
Article in English | MEDLINE | ID: mdl-24161810

ABSTRACT

Inflammatory bowel disease (IBD) management is increasingly concentrated in units with expertise in the condition leading to substantial improvement in outcomes. Such units often employ nurses with a specialised interest in IBD with enhancements in care reflecting in part the promotion of more efficient use of medical and hospital services by this role. However, the relative contributions of nurse specialist input, and the effect of medical staff with a sub-speciality interest in IBD are unclear although this has major implications for funding. Determining the value of IBD nurses by assessing the direct impact of an IBD nurse on reducing admissions and outpatient attendances has immediate cost benefits, but the long-term sustainability of these savings has not been previously investigated. We therefore assessed the effect of an IBD nurse on patient outcomes in a tertiary hospital IBD Unit where the position has been established for 8years by measuring the number of occasions of service (OOS) and outcomes of all interactions between the nurse and patients in a tertiary hospital IBD Unit over a 12-month period. There were 4920 OOS recorded involving 566 patients. IBD nurse intervention led to avoidance of 27 hospital admissions (representing a saving of 171 occupied bed days), 32 Emergency Department presentations and 163 outpatient reviews. After deducting salary and on-costs related to the IBD nurse there was a net direct saving to the hospital of AUD $136,535. IBD nurse positions provide sustained direct cost reductions to health services via reducing hospital attendances. This is additional to benefits that accrue through better patient knowledge, earlier presentation and increased compliance.


Subject(s)
Inflammatory Bowel Diseases/nursing , Nurse's Role , Adult , Cost-Benefit Analysis , Emergency Service, Hospital/economics , Female , Hospitals, University , Humans , Inflammatory Bowel Diseases/economics , Inpatients , Male , South Australia
20.
Spectrochim Acta A Mol Biomol Spectrosc ; 117: 406-12, 2014 Jan 03.
Article in English | MEDLINE | ID: mdl-24001982

ABSTRACT

Co3(PO4)2, SrCo2(PO4)2, Co2P2O7, BaCoP2O7 and SrCoP2O7 present different geometries of five-coordinated Co(2+) (([5])Co(2+)) sites, coexisting with ([6])Co(2+) in Co3(PO4)2 and Co2P2O7, and ([4])Co(2+) in SrCo2(PO4)2. ([5])Co K-edge XANES spectra show that the intensity of the pre-edge and main-edge is intermediate between those of ([6])- and ([4])Co. Diffuse reflectance spectra show the contributions of Co(2+) in (D3h) symmetry for SrCo2(PO4)2, and (C4v) symmetry for BaCoP2O7 and SrCoP2O7. In Co3(PO4)2 and Co2P2O7 the multiple transitions observed arise from energy level splitting and may be labeled in (C2v) symmetry. Spectroscopic data confirm that (D3h) and (C4v) symmetries may be distinguished upon the intensity of the optical absorption bands and crystal field splitting values. We discuss the influence of the geometrical distortion and of the nature of the next nearest neighbors.


Subject(s)
Barium/chemistry , Cobalt/chemistry , Coordination Complexes/chemistry , Phosphates/chemistry , Strontium/chemistry , X-Ray Absorption Spectroscopy , Fiber Optic Technology , Models, Chemical
SELECTION OF CITATIONS
SEARCH DETAIL