Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 16 de 16
Filter
1.
Acad Pediatr ; 23(2): 279-286, 2023 03.
Article in English | MEDLINE | ID: mdl-36410601

ABSTRACT

OBJECTIVE: To determine whether a multicomponent intervention focused on early peanut introduction was associated with a lower peanut allergy incidence in young children. METHODS: The study cohort comprised all children born January 1, 2013 through December 31, 2018 receiving care at a large health care organization. Intervention activities occurred over 16 months and included provider educational programs, electronic health record tools, and new patient instructions. We used an interrupted time series design to assess whether peanut allergy incidence differed across 3 time periods (preintervention, interim, postintervention) among high- and low-risk children. The primary outcome was incident peanut allergy by age 24 months, defined as peanut allergy in the allergy field or active problem list plus a positive supportive test. Severe eczema and/or egg allergy presence defined high-risk. Because the study was conducted as part of routine care, it was not feasible to measure what counseling clinicians provided, or how and when parents fed their children peanut-containing foods. RESULTS: In a cohort of 22,571 children, the percent with peanut allergy by age 24 months was 17.3% (116 of 671) among high-risk and 0.8% (181 of 21,900) among low-risk children. In multivariate analyses, the adjusted peanut allergy rate per 100 person-years was not significantly different across study periods among high-risk (9.6 preintervention, 11.7 interim, and 9.9 postintervention, P = .70) or low-risk (0.5 preintervention, 0.7 interim, and 0.5 postintervention, P = .17) children. CONCLUSIONS: In a community-based setting, the incidence of peanut allergy did not decline following a multicomponent intervention focused on early peanut introduction.


Subject(s)
Eczema , Egg Hypersensitivity , Peanut Hypersensitivity , Humans , Male , Female , Child, Preschool , Child , Arachis , Peanut Hypersensitivity/epidemiology , Primary Health Care , Risk , Age Factors
2.
J Contin Educ Health Prof ; 41(2): 145-152, 2021 04 01.
Article in English | MEDLINE | ID: mdl-33758129

ABSTRACT

INTRODUCTION: Continuing medical education (CME) interventions often evaluate participant commitment to change (CTC) clinical practice. Evidence linking CTC to actual practice change is limited. METHODS: In an intervention that combined live CME with changes to the electronic health record to promote judicious antibiotic use for children with urinary tract infections (UTIs), we evaluated CTC and subsequent prescribing behavior in Kaiser Permanente Colorado, an integrated health care system. CTC was assessed immediately after the session using closed-ended questions about session learning objectives and open-ended questions to elicit specific practice changes. Perceived barriers to implementing recommended changes were also assessed. RESULTS: Among 179 participants, 80 (45%) completed postsession evaluations and treated one or more child with a UTI in the subsequent 17 months (856 UTIs in total). In closed-ended responses about session learning objectives, 45 clinicians (56%) committed to changing practice for antibiotic choice and duration, whereas 37 (46%) committed to implementing new practice guidelines. When asked open-ended questions to identify specific practice changes, 32 (40%) committed to antibiotic choice change and 29 (36%) committed to treatment duration change. Participants who made specific CTC statements had greater improvement in antibiotic choice (relative rate ratio 1.56, 95% CI 1.16-2.09) and duration (relative rate ratio 1.59, 95% CI 1.05-2.41) than participants who did not make specific commitments. Few perceived barriers affected subsequent prescribing. DISCUSSION: Commitments to changing specific clinical behaviors were associated with sustained changes in prescribing for children with UTIs. Linking self-evaluations with clinical data in integrated health care systems is an important tool for CME evaluators.


Subject(s)
Education, Medical, Continuing , Learning , Child , Humans
3.
Pediatrics ; 145(4)2020 04.
Article in English | MEDLINE | ID: mdl-32127361

ABSTRACT

OBJECTIVES: To determine if a multicomponent intervention was associated with increased use of first-line antibiotics (cephalexin or sulfamethoxazole and trimethoprim) among children with uncomplicated urinary tract infections (UTIs) in outpatient settings. METHODS: The study was conducted at Kaiser Permanente Colorado, a large health care organization with ∼127 000 members <18 years of age. After conducting a gap analysis, an intervention was developed to target key drivers of antibiotic prescribing for pediatric UTIs. Intervention activities included development of new local clinical guidelines, a live case-based educational session, pre- and postsession e-mailed knowledge assessments, and a new UTI-specific order set within the electronic health record. Most activities were implemented on April 26, 2017. The study design was an interrupted time series comparing antibiotic prescribing for UTIs before versus after the implementation date. Infants <60 days old and children with complex urologic or neurologic conditions were excluded. RESULTS: During January 2014 to September 2018, 2142 incident outpatient UTIs were identified (1636 preintervention and 506 postintervention). Pyelonephritis was diagnosed for 7.6% of cases. Adjusted for clustering of UTIs within clinicians, the proportion of UTIs treated with first-line antibiotics increased from 43.4% preintervention to 62.4% postintervention (P < .0001). The use of cephalexin (first-line, narrow spectrum) increased from 28.9% preintervention to 53.0% postintervention (P < .0001). The use of cefixime (second-line, broad spectrum) decreased from 17.3% preintervention to 2.6% postintervention (P < .0001). Changes in prescribing practices persisted through the end of the study period. CONCLUSIONS: A multicomponent intervention with educational and process-improvement elements was associated with a sustained change in antibiotic prescribing for uncomplicated pediatric UTIs.


Subject(s)
Ambulatory Care , Anti-Bacterial Agents/therapeutic use , Anti-Infective Agents, Urinary/therapeutic use , Urinary Tract Infections/drug therapy , Adolescent , Age Factors , Cephalexin/therapeutic use , Child , Child, Preschool , Cystitis/drug therapy , Female , Humans , Infant , Interrupted Time Series Analysis , Male , Process Assessment, Health Care , Pyelonephritis/drug therapy , Trimethoprim, Sulfamethoxazole Drug Combination/therapeutic use , Urinary Tract Infections/epidemiology
4.
Acad Pediatr ; 19(5): 572-580, 2019 07.
Article in English | MEDLINE | ID: mdl-30959224

ABSTRACT

OBJECTIVE: Clinical specialty societies recommend long-acting reversible contraceptives (LARCs) as first-line contraception for adolescent women. We evaluated whether a combined educational and process improvement intervention enhanced LARC placement in primary care within an integrated health care system. METHODS: The intervention included journal clubs, live continuing education, point-of-care guidelines, and new patient materials. We conducted a retrospective cohort study across 3 time periods: baseline (January 2013-September 2015), early implementation (October 2015-March 2016), and full implementation (April 2016-June 2017). The primary outcome was the proportion of LARCs placed by primary care clinicians among women aged 13 to 18 years compared with gynecology clinicians. RESULTS: Kaiser Foundation Health Plan of Colorado cared for approximately 20,000 women aged 13 to 18 years in each calendar quarter between 2013 and 2017. Overall, LARC placement increased from 7.0 per 1000 members per quarter at baseline to 13.0 per 1000 during the full intervention. Primary care clinicians placed 6.2% of all LARCs in 2013, increasing to 32.1% by 2017 (P < .001), including 45.5% of contraceptive implants. Clinicians who attended educational sessions were more likely to adopt LARCs than those who did not (17.9% vs 6.4% respectively, P = .009). Neither overall LARC placement rates (relative risk, 1.9; 95% confidence interval, 0.7-5.6) nor contraceptive implant rates (relative risk, 3.0; 95% confidence interval, 0.9-9.8) increased significantly in clinicians who attended educational activities. CONCLUSIONS: This multimodal intervention was associated with increased LARC placement for adolescent women in primary care. The combination of education and process improvement is a promising strategy to promote clinician behavior change.


Subject(s)
Delivery of Health Care, Integrated , Long-Acting Reversible Contraception , Primary Health Care , Adolescent , Contraception Behavior , Female , Health Education , Humans , Retrospective Studies , Socioeconomic Factors , Young Adult
5.
J Am Heart Assoc ; 7(7)2018 03 26.
Article in English | MEDLINE | ID: mdl-29581222

ABSTRACT

BACKGROUND: Primary prevention implantable cardioverter-defibrillators (ICDs) reduce mortality in selected patients with left ventricular systolic dysfunction by delivering therapies (antitachycardia pacing or shocks) to terminate potentially lethal arrhythmias; inappropriate therapies also occur. We assessed device therapies among adults receiving primary prevention ICDs in 7 healthcare systems. METHODS AND RESULTS: We linked medical record data, adjudicated device therapies, and the National Cardiovascular Data Registry ICD Registry. Survival analysis evaluated therapy probability and predictors after ICD implant from 2006 to 2009, with attention to Centers for Medicare and Medicaid Services Coverage With Evidence Development subgroups: left ventricular ejection fraction, 31% to 35%; nonischemic cardiomyopathy <9 months' duration; and New York Heart Association class IV heart failure with cardiac resynchronization therapy defibrillator. Among 2540 patients, 35% were <65 years old, 26% were women, and 59% were white. During 27 (median) months, 738 (29%) received ≥1 therapy. Three-year therapy risk was 36% (appropriate, 24%; inappropriate, 12%). Appropriate therapy was more common in men (adjusted hazard ratio [HR], 1.84; 95% confidence interval [CI], 1.43-2.35). Inappropriate therapy was more common in patients with atrial fibrillation (adjusted HR, 2.20; 95% CI, 1.68-2.87), but less common among patients ≥65 years old versus younger (adjusted HR, 0.72; 95% CI, 0.54-0.95) and in recent implants (eg, in 2009 versus 2006; adjusted HR, 0.66; 95% CI, 0.46-0.95). In Centers for Medicare and Medicaid Services Coverage With Evidence Development analysis, inappropriate therapy was less common with cardiac resynchronization therapy defibrillator versus single chamber (adjusted HR, 0.55; 95% CI, 0.36-0.84); therapy risk did not otherwise differ for Centers for Medicare and Medicaid Services Coverage With Evidence Development subgroups. CONCLUSIONS: In this community cohort of primary prevention patients receiving ICD, therapy delivery varied across demographic and clinical characteristics, but did not differ meaningfully for Centers for Medicare and Medicaid Services Coverage With Evidence Development subgroups.


Subject(s)
Arrhythmias, Cardiac/prevention & control , Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable , Electric Countershock/instrumentation , Primary Prevention/instrumentation , Ventricular Dysfunction, Left/therapy , Aged , Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/mortality , Arrhythmias, Cardiac/physiopathology , Centers for Medicare and Medicaid Services, U.S. , Electric Countershock/adverse effects , Electric Countershock/mortality , Female , Heart Rate , Humans , Male , Middle Aged , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , United States , Ventricular Dysfunction, Left/diagnosis , Ventricular Dysfunction, Left/mortality , Ventricular Dysfunction, Left/physiopathology , Ventricular Function, Left
6.
J Am Heart Assoc ; 6(11)2017 Nov 09.
Article in English | MEDLINE | ID: mdl-29122811

ABSTRACT

BACKGROUND: In US clinical practice, many patients who undergo placement of an implantable cardioverter-defibrillator (ICD) for primary prevention of sudden cardiac death receive dual-chamber devices. The superiority of dual-chamber over single-chamber devices in reducing the risk of inappropriate ICD shocks in clinical practice has not been established. The objective of this study was to compare risk of adverse outcomes, including inappropriate shocks, between single- and dual-chamber ICDs for primary prevention. METHODS AND RESULTS: We identified patients receiving a single- or dual-chamber ICD for primary prevention who did not have an indication for pacing from 15 hospitals within 7 integrated health delivery systems in the Longitudinal Study of Implantable Cardioverter-Defibrillators from 2006 to 2009. The primary outcome was time to first inappropriate shock. ICD shocks were adjudicated for appropriateness. Other outcomes included all-cause hospitalization, heart failure hospitalization, and death. Patient, clinician, and hospital-level factors were accounted for using propensity score weighting methods. Among 1042 patients without pacing indications, 54.0% (n=563) received a single-chamber device and 46.0% (n=479) received a dual-chamber device. In a propensity-weighted analysis, device type was not significantly associated with inappropriate shock (hazard ratio, 0.91; 95% confidence interval, 0.59-1.38 [P=0.65]), all-cause hospitalization (hazard ratio, 1.03; 95% confidence interval, 0.87-1.21 [P=0.76]), heart failure hospitalization (hazard ratio, 0.93; 95% confidence interval, 0.72-1.21 [P=0.59]), or death (hazard ratio, 1.19; 95% confidence interval, 0.93-1.53 [P=0.17]). CONCLUSIONS: Among patients who received an ICD for primary prevention without indications for pacing, dual-chamber devices were not associated with lower risk of inappropriate shock or differences in hospitalization or death compared with single-chamber devices. This study does not justify the use of dual-chamber devices to minimize inappropriate shocks.


Subject(s)
Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable/adverse effects , Heart Failure/therapy , Primary Prevention/methods , Registries , Aged , Death, Sudden, Cardiac/epidemiology , Equipment Design , Female , Heart Failure/mortality , Humans , Incidence , Longitudinal Studies , Male , Middle Aged , Retrospective Studies , Survival Rate/trends , Treatment Outcome , United States/epidemiology
7.
J Am Heart Assoc ; 4(6): e002005, 2015 Jun 02.
Article in English | MEDLINE | ID: mdl-26037083

ABSTRACT

BACKGROUND: Patient sex and age may influence rates of death after receiving an implantable cardioverter-defibrillator for primary prevention. Differences in outcomes other than mortality and whether these differences vary by heart failure symptoms, etiology, and left ventricular ejection fraction are not well characterized. METHODS AND RESULTS: We studied 2954 patients with left ventricular ejection fraction ≤0.35 undergoing first-time implantable cardioverter-defibrillator for primary prevention within the Cardiovascular Research Network; 769 patients (26%) were women, and 2827 (62%) were aged >65 years. In a median follow-up of 2.4 years, outcome rates per 1000 patient-years were 109 for death, 438 for hospitalization, and 111 for heart failure hospitalizations. Procedure-related complications occurred in 8.36%. In multivariable models, women had significantly lower risks of death (hazard ratio 0.67, 95% CI 0.56 to 0.80) and heart failure hospitalization (hazard ratio 0.82, 95% CI 0.68 to 0.98) and higher risks for complications (hazard ratio 1.38, 95% CI 1.01 to 1.90) than men; patients aged >65 years had higher risks of death (hazard ratio 1.55, 95% CI 1.30 to 1.86) and heart failure hospitalization (hazard ratio 1.25, 95% CI 1.05 to 1.49) than younger patients. Age and sex differences were generally consistent in strata according to symptoms, etiology, and severity of left ventricular systolic dysfunction, except the higher risk of complications in women, which differed by New York Heart Association classification (P=0.03 for sex-New York Heart Association interaction), and the risk of heart failure hospitalization in older patients, which differed by etiology of heart failure (P=0.05 for age-etiology interaction). CONCLUSIONS: The burden of adverse outcomes after receipt of an implantable cardioverter-defibrillator for primary prevention is substantial and varies according to patient age and sex. These differences in outcome generally do not vary according to baseline heart failure characteristics.


Subject(s)
Defibrillators, Implantable/statistics & numerical data , Age Factors , Aged , Cardiovascular Diseases/mortality , Cardiovascular Diseases/surgery , Defibrillators, Implantable/adverse effects , Female , Heart Failure/mortality , Hospitalization/statistics & numerical data , Humans , Male , Middle Aged , Prosthesis Implantation/mortality , Prosthesis Implantation/statistics & numerical data , Risk Factors , Sex Factors , United States/epidemiology
8.
JAMA ; 310(2): 155-62, 2013 Jul 10.
Article in English | MEDLINE | ID: mdl-23839749

ABSTRACT

IMPORTANCE: Little is known about how different financial incentives between Medicare Advantage and Medicare fee-for-service (FFS) reimbursement structures influence use of cardiovascular procedures. OBJECTIVE: To compare regional cardiovascular procedure rates between Medicare Advantage and Medicare FFS beneficiaries. DESIGN, SETTING, AND PARTICIPANTS: Cross-sectional study of Medicare beneficiaries older than 65 years between 2003-2007 comparing rates of coronary angiography, percutaneous coronary intervention (PCI), and coronary artery bypass graft (CABG) surgery across 32 hospital referral regions in 12 states. MAIN OUTCOMES AND MEASURES: Rates of coronary angiography, PCI, and CABG surgery. RESULTS: We evaluated a total of 878,339 Medicare Advantage patients and 5,013,650 Medicare FFS patients. Compared with Medicare FFS patients, Medicare Advantage patients had lower age-, sex-, race-, and income-adjusted procedure rates per 1000 person-years for angiography (16.5 [95% CI, 14.8-18.2] vs 25.9 [95% CI, 24.0-27.9]; P < .001) and PCI (6.8 [95% CI, 6.0-7.6] vs 9.8 [95% CI, 9.0-10.6]; P < .001) but similar rates for CABG surgery (3.1 [95% CI, 2.8-3.5] vs 3.4 [95% CI, 3.1-3.7]; P = .33). There were no significant differences between Medicare Advantage and Medicare FFS patients in the rates per 1000 person-years of urgent angiography (3.9 [95% CI, 3.6-4.2] vs 4.3 [95% CI, 4.0-4.6]; P = .24) or PCI (2.4 [95% CI, 2.2-2.7] vs 2.7 [95% CI, 2.5-2.9]; P = .16). Procedure rates varied widely across hospital referral regions among Medicare Advantage and Medicare FFS patients. For angiography, the rates per 1000 person-years ranged from 9.8 to 40.6 for Medicare Advantage beneficiaries and from 15.7 to 44.3 for Medicare FFS beneficiaries. For PCI, the rates ranged from 3.5 to 16.8 for Medicare Advantage and from 4.7 to 16.1 for Medicare FFS. The rates for CABG surgery ranged from 1.5 to 6.1 for Medicare Advantage and from 2.5 to 6.0 for Medicare FFS. Across regions, we found no statistically significant correlation between Medicare Advantage and Medicare FFS beneficiary utilization for angiography (Spearman r = 0.19, P = .29) and modest correlations for PCI (Spearman r = 0.33, P = .06) and CABG surgery (Spearman r = 0.35, P = .05). Among Medicare Advantage beneficiaries, adjustment for additional cardiac risk factors had little influence on procedure rates. CONCLUSIONS AND RELEVANCE: Although Medicare beneficiaries enrolled in capitated Medicare Advantage programs had lower angiography and PCI procedure rates than those enrolled in Medicare FFS, the degree of geographic variation in procedure rates was substantial among Medicare Advantage beneficiaries and was similar in magnitude to that observed among Medicare FFS beneficiaries.


Subject(s)
Coronary Angiography/statistics & numerical data , Coronary Artery Bypass/statistics & numerical data , Fee-for-Service Plans/statistics & numerical data , Medicare Part C/statistics & numerical data , Medicare/statistics & numerical data , Percutaneous Coronary Intervention/statistics & numerical data , Age Factors , Aged , Aged, 80 and over , Capitation Fee , Cross-Sectional Studies , Female , Geography , Humans , Male , Reimbursement, Incentive , Sex Factors , United States
9.
Circ Cardiovasc Qual Outcomes ; 5(6): e78-85, 2012 Nov.
Article in English | MEDLINE | ID: mdl-23170006

ABSTRACT

BACKGROUND: Implantable cardioverter-defibrillators (ICDs) are increasingly used for primary prevention after randomized, controlled trials demonstrating that they reduce the risk of death in patients with left ventricular systolic dysfunction. The extent to which the clinical characteristics and long-term outcomes of unselected, community-based patients with left ventricular systolic dysfunction undergoing primary prevention ICD implantation in a real-world setting compare with those enrolled in the randomized, controlled trials is not well characterized. This study is being conducted to address these questions. METHODS AND RESULTS: The study cohort includes consecutive patients undergoing primary prevention ICD placement between January 1, 2006 and December 31, 2009 in 7 health plans. Baseline clinical characteristics were acquired from the National Cardiovascular Data Registry ICD Registry. Longitudinal data collection is underway, and will include hospitalization, mortality, and resource use from standardized health plan data archives. Data regarding ICD therapies will be obtained through chart abstraction and adjudicated by a panel of experts in device therapy. Compared with the populations of primary prevention ICD therapy randomized, controlled trials, the cohort (n=2621) is on average significantly older (by 2.5-6.5 years), more often female, more often from racial and ethnic minority groups, and has a higher burden of coexisting conditions. The cohort is similar, however, to a national population undergoing primary prevention ICD placement. CONCLUSIONS: Patients undergoing primary prevention ICD implantation in this study differ from those enrolled in the randomized, controlled trials that established the efficacy of ICDs. Understanding a broad range of health outcomes, including ICD therapies, will provide patients, clinicians, and policy makers with contemporary data to inform decision-making.


Subject(s)
Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable , Electric Countershock/instrumentation , Primary Prevention/methods , Ventricular Dysfunction, Left/therapy , Aged , Chi-Square Distribution , Death, Sudden, Cardiac/etiology , Electric Countershock/adverse effects , Electric Countershock/mortality , Female , Hospitalization , Humans , Longitudinal Studies , Male , Middle Aged , Registries , Research Design , Time Factors , Treatment Outcome , United States , Ventricular Dysfunction, Left/complications , Ventricular Dysfunction, Left/diagnosis , Ventricular Dysfunction, Left/mortality
10.
Toxicon ; 55(7): 1396-404, 2010 Jun 15.
Article in English | MEDLINE | ID: mdl-20184911

ABSTRACT

Within the last two decades, Prymnesium parvum (golden algae) has rapidly spread into inland waterways across the southern portion of North America and this organism has now appeared in more northerly distributed watersheds. In its wake, golden algae blooms have left an alarming trail of ecological devastation, namely massive fish kills, which are threatening the economic and recreational value of freshwater systems throughout the United States. To further understand the nature of this emerging crisis, our group investigated the chemical nature of the toxin(s) produced by P. parvum. We approached the problem using a two-pronged strategy that included analyzing both laboratory-grown golden algae and field-collected samples of P. parvum. Our results demonstrate that there is a striking difference in the toxin profiles for these two systems. An assemblage of potently ichthyotoxic fatty acids consisting primarily of stearidonic acid was identified in P. parvum cultures. While the concentration of the fatty acids alone was sufficient to account for the rapid-onset ichthyotoxic properties of cultured P. parvum, we also detected a second type of highly labile ichthyotoxic substance(s) in laboratory-grown golden algae that remains uncharacterized. In contrast, the amounts of stearidonic acid and its related congeners present in samples from recent bloom and fish kill sites fell well below the limits necessary to induce acute toxicity in fish. However, a highly labile ichthyotoxic substance, which is similar to the one found in laboratory-grown P. parvum cultures, was also detected. We propose that the uncharacterized labile metabolite produced by P. parvum is responsible for golden algae's devastating fish killing effects. Moreover, we have determined that the biologically-relevant ichthyotoxins produced by P. parvum are not the prymnesins as is widely believed. Our results suggest that further intensive efforts will be required to chemically define P. parvum's ichthyotoxins under natural bloom conditions.


Subject(s)
Chrysophyta/chemistry , Eutrophication , Fishes/physiology , Marine Toxins/toxicity , Alkalies , Animals , Biological Assay , Cell Line, Tumor , Cell Survival/drug effects , Chromatography, High Pressure Liquid , Esterases/chemistry , Fatty Acids/chemistry , Fatty Acids/metabolism , Fatty Acids, Unsaturated/chemistry , Fatty Acids, Unsaturated/metabolism , Humans , Hydrolysis , Spectrometry, Mass, Electrospray Ionization , Spectrophotometry, Ultraviolet
11.
Analyst ; 133(11): 1581-6, 2008 Nov.
Article in English | MEDLINE | ID: mdl-18936836

ABSTRACT

Ionic surfactant coatings are a popular means to convert reversed-phase columns into ion-exchange phases with adjustable ion-exchange capacity and selectivity. However, the perceived lack of stability of surfactant coatings has hindered their use for routine separations. Coating conditions (acetonitrile content, ionic strength, surfactant concentration and temperature) were varied to determine their effect on coating stability. Under all coating conditions, cetyltrimethylammonium (CTAB) coated columns exhibited an initial decrease in ion retention. However, after the initial 1 L flush, both retention times and efficiency remained stable for >or=3000 column volumes. Greatest column stability and control of column capacity are achieved if the surfactant in the coating solution is below its critical micelle concentration.

12.
J Phys Chem B ; 111(33): 9828-37, 2007 Aug 23.
Article in English | MEDLINE | ID: mdl-17672496

ABSTRACT

Mixed surfactants play a promising role in surface chemical applications. In this study, interfacial and bulk behaviors of binary and ternary combinations of tetradecyltrimethylammonium bromide (C(14)TAB), tetradecyltriphenylphosphonium bromide (C(14)TPB), and tetradecylpyridinium bromide (C(14)PB) have been examined in detail using the methods of tensiometry, conductometry, fluorimetry, and microcalorimetry. The state of micellar aggregation, amphiphile composition in the micelle, extent of counterion binding by the micelle, and interaction among the surfactant monomers in the binary and ternary combinations have been quantitatively assessed in the light of the regular solution theories of Rubingh and that of Rubingh and Holland. The monomer packing in the micelles and their expected shapes have also been estimated from topological considerations. Conceptual rationalization of results has been presented together with associated energetics of the interfacial adsorption and self-aggregation in the bulk.


Subject(s)
Organophosphorus Compounds/chemistry , Pyridinium Compounds/chemistry , Surface-Active Agents/chemistry , Trimethyl Ammonium Compounds/chemistry , Algorithms , Calorimetry , Chemical Phenomena , Chemistry, Physical , Electric Conductivity , Energy Transfer , Fluorometry , Micelles , Surface Tension
13.
J Sep Sci ; 30(11): 1628-45, 2007 Jul.
Article in English | MEDLINE | ID: mdl-17623445

ABSTRACT

The focus of this review is on current status and on-going developments in ion chromatography (IC) using monolithic phases. The use and potential of both silica and polymeric monoliths in IC is discussed, with silica monoliths achieving efficiencies upwards of 10(5) plates/m for inorganic ions in a few minutes or less. Ion exchange capacity can be introduced onto the monolithic columns through the addition of ion interaction reagents to the eluent, coating of the monolith with ionic surfactants or polyelectrolyte latexes, and covalent bonding. The majority of the studies to date have used surfactant-coated columns, but the stability of surfactant coatings limits this approach. Applications of monolithic IC columns to the separation of inorganic anions and cations are tabulated. Finally, a discussion on the recent commercialization of monolithic IC columns and the use of monolithic phases for IC peripherals such as preconcentrator columns, microextractors and suppressors is presented.

14.
J Chromatogr A ; 1155(1): 8-14, 2007 Jun 29.
Article in English | MEDLINE | ID: mdl-17306813

ABSTRACT

A silica monolith column (Merck Chromolith, 100 mm x 4.6 mm) has been coated with Dionex AS9-SC latex nanoparticles to convert the column into an anion-exchange stationary phase. For comparison purposes, a reversed-phase silica monolith was also converted into an anion-exchange column by coating with the cationic surfactant didodecyldimethylammonium bromide (DDAB). Separations of common inorganic anions were carried out using 7.5 or 5.0 mM 4-hydroxybenzoic acid at pH 7.0 along with suppressed conductivity detection. Direct comparisons were then made between the two columns in terms of selectivity, efficiency and stability. The latex-coated column was on average 50% more efficient than the DDAB-coated column. A 10% decrease in retention times was observed on the DDAB column over 11 h of continuous eluent flow, while the latex coating exhibited <1% change in retention even after 2.5 months of periodic use.


Subject(s)
Chromatography, Ion Exchange/instrumentation , Chromatography, Ion Exchange/methods , Latex/chemistry , Silicon Dioxide/chemistry , Anions/chemistry , Nanoparticles/chemistry , Reproducibility of Results
15.
Oecologia ; 143(4): 537-47, 2005 May.
Article in English | MEDLINE | ID: mdl-15791427

ABSTRACT

The role of stoichiometric food quality in influencing genotype coexistence and competitive interactions between clones of the freshwater microcrustacean, Daphnia pulex, was examined in controlled laboratory microcosm experiments. Two genetically distinct clones of D. pulex, which show variation in their ribosomal rDNA structure, as well as differences in a number of previously characterized growth-rate-related features (i.e., life-history features), were allowed to compete in two different arenas: (1) batch cultures differing in algal food quality (i.e., high vs. low carbon:phosphorus (C:P ratio) in the green alga, Scenedesmus acutus); (2) continuous flow microcosms receiving different light levels (i.e., photosynthetically active radiation) that affected algal C:P ratios. In experiment 1, a clear genotype x environment interaction was determined with clone 1 out-competing clone 2 under high nutrient (i.e., low food C:P) conditions, while the exact opposite pattern was observed under low nutrient (i.e., high C:P) conditions. In experiment 2, clone 1 dominated over clone 2 under high light (higher C:P) conditions, but clonal coexistence was observed under low light (low C:P) conditions. These results indicate that food (nutrient) quality effects (hitherto an often overlooked factor) may play a role in microevolutionary (genotypic) responses to changing stoichiometric conditions in natural populations.


Subject(s)
Animal Nutritional Physiological Phenomena , Competitive Behavior/physiology , Daphnia/physiology , Environment , Genetic Variation , Analysis of Variance , Animals , Carbon/metabolism , Chlorophyta/physiology , Daphnia/genetics , Genotype , Light , Phosphorus/metabolism , Population Dynamics
16.
Breastfeed Rev ; 11(1): 5-10, 2003 Mar.
Article in English | MEDLINE | ID: mdl-14768306

ABSTRACT

The aim of our study was to assess the effectiveness of finger feeding in encouraging a breastfeeding-type suck in preterm infants. Through identification of a baby who was developing a suck technique or was discovered to have a faulty technique, we hypothesised that preterm breastfeeding rates could be increased by correcting the suck technique of the infant, whilst being cared for in the Special Care Nursery (SCN). The study was conducted on discharge from the SCN at two time periods, before and after the introduction of the Baby Friendly Hospital Initiative (BFHI) in one hospital in Perth, Western Australia. Prior to BFHI, 44% of preterm infants were breastfed on discharge from the SCN compared to 71% post BFHI implementation. We have shown, using a pre- and post-breastfeeding health promotion initiative within a maternity hospital, that preterm breastfeeding rates can be increased on discharge from the SCN.


Subject(s)
Breast Feeding , Feeding Methods , Adult , Breast Feeding/psychology , Breast Feeding/statistics & numerical data , Female , Humans , Infant Food , Infant, Newborn , Infant, Premature , Male , Postnatal Care , Sucking Behavior/physiology , Western Australia
SELECTION OF CITATIONS
SEARCH DETAIL