Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.104
Filter
2.
Am J Med Sci ; 2024 Sep 04.
Article in English | MEDLINE | ID: mdl-39241828

ABSTRACT

BACKGROUND: Current guidelines lack clarity about the optimal duration of octreotide therapy for patients with esophageal variceal hemorrhage (EVH). To address this lack of evidence, we conducted a randomized clinical trial (RCT) of 24-hours versus 72-hours continuous infusion of octreotide for patients with EVH. METHODS: This multi-center, prospective RCT (NCT03624517), randomized patients with EVH to 24-hour versus 72-hour infusion of octreotide. Patients were required to undergo esophageal variceal band ligation prior to enrollment. The primary endpoint was rebleeding rate at 72 hours. The study was terminated early due to an inability to recruit during and after the COVID-19 epidemic. RESULTS: For patients randomized to 72-hours (n = 19) of octreotide vs 24-hours (n = 15), there were no differences in the need for transfusion, average pRBC units transfused per patient (3 units vs 2 units), infection (5% vs 0%), mechanical ventilation (11% vs 7%), or the need for vasopressors (5% vs 3%), respectively (none of these differences were statistically significantly different). There were 2 re-bleeding events in the 72-hour group (11%), and no re-bleeding events in the 24-hour group (p = 0.49). 8/15 of patients receiving 24 hours of octreotide were discharged at or before hospital day 3 while none in the 72-hour group was discharged before day 3 (p < 0.001). There was one death (in the 72-hour group) within 30 days. CONCLUSIONS: A 24-hour infusion is non-inferior to a 72-hour infusion of octreotide for prevention of re-bleeding in patients with EVH. We propose that shortened octreotide duration may help reduce hospital stay and related costs in these patients.

3.
Theor Appl Genet ; 137(9): 214, 2024 Sep 03.
Article in English | MEDLINE | ID: mdl-39223330

ABSTRACT

KEY MESSAGE: A GWAS in an elite diversity panel, evaluated across 10 environments, identified genomic regions regulating six fiber quality traits, facilitating genomics-assisted breeding and gene discovery in upland cotton. In this study, an elite diversity panel of 348 upland cotton accessions was evaluated in 10 environments across the US Cotton Belt and genotyped with the cottonSNP63K array, for a genome-wide association study of six fiber quality traits. All fiber quality traits, upper half mean length (UHML: mm), fiber strength (FS: g tex-1), fiber uniformity (FU: %), fiber elongation (FE: %), micronaire (MIC) and short fiber content (SFC: %), showed high broad-sense heritability (> 60%). All traits except FE showed high genomic heritability. UHML, FS and FU were all positively correlated with each other and negatively correlated with FE, MIC and SFC. GWAS of these six traits identified 380 significant marker-trait associations (MTAs) including 143 MTAs on 30 genomic regions. These 30 genomic regions included MTAs identified in at least three environments, and 23 of them were novel associations. Phenotypic variation explained for the MTAs in these 30 genomic regions ranged from 6.68 to 11.42%. Most of the fiber quality-associated genomic regions were mapped in the D-subgenome. Further, this study confirmed the pleiotropic region on chromosome D11 (UHML, FS and FU) and identified novel co-localized regions on D04 (FU, SFC), D05 (UHML, FU, and D06 UHML, FU). Marker haplotype analysis identified superior combinations of fiber quality-associated genomic regions with high trait values (UHML = 32.34 mm; FS = 32.73 g tex-1; FE = 6.75%). Genomic analyses of traits, haplotype combinations and candidate gene information described in the current study could help leverage genetic diversity for targeted genetic improvement and gene discovery for fiber quality traits in cotton.


Subject(s)
Cotton Fiber , Genotype , Gossypium , Phenotype , Quantitative Trait Loci , Gossypium/genetics , Gossypium/growth & development , Cotton Fiber/analysis , Polymorphism, Single Nucleotide , Genome-Wide Association Study , Genetic Association Studies , Linkage Disequilibrium , Chromosome Mapping/methods , Genome, Plant , Plant Breeding
4.
Article in English | MEDLINE | ID: mdl-39032694

ABSTRACT

BACKGROUND: Borderline Personality Disorder (BPD) is associated with heightened impulsivity, evidenced by increased substance abuse, self-harm and suicide attempts. Addressing impulsivity in individuals with BPD is a therapeutic objective; but its underlying neural basis in this clinical population remains unclear, partly due to its frequent co-morbidity with attention-deficit/hyperactivity disorder (ADHD). METHODS: We employed a response inhibition paradigm - the interleaved pro-/anti-saccade task (IPAST) - among adolescents diagnosed with BPD with and without comorbid ADHD (N=25 and N=24, respectively) during concomitant video-based eye-tracking. We quantified various eye movement response parameters reflective of impulsive action during the task, including delay to fixation acquisition, fixation breaks, anticipatory saccades, and direction errors with express saccade (Saccade Reaction Time [SRT]: 90-140 ms) and regular saccade latencies (SRT > 140 ms). RESULTS: Individuals with BPD exhibited deficient response preparation, exampled by reduced visual fixation on task cues and greater variability of saccade responses (i.e., SRT and peak velocity). The ADHD/BPD group shared these traits, as well as produced an increased frequency of anticipatory responses and direction errors with express saccade latencies and reduced error correction. CONCLUSIONS: Saccadic deficits in BPD and ADHD/BPD stem not from an inability to execute anti-saccades, but rather from an inadequate preparation for the upcoming task set. These distinctions may arise due to abnormal signaling in cortical areas like the frontal eye fields, posterior parietal cortex, and anterior cingulate cortex. Understanding these mechanisms could provide insights into targeted interventions focusing on task set preparation to manage response inhibition deficits in BPD and ADHD/BPD.

5.
Psychophysiology ; : e14657, 2024 Jul 29.
Article in English | MEDLINE | ID: mdl-39075668
6.
Gastrointest Endosc ; 100(2): 348, 2024 Aug.
Article in English | MEDLINE | ID: mdl-39025602
8.
Gastroenterology ; 2024 Jul 20.
Article in English | MEDLINE | ID: mdl-39038762
9.
Dig Dis Sci ; 69(9): 3206-3213, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38977523

ABSTRACT

BACKGROUND: Endoscopic procedures are among the most commonly performed medical procedures and the serious adverse event rate is reported to be 1-3 adverse events per 1000 procedures. AIMS: Here, we have examined the safety of endoscopy specifically in cirrhotic populations. METHODS: We conducted a retrospective case (cirrhosis)-control (non-cirrhosis) study of the outcomes of patients undergoing endoscopy in a large academic medical center. The primary outcome was a procedural or post-procedural complication. Complete clinical data were collected for all patients undergoing endoscopic procedures-including esophagogastroduodenoscopy, colonoscopy, EUS, ERCP, flexible sigmoidoscopy, and others. Cirrhosis was carefully defined based on clinico-pathological grounds. RESULTS: We identified 16,779 patients who underwent endoscopy, including 2618 with cirrhosis and 14,161 without cirrhosis. There were 167 complications (0.99%), which included 15/2618 cirrhotics (0.6%) and 152/14,161 (1.1%) non-cirrhotics. The most common complications were cardiopulmonary (including hypotension and hypoxemia) found in 67% of patients; procedurally related complications occurred in 19% of patients. The complication rate was the same or lower in cirrhotics than controls undergoing esophagogastroduodenoscopy (0.6% vs 0.9%, p = 0.03), colonoscopy (0.6% vs. 0.6%, p = NS), or ERCP (0.7% vs. 1.4%, p = NS) Logistic regression analysis identified the following features to be associated with an increased risk of having a complication: inpatient status, history of myocardial infarction, and an EUS procedure. CONCLUSIONS: Endoscopy in cirrhotic patients was as safe or safer than non-cirrhotic patients undergoing similar procedures.


Subject(s)
Liver Cirrhosis , Humans , Male , Liver Cirrhosis/complications , Retrospective Studies , Female , Middle Aged , Aged , Case-Control Studies , Adult , Endoscopy, Digestive System/adverse effects , Endoscopy, Digestive System/methods , Risk Factors
11.
Clin Gastroenterol Hepatol ; 22(8): 1575-1583, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38864796

ABSTRACT

DESCRIPTION: In this Clinical Practice Update (CPU), we will Best Practice Advice (BPA) guidance on the appropriate management of iron deficiency anemia. METHODS: This expert review was commissioned and approved by the AGA Institute Clinical Practice Updates Committee (CPUC) and the AGA Governing Board to provide timely guidance on a topic of high clinical importance to the AGA membership, and underwent internal peer review by the CPUC and external peer review through standard procedures of Clinical Gastroenterology and Hepatology. These Best Practice Advice (BPA) statements were drawn from a review of the published literature and from expert opinion. Since systematic reviews were not performed, these BPA statements do not carry formal ratings regarding the quality of evidence or strength of the presented considerations. BEST PRACTICE ADVICE 1: No single formulation of oral iron has any advantages over any other. Ferrous sulfate is preferred as the least expensive iron formulation. BEST PRACTICE ADVICE 2: Give oral iron once a day at most. Every-other-day iron dosing may be better tolerated for some patients with similar or equal rates of iron absorption as daily dosing. BEST PRACTICE ADVICE 3: Add vitamin C to oral iron supplementation to improve absorption. BEST PRACTICE ADVICE 4: Intravenous iron should be used if the patient does not tolerate oral iron, ferritin levels do not improve with a trial of oral iron, or the patient has a condition in which oral iron is not likely to be absorbed. BEST PRACTICE ADVICE 5: Intravenous iron formulations that can replace iron deficits with 1 or 2 infusions are preferred over those that require more than 2 infusions. BEST PRACTICE ADVICE 6: All intravenous iron formulations have similar risks; true anaphylaxis is very rare. The vast majority of reactions to intravenous iron are complement activation-related pseudo-allergy (infusion reactions) and should be treated as such. BEST PRACTICE ADVICE 7: Intravenous iron therapy should be used in individuals who have undergone bariatric procedures, particularly those that are likely to disrupt normal duodenal iron absorption, and have iron-deficiency anemia with no identifiable source of chronic gastrointestinal blood loss. BEST PRACTICE ADVICE 8: In individuals with inflammatory bowel disease and iron-deficiency anemia, clinicians first should determine whether iron-deficiency anemia is owing to inadequate intake or absorption, or loss of iron, typically from gastrointestinal bleeding. Active inflammation should be treated effectively to enhance iron absorption or reduce iron depletion. BEST PRACTICE ADVICE 9: Intravenous iron therapy should be given in individuals with inflammatory bowel disease, iron-deficiency anemia, and active inflammation with compromised absorption. BEST PRACTICE ADVICE 10: In individuals with portal hypertensive gastropathy and iron-deficiency anemia, oral iron supplements initially should be used to replenish iron stores. Intravenous iron therapy should be used in patients with ongoing bleeding who do not respond to oral iron therapy. BEST PRACTICE ADVICE 11: In individuals with portal hypertensive gastropathy and iron-deficiency anemia without another identified source of chronic blood loss, treatment of portal hypertension with nonselective ß-blockers can be considered. BEST PRACTICE ADVICE 12: In individuals with iron-deficiency anemia secondary to gastric antral vascular ectasia who have an inadequate response to iron replacement, consider endoscopic therapy with endoscopic band ligation or thermal methods such as argon plasma coagulation. BEST PRACTICE ADVICE 13: In patients with iron-deficiency anemia and celiac disease, ensure adherence to a gluten-free diet to improve iron absorption. Consider oral iron supplementation based on the severity of iron deficiency and patient tolerance, followed by intravenous iron therapy if iron stores do not improve. BEST PRACTICE ADVICE 14: Deep enteroscopy performed in patients with iron-deficiency anemia suspected to have small-bowel bleeding angioectasias should be performed with a distal attachment to improve detection and facilitate treatment. Small-bowel angioectasias may be treated with ablative thermal therapies such as argon plasma coagulation or with mechanical methods such as hemostatic clips. BEST PRACTICE ADVICE 15: Endoscopic treatment of angioectasias should be accompanied with iron replacement. Medical therapy for small-bowel angioectasias should be reserved for compassionate treatment in refractory cases when iron replacement and endoscopic therapy are ineffective.


Subject(s)
Anemia, Iron-Deficiency , Humans , Anemia, Iron-Deficiency/therapy , Anemia, Iron-Deficiency/drug therapy , Iron/administration & dosage , Iron/therapeutic use , Administration, Oral , Disease Management
12.
Am J Gastroenterol ; 119(3): 438-449, 2024 03 01.
Article in English | MEDLINE | ID: mdl-38857483

ABSTRACT

Gastrointestinal (GI) bleeding is the most common GI diagnosis leading to hospitalization within the United States. Prompt diagnosis and treatment of GI bleeding is critical to improving patient outcomes and reducing high healthcare utilization and costs. Radiologic techniques including computed tomography angiography, catheter angiography, computed tomography enterography, magnetic resonance enterography, nuclear medicine red blood cell scan, and technetium-99m pertechnetate scintigraphy (Meckel scan) are frequently used to evaluate patients with GI bleeding and are complementary to GI endoscopy. However, multiple management guidelines exist which differ in the recommended utilization of these radiologic examinations. This variability can lead to confusion as to how these tests should be used in the evaluation of GI bleeding. In this document, a panel of experts from the American College of Gastroenterology and Society of Abdominal Radiology provide a review of the radiologic examinations used to evaluate for GI bleeding including nomenclature, technique, performance, advantages, and limitations. A comparison of advantages and limitations relative to endoscopic examinations is also included. Finally, consensus statements and recommendations on technical parameters and utilization of radiologic techniques for GI bleeding are provided.


Subject(s)
Gastrointestinal Hemorrhage , Humans , Gastrointestinal Hemorrhage/diagnostic imaging , Gastrointestinal Hemorrhage/diagnosis , Consensus , United States , Gastroenterology/standards , Societies, Medical , Diagnostic Imaging/methods , Diagnostic Imaging/standards , Endoscopy, Gastrointestinal
13.
Ann Gastroenterol ; 37(3): 303-312, 2024.
Article in English | MEDLINE | ID: mdl-38779640

ABSTRACT

Background: The aim of this study was to investigate the impact of blood transfusion (BT) on mortality and rebleeding in patients with gastrointestinal bleeding (GIB) and whether BT at a threshold of ≤7 g/dL may improve these outcomes. Methods: A prospective study was conducted in patients admitted with GIB between 2013 and 2021. Antithrombotic (AT) use and clinical outcomes were compared between transfused and non-transfused patients, and between those transfused at a threshold of ≤7 vs. >7 g/dL. Multivariate analysis was performed to identify predictors of mortality and rebleeding. Results: A total of 667 patients, including 383 transfused, were followed up for a median of 56 months. Predictors of end-of-follow-up mortality included: age-adjusted Charlson Comorbidity Index, stigmata of recent hemorrhage (SRH), and being on anticoagulants only upon presentation (P=0.026). SRH was a predictor of end-of-follow-up rebleeding, while having been on only antiplatelet therapy (AP) upon presentation was protective (P<0.001). BT was not associated with mortality or rebleeding at 1 month or end of follow up. Among transfused patients, being discharged only on AP protected against mortality (P=0.044). BT at >7 g/dL did not affect the risk of short or long-term rebleeding or mortality compared to BT at ≤7 g/dL. Conclusions: Short- and long-term mortality and rebleeding in GIB were not affected by BT, nor by a transfusion threshold of ≤7 vs. >7 g/dL, but were affected by the use of AT. Further studies that account for AT use are needed to determine the best transfusion strategy in GIB.

14.
JCI Insight ; 9(10)2024 May 22.
Article in English | MEDLINE | ID: mdl-38775155

ABSTRACT

Physician-scientists play a crucial role in advancing medical knowledge and patient care, yet the long periods of time required to complete training may impede expansion of this workforce. We examined the relationship between postgraduate training and time to receipt of NIH or Veterans Affairs career development awards (CDAs) for physician-scientists in internal medicine. Data from NIH RePORTER were analyzed for internal medicine residency graduates who received specific CDAs (K08, K23, K99, or IK2) in 2022. Additionally, information on degrees and training duration was collected. Internal medicine residency graduates constituted 19% of K awardees and 28% of IK2 awardees. Of MD-PhD internal medicine-trained graduates who received a K award, 92% received a K08 award; of MD-only graduates who received a K award, a majority received a K23 award. The median time from medical school graduation to CDA was 9.6 years for K awardees and 10.2 years for IK2 awardees. The time from medical school graduation to K or IK2 award was shorter for US MD-PhD graduates than US MD-only graduates. We propose that the time from medical school graduation to receipt of CDAs must be shortened to accelerate training and retention of physician-scientists.


Subject(s)
Education, Medical, Graduate , Internal Medicine , Humans , Internal Medicine/education , United States , Internship and Residency/statistics & numerical data , Biomedical Research/education , Physicians/statistics & numerical data , Research Personnel/statistics & numerical data , Research Personnel/education , Time Factors , Awards and Prizes , National Institutes of Health (U.S.) , United States Department of Veterans Affairs , Male , Female
15.
Dig Dis Sci ; 69(8): 3061-3068, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38782854

ABSTRACT

INTRODUCTION: Patients with cirrhosis are at risk for cardiac complications such as heart failure, particularly heart failure with preserved ejection fraction (HFpEF) due to left ventricular diastolic dysfunction (LVDD). The H2FPEF score is a predictive model used to identify patients with HFpEF. Our primary aim was to assess the H2FPEF score in patients with cirrhosis and determine its potential to identify patients at risk for heart failure after liver transplant. METHODS: This was a cohort study of patients undergoing liver transplant for cirrhosis from January 2010 and October 2018 who had a pre-transplant transthoracic echocardiogram. RESULTS: 166 cirrhosis subjects were included in the study. The majority were men (65%) and Caucasian (85%); NASH was the most common cause of cirrhosis (41%) followed by alcohol (34%). The median H2FPEF score was 2.0 (1.0-4.0). Patients with NASH cirrhosis had higher H2FPEF scores (3.22, 2.79-3.64) than those with alcohol induced cirrhosis (1.89, 1.5-2.29, p < 0.001) and other causes of cirrhosis (1.73, 1.28-2.18, p < 0.001). All subjects with a H2FPEF score > 6 had NASH cirrhosis. There was no association between the H2FPEF scores and measures of severity of liver disease (bilirubin, INR, or MELD score). Patients with heart failure after liver transplant had higher H2FPEF scores than those without heart failure (4.0, 3.1-4.9 vs. 2.3, 2.1-2.6, respectively; p = 0.015), but the score did not predict post-transplant mortality. CONCLUSION: H2FPEF scores are higher in cirrhosis patients with NASH and appear to be associated with post-transplant heart failure, but not death.


Subject(s)
Heart Failure , Liver Cirrhosis , Liver Transplantation , Non-alcoholic Fatty Liver Disease , Humans , Male , Liver Transplantation/adverse effects , Female , Middle Aged , Heart Failure/etiology , Heart Failure/diagnosis , Non-alcoholic Fatty Liver Disease/complications , Non-alcoholic Fatty Liver Disease/epidemiology , Liver Cirrhosis/complications , Liver Cirrhosis/surgery , Aged , Stroke Volume , Retrospective Studies , Echocardiography , Risk Factors , Ventricular Dysfunction, Left/etiology , Ventricular Dysfunction, Left/physiopathology , Risk Assessment/methods
16.
Nat Plants ; 10(6): 1039-1051, 2024 06.
Article in English | MEDLINE | ID: mdl-38816498

ABSTRACT

Cotton (Gossypium hirsutum L.) is the key renewable fibre crop worldwide, yet its yield and fibre quality show high variability due to genotype-specific traits and complex interactions among cultivars, management practices and environmental factors. Modern breeding practices may limit future yield gains due to a narrow founding gene pool. Precision breeding and biotechnological approaches offer potential solutions, contingent on accurate cultivar-specific data. Here we address this need by generating high-quality reference genomes for three modern cotton cultivars ('UGA230', 'UA48' and 'CSX8308') and updating the 'TM-1' cotton genetic standard reference. Despite hypothesized genetic uniformity, considerable sequence and structural variation was observed among the four genomes, which overlap with ancient and ongoing genomic introgressions from 'Pima' cotton, gene regulatory mechanisms and phenotypic trait divergence. Differentially expressed genes across fibre development correlate with fibre production, potentially contributing to the distinctive fibre quality traits observed in modern cotton cultivars. These genomes and comparative analyses provide a valuable foundation for future genetic endeavours to enhance global cotton yield and sustainability.


Subject(s)
Genome, Plant , Gossypium , Plant Breeding , Gossypium/genetics , Gossypium/growth & development , Plant Breeding/methods , Cotton Fiber , Genetic Variation , Phenotype
17.
Article in English | MEDLINE | ID: mdl-38754795
19.
Expert Opin Drug Saf ; 23(4): 527-537, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38482670

ABSTRACT

BACKGROUND: Management of side effects in clinical trials has to balance generation of meaningful data with risk for patients. A toxicity area requiring detailed management guidelines is drug-induced liver injury (DILI). In oncology trials, patients are often included despite baseline liver test abnormalities, for whom there is no consensus yet on levels of liver test changes that should trigger action, such as drug interruption or discontinuation. METHODS: We provide an innovative approach to manage hepatocellular DILI in oncology trials for patients with abnormal baseline alanine aminotransferase (ALT) levels. The algorithm proposed is based on mathematical derivation of action thresholds from those generally accepted for patients with normal baselines. RESULTS: The resulting algorithm is grouped by level of baseline abnormality and avoids calculation of baseline multiples. Suggested layered action levels are 4, 6, and 11 × Upper Limit of Normal (ULN) for patients with baseline ALT between 1.5 and 3 × ULN, and 6, 8, and 12 × ULN for patients with baseline ALT between 3 and 5 × ULN, respectively. CONCLUSIONS: Our concept and resulting algorithm are consistent, transparent, and easy to follow, and the method for derivation from consensus-based thresholds may also be applicable to other drug toxicity areas.


Subject(s)
Carcinoma, Hepatocellular , Chemical and Drug Induced Liver Injury , Drug-Related Side Effects and Adverse Reactions , Liver Neoplasms , Humans , Liver Neoplasms/drug therapy , Chemical and Drug Induced Liver Injury/diagnosis , Chemical and Drug Induced Liver Injury/etiology , Alanine Transaminase , Liver
20.
Plant Dis ; 2024 Mar 27.
Article in English | MEDLINE | ID: mdl-38537146

ABSTRACT

Cranberry (Vaccinium macrocarpon, L.) is a commercial small fruit that is native to North America. Oregon ranks fourth in cranberry production in the U.S.A. with 1052 Ha of cranberry beds and annual production of 23,590 metric tonnes (USDA NASS 2021). Cranberry fruit rot diseases are caused by a complex of 15 fungal pathogens belonging to 10 genera (Polashock et al. 2017). In fruit rot surveys of 'Stevens' cranberry beds in Coos and Curry Counties, Oregon, berries were collected before harvest and sorted by symptoms (e.g. softening, shriveling, or discoloration). Cranberries with field rot symptoms were surface-disinfested and lesion margin tissue placed on V8 agar. Cultures were incubated at room temperature and outgrowing fungi were transferred twice onto fresh V8 agar to obtain single isolates. Storage rots that developed on asymptomatic cranberries held at 4°C for 8 weeks were processed similarly. Since 2017, we periodically isolated Neofabraea actinidiae, which is not a member of the cranberry fruit rot complex, from rotted cranberries (Polashock et al. 2017). In 2022, N. actinidiae was isolated from 22% of 45 cranberries collected from an organically managed farm and developed storage rot and from 6% of 50 storage-rot cranberries from three conventionally managed farms. Symptoms associated with N. actinidiae on 'Stevens' cranberries often include softening and brown tissue surrounded by a yellow-colored ring. On V8 agar, N. actinidiae grew as compact white circular colonies with dense aerial hyphae near the center and accompanied by a red pigment in the agar. Pink-colored mucoidal irregular conidiomata often developed on the colony after 3 weeks. Conidia were hyaline, aseptate, and ellipsoidal to fusiform ranging from 7.5 to 12.6 µm long X 3.5 to 5.6 µm wide (n=100). Genomic DNA was extracted from N. actinidiae isolates from cranberries in 2017 and 2022. ß-tubulin and the ITS 5/4 region were amplified and sequenced with primers Bt-T2m-Up/Bt-LEV-Lo1 and ITS5/ITS4 using conditions of de Jong et al. (2001) and White et al. (1990), respectively. Sequences were deposited in NCBI Genbank (Accessions: OR606303, OR606305, OR606306, & OR606309 for ITS; OR610314, OR610316, OR610317, & OR610320 for ß-tubulin). BlastN analysis of ß-tubulin (695 bp) and ITS (489-490 bp) had a 99.6 to 99.8% and 99.8 to 100% identity, respectively, to Neofabraea actinidiae (CBS 121403) (Accession numbers: KR859285 ß-tubulin and KR859079 ITS). Phylogenetic analysis of concatenated sequences using Tamura-Nei neighbor-joining (Tamura et al. 2004) confirmed identity of cranberry isolates as N. actinidiae. Koch's Postulates were confirmed with four N. actinidiae isolates from cranberry. Agar or hyphal plugs were placed in a 3 mm wound on the side of six surface-disinfested, asymptomatic berries and incubated in a moist chamber at 20°C for 15 days or 4°C for 55 days. Similar symptoms developed on each berry inoculated with N. actinidiae, but not agar alone. The fungus was recovered from symptomatic tissues and identity confirmed by colony morphology. N. actinidiae causes a ripe rot and storage rot in kiwifruit and is one of the species causing Bull's Eye Rot of pome fruits (Tyson et al. 2019). Cryptosporiopsis actinidiae (anamorph) was isolated from cranberries roots in British Columbia, CA, but considered unlikely to be the causal agent of dieback disease of cranberry vines (Sabaratnam et al. 2009). We have demonstrated that N. actinidiae causes a cranberry fruit rot in beds and in storage. Its prevalence, associated fruit rot symptoms, and disease incidence will continue to be monitored.

SELECTION OF CITATIONS
SEARCH DETAIL