Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 171
Filter
1.
Virus Res ; 348: 199446, 2024 Oct.
Article in English | MEDLINE | ID: mdl-39127239

ABSTRACT

The Human papillomavirus (HPV) causes tumors in part by hijacking the host cell cycle and forcing uncontrolled cellular division. While there are >200 genotypes of HPV, 15 are classified as high-risk and have been shown to transform infected cells and contribute to tumor formation. The remaining low-risk genotypes are not considered oncogenic and result in benign skin lesions. In high-risk HPV, the oncoprotein E7 contributes to the dysregulation of cell cycle regulatory mechanisms. High-risk E7 is phosphorylated in cells at two conserved serine residues by Casein Kinase 2 (CK2) and this phosphorylation event increases binding affinity for cellular proteins such as the tumor suppressor retinoblastoma (pRb). While low-risk E7 possesses similar serine residues, it is phosphorylated to a lesser degree in cells and has decreased binding capabilities. When E7 binding affinity is decreased, it is less able to facilitate complex interactions between proteins and therefore has less capability to dysregulate the cell cycle. By comparing E7 protein sequences from both low- and high-risk HPV variants and using site-directed mutagenesis combined with NMR spectroscopy and cell-based assays, we demonstrate that the presence of two key nonpolar valine residues within the CK2 recognition sequence, present in low-risk E7, reduces serine phosphorylation efficiency relative to high-risk E7. This results in significant loss of the ability of E7 to degrade the retinoblastoma tumor suppressor protein, thus also reducing the ability of E7 to increase cellular proliferation and reduce senescence. This provides additional insight into the differential E7-mediated outcomes when cells are infected with high-risk verses low-risk HPV. Understanding these oncogenic differences may be important to developing targeted treatment options for HPV-induced cancers.


Subject(s)
Papillomavirus E7 Proteins , Phosphorylation , Papillomavirus E7 Proteins/metabolism , Papillomavirus E7 Proteins/genetics , Humans , Casein Kinase II/metabolism , Casein Kinase II/genetics , Papillomavirus Infections/virology , Papillomavirus Infections/metabolism , Papillomavirus Infections/genetics , Protein Binding , Retinoblastoma Protein/metabolism , Retinoblastoma Protein/genetics , Papillomaviridae/genetics , Papillomaviridae/metabolism , Papillomaviridae/physiology , Cell Cycle , Mutagenesis, Site-Directed
2.
J Sch Health ; 2024 Aug 05.
Article in English | MEDLINE | ID: mdl-39103203

ABSTRACT

BACKGROUND: Millions of children are diagnosed with a traumatic brain injury (TBI) each year, most being mild TBI (mTBI). The effect of mTBIs on academic performance is of significant importance. We investigate mTBI's impact on parent-reported academic outcomes in school-aged pediatric participants. METHODS: This cross-sectional survey study queried parents (N = 285) regarding letter grade performance and the presence or absence of academic accommodations before and after an mTBI, including complicated mTBI (c-mTBI, or mTBI with radiographic abnormality). RESULTS: We found a parent-reported decline in letter grades following c-mTBI (p < .001), with no significant change following uncomplicated mTBIs. Degree and length of recovery were also associated with grade changes (p < .05). Those with no academic accommodations prior to the injury showed significant decreases in grades after injury regardless of post-injury accommodation status (p < .05). IMPLICATIONS OF SCHOOL HEALTH POLICY, PRACTICE, AND EQUITY: This study underscores the need for an improved framework of support to maximize academic performance of children following mTBI, especially in those with a c-mTBI and still recovering from their injury. CONCLUSION: Our study identifies children who are at risk for adverse academic outcomes following mTBI. We encourage efforts to better support school nurses in this effort, including improved communication between health care teams and school teams.

3.
Pediatr Qual Saf ; 9(3): e738, 2024.
Article in English | MEDLINE | ID: mdl-38868756

ABSTRACT

Introduction: Asthma exacerbations are common presentations to pediatric emergency departments. Standard treatment for moderate-to-severe exacerbations includes administration of oral corticosteroids concurrently with bronchodilators. Early administration of corticosteroids has been shown to decrease emergency department length of stay (LOS) and hospitalizations. Our SMART aim was to reduce the time from arrival to oral corticosteroids (dexamethasone) administration in pediatric patients ≥2 years of age with an initial Pediatric Asthma Severity Score >6 from 60 to 30 minutes within 6 months. Methods: We used the model for improvement with collaboration between ED physicians, nursing, pharmacy, and respiratory therapists. Interventions included nursing education, dosage rounding in the electronic medical record, supplying triage with 1-mg tablets and a pill crusher, updates to an asthma nursing order set and pertinent chief complaints triggering nurses to document a Pediatric Asthma Severity Score in the electronic medical record and use the order set. Our primary outcome measure was the time from arrival to dexamethasone administration. Secondary outcome measures included ED LOS for discharged patients and admission rate. We used statistical process control to analyze changes in measures over time. Results: From October 2021 to March 2022, the average time for dexamethasone administration decreased from 59 to 38 minutes. ED LOS for discharged asthma exacerbation patients rose with overall ED LOS for all patients during the study period. There was no change in the admission rate. Conclusions: Using quality improvement methodology, we successfully decreased the time from ED arrival to administration of dexamethasone in asthma exacerbation patients from 59 to 38 minutes over 10 months.

4.
Chest ; 166(4): 743-753, 2024 Oct.
Article in English | MEDLINE | ID: mdl-38788896

ABSTRACT

BACKGROUND: The last national estimates of US ICU physician staffing are 25 years old and lack information about interprofessional teams. RESEARCH QUESTION: How are US adult ICUs currently staffed? STUDY DESIGN AND METHODS: We conducted a cross-sectional survey (May 4, 2022-February 2, 2023) of adult ICU clinicians (targeting nurse/physician leadership) contacted using 2020 American Hospital Association (AHA) database information and, secondarily, through professional organizations. The survey included questions about interprofessional ICU staffing availability and roles at steady state (pre-COVID-19). We linked survey data to hospital data in the AHA database to create weighted national estimates by extrapolating ICU staffing data to nonrespondent hospitals based on hospital characteristics. RESULTS: The cohort consisted of 596 adult ICUs (response rates: AHA contacts: 2.1%; professional organizations: unknown) with geographic diversity and size variability (median, 20 beds; interquartile range, 12-25); most cared for mixed populations (414 [69.5%]), yet medical (55 [9.2%]), surgical (70 [11.7%]), and specialty (57 [9.6%]) ICUs were well represented. A total of 554 (93.0%) had intensivists available, with intensivists covering all patients in 75.6% of these and onsite 24 h/d in one-half (53.3% weekdays; 51.8% weekends). Of all ICUs, 69.8% had physicians-in-training and 77.7% had nurse practitioners/physician assistants. For patients on mechanical ventilation, nurse to patient ratios were 1:2 in 89.6% of ICUs. Clinical pharmacists were available in 92.6%, and respiratory therapists were available in 98.8%. We estimated 85.1% (95% CI, 85.7%-84.5%) of hospitals nationally had ICUs with intensivists, 51.6% (95% CI, 50.6%-52.5%) had physicians-in-training, 72.1% (95% CI, 71.3%-72.9%) had nurse practitioners/physician assistants, 98.5% (95% CI, 98.4%-98.7%) had respiratory therapists, and 86.9% (95% CI, 86.4%-87.4%) had clinical pharmacists. For patients on mechanical ventilation, 86.4% (95% CI, 85.8%-87.0%) used 1:2 nurses/patients. INTERPRETATION: We found that intensivist presence in adult US ICUs has greatly increased over 25 years. Intensivists, respiratory therapists, and clinical pharmacists are commonly available, and each nurse usually provides care for two patients on mechanical ventilation. However, team composition and workload vary.


Subject(s)
Intensive Care Units , Personnel Staffing and Scheduling , Humans , United States , Intensive Care Units/statistics & numerical data , Intensive Care Units/organization & administration , Cross-Sectional Studies , Personnel Staffing and Scheduling/statistics & numerical data , COVID-19/epidemiology , Workforce/statistics & numerical data , Adult , Surveys and Questionnaires
5.
Crit Care Med ; 51(10): 1285-1293, 2023 10 01.
Article in English | MEDLINE | ID: mdl-37246915

ABSTRACT

OBJECTIVE: Predictive models developed for use in ICUs have been based on retrospectively collected data, which does not take into account the challenges associated with live, clinical data. This study sought to determine if a previously constructed predictive model of ICU mortality (ViSIG) is robust when using data collected prospectively in near real-time. DESIGN: Prospectively collected data were aggregated and transformed to evaluate a previously developed rolling predictor of ICU mortality. SETTING: Five adult ICUs at Robert Wood Johnson-Barnabas University Hospital and one adult ICU at Stamford Hospital. PATIENTS: One thousand eight hundred and ten admissions from August to December 2020. MEASUREMENTS AND MAIN RESULTS: The ViSIG Score, comprised of severity weights for heart rate, respiratory rate, oxygen saturation, mean arterial pressure, mechanical ventilation, and values for OBS Medical's Visensia Index. This information was collected prospectively, whereas data on discharge disposition was collected retrospectively to measure the ViSIG Score's accuracy. The distribution of patients' maximum ViSIG Score was compared with ICU mortality rate, and cut points determined where changes in mortality probability were greatest. The ViSIG Score was validated on new admissions. The ViSIG Score was able to stratify patients into three groups: 0-37 (low risk), 38-58 (moderate risk), and 59-100 (high risk), with mortality of 1.7%, 12.0%, and 39.8%, respectively ( p < 0.001). The sensitivity and specificity of the model to predict mortality for the high-risk group were 51% and 91%. Performance on the validation dataset remained high. There were similar increases across risk groups for length of stay, estimated costs, and readmission. CONCLUSIONS: Using prospectively collected data, the ViSIG Score produced risk groups for mortality with good sensitivity and excellent specificity. A future study will evaluate making the ViSIG Score visible to clinicians to determine whether this metric can influence clinician behavior to reduce adverse outcomes.


Subject(s)
Critical Illness , Intensive Care Units , Adult , Humans , Retrospective Studies , Treatment Outcome , Hospital Mortality , Risk Factors
6.
J Environ Manage ; 337: 117690, 2023 Jul 01.
Article in English | MEDLINE | ID: mdl-36933535

ABSTRACT

Wetlands provide essential ecosystem services, including nutrient cycling, flood protection, and biodiversity support, that are sensitive to changes in wetland hydrology. Wetland hydrological inputs come from precipitation, groundwater discharge, and surface run-off. Changes to these inputs via climate variation, groundwater extraction, and land development may alter the timing and magnitude of wetland inundation. Here, we use a long-term (14-year) comparative study of 152 depressional wetlands in west-central Florida to identify sources of variation in wetland inundation during two key time periods, 2005-2009 and 2010-2018. These time periods are separated by the enactment of water conservation policies in 2009, which included regional reductions in groundwater extraction. We investigated the response of wetland inundation to the interactive effects of precipitation, groundwater extraction, surrounding land development, basin geomorphology, and wetland vegetation class. Results show that water levels were lower and hydroperiods were shorter in wetlands of all vegetation classes during the first (2005-2009) time period, which corresponded with low rainfall conditions and high rates of groundwater extraction. Under water conservation policies enacted in the second (2010-2018) time period, median wetland water depths increased 1.35 m and median hydroperiods increased from 46 % to 83 %. Water-level variation was additionally less sensitive to groundwater extraction. The increase in inundation differed among vegetation classes with some wetlands not displaying signs of hydrological recovery. After accounting for effects of several explanatory factors, inundation still varied considerably among wetlands, suggesting a diversity of hydrological regimes, and thus ecological function, among individual wetlands across the landscape. Policies seeking to balance human water demand with the preservation of depressional wetlands would benefit by recognizing the heightened sensitivity of wetland inundation to groundwater extraction during periods of low precipitation.


Subject(s)
Groundwater , Wetlands , Humans , Ecosystem , Fresh Water , Water
7.
R Soc Open Sci ; 9(6): 220582, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35706674

ABSTRACT

Deforestation alters wildlife communities and modifies human-wildlife interactions, often increasing zoonotic spillover potential. When deforested land reverts to forest, species composition differences between primary and regenerating (secondary) forest could alter spillover risk trajectory. We develop a mathematical model of land-use change, where habitats differ in their relative spillover risk, to understand how land reversion influences spillover risk. We apply this framework to scenarios where spillover risk is higher in deforested land than mature forest, reflecting higher relative abundance of highly competent species and/or increased human-wildlife encounters, and where regenerating forest has either very low or high spillover risk. We find the forest regeneration rate, the spillover risk of regenerating forest relative to deforested land, and how rapidly regenerating forest regains attributes of mature forest determine landscape-level spillover risk. When regenerating forest has a much lower spillover risk than deforested land, reversion lowers cumulative spillover risk, but instaneous spillover risk peaks earlier. However, when spillover risk is high in regenerating and cleared habitats, landscape-level spillover risk remains high, especially when cleared land is rapidly abandoned then slowly regenerates to mature forest. These results suggest that proactive wildlife management and awareness of human exposure risk in regenerating forests could be important tools for spillover mitigation.

8.
Crit Care Med ; 50(7): 1148-1149, 2022 Jul 01.
Article in English | MEDLINE | ID: mdl-35726978
9.
Sci Total Environ ; 835: 155391, 2022 Aug 20.
Article in English | MEDLINE | ID: mdl-35461930

ABSTRACT

Invasive alien species (IAS) are a major driver of global biodiversity loss, hampering conservation efforts and disrupting ecosystem functions and services. While accumulating evidence documented ecological impacts of IAS across major geographic regions, habitat types and taxonomic groups, appraisals for economic costs remained relatively sparse. This has hindered effective cost-benefit analyses that inform expenditure on management interventions to prevent, control, and eradicate IAS. Terrestrial invertebrates are a particularly pervasive and damaging group of invaders, with many species compromising primary economic sectors such as forestry, agriculture and health. The present study provides synthesised quantifications of economic costs caused by invasive terrestrial invertebrates on the global scale and across a range of descriptors, using the InvaCost database. Invasive terrestrial invertebrates cost the global economy US$ 712.44 billion over the investigated period (up to 2020), considering only high-reliability source reports. Overall, costs were not equally distributed geographically, with North America (73%) reporting the greatest costs, with far lower costs reported in Europe (7%), Oceania (6%), Africa (5%), Asia (3%), and South America (< 1%). These costs were mostly due to invasive insects (88%) and mostly resulted from direct resource damages and losses (75%), particularly in agriculture and forestry; relatively little (8%) was invested in management. A minority of monetary costs was directly observed (17%). Economic costs displayed an increasing trend with time, with an average annual cost of US$ 11.40 billion since 1960, but as much as US$ 165.01 billion in 2020, but reporting lags reduced costs in recent years. The massive global economic costs of invasive terrestrial invertebrates require urgent consideration and investment by policymakers and managers, in order to prevent and remediate the economic and ecological impacts of these and other IAS groups.


Subject(s)
Ecosystem , Introduced Species , Animals , Biodiversity , Invertebrates , Reproducibility of Results
10.
Sci Total Environ ; 819: 153404, 2022 May 01.
Article in English | MEDLINE | ID: mdl-35148893

ABSTRACT

The global increase in biological invasions is placing growing pressure on the management of ecological and economic systems. However, the effectiveness of current management expenditure is difficult to assess due to a lack of standardised measurement across spatial, taxonomic and temporal scales. Furthermore, there is no quantification of the spending difference between pre-invasion (e.g. prevention) and post-invasion (e.g. control) stages, although preventative measures are considered to be the most cost-effective. Here, we use a comprehensive database of invasive alien species economic costs (InvaCost) to synthesise and model the global management costs of biological invasions, in order to provide a better understanding of the stage at which these expenditures occur. Since 1960, reported management expenditures have totalled at least US$95.3 billion (in 2017 values), considering only highly reliable and actually observed costs - 12-times less than damage costs from invasions ($1130.6 billion). Pre-invasion management spending ($2.8 billion) was over 25-times lower than post-invasion expenditure ($72.7 billion). Management costs were heavily geographically skewed towards North America (54%) and Oceania (30%). The largest shares of expenditures were directed towards invasive alien invertebrates in terrestrial environments. Spending on invasive alien species management has grown by two orders of magnitude since 1960, reaching an estimated $4.2 billion per year globally (in 2017 values) in the 2010s, but remains 1-2 orders of magnitude lower than damages. National management spending increased with incurred damage costs, with management actions delayed on average by 11 years globally following damage reporting. These management delays on the global level have caused an additional invasion cost of approximately $1.2 trillion, compared to scenarios with immediate management. Our results indicate insufficient management - particularly pre-invasion - and urge better investment to prevent future invasions and to control established alien species. Recommendations to improve reported management cost comprehensiveness, resolution and terminology are also made.


Subject(s)
Ecosystem , Introduced Species , Animals , Invertebrates , North America
11.
Sci Total Environ ; 806(Pt 3): 151318, 2022 Feb 01.
Article in English | MEDLINE | ID: mdl-34743879

ABSTRACT

The United States has thousands of invasive species, representing a sizable, but unknown burden to the national economy. Given the potential economic repercussions of invasive species, quantifying these costs is of paramount importance both for national economies and invasion management. Here, we used a novel global database of invasion costs (InvaCost) to quantify the overall costs of invasive species in the United States across spatiotemporal, taxonomic, and socioeconomic scales. From 1960 to 2020, reported invasion costs totaled $4.52 trillion (USD 2017). Considering only observed, highly reliable costs, this total cost reached $1.22 trillion with an average annual cost of $19.94 billion/year. These costs increased from $2.00 billion annually between 1960 and 1969 to $21.08 billion annually between 2010 and 2020. Most costs (73%) were related to resource damages and losses ($896.22 billion), as opposed to management expenditures ($46.54 billion). Moreover, the majority of costs were reported from invaders from terrestrial habitats ($643.51 billion, 53%) and agriculture was the most impacted sector ($509.55 billion). From a taxonomic perspective, mammals ($234.71 billion) and insects ($126.42 billion) were the taxonomic groups responsible for the greatest costs. Considering the apparent rising costs of invasions, coupled with increasing numbers of invasive species and the current lack of cost information for most known invaders, our findings provide critical information for policymakers and managers.


Subject(s)
Ecosystem , Introduced Species , Agriculture , Animals , Cost of Illness , Health Care Costs , Insecta , United States
12.
Ecology ; 103(2): e03577, 2022 02.
Article in English | MEDLINE | ID: mdl-34714929

ABSTRACT

Populations and communities fluctuate in their overall numbers through time, and the magnitude of fluctuations in individual species may scale to communities. However, the composite variability at the community scale is expected to be tempered by opposing fluctuations in individual populations, a phenomenon often called the portfolio effect. Understanding population variability, how it scales to community variability, and the spatial scaling in this variability are pressing needs given shifting environmental conditions and community composition. We explore evidence for portfolio effects using null community simulations and a large collection of empirical community time series from the BioTIME database. Additionally, we explore the relative roles of habitat type and geographic location on population and community temporal variability. We find strong portfolio effects in our theoretical community model, but weak effects in empirical data, suggesting a role for shared environmental responses, interspecific competition, or a litany of other factors. Furthermore, we observe a clear latitudinal signal - and differences among habitat types - in population and community variability. Together, this highlights the need to develop realistic models of community dynamics, and hints at spatial, and underlying environmental, gradients in variability in both population and community dynamics.


Subject(s)
Ecosystem , Models, Theoretical , Population Dynamics
13.
Sci Total Environ ; 803: 149875, 2022 Jan 10.
Article in English | MEDLINE | ID: mdl-34478901

ABSTRACT

Invasive alien fishes have had pernicious ecological and economic impacts on both aquatic ecosystems and human societies. However, a comprehensive and collective assessment of their monetary costs is still lacking. In this study, we collected and reviewed reported data on the economic impacts of invasive alien fishes using InvaCost, the most comprehensive global database of invasion costs. We analysed how total (i.e. both observed and potential/predicted) and observed (i.e. empirically incurred only) costs of fish invasions are distributed geographically and temporally and assessed which socioeconomic sectors are most affected. Fish invasions have potentially caused the economic loss of at least US$37.08 billion (US2017 value) globally, from just 27 reported species. North America reported the highest costs (>85% of the total economic loss), followed by Europe, Oceania and Asia, with no costs yet reported from Africa or South America. Only 6.6% of the total reported costs were from invasive alien marine fish. The costs that were observed amounted to US$2.28 billion (6.1% of total costs), indicating that the costs of damage caused by invasive alien fishes are often extrapolated and/or difficult to quantify. Most of the observed costs were related to damage and resource losses (89%). Observed costs mainly affected public and social welfare (63%), with the remainder borne by fisheries, authorities and stakeholders through management actions, environmental, and mixed sectors. Total costs related to fish invasions have increased significantly over time, from

Subject(s)
Ecosystem , Introduced Species , Animals , Europe , Fisheries , Fishes , Humans
14.
Crit Care Med ; 49(11): e1177, 2021 11 01.
Article in English | MEDLINE | ID: mdl-34643584
16.
Sci Total Environ ; 775: 145238, 2021 Jun 25.
Article in English | MEDLINE | ID: mdl-33715860

ABSTRACT

Much research effort has been invested in understanding ecological impacts of invasive alien species (IAS) across ecosystems and taxonomic groups, but empirical studies about economic effects lack synthesis. Using a comprehensive global database, we determine patterns and trends in economic costs of aquatic IAS by examining: (i) the distribution of these costs across taxa, geographic regions and cost types; (ii) the temporal dynamics of global costs; and (iii) knowledge gaps, especially compared to terrestrial IAS. Based on the costs recorded from the existing literature, the global cost of aquatic IAS conservatively summed to US$345 billion, with the majority attributed to invertebrates (62%), followed by vertebrates (28%), then plants (6%). The largest costs were reported in North America (48%) and Asia (13%), and were principally a result of resource damages (74%); only 6% of recorded costs were from management. The magnitude and number of reported costs were highest in the United States of America and for semi-aquatic taxa. Many countries and known aquatic alien species had no reported costs, especially in Africa and Asia. Accordingly, a network analysis revealed limited connectivity among countries, indicating disparate cost reporting. Aquatic IAS costs have increased in recent decades by several orders of magnitude, reaching at least US$23 billion in 2020. Costs are likely considerably underrepresented compared to terrestrial IAS; only 5% of reported costs were from aquatic species, despite 26% of known invaders being aquatic. Additionally, only 1% of aquatic invasion costs were from marine species. Costs of aquatic IAS are thus substantial, but likely underreported. Costs have increased over time and are expected to continue rising with future invasions. We urge increased and improved cost reporting by managers, practitioners and researchers to reduce knowledge gaps. Few costs are proactive investments; increased management spending is urgently needed to prevent and limit current and future aquatic IAS damages.


Subject(s)
Ecosystem , Introduced Species , Africa , Animals , Asia , North America
17.
J Transl Med ; 18(1): 374, 2020 10 02.
Article in English | MEDLINE | ID: mdl-33008420

ABSTRACT

BACKGROUND: Cannabis has been documented for use in alleviating anxiety. However, certain research has also shown that it can produce feelings of anxiety, panic, paranoia and psychosis. In humans, Δ9-tetrahydrocannabinol (THC) has been associated with an anxiogenic response, while anxiolytic activity has been attributed mainly to cannabidiol (CBD). In animal studies, the effects of THC are highly dose-dependent, and biphasic effects of cannabinoids on anxiety-related responses have been extensively documented. A more precise assessment is required of both the anxiolytic and anxiogenic potentials of phytocannabinoids, with an aim towards the development of the 'holy grail' in cannabis research, a medicinally-active formulation which may assist in the treatment of anxiety or mood disorders without eliciting any anxiogenic effects. OBJECTIVES: To systematically review studies assessing cannabinoid interventions (e.g. THC or CBD or whole cannabis interventions) both in animals and humans, as well as recent epidemiological studies reporting on anxiolytic or anxiogenic effects from cannabis consumption. METHOD: The articles selected for this review were identified up to January 2020 through searches in the electronic databases OVID MEDLINE, Cochrane Central Register of Controlled Trials, PubMed, and PsycINFO. RESULTS: Acute doses of CBD were found to reduce anxiety both in animals and humans, without having an anxiogenic effect at higher doses. Epidemiological studies tend to support an anxiolytic effect from the consumption of either  CBD or THC, as well as whole plant cannabis. Conversely, the available human clinical studies demonstrate a common anxiogenic response to THC (especially at higher doses). CONCLUSION: Based on current data, cannabinoid therapies (containing primarily CBD) may provide a more suitable treatment for people with pre-existing anxiety or as a potential adjunctive role in managing anxiety or stress-related disorders. However, further research is needed to explore other cannabinoids and phytochemical constituents present in cannabis (e.g. terpenes) as anxiolytic interventions. Future clinical trials involving patients with anxiety disorders are warranted due to the small number of available human studies.


Subject(s)
Anti-Anxiety Agents , Cannabidiol , Cannabis , Animals , Anti-Anxiety Agents/pharmacology , Anti-Anxiety Agents/therapeutic use , Anxiety/drug therapy , Anxiety Disorders/drug therapy , Cannabidiol/pharmacology , Cannabidiol/therapeutic use , Humans
19.
FEMS Microbiol Lett ; 367(13)2020 07 01.
Article in English | MEDLINE | ID: mdl-32589217

ABSTRACT

Autotrophic microorganisms catalyze the entry of dissolved inorganic carbon (DIC; = CO2 + HCO3- + CO32-) into the biological component of the global carbon cycle, despite dramatic differences in DIC abundance and composition in their sometimes extreme environments. "Cyanobacteria" are known to have CO2 concentrating mechanisms (CCMs) to facilitate growth under low CO2 conditions. These CCMs consist of carboxysomes, containing enzymes ribulose 1,5-bisphosphate oxygenase and carbonic anhydrase, partnered to DIC transporters. CCMs and their DIC transporters have been studied in a handful of other prokaryotes, but it was not known how common CCMs were beyond "Cyanobacteria". Since it had previously been noted that genes encoding potential transporters were found neighboring carboxysome loci, α-carboxysome loci were gathered from bacterial genomes, and potential transporter genes neighboring these loci are described here. Members of transporter families whose members all transport DIC (CHC, MDT and Sbt) were common in these neighborhoods, as were members of the SulP transporter family, many of which transport DIC. 109 of 115 taxa with carboxysome loci have some form of DIC transporter encoded in their genomes, suggesting that CCMs consisting of carboxysomes and DIC transporters are widespread not only among "Cyanobacteria", but also among members of "Proteobacteria" and "Actinobacteria".


Subject(s)
Bacteria/genetics , Carbon Dioxide/metabolism , Genes, Bacterial/genetics , Genetic Variation , Membrane Transport Proteins/genetics , Bacteria/metabolism , Biological Transport/genetics
20.
Resuscitation ; 153: 105-110, 2020 08.
Article in English | MEDLINE | ID: mdl-32504768

ABSTRACT

BACKGROUND: Capillary refill time (CRT) is easy, quick to perform and when prolonged in critical illness, correlates with progression of organ failure and mortality. It is utilized in our hospital's early warning score (EWS) as one of 11 parameters. We sought to define CRT's value in predicting patient outcomes, compared to the remaining EWS elements. METHODS: Five-year prospective observational study of 6480 consecutive Rapid Response Team (RRT) patients. CRT measured at the index finger was considered prolonged if time to previous-color return was >3 s. We analyzed the odds ratio of normal vs prolonged-CRT, compared to the other EWS variables, to individual and combined outcomes of mortality, cardiac arrest and higher-level of care transfer. RESULTS: Twenty-percent (N = 1329) of RRT-patients had prolonged-CRT (vs normal-CRT), were twice as likely to die (36% vs 17.8%, p < .001), more likely to experience the combined outcome (72.1% vs 54.2%, p < .001) and had longer hospital length of stays, 15.3 (SD 0.3) vs 13.5 days (SD 0.5) (p < .001). Multivariable logistic regression for mortality ranked CRT second to hypoxia among all 11 variables evaluated (p < 001). CONCLUSIONS: This is the first time CRT has been evaluated in RRT patients. Its measurement is easy to perform and proves useful as an assessment of adult patients at-risk for clinical decline. Its prolongation in our population was an independent predictor of mortality and the combined outcome. This study and others suggest that CRT should be considered further as a fundamental assessment of patients at-risk for clinical decline.


Subject(s)
Early Warning Score , Hospital Rapid Response Team , Adult , Critical Illness , Humans , Microcirculation , Prospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL