Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 53
Filtrar
1.
Eur J Clin Pharmacol ; 79(12): 1613-1621, 2023 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-37737911

RESUMEN

PURPOSE: The primary aim of this study was to investigate the effect of including the Dutch National Pharmacotherapy Assessment (DNPA) in the medical curriculum on the level and development of prescribing knowledge and skills of junior doctors. The secondary aim was to evaluate the relationship between the curriculum type and the prescribing competence of junior doctors. METHODS: We re-analysed the data of a longitudinal study conducted in 2016 involving recently graduated junior doctors from 11 medical schools across the Netherlands and Belgium. Participants completed three assessments during the first year after graduation (around graduation (+ / - 4 weeks), and 6 months, and 1 year after graduation), each of which contained 35 multiple choice questions (MCQs) assessing knowledge and three clinical case scenarios assessing skills. Only one medical school used the DNPA in its medical curriculum; the other medical schools used conventional means to assess prescribing knowledge and skills. Five medical schools were classified as providing solely theoretical clinical pharmacology and therapeutics (CPT) education; the others provided both theoretical and practical CPT education (mixed curriculum). RESULTS: Of the 1584 invited junior doctors, 556 (35.1%) participated, 326 (58.6%) completed the MCQs and 325 (58.5%) the clinical case scenarios in all three assessments. Junior doctors whose medical curriculum included the DNPA had higher knowledge scores than other junior doctors (76.7% [SD 12.5] vs. 67.8% [SD 12.6], 81.8% [SD 11.1] vs. 76.1% [SD 11.1], 77.0% [12.1] vs. 70.6% [SD 14.0], p < 0.05 for all three assessments, respectively). There was no difference in skills scores at the moment of graduation (p = 0.110), but after 6 and 12 months junior doctors whose medical curriculum included the DNPA had higher skills scores (both p < 0.001). Junior doctors educated with a mixed curriculum had significantly higher scores for both knowledge and skills than did junior doctors educated with a theoretical curriculum (p < 0.05 in all assessments). CONCLUSION: Our findings suggest that the inclusion of the knowledge focused DNPA in the medical curriculum improves the prescribing knowledge, but not the skills, of junior doctors at the moment of graduation. However, after 6 and 12 months, both the knowledge and skills were higher in the junior doctors whose medical curriculum included the DNPA. A curriculum that provides both theoretical and practical education seems to improve both prescribing knowledge and skills relative to a solely theoretical curriculum.


Asunto(s)
Curriculum , Educación Médica , Humanos , Estudios Longitudinales , Países Bajos , Cuerpo Médico de Hospitales/educación , Competencia Clínica
2.
New Phytol ; 239(2): 592-605, 2023 07.
Artículo en Inglés | MEDLINE | ID: mdl-37203379

RESUMEN

Traditional phenological models use chilling and thermal forcing (temperature sum or degree-days) to predict budbreak. Because of the heightening impact of climate and other related biotic or abiotic stressors, a model with greater biological support is needed to better predict budbreak. Here, we present an original mechanistic model based on the physiological processes taking place before and during budbreak of conifers. As a general principle, we assume that phenology is driven by the carbon status of the plant, which is closely related to environmental variables and the annual cycle of dormancy-activity. The carbon balance of a branch was modelled from autumn to winter with cold acclimation and dormancy and from winter to spring when deacclimation and growth resumption occur. After being calibrated in a field experiment, the model was validated across a large area (> 34 000 km2 ), covering multiple conifers stands in Québec (Canada) and across heated plots for the SPRUCE experiment in Minnesota (USA). The model accurately predicted the observed dates of budbreak in both Québec (±3.98 d) and Minnesota (±7.98 d). The site-independent calibration provides interesting insights on the physiological mechanisms underlying the dynamics of dormancy break and the resumption of vegetative growth in spring.


Asunto(s)
Picea , Tracheophyta , Carbono , Clima , Plantas , Estaciones del Año , Árboles
3.
J Allergy Clin Immunol Pract ; 11(2): 519-526.e3, 2023 02.
Artículo en Inglés | MEDLINE | ID: mdl-36581072

RESUMEN

BACKGROUND: The quality of allergy documentation in electronic health records is frequently poor. OBJECTIVE: To compare the usability of 3 graphical user interfaces (GUIs) for drug allergy documentation. METHODS: Physicians tested 3 GUIs by means of 5 fictional drug allergy scenarios: the current GUI (GUI 0), using mainly free-text, and 2 new coded versions (GUI 1 and GUI 2) asking information on allergen category, specific allergen, symptom(s), symptom onset, timing of initial reaction, and diagnosis status with a semiautomatic delabeling feature. Satisfaction was measured by the System Usability Scale questionnaire, efficiency by time to complete the tasks, and effectiveness by a task completion score. Posttest interviews provided more in-depth qualitative feedback. RESULTS: Thirty physicians from 7 different medical specialties and with varying degrees of experience participated. The mean System Usability Scale scores for GUI 1 (77.25, adjective rating "Good") and GUI 2 (78.42, adjective rating "Good") were significantly higher than for GUI 0 (56.58, adjective rating "OK") (Z, 6.27, Padj < .001 and Z, 6.62, Padj < .001, respectively). There was no significant difference in task time between GUIs. Task completion scores of GUI 1 and GUI 2 were higher than for GUI 0 (Z, 9.59, Padj < .001 and Z, 11.87, Padj < .001, respectively). Quantitative and qualitative findings were combined to propose a GUI 3 with high usability. CONCLUSIONS: The usability and quality of allergy documentation was higher for the newly developed coded GUIs with a semiautomatic delabeling feature without being more time-consuming.


Asunto(s)
Hipersensibilidad a las Drogas , Hipersensibilidad , Humanos , Interfaz Usuario-Computador , Registros Electrónicos de Salud , Documentación , Hipersensibilidad a las Drogas/diagnóstico
4.
Br J Clin Pharmacol ; 89(4): 1374-1385, 2023 04.
Artículo en Inglés | MEDLINE | ID: mdl-36321834

RESUMEN

AIMS: Many clinical decision support systems trigger warning alerts for drug-drug interactions potentially leading to QT prolongation and torsades de pointes (QT-DDIs). Unfortunately, there is overalerting and underalerting because stratification is only based on a fixed QT-DDI severity level. We aimed to improve QT-DDI alerting by developing and validating a risk prediction model considering patient- and drug-related factors. METHODS: We fitted 31 predictor candidates to a stepwise linear regression for 1000 bootstrap samples and selected the predictors present in 95% of the 1000 models. A final linear regression model with those variables was fitted on the original development sample (350 QT-DDIs). This model was validated on an external dataset (143 QT-DDIs). Both true QTc and predicted QTc were stratified into three risk levels (low, moderate and high). Stratification of QT-DDIs could be appropriate (predicted risk = true risk), acceptable (one risk level difference) or inappropriate (two risk levels difference). RESULTS: The final model included 11 predictors with the three most important being use of antiarrhythmics, age and baseline QTc. Comparing current practice to the prediction model, appropriate stratification increased significantly from 37% to 54% appropriate QT-DDIs (increase of 17.5% on average [95% CI +5.4% to +29.6%], padj = 0.006) and inappropriate stratification decreased significantly from 13% to 1% inappropriate QT-DDIs (decrease of 11.2% on average [95% CI -17.7% to -4.7%], padj ≤ 0.001). CONCLUSION: The prediction model including patient- and drug-related factors outperformed QT alerting based on QT-DDI severity alone and therefore is a promising strategy to improve DDI alerting.


Asunto(s)
Sistemas de Apoyo a Decisiones Clínicas , Síndrome de QT Prolongado , Torsades de Pointes , Humanos , Síndrome de QT Prolongado/inducido químicamente , Síndrome de QT Prolongado/diagnóstico , Interacciones Farmacológicas , Torsades de Pointes/inducido químicamente , Torsades de Pointes/prevención & control , Antiarrítmicos , Factores de Riesgo , Electrocardiografía
5.
J Med Syst ; 46(12): 100, 2022 Nov 23.
Artículo en Inglés | MEDLINE | ID: mdl-36418746

RESUMEN

In clinical practice, many drug therapies are associated with prolongation of the QT interval. In literature, estimation of the risk of prescribing drug-induced QT prolongation is mainly executed by means of logistic regression; only one paper reported the use of machine learning techniques. In this paper, we compare the performance of both techniques on the same dataset. High risk for QT prolongation was defined as having a corrected QT interval (QTc) ≥ 450 ms or ≥ 470 ms for respectively male and female patients. Both conventional statistical methods (CSM) and machine learning techniques (MLT) were used. All algorithms were validated internally and with a hold-out dataset of respectively 512 and 102 drug-drug interactions with possible drug-induced QTc prolongation. MLT outperformed the best CSM in both internal and hold-out validation. Random forest and Adaboost classification performed best in the hold-out set with an equal harmonic mean of sensitivity and specificity (HMSS) of 81.2% and an equal accuracy of 82.4% in a hold-out dataset. Sensitivity and specificity were both high (respectively 75.6% and 87.7%). The most important features were baseline QTc value, C-reactive protein level, heart rate at baseline, age, calcium level, renal function, serum potassium level and the atrial fibrillation status. All CSM performed similarly with HMSS varying between 60.3% and 66.3%. The overall performance of logistic regression was 62.0%. MLT (bagging and boosting) outperform CSM in predicting drug-induced QTc prolongation. Additionally, 19.2% was gained in terms of performance by random forest and Adaboost classification compared to logistic regression (the most used technique in literature in estimating the risk for QTc prolongation). Future research should focus on testing the classification on fully external data, further exploring potential of other (new) machine and deep learning models and on generating data pipelines to automatically feed the data to the classifier used.


Asunto(s)
Síndrome de QT Prolongado , Aprendizaje Automático , Humanos , Femenino , Masculino , Interacciones Farmacológicas , Algoritmos , Frecuencia Cardíaca , Síndrome de QT Prolongado/inducido químicamente
6.
Br J Clin Pharmacol ; 88(12): 5218-5226, 2022 12.
Artículo en Inglés | MEDLINE | ID: mdl-35716366

RESUMEN

AIM: The aim of this study was to investigate how the prescribing knowledge and skills of junior doctors in the Netherlands and Belgium develop in the year after graduation. We also analysed differences in knowledge and skills between surgical and nonsurgical junior doctors. METHODS: This international, multicentre (n = 11), longitudinal study analysed the learning curves of junior doctors working in various specialties via three validated assessments at about the time of graduation, and 6 months and 1 year after graduation. Each assessment contained 35 multiple choice questions (MCQs) on medication safety (passing grade ≥85%) and three clinical scenarios. RESULTS: In total, 556 junior doctors participated, 326 (58.6%) of whom completed the MCQs and 325 (58.5%) the clinical case scenarios of all three assessments. Mean prescribing knowledge was stable in the year after graduation, with 69% (SD 13) correctly answering questions at assessment 1 and 71% (SD 14) at assessment 3, whereas prescribing skills decreased: 63% of treatment plans were considered adequate at assessment 1 but only 40% at assessment 3 (P < .001). While nonsurgical doctors had similar learning curves for knowledge and skills as surgical doctors (P = .53 and P = .56 respectively), their overall level was higher at all three assessments (all P < .05). CONCLUSION: These results show that junior doctors' prescribing knowledge and skills did not improve while they were working in clinical practice. Moreover, their level was under the predefined passing grade. As this might adversely affect patient safety, educational interventions should be introduced to improve the prescribing competence of junior doctors.


Asunto(s)
Competencia Clínica , Cuerpo Médico de Hospitales , Pautas de la Práctica en Medicina , Humanos , Competencia Clínica/estadística & datos numéricos , Estudios de Seguimiento , Estudios Longitudinales
7.
J Eval Clin Pract ; 28(4): 599-606, 2022 08.
Artículo en Inglés | MEDLINE | ID: mdl-35080261

RESUMEN

RATIONALE: Intravenous (IV) fluids are frequently involved in iatrogenic complications in hospitalized patients. Knowledge of IV fluids seems inadequate and is not covered sufficiently in standard medical education. METHODS: Two surveys were developed, based on the 2016 British National Institute for Health and Care Excellence guideline 'IV fluid therapy in adults in hospital', to provide insight on the learning needs and expectations of physicians and nurses. Each survey focused on profession-specific practice and consisted of three parts: demographics, knowledge questions and evaluation of current habits. Physicians and nurses practicing in a Belgian university hospital were invited to complete the survey electronically, respectively, in January and May 2018. RESULTS: A total of 103 physicians (19%) and 259 nurses (24%) participated. Although every indication for fluid therapy may require a specific fluid and electrolyte mixture, and hence, knowledge of their exact composition, most physicians and nurses did not know the composition of commonly prescribed solutions for IV infusion. Senior physicians did not score better than juniors did on questions concerning the daily needs of a nil-by-mouth patient. The availability of an IV fluid on the ward guides physicians to prescribe IV fluids (17%). Nurses (56%) feel they share responsibility in fluid management as they frequently intervene in urgent situations. More than half of participants (70% of physicians, 79% of nurses) indicated a need for additional information. CONCLUSIONS: A clear need for more structured information on IV fluids was identified. Both physicians and nurses struggle with fluid therapy. Continuing education on IV fluid management, emphasizing multidisciplinary collaboration, and monitoring evidence-based practice is essential to support the clinical decision process in daily practice.


Asunto(s)
Médicos , Adulto , Hospitales , Humanos , Infusiones Intravenosas , Práctica Profesional , Encuestas y Cuestionarios
8.
Br J Clin Pharmacol ; 88(2): 753-763, 2022 02.
Artículo en Inglés | MEDLINE | ID: mdl-34331720

RESUMEN

AIMS: To analyse the appropriateness of direct oral anticoagulant (DOAC) dosing and determinants for under-and overdosing as well as acceptance and implementation rates of pharmacists' interventions. METHODS: Cross-sectional study in a tertiary hospital in hospitalized patients with atrial fibrillation on DOACs in 2019 (n = 1688). Primary outcome was the proportion of patients with inappropriate DOAC prescribing with identification of determinants for under-and overdosing. Secondary outcomes included acceptance and implementation rates of pharmacists' recommendations and determination of reasons for nonacceptance/nonimplementation. RESULTS: Inappropriate prescribing was observed in 16.9% of patients (n = 286) with underdosing (9.7%) being more prevalent than overdosing (6.9%). For all DOACs considered together, body weight<60 kg (odds ratio [OR] 0.46 [0.27-0.77]), edoxaban use (OR 0.42 [0.24-0.74]), undergoing surgery (OR 0.57 [0.37-0.87]) and being DOAC naïve (OR 0.45 [0.29-0.71]) were associated with significantly lower odds of underdosing. Bleeding history (OR 1.86 [1.24-2.80]) and narcotic use (OR 1.67 [1.13-2.46]) were associated with significantly higher odds for underdosing. Determinants with a significantly higher odds of overdosing were renal impairment (OR 11.29 [6.23-20.45]) and body weight<60 kg (OR 2.34 [1.42-3.85]), whereas dabigatran use (OR 0.24 [0.08-0.71]) and apixaban (OR 0.18 [0.10-0.32]) were associated with a significantly lower odds of overdosing compared to rivaroxaban. Physicians accepted the pharmacists' advice in 179 cases (79.2%) consisting of 92 (51.4%) recommendations for underdosing, 82 (45.8%) for overdosing and 5 (2.8%) for contraindications. CONCLUSION: Inappropriate DOAC prescribing remains common, although there is a slight improvement compared to our study of 2016. Clinical services led by pharmacists help physicians to reduce the number of inadequate prescriptions for high-risk medications such as DOACs.


Asunto(s)
Fibrilación Atrial , Médicos , Accidente Cerebrovascular , Administración Oral , Anticoagulantes/efectos adversos , Fibrilación Atrial/tratamiento farmacológico , Peso Corporal , Estudios Transversales , Dabigatrán/uso terapéutico , Humanos , Farmacéuticos , Estudios Retrospectivos , Rivaroxabán , Accidente Cerebrovascular/tratamiento farmacológico
9.
Br J Clin Pharmacol ; 88(5): 2419-2429, 2022 05.
Artículo en Inglés | MEDLINE | ID: mdl-34907577

RESUMEN

AIMS: Direct oral anticoagulants (DOACs) are increasingly used for stroke prevention in atrial fibrillation. However, little is known about the association between medication adherence, patient satisfaction and treatment knowledge. The objective was to determine patients' DOAC adherence and their treatment satisfaction over time. Furthermore, we respectively investigated possible associations of treatment satisfaction and treatment knowledge in relation to adherence. METHODS: Longitudinal study conducted in atrial fibrillation patients hospitalized in 2019 in a tertiary university hospital. DOAC adherence, treatment satisfaction and knowledge were assessed with validated questionnaires. Mixed effects logistic regression was modelled to investigate the effect of both treatment satisfaction and knowledge on DOAC adherence over time. RESULTS: In total, 164 patients participated of whom 128 and 101 patients could be recontacted after a period of, respectively, 3 (first contact) and 6 months (second contact) to assess adherence and treatment satisfaction. Suboptimal adherence was observed in 40.6% of the patients after 3 months and in 42.6% after 6 months (P = .78). There was no significant difference (P = .29) in the total score for treatment satisfaction between the first (79.2%) and the second contact (80.6%). DOAC adherence was not affected by time (P = .71) nor by total knowledge score (P = .61) or treatment satisfaction score (P = .34). Nonetheless, a strong correlation between treatment satisfaction and knowledge was found (P = .004). CONCLUSION: DOAC adherence was suboptimal. Treatment satisfaction and knowledge were not associated with DOAC adherence over a 6-month period. Knowledge gaps were identified that could be remediated through patient education and follow-up.


Asunto(s)
Fibrilación Atrial , Accidente Cerebrovascular , Administración Oral , Anticoagulantes , Fibrilación Atrial/complicaciones , Fibrilación Atrial/tratamiento farmacológico , Humanos , Estudios Longitudinales , Cumplimiento de la Medicación , Satisfacción del Paciente , Satisfacción Personal , Accidente Cerebrovascular/prevención & control
10.
Int J Med Inform ; 148: 104393, 2021 04.
Artículo en Inglés | MEDLINE | ID: mdl-33486355

RESUMEN

OBJECTIVE: Evaluation of the effect of six optimization strategies in a clinical decision support system (CDSS) for drug-drug interaction (DDI) screening on alert burden and alert acceptance and description of clinical pharmacist intervention acceptance. METHODS: Optimizations in the new CDSS were the customization of the knowledge base (with addition of 67 extra DDIs and changes in severity classification), a new alert design, required override reasons for the most serious alerts, the creation of DDI-specific screening intervals, patient-specific alerting, and a real-time follow-up system of all alerts by clinical pharmacists with interventions by telephone was introduced. The alert acceptance was evaluated both at the prescription level (i.e. prescription acceptance, was the DDI prescribed?) and at the administration level (i.e. administration acceptance, did the DDI actually take place?). Finally, the new follow-up system was evaluated by assessing the acceptance of clinical pharmacist's interventions. RESULTS: In the pre-intervention period, 1087 alerts (92.0 % level 1 alerts) were triggered, accounting for 19 different DDIs. In the post-intervention period, 2630 alerts (38.4 % level 1 alerts) were triggered, representing 86 different DDIs. The relative risk forprescription acceptance in the post-intervention period compared to the pre-intervention period was 4.02 (95 % confidence interval (CI) 3.17-5.10; 25.5 % versus 6.3 %). The relative risk for administration acceptance was 1.16 (95 % CI 1.08-1.25; 54.4 % versus 46.7 %). Finally, 86.9 % of the clinical pharmacist interventions were accepted. CONCLUSION: Six concurrently implemented CDSS optimization strategies resulted in a high alert acceptance and clinical pharmacist intervention acceptance. Administration acceptance was remarkably higher than prescription acceptance.


Asunto(s)
Sistemas de Apoyo a Decisiones Clínicas , Sistemas de Entrada de Órdenes Médicas , Preparaciones Farmacéuticas , Interacciones Farmacológicas , Humanos , Farmacéuticos
11.
Hypertens Res ; 43(10): 995-1005, 2020 10.
Artículo en Inglés | MEDLINE | ID: mdl-32451494

RESUMEN

The nucleus tractus solitarius (NTS), paraventricular nucleus (PVN), and rostral ventrolateral medulla (RVLM) are the most targeted regions of central blood pressure control studies. Glutamate and gamma-aminobutyric acid (GABA) interact within these brain regions to modulate blood pressure. The brain renin-angiotensin system also participates in central blood pressure control. Angiotensin II increases blood pressure through the stimulation of angiotensin II type 1 (AT1) receptors within the PVN and RVLM and attenuates baroreceptor sensitivity, resulting in elevated blood pressure within the NTS. Angiotensin II type 2 (AT2) receptors in cardiovascular control centers in the brain also appear to be involved in blood pressure control and counteract AT1 receptor-mediated effects. The current review is focused on the interaction of GABA with AT1 and AT2 receptors in the control of blood pressure within the RVLM, PVN and NTS. Within the NTS, GABA is released from local GABAergic interneurons that are stimulated by local AT1 receptors and mediates a hypertensive response. In contrast, the local increase in GABA levels observed after AT2 receptor stimulation within the RVLM, likely from GABAergic nerve endings originating in the caudal ventrolateral medulla, is important in the mediation of the hypotensive response. Preliminary results suggest that the hypertensive response to AT1 receptor stimulation within the RVLM is associated with a reduction in GABA release. The current experimental evidence therefore indicates that GABA is an important mediator of brainstem responses to AT1 and AT2 receptor stimulation and that increased GABA release may play a role in hypertensive and hypotensive responses, depending on the site of action.


Asunto(s)
Presión Sanguínea , Tronco Encefálico/metabolismo , Receptor de Angiotensina Tipo 1/metabolismo , Receptor de Angiotensina Tipo 2/metabolismo , Ácido gamma-Aminobutírico/metabolismo , Animales , Humanos , Sistema Renina-Angiotensina
12.
Environ Entomol ; 49(2): 496-501, 2020 04 14.
Artículo en Inglés | MEDLINE | ID: mdl-32159758

RESUMEN

With current trends in global warming, it has been suggested that spruce budworm outbreaks may spread to northern parts of the boreal forest. However, the major constraints for a northward expansion are the availability of suitable host trees and the insect winter survival capacity. This study aimed to determine the effect of larval feeding on balsam fir, white spruce and black spruce on various spruce budworm life history traits of both the parental and the progeny generations. Results indicated that the weight of the overwintering larval progeny and their winter survival were influenced by host tree species on which larvae of the parental generation fed. White spruce was the most suitable host for the spruce budworm, producing the heaviest pupae and the heaviest overwintering larvae while black spruce was the least suitable, producing the smallest pupae and the smallest overwintering progeny. Overwintering larvae produced by parents that fed on black spruce also suffered higher winter mortality than individuals coming from parents that fed on balsam fir or white spruce. With current trends in global warming, spruce budworm is expected to expand its range to northern boreal forests where black spruce is the dominant tree species. Such northern range expansion might not result in outbreaks if low offspring winter survival on black spruce persist.


Asunto(s)
Abies , Mariposas Nocturnas , Picea , Animales , Pupa , Árboles
14.
Int J Med Inform ; 133: 104013, 2020 01.
Artículo en Inglés | MEDLINE | ID: mdl-31698230

RESUMEN

OBJECTIVE: To investigate whether context-specific alerts for potassium-increasing drug-drug interactions (DDIs) in a clinical decision support system reduced the alert burden, increased alert acceptance, and had an effect on the occurrence of hyperkalemia. MATERIALS AND METHODS: In the pre-intervention period all alerts for potassium-increasing DDIs were level 1 alerts advising absolute contraindication, while in the post-intervention period the same drug combinations could trigger a level 1 (absolute contraindication), a level 2 (monitor potassium values), or a level 3 alert (informative, not shown to physicians) based on the patient's recent laboratory value of potassium. Alert acceptance was defined as non-prescription or non-administration of the interacting drug combination for level 1 alerts and as monitoring of the potassium levels for level 2 alerts. RESULTS: The alert burden decreased by 92.8%. The relative risk (RR) for alert acceptance based on prescription rates for level 1 alerts and monitoring rates for level 2 alerts was 15.048 (86.5% vs 5.7%; 95% CI 12.037-18.811; P < 0.001). With alert acceptance for level 1 alerts based on actual administration and for level 2 alerts on monitoring rates, the RR was 3.597 (87.6% vs 24.4%; 95% CI 3.192-4.053; P < 0.001). In the generalized linear mixed model the effect of the intervention on the occurrence of hyperkalemia was not significant (OR 1.091, 95% CI 0.172-6.919). CONCLUSION: The proposed strategy seems effective to get a grip on the delicate balance between over- and under alerting.


Asunto(s)
Potasio , Sistemas de Apoyo a Decisiones Clínicas , Interacciones Farmacológicas , Humanos
15.
Front Neurosci ; 13: 589, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-31231188

RESUMEN

AIM: The nucleus tractus solitarii (NTS) densely expresses angiotensin II type 2 receptors (AT2R), which are mainly located on inhibitory gamma-aminobutyric acid (GABA) neurons. Central AT2R stimulation reduces blood pressure, and AT2R stimulation in the rostral ventrolateral medulla (RVLM), mediates a hypotensive response through a GABAergic mechanism. We aimed to test the hypothesis that an AT2R mediated inhibition of the GABA release within the NTS might be involved in this hypotensive response, by assessing possible alterations in blood pressure and heart rate, as well as in GABA levels in normotensive Wistar rats. METHODS: In vivo microdialysis was used for measurement of extracellular GABA levels and for perfusion of the selective AT2R agonist, Compound 21, within the NTS. Our set-up allowed to determine simultaneously the excitatory glutamate dialysate levels. The mean arterial pressure and heart rate responses were monitored with a pressure transducer. RESULTS: Local perfusion of Compound 21 into the NTS did not modify blood pressure and heart rate, nor glutamate and GABA levels compared to baseline concentrations. A putative effect was also not unmasked by concomitant angiotensin II type 1 receptor blockade with candesartan. Positive control experiments confirmed that the experimental set up had enough sensitivity to detect a reduction in GABA dialysate levels and blood pressure. CONCLUSION: The results did not provide evidence for a role of the AT2R within the NTS in the control of blood pressure, nor for an interaction with local GABAergic signaling in normotensive rats.

16.
Front Pharmacol ; 10: 460, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-31130861

RESUMEN

AIM: It is well-established that angiotensin II exerts a dampening effect on the baroreflex within the nucleus tractus solitarii (NTS), the principal brainstem site for termination of baroreceptor afferents and which is densely populated with gamma-aminobutyric acid (GABA)ergic neurons and nerve terminals. The present study was designed to investigate whether local release of GABA is involved in the effects mediated by local angiotensin II within the NTS. METHODS: In vivo microdialysis was used for measurement of extracellular glutamate and GABA levels and for infusion of angiotensin II within the NTS of conscious normotensive Wistar rats. The mean arterial pressure (MAP) and heart rate response to local infusion of angiotensin II were subsequently monitored with a pressure transducer under anesthesia. The angiotensin II type 1 receptor (AT1R) antagonist, candesartan, was used to assess whether responses were AT1R dependent and the nitric oxide (NO) synthase inhibitor, N(ω)-nitro-L-arginine methyl ester (L-NAME), was used to assess the involvement of NO in the evoked responses by infusion of angiotensin II. The MAP and heart rate responses were monitored with a pressure transducer. RESULTS: Local infusion into the NTS of angiotensin II induced a significant to ninefold significantly increase in extracellular GABA levels; as well as MAP was increased by 15 mmHg. These responses were both abolished by co-infusion of either, the angiotensin II type 1 receptor antagonist, candesartan, or the NO synthase inhibitor, L-NAME, demonstrating that the effect is not only AT1R dependent but also NO dependent. The pressor response to angiotensin II was reversed by co-infusion with the GABAA receptor antagonist, bicuculline. Local blockade of NO synthase decreased both, GABA and glutamate concentrations. CONCLUSION: Our results suggest that the AT1R mediated hypertensive response to angiotensin II within the NTS in normotensive rats is GABA and NO dependent. Nitric oxide produced within the NTS tonically potentiates local GABA and glutamate release.

17.
Eur J Clin Pharmacol ; 75(7): 895-900, 2019 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-30877328

RESUMEN

PURPOSE: Centrally authorised medicinal products (CAMPs) in the European Union may offer added therapeutic value (ATV) but may be linked to high prices and limited efficiency. Health technology assessment (HTA) and managed entry schemes (MES) may facilitate the reimbursement decision by providing reliable estimates of the medicinal product's value and costs and by controlling the remaining uncertainty, respectively. We investigated the impact of HTA criteria and the initiation of a MES on the reimbursement decision of CAMPs in Belgium. METHODS: We selected all reimbursement submissions for new centrally authorised medicinal products in the 2010-2015 period. We retrieved data relating to the reimbursement decision, the HTA outcome and the use of a managed entry scheme. RESULTS: The decision of the Minister was available for 115 dossiers, covering 36 (31.3%) orphan medicinal products (OMPs) and 79 ATV products. A MES was used in 41 submissions. A positive reimbursement decision was obtained in 65% of cases. The significant factors affecting the reimbursement decision were the approval of ATV, the medical need if it was considered 'important or major' and the use of a managed entry scheme. Price, budget impact and efficiency had no significant impact. CONCLUSIONS: Added therapeutic value and high medical need increase the odds for a positive reimbursement decision. No impact could be demonstrated of the cost-related HTA criteria. Cost elements may be biased by the use of a confidential MES. Without a MES, only 53% of the centrally authorised medicinal products, including OMPs, are reimbursed in Belgium.


Asunto(s)
Preparaciones Farmacéuticas/economía , Mecanismo de Reembolso , Evaluación de la Tecnología Biomédica , Bélgica , Toma de Decisiones
18.
Cerebrovasc Dis Extra ; 9(1): 1-8, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-30616238

RESUMEN

BACKGROUND: In the first 5 years after their stroke, about a quarter of patients will suffer from a recurrent stroke. Digital health interventions facilitating interactions between a caregiver and a patient from a distance are a promising approach to improve patient adherence to lifestyle changes proposed by secondary prevention guidelines. Many of these interventions are not implemented in daily practice, even though efficacy has been shown. One of the reasons can be the lack of clear economic incentives for implementation. We propose to map all health economic evidence regarding digital health interventions for secondary stroke prevention. SUMMARY: We performed a systematic search according to PRISMA-P guidelines and searched on PubMed, Web of Science, Cochrane, and National Institute for Health Research Economic Evaluation Database. Only digital health interventions for secondary prevention in stroke patients were included and all study designs and health economic outcomes were accepted. We combined the terms "Stroke OR Cardiovascular," "Secondary prevention," "Digital health interventions," and "Cost" in one search string using the AND operator. The search performed on April 20, 2017 yielded 163 records of which 26 duplicates were removed. After abstract screening, 20 articles were retained for full-text analysis, of which none reported any health economic evidence that could be included for analysis or discussion. Key Messages: There is a lack of evidence on health economic outcomes on digital health interventions for secondary stroke prevention. Future research in this area should take health economics into consideration when designing a trial and there is a clear need for health economic evidence and models.


Asunto(s)
Costos de la Atención en Salud , Prevención Secundaria/economía , Accidente Cerebrovascular/economía , Accidente Cerebrovascular/prevención & control , Telemedicina/economía , Análisis Costo-Beneficio , Humanos , Recurrencia , Accidente Cerebrovascular/diagnóstico , Factores de Tiempo , Resultado del Tratamiento
19.
Front Pharmacol ; 9: 1220, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-30425641

RESUMEN

Background and Objectives: Appropriate dosing of direct oral anticoagulants (DOACs) is required to avoid under- and overdosing that may precipitate strokes or thromboembolic events and bleedings, respectively. Our objective was to analyze the appropriateness of DOAC dosing according to the summaries of product characteristics (SmPC). Furthermore, determinants for inappropriate prescribing were investigated. Methodology: Retrospective cohort study of hospitalized patients aged ≥60 years with at least one DOAC intake during hospital stay. Descriptive analyses were used to summarize the characteristics of the study population. Chi-square test was used to evaluate differences between DOACs. Binary logistic regression analysis was performed to assess determinants for inappropriate prescribing. Results: For the 772 included patients, inappropriate dosing occurred in 25.0% of hospitalizations with 23.4, 21.9, and 29.7% for dabigatran, rivaroxaban, and apixaban, respectively (p = 0.084). Underdosing was most prevalent for apixaban (24.5%) compared to dabigatran (14.0%) and rivaroxaban (12.8%), p < 0.001. In 67.1% (apixaban), 26.7% (dabigatran), and 51.2% (rivaroxaban) of underdosed DOAC cases according to the SmPC, the dose would be considered appropriate according to the European Heart Rhytm Association (EHRA) guidelines. Overdosing was observed in 4.5% (apixaban), 4.7% (dabigatran), and 7.7% (rivaroxaban) of patients. For all DOACs, our analysis showed an age ≥80 years (p = 0.036), use of apixaban (p = 0.026), DOAC use before hospitalization (p = 0.001), intermediate renal function (p = 0.014), and use of narcotic analgesics (p = 0.019) to be associated with a higher rate of inappropriate prescribing. Undergoing surgery was associated with a lower odds of inappropriate prescribing (p = 0.012). For rivaroxaban, use of medication for hypothyroidism (p = 0.027) and the reduced dose (p < 0.001) were determinants for inappropriate prescribing. Treatment of venous thromboembolism was associated with less errors (p = 0.002). For apixaban, severe renal insufficiency (p < 0.001) and initiation in hospital (p = 0.016) were associated with less and the reduced dose (p < 0.001) with more inappropriate prescribing. No determinants were found in the dabigatran subgroup. Conclusions: Inappropriate DOAC prescribing is frequent with underdosing being the most common drug related problem when using the SmPC as reference. More appropriate prescriptions were found when taking the EHRA guidelines into account. Analysis of determinants of inappropriate prescribing yielded insights in the risk factors associated with inappropriate DOAC prescriptions.

20.
J Med Internet Res ; 20(9): e258, 2018 09 07.
Artículo en Inglés | MEDLINE | ID: mdl-30194058

RESUMEN

BACKGROUND: Worldwide, the burden of allergies-in particular, drug allergies-is growing. In the process of prescribing, dispensing, or administering a drug, a medication error may occur and can have adverse consequences; for example, a drug may be given to a patient with a documented allergy to that particular drug. Computerized physician order entry (CPOE) systems with built-in clinical decision support systems (CDSS) have the potential to prevent such medication errors and adverse events. OBJECTIVE: The aim of this review is to provide a comprehensive overview regarding all aspects of CDSS for drug allergy, including documenting, coding, rule bases, alerts and alert fatigue, and outcome evaluation. METHODS: The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed as much as possible and searches were conducted in 5 databases using CPOE, CDSS, alerts, and allergic or allergy as keywords. Bias could not be evaluated according to PRISMA guidelines due to the heterogeneity of study types included in the review. RESULTS: Of the 3160 articles considered, 60 met the inclusion criteria. A further 9 articles were added based on expert opinion, resulting in a total of 69 articles. An interrater agreement of 90.9% with a reliability Κ=.787 (95% CI 0.686-0.888) was reached. Large heterogeneity across study objectives, study designs, study populations, and reported results was found. Several key findings were identified. Evidence of the usefulness of clinical decision support for drug allergies has been documented. Nevertheless, there are some important problems associated with their use. Accurate and structured documenting of information on drug allergies in electronic health records (EHRs) is difficult, as it is often not clear to healthcare providers how and where to document drug allergies. Besides the underreporting of drug allergies, outdated or inaccurate drug allergy information in EHRs poses an important problem. Research on the use of coding terminologies for documenting drug allergies is sparse. There is no generally accepted standard terminology for structured documentation of allergy information. The final key finding is the consistently reported low specificity of drug allergy alerts. Current systems have high alert override rates of up to 90%, leading to alert fatigue. Important challenges remain for increasing the specificity of drug allergy alerts. We found only one study specifically reporting outcomes related to CDSS for drug allergies. It showed that adverse drug events resulting from overridden drug allergy alerts do not occur frequently. CONCLUSIONS: Accurate and comprehensive recording of drug allergies is required for good use of CDSS for drug allergy screening. We found considerable variation in the way drug allergy are recorded in EHRs. It remains difficult to reduce drug allergy alert overload while maintaining patient safety as the highest priority. Future research should focus on improving alert specificity, thereby reducing override rates and alert fatigue. Also, the effect on patient outcomes and cost-effectiveness should be evaluated.


Asunto(s)
Sistemas de Apoyo a Decisiones Clínicas/normas , Hipersensibilidad a las Drogas/diagnóstico , Hipersensibilidad a las Drogas/patología , Humanos , Reproducibilidad de los Resultados
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA