Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
1.
Pharm. pract. (Granada, Internet) ; 21(4)oct.- dec. 2023. tab, graf
Article in English | IBECS | ID: ibc-229974

ABSTRACT

Background: Hypomagnesemia is common for surgical patients and often requires intravenous (IV) magnesium replacement. Due to the renal handling mechanism of magnesium, prolonging the duration of an IV magnesium infusion has been postulated to improve magnesium retention by reducing the renal excretion of magnesium. However, the evidence supporting this hypothesis is limited. Objective: To determine the change in serum magnesium level after IV magnesium replacement from baseline compared between prolonged (infusion rate < 0.5 g/h) and short infusions (infusion rate < 0.5 g/h) in hospitalized surgical patients. Methods: Medical records of surgical patients with hypomagnesemia who received IV magnesium replacement for three consecutive days and admitted to a university hospital between 2012 and 2022 were reviewed. Patients were separated by the replacement rate into two cohorts: prolonged infusion and short infusion. The primary outcome was a change in serum magnesium per gram administered from the baseline. The secondary outcome was the percentage of patients who achieved an optimal serum magnesium level after IV magnesium replacement. Results: 114 participants were enrolled in the study. The short infusion cohort showed a significantly greater increase in serum magnesium change per gram administered from baseline (0.07 mg/dL/g) compared to the prolonged infusion cohort (0.05 mg/dL/g) (p = 0.04). The difference of serum magnesium level between the two cohorts was 0.013 mg/dL/g of Mg. The percentage of patients who achieved the optimal serum magnesium level after IV magnesium replacement was not different between the two cohorts (prolonged infusion 66.7% vs. short infusion 70.2%; p = 0.84). The change in serum magnesium levels was influenced by renal function and the timing of serum magnesium level measurement after IV magnesium replacement (AU)


Subject(s)
Humans , Male , Female , Middle Aged , Aged , Aged, 80 and over , Magnesium Deficiency/drug therapy , Magnesium/administration & dosage , Magnesium/blood , Treatment Outcome , Retrospective Studies , Cohort Studies
2.
Int J Crit Illn Inj Sci ; 13(3): 118-124, 2023.
Article in English | MEDLINE | ID: mdl-38023581

ABSTRACT

Background: The appropriate dose of gentamicin is important to prevent and treat infections. The study aimed to determine the optimal dose of gentamicin to achieve the probability of pharmacokinetic/pharmacodynamic (PK) targets for efficacy and safety in multiple trauma patients. Methods: PK parameters of gentamicin in multiple trauma patients were gathered to develop a one-compartment PK model for prediction. The Monte Carlo simulation method was performed. The 24-h area under the concentration time curve to the minimum inhibitory concentration ratio (AUC24h/MIC) ≥50 was defined for the infection prevention target. AUC24h/MIC ≥110 or the maximum serum concentration to MIC ratio ≥8-10 was for the treatment of serious Gram-negative infection target. The risk of nephrotoxicity was the minimum serum concentration ≥2 mg/L. The optimal dose of gentamicin was determined when the efficacy target was >90% and the risk of nephrotoxicity was lowest. Results: The optimal gentamicin dose to prevent infection when the MIC was <1 mg/L was 6-7 mg/kg/day. A higher dose of gentamicin up to 10 mg/kg/day could not reach the target for treating serious Gram-negative infection when the expected MIC was ≥1 mg/L. The probability of nephrotoxicity was minimal at 0.2-4% with gentamicin doses of 5-10 mg/kg/day for 3 days. Conclusions: Once daily gentamicin doses of 6-7 mg/kg are recommended to prevent infections in patients with multiple trauma. Gentamicin monotherapy could not be recommended for serious infections. Further clinical studies are required to confirm our results.

3.
Value Health Reg Issues ; 37: 97-104, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37393722

ABSTRACT

OBJECTIVES: This study aimed to compare rabbit-antithymocyte globulin and cyclosporine (rATG/CsA) with oxymetholone in terms of direct medical expenditures and economic evaluation in severe acquired aplastic anemia (SAA) and very severe acquired aplastic anemia (vSAA) patients. METHODS: Patients with SAA/vSAA who initiated treatment with rATG/CsA or oxymetholone between 2004 and 2018 were included. Trial-based cost-effectiveness evaluation in healthcare provider perspective was performed. Direct medical costs were retrieved from hospital database, inflated, and converted to 2020 US dollar (30.01 Baht per US dollar). One-way sensitivity analysis and probabilistic sensitivity analysis by nonparametric bootstrap was performed. RESULTS: After 2-year follow-up, the total mean (SD) direct medical expenditures per patient for oxymetholone and rATG/CsA group were $8 514.48 ($12 595.67) and $41 070.88 ($22 084.04), respectively. Nevertheless, oxymetholone had significant lower survival rate than rATG/CsA (P=.001) but higher in second-year blood transfusion need (71.4% vs 18.2%) and hospitalization (14.3% vs 0%). The incremental cost-effectiveness ratio was $45 854.08 per life-year gained when rATG/CsA was used instead of oxymetholone (95% CI $24 244.03-$143 496.67 per life-year gained). The probabilistic sensitivity analysis indicated that rATG/CsA had no chance of being cost-effective for SAA/vSAA when willingness to pay threshold of one to 3 times of national gross domestic product per capita was applied. CONCLUSIONS: Oxymetholone remains a viable alternative in resource-limited country. Despite its high cost, the rATG/CsA is a preferred treatment option because of the significant advantages on reducing mortality, treatment complications, and hospitalization.


Subject(s)
Anemia, Aplastic , Antilymphocyte Serum , Animals , Rabbits , Anemia, Aplastic/drug therapy , Antilymphocyte Serum/therapeutic use , Cost-Effectiveness Analysis , Cyclosporine , Oxymetholone , Thailand , Humans
4.
Clin Pharmacol ; 15: 67-76, 2023.
Article in English | MEDLINE | ID: mdl-37427084

ABSTRACT

Background: In addition to the maximum plasma concentration (Cmax) to the minimum inhibitory concentration (MIC) ratio, the 24-hour area under the concentration-time curve (AUC24h) to MIC has recently been suggested as pharmacokinetic/pharmacodynamic (PK/PD) targets for efficacy and safety in once-daily dosing of gentamicin (ODDG) in critically ill patients. Purpose: This study aimed to predict the optimal effective dose and risk of nephrotoxicity for gentamicin in critically ill patients for two different PK/PD targets within the first 3 days of infection. Methods: The gathered pharmacokinetic and demographic data in critically ill patients from 21 previously published studies were used to build a one-compartment pharmacokinetic model. The Monte Carlo Simulation (MCS) method was conducted with the use of gentamicin once-daily dosing ranging from 5-10 mg/kg. The percentage target attainment (PTA) for efficacy, Cmax/MIC ~8-10 and AUC24h/MIC ≥110 targets, were studied. The AUC24h >700 mg⋅h/L and Cmin >2 mg/L were used to predict the risk of nephrotoxicity. Results: Gentamicin 7 mg/kg/day could achieve both efficacy targets for more than 90% when the MIC was <0.5 mg/L. When the MIC increased to 1 mg/L, gentamicin 8 mg/kg/day could reach the PK/PD and safety targets. However, for pathogens with MIC ≥2 mg/L, no studied gentamicin doses were sufficient to reach the efficacy target. The risk of nephrotoxicity using AUC24h >700 mg⋅h/L was small, but the risk was greater when applying a Cmin target >2 mg/L. Conclusion: Considering both targets of Cmax/MIC ~8-10 and AUC24h/MIC ≥110, an initial gentamicin dose of 8 mg/kg/day should be recommended in critically ill patients for pathogens with MIC of ≤1 mg/L. Clinical validation of our results is essential.

5.
Ann Hematol ; 102(6): 1333-1340, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37010569

ABSTRACT

Iron deficiency anemia (IDA) is a common health problem in developing countries. Many studies have shown that low-dose oral iron could have similar efficacy and less gastrointestinal effects in iron deficiency without anemia. This prospective open-labeled randomized controlled study was designed to compare the response of 200 mg ferrous fumarate thrice-weekly (TIW) as not inferior to the thrice-daily (TID) regimen and to assess the incidence of adverse events (AEs) between two regimens in treating adult patients with IDA. The primary endpoint was either an increase in Hb ≥ 3 g/dL, having Hb of 12 g/dL in females or 13 g/dL in males at the 12th week of treatment. Secondary outcomes included adverse events (AEs), red blood cell indices, iron profiles, and patient compliance. Sixty-four patients were randomized: 32 in the TIW arm and the other 32 in the TID arm. The response rates were not different between two arms either with intention to treat analysis (72.0%, 95%CI 56.6-88.5 vs. 71.9%, 95%CI 53.3-86.3, p = 0.777); or per-protocol analysis (88.9%, 95%CI 70.8-97.6 vs. 88.5%, 95%CI 69.8-97.6, p = 1.0), respectively. The trial demonstrated non-inferiority at a margin of 23%. Although the iron profile response of the TID arm was earlier than the TIW arm, almost all patients recovered from anemic symptoms at week 4, and hematologic responses were not different at week 12. There were more gastrointestinal AEs in the TID arm. In conclusion, this study showed that the TIW was non-inferior to the TID iron treatment of IDA patients but less AEs and costs.


Subject(s)
Anemia, Iron-Deficiency , Anemia , Male , Female , Humans , Adult , Anemia, Iron-Deficiency/drug therapy , Prospective Studies , Iron , Ferrous Compounds , Anemia/chemically induced , Hemoglobins/analysis
6.
J Neurosci Rural Pract ; 10(4): 582-587, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31831975

ABSTRACT

Background Early posttraumatic seizure (PTS) is a significant cause of unfavorable outcomes in traumatic brain injury (TBI). This study was aimed to investigate the incidence and determine a predictive model for early PTS. Materials and Methods A prospective cohort study of 484 TBI patients was conducted. All patients were evaluated for seizure activities within 7 days after the injury. Risk factors for early PTS were identified using univariate analysis. The candidate risk factors with p < 0.1 were selected into multivariable logistic regression analysis to identify predictors of early PTS. The fitting model and the power of discrimination with the area under the receiver operating characteristic (AUROC) curve were demonstrated. The nomogram for prediction of early PTS was developed for individuals. Results There were 27 patients (5.6%) with early PTS in this study. The final model illustrated chronic alcohol use (odds ratio [OR]: 4.06, 95% confidence interval [CI]: 1.64-10.07), epidural hematoma (OR: 3.98, 95% CI: 1.70-9.33), and Glasgow Coma Scale score 3-8 (OR: 3.78, 95% CI: 1.53-9.35) as predictors of early PTS. The AUROC curve was 0.77 (95% CI: 0.66-0.87). Conclusions The significant predictors for early PTS were chronic alcohol use, epidural hematoma, and severe TBI. Our nomogram was considered as a reliable source for prediction.

7.
Int J Gen Med ; 12: 455-463, 2019.
Article in English | MEDLINE | ID: mdl-31819596

ABSTRACT

PURPOSE: Serum digoxin concentration (SDC) monitoring may be unavailable in some healthcare settings. Predicted SDC comes into play in the efficacy and toxicity monitoring of digoxin. Renal function is the important parameter for predicting SDC. This study was conducted to compare measured and predicted SDC when using creatinine clearance (CrCl) from Cockcroft-Gault (CG) equation and estimated glomerular filtration rate (eGFR) calculated from CKD-Epidemiology Collaboration (CKD-EPI), re-expressed Modification of Diet in Renal Disease (Re-MDRD4), Thai-MDRD4, and Thai-eGFR equations in Sheiner's and Konishi's pharmacokinetic models. PATIENTS AND METHODS: In this retrospective study, patients with cardiovascular disease with a steady-state of SDC within 0.5-2.0 mcg/L were enrolled. CrCl and studied eGFR adjusted for body surface area (BSA) were used in the models to determine the predicted SDC. The discrepancies of the measured and the predicted SDC were analyzed and compared. RESULTS: One hundred and twenty-four patients ranging in age from 22 to 88 years (median 60 years, IQR 50.2, 69.2) were studied. Their serum creatinine ranged from 0.40 to 1.80 mg/dL (median 0.90 mg/dL, IQR 0.79, 1.10). The mean±SD of measured SDC was 1.12±0.34 mcg/L. In the Sheiner's model, the mean predicted SDC was calculated by using the CG and the BSA adjusted CKD-EPI equations and was not different when compared with the measured levels (1.10±0.36 mcg/L (p=0.669) and 1.08±0.42 mcg/L (p=0.374), respectively). The CG, CKD-EPI, and Re-MDRD4 equations were a better fit for patients with creatinine ≥0.9 mg/dL for prediction with minimal errors. In the Konishi's model, the predicted SDC using the CG and the studied eGFR equation was lower than the measured SDC (p<0.05). CONCLUSION: In Sheiner's model, the CG and the BSA adjusted CKD-EPI equations should be used for predicting SDC, especially in patients with serum creatinine ≥0.9 mg/dL. The other studied eGFRs underestimated SDC in both Sheiner's and Konishi's model.

8.
Am J Crit Care ; 21(4): 280-6, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22751371

ABSTRACT

BACKGROUND: Albumin is broadly prescribed for critically ill patients although it does not have a mortality benefit over crystalloids. One common use of albumin is to promote diuresis. Objectives To compare urine output in patients treated with furosemide with and without albumin and to assess other variables possibly associated with enhanced diuresis. METHODS: A retrospective study was conducted on patients in a medical intensive care unit who received furosemide therapy as a continuous infusion with and without 25% albumin for more than 6 hours. Primary end points were urine output and net fluid loss. RESULTS: A total of 31 patients were included in the final analysis. Mean urine output in patients treated with furosemide alone did not differ significantly from output in patients treated with furo-semide plus albumin at 6, 24, and 48 hours: mean output, 1119 (SD, 597) mL vs 1201 (SD, 612) mL, P = .56; 4323 (SD, 1717) mL vs 4615 (SD, 1741) mL, P = .42; and 7563 mL (SD, 2766) vs 7432 (SD, 2324) mL, P = .94, respectively. Additionally, net fluid loss did not differ significantly between the 2 groups at 6, 24, and 48 hours. Higher concentrations of serum albumin did not improve urine output. The only independent variable significantly associated with enhanced urine output at 24 and 48 hours was increased fluid intake. CONCLUSION: Addition of albumin to a furosemide infusion did not enhance diuresis obtained with furosemide alone in critically ill patients.


Subject(s)
Diuresis/drug effects , Furosemide/administration & dosage , Hypoalbuminemia/drug therapy , Serum Albumin/administration & dosage , Adult , Aged , Aged, 80 and over , Arizona , Critical Care/methods , Diuretics/administration & dosage , Drug Therapy, Combination , Female , Humans , Linear Models , Male , Middle Aged , Outcome Assessment, Health Care , Retrospective Studies , Serum Albumin/pharmacology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...