Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 12 de 12
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Crit Care Med ; 49(4): e444-e453, 2021 04 01.
Artigo em Inglês | MEDLINE | ID: mdl-33591007

RESUMO

OBJECTIVES: Septic cardiomyopathy develops frequently in patients with sepsis and likely increases short-term mortality. However, whether septic cardiomyopathy is associated with long-term outcomes after sepsis is unknown. We investigated whether septic patients with septic cardiomyopathy have worse long-term outcomes than septic patients without septic cardiomyopathy. DESIGN: Retrospective cohort study. SETTING: Adult ICU. PATIENTS: Adult ICU patients with sepsis. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Left ventricular global longitudinal systolic strain was our primary measure of septic cardiomyopathy. We employed a suite of multivariable survival analyses to explore linear and nonlinear associations between left ventricular global longitudinal systolic strain and major adverse cardiovascular events, which included death, stroke, and myocardial infarction. Our primary outcome was major adverse cardiovascular event through 24 months after ICU discharge. Among 290 study patients, median left ventricular global longitudinal systolic strain was -16.8% (interquartile range, -20.4% to -12.6%), and 38.3% of patients (n = 111) experienced a major adverse cardiovascular event within 24 months after discharge. On our primary, linear analysis, there was a trend (p = 0.08) toward association between left ventricular global longitudinal systolic strain and major adverse cardiovascular event (odds ratio, 1.03; CI, < 1 to 1.07). On our nonlinear analysis, the association was highly significant (p < 0.001) with both high and low left ventricular global longitudinal systolic strain associated with major adverse cardiovascular event among patients with pre-existing cardiac disease. This association was pronounced among patients who were younger (age < 65 yr) and had Charlson Comorbidity Index greater than 5. CONCLUSIONS: Among patients with sepsis and pre-existing cardiac disease who survived to ICU discharge, left ventricular global longitudinal systolic strain demonstrated a U-shaped association with cardiovascular outcomes through 24 months. The relationship was especially strong among younger patients with more comorbidities. These observations are likely of use to design of future trials.


Assuntos
Ventrículos do Coração/fisiopatologia , Sepse/complicações , Sepse/fisiopatologia , Disfunção Ventricular Esquerda/fisiopatologia , Adulto , Cardiomiopatias/fisiopatologia , Ecocardiografia , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de Risco , Volume Sistólico , Fatores de Tempo , Disfunção Ventricular Esquerda/diagnóstico , Disfunção Ventricular Esquerda/etiologia
2.
Clin Infect Dis ; 70(8): 1781-1787, 2020 04 10.
Artigo em Inglês | MEDLINE | ID: mdl-31641768

RESUMO

Improving antibiotic prescribing in outpatient settings is a public health priority. In the United States, urgent care (UC) encounters are increasing and have high rates of inappropriate antibiotic prescribing. Our objective was to characterize antibiotic prescribing practices during UC encounters, with a focus on respiratory tract conditions. This was a retrospective cohort study of UC encounters in the Intermountain Healthcare network. Among 1.16 million UC encounters, antibiotics were prescribed during 34% of UC encounters and respiratory conditions accounted for 61% of all antibiotics prescribed. Of respiratory encounters, 50% resulted in antibiotic prescriptions, yet the variability at the level of the provider ranged from 3% to 94%. Similar variability between providers was observed for respiratory conditions where antibiotics were not indicated and in first-line antibiotic selection for sinusitis, otitis media, and pharyngitis. These findings support the importance of developing antibiotic stewardship interventions specifically targeting UC settings.


Assuntos
Antibacterianos , Infecções Respiratórias , Assistência Ambulatorial , Antibacterianos/uso terapêutico , Humanos , Prescrição Inadequada , Pacientes Ambulatoriais , Padrões de Prática Médica , Infecções Respiratórias/tratamento farmacológico , Estudos Retrospectivos , Estados Unidos
3.
Eur Respir J ; 54(1)2019 07.
Artigo em Inglês | MEDLINE | ID: mdl-31023851

RESUMO

QUESTION: Is broad-spectrum antibiotic use associated with poor outcomes in community-onset pneumonia after adjusting for confounders? METHODS: We performed a retrospective, observational cohort study of 1995 adults with pneumonia admitted from four US hospital emergency departments. We used multivariable regressions to investigate the effect of broad-spectrum antibiotics on 30-day mortality, length of stay, cost and Clostridioides difficile infection (CDI). To address indication bias, we developed a propensity score using multilevel (individual provider) generalised linear mixed models to perform inverse-probability of treatment weighting (IPTW) to estimate the average treatment effect in the treated. We also manually reviewed a sample of mortality cases for antibiotic-associated adverse events. RESULTS: 39.7% of patients received broad-spectrum antibiotics, but drug-resistant pathogens were recovered in only 3%. Broad-spectrum antibiotics were associated with increased mortality in both the unweighted multivariable model (OR 3.8, 95% CI 2.5-5.9; p<0.001) and IPTW analysis (OR 4.6, 95% CI 2.9-7.5; p<0.001). Broad-spectrum antibiotic use by either analysis was also associated with longer hospital stay, greater cost and increased CDI. Healthcare-associated pneumonia was not associated with mortality independent of broad-spectrum antibiotic use. In manual review we identified antibiotic-associated events in 17.5% of mortality cases. CONCLUSION: Broad-spectrum antibiotics appear to be associated with increased mortality and other poor outcomes in community-onset pneumonia.


Assuntos
Antibacterianos/uso terapêutico , Infecções Comunitárias Adquiridas/tratamento farmacológico , Tempo de Internação/estatística & dados numéricos , Pneumonia/tratamento farmacológico , Pneumonia/mortalidade , Idoso , Antibacterianos/classificação , Infecções por Clostridium , Bases de Dados Factuais , Farmacorresistência Bacteriana , Feminino , Humanos , Modelos Lineares , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Pontuação de Propensão , Estudos Retrospectivos , Fatores de Tempo , Utah/epidemiologia
4.
Int J Sports Phys Ther ; 19(1): 1438-1453, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38179590

RESUMO

Background: Plantar fasciitis (PF) results in pain-related disability and excessive healthcare costs. Photobiomodulation therapy (PBMT) has shown promise for decreasing both pain and disability related to PF. Purpose: The purpose was to assess the clinical impact of PBMT on pain and function in people with PF. Study Design: Prospective, randomized controlled clinical trial. Methods: A convenience sample of adults with PF were randomly assigned to one of three groups: (1) usual care, (2) usual care plus nine doses of PBMT with 25W output power over three weeks, or (3) usual care plus nine doses of PBMT with 10W output power over three weeks. Both 10W and 25W PBMT participants received the same total dose (10J/cm2) by utilizing a simple area equation. Pain (with Defense and Veterans Pain Rating Scale) and function (by Foot and Ankle Ability Measure) were measured at baseline, weeks 3, and 6 for all groups, and at 13 and 26 weeks for PBMT groups. Results: PBMT groups experienced a reduction in pain over the first three weeks (from an average of 4.5 to 2.8) after which their pain levels remained mostly constant, while the UC group experienced a smaller reduction in pain (from an average of 4 to 3.8). The effects on pain were not different between PBMT groups. PBMT in both treatment groups also improved function more than the UC group, again with the improvement occurring within the first three weeks. Conclusions: Pain and function improved during the three weeks of PBMT plus UC and remained stable over the following three weeks. Improvements sustained through six months in the PBMT plus UC groups. Level of Evidence: Level II- RCT or Prospective Comparative Study.

5.
Mil Med ; 187(5-6): 684-689, 2022 05 03.
Artigo em Inglês | MEDLINE | ID: mdl-34559224

RESUMO

INTRODUCTION: The six-item Foot Posture Index (FPI-6) was previously developed as an assessment tool to measure the posture of the foot across multiple segments and planes. It was derived from a criterion-based observational assessment of six components of each foot during static standing. The association between abnormal foot posture and musculoskeletal injuries remains unclear and is in of need further exploration. HYPOTHESIS/PURPOSE: The purpose of this study was to assess the association between foot biomechanics and self-reported history of musculoskeletal pain or injury. STUDY DESIGN: Retrospective, cross-sectional study of collegiate football players at the U.S. Naval Academy. MATERIALS AND METHODS: For each athlete, data were recorded on height, weight, self-reported history of pain or injury, and foot posture, which was measured using a FPI-6 with each item measuring the degree of pronation/supination. The primary outcome was each athlete's maximum deviation from neutral posture across the six-item index (FPImax). The prespecified primary analysis used generalized linear models to measure the association between FPImax and self-report history of pain or injury. Exploratory analyses measured the association using penalized regression (L1-norm) and a type of tree-based ensemble known as extreme gradient boosting (XGBoost). RESULTS: Data were collected on 101 athletes, 99 of whom had sufficient body mass index (BMI) data to be included for analysis. Among the 99 athletes, higher FPImax was associated with a prior history of musculoskeletal pain (odds ratio [OR] 1.15, 95% confidence interval [CI] 0.97 to 1.35), although the sample size was too small for the association to be significant with 95% CI (P = .107). FPImax was not associated with a history of knee injury/pain (OR 0.98, 95% CI 0.83 to 1.15, P = .792), nor with a history of ankle/foot injury or pain (OR 1.04, 95% CI 0.90 to 1.21, P = .599). From the L1-penalized model, the FPI components with the strongest linear associations were the L6, R2, R1-squared, and FPImax. From the XGBoost model, the most important variables were FPItotal, BMI, R1, and R2. CONCLUSIONS: The U.S. Naval Academy football players whose foot postures deviated from neutral were more likely to have reported a previous history of musculoskeletal pain. However, this deviation from normal was not strongly associated with a specific history of pain or injury to the knee, ankle, or foot. CLINICAL RELEVANCE: The information ascertained from this study could be used to better inform clinicians about the value of the FPI in predicting or mitigating injuries for varsity football athletes.


Assuntos
Futebol Americano , Dor Musculoesquelética , Atletas , Fenômenos Biomecânicos , Estudos Transversais , Futebol Americano/lesões , Humanos , Dor Musculoesquelética/epidemiologia , Dor Musculoesquelética/etiologia , Estudos Retrospectivos
6.
J Diabetes Sci Technol ; 16(2): 383-389, 2022 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-32935561

RESUMO

BACKGROUND: Approximately 30 million Americans currently suffer from diabetes, and nearly 55 million people will be impacted by 2030. Continuous glucose monitoring (CGM) systems help patients manage their care with real-time data. Although approximately 95% of those with diabetes suffer from type 2, few studies have measured CGM's clinical impact for this segment within an integrated healthcare system. METHODS: A parallel randomized, multisite prospective trial was conducted using a new CGM device (Dexcom G6) compared to a standard of care finger stick glucometer (FSG) (Contour Next One). All participants received usual care in primary care clinics for six consecutive months while using these devices. Data were collected via electronic medical records, device outputs, exit surveys, and insurance company (SelectHealth) claims in accordance with institutional review board approval. RESULTS: Ninety-nine patients were randomized for analysis (n = 50 CGM and n = 49 FSG). CGM patients significantly decreased hemoglobin A1c (p = .001), total visits (p = .009), emergency department encounters (p = .018), and labs ordered (p = .001). Among SelectHealth non-Medicare Advantage patients, per member per month savings were $417 for CGM compared to FSG, but $9 more for Medicare Advantage. Seventy percent of CGM users reported that the technology helped them better understand daily activity and diet compared to only 16% for FSG. DISCUSSION: Participants using CGM devices had meaningful improvements in clinical outcomes, costs, and self-reported measures compared to the FSG group. Although a larger study is necessary to confirm these results, CGM devices appear to improve patient outcomes while making treatment more affordable.


Assuntos
Prestação Integrada de Cuidados de Saúde , Diabetes Mellitus Tipo 1 , Idoso , Glicemia , Automonitorização da Glicemia , Diabetes Mellitus Tipo 1/tratamento farmacológico , Hemoglobinas Glicadas/análise , Humanos , Medicare , Estudos Prospectivos , Estados Unidos
7.
IEEE Trans Biomed Eng ; 68(1): 181-191, 2021 01.
Artigo em Inglês | MEDLINE | ID: mdl-32746013

RESUMO

OBJECTIVE: Septic shock is a life-threatening manifestation of infection with a mortality of 20-50% [1]. A catecholamine vasopressor, norepinephrine (NE), is widely used to treat septic shock primarily by increasing blood pressure. For this reason, future blood pressure knowledge is invaluable for properly controlling NE infusion rates in septic patients. However, recent machine learning and data-driven methods often treat the physiological effects of NE as a black box. In this paper, a real-time, physiology-informed human mean arterial blood pressure model for septic shock patients undergoing NE infusion is studied. METHODS: Our methods combine learning theory, adaptive filter theory, and physiology. We learn least mean square adaptive filters to predict three physiological parameters (heart rate, pulse pressure, and the product of total arterial compliance and arterial resistance) from previous data and previous NE infusion rate. These predictions are combined according to a physiology model to predict future mean arterial blood pressure. RESULTS: Our model successfully forecasts mean arterial blood pressure on 30 septic patients from two databases. Specifically, we predict mean arterial blood pressure 3.33 minutes to 20 minutes into the future with a root mean square error from 3.56 mmHg to 6.22 mmHg. Additionally, we compare the computational cost of different models and discover a correlation between learned NE response models and a patient's SOFA score. CONCLUSION: Our approach advances our capability to predict the effects of changing NE infusion rates in septic patients. SIGNIFICANCE: More accurately predicted MAP can lessen clinicians' workload and reduce error in NE titration.


Assuntos
Norepinefrina , Choque Séptico , Pressão Arterial , Pressão Sanguínea , Humanos , Norepinefrina/farmacologia , Estudos Prospectivos , Choque Séptico/tratamento farmacológico , Vasoconstritores/farmacologia , Vasoconstritores/uso terapêutico
8.
J Orthop Sports Phys Ther ; 51(12): 619-627, 2021 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-34847698

RESUMO

OBJECTIVE: To investigate the influence of time taken to begin musculoskeletal rehabilitation on injury recurrence and ankle-related medical care use at 1 year after ankle sprain. DESIGN: Retrospective cohort study of all beneficiaries of the US Military Health System seeking care for an ankle sprain over a 4-year period. METHODS: Individuals were classified according to whether they did or did not receive physical rehabilitation. For those who received rehabilitation (n = 6150), linear relationships (with appropriate covariate controls) were analyzed with generalized linear models and generalized additive models to measure the effects of rehabilitation timing on injury recurrence and injury-related medical care use (costs and visits) at 1 year after injury. The nonlinear effect of rehabilitation timing on the probability of recurrence was assessed. RESULTS: Approximately 1 in 4 people received rehabilitation. The probability of ankle sprain recurrence increased for each day that rehabilitation was not provided during the first week. The probability of ankle sprain recurrence plateaued until about 2 months after initial injury, then increased again, with 2 times greater odds of recurrence compared to those who received physical rehabilitation within the first month. When rehabilitation care was delayed, recurrence (odds ratio [OR] = 1.28), number of foot/ankle-related visits (OR = 1.22), and foot/ankle-related costs increased (OR = 1.13; up to $1400 per episode). CONCLUSION: The earlier musculoskeletal rehabilitation care started after an ankle sprain, the lower the likelihood of recurrence and the downstream ankle-related medical costs incurred. J Orthop Sports Phys Ther 2021;51(12):619-627. doi:10.2519/jospt.2021.10730.


Assuntos
Traumatismos do Tornozelo , Serviços de Saúde Militar , Entorses e Distensões , Articulação do Tornozelo , Humanos , Estudos Retrospectivos , Estados Unidos
9.
Mil Med ; 185(1-2): e203-e211, 2020 02 12.
Artigo em Inglês | MEDLINE | ID: mdl-31268524

RESUMO

INTRODUCTION: Acquired Brain Injury, whether resulting from Traumatic brain injury (TBI) or Cerebral Vascular Accident (CVA), represent major health concerns for the Department of Defense and the nation. TBI has been referred to as the "signature" injury of recent U.S. military conflicts in Iraq and Afghanistan - affecting approximately 380,000 service members from 2000 to 2017; whereas CVA has been estimated to effect 795,000 individuals each year in the United States. TBI and CVA often present with similar motor, cognitive, and emotional deficits; therefore the treatment interventions for both often overlap. The Defense Health Agency and Veterans Health Administration would benefit from enhanced rehabilitation solutions to treat deficits resulting from acquired brain injuries (ABI), including both TBI and CVA. The purpose of this study was to evaluate the feasibility of implementing a novel, integrative, and intensive virtual rehabilitation system for treating symptoms of ABI in an outpatient clinic. The secondary aim was to evaluate the system's clinical effectiveness. MATERIALS AND METHODS: Military healthcare beneficiaries with ABI diagnoses completed a 6-week randomized feasibility study of the BrightBrainer Virtual Rehabilitation (BBVR) system in an outpatient military hospital clinic. Twenty-six candidates were screened, consented and randomized, 21 of whom completed the study. The BBVR system is an experimental adjunct ABI therapy program which utilizes virtual reality and repetitive bilateral upper extremity training. Four self-report questionnaires measured participant and provider acceptance of the system. Seven clinical outcomes included the Fugl-Meyer Assessment of Upper Extremity, Box and Blocks Test, Jebsen-Taylor Hand Function Test, Automated Neuropsychological Assessment Metrics, Neurobehavioral Symptom Inventory, Quick Inventory of Depressive Symptomatology-Self-Report, and Post Traumatic Stress Disorder Checklist- Civilian Version. The statistical analyses used bootstrapping, non-parametric statistics, and multilevel/hierarchical modeling as appropriate. This research was approved by the Walter Reed National Military Medical Center and Uniformed Services University of the Health Sciences Institutional Review Boards. RESULTS: All of the participants and providers reported moderate to high levels of utility, ease of use and satisfaction with the BBVR system (x- = 73-86%). Adjunct therapy with the BBVR system trended towards statistical significance for the measure of cognitive function (ANAM [x- = -1.07, 95% CI -2.27 to 0.13, p = 0.074]); however, none of the other effects approached significance. CONCLUSION: This research provides evidence for the feasibility of implementing the BBVR system into an outpatient military setting for treatment of ABI symptoms. It is believed these data justify conducting a larger, randomized trial of the clinical effectiveness of the BBVR system.


Assuntos
Lesões Encefálicas , Militares , Telerreabilitação , Afeganistão , Lesões Encefálicas/complicações , Estudos de Viabilidade , Humanos , Iraque , Estados Unidos
10.
IEEE J Transl Eng Health Med ; 7: 4100209, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31475080

RESUMO

Norepinephrine (NE), an endogenous catecholamine, is a mainstay treatment for septic shock, which is a life-threatening manifestation of severe infection. NE counteracts the loss in blood pressure associated with septic shock. However, an NE infusion that is too low fails to counteract the blood pressure drop, and an NE infusion that is too high can cause a hypertensive crisis and heart attack. Ideally, the NE infusion rate should maintain a patient's mean arterial blood pressure (MAP) above 65 mmHg. There are a few data-driven, quantitative models to predict the MAP, and incorporate NE effects. This paper presents a model, driven by intensive care unit (ICU) measurable data and known NE inputs, to predict the future MAP of an ICU patient. We derive a least square estimation model for MAP based on available ICU data, including heart period, NE infusion rate, and respiration wave. We learn the parameters of our model from initial patient data and then use this information to predict future MAP data. We assess our model with data from 12 septic patients. Our model successfully predicts and tracks MAP when the NE infusion rate changes. Specifically, we predict MAP 3 to 20 min in the future with the mean error of less than 4 to 7 mmHg over 12 patients. Conclusion: this new approach creates the potential to advance methods for predicting NE infusion rate in septic patients. Significance: successfully predicted patients' MAP could reduce catastrophic human error and lessen clinicians' workload.

11.
Ann Am Thorac Soc ; 14(2): 200-205, 2017 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-27690519

RESUMO

RATIONALE: Guidelines recommend a switch from intravenous to oral antibiotics once patients who are hospitalized with pneumonia achieve clinical stability. However, little evidence guides the selection of an oral antibiotic for patients with health care-associated pneumonia, especially where no microbiological diagnosis is made. OBJECTIVES: To compare outcomes between patients who were transitioned to broad- versus narrow-spectrum oral antibiotics after initially receiving broad-spectrum intravenous antibiotic coverage. METHODS: We performed a secondary analysis of an existing database of adults with community-onset pneumonia admitted to seven Utah hospitals. We identified 220 inpatients with microbiology-negative health care-associated pneumonia from 2010 to 2012. After excluding inpatient deaths and treatment failures, 173 patients remained in which broad-spectrum intravenous antibiotics were transitioned to an oral regimen. We classified oral regimens as broad-spectrum (fluoroquinolone) versus narrow-spectrum (usually a ß-lactam). We compared demographic and clinical characteristics between groups. Using a multivariable regression model, we adjusted outcomes by severity (electronically calculated CURB-65), comorbidity (Charlson Index), time to clinical stability, and length of intravenous therapy. MEASUREMENTS AND MAIN RESULTS: Age, severity, comorbidity, length of intravenous therapy, and clinical response were similar between the two groups. Observed 30-day readmission (11.9 vs. 21.4%; P = 0.26) and 30-day all-cause mortality (2.3 vs. 5.3%; P = 0.68) were also similar between the narrow and broad oral antibiotic groups. In multivariable analysis, we found no statistically significant differences for adjusted odds of 30-day readmission (adjusted odds ratio, 0.56; 95% confidence interval, 0.06-5.2; P = 0.61) or 30-day all-cause mortality (adjusted odds ratio, 0.55; 95% confidence interval, 0.19-1.6; P = 0.26) between narrow and broad oral antibiotic groups. CONCLUSIONS: On the basis of analysis of a limited number of patients observed retrospectively, our findings suggest that it may be safe to switch from broad-spectrum intravenous antibiotic coverage to a narrow-spectrum oral antibiotic once clinical stability is achieved for hospitalized patients with health care-associated pneumonia when no microbiological diagnosis is made. A larger retrospective study with propensity matching or regression-adjusted test of equivalence or ideally a prospective comparative effectiveness study will be necessary to confirm our observations.


Assuntos
Antibacterianos/uso terapêutico , Infecções Comunitárias Adquiridas/tratamento farmacológico , Readmissão do Paciente/estatística & dados numéricos , Pneumonia/tratamento farmacológico , Pneumonia/mortalidade , Administração Oral , Idoso , Idoso de 80 Anos ou mais , Bases de Dados Factuais , Farmacorresistência Bacteriana , Feminino , Humanos , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Mortalidade , Análise Multivariada , Razão de Chances , Utah/epidemiologia
12.
Acta Trop ; 99(2-3): 113-8, 2006 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-17022931

RESUMO

BACKGROUND: Anaemia is a major complication of Plasmodium falciparum malaria among small children in sub-Saharan Africa. We studied the performance of the Integrated Management of Childhood Illness (IMCI) recommended assessment of no/some/severe pallor as predictor of anaemia in health surveys at community level and in clinical practice in an out patient department (OPD) and in a hospital ward in rural Tanzania. METHODS: The study was undertaken among 6-36 months old children. Pallor was evaluated as a combined assessment of conjunctiva, tongue and palms and categorised as no, some or severe pallor. Packed cell volume (PCV) was measured and related to pallor. FINDINGS: A total of 740 examinations were performed at village, OPD and in the hospital ward. The prevalences of severe pallor were 0%, 1.5% and 7% respectively. The prevalences of any pallor were 14%, 41% and 86%. The prevalences of severe anaemia (PCV<21%) were 1%, 5% and 81% and of any anaemia (PCV<33%) 68%, 73% and 98%. Severe pallor could not detect severe anaemia. The sensitivities were only 0%, 0% and 8%. The sensitivities of any pallor to detect severe anaemia were however 86% and 98% for children at the health care facility level, but still of relatively poor predictive values since the specificities were only 61% and 68%. INTERPRETATION: Division of pallor into some or severe degrees was of no use at any health care level. The identification of any pallor was of no use at village level, but it may possibly be of some value as a screening test for severe anaemia at health care facilities, if additional assessment is included in view of the low specificity and positive predictive value of the finding.


Assuntos
Anemia/diagnóstico , Anemia/parasitologia , Malária Falciparum/diagnóstico , Palidez/parasitologia , Plasmodium falciparum/crescimento & desenvolvimento , Anemia/sangue , Anemia/patologia , Animais , Pré-Escolar , Estudos Transversais , Feminino , Hematócrito , Humanos , Lactente , Malária Falciparum/sangue , Malária Falciparum/parasitologia , Malária Falciparum/patologia , Masculino , Palidez/sangue , Palidez/patologia , Valor Preditivo dos Testes , População Rural , Sensibilidade e Especificidade , Tanzânia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA