Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 52
1.
AIDS ; 38(7): 1025-1032, 2024 Jun 01.
Article En | MEDLINE | ID: mdl-38691049

OBJECTIVE: Investigate the role of the Ryan White HIV/AIDS Program (RWHAP) - which funds services for vulnerable and historically disadvantaged populations with HIV - in reducing health inequities among people with HIV over a 10-year horizon. DESIGN: We use an agent-based microsimulation model to incorporate the complexity of the program and long-time horizon. METHODS: We use a composite measure (the Theil index) to evaluate the health equity implications of the RWHAP for each of four subgroups (based on race and ethnicity, age, gender, and HIV transmission category) and two outcomes (probability of being in care and treatment and probability of being virally suppressed). We compare results with the RWHAP fully funded versus a counterfactual scenario, in which the medical and support services funded by the RWHAP are not available. RESULTS: The model indicates the RWHAP will improve health equity across all demographic subgroups and outcomes over a 10-year horizon. In Year 10, the Theil index for race and ethnicity is 99% lower for both outcomes under the RWHAP compared to the non-RWHAP scenario; 71-93% lower across HIV transmission categories; 31-44% lower for age; and 73-75% lower for gender. CONCLUSION: Given the large number of people served by the RWHAP and our findings on its impact on equity, the RWHAP represents an important vehicle for achieving the health equity goals of the National HIV/AIDS Strategy (2022-2025) and the Ending the HIV Epidemic Initiative goal of reducing new infections by 90% by 2030.


HIV Infections , Health Equity , United States Health Resources and Services Administration , Humans , Male , Female , HIV Infections/prevention & control , HIV Infections/epidemiology , Adult , Middle Aged , Adolescent , Young Adult , United States , Aged , Child , Child, Preschool , Aged, 80 and over , Infant , White
2.
J Gen Intern Med ; 38(Suppl 3): 805-813, 2023 07.
Article En | MEDLINE | ID: mdl-37340257

BACKGROUND: Travel is a major barrier to healthcare access for Veteran Affairs (VA) patients, and disproportionately affects rural Veterans (approximately one quarter of Veterans). The CHOICE/MISSION acts' intent is to increase timeliness of care and decrease travel, although not clearly demonstrated. The impact on outcomes remains unclear. Increased community care increases VA costs and increases care fragmentation. Retaining Veterans within the VA is a high priority, and reduction of travel burdens will help achieve this goal. Sleep medicine is presented as a use case to quantify travel related barriers. OBJECTIVE: The Observed and Excess Travel Distances are proposed as two measures of healthcare access, allowing for quantification of healthcare delivery related to travel burden. A telehealth initiative that reduced travel burden is presented. DESIGN: Retrospective, observational, utilizing administrative data. SUBJECTS: VA patients with sleep related care between 2017 and 2021. In-person encounters: Office visits and polysomnograms; telehealth encounters: virtual visits and home sleep apnea tests (HSAT). MAIN MEASURES: Observed distance: distance between Veteran's home and treating VA facility. Excess distance: difference between where Veteran received care and nearest VA facility offering the service of interest. Avoided distance: distance between Veteran's home and nearest VA facility offering in-person equivalent of telehealth service. KEY RESULTS: In-person encounters peaked between 2018 and 2019, and have down trended since, while telehealth encounters have increased. During the 5-year period, Veterans traveled an excess 14.1 million miles, while 10.9 million miles of travel were avoided due to telehealth encounters, and 48.4 million miles were avoided due to HSAT devices. CONCLUSIONS: Veterans often experience a substantial travel burden when seeking medical care. Observed and excess travel distances are valuable measures to quantify this major healthcare access barrier. These measures allow for assessment of novel healthcare approaches to improve Veteran healthcare access and identify specific regions that may benefit from additional resources.


Telemedicine , Veterans , Humans , Health Services Accessibility , Retrospective Studies , Travel , Travel-Related Illness , United States/epidemiology , United States Department of Veterans Affairs , Veterans Health
3.
Clin Infect Dis ; 73(9): 1735-1741, 2021 11 02.
Article En | MEDLINE | ID: mdl-33462589

Universities are faced with decisions on how to resume campus activities while mitigating severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) risk. To provide guidance for these decisions, we developed an agent-based network model of SARS-CoV-2 transmission to assess the potential impact of strategies to reduce outbreaks. The model incorporates important features related to risk at the University of California San Diego. We found that structural interventions for housing (singles only) and instructional changes (from in-person to hybrid with class size caps) can substantially reduce the basic reproduction number, but masking and social distancing are required to reduce this to at or below 1. Within a risk mitigation scenario, increased frequency of asymptomatic testing from monthly to twice weekly has minimal impact on average outbreak size (1.1-1.9), but substantially reduces the maximum outbreak size and cumulative number of cases. We conclude that an interdependent approach incorporating risk mitigation, viral detection, and public health intervention is required to mitigate risk.


COVID-19 , Universities , Basic Reproduction Number , Disease Outbreaks/prevention & control , Humans , SARS-CoV-2
4.
J Acquir Immune Defic Syndr ; 86(2): 174-181, 2021 02 01.
Article En | MEDLINE | ID: mdl-33093330

BACKGROUND: With an annual budget of more than $2 billion, the Health Resources and Services Administration's Ryan White HIV/AIDS Program (RWHAP) is the third largest source of public funding for HIV care and treatment in the United States, yet little analysis has been done to quantify the long-term public health and economic impacts of the federal program. METHODS: Using an agent-based, stochastic model, we estimated health care costs and outcomes over a 50-year period in the presence of the RWHAP relative to those expected to prevail if the comprehensive and integrated system of medical and support services funded by the RWHAP were not available. We made a conservative assumption that, in the absence of the RWHAP, only uninsured clients would lose access to these medical and support services. RESULTS: The model predicts that the proportion of people with HIV who are virally suppressed would be 25.2 percentage points higher in the presence of the RWHAP (82.6 percent versus 57.4 percent without the RWHAP). The number of new HIV infections would be 18 percent (190,197) lower, the number of deaths among people with HIV would be 31 percent (267,886) lower, the number of quality-adjusted life years would be 2.7 percent (5.6 million) higher, and the cumulative health care costs would be 25 percent ($165 billion) higher in the presence of the RWHAP relative to the counterfactual. Based on these results, the RWHAP has an incremental cost-effectiveness ratio of $29,573 per quality-adjusted life year gained compared with the non-RWHAP scenario. Sensitivity analysis indicates that the probability of transmitting HIV via male-to-male sexual contact and the cost of antiretroviral medications have the largest effect on the cost-effectiveness of the program. CONCLUSIONS: The RWHAP would be considered very cost-effective when using standard guidelines of less than the per capita gross domestic product of the United States. The results suggest that the RWHAP plays a critical and cost-effective role in the United States' public health response to the HIV epidemic.


Cost-Benefit Analysis , Delivery of Health Care/economics , HIV Infections/drug therapy , Health Care Costs , United States Health Resources and Services Administration , Anti-Retroviral Agents/therapeutic use , HIV Infections/economics , Humans , Male , Patient Protection and Affordable Care Act/economics , United States , United States Health Resources and Services Administration/statistics & numerical data
5.
Am J Kidney Dis ; 77(3): 397-405, 2021 03.
Article En | MEDLINE | ID: mdl-32890592

Kidney disease is a common, complex, costly, and life-limiting condition. Most kidney disease registries or information systems have been limited to single institutions or regions. A national US Department of Veterans Affairs (VA) Renal Information System (VA-REINS) was recently developed. We describe its creation and present key initial findings related to chronic kidney disease (CKD) without kidney replacement therapy (KRT). Data from the VA's Corporate Data Warehouse were processed and linked with national Medicare data for patients with CKD receiving KRT. Operational definitions for VA user, CKD, acute kidney injury, and kidney failure were developed. Among 7 million VA users in fiscal year 2014, CKD was identified using either a strict or liberal operational definition in 1.1 million (16.4%) and 2.5 million (36.3%) veterans, respectively. Most were identified using an estimated glomerular filtration rate laboratory phenotype, some through proteinuria assessment, and very few through International Classification of Diseases, Ninth Revision coding. The VA spent ∼$18 billion for the care of patients with CKD without KRT, most of which was for CKD stage 3, with higher per-patient costs by CKD stage. VA-REINS can be leveraged for disease surveillance, population health management, and improving the quality and value of care, thereby enhancing VA's capacity as a patient-centered learning health system for US veterans.


Health Care Costs/statistics & numerical data , Health Expenditures/statistics & numerical data , Renal Insufficiency, Chronic/economics , Veterans , Adult , Aged , Aged, 80 and over , Ambulatory Care/economics , Drug Costs , Female , Hospitalization/economics , Humans , Male , Middle Aged , Prevalence , Renal Insufficiency, Chronic/epidemiology , United States/epidemiology , United States Department of Veterans Affairs , Young Adult
6.
J Acquir Immune Defic Syndr ; 86(2): 164-173, 2021 02 01.
Article En | MEDLINE | ID: mdl-33109934

BACKGROUND: The Health Resources and Services Administration's Ryan White HIV/AIDS Program provides services to more than half of all people diagnosed with HIV in the United States. We present and validate a mathematical model that can be used to estimate the long-term public health and cost impact of the federal program. METHODS: We developed a stochastic, agent-based model that reflects the current HIV epidemic in the United States. The model simulates everyone's progression along the HIV care continuum, using 2 network-based mechanisms for HIV transmission: injection drug use and sexual contact. To test the validity of the model, we calculated HIV incidence, mortality, life expectancy, and lifetime care costs and compared the results with external benchmarks. RESULTS: The estimated HIV incidence rate for men who have sex with men (502 per 100,000 person years), mortality rate of all people diagnosed with HIV (1663 per 100,000 person years), average life expectancy for individuals with low CD4 counts not on antiretroviral therapy (1.52-3.78 years), and lifetime costs ($362,385) all met our validity criterion of within 15% of external benchmarks. CONCLUSIONS: The model represents a complex HIV care delivery system rather than a single intervention, which required developing solutions to several challenges, such as calculating need for and receipt of multiple services and estimating their impact on care retention and viral suppression. Our strategies to address these methodological challenges produced a valid model for assessing the cost-effectiveness of the Ryan White HIV/AIDS Program.


Cost-Benefit Analysis , HIV Infections/drug therapy , United States Health Resources and Services Administration , Anti-Retroviral Agents/economics , Anti-Retroviral Agents/therapeutic use , Continuity of Patient Care , HIV Infections/mortality , HIV Infections/transmission , Humans , Models, Theoretical , Mortality , United States
7.
Palliat Med ; 33(4): 457-461, 2019 04.
Article En | MEDLINE | ID: mdl-30747040

BACKGROUND: Chronic kidney disease palliative care guidelines would benefit from more diverse and objectively defined health status measures. AIM: The aim is to identify high-risk patients from administrative data and facilitate timely and uniform palliative care involvement. DESIGN: It is a retrospective cohort study. SETTING/PARTICIPANTS: In total, 45,368 Veterans, with chronic kidney disease Stage 3, 4, or 5, were monitored for up to 6 years and categorized into three groups, based on whether they died, started dialysis, or avoided both outcomes. RESULTS: Patient's appointment utilization was a significant predictor for both outcomes. It separated individuals into low, medium, and high appointment utilizers. Among the low appointment utilizers, the risk of death did not change significantly, while the risk of dialysis increased. Medium appointment utilizers had a stable risk of death and a decreasing risk of dialysis. Significant appointment utilization (above 31 visits during the baseline year) helped high-risk patients avoid both outcomes of interest-death and dialysis. CONCLUSION: Our model could justify the creation of a novel palliative care introduction trigger, as patients with medium demand for care may benefit from additional palliative care evaluation. The trigger could facilitate the uniformization of conservative treatment preparations. It could prompt messages to a managing physician when a patient crosses the threshold between low and medium appointment utilization. It may also aid in system-level policy development. Furthermore, our results highlight the benefit of significant appointment utilization among high-risk patients.


Appointments and Schedules , Palliative Care , Aged , Aged, 80 and over , Female , Humans , Kidney Failure, Chronic/therapy , Male , Middle Aged , Renal Dialysis , Retrospective Studies , United States
8.
Ophthalmic Physiol Opt ; 36(1): 60-8, 2016 Jan.
Article En | MEDLINE | ID: mdl-26307152

PURPOSE: A recent randomised controlled trial indicated that providing long-term multifocal wearers with a pair of distance single-vision spectacles for use outside the home reduced falls risk in active older people. However, it also found that participants disliked continually switching between using two pairs of glasses and adherence to the intervention was poor. In this study we determined whether intermediate addition multifocals (which could be worn most of the time inside and outside the home and thus avoid continual switching) could provide similar gait safety on stairs to distance single vision spectacles whilst also providing adequate 'short-term' reading and near vision. METHODS: Fourteen healthy long-term multifocal wearers completed stair ascent and descent trials over a 3-step staircase wearing intermediate and full addition bifocals and progression-addition lenses (PALs) and single-vision distance spectacles. Gait safety/caution was assessed using foot clearance measurements (toe on ascent, heel on descent) over the step edges and ascent and descent duration. Binocular near visual acuity, critical print size and reading speed were measured using Bailey-Lovie near charts and MNRead charts at 40 cm. RESULTS: Gait safety/caution measures were worse with full addition bifocals and PALs compared to intermediate bifocals and PALs. The intermediate PALs provided similar gait ascent/descent measures to those with distance single-vision spectacles. The intermediate addition PALs also provided good reading ability: Near word acuity and MNRead critical print size were better with the intermediate addition PALs than with the single-vision lenses (p < 0.0001), with a mean near visual acuity of 0.24 ± 0.13 logMAR (~N5.5) which is satisfactory for most near vision tasks when performed for a short period of time. CONCLUSIONS: The better ability to 'spot read' with the intermediate addition PALs compared to single-vision spectacles suggests that elderly individuals might better comply with the use of intermediate addition PALs outside the home. A lack of difference in gait parameters for the intermediate addition PALs compared to distance single-vision spectacles suggests they could be usefully used to help prevent falls in older well-adapted full addition PAL wearers. A randomised controlled trial to investigate the usefulness of intermediate multifocals in preventing falls seems warranted.


Accidental Falls/prevention & control , Eyeglasses , Presbyopia/rehabilitation , Walking/physiology , Aged , Female , Gait/physiology , Humans , Male , Middle Aged , Myopia , Reading , Vision, Binocular/physiology , Visual Acuity/physiology
9.
Math Biosci Eng ; 11(6): 1449-64, 2014 Dec.
Article En | MEDLINE | ID: mdl-25365600

A study of the process of pharmacokinetics-pharmacodynamics (PKPD) of antibiotics and their interaction with bacteria during peritoneal dialysis associated peritonitis (PDAP) is presented. We propose a mathematical model describing the evolution of bacteria population in the presence of antibiotics for different peritoneal dialysis regimens. Using the model along with experimental data, clinical parameters, and physiological values, we compute variations in PD fluid distributions, drug concentrations, and number of bacteria in peritoneal and extra-peritoneal cavities. Scheduling algorithms for the PD exchanges that minimize bacteria count are investigated.


Anti-Bacterial Agents/pharmacology , Bacteria/growth & development , Models, Biological , Peritoneal Dialysis/adverse effects , Peritonitis/microbiology , Adult , Anti-Bacterial Agents/administration & dosage , Anti-Bacterial Agents/therapeutic use , Colony Count, Microbial , Humans , Male , Peritonitis/drug therapy
10.
Exp Gerontol ; 55: 152-8, 2014 Jul.
Article En | MEDLINE | ID: mdl-24768822

BACKGROUND: Falls sustained when descending stairs are the leading cause of accidental death in older adults. Highly visible edge highlighters/friction strips (often set back from the tread edge) are sometimes used to improve stair safety, but there is no evidence for the usefulness of either. OBJECTIVE: To determine whether an edge highlighter and its location relative to the tread edge affect foot placement/clearance and accidental foot contacts when descending stairs. METHOD: Sixteen older adults (mean±1SD age; 71±7years) with normal vision (experiment 1) and eight young adults (mean±1SD age; 24±4years) with visual impairment due to simulated age-related cataract (experiment 2) completed step descent trials during which a high contrast edge highlighter was either not present, placed flush with the tread edge, or set back from the edge by 10mm or 30mm. Foot placement/clearance and the number of accidental foot contacts were compared across conditions. RESULTS: In experiment 1, a highlighter set back by 30mm led to a reduction in final foot placement (p<0.001) and foot clearance (p<0.001) compared to a highlighter placed flush with the tread edge, and the percentage of foot clearances that were less than 5mm increased from 2% (abutting) to 17% (away30). In experiment 2, a highlighter placed flush with the tread edge led to a decrease in within-subject variability in final foot placement (p=0.004) and horizontal foot clearance (p=0.022), a decrease in descent duration (p=0.009), and a decrease in the number of low clearances (<5mm, from 8% to 0%) and the number of accidental foot contacts (15% to 3%) when compared to a tread edge with no highlighter present. CONCLUSIONS: Changes to foot clearance parameters as a result of highlighter presence and position suggest that stairs with high-contrast edge highlighters positioned flush with the tread edge will improve safety on stairs, particularly for those with age-related visual impairment.


Accident Prevention/methods , Accidental Falls/prevention & control , Safety , Vision, Low/physiopathology , Adult , Aged , Aging/physiology , Cataract/complications , Female , Foot/physiology , Gait/physiology , Humans , Male , Proprioception/physiology , Vision, Low/etiology , Young Adult
11.
JAMA Intern Med ; 174(3): 391-7, 2014 Mar.
Article En | MEDLINE | ID: mdl-24424348

IMPORTANCE: Older adults are often excluded from clinical trials. The benefit of preventive interventions tested in younger trial populations may be reduced when applied to older adults in the clinical setting if they are less likely to survive long enough to experience those outcomes targeted by the intervention. OBJECTIVE: To extrapolate a treatment effect similar to those reported in major randomized clinical trials of angiotensin-converting enzyme inhibitors and angiotensin II receptor blockers for prevention of end-stage renal disease (ESRD) to a real-world population of older patients with chronic kidney disease. DESIGN, SETTING, AND PARTICIPANTS: Simulation study in a retrospective cohort conducted in Department of Veterans Affairs medical centers. We included 371 470 patients 70 years or older with chronic kidney disease. EXPOSURE: Level of estimated glomerular filtration rate (eGFR) and proteinuria. MAIN OUTCOMES AND MEASURES: Among members of this cohort, we evaluated the expected effect of a 30% reduction in relative risk on the number needed to treat (NNT) to prevent 1 case of ESRD over a 3-year period. These limits were selected to mimic the treatment effect achieved in major trials of angiotensin-converting enzyme inhibitors and angiotensin II receptor blockers for prevention of ESRD. These trials have reported relative risk reductions of 23% to 56% during observation periods of 2.6 to 3.4 years, yielding NNTs to prevent 1 case of ESRD of 9 to 25. RESULTS: The NNT to prevent 1 case of ESRD among members of this cohort ranged from 16 in patients with the highest baseline risk (eGFR of 15-29 mL/min/1.73 m(2) with a dipstick proteinuria measurement of ≥ 2+) to 2500 for those with the lowest baseline risk (eGFR of 45-59 mL/min/1.73 m(2) with negative or trace proteinuria and eGFR of ≥ 60 mL/min/1.73 m2 with dipstick proteinuria measurement of 1+). Most patients belonged to groups with an NNT of more than 100, even when the exposure time was extended over 10 years and in all sensitivity analyses. CONCLUSIONS AND RELEVANCE: Differences in baseline risk and life expectancy between trial subjects and real-world populations of older adults with CKD may reduce the marginal benefit to individual patients of interventions to prevent ESRD.


Clinical Trials as Topic , Kidney Failure, Chronic/prevention & control , Aged , Aged, 80 and over , Angiotensin-Converting Enzyme Inhibitors/pharmacology , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Computer Simulation , Female , Glomerular Filtration Rate/drug effects , Humans , Kidney Failure, Chronic/drug therapy , Male , Outcome Assessment, Health Care , Retrospective Studies , Risk Factors , United States
12.
Shock ; 41(1): 3-11, 2014 Jan.
Article En | MEDLINE | ID: mdl-24346647

Given that the leading clinical conditions associated with acute kidney injury (AKI), namely, sepsis, major surgery, heart failure, and hypovolemia, are all associated with shock, it is tempting to attribute all AKI to ischemia on the basis of macrohemodynamic changes. However, an increasing body of evidence has suggested that in many patients, AKI can occur in the absence of overt signs of global renal hypoperfusion. Indeed, sepsis-induced AKI can occur in the setting of normal or even increased renal blood flow. Accordingly, renal injury may not be entirely explained solely on the basis of the classic paradigm of hypoperfusion, and thus other mechanisms must come into play. Herein, we put forward a "unifying theory" to explain the interplay between inflammation and oxidative stress, microvascular dysfunction, and the adaptive response of the tubular epithelial cell to the septic insult. We propose that this response is mostly adaptive in origin, that it is driven by mitochondria, and that it ultimately results in and explains the clinical phenotype of sepsis-induced AKI.


Acute Kidney Injury/etiology , Inflammation/complications , Kidney Tubules/physiopathology , Sepsis/complications , Acute Kidney Injury/physiopathology , Adaptation, Physiological/physiology , Animals , Energy Metabolism/physiology , Glomerular Filtration Rate/physiology , Humans , Microcirculation/physiology , Renal Circulation/physiology , Sepsis/physiopathology
13.
J Vis ; 13(14)2013 Dec 04.
Article En | MEDLINE | ID: mdl-24306853

Perceived time is inherently malleable. For example, adaptation to relatively long or short sensory events leads to a repulsive aftereffect such that subsequent events appear to be contracted or expanded (duration adaptation). Perceived visual duration can also be distorted via concurrent presentation of discrepant auditory durations (multisensory integration). The neural loci of both distortions remain unknown. In the current study we use a psychophysical approach to establish their relative positioning within the sensory processing hierarchy. We show that audiovisual integration induces marked distortions of perceived visual duration. We proceed to use these distorted durations as visual adapting stimuli yet find subsequent visual duration aftereffects to be consistent with physical rather than perceived visual duration. Conversely, the concurrent presentation of adapted auditory durations with nonadapted visual durations results in multisensory integration patterns consistent with perceived, rather than physical, auditory duration. These results demonstrate that recent sensory history modifies human duration perception prior to the combination of temporal information across sensory modalities and provides support for adaptation mechanisms mediated by duration selective neurons situated in early areas of the visual and auditory nervous system (Aubie, Sayegh, & Faure, 2012; Duysens, Schaafsma, & Orban, 1996; Leary, Edwards, & Rose, 2008).


Illusions/physiology , Neural Pathways/physiology , Time Perception/physiology , Visual Perception/physiology , Adult , Female , Humans , Male
14.
BMC Med Inform Decis Mak ; 13: 102, 2013 Sep 04.
Article En | MEDLINE | ID: mdl-24007376

BACKGROUND: Medical care commonly involves the apprehension of complex patterns of patient derangements to which the practitioner responds with patterns of interventions, as opposed to single therapeutic maneuvers. This complexity renders the objective assessment of practice patterns using conventional statistical approaches difficult. METHODS: Combinatorial approaches drawn from symbolic dynamics are used to encode the observed patterns of patient derangement and associated practitioner response patterns as sequences of symbols. Concatenating each patient derangement symbol with the contemporaneous practitioner response symbol creates "words" encoding the simultaneous patient derangement and provider response patterns and yields an observed vocabulary with quantifiable statistical characteristics. RESULTS: A fundamental observation in many natural languages is the existence of a power law relationship between the rank order of word usage and the absolute frequency with which particular words are uttered. We show that population level patterns of patient derangement: practitioner intervention word usage in two entirely unrelated domains of medical care display power law relationships similar to those of natural languages, and that-in one of these domains-power law behavior at the population level reflects power law behavior at the level of individual practitioners. CONCLUSIONS: Our results suggest that patterns of medical care can be approached using quantitative linguistic techniques, a finding that has implications for the assessment of expertise, machine learning identification of optimal practices, and construction of bedside decision support tools.


Language , Practice Patterns, Physicians' , Symptom Assessment/psychology , Verbal Behavior , Databases, Factual/statistics & numerical data , Humans , Vocabulary
15.
J Nephrol ; 26(1): 3-15, 2013.
Article En | MEDLINE | ID: mdl-23065915

The dynamics of health and health care provision in the United States vary substantially across regions, and there is substantial regional heterogeneity in population density, age distribution, disease prevalence, race and ethnicity, poverty and the ability to access care. Geocoding and geographic information systems (GIS) are important tools to link patient or population location to information regarding these characteristics. In this review, we provide an overview of basic GIS concepts and provide examples to illustrate how GIS techniques have been applied to the study of kidney disease, and in particular to understanding the interplay between race, poverty, rural residence and the planning of renal services for this population. The interplay of socioeconomic status and renal disease outcomes remains an important area for investigation and recent publications have explored this relationship utilizing GIS techniques to incorporate measures of socioeconomic status and racial composition of neighborhoods. In addition, there are many potential challenges in providing care to rural patients with chronic kidney disease including long travel times and sparse renal services such as transplant and dialysis centers. Geospatially fluent analytic approaches can also inform system level analyses of health care systems and these approaches can be applied to identify an optimal distribution of dialysis facilities. GIS analysis could help untangle the complex interplay between geography, socioeconomic status, and racial disparities in chronic kidney disease, and could inform policy decisions and resource allocation as the population ages and the prevalence of renal disease increases.


Geographic Information Systems , Health Services Needs and Demand/trends , Healthcare Disparities/ethnology , Renal Insufficiency, Chronic/ethnology , Rural Health Services/supply & distribution , Forecasting , Health Services Accessibility , Humans , Socioeconomic Factors , United States/epidemiology
16.
Curr Opin Crit Care ; 18(6): 599-606, 2012 Dec.
Article En | MEDLINE | ID: mdl-23079618

PURPOSE OF REVIEW: The number of individuals with chronic kidney disease (CKD) and end-stage renal disease (ESRD) is rising, and these individuals often require intensive care. RECENT FINDINGS: Patients with CKD and ESRD require critical care more frequently than those without these conditions and have similar reasons for requiring critical care as the general population. However, the burden of comorbidities, overall severity of illness as assessed by standard scoring systems, and mortality are higher in patients with ESRD than in the non-ESRD critically ill. After adjustment for demographics, comorbidities, and physiologic variables, the increased mortality risk in patients with ESRD is attenuated. In comparison to patients with dialysis-requiring acute kidney injury (AKI), critically ill patients with ESRD have a more favorable prognosis. Severity of illness scoring systems such as Acute Physiology and Chronic Health Evaluation and Simplified Acute Physiology Score tend to overestimate the risk of death in critically ill ESRD patients. ICU admission does not appear to dramatically affect long-term mortality in those with ESRD who survive their initial acute illness as compared ESRD patients without critical illness. SUMMARY: Despite the manifest physiologic derangements attending CKD/ESRD, a higher burden of comorbid conditions and a greater severity of illness on presentation account for much of the increased mortality. There is no justification for therapeutic nihilism in this population.


Critical Care/methods , Kidney Failure, Chronic/pathology , Kidney Failure, Chronic/therapy , Comorbidity , Hospital Mortality , Humans , Intensive Care Units , Kidney Failure, Chronic/mortality , Prognosis , Severity of Illness Index
19.
Proc Biol Sci ; 279(1729): 690-8, 2012 Feb 22.
Article En | MEDLINE | ID: mdl-21831897

The task of deciding how long sensory events seem to last is one that the human nervous system appears to perform rapidly and, for sub-second intervals, seemingly without conscious effort. That these estimates can be performed within and between multiple sensory and motor domains suggest time perception forms one of the core, fundamental processes of our perception of the world around us. Given this significance, the current paucity in our understanding of how this process operates is surprising. One candidate mechanism for duration perception posits that duration may be mediated via a system of duration-selective 'channels', which are differentially activated depending on the match between afferent duration information and the channels' 'preferred' duration. However, this model awaits experimental validation. In the current study, we use the technique of sensory adaptation, and we present data that are well described by banks of duration channels that are limited in their bandwidth, sensory-specific, and appear to operate at a relatively early stage of visual and auditory sensory processing. Our results suggest that many of the computational principles the nervous system applies to coding visual spatial and auditory spectral information are common to its processing of temporal extent.


Models, Theoretical , Time Perception , Acoustic Stimulation , Adaptation, Physiological , Adult , Auditory Perception , Humans , Visual Perception
20.
Adv Exp Med Biol ; 696: 401-10, 2011.
Article En | MEDLINE | ID: mdl-21431580

Peritoneal dialysis-associated peritonitis (PDAP) can be treated using very different regimens of antimicrobial administration, regimens that result in different pharmacokinetic outcomes and systemic exposure levels. Currently, there is no population-level pharmacokinetic framework germane to the treatment of PDAP. We coupled a differential-equation-based model of antimicrobial kinetics to a Monte Carlo simulation framework, and conducted "in silico" clinical trials to explore the anticipated effects of different antimicrobial dosing regimens on relevant pharmacokinetic parameters (AUC/MIC and time greater than 5 ×MIC) and the level of systemic exposure.


Anti-Infective Agents/pharmacokinetics , Peritonitis/drug therapy , Peritonitis/metabolism , Anti-Infective Agents/administration & dosage , Area Under Curve , Ascitic Fluid/metabolism , Cefazolin/administration & dosage , Cefazolin/pharmacokinetics , Ceftazidime/administration & dosage , Ceftazidime/pharmacokinetics , Clinical Trials as Topic , Computational Biology , Computer Simulation , Dialysis Solutions , Humans , Microbial Sensitivity Tests , Models, Biological , Monte Carlo Method , Peritoneal Dialysis/adverse effects , Peritonitis/etiology , User-Computer Interface
...