RESUMO
Importance: Despite guideline recommendations, clinicians do not systematically use prior screening or health history to guide colorectal cancer (CRC) screening decisions in older adults. Objective: To evaluate the effect of a personalized multilevel intervention on screening orders in older adults due for average-risk CRC screening. Design, Setting, and Participants: Interventional 2-group parallel unmasked cluster randomized clinical trial conducted from November 2015 to February 2019 at 2 US Department of Veterans Affairs (VA) facilities: 1 academic VA medical center and 1 of its connected outpatient clinics. Randomization at the primary care physician/clinician (PCP) level, stratified by study site and clinical full-time equivalency. Participants were 431 average-risk, screen-due US veterans aged 70 to 75 years attending a primary care visit. Data analysis was performed from August 2018 to August 2023. Intervention: The intervention group received a multilevel intervention including a decision-aid booklet with detailed information on screening benefits and harms, personalized for each participant based on age, sex, prior screening, and comorbidity. The control group received a multilevel intervention including a screening informational booklet. All participants received PCP education and system-level modifications to support personalized screening. Main Outcomes and Measures: The primary outcome was whether screening was ordered within 2 weeks of clinic visit. Secondary outcomes were concordance between screening orders and screening benefit and screening utilization within 6 months. Results: A total of 436 patients were consented, and 431 were analyzed across 67 PCPs. Patients had a mean (SD) age of 71.5 (1.7) years; 424 were male (98.4%); 374 were White (86.8%); 89 were college graduates (21.5%); and 351 (81.4%) had undergone prior screening. A total of 258 (59.9%) were randomized to intervention, and 173 (40.1%) to control. Screening orders were placed for 162 of 258 intervention patients (62.8%) vs 114 of 173 control patients (65.9%) (adjusted difference, -4.0 percentage points [pp]; 95% CI, -15.4 to 7.4 pp). In a prespecified interaction analysis, the proportion receiving orders was lower in the intervention group than in the control group for those in the lowest benefit quartile (59.4% vs 71.1%). In contrast, the proportion receiving orders was higher in the intervention group than in the control group for those in the highest benefit quartile (67.6% vs 52.2%) (interaction P = .049). Fewer intervention patients (106 of 256 [41.4%]) utilized screening overall at 6 months than controls (96 of 173 [55.9%]) (adjusted difference, -13.4 pp; 95% CI, -25.3 to -1.6 pp). Conclusions and Relevance: In this cluster randomized clinical trial, patients who were presented with personalized information about screening benefits and harms in the context of a multilevel intervention were more likely to receive screening orders concordant with benefit and were less likely to utilize screening. Trial Registration: ClinicalTrials.gov Identifier: NCT02027545.
Assuntos
Neoplasias Colorretais , Detecção Precoce de Câncer , Humanos , Masculino , Idoso , Feminino , Emprego , Neoplasias Colorretais/diagnóstico , Instituições de Assistência Ambulatorial , Programas de RastreamentoRESUMO
OBJECTIVES: To quantify temporal changes in colonoscopy indication and assess appropriateness of surveillance use in older adults. STUDY DESIGN: Retrospective longitudinal study of national Veterans Health Administration (VHA) data of all patients who underwent outpatient colonoscopy in 2005-2014. METHODS: After validating an electronic algorithm for classifying colonoscopy indication in VHA, we examined trends in colonoscopy indication over time and across patient characteristics. RESULTS: The proportion of colonoscopies performed for postpolypectomy surveillance increased significantly during the study period, particularly among older patients with limited life expectancy, raising concern for possible overuse. CONCLUSIONS: Guidelines should make clear recommendations about when and how to discontinue postpolypectomy surveillance colonoscopy. Doing so would potentially reduce harms due to complications from low-value procedures and in turn moderate demand and thereby enhance overall procedural access for patients more likely to benefit.
Assuntos
Colonoscopia , Neoplasias Colorretais , Idoso , Neoplasias Colorretais/diagnóstico , Neoplasias Colorretais/epidemiologia , Humanos , Estudos Longitudinais , Estudos RetrospectivosRESUMO
BACKGROUND: Inadequate bowel preparation undermines the quality of colonoscopy, but patients likely to be affected are difficult to identify beforehand. AIMS: This study aimed to develop, validate, and compare prediction models for bowel preparation inadequacy using conventional logistic regression (LR) and random forest machine learning (RFML). METHODS: We created a retrospective cohort of patients who underwent outpatient colonoscopy at a single VA medical center between January 2012 and October 2015. Candidate predictor variables were chosen after a literature review. We extracted all available predictor variables from the electronic medical record, and bowel preparation from the endoscopy database. The data were split into 70% training and 30% validation sets. Multivariable LR and RFML were used to predict preparation inadequacy as a dichotomous outcome. RESULTS: The cohort included 6,885 Veterans, of whom 964 (14%) had inadequate preparation. Using LR, the area under the receiver operating characteristic curve (AUC) for the validation cohort was 0.66 (95% CI 0.62, 0.69) and the Brier score, in which a lower score indicates better performance, was 0.11. Using RFML, the AUC for the validation cohort was 0.61 (95% CI 0.58, 0.65) and the Brier score was 0.12. CONCLUSIONS: LR and RFML had similar performance in predicting bowel preparation, which was modest and likely insufficient for use in practice. Future research is needed to identify additional predictor variables and to test other machine learning algorithms. At present, endoscopy units should focus on universal strategies to enhance preparation adequacy.
Assuntos
Veteranos , Humanos , Modelos Logísticos , Aprendizado de Máquina , Estudos Retrospectivos , Medição de RiscoRESUMO
PURPOSE: Stereotactic body radiation therapy (SBRT) use has increased among patients without pathologic confirmation (PC) of lung cancer. Empirical SBRT without PC raises concerns about variation in workup and patient selection, but national trends have not been well described. In this study, we assessed patterns of empirical SBRT use, workup, and causes of death among a large national non-small cell lung cancer (NSCLC) cohort. METHODS AND MATERIALS: We identified 2221 patients treated with SBRT for cT1-T2aN0M0 NSCLC in the Veterans Affairs health care system from 2008 to 2015. We reviewed their pretreatment workup and assessed associations between absence of PC and clinical and demographic factors. We compared causes of death between PC and non-PC groups and used Cox proportional hazards modeling to compare overall survival and lung cancer specific survival (LCSS) between these groups. RESULTS: Treatment without PC varied from 0% to 61% among Veterans Affairs medical centers, with at least 5 cases of stage I NSCLC. Overall, 14.9% of patients were treated without PC and 8.8% did not have a biopsy attempt. Ten percent of facilities were responsible for almost two-thirds (62%) of cases of treatment without PC. Of non-PC patients, 95.5% had positron emission tomography scans, 40.6% had biopsy procedures attempted, and 12.7% underwent endobronchial ultrasound. Non-PC patients were more likely to have cT1 tumors and live outside the histoplasmosis belt. Age, sex, smoking status, and Charlson comorbidity index were similar between groups. Lung cancer was the most common cause of death in both groups. Overall survival was similar between groups, whereas non-PC patients had better LCSS (hazard ratio = 0.77, P = .031). CONCLUSIONS: Empirical SBRT use varied widely among institutions and appropriate radiographic workup was consistently used in this national cohort. Future studies should investigate determinants of variation and reasons for higher LCSS among non-PC patients.
RESUMO
Importance: Inflammatory bowel disease (IBD) is commonly treated with corticosteroids and anti-tumor necrosis factor (TNF) drugs; however, medications have well-described adverse effects. Prior work suggests that anti-TNF therapy may reduce all-cause mortality compared with prolonged corticosteroid use among Medicare and Medicaid beneficiaries with IBD. Objective: To examine the association between use of anti-TNF or corticosteroids and all-cause mortality in a national cohort of veterans with IBD. Design, Setting, and Participants: This cohort study used a well-established Veteran's Health Administration cohort of 2997 patients with IBD treated with prolonged corticosteroids (≥3000-mg prednisone equivalent and/or ≥600 mg of budesonide within a 12-month period) and/or new anti-TNF therapy from January 1, 2006, to October 1, 2015. Data were analyzed between July 1, 2019, and December 31, 2020. Exposures: Use of corticosteroids or anti-TNF. Main Outcomes and Measures: The primary end point was all-cause mortality as defined by the Veterans Health Administration vital status file. Marginal structural modeling was used to compare associations between anti-TNF therapy or corticosteroid use and all-cause mortality. Results: A total of 2997 patients (2725 men [90.9%]; mean [SD] age, 50.0 [17.4] years) were included in the final analysis, 1734 (57.9%) with Crohn disease (CD) and 1263 (42.1%) with ulcerative colitis (UC). All-cause mortality was 8.5% (n = 256) over a mean (SD) of 3.9 (2.3) years' follow-up. At cohort entry, 1836 patients were new anti-TNF therapy users, and 1161 were prolonged corticosteroid users. Anti-TNF therapy use was associated with a lower likelihood of mortality for CD (odds ratio [OR], 0.54; 95% CI, 0.31-0.93) but not for UC (OR, 0.33; 95% CI, 0.10-1.10). In a sensitivity analysis adjusting prolonged corticosteroid users to include patients receiving corticosteroids within 90 to 270 days after initiation of anti-TNF therapy, the OR for UC was statistically significant, at 0.33 (95% CI, 0.13-0.84), and the OR for CD was 0.55 (95% CI, 0.33-0.92). Conclusions and Relevance: This study suggests that anti-TNF therapy may be associated with reduced mortality compared with long-term corticosteroid use among veterans with CD, and potentially among those with UC.
Assuntos
Budesonida/uso terapêutico , Colite Ulcerativa/tratamento farmacológico , Colite Ulcerativa/mortalidade , Doença de Crohn/tratamento farmacológico , Doença de Crohn/mortalidade , Glucocorticoides/uso terapêutico , Prednisona/uso terapêutico , Inibidores do Fator de Necrose Tumoral/uso terapêutico , Adulto , Idoso , Idoso de 80 Anos ou mais , Causas de Morte , Estudos de Coortes , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estados Unidos , United States Department of Veterans Affairs , Saúde dos Veteranos , Adulto JovemRESUMO
BACKGROUND: The Veterans Health Administration (VA) recently has been scrutinized for prolonged wait times for routine medical care, including elective outpatient procedures such as colonoscopy. Wait times for colonoscopy following positive fecal occult blood test (FOBT) are associated with worse clinical outcomes only if greater than 6 months. OBJECTIVE: We aimed to investigate time trends in wait time for outpatient colonoscopy in VA and factors influencing wait time. DESIGN: Retrospective cohort study using mixed-effects regression of VA administrative data from the Corporate Data Warehouse. PARTICIPANTS: Veterans who underwent outpatient colonoscopy for positive FOBT in 2008-2015 at 124 VA endoscopy facilities. MAIN MEASURES: The main outcome measure was wait time (in days) between positive FOBT and colonoscopy completion, stratified by year and adjusted for sedation type, year, and potentially influential patient- and facility-level factors. KEY RESULTS: In total, 125,866 outpatient colonoscopy encounters for positive FOBT occurred during the study period. The number of colonoscopies for this indication declined slightly over time (17,586 in 2008 vs. 13,245 in 2015; range 13,425-19,814). In 2008, median wait time across sites was 50 days (interquartile range [IQR] = 33, 75). There was no secular trend in wait times (2015 median = 52 days, IQR = 34, 77). Examining the adjusted effect of patient- and facility-level factors on wait time, no clinically meaningful difference was found. CONCLUSIONS: Wait times for colonoscopy for positive FOBT have been stable over time. Despite the perception of prolonged VA wait times, wait times for outpatient colonoscopy for positive FOBT are well below the threshold at which clinically meaningful differences in patient outcomes have been observed.
Assuntos
Neoplasias Colorretais , Veteranos , Colonoscopia , Humanos , Programas de Rastreamento , Sangue Oculto , Pacientes Ambulatoriais , Estudos Retrospectivos , Saúde dos Veteranos , Listas de EsperaRESUMO
OBJECTIVES: To assess patient preferences for colorectal cancer screening with stool-based tests after initial colonoscopy with suboptimal bowel preparation. METHODS: An online scenario-based survey of adults aged 45 to 75 years at average risk for colorectal cancer was performed. RESULTS: When presented with a hypothetical scenario of screening colonoscopy with suboptimal bowel preparation, 59% of respondents chose stool-based testing as a next step, 29% preferred a repeat colonoscopy within a year, and 12% preferred a repeat colonoscopy in 10 years (N = 1,080). CONCLUSIONS: Clinicians should consider offering stool-based screening tests as an alternative to repeat colonoscopy after suboptimal bowel preparation.
Assuntos
Colonoscopia/métodos , Neoplasias Colorretais/diagnóstico , Detecção Precoce de Câncer/psicologia , Programas de Rastreamento/psicologia , Preferência do Paciente/estatística & dados numéricos , Idoso , Catárticos/administração & dosagem , Colonoscopia/psicologia , Estudos Transversais , Detecção Precoce de Câncer/métodos , Detecção Precoce de Câncer/estatística & dados numéricos , Feminino , Humanos , Masculino , Programas de Rastreamento/métodos , Programas de Rastreamento/estatística & dados numéricos , Pessoa de Meia-Idade , Sangue Oculto , Preferência do Paciente/psicologia , Fatores de TempoRESUMO
OBJECTIVE AND DESIGN: The objective of this study was to assess the effect of vamorolone, a first-in-class dissociative steroidal compound, to inhibit inflammation when administered after disease onset in the murine collagen antibody-induced arthritis model of arthritis. ANIMALS: 84 DBA1/J mice were used in this study (n = 12 per treatment group). TREATMENT: Vamorolone or prednisolone was administered orally after disease onset for a duration of 7 days. METHODS: Disease score and bone erosion were assessed using previously described scoring systems. Cytokines were measured in joints via immunoassay, and joint cathepsin B activity (marker of inflammation) was assessed using optical imaging of joints on live mice. RESULTS: We found that vamorolone treatment led to a reduction of several disease parameters including disease score, joint inflammation, and the presence of pro-inflammatory mediators to a degree similar of that observed with prednisolone treatment. More importantly, histopathological analysis of affected joints showed that vamorolone treatment significantly reduced the degree of bone erosion while this bone-sparing property was not observed with prednisolone treatment at any of the tested doses. CONCLUSIONS: While many intervention regimens in other studies are administered prior to disease onset in animal models, the current study involves delivery of the potential therapeutic after disease onset. Based on the findings, vamorolone may offer an efficacious, yet safer alternative to conventional steroidal compounds in the treatment of rheumatoid arthritis and other inflammatory diseases.
Assuntos
Anti-Inflamatórios/uso terapêutico , Artrite Experimental/tratamento farmacológico , Pregnadienodiois/uso terapêutico , Animais , Anticorpos Monoclonais/imunologia , Artrite Experimental/imunologia , Artrite Experimental/patologia , Colágeno Tipo II/imunologia , Citocinas/imunologia , Articulações/efeitos dos fármacos , Articulações/imunologia , Articulações/patologia , Lipopolissacarídeos , Masculino , Camundongos Endogâmicos DBARESUMO
OBJECTIVES: Identify predictors of persistence with adalimumab (ADA) among veterans and privately insured patients with inflammatory bowel disease (IBD) in the United States. STUDY DESIGN: Retrospective cohort study. METHODS: Patients with IBD taking ADA as their first biologic were identified from the Veterans Health Administration (VHA) database from 2009 to 2013 and the Truven Health MarketScan database from 2009 to 2012 with a 12-month follow-up. Persistence was defined as continued use 1 year after initiation. Adherence was assessed by calculating a medication possession ratio, which was dichotomized as greater than 0.86 or less than or equal to 0.86. Multivariable logistic regression was used to evaluate predictors of persistence. RESULTS: There were 1030 patients in the VHA population compared with 3264 patients in the privately insured (MarketScan) cohort. In MarketScan, 1800 patients (55%) remained on ADA compared with 755 (73%) in the VHA cohort. In multivariable analysis, male sex (odds ratio [OR], 1.38; 95% CI, 1.16-1.63; P <.01), Crohn disease (OR, 1.27; 95% CI, 1.02-1.57; P = .03), greater adherence (OR, 1.83; 95% CI, 1.45-2.30; P <.01), and dose escalation (OR, 1.82; 95% CI, 1.42-2.33; P <.01) were associated with higher ADA persistence in the MarketScan cohort; narcotic use (OR, 0.71; 95% CI, 0.58-0.88; P <.01) and hospitalization or new steroid use after initiation (OR, 0.04; 95% CI, 0.03-0.05; P <.01) were associated with lower persistence. In the VHA cohort, only a hospitalization or new steroid use (OR, 0.50; 95% CI, 0.36-0.70; P <.01) was associated with lower persistence. CONCLUSIONS: Despite being older and having more comorbidities, patients in the VHA, which is an integrated healthcare system, appear to be more likely to remain on ADA at 1 year than patients in the MarketScan database. Further studies of system differences are needed to understand the reasons behind this discrepancy.
Assuntos
Adalimumab/uso terapêutico , Anti-Inflamatórios/uso terapêutico , Doenças Inflamatórias Intestinais/tratamento farmacológico , Seguro Saúde/estatística & dados numéricos , Adesão à Medicação/estatística & dados numéricos , Veteranos/estatística & dados numéricos , Adulto , Feminino , Humanos , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Estados Unidos , Veteranos/psicologiaRESUMO
Background: Inflammatory bowel disease (IBD) is a chronic disease characterized by unpredictable episodes of flares and periods of remission. Tools that accurately predict disease course would substantially aid therapeutic decision-making. This study aims to construct a model that accurately predicts the combined end point of outpatient corticosteroid use and hospitalizations as a surrogate for IBD flare. Methods: Predictors evaluated included age, sex, race, use of corticosteroid-sparing immunosuppressive medications (immunomodulators and/or anti-TNF), longitudinal laboratory data, and number of previous IBD-related hospitalizations and outpatient corticosteroid prescriptions. We constructed models using logistic regression and machine learning methods (random forest [RF]) to predict the combined end point of hospitalization and/or corticosteroid use for IBD within 6 months. Results: We identified 20,368 Veterans Health Administration patients with the first (index) IBD diagnosis between 2002 and 2009. Area under the receiver operating characteristic curve (AuROC) for the baseline logistic regression model was 0.68 (95% confidence interval [CI], 0.67-0.68). AuROC for the RF longitudinal model was 0.85 (95% CI, 0.84-0.85). AuROC for the RF longitudinal model using previous hospitalization or steroid use was 0.87 (95% CI, 0.87-0.88). The 5 leading independent risk factors for future hospitalization or steroid use were age, mean serum albumin, immunosuppressive medication use, and mean and highest platelet counts. Previous hospitalization and corticosteroid use were highly predictive when included in specified models. Conclusions: A novel machine learning model substantially improved our ability to predict IBD-related hospitalization and outpatient steroid use. This model could be used at point of care to distinguish patients at high and low risk for disease flare, allowing individualized therapeutic management.
Assuntos
Corticosteroides/uso terapêutico , Hospitalização/estatística & dados numéricos , Imunossupressores/uso terapêutico , Doenças Inflamatórias Intestinais/tratamento farmacológico , Aprendizado de Máquina , Pacientes Ambulatoriais/estatística & dados numéricos , Área Sob a Curva , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de RiscoRESUMO
OBJECTIVE: To generate practice-based evidence of outcomes in an interdisciplinary spasticity management clinic using practical application of the Goal Attainment Scale (GAS). DESIGN: Retrospective chart review. PATIENTS: A total of 225 adult patients who were referred for spasticity management at a tertiary rehabilitation hospital and returned for follow-up between 2010 and 2013. METHODS: GAS scores were determined for all patients. GAS T-scores were evaluated based on age; sex; diagnosis; International Classification of Functioning, Disability and Health (ICF) domain; body region affected; and site of botulinum neurotoxin injection. RESULTS: The distribution of GAS outcomes did not vary by age, sex or diagnosis. The overall GAS T-score for the clinic was 47.7, which is consistent with appropriate goal setting. GAS T-scores did not vary by diagnosis or ICF domain. Significant intervention effects were identified for botulinum neurotoxin, with improvements in GAS T-scores for treatment targeted to both upper and lower limb muscles, compared with no botulinum neurotoxin, across diagnoses and ICF domains. CONCLUSION: The GAS is a useful patient-centred outcome measure that can be practically applied in the clinical setting for a heterogeneous population with diverse goals. Botulinum neurotoxin treatment in this setting was associated with improved goal attainment relating to multiple ICF domains.
Assuntos
Espasticidade Muscular/terapia , Adulto , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Espasticidade Muscular/tratamento farmacológico , Estudos Retrospectivos , Resultado do TratamentoRESUMO
OBJECTIVE: To evaluate the prevalence and effect of spasticity after traumatic spinal cord injury (SCI). DESIGN: Prospective cohort study of the Rick Hansen Spinal Cord Injury Registry (RHSCIR) and retrospective review of inpatient medical charts. SETTING: Quaternary trauma center, rehabilitation center, and community settings. PARTICIPANTS: Individuals (N=860) with a traumatic SCI between March 1, 2005, and March 31, 2014, prospectively enrolled in the Vancouver site RHSCIR were eligible for inclusion. INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: Questionnaires (Penn Spasm Frequency Scale, Spinal Cord Injury Health Questionnaire) and antispasticity medication use. RESULTS: In 465 patients, the prevalence of spasticity at community discharge was 65%, and the prevalence of problematic spasticity (defined as discharged on antispasticity medication) was 35%. Problematic spasticity was associated with cervicothoracic neurologic level and injury severity (P<.001). In community follow-up, the prevalence of patients reporting any spasticity treatment (ie, problematic spasticity) was 35% at 1 year, 41% at 2 years, and 31% at 5 years postinjury. Interference with function caused by spasticity was reported by 27% of patients at 1 year, 25% at 2 years, and 20% at 5 years postinjury. Patients with American Spinal Injury Association Impairment Scale grade C injuries had the highest prevalence of ongoing spasticity treatment and functional limitation. CONCLUSIONS: Spasticity is a highly prevalent secondary consequence of SCI, particularly in patients with severe motor incomplete cervicothoracic injuries. It is problematic in one third of all patients with SCI up to 5 years postinjury. One in 5 patients will have ongoing functional limitations related to spasticity, highlighting the importance of close community follow-up and the need for further research into spasticity management strategies.
Assuntos
Espasticidade Muscular/etiologia , Espasticidade Muscular/reabilitação , Traumatismos da Medula Espinal/complicações , Traumatismos da Medula Espinal/reabilitação , Atividades Cotidianas , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Avaliação da Deficiência , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Espasticidade Muscular/fisiopatologia , Prevalência , Estudos Prospectivos , Centros de Reabilitação , Estudos Retrospectivos , Índices de Gravidade do Trauma , Adulto JovemRESUMO
Heterogeneity in regional end expiratory lung volume (EELV) may lead to variations in regional strain (ε). High ε levels have been associated with ventilator-associated lung injury (VALI). While both whole lung and regional EELV may be affected by changes in positive end-expiratory pressure (PEEP), regional variations are not revealed by conventional respiratory system measurements. Differential rates of deflation of adjacent lung units due to regional variation in expiratory time constants (τE) may create localized regions of ε that are significantly greater than implied by whole lung measures. We used functional respiratory imaging (FRI) in an ex vivo porcine lung model to: (i) demonstrate that computed tomography (CT)-based imaging studies can be used to assess global and regional values of ε and τE and, (ii) demonstrate that the manipulation of PEEP will cause measurable changes in total and regional ε and τE values. Our study provides three insights into lung mechanics. First, image-based measurements reveal egional variation that cannot be detected by traditional methods such as spirometry. Second, the manipulation of PEEP causes global and regional changes in R, E, ε and τE values. Finally, regional ε and τE were correlated in several lobes, suggesting the possibility that regional τE could be used as a surrogate marker for regional ε.
Assuntos
Pulmão/diagnóstico por imagem , Testes de Função Respiratória/métodos , Tomografia Computadorizada por Raios X/métodos , Animais , Feminino , Imageamento Tridimensional/métodos , Imageamento Tridimensional/normas , Pulmão/fisiologia , Testes de Função Respiratória/normas , Mecânica Respiratória , Suínos , Tomografia Computadorizada por Raios X/normasRESUMO
We utilized a multicompartment model to describe the effects of changes in tidal volume (VT) and positive end-expiratory pressure (PEEP) on lung emptying during passive deflation before and after experimental lung injury. Expiratory time constants (τE) were determined by partitioning the expiratory flow-volume (VËEV) curve into multiple discrete segments and individually calculating τE for each segment. Under all conditions of PEEP and VT, τE increased throughout expiration both before and after injury. Segmented τE values increased throughout expiration with a slope that was different than zero (P < 0. 01). On average, τE increased by 45.08 msec per segment. When an interaction between injury status and τE segment was included in the model, it was significant (P < 0.05), indicating that later segments had higher τE values post injury than early τE segments. Higher PEEP and VT values were associated with higher τE values. No evidence was found for an interaction between injury status and VT, or PEEP. The current experiment confirms previous observations that τE values are smaller in subjects with injured lungs when compared to controls. We are the first to demonstrate changes in the pattern of τE before and after injury when examined with a multiple compartment model. Finally, increases in PEEP or VT increased τE throughout expiration, but did not appear to have effects that differed between the uninjured and injured state.