Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 64
Filter
1.
Am J Ind Med ; 40(4): 374-92, 2001 Oct.
Article in English | MEDLINE | ID: mdl-11598987

ABSTRACT

BACKGROUND: Job characteristics may constitute a barrier to return-to-work (RTW) after compensated disabling low back pain (LBP). This study examines the impact of psychosocial job factors on time to RTW separately during the acute and subacute/chronic disability phases. METHODS: This is a retrospective cohort study of 433 LBP workers' compensation claimants with 1-4 years of follow-up. The association of psychosocial job factors with duration of work disability was estimated with Cox regression models, adjusting for injury history and severity, physical workload, and demographic and employment factors. RESULTS: High physical and psychological job demands and low supervisory support are each associated with about 20% lower RTW rates during all disability phases. High job control, especially control over work and rest periods, is associated with over 30% higher RTW rates, but only during the subacute/chronic disability phase starting 30 days after injury. Job satisfaction and coworker support are unrelated to time to RTW. CONCLUSIONS: Duration of work disability is associated with psychosocial job factors independent of injury severity and physical workload. The impact of these risk factors changes significantly over the course of disability.


Subject(s)
Back Injuries/psychology , Low Back Pain/psychology , Occupational Diseases/psychology , Adult , Back Injuries/economics , Back Injuries/rehabilitation , California/epidemiology , Cohort Studies , Disability Evaluation , Female , Follow-Up Studies , Humans , Internal-External Control , Job Description , Job Satisfaction , Low Back Pain/epidemiology , Low Back Pain/etiology , Male , Occupational Diseases/economics , Occupational Diseases/epidemiology , Retrospective Studies , Risk Factors , Stress, Physiological , Work Capacity Evaluation , Work Schedule Tolerance/psychology , Workers' Compensation
2.
JAMA ; 285(23): 2987-94, 2001 Jun 20.
Article in English | MEDLINE | ID: mdl-11410097

ABSTRACT

CONTEXT: For many elderly patients, an acute medical illness requiring hospitalization is followed by a progressive decline, resulting in high rates of mortality in this population during the year following discharge. However, few prognostic indices have focused on predicting posthospital mortality in older adults. OBJECTIVE: To develop and validate a prognostic index for 1 year mortality of older adults after hospital discharge using information readily available at discharge. DESIGN: Data analyses derived from 2 prospective studies with 1-year of follow-up, conducted in 1993 through 1997. SETTING AND PATIENTS: We developed the prognostic index in 1495 patients aged at least 70 years who were discharged from a general medical service at a tertiary care hospital (mean age, 81 years; 67% female) and validated it in 1427 patients discharged from a separate community teaching hospital (mean age, 79 years; 61% female). MAIN OUTCOME MEASURE: Prediction of 1-year mortality using risk factors such as demographic characteristics, activities of daily living (ADL) dependency, comorbid conditions, length of hospital stay, and laboratory measurements. RESULTS: In the derivation cohort, 6 independent risk factors for mortality were identified and weighted using logistic regression: male sex (1 point); number of dependent ADLs at discharge (1-4 ADLs, 2 points; all 5 ADLs, 5 points); congestive heart failure (2 points); cancer (solitary, 3 points; metastatic, 8 points); creatinine level higher than 3.0 mg/dL (265 micromol/L) (2 points); and low albumin level (3.0-3.4 g/dL, 1 point; <3.0 g/dL, 2 points). Several variables associated with 1-year mortality in bivariable analyses, such as age and dementia, were not independently associated with mortality after adjustment for functional status. We calculated risk scores for patients by adding the points of each independent risk factor present. In the derivation cohort, 1-year mortality was 13% in the lowest-risk group (0-1 point), 20% in the group with 2 or 3 points, 37% in the group with 4 to 6 points, and 68% in the highest-risk group (>6 points). In the validation cohort, 1-year mortality was 4% in the lowest-risk group, 19% in the group with 2 or 3 points, 34% in the group with 4 to 6 points, and 64% in the highest-risk group. The area under the receiver operating characteristic curve for the point system was 0.75 in the derivation cohort and 0.79 in the validation cohort. CONCLUSIONS: Our prognostic index, which used 6 risk factors known at discharge and a simple additive point system to stratify medical patients 70 years or older according to 1-year mortality after hospitalization, had good discrimination and calibration and generalized well in an independent sample of patients at a different site. These characteristics suggest that our index may be useful for clinical care and risk adjustment.


Subject(s)
Health Status Indicators , Hospitalization/statistics & numerical data , Mortality , Aged , Aged, 80 and over , Female , Humans , Logistic Models , Male , Multivariate Analysis , Prognosis , Risk Assessment , Risk Factors
3.
Stat Med ; 20(12): 1739-53, 2001 Jun 30.
Article in English | MEDLINE | ID: mdl-11406838

ABSTRACT

Repeat measurements of patient characteristics are often used to assess response to treatment. In this paper we discuss a normal mixture model for the observed change in the characteristic of interest in treated patients. The methods described can be used to estimate the overall proportion of non-response to treatment and also the probability that a patient has not responded to treatment given his or her observed change. The model parameters are estimated using maximum likelihood, and the delta method is used to construct a pointwise confidence band for the conditional probability that a patient is a non-responder to treatment. The work was initially motivated by analysis issues in the Fracture Intervention Trial (FIT), a randomized trial of the osteoporosis drug alendronate, and the method is illustrated with data from that study. We also evaluate key aspects of the estimation procedure with two simulation studies. In the first, the data generation model is the assumed normal mixture model, and in the second, the data are generated according to a shifted and scaled central t-distribution model suggested by the FIT data.


Subject(s)
Alendronate/therapeutic use , Models, Biological , Osteoporosis, Postmenopausal/drug therapy , Probability , Aged , Bone Density/drug effects , Computer Simulation , Female , Hip Fractures/prevention & control , Humans , Likelihood Functions , Middle Aged , Osteoporosis, Postmenopausal/prevention & control , Spinal Fractures/prevention & control , Treatment Outcome
4.
J Occup Environ Med ; 43(6): 515-25, 2001 Jun.
Article in English | MEDLINE | ID: mdl-11411323

ABSTRACT

Although doctors are increasingly evaluated on the basis of return-to-work (RTW) outcomes, the effect of doctor-patient communication about the workplace and RTW after an occupational injury has received little research attention. The effect of patient-reported doctor communication on duration of disability was examined retrospectively in a 3-year cohort of 325 claimants with a lost-time low back injury. Although doctor proactive communication was associated with a greater likelihood of RTW during the acute phase (< 30 days of disability), this effect disappeared when injury and workload characteristics were taken into account. A positive RTW recommendation was associated with about a 60% higher RTW rate during the subacute/chronic phase (> 30 days of disability) only. Prospective studies are needed to confirm this effect. The impact of physician communication on RTW is largely confounded by injury and workplace factors.


Subject(s)
Back Injuries/rehabilitation , Occupational Diseases/rehabilitation , Physician-Patient Relations , Work Capacity Evaluation , Workers' Compensation , Adult , Back Injuries/economics , Confounding Factors, Epidemiologic , Female , Humans , Male , Middle Aged , Occupational Diseases/economics , Regression Analysis , Risk Factors , Time Factors , United States
5.
J Occup Environ Med ; 42(3): 323-33, 2000 Mar.
Article in English | MEDLINE | ID: mdl-10738711

ABSTRACT

Little is known about predictors of duration of work disability (DOD). This cohort study of 433 workers' compensation claimants estimated DOD for job, injury, and demographic factors during consecutive disability phases using Cox regression analysis. DOD was calculated from administrative records. Results show that DOD increases with the time spent bending and lifting or pushing or pulling heavy objects at work, but it is unrelated to sitting, standing, or vibration. Younger age, longer pre-injury employment, less severe injuries, and a previous back injury predicted shorter disability, the latter factor only during the subacute/chronic disability phases. The effect of injury severity decayed over time. This study demonstrates the usefulness of a phase-specific analysis and shows that physical job and injury factors have a significant and time-varying impact on DOD.


Subject(s)
Employment/statistics & numerical data , Occupational Diseases/epidemiology , Spinal Injuries/epidemiology , Workers' Compensation , Adult , Age Distribution , California/epidemiology , Cohort Studies , Confidence Intervals , Disability Evaluation , Female , Humans , Injury Severity Score , Low Back Pain/diagnosis , Low Back Pain/epidemiology , Low Back Pain/therapy , Male , Middle Aged , Occupational Diseases/diagnosis , Occupational Diseases/therapy , Physical Endurance , Proportional Hazards Models , Sex Distribution , Spinal Injuries/diagnosis , Spinal Injuries/therapy , Survival Analysis , Workplace
6.
Monogr Soc Res Child Dev ; 65(3): i-vi, 1-123, 2000.
Article in English | MEDLINE | ID: mdl-12467096

ABSTRACT

How do children learn their first words? The field of language development has been polarized by responses to this question. Explanations range from constraints/principles accounts that emphasize the importance of cognitive heuristics in language acquisition, to social-pragmatic accounts that highlight the role of parent-child interaction, to associationistic accounts that highlight the role of "dumb attentional mechanisms" in word learning. In this Monograph, an alternative to these accounts is presented: the emergentist coalition theory. A hybrid view of word learning, this theory characterizes lexical acquisition as the emergent product of multiple factors, including cognitive constraints, social-pragmatic factors, and global attentional mechanisms. The model makes three assumptions: (a) that children cull from multiple inputs available for word learning at any given time, (b) that these inputs are differentially weighted over development, and (c) that children develop emergent principles of word learning, which guide subsequent word acquisition. With few exceptions, competing accounts of the word learning process have examined children who are already veteran word learners. By focusing on the very beginnings of word learning at around 12 months of age, however, it is possible to see how social and cognitive factors are coordinated in the process of vocabulary development. After presenting a new method for investigating word learning, the development of reference is used as a test case of the theory. In 12 experiments, with children ranging in age from 12 to 25 months of age, data are described that support the emergentist coalition model. This fundamentally developmental theory posits that children construct principles of word learning. As children's word learning principles emerge and develop, the character of word learning changes over the course of the 2nd year of life.


Subject(s)
Language , Learning , Mental Recall/physiology , Psychological Theory , Child, Preschool , Female , Humans , Male , Memory/physiology , Vocabulary
7.
Epidemiology ; 10(6): 717-21, 1999 Nov.
Article in English | MEDLINE | ID: mdl-10535786

ABSTRACT

A mother's prepregnancy obesity has been suggested as a risk factor for having offspring with an abdominal wall defect. We evaluated this hypothesis among 104 cases of gastroschisis--a severe birth defect of the abdominal wall most prevalent in infants of young women--and 220 controls with no defect. Using Quetelet's index (QI = weight in kg/height in m2) as a measure of body mass, we found a higher risk of gastroschisis (odds ratio (OR) = 3.2; 95% confidence interval (CI) = 1.4-7.3) for underweight mothers (QI<18.1 kg/m2) and a lower risk (OR = 0.2; 0.05-0.9) for overweight mothers (QI>28.3 kg/m2) as compared with mothers of normal weight. As QI was correlated to height, with the correlation varying according to mother's ethnicity and age, we adjusted for these factors in the analysis; the adjusted values approximated the unadjusted values. Evaluation of QI as a continuous variable showed that, for every unit increase in QI, the risk for gastroschisis decreased by about 11%. Sociodemographic, pregnancy, and nutrient factors did not confound the association. These results suggest that low prepregnancy body mass rather than obesity is a risk factor for gastroschisis.


Subject(s)
Body Mass Index , Gastroschisis/epidemiology , Adult , California/epidemiology , Female , Humans , Infant, Newborn , Multivariate Analysis , Odds Ratio , Retrospective Studies , Risk Assessment , Risk Factors
8.
Am J Ophthalmol ; 127(6): 659-65, 1999 Jun.
Article in English | MEDLINE | ID: mdl-10372875

ABSTRACT

PURPOSE: Tear exchange under a soft contact lens is modest, and higher exchange rates may be necessary to reduce extended-wear complications; what is not known is the optimal soft lens design to increase tear mixing. We explored the effect of lens diameter on tear mixing. METHODS: Twenty-three subjects wore four different soft contact lenses with diameters of 12.0, 12.5, 13.0, and 13.5 mm. Tear mixing was quantified by placing fluorescein isothiocyanate-dextran on the posterior lens surface, inserting the lens, and monitoring the changes in fluorescence intensity in the postlens tear film. Tear mixing, expressed as the percentage decrease in fluorescence intensity per blink, was estimated using an exponential model. Lens movement was videotaped and lens comfort was graded on a 50-point scale (50 = excellent comfort). Subjects reporting a comfort level of less than 35 were excluded. RESULTS: The mean +/- SE tear mixing rates were 1.82% +/- 0.17%, 1.61% +/- 0.16%, 1.34% +/- 0.17%, and 1.24% +/- 0.17% per blink for the 12.0-, 12.5-, 13.0-, and 13.5-mm diameter lenses, respectively. By regression analysis we found that, on average, mixing under the 12.0-mm lens was 0.59% per blink greater than with the 13.5-mm lens (P = .0024). Lens diameter was a significant predictor of lens comfort, and adjusting for the effects of comfort weakened the relationship between diameter and tear replenishment rate, although the mean rate under the 12.0-mm lens was still 0.43% per blink greater than with the 13.5-mm lens (P = .0468). CONCLUSIONS: These data suggest that smaller-diameter soft lenses provide substantially better tear mixing than larger lenses; however, even small lenses provide modest tear mixing compared with rigid contact lenses.


Subject(s)
Contact Lenses, Hydrophilic , Tears/physiology , Adult , Dextrans , Female , Fluorescein-5-isothiocyanate/analogs & derivatives , Fluorophotometry , Humans , Male , Patient Satisfaction , Prosthesis Design , Regression Analysis
9.
Am J Ind Med ; 35(6): 604-18, 1999 Jun.
Article in English | MEDLINE | ID: mdl-10332514

ABSTRACT

BACKGROUND: Studies of low back pain (LBP) disability remain largely incomparable because of different outcome definitions. To date, systematic comparisons of alternative outcome measures have not been made. METHODS: Duration of work disability was studied in a 3-year cohort of 850 workers' compensation LBP claimants. Eleven administrative outcome measures were compared using Kaplan-Meier estimates of the proportion of claimants still on disability benefits during 3.5 years of follow-up. RESULTS: The estimated mean duration of work disability was 75 days for the first temporary disability (TD) episode, 108 days for cumulative time on TD, and 337 for total compensated days, which includes all types of wage replacement benefits during vocational rehabilitation, temporary and permanent disability. CONCLUSIONS: Commonly used administrative measures of lost workdays--time to first return to work and time on temporary disability--substantially underestimate the duration of work disability compared to measures based on all wage replacement benefits.


Subject(s)
Back Injuries/rehabilitation , Disability Evaluation , Low Back Pain/rehabilitation , Occupational Diseases/rehabilitation , Workers' Compensation/statistics & numerical data , Adult , Back Injuries/epidemiology , California/epidemiology , Cohort Studies , Female , Humans , Low Back Pain/epidemiology , Male , Middle Aged , Occupational Diseases/epidemiology , Prevalence , Survival Analysis , Time Factors
10.
Am J Ind Med ; 35(6): 619-31, 1999 Jun.
Article in English | MEDLINE | ID: mdl-10332515

ABSTRACT

BACKGROUND: Workers' compensation wage replacement data have recently been used to estimate time to return to work (RTW) and the number of work days lost after occupational injury. The degree to which indemnity-based measures reflect self-reported work disability has until now not been studied. METHOD: Kaplan-Meier curves of administrative and self-reported measures of duration of work disability were compared within a sample of 433 low back injury claimants followed up for 1 to 3.7 years. RESULTS: Administrative measures consistently and significantly underestimated the duration of disability when compared to self-reported measures of RTW. The difference between the estimated mean number of work days lost for comparable administrative and self-reported measures ranged from 142 to 334 days. CONCLUSIONS: Number of work days lost after low back injury is substantially underestimated by measures based on the duration of wage replacement benefits. This calls into question the adequacy of indemnity benefits and underscores the need for disability prevention programs.


Subject(s)
Back Injuries/rehabilitation , Disability Evaluation , Low Back Pain/rehabilitation , Workers' Compensation/statistics & numerical data , Adult , Back Injuries/epidemiology , California/epidemiology , Cohort Studies , Humans , Low Back Pain/epidemiology , Male , Middle Aged , Survival Analysis , Time Factors
11.
JAMA ; 280(23): 2001-7, 1998 Dec 16.
Article in English | MEDLINE | ID: mdl-9863851

ABSTRACT

CONTEXT: The Lifestyle Heart Trial demonstrated that intensive lifestyle changes may lead to regression of coronary atherosclerosis after 1 year. OBJECTIVES: To determine the feasibility of patients to sustain intensive lifestyle changes for a total of 5 years and the effects of these lifestyle changes (without lipid-lowering drugs) on coronary heart disease. DESIGN: Randomized controlled trial conducted from 1986 to 1992 using a randomized invitational design. PATIENTS: Forty-eight patients with moderate to severe coronary heart disease were randomized to an intensive lifestyle change group or to a usual-care control group, and 35 completed the 5-year follow-up quantitative coronary arteriography. SETTING: Two tertiary care university medical centers. INTERVENTION: Intensive lifestyle changes (10% fat whole foods vegetarian diet, aerobic exercise, stress management training, smoking cessation, group psychosocial support) for 5 years. MAIN OUTCOME MEASURES: Adherence to intensive lifestyle changes, changes in coronary artery percent diameter stenosis, and cardiac events. RESULTS: Experimental group patients (20 [71%] of 28 patients completed 5-year follow-up) made and maintained comprehensive lifestyle changes for 5 years, whereas control group patients (15 [75%] of 20 patients completed 5-year follow-up) made more moderate changes. In the experimental group, the average percent diameter stenosis at baseline decreased 1.75 absolute percentage points after 1 year (a 4.5% relative improvement) and by 3.1 absolute percentage points after 5 years (a 7.9% relative improvement). In contrast, the average percent diameter stenosis in the control group increased by 2.3 percentage points after 1 year (a 5.4% relative worsening) and by 11.8 percentage points after 5 years (a 27.7% relative worsening) (P=.001 between groups. Twenty-five cardiac events occurred in 28 experimental group patients vs 45 events in 20 control group patients during the 5-year follow-up (risk ratio for any event for the control group, 2.47 [95% confidence interval, 1.48-4.20]). CONCLUSIONS: More regression of coronary atherosclerosis occurred after 5 years than after 1 year in the experimental group. In contrast, in the control group, coronary atherosclerosis continued to progress and more than twice as many cardiac events occurred.


Subject(s)
Coronary Disease/prevention & control , Health Behavior , Life Style , Aged , Angina Pectoris , Coronary Angiography , Coronary Artery Disease/prevention & control , Coronary Disease/diagnosis , Coronary Disease/epidemiology , Coronary Disease/physiopathology , Diet , Disease Progression , Exercise , Feasibility Studies , Female , Humans , Lipids/blood , Male , Middle Aged , Risk Factors , Self-Help Groups , Smoking Cessation , Stress, Psychological/prevention & control , Time Factors
12.
Br J Ophthalmol ; 82(4): 376-81, 1998 Apr.
Article in English | MEDLINE | ID: mdl-9640184

ABSTRACT

AIMS: Recently, it was reported by the authors that a single drop fluorophotometric technique for estimating corneal epithelial permeability (Pde) to fluorescein is not sufficiently precise for monitoring permeability changes in individual patients., but may be useful for evaluating mean differences in Pdc in population based research. To determine whether this technique provides a more sensitive index of epithelial integrity compared with conventional clinical assessments, the effects of mild corneal trauma on Pdc, the slit lamp appearance of the cornea, and corneal thickness (CT) were assessed. METHODS: After baseline slit lamp examinations (SLE) and CT measurements, one randomly chosen eye of each of 32 normal subjects underwent 1 hour of closed eye soft contact lens (CL) wear while the fellow eye served as a control (no CL). After removing the CL, the SLE and CT measurements were repeated. Then, Pdc to fluorescein was assessed using a single drop fluorophotometric method refined to enhance feasibility, precision, and accuracy. RESULTS: The mean (95% confidence interval) difference in natural log (Pdc) between 32 pairs of eyes (CL minus no CL) was 0.341 (0.069, 0.613), p = 0.016. By contrast, none of the 32 subjects exhibited corneal epithelial disruption upon SLE with white light following the closed eye period. Also, no substantial differences were apparent in the corneal swelling response between paired eyes, mean delta CT (95% CI) = -2.31(-7.53, 2.91) microns, p = 0.37. CONCLUSIONS: Pdc measurements, used in studies of modest sample size, appear capable of detecting average differences in corneal barrier function that remain undetectable by SLE or pachymetry.


Subject(s)
Contact Lenses, Hydrophilic/adverse effects , Epithelium, Corneal/metabolism , Adult , Epithelium, Corneal/anatomy & histology , Fluorophotometry , Humans , Permeability , Sensitivity and Specificity , Time Factors
13.
Invest Ophthalmol Vis Sci ; 39(1): 3-17, 1998 Jan.
Article in English | MEDLINE | ID: mdl-9430539

ABSTRACT

PURPOSE: To assess corneal structure and the effects of acute hyperglycemia on corneal function in subjects with type 1 diabetes. METHODS: Twenty-one diabetic and 21 nondiabetic volunteers of similar age were recruited. Baseline measurements of intraocular pressure (IOP), corneal thickness (CT), corneal autofluorescence (CAF), corneal sensitivity (CST), central and temporal endothelial cell density (DenC and DenT), and coefficient of variation in cell area (CVC and CVT) were taken. Corneal edema was induced, and the percent recovery per hour (PRPH) from hypoxic edema and endothelial permeability to fluorescein were determined. These procedures were done twice in the diabetic subjects under controlled euglycemic (EG) and hyperglycemic (HG) conditions, and once in control subjects while they were fasting. RESULTS: Substantial differences in baseline measurements were found for IOP, CT, CAF, CST, DenC, and CVT. The mean +/- SE corneal swelling in the HG diabetic subjects (51.6 +/- 2.3 microm) was less when compared to the swelling in the EG diabetic subjects (56.2 +/- 1.87 microm, P = 0.05) and the control subjects (58.9 +/- 1.56 microm, P = 0.011). During euglycemia, the mean +/- SE PRPH was less in diabetic subjects than in control subjects (65.0 +/- 3.20 versus 73.8 +/- 1.81%/hour, P = 0.02) but did not differ in diabetic subjects under EG and HG conditions (65.0 +/- 3.20 versus 67.7 +/- 3.1%/hour, P = 0.56). No significant differences were noted between groups in endothelial permeability. CONCLUSIONS: In addition to differences in baseline corneal structure, diabetic subjects showed less corneal swelling and reduced corneal recovery from hypoxia than did control subjects. During acute hyperglycemia, corneal swelling was less than during euglycemia in diabetic subjects, which suggests that hyperglycemia affected corneal hydration control.


Subject(s)
Blood Glucose , Cornea/physiopathology , Diabetes Mellitus, Type 1/physiopathology , Diabetic Retinopathy/physiopathology , Acute Disease , Adult , Blood Glucose/physiology , Cell Count , Cell Size , Cornea/metabolism , Cornea/pathology , Diabetes Mellitus, Type 1/blood , Diabetic Retinopathy/blood , Endothelium, Corneal/metabolism , Endothelium, Corneal/pathology , Female , Fluorophotometry , Humans , Hyperglycemia/blood , Hyperglycemia/physiopathology , Intraocular Pressure , Male , Middle Aged , Permeability , Sensation/physiology
14.
Teratology ; 58(6): 241-50, 1998 Dec.
Article in English | MEDLINE | ID: mdl-9894673

ABSTRACT

The young age of mothers of infants with gastroschisis, a congenital defect of the abdominal wall, suggested that deficient nutrition, with maternal-fetal competition for nutrients, could be a risk factor for gastroschisis. This population-based hypothesis-generating study consisted of 55 cases of gastroschisis and 182 matched controls. We assessed maternal nutrient intake during the trimester before conception with a self-reported food-frequency questionnaire and screened 38 nutrients to identify those most likely to be associated with gastroschisis. We used statistical classification trees to empirically generate cutpoints that determined the low and high levels of nutrient intakes corresponding to the risk of gastroschisis; cutpoints for most nutrients were similar to the corresponding recommended daily dietary allowances (RDAs). In univariate analysis, low intake of several nutrients emerged as the leading risk factors: carotenoids, e.g., alpha-carotene (odds ratio (OR) = 4.6; 95% confidence interval (CI) = 2.2-9.5), beta-carotene (OR = 3.1; 95% CI = 1.6-6.0); amino-acid compounds, e.g., total glutathione (OR = 3.5; 95% CI = 1.7-7.2); vitamin C (OR = 2.2; 95% CI = 1.5-7.8); vitamin E (OR = 2.3; 95% CI = 1.2-4.4); and minerals, fiber, and the fruit-and-vegetable group (OR = 3.1; 95% CI = 1.5-6.2). High intake of nitrosamines (OR = 2.4; 95% CI = 1.3-4.5) was also a good candidate. Many nutrient values were correlated and, in multivariate analysis, those most associated with gastroschisis were low alpha-carotene (OR = 4.3; 95% CI = 1.9-9.8), low total glutathione (OR = 3.3; 95% CI = 1.4-7.6), and high nitrosamines (OR = 2.6; 95% CI = 1.3-5.4). Adjusting for variables associated with gastroschisis in previous analyses of this population did not substantially alter those risks. These results suggest that maternal dietary inadequacy may be a risk factor for gastroschisis, and the three nutrients that emerged from the nutrient screening appear to be the best candidates to examine in further epidemiological analyses or biological studies.


Subject(s)
Gastroschisis/etiology , Nutritional Status , Carotenoids , Diet , Female , Glutathione , Humans , Maternal-Fetal Exchange , Multivariate Analysis , Nitrosamines/adverse effects , Pregnancy , Risk Factors , Software , beta Carotene
15.
Invest Ophthalmol Vis Sci ; 38(9): 1830-9, 1997 Aug.
Article in English | MEDLINE | ID: mdl-9286273

ABSTRACT

PURPOSE: Permeability (Pdc) to sodium fluorescein (F) is a characteristic of the barrier function of the corneal epithelium. The repeatability of several in vivo fluorophotometric methods used to measure permeability in humans remains largely undocumented. This study examines the repeatability of a method based on topical instillation of a single drop of F for the quantitative assessment of Pdc. METHODS: Nine healthy subjects with no history of ocular disease provided 1 (n = 1), 2 (n = 1), or 3 (n = 7) repeated measurements of each eye at successive visits. After making 3 baseline fluorescence scans centrally through the tear film and cornea, 2 microliters of 0.35% F were instilled and 10 fluorescence scans were obtained at approximately 2-minute intervals immediately after instillation. Subsequently, the eyes were rinsed three times with nonpreserved saline and four additional scans were performed. RESULTS: Pdc was calculated by dividing the baseline-corrected postrinse stromal fluorescence by the time integral of the tear film fluorescence calculated over the 20-minute exposure period. After applying a logarithmic transformation to the Pdc estimates, a mixed-model analysis was used to assess measurement repeatability. On the Pdc scale, there is an estimated 95% chance that a second measurement could be as much as 2.88 times higher or 0.35 times lower than a first measurement. CONCLUSIONS: This substantial variability between repeated measurements indicates that the single-drop procedure is unreliable for monitoring individual patient changes. However, with careful sample size planning, this technique can be used in population-based research to compare differences in treatment effects between groups of subjects.


Subject(s)
Cell Membrane Permeability/physiology , Cornea/metabolism , Adult , Cornea/cytology , Epithelium/metabolism , Fluorescein , Fluoresceins/metabolism , Fluorescent Dyes/metabolism , Fluorophotometry/methods , Humans , Middle Aged , Reproducibility of Results
16.
Curr Eye Res ; 14(5): 349-55, 1995 May.
Article in English | MEDLINE | ID: mdl-7648860

ABSTRACT

Corneal acidosis has been shown to reduce corneal hydration control (CHC) as measured by the rate, expressed as the percent recovery per hour (PRPH), at which the thickness of the cornea decreases exponentially after an increased hydration load. Since the effect of pH on corneal function is of scientific interest and may have clinical implications, we explored the relationship between pH and PRPH in greater detail by examining the effect of different stromal pH levels on corneal hydration control. Corneal edema was induced using a 90-min exposure to wearing a hypoxic contact lens (CL). Following removal of the CL, random assignment over four eye-test combinations of either 0, 3, 5, and 7% CO2 were made while pH and corneal thickness were monitored using slit lamp fluorophometry and optical pachometry to measure corneal pH and corneal thickness, respectively. From these measurements we determined the pH-dose/PRPH relationship. The average stromal pH +/- 1SD resulting from exposure to either the 0, 3, 5, and 7% CO2, was 7.65 +/- 0.11, 7.30 +/- 0.09, 7.15 +/- 0.08 and 7.04 +/- 0.07 (p < 0.001), respectively. Analysis based on a quadratic model of the dose-response relationship between PRPH and corneal pH indicates that PRPH is relatively unchanged for pH in the physiological range (pH = 7.40-7.65) and then decreases notably below the physiological range.


Subject(s)
Cornea/physiology , Corneal Edema/physiopathology , Corneal Stroma/physiopathology , Acidosis , Adolescent , Adult , Body Water , Carbon Dioxide , Cell Hypoxia , Contact Lenses/adverse effects , Corneal Edema/etiology , Female , Fluorophotometry , Humans , Hydrogen-Ion Concentration , Male , Oxygen Consumption
17.
Diabetes ; 42(9): 1324-32, 1993 Sep.
Article in English | MEDLINE | ID: mdl-8349044

ABSTRACT

Plasma glucose values after oral glucose challenge vary widely in nondiabetic subjects. We have now evaluated the role of insulin resistance in determining the plasma glucose response to oral glucose in 74 volunteer subjects with normal glucose tolerance. In these subjects, we determined the plasma glucose and insulin responses over a 3-h period to a 75-g oral glucose challenge, and the steady-state plasma glucose concentration during a continuous infusion of somatostatin, glucose, and insulin (a quantitative measure of insulin resistance). The plasma glucose response was defined as the incremental increase in plasma glucose concentration above the fasting value for 3 h after the oral glucose challenge. Multiple regression analysis was used to define the relationship between the dependent variable (plasma glucose response) and various predictors of this response. These analyses indicated that both the steady-state plasma glucose and the incremental insulin response during the first 30 min after the glucose load were significant predictors of the plasma glucose response. In those individuals in whom insulin action was impaired and the 30-min plasma insulin response was decreased, plasma glucose values reached higher levels. When standardized regression coefficients were determined, the incremental glucose response was directly correlated with steady-state plasma glucose (r = 0.700, P < 0.001) and inversely with the insulin response during the first 30 min (r = 0.268, P = 0.023). Furthermore, the correlation between steady-state plasma glucose and glucose response was significantly greater (P < 0.005) than that between the glucose response and 30-min insulin concentration.(ABSTRACT TRUNCATED AT 250 WORDS)


Subject(s)
Blood Glucose/metabolism , Insulin Resistance/physiology , Insulin/metabolism , Adult , Aged , Female , Glucose Tolerance Test , Humans , Insulin/physiology , Insulin Secretion , Male , Middle Aged , Reference Values , Regression Analysis , Time Factors
18.
J Clin Endocrinol Metab ; 76(2): 489-93, 1993 Feb.
Article in English | MEDLINE | ID: mdl-8432795

ABSTRACT

The plasma cholecystokinin (CCK) response to a test meal was studied in 16 control subjects and 15 patients with noninsulin-dependent diabetes mellitus (NIDDM). Basal CCK levels were approximately 1 pmol in both groups. However, after the test meal, plasma CCK levels were 2-fold greater in the controls when compared to the diabetics. In controls, CCK levels maximally increased by 5.6 +/- 0.8 pmol (mean +/- SEM) 10 min after feeding, whereas in the NIDDM patients this value was 1.9 +/- 0.6 pmol (P < 0.001). After the test meal, the normal subjects showed no postprandial rise in blood glucose, whereas the diabetic patient showed a rise of 2.6 +/- 0.7 mmol. To determine whether the decreased CCK levels may have been related to the postprandial hyperglycemia, 7 diabetic subjects were infused with CCK. With this CCK infusion, postprandial glucose levels did not rise. These data suggest, therefore: 1) a role for cholecystokinin in regulating postprandial hyperglycemia in man, 2) abnormalities in CCK secretion occur in NIDDM and may contribute to the hyperglycemia seen in this disease.


Subject(s)
Cholecystokinin/metabolism , Diabetes Mellitus, Type 2/physiopathology , Food , Hyperglycemia/etiology , Adult , Aged , Blood Glucose/metabolism , Cholecystokinin/physiology , Female , Humans , Insulin/blood , Male , Middle Aged
19.
Arch Virol ; 128(1-2): 29-41, 1993.
Article in English | MEDLINE | ID: mdl-7916588

ABSTRACT

Complementary DNA representing 1418 nucleotides (nt) of the 3'-poly(A)-proximal tract of the genomic RNA of a potyvirus causing woodiness disease in South African passion fruit, was cloned and sequenced. The sequence contained a single long open reading frame (ORF) of 1188 nt with no initiation codon, and a 3'-non-coding region (3'-NCR) of 230 nt followed by a poly-adenylate tract. Comparison of the ORF with other potyviral coat protein (CP) sequences led to the prediction that a 279 residue CP of MW 31722 is encoded by 836 nt at the 3'-terminus of the ORF. This virus is not merely a South African strain of passion fruit woodiness virus (PWV): the deduced CP sequence is only distantly related to CPs of other sequenced strains of PWV, although it is part of a distinct subgroup of potyviruses related to PWV. The virus was therefore designated as South African passiflora virus (SAPV). The DNA containing the putative CP was cloned into the pUEX2 bacterial expression vector and expressed in Escherichia coli as a beta-gal-CP fusion protein. The fusion protein reacted with polyclonal antisera raised against the native virus, and antisera raised against partially purified fusion protein reacted with viral CP in Western blots.


Subject(s)
Capsid Proteins , Capsid/genetics , Plant Viruses/genetics , RNA Viruses/genetics , Amino Acid Sequence , Base Sequence , Cloning, Molecular , DNA, Viral , Escherichia coli , Introns , Molecular Sequence Data , Phylogeny , Plant Viruses/classification , Plants, Toxic , RNA Viruses/classification , Nicotiana/microbiology
20.
Math Biosci ; 111(2): 249-59, 1992 Oct.
Article in English | MEDLINE | ID: mdl-1515746

ABSTRACT

The two-state recurrent stochastic model with time-independent transition rates is generalized to a model with time-dependent transition rates. The rates can be any general function of external time, that is, any general function of the calendar time in which the process unfolds. Formulas for the state transition probabilities, the proportion of individuals in a particular state at time t, the distribution function, and the expectation of the number of individuals in a particular state at time t are derived.


Subject(s)
Disease , Models, Biological , Stochastic Processes , Epidemiology , Humans , Mathematics , Population Dynamics , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...