Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 49
1.
Mol Biol Rep ; 46(4): 4631-4643, 2019 Aug.
Article En | MEDLINE | ID: mdl-31093875

The reliable analysis of the cell cycle status has become increasingly relevant for scientific and clinical work, especially for the determination of tumor cell growth. One established method to characterize the proliferation activity of cells is the analysis of the Ki-67 protein. Ki-67 is expressed in the nucleus during the whole cell cycle except for the G0 phase. Several different protocols exist for the examination of the Ki-67 protein in tissue and cell culture, but most of them are defined for human cells. For the analysis of the Ki-67 protein in murine tissue and cell culture there is a variety of protocols existing which recommend different fixation and permeabilization reagents or special kits. In this study, we established a reliable protocol for Ki-67 staining in murine cells and tissue based on PFA fixation, which can be used not only for flow cytometry but also for immunofluorescence microscopy analysis. We tested our protocol successfully with three different Ki-67 anti-mouse antibodies in cell culture, regenerating liver tissue and mouse melanoma tumor to demonstrate the general applicability.


Cell Proliferation/physiology , Ki-67 Antigen/analysis , Staining and Labeling/methods , Animals , Cell Division , Cell Line, Tumor , Cell Nucleus/metabolism , Flow Cytometry/methods , Humans , Ki-67 Antigen/metabolism , Mice , Mice, Inbred C57BL , Microscopy, Fluorescence/methods , Tumor Cells, Cultured
2.
J Child Orthop ; 13(2): 213-219, 2019 Apr 01.
Article En | MEDLINE | ID: mdl-30996747

PURPOSE: Tibia fractures are the third most common long bone fracture in children. Because of the remodelling potential of the tibial diaphysis, nonoperative treatment has historically been advocated for most tibial shaft fractures in children. The purpose of this study was to estimate the rate of surgical treatment of tibial shaft fractures over time and identify demographic factors associated with surgical treatment, utilizing a large, publicly available, national database. METHODS: The Healthcare Cost and Utilization Project Kids' Inpatient Database was evaluated for the years between 2000 and 2012. Tibial shaft fractures and surgically treated patients were identified using International Classification of Diseases, 9th Revision, Clinical Modification codes. Univariable and multivariable logistic regression were used to determine variables associated with a greater proportion of surgical treatment. Statistical analyses were performed utilizing SAS statistical software v.9.4. Statistical significance was set at p < 0.05. RESULTS: In all, 24 166 tibial shaft fracture admissions were identified, with 15 621 (64.7%) treated surgically. The percentage of patients receiving surgery to treat tibial shaft fractures increased from 57.3% in 2000 to 74.3% in 2012 (p < 0.001). Multivariable regression showed that increasing age was associated with increased rate of surgical treatment (p < 0.001). The greatest increase in surgical treatment was seen in children aged five to nine years, increasing from 23.0% in 2000 to 46.2% in 2012. CONCLUSION: The rate of operative treatment of paediatric tibial shaft fractures increased over time. The largest increase was seen in children aged five to nine years. Increased proportion of surgical treatment was associated with older age, concurrent femur fracture and non-Medicaid insurance status. LEVEL OF EVIDENCE: Level III - Retrospective comparative study.

3.
J Child Orthop ; 12(2): 111-116, 2018 Apr 01.
Article En | MEDLINE | ID: mdl-29707048

PURPOSE: To estimate the rate of surgical treatment of paediatric proximal humerus fractures over time utilizing a large, publicly available national database. METHODS: The Healthcare Cost and Utilization Project Kids' Inpatient Database was evaluated between the years 2000 and 2012. Proximal humerus fractures were identified using International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9 CM) diagnosis codes. ICD-9 CM procedure codes were used to identify patients who received surgical treatment. Univariable and multivariable logistic regression were used to determine variables associated with greater proportions of surgical treatment. All statistical analyses were performed utilizing SAS statistical software v.9.4. Statistical significance was set at p < 0.05. RESULTS: A total of 7520 proximal humerus fracture admissions were identified; 3247 (43.2%) were treated surgically. The percentage of patients receiving surgery increased from 39.3% in 2000 to 46.4% in 2012 (p < 0.001). After adjustment for potential confounders, increased age, increased ICD-9 derived injury severity scores (ICISS) and more recent year were associated with an increased proportion of patients receiving surgical treatment (p < 0.001). Medicaid payer status (p < 0.001) and admission to a children's hospital (p = 0.045) were associated with a lower proportion of surgical treatment. CONCLUSION: The rate of operative treatment of paediatric proximal humerus fractures increased over time between 2000 and 2012. Increased surgical rates were independently associated with older age, increased ICISS, treatment at a non-children's hospital and non-Medicaid insurance status. Further study is needed to provide evidence to support improved outcomes after operative treatment of paediatric proximal humerus fractures.

4.
J Child Orthop ; 11(3): 201-209, 2017 Jun 01.
Article En | MEDLINE | ID: mdl-28828064

FOREARM: Purpose fractures are one of the most commonly sustained injuries in children and are often treated non-operatively. The purpose of this study was to estimate the rate of inpatient surgical treatment of paediatric forearm fractures over time using a large, publicly available, national database. METHODS: The Healthcare Cost and Utilization Project (HCUP) Kids' Inpatient Database (KID) was evaluated between 2000 and 2012. Forearm fractures and surgeries were identified using International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9 CM) diagnosis and procedure codes. Univariable and multivariable logistic regression were used to determine variables associated with greater proportion of surgical treatment. All statistical analyses were performed using SAS statistical software v.9.4 (SAS Institute Inc., Cary, NC, USA). Statistical significance was set at p < 0.05. RESULTS: The database identified 30 936 forearm fracture admissions. Overall, 19 837 of these patients were treated surgically (64.12%). The percentage of patients treated with surgery increased from 59.3% in 2000 to 70.0% in 2012 (p < 0.001). Multivariable regression analysis found increased age (p < 0.001), more recent year (p < 0.001), male gender (p = 0.003) and admission to a children's hospital (p < 0.001) were associated with an increased proportion of patients receiving surgical treatment. Medicaid payer status was associated with a lower proportion of surgical treatment (p < 0.001). CONCLUSIONS: The rate of operative treatment for paediatric forearm fractures admitted to the hospital increased over time. Increased surgical rates were associated with older age, male gender, treatment at a children's hospital and non-Medicaid insurance status.

5.
Food Chem Toxicol ; 44(10): 1636-50, 2006 Oct.
Article En | MEDLINE | ID: mdl-16891049

The European Food Safety Authority (EFSA) and the World Health Organization (WHO), with the support of the International Life Sciences Institute, European Branch (ILSI Europe), organized an international conference on 16-18 November 2005 to discuss how regulatory and advisory bodies evaluate the potential risks of the presence in food of substances that are both genotoxic and carcinogenic. The objectives of the conference were to discuss the possible approaches for risk assessment of such substances, how the approaches may be interpreted and whether they meet the needs of risk managers. ALARA (as low as reasonably achievable) provides advice based solely on hazard identification and does not take into account either potency or human exposure. The use of quantitative low-dose extrapolation of dose-response data from an animal bioassay raises numerous scientific uncertainties related to the selection of mathematical models and extrapolation down to levels of human exposure. There was consensus that the margin of exposure (MOE) was the preferred approach because it is based on the available animal dose-response data, without extrapolation, and on human exposures. The MOE can be used for prioritisation of risk management actions but the conference recognised that it is difficult to interpret it in terms of health risk.


Carcinogens/toxicity , Food/standards , Mutagens/toxicity , Animals , Carcinogenicity Tests , Europe , Foodborne Diseases/etiology , Foodborne Diseases/genetics , Humans , Mutagenicity Tests , Risk Assessment , World Health Organization
6.
Toxicol Sci ; 86(2): 226-30, 2005 Aug.
Article En | MEDLINE | ID: mdl-15829616

The concept that "safe levels of exposure" for humans can be identified for individual chemicals is central to the risk assessment of compounds with known toxicological profiles. The Threshold of Toxicological Concern (TTC) is a concept that refers to the establishment of a level of exposure for all chemicals, whether or not there are chemical-specific toxicity data, below which there would be no appreciable risk to human health. The concept proposes that a low level of exposure with a negligible risk can be identified for many chemicals, including those of unknown toxicity, based on knowledge of their chemical structures. The present paper aims to describe the history of the TTC principle, its use to date, its potential future applications and the incorporation of the TTC principle in the Risk Assessment paradigm.


Food Contamination , Risk Assessment/methods , Animals , Decision Trees , Food Additives/toxicity , Food Packaging , History, 20th Century , History, 21st Century , Humans , No-Observed-Adverse-Effect Level , Risk Assessment/history , Risk Management , Structure-Activity Relationship
7.
Food Chem Toxicol ; 42(1): 65-83, 2004 Jan.
Article En | MEDLINE | ID: mdl-14630131

The threshold of toxicological concern (TTC) is a pragmatic risk assessment tool that is based on the principle of establishing a human exposure threshold value for all chemicals, below which there is a very low probability of an appreciable risk to human health. The concept that there are levels of exposure that do not cause adverse effects is inherent in setting acceptable daily intakes (ADIs) for chemicals with known toxicological profiles. The TTC principle extends this concept by proposing that a de minimis value can be identified for many chemicals, in the absence of a full toxicity database, based on their chemical structures and the known toxicity of chemicals which share similar structural characteristics. The establishment and application of widely accepted TTC values would benefit consumers, industry and regulators. By avoiding unnecessary toxicity testing and safety evaluations when human intakes are below such a threshold, application of the TTC approach would focus limited resources of time, cost, animal use and expertise on the testing and evaluation of substances with the greatest potential to pose risks to human health and thereby contribute to a reduction in the use of animals. An Expert Group of the European branch of the International Life Sciences Institute-ILSI Europe-has examined the TTC principle for its wider applicability in food safety evaluation. The Expert Group examined metabolism and accumulation, structural alerts, endocrine disrupting chemicals and specific endpoints, such as neurotoxicity, teratogenicity, developmental toxicity, allergenicity and immunotoxicity, and determined whether such properties or endpoints had to be taken into consideration specifically in a step-wise approach. The Expert Group concluded that the TTC principle can be applied for low concentrations in food of chemicals that lack toxicity data, provided that there is a sound intake estimate. The use of a decision tree to apply the TTC principle is proposed, and this paper describes the step-wise process in detail. Proteins, heavy metals and polyhalogenated-dibenzodioxins and related compounds were excluded from this approach. When assessing a chemical, a review of prior knowledge and context of use should always precede the use of the TTC decision tree. The initial step is the identification and evaluation of possible genotoxic and/or high potency carcinogens. Following this step, non-genotoxic substances are evaluated in a sequence of steps related to the concerns that would be associated with increasing intakes. For organophosphates a TTC of 18microg per person per day (0.3 microg/kg bw/day) is proposed, and when the compound is not an OP, the TTC values for the Cramer structural classes III, II and I, with their respective TTC levels (e.g. 1800, 540 and 90 microg per person per day; or 30, 9 and 1.5 microg/kg bw /day), would be applied sequentially. All other endpoints or properties were shown to have a distribution of no observed effect levels (NOELs) similar to the distribution of NOELs for general toxicity endpoints in Cramer classes I, II and III. The document was discussed with a wider audience during a workshop held in March 2003 (see list of workshop participants).


Diet , Food/toxicity , Structure-Activity Relationship , Animals , Carcinogens/toxicity , Decision Trees , Endocrine Glands/drug effects , Food Hypersensitivity , Humans , Metabolism , No-Observed-Adverse-Effect Level , Pharmacokinetics , Risk Assessment
8.
Food Chem Toxicol ; 41(12): 1625-49, 2003 Dec.
Article En | MEDLINE | ID: mdl-14563389

There is a growing interest by both consumers and industry for the development of food products with 'functional' properties, or health benefits. These products may take the form of dietary supplements or of foods. The health benefits are given by particular ingredients, and in many cases these are derived from botanicals. The variety of plants providing these functions is large, ranging from staple food sources such as cereals, fruits and vegetables, to herbals as used in traditional medicine. The food or ingredient conferring health properties may consist of the plants themselves, extracts thereof, or more purified components. The scientific literature is abundant with articles not only on the beneficial properties, but also on possible adverse health effects of plants and their components. The present report discusses the data required to determine the safe use of these types of ingredients, and provides advice on the development of risk assessment strategies consistent with due diligence under existing food regulations. Product specifications, composition and characterisation of standardised and authentic materials, documented history of use and comparison to existing products (taking into account the effect of industrial processing), description of the intended use and consequent exposure are highlighted as key background information on which to base a risk evaluation. The extent of experimental investigation required, such as in vitro, animal, and/or human studies, depends on the adequacy of this information. A decision tree is presented as an aid to determine the extent of data requirements based on product comparison. The ultimate safety in use depends on the establishment of an adequate safety margin between expected exposure and identified potential hazards. Health hazards may arise from inherent toxicities or contaminants of the plant materials, including the mechanism of the intended beneficial effect. A lower safety margin may therefore be expected than for food ingredients or additives where no physiological effects are intended. In rare cases, post launch monitoring programmes may be envisaged to confirm expected exposures and adequacy of the safety margin. This guidance document was elaborated by an expert group of the Natural Toxin Task Force of the European Branch of the International Life Sciences Institute--ILSI Europe and discussed with a wider audience of scientists at a workshop held on 13-15 May 2002 in Marseille, France.


Dietary Supplements/adverse effects , Food Additives/adverse effects , Plant Preparations/adverse effects , Animals , Decision Trees , Diet , Dietary Supplements/standards , Food Additives/standards , Food Industry/standards , Humans , Plant Preparations/standards , Risk Assessment
10.
Food Chem Toxicol ; 40(2-3): 145-91, 2002.
Article En | MEDLINE | ID: mdl-11893397

This paper is one of several prepared under the project "Food Safety In Europe: Risk Assessment of Chemicals in Food and Diet" (FOSIE), a European Commission Concerted Action Programme, organised by the International Life Sciences Institute, Europe (ILSI). The aim of the FOSIE project is to review the current state of the science of risk assessment of chemicals in food and diet, by consideration of the four stages of risk assessment, that is, hazard identification, hazard characterisation, exposure assessment and risk characterisation. The contribution of animal-based methods in toxicology to hazard identification of chemicals in food and diet is discussed. The importance of first applying existing technical and chemical knowledge to the design of safety testing programs for food chemicals is emphasised. There is consideration of the presently available and commonly used toxicity testing approaches and methodologies, including acute and repeated dose toxicity, reproductive and developmental toxicity, neurotoxicity, genotoxicity, carcinogenicity, immunotoxicity and food allergy. They are considered from the perspective of whether they are appropriate for assessing food chemicals and whether they are adequate to detect currently known or anticipated hazards from food. Gaps in knowledge and future research needs are identified; research on these could lead to improvements in the methods of hazard identification for food chemicals. The potential impact of some emerging techniques and toxicological issues on hazard identification for food chemicals, such as new measurement techniques, the use of transgenic animals, assessment of hormone balance and the possibilities for conducting studies in which common human diseases have been modelled, is also considered.


Environmental Exposure/adverse effects , Food Analysis , Food Contamination/prevention & control , Hazardous Substances/toxicity , Models, Animal , Toxicology/methods , Animals , Food , Food Contamination/analysis , Foodborne Diseases/prevention & control , Humans , No-Observed-Adverse-Effect Level , Risk Assessment , Risk Management , Safety
11.
Food Chem Toxicol ; 40(2-3): 193-236, 2002.
Article En | MEDLINE | ID: mdl-11893398

In vitro methods are common and widely used for screening and ranking chemicals, and have also been taken into account sporadically for risk assessment purposes in the case of food additives. However, the range of food-associated compounds amenable to in vitro toxicology is considered much broader, comprising not only natural ingredients, including those from food preparation, but also compounds formed endogenously after exposure, permissible/authorised chemicals including additives, residues, supplements, chemicals from processing and packaging and contaminants. A major promise of in vitro systems is to obtain mechanism-derived information that is considered pivotal for adequate risk assessment. This paper critically reviews the entire process of risk assessment by in vitro toxicology, encompassing ongoing and future developments, with major emphasis on cytotoxicity, cellular responses, toxicokinetics, modelling, metabolism, cancer-related endpoints, developmental toxicity, prediction of allergenicity, and finally, development and application of biomarkers. It describes in depth the use of in vitro methods in strategies for characterising and predicting hazards to the human. Major weaknesses and strengths of these assay systems are addressed, together with some key issues concerning major research priorities to improve hazard identification and characterisation of food-associated chemicals.


Food Analysis/methods , Hazardous Substances/toxicity , Risk Assessment , Toxicology/methods , Animal Testing Alternatives , Animals , Biomarkers , Food Additives , Food Contamination , Food Handling , Food Packaging , Humans , In Vitro Techniques
12.
Food Chem Toxicol ; 40(2-3): 237-82, 2002.
Article En | MEDLINE | ID: mdl-11893399

Hazard characterisation of low molecular weight chemicals in food and diet generally use a no-observed-adverse-effect level (NOAEL) or a benchmark dose as the starting point. For hazards that are considered not to have thresholds for their mode of action, low-dose extrapolation and other modelling approaches may be applied. The default position is that rodents are good models for humans. However, some chemicals cause species-specific toxicity syndromes. Information on quantitative species differences is used to modify the default uncertainty factors applied to extrapolate from experimental animals to humans. A central theme for extrapolation is unravelling the mode of action for the critical effects observed. Food can be considered as an extremely complex and variable chemical mixture. Interactions among low molecular weight chemicals are expected to be rare given that the exposure levels generally are far below their NOAELs. Hazard characterisation of micronutrients must consider that adverse effects may arise from intakes that are too low (deficiency) as well as too high (toxicity). Interactions between different nutrients may complicate such hazard characterisations. The principle of substantial equivalence can be applied to guide the hazard identification and hazard characterisation of macronutrients and whole foods. Macronutrients and whole foods must be evaluated on a case-by-case basis and cannot follow a routine assessment protocol.


Hazardous Substances/toxicity , Micronutrients/adverse effects , Animals , Dose-Response Relationship, Drug , European Union , Hazardous Substances/administration & dosage , Humans , Micronutrients/administration & dosage , Models, Animal , Molecular Weight , No-Observed-Adverse-Effect Level , Risk Assessment/methods , Rodentia , Species Specificity
13.
Food Chem Toxicol ; 40(2-3): 283-326, 2002.
Article En | MEDLINE | ID: mdl-11893400

The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.


Hazardous Substances/toxicity , Models, Theoretical , Animals , Decision Making , Dose-Response Relationship, Drug , European Union , Hazardous Substances/pharmacokinetics , Humans , Models, Animal , Risk Assessment/methods , Structure-Activity Relationship , Threshold Limit Values
14.
Food Chem Toxicol ; 40(2-3): 327-85, 2002.
Article En | MEDLINE | ID: mdl-11893401

Exposure assessment is one of the key parts of the risk assessment process. Only intake of toxicologically significant amounts can lead to adverse health effects even for a relatively toxic substance. In the case of chemicals in foods this is based on three major aspects: (i) how to determine quantitatively the presence of a chemical in individual foods and diets, including its fate during the processes within the food production chain; (ii) how to determine the consumption patterns of the individual foods containing the relevant chemicals; (iii) how to integrate both the likelihood of consumers eating large amounts of the given foods and of the relevant chemical being present in these foods at high levels. The techniques used for the evaluation of these three aspects have been critically reviewed in this paper to determine those areas where the current approaches provide a solid basis for assessments and those areas where improvements are needed or desirable. For those latter areas, options for improvements are being suggested, including, for example, the development of a pan-European food composition database, activities to understand better effects of processing on individual food chemicals, harmonisation of food consumption survey methods with the option of a regular pan-European survey, evaluation of probabilistic models and the development of models to assess exposure to food allergens. In all three areas, the limitations of the approaches currently used lead to uncertainties which can either cause an over- or underestimation of real intakes and thus risks. Given these imprecisions, risk assessors tend to build in additional uncertainty factors to avoid health-relevant underestimates. This is partly done by using screening methods designed to look for "worst case" situations. Such worse case assumptions lead to intake estimates that are higher than reality. These screening methods are used to screen all those chemicals with a safe intake distribution. For chemicals with a potential risk, more information is needed to allow more refined screening or even the most accurate estimation. More information and more refined methods however, require more resources. The ultimate aims are: (1) to obtain appropriate estimations for the presence and quantity of a given chemical in a food and in the diet in general; (2) to assess the consumption patterns for the foods containing these substances, including especially those parts of the population with high consumption and thus potentially high intakes; and (3) to develop and apply tools to predict reliably the likelihood of high end consumption with the presence of high levels of the relevant substances. It has thus been demonstrated that a tiered approach at all three steps can be helpful to optimise the use of the available resources: if relatively crude tools - designed to provide a "worst case" estimate - do not suggest a toxicologically significant exposure (or a relevant deficit of a particular nutrient) it may not be necessary to use more sophisticated tools. These will be needed if initially high intakes are indicated for at least parts of the population. Existing pragmatic approaches are a first crude step to model food chemical intake. It is recommended to extend, refine and validate this approach in the near future. This has to result in a cost-effective exposure assessment system to be used for existing and potential categories of chemicals. This system of knowledge (with information on sensitivities, accuracy, etc.) will guide future data collection.


Food Contamination/analysis , Hazardous Substances/toxicity , Animals , Diet , Diet Surveys , Eating , European Union , Feeding Behavior , Food Analysis , Food Chain , Hazardous Substances/administration & dosage , Humans , Risk Assessment/methods
15.
Food Chem Toxicol ; 40(2-3): 387-424, 2002.
Article En | MEDLINE | ID: mdl-11893402

Epidemiologic studies directly contribute data on risk (or benefit) in humans as the investigated species, and in the full food intake range normally encountered by humans. This paper starts with introducing the epidemiologic approach, followed by a discussion of perceived differences between toxicological and epidemiologic risk assessment. Areas of contribution of epidemiology to the risk assessment process are identified, and ideas for tailoring epidemiologic studies to the risk assessment procedures are suggested, dealing with data collection, analyses and reporting of both existing and new epidemiologic studies. The dietary habits and subsequent disease occurrence of over three million people are currently under observation worldwide in cohort studies, offering great potential for use in risk assessment. The use of biomarkers and data on genetic susceptibility are discussed. The paper describes a scheme to classify epidemiologic studies for use in risk assessment, and deals with combining evidence from multiple studies. Using a matrix approach, the potential contribution to each of the steps in the risk assessment process is evaluated for categories of food substances. The contribution to risk assessment of specific food substances depends on the quality of the exposure information. Strengths and weaknesses are summarized. It is concluded that epidemiology can contribute significantly to hazard identification, hazard characterisation and exposure assessment.


Epidemiologic Studies , Food Contamination/analysis , Hazardous Substances/toxicity , Risk Assessment/methods , Biomarkers , Epidemiologic Methods , Feeding Behavior , Humans , Toxicology/methods
16.
Brain Cogn ; 49(2): 194-8, 2002 Jul.
Article En | MEDLINE | ID: mdl-15259387

The purpose of this study was to compare the neuropsychological functioning of 12 veterans who were HIV-positive to 21 age-matched veterans who were HIV-negative. Consistent with expectations, the HIV-positive group was found to perform more poorly in areas related to attention and concentration, immediate and delayed verbal recall, immediate and delayed visual recall, visual learning, and tasks requiring psychomotor speed, while a number of language tasks were left intact. This was similar to dysfunction often seen in HIV-related dementia cases. However, this group was also significantly more impaired in confrontation naming, planning, mental calculations, and abstract thought when compared to the HIV-negative group. Comorbid substance abuse found in the majority of our HIV-positive subjects was thought to contribute to the HIV-related dysfunction.


AIDS Dementia Complex/complications , Cognition Disorders/etiology , HIV Infections/complications , Memory Disorders/etiology , Substance-Related Disorders/complications , Veterans/classification , AIDS Dementia Complex/psychology , Adult , Analysis of Variance , Attention , Cognition Disorders/ethnology , Female , HIV Infections/psychology , Humans , Male , Matched-Pair Analysis , Middle Aged , Neuropsychological Tests , New York , Reference Values , Verbal Learning , Veterans/psychology , Veterans/statistics & numerical data
17.
Brain Cogn ; 49(2): 216-20, 2002 Jul.
Article En | MEDLINE | ID: mdl-15259394

We compared the verbal learning and visuomotor attention of 34 Alzheimer's patients and 18 depressive patients. Verbal learning was assessed using The Hopkins Verbal Learning Test--Revised (HVLT--R); visuomotor attention was assessed using the Trail Making Test (TMT). The Alzheimer's patients had significantly lower scores on immediate and delayed recall of a word list. There was a nonsignificant trend in this group toward a fewer number of true positives and a greater number of false positives. Alzheimer's patients were significantly slower on Trails A, with a nonsignificant trend toward slower performance on Trails B. No difference was observed in accuracy of attentional processing. The results are discussed in terms of other factors, such as stage of cognitive decline, which might have influenced the findings.


Alzheimer Disease/physiopathology , Attention , Depressive Disorder/physiopathology , Factitious Disorders/physiopathology , Memory , Verbal Learning , Aged , Aged, 80 and over , Analysis of Variance , Female , Fixation, Ocular , Geriatric Assessment , Humans , Male , Reaction Time
18.
Am J Physiol Regul Integr Comp Physiol ; 280(5): R1434-9, 2001 May.
Article En | MEDLINE | ID: mdl-11294765

The paraventricular nucleus of the hypothalamus (PVH) occupies a pivotal point within the network of brain nuclei coordinating critical host-defense responses. In mice, T cell-dependent immune stimuli, including the bacterial superantigen staphylococcal enterotoxin B (SEB), can activate the PVH. To determine whether T cell-dependent immune stimuli activate the PVH in rats, we assessed plasma corticosterone (Cort) levels, fever responses, and c-Fos expression in the PVH in animals treated with intraperitoneal injections of SEB. In animals with previously implanted abdominal thermisters, intraperitoneal injection of 1 mg/kg SEB resulted in a significant rise in body temperature, with a latency of 3.5-4 h. In separate animals, intraperitoneal injection of 1 mg/kg SEB resulted in a significant elevation of plasma Cort and induced c-Fos expression in parvocellular neurons within the PVH. These results support the idea that T cell-dependent immune stimuli activate brain pathways mediating host-defense responses such as fever and neuroendocrine changes.


Corticosterone/blood , Enterotoxins/pharmacology , Fever/physiopathology , Neurons/physiology , Paraventricular Hypothalamic Nucleus/physiology , Proto-Oncogene Proteins c-fos/metabolism , Animals , Body Temperature/drug effects , Body Temperature Regulation/drug effects , Enterotoxins/administration & dosage , Fever/chemically induced , Injections, Intraperitoneal , Male , Mice , Neurons/drug effects , Paraventricular Hypothalamic Nucleus/drug effects , Paraventricular Hypothalamic Nucleus/physiopathology , Rats , Rats, Sprague-Dawley , Staphylococcus aureus , Superantigens/pharmacology
20.
Ann Nutr Metab ; 45(6): 235-54, 2001.
Article En | MEDLINE | ID: mdl-11786646

Recombinant DNA techniques are capable of introducing genetic changes into food organisms that are more predictable than those introduced through conventional breeding techniques. This review discusses whether the consumption of DNA in approved novel foods and novel food ingredients derived from genetically modified organisms (GMOs) can be regarded as being as safe as the consumption of DNA in existing foods. It concludes that DNA from GMOs is equivalent to DNA from existing food organisms that has always been consumed with human diets. Any risks associated with the consumption of DNA will remain, irrespective of its origin, because the body handles all DNA in the same way. The breakdown of DNA during food processing and passage through the gastrointestinal tract reduces the likelihood that intact genes capable of encoding foreign proteins will be transferred to gut microflora. The review does not specifically address food safety issues arising from the consumption of viable genetically modified microorganisms but it shows that the likelihood of transfer and functional integration of DNA from ingested food by gut microflora and/or human cells is minimal. Information reviewed does not indicate any safety concerns associated with the ingestion of DNA per se from GMOs resulting from the use of currently available recombinant DNA techniques in the food chain.


DNA/administration & dosage , Food, Genetically Modified , Consumer Product Safety , DNA/chemistry , DNA/pharmacokinetics , DNA/physiology , Digestion , Food Microbiology , Food Technology/standards , Food, Genetically Modified/adverse effects , Food, Genetically Modified/standards , Gene Transfer, Horizontal , Genetic Engineering , Humans , Structure-Activity Relationship
...