Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 498
Filtrar
1.
Rev Clin Esp (Barc) ; 223(1): 60-61, 2023 01.
Artigo em Inglês | MEDLINE | ID: mdl-36372381
2.
Rev Clin Esp (Barc) ; 222(8): 504-505, 2022 10.
Artigo em Inglês | MEDLINE | ID: mdl-35750596
3.
Early Interv Psychiatry ; 15(5): 1395-1408, 2021 10.
Artigo em Inglês | MEDLINE | ID: mdl-33283472

RESUMO

AIM: Pennsylvania (PA) first-episode psychosis (FEP) program evaluation is a statewide initiative, supported by the PA Office of Mental Health and Substance Abuse Services (PA-OMHSAS) and administered by PA Early Intervention Center/Heads Up, which evaluates fidelity and outcomes of PA Coordinated Specialty Care (CSC) programs. Programs participate in standard computerized measures of CSC outcomes using centralized informatics. The aims of the current report are to describe implementation of this core battery for program evaluation in PA and to present 6- and 12-month outcomes. METHODS: Participants (n = 697) from nine PA CSC programs completed the core battery at admission. The battery was re-administered at 6- and 12-month follow-up, and data were analysed for individuals (n = 230) who had completed 12-months of treatment. Domains assessed via clinician report and/or self-report included symptoms, role and social functioning, self-perceived recovery and service utilization. RESULTS: PA FEP CSC participants showed improvement over time in several domains, including decreased symptoms, higher role and social functioning, decreased hospitalizations, and improved self-perception of recovery, quality of life, and services satisfaction. Trends towards improvements were observed for participant happiness, hopelessness, and school-enrolment. Nearly all improvements were observed at 6-month follow-up, with earlier gains maintained at 12-months. CONCLUSIONS: PA FEP CSC programs demonstrate the ability to assess and improve critical outcomes of coordinated specialty care in PA. Improved outcomes by 12 months in treatment provides evidence of an effective treatment model and supports the continuation of these programs in pursuit of our goal of reducing schizophrenia disease burden on individuals and society.


Assuntos
Transtornos Psicóticos , Esquizofrenia , Humanos , Pennsylvania , Avaliação de Programas e Projetos de Saúde , Transtornos Psicóticos/diagnóstico , Transtornos Psicóticos/terapia , Qualidade de Vida
4.
Rev Clin Esp (Barc) ; 220(2): 115-116, 2020 Mar.
Artigo em Inglês, Espanhol | MEDLINE | ID: mdl-32063263
5.
Rev Clin Esp ; 220(2): 115-116, 2020 Mar.
Artigo em Espanhol | MEDLINE | ID: mdl-34170983
6.
Aust Vet J ; 97(11): 465-472, 2019 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-31418855

RESUMO

OBJECTIVE: To monitor cobalt concentrations in urine, red blood cells and plasma after chronic parenteral administration of cobalt chloride evaluate these results against the current International Federation of Horseracing Authorities thresholds for detecting cobalt misuse. DESIGN: Eight mares were randomly assigned to four treatment groups, with two mares in each group: Group 1 - control group, Group 2 - 25 milligrams cobalt intravenously as CoCl2 weekly, Group 3 - 50 milligrams cobalt intravenously as CoCl2 weekly, and Group 4 - 25 milligrams cobalt intravenously mid-week and at the end of the week. Urine and blood samples were collected before each weekly administration so that trough levels were assessed. In the group receiving two doses per week, urine and blood were collected prior to the dose given at the end of each week. Samples were initially collected at time zero then weekly for 10 weeks. Three further collections of urine and blood were made at days 81, 106 and 127. METHODS: Urine creatinine measurements to assess horse hydration status were performed by the Jaffe reaction method. Cobalt determinations in plasma, blood and urine were by inductively coupled plasma-mass spectrometry. Haematocrit concentrations, used to calculate red cell cobalt levels, were performed using a microhematocrit centrifuge. Statistical analyses were conducted in Genstat (v17, VSNi). RESULTS: Marked cobalt accumulation was evident with increasing cobalt concentrations for all sample matrices in specimens collected immediately prior to cobalt administration. Correlation between the sample matrices improved when urine cobalt concentration was adjusted for creatinine level. Red cell cobalt levels remained elevated for at least 12 weeks after cessation of administration, consistent with the lifespan of the red cell. There was no significant change in haematocrit concentrations for the duration of the study. CONCLUSION: The current urine cobalt threshold was only effective at detecting acute cobalt exposure while the plasma cobalt threshold was able to consistently identify chronic high-level cobalt exposure and potential cobalt misuse. The threshold values legislated for urine cobalt do not correlate with those set for plasma. The acute nature of urinary cobalt excretion provides a relatively small window through which cobalt administration is detected. Plasma and red cell cobalt concentrations can provide a clearer picture of potential cobalt misuse.


Assuntos
Cobalto/sangue , Cobalto/urina , Creatinina/urina , Cavalos/urina , Animais , Cobalto/administração & dosagem , Cobalto/normas , Feminino , New South Wales , Plasma/química , Esportes
16.
J Anim Sci ; 92(1): 282-91, 2014 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-24243896

RESUMO

Cattle obtain water primarily from the moisture in their feed and from drinking water. On pasture, the moisture content of the diet is influenced by plant tissue water (internal water) and surface moisture (external water), which may include dew, guttation, and intercepted rain, that influence the drinking water requirement. This study investigated the relationship between daily drinking water intake (DWI, L/d) of steers on pasture (19 steers with mean initial BW of approximately 400 kg) and soil and weather factors that are known to affect plant water status (dry matter content) and surface moisture formation and persistence. Daily records of weather conditions and DWI were obtained during 2 grazing seasons with contrasting spring, summer, and autumn rainfall patterns. Plant available water in the soil (PAW, mm) was modeled from actual and potential evapotranspiration and the water-holding capacity of the soil. The DWI averaged over the herd varied among days from 0 to 29 L/d (grazing season mean 9.8 L/d). The DWI on both dry (<0.2 mm rainfall on the corresponding and previous days) and wet (>2 mm) days increased with increasing temperature (mean, maximum, and minimum), sunshine hours, and global radiation and decreasing relative humidity, and the slopes and coefficients of determination were generally greater for wet days. Wind reduced DWI on wet days but had no effect on dry days. The DWI was reduced by up to 4.4 L/d on wet days compared to dry days, but DWI did not correlate with rainfall amount. Increasing PAW decreased DWI by up to >10 L/d on both dry and wet days. These results are all consistent with environmental effects on the water status (dry matter content) of pasture vegetation and canopy surface moisture, the associated effects on grazing-related water intake, and the corresponding balancing changes of DWI. Using the observed relationships with environmental factors, we derived a new model predicting DWI for any soil moisture condition, for both wet and dry days, which included mean ambient temperature and relative humidity and explained virtually all variation of DWI that was not caused by the random scatter among individual animals.


Assuntos
Criação de Animais Domésticos , Comportamento de Ingestão de Líquido , Tempo (Meteorologia) , Animais , Bovinos , Alemanha , Masculino , Modelos Biológicos , Chuva , Estações do Ano , Água/análise
17.
Transpl Infect Dis ; 15(1): E20-4, 2013 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-23279826

RESUMO

We discuss a case of acute disseminated toxoplasmosis in a renal transplant recipient presenting with septic shock. Our literature review of disseminated toxoplasmosis presenting as septic shock reveals a disease process that is rapid and almost uniformly fatal. This unusual presentation warrants a high index of suspicion in transplant recipients with immediate administration of appropriate empiric antimicrobials.


Assuntos
Transplante de Rim/efeitos adversos , Choque Séptico/diagnóstico , Toxoplasma/isolamento & purificação , Toxoplasmose/diagnóstico , Negro ou Afro-Americano , Evolução Fatal , Humanos , Masculino , Pessoa de Meia-Idade , Choque Séptico/parasitologia , Fatores de Tempo , Toxoplasmose/etiologia
19.
Skin Pharmacol Physiol ; 24(6): 312-21, 2011.
Artigo em Inglês | MEDLINE | ID: mdl-21822032

RESUMO

The detection of the antioxidative capacity of the skin is of great practical relevance since free radicals are involved in many skin damaging processes, including aging and inflammation. The nitroxide TEMPO (2,2,6,6-tetramethyl-1-piperidinyloxyl) in combination with electron paramagnetic resonance spectroscopy was found suitable for measuring the antioxidative capacity since its reaction with reducing agents is considerably fast. Yet, in order to achieve longer measurement times, e.g. in inflammatory skin diseases, the stabilizing effect of an invasome (ultraflexible vesicle/liposome) suspension with TEMPO was investigated ex vivo on porcine skin and in vivo on human skin. Invasomes increased the measurement time ex vivo 2-fold and the reduction was significantly slowed down in vivo, which is due to membrane-associated and therefore protected TEMPO. Furthermore, TEMPO accumulation in the membrane phase as well as the decreasing polarity of the ultimate surroundings of TEMPO during skin penetration explains the stabilizing effect. Thus, an invasome suspension with TEMPO exhibits stabilizing effects ex vivo and in vivo.


Assuntos
Antioxidantes/química , Óxidos N-Cíclicos/química , Espectroscopia de Ressonância de Spin Eletrônica/métodos , Pele/metabolismo , Adulto , Humanos , Pessoa de Meia-Idade
20.
Praxis (Bern 1994) ; 99(12): 705-14, 2010 Jun 09.
Artigo em Alemão | MEDLINE | ID: mdl-20533230

RESUMO

Patients with acute heart failure usually present with dyspnoe and edema secondary to elevated intracardiac filling pressure resulting from volume overload. Despite significant progress in understanding heart failure, the treatment strategy for acute heart failure did not change in the same way. Diuretics, especially loop diuretics, are the most common therapy used in this setting. Intravenous diuretics act acutely by exerting a modest vasodilatory response and chronically by reducing circulating blood volume. Despite near universal use of diuretics in patients hospitalized with acute heart failure, nearly half of these patients are discharged from hospital without weight loss. This could be due to inadequate diuresis, overdiuresis with subsequent volume replacement and diuretic resistance. Aggressive diuresis carries a significant risk of electrolyte and volume depletion with subsequent arrythmias, hypotension, and worsening renal function. Actually there were scant data available from randomized clinical trials to guide therapeutic choice with diuretics. Thus, the choice and dosing of diuretic therapy must be individualized based on general knowledge of potency and pharmacokinetic and pharmacodynamic considerations.


Assuntos
Diuréticos/uso terapêutico , Insuficiência Cardíaca/tratamento farmacológico , Doença Aguda , Algoritmos , Terapia Combinada , Diagnóstico Diferencial , Diuréticos/efeitos adversos , Relação Dose-Resposta a Droga , Insuficiência Cardíaca/etiologia , Insuficiência Cardíaca/fisiopatologia , Humanos , Ensaios Clínicos Controlados Aleatórios como Assunto , Equilíbrio Hidroeletrolítico/efeitos dos fármacos , Equilíbrio Hidroeletrolítico/fisiologia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA