Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 10 de 10
Filtrar
Mais filtros

Bases de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
2.
ANZ J Surg ; 91(11): 2436-2442, 2021 11.
Artigo em Inglês | MEDLINE | ID: mdl-34224192

RESUMO

BACKGROUND: To determine whether a bariatric surgical procedure is associated with a reduction in healthcare utilisation among patients with obesity and high pre-procedural healthcare needs. METHODS: Design: Retrospective cohort study. SETTING: Tertiary Victorian public hospital. PARTICIPANTS: Twenty-nine adults who underwent publicly funded primary bariatric surgery between 2008 and 2018 at the Alfred Hospital, Melbourne and had high resource use over the year prior to surgery, defined as at least two of ≥3 hospital admissions, ≥7 inpatient bed days for obesity-related co-morbidities or inpatient hospital costs ≥$10 000. MAIN OUTCOME MEASURES: Change in inpatient and outpatient resource use. RESULTS: After 1 year following bariatric surgery, total hospital bed days decreased from 663 to 80 and the median (Q1, Q3) per patient decreased from 7 (4.5, 15) to 5 (2.25, 9.75) (p = 0.001) and the total number of hospital admissions fell from 118 to 67 (p < 0.001). The median cost of inpatient care decreased from $11 405 ($4408, $22251) to $3974 ($0, $4325) per annum (p < 0.001). The total and median number of outpatient attendances did not significantly change 12 months after bariatric surgery, but the demand for outpatient services unrelated to bariatric surgery declined by a median of four visits per patient (p = 0.013). CONCLUSIONS: The evidence from this small pilot study suggests that Bariatric surgery has the potential to decrease resource use and inpatient hospital costs over a 1-year time frame for obese patients with high resource use. These data will be used to design a prospective randomised controlled trial to provide more definitive information on this important issue.


Assuntos
Cirurgia Bariátrica , Adulto , Atenção à Saúde , Humanos , Aceitação pelo Paciente de Cuidados de Saúde , Projetos Piloto , Estudos Prospectivos , Estudos Retrospectivos
3.
Hum Genomics ; 12(1): 24, 2018 04 26.
Artigo em Inglês | MEDLINE | ID: mdl-29695297

RESUMO

BACKGROUND: Genomic and biosocial research data about individuals is rapidly proliferating, bringing the potential for novel opportunities for data integration and use. The scale, pace and novelty of these applications raise a number of urgent sociotechnical, ethical and legal questions, including optimal methods of data storage, management and access. Although the open science movement advocates unfettered access to research data, many of the UK's longitudinal cohort studies operate systems of managed data access, in which access is governed by legal and ethical agreements between stewards of research datasets and researchers wishing to make use of them. Amongst other things, these agreements aim to respect the reasonable expectations of the research participants who provided data and samples, as expressed in the consent process. Arguably, responsible data management and governance of data and sample use are foundational to the consent process in longitudinal studies and are an important source of trustworthiness in the eyes of those who contribute data to genomic and biosocial research. METHODS: This paper presents an ethnographic case study exploring the foundational principles of a governance infrastructure for Managing Ethico-social, Technical and Administrative issues in Data ACcess (METADAC), which are operationalised through a committee known as the METADAC Access Committee. METADAC governs access to phenotype, genotype and 'omic' data and samples from five UK longitudinal studies. FINDINGS: Using the example of METADAC, we argue that three key structural features are foundational for practising responsible data sharing: independence and transparency; interdisciplinarity; and participant-centric decision-making. We observe that the international research community is proactively working towards optimising the use of research data, integrating/linking these data with routine data generated by health and social care services and other administrative data services to improve the analysis, interpretation and utility of these data. The governance of these new complex data assemblages will require a range of expertise from across a number of domains and disciplines, including that of study participants. Human-mediated decision-making bodies will be central to ensuring achievable, reasoned and responsible decisions about the use of these data; the METADAC model described in this paper provides an example of how this could be realised.


Assuntos
Big Data , Pesquisa Biomédica/ética , Genômica/ética , Disseminação de Informação/ética , Pesquisa Biomédica/economia , Bases de Dados Genéticas/economia , Bases de Dados Genéticas/ética , Genótipo , Humanos
4.
Bioinformatics ; 31(16): 2691-6, 2015 Aug 15.
Artigo em Inglês | MEDLINE | ID: mdl-25908791

RESUMO

MOTIVATION: Very large studies are required to provide sufficiently big sample sizes for adequately powered association analyses. This can be an expensive undertaking and it is important that an accurate sample size is identified. For more realistic sample size calculation and power analysis, the impact of unmeasured aetiological determinants and the quality of measurement of both outcome and explanatory variables should be taken into account. Conventional methods to analyse power use closed-form solutions that are not flexible enough to cater for all of these elements easily. They often result in a potentially substantial overestimation of the actual power. RESULTS: In this article, we describe the Estimating Sample-size and Power in R by Exploring Simulated Study Outcomes tool that allows assessment errors in power calculation under various biomedical scenarios to be incorporated. We also report a real world analysis where we used this tool to answer an important strategic question for an existing cohort. AVAILABILITY AND IMPLEMENTATION: The software is available for online calculation and downloads at http://espresso-research.org. The code is freely available at https://github.com/ESPRESSO-research. CONTACT: louqman@gmail.com SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.


Assuntos
Estudos de Associação Genética , Software , Algoritmos , Pressão Sanguínea , Canadá , Estudos de Coortes , Humanos , Internet , Modelos Lineares , Característica Quantitativa Herdável , Tamanho da Amostra , Sístole
5.
Int J Epidemiol ; 40(5): 1314-28, 2011 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-21804097

RESUMO

BACKGROUND: Proper understanding of the roles of, and interactions between genetic, lifestyle, environmental and psycho-social factors in determining the risk of development and/or progression of chronic diseases requires access to very large high-quality databases. Because of the financial, technical and time burdens related to developing and maintaining very large studies, the scientific community is increasingly synthesizing data from multiple studies to construct large databases. However, the data items collected by individual studies must be inferentially equivalent to be meaningfully synthesized. The DataSchema and Harmonization Platform for Epidemiological Research (DataSHaPER; http://www.datashaper.org) was developed to enable the rigorous assessment of the inferential equivalence, i.e. the potential for harmonization, of selected information from individual studies. METHODS: This article examines the value of using the DataSHaPER for retrospective harmonization of established studies. Using the DataSHaPER approach, the potential to generate 148 harmonized variables from the questionnaires and physical measures collected in 53 large population-based studies (6.9 million participants) was assessed. Variable and study characteristics that might influence the potential for data synthesis were also explored. RESULTS: Out of all assessment items evaluated (148 variables for each of the 53 studies), 38% could be harmonized. Certain characteristics of variables (i.e. relative importance, individual targeted, reference period) and of studies (i.e. observational units, data collection start date and mode of questionnaire administration) were associated with the potential for harmonization. For example, for variables deemed to be essential, 62% of assessment items paired could be harmonized. CONCLUSION: The current article shows that the DataSHaPER provides an effective and flexible approach for the retrospective harmonization of information across studies. To implement data synthesis, some additional scientific, ethico-legal and technical considerations must be addressed. The success of the DataSHaPER as a harmonization approach will depend on its continuing development and on the rigour and extent of its use. The DataSHaPER has the potential to take us closer to a truly collaborative epidemiology and offers the promise of enhanced research potential generated through synthesized databases.


Assuntos
Coleta de Dados/métodos , Interpretação Estatística de Dados , Estudos Retrospectivos , Doença Crônica/epidemiologia , Estudos de Coortes , Bases de Dados Factuais , Métodos Epidemiológicos , Humanos , Fatores de Risco , Inquéritos e Questionários
6.
Int J Epidemiol ; 39(5): 1383-93, 2010 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-20813861

RESUMO

BACKGROUND: Vast sample sizes are often essential in the quest to disentangle the complex interplay of the genetic, lifestyle, environmental and social factors that determine the aetiology and progression of chronic diseases. The pooling of information between studies is therefore of central importance to contemporary bioscience. However, there are many technical, ethico-legal and scientific challenges to be overcome if an effective, valid, pooled analysis is to be achieved. Perhaps most critically, any data that are to be analysed in this way must be adequately 'harmonized'. This implies that the collection and recording of information and data must be done in a manner that is sufficiently similar in the different studies to allow valid synthesis to take place. METHODS: This conceptual article describes the origins, purpose and scientific foundations of the DataSHaPER (DataSchema and Harmonization Platform for Epidemiological Research; http://www.datashaper.org), which has been created by a multidisciplinary consortium of experts that was pulled together and coordinated by three international organizations: P³G (Public Population Project in Genomics), PHOEBE (Promoting Harmonization of Epidemiological Biobanks in Europe) and CPT (Canadian Partnership for Tomorrow Project). RESULTS: The DataSHaPER provides a flexible, structured approach to the harmonization and pooling of information between studies. Its two primary components, the 'DataSchema' and 'Harmonization Platforms', together support the preparation of effective data-collection protocols and provide a central reference to facilitate harmonization. The DataSHaPER supports both 'prospective' and 'retrospective' harmonization. CONCLUSION: It is hoped that this article will encourage readers to investigate the project further: the more the research groups and studies are actively involved, the more effective the DataSHaPER programme will ultimately be.


Assuntos
Ensaios Clínicos como Assunto/estatística & dados numéricos , Interpretação Estatística de Dados , Métodos Epidemiológicos , Armazenamento e Recuperação da Informação/métodos , Metanálise como Assunto , Coleta de Dados/métodos , Comportamentos Relacionados com a Saúde , Humanos , Características de Residência , Fatores Socioeconômicos
7.
Am J Psychiatry ; 167(3): 253-9, 2010 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-20194488

RESUMO

Aggressive patients often target psychiatrists and psychiatric residents, yet most clinicians are insufficiently trained in violence risk assessment and management. Consequently, many clinicians are reluctant to diagnose and treat aggressive and assaultive features in psychiatric patients and instead focus attention on other axis I mental disorders with proven pharmacological treatment in the hope that this approach will reduce the aggressive behavior. Unclear or nonexistent reporting policies or feelings of self-blame may impede clinicians from reporting assaults, thus limiting our knowledge of the impact of, and best response to, aggression in psychiatric patients. The authors pre-sent the case of a young adult inpatient with a long history of antisocial and assaultive behavior who struck and injured a psychiatric resident. With this case in mind, the authors discuss the diagnostic complexities related to violent patients, the importance of assessing violence risk when initially evaluating a patient, and the relevance of risk assessment for treatment considerations and future management. This report illustrates common deficiencies in the prevention of violence on inpatient psychiatric units and in the reporting and response to an assault, and has implications for residency and clinician training.


Assuntos
Agressão/psicologia , Transtorno da Personalidade Antissocial/psicologia , Internato e Residência , Psiquiatria/educação , Violência/psicologia , Adolescente , Transtorno da Personalidade Antissocial/diagnóstico , Transtorno da Personalidade Antissocial/terapia , Internação Compulsória de Doente Mental , Comorbidade , Comportamento Perigoso , Quimioterapia Combinada , Piromania/psicologia , Humanos , Masculino , Transtornos do Humor/diagnóstico , Transtornos do Humor/psicologia , Transtornos do Humor/terapia , Prisioneiros/psicologia , Psicotrópicos/efeitos adversos , Psicotrópicos/uso terapêutico , Medição de Risco , Gestão de Riscos , Violência/prevenção & controle
8.
Obes Surg ; 18(9): 1104-8, 2008 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-18431612

RESUMO

BACKGROUND: Laparoscopic adjustable gastric banding (LAGB) has commonly been complicated by the problem of band slippage or prolapse. Since popularization of the pars flaccida approach and improved anterior fixation, it is our impression that the problem of symmetrical dilatation of the proximal gastric pouch has become more important. METHODS: We have reviewed the results of a series of 425 LAGB all performed by the pars flaccida approach from June 2003 to October 2007 to analyze the incidence and implications of this new pattern. RESULTS: There were no posterior prolapses, 2 anterior prolapses, and 17 cases of symmetrical pouch dilatation (SPD) (revision rate 4.4%). Teenage patients had a 22% revision rate for SPD. All revisions were completed laparoscopically with no mortality, no significant complications, and a median hospital stay of 1 day. The median weight loss following revisional surgery was not significantly different from the background cohort. CONCLUSION: SPD is the most common reason for revision of LAGB in this series. We postulate that SPD is caused by excessive pressure in the proximal gastric pouch. This may be generated either by eating too quickly or too large a volume or excessive tightening of the band. The radial forces in the pouch may ultimately cause pressure on the phrenoesophageal ligament and a secondary hiatal hernia.


Assuntos
Dilatação Gástrica/epidemiologia , Dilatação Gástrica/cirurgia , Gastroplastia/efeitos adversos , Laparoscopia , Obesidade Mórbida/cirurgia , Adolescente , Adulto , Idoso , Estudos de Coortes , Bases de Dados Factuais , Feminino , Dilatação Gástrica/diagnóstico , Gastroplastia/instrumentação , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Reoperação , Estudos Retrospectivos , Fatores de Tempo , Resultado do Tratamento , Adulto Jovem
9.
Stat Med ; 24(15): 2401-28, 2005 Aug 15.
Artigo em Inglês | MEDLINE | ID: mdl-16015676

RESUMO

There has been a recent growth in the use of Bayesian methods in medical research. The main reasons for this are the development of computer intensive simulation based methods such as Markov chain Monte Carlo (MCMC), increases in computing power and the introduction of powerful software such as WinBUGS. This has enabled increasingly complex models to be fitted. The ability to fit these complex models has led to MCMC methods being used as a convenient tool by frequentists, who may have no desire to be fully Bayesian. Often researchers want 'the data to dominate' when there is no prior information and thus attempt to use vague prior distributions. However, with small amounts of data the use of vague priors can be problematic. The results are potentially sensitive to the choice of prior distribution. In general there are fewer problems with location parameters. The main problem is with scale parameters. With scale parameters, not only does one have to decide the distributional form of the prior distribution, but also whether to put the prior distribution on the variance, standard deviation or precision. We have conducted a simulation study comparing the effects of 13 different prior distributions for the scale parameter on simulated random effects meta-analysis data. We varied the number of studies (5, 10 and 30) and compared three different between-study variances to give nine different simulation scenarios. One thousand data sets were generated for each scenario and each data set was analysed using the 13 different prior distributions. The frequentist properties of bias and coverage were investigated for the between-study variance and the effect size. The choice of prior distribution was crucial when there were just five studies. There was a large variation in the estimates of the between-study variance for the 13 different prior distributions. With a large number of studies the choice of prior distribution was less important. The effect size estimated was not biased, but the precision with which it was estimated varied with the choice of prior distribution leading to varying coverage intervals and, potentially, to different statistical inferences. Again there was less of a problem with a larger number of studies. There is a particular problem if the between-study variance is close to the boundary at zero, as MCMC results tend to produce upwardly biased estimates of the between-study variance, particularly if inferences are based on the posterior mean. The choice of 'vague' prior distribution can lead to a marked variation in results, particularly in small studies. Sensitivity to the choice of prior distribution should always be assessed.


Assuntos
Teorema de Bayes , Cadeias de Markov , Método de Monte Carlo , Software , Antibacterianos/administração & dosagem , Antibacterianos/uso terapêutico , Simulação por Computador , Humanos , Otite Média/tratamento farmacológico
10.
Genet Epidemiol ; 24(1): 24-35, 2003 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-12508253

RESUMO

Gibbs sampling-based generalized linear mixed models (GLMMs) provide a convenient and flexible way to extend variance components models for multivariate normally distributed continuous traits to other classes of phenotype. This includes binary traits and right-censored failure times such as age-at-onset data. The approach has applications in many areas of genetic epidemiology. However, the required GLMMs are sensitive to nonrandom ascertainment. In the absence of an appropriate correction for ascertainment, they can exhibit marked positive bias in the estimated grand mean and serious shrinkage in the estimated magnitude of variance components. To compound practical difficulties, it is currently difficult to implement a conventional adjustment for ascertainment because of the need to undertake repeated integration across the distribution of random effects. This is prohibitively slow when it must be repeated at every iteration of the Markov chain Monte Carlo (MCMC) procedure. This paper motivates a correction for ascertainment that is based on sampling random effects rather than integrating across them and can therefore be implemented in a general-purpose Gibbs sampling environment such as WinBUGS. The approach has the characteristic that it returns ascertainment-adjusted parameter estimates that pertain to the true distribution of determinants in the ascertained sample rather than in the general population. The implications of this characteristic are investigated and discussed. This paper extends the utility of Gibbs sampling-based GLMMs to a variety of settings in which family data are ascertained nonrandomly.


Assuntos
Doenças Genéticas Inatas/genética , Variação Genética/genética , Modelos Lineares , Modelos Genéticos , Fenótipo , Idade de Início , Análise de Variância , Viés , Modificador do Efeito Epidemiológico , Doenças Genéticas Inatas/epidemiologia , Genética Populacional , Genótipo , Humanos , Cadeias de Markov , Método de Monte Carlo , Linhagem , Valor Preditivo dos Testes , Estudos de Amostragem , Sensibilidade e Especificidade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA