Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 5 de 5
1.
Stud Health Technol Inform ; 302: 871-875, 2023 May 18.
Article En | MEDLINE | ID: mdl-37203520

Conducting large-scale epidemiologic studies requires powerful software for electronic data capture, data management, data quality assessments, and participant management. There is also an increasing need to make studies and the data collected findable, accessible, interoperable, and reusable (FAIR). However, reusable software tools from major studies, underlying such needs, are not necessarily known to other researchers. Therefore, this work gives an overview on the main tools used to conduct the internationally highly networked population-based project Study of Health in Pomerania (SHIP), as well as approaches taken to improve its FAIRness. Deep phenotyping, formalizing processes from data capture to data transfer, with a strong emphasis on cooperation and data exchange have laid the foundation for a broad scientific impact with more than 1500 published papers to date.


Data Management , Software , Humans , Cohort Studies , Research , Epidemiologic Studies
3.
NPJ Aging Mech Dis ; 7(1): 15, 2021 Jun 01.
Article En | MEDLINE | ID: mdl-34075044

The development of 'age clocks', machine learning models predicting age from biological data, has been a major milestone in the search for reliable markers of biological age and has since become an invaluable tool in aging research. However, beyond their unquestionable utility, current clocks offer little insight into the molecular biological processes driving aging, and their inner workings often remain non-transparent. Here we propose a new type of age clock, one that couples predictivity with interpretability of the underlying biology, achieved through the incorporation of prior knowledge into the model design. The clock, an artificial neural network constructed according to well-described biological pathways, allows the prediction of age from gene expression data of skin tissue with high accuracy, while at the same time capturing and revealing aging states of the pathways driving the prediction. The model recapitulates known associations of aging gene knockdowns in simulation experiments and demonstrates its utility in deciphering the main pathways by which accelerated aging conditions such as Hutchinson-Gilford progeria syndrome, as well as pro-longevity interventions like caloric restriction, exert their effects.

4.
Article De | MEDLINE | ID: mdl-29147857

BACKGROUND: Cohort studies are a longitudinal observational study type. They are firmly established within epidemiology to assess the course of diseases and risk factors. Yet, standards to describe and evaluate quality characteristics of cohort studies need further development. OBJECTIVE: Within the TMF ("Technologie- und Methodenplattform für die vernetzte medizinische Forschung e. V.") project "Quality management standards in cohort studies", a catalogue of requirements was compiled and evaluated, focusing on the preparation and conduct of epidemiologic cohort studies. MATERIALS AND METHODS: The catalogue of requirements was established based on a consensus process between representatives of seven German epidemiologic cohort studies. For this purpose, a set of expert meetings (telephone, face-to-face, web-based) was conducted and the importance of each element of the catalogue was assessed as well as its implementation. RESULTS: A catalogue of requirements with 138 requirements was consented. It is structured into ten sections: 1. Study documentation; 2. Selection of instruments; 3. Study implementation, 4. Organizational structure; 5. Qualification and certification; 6. Participant recruitment; 7. Preparation, conduct and follow-up processing of examinations; 8. Study logistics and maintenance, 9. Data capture and data management; 10. Reporting and monitoring. In total, 41 elements were categorized as being essential, 91 as important, and 6 as less important. CONCLUSION: The catalogue of requirements provides a guideline to improve the preparation and operation of cohort studies. The evaluation of the importance and degree of implementation of requirements depended on the study design. With adaptations, the catalogue might be transferable to other study types.


Cohort Studies , Data Accuracy , Epidemiologic Methods , Data Collection/standards , Documentation/standards , Germany , Humans , Patient Selection , Research Design/standards
5.
Stud Health Technol Inform ; 235: 549-553, 2017.
Article En | MEDLINE | ID: mdl-28423853

Valid scientific inferences from epidemiological and clinical studies require high data quality. Data generating departments therefore aim to detect data irregularities as early as possible in order to guide quality management processes. In addition, after the completion of data collections the obtained data quality must be evaluated. This can be challenging in complex studies due to a wide scope of examinations, numerous study variables, multiple examiners, devices, and examination centers. This paper describes a Java EE web application used to monitor and evaluate data quality in institutions with complex and multiple studies, named Square2. It uses the Java libraries Apache MyFaces 2, extended by BootsFaces for layout and style. RServe and REngine manage calls to R server processes. All study data and metadata are stored in PostgreSQL. R is the statistics backend and LaTeX is used for the generation of print ready PDF reports. A GUI manages the entire workflow. Square2 covers all steps in the data monitoring workflow, including the setup of studies and their structure, the handling of metadata for data monitoring purposes, selection of variables, upload of data, statistical analyses, and the generation as well as inspection of quality reports. To take into account data protection issues, Square2 comprises an extensive user rights and roles concept.


Biomedical Research , Data Accuracy , Data Collection , Internet , Software , Epidemiologic Studies , Workflow
...