Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 36
Filter
Add more filters

Publication year range
1.
J Public Health Manag Pract ; 25(4): 366-372, 2019.
Article in English | MEDLINE | ID: mdl-31136510

ABSTRACT

CONTEXT: Leaders of government agencies are responsible for stewardship over taxpayer investments. Stewardship strengthens agency performance that is critical to improving population health. Most industries, including health care, and public enterprises, such as education, have policies for uniform data reporting and financial systems for the application of theoretical analytical techniques to organizations and entire systems. However, this is not a mainstreamed practice in local and state government public health. PROGRAM: The Public Health Uniform National Data System (PHUND$) is a financial information system for local health departments that advances the application of uniform practices to close financial analytical gaps. A 10-year retrospective overview on the development, implementation, and utility of PHUND$ is provided and supported by documented program and agency improvements to validate the analytical features and demonstrate a best practice. RESULTS: Benefits found from utilizing PHUND$ included reducing financial risks, supporting requests for increased revenues, providing comparative analysis, isolating drivers of costs and deficits, increasing workforce financial management skills, enhancing decision-making processes, and fostering agency sustainability to support continuous improvements in quality and population health. The PHUND$ financial data definitions in the data dictionary provided the structure needed for standardized data collection and confirmed the feasibility of a standardized public health chart of accounts. CONCLUSION: PHUND$ analysis provided evidence on the relationship between financial and operational performance, as well as informing strategies for managing risks and improving quality. Such analysis is critical to identifying financial and operational problems and essential to mitigating financial crisis, avoiding disruption of services, and fostering agency sustainability. PHUND$ additionally serves as an instrument that can guide development of standards that measure for agency sound financial management systems.


Subject(s)
Informatics/standards , Program Evaluation/standards , United States Public Health Service/economics , Florida , Humans , Informatics/instrumentation , Informatics/statistics & numerical data , Local Government , Program Evaluation/statistics & numerical data , Public Health/economics , Public Health/methods , United States
2.
J Med Internet Res ; 19(2): e47, 2017 02 24.
Article in English | MEDLINE | ID: mdl-28235748

ABSTRACT

BACKGROUND: The enactment of the General Data Protection Regulation (GDPR) will impact on European data science. Particular concerns relating to consent requirements that would severely restrict medical data research have been raised. OBJECTIVE: Our objective is to explain the changes in data protection laws that apply to medical research and to discuss their potential impact. METHODS: Analysis of ethicolegal requirements imposed by the GDPR. RESULTS: The GDPR makes the classification of pseudonymised data as personal data clearer, although it has not been entirely resolved. Biomedical research on personal data where consent has not been obtained must be of substantial public interest. CONCLUSIONS: The GDPR introduces protections for data subjects that aim for consistency across the EU. The proposed changes will make little impact on biomedical data research.


Subject(s)
Biomedical Research/methods , Computer Security , Informatics/methods , Research Design , Biomedical Research/ethics , Humans , Informatics/standards
3.
ScientificWorldJournal ; 2014: 809219, 2014.
Article in English | MEDLINE | ID: mdl-25013868

ABSTRACT

This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.


Subject(s)
Accounting/standards , Retirement/economics , Accounting/methods , Industry/economics , Industry/legislation & jurisprudence , Industry/standards , Informatics/economics , Informatics/legislation & jurisprudence , Informatics/standards , Retirement/legislation & jurisprudence
4.
Med Tr Prom Ekol ; (1): 36-43, 2014.
Article in Russian | MEDLINE | ID: mdl-25069277

ABSTRACT

The increasing flow of information, speeding up the progress of society, can impact the health that puts the task of its hygienic reglamentation. The physical aspects of information, parameters and units of quantities, aspects of measurement and evaluation with account of information quantity and quality as well as criteria of its permissible and optimal levels are considered. The results of measurements of quantity of text information produced per year on computer in 17 occupations of 10 economic sectors are presented. The principle of IT-automation of operator's work and of dynamic monitoring is proposed. On the basis of research performed the glossary of terms and guide on the problem with computer support are elaborated for the accumulation of experience and clarification of prospects.


Subject(s)
Industry/statistics & numerical data , Informatics/statistics & numerical data , Occupational Health/statistics & numerical data , Humans , Industry/legislation & jurisprudence , Industry/standards , Informatics/legislation & jurisprudence , Informatics/standards , Occupational Health/legislation & jurisprudence , Occupational Health/standards
5.
Neuroimage ; 82: 647-61, 2013 Nov 15.
Article in English | MEDLINE | ID: mdl-23727024

ABSTRACT

Data sharing efforts increasingly contribute to the acceleration of scientific discovery. Neuroimaging data is accumulating in distributed domain-specific databases and there is currently no integrated access mechanism nor an accepted format for the critically important meta-data that is necessary for making use of the combined, available neuroimaging data. In this manuscript, we present work from the Derived Data Working Group, an open-access group sponsored by the Biomedical Informatics Research Network (BIRN) and the International Neuroimaging Coordinating Facility (INCF) focused on practical tools for distributed access to neuroimaging data. The working group develops models and tools facilitating the structured interchange of neuroimaging meta-data and is making progress towards a unified set of tools for such data and meta-data exchange. We report on the key components required for integrated access to raw and derived neuroimaging data as well as associated meta-data and provenance across neuroimaging resources. The components include (1) a structured terminology that provides semantic context to data, (2) a formal data model for neuroimaging with robust tracking of data provenance, (3) a web service-based application programming interface (API) that provides a consistent mechanism to access and query the data model, and (4) a provenance library that can be used for the extraction of provenance data by image analysts and imaging software developers. We believe that the framework and set of tools outlined in this manuscript have great potential for solving many of the issues the neuroimaging community faces when sharing raw and derived neuroimaging data across the various existing database systems for the purpose of accelerating scientific discovery.


Subject(s)
Database Management Systems/organization & administration , Database Management Systems/standards , Informatics/standards , Information Dissemination/methods , Neuroimaging/methods , Databases, Factual/standards , Humans , Informatics/methods , Informatics/trends , Internet , Neuroimaging/standards
6.
Epilepsia ; 53 Suppl 2: 28-32, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22765502

ABSTRACT

The 2010 International League Against Epilepsy (ILAE) classification and terminology commission report proposed a much needed departure from previous classifications to incorporate advances in molecular biology, neuroimaging, and genetics. It proposed an interim classification and defined two key requirements that need to be satisfied. The first is the ability to classify epilepsy in dimensions according to a variety of purposes including clinical research, patient care, and drug discovery. The second is the ability of the classification system to evolve with new discoveries. Multidimensionality and flexibility are crucial to the success of any future classification. In addition, a successful classification system must play a central role in the rapidly growing field of epilepsy informatics. An epilepsy ontology, based on classification, will allow information systems to facilitate data-intensive studies and provide a proven route to meeting the two foregoing key requirements. Epilepsy ontology will be a structured terminology system that accommodates proposed and evolving ILAE classifications, the National Institutes of Health/National Institute of Neurological Disorders and Stroke (NIH/NINDS) Common Data Elements, the International Classification of Diseases (ICD) systems and explicitly specifies all known relationships between epilepsy concepts in a proper framework. This will aid evidence-based epilepsy diagnosis, investigation, treatment and research for a diverse community of clinicians and researchers. Benefits range from systematization of electronic patient records to multimodal data repositories for research and training manuals for those involved in epilepsy care. Given the complexity, heterogeneity, and pace of research advances in the epilepsy domain, such an ontology must be collaboratively developed by key stakeholders in the epilepsy community and experts in knowledge engineering and computer science.


Subject(s)
Epilepsy/classification , Informatics/standards , Terminology as Topic , Humans
7.
ScientificWorldJournal ; 2012: 614635, 2012.
Article in English | MEDLINE | ID: mdl-22988428

ABSTRACT

Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.


Subject(s)
Informatics/standards , Programming Languages , Software Design , Software/standards , Models, Theoretical , Reproducibility of Results
9.
Future Oncol ; 6(10): 1551-62, 2010 Oct.
Article in English | MEDLINE | ID: mdl-21062155

ABSTRACT

The welcome attitude of the 'omics community, journals and funders of research towards data sharing, coupled with successful implementations of data standards, has resulted in resource dissemination and a better understanding of many diseases, including cancer. Sharing experiment data is beneficial in terms of knowledge generation, allowing reproduction and validation of results. An adherence to a reporting guideline enables full-value extraction from costly data; this is an inexpensive method to increased quality without incurring disproportionate costs. For therapy data in particular, easy access to the range of new approaches and the ability to perform valid comparisons between these approaches would be especially useful. We discuss initiatives that support resource sharing and summarize three reporting guidelines for experiment data that have been adopted successfully. Finally, we introduce a new guideline that encompasses the diverse data types in therapeutic experiments, which is intended to be of use to the cancer therapeutics community.


Subject(s)
Guideline Adherence/standards , Guidelines as Topic/standards , Informatics/standards , Information Dissemination/methods , Neoplasms , Animals , Humans , Informatics/methods , Informatics/organization & administration
10.
Genes Brain Behav ; 19(7): e12676, 2020 09.
Article in English | MEDLINE | ID: mdl-32445272

ABSTRACT

Phenotyping mouse model systems of human disease has proven to be a difficult task, with frequent poor inter- and intra-laboratory replicability, particularly in behavioral domains such as social and cognitive function. However, establishing robust animal model systems with strong construct validity is of fundamental importance as they are central tools for understanding disease pathophysiology and developing therapeutics. To complete our studies of mouse model systems relevant to autism spectrum disorder (ASD), we present a replication of the main findings from our two published studies of five genetic mouse model systems of ASD. To assess the intra-laboratory robustness of previous results, we chose the two model systems that showed the greatest phenotypic differences, the Shank3/F and Cntnap2, and repeated assessments of general health, activity and social behavior. We additionally explored all five model systems in the same framework, comparing all results obtained in this three-yearlong effort using informatics techniques to assess commonalities and differences. Our results showed high intra-laboratory replicability of results, even for those with effect sizes that were not particularly large, suggesting that discrepancies in the literature may be dependent on subtle but pivotal differences in testing conditions, housing enrichment, or background strains and less so on the variability of the behavioral phenotypes. The overall informatics analysis suggests that in our behavioral assays we can separate the set of tested mouse model system into two main classes that in some aspects lie on opposite ends of the behavioral spectrum, supporting the view that autism is not a unitary concept.


Subject(s)
Autism Spectrum Disorder/genetics , Behavior, Animal , Disease Models, Animal , Informatics/methods , Animals , Autism Spectrum Disorder/physiopathology , Body Weight , Female , Informatics/standards , Learning , Male , Membrane Proteins/genetics , Mice , Mice, Inbred C57BL , Microfilament Proteins/genetics , Nerve Tissue Proteins/genetics , Reproducibility of Results , Social Behavior
12.
Sci Eng Ethics ; 15(4): 467-89, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19247811

ABSTRACT

This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.


Subject(s)
Authorship , Editorial Policies , Informatics/standards , Periodicals as Topic/standards , Research/standards , Computers/history , Guidelines as Topic , History, 20th Century , Humans , Informatics/ethics , Periodicals as Topic/ethics , Periodicals as Topic/history , Research/history
13.
Hu Li Za Zhi ; 56(3): 29-35, 2009 Jun.
Article in Zh | MEDLINE | ID: mdl-19472110

ABSTRACT

While the quality of data affects every aspect of business, it is frequently overlooked in terms of customer data integration, data warehousing, business intelligence and enterprise applications. Regardless of which data terms are used, a high level of data quality is a critical base condition essential to satisfy user needs and facilitate the development of effective applications. In this paper, the author introduces methods, a management framework and the major factors involved in data quality assessment. Author also integrates expert opinions to develop data quality assessment tools.


Subject(s)
Database Management Systems , Informatics/standards , Database Management Systems/organization & administration , Humans
14.
Mol Inform ; 38(4): e1800108, 2019 04.
Article in English | MEDLINE | ID: mdl-30499195

ABSTRACT

Despite the increasing volume of available data, the proportion of experimentally measured data remains small compared to the virtual chemical space of possible chemical structures. Therefore, there is a strong interest in simultaneously predicting different ADMET and biological properties of molecules, which are frequently strongly correlated with one another. Such joint data analyses can increase the accuracy of models by exploiting their common representation and identifying common features between individual properties. In this work we review the recent developments in multi-learning approaches as well as cover the freely available tools and packages that can be used to perform such studies.


Subject(s)
Chemistry/methods , Databases, Chemical , Informatics/methods , Machine Learning , Informatics/standards
15.
J Acad Nutr Diet ; 119(8): 1375-1382, 2019 08.
Article in English | MEDLINE | ID: mdl-31353011

ABSTRACT

It is the position of the Academy of Nutrition and Dietetics that nutrition informatics is a rapidly evolving area of practice for registered dietitian nutritionists and nutrition and dietetic technicians, registered; and that the knowledge and skills inherent to nutrition informatics permeate all areas of the dietetics profession. Further, nutrition and dietetics practitioners must continually learn and update their informatics knowledge and skills to remain at the forefront of nutrition practice. Nutrition informatics is the intersection of information, nutrition, and technology. However, informatics is not just using technology to do work. The essence of nutrition informatics is to manage nutrition data in combination with standards, processes, and technology to improve knowledge and practice that ultimately lead to improved quality of health care and work efficiency. Registered dietitian nutritionists and nutrition and dietetic technicians, registered, are already experts in using evidence to practice in all areas of nutrition and dietetics. To remain at the forefront of technological innovation, the profession must actively participate in the development of standards, processes, and technologies for providing nutrition care.


Subject(s)
Dietetics/standards , Informatics/standards , Nutrition Therapy/standards , Nutritionists/standards , Academies and Institutes , Clinical Competence , Dietetics/methods , Humans , Informatics/methods
16.
Diabetes Technol Ther ; 10(1): 16-24, 2008 Feb.
Article in English | MEDLINE | ID: mdl-18275359

ABSTRACT

BACKGROUND: Research suggests Internet-based care management tools are associated with improvements in care and patient outcomes. However, although such tools change workflow, rarely is their usability addressed and reported. This article presents a usability study of an Internet-based informatics application called the Comprehensive Diabetes Management Program (CDMP), developed by content experts and technologists. Our aim is to demonstrate a process for conducting a usability study of such a tool and to report results. METHODS: We conducted the usability test with six diabetes care providers under controlled conditions. Each provider worked with the CDMP in a single session using a "think aloud" process. Providers performed standardized tasks with fictitious patient data, and we observed how they approached these tasks, documenting verbalizations and subjective ratings. The providers then completed a usability questionnaire and interviews. RESULTS: Overall, the scores on the usability questionnaire were neutral to favorable. For specific subdomains of the questionnaire, the providers' reported problems with the application's ease of use, performance, and support features, but were satisfied with its visual appeal and content. The results from the observational and interview data indicated areas for improvement, particularly in navigation and terminology. CONCLUSIONS: The usability study identified several issues for improvement, confirming the need for usability testing of Internet-based informatics applications, even those developed by experts. To our knowledge, there have been no other usability studies of an Internet-based informatics application with the functionality of the CDMP. Such studies can form the foundation for translation of Internet-based medical informatics tools into clinical practice.


Subject(s)
Diabetes Mellitus/therapy , Health Personnel/education , Informatics/methods , Internet , Disease Management , Humans , Informatics/standards , Surveys and Questionnaires , User-Computer Interface
17.
J Am Med Inform Assoc ; 25(2): 206-209, 2018 02 01.
Article in English | MEDLINE | ID: mdl-28633483

ABSTRACT

As part of an interdisciplinary acute care patient portal task force with members from 10 academic medical centers and professional organizations, we held a national workshop with 71 attendees representing over 30 health systems, professional organizations, and technology companies. Our consensus approach identified 7 key sociotechnical and evaluation research focus areas related to the consumption and capture of information from patients, care partners (eg, family, friends), and clinicians through portals in the acute and post-acute care settings. The 7 research areas were: (1) standards, (2) privacy and security, (3) user-centered design, (4) implementation, (5) data and content, (6) clinical decision support, and (7) measurement. Patient portals are not yet in routine use in the acute and post-acute setting, and research focused on the identified domains should increase the likelihood that they will deliver benefit, especially as there are differences between needs in acute and post-acute care compared to the ambulatory setting.


Subject(s)
Continuity of Patient Care , Hospitalization , Informatics/standards , Patient Participation , Patient Portals , Computer Security , Decision Support Systems, Clinical , Family , Humans , Informatics/organization & administration , Patient Portals/standards
18.
Proteomics ; 7 Suppl 1: 35-40, 2007 Sep.
Article in English | MEDLINE | ID: mdl-17893862

ABSTRACT

The use of gel electrophoresis to separate and, in some instances, to quantify the abundance of large numbers of proteins from complex mixtures, has been well established for several decades. The quantity of publicly available data is still relatively modest due to a lack of community accepted data standards, tools to facilitate the data sharing process and controlled vocabularies to ensure that consistent terminology is used to describe the experimental methodology. It is becoming widely recognised that there are significant benefits in data sharing for proteomics, allowing results to be verified and new findings to be generated by re-analysis of published studies. We report on standards development by the Gel Analysis Workgroup of the Proteomics Standards Initiative. The workgroup develops reporting requirements, data formats and controlled vocabularies for experimental gel electrophoresis, and informatics performed on gel images. We present a tutorial on how such resources can be used and how the community should get involved with the on-going projects. Finally, we present a roadmap for future developments in this area.


Subject(s)
Electrophoresis, Gel, Two-Dimensional/standards , Proteomics/standards , Databases, Genetic , Informatics/standards , Vocabulary, Controlled
19.
Inform Prim Care ; 15(3): 143-50, 2007.
Article in English | MEDLINE | ID: mdl-18005561

ABSTRACT

BACKGROUND: Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. OBJECTIVE: To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. METHOD: General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. RESULTS: A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. CONCLUSIONS: Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.


Subject(s)
Data Display/standards , Informatics/standards , Private Practice/classification , Private Practice/standards , Depression/classification , Depression/epidemiology , Humans , Software , United Kingdom
20.
Stud Health Technol Inform ; 129(Pt 1): 233-6, 2007.
Article in English | MEDLINE | ID: mdl-17911713

ABSTRACT

In the past, the training of health information professionals (HIPs) has focussed almost exclusively on technical matters, the concerns of software developers and purveyors have essentially centred on security and functionality, and health care providers have mainly worried about costs and efficiency. This paper outlines some ethical threats that are ignored by such a purely technical focus and argues that because of the increasing globalization of health care delivery through e-Health, and because of the international threats to confidentiality posed by legislation such as the US Patriot Act, the health informatics community should pursue a project of global certification for HIPs that includes information ethics as an integral component. It also argues that a corresponding certification process for health care institutions and software developers should be initiated.


Subject(s)
Certification , Informatics/standards , Codes of Ethics , Confidentiality , Informatics/education , Informatics/ethics , Organizations/ethics , Security Measures/legislation & jurisprudence , United States
SELECTION OF CITATIONS
SEARCH DETAIL