Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
1.
Sci Eng Ethics ; 21(4): 1033-48, 2015 Aug.
Article in English | MEDLINE | ID: mdl-24938695

ABSTRACT

For the past half-century, issues relating to the ethical conduct of human research have focused largely on the domain of medical, and more recently social-psychological research. The modern regime of applied ethics, emerging as it has from the Nuremberg trials and certain other historical antecedents, applies the key principles of: autonomy, respect for persons, beneficence, non-maleficence, and justice to human beings who enter trials of experimental drugs and devices (Martensen in J Hist Med Allied Sci 56(2):168-175, 2001). Institutions such as Institutional Review Boards (in the U.S.) and Ethics Committees (in Europe and elsewhere) oversee most governmentally-funded medical research around the world, in more than a hundred nations that are signers of the Declaration of Helsinki (World Medical Association 2008). Increasingly, research outside of medicine has been recognized to pose potential risks to human subjects of experiments. Ethics committees now operate in the US, Canada, the U.K. and Australia to oversee all governmental-funded research, and in other jurisdictions, the range of research covered by such committees is expanding. Social science, anthropology, and other fields are falling under more clear directives to conduct a formal ethical review for basic research involving human participants (Federman et al. in Responsible research: a systems approach to protecting research participants. National Academies Press, Washington, 2003, p. 36). The legal and institutional response for protecting human subjects in the course of developing non-medical technologies, engineering, and design is currently vague, but some universities are establishing ethics committees to oversee their human subjects research even where the experiments involved are non-medical and not technically covered by the Declaration of Helsinki. In The Netherlands, as in most of Europe, Asia, Latin America, or Africa, no laws mandate an ethical review of non-medical research. Yet, nearly 2 years ago we launched a pilot ethics committee at our technical university and began soliciting our colleagues to submit their studies for review. In the past year, we have become officially recognized as a human subjects ethics committee for our university and we are beginning the process of requiring all studies using human subjects to apply for our approval. In this article, we consider some of the special problems relating to protecting human participants in a technology context, and discuss some of our experiences and insights about reviewing human subjects research at a technical university, concluding: that not less than in medical studies, human participants used in technology research benefit from ethical committees' reviews, practical requirements for publications, grants, and avoiding legal liability are also served by such committees, and ethics committees in such contexts have many similarities to, but certain other special foci than medical ethics committees. We believe that this experience, and these observations, are helpful for those seeking to establish such committees in technology research contexts, and for framing the particular issues that may arise in such contexts for the benefit of researchers, and nascent committees seeking to establish their own procedures.


Subject(s)
Engineering/ethics , Ethical Review , Ethics Committees, Research , Human Experimentation/ethics , Research Subjects , Ethics, Research , Humans , Netherlands
2.
Genes (Basel) ; 14(6)2023 06 01.
Article in English | MEDLINE | ID: mdl-37372394

ABSTRACT

Atherogenesis and dyslipidemia increase the risk of cardiovascular disease, which is the leading cause of death in developed countries. While blood lipid levels have been studied as disease predictors, their accuracy in predicting cardiovascular risk is limited due to their high interindividual and interpopulation variability. The lipid ratios, atherogenic index of plasma (AIP = log TG/HDL-C) and the Castelli risk index 2 (CI2 = LDL-C/HDL-C), have been proposed as better predictors of cardiovascular risk, but the genetic variability associated with these ratios has not been investigated. This study aimed to identify genetic associations with these indexes. The study population (n = 426) included males (40%) and females (60%) aged 18-52 years (mean 39 years); the Infinium GSA array was used for genotyping. Regression models were developed using R and PLINK. AIP was associated with variation on APOC3, KCND3, CYBA, CCDC141/TTN, and ARRB1 (p-value < 2.1 × 10-6). The three former were previously associated with blood lipids, while CI2 was associated with variants on DIPK2B, LIPC, and 10q21.3 rs11251177 (p-value 1.1 × 10-7). The latter was previously linked to coronary atherosclerosis and hypertension. KCND3 rs6703437 was associated with both indexes. This study is the first to characterize the potential link between genetic variation and atherogenic indexes, AIP, and CI2, highlighting the relationship between genetic variation and dyslipidemia predictors. These results also contribute to consolidating the genetics of blood lipid and lipid indexes.


Subject(s)
Atherosclerosis , Coronary Artery Disease , Dyslipidemias , Male , Female , Humans , Case-Control Studies , Atherosclerosis/genetics , Coronary Artery Disease/genetics , Lipids , Dyslipidemias/genetics
3.
Sci Eng Ethics ; 16(1): 119-33, 2010 Mar.
Article in English | MEDLINE | ID: mdl-19644770

ABSTRACT

The age-old maxim of scientists whose work has resulted in deadly or dangerous technologies is: scientists are not to blame, but rather technologists and politicians must be morally culpable for the uses of science. As new technologies threaten not just populations but species and biospheres, scientists should reassess their moral culpability when researching fields whose impact may be catastrophic. Looking at real-world examples such as smallpox research and the Australian "mousepox trick", and considering fictional or future technologies like Kurt Vonnegut's "ice-nine" from Cat's Cradle, and the "grey goo" scenario in nanotechnology, this paper suggests how ethical principles developed in biomedicine can be adjusted for science in general. An "extended moral horizon" may require looking not just to the effects of research on individual human subjects, but also to effects on humanity as a whole. Moreover, a crude utilitarian calculus can help scientists make moral decisions about which technologies to pursue and disseminate when catastrophes may result. Finally, institutions should be devised to teach these moral principles to scientists, and require moral education for future funding.


Subject(s)
Biomedical Research/ethics , Double Effect Principle , Moral Obligations , Professional Role , Science/ethics , Bioethics , Bioterrorism/ethics , Bioterrorism/prevention & control , Codes of Ethics , Decision Making/ethics , Ethical Theory , Forecasting , Genetic Engineering/ethics , Guidelines as Topic , Humans , Information Dissemination/ethics , Principle-Based Ethics , Research Support as Topic/ethics , Risk Assessment/ethics , Variola virus/genetics
4.
J Empir Res Hum Res Ethics ; 9(3): 67-73, 2014 Jul.
Article in English | MEDLINE | ID: mdl-25746787

ABSTRACT

Human research ethics has developed in both theory and practice mostly from experiences in medical research. Human participants, however, are used in a much broader range of research than ethics committees oversee, including both basic and applied research at technical universities. Although mandated in the United States, the United Kingdom, Canada, and Australia, non-medical research involving humans need not receive ethics review in much of Europe, Asia, Latin America, and Africa. Our survey of the top 50 technical universities in the world shows that, where not specifically mandated by law, most technical universities do not employ ethics committees to review human studies. As the domains of basic and applied sciences expand, ethics committees are increasingly needed to guide and oversee all such research regardless of legal requirements. We offer as examples, from our experience as an ethics committee in a major European technical university, ways in which such a committee provides needed services and can help ensure more ethical studies involving humans outside the standard medical context. We provide some arguments for creating such committees, and in our supplemental article, we provide specific examples of cases and concerns that may confront technical, engineering, and design research, as well as outline the general framework we have used in creating our committee.


Subject(s)
Engineering/ethics , Ethics Committees, Research , Human Experimentation/ethics , Research , Science/ethics , Technology/ethics , Universities , Ethical Review , Ethics, Research , Humans , Internationality , Research Design
5.
Nanoethics ; 3(2): 157-166, 2009 Aug.
Article in English | MEDLINE | ID: mdl-20234881

ABSTRACT

Much of the discussion regarding nanotechnology centers around perceived and prosphesied harms and risks. While there are real risks that could emerge from futuristic nanotechnology, there are other current risks involved with its development, not involving physical harms, that could prevent its full promise from being realized. Transitional forms of the technology, involving "microfab," or localized, sometimes desk-top, manufacture, pose a good opportunity for case study. How can we develop legal and regulatory institutions, specifically centered around the problems of intellectual property, that both stimulate innovation, and make the best possible use of what will eventually be a market in "types" rather than "tokens"? This paper argues that this is the most critical, current issues facing nanotechnology, and suggests a manner to approach it.

7.
J Empir Res Hum Res Ethics ; 4(1): 43-58, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19374479

ABSTRACT

ONTOLOGIES DESCRIBE REALITY IN SPECIFIC domains in ways that can bridge various disciplines and languages. They allow easier access and integration of information that is collected by different groups. Ontologies are currently used in the biomedical sciences, geography, and law. A Biomedical Ethics Ontology (BMEO) would benefit members of ethics committees who deal with protocols and consent forms spanning numerous fields of inquiry. There already exists the Ontology for Biomedical Investigations (OBI); the proposed BMEO would interoperate with OBI, creating a powerful information tool. We define a domain ontology and begin to construct a BMEO, focused on the process of evaluating human research protocols. Finally, we show how our BMEO can have practical applications for ethics committees. This paper describes ongoing research and a strategy for its broader continuation and cooperation.


Subject(s)
Decision Support Techniques , Ethical Analysis/methods , Knowledge Bases , Peer Review, Research/methods , Vocabulary, Controlled , Automation , Classification , Ethics Committees, Research , Humans , Reference Standards , Semantics , United States
SELECTION OF CITATIONS
SEARCH DETAIL