Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 26
Filtrar
1.
Environ Health ; 20(1): 104, 2021 09 17.
Artigo em Inglês | MEDLINE | ID: mdl-34535123

RESUMO

Toxic chemicals - "toxicants" - have been studied and regulated as single entities, and, carcinogens aside, almost all toxicants, single or mixed and however altered, have been thought harmless in very low doses or very weak concentrations. Yet much work in recent decades has shown that toxicants can injure wildlife, laboratory animals, and humans following exposures previously expected to be harmless. Additional work has shown that toxicants can act not only individually and cumulatively but also collectively and even synergistically and that they affect disadvantaged communities inordinately - and therefore, as argued by reformers, unjustly. As late as December 2016, the last full month before the inauguration of a president promising to rescind major environmental regulations, the United States federal environmental-health establishment, as led by the Environmental Protection Agency (EPA), had not developed coherent strategies to mitigate such risks, to alert the public to their plausibility, or to advise leadership in government and industry about their implications. To understand why, we examined archival materials, reviewed online databases, read internal industry communications, and interviewed experts. We confirmed that external constraints, statutory and judicial, had been in place prior to EPA's earliest interest in mixture toxicity, but we found no overt effort, certainly no successful effort, to loosen those constraints. We also found internal constraints: concerns that fully committing to the study of complex mixtures involving numerous toxicants would lead to methodological drift within the toxicological community and that trying to act on insights from such study could lead only to regulatory futility. Interaction of these constraints, external and internal, shielded the EPA by circumscribing its responsibilities and by impeding movement toward paradigmatic adjustment, but it also perpetuated scientifically dubious policies, such as those limiting the evaluation of commercial chemical formulations, including pesticide formulations, to only those ingredients said by their manufacturers to be active. In this context, regulators' disregard of synergism contrasted irreconcilably with biocide manufacturers' understanding that synergism enhanced lethality and patentability. In the end, an effective national response to mixture toxicity, cumulative risk, and environmental injustice did not emerge. In parallel, though, the National Institute of Environmental Health Sciences, which was less constrained, pursued with scientific investigation what the EPA had not pursued with regulatory action.


Assuntos
Política Ambiental/história , Poluentes Ambientais/toxicidade , Substâncias Perigosas/toxicidade , National Institute of Environmental Health Sciences (U.S.)/história , Medição de Risco/história , United States Environmental Protection Agency/história , Saúde Ambiental/história , Regulamentação Governamental , História do Século XX , História do Século XXI , Humanos , Justiça Social , Estados Unidos
3.
Risk Anal ; 40(S1): 2218-2230, 2020 11.
Artigo em Inglês | MEDLINE | ID: mdl-33135225

RESUMO

Before the founding of the Society for Risk Analysis (SRA) in 1980, food safety in the United States had long been a concern, but there was a lack of systematic methods to assess food-related risks. In 1906, the U.S. Congress passed, and President Roosevelt signed, the Pure Food and Drug Act and the Meat Inspection Act to regulate food safety at the federal level. This Act followed the publication of multiple reports of food contamination, culminating in Upton Sinclair's novel The Jungle, which highlighted food and worker abuses in the meatpacking industry. Later in the 20th century, important developments in agricultural and food technology greatly increased food production. But chemical exposures from agricultural and other practices resulted in major amendments to federal food laws, including the Delaney Clause, aimed specifically at cancer-causing chemicals. Later in the 20th century, when quantitative risk assessment methods were given greater scientific status in a seminal National Research Council report, food safety risk assessment became more systematized. Additionally, in these last 40 years, food safety research has resulted in increased understanding of a range of health effects from foodborne chemicals, and technological developments have improved U.S. food safety from farm to fork by offering new ways to manage risks. We discuss the history of food safety and the role risk analysis has played in its evolution, starting from over a century ago, but focusing on the last 40 years. While we focus on chemical risk assessment in the U.S., we also discuss microbial risk assessment and international food safety.


Assuntos
Inocuidade dos Alimentos , Medição de Risco/história , Carcinógenos/análise , Contaminação de Alimentos/análise , História do Século XX , Estados Unidos , United States Food and Drug Administration
4.
Environ Res ; 158: 773-788, 2017 10.
Artigo em Inglês | MEDLINE | ID: mdl-28756009

RESUMO

The LNT single-hit model was derived from the Nobel Prize-winning research of Herman J. Muller who showed that x-rays could induce gene mutations in Drosophila and that the dose response for these so-called mutational events was linear. Lewis J. Stadler, another well-known and respected geneticist at the time, strongly disagreed with and challenged Muller's claims. Detailed evaluations by Stadler over a prolonged series of investigations revealed that Muller's experiments had induced gross heritable chromosomal damage instead of specific gene mutations as had been claimed by Muller at his Nobel Lecture. These X-ray-induced alterations became progressively more frequent and were of larger magnitude (more destructive) with increasing doses. Thus, Muller's claim of having induced discrete gene mutations represented a substantial speculative overreach and was, in fact, without proof. The post hoc arguments of Muller to support his gene mutation hypothesis were significantly challenged and weakened by a series of new findings in the areas of cytogenetics, reverse mutation, adaptive and repair processes, and modern molecular methods for estimating induced genetic damage. These findings represented critical and substantial limitations to Muller's hypothesis of X-ray-induced gene mutations. Furthermore, they challenged the scientific foundations used in support of the LNT single-hit model by severing the logical nexus between Muller's data on radiation-induced inheritable alterations and the LNT single-hit model. These findings exposed fundamental scientific flaws that undermined not only the seminal recommendation of the 1956 BEAR I Genetics Panel to adopt the LNT single-hit Model for risk assessment but also any rationale for its continued use in the present day.


Assuntos
Mutação/efeitos da radiação , Neoplasias/etiologia , Medição de Risco/história , Animais , Relação Dose-Resposta à Radiação , Drosophila/efeitos da radiação , História do Século XX , Humanos , Modelos Genéticos
5.
Environ Res ; 154: 362-379, 2017 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-28167448

RESUMO

There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, has charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm.


Assuntos
Carcinogênese/efeitos da radiação , Hormese/efeitos da radiação , Mutação/efeitos da radiação , Neoplasias Induzidas por Radiação/genética , Neoplasias/radioterapia , Radioterapia/efeitos adversos , Radioterapia/história , Relação Dose-Resposta à Radiação , História do Século XX , História do Século XXI , Humanos , Neoplasias/história , Neoplasias Induzidas por Radiação/história , Radiação Ionizante , Medição de Risco/história , Níveis Máximos Permitidos
7.
Sci Context ; 28(3): 427-64, 2015 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-26256506

RESUMO

This paper brings together the history of risk and the history of DNA repair, a biological phenomenon that emerged as a research field in between molecular biology, genetics, and radiation research in the 1960s. The case of xeroderma pigmentosum (XP), an inherited hypersensitivity to UV light and, hence, a disposition to skin cancer will be the starting point to argue that, in the 1970s and 1980s, DNA repair became entangled in the creation of new models of the human body at risk - what is here conceptually referred to as the vulnerability aspect of body history - and new attempts at cancer prevention and enhancement of the body associated with the new flourishing research areas of antimutagenesis and anticarcinogenesis. The aim will be to demonstrate that DNA repair created special attempts at disease prevention: molecular enhancement, seeking to identify means to increase the self-repair abilities of the body at the molecular level. Prevention in this sense meant enhancing the body's ability to cope with the environmental hazards of an already toxic world. This strategy has recently been adopted by the beauty industry, which introduced DNA care as a new target for skin care research and anti-aging formulas.


Assuntos
Reparo do DNA , Genética/história , Biologia Molecular/história , Saúde Radiológica/história , Xeroderma Pigmentoso/história , História do Século XX , História do Século XXI , Humanos , Medição de Risco/história , Xeroderma Pigmentoso/etiologia
8.
Ann Epidemiol ; 25(3): 147-54, 2015 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-25721747

RESUMO

PURPOSE: Since Doll published the first PAF in 1951, it has been a mainstay. Confusion in terminology abounds with regard to these measures. The ability to estimate all of them in case-control studies as well as in cohort studies is not widely appreciated. METHODS: This article reviews and comments on the historical development of the population attributable fraction (PAF), the exposed attributable fraction (EAF), the rate difference (ID), the population rate (or incidence) difference (PID), and the caseload difference (CD). RESULTS: The desire for PAFs to sum to no more than 100% and the interpretation of the complement of a PAF as the proportion of a rate that can be attributed to other causes are shown to stem from the same problem: a failure to recognize the pervasiveness of shared etiologic responsibility among causes. A lack of appreciation that "expected" numbers of cases and deaths are not actually the numbers to be expected when an exposure or intervention appreciably affects person-time denominators for rates, as in the case of smoking and nonnormal body mass, makes many CD estimates inflated. A movement may be gaining momentum to shift away from assuming, often unrealistically, the complete elimination of harmful exposures and toward estimating the effects of realistic interventions. This movement could culminate in a merger of the academic concept of transportability with the applied discipline of risk assessment. CONCLUSIONS: A suggestion is offered to pay more attention to absolute measures such as the rate difference, the population rate difference, and the CD, when the latter can be validly estimated and less attention to proportional measures such as the EAF and PAF.


Assuntos
Estudos de Casos e Controles , Estudos de Coortes , Incidência , Vigilância da População/métodos , Medição de Risco/história , Feminino , História do Século XX , Humanos , Masculino , Pessoa de Meia-Idade , Medição de Risco/métodos , Fatores de Risco
9.
Arch Toxicol ; 87(9): 1621-33, 2013 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-23887208

RESUMO

This paper identifies the origin of the linearity at low-dose concept [i.e., linear no threshold (LNT)] for ionizing radiation-induced mutation. After the discovery of X-ray-induced mutations, Olson and Lewis (Nature 121(3052):673-674, 1928) proposed that cosmic/terrestrial radiation-induced mutations provide the principal mechanism for the induction of heritable traits, providing the driving force for evolution. For this concept to be general, a LNT dose relationship was assumed, with genetic damage proportional to the energy absorbed. Subsequent studies suggested a linear dose response for ionizing radiation-induced mutations (Hanson and Heys in Am Nat 63(686):201-213, 1929; Oliver in Science 71:44-46, 1930), supporting the evolutionary hypothesis. Based on an evaluation of spontaneous and ionizing radiation-induced mutation with Drosophila, Muller argued that background radiation had a negligible impact on spontaneous mutation, discrediting the ionizing radiation-based evolutionary hypothesis. Nonetheless, an expanded set of mutation dose-response observations provided a basis for collaboration between theoretical physicists (Max Delbruck and Gunter Zimmer) and the radiation geneticist Nicolai Timoféeff-Ressovsky. They developed interrelated physical science-based genetics perspectives including a biophysical model of the gene, a radiation-induced gene mutation target theory and the single-hit hypothesis of radiation-induced mutation, which, when integrated, provided the theoretical mechanism and mathematical basis for the LNT model. The LNT concept became accepted by radiation geneticists and recommended by national/international advisory committees for risk assessment of ionizing radiation-induced mutational damage/cancer from the mid-1950s to the present. The LNT concept was later generalized to chemical carcinogen risk assessment and used by public health and regulatory agencies worldwide.


Assuntos
Carcinogênese/efeitos da radiação , Mutagênese/efeitos da radiação , Lesões por Radiação/prevenção & controle , Radiação Ionizante , Toxicologia/história , Animais , Carcinogênese/induzido quimicamente , Carcinógenos/administração & dosagem , Carcinógenos/toxicidade , Relação Dose-Resposta a Droga , Relação Dose-Resposta à Radiação , Doença Ambiental/prevenção & controle , História do Século XX , História do Século XXI , Humanos , Modelos Lineares , Mutagênese/efeitos dos fármacos , Mutagênicos/administração & dosagem , Mutagênicos/toxicidade , Traumatismos Ocupacionais/prevenção & controle , Tolerância a Radiação , Medição de Risco/história , Medição de Risco/legislação & jurisprudência , Medição de Risco/métodos , Níveis Máximos Permitidos , Toxicologia/legislação & jurisprudência , Estados Unidos
11.
J Environ Monit ; 14(2): 340-7, 2012 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-22109739

RESUMO

This paper reviews how aerosol exposure assessment, for people in both working and living environments, has evolved over the years. It charts the main scientific developments that led to progressively improved ways of thinking and methods to assess exposure to airborne particulate matter in a manner more relevant to human health. It has been a long scientific journey as one generation of pioneering contributors has handed off to the next. In the process a consistent rationale has emerged, producing aerosol sampling criteria--and in turn exposure standards--which have been increasingly relevant to actual human exposures. The journey continues as a new generation of scientists steps up to deal with the new challenges that are emerging. An appreciation of the history of what went before is essential to charting the most effective path looking forward.


Assuntos
Aerossóis/análise , Poluentes Atmosféricos/análise , Exposição Ambiental/análise , Poluição do Ar/estatística & dados numéricos , Exposição Ambiental/história , Exposição Ambiental/estatística & dados numéricos , Monitoramento Ambiental/história , Monitoramento Ambiental/estatística & dados numéricos , História do Século XVII , História do Século XX , História do Século XXI , Humanos , Medição de Risco/história , Medição de Risco/tendências
12.
Arch Toxicol ; 83(3): 203-25, 2009 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-19247635

RESUMO

This article assesses the historical foundations of how linearity at low dose became accepted by the scientific/regulatory communities. While the threshold model was used in the 1920s/1930s in establishing radiation health standards, its foundations were challenged by the genetics community who argued that radiation induced mutations in reproductive cells followed a linear response, were cumulative and deleterious. Scientific foundations of linearity for gonadal mutations were based on non-conclusive evidence as well as not being conducted at low doses. Following years of debate, leaders in the genetics community participated in the U.S. National Academy of Sciences (NAS) (1956) Biological Effects of Atomic Radiation (BEAR) BEAR I Committee, getting their perspectives accepted, incorporating linearity for radiation-induced mutational effects in risk assessment. Overtime the concept of linearity was generalized to include somatic effects induced by radiation based on a protectionist philosophy. This affected the course of radiation-induced and later chemically-induced carcinogen risk assessment. Acceptance of linearity at low dose from chemical carcinogens was strongly influenced by the NAS Safe Drinking Water Committee report of 1977 which provided the critical guidance to the U.S. EPA to adopt linear at low dose modeling for risk assessment for chemical carcinogens with little supportive data, much of which has been either discredited or seriously weakened over the past 3 decades. Nonetheless, there has been little practical change of regulatory policy concerning carcinogen risk assessment. These observations suggest that while scientific disciplines are self correcting, that regulatory 'science' fails to display the same self-correcting mechanism despite contradictory data.


Assuntos
Carcinógenos/toxicidade , Relação Dose-Resposta a Droga , Modelos Lineares , Animais , Testes de Carcinogenicidade , História do Século XX , Humanos , Mutação , Medição de Risco/história , Medição de Risco/métodos
13.
Yakushigaku Zasshi ; 44(2): 64-70, 2009.
Artigo em Japonês | MEDLINE | ID: mdl-20527311

RESUMO

The first remarkable adverse drug reaction (ADR) reported in Japan was anaphylactic shock caused by penicillin. Although intradermal testing for antibiotics had been exercised as prediction method of anaphylactic shock for a long time, it was discontinued in 2004 because of no evidence for prediction. The malformation of limbs, etc. caused by thalidomide was a global problem, and thalidomide was withdrawn from the market. Teratogenicity testing during new drug development has been implemented since 1963. Chinoform (clioquinol)-iron chelate was detected from green tongue and green urine in patients with subacute myelo-optic neuropathy (SMON) and identified as a causal material of SMON in 1970. Chinoform was withdrawn from the market, and a fund for relief the health damage caused by ADR was established in 1979. The co-administration of sorivudine and fluorouracil anticancer agents induced fatal agranulocytosis, and sorivudine was withdrawn from the market after being on sale for one month in 1993. The guidelines for package inserts were corrected with this opportunity, and early phase pharmacovigilance of new drugs was introduced later. Since acquired immune deficiency syndrome, and hepatitis B and C were driven by virus-infected blood products, the Ministry of Health, Labor and Welfare tightened regulations regarding biological products in 2003, and a fund for relief of health damage caused by infections driven from biological products was established in 2004. The other remarkable ADRs were quadriceps contracture induced by the repeated administration of muscular injection products and Creutzfeldt-Jakob disease caused by the transplantation of human dry cranial dura matter, etc. The significance of safety measures for drugs based on experiences related to ADRs is worthy of notice. New drugs are approved based on a benefit-risk assessment, if the expected therapeutic benefits outweigh the possible risks associated with treatment. Since unexpected, rare and serious ADRs have been detected after administration to many patients in the post-marketing stage, risk management is required for product life-cycle management.


Assuntos
Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/história , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/prevenção & controle , História do Século XX , História do Século XXI , Humanos , Japão , Medição de Risco/história , Gestão de Riscos/história
15.
J Neurosurg ; 108(1): 186-93, 2008 01.
Artigo em Inglês | MEDLINE | ID: mdl-18173333

RESUMO

An important factor in making a recommendation for treatment of a patient with arteriovenous malformation (AVM) is to estimate the risk of surgery for that patient. A simple, broadly applicable grading system that is designed to predict the risk of morbidity and mortality attending the operative treatment of specific AVM's is proposed. The lesion is graded on the basis of size, pattern of venous drainage, and neurological eloquence of adjacent brain. All AVM's fall into one of six grades. Grade I malformations are small, superficial, and located in non-eloquent cortex; Grade V lesions are large, deep, and situated in neurologically critical areas; and Grade VI lesions are essentially inoperable AVM's. Retrospective application of this grading scheme to a series of surgically excised AVM's has demonstrated its correlation with the incidence of postoperative neurological complications. The application of a standardized grading scheme will enable a comparison of results between various clinical series and between different treatment techniques, and will assist in the process of management decision-making.


Assuntos
Angiografia Cerebral/história , Malformações Arteriovenosas Intracranianas/história , Medição de Risco/história , História do Século XX , Humanos , Malformações Arteriovenosas Intracranianas/diagnóstico por imagem , Malformações Arteriovenosas Intracranianas/cirurgia , Complicações Pós-Operatórias
16.
J Radiol Prot ; 26(2): 141-59, 2006 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-16738413

RESUMO

Since the discovery of fission, the notion of a chain reaction in a critical mass releasing massive amounts of energy has haunted physicists. The possibility of a bomb or a reactor prompted much of the early work on determining a critical mass, but the need to avoid an accidental critical excursion during processing or transport of fissile material drove much that took place subsequently. Because of the variety of possible situations that might arise, it took some time to develop adequate theoretical tools for criticality safety and the early assessments were based on direct experiment. Some extension of these experiments to closely similar situations proved possible, but it was not until the 1960s that theoretical methods (and computers to run them) developed enough for them to become reliable assessment tools. Validating such theoretical methods remained a concern, but by the end of the century they formed the backbone of criticality safety assessment. This paper traces the evolution of these methods, principally in the UK and USA, and summarises some related work concerned with the nature of criticality accidents and their radiological consequences. It also indicates how the results have been communicated and used in ensuring nuclear safety.


Assuntos
Modelos Químicos , Fissão Nuclear , Física Nuclear/história , Liberação Nociva de Radioativos/prevenção & controle , Radiometria/história , Medição de Risco/história , Gestão da Segurança/história , História do Século XX , História do Século XXI , Internacionalidade , Proteção Radiológica/história
18.
Am Heart J ; 148(1): 16-26, 2004 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-15215787

RESUMO

Despite major advances in the diagnosis and treatment of atherosclerotic cardiovascular disease (CVD) in the past century, it remains a serious clinical and public health problem. Multivariable risk factor analysis is now commonly performed to identify high-risk candidates for CVD who need preventive measures and to seek out clues to the pathogenesis of the disease. The set of risk factors used for the former is constrained by practical considerations, and the set of risk factors used for the latter is constrained by the hypothesis being tested. This report reviews the evolution and usefulness of multivariable risk functions crafted for estimating risk of clinical manifestations of atherosclerosis and for gaining insights into their pathogenesis. Decades of evaluation of CVD risk factors by the Framingham Study led to the conclusion that CVD risk evaluation is most fruitfully appraised from the multivariable risk posed by a set of established risk factors. Such assessment is essential because risk factors seldom occur in isolation, and the risk associated with each varies widely depending on the burden of associated risk factors. About half the CVD in the general population arises from the segment with multiple marginal risk factor abnormalities. Although disease-specific profiles are available, a multivariable risk formulation for coronary disease comprised of age, sex, the total/high-density lipoprotein cholesterol ratio, systolic blood pressure, glucose intolerance, cigarette smoking, and electrocardiography-left ventricular hypertrophy is also predictive of peripheral artery disease, heart failure, and stroke because of shared risk factors. Correcting risk factors for any particular CVD has the potential to protect against > or =1 of the others. Multivariable risk stratification is now recognized as essential in efficiently identifying likely candidates for CVD and quantifying the hazard.


Assuntos
Doenças Cardiovasculares , Medição de Risco/métodos , Adulto , Idoso , Colesterol/sangue , Feminino , Intolerância à Glucose , História do Século XX , História do Século XXI , Humanos , Hipertensão , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Análise de Regressão , Medição de Risco/história , Fatores de Risco , Triglicerídeos/sangue
19.
Health Phys ; 85(1): 4-12, 2003 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-12852465

RESUMO

The theme that runs through this 26th Taylor Lecture is the question of how can data on the mechanism of induction of genetic alterations by radiations and chemicals be used to support the development of risk estimates, particularly at low exposure levels. The premise is that chromosomal alterations are involved in the development of tumors and birth defects, and that data generated for genetic alterations can be interpreted in terms of these adverse health outcomes. The general conclusions are that chromosomal alterations can be induced by ionizing radiations by a single energy loss event in a target of the size of a DNA molecule and that aberrations generally result from misrepair or failure to repair the induced lesions (generally assumed to be double-strand breaks). Chromosomal alterations induced by chemicals are produced almost exclusively by replication errors on a damaged DNA template. Thus, cell cycle stage and DNA repair and replication fidelity will be influential on overall sensitivity to aberration induction. These same features are also important in considerations of genetic susceptibility-alterations in cell cycle control or DNA repair or replication fidelity can alter sensitivity. The differences in mechanism of induction of chromosomal aberrations by ionizing radiation and chemicals is most important when considering cells at risk and comparative sensitivities among species and cell types. Models of cancer induction have gradually evolved from initiation, promotion, and progression models to multistep genetic models to the most recent one of six acquired characteristics. This evolution has passed the level of concentration of research from single gene, single cell to multiple genes (pathways), and whole tissues. The latter areas of concentration are ideal for addressing with the new genomics, proteomics, and computational modeling approaches. The attention is still on the role of genetic alterations in cancer and hereditary effects and the mechanism of their formation--it is the approaches to address these that are changing.


Assuntos
Medicina Nuclear/história , Física Nuclear/história , Animais , Cromossomos/efeitos da radiação , Inglaterra , História do Século XXI , Humanos , Neoplasias Induzidas por Radiação/história , Neoplasias Induzidas por Radiação/prevenção & controle , Medição de Risco/história , Medição de Risco/métodos , Estados Unidos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA