Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
J Res Natl Inst Stand Technol ; 115(5): 373-92, 2010.
Artigo em Inglês | MEDLINE | ID: mdl-27134792

RESUMO

Between 1911 and 1984, the National Bureau of Standards (NBS) conducted a large number of corrosion studies that included the measurement of corrosion damage to samples exposed to real-world environments. One of these studies was an investigation conducted between 1922 and 1940 into the corrosion of bare steel and wrought iron pipes buried underground at 47 different sites representing different soil types across the Unites States. At the start of this study, very little was known about the corrosion of ferrous alloys underground. The objectives of this study were to determine (i) if coatings would be required to prevent corrosion, and (ii) if soil properties could be used to predict corrosion and determine when coatings would be required. While this study determined very quickly that coatings would be required for some soils, it found that the results were so divergent that even generalities based on this data must be drawn with care. The investigators concluded that so many diverse factors influence corrosion rates underground that planning of proper tests and interpretation of the results were matters of considerable difficulty and that quantitative interpretations or extrapolations could be done "only in approximate fashion" and attempted only in the "restricted area" of the tests until more complete information is available. Following the passage of the Pipeline Safety Improvement Act in 2002 and at the urging of the pipeline industry, the Office of Pipeline Safety of the U.S. Department of Transportation approached the National Institute of Standards and Technology (NBS became NIST in 1988) and requested that the data from this study be reexamined to determine if the information handling and analysis capabilities of modern computers and software could enable the extraction of more meaningful information from these data. This report is a summary of the resulting investigations. The data from the original NBS studies were analyzed using a variety of commercially available software packages for statistical analysis. The emphasis was on identifying trends in the data that could be later exploited in the development of an empirical model for predicting the range of expected corrosion behavior for any given set of soil chemistry and conditions. A large number of issues were identified with this corrosion dataset, but given the limited knowledge of corrosion and statistical analysis at the time the study was conducted, these shortcomings are not surprising and many of these were recognized by the investigators before the study was concluded. However, it is important to keep in mind that complete soil data is provided for less than half of the sites in this study. In agreement with the initial study, it was concluded that any differences in the corrosion behavior of the alloys could not be resolved due to the scatter in the results from the environmental factors and no significant difference could be determined between alloys. Linear regression and curve fitting of the corrosion damage measurements against the measured soil composition and properties found some weak trends. These trends improved with multiple regression, and empirical equations representing the performance of the samples in the tests were developed with uncertainty estimates. The uncertainties in these empirical models for the corrosion data were large, and extrapolation beyond the parameter space or exposure times of these experiments will create additional uncertainties. It is concluded that equations for the estimation of corrosion damage distributions and rates can be developed from these data, but these models will always have relatively large uncertainties that will limit their utility. These uncertainties result from the scatter in the measurements due to annual, seasonal, and sample position dependent variations at the burial sites. The data indicate that more complete datasets with soil property measurements reflecting the properties of the soil and ground water directly in contact with the sample from statistically designed experiments would greatly reduce this scatter and enable more representative predictions.

2.
Addit Manuf ; 362020 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-34141601

RESUMO

Melt pool monitoring (MPM) is a technique used in laser powder bed fusion (LPBF) to extract features from insitu sensor signals that correlate to defect formation or general part fabrication quality. Various melt pool phenomena have been shown to relate to measured transient absorption of the laser energy, which in turn, can be relatable to the melt pool emission measured in MPM systems. This paper describes use of a reflectometer-based instrument to measure the dynamic laser energy absorption during single-line laser scans. Scans are conducted on bare metal and single powder layer of nickel alloy 625 (IN625) at a range of laser powers. In addition, a photodetector aligned co-axially with the laser, often found in commercial LPBF monitoring systems, synchronously measured of the incandescent emission from the melt pool with the dynamic laser absorption. Relationships between the dynamic laser absorption, co-axial MPM, and surface features on the tracks are observed, providing illustration of the melt pool dynamics that formed these features. Time-integrated measurements of laser absorption are shown to correlate well with MPM signal, as well as indicate the transition between conduction and keyhole mode. This transition is corroborated by metallographic cross-section measurement, as well as topographic measurements of the solidified tracks. Ultimately, this paper exemplifies the utility of dynamic laser absorption measurements to inform both the physical nature of the melt pool dynamics, as well as interpretation of process monitoring signals.

3.
Artigo em Inglês | MEDLINE | ID: mdl-34123701

RESUMO

The complex physical nature of the laser powder bed fusion (LPBF) process warrants use of multiphysics computational simulations to predict or design optimal operating parameters or resultant part qualities such as microstructure or defect concentration. Many of these simulations rely on tuning based on characteristics of the laser-induced melt pool, such as the melt pool geometry (length, width, and depth). Additionally, many of numerous interacting variables that make LPBF process so complex can be reduced and controlled by performing simple, single track experiments on bare (no powder) substrates, yet still produce important and applicable physical results. The 2018 Additive Manufacturing Benchmark (AM Bench) tests and measurements were designed for this application. This paper describes the experiment design for the tests conducted using LPBF on bare metal surfaces, and the measurement results for the melt pool geometry and melt pool cooling rate performed on two LPBF systems. Several factors, such as accurate laser spot size, were determined after the 2018 AM Bench conference, with results of those additional tests reported here.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA