Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 51
Filtrar
Mais filtros

Intervalo de ano de publicação
1.
J Comput Aided Mol Des ; 37(8): 373-394, 2023 08.
Artigo em Inglês | MEDLINE | ID: mdl-37329395

RESUMO

Using generative deep learning models and reinforcement learning together can effectively generate new molecules with desired properties. By employing a multi-objective scoring function, thousands of high-scoring molecules can be generated, making this approach useful for drug discovery and material science. However, the application of these methods can be hindered by computationally expensive or time-consuming scoring procedures, particularly when a large number of function calls are required as feedback in the reinforcement learning optimization. Here, we propose the use of double-loop reinforcement learning with simplified molecular line entry system (SMILES) augmentation to improve the efficiency and speed of the optimization. By adding an inner loop that augments the generated SMILES strings to non-canonical SMILES for use in additional reinforcement learning rounds, we can both reuse the scoring calculations on the molecular level, thereby speeding up the learning process, as well as offer additional protection against mode collapse. We find that employing between 5 and 10 augmentation repetitions is optimal for the scoring functions tested and is further associated with an increased diversity in the generated compounds, improved reproducibility of the sampling runs and the generation of molecules of higher similarity to known ligands.


Assuntos
Desenho de Fármacos , Redes Neurais de Computação , Reprodutibilidade dos Testes , Descoberta de Drogas/métodos
2.
J Comput Aided Mol Des ; 36(5): 363-371, 2022 05.
Artigo em Inglês | MEDLINE | ID: mdl-34046745

RESUMO

Exploring the origin of multi-target activity of small molecules and designing new multi-target compounds are highly topical issues in pharmaceutical research. We have investigated the ability of a generative neural network to create multi-target compounds. Data sets of experimentally confirmed multi-target, single-target, and consistently inactive compounds were extracted from public screening data considering positive and negative assay results. These data sets were used to fine-tune the REINVENT generative model via transfer learning to systematically recognize multi-target compounds, distinguish them from single-target or inactive compounds, and construct new multi-target compounds. During fine-tuning, the model showed a clear tendency to increasingly generate multi-target compounds and structural analogs. Our findings indicate that generative models can be adopted for de novo multi-target compound design.


Assuntos
Desenho de Fármacos , Redes Neurais de Computação
3.
Sensors (Basel) ; 22(12)2022 Jun 14.
Artigo em Inglês | MEDLINE | ID: mdl-35746285

RESUMO

Classification is a very common image processing task. The accuracy of the classified map is typically assessed through a comparison with real-world situations or with available reference data to estimate the reliability of the classification results. Common accuracy assessment approaches are based on an error matrix and provide a measure for the overall accuracy. A frequently used index is the Kappa index. As the Kappa index has increasingly been criticized, various alternative measures have been investigated with minimal success in practice. In this article, we introduce a novel index that overcomes the limitations. Unlike Kappa, it is not sensitive to asymmetric distributions. The quantity and allocation disagreement index (QADI) index computes the degree of disagreement between the classification results and reference maps by counting wrongly labeled pixels as A and quantifying the difference in the pixel count for each class between the classified map and reference data as Q. These values are then used to determine a quantitative QADI index value, which indicates the value of disagreement and difference between a classification result and training data. It can also be used to generate a graph that indicates the degree to which each factor contributes to the disagreement. The efficiency of Kappa and QADI were compared in six use cases. The results indicate that the QADI index generates more reliable classification accuracy assessments than the traditional Kappa can do. We also developed a toolbox in a GIS software environment.


Assuntos
Processamento de Imagem Assistida por Computador , Tecnologia de Sensoriamento Remoto , Processamento de Imagem Assistida por Computador/métodos , Tecnologia de Sensoriamento Remoto/métodos , Reprodutibilidade dos Testes , Software
4.
Sensors (Basel) ; 22(9)2022 Apr 19.
Artigo em Inglês | MEDLINE | ID: mdl-35590797

RESUMO

This work evaluates the performance of three machine learning (ML) techniques, namely logistic regression (LGR), linear regression (LR), and support vector machines (SVM), and two multi-criteria decision-making (MCDM) techniques, namely analytical hierarchy process (AHP) and the technique for order of preference by similarity to ideal solution (TOPSIS), for mapping landslide susceptibility in the Chitral district, northern Pakistan. Moreover, we create landslide inventory maps from LANDSAT-8 satellite images through the change vector analysis (CVA) change detection method. The change detection yields more than 500 landslide spots. After some manual post-processing correction, the landslide inventory spots are randomly split into two sets with a 70/30 ratio for training and validating the performance of the ML techniques. Sixteen topographical, hydrological, and geological landslide-related factors of the study area are prepared as GIS layers. They are used to produce landslide susceptibility maps (LSMs) with weighted overlay techniques using different weights of landslide-related factors. The accuracy assessment shows that the ML techniques outperform the MCDM methods, while SVM yields the highest accuracy of 88% for the resulting LSM.


Assuntos
Deslizamentos de Terra , Sistemas de Informação Geográfica , Modelos Logísticos , Paquistão , Máquina de Vetores de Suporte
5.
Sensors (Basel) ; 21(1)2021 Jan 04.
Artigo em Inglês | MEDLINE | ID: mdl-33406613

RESUMO

There is an evident increase in the importance that remote sensing sensors play in the monitoring and evaluation of natural hazards susceptibility and risk. The present study aims to assess the flash-flood potential values, in a small catchment from Romania, using information provided remote sensing sensors and Geographic Informational Systems (GIS) databases which were involved as input data into a number of four ensemble models. In a first phase, with the help of high-resolution satellite images from the Google Earth application, 481 points affected by torrential processes were acquired, another 481 points being randomly positioned in areas without torrential processes. Seventy percent of the dataset was kept as training data, while the other 30% was assigned to validating sample. Further, in order to train the machine learning models, information regarding the 10 flash-flood predictors was extracted in the training sample locations. Finally, the following four ensembles were used to calculate the Flash-Flood Potential Index across the Bâsca Chiojdului river basin: Deep Learning Neural Network-Frequency Ratio (DLNN-FR), Deep Learning Neural Network-Weights of Evidence (DLNN-WOE), Alternating Decision Trees-Frequency Ratio (ADT-FR) and Alternating Decision Trees-Weights of Evidence (ADT-WOE). The model's performances were assessed using several statistical metrics. Thus, in terms of Sensitivity, the highest value of 0.985 was achieved by the DLNN-FR model, meanwhile the lowest one (0.866) was assigned to ADT-FR ensemble. Moreover, the specificity analysis shows that the highest value (0.991) was attributed to DLNN-WOE algorithm, while the lowest value (0.892) was achieved by ADT-FR. During the training procedure, the models achieved overall accuracies between 0.878 (ADT-FR) and 0.985 (DLNN-WOE). K-index shows again that the most performant model was DLNN-WOE (0.97). The Flash-Flood Potential Index (FFPI) values revealed that the surfaces with high and very high flash-flood susceptibility cover between 46.57% (DLNN-FR) and 59.38% (ADT-FR) of the study zone. The use of the Receiver Operating Characteristic (ROC) curve for results validation highlights the fact that FFPIDLNN-WOE is characterized by the most precise results with an Area Under Curve of 0.96.

6.
J Environ Manage ; 284: 112067, 2021 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-33556831

RESUMO

Land subsidence (LS) in arid and semi-arid areas, such as Iran, is a significant threat to sustainable land management. The purpose of this study is to predict the LS distribution by generating land subsidence susceptibility models (LSSMs) for the Shahroud plain in Iran using three different multi-criteria decision making (MCDM) and five different artificial intelligence (AI) models. The MCDM models we used are the VlseKriterijumska Optimizacija IKompromisno Resenje (VIKOR), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and Complex Proportional Assessment (COPRAS), and the AI models are the extreme gradient boosting (XGBoost), Cubist, Elasticnet, Bayesian multivariate adaptive regression spline (BMARS) and conditional random forest (Cforest) methods. We used the Receiver Operating Characteristic (ROC) curve, Area Under Curve (AUC) and different statistical indices,i.e. accuracy, sensitivity, specificity, F score, Kappa, Mean Absolute Error (MAE) and Nash-Sutcliffe Criteria (NSC)to validate and evaluate the methods. Based on the different validation techniques, the Cforest method yielded the best results with minimum and maximum values of 0.04 and 0.99, respectively. According to the Cforest model, 30.55% of the study area is extremely vulnerable to land subsidence. The results of our research will be of great help to planners and policy makers in the identification of the most vulnerable regions and the implementation of appropriate development strategies in this area.


Assuntos
Inteligência Artificial , Teorema de Bayes , Irã (Geográfico) , Curva ROC
7.
J Chem Inf Model ; 60(12): 5918-5922, 2020 12 28.
Artigo em Inglês | MEDLINE | ID: mdl-33118816

RESUMO

In the past few years, we have witnessed a renaissance of the field of molecular de novo drug design. The advancements in deep learning and artificial intelligence (AI) have triggered an avalanche of ideas on how to translate such techniques to a variety of domains including the field of drug design. A range of architectures have been devised to find the optimal way of generating chemical compounds by using either graph- or string (SMILES)-based representations. With this application note, we aim to offer the community a production-ready tool for de novo design, called REINVENT. It can be effectively applied on drug discovery projects that are striving to resolve either exploration or exploitation problems while navigating the chemical space. It can facilitate the idea generation process by bringing to the researcher's attention the most promising compounds. REINVENT's code is publicly available at https://github.com/MolecularAI/Reinvent.


Assuntos
Inteligência Artificial , Desenho de Fármacos , Descoberta de Drogas
8.
Sensors (Basel) ; 20(5)2020 Feb 28.
Artigo em Inglês | MEDLINE | ID: mdl-32121238

RESUMO

Gully erosion is a form of natural disaster and one of the land loss mechanisms causing severe problems worldwide. This study aims to delineate the areas with the most severe gully erosion susceptibility (GES) using the machine learning techniques Random Forest (RF), Gradient Boosted Regression Tree (GBRT), Naïve Bayes Tree (NBT), and Tree Ensemble (TE). The gully inventory map (GIM) consists of 120 gullies. Of the 120 gullies, 84 gullies (70%) were used for training and 36 gullies (30%) were used to validate the models. Fourteen gully conditioning factors (GCFs) were used for GES modeling and the relationships between the GCFs and gully erosion was assessed using the weight-of-evidence (WofE) model. The GES maps were prepared using RF, GBRT, NBT, and TE and were validated using area under the receiver operating characteristic(AUROC) curve, the seed cell area index (SCAI) and five statistical measures including precision (PPV), false discovery rate (FDR), accuracy, mean absolute error (MAE), and root mean squared error (RMSE). Nearly 7% of the basin has high to very high susceptibility for gully erosion. Validation results proved the excellent ability of these models to predict the GES. Of the analyzed models, the RF (AUROC = 0.96, PPV = 1.00, FDR = 0.00, accuracy = 0.87, MAE = 0.11, RMSE = 0.19 for validation dataset) is accurate enough for modeling and better suited for GES modeling than the other models. Therefore, the RF model can be used to model the GES areas not only in this river basin but also in other areas with the same geo-environmental conditions.

9.
Sensors (Basel) ; 20(2)2020 Jan 07.
Artigo em Inglês | MEDLINE | ID: mdl-31936038

RESUMO

Gully erosion is a problem; therefore, it must be predicted using highly accurate predictive models to avoid losses caused by gully development and to guarantee sustainable development. This research investigates the predictive performance of seven multiple-criteria decision-making (MCDM), statistical, and machine learning (ML)-based models and their ensembles for gully erosion susceptibility mapping (GESM). A case study of the Dasjard River watershed, Iran uses a database of 306 gully head cuts and 15 conditioning factors. The database was divided 70:30 to train and verify the models. Their performance was assessed with the area under prediction rate curve (AUPRC), the area under success rate curve (AUSRC), accuracy, and kappa. Results show that slope is key to gully formation. The maximum entropy (ME) ML model has the best performance (AUSRC = 0.947, AUPRC = 0.948, accuracy = 0.849 and kappa = 0.699). The second best is the random forest (RF) model (AUSRC = 0.965, AUPRC = 0.932, accuracy = 0.812 and kappa = 0.624). By contrast, the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) model was the least effective (AUSRC = 0.871, AUPRC = 0.867, accuracy = 0.758 and kappa = 0.516). RF increased the performance of statistical index (SI) and frequency ratio (FR) statistical models. Furthermore, the combination of a generalized linear model (GLM), and functional data analysis (FDA) improved their performances. The results demonstrate that a combination of geographic information systems (GIS) with remote sensing (RS)-based ML models can successfully map gully erosion susceptibility, particularly in low-income and developing regions. This method can aid the analyses and decisions of natural resources managers and local planners to reduce damages by focusing attention and resources on areas prone to the worst and most damaging gully erosion.

10.
Cartogr Geogr Inf Sci ; 47(1): 28-45, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32104165

RESUMO

A long-standing question in GIScience is whether geographic information systems (GIS) facilitates an adequate quantifiable representation of the concept of place. Considering the difficulties of quantifying elusive concepts related to place, several researchers focus on more tangible dimensions of the human understanding of place. The most common approaches are semantic enrichment of spatial information and holistic conceptualization of the notion of place. However, these approaches give emphasis on either space or human meaning, or they mainly exist as concepts without practically proven usable artifacts. A partial answer to this problem was proposed by the function-based model that treats place as functional space. This paper focuses primarily on the level of composition, describing and formalizing it as a rule-based framework with the following objectives: (a) contribute to the formalization efforts of the notion of place and its integration within GIS and (b) maintain tangible properties intertwined with the human understanding of place. The operationalization potential of the proposed framework is illustrated with an example of identifying the shopping areas in an urban region. The results show that the proposed model is able to capture all shopping malls as well as other areas that are not explicitly labeled as such but still function similarly to a shopping mall.

11.
Appl Opt ; 56(23): 6548, 2017 08 10.
Artigo em Inglês | MEDLINE | ID: mdl-29047944

RESUMO

The authors regret the incomplete acknowledgment in Appl. Opt.55, 6199 (2016)APOPAI0003-693510.1364/AO.55.006199.

12.
Appl Opt ; 55(23): 6199-211, 2016 Aug 10.
Artigo em Inglês | MEDLINE | ID: mdl-27534460

RESUMO

The emphasis of the present work lies on the examination of the distribution and spectral behavior of the optical properties of atmospheric aerosols in the Indo-Gangetic plains (IGP). Measurements were performed using an AErosol RObotic NETwork (AERONET) Sun photometer at four sites (Karachi, Lahore, Jaipur, and Kanpur) with different aerosol environments during the period 2007-2013. The aerosol optical depth (AOD) and Ångström exponent (α) were measured, and the results revealed a high AOD with a low α value over Karachi and Jaipur in July, while a high AOD with a high α value was reported over Lahore and Kanpur during October and December. The pattern of the aerosol volume size distribution (VSD) was similar across all four sites, with a prominent peak in coarse mode at a radius of 4.0-5.0 µm, and in fine mode at a radius of 0.1-4.0 µm, for all seasons. On the other hand, during the winter months, the fine-mode peaks were comparable to the coarse mode, which was not the case during the other seasons. The single scattering albedo (SSA) was found to be strongly wavelength-dependent during all seasons and for all sites, with the exception of Kanpur, where the SSA decreases with increasing wavelength during winter and post-monsoon. It was found that the phase function of the atmospheric aerosol was high at a small angle and stable around a scattering angle of 90°-180° at all sites and during all seasons. Spectral variation of the asymmetry parameter (ASY) revealed a decreasing trend with increasing wavelength, and this decreasing trend was more pronounced during the summer, winter, and post-monsoon as compared to pre-monsoon. Furthermore, extensive measurements suggest that both real (RRI) and imaginary (IRI) parts of the refractive index (RI) show contrasting spectral behavior during all seasons. Finally, the analysis of the National Oceanic and Atmospheric Administration hybrid single particle Lagrangian integrated trajectory model back trajectory revealed that the seasonal variation in aerosol types was influenced by a contribution of air masses from multiple source locations.

13.
Int J Health Geogr ; 14: 11, 2015 Mar 20.
Artigo em Inglês | MEDLINE | ID: mdl-25888924

RESUMO

BACKGROUND: Deprivation indices are useful measures to analyze health inequalities. There are several methods to construct these indices, however, few studies have used Geographic Information Systems (GIS) and Multi-Criteria methods to construct a deprivation index. Therefore, this study applies Multi-Criteria Evaluation to calculate weights for the indicators that make up the deprivation index and a GIS-based fuzzy approach to create different scenarios of this index is also implemented. METHODS: The Analytical Hierarchy Process (AHP) is used to obtain the weights for the indicators of the index. The Ordered Weighted Averaging (OWA) method using linguistic quantifiers is applied in order to create different deprivation scenarios. Geographically Weighted Regression (GWR) and a Moran's I analysis are employed to explore spatial relationships between the different deprivation measures and two health factors: the distance to health services and the percentage of people that have never had a live birth. This last indicator was considered as the dependent variable in the GWR. The case study is Quito City, in Ecuador. RESULTS: The AHP-based deprivation index show medium and high levels of deprivation (0,511 to 1,000) in specific zones of the study area, even though most of the study area has low values of deprivation. OWA results show deprivation scenarios that can be evaluated considering the different attitudes of decision makers. GWR results indicate that the deprivation index and its OWA scenarios can be considered as local estimators for health related phenomena. Moran's I calculations demonstrate that several deprivation scenarios, in combination with the 'distance to health services' factor, could be explanatory variables to predict the percentage of people that have never had a live birth. CONCLUSIONS: The AHP-based deprivation index and the OWA deprivation scenarios developed in this study are Multi-Criteria instruments that can support the identification of highly deprived zones and can support health inequalities analysis in combination with different health factors. The methodology described in this study can be applied in other regions of the world to develop spatial deprivation indices based on Multi-Criteria analysis.


Assuntos
Sistemas de Informação Geográfica , Disparidades em Assistência à Saúde/economia , Análise Espacial , Equador/epidemiologia , Sistemas de Informação Geográfica/estatística & dados numéricos , Disparidades em Assistência à Saúde/estatística & dados numéricos , Humanos
14.
Sensors (Basel) ; 15(7): 17013-35, 2015 Jul 14.
Artigo em Inglês | MEDLINE | ID: mdl-26184221

RESUMO

In this article we critically discuss the challenge of integrating contextual information, in particular spatiotemporal contextual information, with human and technical sensor information, which we approach from a geospatial perspective. We start by highlighting the significance of context in general and spatiotemporal context in particular and introduce a smart city model of interactions between humans, the environment, and technology, with context at the common interface. We then focus on both the intentional and the unintentional sensing capabilities of today's technologies and discuss current technological trends that we consider have the ability to enrich human and technical geo-sensor information with contextual detail. The different types of sensors used to collect contextual information are analyzed and sorted into three groups on the basis of names considering frequently used related terms, and characteristic contextual parameters. These three groups, namely technical in situ sensors, technical remote sensors, and human sensors are analyzed and linked to three dimensions involved in sensing (data generation, geographic phenomena, and type of sensing). In contrast to other scientific publications, we found a large number of technologies and applications using in situ and mobile technical sensors within the context of smart cities, and surprisingly limited use of remote sensing approaches. In this article we further provide a critical discussion of possible impacts and influences of both technical and human sensing approaches on society, pointing out that a larger number of sensors, increased fusion of information, and the use of standardized data formats and interfaces will not necessarily result in any improvement in the quality of life of the citizens of a smart city. This article seeks to improve our understanding of technical and human geo-sensing capabilities, and to demonstrate that the use of such sensors can facilitate the integration of different types of contextual information, thus providing an additional, namely the geo-spatial perspective on the future development of smart cities.

15.
ISPRS J Photogramm Remote Sens ; 87(100): 180-191, 2014 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-24623958

RESUMO

The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the 'per-pixel paradigm' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

16.
Comput Geosci ; 64: 81-95, 2014 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-25843987

RESUMO

GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

17.
Comput Geosci ; 73: 208-221, 2014 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-26089577

RESUMO

Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having "very high susceptibility", with the further 31% falling into zones classified as having "high susceptibility".

18.
Biomass Bioenergy ; 55: 3-16, 2013 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-26109751

RESUMO

Renewable energy will play a crucial role in the future society of the 21st century. The various renewable energy sources need to be balanced and their use carefully planned since they are characterized by high temporal and spatial variability that will pose challenges to maintaining a well balanced supply and to the stability of the grid. This article examines the ways that future 'energy landscapes' can be modelled in time and space. Biomass needs a great deal of space per unit of energy produced but it is an energy carrier that may be strategically useful in circumstances where other renewable energy carriers are likely to deliver less. A critical question considered in this article is whether a massive expansion in the use of biomass will allow us to construct future scenarios while repositioning the 'energy landscape' as an object of study. A second important issue is the utilization of heat from biomass energy plants. Biomass energy also has a larger spatial footprint than other carriers such as, for example, solar energy. This article seeks to provide a bridge between energy modelling and spatial planning while integrating research and techniques in energy modelling with Geographic Information Science. This encompasses GIS, remote sensing, spatial disaggregation techniques and geovisualization. Several case studies in Austria and Germany demonstrate a top-down methodology and some results while stepwise calculating potentials from theoretical to technically feasible potentials and setting the scene for the definition of economic potentials based on scenarios and assumptions.

19.
Sensors (Basel) ; 12(7): 9800-22, 2012.
Artigo em Inglês | MEDLINE | ID: mdl-23012571

RESUMO

Ubiquitous geo-sensing enables context-aware analyses of physical and social phenomena, i.e., analyzing one phenomenon in the context of another. Although such context-aware analysis can potentially enable a more holistic understanding of spatio-temporal processes, it is rarely documented in the scientific literature yet. In this paper we analyzed the collective human behavior in the context of the weather. We therefore explored the complex relationships between these two spatio-temporal phenomena to provide novel insights into the dynamics of urban systems. Aggregated mobile phone data, which served as a proxy for collective human behavior, was linked with the weather data from climate stations in the case study area, the city of Udine, Northern Italy. To identify and characterize potential patterns within the weather-human relationships, we developed a hybrid approach which integrates several spatio-temporal statistical analysis methods. Thereby we show that explanatory factor analysis, when applied to a number of meteorological variables, can be used to differentiate between normal and adverse weather conditions. Further, we measured the strength of the relationship between the 'global' adverse weather conditions and the spatially explicit effective variations in user-generated mobile network traffic for three distinct periods using the Maximal Information Coefficient (MIC). The analyses result in three spatially referenced maps of MICs which reveal interesting insights into collective human dynamics in the context of weather, but also initiate several new scientific challenges.

20.
Sci Total Environ ; 835: 155512, 2022 Aug 20.
Artigo em Inglês | MEDLINE | ID: mdl-35489485

RESUMO

This study deals with the issue of greenwashing, i.e. the false portrayal of companies as environmentally friendly. The analysis focuses on the US metal industry, which is a major emission source of sulfur dioxide (SO2), one of the most harmful air pollutants. One way to monitor the distribution of atmospheric SO2 concentrations is through satellite data from the Sentinel-5P programme, which represents a major advance due to its unprecedented spatial resolution. In this paper, Sentinel-5P remote sensing data was combined with a plant-level firm database to investigate the relationship between the US metal industry and SO2 concentrations using a spatial regression analysis. Additionally, this study considered web text data, classifying companies based on their websites in order to depict their self-portrayal on the topic of sustainability. In doing so, we investigated the topic of greenwashing, i.e. whether or not a positive self-portrayal regarding sustainability is related to lower local SO2 concentrations. Our results indicated a general, positive correlation between the number of employees in the metal industry and local SO2 concentrations. The web-based analysis showed that only 8% of companies in the metal industry could be classified as engaged in sustainability based on their websites. The regression analyses indicated that these self-reported "sustainable" companies had a weaker effect on local SO2 concentrations compared to their "non-sustainable" counterparts, which we interpreted as an indication of the absence of general greenwashing in the US metal industry. However, the large share of firms without a website and lack of specificity of the text classification model were limitations to our methodology.


Assuntos
Poluentes Atmosféricos , Poluição do Ar , Poluentes Atmosféricos/análise , Poluição do Ar/análise , Mineração de Dados , Monitoramento Ambiental , Humanos , Indústrias , Metais/análise , Análise de Regressão , Dióxido de Enxofre/análise
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA