RESUMEN
A soil seed bank is the collective name for viable seeds that are stored naturally in the soil. At the species or population level, the ability to form a seed bank represents a strategy for (re)colonization following a disturbance or other change in the local environmental conditions. At the community level, seed banks are thought to buffer local diversity during periods of environmental change and are often studied in relation to the potential for passive habitat restoration. The role that seed banks play in plant population and community dynamics, as well as their importance in the agricultural sector, means that they have been widely studied in ecological research. This database is the result of a comprehensive literature search, including all seed bank studies from the Web of Science from which data could be extracted, as well as an additional search of the Russian language literature. The database contains information on the species richness, seed density, and/or seed abundance in 3096 records from at least 1929 locations across the world's seven continents, extracted from 1442 studies published between 1940 and 2020. Records are grouped into five broad habitat categories (aquatic, arable, forest, grassland-including shrubland-and wetland), including information relating to habitat degradation from, or restoration to other habitats (total 14 combinations). Sampling protocols were also extracted for each record, and the database was extensively checked for errors. The location of each record was then used to extract summary climate data and biome classification from external published databases. The database has several potential uses. The large geographical spread relative to many other global biodiversity datasets is relevant for investigating patterns of diversity in biogeographical or macroecological contexts. Habitat type and status (intact, degraded, and restored) can be used to provide insights for biodiversity conservation, while the potential effects of sampling method and effort can be used to inform optimized data collection for future seed bank studies. This database is released under the CC-BY license.
RESUMEN
This study investigates the relationship between the development of the life insurance market and bank stability within the context of developing countries. We used data from 2012 to 2020 across 108 developing countries and applied econometric techniques, including fixed-effect and system generalized method of moments (GMM) methods, to test the relationship between the life insurance market size, life insurance market growth, and bank stability at the country level. Our results indicate a positive relationship between life insurance market size and bank stability, i.e., a large life insurance market can help increase bank stability in developing countries. However, these countries should refrain from developing their life insurance markets too quickly; according to our empirical results, there is an inverted U-shaped relationship between life insurance market growth and bank stability. In the context of the growing life insurance market in developing countries as well as the increasing cooperation between banks and insurance companies towards expanding the life insurance market in these countries, our research provides important policy implications for ensuring the stability for financial markets in general.
RESUMEN
BACKGROUND: The transfusion quality improvement project (QIP) serves as a valuable tool for assessing and educating individuals who request blood components. The World Health Organization (WHO) recommends that each institution utilize a blood transfusion request form to ensure the effective conveyance of patient information to the hospital's blood bank. This QIP aimed to implement a transfusion request form and measure compliance with its use. METHODS: A prospective study was conducted at Al Managil Teaching Hospital, Sudan, from May 1 to August 3, 2024, to address the lack of standardized transfusion request forms. The study included three cycles involving pre-intervention analysis, two phases of intervention with training sessions, and post-intervention evaluations. The interventions focused on developing and implementing a new transfusion request form, training clinical physicians, and reinforcing the form's use. Data from 100 randomly selected transfusion request forms were analyzed for completeness and adherence. RESULTS: The study showed significant improvements in the completeness of transfusion request forms across three cycles. In the first cycle, no data were collected, highlighting the absence of standardized forms. During the second cycle, with the introduction of the new form, the completion rates varied: some fields, such as patient information and clinical details, were fully completed in 50 cases (100%), while critical clinical parameters, such as current hemoglobin (Hb) and platelet (PLT) levels, were completed in only four requests (8%). By the third cycle, there was a substantial increase in completion rates across all domains. For example, patient information fields achieved 100% completion in 50 cases, and clinical parameters saw significant improvement, with current Hb and PLT levels documented in 48 cases (96%). The mean percentage completion increased from 68.1% in the second cycle to 97.9% in the third cycle, demonstrating the effectiveness of the interventions and training sessions. Minor decreases were observed in health insurance documentation and certain clinical details, indicating areas for further improvement. CONCLUSION: The systematic implementation and iterative evaluation of transfusion request forms significantly enhanced documentation completeness.
RESUMEN
Interactions between International Organisations (IOs) within a regime complex often manifest themselves through competition and cooperation. Current research has examined the factors that promote inter-organisational competition and cooperation, yet the precise timing of when such competition or cooperation commences remains unclear. This paper focuses on two pivotal IOs in global health governance, the World Health Organization (WHO) and the World Bank, to explore the timing and onset of competition and cooperation within a regime complex, as well as the driving factors in the evolution of their inter-organisational relationships. By looking into the interactions between the WHO and the World Bank in norm-setting and resource mobilising, the paper sheds light on how their relationships have transitioned from competitors to cooperators. It systematically presents the mechanisms and processes of policy transformation in inter-organisational interactions. As a new agenda arises, IOs within a regime complex often compete for dominance, with ideational differences driving them to propose and implement distinct governance strategies. They will compete for resources and mainstream of their strategy. The negative spillover effects of competitive policies consequently undermine the effectiveness of IOs' policy, thereby undercut their legitimacy. To surmount these challenges, the international community should promote inter-institutional coordination in global governance.
Asunto(s)
Salud Global , Cooperación Internacional , Naciones Unidas , Organización Mundial de la Salud , Humanos , Conducta Cooperativa , Política de SaludRESUMEN
Food banks have become commonplace in the UK as an emergency response to food insecurity. However, food banks are not a long-term solution to food insecurity and are often not accessed by those in need. In the context of the cost-of-living crisis, and increased food insecurity, this systematic review applied market/government failure theory, voluntary failure theory, and Radimer et al.'s (1990) domains of food insecurity to explore three important aspects relevant to the food banking experience: the drivers of food bank use; the limitations of the current food bank model; and the impacts of the food banking model for food bank clients. Empirical, peer-reviewed articles written in English with a UK food bank context and reporting relevant data to these aspects were eligible for inclusion. In total, 221 titles were identified using four databases (Web of Science, SCOPUS, PubMed, CINHAL Plus) in July 2022. The final sample of 41 articles (comprising qualitative, quantitative and mixed methods studies), were quality assessed using the Mixed Methods Appraisal Tool. Data were extracted and analysed through directed content analysis. Market and government failures were widely reported to drive food bank use. Insufficiency, paternalism and particularism represented key limitations of the food bank model. Negative health and psychological impacts of food bank use were prominent, yet social impacts were largely positive. Consequently, new solutions are needed to promote positive health and psychological impacts for food bank clients in the UK. The application of these findings to other high-income countries experiencing food insecurity should be determined.
RESUMEN
The absence of solvent molecules in high-resolution protein crystal structure models deposited in the Protein Data Bank (PDB) contradicts the fact that, for proteins crystallized from aqueous media, water molecules are always expected to bind to the protein surface, as well as to some sites in the protein interior. An analysis of the contents of the PDB indicated that the expected ratio of the number of water molecules to the number of amino-acid residues exceeds 1.5 in atomic resolution structures, decreasing to 0.25 at around 2.5â Å resolution. Nevertheless, almost 800 protein crystal structures determined at a resolution of 2.5â Å or higher are found in the current release of the PDB without any water molecules, whereas some other depositions have unusually low or high occupancies of modeled solvent. Detailed analysis of these depositions revealed that the lack of solvent molecules might be an indication of problems with either the diffraction data, the refinement protocol, the deposition process or a combination of these factors. It is postulated that problems with solvent structure should be flagged by the PDB and addressed by the depositors.
RESUMEN
PURPOSE: To investigate factors influencing long-term utilization and disposal patterns of cryopreserved semen straws in oncological patients. METHODS: This retrospective study included all men who cryopreserved semen due to cancer between October 1993 and December 2021. To assess non-used cryopreserved sperm straws, we investigated the following parameters: cryopreserved semen and usage for fertility treatments versus disposal, summarized by total non-used cases. A Kaplan-Meier curve was used to describe last usage and disposal requests over a 15-year analysis. A Log-rank test was applied to compare between age and paternal status categories. RESULTS: The cohort consisted of 445 patients. Of these, 55 patients utilized thawed semen for fertility treatments, and 65 opted for disposals, leaving 325 patients who neither used nor disposed of their cryopreserved straws. Our findings revealed a distinct pattern based on age, with the youngest age group (< 25 years) exhibiting significantly lower utilization and disposal rates compared to older patient groups. Additionally, men without children exhibited significantly fewer disposal requests compared to fathers. The median cryopreserved straws were 10 (interquartile range, 6 to 17), while the median used straws were only 2 (interquartile range, 2 to 6). DISCUSSION: Our study brings attention to the additional and needless burden of preservation from both patient and preserved straw perspectives. Implementing a policy based on a cost-effective approach, incorporating time and straw limits, and considering demographic characteristics, could enhance efficiency and necessitate patient consent before preservation.
RESUMEN
The transplantation of CD34+ hematopoietic stem-progenitor cells (HSPCs) derived from cord blood serves as the standard treatment for selected hematological, oncological, metabolic, and immunodeficiency disorders, of which the dose is pivotal to the clinical outcome. Based on numerous maternal and neonatal parameters, we evaluated the predictive power of mathematical pipelines to the proportion of CD34+ cells in the final cryopreserved cord blood product adopting both parametric and non-parametric algorithms. Twenty-four predictor variables associated with the cord blood processing of 802 processed cord blood units randomly sampled in 2020-2022 were retrieved and analyzed. Prediction models were developed by adopting the parametric (multivariate linear regression) and non-parametric (random forest and back propagation neural network) statistical models to investigate the data patterns for determining the single outcome (i.e., the proportion of CD34+ cells). The multivariate linear regression model produced the lowest root-mean-square deviation (0.0982). However, the model created by the back propagation neural network produced the highest median absolute deviation (0.0689) and predictive power (56.99%) in comparison to the random forest and multivariate linear regression. The predictive model depending on a combination of continuous and discrete maternal with neonatal parameters associated with cord blood processing can predict the CD34+ dose in the final product for clinical utilization. The back propagation neural network algorithm produces a model with the highest predictive power which can be widely applied to assisting cell banks for optimal cord blood unit selection to ensure the highest chance of transplantation success.
Asunto(s)
Algoritmos , Antígenos CD34 , Sangre Fetal , Células Madre Hematopoyéticas , Aprendizaje Automático , Humanos , Sangre Fetal/citología , Células Madre Hematopoyéticas/citología , Células Madre Hematopoyéticas/metabolismo , Antígenos CD34/metabolismo , Femenino , Redes Neurales de la Computación , Trasplante de Células Madre Hematopoyéticas/métodos , Trasplante de Células Madre de Sangre del Cordón Umbilical/métodos , Recién Nacido , Criopreservación/métodosRESUMEN
OBJECTIVE: Households with children accessing food aid in high-income countries are often food insecure. We aimed to review the evidence on food aid interventions in households with children and impact on food insecurity, diet quality and mental health. DESIGN: A systematic search was conducted using Web of Science, MEDLINE, CINAHL and PsycINFO. Articles published from January 2008 to July 2022 including cross-sectional, cohort and interventional studies in high-income countries were eligible. SETTING: Food aid is defined as the use of interventions providing free food items by community and/or charitable organisations. PARTICIPANTS: Two-parent, lone parent or households with a primary caregiver with at least one child ≤ 18 years. RESULTS: From a total of 10 394 articles, nine were included. Food banks, mobile pantry combined with a free meal for children, backpack provision during school term and food parcel home delivery interventions were evaluated. Food bank models offering additional support such as community programmes, health and social services, cooking classes and free meals for children, client-choice-based models and programmes providing convenient access were associated with improved food security and diet quality (increased intake of wholegrains, fruit and vegetables). One study reported an improvement in mental health and food bank access at the end of 18 months but not at earlier timepoints and one study reported no change in parents' mental health. CONCLUSIONS: Accessing food aid was linked to improved diet quality and reduced food insecurity in some studies. Allowing clients to choose food items and providing support services were most effective.
Asunto(s)
Países Desarrollados , Dieta , Composición Familiar , Asistencia Alimentaria , Inseguridad Alimentaria , Salud Mental , Adolescente , Niño , Preescolar , Femenino , Humanos , Masculino , Dieta/estadística & datos numéricos , Dieta/psicología , Dieta Saludable/psicología , Dieta Saludable/estadística & datos numéricos , Abastecimiento de Alimentos/estadística & datos numéricos , Recién Nacido , LactanteRESUMEN
BACKGROUND: Streptococcus pyogenes (Group A Streptococcus, GAS) is a significant pathogen that causes diverse infections, ranging from pharyngitis to severe invasive diseases. Asymptomatic carriage in children is pivotal for transmission. The COVID-19 pandemic's health measures, including mask wearing and enhanced hand hygiene, likely influenced GAS transmission dynamics. This study evaluated the impact of these precautions on the prevalence of asymptomatic pharyngeal GAS carriage among schoolchildren in the southern West Bank, Palestine. METHODS: This cross-sectional study was conducted in two phases: pre-COVID-19 (November 2019-January 2020) and post-COVID-19 (November 2023-April 2024). Throat swabs were collected from 701 children (345 pre-COVID-19, 356 post-COVID-19) via cluster sampling. The samples were tested with the ABON Strep A rapid test and confirmed by culture. Sociodemographic, health, and household data were also collected. The statistical analyses included descriptive statistics, chi-square tests, and binary logistic regression. RESULTS: The prevalence of asymptomatic pharyngeal GAS carriage declined from 15.7% pre-COVID-19 to 10.4% post-COVID-19 (p = 0.038). Significant reductions were observed among urban residents (23.5-10.1%, p = 0.003) and those from medium socioeconomic backgrounds (16.0-9.1%, p = 0.008). Compared with urban residents, rural residents had lower GAS carriage rates (adjusted OR = 0.505, p = 0.023). Carriage rates also decreased among children with frequent sore throats (17.6-7.3%, p = 0.007) and those using private wells (52.5-14.9%, p < 0.001). Higher BMI was a significant risk factor (adjusted OR = 17.68, p < 0.001), whereas frequent tooth brushing (adjusted OR = 0.055, p < 0.001) and hand washing (adjusted OR = 0.367, p < 0.001) were protective factors. CONCLUSIONS: COVID-19-related health precautions were correlated with a significant reduction in asymptomatic GAS carriage among Palestinian children. These findings suggest that public health measures, such as mask wearing and hand hygiene, can influence the transmission of respiratory pathogens. Ongoing surveillance and targeted interventions are essential for managing GAS infections, particularly in resource-limited settings.
Asunto(s)
COVID-19 , Portador Sano , Infecciones Estreptocócicas , Streptococcus pyogenes , Adolescente , Niño , Preescolar , Femenino , Humanos , Masculino , Árabes/estadística & datos numéricos , Infecciones Asintomáticas/epidemiología , Portador Sano/epidemiología , Portador Sano/microbiología , COVID-19/epidemiología , COVID-19/prevención & control , COVID-19/transmisión , Estudios Transversales , Higiene de las Manos , Máscaras/estadística & datos numéricos , Medio Oriente/epidemiología , Faringe/microbiología , Faringe/virología , Prevalencia , Infecciones Estreptocócicas/epidemiología , Infecciones Estreptocócicas/microbiología , Infecciones Estreptocócicas/prevención & control , Streptococcus pyogenes/aislamiento & purificaciónRESUMEN
OBJECTIVE: This study aimed to determine the prevalence of the negative D antigen phenotype, adherence to routine antenatal anti-D immunoglobulin prophylaxis (RAADP) administration and D antigen sensitisation among pregnant women in the UAE. DESIGN: Data was collected from pregnant women enrolled in the Mutaba'ah Study. The Mutaba'ah Study is an ongoing prospective mother and child cohort study in the UAE. Data were extracted from the medical records and baseline questionnaire administered to the participants between May 2017 and January 2021. SETTING: The study was conducted in Al Ain city of the UAE. PARTICIPANTS: A total of 5080 pregnant women residing in Al Ain participated in the study. OUTCOME MEASURES: The study estimated the prevalence of negative D antigen phenotype and the provision of RAADP in this population. RESULTS: Of the 5080 pregnant women analysed, 4651 (91.6%) had D antigen positive status, while 429 (8.4%) were D-negative. D antigen sensitisation was low at 0.5%, and there was a high uptake of RAADP in the population at 88.8%. CONCLUSIONS: The adherence to RAADP is consistent with published data from other healthcare settings. Knowledge of the prevalence of D antigen negative mothers is crucial to the financial and resource consideration for implementing antenatal foetal cell-free DNA screening to determine foetal D antigen status.
Asunto(s)
Atención Prenatal , Globulina Inmune rho(D) , Humanos , Femenino , Embarazo , Emiratos Árabes Unidos , Adulto , Estudios Transversales , Estudios Prospectivos , Adulto Joven , Sistema del Grupo Sanguíneo Rh-Hr/inmunologíaRESUMEN
Objective: Nutrition interventions delivered through food pantries could reduce health disparities for people experiencing food insecurity. We identified clients' preferences for cuisines, nutrition interventions, and outcomes and whether preferences differ for subpopulations. Methods: Cross-sectional study at a large pantry in Dallas, Texas (N = 200). Survey collected from February-May 2023 on demographics, cuisine preferences, nutrition intervention preferences, and outcomes clients hope to achieve when changing lifestyle (weight loss, feeling comfortable in clothes, feeling good about diet, wellbeing). A subsample (N = 130) had height and weight measured. We tested whether food security and BMI (categorical) were associated with intervention or outcome preferences using IBM SPSS Statistics (Version 29) to conduct analysis of variance. Results: Top-rated cuisines were Mexican, Chinese, Italian. Participants reported a desire for interventions implemented through the pantry reflected by high Nutrition Intervention Index scores. The highest rated intervention was bringing more healthy food into the pantry and lowest rated was restricting unhealthy donations.Overall wellbeing was the most important outcome and weight loss the least important.Neither food security nor BMI were associated with desire for interventions. All outcomes were rated in a similar pattern, though people with obesity and overweight rated weight loss as more important than people with normal weight. Conclusions: Most participants demonstrated a strong desire for healthier, ethnically diverse options, and nutrition interventions delivered through the pantry. Our findings explore cuisines and outcomes preferred by people that use food pantries which can guide researchers, clinicians, and non-profit organizations in planning and promotion of nutrition programs for pantry clients.
RESUMEN
Given progressive population ageing and the increase in the number of patients with comorbidities, the management of chronic and/or hard-to-heal wounds (HHWs) nowadays represents a common problem in many clinical settings. In these cases, standard strategies may not be sufficient. Autologous grafting represent the gold standard for permanent wound closure, but is almost never realized when the skin loss is extensive/the patient is young. The grafting of homologous skin/dermal tissue procured from cadaver donors (i.e., allografting) represents the best alternative, especially when the dermal component is lost. This request supports the activities of skin bank establishments (including donor screening, skin procurement, processing, storage, and distribution) that are regulated by specific guidelines and need to continuously meet quality standard requirements. The aim of this work is to both give specific insights of all the procedures implied in allograft preparation as well as an overview of their practical application in the treatment of different HHWs. The particular characteristics of each skin/dermal allograft released by Siena Skin Bank (cryopreserved/glycerol-preserved skin/de-epidermized dermis, acellular lyophilized de-epidermized dermis/reticular dermis) are also discussed. The exemplificative series of HHWs managed in the Dermatology Department of Siena were classified according their etiology into post-traumatic, vascular (arterial/venous/mixed/lymphatic), inflammatory, surgical, and heat/chemical burns. Globally, the clinical advantages obtained include: acceleration of healing process, pain sparing, resistance to bacterial contamination, dermal regeneration (instead of scarring), and better aesthetic-functional outcome.
RESUMEN
Background Blood component transfusion is vital for various medical conditions, requiring thorough pretransfusion testing to ensure safety and compatibility. The Type and Screen (T and S) method allows for efficient detection of clinically significant antibodies while reducing unnecessary crossmatching. This study evaluates T and S's impact on turnaround time and man-hour utilization compared to conventional crossmatching. Methodology The study included 835 elective crossmatch requests. Blood grouping, antibody screening, and antihuman globulin (AHG) crossmatching were performed using column agglutination technology, while immediate saline crossmatching was done by tube technique. Turnaround time (TAT), man-hour utilization, and saved rates were calculated for both the T and S protocol and routine AHG crossmatch. Results In this study, 835 elective blood samples underwent antibody screening and immediate saline phase crossmatching, with validation by AHG phase crossmatch. The T and S protocol significantly reduced TAT for both first (51.24 vs. 71.56 minutes) and subsequent transfusions (17.47 vs. 39.67 minutes) compared to AHG crossmatching, with reductions being statistically significant (p < 0.0001). T and S also saved 279.28 man-hours (11.6 man-days), equating to 0.33 man-hours saved per sample. Conclusion Our study shows that the T and S protocol significantly enhances blood bank efficiency by reducing TAT and man-hour utilization compared to conventional AHG crossmatching. This improvement not only optimizes manpower but also makes the process more cost-effective.
RESUMEN
BACKGROUND AND AIM: Some patients with intracerebral hemorrhage are on antithrombotic agents at the time of the event and these may worsen outcome, but the relative risk of different oral anticoagulants and antiplatelet agents is uncertain. We determined associations between pre-onset intake of antithrombotic agents and initial stroke severity, and outcomes, in patients with intracerebral hemorrhage. METHODS: Patients with intracerebral hemorrhage admitted within 24 h after onset between January 2017 and December 2020 and recruited to the Japan Stroke Data Bank, a hospital-based multicenter prospective registry, were included. Enrolled patients were classified into four groups based on the type of antithrombotic agents being used on admission. The outcomes were the National Institutes of Health Stroke Scale (NIHSS) score on admission and modified Rankin Scale (mRS) of 5-6 at discharge. RESULTS: Of a total 9810 patients with intracerebral hemorrhage (4267 females; mean age = 70 ± 15 years), 77.1% were classified into the no-antithrombotic group, 13.2% into the antiplatelet group, 4.0% into the warfarin group, and 5.8% into the direct oral anticoagulant (DOAC) group. Median (interquartile range) NIHSS score on admission was 12 (5-22), 13 (5-26), 15 (5-30), and 13 (6-24), respectively, in the four groups. In multivariable analysis, the prestroke warfarin use was associated with higher NIHSS score (adjusted incidence rate ratio = 1.09 (95% confidence interval (CI) = 1.06-1.13), with the no-antithrombotic group as the reference), but the antiplatelet group (1.00 (95% CI = 0.98-1.02)) and DOAC group (0.98 (95% CI = 0.95-1.01)) were not. The rate of mRS 5-6 at discharge was 30.8%, 41.9%, 48.6%, and 41.5%, respectively, in the four groups. In multivariable analysis, prestroke warfarin use was associated with mRS 5-6 (adjusted odds ratio = 1.90 (95% CI = 1.28-2.81), with the no-antithrombotic group as the reference), but the antiplatelet group (1.12 (95% CI = 0.91-1.37)) and DOAC group (1.25 (95% CI = 0.88-1.77)) were not. CONCLUSION: Patients who were taking warfarin prior to intracerebral hemorrhage onset suffered more severe intracerebral hemorrhage as evidenced by higher admission NIHSS and higher discharge mRS. In contrast, no increase in severity was seen with antiplatelet agents.
RESUMEN
The accuracy of the information in the Protein Data Bank (PDB) is of great importance for the myriad downstream applications that make use of protein structural information. Despite best efforts, the occasional introduction of errors is inevitable, especially where the experimental data are of limited resolution. A novel protein structure validation approach based on spotting inconsistencies between the residue contacts and distances observed in a structural model and those computationally predicted by methods such as AlphaFold2 has previously been established. It is particularly well suited to the detection of register errors. Importantly, this new approach is orthogonal to traditional methods based on stereochemistry or map-model agreement, and is resolution independent. Here, thousands of likely register errors are identified by scanning 3-5â Å resolution structures in the PDB. Unlike most methods, the application of this approach yields suggested corrections to the register of affected regions, which it is shown, even by limited implementation, lead to improved refinement statistics in the vast majority of cases. A few limitations and confounding factors such as fold-switching proteins are characterized, but this approach is expected to have broad application in spotting potential issues in current accessions and, through its implementation and distribution in CCP4, helping to ensure the accuracy of future depositions.
RESUMEN
Many plants exhibit a canopy seed bank, where seeds persist within the canopy for prolonged periods, gradually descending over time and potentially influencing seed predation and animal-mediated dispersal. However, the impact of delayed seed drop on animal predation and seed dispersal remains unclear. We used Chinese Armand pine seeds to simulate delayed seed drop of the canopy seed bank by releasing 7800 pine seeds in both winter and the following summer over 2 years, tracking their fates to investigate its effect on seed predation and dispersal by rodents in a pine plantation in southwest China. Results showed significant seasonal differences in seed fate. In summer, seeds experienced higher predation rates (62.08% vs 3.80% in winter) and lower scatter-hoarding rates (4.18% vs 15.40% in winter). Additionally, seeds in summer were dispersed farther (4.20 m vs. 3.56 m in winter) and primarily formed single-seed caches, as opposed to multi-seed caches in winter. Although delayed seed drop increased immediate predation risks, favorable summer conditions allowed for rapid germination, reducing long-term exposure to predation. In conclusion, while delayed seed drop increases immediate predation risks and reduces caching, it concurrently enhances dispersal distances and reduces cache size.
RESUMEN
The coexistence of marine sensitive areas with the oil industry requires robust preparedness and rapid response capabilities for monitoring and mitigating oil spill events. Scientifically proven satellite-based methods for the visual detection of oil spills are widely recognized as effective, low-cost, transferable, scalable, and operational solutions, particularly in developing economies. Following meticulous design and implementation, we adopted and executed a relatively low-cost operational monitoring and alert system for oil spill detection over the ocean surface and alert issuance. We analyzed over 1500 satellite images, issuing over 70 warning reports on oil slicks and spills in the southern Gulf of Mexico. To assess the system's efficiency and performance, we leveraged data from three major oil spill incidents in the study region during June and July of 2023 in the study region, covering a maximum area of 669 km2 and tracked for 12 to 24 days. We documented the evolution of these oil spills by integrating satellite sensing data with on-site Lagrangian drifting buoys, a network of high-frequency radars, and citizen reports to validate the outcomes of this system. We generated timely technical information on the spill's evolution, informing decision-makers and local community leaders to strengthen their mitigation response capabilities. Additionally, we developed a robust database with spectral and spatiotemporal features of satellite-detected oil, thereby contributing to advancing the scientific understanding of sea surface dynamics related to natural and anthropogenic oil sources. This study also highlights immediate-, medium-, and long-term research agendas and establishes a reference for a sustained, transferable, and operational oil spill monitoring system.