RESUMEN
Biodiversity contributes to the ecological and climatic stability of the Amazon Basin1,2, but is increasingly threatened by deforestation and fire3,4. Here we quantify these impacts over the past two decades using remote-sensing estimates of fire and deforestation and comprehensive range estimates of 11,514 plant species and 3,079 vertebrate species in the Amazon. Deforestation has led to large amounts of habitat loss, and fires further exacerbate this already substantial impact on Amazonian biodiversity. Since 2001, 103,079-189,755 km2 of Amazon rainforest has been impacted by fires, potentially impacting the ranges of 77.3-85.2% of species that are listed as threatened in this region5. The impacts of fire on the ranges of species in Amazonia could be as high as 64%, and greater impacts are typically associated with species that have restricted ranges. We find close associations between forest policy, fire-impacted forest area and their potential impacts on biodiversity. In Brazil, forest policies that were initiated in the mid-2000s corresponded to reduced rates of burning. However, relaxed enforcement of these policies in 2019 has seemingly begun to reverse this trend: approximately 4,253-10,343 km2 of forest has been impacted by fire, leading to some of the most severe potential impacts on biodiversity since 2009. These results highlight the critical role of policy enforcement in the preservation of biodiversity in the Amazon.
Asunto(s)
Biodiversidad , Conservación de los Recursos Naturales/legislación & jurisprudencia , Sequías , Agricultura Forestal/legislación & jurisprudencia , Bosque Lluvioso , Incendios Forestales/estadística & datos numéricos , Animales , Brasil , Cambio Climático/estadística & datos numéricos , Bosques , Mapeo Geográfico , Plantas , Árboles/fisiología , VertebradosRESUMEN
Increased exposure to extreme heat from both climate change and the urban heat island effect-total urban warming-threatens the sustainability of rapidly growing urban settlements worldwide. Extreme heat exposure is highly unequal and severely impacts the urban poor. While previous studies have quantified global exposure to extreme heat, the lack of a globally accurate, fine-resolution temporal analysis of urban exposure crucially limits our ability to deploy adaptations. Here, we estimate daily urban population exposure to extreme heat for 13,115 urban settlements from 1983 to 2016. We harmonize global, fine-resolution (0.05°), daily temperature maxima and relative humidity estimates with geolocated and longitudinal global urban population data. We measure the average annual rate of increase in exposure (person-days/year-1) at the global, regional, national, and municipality levels, separating the contribution to exposure trajectories from urban population growth versus total urban warming. Using a daily maximum wet bulb globe temperature threshold of 30 °C, global exposure increased nearly 200% from 1983 to 2016. Total urban warming elevated the annual increase in exposure by 52% compared to urban population growth alone. Exposure trajectories increased for 46% of urban settlements, which together in 2016 comprised 23% of the planet's population (1.7 billion people). However, how total urban warming and population growth drove exposure trajectories is spatially heterogeneous. This study reinforces the importance of employing multiple extreme heat exposure metrics to identify local patterns and compare exposure trends across geographies. Our results suggest that previous research underestimates extreme heat exposure, highlighting the urgency for targeted adaptations and early warning systems to reduce harm from urban extreme heat exposure.
Asunto(s)
Exposición a Riesgos Ambientales/efectos adversos , Calor Extremo/efectos adversos , Clima Extremo , Calor , Población Urbana/estadística & datos numéricos , Ciudades/estadística & datos numéricos , Clima , Calentamiento Global , Humanos , Salud Pública , UrbanizaciónRESUMEN
A new tuberculosis vaccine is a high priority. However, the classical development pathway is a major deterrent. Most tuberculosis cases arise within 2 years after Mycobacterium tuberculosis exposure, suggesting a 3-year trial period should be possible if sample size is large to maximize the number of early exposures. Increased sample size could be facilitated by working alongside optimized routine services for case ascertainment, with strategies for enhanced case detection and safety monitoring. Shortening enrolment could be achieved by simplifying screening criteria and procedures and strengthening site capacity. Together, these measures could enable radically shortened phase 3 tuberculosis vaccine trials.
Asunto(s)
Mycobacterium tuberculosis , Vacunas contra la Tuberculosis , Tuberculosis , Humanos , Vacunas contra la Tuberculosis/inmunología , Nueces/inmunología , Tuberculosis/prevención & control , Tuberculosis/inmunología , Mycobacterium tuberculosis/inmunología , Método Doble CiegoRESUMEN
Lima bean production has been an economically valuable staple in Delaware agriculture for almost a century, with annual revenue approaching 8 million dollars (USDA-NASS, 2019; Evans et al. 2007). From 2019-2021, lima beans displaying symptoms of brown discoloration, referred to as "brown bean" were observed in the green baby lima variety 'Cypress' across multiple commercial and research fields. Symptoms were present in approximately 1-5% of beans and not visible until pods were opened for harvest. Thirty-seven symptomatic beans were collected and surface disinfested in 0.85% sodium hypochlorite for 30 s, rinsed in sterile deionized water for 30 s, sectioned into four pieces and plated onto potato dextrose agar (PDA) amended with 50 µg/ml penicillin G and streptomycin sulfate. Petri dishes were incubated at 23ºC and observed for colony morphology. Pure cultures were obtained with tan colonies that had mycelia with right angle branching and septations near the branch, consistent with the description of Rhizoctonia solani Kuhn (Sneh et al. 1991). DNA extraction and pathogen identification was confirmed by sequencing of the internal transcribed spacer (ITS) region of nuclear ribosomal DNA using primers ITS4/ITS5 (White et al. 1990) for thirty-seven isolates collected in 2019 and 2020. Isolates were identified as Rhizoctonia solani AG 4 (99.9% sequence identity with GenBank Accession [MN106359.1].) A representative isolate was selected to complete Koch's postulates and the sequence was deposited in GenBank as accession number MW560551. To observe colonization ability, 10 detached pods were sterilized in 75% EtOH for 60 s, then rinsed in Milli-Q water. The detached pods were divided among two 150 mm Petri dishes containing a single 150 mm filter paper saturated with Milli-Q water. Five 1 mm2 agar plugs colonized with the representative R. solani isolate were placed 0.5 cm apart along the length of the pod. Plates were sealed with parafilm and left at room temperature. Control pods were kept in identical conditions but inoculated using clean agar plugs. The trial was repeated and a second trial was conducted on 12 attached asymptomatic pods from C-Elite Select lima bean plants at the succulent seed stage to complete Koch's Postulates. Pods were surface disinfested with 70% ethanol. Three attached pods were wounded with the tip of a sterile scalpel blade where a colonized agar plug was placed and loosely wrapped with a thin parafilm layer to maintain contact. Three attached pods not wounded were also inoculated with a colonized agar plug and wrapped by parafilm. Three wounded and non-wounded pods received clean agar plug controls. Both attached and detached pods were kept at room temperature for one week until symptoms began to manifest on the pod surfaces, at which point the beans from infected pods were removed and placed on PDA, three to a plate. In the attached assay, all beans of both wounded and non-wounded pods developed symptoms. The plates were stored in identical conditions and monitored for 5 days until tan colonies were observed. Culture morphology was consistent with the original isolate in all beans. Sequencing of the ITS region confirmed identity as R. solani AG-4. No symptoms were observed on control pods or seeds. Rhizoctonia solani is most frequently associated with symptoms of root rot (Sharma-Poudyal et al. 2015), but no stem symptoms are associated with the late season "brown bean" that has been observed throughout production in recent years. To our knowledge, this is the first report of Rhizoctonia solani AG 4 causing symptoms of brown bean of lima bean in Delaware. In preliminary observations, symptoms seem to be worse in pods that could have had contact with the soil directly or via rain splash. This disease cannot be detected until pods are split open, which has potential to reduce lima bean quality at harvest. Further monitoring should be conducted to quantify yield impacts and develop appropriate preventative and curative techniques.
RESUMEN
Introduction: Understanding of how SARS-CoV-2 manifests itself in older adults was unknown at the outset of the pandemic. We undertook a retrospective observational analysis of all patients admitted to older people's services with confirmed COVID-19 in one of the largest hospitals in Europe. We detail presenting symptoms, prognostic features and vulnerability to nosocomial spread. Methods: We retrospectively collected data for each patient with a positive SARSCoV-2 RT PCR between 18th March and the 20th April 2020 in a department of medicine for the elderly in Glasgow. Results: 222 patients were included in our analysis. Age ranged from 56 to 99 years (mean = 82) and 148 were female (67%). 119 patients had a positive swab for SARS-CoV-2 within the first 14 days of admission, only 32% of these patients presented with primarily a respiratory type illness. 103 patients (46%) tested positive after 14 days of admission - this was felt to represent likely nosocomial infection. 95 patients (43%) died by day 30 after diagnosis. Discussion: This data indicates that older people were more likely to present with non-respiratory symptoms. High clinical frailty scores, severe lymphopenia and cumulative comorbidities were associated with higher mortality rates. Several contributing factors will have led to nosocomial transmission.
Asunto(s)
Prueba de COVID-19 , COVID-19/diagnóstico , Factores de Edad , Anciano , Anciano de 80 o más Años , COVID-19/complicaciones , COVID-19/mortalidad , Infección Hospitalaria/complicaciones , Infección Hospitalaria/diagnóstico , Infección Hospitalaria/mortalidad , Femenino , Anciano Frágil , Fragilidad/complicaciones , Fragilidad/diagnóstico , Humanos , Masculino , Persona de Mediana Edad , Pronóstico , Estudios Retrospectivos , Escocia/epidemiologíaRESUMEN
BACKGROUND: In October 2015, 65 people came into direct contact with a healthcare worker presenting with a late reactivation of Ebola virus disease (EVD) in the United Kingdom. Vaccination was offered to 45 individuals with an initial assessment of high exposure risk. METHODS: Approval for rapid expanded access to the recombinant vesicular stomatitis virus-Zaire Ebola virus (rVSV-ZEBOV) vaccine as an unlicensed emergency medicine was obtained from the relevant authorities. An observational follow-up study was carried out for 1 year following vaccination. RESULTS: Twenty-six of 45 individuals elected to receive vaccination between 10 and 11 October 2015 following written informed consent. By day 14, 39% had seroconverted, increasing to 87% by day 28 and 100% by 3 months, although these responses were not always sustained. Neutralizing antibody responses were detectable in 36% by day 14 and 73% at 12 months. Common side effects included fatigue, myalgia, headache, arthralgia, and fever. These were positively associated with glycoprotein-specific T-cell but not immunoglobulin (Ig) M or IgG antibody responses. No severe vaccine-related adverse events were reported. No one exposed to the virus became infected. CONCLUSIONS: This paper reports the use of the rVSV-ZEBOV vaccine given as an emergency intervention to individuals exposed to a patient presenting with a late reactivation of EVD. The vaccine was relatively well tolerated, but a high percentage developed a fever ≥37.5°C, necessitating urgent screening for Ebola virus, and a small number developed persistent arthralgia.
Asunto(s)
Vacunas contra el Virus del Ébola/uso terapéutico , Fiebre Hemorrágica Ebola , Profilaxis Posexposición , Anticuerpos Antivirales , Ebolavirus , Estudios de Seguimiento , Fiebre Hemorrágica Ebola/prevención & control , Humanos , Recurrencia , Reino UnidoRESUMEN
Streptococcus pneumoniae is the major bacterial cause of community-acquired pneumonia, and the leading agent of childhood pneumonia deaths worldwide. Nasal colonization is an essential step prior to infection. The cytokine IL-17 protects against such colonization and vaccines that enhance IL-17 responses to pneumococcal colonization are being developed. The role of IL-17 in host defence against pneumonia is not known. To address this issue, we have utilized a murine model of pneumococcal pneumonia in which the gene for the IL-17 cytokine family receptor, Il17ra, has been inactivated. Using this model, we show that IL-17 produced predominantly from γδ T cells protects mice against death from the invasive TIGR4 strain (serotype 4) which expresses a relatively thin capsule. However, in pneumonia produced by two heavily encapsulated strains with low invasive potential (serotypes 3 and 6B), IL-17 significantly enhanced mortality. Neutrophil uptake and killing of the serotype 3 strain was significantly impaired compared to the serotype 4 strain and depletion of neutrophils with antibody enhanced survival of mice infected with the highly encapsulated SRL1 strain. These data strongly suggest that IL-17 mediated neutrophil recruitment to the lungs clears infection from the invasive TIGR4 strain but that lung neutrophils exacerbate disease caused by the highly encapsulated pneumococcal strains. Thus, whilst augmenting IL-17 immune responses against pneumococci may decrease nasal colonization, this may worsen outcome during pneumonia caused by some strains.
Asunto(s)
Interleucina-17/inmunología , Neumonía Neumocócica/inmunología , Receptores de Interleucina-17/genética , Streptococcus pneumoniae/inmunología , Animales , Bacteriemia/inmunología , Bacteriemia/microbiología , Cápsulas Bacterianas/inmunología , Cápsulas Bacterianas/ultraestructura , Líquido del Lavado Bronquioalveolar/citología , Líquido del Lavado Bronquioalveolar/microbiología , Modelos Animales de Enfermedad , Pulmón/citología , Pulmón/enzimología , Pulmón/inmunología , Ratones , Ratones Endogámicos C57BL , Ratones Noqueados , Microscopía Electrónica de Transmisión , Microscopía Fluorescente , Nasofaringe/microbiología , Neutrófilos/citología , Neutrófilos/inmunología , Peroxidasa/metabolismo , Fagocitosis , Neumonía Neumocócica/mortalidad , Neumonía Neumocócica/prevención & control , Receptores de Antígenos de Linfocitos T gamma-delta/genética , Receptores de Antígenos de Linfocitos T gamma-delta/inmunología , Organismos Libres de Patógenos Específicos , Streptococcus pneumoniae/ultraestructuraRESUMEN
Policy makers around the world tout decentralization as an effective tool in the governance of natural resources. Despite the popularity of these reforms, there is limited scientific evidence on the environmental effects of decentralization, especially in tropical biomes. This study presents evidence on the institutional conditions under which decentralization is likely to be successful in sustaining forests. We draw on common-pool resource theory to argue that the environmental impact of decentralization hinges on the ability of reforms to engage local forest users in the governance of forests. Using matching techniques, we analyze longitudinal field observations on both social and biophysical characteristics in a large number of local government territories in Bolivia (a country with a decentralized forestry policy) and Peru (a country with a much more centralized forestry policy). We find that territories with a decentralized forest governance structure have more stable forest cover, but only when local forest user groups actively engage with the local government officials. We provide evidence in support of a possible causal process behind these results: When user groups engage with the decentralized units, it creates a more enabling environment for effective local governance of forests, including more local government-led forest governance activities, fora for the resolution of forest-related conflicts, intermunicipal cooperation in the forestry sector, and stronger technical capabilities of the local government staff.
Asunto(s)
Relaciones Comunidad-Institución , Conservación de los Recursos Naturales , Agricultura Forestal/legislación & jurisprudencia , Gobierno Local , Bolivia , Ecosistema , Bosques , Humanos , Perú , Política Pública , ÁrbolesRESUMEN
Land cover maps increasingly underlie research into socioeconomic and environmental patterns and processes, including global change. It is known that map errors impact our understanding of these phenomena, but quantifying these impacts is difficult because many areas lack adequate reference data. We used a highly accurate, high-resolution map of South African cropland to assess (1) the magnitude of error in several current generation land cover maps, and (2) how these errors propagate in downstream studies. We first quantified pixel-wise errors in the cropland classes of four widely used land cover maps at resolutions ranging from 1 to 100 km, and then calculated errors in several representative "downstream" (map-based) analyses, including assessments of vegetative carbon stocks, evapotranspiration, crop production, and household food security. We also evaluated maps' spatial accuracy based on how precisely they could be used to locate specific landscape features. We found that cropland maps can have substantial biases and poor accuracy at all resolutions (e.g., at 1 km resolution, up to â¼45% underestimates of cropland (bias) and nearly 50% mean absolute error (MAE, describing accuracy); at 100 km, up to 15% underestimates and nearly 20% MAE). National-scale maps derived from higher-resolution imagery were most accurate, followed by multi-map fusion products. Constraining mapped values to match survey statistics may be effective at minimizing bias (provided the statistics are accurate). Errors in downstream analyses could be substantially amplified or muted, depending on the values ascribed to cropland-adjacent covers (e.g., with forest as adjacent cover, carbon map error was 200%-500% greater than in input cropland maps, but â¼40% less for sparse cover types). The average locational error was 6 km (600%). These findings provide deeper insight into the causes and potential consequences of land cover map error, and suggest several recommendations for land cover map users.
Asunto(s)
Conservación de los Recursos Naturales/estadística & datos numéricos , Productos Agrícolas , Monitoreo del Ambiente/métodos , Bosques , Producción de Cultivos , Monitoreo del Ambiente/normas , Monitoreo del Ambiente/estadística & datos numéricos , Sistemas de Información Geográfica , Mapeo Geográfico , SudáfricaRESUMEN
BACKGROUND: The bacterial pathogen Streptococcus pneumoniae colonizes the nasopharynx prior to causing disease, necessitating successful competition with the resident microflora. Cytokines of the IL-17 family are important in host defence against this pathogen but their effect on the nasopharyngeal microbiome is unknown. Here we analyse the influence of IL-17 on the composition and interactions of the nasopharyngeal microbiome before and after pneumococcal colonization. RESULTS: Using a murine model and 16S rRNA profiling, we found that a lack of IL-17 signalling led to profound alterations in the nasal but not lung microbiome characterized by decreased diversity and richness, increases in Proteobacteria and reduction in Bacteroidetes, Actinobacteria and Acidobacteria. Following experimental pneumococcal nasal inoculation, animals lacking IL-17 family signalling showed increased pneumococcal colonization, though both wild type and knockout animals showed as significant disruption of nasal microbiome composition, with increases in the proportion of Proteobacteria, even in animals that did not have persistent colonization. Sparse correlation analysis of the composition of the microbiome at various time points after infection showed strong positive interactions within the Firmicutes and Proteobacteria, but strong antagonism between members of these two phyla. CONCLUSIONS: These results show the powerful influence of IL-17 signalling on the composition of the nasal microbiome before and after pneumococcal colonization, and apparent lack of interspecific competition between pneumococci and other Firmicutes. IL-17 driven changes in nasal microbiome composition may thus be an important factor in successful resistance to pneumococcal colonization and potentially could be manipulated to augment host defence against this pathogen.
Asunto(s)
Interleucina-17/metabolismo , Microbiota , Mucosa Nasal/metabolismo , Infecciones Neumocócicas/genética , Streptococcus pneumoniae/fisiología , Animales , Variación Genética , Interleucina-17/genética , Pulmón/citología , Pulmón/metabolismo , Pulmón/microbiología , Ratones , Ratones Endogámicos C57BL , Ratones Noqueados , Mucosa Nasal/citología , Mucosa Nasal/microbiología , Infecciones Neumocócicas/microbiología , Receptores de Interleucina-17/metabolismo , Ribotipificación , Streptococcus pneumoniae/genética , Streptococcus pneumoniae/aislamiento & purificación , Streptococcus pneumoniae/patogenicidadRESUMEN
Multiplex PCR can provide rapid diagnosis for patients presenting with an acute undifferentiated febrile illness. Such technology is useful in deployed settings, where access to conventional microbiological diagnosis is limited. It was used in Sierra Leone to guide management of febrile healthcare workers, in whom Ebola virus disease was a possible cause. In particular, it informed appropriate antibiotic treatment while minimising the risk to clinicians of exposure to the causative organism.
Asunto(s)
Fiebre/diagnóstico , Fiebre/microbiología , Gastroenteritis/diagnóstico , Gastroenteritis/microbiología , Fiebre Hemorrágica Ebola/terapia , Adulto , Brotes de Enfermedades , Gastroenteritis/complicaciones , Personal de Salud , Humanos , Masculino , Reacción en Cadena de la Polimerasa MultiplexRESUMEN
Lectin-like bacteriocins consist of tandem monocot mannose-binding domains and display a genus-specific killing activity. Here we show that pyocin L1, a novel member of this family from Pseudomonas aeruginosa, targets susceptible strains of this species through recognition of the common polysaccharide antigen (CPA) of P. aeruginosa lipopolysaccharide that is predominantly a homopolymer of D-rhamnose. Structural and biophysical analyses show that recognition of CPA occurs through the C-terminal carbohydrate-binding domain of pyocin L1 and that this interaction is a prerequisite for bactericidal activity. Further to this, we show that the previously described lectin-like bacteriocin putidacin L1 shows a similar carbohydrate-binding specificity, indicating that oligosaccharides containing D-rhamnose and not D-mannose, as was previously thought, are the physiologically relevant ligands for this group of bacteriocins. The widespread inclusion of d-rhamnose in the lipopolysaccharide of members of the genus Pseudomonas explains the unusual genus-specific activity of the lectin-like bacteriocins.
Asunto(s)
Bacteriocinas/metabolismo , Lipopolisacáridos/metabolismo , Pseudomonas aeruginosa/metabolismo , Ramnosa/metabolismo , Secuencia de Aminoácidos , Bacteriocinas/química , Immunoblotting , Datos de Secuencia Molecular , Mutagénesis Sitio-Dirigida , Reacción en Cadena de la Polimerasa , Estructura Cuaternaria de Proteína , Pseudomonas aeruginosa/química , Ramnosa/químicaRESUMEN
A central challenge in global ecology is the identification of key functional processes in ecosystems that scale, but do not require, data for individual species across landscapes. Given that nearly all tree species form symbiotic relationships with one of two types of mycorrhizal fungi - arbuscular mycorrhizal (AM) and ectomycorrhizal (ECM) fungi - and that AM- and ECM-dominated forests often have distinct nutrient economies, the detection and mapping of mycorrhizae over large areas could provide valuable insights about fundamental ecosystem processes such as nutrient cycling, species interactions, and overall forest productivity. We explored remotely sensed tree canopy spectral properties to detect underlying mycorrhizal association across a gradient of AM- and ECM-dominated forest plots. Statistical mining of reflectance and reflectance derivatives across moderate/high-resolution Landsat data revealed distinctly unique phenological signals that differentiated AM and ECM associations. This approach was trained and validated against measurements of tree species and mycorrhizal association across ~130 000 trees throughout the temperate United States. We were able to predict 77% of the variation in mycorrhizal association distribution within the forest plots (P < 0.001). The implications for this work move us toward mapping mycorrhizal association globally and advancing our understanding of biogeochemical cycling and other ecosystem processes.
Asunto(s)
Ecología/métodos , Bosques , Micorrizas/fisiología , Árboles/microbiología , Tecnología de Sensores Remotos , Imágenes SatelitalesRESUMEN
NEW FINDINGS: What is the central question of this study? To what extent focal abdominal aortic aneurysmal (AAA) disease is associated with systemic remodelling of the vascular tree remains unknown. The present study examined whether anatomical differences exist between distances of the intervisceral artery origins and AAA location/size in patients with disease compared with healthy patients. What is the main finding and its importance? Intervisceral artery distances were shown to be consistently greater in AAA patients, highlighting the systemic nature of AAA disease that extends proximally to the abdominal aorta and its branches. The anatomical description of the natural variation in visceral artery origins has implications for the design of stent grafts and planning complex open aortic surgery. The initial histopathology of abdominal aortic aneurysmal (AAA) disease is atherosclerotic, later diverting towards a distinctive dilating rather than occlusive aortic phenotype. To what extent focal AAA disease is associated with systemic remodelling of the vascular tree remains unknown. The present study examined whether anatomical differences exist between the intervisceral artery origins and AAA location/size in patients with AAA disease (AAA+) relative to those without (AAA-). Preoperative contrast-enhanced computerized tomograms were reviewed in 90 consecutive AAA+ patients scheduled for open repair who underwent an infrarenal (n = 45), suprarenal (n = 26) or supracoeliac clamp (n = 19). These were compared with 39 age-matched AAA- control patients. Craniocaudal measurements were recorded from the distal origin of the coeliac artery to the superior mesenteric artery and from the origin of the superior mesenteric artery to both renal artery origins. Serial blood samples were obtained for estimation of the glomerular filtration rate before and after surgery. Intervisceral artery origins were shown to be consistently greater in AAA+ patients (P < 0.05 versus AAA-), although unrelated to AAA diameter (P > 0.05). Postoperative renal function became progressively more impaired the more proximal the clamp placement (estimated glomerular filtration rate for supracoeliac < suprarenal < infrarenal clamps, P < 0.05). These findings highlight the systemic nature of AAA disease that extends proximally to the abdominal aorta and its branches. The anatomical description of the natural variation in visceral artery origins has implications for the design of stent grafts and planning complex open aortic surgery.
Asunto(s)
Aorta Abdominal/patología , Aneurisma de la Aorta Abdominal/patología , Remodelación Vascular/fisiología , Anciano , Femenino , Tasa de Filtración Glomerular/fisiología , Humanos , Masculino , Arterias Mesentéricas/patología , Arteria Renal/patologíaRESUMEN
This study uses a mail survey of private landowners in the Midwest United States to understand the characteristics of owners who have planted trees or intend to plant trees in the future. The analysis examines what policy tools encourage owners to plant trees, and how policy tools operate across different ownership attributes to promote tree-planting on private lands. Logistic regression results suggest that cost-subsidizing policy tools, such as low-cost and free seedlings, significantly increase the odds of actual and planned reforestation when landowners consider them important for increasing forest cover. Individuals most likely to plant trees, when low-cost seedlings are available and important, are fairly recent (<5 years), college-educated owners who own small parcels (<4 ha) and use the land for recreation. Motivations to reforest were also shaped by owners' planning horizons, connection to the land, previous tree-planting experience, and peer influence. The study has relevance for the design of policy approaches that can encourage private forestation through provision of economic incentives and capacity to private landowners.
Asunto(s)
Conservación de los Recursos Naturales/economía , Motivación , Propiedad , Árboles/crecimiento & desarrollo , Conservación de los Recursos Naturales/métodos , Humanos , Indiana , Encuestas y Cuestionarios , Estados UnidosRESUMEN
Increasingly, the need to strengthen global capacity to prevent, detect, and respond to public health threats around the globe is being recognized. CDC, in partnership with the World Health Organization (WHO), has committed to building capacity by assisting member states with strengthening their national capacity for integrated disease surveillance and response as required by International Health Regulations (IHR). CDC and other U.S. agencies have reinforced their pledge through creation of global health security (GHS) demonstration projects. One such project was conducted during March-September 2013, when the Uganda Ministry of Health (MoH) and CDC implemented upgrades in three areas: 1) strengthening the public health laboratory system by increasing the capacity of diagnostic and specimen referral networks, 2) enhancing the existing communications and information systems for outbreak response, and 3) developing a public health emergency operations center (EOC) (Figure 1). The GHS demonstration project outcomes included development of an outbreak response module that allowed reporting of suspected cases of illness caused by priority pathogens via short messaging service (SMS; i.e., text messaging) to the Uganda District Health Information System (DHIS-2) and expansion of the biologic specimen transport and laboratory reporting system supported by the President's Emergency Plan for AIDS Relief (PEPFAR). Other enhancements included strengthening laboratory management, establishing and equipping the EOC, and evaluating these enhancements during an outbreak exercise. In 6 months, the project demonstrated that targeted enhancements resulted in substantial improvements to the ability of Uganda's public health system to detect and respond to health threats.
Asunto(s)
Creación de Capacidad/organización & administración , Brotes de Enfermedades/prevención & control , Salud Global , Cooperación Internacional , Vigilancia de la Población , Centers for Disease Control and Prevention, U.S. , Humanos , Uganda , Estados Unidos , Organización Mundial de la SaludRESUMEN
Timely and accurate detection and identification of species are crucial for monitoring wildlife for conservation and management. Technological advances, including connectivity of camera traps to mobile phone networks and artificial intelligence (AI) algorithms for automated species identification, can potentially improve the timeliness and accuracy of species detection and identification. Adoption of this new technology, however, is often seen as cost-prohibitive as it has been difficult to calculate the cost savings or qualitative benefits over the life of the program. We developed a decision tool to quantify potential cost savings associated with incorporating the use of mobile phone network connectivity and AI technologies into monitoring programs. Using a feral cat eradication program as a case study, we used our decision tool to quantify technology-related savings in costs and carbon emissions, and compared the accuracy of AI species identification to that of experienced human observers. Over the life of the program, AI technology yielded cost savings of $0.27 M and when coupled with mobile phone network connectivity, AI saved $2.15 M and 115,838 kg in carbon emissions, with AI algorithms outperforming human observers in both speed and accuracy. Our case study demonstrates how advanced technologies can improve accuracy and cost-effectiveness and improve monitoring program efficiencies.
Asunto(s)
Inteligencia Artificial , Animales , Humanos , Teléfono Celular , Algoritmos , Carbono , Gatos , Ahorro de Costo , Conservación de los Recursos Naturales/métodos , Conservación de los Recursos Naturales/economíaRESUMEN
Marine megafauna exposed to fisheries bycatch belong to some of the most threatened taxonomic groups and include apex and mesopredators that contribute to ecosystem regulation. Fisheries bycatch is a major threat to the conservation of albatrosses, large petrels and other pelagic seabirds. Using data sourced from a fisheries electronic monitoring system, we assessed the effects of the time-of-day and relative depth of fishing on seabird and target species catch rates for a Pacific Ocean pelagic longline fishery that targets albacore tuna with an apparently high albatross bycatch rate. Using a Bayesian inference workflow with a spatially-explicit generalized additive mixed model for albacore tuna and generalized linear mixed regression models both for combined albatrosses and combined seabirds, we found that time-of-day and fishing depth did not significantly affect the target species catch rate while night-time deep setting had > 99% lower albatross and total seabird catch rates compared to both deep and shallow partial day-time sets. This provides the first evidence that night-time setting in combination with fishing deep reduces seabird catch risk and may be commercially viable in this and similar albacore tuna longline fisheries. Findings support evidence-informed interventions to reduce the mortality of threatened seabird bycatch species in pelagic longline fisheries.