RESUMO
BACKGROUND: Blood products form the cornerstone of contemporary hemorrhage control but are limited resources. Freeze-dried plasma (FDP), which contains coagulation factors, is a promising adjunct in hemostatic resuscitation. We explore the association between FDP alone or in combination with other blood products on 24-h mortality. STUDY DESIGN AND METHODS: This is a secondary data analysis from a cross-sectional prospective observational multicenter study of adult trauma patients in the Western Cape of South Africa. We compare mortality among trauma patients at risk of hemorrhage in three treatment groups: Blood Products only, FDP + Blood Products, and FDP only. We apply inverse probability of treatment weighting and fit a multivariable Cox proportional hazards model to assess the hazard of 24-h mortality. RESULTS: Four hundred and forty-eight patients were included, and 55 (12.2%) died within 24 h of hospital arrival. Compared to the Blood Products only group, we found no difference in 24-h mortality for the FDP + Blood Product group (p = .40) and a lower hazard of death for the FDP only group (hazard = 0.38; 95% CI, 0.15-1.00; p = .05). However, sensitivity analyses showed no difference in 24-h mortality across treatments in subgroups with moderate and severe shock, early blood product administration, and accounting for immortal time bias. CONCLUSION: We found insufficient evidence to conclude there is a difference in relative 24-h mortality among trauma patients at risk for hemorrhage who received FDP alone, blood products alone, or blood products with FDP. There may be an adjunctive role for FDP in hemorrhagic shock resuscitation in settings with significantly restricted access to blood products.
Assuntos
Liofilização , Hemorragia , Plasma , Ferimentos e Lesões , Humanos , Feminino , Masculino , Hemorragia/mortalidade , Hemorragia/terapia , Hemorragia/etiologia , Adulto , Ferimentos e Lesões/mortalidade , Ferimentos e Lesões/terapia , Ferimentos e Lesões/complicações , Ferimentos e Lesões/sangue , Pessoa de Meia-Idade , Estudos Prospectivos , Estudos Transversais , África do Sul/epidemiologia , Transfusão de Componentes Sanguíneos , Ressuscitação/métodosRESUMO
BACKGROUND: Rapidly developing tests for emerging diseases is critical for early disease monitoring. In the early stages of an epidemic, when low prevalences are expected, high specificity tests are desired to avoid numerous false positives. Selecting a cutoff to classify positive and negative test results that has the desired operating characteristics, such as specificity, is challenging for new tests because of limited validation data with known disease status. While there is ample statistical literature on estimating quantiles of a distribution, there is limited evidence on estimating extreme quantiles from limited validation data and the resulting test characteristics in the disease testing context. METHODS: We propose using extreme value theory to select a cutoff with predetermined specificity by fitting a Pareto distribution to the upper tail of the negative controls. We compared this method to five previously proposed cutoff selection methods in a data analysis and simulation study. We analyzed COVID-19 enzyme linked immunosorbent assay antibody test results from long-term care facilities and skilled nursing staff in Colorado between May and December of 2020. RESULTS: We found the extreme value approach had minimal bias when targeting a specificity of 0.995. Using the empirical quantile of the negative controls performed well when targeting a specificity of 0.95. The higher target specificity is preferred for overall test accuracy when prevalence is low, whereas the lower target specificity is preferred when prevalence is higher and resulted in less variable prevalence estimation. DISCUSSION: While commonly used, the normal based methods showed considerable bias compared to the empirical and extreme value theory-based methods. CONCLUSIONS: When determining disease testing cutoffs from small training data samples, we recommend using the extreme value based-methods when targeting a high specificity and the empirical quantile when targeting a lower specificity.
Assuntos
Testes Diagnósticos de Rotina , Humanos , Sensibilidade e Especificidade , ViésRESUMO
Women remain underrepresented among faculty in nearly all academic fields. Using a census of 245,270 tenure-track and tenured professors at United States-based PhD-granting departments, we show that women leave academia overall at higher rates than men at every career age, in large part because of strongly gendered attrition at lower-prestige institutions, in non-STEM fields, and among tenured faculty. A large-scale survey of the same faculty indicates that the reasons faculty leave are gendered, even for institutions, fields, and career ages in which retention rates are not. Women are more likely than men to feel pushed from their jobs and less likely to feel pulled toward better opportunities, and women leave or consider leaving because of workplace climate more often than work-life balance. These results quantify the systemic nature of gendered faculty retention; contextualize its relationship with career age, institutional prestige, and field; and highlight the importance of understanding the gendered reasons for attrition rather than focusing on rates alone.
RESUMO
OBJECTIVE: Nutrition therapy for gestational diabetes mellitus (GDM) has conventionally focused on carbohydrate restriction. In a randomized controlled trial (RCT), we tested the hypothesis that a diet (all meals provided) with liberalized complex carbohydrate (60%) and lower fat (25%) (CHOICE diet) could improve maternal insulin resistance and 24-h glycemia, resulting in reduced newborn adiposity (NB%fat; powered outcome) versus a conventional lower-carbohydrate (40%) and higher-fat (45%) (LC/CONV) diet. RESEARCH DESIGN AND METHODS: After diagnosis (at â¼28-30 weeks' gestation), 59 women with diet-controlled GDM (mean ± SEM; BMI 32 ± 1 kg/m2) were randomized to a provided LC/CONV or CHOICE diet (BMI-matched calories) through delivery. At 30-31 and 36-37 weeks of gestation, a 2-h, 75-g oral glucose tolerance test (OGTT) was performed and a continuous glucose monitor (CGM) was worn for 72 h. Cord blood samples were collected at delivery. NB%fat was measured by air displacement plethysmography (13.4 ± 0.4 days). RESULTS: There were 23 women per group (LC/CONV [214 g/day carbohydrate] and CHOICE [316 g/day carbohydrate]). For LC/CONV and CHOICE, respectively (mean ± SEM), NB%fat (10.1 ± 1 vs. 10.5 ± 1), birth weight (3,303 ± 98 vs. 3,293 ± 81 g), and cord C-peptide levels were not different. Weight gain, physical activity, and gestational age at delivery were similar. At 36-37 weeks of gestation, CGM fasting (86 ± 3 vs. 90 ± 3 mg/dL), 1-h postprandial (119 ± 3 vs. 117 ± 3 mg/dL), 2-h postprandial (106 ± 3 vs. 108 ± 3 mg/dL), percent time in range (%TIR; 92 ± 1 vs. 91 ± 1), and 24-h glucose area under the curve values were similar between diets. The %time >120 mg/dL was statistically higher (8%) in CHOICE, as was the nocturnal glucose AUC; however, nocturnal %TIR (63-100 mg/dL) was not different. There were no between-group differences in OGTT glucose and insulin levels at 36-37 weeks of gestation. CONCLUSIONS: A â¼100 g/day difference in carbohydrate intake did not result in between-group differences in NB%fat, cord C-peptide level, maternal 24-h glycemia, %TIR, or insulin resistance indices in diet-controlled GDM.
Assuntos
Diabetes Gestacional , Resistência à Insulina , Gravidez , Feminino , Recém-Nascido , Humanos , Adiposidade , Peptídeo C , Distribuição Aleatória , Glicemia , Obesidade , Glucose , Dieta com Restrição de GordurasRESUMO
Emergence of a novel pathogen drives the urgent need for diagnostic tests that can aid in defining disease prevalence. The limitations associated with rapid development and deployment of these tests result in a dilemma: In efforts to optimize prevalence estimates, would tests be better used in the lab to reduce uncertainty in test characteristics or to increase sample size in field studies? Here, we provide a framework to address this question through a joint Bayesian model that simultaneously analyzes lab validation and field survey data, and we define the impact of test allocation on inferences of sensitivity, specificity, and prevalence. In many scenarios, prevalence estimates can be most improved by apportioning additional effort towards validation rather than to the field. The joint model provides superior estimation of prevalence, sensitivity, and specificity, compared with typical analyses that model lab and field data separately, and it can be used to inform sample allocation when testing is limited.
Assuntos
Sensibilidade e Especificidade , Teorema de Bayes , Prevalência , Tamanho da AmostraRESUMO
BACKGROUND: Deaths due to injuries exceed 4.4 million annually, with over 90% occurring in low-and middle-income countries. A key contributor to high trauma mortality is prolonged trauma-to-treatment time. Earlier receipt of medical care following an injury is critical to better patient outcomes. Trauma epidemiological studies can identify gaps and opportunities to help strengthen emergency care systems globally, especially in lower income countries, and among military personnel wounded in combat. This paper describes the methodology of the "Epidemiology and Outcomes of Prolonged Trauma Care (EpiC)" study, which aims to investigate how the delivery of resuscitative interventions and their timeliness impacts the morbidity and mortality outcomes of patients with critical injuries in South Africa. METHODS: The EpiC study is a prospective, multicenter cohort study that will be implemented over a 6-year period in the Western Cape, South Africa. Data collected will link pre- and in-hospital care with mortuary reports through standardized clinical chart abstraction and will provide longitudinal documentation of the patient's clinical course after injury. The study will enroll an anticipated sample of 14,400 injured adults. Survival and regression analysis will be used to assess the effects of critical early resuscitative interventions (airway, breathing, circulatory, and neurologic) and trauma-to-treatment time on the primary 7-day mortality outcome and secondary mortality (24-h, 30-day) and morbidity outcomes (need for operative interventions, secondary infections, and organ failure). DISCUSSION: This study is the first effort in the Western Cape of South Africa to build a standardized, high-quality, multicenter epidemiologic trauma dataset that links pre- and in-hospital care with mortuary data. In high-income countries and the U.S. military, the introduction of trauma databases and registries has led to interventions that significantly reduce post-injury death and disability. The EpiC study will describe epidemiology trends over time, and it will enable assessments of how trauma care and system processes directly impact trauma outcomes to ultimately improve the overall emergency care system. TRIAL REGISTRATION: Not applicable as this study is not a clinical trial.
Assuntos
Serviços Médicos de Emergência , Ferimentos e Lesões , Adulto , Estudos de Coortes , Humanos , Estudos Prospectivos , Sistema de Registros , África do Sul/epidemiologia , Ferimentos e Lesões/epidemiologia , Ferimentos e Lesões/terapiaRESUMO
The COVID-19 pandemic severely impacted long-term care facilities resulting in the death of approximately 8% of residents nationwide as of March 2021. As COVID-19 case rates declined and state and county restrictions were lifted in spring 2021, facility managers, local and state health agencies were challenged with defining their own policies moving forward to appropriately mitigate disease transmission. The continued emergence of variants of concern and variable vaccine uptake across facilities highlighted the need for a readily available tool that can be employed at the facility-level to determine best practices for mitigation and ensure resident and staff safety. To assist leadership in determining the impact of various infection surveillance and response strategies, we developed an agent-based model and an online dashboard interface that simulates COVID-19 infection within congregate care settings under various mitigation measures. This dashboard quantifies the continued risk for COVID-19 infections within a facility given a designated testing schedule and vaccine requirements. Key findings were that choice of COVID-19 diagnostic (ex. nasal swab qRT-PCR vs rapid antigen) and testing cadence has less impact on attack rate and staff workdays missed than does vaccination rates among staff and residents. Specifically, low vaccine uptake among staff at long-term care facilities puts staff and residents at risk of ongoing COVID-19 outbreaks. Here we present our model and dashboard as an exemplar of a tool for state public health officials and facility directors to gain insights from an infectious disease model that can directly inform policy decisions in the midst of a pandemic.
RESUMO
INTRODUCTION: The infection fatality rate (IFR) of COVID-19 has been carefully measured and analysed in high-income countries, whereas there has been no systematic analysis of age-specific seroprevalence or IFR for developing countries. METHODS: We systematically reviewed the literature to identify all COVID-19 serology studies in developing countries that were conducted using representative samples collected by February 2021. For each of the antibody assays used in these serology studies, we identified data on assay characteristics, including the extent of seroreversion over time. We analysed the serology data using a Bayesian model that incorporates conventional sampling uncertainty as well as uncertainties about assay sensitivity and specificity. We then calculated IFRs using individual case reports or aggregated public health updates, including age-specific estimates whenever feasible. RESULTS: In most locations in developing countries, seroprevalence among older adults was similar to that of younger age cohorts, underscoring the limited capacity that these nations have to protect older age groups.Age-specific IFRs were roughly 2 times higher than in high-income countries. The median value of the population IFR was about 0.5%, similar to that of high-income countries, because disparities in healthcare access were roughly offset by differences in population age structure. CONCLUSION: The burden of COVID-19 is far higher in developing countries than in high-income countries, reflecting a combination of elevated transmission to middle-aged and older adults as well as limited access to adequate healthcare. These results underscore the critical need to ensure medical equity to populations in developing countries through provision of vaccine doses and effective medications.
Assuntos
COVID-19 , Países em Desenvolvimento , Idoso , Teorema de Bayes , COVID-19/epidemiologia , Acessibilidade aos Serviços de Saúde , Humanos , Pessoa de Meia-Idade , Política Pública , Estudos SoroepidemiológicosRESUMO
Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) has caused millions of deaths around the world within the past 2 years. Transmission within the United States has been heterogeneously distributed by geography and social factors with little data from North Carolina. Here, we describe results from a weekly cross-sectional study of 12,471 unique hospital remnant samples from 19 April to 26 December 2020 collected by four clinical sites within the University of North Carolina Health system, with a majority of samples from urban, outpatient populations in central North Carolina. We employed a Bayesian inference model to calculate SARS-CoV-2 spike protein immunoglobulin prevalence estimates and conditional odds ratios for seropositivity. Furthermore, we analyzed a subset of these seropositive samples for neutralizing antibodies. We observed an increase in seroprevalence from 2.9 (95% confidence interval [CI], 1.8 to 4.5) to 12.8 (95% CI, 10.6 to 15.2) over the course of the study. Latinx individuals had the highest odds ratio of SARS-CoV-2 exposure at 6.56 (95% CI, 4.66 to 9.44). Our findings aid in quantifying the degree of asymmetric SARS-CoV-2 exposure by ethnoracial grouping. We also find that 49% of a subset of seropositive individuals had detectable neutralizing antibodies, which was skewed toward those with recent respiratory infection symptoms. IMPORTANCE PCR-confirmed SARS-CoV-2 cases underestimate true prevalence. Few robust community-level SARS-CoV-2 ethnoracial and overall prevalence estimates have been published for North Carolina in 2020. Mortality has been concentrated among ethnoracial minorities and may result from a high likelihood of SARS-CoV-2 exposure, which we observe was particularly high among Latinx individuals in North Carolina. Additionally, neutralizing antibody titers are a known correlate of protection. Our observation that development of SARS-CoV-2 neutralizing antibodies may be inconsistent and dependent on severity of symptoms makes vaccination a high priority despite prior exposure.
Assuntos
COVID-19 , SARS-CoV-2 , Anticorpos Neutralizantes , Teorema de Bayes , COVID-19/epidemiologia , Estudos Transversais , Humanos , North Carolina/epidemiologia , Estudos Soroepidemiológicos , Glicoproteína da Espícula de CoronavírusRESUMO
During early phases of the SARS-CoV-2 epidemic, many research laboratories repurposed their efforts towards developing diagnostic testing that could aid public health surveillance while commercial and public diagnostic laboratories developed capacity and validated large scale testing methods. Simultaneously, the rush to produce point-of-care and diagnostic facility testing resulted in FDA Emergency Use Authorization with scarce and poorly validated clinical samples. Here, we review serologic test results from 186 serum samples collected in early phases of the pandemic (May 2020) from skilled nursing facilities tested with six laboratory-based and two commercially available assays. Serum neutralization titers were used to set cut-off values using positive to negative ratio (P/N) analysis to account for batch effects. We found that laboratory-based receptor binding domain (RBD) binding assays had equivalent or superior sensitivity and specificity compared to commercially available tests. We also determined seroconversion rate and compared with qPCR outcomes. Our work suggests that research laboratory assays can contribute reliable surveillance information and should be considered important adjuncts to commercial laboratory testing facilities during early phases of disease outbreaks.
Assuntos
COVID-19 , SARS-CoV-2 , Humanos , COVID-19/diagnóstico , Teste para COVID-19 , Técnicas de Laboratório Clínico/métodos , Anticorpos Antivirais , Sensibilidade e Especificidade , Testes Sorológicos/métodosRESUMO
SARS-CoV-2 has had a disproportionate impact on nonhospital health care settings, such as long-term-care facilities (LTCFs). The communal nature of these facilities, paired with the high-risk profile of residents, has resulted in thousands of infections and deaths and a high case fatality rate. To detect presymptomatic infections and identify infected workers, we performed weekly surveillance testing of staff at two LTCFs, which revealed a large outbreak at one of the sites. We collected serum from staff members throughout the study and evaluated it for binding and neutralization to measure seroprevalence, seroconversion, and type and functionality of antibodies. At the site with very few incident infections, we detected that over 40% of the staff had preexisting SARS-CoV-2 neutralizing antibodies, suggesting prior exposure. At the outbreak site, we saw rapid seroconversion following infection. Neutralizing antibody levels were stable for many weeks following infection, suggesting a durable, long-lived response. Receptor-binding domain antibodies and neutralizing antibodies were strongly correlated. The site with high seroprevalence among staff had two unique introductions of SARS-CoV-2 into the facility through seronegative infected staff during the period of study, but these did not result in workplace spread or outbreaks. Together, our results suggest that a high seroprevalence rate among staff can contribute to immunity within a workplace and protect against subsequent infection and spread within a facility. IMPORTANCE Long-term care facilities (LTCFs) have been disproportionately impacted by COVID-19 due to their communal nature and high-risk profile of residents. LTCF staff have the ability to introduce SARS-CoV-2 into the facility, where it can spread, causing outbreaks. We tested staff weekly at two LTCFs and collected blood throughout the study to measure SARS-CoV-2 antibodies. One site had a large outbreak and infected individuals rapidly generated antibodies after infection. At the other site, almost half the staff already had antibodies, suggesting prior infection. The majority of these antibodies bind to the receptor-binding domain of the SARS-CoV-2 spike protein and are potently neutralizing and stable for many months. The non-outbreak site had two unique introductions of SARS-CoV-2 into the facility, but these did not result in workplace spread or outbreaks. Our results reveal that high seroprevalence among staff can contribute to immunity and protect against subsequent infection and spread within a facility.
Assuntos
Formação de Anticorpos , COVID-19/epidemiologia , COVID-19/imunologia , Surtos de Doenças , Assistência de Longa Duração , Anticorpos Neutralizantes/imunologia , Anticorpos Antivirais/sangue , Anticorpos Antivirais/imunologia , Infecções Assintomáticas/epidemiologia , Sítios de Ligação de Anticorpos , Teste para COVID-19 , Humanos , Vigilância Imunológica , RNA Viral , SARS-CoV-2/genética , SARS-CoV-2/imunologia , Sensibilidade e Especificidade , Estudos Soroepidemiológicos , Glicoproteína da Espícula de Coronavírus/imunologiaRESUMO
Importance: Detailed analysis of infection rates paired with behavioral and employee-reported risk factors is vital to understanding how transmission of SARS-CoV-2 infection may be exacerbated or mitigated in the workplace. Institutions of higher education are heterogeneous work units that supported continued in-person employment during the COVID-19 pandemic, providing a test site for occupational health evaluation. Objective: To evaluate the association between self-reported protective behaviors and prevalence of SARS-CoV-2 infection among essential in-person employees during the first 6 months of the COVID-19 pandemic in the US. Design, Setting, and Participants: This cross-sectional study was conducted from July 13 to September 2, 2020, at an institution of higher education in Fort Collins, Colorado. Employees 18 years or older without symptoms of COVID-19 who identified as essential in-person workers during the first 6 months of the pandemic were included. Participants completed a survey, and blood and nasal swab samples were collected to assess active SARS-CoV-2 infection via quantitative reverse transcriptase-polymerase chain reaction (qRT-PCR) and past infection by serologic testing. Exposure: Self-reported practice of protective behaviors against COVID-19 according to public health guidelines provided to employees. Main Outcomes and Measures: Prevalence of current SARS-CoV-2 infection detected by qRT-PCR or previous SARS-CoV-2 infection detected by an IgG SARS-CoV-2 testing platform. The frequency of protective behavior practices and essential workers' concerns regarding contracting COVID-19 and exposing others were measured based on survey responses. Results: Among 508 participants (305 [60.0%] women, 451 [88.8%] non-Hispanic White individuals; mean [SD] age, 41.1 [12.5] years), there were no qRT-PCR positive test results, and only 2 participants (0.4%) had seroreactive IgG antibodies. Handwashing and mask wearing were reported frequently both at work (480 [94.7%] and 496 [97.8%] participants, respectively) and outside work (465 [91.5%] and 481 [94.7%] participants, respectively). Social distancing was reported less frequently at work (403 [79.5%]) than outside work (465 [91.5%]) (P < .001). Participants were more highly motivated to avoid exposures because of concern about spreading the infection to others (419 [83.0%]) than for personal protection (319 [63.2%]) (P < .001). Conclusions and Relevance: In this cross-sectional study of essential workers at an institution of higher education, when employees reported compliance with public health practices both at and outside work, they were able to operate safely in their work environment during the COVID-19 pandemic.
Assuntos
COVID-19 , Pandemias , Saúde Pública , SARS-CoV-2 , Comportamento Social , Universidades , Local de Trabalho , Adulto , COVID-19/sangue , COVID-19/prevenção & controle , COVID-19/transmissão , COVID-19/virologia , Teste para COVID-19 , Colorado , Controle de Doenças Transmissíveis , Estudos Transversais , Feminino , Guias como Assunto , Humanos , Imunoglobulina G/sangue , Masculino , Pessoa de Meia-Idade , Saúde Ocupacional , Reação em Cadeia da Polimerase , SARS-CoV-2/crescimento & desenvolvimento , SARS-CoV-2/imunologia , AutorrelatoRESUMO
Background: Robust community-level SARS-CoV-2 prevalence estimates have been difficult to obtain in the American South and outside of major metropolitan areas. Furthermore, though some previous studies have investigated the association of demographic factors such as race with SARS-CoV-2 exposure risk, fewer have correlated exposure risk to surrogates for socioeconomic status such as health insurance coverage. Methods: We used a highly specific serological assay utilizing the receptor binding domain of the SARS-CoV-2 spike-protein to identify SARS-CoV-2 antibodies in remnant blood samples collected by the University of North Carolina Health system. We estimated the prevalence of SARS-CoV-2 in this cohort with Bayesian regression, as well as the association of critical demographic factors with higher prevalence odds. Findings: Between April 21st and October 3rd of 2020, a total of 9,624 unique samples were collected from clinical sites in central NC and we observed a seroprevalence increase from 2·9 (1·7, 4·3) to 9·1 (7·2, 11·1) over the study period. Individuals who identified as Latinx were associated with the highest odds ratio of SARS-CoV-2 exposure at 7·77 overall (5·20, 12·10). Increased odds were also observed among Black individuals and individuals without public or private health insurance. Interpretation: Our data suggests that for this care-accessing cohort, SARS-CoV-2 seroprevalence was significantly higher than cumulative total cases reported for the study geographical area six months into the COVID-19 pandemic in North Carolina. The increased odds of seropositivity by ethnoracial grouping as well as health insurance highlights the urgent and ongoing need to address underlying health and social disparities in these populations.
RESUMO
Establishing how many people have been infected by SARS-CoV-2 remains an urgent priority for controlling the COVID-19 pandemic. Serological tests that identify past infection can be used to estimate cumulative incidence, but the relative accuracy and robustness of various sampling strategies have been unclear. We developed a flexible framework that integrates uncertainty from test characteristics, sample size, and heterogeneity in seroprevalence across subpopulations to compare estimates from sampling schemes. Using the same framework and making the assumption that seropositivity indicates immune protection, we propagated estimates and uncertainty through dynamical models to assess uncertainty in the epidemiological parameters needed to evaluate public health interventions and found that sampling schemes informed by demographics and contact networks outperform uniform sampling. The framework can be adapted to optimize serosurvey design given test characteristics and capacity, population demography, sampling strategy, and modeling approach, and can be tailored to support decision-making around introducing or removing interventions.
Assuntos
COVID-19/epidemiologia , Adolescente , Adulto , Fatores Etários , Idoso , Teorema de Bayes , COVID-19/diagnóstico , Teste Sorológico para COVID-19 , Criança , Pré-Escolar , Humanos , Lactente , Recém-Nascido , Pessoa de Meia-Idade , Pandemias , SARS-CoV-2/isolamento & purificação , Estudos Soroepidemiológicos , Incerteza , Adulto JovemRESUMO
Many existing statistical and machine learning tools for social network analysis focus on a single level of analysis. Methods designed for clustering optimize a global partition of the graph, whereas projection-based approaches (e.g., the latent space model in the statistics literature) represent in rich detail the roles of individuals. Many pertinent questions in sociology and economics, however, span multiple scales of analysis. Further, many questions involve comparisons across disconnected graphs that will, inevitably be of different sizes, either due to missing data or the inherent heterogeneity in real-world networks. We propose a class of network models that represent network structure on multiple scales and facilitate comparison across graphs with different numbers of individuals. These models differentially invest modeling effort within subgraphs of high density, often termed communities, while maintaining a parsimonious structure between said subgraphs. We show that our model class is projective, highlighting an ongoing discussion in the social network modeling literature on the dependence of inference paradigms on the size of the observed graph. We illustrate the utility of our method using data on household relations from Karnataka, India. Supplementary material for this article is available online.
RESUMO
We propose an application of network analysis to determine which traits and behaviors predict fertilizations within and between populations. This approach quantifies how reproductive behavior between individuals shapes patterns of selection and gene flow, filling an important gap in our understanding of the connection between evolutionary processes and emergent patterns.
Assuntos
Especiação Genética , Comportamento Reprodutivo , Evolução Biológica , Fluxo Gênico , Fenótipo , Isolamento ReprodutivoRESUMO
International environmental treaties are the key means by which states overcome collective action problems and make specific commitments to address environmental issues. However, systematically assessing states' influence in promoting global environmental protection has proven difficult. Analyzing newly compiled data with a purpose-built statistical model, we provide a novel measurement of state influence within the scope of environmental politics and find strong influences among states and treaties. Specifically, we report evidence that states are less likely to ratify when states within their region ratify, and results suggesting that countries positively influence other countries at similar levels of economic development. By examining several prominent treaties, we illustrate the complex nature of influence: a single act of ratification can dramatically reshape global environmental politics. More generally, our findings and approach provide an innovative means to understand the evolution and complexity of international environmental protection.
Assuntos
Conservação dos Recursos Naturais/legislação & jurisprudência , Saúde Global , Cooperação Internacional , Política , Países em Desenvolvimento , HumanosRESUMO
Animals use morphological signals such as ornamental traits or weaponry to mediate social interactions, and the extent of signal trait elaboration is often positively associated with reproductive success. By demonstrating relationships between signal traits and fitness, researchers often make inferences about how behaviour operates to shape those outcomes. However, detailed information about fine-scale individual behaviour, and its physiological basis, can be difficult to obtain. Here we show that experimental manipulations to exaggerate a signal trait (plumage colour) and concomitant changes in testosterone and stress-induced corticosterone levels altered social interactivity between manipulated males and their social mates. On average, darkened males did not have higher levels of interactivity than unmanipulated males; however, males who experienced a greater shift in colour (pale to dark), a larger, positive change in testosterone levels, and a dampened stress-induced corticosterone response had a larger increase in the number of interactions with their social mate post-manipulation compared to pre-manipulation. This work provides new insights into the integration and real-time flexibility of multivariate phenotypes and direct evidence for the role of social interactions in pair bond maintenance.
Assuntos
Aves/fisiologia , Fenótipo , Comportamento Sexual Animal , Animais , Tamanho Corporal , Cor , Corticosterona/sangue , Plumas/anatomia & histologia , Testosterona/sangueRESUMO
Relational event data, which consist of events involving pairs of actors over time, are now commonly available at the finest of temporal resolutions. Existing continuous-time methods for modeling such data are based on point processes and directly model interaction "contagion," whereby one interaction increases the propensity of future interactions among actors, often as dictated by some latent variable structure. In this article, we present an alternative approach to using temporal-relational point process models for continuous-time event data. We characterize interactions between a pair of actors as either spurious or as resulting from an underlying, persistent connection in a latent social network. We argue that consistent deviations from expected behavior, rather than solely high frequency counts, are crucial for identifying well-established underlying social relationships. This study aims to explore these latent network structures in two contexts: one comprising of college students and another involving barn swallows.
RESUMO
BACKGROUND: Calculus is a foundational course for STEM-intending students yet has been shown to dissuade students from pursuing STEM degrees. In this report, we examine factors related to students and instructors reporting a lack of time in class for students to understand difficult ideas and relate this to students' and instructors' perceptions of opportunities to learn using a hierarchical linear model. This work is part of the US national study on college calculus, which provides an ideal landscape to examine these questions on a large scale. RESULTS: We find a number of student factors associated with students experiencing negative opportunities to learn, such as student gender, lacking previous calculus experience, and reports of poor and non-student-centered teaching. Factors weakly associated with instructor reports of lack of time were a common final and reporting that approximately half of the students lacked the ability to succeed in the course. CONCLUSIONS: This analysis offers insight into how we might create more positive opportunities to learn in our own classrooms. This includes preparing students before they enter calculus, so they feel confident in their abilities, as well as weakening the internal framing of the course by engaging in teaching practices that provide students opportunities to communicate and influence their learning (e.g., discussion and group work). We argue that this is especially important in introductory college calculus courses that are packed with material, taught to a diverse population of students in terms of demographics, mathematical preparation, and career goals.