RESUMO
The 2020 general election occurred while many parts of the nation were under emergency orders related to the COVID-19 pandemic. This led to new requirements and considerations for voting systems. We introduce a model of the voting process to capture pandemic-related changes. Using a discrete event simulation case study of Milwaukee, WI, we study how to design in-person voting systems whose performance are robust to pandemic conditions, such as protective measures implemented during the COVID-19 pandemic. We assess various voting system designs on the voter wait times, voter sojourn times, line lengths at polling locations, voter time spent inside, and the number of voters inside. The analysis indicates that poll worker shortages, social distancing, and personalized protective equipment usage and sanitation measures can lead to extremely long voter wait times. We consider several design choices for mitigating the impact of pandemic-related changes on voting metrics. The case study suggests that long wait times can be avoided by staffing additional check-in locations, expanding early voting, and avoiding consolidated polling locations. Additionally, the analysis suggests that implementing a priority queue discipline has the potential to reduce waiting times for vulnerable populations at increased susceptibility to health risks associated with in-person voting.
RESUMO
In recent years, there have been growing concerns regarding risks in federal information technology (IT) supply chains in the United States that protect cyber infrastructure. A critical need faced by decisionmakers is to prioritize investment in security mitigations to maximally reduce risks in IT supply chains. We extend existing stochastic expected budgeted maximum multiple coverage models that identify "good" solutions on average that may be unacceptable in certain circumstances. We propose three alternative models that consider different robustness methods that hedge against worst-case risks, including models that maximize the worst-case coverage, minimize the worst-case regret, and maximize the average coverage in the ( 1 - α ) worst cases (conditional value at risk). We illustrate the solutions to the robust methods with a case study and discuss the insights their solutions provide into mitigation selection compared to an expected-value maximizer. Our study provides valuable tools and insights for decisionmakers with different risk attitudes to manage cybersecurity risks under uncertainty.
RESUMO
Emergency medical services provide immediate care to patients with various types of needs. When the system is congested, the response to urgent emergency calls can be delayed. To address this issue, we propose a spatial Hypercube approximation model with a cutoff priority queue that estimates performance measures for a system where some servers are reserved exclusively for high priority calls when the system is congested. In the cutoff priority queue, low priority calls are not immediately served-they are either lost or entered into a queue-whenever the number of busy ambulances is equal to or greater than the cutoff. The spatial Hypercube approximation model can be used to evaluate the design of public safety systems that employ a cutoff priority queue. A mixed integer linear programming model uses the Hypercube model to identify deployment and dispatch decisions in a cutoff priority queue paradigm. Our computational study suggests that the improvement in the expected coverage is significant when the cutoff is imposed, and it elucidates the tradeoff between the coverage improvement and the cost to low-priority calls that are "lost" when using a cutoff. Finally, we present a method for selecting the cutoff value for a system based on the relative importance of low-priority calls to high-priority calls.
Assuntos
Sistemas de Comunicação entre Serviços de Emergência/organização & administração , Teoria de Sistemas , Triagem/métodos , Sistemas de Comunicação entre Serviços de Emergência/normas , Serviços Médicos de Emergência/organização & administração , Humanos , Fatores de TempoRESUMO
INTRODUCTION: Substance use disorder (SUD), overdose, and drug use-related crime continue to increase in the U.S. Pre-arrest diversion-to-treatment programs may decrease crime recidivism and overdose deaths. We assessed the impact of a community-wide diversion-to-treatment initiative on crime, incarceration, and overdose. METHODS: This article reports on the prospective evaluation of a law enforcement-led, pre-arrest diversion-to-treatment program on crime, incarceration, and overdose deaths compared between participants who did not engage (non-engaged; n = 103), engaged but did not complete (non-completers; n = 60) and completed (completers; n = 100) the program. Participants included 263 adults apprehended by police officers for low-level, drug use-related crimes between September 1, 2017 and August 31, 2020. The program offered eligible persons participation in a six-month program consisting of a clinical assessment, referral to addiction treatment services based on each individual's needs, connection to recovery peer support, and treatment engagement monitoring. Completers had their initial criminal charges 'voided,' while non-engaged and non-Completer participants had their original charges filed with local prosecutors. The project collected participant-level data on arrests and incarceration within 12 months before and 12 months after program enrollment and data on fatal overdose within 12 months after program enrollment. Logistic regression predicted outcomes using baseline demographics (sex, age, race, housing status) and pre-index crime arrest and incarceration indices as covariates. RESULTS: After accounting for baseline demographics and pre-enrollment arrest/incarceration history, logistic regression models found that the non-engaged and the non-Completer groups were more likely than completers to be arrested (odds ratios [ORs]: 3.9 [95 % CI, 2.0-7.7] and 3.6 [95 % CI, 1.7-7.5], respectively) and incarcerated (ORs: 10.3 [95 % CI, 5.0-20.8] and 21.0 [95 % CI, 7.9-55.7], respectively) during the 12-month follow-up. Rates of overdose deaths during the 12-month follow-up were greatest in non-engaged (6/103, 5.8 %) and non-Completer (2/60, 3.3 %) groups; completers had the lowest rate (2/100, 2.0 %), with all deaths occurring after completion of the six-month treatment/monitoring program. CONCLUSIONS: Collaboration between law enforcement, clinicians, researchers, and the broader community to divert adults who commit a low-level, drug use-related crime from criminal prosecution to addiction treatment may effectively reduce crime recidivism, incarceration, and overdose deaths.
Assuntos
Crime , Overdose de Drogas , Aplicação da Lei , Avaliação de Programas e Projetos de Saúde , Reincidência , Transtornos Relacionados ao Uso de Substâncias , Humanos , Masculino , Feminino , Adulto , Overdose de Drogas/mortalidade , Overdose de Drogas/prevenção & controle , Aplicação da Lei/métodos , Reincidência/prevenção & controle , Reincidência/estatística & dados numéricos , Transtornos Relacionados ao Uso de Substâncias/mortalidade , Crime/prevenção & controle , Crime/estatística & dados numéricos , Crime/legislação & jurisprudência , Estudos Prospectivos , Pessoa de Meia-Idade , Prisioneiros/estatística & dados numéricos , Prisioneiros/legislação & jurisprudência , Prisioneiros/psicologia , EncarceramentoRESUMO
BACKGROUND: Overdose deaths, addiction, and drug-related crime have increased in the United States over the past decade. Treatment improves outcomes, including reducing crime, but few individuals with addiction receive treatment. Here, we determine whether the Madison Addiction Recovery Initiative (MARI), a community policing program implemented by the City of Madison (Wisconsin) Police Department (MPD) that diverts adults who committed a non-violent, drug use-related crime from criminal prosecution to addiction treatment, reduces the risk of recidivism (i.e., an arrest) in the 6-month period following the index crime. METHODS: Observational data were collected by the MPD for 12 months before through 6 months after an index crime from participants in the MARI program (n = 263) who referred to MARI between September 1, 2017 and August 31, 2020 and a Historical Comparison group (n = 52) who committed a comparable crime between September 1, 2015 and August 31, 2016. Average effects were estimated using intention-to-treat (ITT), a per-protocol, and a complier average causal effects (CACE) analyses, adjusted for covariates. RESULTS: ITT analysis did not show that MARI assignment lowered adjusted odds of 6-month recidivism (aOR = 0.59 [0.32, 1.12], p = 0.11). Per-protocol analysis showed that completing MARI lowered the adjusted odds of 6-month recidivism (aOR = 0.23 [0.10, 0.52], p < 0.001). CACE analysis indicated that assignment to MARI among individuals who would complete the MARI program if assigned to the program lowered the adjusted odds of 6-month recidivism (aOR = 0.85 [0.80, 0.90], p < 0.001). CONCLUSIONS: Diverting adults who committed a non-violent, drug use-related crime from criminal prosecution to addiction treatment may reduce 6-month recidivism.