Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 15 de 15
Filter
1.
Nicotine Tob Res ; 2024 Feb 26.
Article in English | MEDLINE | ID: mdl-38407960

ABSTRACT

INTRODUCTION: The use of electronic vaping products (EVPs) containing nicotine, marijuana, and/or other substances remains prominent among youth; with EVPs containing nicotine being the most commonly used tobacco product among youth since 2014. However, a detailed understanding of the chemical composition of these products is limited. METHODS: During February 25th-March 15th, 2019, a total of 576 EVPs, including 233 e-cigarette devices (with 43 disposable vape pens) and 343 e-liquid cartridges/pods/bottled e-liquids, were found or confiscated from a convenience sample of 16 public high schools in California. Liquids inside 251 vape pens and cartridges/pods/bottled e-liquids were analyzed using a gas chromatography/mass spectrometry (GC/MS). For comparison, new JUUL pods, the most commonly used e-cigarette among youth during 2018-2019, with different flavorings and nicotine content were purchased and analyzed. RESULTS: For e-cigarette cartridges/pods/bottled e-liquids, nicotine was detected in 204 of 208 (98.1%) samples. Propylene glycol (PG) and vegetable glycerin (VG) were dominant solvents in nicotine-containing EVPs. Among 43 disposable vape pen devices, cannabinoids such as tetrahydrocannabinol (THC) or cannabidiol (CBD) were identified in 39 of 43 (90.1%) samples, of which 3 contained both nicotine and THC. Differences in chemical compositions were observed between confiscated or collected JUULs and purchased JUULs. Measured nicotine was inconsistent with labels on some confiscated or collected bottled e-liquids. CONCLUSIONS: EVPs from 16 participating schools were found to widely contain substances with known adverse health effects among youth, including nicotine and cannabinoids. There was inconsistency between labeled and measured nicotine on the products from schools. IMPLICATIONS: This study measured the main chemical compositions of EVPs found at 16 California public high schools. Continued efforts are warranted, including at the school-level, to educate, prevent and reduce youth use of EVPs.

2.
Proc Natl Acad Sci U S A ; 117(9): 5067-5073, 2020 03 03.
Article in English | MEDLINE | ID: mdl-32054785

ABSTRACT

Forecasting the spatiotemporal spread of infectious diseases during an outbreak is an important component of epidemic response. However, it remains challenging both methodologically and with respect to data requirements, as disease spread is influenced by numerous factors, including the pathogen's underlying transmission parameters and epidemiological dynamics, social networks and population connectivity, and environmental conditions. Here, using data from Sierra Leone, we analyze the spatiotemporal dynamics of recent cholera and Ebola outbreaks and compare and contrast the spread of these two pathogens in the same population. We develop a simulation model of the spatial spread of an epidemic in order to examine the impact of a pathogen's incubation period on the dynamics of spread and the predictability of outbreaks. We find that differences in the incubation period alone can determine the limits of predictability for diseases with different natural history, both empirically and in our simulations. Our results show that diseases with longer incubation periods, such as Ebola, where infected individuals can travel farther before becoming infectious, result in more long-distance sparking events and less predictable disease trajectories, as compared to the more predictable wave-like spread of diseases with shorter incubation periods, such as cholera.


Subject(s)
Cholera/epidemiology , Computer Simulation , Disease Outbreaks , Hemorrhagic Fever, Ebola/epidemiology , Communicable Diseases/epidemiology , Epidemics , Epidemiologic Methods , Forecasting , Humans , Sierra Leone/epidemiology
3.
Clin Infect Dis ; 71(1): 14-21, 2020 06 24.
Article in English | MEDLINE | ID: mdl-31412358

ABSTRACT

BACKGROUND: Hepatitis A is a vaccine-preventable viral disease transmitted by the fecal-oral route. During 2016-2018, the County of San Diego investigated an outbreak of hepatitis A infections primarily among people experiencing homelessness (PEH) to identify risk factors and support control measures. At the time of the outbreak, homelessness was not recognized as an independent risk factor for the disease. METHODS: We tested the association between homelessness and infection with hepatitis A virus (HAV) using a test-negative study design comparing patients with laboratory-confirmed hepatitis A with control subjects who tested negative for HAV infection. We assessed risk factors for severe hepatitis A disease outcomes, including hospitalization and death, using multivariable logistic regression. We measured the frequency of indications for hepatitis A vaccination according to Advisory Committee on Immunization Practices (ACIP) guidelines. RESULTS: Among 589 outbreak-associated cases reported, 291 (49%) occurred among PEH. Compared with those who were not homeless, PEH had 3.3 (95% confidence interval [CI], 1.5-7.9) times higher odds of HAV infection, 2.5 (95% CI, 1.7-3.9) times higher odds of hospitalization, and 3.9 (95% CI, 1.1-16.9) times higher odds of death associated with hepatitis A. Among PEH, 212 (73%) patients recorded other ACIP indications for hepatitis A vaccination. CONCLUSIONS: PEH were at higher risk of infection with HAV and of severe hepatitis A disease outcomes compared with those not experiencing homelessness. Approximately one-fourth of PEH had no other ACIP indication for hepatitis A vaccination. These findings support the recent ACIP recommendation to add homelessness as an indication for hepatitis A vaccination.


Subject(s)
Hepatitis A virus , Hepatitis A , Ill-Housed Persons , Disease Outbreaks , Hepatitis A/epidemiology , Hepatitis A Vaccines , Humans , Vaccination
4.
Proc Natl Acad Sci U S A ; 114(15): 4023-4028, 2017 04 11.
Article in English | MEDLINE | ID: mdl-28351976

ABSTRACT

Strategies for containing an emerging infectious disease outbreak must be nonpharmaceutical when drugs or vaccines for the pathogen do not yet exist or are unavailable. The success of these nonpharmaceutical strategies will depend on not only the effectiveness of isolation measures but also the epidemiological characteristics of the infection. However, there is currently no systematic framework to assess the relationship between different containment strategies and the natural history and epidemiological dynamics of the pathogen. Here, we compare the effectiveness of quarantine and symptom monitoring, implemented via contact tracing, in controlling epidemics using an agent-based branching model. We examine the relationship between epidemic containment and the disease dynamics of symptoms and infectiousness for seven case-study diseases with diverse natural histories, including Ebola, influenza A, and severe acute respiratory syndrome (SARS). We show that the comparative effectiveness of symptom monitoring and quarantine depends critically on the natural history of the infectious disease, its inherent transmissibility, and the intervention feasibility in the particular healthcare setting. The benefit of quarantine over symptom monitoring is generally maximized for fast-course diseases, but we show the conditions under which symptom monitoring alone can control certain outbreaks. This quantitative framework can guide policymakers on how best to use nonpharmaceutical interventions and prioritize research during an outbreak of an emerging pathogen.


Subject(s)
Communicable Diseases, Emerging/epidemiology , Communicable Diseases, Emerging/prevention & control , Disease Outbreaks/prevention & control , Coronavirus Infections/prevention & control , Hemorrhagic Fever, Ebola/prevention & control , Hepatitis A/prevention & control , Humans , Influenza, Human/prevention & control , Models, Theoretical , Quarantine , Severe Acute Respiratory Syndrome/prevention & control , Smallpox/prevention & control
5.
MMWR Morb Mortal Wkly Rep ; 67(5152): 1415-1418, 2019 Jan 04.
Article in English | MEDLINE | ID: mdl-30605447

ABSTRACT

During September 29-October 6, 2017, the County of San Diego Public Health Services (COSD) was notified of two patients with suspected wound botulism and a history of using black tar heroin. On October 9, COSD, which had reported an average of one wound botulism case per year during 2001-2016, sent a health alert through the California Health Alert Network, notifying Southern California providers of these two patients, including their signs and symptoms and black tar heroin exposure. In collaboration with the California Department of Public Health, COSD conducted an investigation to identify additional cases, determine risk factors for illness, estimate cost of medical care, and develop recommendations to prevent further illness. By April 18, 2018, nine (eight confirmed and one probable) patients with wound botulism were identified, all of whom were hospitalized; one of the nine died. All nine were persons who inject drugs; seven specifically reported using black tar heroin and six practiced subcutaneous injection known as skin popping. Clinically compatible signs and symptoms included muscle weakness, difficulty swallowing, blurred vision, drooping eyelids, slurred speech, difficulty breathing, loss of facial expression, or descending paralysis. All patients were treated with heptavalent botulism antitoxin (BAT). Wound botulism is likely underrecognized because of its rarity and the overlapping signs and symptoms with opioid intoxication, overdose, and other neurologic syndromes including Guillain-Barré syndrome, the Miller Fisher variant of Guillain-Barré syndrome, and myasthenia gravis. Prompt diagnosis, administration of BAT, and provision of supportive care can help stop the progression of paralysis and be lifesaving.


Subject(s)
Botulism/epidemiology , Disease Outbreaks , Heroin Dependence/complications , Wound Infection/epidemiology , California/epidemiology , Humans
6.
Lancet ; 388(10062): 2904-2911, 2016 12 10.
Article in English | MEDLINE | ID: mdl-27837923

ABSTRACT

BACKGROUND: The ongoing yellow fever epidemic in Angola strains the global vaccine supply, prompting WHO to adopt dose sparing for its vaccination campaign in Kinshasa, Democratic Republic of the Congo, in July-August, 2016. Although a 5-fold fractional-dose vaccine is similar to standard-dose vaccine in safety and immunogenicity, efficacy is untested. There is an urgent need to ensure the robustness of fractional-dose vaccination by elucidation of the conditions under which dose fractionation would reduce transmission. METHODS: We estimate the effective reproductive number for yellow fever in Angola using disease natural history and case report data. With simple mathematical models of yellow fever transmission, we calculate the infection attack rate (the proportion of population infected over the course of an epidemic) with various levels of transmissibility and 5-fold fractional-dose vaccine efficacy for two vaccination scenarios, ie, random vaccination in a hypothetical population that is completely susceptible, and the Kinshasa vaccination campaign in July-August, 2016, with different age cutoff for fractional-dose vaccines. FINDINGS: We estimate the effective reproductive number early in the Angola outbreak was between 5·2 and 7·1. If vaccine action is all-or-nothing (ie, a proportion of vaccine recipients receive complete protection [VE] and the remainder receive no protection), n-fold fractionation can greatly reduce infection attack rate as long as VE exceeds 1/n. This benefit threshold becomes more stringent if vaccine action is leaky (ie, the susceptibility of each vaccine recipient is reduced by a factor that is equal to the vaccine efficacy). The age cutoff for fractional-dose vaccines chosen by WHO for the Kinshasa vaccination campaign (2 years) provides the largest reduction in infection attack rate if the efficacy of 5-fold fractional-dose vaccines exceeds 20%. INTERPRETATION: Dose fractionation is an effective strategy for reduction of the infection attack rate that would be robust with a large margin for error in case fractional-dose VE is lower than expected. FUNDING: NIH-MIDAS, HMRF-Hong Kong.


Subject(s)
Disease Outbreaks/prevention & control , Dose-Response Relationship, Drug , Yellow Fever Vaccine/administration & dosage , Yellow Fever Vaccine/supply & distribution , Yellow Fever/drug therapy , Angola/epidemiology , Democratic Republic of the Congo/epidemiology , Humans , Models, Theoretical , Vaccination/methods , Yellow Fever/epidemiology , Yellow Fever/transmission
8.
J Adolesc Health ; 69(2): 342-345, 2021 08.
Article in English | MEDLINE | ID: mdl-33712386

ABSTRACT

PURPOSE: To examine the chemical composition of JUUL pods collected from a convenience sample of 16 high schools in California to identify possible consumer modification or counterfeit use. METHODS: Using Gas Chromatography-Mass Spectrometry, we quantitatively analyzed the nicotine, propylene glycol (PG), and vegetable glycerin (VG) in JUUL pods (n = 26) collected from California high schools and compared results to commercial 3% (n = 15) and 5% (n = 24) JUUL pods purchased online. RESULTS: Most of the collected JUUL pods (24/26 pods) had a nicotine concentration (43.3 mg/ml, 95% PI: 21.5-65.1) outside the prediction intervals (PI) of the 3% (33.5 mg/ml, 95% PI: 31.8-35.2) and 5% (55.0 mg/ml, 95% PI: 51.5-58.3) commercial JUUL pods. Most (73%) collected JUUL pods had VG concentrations (583.5 mg/ml, PI: 428.9-738.1) lower than the 3% (722.2 mg/ml, PI: 643.0-801.4) and 5% (710.5 mg/ml, PI: 653.1-767.8) commercial JUUL pods. CONCLUSIONS: Used JUUL products collected from high school students or found on school grounds were not chemically consistent with the manufacturer's stated formulations.


Subject(s)
Electronic Nicotine Delivery Systems , Vaping , California , Flavoring Agents , Humans , Schools , Students
9.
medRxiv ; 2020 Apr 26.
Article in English | MEDLINE | ID: mdl-32511440

ABSTRACT

Background: Voluntary individual quarantine and voluntary active monitoring of contacts are core disease control strategies for emerging infectious diseases, such as COVID-19. Given the impact of quarantine on resources and individual liberty, it is vital to assess under what conditions individual quarantine can more effectively control COVID-19 than active monitoring. As an epidemic grows, it is also important to consider when these interventions are no longer feasible, and broader mitigation measures must be implemented. Methods: To estimate the comparative efficacy of these case-based interventions to control COVID-19, we fit a stochastic branching model to reported parameters for the dynamics of the disease. Specifically, we fit to the incubation period distribution and each of two sets of the serial interval distribution: a shorter one with a mean serial interval of 4.8 days and a longer one with a mean of 7.5 days. To assess variable resource settings, we consider two feasibility settings: a high feasibility setting with 90% of contacts traced, a half-day average delay in tracing and symptom recognition, and 90% effective isolation; and low feasibility setting with 50% of contacts traced, a two-day average delay, and 50% effective isolation. Findings: Our results suggest that individual quarantine in high feasibility settings where at least three-quarters of infected contacts are individually quarantined contains an outbreak of COVID-19 with a short serial interval (4.8 days) 84% of the time. However, in settings where this performance is unrealistically high and the outbreak continues to grow, so too will the burden of the number of contacts traced for active monitoring or quarantine. When resources are prioritized for scalable interventions such as social distancing, we show active monitoring or individual quarantine of high-risk contacts can contribute synergistically to mitigation efforts. Interpretation: Our model highlights the urgent need for more data on the serial interval and the extent of presymptomatic transmission in order to make data-driven policy decisions regarding the cost-benefit comparisons of individual quarantine vs. active monitoring of contacts. To the extent these interventions can be implemented they can help mitigate the spread of COVID-19.

10.
Lancet Infect Dis ; 20(9): 1025-1033, 2020 09.
Article in English | MEDLINE | ID: mdl-32445710

ABSTRACT

BACKGROUND: Voluntary individual quarantine and voluntary active monitoring of contacts are core disease control strategies for emerging infectious diseases such as COVID-19. Given the impact of quarantine on resources and individual liberty, it is vital to assess under what conditions individual quarantine can more effectively control COVID-19 than active monitoring. As an epidemic grows, it is also important to consider when these interventions are no longer feasible and broader mitigation measures must be implemented. METHODS: To estimate the comparative efficacy of individual quarantine and active monitoring of contacts to control severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), we fit a stochastic branching model to reported parameters for the dynamics of the disease. Specifically, we fit a model to the incubation period distribution (mean 5·2 days) and to two estimates of the serial interval distribution: a shorter one with a mean serial interval of 4·8 days and a longer one with a mean of 7·5 days. To assess variable resource settings, we considered two feasibility settings: a high-feasibility setting with 90% of contacts traced, a half-day average delay in tracing and symptom recognition, and 90% effective isolation; and a low-feasibility setting with 50% of contacts traced, a 2-day average delay, and 50% effective isolation. FINDINGS: Model fitting by sequential Monte Carlo resulted in a mean time of infectiousness onset before symptom onset of 0·77 days (95% CI -1·98 to 0·29) for the shorter serial interval, and for the longer serial interval it resulted in a mean time of infectiousness onset after symptom onset of 0·51 days (95% CI -0·77 to 1·50). Individual quarantine in high-feasibility settings, where at least 75% of infected contacts are individually quarantined, contains an outbreak of SARS-CoV-2 with a short serial interval (4·8 days) 84% of the time. However, in settings where the outbreak continues to grow (eg, low-feasibility settings), so too will the burden of the number of contacts traced for active monitoring or quarantine, particularly uninfected contacts (who never develop symptoms). When resources are prioritised for scalable interventions such as physical distancing, we show active monitoring or individual quarantine of high-risk contacts can contribute synergistically to mitigation efforts. Even under the shorter serial interval, if physical distancing reduces the reproductive number to 1·25, active monitoring of 50% of contacts can result in overall outbreak control (ie, effective reproductive number <1). INTERPRETATION: Our model highlights the urgent need for more data on the serial interval and the extent of presymptomatic transmission to make data-driven policy decisions regarding the cost-benefit comparisons of individual quarantine versus active monitoring of contacts. To the extent that these interventions can be implemented, they can help mitigate the spread of SARS-CoV-2. FUNDING: National Institute of General Medical Sciences, National Institutes of Health.


Subject(s)
Betacoronavirus/isolation & purification , Contact Tracing , Coronavirus Infections/prevention & control , Disease Outbreaks/prevention & control , Models, Theoretical , Pandemics/prevention & control , Pneumonia, Viral/prevention & control , Quarantine , COVID-19 , Coronavirus Infections/epidemiology , Coronavirus Infections/transmission , Coronavirus Infections/virology , Epidemiological Monitoring , Humans , Monte Carlo Method , Pneumonia, Viral/epidemiology , Pneumonia, Viral/transmission , Pneumonia, Viral/virology , SARS-CoV-2 , Voluntary Programs
11.
PLoS Negl Trop Dis ; 12(2): e0006257, 2018 02.
Article in English | MEDLINE | ID: mdl-29489815

ABSTRACT

BACKGROUND: Oral cholera vaccination is an approach to preventing outbreaks in at-risk settings and controlling cholera in endemic settings. However, vaccine-derived herd immunity may be short-lived due to interactions between human mobility and imperfect or waning vaccine efficacy. As the supply and utilization of oral cholera vaccines grows, critical questions related to herd immunity are emerging, including: who should be targeted; when should revaccination be performed; and why have cholera outbreaks occurred in recently vaccinated populations? METHODS AND FINDINGS: We use mathematical models to simulate routine and mass oral cholera vaccination in populations with varying degrees of migration, transmission intensity, and vaccine coverage. We show that migration and waning vaccine efficacy strongly influence the duration of herd immunity while birth and death rates have relatively minimal impacts. As compared to either periodic mass vaccination or routine vaccination alone, a community could be protected longer by a blended "Mass and Maintain" strategy. We show that vaccination may be best targeted at populations with intermediate degrees of mobility as compared to communities with very high or very low population turnover. Using a case study of an internally displaced person camp in South Sudan which underwent high-coverage mass vaccination in 2014 and 2015, we show that waning vaccine direct effects and high population turnover rendered the camp over 80% susceptible at the time of the cholera outbreak beginning in October 2016. CONCLUSIONS: Oral cholera vaccines can be powerful tools for quickly protecting a population for a period of time that depends critically on vaccine coverage, vaccine efficacy over time, and the rate of population turnover through human mobility. Due to waning herd immunity, epidemics in vaccinated communities are possible but become less likely through complementary interventions or data-driven revaccination strategies.


Subject(s)
Cholera Vaccines/immunology , Cholera/immunology , Immunity, Herd , Population Dynamics , Vaccine Potency , Child , Cholera/epidemiology , Cholera Vaccines/supply & distribution , Disease Outbreaks/prevention & control , Disease Outbreaks/statistics & numerical data , Humans , Immunization, Secondary/methods , Mass Vaccination , Models, Theoretical , Refugee Camps , South Sudan/epidemiology , Vaccination , Vaccination Coverage
12.
Int J Epidemiol ; 47(5): 1562-1570, 2018 10 01.
Article in English | MEDLINE | ID: mdl-29947788

ABSTRACT

Background: Travel restrictions were implemented on an unprecedented scale in 2015 in Sierra Leone to contain and eliminate Ebola virus disease. However, the impact of epidemic travel restrictions on mobility itself remains difficult to measure with traditional methods. New 'big data' approaches using mobile phone data can provide, in near real-time, the type of information needed to guide and evaluate control measures. Methods: We analysed anonymous mobile phone call detail records (CDRs) from a leading operator in Sierra Leone between 20 March and 1 July in 2015. We used an anomaly detection algorithm to assess changes in travel during a national 'stay at home' lockdown from 27 to 29 March. To measure the magnitude of these changes and to assess effect modification by region and historical Ebola burden, we performed a time series analysis and a crossover analysis. Results: Routinely collected mobile phone data revealed a dramatic reduction in human mobility during a 3-day lockdown in Sierra Leone. The number of individuals relocating between chiefdoms decreased by 31% within 15 km, by 46% for 15-30 km and by 76% for distances greater than 30 km. This effect was highly heterogeneous in space, with higher impact in regions with higher Ebola incidence. Travel quickly returned to normal patterns after the restrictions were lifted. Conclusions: The effects of travel restrictions on mobility can be large, targeted and measurable in near real-time. With appropriate anonymization protocols, mobile phone data should play a central role in guiding and monitoring interventions for epidemic containment.


Subject(s)
Cell Phone/statistics & numerical data , Epidemics , Hemorrhagic Fever, Ebola/epidemiology , Travel/legislation & jurisprudence , Travel/statistics & numerical data , Hemorrhagic Fever, Ebola/transmission , Humans , Incidence , Infection Control/methods , Population Dynamics , Retrospective Studies , Sierra Leone/epidemiology
13.
Am J Trop Med Hyg ; 92(4): 811-817, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25667053

ABSTRACT

In addition to being effective, fast-acting, and well tolerated, artemisinin-based combination therapies (ACTs) are able to kill certain transmission stages of the malaria parasite. However, the population-level impacts of ACTs on reducing malaria transmission have been difficult to assess. In this study on the history of malaria control in Vietnam, we assemble annual reporting on malaria case counts, coverage with insecticide-treated nets (ITN) and indoor residual spraying (IRS), and drug purchases by provincial malaria control programs from 1991 to 2010 in Vietnam's 20 southern provinces. We observe a significant negative association between artemisinin use and malaria incidence, with a 10% absolute increase in the purchase proportion of artemisinin-containing regimens being associated with a 29.1% (95% confidence interval: 14.8-41.0%) reduction in slide-confirmed malaria incidence, after accounting for changes in urbanization, ITN/IRS coverage, and two indicators of health system capacity. One budget-related indicator of health system capacity was found to have a smaller association with malaria incidence, and no other significant factors were found. Our findings suggest that including an artemisinin component in malaria drug regimens was strongly associated with reduced malaria incidence in southern Vietnam, whereas changes in urbanization and coverage with ITN or IRS were not.


Subject(s)
Anti-Infective Agents/administration & dosage , Artemisinins/administration & dosage , Malaria/epidemiology , Mosquito Control , Plasmodium/drug effects , Animals , Case Management , Confidence Intervals , Culicidae/parasitology , Humans , Incidence , Insect Vectors/parasitology , Insecticide-Treated Bednets , Malaria/drug therapy , Malaria/transmission , Models, Statistical , Vietnam/epidemiology
14.
ACS Appl Mater Interfaces ; 6(9): 6257-63, 2014 May 14.
Article in English | MEDLINE | ID: mdl-24758478

ABSTRACT

We report a novel, low-resource malaria diagnostic platform inspired by the coffee ring phenomenon, selective for Plasmodium falciparum histidine-rich protein-II (PfHRP-II), a biomarker indicative of the P. falciparum parasite strain. In this diagnostic design, a recombinant HRP-II (rcHRP-II) biomarker is sandwiched between 1 µm Ni(II)nitrilotriacetic acid (NTA) gold-plated polystyrene microspheres (AuPS) and Ni(II)NTA-functionalized glass. After rcHRP-II malaria biomarkers had reacted with Ni(II)NTA-functionalized particles, a 1 µL volume of the particle-protein conjugate solution is deposited onto a functionalized glass slide. Drop evaporation produces the radial flow characteristic of coffee ring formation, and particle-protein conjugates are transported toward the drop edge, where, in the presence of rcHRP-II, particles bind to the Ni(II)NTA-functionalized glass surface. After evaporation, a wash with deionized water removes nonspecifically bound materials while maintaining the integrity of the surface-coupled ring produced by the presence of the protein biomarker. The dynamic range of this design was found to span 3 orders of magnitude, and rings are visible with the naked eye at protein concentrations as low as 10 pM, 1 order of magnitude below the 100 pM PfHRP-II threshold recommended by the World Health Organization. Key enabling features of this design are the inert and robust gold nanoshell to reduce nonspecific interactions on the particle surface, inclusion of a water wash step after drop evaporation to reduce nonspecific binding to the glass, a large diameter particle to project a large two-dimensional viewable area after ring formation, and a low particle density to favor radial flow toward the drop edge and reduce vertical settling to the glass surface in the center of the drop. This robust, antibody-free assay offers a simple user interface and clinically relevant limits of biomarker detection, two critical features required for low-resource malaria detection.


Subject(s)
Antigens, Protozoan/metabolism , Biomarkers/metabolism , Gold/chemistry , Malaria, Falciparum/parasitology , Plasmodium falciparum/metabolism , Polystyrenes/chemistry , Protozoan Proteins/metabolism , Titanium/chemistry , Animals , Microscopy, Electron, Transmission , Surface Properties
SELECTION OF CITATIONS
SEARCH DETAIL