ABSTRACT
The ordered assembly of tau protein into filaments characterizes several neurodegenerative diseases, which are called tauopathies. It was previously reported that, by cryo-electron microscopy, the structures of tau filaments from Alzheimer's disease1,2, Pick's disease3, chronic traumatic encephalopathy4 and corticobasal degeneration5 are distinct. Here we show that the structures of tau filaments from progressive supranuclear palsy (PSP) define a new three-layered fold. Moreover, the structures of tau filaments from globular glial tauopathy are similar to those from PSP. The tau filament fold of argyrophilic grain disease (AGD) differs, instead resembling the four-layered fold of corticobasal degeneration. The AGD fold is also observed in ageing-related tau astrogliopathy. Tau protofilament structures from inherited cases of mutations at positions +3 or +16 in intron 10 of MAPT (the microtubule-associated protein tau gene) are also identical to those from AGD, suggesting that relative overproduction of four-repeat tau can give rise to the AGD fold. Finally, the structures of tau filaments from cases of familial British dementia and familial Danish dementia are the same as those from cases of Alzheimer's disease and primary age-related tauopathy. These findings suggest a hierarchical classification of tauopathies on the basis of their filament folds, which complements clinical diagnosis and neuropathology and also allows the identification of new entities-as we show for a case diagnosed as PSP, but with filament structures that are intermediate between those of globular glial tauopathy and PSP.
Subject(s)
Cryoelectron Microscopy , Protein Folding , Tauopathies/classification , tau Proteins/chemistry , tau Proteins/ultrastructure , Aged , Aged, 80 and over , Amino Acid Sequence , Dementia/genetics , Denmark , Female , Humans , Introns/genetics , Male , Middle Aged , Models, Molecular , Mutation , Protein Isoforms/chemistry , Protein Isoforms/ultrastructure , Supranuclear Palsy, Progressive , Tauopathies/pathology , United KingdomABSTRACT
The American Society of Anesthesiologists (ASA) opposes automatic reversal of do-not-resuscitate orders during the perioperative period, instead advocating for a goal-directed approach that aligns decision-making with patients' priorities and clinical circumstances. Implementation of ASA guidelines continues to face significant barriers including time constraints, lack of longitudinal relationships with patients, and difficulty translating goal-focused discussion into concrete clinical plans. These challenges mirror those of advance care planning more generally, suggesting a need for novel frameworks for serious illness communication and patient-centered decision-making. This review considers ASA guidelines in the context of ongoing transitions to serious illness communication and increasingly multidisciplinary perioperative care. It aims to provide practical guidance for the practicing anesthesiologist while also acknowledging the complexity of decision-making, considering limitations inherent to anesthesiologists' role, and outlining a need to conceptualize delivery of ethically informed care as a collaborative, multidisciplinary endeavor.
Subject(s)
Resuscitation Orders , Humans , Resuscitation Orders/ethics , Practice Guidelines as Topic/standards , Perioperative Care/ethics , Perioperative Care/methods , Perioperative Care/standardsABSTRACT
INTRODUCTION: Neoadjuvant chemoradiation therapy (NCRT) for cT1b esophageal cancer is not recommended despite the risk of pathologic upstaging with increased depth of penetration. We aimed to (1) define the rate of and factors associated with pathologic upstaging, (2) describe current trends in treatments, and (3) compare overall survival (OS) with and without NCRT for surgically resected cT1b lesions. METHODS: We used the 2020 National Cancer Database to identify patients with cT1b N0 esophageal cancer with or without pathologic upstaging who underwent removal of their tumor. We built multivariable logistic regression models to assess factors associated with pathologic upstaging. Survival was compared using log-rank analysis and modeled using multivariable Cox proportional hazards regressions. RESULTS: Out of 1106 patients with cT1b esophageal cancer, 17.3% (N = 191) had pathologic upstaging. A higher tumor grade (P = 0.002), greater tumor size (P < 0.001), and presence of lympho-vascular invasion (P < 0.001) were associated with pathologic upstaging. 8.0% (N = 114) of patients were treated with NCRT. Five-y OS was 49.4% for patients who received NCRT compared to 67.2% for upfront esophagectomy (P < 0.05). Pathologic upstaging was associated with decreased OS (pathologic upstaging 43.7% versus no pathologic upstaging 67.7%) (hazard ratio 2.12 [95% confidence interval, 1.70-2.65; P < 0.001]). Compared to esophagectomy, endoscopic local tumor excision was associated with a decreased OS (hazard ratio 1.50 [95% confidence interval, 1.19-1.89; P = 0.001]). CONCLUSIONS: Pathologic upstaging of cT1b lesions is associated with decreased OS. Esophagectomy is associated with a survival benefit over endoscopic local tumor excision for these lesions. NCRT is not associated with an increase in OS in cT1b lesions compared to upfront esophagectomy.
Subject(s)
Adenocarcinoma , Carcinoma, Squamous Cell , Esophageal Neoplasms , Humans , Neoadjuvant Therapy , Neoplasm Staging , Esophageal Neoplasms/surgery , Adenocarcinoma/surgery , Esophagectomy , Retrospective Studies , Treatment OutcomeABSTRACT
Pathological tau accumulates in the brain in tauopathies such as Alzheimer's disease, Pick's disease, progressive supranuclear palsy and corticobasal degeneration, and forms amyloid-like filaments incorporating various post-translational modifications (PTMs). Cryo-electron microscopic (cryo-EM) studies have demonstrated that tau filaments extracted from tauopathy brains are characteristic of the disease and share a common fold(s) in the same disease group. Furthermore, the tau PTM profile changes during tau pathology formation and disease progression, and disease-specific PTMs are detected in and around the filament core. In addition, templated seeding has been suggested to trigger pathological tau amplification and spreading in vitro and in vivo, although the molecular mechanisms are not fully understood. Recently, we reported that the cryo-EM structures of tau protofilaments in SH-SY5Y cells seeded with patient-derived tau filaments show a core structure(s) resembling that of the original seeds. Here, we investigated PTMs of tau filaments accumulated in the seeded cells by liquid chromatography/tandem mass spectrometry and compared them with the PTMs of patient-derived tau filaments. Examination of insoluble tau extracted from SH-SY5Y cells showed that numerous phosphorylation, deamidation and oxidation sites detected in the fuzzy coat in the original seeds were well reproduced in SH-SY5Y cells. Moreover, templated tau filament formation preceded both truncation of the N-/C-terminals of tau and PTMs in and around the filament core, indicating these PTMs may predominantly be introduced after the degradation of the fuzzy coat.
Subject(s)
Alzheimer Disease , Neuroblastoma , Tauopathies , Humans , Alzheimer Disease/pathology , Brain/pathology , Neuroblastoma/metabolism , Neuroblastoma/pathology , Protein Processing, Post-Translational , tau Proteins/metabolism , Tauopathies/pathologyABSTRACT
To read this article, you have to constantly direct your gaze at the words on the page. If you go for a run instead, your gaze will be less constrained, so many factors could influence where you look. We show that you are likely to spend less time looking at the path just in front of you when running alone than when running with someone else, presumably because the presence of the other runner makes foot placement more critical.
Subject(s)
Running , Humans , Running/physiology , Adult , Male , Female , Young Adult , Fixation, Ocular/physiologyABSTRACT
Currently, all eligible goalball players compete together irrespective of their level of vision impairment, yet it remains unclear whether those with more impairment are disadvantaged during competition. Following the International Paralympic Committee's requirement for evidence-based, sport-specific classification, this study assessed whether individual goalball performance relates to the level of visual impairment. Using results from the 2016 and 2020 Paralympic Games, players' sport classes and in-competition key performance statistics (minutes played, throws per minute, goals per minute, penalties conceded per minute, blocks per minute, and goals per throw) were extracted. Players' visual acuity and visual field results were obtained through the IBSA Sport Administration System. Results showed no statistically significant differences in performance between classes. Further, there were no significant relationships between vision and performance for all six variables for female players. A small but significant positive correlation was found between visual acuity and the number of penalties conceded for male players. Collectively, the results suggest that currently eligible players compete fairly against one another during competitive goalball matches. Results provide support for the existing system of classification whereby all eligible athletes compete against each other irrespective of their level of impairment.
Subject(s)
Athletic Performance , Competitive Behavior , Visual Acuity , Humans , Athletic Performance/physiology , Male , Female , Visual Acuity/physiology , Competitive Behavior/physiology , Sports for Persons with Disabilities/classification , Sports for Persons with Disabilities/physiology , Vision Disorders , Visual Fields/physiology , AdultABSTRACT
Despite evidence that elite-level cricket umpires are highly accurate in making leg-before-wicket (LBW) judgements, there is limited understanding as to how they make these judgements. In this study, we explored the explicit LBW decision-making expertise of elite-level cricket umpires (N = 10) via 10 individual semi-structured interviews. Using thematic analysis, we aimed to identify the sources of information that umpires incorporate into their decision-making process. Results indicated that umpires engage in intentional pre-delivery information-gathering to guide their expectations, and to set context-specific parameters as to what would constitute an LBW dismissal. Not only do umpires use information about the ball trajectory, but they also use additional information about the condition of the pitch, the action-capabilities and susceptibilities of players, and the unique requirements of different match formats. Umpires reported employing a gaze-anchor strategy when gathering information for each delivery and described the process of this information as initially intuitive, before engaging in deeper post-hoc reasoning. Findings highlight the importance of including contextual information when exploring officials' decisions and may inform future training interventions for cricket umpires.
Subject(s)
Cricket Sport , Decision Making , Judgment , Humans , Cricket Sport/physiology , Male , Female , Adult , Young Adult , Leg/physiologyABSTRACT
As the development of drugs with a covalent mode of action is becoming increasingly popular, well-validated covalent fragment-based drug discovery (FBDD) methods have been comparatively slow to keep up with the demand. In this chapter the principles of covalent fragment reactivity, library design, synthesis, and screening methods are explored in depth, focussing on literature examples with direct applications to practical covalent fragment library design and screening. Further, questions about the future of the field are explored and potential useful advances are proposed.
Subject(s)
Drug Discovery , Small Molecule Libraries , Small Molecule Libraries/pharmacology , Drug DesignABSTRACT
Alzheimer's disease (AD) is a genetically heterogeneous disorder characterized by early hippocampal atrophy and cerebral amyloid-beta (Abeta) peptide deposition. Using TissueInfo to screen for genes preferentially expressed in the hippocampus and located in AD linkage regions, we identified a gene on 10q24.33 that we call CALHM1. We show that CALHM1 encodes a multipass transmembrane glycoprotein that controls cytosolic Ca(2+) concentrations and Abeta levels. CALHM1 homomultimerizes, shares strong sequence similarities with the selectivity filter of the NMDA receptor, and generates a large Ca(2+) conductance across the plasma membrane. Importantly, we determined that the CALHM1 P86L polymorphism (rs2986017) is significantly associated with AD in independent case-control studies of 3404 participants (allele-specific OR = 1.44, p = 2 x 10(-10)). We further found that the P86L polymorphism increases Abeta levels by interfering with CALHM1-mediated Ca(2+) permeability. We propose that CALHM1 encodes an essential component of a previously uncharacterized cerebral Ca(2+) channel that controls Abeta levels and susceptibility to late-onset AD.
Subject(s)
Alzheimer Disease/genetics , Amyloid beta-Peptides/metabolism , Calcium/metabolism , Genetic Predisposition to Disease , Membrane Glycoproteins/genetics , Membrane Glycoproteins/metabolism , Polymorphism, Genetic , Aged , Aged, 80 and over , Amino Acid Sequence , Calcium Channels , Cell Membrane/metabolism , Chromosomes, Human, Pair 10 , Cytosol/metabolism , Female , Genome, Human , Humans , Male , Membrane Glycoproteins/chemistry , Middle Aged , Molecular Sequence Data , Phylogeny , Sequence AlignmentABSTRACT
The house fly, Musca domestica, is a pest of livestock, transmits pathogens of human diseases, and is a model organism in multiple biological research areas. The first house fly genome assembly was published in 2014 and has been of tremendous use to the community of house fly biologists, but that genome is discontiguous and incomplete by contemporary standards. To improve the house fly reference genome, we sequenced, assembled, and annotated the house fly genome using improved techniques and technologies that were not available at the time of the original genome sequencing project. The new genome assembly is substantially more contiguous and complete than the previous genome. The new genome assembly has a scaffold N50 of 12.46 Mb, which is a 50-fold improvement over the previous assembly. In addition, the new genome assembly is within 1% of the estimated genome size based on flow cytometry, whereas the previous assembly was missing nearly one-third of the predicted genome sequence. The improved genome assembly has much more contiguous scaffolds containing large gene families. To provide an example of the benefit of the new genome, we used it to investigate tandemly arrayed immune gene families. The new contiguous assembly of these loci provides a clearer picture of the regulation of the expression of immune genes, and it leads to new insights into the selection pressures that shape their evolution.
ABSTRACT
BACKGROUND: Eye tracking is a promising method for objectively assessing functional visual capabilities, but its suitability remains unclear when assessing the vision of people with vision impairment. In particular, accurate eye tracking typically relies on a stable and reliable image of the pupil and cornea, which may be compromised by abnormalities associated with vision impairment (e.g., nystagmus, aniridia). OBJECTIVES: This study aimed to establish the degree to which video-based eye tracking can be used to assess visual function in the presence of vision impairment. DATA SOURCES: A systematic review was conducted using PubMed, EMBASE, and Web of Science databases, encompassing literature from inception to July 2022. STUDY ELIGIBILITY CRITERIA, PARTICIPANTS, AND INTERVENTIONS: Studies included in the review used video-based eye tracking, included individuals with vision impairment, and used screen-based tasks unrelated to practiced skills such as reading or driving. STUDY APPRAISAL AND SYNTHESIS METHODS: The included studies were assessed for quality using the Strengthening the Reporting of Observational Studies in Epidemiology assessment tool. Data extraction and synthesis were performed according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. RESULTS: Our analysis revealed that five common tests of visual function were used: (i) fixation stability, (ii) smooth pursuit, (iii) saccades, (iv) free viewing, and (v) visual search. The studies reported considerable success when testing individuals with vision impairment, yielding usable data from 96.5% of participants. LIMITATIONS: There was an overrepresentation of conditions affecting the optic nerve or macula and an underrepresentation of conditions affecting the anterior segment or peripheral retina. CONCLUSIONS AND IMPLICATIONS OF KEY FINDINGS: The results offer promise for the use of eye tracking to assess the visual function of a considerable proportion of those with vision impairment. Based on the findings, we outline a framework for how eye tracking can be used to test visual function in the presence of vision impairment.
Subject(s)
Eye-Tracking Technology , Nystagmus, Pathologic , Humans , Retina , Pursuit, SmoothABSTRACT
High-level athletes can predict the actions of an opposing player. Interestingly, such predictions are also reflected by the athlete's gaze behavior. In cricket, for example, players first pursue the ball with their eyes before they very often initiate two predictive saccades: one to the predicted ball-bounce point and a second to the predicted ball-bat-contact point. That means, they move their eyes ahead of the ball and "wait" for the ball at the new fixation location, potentially using their peripheral vision to update information about the ball's trajectory. In this study, we investigated whether predictive saccades are linked to the processing of information in peripheral vision and if predictive saccades are superior to continuously following the ball with foveal vision using smooth-pursuit eye-movements (SPEMs). In the first two experiments, we evoked the typical eye-movements observed in cricket and showed that the information gathered during SPEMs is sufficient to predict when the moving object will hit the target location and that (additional) peripheral monitoring of the object does not help to improve performance. In a third experiment, we show that it could actually be beneficial to use SPEMs rather than predictive saccades to improve performance. Thus, predictive saccades ahead of a target are unlikely to be performed to enhance the peripheral monitoring of target.
Subject(s)
Psychomotor Performance , Saccades , Humans , Psychomotor Performance/physiology , Eye Movements , Pursuit, Smooth , Visual PerceptionABSTRACT
Scouts search for "sleepers" who may be initially overlooked but ultimately exceed expectations. The psychological characteristics of those players are often neglected because they are difficult to observe, but hold promise to identify sleepers given for example the self-regulation and perceptual-cognitive skills that those developing players might need to flourish. The aim of this study was to examine whether sleepers could be retrospectively identified using psychological characteristics. Ninety-five junior elite ice-hockey players (aged 15-16) were assessed on self-regulation and perceptual-cognitive skills before the yearly draft. Seventy players were drafted after the second round (37th or later). Three years later, professional scouts identified 15/70 sleepers they would now pick if given the chance. Those identified by the scouts showed higher self-regulation planning, and had distinguishable gaze behaviour (fewer fixations on more AOIs) when performing a video-based decision-making task than other late-drafted players (84.3% correct classification; R2 = .40). In addition, two latent profiles differentiated by self-regulation were found, with the profile with higher scores including 14/15 players selected by the scouts. Psychological characteristics were successful in retrospectively predicting sleepers, and may in future help scouts to make better selections of talent.
Subject(s)
Hockey , Humans , Hockey/physiology , Retrospective Studies , AptitudeABSTRACT
Footballers with vision impairment (VI) are eligible to compete in the Para sport if they meet a minimum impairment criteria (MIC) based on measures of their visual acuity (VA) and/or visual field. Despite the requirements of the International Paralympic Committee Athlete Classification Code that each sport uses an evidence-based classification system, VI football continues to use a medical-based system that lacks evidence to demonstrate the relationship between impairment and performance in the sport. The aim of this study was to systematically simulate vision loss to establish the minimum level of impairment that would affect performance in futsal. Nineteen skilled sighted players completed tests of individual technical skill and anticipation performance under six levels of simulated blur that decreased both VA and contrast sensitivity (CS). VA needed to be reduced to a level of acuity that represents worse vision than that currently used for inclusion in VI football before meaningful decreases in performance were observed. CS did not have a clear effect on football performance. These findings produce the first evidence for the minimum impairment criteria in VI football and suggest a more severe degree of impairment may be required for the MIC.
Subject(s)
Para-Athletes , Soccer , Vision Disorders , Humans , Athletic PerformanceABSTRACT
BACKGROUND: Children born with Trisomy 13 or 18 (T13/18) often have multiple congenital anomalies, many of which drastically shorten their lifespan. Among these defects are cleft lip and palate, the repair of which presents an ethical dilemma to the surgeon given the underlying comorbidities associated with T13/18. The authors present an ethical discussion and institutional experience in navigating this dilemma. METHODS: The authors analyzed existing literature on T13 and T18 surgery and mortality. A retrospective study over ten years was also conducted to identify pediatric patients who underwent surgical correction of cleft lip and/or palate secondary to a confirmed diagnosis of T13/18. The authors identified two patients and examined their treatment course. RESULTS: The authors' review of literature coupled with their institution's experience builds on the published successes of correcting cleft lip and palate in the setting of T13/18. It was found that both patients identified in the case series underwent successful correction with no surgical complications. CONCLUSION: A careful balance must be struck between improved quality of life, benefits of treatment, and risks of surgery in children with T13/T18. Careful consideration should be given to the medical status of these complex patients. If the remaining medical comorbidities are well managed and under control, there is an ethical precedent for performing cleft lip and palate surgeries on these children. A diagnosis of T13/T18 alone is not enough to disqualify patients from cleft lip/palate surgery.
ABSTRACT
Para sport classification aims to minimize the impact of impairments on the outcome of competition. The International Paralympic Committee requires classification systems to be evidence based and sport specific, yet the sport of goalball uses a structure that is not supported by evidence demonstrating its legitimacy for competition. This study aimed to establish expert opinions on how a sport-specific system of classification should be structured in the sport of goalball. Using a three-round Delphi survey, 30 international experts expressed their views across topics linked to goalball classification. Participants were divided as to whether the current system fulfills the aim to minimize the impact of impairment on competition. Most felt that less impairment should be required to compete but that the one-class structure should remain. Experts identified measures of visual function that should be considered and 15 core components of individual goalball performance. Findings constitute a crucial first step toward evidence-based classification in goalball.
Subject(s)
Disabled Persons , Sports for Persons with Disabilities , Humans , Delphi Technique , Disabled Persons/classification , Disability Evaluation , Male , Female , Vision Disorders/classificationABSTRACT
People often equate "Lean" with the tools that are used to create efficiencies and standardize processes. However, implementing tools represents at most 20 percent of the effort in Lean transformations. The other 80 percent is expended on changing leaders' practices and behaviors, and ultimately their mindset. Senior management has an essential role in establishing conditions that enable 80 percent of the effort to succeed. Their involvement includes establishing governance arrangements that cross divisional boundaries, supporting a thorough, long-term vision of the organization's value-producing processes, and holding everyone accountable for meeting Lean commitments. This is accomplished through regular, direct involvement. When upper management sets the example, durable Lean success and an increasingly Lean leadership mindset follow.
Subject(s)
Efficiency, Organizational , Leadership , HumansABSTRACT
For decades, quaternary ammonium compounds (QAC)-based sanitizers have been broadly used in food processing environments to control foodborne pathogens such as Listeria monocytogenes. Still, there is a lack of consensus on the likelihood and implication of reduced Listeria susceptibility to benzalkonium chloride (BC) that may emerge due to sublethal exposure to the sanitizers in food processing environments. With a focus on fresh produce processing, we attempted to fill multiple data and evidence gaps surrounding the debate. We determined a strong correlation between tolerance phenotypes and known genetic determinants of BC tolerance with an extensive set of fresh produce isolates. We assessed BC selection on L. monocytogenes through a large-scale and source-structured genomic survey of 25,083 publicly available L. monocytogenes genomes from diverse sources in the United States. With the consideration of processing environment constraints, we monitored the temporal onset and duration of adaptive BC tolerance in both tolerant and sensitive isolates. Finally, we examined residual BC concentrations throughout a fresh produce processing facility at different time points during daily operation. While genomic evidence supports elevated BC selection and the recommendation for sanitizer rotation in the general context of food processing environments, it also suggests a marked variation in the occurrence and potential impact of the selection among different commodities and sectors. For the processing of fresh fruits and vegetables, we conclude that properly sanitized and cleaned facilities are less affected by BC selection and unlikely to provide conditions that are conducive for the emergence of adaptive BC tolerance in L. monocytogenes. IMPORTANCE Our study demonstrates an integrative approach to improve food safety assessment and control strategies in food processing environments through the collective leveraging of genomic surveys, laboratory assays, and processing facility sampling. In the example of assessing reduced Listeria susceptibility to a widely used sanitizer, this approach yielded multifaceted evidence that incorporates population genetic signals, experimental findings, and real-world constraints to help address a lasting debate of policy and practical importance.
Subject(s)
Listeria monocytogenes , Listeria , Listeria monocytogenes/genetics , Benzalkonium Compounds/pharmacology , Drug Resistance, Bacterial/genetics , Food Handling , Food MicrobiologyABSTRACT
Whole-genome sequencing (WGS) for public health surveillance and epidemiological investigation of foodborne pathogens predominantly relies on sequencing platforms that generate short reads. Continuous improvement of long-read nanopore sequencing, such as Oxford nanopore technologies (ONT), presents a potential for leveraging multiple advantages of the technology in public health and food industry settings, including rapid turnaround and onsite applicability in addition to superior read length. Using an established cohort of Salmonella Enteritidis isolates for subtyping evaluation, we assessed the technical readiness of nanopore long read sequencing for single nucleotide polymorphism (SNP) analysis and core-genome multilocus sequence typing (cgMLST) of a major foodborne pathogen. By multiplexing three isolates per flow cell, we generated sufficient sequencing depths in <7 h of sequencing for robust subtyping. SNP calls by ONT and Illumina reads were highly concordant despite homopolymer errors in ONT reads (R9.4.1 chemistry). In silico correction of such errors allowed accurate allelic calling for cgMLST and allelic difference measurements to facilitate heuristic detection of outbreak isolates. IMPORTANCE Evaluation, standardization, and implementation of the ONT approach to WGS-based, strain-level subtyping is challenging, in part due to its relatively high base-calling error rates and frequent iterations of sequencing chemistry and bioinformatic analytics. Our study established a baseline for the continuously evolving nanopore technology as a viable solution to high-quality subtyping of Salmonella, delivering comparable subtyping performance when used standalone or together with short-read platforms. This study paves the way for evaluating and optimizing the logistics of implementing the ONT approach for foodborne pathogen surveillance in specific settings.
Subject(s)
Nanopores , Salmonella enteritidis , Genome, Bacterial , High-Throughput Nucleotide Sequencing , Humans , Multilocus Sequence Typing , Polymorphism, Single Nucleotide , Salmonella enteritidis/genetics , Whole Genome SequencingABSTRACT
Intracellular accumulation of abnormal proteins with conformational changes is the defining neuropathological feature of neurodegenerative diseases. The pathogenic proteins that accumulate in patients' brains adopt an amyloid-like fibrous structure and exhibit various ultrastructural features. The biochemical analysis of pathogenic proteins in sarkosyl-insoluble fractions extracted from patients' brains also shows disease-specific features. Intriguingly, these ultrastructural and biochemical features are common within the same disease group. These differences among the pathogenic proteins extracted from patients' brains have important implications for definitive diagnosis of the disease, and also suggest the existence of pathogenic protein strains that contribute to the heterogeneity of pathogenesis in neurodegenerative diseases. Recent experimental evidence has shown that prion-like propagation of these pathogenic proteins from host cells to recipient cells underlies the onset and progression of neurodegenerative diseases. The reproduction of the pathological features that characterize each disease in cellular and animal models of prion-like propagation also implies that the structural differences in the pathogenic proteins are inherited in a prion-like manner. In this review, we summarize the ultrastructural and biochemical features of pathogenic proteins extracted from the brains of patients with neurodegenerative diseases that accumulate abnormal forms of tau, α-synuclein, and TDP-43, and we discuss how these disease-specific properties are maintained in the brain, based on recent experimental insights.