RESUMO
Decoherence-free subspaces and subsystems (DFS) preserve quantum information by encoding it into symmetry-protected states unaffected by decoherence. An inherent DFS of a given experimental system may not exist; however, through the use of dynamical decoupling (DD), one can induce symmetries that support DFSs. Here, we provide the first experimental demonstration of DD-generated decoherence-free subsystem logical qubits. Utilizing IBM Quantum superconducting processors, we investigate two and three-qubit DFS codes comprising up to six and seven noninteracting logical qubits, respectively. Through a combination of DD and error detection, we show that DFS logical qubits can achieve up to a 23% improvement in state preservation fidelity over physical qubits subject to DD alone. This constitutes a beyond-breakeven fidelity improvement for DFS-encoded qubits. Our results showcase the potential utility of DFS codes as a pathway toward enhanced computational accuracy via logical encoding on quantum processors.
RESUMO
Deflectometric profilometers are used to precisely measure the form of beam shaping optics of synchrotrons and X-ray free-electron lasers. They often utilize autocollimators which measure slope by evaluating the displacement of a reticle image on a detector. Based on our privileged access to the raw image data of an autocollimator, novel strategies to reduce the systematic measurement errors by using a set of overlapping images of the reticle obtained at different positions on the detector are discussed. It is demonstrated that imaging properties such as, for example, geometrical distortions and vignetting, can be extracted from this redundant set of images without recourse to external calibration facilities. This approach is based on the fact that the properties of the reticle itself do not change - all changes in the reticle image are due to the imaging process. Firstly, by combining interpolation and correlation, it is possible to determine the shift of a reticle image relative to a reference image with minimal error propagation. Secondly, the intensity of the reticle image is analysed as a function of its position on the CCD and a vignetting correction is calculated. Thirdly, the size of the reticle image is analysed as a function of its position and an imaging distortion correction is derived. It is demonstrated that, for different measurement ranges and aperture diameters of the autocollimator, reductions in the systematic errors of up to a factor of four to five can be achieved without recourse to external measurements.
RESUMO
Circulating tumor DNA (ctDNA) is a promising biomarker, reflecting the presence of tumor cells. Sequencing-based detection of ctDNA at low tumor fractions is challenging due to the crude error rate of sequencing. To mitigate this challenge, we developed ultra-deep mutation-integrated sequencing (UMIseq), a fixed-panel deep targeted sequencing approach, which is universally applicable to all colorectal cancer (CRC) patients. UMIseq features UMI-mediated error correction, the exclusion of mutations related to clonal hematopoiesis, a panel of normal samples for error modeling, and signal integration from single-nucleotide variations, insertions, deletions, and phased mutations. UMIseq was trained and independently validated on pre-operative (pre-OP) plasma from CRC patients (n = 364) and healthy individuals (n = 61). UMIseq displayed an area under the curve surpassing 0.95 for allele frequencies (AFs) down to 0.05%. In the training cohort, the pre-OP detection rate reached 80% at 95% specificity, while it was 70% in the validation cohort. UMIseq enabled the detection of AFs down to 0.004%. To assess the potential for detection of residual disease, 26 post-operative plasma samples from stage III CRC patients were analyzed. From this we found that the detection of ctDNA was associated with recurrence. In conclusion, UMIseq demonstrated robust performance with high sensitivity and specificity, enabling the detection of ctDNA at low allele frequencies.
Assuntos
Biomarcadores Tumorais , DNA Tumoral Circulante , Neoplasias Colorretais , Sequenciamento de Nucleotídeos em Larga Escala , Mutação , Humanos , Neoplasias Colorretais/genética , Neoplasias Colorretais/sangue , Neoplasias Colorretais/diagnóstico , DNA Tumoral Circulante/genética , DNA Tumoral Circulante/sangue , Sequenciamento de Nucleotídeos em Larga Escala/métodos , Masculino , Feminino , Biomarcadores Tumorais/sangue , Biomarcadores Tumorais/genética , Idoso , Pessoa de Meia-Idade , Adulto , Frequência do Gene , Idoso de 80 Anos ou mais , Ácidos Nucleicos Livres/genética , Ácidos Nucleicos Livres/sangue , Sensibilidade e EspecificidadeRESUMO
Accurate circulating tumor DNA (ctDNA) detection has an immense biomarker potential in all phases of the cancer disease course. Presence of ctDNA in the blood has been shown to have prognostic value in various cancer types as it may reflect the actual tumor burden. There are two main methods to consider, a tumor-informed and a tumor-agnostic analysis of ctDNA. Both techniques exploit the short half-life of circulating cell-free DNA (cfDNA)/ctDNA for disease monitoring and ultimately future clinical treatment intervention. Urothelial carcinoma is characterized by a high mutation spectrum but very few hotspot mutations. This limits tumor agnostic usability of hotspot mutation or fixed sets of genes for ctDNA detection. Here we focus on a tumor-informed analysis for ultrasensitive patient- and tumor-specific ctDNA detection using personalized mutation panels, probes that bind to specific genomic sequences to enrich for the region of interest. In this chapter, we describe methods for purification of high-quality cfDNA and guidelines for designing tumor-informed customized capture panels for sensitive detection of ctDNA. Furthermore, a detailed protocol for library preparation and panel capture utilizing a double enrichment strategy with low amplification is described.
Assuntos
Carcinoma de Células de Transição , Ácidos Nucleicos Livres , DNA Tumoral Circulante , Neoplasias da Bexiga Urinária , Humanos , DNA Tumoral Circulante/genética , Mutação , Biomarcadores Tumorais/genéticaRESUMO
Wisconsin card-sorting tasks provide unique opportunities to study cognitive flexibility and its limitations, which express themselves behaviorally as perseverative errors (PE). PE refer to those behavioral errors on Wisconsin card-sorting tasks that are committed when cognitive rules are maintained even though recently received outcomes demand to switch to other rules (i.e., cognitive perseveration). We explored error-suppression effects (ESE) across three Wisconsin card-sorting studies. ESE refer to the phenomenon that PE are reduced on repetitive trials compared to non-repetitive trials. We replicated ESE in all three Wisconsin card-sorting studies. Study 1 revealed that non-associative accounts of ESE, in particular the idea that cognitive inhibition may account for them, are not tenable. Study 2 suggested that models of instrumental learning are among the most promising associative accounts of ESE. Instrumental learning comprises goal-directed control and the formation of corresponding associative memories over and above the formation of habitual memories according to dual-process models of instrumental learning. Study 3 showed that cognitive, rather than motor, representations of responses should be conceptualized as elements entering goal-directed instrumental memories. Collectively, the results imply that ESE on Wisconsin card-sorting tasks are not only a highly replicable phenomenon, but they also indicate that ESE provide an opportunity to study cognitive mechanisms of goal-directed instrumental control. Based on the reported data, we present a novel theory of cognitive perseveration (i.e., the 'goal-directed instrumental control' GIC model), which is outlined in the Concluding Discussion.
RESUMO
This is a brief account of Turing's ideas on biological pattern and the events that led to their wider acceptance by biologists as a valid way to investigate developmental pattern, and of the value of theory more generally in biology. Periodic patterns have played a key role in this process, especially 2D arrays of oriented stripes, which proved a disappointment in theoretical terms in the case of Drosophila segmentation, but a boost to theory as applied to skin patterns in fish and model chemical reactions. The concept of "order from fluctuations" is a key component of Turing's theory, wherein pattern arises by selective amplification of spatial components concealed in the random disorder of molecular and/or cellular processes. For biological examples, a crucial point from an analytical standpoint is knowing the nature of the fluctuations, where the amplifier resides, and the timescale over which selective amplification occurs. The answer clarifies the difference between "inelegant" examples such as Drosophila segmentation, which is perhaps better understood as a programmatic assembly process, and "elegant" ones expressible in equations like Turing's: that the fluctuations and selection process occur predominantly in evolutionary time for the former, but in real time for the latter, and likewise for error suppression, which for Drosophila is historical, in being lodged firmly in past evolutionary events. The prospects for a further extension of Turing's ideas to the complexities of brain development and consciousness is discussed, where a case can be made that it could well be in neuroscience that his ideas find their most important application.
RESUMO
Error attenuation capacity of a target tracking system is the key indicator for the system's tracking precision. Without changing the system's feedback control structure, the traditional first order integral control, which is widely used in traditional tracking systems, cannot meet a higher precision for those fast targets with high mobility. The work described in this paper concerns about this problem, and proposes a cascade lag control scheme with one or more order to level up the system's active error suppression capacity in low-frequency range. By substituting the cascade lag controllers for additional integral operator, a higher amplitude ratio system, which implies higher tracking precision, is obtained without loss of stability. As a difficult task for massive parameters' designing, a concept of relative order and a configuration proportion law is proposed to simplify the analysis as well as parameters tuning. Relationship between the relative order and system performance is given. The multi-order cascade lag control scheme's efficiency is proved in both theoretical analysis and experiments in an electro-optical tracking system.
RESUMO
BACKGROUND: There is currently no method to precisely measure the errors that occur in the sequencing instrument/sequencer, which is critical for next-generation sequencing applications aimed at discovering the genetic makeup of heterogeneous cellular populations. RESULTS: We propose a novel computational method, SequencErr, to address this challenge by measuring the base correspondence between overlapping regions in forward and reverse reads. An analysis of 3777 public datasets from 75 research institutions in 18 countries revealed the sequencer error rate to be ~ 10 per million (pm) and 1.4% of sequencers and 2.7% of flow cells have error rates > 100 pm. At the flow cell level, error rates are elevated in the bottom surfaces and > 90% of HiSeq and NovaSeq flow cells have at least one outlier error-prone tile. By sequencing a common DNA library on different sequencers, we demonstrate that sequencers with high error rates have reduced overall sequencing accuracy, and removal of outlier error-prone tiles improves sequencing accuracy. We demonstrate that SequencErr can reveal novel insights relative to the popular quality control method FastQC and achieve a 10-fold lower error rate than popular error correction methods including Lighter and Musket. CONCLUSIONS: Our study reveals novel insights into the nature of DNA sequencing errors incurred on DNA sequencers. Our method can be used to assess, calibrate, and monitor sequencer accuracy, and to computationally suppress sequencer errors in existing datasets.
Assuntos
Sequenciamento de Nucleotídeos em Larga Escala/métodos , Algoritmos , Calibragem , Biblioteca Gênica , Humanos , Modelos Genéticos , SARS-CoV-2 , Análise de Sequência de DNA/métodosRESUMO
We explored short-term behavioral plasticity on the Modified Wisconsin Card Sorting Test (M-WCST) by deriving novel error metrics by stratifying traditional set loss and perseverative errors. Separating the rule set and the response set allowed for the measurement of performance across four trial types, crossing rule set (i.e., maintain vs. switch) and response demand (i.e., repeat vs. alternate). Critically, these four trial types can be grouped based on trial-wise feedback on t - 1 trials. Rewarded (correct) maintain t - 1 trials should lead to error enhancement when the response demands shift from repeat to alternate. In contrast, punished (incorrect) t - 1 trials should lead to error suppression when the response demands shift from repeat to alternate. The results supported the error suppression prediction: An error suppression effect (ESE) was observed across numerous patient samples. Exploratory analyses show that the ESE did not share substantial portions of variance with traditional neuropsychological measures of executive functioning. They further point into the direction that striatal or limbic circuit neuropathology may be associated with enhanced ESE. These data suggest that punishment of the recently executed response induces behavioral avoidance, which is detectable as the ESE on the WCST. The assessment of the ESE might provide an index of response-related avoidance learning on the WCST.
RESUMO
BACKGROUND: Ultra-deep next-generation sequencing of circulating tumor DNA (ctDNA) holds great promise as a tool for the early detection of cancer and for monitoring disease progression and therapeutic responses. However, the low abundance of ctDNA in the bloodstream coupled with technical errors introduced during library construction and sequencing complicates mutation detection. RESULTS: To achieve high accuracy of variant calling via better distinguishing low-frequency ctDNA mutations from background errors, we introduce TNER (Tri-Nucleotide Error Reducer), a novel background error suppression method that provides a robust estimation of background noise to reduce sequencing errors. The results on both simulated data and real data from healthy subjects demonstrate that the proposed algorithm consistently outperforms a current, state-of-the-art, position-specific error polishing model, particularly when the sample size of healthy subjects is small. CONCLUSIONS: TNER significantly enhances the specificity of downstream ctDNA mutation detection without sacrificing sensitivity. The tool is publicly available at https://github.com/ctDNA/TNER .