ABSTRACT
Addressing health inequity is a central component of the Sustainable Development Goals and a priority of the World Health Organization (WHO). WHO supports countries in strengthening their health information systems in order to better collect, analyze and report health inequality data. Improving information and research about health inequality is crucial to identify and address the inequalities that lead to poorer health outcomes. Building analytical capacities of individuals, particularly in low-resource areas, empowers them to build a stronger evidence-base, leading to more informed policy and programme decision-making. However, health inequality analysis requires a unique set of skills and knowledge. This paper describes three resources developed by WHO to support the analysis of inequality data by non-statistical users using Microsoft Excel, a widely used and accessible software programme. The resources include a practical eLearning course, which trains learners in the preparation and reporting of disaggregated data using Excel, an Excel workbook that takes users step-by-step through the calculation of 21 summary measures of health inequality, and a workbook that automatically calculates these measures with the user's disaggregated dataset. The utility of the resources is demonstrated through an empirical example.
Subject(s)
Health Status Disparities , Software , World Health Organization , HumansABSTRACT
In higher education, many universities in Taiwan let college students learn excel in a self-directed way. The current axle of the Excel curriculum mainly relies on self-directed learning. In the study, we designed the digital game "Legendary Wizard Excel" and took a certified Excel textbook as the research tool. The game we designed integrated the role-play with cognitive scaffolding to help learners learn Excel skills, whereas the textbook we used was "Excel Expert" in the Microsoft Office Specialist. We compared the Learning Effectiveness, Flow Status, and Technology Acceptance Model with 187 college students between two tools, and found that: (1) The game reached a high Technology Acceptance Model; (2) Both groups of learners had significant improvements in learning effectiveness and were engaged in the activity; (3) On learning effectiveness, learners in game-based learning groups achieved higher than learners in textbook groups; (4) Learners in game-based learning groups engaged better in the activity than learners in textbook groups. Therefore, in the future, we looked forward to bringing our results to higher education levels and workplace training to enhance the Excel skills.
ABSTRACT
Clinical research using restricted diffusion-weighted imaging, especially diffusion kurtosis (DK) imaging, has been progressing, with reports on its effectiveness in the diagnostic imaging of cerebral infarctions, neurodegenerative diseases, and tumors, among others. However, the application of DK imaging in daily clinical practice has not spread because of the long imaging time required and the use of specific software for image creation. Herein, with the aim of promoting clinical research using DK imaging at any medical facility, we evaluated fast DK imaging using a new software program. We developed a new macro program that produces DK images using general-purpose, inexpensive software (Microsoft Excel and ImageJ), and we evaluated fast DK imaging using bio-phantoms and a healthy volunteer in clinical trials. The DK images created by the new software with diffusion-weighted images captured with short-time imaging sequences were similar to the original DK images captured with long-time imaging sequences. The DK images using three b-values, which can reduce the imaging time by 43%, were equivalent to the DK images using five b-values. The DK imaging technique developed herein might allow any medical facility to increase its daily clinical use of DK imaging and easily conduct clinical research.
Subject(s)
Diffusion Magnetic Resonance Imaging , Software , Diffusion , Diffusion Magnetic Resonance Imaging/methods , Humans , Phantoms, ImagingABSTRACT
The sympathetic nervous system is important for the beat-by-beat regulation of arterial blood pressure and the control of blood flow to various organs. Microneurographic recordings of pulse-synchronous muscle sympathetic nerve activity (MSNA) are used by numerous laboratories worldwide. The transduction of hemodynamic and vascular responses elicited by spontaneous bursts of MSNA provides novel, mechanistic insight into sympathetic neural control of the circulation. Although some of these laboratories have developed in-house software programs to analyze these sympathetic transduction responses, they are not openly available and most require higher level programming skills and/or costly platforms. In the present paper, we present an open-source, Microsoft Excel-based analysis program designed to examine the pressor and/or vascular responses to spontaneous resting bursts of MSNA, including across longer, continuous MSNA burst sequences, as well as following heartbeats not associated with MSNA bursts. An Excel template with embedded formulas is provided. Detailed written and video-recorded instructions are provided to help facilitate the user and promote its implementation among the research community. Open science activities such as the dissemination of analytical programs and instructions may assist other laboratories in their pursuit to answer novel and impactful research questions regarding sympathetic neural control strategies in human health and disease.NEW & NOTEWORTHY The pressor responses to spontaneous bursts of muscle sympathetic nerve activity provide important information regarding sympathetic regulation of the circulation. Many laboratories worldwide quantify sympathetic neurohemodynamic transduction using in-house, customized software requiring high-level programming skills and/or costly computer programs. To overcome these barriers, this study presents a simple, open-source, Microsoft Excel-based analysis program along with video instructions to assist researchers without the necessary resources to quantify sympathetic neurohemodynamic transduction.
Subject(s)
Electrocardiography/methods , Software , Sympathetic Nervous System/physiology , Action Potentials , Animals , Heart/innervation , Heart/physiology , HumansABSTRACT
The near universal availability of UV-Visible spectrophotometers makes this instrument a highly exploited tool for the inexpensive, rapid examination of iron-sulfur clusters. Yet, the analysis of iron-sulfur cluster reconstitution experiments by UV-Vis spectroscopy is notoriously difficult due to the presence of broad, ill-defined peaks. Other types of spectroscopies, such as electron paramagnetic resonance spectroscopy and Mössbauer spectroscopy, are superior in characterizing the type of cluster present and their associated electronic transitions but require expensive, less readily available equipment. Here, we describe a tool that utilizes the accessible and convenient platform of Microsoft Excel to allow for the semi-quantitative analysis of iron-sulfur clusters by UV-Vis spectroscopy. This tool, which we call Fit-FeS, could potentially be used to additionally decompose spectra of solutions containing chromophores other than iron-sulfur clusters.
Subject(s)
Iron/chemistry , Sulfur/chemistry , Electron Spin Resonance Spectroscopy , Ferrous Compounds/chemistry , Molecular Conformation , Peptide Library , Peptides/chemistry , Spectrophotometry, UltravioletABSTRACT
Coupled with the reduction in sequencing costs, the number of RAD-seq analysis have been surging, generating vast genetic knowledge in relation with many crops. Specialized platforms might be intimidating to non-expert users and difficult to implement on each computer despite the growing interest in the usage of the dataset obtained by high-throughput sequencing. Therefore, RAD-R scripts were developed on Windows10 for RAD-seq analysis, allowing users who are not familiar with bioinformatics to easily analyze big sequence data. These RAD-R scripts that run a flow from raw sequence reads of F2 population for the self-fertilization plants to the linkage map construction as well as the QTL analysis can be also useful to many users with limited experience due to the simplicity of copying Excel cells into the R console. During the comparison of linkage maps constructed by RAD-R scripts and Stacks, RAD-R scripts were shown to construct the linkage map with less missing genotype data and a shorter total genetic distance. QTL analysis results can be easily obtained by selecting the reliable genotype data that is visually inferred to be appropriate for error correction from the genotype data files created by RAD-R scripts.
ABSTRACT
BACKGROUND & AIMS: Variants in STAT4 (rs7574865) have been associated with seroconversion to hepatitis B e antigen (HBeAg) and reduction in levels of hepatitis B virus (HBV) DNA in patients with chronic infection treated with interferon alpha (IFNA). We evaluated the associations among rs7574865, loss of HB surface antigen (HBsAg, a marker of functional cure of HBV infection), and response to treatment with pegylated IFNA (PegIFN) or nucleos(t)ide analogues (NUCs) in HBeAg-positive patients with chronic HBV infection. METHODS: We performed a retrospective analysis of 1823 HBeAg-positive patients with chronic HBV infection (954 patients treated with PegIFN and 869 patients treated with NUCs) included in 4 phase-4 multicenter randomized controlled trials. The Cochran-Armitage trend test was used to evaluate the association of rs7574865 genotype with combined response (CR, defined as HBeAg seroconversion and HBV DNA level <2000 IU/mL) and loss of HBsAg at week 72, for patients given PegIFN, or week 104, for patients given NUCs. RESULTS: We found a significant association between rs7574865 genotype and CR (P = .004) and loss of HBsAg (P = .037) in patients treated with PegIFN. In patients with HBV genotype B infection, 43.6% of those with rs7574865 TT achieved a CR, compared to patients with rs7574865 GG (20.5%), and 7.7% had loss of HBsAg, compared to 1.9% of patients with rs7574865 GG. However, in patients treated with NUCs, we found no association of rs7574865 genotype with CR (P = .811) or loss of HBsAg (P=.439). CONCLUSIONS: In a retrospective analysis of data from 4 clinical trials, we found rs7574865 in STAT4 to be associated with functional cure of chronic HBV infection by PegIFN treatment, but not NUCs treatment, in HBeAg-positive patients with HBV genotype B infection.
Subject(s)
Antiviral Agents/therapeutic use , Hepatitis B, Chronic/drug therapy , Hepatitis B, Chronic/genetics , Interferon-alpha/therapeutic use , STAT4 Transcription Factor/genetics , Adult , DNA, Viral/analysis , Female , Genotype , Hepatitis B e Antigens/immunology , Hepatitis B, Chronic/immunology , Humans , Male , Nucleosides/therapeutic use , Nucleotides/therapeutic use , Polyethylene Glycols/therapeutic use , Randomized Controlled Trials as Topic , Recombinant Proteins/therapeutic use , Retrospective Studies , Seroconversion , Young AdultABSTRACT
Developing a search strategy for a systematic review is a time-consuming process in which small errors around the formatting and compilation of terms can have large consequences. Microsoft Excel was identified as a potentially useful software to streamline the process and reduce manual errors. Ultimately a spreadsheet was created that largely automates the process of creating a single-line search string with correctly formatted terms, Boolean operators and parentheses.
Subject(s)
Information Storage and Retrieval/standards , Search Engine , Software , Systematic Reviews as Topic , Bibliometrics , Databases, Bibliographic , HumansABSTRACT
Global food demand is rising, impelling us to develop strategies for improving the efficiency of photosynthesis. Classical photosynthesis models based on steady-state assumptions are inherently unsuitable for assessing biochemical and stomatal responses to rapid variations in environmental drivers. To identify strategies to increase photosynthetic efficiency, we need models that account for the timing of CO2 assimilation responses to dynamic environmental stimuli. Herein, I present a dynamic process-based photosynthetic model for C3 leaves. The model incorporates both light and dark reactions, coupled with a hydro-mechanical model of stomatal behaviour. The model achieved a stable and realistic rate of light-saturated CO2 assimilation and stomatal conductance. Additionally, it replicated complete typical assimilatory response curves (stepwise change in CO2 and light intensity at different oxygen levels) featuring both short lag times and full photosynthetic acclimation. The model also successfully replicated transient responses to changes in light intensity (light flecks), CO2 concentration, and atmospheric oxygen concentration. This dynamic model is suitable for detailed ecophysiological studies and has potential for superseding the long-dominant steady-state approach to photosynthesis modelling. The model runs as a stand-alone workbook in Microsoft® Excel® and is freely available to download along with a video tutorial.
Subject(s)
Carbon/metabolism , Light , Models, Biological , Photosynthesis/radiation effects , Plant Stomata/radiation effects , Carbon Dioxide/pharmacology , Darkness , Metabolome , Oxygen/pharmacology , Photons , Photosynthesis/drug effects , Plant Stomata/drug effectsABSTRACT
INTRODUCTION: In metabolomics studies, unwanted variation inevitably arises from various sources. Normalization, that is the removal of unwanted variation, is an essential step in the statistical analysis of metabolomics data. However, metabolomics normalization is often considered an imprecise science due to the diverse sources of variation and the availability of a number of alternative strategies that may be implemented. OBJECTIVES: We highlight the need for comparative evaluation of different normalization methods and present software strategies to help ease this task for both data-oriented and biological researchers. METHODS: We present NormalizeMets-a joint graphical user interface within the familiar Microsoft Excel and freely-available R software for comparative evaluation of different normalization methods. The NormalizeMets R package along with the vignette describing the workflow can be downloaded from https://cran.r-project.org/web/packages/NormalizeMets/ . The Excel Interface and the Excel user guide are available on https://metabolomicstats.github.io/ExNormalizeMets . RESULTS: NormalizeMets allows for comparative evaluation of normalization methods using criteria that depend on the given dataset and the ultimate research question. Hence it guides researchers to assess, select and implement a suitable normalization method using either the familiar Microsoft Excel and/or freely-available R software. In addition, the package can be used for visualisation of metabolomics data using interactive graphical displays and to obtain end statistical results for clustering, classification, biomarker identification adjusting for confounding variables, and correlation analysis. CONCLUSION: NormalizeMets is designed for comparative evaluation of normalization methods, and can also be used to obtain end statistical results. The use of freely-available R software offers an attractive proposition for programming-oriented researchers, and the Excel interface offers a familiar alternative to most biological researchers. The package handles the data locally in the user's own computer allowing for reproducible code to be stored locally.
Subject(s)
Metabolomics/methods , Metabolomics/standards , Reference Standards , Animals , Cluster Analysis , Data Accuracy , Data Interpretation, Statistical , Humans , Mass Spectrometry/methods , Research/standards , Research Design/standards , Software , Surveys and Questionnaires , User-Computer Interface , WorkflowABSTRACT
BACKGROUND: Genomic datasets accompanying scientific publications show a surprisingly high rate of gene name corruption. This error is generated when files and tables are imported into Microsoft Excel and certain gene symbols are automatically converted into dates. RESULTS: We have developed Truke, a fexible Web tool to detect, tag and fix, if possible, such misconversions. Aside, Truke is language and regional locale-aware, providing file format customization (decimal symbol, field sepator, etc.) following user's preferences. CONCLUSIONS: Truke is a data format conversion tool with a unique corrupted gene symbol detection utility. Truke is freely available without registration at http://maplab.cat/truke .
Subject(s)
Computational Biology/methods , Genomics/methods , Software , Web BrowserABSTRACT
Nutritional requirements and responses of all organisms are estimated using various models representing the response to different dietary levels of the nutrient in question. To help nutritionists design experiments for estimating responses and requirements, we developed a simulation workbook using Microsoft Excel. The objective of the present study was to demonstrate the influence of different numbers of nutrient levels, ranges of nutrient levels and replications per nutrient level on the estimates of requirements based on common nutritional response models. The user provides estimates of the shape of the response curve, requirements and other parameters and observation to observation variation. The Excel workbook then produces 1-1000 randomly simulated responses based on the given response curve and estimates the standard errors of the requirement (and other parameters) from different models as an indication of the expected power of the experiment. Interpretations are based on the assumption that the smaller the standard error of the requirement, the more powerful the experiment. The user can see the potential effects of using one or more subjects, different nutrient levels, etc., on the expected outcome of future experiments. From a theoretical perspective, each organism should have some enzyme-catalysed reaction whose rate is limited by the availability of some limiting nutrient. The response to the limiting nutrient should therefore be similar to enzyme kinetics. In conclusion, the workbook eliminates some of the guesswork involved in designing experiments and determining the minimum number of subjects needed to achieve desired outcomes.
Subject(s)
Diet , Models, Biological , Nutritional Requirements/physiology , Research Design , Animals , Enzymes , Female , Humans , Kinetics , MaleABSTRACT
The fluid mechanics of microfluidics is distinctively simpler than the fluid mechanics of macroscopic systems. In macroscopic systems effects such as non-laminar flow, convection, gravity etc. need to be accounted for all of which can usually be neglected in microfluidic systems. Still, there exists only a very limited selection of channel cross-sections for which the Navier-Stokes equation for pressure-driven Poiseuille flow can be solved analytically. From these equations, velocity profiles as well as flow rates can be calculated. However, whenever a cross-section is not highly symmetric (rectangular, elliptical or circular) the Navier-Stokes equation can usually not be solved analytically. In all of these cases, numerical methods are required. However, in many instances it is not necessary to turn to complex numerical solver packages for deriving, e.g., the velocity profile of a more complex microfluidic channel cross-section. In this paper, a simple spreadsheet analysis tool (here: Microsoft Excel) will be used to implement a simple numerical scheme which allows solving the Navier-Stokes equation for arbitrary channel cross-sections.
Subject(s)
Computer Simulation , Microfluidics/instrumentation , Microfluidics/methods , Numerical Analysis, Computer-AssistedABSTRACT
In the present paper, the novel software GTest is introduced, designed for testing the normality of a user-specified empirical distribution. It has been implemented with two unusual characteristics; the first is the user option of selecting four different versions of the normality test, each of them suited to be applied to a specific dataset or goal, and the second is the inferential paradigm that informs the output of such tests: it is basically graphical and intrinsically self-explanatory. The concept of inference-by-eye is an emerging inferential approach which will find a successful application in the near future due to the growing need of widening the audience of users of statistical methods to people with informal statistical skills. For instance, the latest European regulation concerning environmental issues introduced strict protocols for data handling (data quality assurance, outliers detection, etc.) and information exchange (areal statistics, trend detection, etc.) between regional and central environmental agencies. Therefore, more and more frequently, laboratory and field technicians will be requested to utilize complex software applications for subjecting data coming from monitoring, surveying or laboratory activities to specific statistical analyses. Unfortunately, inferential statistics, which actually influence the decisional processes for the correct managing of environmental resources, are often implemented in a way which expresses its outcomes in a numerical form with brief comments in a strict statistical jargon (degrees of freedom, level of significance, accepted/rejected H0, etc.). Therefore, often, the interpretation of such outcomes is really difficult for people with poor statistical knowledge. In such framework, the paradigm of the visual inference can contribute to fill in such gap, providing outcomes in self-explanatory graphical forms with a brief comment in the common language. Actually, the difficulties experienced by colleagues and their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.
Subject(s)
Environmental Monitoring/methods , Software , Statistics as Topic , Humans , Normal Distribution , Reproducibility of ResultsABSTRACT
Delay discounting describes the process wherein rewards lose value as a function of their delayed receipt; how quickly rewards lose value is termed the rate of delay discounting. Rates of delay discounting are robust predictors of much behavior of societal importance. One efficient approach to obtaining a human subject's rate of delay discounting is via the 21- and 27-item Monetary Choice Questionnaires, brief dichotomous choice tasks that assess preference between small immediate and larger delayed monetary outcomes. Unfortunately, the scoring procedures for the Monetary Choice Questionnaires are rather complex, which may serve as a barrier to their use. This report details a freely available Excel-based spreadsheet tool that automatically scores Monetary Choice Questionnaire response sets, using both traditional and contemporary/advanced approaches. An overview of the Monetary Choice Questionnaire and its scoring algorithm is provided. We conclude with general considerations for using the spreadsheet tool.
ABSTRACT
OBJECTIVES: Developing a search strategy for use in a systematic review is a time-consuming process requiring construction of detailed search strings using complicated syntax, followed by iterative fine-tuning and trial-and-error testing of these strings in online biomedical search engines. METHODS: Building upon limitations of existing online-only search builders, a user-friendly computer-based tool was created to expedite search strategy development as part of production of a systematic review. RESULTS: Search Builder 1.0 is a Microsoft Excel®-based tool that automatically assembles search strategy text strings for PubMed (www.pubmed.com) and Embase (www.embase.com), based on a list of user-defined search terms and preferences. With the click of a button, Search Builder 1.0 automatically populates the syntax needed for functional search strings, and copies the string to the clipboard for pasting into Pubmed or Embase. The offline file-based interface of Search Builder 1.0 also allows for searches to be easily shared and saved for future reference. CONCLUSIONS: This novel, user-friendly tool can save considerable time and streamline a cumbersome step in the systematic review process.
Subject(s)
Review Literature as Topic , Search Engine/methods , Humans , PubMed , User-Computer InterfaceABSTRACT
Effects of glyphosate based herbicide, Excel Mera 71 at a dose of 17.20mg/l on enzyme activities of acetylcholinesterase (AChE), lipid peroxidation (LPO), catalase (CAT), glutathione-S-transferase (GST) and protein content were measured in different tissues of two Indian air-breathing teleosts, Anabas testudineus (Bloch) and Heteropneustes fossilis (Bloch) during an exposure period of 30 days under laboratory condition. AChE activity was significantly increased in all the investigated tissues of both fish species and maximum elevation was observed in brain of H. fossilis, while spinal cord of A. testudineus showed minimum increment. Fishes showed significant increase LPO levels in all the tissues; highest was observed in gill of A. testudineus but lowest LPO level was observed in muscle of H. fossilis. CAT was also enhanced in both the fishes, while GST activity in liver diminished substantially and minimum was observed in liver of A. testudineus. Total protein content showed decreased value in all the tissues, maximum reduction was observed in liver and minimum in brain of A. testudineus and H. fossilis respectively. The results indicated that Excel Mera 71 caused serious alterations in the enzyme activities resulting into severe deterioration of fish health; so, AChE, LPO, CAT and GST can be used as suitable indicators of herbicidal toxicity.
Subject(s)
Catfishes/metabolism , Glycine/analogs & derivatives , Herbicides/toxicity , Perciformes/metabolism , Seafood/analysis , Acetylcholinesterase/metabolism , Animals , Biomarkers/metabolism , Catalase/metabolism , Fish Proteins/metabolism , Gills/drug effects , Glutathione/metabolism , Glutathione Transferase/metabolism , Glycine/toxicity , India , Lipid Peroxidation/drug effects , Liver/drug effects , Muscles/metabolism , GlyphosateABSTRACT
BACKGROUND: Here, we introduced our short experience on the application of a new CUSA Excel ultrasonic aspiration system, which was provided by Integra Lifesciences corporation, in skull base meningiomas resection. METHODS: Ten patients with anterior, middle skull base and sphenoid ridge meningioma were operated using the CUSA Excel ultrasonic aspiration system at the Neurosurgery Department of Shanghai Huashan Hospital from August 2014 to October 2014. There were six male and four female patients, aged from 38 to 61 years old (the mean age was 48.5 years old). Five cases with tumor located at anterior skull base, three cases with tumor on middle skull base, and two cases with tumor on sphenoid ridge. RESULTS: All the patents received total resection of meningiomas with the help of this new tool, and the critical brain vessels and nerves were preserved during operations. All the patients recovered well after operation. CONCLUSIONS: This new CUSA Excel ultrasonic aspiration system has the advantage of preserving vital brain arteries and cranial nerves during skull base meningioma resection, which is very important for skull base tumor operations. This key step would ensure a well prognosis for patients. We hope the neurosurgeons would benefit from this kind of technique.
ABSTRACT
Numerous studies revealed optimization techniques' applicability in minimizing the costs of reinforced concrete buildings. However, the existing literature has narrowly focused on optimizing buildings with a single function, such as residential or office buildings, hindering the generalization of the results. This paper aims to bridge the gap between optimization and structural engineering by obtaining the minimum-cost design of flat slab buildings with different intended functions. In this context, the optimal designs of 120 alternatives were obtained, considering various spans (4-8 m), live loads (2-10 kPa), and concrete compressive strength (25-40 MPa). The optimization was executed using the evolutionary algorithm provided in Microsoft Excel's Solver tool. The optimization model permits the utilization of drop panels to resist punching stresses developed from the slab-column interaction. The objective function is the cost of materials and labor involved in constructing floors and columns. The decision variables are the floor dimensions and column configurations in dimensions and reinforcement. The structural constraints were applied per the Egyptian design code (ECP203-2020). Eventually, guidelines were developed to help the designers choose the economic floor system and quantities of materials based on the building's intended function.
ABSTRACT
BACKGROUND: The dissolution test is a critical quality control method in the pharmaceutical industry, primarily used to assess drug bioavailability and ensure the consistency of manufactured batches. This test simulates the release of the active ingredient in the body and verifies compliance with specifications through multiple stages (e.g., S1, S2, S3). However, measurement uncertainty can undermine the reliability of test results, potentially leading to erroneous conformity decisions. This study aims to quantify the uncertainties arising from the dissolution, sampling, and quantification steps, as well as to estimate the risk of false conformity decisions in the dissolution test results for tablets. RESULTS: A comprehensive uncertainty evaluation was conducted for the dissolution, sampling, and quantification stages. The Monte Carlo method (MCM) was applied to assess the overall measurement uncertainty, which was determined to be approximately 5.2 %. The study revealed that sampling was the predominant contributor, accounting for 92 % of the total uncertainty, compared to 7 % from quantification and 1 % from dissolution steps. An MS Excel spreadsheet was developed to calculate the total risk value and classify it as either producer or consumer risk. This tool enables the evaluation of uncertainty in both individual tested units and mean values, depending on the stage criteria (e.g., S1, S2, S3). The proposed improved criteria were tested across various scenarios where the risk of false decisions due to measurement uncertainty was considered. These tests demonstrated the effectiveness of the criteria in managing consumer risk, highlighting the critical impact of sampling uncertainty on the decision-making process. SIGNIFICANCE AND NOVELTY: This study introduces novel, improved criteria for the dissolution test that account for the risk of false decisions due to measurement uncertainty. The proposed criteria significantly enhance the reliability of drug quality assessments. The study provides a robust framework for minimizing false conformity decisions. The development of a practical MS Excel tool further supports the reliable assessment of dissolution test results, ensuring higher standards of drug safety and efficacy in pharmaceutical quality control.