Search results for: and validation
1374 A Validation Technique for Integrated Ontologies
Authors: Neli P. Zlatareva
Abstract:
Ontology validation is an important part of web applications’ development, where knowledge integration and ontological reasoning play a fundamental role. It aims to ensure the consistency and correctness of ontological knowledge and to guarantee that ontological reasoning is carried out in a meaningful way. Existing approaches to ontology validation address more or less specific validation issues, but the overall process of validating web ontologies has not been formally established yet. As the size and the number of web ontologies continue to grow, the necessity to validate and ensure their consistency and interoperability is becoming increasingly important. This paper presents a validation technique intended to test the consistency of independent ontologies utilized by a common application.Keywords: knowledge engineering, ontological reasoning, ontology validation, semantic web
Procedia PDF Downloads 3221373 Invention of Novel Technique of Process Scale Up by Using Solid Dosage Form
Authors: Shashank Tiwari, S. P. Mahapatra
Abstract:
The aim of this technique is to reduce the steps of process scales up, save time & cost of the industries. This technique will minimise the steps of process scale up. The new steps are, Novel Lab Scale, Novel Lab Scale Trials, Novel Trial Batches, Novel Exhibit Batches, Novel Validation Batches. In these steps, it is not divided to validation batches in three parts but the data of trials batches, Exhibit Batches and Validation batches are use and compile for production and used for validation. It also increases the batch size of the trial, exhibit batches. The new size of trials batches is not less than fifty Thousand, the exhibit batches increase up to two lack and the validation batches up to five lack. After preparing the batches all their data & drugs use for stability & maintain the validation record and compile data for the technology transfer in production department for preparing the marketed size batches.Keywords: batches, technique, preparation, scale up, validation
Procedia PDF Downloads 3571372 Efficient Model Selection in Linear and Non-Linear Quantile Regression by Cross-Validation
Authors: Yoonsuh Jung, Steven N. MacEachern
Abstract:
Check loss function is used to define quantile regression. In the prospect of cross validation, it is also employed as a validation function when underlying truth is unknown. However, our empirical study indicates that the validation with check loss often leads to choosing an over estimated fits. In this work, we suggest a modified or L2-adjusted check loss which rounds the sharp corner in the middle of check loss. It has a large effect of guarding against over fitted model in some extent. Through various simulation settings of linear and non-linear regressions, the improvement of check loss by L2 adjustment is empirically examined. This adjustment is devised to shrink to zero as sample size grows.Keywords: cross-validation, model selection, quantile regression, tuning parameter selection
Procedia PDF Downloads 4381371 Preliminary Study of Standardization and Validation of Micronuclei Technique to Assess the DNA Damages Cause for the X-Rays
Authors: L. J. Díaz, M. A. Hernández, A. K. Molina, A. Bermúdez, C. Crane, V. M. Pabón
Abstract:
One of the most important biological indicators that show the exposure to the radiation is the micronuclei (MN). This technique is using to determinate the radiation effects in blood cultures as a biological control and a complement to the physics dosimetry. In Colombia the necessity to apply this analysis has emerged due to the current biological indicator most used is the chromosomal aberrations (CA), that is why it is essential the MN technique’s standardization and validation to have enough tools to improve the radioprotection topic in the country. Besides, this technique will be applied on the construction of a dose-response curve, that allow measure an approximately dose to irradiated people according to MN frequency found. Inside the steps that carried out to accomplish the standardization and validation is the statistic analysis from the lectures of “in vitro” peripheral blood cultures with different analysts, also it was determinate the best culture medium and conditions for the MN can be detected easily.Keywords: micronuclei, radioprotection, standardization, validation
Procedia PDF Downloads 4931370 Analysis of Expression Data Using Unsupervised Techniques
Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe
Abstract:
his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation
Procedia PDF Downloads 1491369 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)
Authors: Gule Teri
Abstract:
The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing
Procedia PDF Downloads 801368 The Detection of Implanted Radioactive Seeds on Ultrasound Images Using Convolution Neural Networks
Authors: Edward Holupka, John Rossman, Tye Morancy, Joseph Aronovitz, Irving Kaplan
Abstract:
A common modality for the treatment of early stage prostate cancer is the implantation of radioactive seeds directly into the prostate. The radioactive seeds are positioned inside the prostate to achieve optimal radiation dose coverage to the prostate. These radioactive seeds are positioned inside the prostate using Transrectal ultrasound imaging. Once all of the planned seeds have been implanted, two dimensional transaxial transrectal ultrasound images separated by 2 mm are obtained through out the prostate, beginning at the base of the prostate up to and including the apex. A common deep neural network, called DetectNet was trained to automatically determine the position of the implanted radioactive seeds within the prostate under ultrasound imaging. The results of the training using 950 training ultrasound images and 90 validation ultrasound images. The commonly used metrics for successful training were used to evaluate the efficacy and accuracy of the trained deep neural network and resulted in an loss_bbox (train) = 0.00, loss_coverage (train) = 1.89e-8, loss_bbox (validation) = 11.84, loss_coverage (validation) = 9.70, mAP (validation) = 66.87%, precision (validation) = 81.07%, and a recall (validation) = 82.29%, where train and validation refers to the training image set and validation refers to the validation training set. On the hardware platform used, the training expended 12.8 seconds per epoch. The network was trained for over 10,000 epochs. In addition, the seed locations as determined by the Deep Neural Network were compared to the seed locations as determined by a commercial software based on a one to three months after implant CT. The Deep Learning approach was within \strikeout off\uuline off\uwave off2.29\uuline default\uwave default mm of the seed locations determined by the commercial software. The Deep Learning approach to the determination of radioactive seed locations is robust, accurate, and fast and well within spatial agreement with the gold standard of CT determined seed coordinates.Keywords: prostate, deep neural network, seed implant, ultrasound
Procedia PDF Downloads 1981367 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator
Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty
Abstract:
Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state
Procedia PDF Downloads 2661366 A Proposal for Systematic Mapping Study of Software Security Testing, Verification and Validation
Authors: Adriano Bessa Albuquerque, Francisco Jose Barreto Nunes
Abstract:
Software vulnerabilities are increasing and not only impact services and processes availability as well as information confidentiality, integrity and privacy, but also cause changes that interfere in the development process. Security test could be a solution to reduce vulnerabilities. However, the variety of test techniques with the lack of real case studies of applying tests focusing on software development life cycle compromise its effective use. This paper offers an overview of how a Systematic Mapping Study (MS) about security verification, validation and test (VVT) was performed, besides presenting general results about this study.Keywords: software test, software security verification validation and test, security test institutionalization, systematic mapping study
Procedia PDF Downloads 4091365 Modeling Sediment Yield Using the SWAT Model: A Case Study of Upper Ankara River Basin, Turkey
Authors: Umit Duru
Abstract:
The Soil and Water Assessment Tool (SWAT) was tested for prediction of water balance and sediment yield in the Ankara gauged basin, Turkey. The overall objective of this study was to evaluate the performance and applicability of the SWAT in this region of Turkey. Thirteen years of monthly stream flow, and suspended sediment, data were used for calibration and validation. This research assessed model performance based on differences between observed and predicted suspended sediment yield during calibration (1987-1996) and validation (1982-1984) periods. Statistical comparisons of suspended sediment produced values for NSE (Nash Sutcliffe efficiency), RE (relative error), and R² (coefficient of determination), of 0.81, -1.55, and 0.93, respectively, during the calibration period, and NSE, RE (%), and R² of 0.77, -2.61, and 0.87, respectively, during the validation period. Based on the analyses, SWAT satisfactorily simulated observed hydrology and sediment yields and can be used as a tool in decision making for water resources planning and management in the basin.Keywords: calibration, GIS, sediment yield, SWAT, validation
Procedia PDF Downloads 2821364 Semi-Automatic Method to Assist Expert for Association Rules Validation
Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen
Abstract:
In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.Keywords: association rules, rule-based classification, classification quality, validation
Procedia PDF Downloads 4391363 Validation of the Formal Model of Web Services Applications for Digital Reference Service of Library Information System
Authors: Zainab Magaji Musa, Nordin M. A. Rahman, Julaily Aida Jusoh
Abstract:
The web services applications for digital reference service (WSDRS) of LIS model is an informal model that claims to reduce the problems of digital reference services in libraries. It uses web services technology to provide efficient way of satisfying users’ needs in the reference section of libraries. The formal WSDRS model consists of the Z specifications of all the informal specifications of the model. This paper discusses the formal validation of the Z specifications of WSDRS model. The authors formally verify and thus validate the properties of the model using Z/EVES theorem prover.Keywords: validation, verification, formal, theorem prover
Procedia PDF Downloads 5161362 Estimation of Uncertainty of Thermal Conductivity Measurement with Single Laboratory Validation Approach
Authors: Saowaluck Ukrisdawithid
Abstract:
The thermal conductivity of thermal insulation materials are measured by Heat Flow Meter (HFM) apparatus. The components of uncertainty are complex and difficult on routine measurement by modelling approach. In this study, uncertainty of thermal conductivity measurement was estimated by single laboratory validation approach. The within-laboratory reproducibility was 1.1%. The standard uncertainty of method and laboratory bias by using SRM1453 expanded polystyrene board was dominant at 1.4%. However, it was assessed that there was no significant bias. For sample measurement, the sources of uncertainty were repeatability, density of sample and thermal conductivity resolution of HFM. From this approach to sample measurements, the combined uncertainty was calculated. In summary, the thermal conductivity of sample, polystyrene foam, was reported as 0.03367 W/m·K ± 3.5% (k = 2) at mean temperature 23.5 °C. The single laboratory validation approach is simple key of routine testing laboratory for estimation uncertainty of thermal conductivity measurement by using HFM, according to ISO/IEC 17025-2017 requirements. These are meaningful for laboratory competent improvement, quality control on products, and conformity assessment.Keywords: single laboratory validation approach, within-laboratory reproducibility, method and laboratory bias, certified reference material
Procedia PDF Downloads 1531361 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models
Authors: Yoonsuh Jung
Abstract:
As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an "optimal" value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.Keywords: cross validation, parameter averaging, parameter selection, regularization parameter search
Procedia PDF Downloads 4151360 Validation of Existing Index Properties-Based Correlations for Estimating the Soil–Water Characteristic Curve of Fine-Grained Soils
Authors: Karim Kootahi, Seyed Abolhasan Naeini
Abstract:
The soil-water characteristic curve (SWCC), which represents the relationship between suction and water content (or degree of saturation), is an important property of unsaturated soils. The conventional method for determining SWCC is through specialized testing procedures. Since these procedures require specialized unsaturated soil testing apparatus and lengthy testing programs, several index properties-based correlations have been developed for estimating the SWCC of fine-grained soils. There are, however, considerable inconsistencies among the published correlations and there is no validation study on the predictive ability of existing correlations. In the present study, all existing index properties-based correlations are evaluated using a high quality worldwide database. The performances of existing correlations are assessed both graphically and quantitatively using statistical measures. The results of the validation indicate that most of the existing correlations provide unacceptable estimates of degree of saturation but the most recent model appears to be promising.Keywords: SWCC, correlations, index properties, validation
Procedia PDF Downloads 1761359 Design, Fabrication, and Experimental Validation of a Warm Bulge Test System
Authors: Emine Feyza Şükür, Mevlüt Türköz, Murat Dilmeç, Hüseyin Selçuk Halkacı
Abstract:
In this study, a warm bulge test system was designed, built and experimentally validated to perform warm bulge tests with all necessary systems. In addition, performance of each sub-system is validated through repeated production and/or test runs as well as through part quality measurements. Validation and performance tests were performed to characterize the repeatability of the system. As a result of these tests, the desired temperature distribution on the sheet metal was obtained by the heating systems and the good repeatability of the bulge tests was obtained. Consequently, this study is expected to provide other researchers and manufacturer with a set of design and process guidelines to develop similar systems.Keywords: design, test unit, warm bulge test unit, validation test
Procedia PDF Downloads 4911358 Decision Tree Analysis of Risk Factors for Intravenous Infiltration among Hospitalized Children: A Retrospective Study
Authors: Soon-Mi Park, Ihn Sook Jeong
Abstract:
This retrospective study was aimed to identify risk factors of intravenous (IV) infiltration for hospitalized children. The participants were 1,174 children for test and 424 children for validation, who admitted to a general hospital, received peripheral intravenous injection therapy at least once and had complete records. Data were analyzed with frequency and percentage or mean and standard deviation were calculated, and decision tree analysis was used to screen for the most important risk factors for IV infiltration for hospitalized children. The decision tree analysis showed that the most important traditional risk factors for IV infiltration were the use of ampicillin/sulbactam, IV insertion site (lower extremities), and medical department (internal medicine) both in the test sample and validation sample. The correct classification was 92.2% in the test sample and 90.1% in the validation sample. More careful attention should be made to patients who are administered ampicillin/sulbactam, have IV site in lower extremities and have internal medical problems to prevent or detect infiltration occurrence.Keywords: decision tree analysis, intravenous infiltration, child, validation
Procedia PDF Downloads 1761357 Measurement Errors and Misclassifications in Covariates in Logistic Regression: Bayesian Adjustment of Main and Interaction Effects and the Sample Size Implications
Authors: Shahadut Hossain
Abstract:
Measurement errors in continuous covariates and/or misclassifications in categorical covariates are common in epidemiological studies. Regression analysis ignoring such mismeasurements seriously biases the estimated main and interaction effects of covariates on the outcome of interest. Thus, adjustments for such mismeasurements are necessary. In this research, we propose a Bayesian parametric framework for eliminating deleterious impacts of covariate mismeasurements in logistic regression. The proposed adjustment method is unified and thus can be applied to any generalized linear and non-linear regression models. Furthermore, adjustment for covariate mismeasurements requires validation data usually in the form of either gold standard measurements or replicates of the mismeasured covariates on a subset of the study population. Initial investigation shows that adequacy of such adjustment depends on the sizes of main and validation samples, especially when prevalences of the categorical covariates are low. Thus, we investigate the impact of main and validation sample sizes on the adjusted estimates, and provide a general guideline about these sample sizes based on simulation studies.Keywords: measurement errors, misclassification, mismeasurement, validation sample, Bayesian adjustment
Procedia PDF Downloads 4081356 Translation, Cultural Adaptation and Validation of the Hungarian Version of Self- Determination Scale
Authors: E. E. Marschalko, K. Kalcza-Janosi, I. Kotta, B. Bibok
Abstract:
Cultural moderation aspects have been highlighted in the literature on self-determination behavior in some cultures, including in the Hungarian population. There is a lack of validated instruments in Hungarian for the assessment of self-determination related behaviors. In order to fill in this gap, the aim of this study was the translation, cultural adaptation and validation of Self Determination Scale (Sheldon, 1995) for the Hungarian population. A total of 4335 adults participated in the study. The mean age of the participants was 27.97 (SD=9.60). The sample consisted mostly from females, less than 20% were males. Exploratory and confirmatory factor analyses were performed for adequacy checking. Cronbach’s alpha was used to examine the reliability of the factors. Our results revealed that the Hungarian version of SDS has good psychometric properties and it is a reliable tool for psychologist who would like to study or assess self-determination in their clients. The final, adapted and validated SDS items are presented in this paper.Keywords: self-determination scale, Hungarian, adaptation, validation, reliability
Procedia PDF Downloads 2541355 The Development and Validation of the Awareness to Disaster Risk Reduction Questionnaire for Teachers
Authors: Ian Phil Canlas, Mageswary Karpudewan, Joyce Magtolis, Rosario Canlas
Abstract:
This study reported the development and validation of the Awareness to Disaster Risk Reduction Questionnaire for Teachers (ADRRQT). The questionnaire is a combination of Likert scale and open-ended questions that were grouped into two parts. The first part included questions relating to the general awareness on disaster risk reduction. Whereas, the second part comprised questions regarding the integration of disaster risk reduction in the teaching process. The entire process of developing and validating of the ADRRQT was described in this study. Statistical and qualitative findings revealed that the ADRRQT is significantly valid and reliable and has the potential of measuring awareness to disaster risk reduction of stakeholders in the field of teaching. Moreover, it also shows the potential to be adopted in other fields.Keywords: awareness, development, disaster risk reduction, questionnaire, validation
Procedia PDF Downloads 2281354 The Network Relative Model Accuracy (NeRMA) Score: A Method to Quantify the Accuracy of Prediction Models in a Concurrent External Validation
Authors: Carl van Walraven, Meltem Tuna
Abstract:
Background: Network meta-analysis (NMA) quantifies the relative efficacy of 3 or more interventions from studies containing a subgroup of interventions. This study applied the analytical approach of NMA to quantify the relative accuracy of prediction models with distinct inclusion criteria that are evaluated on a common population (‘concurrent external validation’). Methods: We simulated binary events in 5000 patients using a known risk function. We biased the risk function and modified its precision by pre-specified amounts to create 15 prediction models with varying accuracy and distinct patient applicability. Prediction model accuracy was measured using the Scaled Brier Score (SBS). Overall prediction model accuracy was measured using fixed-effects methods that accounted for model applicability patterns. Prediction model accuracy was summarized as the Network Relative Model Accuracy (NeRMA) Score which ranges from -∞ through 0 (accuracy of random guessing) to 1 (accuracy of most accurate model in concurrent external validation). Results: The unbiased prediction model had the highest SBS. The NeRMA score correctly ranked all simulated prediction models by the extent of bias from the known risk function. A SAS macro and R-function was created to implement the NeRMA Score. Conclusions: The NeRMA Score makes it possible to quantify the accuracy of binomial prediction models having distinct inclusion criteria in a concurrent external validation.Keywords: prediction model accuracy, scaled brier score, fixed effects methods, concurrent external validation
Procedia PDF Downloads 2351353 Developing and Evaluating Clinical Risk Prediction Models for Coronary Artery Bypass Graft Surgery
Authors: Mohammadreza Mohebbi, Masoumeh Sanagou
Abstract:
The ability to predict clinical outcomes is of great importance to physicians and clinicians. A number of different methods have been used in an effort to accurately predict these outcomes. These methods include the development of scoring systems based on multivariate statistical modelling, and models involving the use of classification and regression trees. The process usually consists of two consecutive phases, namely model development and external validation. The model development phase consists of building a multivariate model and evaluating its predictive performance by examining calibration and discrimination, and internal validation. External validation tests the predictive performance of a model by assessing its calibration and discrimination in different but plausibly related patients. A motivate example focuses on prediction modeling using a sample of patients undergone coronary artery bypass graft (CABG) has been used for illustrative purpose and a set of primary considerations for evaluating prediction model studies using specific quality indicators as criteria to help stakeholders evaluate the quality of a prediction model study has been proposed.Keywords: clinical prediction models, clinical decision rule, prognosis, external validation, model calibration, biostatistics
Procedia PDF Downloads 2971352 Experimental Validation of Computational Fluid Dynamics Used for Pharyngeal Flow Patterns during Obstructive Sleep Apnea
Authors: Pragathi Gurumurthy, Christina Hagen, Patricia Ulloa, Martin A. Koch, Thorsten M. Buzug
Abstract:
Obstructive sleep apnea (OSA) is a sleep disorder where the patient suffers a disturbed airflow during sleep due to partial or complete occlusion of the pharyngeal airway. Recently, numerical simulations have been used to better understand the mechanism of pharyngeal collapse. However, to gain confidence in the solutions so obtained, an experimental validation is required. Therefore, in this study an experimental validation of computational fluid dynamics (CFD) used for the study of human pharyngeal flow patterns during OSA is performed. A stationary incompressible Navier-Stokes equation solved using the finite element method was used to numerically study the flow patterns in a computed tomography-based human pharynx model. The inlet flow rate was set to 250 ml/s and such that a flat profile was maintained at the inlet. The outlet pressure was set to 0 Pa. The experimental technique used for the validation of CFD of fluid flow patterns is phase contrast-MRI (PC-MRI). Using the same computed tomography data of the human pharynx as in the simulations, a phantom for the experiment was 3 D printed. Glycerol (55.27% weight) in water was used as a test fluid at 25°C. Inflow conditions similar to the CFD study were simulated using an MRI compatible flow pump (CardioFlow-5000MR, Shelley Medical Imaging Technologies). The entire experiment was done on a 3 T MR system (Ingenia, Philips) with 108 channel body coil using an RF-spoiled, gradient echo sequence. A comparison of the axial velocity obtained in the pharynx from the numerical simulations and PC-MRI shows good agreement. The region of jet impingement and recirculation also coincide, therefore validating the numerical simulations. Hence, the experimental validation proves the reliability and correctness of the numerical simulations.Keywords: computational fluid dynamics, experimental validation, phase contrast-MRI, obstructive sleep apnea
Procedia PDF Downloads 3111351 Validation of the X-Ray Densitometry Method for Radial Density Pattern Determination of Acacia seyal var. seyal Tree Species
Authors: Hanadi Mohamed Shawgi Gamal, Claus Thomas Bues
Abstract:
Wood density is a variable influencing many of the technological and quality properties of wood. Understanding the pattern of wood density radial variation is important for its end-use. The X-ray technique, traditionally applied to softwood species to assess the wood quality properties, due to its simple and relatively uniform wood structure. On the other hand, very limited information is available about the validation of using this technique for hardwood species. The suitability of using the X-ray technique for the determination of hardwood density has a special significance in countries like Sudan, where only a few timbers are well known. This will not only save the time consumed by using the traditional methods, but it will also enhance the investigations of the great number of the lesser known species, the thing which will fill the huge cap of lake information of hardwood species growing in Sudan. The current study aimed to evaluate the validation of using the X-ray densitometry technique to determine the radial variation of wood density of Acacia seyal var. seyal. To this, a total of thirty trees were collected randomly from four states in Sudan. The wood density radial trend was determined using the basic density as well as density obtained by the X-ray densitometry method in order to assess the validation of X-ray technique in wood density radial variation determination. The results showed that the pattern of radial trend of density obtained by X-ray technique is very similar to that achieved by basic density. These results confirmed the validation of using the X-ray technique for Acacia seyal var. seyal density radial trend determination. It also promotes the suitability of using this method in other hardwood species.Keywords: x-ray densitometry, wood density, Acacia seyal var. seyal, radial variation
Procedia PDF Downloads 1521350 Development of a Predictive Model to Prevent Financial Crisis
Authors: Tengqin Han
Abstract:
Delinquency has been a crucial factor in economics throughout the years. Commonly seen in credit card and mortgage, it played one of the crucial roles in causing the most recent financial crisis in 2008. In each case, a delinquency is a sign of the loaner being unable to pay off the debt, and thus may cause a lost of property in the end. Individually, one case of delinquency seems unimportant compared to the entire credit system. China, as an emerging economic entity, the national strength and economic strength has grown rapidly, and the gross domestic product (GDP) growth rate has remained as high as 8% in the past decades. However, potential risks exist behind the appearance of prosperity. Among the risks, the credit system is the most significant one. Due to long term and a large amount of balance of the mortgage, it is critical to monitor the risk during the performance period. In this project, about 300,000 mortgage account data are analyzed in order to develop a predictive model to predict the probability of delinquency. Through univariate analysis, the data is cleaned up, and through bivariate analysis, the variables with strong predictive power are detected. The project is divided into two parts. In the first part, the analysis data of 2005 are split into 2 parts, 60% for model development, and 40% for in-time model validation. The KS of model development is 31, and the KS for in-time validation is 31, indicating the model is stable. In addition, the model is further validation by out-of-time validation, which uses 40% of 2006 data, and KS is 33. This indicates the model is still stable and robust. In the second part, the model is improved by the addition of macroeconomic economic indexes, including GDP, consumer price index, unemployment rate, inflation rate, etc. The data of 2005 to 2010 is used for model development and validation. Compared with the base model (without microeconomic variables), KS is increased from 41 to 44, indicating that the macroeconomic variables can be used to improve the separation power of the model, and make the prediction more accurate.Keywords: delinquency, mortgage, model development, model validation
Procedia PDF Downloads 2281349 Development and Validation of a HPLC Method for Standardization of Methanolic Extract of Hypericum sinaicum Hochst
Authors: Taghreed A. Ibrahim, Atef A. El-Hela, Hala M. El-Hefnawy
Abstract:
The chromatographic profile of methanol extract of Hypericum sinaicum was determined using HPLC-DAD. Apigenin was used as an external standard in the development and validation of the HPLC method. The proposed method is simple, rapid and reliable and can be successfully applied for standardization of Hypericum sinaicum methanol extract.Keywords: quality control, standardization, falvonoids, methanol extract
Procedia PDF Downloads 5031348 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design
Authors: Pegah Eshraghi, Zahra Sadat Zomorodian, Mohammad Tahsildoost
Abstract:
Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.Keywords: early stage of design, energy, thermal comfort, validation, machine learning
Procedia PDF Downloads 981347 Validation of Mapping Historical Linked Data to International Committee for Documentation (CIDOC) Conceptual Reference Model Using Shapes Constraint Language
Authors: Ghazal Faraj, András Micsik
Abstract:
Shapes Constraint Language (SHACL), a World Wide Web Consortium (W3C) language, provides well-defined shapes and RDF graphs, named "shape graphs". These shape graphs validate other resource description framework (RDF) graphs which are called "data graphs". The structural features of SHACL permit generating a variety of conditions to evaluate string matching patterns, value type, and other constraints. Moreover, the framework of SHACL supports high-level validation by expressing more complex conditions in languages such as SPARQL protocol and RDF Query Language (SPARQL). SHACL includes two parts: SHACL Core and SHACL-SPARQL. SHACL Core includes all shapes that cover the most frequent constraint components. While SHACL-SPARQL is an extension that allows SHACL to express more complex customized constraints. Validating the efficacy of dataset mapping is an essential component of reconciled data mechanisms, as the enhancement of different datasets linking is a sustainable process. The conventional validation methods are the semantic reasoner and SPARQL queries. The former checks formalization errors and data type inconsistency, while the latter validates the data contradiction. After executing SPARQL queries, the retrieved information needs to be checked manually by an expert. However, this methodology is time-consuming and inaccurate as it does not test the mapping model comprehensively. Therefore, there is a serious need to expose a new methodology that covers the entire validation aspects for linking and mapping diverse datasets. Our goal is to conduct a new approach to achieve optimal validation outcomes. The first step towards this goal is implementing SHACL to validate the mapping between the International Committee for Documentation (CIDOC) conceptual reference model (CRM) and one of its ontologies. To initiate this project successfully, a thorough understanding of both source and target ontologies was required. Subsequently, the proper environment to run SHACL and its shape graphs were determined. As a case study, we performed SHACL over a CIDOC-CRM dataset after running a Pellet reasoner via the Protégé program. The applied validation falls under multiple categories: a) data type validation which constrains whether the source data is mapped to the correct data type. For instance, checking whether a birthdate is assigned to xsd:datetime and linked to Person entity via crm:P82a_begin_of_the_begin property. b) Data integrity validation which detects inconsistent data. For instance, inspecting whether a person's birthdate occurred before any of the linked event creation dates. The expected results of our work are: 1) highlighting validation techniques and categories, 2) selecting the most suitable techniques for those various categories of validation tasks. The next plan is to establish a comprehensive validation model and generate SHACL shapes automatically.Keywords: SHACL, CIDOC-CRM, SPARQL, validation of ontology mapping
Procedia PDF Downloads 2531346 Synthesis, Characterization, Validation of Resistant Microbial Strains and Anti Microbrial Activity of Substitted Pyrazoles
Authors: Rama Devi Kyatham, D. Ashok, K. S. K. Rao Patnaik, Raju Bathula
Abstract:
We have shown the importance of pyrazoles as anti-microbial chemical entities. These compounds have generally been considered significant due to their wide range of pharmacological acivities and their discovery motivates new avenues of research.The proposed pyrazoles were synthesized and evaluated for their anti-microbial activities. The Synthesized compounds were analyzed by different spectroscopic methods.Keywords: pyrazoles, validation, resistant microbial strains, anti-microbial activities
Procedia PDF Downloads 1721345 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design
Authors: Rajaian Hoonejani Mohammad, Eshraghi Pegah, Zomorodian Zahra Sadat, Tahsildoost Mohammad
Abstract:
Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.Keywords: early stage of design, energy, thermal comfort, validation, machine learning
Procedia PDF Downloads 73