Search results for: ontology validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1493

Search results for: ontology validation

1433 HBTOnto: An Ontology Model for Analyzing Human Behavior Trajectories

Authors: Heba M. Wagih, Hoda M. O. Mokhtar

Abstract:

Social Network has recently played a significant role in both scientific and social communities. The growing adoption of social network applications has been a relevant source of information nowadays. Due to its popularity, several research trends are emerged to service the huge volume of users including, Location-Based Social Networks (LBSN), Recommendation Systems, Sentiment Analysis Applications, and many others. LBSNs applications are among the highly demanded applications that do not focus only on analyzing the spatiotemporal positions in a given raw trajectory but also on understanding the semantics behind the dynamics of the moving object. LBSNs are possible means of predicting human mobility based on users social ties as well as their spatial preferences. LBSNs rely on the efficient representation of users’ trajectories. Hence, traditional raw trajectory information is no longer convenient. In our research, we focus on studying human behavior trajectory which is the major pillar in location recommendation systems. In this paper, we propose an ontology design patterns with their underlying description logics to efficiently annotate human behavior trajectories.

Keywords: human behavior trajectory, location-based social network, ontology, social network

Procedia PDF Downloads 425
1432 Deep Learning Based-Object-classes Semantic Classification of Arabic Texts

Authors: Imen Elleuch, Wael Ouarda, Gargouri Bilel

Abstract:

We proposes in this paper a Deep Learning based approach to classify text in order to enrich an Arabic ontology based on the objects classes of Gaston Gross. Those object classes are defined by taking into account the syntactic and semantic features of the treated language. Thus, our proposed approach is a hybrid one. In fact, it is based on the one hand on the object classes that represents a knowledge based-approach on classification of text and in the other hand it uses the deep learning approach that use the word embedding-based-approach to classify text. We have applied our proposed approach on a corpus constructed from an Arabic dictionary. The obtained semantic classification of text will enrich the Arabic objects classes ontology. In fact, new classes can be added to the ontology or an expansion of the features that characterizes each object class can be updated. The obtained results are compared to a similar work that treats the same object with a classical linguistic approach for the semantic classification of text. This comparison highlight our hybrid proposed approach that can be ameliorated by broaden the dataset used in the deep learning process.

Keywords: deep-learning approach, object-classes, semantic classification, Arabic

Procedia PDF Downloads 41
1431 Ontology-Based Systemizing of the Science Information Devoted to Waste Utilizing by Methanogenesis

Authors: Ye. Shapovalov, V. Shapovalov, O. Stryzhak, A. Salyuk

Abstract:

Over the past decades, amount of scientific information has been growing exponentially. It became more complicated to process and systemize this amount of data. The approach to systematization of scientific information on the production of biogas based on the ontological IT platform “T.O.D.O.S.” has been developed. It has been proposed to select semantic characteristics of each work for their further introduction into the IT platform “T.O.D.O.S.”. An ontological graph with a ranking function for previous scientific research and for a system of selection of microorganisms has been worked out. These systems provide high performance of information management of scientific information.

Keywords: ontology-based analysis, analysis of scientific data, methanogenesis, microorganism hierarchy, 'T.O.D.O.S.'

Procedia PDF Downloads 132
1430 Social Media Idea Ontology: A Concept for Semantic Search of Product Ideas in Customer Knowledge through User-Centered Metrics and Natural Language Processing

Authors: Martin H¨ausl, Maximilian Auch, Johannes Forster, Peter Mandl, Alexander Schill

Abstract:

In order to survive on the market, companies must constantly develop improved and new products. These products are designed to serve the needs of their customers in the best possible way. The creation of new products is also called innovation and is primarily driven by a company’s internal research and development department. However, a new approach has been taking place for some years now, involving external knowledge in the innovation process. This approach is called open innovation and identifies customer knowledge as the most important source in the innovation process. This paper presents a concept of using social media posts as an external source to support the open innovation approach in its initial phase, the Ideation phase. For this purpose, the social media posts are semantically structured with the help of an ontology and the authors are evaluated using graph-theoretical metrics such as density. For the structuring and evaluation of relevant social media posts, we also use the findings of Natural Language Processing, e. g. Named Entity Recognition, specific dictionaries, Triple Tagger and Part-of-Speech-Tagger. The selection and evaluation of the tools used are discussed in this paper. Using our ontology and metrics to structure social media posts enables users to semantically search these posts for new product ideas and thus gain an improved insight into the external sources such as customer needs.

Keywords: idea ontology, innovation management, semantic search, open information extraction

Procedia PDF Downloads 164
1429 Preliminary Study of Standardization and Validation of Micronuclei Technique to Assess the DNA Damages Cause for the X-Rays

Authors: L. J. Díaz, M. A. Hernández, A. K. Molina, A. Bermúdez, C. Crane, V. M. Pabón

Abstract:

One of the most important biological indicators that show the exposure to the radiation is the micronuclei (MN). This technique is using to determinate the radiation effects in blood cultures as a biological control and a complement to the physics dosimetry. In Colombia the necessity to apply this analysis has emerged due to the current biological indicator most used is the chromosomal aberrations (CA), that is why it is essential the MN technique’s standardization and validation to have enough tools to improve the radioprotection topic in the country. Besides, this technique will be applied on the construction of a dose-response curve, that allow measure an approximately dose to irradiated people according to MN frequency found. Inside the steps that carried out to accomplish the standardization and validation is the statistic analysis from the lectures of “in vitro” peripheral blood cultures with different analysts, also it was determinate the best culture medium and conditions for the MN can be detected easily.

Keywords: micronuclei, radioprotection, standardization, validation

Procedia PDF Downloads 464
1428 E-Learning Recommender System Based on Collaborative Filtering and Ontology

Authors: John Tarus, Zhendong Niu, Bakhti Khadidja

Abstract:

In recent years, e-learning recommender systems has attracted great attention as a solution towards addressing the problem of information overload in e-learning environments and providing relevant recommendations to online learners. E-learning recommenders continue to play an increasing educational role in aiding learners to find appropriate learning materials to support the achievement of their learning goals. Although general recommender systems have recorded significant success in solving the problem of information overload in e-commerce domains and providing accurate recommendations, e-learning recommender systems on the other hand still face some issues arising from differences in learner characteristics such as learning style, skill level and study level. Conventional recommendation techniques such as collaborative filtering and content-based deal with only two types of entities namely users and items with their ratings. These conventional recommender systems do not take into account the learner characteristics in their recommendation process. Therefore, conventional recommendation techniques cannot make accurate and personalized recommendations in e-learning environment. In this paper, we propose a recommendation technique combining collaborative filtering and ontology to recommend personalized learning materials to online learners. Ontology is used to incorporate the learner characteristics into the recommendation process alongside the ratings while collaborate filtering predicts ratings and generate recommendations. Furthermore, ontological knowledge is used by the recommender system at the initial stages in the absence of ratings to alleviate the cold-start problem. Evaluation results show that our proposed recommendation technique outperforms collaborative filtering on its own in terms of personalization and recommendation accuracy.

Keywords: collaborative filtering, e-learning, ontology, recommender system

Procedia PDF Downloads 339
1427 Ontology based Fault Detection and Diagnosis system Querying and Reasoning examples

Authors: Marko Batic, Nikola Tomasevic, Sanja Vranes

Abstract:

One of the strongholds in the ubiquitous efforts related to the energy conservation and energy efficiency improvement is represented by the retrofit of high energy consumers in buildings. In general, HVAC systems represent the highest energy consumers in buildings. However they usually suffer from mal-operation and/or malfunction, causing even higher energy consumption than necessary. Various Fault Detection and Diagnosis (FDD) systems can be successfully employed for this purpose, especially when it comes to the application at a single device/unit level. In the case of more complex systems, where multiple devices are operating in the context of the same building, significant energy efficiency improvements can only be achieved through application of comprehensive FDD systems relying on additional higher level knowledge, such as their geographical location, served area, their intra- and inter- system dependencies etc. This paper presents a comprehensive FDD system that relies on the utilization of common knowledge repository that stores all critical information. The discussed system is deployed as a test-bed platform at the two at Fiumicino and Malpensa airports in Italy. This paper aims at presenting advantages of implementation of the knowledge base through the utilization of ontology and offers improved functionalities of such system through examples of typical queries and reasoning that enable derivation of high level energy conservation measures (ECM). Therefore, key SPARQL queries and SWRL rules, based on the two instantiated airport ontologies, are elaborated. The detection of high level irregularities in the operation of airport heating/cooling plants is discussed and estimation of energy savings is reported.

Keywords: airport ontology, knowledge management, ontology modeling, reasoning

Procedia PDF Downloads 488
1426 OSEME: A Smart Learning Environment for Music Education

Authors: Konstantinos Sofianos, Michael Stefanidakis

Abstract:

Nowadays, advances in information and communication technologies offer a range of opportunities for new approaches, methods, and tools in the field of education and training. Teacher-centered learning has changed to student-centered learning. E-learning has now matured and enables the design and construction of intelligent learning systems. A smart learning system fully adapts to a student's needs and provides them with an education based on their preferences, learning styles, and learning backgrounds. It is a wise friend and available at any time, in any place, and with any digital device. In this paper, we propose an intelligent learning system, which includes an ontology with all elements of the learning process (learning objects, learning activities) and a massive open online course (MOOC) system. This intelligent learning system can be used in music education.

Keywords: intelligent learning systems, e-learning, music education, ontology, semantic web

Procedia PDF Downloads 280
1425 Analysis of Expression Data Using Unsupervised Techniques

Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.

Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation

Procedia PDF Downloads 112
1424 A Semantic Registry to Support Brazilian Aeronautical Web Services Operations

Authors: Luís Antonio de Almeida Rodriguez, José Maria Parente de Oliveira, Ednelson Oliveira

Abstract:

In the last two decades, the world’s aviation authorities have made several attempts to create consensus about a global and accepted approach for applying semantics to web services registry descriptions. This problem has led communities to face a fat and disorganized infrastructure to describe aeronautical web services. It is usual for developers to implement ad-hoc connections among consumers and providers and manually create non-standardized service compositions, which need some particular approach to compose and semantically discover a desired web service. Current practices are not precise and tend to focus on lightweight specifications of some parts of the OWL-S and embed them into syntactic descriptions (SOAP artifacts and OWL language). It is necessary to have the ability to manage the use of both technologies. This paper presents an implementation of the ontology OWL-S that describes a Brazilian Aeronautical Web Service Registry, which makes it able to publish, advertise, make multi-criteria semantic discovery aligned with the ideas of the System Wide Information Management (SWIM) Program, and invoke web services within the Air Traffic Management context. The proposal’s best finding is a generic approach to describe semantic web services. The paper also presents a set of functional requirements to guide the ontology development and to compare them to the results to validate the implementation of the OWL-S Ontology.

Keywords: aeronautical web services, OWL-S, semantic web services discovery, ontologies

Procedia PDF Downloads 58
1423 Application of Ontologies to Contract for Difference Documents

Authors: Renato Figueira Franco

Abstract:

This paper aims to create a representational information system applied to the securities market, particularly the development of an ontology applied to the analysis of the Key Information Documents of Contracts for Difference. The process of obtaining knowledge and its proper formal representation has raised the attention both from the scientific literature and the capital markets supervisory authorities. The formal knowledge representation is embodied in the construction of ontologies, which are responsible for defining a knowledge base structure of a given scientific domain, facilitating its understanding, and allowing its sharing among the scientific community. The scope of this study is restricted to the analysis of capital markets ontologies in order to capture its structure, semantics and knowledge sharing between people and systems.

Keywords: ontology, financial markets, CFD, PRIIPs, key information documents

Procedia PDF Downloads 36
1422 Enhancement of Indexing Model for Heterogeneous Multimedia Documents: User Profile Based Approach

Authors: Aicha Aggoune, Abdelkrim Bouramoul, Mohamed Khiereddine Kholladi

Abstract:

Recent research shows that user profile as important element can improve heterogeneous information retrieval with its content. In this context, we present our indexing model for heterogeneous multimedia documents. This model is based on the combination of user profile to the indexing process. The general idea of our proposal is to operate the common concepts between the representation of a document and the definition of a user through his profile. These two elements will be added as additional indexing entities to enrich the heterogeneous corpus documents indexes. We have developed IRONTO domain ontology allowing annotation of documents. We will present also the developed tool validating the proposed model.

Keywords: indexing model, user profile, multimedia document, heterogeneous of sources, ontology

Procedia PDF Downloads 321
1421 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)

Authors: Gule Teri

Abstract:

The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.

Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing

Procedia PDF Downloads 40
1420 Critical Realism as a Bridge between Critical Pedagogy and Queer Theory

Authors: Mike Seal

Abstract:

This paper explores the traditions of critical and queer pedagogy, its intersections, tensions and paradoxes. Critical pedagogy, with a materialist realist ontology, and queer theory, which is often post-modern, post-structural and anti-essential, may not seem compatible. Similarly, there are tensions between activist orientations, often enacted through essential sexual identities, and a queer approach that questions such identities and subjectivities. It will argue that critical realism gives us a bridge between critical and queer pedagogy in preserving a realist materialist ontology, where economic forces are real, and independent of consciousness and hermeneutic constructions of them. At the same time, it offers an epistemology that does not necessitate a binary view of the roles of the oppressed, liberator, or even oppressor. It accepts that our knowledge is contingent, partial and contestable, but has the potential, and enough validity, to demand action and potentially inform the actions of others.

Keywords: critical pedagogy, queer pedagogy, critical realsim, heteronormativity

Procedia PDF Downloads 153
1419 The Detection of Implanted Radioactive Seeds on Ultrasound Images Using Convolution Neural Networks

Authors: Edward Holupka, John Rossman, Tye Morancy, Joseph Aronovitz, Irving Kaplan

Abstract:

A common modality for the treatment of early stage prostate cancer is the implantation of radioactive seeds directly into the prostate. The radioactive seeds are positioned inside the prostate to achieve optimal radiation dose coverage to the prostate. These radioactive seeds are positioned inside the prostate using Transrectal ultrasound imaging. Once all of the planned seeds have been implanted, two dimensional transaxial transrectal ultrasound images separated by 2 mm are obtained through out the prostate, beginning at the base of the prostate up to and including the apex. A common deep neural network, called DetectNet was trained to automatically determine the position of the implanted radioactive seeds within the prostate under ultrasound imaging. The results of the training using 950 training ultrasound images and 90 validation ultrasound images. The commonly used metrics for successful training were used to evaluate the efficacy and accuracy of the trained deep neural network and resulted in an loss_bbox (train) = 0.00, loss_coverage (train) = 1.89e-8, loss_bbox (validation) = 11.84, loss_coverage (validation) = 9.70, mAP (validation) = 66.87%, precision (validation) = 81.07%, and a recall (validation) = 82.29%, where train and validation refers to the training image set and validation refers to the validation training set. On the hardware platform used, the training expended 12.8 seconds per epoch. The network was trained for over 10,000 epochs. In addition, the seed locations as determined by the Deep Neural Network were compared to the seed locations as determined by a commercial software based on a one to three months after implant CT. The Deep Learning approach was within \strikeout off\uuline off\uwave off2.29\uuline default\uwave default mm of the seed locations determined by the commercial software. The Deep Learning approach to the determination of radioactive seed locations is robust, accurate, and fast and well within spatial agreement with the gold standard of CT determined seed coordinates.

Keywords: prostate, deep neural network, seed implant, ultrasound

Procedia PDF Downloads 166
1418 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state

Procedia PDF Downloads 228
1417 A Proposal for Systematic Mapping Study of Software Security Testing, Verification and Validation

Authors: Adriano Bessa Albuquerque, Francisco Jose Barreto Nunes

Abstract:

Software vulnerabilities are increasing and not only impact services and processes availability as well as information confidentiality, integrity and privacy, but also cause changes that interfere in the development process. Security test could be a solution to reduce vulnerabilities. However, the variety of test techniques with the lack of real case studies of applying tests focusing on software development life cycle compromise its effective use. This paper offers an overview of how a Systematic Mapping Study (MS) about security verification, validation and test (VVT) was performed, besides presenting general results about this study.

Keywords: software test, software security verification validation and test, security test institutionalization, systematic mapping study

Procedia PDF Downloads 355
1416 Modeling Sediment Yield Using the SWAT Model: A Case Study of Upper Ankara River Basin, Turkey

Authors: Umit Duru

Abstract:

The Soil and Water Assessment Tool (SWAT) was tested for prediction of water balance and sediment yield in the Ankara gauged basin, Turkey. The overall objective of this study was to evaluate the performance and applicability of the SWAT in this region of Turkey. Thirteen years of monthly stream flow, and suspended sediment, data were used for calibration and validation. This research assessed model performance based on differences between observed and predicted suspended sediment yield during calibration (1987-1996) and validation (1982-1984) periods. Statistical comparisons of suspended sediment produced values for NSE (Nash Sutcliffe efficiency), RE (relative error), and R² (coefficient of determination), of 0.81, -1.55, and 0.93, respectively, during the calibration period, and NSE, RE (%), and R² of 0.77, -2.61, and 0.87, respectively, during the validation period. Based on the analyses, SWAT satisfactorily simulated observed hydrology and sediment yields and can be used as a tool in decision making for water resources planning and management in the basin.

Keywords: calibration, GIS, sediment yield, SWAT, validation

Procedia PDF Downloads 246
1415 Semi-Automatic Method to Assist Expert for Association Rules Validation

Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen

Abstract:

In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.

Keywords: association rules, rule-based classification, classification quality, validation

Procedia PDF Downloads 402
1414 Knowledge Elicitation Approach for Formal Ontology Design: An Exploratory Study Applied in Industry for Knowledge Management

Authors: Ouassila Labbani-Narsis, Christophe Nicolle

Abstract:

Building formal ontologies remains a complex process for companies. In the literature, this process is based on the technical knowledge and expertise of domain experts, without further details on the used methodologies. Possible problems of disagreements between experts, expression of tacit knowledge related to high level know-how rarely verbalized, qualification of results by using cases, or simply adhesion of the group of experts, remain currently unsolved. This paper proposes a methodological approach based on knowledge elicitation for the conception of formal, consensual, and shared ontologies. The proposed approach is experimentally tested on industrial collaboration projects in the field of manufacturing (associating knowledge sources from multinational companies) and in the field of viticulture (associating explicit knowledge and implicit knowledge acquired through observation).

Keywords: collaborative ontology engineering, knowledge elicitation, knowledge engineering, knowledge management

Procedia PDF Downloads 50
1413 Validation of the Formal Model of Web Services Applications for Digital Reference Service of Library Information System

Authors: Zainab Magaji Musa, Nordin M. A. Rahman, Julaily Aida Jusoh

Abstract:

The web services applications for digital reference service (WSDRS) of LIS model is an informal model that claims to reduce the problems of digital reference services in libraries. It uses web services technology to provide efficient way of satisfying users’ needs in the reference section of libraries. The formal WSDRS model consists of the Z specifications of all the informal specifications of the model. This paper discusses the formal validation of the Z specifications of WSDRS model. The authors formally verify and thus validate the properties of the model using Z/EVES theorem prover.

Keywords: validation, verification, formal, theorem prover

Procedia PDF Downloads 476
1412 Estimation of Uncertainty of Thermal Conductivity Measurement with Single Laboratory Validation Approach

Authors: Saowaluck Ukrisdawithid

Abstract:

The thermal conductivity of thermal insulation materials are measured by Heat Flow Meter (HFM) apparatus. The components of uncertainty are complex and difficult on routine measurement by modelling approach. In this study, uncertainty of thermal conductivity measurement was estimated by single laboratory validation approach. The within-laboratory reproducibility was 1.1%. The standard uncertainty of method and laboratory bias by using SRM1453 expanded polystyrene board was dominant at 1.4%. However, it was assessed that there was no significant bias. For sample measurement, the sources of uncertainty were repeatability, density of sample and thermal conductivity resolution of HFM. From this approach to sample measurements, the combined uncertainty was calculated. In summary, the thermal conductivity of sample, polystyrene foam, was reported as 0.03367 W/m·K ± 3.5% (k = 2) at mean temperature 23.5 °C. The single laboratory validation approach is simple key of routine testing laboratory for estimation uncertainty of thermal conductivity measurement by using HFM, according to ISO/IEC 17025-2017 requirements. These are meaningful for laboratory competent improvement, quality control on products, and conformity assessment.

Keywords: single laboratory validation approach, within-laboratory reproducibility, method and laboratory bias, certified reference material

Procedia PDF Downloads 105
1411 Not Three Gods but One: Why Reductionism Does Not Serve Our Theological Discourse

Authors: Finley Lawson

Abstract:

The triune nature of God is one of the most complex doctrines of Christianity, and its complexity is further compounded when one considers the incarnation. However, many of the difficulties and paradoxes associated with our idea of the divine arise from our adherence to reductionist ontology. In order to move our theological discourse forward, in respect to divine and human nature, a holistic interpretation of our profession of faith is necessary. The challenge of a holistic interpretation is that it questions our ability to make any statement about the genuine, ontological individuation of persons (both divine and human), and in doing so raises the issue of whether we are, ontologically, bound to descend in to a form of pan(en)theism. In order to address the ‘inevitable’ slide in to pan(en)theism. The impact of two forms of holistic interpretation, Boolean and Non-Boolean, on our concept of personhood will be examined. Whilst a Boolean interpretation allows for a greater understanding of the relational nature of the Trinity, it is the Non-Boolean interpretation which has greater ontological significance. A Non-Boolean ontology, grounded in our scientific understanding of the nature of the world, shows our quest for individuation rests not in ontological fact but in epistemic need, and that it is our limited epistemology that drives our need to divide that which is ontologically indivisible. This discussion takes place within a ‘methodological’, rather than ‘doctrinal’ approach to science and religion - examining assumptions and methods that have shaped our language and beliefs about key doctrines, rather than seeking to reconcile particular Christian doctrines with particular scientific theories. Concluding that Non-Boolean holism is the more significant for our doctrine is, in itself, not enough. A world without division appears much removed from the distinct place of man and divine as espoused in our creedal affirmation, to this end, several possible interpretations for understanding Non-Boolean human – divine relations are tentatively put forward for consideration.

Keywords: holism, individuation, ontology, Trinitarian relations

Procedia PDF Downloads 224
1410 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models

Authors: Yoonsuh Jung

Abstract:

As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an "optimal" value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.

Keywords: cross validation, parameter averaging, parameter selection, regularization parameter search

Procedia PDF Downloads 386
1409 Validation of Existing Index Properties-Based Correlations for Estimating the Soil–Water Characteristic Curve of Fine-Grained Soils

Authors: Karim Kootahi, Seyed Abolhasan Naeini

Abstract:

The soil-water characteristic curve (SWCC), which represents the relationship between suction and water content (or degree of saturation), is an important property of unsaturated soils. The conventional method for determining SWCC is through specialized testing procedures. Since these procedures require specialized unsaturated soil testing apparatus and lengthy testing programs, several index properties-based correlations have been developed for estimating the SWCC of fine-grained soils. There are, however, considerable inconsistencies among the published correlations and there is no validation study on the predictive ability of existing correlations. In the present study, all existing index properties-based correlations are evaluated using a high quality worldwide database. The performances of existing correlations are assessed both graphically and quantitatively using statistical measures. The results of the validation indicate that most of the existing correlations provide unacceptable estimates of degree of saturation but the most recent model appears to be promising.

Keywords: SWCC, correlations, index properties, validation

Procedia PDF Downloads 141
1408 Design, Fabrication, and Experimental Validation of a Warm Bulge Test System

Authors: Emine Feyza Şükür, Mevlüt Türköz, Murat Dilmeç, Hüseyin Selçuk Halkacı

Abstract:

In this study, a warm bulge test system was designed, built and experimentally validated to perform warm bulge tests with all necessary systems. In addition, performance of each sub-system is validated through repeated production and/or test runs as well as through part quality measurements. Validation and performance tests were performed to characterize the repeatability of the system. As a result of these tests, the desired temperature distribution on the sheet metal was obtained by the heating systems and the good repeatability of the bulge tests was obtained. Consequently, this study is expected to provide other researchers and manufacturer with a set of design and process guidelines to develop similar systems.

Keywords: design, test unit, warm bulge test unit, validation test

Procedia PDF Downloads 461
1407 Semantic Platform for Adaptive and Collaborative e-Learning

Authors: Massra M. Sabeima, Myriam lamolle, Mohamedade Farouk Nanne

Abstract:

Adapting the learning resources of an e-learning system to the characteristics of the learners is an important aspect to consider when designing an adaptive e-learning system. However, this adaptation is not a simple process; it requires the extraction, analysis, and modeling of user information. This implies a good representation of the user's profile, which is the backbone of the adaptation process. Moreover, during the e-learning process, collaboration with similar users (same geographic province or knowledge context) is important. Productive collaboration motivates users to continue or not abandon the course and increases the assimilation of learning objects. The contribution of this work is the following: we propose an adaptive e-learning semantic platform to recommend learning resources to learners, using ontology to model the user profile and the course content, furthermore an implementation of a multi-agent system able to progressively generate the learning graph (taking into account the user's progress, and the changes that occur) for each user during the learning process, and to synchronize the users who collaborate on a learning object.

Keywords: adaptative learning, collaboration, multi-agent, ontology

Procedia PDF Downloads 145
1406 Decision Tree Analysis of Risk Factors for Intravenous Infiltration among Hospitalized Children: A Retrospective Study

Authors: Soon-Mi Park, Ihn Sook Jeong

Abstract:

This retrospective study was aimed to identify risk factors of intravenous (IV) infiltration for hospitalized children. The participants were 1,174 children for test and 424 children for validation, who admitted to a general hospital, received peripheral intravenous injection therapy at least once and had complete records. Data were analyzed with frequency and percentage or mean and standard deviation were calculated, and decision tree analysis was used to screen for the most important risk factors for IV infiltration for hospitalized children. The decision tree analysis showed that the most important traditional risk factors for IV infiltration were the use of ampicillin/sulbactam, IV insertion site (lower extremities), and medical department (internal medicine) both in the test sample and validation sample. The correct classification was 92.2% in the test sample and 90.1% in the validation sample. More careful attention should be made to patients who are administered ampicillin/sulbactam, have IV site in lower extremities and have internal medical problems to prevent or detect infiltration occurrence.

Keywords: decision tree analysis, intravenous infiltration, child, validation

Procedia PDF Downloads 144
1405 Measurement Errors and Misclassifications in Covariates in Logistic Regression: Bayesian Adjustment of Main and Interaction Effects and the Sample Size Implications

Authors: Shahadut Hossain

Abstract:

Measurement errors in continuous covariates and/or misclassifications in categorical covariates are common in epidemiological studies. Regression analysis ignoring such mismeasurements seriously biases the estimated main and interaction effects of covariates on the outcome of interest. Thus, adjustments for such mismeasurements are necessary. In this research, we propose a Bayesian parametric framework for eliminating deleterious impacts of covariate mismeasurements in logistic regression. The proposed adjustment method is unified and thus can be applied to any generalized linear and non-linear regression models. Furthermore, adjustment for covariate mismeasurements requires validation data usually in the form of either gold standard measurements or replicates of the mismeasured covariates on a subset of the study population. Initial investigation shows that adequacy of such adjustment depends on the sizes of main and validation samples, especially when prevalences of the categorical covariates are low. Thus, we investigate the impact of main and validation sample sizes on the adjusted estimates, and provide a general guideline about these sample sizes based on simulation studies.

Keywords: measurement errors, misclassification, mismeasurement, validation sample, Bayesian adjustment

Procedia PDF Downloads 384
1404 Translation, Cultural Adaptation and Validation of the Hungarian Version of Self- Determination Scale

Authors: E. E. Marschalko, K. Kalcza-Janosi, I. Kotta, B. Bibok

Abstract:

Cultural moderation aspects have been highlighted in the literature on self-determination behavior in some cultures, including in the Hungarian population. There is a lack of validated instruments in Hungarian for the assessment of self-determination related behaviors. In order to fill in this gap, the aim of this study was the translation, cultural adaptation and validation of Self Determination Scale (Sheldon, 1995) for the Hungarian population. A total of 4335 adults participated in the study. The mean age of the participants was 27.97 (SD=9.60). The sample consisted mostly from females, less than 20% were males. Exploratory and confirmatory factor analyses were performed for adequacy checking. Cronbach’s alpha was used to examine the reliability of the factors. Our results revealed that the Hungarian version of SDS has good psychometric properties and it is a reliable tool for psychologist who would like to study or assess self-determination in their clients. The final, adapted and validated SDS items are presented in this paper.

Keywords: self-determination scale, Hungarian, adaptation, validation, reliability

Procedia PDF Downloads 221