Search results for: data consistency
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24493

Search results for: data consistency

24463 Simulation of a Cost Model Response Requests for Replication in Data Grid Environment

Authors: Kaddi Mohammed, A. Benatiallah, D. Benatiallah

Abstract:

Data grid is a technology that has full emergence of new challenges, such as the heterogeneity and availability of various resources and geographically distributed, fast data access, minimizing latency and fault tolerance. Researchers interested in this technology address the problems of the various systems related to the industry such as task scheduling, load balancing and replication. The latter is an effective solution to achieve good performance in terms of data access and grid resources and better availability of data cost. In a system with duplication, a coherence protocol is used to impose some degree of synchronization between the various copies and impose some order on updates. In this project, we present an approach for placing replicas to minimize the cost of response of requests to read or write, and we implement our model in a simulation environment. The placement techniques are based on a cost model which depends on several factors, such as bandwidth, data size and storage nodes.

Keywords: response time, query, consistency, bandwidth, storage capacity, CERN

Procedia PDF Downloads 247
24462 The Influence of Partial Replacement of Hydrated Lime by Pozzolans on Properties of Lime Mortars

Authors: Przemyslaw Brzyski, Stanislaw Fic

Abstract:

Hydrated lime, because of the life cycle (return to its natural form as a result of the setting and hardening) has a positive environmental impact. The lime binder is used in mortars. Lime is a slow setting binder with low mechanical properties. The aim of the study was to evaluate the possibility of improving the properties of the lime binder by using different pozzolanic materials as partial replacement of hydrated lime binder. Pozzolan materials are the natural or industrial waste, so do not affect the environmental impact of the lime binder. The following laboratory tests were performed: the analysis of the physical characteristics of the tested samples of lime mortars (bulk density, porosity), flexural and compressive strength, water absorption and the capillary rise of samples and consistency of fresh mortars. As a partial replacement of hydrated lime (in the amount of 10%, 20%, 30% by weight of lime) a metakaolin, silica fume, and zeolite were used. The shortest setting and hardening time showed mortars with the addition of metakaolin. All additives noticeably improved strength characteristic of lime mortars. With the increase in the amount of additive, the increase in strength was also observed. The highest flexural strength was obtained by using the addition of metakaolin in an amount of 20% by weight of lime (2.08 MPa). The highest compressive strength was obtained by using also the addition of metakaolin but in an amount of 30% by weight of lime (9.43 MPa). The addition of pozzolan caused an increase in the mortar tightness which contributed to the limitation of absorbability. Due to the different surface area, pozzolanic additives affected the consistency of fresh mortars. Initial consistency was assumed as plastic. Only the addition of silica fume an amount of 20 and 30% by weight of lime changed the consistency to the thick-plastic. The conducted study demonstrated the possibility of applying lime mortar with satisfactory properties. The features of lime mortars do not differ significantly from cement-based mortar properties and show a lower environmental impact due to CO₂ absorption during lime hardening. Taking into consideration the setting time, strength and consistency, the best results can be obtained with metakaolin addition to the lime mortar.

Keywords: lime, binder, mortar, pozzolan, properties

Procedia PDF Downloads 168
24461 Generalized Mean-Field Theory of Phase Unwrapping via Multiple Interferograms

Authors: Yohei Saika

Abstract:

On the basis of Bayesian inference using the maximizer of the posterior marginal estimate, we carry out phase unwrapping using multiple interferograms via generalized mean-field theory. Numerical calculations for a typical wave-front in remote sensing using the synthetic aperture radar interferometry, phase diagram in hyper-parameter space clarifies that the present method succeeds in phase unwrapping perfectly under the constraint of surface- consistency condition, if the interferograms are not corrupted by any noises. Also, we find that prior is useful for extending a phase in which phase unwrapping under the constraint of the surface-consistency condition. These results are quantitatively confirmed by the Monte Carlo simulation.

Keywords: Bayesian inference, generalized mean-field theory, phase unwrapping, multiple interferograms, statistical mechanics

Procedia PDF Downloads 453
24460 Membership Surface and Arithmetic Operations of Imprecise Matrix

Authors: Dhruba Das

Abstract:

In this paper, a method has been developed to construct the membership surfaces of row and column vectors and arithmetic operations of imprecise matrix. A matrix with imprecise elements would be called an imprecise matrix. The membership surface of imprecise vector has been already shown based on Randomness-Impreciseness Consistency Principle. The Randomness- Impreciseness Consistency Principle leads to defining a normal law of impreciseness using two different laws of randomness. In this paper, the author has shown row and column membership surfaces and arithmetic operations of imprecise matrix and demonstrated with the help of numerical example.

Keywords: imprecise number, imprecise vector, membership surface, imprecise matrix

Procedia PDF Downloads 362
24459 Challenges and Opportunities: One Stop Processing for the Automation of Indonesian Large-Scale Topographic Base Map Using Airborne LiDAR Data

Authors: Elyta Widyaningrum

Abstract:

The LiDAR data acquisition has been recognizable as one of the fastest solution to provide the basis data for topographic base mapping in Indonesia. The challenges to accelerate the provision of large-scale topographic base maps as a development plan basis gives the opportunity to implement the automated scheme in the map production process. The one stop processing will also contribute to accelerate the map provision especially to conform with the Indonesian fundamental spatial data catalog derived from ISO 19110 and geospatial database integration. Thus, the automated LiDAR classification, DTM generation and feature extraction will be conducted in one GIS-software environment to form all layers of topographic base maps. The quality of automated topographic base map will be assessed and analyzed based on its completeness, correctness, contiguity, consistency and possible customization.

Keywords: automation, GIS environment, LiDAR processing, map quality

Procedia PDF Downloads 339
24458 The Consistency of Gerhard Kittel’s “Christian” Antisemitism in His "Die Judenfrage" and "Meine Verteidigung"

Authors: Catherine Harrison

Abstract:

Faced with arrest, imprisonment and the denazification process in 1945, Tübingen University’s Professor of Theology, Gerhard Kittel, refused to abandon the “Christian” antisemitism which he had first expounded in his Die Judenfrage [The Jewish Question] (1933 and 1934). At the heart of this paper is a critical engagement with Die Judenfrage, the first in English. Putting Die Judenfrage into dialogue with Kittel’s 1946, Meine Verteidigung [My Defence] (1945-6) exposes the remarkable consistency of Kittel’s idiosyncratic but closely argued Christian theology of antisemitism. Girdling his career as a foremost theologian, antisemite and enthusiastic supporter of Hitler and the NSDAP, the consistency between Die Judenfrage and Meine Verteidigung attests Kittel’s consistent and authentic, intellectual position. In both texts, he claims to be advancing Christian, as opposed to “vulgar” or racial, antisemitism. Yet, in the thirteen years which divide them, Kittel had mediated contact with Nazi illuminati Rudolph Hess, Alfred Rosenberg, Winnifred Wagner, Josef Goebbels and Baldur von Schirach, through his publications in various antisemitic journals. The paper argues: Die Judenfrage, as both a text and as a theme, is axiomatic to Kittel’s defence statement; and that Die Judenfrage constitutes the template of Kittel’s arcane, personal “Christian” antisemitism of which Meine Verteidigung is a faithful impression. Both are constructed on the same theologically chimeric and abstruse hypotheses regarding Volk, Spätjudentum [late Judaism] and Heilgeschichte [salvation history]. Problematising these and other definitional vagaries that make up Kittel’s “Christian” antisemitism highlight the remarkable theoretical consistency between Die Judenfrage and Meine Verteidigung. It is concluded that a deadly synergy of Nazi racial antisemitism and the New Testament antisemitism shaped Kittel’s judgement to the degree that, despite the slipstream of concentration camp footage which was shaking the foundations of post-war German academia, Meine Verteidigung is a simple restatement of the antisemitsm conveyed in Die Judenfrage.

Keywords: Gerhard Kittel, Third Reich theology, the Jewish Question, Nazi antisemitism

Procedia PDF Downloads 132
24457 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 217
24456 Imprecise Vector: The Case of Subnormality

Authors: Dhruba Das

Abstract:

In this article, the author has put forward the actual mathematical explanation of subnormal imprecise vector. Every subnormal imprecise vector has to be defined with reference to a membership surface. The membership surface of normal imprecise vector has already defined based on Randomness-Impreciseness Consistency Principle. The Randomness- Impreciseness Consistency Principle leads to defining a normal law of impreciseness using two different laws of randomness. A normal imprecise vector is a special case of subnormal imprecise vector. Nothing however is available in the literature about the membership surface when a subnormal imprecise vector is defined. The author has shown here how to construct the membership surface of a subnormal imprecise vector.

Keywords: imprecise vector, membership surface, subnormal imprecise number, subnormal imprecise vector

Procedia PDF Downloads 300
24455 A New Authenticable Steganographic Method via the Use of Numeric Data on Public Websites

Authors: Che-Wei Lee, Bay-Erl Lai

Abstract:

A new steganographic method via the use of numeric data on public websites with self-authentication capability is proposed. The proposed technique transforms a secret message into partial shares by Shamir’s (k, n)-threshold secret sharing scheme with n = k + 1. The generated k+1 partial shares then are embedded into the selected numeric items in a website as if they are part of the website’s numeric content. Afterward, a receiver links to the website and extracts every k shares among the k+1 ones from the stego-numeric-content to compute k+1 copies of the secret, and the phenomenon of value consistency of the computed k+1 copies is taken as an evidence to determine whether the extracted message is authentic or not, attaining the goal of self-authentication of the extracted secret message. Experimental results and discussions are provided to show the feasibility and effectiveness of the proposed method.

Keywords: steganography, data hiding, secret authentication, secret sharing

Procedia PDF Downloads 218
24454 Attitude-Behavior Consistency: A Descriptive Study in the Context of Climate Change and Acceptance of Psychological Findings by the Public

Authors: Nita Mitra, Pranab Chanda

Abstract:

In this paper, the issue of attitude-behavior consistency has been addressed in the context of climate change. Scientists (about 98 percent) opine that human behavior has a significant role in climate change. Such climate changes are harmful for human life. Thus, it is natural to conclude that only change of human behavior can avoid harmful consequences. Government and Non-Government Organizations are taking steps to bring in the desired changes in behavior. However, it seems that although the efforts are achieving changes in the attitudes to some degree, those steps are failing to materialize the corresponding behavioral changes. This has been a great concern for environmentalists. Psychologists have noticed the problem as a particular case of the general psychological problem of making attitude and behavior consistent with each other. The present study is in continuation of a previous work of the same author based upon descriptive research on the status of attitude and behavior of the people of a foot-hill region of the Himalayas in India regarding climate change. The observations confirm the mismatch of attitude and behavior of the people of the region with respect to climate change. While doing so an attitude-behavior mismatch has been noticed with respect to the acceptance of psychological findings by the public. People have been found to be interested in Psychology as an important subject, but they are reluctant to take the observations of psychologists seriously. A comparative study in this regard has been made with similar studies done elsewhere. Finally, an attempt has been made to perceive observations in the framework of observational learning due to Bandura's and behavior change due to Lewin.

Keywords: acceptance of psychological variables, attitude-behavior consistency, behavior change, climate change, observational learning

Procedia PDF Downloads 122
24453 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 67
24452 Research on Dynamic Practical Byzantine Fault Tolerance Consensus Algorithm

Authors: Cao Xiaopeng, Shi Linkai

Abstract:

The practical Byzantine fault-tolerant algorithm does not add nodes dynamically. It is limited in practical application. In order to add nodes dynamically, Dynamic Practical Byzantine Fault Tolerance Algorithm (DPBFT) was proposed. Firstly, a new node sends request information to other nodes in the network. The nodes in the network decide their identities and requests. Then the nodes in the network reverse connect to the new node and send block information of the current network. The new node updates information. Finally, the new node participates in the next round of consensus, changes the view and selects the master node. This paper abstracts the decision of nodes into the undirected connected graph. The final consistency of the graph is used to prove that the proposed algorithm can adapt to the network dynamically. Compared with the PBFT algorithm, DPBFT has better fault tolerance and lower network bandwidth.

Keywords: practical byzantine, fault tolerance, blockchain, consensus algorithm, consistency analysis

Procedia PDF Downloads 103
24451 Applying Spanning Tree Graph Theory for Automatic Database Normalization

Authors: Chetneti Srisa-an

Abstract:

In Knowledge and Data Engineering field, relational database is the best repository to store data in a real world. It has been using around the world more than eight decades. Normalization is the most important process for the analysis and design of relational databases. It aims at creating a set of relational tables with minimum data redundancy that preserve consistency and facilitate correct insertion, deletion, and modification. Normalization is a major task in the design of relational databases. Despite its importance, very few algorithms have been developed to be used in the design of commercial automatic normalization tools. It is also rare technique to do it automatically rather manually. Moreover, for a large and complex database as of now, it make even harder to do it manually. This paper presents a new complete automated relational database normalization method. It produces the directed graph and spanning tree, first. It then proceeds with generating the 2NF, 3NF and also BCNF normal forms. The benefit of this new algorithm is that it can cope with a large set of complex function dependencies.

Keywords: relational database, functional dependency, automatic normalization, primary key, spanning tree

Procedia PDF Downloads 328
24450 Exhaustive Study of Essential Constraint Satisfaction Problem Techniques Based on N-Queens Problem

Authors: Md. Ahsan Ayub, Kazi A. Kalpoma, Humaira Tasnim Proma, Syed Mehrab Kabir, Rakib Ibna Hamid Chowdhury

Abstract:

Constraint Satisfaction Problem (CSP) is observed in various applications, i.e., scheduling problems, timetabling problems, assignment problems, etc. Researchers adopt a CSP technique to tackle a certain problem; however, each technique follows different approaches and ways to solve a problem network. In our exhaustive study, it has been possible to visualize the processes of essential CSP algorithms from a very concrete constraint satisfaction example, NQueens Problem, in order to possess a deep understanding about how a particular constraint satisfaction problem will be dealt with by our studied and implemented techniques. Besides, benchmark results - time vs. value of N in N-Queens - have been generated from our implemented approaches, which help understand at what factor each algorithm produces solutions; especially, in N-Queens puzzle. Thus, extended decisions can be made to instantiate a real life problem within CSP’s framework.

Keywords: arc consistency (AC), backjumping algorithm (BJ), backtracking algorithm (BT), constraint satisfaction problem (CSP), forward checking (FC), least constrained values (LCV), maintaining arc consistency (MAC), minimum remaining values (MRV), N-Queens problem

Procedia PDF Downloads 330
24449 Multichannel Analysis of the Surface Waves of Earth Materials in Some Parts of Lagos State, Nigeria

Authors: R. B. Adegbola, K. F. Oyedele, L. Adeoti

Abstract:

We present a method that utilizes Multi-channel Analysis of Surface Waves, which was used to measure shear wave velocities with a view to establishing the probable causes of road failure, subsidence and weakening of structures in some Local Government Area, Lagos, Nigeria. Multi channel Analysis of Surface waves (MASW) data were acquired using 24-channel seismograph. The acquired data were processed and transformed into two-dimensional (2-D) structure reflective of depth and surface wave velocity distribution within a depth of 0–15m beneath the surface using SURFSEIS software. The shear wave velocity data were compared with other geophysical/borehole data that were acquired along the same profile. The comparison and correlation illustrates the accuracy and consistency of MASW derived-shear wave velocity profiles. Rigidity modulus and N-value were also generated. The study showed that the low velocity/very low velocity are reflective of organic clay/peat materials and thus likely responsible for the failed, subsidence/weakening of structures within the study areas.

Keywords: seismograph, road failure, rigidity modulus, N-value, subsidence

Procedia PDF Downloads 328
24448 Road Transition Design on Freeway Tunnel Entrance and Exit Based on Traffic Capacity

Authors: Han Bai, Tong Zhang, Lemei Yu, Doudou Xie, Liang Zhao

Abstract:

Road transition design on freeway tunnel entrance and exit is one vital factor in realizing smooth transition and improving traveling safety for vehicles. The goal of this research is to develop a horizontal road transition design tool that considers the transition technology of traffic capacity consistency to explore its accommodation mechanism. The influencing factors of capacity are synthesized and a modified capacity calculation model focusing on the influence of road width and lateral clearance is developed based on the VISSIM simulation to calculate the width of road transition sections. To keep the traffic capacity consistency, the right side of the transition section of the tunnel entrance and exit is divided into three parts: front arc, an intermediate transition section, and end arc; an optimization design on each transition part is conducted to improve the capacity stability and horizontal alignment transition. A case study on the Panlong Tunnel in Ji-Qing freeway illustrates the application of the tool.

Keywords: traffic safety, road transition, freeway tunnel, traffic capacity

Procedia PDF Downloads 298
24447 Influence of the Mixer on the Rheological Properties of the Fresh Concrete

Authors: Alexander Nitsche, Piotr-Robert Lazik, Harald Garrecht

Abstract:

The viscosity of the concrete has a great influence on the properties of the fresh concrete. Fresh concretes with low viscosity have a good flowability, whereas high viscosity has a lower flowability. Clearly, viscosity is directly linked to other parameters such as consistency, compaction, and workability of the concrete. The above parameters also depend very much on the energy induced during the mixing process and, of course, on the installation of the mixer itself. The University of Stuttgart has decided to investigate the influence of different mixing systems on the viscosity of various types of concrete, such as road concrete, self-compacting concrete, and lightweight concrete, using a rheometer and other testing methods. Each type is tested with three different mixers, and the rheological properties, namely consistency, and viscosity are determined. The aim of the study is to show that different types of concrete mixed with different types of mixers reach completely different yield points. Therefore, a 3 step procedure will be introduced. At first, various types of concrete mixtures and their differences are introduced. Then, the chosen suspension mixer and conventional mixers, which are going to be used in this paper, will be discussed. Lastly, the influence of the mixing system on the rheological properties of each of the select mix designs, as well as on fresh concrete, in general, will be presented.

Keywords: rheological properties, flowability, suspension mixer, viscosity

Procedia PDF Downloads 114
24446 Integrated On-Board Diagnostic-II and Direct Controller Area Network Access for Vehicle Monitoring System

Authors: Kavian Khosravinia, Mohd Khair Hassan, Ribhan Zafira Abdul Rahman, Syed Abdul Rahman Al-Haddad

Abstract:

The CAN (controller area network) bus is introduced as a multi-master, message broadcast system. The messages sent on the CAN are used to communicate state information, referred as a signal between different ECUs, which provides data consistency in every node of the system. OBD-II Dongles that are based on request and response method is the wide-spread solution for extracting sensor data from cars among researchers. Unfortunately, most of the past researches do not consider resolution and quantity of their input data extracted through OBD-II technology. The maximum feasible scan rate is only 9 queries per second which provide 8 data points per second with using ELM327 as well-known OBD-II dongle. This study aims to develop and design a programmable, and latency-sensitive vehicle data acquisition system that improves the modularity and flexibility to extract exact, trustworthy, and fresh car sensor data with higher frequency rates. Furthermore, the researcher must break apart, thoroughly inspect, and observe the internal network of the vehicle, which may cause severe damages to the expensive ECUs of the vehicle due to intrinsic vulnerabilities of the CAN bus during initial research. Desired sensors data were collected from various vehicles utilizing Raspberry Pi3 as computing and processing unit with using OBD (request-response) and direct CAN method at the same time. Two types of data were collected for this study. The first, CAN bus frame data that illustrates data collected for each line of hex data sent from an ECU and the second type is the OBD data that represents some limited data that is requested from ECU under standard condition. The proposed system is reconfigurable, human-readable and multi-task telematics device that can be fitted into any vehicle with minimum effort and minimum time lag in the data extraction process. The standard operational procedure experimental vehicle network test bench is developed and can be used for future vehicle network testing experiment.

Keywords: CAN bus, OBD-II, vehicle data acquisition, connected cars, telemetry, Raspberry Pi3

Procedia PDF Downloads 168
24445 Characteristics of Oil-In-Water Emulsion Stabilized with Pregelatinized Waxy Rice Starch

Authors: R. Yulianingsih, S. Gohtani

Abstract:

Characteristics of pregelatinized waxy rice starch (PWR) gelatinized at different temperatures (65, 75, and 85 °C, abbreviated as PWR 65, 75 and 85 respectively) and their emulsion-stabilizing properties at different starch concentrations (3, 5, 7, and 9%) were studied. The yield stress and consistency index value of PWR solution increased with an increase in starch concentration. The pseudoplasticity of PWR 65 solution increased and that for both PWR 75 and 85 solution decreased with an increase in starch concentration. Small angle X-ray scattering (SAXS) profiles analyzed by Kratky Plot indicated that PWR 65 is natively unfolded particles while PWR 75 and 85 are the globular particles. The characteristics of emulsions stabilized with PWR were influenced by the temperature of gelatinization process and starch concentration. Elevated concentration of starch decreased the value of yield stress and increased the consistency index. PWR 65 produce stable emulsion to creaming at starch concentrations more than 5%, while PWR 85 is able to produce stable emulsion to both creaming and coalescence of droplets.

Keywords: emulsion, gelatinization temperature, rheology, small-angle X-ray scattering, waxy rice starch

Procedia PDF Downloads 135
24444 Digitalization of Functional Safety - Increasing Productivity while Reducing Risks

Authors: Michael Scott, Phil Jarrell

Abstract:

Digitalization seems to be everywhere these days. So if one was to digitalize Functional Safety, what would that require: • Ability to directly use data from intelligent P&IDs / process design in a PHA / LOPA • Ability to directly use data from intelligent P&IDs in the SIS Design to support SIL Verification Calculations, SRS, C&Es, Functional Test Plans • Ability to create Unit Operation / SIF Libraries to radically reduce engineering manhours while ensuring consistency and improving quality of SIS designs • Ability to link data directly from a PHA / LOPA to SIS Designs • Ability to leverage reliability models and SRS details from SIS Designs to automatically program the Safety PLC • Ability to leverage SIS Test Plans to automatically create Safety PLC application logic Test Plans for a virtual FAT • Ability to tie real-time data from Process Historians / CMMS to assumptions in the PHA / LOPA and SIS Designs to generate leading indicators on protection layer health • Ability to flag SIS bad actors for proactive corrective actions prior to a near miss or loss of containment event What if I told you all of this was available today? This paper will highlight how the digital revolution has revolutionized the way Safety Instrumented Systems are designed, configured, operated and maintained.

Keywords: IEC 61511, safety instrumented systems, functional safety, digitalization, IIoT

Procedia PDF Downloads 140
24443 Development and Evaluation of a Psychological Adjustment and Adaptation Status Scale for Breast Cancer Survivors

Authors: Jing Chen, Jun-E Liu, Peng Yue

Abstract:

Objective: The objective of this study was to develop a psychological adjustment and adaptation status scale for breast cancer survivors, and to examine the reliability and validity of the scale. Method: 37 breast cancer survivors were recruited in qualitative research; a five-subject theoretical framework and an item pool of 150 items of the scale were derived from the interview data. In order to evaluate and select items and reach a preliminary validity and reliability for the original scale, the suggestions of study group members, experts and breast cancer survivors were taken, and statistical methods were used step by step in a sample of 457 breast cancer survivors. Results: An original 24-item scale was developed. The five dimensions “domestic affections”, “interpersonal relationship”, “attitude of life”, “health awareness”, “self-control/self-efficacy” explained 58.053% of the total variance. The content validity was assessed by experts, the CVI was 0.92. The construct validity was examined in a sample of 264 breast cancer survivors. The fitting indexes of confirmatory factor analysis (CFA) showed good fitting of the five dimensions model. The criterion-related validity of the total scale with PTGI was satisfactory (r=0.564, p<0.001). The internal consistency reliability and test-retest reliability were tested. Cronbach’s alpha value (0.911) showed a good internal consistency reliability, and the intraclass correlation coefficient (ICC=0.925, p<0.001) showed a satisfactory test-retest reliability. Conclusions: The scale was brief and easy to understand, was suitable for breast cancer patients whose physical strength and energy were limited.

Keywords: breast cancer survivors, rehabilitation, psychological adaption and adjustment, development of scale

Procedia PDF Downloads 490
24442 Statically Fused Unbiased Converted Measurements Kalman Filter

Authors: Zhengkun Guo, Yanbin Li, Wenqing Wang, Bo Zou

Abstract:

The statically fused converted position and doppler measurements Kalman filter (SF-CMKF) with additive debiased measurement conversion has been previously presented to combine the resulting states of converted position measurements Kalman filter (CPMKF) and converted doppler measurement Kalman filter (CDMKF) to yield the final state estimates under minimum mean squared error (MMSE) criterion. However, the exact compensation for the bias in the polar-to-cartesian and spherical-to-cartesian conversion are multiplicative and depend on the statistics of the cosine of the angle measurement errors. As a result, the consistency and performance of the SF-CMKF may be suboptimal in large-angle error situations. In this paper, the multiplicative unbiased position and Doppler measurement conversion for 2D (polar-to-cartesian) tracking are derived, and the SF-CMKF is improved to use those conversions. Monte Carlo simulations are presented to demonstrate the statistical consistency of the multiplicative unbiased conversion and the superior performance of the modified SF-CMKF (SF-UCMKF).

Keywords: measurement conversion, Doppler, Kalman filter, estimation, tracking

Procedia PDF Downloads 172
24441 A Semiparametric Approach to Estimate the Mode of Continuous Multivariate Data

Authors: Tiee-Jian Wu, Chih-Yuan Hsu

Abstract:

Mode estimation is an important task, because it has applications to data from a wide variety of sources. We propose a semi-parametric approach to estimate the mode of an unknown continuous multivariate density function. Our approach is based on a weighted average of a parametric density estimate using the Box-Cox transform and a non-parametric kernel density estimate. Our semi-parametric mode estimate improves both the parametric- and non-parametric- mode estimates. Specifically, our mode estimate solves the non-consistency problem of parametric mode estimates (at large sample sizes) and reduces the variability of non-parametric mode estimates (at small sample sizes). The performance of our method at practical sample sizes is demonstrated by simulation examples and two real examples from the fields of climatology and image recognition.

Keywords: Box-Cox transform, density estimation, mode seeking, semiparametric method

Procedia PDF Downloads 254
24440 Selecting Graduates for the Interns’ Award by Using Multisource Feedback Process: Does It Work?

Authors: Kathryn Strachan, Sameer Otoom, Amal AL-Gallaf, Ahmed Al Ansari

Abstract:

Introduction: Introducing a reliable method to select graduates for an award in higher education can be challenging but is not impossible. Multisource feedback (MSF) is a popular assessment tool that relies on evaluations of different groups of people, including physicians and non-physicians. It is useful for assessing several domains, including professionalism, communication and collaboration and may be useful for selecting the best interns to receive a University award. Methods: 16 graduates responded to an invitation to participate in the student award, which was conducted by the Royal College of Surgeons of Ireland-Bahrain Medical University of Bahrain (RCSI Bahrain) using the MSF process. Five individuals from the following categories rated each participant: physicians, nurses, and fellow students. RCSI Bahrain graduates were assessed in the following domains; professionalism, communication, and collaboration. Mean and standard deviation were calculated and the award was given to the graduate who scored the highest among his/her colleagues. Cronbach’s coefficient was used to determine the questionnaire’s internal consistency and reliability. Factor analysis was conducted to examine for the construct validity. Results: 16 graduates participated in the RCSI-Bahrain interns’ award based on the MSF process, giving us a 16.5% response rate. The instrument was found to be suitable for factor analysis and showed 3 factor solutions representing 79.3% of the total variance. Reliability analysis using Cronbach’s α reliability of internal consistency indicated that the full scale of the instrument had high internal consistency (Cronbach’s α 0.98). Conclusion: This study found the MSF process to be reliable and valid for selecting the best graduates for the interns’ awards. However, the low response rates may suggest that the process is not feasible for allowing the majority of the students to participate in the selection process. Further research studies may be required to support the feasibility of the MSF process in selecting graduates for the university award.

Keywords: MSF, RCSI, validity, Bahrain

Procedia PDF Downloads 314
24439 The Perspective of Waria Transgenders in Singaraja on Their Reproduction Health

Authors: Made Kurnia Widiastuti Giri, Nyoman Kanca, Arie Swastini, Bambang Purwanto

Abstract:

Aim: Waria transgenders are a phenomenon whose existence is undeniable. The sexual behaviours of waria transgenders belong to the groups of high-risk STDs infections, especially HIV/AIDS. The present study was aimed at finding out the general idea of the existence of waria transgenders in Singaraja, their sexual transactions, their sexual behaviours, and at exploring the factors affecting their sexual behaviours along with their participation in regular reproduction health control. Methods: The subjects of the present research were male-to-female transgenders living in the town of Singaraja. The research applied a qualitative approach. Data collection in this research was conducted through in-depth interview and observation. Results: The results of the study exposed 1) the existence of waria transgender community in Singaraja observed from their active participation in social events such as taking the roles of counsellors in the campaign of prevention and control of HIV/AIDS with the Local Commission of AIDS Control and other foundations; 2) the sexual services provided by waria transgenders which were performed in squeeze method, oral and anal sex which could be categorized as HIV/AIDS high-risk sexual behaviours, while the consistency in doing safe sex among the trangenders in Singaraja showed that most of the waria transgenders (80%) were aware of the urgency of using condoms during sexual intercourse; and 3) the low participation of the waria transgenders in Singaraja in regular reproduction health check up at the local Centre of Public Health Service was caused by their negative perception about being examined by female doctors. Conclucions: Waria in singaraja categorized as HIV/AIDS high-risk sexual behaviours but they do have consistency in doing safe sex by using condoms. They have a negative psychological perception about being examined by female doctors.

Keywords: waria transgenders, sexual behaviours, reproduction health, hiv/aids

Procedia PDF Downloads 312
24438 Effect of Gum Extracts on the Textural and Bread-Making Properties of a Composite Flour Based on Sour Cassava Starch (Manihot esculenta), Peanut (Arachis hypogaea) and Cowpea Flour (Vigna unguiculata)

Authors: Marie Madeleine Nanga Ndjang, Julie Mathilde Klang, Edwin M. Mmutlane, Derek Tantoh Ndinteh, Eugenie Kayitesi, Francois Ngoufack Zambou

Abstract:

Gluten intolerance and the unavailability of wheat flour in some parts of the world have led to the development of gluten-free bread. However, gluten-free bread generally results in a low specific volume, and to remedy this, the use of hydrocolloids and bases has proved to be very successful. Thus, the present study aims to determine the optimal proportions of gum extract of Triumffetapentendraand sodium bicarbonate in breadmaking of a composite flour based on sour cassava starch, peanut, and cowpea flour. To achieve this, a BoxBenkhendesign was used, the variable being the amount of extract gums, the amount of bicarbonate, and the amount of water. The responses evaluated were the specific volume and texture properties (Hardness, Cohesiveness, Consistency, Elasticity, and Masticability). The specific volume was done according to standard methods of AACC and the textural properties by a texture analyzer. It appears from this analysis that the specific volume is positively influenced by the incorporation of extract gums, bicarbonate, and water. The hardness, consistency, and plasticity increased with the incorporation rate of extract gums but reduced with the incorporation rate of bicarbonate and water. On the other hand, Cohesion and elasticity increased with the incorporation rate of bicarbonate and water but reduced with the incorporation of extract gum. The optimate proportions of extract gum, bicarbonate, and water are 0.28;1.99, and 112.5, respectively. This results in a specific volume of 1.51; a hardness of 38.51; a cohesiveness of 0.88; a consistency of 32.86; an elasticity of 5.57, and amasticability of 162.35. Thus, this analysis suggests that gum extracts and sodium bicarbonate can be used to improve the quality of gluten-free bread.

Keywords: box benkhen design, bread-making, gums, textures properties, specific volume

Procedia PDF Downloads 65
24437 Mapping Feature Models to Code Using a Reference Architecture: A Case Study

Authors: Karam Ignaim, Joao M. Fernandes, Andre L. Ferreira

Abstract:

Mapping the artifacts coming from a set of similar products family developed in an ad-hoc manner to make up the resulting software product line (SPL) plays a key role to maintain the consistency between requirements and code. This paper presents a feature mapping approach that focuses on tracing the artifact coming from the migration process, the current feature model (FM), to the other artifacts of the resulting SPL, the reference architecture, and code. Thus, our approach relates each feature of the current FM to its locations in the implementation code, using the reference architecture as an intermediate artifact (as a centric point) to preserve consistency among them during an SPL evolution. The approach uses a particular artifact (i.e., traceability tree) as a solution for managing the mapping process. Tool support is provided using friendlyMapper. We have evaluated the feature mapping approach and tool support by putting the approach into practice (i.e., conducting a case study) of the automotive domain for Classical Sensor Variants Family at Bosch Car Multimedia S.A. The evaluation reveals that the mapping approach presented by this paper fits the automotive domain.

Keywords: feature location, feature models, mapping, software product lines, traceability

Procedia PDF Downloads 91
24436 A Network-Theorical Perspective on Music Analysis

Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria

Abstract:

The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.

Keywords: computational musicology, mathematical music modelling, music analysis, style classification

Procedia PDF Downloads 64
24435 A Comparative Study of Self, Peer and Teacher Assessment Based on an English Writing Checklist

Authors: Xiaoting Shi, Xiaomei Ma

Abstract:

In higher education, students' self-assessment and peer assessment of compositions in writing classes can effectively improve their ability of evaluative judgment. However, students' self-assessment and peer assessment are not advocated by most teachers because of the significant difference in scoring compared with teacher assessment. This study used a multi-faceted Rasch model to explore whether an English writing checklist containing 30 descriptors can effectively improve rating consistency among self-assessment, peer assessment and teacher assessment. Meanwhile, a questionnaire was adopted to survey students’ and teachers’ attitudes toward self-assessment and peer assessment using the writing checklist. Results of the multi-faceted Rasch model analysis show that the writing checklist can effectively distinguish the students’ writing ability (separate coefficient = 2.05, separate reliability = 0.81, chi-square value (df = 32) = 123.4). Moreover, the results revealed that the checklist could improve rating consistency among self-assessment, peer assessment and teacher assessment. (separate coefficient = 1.71, separate reliability = 0.75, chi-square value (df=4) = 20.8). The results of the questionnaire showed that more than 85% of students and all teachers believed that the checklist had a good advantage in self-assessment and peer assessment, and they were willing to use the checklist to conduct self-assessment and peer assessment in class in the future.

Keywords: english writing, self-assessment, peer assessment, writing checklist

Procedia PDF Downloads 126
24434 Psychometric Properties of the Social Skills Rating System: Teacher Version

Authors: Amani Kappi, Ana Maria Linares, Gia Mudd-Martin

Abstract:

Children with Attention Deficit Hyperactivity Disorder (ADHD) are more likely to develop social skills deficits that can lead to academic underachievement, peer rejection, and maladjustment. Surveying teachers about children's social skills with ADHD will become a significant factor in identifying whether the children will be diagnosed with social skills deficits. The teacher-specific version of the Social Skills Rating System scale (SSRS-T) has been used as a screening tool for children's social behaviors. The psychometric properties of the SSRS-T have been evaluated in various populations and settings, such as when used by teachers to assess social skills for children with learning disabilities. However, few studies have been conducted to examine the psychometric properties of the SSRS-T when used to assess children with ADHD. The purpose of this study was to examine the psychometric properties of the SSRS-T and two SSRS-T subscales, Social Skills and Problem Behaviors. This was a secondary analysis of longitudinal data from the Fragile Families and Child Well-Being Study. This study included a sample of 194 teachers who used the SSRS-T to assess the social skills of children aged 8 to 10 years with ADHD. Exploratory principal components factor analysis was used to assess the construct validity of the SSRS-T scale. Cronbach’s alpha value was used to assess the internal consistency reliability of the total SSRS-T scale and the subscales. Item analyses included item-item intercorrelations, item-to-subscale correlations, and Cronbach’s alpha value changes with item deletion. The results of internal consistency reliability for both the total scale and subscales were acceptable. The results of the exploratory factor analysis supported the five factors of SSRS-T (Cooperation, Self-control, Assertion, Internalize behaviors, and Externalize behaviors) reported in the original version. Findings indicated that SSRS-T is a reliable and valid tool for assessing the social behaviors of children with ADHD.

Keywords: ADHD, children, social skills, SSRS-T, psychometric properties

Procedia PDF Downloads 101