Search results for: statistical process control
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27124

Search results for: statistical process control

26614 Enhance Security in XML Databases: XLog File for Severity-Aware Trust-Based Access Control

Authors: A: Asmawi, L. S. Affendey, N. I. Udzir, R. Mahmod

Abstract:

The topic of enhancing security in XML databases is important as it includes protecting sensitive data and providing a secure environment to users. In order to improve security and provide dynamic access control for XML databases, we presented XLog file to calculate user trust values by recording users’ bad transaction, errors and query severities. Severity-aware trust-based access control for XML databases manages the access policy depending on users' trust values and prevents unauthorized processes, malicious transactions and insider threats. Privileges are automatically modified and adjusted over time depending on user behaviour and query severity. Logging in database is an important process and is used for recovery and security purposes. In this paper, the Xlog file is presented as a dynamic and temporary log file for XML databases to enhance the level of security.

Keywords: XML database, trust-based access control, severity-aware, trust values, log file

Procedia PDF Downloads 300
26613 A Dual-Mode Infinite Horizon Predictive Control Algorithm for Load Tracking in PUSPATI TRIGA Reactor

Authors: Mohd Sabri Minhat, Nurul Adilla Mohd Subha

Abstract:

The PUSPATI TRIGA Reactor (RTP), Malaysia reached its first criticality on June 28, 1982, with power capacity 1MW thermal. The Feedback Control Algorithm (FCA) which is conventional Proportional-Integral (PI) controller, was used for present power control method to control fission process in RTP. It is important to ensure the core power always stable and follows load tracking within acceptable steady-state error and minimum settling time to reach steady-state power. At this time, the system could be considered not well-posed with power tracking performance. However, there is still potential to improve current performance by developing next generation of a novel design nuclear core power control. In this paper, the dual-mode predictions which are proposed in modelling Optimal Model Predictive Control (OMPC), is presented in a state-space model to control the core power. The model for core power control was based on mathematical models of the reactor core, OMPC, and control rods selection algorithm. The mathematical models of the reactor core were based on neutronic models, thermal hydraulic models, and reactivity models. The dual-mode prediction in OMPC for transient and terminal modes was based on the implementation of a Linear Quadratic Regulator (LQR) in designing the core power control. The combination of dual-mode prediction and Lyapunov which deal with summations in cost function over an infinite horizon is intended to eliminate some of the fundamental weaknesses related to MPC. This paper shows the behaviour of OMPC to deal with tracking, regulation problem, disturbance rejection and caters for parameter uncertainty. The comparison of both tracking and regulating performance is analysed between the conventional controller and OMPC by numerical simulations. In conclusion, the proposed OMPC has shown significant performance in load tracking and regulating core power for nuclear reactor with guarantee stabilising in the closed-loop.

Keywords: core power control, dual-mode prediction, load tracking, optimal model predictive control

Procedia PDF Downloads 162
26612 Investigating the Effect of Executive Functions on Young Children’s Drawing of Familiar and Unfamiliar

Authors: Reshaa Alruwaili

Abstract:

This study was inspired by previous studies with young children that found (a) that they need both inhibitory control and working memory when drawing an unfamiliar subject (e.g., animals) by adapting their schema of the human figure and (b) that when drawing something familiar (e.g., a person) they use inhibitory control mediated through fine motor control to execute their drawing. This study, therefore, systematically investigated whether direct effects for both working memory and inhibitory control and/or effects mediated through fine motor control existed when drawing both familiar and unfamiliar subjects. Participants were 95 children (41-66 months old) required to draw both a man and a dog, scored respectively for how representational they were and for differences from a human figure. Regression and mediation analyses showed that inhibitory control alone predicted drawing a recognizable man while working memory alone predicted drawing a dog that was not human-like when fine motor control, age, and gender were controlled. Contrasting with some previous studies, these results suggest that the roles of working memory and inhibitory control are sensitive to the familiarity of the drawing task and are not necessarily mediated through fine motor control. Implications for research on drawing development are discussed.

Keywords: child drawing, inhibitory control, working memory, fine motor control, mediation, familiar and unfamiliar subjects

Procedia PDF Downloads 76
26611 Holistic Approach to Teaching Mathematics in Secondary School as a Means of Improving Students’ Comprehension of Study Material

Authors: Natalia Podkhodova, Olga Sheremeteva, Mariia Soldaeva

Abstract:

Creating favorable conditions for students’ comprehension of mathematical content is one of the primary problems in teaching mathematics in secondary school. Psychology research has demonstrated that positive comprehension becomes possible when new information becomes part of student’s subjective experience and when linkages between the attributes of notions and various ways of their presentations can be established. The fact of comprehension includes the ability to build a working situational model and thus becomes an important means of solving mathematical problems. The article describes the implementation of a holistic approach to teaching mathematics designed to address the primary challenges of such teaching, specifically, the challenge of students’ comprehension. This approach consists of (1) establishing links between the attributes of a notion: the sense, the meaning, and the term; (2) taking into account the components of student’s subjective experience -emotional and value, contextual, procedural, communicative- during the educational process; (3) links between different ways to present mathematical information; (4) identifying and leveraging the relationships between real, perceptual and conceptual (scientific) mathematical spaces by applying real-life situational modeling. The article describes approaches to the practical use of these foundational concepts. Identifying how proposed methods and technology influence understanding of material used in teaching mathematics was the research’s primary goal. The research included an experiment in which 256 secondary school students took part: 142 in the experimental group and 114 in the control group. All students in these groups had similar levels of achievement in math and studied math under the same curriculum. In the course of the experiment, comprehension of two topics -'Derivative' and 'Trigonometric functions'- was evaluated. Control group participants were taught using traditional methods. Students in the experimental group were taught using the holistic method: under the teacher’s guidance, they carried out problems designed to establish linkages between notion’s characteristics, to convert information from one mode of presentation to another, as well as problems that required the ability to operate with all modes of presentation. The use of the technology that forms inter-subject notions based on linkages between perceptional, real, and conceptual mathematical spaces proved to be of special interest to the students. Results of the experiment were analyzed by presenting students in each of the groups with a final test in each of the studied topics. The test included problems that required building real situational models. Statistical analysis was used to aggregate test results. Pierson criterion was used to reveal the statistical significance of results (pass-fail the modeling test). A significant difference in results was revealed (p < 0.001), which allowed the authors to conclude that students in the study group showed better comprehension of mathematical information than those in the control group. Also, it was revealed (used Student’s t-test) that the students of the experimental group performed reliably (p = 0.0001) more problems in comparison with those in the control group. The results obtained allow us to conclude that increasing comprehension and assimilation of study material took place as a result of applying implemented methods and techniques.

Keywords: comprehension of mathematical content, holistic approach to teaching mathematics in secondary school, subjective experience, technology of the formation of inter-subject notions

Procedia PDF Downloads 176
26610 DEA-Based Variable Structure Position Control of DC Servo Motor

Authors: Ladan Maijama’a, Jibril D. Jiya, Ejike C. Anene

Abstract:

This paper presents Differential Evolution Algorithm (DEA) based Variable Structure Position Control (VSPC) of Laboratory DC servomotor (LDCSM). DEA is employed for the optimal tuning of Variable Structure Control (VSC) parameters for position control of a DC servomotor. The VSC combines the techniques of Sliding Mode Control (SMC) that gives the advantages of small overshoot, improved step response characteristics, faster dynamic response and adaptability to plant parameter variations, suppressed influences of disturbances and uncertainties in system behavior. The results of the simulation responses of the VSC parameters adjustment by DEA were performed in Matlab Version 2010a platform and yield better dynamic performance compared with the untuned VSC designed.

Keywords: differential evolution algorithm, laboratory DC servomotor, sliding mode control, variable structure control

Procedia PDF Downloads 415
26609 Determination of Anti-Fungal Activity of Cedrus deodara Oil against Oligoporus placentus, Trametes versicolor and Xylaria acuminata on Populus deltoids

Authors: Sauradipta Ganguly, Akhato Sumi, Sanjeet Kumar Hom, Ajan T. Lotha

Abstract:

Populus deltoides is a hardwood used predominantly for the manufacturing of plywood, matchsticks, and paper in India and hence has a higher economical significance. Wood-decaying fungi cause serious damage to Populus deltoides products, as the wood itself is perishable and vulnerable to decaying agents, decreasing their aesthetical value which in return results in significant monetary loss for the wood industries concerned. The aim of the study was to determine the antifungal activity of Cedrus deodara oil against three primary wood-decaying fungi namely white-rot fungi (Trametes versicolor), brown-rot fungi (Oligoporus placentus) and soft-rot fungi (Xylaria acuminata) on Populus deltoides samples under optimum laboratory conditions. The susceptibility of Populus deltoides samples on the fungal attack and the ability of deodar oil to control colonization of the wood rotting fungi on the samples were assessed. Three concentrations of deodar oil were considered for the study as treating solutions, i.e., 4%, 5%, and 6%. The Populus deltoides samples were treated with treating solutions, and the ability of the same to prevent a fungal attack on the samples were assessed using accelerated test in the laboratory at Biochemical Oxygen Demand incubator at temperature (25 ± 2°C) and relative humidity 70 ± 4%. Efficacy test and statistical analysis of deodar oil against Trametes versicolor, Oligoporus placentus, and Xylariaacuminataon P. deltoides samples exhibited light, minor and negligible mycelia growth at 4 %, 5% and 6% concentrations of deodar oil, respectively. Whereas, moderate to heavy attack was observed on the surface of the control samples. Statistical analysis further established that the treatments were statistically significant and had significantly inhibited fungal growth of all the three fungus spp by almost 3 to 5 times.

Keywords: populus deltoides, Trametes versicolor, Oligoporus placentus, Xylaria acuminata, Deodar oil, treatment

Procedia PDF Downloads 125
26608 Control of Listeria monocytogenes ATCC7644 in Fresh Tomato and Carrot with Zinc Oxide Nanoparticles

Authors: Oluwatosin A. Ijabadeniyi, Faith Semwayo

Abstract:

Preference for consumption of fresh and minimally processed fruits and vegetables continues to be on the upward trend however food-borne outbreaks related to them have also been on the increase. In this study the effect of zinc oxide nanoparticles on controlling Listeria monocytogenes ATCC 7644 in tomatoes and carrots during storage was investigated. Nutrient broth was inoculated with Listeria monocytogenes ATCC 7644 and thereafter inoculated with 0.3mg/ml nano-zinc oxide solution and 1.2mg/ml nano-zinc oxide solution and 200ppm chlorine was used as a control. Whole tomatoes and carrots were also inoculated with Listeria monocytogenes ATCC 7644 after which they were dipped into zinc oxide nanoparticle solutions and chlorine solutions. 1.2 mg/ml had a 2.40 log reduction; 0.3mg/ml nano-zinc oxide solution had a log reduction of 2.15 in the broth solution. There was however a 4.89 log and 4.46 reduction by 200 ppm chlorine in tomato and carrot respectively. Control with 0.3 mg/ml zinc oxide nanoparticles resulted in a log reduction of 5.19 in tomato and 3.66 in carrots. 1.2 mg/ml nanozinc oxide solution resulted in a 5.53 log reduction in tomato and a 4.44 log reduction in carrots. A combination of 50ppm Chlorine and 0.3 mg/ml nanozinc oxide was also used and resulted in log reductions of 5.76 and 4.84 respectively in tomatoes and carrots. Treatments were more effective in tomatoes than in carrots and the combination of 50ppm Chlorine and 0.3 mg/ml ZnO resulted in the highest log reductions in both vegetables. Statistical analysis however showed that there was no significant difference between treatments with Chlorine and nanoparticle solutions. This study therefore indicates that zinc oxide nanoparticles have the potential for use as a control agent in the fresh produce industry.

Keywords: Listeria monocytogenes, nanoparticles, tomato, carrot

Procedia PDF Downloads 501
26607 Methods for Business Process Simulation Based on Petri Nets

Authors: K. Shoylekova, K. Grigorova

Abstract:

The Petri nets are the first standard for business process modeling. Most probably, it is one of the core reasons why all new standards created afterwards have to be so reformed as to reach the stage of mapping the new standard onto Petri nets. The paper presents a Business process repository based on a universal database. The repository provides the possibility the data about a given process to be stored in three different ways. Business process repository is developed with regard to the reformation of a given model to a Petri net in order to be easily simulated two different techniques for business process simulation based on Petri nets - Yasper and Woflan are discussed. Their advantages and drawbacks are outlined. The way of simulating business process models, stored in the Business process repository is shown.

Keywords: business process repository, petri nets, simulation, Woflan, Yasper

Procedia PDF Downloads 370
26606 Control of Oil Content of Fried Zucchini Slices by Partial Predrying and Process Optimization

Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner

Abstract:

Main concern about deep-fat-fried food materials is their high final oil contents absorbed during frying process and/or after cooling period, since diet including high content of oil is accepted unhealthy by consumers. Different methods have been evaluated to decrease oil content of fried food stuffs. One promising method is partially drying of food material before frying. In the present study it was aimed to control and decrease the final oil content of zucchini slices by means of partial drying and to optimize process conditions. Conventional oven drying was used to decrease moisture content of zucchini slices at a certain extent. Process performance in terms of oil uptake was evaluated by comparing oil content of predried and then fried zucchini slices with those determined for directly fried ones. For predrying and frying processes, oven temperature and weight loss and frying oil temperature and time pairs were controlled variables, respectively. Zucchini slices were also directly fried for sensory evaluations revealing preferred properties of final product in terms of surface color, moisture content, texture and taste. These properties of directly fried zucchini slices taking the highest score at the end of sensory evaluation were determined and used as targets in optimization procedure. Response surface methodology was used for process optimization. The properties, determined after sensory evaluation, were selected as targets; meanwhile oil content was aimed to be minimized. Results indicated that final oil content of zucchini slices could be reduced from 58% to 46% by controlling conditions of predrying and frying processes. As a result, it was suggested that predrying could be one choose to reduce oil content of fried zucchini slices for health diet. This project (113R015) has been supported by TUBITAK.

Keywords: health process, optimization, response surface methodology, oil uptake, conventional oven

Procedia PDF Downloads 366
26605 Intrusiveness, Appraisal and Thought Control Strategies in Patients with Obsessive Compulsive Disorder

Authors: T. Arshad

Abstract:

A correlation study was done to explore the relationship of intrusiveness, appraisal and thought control strategies in patients with Obsessive Compulsive Disorder. Theoretical frame work for the present study was Salkovskis (1985) cognitive model of obsessive compulsive disorder. Sample of 100 patients (men=48, women=52) of age 14-62 years (M=32.13, SD=10.37) was recruited from hospitals of Lahore, Pakistan. Revised Obsessional Intrusion Inventory, Stress Appraisal Measure, Thought Control Questionnaire and Symptoms Checklist-R were self-administered. Findings revealed that intrusiveness is correlated with appraisals (controllable by self, controllable by others, uncontrollable, stressfulness) and thought control strategy (punishment). Furthermore, appraisals (uncontrollable, stressfulness, controllable by others) were emerged as strong predictors for different through control strategies (distraction, punishment and social control). Moreover, men have higher frequency of intrusion, whereas women were frequently using social control as thought control strategy. Results implied that intrusiveness, appraisals (controllable by others, uncontrollable, stressfulness) and thought control strategy (punishment) are related which maintains the disorder.

Keywords: appraisal, intrusiveness, obsessive compulsive disorder, thought control strategies

Procedia PDF Downloads 389
26604 Stability of Stochastic Model Predictive Control for Schrödinger Equation with Finite Approximation

Authors: Tomoaki Hashimoto

Abstract:

Recent technological advance has prompted significant interest in developing the control theory of quantum systems. Following the increasing interest in the control of quantum dynamics, this paper examines the control problem of Schrödinger equation because quantum dynamics is basically governed by Schrödinger equation. From the practical point of view, stochastic disturbances cannot be avoided in the implementation of control method for quantum systems. Thus, we consider here the robust stabilization problem of Schrödinger equation against stochastic disturbances. In this paper, we adopt model predictive control method in which control performance over a finite future is optimized with a performance index that has a moving initial and terminal time. The objective of this study is to derive the stability criterion for model predictive control of Schrödinger equation under stochastic disturbances.

Keywords: optimal control, stochastic systems, quantum systems, stabilization

Procedia PDF Downloads 458
26603 Estimation of Forces Applied to Forearm Using EMG Signal Features to Control of Powered Human Arm Prostheses

Authors: Faruk Ortes, Derya Karabulut, Yunus Ziya Arslan

Abstract:

Myoelectric features gathering from musculature environment are considered on a preferential basis to perceive muscle activation and control human arm prostheses according to recent experimental researches. EMG (electromyography) signal based human arm prostheses have shown a promising performance in terms of providing basic functional requirements of motions for the amputated people in recent years. However, these assistive devices for neurorehabilitation still have important limitations in enabling amputated people to perform rather sophisticated or functional movements. Surface electromyogram (EMG) is used as the control signal to command such devices. This kind of control consists of activating a motion in prosthetic arm using muscle activation for the same particular motion. Extraction of clear and certain neural information from EMG signals plays a major role especially in fine control of hand prosthesis movements. Many signal processing methods have been utilized for feature extraction from EMG signals. The specific objective of this study was to compare widely used time domain features of EMG signal including integrated EMG(IEMG), root mean square (RMS) and waveform length(WL) for prediction of externally applied forces to human hands. Obtained features were classified using artificial neural networks (ANN) to predict the forces. EMG signals supplied to process were recorded during only type of muscle contraction which is isometric and isotonic one. Experiments were performed by three healthy subjects who are right-handed and in a range of 25-35 year-old aging. EMG signals were collected from muscles of the proximal part of the upper body consisting of: biceps brachii, triceps brachii, pectorialis major and trapezius. The force prediction results obtained from the ANN were statistically analyzed and merits and pitfalls of the extracted features were discussed with detail. The obtained results are anticipated to contribute classification process of EMG signal and motion control of powered human arm prosthetics control.

Keywords: assistive devices for neurorehabilitation, electromyography, feature extraction, force estimation, human arm prosthesis

Procedia PDF Downloads 367
26602 Energetic and Exergetic Evaluation of Box-Type Solar Cookers Using Different Insulation Materials

Authors: A. K. Areamu, J. C. Igbeka

Abstract:

The performance of box-type solar cookers has been reported by several researchers but little attention was paid to the effect of the type of insulation material on the energy and exergy efficiency of these cookers. This research aimed at evaluating the energy and exergy efficiencies of the box-type cookers containing different insulation materials. Energy and exergy efficiencies of five box-type solar cookers insulated with maize cob, air (control), maize husk, coconut coir and polyurethane foam respectively were obtained over a period of three years. The cookers were evaluated using water heating test procedures in determining the energy and exergy analysis. The results were subjected to statistical analysis using ANOVA. The result shows that the average energy input for the five solar cookers were: 245.5, 252.2, 248.7, 241.5 and 245.5J respectively while their respective average energy losses were: 201.2, 212.7, 208.4, 189.1 and 199.8J. The average exergy input for five cookers were: 228.2, 234.4, 231.1, 224.4 and 228.2J respectively while their respective average exergy losses were: 223.4, 230.6, 226.9, 218.9 and 223.0J. The energy and exergy efficiency was highest in the cooker with coconut coir (37.35 and 3.90% respectively) in the first year but was lowest for air (11 and 1.07% respectively) in the third year. Statistical analysis showed significant difference between the energy and exergy efficiencies over the years. These results reiterate the importance of a good insulating material for a box-type solar cooker.

Keywords: efficiency, energy, exergy, heating insolation

Procedia PDF Downloads 367
26601 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms

Authors: Nebi Gedik

Abstract:

One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).

Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram

Procedia PDF Downloads 222
26600 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: mortality map, spatial patterns, statistical area, variation

Procedia PDF Downloads 258
26599 Effect of Nutrition Education on the Control and Function of Insulin-Dependent Diabetes Patients

Authors: Rahil Sahragard, Mahmoud Hatami, Rostam Bahadori Khalili

Abstract:

Diabetes is one of the most important health problems in the world and a chronic disease requiring continuous care and therefore, it is necessary for patients to undergo self-care and nutrition education. This study was conducted to evaluate the effect of nutrition education on the metabolic control of diabetic patients in Tehran in 2015. An experimental study was conducted on 100 patients who had previously been approved by a specialist physician for diabetes and at least one year after their onset. At first, patients without any knowledge of the educational program were selected as sample and from them a checklist containing demographic and specific information about diabetes was filled and were taken three fasting blood glucose and three times fasting blood glucose (5 p.m.) Then, the patients received face-to-face training in the same conditions for 2 weeks in a Mehregan hospital of Tehran, and received 3 months of training, while they were fully monitored and during this time, samples that had a cold or blood pressure-related disease or were admitted to the hospital were excluded from the study. After the end of the study, the checklist was filled again and 3 fasting blood glucose and 3 fasting blood glucose samples were taken, the results were statistically analyzed by MC Nemar's statistical test. The research findings were performed on 100 patients 41.7% male and 58.3% women, the range of age was between 22 and 60 years old, with a duration of diabetes ranging from 1 to 15 years. Abnormal fasting blood glucose from 95% to 48.3% (P <0.0001) and non-fasting blood glucose decreased from 91.6% to 71.2% (P <0.001). Research has shown that training on blood glucose control has been successful, therefore, it is recommended that more research is done in the field of education to help patients with diabetes more comfortable.

Keywords: nutrition education, diabetes, function, insulin, chronic, metabolic control

Procedia PDF Downloads 137
26598 Direct Translation vs. Pivot Language Translation for Persian-Spanish Low-Resourced Statistical Machine Translation System

Authors: Benyamin Ahmadnia, Javier Serrano

Abstract:

In this paper we compare two different approaches for translating from Persian to Spanish, as a language pair with scarce parallel corpus. The first approach involves direct transfer using an statistical machine translation system, which is available for this language pair. The second approach involves translation through English, as a pivot language, which has more translation resources and more advanced translation systems available. The results show that, it is possible to achieve better translation quality using English as a pivot language in either approach outperforms direct translation from Persian to Spanish. Our best result is the pivot system which scores higher than direct translation by (1.12) BLEU points.

Keywords: statistical machine translation, direct translation approach, pivot language translation approach, parallel corpus

Procedia PDF Downloads 487
26597 Acoustic Emission Techniques in Monitoring Low-Speed Bearing Conditions

Authors: Faisal AlShammari, Abdulmajid Addali, Mosab Alrashed

Abstract:

It is widely acknowledged that bearing failures are the primary reason for breakdowns in rotating machinery. These failures are extremely costly, particularly in terms of lost production. Roller bearings are widely used in industrial machinery and need to be maintained in good condition to ensure the continuing efficiency, effectiveness, and profitability of the production process. The research presented here is an investigation of the use of acoustic emission (AE) to monitor bearing conditions at low speeds. Many machines, particularly large, expensive machines operate at speeds below 100 rpm, and such machines are important to the industry. However, the overwhelming proportion of studies have investigated the use of AE techniques for condition monitoring of higher-speed machines (typically several hundred rpm, or even higher). Few researchers have investigated the application of these techniques to low-speed machines ( < 100 rpm). This paper addressed this omission and has established which, of the available, AE techniques are suitable for the detection of incipient faults and measurement of fault growth in low-speed bearings. The first objective of this paper program was to assess the applicability of AE techniques to monitor low-speed bearings. It was found that the measured statistical parameters successfully monitored bearing conditions at low speeds (10-100 rpm). The second objective was to identify which commonly used statistical parameters derived from the AE signal (RMS, kurtosis, amplitude and counts) could identify the onset of a fault in the out race. It was found that these parameters effectually identify the presence of a small fault seeded into the outer races. Also, it is concluded that rotational speed has a strong influence on the measured AE parameters but that they are entirely independent of the load under such load and speed conditions.

Keywords: acoustic emission, condition monitoring, NDT, statistical analysis

Procedia PDF Downloads 248
26596 Effect of Common Yoga Protocol on Reaction Time of Football Players

Authors: Vikram Singh

Abstract:

The objective of the study was to study the effectiveness of common yoga protocol on reaction time (simple visual reaction time-SVRT measured in milliseconds/seconds) of male football players in the age group of 15 to 21 years. The 40 boys were randomly assigned into two groups i.e. control and experimental. SVRT for both the groups were measured on day-1 and post intervention (common yoga protocol here) was measured after 45 days of training to the experimental group only. One way ANOVA (Univariate analysis) and Independent t-test using SPSS 23 statistical package was applied to get and analyze the results. There was a significant difference after 45 days of yoga protocol in simple visual reaction time of experimental group (p = .032), t (33.05) = 3.881, p = .000 (two-tailed). Null hypothesis (that there would be no post measurement differences in reaction times of control and experimental groups) was rejected. Where p<.05. Therefore alternate hypothesis was accepted.

Keywords: footballers, t-test, yoga protocol, reaction time

Procedia PDF Downloads 253
26595 Normalizing Logarithms of Realized Volatility in an ARFIMA Model

Authors: G. L. C. Yap

Abstract:

Modelling realized volatility with high-frequency returns is popular as it is an unbiased and efficient estimator of return volatility. A computationally simple model is fitting the logarithms of the realized volatilities with a fractionally integrated long-memory Gaussian process. The Gaussianity assumption simplifies the parameter estimation using the Whittle approximation. Nonetheless, this assumption may not be met in the finite samples and there may be a need to normalize the financial series. Based on the empirical indices S&P500 and DAX, this paper examines the performance of the linear volatility model pre-treated with normalization compared to its existing counterpart. The empirical results show that by including normalization as a pre-treatment procedure, the forecast performance outperforms the existing model in terms of statistical and economic evaluations.

Keywords: Gaussian process, long-memory, normalization, value-at-risk, volatility, Whittle estimator

Procedia PDF Downloads 354
26594 Realization of a Temperature Based Automatic Controlled Domestic Electric Boiling System

Authors: Shengqi Yu, Jinwei Zhao

Abstract:

This paper presents a kind of analog circuit based temperature control system, which is mainly composed by threshold control signal circuit, synchronization signal circuit and trigger pulse circuit. Firstly, the temperature feedback signal function is realized by temperature sensor TS503F3950E. Secondly, the main control circuit forms the cycle controlled pulse signal to control the thyristor switching model. Finally two reverse paralleled thyristors regulate the output power by their switching state. In the consequence, this is a modernized and energy-saving domestic electric heating system.

Keywords: time base circuit, automatic control, zero-crossing trigger, temperature control

Procedia PDF Downloads 481
26593 A Study of Evolutional Control Systems

Authors: Ti-Jun Xiao, Zhe Xu

Abstract:

Controllability is one of the fundamental issues in control systems. In this paper, we study the controllability of second order evolutional control systems in Hilbert spaces with memory and boundary controls, which model dynamic behaviors of some viscoelastic materials. Transferring the control problem into a moment problem and showing the Riesz property of a family of functions related to Cauchy problems for some integrodifferential equations, we obtain a general boundary controllability theorem for these second order evolutional control systems. This controllability theorem is applicable to various concrete 1D viscoelastic systems and recovers some previous related results. It is worth noting that Riesz sequences can be used for numerical computations of the control functions and the identification of new Riesz sequence is of independent interest for the basis-function theory. Moreover, using the Riesz sequences, we obtain the existence and uniqueness of (weak) solutions to these second order evolutional control systems in Hilbert spaces. Finally, we derive the exact boundary controllability of a viscoelastic beam equation, as an application of our abstract theorem.

Keywords: evolutional control system, controllability, boundary control, existence and uniqueness

Procedia PDF Downloads 222
26592 The Influence of Teachers Anxiety-Reducing Strategies on Learners Foreign Language Anxiety

Authors: Fakieh Alrabai

Abstract:

This study investigated the effects on learner anxiety of anxiety-reducing strategies utilized by English as foreign language teachers in Saudi Arabia. The study was conducted in two stages. In the first stage, sources of foreign language anxiety for Saudi learners of English (N = 596) were identified using The Foreign Language Classroom Anxiety Scale (FLCAS). In the second stage, 465 learners who were divided almost equally into two groups (experimental vs. control) and 12 teachers were recruited. Anxiety-reducing strategies were implemented exclusively in the treatment group for approximately eight weeks. FLCAS was used to assess learners’ FL anxiety levels before and after treatment. Statistical analyses (e.g. ANOVA and ANCOVA) were used to evaluate the study findings. These findings revealed that the intervention led to significantly decreased levels of FL anxiety for learners in the experimental group compared with increased levels of anxiety for those in the control group.

Keywords: communication apprehension, EFL teaching/learning, fear of negative evaluation, foreign language anxiety

Procedia PDF Downloads 355
26591 Statistical Optimization of Adsorption of a Harmful Dye from Aqueous Solution

Authors: M. Arun, A. Kannan

Abstract:

Textile industries cater to varied customer preferences and contribute substantially to the economy. However, these textile industries also produce a considerable amount of effluents. Prominent among these are the azo dyes which impart considerable color and toxicity even at low concentrations. Azo dyes are also used as coloring agents in food and pharmaceutical industry. Despite their applications, azo dyes are also notorious pollutants and carcinogens. Popular techniques like photo-degradation, biodegradation and the use of oxidizing agents are not applicable for all kinds of dyes, as most of them are stable to these techniques. Chemical coagulation produces a large amount of toxic sludge which is undesirable and is also ineffective towards a number of dyes. Most of the azo dyes are stable to UV-visible light irradiation and may even resist aerobic degradation. Adsorption has been the most preferred technique owing to its less cost, high capacity and process efficiency and the possibility of regenerating and recycling the adsorbent. Adsorption is also most preferred because it may produce high quality of the treated effluent and it is able to remove different kinds of dyes. However, the adsorption process is influenced by many variables whose inter-dependence makes it difficult to identify optimum conditions. The variables include stirring speed, temperature, initial concentration and adsorbent dosage. Further, the internal diffusional resistance inside the adsorbent particle leads to slow uptake of the solute within the adsorbent. Hence, it is necessary to identify optimum conditions that lead to high capacity and uptake rate of these pollutants. In this work, commercially available activated carbon was chosen as the adsorbent owing to its high surface area. A typical azo dye found in textile effluent waters, viz. the monoazo Acid Orange 10 dye (CAS: 1936-15-8) has been chosen as the representative pollutant. Adsorption studies were mainly focused at obtaining equilibrium and kinetic data for the batch adsorption process at different process conditions. Studies were conducted at different stirring speed, temperature, adsorbent dosage and initial dye concentration settings. The Full Factorial Design was the chosen statistical design framework for carrying out the experiments and identifying the important factors and their interactions. The optimum conditions identified from the experimental model were validated with actual experiments at the recommended settings. The equilibrium and kinetic data obtained were fitted to different models and the model parameters were estimated. This gives more details about the nature of adsorption taking place. Critical data required to design batch adsorption systems for removal of Acid Orange 10 dye and identification of factors that critically influence the separation efficiency are the key outcomes from this research.

Keywords: acid orange 10, activated carbon, optimum adsorption conditions, statistical design

Procedia PDF Downloads 169
26590 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm

Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar

Abstract:

The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.

Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations

Procedia PDF Downloads 415
26589 Linear Quadratic Gaussian/Loop Transfer Recover Control Flight Control on a Nonlinear Model

Authors: T. Sanches, K. Bousson

Abstract:

As part of the development of a 4D autopilot system for unmanned aerial vehicles (UAVs), i.e. a time-dependent robust trajectory generation and control algorithm, this work addresses the problem of optimal path control based on the flight sensors data output that may be unreliable due to noise on data acquisition and/or transmission under certain circumstances. Although several filtering methods, such as the Kalman-Bucy filter or the Linear Quadratic Gaussian/Loop Transfer Recover Control (LQG/LTR), are available, the utter complexity of the control system, together with the robustness and reliability required of such a system on a UAV for airworthiness certifiable autonomous flight, required the development of a proper robust filter for a nonlinear system, as a way of further mitigate errors propagation to the control system and improve its ,performance. As such, a nonlinear algorithm based upon the LQG/LTR, is validated through computational simulation testing, is proposed on this paper.

Keywords: autonomous flight, LQG/LTR, nonlinear state estimator, robust flight control

Procedia PDF Downloads 138
26588 Analysis of the Engineering Judgement Influence on the Selection of Geotechnical Parameters Characteristic Values

Authors: K. Ivandic, F. Dodigovic, D. Stuhec, S. Strelec

Abstract:

A characteristic value of certain geotechnical parameter results from an engineering assessment. Its selection has to be based on technical principles and standards of engineering practice. It has been shown that the results of engineering assessment of different authors for the same problem and input data are significantly dispersed. A survey was conducted in which participants had to estimate the force that causes a 10 cm displacement at the top of a axially in-situ compressed pile. Fifty experts from all over the world took part in it. The lowest estimated force value was 42% and the highest was 133% of measured force resulting from a mentioned static pile load test. These extreme values result in significantly different technical solutions to the same engineering task. In case of selecting a characteristic value of a geotechnical parameter the importance of the influence of an engineering assessment can be reduced by using statistical methods. An informative annex of Eurocode 1 prescribes the method of selecting the characteristic values of material properties. This is followed by Eurocode 7 with certain specificities linked to selecting characteristic values of geotechnical parameters. The paper shows the procedure of selecting characteristic values of a geotechnical parameter by using a statistical method with different initial conditions. The aim of the paper is to quantify an engineering assessment in the example of determining a characteristic value of a specific geotechnical parameter. It is assumed that this assessment is a random variable and that its statistical features will be determined. For this purpose, a survey research was conducted among relevant experts from the field of geotechnical engineering. Conclusively, the results of the survey and the application of statistical method were compared.

Keywords: characteristic values, engineering judgement, Eurocode 7, statistical methods

Procedia PDF Downloads 296
26587 Study on Safety Management of Deep Foundation Pit Construction Site Based on Building Information Modeling

Authors: Xuewei Li, Jingfeng Yuan, Jianliang Zhou

Abstract:

The 21st century has been called the century of human exploitation of underground space. Due to the characteristics of large quantity, tight schedule, low safety reserve and high uncertainty of deep foundation pit engineering, accidents frequently occur in deep foundation pit engineering, causing huge economic losses and casualties. With the successful application of information technology in the construction industry, building information modeling has become a research hotspot in the field of architectural engineering. Therefore, the application of building information modeling (BIM) and other information communication technologies (ICTs) in construction safety management is of great significance to improve the level of safety management. This research summed up the mechanism of the deep foundation pit engineering accident through the fault tree analysis to find the control factors of deep foundation pit engineering safety management, the deficiency existing in the traditional deep foundation pit construction site safety management. According to the accident cause mechanism and the specific process of deep foundation pit construction, the hazard information of deep foundation pit engineering construction site was identified, and the hazard list was obtained, including early warning information. After that, the system framework was constructed by analyzing the early warning information demand and early warning function demand of the safety management system of deep foundation pit. Finally, the safety management system of deep foundation pit construction site based on BIM through combing the database and Web-BIM technology was developed, so as to realize the three functions of real-time positioning of construction site personnel, automatic warning of entering a dangerous area, real-time monitoring of deep foundation pit structure deformation and automatic warning. This study can initially improve the current situation of safety management in the construction site of deep foundation pit. Additionally, the active control before the occurrence of deep foundation pit accidents and the whole process dynamic control in the construction process can be realized so as to prevent and control the occurrence of safety accidents in the construction of deep foundation pit engineering.

Keywords: Web-BIM, safety management, deep foundation pit, construction

Procedia PDF Downloads 153
26586 A Decentralized Application for Secure Data Handling of Wireless Networks Using Ethereum Smart Contracts

Authors: Midhun Xavier

Abstract:

This paper introduces a method to verify multi-agent systems in industrial control systems using blockchain technology. The proposed solution enables to record and verify each process that occurs while generating a customized product using Ethereum-based smart contracts. Node-Red software agents are developed with the help of semantic web technologies, and these software agents interact with IEC 61499 function blocks to execute the processes. The agent associated with each mechatronic component and its controller can communicate with the blockchain to record various events that occur during each process, and the latter smart contract helps to verify these process orders of the customized product.

Keywords: blockchain, Ethereum, node-red, IEC 61499, multi-agent system, MQTT

Procedia PDF Downloads 94
26585 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods

Authors: Devatha Kalyan Kumar, R. Poovarasan

Abstract:

In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.

Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric

Procedia PDF Downloads 256