Search results for: multivariate measurement system analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 39673

Search results for: multivariate measurement system analysis

39583 The Use of Boosted Multivariate Trees in Medical Decision-Making for Repeated Measurements

Authors: Ebru Turgal, Beyza Doganay Erdogan

Abstract:

Machine learning aims to model the relationship between the response and features. Medical decision-making researchers would like to make decisions about patients’ course and treatment, by examining the repeated measurements over time. Boosting approach is now being used in machine learning area for these aims as an influential tool. The aim of this study is to show the usage of multivariate tree boosting in this field. The main reason for utilizing this approach in the field of decision-making is the ease solutions of complex relationships. To show how multivariate tree boosting method can be used to identify important features and feature-time interaction, we used the data, which was collected retrospectively from Ankara University Chest Diseases Department records. Dataset includes repeated PF ratio measurements. The follow-up time is planned for 120 hours. A set of different models is tested. In conclusion, main idea of classification with weighed combination of classifiers is a reliable method which was shown with simulations several times. Furthermore, time varying variables will be taken into consideration within this concept and it could be possible to make accurate decisions about regression and survival problems.

Keywords: boosted multivariate trees, longitudinal data, multivariate regression tree, panel data

Procedia PDF Downloads 173
39582 Examining the Modular End of Line Control Unit Design Criteria for Vehicle Sliding Door System Slide Profile

Authors: Orhan Kurtuluş, Cüneyt Yavuz

Abstract:

The end of the line controls of the finished products in the automotive industry is important. The control that has been conducted with the manual methods for the sliding doors tracks is not sufficient and faulty products cannot be identified. As a result, the customer has the faulty products. In the scope of this study, the design criteria of the PLC integrated modular end of line control unit has been examined, designed and manufactured to make the control of the 10 different track profile to 2 different vehicles with an objective to minimize the salvage costs by obtaining more sensitive, certain and accurate measurement results. In the study that started with literature and patent review, the design inputs have been specified, the technical concept has been developed, computer supported mechanic design, control system and automation design, design review and design improvement have been made. Laser analog sensors at high sensitivity, probes and modular blocks have been used in the unit. The measurement has been conducted in the system and it is observed that measurement results are more sensitive than the previous methods.

Keywords: control unit design, end of line, modular design, sliding door system

Procedia PDF Downloads 406
39581 Influence of Shading on a BIPV System’s Performance in an Urban Context: Case Study of BIPV Systems of the Science Center of Complexity Building of the National and Autonomous University of Mexico in Mexico City

Authors: Viridiana Edith Ardura Perea, José Luis Bermúdez Alcocer

Abstract:

The purpose of this paper is to establish the influence of shading on a Building Integrated Photovoltaic (BIPV) system´s performance in an urban context. The PV systems of the Science Center of Complexity (Centro de Ciencias de la Complejidad) Building based in the Main Campus of the National and Autonomous University of Mexico (UNAM) in Mexico City was taken as case study.  The PV systems are placed on the rooftop and on the south façade of the building.  The south-façade PV system, operating as sunshades, consists of two strings:  one at the ground floor and the other one at the first floor.  According to the building’s facility manager, the south-façade PV system generates 42% less electricity per kilowatt peak (kWp) installed than the one on the roof.  The methods applied in this study were Solar Radiation Analysis (SRA) simulations performed with the Insight 360 Plug-in from Revit 2018® and an on-site measurement using specialized tools.  The results of the SRA simulations showed that the shading casted by the PV system placed on the first floor on top of the PV system of the ground floor decreases its solar incident radiation over 50%.  The simulation outcome was compared and validated to the measured data obtained from the on-site measurement.  In conclusion, the loss factor achieved from the shading of the PVs is due to the surroundings and the PV system´s own design.  The south-façade BIPV system’s deficient design generates critical losses on its performance and decreases its profitability.

Keywords: building integrated photovoltaics design, energy analysis software, shading losses, solar radiation analysis

Procedia PDF Downloads 149
39580 A Case Study of Conceptual Framework for Process Performance

Authors: Ljubica Milanović Glavan, Vesna Bosilj Vukšić, Dalia Suša

Abstract:

In order to gain a competitive advantage, many companies are focusing on reorganization of their business processes and implementing process-based management. In this context, assessing process performance is essential because it enables individuals and groups to assess where they stand in comparison to their competitors. In this paper, it is argued that process performance measurement is a necessity for a modern process-oriented company and it should be supported by a holistic process performance measurement system. It seems very unlikely that a universal set of performance indicators can be applied successfully to all business processes. Thus, performance indicators must be process-specific and have to be derived from both the strategic enterprise-wide goals and the process goals. Based on the extensive literature review and interviews conducted in Croatian company a conceptual framework for process performance measurement system was developed. The main objective of such system is to help process managers by providing comprehensive and timely information on the performance of business processes. This information can be used to communicate goals and current performance of a business process directly to the process team, to improve resource allocation and process output regarding quantity and quality, to give early warning signals, to make a diagnosis of the weaknesses of a business process, to decide whether corrective actions are needed and to assess the impact of actions taken.

Keywords: Croatia, key performance indicators, performance measurement, process performance

Procedia PDF Downloads 641
39579 Multivariate Analytical Insights into Spatial and Temporal Variation in Water Quality of a Major Drinking Water Reservoir

Authors: Azadeh Golshan, Craig Evans, Phillip Geary, Abigail Morrow, Zoe Rogers, Marcel Maeder

Abstract:

22 physicochemical variables have been determined in water samples collected weekly from January to December in 2013 from three sampling stations located within a major drinking water reservoir. Classical Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) analysis was used to investigate the environmental factors associated with the physico-chemical variability of the water samples at each of the sampling stations. Matrix augmentation MCR-ALS (MA-MCR-ALS) was also applied, and the two sets of results were compared for interpretative clarity. Links between these factors, reservoir inflows and catchment land-uses were investigated and interpreted in relation to chemical composition of the water and their resolved geographical distribution profiles. The results suggested that the major factors affecting reservoir water quality were those associated with agricultural runoff, with evidence of influence on algal photosynthesis within the water column. Water quality variability within the reservoir was also found to be strongly linked to physical parameters such as water temperature and the occurrence of thermal stratification. The two methods applied (MCR-ALS and MA-MCR-ALS) led to similar conclusions; however, MA-MCR-ALS appeared to provide results more amenable to interpretation of temporal and geological variation than those obtained through classical MCR-ALS.

Keywords: drinking water reservoir, multivariate analysis, physico-chemical parameters, water quality

Procedia PDF Downloads 254
39578 Prediction of Slaughter Body Weight in Rabbits: Multivariate Approach through Path Coefficient and Principal Component Analysis

Authors: K. A. Bindu, T. V. Raja, P. M. Rojan, A. Siby

Abstract:

The multivariate path coefficient approach was employed to study the effects of various production and reproduction traits on the slaughter body weight of rabbits. Information on 562 rabbits maintained at the university rabbit farm attached to the Centre for Advanced Studies in Animal Genetics, and Breeding, Kerala Veterinary and Animal Sciences University, Kerala State, India was utilized. The manifest variables used in the study were age and weight of dam, birth weight, litter size at birth and weaning, weight at first, second and third months. The linear multiple regression analysis was performed by keeping the slaughter weight as the dependent variable and the remaining as independent variables. The model explained 48.60 percentage of the total variation present in the market weight of the rabbits. Even though the model used was significant, the standardized beta coefficients for the independent variables viz., age and weight of the dam, birth weight and litter sizes at birth and weaning were less than one indicating their negligible influence on the slaughter weight. However, the standardized beta coefficient of the second-month body weight was maximum followed by the first-month weight indicating their major role on the market weight. All the other factors influence indirectly only through these two variables. Hence it was concluded that the slaughter body weight can be predicted using the first and second-month body weights. The principal components were also developed so as to achieve more accuracy in the prediction of market weight of rabbits.

Keywords: component analysis, multivariate, slaughter, regression

Procedia PDF Downloads 132
39577 Application of Deep Learning in Top Pair and Single Top Quark Production at the Large Hadron Collider

Authors: Ijaz Ahmed, Anwar Zada, Muhammad Waqas, M. U. Ashraf

Abstract:

We demonstrate the performance of a very efficient tagger applies on hadronically decaying top quark pairs as signal based on deep neural network algorithms and compares with the QCD multi-jet background events. A significant enhancement of performance in boosted top quark events is observed with our limited computing resources. We also compare modern machine learning approaches and perform a multivariate analysis of boosted top-pair as well as single top quark production through weak interaction at √s = 14 TeV proton-proton Collider. The most relevant known background processes are incorporated. Through the techniques of Boosted Decision Tree (BDT), likelihood and Multlayer Perceptron (MLP) the analysis is trained to observe the performance in comparison with the conventional cut based and count approach

Keywords: top tagger, multivariate, deep learning, LHC, single top

Procedia PDF Downloads 77
39576 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or under-estimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improves accuracies. This requires standard measurement methods to be structured in ontologically and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, construction projects, cost estimation, NRM, ontology

Procedia PDF Downloads 513
39575 Inter-Area Oscillation Monitoring in Maghrebian Power Grid Using Phasor Measurement Unit

Authors: M. Tsebia, H. Bentarzi

Abstract:

In the inter-connected power systems, a phenomenon called inter-area oscillation may be caused by several defects. In this paper, a study of the Maghreb countries inter-area power networks oscillation has been investigated. The inter-area oscillation monitoring can be enhanced by integrating Phasor Measurement Unit (PMU) technology installed in different places. The data provided by PMU and recorded by PDC will be used for the monitoring, analysis, and control purposes. The proposed approach has been validated by simulation using MATLAB/Simulink.

Keywords: PMU, inter-area oscillation, Maghrebian power system, Simulink

Procedia PDF Downloads 323
39574 Blood Oxygen Saturation Measurement System Using Broad-Band Light Source with LabVIEW Program

Authors: Myoung Ah Kim, Dong Ho Sin, Chul Gyu Song

Abstract:

Blood oxygen saturation system is a well-established, noninvasive photoplethysmographic method to monitor vital signs. Conventional blood oxygen saturation measurements for the two LED light source is the ambiguity of the oxygen saturation measurement principle and the measurement results greatly influenced and heat and motion artifact. A high accuracy in order to solve these problems blood oxygen saturation measuring method has been proposed using a broadband light source that can be easily understood by the algorithm. The measurement of blood oxygen saturation based on broad-band light source has advantage of simple testing facility and easy understanding. Broadband light source based on blood oxygen saturation measuring program proposed in this paper is a combination of LabVIEW and MATLAB. Using the wavelength range of 450 nm-750 nm using a floating light absorption of oxyhemoglobin and deoxyhemoglobin to measure the blood oxygen saturation. Hand movement is to fix the probe to the motor stage in order to prevent oxygen saturation measurement that affect the sample and probe kept constant interval. Experimental results show that the proposed method noticeably increases the accuracy and saves time compared with the conventional methods.

Keywords: oxygen saturation, broad-band light source, CCD, light reflectance theory

Procedia PDF Downloads 417
39573 Integrating the Athena Vortex Lattice Code into a Multivariate Design Synthesis Optimisation Platform in JAVA

Authors: Paul Okonkwo, Howard Smith

Abstract:

This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology by Mark Drela allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.

Keywords: aerodynamics, automation, optimisation, AVL, JNI

Procedia PDF Downloads 558
39572 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector

Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau

Abstract:

Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.

Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement

Procedia PDF Downloads 173
39571 Classification of Generative Adversarial Network Generated Multivariate Time Series Data Featuring Transformer-Based Deep Learning Architecture

Authors: Thrivikraman Aswathi, S. Advaith

Abstract:

As there can be cases where the use of real data is somehow limited, such as when it is hard to get access to a large volume of real data, we need to go for synthetic data generation. This produces high-quality synthetic data while maintaining the statistical properties of a specific dataset. In the present work, a generative adversarial network (GAN) is trained to produce multivariate time series (MTS) data since the MTS is now being gathered more often in various real-world systems. Furthermore, the GAN-generated MTS data is fed into a transformer-based deep learning architecture that carries out the data categorization into predefined classes. Further, the model is evaluated across various distinct domains by generating corresponding MTS data.

Keywords: GAN, transformer, classification, multivariate time series

Procedia PDF Downloads 90
39570 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection

Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine

Abstract:

Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.

Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine

Procedia PDF Downloads 238
39569 Multivariate Analysis of the Relationship between Professional Burnout, Emotional Intelligence and Health Level in Teachers University of Guayaquil

Authors: Viloria Marin Hermes, Paredes Santiago Maritza, Viloria Paredes Jonathan

Abstract:

The aim of this study is to assess the prevalence of Burnout syndrome in a sample of 600 professors at the University of Guayaquil (Ecuador) using the Maslach Burnout Inventory (M.B.I.). In addition, assessment was made of the effects on health from professional burnout using the General Health Questionnaire (G.H.Q.-28), and the influence of Emotional Intelligence on prevention of its symptoms using the Spanish version of the Trait Meta-Mood Scale (T.M.M.S.-24). After confirmation of the underlying factor structure, the three measurement tools showed high levels of internal consistency, and specific cut-off points were proposed for the group of Latin American academics in the M.B.I. Statistical analysis showed the syndrome is present extensively, particularly on medium levels, with notably low scores given for Professional Self-Esteem. The application of Canonical Correspondence Analysis revealed that low levels of self-esteem are related to depression, with a lack of personal resources related to anxiety and insomnia, whereas the ability to perceive and control emotions and feelings improves perceptions of professional effectiveness and performance.

Keywords: burnout, academics, emotional intelligence, general health, canonical correspondence analysis

Procedia PDF Downloads 339
39568 Performance Management in Higher Education: Lessons from Germany's New Public Management System

Authors: Patrick Oehler, Nicholas Folger

Abstract:

Following a new public management approach, Germany has widely reformed its higher education system around the turn of the millennium. Aimed at preparing the country’s publicly funded universities and applied science colleges for a century of glory, the reforms led to the introduction of rigid performance measurement and management practices, which disrupted the inert system on all levels. Yet, many of the new policies met significant resistance, and some of them had to be reversed over time. Ever since Germany has struggled to find a balance between its pre- and its post-millennial approach to performance measurement and management. This contribution combines insights of a joint research project, which was created and funded by the German Federal Ministry of Education and Research with the aim to better understand the effects of its performance measurement and management policies, including those the ministry had implemented over the previous decades. The research project combines researchers from 17 German research institutions who employed a wide range of theories from various disciplines and very diverse research methods to explain performance measurement and management and their consequences on the behavior of various stakeholders in higher education systems. In these projects, performance measurement and management have been researched from three angles—education, research, and third mission. The collaborative project differentiated functional and dysfunctional elements of common performance measurement and management practices, and identified key problems with these practices, such as (1) oversimplification of performance indicators, (2) ‘overmeasurement’ of performance in general, (3) excessive use of quantitative indicators, and (4), a myopic focus on research-focused indicators and a negligence of measures targeting education and third mission. To address these issues, the collaborative project developed alternative approaches to performance measurement and management, including suggestions for qualitative performance measures, improved supervision, review, and evaluations methods, and recommendations how to better balance education, research, and third mission. The authors would like to share the rich findings of the joint research project with an international audience and discuss their implications for alternative higher education systems.

Keywords: performance measurement, performance management, new public management, performance evaluation

Procedia PDF Downloads 247
39567 Subpixel Corner Detection for Monocular Camera Linear Model Research

Authors: Guorong Sui, Xingwei Jia, Fei Tong, Xiumin Gao

Abstract:

Camera calibration is a fundamental issue of high precision noncontact measurement. And it is necessary to analyze and study the reliability and application range of its linear model which is often used in the camera calibration. According to the imaging features of monocular cameras, a camera model which is based on the image pixel coordinates and three dimensional space coordinates is built. Using our own customized template, the image pixel coordinate is obtained by the subpixel corner detection method. Without considering the aberration of the optical system, the feature extraction and linearity analysis of the line segment in the template are performed. Moreover, the experiment is repeated 11 times by constantly varying the measuring distance. At last, the linearity of the camera is achieved by fitting 11 groups of data. The camera model measurement results show that the relative error does not exceed 1%, and the repeated measurement error is not more than 0.1 mm magnitude. Meanwhile, it is found that the model has some measurement differences in the different region and object distance. The experiment results show this linear model is simple and practical, and have good linearity within a certain object distance. These experiment results provide a powerful basis for establishment of the linear model of camera. These works will have potential value to the actual engineering measurement.

Keywords: camera linear model, geometric imaging relationship, image pixel coordinates, three dimensional space coordinates, sub-pixel corner detection

Procedia PDF Downloads 255
39566 Automated Method Time Measurement System for Redesigning Dynamic Facility Layout

Authors: Salam Alzubaidi, G. Fantoni, F. Failli, M. Frosolini

Abstract:

The dynamic facility layout problem is a really critical issue in the competitive industrial market; thus, solving this problem requires robust design and effective simulation systems. The sustainable simulation requires inputting reliable and accurate data into the system. So this paper describes an automated system integrated into the real environment to measure the duration of the material handling operations, collect the data in real-time, and determine the variances between the actual and estimated time schedule of the operations in order to update the simulation software and redesign the facility layout periodically. The automated method- time measurement system collects the real data through using Radio Frequency-Identification (RFID) and Internet of Things (IoT) technologies. Hence, attaching RFID- antenna reader and RFID tags enables the system to identify the location of the objects and gathering the time data. The real duration gathered will be manipulated by calculating the moving average duration of the material handling operations, choosing the shortest material handling path, and then updating the simulation software to redesign the facility layout accommodating with the shortest/real operation schedule. The periodic simulation in real-time is more sustainable and reliable than the simulation system relying on an analysis of historical data. The case study of this methodology is in cooperation with a workshop team for producing mechanical parts. Although there are some technical limitations, this methodology is promising, and it can be significantly useful in the redesigning of the manufacturing layout.

Keywords: dynamic facility layout problem, internet of things, method time measurement, radio frequency identification, simulation

Procedia PDF Downloads 99
39565 Using Discriminant Analysis to Forecast Crime Rate in Nigeria

Authors: O. P. Popoola, O. A. Alawode, M. O. Olayiwola, A. M. Oladele

Abstract:

This research work is based on using discriminant analysis to forecast crime rate in Nigeria between 1996 and 2008. The work is interested in how gender (male and female) relates to offences committed against the government, against other properties, disturbance in public places, murder/robbery offences and other offences. The data used was collected from the National Bureau of Statistics (NBS). SPSS, the statistical package was used to analyse the data. Time plot was plotted on all the 29 offences gotten from the raw data. Eigenvalues and Multivariate tests, Wilks’ Lambda, standardized canonical discriminant function coefficients and the predicted classifications were estimated. The research shows that the distribution of the scores from each function is standardized to have a mean O and a standard deviation of 1. The magnitudes of the coefficients indicate how strongly the discriminating variable affects the score. In the predicted group membership, 172 cases that were predicted to commit crime against Government group, 66 were correctly predicted and 106 were incorrectly predicted. After going through the predicted classifications, we found out that most groups numbers that were correctly predicted were less than those that were incorrectly predicted.

Keywords: discriminant analysis, DA, multivariate analysis of variance, MANOVA, canonical correlation, and Wilks’ Lambda

Procedia PDF Downloads 440
39564 Use of Multivariate Statistical Techniques for Water Quality Monitoring Network Assessment, Case of Study: Jequetepeque River Basin

Authors: Jose Flores, Nadia Gamboa

Abstract:

A proper water quality management requires the establishment of a monitoring network. Therefore, evaluation of the efficiency of water quality monitoring networks is needed to ensure high-quality data collection of critical quality chemical parameters. Unfortunately, in some Latin American countries water quality monitoring programs are not sustainable in terms of recording historical data or environmentally representative sites wasting time, money and valuable information. In this study, multivariate statistical techniques, such as principal components analysis (PCA) and hierarchical cluster analysis (HCA), are applied for identifying the most significant monitoring sites as well as critical water quality parameters in the monitoring network of the Jequetepeque River basin, in northern Peru. The Jequetepeque River basin, like others in Peru, shows socio-environmental conflicts due to economical activities developed in this area. Water pollution by trace elements in the upper part of the basin is mainly related with mining activity, and agricultural land lost due to salinization is caused by the extensive use of groundwater in the lower part of the basin. Since the 1980s, the water quality in the basin has been non-continuously assessed by public and private organizations, and recently the National Water Authority had established permanent water quality networks in 45 basins in Peru. Despite many countries use multivariate statistical techniques for assessing water quality monitoring networks, those instruments have never been applied for that purpose in Peru. For this reason, the main contribution of this study is to demonstrate that application of the multivariate statistical techniques could serve as an instrument that allows the optimization of monitoring networks using least number of monitoring sites as well as the most significant water quality parameters, which would reduce costs concerns and improve the water quality management in Peru. Main socio-economical activities developed and the principal stakeholders related to the water management in the basin are also identified. Finally, water quality management programs will also be discussed in terms of their efficiency and sustainability.

Keywords: PCA, HCA, Jequetepeque, multivariate statistical

Procedia PDF Downloads 329
39563 Immigrant Status and System Justification and Condemnation

Authors: Nancy Bartekian, Kaelan Vazquez, Christine Reyna

Abstract:

Immigrants coming into the United States of America may justify the American system (political, economic, healthcare, criminal justice) and see it as functional. This may be explained because they may come from countries that are even more unstable than the U.S. and/or come here to benefit from the promise of the “American dream” -a narrative that they might be more likely to believe in if they were willing to undergo the costly and sometimes dangerous process to immigrate. Conversely, native-born Americans, as well as immigrants who may have lived in America for a longer period of time, would have more experiences with the various broken systems in America that are dysfunctional, fail to provide adequate services equitably, and/or are steeped in systemic racism and other biases that disadvantage lower-status groups. Thus, our research expects that system justification would decrease, and condemnation would increase with more time spent in the U.S. for immigrant groups. We predict that a) those not born in the U.S. will be more likely to justify the system, b) they will also be less likely to condemn the system, and c) the longer an immigrant has been in the U.S. the less likely they will to justify, and more they will to condemn the system. We will use a mixed-model multivariate analysis of covariance (MANCOVA) and control for race, income, and education. We will also run linear regression models to test if there is a relationship between the length of time in the United States and a decrease in system justification, and length of time and an increase in system condemnation for those not born in the U.S. We will also conduct exploratory analyses to see if the predicted patterns are more likely within certain systems over other systems (political, economic, healthcare, criminal justice).

Keywords: immigration, system justification, system condemnation, system qualification

Procedia PDF Downloads 69
39562 Use of In-line Data Analytics and Empirical Model for Early Fault Detection

Authors: Hyun-Woo Cho

Abstract:

Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: batch process, monitoring, measurement, kernel method

Procedia PDF Downloads 295
39561 Modification of the Athena Vortex Lattice Code for the Multivariate Design Synthesis Optimisation of the Blended Wing Body Aircraft

Authors: Paul Okonkwo, Howard Smith

Abstract:

This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology by Mark Drela allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.

Keywords: aerodynamics, automation, optimisation, AVL

Procedia PDF Downloads 625
39560 A Mathematical Model for 3-DOF Rotary Accuracy Measurement Method Based on a Ball Lens

Authors: Hau-Wei Lee, Yu-Chi Liu, Chien-Hung Liu

Abstract:

A mathematical model is presented for a system that measures rotational errors in a shaft using a ball lens. The geometric optical characteristics of the ball lens mounted on the shaft allows the measurement of rotation axis errors in both the radial and axial directions. The equipment used includes two quadrant detectors (QD), two laser diodes and a ball lens that is mounted on the rotating shaft to be evaluated. Rotational errors in the shaft cause changes in the optical geometry of the ball lens. The resulting deflection of the laser beams is detected by the QDs and their output signals are used to determine rotational errors. The radial and the axial rotational errors can be calculated as explained by the mathematical model. Results from system calibration show that the measurement error is within ±1 m and resolution is about 20 nm. Using a direct drive motor (DD motor) as an example, experimental results show a rotational error of less than 20 m. The most important features of this system are that it does not require the use of expensive optical components, it is small, very easy to set up, and measurements are highly accurate.

Keywords: ball lens, quadrant detector, axial error, radial error

Procedia PDF Downloads 430
39559 An Assessment of Different Blade Tip Timing (BTT) Algorithms Using an Experimentally Validated Finite Element Model Simulator

Authors: Mohamed Mohamed, Philip Bonello, Peter Russhard

Abstract:

Blade Tip Timing (BTT) is a technology concerned with the estimation of both frequency and amplitude of rotating blades. A BTT system comprises two main parts: (a) the arrival time measurement system, and (b) the analysis algorithms. Simulators play an important role in the development of the analysis algorithms since they generate blade tip displacement data from the simulated blade vibration under controlled conditions. This enables an assessment of the performance of the different algorithms with respect to their ability to accurately reproduce the original simulated vibration. Such an assessment is usually not possible with real engine data since there is no practical alternative to BTT for blade vibration measurement. Most simulators used in the literature are based on a simple spring-mass-damper model to determine the vibration. In this work, a more realistic experimentally validated simulator based on the Finite Element (FE) model of a bladed disc (blisk) is first presented. It is then used to generate the necessary data for the assessment of different BTT algorithms. The FE modelling is validated using both a hammer test and two firewire cameras for the mode shapes. A number of autoregressive methods, fitting methods and state-of-the-art inverse methods (i.e. Russhard) are compared. All methods are compared with respect to both synchronous and asynchronous excitations with both single and simultaneous frequencies. The study assesses the applicability of each method for different conditions of vibration, amount of sampling data, and testing facilities, according to its performance and efficiency under these conditions.

Keywords: blade tip timing, blisk, finite element, vibration measurement

Procedia PDF Downloads 284
39558 Multivariate Dependent Frequency-Severity Modeling of Insurance Claims: A Vine Copula Approach

Authors: Islem Kedidi, Rihab Bedoui Bensalem, Faysal Manssouri

Abstract:

In traditional models of insurance data, the number and size of claims are assumed to be independent. Relaxing the independence assumption, this article explores the Vine copula to model dependence structure between multivariate frequency and average severity of insurance claim. To illustrate this approach, we use the Wisconsin local government property insurance fund which offers several insurance protections for motor vehicles, property and contractor’s equipment claims. Results show that the C-vine copula can better characterize the multivariate dependence structure between frequency and severity. Furthermore, we find significant dependencies especially between frequency and average severity among different coverage types.

Keywords: dependency modeling, government insurance, insurance claims, vine copula

Procedia PDF Downloads 172
39557 Artificial Reproduction System and Imbalanced Dataset: A Mendelian Classification

Authors: Anita Kushwaha

Abstract:

We propose a new evolutionary computational model called Artificial Reproduction System which is based on the complex process of meiotic reproduction occurring between male and female cells of the living organisms. Artificial Reproduction System is an attempt towards a new computational intelligence approach inspired by the theoretical reproduction mechanism, observed reproduction functions, principles and mechanisms. A reproductive organism is programmed by genes and can be viewed as an automaton, mapping and reducing so as to create copies of those genes in its off springs. In Artificial Reproduction System, the binding mechanism between male and female cells is studied, parameters are chosen and a network is constructed also a feedback system for self regularization is established. The model then applies Mendel’s law of inheritance, allele-allele associations and can be used to perform data analysis of imbalanced data, multivariate, multiclass and big data. In the experimental study Artificial Reproduction System is compared with other state of the art classifiers like SVM, Radial Basis Function, neural networks, K-Nearest Neighbor for some benchmark datasets and comparison results indicates a good performance.

Keywords: bio-inspired computation, nature- inspired computation, natural computing, data mining

Procedia PDF Downloads 235
39556 Feasibility Study and Experiment of On-Site Nuclear Material Identification in Fukushima Daiichi Fuel Debris by Compact Neutron Source

Authors: Yudhitya Kusumawati, Yuki Mitsuya, Tomooki Shiba, Mitsuru Uesaka

Abstract:

After the Fukushima Daiichi nuclear power reactor incident, there are a lot of unaccountable nuclear fuel debris in the reactor core area, which is subject to safeguard and criticality safety. Before the actual precise analysis is performed, preliminary on-site screening and mapping of nuclear debris activity need to be performed to provide a reliable data on the nuclear debris mass-extraction planning. Through a collaboration project with Japan Atomic Energy Agency, an on-site nuclear debris screening system by using dual energy X-Ray inspection and neutron energy resonance analysis has been established. By using the compact and mobile pulsed neutron source constructed from 3.95 MeV X-Band electron linac, coupled with Tungsten as electron-to-photon converter and Beryllium as a photon-to-neutron converter, short-distance neutron Time of Flight measurement can be performed. Experiment result shows this system can measure neutron energy spectrum up to 100 eV range with only 2.5 meters Time of Flightpath in regards to the X-Band accelerator’s short pulse. With this, on-site neutron Time of Flight measurement can be used to identify the nuclear debris isotope contents through Neutron Resonance Transmission Analysis (NRTA). Some preliminary NRTA experiments have been done with Tungsten sample as dummy nuclear debris material, which isotopes Tungsten-186 has close energy absorption value with Uranium-238 (15 eV). The results obtained shows that this system can detect energy absorption in the resonance neutron area within 1-100 eV. It can also detect multiple elements in a material at once with the experiment using a combined sample of Indium, Tantalum, and silver makes it feasible to identify debris containing mixed material. This compact neutron Time of Flight measurement system is a great complementary for dual energy X-Ray Computed Tomography (CT) method that can identify atomic number quantitatively but with 1-mm spatial resolution and high error bar. The combination of these two measurement methods will able to perform on-site nuclear debris screening at Fukushima Daiichi reactor core area, providing the data for nuclear debris activity mapping.

Keywords: neutron source, neutron resonance, nuclear debris, time of flight

Procedia PDF Downloads 211
39555 Measurement and Analysis of Human Hand Kinematics

Authors: Tamara Grujic, Mirjana Bonkovic

Abstract:

Measurements and quantitative analysis of kinematic parameters of human hand movements have an important role in different areas such as hand function rehabilitation, modeling of multi-digits robotic hands, and the development of machine-man interfaces. In this paper the assessment and evaluation of the reach-to-grasp movement by using computerized and robot-assisted method is described. Experiment involved the measurements of hand positions of seven healthy subjects during grasping three objects of different shapes and sizes. Results showed that three dominant phases of reach-to-grasp movements could be clearly identified.

Keywords: human hand, kinematics, measurement and analysis, reach-to-grasp movement

Procedia PDF Downloads 439
39554 On the Impact of Oil Price Fluctuations on Stock Markets: A Multivariate Long-Memory GARCH Framework

Authors: Manel Youssef, Lotfi Belkacem

Abstract:

This paper employs multivariate long memory GARCH models to simultaneously estimate mean and conditional variance spillover effects between oil prices and different financial markets. Since different financial assets are traded based on these market sector returns, it’s important for financial market participants to understand the volatility transmission mechanism over time and across these series in order to make optimal portfolio allocation decisions. We examine weekly returns from January 1, 2003 to November 30, 2012 and find evidence of significant transmission of shocks and volatilities between oil prices and some of the examined financial markets. The findings support the idea of cross-market hedging and sharing of common information by investors.

Keywords: oil prices, stock indices returns, oil volatility, contagion, DCC-multivariate (FI) GARCH

Procedia PDF Downloads 501