Search results for: Statistical process control
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9438

Search results for: Statistical process control

5178 Multiscale Modelization of Multilayered Bi-Dimensional Soils

Authors: I. Hosni, L. Bennaceur Farah, N. Saber, R Bennaceur

Abstract:

Soil moisture content is a key variable in many environmental sciences. Even though it represents a small proportion of the liquid freshwater on Earth, it modulates interactions between the land surface and the atmosphere, thereby influencing climate and weather. Accurate modeling of the above processes depends on the ability to provide a proper spatial characterization of soil moisture. The measurement of soil moisture content allows assessment of soil water resources in the field of hydrology and agronomy. The second parameter in interaction with the radar signal is the geometric structure of the soil. Most traditional electromagnetic models consider natural surfaces as single scale zero mean stationary Gaussian random processes. Roughness behavior is characterized by statistical parameters like the Root Mean Square (RMS) height and the correlation length. Then, the main problem is that the agreement between experimental measurements and theoretical values is usually poor due to the large variability of the correlation function, and as a consequence, backscattering models have often failed to predict correctly backscattering. In this study, surfaces are considered as band-limited fractal random processes corresponding to a superposition of a finite number of one-dimensional Gaussian process each one having a spatial scale. Multiscale roughness is characterized by two parameters, the first one is proportional to the RMS height, and the other one is related to the fractal dimension. Soil moisture is related to the complex dielectric constant. This multiscale description has been adapted to two-dimensional profiles using the bi-dimensional wavelet transform and the Mallat algorithm to describe more correctly natural surfaces. We characterize the soil surfaces and sub-surfaces by a three layers geo-electrical model. The upper layer is described by its dielectric constant, thickness, a multiscale bi-dimensional surface roughness model by using the wavelet transform and the Mallat algorithm, and volume scattering parameters. The lower layer is divided into three fictive layers separated by an assumed plane interface. These three layers were modeled by an effective medium characterized by an apparent effective dielectric constant taking into account the presence of air pockets in the soil. We have adopted the 2D multiscale three layers small perturbations model including, firstly air pockets in the soil sub-structure, and then a vegetable canopy in the soil surface structure, that is to simulate the radar backscattering. A sensitivity analysis of backscattering coefficient dependence on multiscale roughness and new soil moisture has been performed. Later, we proposed to change the dielectric constant of the multilayer medium because it takes into account the different moisture values of each layer in the soil. A sensitivity analysis of the backscattering coefficient, including the air pockets in the volume structure with respect to the multiscale roughness parameters and the apparent dielectric constant, was carried out. Finally, we proposed to study the behavior of the backscattering coefficient of the radar on a soil having a vegetable layer in its surface structure.

Keywords: Multiscale, bi-dimensional, wavelets, SPM, backscattering, multilayer, air pockets, vegetable.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 591
5177 Analysis and Classification of Hiv-1 Sub- Type Viruses by AR Model through Artificial Neural Networks

Authors: O. Yavuz, L. Ozyilmaz

Abstract:

HIV-1 genome is highly heterogeneous. Due to this variation, features of HIV-I genome is in a wide range. For this reason, the ability to infection of the virus changes depending on different chemokine receptors. From this point of view, R5 HIV viruses use CCR5 coreceptor while X4 viruses use CXCR5 and R5X4 viruses can utilize both coreceptors. Recently, in Bioinformatics, R5X4 viruses have been studied to classify by using the experiments on HIV-1 genome. In this study, R5X4 type of HIV viruses were classified using Auto Regressive (AR) model through Artificial Neural Networks (ANNs). The statistical data of R5X4, R5 and X4 viruses was analyzed by using signal processing methods and ANNs. Accessible residues of these virus sequences were obtained and modeled by AR model since the dimension of residues is large and different from each other. Finally the pre-processed data was used to evolve various ANN structures for determining R5X4 viruses. Furthermore ROC analysis was applied to ANNs to show their real performances. The results indicate that R5X4 viruses successfully classified with high sensitivity and specificity values training and testing ROC analysis for RBF, which gives the best performance among ANN structures.

Keywords: Auto-Regressive Model, HIV, Neural Networks, ROC Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1165
5176 Key Factors Influencing Individual Knowledge Capability in KIFs

Authors: Salman Iqbal

Abstract:

Knowledge management (KM) literature has mainly focused on the antecedents of KM. The purpose of this study is to investigate the effect of specific human resource management (HRM) practices on employee knowledge sharing and its outcome as individual knowledge capability. Based on previous literature, a model is proposed for the study and hypotheses are formulated. The cross-sectional dataset comes from a sample of 19 knowledge intensive firms (KIFs). This study has run an item parceling technique followed by Confirmatory Factor Analysis (CFA) on the latent constructs of the research model. Employees’ collaboration and their interpersonal trust can help to improve their knowledge sharing behaviour and knowledge capability within organisations. This study suggests that in future, by using a larger sample, better statistical insight is possible. The findings of this study are beneficial for scholars, policy makers and practitioners. The empirical results of this study are entirely based on employees’ perceptions and make a significant research contribution, given there is a dearth of empirical research focusing on the subcontinent.

Keywords: Employees’ collaboration, individual knowledge capability, knowledge sharing, monetary rewards, structural equation modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1337
5175 Rule Based Architecture for Collaborative Multidisciplinary Aircraft Design Optimisation

Authors: Nickolay Jelev, Andy Keane, Carren Holden, András Sóbester

Abstract:

In aircraft design, the jump from the conceptual to preliminary design stage introduces a level of complexity which cannot be realistically handled by a single optimiser, be that a human (chief engineer) or an algorithm. The design process is often partitioned along disciplinary lines, with each discipline given a level of autonomy. This introduces a number of challenges including, but not limited to: coupling of design variables; coordinating disciplinary teams; handling of large amounts of analysis data; reaching an acceptable design within time constraints. A number of classical Multidisciplinary Design Optimisation (MDO) architectures exist in academia specifically designed to address these challenges. Their limited use in the industrial aircraft design process has inspired the authors of this paper to develop an alternative strategy based on well established ideas from Decision Support Systems. The proposed rule based architecture sacrifices possibly elusive guarantees of convergence for an attractive return in simplicity. The method is demonstrated on analytical and aircraft design test cases and its performance is compared to a number of classical distributed MDO architectures.

Keywords: Multidisciplinary design optimisation, rule based architecture, aircraft design, decision support system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1036
5174 Identification of Disease Causing DNA Motifs in Human DNA Using Clustering Approach

Authors: G. Tamilpavai, C. Vishnuppriya

Abstract:

Studying DNA (deoxyribonucleic acid) sequence is useful in biological processes and it is applied in the fields such as diagnostic and forensic research. DNA is the hereditary information in human and almost all other organisms. It is passed to their generations. Earlier stage detection of defective DNA sequence may lead to many developments in the field of Bioinformatics. Nowadays various tedious techniques are used to identify defective DNA. The proposed work is to analyze and identify the cancer-causing DNA motif in a given sequence. Initially the human DNA sequence is separated as k-mers using k-mer separation rule. The separated k-mers are clustered using Self Organizing Map (SOM). Using Levenshtein distance measure, cancer associated DNA motif is identified from the k-mer clusters. Experimental results of this work indicate the presence or absence of cancer causing DNA motif. If the cancer associated DNA motif is found in DNA, it is declared as the cancer disease causing DNA sequence. Otherwise the input human DNA is declared as normal sequence. Finally, elapsed time is calculated for finding the presence of cancer causing DNA motif using clustering formation. It is compared with normal process of finding cancer causing DNA motif. Locating cancer associated motif is easier in cluster formation process than the other one. The proposed work will be an initiative aid for finding genetic disease related research.

Keywords: Bioinformatics, cancer motif, DNA, k-mers, Levenshtein distance, SOM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1344
5173 A Comparative Study of SVM Classifiers and Artificial Neural Networks Application for Rolling Element Bearing Fault Diagnosis using Wavelet Transform Preprocessing

Authors: Commander Sunil Tyagi

Abstract:

Effectiveness of Artificial Neural Networks (ANN) and Support Vector Machines (SVM) classifiers for fault diagnosis of rolling element bearings are presented in this paper. The characteristic features of vibration signals of rotating driveline that was run in its normal condition and with faults introduced were used as input to ANN and SVM classifiers. Simple statistical features such as standard deviation, skewness, kurtosis etc. of the time-domain vibration signal segments along with peaks of the signal and peak of power spectral density (PSD) are used as features to input the ANN and SVM classifier. The effect of preprocessing of the vibration signal by Discreet Wavelet Transform (DWT) prior to feature extraction is also studied. It is shown from the experimental results that the performance of SVM classifier in identification of bearing condition is better then ANN and pre-processing of vibration signal by DWT enhances the effectiveness of both ANN and SVM classifier

Keywords: ANN, Artificial Intelligence, Fault Diagnosis, Pattern Recognition, Rolling Element Bearing, SVM. Wavelet Transform

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2088
5172 Monte Carlo Analysis and Fuzzy Sets for Uncertainty Propagation in SIS Performance Assessment

Authors: Fares Innal, Yves Dutuit, Mourad Chebila

Abstract:

The object of this work is the probabilistic performance evaluation of safety instrumented systems (SIS), i.e. the average probability of dangerous failure on demand (PFDavg) and the average frequency of failure (PFH), taking into account the uncertainties related to the different parameters that come into play: failure rate (λ), common cause failure proportion (β), diagnostic coverage (DC)... This leads to an accurate and safe assessment of the safety integrity level (SIL) inherent to the safety function performed by such systems. This aim is in keeping with the requirement of the IEC 61508 standard with respect to handling uncertainty. To do this, we propose an approach that combines (1) Monte Carlo simulation and (2) fuzzy sets. Indeed, the first method is appropriate where representative statistical data are available (using pdf of the relating parameters), while the latter applies in the case characterized by vague and subjective information (using membership function). The proposed approach is fully supported with a suitable computer code.

Keywords: Fuzzy sets, Monte Carlo simulation, Safety instrumented system, Safety integrity level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2760
5171 Hand Gesture Detection via EmguCV Canny Pruning

Authors: N. N. Mosola, S. J. Molete, L. S. Masoebe, M. Letsae

Abstract:

Hand gesture recognition is a technique used to locate, detect, and recognize a hand gesture. Detection and recognition are concepts of Artificial Intelligence (AI). AI concepts are applicable in Human Computer Interaction (HCI), Expert systems (ES), etc. Hand gesture recognition can be used in sign language interpretation. Sign language is a visual communication tool. This tool is used mostly by deaf societies and those with speech disorder. Communication barriers exist when societies with speech disorder interact with others. This research aims to build a hand recognition system for Lesotho’s Sesotho and English language interpretation. The system will help to bridge the communication problems encountered by the mentioned societies. The system has various processing modules. The modules consist of a hand detection engine, image processing engine, feature extraction, and sign recognition. Detection is a process of identifying an object. The proposed system uses Canny pruning Haar and Haarcascade detection algorithms. Canny pruning implements the Canny edge detection. This is an optimal image processing algorithm. It is used to detect edges of an object. The system employs a skin detection algorithm. The skin detection performs background subtraction, computes the convex hull, and the centroid to assist in the detection process. Recognition is a process of gesture classification. Template matching classifies each hand gesture in real-time. The system was tested using various experiments. The results obtained show that time, distance, and light are factors that affect the rate of detection and ultimately recognition. Detection rate is directly proportional to the distance of the hand from the camera. Different lighting conditions were considered. The more the light intensity, the faster the detection rate. Based on the results obtained from this research, the applied methodologies are efficient and provide a plausible solution towards a light-weight, inexpensive system which can be used for sign language interpretation.

Keywords: Canny pruning, hand recognition, machine learning, skin tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1271
5170 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency

Authors: Rania Alshikhe, Vinita Jindal

Abstract:

Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from trav-eling vehicles, such as taxis through installed global positioning sys-tem (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.

Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 474
5169 A Study of Students’ Perceptions Regarding the Effectiveness of Semester and Annual Examination System at Institute of Education and Research

Authors: Ayesha Batool, Saghir Ahmad, Abid Hussain Ch.

Abstract:

The art of the examination is probably the most difficult one in the whole range of educational practices. Semester system is the system of examination, which is set with an institute by its own teachers. Annual system is the system of examination, which is constructed and administrated by some agency outside the institute, it enables the teacher to estimate the effectiveness of the instruction, and students to estimate the progress made by them. On the other hand, semester system of examinations requires following the curriculum strictly and methods of teaching are to be employed by the choice of teachers. The main purpose of the study was to investigate university students’ perceptions regarding the effectiveness of semester system and annual system. The study was quantitative in nature. The sample consisted of 200 students. A five point Likert type scale was used to collect the data. The statistical measures like frequencies, mean, standard deviation, and One Way ANOVA test were applied to analyze the data. The major findings of the study indicated that in semester system students do not spend much time in political activities and develop their study habits. It also revealed that annual system of examination does not satisfy the educational aspirations of the students.

Keywords: Effectiveness, semester system, annual system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 920
5168 Dynamic Variation in Nano-Scale CMOS SRAM Cells Due to LF/RTS Noise and Threshold Voltage

Authors: M. Fadlallah, G. Ghibaudo, C. G. Theodorou

Abstract:

The dynamic variation in memory devices such as the Static Random Access Memory can give errors in read or write operations. In this paper, the effect of low-frequency and random telegraph noise on the dynamic variation of one SRAM cell is detailed. The effect on circuit noise, speed, and length of time of processing is examined, using the Supply Read Retention Voltage and the Read Static Noise Margin. New test run methods are also developed. The obtained results simulation shows the importance of noise caused by dynamic variation, and the impact of Random Telegraph noise on SRAM variability is examined by evaluating the statistical distributions of Random Telegraph noise amplitude in the pull-up, pull-down. The threshold voltage mismatch between neighboring cell transistors due to intrinsic fluctuations typically contributes to larger reductions in static noise margin. Also the contribution of each of the SRAM transistor to total dynamic variation has been identified.

Keywords: Low-frequency noise, Random Telegraph Noise, Dynamic Variation, SRRV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696
5167 Hazardous Waste Management of Transmission Line Tower Manufacturing

Authors: S.P.Gautam, P.S.Bundela, R.K. Jain, V. N. Tripathi

Abstract:

The manufacturing transmission line tower parts has being generated hazardous waste which is required proper disposal of waste for protection of land pollution. Manufacturing Process in the manufacturing of steel angle, plates, pipes, channels are passes through conventional, semi automatic and CNC machines for cutting, marking, punching, drilling, notching, bending operations. All fabricated material Coated with thin layer of Zinc in Galvanizing plant where molten zinc is used for coating. Prior to Galvanizing, chemical like 33% concentrated HCl Acid, ammonium chloride and d-oil being used for pretreatment of iron. The bath of water with sodium dichromate is used for cooling and protection of the galvanized steel. For the heating purpose the furnace oil burners are used. These above process the Zinc dross, Zinc ash, ETP sludge and waste pickled acid generated as hazardous waste. The RPG has made captive secured land fill site, since 1997 since then it was using for disposal of hazardous waste after completion of SLF (Secured land fill) site. The RPG has raised height from ground level then now it is being used for disposal of waste as he designed the SLF after in creasing height of from GL it is functional without leach ate or adverse impacts in the environment.

Keywords: Disposal, Drilling, Fabricated. Hazardous waste, Punching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
5166 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control

Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni

Abstract:

An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.

Keywords: Automation, human factors, air traffic controller, MINIMA, OOTL, Out-Of-The-Loop, EEG, electroencephalography, HMI, human machine interface.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1436
5165 Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain

Authors: C. Bay, A. Mahr, H. Groneberg, F. Döpper

Abstract:

Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.

Keywords: Additive manufacturing, lean production, reproducibility, work safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 805
5164 Robust Digital Cinema Watermarking

Authors: Sadi Vural, Hiromi Tomii, Hironori Yamauchi

Abstract:

With the advent of digital cinema and digital broadcasting, copyright protection of video data has been one of the most important issues. We present a novel method of watermarking for video image data based on the hardware and digital wavelet transform techniques and name it as “traceable watermarking" because the watermarked data is constructed before the transmission process and traced after it has been received by an authorized user. In our method, we embed the watermark to the lowest part of each image frame in decoded video by using a hardware LSI. Digital Cinema is an important application for traceable watermarking since digital cinema system makes use of watermarking technology during content encoding, encryption, transmission, decoding and all the intermediate process to be done in digital cinema systems. The watermark is embedded into the randomly selected movie frames using hash functions. Embedded watermark information can be extracted from the decoded video data. For that, there is no need to access original movie data. Our experimental results show that proposed traceable watermarking method for digital cinema system is much better than the convenient watermarking techniques in terms of robustness, image quality, speed, simplicity and robust structure.

Keywords: Decoder, Digital content, JPEG2000 Frame, System-On-Chip, traceable watermark, Hash Function, CRC-32.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633
5163 Decision Analysis Module for Excel

Authors: Radomir Perzina, Jaroslav Ramik

Abstract:

The Analytic Hierarchy Process is frequently used approach for solving decision making problems. There exists wide range of software programs utilizing that approach. Their main disadvantage is that they are relatively expensive and missing intermediate calculations. This work introduces a Microsoft Excel add-in called DAME – Decision Analysis Module for Excel. Comparing to other computer programs DAME is free, can work with scenarios or multiple decision makers and displays intermediate calculations. Users can structure their decision models into three levels – scenarios/users, criteria and variants. Items on all levels can be evaluated either by weights or pair-wise comparisons. There are provided three different methods for the evaluation of the weights of criteria, the variants as well as the scenarios – Saaty’s Method, Geometric Mean Method and Fuller’s Triangle Method. Multiplicative and additive syntheses are supported. The proposed software package is demonstrated on couple of illustrating examples of real life decision problems.

Keywords: Analytic hierarchy process, multi-criteria decision making, pair-wise comparisons, Microsoft Excel, Scenarios.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3382
5162 The Transfer of Low-Cost Housing in South Africa: Problems and Impediments

Authors: Gert Van Schalkwyk, Chris Cloete

Abstract:

South Africa is experiencing a massive housing backlog in urban low-cost housing. A backlog in the transfer of low-cost housing units is exacerbated by various impediments and delays that exist in the current legal framework. Structured interviews were conducted with 45 practicing conveyancers and 15 deeds office examiners at the Deeds Office in Pretoria, South Africa. One of the largest, the Deeds Office in Pretoria implements a uniform registration process and can be regarded as representative of other deeds offices in South Africa. It was established that a low percentage of low-cost properties are freely transferable. The main economic impediments are the absence of financing and the affordability or payment of rates and taxes to local government. Encroachment of buildings on neighbouring stands caused by enlargement of existing small units on small stands also cause long-term unresolved legal disputes. In addition, as transfer of properties is dependent on the proper functioning of administrative functions of various government departments, the adverse service delivery of government departments hampers transfer. Addressing the identified problems will contribute to a more sustainable process for the transfer of low-cost housing units in South Africa.

Keywords: Conveyancing, low-cost housing, South Africa, tenure, transfer, titling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 247
5161 Public Economic Efficiency and Case-Based Reasoning: A Theoretical Framework to Police Performance

Authors: Javier Parra-Domínguez, Juan Manuel Corchado

Abstract:

At present, public efficiency is a concept that intends to maximize return on public investment focus on minimizing the use of resources and maximizing the outputs. The concept takes into account statistical criteria drawn up according to techniques such as DEA (Data Envelopment Analysis). The purpose of the current work is to consider, more precisely, the theoretical application of CBR (Case-Based Reasoning) from economics and computer science, as a preliminary step to improving the efficiency of law enforcement agencies (public sector). With the aim of increasing the efficiency of the public sector, we have entered into a phase whose main objective is the implementation of new technologies. Our main conclusion is that the application of computer techniques, such as CBR, has become key to the efficiency of the public sector, which continues to require economic valuation based on methodologies such as DEA. As a theoretical result and conclusion, the incorporation of CBR systems will reduce the number of inputs and increase, theoretically, the number of outputs generated based on previous computer knowledge.

Keywords: Case-based reasoning, knowledge, police, public efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 584
5160 Analysis of Web User Identification Methods

Authors: Renáta Iváncsy, Sándor Juhász

Abstract:

Web usage mining has become a popular research area, as a huge amount of data is available online. These data can be used for several purposes, such as web personalization, web structure enhancement, web navigation prediction etc. However, the raw log files are not directly usable; they have to be preprocessed in order to transform them into a suitable format for different data mining tasks. One of the key issues in the preprocessing phase is to identify web users. Identifying users based on web log files is not a straightforward problem, thus various methods have been developed. There are several difficulties that have to be overcome, such as client side caching, changing and shared IP addresses and so on. This paper presents three different methods for identifying web users. Two of them are the most commonly used methods in web log mining systems, whereas the third on is our novel approach that uses a complex cookie-based method to identify web users. Furthermore we also take steps towards identifying the individuals behind the impersonal web users. To demonstrate the efficiency of the new method we developed an implementation called Web Activity Tracking (WAT) system that aims at a more precise distinction of web users based on log data. We present some statistical analysis created by the WAT on real data about the behavior of the Hungarian web users and a comprehensive analysis and comparison of the three methods

Keywords: Data preparation, Tracking individuals, Web useridentification, Web usage mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4376
5159 Material Concepts and Processing Methods for Electrical Insulation

Authors: R. Sekula

Abstract:

Epoxy composites are broadly used as an electrical insulation for the high voltage applications since only such materials can fulfill particular mechanical, thermal, and dielectric requirements. However, properties of the final product are strongly dependent on proper manufacturing process with minimized material failures, as too large shrinkage, voids and cracks. Therefore, application of proper materials (epoxy, hardener, and filler) and process parameters (mold temperature, filling time, filling velocity, initial temperature of internal parts, gelation time), as well as design and geometric parameters are essential features for final quality of the produced components. In this paper, an approach for three-dimensional modeling of all molding stages, namely filling, curing and post-curing is presented. The reactive molding simulation tool is based on a commercial CFD package, and include dedicated models describing viscosity and reaction kinetics that have been successfully implemented to simulate the reactive nature of the system with exothermic effect. Also a dedicated simulation procedure for stress and shrinkage calculations, as well as simulation results are presented in the paper. Second part of the paper is dedicated to recent developments on formulations of functional composites for electrical insulation applications, focusing on thermally conductive materials. Concepts based on filler modifications for epoxy electrical composites have been presented, including the results of the obtained properties. Finally, having in mind tough environmental regulations, in addition to current process and design aspects, an approach for product re-design has been presented focusing on replacement of epoxy material with the thermoplastic one. Such “design-for-recycling” method is one of new directions associated with development of new material and processing concepts of electrical products and brings a lot of additional research challenges. For that, one of the successful products has been presented to illustrate the presented methodology.

Keywords: Curing, epoxy insulation, numerical simulations, recycling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1613
5158 Identity Management in Virtual Worlds Based on Biometrics Watermarking

Authors: S. Bader, N. Essoukri Ben Amara

Abstract:

With the technological development and rise of virtual worlds, these spaces are becoming more and more attractive for cybercriminals, hidden behind avatars and fictitious identities. Since access to these spaces is not restricted or controlled, some impostors take advantage of gaining unauthorized access and practicing cyber criminality. This paper proposes an identity management approach for securing access to virtual worlds. The major purpose of the suggested solution is to install a strong security mechanism to protect virtual identities represented by avatars. Thus, only legitimate users, through their corresponding avatars, are allowed to access the platform resources. Access is controlled by integrating an authentication process based on biometrics. In the request process for registration, a user fingerprint is enrolled and then encrypted into a watermark utilizing a cancelable and non-invertible algorithm for its protection. After a user personalizes their representative character, the biometric mark is embedded into the avatar through a watermarking procedure. The authenticity of the avatar identity is verified when it requests authorization for access. We have evaluated the proposed approach on a dataset of avatars from various virtual worlds, and we have registered promising performance results in terms of authentication accuracy, acceptation and rejection rates.

Keywords: Identity management, security, biometrics authentication and authorization, avatar, virtual world.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1632
5157 Delaunay Triangulations Efficiency for Conduction-Convection Problems

Authors: Bashar Albaalbaki, Roger E. Khayat

Abstract:

This work is a comparative study on the effect of Delaunay triangulation algorithms on discretization error for conduction-convection conservation problems. A structured triangulation and many unstructured Delaunay triangulations using three popular algorithms for node placement strategies are used. The numerical method employed is the vertex-centered finite volume method. It is found that when the computational domain can be meshed using a structured triangulation, the discretization error is lower for structured triangulations compared to unstructured ones for only low Peclet number values, i.e. when conduction is dominant. However, as the Peclet number is increased and convection becomes more significant, the unstructured triangulations reduce the discretization error. Also, no statistical correlation between triangulation angle extremums and the discretization error is found using 200 samples of randomly generated Delaunay and non-Delaunay triangulations. Thus, the angle extremums cannot be an indicator of the discretization error on their own and need to be combined with other triangulation quality measures, which is the subject of further studies.

Keywords: Conduction-convection problems, Delaunay triangulation, discretization error, finite volume method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 116
5156 Laboratory Indices in Late Childhood Obesity: The Importance of DONMA Indices

Authors: Orkide Donma, Mustafa M. Donma, Muhammet Demirkol, Murat Aydin, Tuba Gokkus, Burcin Nalbantoglu, Aysin Nalbantoglu, Birol Topcu

Abstract:

Obesity in childhood establishes a ground for adulthood obesity. Especially morbid obesity is an important problem for the children because of the associated diseases such as diabetes mellitus, cancer and cardiovascular diseases. In this study, body mass index (BMI), body fat ratios, anthropometric measurements and ratios were evaluated together with different laboratory indices upon evaluation of obesity in morbidly obese (MO) children. Children with nutritional problems participated in the study. Written informed consent was obtained from the parents. Study protocol was approved by the Ethics Committee. Sixty-two MO girls aged 129.5±35.8 months and 75 MO boys aged 120.1±26.6 months were included into the scope of the study. WHO-BMI percentiles for age-and-sex were used to assess the children with those higher than 99th as morbid obesity. Anthropometric measurements of the children were recorded after their physical examination. Bio-electrical impedance analysis was performed to measure fat distribution. Anthropometric ratios, body fat ratios, Index-I and Index-II as well as insulin sensitivity indices (ISIs) were calculated. Girls as well as boys were binary grouped according to homeostasis model assessment-insulin resistance (HOMA-IR) index of <2.5 and >2.5, fasting glucose to insulin ratio (FGIR) of <6 and >6 and quantitative insulin sensitivity check index (QUICKI) of <0.33 and >0.33 as the frequently used cut-off points. They were evaluated based upon their BMIs, arms, legs, trunk, whole body fat percentages, body fat ratios such as fat mass index (FMI), trunk-to-appendicular fat ratio (TAFR), whole body fat ratio (WBFR), anthropometric measures and ratios [waist-to-hip, head-to-neck, thigh-to-arm, thigh-to-ankle, height/2-to-waist, height/2-to-hip circumference (C)]. SPSS/PASW 18 program was used for statistical analyses. p≤0.05 was accepted as statistically significance level. All of the fat percentages showed differences between below and above the specified cut-off points in girls when evaluated with HOMA-IR and QUICKI. Differences were observed only in arms fat percent for HOMA-IR and legs fat percent for QUICKI in boys (p≤ 0.05). FGIR was unable to detect any differences for the fat percentages of boys. Head-to-neck C was the only anthropometric ratio recommended to be used for all ISIs (p≤0.001 for both girls and boys in HOMA-IR, p≤0.001 for girls and p≤0.05 for boys in FGIR and QUICKI). Indices which are recommended for use in both genders were Index-I, Index-II, HOMA/BMI and log HOMA (p≤0.001). FMI was also a valuable index when evaluated with HOMA-IR and QUICKI (p≤0.001). The important point was the detection of the severe significance for HOMA/BMI and log HOMA while they were evaluated also with the other indices, FGIR and QUICKI (p≤0.001). These parameters along with Index-I were unique at this level of significance for all children. In conclusion, well-accepted ratios or indices may not be valid for the evaluation of both genders. This study has emphasized the limiting properties for boys. This is particularly important for the selection process of some ratios and/or indices during the clinical studies. Gender difference should be taken into consideration for the evaluation of the ratios or indices, which will be recommended to be used particularly within the scope of obesity studies.

Keywords: Anthropometry, childhood obesity, gender, insulin sensitivity index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
5155 Perceived Ease-of-Use and Intention to Use E-Government Services in Ghana: The Moderating Role of Perceived Usefulness

Authors: Isaac Kofi Mensah

Abstract:

Public sector organizations, ministries, departments and local government agencies are adopting e-government as a means to provide efficient and quality service delivery to citizens. The purpose of this research paper is to examine the extent to which perceived usefulness (PU) of e-government services moderates between perceived ease-of-use (PEOU) of e-government services and intention to use (IU) e-government services in Ghana. A structured research questionnaire instrument was developed and administered to 700 potential respondents in Ghana, of which 693 responded, representing 99% of the questionnaires distributed. The Technology Acceptance Model (TAM) was used as the theoretical framework for the study. The Statistical Package for Social Science (SPSS) was used to capture and analyze the data. The results indicate that even though predictors such as PU and PEOU are main determiners of citizens’ intention to adopt and use e-government services in Ghana, it failed to show that PEOU and IU e-government services in Ghana is significantly moderated by the PU of e-government services. The implication of this finding on theory and practice is further discussed.

Keywords: E-government services, intention to use, moderating role, perceived ease-of-use, perceived usefulness, Ghana, technology acceptance model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1509
5154 Tagged Grid Matching Based Object Detection in Wavelet Neural Network

Authors: R. Arulmurugan, P. Sengottuvelan

Abstract:

Object detection using Wavelet Neural Network (WNN) plays a major contribution in the analysis of image processing. Existing cluster-based algorithm for co-saliency object detection performs the work on the multiple images. The co-saliency detection results are not desirable to handle the multi scale image objects in WNN. Existing Super Resolution (SR) scheme for landmark images identifies the corresponding regions in the images and reduces the mismatching rate. But the Structure-aware matching criterion is not paying attention to detect multiple regions in SR images and fail to enhance the result percentage of object detection. To detect the objects in the high-resolution remote sensing images, Tagged Grid Matching (TGM) technique is proposed in this paper. TGM technique consists of the three main components such as object determination, object searching and object verification in WNN. Initially, object determination in TGM technique specifies the position and size of objects in the current image. The specification of the position and size using the hierarchical grid easily determines the multiple objects. Second component, object searching in TGM technique is carried out using the cross-point searching. The cross out searching point of the objects is selected to faster the searching process and reduces the detection time. Final component performs the object verification process in TGM technique for identifying (i.e.,) detecting the dissimilarity of objects in the current frame. The verification process matches the search result grid points with the stored grid points to easily detect the objects using the Gabor wavelet Transform. The implementation of TGM technique offers a significant improvement on the multi-object detection rate, processing time, precision factor and detection accuracy level.

Keywords: Object Detection, Cross-point Searching, Wavelet Neural Network, Object Determination, Gabor Wavelet Transform, Tagged Grid Matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1952
5153 Supplementation of Annatto (Bixa orellana)-Derived δ-Tocotrienol Produced High Number of Morula through Increased Expression of 3-Phosphoinositide- Dependent Protein Kinase-1 (PDK1) in Mice

Authors: S. M. M. Syairah, M. H. Rajikin, A-R. Sharaniza

Abstract:

Several embryonic cellular mechanism including cell cycle, growth and apoptosis are regulated by phosphatidylinositol-3- kinase (PI3K)/Akt signaling pathway. The goal of present study is to determine the effects of annatto (Bixa orellana)-derived δ-tocotrienol (δ-TCT) on the regulations of PI3K/Akt genes in murine morula. Twenty four 6-8 week old (23-25g) female balb/c mice were randomly divided into four groups (G1-G4; n=6). Those groups were subjected to the following treatments for 7 consecutive days: G1 (control) received tocopherol stripped corn oil, G2 was given 60 mg/kg/day of δ-TCT mixture (contains 90% delta & 10% gamma isomers), G3 was given 60 mg/kg/day of pure δ-TCT (>98% purity) and G4 received 60 mg/kg/day α-TOC. On Day 8, females were superovulated with 5 IU Pregnant Mare’s Serum Gonadotropin (PMSG) for 48 hours followed with 5 IU human Chorionic Gonadotropin (hCG) before mated with males at the ratio of 1:1. Females were sacrificed by cervical dislocation for embryo collection 48 hours post-coitum. About fifty morulas from each group were used in the gene expression analyses using Affymetrix QuantiGene Plex 2.0 Assay. Present data showed a significant increase (p<0.05) in the average number (mean + SEM) of morula produced in G2 (27.32 + 0.23), G3 (25.42 + 0.21) and G4 (27.21 + 0.34) compared to control group (G1 – 14.61 + 0.25). This is parallel with the high expression of PDK1 gene with increase of 2.75-fold (G2), 3.07-fold (G3) and 3.59-fold (G4) compared to G1. From the present data, it can be concluded that supplementation with δ-TCT(s) and α-TOC induced high expression of PDK1 in G2-G4 which enhanced the PI3K/Akt signaling activity, resulting in the increased number of morula.

Keywords: Embryonic development, morula, nicotine, vitamin E.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4812
5152 Use of Agricultural Waste for the Removal of Nickel Ions from Aqueous Solutions: Equilibrium and Kinetics Studies

Authors: Manjeet Bansal, Diwan Singh, V.K.Garg, Pawan Rose

Abstract:

The potential of economically cheaper cellulose containing natural materials like rice husk was assessed for nickel adsorption from aqueous solutions. The effects of pH, contact time, sorbent dose, initial metal ion concentration and temperature on the uptake of nickel were studied in batch process. The removal of nickel was dependent on the physico-chemical characteristics of the adsorbent, adsorbate concentration and other studied process parameters. The sorption data has been correlated with Langmuir, Freundlich and Dubinin-Radush kevich (D-R) adsorption models. It was found that Freundlich and Langmuir isotherms fitted well to the data. Maximum nickel removal was observed at pH 6.0. The efficiency of rice husk for nickel removal was 51.8% for dilute solutions at 20 g L-1 adsorbent dose. FTIR, SEM and EDAX were recorded before and after adsorption to explore the number and position of the functional groups available for nickel binding on to the studied adsorbent and changes in surface morphology and elemental constitution of the adsorbent. Pseudo-second order model explains the nickel kinetics more effectively. Reusability of the adsorbent was examined by desorption in which HCl eluted 78.93% nickel. The results revealed that nickel is considerably adsorbed on rice husk and it could be and economic method for the removal of nickel from aqueous solutions.

Keywords: Adsorption, nickel, SEM, EDAX.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2657
5151 Effectiveness of Biopesticide against Insects Pest and Its Quality of Pomelo (Citrus maxima Merr.)

Authors: U. Pangnakorn, S. Chuenchooklin

Abstract:

Effect of biopesticide from wood vinegar and extracted substances from 3 medicinal plants such as: non taai yak (Stemona tuberosa Lour), boraphet (Tinospora crispa Mier) and derris (Derris elliptica Roxb) were tested on the age five years of pomelo. The selected pomelo was carried out for insects’ pest control and its quality. The experimental site was located at farmer’s orchard in Phichit Province, Thailand. This study was undertaken during the drought season (December to March). The extracted from plants and wood vinegar were evaluated in 6 treatments: 1) water as control; 2) wood vinegar; 3) S. tuberosa Lour; 4) T. crispa Mier; 5) D. elliptica Roxb; 6) mixed (wood vinegar + S. tuberosa Lour + T. crispa Mier + D. elliptica Roxb). The experiment was RCB with 6 treatments and 3 replications per treatment. The results showed that T. crispa Mier was the highest effectiveness for reduction population of thrips (Scirtothrips dorsalis Hood) and citrus leaf miner (Phyllocnistis citrella Stainton) at 14.10 and 15.37 respectively, followed by treatment of mixed, D. elliptica Roxb, S. tuberosa Lour and wood vinegar with significance different. Additionally, T. crispa Mier promoted the high quality of harvested pomelo in term of thickness of skin at 12.45 mm and S. tuberosa Lour gave the high quality of the pomelo in term of firmness (276.5 kg/cm2) and brix (11.0%).

Keywords: Wood vinegar, Medicinal plants, Pomelo (Citrus maxima Merr.), Thrips (Scirtothrips dorsalis Hood), Citrus leaf miner (Phyllocnistis citrella Stainton).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3338
5150 An Intelligent Cascaded Fuzzy Logic Based Controller for Controlling the Room Temperature in Hydronic Heating System

Authors: Vikram Jeganathan, A. V. Sai Balasubramanian, N. Ravi Shankar, S. Subbaraman, R. Rengaraj

Abstract:

Heating systems are a necessity for regions which brace extreme cold weather throughout the year. To maintain a comfortable temperature inside a given place, heating systems making use of- Hydronic boilers- are used. The principle of a single pipe system serves as a base for their working. It is mandatory for these heating systems to control the room temperature, thus maintaining a warm environment. In this paper, the concept of regulation of the room temperature over a wide range is established by using an Adaptive Fuzzy Controller (AFC). This fuzzy controller automatically detects the changes in the outside temperatures and correspondingly maintains the inside temperature to a palatial value. Two separate AFC's are put to use to carry out this function: one to determine the quantity of heat needed to reach the prospective temperature required and to set the desired temperature; the other to control the position of the valve, which is directly proportional to the error between the present room temperature and the user desired temperature. The fuzzy logic controls the position of the valve as per the requirement of the heat. The amount by which the valve opens or closes is controlled by 5 knob positions, which vary from minimum to maximum, thereby regulating the amount of heat flowing through the valve. For the given test system data, different de-fuzzifier methods have been implemented and the results are compared. In order to validate the effectiveness of the proposed approach, a fuzzy controller has been designed by obtaining a test data from a real time system. The simulations are performed in MATLAB and are verified with standard system data. The proposed approach can be implemented for real time applications.

Keywords: Adaptive fuzzy controller, Hydronic heating system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
5149 Condition Monitoring in the Management of Maintenance in a Large Scale Precision CNC Machining Manufacturing Facility

Authors: N. Ahmed, A.J. Day, J.L. Victory L. Zeall, B. Young

Abstract:

The manufacture of large-scale precision aerospace components using CNC requires a highly effective maintenance strategy to ensure that the required accuracy can be achieved over many hours of production. This paper reviews a strategy for a maintenance management system based on Failure Mode Avoidance, which uses advanced techniques and technologies to underpin a predictive maintenance strategy. It is shown how condition monitoring (CM) is important to predict potential failures in high precision machining facilities and achieve intelligent and integrated maintenance management. There are two distinct ways in which CM can be applied. One is to monitor key process parameters and observe trends which may indicate a gradual deterioration of accuracy in the product. The other is the use of CM techniques to monitor high status machine parameters enables trends to be observed which can be corrected before machine failure and downtime occurs. It is concluded that the key to developing a flexible and intelligent maintenance framework in any precision manufacturing operation is the ability to evaluate reliably and routinely machine tool condition using condition monitoring techniques within a framework of Failure Mode Avoidance.

Keywords: Maintenance, Condition Monitoring, CNC, Machining, Accuracy, Capability, Key Process Parameters, Critical Parameters

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2209