Search results for: features engineering methods for forecasting
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6316

Search results for: features engineering methods for forecasting

5326 Fault Detection via Stability Analysis for the Hybrid Control Unit of HEVs

Authors: Kyogun Chang, Yoon Bok Lee

Abstract:

Fault detection determines faultexistence and detecting time. This paper discusses two layered fault detection methods to enhance the reliability and safety. Two layered fault detection methods consist of fault detection methods of component level controllers and system level controllers. Component level controllers detect faults by using limit checking, model-based detection, and data-driven detection and system level controllers execute detection by stability analysis which can detect unknown changes. System level controllers compare detection results via stability with fault signals from lower level controllers. This paper addresses fault detection methods via stability and suggests fault detection criteria in nonlinear systems. The fault detection method applies tothe hybrid control unit of a military hybrid electric vehicleso that the hybrid control unit can detect faults of the traction motor.

Keywords: Two Layered Fault Detection, Stability Analysis, Fault-Tolerant Control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705
5325 An Efficient Biometric Cryptosystem using Autocorrelators

Authors: R. Bremananth, A. Chitra

Abstract:

Cryptography provides the secure manner of information transmission over the insecure channel. It authenticates messages based on the key but not on the user. It requires a lengthy key to encrypt and decrypt the sending and receiving the messages, respectively. But these keys can be guessed or cracked. Moreover, Maintaining and sharing lengthy, random keys in enciphering and deciphering process is the critical problem in the cryptography system. A new approach is described for generating a crypto key, which is acquired from a person-s iris pattern. In the biometric field, template created by the biometric algorithm can only be authenticated with the same person. Among the biometric templates, iris features can efficiently be distinguished with individuals and produces less false positives in the larger population. This type of iris code distribution provides merely less intra-class variability that aids the cryptosystem to confidently decrypt messages with an exact matching of iris pattern. In this proposed approach, the iris features are extracted using multi resolution wavelets. It produces 135-bit iris codes from each subject and is used for encrypting/decrypting the messages. The autocorrelators are used to recall original messages from the partially corrupted data produced by the decryption process. It intends to resolve the repudiation and key management problems. Results were analyzed in both conventional iris cryptography system (CIC) and non-repudiation iris cryptography system (NRIC). It shows that this new approach provides considerably high authentication in enciphering and deciphering processes.

Keywords: Autocorrelators, biometrics cryptography, irispatterns, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519
5324 A Comparison of Shunt Active Power Filter Control Methods under Non-Sinusoidal and Unbalanced Voltage Conditions

Authors: H. Abaali, M. T. Lamchich, M. Raoufi

Abstract:

There are a variety of reference current identification methods, for the shunt active power filter (SAPF), such as the instantaneous active and reactive power, the instantaneous active and reactive current and the synchronous detection method are evaluated and compared under ideal, non sinusoidal and unbalanced voltage conditions. The SAPF performances, for the investigated identification methods, are tested for a non linear load. The simulation results, using Matlab Power System Blockset Toolbox from a complete structure, are presented and discussed.

Keywords: Shunt active power filter, Current perturbation, Non sinusoidal and unbalanced voltage conditions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2535
5323 Exploring the Need to Study the Efficacy of VR Training Compared to Traditional Cybersecurity Training

Authors: Shaila Rana, Wasim Alhamdani

Abstract:

Effective cybersecurity training is of the utmost importance, given the plethora of attacks that continue to increase in complexity and ubiquity. VR cybersecurity training remains a starkly understudied discipline. Studies that evaluated the effectiveness of VR cybersecurity training over traditional methods are required. An engaging and interactive platform can support knowledge retention of the training material. Consequently, an effective form of cybersecurity training is required to support a culture of cybersecurity awareness. Measurements of effectiveness varied throughout the studies, with surveys and observations being the two most utilized forms of evaluating effectiveness. Further research is needed to evaluate the effectiveness of VR cybersecurity training and traditional training. Additionally, research for evaluating if VR cybersecurity training is more effective than traditional methods is vital. This paper proposes a methodology to compare the two cybersecurity training methods and their effectiveness. The proposed framework includes developing both VR and traditional cybersecurity training methods and delivering them to at least 100 users. A quiz along with a survey will be administered and statistically analyzed to determine if there is a difference in knowledge retention and user satisfaction. The aim of this paper is to bring attention to the need to study VR cybersecurity training and its effectiveness compared to traditional training methods. This paper hopes to contribute to the cybersecurity training field by providing an effective way to train users for security awareness. If VR training is deemed more effective, this could create a new direction for cybersecurity training practices.

Keywords: Virtual reality cybersecurity training, VR cybersecurity training, traditional cybersecurity training, evaluating efficacy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1066
5322 Virtual Mechanical Engineering Education – A Case Study

Authors: S. H. R. Lo

Abstract:

Virtual engineering technology has undergone rapid progress in recent years and is being adopted increasingly by manufacturing companies of many engineering disciplines. There is an increasing demand from industry for qualified virtual engineers. The qualified virtual engineers should have the ability of applying engineering principles and mechanical design methods within the commercial software package environment. It is a challenge to the engineering education in universities which traditionally tends to lack the integration of knowledge and skills required for solving real world problems. In this paper, a case study shows some recent development of a MSc Mechanical Engineering course at Department of Engineering and Technology in MMU, and in particular, two units Simulation of Mechanical Systems(SMS) and Computer Aided Fatigue Analysis(CAFA) that emphasize virtual engineering education and promote integration of knowledge acquisition, skill training and industrial application.

Keywords: Computational modelling and simulation, mechanical engineering education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2429
5321 An IM-COH Algorithm Neural Network Optimization with Cuckoo Search Algorithm for Time Series Samples

Authors: Wullapa Wongsinlatam

Abstract:

Back propagation algorithm (BP) is a widely used technique in artificial neural network and has been used as a tool for solving the time series problems, such as decreasing training time, maximizing the ability to fall into local minima, and optimizing sensitivity of the initial weights and bias. This paper proposes an improvement of a BP technique which is called IM-COH algorithm (IM-COH). By combining IM-COH algorithm with cuckoo search algorithm (CS), the result is cuckoo search improved control output hidden layer algorithm (CS-IM-COH). This new algorithm has a better ability in optimizing sensitivity of the initial weights and bias than the original BP algorithm. In this research, the algorithm of CS-IM-COH is compared with the original BP, the IM-COH, and the original BP with CS (CS-BP). Furthermore, the selected benchmarks, four time series samples, are shown in this research for illustration. The research shows that the CS-IM-COH algorithm give the best forecasting results compared with the selected samples.

Keywords: Artificial neural networks, back propagation algorithm, time series, local minima problem, metaheuristic optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1083
5320 Bioinformatics Profiling of Missense Mutations

Authors: I. Nassiri, B. Goliaei, M. Tavassoli

Abstract:

The ability to distinguish missense nucleotide substitutions that contribute to harmful effect from those that do not is a difficult problem usually accomplished through functional in vivo analyses. In this study, instead current biochemical methods, the effects of missense mutations upon protein structure and function were assayed by means of computational methods and information from the databases. For this order, the effects of new missense mutations in exon 5 of PTEN gene upon protein structure and function were examined. The gene coding for PTEN was identified and localized on chromosome region 10q23.3 as the tumor suppressor gene. The utilization of these methods were shown that c.319G>A and c.341T>G missense mutations that were recognized in patients with breast cancer and Cowden disease, could be pathogenic. This method could be use for analysis of missense mutation in others genes.

Keywords: Bioinformatics, missense mutations, PTEN tumorsuppressor gene.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2382
5319 Dynamic Self-Scheduling of Pumped-Storage Power Plant in Energy and Ancillary Service Markets Using Sliding Window Technique

Authors: P. Kanakasabapathy, Radhika. S,

Abstract:

In the competitive electricity market environment, the profit of the pumped-storage plant in the energy market can be maximized by operating it as a generator, when market clearing price is high and as a pump, to pump water from lower reservoir to upper reservoir, when the price is low. An optimal self-scheduling plan has been developed for a pumped-storage plant, carried out on weekly basis in order to maximize the profit of the plant, keeping into account of all the major uncertainties such as the sudden ancillary service delivery request and the price forecasting errors. For a pumped storage power plant to operate in a real time market successive self scheduling has to be done by considering the forecast of the day-ahead market and the modified reservoir storage due to the ancillary service request of the previous day. Sliding Window Technique has been used for successive self scheduling to ensure profit for the plant.

Keywords: Ancillary services, BPSO, Power System Economics (Electricity markets), Self-Scheduling, Sliding Window Technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2564
5318 Tests for Gaussianity of a Stationary Time Series

Authors: Adnan Al-Smadi

Abstract:

One of the primary uses of higher order statistics in signal processing has been for detecting and estimation of non- Gaussian signals in Gaussian noise of unknown covariance. This is motivated by the ability of higher order statistics to suppress additive Gaussian noise. In this paper, several methods to test for non- Gaussianity of a given process are presented. These methods include histogram plot, kurtosis test, and hypothesis testing using cumulants and bispectrum of the available sequence. The hypothesis testing is performed by constructing a statistic to test whether the bispectrum of the given signal is non-zero. A zero bispectrum is not a proof of Gaussianity. Hence, other tests such as the kurtosis test should be employed. Examples are given to demonstrate the performance of the presented methods.

Keywords: Non-Gaussian, bispectrum, kurtosis, hypothesistesting, histogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1899
5317 Properties of Bacterial Nanocellulose for Scenic Arts

Authors: B. Suárez, G. Forman

Abstract:

Kombucha (a symbiotic culture of bacteria and yeast) produces material capable of acquiring multiple shapes and textures that change significantly under different environment or temperature variations (e.g., when it is exposed to wet conditions), properties that may be explored in the scenic industry. This paper presents an analysis of its specific characteristics, exploring them as a non-conventional material for arts and performance. Costume Design uses surfaces as a powerful way of expression to represent concepts and stories; it may apply the unique features of nano bacterial cellulose (NBC) as assets in this artistic context. A mix of qualitative and quantitative (interventionist) methodology approaches were used such as review of relevant literature to deepen knowledge on the research topic (crossing bibliography from different fields of studies: biology, art, costume design, etc.); as well as descriptive methods: laboratorial experiments, document quantities, observation to identify material properties and possibilities used to express a multiple narrative ideas, concepts and feelings. The results confirmed that NBC is an interactive and versatile material viable to be used in an alternative scenic context; its unique aesthetic and performative qualities, which change in contact to moisture, are resources that can be used to show a visual and poetic impact on stage.

Keywords: Biotechnological materials, contemporary dance, costume design, nano bacterial cellulose, performing arts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 494
5316 Variation of Spot Price and Profits of Andhra Pradesh State Grid in Deregulated Environment

Authors: Chava Sunil Kumar, P.S. Subrahmanyan, J. Amarnath

Abstract:

In this paper variation of spot price and total profits of the generating companies- through wholesale electricity trading are discussed with and without Central Generating Stations (CGS) share and seasonal variations are also considered. It demonstrates how proper analysis of generators- efficiencies and capabilities, types of generators owned, fuel costs, transmission losses and settling price variation using the solutions of Optimal Power Flow (OPF), can allow companies to maximize overall revenue. It illustrates how solutions of OPF can be used to maximize companies- revenue under different scenarios. And is also extended to computation of Available Transfer Capability (ATC) is very important to the transmission system security and market forecasting. From these results it is observed that how crucial it is for companies to plan their daily operations and is certainly useful in an online environment of deregulated power system. In this paper above tasks are demonstrated on 124 bus real-life Indian utility power system of Andhra Pradesh State Grid and results have been presented and analyzed.

Keywords: OPF, ATC, Electricity Market, Bid, Spot Price

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
5315 Automatic Detection of Breast Tumors in Sonoelastographic Images Using DWT

Authors: A. Sindhuja, V. Sadasivam

Abstract:

Breast Cancer is the most common malignancy in women and the second leading cause of death for women all over the world. Earlier the detection of cancer, better the treatment. The diagnosis and treatment of the cancer rely on segmentation of Sonoelastographic images. Texture features has not considered for Sonoelastographic segmentation. Sonoelastographic images of 15 patients containing both benign and malignant tumorsare considered for experimentation.The images are enhanced to remove noise in order to improve contrast and emphasize tumor boundary. It is then decomposed into sub-bands using single level Daubechies wavelets varying from single co-efficient to six coefficients. The Grey Level Co-occurrence Matrix (GLCM), Local Binary Pattern (LBP) features are extracted and then selected by ranking it using Sequential Floating Forward Selection (SFFS) technique from each sub-band. The resultant images undergo K-Means clustering and then few post-processing steps to remove the false spots. The tumor boundary is detected from the segmented image. It is proposed that Local Binary Pattern (LBP) from the vertical coefficients of Daubechies wavelet with two coefficients is best suited for segmentation of Sonoelastographic breast images among the wavelet members using one to six coefficients for decomposition. The results are also quantified with the help of an expert radiologist. The proposed work can be used for further diagnostic process to decide if the segmented tumor is benign or malignant.

Keywords: Breast Cancer, Segmentation, Sonoelastography, Tumor Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2198
5314 An Efficient 3D Animation Data Reduction Using Frame Removal

Authors: Jinsuk Yang, Choongjae Joo, Kyoungsu Oh

Abstract:

Existing methods in which the animation data of all frames are stored and reproduced as with vertex animation cannot be used in mobile device environments because these methods use large amounts of the memory. So 3D animation data reduction methods aimed at solving this problem have been extensively studied thus far and we propose a new method as follows. First, we find and remove frames in which motion changes are small out of all animation frames and store only the animation data of remaining frames (involving large motion changes). When playing the animation, the removed frame areas are reconstructed using the interpolation of the remaining frames. Our key contribution is to calculate the accelerations of the joints of individual frames and the standard deviations of the accelerations using the information of joint locations in the relevant 3D model in order to find and delete frames in which motion changes are small. Our methods can reduce data sizes by approximately 50% or more while providing quality which is not much lower compared to original animations. Therefore, our method is expected to be usefully used in mobile device environments or other environments in which memory sizes are limited.

Keywords: Data Reduction, Interpolation, Vertex Animation, 3D Animation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
5313 Teaching Computer Programming to Diverse Students: A Comparative, Mixed-Methods, Classroom Research Study

Authors: Almudena Konrad, Tomás Galguera

Abstract:

Lack of motivation and interest is a serious obstacle to students’ learning computing skills. A need exists for a knowledge base on effective pedagogy and curricula to teach computer programming. This paper presents results from research evaluating a six-year project designed to teach complex concepts in computer programming collaboratively, while supporting students to continue developing their computer thinking and related coding skills individually. Utilizing a quasi-experimental, mixed methods design, the pedagogical approaches and methods were assessed in two contrasting groups of students with different socioeconomic status, gender, and age composition. Analyses of quantitative data from Likert-scale surveys and an evaluation rubric, combined with qualitative data from reflective writing exercises and semi-structured interviews yielded convincing evidence of the project’s success at both teaching and inspiring students.

Keywords: Computational thinking, computing education, computer programming curriculum, logic, teaching methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 779
5312 Attribute Based Comparison and Selection of Modular Self-Reconfigurable Robot Using Multiple Attribute Decision Making Approach

Authors: Manpreet Singh, V. P. Agrawal, Gurmanjot Singh Bhatti

Abstract:

From the last decades, there is a significant technological advancement in the field of robotics, and a number of modular self-reconfigurable robots were introduced that can help in space exploration, bucket to stuff, search, and rescue operation during earthquake, etc. As there are numbers of self-reconfigurable robots, choosing the optimum one is always a concern for robot user since there is an increase in available features, facilities, complexity, etc. The objective of this research work is to present a multiple attribute decision making based methodology for coding, evaluation, comparison ranking and selection of modular self-reconfigurable robots using a technique for order preferences by similarity to ideal solution approach. However, 86 attributes that affect the structure and performance are identified. A database for modular self-reconfigurable robot on the basis of different pertinent attribute is generated. This database is very useful for the user, for selecting a robot that suits their operational needs. Two visual methods namely linear graph and spider chart are proposed for ranking of modular self-reconfigurable robots. Using five robots (Atron, Smores, Polybot, M-Tran 3, Superbot), an example is illustrated, and raking of the robots is successfully done, which shows that Smores is the best robot for the operational need illustrated, and this methodology is found to be very effective and simple to use.

Keywords: Self-reconfigurable robots, MADM, TOPSIS, morphogenesis, scalability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 876
5311 Relative Suitability Evaluation of Two Methods of Particle-Size Analysis for Selected Soils of Sudan Savanna of Nigeria

Authors: B. A. Lawal, B. R. Singh, G. A. Babaji, P. A. Tsado

Abstract:

The two widely used methods base on the sedimentation principle (Bouyoucos hydrometer and International pipette) for particle-size analysis were comparatively evaluated on soils collected from various locations in Sudan savanna of Nigeria particularly from Sokoto and Zamfara States. The hydrometer method under-estimated the silt and over-estimated the clay content. Also, the hydrometer reading proved difficult and tended to submerge when floated for clay reading in the suspension of very sandy soils (900g kg-1 sand). Furthermore, the results from the two methods were validated by subjecting the data to USDA soil textural triangle to determine their textural class names. The outcome was that 91.67 % of the experimental soils retained the same textural class names irrespective of the method. Thus, Bouyoucos hydrometer method may conveniently find a place in routine work in view of its simplicity, rapidity, and strong correlation with the pipette method.

Keywords: Hydrometer and pipette methods, particle-size analysis, sedimentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2362
5310 Comparative Study of Line Voltage Stability Indices for Voltage Collapse Forecasting in Power Transmission System

Authors: H. H. Goh, Q. S. Chua, S. W. Lee, B. C. Kok, K. C. Goh, K. T. K. Teo

Abstract:

At present, the evaluation of voltage stability assessment experiences sizeable anxiety in the safe operation of power systems. This is due to the complications of a strain power system. With the snowballing of power demand by the consumers and also the restricted amount of power sources, therefore, the system has to perform at its maximum proficiency. Consequently, the noteworthy to discover the maximum ability boundary prior to voltage collapse should be undertaken. A preliminary warning can be perceived to evade the interruption of power system’s capacity. The effectiveness of line voltage stability indices (LVSI) is differentiated in this paper. The main purpose of the indices used is to predict the proximity of voltage instability of the electric power system. On the other hand, the indices are also able to decide the weakest load buses which are close to voltage collapse in the power system. The line stability indices are assessed using the IEEE 14 bus test system to validate its practicability. Results demonstrated that the implemented indices are practically relevant in predicting the manifestation of voltage collapse in the system. Therefore, essential actions can be taken to dodge the incident from arising.

Keywords: Critical line, line outage, line voltage stability indices (LVSI), maximum loadability, voltage collapse, voltage instability, voltage stability analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4137
5309 A User Friendly Tool for Performance Evaluation of Different Reference Evapotranspiration Methods

Authors: Vijay Shankar

Abstract:

Evapotranspiration (ET) is a major component of the hydrologic cycle and its accurate estimation is essential for hydrological studies. In past, various estimation methods have been developed for different climatological data, and the accuracy of these methods varies with climatic conditions. Reference crop evapotranspiration (ET0) is a key variable in procedures established for estimating evapotranspiration rates of agricultural crops. Values of ET0 are used with crop coefficients for many aspects of irrigation and water resources planning and management. Numerous methods are used for estimating ET0. As per internationally accepted procedures outlined in the United Nations Food and Agriculture Organization-s Irrigation and Drainage Paper No. 56(FAO-56), use of Penman-Monteith equation is recommended for computing ET0 from ground based climatological observations. In the present study, seven methods have been selected for performance evaluation. User friendly software has been developed using programming language visual basic. The visual basic has ability to create graphical environment using less coding. For given data availability the developed software estimates reference evapotranspiration for any given area and period for which data is available. The accuracy of the software has been checked by the examples given in FAO-56.The developed software is a user friendly tool for estimating ET0 under different data availability and climatic conditions.

Keywords: Crop coefficient, Crop evapotranspiration, Field moisture, Irrigation Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1650
5308 Effects of Different Drying Methods on the Properties of Viscose Single Jersey Fabrics

Authors: M. Kucukali Ozturk, Y. Beceren, B. Nergis

Abstract:

The study discussed in this paper was conducted in an attempt to investigate effects of different drying methods (line dry and tumble dry) on viscose single jersey fabrics knitted with ring yarn.

Keywords: Color change, dimensional properties, drying method, fabric tightness, physical properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3007
5307 Evaluating Hourly Sulphur Dioxide and Ground Ozone Simulated with the Air Quality Model in Lima, Peru

Authors: Odón R. Sánchez-Ccoyllo, Elizabeth Ayma-Choque, Alan Llacza

Abstract:

Sulphur dioxide (SO₂) and surface-ozone (O₃) concentrations are associated with diseases. The objective of this research is to evaluate the effectiveness of the air-quality Weather Research and Forecasting model coupled to Chemistry (WRF-Chem) model with a horizontal resolution of 5 km x 5 km. For this purpose, the measurements of the hourly SO₂ and O₃ concentrations available in three air quality monitoring stations in Lima, Peru were used for the purpose of validating the simulations of the SO₂ and O₃ concentrations obtained with the WRF-Chem model in February 2018. For the quantitative evaluation of the simulations of these gases, statistical techniques were implemented, such as the average of the simulations; the average of the measurements; the Mean Bias (MeB); the Mean Error (MeE); and the Root Mean Square Error (RMSE). The results of these statistical metrics indicated that the simulated SO₂ and O₃ values over-predicted the SO₂ and O₃ measurements. For the SO₂ concentration, the MeB values varied from 0.58 to 26.35 µg/m³; the MeE values varied from 8.75 to 26.5 µg/m³; the RMSE values varied from 13.3 to 31.79 µg/m³; while for O₃ concentrations the statistical values of the MeB varied from 37.52 to 56.29 µg/m³; the MeE values varied from 37.54 to 56.70 µg/m³; the RMSE values varied from 43.05 to 69.56 µg/m³.

Keywords: Ground-ozone, Lima, Sulphur dioxide, WRF-Chem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 349
5306 Numerical Study of Iterative Methods for the Solution of the Dirichlet-Neumann Map for Linear Elliptic PDEs on Regular Polygon Domains

Authors: A. G. Sifalakis, E. P. Papadopoulou, Y. G. Saridakis

Abstract:

A generalized Dirichlet to Neumann map is one of the main aspects characterizing a recently introduced method for analyzing linear elliptic PDEs, through which it became possible to couple known and unknown components of the solution on the boundary of the domain without solving on its interior. For its numerical solution, a well conditioned quadratically convergent sine-Collocation method was developed, which yielded a linear system of equations with the diagonal blocks of its associated coefficient matrix being point diagonal. This structural property, among others, initiated interest for the employment of iterative methods for its solution. In this work we present a conclusive numerical study for the behavior of classical (Jacobi and Gauss-Seidel) and Krylov subspace (GMRES and Bi-CGSTAB) iterative methods when they are applied for the solution of the Dirichlet to Neumann map associated with the Laplace-s equation on regular polygons with the same boundary conditions on all edges.

Keywords: Elliptic PDEs, Dirichlet to Neumann Map, Global Relation, Collocation, Iterative Methods, Jacobi, Gauss-Seidel, GMRES, Bi-CGSTAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
5305 Volatility Switching between Two Regimes

Authors: Josip Visković, Josip Arnerić, Ante Rozga

Abstract:

Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modeling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behavior of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.

Keywords: Central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2513
5304 Data Mining Classification Methods Applied in Drug Design

Authors: Mária Stachová, Lukáš Sobíšek

Abstract:

Data mining incorporates a group of statistical methods used to analyze a set of information, or a data set. It operates with models and algorithms, which are powerful tools with the great potential. They can help people to understand the patterns in certain chunk of information so it is obvious that the data mining tools have a wide area of applications. For example in the theoretical chemistry data mining tools can be used to predict moleculeproperties or improve computer-assisted drug design. Classification analysis is one of the major data mining methodologies. The aim of thecontribution is to create a classification model, which would be able to deal with a huge data set with high accuracy. For this purpose logistic regression, Bayesian logistic regression and random forest models were built using R software. TheBayesian logistic regression in Latent GOLD software was created as well. These classification methods belong to supervised learning methods. It was necessary to reduce data matrix dimension before construct models and thus the factor analysis (FA) was used. Those models were applied to predict the biological activity of molecules, potential new drug candidates.

Keywords: data mining, classification, drug design, QSAR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2840
5303 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: Computational social science, movie preference, machine learning, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
5302 Design of a Fuzzy Feed-forward Controller for Monitor HAGC System of Cold Rolling Mill

Authors: S. Khosravi, A. Afshar, F. Barazandeh

Abstract:

In this study we propose a novel monitor hydraulic automatic gauge control (HAGC) system based on fuzzy feedforward controller. This is used in the development of cold rolling mill automation system to improve the quality of cold strip. According to features/ properties of entry steel strip like its average yield stress, width of strip, and desired exit thickness, this controller realizes the compensation for the exit thickness error. The traditional methods of adjusting the roller position, can-t tolerate the variance in the entry steel strip. The proposed method uses a mathematical model of the system together with the expert knowledge to perform this adjustment while minimizing the effect of the stated problem. In order to improve the speed of the controller in rejecting disturbances introduced by entry strip thickness variations, expert knowledge is added as a feed-forward term to the HAGC system. Simulation results for the application of the proposed controller to a real cold mill show that the exit strip quality is highly improved.

Keywords: Fuzzy feed-forward controller, monitor HAGC system, dynamic mathematical model, entry strip thickness deviation compensation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2199
5301 Development of a Multi-Factorial Instrument for Accident Analysis Based on Systemic Methods

Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu

Abstract:

The present research is built on three major pillars, commencing by making some considerations on accident investigation methods and pointing out both defining aspects and differences between linear and non-linear analysis. The traditional linear focus on accident analysis describes accidents as a sequence of events, while the latest systemic models outline interdependencies between different factors and define the processes evolution related to a specific (normal) situation. Linear and non-linear accident analysis methods have specific limitations, so the second point of interest is mirrored by the aim to discover the drawbacks of systemic models which becomes a starting point for developing new directions to identify risks or data closer to the cause of incidents/accidents. Since communication represents a critical issue in the interaction of human factor and has been proved to be the answer of the problems made by possible breakdowns in different communication procedures, from this focus point, on the third pylon a new error-modeling instrument suitable for risk assessment/accident analysis will be elaborated.

Keywords: Accident analysis, multi-factorial error modeling, risk, systemic methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1030
5300 The Research Approaches on Crisis and its Management

Authors: M. Mikušová, P. Horváthová

Abstract:

The paper structures research approaches to the crisis and its management. It focuses on approaches – psychological, sociological, economic, ethical and technological. Furthermore, it describes the basic features of models chosen according to those approaches. By their comparison it shows how the crisis influences organizations and individuals, and their mutual interaction.

Keywords: approaches, crisis, model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1306
5299 Object Identification with Color, Texture, and Object-Correlation in CBIR System

Authors: Awais Adnan, Muhammad Nawaz, Sajid Anwar, Tamleek Ali, Muhammad Ali

Abstract:

Needs of an efficient information retrieval in recent years in increased more then ever because of the frequent use of digital information in our life. We see a lot of work in the area of textual information but in multimedia information, we cannot find much progress. In text based information, new technology of data mining and data marts are now in working that were started from the basic concept of database some where in 1960. In image search and especially in image identification, computerized system at very initial stages. Even in the area of image search we cannot see much progress as in the case of text based search techniques. One main reason for this is the wide spread roots of image search where many area like artificial intelligence, statistics, image processing, pattern recognition play their role. Even human psychology and perception and cultural diversity also have their share for the design of a good and efficient image recognition and retrieval system. A new object based search technique is presented in this paper where object in the image are identified on the basis of their geometrical shapes and other features like color and texture where object-co-relation augments this search process. To be more focused on objects identification, simple images are selected for the work to reduce the role of segmentation in overall process however same technique can also be applied for other images.

Keywords: Object correlation, Geometrical shape, Color, texture, features, contents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2021
5298 A Text Mining Technique Using Association Rules Extraction

Authors: Hany Mahgoub, Dietmar Rösner, Nabil Ismail, Fawzy Torkey

Abstract:

This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.

Keywords: Text mining, data mining, association rule mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4423
5297 Application of Scanning Electron Microscopy and X-Ray Evaluation of the Main Digestion Methods for Determination of Macroelements in Plant Tissue

Authors: Krasimir I. Ivanov, Penka S. Zapryanova, Stefan V. Krustev, Violina R. Angelova

Abstract:

Three commonly used digestion methods (dry ashing, acid digestion, and microwave digestion) in different variants were compared for digestion of tobacco leaves. Three main macroelements (K, Ca and Mg) were analysed using AAS Spectrometer Spectra АА 220, Varian, Australia. The accuracy and precision of the measurements were evaluated by using Polish reference material CTR-VTL-2 (Virginia tobacco leaves). To elucidate the problems with elemental recovery X-Ray and SEM–EDS analysis of all residues after digestion were performed. The X-ray investigation showed a formation of KClO4 when HClO4 was used as a part of the acids mixture. The use of HF at Ca and Mg determination led to the formation of CaF2 and MgF2. The results were confirmed by energy dispersive X-ray microanalysis. SPSS program for Windows was used for statistical data processing.

Keywords: Digestion methods, determination of macroelements, plant tissue.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 928