Search results for: Independent Component Analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9589

Search results for: Independent Component Analysis

8809 A Trends Analysis of Dinghy Yacht Simulator

Authors: Jae-Neung Lee, Sung-Bum Pan, Keun-Chang Kwak

Abstract:

This paper describes an analysis of Yacht Simulator international trends and also explains about Yacht. The results are summarized as follows. Attached to the cockpit are sensors that feed -back information on rudder angle, boat heel angle and mainsheet tension to the computer. Energy expenditure of the sailor measure indirectly using expired gas analysis for the measurement of VO2 and VCO2. At sea course configurations and wind conditions can be preset to suit any level of sailor from complete beginner to advanced sailor.

Keywords: Trends Analysis, Yacht Simulator, Sailing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2215
8808 Minimizing Fish-feed Loss due to Sea Currents: An Economic Methodology

Authors: V. Vassiliou, M. Charalambides, M. Menicou

Abstract:

Fish-feed is a major cost component of operating expenses for any aquaculture farm. Due to soaring prices of fish-feed ingredients, the need for better feeding schedule management has become imperative. On such factor that influences the utilization rate of fish-feed are sea currents. Up to now, practical monitoring of fishfeed loss due to sea currents is not exercised. This paper gives a description of an economic methodology that aims at quantifying the amount of fish-feed lost due to sea currents and draws on data from a Mediterranean aquaculture farm to formulate the associated model.

Keywords: Aquaculture, economic model, fish-feed loss, sea currents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
8807 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1055
8806 Turkish Adolescents' Subjective Well-Being with Respect to Age, Gender and SES of Parents

Authors: Ali Eryılmaz

Abstract:

In this research it is aimed that the effect of some demographic factors on Turkish Adolescents' subjective well being is investigated. 432 adolescents who are 247 girls and 185 boys are participated in this study. They are ages 15-17, and also are high school students. The Positive and Negative Affect Scale and Life Satisfaction Scale are used for measuring adolescents' subjective well being. The ANOVA method is used in order to examine the effect of ages. For gender differences, independent t-test method is used, and finally the Pearson Correlation method is used so as to examine the effect of socio economic statues of adolescents' parents. According to results, there is no gender difference on adolescents' subjective well being. On the other hand, SES and age are effect significantly lover level on adolescents' subjective well being.

Keywords: Subjective wellbeing, adolescents, and age, gender, SES.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2924
8805 Hardware Centric Machine Vision for High Precision Center of Gravity Calculation

Authors: Xin Cheng, Benny Thörnberg, Abdul Waheed Malik, Najeem Lawal

Abstract:

We present a hardware oriented method for real-time measurements of object-s position in video. The targeted application area is light spots used as references for robotic navigation. Different algorithms for dynamic thresholding are explored in combination with component labeling and Center Of Gravity (COG) for highest possible precision versus Signal-to-Noise Ratio (SNR). This method was developed with a low hardware cost in focus having only one convolution operation required for preprocessing of data.

Keywords: Dynamic thresholding, segmentation, position measurement, sub-pixel precision, center of gravity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2340
8804 Strategic Regional Identity for Health and Wellness Lodging

Authors: Pongsiri K.

Abstract:

This research aimed to study the competency of health and wellness hotels and resorts in developing use the local natural resources and wisdom to conform to the national health and wellness tourism (HWT) strategy by comparing two independent samples, from Aumpur Muang, Ranong province and Aumpur Muang, Chiangmai province. And also study in the suggestive direct path to lead the organization to the sustainable successful. This research was conduct by using mix methodology; both quantitative and qualitative data were used. The data of competency of health and wellness hotels and resorts (HWHR) in developing use the local natural resources for HWT promoting were collected via 300 set of questionnaires, from 6 hotels and resorts in 2 areas, 3 places from Aumpur Muang, Ranong province and another 3 from Aumpur Muang, Chiangmai province. Thestudy of HWHR’s competency in developing use the local natural resources and wisdom to conform to the national HWT strategycan be divided into fourmain areas, food and beverages service, tourism activity, environmental service, and value adding. The total competency of the Chiangmai sample is importantly scoredp. value 0.01 higher than the Ranong one while the area of safety, Chiangmai’s competency is importantly scored 0.05 higher than the Ranong’scompetency. Others were rated not differently. Since Chiangmai perform better, then it can be a role model in developing HTHR or HWT destination. From the part of qualitative research, content analysis of business contents and its environments were analyzed. The four stages of strategic development and plans, from the smallest scale to the largest scale such a national base were discussed. The HWT: Evolution model and strategy for lodging Business were suggested. All those stages must work harmoniously together. The distinctive result illustrates the need of human resource development as the key point to create the identity of Thainess on Health and wellness service providing. This will add-on the value of services and differentiates ourselves from other competitors. The creative of Thailand’s health and wellness brand possibly increase loyalty customers which agreed to be a path of sustainable development.

Keywords: Health and Wellness Tourism (HWT), Strategic Analysis, Health and Wellness Hotels and Resorts (HWHR), Lodging Firms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2759
8803 A Real-Time Simulation Environment for Avionics Software Development and Qualification

Authors: U. Tancredi, D. Accardo, M. Grassi, G. Fasano, A. E. Tirri, A. Vitale, N. Genito, F. Montemari, L. Garbarino

Abstract:

The development of guidance, navigation and control algorithms and avionic procedures requires the disposability of suitable analysis and verification tools, such as simulation environments, which support the design process and allow detecting potential problems prior to the flight test, in order to make new technologies available at reduced cost, time and risk. This paper presents a simulation environment for avionic software development and qualification, especially aimed at equipment for general aviation aircrafts and unmanned aerial systems. The simulation environment includes models for short and medium-range radio-navigation aids, flight assistance systems, and ground control stations. All the software modules are able to simulate the modeled systems both in fast-time and real-time tests, and were implemented following component oriented modeling techniques and requirement based approach. The paper describes the specific models features, the architectures of the implemented software systems and its validation process. Performed validation tests highlighted the capability of the simulation environment to guarantee in real-time the required functionalities and performance of the simulated avionics systems, as well as to reproduce the interaction between these systems, thus permitting a realistic and reliable simulation of a complete mission scenario.

Keywords: ADS-B, avionics, NAVAIDs, real time simulation, TCAS, UAS ground control station.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 846
8802 Realization of Sustainable Urban Society by Personal Electric Transporter and Natural Energy

Authors: Yuichi Miyamoto

Abstract:

In regards to the energy sector in the modern period, two points were raised. First is a vast and growing energy demand, and second is an environmental impact associated with it. The enormous consumption of fossil fuel to the mobile unit is leading to its rapid depletion. Nuclear power is not the only problem. A modal shift that utilizes personal transporters and independent power, in order to realize a sustainable society, is very effective. The author proposes that the world will continue to work on this. Energy of the future society, innovation in battery technology and the use of natural energy is a big key. And it is also necessary in order to save on energy consumption.

Keywords: Natural energy, Modal shift, Personal transporter, Battery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
8801 Meta-Classification using SVM Classifiers for Text Documents

Authors: Daniel I. Morariu, Lucian N. Vintan, Volker Tresp

Abstract:

Text categorization is the problem of classifying text documents into a set of predefined classes. In this paper, we investigated three approaches to build a meta-classifier in order to increase the classification accuracy. The basic idea is to learn a metaclassifier to optimally select the best component classifier for each data point. The experimental results show that combining classifiers can significantly improve the accuracy of classification and that our meta-classification strategy gives better results than each individual classifier. For 7083 Reuters text documents we obtained a classification accuracies up to 92.04%.

Keywords: Meta-classification, Learning with Kernels, Support Vector Machine, and Performance Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596
8800 The Enhancement of Target Localization Using Ship-Borne Electro-Optical Stabilized Platform

Authors: Jaehoon Ha, Byungmo Kang, Kilho Hong, Jungsoo Park

Abstract:

Electro-optical (EO) stabilized platforms have been widely used for surveillance and reconnaissance on various types of vehicles, from surface ships to unmanned air vehicles (UAVs). EO stabilized platforms usually consist of an assembly of structure, bearings, and motors called gimbals in which a gyroscope is installed. EO elements such as a CCD camera and IR camera, are mounted to a gimbal, which has a range of motion in elevation and azimuth and can designate and track a target. In addition, a laser range finder (LRF) can be added to the gimbal in order to acquire the precise slant range from the platform to the target. Recently, a versatile functionality of target localization is needed in order to cooperate with the weapon systems that are mounted on the same platform. The target information, such as its location or velocity, needed to be more accurate. The accuracy of the target information depends on diverse component errors and alignment errors of each component. Specially, the type of moving platform can affect the accuracy of the target information. In the case of flying platforms, or UAVs, the target location error can be increased with altitude so it is important to measure altitude as precisely as possible. In the case of surface ships, target location error can be increased with obliqueness of the elevation angle of the gimbal since the altitude of the EO stabilized platform is supposed to be relatively low. The farther the slant ranges from the surface ship to the target, the more extreme the obliqueness of the elevation angle. This can hamper the precise acquisition of the target information. So far, there have been many studies on EO stabilized platforms of flying vehicles. However, few researchers have focused on ship-borne EO stabilized platforms of the surface ship. In this paper, we deal with a target localization method when an EO stabilized platform is located on the mast of a surface ship. Especially, we need to overcome the limitation caused by the obliqueness of the elevation angle of the gimbal. We introduce a well-known approach for target localization using Unscented Kalman Filter (UKF) and present the problem definition showing the above-mentioned limitation. Finally, we want to show the effectiveness of the approach that will be demonstrated through computer simulations.

Keywords: Target localization, ship-borne electro-optical stabilized platform, unscented Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1101
8799 Rigid and Non-rigid Registration of Binary Objects using the Weighted Ratio Image

Authors: Panos Kotsas, Tony Dodd

Abstract:

This paper presents the application of a signal intensity independent similarity criterion for rigid and non-rigid body registration of binary objects. The criterion is defined as the weighted ratio image of two images. The ratio is computed on a voxel per voxel basis and weighting is performed by setting the raios between signal and background voxels to a standard high value. The mean squared value of the weighted ratio is computed over the union of the signal areas of the two images and it is minimized using the Chebyshev polynomial approximation.

Keywords: rigid and non-rigid body registration, binary objects

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321
8798 A New Splitting H1-Galerkin Mixed Method for Pseudo-hyperbolic Equations

Authors: Yang Liu, Jinfeng Wang, Hong Li, Wei Gao, Siriguleng He

Abstract:

A new numerical scheme based on the H1-Galerkin mixed finite element method for a class of second-order pseudohyperbolic equations is constructed. The proposed procedures can be split into three independent differential sub-schemes and does not need to solve a coupled system of equations. Optimal error estimates are derived for both semidiscrete and fully discrete schemes for problems in one space dimension. And the proposed method dose not requires the LBB consistency condition. Finally, some numerical results are provided to illustrate the efficacy of our method.

Keywords: Pseudo-hyperbolic equations, splitting system, H1-Galerkin mixed method, error estimates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489
8797 Multi-Dimensional Concerns Mining for Web Applications via Concept-Analysis

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become very complex and crucial, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering), the scientific community has focused attention to Web applications design, development, analysis, and testing, by studying and proposing methodologies and tools. This paper proposes an approach to automatic multi-dimensional concern mining for Web Applications, based on concepts analysis, impact analysis, and token-based concern identification. This approach lets the user to analyse and traverse Web software relevant to a particular concern (concept, goal, purpose, etc.) via multi-dimensional separation of concerns, to document, understand and test Web applications. This technique was developed in the context of WAAT (Web Applications Analysis and Testing) project. A semi-automatic tool to support this technique is currently under development.

Keywords: Concepts Analysis, Concerns Mining, Multi-Dimensional Separation of Concerns, Impact Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
8796 The Performance Analysis of Error Saturation Nonlinearity LMS in Impulsive Noise based on Weighted-Energy Conservation

Authors: T Panigrahi, G Panda, Mulgrew

Abstract:

This paper introduces a new approach for the performance analysis of adaptive filter with error saturation nonlinearity in the presence of impulsive noise. The performance analysis of adaptive filters includes both transient analysis which shows that how fast a filter learns and the steady-state analysis gives how well a filter learns. The recursive expressions for mean-square deviation(MSD) and excess mean-square error(EMSE) are derived based on weighted energy conservation arguments which provide the transient behavior of the adaptive algorithm. The steady-state analysis for co-related input regressor data is analyzed, so this approach leads to a new performance results without restricting the input regression data to be white.

Keywords: Error saturation nonlinearity, transient analysis, impulsive noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
8795 2D Graphical Analysis of Wastewater Influent Capacity Time Series

Authors: Monika Chuchro, Maciej Dwornik

Abstract:

The extraction of meaningful information from image could be an alternative method for time series analysis. In this paper, we propose a graphical analysis of time series grouped into table with adjusted colour scale for numerical values. The advantages of this method are also discussed. The proposed method is easy to understand and is flexible to implement the standard methods of pattern recognition and verification, especially for noisy environmental data.

Keywords: graphical analysis, time series, seasonality, noisy environmental data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1437
8794 The Sizes of Large Hierarchical Long-Range Percolation Clusters

Authors: Yilun Shang

Abstract:

We study a long-range percolation model in the hierarchical lattice ΩN of order N where probability of connection between two nodes separated by distance k is of the form min{αβ−k, 1}, α ≥ 0 and β > 0. The parameter α is the percolation parameter, while β describes the long-range nature of the model. The ΩN is an example of so called ultrametric space, which has remarkable qualitative difference between Euclidean-type lattices. In this paper, we characterize the sizes of large clusters for this model along the line of some prior work. The proof involves a stationary embedding of ΩN into Z. The phase diagram of this long-range percolation is well understood.

Keywords: percolation, component, hierarchical lattice, phase transition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1256
8793 Comparative Study of the Static and Dynamic Analysis of Multi-Storey Irregular Building

Authors: Bahador Bagheri, Ehsan Salimi Firoozabad, Mohammadreza Yahyaei

Abstract:

As the world move to the accomplishment of Performance Based Engineering philosophies in seismic design of Civil Engineering structures, new seismic design provisions require Structural Engineers to perform both static and dynamic analysis for the design of structures. While Linear Equivalent Static Analysis is performed for regular buildings up to 90m height in zone I and II, Dynamic Analysis should be performed for regular and irregular buildings in zone IV and V. Dynamic Analysis can take the form of a dynamic Time History Analysis or a linear Response Spectrum Analysis. In present study, Multi-storey irregular buildings with 20 stories have been modeled using software packages ETABS and SAP 2000 v.15 for seismic zone V in India. This paper also deals with the effect of the variation of the building height on the structural response of the shear wall building. Dynamic responses of building under actual earthquakes, EL-CENTRO 1949 and CHI-CHI Taiwan 1999 have been investigated. This paper highlights the accuracy and exactness of Time History analysis in comparison with the most commonly adopted Response Spectrum Analysis and Equivalent Static Analysis.

Keywords: Equivalent Static Analysis, Time history method, Response spectrum method, Reinforce concrete building, displacement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16114
8792 Economic Factorial Analysis of CO2 Emissions: The Divisia Index with Interconnected Factors Approach

Authors: Alexander Y. Vaninsky

Abstract:

This paper presents a method of economic factorial analysis of the CO2 emissions based on the extension of the Divisia index to interconnected factors. This approach, contrary to the Kaya identity, considers three main factors of the CO2 emissions: gross domestic product, energy consumption, and population - as equally important, and allows for accounting of all of them simultaneously. The three factors are included into analysis together with their carbon intensities that allows for obtaining a comprehensive picture of the change in the CO2 emissions. A computer program in R-language that is available for free download serves automation of the calculations. A case study of the U.S. carbon dioxide emissions is used as an example. 

Keywords: CO2 emissions, Economic analysis, Factorial analysis, Divisia index, Interconnected factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2526
8791 Enhanced Gram-Schmidt Process for Improving the Stability in Signal and Image Processing

Authors: Mario Mastriani, Marcelo Naiouf

Abstract:

The Gram-Schmidt Process (GSP) is used to convert a non-orthogonal basis (a set of linearly independent vectors) into an orthonormal basis (a set of orthogonal, unit-length vectors). The process consists of taking each vector and then subtracting the elements in common with the previous vectors. This paper introduces an Enhanced version of the Gram-Schmidt Process (EGSP) with inverse, which is useful for signal and image processing applications.

Keywords: Digital filters, digital signal and image processing, Gram-Schmidt Process, orthonormalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2871
8790 Validation of Reverse Engineered Web Application Models

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.

Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610
8789 Direct Design of Steel Bridge Using Nonlinear Inelastic Analysis

Authors: Boo-Sung Koh, Seung-Eock Kim

Abstract:

In this paper, a direct design using a nonlinear inelastic analysis is suggested. Also, this paper compares the load carrying capacity obtained by a nonlinear inelastic analysis with experiment results to verify the accuracy of the results. The allowable stress design results of a railroad through a plate girder bridge and the safety factor of the nonlinear inelastic analysis were compared to examine the safety performance. As a result, the load safety factor for the nonlinear inelastic analysis was twice as high as the required safety factor under the allowable stress design standard specified in the civil engineering structure design standards for urban magnetic levitation railways, which further verified the advantages of the proposed direct design method.

Keywords: Direct design, nonlinear inelastic analysis, residual stress, initial geometric imperfection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
8788 PIL Theory

Authors: A. Peveri

Abstract:

The curvature space-time by the presence of material, this deformation must present a pattern of deformation, not random. Space is uniform, elastic and any modification that occurs in one part, causes a change in another.

This deformation exists, must be a constant value and is independent of the observer, and relates the amount of matter, the force caused by the curvature of space and surface space. This unit of space is defined in this study as PIL and represents a constant area of space, deformable in the direction and sense of the center of mass of the body. The PIL is curved and connected to the center of mass of the Earth, to get to that point, through all matter, thus forming part of any place between particles at atomic and subatomic levels. At these levels the space between each particle is flat, unlike the macro where the space curves.

Keywords: Space flat, Space curved, Unit of space, Deformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
8787 Correlational Analysis between Brain Dominances and Multiple Intelligences

Authors: Lakshmi Dhandabani, Rajeev Sukumaran

Abstract:

Aim of this research study is to investigate and establish the characteristics of brain dominances (BD) and multiple intelligences (MI). This experimentation has been conducted for the sample size of 552 undergraduate computer-engineering students. In addition, mathematical formulation has been established to exhibit the relation between thinking and intelligence, and its correlation has been analyzed. Correlation analysis has been statistically measured using Pearson’s coefficient. Analysis of the results proves that there is a strong relational existence between thinking and intelligence. This research is carried to improve the didactic methods in engineering learning and also to improve e-learning strategies.

Keywords: Thinking style assessment, correlational analysis, mathematical model, data analysis, dynamic equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866
8786 Low Resolution Face Recognition Using Mixture of Experts

Authors: Fatemeh Behjati Ardakani, Fatemeh Khademian, Abbas Nowzari Dalini, Reza Ebrahimpour

Abstract:

Human activity is a major concern in a wide variety of applications, such as video surveillance, human computer interface and face image database management. Detecting and recognizing faces is a crucial step in these applications. Furthermore, major advancements and initiatives in security applications in the past years have propelled face recognition technology into the spotlight. The performance of existing face recognition systems declines significantly if the resolution of the face image falls below a certain level. This is especially critical in surveillance imagery where often, due to many reasons, only low-resolution video of faces is available. If these low-resolution images are passed to a face recognition system, the performance is usually unacceptable. Hence, resolution plays a key role in face recognition systems. In this paper we introduce a new low resolution face recognition system based on mixture of expert neural networks. In order to produce the low resolution input images we down-sampled the 48 × 48 ORL images to 12 × 12 ones using the nearest neighbor interpolation method and after that applying the bicubic interpolation method yields enhanced images which is given to the Principal Component Analysis feature extractor system. Comparison with some of the most related methods indicates that the proposed novel model yields excellent recognition rate in low resolution face recognition that is the recognition rate of 100% for the training set and 96.5% for the test set.

Keywords: Low resolution face recognition, Multilayered neuralnetwork, Mixture of experts neural network, Principal componentanalysis, Bicubic interpolation, Nearest neighbor interpolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709
8785 A609 Modeling of AC Servomotor Using Genetic Algorithm and Tests for Control of a Robotic Joint

Authors: J. G. Batista, T. S. Santiago, E. A. Ribeiro, ¬G. A. P. Thé

Abstract:

This work deals with parameter identification of permanent magnet motors, a class of ac motor which is particularly important in industrial automation due to characteristics like applications high performance, are very attractive for applications with limited space and reducing the need to eliminate because they have reduced size and volume and can operate in a wide speed range, without independent ventilation. By using experimental data and genetic algorithm we have been able to extract values for both the motor inductance and the electromechanical coupling constant, which are then compared to measure and/or expected values.

Keywords: Modeling, AC servomotor, Permanent Magnet Synchronous Motor-PMSM, Genetic Algorithm, Vector Control, Robotic Manipulator, Control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2472
8784 A Dynamic Mechanical Thermal T-Peel Test Approach to Characterize Interfacial Behavior of Polymeric Textile Composites

Authors: J. R. Büttler, T. Pham

Abstract:

Basic understanding of interfacial mechanisms is of importance for the development of polymer composites. For this purpose, we need techniques to analyze the quality of interphases, their chemical and physical interactions and their strength and fracture resistance. In order to investigate the interfacial phenomena in detail, advanced characterization techniques are favorable. Dynamic mechanical thermal analysis (DMTA) using a rheological system is a sensitive tool. T-peel tests were performed with this system, to investigate the temperature-dependent peel behavior of woven textile composites. A model system was made of polyamide (PA) woven fabric laminated with films of polypropylene (PP) or PP modified by grafting with maleic anhydride (PP-g-MAH). Firstly, control measurements were performed with solely PP matrixes. Polymer melt investigations, as well as the extensional stress, extensional viscosity and extensional relaxation modulus at -10°C, 100 °C and 170 °C, demonstrate similar viscoelastic behavior for films made of PP-g-MAH and its non-modified PP-control. Frequency sweeps have shown that PP-g-MAH has a zero phase viscosity of around 1600 Pa·s and PP-control has a similar zero phase viscosity of 1345 Pa·s. Also, the gelation points are similar at 2.42*104 Pa (118 rad/s) and 2.81*104 Pa (161 rad/s) for PP-control and PP-g-MAH, respectively. Secondly, the textile composite was analyzed. The extensional stress of PA66 fabric laminated with either PP-control or PP-g-MAH at -10 °C, 25 °C and 170 °C for strain rates of 0.001 – 1 s-1 was investigated. The laminates containing the modified PP need more stress for T-peeling. However, the strengthening effect due to the modification decreases by increasing temperature and at 170 °C, just above the melting temperature of the matrix, the difference disappears. Independent of the matrix used in the textile composite, there is a decrease of extensional stress by increasing temperature. It appears that the more viscous is the matrix, the weaker the laminar adhesion. Possibly, the measurement is influenced by the fact that the laminate becomes stiffer at lower temperatures. Adhesive lap-shear testing at room temperature supports the findings obtained with the T-peel test. Additional analysis of the textile composite at the microscopic level ensures that the fibers are well embedded in the matrix. Atomic force microscopy (AFM) imaging of a cross section of the composite shows no gaps between the fibers and matrix. Measurements of the water contact angle show that the MAH grafted PP is more polar than the virgin-PP, and that suggests a more favorable chemical interaction of PP-g-MAH with PA, compared to the non-modified PP. In fact, this study indicates that T-peel testing by DMTA is a technique to achieve more insights into polymeric textile composites.

Keywords: Dynamic mechanical thermal analysis, interphase, polyamide, polypropylene, textile composite, T-peel test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 711
8783 Model Solutions for Performance-Based Seismic Analysis of an Anchored Sheet Pile Quay Wall

Authors: C. J. W. Habets, D. J. Peters, J. G. de Gijt, A. V. Metrikine, S. N. Jonkman

Abstract:

Conventional seismic designs of quay walls in ports are mostly based on pseudo-static analysis. A more advanced alternative is the Performance-Based Design (PBD) method, which evaluates permanent deformations and amounts of (repairable) damage under seismic loading. The aim of this study is to investigate the suitability of this method for anchored sheet pile quay walls that were not purposely designed for seismic loads. A research methodology is developed in which pseudo-static, permanent-displacement and finite element analysis are employed, calibrated with an experimental reference case that considers a typical anchored sheet pile wall. A reduction factor that accounts for deformation behaviour is determined for pseudo-static analysis. A model to apply traditional permanent displacement analysis on anchored sheet pile walls is proposed. Dynamic analysis is successfully carried out. From the research it is concluded that PBD evaluation can effectively be used for seismic analysis and design of this type of structure.

Keywords: Anchored sheet pile quay wall, simplified dynamic analysis, performance-based design, pseudo-static analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2349
8782 Nullity of t-Tupple Graphs

Authors: Khidir R. Sharaf, Didar A. Ali

Abstract:

The nullity η(G) of a graph is the occurrence of zero as an eigenvalue in its spectra. A zero-sum weighting of a graph G is real valued function, say f from vertices of G to the set of real numbers, provided that for each vertex of G the summation of the weights f(w) over all neighborhood w of v is zero for each v in G.A high zero-sum weighting of G is one that uses maximum number of non-zero independent variables. If G is graph with an end vertex, and if H is an induced subgraph of G obtained by deleting this vertex together with the vertex adjacent to it, then, η(G)= η(H). In this paper, a high zero-sum weighting technique and the endvertex procedure are applied to evaluate the nullity of t-tupple and generalized t-tupple graphs are derived  and determined for some special types of graphs,

 Also, we introduce and prove some important results about the t-tupple coalescence, Cartesian and Kronecker products of nut graphs.

Keywords: Graph theory, Graph spectra, Nullity of graphs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1905
8781 Mix Design Curves for High Volume Fly Ash Concrete

Authors: S. S. Awanti, Aravindakumar B. Harwalkar

Abstract:

Concrete construction in future has to be environmental friendly apart from being safe so that society at large is benefited by the huge investments made in the infrastructure projects. To achieve this, component materials of the concrete system have to be optimized with reference to sustainability. This paper presents a study on development of mix proportions of high volume fly ash concrete (HFC). A series of HFC mixtures with cement replacement levels varying between 50% and 65% were prepared with water/binder ratios of 0.3 and 0.35. Compressive strength values were obtained at different ages. From the experimental results, pozzolanic efficiency ratios and mix design curves for HFC were established.

Keywords: Age factor, compressive strength, high volume fly ash concrete, pozzolanic efficiency ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1624
8780 Analysis of Medical Data using Data Mining and Formal Concept Analysis

Authors: Anamika Gupta, Naveen Kumar, Vasudha Bhatnagar

Abstract:

This paper focuses on analyzing medical diagnostic data using classification rules in data mining and context reduction in formal concept analysis. It helps in finding redundancies among the various medical examination tests used in diagnosis of a disease. Classification rules have been derived from positive and negative association rules using the Concept lattice structure of the Formal Concept Analysis. Context reduction technique given in Formal Concept Analysis along with classification rules has been used to find redundancies among the various medical examination tests. Also it finds out whether expensive medical tests can be replaced by some cheaper tests.

Keywords: Data Mining, Formal Concept Analysis, Medical Data, Negative Classification Rules.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724