Search results for: Complexity reduction approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6778

Search results for: Complexity reduction approach

6538 An Approach for Transient Response Calculation of large Nonproportionally Damped Structures using Component Mode Synthesis

Authors: Alexander A. Muravyov

Abstract:

A minimal complexity version of component mode synthesis is presented that requires simplified computer programming, but still provides adequate accuracy for modeling lower eigenproperties of large structures and their transient responses. The novelty is that a structural separation into components is done along a plane/surface that exhibits rigid-like behavior, thus only normal modes of each component is sufficient to use, without computing any constraint, attachment, or residual-attachment modes. The approach requires only such input information as a few (lower) natural frequencies and corresponding undamped normal modes of each component. A novel technique is shown for formulation of equations of motion, where a double transformation to generalized coordinates is employed and formulation of nonproportional damping matrix in generalized coordinates is shown.

Keywords: component mode synthesis, finite element models, transient response, nonproportional damping

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1769
6537 Multi Switched Split Vector Quantization of Narrowband Speech Signals

Authors: M. Satya Sai Ram, P. Siddaiah, M. Madhavi Latha

Abstract:

Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization (MSSVQ), which is a hybrid of Multi, switched, split vector quantization techniques. The spectral distortion performance, computational complexity, and memory requirements of MSSVQ are compared to split vector quantization (SVQ), multi stage vector quantization(MSVQ) and switched split vector quantization (SSVQ) techniques. It has been proved from results that MSSVQ has better spectral distortion performance, lower computational complexity and lower memory requirements when compared to all the above mentioned product code vector quantization techniques. Computational complexity is measured in floating point operations (flops), and memory requirements is measured in (floats).

Keywords: Linear predictive Coding, Multi stage vectorquantization, Switched Split vector quantization, Split vectorquantization, Line Spectral Frequencies (LSF).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
6536 Vehicle Aerodynamics: Drag Reduction by Surface Dimples

Authors: C. K. Chear, S. S. Dol

Abstract:

For a bluff body, dimples behave like roughness elements in stimulating a turbulent boundary layer, leading to delayed flow separation, a smaller wake and lower form drag. This is very different in principle from the application of dimples to streamlined body, where any reduction in drag would be predominantly due to a reduction in skin friction. In the present work, a car model with different dimple geometry is simulated using k-ε turbulence modeling to determine its effect to the aerodynamics performance. Overall, the results show that the application of dimples manages to reduce the drag coefficient of the car model.

Keywords: Aerodynamics, Boundary Layer, Dimple, Drag, Kinetic Energy, Turbulence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6245
6535 Improving the Effectiveness of Software Testing through Test Case Reduction

Authors: R. P. Mahapatra, Jitendra Singh

Abstract:

This paper proposes a new technique for improving the efficiency of software testing, which is based on a conventional attempt to reduce test cases that have to be tested for any given software. The approach utilizes the advantage of Regression Testing where fewer test cases would lessen time consumption of the testing as a whole. The technique also offers a means to perform test case generation automatically. Compared to one of the techniques in the literature where the tester has no option but to perform the test case generation manually, the proposed technique provides a better option. As for the test cases reduction, the technique uses simple algebraic conditions to assign fixed values to variables (Maximum, minimum and constant variables). By doing this, the variables values would be limited within a definite range, resulting in fewer numbers of possible test cases to process. The technique can also be used in program loops and arrays.

Keywords: Software Testing, Test Case Generation, Test CaseReduction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2934
6534 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: Binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 682
6533 Carbothermic Reduction of Phosphoric Acid Extracted from Dephosphorization Slags to Produce Yellow Phosphorus

Authors: Ryoko Yoshida, Jyunpei Yoshida, Hua Fang Yu, Yasushi Sasaki, Tetsuya Nagasaka

Abstract:

Phosphorous is an important element for agriculture and industry and is a non-renewable resource. Especially, yellow phosphorus is an essential material in advanced industrial technology, but phosphorus resources were not produced in Japan at all, and all depend on imports. It has been suggested, however, that the remaining accessible reserves of phosphate ore will be depleted within 50 years. Therefore, alternative resources for phosphate ore must be found. In this research, we have developed a process that enables the production of high-purity yellow phosphorus from domestic unused phosphorus resources such as steelmaking slags. The process consists of two parts: (1) the production of crude phosphoric acid from wastes such as steelmaking slag; (2) producing high-purity yellow phosphorus by low-temperature carbothermic reduction of phosphoric acid (H3PO4). The details of the carbothermic reduction of phosphoric acid are presented in this paper. Yellow phosphorus is commercially produced by carbothermic reduction of phosphate ore in an electric arc furnace at more than 1673K. In the newly developed system, gaseous P4O10 evaporated from H3PO4 is successfully reduced to yellow phosphorus by using carbon packed bed at less than 1273K. To meet the depletion of phosphate ore, the proposed process in this study to produce yellow phosphorus by carbothermic reduction of H3PO4 that are extracted from dephosphorization slags will be one of the effective and economical solutions.

Keywords: Carbothermic reduction, dephosphorization slags, phosphoric acid, yellow phosphorus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 903
6532 Feeder Reconfiguration for Loss Reduction in Unbalanced Distribution System Using Genetic Algorithm

Authors: Ganesh. Vulasala, Sivanagaraju. Sirigiri, Ramana. Thiruveedula

Abstract:

This paper presents an efficient approach to feeder reconfiguration for power loss reduction and voltage profile imprvement in unbalanced radial distribution systems (URDS). In this paper Genetic Algorithm (GA) is used to obtain solution for reconfiguration of radial distribution systems to minimize the losses. A forward and backward algorithm is used to calculate load flows in unbalanced distribution systems. By simulating the survival of the fittest among the strings, the optimum string is searched by randomized information exchange between strings by performing crossover and mutation. Results have shown that proposed algorithm has advantages over previous algorithms The proposed method is effectively tested on 19 node and 25 node unbalanced radial distribution systems.

Keywords: Distribution system, Load flows, Reconfiguration, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3192
6531 Variable Rough Set Model and Its Knowledge Reduction for Incomplete and Fuzzy Decision Information Systems

Authors: Da-kuan Wei, Xian-zhong Zhou, Dong-jun Xin, Zhi-wei Chen

Abstract:

The information systems with incomplete attribute values and fuzzy decisions commonly exist in practical problems. On the base of the notion of variable precision rough set model for incomplete information system and the rough set model for incomplete and fuzzy decision information system, the variable rough set model for incomplete and fuzzy decision information system is constructed, which is the generalization of the variable precision rough set model for incomplete information system and that of rough set model for incomplete and fuzzy decision information system. The knowledge reduction and heuristic algorithm, built on the method and theory of precision reduction, are proposed.

Keywords: Rough set, Incomplete and fuzzy decision information system, Limited valued tolerance relation, Knowledge reduction, Variable rough set model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
6530 Effective Methodology for Security Risk Assessment of Computer Systems

Authors: Daniel F. García, Adrián Fernández

Abstract:

Today, computer systems are more and more complex and support growing security risks. The security managers need to find effective security risk assessment methodologies that allow modeling well the increasing complexity of current computer systems but also maintaining low the complexity of the assessment procedure. This paper provides a brief analysis of common security risk assessment methodologies leading to the selection of a proper methodology to fulfill these requirements. Then, a detailed analysis of the most effective methodology is accomplished, presenting numerical examples to demonstrate how easy it is to use.

Keywords: Computer security, qualitative and quantitative methods, risk assessment methodologies, security risk assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3121
6529 Resistance Training as a Powerful Tool in the Prevention and Treatment of Cardiovascular Diseases

Authors: I. Struhár, L. Dovrtělová, M. Kumstát

Abstract:

Regular exercise promotes reduction in blood pressure, reduction in body weight and it also helps to increase in insulin sensitivity. Participation in physical activity should always be linked to medical screening which can reveal serious medical problems. One of them is high blood pressure. Hypertension is risk factor for one billion people worldwide and the highest prevalence is found in Africa. Another component of hypertension is that people who suffer from hypertension have no symptoms. It is estimated that reduction of 3mm Hg in Systolic Blood Pressure decreases cardiac morbidity at least 5%. The most of the guidelines suggest aerobic exercise in a prevention of cardiovascular diseases. On the other hand, it is important to emphasize the impact of resistance training. Even, it was found higher effect for reduction on the level of systolic blood pressure than aerobic exercise.

Keywords: Coronary artery disease, physical activity, prevention, resistance training.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
6528 Speech Data Compression using Vector Quantization

Authors: H. B. Kekre, Tanuja K. Sarode

Abstract:

Mostly transforms are used for speech data compressions which are lossy algorithms. Such algorithms are tolerable for speech data compression since the loss in quality is not perceived by the human ear. However the vector quantization (VQ) has a potential to give more data compression maintaining the same quality. In this paper we propose speech data compression algorithm using vector quantization technique. We have used VQ algorithms LBG, KPE and FCG. The results table shows computational complexity of these three algorithms. Here we have introduced a new performance parameter Average Fractional Change in Speech Sample (AFCSS). Our FCG algorithm gives far better performance considering mean absolute error, AFCSS and complexity as compared to others.

Keywords: Vector Quantization, Data Compression, Encoding, , Speech coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2356
6527 A Holistic Workflow Modeling Method for Business Process Redesign

Authors: Heejung Lee

Abstract:

In a highly competitive environment, it becomes more important to shorten the whole business process while delivering or even enhancing the business value to the customers and suppliers. Although the workflow management systems receive much attention for its capacity to practically support the business process enactment, the effective workflow modeling method remain still challenging and the high degree of process complexity makes it more difficult to gain the short lead time. This paper presents a workflow structuring method in a holistic way that can reduce the process complexity using activity-needs and formal concept analysis, which eventually enhances the key performance such as quality, delivery, and cost in business process.

Keywords: Workflow management, reengineering, formal concept analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
6526 Power Efficient OFDM Signals with Reduced Symbol's Aperiodic Autocorrelation

Authors: Ibrahim M. Hussain

Abstract:

Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.

Keywords: Aperiodic autocorrelation, OFDM, PAPR, SLM, wireless communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
6525 Supervisor Controller-Based Colored Petri Nets for Deadlock Control and Machine Failures in Automated Manufacturing Systems

Authors: Husam Kaid, Abdulrahman Al-Ahmari, Zhiwu Li

Abstract:

This paper develops a robust deadlock control technique for shared and unreliable resources in automated manufacturing systems (AMSs) based on structural analysis and colored Petri nets, which consists of three steps. The first step involves using strict minimal siphon control to create a live (deadlock-free) system that does not consider resource failure. The second step uses an approach based on colored Petri net, in which all monitors designed in the first step are merged into a single monitor. The third step addresses the deadlock control problems caused by resource failures. For all resource failures in the Petri net model a common recovery subnet based on colored petri net is proposed. The common recovery subnet is added to the obtained system at the second step to make the system reliable. The proposed approach is evaluated using an AMS from the literature. The results show that the proposed approach can be applied to an unreliable complex Petri net model, has a simpler structure and less computational complexity, and can obtain one common recovery subnet to model all resource failures.

Keywords: Automated manufacturing system, colored Petri net, deadlock, siphon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 412
6524 Effect of Reynolds Number and Concentration of Biopolymer (Gum Arabic) on Drag Reduction of Turbulent Flow in Circular Pipe

Authors: Kamaljit Singh Sokhal, Gangacharyulu Dasoraju, Vijaya Kumar Bulasara

Abstract:

Biopolymers are popular in many areas, like petrochemicals, food industry and agriculture due to their favorable properties like environment-friendly, availability, and cost. In this study, a biopolymer gum Arabic was used to find its effect on the pressure drop at various concentrations (100 ppm – 300 ppm) with various Reynolds numbers (10000 – 45000). A rheological study was also done by using the same concentrations to find the effect of the shear rate on the shear viscosity. Experiments were performed to find the effect of injection of gum Arabic directly near the boundary layer and to investigate its effect on the maximum possible drag reduction. Experiments were performed on a test section having i.d of 19.50 mm and length of 3045 mm. The polymer solution was injected from the top of the test section by using a peristaltic pump. The concentration of the polymer solution and the Reynolds number were used as parameters to get maximum possible drag reduction. Water was circulated through a centrifugal pump having a maximum 3000 rpm and the flow rate was measured by using rotameter. Results were validated by using Virk's maximum drag reduction asymptote. A maximum drag reduction of 62.15% was observed with the maximum concentration of gum Arabic, 300 ppm. The solution was circulated in the closed loop to find the effect of degradation of polymers with a number of cycles on the drag reduction percentage. It was observed that the injection of the polymer solution in the boundary layer was showing better results than premixed solutions.

Keywords: Drag reduction, shear viscosity, gum Arabic, injection point.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 692
6523 Efficient CNC Milling by Adjusting Material Removal Rate

Authors: Majid Tolouei-Rad

Abstract:

This paper describes a combined mathematicalgraphical approach for optimum tool path planning in order to improve machining efficiency. A methodology has been used that stabilizes machining operations by adjusting material removal rate in pocket milling operations while keeping cutting forces within limits. This increases the life of cutting tool and reduces the risk of tool breakage, machining vibration, and chatter. Case studies reveal the fact that application of this approach could result in a slight increase of machining time, however, a considerable reduction of tooling cost, machining vibration, noise and chatter can be achieved in addition to producing a better surface finish.

Keywords: CNC machines, milling, optimization, removal rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3436
6522 The Effects of Yield and Yield Components of Some Quality Increase Applications on Razakı Grape Variety

Authors: Şehri Çınar, Aydın Akın

Abstract:

This study was conducted Razakı grape variety (Vitis vinifera L.) and its vine which was aged 19 was grown on 5 BB rootstock in a vegetation period of 2014 in Afyon province in Turkey. In this research, it was investigated whether the applications of Control (C), 1/3 Cluster Tip Reduction (1/3 CTR), Shoot Tip Reduction (STR), 1/3 CTR + STR, Boric Acid (BA), 1/3 CTR + BA, STR + BA, 1/3 CTR + STR + BA on yield and yield components of Razakı grape variety. The results were obtained as the highest fresh grape yield (7.74 kg/vine) with C application; as the highest cluster weight (244.62 g) with STR application; as the highest 100 berry weight (504.08 g) with C application; as the highest maturity index (36.89) with BA application; as the highest must yield (695.00 ml) with BA and (695.00 ml) with 1/3 CTR + STR + BA applications; as the highest intensity of L* color (46.93) with STR and (46.10) with 1/3 CTR + STR + BA applications; as the highest intensity of a* color (-5.37) with 1/3 CTR + STR and (-5.01) with STR, as the highest intensity of b* color (12.59) with STR application. The shoot tip reduction to increase cluster weight and boric acid application to increase maturity index of Razakı grape variety can be recommended.

Keywords: Razakı, 1/3 cluster tip reduction, shoot tip reduction, boric acid, yield and yield components.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3515
6521 Blind Channel Estimation Based on URV Decomposition Technique for Uplink of MC-CDMA

Authors: Pradya Pornnimitkul, Suwich Kunaruttanapruk, Bamrung Tau Sieskul, Somchai Jitapunkul

Abstract:

In this paper, we investigate a blind channel estimation method for Multi-carrier CDMA systems that use a subspace decomposition technique. This technique exploits the orthogonality property between the noise subspace and the received user codes to obtain channel of each user. In the past we used Singular Value Decomposition (SVD) technique but SVD have most computational complexity so in this paper use a new algorithm called URV Decomposition, which serve as an intermediary between the QR decomposition and SVD, replaced in SVD technique to track the noise space of the received data. Because of the URV decomposition has almost the same estimation performance as the SVD, but has less computational complexity.

Keywords: Channel estimation, MC-CDMA, SVD, URV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
6520 Monomial Form Approach to Rectangular Surface Modeling

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.

Keywords: Monomial form, rectangular surfaces, CAGD curves, monomial matrix applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 651
6519 Gene Expression Data Classification Using Discriminatively Regularized Sparse Subspace Learning

Authors: Chunming Xu

Abstract:

Sparse representation which can represent high dimensional data effectively has been successfully used in computer vision and pattern recognition problems. However, it doesn-t consider the label information of data samples. To overcome this limitation, we develop a novel dimensionality reduction algorithm namely dscriminatively regularized sparse subspace learning(DR-SSL) in this paper. The proposed DR-SSL algorithm can not only make use of the sparse representation to model the data, but also can effective employ the label information to guide the procedure of dimensionality reduction. In addition,the presented algorithm can effectively deal with the out-of-sample problem.The experiments on gene-expression data sets show that the proposed algorithm is an effective tool for dimensionality reduction and gene-expression data classification.

Keywords: sparse representation, dimensionality reduction, labelinformation, sparse subspace learning, gene-expression data classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
6518 Similarity Detection in Collaborative Development of Object-Oriented Formal Specifications

Authors: Fathi Taibi, Fouad Mohammed Abbou, Md. Jahangir Alam

Abstract:

The complexity of today-s software systems makes collaborative development necessary to accomplish tasks. Frameworks are necessary to allow developers perform their tasks independently yet collaboratively. Similarity detection is one of the major issues to consider when developing such frameworks. It allows developers to mine existing repositories when developing their own views of a software artifact, and it is necessary for identifying the correspondences between the views to allow merging them and checking their consistency. Due to the importance of the requirements specification stage in software development, this paper proposes a framework for collaborative development of Object- Oriented formal specifications along with a similarity detection approach to support the creation, merging and consistency checking of specifications. The paper also explores the impact of using additional concepts on improving the matching results. Finally, the proposed approach is empirically evaluated.

Keywords: Collaborative Development, Formal methods, Object-Oriented, Similarity detection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432
6517 Invariant Characters of Tolerance Class and Reduction under Homomorphism in IIS

Authors: Chen Wu, Lijuan Wang

Abstract:

Some invariant properties of incomplete information systems homomorphism are studied in this paper. Demand conditions of tolerance class, attribute reduction, indispensable attribute and dispensable attribute being invariant under homomorphism in incomplete information system are revealed and discussed. The existing condition of endohomomorphism on an incomplete information system is also explored. It establishes some theoretical foundations for further investigations on incomplete information systems in rough set theory, like in information systems.

Keywords: Attribute reduction, homomorphism, incomplete information system, rough set, tolerance relation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 699
6516 Efficient Boosting-Based Active Learning for Specific Object Detection Problems

Authors: Thuy Thi Nguyen, Nguyen Dang Binh, Horst Bischof

Abstract:

In this work, we present a novel active learning approach for learning a visual object detection system. Our system is composed of an active learning mechanism as wrapper around a sub-algorithm which implement an online boosting-based learning object detector. In the core is a combination of a bootstrap procedure and a semi automatic learning process based on the online boosting procedure. The idea is to exploit the availability of classifier during learning to automatically label training samples and increasingly improves the classifier. This addresses the issue of reducing labeling effort meanwhile obtain better performance. In addition, we propose a verification process for further improvement of the classifier. The idea is to allow re-update on seen data during learning for stabilizing the detector. The main contribution of this empirical study is a demonstration that active learning based on an online boosting approach trained in this manner can achieve results comparable or even outperform a framework trained in conventional manner using much more labeling effort. Empirical experiments on challenging data set for specific object deteciton problems show the effectiveness of our approach.

Keywords: Computer vision, object detection, online boosting, active learning, labeling complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
6515 A Novel Frequency Offset Estimation Scheme for OFDM Systems

Authors: Youngpo Lee, Seokho Yoon

Abstract:

In this paper, we propose a novel frequency offset estimation scheme for orthogonal frequency division multiplexing (OFDM) systems. By correlating the OFDM signals within the coherence phase bandwidth and employing a threshold in the frequency offset estimation process, the proposed scheme is not only robust to the timing offset but also has a reduced complexity compared with that of the conventional scheme. Moreover, a timing offset estimation scheme is also proposed as the next stage of the proposed frequency offset estimation. Numerical results show that the proposed scheme can estimate frequency offset with lower computational complexity and does not require additional memory while maintaining the same level of estimation performance.

Keywords: OFDM, frequency offset estimation, threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2157
6514 A New Fuzzy Mathematical Model in Recycling Collection Networks: A Possibilistic Approach

Authors: B. Vahdani, R. Tavakkoli-Moghaddam, A. Baboli, S. M. Mousavi

Abstract:

Focusing on the environmental issues, including the reduction of scrap and consumer residuals, along with the benefiting from the economic value during the life cycle of goods/products leads the companies to have an important competitive approach. The aim of this paper is to present a new mixed nonlinear facility locationallocation model in recycling collection networks by considering multi-echelon, multi-suppliers, multi-collection centers and multifacilities in the recycling network. To make an appropriate decision in reality, demands, returns, capacities, costs and distances, are regarded uncertain in our model. For this purpose, a fuzzy mathematical programming-based possibilistic approach is introduced as a solution methodology from the recent literature to solve the proposed mixed-nonlinear programming model (MNLP). The computational experiments are provided to illustrate the applicability of the designed model in a supply chain environment and to help the decision makers to facilitate their analysis.

Keywords: Location-allocation model, recycling collection networks, fuzzy mathematical programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2050
6513 A Hybrid Approach for Quantification of Novelty in Rule Discovery

Authors: Vasudha Bhatnagar, Ahmed Sultan Al-Hegami, Naveen Kumar

Abstract:

Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.

Keywords: Knowledge Discovery in Databases (KDD), Data Mining, Rule Discovery, Interestingness, Subjective Measures, Novelty Measure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1313
6512 A Life Cycle Assessment (LCA) of Aluminum Production Process

Authors: Alaa Al Hawari, Mohammad Khader, Wael El Hasan, Mahmoud Alijla, Ammar Manawi, Abdelbaki Benamour

Abstract:

The production of aluminum alloys and ingots – starting from the processing of alumina to aluminum, and the final cast product – was studied using a Life Cycle Assessment (LCA) approach. The studied aluminum supply chain consisted of a carbon plant, a reduction plant, a casting plant, and a power plant. In the LCA model, the environmental loads of the different plants for the production of 1 ton of aluminum metal were investigated. The impact of the aluminum production was assessed in eight impact categories. The results showed that for all of the impact categories the power plant had the highest impact only in the cases of Human Toxicity Potential (HTP) the reduction plant had the highest impact and in the Marine Aquatic Eco-Toxicity Potential (MAETP) the carbon plant had the highest impact. Furthermore, the impact of the carbon plant and the reduction plant combined was almost the same as the impact of the power plant in the case of the Acidification Potential (AP). The carbon plant had a positive impact on the environment when it come to the Eutrophication Potential (EP) due to the production of clean water in the process. The natural gas based power plant used in the case study had 8.4 times less negative impact on the environment when compared to the heavy fuel based power plant and 10.7 times less negative impact when compared to the hard coal based power plant.

Keywords: Life cycle assessment, aluminum production, Supply chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4527
6511 An Approach to Secure Mobile Agent Communication in Multi-Agent Systems

Authors: Olumide Simeon Ogunnusi, Shukor Abd Razak, Michael Kolade Adu

Abstract:

Inter-agent communication manager facilitates communication among mobile agents via message passing mechanism. Until now, all Foundation for Intelligent Physical Agents (FIPA) compliant agent systems are capable of exchanging messages following the standard format of sending and receiving messages. Previous works tend to secure messages to be exchanged among a community of collaborative agents commissioned to perform specific tasks using cryptosystems. However, the approach is characterized by computational complexity due to the encryption and decryption processes required at the two ends. The proposed approach to secure agent communication allows only agents that are created by the host agent server to communicate via the agent communication channel provided by the host agent platform. These agents are assumed to be harmless. Therefore, to secure communication of legitimate agents from intrusion by external agents, a 2-phase policy enforcement system was developed. The first phase constrains the external agent to run only on the network server while the second phase confines the activities of the external agent to its execution environment. To implement the proposed policy, a controller agent was charged with the task of screening any external agent entering the local area network and preventing it from migrating to the agent execution host where the legitimate agents are running. On arrival of the external agent at the host network server, an introspector agent was charged to monitor and restrain its activities. This approach secures legitimate agent communication from Man-in-the Middle and Replay attacks.

Keywords: Agent communication, introspective agent, isolation of agent, policy enforcement system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 592
6510 Gender Based Variability Time Series Complexity Analysis

Authors: Ramesh K. Sunkaria, Puneeta Marwaha

Abstract:

Non linear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy normal sinus rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.

Keywords: Heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
6509 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory

Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi

Abstract:

One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm, to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.

Keywords: Rough Set Theory, Attribute Reduction, Fuzzy Logic, Memetic Algorithms, Record to Record Algorithm, Great Deluge Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893