Search results for: Weighted Class Complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2001

Search results for: Weighted Class Complexity

1761 An Efficient Multi Join Algorithm Utilizing a Lattice of Double Indices

Authors: Hanan A. M. Abd Alla, Lilac A. E. Al-Safadi

Abstract:

In this paper, a novel multi join algorithm to join multiple relations will be introduced. The novel algorithm is based on a hashed-based join algorithm of two relations to produce a double index. This is done by scanning the two relations once. But instead of moving the records into buckets, a double index will be built. This will eliminate the collision that can happen from a complete hash algorithm. The double index will be divided into join buckets of similar categories from the two relations. The algorithm then joins buckets with similar keys to produce joined buckets. This will lead at the end to a complete join index of the two relations. without actually joining the actual relations. The time complexity required to build the join index of two categories is Om log m where m is the size of each category. Totaling time complexity to O n log m for all buckets. The join index will be used to materialize the joined relation if required. Otherwise, it will be used along with other join indices of other relations to build a lattice to be used in multi-join operations with minimal I/O requirements. The lattice of the join indices can be fitted into the main memory to reduce time complexity of the multi join algorithm.

Keywords: Multi join, Relation, Lattice, Join indices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1251
1760 Estimation of Component Reusability through Reusability Metrics

Authors: Aditya Pratap Singh, Pradeep Tomar

Abstract:

Software reusability is an essential characteristic of Component-Based Software (CBS). The component reusability is an important assess for the effective reuse of components in CBS. The attributes of reusability proposed by various researchers are studied and four of them are identified as potential factors affecting reusability. This paper proposes metric for reusability estimation of black-box software component along with metrics for Interface Complexity, Understandability, Customizability and Reliability. An experiment is performed for estimation of reusability through a case study on a sample web application using a real world component.

Keywords: Component-based software, component reusability, customizability, interface complexity, reliability, understandability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3016
1759 Model Predictive Control with Unscented Kalman Filter for Nonlinear Implicit Systems

Authors: Takashi Shimizu, Tomoaki Hashimoto

Abstract:

A class of implicit systems is known as a more generalized class of systems than a class of explicit systems. To establish a control method for such a generalized class of systems, we adopt model predictive control method which is a kind of optimal feedback control with a performance index that has a moving initial time and terminal time. However, model predictive control method is inapplicable to systems whose all state variables are not exactly known. In other words, model predictive control method is inapplicable to systems with limited measurable states. In fact, it is usual that the state variables of systems are measured through outputs, hence, only limited parts of them can be used directly. It is also usual that output signals are disturbed by process and sensor noises. Hence, it is important to establish a state estimation method for nonlinear implicit systems with taking the process noise and sensor noise into consideration. To this purpose, we apply the model predictive control method and unscented Kalman filter for solving the optimization and estimation problems of nonlinear implicit systems, respectively. The objective of this study is to establish a model predictive control with unscented Kalman filter for nonlinear implicit systems.

Keywords: Model predictive control, unscented Kalman filter, nonlinear systems, implicit systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 874
1758 Spatial Pattern and GIS-Based Model for Risk Assessment – A Case Study of Dusit District, Bangkok

Authors: Morakot Worachairungreung

Abstract:

The objectives of the research are to study patterns of fire location distribution and develop techniques of Geographic Information System application in fire risk assessment for fire planning and management. Fire risk assessment was based on two factors: the vulnerability factor such as building material types, building height, building density and capacity for mitigation factor such as accessibility by road, distance to fire station, distance to hydrants and it was obtained from four groups of stakeholders including firemen, city planners, local government officers and local residents. Factors obtained from all stakeholders were converted into Raster data of GIS and then were superimposed on the data in order to prepare fire risk map of the area showing level of fire risk ranging from high to low. The level of fire risk was obtained from weighted mean of each factor based on the stakeholders. Weighted mean for each factor was obtained by Analytical Hierarchy Analysis.

Keywords: Fire Risk Assessment, Geographic Information System: GIS, Raster Analysis and Analytical Hierarchy Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2149
1757 Parametric Design as an Approach to Respond to Complexity

Authors: Sepideh Jabbari Behnam, Zahrasadat Saide Zarabadi

Abstract:

A city is an intertwined texture from the relationship of different components in a whole which is united in a one, so designing the whole complex and its planning is not an easy matter. By considering that a city is a complex system with infinite components and communications, providing flexible layouts that can respond to the unpredictable character of the city, which is a result of its complexity, is inevitable. Parametric design approach as a new approach can produce flexible and transformative layouts in any stage of design. This study aimed to introduce parametric design as a modern approach to respond to complex urban issues by using descriptive and analytical methods. This paper firstly introduces complex systems and then giving a brief characteristic of complex systems. The flexible design and layout flexibility is another matter in response and simulation of complex urban systems that should be considered in design, which is discussed in this study. In this regard, after describing the nature of the parametric approach as a flexible approach, as well as a tool and appropriate way to respond to features such as limited predictability, reciprocating nature, complex communications, and being sensitive to initial conditions and hierarchy, this paper introduces parametric design.

Keywords: Complexity theory, complex system, flexibility, parametric design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1264
1756 A Method under Uncertain Information for the Selection of Students in Interdisciplinary Studies

Authors: José M. Merigó, Pilar López-Jurado, M.Carmen Gracia, Montserrat Casanovas

Abstract:

We present a method for the selection of students in interdisciplinary studies based on the hybrid averaging operator. We assume that the available information given in the problem is uncertain so it is necessary to use interval numbers. Therefore, we suggest a new type of hybrid aggregation called uncertain induced generalized hybrid averaging (UIGHA) operator. It is an aggregation operator that considers the weighted average (WA) and the ordered weighted averaging (OWA) operator in the same formulation. Therefore, we are able to consider the degree of optimism of the decision maker and grades of importance in the same approach. By using interval numbers, we are able to represent the information considering the best and worst possible results so the decision maker gets a more complete view of the decision problem. We develop an illustrative example of the proposed scheme in the selection of students in interdisciplinary studies. We see that with the use of the UIGHA operator we get a more complete representation of the selection problem. Then, the decision maker is able to consider a wide range of alternatives depending on his interests. We also show other potential applications that could be used by using the UIGHA operator in educational problems about selection of different types of resources such as students, professors, etc.

Keywords: Decision making, Selection of students, Uncertainty, Aggregation operators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1351
1755 An Optimal Unsupervised Satellite image Segmentation Approach Based on Pearson System and k-Means Clustering Algorithm Initialization

Authors: Ahmed Rekik, Mourad Zribi, Ahmed Ben Hamida, Mohamed Benjelloun

Abstract:

This paper presents an optimal and unsupervised satellite image segmentation approach based on Pearson system and k-Means Clustering Algorithm Initialization. Such method could be considered as original by the fact that it utilised K-Means clustering algorithm for an optimal initialisation of image class number on one hand and it exploited Pearson system for an optimal statistical distributions- affectation of each considered class on the other hand. Satellite image exploitation requires the use of different approaches, especially those founded on the unsupervised statistical segmentation principle. Such approaches necessitate definition of several parameters like image class number, class variables- estimation and generalised mixture distributions. Use of statistical images- attributes assured convincing and promoting results under the condition of having an optimal initialisation step with appropriated statistical distributions- affectation. Pearson system associated with a k-means clustering algorithm and Stochastic Expectation-Maximization 'SEM' algorithm could be adapted to such problem. For each image-s class, Pearson system attributes one distribution type according to different parameters and especially the Skewness 'β1' and the kurtosis 'β2'. The different adapted algorithms, K-Means clustering algorithm, SEM algorithm and Pearson system algorithm, are then applied to satellite image segmentation problem. Efficiency of those combined algorithms was firstly validated with the Mean Quadratic Error 'MQE' evaluation, and secondly with visual inspection along several comparisons of these unsupervised images- segmentation.

Keywords: Unsupervised classification, Pearson system, Satellite image, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1984
1754 Multi Switched Split Vector Quantization of Narrowband Speech Signals

Authors: M. Satya Sai Ram, P. Siddaiah, M. Madhavi Latha

Abstract:

Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization (MSSVQ), which is a hybrid of Multi, switched, split vector quantization techniques. The spectral distortion performance, computational complexity, and memory requirements of MSSVQ are compared to split vector quantization (SVQ), multi stage vector quantization(MSVQ) and switched split vector quantization (SSVQ) techniques. It has been proved from results that MSSVQ has better spectral distortion performance, lower computational complexity and lower memory requirements when compared to all the above mentioned product code vector quantization techniques. Computational complexity is measured in floating point operations (flops), and memory requirements is measured in (floats).

Keywords: Linear predictive Coding, Multi stage vectorquantization, Switched Split vector quantization, Split vectorquantization, Line Spectral Frequencies (LSF).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
1753 Facial Expressions Recognition from Complex Background using Face Context and Adaptively Weighted sub-Pattern PCA

Authors: Md. Zahangir Alom, Mei-Lan Piao, Md. Ashraful Alam, Nam Kim, Jae-Hyeung Park

Abstract:

A new approach for facial expressions recognition based on face context and adaptively weighted sub-pattern PCA (Aw-SpPCA) has been presented in this paper. The facial region and others part of the body have been segmented from the complex environment based on skin color model. An algorithm has been proposed to accurate detection of face region from the segmented image based on constant ratio of height and width of face (δ= 1.618). The paper also discusses on new concept to detect the eye and mouth position. The desired part of the face has been cropped to analysis the expression of a person. Unlike PCA based on a whole image pattern, Aw-SpPCA operates directly on its sub patterns partitioned from an original whole pattern and separately extracts features from them. Aw-SpPCA can adaptively compute the contributions of each part and a classification task in order to enhance the robustness to both expression and illumination variations. Experiments on single standard face with five types of facial expression database shows that the proposed method is competitive.

Keywords: Aw-SpPC, Expressoin Recognition, Face context, Face Detection, PCA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
1752 Influence of City Environment to the Regional Development in Baltic Countries

Authors: Ilze Stokmane

Abstract:

Economic processes underway in the country directly and indirectly affect the welfare of the people and the social environment, starting with job security and having a direct impact on the qualitative and safe living environment.

The paper describes existing situation and gives analysis of the regional development policy determination and implementation in the all three Baltic countries. According statistical indicators there are differences between implementation of the regional development activities between all Baltic countries and in regions of inside each country.

It is analyzed more detail differences between regions in Latvia, Lithuania and Estonia according possibility to evaluate success of development processes in regions of the Baltic countries. The descriptive analyze of documents, statistical indicators at national level and regional level were used in the research.

Keywords: Baltic countries, city environment, regional development, urban areas.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1384
1751 Fuzzy Multiple Criteria Decision Making for Unmanned Combat Aircraft Selection Using Proximity Measure Method

Authors: C. Ardil

Abstract:

Intuitionistic fuzzy sets (IFS), Pythagorean fuzzy sets (PyFS), Picture fuzzy sets (PFS), q-rung orthopair fuzzy sets (q-ROF), Spherical fuzzy sets (SFS), T-spherical FS, and Neutrosophic sets (NS) are reviewed as multidimensional extensions of fuzzy sets in order to more explicitly and informatively describe the opinions of decision-making experts under uncertainty. To handle operations with standard fuzzy sets (SFS), the necessary operators; weighted arithmetic mean (WAM), weighted geometric mean (WGM), and Minkowski distance function are defined. The algorithm of the proposed proximity measure method (PMM) is provided with a multiple criteria group decision making method (MCDM) for use in a standard fuzzy set environment. To demonstrate the feasibility of the proposed method, the problem of selecting the best drone for an Air Force procurement request is used. The proximity measure method (PMM) based multidimensional standard fuzzy sets (SFS) is introduced to demonstrate its use with an issue involving unmanned combat aircraft selection.

Keywords: standard fuzzy sets (SFS), unmanned combat aircraft selection, multiple criteria decision making (MCDM), proximity measure method (PMM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 282
1750 Low Complexity Peak-to-Average Power Ratio Reduction in Orthogonal Frequency Division Multiplexing System by Simultaneously Applying Partial Transmit Sequence and Clipping Algorithms

Authors: V. Sudha, D. Sriram Kumar

Abstract:

Orthogonal Frequency Division Multiplexing (OFDM) has been used in many advanced wireless communication systems due to its high spectral efficiency and robustness to frequency selective fading channels. However, the major concern with OFDM system is the high peak-to-average power ratio (PAPR) of the transmitted signal. Some of the popular techniques used for PAPR reduction in OFDM system are conventional partial transmit sequences (CPTS) and clipping. In this paper, a parallel combination/hybrid scheme of PAPR reduction using clipping and CPTS algorithms is proposed. The proposed method intelligently applies both the algorithms in order to reduce both PAPR as well as computational complexity. The proposed scheme slightly degrades bit error rate (BER) performance due to clipping operation and it can be reduced by selecting an appropriate value of the clipping ratio (CR). The simulation results show that the proposed algorithm achieves significant PAPR reduction with much reduced computational complexity.

Keywords: CCDF, OFDM, PAPR, PTS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1331
1749 Modeling and Analysis for Effective Capacity of a Cross-Layer Optimized Wireless Networks

Authors: Reham A. El-mayet, Hesham M. El-Badawy, Salwa H. Elramly

Abstract:

New generation mobile communication networks have the ability of supporting triple play. In order that, Orthogonal Frequency Division Multiplexing (OFDM) access techniques have been chosen to enlarge the system ability for high data rates networks. Many of cross-layer modeling and optimization schemes for Quality of Service (QoS) and capacity of downlink multiuser OFDM system were proposed. In this paper, the Maximum Weighted Capacity (MWC) based resource allocation at the Physical (PHY) layer is used. This resource allocation scheme provides a much better QoS than the previous resource allocation schemes, while maintaining the highest or nearly highest capacity and costing similar complexity. In addition, the Delay Satisfaction (DS) scheduling at the Medium Access Control (MAC) layer, which allows more than one connection to be served in each slot is used. This scheduling technique is more efficient than conventional scheduling to investigate both of the number of users as well as the number of subcarriers against system capacity. The system will be optimized for different operational environments: the outdoor deployment scenarios as well as the indoor deployment scenarios are investigated and also for different channel models. In addition, effective capacity approach [1] is used not only for providing QoS for different mobile users, but also to increase the total wireless network's throughput.

Keywords: Cross-layer, effective capacity, LTE, OFDM, QoS, resource allocation, wireless networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1752
1748 Enhanced Face Recognition with Daisy Descriptors Using 1BT Based Registration

Authors: Sevil Igit, Merve Meric, Sarp Erturk

Abstract:

In this paper, it is proposed to improve Daisy Descriptor based face recognition using a novel One-Bit Transform (1BT) based pre-registration approach. The 1BT based pre-registration procedure is fast and has low computational complexity. It is shown that the face recognition accuracy is improved with the proposed approach. The proposed approach can facilitate highly accurate face recognition using DAISY descriptor with simple matching and thereby facilitate a low-complexity approach.

Keywords: Face Recognition, Daisy Descriptor, One-Bit Transform, Image Registration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
1747 Unsupervised Outlier Detection in Streaming Data Using Weighted Clustering

Authors: Yogita, Durga Toshniwal

Abstract:

Outlier detection in streaming data is very challenging because streaming data cannot be scanned multiple times and also new concepts may keep evolving. Irrelevant attributes can be termed as noisy attributes and such attributes further magnify the challenge of working with data streams. In this paper, we propose an unsupervised outlier detection scheme for streaming data. This scheme is based on clustering as clustering is an unsupervised data mining task and it does not require labeled data, both density based and partitioning clustering are combined for outlier detection. In this scheme partitioning clustering is also used to assign weights to attributes depending upon their respective relevance and weights are adaptive. Weighted attributes are helpful to reduce or remove the effect of noisy attributes. Keeping in view the challenges of streaming data, the proposed scheme is incremental and adaptive to concept evolution. Experimental results on synthetic and real world data sets show that our proposed approach outperforms other existing approach (CORM) in terms of outlier detection rate, false alarm rate, and increasing percentages of outliers.

Keywords: Concept Evolution, Irrelevant Attributes, Streaming Data, Unsupervised Outlier Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2589
1746 Determination of Post-Failure Characteristic Behaviour of Rocks under Conventional Method Based on the Mechanism of Rock Deformation Process

Authors: Victor Abioye Akinbinu

Abstract:

This work is intended to study the post-failure characteristic behaviour of rocks and the techniques of controlling the post-failure regime based on the mechanism of rocks deformation process. It is impossible to determine the post-failure regime of rocks using conventional laboratory testing equipment. This is because most testing machines are soft and therefore no information can be obtained after the peak load. Stress-strain deformation tests were conducted using both conventional and unconventional method (i.e. the closed loop servo-controlled testing machine) in accordance to ISRM standard. Normalised pre-failure curves were constructed to show the stages in the deformation process. The first type contains the Class I and progress to Class II with low strength soft brittle rocks. The second type shows entirely Class II characteristic behaviour. The third type is extremely brittle under axial loading, resulted in explosive failure, so its class could not be determined. The difficulty in obtaining the post-failure curves increases as the total volumetric strain approaches a positive value. The author’s use of normalised pre-failure curves enables identification of additional type of deformation process with very brittle response under axial loading. Testing the third type without confinement could cause equipment damage. Identification of the deformation process with the rock classes using conventional test could guide the personnel conducting tests using closed-loop servo-controlled system, to avoid equipment damage when testing rocks with third type deformation process so that testing is performed safely. It has also improved our understanding on total specimen failure and brittleness of rocks (e.g. brittle for Class II and less brittle or ductile for Class I).

Keywords: Closed-loop servo-controlled system, conventional testing equipment, deformation process, post-failure, pre-failure normalised curves, rock classes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 776
1745 Nonparametric Control Chart Using Density Weighted Support Vector Data Description

Authors: Myungraee Cha, Jun Seok Kim, Seung Hwan Park, Jun-Geol Baek

Abstract:

In manufacturing industries, development of measurement leads to increase the number of monitoring variables and eventually the importance of multivariate control comes to the fore. Statistical process control (SPC) is one of the most widely used as multivariate control chart. Nevertheless, SPC is restricted to apply in processes because its assumption of data as following specific distribution. Unfortunately, process data are composed by the mixture of several processes and it is hard to estimate as one certain distribution. To alternative conventional SPC, therefore, nonparametric control chart come into the picture because of the strength of nonparametric control chart, the absence of parameter estimation. SVDD based control chart is one of the nonparametric control charts having the advantage of flexible control boundary. However,basic concept of SVDD has been an oversight to the important of data characteristic, density distribution. Therefore, we proposed DW-SVDD (Density Weighted SVDD) to cover up the weakness of conventional SVDD. DW-SVDD makes a new attempt to consider dense of data as introducing the notion of density Weight. We extend as control chart using new proposed SVDD and a simulation study of various distributional data is conducted to demonstrate the improvement of performance.

Keywords: Density estimation, Multivariate control chart, Oneclass classification, Support vector data description (SVDD)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071
1744 Effective Methodology for Security Risk Assessment of Computer Systems

Authors: Daniel F. García, Adrián Fernández

Abstract:

Today, computer systems are more and more complex and support growing security risks. The security managers need to find effective security risk assessment methodologies that allow modeling well the increasing complexity of current computer systems but also maintaining low the complexity of the assessment procedure. This paper provides a brief analysis of common security risk assessment methodologies leading to the selection of a proper methodology to fulfill these requirements. Then, a detailed analysis of the most effective methodology is accomplished, presenting numerical examples to demonstrate how easy it is to use.

Keywords: Computer security, qualitative and quantitative methods, risk assessment methodologies, security risk assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3121
1743 Effect of Strength Class of Concrete and Curing Conditions on Capillary Water Absorption of Self-Compacting and Conventional Concrete

Authors: Emine Ebru Demirci, Remzi Sahin

Abstract:

The purpose of this study is to compare Self Compacting Concrete (SCC) and Conventional Concrete (CC) in terms of their capillary water absorption. During the comparison of SCC and CC, the effects of two different factors were also investigated: concrete strength class and curing condition. In the study, both SCC and CC were produced in three different concrete classes (C25, C50 and C70) and the other parameter (i.e. curing condition) was determined as two levels: moisture and air curing. It was observed that, for both curing environments and all strength classes of concrete, SCCs had lower capillary water absorption values than that of CCs. It was also detected that, for both SCC and CC, capillary water absorption values of samples kept in moisture curing were significantly lower than that of samples stored in air curing. Additionally, it was determined that capillary water absorption values for both SCC and CC decrease with increasing strength class of concrete for both curing environments.

Keywords: Capillary water absorption, curing condition, reinforced concrete beam, self-compacting concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3372
1742 Speech Data Compression using Vector Quantization

Authors: H. B. Kekre, Tanuja K. Sarode

Abstract:

Mostly transforms are used for speech data compressions which are lossy algorithms. Such algorithms are tolerable for speech data compression since the loss in quality is not perceived by the human ear. However the vector quantization (VQ) has a potential to give more data compression maintaining the same quality. In this paper we propose speech data compression algorithm using vector quantization technique. We have used VQ algorithms LBG, KPE and FCG. The results table shows computational complexity of these three algorithms. Here we have introduced a new performance parameter Average Fractional Change in Speech Sample (AFCSS). Our FCG algorithm gives far better performance considering mean absolute error, AFCSS and complexity as compared to others.

Keywords: Vector Quantization, Data Compression, Encoding, , Speech coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2357
1741 Bio-Ecological Monitoring of Potatoes Stem Nematodes (Ditylenchus destructor Thorne, 1945) in Four Major Potato-Planter Municipalities of Kvemo Kartli (Eastern Georgia) Accompanying Fauna Biodiversity

Authors: E. Tskitishvili, L. Jgenti, I. Eliava, T. Tskitishvili, N. Bagathuria, M. Gigolashvili

Abstract:

There has been studied the distribution character of potato stem nematode (Ditylenchus destructor Thorne, 1945) on the potato fields in four municipalities (Tsalka, Bolnisi, Marneuli, Gardabani) of Kvemo Kartli (Eastern Georgia).

As a result of scientific research there is stated the extensiveness of pathogens invasion, accompanying composition of fauna species, environmental groups of populations and quantity.

During the research process in the studied ecosystems there were registered 160 forms of free-living and Phyto-parasitic nematodes, from which 118 forms are determined as species and 42 as genus.

It was found that in almost the entire studied ecosystem there is dominated pathogenic nematodes Ditylenchus destructor. The large number of exemplars (almost uncountable) was found in tubers material of Bolnisi and Gardabani. 

Keywords: Nematoda, potato, steam, bioecological, monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2100
1740 Weak Measurement Theory for Discrete Scales

Authors: Jan Newmarch

Abstract:

With the increasing spread of computers and the internet among culturally, linguistically and geographically diverse communities, issues of internationalization and localization and becoming increasingly important. For some of the issues such as different scales for length and temperature, there is a well-developed measurement theory. For others such as date formats no such theory will be possible. This paper fills a gap by developing a measurement theory for a class of scales previously overlooked, based on discrete and interval-valued scales such as spanner and shoe sizes. The paper gives a theoretical foundation for a class of data representation problems.

Keywords: Data representation, internationalisation, localisation, measurement theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400
1739 A Holistic Workflow Modeling Method for Business Process Redesign

Authors: Heejung Lee

Abstract:

In a highly competitive environment, it becomes more important to shorten the whole business process while delivering or even enhancing the business value to the customers and suppliers. Although the workflow management systems receive much attention for its capacity to practically support the business process enactment, the effective workflow modeling method remain still challenging and the high degree of process complexity makes it more difficult to gain the short lead time. This paper presents a workflow structuring method in a holistic way that can reduce the process complexity using activity-needs and formal concept analysis, which eventually enhances the key performance such as quality, delivery, and cost in business process.

Keywords: Workflow management, reengineering, formal concept analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
1738 Multiobjective Optimization Solution for Shortest Path Routing Problem

Authors: C. Chitra, P. Subbaraj

Abstract:

The shortest path routing problem is a multiobjective nonlinear optimization problem with constraints. This problem has been addressed by considering Quality of service parameters, delay and cost objectives separately or as a weighted sum of both objectives. Multiobjective evolutionary algorithms can find multiple pareto-optimal solutions in one single run and this ability makes them attractive for solving problems with multiple and conflicting objectives. This paper uses an elitist multiobjective evolutionary algorithm based on the Non-dominated Sorting Genetic Algorithm (NSGA), for solving the dynamic shortest path routing problem in computer networks. A priority-based encoding scheme is proposed for population initialization. Elitism ensures that the best solution does not deteriorate in the next generations. Results for a sample test network have been presented to demonstrate the capabilities of the proposed approach to generate well-distributed pareto-optimal solutions of dynamic routing problem in one single run. The results obtained by NSGA are compared with single objective weighting factor method for which Genetic Algorithm (GA) was applied.

Keywords: Multiobjective optimization, Non-dominated SortingGenetic Algorithm, Routing, Weighted sum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3211
1737 Impact of the Existence of One-Way Functionson the Conceptual Difficulties of Quantum Measurements

Authors: Arkady Bolotin

Abstract:

One-way functions are functions that are easy to compute but hard to invert. Their existence is an open conjecture; it would imply the existence of intractable problems (i.e. NP-problems which are not in the P complexity class). If true, the existence of one-way functions would have an impact on the theoretical framework of physics, in particularly, quantum mechanics. Such aspect of one-way functions has never been shown before. In the present work, we put forward the following. We can calculate the microscopic state (say, the particle spin in the z direction) of a macroscopic system (a measuring apparatus registering the particle z-spin) by the system macroscopic state (the apparatus output); let us call this association the function F. The question is: can we compute the function F in the inverse direction? In other words, can we compute the macroscopic state of the system through its microscopic state (the preimage F -1)? In the paper, we assume that the function F is a one-way function. The assumption implies that at the macroscopic level the Schrödinger equation becomes unfeasible to compute. This unfeasibility plays a role of limit of the validity of the linear Schrödinger equation.

Keywords: One-way functions, P versus NP problem, quantummeasurements.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1268
1736 Blind Channel Estimation Based on URV Decomposition Technique for Uplink of MC-CDMA

Authors: Pradya Pornnimitkul, Suwich Kunaruttanapruk, Bamrung Tau Sieskul, Somchai Jitapunkul

Abstract:

In this paper, we investigate a blind channel estimation method for Multi-carrier CDMA systems that use a subspace decomposition technique. This technique exploits the orthogonality property between the noise subspace and the received user codes to obtain channel of each user. In the past we used Singular Value Decomposition (SVD) technique but SVD have most computational complexity so in this paper use a new algorithm called URV Decomposition, which serve as an intermediary between the QR decomposition and SVD, replaced in SVD technique to track the noise space of the received data. Because of the URV decomposition has almost the same estimation performance as the SVD, but has less computational complexity.

Keywords: Channel estimation, MC-CDMA, SVD, URV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
1735 An Approach for Reducing the Computational Complexity of LAMSTAR Intrusion Detection System using Principal Component Analysis

Authors: V. Venkatachalam, S. Selvan

Abstract:

The security of computer networks plays a strategic role in modern computer systems. Intrusion Detection Systems (IDS) act as the 'second line of defense' placed inside a protected network, looking for known or potential threats in network traffic and/or audit data recorded by hosts. We developed an Intrusion Detection System using LAMSTAR neural network to learn patterns of normal and intrusive activities, to classify observed system activities and compared the performance of LAMSTAR IDS with other classification techniques using 5 classes of KDDCup99 data. LAMSAR IDS gives better performance at the cost of high Computational complexity, Training time and Testing time, when compared to other classification techniques (Binary Tree classifier, RBF classifier, Gaussian Mixture classifier). we further reduced the Computational Complexity of LAMSTAR IDS by reducing the dimension of the data using principal component analysis which in turn reduces the training and testing time with almost the same performance.

Keywords: Binary Tree Classifier, Gaussian Mixture, IntrusionDetection System, LAMSTAR, Radial Basis Function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697
1734 Using Data Mining Technique for Scholarship Disbursement

Authors: J. K. Alhassan, S. A. Lawal

Abstract:

This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.

Keywords: Decision tree, classification, data mining, scholarship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2103
1733 A Type-2 Fuzzy Adaptive Controller of a Class of Nonlinear System

Authors: A. El Ougli, I. Lagrat, I. Boumhidi

Abstract:

In this paper we propose a robust adaptive fuzzy controller for a class of nonlinear system with unknown dynamic. The method is based on type-2 fuzzy logic system to approximate unknown non-linear function. The design of the on-line adaptive scheme of the proposed controller is based on Lyapunov technique. Simulation results are given to illustrate the effectiveness of the proposed approach.

Keywords: Fuzzy set type-2, Adaptive fuzzy control, Nonlinear system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1777
1732 Bee Parameter Determination via Weighted Centriod Modified Simplex and Constrained Response Surface Optimisation Methods

Authors: P. Luangpaiboon

Abstract:

Various intelligences and inspirations have been adopted into the iterative searching process called as meta-heuristics. They intelligently perform the exploration and exploitation in the solution domain space aiming to efficiently seek near optimal solutions. In this work, the bee algorithm, inspired by the natural foraging behaviour of honey bees, was adapted to find the near optimal solutions of the transportation management system, dynamic multi-zone dispatching. This problem prepares for an uncertainty and changing customers- demand. In striving to remain competitive, transportation system should therefore be flexible in order to cope with the changes of customers- demand in terms of in-bound and outbound goods and technological innovations. To remain higher service level but lower cost management via the minimal imbalance scenario, the rearrangement penalty of the area, in each zone, including time periods are also included. However, the performance of the algorithm depends on the appropriate parameters- setting and need to be determined and analysed before its implementation. BEE parameters are determined through the linear constrained response surface optimisation or LCRSOM and weighted centroid modified simplex methods or WCMSM. Experimental results were analysed in terms of best solutions found so far, mean and standard deviation on the imbalance values including the convergence of the solutions obtained. It was found that the results obtained from the LCRSOM were better than those using the WCMSM. However, the average execution time of experimental run using the LCRSOM was longer than those using the WCMSM. Finally a recommendation of proper level settings of BEE parameters for some selected problem sizes is given as a guideline for future applications.

Keywords: Meta-heuristic, Bee Algorithm, Dynamic Multi-Zone Dispatching, Linear Constrained Response SurfaceOptimisation Method, Weighted Centroid Modified Simplex Method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1333