Search results for: labeling complexity.
772 Feature Selection Approaches with Missing Values Handling for Data Mining - A Case Study of Heart Failure Dataset
Authors: N.Poolsawad, C.Kambhampati, J. G. F. Cleland
Abstract:
In this paper, we investigated the characteristic of a clinical dataseton the feature selection and classification measurements which deal with missing values problem.And also posed the appropriated techniques to achieve the aim of the activity; in this research aims to find features that have high effect to mortality and mortality time frame. We quantify the complexity of a clinical dataset. According to the complexity of the dataset, we proposed the data mining processto cope their complexity; missing values, high dimensionality, and the prediction problem by using the methods of missing value replacement, feature selection, and classification.The experimental results will extend to develop the prediction model for cardiology.Keywords: feature selection, missing values, classification, clinical dataset, heart failure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3211771 Iterative Joint Power Control and Partial Crosstalk Cancellation in Upstream VDSL
Authors: H. Bagheri, H. Emami, M. R. Pakravan
Abstract:
Crosstalk is the major limiting issue in very high bit-rate digital subscriber line (VDSL) systems in terms of bit-rate or service coverage. At the central office side, joint signal processing accompanied by appropriate power allocation enables complex multiuser processors to provide near capacity rates. Unfortunately complexity grows with the square of the number of lines within a binder, so by taking into account that there are only a few dominant crosstalkers who contribute to main part of crosstalk power, the canceller structure can be simplified which resulted in a much lower run-time complexity. In this paper, a multiuser power control scheme, namely iterative waterfilling, is combined with previously proposed partial crosstalk cancellation approaches to demonstrate the best ever achieved performance which is verified by simulation results.
Keywords: iterative waterfilling, partial crosstalk cancellation, run-time complexity, VDSL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403770 Low-Complexity Channel Estimation Algorithm for MIMO-OFDM Systems
Authors: Ali Beydoun, Hamzé H. Alaeddine
Abstract:
One of the main challenges in MIMO-OFDM system to achieve the expected performances in terms of data rate and robustness against multi-path fading channels is the channel estimation. Several methods were proposed in the literature based on either least square (LS) or minimum mean squared error (MMSE) estimators. These methods present high implementation complexity as they require the inversion of large matrices. In order to overcome this problem and to reduce the complexity, this paper presents a solution that benefits from the use of the STBC encoder and transforms the channel estimation process into a set of simple linear operations. The proposed method is evaluated via simulation in AWGN-Rayleigh fading channel. Simulation results show a maximum reduction of 6.85% of the bit error rate (BER) compared to the one obtained with the ideal case where the receiver has a perfect knowledge of the channel.Keywords: Channel estimation, MIMO, OFDM, STBC, CAZAC sequence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 881769 Paradigms Shift in Sport Sciences: Body's focus
Authors: Michele V. Carbinatto, Wagner Wey Moreira, Myrian Nunomura; Mariana H. C. Tsukamoto, VilmaLeni Nista-Piccolo
Abstract:
Sports Sciences has been historically supported by the positivism idea of science, especially by the mechanistic/reductionist and becomes a field that views experimentation and measurement as the mayor research domains. The disposition to simplify nature and the world by parts has fragmented and reduced the idea of bodyathletes as machine. In this paper we intent to re-think this perception lined by Complexity Theory. We come with the idea of athletes as a reflexive and active being (corporeity-body). Therefore, the construction of a training that considers the cultural, biological, psychological elements regarding the experience of the human corporal movements in a circumspect and responsible way could bring better chances of accomplishment. In the end, we hope to help coaches understand the intrinsic complexity of the body they are training, how better deal with it, and, in the field of a deep globalization among the different types of knowledge, to respect and accepted the peculiarities of knowledge that comprise this area.
Keywords: Sport science, body, complexity theory, corporeity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2137768 Complexity of Mathematical Expressions in Adaptive Multimodal Multimedia System Ensuring Access to Mathematics for Visually Impaired Users
Authors: Ali Awde, Yacine Bellik, Chakib Tadj
Abstract:
Our adaptive multimodal system aims at correctly presenting a mathematical expression to visually impaired users. Given an interaction context (i.e. combination of user, environment and system resources) as well as the complexity of the expression itself and the user-s preferences, the suitability scores of different presentation format are calculated. Unlike the current state-of-the art solutions, our approach takes into account the user-s situation and not imposes a solution that is not suitable to his context and capacity. In this wok, we present our methodology for calculating the mathematical expression complexity and the results of our experiment. Finally, this paper discusses the concepts and principles applied on our system as well as their validation through cases studies. This work is our original contribution to an ongoing research to make informatics more accessible to handicapped users.Keywords: Adaptive system, intelligent multi-agent system, mathematics for visually-impaired users.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585767 Classification and Resolving Urban Problems by Means of Fuzzy Approach
Authors: F. Habib, A. Shokoohi
Abstract:
Urban problems are problems of organized complexity. Thus, many models and scientific methods to resolve urban problems are failed. This study is concerned with proposing of a fuzzy system driven approach for classification and solving urban problems. The proposed study investigated mainly the selection of the inputs and outputs of urban systems for classification of urban problems. In this research, five categories of urban problems, respect to fuzzy system approach had been recognized: control, polytely, optimizing, open and decision making problems. Grounded Theory techniques were then applied to analyze the data and develop new solving method for each category. The findings indicate that the fuzzy system methods are powerful processes and analytic tools for helping planners to resolve urban complex problems. These tools can be successful where as others have failed because both incorporate or address uncertainty and risk; complexity and systems interacting with other systems.
Keywords: Classification, complexity, Fuzzy theory, urban problems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2113766 Bit Model Based Key Management Scheme for Secure Group Communication
Authors: R. Varalakshmi
Abstract:
For the last decade, researchers have started to focus their interest on Multicast Group Key Management Framework. The central research challenge is secure and efficient group key distribution. The present paper is based on the Bit model based Secure Multicast Group key distribution scheme using the most popular absolute encoder output type code named Gray Code. The focus is of two folds. The first fold deals with the reduction of computation complexity which is achieved in our scheme by performing fewer multiplication operations during the key updating process. To optimize the number of multiplication operations, an O(1) time algorithm to multiply two N-bit binary numbers which could be used in an N x N bit-model of reconfigurable mesh is used in this proposed work. The second fold aims at reducing the amount of information stored in the Group Center and group members while performing the update operation in the key content. Comparative analysis to illustrate the performance of various key distribution schemes is shown in this paper and it has been observed that this proposed algorithm reduces the computation and storage complexity significantly. Our proposed algorithm is suitable for high performance computing environment.
Keywords: Multicast Group key distribution, Bit model, Integer Multiplications, reconfigurable mesh, optimal algorithm, Gray Code, Computation Complexity, Storage Complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1971765 Functional Decomposition Based Effort Estimation Model for Software-Intensive Systems
Authors: Nermin Sökmen
Abstract:
An effort estimation model is needed for softwareintensive projects that consist of hardware, embedded software or some combination of the two, as well as high level software solutions. This paper first focuses on functional decomposition techniques to measure functional complexity of a computer system and investigates its impact on system development effort. Later, it examines effects of technical difficulty and design team capability factors in order to construct the best effort estimation model. With using traditional regression analysis technique, the study develops a system development effort estimation model which takes functional complexity, technical difficulty and design team capability factors as input parameters. Finally, the assumptions of the model are tested.
Keywords: Functional complexity, functional decomposition, development effort, technical difficulty, design team capability, regression analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2278764 Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding
Authors: K. Anitha Sheela, J. Tarun Kumar
Abstract:
HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.Keywords: AMC, HSDPA, LDPC, WCDMA, 3GPP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048763 Theoretical Considerations for Software Component Metrics
Authors: V. Lakshmi Narasimhan, Bayu Hendradjaya
Abstract:
We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.Keywords: Component Assembly, Component Based SoftwareEngineering, CORBA Component Model, Software ComponentMetrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281762 Climate Change and Environmental Education: The Application of Concept Map for Representing the Knowledge Complexity of Climate Change
Authors: Hsueh-Chih, Chen, Yau-Ting, Sung, Tsai-Wen, Lin, Hung-Teng, Chou
Abstract:
It has formed an essential issue that Climate Change, composed of highly knowledge complexity, reveals its significant impact on human existence. Therefore, specific national policies, some of which present the educational aspects, have been published for overcoming the imperative problem. Accordingly, the study aims to analyze as well as integrate the relationship between Climate Change and environmental education and apply the perspective of concept map to represent the knowledge contents and structures of Climate Change; by doing so, knowledge contents of Climate Change could be represented in an even more comprehensive way and manipulated as the tool for environmental education. The method adapted for this study is knowledge conversion model compounded of the platform for experts and teachers, who were the participants for this study, to cooperate and combine each participant-s standpoints into a complete knowledge framework that is the foundation for structuring the concept map. The result of this research contains the important concepts, the precise propositions and the entire concept map for representing the robust concepts of Climate Change.
Keywords: Climate Change, knowledge complexity, concept map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736761 Fast and Efficient Algorithms for Evaluating Uniform and Nonuniform Lagrange and Newton Curves
Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong
Abstract:
Newton-Lagrange Interpolations are widely used in numerical analysis. However, it requires a quadratic computational time for their constructions. In computer aided geometric design (CAGD), there are some polynomial curves: Wang-Ball, DP and Dejdumrong curves, which have linear time complexity algorithms. Thus, the computational time for Newton-Lagrange Interpolations can be reduced by applying the algorithms of Wang-Ball, DP and Dejdumrong curves. In order to use Wang-Ball, DP and Dejdumrong algorithms, first, it is necessary to convert Newton-Lagrange polynomials into Wang-Ball, DP or Dejdumrong polynomials. In this work, the algorithms for converting from both uniform and non-uniform Newton-Lagrange polynomials into Wang-Ball, DP and Dejdumrong polynomials are investigated. Thus, the computational time for representing Newton-Lagrange polynomials can be reduced into linear complexity. In addition, the other utilizations of using CAGD curves to modify the Newton-Lagrange curves can be taken.Keywords: Newton interpolation, Lagrange interpolation, linear complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 617760 Computer Proven Correctness of the Rabin Public-Key Scheme
Authors: Johannes Buchmann, Markus Kaiser
Abstract:
We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1591759 Visual Analytics in K 12 Education - Emerging Dimensions of Complexity
Authors: Linnea Stenliden
Abstract:
The aim of this paper is to understand emerging learning conditions, when a visual analytics is implemented and used in K 12 (education). To date, little attention has been paid to the role visual analytics (digital media and technology that highlight visual data communication in order to support analytical tasks) can play in education, and to the extent to which these tools can process actionable data for young students. This study was conducted in three public K 12 schools, in four social science classes with students aged 10 to 13 years, over a period of two to four weeks at each school. Empirical data were generated using video observations and analyzed with help of metaphors within Actor-network theory (ANT). The learning conditions are found to be distinguished by broad complexity, characterized by four dimensions. These emerge from the actors’ deeply intertwined relations in the activities. The paper argues in relation to the found dimensions that novel approaches to teaching and learning could benefit students’ knowledge building as they work with visual analytics, analyzing visualized data.
Keywords: Analytical reasoning, complexity, data use, problem space, visual analytics, visual storytelling, translation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697758 A Time-Reducible Approach to Compute Determinant |I-X|
Authors: Wang Xingbo
Abstract:
Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.Keywords: Algorithm, determinant, computation, eigenvalue, time complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1156757 Towards a Simulation Model to Ensure the Availability of Machines in Maintenance Activities
Authors: Maryam Gallab, Hafida Bouloiz, Youness Chater, Mohamed Tkiouat
Abstract:
The aim of this paper is to present a model based on multi-agent systems in order to manage the maintenance activities and to ensure the reliability and availability of machines just with the required resources (operators, tools). The interest of the simulation is to solve the complexity of the system and to find results without cost or wasting time. An implementation of the model is carried out on the AnyLogic platform to display the defined performance indicators.Keywords: Maintenance, complexity, simulation, multi-agent systems, AnyLogic platform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1508756 Promoting Complex Systems Learning through the use of Computer Modeling
Authors: Kamel Hashem, David Mioduser
Abstract:
This paper describes part of a project about Learningby- Modeling (LbM). Studying complex systems is increasingly important in teaching and learning many science domains. Many features of complex systems make it difficult for students to develop deep understanding. Previous research indicates that involvement with modeling scientific phenomena and complex systems can play a powerful role in science learning. Some researchers argue with this view indicating that models and modeling do not contribute to understanding complexity concepts, since these increases the cognitive load on students. This study will investigate the effect of different modes of involvement in exploring scientific phenomena using computer simulation tools, on students- mental model from the perspective of structure, behavior and function. Quantitative and qualitative methods are used to report about 121 freshmen students that engaged in participatory simulations about complex phenomena, showing emergent, self-organized and decentralized patterns. Results show that LbM plays a major role in students' concept formation about complexity concepts.Keywords: Complexity, Educational technology, Learning by modeling, Mental models
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572755 Evaluating Complexity – Ethical Challenges in Computational Design Processes
Authors: J.Partanen
Abstract:
Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.Keywords: urban planning, architecture, dynamic modeling, ethics, complexity theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890754 Low Complexity Multi Mode Interleaver Core for WiMAX with Support for Convolutional Interleaving
Authors: Rizwan Asghar, Dake Liu
Abstract:
A hardware efficient, multi mode, re-configurable architecture of interleaver/de-interleaver for multiple standards, like DVB, WiMAX and WLAN is presented. The interleavers consume a large part of silicon area when implemented by using conventional methods as they use memories to store permutation patterns. In addition, different types of interleavers in different standards cannot share the hardware due to different construction methodologies. The novelty of the work presented in this paper is threefold: 1) Mapping of vital types of interleavers including convolutional interleaver onto a single architecture with flexibility to change interleaver size; 2) Hardware complexity for channel interleaving in WiMAX is reduced by using 2-D realization of the interleaver functions; and 3) Silicon cost overheads reduced by avoiding the use of small memories. The proposed architecture consumes 0.18mm2 silicon area for 0.12μm process and can operate at a frequency of 140 MHz. The reduced complexity helps in minimizing the memory utilization, and at the same time provides strong support to on-the-fly computation of permutation patterns.Keywords: Hardware interleaver implementation, WiMAX, DVB, block interleaver, convolutional interleaver, hardwaremultiplexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2036753 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: Integral differential equations, American options, jump–diffusion model, rational approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 561752 Optimal Solution of Constraint Satisfaction Problems
Authors: Jeffrey L. Duffany
Abstract:
An optimal solution for a large number of constraint satisfaction problems can be found using the technique of substitution and elimination of variables analogous to the technique that is used to solve systems of equations. A decision function f(A)=max(A2) is used to determine which variables to eliminate. The algorithm can be expressed in six lines and is remarkable in both its simplicity and its ability to find an optimal solution. However it is inefficient in that it needs to square the updated A matrix after each variable elimination. To overcome this inefficiency the algorithm is analyzed and it is shown that the A matrix only needs to be squared once at the first step of the algorithm and then incrementally updated for subsequent steps, resulting in significant improvement and an algorithm complexity of O(n3).Keywords: Algorithm, complexity, constraint, np-complete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1422751 An Efficient Multi Join Algorithm Utilizing a Lattice of Double Indices
Authors: Hanan A. M. Abd Alla, Lilac A. E. Al-Safadi
Abstract:
In this paper, a novel multi join algorithm to join multiple relations will be introduced. The novel algorithm is based on a hashed-based join algorithm of two relations to produce a double index. This is done by scanning the two relations once. But instead of moving the records into buckets, a double index will be built. This will eliminate the collision that can happen from a complete hash algorithm. The double index will be divided into join buckets of similar categories from the two relations. The algorithm then joins buckets with similar keys to produce joined buckets. This will lead at the end to a complete join index of the two relations. without actually joining the actual relations. The time complexity required to build the join index of two categories is Om log m where m is the size of each category. Totaling time complexity to O n log m for all buckets. The join index will be used to materialize the joined relation if required. Otherwise, it will be used along with other join indices of other relations to build a lattice to be used in multi-join operations with minimal I/O requirements. The lattice of the join indices can be fitted into the main memory to reduce time complexity of the multi join algorithm.Keywords: Multi join, Relation, Lattice, Join indices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1299750 Estimation of Component Reusability through Reusability Metrics
Authors: Aditya Pratap Singh, Pradeep Tomar
Abstract:
Software reusability is an essential characteristic of Component-Based Software (CBS). The component reusability is an important assess for the effective reuse of components in CBS. The attributes of reusability proposed by various researchers are studied and four of them are identified as potential factors affecting reusability. This paper proposes metric for reusability estimation of black-box software component along with metrics for Interface Complexity, Understandability, Customizability and Reliability. An experiment is performed for estimation of reusability through a case study on a sample web application using a real world component.
Keywords: Component-based software, component reusability, customizability, interface complexity, reliability, understandability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3058749 Parametric Design as an Approach to Respond to Complexity
Authors: Sepideh Jabbari Behnam, Zahrasadat Saide Zarabadi
Abstract:
A city is an intertwined texture from the relationship of different components in a whole which is united in a one, so designing the whole complex and its planning is not an easy matter. By considering that a city is a complex system with infinite components and communications, providing flexible layouts that can respond to the unpredictable character of the city, which is a result of its complexity, is inevitable. Parametric design approach as a new approach can produce flexible and transformative layouts in any stage of design. This study aimed to introduce parametric design as a modern approach to respond to complex urban issues by using descriptive and analytical methods. This paper firstly introduces complex systems and then giving a brief characteristic of complex systems. The flexible design and layout flexibility is another matter in response and simulation of complex urban systems that should be considered in design, which is discussed in this study. In this regard, after describing the nature of the parametric approach as a flexible approach, as well as a tool and appropriate way to respond to features such as limited predictability, reciprocating nature, complex communications, and being sensitive to initial conditions and hierarchy, this paper introduces parametric design.
Keywords: Complexity theory, complex system, flexibility, parametric design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318748 Multi Switched Split Vector Quantization of Narrowband Speech Signals
Authors: M. Satya Sai Ram, P. Siddaiah, M. Madhavi Latha
Abstract:
Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization (MSSVQ), which is a hybrid of Multi, switched, split vector quantization techniques. The spectral distortion performance, computational complexity, and memory requirements of MSSVQ are compared to split vector quantization (SVQ), multi stage vector quantization(MSVQ) and switched split vector quantization (SSVQ) techniques. It has been proved from results that MSSVQ has better spectral distortion performance, lower computational complexity and lower memory requirements when compared to all the above mentioned product code vector quantization techniques. Computational complexity is measured in floating point operations (flops), and memory requirements is measured in (floats).Keywords: Linear predictive Coding, Multi stage vectorquantization, Switched Split vector quantization, Split vectorquantization, Line Spectral Frequencies (LSF).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672747 Low Complexity Peak-to-Average Power Ratio Reduction in Orthogonal Frequency Division Multiplexing System by Simultaneously Applying Partial Transmit Sequence and Clipping Algorithms
Authors: V. Sudha, D. Sriram Kumar
Abstract:
Orthogonal Frequency Division Multiplexing (OFDM) has been used in many advanced wireless communication systems due to its high spectral efficiency and robustness to frequency selective fading channels. However, the major concern with OFDM system is the high peak-to-average power ratio (PAPR) of the transmitted signal. Some of the popular techniques used for PAPR reduction in OFDM system are conventional partial transmit sequences (CPTS) and clipping. In this paper, a parallel combination/hybrid scheme of PAPR reduction using clipping and CPTS algorithms is proposed. The proposed method intelligently applies both the algorithms in order to reduce both PAPR as well as computational complexity. The proposed scheme slightly degrades bit error rate (BER) performance due to clipping operation and it can be reduced by selecting an appropriate value of the clipping ratio (CR). The simulation results show that the proposed algorithm achieves significant PAPR reduction with much reduced computational complexity.
Keywords: CCDF, OFDM, PAPR, PTS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1368746 Enhanced Face Recognition with Daisy Descriptors Using 1BT Based Registration
Authors: Sevil Igit, Merve Meric, Sarp Erturk
Abstract:
In this paper, it is proposed to improve Daisy Descriptor based face recognition using a novel One-Bit Transform (1BT) based pre-registration approach. The 1BT based pre-registration procedure is fast and has low computational complexity. It is shown that the face recognition accuracy is improved with the proposed approach. The proposed approach can facilitate highly accurate face recognition using DAISY descriptor with simple matching and thereby facilitate a low-complexity approach.
Keywords: Face Recognition, Daisy Descriptor, One-Bit Transform, Image Registration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1972745 Effective Methodology for Security Risk Assessment of Computer Systems
Authors: Daniel F. García, Adrián Fernández
Abstract:
Today, computer systems are more and more complex and support growing security risks. The security managers need to find effective security risk assessment methodologies that allow modeling well the increasing complexity of current computer systems but also maintaining low the complexity of the assessment procedure. This paper provides a brief analysis of common security risk assessment methodologies leading to the selection of a proper methodology to fulfill these requirements. Then, a detailed analysis of the most effective methodology is accomplished, presenting numerical examples to demonstrate how easy it is to use.
Keywords: Computer security, qualitative and quantitative methods, risk assessment methodologies, security risk assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3166744 Speech Data Compression using Vector Quantization
Authors: H. B. Kekre, Tanuja K. Sarode
Abstract:
Mostly transforms are used for speech data compressions which are lossy algorithms. Such algorithms are tolerable for speech data compression since the loss in quality is not perceived by the human ear. However the vector quantization (VQ) has a potential to give more data compression maintaining the same quality. In this paper we propose speech data compression algorithm using vector quantization technique. We have used VQ algorithms LBG, KPE and FCG. The results table shows computational complexity of these three algorithms. Here we have introduced a new performance parameter Average Fractional Change in Speech Sample (AFCSS). Our FCG algorithm gives far better performance considering mean absolute error, AFCSS and complexity as compared to others.Keywords: Vector Quantization, Data Compression, Encoding, , Speech coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2403743 A Holistic Workflow Modeling Method for Business Process Redesign
Authors: Heejung Lee
Abstract:
In a highly competitive environment, it becomes more important to shorten the whole business process while delivering or even enhancing the business value to the customers and suppliers. Although the workflow management systems receive much attention for its capacity to practically support the business process enactment, the effective workflow modeling method remain still challenging and the high degree of process complexity makes it more difficult to gain the short lead time. This paper presents a workflow structuring method in a holistic way that can reduce the process complexity using activity-needs and formal concept analysis, which eventually enhances the key performance such as quality, delivery, and cost in business process.
Keywords: Workflow management, reengineering, formal concept analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1951