Search results for: Complexity theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2302

Search results for: Complexity theory

2272 A Review of Existing Turnover Intention Theories

Authors: Pauline E. Ngo-Henha

Abstract:

Existing turnover intention theories are reviewed in this paper. This review was conducted with the help of the search keyword “turnover intention theories” in Google Scholar during the month of July 2017. These theories include: The Theory of Organizational Equilibrium (TOE), Social Exchange Theory, Job Embeddedness Theory, Herzberg’s Two-Factor Theory, the Resource-Based View, Equity Theory, Human Capital Theory, and the Expectancy Theory. One of the limitations of this review paper is that data were only collected from Google Scholar where many papers were sometimes not freely accessible. However, this paper attempts to contribute to the research in clarifying the distinction between theories and models in the context of turnover intention.

Keywords: Job embeddedness theory, theory of organizational equilibrium (TOE), Herzberg’s two-factor theory, turnover intention theories, theories and models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22074
2271 Selective Intra Prediction Mode Decision for H.264/AVC Encoders

Authors: Jun Sung Park, Hyo Jung Song

Abstract:

H.264/AVC offers a considerably higher improvement in coding efficiency compared to other compression standards such as MPEG-2, but computational complexity is increased significantly. In this paper, we propose selective mode decision schemes for fast intra prediction mode selection. The objective is to reduce the computational complexity of the H.264/AVC encoder without significant rate-distortion performance degradation. In our proposed schemes, the intra prediction complexity is reduced by limiting the luma and chroma prediction modes using the directional information of the 16×16 prediction mode. Experimental results are presented to show that the proposed schemes reduce the complexity by up to 78% maintaining the similar PSNR quality with about 1.46% bit rate increase in average.

Keywords: Video encoding, H.264, Intra prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3423
2270 Dynamically Monitoring Production Methods for Identifying Structural Changes relevant to Logistics

Authors: Marco Kennemann, Steffen C. Eickemeyer, Peter Nyhuis

Abstract:

Due to the growing dynamic and complexity within the market environment production enterprises in particular are faced with new logistic challenges. Moreover, it is here in this dynamic environment that the Logistic Operating Curve Theory also reaches its limits as a method for describing the correlations between the logistic objectives. In order to convert this theory into a method for dynamically monitoring productions this paper will introduce methods for reliably and quickly identifying structural changes relevant to logistics.

Keywords: Dynamics, Logistic Operating Curves, Production Logistics, Production Planning and Control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1332
2269 A New H.264-Based Rate Control Algorithm for Stereoscopic Video Coding

Authors: Yi Liao, Wencheng Yang, Gangyi Jiang

Abstract:

According to investigating impact of complexity of stereoscopic frame pairs on stereoscopic video coding and transmission, a new rate control algorithm is presented. The proposed rate control algorithm is performed on three levels: stereoscopic group of pictures (SGOP) level, stereoscopic frame (SFrame) level and frame level. A temporal-spatial frame complexity model is firstly established, in the bits allocation stage, the frame complexity, position significance and reference property between the left and right frames are taken into account. Meanwhile, the target buffer is set according to the frame complexity. Experimental results show that the proposed method can efficiently control the bitrates, and it outperforms the fixed quantization parameter method from the rate distortion perspective, and average PSNR gain between rate-distortion curves (BDPSNR) is 0.21dB.

Keywords: Stereoscopic video coding, rate control, stereoscopic group of pictures, complexity of stereoscopic frame pairs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
2268 Urdu Nastaleeq Optical Character Recognition

Authors: Zaheer Ahmad, Jehanzeb Khan Orakzai, Inam Shamsher, Awais Adnan

Abstract:

This paper discusses the Urdu script characteristics, Urdu Nastaleeq and a simple but a novel and robust technique to recognize the printed Urdu script without a lexicon. Urdu being a family of Arabic script is cursive and complex script in its nature, the main complexity of Urdu compound/connected text is not its connections but the forms/shapes the characters change when it is placed at initial, middle or at the end of a word. The characters recognition technique presented here is using the inherited complexity of Urdu script to solve the problem. A word is scanned and analyzed for the level of its complexity, the point where the level of complexity changes is marked for a character, segmented and feeded to Neural Networks. A prototype of the system has been tested on Urdu text and currently achieves 93.4% accuracy on the average.

Keywords: Cursive Script, OCR, Urdu.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2730
2267 Performance Complexity Measurement of Tightening Equipment Based on Kolmogorov Entropy

Authors: Guoliang Fan, Aiping Li, Xuemei Liu, Liyun Xu

Abstract:

The performance of the tightening equipment will decline with the working process in manufacturing system. The main manifestations are the randomness and discretization degree increasing of the tightening performance. To evaluate the degradation tendency of the tightening performance accurately, a complexity measurement approach based on Kolmogorov entropy is presented. At first, the states of performance index are divided for calibrating the discrete degree. Then the complexity measurement model based on Kolmogorov entropy is built. The model describes the performance degradation tendency of tightening equipment quantitatively. At last, a study case is applied for verifying the efficiency and validity of the approach. The research achievement shows that the presented complexity measurement can effectively evaluate the degradation tendency of the tightening equipment. It can provide theoretical basis for preventive maintenance and life prediction of equipment.

Keywords: Complexity measurement, Kolmogorov entropy, manufacturing system, performance evaluation, tightening equipment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 908
2266 Capacity Flexibility within Production

Authors: Johannes Nywlt, Julian Becker, Sebastian Bertsch

Abstract:

Due to high dynamics in current markets the expectations regarding logistics increase steadily. However, the complexity and variety of products and production make it difficult to understand the interdependencies between logistical objectives and their determining factors. Therefore specific models are needed to meet this challenge. The Logistic Operating Curves Theory is such a model. With its aid the basic correlations between the logistic objectives can be described. Within this model the capacity flexibility represents an important parameter. However, a proper mathematical description for this parameter is still missing. Within this paper such a description will be developed in order to make the Logistic Operating Curves Theory more accurate.

Keywords: Capacity flexibility, Production controlling, Production logistics, Production management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2036
2265 Clustering Categorical Data Using Hierarchies (CLUCDUH)

Authors: Gökhan Silahtaroğlu

Abstract:

Clustering large populations is an important problem when the data contain noise and different shapes. A good clustering algorithm or approach should be efficient enough to detect clusters sensitively. Besides space complexity, time complexity also gains importance as the size grows. Using hierarchies we developed a new algorithm to split attributes according to the values they have and choosing the dimension for splitting so as to divide the database roughly into equal parts as much as possible. At each node we calculate some certain descriptive statistical features of the data which reside and by pruning we generate the natural clusters with a complexity of O(n).

Keywords: Clustering, tree, split, pruning, entropy, gini.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1502
2264 Fundamental Concepts of Theory of Constraints: An Emerging Philosophy

Authors: Ajay Gupta, Arvind Bhardwaj, Arun Kanda

Abstract:

Dr Eliyahu Goldratt has done the pioneering work in the development of Theory of Constraints. Since then, many more researchers around the globe are working to enhance this body of knowledge. In this paper, an attempt has been made to compile the salient features of this theory from the work done by Goldratt and other researchers. This paper will provide a good starting point to the potential researchers interested to work in Theory of Constraints. The paper will also help the practicing managers by clarifying their concepts on the theory and will facilitate its successful implementation in their working areas.

Keywords: Drum-Buffer-Rope, Goldratt, ProductionScheduling, Theory of Constraints.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3462
2263 Application of Hardware Efficient CIC Compensation Filter in Narrow Band Filtering

Authors: Vishal Awasthi, Krishna Raj

Abstract:

In many communication and signal processing systems, it is highly desirable to implement an efficient narrow-band filter that decimate or interpolate the incoming signals. This paper presents hardware efficient compensated CIC filter over a narrow band frequency that increases the speed of down sampling by using multiplierless decimation filters with polyphase FIR filter structure. The proposed work analyzed the performance of compensated CIC filter on the bases of the improvement of frequency response with reduced hardware complexity in terms of no. of adders and multipliers and produces the filtered results without any alterations. CIC compensator filter demonstrated that by using compensation with CIC filter improve the frequency response in passed of interest 26.57% with the reduction in hardware complexity 12.25% multiplications per input sample (MPIS) and 23.4% additions per input sample (APIS) w.r.t. FIR filter respectively.

Keywords: Multirate filtering, Narrow-band Signaling, Compensation Theory, CIC filter, Decimation, Compensation filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2899
2262 Stochastic Control of Decentralized Singularly Perturbed Systems

Authors: Walid S. Alfuhaid, Saud A. Alghamdi, John M. Watkins, M. Edwin Sawan

Abstract:

Designing a controller for stochastic decentralized interconnected large scale systems usually involves a high degree of complexity and computation ability. Noise, observability, and controllability of all system states, connectivity, and channel bandwidth are other constraints to design procedures for distributed large scale systems. The quasi-steady state model investigated in this paper is a reduced order model of the original system using singular perturbation techniques. This paper results in an optimal control synthesis to design an observer based feedback controller by standard stochastic control theory techniques using Linear Quadratic Gaussian (LQG) approach and Kalman filter design with less complexity and computation requirements. Numerical example is given at the end to demonstrate the efficiency of the proposed method.

Keywords: Decentralized, optimal control, output, singular perturb.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1533
2261 Comparative Study of Complexity in Streetscape Composition

Authors: Ahmed Mansouri, Naoji Matsumoto

Abstract:

This research is a comparative study of complexity, as a multidimensional concept, in the context of streetscape composition in Algeria and Japan. 80 streetscapes visual arrays have been collected and then presented to 20 participants, with different cultural backgrounds, in order to be categorized and classified according to their degrees of complexity. Three analysis methods have been used in this research: cluster analysis, ranking method and Hayashi Quantification method (Method III). The results showed that complexity, disorder, irregularity and disorganization are often conflicting concepts in the urban context. Algerian daytime streetscapes seem to be balanced, ordered and regular, and Japanese daytime streetscapes seem to be unbalanced, regular and vivid. Variety, richness and irregularity with some aspects of order and organization seem to characterize Algerian night streetscapes. Japanese night streetscapes seem to be more related to balance, regularity, order and organization with some aspects of confusion and ambiguity. Complexity characterized mainly Algerian avenues with green infrastructure. Therefore, for Japanese participants, Japanese traditional night streetscapes were complex. And for foreigners, Algerian and Japanese avenues nightscapes were the most complex visual arrays.

Keywords: Streetscape, Nightscape, Complexity, Visual Array, Affordance, Cluster Analysis, Hayashi Quantification Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2295
2260 Complexity Reduction Approach with Jacobi Iterative Method for Solving Composite Trapezoidal Algebraic Equations

Authors: Mohana Sundaram Muthuvalu, Jumat Sulaiman

Abstract:

In this paper, application of the complexity reduction approach based on half- and quarter-sweep iteration concepts with Jacobi iterative method for solving composite trapezoidal (CT) algebraic equations is discussed. The performances of the methods for CT algebraic equations are comparatively studied by their application in solving linear Fredholm integral equations of the second kind. Furthermore, computational complexity analysis and numerical results for three test problems are also included in order to verify performance of the methods.

Keywords: Complexity reduction approach, Composite trapezoidal scheme, Jacobi method, Linear Fredholm integral equations

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554
2259 Low Complexity Hybrid Scheme for PAPR Reduction in OFDM Systems Based on SLM and Clipping

Authors: V. Sudha, D. Sriram Kumar

Abstract:

In this paper, we present a low complexity hybrid scheme using conventional selective mapping (C-SLM) and clipping algorithms to reduce the high peak-to-average power ratio (PAPR) of orthogonal frequency division multiplexing (OFDM) signal. In the proposed scheme, the input data sequence (X) is divided into two sub-blocks, then clipping algorithm is applied to the first sub-block, whereas C-SLM algorithm is applied to the second sub-block in order to reduce both computational complexity and PAPR. The resultant time domain OFDM signal is obtained by combining the output of two sub-blocks. The simulation results show that the proposed hybrid scheme provides 0.45 dB PAPR reduction gain at CCDF value of 10-2 and 52% of computational complexity reduction when compared to C-SLM scheme at the expense of slight degradation in bit error rate (BER) performance.

Keywords: CCDF, Clipping, OFDM, PAPR, SLM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217
2258 Modeling the Symptom-Disease Relationship by Using Rough Set Theory and Formal Concept Analysis

Authors: Mert Bal, Hayri Sever, Oya Kalıpsız

Abstract:

Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.

Keywords: Formal Concept Analysis, Rough Set Theory, Granular Computing, Medical Decision Support System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1774
2257 Learner Autonomy Based On Constructivism Learning Theory

Authors: Haiyan Wang

Abstract:

Constuctivism learning theory lays emphasis on the learners' active learning, such as learning initiative, sociality and context. By analyzing the relationship between constructivism learning theory and learner autonomy, this paper explores how to cultivate learners' learner autonomy under the guidance of constructivism learning theory.

Keywords: Constructivism learning theory, learner autonomy, relationship, cultivation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7674
2256 A Low Complexity Frequency Offset Estimation for MB-OFDM based UWB Systems

Authors: Wang Xue, Liu Dan, Liu Ying, Wang Molin, Qian Zhihong

Abstract:

A low-complexity, high-accurate frequency offset estimation for multi-band orthogonal frequency division multiplexing (MB-OFDM) based ultra-wide band systems is presented regarding different carrier frequency offsets, different channel frequency responses, different preamble patterns in different bands. Utilizing a half-cycle Constant Amplitude Zero Auto Correlation (CAZAC) sequence as the preamble sequence, the estimator with a semi-cross contrast scheme between two successive OFDM symbols is proposed. The CRLB and complexity of the proposed algorithm are derived. Compared to the reference estimators, the proposed method achieves significantly less complexity (about 50%) for all preamble patterns of the MB-OFDM systems. The CRLBs turn out to be of well performance.

Keywords: CAZAC, Frequency Offset, Semi-cross Contrast, MB-OFDM, UWB

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611
2255 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis

Authors: Young-Seok Choi

Abstract:

This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.

Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
2254 A P-SPACE Algorithm for Groebner Bases Computation in Boolean Rings

Authors: Quoc-Nam Tran

Abstract:

The theory of Groebner Bases, which has recently been honored with the ACM Paris Kanellakis Theory and Practice Award, has become a crucial building block to computer algebra, and is widely used in science, engineering, and computer science. It is wellknown that Groebner bases computation is EXP-SPACE in a general setting. In this paper, we give an algorithm to show that Groebner bases computation is P-SPACE in Boolean rings. We also show that with this discovery, the Groebner bases method can theoretically be as efficient as other methods for automated verification of hardware and software. Additionally, many useful and interesting properties of Groebner bases including the ability to efficiently convert the bases for different orders of variables making Groebner bases a promising method in automated verification.

Keywords: Algorithm, Complexity, Groebner basis, Applications of Computer Science.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1778
2253 High Accuracy Eigensolutions in Elasticity for Boundary Integral Equations by Nyström Method

Authors: Pan Cheng, Jin Huang, Guang Zeng

Abstract:

Elastic boundary eigensolution problems are converted into boundary integral equations by potential theory. The kernels of the boundary integral equations have both the logarithmic and Hilbert singularity simultaneously. We present the mechanical quadrature methods for solving eigensolutions of the boundary integral equations by dealing with two kinds of singularities at the same time. The methods possess high accuracy O(h3) and low computing complexity. The convergence and stability are proved based on Anselone-s collective compact theory. Bases on the asymptotic error expansion with odd powers, we can greatly improve the accuracy of the approximation, and also derive a posteriori error estimate which can be used for constructing self-adaptive algorithms. The efficiency of the algorithms are illustrated by numerical examples.

Keywords: boundary integral equation, extrapolation algorithm, aposteriori error estimate, elasticity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3604
2252 An Enhanced Distributed System to improve theTime Complexity of Binary Indexed Trees

Authors: Ahmed M. Elhabashy, A. Baes Mohamed, Abou El Nasr Mohamad

Abstract:

Distributed Computing Systems are usually considered the most suitable model for practical solutions of many parallel algorithms. In this paper an enhanced distributed system is presented to improve the time complexity of Binary Indexed Trees (BIT). The proposed system uses multi-uniform processors with identical architectures and a specially designed distributed memory system. The analysis of this system has shown that it has reduced the time complexity of the read query to O(Log(Log(N))), and the update query to constant complexity, while the naive solution has a time complexity of O(Log(N)) for both queries. The system was implemented and simulated using VHDL and Verilog Hardware Description Languages, with xilinx ISE 10.1, as the development environment and ModelSim 6.1c, similarly as the simulation tool. The simulation has shown that the overhead resulting by the wiring and communication between the system fragments could be fairly neglected, which makes it applicable to practically reach the maximum speed up offered by the proposed model.

Keywords: Binary Index Tree (BIT), Least Significant Bit (LSB), Parallel Adder (PA), Very High Speed Integrated Circuits HardwareDescription Language (VHDL), Distributed Parallel Computing System(DPCS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
2251 On Generalizing Rough Set Theory via using a Filter

Authors: Serkan Narlı, Ahmet Z. Ozcelik

Abstract:

The theory of rough sets is generalized by using a filter. The filter is induced by binary relations and it is used to generalize the basic rough set concepts. The knowledge representations and processing of binary relations in the style of rough set theory are investigated.

Keywords: Rough set, fuzzy set, membership function, knowledge representation and processing, information theory

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
2250 A Family of Minimal Residual Based Algorithm for Adaptive Filtering

Authors: Noor Atinah Ahmad

Abstract:

The Minimal Residual (MR) is modified for adaptive filtering application. Three forms of MR based algorithm are presented: i) the low complexity SPCG, ii) MREDSI, and iii) MREDSII. The low complexity is a reduced complexity version of a previously proposed SPCG algorithm. Approximations introduced reduce the algorithm to an LMS type algorithm, but, maintain the superior convergence of the SPCG algorithm. Both MREDSI and MREDSII are MR based methods with Euclidean direction of search. The choice of Euclidean directions is shown via simulation to give better misadjustment compared to their gradient search counterparts.

Keywords: Adaptive filtering, Adaptive least square, Minimalresidual method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1394
2249 Weak Measurement Theory for Discrete Scales

Authors: Jan Newmarch

Abstract:

With the increasing spread of computers and the internet among culturally, linguistically and geographically diverse communities, issues of internationalization and localization and becoming increasingly important. For some of the issues such as different scales for length and temperature, there is a well-developed measurement theory. For others such as date formats no such theory will be possible. This paper fills a gap by developing a measurement theory for a class of scales previously overlooked, based on discrete and interval-valued scales such as spanner and shoe sizes. The paper gives a theoretical foundation for a class of data representation problems.

Keywords: Data representation, internationalisation, localisation, measurement theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400
2248 Assessing the Relation between Theory of Multiple Algebras and Universal Algebras

Authors: Mona Taheri

Abstract:

In this study, we examine multiple algebras and algebraic structures derived from them and by stating a theory on multiple algebras; we will show that the theory of multiple algebras is a natural extension of the theory of universal algebras. Also, we will treat equivalence relations on multiple algebras, for which the quotient constructed modulo them is a universal algebra and will study the basic relation and the fundamental algebra in question. In this study, by stating the characteristic theorem of multiple algebras, we show that the theory of multiple algebras is a natural extension of the theory of universal algebras.

Keywords: multiple algebras , universal algebras

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1146
2247 Evaluating Sinusoidal Functions by a Low Complexity Cubic Spline Interpolator with Error Optimization

Authors: Abhijit Mitra, Harpreet Singh Dhillon

Abstract:

We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.

Keywords: Arithmetic, spline interpolator, hardware design, erroranalysis, optimization methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
2246 Handling Complexity of a Complex System Design: Paradigm, Formalism and Transformations

Authors: Hycham Aboutaleb, Bruno Monsuez

Abstract:

Current systems complexity has reached a degree that requires addressing conception and design issues while taking into account environmental, operational, social, legal and financial aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponential growing effort, cost and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework and an environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and suggests a graphbased formalism for modeling complex systems. Finally, a set of transformations are defined to handle the model complexity.

Keywords: Higraph-based, formalism, system engineering paradigm, modeling requirements, graph-based transformations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1599
2245 A Cognitive Measurement of Complexity and Comprehension for Object-Oriented Code

Authors: Amit Kumar Jakhar, Kumar Rajnish

Abstract:

Inherited complexity is one of the difficult tasks in software engineering field. Further, it is said that there is no physical laws or standard guidelines suit for designing different types of software. Hence, to make the software engineering as a matured engineering discipline like others, it is necessary that it has its own theoretical frameworks and laws. Software designing and development is a human effort which takes a lot of time and considers various parameters for successful completion of the software. The cognitive informatics plays an important role for understanding the essential characteristics of the software. The aim of this work is to consider the fundamental characteristics of the source code of Object-Oriented software i.e. complexity and understandability. The complexity of the programs is analyzed with the help of extracted important attributes of the source code, which is further utilized to evaluate the understandability factor. The aforementioned characteristics are analyzed on the basis of 16 C++ programs by distributing them to forty MCA students. They all tried to understand the source code of the given program and mean time is taken as the actual time needed to understand the program. For validation of this work, Briand’s framework is used and the presented metric is also evaluated comparatively with existing metric which proves its robustness.

Keywords: Software metrics, object-oriented, complexity, cognitive weight, understandability, basic control structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1078
2244 On the Solution of the Towers of Hanoi Problem

Authors: Hayedeh Ahrabian, Comfar Badamchi, Abbass Nowzari-Dalini

Abstract:

In this paper, two versions of an iterative loopless algorithm for the classical towers of Hanoi problem with O(1) storage complexity and O(2n) time complexity are presented. Based on this algorithm the number of different moves in each of pegs with its direction is formulated.

Keywords: Loopless algorithm, Binary tree, Towers of Hanoi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4791
2243 A Deterministic Polynomial-time Algorithm for the Clique Problem and the Equality of P and NP Complexity Classes

Authors: Zohreh O. Akbari

Abstract:

In this paper a deterministic polynomial-time algorithm is presented for the Clique problem. The case is considered as the problem of omitting the minimum number of vertices from the input graph so that none of the zeroes on the graph-s adjacency matrix (except the main diagonal entries) would remain on the adjacency matrix of the resulting subgraph. The existence of a deterministic polynomial-time algorithm for the Clique problem, as an NP-complete problem will prove the equality of P and NP complexity classes.

Keywords: Clique problem, Deterministic Polynomial-time Algorithm, Equality of P and NP Complexity Classes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1767