Search results for: low complexity
731 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: Integral differential equations, American options, jump–diffusion model, rational approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 561730 Optimal Solution of Constraint Satisfaction Problems
Authors: Jeffrey L. Duffany
Abstract:
An optimal solution for a large number of constraint satisfaction problems can be found using the technique of substitution and elimination of variables analogous to the technique that is used to solve systems of equations. A decision function f(A)=max(A2) is used to determine which variables to eliminate. The algorithm can be expressed in six lines and is remarkable in both its simplicity and its ability to find an optimal solution. However it is inefficient in that it needs to square the updated A matrix after each variable elimination. To overcome this inefficiency the algorithm is analyzed and it is shown that the A matrix only needs to be squared once at the first step of the algorithm and then incrementally updated for subsequent steps, resulting in significant improvement and an algorithm complexity of O(n3).Keywords: Algorithm, complexity, constraint, np-complete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1422729 An Efficient Multi Join Algorithm Utilizing a Lattice of Double Indices
Authors: Hanan A. M. Abd Alla, Lilac A. E. Al-Safadi
Abstract:
In this paper, a novel multi join algorithm to join multiple relations will be introduced. The novel algorithm is based on a hashed-based join algorithm of two relations to produce a double index. This is done by scanning the two relations once. But instead of moving the records into buckets, a double index will be built. This will eliminate the collision that can happen from a complete hash algorithm. The double index will be divided into join buckets of similar categories from the two relations. The algorithm then joins buckets with similar keys to produce joined buckets. This will lead at the end to a complete join index of the two relations. without actually joining the actual relations. The time complexity required to build the join index of two categories is Om log m where m is the size of each category. Totaling time complexity to O n log m for all buckets. The join index will be used to materialize the joined relation if required. Otherwise, it will be used along with other join indices of other relations to build a lattice to be used in multi-join operations with minimal I/O requirements. The lattice of the join indices can be fitted into the main memory to reduce time complexity of the multi join algorithm.Keywords: Multi join, Relation, Lattice, Join indices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1299728 Estimation of Component Reusability through Reusability Metrics
Authors: Aditya Pratap Singh, Pradeep Tomar
Abstract:
Software reusability is an essential characteristic of Component-Based Software (CBS). The component reusability is an important assess for the effective reuse of components in CBS. The attributes of reusability proposed by various researchers are studied and four of them are identified as potential factors affecting reusability. This paper proposes metric for reusability estimation of black-box software component along with metrics for Interface Complexity, Understandability, Customizability and Reliability. An experiment is performed for estimation of reusability through a case study on a sample web application using a real world component.
Keywords: Component-based software, component reusability, customizability, interface complexity, reliability, understandability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3058727 Parametric Design as an Approach to Respond to Complexity
Authors: Sepideh Jabbari Behnam, Zahrasadat Saide Zarabadi
Abstract:
A city is an intertwined texture from the relationship of different components in a whole which is united in a one, so designing the whole complex and its planning is not an easy matter. By considering that a city is a complex system with infinite components and communications, providing flexible layouts that can respond to the unpredictable character of the city, which is a result of its complexity, is inevitable. Parametric design approach as a new approach can produce flexible and transformative layouts in any stage of design. This study aimed to introduce parametric design as a modern approach to respond to complex urban issues by using descriptive and analytical methods. This paper firstly introduces complex systems and then giving a brief characteristic of complex systems. The flexible design and layout flexibility is another matter in response and simulation of complex urban systems that should be considered in design, which is discussed in this study. In this regard, after describing the nature of the parametric approach as a flexible approach, as well as a tool and appropriate way to respond to features such as limited predictability, reciprocating nature, complex communications, and being sensitive to initial conditions and hierarchy, this paper introduces parametric design.
Keywords: Complexity theory, complex system, flexibility, parametric design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318726 Multi Switched Split Vector Quantization of Narrowband Speech Signals
Authors: M. Satya Sai Ram, P. Siddaiah, M. Madhavi Latha
Abstract:
Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization (MSSVQ), which is a hybrid of Multi, switched, split vector quantization techniques. The spectral distortion performance, computational complexity, and memory requirements of MSSVQ are compared to split vector quantization (SVQ), multi stage vector quantization(MSVQ) and switched split vector quantization (SSVQ) techniques. It has been proved from results that MSSVQ has better spectral distortion performance, lower computational complexity and lower memory requirements when compared to all the above mentioned product code vector quantization techniques. Computational complexity is measured in floating point operations (flops), and memory requirements is measured in (floats).Keywords: Linear predictive Coding, Multi stage vectorquantization, Switched Split vector quantization, Split vectorquantization, Line Spectral Frequencies (LSF).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673725 Low Complexity Peak-to-Average Power Ratio Reduction in Orthogonal Frequency Division Multiplexing System by Simultaneously Applying Partial Transmit Sequence and Clipping Algorithms
Authors: V. Sudha, D. Sriram Kumar
Abstract:
Orthogonal Frequency Division Multiplexing (OFDM) has been used in many advanced wireless communication systems due to its high spectral efficiency and robustness to frequency selective fading channels. However, the major concern with OFDM system is the high peak-to-average power ratio (PAPR) of the transmitted signal. Some of the popular techniques used for PAPR reduction in OFDM system are conventional partial transmit sequences (CPTS) and clipping. In this paper, a parallel combination/hybrid scheme of PAPR reduction using clipping and CPTS algorithms is proposed. The proposed method intelligently applies both the algorithms in order to reduce both PAPR as well as computational complexity. The proposed scheme slightly degrades bit error rate (BER) performance due to clipping operation and it can be reduced by selecting an appropriate value of the clipping ratio (CR). The simulation results show that the proposed algorithm achieves significant PAPR reduction with much reduced computational complexity.
Keywords: CCDF, OFDM, PAPR, PTS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1369724 Enhanced Face Recognition with Daisy Descriptors Using 1BT Based Registration
Authors: Sevil Igit, Merve Meric, Sarp Erturk
Abstract:
In this paper, it is proposed to improve Daisy Descriptor based face recognition using a novel One-Bit Transform (1BT) based pre-registration approach. The 1BT based pre-registration procedure is fast and has low computational complexity. It is shown that the face recognition accuracy is improved with the proposed approach. The proposed approach can facilitate highly accurate face recognition using DAISY descriptor with simple matching and thereby facilitate a low-complexity approach.
Keywords: Face Recognition, Daisy Descriptor, One-Bit Transform, Image Registration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1972723 Effective Methodology for Security Risk Assessment of Computer Systems
Authors: Daniel F. García, Adrián Fernández
Abstract:
Today, computer systems are more and more complex and support growing security risks. The security managers need to find effective security risk assessment methodologies that allow modeling well the increasing complexity of current computer systems but also maintaining low the complexity of the assessment procedure. This paper provides a brief analysis of common security risk assessment methodologies leading to the selection of a proper methodology to fulfill these requirements. Then, a detailed analysis of the most effective methodology is accomplished, presenting numerical examples to demonstrate how easy it is to use.
Keywords: Computer security, qualitative and quantitative methods, risk assessment methodologies, security risk assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3167722 Speech Data Compression using Vector Quantization
Authors: H. B. Kekre, Tanuja K. Sarode
Abstract:
Mostly transforms are used for speech data compressions which are lossy algorithms. Such algorithms are tolerable for speech data compression since the loss in quality is not perceived by the human ear. However the vector quantization (VQ) has a potential to give more data compression maintaining the same quality. In this paper we propose speech data compression algorithm using vector quantization technique. We have used VQ algorithms LBG, KPE and FCG. The results table shows computational complexity of these three algorithms. Here we have introduced a new performance parameter Average Fractional Change in Speech Sample (AFCSS). Our FCG algorithm gives far better performance considering mean absolute error, AFCSS and complexity as compared to others.Keywords: Vector Quantization, Data Compression, Encoding, , Speech coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2403721 A Holistic Workflow Modeling Method for Business Process Redesign
Authors: Heejung Lee
Abstract:
In a highly competitive environment, it becomes more important to shorten the whole business process while delivering or even enhancing the business value to the customers and suppliers. Although the workflow management systems receive much attention for its capacity to practically support the business process enactment, the effective workflow modeling method remain still challenging and the high degree of process complexity makes it more difficult to gain the short lead time. This paper presents a workflow structuring method in a holistic way that can reduce the process complexity using activity-needs and formal concept analysis, which eventually enhances the key performance such as quality, delivery, and cost in business process.
Keywords: Workflow management, reengineering, formal concept analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1951720 Blind Channel Estimation Based on URV Decomposition Technique for Uplink of MC-CDMA
Authors: Pradya Pornnimitkul, Suwich Kunaruttanapruk, Bamrung Tau Sieskul, Somchai Jitapunkul
Abstract:
In this paper, we investigate a blind channel estimation method for Multi-carrier CDMA systems that use a subspace decomposition technique. This technique exploits the orthogonality property between the noise subspace and the received user codes to obtain channel of each user. In the past we used Singular Value Decomposition (SVD) technique but SVD have most computational complexity so in this paper use a new algorithm called URV Decomposition, which serve as an intermediary between the QR decomposition and SVD, replaced in SVD technique to track the noise space of the received data. Because of the URV decomposition has almost the same estimation performance as the SVD, but has less computational complexity.
Keywords: Channel estimation, MC-CDMA, SVD, URV.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780719 An Approach for Reducing the Computational Complexity of LAMSTAR Intrusion Detection System using Principal Component Analysis
Authors: V. Venkatachalam, S. Selvan
Abstract:
The security of computer networks plays a strategic role in modern computer systems. Intrusion Detection Systems (IDS) act as the 'second line of defense' placed inside a protected network, looking for known or potential threats in network traffic and/or audit data recorded by hosts. We developed an Intrusion Detection System using LAMSTAR neural network to learn patterns of normal and intrusive activities, to classify observed system activities and compared the performance of LAMSTAR IDS with other classification techniques using 5 classes of KDDCup99 data. LAMSAR IDS gives better performance at the cost of high Computational complexity, Training time and Testing time, when compared to other classification techniques (Binary Tree classifier, RBF classifier, Gaussian Mixture classifier). we further reduced the Computational Complexity of LAMSTAR IDS by reducing the dimension of the data using principal component analysis which in turn reduces the training and testing time with almost the same performance.Keywords: Binary Tree Classifier, Gaussian Mixture, IntrusionDetection System, LAMSTAR, Radial Basis Function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747718 A Novel Frequency Offset Estimation Scheme for OFDM Systems
Authors: Youngpo Lee, Seokho Yoon
Abstract:
In this paper, we propose a novel frequency offset estimation scheme for orthogonal frequency division multiplexing (OFDM) systems. By correlating the OFDM signals within the coherence phase bandwidth and employing a threshold in the frequency offset estimation process, the proposed scheme is not only robust to the timing offset but also has a reduced complexity compared with that of the conventional scheme. Moreover, a timing offset estimation scheme is also proposed as the next stage of the proposed frequency offset estimation. Numerical results show that the proposed scheme can estimate frequency offset with lower computational complexity and does not require additional memory while maintaining the same level of estimation performance.
Keywords: OFDM, frequency offset estimation, threshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2212717 Gender Based Variability Time Series Complexity Analysis
Authors: Ramesh K. Sunkaria, Puneeta Marwaha
Abstract:
Non linear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy normal sinus rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.
Keywords: Heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1766716 Mitigating the Cost of Empty Container Repositioning through the Virtual Container Yard: An Appraisal of Carriers’ Perceptions
Authors: L. Edirisinghe, Z. Jin, A. W. Wijeratne, R. Mudunkotuwa
Abstract:
Empty container repositioning is a fundamental problem faced by the shipping industry. The virtual container yard is a novel strategy underpinning the container interchange between carriers that could substantially reduce this ever-increasing shipping cost. This paper evaluates the shipping industry perception of the virtual container yard using chi-square tests. It examines if the carriers perceive that the selected independent variables, namely culture, organization, decision, marketing, attitudes, legal, independent, complexity, and stakeholders of carriers, impact the efficiency and benefits of the virtual container yard. There are two major findings of the research. Firstly, carriers view that complexity, attitudes, and stakeholders may impact the effectiveness of container interchange and may influence the perceived benefits of the virtual container yard. Secondly, the three factors of legal, organization, and decision influence only the perceived benefits of the virtual container yard. Accordingly, the implementation of the virtual container yard will be influenced by six key factors, namely complexity, attitudes, stakeholders, legal, organization and decision. Since the virtual container yard could reduce overall shipping costs, it is vital to examine the carriers’ perception of this concept.
Keywords: Virtual container yard, imbalance, management, inventory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 693715 Multi Switched Split Vector Quantizer
Authors: M. Satya Sai Ram, P. Siddaiah, M. Madhavi Latha
Abstract:
Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization, This is a hybrid of two product code vector quantization techniques namely the Multi stage vector quantization technique, and Switched split vector quantization technique,. Multi Switched Split Vector Quantization technique quantizes the linear predictive coefficients in terms of line spectral frequencies. From results it is proved that Multi Switched Split Vector Quantization provides better trade off between bitrate and spectral distortion performance, computational complexity and memory requirements when compared to Switched Split Vector Quantization, Multi stage vector quantization, and Split Vector Quantization techniques. By employing the switching technique at each stage of the vector quantizer the spectral distortion, computational complexity and memory requirements were greatly reduced. Spectral distortion was measured in dB, Computational complexity was measured in floating point operations (flops), and memory requirements was measured in (floats).Keywords: Unconstrained vector quantization, Linear predictiveCoding, Split vector quantization, Multi stage vector quantization, Switched Split vector quantization, Line Spectral Frequencies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1741714 Fuzzy Based Problem-Solution Data Structureas a Data Oriented Model for ABS Controlling
Authors: Ahmad Habibizad Navin, Mehdi Naghian Fesharaki, Mohamad Teshnelab, Ehsan Shahamatnia
Abstract:
The anti-lock braking systems installed on vehicles for safe and effective braking, are high-order nonlinear and timevariant. Using fuzzy logic controllers increase efficiency of such systems, but impose a high computational complexity as well. The main concept introduced by this paper is reducing computational complexity of fuzzy controllers by deploying problem-solution data structure. Unlike conventional methods that are based on calculations, this approach is based on data oriented modeling.Keywords: ABS, Fuzzy controller, PSDS, Time-Memory tradeoff, Data oriented modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736713 Formal Analysis of a Public-Key Algorithm
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.
Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537712 An Improved Integer Frequency Offset Estimator using the P1 Symbol for OFDM System
Authors: Yong-An Jung, Young-Hwan You
Abstract:
This paper suggests an improved integer frequency offset (IFO) estimation scheme using P1 symbol for orthogonal frequency division multiplexing (OFDM) based the second generation terrestrial digital video broadcasting (DVB-T2) system. Proposed IFO estimator is designed by a low-complexity blind IFO estimation scheme, which is implemented with complex additions. Also, we propose active carriers (ACs) selection scheme in order to prevent performance degradation in blind IFO estimation. The simulation results show that under the AWGN and TU6 channels, the proposed method has low complexity than conventional method and almost similar performance in comparison with the conventional method.Keywords: OFDM, DVB-T2, P1 symbol, ACs, IFO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1899711 Digital Paradoxes in Learning Theories
Authors: Marcello Bettoni
Abstract:
As a learning theory tries to borrow from science a framework to found its method, it shows paradoxes and paralysing contraddictions. This results, on one hand, from adopting a learning/teaching model as it were a mere “transfer of data" (mechanical learning approach), and on the other hand from borrowing the complexity theory (an indeterministic and non-linear model), that risks to vanish every educational effort. This work is aimed at describing existing criticism, unveiling the antinomic nature of such paradoxes, focussing on a view where neither the mechanical learning perspective nor the chaotic and nonlinear model can threaten and jeopardize the educational work. Author intends to go back over the steps that led to these paradoxes and to unveil their antinomic nature. Actually this could serve the purpose to explain some current misunderstandings about the real usefulness of Ict within the youth-s learning process and growth.
Keywords: Antinomy, complexity, Leibniz, paradox
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1278710 Efficient and Effective Gabor Feature Representation for Face Detection
Authors: Yasuomi D. Sato, Yasutaka Kuriya
Abstract:
We here propose improved version of elastic graph matching (EGM) as a face detector, called the multi-scale EGM (MS-EGM). In this improvement, Gabor wavelet-based pyramid reduces computational complexity for the feature representation often used in the conventional EGM, but preserving a critical amount of information about an image. The MS-EGM gives us higher detection performance than Viola-Jones object detection algorithm of the AdaBoost Haar-like feature cascade. We also show rapid detection speeds of the MS-EGM, comparable to the Viola-Jones method. We find fruitful benefits in the MS-EGM, in terms of topological feature representation for a face.
Keywords: Face detection, Gabor wavelet based pyramid, elastic graph matching, topological preservation, redundancy of computational complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1874709 A Comparative Study of High Order Rotated Group Iterative Schemes on Helmholtz Equation
Authors: Norhashidah Hj. Mohd Ali, Teng Wai Ping
Abstract:
In this paper, we present a high order group explicit method in solving the two dimensional Helmholtz equation. The presented method is derived from a nine-point fourth order finite difference approximation formula obtained from a 45-degree rotation of the standard grid which makes it possible for the construction of iterative procedure with reduced complexity. The developed method will be compared with the existing group iterative schemes available in literature in terms of computational time, iteration counts, and computational complexity. The comparative performances of the methods will be discussed and reported.Keywords: Explicit group method, finite difference, Helmholtz equation, rotated grid, standard grid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1166708 Computational Aspects of Regression Analysis of Interval Data
Authors: Michal Cerny
Abstract:
We consider linear regression models where both input data (the values of independent variables) and output data (the observations of the dependent variable) are interval-censored. We introduce a possibilistic generalization of the least squares estimator, so called OLS-set for the interval model. This set captures the impact of the loss of information on the OLS estimator caused by interval censoring and provides a tool for quantification of this effect. We study complexity-theoretic properties of the OLS-set. We also deal with restricted versions of the general interval linear regression model, in particular the crisp input – interval output model. We give an argument that natural descriptions of the OLS-set in the crisp input – interval output cannot be computed in polynomial time. Then we derive easily computable approximations for the OLS-set which can be used instead of the exact description. We illustrate the approach by an example.
Keywords: Linear regression, interval-censored data, computational complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470707 Improvement over DV-Hop Localization Algorithm for Wireless Sensor Networks
Authors: Shrawan Kumar, D. K. Lobiyal
Abstract:
In this paper, we propose improved versions of DVHop algorithm as QDV-Hop algorithm and UDV-Hop algorithm for better localization without the need for additional range measurement hardware. The proposed algorithm focuses on third step of DV-Hop, first error terms from estimated distances between unknown node and anchor nodes is separated and then minimized. In the QDV-Hop algorithm, quadratic programming is used to minimize the error to obtain better localization. However, quadratic programming requires a special optimization tool box that increases computational complexity. On the other hand, UDV-Hop algorithm achieves localization accuracy similar to that of QDV-Hop by solving unconstrained optimization problem that results in solving a system of linear equations without much increase in computational complexity. Simulation results show that the performance of our proposed schemes (QDV-Hop and UDV-Hop) is superior to DV-Hop and DV-Hop based algorithms in all considered scenarios.Keywords: Wireless sensor networks, Error term, DV-Hop algorithm, Localization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2265706 Dynamic Data Partition Algorithm for a Parallel H.264 Encoder
Authors: Juntae Kim, Jaeyoung Park, Kyoungkun Lee, Jong Tae Kim
Abstract:
The H.264/AVC standard is a highly efficient video codec providing high-quality videos at low bit-rates. As employing advanced techniques, the computational complexity has been increased. The complexity brings about the major problem in the implementation of a real-time encoder and decoder. Parallelism is the one of approaches which can be implemented by multi-core system. We analyze macroblock-level parallelism which ensures the same bit rate with high concurrency of processors. In order to reduce the encoding time, dynamic data partition based on macroblock region is proposed. The data partition has the advantages in load balancing and data communication overhead. Using the data partition, the encoder obtains more than 3.59x speed-up on a four-processor system. This work can be applied to other multimedia processing applications.Keywords: H.264/AVC, video coding, thread-level parallelism, OpenMP, multimedia
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1795705 Computing Entropy for Ortholog Detection
Authors: Hsing-Kuo Pao, John Case
Abstract:
Biological sequences from different species are called or-thologs if they evolved from a sequence of a common ancestor species and they have the same biological function. Approximations of Kolmogorov complexity or entropy of biological sequences are already well known to be useful in extracting similarity information between such sequences -in the interest, for example, of ortholog detection. As is well known, the exact Kolmogorov complexity is not algorithmically computable. In prac-tice one can approximate it by computable compression methods. How-ever, such compression methods do not provide a good approximation to Kolmogorov complexity for short sequences. Herein is suggested a new ap-proach to overcome the problem that compression approximations may notwork well on short sequences. This approach is inspired by new, conditional computations of Kolmogorov entropy. A main contribution of the empir-ical work described shows the new set of entropy-based machine learning attributes provides good separation between positive (ortholog) and nega-tive (non-ortholog) data - better than with good, previously known alter-natives (which do not employ some means to handle short sequences well).Also empirically compared are the new entropy based attribute set and a number of other, more standard similarity attributes sets commonly used in genomic analysis. The various similarity attributes are evaluated by cross validation, through boosted decision tree induction C5.0, and by Receiver Operating Characteristic (ROC) analysis. The results point to the conclu-sion: the new, entropy based attribute set by itself is not the one giving the best prediction; however, it is the best attribute set for use in improving the other, standard attribute sets when conjoined with them.
Keywords: compression, decision tree, entropy, ortholog, ROC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1827704 Modeling Hybrid Systems with MLD Approach and Analysis of the Model Size and Complexity
Authors: H. Mahboubi, B. Moshiri, A. Khaki Seddigh
Abstract:
Recently, a great amount of interest has been shown in the field of modeling and controlling hybrid systems. One of the efficient and common methods in this area utilizes the mixed logicaldynamical (MLD) systems in the modeling. In this method, the system constraints are transformed into mixed-integer inequalities by defining some logic statements. In this paper, a system containing three tanks is modeled as a nonlinear switched system by using the MLD framework. Comparing the model size of the three-tank system with that of a two-tank system, it is deduced that the number of binary variables, the size of the system and its complexity tremendously increases with the number of tanks, which makes the control of the system more difficult. Therefore, methods should be found which result in fewer mixed-integer inequalities.Keywords: Hybrid systems, mixed-integer inequalities, mixed logical dynamical systems, multi-tank system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759703 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation
Authors: Oğuzhan Urhan
Abstract:
In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.
Keywords: Fast motion estimation, low-complexity motion estimation, video coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 854702 Interbank Networks and the Benefits of Using Multilayer Structures
Authors: Danielle Sandler dos Passos, Helder Coelho, Flávia Mori Sarti
Abstract:
Complexity science seeks the understanding of systems adopting diverse theories from various areas. Network analysis has been gaining space and credibility, namely with the biological, social and economic systems. Significant part of the literature focuses only monolayer representations of connections among agents considering one level of their relationships, and excludes other levels of interactions, leading to simplistic results in network analysis. Therefore, this work aims to demonstrate the advantages of the use of multilayer networks for the representation and analysis of networks. For this, we analyzed an interbank network, composed of 42 banks, comparing the centrality measures of the agents (degree and PageRank) resulting from each method (monolayer x multilayer). This proved to be the most reliable and efficient the multilayer analysis for the study of the current networks and highlighted JP Morgan and Deutsche Bank as the most important banks of the analyzed network.
Keywords: Complexity, interbank networks, multilayer networks, network analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 851