Search results for: time complexity measurements.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7673

Search results for: time complexity measurements.

7673 A Technique for Reachability Graph Generation for the Petri Net Models of Parallel Processes

Authors: Farooq Ahmad, Hejiao Huang, Xiaolong Wang

Abstract:

Reachability graph (RG) generation suffers from the problem of exponential space and time complexity. To alleviate the more critical problem of time complexity, this paper presents the new approach for RG generation for the Petri net (PN) models of parallel processes. Independent RGs for each parallel process in the PN structure are generated in parallel and cross-product of these RGs turns into the exhaustive state space from which the RG of given parallel system is determined. The complexity analysis of the presented algorithm illuminates significant decrease in the time complexity cost of RG generation. The proposed technique is applicable to parallel programs having multiple threads with the synchronization problem.

Keywords: Parallel processes, Petri net, reachability graph, time complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
7672 Complexity of Component-based Development of Embedded Systems

Authors: M. Zheng, V. S. Alagar

Abstract:

The paper discusses complexity of component-based development (CBD) of embedded systems. Although CBD has its merits, it must be augmented with methods to control the complexities that arise due to resource constraints, timeliness, and run-time deployment of components in embedded system development. Software component specification, system-level testing, and run-time reliability measurement are some ways to control the complexity.

Keywords: Components, embedded systems, complexity, softwaredevelopment, traffic controller system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
7671 A Survey on Metric of Software Cognitive Complexity for OO design

Authors: A.Aloysius, L. Arockiam

Abstract:

In modern era, the biggest challenge facing the software industry is the upcoming of new technologies. So, the software engineers are gearing up themselves to meet and manage change in large software system. Also they find it difficult to deal with software cognitive complexities. In the last few years many metrics were proposed to measure the cognitive complexity of software. This paper aims at a comprehensive survey of the metric of software cognitive complexity. Some classic and efficient software cognitive complexity metrics, such as Class Complexity (CC), Weighted Class Complexity (WCC), Extended Weighted Class Complexity (EWCC), Class Complexity due to Inheritance (CCI) and Average Complexity of a program due to Inheritance (ACI), are discussed and analyzed. The comparison and the relationship of these metrics of software complexity are also presented.

Keywords: Software Metrics, Software Complexity, Cognitive Informatics, Cognitive Complexity, Software measurement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2954
7670 Feature Selection Approaches with Missing Values Handling for Data Mining - A Case Study of Heart Failure Dataset

Authors: N.Poolsawad, C.Kambhampati, J. G. F. Cleland

Abstract:

In this paper, we investigated the characteristic of a clinical dataseton the feature selection and classification measurements which deal with missing values problem.And also posed the appropriated techniques to achieve the aim of the activity; in this research aims to find features that have high effect to mortality and mortality time frame. We quantify the complexity of a clinical dataset. According to the complexity of the dataset, we proposed the data mining processto cope their complexity; missing values, high dimensionality, and the prediction problem by using the methods of missing value replacement, feature selection, and classification.The experimental results will extend to develop the prediction model for cardiology.

Keywords: feature selection, missing values, classification, clinical dataset, heart failure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3155
7669 A Deterministic Polynomial-time Algorithm for the Clique Problem and the Equality of P and NP Complexity Classes

Authors: Zohreh O. Akbari

Abstract:

In this paper a deterministic polynomial-time algorithm is presented for the Clique problem. The case is considered as the problem of omitting the minimum number of vertices from the input graph so that none of the zeroes on the graph-s adjacency matrix (except the main diagonal entries) would remain on the adjacency matrix of the resulting subgraph. The existence of a deterministic polynomial-time algorithm for the Clique problem, as an NP-complete problem will prove the equality of P and NP complexity classes.

Keywords: Clique problem, Deterministic Polynomial-time Algorithm, Equality of P and NP Complexity Classes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757
7668 Control-flow Complexity Measurement of Processes and Weyuker's Properties

Authors: Jorge Cardoso

Abstract:

Process measurement is the task of empirically and objectively assigning numbers to the properties of business processes in such a way as to describe them. Desirable attributes to study and measure include complexity, cost, maintainability, and reliability. In our work we will focus on investigating process complexity. We define process complexity as the degree to which a business process is difficult to analyze, understand or explain. One way to analyze a process- complexity is to use a process control-flow complexity measure. In this paper, an attempt has been made to evaluate the control-flow complexity measure in terms of Weyuker-s properties. Weyuker-s properties must be satisfied by any complexity measure to qualify as a good and comprehensive one.

Keywords: Business process measurement, workflow, complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2629
7667 Basic Calibration and Normalization Techniques for Time Domain Reflectometry Measurements

Authors: Shagufta Tabassum

Abstract:

The study of dielectric properties in a binary mixture of liquids is very useful to understand the liquid structure, molecular interaction, dynamics, and kinematics of the mixture. Time-domain reflectometry (TDR) is a powerful tool for studying the cooperation and molecular dynamics of the H-bonded system. Here we discuss the basic calibration and normalization procedure for TDR measurements. Our aim is to explain different types of error occur during TDR measurements and how to minimize it.

Keywords: time domain reflectometry measurement technique, cable and connector loss, oscilloscope loss, normalization technique

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 419
7666 Project Complexity Indices based on Topology Features

Authors: Amer A. Boushaala

Abstract:

The heuristic decision rules used for project scheduling will vary depending upon the project-s size, complexity, duration, personnel, and owner requirements. The concept of project complexity has received little detailed attention. The need to differentiate between easy and hard problem instances and the interest in isolating the fundamental factors that determine the computing effort required by these procedures inspired a number of researchers to develop various complexity measures. In this study, the most common measures of project complexity are presented. A new measure of project complexity is developed. The main privilege of the proposed measure is that, it considers size, shape and logic characteristics, time characteristics, resource demands and availability characteristics as well as number of critical activities and critical paths. The degree of sensitivity of the proposed measure for complexity of project networks has been tested and evaluated against the other measures of complexity of the considered fifty project networks under consideration in the current study. The developed measure showed more sensitivity to the changes in the network data and gives accurate quantified results when comparing the complexities of networks.

Keywords: Activity networks, Complexity index, Networkcomplexity measure, Network topology, Project Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
7665 A Time-Reducible Approach to Compute Determinant |I-X|

Authors: Wang Xingbo

Abstract:

Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.

Keywords: Algorithm, determinant, computation, eigenvalue, time complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1103
7664 Sequential Straightforward Clustering for Local Image Block Matching

Authors: Mohammad Akbarpour Sekeh, Mohd. Aizaini Maarof, Mohd. Foad Rohani, Malihe Motiei

Abstract:

Duplicated region detection is a technical method to expose copy-paste forgeries on digital images. Copy-paste is one of the common types of forgeries to clone portion of an image in order to conceal or duplicate special object. In this type of forgery detection, extracting robust block feature and also high time complexity of matching step are two main open problems. This paper concentrates on computational time and proposes a local block matching algorithm based on block clustering to enhance time complexity. Time complexity of the proposed algorithm is formulated and effects of two parameter, block size and number of cluster, on efficiency of this algorithm are considered. The experimental results and mathematical analysis demonstrate this algorithm is more costeffective than lexicographically algorithms in time complexity issue when the image is complex.

Keywords: Copy-paste forgery detection, Duplicated region, Timecomplexity, Local block matching, Sequential block clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1772
7663 Static and Dynamic Complexity Analysis of Software Metrics

Authors: Kamaljit Kaur, Kirti Minhas, Neha Mehan, Namita Kakkar

Abstract:

Software complexity metrics are used to predict critical information about reliability and maintainability of software systems. Object oriented software development requires a different approach to software complexity metrics. Object Oriented Software Metrics can be broadly classified into static and dynamic metrics. Static Metrics give information at the code level whereas dynamic metrics provide information on the actual runtime. In this paper we will discuss the various complexity metrics, and the comparison between static and dynamic complexity.

Keywords: Static Complexity, Dynamic Complexity, Halstead Metric, Mc Cabe's Metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3156
7662 An Enhanced Distributed System to improve theTime Complexity of Binary Indexed Trees

Authors: Ahmed M. Elhabashy, A. Baes Mohamed, Abou El Nasr Mohamad

Abstract:

Distributed Computing Systems are usually considered the most suitable model for practical solutions of many parallel algorithms. In this paper an enhanced distributed system is presented to improve the time complexity of Binary Indexed Trees (BIT). The proposed system uses multi-uniform processors with identical architectures and a specially designed distributed memory system. The analysis of this system has shown that it has reduced the time complexity of the read query to O(Log(Log(N))), and the update query to constant complexity, while the naive solution has a time complexity of O(Log(N)) for both queries. The system was implemented and simulated using VHDL and Verilog Hardware Description Languages, with xilinx ISE 10.1, as the development environment and ModelSim 6.1c, similarly as the simulation tool. The simulation has shown that the overhead resulting by the wiring and communication between the system fragments could be fairly neglected, which makes it applicable to practically reach the maximum speed up offered by the proposed model.

Keywords: Binary Index Tree (BIT), Least Significant Bit (LSB), Parallel Adder (PA), Very High Speed Integrated Circuits HardwareDescription Language (VHDL), Distributed Parallel Computing System(DPCS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
7661 Object-Oriented Cognitive-Spatial Complexity Measures

Authors: Varun Gupta, Jitender Kumar Chhabra

Abstract:

Software maintenance and mainly software comprehension pose the largest costs in the software lifecycle. In order to assess the cost of software comprehension, various complexity measures have been proposed in the literature. This paper proposes new cognitive-spatial complexity measures, which combine the impact of spatial as well as architectural aspect of the software to compute the software complexity. The spatial aspect of the software complexity is taken into account using the lexical distances (in number of lines of code) between different program elements and the architectural aspect of the software complexity is taken into consideration using the cognitive weights of control structures present in control flow of the program. The proposed measures are evaluated using standard axiomatic frameworks and then, the proposed measures are compared with the corresponding existing cognitive complexity measures as well as the spatial complexity measures for object-oriented software. This study establishes that the proposed measures are better indicators of the cognitive effort required for software comprehension than the other existing complexity measures for object-oriented software.

Keywords: cognitive complexity, software comprehension, software metrics, spatial complexity, Object-oriented software

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2086
7660 AC Signals Estimation from Irregular Samples

Authors: Predrag B. Petrović

Abstract:

The paper deals with the estimation of amplitude and phase of an analogue multi-harmonic band-limited signal from irregularly spaced sampling values. To this end, assuming the signal fundamental frequency is known in advance (i.e., estimated at an independent stage), a complexity-reduced algorithm for signal reconstruction in time domain is proposed. The reduction in complexity is achieved owing to completely new analytical and summarized expressions that enable a quick estimation at a low numerical error. The proposed algorithm for the calculation of the unknown parameters requires O((2M+1)2) flops, while the straightforward solution of the obtained equations takes O((2M+1)3) flops (M is the number of the harmonic components). It is applied in signal reconstruction, spectral estimation, system identification, as well as in other important signal processing problems. The proposed method of processing can be used for precise RMS measurements (for power and energy) of a periodic signal based on the presented signal reconstruction. The paper investigates the errors related to the signal parameter estimation, and there is a computer simulation that demonstrates the accuracy of these algorithms.

Keywords: Band-limited signals, Fourier coefficient estimation, analytical solutions, signal reconstruction, time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
7659 A Cognitive Measurement of Complexity and Comprehension for Object-Oriented Code

Authors: Amit Kumar Jakhar, Kumar Rajnish

Abstract:

Inherited complexity is one of the difficult tasks in software engineering field. Further, it is said that there is no physical laws or standard guidelines suit for designing different types of software. Hence, to make the software engineering as a matured engineering discipline like others, it is necessary that it has its own theoretical frameworks and laws. Software designing and development is a human effort which takes a lot of time and considers various parameters for successful completion of the software. The cognitive informatics plays an important role for understanding the essential characteristics of the software. The aim of this work is to consider the fundamental characteristics of the source code of Object-Oriented software i.e. complexity and understandability. The complexity of the programs is analyzed with the help of extracted important attributes of the source code, which is further utilized to evaluate the understandability factor. The aforementioned characteristics are analyzed on the basis of 16 C++ programs by distributing them to forty MCA students. They all tried to understand the source code of the given program and mean time is taken as the actual time needed to understand the program. For validation of this work, Briand’s framework is used and the presented metric is also evaluated comparatively with existing metric which proves its robustness.

Keywords: Software metrics, object-oriented, complexity, cognitive weight, understandability, basic control structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1062
7658 HOPF Bifurcation of a Predator-prey Model with Time Delay and Habitat Complexity

Authors: Li Hongwei

Abstract:

In this paper, a predator-prey model with time delay and habitat complexity is investigated. By analyzing the characteristic equations, the local stability of each feasible equilibria of the system is discussed and the existence of a Hopf bifurcation at the coexistence equilibrium is established. By choosing the sum of two delays as a bifurcation parameter, we show that Hopf bifurcations can occur as  crosses some critical values. By deriving the equation describing the flow on the center manifold, we can determine the direction of the Hopf bifurcations and the stability of the bifurcating periodic solutions. Numerical simulations are carried out to illustrate the main theoretical results.

Keywords: Predator-prey system, delay, habitat complexity, HOPF bifurcation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
7657 Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems

Authors: Dr. L. Arockiam, A. Aloysius

Abstract:

In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experiments

Keywords: Software Complexity, Attribute Weighted Class Complexity, Weighted Class Complexity, Data Type

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2055
7656 Dynamic Metrics for Polymorphism in Object Oriented Systems

Authors: Parvinder Singh Sandhu, Gurdev Singh

Abstract:

Metrics is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules. Software metrics are instruments or ways to measuring all the aspect of software product. These metrics are used throughout a software project to assist in estimation, quality control, productivity assessment, and project control. Object oriented software metrics focus on measurements that are applied to the class and other characteristics. These measurements convey the software engineer to the behavior of the software and how changes can be made that will reduce complexity and improve the continuing capability of the software. Object oriented software metric can be classified in two types static and dynamic. Static metrics are concerned with all the aspects of measuring by static analysis of software and dynamic metrics are concerned with all the measuring aspect of the software at run time. Major work done before, was focusing on static metric. Also some work has been done in the field of dynamic nature of the software measurements. But research in this area is demanding for more work. In this paper we give a set of dynamic metrics specifically for polymorphism in object oriented system.

Keywords: Metrics, Software, Quality, Object oriented system, Polymorphism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
7655 Cognitive Weighted Polymorphism Factor: A Comprehension Augmented Complexity Metric

Authors: T. Francis Thamburaj, A. Aloysius

Abstract:

Polymorphism is one of the main pillars of objectoriented paradigm. It induces hidden forms of class dependencies which may impact software quality, resulting in higher cost factor for comprehending, debugging, testing, and maintaining the software. In this paper, a new cognitive complexity metric called Cognitive Weighted Polymorphism Factor (CWPF) is proposed. Apart from the software structural complexity, it includes the cognitive complexity on the basis of type. The cognitive weights are calibrated based on 27 empirical studies with 120 persons. A case study and experimentation of the new software metric shows positive results. Further, a comparative study is made and the correlation test has proved that CWPF complexity metric is a better, more comprehensive, and more realistic indicator of the software complexity than Abreu’s Polymorphism Factor (PF) complexity metric.

Keywords: Cognitive complexity metric, cognitive weighted polymorphism factor, object-oriented metrics, polymorphism factor, software metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2170
7654 String Matching using Inverted Lists

Authors: Chouvalit Khancome, Veera Boonjing

Abstract:

This paper proposes a new solution to string matching problem. This solution constructs an inverted list representing a  string pattern to be searched for. It then uses a new algorithm to process an input string in a single pass. The preprocessing phase  takes 1) time complexity O(m) 2) space complexity O(1) where m is  the length of pattern. The searching phase time complexity takes 1)  O(m+α ) in average case 2) O(n/m) in the best case and 3) O(n) in  the worst case, where α is the number of comparing leading to  mismatch and n is the length of input text.

Keywords: String matching, inverted list, inverted index, pattern, algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1502
7653 Clustering Categorical Data Using Hierarchies (CLUCDUH)

Authors: Gökhan Silahtaroğlu

Abstract:

Clustering large populations is an important problem when the data contain noise and different shapes. A good clustering algorithm or approach should be efficient enough to detect clusters sensitively. Besides space complexity, time complexity also gains importance as the size grows. Using hierarchies we developed a new algorithm to split attributes according to the values they have and choosing the dimension for splitting so as to divide the database roughly into equal parts as much as possible. At each node we calculate some certain descriptive statistical features of the data which reside and by pruning we generate the natural clusters with a complexity of O(n).

Keywords: Clustering, tree, split, pruning, entropy, gini.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493
7652 Fast and Efficient Algorithms for Evaluating Uniform and Nonuniform Lagrange and Newton Curves

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Newton-Lagrange Interpolations are widely used in numerical analysis. However, it requires a quadratic computational time for their constructions. In computer aided geometric design (CAGD), there are some polynomial curves: Wang-Ball, DP and Dejdumrong curves, which have linear time complexity algorithms. Thus, the computational time for Newton-Lagrange Interpolations can be reduced by applying the algorithms of Wang-Ball, DP and Dejdumrong curves. In order to use Wang-Ball, DP and Dejdumrong algorithms, first, it is necessary to convert Newton-Lagrange polynomials into Wang-Ball, DP or Dejdumrong polynomials. In this work, the algorithms for converting from both uniform and non-uniform Newton-Lagrange polynomials into Wang-Ball, DP and Dejdumrong polynomials are investigated. Thus, the computational time for representing Newton-Lagrange polynomials can be reduced into linear complexity. In addition, the other utilizations of using CAGD curves to modify the Newton-Lagrange curves can be taken.

Keywords: Newton interpolation, Lagrange interpolation, linear complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 547
7651 The Autoregresive Analysis for Wind Turbine Signal Postprocessing

Authors: Daniel Pereiro, Felix Martinez, Iker Urresti, Ana Gomez Gonzalez

Abstract:

Today modern simulations solutions in the wind turbine industry have achieved a high degree of complexity and detail in result. Limitations exist when it is time to validate model results against measurements. Regarding Model validation it is of special interest to identify mode frequencies and to differentiate them from the different excitations. A wind turbine is a complex device and measurements regarding any part of the assembly show a lot of noise. Input excitations are difficult or even impossible to measure due to the stochastic nature of the environment. Traditional techniques for frequency analysis or features extraction are widely used to analyze wind turbine sensor signals, but have several limitations specially attending to non stationary signals (Events). A new technique based on autoregresive analysis techniques is introduced here for a specific application, a comparison and examples related to different events in the wind turbine operations are presented.

Keywords: Wind turbine, signal processing, mode extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492
7650 Gender Based Variability Time Series Complexity Analysis

Authors: Ramesh K. Sunkaria, Puneeta Marwaha

Abstract:

Non linear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy normal sinus rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.

Keywords: Heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1714
7649 Application of Quality Index Method, Texture Measurements and Electronic Nose to Assess the Freshness of Atlantic Herring (Clupea harengus) Stored in Ice

Authors: Nga T.T. Mai, Emilía Martinsdóttir, Kolbrún Sveinsdóttir, Gudrun Olafsdóttir, Sigurjón Arason

Abstract:

Atlantic herring (Clupea harengus) is an important commercial fish and shows to be more and more demanded for human consumption. Therefore, it is very important to find good methods for monitoring the freshness of the fish in order to keep it in the best quality for human consumption. In this study, the fish was stored in ice up to 2 weeks. Quality changes during storage were assessed by the Quality Index Method (QIM), quantitative descriptive analysis (QDA) and Torry scheme, by texture measurements: puncture tests and Texture Profile Analysis (TPA) tests on texture analyzer TA.XT2i, and by electronic nose (e-nose) measurements using FreshSense instrument. Storage time of herring in ice could be estimated by QIM with ± 2 days using 5 herring per lot. No correlation between instrumental texture parameters and storage time or between sensory and instrumental texture variables was found. E-nose measurements could be use to detect the onset of spoilage.

Keywords: Herring, Quality Index Method (QIM), freshness, storage time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
7648 On the Solution of the Towers of Hanoi Problem

Authors: Hayedeh Ahrabian, Comfar Badamchi, Abbass Nowzari-Dalini

Abstract:

In this paper, two versions of an iterative loopless algorithm for the classical towers of Hanoi problem with O(1) storage complexity and O(2n) time complexity are presented. Based on this algorithm the number of different moves in each of pegs with its direction is formulated.

Keywords: Loopless algorithm, Binary tree, Towers of Hanoi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4768
7647 Sparse Frequencies Extracting from Partial Phase-Only Measurements

Authors: R. Fan, Q. Wan, H. Chen, Y.L. Liu, Y.P. Liu

Abstract:

This paper considers a robust recovery of sparse frequencies from partial phase-only measurements. With the proposed method, sparse frequencies can be reconstructed, which makes full use of the sparse distribution in the Fourier representation of the complex-valued time signal. Simulation experiments illustrate the proposed method-s advantages over conventional methods in both noiseless and additive white Gaussian noise cases.

Keywords: Sparse signal recovery, phase-only measurements, Compressive sensing, convex relaxation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413
7646 Iterative Joint Power Control and Partial Crosstalk Cancellation in Upstream VDSL

Authors: H. Bagheri, H. Emami, M. R. Pakravan

Abstract:

Crosstalk is the major limiting issue in very high bit-rate digital subscriber line (VDSL) systems in terms of bit-rate or service coverage. At the central office side, joint signal processing accompanied by appropriate power allocation enables complex multiuser processors to provide near capacity rates. Unfortunately complexity grows with the square of the number of lines within a binder, so by taking into account that there are only a few dominant crosstalkers who contribute to main part of crosstalk power, the canceller structure can be simplified which resulted in a much lower run-time complexity. In this paper, a multiuser power control scheme, namely iterative waterfilling, is combined with previously proposed partial crosstalk cancellation approaches to demonstrate the best ever achieved performance which is verified by simulation results.

Keywords: iterative waterfilling, partial crosstalk cancellation, run-time complexity, VDSL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1349
7645 Low Complexity Hybrid Scheme for PAPR Reduction in OFDM Systems Based on SLM and Clipping

Authors: V. Sudha, D. Sriram Kumar

Abstract:

In this paper, we present a low complexity hybrid scheme using conventional selective mapping (C-SLM) and clipping algorithms to reduce the high peak-to-average power ratio (PAPR) of orthogonal frequency division multiplexing (OFDM) signal. In the proposed scheme, the input data sequence (X) is divided into two sub-blocks, then clipping algorithm is applied to the first sub-block, whereas C-SLM algorithm is applied to the second sub-block in order to reduce both computational complexity and PAPR. The resultant time domain OFDM signal is obtained by combining the output of two sub-blocks. The simulation results show that the proposed hybrid scheme provides 0.45 dB PAPR reduction gain at CCDF value of 10-2 and 52% of computational complexity reduction when compared to C-SLM scheme at the expense of slight degradation in bit error rate (BER) performance.

Keywords: CCDF, Clipping, OFDM, PAPR, SLM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1206
7644 Computationally Efficient Adaptive Rate Sampling and Adaptive Resolution Analysis

Authors: Saeed Mian Qaisar, Laurent Fesquet, Marc Renaudin

Abstract:

Mostly the real life signals are time varying in nature. For proper characterization of such signals, time-frequency representation is required. The STFT (short-time Fourier transform) is a classical tool used for this purpose. The limitation of the STFT is its fixed time-frequency resolution. Thus, an enhanced version of the STFT, which is based on the cross-level sampling, is devised. It can adapt the sampling frequency and the window function length by following the input signal local variations. Therefore, it provides an adaptive resolution time-frequency representation of the input. The computational complexity of the proposed STFT is deduced and compared to the classical one. The results show a significant gain of the computational efficiency and hence of the processing power. The processing error of the proposed technique is also discussed.

Keywords: Level Crossing Sampling, Activity Selection, Adaptive Resolution Analysis, Computational Complexity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1211