Search results for: estimation for the number of the blind sources
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5379

Search results for: estimation for the number of the blind sources

5169 Estimating Cost of R&D Activities for Feasibility Study of Public R&D Investment

Authors: Ie-jung Choi

Abstract:

Since the feasibility study of R&D programs have been initiated for efficient public R&D investments, year 2008, feasibility studies have improved in terms of precision. Although experience related to these studies of R&D programs have increased to a certain point, still methodological improvement is required. The feasibility studies of R&D programs are consisted of various viewpoints, such as technology, policy, and economics. This research is to provide improvement methods to the economic perspective; especially the cost estimation process of R&D activities. First of all, the fundamental concept of cost estimation is reviewed. After the review, a statistical and econometric analysis method is applied as empirical analysis. Conclusively, limitations and further research directions are provided.

Keywords: Cost Estimation, R&D Program, Feasibility AnalysisStudy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587
5168 Low-complexity Integer Frequency Offset Synchronization for OFDMA System

Authors: Young-Jae Kim, Young-Hwan You

Abstract:

This paper presents a integer frequency offset (IFO) estimation scheme for the 3GPP long term evolution (LTE) downlink system. Firstly, the conventional joint detection method for IFO and sector cell index (CID) information is introduced. Secondly, an IFO estimation without explicit sector CID information is proposed, which can operate jointly with the proposed IFO estimation and reduce the time delay in comparison with the conventional joint method. Also, the proposed method is computationally efficient and has almost similar performance in comparison with the conventional method over the Pedestrian and Vehicular channel models.

Keywords: LTE, OFDMA, primary synchronization signal (PSS), IFO, CID

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2273
5167 Functional Decomposition Based Effort Estimation Model for Software-Intensive Systems

Authors: Nermin Sökmen

Abstract:

An effort estimation model is needed for softwareintensive projects that consist of hardware, embedded software or some combination of the two, as well as high level software solutions. This paper first focuses on functional decomposition techniques to measure functional complexity of a computer system and investigates its impact on system development effort. Later, it examines effects of technical difficulty and design team capability factors in order to construct the best effort estimation model. With using traditional regression analysis technique, the study develops a system development effort estimation model which takes functional complexity, technical difficulty and design team capability factors as input parameters. Finally, the assumptions of the model are tested.

Keywords: Functional complexity, functional decomposition, development effort, technical difficulty, design team capability, regression analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2232
5166 Learning Monte Carlo Data for Circuit Path Length

Authors: Namal A. Senanayake, A. Beg, Withana C. Prasad

Abstract:

This paper analyzes the patterns of the Monte Carlo data for a large number of variables and minterms, in order to characterize the circuit path length behavior. We propose models that are determined by training process of shortest path length derived from a wide range of binary decision diagram (BDD) simulations. The creation of the model was done use of feed forward neural network (NN) modeling methodology. Experimental results for ISCAS benchmark circuits show an RMS error of 0.102 for the shortest path length complexity estimation predicted by the NN model (NNM). Use of such a model can help reduce the time complexity of very large scale integrated (VLSI) circuitries and related computer-aided design (CAD) tools that use BDDs.

Keywords: Monte Carlo data, Binary decision diagrams, Neural network modeling, Shortest path length estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
5165 Human Pose Estimation using Active Shape Models

Authors: Changhyuk Jang, Keechul Jung

Abstract:

Human pose estimation can be executed using Active Shape Models. The existing techniques for applying to human-body research using Active Shape Models, such as human detection, primarily take the form of silhouette of human body. This technique is not able to estimate accurately for human pose to concern two arms and legs, as the silhouette of human body represents the shape as out of round. To solve this problem, we applied the human body model as stick-figure, “skeleton". The skeleton model of human body can give consideration to various shapes of human pose. To obtain effective estimation result, we applied background subtraction and deformed matching algorithm of primary Active Shape Models in the fitting process. The images which were used to make the model were 600 human bodies, and the model has 17 landmark points which indicate body junction and key features of human pose. The maximum iteration for the fitting process was 30 times and the execution time was less than .03 sec.

Keywords: Active shape models, skeleton, pose estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2367
5164 Estimation of the Mean of the Selected Population

Authors: Kalu Ram Meena, Aditi Kar Gangopadhyay, Satrajit Mandal

Abstract:

Two normal populations with different means and same variance are considered, where the variance is known. The population with the smaller sample mean is selected. Various estimators are constructed for the mean of the selected normal population. Finally, they are compared with respect to the bias and MSE risks by the mehod of Monte-Carlo simulation and their performances are analysed with the help of graphs.

Keywords: Estimation after selection, Brewster-Zidek technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1363
5163 Applying Gibbs Sampler for Multivariate Hierarchical Linear Model

Authors: Satoshi Usami

Abstract:

Among various HLM techniques, the Multivariate Hierarchical Linear Model (MHLM) is desirable to use, particularly when multivariate criterion variables are collected and the covariance structure has information valuable for data analysis. In order to reflect prior information or to obtain stable results when the sample size and the number of groups are not sufficiently large, the Bayes method has often been employed in hierarchical data analysis. In these cases, although the Markov Chain Monte Carlo (MCMC) method is a rather powerful tool for parameter estimation, Procedures regarding MCMC have not been formulated for MHLM. For this reason, this research presents concrete procedures for parameter estimation through the use of the Gibbs samplers. Lastly, several future topics for the use of MCMC approach for HLM is discussed.

Keywords: Gibbs sampler, Hierarchical Linear Model, Markov Chain Monte Carlo, Multivariate Hierarchical Linear Model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1824
5162 Numerical Optimization within Vector of Parameters Estimation in Volatility Models

Authors: J. Arneric, A. Rozga

Abstract:

In this paper usefulness of quasi-Newton iteration procedure in parameters estimation of the conditional variance equation within BHHH algorithm is presented. Analytical solution of maximization of the likelihood function using first and second derivatives is too complex when the variance is time-varying. The advantage of BHHH algorithm in comparison to the other optimization algorithms is that requires no third derivatives with assured convergence. To simplify optimization procedure BHHH algorithm uses the approximation of the matrix of second derivatives according to information identity. However, parameters estimation in a/symmetric GARCH(1,1) model assuming normal distribution of returns is not that simple, i.e. it is difficult to solve it analytically. Maximum of the likelihood function can be founded by iteration procedure until no further increase can be found. Because the solutions of the numerical optimization are very sensitive to the initial values, GARCH(1,1) model starting parameters are defined. The number of iterations can be reduced using starting values close to the global maximum. Optimization procedure will be illustrated in framework of modeling volatility on daily basis of the most liquid stocks on Croatian capital market: Podravka stocks (food industry), Petrokemija stocks (fertilizer industry) and Ericsson Nikola Tesla stocks (information-s-communications industry).

Keywords: Heteroscedasticity, Log-likelihood Maximization, Quasi-Newton iteration procedure, Volatility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2604
5161 Development of Cooling Load Demand Program for Building in Malaysia

Authors: Zamri Noranai, Dayang Siti Zainab Abang Bujang, Rosli Asmawi, Hamidon Salleh, Mohammad Zainal Md Yusof

Abstract:

Air conditioning is mainly to be used as human comfort medium. It has been use more often in country in which the daily temperatures are high. In scientific, air conditioning is defined as a process of controlling the moisture, cooling, heating and cleaning air. Without proper estimation of cooling load, big amount of waste energy been used because of unsuitable of air conditioning system are not considering to overcoming heat gains from surrounding. This is due to the size of the room is too big and the air conditioning has to use more energy to cool the room and the air conditioning is too small for the room. The studies are basically to develop a program to calculate cooling load. Through this study it is easy to calculate cooling load estimation. Furthermore it-s help to compare the cooling load estimation by hourly and yearly. Base on the last study that been done, the developed software are not user-friendly. For individual without proper knowledge of calculating cooling load estimation might be problem. Easy excess and user-friendly should be the main objective to design something. This program will allow cooling load able be estimate by any users rather than estimation by using rule of thumb. Several of limitation of case study is judged to sure it-s meeting to Malaysia building specification. Finally validation is done by comparison manual calculation and by developed program.

Keywords: Building, Energy and Coaling Load

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2897
5160 Verified Experiment: Intelligent Fuzzy Weighted Input Estimation Method to Inverse Heat Conduction Problem

Authors: Chen-Yu Wang, Tsung-Chien Chen, Ming-Hui Lee, Jen-Feng Huang

Abstract:

In this paper, the innovative intelligent fuzzy weighted input estimation method (FWIEM) can be applied to the inverse heat transfer conduction problem (IHCP) to estimate the unknown time-varying heat flux efficiently as presented. The feasibility of this method can be verified by adopting the temperature measurement experiment. We would like to focus attention on the heat flux estimation to three kinds of samples (Copper, Iron and Steel/AISI 304) with the same 3mm thickness. The temperature measurements are then regarded as the inputs into the FWIEM to estimate the heat flux. The experiment results show that the proposed algorithm can estimate the unknown time-varying heat flux on-line.

Keywords: Fuzzy Weighted Input Estimation Method, IHCP andHeat Flux.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494
5159 An Efficient Separation for Convolutive Mixtures

Authors: Salah Al-Din I. Badran, Samad Ahmadi, Dylan Menzies, Ismail Shahin

Abstract:

This paper describes a new efficient blind source separation method; in this method we uses a non-uniform filter bank and a new structure with different sub-bands. This method provides a reduced permutation and increased convergence speed comparing to the full-band algorithm. Recently, some structures have been suggested to deal with two problems: reducing permutation and increasing the speed of convergence of the adaptive algorithm for correlated input signals. The permutation problem is avoided with the use of adaptive filters of orders less than the full-band adaptive filter, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each sub-band than the input signal at full-band, and can promote better rates of convergence.

Keywords: Blind source separation (BSS), estimates, full-band, mixtures, Sub-band.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732
5158 An Interlacing Technique-Based Blind Video Watermarking Using Wavelet

Authors: B. Sridhar, C. Arun

Abstract:

The rapid growth of multimedia technology demands the secure and efficient access to information. This fast growing lose the confidence of unauthorized duplication. Henceforth the protection of multimedia content is becoming more important. Watermarking solves the issue of unlawful copy of advanced data. In this paper, blind video watermarking technique has been proposed. A luminance layer of selected frames is interlaced into two even and odd rows of an image, further it is deinterlaced and equalizes the coefficients of the two shares. Color watermark is split into different blocks, and the pieces of block are concealed in one of the share under the wavelet transform. Stack the two images into a single image by introducing interlaced even and odd rows in the two shares. Finally, chrominance bands are concatenated with the watermarked luminance band. The safeguard level of the secret information is high, and it is undetectable. Results show that the quality of the video is not changed also yields the better PSNR values.

Keywords: Authentication, data security, deinterlaced, wavelet transform, watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2433
5157 Parameter Estimation of Diode Circuit Using Extended Kalman Filter

Authors: Amit Kumar Gautam, Sudipta Majumdar

Abstract:

This paper presents parameter estimation of a single-phase rectifier using extended Kalman filter (EKF). The state space model has been obtained using Kirchhoff’s current law (KCL) and Kirchhoff’s voltage law (KVL). The capacitor voltage and diode current of the circuit have been estimated using EKF. Simulation results validate the better accuracy of the proposed method as compared to the least mean square method (LMS). Further, EKF has the advantage that it can be used for nonlinear systems.

Keywords: Extended Kalman filter, parameter estimation, single phase rectifier, state space modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 852
5156 Contour Estimation in Synthetic and Real Weld Defect Images based on Maximum Likelihood

Authors: M. Tridi, N. Nacereddine, N. Oucief

Abstract:

This paper describes a novel method for automatic estimation of the contours of weld defect in radiography images. Generally, the contour detection is the first operation which we apply in the visual recognition system. Our approach can be described as a region based maximum likelihood formulation of parametric deformable contours. This formulation provides robustness against the poor image quality, and allows simultaneous estimation of the contour parameters together with other parameters of the model. Implementation is performed by a deterministic iterative algorithm with minimal user intervention. Results testify for the very good performance of the approach especially in synthetic weld defect images.

Keywords: Contour, gaussian, likelihood, rayleigh.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608
5155 An Integrated Software Architecture for Bandwidth Adaptive Video Streaming

Authors: T. Arsan

Abstract:

Video streaming over lossy IP networks is very important issues, due to the heterogeneous structure of networks. Infrastructure of the Internet exhibits variable bandwidths, delays, congestions and time-varying packet losses. Because of variable attributes of the Internet, video streaming applications should not only have a good end-to-end transport performance but also have a robust rate control, furthermore multipath rate allocation mechanism. So for providing the video streaming service quality, some other components such as Bandwidth Estimation and Adaptive Rate Controller should be taken into consideration. This paper gives an overview of video streaming concept and bandwidth estimation tools and then introduces special architectures for bandwidth adaptive video streaming. A bandwidth estimation algorithm – pathChirp, Optimized Rate Controllers and Multipath Rate Allocation Algorithm are considered as all-in-one solution for video streaming problem. This solution is directed and optimized by a decision center which is designed for obtaining the maximum quality at the receiving side.

Keywords: Adaptive Video Streaming, Bandwidth Estimation, QoS, Software Architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1391
5154 Implementation of SU-MIMO and MU-MIMOGTD-System under Imperfect CSI Knowledge

Authors: Parit Kanjanavirojkul, Kiatwarakorn Keeratishananond, Prapun Suksompong

Abstract:

We study the performance of compressed beamforming weights feedback technique in generalized triangular decomposition (GTD) based MIMO system. GTD is a beamforming technique that enjoys QoS flexibility. The technique, however, will perform at its optimum only when the full knowledge of channel state information (CSI) is available at the transmitter. This would be impossible in the real system, where there are channel estimation error and limited feedback. We suggest a way to implement the quantized beamforming weights feedback, which can significantly reduce the feedback data, on GTD-based MIMO system and investigate the performance of the system. Interestingly, we found that compressed beamforming weights feedback does not degrade the BER performance of the system at low input power, while the channel estimation error and quantization do. For comparison, GTD is more sensitive to compression and quantization, while SVD is more sensitive to the channel estimation error. We also explore the performance of GTDbased MU-MIMO system, and find that the BER performance starts to degrade largely at around -20 dB channel estimation error.

Keywords: MIMO, MU-MIMO, GTD, Imperfect CSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1901
5153 On the Modeling and State Estimation for Dynamic Power System

Authors: A. Thabet, M. Boutayeb, M. N. Abdelkrim

Abstract:

This paper investigates a method for the state estimation of nonlinear systems described by a class of differential-algebraic equation (DAE) models using the extended Kalman filter. The method involves the use of a transformation from a DAE to ordinary differential equation (ODE). A relevant dynamic power system model using decoupled techniques will be proposed. The estimation technique consists of a state estimator based on the EKF technique as well as the local stability analysis. High performances are illustrated through a simulation study applied on IEEE 13 buses test system.

Keywords: Power system, Dynamic decoupled model, Extended Kalman Filter, Convergence analysis, Time computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2691
5152 Estimation of Skew Angle in Binary Document Images Using Hough Transform

Authors: Nandini N., Srikanta Murthy K., G. Hemantha Kumar

Abstract:

This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.

Keywords: Dilation, Document processing, Hough transform, Optical Character Recognition, Skew estimation, and Thinning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3224
5151 Blind Spot Area Tracking Solution Using 1x12 POF-Based Optical Couplers

Authors: Mohammad Syuhaimi Ab-Rahman, Mohd Hadi Guna Safnal, Mohd Hazwan Harun, Mohd.Saiful Dzulkefly Zan, Kasmiran Jumari

Abstract:

Optical 1x12 fused-taper-twisted polymer optical fiber (POF) couplers has been fabricated by a perform technique. Characterization of the coupler which proposed to be used in passive night vision application to tracking a blind sport area was reported. During the development process of fused-taper-twisted POF couplers was carried out, red LED fully utilized to be injected into the couplers to test the quality of fabricated couplers. Some characterization parameters, such as optical output power, POFs attenuation characteristics and power losses on the network were observed. The maximum output power efficiency of the coupler is about 40%, but it can be improved gradually through experience and practice.

Keywords: polymer optical fiber (POF), customer-made, fused-taper-twisted fiber, optical coupler, small world communication, home network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1378
5150 A Proposed Trust Model for the Semantic Web

Authors: Hoda Waguih

Abstract:

A serious problem on the WWW is finding reliable information. Not everything found on the Web is true and the Semantic Web does not change that in any way. The problem will be even more crucial for the Semantic Web, where agents will be integrating and using information from multiple sources. Thus, if an incorrect premise is used due to a single faulty source, then any conclusions drawn may be in error. Thus, statements published on the Semantic Web have to be seen as claims rather than as facts, and there should be a way to decide which among many possibly inconsistent sources is most reliable. In this work, we propose a trust model for the Semantic Web. The proposed model is inspired by the use trust in human society. Trust is a type of social knowledge and encodes evaluations about which agents can be taken as reliable sources of information or services. Our proposed model allows agents to decide which among different sources of information to trust and thus act rationally on the semantic web.

Keywords: Semantic Web, Trust, Web of Trust, WWW.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492
5149 Model Based Monitoring Using Integrated Data Validation, Simulation and Parameter Estimation

Authors: Reza Hayati, Maryam Sadi, Saeid Shokri, Mehdi Ahmadi Marvast, Saeid Hassan Boroojerdi, Amin Hamzavi Abedi

Abstract:

Efficient and safe plant operation can only be achieved if the operators are able to monitor all key process parameters. Instrumentation is used to measure many process variables, like temperatures, pressures, flow rates, compositions or other product properties. Therefore Performance monitoring is a suitable tool for operators. In this paper, we integrate rigorous simulation model, data reconciliation and parameter estimation to monitor process equipments and determine key performance indicator (KPI) of them. The applied method here has been implemented in two case studies.

Keywords: Data Reconciliation, Measurement, Optimization, Parameter Estimation, Performance Monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037
5148 Density Estimation using Generalized Linear Model and a Linear Combination of Gaussians

Authors: Aly Farag, Ayman El-Baz, Refaat Mohamed

Abstract:

In this paper we present a novel approach for density estimation. The proposed approach is based on using the logistic regression model to get initial density estimation for the given empirical density. The empirical data does not exactly follow the logistic regression model, so, there will be a deviation between the empirical density and the density estimated using logistic regression model. This deviation may be positive and/or negative. In this paper we use a linear combination of Gaussian (LCG) with positive and negative components as a model for this deviation. Also, we will use the expectation maximization (EM) algorithm to estimate the parameters of LCG. Experiments on real images demonstrate the accuracy of our approach.

Keywords: Logistic regression model, Expectationmaximization, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
5147 Issues in Spectral Source Separation Techniques for Plant-wide Oscillation Detection and Diagnosis

Authors: A.K. Tangirala, S. Babji

Abstract:

In the last few years, three multivariate spectral analysis techniques namely, Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Non-negative Matrix Factorization (NMF) have emerged as effective tools for oscillation detection and isolation. While the first method is used in determining the number of oscillatory sources, the latter two methods are used to identify source signatures by formulating the detection problem as a source identification problem in the spectral domain. In this paper, we present a critical drawback of the underlying linear (mixing) model which strongly limits the ability of the associated source separation methods to determine the number of sources and/or identify the physical source signatures. It is shown that the assumed mixing model is only valid if each unit of the process gives equal weighting (all-pass filter) to all oscillatory components in its inputs. This is in contrast to the fact that each unit, in general, acts as a filter with non-uniform frequency response. Thus, the model can only facilitate correct identification of a source with a single frequency component, which is again unrealistic. To overcome this deficiency, an iterative post-processing algorithm that correctly identifies the physical source(s) is developed. An additional issue with the existing methods is that they lack a procedure to pre-screen non-oscillatory/noisy measurements which obscure the identification of oscillatory sources. In this regard, a pre-screening procedure is prescribed based on the notion of sparseness index to eliminate the noisy and non-oscillatory measurements from the data set used for analysis.

Keywords: non-negative matrix factorization, PCA, source separation, plant-wide diagnosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1481
5146 Factors of Effective Business Software Systems Development and Enhancement Projects Work Effort Estimation

Authors: Beata Czarnacka-Chrobot

Abstract:

Majority of Business Software Systems (BSS) Development and Enhancement Projects (D&EP) fail to meet criteria of their effectiveness, what leads to the considerable financial losses. One of the fundamental reasons for such projects- exceptionally low success rate are improperly derived estimates for their costs and time. In the case of BSS D&EP these attributes are determined by the work effort, meanwhile reliable and objective effort estimation still appears to be a great challenge to the software engineering. Thus this paper is aimed at presenting the most important synthetic conclusions coming from the author-s own studies concerning the main factors of effective BSS D&EP work effort estimation. Thanks to the rational investment decisions made on the basis of reliable and objective criteria it is possible to reduce losses caused not only by abandoned projects but also by large scale of overrunning the time and costs of BSS D&EP execution.

Keywords: Benchmarking data, business software systems development and enhancement projects, effort estimation, software engineering economics, software functional size measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
5145 New Concept for the Overall use of Renewable Energy

Authors: Chang-Hsien Tai, Uzu-Kuei Hsu, Jr-Ming Miao, Yong-Jhou Lin

Abstract:

The development and application of wind power for renewable energy has attracted growing interest in recent years. Renewable energy sources are attracting much alteration as they can reduce both environmental damage and dependence on fossil fuels. With the growing need for sustainable energy supplies, a case is made for decentralized, stand-alone power supplies (SAPS) as an alternative to power grids. In the era which traditional petroleum energy resource decreasing and the green house affect significant increasing, the development and usage of regenerative resources is inevitable. Due to the contribution of the pioneers, the development of regenerative resources already has a remarkable achievement; however, in the view of economy and quantity, it is still a long road for regenerative energy to replace traditional petroleum energy. In our prospective, in stead of investigate larger regenerative energy equipment, it is much wiser to think about the blind side and breakthrough of the current technique.

Keywords: regenerative resources, hybrid system, transfer, storage, phase change

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620
5144 A Robust Frequency Offset Estimator for Orthogonal Frequency Division Multiplexing

Authors: Keunhong Chae, Seokho Yoon

Abstract:

We address the integer frequency offset (IFO) estimation under the influence of the timing offset (TO) in orthogonal frequency division multiplexing (OFDM) systems. Incorporating the IFO and TO into the symbol set used to represent the received OFDM symbol, we investigate the influence of the TO on the IFO, and then, propose a combining method between two consecutive OFDM correlations, reducing the influence. The proposed scheme has almost the same complexity as that of the conventional schemes, whereas it does not need the TO knowledge contrary to the conventional schemes. From numerical results it is confirmed that the proposed scheme is insensitive to the TO, consequently, yielding an improvement of the IFO estimation performance over the conventional schemes when the TO exists.

Keywords: Estimation, integer frequency offset, OFDM, timing offset.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069
5143 A Linear Use Case Based Software Cost Estimation Model

Authors: Hasan.O. Farahneh, Ayman A. Issa

Abstract:

Software development is moving towards agility with use cases and scenarios being used for requirements stories. Estimates of software costs are becoming even more important than before as effects of delays is much larger in successive short releases context of agile development. Thus, this paper reports on the development of new linear use case based software cost estimation model applicable in the very early stages of software development being based on simple metric. Evaluation showed that accuracy of estimates varies between 43% and 55% of actual effort of historical test projects. These results outperformed those of wellknown models when applied in the same context. Further work is being carried out to improve the performance of the proposed model when considering the effect of non-functional requirements.

Keywords: Metrics, Software Cost Estimation, Use Cases

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960
5142 Towards the Use of Renewable Energy Sources in the Home

Authors: Adriana Alexandru, Elena Jitaru, Rayner Mayer

Abstract:

The paper presents the results of the European EIE project “Realising the potential for small scale renewable energy sources in the home – Kyotointhehome". The project's global aim is to inform and educate teachers, students and their families so that they can realise the need and can assess the potential for energy efficiency (EE) measures and renewable energy sources (RES) in their homes. The project resources were translated and trialled by 16 partners in 10 European countries. A web-based methodology which will enable families to assess how RES can be incorporated into energy efficient homes was accomplished. The web application “KYOTOINHOME" will help the citizens to identify what they can do to help their community meet the Kyoto target for greenhouse gas reductions and prevent global warming. This application provides useful information on how the citizens can use renewable energy sources in their home to provide space heating and cooling, hot water and electricity. A methodology for assessing heat loss in a dwelling and application of heat pump system was elaborated and will be implemented this year. For schools, we developed a set of practical activities concerned with preventing climate change through using renewable energy sources. Complementary resources will also developed in the Romanian research project “Romania Contribution to the European Targets Regarding the Development of Renewable Energy Sources" - PROMES.

Keywords: Education, energy policy, Internet, renewable energy sources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
5141 High Performance of Direct Torque and Flux Control of a Double Stator Induction Motor Drive with a Fuzzy Stator Resistance Estimator

Authors: K. Kouzi

Abstract:

In order to have stable and high performance of direct torque and flux control (DTFC) of double star induction motor drive (DSIM), proper on-line adaptation of the stator resistance is very important. This is inevitably due to the variation of the stator resistance during operating conditions, which introduces error in estimated flux position and the magnitude of the stator flux. Error in the estimated stator flux deteriorates the performance of the DTFC drive. Also, the effect of error in estimation is very important especially at low speed. Due to this, our aim is to overcome the sensitivity of the DTFC to the stator resistance variation by proposing on-line fuzzy estimation stator resistance. The fuzzy estimation method is based on an on-line stator resistance correction through the variations of the stator current estimation error and its variations. The fuzzy logic controller gives the future stator resistance increment at the output. The main advantage of the suggested algorithm control is to avoid the drive instability that may occur in certain situations and ensure the tracking of the actual stator resistance. The validity of the technique and the improvement of the whole system performance are proved by the results.

Keywords: Direct torque control, dual stator induction motor, fuzzy logic estimation, stator resistance adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1116
5140 Generalized Maximal Ratio Combining as a Supra-optimal Receiver Diversity Scheme

Authors: Jean-Pierre Dubois, Rania Minkara, Rafic Ayoubi

Abstract:

Maximal Ratio Combining (MRC) is considered the most complex combining technique as it requires channel coefficients estimation. It results in the lowest bit error rate (BER) compared to all other combining techniques. However the BER starts to deteriorate as errors are introduced in the channel coefficients estimation. A novel combining technique, termed Generalized Maximal Ratio Combining (GMRC) with a polynomial kernel, yields an identical BER as MRC with perfect channel estimation and a lower BER in the presence of channel estimation errors. We show that GMRC outperforms the optimal MRC scheme in general and we hereinafter introduce it to the scientific community as a new “supraoptimal" algorithm. Since diversity combining is especially effective in small femto- and pico-cells, internet-associated wireless peripheral systems are to benefit most from GMRC. As a result, many spinoff applications can be made to IP-based 4th generation networks.

Keywords: Bit error rate, femto-internet cells, generalized maximal ratio combining, signal-to-scattering noise ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2113