Search results for: error information
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5018

Search results for: error information

4568 Roll of Membership functions in Fuzzy Logic for Prediction of Shoot Length of Mustard Plant Based on Residual Analysis

Authors: Satyendra Nath Mandal, J. Pal Choudhury, Dilip De, S. R. Bhadra Chaudhuri

Abstract:

The selection for plantation of a particular type of mustard plant depending on its productivity (pod yield) at the stage of maturity. The growth of mustard plant dependent on some parameters of that plant, these are shoot length, number of leaves, number of roots and roots length etc. As the plant is growing, some leaves may be fall down and some new leaves may come, so it can not gives the idea to develop the relationship with the seeds weight at mature stage of that plant. It is not possible to find the number of roots and root length of mustard plant at growing stage that will be harmful of this plant as roots goes deeper to deeper inside the land. Only the value of shoot length which increases in course of time can be measured at different time instances. Weather parameters are maximum and minimum humidity, rain fall, maximum and minimum temperature may effect the growth of the plant. The parameters of pollution, water, soil, distance and crop management may be dominant factors of growth of plant and its productivity. Considering all parameters, the growth of the plant is very uncertain, fuzzy environment can be considered for the prediction of shoot length at maturity of the plant. Fuzzification plays a greater role for fuzzification of data, which is based on certain membership functions. Here an effort has been made to fuzzify the original data based on gaussian function, triangular function, s-function, Trapezoidal and L –function. After that all fuzzified data are defuzzified to get normal form. Finally the error analysis (calculation of forecasting error and average error) indicates the membership function appropriate for fuzzification of data and use to predict the shoot length at maturity. The result is also verified using residual (Absolute Residual, Maximum of Absolute Residual, Mean Absolute Residual, Mean of Mean Absolute Residual, Median of Absolute Residual and Standard Deviation) analysis.

Keywords: Fuzzification, defuzzification, gaussian function, triangular function, trapezoidal function, s-function, , membership function, residual analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2319
4567 Categorical Clustering By Converting Associated Information

Authors: Dongmin Cai, Stephen S-T Yau

Abstract:

Lacking an inherent “natural" dissimilarity measure between objects in categorical dataset presents special difficulties in clustering analysis. However, each categorical attributes from a given dataset provides natural probability and information in the sense of Shannon. In this paper, we proposed a novel method which heuristically converts categorical attributes to numerical values by exploiting such associated information. We conduct an experimental study with real-life categorical dataset. The experiment demonstrates the effectiveness of our approach.

Keywords: Categorical, Clustering, Converting, Information

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360
4566 On Adaptive Optimization of Filter Performance Based on Markov Representation for Output Prediction Error

Authors: Hong Son Hoang, Remy Baraille

Abstract:

This paper addresses the problem of how one can improve the performance of a non-optimal filter. First the theoretical question on dynamical representation for a given time correlated random process is studied. It will be demonstrated that for a wide class of random processes, having a canonical form, there exists a dynamical system equivalent in the sense that its output has the same covariance function. It is shown that the dynamical approach is more effective for simulating and estimating a Markov and non- Markovian random processes, computationally is less demanding, especially with increasing of the dimension of simulated processes. Numerical examples and estimation problems in low dimensional systems are given to illustrate the advantages of the approach. A very useful application of the proposed approach is shown for the problem of state estimation in very high dimensional systems. Here a modified filter for data assimilation in an oceanic numerical model is presented which is proved to be very efficient due to introducing a simple Markovian structure for the output prediction error process and adaptive tuning some parameters of the Markov equation.

Keywords: Statistical simulation, canonical form, dynamical system, Markov and non-Markovian processes, data assimilation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1298
4565 Comparison of Alternative Models to Predict Lean Meat Percentage of Lamb Carcasses

Authors: Vasco A. P. Cadavez, Fernando C. Monteiro

Abstract:

The objective of this study was to develop and compare alternative prediction equations of lean meat proportion (LMP) of lamb carcasses. Forty (40) male lambs, 22 of Churra Galega Bragançana Portuguese local breed and 18 of Suffolk breed were used. Lambs were slaughtered, and carcasses weighed approximately 30 min later in order to obtain hot carcass weight (HCW). After cooling at 4º C for 24-h a set of seventeen carcass measurements was recorded. The left side of carcasses was dissected into muscle, subcutaneous fat, inter-muscular fat, bone, and remainder (major blood vessels, ligaments, tendons, and thick connective tissue sheets associated with muscles), and the LMP was evaluated as the dissected muscle percentage. Prediction equations of LMP were developed, and fitting quality was evaluated through the coefficient of determination of estimation (R2 e) and standard error of estimate (SEE). Models validation was performed by k-fold crossvalidation and the coefficient of determination of prediction (R2 p) and standard error of prediction (SEP) were computed. The BT2 measurement was the best single predictor and accounted for 37.8% of the LMP variation with a SEP of 2.30%. The prediction of LMP of lamb carcasses can be based simple models, using as predictors the HCW and one fat thickness measurement.

Keywords: Bootstrap, Carcass, Lambs, Lean meat

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
4564 Robust Image Registration Based on an Adaptive Normalized Mutual Information Metric

Authors: Huda Algharib, Amal Algharib, Hanan Algharib, Ali Mohammad Alqudah

Abstract:

Image registration is an important topic for many imaging systems and computer vision applications. The standard image registration techniques such as Mutual information/ Normalized mutual information -based methods have a limited performance because they do not consider the spatial information or the relationships between the neighbouring pixels or voxels. In addition, the amount of image noise may significantly affect the registration accuracy. Therefore, this paper proposes an efficient method that explicitly considers the relationships between the adjacent pixels, where the gradient information of the reference and scene images is extracted first, and then the cosine similarity of the extracted gradient information is computed and used to improve the accuracy of the standard normalized mutual information measure. Our experimental results on different data types (i.e. CT, MRI and thermal images) show that the proposed method outperforms a number of image registration techniques in terms of the accuracy.

Keywords: Image registration, mutual information, image gradients, Image transformations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 896
4563 Image Transmission: A Case Study on Combined Scheme of LDPC-STBC in Asynchronous Cooperative MIMO Systems

Authors: Shan Ding, Lijia Zhang, Hongming Xu

Abstract:

this paper presents a novel scheme which is capable of reducing the error rate and improves the transmission performance in the asynchronous cooperative MIMO systems. A case study of image transmission is applied to prove the efficient of scheme. The linear dispersion structure is employed to accommodate the cooperative wireless communication network in the dynamic topology of structure, as well as to achieve higher throughput than conventional space–time codes based on orthogonal designs. The LDPC encoder without girth-4 and the STBC encoder with guard intervals are respectively introduced. The experiment results show that the combined coder of LDPC-STBC with guard intervals can be the good error correcting coders and BER performance in the asynchronous cooperative communication. In the case study of image transmission, the results show that in the transmission process, the image quality which is obtained by applied combined scheme is much better than it which is not applied the scheme in the asynchronous cooperative MIMO systems.

Keywords: Cooperative MIMO, image transmission, lineardispersion codes, Low-Density Parity-Check (LDPC)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933
4562 The Characteristics of a Fair and Efficient Tax Auditing Information System as a Tool against Tax Evasion: A Theoretical Framework

Authors: Dimitris Balios, Stefanos Tantos

Abstract:

Economic growth and social evolution are connected to trust relationships in a society. The quality of the accounting information, the tax information system and the tax audit mechanism evolve multiple benefits in an economy. Tax evasion, the illegal practice where people and companies do not pay taxes, is a crime because of the negative effect in economy and society. In this paper, we describe a theoretical framework on the characteristics of a fair and efficient tax auditing information system which could be a tool against tax evasion, a tool for an economy to grow, especially in countries that face fluctuations in economic activity. We conclude that a fair and efficient tax auditing information system increases the reliability of tax administration, improves taxpayers’ tax compliance and causes a developmental trajectory for the economy.

Keywords: Auditing information system, auditing mechanism, tax evasion, taxation, quality of accounting information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1109
4561 A Utilitarian Approach to Modeling Information Flows in Social Networks

Authors: Usha Sridhar, Sridhar Mandyam

Abstract:

We propose a multi-agent based utilitarian approach to model and understand information flows in social networks that lead to Pareto optimal informational exchanges. We model the individual expected utility function of the agents to reflect the net value of information received. We show how this model, adapted from a theorem by Karl Borch dealing with an actuarial Risk Exchange concept in the Insurance industry, can be used for social network analysis. We develop a utilitarian framework that allows us to interpret Pareto optimal exchanges of value as potential information flows, while achieving a maximization of a sum of expected utilities of information of the group of agents. We examine some interesting conditions on the utility function under which the flows are optimal. We illustrate the promise of this new approach to attach economic value to information in networks with a synthetic example.

Keywords: Borch's Theorem , Economic value of information, Information Exchange, Pareto Optimal Solution, Social Networks, Utility Functions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1505
4560 Impact of Government Spending on Private Consumption and on the Economy: Case of Thailand

Authors: Paitoon Kraipornsak

Abstract:

The recent global financial problem urges government to play role in stimulating the economy due to the fact that private sector has little ability to purchase during the recession. A concerned question is whether the increased government spending crowds out private consumption and whether it helps stimulate the economy. If the government spending policy is effective; the private consumption is expected to increase and can compensate the recent extra government expense. In this study, the government spending is categorized into government consumption spending and government capital spending. The study firstly examines consumer consumption along the line with the demand function in microeconomic theory. Three categories of private consumption are used in the study. Those are food consumption, non food consumption, and services consumption. The dynamic Almost Ideal Demand System of the three categories of the private consumption is estimated using the Vector Error Correction Mechanism model. The estimated model indicates the substituting effects (negative impacts) of the government consumption spending on budget shares of private non food consumption and of the government capital spending on budget share of private food consumption, respectively. Nevertheless the result does not necessarily indicate whether the negative effects of changes in the budget shares of the non food and the food consumption means fallen total private consumption. Microeconomic consumer demand analysis clearly indicates changes in component structure of aggregate expenditure in the economy as a result of the government spending policy. The macroeconomic concept of aggregate demand comprising consumption, investment, government spending (the government consumption spending and the government capital spending), export, and import are used to estimate for their relationship using the Vector Error Correction Mechanism model. The macroeconomic study found no effect of the government capital spending on either the private consumption or the growth of GDP while the government consumption spending has negative effect on the growth of GDP. Therefore no crowding out effect of the government spending is found on the private consumption but it is ineffective and even inefficient expenditure as found reducing growth of the GDP in the context of Thailand.

Keywords: government consumption spending, governmentcapital spending, private consumption on food, non food, andservices, Vector Error Correction Mechanism, Almost Ideal DemandSystem, substitution effect, complementary effect, consumer demand, aggregate demand

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828
4559 Predictive Clustering Hybrid Regression(pCHR) Approach and Its Application to Sucrose-Based Biohydrogen Production

Authors: Nikhil, Ari Visa, Chin-Chao Chen, Chiu-Yue Lin, Jaakko A. Puhakka, Olli Yli-Harja

Abstract:

A predictive clustering hybrid regression (pCHR) approach was developed and evaluated using dataset from H2- producing sucrose-based bioreactor operated for 15 months. The aim was to model and predict the H2-production rate using information available about envirome and metabolome of the bioprocess. Selforganizing maps (SOM) and Sammon map were used to visualize the dataset and to identify main metabolic patterns and clusters in bioprocess data. Three metabolic clusters: acetate coupled with other metabolites, butyrate only, and transition phases were detected. The developed pCHR model combines principles of k-means clustering, kNN classification and regression techniques. The model performed well in modeling and predicting the H2-production rate with mean square error values of 0.0014 and 0.0032, respectively.

Keywords: Biohydrogen, bioprocess modeling, clusteringhybrid regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1777
4558 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map

Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo

Abstract:

Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.

Keywords: RDM, multi-source data, big data, U-City.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 805
4557 Managing User Expectations in Information Systems Development

Authors: Linda, Sau-ling Lai

Abstract:

This paper provides new ways to explore the old problem of failure of information systems development in an organisation. Based on the theory of cognitive dissonance, information systems (IS) failure is defined as a gap between what the users expect from an information system and how well these expectations are met by the perceived performance of the delivered system. Bridging the expectation-perception gap requires that IS professionals make a radical change from being the proprietor of information systems and products to being service providers. In order to deliver systems and services that IS users perceive as valuable, IS people must become expert in determining and assessing users- expectations and perceptions. It is also suggested that the IS community, in general, has given relatively little attention to the front-end process of requirements specification for IS development. There is a simplistic belief that requirements are obtainable from users, they are then translatable into a formal specification. The process of information needs analysis is problematic and worthy of investigation.

Keywords: Information Systems Development, Cognitive Dissonance, Expectation-Perception Gap, Requirements Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3655
4556 Capacity of Overloaded DS-CDMA System on Rayleigh Fading Channel with Timing Error

Authors: Preetam Kumar

Abstract:

The number of users supported in a DS-CDMA cellular system is typically less than spreading factor (N), and the system is said to be underloaded. Overloading is a technique to accommodate more number of users than the spreading factor N. In O/O overloading scheme, the first set is assigned to the N synchronous users and the second set is assigned to the additional synchronous users. An iterative multistage soft decision interference cancellation (SDIC) receiver is used to remove high level of interference between the two sets. Performance is evaluated in terms of the maximum number acceptable users so that the system performance is degraded slightly compared to the single user performance at a specified BER. In this paper, the capacity of CDMA based O/O overloading scheme is evaluated with SDIC receiver. It is observed that O/O scheme using orthogonal Gold codes provides 25% channel overloading (N=64) for synchronous DS-CDMA system on an AWGN channel in the uplink at a BER of 1e-5.For a Rayleigh faded channel, the critical capacity is 40% at a BER of 5e-5 assuming synchronous users. But in practical systems, perfect chip timing is very difficult to maintain in the uplink.. We have shown that the overloading performance reduces to 11% for a timing synchronization error of 0.02Tc for a BER of 1e-5.

Keywords: DS-CDMA, Interference Cancellation, MultiuserDetection, Orthogonal codes, Overloading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717
4555 Predicting the Impact of the Defect on the Overall Environment in Function Based Systems

Authors: Parvinder S. Sandhu, Urvashi Malhotra, E. Ardil

Abstract:

There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.

Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, Software Faults, Accuracy, MAE, RMSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1356
4554 Application of Geographic Information Systems(GIS) in the History of Cartography

Authors: Bangbo Hu

Abstract:

This paper discusses applications of a revolutionary information technology, Geographic Information Systems (GIS), in the field of the history of cartography by examples, including assessing accuracy of early maps, establishing a database of places and historical administrative units in history, integrating early maps in GIS or digital images, and analyzing social, political, and economic information related to production of early maps. GIS provides a new mean to evaluate the accuracy of early maps. Four basic steps using GIS for this type of study are discussed. In addition, several historical geographical information systems are introduced. These include China Historical Geographic Information Systems (CHGIS), the United States National Historical Geographic Information System (NHGIS), and the Great Britain Historical Geographical Information System. GIS also provides digital means to display and analyze the spatial information on the early maps or to layer them with modern spatial data. How GIS relational data structure may be used to analyze social, political, and economic information related to production of early maps is also discussed in this paper. Through discussion on these examples, this paper reveals value of GIS applications in this field.

Keywords: Cartography, GIS, history, maps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3156
4553 A Framework for Personalized Multi-Device Information Communicating System

Authors: Rohiza Ahmad, Rozana Kasbon, Eliza Mazmee Mazlan, Aliza Sarlan

Abstract:

Due to the mobility of users, many information systems are now developed with the capability of supporting retrieval of information from both static and mobile users. Hence, the amount, content and format of the information retrieved will need to be tailored according to the device and the user who requested for it. Thus, this paper presents a framework for the design and implementation of such a system, which is to be developed for communicating final examination related information to the academic community at one university in Malaysia. The concept of personalization will be implemented in the system so that only highly relevant information will be delivered to the users. The personalization concept used will be based on user profiling as well as context. The system in its final state will be accessible through cell phones as well as intranet connected personal computers.

Keywords: System framework, personalization, informationcommunicating system, multi-device.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1386
4552 Analysis of Message Authentication in Turbo Coded Halftoned Images using Exit Charts

Authors: Andhe Dharani, P. S. Satyanarayana, Andhe Pallavi

Abstract:

Considering payload, reliability, security and operational lifetime as major constraints in transmission of images we put forward in this paper a steganographic technique implemented at the physical layer. We suggest transmission of Halftoned images (payload constraint) in wireless sensor networks to reduce the amount of transmitted data. For low power and interference limited applications Turbo codes provide suitable reliability. Ensuring security is one of the highest priorities in many sensor networks. The Turbo Code structure apart from providing forward error correction can be utilized to provide for encryption. We first consider the Halftoned image and then the method of embedding a block of data (called secret) in this Halftoned image during the turbo encoding process is presented. The small modifications required at the turbo decoder end to extract the embedded data are presented next. The implementation complexity and the degradation of the BER (bit error rate) in the Turbo based stego system are analyzed. Using some of the entropy based crypt analytic techniques we show that the strength of our Turbo based stego system approaches that found in the OTPs (one time pad).

Keywords: Halftoning, Turbo codes, security, operationallifetime, Turbo based stego system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1508
4551 Optimization of Transmitter Aperture by Genetic Algorithm in Optical Satellite

Authors: Karim Kemih, Yacine Yaiche, Malek Benslama

Abstract:

To establish optical communication between any two satellites, the transmitter satellite must track the beacon of the receiver satellite and point the information optical beam in its direction. Optical tracking and pointing systems for free space suffer during tracking from high-amplitude vibration because of background radiation from interstellar objects such as the Sun, Moon, Earth, and stars in the tracking field of view or the mechanical impact from satellite internal and external sources. The vibrations of beam pointing increase the bit error rate and jam communication between the two satellites. One way to overcome this problem is the use of very small transmitter beam divergence angles of too narrow divergence angle is that the transmitter beam may sometimes miss the receiver satellite, due to pointing vibrations. In this paper we propose the use of genetic algorithm to optimize the BER as function of transmitter optics aperture.

Keywords: Optical Satellite Communication, Genetic Algorithm, Transmitter Optics Aperture

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
4550 Modeling the Impact of Controls on Information System Risks

Authors: M. Ndaw, G. Mendy, S. Ouya

Abstract:

Information system risk management helps to reduce or eliminate risk by implementing appropriate controls. In this paper, we propose a quantification model of controls impact on information system risks by automatizing the residual criticality estimation step of FMECA which is based on a inductive reasoning. For this, we defined three equations based on type and maturity of controls. For testing, the values obtained with the model were compared to estimated values given by interlocutors during different working sessions and the result is satisfactory. This model allows an optimal assessment of controls maturity and facilitates risk analysis of information system.

Keywords: Information System, Risk, Control, FMECA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
4549 Validation and Selection between Machine Learning Technique and Traditional Methods to Reduce Bullwhip Effects: a Data Mining Approach

Authors: Hamid R. S. Mojaveri, Seyed S. Mousavi, Mojtaba Heydar, Ahmad Aminian

Abstract:

The aim of this paper is to present a methodology in three steps to forecast supply chain demand. In first step, various data mining techniques are applied in order to prepare data for entering into forecasting models. In second step, the modeling step, an artificial neural network and support vector machine is presented after defining Mean Absolute Percentage Error index for measuring error. The structure of artificial neural network is selected based on previous researchers' results and in this article the accuracy of network is increased by using sensitivity analysis. The best forecast for classical forecasting methods (Moving Average, Exponential Smoothing, and Exponential Smoothing with Trend) is resulted based on prepared data and this forecast is compared with result of support vector machine and proposed artificial neural network. The results show that artificial neural network can forecast more precisely in comparison with other methods. Finally, forecasting methods' stability is analyzed by using raw data and even the effectiveness of clustering analysis is measured.

Keywords: Artificial Neural Networks (ANN), bullwhip effect, demand forecasting, Support Vector Machine (SVM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2010
4548 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models

Authors: Ramin Vafadary, Maryam Khanbaghi

Abstract:

Forecasting electricity load is important for various purposes like planning, operation and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria namely, the Mean Absolute Error and Root Mean Square Error. The National Renewable Energy Laboratory (NREL) residential energy consumption data are used to train the models. The results of this study show that SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts we can improve the robustness of the models for 24 hour ahead electricity load forecasting.

Keywords: Bagging, Fbprophet, Holt-Winters, LSTM, Load Forecast, SARIMA, tensorflow probability, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 483
4547 A Novel Convergence Accelerator for the LMS Adaptive Algorithm

Authors: Jeng-Shin Sheu, Jenn-Kaie Lain, Tai-Kuo Woo, Jyh-Horng Wen

Abstract:

The least mean square (LMS) algorithmis one of the most well-known algorithms for mobile communication systems due to its implementation simplicity. However, the main limitation is its relatively slow convergence rate. In this paper, a booster using the concept of Markov chains is proposed to speed up the convergence rate of LMS algorithms. The nature of Markov chains makes it possible to exploit the past information in the updating process. Moreover, since the transition matrix has a smaller variance than that of the weight itself by the central limit theorem, the weight transition matrix converges faster than the weight itself. Accordingly, the proposed Markov-chain based booster thus has the ability to track variations in signal characteristics, and meanwhile, it can accelerate the rate of convergence for LMS algorithms. Simulation results show that the LMS algorithm can effectively increase the convergence rate and meantime further approach the Wiener solution, if the Markov-chain based booster is applied. The mean square error is also remarkably reduced, while the convergence rate is improved.

Keywords: LMS, Markov chain, convergence rate, accelerator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
4546 Biometric Steganography Using Variable Length Embedding

Authors: Souvik Bhattacharyya, Indradip Banerjee, Anumoy Chakraborty, Gautam Sanyal

Abstract:

Recent growth in digital multimedia technologies has presented a lot of facilities in information transmission, reproduction and manipulation. Therefore, the concept of information security is one of the superior articles in the present day situation. The biometric information security is one of the information security mechanisms. It has the advantages as well as disadvantages. The biometric system is at risk to a range of attacks. These attacks are anticipated to bypass the security system or to suspend the normal functioning. Various hazards have been discovered while using biometric system. Proper use of steganography greatly reduces the risks in biometric systems from the hackers. Steganography is one of the fashionable information hiding technique. The goal of steganography is to hide information inside a cover medium like text, image, audio, video etc. through which it is not possible to detect the existence of the secret information. Here in this paper a new security concept has been established by making the system more secure with the help of steganography along with biometric security. Here the biometric information has been embedded to a skin tone portion of an image with the help of proposed steganographic technique.

Keywords: Biometrics, Skin tone detection, Series, Polynomial, Cover Image, Stego Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2670
4545 A Finite Precision Block Floating Point Treatment to Direct Form, Cascaded and Parallel FIR Digital Filters

Authors: Abhijit Mitra

Abstract:

This paper proposes an efficient finite precision block floating point (BFP) treatment to the fixed coefficient finite impulse response (FIR) digital filter. The treatment includes effective implementation of all the three forms of the conventional FIR filters, namely, direct form, cascaded and par- allel, and a roundoff error analysis of them in the BFP format. An effective block formatting algorithm together with an adaptive scaling factor is pro- posed to make the realizations more simple from hardware view point. To this end, a generic relation between the tap weight vector length and the input block length is deduced. The implementation scheme also emphasises on a simple block exponent update technique to prevent overflow even during the block to block transition phase. The roundoff noise is also investigated along the analogous lines, taking into consideration these implementational issues. The simulation results show that the BFP roundoff errors depend on the sig- nal level almost in the same way as floating point roundoff noise, resulting in approximately constant signal to noise ratio over a relatively large dynamic range.

Keywords: Finite impulse response digital filters, Cascade structure, Parallel structure, Block floating point arithmetic, Roundoff error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
4544 Software Maintenance Severity Prediction for Object Oriented Systems

Authors: Parvinder S. Sandhu, Roma Jaswal, Sandeep Khimta, Shailendra Singh

Abstract:

As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done in time especially for the critical applications. As, Neural networks, which have been already applied in software engineering applications to build reliability growth models predict the gross change or reusability metrics. Neural networks are non-linear sophisticated modeling techniques that are able to model complex functions. Neural network techniques are used when exact nature of input and outputs is not known. A key feature is that they learn the relationship between input and output through training. In this present work, various Neural Network Based techniques are explored and comparative analysis is performed for the prediction of level of need of maintenance by predicting level severity of faults present in NASA-s public domain defect dataset. The comparison of different algorithms is made on the basis of Mean Absolute Error, Root Mean Square Error and Accuracy Values. It is concluded that Generalized Regression Networks is the best algorithm for classification of the software components into different level of severity of impact of the faults. The algorithm can be used to develop model that can be used for identifying modules that are heavily affected by the faults.

Keywords: Neural Network, Software faults, Software Metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1575
4543 High Securing Cover-File of Hidden Data Using Statistical Technique and AES Encryption Algorithm

Authors: A. A. Zaidan, Anas Majeed, B. B. Zaidan

Abstract:

Nowadays, the rapid development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information Besides that, digital documents are also easy to copy and distribute, therefore it will be faced by many threatens. It-s a big security and privacy issue with the large flood of information and the development of the digital format, it become necessary to find appropriate protection because of the significance, accuracy and sensitivity of the information. Nowadays protection system classified with more specific as hiding information, encryption information, and combination between hiding and encryption to increase information security, the strength of the information hiding science is due to the non-existence of standard algorithms to be used in hiding secret messages. Also there is randomness in hiding methods such as combining several media (covers) with different methods to pass a secret message. In addition, there are no formal methods to be followed to discover the hidden data. For this reason, the task of this research becomes difficult. In this paper, a new system of information hiding is presented. The proposed system aim to hidden information (data file) in any execution file (EXE) and to detect the hidden file and we will see implementation of steganography system which embeds information in an execution file. (EXE) files have been investigated. The system tries to find a solution to the size of the cover file and making it undetectable by anti-virus software. The system includes two main functions; first is the hiding of the information in a Portable Executable File (EXE), through the execution of four process (specify the cover file, specify the information file, encryption of the information, and hiding the information) and the second function is the extraction of the hiding information through three process (specify the steno file, extract the information, and decryption of the information). The system has achieved the main goals, such as make the relation of the size of the cover file and the size of information independent and the result file does not make any conflict with anti-virus software.

Keywords: Cryptography, Steganography, Portable ExecutableFile.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
4542 Economic Factors Affecting Rice Export of Thailand

Authors: Somphoom Sawaengkun

Abstract:

The purpose of this study was primarily assessing how important economic factors namely: The Thai export price of white rice, the exchange rate, and the world rice consumption affect the overall Thai white rice export, using historical data during the period 1989-2013 from the Thai Rice Exporters Association, and Food and Agricultural Organization of the United Nations. The co-integration method, regression analysis, and error correction model were applied to investigate the econometric model. The findings indicated that in the long-run, the world rice consumption, the exchange rate, and the Thai export price of white rice were the important factors affecting the export quantity of Thai white rice respectively, as indicated by their significant coefficients. Meanwhile, the rice export price was an important factor affecting the export quantity of Thai white rice in the short-run. This information is useful in the business, export opportunities, price competitiveness, and policymaker in Thailand.

Keywords: Economic Factors, Rice Export, White Rice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3496
4541 Social Aspects and Successfully Funding a Crowd-Funding Project: The Impact of Social Information

Authors: Peggy S. C. van Teunenbroek

Abstract:

Recently, philanthropic crowd-funding -the raising of external funding from a large audience via social networks or social media- emerged as a new funding instrument for the Dutch cultural sector. However, such philanthropic crowdfunding in the US and the Netherlands is less successful than any other form of crowdfunding. We argue that social aspects are an important stimulus in philanthropic crowd-funding since previous research has shown that crowdfunding is stimulated by something beyond financial merits. Put simply, crowd-funding seems to be a socially motivated activity. In this paper we focus on the effect of social information, described as information about the donation behavior of previous donors. Using a classroom experiment we demonstrated a positive effect of social information on the donation behavior in crowdfunding campaigns. Our study extends previous research by showing who is affected by social information and why, and highlights how social information can be used to stimulate individuals to donate more to crowdfunding projects.

Keywords: Online donation behavior, philanthropic crowd-funding, social information, social influence, social motivation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
4540 Monitoring and Fault-Recovery Capacity with Waveguide Grating-based Optical Switch over WDM/OCDMA-PON

Authors: Yao-Tang Chang, Chuen-Ching Wang, Shu-Han Hu

Abstract:

In order to implement flexibility as well as survivable capacities over passive optical network (PON), a new automatic random fault-recovery mechanism with array-waveguide-grating based (AWG-based) optical switch (OSW) is presented. Firstly, wavelength-division-multiplexing and optical code-division multiple-access (WDM/OCDMA) scheme are configured to meet the various geographical locations requirement between optical network unit (ONU) and optical line terminal (OLT). The AWG-base optical switch is designed and viewed as central star-mesh topology to prohibit/decrease the duplicated redundant elements such as fiber and transceiver as well. Hence, by simple monitoring and routing switch algorithm, random fault-recovery capacity is achieved over bi-directional (up/downstream) WDM/OCDMA scheme. When error of distribution fiber (DF) takes place or bit-error-rate (BER) is higher than 10-9 requirement, the primary/slave AWG-based OSW are adjusted and controlled dynamically to restore the affected ONU groups via the other working DFs immediately.

Keywords: Random fault recovery mechanism, Array-waveguide-grating based optical switch (AWG- based OSW), wavelength-division-multiplexing and optical code-divisionmultiple-access (WDM/ OCDMA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
4539 Discrete Polyphase Matched Filtering-based Soft Timing Estimation for Mobile Wireless Systems

Authors: Thomas O. Olwal, Michael A. van Wyk, Barend J. van Wyk

Abstract:

In this paper we present a soft timing phase estimation (STPE) method for wireless mobile receivers operating in low signal to noise ratios (SNRs). Discrete Polyphase Matched (DPM) filters, a Log-maximum a posterior probability (MAP) and/or a Soft-output Viterbi algorithm (SOVA) are combined to derive a new timing recovery (TR) scheme. We apply this scheme to wireless cellular communication system model that comprises of a raised cosine filter (RCF), a bit-interleaved turbo-coded multi-level modulation (BITMM) scheme and the channel is assumed to be memory-less. Furthermore, no clock signals are transmitted to the receiver contrary to the classical data aided (DA) models. This new model ensures that both the bandwidth and power of the communication system is conserved. However, the computational complexity of ideal turbo synchronization is increased by 50%. Several simulation tests on bit error rate (BER) and block error rate (BLER) versus low SNR reveal that the proposed iterative soft timing recovery (ISTR) scheme outperforms the conventional schemes.

Keywords: discrete polyphase matched filters, maximum likelihood estimators, soft timing phase estimation, wireless mobile systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692