Search results for: Bayesian estimation theory
2302 Design of Nonlinear Observer by Using Chebyshev Interpolation based on Formal Linearization
Authors: Kazuo Komatsu, Hitoshi Takata
Abstract:
This paper discusses a design of nonlinear observer by a formal linearization method using an application of Chebyshev Interpolation in order to facilitate processes for synthesizing a nonlinear observer and to improve the precision of linearization. A dynamic nonlinear system is linearized with respect to a linearization function, and a measurement equation is transformed into an augmented linear one by the formal linearization method which is based on Chebyshev interpolation. To the linearized system, a linear estimation theory is applied and a nonlinear observer is derived. To show effectiveness of the observer design, numerical experiments are illustrated and they indicate that the design shows remarkable performances for nonlinear systems.Keywords: nonlinear system, nonlinear observer, formal linearization, Chebyshev interpolation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15932301 Zero Inflated Strict Arcsine Regression Model
Authors: Y. N. Phang, E. F. Loh
Abstract:
Zero inflated strict arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, we extend zero inflated strict arcsine model to zero inflated strict arcsine regression model by taking into consideration the extra variability caused by extra zeros and covariates in count data. Maximum likelihood estimation method is used in estimating the parameters for this zero inflated strict arcsine regression model.Keywords: Overdispersed count data, maximum likelihood estimation, simulated annealing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17552300 The Truth about Good and Evil: A Mixed-Methods Approach to Color Theory
Authors: Raniya Alsharif
Abstract:
The color theory of good and evil is the association of colors to the omnipresent concept of good and evil, where human behavior and perception can be highly influenced by seeing black and white, making these connotations almost dangerously distinctive where they can be very hard to distinguish. This theory is a human construct that dates back to ancient Egypt and has been used since then in almost all forms of communication and expression, such as art, fashion, literature, and religious manuscripts, helping the implantation of preconceived ideas that influence behavior and society. This is a mixed-methods research that uses both surveys to collect quantitative data related to the theory and a vignette to collect qualitative data by using a scenario where participants aged between 18-25 will style two characters of good and bad characteristics with color contrasting clothes, both yielding results about the nature of the preconceived perceptions associated with ‘black and white’ and ‘good and evil’, illustrating the important role of media and communications in human behavior and subconscious, and also uncover how far this theory goes in the age of social media enlightenment.
Keywords: Color perception, interpretivism, thematic analysis, vignettes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10002299 Decision-Making Criteria of PPP Projects: Stakeholder Theoretic Perspective
Authors: Xueqin Shan, Wenhua Hou, Xiaosu Ye, Chuanming Wu
Abstract:
Any decision-making is based on certain theory. Taking the public rental housing in Chongqing municipality as an example, this essay states that the stakeholder theory can provide innovative criteria and evaluation methods for Public Private Partnership (PPP) projects. It gives an analysis of how to choose decision-making criteria for different stakeholders in the PPP model and what measures to take to meet the criteria to form “symbiotic" decision-making mode through contracts and to boost the application of PPP model in large-scale public programs in China.Keywords: PPP, Stakeholder Theory, Stakeholders, Decision- making Criteria
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24222298 A New Measurable Definition of Knowledge in New Growth Theory
Authors: Mohammad Ali Molaei
Abstract:
New Growth Theory helps us make sense of the ongoing shift from a resource-based economy to a knowledge-based economy. It underscores the point that the economic processes which create and diffuse new knowledge are critical to shaping the growth of nations, communities and individual firms. In all too many contributions to New (Endogenous) Growth Theory – though not in all – central reference is made to 'a stock of knowledge', a 'stock of ideas', etc., this variable featuring centre-stage in the analysis. Yet it is immediately apparent that this is far from being a crystal clear concept. The difficulty and uncertainty of being able to capture the value associated with knowledge is a real problem. The intent of this paper is introducing new thinking and theorizing about the knowledge and its measurability in new growth theory. Moreover the study aims to synthesize various strain of the literature with a practical bearing on knowledge concept. By contribution of institution framework which is found within NGT, we can indirectly measure the knowledge concept. Institutions matter because they shape the environment for production and employment of new knowledgeKeywords: Institution Framework, Knowledge, New GrowthTheory (NGT)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15432297 Language Learning, Drives, and Context: A Grounded Theory of Learning Behavior
Authors: Julian Pigott
Abstract:
This paper presents the Language Learning as a Means of Drive Engagement (LLMDE) theory, derived from a grounded theory analysis of interviews with Japanese university students. According to LLMDE theory, language learning can be understood as a means of engaging one or more of four self-fulfillment drives: the drive to expand one’s horizons (perspective drive); the drive to make a success of oneself (status drive); the drive to engage in interaction with others (communication drive); and the drive to obtain intellectual and affective stimulation (entertainment drive). While many theories of learner psychology focus on conscious agency, LLMDE theory addresses the role of the unconscious. In addition, supplementary thematic analysis of the data revealed the role of context in mediating drive engagement. Unexpected memorable events, for example, play a key role in instigating and, indirectly, in regulating learning, as do institutional and cultural contexts. Given the apparent importance of such factors beyond the immediate control of the learner, and given the pervasive role of habit and drives, it is argued that the concept of motivation merits theoretical reappraisal. Rather than an underlying force determining language learning success or failure, it can be understood to emerge sporadically in consciousness to promote behavioral change, or to protect habitual behavior from disruption.
Keywords: Drives, grounded theory, motivation, significant events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6002296 Recursive Filter for Coastal Displacement Estimation
Authors: Efstratios Doukakis, Nikolaos Petrelis
Abstract:
All climate models agree that the temperature in Greece will increase in the range of 1° to 2°C by the year 2030 and mean sea level in Mediterranean is expected to rise at the rate of 5 cm/decade. The aim of the present paper is the estimation of the coastline displacement driven by the climate change and sea level rise. In order to achieve that, all known statistical and non-statistical computational methods are employed on some Greek coastal areas. Furthermore, Kalman filtering techniques are for the first time introduced, formulated and tested. Based on all the above, shoreline change signals and noises are computed and an inter-comparison between the different methods can be deduced to help evaluating which method is most promising as far as the retrieve of shoreline change rate is concerned.Keywords: Climate Change, Coastal Displacement, KalmanFilter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14062295 A Simulation for Estimation of the Blood Pressure using Arterial Pressure-volume Model
Authors: Gye-rok Jeon, Jae-hee Jung, In-cheol Kim, Ah-young Jeon, Sang-hwa Yoon, Jung-man Son, Jae-hyung Kim, Soo-young Ye, Jung-hoon Ro, Dong-hyun Kim, Chul-han Kim
Abstract:
A analysis on the conventional the blood pressure estimation method using an oscillometric sphygmomanometer was performed through a computer simulation using an arterial pressure-volume (APV) model. Traditionally, the maximum amplitude algorithm (MAP) was applied on the oscillation waveforms of the APV model to obtain the mean arterial pressure and the characteristic ratio. The estimation of mean arterial pressure and characteristic ratio was significantly affected with the shape of the blood pressure waveforms and the cutoff frequency of high-pass filter (HPL) circuitry. Experimental errors are due to these effects when estimating blood pressure. To find out an algorithm independent from the influence of waveform shapes and parameters of HPL, the volume oscillation of the APV model and the phase shift of the oscillation with fast fourier transform (FFT) were testified while increasing the cuff pressure from 1 mmHg to 200 mmHg (1 mmHg per second). The phase shift between the ranges of volume oscillation was then only observed between the systolic and the diastolic blood pressures. The same results were also obtained from the simulations performed on two different the arterial blood pressure waveforms and one hyperthermia waveform.Keywords: Arterial blood pressure, oscillometric method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33352294 Human Action Recognition System Based on Silhouette
Authors: S. Maheswari, P. Arockia Jansi Rani
Abstract:
Human action is recognized directly from the video sequences. The objective of this work is to recognize various human actions like run, jump, walk etc. Human action recognition requires some prior knowledge about actions namely, the motion estimation, foreground and background estimation. Region of interest (ROI) is extracted to identify the human in the frame. Then, optical flow technique is used to extract the motion vectors. Using the extracted features similarity measure based classification is done to recognize the action. From experimentations upon the Weizmann database, it is found that the proposed method offers a high accuracy.Keywords: Background subtraction, human silhouette, optical flow, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9982293 Coherence Analysis between Respiration and PPG Signal by Bivariate AR Model
Authors: Yue-Der Lin, Wei-Ting Liu, Ching-Che Tsai, Wen-Hsiu Chen
Abstract:
PPG is a potential tool in clinical applications. Among such, the relationship between respiration and PPG signal has attracted attention in past decades. In this research, a bivariate AR spectral estimation method was utilized for the coherence analysis between these two signals. Ten healthy subjects participated in this research with signals measured at different respiratory rates. The results demonstrate that high coherence exists between respiration and PPG signal, whereas the coherence disappears in breath-holding experiments. These results imply that PPG signal reveals the respiratory information. The utilized method may provide an attractive alternative approach for the related researches.
Keywords: Coherence analysis, photoplethysmography (PPG), bivariate AR spectral estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25972292 Mega Projects and Governmentality
Authors: Sophie Sturup
Abstract:
Mega urban transport projects (MUTPs) are increasingly being used in urban environments to ameliorate the problem of congestion. However, a number of problems with regard to mega projects have been identified. In particular the seemingly institutionalised over estimation of economic benefits and persistent cost over runs, could mean that the wrong projects are selected, and that the projects that are selected cost more than they should. Studies to date have produced a number of solutions to these problems, perhaps most notably, the various methods for the inclusion of the private sector in project provision. However the problems have shown significant intractability in the face of these solutions. This paper provides a detailed examination of some of the problems facing mega projects and then examines Foucault-s theory of 'governmentality' as a possible frame of analysis which might shed light on the intractability of the problems that have been identified, through an identification of the art of government in which MUTPs occur.Keywords: Michel Foucault, Governmentality, Mega projects, Transport.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24852291 Unscented Grid Filtering and Smoothing for Nonlinear Time Series Analysis
Authors: Nikolay Nikolaev, Evgueni Smirnov
Abstract:
This paper develops an unscented grid-based filter and a smoother for accurate nonlinear modeling and analysis of time series. The filter uses unscented deterministic sampling during both the time and measurement updating phases, to approximate directly the distributions of the latent state variable. A complementary grid smoother is also made to enable computing of the likelihood. This helps us to formulate an expectation maximisation algorithm for maximum likelihood estimation of the state noise and the observation noise. Empirical investigations show that the proposed unscented grid filter/smoother compares favourably to other similar filters on nonlinear estimation tasks. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13302290 Application of Seismic Wave Method in Early Estimation of Wencheng Earthquake
Authors: Wenlong Liu, Yucheng Liu
Abstract:
This paper introduces the application of seismic wave method in earthquake prediction and early estimation. The advantages of the seismic wave method over the traditional earthquake prediction method are demonstrated. An example is presented in this study to show the accuracy and efficiency of using the seismic wave method in predicting a medium-sized earthquake swarm occurred in Wencheng, Zhejiang, China. By applying this method, correct predictions were made on the day after this earthquake swarm started and the day the maximum earthquake occurred, which provided scientific bases for governmental decision-making.
Keywords: earthquake prediction, earthquake swarm, seismicactivity method, seismic wave method, Wencheng earthquake
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16552289 Dengue Disease Mapping with Standardized Morbidity Ratio and Poisson-gamma Model: An Analysis of Dengue Disease in Perak, Malaysia
Authors: N. A. Samat, S. H. Mohd Imam Ma’arof
Abstract:
Dengue disease is an infectious vector-borne viral disease that is commonly found in tropical and sub-tropical regions, especially in urban and semi-urban areas, around the world and including Malaysia. There is no currently available vaccine or chemotherapy for the prevention or treatment of dengue disease. Therefore prevention and treatment of the disease depend on vector surveillance and control measures. Disease risk mapping has been recognized as an important tool in the prevention and control strategies for diseases. The choice of statistical model used for relative risk estimation is important as a good model will subsequently produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for dengue disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and one of the earliest applications of Bayesian methodology called Poisson-gamma model. This paper begins by providing a review of the SMR method, which we then apply to dengue data of Perak, Malaysia. We then fit an extension of the SMR method, which is the Poisson-gamma model. Both results are displayed and compared using graph, tables and maps. Results of the analysis shows that the latter method gives a better relative risk estimates compared with using the SMR. The Poisson-gamma model has been demonstrated can overcome the problem of SMR when there is no observed dengue cases in certain regions. However, covariate adjustment in this model is difficult and there is no possibility for allowing spatial correlation between risks in adjacent areas. The drawbacks of this model have motivated many researchers to propose other alternative methods for estimating the risk.
Keywords: Dengue disease, Disease mapping, Standardized Morbidity Ratio, Poisson-gamma model, Relative risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32932288 Time-Delay Estimation Using Cross-ΨB-Energy Operator
Authors: Z. Saidi, A.O. Boudraa, J.C. Cexus, S. Bourennane
Abstract:
In this paper, a new time-delay estimation technique based on the cross IB-energy operator [5] is introduced. This quadratic energy detector measures how much a signal is present in another one. The location of the peak of the energy operator, corresponding to the maximum of interaction between the two signals, is the estimate of the delay. The method is a fully data-driven approach. The discrete version of the continuous-time form of the cross IBenergy operator, for its implementation, is presented. The effectiveness of the proposed method is demonstrated on real underwater acoustic signals arriving from targets and the results compared to the cross-correlation method.Keywords: Teager-Kaiser energy operator, Cross-energyoperator, Time-Delay, Underwater acoustic signals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 56452287 Applying Complex Network Theory to Software Structure Analysis
Authors: Weifeng Pan
Abstract:
Complex networks have been intensively studied across many fields, especially in Internet technology, biological engineering, and nonlinear science. Software is built up out of many interacting components at various levels of granularity, such as functions, classes, and packages, representing another important class of complex networks. It can also be studied using complex network theory. Over the last decade, many papers on the interdisciplinary research between software engineering and complex networks have been published. It provides a different dimension to our understanding of software and also is very useful for the design and development of software systems. This paper will explore how to use the complex network theory to analyze software structure, and briefly review the main advances in corresponding aspects.Keywords: Metrics, measurement, complex networks, software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25952286 Orthogonal Polynomial Density Estimates: Alternative Representation and Degree Selection
Authors: Serge B. Provost, Min Jiang
Abstract:
The density estimates considered in this paper comprise a base density and an adjustment component consisting of a linear combination of orthogonal polynomials. It is shown that, in the context of density approximation, the coefficients of the linear combination can be determined either from a moment-matching technique or a weighted least-squares approach. A kernel representation of the corresponding density estimates is obtained. Additionally, two refinements of the Kronmal-Tarter stopping criterion are proposed for determining the degree of the polynomial adjustment. By way of illustration, the density estimation methodology advocated herein is applied to two data sets.Keywords: kernel density estimation, orthogonal polynomials, moment-based methodologies, density approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23682285 Code-Aided Turbo Channel Estimation for OFDM Systems with NB-LDPC Codes
Authors: Ł. Januszkiewicz, G. Bacci, H. Gierszal, M. Luise
Abstract:
In this paper channel estimation techniques are considered as the support methods for OFDM transmission systems based on Non Binary LDPC (Low Density Parity Check) codes. Standard frequency domain pilot aided LS (Least Squares) and LMMSE (Linear Minimum Mean Square Error) estimators are investigated. Furthermore, an iterative algorithm is proposed as a solution exploiting the NB-LDPC channel decoder to improve the performance of the LMMSE estimator. Simulation results of signals transmitted through fading mobile channels are presented to compare the performance of the proposed channel estimators.Keywords: LDPC codes, LMMSE, OFDM, turbo channelestimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16582284 Enhancing the Performance of H.264/AVC in Adaptive Group of Pictures Mode Using Octagon and Square Search Pattern
Authors: S. Sowmyayani, P. Arockia Jansi Rani
Abstract:
This paper integrates Octagon and Square Search pattern (OCTSS) motion estimation algorithm into H.264/AVC (Advanced Video Coding) video codec in Adaptive Group of Pictures (AGOP) mode. AGOP structure is computed based on scene change in the video sequence. Octagon and square search pattern block-based motion estimation method is implemented in inter-prediction process of H.264/AVC. Both these methods reduce bit rate and computational complexity while maintaining the quality of the video sequence respectively. Experiments are conducted for different types of video sequence. The results substantially proved that the bit rate, computation time and PSNR gain achieved by the proposed method is better than the existing H.264/AVC with fixed GOP and AGOP. With a marginal gain in quality of 0.28dB and average gain in bitrate of 132.87kbps, the proposed method reduces the average computation time by 27.31 minutes when compared to the existing state-of-art H.264/AVC video codec.Keywords: Block Distortion Measure, Block Matching Algorithms, H.264/AVC, Motion estimation, Search patterns, Shot cut detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17292283 Chaotic Dynamics of Cost Overruns in Oil and Gas Megaprojects: A Review
Authors: O. J. Olaniran, P. E. D. Love, D. J. Edwards, O. Olatunji, J. Matthews
Abstract:
Cost overruns are a persistent problem in oil and gas megaprojects. Whilst the extant literature is filled with studies on incidents and causes of cost overruns, underlying theories to explain their emergence in oil and gas megaprojects are few. Yet, a way to contain the syndrome of cost overruns is to understand the bases of ‘how and why’ they occur. Such knowledge will also help to develop pragmatic techniques for better overall management of oil and gas megaprojects. The aim of this paper is to explain the development of cost overruns in hydrocarbon megaprojects through the perspective of chaos theory. The underlying principles of chaos theory and its implications for cost overruns are examined and practical recommendations proposed. In addition, directions for future research in this fertile area provided.Keywords: Chaos theory, oil and gas, cost overruns, megaprojects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23342282 Modality and Redundancy Effects on Music Theory Learning Among Pupils of Different Anxiety Levels
Authors: Soon Fook Fong, Aldalalah, M. Osamah
Abstract:
The purpose of this study was to investigate effects of modality and redundancy principles on music theory learning among pupils of different anxiety levels. The lesson of music theory was developed in three different modes, audio and image (AI), text with image (TI) and audio with image and text (AIT). The independent variables were the three modes of courseware. The moderator variable was the anxiety level, while the dependent variable was the post test score. The study sample consisted of 405 third-grade pupils. Descriptive and inferential statistics were conducted to analyze the collected data. Analyses of covariance (ANCOVA) and Post hoc were carried out to examine the main effects as well as the interaction effects of the independent variables on the dependent variable. The findings of this study showed that medium anxiety pupils performed significantly better than low and high anxiety pupils in all the three treatment modes. The AI mode was found to help pupils with high anxiety significantly more than the TI and AIT modes.Keywords: Modality, Redundancy, Music theory, Cognitivetheory of multimedia learning, Cognitive load theory, Anxiety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21352281 Maximum Likelihood Estimation of Burr Type V Distribution under Left Censored Samples
Abstract:
The paper deals with the maximum likelihood estimation of the parameters of the Burr type V distribution based on left censored samples. The maximum likelihood estimators (MLE) of the parameters have been derived and the Fisher information matrix for the parameters of the said distribution has been obtained explicitly. The confidence intervals for the parameters have also been discussed. A simulation study has been conducted to investigate the performance of the point and interval estimates.
Keywords: Fisher information matrix, confidence intervals, censoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17082280 Building Information Modeling-Based Approach for Automatic Quantity Take-off and Cost Estimation
Authors: Lo Kar Yin, Law Ka Mei
Abstract:
Architectural, engineering, construction and operations (AECO) industry practitioners have been well adapting to the dynamic construction market from the fundamental training of its disciplines. As further triggered by the pandemic since 2019, great steps are taken in virtual environment and the best collaboration is strived with project teams without boundaries. With adoption of Building Information Modeling-based approach and qualitative analysis, this paper is to review quantity take-off (QTO) and cost estimation process through modeling techniques in liaison with suppliers, fabricators, subcontractors, contractors, designers, consultants and services providers in the construction industry value chain for automatic project cost budgeting, project cost control and cost evaluation on design options of in-situ reinforced-concrete construction and Modular Integrated Construction (MiC) at design stage, variation of works and cash flow/spending analysis at construction stage as far as practicable, with a view to sharing the findings for enhancing mutual trust and co-operation among AECO industry practitioners. It is to foster development through a common prototype of design and build project delivery method in NEC4 Engineering and Construction Contract (ECC) Options A and C.
Keywords: Building Information Modeling, cost estimation, quantity take-off, modeling techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7082279 Analysis of Distribution of Thrust, Torque and Efficiency of a Constant Chord, Constant Pitch C.R.P. Fan by H.E.S. Method
Authors: Morteza Abbaszadeh, Parvin Nikpoorparizi, Mina Shahrooz
Abstract:
For the first time since 1940 and presentation of theodorson-s theory, distribution of thrust, torque and efficiency along the blade of a counter rotating propeller axial fan was studied with a novel method in this research. A constant chord, constant pitch symmetric fan was investigated with Reynolds Stress Turbulence method in this project and H.E.S. method was utilized to obtain distribution profiles from C.F.D. tests outcome. C.F.D. test results were validated by estimation from Playlic-s analytical method. Final results proved ability of H.E.S. method to obtain distribution profiles from C.F.D test results and demonstrated interesting facts about effects of solidity and differences between distributions in front and rear section.Keywords: C.F.D Test, Counter Rotating Propeller, H.E.S. Method, R.S.M. Method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30232278 Fast Algorithm of Infrared Point Target Detection in Fluctuant Background
Authors: Yang Weiping, Zhang Zhilong, Li Jicheng, Chen Zengping, He Jun
Abstract:
The background estimation approach using a small window median filter is presented on the bases of analyzing IR point target, noise and clutter model. After simplifying the two-dimensional filter, a simple method of adopting one-dimensional median filter is illustrated to make estimations of background according to the characteristics of IR scanning system. The adaptive threshold is used to segment canceled image in the background. Experimental results show that the algorithm achieved good performance and satisfy the requirement of big size image-s real-time processing.Keywords: Point target, background estimation, median filter, adaptive threshold, target detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18422277 Long Term Examination of the Profitability Estimation Focused on Benefits
Authors: Stephan Printz, Kristina Lahl, René Vossen, Sabina Jeschke
Abstract:
Strategic investment decisions are characterized by high innovation potential and long-term effects on the competitiveness of enterprises. Due to the uncertainty and risks involved in this complex decision making process, the need arises for well-structured support activities. A method that considers cost and the long-term added value is the cost-benefit effectiveness estimation. One of those methods is the “profitability estimation focused on benefits – PEFB”-method developed at the Institute of Management Cybernetics at RWTH Aachen University. The method copes with the challenges associated with strategic investment decisions by integrating long-term non-monetary aspects whilst also mapping the chronological sequence of an investment within the organization’s target system. Thus, this method is characterized as a holistic approach for the evaluation of costs and benefits of an investment. This participation-oriented method was applied to business environments in many workshops. The results of the workshops are a library of more than 96 cost aspects, as well as 122 benefit aspects. These aspects are preprocessed and comparatively analyzed with regards to their alignment to a series of risk levels. For the first time, an accumulation and a distribution of cost and benefit aspects regarding their impact and probability of occurrence are given. The results give evidence that the PEFB-method combines precise measures of financial accounting with the incorporation of benefits. Finally, the results constitute the basics for using information technology and data science for decision support when applying within the PEFB-method.Keywords: Cost-benefit analysis, multi-criteria decision, profitability estimation focused on benefits, risk and uncertainty analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14982276 A Self Configuring System for Object Recognition in Color Images
Authors: Michela Lecca
Abstract:
System MEMORI automatically detects and recognizes rotated and/or rescaled versions of the objects of a database within digital color images with cluttered background. This task is accomplished by means of a region grouping algorithm guided by heuristic rules, whose parameters concern some geometrical properties and the recognition score of the database objects. This paper focuses on the strategies implemented in MEMORI for the estimation of the heuristic rule parameters. This estimation, being automatic, makes the system a highly user-friendly tool.
Keywords: Automatic object recognition, clustering, content based image retrieval system, image segmentation, region adjacency graph, region grouping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14072275 An Alternative Proof for the Topological Entropy of the Motzkin Shift
Authors: Fahad Alsharari, Mohd Salmi Md Noorani
Abstract:
A Motzkin shift is a mathematical model for constraints on genetic sequences. In terms of the theory of symbolic dynamics, the Motzkin shift is nonsofic, and therefore, we cannot use the Perron- Frobenius theory to calculate its topological entropy. The Motzkin shift M(M,N) which comes from language theory, is defined to be the shift system over an alphabet A that consists of N negative symbols, N positive symbols and M neutral symbols. For an x in the full shift, x will be in the Motzkin subshift M(M,N) if and only if every finite block appearing in x has a non-zero reduced form. Therefore, the constraint for x cannot be bounded in length. K. Inoue has shown that the entropy of the Motzkin shift M(M,N) is log(M + N + 1). In this paper, a new direct method of calculating the topological entropy of the Motzkin shift is given without any measure theoretical discussion.
Keywords: Motzkin shift, topological entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20092274 Distributed Frequency Synchronization for Global Synchronization in Wireless Mesh Networks
Authors: Jung-Hyun Kim, Jihyung Kim, Kwangjae Lim, Dong Seung Kwon
Abstract:
In this paper, our focus is to assure a global frequency synchronization in OFDMA-based wireless mesh networks with local information. To acquire the global synchronization in distributed manner, we propose a novel distributed frequency synchronization (DFS) method. DFS is a method that carrier frequencies of distributed nodes converge to a common value by repetitive estimation and averaging step and sharing step. Experimental results show that DFS achieves noteworthy better synchronization success probability than existing schemes in OFDMA-based mesh networks where the estimation error is presented.
Keywords: OFDMA systems, Frequency synchronization, Distributed networks, Multiple groups.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17172273 Examining Herzberg-s Two Factor Theory in a Large Chinese Chemical Fiber Company
Authors: Ju-Chun Chien
Abstract:
The validity of Herzberg-s Two-Factor Theory of Motivation was tested empirically by surveying 2372 chemical fiber employees in 2012. In the valid sample of 1875 respondents, the degree of overall job satisfaction was more than moderate. The most highly valued components of job satisfaction were: “corporate image," “collaborative working atmosphere," and “supervisor-s expertise"; whereas the lowest mean score was 34.65 for “job rotation and promotion." The top three job retention options rated by the participants were “good image of the enterprise," “good compensation," and “workplace is close to my residence." The overall evaluation of the level of thriving facilitation workplace reached almost to “mostly agree." For those participants who chose at least one motivator as their job retention options had significantly greater job satisfaction than those who chose only hygiene factors as their retention options. Therefore, Herzberg-s Two-Factor Theory of Motivation was proven valid in this study.Keywords: Employee job satisfaction, Job retention, Traditional business, Two-factor theory of motivation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5412