Search results for: Quantitative Precipitation Estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1867

Search results for: Quantitative Precipitation Estimation

1567 New Enhanced Hexagon-Based Search Using Point-Oriented Inner Search for Fast Block Motion Estimation

Authors: Lai-Man Po, Chi-Wang Ting, Ka-Ho Ng

Abstract:

Recently, an enhanced hexagon-based search (EHS) algorithm was proposed to speedup the original hexagon-based search (HS) by exploiting the group-distortion information of some evaluated points. In this paper, a second version of the EHS is proposed with a new point-oriented inner search technique which can further speedup the HS in both large and small motion environments. Experimental results show that the enhanced hexagon-based search version-2 (EHS2) is faster than the HS up to 34% with negligible PSNR degradation.

Keywords: Inner search, fast motion estimation, block-matching, hexagon search

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1398
1566 Exponential State Estimation for Neural Networks with Leakage, Discrete and Distributed Delays

Authors: Liyuan Wang, Shouming Zhong

Abstract:

In this paper, the design problem of state estimator for neural networks with the mixed time-varying delays are investigated by constructing appropriate Lyapunov-Krasovskii functionals and using some effective mathematical techniques. In order to derive several conditions to guarantee the estimation error systems to be globally exponential stable, we transform the considered systems into the neural-type time-delay systems. Then with a set of linear inequalities(LMIs), we can obtain the stable criteria. Finally, three numerical examples are given to show the effectiveness and less conservatism of the proposed criterion.

Keywords: State estimator, Neural networks, Globally exponential stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1632
1565 A Graph Theoretic Approach for Quantitative Evaluation of NAAC Accreditation Criteria for the Indian University

Authors: Nameesh Miglani, Rajeev Saha, R. S. Parihar

Abstract:

Estimation of the quality regarding higher education within a university is practically long drawn process besides being difficult to measure primarily due to lack of a standard scale. National Assessment and Accreditation Council (NAAC) evolved a methodology of assessment which involves self-appraisal by each university/college and an assessment of performance by an expert committee. The attributes involved in assessing a university may not be totally independent from each other thereby necessitating the consideration of interdependencies. The present study focuses on evaluation of assessment criteria using graph theoretic approach and fuzzy treatment of data collected from the students. The technique will provide a suitable platform to university management team to cross check assessment of education quality by considering interdependencies of the attributes using graph theory.

Keywords: Graph theory, NAAC accreditation criteria, Indian University accreditation process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1097
1564 Estimation of Shock Velocity and Pressure of Detonations and Finding Their Flow Parameters

Authors: Mahmoud Zarrini, R. N. Pralhad

Abstract:

In this paper, mathematical modeling of detonation in the ground is studied. Estimation of flow parameters such as velocity, maximum velocity, acceleration, maximum acceleration, shock pressure as a result of an explosion in the ground have been computed in an appropriate dynamic model approach. The variation of these parameters with the diameter of detonation place (L), density of earth or stone (¤ü), time decay of detonation (T), peak pressure (Pm), and time (t) have been analyzed. The model has been developed from the concept of underwater explosions [Refs. [1]-[3]] with appropriate changes to the present model requirements.

Keywords: Shock velocity, detonation, shock acceleration, shock pressure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1265
1563 Zero Inflated Strict Arcsine Regression Model

Authors: Y. N. Phang, E. F. Loh

Abstract:

Zero inflated strict arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, we extend zero inflated strict arcsine model to zero inflated strict arcsine regression model by taking into consideration the extra variability caused by extra zeros and covariates in count data. Maximum likelihood estimation method is used in estimating the parameters for this zero inflated strict arcsine regression model.

Keywords: Overdispersed count data, maximum likelihood estimation, simulated annealing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
1562 Evaluation of Best-Fit Probability Distribution for Prediction of Extreme Hydrologic Phenomena

Authors: Karim Hamidi Machekposhti, Hossein Sedghi

Abstract:

The probability distributions are the best method for forecasting of extreme hydrologic phenomena such as rainfall and flood flows. In this research, in order to determine suitable probability distribution for estimating of annual extreme rainfall and flood flows (discharge) series with different return periods, precipitation with 40 and discharge with 58 years time period had been collected from Karkheh River at Iran. After homogeneity and adequacy tests, data have been analyzed by Stormwater Management and Design Aid (SMADA) software and residual sum of squares (R.S.S). The best probability distribution was Log Pearson Type III with R.S.S value (145.91) and value (13.67) for peak discharge and Log Pearson Type III with R.S.S values (141.08) and (8.95) for maximum discharge in Jelogir Majin and Pole Zal stations, respectively. The best distribution for maximum precipitation in Jelogir Majin and Pole Zal stations was Log Pearson Type III distribution with R.S.S values (1.74&1.90) and then Pearson Type III distribution with R.S.S values (1.53&1.69). Overall, the Log Pearson Type III distributions are acceptable distribution types for representing statistics of extreme hydrologic phenomena in Karkheh River at Iran with the Pearson Type III distribution as a potential alternative.

Keywords: Karkheh river, log pearson type III, probability distribution, residual sum of squares.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 837
1561 Groundwater Seepage Estimation into Amirkabir Tunnel Using Analytical Methods and DEM and SGR Method

Authors: Hadi Farhadian, Homayoon Katibeh

Abstract:

In this paper, groundwater seepage into Amirkabir tunnel has been estimated using analytical and numerical methods for 14 different sections of the tunnel. Site Groundwater Rating (SGR) method also has been performed for qualitative and quantitative classification of the tunnel sections. The obtained results of above mentioned methods were compared together. The study shows reasonable accordance with results of the all methods unless for two sections of tunnel. In these two sections there are some significant discrepancies between numerical and analytical results mainly originated from model geometry and high overburden. SGR and the analytical and numerical calculations, confirm high concentration of seepage inflow in fault zones. Maximum seepage flow into tunnel has been estimated 0.425 lit/sec/m using analytical method and 0.628 lit/sec/m using numerical method occured in crashed zone. Based on SGR method, six sections of 14 sections in Amirkabir tunnel axis are found to be in "No Risk" class that is supported by the analytical and numerical seepage value of less than 0.04 lit/sec/m.

Keywords: Water Seepage, Amirkabir Tunnel, Analytical Method, DEM, SGR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3920
1560 Recursive Filter for Coastal Displacement Estimation

Authors: Efstratios Doukakis, Nikolaos Petrelis

Abstract:

All climate models agree that the temperature in Greece will increase in the range of 1° to 2°C by the year 2030 and mean sea level in Mediterranean is expected to rise at the rate of 5 cm/decade. The aim of the present paper is the estimation of the coastline displacement driven by the climate change and sea level rise. In order to achieve that, all known statistical and non-statistical computational methods are employed on some Greek coastal areas. Furthermore, Kalman filtering techniques are for the first time introduced, formulated and tested. Based on all the above, shoreline change signals and noises are computed and an inter-comparison between the different methods can be deduced to help evaluating which method is most promising as far as the retrieve of shoreline change rate is concerned.

Keywords: Climate Change, Coastal Displacement, KalmanFilter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1379
1559 A Simulation for Estimation of the Blood Pressure using Arterial Pressure-volume Model

Authors: Gye-rok Jeon, Jae-hee Jung, In-cheol Kim, Ah-young Jeon, Sang-hwa Yoon, Jung-man Son, Jae-hyung Kim, Soo-young Ye, Jung-hoon Ro, Dong-hyun Kim, Chul-han Kim

Abstract:

A analysis on the conventional the blood pressure estimation method using an oscillometric sphygmomanometer was performed through a computer simulation using an arterial pressure-volume (APV) model. Traditionally, the maximum amplitude algorithm (MAP) was applied on the oscillation waveforms of the APV model to obtain the mean arterial pressure and the characteristic ratio. The estimation of mean arterial pressure and characteristic ratio was significantly affected with the shape of the blood pressure waveforms and the cutoff frequency of high-pass filter (HPL) circuitry. Experimental errors are due to these effects when estimating blood pressure. To find out an algorithm independent from the influence of waveform shapes and parameters of HPL, the volume oscillation of the APV model and the phase shift of the oscillation with fast fourier transform (FFT) were testified while increasing the cuff pressure from 1 mmHg to 200 mmHg (1 mmHg per second). The phase shift between the ranges of volume oscillation was then only observed between the systolic and the diastolic blood pressures. The same results were also obtained from the simulations performed on two different the arterial blood pressure waveforms and one hyperthermia waveform.

Keywords: Arterial blood pressure, oscillometric method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3301
1558 Human Action Recognition System Based on Silhouette

Authors: S. Maheswari, P. Arockia Jansi Rani

Abstract:

Human action is recognized directly from the video sequences. The objective of this work is to recognize various human actions like run, jump, walk etc. Human action recognition requires some prior knowledge about actions namely, the motion estimation, foreground and background estimation. Region of interest (ROI) is extracted to identify the human in the frame. Then, optical flow technique is used to extract the motion vectors. Using the extracted features similarity measure based classification is done to recognize the action. From experimentations upon the Weizmann database, it is found that the proposed method offers a high accuracy.

Keywords: Background subtraction, human silhouette, optical flow, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 970
1557 Coherence Analysis between Respiration and PPG Signal by Bivariate AR Model

Authors: Yue-Der Lin, Wei-Ting Liu, Ching-Che Tsai, Wen-Hsiu Chen

Abstract:

PPG is a potential tool in clinical applications. Among such, the relationship between respiration and PPG signal has attracted attention in past decades. In this research, a bivariate AR spectral estimation method was utilized for the coherence analysis between these two signals. Ten healthy subjects participated in this research with signals measured at different respiratory rates. The results demonstrate that high coherence exists between respiration and PPG signal, whereas the coherence disappears in breath-holding experiments. These results imply that PPG signal reveals the respiratory information. The utilized method may provide an attractive alternative approach for the related researches.

Keywords: Coherence analysis, photoplethysmography (PPG), bivariate AR spectral estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2560
1556 Unscented Grid Filtering and Smoothing for Nonlinear Time Series Analysis

Authors: Nikolay Nikolaev, Evgueni Smirnov

Abstract:

This paper develops an unscented grid-based filter and a smoother for accurate nonlinear modeling and analysis of time series. The filter uses unscented deterministic sampling during both the time and measurement updating phases, to approximate directly the distributions of the latent state variable. A complementary grid smoother is also made to enable computing of the likelihood. This helps us to formulate an expectation maximisation algorithm for maximum likelihood estimation of the state noise and the observation noise. Empirical investigations show that the proposed unscented grid filter/smoother compares favourably to other similar filters on nonlinear estimation tasks.

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1300
1555 Application of Seismic Wave Method in Early Estimation of Wencheng Earthquake

Authors: Wenlong Liu, Yucheng Liu

Abstract:

This paper introduces the application of seismic wave method in earthquake prediction and early estimation. The advantages of the seismic wave method over the traditional earthquake prediction method are demonstrated. An example is presented in this study to show the accuracy and efficiency of using the seismic wave method in predicting a medium-sized earthquake swarm occurred in Wencheng, Zhejiang, China. By applying this method, correct predictions were made on the day after this earthquake swarm started and the day the maximum earthquake occurred, which provided scientific bases for governmental decision-making.

Keywords: earthquake prediction, earthquake swarm, seismicactivity method, seismic wave method, Wencheng earthquake

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
1554 Time-Delay Estimation Using Cross-ΨB-Energy Operator

Authors: Z. Saidi, A.O. Boudraa, J.C. Cexus, S. Bourennane

Abstract:

In this paper, a new time-delay estimation technique based on the cross IB-energy operator [5] is introduced. This quadratic energy detector measures how much a signal is present in another one. The location of the peak of the energy operator, corresponding to the maximum of interaction between the two signals, is the estimate of the delay. The method is a fully data-driven approach. The discrete version of the continuous-time form of the cross IBenergy operator, for its implementation, is presented. The effectiveness of the proposed method is demonstrated on real underwater acoustic signals arriving from targets and the results compared to the cross-correlation method.

Keywords: Teager-Kaiser energy operator, Cross-energyoperator, Time-Delay, Underwater acoustic signals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5621
1553 Orthogonal Polynomial Density Estimates: Alternative Representation and Degree Selection

Authors: Serge B. Provost, Min Jiang

Abstract:

The density estimates considered in this paper comprise a base density and an adjustment component consisting of a linear combination of orthogonal polynomials. It is shown that, in the context of density approximation, the coefficients of the linear combination can be determined either from a moment-matching technique or a weighted least-squares approach. A kernel representation of the corresponding density estimates is obtained. Additionally, two refinements of the Kronmal-Tarter stopping criterion are proposed for determining the degree of the polynomial adjustment. By way of illustration, the density estimation methodology advocated herein is applied to two data sets.

Keywords: kernel density estimation, orthogonal polynomials, moment-based methodologies, density approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2341
1552 A Simple Chemical Precipitation Method of Titanium Dioxide Nanoparticles Using Polyvinyl Pyrrolidone as a Capping Agent and Their Characterization

Authors: V. P. Muhamed Shajudheen, K. Viswanathan, K. Anitha Rani, A. Uma Maheswari, S. Saravana Kumar

Abstract:

In this paper, a simple chemical precipitation route for the preparation of titanium dioxide nanoparticles, synthesized by using titanium tetra isopropoxide as a precursor and polyvinyl pyrrolidone (PVP) as a capping agent, is reported. The Differential Scanning Calorimetry (DSC) and Thermo Gravimetric Analysis (TGA) of the samples were recorded and the phase transformation temperature of titanium hydroxide, Ti(OH)4 to titanium oxide, TiO2 was investigated. The as-prepared Ti(OH)4 precipitate was annealed at 800°C to obtain TiO2 nanoparticles. The thermal, structural, morphological and textural characterizations of the TiO2 nanoparticle samples were carried out by different techniques such as DSC-TGA, X-Ray Diffraction (XRD), Fourier Transform Infra-Red spectroscopy (FTIR), Micro Raman spectroscopy, UV-Visible absorption spectroscopy (UV-Vis), Photoluminescence spectroscopy (PL) and Field Effect Scanning Electron Microscopy (FESEM) techniques. The as-prepared precipitate was characterized using DSC-TGA and confirmed the mass loss of around 30%. XRD results exhibited no diffraction peaks attributable to anatase phase, for the reaction products, after the solvent removal. The results indicate that the product is purely rutile. The vibrational frequencies of two main absorption bands of prepared samples are discussed from the results of the FTIR analysis. The formation of nanosphere of diameter of the order of 10 nm, has been confirmed by FESEM. The optical band gap was found by using UV-Visible spectrum. From photoluminescence spectra, a strong emission was observed. The obtained results suggest that this method provides a simple, efficient and versatile technique for preparing TiO2 nanoparticles and it has the potential to be applied to other systems for photocatalytic activity.

Keywords: TiO2 nanoparticles, chemical precipitation route, phase transition, Fourier Transform Infra-Red spectroscopy, micro Raman spectroscopy, UV-Visible absorption spectroscopy, Photoluminescence spectroscopy, Field Effect Scanning Electron Microscopy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4237
1551 Code-Aided Turbo Channel Estimation for OFDM Systems with NB-LDPC Codes

Authors: Ł. Januszkiewicz, G. Bacci, H. Gierszal, M. Luise

Abstract:

In this paper channel estimation techniques are considered as the support methods for OFDM transmission systems based on Non Binary LDPC (Low Density Parity Check) codes. Standard frequency domain pilot aided LS (Least Squares) and LMMSE (Linear Minimum Mean Square Error) estimators are investigated. Furthermore, an iterative algorithm is proposed as a solution exploiting the NB-LDPC channel decoder to improve the performance of the LMMSE estimator. Simulation results of signals transmitted through fading mobile channels are presented to compare the performance of the proposed channel estimators.

Keywords: LDPC codes, LMMSE, OFDM, turbo channelestimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633
1550 Enhancing the Performance of H.264/AVC in Adaptive Group of Pictures Mode Using Octagon and Square Search Pattern

Authors: S. Sowmyayani, P. Arockia Jansi Rani

Abstract:

This paper integrates Octagon and Square Search pattern (OCTSS) motion estimation algorithm into H.264/AVC (Advanced Video Coding) video codec in Adaptive Group of Pictures (AGOP) mode. AGOP structure is computed based on scene change in the video sequence. Octagon and square search pattern block-based motion estimation method is implemented in inter-prediction process of H.264/AVC. Both these methods reduce bit rate and computational complexity while maintaining the quality of the video sequence respectively. Experiments are conducted for different types of video sequence. The results substantially proved that the bit rate, computation time and PSNR gain achieved by the proposed method is better than the existing H.264/AVC with fixed GOP and AGOP. With a marginal gain in quality of 0.28dB and average gain in bitrate of 132.87kbps, the proposed method reduces the average computation time by 27.31 minutes when compared to the existing state-of-art H.264/AVC video codec.

Keywords: Block Distortion Measure, Block Matching Algorithms, H.264/AVC, Motion estimation, Search patterns, Shot cut detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
1549 Development of a Speed Sensorless IM Drives

Authors: Dj. Cherifi, Y. Miloud, A. Tahri

Abstract:

The primary objective of this paper is to elimination of the problem of sensitivity to parameter variation of induction motor drive. The proposed sensorless strategy is based on an algorithm permitting a better simultaneous estimation of the rotor speed and the stator resistance including an adaptive mechanism based on the lyaponov theory. To study the reliability and the robustness of the sensorless technique to abnormal operations, some simulation tests have been performed under several cases.

The proposed sensorless vector control scheme showed a good performance behavior in the transient and steady states, with an excellent disturbance rejection of the load torque.

Keywords: Induction Motor Drive, field-oriented control, adaptive speed observer, stator resistance estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991
1548 Maximum Likelihood Estimation of Burr Type V Distribution under Left Censored Samples

Authors: N. Feroze, M. Aslam

Abstract:

The paper deals with the maximum likelihood estimation of the parameters of the Burr type V distribution based on left censored samples. The maximum likelihood estimators (MLE) of the parameters have been derived and the Fisher information matrix for the parameters of the said distribution has been obtained explicitly. The confidence intervals for the parameters have also been discussed. A simulation study has been conducted to investigate the performance of the point and interval estimates.

Keywords: Fisher information matrix, confidence intervals, censoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681
1547 Characterization of the Microbial Induced Carbonate Precipitation Technique as a Biological Cementing Agent for Sand Deposits

Authors: Sameh Abu El-Soud, Zahra Zayed, Safwan Khedr, Adel M. Belal

Abstract:

The population increase in Egypt is urging for horizontal land development which became a demand to allow the benefit of different natural resources and expand from the narrow Nile valley. However, this development is facing challenges preventing land development and agriculture development. Desertification and moving sand dunes in the west sector of Egypt are considered the major obstacle that is blocking the ideal land use and development. In the proposed research, the sandy soil is treated biologically using Bacillus pasteurii bacteria as these bacteria have the ability to bond the sand partials to change its state of loose sand to cemented sand, which reduces the moving ability of the sand dunes. The procedure of implementing the Microbial Induced Carbonate Precipitation Technique (MICP) technique is examined, and the different factors affecting on this process such as the medium of bacteria sample preparation, the optical density (OD600), the reactant concentration, injection rates and intervals are highlighted. Based on the findings of the MICP treatment for sandy soil, conclusions and future recommendations are reached.

Keywords: Soil stabilization, biological treatment, MICP, sand cementation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000
1546 Building Information Modeling-Based Approach for Automatic Quantity Take-off and Cost Estimation

Authors: Lo Kar Yin, Law Ka Mei

Abstract:

Architectural, engineering, construction and operations (AECO) industry practitioners have been well adapting to the dynamic construction market from the fundamental training of its disciplines. As further triggered by the pandemic since 2019, great steps are taken in virtual environment and the best collaboration is strived with project teams without boundaries. With adoption of Building Information Modeling-based approach and qualitative analysis, this paper is to review quantity take-off (QTO) and cost estimation process through modeling techniques in liaison with suppliers, fabricators, subcontractors, contractors, designers, consultants and services providers in the construction industry value chain for automatic project cost budgeting, project cost control and cost evaluation on design options of in-situ reinforced-concrete construction and Modular Integrated Construction (MiC) at design stage, variation of works and cash flow/spending analysis at construction stage as far as practicable, with a view to sharing the findings for enhancing mutual trust and co-operation among AECO industry practitioners. It is to foster development through a common prototype of design and build project delivery method in NEC4 Engineering and Construction Contract (ECC) Options A and C.

Keywords: Building Information Modeling, cost estimation, quantity take-off, modeling techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 647
1545 Fast Algorithm of Infrared Point Target Detection in Fluctuant Background

Authors: Yang Weiping, Zhang Zhilong, Li Jicheng, Chen Zengping, He Jun

Abstract:

The background estimation approach using a small window median filter is presented on the bases of analyzing IR point target, noise and clutter model. After simplifying the two-dimensional filter, a simple method of adopting one-dimensional median filter is illustrated to make estimations of background according to the characteristics of IR scanning system. The adaptive threshold is used to segment canceled image in the background. Experimental results show that the algorithm achieved good performance and satisfy the requirement of big size image-s real-time processing.

Keywords: Point target, background estimation, median filter, adaptive threshold, target detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
1544 Long Term Examination of the Profitability Estimation Focused on Benefits

Authors: Stephan Printz, Kristina Lahl, René Vossen, Sabina Jeschke

Abstract:

Strategic investment decisions are characterized by high innovation potential and long-term effects on the competitiveness of enterprises. Due to the uncertainty and risks involved in this complex decision making process, the need arises for well-structured support activities. A method that considers cost and the long-term added value is the cost-benefit effectiveness estimation. One of those methods is the “profitability estimation focused on benefits – PEFB”-method developed at the Institute of Management Cybernetics at RWTH Aachen University. The method copes with the challenges associated with strategic investment decisions by integrating long-term non-monetary aspects whilst also mapping the chronological sequence of an investment within the organization’s target system. Thus, this method is characterized as a holistic approach for the evaluation of costs and benefits of an investment. This participation-oriented method was applied to business environments in many workshops. The results of the workshops are a library of more than 96 cost aspects, as well as 122 benefit aspects. These aspects are preprocessed and comparatively analyzed with regards to their alignment to a series of risk levels. For the first time, an accumulation and a distribution of cost and benefit aspects regarding their impact and probability of occurrence are given. The results give evidence that the PEFB-method combines precise measures of financial accounting with the incorporation of benefits. Finally, the results constitute the basics for using information technology and data science for decision support when applying within the PEFB-method.

Keywords: Cost-benefit analysis, multi-criteria decision, profitability estimation focused on benefits, risk and uncertainty analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1462
1543 A Self Configuring System for Object Recognition in Color Images

Authors: Michela Lecca

Abstract:

System MEMORI automatically detects and recognizes rotated and/or rescaled versions of the objects of a database within digital color images with cluttered background. This task is accomplished by means of a region grouping algorithm guided by heuristic rules, whose parameters concern some geometrical properties and the recognition score of the database objects. This paper focuses on the strategies implemented in MEMORI for the estimation of the heuristic rule parameters. This estimation, being automatic, makes the system a highly user-friendly tool.

Keywords: Automatic object recognition, clustering, content based image retrieval system, image segmentation, region adjacency graph, region grouping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1382
1542 Distributed Frequency Synchronization for Global Synchronization in Wireless Mesh Networks

Authors: Jung-Hyun Kim, Jihyung Kim, Kwangjae Lim, Dong Seung Kwon

Abstract:

In this paper, our focus is to assure a global frequency synchronization in OFDMA-based wireless mesh networks with local information. To acquire the global synchronization in distributed manner, we propose a novel distributed frequency synchronization (DFS) method. DFS is a method that carrier frequencies of distributed nodes converge to a common value by repetitive estimation and averaging step and sharing step. Experimental results show that DFS achieves noteworthy better synchronization success probability than existing schemes in OFDMA-based mesh networks where the estimation error is presented.

Keywords: OFDMA systems, Frequency synchronization, Distributed networks, Multiple groups.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
1541 Comprehensive Regional Drought Assessment Index

Authors: A. Zeynolabedin, M. A. Olyaei, B. Ghiasi

Abstract:

Drought is an inevitable part of the earth’s climate. It occurs regularly with no clear warning and without recognizing borders. In addition, its impact is cumulative and not immediately discernible. Iran is located in a semi-arid region where droughts occur periodically as natural hazard. Standardized Precipitation Index (SPI), Surface Water Supply Index (SWSI), and Palmer Drought Severity Index (PDSI) are three well-known indices which describe drought severity; each has its own advantages and disadvantages and can be used for specific types of drought. These indices take into account some factors such as precipitation, reservoir storage and discharge, temperature, and potential evapotranspiration in determining drought severity. In this paper, first all three indices are calculated in Aharchay river watershed located in northwestern part of Iran in East Azarbaijan province. Next, based on two other important parameters which are groundwater level and solar radiation, two new indices are defined. Finally, considering all five aforementioned indices, a combined drought index (CDI) is presented and calculated for the region. This combined index is based on all the meteorological, hydrological, and agricultural features of the region. The results show that the most severe drought condition in Aharchay watershed happened in Jun, 2004. The result of this study can be used for monitoring drought and prepare for the drought mitigation planning.

Keywords: Drought, index variation, regional assessment, monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1229
1540 Reliability of Digital FSO Links in Europe

Authors: Zdenek Kolka, Otakar Wilfert, Viera Biolkova

Abstract:

The paper deals with an analysis of visibility records collected from 210 European airports to obtain a realistic estimation of the availability of Free Space Optical (FSO) data links. Commercially available optical links usually operate in the 850nm waveband. Thus the influence of the atmosphere on the optical beam and on the visible light is similar. Long-term visibility records represent an invaluable source of data for the estimation of the quality of service of FSO links. The model used characterizes both the statistical properties of fade depths and the statistical properties of individual fade durations. Results are presented for Italy, France, and Germany.

Keywords: Computer networks, free-space optical links, meteorology, quality of service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
1539 Color Constancy using Superpixel

Authors: Xingsheng Yuan, Zhengzhi Wang

Abstract:

Color constancy algorithms are generally based on the simplified assumption about the spectral distribution or the reflection attributes of the scene surface. However, in reality, these assumptions are too restrictive. The methodology is proposed to extend existing algorithm to applying color constancy locally to image patches rather than globally to the entire images. In this paper, a method based on low-level image features using superpixels is proposed. Superpixel segmentation partition an image into regions that are approximately uniform in size and shape. Instead of using entire pixel set for estimating the illuminant, only superpixels with the most valuable information are used. Based on large scale experiments on real-world scenes, it can be derived that the estimation is more accurate using superpixels than when using the entire image.

Keywords: color constancy, illuminant estimation, superpixel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1433
1538 Estimating Development Time of Software Projects Using a Neuro Fuzzy Approach

Authors: Venus Marza, Amin Seyyedi, Luiz Fernando Capretz

Abstract:

Software estimation accuracy is among the greatest challenges for software developers. This study aimed at building and evaluating a neuro-fuzzy model to estimate software projects development time. The forty-one modules developed from ten programs were used as dataset. Our proposed approach is compared with fuzzy logic and neural network model and Results show that the value of MMRE (Mean of Magnitude of Relative Error) applying neuro-fuzzy was substantially lower than MMRE applying fuzzy logic and neural network.

Keywords: Artificial Neural Network, Fuzzy Logic, Neuro-Fuzzy, Software Estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631