Search results for: Fuzzy Logic estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2978

Search results for: Fuzzy Logic estimation

2108 Non-Parametric, Unconditional Quantile Estimation of Efficiency in Microfinance Institutions

Authors: Komlan Sedzro

Abstract:

We apply the non-parametric, unconditional, hyperbolic order-α quantile estimator to appraise the relative efficiency of Microfinance Institutions in Africa in terms of outreach. Our purpose is to verify if these institutions, which must constantly try to strike a compromise between their social role and financial sustainability are operationally efficient. Using data on African MFIs extracted from the Microfinance Information eXchange (MIX) database and covering the 2004 to 2006 periods, we find that more efficient MFIs are also the most profitable. This result is in line with the view that social performance is not in contradiction with the pursuit of excellent financial performance. Our results also show that large MFIs in terms of asset and those charging the highest fees are not necessarily the most efficient.

Keywords: data envelopment analysis, microfinance institutions, quantile estimation of efficiency, social and financial performance

Procedia PDF Downloads 311
2107 The New Propensity Score Method and Assessment of Propensity Score: A Simulation Study

Authors: Azam Najafkouchak, David Todem, Dorothy Pathak, Pramod Pathak, Joseph Gardiner

Abstract:

Propensity score (PS) methods have recently become the standard analysis tool for causal inference in observational studies where exposure is not randomly assigned. Thus, confounding can impact the estimation of treatment effect on the outcome. Due to the dangers of discretizing continuous variables, the focus of this paper will be on how the variation in cut-points or boundaries will affect the average treatment effect utilizing the stratification of the PS method. In this study, we will develop a new methodology to improve the efficiency of the PS analysis through stratification and simulation study. We will also explore the property of empirical distribution of average treatment effect theoretically, including asymptotic distribution, variance estimation and 95% confident Intervals.

Keywords: propensity score, stratification, emprical distribution, average treatment effect

Procedia PDF Downloads 99
2106 Performance Evaluation of Microcontroller-Based Fuzzy Controller for Fruit Drying System

Authors: Salisu Umar

Abstract:

Fruits are a seasonal crop and get spoiled quickly. They are dried to be preserved for a long period. The natural drying process requires more time. The investment on space requirement and infrastructure is large, and cannot be afforded by a middle class farmer. Therefore there is a need for a comparatively small unit with reduced drying times, which can be afforded by a middle class farmer. A controlled environment suitable for fruit drying is developed within a closed chamber and is a three step process. Firstly, the infrared light is used internally to preheated the fruit to speedily remove the water content inside the fruit for fast drying. Secondly, hot air of a specified temperature is blown inside the chamber to maintain the humidity below a specified level and exhaust the humid air of the chamber. Thirdly the microcontroller idles disconnecting the power to the chamber after the weight of the fruits is reduced to a known value of its original weight. This activates a buzzer for duration of ten seconds to indicate the end of the drying process. The results obtained indicate that the system is significantly reducing the drying time without affecting the quality of the fruits compared with the existing dryers.

Keywords: fruit, fuzzy controller, microcontroller, temperature, weight and humidity

Procedia PDF Downloads 445
2105 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand

Authors: Chukiat Chaiboonsri, Satawat Wannapan

Abstract:

This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.

Keywords: TThailand tourism, Maximum Entropy Bootstrapping approach, macroeconomic model, asymmetric information

Procedia PDF Downloads 295
2104 Performance Comparison of Wideband Covariance Matrix Sparse Representation (W-CMSR) with Other Wideband DOA Estimation Methods

Authors: Sandeep Santosh, O. P. Sahu

Abstract:

In this paper, performance comparison of wideband covariance matrix sparse representation (W-CMSR) method with other existing wideband Direction of Arrival (DOA) estimation methods has been made.W-CMSR relies less on a priori information of the incident signal number than the ordinary subspace based methods.Consider the perturbation free covariance matrix of the wideband array output. The diagonal covariance elements are contaminated by unknown noise variance. The covariance matrix of array output is conjugate symmetric i.e its upper right triangular elements can be represented by lower left triangular ones.As the main diagonal elements are contaminated by unknown noise variance,slide over them and align the lower left triangular elements column by column to obtain a measurement vector.Simulation results for W-CMSR are compared with simulation results of other wideband DOA estimation methods like Coherent signal subspace method (CSSM), Capon, l1-SVD, and JLZA-DOA. W-CMSR separate two signals very clearly and CSSM, Capon, L1-SVD and JLZA-DOA fail to separate two signals clearly and an amount of pseudo peaks exist in the spectrum of L1-SVD.

Keywords: W-CMSR, wideband direction of arrival (DOA), covariance matrix, electrical and computer engineering

Procedia PDF Downloads 471
2103 A System Dynamics Approach to Technological Learning Impact for Cost Estimation of Solar Photovoltaics

Authors: Rong Wang, Sandra Hasanefendic, Elizabeth von Hauff, Bart Bossink

Abstract:

Technological learning and learning curve models have been continuously used to estimate the photovoltaics (PV) cost development over time for the climate mitigation targets. They can integrate a number of technological learning sources which influence the learning process. Yet the accuracy and realistic predictions for cost estimations of PV development are still difficult to achieve. This paper develops four hypothetical-alternative learning curve models by proposing different combinations of technological learning sources, including both local and global technology experience and the knowledge stock. This paper specifically focuses on the non-linear relationship between the costs and technological learning source and their dynamic interaction and uses the system dynamics approach to predict a more accurate PV cost estimation for future development. As the case study, the data from China is gathered and drawn to illustrate that the learning curve model that incorporates both the global and local experience is more accurate and realistic than the other three models for PV cost estimation. Further, absorbing and integrating the global experience into the local industry has a positive impact on PV cost reduction. Although the learning curve model incorporating knowledge stock is not realistic for current PV cost deployment in China, it still plays an effective positive role in future PV cost reduction.

Keywords: photovoltaic, system dynamics, technological learning, learning curve

Procedia PDF Downloads 97
2102 Characterising Stable Model by Extended Labelled Dependency Graph

Authors: Asraful Islam

Abstract:

Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.

Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring

Procedia PDF Downloads 215
2101 Development of Fault Diagnosis Technology for Power System Based on Smart Meter

Authors: Chih-Chieh Yang, Chung-Neng Huang

Abstract:

In power system, how to improve the fault diagnosis technology of transmission line has always been the primary goal of power grid operators. In recent years, due to the rise of green energy, the addition of all kinds of distributed power also has an impact on the stability of the power system. Because the smart meters are with the function of data recording and bidirectional transmission, the adaptive Fuzzy Neural inference system, ANFIS, as well as the artificial intelligence that has the characteristics of learning and estimation in artificial intelligence. For transmission network, in order to avoid misjudgment of the fault type and location due to the input of these unstable power sources, combined with the above advantages of smart meter and ANFIS, a method for identifying fault types and location of faults is proposed in this study. In ANFIS training, the bus voltage and current information collected by smart meters can be trained through the ANFIS tool in MATLAB to generate fault codes to identify different types of faults and the location of faults. In addition, due to the uncertainty of distributed generation, a wind power system is added to the transmission network to verify the diagnosis correctness of the study. Simulation results show that the method proposed in this study can correctly identify the fault type and location of fault with more efficiency, and can deal with the interference caused by the addition of unstable power sources.

Keywords: ANFIS, fault diagnosis, power system, smart meter

Procedia PDF Downloads 140
2100 Future of Nanotechnology in Digital MacDraw

Authors: Pejman Hosseinioun, Abolghasem Ghasempour, Elham Gholami, Hamed Sarbazi

Abstract:

Considering the development in global semiconductor technology, it is anticipated that gadgets such as diodes and resonant transistor tunnels (RTD/RTT), Single electron transistors (SET) and quantum cellular automata (QCA) will substitute CMOS (Complementary Metallic Oxide Semiconductor) gadgets in many applications. Unfortunately, these new technologies cannot disembark the common Boolean logic efficiently and are only appropriate for liminal logic. Therefor there is no doubt that with the development of these new gadgets it is necessary to find new MacDraw technologies which are compatible with them. Resonant transistor tunnels (RTD/RTT) and circuit MacDraw with enhanced computing abilities are candida for accumulating Nano criterion in the future. Quantum cellular automata (QCA) are also advent Nano technological gadgets for electrical circuits. Advantages of these gadgets such as higher speed, smaller dimensions, and lower consumption loss are of great consideration. QCA are basic gadgets in manufacturing gates, fuses and memories. Regarding the complex Nano criterion physical entity, circuit designers can focus on logical and constructional design to decrease complication in MacDraw. Moreover Single electron technology (SET) is another noteworthy gadget considered in Nano technology. This article is a survey in future of Nano technology in digital MacDraw.

Keywords: nano technology, resonant transistor tunnels, quantum cellular automata, semiconductor

Procedia PDF Downloads 266
2099 Estimation of a Finite Population Mean under Random Non Response Using Improved Nadaraya and Watson Kernel Weights

Authors: Nelson Bii, Christopher Ouma, John Odhiambo

Abstract:

Non-response is a potential source of errors in sample surveys. It introduces bias and large variance in the estimation of finite population parameters. Regression models have been recognized as one of the techniques of reducing bias and variance due to random non-response using auxiliary data. In this study, it is assumed that random non-response occurs in the survey variable in the second stage of cluster sampling, assuming full auxiliary information is available throughout. Auxiliary information is used at the estimation stage via a regression model to address the problem of random non-response. In particular, the auxiliary information is used via an improved Nadaraya-Watson kernel regression technique to compensate for random non-response. The asymptotic bias and mean squared error of the estimator proposed are derived. Besides, a simulation study conducted indicates that the proposed estimator has smaller values of the bias and smaller mean squared error values compared to existing estimators of finite population mean. The proposed estimator is also shown to have tighter confidence interval lengths at a 95% coverage rate. The results obtained in this study are useful, for instance, in choosing efficient estimators of the finite population mean in demographic sample surveys.

Keywords: mean squared error, random non-response, two-stage cluster sampling, confidence interval lengths

Procedia PDF Downloads 141
2098 RP-HPLC Method Development and Its Validation for Simultaneous Estimation of Metoprolol Succinate and Olmesartan Medoxomil Combination in Bulk and Tablet Dosage Form

Authors: S. Jain, R. Savalia, V. Saini

Abstract:

A simple, accurate, precise, sensitive and specific RP-HPLC method was developed and validated for simultaneous estimation of Metoprolol Succinate and Olmesartan Medoxomil in bulk and tablet dosage form. The RP-HPLC method has shown adequate separation for Metoprolol Succinate and Olmesartan Medoxomil from its degradation products. The separation was achieved on a Phenomenex luna ODS C18 (250mm X 4.6mm i.d., 5μm particle size) with an isocratic mixture of acetonitrile: 50mM phosphate buffer pH 4.0 adjusted with glacial acetic acid in the ratio of 55:45 v/v. The mobile phase at a flow rate of 1.0ml/min, Injection volume 20μl and wavelength of detection was kept at 225nm. The retention time for Metoprolol Succinate and Olmesartan Medoxomil was 2.451±0.1min and 6.167±0.1min, respectively. The linearity of the proposed method was investigated in the range of 5-50μg/ml and 2-20μg/ml for Metoprolol Succinate and Olmesartan Medoxomil, respectively. Correlation coefficient was 0.999 and 0.9996 for Metoprolol Succinate and Olmesartan Medoxomil, respectively. The limit of detection was 0.2847μg/ml and 0.1251μg/ml for Metoprolol Succinate and Olmesartan Medoxomil, respectively and the limit of quantification was 0.8630μg/ml and 0.3793μg/ml for Metoprolol and Olmesartan, respectively. Proposed methods were validated as per ICH guidelines for linearity, accuracy, precision, specificity and robustness for estimation of Metoprolol Succinate and Olmesartan Medoxomil in commercially available tablet dosage form and results were found to be satisfactory. Thus the developed and validated stability indicating method can be used successfully for marketed formulations.

Keywords: metoprolol succinate, olmesartan medoxomil, RP-HPLC method, validation, ICH

Procedia PDF Downloads 316
2097 Stature and Gender Estimation Using Foot Measurements in South Indian Population

Authors: Jagadish Rao Padubidri, Mehak Bhandary, Sowmya J. Rao

Abstract:

Introduction: The significance of the human foot and its measurements in identifying an individual has been proved a lot of times by different studies in different geographical areas and its association to the stature and gender of the individual has been justified by many researches. In our study we have used different foot measurements including the length, width, malleol height and navicular height for establishing its association to stature and gender and to find out its accuracy. The purpose of this study is to show the relation of foot measurements with stature and gender, and to derive Multiple and Logistic regression equations for stature and gender estimation in South Indian population. Materials and Methods: The subjects for this study were 200 South Indian students out of which 100 were females and 100 were males, aged between 18 to 24 years. The data for the present study included the stature, foot length, foot breath, foot malleol height, foot navicular height of both right and left foot. Descriptive statistics, T-test and Pearson correlation coefficients were derived between stature, gender and foot measurements. The stature was estimated from right and left foot measurements for both male and female South Indian population using multiple regression analysis and logistic regression analysis for gender estimation. Results: The means, standard deviation, stature, right and left foot measurements and T-test in male population were higher than in females. LFL (Left foot length) is more than RFL (Right Foot length) in male groups, but in female groups the length of both foot are almost equal [RFL=226.6, LFL=227.1]. There is not much of difference in means of RFW (Right foot width) and LFW (Left foot width) in both the genders. Significant difference were seen in mean values of malleol and navicular height of right and left feet in male gender. No such difference was seen in female subjects. Conclusions: The study has successfully demonstrated the correlation of foot length in stature estimation in all the three study groups in both right and left foot. Next in parameters are Foot width and malleol height in estimating stature among male and female groups. Navicular height of both right and left foot showed poor relationship with stature estimation in both male and female groups. Multiple regression equations for both right and left foot measurements to estimate stature were derived with standard error ranging from 11-12 cm in males and 10-11 cm in females. The SEE was 5.8 when both male and female groups were pooled together. The logistic regression model which was derived to determine gender showed 85% accuracy and 92.5% accuracy using right and left foot measurements respectively. We believe that stature and gender can be estimated with foot measurements in South Indian population.

Keywords: foot length, gender, stature, South Indian

Procedia PDF Downloads 335
2096 State Estimation Based on Unscented Kalman Filter for Burgers’ Equation

Authors: Takashi Shimizu, Tomoaki Hashimoto

Abstract:

Controlling the flow of fluids is a challenging problem that arises in many fields. Burgers’ equation is a fundamental equation for several flow phenomena such as traffic, shock waves, and turbulence. The optimal feedback control method, so-called model predictive control, has been proposed for Burgers’ equation. However, the model predictive control method is inapplicable to systems whose all state variables are not exactly known. In practical point of view, it is unusual that all the state variables of systems are exactly known, because the state variables of systems are measured through output sensors and limited parts of them can be only available. In fact, it is usual that flow velocities of fluid systems cannot be measured for all spatial domains. Hence, any practical feedback controller for fluid systems must incorporate some type of state estimator. To apply the model predictive control to the fluid systems described by Burgers’ equation, it is needed to establish a state estimation method for Burgers’ equation with limited measurable state variables. To this purpose, we apply unscented Kalman filter for estimating the state variables of fluid systems described by Burgers’ equation. The objective of this study is to establish a state estimation method based on unscented Kalman filter for Burgers’ equation. The effectiveness of the proposed method is verified by numerical simulations.

Keywords: observer systems, unscented Kalman filter, nonlinear systems, Burgers' equation

Procedia PDF Downloads 153
2095 An Ontology-Based Framework to Support Asset Integrity Modeling: Case Study of Offshore Riser Integrity

Authors: Mohammad Sheikhalishahi, Vahid Ebrahimipour, Amir Hossein Radman-Kian

Abstract:

This paper proposes an Ontology framework for knowledge modeling and representation of the equipment integrity process in a typical oil and gas production plant. Our aim is to construct a knowledge modeling that facilitates translation, interpretation, and conversion of human-readable integrity interpretation into computer-readable representation. The framework provides a function structure related to fault propagation using ISO 14224 and ISO 15926 OWL-Lite/ Resource Description Framework (RDF) to obtain a generic system-level model of asset integrity that can be utilized in the integrity engineering process during the equipment life cycle. It employs standard terminology developed by ISO 15926 and ISO 14224 to map textual descriptions of equipment failure and then convert it to a causality-driven logic by semantic interpretation and computer-based representation using Lite/RDF. The framework applied for an offshore gas riser. The result shows that the approach can cross-link the failure-related integrity words and domain-specific logic to obtain a representation structure of equipment integrity with causality inference based on semantic extraction of inspection report context.

Keywords: asset integrity modeling, interoperability, OWL, RDF/XML

Procedia PDF Downloads 189
2094 A Digital Filter for Symmetrical Components Identification

Authors: Khaled M. El-Naggar

Abstract:

This paper presents a fast and efficient technique for monitoring and supervising power system disturbances generated due to dynamic performance of power systems or faults. Monitoring power system quantities involve monitoring fundamental voltage, current magnitudes, and their frequencies as well as their negative and zero sequence components under different operating conditions. The proposed technique is based on simulated annealing optimization technique (SA). The method uses digital set of measurements for the voltage or current waveforms at power system bus to perform the estimation process digitally. The algorithm is tested using different simulated data to monitor the symmetrical components of power system waveforms. Different study cases are considered in this work. Effects of number of samples, sampling frequency and the sample window size are studied. Results are reported and discussed.

Keywords: estimation, faults, measurement, symmetrical components

Procedia PDF Downloads 466
2093 A Bathtub Curve from Nonparametric Model

Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos

Abstract:

This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.

Keywords: bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution

Procedia PDF Downloads 446
2092 Flame Volume Prediction and Validation for Lean Blowout of Gas Turbine Combustor

Authors: Ejaz Ahmed, Huang Yong

Abstract:

The operation of aero engines has a critical importance in the vicinity of lean blowout (LBO) limits. Lefebvre’s model of LBO based on empirical correlation has been extended to flame volume concept by the authors. The flame volume takes into account the effects of geometric configuration, the complex spatial interaction of mixing, turbulence, heat transfer and combustion processes inside the gas turbine combustion chamber. For these reasons, flame volume based LBO predictions are more accurate. Although LBO prediction accuracy has improved, it poses a challenge associated with Vf estimation in real gas turbine combustors. This work extends the approach of flame volume prediction previously based on fuel iterative approximation with cold flow simulations to reactive flow simulations. Flame volume for 11 combustor configurations has been simulated and validated against experimental data. To make prediction methodology robust as required in the preliminary design stage, reactive flow simulations were carried out with the combination of probability density function (PDF) and discrete phase model (DPM) in FLUENT 15.0. The criterion for flame identification was defined. Two important parameters i.e. critical injection diameter (Dp,crit) and critical temperature (Tcrit) were identified, and their influence on reactive flow simulation was studied for Vf estimation. Obtained results exhibit ±15% error in Vf estimation with experimental data.

Keywords: CFD, combustion, gas turbine combustor, lean blowout

Procedia PDF Downloads 268
2091 On Modeling Data Sets by Means of a Modified Saddlepoint Approximation

Authors: Serge B. Provost, Yishan Zhang

Abstract:

A moment-based adjustment to the saddlepoint approximation is introduced in the context of density estimation. First applied to univariate distributions, this methodology is extended to the bivariate case. It then entails estimating the density function associated with each marginal distribution by means of the saddlepoint approximation and applying a bivariate adjustment to the product of the resulting density estimates. The connection to the distribution of empirical copulas will be pointed out. As well, a novel approach is proposed for estimating the support of distribution. As these results solely rely on sample moments and empirical cumulant-generating functions, they are particularly well suited for modeling massive data sets. Several illustrative applications will be presented.

Keywords: empirical cumulant-generating function, endpoints identification, saddlepoint approximation, sample moments, density estimation

Procedia PDF Downloads 162
2090 An Efficient Propensity Score Method for Causal Analysis With Application to Case-Control Study in Breast Cancer Research

Authors: Ms Azam Najafkouchak, David Todem, Dorothy Pathak, Pramod Pathak, Joseph Gardiner

Abstract:

Propensity score (PS) methods have recently become the standard analysis as a tool for the causal inference in the observational studies where exposure is not randomly assigned, thus, confounding can impact the estimation of treatment effect on the outcome. For the binary outcome, the effect of treatment on the outcome can be estimated by odds ratios, relative risks, and risk differences. However, using the different PS methods may give you a different estimation of the treatment effect on the outcome. Several methods of PS analyses have been used mainly, include matching, inverse probability of weighting, stratification, and covariate adjusted on PS. Due to the dangers of discretizing continuous variables (exposure, covariates), the focus of this paper will be on how the variation in cut-points or boundaries will affect the average treatment effect (ATE) utilizing the stratification of PS method. Therefore, we are trying to avoid choosing arbitrary cut-points, instead, we continuously discretize the PS and accumulate information across all cut-points for inferences. We will use Monte Carlo simulation to evaluate ATE, focusing on two PS methods, stratification and covariate adjusted on PS. We will then show how this can be observed based on the analyses of the data from a case-control study of breast cancer, the Polish Women’s Health Study.

Keywords: average treatment effect, propensity score, stratification, covariate adjusted, monte Calro estimation, breast cancer, case_control study

Procedia PDF Downloads 107
2089 On Confidence Intervals for the Difference between Inverse of Normal Means with Known Coefficients of Variation

Authors: Arunee Wongkhao, Suparat Niwitpong, Sa-aat Niwitpong

Abstract:

In this paper, we propose two new confidence intervals for the difference between the inverse of normal means with known coefficients of variation. One of these two confidence intervals for this problem is constructed based on the generalized confidence interval and the other confidence interval is constructed based on the closed form method of variance estimation. We examine the performance of these confidence intervals in terms of coverage probabilities and expected lengths via Monte Carlo simulation.

Keywords: coverage probability, expected length, inverse of normal mean, coefficient of variation, generalized confidence interval, closed form method of variance estimation

Procedia PDF Downloads 309
2088 Data Driven Infrastructure Planning for Offshore Wind farms

Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree

Abstract:

The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.

Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data

Procedia PDF Downloads 87
2087 Enhancing Project Management Performance in Prefabricated Building Construction under Uncertainty: A Comprehensive Approach

Authors: Niyongabo Elyse

Abstract:

Prefabricated building construction is a pioneering approach that combines design, production, and assembly to attain energy efficiency, environmental sustainability, and economic feasibility. Despite continuous development in the industry in China, the low technical maturity of standardized design, factory production, and construction assembly introduces uncertainties affecting prefabricated component production and on-site assembly processes. This research focuses on enhancing project management performance under uncertainty to help enterprises navigate these challenges and optimize project resources. The study introduces a perspective on how uncertain factors influence the implementation of prefabricated building construction projects. It proposes a theoretical model considering project process management ability, adaptability to uncertain environments, and collaboration ability of project participants. The impact of uncertain factors is demonstrated through case studies and quantitative analysis, revealing constraints on implementation time, cost, quality, and safety. To address uncertainties in prefabricated component production scheduling, a fuzzy model is presented, expressing processing times in interval values. The model utilizes a cooperative co-evolution evolution algorithm (CCEA) to optimize scheduling, demonstrated through a real case study showcasing reduced project duration and minimized effects of processing time disturbances. Additionally, the research addresses on-site assembly construction scheduling, considering the relationship between task processing times and assigned resources. A multi-objective model with fuzzy activity durations is proposed, employing a hybrid cooperative co-evolution evolution algorithm (HCCEA) to optimize project scheduling. Results from real case studies indicate improved project performance in terms of duration, cost, and resilience to processing time delays and resource changes. The study also introduces a multistage dynamic process control model, utilizing IoT technology for real-time monitoring during component production and construction assembly. This approach dynamically adjusts schedules when constraints arise, leading to enhanced project management performance, as demonstrated in a real prefabricated housing project. Key contributions include a fuzzy prefabricated components production scheduling model, a multi-objective multi-mode resource-constrained construction project scheduling model with fuzzy activity durations, a multi-stage dynamic process control model, and a cooperative co-evolution evolution algorithm. The integrated mathematical model addresses the complexity of prefabricated building construction project management, providing a theoretical foundation for practical decision-making in the field.

Keywords: prefabricated construction, project management performance, uncertainty, fuzzy scheduling

Procedia PDF Downloads 51
2086 Improvement of Process Competitiveness Using Intelligent Reference Models

Authors: Julio Macedo

Abstract:

Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.

Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics

Procedia PDF Downloads 88
2085 Material Handling Equipment Selection Using Fuzzy AHP Approach

Authors: Priyanka Verma, Vijaya Dixit, Rishabh Bajpai

Abstract:

This research paper is aimed at selecting appropriate material handling equipment among the given choices so that the automation level in material handling can be enhanced. This work is a practical case scenario of material handling systems in consumer electronic appliances manufacturing organization. The choices of material handling equipment among which the decision has to be made are Automated Guided Vehicle’s (AGV), Autonomous Mobile Robots (AMR), Overhead Conveyer’s (OC) and Battery Operated Trucks/Vehicle’s (BOT). There is a need of attaining a certain level of automation in order to reduce human interventions in the organization. This requirement of achieving certain degree of automation can be attained by material handling equipment’s mentioned above. The main motive for selecting above equipment’s for study was solely based on corporate financial strategy of investment and return obtained through that investment made in stipulated time framework. Since the low cost automation with respect to material handling devices has to be achieved hence these equipment’s were selected. Investment to be done on each unit of this equipment is less than 20 lakh rupees (INR) and the recovery period is less than that of five years. Fuzzy analytic hierarchic process (FAHP) is applied here for selecting equipment where the four choices are evaluated on basis of four major criteria’s and 13 sub criteria’s, and are prioritized on the basis of weight obtained. The FAHP used here make use of triangular fuzzy numbers (TFN). The inability of the traditional AHP in order to deal with the subjectiveness and impreciseness in the pair-wise comparison process has been improved in the FAHP. The range of values for general rating purposes for all decision making parameters is kept between 0 and 1 on the basis of expert opinions captured on shop floor. These experts were familiar with operating environment and shop floor activity control. Instead of generating exact value the FAHP generates the ranges of values to accommodate the uncertainty in decision-making process. The four major criteria’s selected for the evaluation of choices of material handling equipment’s available are materials, technical capabilities, cost and other features. The thirteen sub criteria’s listed under these following four major criteria’s are weighing capacity, load per hour, material compatibility, capital cost, operating cost and maintenance cost, speed, distance moved, space required, frequency of trips, control required, safety and reliability issues. The key finding shows that among the four major criteria selected, cost is emerged as the most important criteria and is one of the key decision making aspect on the basis of which material equipment selection is based on. While further evaluating the choices of equipment available for each sub criteria it is found that AGV scores the highest weight in most of the sub-criteria’s. On carrying out complete analysis the research shows that AGV is the best material handling equipment suiting all decision criteria’s selected in FAHP and therefore it is beneficial for the organization to carry out automated material handling in the facility using AGV’s.

Keywords: fuzzy analytic hierarchy process (FAHP), material handling equipment, subjectiveness, triangular fuzzy number (TFN)

Procedia PDF Downloads 434
2084 A Carrier Phase High Precision Ranging Theory Based on Frequency Hopping

Authors: Jie Xu, Zengshan Tian, Ze Li

Abstract:

Previous indoor ranging or localization systems achieving high accuracy time of flight (ToF) estimation relied on two key points. One is to do strict time and frequency synchronization between the transmitter and receiver to eliminate equipment asynchronous errors such as carrier frequency offset (CFO), but this is difficult to achieve in a practical communication system. The other one is to extend the total bandwidth of the communication because the accuracy of ToF estimation is proportional to the bandwidth, and the larger the total bandwidth, the higher the accuracy of ToF estimation obtained. For example, ultra-wideband (UWB) technology is implemented based on this theory, but high precision ToF estimation is difficult to achieve in common WiFi or Bluetooth systems with lower bandwidth compared to UWB. Therefore, it is meaningful to study how to achieve high-precision ranging with lower bandwidth when the transmitter and receiver are asynchronous. To tackle the above problems, we propose a two-way channel error elimination theory and a frequency hopping-based carrier phase ranging algorithm to achieve high accuracy ranging under asynchronous conditions. The two-way channel error elimination theory uses the symmetry property of the two-way channel to solve the asynchronous phase error caused by the asynchronous transmitter and receiver, and we also study the effect of the two-way channel generation time difference on the phase according to the characteristics of different hardware devices. The frequency hopping-based carrier phase ranging algorithm uses frequency hopping to extend the equivalent bandwidth and incorporates a carrier phase ranging algorithm with multipath resolution to achieve a ranging accuracy comparable to that of UWB at 400 MHz bandwidth in the typical 80 MHz bandwidth of commercial WiFi. Finally, to verify the validity of the algorithm, we implement this theory using a software radio platform, and the actual experimental results show that the method proposed in this paper has a median ranging error of 5.4 cm in the 5 m range, 7 cm in the 10 m range, and 10.8 cm in the 20 m range for a total bandwidth of 80 MHz.

Keywords: frequency hopping, phase error elimination, carrier phase, ranging

Procedia PDF Downloads 124
2083 The Use of a Rabbit Model to Evaluate the Influence of Age on Excision Wound Healing

Authors: S. Bilal, S. A. Bhat, I. Hussain, J. D. Parrah, S. P. Ahmad, M. R. Mir

Abstract:

Background: The wound healing involves a highly coordinated cascade of cellular and immunological response over a period including coagulation, inflammation, granulation tissue formation, epithelialization, collagen synthesis and tissue remodeling. Wounds in aged heal more slowly than those in younger, mainly because of comorbidities that occur as one age. The present study is about the influence of age on wound healing. 1x1cm^2 (100 mm) wounds were created on the back of the animal. The animals were divided into two groups; one group had animals in the age group of 3-9 months while another group had animals in the age group of 15-21 months. Materials and Methods: 24 clinically healthy rabbits in the age group of 3-21 months were used as experimental animals and divided into two groups viz A and B. All experimental parameters, i.e., Excision wound model, Measurement of wound area, Protein extraction and estimation, Protein extraction and estimation and DNA extraction and estimation were done by standard methods. Results: The parameters studied were wound contraction, hydroxyproline, glucosamine, protein, and DNA. A significant increase (p<0.005) in the hydroxyproline, glucosamine, protein and DNA and a significant decrease in wound area (p<0.005) was observed in the age group of 3-9 months when compared to animals of an age group of 15-21 months. Wound contraction together with hydroxyproline, glucosamine, protein and DNA estimations suggest that advanced age results in retarded wound healing. Conclusion: The decrease wound contraction and accumulation of hydroxyproline, glucosamine, protein and DNA in group B animals may be associated with the reduction or delay in growth factors because of the advancing age.

Keywords: age, wound healing, excision wound, hydroxyproline, glucosamine

Procedia PDF Downloads 660
2082 In Agile Projects - Arithmetic Sequence is More Effective than Fibonacci Sequence to Use for Estimating the Implementation Effort of User Stories

Authors: Khaled Jaber

Abstract:

The estimation of effort in software development is a complex task. The traditional Waterfall approach used to develop software systems requires a lot of time to estimate the effort needed to implement user requirements. Agile manifesto, however, is currently more used in the industry than the Waterfall to develop software systems. In Agile, the user requirement is referred to as a user story. Agile teams mostly use the Fibonacci sequence 1, 2, 3, 5, 8, 11, etc. in estimating the effort needed to implement the user story. This work shows through analysis that the Arithmetic sequence, e.g., 3, 6, 9, 12, etc., is more effective than the Fibonacci sequence in estimating the user stories. This paper mathematically and visually proves the effectiveness of the Arithmetic sequence over the FB sequence.

Keywords: agie, scrum, estimation, fibonacci sequence

Procedia PDF Downloads 207
2081 Item Response Calibration/Estimation: An Approach to Adaptive E-Learning System Development

Authors: Adeniran Adetunji, Babalola M. Florence, Akande Ademola

Abstract:

In this paper, we made an overview on the concept of adaptive e-Learning system, enumerates the elements of adaptive learning concepts e.g. A pedagogical framework, multiple learning strategies and pathways, continuous monitoring and feedback on student performance, statistical inference to reach final learning strategy that works for an individual learner by “mass-customization”. Briefly highlights the motivation of this new system proposed for effective learning teaching. E-Review literature on the concept of adaptive e-learning system and emphasises on the Item Response Calibration, which is an important approach to developing an adaptive e-Learning system. This paper write-up is concluded on the justification of item response calibration/estimation towards designing a successful and effective adaptive e-Learning system.

Keywords: adaptive e-learning system, pedagogical framework, item response, computer applications

Procedia PDF Downloads 597
2080 Modified Fuzzy Delphi Method to Incorporate Healthcare Stakeholders’ Perspectives in Selecting Quality Improvement Projects’ Criteria

Authors: Alia Aldarmaki, Ahmad Elshennawy

Abstract:

There is a global shift in healthcare systems’ emphasizing engaging different stakeholders in selecting quality improvement initiatives and incorporating their preferences to improve the healthcare efficiency and outcomes. Although experts bring scientific knowledge based on the scientific model and their personal experience, other stakeholders can bring new insights and information into the decision-making process. This study attempts to explore the impact of incorporating different stakeholders’ preference in identifying the most significant criteria that should be considered in healthcare for electing the improvement projects. A Framework based on a modified Fuzzy Delphi Method (FDM) was built. In addition to, the subject matter experts, doctors/physicians, nurses, administrators, and managers groups contribute to the selection process. The research identifies potential criteria for evaluating projects in healthcare, then utilizes FDM to capture expertise knowledge. The first round in FDM is intended to validate the identified list of criteria from experts; which includes collecting additional criteria from experts that the literature might have overlooked. When an acceptable level of consensus has been reached, a second round is conducted to obtain experts’ and other related stakeholders’ opinions on the appropriate weight of each criterion’s importance using linguistic variables. FDM analyses eliminate or retain the criteria to produce a final list of the critical criteria to select improvement projects in healthcare. Finally, reliability and validity were investigated using Cronbach’s alpha and factor analysis, respectively. Two case studies were carried out in a public hospital in the United Arab Emirates to test the framework. Both cases demonstrate that even though there were common criteria between the experts and the stakeholders, still stakeholders’ perceptions bring additional critical criteria into the evaluation process, which can impact the outcomes. Experts selected criteria related to strategical and managerial aspects, while the other participants preferred criteria related to social aspects such as health and safety and patients’ satisfaction. The health and safety criterion had the highest important weight in both cases. The analysis showed that Cronbach’s alpha value is 0.977 and all criteria have factor loading greater than 0.3. In conclusion, the inclusion of stakeholders’ perspectives is intended to enhance stakeholders’ engagement, improve transparency throughout the decision process, and take robust decisions.

Keywords: Fuzzy Delphi Method, fuzzy number, healthcare, stakeholders

Procedia PDF Downloads 129
2079 Formal Group Laws and Toposes in Gauge Theory

Authors: Patrascu Andrei Tudor

Abstract:

One of the main problems in high energy physics is the fact that we do not have a complete understanding of the interaction between local and global effects in gauge theory. This has an increasing impact on our ability to access the non-perturbative regime of most of our theories. Our theories, while being based on gauge groups considered to be simple or semi-simple and connected, are expected to be described by their simple local linear approximation, namely the Lie algebras. However, higher homotopy properties resulting in gauge anomalies appear frequently in theories of physical interest. Our assumption that the groups we deal with are simple and simply connected is probably not suitable, and ways to go beyond such assumptions, particularly in gauge theories, where the Lie algebra linear approximation is prevalent, are not known. We approach this problem from two directions: on one side we are explaining the potential role of formal group laws in describing certain higher homotopical properties and interferences with local or perturbative effects, and on the other side, we employ a categorical approach leading to synthetic theory and a way of looking at gauge theories. The topos approach is based on a geometry where the fundamental logic is intuitionistic logic, and hence the ‘tertium non datur’ principle is abandoned. This has a remarkable impact on understanding conformal symmetry and its anomalies in string theory in various dimensions.

Keywords: Gauge theory, formal group laws, Topos theory, conformal symmetry

Procedia PDF Downloads 41