Search results for: regression models.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3050

Search results for: regression models.

2420 Masonry CSEB Building Models under Shaketable Testing-An Experimental Study

Authors: Lakshmi Keshav, V. G. Srisanthi

Abstract:

In this experimental investigation shake table tests were conducted on two reduced models that represent normal single room building constructed by Compressed Stabilized Earth Block (CSEB) from locally available soil. One model was constructed with earthquake resisting features (EQRF) having sill band, lintel band and vertical bands to control the building vibration and another one was without Earthquake Resisting Features. To examine the seismic capacity of the models particularly when it is subjected to long-period ground motion by large amplitude by many cycles of repeated loading, the test specimen was shaken repeatedly until the failure. The test results from Hi-end Data Acquisition system show that model with EQRF behave better than without EQRF. This modified masonry model with new material combined with new bands is used to improve the behavior of masonry building.

Keywords: Earth Quake Resisting Features, Compressed Stabilized Earth Blocks, Masonry structures, Shake table testing, Horizontal and vertical bands.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2707
2419 An Improved Prediction Model of Ozone Concentration Time Series Based On Chaotic Approach

Authors: N. Z. A. Hamid, M. S. M. Noorani

Abstract:

This study is focused on the development of prediction models of the Ozone concentration time series. Prediction model is built based on chaotic approach. Firstly, the chaotic nature of the time series is detected by means of phase space plot and the Cao method. Then, the prediction model is built and the local linear approximation method is used for the forecasting purposes. Traditional prediction of autoregressive linear model is also built. Moreover, an improvement in local linear approximation method is also performed. Prediction models are applied to the hourly Ozone time series observed at the benchmark station in Malaysia. Comparison of all models through the calculation of mean absolute error, root mean squared error and correlation coefficient shows that the one with improved prediction method is the best. Thus, chaotic approach is a good approach to be used to develop a prediction model for the Ozone concentration time series.

Keywords: Chaotic approach, phase space, Cao method, local linear approximation method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
2418 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.

Keywords: Data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks, WSN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1204
2417 Facilitating Cooperative Knowledge Support by Role-Based Knowledge-Flow Views

Authors: Chih-Wei Lin, Duen-Ren Liu, Hui-Fang Chen

Abstract:

Effective knowledge support relies on providing operation-relevant knowledge to workers promptly and accurately. A knowledge flow represents an individual-s or a group-s knowledge-needs and referencing behavior of codified knowledge during operation performance. The flow has been utilized to facilitate organizational knowledge support by illustrating workers- knowledge-needs systematically and precisely. However, conventional knowledge-flow models cannot work well in cooperative teams, which team members usually have diverse knowledge-needs in terms of roles. The reason is that those models only provide one single view to all participants and do not reflect individual knowledge-needs in flows. Hence, we propose a role-based knowledge-flow view model in this work. The model builds knowledge-flow views (or virtual knowledge flows) by creating appropriate virtual knowledge nodes and generalizing knowledge concepts to required concept levels. The customized views could represent individual role-s knowledge-needs in teamwork context. The novel model indicates knowledge-needs in condensed representation from a roles perspective and enhances the efficiency of cooperative knowledge support in organizations.

Keywords: cooperative knowledge support, knowledge flow, knowledge-flow view, role-based models

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1267
2416 Application of Transportation Models for Analysing Future Intercity and Intracity Travel Patterns in Kuwait

Authors: Srikanth Pandurangi, Basheer Mohammed, Nezar Al Sayegh

Abstract:

In order to meet the increasing demand for housing care for Kuwaiti citizens, the government authorities in Kuwait are undertaking a series of projects in the form of new large cities, outside the current urban area. Al Mutlaa City located to the north-west of the Kuwait Metropolitan Area is one such project out of the 15 planned new cities. The city accommodates a wide variety of residential developments, employment opportunities, commercial, recreational, health care and institutional uses. This paper examines the application of comprehensive transportation demand modeling works undertaken in VISUM platform to understand the future intracity and intercity travel distribution patterns in Kuwait. The scope of models developed varied in levels of detail: strategic model update, sub-area models representing future demand of Al Mutlaa City, sub-area models built to estimate the demand in the residential neighborhoods of the city. This paper aims at offering model update framework that facilitates easy integration between sub-area models and strategic national models for unified traffic forecasts. This paper presents the transportation demand modeling results utilized in informing the planning of multi-modal transportation system for Al Mutlaa City. This paper also presents the household survey data collection efforts undertaken using GPS devices (first time in Kuwait) and notebook computer based digital survey forms for interviewing representative sample of citizens and residents. The survey results formed the basis of estimating trip generation rates and trip distribution coefficients used in the strategic base year model calibration and validation process.

Keywords: GPS based household surveys, transportation infrastructure, origin-destination trip matrices, traffic forecasts, transportation demand modeling, travel behavior patterns.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
2415 Performance Improvement in the Bivariate Models by using Modified Marginal Variance of Noisy Observations for Image-Denoising Applications

Authors: R. Senthilkumar

Abstract:

Most simple nonlinear thresholding rules for wavelet- based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. This paper attempts to give a recipe for selecting one of the popular image-denoising algorithms based on VisuShrink, SureShrink, OracleShrink, BayesShrink and BiShrink and also this paper compares different Bivariate models used for image denoising applications. The first part of the paper compares different Shrinkage functions used for image-denoising. The second part of the paper compares different bivariate models and the third part of this paper uses the Bivariate model with modified marginal variance which is based on Laplacian assumption. This paper gives an experimental comparison on six 512x512 commonly used images, Lenna, Barbara, Goldhill, Clown, Boat and Stonehenge. The following noise powers 25dB,26dB, 27dB, 28dB and 29dB are added to the six standard images and the corresponding Peak Signal to Noise Ratio (PSNR) values are calculated for each noise level.

Keywords: BiShrink, Image-Denoising, PSNR, Shrinkage function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321
2414 Fuzzy EOQ Models for Deteriorating Items with Stock Dependent Demand and Non-Linear Holding Costs

Authors: G. C. Mahata, A. Goswami

Abstract:

This paper deals with infinite time horizon fuzzy Economic Order Quantity (EOQ) models for deteriorating items with  stock dependent demand rate and nonlinear holding costs by taking deterioration rate θ0 as a triangular fuzzy number  (θ0 −δ 1, θ0, θ0 +δ 2), where 1 2 0 0 <δ ,δ <θ are fixed real numbers. The traditional parameters such as unit cost and ordering  cost have been kept constant but holding cost is considered to vary. Two possibilities of variations in the holding cost function namely, a non-linear function of the length of time for which the item is held in stock and a non-linear function of the amount of on-hand inventory have been used in the models. The approximate optimal solution for the fuzzy cost functions in both these cases have been obtained and the effect of non-linearity in holding costs is studied with the help of a numerical example.

Keywords: Inventory Model, Deterioration, Holding Cost, Fuzzy Total Cost, Extension Principle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1789
2413 Kinetic Studies on Microbial Production of Tannase Using Redgram Husk

Authors: S. K. Mohan, T. Viruthagiri, C. Arunkumar

Abstract:

Tannase (tannin acyl hydrolase, E.C.3.1.1.20) is an important hydrolysable enzyme with innumerable applications and industrial potential. In the present study, a kinetic model has been developed for the batch fermentation used for the production of tannase by A.flavus MTCC 3783. Maximum tannase activity of 143.30 U/ml was obtained at 96 hours under optimum operating conditions at 35oC, an initial pH of 5.5 and with an inducer tannic acid concentration of 3% (w/v) for a fermentation period of 120 hours. The biomass concentration reaches a maximum of 6.62 g/l at 96 hours and further there was no increase in biomass concentration till the end of the fermentation. Various unstructured kinetic models were analyzed to simulate the experimental values of microbial growth, tannase activity and substrate concentration. The Logistic model for microbial growth , Luedeking - Piret model for production of tannase and Substrate utilization kinetic model for utilization of substrate were capable of predicting the fermentation profile with high coefficient of determination (R2) values of 0.980, 0.942 and 0.983 respectively. The results indicated that the unstructured models were able to describe the fermentation kinetics more effectively.

Keywords: Aspergillus flavus, Batch fermentation, Kinetic model, Tannase, Unstructured models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538
2412 Estimation of Natural Frequency of the Bearing System under Periodic Force Based on Principal of Hydrodynamic Mass of Fluid

Authors: M. H. Pol, A. Bidi, A. V. Hoseini

Abstract:

Estimation of natural frequency of structures is very important and isn-t usually calculated simply and sometimes complicated. Lack of knowledge about that caused hard damage and hazardous effects. In this paper, with using from two different models in FEM method and based on hydrodynamic mass of fluids, natural frequency of an especial bearing (Fig. 1) in an electric field (or, a periodic force) is calculated in different stiffness and different geometric. In final, the results of two models and analytical solution are compared.

Keywords: Natural frequency of the bearing, Hydrodynamic mass of fluid method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2623
2411 Application of Differential Transformation Method for Solving Dynamical Transmission of Lassa Fever Model

Authors: M. A. Omoloye, M. I. Yusuff, O. K. S. Emiola

Abstract:

The use of mathematical models for solving biological problems varies from simple to complex analyses, depending on the nature of the research problems and applicability of the models. The method is more common nowadays. Many complex models become impractical when transmitted analytically. However, alternative approach such as numerical method can be employed. It appropriateness in solving linear and non-linear model equation in Differential Transformation Method (DTM) which depends on Taylor series make it applicable. Hence this study investigates the application of DTM to solve dynamic transmission of Lassa fever model in a population. The mathematical model was formulated using first order differential equation. Firstly, existence and uniqueness of the solution was determined to establish that the model is mathematically well posed for the application of DTM. Numerically, simulations were conducted to compare the results obtained by DTM and that of fourth-order Runge-Kutta method. As shown, DTM is very effective in predicting the solution of epidemics of Lassa fever model.

Keywords: Differential Transform Method, Existence and uniqueness, Lassa fever, Runge-Kutta Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 449
2410 The Effect of Modification and Initial Concentration on Ammonia Removal from Leachate by Zeolite

Authors: Fulya Aydın, Ayşe Kuleyin

Abstract:

The purpose of this study is to investigate the capacity of natural Turkish zeolite for NH4-N removal from landfill leachate. The effects of modification and initial concentration on the removal of NH4-N from leachate were also investigated. The kinetics of adsorption of NH4-N has been discussed using three kinetic models, i.e., the pseudo-second order model, the Elovich equation, the intraparticle diffuion model. Kinetic parameters and correlation coefficients were determined. Equilibrium isotherms for the adsorption of NH4-N were analyzed by Langmuir, Freundlich and Tempkin isotherm models. Langmuir isotherm model was found to best represent the data for NH4-N.

Keywords: Leachate, Ammonium, zeolite

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2336
2409 A Novel Approach to Handle Uncertainty in Health System Variables for Hospital Admissions

Authors: Manisha Rathi, Thierry Chaussalet

Abstract:

Hospital staff and managers are under pressure and concerned for effective use and management of scarce resources. The hospital admissions require many decisions that have complex and uncertain consequences for hospital resource utilization and patient flow. It is challenging to predict risk of admissions and length of stay of a patient due to their vague nature. There is no method to capture the vague definition of admission of a patient. Also, current methods and tools used to predict patients at risk of admission fail to deal with uncertainty in unplanned admission, LOS, patients- characteristics. The main objective of this paper is to deal with uncertainty in health system variables, and handles uncertain relationship among variables. An introduction of machine learning techniques along with statistical methods like Regression methods can be a proposed solution approach to handle uncertainty in health system variables. A model that adapts fuzzy methods to handle uncertain data and uncertain relationships can be an efficient solution to capture the vague definition of admission of a patient.

Keywords: Admission, Fuzzy, Regression, Uncertainty

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
2408 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting

Authors: Yiannis G. Smirlis

Abstract:

The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.

Keywords: Data envelopment analysis, interval DEA, efficiency classification, efficiency prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 910
2407 Analyzing Preservice Teachers’ Attitudes towards Technology

Authors: Ahmet Oguz Akturk, Kemal Izci, Gurbuz Caliskan, Ismail Sahin

Abstract:

Rapid developments in technology in the present age have made it necessary for communities to follow technological developments and adapt themselves to these developments. One of the fields that are most rapidly affected by these developments is undoubtedly education. Determination of the attitudes of preservice teachers, who live in an age of technology and get ready to raise future individuals, is of paramount importance both educationally and professionally. The purpose of this study was to analyze attitudes of preservice teachers towards technology and some variables that predict these attitudes (gender, daily duration of internet use, and the number of technical devices owned). 329 preservice teachers attending the education faculty of a large university in central Turkey participated, on a volunteer basis, in this study, where relational survey model was used as the research method. Research findings reveal that preservice teachers’ attitudes towards technology are positive and at the same time, the attitudes of male preservice teachers towards technology are more positive than their female counterparts. As a result of the stepwise multiple regression analysis where factors predicting preservice teachers’ attitudes towards technology, it was found that duration of daily internet use was the strongest predictor of attitudes towards technology.

Keywords: Attitudes towards technology, preservice teachers, gender, stepwise multiple regression analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1725
2406 Prediction of Computer and Video Game Playing Population: An Age Structured Model

Authors: T. K. Sriram, Joydip Dhar

Abstract:

Models based on stage structure have found varied applications in population models. This paper proposes a stage structured model to study the trends in the computer and video game playing population of US. The game paying population is divided into three compartments based on their age group. After simulating the mathematical model, a forecast of the number of game players in each stage as well as an approximation of the average age of game players in future has been made.

Keywords: Age structure, Forecasting, Mathematical modeling, Stage structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1876
2405 Hybrid Equity Warrants Pricing Formulation under Stochastic Dynamics

Authors: Teh Raihana Nazirah Roslan, Siti Zulaiha Ibrahim, Sharmila Karim

Abstract:

A warrant is a financial contract that confers the right but not the obligation, to buy or sell a security at a certain price before expiration. The standard procedure to value equity warrants using call option pricing models such as the Black–Scholes model had been proven to contain many flaws, such as the assumption of constant interest rate and constant volatility. In fact, existing alternative models were found focusing more on demonstrating techniques for pricing, rather than empirical testing. Therefore, a mathematical model for pricing and analyzing equity warrants which comprises stochastic interest rate and stochastic volatility is essential to incorporate the dynamic relationships between the identified variables and illustrate the real market. Here, the aim is to develop dynamic pricing formulations for hybrid equity warrants by incorporating stochastic interest rates from the Cox-Ingersoll-Ross (CIR) model, along with stochastic volatility from the Heston model. The development of the model involves the derivations of stochastic differential equations that govern the model dynamics. The resulting equations which involve Cauchy problem and heat equations are then solved using partial differential equation approaches. The analytical pricing formulas obtained in this study comply with the form of analytical expressions embedded in the Black-Scholes model and other existing pricing models for equity warrants. This facilitates the practicality of this proposed formula for comparison purposes and further empirical study.

Keywords: Cox-Ingersoll-Ross model, equity warrants, Heston model, hybrid models, stochastic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 545
2404 Specialized Reduced Models of Dynamic Flows in 2-Stroke Engines

Authors: S. Cagin, X. Fischer, E. Delacourt, N. Bourabaa, C. Morin, D. Coutellier, B. Carré, S. Loumé

Abstract:

The complexity of scavenging by ports and its impact on engine efficiency create the need to understand and to model it as realistically as possible. However, there are few empirical scavenging models and these are highly specialized. In a design optimization process, they appear very restricted and their field of use is limited. This paper presents a comparison of two methods to establish and reduce a model of the scavenging process in 2-stroke diesel engines. To solve the lack of scavenging models, a CFD model has been developed and is used as the referent case. However, its large size requires a reduction. Two techniques have been tested depending on their fields of application: The NTF method and neural networks. They both appear highly appropriate drastically reducing the model’s size (over 90% reduction) with a low relative error rate (under 10%). Furthermore, each method produces a reduced model which can be used in distinct specialized fields of application: the distribution of a quantity (mass fraction for example) in the cylinder at each time step (pseudo-dynamic model) or the qualification of scavenging at the end of the process (pseudo-static model).

Keywords: Diesel engine, Design optimization, Model reduction, Neural network, NTF algorithm, Scavenging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1294
2403 On the Performance of Information Criteria in Latent Segment Models

Authors: Jaime R. S. Fonseca

Abstract:

Nevertheless the widespread application of finite mixture models in segmentation, finite mixture model selection is still an important issue. In fact, the selection of an adequate number of segments is a key issue in deriving latent segments structures and it is desirable that the selection criteria used for this end are effective. In order to select among several information criteria, which may support the selection of the correct number of segments we conduct a simulation study. In particular, this study is intended to determine which information criteria are more appropriate for mixture model selection when considering data sets with only categorical segmentation base variables. The generation of mixtures of multinomial data supports the proposed analysis. As a result, we establish a relationship between the level of measurement of segmentation variables and some (eleven) information criteria-s performance. The criterion AIC3 shows better performance (it indicates the correct number of the simulated segments- structure more often) when referring to mixtures of multinomial segmentation base variables.

Keywords: Quantitative Methods, Multivariate Data Analysis, Clustering, Finite Mixture Models, Information Theoretical Criteria, Simulation experiments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493
2402 Mapping Knowledge Model Onto Java Codes

Authors: B.A.Gobin, R.K.Subramanian

Abstract:

This paper gives an overview of the mapping mechanism of SEAM-a methodology for the automatic generation of knowledge models and its mapping onto Java codes. It discusses the rules that will be used to map the different components in the knowledge model automatically onto Java classes, properties and methods. The aim of developing this mechanism is to help in the creation of a prototype which will be used to validate the knowledge model which has been generated automatically. It will also help to link the modeling phase with the implementation phase as existing knowledge engineering methodologies do not provide for proper guidelines for the transition from the knowledge modeling phase to development phase. This will decrease the development overheads associated to the development of Knowledge Based Systems.

Keywords: KBS, OWL, ontology, knowledge models

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360
2401 Feature Analysis of Predictive Maintenance Models

Authors: Zhaoan Wang

Abstract:

Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.

Keywords: Automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1955
2400 An Internet of Things-Based Weight Monitoring System for Honey

Authors: Zheng-Yan Ruan, Chien-Hao Wang, Hong-Jen Lin, Chien-Peng Huang, Ying-Hao Chen, En-Cheng Yang, Chwan-Lu Tseng, Joe-Air Jiang

Abstract:

Bees play a vital role in pollination. This paper focuses on the weighing process of honey. Honey is usually stored at the comb in a hive. Bee farmers brush bees away from the comb and then collect honey, and the collected honey is weighed afterward. However, such a process brings strong negative influences on bees and even leads to the death of bees. This paper therefore presents an Internet of Things-based weight monitoring system which uses weight sensors to measure the weight of honey and simplifies the whole weighing procedure. To verify the system, the weight measured by the system is compared to the weight of standard weights used for calibration by employing a linear regression model. The R2 of the regression model is 0.9788, which suggests that the weighing system is highly reliable and is able to be applied to obtain actual weight of honey. In the future, the weight data of honey can be used to find the relationship between honey production and different ecological parameters, such as bees’ foraging behavior and weather conditions. It is expected that the findings can serve as critical information for honey production improvement.

Keywords: Internet of Things, weight, honey, bee.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1266
2399 Identification, Prediction and Detection of the Process Fault in a Cement Rotary Kiln by Locally Linear Neuro-Fuzzy Technique

Authors: Masoud Sadeghian, Alireza Fatehi

Abstract:

In this paper, we use nonlinear system identification method to predict and detect process fault of a cement rotary kiln. After selecting proper inputs and output, an input-output model is identified for the plant. To identify the various operation points in the kiln, Locally Linear Neuro-Fuzzy (LLNF) model is used. This model is trained by LOLIMOT algorithm which is an incremental treestructure algorithm. Then, by using this method, we obtained 3 distinct models for the normal and faulty situations in the kiln. One of the models is for normal condition of the kiln with 15 minutes prediction horizon. The other two models are for the two faulty situations in the kiln with 7 minutes prediction horizon are presented. At the end, we detect these faults in validation data. The data collected from White Saveh Cement Company is used for in this study.

Keywords: Cement Rotary Kiln, Fault Detection, Delay Estimation Method, Locally Linear Neuro Fuzzy Model, LOLIMOT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
2398 Performance Evaluation of Minimum Quantity Lubrication on EN3 Mild Steel Turning

Authors: Swapnil Rajan Jadhav, Ajay Vasantrao Kashikar

Abstract:

Lubrication, cooling and chip removal are the desired functions of any cutting fluid. Conventional or flood lubrication requires high volume flow rate and cost associated with this is higher. In addition, flood lubrication possesses health risks to machine operator. To avoid these consequences, dry machining and minimum quantity are two alternatives. Dry machining cannot be a suited alternative as it can generate greater heat and poor surface finish. Here, turning work is carried out on a Lathe machine using EN3 Mild steel. Variable cutting speeds and depth of cuts are provided and corresponding temperatures and surface roughness values were recorded. Experimental results are analyzed by Minitab software. Regression analysis, main effect plot, and interaction plot conclusion are drawn by using ANOVA. There is a 95.83% reduction in the use of cutting fluid. MQL gives a 9.88% reduction in tool temperature, this will improve tool life. MQL produced a 17.64% improved surface finish. MQL appears to be an economical and environmentally compatible lubrication technique for sustainable manufacturing.

Keywords: ANOVA, MQL, regression analysis, surface roughness

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 441
2397 Performance Evaluation of Minimum Quantity Lubrication on EN3 Mild Steel Turning

Authors: Swapnil Rajan Jadhav, Ajay Vasantrao Kashikar

Abstract:

Lubrication, cooling and chip removal are the desired functions of any cutting fluid. Conventional or flood lubrication requires high volume flow rate and cost associated with this is higher. In addition, flood lubrication possesses health risks to machine operator. To avoid these consequences, dry machining and minimum quantity are two alternatives. Dry machining cannot be a suited alternative as it can generate greater heat and poor surface finish. Here, turning work is carried out on a Lathe machine using EN3 Mild steel. Variable cutting speeds and depth of cuts are provided and corresponding temperatures and surface roughness values were recorded. Experimental results are analyzed by Minitab software. Regression analysis, main effect plot, and interaction plot conclusion are drawn by using ANOVA. There is a 95.83% reduction in the use of cutting fluid. MQL gives a 9.88% reduction in tool temperature, this will improve tool life. MQL produced a 17.64% improved surface finish. MQL appears to be an economical and environmentally compatible lubrication technique for sustainable manufacturing.

Keywords: ANOVA, MQL, regression analysis, surface roughness

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 340
2396 Evaluating Factors Influencing Information Quality in Large Firms

Authors: B. E. Narkhede, S. K. Mahajan, B. T. Patil, R. D. Raut

Abstract:

Information quality is a major performance measure for an Enterprise Resource Planning (ERP) system of any firm. This study identifies various critical success factors of information quality. The effect of various critical success factors like project management, reengineering efforts and interdepartmental communications on information quality is analyzed using a multiple regression model. Here quantitative data are collected from respondents from various firms through structured questionnaire for assessment of the information quality, project management, reengineering efforts and interdepartmental communications. The validity and reliability of the data are ensured using techniques like factor analysis, computing of Cronbach’s alpha. This study gives relative importance of each of the critical success factors. The findings suggest that among the various factors influencing information quality careful reengineering efforts are the most influencing factor. This paper gives clear insight to managers and practitioners regarding the relative importance of critical success factors influencing information quality so that they can formulate a strategy at the beginning of ERP system implementation.

Keywords: Enterprise resource planning, information systems, multiple regression, information quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2090
2395 Investigations Into the Turning Parameters Effect on the Surface Roughness of Flame Hardened Medium Carbon Steel with TiN-Al2O3-TiCN Coated Inserts based on Taguchi Techniques

Authors: Samir Khrais, Adel Mahammod Hassan , Amro Gazawi

Abstract:

The aim of this research is to evaluate surface roughness and develop a multiple regression model for surface roughness as a function of cutting parameters during the turning of flame hardened medium carbon steel with TiN-Al2O3-TiCN coated inserts. An experimental plan of work and signal-to-noise ratio (S/N) were used to relate the influence of turning parameters to the workpiece surface finish utilizing Taguchi methodology. The effects of turning parameters were studied by using the analysis of variance (ANOVA) method. Evaluated parameters were feed, cutting speed, and depth of cut. It was found that the most significant interaction among the considered turning parameters was between depth of cut and feed. The average surface roughness (Ra) resulted by TiN-Al2O3- TiCN coated inserts was about 2.44 μm and minimum value was 0.74 μm. In addition, the regression model was able to predict values for surface roughness in comparison with experimental values within reasonable limit.

Keywords: Medium carbon steel, Prediction, Surface roughness, Taguchi method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
2394 Multi-models Approach for Describing and Verifying Constraints Based Interactive Systems

Authors: Mamoun Sqali, Mohamed Wassim Trojet

Abstract:

The requirements analysis, modeling, and simulation have consistently been one of the main challenges during the development of complex systems. The scenarios and the state machines are two successful models to describe the behavior of an interactive system. The scenarios represent examples of system execution in the form of sequences of messages exchanged between objects and are a partial view of the system. In contrast, state machines can represent the overall system behavior. The automation of processing scenarios in the state machines provide some answers to various problems such as system behavior validation and scenarios consistency checking. In this paper, we propose a method for translating scenarios in state machines represented by Discreet EVent Specification and procedure to detect implied scenarios. Each induced DEVS model represents the behavior of an object of the system. The global system behavior is described by coupling the atomic DEVS models and validated through simulation. We improve the validation process with integrating formal methods to eliminate logical inconsistencies in the global model. For that end, we use the Z notation.

Keywords: Scenarios, DEVS, synthesis, validation and verification, simulation, formal verification, z notation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1362
2393 Hybrid Project Management Model Based on Lean and Agile Approach

Authors: F. Z. Eddoug, J. Benhra, R. Benabbou

Abstract:

Excellence and Success are the ultimate goal for any project and in order to achieve it, every project manager looks for the convenient tools and methods. This work proposes a framework that seeks an efficient management of general project through a lean and agile approach. In order to get this objective, the article was divided in two stages, the first one was emphasized on exploring and analyzing the existing project management models and then in the second one the desired framework was created, beginning by focusing on seven existing models and then proposing for each phase of the framework the convenient lean and agile tools.

Keywords: Agility, hybrid project management, lean, scrum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 374
2392 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level

Authors: Pedro M. Abreu, Bruno R. Mendes

Abstract:

The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.

Keywords: Clinical pharmacy, co-payments, healthcare, medicines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318
2391 On Hyperbolic Gompertz Growth Model

Authors: Angela Unna Chukwu, Samuel Oluwafemi Oyamakin

Abstract:

We proposed a Hyperbolic Gompertz Growth Model (HGGM), which was developed by introducing a shape parameter (allometric). This was achieved by convoluting hyperbolic sine function on the intrinsic rate of growth in the classical gompertz growth equation. The resulting integral solution obtained deterministically was reprogrammed into a statistical model and used in modeling the height and diameter of Pines (Pinus caribaea). Its ability in model prediction was compared with the classical gompertz growth model, an approach which mimicked the natural variability of height/diameter increment with respect to age and therefore provides a more realistic height/diameter predictions using goodness of fit tests and model selection criteria. The Kolmogorov Smirnov test and Shapiro-Wilk test was also used to test the compliance of the error term to normality assumptions while the independence of the error term was confirmed using the runs test. The mean function of top height/Dbh over age using the two models under study predicted closely the observed values of top height/Dbh in the hyperbolic gompertz growth models better than the source model (classical gompertz growth model) while the results of R2, Adj. R2, MSE and AIC confirmed the predictive power of the Hyperbolic Gompertz growth models over its source model.

Keywords: Height, Dbh, forest, Pinus caribaea, hyperbolic, gompertz.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2680