Search results for: stock price prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3928

Search results for: stock price prediction

3148 Motives and Barriers of Using Airbnb: Findings from Mixed Method Approach

Authors: Ghada Mohammed, Mohamed Abdel Salam, Passent Tantawi

Abstract:

The study aimed to investigate the impact of motives and barriers for Egyptian users to use Airbnb as a platform of peer-to-peer accommodation instead of hotels on overall attitude towards Airbnb. A sequential mixed-methods approach was adopted to this study and it proposed a comprehensive research model adapted from both literature and results of qualitative phase and then tested via an online questionnaire. The findings revealed that, motives, price, home benefits, privacy, and online reviews significantly explained overall attitude towards Airbnb, while the main barriers were respectively: perceived risk and distrust in which they can predict the overall attitude. While from the subjective norms, only social influence can predict behavioral intention to use Airbnb. The study may serve as a practical reference for practitioners as well as researchers when developing programs and strategies to manage Airbnb consumers' needs and decision process. Some of the main conclusions drawn from this study are that variety was one of the major things that users like about Airbnb and the most important motives are the functional ones like price rather than the experiential ones like authenticity.

Keywords: airbnb, barriers, disruptive innovation, motives, sharing economy

Procedia PDF Downloads 140
3147 An Implementation of Incentive Systems within Property Life Cycles Will Reward Investors, Planners and Users

Authors: Nadine Wills

Abstract:

The whole life thinking of buildings (independent if these are commercial properties or residential properties) will raise if incentive systems are provided to investors, planners and users. The Use of Building Information Modelling (BIM)-Systems offers planners the possibility to plan and re-plan buildings for decades after a period of utilization without spending many capacities. The strategy-incentive should be to plan the building in a way that makes rescheduling possible by changing just parameters in the system and not re-planning the whole building. If users receive the chance to patient incentive systems, the building stock will have a long life period. Business models of tenant electricity or self-controlled operating costs are incentive systems for building –users to let fixed running costs decline without producing damages due to wrong purposes. BIM is the controlling body to ensure that users do not abuse the incentive solution and take negative influence on the building stock. The investor benefits from the planner’s and user’s incentives: the fact that the building becomes useful for the whole life without making unnecessary investments provides possibilities to make investments in different assets. Moreover, the investor gains the facility to achieve higher rents by merchandise the property with low operating costs. To execute BIM offers whole property life cycles.

Keywords: BIM, incentives, life cycle, sustainability

Procedia PDF Downloads 294
3146 A Multilayer Perceptron Neural Network Model Optimized by Genetic Algorithm for Significant Wave Height Prediction

Authors: Luis C. Parra

Abstract:

The significant wave height prediction is an issue of great interest in the field of coastal activities because of the non-linear behavior of the wave height and its complexity of prediction. This study aims to present a machine learning model to forecast the significant wave height of the oceanographic wave measuring buoys anchored at Mooloolaba of the Queensland Government Data. Modeling was performed by a multilayer perceptron neural network-genetic algorithm (GA-MLP), considering Relu(x) as the activation function of the MLPNN. The GA is in charge of optimized the MLPNN hyperparameters (learning rate, hidden layers, neurons, and activation functions) and wrapper feature selection for the window width size. Results are assessed using Mean Square Error (MSE), Root Mean Square Error (RMSE), and Mean Absolute Error (MAE). The GAMLPNN algorithm was performed with a population size of thirty individuals for eight generations for the prediction optimization of 5 steps forward, obtaining a performance evaluation of 0.00104 MSE, 0.03222 RMSE, 0.02338 MAE, and 0.71163% of MAPE. The results of the analysis suggest that the MLPNNGA model is effective in predicting significant wave height in a one-step forecast with distant time windows, presenting 0.00014 MSE, 0.01180 RMSE, 0.00912 MAE, and 0.52500% of MAPE with 0.99940 of correlation factor. The GA-MLP algorithm was compared with the ARIMA forecasting model, presenting better performance criteria in all performance criteria, validating the potential of this algorithm.

Keywords: significant wave height, machine learning optimization, multilayer perceptron neural networks, evolutionary algorithms

Procedia PDF Downloads 103
3145 Prediction of Compressive Strength in Geopolymer Composites by Adaptive Neuro Fuzzy Inference System

Authors: Mehrzad Mohabbi Yadollahi, Ramazan Demirboğa, Majid Atashafrazeh

Abstract:

Geopolymers are highly complex materials which involve many variables which makes modeling its properties very difficult. There is no systematic approach in mix design for Geopolymers. Since the amounts of silica modulus, Na2O content, w/b ratios and curing time have a great influence on the compressive strength an ANFIS (Adaptive neuro fuzzy inference system) method has been established for predicting compressive strength of ground pumice based Geopolymers and the possibilities of ANFIS for predicting the compressive strength has been studied. Consequently, ANFIS can be used for geopolymer compressive strength prediction with acceptable accuracy.

Keywords: geopolymer, ANFIS, compressive strength, mix design

Procedia PDF Downloads 847
3144 Customer Experience Management in Food and Beverage Outlet at Indian School of Business: Methodology and Recommendations

Authors: Anupam Purwar

Abstract:

In conventional consumer product industry, stockouts are taken care by carrying buffer stock to check underserving caused by changes in customer demand, incorrect forecast or variability in lead times. But, for food outlets, the alternate of carrying buffer stock is unviable because of indispensable need to serve freshly cooked meals. Besides, the food outlet being the sole provider has no incentives to reduce stockouts, as they have no fear of losing revenue, gross profit, customers and market share. Hence, innovative, easy to implement and practical ways of addressing the twin problem of long queues and poor customer experience needs to be investigated. Current work analyses the demand pattern of 11 different food items across a routine day. Based on this optimum resource allocation for all food items has been carried out by solving a linear programming problem with cost minimization as the objective. Concurrently, recommendations have been devised to address this demand and supply side problem keeping in mind their practicability. Currently, the recommendations are being discussed and implemented at ISB (Indian School of Business) Hyderabad campus.

Keywords: F&B industry, resource allocation, demand management, linear programming, LP, queuing analysis

Procedia PDF Downloads 131
3143 Forecasting Electricity Spot Price with Generalized Long Memory Modeling: Wavelet and Neural Network

Authors: Souhir Ben Amor, Heni Boubaker, Lotfi Belkacem

Abstract:

This aims of this paper is to forecast the electricity spot prices. First, we focus on modeling the conditional mean of the series so we adopt a generalized fractional -factor Gegenbauer process (k-factor GARMA). Secondly, the residual from the -factor GARMA model has used as a proxy for the conditional variance; these residuals were predicted using two different approaches. In the first approach, a local linear wavelet neural network model (LLWNN) has developed to predict the conditional variance using the Back Propagation learning algorithms. In the second approach, the Gegenbauer generalized autoregressive conditional heteroscedasticity process (G-GARCH) has adopted, and the parameters of the k-factor GARMA-G-GARCH model has estimated using the wavelet methodology based on the discrete wavelet packet transform (DWPT) approach. The empirical results have shown that the k-factor GARMA-G-GARCH model outperform the hybrid k-factor GARMA-LLWNN model, and find it is more appropriate for forecasts.

Keywords: electricity price, k-factor GARMA, LLWNN, G-GARCH, forecasting

Procedia PDF Downloads 227
3142 Prediction of Deformations of Concrete Structures

Authors: A. Brahma

Abstract:

Drying is a phenomenon that accompanies the hardening of hydraulic materials. It can, if it is not prevented, lead to significant spontaneous dimensional variations, which the cracking is one of events. In this context, cracking promotes the transport of aggressive agents in the material, which can affect the durability of concrete structures. Drying shrinkage develops over a long period almost 30 years although most occurred during the first three years. Drying shrinkage stabilizes when the material is water balance with the external environment. The drying shrinkage of cementitious materials is due to the formation of capillary tensions in the pores of the material, which has the consequences of bringing the solid walls of each other. Knowledge of the shrinkage characteristics of concrete is a necessary starting point in the design of structures for crack control. Such knowledge will enable the designer to estimate the probable shrinkage movement in reinforced or prestressed concrete and the appropriate steps can be taken in design to accommodate this movement. This study is concerned the modelling of drying shrinkage of the hydraulic materials and the prediction of the rate of spontaneous deformations of hydraulic materials during hardening. The model developed takes in consideration the main factors affecting drying shrinkage. There was agreement between drying shrinkage predicted by the developed model and experimental results. In last we show that developed model describe the evolution of the drying shrinkage of high performances concretes correctly.

Keywords: drying, hydraulic concretes, shrinkage, modeling, prediction

Procedia PDF Downloads 330
3141 Landslide Susceptibility Mapping: A Comparison between Logistic Regression and Multivariate Adaptive Regression Spline Models in the Municipality of Oudka, Northern of Morocco

Authors: S. Benchelha, H. C. Aoudjehane, M. Hakdaoui, R. El Hamdouni, H. Mansouri, T. Benchelha, M. Layelmam, M. Alaoui

Abstract:

The logistic regression (LR) and multivariate adaptive regression spline (MarSpline) are applied and verified for analysis of landslide susceptibility map in Oudka, Morocco, using geographical information system. From spatial database containing data such as landslide mapping, topography, soil, hydrology and lithology, the eight factors related to landslides such as elevation, slope, aspect, distance to streams, distance to road, distance to faults, lithology map and Normalized Difference Vegetation Index (NDVI) were calculated or extracted. Using these factors, landslide susceptibility indexes were calculated by the two mentioned methods. Before the calculation, this database was divided into two parts, the first for the formation of the model and the second for the validation. The results of the landslide susceptibility analysis were verified using success and prediction rates to evaluate the quality of these probabilistic models. The result of this verification was that the MarSpline model is the best model with a success rate (AUC = 0.963) and a prediction rate (AUC = 0.951) higher than the LR model (success rate AUC = 0.918, rate prediction AUC = 0.901).

Keywords: landslide susceptibility mapping, regression logistic, multivariate adaptive regression spline, Oudka, Taounate

Procedia PDF Downloads 184
3140 The Impact of Dispatching with Rolling Horizon Control in Sizing Thermal Storage for Solar Tower Plant Participating in Wholesale Spot Electricity Market

Authors: Navid Mohammadzadeh, Huy Truong-Ba, Michael Cholette

Abstract:

The solar tower (ST) plant is a promising technology to exploit large-scale solar irradiation. With thermal energy storage, ST plant has the potential to shift generation to high electricity price periods. However, the size of storage limits the dispatchability of the plant, particularly when it should compete with uncertainty in forecasts of solar irradiation and electricity prices. The purpose of this study is to explore the size of storage when Rolling Horizon Control (RHC) is employed for dispatch scheduling. To this end, RHC is benchmarked against perfect knowledge (PK) forecast and two day-ahead dispatching policies. With optimisation of dispatch planning using PK policy, the optimal achievable profit for a specific size of the storage is determined. A sensitivity analysis using Monte-Carlo simulation is conducted, and the size of storage for RHC and day-ahead policies is determined with the objective of reaching the profit obtained from the PK policy. A case study is conducted for a hypothetical ST plant with thermal storage located in South Australia and intends to dispatch under two market scenarios: 1) fixed price and 2) wholesale spot price. The impact of each individual source of uncertainty on storage size is examined for January and August. The exploration of results shows that dispatching with RH controller reaches optimal achievable profit with ~15% smaller storage compared to that in day-ahead policies. The results of this study may be applied to the CSP plant design procedure.

Keywords: solar tower plant, spot market, thermal storage system, optimized dispatch planning, sensitivity analysis, Monte Carlo simulation

Procedia PDF Downloads 120
3139 Scour Depth Prediction around Bridge Piers Using Neuro-Fuzzy and Neural Network Approaches

Authors: H. Bonakdari, I. Ebtehaj

Abstract:

The prediction of scour depth around bridge piers is frequently considered in river engineering. One of the key aspects in efficient and optimum bridge structure design is considered to be scour depth estimation around bridge piers. In this study, scour depth around bridge piers is estimated using two methods, namely the Adaptive Neuro-Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN). Therefore, the effective parameters in scour depth prediction are determined using the ANN and ANFIS methods via dimensional analysis, and subsequently, the parameters are predicted. In the current study, the methods’ performances are compared with the nonlinear regression (NLR) method. The results show that both methods presented in this study outperform existing methods. Moreover, using the ratio of pier length to flow depth, ratio of median diameter of particles to flow depth, ratio of pier width to flow depth, the Froude number and standard deviation of bed grain size parameters leads to optimal performance in scour depth estimation.

Keywords: adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN), bridge pier, scour depth, nonlinear regression (NLR)

Procedia PDF Downloads 215
3138 An Application for Risk of Crime Prediction Using Machine Learning

Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento

Abstract:

The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.

Keywords: crime prediction, machine learning, public safety, smart city

Procedia PDF Downloads 107
3137 Analysis of Brain Signals Using Neural Networks Optimized by Co-Evolution Algorithms

Authors: Zahra Abdolkarimi, Naser Zourikalatehsamad,

Abstract:

Up to 40 years ago, after recognition of epilepsy, it was generally believed that these attacks occurred randomly and suddenly. However, thanks to the advance of mathematics and engineering, such attacks can be predicted within a few minutes or hours. In this way, various algorithms for long-term prediction of the time and frequency of the first attack are presented. In this paper, by considering the nonlinear nature of brain signals and dynamic recorded brain signals, ANFIS model is presented to predict the brain signals, since according to physiologic structure of the onset of attacks, more complex neural structures can better model the signal during attacks. Contribution of this work is the co-evolution algorithm for optimization of ANFIS network parameters. Our objective is to predict brain signals based on time series obtained from brain signals of the people suffering from epilepsy using ANFIS. Results reveal that compared to other methods, this method has less sensitivity to uncertainties such as presence of noise and interruption in recorded signals of the brain as well as more accuracy. Long-term prediction capacity of the model illustrates the usage of planted systems for warning medication and preventing brain signals.

Keywords: co-evolution algorithms, brain signals, time series, neural networks, ANFIS model, physiologic structure, time prediction, epilepsy suffering, illustrates model

Procedia PDF Downloads 275
3136 Rainfall-Runoff Forecasting Utilizing Genetic Programming Technique

Authors: Ahmed Najah Ahmed Al-Mahfoodh, Ali Najah Ahmed Al-Mahfoodh, Ahmed Al-Shafie

Abstract:

In this study, genetic programming (GP) technique has been investigated in prediction of set of rainfall-runoff data. To assess the effect of input parameters on the model, the sensitivity analysis was adopted. To evaluate the performance of the proposed model, three statistical indexes were used, namely; Correlation Coefficient (CC), Mean Square Error (MSE) and Correlation of Efficiency (CE). The principle aim of this study is to develop a computationally efficient and robust approach for predict of rainfall-runoff which could reduce the cost and labour for measuring these parameters. This research concentrates on the Johor River in Johor State, Malaysia.

Keywords: genetic programming, prediction, rainfall-runoff, Malaysia

Procedia PDF Downloads 474
3135 A Study for Area-level Mosquito Abundance Prediction by Using Supervised Machine Learning Point-level Predictor

Authors: Theoktisti Makridou, Konstantinos Tsaprailis, George Arvanitakis, Charalampos Kontoes

Abstract:

In the literature, the data-driven approaches for mosquito abundance prediction relaying on supervised machine learning models that get trained with historical in-situ measurements. The counterpart of this approach is once the model gets trained on pointlevel (specific x,y coordinates) measurements, the predictions of the model refer again to point-level. These point-level predictions reduce the applicability of those solutions once a lot of early warning and mitigation actions applications need predictions for an area level, such as a municipality, village, etc... In this study, we apply a data-driven predictive model, which relies on public-open satellite Earth Observation and geospatial data and gets trained with historical point-level in-Situ measurements of mosquito abundance. Then we propose a methodology to extract information from a point-level predictive model to a broader area-level prediction. Our methodology relies on the randomly spatial sampling of the area of interest (similar to the Poisson hardcore process), obtaining the EO and geomorphological information for each sample, doing the point-wise prediction for each sample, and aggregating the predictions to represent the average mosquito abundance of the area. We quantify the performance of the transformation from the pointlevel to the area-level predictions, and we analyze it in order to understand which parameters have a positive or negative impact on it. The goal of this study is to propose a methodology that predicts the mosquito abundance of a given area by relying on point-level prediction and to provide qualitative insights regarding the expected performance of the area-level prediction. We applied our methodology to historical data (of Culex pipiens) of two areas of interest (Veneto region of Italy and Central Macedonia of Greece). In both cases, the results were consistent. The mean mosquito abundance of a given area can be estimated with similar accuracy to the point-level predictor, sometimes even better. The density of the samples that we use to represent one area has a positive effect on the performance in contrast to the actual number of sampling points which is not informative at all regarding the performance without the size of the area. Additionally, we saw that the distance between the sampling points and the real in-situ measurements that were used for training did not strongly affect the performance.

Keywords: mosquito abundance, supervised machine learning, culex pipiens, spatial sampling, west nile virus, earth observation data

Procedia PDF Downloads 143
3134 Application of Latent Class Analysis and Self-Organizing Maps for the Prediction of Treatment Outcomes for Chronic Fatigue Syndrome

Authors: Ben Clapperton, Daniel Stahl, Kimberley Goldsmith, Trudie Chalder

Abstract:

Chronic fatigue syndrome (CFS) is a condition characterised by chronic disabling fatigue and other symptoms that currently can't be explained by any underlying medical condition. Although clinical trials support the effectiveness of cognitive behaviour therapy (CBT), the success rate for individual patients is modest. Patients vary in their response and little is known which factors predict or moderate treatment outcomes. The aim of the project is to develop a prediction model from baseline characteristics of patients, such as demographics, clinical and psychological variables, which may predict likely treatment outcome and provide guidance for clinical decision making and help clinicians to recommend the best treatment. The project is aimed at identifying subgroups of patients with similar baseline characteristics that are predictive of treatment effects using modern cluster analyses and data mining machine learning algorithms. The characteristics of these groups will then be used to inform the types of individuals who benefit from a specific treatment. In addition, results will provide a better understanding of for whom the treatment works. The suitability of different clustering methods to identify subgroups and their response to different treatments of CFS patients is compared.

Keywords: chronic fatigue syndrome, latent class analysis, prediction modelling, self-organizing maps

Procedia PDF Downloads 221
3133 Time and Cost Efficiency Analysis of Quick Die Change System on Metal Stamping Industry

Authors: Rudi Kurniawan Arief

Abstract:

Manufacturing cost and setup time are the hot topics to improve in Metal Stamping industry because material and components price are always rising up while costumer requires to cut down the component price year by year. The Single Minute Exchange of Die (SMED) is one of many methods to reduce waste in stamping industry. The Japanese Quick Die Change (QDC) dies system is one of SMED systems that could reduce both of setup time and manufacturing cost. However, this system is rarely used in stamping industries. This paper will analyze how deep the QDC dies system could reduce setup time and the manufacturing cost. The research is conducted by direct observation, simulating and comparing of QDC dies system with conventional dies system. In this research, we found that the QDC dies system could save up to 35% of manufacturing cost and reduce 70% of setup times. This simulation proved that the QDC die system is effective for cost reduction but must be applied in several parallel production processes.

Keywords: press die, metal stamping, QDC system, single minute exchange die, manufacturing cost saving, SMED

Procedia PDF Downloads 166
3132 The Combination of the Mel Frequency Cepstral Coefficients, Perceptual Linear Prediction, Jitter and Shimmer Coefficients for the Improvement of Automatic Recognition System for Dysarthric Speech

Authors: Brahim Fares Zaidi

Abstract:

Our work aims to improve our Automatic Recognition System for Dysarthria Speech based on the Hidden Models of Markov and the Hidden Markov Model Toolkit to help people who are sick. With pronunciation problems, we applied two techniques of speech parameterization based on Mel Frequency Cepstral Coefficients and Perceptual Linear Prediction and concatenated them with JITTER and SHIMMER coefficients in order to increase the recognition rate of a dysarthria speech. For our tests, we used the NEMOURS database that represents speakers with dysarthria and normal speakers.

Keywords: ARSDS, HTK, HMM, MFCC, PLP

Procedia PDF Downloads 106
3131 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 141
3130 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome

Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler

Abstract:

Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.

Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model

Procedia PDF Downloads 152
3129 Prediction of Rotating Machines with Rolling Element Bearings and Its Components Deterioration

Authors: Marimuthu Gurusamy

Abstract:

In vibration analysis (with accelerometers) of rotating machines with rolling element bearing, the customers are interested to know the failure of the machine well in advance to plan the spare inventory and maintenance. But in real world most of the machines fails before the prediction of vibration analyst or Expert analysis software. Presently the prediction of failure is based on ISO 10816 vibration limits only. But this is not enough to monitor the failure of machines well in advance. Because more than 50% of the machines will fail even the vibration readings are within acceptable zone as per ISO 10816.Hence it requires further detail analysis and different techniques to predict the failure well in advance. In vibration Analysis, the velocity spectrum is used to analyse the root cause of the mechanical problems like unbalance, misalignment and looseness etc. The envelope spectrum are used to analyse the bearing frequency components, hence the failure in inner race, outer race and rolling elements are identified. But so far there is no correlation made between these two concepts. The author used both velocity spectrum and Envelope spectrum to analyse the machine behaviour and bearing condition to correlated the changes in dynamic load (by unbalance, misalignment and looseness etc.) and effect of impact on the bearing. Hence we could able to predict the expected life of the machine and bearings in the rotating equipment (with rolling element bearings). Also we used process parameters like temperature, flow and pressure to correlate with flow induced vibration and load variations, when abnormal vibration occurs due to changes in process parameters. Hence by correlation of velocity spectrum, envelope spectrum and process data with 20 years of experience in vibration analysis, the author could able to predict the rotating Equipment and its component’s deterioration and expected duration for maintenance.

Keywords: vibration analysis, velocity spectrum, envelope spectrum, prediction of deterioration

Procedia PDF Downloads 448
3128 Profitability Assessment of Granite Aggregate Production and the Development of a Profit Assessment Model

Authors: Melodi Mbuyi Mata, Blessing Olamide Taiwo, Afolabi Ayodele David

Abstract:

The purpose of this research is to create empirical models for assessing the profitability of granite aggregate production in Akure, Ondo state aggregate quarries. In addition, an artificial neural network (ANN) model and multivariate predicting models for granite profitability were developed in the study. A formal survey questionnaire was used to collect data for the study. The data extracted from the case study mine for this study includes granite marketing operations, royalty, production costs, and mine production information. The following methods were used to achieve the goal of this study: descriptive statistics, MATLAB 2017, and SPSS16.0 software in analyzing and modeling the data collected from granite traders in the study areas. The ANN and Multi Variant Regression models' prediction accuracy was compared using a coefficient of determination (R²), Root mean square error (RMSE), and mean square error (MSE). Due to the high prediction error, the model evaluation indices revealed that the ANN model was suitable for predicting generated profit in a typical quarry. More quarries in Nigeria's southwest region and other geopolitical zones should be considered to improve ANN prediction accuracy.

Keywords: national development, granite, profitability assessment, ANN models

Procedia PDF Downloads 93
3127 Is More Inclusive More Effective? The 'New Style' Public Distribution System in India

Authors: Avinash Kishore, Suman Chakrabarti

Abstract:

In September 2013, the parliament of India enacted the National Food Security Act (NFSA) which entitles two-thirds of India’s population to five kilograms of rice, wheat or coarse cereals per person per month at one to three rupees per kilogram. Five states in India—Andhra Pradesh, Chhattisgarh, Tamil Nadu, Odisha and West Bengal—had already implemented somewhat similar changes in the TPDS a few years earlier using their own budgetary resources. They made rice—coincidentally, all five states are predominantly rice-eating—available in fair price shops to a majority of their population at very low prices (less than Rs.3/kg). This paper tries to account for the changes in household consumption patterns associated with the change in TPDS policy in these states using data from household consumption surveys by the National Sample Survey Organization (NSSO). NSS data show improvement in the coverage of TPDS and average off-take of grains from fair price shops between 2004-05 and 2009-10 across all states of India. However, the increase in coverage and off-take was significantly higher in four out of these five states than in the rest of India. An average household in these states purchased three kilos more rice per month from fair price shops than its counterpart in non-treated states as a result of more generous TPDS policies backed by administrative reforms. The increase in consumption of PDS rice was the highest in Chhattisgarh, the poster state of PDS reforms. Households in Chhattisgarh used money saved on rice to spend more on pulses, edible oil, vegetables and sugar and other non-food items. We also find evidence that making TPDS more inclusive and more generous is not enough unless it is supported by administrative reforms to improve grain delivery and control diversion to open markets.

Keywords: public distribution system, social safety-net, national food security act, diet quality, Chhattisgarh

Procedia PDF Downloads 372
3126 Earthquake Vulnerability and Repair Cost Estimation of Masonry Buildings in the Old City Center of Annaba, Algeria

Authors: Allaeddine Athmani, Abdelhacine Gouasmia, Tiago Ferreira, Romeu Vicente

Abstract:

The seismic risk mitigation from the perspective of the old buildings stock is truly essential in Algerian urban areas, particularly those located in seismic prone regions, such as Annaba city, and which the old buildings present high levels of degradation associated with no seismic strengthening and/or rehabilitation concerns. In this sense, the present paper approaches the issue of the seismic vulnerability assessment of old masonry building stocks through the adaptation of a simplified methodology developed for a European context area similar to that of Annaba city, Algeria. Therefore, this method is used for the first level of seismic vulnerability assessment of the masonry buildings stock of the old city center of Annaba. This methodology is based on a vulnerability index that is suitable for the evaluation of damage and for the creation of large-scale loss scenarios. Over 380 buildings were evaluated in accordance with the referred methodology and the results obtained were then integrated into a Geographical Information System (GIS) tool. Such results can be used by the Annaba city council for supporting management decisions, based on a global view of the site under analysis, which led to more accurate and faster decisions for the risk mitigation strategies and rehabilitation plans.

Keywords: Damage scenarios, masonry buildings, old city center, seismic vulnerability, vulnerability index

Procedia PDF Downloads 447
3125 Impact of Ethnomedicinal Plants on Toothpaste Improvement

Authors: Muna Jalal Ali, Essam A. Makky, Mashitah M. Yusoff

Abstract:

Objectives: The aim of this study to evaluate the antimicrobial susceptibility of combined toothpaste with medicinal plants and the relations between the commercial toothpaste to its price and the patient age as well. Materials and Methods: Oral isolates of different patients aged 3 to 60 years were obtained, purified, and tested against four different ethnomedicinal plant extracts for antimicrobial activity. A total of 10 different commercial toothpastes (different brands and prices) were collected from the market, and the combined action of the medicinal plants and toothpaste was studied. Results: We found a higher bacterial population in the age group of 3–40 years than the group of 40–60 years, with approximately 44% and 32%, respectively. The combined action of ethanolic extract (alone) against oral isolates showed a synergistic effect, with 32.20, 30.50, and 25.42% for combinations A (Ci/Ca), B (Ci/Ca/P), and C (Ci/Ca/P/N), respectively. By contrast, the combined action of ethnomedicinal plants with 10 different toothpastes improved the antimicrobial sensitivity by 60, 100, and 0% for combinations A, B, and C respectively. Clinical relevance: The ethanolic extract of only combinations A and B with commercial toothpaste showed high antibacterial activity against oral isolates and the effectiveness of toothpaste is not related to the price.

Keywords: microbial evolution, oral isolates, ethnomedicinal plants, antimicrobial activity, toothpaste

Procedia PDF Downloads 306
3124 Prediction of Coronary Heart Disease Using Fuzzy Logic

Authors: Elda Maraj, Shkelqim Kuka

Abstract:

Coronary heart disease causes many deaths in the world. Unfortunately, this problem will continue to increase in the future. In this paper, a fuzzy logic model to predict coronary heart disease is presented. This model has been developed with seven input variables and one output variable that was implemented for 30 patients in Albania. Here fuzzy logic toolbox of MATLAB is used. Fuzzy model inputs are considered as cholesterol, blood pressure, physical activity, age, BMI, smoking, and diabetes, whereas the output is the disease classification. The fuzzy sets and membership functions are chosen in an appropriate manner. Centroid method is used for defuzzification. The database is taken from University Hospital Center "Mother Teresa" in Tirana, Albania.

Keywords: coronary heart disease, fuzzy logic toolbox, membership function, prediction model

Procedia PDF Downloads 155
3123 Prediction of Scour Profile Caused by Submerged Three-Dimensional Wall Jets

Authors: Abdullah Al Faruque, Ram Balachandar

Abstract:

Series of laboratory tests were carried out to study the extent of scour caused by a three-dimensional wall jets exiting from a square cross-section nozzle and into a non-cohesive sand beds. Previous observations have indicated that the effect of the tailwater depth was significant for densimetric Froude number greater than ten. However, the present results indicate that the cut off value could be lower depending on the value of grain size-to-nozzle width ratio. Numbers of equations are drawn out for a better scaling of numerous scour parameters. Also suggested the empirical prediction of scour to predict the scour centre line profile and plan view of scour profile at any particular time.

Keywords: densimetric froude number, jets, nozzle, sand, scour, tailwater, time

Procedia PDF Downloads 431
3122 A Linear Autoregressive and Non-Linear Regime Switching Approach in Identifying the Structural Breaks Caused by Anti-Speculation Measures: The Case of Hong Kong

Authors: Mengna Hu

Abstract:

This paper examines the impact of an anti-speculation tax policy on the trading activities and home price movements in the housing market in Hong Kong. The study focuses on the secondary residential property market where transactions dominate. The policy intervention substantially raised the transaction cost to speculators as well as genuine homeowners who dispose their homes within a certain period. Through the demonstration of structural breaks, our empirical results show that the rise in transaction cost effectively reduced speculative trading activities. However, it accelerated price increase in the small-sized segment by vastly demotivating existing homeowners from trading up to better homes, causing congestion in the lower-end market where the demand from first-time buyers is still strong. Apart from that, by employing regime switching approach, we further show that the unintended consequences are likely to be persistent due to this policy together with other strengthened cooling measures.

Keywords: transaction costs, housing market, structural breaks, regime switching

Procedia PDF Downloads 255
3121 The Application of Data Mining Technology in Building Energy Consumption Data Analysis

Authors: Liang Zhao, Jili Zhang, Chongquan Zhong

Abstract:

Energy consumption data, in particular those involving public buildings, are impacted by many factors: the building structure, climate/environmental parameters, construction, system operating condition, and user behavior patterns. Traditional methods for data analysis are insufficient. This paper delves into the data mining technology to determine its application in the analysis of building energy consumption data including energy consumption prediction, fault diagnosis, and optimal operation. Recent literature are reviewed and summarized, the problems faced by data mining technology in the area of energy consumption data analysis are enumerated, and research points for future studies are given.

Keywords: data mining, data analysis, prediction, optimization, building operational performance

Procedia PDF Downloads 848
3120 Effect of Management Compensation and Auditor Reputation on Tax Management in the Listed Banking Companies in Indonesia

Authors: Fahreza, Yudhi Herliansyah, Harnovinsah

Abstract:

This study aims to examine how management compensation and auditor reputation effect on corporate tax management in banking using a sample banking companies listed in Indonesia Stock Exchange. At first, this study examines how the influence of management compensation on the implementation of tax management that may be made by management in order to improve the performance of the company. Second, this study also examines the effect of auditor reputation conducting audit on the implementation of the tax management. The population used in this study is the banking companies listed in Indonesia Stock Exchange. The method used was purposive sampling because the samples of this study have certain criteria that are tailored to the purpose of the study. Based on purposive sampling method, the number of samples in this study is 28 samples. Hypothesis tested using multiple regression analysis. The results of this study indicate that on the 5 % significance level, management compensation significantly influenced tax management as measured using the proxy book tax gap. Other result is management compensation does not significantly affect the tax management that measured using a proxy GAAP effective tax rate. In addition the auditor's reputation does significantly influence tax management as measured using the proxy book tax gap and GAAP effective tax rate.

Keywords: tax management, management compensation, auditor reputation, corporate characteristic

Procedia PDF Downloads 299
3119 Reducing the Negative Effects of Infrastructure Deficit through Continuity in Governance

Authors: Edoghogho Ogbeifun, Charles Mbohwa, J. H. C. Pretorius

Abstract:

Effective infrastructure development scheme planned and executed has positive influence on the quantity of available stock of infrastructure to meet the immediate and expansion needs of an organization, as well as contribute to the overall economic development of a nation, community or the local entity where the infrastructure is hosted. It is noteworthy, however, that infrastructure development scheme spans a long time frame, usually longer than the political life of the administration that initiates the scheme. In the majority of circumstances, execution may start and achieve different levels of completion; at best, only limited numbers are completed and put into functional use during the life of the administration that initiated the infrastructure scheme. When there is a change in leadership, many of the uncompleted projects are usually abandoned. The new administration repeats the circle of its predecessors and develops another set of infrastructure scheme which will suffer similar fate as the ones developed by their predecessors; thus doting the landscape with many uncompleted projects, which leads to infrastructure deficit. These circle will continue unless each succeeding leader sees governance as single continuum. Therefore, infrastructure projects not completed by one administration should be continued by succeeding administration, in order to increase the stock of relevant infrastructure available for the smooth operations organization and enhance the needed developments, as well as reduce the negative effects of infrastructure deficit. The single case study of qualitative research method was adopted to investigate the actions of the administration of three successive Vice-Chancellors, in a higher education institution in Nigeria, over a longitudinal period of twelve years. This is with a view to exploring the effects of each administration on the development and execution of infrastructure projects, with particular interest on abandoned projects. The findings revealed that although two of Vice-Chancellors were committed to infrastructure upgrade, they executed more new projects than completing abandoned ones, while the current leader has shown more pragmatism in completing abandoned projects alongside constructing new ones; thus demonstrating the importance of the continuity of governance. In this regard, there is a steady increase in the stock of infrastructure to accommodate the expansion of existing academic programmes, host new ones as well as reduce the negative effects of infrastructure deficit caused by abandoned projects.

Keywords: abandoned projects, continuity of governance, infrastructure development scheme, long time frame

Procedia PDF Downloads 178