Search results for: Gaussian process for regression
17582 Linearization and Process Standardization of Construction Design Engineering Workflows
Authors: T. R. Sreeram, S. Natarajan, C. Jena
Abstract:
Civil engineering construction is a network of tasks involving varying degree of complexity and streamlining, and standardization is the only way to establish a systemic approach to design. While there are off the shelf tools such as AutoCAD that play a role in the realization of design, the repeatable process in which these tools are deployed often is ignored. The present paper addresses this challenge through a sustainable design process and effective standardizations at all stages in the design workflow. The same is demonstrated through a case study in the context of construction, and further improvement points are highlighted.Keywords: syste, lean, value stream, process improvement
Procedia PDF Downloads 11917581 An Empirical Research on Customer Knowledge Management in the Iranian Banks
Authors: Ebrahim Gharleghi
Abstract:
This paper aims to examine how customer knowledge management (CKM) can be implemented in Iranian Banks in practice, with the focus on the human resource (people, technology and processes) as important factors of CKM. A conceptual model of an analytical CKM strategy for CKM in this Iranian Banks is developed from the findings and literature review. This article has been based on interviews and distributing the questionnaire. Data were collected from 260 managers from bank managers. The paper finds that hypotheses were tested using student’s t-test (one-sample t-test), Pearson correlation analysis and regression analysis. Test of hypotheses revealed that human, technology and processes factors positively and significantly influenced the implementation of CKM practices. These findings tend to corroborate our conceptual model. Human factor of CKM was found to be more significantly affecting appropriate CKM implementation than others CKM factors, indicating that this factor is more important than the others aspects of CKM. On the other hand, this factor is appropriate in Iranian Banks. Process is in second part and technology is in final part. This indicates that technology infrastructures are so weak in Iranian Banks for CKM implementation. In this paper there is little or no empirical evidence investigating the amount of the execution of the CKM in Iranian Banks. This paper rectifies this imbalance by clarifying the significance human, technology and processes factors in CKM implementation.Keywords: knowledge management, customer relationship management, customer knowledge management, integration, people, technology, process
Procedia PDF Downloads 27117580 Naïve Bayes: A Classical Approach for the Epileptic Seizures Recognition
Authors: Bhaveek Maini, Sanjay Dhanka, Surita Maini
Abstract:
Electroencephalography (EEG) is used to classify several epileptic seizures worldwide. It is a very crucial task for the neurologist to identify the epileptic seizure with manual EEG analysis, as it takes lots of effort and time. Human error is always at high risk in EEG, as acquiring signals needs manual intervention. Disease diagnosis using machine learning (ML) has continuously been explored since its inception. Moreover, where a large number of datasets have to be analyzed, ML is acting as a boon for doctors. In this research paper, authors proposed two different ML models, i.e., logistic regression (LR) and Naïve Bayes (NB), to predict epileptic seizures based on general parameters. These two techniques are applied to the epileptic seizures recognition dataset, available on the UCI ML repository. The algorithms are implemented on an 80:20 train test ratio (80% for training and 20% for testing), and the performance of the model was validated by 10-fold cross-validation. The proposed study has claimed accuracy of 81.87% and 95.49% for LR and NB, respectively.Keywords: epileptic seizure recognition, logistic regression, Naïve Bayes, machine learning
Procedia PDF Downloads 5317579 Regional Flood Frequency Analysis in Narmada Basin: A Case Study
Authors: Ankit Shah, R. K. Shrivastava
Abstract:
Flood and drought are two main features of hydrology which affect the human life. Floods are natural disasters which cause millions of rupees’ worth of damage each year in India and the whole world. Flood causes destruction in form of life and property. An accurate estimate of the flood damage potential is a key element to an effective, nationwide flood damage abatement program. Also, the increase in demand of water due to increase in population, industrial and agricultural growth, has let us know that though being a renewable resource it cannot be taken for granted. We have to optimize the use of water according to circumstances and conditions and need to harness it which can be done by construction of hydraulic structures. For their safe and proper functioning of hydraulic structures, we need to predict the flood magnitude and its impact. Hydraulic structures play a key role in harnessing and optimization of flood water which in turn results in safe and maximum use of water available. Mainly hydraulic structures are constructed on ungauged sites. There are two methods by which we can estimate flood viz. generation of Unit Hydrographs and Flood Frequency Analysis. In this study, Regional Flood Frequency Analysis has been employed. There are many methods for estimating the ‘Regional Flood Frequency Analysis’ viz. Index Flood Method. National Environmental and Research Council (NERC Methods), Multiple Regression Method, etc. However, none of the methods can be considered universal for every situation and location. The Narmada basin is located in Central India. It is drained by most of the tributaries, most of which are ungauged. Therefore it is very difficult to estimate flood on these tributaries and in the main river. As mentioned above Artificial Neural Network (ANN)s and Multiple Regression Method is used for determination of Regional flood Frequency. The annual peak flood data of 20 sites gauging sites of Narmada Basin is used in the present study to determine the Regional Flood relationships. Homogeneity of the considered sites is determined by using the Index Flood Method. Flood relationships obtained by both the methods are compared with each other, and it is found that ANN is more reliable than Multiple Regression Method for the present study area.Keywords: artificial neural network, index flood method, multi layer perceptrons, multiple regression, Narmada basin, regional flood frequency
Procedia PDF Downloads 41517578 New Approach for Load Modeling
Authors: Slim Chokri
Abstract:
Load forecasting is one of the central functions in power systems operations. Electricity cannot be stored, which means that for electric utility, the estimate of the future demand is necessary in managing the production and purchasing in an economically reasonable way. A majority of the recently reported approaches are based on neural network. The attraction of the methods lies in the assumption that neural networks are able to learn properties of the load. However, the development of the methods is not finished, and the lack of comparative results on different model variations is a problem. This paper presents a new approach in order to predict the Tunisia daily peak load. The proposed method employs a computational intelligence scheme based on the Fuzzy neural network (FNN) and support vector regression (SVR). Experimental results obtained indicate that our proposed FNN-SVR technique gives significantly good prediction accuracy compared to some classical techniques.Keywords: neural network, load forecasting, fuzzy inference, machine learning, fuzzy modeling and rule extraction, support vector regression
Procedia PDF Downloads 43117577 Agriculture Yield Prediction Using Predictive Analytic Techniques
Authors: Nagini Sabbineni, Rajini T. V. Kanth, B. V. Kiranmayee
Abstract:
India’s economy primarily depends on agriculture yield growth and their allied agro industry products. The agriculture yield prediction is the toughest task for agricultural departments across the globe. The agriculture yield depends on various factors. Particularly countries like India, majority of agriculture growth depends on rain water, which is highly unpredictable. Agriculture growth depends on different parameters, namely Water, Nitrogen, Weather, Soil characteristics, Crop rotation, Soil moisture, Surface temperature and Rain water etc. In our paper, lot of Explorative Data Analysis is done and various predictive models were designed. Further various regression models like Linear, Multiple Linear, Non-linear models are tested for the effective prediction or the forecast of the agriculture yield for various crops in Andhra Pradesh and Telangana states.Keywords: agriculture yield growth, agriculture yield prediction, explorative data analysis, predictive models, regression models
Procedia PDF Downloads 30817576 Using Linear Logistic Regression to Evaluation the Patient and System Delay and Effective Factors in Mortality of Patients with Acute Myocardial Infarction
Authors: Firouz Amani, Adalat Hoseinian, Sajjad Hakimian
Abstract:
Background: The mortality due to Myocardial Infarction (MI) is often occur during the first hours after onset of symptom. So, for taking the necessary treatment and decreasing the mortality rate, timely visited of the hospital could be effective in this regard. The aim of this study was to investigate the impact of effective factors in mortality of MI patients by using Linear Logistic Regression. Materials and Methods: In this case-control study, all patients with Acute MI who referred to the Ardabil city hospital were studied. All of died patients were considered as the case group (n=27) and we select 27 matched patients without Acute MI as a control group. Data collected for all patients in two groups by a same checklist and then analyzed by SPSS version 24 software using statistical methods. We used the linear logistic regression model to determine the effective factors on mortality of MI patients. Results: The mean age of patients in case group was significantly higher than control group (75.1±11.7 vs. 63.1±11.6, p=0.001).The history of non-cardinal diseases in case group with 44.4% significantly higher than control group with 7.4% (p=0.002).The number of performed PCIs in case group with 40.7% significantly lower than control group with 74.1% (P=0.013). The time distance between hospital admission and performed PCI in case group with 110.9 min was significantly upper than control group with 56 min (P=0.001). The mean of delay time from Onset of symptom to hospital admission (patient delay) and the mean of delay time from hospital admissions to receive treatment (system delay) was similar between two groups. By using logistic regression model we revealed that history of non-cardinal diseases (OR=283) and the number of performed PCIs (OR=24.5) had significant impact on mortality of MI patients in compare to other factors. Conclusion: Results of this study showed that of all studied factors, the number of performed PCIs, history of non-cardinal illness and the interval between onset of symptoms and performed PCI have significant relation with morality of MI patients and other factors were not meaningful. So, doing more studies with a large sample and investigated other involved factors such as smoking, weather and etc. is recommended in future.Keywords: acute MI, mortality, heart failure, arrhythmia
Procedia PDF Downloads 11817575 Crude Distillation Process Simulation Using Unisim Design Simulator
Authors: C. Patrascioiu, M. Jamali
Abstract:
The paper deals with the simulation of the crude distillation process using the Unisim Design simulator. The necessity of simulating this process is argued both by considerations related to the design of the crude distillation column, but also by considerations related to the design of advanced control systems. In order to use the Unisim Design simulator to simulate the crude distillation process, the identification of the simulators used in Romania and an analysis of the PRO/II, HYSYS, and Aspen HYSYS simulators were carried out. Analysis of the simulators for the crude distillation process has allowed the authors to elaborate the conclusions of the success of the crude modelling. A first aspect developed by the authors is the implementation of specific problems of petroleum liquid-vapors equilibrium using Unisim Design simulator. The second major element of the article is the development of the methodology and the elaboration of the simulation program for the crude distillation process, using Unisim Design resources. The obtained results validate the proposed methodology and will allow dynamic simulation of the process.Keywords: crude oil, distillation, simulation, Unisim Design, simulators
Procedia PDF Downloads 24217574 Islamic Equity Markets Response to Volatility of Bitcoin
Authors: Zakaria S. G. Hegazy, Walid M. A. Ahmed
Abstract:
This paper examines the dependence structure of Islamic stock markets on Bitcoin’s realized volatility components in bear, normal, and bull market periods. A quantile regression approach is employed, after adjusting raw returns with respect to a broad set of relevant global factors and accounting for structural breaks in the data. The results reveal that upside volatility tends to exert negative influences on Islamic developed-market returns more in bear than in bull market conditions, while downside volatility positively affects returns during bear and bull conditions. For emerging markets, we find that the upside (downside) component exerts lagged negative (positive) effects on returns in bear (all) market regimes. By and large, the dependence structures turn out to be asymmetric. Our evidence provides essential implications for investors.Keywords: cryptocurrency markets, bitcoin, realized volatility measures, asymmetry, quantile regression
Procedia PDF Downloads 18317573 Establishing a Surrogate Approach to Assess the Exposure Concentrations during Coating Process
Authors: Shan-Hong Ying, Ying-Fang Wang
Abstract:
A surrogate approach was deployed for assessing exposures of multiple chemicals at the selected working area of coating processes and applied to assess the exposure concentration of similar exposed groups using the same chemicals but different formula ratios. For the selected area, 6 to 12 portable photoionization detector (PID) were placed uniformly in its workplace to measure its total VOCs concentrations (CT-VOCs) for 6 randomly selected workshifts. Simultaneously, one sampling strain was placed beside one of these portable PIDs, and the collected air sample was analyzed for individual concentration (CVOCi) of 5 VOCs (xylene, butanone, toluene, butyl acetate, and dimethylformamide). Predictive models were established by relating the CT-VOCs to CVOCi of each individual compound via simple regression analysis. The established predictive models were employed to predict each CVOCi based on the measured CT-VOC for each the similar working area using the same portable PID. Results show that predictive models obtained from simple linear regression analyses were found with an R2 = 0.83~0.99 indicating that CT-VOCs were adequate for predicting CVOCi. In order to verify the validity of the exposure prediction model, the sampling analysis of the above chemical substances was further carried out and the correlation between the measured value (Cm) and the predicted value (Cp) was analyzed. It was found that there is a good correction between the predicted value and measured value of each measured chemical substance (R2=0.83~0.98). Therefore, the surrogate approach could be assessed the exposure concentration of similar exposed groups using the same chemicals but different formula ratios. However, it is recommended to establish the prediction model between the chemical substances belonging to each coater and the direct-reading PID, which is more representative of reality exposure situation and more accurately to estimate the long-term exposure concentration of operators.Keywords: exposure assessment, exposure prediction model, surrogate approach, TVOC
Procedia PDF Downloads 14617572 Parallel Fuzzy Rough Support Vector Machine for Data Classification in Cloud Environment
Authors: Arindam Chaudhuri
Abstract:
Classification of data has been actively used for most effective and efficient means of conveying knowledge and information to users. The prima face has always been upon techniques for extracting useful knowledge from data such that returns are maximized. With emergence of huge datasets the existing classification techniques often fail to produce desirable results. The challenge lies in analyzing and understanding characteristics of massive data sets by retrieving useful geometric and statistical patterns. We propose a supervised parallel fuzzy rough support vector machine (PFRSVM) for data classification in cloud environment. The classification is performed by PFRSVM using hyperbolic tangent kernel. The fuzzy rough set model takes care of sensitiveness of noisy samples and handles impreciseness in training samples bringing robustness to results. The membership function is function of center and radius of each class in feature space and is represented with kernel. It plays an important role towards sampling the decision surface. The success of PFRSVM is governed by choosing appropriate parameter values. The training samples are either linear or nonlinear separable. The different input points make unique contributions to decision surface. The algorithm is parallelized with a view to reduce training times. The system is built on support vector machine library using Hadoop implementation of MapReduce. The algorithm is tested on large data sets to check its feasibility and convergence. The performance of classifier is also assessed in terms of number of support vectors. The challenges encountered towards implementing big data classification in machine learning frameworks are also discussed. The experiments are done on the cloud environment available at University of Technology and Management, India. The results are illustrated for Gaussian RBF and Bayesian kernels. The effect of variability in prediction and generalization of PFRSVM is examined with respect to values of parameter C. It effectively resolves outliers’ effects, imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. The average classification accuracy for PFRSVM is better than other classifiers for both Gaussian RBF and Bayesian kernels. The experimental results on both synthetic and real data sets clearly demonstrate the superiority of the proposed technique.Keywords: FRSVM, Hadoop, MapReduce, PFRSVM
Procedia PDF Downloads 48617571 Estimation of Dynamic Characteristics of a Middle Rise Steel Reinforced Concrete Building Using Long-Term
Authors: Fumiya Sugino, Naohiro Nakamura, Yuji Miyazu
Abstract:
In earthquake resistant design of buildings, evaluation of vibration characteristics is important. In recent years, due to the increment of super high-rise buildings, the evaluation of response is important for not only the first mode but also higher modes. The knowledge of vibration characteristics in buildings is mostly limited to the first mode and the knowledge of higher modes is still insufficient. In this paper, using earthquake observation records of a SRC building by applying frequency filter to ARX model, characteristics of first and second modes were studied. First, we studied the change of the eigen frequency and the damping ratio during the 3.11 earthquake. The eigen frequency gradually decreases from the time of earthquake occurrence, and it is almost stable after about 150 seconds have passed. At this time, the decreasing rates of the 1st and 2nd eigen frequencies are both about 0.7. Although the damping ratio has more large error than the eigen frequency, both the 1st and 2nd damping ratio are 3 to 5%. Also, there is a strong correlation between the 1st and 2nd eigen frequency, and the regression line is y=3.17x. In the damping ratio, the regression line is y=0.90x. Therefore 1st and 2nd damping ratios are approximately the same degree. Next, we study the eigen frequency and damping ratio from 1998 after 3.11 earthquakes, the final year is 2014. In all the considered earthquakes, they are connected in order of occurrence respectively. The eigen frequency slowly declined from immediately after completion, and tend to stabilize after several years. Although it has declined greatly after the 3.11 earthquake. Both the decresing rate of the 1st and 2nd eigen frequencies until about 7 years later are about 0.8. For the damping ratio, both the 1st and 2nd are about 1 to 6%. After the 3.11 earthquake, the 1st increases by about 1% and the 2nd increases by less than 1%. For the eigen frequency, there is a strong correlation between the 1st and 2nd, and the regression line is y=3.17x. For the damping ratio, the regression line is y=1.01x. Therefore, it can be said that the 1st and 2nd damping ratio is approximately the same degree. Based on the above results, changes in eigen frequency and damping ratio are summarized as follows. In the long-term study of the eigen frequency, both the 1st and 2nd gradually declined from immediately after completion, and tended to stabilize after a few years. Further it declined after the 3.11 earthquake. In addition, there is a strong correlation between the 1st and 2nd, and the declining time and the decreasing rate are the same degree. In the long-term study of the damping ratio, both the 1st and 2nd are about 1 to 6%. After the 3.11 earthquake, the 1st increases by about 1%, the 2nd increases by less than 1%. Also, the 1st and 2nd are approximately the same degree.Keywords: eigenfrequency, damping ratio, ARX model, earthquake observation records
Procedia PDF Downloads 21317570 Impact of Positive Psychology Education and Interventions on Well-Being: A Study of Students Engaged in Pastoral Care
Authors: Inna R. Edara, Haw-Lin Wu
Abstract:
Positive psychology investigates human strengths and virtues and promotes well-being. Relying on this assumption, positive interventions have been continuously designed to build pleasure and happiness, joy and contentment, engagement and meaning, hope and optimism, satisfaction and gratitude, spirituality, and various other positive measures of well-being. In line with this model of positive psychology and interventions, this study investigated certain measures of well-being in a group of 45 students enrolled in an 18-week positive psychology course and simultaneously engaged in service-oriented interventions that they chose for themselves based on the course content and individual interests. Students’ well-being was measured at the beginning and end of the course. The well-being indicators included positive automatic thoughts, optimism and hope, satisfaction with life, and spirituality. A paired-samples t-test conducted to evaluate the impact of class content and service-oriented interventions on students’ scores of well-being indicators indicated statistically significant increase from pre-class to post-class scores. There were also significant gender differences in post-course well-being scores, with females having higher levels of well-being than males. A two-way between groups analysis of variance indicated a significant interaction effect of age by gender on the post-course well-being scores, with females in the age group of 56-65 having the highest scores of well-being in comparison to the males in the same age group. Regression analyses indicated that positive automatic thought significantly predicted hope and satisfaction with life in the pre-course analysis. In the post-course regression analysis, spiritual transcendence made a significant contribution to optimism, and positive automatic thought made a significant contribution to both hope and satisfaction with life. Finally, a significant test between pre-course and post-course regression coefficients indicated that the regression coefficients at pre-course were significantly different from post-course coefficients, suggesting that the positive psychology course and the interventions were helpful in raising the levels of well-being. The overall results suggest a substantial increase in the participants’ well-being scores after engaging in the positive-oriented interventions, implying a need for designing more positive interventions in education to promote well-being.Keywords: hope, optimism, positive automatic thoughts, satisfaction with life, spirituality, well-being
Procedia PDF Downloads 21217569 The Effect of Sustainable Land Management Technologies on Food Security of Farming Households in Kwara State, Nigeria
Authors: Shehu A. Salau, Robiu O. Aliu, Nofiu B. Nofiu
Abstract:
Nigeria is among countries of the world confronted with food insecurity problem. The agricultural production systems that produces food for the teaming population is not endurable. Attention is thus being given to alternative approaches of intensification such as the use of Sustainable Land Management (SLM) technologies. Thus, this study assessed the effect of SLM technologies on food security of farming households in Kwara State, Nigeria. A-three stage sampling technique was used to select a sample of 200 farming households for this study. Descriptive statistics, Shriar index, Likert scale, food security index and logistic regression were employed for the analysis. The result indicated that majority (41%) of the household heads were between the ages of 51 and 70 years with an average of 60.5 years. Food security index revealed that 35% and 65% of the households were food secure and food insecure respectively. The logistic regression showed that SLM technologies, estimated income, household size, gender and age of the household heads were the critical determinants of food security among farming households. The most effective coping strategies adopted by households geared towards lessening the effects of food insecurity are reduced quality of food consumed, employed off-farm jobs to raise household income and diversion of money budgeted for other uses to purchase foods. Governments should encourage the adoption and use of SLM technologies at all levels. Policies and strategies that reduce household size should be enthusiastically pursued to reduce food insecurity.Keywords: agricultural practices, coping strategies, farming households, food security, SLM technologies, logistic regression
Procedia PDF Downloads 16717568 Guidelines for the Management Process Development of Research Journals in Order to Develop Suan Sunandha Rajabhat University to International Standards
Authors: Araya Yordchim, Rosjana Chandhasa, Suwaree Yordchim
Abstract:
This research aims to study guidelines on the development of management process for research journals in order to develop Suan Sunandha Rajabhat University to international standards. This research investigated affecting elements ranging from the format of the article, evaluation form for research article quality, the process of creating a scholarly journal, satisfaction level of those with knowledge and competency to conduct research, arisen problems, and solutions. Drawing upon the sample size of 40 persons who had knowledge and competency in conducting research and creating scholarly journal articles at an international level, the data for this research were collected using questionnaires as a tool. Through the usage of computer software, data were analyzed by using the statistics in the forms of frequency, percentage, mean, standard deviation, and multiple regression analysis. The majority of participants were civil servants with a doctorate degree, followed by civil servants with a master's degree. Among them, the suitability of the article format was rated at a good level while the evaluation form for research articles quality was assessed at a good level. Based on participants' viewpoints, the process of creating scholarly journals was at a good level, while the satisfaction of those who had knowledge and competency in conducting research was at a satisfactory level. The problems encountered were the difficulty in accessing the website. The solution to the problem was to develop a website with user-friendly accessibility, including setting up a Google scholar profile for the purpose of references counting and the articles being used for reference in real-time. Research article format influenced the level of satisfaction of those who had the knowledge and competency to conduct research with statistical significance at the 0.01 level. The research article quality assessment form (preface section, research article writing section, preparation for research article manuscripts section, and the original article evaluation form for the author) affected the satisfaction of those with knowledge and competency to conduct research with the statistical significance at the level of 0.01. The process of establishing journals had an impact on the satisfaction of those with knowledge and ability to conduct research with statistical significance at the level of .05Keywords: guidelines, development of management, research journals, international standards
Procedia PDF Downloads 12017567 The Factors Predicting Credibility of News in Social Media in Thailand
Authors: Ekapon Thienthaworn
Abstract:
This research aims to study the reliability of the forecasting factor in social media by using survey research methods with questionnaires. The sampling is the group of undergraduate students in Bangkok. A multiple-step random number of 400 persons, data analysis are descriptive statistics with multivariate regression analysis. The research found the average of the overall trust at the intermediate level for reading the news in social media and the results of the multivariate regression analysis to find out the factors that forecast credibility of the media found the only content that has the power to forecast reliability of undergraduate students in Bangkok to reading the news on social media at the significance level.at 0.05.These can be factors with forecasts reliability of news in social media by a variable that has the highest influence factor of the media content and the speed is also important for reliability of the news.Keywords: credibility of news, behaviors and attitudes, social media, web board
Procedia PDF Downloads 46517566 Evaluation of the Effect of IMS on the Social Responsibility in the Oil and Gas Production Companies of National Iranian South Oil Fields Company (NISOC)
Authors: Kamran Taghizadeh
Abstract:
This study was aimed at evaluating the effect of IMS including occupational health system, environmental management system, and safety and health system on the social responsibility (case study of NISOC`s oil and gas production companies). This study`s objectives include evaluating the IMS situation and its effect on social responsibility in addition of providing appropriate solutions based on the study`s hypotheses as a basis for future. Data collection was carried out by library and field studies as well as a questionnaire. The stratified random method was the sampling method and a sample of 285 employees in addition to the collected data (from the questionnaire) were analyzed by inferential statistics methods using SPSS software. Finally, results of regression and fitted model at a significance level of 5% confirmed all hypotheses meaning that IMS and its items have a significant effect on social responsibility.Keywords: social responsibility, integrated management, oil and gas production companies, regression
Procedia PDF Downloads 25017565 Machine Learning Techniques for Estimating Ground Motion Parameters
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine
Procedia PDF Downloads 12117564 Using Machine Learning to Classify Different Body Parts and Determine Healthiness
Authors: Zachary Pan
Abstract:
Our general mission is to solve the problem of classifying images into different body part types and deciding if each of them is healthy or not. However, for now, we will determine healthiness for only one-sixth of the body parts, specifically the chest. We will detect pneumonia in X-ray scans of those chest images. With this type of AI, doctors can use it as a second opinion when they are taking CT or X-ray scans of their patients. Another ad-vantage of using this machine learning classifier is that it has no human weaknesses like fatigue. The overall ap-proach to this problem is to split the problem into two parts: first, classify the image, then determine if it is healthy. In order to classify the image into a specific body part class, the body parts dataset must be split into test and training sets. We can then use many models, like neural networks or logistic regression models, and fit them using the training set. Now, using the test set, we can obtain a realistic accuracy the models will have on images in the real world since these testing images have never been seen by the models before. In order to increase this testing accuracy, we can also apply many complex algorithms to the models, like multiplicative weight update. For the second part of the problem, to determine if the body part is healthy, we can have another dataset consisting of healthy and non-healthy images of the specific body part and once again split that into the test and training sets. We then use another neural network to train on those training set images and use the testing set to figure out its accuracy. We will do this process only for the chest images. A major conclusion reached is that convolutional neural networks are the most reliable and accurate at image classification. In classifying the images, the logistic regression model, the neural network, neural networks with multiplicative weight update, neural networks with the black box algorithm, and the convolutional neural network achieved 96.83 percent accuracy, 97.33 percent accuracy, 97.83 percent accuracy, 96.67 percent accuracy, and 98.83 percent accuracy, respectively. On the other hand, the overall accuracy of the model that de-termines if the images are healthy or not is around 78.37 percent accuracy.Keywords: body part, healthcare, machine learning, neural networks
Procedia PDF Downloads 9817563 Optimal Control of Volterra Integro-Differential Systems Based on Legendre Wavelets and Collocation Method
Authors: Khosrow Maleknejad, Asyieh Ebrahimzadeh
Abstract:
In this paper, the numerical solution of optimal control problem (OCP) for systems governed by Volterra integro-differential (VID) equation is considered. The method is developed by means of the Legendre wavelet approximation and collocation method. The properties of Legendre wavelet accompany with Gaussian integration method are utilized to reduce the problem to the solution of nonlinear programming one. Some numerical examples are given to confirm the accuracy and ease of implementation of the method.Keywords: collocation method, Legendre wavelet, optimal control, Volterra integro-differential equation
Procedia PDF Downloads 38517562 Determining the Factors Affecting Social Media Addiction (Virtual Tolerance, Virtual Communication), Phubbing, and Perception of Addiction in Nurses
Authors: Fatima Zehra Allahverdi, Nukhet Bayer
Abstract:
Objective: Three questions were formulated to examine stressful working units (intensive care units, emergency unit nurses) utilizing the self-perception theory and social support theory. This study provides a distinctive input by inspecting the combination of variables regarding stressful working environments. Method: The descriptive research was conducted with the participation of 400 nurses working at Ankara City Hospital. The study used Multivariate Analysis of Variance (MANOVA), regression analysis, and a mediation model. Hypothesis one used MANOVA followed by a Scheffe post hoc test. Hypothesis two utilized regression analysis using a hierarchical linear regression model. Hypothesis three used a mediation model. Result: The study utilized mediation analyses. Findings supported the hypotheses that intensive care units have significantly high scores in virtual communication and virtual tolerance. The number of years on the job, virtual communication, virtual tolerance, and phubbing significantly predicted 51% of the variance of perception of addiction. Interestingly, the number of years on the job, while significant, was negatively related to perception of addiction. Conclusion: The reasoning behind these findings and the lack of significance in the emergency unit is discussed. Around 7% of the variance of phubbing was accounted for through working in intensive care units. The model accounted for 26.80 % of the differences in the perception of addiction.Keywords: phubbing, social media, working units, years on the job, stress
Procedia PDF Downloads 5117561 The Role Played by Awareness and Complexity through the Use of a Logistic Regression Analysis
Authors: Yari Vecchio, Margherita Masi, Jorgelina Di Pasquale
Abstract:
Adoption of Precision Agriculture (PA) is involved in a multidimensional and complex scenario. The process of adopting innovations is complex and social inherently, influenced by other producers, change agents, social norms and organizational pressure. Complexity depends on factors that interact and influence the decision to adopt. Farm and operator characteristics, as well as organizational, informational and agro-ecological context directly affect adoption. This influence has been studied to measure drivers and to clarify 'bottlenecks' of the adoption of agricultural innovation. Making decision process involves a multistage procedure, in which individual passes from first hearing about the technology to final adoption. Awareness is the initial stage and represents the moment in which an individual learns about the existence of the technology. 'Static' concept of adoption has been overcome. Awareness is a precondition to adoption. This condition leads to not encountering some erroneous evaluations, arose from having carried out analysis on a population that is only in part aware of technologies. In support of this, the present study puts forward an empirical analysis among Italian farmers, considering awareness as a prerequisite for adoption. The purpose of the present work is to analyze both factors that affect the probability to adopt and determinants that drive an aware individual to not adopt. Data were collected through a questionnaire submitted in November 2017. A preliminary descriptive analysis has shown that high levels of adoption have been found among younger farmers, better educated, with high intensity of information, with large farm size and high labor-intensive, and whose perception of the complexity of adoption process is lower. The use of a logit model permits to appreciate the weight played by the intensity of labor and complexity perceived by the potential adopter in PA adoption process. All these findings suggest important policy implications: measures dedicated to promoting innovation will need to be more specific for each phase of this adoption process. Specifically, they should increase awareness of PA tools and foster dissemination of information to reduce the degree of perceived complexity of the adoption process. These implications are particularly important in Europe where is pre-announced the reform of Common Agricultural Policy, oriented to innovation. In this context, these implications suggest to the measures supporting innovation to consider the relationship between various organizational and structural dimensions of European agriculture and innovation approaches.Keywords: adoption, awareness, complexity, precision agriculture
Procedia PDF Downloads 13617560 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis
Authors: Mayada Attia Ibrahim
Abstract:
Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis
Procedia PDF Downloads 9617559 Treatment of Cutting Oily-Wastewater by Sono-Fenton Process: Experimental Approach and Combined Process
Authors: Pisut Painmanakul, Thawatchai Chintateerachai, Supanid Lertlapwasin, Nusara Rojvilavan, Tanun Chalermsinsuwan, Nattawin Chawaloesphonsiya, Onanong Larpparisudthi
Abstract:
Conventional coagulation, advance oxidation process (AOPs), and the combined process were evaluated and compared for its suitability to treat the stabilized cutting-oil wastewater. The 90% efficiency was obtained from the coagulation at Al2(SO4)3 dosage of 150 mg/L and pH 7. On the other hands, efficiencies of AOPs for 30 minutes oxidation time were 10% for acoustic oxidation, 12% for acoustic oxidation with hydrogen peroxide, 76% for Fenton, and 92% sono-Fenton processes. The highest efficiency for effective oil removal of AOPs required large amount of chemical. Therefore, AOPs were studied as a post-treatment after conventional separation process. The efficiency was considerable as the effluent COD can pass the standard required for industrial wastewater discharge with less chemical and energy consumption.Keywords: cutting oily-wastewater, advance oxidation process, sono-fenton, combined process
Procedia PDF Downloads 35217558 Unlocking E-commerce: Analyzing User Behavior and Segmenting Customers for Strategic Insights
Authors: Aditya Patil, Arun Patil, Vaishali Patil, Sudhir Chitnis, Anjum Patel
Abstract:
Rapid growth has given e-commerce platforms a lot of client behavior and spending data. To maximize their strategy, businesses must understand how customers utilize online shopping platforms and what influences their purchases. Our research focuses on e-commerce user behavior and purchasing trends. This extensive study examines spending and user behavior. Regression and grouping disclose relevant data from the dataset. We can understand user spending trends via multilevel regression. We can analyze how pricing, user demographics, and product categories affect customer purchase decisions with this technique. Clustering groups consumers by spending. Important information was found. Purchase habits vary by user group. Our analysis illuminates the complex world of e-commerce consumer behavior and purchase trends. Understanding user behavior helps create effective e-commerce marketing strategies. This market can benefit from K-means clustering. This study focuses on tailoring strategies to user groups and improving product and price effectiveness. Customer buying behaviors across categories were shown via K-means clusters. Average spending is highest in Cluster 4 and lowest in Cluster 3. Clothing is less popular than gadgets and appliances around the holidays. Cluster spending distribution is examined using average variables. Our research enhances e-commerce analytics. Companies can improve customer service and decision-making with this data.Keywords: e-commerce, regression, clustering, k-means
Procedia PDF Downloads 1017557 Institutional and Technological Factors Influencing the Adoption of Tenera Oil Palm Practices: Gender Analysis Smallholder Farmers in Edo State, Nigeria
Authors: Cornelius Michael Ekenta
Abstract:
The study determined institutional and technological factors that influence the adoption of tenera oil palm production practices with a gender dimension among smallholder farmers in Edo State, Nigeria. Primary data were generated with use of questionnaire administered to 155 males and 137 female respondents. Results show that the level of adoption of tenera oil palm production practices was low for both male and females. Tobi regression result shows that land ownership structure and affordability at 1% significance influenced male adoption of tenera oil palm production practices while age and level of income at 1% significance influenced female in the adoption. The major roles of male as reported in adopting process were purchase of seedlings, clearing of bush for planting and selling of cut bunches while the major roles of female were periodic weeding, gathering of cut bunches and mulching of palm field. The major constraint faced by male in adoption process were high cost of labour while for females is drudgery nature of the work. The study recommended that the Land Use Act of 1978 should be enforced to help women and non-indigenes to have sizeable farm lands, Government should empower Agricultural Development Programme (ADP) by employing more extension personnel to increase their contacts with the farmers.Keywords: gender, adoption, variety, oil, tenera, Edo
Procedia PDF Downloads 7817556 Parametric Studies of Ethylene Dichloride Purification Process
Authors: Sh. Arzani, H. Kazemi Esfeh, Y. Galeh Zadeh, V. Akbari
Abstract:
Ethylene dichloride is a colorless liquid with a smell like chloroform. EDC is classified in the simple hydrocarbon group which is obtained from chlorinating ethylene gas. Its chemical formula is C2H2Cl2 which is used as the main mediator in VCM production. Therefore, the purification process of EDC is important in the petrochemical process. In this study, the purification unit of EDC was simulated, and then validation was performed. Finally, the impact of process parameter was studied for the degree of EDC purity. The results showed that by increasing the feed flow, the reflux impure combinations increase and result in an EDC purity decrease.Keywords: ethylene dichloride, purification, edc, simulation
Procedia PDF Downloads 31217555 Audit Committee Characteristics and Earnings Quality of Listed Food and Beverages Firms in Nigeria
Authors: Hussaini Bala
Abstract:
There are different opinions in the literature on the relationship between Audit Committee characteristics and earnings management. The mix of opinions makes the direction of their relationship ambiguous. This study investigated the relationship between Audit Committee characteristics and earnings management of listed food and beverages Firms in Nigeria. The study covered the period of six years from 2007 to 2012. Data for the study were extracted from the Firms’ annual reports and accounts. After running the OLS regression, a robustness test was conducted for the validity of statistical inferences. The dependent variable was generated using two steps regression in order to determine the discretionary accrual of the sample Firms. Multiple regression was employed to run the data of the study using Random Model. The results from the analysis revealed a significant association between audit committee characteristics and earnings management of the Firms. While audit committee size and committees’ financial expertise showed an inverse relationship with earnings management, committee’s independence, and frequency of meetings are positively and significantly related to earnings management. In line with the findings, the study recommended among others that listed food and beverages Firms in Nigeria should strictly comply with the provision of Companies and Allied Matters Act (CAMA) and SEC Code of Corporate Governance on the issues regarding Audit Committees. Regulators such as SEC should increase the minimum number of Audit Committee members with financial expertise and also have a statutory position on the maximum number of Audit Committees meetings, which should not be greater than four meetings in a year as SEC code of corporate governance is silent on this.Keywords: audit committee, earnings management, listed Food and beverages size, leverage, Nigeria
Procedia PDF Downloads 26317554 Impact of the Simplification of Licensing Procedures for Industrial Complexes on Supply of Industrial Complexes and Regional Policies
Authors: Seung-Seok Bak, Chang-Mu Jung
Abstract:
An enough amount supply of industrial complexes is an important national policy in South Korea, which is highly dependent on foreign trade. A development process of the industrial complex can distinguish between the planning stage and the construction stage. The planning stage consists of the process of consulting with many stakeholders on the contents of the development of industrial complex, feasibility study, compliance with the Regional policies, and so on. The industrial complex planning stage, including licensing procedure, usually takes about three years in South Korea. The government determined that the appropriate supply of industrial complexes have been delayed, due to the long licensing period and drafted a law to shorten the license period in 2008. The law was expected to shorten the period of licensing, which was about three years, to six months. This paper attempts to show that the shortening of the licensing period does not positively affect the appropriate supply of industrial complexes. To do this, we used Interrupted Time Series Designs. As a result, it was found that the supply of industrial complexes was influenced more by other factors such as actual industrial complex demand of private sector and macro-level economic variables. In addition, the specific provisions of the law conflict with local policy and cause some problems such as damage to nature and agricultural land, traffic congestion.Keywords: development of industrial complexes, industrial complexes, interrupted time series designs, simplification of licensing procedures for industrial complexes, time series regression
Procedia PDF Downloads 29117553 Forecasting Stock Indexes Using Bayesian Additive Regression Tree
Authors: Darren Zou
Abstract:
Forecasting the stock market is a very challenging task. Various economic indicators such as GDP, exchange rates, interest rates, and unemployment have a substantial impact on the stock market. Time series models are the traditional methods used to predict stock market changes. In this paper, a machine learning method, Bayesian Additive Regression Tree (BART) is used in predicting stock market indexes based on multiple economic indicators. BART can be used to model heterogeneous treatment effects, and thereby works well when models are misspecified. It also has the capability to handle non-linear main effects and multi-way interactions without much input from financial analysts. In this research, BART is proposed to provide a reliable prediction on day-to-day stock market activities. By comparing the analysis results from BART and with time series method, BART can perform well and has better prediction capability than the traditional methods.Keywords: BART, Bayesian, predict, stock
Procedia PDF Downloads 125