Search results for: empirical Bayes method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20468

Search results for: empirical Bayes method

20438 Early Gastric Cancer Prediction from Diet and Epidemiological Data Using Machine Learning in Mizoram Population

Authors: Brindha Senthil Kumar, Payel Chakraborty, Senthil Kumar Nachimuthu, Arindam Maitra, Prem Nath

Abstract:

Gastric cancer is predominantly caused by demographic and diet factors as compared to other cancer types. The aim of the study is to predict Early Gastric Cancer (ECG) from diet and lifestyle factors using supervised machine learning algorithms. For this study, 160 healthy individual and 80 cases were selected who had been followed for 3 years (2016-2019), at Civil Hospital, Aizawl, Mizoram. A dataset containing 11 features that are core risk factors for the gastric cancer were extracted. Supervised machine algorithms: Logistic Regression, Naive Bayes, Support Vector Machine (SVM), Multilayer perceptron, and Random Forest were used to analyze the dataset using Python Jupyter Notebook Version 3. The obtained classified results had been evaluated using metrics parameters: minimum_false_positives, brier_score, accuracy, precision, recall, F1_score, and Receiver Operating Characteristics (ROC) curve. Data analysis results showed Naive Bayes - 88, 0.11; Random Forest - 83, 0.16; SVM - 77, 0.22; Logistic Regression - 75, 0.25 and Multilayer perceptron - 72, 0.27 with respect to accuracy and brier_score in percent. Naive Bayes algorithm out performs with very low false positive rates as well as brier_score and good accuracy. Naive Bayes algorithm classification results in predicting ECG showed very satisfactory results using only diet cum lifestyle factors which will be very helpful for the physicians to educate the patients and public, thereby mortality of gastric cancer can be reduced/avoided with this knowledge mining work.

Keywords: Early Gastric cancer, Machine Learning, Diet, Lifestyle Characteristics

Procedia PDF Downloads 112
20437 Empirical Orthogonal Functions Analysis of Hydrophysical Characteristics in the Shira Lake in Southern Siberia

Authors: Olga S. Volodko, Lidiya A. Kompaniets, Ludmila V. Gavrilova

Abstract:

The method of empirical orthogonal functions is the method of data analysis with a complex spatial-temporal structure. This method allows us to decompose the data into a finite number of modes determined by empirically finding the eigenfunctions of data correlation matrix. The modes have different scales and can be associated with various physical processes. The empirical orthogonal function method has been widely used for the analysis of hydrophysical characteristics, for example, the analysis of sea surface temperatures in the Western North Atlantic, ocean surface currents in the North Carolina, the study of tropical wave disturbances etc. The method used in this study has been applied to the analysis of temperature and velocity measurements in saline Lake Shira (Southern Siberia, Russia). Shira is a shallow lake with the maximum depth of 25 m. The lake Shira can be considered as a closed water site because of it has one small river providing inflow and but it has no outflows. The main factor that causes the motion of fluid is variable wind flows. In summer the lake is strongly stratified by temperature and saline. Long-term measurements of the temperatures and currents were conducted at several points during summer 2014-2015. The temperature has been measured with an accuracy of 0.1 ºC. The data were analyzed using the empirical orthogonal function method in the real version. The first empirical eigenmode accounts for 70-80 % of the energy and can be interpreted as temperature distribution with a thermocline. A thermocline is a thermal layer where the temperature decreases rapidly from the mixed upper layer of the lake to much colder deep water. The higher order modes can be interpreted as oscillations induced by internal waves. The currents measurements were recorded using Acoustic Doppler Current Profilers 600 kHz and 1200 kHz. The data were analyzed using the empirical orthogonal function method in the complex version. The first empirical eigenmode accounts for about 40 % of the energy and corresponds to the Ekman spiral occurring in the case of a stationary homogeneous fluid. Other modes describe the effects associated with the stratification of fluids. The second and next empirical eigenmodes were associated with dynamical modes. These modes were obtained for a simplified model of inhomogeneous three-level fluid at a water site with a flat bottom.

Keywords: Ekman spiral, empirical orthogonal functions, data analysis, stratified fluid, thermocline

Procedia PDF Downloads 112
20436 Empirical Green’s Function Technique for Accelerogram Synthesis: The Problem of the Use for Marine Seismic Hazard Assessment

Authors: Artem A. Krylov

Abstract:

Instrumental seismological researches in water areas are complicated and expensive, that leads to the lack of strong motion records in most offshore regions. In the same time the number of offshore industrial infrastructure objects, such as oil rigs, subsea pipelines, is constantly increasing. The empirical Green’s function technique proved to be very effective for accelerograms synthesis under the conditions of poorly described seismic wave propagation medium. But the selection of suitable small earthquake record in offshore regions as an empirical Green’s function is a problem because of short seafloor instrumental seismological investigation results usually with weak micro-earthquakes recordings. An approach based on moving average smoothing in the frequency domain is presented for preliminary processing of weak micro-earthquake records before using it as empirical Green’s function. The method results in significant waveform correction for modeled event. The case study for 2009 L’Aquila earthquake was used to demonstrate the suitability of the method. This work was supported by the Russian Foundation of Basic Research (project № 18-35-00474 mol_a).

Keywords: accelerogram synthesis, empirical Green's function, marine seismology, microearthquakes

Procedia PDF Downloads 296
20435 A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry

Authors: I GD Yudha Partama, A. Kanno, Y. Akamatsu, R. Inui, M. Goto, M. Sekine

Abstract:

The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).

Keywords: bottom elevation, MVS, river, SfM

Procedia PDF Downloads 282
20434 Hydrological Modeling of Watersheds Using the Only Corresponding Competitor Method: The Case of M’Zab Basin, South East Algeria

Authors: Oulad Naoui Noureddine, Cherif ELAmine, Djehiche Abdelkader

Abstract:

Water resources management includes several disciplines; the modeling of rainfall-runoff relationship is the most important discipline to prevent natural risks. There are several models to study rainfall-runoff relationship in watersheds. However, the majority of these models are not applicable in all basins of the world.  In this study, a new stochastic method called The Only Corresponding Competitor method (OCC) was used for the hydrological modeling of M’ZAB   Watershed (South East of Algeria) to adapt a few empirical models for any hydrological regime.  The results obtained allow to authorize a certain number of visions, in which it would be interesting to experiment with hydrological models that improve collectively or separately the data of a catchment by the OCC method.

Keywords: modelling, optimization, rainfall-runoff relationship, empirical model, OCC

Procedia PDF Downloads 236
20433 A Comparison between Empirical and Theoretical OC Curves Related to Acceptance Sampling for Attributes

Authors: Encarnacion Alvarez, Noemı Hidalgo-Rebollo, Juan F. Munoz, Francisco J. Blanco-Encomienda

Abstract:

Many companies use the technique named as acceptance sampling which consists on the inspection and decision making regarding products. According to the results derived from this method, the company takes the decision of acceptance or rejection of a product. The acceptance sampling can be applied to the technology management, since the acceptance sampling can be seen as a tool to improve the design planning, operation and control of technological products. The theoretical operating characteristic (OC) curves are widely used when dealing with acceptance sampling. In this paper, we carry out Monte Carlo simulation studies to compare numerically the empirical OC curves derived from the empirical results to the customary theoretical OC curves. We analyze various possible scenarios in such a way that the differences between the empirical and theoretical curves can be observed under different situations.

Keywords: single-sampling plan, lot, Monte Carlo simulation, quality control

Procedia PDF Downloads 437
20432 Performance Analysis with the Combination of Visualization and Classification Technique for Medical Chatbot

Authors: Shajida M., Sakthiyadharshini N. P., Kamalesh S., Aswitha B.

Abstract:

Natural Language Processing (NLP) continues to play a strategic part in complaint discovery and medicine discovery during the current epidemic. This abstract provides an overview of performance analysis with a combination of visualization and classification techniques of NLP for a medical chatbot. Sentiment analysis is an important aspect of NLP that is used to determine the emotional tone behind a piece of text. This technique has been applied to various domains, including medical chatbots. In this, we have compared the combination of the decision tree with heatmap and Naïve Bayes with Word Cloud. The performance of the chatbot was evaluated using accuracy, and the results indicate that the combination of visualization and classification techniques significantly improves the chatbot's performance.

Keywords: sentimental analysis, NLP, medical chatbot, decision tree, heatmap, naïve bayes, word cloud

Procedia PDF Downloads 45
20431 A Supervised Approach for Detection of Singleton Spam Reviews

Authors: Atefeh Heydari, Mohammadali Tavakoli, Naomie Salim

Abstract:

In recent years, we have witnessed that online reviews are the most important source of customers’ opinion. They are progressively more used by individuals and organisations to make purchase and business decisions. Unfortunately, for the reason of profit or fame, frauds produce deceptive reviews to hoodwink potential customers. Their activities mislead not only potential customers to make appropriate purchasing decisions and organisations to reshape their business, but also opinion mining techniques by preventing them from reaching accurate results. Spam reviews could be divided into two main groups, i.e. multiple and singleton spam reviews. Detecting a singleton spam review that is the only review written by a user ID is extremely challenging due to lack of clue for detection purposes. Singleton spam reviews are very harmful and various features and proofs used in multiple spam reviews detection are not applicable in this case. Current research aims to propose a novel supervised technique to detect singleton spam reviews. To achieve this, various features are proposed in this study and are to be combined with the most appropriate features extracted from literature and employed in a classifier. In order to compare the performance of different classifiers, SVM and naive Bayes classification algorithms were used for model building. The results revealed that SVM was more accurate than naive Bayes and our proposed technique is capable to detect singleton spam reviews effectively.

Keywords: classification algorithms, Naïve Bayes, opinion review spam detection, singleton review spam detection, support vector machine

Procedia PDF Downloads 278
20430 Social Innovation Rediscovered: An Analysis of Empirical Research

Authors: Imen Douzi, Karim Ben Kahla

Abstract:

In spite of the growing attention for social innovation, it is still considered to be in a stage of infancy with minimal progress in theory development. Upon examining the field of study, one would have to conclude that, over the past two decades, academic research has focused primarily on establishing a conceptual foundation. This has resulted in a considerable stream of conceptual papers which have outnumbered empirical articles. Nevertheless, despite its growing popularity, scholars and practitioners are far from reaching a consensus as to what social innovation actually means which resulted in competing definitions and approaches within the field of social innovation and lack of unifying conceptual framework. This paper reviews empirical research studies on social innovation, classifies them along three dimensions and summarizes research findings for each of these dimensions. Preliminary to the analysis of empirical researches, an overview of different perspectives of social innovation is presented.

Keywords: analysis of empirical research, definition, empirical research, social innovation perspectives

Procedia PDF Downloads 349
20429 Comparative Settlement Analysis on the under of Embankment with Empirical Formulas and Settlement Plate Measurement for Reducing Building Crack around of Embankments

Authors: Safitri Nur Wulandari, M. Ivan Adi Perdana, Prathisto L. Panuntun Unggul, R. Dary Wira Mahadika

Abstract:

In road construction on the soft soil, we need a soil improvement method to improve the soil bearing capacity of the land base so that the soil can withstand the traffic loads. Most of the land in Indonesia has a soft soil, where soft soil is a type of clay that has the consistency of very soft to medium stiff, undrained shear strength, Cu <0:25 kg/cm2, or the estimated value of NSPT <5 blows/ft. This study focuses on the analysis of the effect on preloading load (embarkment) to the amount of settlement ratio on the under of embarkment that will impact on the building cracks around of embarkment. The method used in this research is a superposition method for embarkment distribution on 27 locations with undisturbed soil samples at some borehole point in Java and Kalimantan, Indonesia. Then correlating the results of settlement plate monitoring on the field with Asaoka method. The results of settlement plate monitoring taken from an embarkment of Ahmad Yani airport in Semarang on 32 points. Where the value of Cc (index compressible) soil data based on some laboratory test results, while the value of Cc is not tested obtained from empirical formula Ardhana and Mochtar, 1999. From this research, the results of the field monitoring showed almost the same results with an empirical formulation with the standard deviation of 4% where the formulation of the empirical results of this analysis obtained by linear formula. Value empirical linear formula is to determine the effect of compression heap area as high as 4,25 m is 3,1209x + y = 0.0026 for the slope of the embankment 1: 8 for the same analysis with an initial height of embankment on the field. Provided that at the edge of the embankment settlement worth is not equal to 0 but at a quarter of embankment has a settlement ratio average 0.951 and at the edge of embankment has a settlement ratio 0,049. The influence areas around of embankment are approximately 1 meter for slope 1:8 and 7 meters for slope 1:2. So, it can cause the building cracks, to build in sustainable development.

Keywords: building cracks, influence area, settlement plate, soft soil, empirical formula, embankment

Procedia PDF Downloads 316
20428 Orthogonal Metal Cutting Simulation of Steel AISI 1045 via Smoothed Particle Hydrodynamic Method

Authors: Seyed Hamed Hashemi Sohi, Gerald Jo Denoga

Abstract:

Machining or metal cutting is one of the most widely used production processes in industry. The quality of the process and the resulting machined product depends on parameters like tool geometry, material, and cutting conditions. However, the relationships of these parameters to the cutting process are often based mostly on empirical knowledge. In this study, computer modeling and simulation using LS-DYNA software and a Smoothed Particle Hydrodynamic (SPH) methodology, was performed on the orthogonal metal cutting process to analyze three-dimensional deformation of AISI 1045 medium carbon steel during machining. The simulation was performed using the following constitutive models: the Power Law model, the Johnson-Cook model, and the Zerilli-Armstrong models (Z-A). The outcomes were compared against the simulated results obtained by Cenk Kiliçaslan using the Finite Element Method (FEM) and the empirical results of Jaspers and Filice. The analysis shows that the SPH method combined with the Zerilli-Armstrong constitutive model is a viable alternative to simulating the metal cutting process. The tangential force was overestimated by 7%, and the normal force was underestimated by 16% when compared with empirical values. The simulation values for flow stress versus strain at various temperatures were also validated against empirical values. The SPH method using the Z-A model has also proven to be robust against issues of time-scaling. Experimental work was also done to investigate the effects of friction, rake angle and tool tip radius on the simulation.

Keywords: metal cutting, smoothed particle hydrodynamics, constitutive models, experimental, cutting forces analyses

Procedia PDF Downloads 234
20427 Early Stage Suicide Ideation Detection Using Supervised Machine Learning and Neural Network Classifier

Authors: Devendra Kr Tayal, Vrinda Gupta, Aastha Bansal, Khushi Singh, Sristi Sharma, Hunny Gaur

Abstract:

In today's world, suicide is a serious problem. In order to save lives, early suicide attempt detection and prevention should be addressed. A good number of at-risk people utilize social media platforms to talk about their issues or find knowledge on related chores. Twitter and Reddit are two of the most common platforms that are used for expressing oneself. Extensive research has already been done in this field. Through supervised classification techniques like Nave Bayes, Bernoulli Nave Bayes, and Multiple Layer Perceptron on a Reddit dataset, we demonstrate the early recognition of suicidal ideation. We also performed comparative analysis on these approaches and used accuracy, recall score, F1 score, and precision score for analysis.

Keywords: machine learning, suicide ideation detection, supervised classification, natural language processing

Procedia PDF Downloads 61
20426 Estimation of Stress-Strength Parameter for Burr Type XII Distribution Based on Progressive Type-II Censoring

Authors: A. M. Abd-Elfattah, M. H. Abu-Moussa

Abstract:

In this paper, the estimation of stress-strength parameter R = P(Y < X) is considered when X; Y the strength and stress respectively are two independent random variables of Burr Type XII distribution. The samples taken for X and Y are progressively censoring of type II. The maximum likelihood estimator (MLE) of R is obtained when the common parameter is unknown. But when the common parameter is known the MLE, uniformly minimum variance unbiased estimator (UMVUE) and the Bayes estimator of R = P(Y < X) are obtained. The exact con dence interval of R based on MLE is obtained. The performance of the proposed estimators is compared using the computer simulation.

Keywords: Burr Type XII distribution, progressive type-II censoring, stress-strength model, unbiased estimator, maximum-likelihood estimator, uniformly minimum variance unbiased estimator, confidence intervals, Bayes estimator

Procedia PDF Downloads 426
20425 The Use of Psychological Tests in Polish Organizations - Empirical Evidence

Authors: Milena Gojny-Zbierowska

Abstract:

In the last decades psychological tests have been gaining in popularity as a method used for evaluating personnel, and they bring consulting companies solid profits rising by up to 10% each year. The market is offering a growing range of tools for the assessment of personality. Tests are used in organizations mainly in the recruitment and selection of staff. This paper is an attempt to initially diagnose the state of the use of psychological tests in Polish companies on the basis of empirical research.

Keywords: psychological tests, personality, content analysis, NEO FFI, big five personality model

Procedia PDF Downloads 315
20424 A Study of Permission-Based Malware Detection Using Machine Learning

Authors: Ratun Rahman, Rafid Islam, Akin Ahmed, Kamrul Hasan, Hasan Mahmud

Abstract:

Malware is becoming more prevalent, and several threat categories have risen dramatically in recent years. This paper provides a bird's-eye view of the world of malware analysis. The efficiency of five different machine learning methods (Naive Bayes, K-Nearest Neighbor, Decision Tree, Random Forest, and TensorFlow Decision Forest) combined with features picked from the retrieval of Android permissions to categorize applications as harmful or benign is investigated in this study. The test set consists of 1,168 samples (among these android applications, 602 are malware and 566 are benign applications), each consisting of 948 features (permissions). Using the permission-based dataset, the machine learning algorithms then produce accuracy rates above 80%, except the Naive Bayes Algorithm with 65% accuracy. Of the considered algorithms TensorFlow Decision Forest performed the best with an accuracy of 90%.

Keywords: android malware detection, machine learning, malware, malware analysis

Procedia PDF Downloads 121
20423 An Integrated Lightweight Naïve Bayes Based Webpage Classification Service for Smartphone Browsers

Authors: Mayank Gupta, Siba Prasad Samal, Vasu Kakkirala

Abstract:

The internet world and its priorities have changed considerably in the last decade. Browsing on smart phones has increased manifold and is set to explode much more. Users spent considerable time browsing different websites, that gives a great deal of insight into user’s preferences. Instead of plain information classifying different aspects of browsing like Bookmarks, History, and Download Manager into useful categories would improve and enhance the user’s experience. Most of the classification solutions are server side that involves maintaining server and other heavy resources. It has security constraints and maybe misses on contextual data during classification. On device, classification solves many such problems, but the challenge is to achieve accuracy on classification with resource constraints. This on device classification can be much more useful in personalization, reducing dependency on cloud connectivity and better privacy/security. This approach provides more relevant results as compared to current standalone solutions because it uses content rendered by browser which is customized by the content provider based on user’s profile. This paper proposes a Naive Bayes based lightweight classification engine targeted for a resource constraint devices. Our solution integrates with Web Browser that in turn triggers classification algorithm. Whenever a user browses a webpage, this solution extracts DOM Tree data from the browser’s rendering engine. This DOM data is a dynamic, contextual and secure data that can’t be replicated. This proposal extracts different features of the webpage that runs on an algorithm to classify into multiple categories. Naive Bayes based engine is chosen in this solution for its inherent advantages in using limited resources compared to other classification algorithms like Support Vector Machine, Neural Networks, etc. Naive Bayes classification requires small memory footprint and less computation suitable for smartphone environment. This solution has a feature to partition the model into multiple chunks that in turn will facilitate less usage of memory instead of loading a complete model. Classification of the webpages done through integrated engine is faster, more relevant and energy efficient than other standalone on device solution. This classification engine has been tested on Samsung Z3 Tizen hardware. The Engine is integrated into Tizen Browser that uses Chromium Rendering Engine. For this solution, extensive dataset is sourced from dmoztools.net and cleaned. This cleaned dataset has 227.5K webpages which are divided into 8 generic categories ('education', 'games', 'health', 'entertainment', 'news', 'shopping', 'sports', 'travel'). Our browser integrated solution has resulted in 15% less memory usage (due to partition method) and 24% less power consumption in comparison with standalone solution. This solution considered 70% of the dataset for training the data model and the rest 30% dataset for testing. An average accuracy of ~96.3% is achieved across the above mentioned 8 categories. This engine can be further extended for suggesting Dynamic tags and using the classification for differential uses cases to enhance browsing experience.

Keywords: chromium, lightweight engine, mobile computing, Naive Bayes, Tizen, web browser, webpage classification

Procedia PDF Downloads 135
20422 Estimation of Shear Wave Velocity from Cone Penetration Test for Structured Busan Clays

Authors: Vinod K. Singh, S. G. Chung

Abstract:

The degree of structuration of Busan clays at the mouth of Nakdong River mouth was highly influenced by the depositional environment, i.e., flow of the river stream, marine regression, and transgression during the sedimentation process. As a result, the geotechnical properties also varies along the depth with change in degree of structuration. Thus, the in-situ tests such as cone penetration test (CPT) could not be used to predict various geotechnical properties properly by using the conventional empirical methods. In this paper, the shear wave velocity (Vs) was measured from the field using the seismic dilatometer. The Vs was also measured in the laboratory from high quality undisturbed and remolded samples using bender element method to evaluate the degree of structuration. The degree of structuration was quantitatively defined by the modulus ratio of undisturbed to remolded soil samples which is found well correlated with the normalized void ratio (e0/eL) where eL is the void ratio at the liquid limit. It is revealed that the empirical method based on laboratory results incorporating e0/eL can predict Vs from the field more accurately. Thereafter, the CPT based empirical method was developed to estimate the shear wave velocity taking the effect of structuration in the consideration. The developed method was found to predict shear wave velocity reasonably for Busan clays.

Keywords: level of structuration, normalized modulus, normalized void ratio, shear wave velocity, site characterization

Procedia PDF Downloads 207
20421 Empirical Modeling of Air Dried Rubberwood Drying System

Authors: S. Khamtree, T. Ratanawilai, C. Nuntadusit

Abstract:

Rubberwood is a crucial commercial timber in Southern Thailand. All processes in a rubberwood production depend on the knowledge and expertise of the technicians, especially the drying process. This research aims to develop an empirical model for drying kinetics in rubberwood. During the experiment, the temperature of the hot air and the average air flow velocity were kept at 80-100 °C and 1.75 m/s, respectively. The moisture content in the samples was determined less than 12% in the achievement of drying basis. The drying kinetic was simulated using an empirical solver. The experimental results illustrated that the moisture content was reduced whereas the drying temperature and time were increased. The coefficient of the moisture ratio between the empirical and the experimental model was tested with three statistical parameters, R-square (), Root Mean Square Error (RMSE) and Chi-square (χ²) to predict the accuracy of the parameters. The experimental moisture ratio had a good fit with the empirical model. Additionally, the results indicated that the drying of rubberwood using the Henderson and Pabis model revealed the suitable level of agreement. The result presented an excellent estimation (= 0.9963) for the moisture movement compared to the other models. Therefore, the empirical results were valid and can be implemented in the future experiments.

Keywords: empirical models, rubberwood, moisture ratio, hot air drying

Procedia PDF Downloads 239
20420 An Approximate Formula for Calculating the Fundamental Mode Period of Vibration of Practical Building

Authors: Abdul Hakim Chikho

Abstract:

Most international codes allow the use of an equivalent lateral load method for designing practical buildings to withstand earthquake actions. This method requires calculating an approximation to the fundamental mode period of vibrations of these buildings. Several empirical equations have been suggested to calculate approximations to the fundamental periods of different types of structures. Most of these equations are knowing to provide an only crude approximation to the required fundamental periods and repeating the calculation utilizing a more accurate formula is usually required. In this paper, a new formula to calculate a satisfactory approximation of the fundamental period of a practical building is proposed. This formula takes into account the mass and the stiffness of the building therefore, it is more logical than the conventional empirical equations. In order to verify the accuracy of the proposed formula, several examples have been solved. In these examples, calculating the fundamental mode periods of several farmed buildings utilizing the proposed formula and the conventional empirical equations has been accomplished. Comparing the obtained results with those obtained from a dynamic computer has shown that the proposed formula provides a more accurate estimation of the fundamental periods of practical buildings. Since the proposed method is still simple to use and requires only a minimum computing effort, it is believed to be ideally suited for design purposes.

Keywords: earthquake, fundamental mode period, design, building

Procedia PDF Downloads 251
20419 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error

Authors: Oscar Javier Herrera, Manuel Angel Camacho

Abstract:

This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors’. The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.

Keywords: demand forecasting, empirical distribution, propagation of error, Bogota

Procedia PDF Downloads 587
20418 Research on Straightening Process Model Based on Iteration and Self-Learning

Authors: Hong Lu, Xiong Xiao

Abstract:

Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.

Keywords: straightness, straightening stroke, deflection, shaft parts

Procedia PDF Downloads 297
20417 Key Success Factors of Customer Relationship Management: An Empirical Study of Tunisian Firms

Authors: Khlif Hamadi

Abstract:

Customer Relationship Management has become the main interest of researchers and practitioners especially in the domains of Management and Information Systems (IS). This paper is an overview of success factors that could facilitate successful adoption of CRM. There are 2 factors: the organizational climate and the capacity for innovation. The survey was developed with 200 CRM users. Empirical research is in the positivist paradigm based on the hypothetico-deductive method. Indeed, the approach adopted is the quantitative approach based on a questionnaire complied by Tunisian companies operating in different sectors of activity. For the data analyses, the structural equations method was used to conduct our exploratory and confirmatory analysis. The results revealed that the creative organizational climate and high innovation capacity positively influence the success of CRM practice.

Keywords: CRM practices, innovation capacity, organizational climate, the structural equation

Procedia PDF Downloads 92
20416 Sentiment Analysis of Ensemble-Based Classifiers for E-Mail Data

Authors: Muthukumarasamy Govindarajan

Abstract:

Detection of unwanted, unsolicited mails called spam from email is an interesting area of research. It is necessary to evaluate the performance of any new spam classifier using standard data sets. Recently, ensemble-based classifiers have gained popularity in this domain. In this research work, an efficient email filtering approach based on ensemble methods is addressed for developing an accurate and sensitive spam classifier. The proposed approach employs Naive Bayes (NB), Support Vector Machine (SVM) and Genetic Algorithm (GA) as base classifiers along with different ensemble methods. The experimental results show that the ensemble classifier was performing with accuracy greater than individual classifiers, and also hybrid model results are found to be better than the combined models for the e-mail dataset. The proposed ensemble-based classifiers turn out to be good in terms of classification accuracy, which is considered to be an important criterion for building a robust spam classifier.

Keywords: accuracy, arcing, bagging, genetic algorithm, Naive Bayes, sentiment mining, support vector machine

Procedia PDF Downloads 111
20415 Stating Best Commercialization Method: An Unanswered Question from Scholars and Practitioners

Authors: Saheed A. Gbadegeshin

Abstract:

Commercialization method is a means to make inventions available at the market for final consumption. It is described as an important tool for keeping business enterprises sustainable and improving national economic growth. Thus, there are several scholarly publications on it, either presenting or testing different methods for commercialization. However, young entrepreneurs, technologists and scientists would like to know the best method to commercialize their innovations. Then, this question arises: What is the best commercialization method? To answer the question, a systematic literature review was conducted, and practitioners were interviewed. The literary results revealed that there are many methods but new methods are needed to improve commercialization especially during these times of economic crisis and political uncertainty. Similarly, the empirical results showed there are several methods, but the best method is the one that reduces costs, reduces the risks associated with uncertainty, and improves customer participation and acceptability. Therefore, it was concluded that new commercialization method is essential for today's high technologies and a method was presented.

Keywords: commercialization method, technology, knowledge, intellectual property, innovation, invention

Procedia PDF Downloads 301
20414 Applying Critical Realism to Qualitative Social Work Research: A Critical Realist Approach for Social Work Thematic Analysis Method

Authors: Lynne Soon-Chean Park

Abstract:

Critical Realism (CR) has emerged as an alternative to both the positivist and constructivist perspectives that have long dominated social work research. By unpacking the epistemic weakness of two dogmatic perspectives, CR provides a useful philosophical approach that incorporates the ontological objectivist and subjectivist stance. The CR perspective suggests an alternative approach for social work researchers who have long been looking to engage in the complex interplay between perceived reality at the empirical level and the objective reality that lies behind the empirical event as a causal mechanism. However, despite the usefulness of CR in informing social work research, little practical guidance is available about how CR can inform methodological considerations in social work research studies. This presentation aims to provide a detailed description of CR-informed thematic analysis by drawing examples from a social work doctoral research of Korean migrants’ experiences and understanding of trust associated with their settlement experience in New Zealand. Because of its theoretical flexibility and accessibility as a qualitative analysis method, thematic analysis can be applied as a method that works both to search for the demi-regularities of the collected data and to identify the causal mechanisms that lay behind the empirical data. In so doing, this presentation seeks to provide a concrete and detailed exemplar for social work researchers wishing to employ CR in their qualitative thematic analysis process.

Keywords: critical Realism, data analysis, epistemology, research methodology, social work research, thematic analysis

Procedia PDF Downloads 186
20413 Analysis of Dynamics Underlying the Observation Time Series by Using a Singular Spectrum Approach

Authors: O. Delage, H. Bencherif, T. Portafaix, A. Bourdier

Abstract:

The main purpose of time series analysis is to learn about the dynamics behind some time ordered measurement data. Two approaches are used in the literature to get a better knowledge of the dynamics contained in observation data sequences. The first of these approaches concerns time series decomposition, which is an important analysis step allowing patterns and behaviors to be extracted as components providing insight into the mechanisms producing the time series. As in many cases, time series are short, noisy, and non-stationary. To provide components which are physically meaningful, methods such as Empirical Mode Decomposition (EMD), Empirical Wavelet Transform (EWT) or, more recently, Empirical Adaptive Wavelet Decomposition (EAWD) have been proposed. The second approach is to reconstruct the dynamics underlying the time series as a trajectory in state space by mapping a time series into a set of Rᵐ lag vectors by using the method of delays (MOD). Takens has proved that the trajectory obtained with the MOD technic is equivalent to the trajectory representing the dynamics behind the original time series. This work introduces the singular spectrum decomposition (SSD), which is a new adaptive method for decomposing non-linear and non-stationary time series in narrow-banded components. This method takes its origin from singular spectrum analysis (SSA), a nonparametric spectral estimation method used for the analysis and prediction of time series. As the first step of SSD is to constitute a trajectory matrix by embedding a one-dimensional time series into a set of lagged vectors, SSD can also be seen as a reconstruction method like MOD. We will first give a brief overview of the existing decomposition methods (EMD-EWT-EAWD). The SSD method will then be described in detail and applied to experimental time series of observations resulting from total columns of ozone measurements. The results obtained will be compared with those provided by the previously mentioned decomposition methods. We will also compare the reconstruction qualities of the observed dynamics obtained from the SSD and MOD methods.

Keywords: time series analysis, adaptive time series decomposition, wavelet, phase space reconstruction, singular spectrum analysis

Procedia PDF Downloads 77
20412 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint

Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar

Abstract:

Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.

Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine

Procedia PDF Downloads 36
20411 Improved 3D Structure Prediction of Beta-Barrel Membrane Proteins by Using Evolutionary Coupling Constraints, Reduced State Space and an Empirical Potential Function

Authors: Wei Tian, Jie Liang, Hammad Naveed

Abstract:

Beta-barrel membrane proteins are found in the outer membrane of gram-negative bacteria, mitochondria, and chloroplasts. They carry out diverse biological functions, including pore formation, membrane anchoring, enzyme activity, and bacterial virulence. In addition, beta-barrel membrane proteins increasingly serve as scaffolds for bacterial surface display and nanopore-based DNA sequencing. Due to difficulties in experimental structure determination, they are sparsely represented in the protein structure databank and computational methods can help to understand their biophysical principles. We have developed a novel computational method to predict the 3D structure of beta-barrel membrane proteins using evolutionary coupling (EC) constraints and a reduced state space. Combined with an empirical potential function, we can successfully predict strand register at > 80% accuracy for a set of 49 non-homologous proteins with known structures. This is a significant improvement from previous results using EC alone (44%) and using empirical potential function alone (73%). Our method is general and can be applied to genome-wide structural prediction.

Keywords: beta-barrel membrane proteins, structure prediction, evolutionary constraints, reduced state space

Procedia PDF Downloads 578
20410 Empirical Mode Decomposition Based Denoising by Customized Thresholding

Authors: Wahiba Mohguen, Raïs El’hadi Bekka

Abstract:

This paper presents a denoising method called EMD-Custom that was based on Empirical Mode Decomposition (EMD) and the modified Customized Thresholding Function (Custom) algorithms. EMD was applied to decompose adaptively a noisy signal into intrinsic mode functions (IMFs). Then, all the noisy IMFs got threshold by applying the presented thresholding function to suppress noise and to improve the signal to noise ratio (SNR). The method was tested on simulated data and real ECG signal, and the results were compared to the EMD-Based signal denoising methods using the soft and hard thresholding. The results showed the superior performance of the proposed EMD-Custom denoising over the traditional approach. The performances were evaluated in terms of SNR in dB, and Mean Square Error (MSE).

Keywords: customized thresholding, ECG signal, EMD, hard thresholding, soft-thresholding

Procedia PDF Downloads 279
20409 Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications

Authors: Chia-Ju Peng, Shih-Jui Chen

Abstract:

This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance.

Keywords: biomedical engineering, brain computer interface, electroencephalography, rehabilitation

Procedia PDF Downloads 367