Search results for: data mining techniques
28544 Effective Counseling Techniques Working with At-Risk Youth in Residential and Outpatient Settings
Authors: David A. Scott, Michelle G. Scott
Abstract:
The problem of juvenile crime, school suspensions and oppositional behaviors indicates a need for a wide range of intervention programs for at-risk youth. Juvenile court systems and mental health agencies are examining alternative ways to deal with at-risk youth that will allow the adolescent to live within their home community. The previous trend that treatment away from home is more effective than treatment near one's community has shifted. Research now suggests that treatment be close to home for several reasons, such as increased treatment success, parental involvement, and reduced costs. Treatment options consist of a wide range of interventions, including outpatient, inpatient, and community-based services (therapeutic group homes, foster care and in-home preservation services). The juvenile justice system, families and other mental health agencies continue to seek the most effective treatment for at-risk youth in their communities. This research examines two possible treatment modalities, a multi-systemic outpatient program and a residential program. Research examining effective, evidence- based counseling will be discussed during this presentation. The presenter recently completed a three-year research grant examining effective treatment modalities for at-risk youth participating in a multi-systemic program. The presenter has also been involved in several research activities gathering data on effective techniques used in residential programs. The data and discussion will be broken down into two parts, each discussing one of the treatment modalities mentioned above. Data on the residential programs was collected on both a sample of 740 at- risk youth over a five-year period and also a sample of 63 participants during a one-year period residing in a residential programs. The effectiveness of these residential services was measured in three ways: services are evaluated by primary referral sources; follow-up data is obtained at various intervals after program participation to measure recidivism (what percentage got back into trouble with the Department of Juvenile Justice); and a more sensitive, "Offense Seriousness Score", has been computed and analyzed prior to, during and after treatment in the residential program. Data on the multi-systemic program was gathered over the past three years on 190 participants. Research will discuss pre and post test results, recidivism rates, academic performance, parental involvement, and effective counseling treatment modalities.Keywords: at-risk youth, group homes, therapeutic group homes, recidivism rates
Procedia PDF Downloads 8228543 Electrospray Deposition Technique of Dye Molecules in the Vacuum
Authors: Nouf Alharbi
Abstract:
The electrospray deposition technique became an important method that enables fragile, nonvolatile molecules to be deposited in situ in high vacuum environments. Furthermore, it is considered one of the ways to close the gap between basic surface science and molecular engineering, which represents a gradual change in the range of scientist research. Also, this paper talked about one of the most important techniques that have been developed and aimed for helping to further develop and characterize the electrospray by providing data collected using an image charge detection instrument. Image charge detection mass spectrometry (CDMS) is used to measure speed and charge distributions of the molecular ions. As well as, some data has been included using SIMION simulation to simulate the energies and masses of the molecular ions through the system in order to refine the mass-selection process.Keywords: charge, deposition, electrospray, image, ions, molecules, SIMION
Procedia PDF Downloads 13328542 The Functional Magnetic Resonance Imaging and the Consumer Behaviour: Reviewing Recent Research
Authors: Mikel Alonso López
Abstract:
In the first decade of the twenty-first century, advanced imaging techniques began to be applied for neuroscience research. The Functional Magnetic Resonance Imaging (fMRI) is one of the most important and most used research techniques for the investigation of emotions, because of its ease to observe the brain areas that oxygenate when performing certain tasks. In this research, we make a review about the main research carried out on the influence of the emotions in the decision-making process that is exposed by using the fMRI.Keywords: decision making, emotions, fMRI, consumer behaviour
Procedia PDF Downloads 47928541 Performance of Environmental Efficiency of Energy Consumption in OPEC Countries
Authors: Bahram Fathi, Mahdi Khodaparast Mashhadi, Masuod Homayounifar
Abstract:
Global awareness on energy security and climate change has created much interest in assessing energy efficiency performance. A number of previous studies have contributed to evaluate energy efficiency performance using different analytical techniques among which data envelopment analysis (DEA) has recently received increasing attention. Most of DEA-related energy efficiency studies do not consider undesirable outputs such as CO2 emissions in their modeling framework, which may lead to biased energy efficiency values. Within a joint production frame work of desirable and undesirable outputs, in this paper we construct energy efficiency performance index for measuring energy efficiency performance by using environmental DEA model with CO2 emissions. We finally apply the index proposed to assess the energy efficiency performance in OPEC over time.Keywords: energy efficiency, environmental, OPEC, data envelopment analysis
Procedia PDF Downloads 38728540 Modern Proteomics and the Application of Machine Learning Analyses in Proteomic Studies of Chronic Kidney Disease of Unknown Etiology
Authors: Dulanjali Ranasinghe, Isuru Supasan, Kaushalya Premachandra, Ranjan Dissanayake, Ajith Rajapaksha, Eustace Fernando
Abstract:
Proteomics studies of organisms are considered to be significantly information-rich compared to their genomic counterparts because proteomes of organisms represent the expressed state of all proteins of an organism at a given time. In modern top-down and bottom-up proteomics workflows, the primary analysis methods employed are gel–based methods such as two-dimensional (2D) electrophoresis and mass spectrometry based methods. Machine learning (ML) and artificial intelligence (AI) have been used increasingly in modern biological data analyses. In particular, the fields of genomics, DNA sequencing, and bioinformatics have seen an incremental trend in the usage of ML and AI techniques in recent years. The use of aforesaid techniques in the field of proteomics studies is only beginning to be materialised now. Although there is a wealth of information available in the scientific literature pertaining to proteomics workflows, no comprehensive review addresses various aspects of the combined use of proteomics and machine learning. The objective of this review is to provide a comprehensive outlook on the application of machine learning into the known proteomics workflows in order to extract more meaningful information that could be useful in a plethora of applications such as medicine, agriculture, and biotechnology.Keywords: proteomics, machine learning, gel-based proteomics, mass spectrometry
Procedia PDF Downloads 15128539 Landslide Hazard Zonation Using Satellite Remote Sensing and GIS Technology
Authors: Ankit Tyagi, Reet Kamal Tiwari, Naveen James
Abstract:
Landslide is the major geo-environmental problem of Himalaya because of high ridges, steep slopes, deep valleys, and complex system of streams. They are mainly triggered by rainfall and earthquake and causing severe damage to life and property. In Uttarakhand, the Tehri reservoir rim area, which is situated in the lesser Himalaya of Garhwal hills, was selected for landslide hazard zonation (LHZ). The study utilized different types of data, including geological maps, topographic maps from the survey of India, Landsat 8, and Cartosat DEM data. This paper presents the use of a weighted overlay method in LHZ using fourteen causative factors. The various data layers generated and co-registered were slope, aspect, relative relief, soil cover, intensity of rainfall, seismic ground shaking, seismic amplification at surface level, lithology, land use/land cover (LULC), normalized difference vegetation index (NDVI), topographic wetness index (TWI), stream power index (SPI), drainage buffer and reservoir buffer. Seismic analysis is performed using peak horizontal acceleration (PHA) intensity and amplification factors in the evaluation of the landslide hazard index (LHI). Several digital image processing techniques such as topographic correction, NDVI, and supervised classification were widely used in the process of terrain factor extraction. Lithological features, LULC, drainage pattern, lineaments, and structural features are extracted using digital image processing techniques. Colour, tones, topography, and stream drainage pattern from the imageries are used to analyse geological features. Slope map, aspect map, relative relief are created by using Cartosat DEM data. DEM data is also used for the detailed drainage analysis, which includes TWI, SPI, drainage buffer, and reservoir buffer. In the weighted overlay method, the comparative importance of several causative factors obtained from experience. In this method, after multiplying the influence factor with the corresponding rating of a particular class, it is reclassified, and the LHZ map is prepared. Further, based on the land-use map developed from remote sensing images, a landslide vulnerability study for the study area is carried out and presented in this paper.Keywords: weighted overlay method, GIS, landslide hazard zonation, remote sensing
Procedia PDF Downloads 13328538 Exploring the Synergistic Effects of Aerobic Exercise and Cinnamon Extract on Metabolic Markers in Insulin-Resistant Rats through Advanced Machine Learning and Deep Learning Techniques
Authors: Masoomeh Alsadat Mirshafaei
Abstract:
The present study aims to explore the effect of an 8-week aerobic training regimen combined with cinnamon extract on serum irisin and leptin levels in insulin-resistant rats. Additionally, this research leverages various machine learning (ML) and deep learning (DL) algorithms to model the complex interdependencies between exercise, nutrition, and metabolic markers, offering a groundbreaking approach to obesity and diabetes research. Forty-eight Wistar rats were selected and randomly divided into four groups: control, training, cinnamon, and training cinnamon. The training protocol was conducted over 8 weeks, with sessions 5 days a week at 75-80% VO2 max. The cinnamon and training-cinnamon groups were injected with 200 ml/kg/day of cinnamon extract. Data analysis included serum data, dietary intake, exercise intensity, and metabolic response variables, with blood samples collected 72 hours after the final training session. The dataset was analyzed using one-way ANOVA (P<0.05) and fed into various ML and DL models, including Support Vector Machines (SVM), Random Forest (RF), and Convolutional Neural Networks (CNN). Traditional statistical methods indicated that aerobic training, with and without cinnamon extract, significantly increased serum irisin and decreased leptin levels. Among the algorithms, the CNN model provided superior performance in identifying specific interactions between cinnamon extract concentration and exercise intensity, optimizing the increase in irisin and the decrease in leptin. The CNN model achieved an accuracy of 92%, outperforming the SVM (85%) and RF (88%) models in predicting the optimal conditions for metabolic marker improvements. The study demonstrated that advanced ML and DL techniques could uncover nuanced relationships and potential cellular responses to exercise and dietary supplements, which is not evident through traditional methods. These findings advocate for the integration of advanced analytical techniques in nutritional science and exercise physiology, paving the way for personalized health interventions in managing obesity and diabetes.Keywords: aerobic training, cinnamon extract, insulin resistance, irisin, leptin, convolutional neural networks, exercise physiology, support vector machines, random forest
Procedia PDF Downloads 3828537 Resident-Aware Green Home
Authors: Ahlam Elkilani, Bayan Elsheikh Ali, Rasha Abu Romman, Amjed Al-mousa, Belal Sababha
Abstract:
The amount of energy the world uses doubles every 20 years. Green homes play an important role in reducing the residential energy demand. This paper presents a platform that is intended to learn the behavior of home residents and build a profile about their habits and actions. The proposed resident aware home controller intervenes in the operation of home appliances in order to save energy without compromising the convenience of the residents. The presented platform can be used to simulate the actions and movements happening inside a home. The paper includes several optimization techniques that are meant to save energy in the home. In addition, several test scenarios are presented that show how the controller works. Moreover, this paper shows the computed actual savings when each of the presented techniques is implemented in a typical home. The test scenarios have validated that the techniques developed are capable of effectively saving energy at homes.Keywords: green home, resident aware, resident profile, activity learning, machine learning
Procedia PDF Downloads 38928536 Using Multi-Arm Bandits to Optimize Game Play Metrics and Effective Game Design
Authors: Kenny Raharjo, Ramon Lawrence
Abstract:
Game designers have the challenging task of building games that engage players to spend their time and money on the game. There are an infinite number of game variations and design choices, and it is hard to systematically determine game design choices that will have positive experiences for players. In this work, we demonstrate how multi-arm bandits can be used to automatically explore game design variations to achieve improved player metrics. The advantage of multi-arm bandits is that they allow for continuous experimentation and variation, intrinsically converge to the best solution, and require no special infrastructure to use beyond allowing minor game variations to be deployed to users for evaluation. A user study confirms that applying multi-arm bandits was successful in determining the preferred game variation with highest play time metrics and can be a useful technique in a game designer's toolkit.Keywords: game design, multi-arm bandit, design exploration and data mining, player metric optimization and analytics
Procedia PDF Downloads 51028535 Identifying Factors Contributing to the Spread of Lyme Disease: A Regression Analysis of Virginia’s Data
Authors: Fatemeh Valizadeh Gamchi, Edward L. Boone
Abstract:
This research focuses on Lyme disease, a widespread infectious condition in the United States caused by the bacterium Borrelia burgdorferi sensu stricto. It is critical to identify environmental and economic elements that are contributing to the spread of the disease. This study examined data from Virginia to identify a subset of explanatory variables significant for Lyme disease case numbers. To identify relevant variables and avoid overfitting, linear poisson, and regularization regression methods such as a ridge, lasso, and elastic net penalty were employed. Cross-validation was performed to acquire tuning parameters. The methods proposed can automatically identify relevant disease count covariates. The efficacy of the techniques was assessed using four criteria on three simulated datasets. Finally, using the Virginia Department of Health’s Lyme disease data set, the study successfully identified key factors, and the results were consistent with previous studies.Keywords: lyme disease, Poisson generalized linear model, ridge regression, lasso regression, elastic net regression
Procedia PDF Downloads 13728534 The Effectiveness of Electronic Local Financial Management Information System (ELFMIS) in Mempawah Regency, West Borneo Province, Indonesia
Authors: Muhadam Labolo, Afdal R. Anwar, Sucia Miranti Sipisang
Abstract:
Electronic Local Finance Management Information System (ELFMIS) is integrated application that was used as a tool for local governments to improve the effectiveness of the implementation of the various areas of financial management regulations. Appropriate With Exceptions Opinion (WDP) of Indonesia Audit Agency (BPK) for local governments Mempawah is a financial management problem that must be improved to avoid mistakes in decision-making. The use of Electronic Local Finance Management Information System (ELFMIS) by Mempawah authority has not yet performed maximally. These problems became the basis for research in measuring the effectiveness LFMIS in Mempawah regency. This research uses an indicator variable for measuring information systems effectiveness proposed by Bodnar. This research made use descriptive with inductive approach. Data collection techniques were mixed from qualitative and quantitative techniques, used questionnaires, interviews and documentation. The obstacles in Local Finance Board (LFB) for the application of ELFMIS such as connection, the quality and quantity of human resources, realization of financial resources, absence of maintenance and another facilities of ELFMIS and verification for financial information.Keywords: effectiveness, E-LFMIS, finance, local government, system
Procedia PDF Downloads 21928533 Feature Selection Approach for the Classification of Hydraulic Leakages in Hydraulic Final Inspection using Machine Learning
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Manufacturing companies are facing global competition and enormous cost pressure. The use of machine learning applications can help reduce production costs and create added value. Predictive quality enables the securing of product quality through data-supported predictions using machine learning models as a basis for decisions on test results. Furthermore, machine learning methods are able to process large amounts of data, deal with unfavourable row-column ratios and detect dependencies between the covariates and the given target as well as assess the multidimensional influence of all input variables on the target. Real production data are often subject to highly fluctuating boundary conditions and unbalanced data sets. Changes in production data manifest themselves in trends, systematic shifts, and seasonal effects. Thus, Machine learning applications require intensive pre-processing and feature selection. Data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets. Within the used real data set of Bosch hydraulic valves, the comparability of the same production conditions in the production of hydraulic valves within certain time periods can be identified by applying the concept drift method. Furthermore, a classification model is developed to evaluate the feature importance in different subsets within the identified time periods. By selecting comparable and stable features, the number of features used can be significantly reduced without a strong decrease in predictive power. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. In this research, the ada boosting classifier is used to predict the leakage of hydraulic valves based on geometric gauge blocks from machining, mating data from the assembly, and hydraulic measurement data from end-of-line testing. In addition, the most suitable methods are selected and accurate quality predictions are achieved.Keywords: classification, achine learning, predictive quality, feature selection
Procedia PDF Downloads 16228532 Heuristic Classification of Hydrophone Recordings
Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas
Abstract:
An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.Keywords: anthrophony, hydrophone, k-means, machine learning
Procedia PDF Downloads 17028531 Relationship between the Ability of Accruals and Non-Systematic Risk of Shares for Companies Listed in Stock Exchange: Case Study, Tehran
Authors: Lina Najafian, Hamidreza Vakilifard
Abstract:
The present study focused on the relationship between the quality of accruals and non-systematic risk. The independent study variables included the ability of accruals, the information content of accruals, and amount of discretionary accruals considered as accruals quality measures. The dependent variable was non-systematic risk based on the Fama and French Three Factor model (FFTFM) and the capital asset pricing model (CAPM). The control variables were firm size, financial leverage, stock return, cash flow fluctuations, and book-to-market ratio. The data collection method was based on library research and document mining including financial statements. Multiple regression analysis was used to analyze the data. The study results showed that there is a significant direct relationship between financial leverage and discretionary accruals and non-systematic risk based on FFTFM and CAPM. There is also a significant direct relationship between the ability of accruals, information content of accruals, firm size, and stock return and non-systematic based on both models. It was also found that there is no relationship between book-to-market ratio and cash flow fluctuations and non-systematic risk.Keywords: accruals quality, non-systematic risk, CAPM, FFTFM
Procedia PDF Downloads 15928530 A Survey of Feature-Based Steganalysis for JPEG Images
Authors: Syeda Mainaaz Unnisa, Deepa Suresh
Abstract:
Due to the increase in usage of public domain channels, such as the internet, and communication technology, there is a concern about the protection of intellectual property and security threats. This interest has led to growth in researching and implementing techniques for information hiding. Steganography is the art and science of hiding information in a private manner such that its existence cannot be recognized. Communication using steganographic techniques makes not only the secret message but also the presence of hidden communication, invisible. Steganalysis is the art of detecting the presence of this hidden communication. Parallel to steganography, steganalysis is also gaining prominence, since the detection of hidden messages can prevent catastrophic security incidents from occurring. Steganalysis can also be incredibly helpful in identifying and revealing holes with the current steganographic techniques, which makes them vulnerable to attacks. Through the formulation of new effective steganalysis methods, further research to improve the resistance of tested steganography techniques can be developed. Feature-based steganalysis method for JPEG images calculates the features of an image using the L1 norm of the difference between a stego image and the calibrated version of the image. This calibration can help retrieve some of the parameters of the cover image, revealing the variations between the cover and stego image and enabling a more accurate detection. Applying this method to various steganographic schemes, experimental results were compared and evaluated to derive conclusions and principles for more protected JPEG steganography.Keywords: cover image, feature-based steganalysis, information hiding, steganalysis, steganography
Procedia PDF Downloads 21628529 Pre-Service Teachers’ Opinions on Disabled People
Authors: Sinem Toraman, Aysun Öztuna Kaplan, Hatice Mertoğlu, Esra Macaroğlu Akgül
Abstract:
This study aims to examine pre-service teachers’ opinions on disabled people taking into consideration various variables. The participants of the study are composed of 170 pre-service teachers being 1st year students of different branches at Education Department of Yıldız Technical, Yeditepe, Marmara and Sakarya Universities. Data of the research was collected in 2013-2014 fall term. This study was designed as a phenomenological study appropriately qualitative research paradigm. Pre-service teachers’ opinions about disabled people were examined in this study, open ended question form which was prepared by researcher and focus group interview techniques were used as data collection tool. The study presents pre-service teachers’ opinions about disabled people which were mentioned, and suggestions about teacher education.Keywords: pre-service teachers, disabled people, teacher education, teachers' opinions
Procedia PDF Downloads 45828528 Video Summarization: Techniques and Applications
Authors: Zaynab El Khattabi, Youness Tabii, Abdelhamid Benkaddour
Abstract:
Nowadays, huge amount of multimedia repositories make the browsing, retrieval and delivery of video contents very slow and even difficult tasks. Video summarization has been proposed to improve faster browsing of large video collections and more efficient content indexing and access. In this paper, we focus on approaches to video summarization. The video summaries can be generated in many different forms. However, two fundamentals ways to generate summaries are static and dynamic. We present different techniques for each mode in the literature and describe some features used for generating video summaries. We conclude with perspective for further research.Keywords: video summarization, static summarization, video skimming, semantic features
Procedia PDF Downloads 40128527 Inferring Human Mobility in India Using Machine Learning
Authors: Asra Yousuf, Ajaykumar Tannirkulum
Abstract:
Inferring rural-urban migration trends can help design effective policies that promote better urban planning and rural development. In this paper, we describe how machine learning algorithms can be applied to predict internal migration decisions of people. We consider data collected from household surveys in Tamil Nadu to train our model. To measure the performance of the model, we use data on past migration from National Sample Survey Organisation of India. The factors for training the model include socioeconomic characteristic of each individual like age, gender, place of residence, outstanding loans, strength of the household, etc. and his past migration history. We perform a comparative analysis of the performance of a number of machine learning algorithm to determine their prediction accuracy. Our results show that machine learning algorithms provide a stronger prediction accuracy as compared to statistical models. Our goal through this research is to propose the use of data science techniques in understanding human decisions and behaviour in developing countries.Keywords: development, migration, internal migration, machine learning, prediction
Procedia PDF Downloads 27128526 Prediction of Formation Pressure Using Artificial Intelligence Techniques
Authors: Abdulmalek Ahmed
Abstract:
Formation pressure is the main function that affects drilling operation economically and efficiently. Knowing the pore pressure and the parameters that affect it will help to reduce the cost of drilling process. Many empirical models reported in the literature were used to calculate the formation pressure based on different parameters. Some of these models used only drilling parameters to estimate pore pressure. Other models predicted the formation pressure based on log data. All of these models required different trends such as normal or abnormal to predict the pore pressure. Few researchers applied artificial intelligence (AI) techniques to predict the formation pressure by only one method or a maximum of two methods of AI. The objective of this research is to predict the pore pressure based on both drilling parameters and log data namely; weight on bit, rotary speed, rate of penetration, mud weight, bulk density, porosity and delta sonic time. A real field data is used to predict the formation pressure using five different artificial intelligence (AI) methods such as; artificial neural networks (ANN), radial basis function (RBF), fuzzy logic (FL), support vector machine (SVM) and functional networks (FN). All AI tools were compared with different empirical models. AI methods estimated the formation pressure by a high accuracy (high correlation coefficient and low average absolute percentage error) and outperformed all previous. The advantage of the new technique is its simplicity, which represented from its estimation of pore pressure without the need of different trends as compared to other models which require a two different trend (normal or abnormal pressure). Moreover, by comparing the AI tools with each other, the results indicate that SVM has the advantage of pore pressure prediction by its fast processing speed and high performance (a high correlation coefficient of 0.997 and a low average absolute percentage error of 0.14%). In the end, a new empirical correlation for formation pressure was developed using ANN method that can estimate pore pressure with a high precision (correlation coefficient of 0.998 and average absolute percentage error of 0.17%).Keywords: Artificial Intelligence (AI), Formation pressure, Artificial Neural Networks (ANN), Fuzzy Logic (FL), Support Vector Machine (SVM), Functional Networks (FN), Radial Basis Function (RBF)
Procedia PDF Downloads 14928525 Ecotourism Sites in Central Visayas, Philippines: A Green Business Profile
Authors: Ivy Jumao-As, Randy Lupango, Clifford Villaflores, Marites Khanser
Abstract:
Alongside inadequate implementation of ecotourism standards and other pressing issues on sustainable development is the lack of business plans and formal business structures of various ecotourism sites in the Central Visayas, Philippines, and other parts of the country. Addressing these issues plays a key role to boost ecotourism which is a sustainability tool to the country’s economic development. A three-phase research is designed to investigate the green business practices of selected ecotourism sites in the region in order to propose a business model for ecotourism destinations in the region and outside. This paper reports the initial phase of the study which described the sites’ profile as well as operators of the following selected destinations: Cebu City Protected Landscape and Olango Island Wildlife Bird Sanctuary in Cebu, Rajah Sikatuna Protected Landscape in Bohol. Interview, Self-Administered Questionnaire with key informants and Data Mining were employed in the data collection. Findings highlighted similarities and differences in terms of eco-tourism products, type and number of visitors, manpower composition, cultural and natural resources, complementary services and products, awards and accreditation, peak and off peak seasons, among others. Recommendations based from common issues initially identified in this study are also highlighted.Keywords: ecotourism, ecotourism sites, green business, sustainability
Procedia PDF Downloads 27228524 Improving Trainings of Mineral Processing Operators Through Gamification and Modelling and Simulation
Authors: Pedro A. S. Bergamo, Emilia S. Streng, Jan Rosenkranz, Yousef Ghorbani
Abstract:
Within the often-hazardous mineral industry, simulation training has speedily gained appreciation as an important method of increasing site safety and productivity through enhanced operator skill and knowledge. Performance calculations related to froth flotation, one of the most important concentration methods, is probably the hardest topic taught during the training of plant operators. Currently, most training teach those skills by traditional methods like slide presentations and hand-written exercises with a heavy focus on memorization. To optimize certain aspects of these pieces of training, we developed “MinFloat”, which teaches the operation formulas of the froth flotation process with the help of gamification. The simulation core based on a first-principles flotation model was implemented in Unity3D and an instructor tutoring system was developed, which presents didactic content and reviews the selected answers. The game was tested by 25 professionals with extensive experience in the mining industry based on a questionnaire formulated for training evaluations. According to their feedback, the game scored well in terms of quality, didactic efficacy and inspiring character. The feedback of the testers on the main target audience and the outlook of the mentioned solution is presented. This paper aims to provide technical background on the construction of educational games for the mining industry besides showing how feedback from experts can more efficiently be gathered thanks to new technologies such as online forms.Keywords: training evaluation, simulation based training, modelling, and simulation, froth flotation
Procedia PDF Downloads 11328523 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome
Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco
Abstract:
Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.Keywords: data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index
Procedia PDF Downloads 13528522 Corrosion Protection of Steel 316 by Electrochemically Synthesized Conductive Poly (O-Toluidine)
Authors: H. Acar, M. Karakışla, L. Aksu, M. Saçak
Abstract:
The corrosion protection effect of poly(o-toluidine) (POT) coated on steel 316 electrode was determined in corrosive media such as NaCl, H2SO4 and HCl with the use of Tafel curves and electrochemical impedance spectroscopy techniques. The POT coatings were prepared with cyclic voltammetry technique in aqueous solution of oxalic acid and they were characterized by FTIR and UV-Visible absorption spectroscopy. The Tafel curves revealed that the POT coating provides the most effective protection compared to the bare steel 316 electrode in NaCl as corrosive medium. The results were evaluated based upon data decrease of corrosion current and shift to positive potentials with the increase of number of scans. Electrochemical impedance spectroscopy measurements were found to support Tafel data of POT coating.Keywords: corrosion, impedance spectroscopy, steel 316, poly(o-toluidine)
Procedia PDF Downloads 31928521 A Survey on Types of Noises and De-Noising Techniques
Authors: Amandeep Kaur
Abstract:
Digital Image processing is a fundamental tool to perform various operations on the digital images for pattern recognition, noise removal and feature extraction. In this paper noise removal technique has been described for various types of noises. This paper comprises discussion about various noises available in the image due to different environmental, accidental factors. In this paper, various de-noising approaches have been discussed that utilize different wavelets and filters for de-noising. By analyzing various papers on image de-noising we extract that wavelet based de-noise approaches are much effective as compared to others.Keywords: de-noising techniques, edges, image, image processing
Procedia PDF Downloads 33628520 Computational Chemical-Composition of Carbohydrates in the Context of Healthcare Informatics
Authors: S. Chandrasekaran, S. Nandita, M. Shivathmika, Srikrishnan Shivakumar
Abstract:
The objective of the research work is to analyze the computational chemical-composition of carbohydrates in the context of healthcare informatics. The computation involves the representation of complex chemical molecular structure of carbohydrate using graph theory and in a deployable Chemical Markup Language (CML). The parallel molecular structure of the chemical molecules with or without other adulterants for the sake of business profit can be analyzed in terms of robustness and derivatization measures. The rural healthcare program should create awareness in malnutrition to reduce ill-effect of decomposition and help the consumers to know the level of such energy storage mixtures in a quantitative way. The earlier works were based on the empirical and wet data which can vary from time to time but cannot be made to reuse the results of mining. The work is carried out on the quantitative computational chemistry on carbohydrates to provide a safe and secure right to food act and its regulations.Keywords: carbohydrates, chemical-composition, chemical markup, robustness, food safety
Procedia PDF Downloads 37428519 An Approach for Association Rules Ranking
Authors: Rihab Idoudi, Karim Saheb Ettabaa, Basel Solaiman, Kamel Hamrouni
Abstract:
Medical association rules induction is used to discover useful correlations between pertinent concepts from large medical databases. Nevertheless, ARs algorithms produce huge amount of delivered rules and do not guarantee the usefulness and interestingness of the generated knowledge. To overcome this drawback, we propose an ontology based interestingness measure for ARs ranking. According to domain expert, the goal of the use of ARs is to discover implicit relationships between items of different categories such as ‘clinical features and disorders’, ‘clinical features and radiological observations’, etc. That’s to say, the itemsets which are composed of ‘similar’ items are uninteresting. Therefore, the dissimilarity between the rule’s items can be used to judge the interestingness of association rules; the more different are the items, the more interesting the rule is. In this paper, we design a distinct approach for ranking semantically interesting association rules involving the use of an ontology knowledge mining approach. The basic idea is to organize the ontology’s concepts into a hierarchical structure of conceptual clusters of targeted subjects, where each cluster encapsulates ‘similar’ concepts suggesting a specific category of the domain knowledge. The interestingness of association rules is, then, defined as the dissimilarity between corresponding clusters. That is to say, the further are the clusters of the items in the AR, the more interesting the rule is. We apply the method in our domain of interest – mammographic domain- using an existing mammographic ontology called Mammo with the goal of deriving interesting rules from past experiences, to discover implicit relationships between concepts modeling the domain.Keywords: association rule, conceptual clusters, interestingness measures, ontology knowledge mining, ranking
Procedia PDF Downloads 32228518 Comparison of Different Data Acquisition Techniques for Shape Optimization Problems
Authors: Attila Vámosi, Tamás Mankovits, Dávid Huri, Imre Kocsis, Tamás Szabó
Abstract:
Non-linear FEM calculations are indispensable when important technical information like operating performance of a rubber component is desired. Rubber bumpers built into air-spring structures may undergo large deformations under load, which in itself shows non-linear behavior. The changing contact range between the parts and the incompressibility of the rubber increases this non-linear behavior further. The material characterization of an elastomeric component is also a demanding engineering task. The shape optimization problem of rubber parts led to the study of FEM based calculation processes. This type of problems was posed and investigated by several authors. In this paper the time demand of certain calculation methods are studied and the possibilities of time reduction is presented.Keywords: rubber bumper, data acquisition, finite element analysis, support vector regression
Procedia PDF Downloads 47128517 From Sampling to Sustainable Phosphate Recovery from Mine Waste Rock Piles
Authors: Hicham Amar, Mustapha El Ghorfi, Yassine Taha, Abdellatif Elghali, Rachid Hakkou, Mostafa Benzaazoua
Abstract:
Phosphate mine waste rock (PMWR) generated during ore extraction is continuously increasing, resulting in a significant environmental footprint. The main objectives of this study consist of i) elaboration of the sampling strategy of PMWR piles, ii) a mineralogical and chemical characterization of PMWR piles, and iii) 3D block model creation to evaluate the potential valorization of the existing PMWR. Destructive drilling using reverse circulation from 13 drills was used to collect samples for chemical (X-ray fluorescence analysis) and mineralogical assays. The 3D block model was created based on the data set, including chemical data of the realized drills using Datamine RM software. The optical microscopy observations showed that the sandy phosphate from drills in the PMWR piles is characterized by the abundance of carbonate fluorapatite with the presence of calcite, dolomite, and quartz. The mean grade of composite samples was around 19.5±2.7% for P₂O₅. The mean grade of P₂O₅ exhibited an increasing tendency by depth profile from bottom to top of PMWR piles. 3D block model generated with chemical data confirmed the tendency of the mean grades’ variation and may allow a potential selective extraction according to %P₂O₅. The 3D block model of P₂O₅ grade is an efficient sampling approach that confirmed the variation of P₂O₅ grade. This integrated approach for PMWR management will be a helpful tool for decision-making to recover the residual phosphate, adopting the circular economy and sustainability in the phosphate mining industry.Keywords: 3D modelling, reverse circulation drilling, circular economy, phosphate mine waste rock, sampling
Procedia PDF Downloads 7828516 Supply Chain Risk Management: A Meta-Study of Empirical Research
Authors: Shoufeng Cao, Kim Bryceson, Damian Hine
Abstract:
The existing supply chain risk management (SCRM) research is currently chaotic and somewhat disorganized, and the topic has been addressed conceptually more often than empirically. This paper, using both qualitative and quantitative data, employs a modified Meta-study method to investigate the SCRM empirical research published in quality journals over the period of 12 years (2004-2015). The purpose is to outline the extent research trends and the employed research methodologies (i.e., research method, data collection and data analysis) across the sub-field that will guide future research. The synthesized findings indicate that empirical study on risk ripple effect along an entire supply chain, industry-specific supply chain risk management and global/export supply chain risk management has not yet given much attention than it deserves in the SCRM field. Besides, it is suggested that future empirical research should employ multiple and/or mixed methods and multi-source data collection techniques to reduce common method bias and single-source bias, thus improving research validity and reliability. In conclusion, this paper helps to stimulate more quality empirical research in the SCRM field via identifying promising research directions and providing some methodology guidelines.Keywords: empirical research, meta-study, methodology guideline, research direction, supply chain risk management
Procedia PDF Downloads 31728515 Power Quality Modeling Using Recognition Learning Methods for Waveform Disturbances
Authors: Sang-Keun Moon, Hong-Rok Lim, Jin-O Kim
Abstract:
This paper presents a Power Quality (PQ) modeling and filtering processes for the distribution system disturbances using recognition learning methods. Typical PQ waveforms with mathematical applications and gathered field data are applied to the proposed models. The objective of this paper is analyzing PQ data with respect to monitoring, discriminating, and evaluating the waveform of power disturbances to ensure the system preventative system failure protections and complex system problem estimations. Examined signal filtering techniques are used for the field waveform noises and feature extractions. Using extraction and learning classification techniques, the efficiency was verified for the recognition of the PQ disturbances with focusing on interactive modeling methods in this paper. The waveform of selected 8 disturbances is modeled with randomized parameters of IEEE 1159 PQ ranges. The range, parameters, and weights are updated regarding field waveform obtained. Along with voltages, currents have same process to obtain the waveform features as the voltage apart from some of ratings and filters. Changing loads are causing the distortion in the voltage waveform due to the drawing of the different patterns of current variation. In the conclusion, PQ disturbances in the voltage and current waveforms indicate different types of patterns of variations and disturbance, and a modified technique based on the symmetrical components in time domain was proposed in this paper for the PQ disturbances detection and then classification. Our method is based on the fact that obtained waveforms from suggested trigger conditions contain potential information for abnormality detections. The extracted features are sequentially applied to estimation and recognition learning modules for further studies.Keywords: power quality recognition, PQ modeling, waveform feature extraction, disturbance trigger condition, PQ signal filtering
Procedia PDF Downloads 186