Search results for: WEKA data mining tool
27734 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications
Authors: K. P. Sandesh, M. H. Suman
Abstract:
Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms
Procedia PDF Downloads 51827733 The Development of a Cyber Violence Measurement Tool for Youths: A Multi-Reporting of Ecological Factors
Authors: Jong-Hyo Park, Eunyoung Choi, Jae-Yeon Lim, Seon-Suk Lee, Yeong-Rong Koo, Ji-Ung Kwon, Kyung-Sung Kim, Jong-Ik Lee, Juhan Park, Hyun-Kyu Lee, Won-Kyoung Oh, Jisang Lee, Jiwon Choe
Abstract:
Due to COVID-19, cyber violence among youths has soared as they spend more time online than before. In contrast to the deepening concerns, measurement tools that can assess the vulnerability of cyber violence in individual youths still need to be supplemented. The measurement tools lack consideration of various factors related to cyber violence among youths. Most of the tools are self-report questionnaires, and these adolescents' self-report questionnaire forms can underestimate the harmful behavior and overestimate the damage experience. Therefore, this study aims to develop a multi-report measurement tool for youths that can reliably measure individuals' ecological factors related to cyber violence. The literature review explored factors related to cyber violence, and the questions were constructed. The face validity of the questions was confirmed by conducting focus group interviews. Exploratory and confirmatory factor analyses (N=671) were also conducted for statistical validation. This study developed a multi-report measurement tool for cyber violence with 161 questions, consisting of six domains: online behavior, cyber violence awareness, victimization-perpetration-witness experience, coping efficacy (individuals, peers, teachers, and parents), psychological characteristics, and pro-social capabilities. In addition to self-report from a youth respondent, this measurement tool includes peers, teachers, and parents reporting for the respondent. It is possible to reliably measure the ecological factors of individual youths who are vulnerable or highly resistant to cyber violence. In schools, teachers could refer to the measurement results for guiding students, better understanding their cyber violence conditions, and assessing their pro-social capabilities. With the measurement results, teachers and police officers could detect perpetrators or victims and intervene immediately. In addition, this measurement tool could analyze the effects of the prevention and intervention programs for cyber violence and draw appropriate suggestions.Keywords: adolescents, cyber violence, cyber violence measurement tool, measurement tool, multi-report measurement tool, youths
Procedia PDF Downloads 10127732 Data Analysis for Taxonomy Prediction and Annotation of 16S rRNA Gene Sequences from Metagenome Data
Authors: Suchithra V., Shreedhanya, Kavya Menon, Vidya Niranjan
Abstract:
Skin metagenomics has a wide range of applications with direct relevance to the health of the organism. It gives us insight to the diverse community of microorganisms (the microbiome) harbored on the skin. In the recent years, it has become increasingly apparent that the interaction between skin microbiome and the human body plays a prominent role in immune system development, cancer development, disease pathology, and many other biological implications. Next Generation Sequencing has led to faster and better understanding of environmental organisms and their mutual interactions. This project is studying the human skin microbiome of different individuals having varied skin conditions. Bacterial 16S rRNA data of skin microbiome is downloaded from SRA toolkit provided by NCBI to perform metagenomics analysis. Twelve samples are selected with two controls, and 3 different categories, i.e., sex (male/female), skin type (moist/intermittently moist/sebaceous) and occlusion (occluded/intermittently occluded/exposed). Quality of the data is increased using Cutadapt, and its analysis is done using FastQC. USearch, a tool used to analyze an NGS data, provides a suitable platform to obtain taxonomy classification and abundance of bacteria from the metagenome data. The statistical tool used for analyzing the USearch result is METAGENassist. The results revealed that the top three abundant organisms found were: Prevotella, Corynebacterium, and Anaerococcus. Prevotella is known to be an infectious bacterium found on wound, tooth cavity, etc. Corynebacterium and Anaerococcus are opportunist bacteria responsible for skin odor. This result infers that Prevotella thrives easily in sebaceous skin conditions. Therefore it is better to undergo intermittently occluded treatment such as applying ointments, creams, etc. to treat wound for sebaceous skin type. Exposing the wound should be avoided as it leads to an increase in Prevotella abundance. Moist skin type individuals can opt for occluded or intermittently occluded treatment as they have shown to decrease the abundance of bacteria during treatment.Keywords: bacterial 16S rRNA , next generation sequencing, skin metagenomics, skin microbiome, taxonomy
Procedia PDF Downloads 17227731 The Impact of Access to Microcredit Programme on Women Empowerment: A Case Study of Cowries Microfinance Bank in Lagos State, Nigeria
Authors: Adijat Olubukola Olateju
Abstract:
Women empowerment is an essential developmental tool in every economy especially in less developed countries; as it helps to enhance women's socio-economic well-being. Some empirical evidence has shown that microcredit has been an effective tool in enhancing women empowerment, especially in developing countries. This paper therefore, investigates the impact of microcredit programme on women empowerment in Lagos State, Nigeria. The study used Cowries Microfinance Bank (CMB) as a case study bank, and a total of 359 women entrepreneurs were selected by simple random sampling technique from the list of Cowries Microfinance Bank. Selection bias which could arise from non-random selection of participants or non-random placement of programme, was adjusted for by dividing the data into participant women entrepreneurs and non-participant women entrepreneurs. The data were analyzed with a Propensity Score Matching (PSM) technique. The result of the Average Treatment Effect on the Treated (ATT) obtained from the PSM indicates that the credit programme has a significant effect on the empowerment of women in the study area. It is therefore, recommended that microfinance banks should be encouraged to give loan to women and for more impact of the loan to be felt by the beneficiaries the loan programme should be complemented with other programmes such as training, grant, and periodic monitoring of programme should be encouraged.Keywords: empowerment, microcredit, socio-economic wellbeing, development
Procedia PDF Downloads 30427730 Technological Tool-Use as an Online Learner Strategy in a Synchronous Speaking Task
Authors: J. Knight, E. Barberà
Abstract:
Language learning strategies have been defined as thoughts and actions, consciously chosen and operationalized by language learners, to help them in carrying out a multiplicity of tasks from the very outset of learning to the most advanced levels of target language performance. While research in the field of Second Language Acquisition has focused on ‘good’ language learners, the effectiveness of strategy-use and orchestration by effective learners in face-to-face classrooms much less research has attended to learner strategies in online contexts, particular strategies in relation to technological tool use which can be part of a task design. In addition, much research on learner strategies and strategy use has been explored focusing on cognitive, attitudinal and metacognitive behaviour with less research focusing on the social aspect of strategies. This study focuses on how learners mediate with a technological tool designed to support synchronous spoken interaction and how this shape their spoken interaction in the opening of their talk. A case study approach is used incorporating notions from communities of practice theory to analyse and understand learner strategies of dyads carrying out a role play task. The study employs analysis of transcripts of spoken interaction in the openings of the talk along with log files of tool use. The study draws on results of previous studies pertaining to the same tool as a form of triangulation. Findings show how learners gain pre-task planning time through technological tool control. The strategies involving learners’ choices to enter and exit the tool shape their spoken interaction qualitatively, with some cases demonstrating long silences whilst others appearing to start the pedagogical task immediately. Who/what learners orientate to in the openings of the talk: an audience (i.e. the teacher), each other and/or screen-based signifiers in the opening moments of the talk also becomes a focus. The study highlights how tool use as a social practice should be considered a learning strategy in online contexts whereby different usages may be understood in the light of the more usual asynchronous social practices of the online community. The teachers’ role in the community is also problematised as the evaluator of the practices of that community. Results are pertinent for task design for synchronous speaking tasks. The use of community of practice theory supports an understanding of strategy use that involves both metacognition alongside social context revealing how tool-use strategies may need to be orally (socially) negotiated by learners and may also differ from an online language community.Keywords: learner strategy, tool use, community of practice, speaking task
Procedia PDF Downloads 34127729 The Effect of Paper Based Concept Mapping on Students' Academic Achievement and Attitude in Science Education
Authors: Orhan Akınoğlu, Arif Çömek, Ersin Elmacı, Tuğba Gündoğdu
Abstract:
The concept map is known to be a powerful tool to organize the ideas and concepts of an individuals’ mind. This tool is a kind of visual map that illustrates the relationships between the concepts of a certain subject. The effect of concept mapping on cognitive and affective qualities is one of the research topics among educational researchers for last decades. We educators want to utilize it both as an instructional tool or an assessment tool in classes. For that reason, this study aimed to determine the effect of concept mapping as a learning strategy in science classes on students’ academic achievement and attitude. The research employed a randomized pre-test post-test control group design. Data collected from 60 sixth grade students participated in the study from a randomly selected primary school in Turkey. Sixth-grade classes of the school were analyzed according to students’ academic achievement, science attitude, gender, mathematics, science courses grades, and their GPAs before the implementation. Two of the classes found to be equivalent (t=0,983, p>0,05) and one of them was defined as experimental and the other one control group randomly. During a 5-weeks period, the experimental group students (N=30) used the paper-based concept mapping method while the control group students (N=30) were taught with the traditional approach according to the science and technology education curriculum for light and sound subject. Both groups were taught by the same teacher who is experienced using concept mapping in science classes. Before the implementation, the teacher explained the theory of the concept maps and showed how to create paper-based concept mapping individually to the experimental group students for two hours. Then for two following hours she asked them to create some concept maps related to their former science subjects and gave them feedback by reviewing their concept maps to be sure that they can create during the implementation. The data were collected by science achievement test, science attitude scale and personal information form. Science achievement test and science attitude scale were implemented as pre-test and post-test while personal information form was implemented just as once. The reliability coefficient of the achievement test was KR20=0,76 and Cronbach’s Alpha of the attitude scale was 0,89. SPSS statistical software was used to analyze the data. According to the results, there was a statistically significant difference between the experimental and control group for academic achievement but not for attitude. The experimental group had significantly greater gains from academic achievement test than the control group (t=0,02, p<0,05). The findings showed that the paper-and-pencil concept mapping can be used as an effective method for students’ academic achievement in science classes. The results have implications for further researches.Keywords: concept mapping, science education, constructivism, academic achievement, science attitude
Procedia PDF Downloads 40827728 Shopping Tourism for Emerging Markets: Examining Shopping Tourism in the UK as an Attraction Tool for Wealthy Tourists
Authors: Ali Abdallah, Shaima Al Mohannadi
Abstract:
This study explores shopping tourism in the UK and examines it as an attraction tool for wealthy tourists to the UK’s capital city London. The study aims to identify the scope of shopping tourism used by countries such as the UK as a tool for attracting wealthy tourists. This study adopts the quantitative research approach through surveys in attaining the results required. Results demonstrate how the UK tourism market is an experience-based market and has recently become an attraction for luxurious brand shoppers. The term Trexit is introduced as a new form of tourism generated by the Brexit. If addressed appropriately the Trexit can assist in any negative economic retaliations of the Brexit. The study concludes that shopping tourism is yet to further incline in years to come, however, government support and cooperative planning with the retail industry is required as a means of further strengthening this developing sector.Keywords: Brexit tourism, luxury shopping, UK tourism, wealthy tourists
Procedia PDF Downloads 16327727 Productivity and Structural Design of Manufacturing Systems
Authors: Ryspek Usubamatov, Tan San Chin, Sarken Kapaeva
Abstract:
Productivity of the manufacturing systems depends on technological processes, a technical data of machines and a structure of systems. Technology is presented by the machining mode and data, a technical data presents reliability parameters and auxiliary time for discrete production processes. The term structure of manufacturing systems includes the number of serial and parallel production machines and links between them. Structures of manufacturing systems depend on the complexity of technological processes. Mathematical models of productivity rate for manufacturing systems are important attributes that enable to define best structure by criterion of a productivity rate. These models are important tool in evaluation of the economical efficiency for production systems.Keywords: productivity, structure, manufacturing systems, structural design
Procedia PDF Downloads 58427726 A Heart Arrhythmia Prediction Using Machine Learning’s Classification Approach and the Concept of Data Mining
Authors: Roshani S. Golhar, Neerajkumar S. Sathawane, Snehal Dongre
Abstract:
Background and objectives: As the, cardiovascular illnesses increasing and becoming cause of mortality worldwide, killing around lot of people each year. Arrhythmia is a type of cardiac illness characterized by a change in the linearity of the heartbeat. The goal of this study is to develop novel deep learning algorithms for successfully interpreting arrhythmia using a single second segment. Because the ECG signal indicates unique electrical heart activity across time, considerable changes between time intervals are detected. Such variances, as well as the limited number of learning data available for each arrhythmia, make standard learning methods difficult, and so impede its exaggeration. Conclusions: The proposed method was able to outperform several state-of-the-art methods. Also proposed technique is an effective and convenient approach to deep learning for heartbeat interpretation, that could be probably used in real-time healthcare monitoring systemsKeywords: electrocardiogram, ECG classification, neural networks, convolutional neural networks, portable document format
Procedia PDF Downloads 6927725 A Novel Probabilistic Spatial Locality of Reference Technique for Automatic Cleansing of Digital Maps
Authors: A. Abdullah, S. Abushalmat, A. Bakshwain, A. Basuhail, A. Aslam
Abstract:
GIS (Geographic Information System) applications require geo-referenced data, this data could be available as databases or in the form of digital or hard-copy agro-meteorological maps. These parameter maps are color-coded with different regions corresponding to different parameter values, converting these maps into a database is not very difficult. However, text and different planimetric elements overlaid on these maps makes an accurate image to database conversion a challenging problem. The reason being, it is almost impossible to exactly replace what was underneath the text or icons; thus, pointing to the need for inpainting. In this paper, we propose a probabilistic inpainting approach that uses the probability of spatial locality of colors in the map for replacing overlaid elements with underlying color. We tested the limits of our proposed technique using non-textual simulated data and compared text removing results with a popular image editing tool using public domain data with promising results.Keywords: noise, image, GIS, digital map, inpainting
Procedia PDF Downloads 35227724 Literature Review on Text Comparison Techniques: Analysis of Text Extraction, Main Comparison and Visual Representation Tools
Authors: Andriana Mkrtchyan, Vahe Khlghatyan
Abstract:
The choice of a profession is one of the most important decisions people make throughout their life. With the development of modern science, technologies, and all the spheres existing in the modern world, more and more professions are being arisen that complicate even more the process of choosing. Hence, there is a need for a guiding platform to help people to choose a profession and the right career path based on their interests, skills, and personality. This review aims at analyzing existing methods of comparing PDF format documents and suggests that a 3-stage approach is implemented for the comparison, that is – 1. text extraction from PDF format documents, 2. comparison of the extracted text via NLP algorithms, 3. comparison representation using special shape and color psychology methodology.Keywords: color psychology, data acquisition/extraction, data augmentation, disambiguation, natural language processing, outlier detection, semantic similarity, text-mining, user evaluation, visual search
Procedia PDF Downloads 7627723 The Implementation of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications
Authors: Mohamed R. Mhereeg
Abstract:
The paper discusses the implementation of the MultiAgent classification System (MACS) and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies, which are the .NET widows service based agents, the Windows Communication Foundation (WCF) services, the Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). Microsoft's .NET windows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW. The Monitoring Agents (MAs) were configured to execute automatically to monitor excel spreadsheets development activities by content. Data gathered by the Monitoring Agents from various resources over a period of time was collected and filtered by a Database Updater Agent (DUA) residing in the .NET client application of the system. This agent then transfers and stores the data in Oracle server database via Oracle stored procedures for further processing that leads to the classification of the end user developers.Keywords: MACS, implementation, multi-agent, SOA, autonomous, WCF
Procedia PDF Downloads 27427722 Implementation of Data Science in Field of Homologation
Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande
Abstract:
For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)
Procedia PDF Downloads 16327721 Exploration of Building Information Modelling Software to Develop Modular Coordination Design Tool for Architects
Authors: Muhammad Khairi bin Sulaiman
Abstract:
The utilization of Building Information Modelling (BIM) in the construction industry has provided an opportunity for designers in the Architecture, Engineering and Construction (AEC) industry to proceed from the conventional method of using manual drafting to a way that creates alternative designs quickly, produces more accurate, reliable and consistent outputs. By using BIM Software, designers can create digital content that manipulates the use of data using the parametric model of BIM. With BIM software, more alternative designs can be created quickly and design problems can be explored further to produce a better design faster than conventional design methods. Generally, BIM is used as a documentation mechanism and has not been fully explored and utilised its capabilities as a design tool. Relative to the current issue, Modular Coordination (MC) design as a sustainable design practice is encouraged since MC design will reduce material wastage through standard dimensioning, pre-fabrication, repetitive, modular construction and components. However, MC design involves a complex process of rules and dimensions. Therefore, a tool is needed to make this process easier. Since the parameters in BIM can easily be manipulated to follow MC rules and dimensioning, thus, the integration of BIM software with MC design is proposed for architects during the design stage. With this tool, there will be an improvement in acceptance and practice in the application of MC design effectively. Consequently, this study will analyse and explore the function and customization of BIM objects and the capability of BIM software to expedite the application of MC design during the design stage for architects. With this application, architects will be able to create building models and locate objects within reference modular grids that adhere to MC rules and dimensions. The parametric modeling capabilities of BIM will also act as a visual tool that will further enhance the automation of the 3-Dimensional space planning modeling process. (Method) The study will first analyze and explore the parametric modeling capabilities of rule-based BIM objects, which eventually customize a reference grid within the rules and dimensioning of MC. Eventually, the approach will further enhance the architect's overall design process and enable architects to automate complex modeling, which was nearly impossible before. A prototype using a residential quarter will be modeled. A set of reference grids guided by specific MC rules and dimensions will be used to develop a variety of space planning and configuration. With the use of the design, the tool will expedite the design process and encourage the use of MC Design in the construction industry.Keywords: building information modeling, modular coordination, space planning, customization, BIM application, MC space planning
Procedia PDF Downloads 8427720 A Multifactorial Algorithm to Automate Screening of Drug-Induced Liver Injury Cases in Clinical and Post-Marketing Settings
Authors: Osman Turkoglu, Alvin Estilo, Ritu Gupta, Liliam Pineda-Salgado, Rajesh Pandey
Abstract:
Background: Hepatotoxicity can be linked to a variety of clinical symptoms and histopathological signs, posing a great challenge in the surveillance of suspected drug-induced liver injury (DILI) cases in the safety database. Additionally, the majority of such cases are rare, idiosyncratic, highly unpredictable, and tend to demonstrate unique individual susceptibility; these qualities, in turn, lend to a pharmacovigilance monitoring process that is often tedious and time-consuming. Objective: Develop a multifactorial algorithm to assist pharmacovigilance physicians in identifying high-risk hepatotoxicity cases associated with DILI from the sponsor’s safety database (Argus). Methods: Multifactorial selection criteria were established using Structured Query Language (SQL) and the TIBCO Spotfire® visualization tool, via a combination of word fragments, wildcard strings, and mathematical constructs, based on Hy’s law criteria and pattern of injury (R-value). These criteria excluded non-eligible cases from monthly line listings mined from the Argus safety database. The capabilities and limitations of these criteria were verified by comparing a manual review of all monthly cases with system-generated monthly listings over six months. Results: On an average, over a period of six months, the algorithm accurately identified 92% of DILI cases meeting established criteria. The automated process easily compared liver enzyme elevations with baseline values, reducing the screening time to under 15 minutes as opposed to multiple hours exhausted using a cognitively laborious, manual process. Limitations of the algorithm include its inability to identify cases associated with non-standard laboratory tests, naming conventions, and/or incomplete/incorrectly entered laboratory values. Conclusions: The newly developed multifactorial algorithm proved to be extremely useful in detecting potential DILI cases, while heightening the vigilance of the drug safety department. Additionally, the application of this algorithm may be useful in identifying a potential signal for DILI in drugs not yet known to cause liver injury (e.g., drugs in the initial phases of development). This algorithm also carries the potential for universal application, due to its product-agnostic data and keyword mining features. Plans for the tool include improving it into a fully automated application, thereby completely eliminating a manual screening process.Keywords: automation, drug-induced liver injury, pharmacovigilance, post-marketing
Procedia PDF Downloads 15227719 Using Geospatial Analysis to Reconstruct the Thunderstorm Climatology for the Washington DC Metropolitan Region
Authors: Mace Bentley, Zhuojun Duan, Tobias Gerken, Dudley Bonsal, Henry Way, Endre Szakal, Mia Pham, Hunter Donaldson, Chelsea Lang, Hayden Abbott, Leah Wilcynzski
Abstract:
Air pollution has the potential to modify the lifespan and intensity of thunderstorms and the properties of lightning. Using data mining and geovisualization, we investigate how background climate and weather conditions shape variability in urban air pollution and how this, in turn, shapes thunderstorms as measured by the intensity, distribution, and frequency of cloud-to-ground lightning. A spatiotemporal analysis was conducted in order to identify thunderstorms using high-resolution lightning detection network data. Over seven million lightning flashes were used to identify more than 196,000 thunderstorms that occurred between 2006 - 2020 in the Washington, DC Metropolitan Region. Each lightning flash in the dataset was grouped into thunderstorm events by means of a temporal and spatial clustering algorithm. Once the thunderstorm event database was constructed, hourly wind direction, wind speed, and atmospheric thermodynamic data were added to the initiation and dissipation times and locations for the 196,000 identified thunderstorms. Hourly aerosol and air quality data for the thunderstorm initiation times and locations were also incorporated into the dataset. Developing thunderstorm climatologies using a lightning tracking algorithm and lightning detection network data was found to be useful for visualizing the spatial and temporal distribution of urban augmented thunderstorms in the region.Keywords: lightning, urbanization, thunderstorms, climatology
Procedia PDF Downloads 7527718 A Research and Application of Feature Selection Based on IWO and Tabu Search
Authors: Laicheng Cao, Xiangqian Su, Youxiao Wu
Abstract:
Feature selection is one of the important problems in network security, pattern recognition, data mining and other fields. In order to remove redundant features, effectively improve the detection speed of intrusion detection system, proposes a new feature selection method, which is based on the invasive weed optimization (IWO) algorithm and tabu search algorithm(TS). Use IWO as a global search, tabu search algorithm for local search, to improve the results of IWO algorithm. The experimental results show that the feature selection method can effectively remove the redundant features of network data information in feature selection, reduction time, and to guarantee accurate detection rate, effectively improve the speed of detection system.Keywords: intrusion detection, feature selection, iwo, tabu search
Procedia PDF Downloads 53027717 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques
Authors: Gizem Eser Erdek
Abstract:
This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet
Procedia PDF Downloads 7727716 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria
Authors: Isaac Kayode Ogunlade
Abstract:
Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device
Procedia PDF Downloads 9127715 Socioterritorial Inequalities in a Region of Chile. Beyond the Geography
Authors: Javier Donoso-Bravo, Camila Cortés-Zambrano
Abstract:
In this paper, we analyze socioterritorial inequalities in the region of Valparaiso (Chile) using secondary data to account for these inequalities drawing on economic, social, educational, and environmental dimensions regarding the thirty-six municipalities of the region. We looked over a wide-ranging set of secondary data from public sources regarding economic activities, poverty, employment, income, years of education, post-secondary education access, green areas, access to potable water, and others. We found sharp socioterritorial inequalities especially based on the economic performance in each territory. Analysis show, on the one hand, the existence of a dual and unorganized development model in some territories with a strong economic activity -especially in the areas of finance, real estate, mining, and vineyards- but, at the same time, with poor social indicators. On the other hand, most of the territories show a dispersed model with very little dynamic economic activities and very poor social development. Finally, we discuss how socioterritorial inequalities in the region of Valparaiso reflect the level of globalization of the economic activities carried on in every territory.Keywords: socioterritorial inequalities, development model, Chile, secondary data, Region of Valparaiso
Procedia PDF Downloads 10027714 EEG-Based Screening Tool for School Student’s Brain Disorders Using Machine Learning Algorithms
Authors: Abdelrahman A. Ramzy, Bassel S. Abdallah, Mohamed E. Bahgat, Sarah M. Abdelkader, Sherif H. ElGohary
Abstract:
Attention-Deficit/Hyperactivity Disorder (ADHD), epilepsy, and autism affect millions of children worldwide, many of which are undiagnosed despite the fact that all of these disorders are detectable in early childhood. Late diagnosis can cause severe problems due to the late treatment and to the misconceptions and lack of awareness as a whole towards these disorders. Moreover, electroencephalography (EEG) has played a vital role in the assessment of neural function in children. Therefore, quantitative EEG measurement will be utilized as a tool for use in the evaluation of patients who may have ADHD, epilepsy, and autism. We propose a screening tool that uses EEG signals and machine learning algorithms to detect these disorders at an early age in an automated manner. The proposed classifiers used with epilepsy as a step taken for the work done so far, provided an accuracy of approximately 97% using SVM, Naïve Bayes and Decision tree, while 98% using KNN, which gives hope for the work yet to be conducted.Keywords: ADHD, autism, epilepsy, EEG, SVM
Procedia PDF Downloads 19027713 An Experimental Study on Ultrasonic Machining of Pure Titanium Using Full Factorial Design
Authors: Jatinder Kumar
Abstract:
Ultrasonic machining is one of the most widely used non-traditional machining processes for machining of materials that are relatively brittle, hard and fragile such as advanced ceramics, refractories, crystals, quartz etc. There is a considerable lack of research on its application to the cost-effective machining of tough materials such as titanium. In this investigation, the application of USM process for machining of titanium (ASTM Grade-I) has been explored. Experiments have been conducted to assess the effect of different parameters of USM process on machining rate and tool wear rate as response characteristics. The process parameters that were included in this study are: abrasive grit size, tool material and power rating of the ultrasonic machine. It has been concluded that titanium is fairly machinable with USM process. Significant improvement in the machining rate can be realized by manipulating the process parameters and obtaining the optimum combination of these parameters.Keywords: abrasive grit size, tool material, titanium, ultrasonic machining
Procedia PDF Downloads 35927712 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software
Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman
Abstract:
Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation
Procedia PDF Downloads 12327711 "Revolutionizing Geographic Data: CADmapper's Automated Precision in CAD Drawing Transformation"
Authors: Toleen Alaqqad, Kadi Alshabramiy, Suad Zaafarany, Basma Musallam
Abstract:
CADmapper is a significant tool of software for transforming geographic data into realistic CAD drawings. It speeds up and simplifies the conversion process by automating it. This allows architects, urban planners, engineers, and geographic information system (GIS) experts to solely concentrate on the imaginative and scientific parts of their projects. While the future incorporation of AI has the potential for further improvements, CADmapper's current capabilities make it an indispensable asset in the business. It covers a combination of 2D and 3D city and urban area models. The user can select a specific square section of the map to view, and the fee is based on the dimensions of the area being viewed. The procedure is straightforward: you choose the area you want, then pick whether or not to include topography. 3D architectural data (if available), followed by selecting whatever design program or CAD style you want to publish the document which contains more than 200 free broad town plans in DXF format. If you desire to specify a bespoke area, it's free up to 1 km2.Keywords: cadmaper, gdata, 2d and 3d data conversion, automated cad drawing, urban planning software
Procedia PDF Downloads 6827710 Evaluating the Feasibility of Magnetic Induction to Cross an Air-Water Boundary
Authors: Mark Watson, J.-F. Bousquet, Adam Forget
Abstract:
A magnetic induction based underwater communication link is evaluated using an analytical model and a custom Finite-Difference Time-Domain (FDTD) simulation tool. The analytical model is based on the Sommerfeld integral, and a full-wave simulation tool evaluates Maxwell’s equations using the FDTD method in cylindrical coordinates. The analytical model and FDTD simulation tool are then compared and used to predict the system performance for various transmitter depths and optimum frequencies of operation. To this end, the system bandwidth, signal to noise ratio, and the magnitude of the induced voltage are used to estimate the expected channel capacity. The models show that in seawater, a relatively low-power and small coils may be capable of obtaining a throughput of 40 to 300 kbps, for the case where a transmitter is at depths of 1 to 3 m and a receiver is at a height of 1 m.Keywords: magnetic induction, FDTD, underwater communication, Sommerfeld
Procedia PDF Downloads 12527709 Effect of Taper Pin Ratio on Microstructure and Mechanical Property of Friction Stir Welded AZ31 Magnesium Alloy
Authors: N. H. Othman, N. Udin, M. Ishak, L. H. Shah
Abstract:
This study focuses on the effect of pin taper tool ratio on friction stir welding of magnesium alloy AZ31. Two pieces of AZ31 alloy with thickness of 6 mm were friction stir welded by using the conventional milling machine. The shoulder diameter used in this experiment is fixed at 18 mm. The taper pin ratio used are varied at 6:6, 6:5, 6:4, 6:3, 6:2 and 6:1. The rotational speeds that were used in this study were 500 rpm, 1000 rpm and 1500 rpm, respectively. The welding speeds used are 150 mm/min, 200 mm/min and 250 mm/min. Microstructure observation of welded area was studied by using optical microscope. Equiaxed grains were observed at the TMAZ and stir zone indicating fully plastic deformation. Tool pin diameter ratio 6/1 causes low heat input to the material because of small contact surface between tool surface and stirred materials compared to other tool pin diameter ratio. The grain size of stir zone increased with increasing of ratio of rotational speed to transverse speed due to higher heat input. It is observed that worm hole is produced when excessive heat input is applied. To evaluate the mechanical properties of this specimen, tensile test was used in this study. Welded specimens using taper pin ratio 6:1 shows higher tensile strength compared to other taper pin ratio up to 204 MPa. Moreover, specimens using taper pin ratio 6:1 showed better tensile strength with 500 rpm of rotational speed and 150mm/min welding speed.Keywords: friction stir welding, magnesium AZ31, cylindrical taper tool, taper pin ratio
Procedia PDF Downloads 28627708 Risk Based Maintenance Planning for Loading Equipment in Underground Hard Rock Mine: Case Study
Authors: Sidharth Talan, Devendra Kumar Yadav, Yuvraj Singh Rajput, Subhajit Bhattacharjee
Abstract:
Mining industry is known for its appetite to spend sizeable capital on mine equipment. However, in the current scenario, the mining industry is challenged by daunting factors of non-uniform geological conditions, uneven ore grade, uncontrollable and volatile mineral commodity prices and the ever increasing quest to optimize the capital and operational costs. Thus, the role of equipment reliability and maintenance planning inherits a significant role in augmenting the equipment availability for the operation and in turn boosting the mine productivity. This paper presents the Risk Based Maintenance (RBM) planning conducted on mine loading equipment namely Load Haul Dumpers (LHDs) at Vedanta Resources Ltd subsidiary Hindustan Zinc Limited operated Sindesar Khurd Mines, an underground zinc and lead mine situated in Dariba, Rajasthan, India. The mining equipment at the location is maintained by the Original Equipment Manufacturers (OEMs) namely Sandvik and Atlas Copco, who carry out the maintenance and inspection operations for the equipment. Based on the downtime data extracted for the equipment fleet over the period of 6 months spanning from 1st January 2017 until 30th June 2017, it was revealed that significant contribution of three downtime issues related to namely Engine, Hydraulics, and Transmission to be common among all the loading equipment fleet and substantiated by Pareto Analysis. Further scrutiny through Bubble Matrix Analysis of the given factors revealed the major influence of selective factors namely Overheating, No Load Taken (NTL) issues, Gear Changing issues and Hose Puncture and leakage issues. Utilizing the equipment wise analysis of all the downtime factors obtained, spares consumed, and the alarm logs extracted from the machines, technical design changes in the equipment and pre shift critical alarms checklist were proposed for the equipment maintenance. The given analysis is beneficial to allow OEMs or mine management to focus on the critical issues hampering the reliability of mine equipment and design necessary maintenance strategies to mitigate them.Keywords: bubble matrix analysis, LHDs, OEMs, Pareto chart analysis, spares consumption matrix, critical alarms checklist
Procedia PDF Downloads 15327707 Longitudinal Analysis of Internet Speed Data in the Gulf Cooperation Council Region
Authors: Musab Isah
Abstract:
This paper presents a longitudinal analysis of Internet speed data in the Gulf Cooperation Council (GCC) region, focusing on the most populous cities of each of the six countries – Riyadh, Saudi Arabia; Dubai, UAE; Kuwait City, Kuwait; Doha, Qatar; Manama, Bahrain; and Muscat, Oman. The study utilizes data collected from the Measurement Lab (M-Lab) infrastructure over a five-year period from January 1, 2019, to December 31, 2023. The analysis includes downstream and upstream throughput data for the cities, covering significant events such as the launch of 5G networks in 2019, COVID-19-induced lockdowns in 2020 and 2021, and the subsequent recovery period and return to normalcy. The results showcase substantial increases in Internet speeds across the cities, highlighting improvements in both download and upload throughput over the years. All the GCC countries have achieved above-average Internet speeds that can conveniently support various online activities and applications with excellent user experience.Keywords: internet data science, internet performance measurement, throughput analysis, internet speed, measurement lab, network diagnostic tool
Procedia PDF Downloads 6227706 Research and Innovations in Music Teacher Training Programme in Hungary
Authors: Monika Benedek
Abstract:
Improvisation is an integral part of music education programmes worldwide since teachers recognize that improvisation helps to broaden stylistic knowledge, develops creativity and various musical skills, in particular, aural skills, and also motivates to learn music theory. In Hungary, where Kodály concept is a core element of music teacher education, improvisation has been relatively neglected subject in both primary school and classical music school curricula. Therefore, improvisation was an important theme of a one-year-long research project carried out at the Liszt Academy of Music in Budapest. The project aimed to develop the music teacher training programme, and among others, focused on testing how improvisation could be used as a teaching tool to improve students’ musical reading and writing skills and creative musical skills. Teacher-researchers first tested various teaching approaches of improvisation with numerous teaching modules in music lessons at public schools and music schools. Data were collected from videos of lessons and from teachers’ reflective notes. After analysing data and developing teaching modules, all modules were tested again in a pilot course in 30 contact lessons for music teachers. Teachers gave written feedback of the pilot programme, tested two modules by their choice in their own teaching and wrote reflecting comments about their experiences in applying teaching modules of improvisation. The overall results indicated that improvisation could be an innovative approach to teaching various musical subjects, in particular, solfege, music theory, and instrument, either in individual or in group instruction. Improvisation, especially with the application of relative solmisation and singing, appeared to have been a beneficial tool to develop various musicianship skills of students and teachers, in particular, the aural, musical reading and writing skills, and creative musical skills. Furthermore, improvisation seemed to have been a motivating teaching tool to learn music theory by creating a bridge between various musical styles. This paper reports on the results of the research project.Keywords: improvisation, Kodály concept, music school, public school, teacher training
Procedia PDF Downloads 14427705 Functional and Efficient Query Interpreters: Principle, Application and Performances’ Comparison
Authors: Laurent Thiry, Michel Hassenforder
Abstract:
This paper presents a general approach to implement efficient queries’ interpreters in a functional programming language. Indeed, most of the standard tools actually available use an imperative and/or object-oriented language for the implementation (e.g. Java for Jena-Fuseki) but other paradigms are possible with, maybe, better performances. To proceed, the paper first explains how to model data structures and queries in a functional point of view. Then, it proposes a general methodology to get performances (i.e. number of computation steps to answer a query) then it explains how to integrate some optimization techniques (short-cut fusion and, more important, data transformations). It then compares the functional server proposed to a standard tool (Fuseki) demonstrating that the first one can be twice to ten times faster to answer queries.Keywords: data transformation, functional programming, information server, optimization
Procedia PDF Downloads 157