Search results for: WEKA data mining tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28612

Search results for: WEKA data mining tool

27142 A Decision-Support Tool for Humanitarian Distribution Planners in the Face of Congestion at Security Checkpoints: A Real-World Case Study

Authors: Mohanad Rezeq, Tarik Aouam, Frederik Gailly

Abstract:

In times of armed conflicts, various security checkpoints are placed by authorities to control the flow of merchandise into and within areas of conflict. The flow of humanitarian trucks that is added to the regular flow of commercial trucks, together with the complex security procedures, creates congestion and long waiting times at the security checkpoints. This causes distribution costs to increase and shortages of relief aid to the affected people to occur. Our research proposes a decision-support tool to assist planners and policymakers in building efficient plans for the distribution of relief aid, taking into account congestion at security checkpoints. The proposed tool is built around a multi-item humanitarian distribution planning model based on multi-phase design science methodology that has as its objective to minimize distribution and back ordering costs subject to capacity constraints that reflect congestion effects using nonlinear clearing functions. Using the 2014 Gaza War as a case study, we illustrate the application of the proposed tool, model the underlying relief-aid humanitarian supply chain, estimate clearing functions at different security checkpoints, and conduct computational experiments. The decision support tool generated a shipment plan that was compared to two benchmarks in terms of total distribution cost, average lead time and work in progress (WIP) at security checkpoints, and average inventory and backorders at distribution centers. The first benchmark is the shipment plan generated by the fixed capacity model, and the second is the actual shipment plan implemented by the planners during the armed conflict. According to our findings, modeling and optimizing supply chain flows reduce total distribution costs, average truck wait times at security checkpoints, and average backorders when compared to the executed plan and the fixed-capacity model. Finally, scenario analysis concludes that increasing capacity at security checkpoints can lower total operations costs by reducing the average lead time.

Keywords: humanitarian distribution planning, relief-aid distribution, congestion, clearing functions

Procedia PDF Downloads 82
27141 Learning about the Strengths and Weaknesses of Urban Climate Action Plans

Authors: Prince Dacosta Aboagye, Ayyoob Sharifi

Abstract:

Cities respond to climate concerns mainly through their climate action plans (CAPs). A comprehensive content analysis of the dynamics in existing urban CAPs is not well represented in the literature. This literature void presents a difficulty in appreciating the strengths and weaknesses of urban CAPs. Here, we perform a qualitative content analysis (QCA) on CAPs from 278 cities worldwide and use text-mining tools to map and visualize the relevant data. Our analysis showed a decline in the number of CAPs developed and published following the global COVID-19 lockdown period. Evidently, megacities are leading the deep decarbonisation agenda. We also observed a transition from developing mainly mitigation-focused CAPs pre-COP21 to both mitigation and adaptation CAPs. A lack of inclusiveness in local climate planning was common among European and North American cities. The evidence is a catalyst for understanding the trends in existing urban CAPs to shape future urban climate planning.

Keywords: urban, climate action plans, strengths, weaknesses

Procedia PDF Downloads 97
27140 Hydrogeophysical Investigations And Mapping of Ingress Channels Along The Blesbokspruit Stream In The East Rand Basin Of The Witwatersrand, South Africa

Authors: Melvin Sethobya, Sithule Xanga, Sechaba Lenong, Lunga Nolakana, Gbenga Adesola

Abstract:

Mining has been the cornerstone of the South African economy for the last century. Most of the gold mining in South Africa was conducted within the Witwatersrand basin, which contributed to the rapid growth of the city of Johannesburg and capitulated the city to becoming the business and wealth capital of the country. But with gradual depletion of resources, a stoppage in the extraction of underground water from mines and other factors relating to survival of the mining operations over a lengthy period, most of the mines were abandoned and left to pollute the local waterways and groundwater with toxins, heavy metal residue and increased acid mine drainage ensued. The Department of Mineral Resources and Energy commissioned a project whose aim is to monitor, maintain, and mitigate the adverse environmental impacts of polluted water mine water flowing into local streams affecting local ecosystems and livelihoods downstream. As part of mitigation efforts, the diagnosis and monitoring of groundwater or surface water polluted sites has become important. Geophysical surveys, in particular, Resistivity and Magnetics surveys, were selected as some of most suitable techniques for investigation of local ingress points along of one the major streams cutting through the Witwatersrand basin, namely the Blesbokspruit, which is found in the eastern part of the basin. The aim of the surveys was to provide information that could be used to assist in determining possible water loss/ ingress from the Blesbokspriut stream. Modelling of geophysical surveys results offered an in-depth insight into the interaction and pathways of polluted water through mapping of possible ingress channels near the Blesbokspruit. The resistivity - depth profile of the surveyed site exhibit a three(3) layered model with low resistivity values (10 to 200 Ω.m) overburden, which is underlain by a moderate resistivity weathered layer (>300 Ω.m), which sits on a more resistive crystalline bedrock (>500 Ω.m). Two locations of potential ingress channels were mapped across the two traverses at the site. The magnetic survey conducted at the site mapped a major NE-SW trending regional linearment with a strong magnetic signature, which was modeled to depth beyond 100m, with the potential to act as a conduit for dispersion of stream water away from the stream, as it shared a similar orientation with the potential ingress channels as mapped using the resistivity method.

Keywords: eletrictrical resistivity, magnetics survey, blesbokspruit, ingress

Procedia PDF Downloads 63
27139 Application of Blockchain Technology in Geological Field

Authors: Mengdi Zhang, Zhenji Gao, Ning Kang, Rongmei Liu

Abstract:

Management and application of geological big data is an important part of China's national big data strategy. With the implementation of a national big data strategy, geological big data management becomes more and more critical. At present, there are still a lot of technology barriers as well as cognition chaos in many aspects of geological big data management and application, such as data sharing, intellectual property protection, and application technology. Therefore, it’s a key task to make better use of new technologies for deeper delving and wider application of geological big data. In this paper, we briefly introduce the basic principle of blockchain technology at the beginning and then make an analysis of the application dilemma of geological data. Based on the current analysis, we bring forward some feasible patterns and scenarios for the blockchain application in geological big data and put forward serval suggestions for future work in geological big data management.

Keywords: blockchain, intellectual property protection, geological data, big data management

Procedia PDF Downloads 91
27138 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction

Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun

Abstract:

The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.

Keywords: usability, qualitative data, text-processing algorithm, natural language processing

Procedia PDF Downloads 285
27137 De-Novo Structural Elucidation from Mass/NMR Spectra

Authors: Ismael Zamora, Elisabeth Ortega, Tatiana Radchenko, Guillem Plasencia

Abstract:

The structure elucidation based on Mass Spectra (MS) data of unknown substances is an unresolved problem that affects many different fields of application. The recent overview of software available for structure elucidation of small molecules has shown the demand for efficient computational tool that will be able to perform structure elucidation of unknown small molecules and peptides. We developed an algorithm for De-Novo fragment analysis based on MS data that proposes a set of scored and ranked structures that are compatible with the MS and MSMS spectra. Several different algorithms were developed depending on the initial set of fragments and the structure building processes. Also, in all cases, several scores for the final molecule ranking were computed. They were validated with small and middle databases (DB) with the eleven test set compounds. Similar results were obtained from any of the databases that contained the fragments of the expected compound. We presented an algorithm. Or De-Novo fragment analysis based on only mass spectrometry (MS) data only that proposed a set of scored/ranked structures that was validated on different types of databases and showed good results as proof of concept. Moreover, the solutions proposed by Mass Spectrometry were submitted to the prediction of NMR spectra in order to elucidate which of the proposed structures was compatible with the NMR spectra collected.

Keywords: De Novo, structure elucidation, mass spectrometry, NMR

Procedia PDF Downloads 295
27136 A Ratio-Weighted Decision Tree Algorithm for Imbalance Dataset Classification

Authors: Doyin Afolabi, Phillip Adewole, Oladipupo Sennaike

Abstract:

Most well-known classifiers, including the decision tree algorithm, can make predictions on balanced datasets efficiently. However, the decision tree algorithm tends to be biased towards imbalanced datasets because of the skewness of the distribution of such datasets. To overcome this problem, this study proposes a weighted decision tree algorithm that aims to remove the bias toward the majority class and prevents the reduction of majority observations in imbalance datasets classification. The proposed weighted decision tree algorithm was tested on three imbalanced datasets- cancer dataset, german credit dataset, and banknote dataset. The specificity, sensitivity, and accuracy metrics were used to evaluate the performance of the proposed decision tree algorithm on the datasets. The evaluation results show that for some of the weights of our proposed decision tree, the specificity, sensitivity, and accuracy metrics gave better results compared to that of the ID3 decision tree and decision tree induced with minority entropy for all three datasets.

Keywords: data mining, decision tree, classification, imbalance dataset

Procedia PDF Downloads 137
27135 Dry High Speed Orthogonal Turning of Ti-6Al-4V Titanium Alloy

Authors: M. Benghersallah, G. List, G. Sutter

Abstract:

The present work is an experimental study on the dry high speed turning of Ti-6Al-4V titanium alloy. The objective of this study is to see for high cutting speeds, how wear occurs on the face of insert and how to evolve cutting forces and chip formation. Cutting speeds tested is 600, 800, 1000, and 1200 m/min in orthogonal turning with a carbide insert tool H13A uncoated on a cylindrical titanium alloy part. Investigation on the wear inserts with 3D scanning microscope revered the crater formation is instantaneous and a chip adhesion (welded chip) causes detachment of carbide particles. Cutting forces increase and stabilize before removing the tool. The chip reaches a very high temperature.

Keywords: titanium alloy, dry hjgh speed turning, wear insert, MQL technique

Procedia PDF Downloads 555
27134 Application of Nonlinear Model to Optimize the Coagulant Dose in Drinking Water Treatment

Authors: M. Derraz, M.Farhaoui

Abstract:

In the water treatment processes, the determination of the optimal dose of the coagulant is an issue of particular concern. Coagulant dosing is correlated to raw water quality which depends on some parameters (turbidity, ph, temperature, conductivity…). The objective of this study is to provide water treatment operators with a tool that enables to predict and replace, sometimes, the manual method (jar testing) used in this plant to predict the optimum coagulant dose. The model is constructed using actual process data for a water treatment plant located in the middle of Morocco (Meknes).

Keywords: coagulation process, aluminum sulfate, model, coagulant dose

Procedia PDF Downloads 278
27133 Research on the Risks of Railroad Receiving and Dispatching Trains Operators: Natural Language Processing Risk Text Mining

Authors: Yangze Lan, Ruihua Xv, Feng Zhou, Yijia Shan, Longhao Zhang, Qinghui Xv

Abstract:

Receiving and dispatching trains is an important part of railroad organization, and the risky evaluation of operating personnel is still reflected by scores, lacking further excavation of wrong answers and operating accidents. With natural language processing (NLP) technology, this study extracts the keywords and key phrases of 40 relevant risk events about receiving and dispatching trains and reclassifies the risk events into 8 categories, such as train approach and signal risks, dispatching command risks, and so on. Based on the historical risk data of personnel, the K-Means clustering method is used to classify the risk level of personnel. The result indicates that the high-risk operating personnel need to strengthen the training of train receiving and dispatching operations towards essential trains and abnormal situations.

Keywords: receiving and dispatching trains, natural language processing, risk evaluation, K-means clustering

Procedia PDF Downloads 91
27132 Friction Stir Welding of Al-Mg-Mn Aluminum Alloy Plates: A Review

Authors: K. Subbaiah, C. V. Jayakumar

Abstract:

Friction stir welding is a solid state welding process. Friction stir welding process eliminates the defects found in fusion welding processes. It is environmentally friend process. 5000 and 6000 series aluminum alloys are widely used in the transportation industries. The Al-Mg-Mn (5000) and Al-Mg-Si (6000) alloys are preferably offer best combination of use in Marine construction. The medium strength and high corrosion resistant 5000 series alloys are the aluminum alloys, which are found maximum utility in the world. In this review, the tool pin profile, process parameters such as hardness, yield strength and tensile strength, and microstructural evolution of friction stir welding of Al-Mg-Mn alloys (5000 Series) have been discussed.

Keywords: Al-Mg-Mn alloys, friction stir welding, tool pin profile, microstructure and mechanical properties

Procedia PDF Downloads 441
27131 Julia-Based Computational Tool for Composite System Reliability Assessment

Authors: Josif Figueroa, Kush Bubbar, Greg Young-Morris

Abstract:

The reliability evaluation of composite generation and bulk transmission systems is crucial for ensuring a reliable supply of electrical energy to significant system load points. However, evaluating adequacy indices using probabilistic methods like sequential Monte Carlo Simulation can be computationally expensive. Despite this, it is necessary when time-varying and interdependent resources, such as renewables and energy storage systems, are involved. Recent advances in solving power network optimization problems and parallel computing have improved runtime performance while maintaining solution accuracy. This work introduces CompositeSystems, an open-source Composite System Reliability Evaluation tool developed in Julia™, to address the current deficiencies of commercial and non-commercial tools. This work introduces its design, validation, and effectiveness, which includes analyzing two different formulations of the Optimal Power Flow problem. The simulations demonstrate excellent agreement with existing published studies while improving replicability and reproducibility. Overall, the proposed tool can provide valuable insights into the performance of transmission systems, making it an important addition to the existing toolbox for power system planning.

Keywords: open-source software, composite system reliability, optimization methods, Monte Carlo methods, optimal power flow

Procedia PDF Downloads 73
27130 Mobile Application Tool for Individual Maintenance Users on High-Rise Residential Buildings in South Korea

Authors: H. Cha, J. Kim, D. Kim, J. Shin, K. Lee

Abstract:

Since 1980's, the rapid economic growth resulted in so many aged apartment buildings in South Korea. Nevertheless, there is insufficient maintenance practice of buildings. In this study, to facilitate the building maintenance the authors classified the building defects into three levels according to their level of performance and developed a mobile application tool based on each level's appropriate feedback. The feedback structure consisted of 'Maintenance manual phase', 'Online feedback phase', 'Repair work phase of the specialty contractors'. In order to implement each phase the authors devised the necessary database for each phase and created a prototype system that can develop on its own. The authors expect that the building users can easily maintain their buildings by using this application.

Keywords: building defect, maintenance practice, mobile application, system algorithm

Procedia PDF Downloads 188
27129 Experimental Parameters’ Effects on the Electrical Discharge Machining Performances (µEDM)

Authors: Asmae Tafraouti, Yasmina Layouni, Pascal Kleimann

Abstract:

The growing market for Microsystems (MST) and Micro-Electromechanical Systems (MEMS) is driving the research for alternative manufacturing techniques to microelectronics-based technologies, which are generally expensive and time-consuming. Hot-embossing and micro-injection modeling of thermoplastics appear to be industrially viable processes. However, both require the use of master models, usually made in hard materials such as steel. These master models cannot be fabricated using standard microelectronics processes. Thus, other micromachining processes are used, as laser machining or micro-electrical discharge machining (µEDM). In this work, µEDM has been used. The principle of µEDM is based on the use of a thin cylindrical micro-tool that erodes the workpiece surface. The two electrodes are immersed in a dielectric with a distance of a few micrometers (gap). When an electrical voltage is applied between the two electrodes, electrical discharges are generated, which cause material machining. In order to produce master models with high resolution and smooth surfaces, it is necessary to well control the discharge mechanism. However, several problems are encountered, such as a random electrical discharge process, the fluctuation of the discharge energy, the electrodes' polarity inversion, and the wear of the micro-tool. The effect of different parameters, such as the applied voltage, the working capacitor, the micro-tool diameter, the initial gap, has been studied. This analysis helps to improve the machining performances, such: the workpiece surface condition and the lateral crater's gap.

Keywords: craters, electrical discharges, micro-electrical discharge machining (µEDM), microsystems

Procedia PDF Downloads 96
27128 Credit Risk Assessment Using Rule Based Classifiers: A Comparative Study

Authors: Salima Smiti, Ines Gasmi, Makram Soui

Abstract:

Credit risk is the most important issue for financial institutions. Its assessment becomes an important task used to predict defaulter customers and classify customers as good or bad payers. To this objective, numerous techniques have been applied for credit risk assessment. However, to our knowledge, several evaluation techniques are black-box models such as neural networks, SVM, etc. They generate applicants’ classes without any explanation. In this paper, we propose to assess credit risk using rules classification method. Our output is a set of rules which describe and explain the decision. To this end, we will compare seven classification algorithms (JRip, Decision Table, OneR, ZeroR, Fuzzy Rule, PART and Genetic programming (GP)) where the goal is to find the best rules satisfying many criteria: accuracy, sensitivity, and specificity. The obtained results confirm the efficiency of the GP algorithm for German and Australian datasets compared to other rule-based techniques to predict the credit risk.

Keywords: credit risk assessment, classification algorithms, data mining, rule extraction

Procedia PDF Downloads 181
27127 Investigating the Effectiveness of a 3D Printed Composite Mold

Authors: Peng Hao Wang, Garam Kim, Ronald Sterkenburg

Abstract:

In composite manufacturing, the fabrication of tooling and tooling maintenance contributes to a large portion of the total cost. However, as the applications of composite materials continue to increase, there is also a growing demand for more tooling. The demand for more tooling places heavy emphasis on the industry’s ability to fabricate high quality tools while maintaining the tool’s cost effectiveness. One of the popular techniques of tool fabrication currently being developed utilizes additive manufacturing technology known as 3D printing. The popularity of 3D printing is due to 3D printing’s ability to maintain low material waste, low cost, and quick fabrication time. In this study, a team of Purdue University School of Aviation and Transportation Technology (SATT) faculty and students investigated the effectiveness of a 3D printed composite mold. A steel valve cover from an aircraft reciprocating engine was modeled utilizing 3D scanning and computer-aided design (CAD) to create a 3D printed composite mold. The mold was used to fabricate carbon fiber versions of the aircraft reciprocating engine valve cover. The carbon fiber valve covers were evaluated for dimensional accuracy and quality while the 3D printed composite mold was evaluated for durability and dimensional stability. The data collected from this study provided valuable information in the understanding of 3D printed composite molds, potential improvements for the molds, and considerations for future tooling design.

Keywords: additive manufacturing, carbon fiber, composite tooling, molds

Procedia PDF Downloads 114
27126 Reliability and Construct Validity of the Early Dementia Questionnaire (EDQ)

Authors: A. Zurraini, Syed Alwi Sar, H. Helmy, H. Nazeefah

Abstract:

Early Dementia Questionnaire (EDQ) was developed as a screening tool to detect patients with early dementia in primary care. It was developed based on 20 symptoms of dementia. From a preliminary study, EDQ had been shown to be a promising alternative for screening of early dementia. This study was done to further test on EDQ’s reliability and validity. Using a systematic random sampling, 200 elderly patients attending primary health care centers in Kuching, Sarawak had consented to participate in the study and were administered the EDQ. Geriatric Depression Scale (GDS) was used to exclude patients with depression. Those who scored >21 MMSE, were retested using the EDQ. Reliability was determined by Cronbach’s alpha for internal consistency and construct validity was assessed using confirmatory factor analysis (principle component with varimax rotation). The result showed that the overall Cronbach’s alpha coefficient was good which was 0.874. Confirmatory factor analysis on 4 factors indicated that the Cronbach’s alpha for each domain were acceptable with memory (0.741), concentration (0.764), emotional and physical symptoms (0.754) and lastly sleep and environment (0.720). Pearson correlation coefficient between the first EDQ score and the retest EDQ score among those with MMSE of >21 showed a very strong, positive correlation between the two variables, r = 0.992, N=160, P <0.001. The results of the validation study showed that Early Dementia Questionnaire (EDQ) is a valid and reliable tool to be used as a screening tool to detect early dementia in primary care.

Keywords: Early Dementia Questionnaire (EDQ), screening, primary care, construct validity

Procedia PDF Downloads 436
27125 Students Perception of a Gamified Student Engagement Platform as Supportive Technology in Learning

Authors: Pinn Tsin Isabel Yee

Abstract:

Students are increasingly turning towards online learning materials to supplement their education. One such approach would be the gamified student engagement platforms (GSEPs) to instill a new learning culture. Data was collected from closed-ended questions via content analysis techniques. About 81.8% of college students from the Monash University Foundation Year agreed that GSEPs (Quizizz) was an effective tool for learning. Approximately 85.5% of students disagreed that games were a waste of time. GSEPs were highly effective among students to facilitate the learning process.

Keywords: engagement, gamified, Quizizz, technology

Procedia PDF Downloads 107
27124 Catchment Yield Prediction in an Ungauged Basin Using PyTOPKAPI

Authors: B. S. Fatoyinbo, D. Stretch, O. T. Amoo, D. Allopi

Abstract:

This study extends the use of the Drainage Area Regionalization (DAR) method in generating synthetic data and calibrating PyTOPKAPI stream yield for an ungauged basin at a daily time scale. The generation of runoff in determining a river yield has been subjected to various topographic and spatial meteorological variables, which integers form the Catchment Characteristics Model (CCM). Many of the conventional CCM models adapted in Africa have been challenged with a paucity of adequate, relevance and accurate data to parameterize and validate the potential. The purpose of generating synthetic flow is to test a hydrological model, which will not suffer from the impact of very low flows or very high flows, thus allowing to check whether the model is structurally sound enough or not. The employed physically-based, watershed-scale hydrologic model (PyTOPKAPI) was parameterized with GIS-pre-processing parameters and remote sensing hydro-meteorological variables. The validation with mean annual runoff ratio proposes a decent graphical understanding between observed and the simulated discharge. The Nash-Sutcliffe efficiency and coefficient of determination (R²) values of 0.704 and 0.739 proves strong model efficiency. Given the current climate variability impact, water planner can now assert a tool for flow quantification and sustainable planning purposes.

Keywords: catchment characteristics model, GIS, synthetic data, ungauged basin

Procedia PDF Downloads 327
27123 A Single-Use Endoscopy System for Identification of Abnormalities in the Distal Oesophagus of Individuals with Chronic Reflux

Authors: Nafiseh Mirabdolhosseini, Jerry Zhou, Vincent Ho

Abstract:

The dramatic global rise in acid reflux has also led to oesophageal adenocarcinoma (OAC) becoming the fastest-growing cancer in developed countries. While gastroscopy with biopsy is used to diagnose OAC patients, this labour-intensive and expensive process is not suitable for population screening. This study aims to design, develop, and implement a minimally invasive system to capture optical data of the distal oesophagus for rapid screening of potential abnormalities. To develop the system and understand user requirements, a user-centric approach was employed by utilising co-design strategies. Target users’ segments were identified, and 38 patients and 14 health providers were interviewed. Next, the technical requirements were developed based on consultations with the industry. A minimally invasive optical system was designed and developed considering patient comfort. This system consists of the sensing catheter, controller unit, and analysis program. Its procedure only takes 10 minutes to perform and does not require cleaning afterward since it has a single-use catheter. A prototype system was evaluated for safety and efficacy for both laboratory and clinical performance. This prototype performed successfully when submerged in simulated gastric fluid without showing evidence of erosion after 24 hours. The system effectively recorded a video of the mid-distal oesophagus of a healthy volunteer (34-year-old male). The recorded images were used to develop an automated program to identify abnormalities in the distal oesophagus. Further data from a larger clinical study will be used to train the automated program. This system allows for quick visual assessment of the lower oesophagus in primary care settings and can serve as a screening tool for oesophageal adenocarcinoma. In addition, this system is able to be coupled with 24hr ambulatory pH monitoring to better correlate oesophageal physiological changes with reflux symptoms. It also can provide additional information on lower oesophageal sphincter functions such as opening times and bolus retention.

Keywords: endoscopy, MedTech, oesophageal adenocarcinoma, optical system, screening tool

Procedia PDF Downloads 88
27122 To Handle Data-Driven Software Development Projects Effectively

Authors: Shahnewaz Khan

Abstract:

Machine learning (ML) techniques are often used in projects for creating data-driven applications. These tasks typically demand additional research and analysis. The proper technique and strategy must be chosen to ensure the success of data-driven projects. Otherwise, even exerting a lot of effort, the necessary development might not always be possible. In this post, an effort to examine the workflow of data-driven software development projects and its implementation process in order to describe how to manage a project successfully. Which will assist in minimizing the added workload.

Keywords: data, data-driven projects, data science, NLP, software project

Procedia PDF Downloads 83
27121 Traditional Drawing, BIM and Erudite Design Process

Authors: Maryam Kalkatechi

Abstract:

Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.

Keywords: erudite, data-sketch, algorithm design in architecture, design process

Procedia PDF Downloads 275
27120 Reflective and Collaborative Professional Development Program in Secondary Education to Improve Student’s Oral Language

Authors: Marta Gràcia, Ana Luisa Adam-Alcocer, Jesús M. Alvarado, Verónica Quezada, Tere Zarza, Priscila Garza

Abstract:

In secondary education, integrating linguistic content and reflection on it is a crucial challenge that should be included in course plans to enhance students' oral communication competence. In secondary education classrooms, a continuum can be identified in relation to teaching methodologies: 1) the traditional teacher-dominated transmission approach, which is described as that in which teachers transmit content to students unidirectionally; 2) dialogical, bidirectional teaching approach that encourages students to adopt a critical vision of the information provided by the teacher or that is generated through students’ discussion. In this context, the EVALOE-DSS (Assessment Scale of Oral Language Teaching in the School Context-Decision Support System) digital instrument has emerged to help teachers in transforming their classes into spaces for communication, dialogue, reflection, evaluation of the learning process, teaching linguistic contents, and to develop curricular competencies. The tool includes various resources, such as a tutorial with the objectives and an initial screen for teachers to describe the class to be evaluated. One of the main resources of the digital instrument consists of 30 items-actions with three qualitative response options (green, orange, and red face emoji) grouped in five dimensions. In the context of the participation of secondary education teachers in a professional development program using EVALOE-DSS, a digital tool resource aimed to generate more participatory, interactive, dialogic classes, the objectives of the study were: 1) understanding the changes in classrooms’ dynamics and in the teachers’ strategies during their participation in the professional developmental program; 2) analyzing the impact of these changes in students’ oral language development according to their teachers; 3) Deeping on the impact of these changes in the students’ assessment of the classes and the self-assessment of oral competence; 4) knowing teachers’ assessment and reflections about their participation in the professional developmental program. Participants were ten teachers of different subjects and 250 students of secondary education (16-18 years) schools in Spain. The principal instrument used was the digital tool EVALOE-DSS. For 6 months, teachers used the digital tool to reflect on their classes, assess them (their actions and their students’ actions), make decisions, and introduce changes in their classes to be more participatory, interactive, and reflective about linguistic contents. Other collecting data instruments and techniques used during the study were: 1) a questionnaire to assess students’ oral language competence before and at the end of the study, 2) a questionnaire for students’ assessment of the characteristics of classes, 3) teachers’ meetings during the professional developmental program to reflect collaboratively on their experience, 4) questionnaire to assess teacher’s experience during their participation in the professional developmental program, 5) focus group meetings between the teachers and two researchers at the end of the study. The results showed relevant changes in teaching strategies, in the dynamics of the classes, which were more interactive, participative, dialogic and self-managed by the students. Both teachers and students agree about the progressive classes’ transformation into spaces for communication, discussion, and reflection on the language, its development, and its use as an essential instrument to develop curricular competencies.

Keywords: digital tool, individual and collaborative reflection, oral language competence, professional development program, secondary education

Procedia PDF Downloads 36
27119 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal

Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha

Abstract:

Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.

Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit

Procedia PDF Downloads 420
27118 Factors Impacting Technology Integration in EFL Classrooms: A Study of Qatari Independent Schools

Authors: Youmen Chaaban, Maha Ellili-Cherif

Abstract:

The purpose of this study was to examine the effects of teachers’ individual characteristics and perceptions of environmental factors that impact their technology integration into their EFL (English as a Foreign Language) classrooms. To this end, a national survey examining EFL teachers’ perceptions was conducted at Qatari Independent schools. 263 EFL teachers responded to the survey which investigated several factors known to impact technology integration. These factors included technology availability and support, EFL teachers’ perceptions of importance, obstacles facing technology integration, competency with technology use, and formal technology preparation. The impact of these factors on teachers’ and students’ educational technology use was further measured. The analysis of the data included descriptive statistics and a chi-square analysis test in order to examine the relationship between these factors. The results revealed important cultural factors that impact teachers’ practices and attitudes towards technology in the Qatari context. EFL teachers were found to integrate technology most prominently for instructional delivery and preparation. The use of technology as a learning tool received less emphasis. Teachers further revealed consistent perceptions about obstacles to integration, high levels of confidence in using technology, and consistent beliefs about the importance of using technology as a learning tool. Further analyses of the factors impacting technology integration can assist with Qatar’s technology advancement and development efforts by indicating the areas of strength and areas where additional efforts are needed. The results will lay the foundation for conducting context-specific professional development suitable for the needs of EFL teachers in Qatari Independent Schools.

Keywords: educational technology integration, Qatar, EFL, independent schools, ICT

Procedia PDF Downloads 383
27117 Geological Structure Identification in Semilir Formation: An Correlated Geological and Geophysical (Very Low Frequency) Data for Zonation Disaster with Current Density Parameters and Geological Surface Information

Authors: E. M. Rifqi Wilda Pradana, Bagus Bayu Prabowo, Meida Riski Pujiyati, Efraim Maykhel Hagana Ginting, Virgiawan Arya Hangga Reksa

Abstract:

The VLF (Very Low Frequency) method is an electromagnetic method that uses low frequencies between 10-30 KHz which results in a fairly deep penetration. In this study, the VLF method was used for zonation of disaster-prone areas by identifying geological structures in the form of faults. Data acquisition was carried out in Trimulyo Region, Jetis District, Bantul Regency, Special Region of Yogyakarta, Indonesia with 8 measurement paths. This study uses wave transmitters from Japan and Australia to obtain Tilt and Elipt values that can be used to create RAE (Rapat Arus Ekuivalen or Current Density) sections that can be used to identify areas that are easily crossed by electric current. This section will indicate the existence of a geological structure in the form of faults in the study area which is characterized by a high RAE value. In data processing of VLF method, it is obtained Tilt vs Elliptical graph and Moving Average (MA) Tilt vs Moving Average (MA) Elipt graph of each path that shows a fluctuating pattern and does not show any intersection at all. Data processing uses Matlab software and obtained areas with low RAE values that are 0%-6% which shows medium with low conductivity and high resistivity and can be interpreted as sandstone, claystone, and tuff lithology which is part of the Semilir Formation. Whereas a high RAE value of 10% -16% which shows a medium with high conductivity and low resistivity can be interpreted as a fault zone filled with fluid. The existence of the fault zone is strengthened by the discovery of a normal fault on the surface with strike N550W and dip 630E at coordinates X= 433256 and Y= 9127722 so that the activities of residents in the zone such as housing, mining activities and other activities can be avoided to reduce the risk of natural disasters.

Keywords: current density, faults, very low frequency, zonation

Procedia PDF Downloads 175
27116 Impact of Newspaper Coverage of 2015 General Elections in Nigeria

Authors: Shola H. Adeosun, Lekan M. Togunwa, Kolawole Z. Amos

Abstract:

This paper appraises ‘Newspaper Coverage of 2015 General Election: A study of The Punch and Guardian Newspapers’. The objectives of the study were to examine how credible newspaper reports of 2015 election were and to examine the significant role Nigeria Newspapers played in the 2015 general elections. Also this study examined the extent at which the print media contributed to the success of 2015 general election and to ascertain the extent at which print media reports serve as a tool for sensitizing the masses. The research questions that guided this research include: How credible was newspaper report of 2015 general election? To what extent did the print media contributed to the success of 2015 general elections? To what extent did the print media reports serve as a tool for sensitizing the masses? The research work was given solid theoretical foundation with the review of Agenda-setting theory, Media System Dependency Theory and Normative theories. This study was given solid theoretical foundation with the review of Agenda-setting theory, Media Dependency Theory and Normative theories. The theory was conducted using content analysis method of research and 30 publications of both The Guardian and Punch Newspaper between January 1st and March 30, 2015 forms the population for this research work. Selection of the dates and editions of Newspaper under study were done using the composite week sampling technique. All the days of the week were used for the newspapers because they (The Punch and The Guardian) are published all the days of the week. Coding sheet was the tool of data collection for the content analysis of this study. Findings of the study revealed that by the Punch newspaper and Guardian has played a significant role in eradicating election malpractices in Nigeria. It therefore concludes that media is metaphoric when we termed it to be a watchdog of the nation as well the mirror through which the nation see and recognize itself. The study also recommends that Nigerian media should strike balance between entertainment stories, crisis stories, economic stories, law story, education stories, terrorism stories, health stories, sport stories, metropolitan stories instead of portraying the country as being crime oriented.

Keywords: newspaper, coverage, general elections, impact

Procedia PDF Downloads 336
27115 Technology in the Calculation of People Health Level: Design of a Computational Tool

Authors: Sara Herrero Jaén, José María Santamaría García, María Lourdes Jiménez Rodríguez, Jorge Luis Gómez González, Adriana Cercas Duque, Alexandra González Aguna

Abstract:

Background: Health concept has evolved throughout history. The health level is determined by the own individual perception. It is a dynamic process over time so that you can see variations from one moment to the next. In this way, knowing the health of the patients you care for, will facilitate decision making in the treatment of care. Objective: To design a technological tool that calculates the people health level in a sequential way over time. Material and Methods: Deductive methodology through text analysis, extraction and logical knowledge formalization and education with expert group. Studying time: September 2015- actually. Results: A computational tool for the use of health personnel has been designed. It has 11 variables. Each variable can be given a value from 1 to 5, with 1 being the minimum value and 5 being the maximum value. By adding the result of the 11 variables we obtain a magnitude in a certain time, the health level of the person. The health calculator allows to represent people health level at a time, establishing temporal cuts being useful to determine the evolution of the individual over time. Conclusion: The Information and Communication Technologies (ICT) allow training and help in various disciplinary areas. It is important to highlight their relevance in the field of health. Based on the health formalization, care acts can be directed towards some of the propositional elements of the concept above. The care acts will modify the people health level. The health calculator allows the prioritization and prediction of different strategies of health care in hospital units.

Keywords: calculator, care, eHealth, health

Procedia PDF Downloads 264
27114 Identification of Conserved Domains and Motifs for GRF Gene Family

Authors: Jafar Ahmadi, Nafiseh Noormohammadi, Sedegeh Fabriki Ourang

Abstract:

GRF, Growth regulating factor, genes encode a novel class of plant-specific transcription factors. The GRF proteins play a role in the regulation of cell numbers in young and growing tissues and may act as transcription activations in growth and development of plants. Identification of GRF genes and their expression are important in plants to performance of the growth and development of various organs. In this study, to better understanding the structural and functional differences of GRFs family, 45 GRF proteins sequences in A. thaliana, Z. mays, O. sativa, B. napus, B. rapa, H. vulgare, and S. bicolor, have been collected and analyzed through bioinformatics data mining. As a result, in secondary structure of GRFs, the number of alpha helices was more than beta sheets and in all of them QLQ domains were completely in the biggest alpha helix. In all GRFs, QLQ, and WRC domains were completely protected except in AtGRF9. These proteins have no trans-membrane domain and due to have nuclear localization signals act in nuclear and they are component of unstable proteins in the test tube.

Keywords: domain, gene family, GRF, motif

Procedia PDF Downloads 457
27113 Smart in Performance: More to Practical Life than Hardware and Software

Authors: Faten Hatem

Abstract:

This paper promotes the importance of focusing on spatial aspects and affective factors that impact smart urbanism. This helps to better inform city governance, spatial planning, and policymaking to focus on what Smart does and what it can achieve for cities in terms of performance rather than on using the notion for prestige in a worldwide trend towards becoming a smart city. By illustrating how this style of practice compromises the social aspects and related elements of space making through an interdisciplinary comparative approach, the paper clarifies the impact of this compromise on the overall smart city performance. In response, this paper recognizes the importance of establishing a new meaning for urban progress by moving beyond improving basic services of the city to enhance the actual human experience which is essential for the development of authentic smart cities. The topic is presented under five overlooked areas that discuss the relation between smart cities’ potential and efficiency paradox, the social aspect, connectedness with nature, the human factor, and untapped resources. However, these themes are not meant to be discussed in silos, instead, they are presented to collectively examine smart cities in performance, arguing there is more to the practical life of smart cities than software and hardware inventions. The study is based on a case study approach, presenting Milton Keynes as a living example to learn from while engaging with various methods for data collection including multi-disciplinary semi-structured interviews, field observations, and data mining.

Keywords: smart design, the human in the city, human needs and urban planning, sustainability, smart cities, smart

Procedia PDF Downloads 99