Search results for: mining legacy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1222

Search results for: mining legacy

262 Foreign Exchange Volatilities and Stock Prices: Evidence from London Stock Exchange

Authors: Mahdi Karazmodeh, Pooyan Jafari

Abstract:

One of the most interesting topics in finance is the relation between stock prices and exchange rates. During the past decades different stock markets in different countries have been the subject of study for researches. The volatilities of exchange rates and its effect on stock prices during the past 10 years have continued to be an attractive research topic. The subject of this study is one of the most important indices, FTSE 100. 20 firms with the highest market capitalization in 5 different industries are chosen. Firms are included in oil and gas, mining, pharmaceuticals, banking and food related industries. 5 different criteria have been introduced to evaluate the relationship between stock markets and exchange rates. Return of market portfolio, returns on broad index of Sterling are also introduced. The results state that not all firms are sensitive to changes in exchange rates. Furthermore, a Granger Causality test has been run to observe the route of changes between stock prices and foreign exchange rates. The results are consistent, to some level, with the previous studies. However, since the number of firms is not large, it is suggested that a larger number of firms being used to achieve the best results. However results showed that not all firms are affected by foreign exchange rates changes. After testing Granger Causality, this study found out that in some industries (oil and gas, pharmaceuticals), changes in foreign exchange rate will not cause any changes in stock prices (or vice versa), however, in banking sector the situation was different. This industry showed more reaction to these changes. The results are similar to the ones with Richards and Noel, where a variety of firms in different industries were evaluated.

Keywords: stock prices, foreign exchange rate, exchange rate exposure, Granger Causality

Procedia PDF Downloads 433
261 Information Communication Technology Based Road Traffic Accidents’ Identification, and Related Smart Solution Utilizing Big Data

Authors: Ghulam Haider Haidaree, Nsenda Lukumwena

Abstract:

Today the world of research enjoys abundant data, available in virtually any field, technology, science, and business, politics, etc. This is commonly referred to as big data. This offers a great deal of precision and accuracy, supportive of an in-depth look at any decision-making process. When and if well used, Big Data affords its users with the opportunity to produce substantially well supported and good results. This paper leans extensively on big data to investigate possible smart solutions to urban mobility and related issues, namely road traffic accidents, its casualties, and fatalities based on multiple factors, including age, gender, location occurrences of accidents, etc. Multiple technologies were used in combination to produce an Information Communication Technology (ICT) based solution with embedded technology. Those technologies include principally Geographic Information System (GIS), Orange Data Mining Software, Bayesian Statistics, to name a few. The study uses the Leeds accident 2016 to illustrate the thinking process and extracts thereof a model that can be tested, evaluated, and replicated. The authors optimistically believe that the proposed model will significantly and smartly help to flatten the curve of road traffic accidents in the fast-growing population densities, which increases considerably motor-based mobility.

Keywords: accident factors, geographic information system, information communication technology, mobility

Procedia PDF Downloads 197
260 Mitigating Acid Mine Drainage Pollution: A Case Study In the Witwatersrand Area of South Africa

Authors: Elkington Sibusiso Mnguni

Abstract:

In South Africa, mining has been a key economic sector since the discovery of gold in 1886 in the Witwatersrand region, where the city of Johannesburg is located. However, some mines have since been decommissioned, and the continuous pumping of acid mine drainage (AMD) also stopped causing the AMD to rise towards the ground surface. This posed a serious environmental risk to the groundwater resources and river systems in the region. This paper documents the development and extent of the environmental damage as well as the measures implemented by the government to alleviate such damage. The study will add to the body of knowledge on the subject of AMD treatment to prevent environmental degradation. The method used to gather and collate relevant data and information was the desktop study. The key findings include the social and environmental impact of the AMD, which include the pollution of water sources for domestic use leading to skin and other health problems and the loss of biodiversity in some areas. It was also found that the technical intervention of constructing a plant to pump and treat the AMD using the high-density sludge technology was the most effective short-term solution available while a long-term solution was being explored. Some successes and challenges experienced during the implementation of the project are also highlighted. The study will be a useful record of the current status of the AMD treatment interventions in the region.

Keywords: acid mine drainage, groundwater resources, pollution, river systems, technical intervention, high density sludge

Procedia PDF Downloads 173
259 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 294
258 Laboratory Scale Experimental Studies on CO₂ Based Underground Coal Gasification in Context of Clean Coal Technology

Authors: Geeta Kumari, Prabu Vairakannu

Abstract:

Coal is the largest fossil fuel. In India, around 37 % of coal resources found at a depth of more than 300 meters. In India, more than 70% of electricity production depends on coal. Coal on combustion produces greenhouse and pollutant gases such as CO₂, SOₓ, NOₓ, and H₂S etc. Underground coal gasification (UCG) technology is an efficient and an economic in-situ clean coal technology, which converts these unmineable coals into valuable calorific gases. The UCG syngas (mainly H₂, CO, CH₄ and some lighter hydrocarbons) which can utilized for the production of electricity and manufacturing of various useful chemical feedstock. It is an inherent clean coal technology as it avoids ash disposal, mining, transportation and storage problems. Gasification of underground coal using steam as a gasifying medium is not an easy process because sending superheated steam to deep underground coal leads to major transportation difficulties and cost effective. Therefore, for reducing this problem, we have used CO₂ as a gasifying medium, which is a major greenhouse gas. This paper focus laboratory scale underground coal gasification experiment on a coal block by using CO₂ as a gasifying medium. In the present experiment, first, we inject oxygen for combustion for 1 hour and when the temperature of the zones reached to more than 1000 ºC, and then we started supplying of CO₂ as a gasifying medium. The gasification experiment was performed at an atmospheric pressure of CO₂, and it was found that the amount of CO produced due to Boudouard reaction (C+CO₂  2CO) is around 35%. The experiment conducted to almost 5 hours. The maximum gas composition observed, 35% CO, 22 % H₂, and 11% CH4 with LHV 248.1 kJ/mol at CO₂/O₂ ratio 0.4 by volume.

Keywords: underground coal gasification, clean coal technology, calorific value, syngas

Procedia PDF Downloads 211
257 Spatial Information and Urbanizing Futures

Authors: Mohammad Talei, Neda Ranjbar Nosheri, Reza Kazemi Gorzadini

Abstract:

Today municipalities are searching for the new tools for increasing the public participation in different levels of urban planning. This approach of urban planning involves the community in planning process using participatory approaches instead of the long traditional top-down planning methods. These tools can be used to obtain the particular problems of urban furniture form the residents’ point of view. One of the tools that is designed with this goal is public participation GIS (PPGIS) that enables citizen to record and following up their feeling and spatial knowledge regarding main problems of the city, specifically urban furniture, in the form of maps. However, despite the good intentions of PPGIS, its practical implementation in developing countries faces many problems including the lack of basic supporting infrastructure and services and unavailability of sophisticated public participatory models. In this research we develop a PPGIS using of Web 2 to collect voluntary geodataand to perform spatial analysis based on Spatial OnLine Analytical Processing (SOLAP) and Spatial Data Mining (SDM). These tools provide urban planners with proper informationregarding the type, spatial distribution and the clusters of reported problems. This system is implemented in a case study area in Tehran, Iran and the challenges to make it applicable and its potential for real urban planning have been evaluated. It helps decision makers to better understand, plan and allocate scarce resources for providing most requested urban furniture.

Keywords: PPGIS, spatial information, urbanizing futures, urban planning

Procedia PDF Downloads 709
256 Investigating Links in Achievement and Deprivation (ILiAD): A Case Study Approach to Community Differences

Authors: Ruth Leitch, Joanne Hughes

Abstract:

This paper presents the findings of a three-year government-funded study (ILiAD) that aimed to understand the reasons for differential educational achievement within and between socially and economically deprived areas in Northern Ireland. Previous international studies have concluded that there is a positive correlation between deprivation and underachievement. Our preliminary secondary data analysis suggested that the factors involved in educational achievement within multiple deprived areas may be more complex than this, with some areas of high multiple deprivation having high levels of student attainment, whereas other less deprived areas demonstrated much lower levels of student attainment, as measured by outcomes on high stakes national tests. The study proposed that no single explanation or disparate set of explanations could easily account for the linkage between levels of deprivation and patterns of educational achievement. Using a social capital perspective that centralizes the connections within and between individuals and social networks in a community as a valuable resource for educational achievement, the ILiAD study involved a multi-level case study analysis of seven community sites in Northern Ireland, selected on the basis of religious composition (housing areas are largely segregated by religious affiliation), measures of multiple deprivation and differentials in educational achievement. The case study approach involved three (interconnecting) levels of qualitative data collection and analysis - what we have termed Micro (or community/grassroots level) understandings, Meso (or school level) explanations and Macro (or policy/structural) factors. The analysis combines a statistical mapping of factors with qualitative, in-depth data interpretation which, together, allow for deeper understandings of the dynamics and contributory factors within and between the case study sites. Thematic analysis of the qualitative data reveals both cross-cutting factors (e.g. demographic shifts and loss of community, place of the school in the community, parental capacity) and analytic case studies of explanatory factors associated with each of the community sites also permit a comparative element. Issues arising from the qualitative analysis are classified either as drivers or inhibitors of educational achievement within and between communities. Key issues that are emerging as inhibitors/drivers to attainment include: the legacy of the community conflict in Northern Ireland, not least in terms of inter-generational stress, related with substance abuse and mental health issues; differing discourses on notions of ‘community’ and ‘achievement’ within/between community sites; inter-agency and intra-agency levels of collaboration and joined-up working; relationship between the home/school/community triad and; school leadership and school ethos. At this stage, the balance of these factors can be conceptualized in terms of bonding social capital (or lack of it) within families, within schools, within each community, within agencies and also bridging social capital between the home/school/community, between different communities and between key statutory and voluntary organisations. The presentation will outline the study rationale, its methodology, present some cross-cutting findings and use an illustrative case study of the findings from a community site to underscore the importance of attending to community differences when trying to engage in research to understand and improve educational attainment for all.

Keywords: educational achievement, multiple deprivation, community case studies, social capital

Procedia PDF Downloads 364
255 Educational Leadership and Artificial Intelligence

Authors: Sultan Ghaleb Aldaihani

Abstract:

- The environment in which educational leadership takes place is becoming increasingly complex due to factors like globalization and rapid technological change. - This is creating a "leadership gap" where the complexity of the environment outpaces the ability of leaders to effectively respond. - Educational leadership involves guiding teachers and the broader school system towards improved student learning and achievement. 2. Implications of Artificial Intelligence (AI) in Educational Leadership: - AI has great potential to enhance education, such as through intelligent tutoring systems and automating routine tasks to free up teachers. - AI can also have significant implications for educational leadership by providing better information and data-driven decision-making capabilities. - Computer-adaptive testing can provide detailed, individualized data on student learning that leaders can use for instructional decisions and accountability. 3. Enhancing Decision-Making Processes: - Statistical models and data mining techniques can help identify at-risk students earlier, allowing for targeted interventions. - Probability-based models can diagnose students likely to drop out, enabling proactive support. - These data-driven approaches can make resource allocation and decision-making more effective. 4. Improving Efficiency and Productivity: - AI systems can automate tasks and change processes to improve the efficiency of educational leadership and administration. - Integrating AI can free up leaders to focus more on their role's human, interactive elements.

Keywords: Education, Leadership, Technology, Artificial Intelligence

Procedia PDF Downloads 13
254 Design Thinking and Project-Based Learning: Opportunities, Challenges, and Possibilities

Authors: Shoba Rathilal

Abstract:

High unemployment rates and a shortage of experienced and qualified employees appear to be a paradox that currently plagues most countries worldwide. In a developing country like South Africa, the rate of unemployment is reported to be approximately 35%, the highest recorded globally. At the same time, a countrywide deficit in experienced and qualified potential employees is reported in South Africa, which is causing fierce rivalry among firms. Employers have reported that graduates are very rarely able to meet the demands of the job as there are gaps in their knowledge and conceptual understanding and other 21st-century competencies, attributes, and dispositions required to successfully negotiate the multiple responsibilities of employees in organizations. In addition, the rates of unemployment and suitability of graduates appear to be skewed by race and social class, the continued effects of a legacy of inequitable educational access. Higher Education in the current technologically advanced and dynamic world needs to serve as an agent of transformation, aspiring to develop graduates to be creative, flexible, critical, and with entrepreneurial acumen. This requires that higher education curricula and pedagogy require a re-envisioning of our selection, sequencing, and pacing of the learning, teaching, and assessment. At a particular Higher education Institution in South Africa, Design Thinking and Project Based learning are being adopted as two approaches that aim to enhance the student experience through the provision of a “distinctive education” that brings together disciplinary knowledge, professional engagement, technology, innovation, and entrepreneurship. Using these methodologies forces the students to solve real-time applied problems using various forms of knowledge and finding innovative solutions that can result in new products and services. The intention is to promote the development of skills for self-directed learning, facilitate the development of self-awareness, and contribute to students being active partners in the application and production of knowledge. These approaches emphasize active and collaborative learning, teamwork, conflict resolution, and problem-solving through effective integration of theory and practice. In principle, both these approaches are extremely impactful. However, at the institution in this study, the implementation of the PBL and DT was not as “smooth” as anticipated. This presentation reports on the analysis of the implementation of these two approaches within higher education curricula at a particular university in South Africa. The study adopts a qualitative case study design. Data were generated through the use of surveys, evaluation feedback at workshops, and content analysis of project reports. Data were analyzed using document analysis, content, and thematic analysis. Initial analysis shows that the forces constraining the implementation of PBL and DT range from the capacity to engage with DT and PBL, both from staff and students, educational contextual realities of higher education institutions, administrative processes, and resources. At the same time, the implementation of DT and PBL was enabled through the allocation of strategic funding and capacity development workshops. These factors, however, could not achieve maximum impact. In addition, the presentation will include recommendations on how DT and PBL could be adapted for differing contexts will be explored.

Keywords: design thinking, project based learning, innovative higher education pedagogy, student and staff capacity development

Procedia PDF Downloads 60
253 The “Bright Side” of COVID-19: Effects of Livestream Affordances on Consumer Purchase Willingness: Explicit IT Affordances Perspective

Authors: Isaac Owusu Asante, Yushi Jiang, Hailin Tao

Abstract:

Live streaming marketing, the new electronic commerce element, became an optional marketing channel following the COVID-19 pandemic. Many sellers have leveraged the features presented by live streaming to increase sales. Studies on live streaming have focused on gaming and consumers’ loyalty to brands through live streaming, using interview questionnaires. This study, however, was conducted to measure real-time observable interactions between consumers and sellers. Based on the affordance theory, this study conceptualized constructs representing the interactive features and examined how they drive consumers’ purchase willingness during live streaming sessions using 1238 datasets from Amazon Live, following the manual observation of transaction records. Using structural equation modeling, the ordinary least square regression suggests that live viewers, new followers, live chats, and likes positively affect purchase willingness. The Sobel and Monte Carlo tests show that new followers, live chats, and likes significantly mediate the relationship between live viewers and purchase willingness. The study introduces a new way of measuring interactions in live streaming commerce and proposes a way to manually gather data on consumer behaviors in live streaming platforms when the application programming interface (API) of such platforms does not support data mining algorithms.

Keywords: livestreaming marketing, live chats, live viewers, likes, new followers, purchase willingness

Procedia PDF Downloads 59
252 A Supervised Approach for Detection of Singleton Spam Reviews

Authors: Atefeh Heydari, Mohammadali Tavakoli, Naomie Salim

Abstract:

In recent years, we have witnessed that online reviews are the most important source of customers’ opinion. They are progressively more used by individuals and organisations to make purchase and business decisions. Unfortunately, for the reason of profit or fame, frauds produce deceptive reviews to hoodwink potential customers. Their activities mislead not only potential customers to make appropriate purchasing decisions and organisations to reshape their business, but also opinion mining techniques by preventing them from reaching accurate results. Spam reviews could be divided into two main groups, i.e. multiple and singleton spam reviews. Detecting a singleton spam review that is the only review written by a user ID is extremely challenging due to lack of clue for detection purposes. Singleton spam reviews are very harmful and various features and proofs used in multiple spam reviews detection are not applicable in this case. Current research aims to propose a novel supervised technique to detect singleton spam reviews. To achieve this, various features are proposed in this study and are to be combined with the most appropriate features extracted from literature and employed in a classifier. In order to compare the performance of different classifiers, SVM and naive Bayes classification algorithms were used for model building. The results revealed that SVM was more accurate than naive Bayes and our proposed technique is capable to detect singleton spam reviews effectively.

Keywords: classification algorithms, Naïve Bayes, opinion review spam detection, singleton review spam detection, support vector machine

Procedia PDF Downloads 288
251 Developing a Place-Name Gazetteer for Singapore by Mining Historical Planning Archives and Selective Crowd-Sourcing

Authors: Kevin F. Hsu, Alvin Chua, Sarah X. Lin

Abstract:

As a multilingual society, Singaporean names for different parts of the city have changed over time. Residents included Indigenous Malays, dialect-speakers from China, European settler-colonists, and Tamil-speakers from South India. Each group would name locations in their own languages. Today, as ancestral tongues are increasingly supplanted by English, contemporary Singaporeans’ understanding of once-common place names is disappearing. After demolition or redevelopment, some urban places will only exist in archival records or in human memory. United Nations conferences on the standardization of geographic names have called attention to how place names relate to identity, well-being, and a sense of belonging. The Singapore Place-Naming Project responds to these imperatives by capturing past and present place names through digitizing historical maps, mining archival records, and applying selective crowd-sourcing to trace the evolution of place names throughout the city. The project ensures that both formal and vernacular geographical names remain accessible to historians, city planners, and the public. The project is compiling a gazetteer, a geospatial archive of placenames, with streets, buildings, landmarks, and other points of interest (POI) appearing in the historic maps and planning documents of Singapore, currently held by the National Archives of Singapore, the National Library Board, university departments, and the Urban Redevelopment Authority. To create a spatial layer of information, the project links each place name to either a geo-referenced point, line segment, or polygon, along with the original source material in which the name appears. This record is supplemented by crowd-sourced contributions from civil service officers and heritage specialists, drawing from their collective memory to (1) define geospatial boundaries of historic places that appear in past documents, but maybe unfamiliar to users today, and (2) identify and record vernacular place names not captured in formal planning documents. An intuitive interface allows participants to demarcate feature classes, vernacular phrasings, time periods, and other knowledge related to historical or forgotten spaces. Participants are stratified into age bands and ethnicity to improve representativeness. Future iterations could allow additional public contributions. Names reveal meanings that communities assign to each place. While existing historical maps of Singapore allow users to toggle between present-day and historical raster files, this project goes a step further by adding layers of social understanding and planning documents. Tracking place names illuminates linguistic, cultural, commercial, and demographic shifts in Singapore, in the context of transformations of the urban environment. The project also demonstrates how a moderated, selectively crowd-sourced effort can solicit useful geospatial data at scale, sourced from different generations, and at higher granularity than traditional surveys, while mitigating negative impacts of unmoderated crowd-sourcing. Stakeholder agencies believe the project will achieve several objectives, including Supporting heritage conservation and public education; Safeguarding intangible cultural heritage; Providing historical context for street, place or development-renaming requests; Enhancing place-making with deeper historical knowledge; Facilitating emergency and social services by tagging legal addresses to vernacular place names; Encouraging public engagement with heritage by eliciting multi-stakeholder input.

Keywords: collective memory, crowd-sourced, digital heritage, geospatial, geographical names, linguistic heritage, place-naming, Singapore, Southeast Asia

Procedia PDF Downloads 112
250 Characterization of Aluminosilicates and Verification of Their Impact on Quality of Ceramic Proppants Intended for Shale Gas Output

Authors: Joanna Szymanska, Paulina Wawulska-Marek, Jaroslaw Mizera

Abstract:

Nowadays, the rapid growth of global energy consumption and uncontrolled depletion of natural resources become a serious problem. Shale rocks are the largest and potential global basins containing hydrocarbons, trapped in closed pores of the shale matrix. Regardless of the shales origin, mining conditions are extremely unfavourable due to high reservoir pressure, great depths, increased clay minerals content and limited permeability (nanoDarcy) of the rocks. Taking into consideration such geomechanical barriers, effective extraction of natural gas from shales with plastic zones demands effective operations. Actually, hydraulic fracturing is the most developed technique based on the injection of pressurized fluid into a wellbore, to initiate fractures propagation. However, a rapid drop of pressure after fluid suction to the ground induces a fracture closure and conductivity reduction. In order to minimize this risk, proppants should be applied. They are solid granules transported with hydraulic fluids to locate inside the rock. Proppants act as a prop for the closing fracture, thus gas migration to a borehole is effective. Quartz sands are commonly applied proppants only at shallow deposits (USA). Whereas, ceramic proppants are designed to meet rigorous downhole conditions to intensify output. Ceramic granules predominate with higher mechanical strength, stability in strong acidic environment, spherical shape and homogeneity as well. Quality of ceramic proppants is conditioned by raw materials selection. Aim of this study was to obtain the proppants from aluminosilicates (the kaolinite subgroup) and mix of minerals with a high alumina content. These loamy minerals contain a tubular and platy morphology that improves mechanical properties and reduces their specific weight. Moreover, they are distinguished by well-developed surface area, high porosity, fine particle size, superb dispersion and nontoxic properties - very crucial for particles consolidation into spherical and crush-resistant granules in mechanical granulation process. The aluminosilicates were mixed with water and natural organic binder to improve liquid-bridges and pores formation between particles. Afterward, the green proppants were subjected to sintering at high temperatures. Evaluation of the minerals utility was based on their particle size distribution (laser diffraction study) and thermal stability (thermogravimetry). Scanning Electron Microscopy was useful for morphology and shape identification combined with specific surface area measurement (BET). Chemical composition was verified by Energy Dispersive Spectroscopy and X-ray Fluorescence. Moreover, bulk density and specific weight were measured. Such comprehensive characterization of loamy materials confirmed their favourable impact on the proppants granulation. The sintered granules were analyzed by SEM to verify the surface topography and phase transitions after sintering. Pores distribution was identified by X-Ray Tomography. This method enabled also the simulation of proppants settlement in a fracture, while measurement of bulk density was essential to predict their amount to fill a well. Roundness coefficient was also evaluated, whereas impact on mining environment was identified by turbidity and solubility in acid - to indicate risk of the material decay in a well. The obtained outcomes confirmed a positive influence of the loamy minerals on ceramic proppants properties with respect to the strict norms. This research is perspective for higher quality proppants production with costs reduction.

Keywords: aluminosilicates, ceramic proppants, mechanical granulation, shale gas

Procedia PDF Downloads 152
249 Comparative Evaluation of Accuracy of Selected Machine Learning Classification Techniques for Diagnosis of Cancer: A Data Mining Approach

Authors: Rajvir Kaur, Jeewani Anupama Ginige

Abstract:

With recent trends in Big Data and advancements in Information and Communication Technologies, the healthcare industry is at the stage of its transition from clinician oriented to technology oriented. Many people around the world die of cancer because the diagnosis of disease was not done at an early stage. Nowadays, the computational methods in the form of Machine Learning (ML) are used to develop automated decision support systems that can diagnose cancer with high confidence in a timely manner. This paper aims to carry out the comparative evaluation of a selected set of ML classifiers on two existing datasets: breast cancer and cervical cancer. The ML classifiers compared in this study are Decision Tree (DT), Support Vector Machine (SVM), k-Nearest Neighbor (k-NN), Logistic Regression, Ensemble (Bagged Tree) and Artificial Neural Networks (ANN). The evaluation is carried out based on standard evaluation metrics Precision (P), Recall (R), F1-score and Accuracy. The experimental results based on the evaluation metrics show that ANN showed the highest-level accuracy (99.4%) when tested with breast cancer dataset. On the other hand, when these ML classifiers are tested with the cervical cancer dataset, Ensemble (Bagged Tree) technique gave better accuracy (93.1%) in comparison to other classifiers.

Keywords: artificial neural networks, breast cancer, classifiers, cervical cancer, f-score, machine learning, precision, recall

Procedia PDF Downloads 263
248 Pattern Discovery from Student Feedback: Identifying Factors to Improve Student Emotions in Learning

Authors: Angelina A. Tzacheva, Jaishree Ranganathan

Abstract:

Interest in (STEM) Science Technology Engineering Mathematics education especially Computer Science education has seen a drastic increase across the country. This fuels effort towards recruiting and admitting a diverse population of students. Thus the changing conditions in terms of the student population, diversity and the expected teaching and learning outcomes give the platform for use of Innovative Teaching models and technologies. It is necessary that these methods adapted should also concentrate on raising quality of such innovations and have positive impact on student learning. Light-Weight Team is an Active Learning Pedagogy, which is considered to be low-stake activity and has very little or no direct impact on student grades. Emotion plays a major role in student’s motivation to learning. In this work we use the student feedback data with emotion classification using surveys at a public research institution in the United States. We use Actionable Pattern Discovery method for this purpose. Actionable patterns are patterns that provide suggestions in the form of rules to help the user achieve better outcomes. The proposed method provides meaningful insight in terms of changes that can be incorporated in the Light-Weight team activities, resources utilized in the course. The results suggest how to enhance student emotions to a more positive state, in particular focuses on the emotions ‘Trust’ and ‘Joy’.

Keywords: actionable pattern discovery, education, emotion, data mining

Procedia PDF Downloads 78
247 Health and Safety Risk Assesment with Electromagnetic Field Exposure for Call Center Workers

Authors: Dilsad Akal

Abstract:

Aim: Companies communicate with each other and with their costumers via call centers. Call centers are defined as stressful because of their uncertain working hours, inadequate relief time, performance based system and heavy workload. In literature, this sector is defined as risky as mining sector by means of health and safety. The aim of this research is to enlight the relatively dark area. Subject and Methods: The collection of data for this study completed during April-May 2015 for the two selected call centers in different parts of Turkey. The applied question mostly investigated the health conditions of call center workers. Electromagnetic field measurements were completed at the same time with applying the question poll. The ratio of employee accessibility noted as 73% for the first call center and 87% for the second. Results: The results of electromagnetic field measurements were as between 371 V/m-32 V/m for the first location and between 370 V/m-61 V/m for the second. The general complaints of the employees for both workplaces can be counted as; inadequate relief time, inadequate air conditioning, disturbance, poor thermal conditions, inadequate or extreme lighting. Furthermore, musculoskeletal discomfort, stress, ear and eye discomfort are main health problems of employees. Conclusion: The measured values and the responses to the question poll were found parallel with the other similar research results in literature. At the end of this survey, a risk map of workplace was prepared in terms of safety and health at work in general and some suggestions for resolution were provided.

Keywords: call center, health and safety, electromagnetic field, risk map

Procedia PDF Downloads 167
246 Design and Fabrication of a Smart Quadruped Robot

Authors: Shivani Verma, Amit Agrawal, Pankaj Kumar Meena, Ashish B. Deoghare

Abstract:

Over the decade robotics has been a major area of interest among the researchers and scientists in reducing human efforts. The need for robots to replace human work in different dangerous fields such as underground mining, nuclear power station and war against terrorist attack has gained huge attention. Most of the robot design is based on human structure popularly known as humanoid robots. However, the problems encountered in humanoid robots includes low speed of movement, misbalancing in structure, poor load carrying capacity, etc. The simplification and adaptation of the fundamental design principles seen in animals have led to the creation of bio-inspired robots. But the major challenges observed in naturally inspired robot include complexity in structure, several degrees of freedom and energy storage problem. The present work focuses on design and fabrication of a bionic quadruped walking robot which is based on different joint of quadruped mammals like a dog, cheetah, etc. The design focuses on the structure of the robot body which consists of four legs having three degrees of freedom per leg and the electronics system involved in it. The robot is built using readily available plastics and metals. The proposed robot is simple in construction and is able to move through uneven terrain, detect and locate obstacles and take images while carrying additional loads which may include hardware and sensors. The robot will find possible application in the artificial intelligence sector.

Keywords: artificial intelligence, bionic, quadruped robot, degree of freedom

Procedia PDF Downloads 200
245 Identification of Workplace Hazards of Underground Coal Mines

Authors: Madiha Ijaz, Muhammad Akram, Sima Mir

Abstract:

Underground mining of coal is carried out manually in Pakistan. Exposure to ergonomic hazards (musculoskeletal disorders) are very common among the coal cutters of these mines. Cutting coal in narrow spaces poses a great threat to both upper and lower limbs of these workers. To observe the prevalence of such hazards, a thorough study was conducted on 600 workers from 30 mines (20 workers from 1 mine), located in two districts of province Punjab, Pakistan. Rapid Upper Limb Assessment sheet and Rapid Entire Body Assessment sheet were used for the study along with a standard Nordic Musculoskeleton disorder questionnaire. SPSS, 25, software was used for data analysis on upper and lower limb disorders, and regression analysis models were run for upper and lower back pain. According to the results obtained, it was found that work stages (drilling & blasting, coal cutting, timbering & supporting, etc.), wok experience and number of repetitions performed/minute were significant (with p-value 0.00,0.004 and 0.009, respectively) for discomfort in upper and lower limb. Age got p vale 0.00 for upper limb and 0.012 for lower limb disorder. The task of coal cutting was strongly associated with the pain in upper back (with odd ratios13.21, 95% confidence interval (CI)14.0-21.64)) and lower back pain (3.7, 95% confidence interval 1.3-4.2). scored on RULA and REBA sheets, every work-stage was ranked at 7-highest level of risk involved. Workers were young (mean value of age= 28.7 years) with mean BMI 28.1 kg/m2

Keywords: workplace hazards, ergonomic disorders, limb disorders, MSDs.

Procedia PDF Downloads 68
244 Deep-Learning Coupled with Pragmatic Categorization Method to Classify the Urban Environment of the Developing World

Authors: Qianwei Cheng, A. K. M. Mahbubur Rahman, Anis Sarker, Abu Bakar Siddik Nayem, Ovi Paul, Amin Ahsan Ali, M. Ashraful Amin, Ryosuke Shibasaki, Moinul Zaber

Abstract:

Thomas Friedman, in his famous book, argued that the world in this 21st century is flat and will continue to be flatter. This is attributed to rapid globalization and the interdependence of humanity that engendered tremendous in-flow of human migration towards the urban spaces. In order to keep the urban environment sustainable, policy makers need to plan based on extensive analysis of the urban environment. With the advent of high definition satellite images, high resolution data, computational methods such as deep neural network analysis, and hardware capable of high-speed analysis; urban planning is seeing a paradigm shift. Legacy data on urban environments are now being complemented with high-volume, high-frequency data. However, the first step of understanding urban space lies in useful categorization of the space that is usable for data collection, analysis, and visualization. In this paper, we propose a pragmatic categorization method that is readily usable for machine analysis and show applicability of the methodology on a developing world setting. Categorization to plan sustainable urban spaces should encompass the buildings and their surroundings. However, the state-of-the-art is mostly dominated by classification of building structures, building types, etc. and largely represents the developed world. Hence, these methods and models are not sufficient for developing countries such as Bangladesh, where the surrounding environment is crucial for the categorization. Moreover, these categorizations propose small-scale classifications, which give limited information, have poor scalability and are slow to compute in real time. Our proposed method is divided into two steps-categorization and automation. We categorize the urban area in terms of informal and formal spaces and take the surrounding environment into account. 50 km × 50 km Google Earth image of Dhaka, Bangladesh was visually annotated and categorized by an expert and consequently a map was drawn. The categorization is based broadly on two dimensions-the state of urbanization and the architectural form of urban environment. Consequently, the urban space is divided into four categories: 1) highly informal area; 2) moderately informal area; 3) moderately formal area; and 4) highly formal area. In total, sixteen sub-categories were identified. For semantic segmentation and automatic categorization, Google’s DeeplabV3+ model was used. The model uses Atrous convolution operation to analyze different layers of texture and shape. This allows us to enlarge the field of view of the filters to incorporate larger context. Image encompassing 70% of the urban space was used to train the model, and the remaining 30% was used for testing and validation. The model is able to segment with 75% accuracy and 60% Mean Intersection over Union (mIoU). In this paper, we propose a pragmatic categorization method that is readily applicable for automatic use in both developing and developed world context. The method can be augmented for real-time socio-economic comparative analysis among cities. It can be an essential tool for the policy makers to plan future sustainable urban spaces.

Keywords: semantic segmentation, urban environment, deep learning, urban building, classification

Procedia PDF Downloads 165
243 Applying Theory of Inventive Problem Solving to Develop Innovative Solutions: A Case Study

Authors: Y. H. Wang, C. C. Hsieh

Abstract:

Good service design can increase organization revenue and consumer satisfaction while reducing labor and time costs. The problems facing consumers in the original serve model for eyewear and optical industry includes the following issues: 1. Insufficient information on eyewear products 2. Passively dependent on recommendations, insufficient selection 3. Incomplete records on progression of vision conditions 4. Lack of complete customer records. This study investigates the case of Kobayashi Optical, applying the Theory of Inventive Problem Solving (TRIZ) to develop innovative solutions for eyewear and optical industry. Analysis results raise the following conclusions and management implications: In order to provide customers with improved professional information and recommendations, Kobayashi Optical is suggested to establish customer purchasing records. Overall service efficiency can be enhanced by applying data mining techniques to analyze past consumer preferences and purchase histories. Furthermore, Kobayashi Optical should continue to develop a 3D virtual trial service which can allow customers for easy browsing of different frame styles and colors. This 3D virtual trial service will save customer waiting times in during peak service times at stores.

Keywords: theory of inventive problem solving (TRIZ), service design, augmented reality (AR), eyewear and optical industry

Procedia PDF Downloads 269
242 Determination of Safe Ore Extraction Methodology beneath Permanent Extraction in a Lead Zinc Mine with the Help of FLAC3D Numerical Model

Authors: Ayan Giri, Lukaranjan Phukan, Shantanu Karmakar

Abstract:

Structure and tectonics play a vital role in ore genesis and deposition. The existence of a swelling structure below the current level of a mine leads to the discovery of ores below some permeant developments of the mine. The discovery and the extraction of the ore body are very critical to sustain the business requirement of the mine. The challenge was to extract the ore without hampering the global stability of the mine. In order to do so, different mining options were considered and analysed by numerical modelling in FLAC3d software. The constitutive model prepared for this simulation is the improved unified constitutive model, which can better and more accurately predict the stress-strain relationships in a continuum model. The IUCM employs the Hoek-Brown criterion to determine the instantaneous Mohr-Coulomb parameters cohesion (c) and friction (ɸ) at each level of confining stress. The extra swelled part can be dimensioned as north-south strike width 50m, east-west strike width 50m. On the north side, already a stope (P1) is excavated of the dimension of 25m NS width. The different options considered were (a) Open stoping of extraction of southern part (P0) of 50m to the full extent, (b) Extraction of the southern part of 25m, then filling of both the primaries and extraction of secondary (S0) 25m in between. (c) Extraction of the southern part (P0) completely, preceded by backfill and modify the design of the secondary (S0) for the overall stability of the permanent excavation above the stoping.

Keywords: extraction, IUCM, FLAC 3D, stoping, tectonics

Procedia PDF Downloads 202
241 Using Geospatial Analysis to Reconstruct the Thunderstorm Climatology for the Washington DC Metropolitan Region

Authors: Mace Bentley, Zhuojun Duan, Tobias Gerken, Dudley Bonsal, Henry Way, Endre Szakal, Mia Pham, Hunter Donaldson, Chelsea Lang, Hayden Abbott, Leah Wilcynzski

Abstract:

Air pollution has the potential to modify the lifespan and intensity of thunderstorms and the properties of lightning. Using data mining and geovisualization, we investigate how background climate and weather conditions shape variability in urban air pollution and how this, in turn, shapes thunderstorms as measured by the intensity, distribution, and frequency of cloud-to-ground lightning. A spatiotemporal analysis was conducted in order to identify thunderstorms using high-resolution lightning detection network data. Over seven million lightning flashes were used to identify more than 196,000 thunderstorms that occurred between 2006 - 2020 in the Washington, DC Metropolitan Region. Each lightning flash in the dataset was grouped into thunderstorm events by means of a temporal and spatial clustering algorithm. Once the thunderstorm event database was constructed, hourly wind direction, wind speed, and atmospheric thermodynamic data were added to the initiation and dissipation times and locations for the 196,000 identified thunderstorms. Hourly aerosol and air quality data for the thunderstorm initiation times and locations were also incorporated into the dataset. Developing thunderstorm climatologies using a lightning tracking algorithm and lightning detection network data was found to be useful for visualizing the spatial and temporal distribution of urban augmented thunderstorms in the region.

Keywords: lightning, urbanization, thunderstorms, climatology

Procedia PDF Downloads 58
240 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis

Authors: C. B. Le, V. N. Pham

Abstract:

In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.

Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering

Procedia PDF Downloads 166
239 Atomic Town: History and Vernacular Heritage at the Mary Kathleen Uranium Mine in Australia

Authors: Erik Eklund

Abstract:

Mary Kathleen was a purpose-built company town located in northwest Queensland in Australia. It was created to work on a rich uranium deposit discovered in the area in July 1954. The town was complete by 1958, possessing curved streets, modern materials, and a progressive urban planning scheme. Formed in the minds of corporate executives and architects and made manifest in arid zone country between Cloncurry and Mount Isa, Mary Kathleen was a modern marvel in the outback, a town that tamed the wild country of northwest Queensland, or so it seemed. The town was also a product of the Cold War. In the context of a nuclear arms race between the Soviet Union and her allies, and the United States of America (USA) and her Allies, a rapid rush to locate, mine, and process uranium after 1944 led to the creation of uranium towns in Czechoslovakia, Canada, the Soviet Union, USA and Australia of which Mary Kathleen was one such example. Mary Kathleen closed in 1981, and most of the town’s infrastructure was removed. Since then, the town’s ghostly remains have attracted travellers and tourists. Never an officially-sanctioned tourist site, the area has nevertheless become a regular stop for campers and day trippers who have engaged with the site often without formal interpretation. This paper explores the status of this vernacular heritage and asks why it has not gained any official status and what visitors might see in the place despite its uncertain status.

Keywords: uranium mining, planned communities, official heritage, vernacular heritage, Australian history

Procedia PDF Downloads 71
238 Lipidomic Response to Neoadjuvant Chemoradiotherapy in Rectal Cancer

Authors: Patricia O. Carvalho, Marcia C. F. Messias, Salvador Sanchez Vinces, Caroline F. A. Gatinoni, Vitor P. Iordanu, Carlos A. R. Martinez

Abstract:

Lipidomics methods are widely used in the identification and validation of disease-specific biomarkers and therapy response evaluation. The present study aimed to identify a panel of potential lipid biomarkers to evaluate response to neoadjuvant chemoradiotherapy in rectal adenocarcinoma (RAC). Liquid chromatography–mass spectrometry (LC-MS)-based untargeted lipidomic was used to profile human serum samples from patients with clinical stage T2 or T3 resectable RAC, after and before chemoradiotherapy treatment. A total of 28 blood plasma samples were collected from 14 patients with RAC who recruited at the São Francisco University Hospital (HUSF/USF). The study was approved by the ethics committee (CAAE 14958819.8.0000.5514). Univariate and multivariate statistical analyses were applied to explore dysregulated metabolic pathways using untargeted lipidic profiling and data mining approaches. A total of 36 statistically significant altered lipids were identified and the subsequent partial least-squares discriminant analysis model was both cross validated (R2, Q2) and permutated. Lisophosphatidyl-choline (LPC) plasmalogens containing palmitoleic and oleic acids, with high variable importance in projection score, showed a tendency to be lower after completion of chemoradiotherapy. Chemoradiotherapy seems to change plasmanyl-phospholipids levels, indicating that these lipids play an important role in the RAC pathogenesis.

Keywords: lipidomics, neoadjuvant chemoradiotherapy, plasmalogens, rectal adenocarcinoma

Procedia PDF Downloads 115
237 A Methodology for Automatic Diversification of Document Categories

Authors: Dasom Kim, Chen Liu, Myungsu Lim, Su-Hyeon Jeon, ByeoungKug Jeon, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, numerous documents including unstructured data and text have been created due to the rapid increase in the usage of social media and the Internet. Each document is usually provided with a specific category for the convenience of the users. In the past, the categorization was performed manually. However, in the case of manual categorization, not only can the accuracy of the categorization be not guaranteed but the categorization also requires a large amount of time and huge costs. Many studies have been conducted towards the automatic creation of categories to solve the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorizing complex documents with multiple topics because the methods work by assuming that one document can be categorized into one category only. In order to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, they are also limited in that their learning process involves training using a multi-categorized document set. These methods therefore cannot be applied to multi-categorization of most documents unless multi-categorized training sets are provided. To overcome the limitation of the requirement of a multi-categorized training set by traditional multi-categorization algorithms, we previously proposed a new methodology that can extend a category of a single-categorized document to multiple categorizes by analyzing relationships among categories, topics, and documents. In this paper, we design a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.

Keywords: big data analysis, document classification, multi-category, text mining, topic analysis

Procedia PDF Downloads 257
236 Children and Migration in Ghana: Unveiling the Realities of Vulnerability and Social Exclusion

Authors: Thomas Yeboah

Abstract:

In contemporary times, the incessant movement of northern children especially girls to southern Ghana at the detriment of their education is worrisome. Due to the misplaced mindset of the migrants concerning southern Ghana, majority of them move without an idea of where to stay and what to do exposing them to hash conditions of living. Majority find menial work in cocoa farms, illegal mining and head porterage business. This study was conducted in the Kumasi Metropolis to ascertain the major causes of child migration from the northern part of Ghana to the south and their living conditions. Both qualitative and quantitative tools of data collection and analysis were employed. The purposive sampling technique was used to select 90 migrants below 18 years. Specifically, interviews, focus group discussions and questionnaires were used to elicit responses from the units of analysis. The study revealed that the major cause of child migration from northern Ghana to the south is poverty. It was evident that respondents were vulnerable to the new environment in which they lived. They are exposed to harsh environmental conditions; sexual, verbal and physical assault; and harassment from arm robbers. The paper recommends that policy decisions should be able to create an enabling environment for the labour force in the north to ameliorate the compelling effects poverty has on child migration. Efforts should also be made to create a proper psychological climate in the minds of the children regarding their destination areas through sensitization and education.

Keywords: child migration, vulnerability, social exclusion, child labour, Ghana

Procedia PDF Downloads 428
235 Petrology Investigation of Apatite Minerals in the Esfordi Mine

Authors: Haleh Rezaei Zanjirabadi, Fatemeh Saberi, Bahman Rahimzadeh, Fariborz Masoudi, Mohammad Rahgosha

Abstract:

In this study, apatite minerals from the iron-phosphate deposit of Yazd have been investigated within the microcontinent zone of Iran in the Zagros structural zone. The geological units in the Esfordi area belong to the pre-Cambrian to lower-Cambrian age, consisting of a succession of carbonate rocks (dolomite), shale, tuff, sandstone, and volcanic rocks. In addition to the mentioned sedimentary and volcanic rocks, the granitoid mass of Bahabad, which is the largest intrusive mass in the region, has intruded into the eastern part of this series and has caused its metamorphism and alteration. After collecting the available data, various samples of Esfordi’s apatite were prepared, and their mineralogy and crystallography were investigated using laboratory methods such as petrographic microscopy, Raman spectroscopy, EDS, and SEM. In non-destructive Raman spectroscopy, the molecular structure of apatite minerals was revealed in four distinct spectral ranges. Initially, the spectra of phosphate and aluminum bonds with O2HO, OH, were observed, followed by the identification of Cl, OH, Al, Na, Ca and hydroxyl units depending on the type of apatite mineral family. In SEM analysis, based on various shapes and different phases of apatites, their constituent major elements were identified through EDS, indicating that the samples from the Esfordi mining area exhibit a dense and coherent texture with smooth surfaces. Based on the elemental analysis results by EDS, the apatites in the Esfordi area are classified into the calcic apatite group.

Keywords: petrology, apatite, Esfordi, EDS, SEM, Raman spectroscopy

Procedia PDF Downloads 44
234 Graph-Based Semantical Extractive Text Analysis

Authors: Mina Samizadeh

Abstract:

In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.

Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis

Procedia PDF Downloads 52
233 Mitigation Measures for the Acid Mine Drainage Emanating from the Sabie Goldfield: Case Study of the Nestor Mine

Authors: Rudzani Lusunzi, Frans Waanders, Elvis Fosso-Kankeu, Robert Khashane Netshitungulwana

Abstract:

The Sabie Goldfield has a history of gold mining dating back more than a century. Acid mine drainage (AMD) from the Nestor mine tailings storage facility (MTSF) poses a serious threat to the nearby ecosystem, specifically the Sabie River system. This study aims at developing mitigation measures for the AMD emanating from the Nestor MTSF using materials from the Glynns Lydenburg MTSF. The Nestor MTSF (NM) and the Glynns Lydenburg MTSF (GM) each provided about 20 kg of bulk composite samples. Using samples from the Nestor MTSF and the Glynns Lydenburg MTSF, two mixtures were created. MIX-A is a mixture that contains 25% weight percent (GM) and 75% weight percent (NM). MIX-B is the name given to the second mixture, which contains 50% AN and 50% AG. The same static test, i.e., acid–base accounting (ABA), net acid generation (NAG), and acid buffering characteristics curve (ABCC) was used to estimate the acid-generating probabilities of samples NM and GM for MIX-A and MIX-B. Furthermore, the mineralogy of the Nestor MTSF samples consists of the primary acid-producing mineral pyrite as well as the secondary minerals ferricopiapite and jarosite, which are common in acidic conditions. The Glynns Lydenburg MTSF samples, on the other hand, contain primary acid-neutralizing minerals calcite and dolomite. Based on the assessment conducted, materials from the Glynns Lydenburg are capable of neutralizing AMD from Nestor MTSF. Therefore, the alkaline tailings materials from the Glynns Lydenburg MTSF can be used to rehabilitate the acidic Nestor MTSF.

Keywords: Nestor Mine, acid mine drainage, mitigation, Sabie River system

Procedia PDF Downloads 67