Search results for: data mining applications and discovery
30178 Application of Computer Aided Engineering Tools in Performance Prediction and Fault Detection of Mechanical Equipment of Mining Process Line
Abstract:
Nowadays, to decrease the number of downtimes in the industries such as metal mining, petroleum and chemical industries, predictive maintenance is crucial. In order to have efficient predictive maintenance, knowing the performance of critical equipment of production line such as pumps and hydro-cyclones under variable operating parameters, selecting best indicators of this equipment health situations, best locations for instrumentation, and also measuring of these indicators are very important. In this paper, computer aided engineering (CAE) tools are implemented to study some important elements of copper process line, namely slurry pumps and cyclone to predict the performance of these components under different working conditions. These modeling and simulations can be used in predicting, for example, the damage tolerance of the main shaft of the slurry pump or wear rate and location of cyclone wall or pump case and impeller. Also, the simulations can suggest best-measuring parameters, measuring intervals, and their locations.Keywords: computer aided engineering, predictive maintenance, fault detection, mining process line, slurry pump, hydrocyclone
Procedia PDF Downloads 40630177 Towards the Use of Software Product Metrics as an Indicator for Measuring Mobile Applications Power Consumption
Authors: Ching Kin Keong, Koh Tieng Wei, Abdul Azim Abd. Ghani, Khaironi Yatim Sharif
Abstract:
Maintaining factory default battery endurance rate over time in supporting huge amount of running applications on energy-restricted mobile devices has created a new challenge for mobile applications developer. While delivering customers’ unlimited expectations, developers are barely aware of efficient use of energy from the application itself. Thus developers need a set of valid energy consumption indicators in assisting them to develop energy saving applications. In this paper, we present a few software product metrics that can be used as an indicator to measure energy consumption of Android-based mobile applications in the early of design stage. In particular, Trepn Profiler (Power profiling tool for Qualcomm processor) has used to collect the data of mobile application power consumption, and then analyzed for the 23 software metrics in this preliminary study. The results show that McCabe cyclomatic complexity, number of parameters, nested block depth, number of methods, weighted methods per class, number of classes, total lines of code and method lines have direct relationship with power consumption of mobile application.Keywords: battery endurance, software metrics, mobile application, power consumption
Procedia PDF Downloads 39530176 Heat Transfer Studies on CNT Nanofluids in a Turbulent Flow Heat Exchanger
Authors: W. Rashmi, M. Khalid, O. Seiksan, R. Saidur, A. F. Ismail
Abstract:
Nanofluids have received much more attention since its discovery. They are believed to be promising coolants in heat transfer applications due to their enhanced thermal conductivity and heat transfer characteristics. In this study, the enhancement in heat transfer of CNT-nanofluids under turbulent flow conditions is investigated experimentally. Carbon nanotube (CNTs) concentration was varied between 0.051-0.085 wt%. The nanofluid suspension was stabilized by gum arabic (GA) through a process of homogenisation and sonication. The flow rates of cold fluid (water) is varied from 1.7-3 L/min and flow rates of the hot fluid is varied between 2-3.5 L/min. Thermal conductivity, density and viscosity of the nanofluids were also measured as a function of temperature and CNT concentration. The experimental results are validated with theoretical correlations for turbulent flow available in the literature. Results showed an enhancement in heat transfer range between 9-67% as a function of temperature and CNT concentration.Keywords: nanofluids, carbon nanotubes (CNT), heat transfer enhancement, heat transfer
Procedia PDF Downloads 50130175 Use of Fine Recycled Aggregates in Normal Concrete Production
Authors: Vignesh Pechiappan Ayyathurai, Mukesh Limbachiya, Hsein Kew
Abstract:
There is a growing interest in using recycled, secondary use and industrial by product materials in high value commercial applications. Potential high volume applications include use of fine aggregate in flowable fill or as a component in manufactured aggregates. However, there is much scientific, as well as applied research needed in this area due to lack to availability of data on the mechanical and environmental properties of elements or products produced using fine recycled aggregates. The principle objectives of this research are to synthesize existing data on the beneficial reuse of fine recycled materials and to develop extensive testing programme for assessing and establishing engineering and long term durability properties of concrete and other construction products produced using such material for use in practical application widely. This paper is a research proposal for PhD admission. The proposed research aims to supply the necessary technical, as well as practical information on fine recycled aggregate concrete to the construction industry for promoting its wider use within the construction industry. Furthermore, to disseminate research outcomes to the local authorities for consideration of use of fine recycled aggregate concrete in various applications.Keywords: FRA, fine aggregate, recycling, concrete
Procedia PDF Downloads 32330174 Assessing Online Learning Paths in an Learning Management Systems Using a Data Mining and Machine Learning Approach
Authors: Alvaro Figueira, Bruno Cabral
Abstract:
Nowadays, students are used to be assessed through an online platform. Educators have stepped up from a period in which they endured the transition from paper to digital. The use of a diversified set of question types that range from quizzes to open questions is currently common in most university courses. In many courses, today, the evaluation methodology also fosters the students’ online participation in forums, the download, and upload of modified files, or even the participation in group activities. At the same time, new pedagogy theories that promote the active participation of students in the learning process, and the systematic use of problem-based learning, are being adopted using an eLearning system for that purpose. However, although there can be a lot of feedback from these activities to student’s, usually it is restricted to the assessments of online well-defined tasks. In this article, we propose an automatic system that informs students of abnormal deviations of a 'correct' learning path in the course. Our approach is based on the fact that by obtaining this information earlier in the semester, may provide students and educators an opportunity to resolve an eventual problem regarding the student’s current online actions towards the course. Our goal is to prevent situations that have a significant probability to lead to a poor grade and, eventually, to failing. In the major learning management systems (LMS) currently available, the interaction between the students and the system itself is registered in log files in the form of registers that mark beginning of actions performed by the user. Our proposed system uses that logged information to derive new one: the time each student spends on each activity, the time and order of the resources used by the student and, finally, the online resource usage pattern. Then, using the grades assigned to the students in previous years, we built a learning dataset that is used to feed a machine learning meta classifier. The produced classification model is then used to predict the grades a learning path is heading to, in the current year. Not only this approach serves the teacher, but also the student to receive automatic feedback on her current situation, having past years as a perspective. Our system can be applied to online courses that integrate the use of an online platform that stores user actions in a log file, and that has access to other student’s evaluations. The system is based on a data mining process on the log files and on a self-feedback machine learning algorithm that works paired with the Moodle LMS.Keywords: data mining, e-learning, grade prediction, machine learning, student learning path
Procedia PDF Downloads 12330173 Impact of Collieries on Groundwater in Damodar River Basin
Authors: Rajkumar Ghosh
Abstract:
The industrialization of coal mining and related activities has a significant impact on groundwater in the surrounding areas of the Damodar River. The Damodar River basin, located in eastern India, is known as the "Ruhr of India" due to its abundant coal reserves and extensive coal mining and industrial operations. One of the major consequences of collieries on groundwater is the contamination of water sources. Coal mining activities often involve the excavation and extraction of coal through underground or open-pit mining methods. These processes can release various pollutants and chemicals into the groundwater, including heavy metals, acid mine drainage, and other toxic substances. As a result, the quality of groundwater in the Damodar River region has deteriorated, making it unsuitable for drinking, irrigation, and other purposes. The high concentration of heavy metals, such as arsenic, lead, and mercury, in the groundwater has posed severe health risks to the local population. Prolonged exposure to contaminated water can lead to various health problems, including skin diseases, respiratory issues, and even long-term ailments like cancer. The contamination has also affected the aquatic ecosystem, harming fish populations and other organisms dependent on the river's water. Moreover, the excessive extraction of groundwater for industrial processes, including coal washing and cooling systems, has resulted in a decline in the water table and depletion of aquifers. This has led to water scarcity and reduced availability of water for agricultural activities, impacting the livelihoods of farmers in the region. Efforts have been made to mitigate these issues through the implementation of regulations and improved industrial practices. However, the historical legacy of coal industrialization continues to impact the groundwater in the Damodar River area. Remediation measures, such as the installation of water treatment plants and the promotion of sustainable mining practices, are essential to restore the quality of groundwater and ensure the well-being of the affected communities. In conclusion, the coal industrialization in the Damodar River surrounding has had a detrimental impact on groundwater. This research focuses on soil subsidence induced by the over-exploitation of ground water for dewatering open pit coal mines. Soil degradation happens in arid and semi-arid regions as a result of land subsidence in coal mining region, which reduces soil fertility. Depletion of aquifers, contamination, and water scarcity are some of the key challenges resulting from these activities. It is crucial to prioritize sustainable mining practices, environmental conservation, and the provision of clean drinking water to mitigate the long-lasting effects of collieries on the groundwater resources in the region.Keywords: coal mining, groundwater, soil subsidence, water table, damodar river
Procedia PDF Downloads 8230172 Broadband Platinum Disulfide Based Saturable Absorber Used for Optical Fiber Mode Locking Lasers
Authors: Hui Long, Chun Yin Tang, Ping Kwong Cheng, Xin Yu Wang, Wayesh Qarony, Yuen Hong Tsang
Abstract:
Two dimensional (2D) materials have recently attained substantial research interest since the discovery of graphene. However, the zero-bandgap feature of the graphene limits its nonlinear optical applications, e.g., saturable absorption for these applications require strong light-matter interaction. Nevertheless, the excellent optoelectronic properties, such as broad tunable bandgap energy and high carrier mobility of Group 10 transition metal dichalcogenides 2D materials, e.g., PtS2 introduce new degree of freedoms in the optoelectronic applications. This work reports our recent research findings regarding the saturable absorption property of PtS2 layered 2D material and its possibility to be used as saturable absorber (SA) for ultrafast mode locking fiber laser. The demonstration of mode locking operation by using the fabricated PtS2 as SA will be discussed. The PtS2/PVA SA used in this experiment is made up of some few layered PtS2 nanosheets fabricated via a simple ultrasonic liquid exfoliation. The operational wavelength located at ~1 micron is demonstrated from Yb-doped mode locking fiber laser ring cavity by using the PtS2 SA. The fabricated PtS2 saturable absorber offers strong nonlinear properties, and it is capable of producing regular mode locking laser pulses with pulse to pulse duration matched with the round-trip cavity time. The results confirm successful mode locking operation achieved by the fabricated PtS2 material. This work opens some new opportunities for these PtS2 materials for the ultrafast laser generation. Acknowledgments: This work is financially supported by Shenzhen Science and Technology Innovation Commission (JCYJ20170303160136888) and the Research Grants Council of Hong Kong, China (GRF 152109/16E, PolyU code: B-Q52T).Keywords: platinum disulfide, PtS2, saturable absorption, saturable absorber, mode locking laser
Procedia PDF Downloads 18930171 A Comparative Study of GTC and PSP Algorithms for Mining Sequential Patterns Embedded in Database with Time Constraints
Authors: Safa Adi
Abstract:
This paper will consider the problem of sequential mining patterns embedded in a database by handling the time constraints as defined in the GSP algorithm (level wise algorithms). We will compare two previous approaches GTC and PSP, that resumes the general principles of GSP. Furthermore this paper will discuss PG-hybrid algorithm, that using PSP and GTC. The results show that PSP and GTC are more efficient than GSP. On the other hand, the GTC algorithm performs better than PSP. The PG-hybrid algorithm use PSP algorithm for the two first passes on the database, and GTC approach for the following scans. Experiments show that the hybrid approach is very efficient for short, frequent sequences.Keywords: database, GTC algorithm, PSP algorithm, sequential patterns, time constraints
Procedia PDF Downloads 39030170 A Bayesian Classification System for Facilitating an Institutional Risk Profile Definition
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for easy creation and classification of institutional risk profiles supporting endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support set up of the most important risk factors. Subsequently, risk profiles employ risk factors classifier and associated configurations to support digital preservation experts with a semi-automatic estimation of endangerment group for file format risk profiles. Our goal is to make use of an expert knowledge base, accuired through a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation of risk factors for a requried dimension for analysis. Using the naive Bayes method, the decision support system recommends to an expert the matching risk profile group for the previously selected institutional risk profile. The proposed methods improve the visibility of risk factor values and the quality of a digital preservation process. The presented approach is designed to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and values of file format risk profiles. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert and to define its profile group. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.Keywords: linked open data, information integration, digital libraries, data mining
Procedia PDF Downloads 42830169 The Effect of Artificial Intelligence on the Production of Agricultural Lands and Labor
Authors: Ibrahim Makram Ibrahim Salib
Abstract:
Agriculture plays an essential role in providing food for the world's population. It also offers numerous benefits to countries, including non-food products, transportation, and environmental balance. Precision agriculture, which employs advanced tools to monitor variability and manage inputs, can help achieve these benefits. The increasing demand for food security puts pressure on decision-makers to ensure sufficient food production worldwide. To support sustainable agriculture, unmanned aerial vehicles (UAVs) can be utilized to manage farms and increase yields. This paper aims to provide an understanding of UAV usage and its applications in agriculture. The objective is to review the various applications of UAVs in agriculture. Based on a comprehensive review of existing research, it was found that different sensors provide varying analyses for agriculture applications. Therefore, the purpose of the project must be determined before using UAV technology for better data quality and analysis. In conclusion, identifying a suitable sensor and UAV is crucial to gather accurate data and precise analysis when using UAVs in agriculture.Keywords: agriculture land, agriculture land loss, Kabul city, urban land expansion, urbanization agriculture yield growth, agriculture yield prediction, explorative data analysis, predictive models, regression models drone, precision agriculture, farmer income
Procedia PDF Downloads 7630168 Field Trial of Resin-Based Composite Materials for the Treatment of Surface Collapses Associated with Former Shallow Coal Mining
Authors: Philip T. Broughton, Mark P. Bettney, Isla L. Smail
Abstract:
Effective treatment of ground instability is essential when managing the impacts associated with historic mining. A field trial was undertaken by the Coal Authority to investigate the geotechnical performance and potential use of composite materials comprising resin and fill or stone to safely treat surface collapses, such as crown-holes, associated with shallow mining. Test pits were loosely filled with various granular fill materials. The fill material was injected with commercially available silicate and polyurethane resin foam products. In situ and laboratory testing was undertaken to assess the geotechnical properties of the resultant composite materials. The test pits were subsequently excavated to assess resin permeation. Drilling and resin injection was easiest through clean limestone fill materials. Recycled building waste fill material proved difficult to inject with resin; this material is thus considered unsuitable for use in resin composites. Incomplete resin permeation in several of the test pits created irregular ‘blocks’ of composite. Injected resin foams significantly improve the stiffness and resistance (strength) of the un-compacted fill material. The stiffness of the treated fill material appears to be a function of the stone particle size, its associated compaction characteristics (under loose tipping) and the proportion of resin foam matrix. The type of fill material is more critical than the type of resin to the geotechnical properties of the composite materials. Resin composites can effectively support typical design imposed loads. Compared to other traditional treatment options, such as cement grouting, the use of resin composites is potentially less disruptive, particularly for sites with limited access, and thus likely to achieve significant reinstatement cost savings. The use of resin composites is considered a suitable option for the future treatment of shallow mining collapses.Keywords: composite material, ground improvement, mining legacy, resin
Procedia PDF Downloads 35530167 Algorithmic Obligations: Proactive Liability for AI-Generated Content and Copyright Compliance
Authors: Aleksandra Czubek
Abstract:
As AI systems increasingly shape content creation, existing copyright frameworks face significant challenges in determining liability for AI-generated outputs. Current legal discussions largely focus on who bears responsibility for infringing works, be it developers, users, or entities benefiting from AI outputs. This paper introduces a novel concept of algorithmic obligations, proposing that AI developers be subject to proactive duties that ensure their models prevent copyright infringement before it occurs. Building on principles of obligations law traditionally applied to human actors, the paper suggests a shift from reactive enforcement to proactive legal requirements. AI developers would be legally mandated to incorporate copyright-aware mechanisms within their systems, turning optional safeguards into enforceable standards. These obligations could vary in implementation across international, EU, UK, and U.S. legal frameworks, creating a multi-jurisdictional approach to copyright compliance. This paper explores how the EU’s existing copyright framework, exemplified by the Copyright Directive (2019/790), could evolve to impose a duty of foresight on AI developers, compelling them to embed mechanisms that prevent infringing outputs. By drawing parallels to GDPR’s “data protection by design,” a similar principle could be applied to copyright law, where AI models are designed to minimize copyright risks. In the UK, post-Brexit text and data mining exemptions are seen as pro-innovation but pose risks to copyright protections. This paper proposes a balanced approach, introducing algorithmic obligations to complement these exemptions. AI systems benefiting from text and data mining provisions should integrate safeguards that flag potential copyright violations in real time, ensuring both innovation and protection. In the U.S., where copyright law focuses on human-centric works, this paper suggests an evolution toward algorithmic due diligence. AI developers would have a duty similar to product liability, ensuring that their systems do not produce infringing outputs, even if the outputs themselves cannot be copyrighted. This framework introduces a shift from post-infringement remedies to preventive legal structures, where developers actively mitigate risks. The paper also breaks new ground by addressing obligations surrounding the training data of large language models (LLMs). Currently, training data is often treated under exceptions such as the EU’s text and data mining provisions or U.S. fair use. However, this paper proposes a proactive framework where developers are obligated to verify and document the legal status of their training data, ensuring it is licensed or otherwise cleared for use. In conclusion, this paper advocates for an obligations-centered model that shifts AI-related copyright law from reactive litigation to proactive design. By holding AI developers to a heightened standard of care, this approach aims to prevent infringement at its source, addressing both the outputs of AI systems and the training processes that underlie them.Keywords: ip, technology, copyright, data, infringement, comparative analysis
Procedia PDF Downloads 2030166 Research on Spatial Distribution of Service Facilities Based on Innovation Function: A Case Study of Zhejiang University Zijin Co-Maker Town
Authors: Zhang Yuqi
Abstract:
Service facilities are the boosters for the cultivation and development of innovative functions in innovative cluster areas. At the same time, reasonable service facilities planning can better link the internal functional blocks. This paper takes Zhejiang University Zijin Co-Maker Town as the research object, based on the combination of network data mining and field research and verification, combined with the needs of its internal innovative groups. It studies the distribution characteristics and existing problems of service facilities and then proposes a targeted planning suggestion. The main conclusions are as follows: (1) From the perspective of view, the town is rich in general life-supporting services, but lacking of provision targeted and distinctive service facilities for innovative groups; (2) From the perspective of scale structure, small-scale street shops are the main business form, lack of large-scale service center; (3) From the perspective of spatial structure, service facilities layout of each functional block is too fragile to fit the characteristics of 2aggregation- distribution' of innovation and entrepreneurial activities; (4) The goal of optimizing service facilities planning should be guided for fostering function of innovation and entrepreneurship and meet the actual needs of the innovation and entrepreneurial groups.Keywords: the cultivation of innovative function, Zhejiang University Zijin Co-Maker Town, service facilities, network data mining, space optimization advice
Procedia PDF Downloads 11730165 Recurrent Neural Networks for Classifying Outliers in Electronic Health Record Clinical Text
Authors: Duncan Wallace, M-Tahar Kechadi
Abstract:
In recent years, Machine Learning (ML) approaches have been successfully applied to an analysis of patient symptom data in the context of disease diagnosis, at least where such data is well codified. However, much of the data present in Electronic Health Records (EHR) are unlikely to prove suitable for classic ML approaches. Furthermore, as scores of data are widely spread across both hospitals and individuals, a decentralized, computationally scalable methodology is a priority. The focus of this paper is to develop a method to predict outliers in an out-of-hours healthcare provision center (OOHC). In particular, our research is based upon the early identification of patients who have underlying conditions which will cause them to repeatedly require medical attention. OOHC act as an ad-hoc delivery of triage and treatment, where interactions occur without recourse to a full medical history of the patient in question. Medical histories, relating to patients contacting an OOHC, may reside in several distinct EHR systems in multiple hospitals or surgeries, which are unavailable to the OOHC in question. As such, although a local solution is optimal for this problem, it follows that the data under investigation is incomplete, heterogeneous, and comprised mostly of noisy textual notes compiled during routine OOHC activities. Through the use of Deep Learning methodologies, the aim of this paper is to provide the means to identify patient cases, upon initial contact, which are likely to relate to such outliers. To this end, we compare the performance of Long Short-Term Memory, Gated Recurrent Units, and combinations of both with Convolutional Neural Networks. A further aim of this paper is to elucidate the discovery of such outliers by examining the exact terms which provide a strong indication of positive and negative case entries. While free-text is the principal data extracted from EHRs for classification, EHRs also contain normalized features. Although the specific demographical features treated within our corpus are relatively limited in scope, we examine whether it is beneficial to include such features among the inputs to our neural network, or whether these features are more successfully exploited in conjunction with a different form of a classifier. In this section, we compare the performance of randomly generated regression trees and support vector machines and determine the extent to which our classification program can be improved upon by using either of these machine learning approaches in conjunction with the output of our Recurrent Neural Network application. The output of our neural network is also used to help determine the most significant lexemes present within the corpus for determining high-risk patients. By combining the confidence of our classification program in relation to lexemes within true positive and true negative cases, with an inverse document frequency of the lexemes related to these cases, we can determine what features act as the primary indicators of frequent-attender and non-frequent-attender cases, providing a human interpretable appreciation of how our program classifies cases.Keywords: artificial neural networks, data-mining, machine learning, medical informatics
Procedia PDF Downloads 13230164 Digital Transformation and Digitalization of Public Administration
Authors: Govind Kumar
Abstract:
The concept of ‘e-governance’ that was brought about by the new wave of reforms, namely ‘LPG’ in the early 1990s, has been enabling governments across the globe to digitally transform themselves. Digital transformation is leading the governments with qualitative decisions, optimization in rational use of resources, facilitation of cost-benefit analyses, and elimination of redundancy and corruption with the help of ICT-based applications interface. ICT-based applications/technologies have enormous potential for impacting positive change in the social lives of the global citizenry. Supercomputers test and analyze millions of drug molecules for developing candidate vaccines to combat the global pandemic. Further, e-commerce portals help distribute and supply household items and medicines, while videoconferencing tools provide a visual interface between the clients and hosts. Besides, crop yields are being maximized with the help of drones and machine learning, whereas satellite data, artificial intelligence, and cloud computing help governments with the detection of illegal mining, tackling deforestation, and managing freshwater resources. Such e-applications have the potential to take governance an extra mile by achieving 5 Es (effective, efficient, easy, empower, and equity) of e-governance and six Rs (reduce, reuse, recycle, recover, redesign and remanufacture) of sustainable development. If such digital transformation gains traction within the government framework, it will replace the traditional administration with the digitalization of public administration. On the other hand, it has brought in a new set of challenges, like the digital divide, e-illiteracy, technological divide, etc., and problems like handling e-waste, technological obsolescence, cyber terrorism, e-fraud, hacking, phishing, etc. before the governments. Therefore, it would be essential to bring in a rightful mixture of technological and humanistic interventions for addressing the above issues. This is on account of the reason that technology lacks an emotional quotient, and the administration does not work like technology. Both are self-effacing unless a blend of technology and a humane face are brought in into the administration. The paper will empirically analyze the significance of the technological framework of digital transformation within the government set up for the digitalization of public administration on the basis of the synthesis of two case studies undertaken from two diverse fields of administration and present a future framework of the study.Keywords: digital transformation, electronic governance, public administration, knowledge framework
Procedia PDF Downloads 10130163 Assessing the High Rate of Deforestation Caused by the Operations of Timber Industries in Ghana
Authors: Obed Asamoah
Abstract:
Forests are very vital for human survival and our well-being. During the past years, the world has taken an increasingly significant role in the modification of the global environment. The high rate of deforestation in Ghana is of primary national concern as the forests provide many ecosystem services and functions that support the country’s predominantly agrarian economy and foreign earnings. Ghana forest is currently major source of carbon sink that helps to mitigate climate change. Ghana forests, both the reserves and off-reserves, are under pressure of deforestation. The causes of deforestation are varied but can broadly be categorized into anthropogenic and natural factors. For the anthropogenic factors, increased wood fuel collection, clearing of forests for agriculture, illegal and poorly regulated timber extraction, social and environmental conflicts, increasing urbanization and industrialization are the primary known causes for the loss of forests and woodlands. Mineral exploitation in the forest areas is considered as one of the major causes of deforestation in Ghana. Mining activities especially mining of gold by both the licensed mining companies and illegal mining groups who are locally known as "gallantly mining" also cause damage to the nation's forest reserves. Several works have been conducted regarding the causes of the high rate of deforestation in Ghana, major attention has been placed on illegal logging and using forest lands for illegal farming and mining activities. Less emphasis has been placed on the timber production companies on their harvesting methods in the forests in Ghana and other activities that are carried out in the forest. The main objective of the work is to find out the harvesting methods and the activities of the timber production companies and their effects on the forests in Ghana. Both qualitative and quantitative research methods were engaged in the research work. The study population comprised of 20 Timber industries (Sawmills) forest areas of Ghana. These companies were selected randomly. The cluster sampling technique was engaged in selecting the respondents. Both primary and secondary data were employed. In the study, it was observed that most of the timber production companies do not know the age, the weight, the distance covered from the harvesting to the loading site in the forest. It was also observed that old and heavy machines are used by timber production companies in their operations in the forest, which makes the soil compact prevents regeneration and enhances soil erosion. It was observed that timber production companies do not abide by the rules and regulations governing their operations in the forest. The high rate of corruption on the side of the officials of the Ghana forestry commission makes the officials relax and do not embark on proper monitoring on the operations of the timber production companies which makes the timber companies to cause more harm to the forest. In other to curb this situation the Ghana forestry commission with the ministry of lands and natural resources should monitor the activities of the timber production companies and sanction all the companies that make foul play in their activities in the forest. The commission should also pay more attention to the policy “fell one plant 10” to enhance regeneration in both reserves and off-reserves forest.Keywords: companies, deforestation, forest, Ghana, timber
Procedia PDF Downloads 20030162 Establishing a Drug Discovery Platform to Progress Compounds into the Clinic
Authors: Sheraz Gul
Abstract:
The requirements for progressing a compound to clinical trials is well established and relies on the results from in-vitro and in-vivo animal tests to indicate that it is likely to be safe and efficacious when testing in humans. The typical data package required will include demonstrating compound safety, toxicity, bioavailability, pharmacodynamics (potential effects of the compound on body systems) and pharmacokinetics (how the compound is potentially absorbed, distributed, metabolised and eliminated after dosing in humans). If the desired criteria are met and the compound meets the clinical Candidate criteria and is deemed worthy of further development, a submission to regulatory bodies such as the US Food & Drug Administration for an exploratory Investigational New Drug Study can be made. The purpose of this study is to collect data to establish that the compound will not expose humans to unreasonable risks when used in limited, early-stage clinical studies in patients or normal volunteer subjects (Phase I). These studies are also designed to determine the metabolism and pharmacologic actions of the drug in humans, the side effects associated with increasing doses, and, if possible, to gain early evidence on their effectiveness. In order to reach the above goals, we have developed a pre-clinical high throughput Absorption, Distribution, Metabolism and Excretion–Toxicity (ADME–Toxicity) panel of assays to identify compounds that are likely to meet the Lead and Candidate compound acceptance criteria. This panel includes solubility studies in a range of biological fluids, cell viability studies in cancer and primary cell-lines, mitochondrial toxicity, off-target effects (across the kinase, protease, histone deacetylase, phosphodiesterase and GPCR protein families), CYP450 inhibition (5 different CYP450 enzymes), CYP450 induction, cardio-toxicity (hERG) and gene-toxicity. This panel of assays has been applied to multiple compound series developed in a number of projects delivering Lead and clinical Candidates and examples from these will be presented.Keywords: absorption, distribution, metabolism and excretion–toxicity , drug discovery, food and drug administration , pharmacodynamics
Procedia PDF Downloads 17330161 The Early Discovery and Confirmation of the Indus Valley Civilization
Authors: Muhammad Ishaqa, Quanchao Zhanga, Qian Wangb
Abstract:
The Indus Valley Civilization is predominantly found in the northeast of Afghanistan, Pakistan, and the northwest of India and is considered one of the four ancient civilizations of the Old World, as well as the first urban civilization in South Asia. In 1920, John Marshall and other archaeologists established the existence of this civilization. Over the course of a century, India and Pakistan have made significant advancements in their joint archaeological investigation and excavation, contributing to the study of the Indus Valley Civilization. Given the importance of early discovery and confirmation of this civilization, our research focuses on the academic history of its archaeology by gathering published research material. Our research begins by collecting research data associated with the Indus Valley Civilization and documenting the process of archaeological investigations and excavations from the 19th century until the present day. We also summarize the archaeological works conducted during different periods. Furthermore, we present the primary academic views on the Indus Civilization from the 19th century until the present, explaining their developmental process and highlighting recent research. This forms a foundation for further study. We discovered that the archaeological research of the Indus Civilization is significantly influenced by Western archaeology and has yet to establish an independent, local research system. We delve into the three primary sites of the Indus Valley Civilization - Harappa, Mohenjo-Daro, and Chanhudaro - discussing their history and archaeological excavation records. Our findings indicate that the Indus Civilization is solely dependent on archaeology, distinguishing it from the Sumerian Civilization and verifying that it originates from the Bronze Age of the Indus Valley. Lastly, we examine the primary academic issues associated with the Indus Civilization in greater depth. These issues include climate environment, political system, primitive religion, and academic contribution.Keywords: Indus Valley civilization, archaeology, Harappa, Mohenjo-Daro
Procedia PDF Downloads 5330160 In-situ Oxygen Enrichment for UCG
Authors: Adesola O. Orimoloye, Edward Gobina
Abstract:
Membrane separation technology is still considered as an emerging technology in the mining sector and does not yet have the widespread acceptance that it has in other industrial sectors. Underground Coal Gasification (UCG), wherein coal is converted to gas in-situ, is a safer alternative to mining method that retains all pollutants underground making the process environmentally friendly. In-situ combustion of coal for power generation allows access to more of the physical global coal resource than would be included in current economically recoverable reserve estimates. Where mining is no longer taking place, for economic or geological reasons, controlled gasification permits exploitation of the deposit (again a reaction of coal to form a synthesis gas) of coal seams in situ. The oxygen supply stage is one of the most expensive parts of any gasification project but the use of membranes is a potentially attractive approach for producing oxygen-enriched air. In this study, a variety of cost-effective membrane materials that gives an optimal amount of oxygen concentrations in the range of interest was designed and tested at diverse operating conditions. Oxygen-enriched atmosphere improves the combustion temperature but a decline is observed if oxygen concentration exceeds optimum. Experimental result also reveals the preparatory method, apparatus and performance of the fabricated membrane.Keywords: membranes, oxygen-enrichment, gasification, coal
Procedia PDF Downloads 32530159 Growth and Development of Membranes in Gas Sequestration
Authors: Sreevalli Bokka
Abstract:
The process of reducing the intensity of the carbon from a process or stream into the atmosphere is termed Decarbonization. Of the various technologies that are emerging to capture or reduce carbon intensity, membranes are emerging as a key player in separating carbon from a gas stream, such as industrial effluent air and others. Due to the advantage of high surface area and low flow resistance, fiber membranes are emerging widely for gas separation applications. A fiber membrane is a semipermeable barrier that is increasingly used for filtration and separation applications needing high packing density. A few of the many applications are in water desalination, medical applications, bioreactors, and gas separations applications. Only a few polymeric membranes were studied for fabricating fiber membranes such as cellulose acetate, Polysulfone, and Polyvinylidene fluoride. A few of the challenges of using fiber membranes are fouling and weak mechanical properties, leading to the breakage of membranes. In this work, the growth of fiber membranes and challenges for future developments in the filtration and gas separation applications are presented.Keywords: membranes, filtration, separations, polymers, carbon capture
Procedia PDF Downloads 5730158 Commercial Automobile Insurance: A Practical Approach of the Generalized Additive Model
Authors: Nicolas Plamondon, Stuart Atkinson, Shuzi Zhou
Abstract:
The insurance industry is usually not the first topic one has in mind when thinking about applications of data science. However, the use of data science in the finance and insurance industry is growing quickly for several reasons, including an abundance of reliable customer data, ferocious competition requiring more accurate pricing, etc. Among the top use cases of data science, we find pricing optimization, customer segmentation, customer risk assessment, fraud detection, marketing, and triage analytics. The objective of this paper is to present an application of the generalized additive model (GAM) on a commercial automobile insurance product: an individually rated commercial automobile. These are vehicles used for commercial purposes, but for which there is not enough volume to apply pricing to several vehicles at the same time. The GAM model was selected as an improvement over GLM for its ease of use and its wide range of applications. The model was trained using the largest split of the data to determine model parameters. The remaining part of the data was used as testing data to verify the quality of the modeling activity. We used the Gini coefficient to evaluate the performance of the model. For long-term monitoring, commonly used metrics such as RMSE and MAE will be used. Another topic of interest in the insurance industry is to process of producing the model. We will discuss at a high level the interactions between the different teams with an insurance company that needs to work together to produce a model and then monitor the performance of the model over time. Moreover, we will discuss the regulations in place in the insurance industry. Finally, we will discuss the maintenance of the model and the fact that new data does not come constantly and that some metrics can take a long time to become meaningful.Keywords: insurance, data science, modeling, monitoring, regulation, processes
Procedia PDF Downloads 7630157 Identity Verification Using k-NN Classifiers and Autistic Genetic Data
Authors: Fuad M. Alkoot
Abstract:
DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN).Keywords: biometrics, genetic data, identity verification, k nearest neighbor
Procedia PDF Downloads 25830156 Beyond Voluntary Corporate Social Responsibility: Examining the Impact of the New Mandatory Community Development Agreement in the Mining Sector of Sierra Leone
Authors: Wusu Conteh
Abstract:
Since the 1990s, neo-liberalization has become a global agenda. The free market ushered in an unprecedented drive by Multinational Corporations (MNCs) to secure mineral rights in resource-rich countries. Several governments in the Global South implemented a liberalized mining policy with support from the International Financial Institutions (IFIs). MNCs have maintained that voluntary Corporate Social Responsibility (CSR) has engendered socio-economic development in mining-affected communities. However, most resource-rich countries are struggling to transform the resources into sustainable socio-economic development. They are trapped in what has been widely described as the ‘resource curse.’ In an attempt to address this resource conundrum, the African Mining Vision (AMV) of 2009 developed a model on resource governance. The advent of the AMV has engendered the introduction of mandatory community development agreement (CDA) into the legal framework of many countries in Africa. In 2009, Sierra Leone enacted the Mines and Minerals Act that obligates mining companies to invest in Primary Host Communities. The study employs interviews and field observation techniques to explicate the dynamics of the CDA program. A total of 25 respondents -government officials, NGOs/CSOs and community stakeholders were interviewed. The study focuses on a case study of the Sierra Rutile CDA program in Sierra Leone. Extant scholarly works have extensively explored the resource curse and voluntary CSR. There are limited studies to uncover the mandatory CDA and its impact on socio-economic development in mining-affected communities. Thus, the purpose of this study is to explicate the impact of the CDA in Sierra Leone. Using the theory of change helps to understand how the availability of mandatory funds can empower communities to take an active part in decision making related to the development of the communities. The results show that the CDA has engendered a predictable fund for community development. It has also empowered ordinary members of the community to determine the development program. However, the CDA has created a new ground for contestations between the pre-existing local governance structure (traditional authority) and the newly created community development committee (CDC) that is headed by an ordinary member of the community.Keywords: community development agreement, impact, mandatory, participation
Procedia PDF Downloads 12530155 Increasing the Capacity of Plant Bottlenecks by Using of Improving the Ratio of Mean Time between Failures to Mean Time to Repair
Authors: Jalal Soleimannejad, Mohammad Asadizeidabadi, Mahmoud Koorki, Mojtaba Azarpira
Abstract:
A significant percentage of production costs is the maintenance costs, and analysis of maintenance costs could to achieve greater productivity and competitiveness. With this is mind, the maintenance of machines and installations is considered as an essential part of organizational functions and applying effective strategies causes significant added value in manufacturing activities. Organizations are trying to achieve performance levels on a global scale with emphasis on creating competitive advantage by different methods consist of RCM (Reliability-Center-Maintenance), TPM (Total Productivity Maintenance) etc. In this study, increasing the capacity of Concentration Plant of Golgohar Iron Ore Mining & Industrial Company (GEG) was examined by using of reliability and maintainability analyses. The results of this research showed that instead of increasing the number of machines (in order to solve the bottleneck problems), the improving of reliability and maintainability would solve bottleneck problems in the best way. It should be mention that in the abovementioned study, the data set of Concentration Plant of GEG as a case study, was applied and analyzed.Keywords: bottleneck, golgohar iron ore mining & industrial company, maintainability, maintenance costs, reliability
Procedia PDF Downloads 36530154 Gold, Power, Protest, Examining How Digital Media and PGIS are Used to Protest the Mining Industry in Colombia
Authors: Doug Specht
Abstract:
This research project sought to explore the links between digital media, PGIS and social movement organisations in Tolima, Colombia. The primary aim of the research was to examine how knowledge is created and disseminated through digital media and GIS in the region, and whether there exists the infrastructure to allow for this. The second strand was to ascertain if this has had a significant impact on the way grassroots movements work and produce collective actions. The third element is a hypothesis about how digital media and PGIS could play a larger role in activist activities, particularly in reference to the extractive industries. Three theoretical strands have been brought together to provide a basis for this research, namely (a) the politics of knowledge, (b) spatial management and inclusion, and (c) digital media and political engagement. Quantitative data relating to digital media and mobile internet use was collated alongside qualitative data relating to the likelihood of using digital media in activist campaigns, with particular attention being given to grassroots movements working against extractive industries in the Tolima region of Colombia. Through interviews, surveys and GIS analysis it has been possible to build a picture of online activism and the role of PPGIS within protest movement in the region of Tolima, Colombia. Results show a gap between the desires of social movements to use digital media and the skills and finances required to implement programs that utilise it. Maps and GIS are generally reserved for legal cases rather than for informing the lay person. However, it became apparent that the combination of digital/social media and PPGIS could play a significant role in supporting the work of grassroots movements.Keywords: PGIS, GIS, social media, digital media, mining, colombia, social movements, protest
Procedia PDF Downloads 42730153 Advanced Simulation and Enhancement for Distributed and Energy Efficient Scheduling for IEEE802.11s Wireless Enhanced Distributed Channel Access Networks
Authors: Fisayo G. Ojo, Shamala K. Subramaniam, Zuriati Ahmad Zukarnain
Abstract:
As technology is advancing and wireless applications are becoming dependable sources, while the physical layer of the applications are been embedded into tiny layer, so the more the problem on energy efficiency and consumption. This paper reviews works done in recent years in wireless applications and distributed computing, we discovered that applications are becoming dependable, and resource allocation sharing with other applications in distributed computing. Applications embedded in distributed system are suffering from power stability and efficiency. In the reviews, we also prove that discrete event simulation has been left behind untouched and not been adapted into distributed system as a simulation technique in scheduling of each event that took place in the development of distributed computing applications. We shed more lights on some researcher proposed techniques and results in our reviews to prove the unsatisfactory results, and to show that more work still have to be done on issues of energy efficiency in wireless applications, and congestion in distributed computing.Keywords: discrete event simulation (DES), distributed computing, energy efficiency (EE), internet of things (IOT), quality of service (QOS), user equipment (UE), wireless mesh network (WMN), wireless sensor network (wsn), worldwide interoperability for microwave access x (WiMAX)
Procedia PDF Downloads 19330152 Applications of Drones in Infrastructures: Challenges and Opportunities
Authors: Jin Fan, M. Ala Saadeghvaziri
Abstract:
Unmanned aerial vehicles (UAVs), also referred to as drones, equipped with various kinds of advanced detecting or surveying systems, are effective and low-cost in data acquisition, data delivery and sharing, which can benefit the building of infrastructures. This paper will give an overview of applications of drones in planning, designing, construction and maintenance of infrastructures. The drone platform, detecting and surveying systems, and post-data processing systems will be introduced, followed by cases with details of the applications. Challenges from different aspects will be addressed. Opportunities of drones in infrastructure include but not limited to the following. Firstly, UAVs equipped with high definition cameras or other detecting equipment are capable of inspecting the hard to reach infrastructure assets. Secondly, UAVs can be used as effective tools to survey and map the landscape to collect necessary information before infrastructure construction. Furthermore, an UAV or multi-UVAs are useful in construction management. UVAs can also be used in collecting roads and building information by taking high-resolution photos for future infrastructure planning. UAVs can be used to provide reliable and dynamic traffic information, which is potentially helpful in building smart cities. The main challenges are: limited flight time, the robustness of signal, post data analyze, multi-drone collaboration, weather condition, distractions to the traffic caused by drones. This paper aims to help owners, designers, engineers and architects to improve the building process of infrastructures for higher efficiency and better performance.Keywords: bridge, construction, drones, infrastructure, information
Procedia PDF Downloads 12430151 Feasibility of Washing/Extraction Treatment for the Remediation of Deep-Sea Mining Trailings
Authors: Kyoungrean Kim
Abstract:
Importance of deep-sea mineral resources is dramatically increasing due to the depletion of land mineral resources corresponding to increasing human’s economic activities. Korea has acquired exclusive exploration licenses at four areas which are the Clarion-Clipperton Fracture Zone in the Pacific Ocean (2002), Tonga (2008), Fiji (2011) and Indian Ocean (2014). The preparation for commercial mining of Nautilus minerals (Canada) and Lockheed martin minerals (USA) is expected by 2020. The London Protocol 1996 (LP) under International Maritime Organization (IMO) and International Seabed Authority (ISA) will set environmental guidelines for deep-sea mining until 2020, to protect marine environment. In this research, the applicability of washing/extraction treatment for the remediation of deep-sea mining tailings was mainly evaluated in order to present preliminary data to develop practical remediation technology in near future. Polymetallic nodule samples were collected at the Clarion-Clipperton Fracture Zone in the Pacific Ocean, then stored at room temperature. Samples were pulverized by using jaw crusher and ball mill then, classified into 3 particle sizes (> 63 µm, 63-20 µm, < 20 µm) by using vibratory sieve shakers (Analysette 3 Pro, Fritsch, Germany) with 63 µm and 20 µm sieve. Only the particle size 63-20 µm was used as the samples for investigation considering the lower limit of ore dressing process which is tens to 100 µm. Rhamnolipid and sodium alginate as biosurfactant and aluminum sulfate which are mainly used as flocculant were used as environmentally friendly additives. Samples were adjusted to 2% liquid with deionized water then mixed with various concentrations of additives. The mixture was stirred with a magnetic bar during specific reaction times and then the liquid phase was separated by a centrifugal separator (Thermo Fisher Scientific, USA) under 4,000 rpm for 1 h. The separated liquid was filtered with a syringe and acrylic-based filter (0.45 µm). The extracted heavy metals in the filtered liquid were then determined using a UV-Vis spectrometer (DR-5000, Hach, USA) and a heat block (DBR 200, Hach, USA) followed by US EPA methods (8506, 8009, 10217 and 10220). Polymetallic nodule was mainly composed of manganese (27%), iron (8%), nickel (1.4%), cupper (1.3 %), cobalt (1.3%) and molybdenum (0.04%). Based on remediation standards of various countries, Nickel (Ni), Copper (Cu), Cadmium (Cd) and Zinc (Zn) were selected as primary target materials. Throughout this research, the use of rhamnolipid was shown to be an effective approach for removing heavy metals in samples originated from manganese nodules. Sodium alginate might also be one of the effective additives for the remediation of deep-sea mining tailings such as polymetallic nodules. Compare to the use of rhamnolipid and sodium alginate, aluminum sulfate was more effective additive at short reaction time within 4 h. Based on these results, sequencing particle separation, selective extraction/washing, advanced filtration of liquid phase, water treatment without dewatering and solidification/stabilization may be considered as candidate technologies for the remediation of deep-sea mining tailings.Keywords: deep-sea mining tailings, heavy metals, remediation, extraction, additives
Procedia PDF Downloads 15730150 A Note on Metallurgy at Khanak: An Indus Site in Tosham Mining Area, Haryana
Authors: Ravindra N. Singh, Dheerendra P. Singh
Abstract:
Recent discoveries of Bronze Age artefacts, tin slag, furnaces and crucibles, together with new geological evidence on tin deposits in Tosham area of Bhiwani district in Haryana (India) provide the opportunity to survey the evidence for possible sources of tin and the use of bronze in the Harappan sites of north western India. Earlier, Afghanistan emerged as the most promising eastern source of tin utilized by Indus Civilization copper-smiths. Our excavations conducted at Khanak near Tosham mining area during 2014 and 2016 revealed ample evidence of metallurgical activities as attested by the occurrence of slag, ores and evidences of ashes and fragments of furnaces in addition to the bronze objects. We have conducted petrological, XRD, EDAX, TEM, SEM and metallography on the slag, ores, crucible fragments and bronze objects samples recovered from Khanak excavations. This has given positive indication of mining and metallurgy of poly-mettalic Tin at the site; however, it can only be ascertained after the detailed scientific examination of the materials which is underway. In view of the importance of site, we intend to excavate the site horizontally in future so as to obtain more samples for scientific studies.Keywords: archaeometallurgy, problem of tin, metallography, indus civilization
Procedia PDF Downloads 30130149 A Case Study: Beginning Teacher's Experiences of Mentoring in Secondary Education
Authors: Abdul Rofiq Badril Rizal M. Z.
Abstract:
This case study examines the experiences of four beginning teachers currently working in New South Wales secondary schools. Data were collected from semi-structured interviews conducted one on one over the period of one month. The data were coded with findings reported through key areas of discovery, which linked to the research presented in the literature review. The participants involved in the case study all reported positive experiences with mentoring, though none were given the opportunity to take part in a formal mentoring program, and all the mentors offered their time voluntarily. The mentoring took different forms, but the support most valued by the participants was the emotional and curriculum related supported received. All participants wished they had greater access to mentoring and felt it would have benefits for most beginning teachers. The study highlights ongoing issues around the lack of access to mentoring, which could be due to factors such as funding, time and training.Keywords: mentor, mentee, pre-service teacher, beginning teacher
Procedia PDF Downloads 108