Search results for: sentiment mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1214

Search results for: sentiment mining

224 Petrology Investigation of Apatite Minerals in the Esfordi Mine

Authors: Haleh Rezaei Zanjirabadi, Fatemeh Saberi, Bahman Rahimzadeh, Fariborz Masoudi, Mohammad Rahgosha

Abstract:

In this study, apatite minerals from the iron-phosphate deposit of Yazd have been investigated within the microcontinent zone of Iran in the Zagros structural zone. The geological units in the Esfordi area belong to the pre-Cambrian to lower-Cambrian age, consisting of a succession of carbonate rocks (dolomite), shale, tuff, sandstone, and volcanic rocks. In addition to the mentioned sedimentary and volcanic rocks, the granitoid mass of Bahabad, which is the largest intrusive mass in the region, has intruded into the eastern part of this series and has caused its metamorphism and alteration. After collecting the available data, various samples of Esfordi’s apatite were prepared, and their mineralogy and crystallography were investigated using laboratory methods such as petrographic microscopy, Raman spectroscopy, EDS, and SEM. In non-destructive Raman spectroscopy, the molecular structure of apatite minerals was revealed in four distinct spectral ranges. Initially, the spectra of phosphate and aluminum bonds with O2HO, OH, were observed, followed by the identification of Cl, OH, Al, Na, Ca and hydroxyl units depending on the type of apatite mineral family. In SEM analysis, based on various shapes and different phases of apatites, their constituent major elements were identified through EDS, indicating that the samples from the Esfordi mining area exhibit a dense and coherent texture with smooth surfaces. Based on the elemental analysis results by EDS, the apatites in the Esfordi area are classified into the calcic apatite group.

Keywords: petrology, apatite, Esfordi, EDS, SEM, Raman spectroscopy

Procedia PDF Downloads 29
223 Graph-Based Semantical Extractive Text Analysis

Authors: Mina Samizadeh

Abstract:

In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.

Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis

Procedia PDF Downloads 45
222 Mitigation Measures for the Acid Mine Drainage Emanating from the Sabie Goldfield: Case Study of the Nestor Mine

Authors: Rudzani Lusunzi, Frans Waanders, Elvis Fosso-Kankeu, Robert Khashane Netshitungulwana

Abstract:

The Sabie Goldfield has a history of gold mining dating back more than a century. Acid mine drainage (AMD) from the Nestor mine tailings storage facility (MTSF) poses a serious threat to the nearby ecosystem, specifically the Sabie River system. This study aims at developing mitigation measures for the AMD emanating from the Nestor MTSF using materials from the Glynns Lydenburg MTSF. The Nestor MTSF (NM) and the Glynns Lydenburg MTSF (GM) each provided about 20 kg of bulk composite samples. Using samples from the Nestor MTSF and the Glynns Lydenburg MTSF, two mixtures were created. MIX-A is a mixture that contains 25% weight percent (GM) and 75% weight percent (NM). MIX-B is the name given to the second mixture, which contains 50% AN and 50% AG. The same static test, i.e., acid–base accounting (ABA), net acid generation (NAG), and acid buffering characteristics curve (ABCC) was used to estimate the acid-generating probabilities of samples NM and GM for MIX-A and MIX-B. Furthermore, the mineralogy of the Nestor MTSF samples consists of the primary acid-producing mineral pyrite as well as the secondary minerals ferricopiapite and jarosite, which are common in acidic conditions. The Glynns Lydenburg MTSF samples, on the other hand, contain primary acid-neutralizing minerals calcite and dolomite. Based on the assessment conducted, materials from the Glynns Lydenburg are capable of neutralizing AMD from Nestor MTSF. Therefore, the alkaline tailings materials from the Glynns Lydenburg MTSF can be used to rehabilitate the acidic Nestor MTSF.

Keywords: Nestor Mine, acid mine drainage, mitigation, Sabie River system

Procedia PDF Downloads 52
221 Light-Weight Network for Real-Time Pose Estimation

Authors: Jianghao Hu, Hongyu Wang

Abstract:

The effective and efficient human pose estimation algorithm is an important task for real-time human pose estimation on mobile devices. This paper proposes a light-weight human key points detection algorithm, Light-Weight Network for Real-Time Pose Estimation (LWPE). LWPE uses light-weight backbone network and depthwise separable convolutions to reduce parameters and lower latency. LWPE uses the feature pyramid network (FPN) to fuse the high-resolution, semantically weak features with the low-resolution, semantically strong features. In the meantime, with multi-scale prediction, the predicted result by the low-resolution feature map is stacked to the adjacent higher-resolution feature map to intermediately monitor the network and continuously refine the results. At the last step, the key point coordinates predicted in the highest-resolution are used as the final output of the network. For the key-points that are difficult to predict, LWPE adopts the online hard key points mining strategy to focus on the key points that hard predicting. The proposed algorithm achieves excellent performance in the single-person dataset selected in the AI (artificial intelligence) challenge dataset. The algorithm maintains high-precision performance even though the model only contains 3.9M parameters, and it can run at 225 frames per second (FPS) on the generic graphics processing unit (GPU).

Keywords: depthwise separable convolutions, feature pyramid network, human pose estimation, light-weight backbone

Procedia PDF Downloads 122
220 Assessment of the Effect of Cu and Zn on the Growth of Two Chlorophytic Microalgae

Authors: Medina O. Kadiri, John E. Gabriel

Abstract:

Heavy metals are metallic elements with a relatively high density, at least five times greater compared to water. The sources of heavy metal pollution in the environment include industrial, medical, agricultural, pharmaceutical, domestic effluents, and atmospheric sources, mining, foundries, smelting, and any heavy metal-based operation. Although some heavy metals in trace quantities are required for biological metabolism, their higher concentrations elicit toxicities. Others are distinctly toxic and are of no biological functions. Microalgae are the primary producers of aquatic ecosystems and, therefore, the foundation of the aquatic food chain. A study investigating the effects of copper and zinc on the two chlorophytes-Chlorella vulgaris and Dictyosphaerium pulchellum was done in the laboratory, under different concentrations of 0mg/l, 2mg/l, 4mg/l, 6mg/l, 8mg/l, 10mg/l, and 20mg/l. The growth of the test microalgae was determined every two days for 14 days. The results showed that the effects of the test heavy metals were concentration-dependent. From the two microalgae species tested, Chlorella vulgaris showed appreciable growth up to 8mg/l concentration of zinc. Dictyoshphaerium pulchellum had only minimal growth at different copper concentrations except for 2mg/l, which seemed to have relatively higher growth. The growth of the control was remarkably higher than in other concentrations. Generally, the growth of both test algae was consistently inhibited by heavy metals. Comparatively, copper generally inhibited the growth of both algae than zinc. Chlorella vulgaris can be used for bioremediation of high concentrations of zinc. The potential of many microalgae in heavy metal bioremediation can be explored.

Keywords: heavy metals, green algae, microalgae, pollution

Procedia PDF Downloads 167
219 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal

Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha

Abstract:

Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.

Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit

Procedia PDF Downloads 394
218 Cultivation of High-value Patent from the Perspective of Knowledge Diffusion: A Case Study of the Power Semiconductor Field

Authors: Lin Qing

Abstract:

[Objective/Significance] The cultivation of high-value patents is the focus and difficulty of patent work, which is of great significance to the construction of a powerful country with intellectual property rights. This work should not only pay attention to the existing patent applications, but also start from the pre-application to explore the high-value technical solutions as the core of high-value patents. [Methods/processes] Comply with the principle of scientific and technological knowledge diffusion, this study studies the top academic conference papers and their cited patent applications, taking the power semiconductor field as an example, using facts date show the feasibility and rationality of mining technology solutions from high quality research results to foster high value patents, stating the actual benefits of these achievements to the industry, giving patent protection suggestions for Chinese applicants comparative with field situation. [Results/Conclusion] The research shows that the quality of citation applications of ISPSD papers is significantly higher than the field average level, and the ability of Chinese applicants to use patent protection related achievements needs to be improved. This study provides a practical and highly targeted reference idea for patent administrators and researchers, and also makes a positive exploration for the practice of the spirit of breaking the five rules.

Keywords: high-value patents cultivation, technical solutions, knowledge diffusion, top academic conference papers, intellectual property information analysis

Procedia PDF Downloads 99
217 Big Data in Construction Project Management: The Colombian Northeast Case

Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez

Abstract:

In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.

Keywords: big data, building information modeling, tecnology, project manamegent

Procedia PDF Downloads 101
216 Smart in Performance: More to Practical Life than Hardware and Software

Authors: Faten Hatem

Abstract:

This paper promotes the importance of focusing on spatial aspects and affective factors that impact smart urbanism. This helps to better inform city governance, spatial planning, and policymaking to focus on what Smart does and what it can achieve for cities in terms of performance rather than on using the notion for prestige in a worldwide trend towards becoming a smart city. By illustrating how this style of practice compromises the social aspects and related elements of space making through an interdisciplinary comparative approach, the paper clarifies the impact of this compromise on the overall smart city performance. In response, this paper recognizes the importance of establishing a new meaning for urban progress by moving beyond improving basic services of the city to enhance the actual human experience which is essential for the development of authentic smart cities. The topic is presented under five overlooked areas that discuss the relation between smart cities’ potential and efficiency paradox, the social aspect, connectedness with nature, the human factor, and untapped resources. However, these themes are not meant to be discussed in silos, instead, they are presented to collectively examine smart cities in performance, arguing there is more to the practical life of smart cities than software and hardware inventions. The study is based on a case study approach, presenting Milton Keynes as a living example to learn from while engaging with various methods for data collection including multi-disciplinary semi-structured interviews, field observations, and data mining.

Keywords: smart design, the human in the city, human needs and urban planning, sustainability, smart cities, smart

Procedia PDF Downloads 65
215 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition

Procedia PDF Downloads 117
214 Consequential Effects of Coal Utilization on Urban Water Supply Sources – a Study of Ajali River in Enugu State Nigeria

Authors: Enebe Christian Chukwudi

Abstract:

Water bodies around the world notably underground water, ground water, rivers, streams, and seas, face degradation of their water quality as a result of activities associated with coal utilization including coal mining, coal processing, coal burning, waste storage and thermal pollution from coal plants which tend to contaminate these water bodies. This contamination results from heavy metals, presence of sulphate and iron, dissolved solids, mercury and other toxins contained in coal ash, sludge, and coal waste. These wastes sometimes find their way to sources of urban water supply and contaminate them. A major problem encountered in the supply of potable water to Enugu municipality is the contamination of Ajali River, the source of water supply to Enugu municipal by coal waste. Hydro geochemical analysis of Ajali water samples indicate high sulphate and iron content, high total dissolved solids(TDS), low pH (acidity values) and significant hardness in addition to presence of heavy metals, mercury, and other toxins. This is indicative of the following remedial measures: I. Proper disposal of mine wastes at designated disposal sites that are suitably prepared. II. Proper water treatment and III. Reduction of coal related contaminants taking advantage of clean coal technology.

Keywords: effects, coal, utilization, water quality, sources, waste, contamination, treatment

Procedia PDF Downloads 398
213 Comparative Study Using WEKA for Red Blood Cells Classification

Authors: Jameela Ali, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy

Abstract:

Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifying the RBCs as normal, or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithm tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital-alaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectively.

Keywords: K-nearest neighbors algorithm, radial basis function neural network, red blood cells, support vector machine

Procedia PDF Downloads 374
212 Gas Phase Extraction: An Environmentally Sustainable and Effective Method for The Extraction and Recovery of Metal from Ores

Authors: Kolela J Nyembwe, Darlington C. Ashiegbu, Herman J. Potgieter

Abstract:

Over the past few decades, the demand for metals has increased significantly. This has led to a decrease and decline of high-grade ore over time and an increase in mineral complexity and matrix heterogeneity. In addition to that, there are rising concerns about greener processes and a sustainable environment. Due to these challenges, the mining and metal industry has been forced to develop new technologies that are able to economically process and recover metallic values from low-grade ores, materials having a metal content locked up in industrially processed residues (tailings and slag), and complex matrix mineral deposits. Several methods to address these issues have been developed, among which are ionic liquids (IL), heap leaching, and bioleaching. Recently, the gas phase extraction technique has been gaining interest because it eliminates many of the problems encountered in conventional mineral processing methods. The technique relies on the formation of volatile metal complexes, which can be removed from the residual solids by a carrier gas. The complexes can then be reduced using the appropriate method to obtain the metal and regenerate-recover the organic extractant. Laboratory work on the gas phase have been conducted for the extraction and recovery of aluminium (Al), iron (Fe), copper (Cu), chrome (Cr), nickel (Ni), lead (Pb), and vanadium V. In all cases the extraction revealed to depend of temperature and mineral surface area. The process technology appears very promising, offers the feasibility of recirculation, organic reagent regeneration, and has the potential to deliver on all promises of a “greener” process.

Keywords: gas-phase extraction, hydrometallurgy, low-grade ore, sustainable environment

Procedia PDF Downloads 98
211 Network Analysis of Genes Involved in the Biosynthesis of Medicinally Important Naphthodianthrone Derivatives of Hypericum perforatum

Authors: Nafiseh Noormohammadi, Ahmad Sobhani Najafabadi

Abstract:

Hypericins (hypericin and pseudohypericin) are natural napthodianthrone derivatives produced by Hypericum perforatum (St. John’s Wort), which have many medicinal properties such as antitumor, antineoplastic, antiviral, and antidepressant activities. Production and accumulation of hypericin in the plant are influenced by both genetic and environmental conditions. Despite the existence of different high-throughput data on the plant, genetic dimensions of hypericin biosynthesis have not yet been completely understood. In this research, 21 high-quality RNA-seq data on different parts of the plant were integrated into metabolic data to reconstruct a coexpression network. Results showed that a cluster of 30 transcripts was correlated with total hypericin. The identified transcripts were divided into three main groups based on their functions, including hypericin biosynthesis genes, transporters, detoxification genes, and transcription factors (TFs). In the biosynthetic group, different isoforms of polyketide synthase (PKSs) and phenolic oxidative coupling proteins (POCPs) were identified. Phylogenetic analysis of protein sequences integrated into gene expression analysis showed that some of the POCPs seem to be very important in the biosynthetic pathway of hypericin. In the TFs group, six TFs were correlated with total hypericin. qPCR analysis of these six TFs confirmed that three of them were highly correlated. The identified genes in this research are a rich resource for further studies on the molecular breeding of H. perforatum in order to obtain varieties with high hypericin production.

Keywords: hypericin, St. John’s Wort, data mining, transcription factors, secondary metabolites

Procedia PDF Downloads 55
210 A Fast Community Detection Algorithm

Authors: Chung-Yuan Huang, Yu-Hsiang Fu, Chuen-Tsai Sun

Abstract:

Community detection represents an important data-mining tool for analyzing and understanding real-world complex network structures and functions. We believe that at least four criteria determine the appropriateness of a community detection algorithm: (a) it produces useable normalized mutual information (NMI) and modularity results for social networks, (b) it overcomes resolution limitation problems associated with synthetic networks, (c) it produces good NMI results and performance efficiency for Lancichinetti-Fortunato-Radicchi (LFR) benchmark networks, and (d) it produces good modularity and performance efficiency for large-scale real-world complex networks. To our knowledge, no existing community detection algorithm meets all four criteria. In this paper, we describe a simple hierarchical arc-merging (HAM) algorithm that uses network topologies and rule-based arc-merging strategies to identify community structures that satisfy the criteria. We used five well-studied social network datasets and eight sets of LFR benchmark networks to validate the ground-truth community correctness of HAM, eight large-scale real-world complex networks to measure its performance efficiency, and two synthetic networks to determine its susceptibility to resolution limitation problems. Our results indicate that the proposed HAM algorithm is capable of providing satisfactory performance efficiency and that HAM-identified communities were close to ground-truth communities in social and LFR benchmark networks while overcoming resolution limitation problems.

Keywords: complex network, social network, community detection, network hierarchy

Procedia PDF Downloads 192
209 Predictive Analytics Algorithms: Mitigating Elementary School Drop Out Rates

Authors: Bongs Lainjo

Abstract:

Educational institutions and authorities that are mandated to run education systems in various countries need to implement a curriculum that considers the possibility and existence of elementary school dropouts. This research focuses on elementary school dropout rates and the ability to replicate various predictive models carried out globally on selected Elementary Schools. The study was carried out by comparing the classical case studies in Africa, North America, South America, Asia and Europe. Some of the reasons put forward for children dropping out include the notion of being successful in life without necessarily going through the education process. Such mentality is coupled with a tough curriculum that does not take care of all students. The system has completely led to poor school attendance - truancy which continuously leads to dropouts. In this study, the focus is on developing a model that can systematically be implemented by school administrations to prevent possible dropout scenarios. At the elementary level, especially the lower grades, a child's perception of education can be easily changed so that they focus on the better future that their parents desire. To deal effectively with the elementary school dropout problem, strategies that are put in place need to be studied and predictive models are installed in every educational system with a view to helping prevent an imminent school dropout just before it happens. In a competency-based curriculum that most advanced nations are trying to implement, the education systems have wholesome ideas of learning that reduce the rate of dropout.

Keywords: elementary school, predictive models, machine learning, risk factors, data mining, classifiers, dropout rates, education system, competency-based curriculum

Procedia PDF Downloads 141
208 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia PDF Downloads 354
207 A Comparative Study for Various Techniques Using WEKA for Red Blood Cells Classification

Authors: Jameela Ali, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy

Abstract:

Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifyig the red blood cells as normal, or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithm tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital-Malaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectively

Keywords: red blood cells, classification, radial basis function neural networks, suport vector machine, k-nearest neighbors algorithm

Procedia PDF Downloads 439
206 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions

Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib

Abstract:

Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.

Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption

Procedia PDF Downloads 378
205 Analyzing Factors Impacting COVID-19 Vaccination Rates

Authors: Dongseok Cho, Mitchell Driedger, Sera Han, Noman Khan, Mohammed Elmorsy, Mohamad El-Hajj

Abstract:

Since the approval of the COVID-19 vaccine in late 2020, vaccination rates have varied around the globe. Access to a vaccine supply, mandated vaccination policy, and vaccine hesitancy contribute to these rates. This study used COVID-19 vaccination data from Our World in Data and the Multilateral Leaders Task Force on COVID-19 to create two COVID-19 vaccination indices. The first index is the Vaccine Utilization Index (VUI), which measures how effectively each country has utilized its vaccine supply to doubly vaccinate its population. The second index is the Vaccination Acceleration Index (VAI), which evaluates how efficiently each country vaccinated its population within its first 150 days. Pearson correlations were created between these indices and country indicators obtained from the World Bank. The results of these correlations identify countries with stronger health indicators, such as lower mortality rates, lower age dependency ratios, and higher rates of immunization to other diseases, displaying higher VUI and VAI scores than countries with lesser values. VAI scores are also positively correlated to Governance and Economic indicators, such as regulatory quality, control of corruption, and GDP per capita. As represented by the VUI, proper utilization of the COVID-19 vaccine supply by country is observed in countries that display excellence in health practices. A country’s motivation to accelerate its vaccination rates within the first 150 days of vaccinating, as represented by the VAI, was largely a product of the governing body’s effectiveness and economic status, as well as overall excellence in health practises.

Keywords: data mining, Pearson correlation, COVID-19, vaccination rates and hesitancy

Procedia PDF Downloads 86
204 Phytoextraction of Copper and Zinc by Willow Varieties in a Pot Experiment

Authors: Muhammad Mohsin, Mir Md Abdus Salam, Pertti Pulkkinen, Ari Pappinen

Abstract:

Soil and water contamination by heavy metals is a major challenging issue for the environment. Phytoextraction is an emerging, environmentally friendly and cost-efficient technology in which plants are used to eliminate pollutants from the soil and water. We aimed to assess the copper (Cu) and zinc (Zn) removal efficiency by two willow varieties such as Klara (S. viminalis x S. schwerinii x S. dasyclados) and Karin ((S.schwerinii x S. viminalis) x (S. viminalis x S.burjatica)) under different soil treatments (control/unpolluted, polluted, lime with polluted, wood ash with polluted). In 180 days of pot experiment, these willow varieties were grown in a highly polluted soil collected from Pyhasalmi mining area in Finland. The lime and wood ash were added to the polluted soil to improve the soil pH and observe their effects on metals accumulation in plant biomass. The Inductively Coupled Plasma Optical Emission Spectrometer (ELAN 6000 ICP-EOS, Perkin-Elmer Corporation) was used in this study to assess the heavy metals concentration in the plant biomass. The result shows that both varieties of willow have the capability to accumulate the considerable amount of Cu and Zn varying from 36.95 to 314.80 mg kg⁻¹ and 260.66 to 858.70 mg kg⁻¹, respectively. The application of lime and wood ash substantially affected the stimulation of the plant height, dry biomass and deposition of Cu and Zn into total plant biomass. Besides, the lime application appeared to upsurge Cu and Zn concentrations in the shoots and leaves in both willow varieties when planted in polluted soil. However, wood ash application was found more efficient to mobilize the metals in the roots of both varieties. The study recommends willow plantations to rehabilitate the Cu and Zn polluted soils.

Keywords: heavy metals, lime, phytoextraction, wood ash, willow

Procedia PDF Downloads 205
203 Supply Chain Optimisation through Geographical Network Modeling

Authors: Cyrillus Prabandana

Abstract:

Supply chain optimisation requires multiple factors as consideration or constraints. These factors are including but not limited to demand forecasting, raw material fulfilment, production capacity, inventory level, facilities locations, transportation means, and manpower availability. By knowing all manageable factors involved and assuming the uncertainty with pre-defined percentage factors, an integrated supply chain model could be developed to manage various business scenarios. This paper analyse the utilisation of geographical point of view to develop an integrated supply chain network model to optimise the distribution of finished product appropriately according to forecasted demand and available supply. The supply chain optimisation model shows that small change in one supply chain constraint is possible to largely impact other constraints, and the new information from the model should be able to support the decision making process. The model was focused on three areas, i.e. raw material fulfilment, production capacity and finished products transportation. To validate the model suitability, it was implemented in a project aimed to optimise the concrete supply chain in a mining location. The high level of operations complexity and involvement of multiple stakeholders in the concrete supply chain is believed to be sufficient to give the illustration of the larger scope. The implementation of this geographical supply chain network modeling resulted an optimised concrete supply chain from raw material fulfilment until finished products distribution to each customer, which indicated by lower percentage of missed concrete order fulfilment to customer.

Keywords: decision making, geographical supply chain modeling, supply chain optimisation, supply chain

Procedia PDF Downloads 326
202 Assessment of Drinking Water Quality in Relation to Arsenic Contamination in Drinking Water in Liberia: Achieving the Sustainable Development Goal of Ensuring Clean Water and Sanitation

Authors: Victor Emery David Jr., Jiang Wenchao, Daniel Mmereki, Yasinta John

Abstract:

The fundamentals of public health are access to safe and clean drinking water. The presence of arsenic and other contaminants in drinking water leads to the potential risk to public health and the environment particularly in most developing countries where there’s inadequate access to safe and clean water and adequate sanitation. Liberia has taken steps to improve its drinking water status so as to achieve the Sustainable Development Goals (SDGs) target of ensuring clean water and effective sanitation but there is still a lot to be done. The Sustainable Development Goals are a United Nation initiative also known as transforming our world: The 2030 agenda for sustainable development. It contains seventeen goals with 169 targets to be met by respective countries. Liberia is situated within in the gold belt region where there exist the presence of arsenic and other contaminants in the underground water due to mining and other related activities. While there are limited or no epidemiological studies conducted in Liberia to confirm illness or death as a result of arsenic contamination in Liberia, it remains a public health concern. This paper assesses the drinking water quality, the presence of arsenic in groundwater/drinking water in Liberia, and proposes strategies for mitigating contaminants in drinking water and suggests options for improvement with regards to achieving the Sustainable Development Goals of ensuring clean water and effective sanitation in Liberia by 2030.

Keywords: arsenic, action plan, contaminants, environment, groundwater, sustainable development goals (SDGs), Monrovia, Liberia, public health, drinking water

Procedia PDF Downloads 228
201 A Case Study of Coalface Workers' Attitude towards Occupational Health and Safety Key Performance Indicators

Authors: Gayan Mapitiya

Abstract:

Maintaining good occupational health and safety (OHS) performance is significant at the coalface, especially in industries such as mining, power, and construction. Coalface workers are vulnerable to high OHS risks such as working at heights, working with mobile plants and vehicles, working with underground and above ground services, chemical emissions, radiation hazards and explosions at everyday work. To improve OHS performance of workers, OHS key performance indicators (KPIs) (for example, lost time injuries (LTI), serious injury frequency rate (SIFR), total reportable injury frequency rate (TRIFR) and number of near misses) are widely used by managers in making OHS business decisions such as investing in safety equipment and training programs. However, in many organizations, workers at the coalface hardly see any relevance or value addition of OHS KPIs to their everyday work. Therefore, the aim of the study was to understand why coalface workers perceive that OHS KPIs are not practically relevant to their jobs. Accordingly, this study was conducted as a qualitative case study focusing on a large electricity and gas firm in Australia. Semi-structured face to face interviews were conducted with selected coalface workers to gather data on their attitude towards OHS KPIs. The findings of the study revealed that workers at the coalface generally have no understanding of the purpose of KPIs, the meaning of each KPI, origin of KPIs, and how KPIs are correlated to organizational performance. Indeed, KPIs are perceived as ‘meaningless obstacles’ imposed on workers by managers without a rationale. It is recommended to engage coalface workers (a fair number of representatives) in both KPIs setting and revising processes while maintaining a continuous dialogue between workers and managers in regards OHS KPIs.

Keywords: KPIs, coalface, OHS risks, case-study

Procedia PDF Downloads 86
200 Depletion Behavior of Potassium by Continuous Cropping Using Rice as a Test Crop

Authors: Rafeza Begum, Mohammad Mokhlesur Rahman, Safikul Moula, Rafiqul Islam

Abstract:

Potassium (K) is crucial for healthy soil and plant growth. However, K fertilization is either disregarded or poorly underutilized in Bangladesh agriculture, despite the great demand for crops. This could eventually result in a significant depletion of the soil's potassium reserves, irreversible alteration of the minerals that contain potassium, and detrimental effects on crop productivity. Soil K mining in Bangladesh is a worrying problem, and we need to evaluate it thoroughly and find remedies. A pot culture experiment was conducted in the greenhouse of Bangladesh Institute of Nuclear Agriculture (BINA) using eleven soil series of Bangladesh in order to see the depletion behaviour of potassium (K) by continuous cropping using rice (var. Iratom-24) as the test crop. The soil series were Ranishankhail, Kaonia. Sonatala, Silmondi, Gopalpur, Ishurdi, Sara, Kongsha, Nunni, Lauta and Amnura on which four successive rice plants (45 days duration) were raised with (100 ppm K) or without addition of potassium. Nitrogen, phosphorus, sulfur and zinc were applied as basal to all pots. Potassium application resulted in higher dry matter yield, increased K concentration and uptake in all the soils compared with no K treatment; which gradually decreased in the subsequent harvests. Furthermore, plant takes up K not only from exchangeable pool but also from non-exchangeable sites and a minimum replenishment of K from the soil reserve was observed. Continuous cropping has resulted in the depletion of available K of the soil. The result indicated that in order to sustain higher crop yield under intensive cultivation, the addition of potash fertilizer is necessary.

Keywords: potassium, exchangeable pool, depletion behavior., Soil series

Procedia PDF Downloads 91
199 Co-Disposal of Coal Ash with Mine Tailings in Surface Paste Disposal Practices: A Gold Mining Case Study

Authors: M. L. Dinis, M. C. Vila, A. Fiúza, A. Futuro, C. Nunes

Abstract:

The present paper describes the study of paste tailings prepared in laboratory using gold tailings, produced in a Finnish gold mine with the incorporation of coal ash. Natural leaching tests were conducted with the original materials (tailings, fly and bottom ashes) and also with paste mixtures that were prepared with different percentages of tailings and ashes. After leaching, the solid wastes were physically and chemically characterized and the results were compared to those selected as blank – the unleached samples. The tailings and the coal ash, as well as the prepared mixtures, were characterized, in addition to the textural parameters, by the following measurements: grain size distribution, chemical composition and pH. Mixtures were also tested in order to characterize their mechanical behavior by measuring the flexural strength, the compressive strength and the consistency. The original tailing samples presented an alkaline pH because during their processing they were previously submitted to pressure oxidation with destruction of the sulfides. Therefore, it was not possible to ascertain the effect of the coal ashes in the acid mine drainage. However, it was possible to verify that the paste reactivity was affected mostly by the bottom ash and that the tailings blended with bottom ash present lower mechanical strength than when blended with a combination of fly and bottom ash. Surface paste disposal offer an attractive alternative to traditional methods in addition to the environmental benefits of incorporating large-volume wastes (e.g. bottom ash). However, a comprehensive characterization of the paste mixtures is crucial to optimize paste design in order to enhance engineer and environmental properties.

Keywords: coal ash, mine tailings, paste blends, surface disposal

Procedia PDF Downloads 269
198 Factors That Contribute to Noise Induced Hearing Loss Amongst Employees at the Platinum Mine in Limpopo Province, South Africa

Authors: Livhuwani Muthelo, R. N. Malema, T. M. Mothiba

Abstract:

Long term exposure to excessive noise in the mining industry increases the risk of noise induced hearing loss, with consequences for employee’s health, productivity and the overall quality of life. Objective: The objective of this study was to investigate the factors that contribute to Noise Induced Hearing Loss amongst employees at the Platinum mine in the Limpopo Province, South Africa. Study method: A qualitative, phenomenological, exploratory, descriptive, contextual design was applied in order to explore and describe the contributory factors. Purposive non-probability sampling was used to select 10 male employees who were diagnosed with NIHL in the year 2014 in four mine shafts, and 10 managers who were involved in a Hearing Conservation Programme. The data were collected using semi-structured one-on-one interviews. A qualitative data analysis of Tesch’s approach was followed. Results: The following themes emerged: Experiences and challenges faced by employees in the work environment, hearing protective device factors and management and leadership factors. Hearing loss was caused by partial application of guidelines, policies, and procedures from the Department of Minerals and Energy. Conclusion: The study results indicate that although there are guidelines, policies, and procedures available, failure in the implementation of one element will affect the development and maintenance of employees hearing mechanism. It is recommended that the mine management should apply the guidelines, policies, and procedures and promptly repair the broken hearing protective devices.

Keywords: employees, factors, noise induced hearing loss, noise exposure

Procedia PDF Downloads 94
197 An Architectural Study on the Railway Station Buildings in Malaysia during British Era, 1885-1957

Authors: Nor Hafizah Anuar, M. Gul Akdeniz

Abstract:

This paper attempted on emphasize on the station buildings façade elements. Station buildings were essential part of the transportation that reflected the technology. Comparative analysis on architectural styles will also be made between the railway station buildings of Malaysia and any railway station buildings which have similarities. The Malay Peninsula which is strategically situated between the Straits of Malacca and the South China Sea makes it an ideal location for trade. Malacca became an important trading port whereby merchants from around the world stopover to exchange various products. The Portuguese ruled Malacca for 130 years (1511–1641) and for the next century and a half (1641–1824), the Dutch endeavoured to maintain an economic monopoly along the coasts of Malaya. Malacca came permanently under British rule under the Anglo-Dutch Treaty, 1824. Up to Malaysian independence in 1957, Malaya saw a great influx of Chinese and Indian migrants as workers to support its growing industrial needs facilitated by the British. The growing tin ore mining and rubber industry resulted as the reason of the development of the railways as urgency to transport it from one place to another. The existence of railway transportation becomes more significant when the city started to bloom and the British started to build grandeur buildings that have different functions; administrative buildings, town and city halls, railway stations, public works department, courts, and post offices.

Keywords: Malaysia, station building, architectural styles, facade elements

Procedia PDF Downloads 141
196 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 434
195 In-Situ Determination of Radioactivity Levels and Radiological Hazards in and around the Gold Mine Tailings of the West Rand Area, South Africa

Authors: Paballo M. Moshupya, Tamiru A. Abiye, Ian Korir

Abstract:

Mining and processing of naturally occurring radioactive materials could result in elevated levels of natural radionuclides in the environment. The aim of this study was to evaluate the radioactivity levels on a large scale in the West Rand District in South Africa, which is dominated by abandoned gold mine tailings and the consequential radiological exposures to members of the public. The activity concentrations of ²³⁸U, ²³²Th and 40K in mine tailings, soil and rocks were assessed using the BGO Super-Spec (RS-230) gamma spectrometer. The measured activity concentrations for ²³⁸U, ²³²Th and 40K in the studied mine tailings were found to range from 209.95 to 2578.68 Bq/kg, 19.49 to 108.00 Bq/kg and 31.30 to 626.00 Bq/kg, respectively. In surface soils, the overall average activity concentrations were found to be 59.15 Bq/kg, 34.91 and 245.64 Bq/kg for 238U, ²³²Th and 40K, respectively. For the rock samples analyzed, the mean activity concentrations were 32.97 Bq/kg, 32.26 Bq/kg and 351.52 Bg/kg for ²³⁸U, ²³²Th and 40K, respectively. High radioactivity levels were found in mine tailings, with ²³⁸U contributing significantly to the overall activity concentration. The external gamma radiation received from surface soil in the area is generally low, with an average of 0.07 mSv/y. The highest annual effective doses were estimated from the tailings dams and the levels varied between 0.14 mSv/y and 1.09 mSv/y, with an average of 0.51 mSv/y. In certain locations, the recommended dose constraint of 0.25 mSv/y from a single source to the average member of the public within the exposed population was exceeded, indicating the need for further monitoring and regulatory control measures specific to these areas to ensure the protection of resident members of the public.

Keywords: activity concentration, gold mine tailings, in-situ gamma spectrometry, radiological exposures

Procedia PDF Downloads 102