Search results for: agent based web content mining
33269 A U-Net Based Architecture for Fast and Accurate Diagram Extraction
Authors: Revoti Prasad Bora, Saurabh Yadav, Nikita Katyal
Abstract:
In the context of educational data mining, the use case of extracting information from images containing both text and diagrams is of high importance. Hence, document analysis requires the extraction of diagrams from such images and processes the text and diagrams separately. To the author’s best knowledge, none among plenty of approaches for extracting tables, figures, etc., suffice the need for real-time processing with high accuracy as needed in multiple applications. In the education domain, diagrams can be of varied characteristics viz. line-based i.e. geometric diagrams, chemical bonds, mathematical formulas, etc. There are two broad categories of approaches that try to solve similar problems viz. traditional computer vision based approaches and deep learning approaches. The traditional computer vision based approaches mainly leverage connected components and distance transform based processing and hence perform well in very limited scenarios. The existing deep learning approaches either leverage YOLO or faster-RCNN architectures. These approaches suffer from a performance-accuracy tradeoff. This paper proposes a U-Net based architecture that formulates the diagram extraction as a segmentation problem. The proposed method provides similar accuracy with a much faster extraction time as compared to the mentioned state-of-the-art approaches. Further, the segmentation mask in this approach allows the extraction of diagrams of irregular shapes.Keywords: computer vision, deep-learning, educational data mining, faster-RCNN, figure extraction, image segmentation, real-time document analysis, text extraction, U-Net, YOLO
Procedia PDF Downloads 13733268 Data Mining of Students' Performance Using Artificial Neural Network: Turkish Students as a Case Study
Authors: Samuel Nii Tackie, Oyebade K. Oyedotun, Ebenezer O. Olaniyi, Adnan Khashman
Abstract:
Artificial neural networks have been used in different fields of artificial intelligence, and more specifically in machine learning. Although, other machine learning options are feasible in most situations, but the ease with which neural networks lend themselves to different problems which include pattern recognition, image compression, classification, computer vision, regression etc. has earned it a remarkable place in the machine learning field. This research exploits neural networks as a data mining tool in predicting the number of times a student repeats a course, considering some attributes relating to the course itself, the teacher, and the particular student. Neural networks were used in this work to map the relationship between some attributes related to students’ course assessment and the number of times a student will possibly repeat a course before he passes. It is the hope that the possibility to predict students’ performance from such complex relationships can help facilitate the fine-tuning of academic systems and policies implemented in learning environments. To validate the power of neural networks in data mining, Turkish students’ performance database has been used; feedforward and radial basis function networks were trained for this task; and the performances obtained from these networks evaluated in consideration of achieved recognition rates and training time.Keywords: artificial neural network, data mining, classification, students’ evaluation
Procedia PDF Downloads 61333267 Bounded Rational Heterogeneous Agents in Artificial Stock Markets: Literature Review and Research Direction
Authors: Talal Alsulaiman, Khaldoun Khashanah
Abstract:
In this paper, we provided a literature survey on the artificial stock problem (ASM). The paper began by exploring the complexity of the stock market and the needs for ASM. ASM aims to investigate the link between individual behaviors (micro level) and financial market dynamics (macro level). The variety of patterns at the macro level is a function of the AFM complexity. The financial market system is a complex system where the relationship between the micro and macro level cannot be captured analytically. Computational approaches, such as simulation, are expected to comprehend this connection. Agent-based simulation is a simulation technique commonly used to build AFMs. The paper proceeds by discussing the components of the ASM. We consider the roles of behavioral finance (BF) alongside the traditionally risk-averse assumption in the construction of agent's attributes. Also, the influence of social networks in the developing of agents’ interactions is addressed. Network topologies such as a small world, distance-based, and scale-free networks may be utilized to outline economic collaborations. In addition, the primary methods for developing agents learning and adaptive abilities have been summarized. These incorporated approach such as Genetic Algorithm, Genetic Programming, Artificial neural network and Reinforcement Learning. In addition, the most common statistical properties (the stylized facts) of stock that are used for calibration and validation of ASM are discussed. Besides, we have reviewed the major related previous studies and categorize the utilized approaches as a part of these studies. Finally, research directions and potential research questions are argued. The research directions of ASM may focus on the macro level by analyzing the market dynamic or on the micro level by investigating the wealth distributions of the agents.Keywords: artificial stock markets, market dynamics, bounded rationality, agent based simulation, learning, interaction, social networks
Procedia PDF Downloads 35433266 Use of Cold In-Place Asphalt Mixtures Technique in Road Maintenance in Egypt
Authors: Mohammed Mamdouh Mohammed, Ali Zain Elabdeen Heikal, Hassan Mahdy, Sherif El-Badawy
Abstract:
The main purpose of this research is to assess the effectiveness of the Cold In-Place Recycling (CIR) technique in asphalt maintenance by analyzing performance outcomes. To achieve this, fifteen CIR mixtures were prepared using slow-setting emulsified asphalt as the recycling agent, with percentages ranging from 2% to 4% in 0.5% increments. Additionally, pure water was incorporated in percentages ranging from 2% to 4% in 1% increments, and Portland cement was added at a constant content of 1%. The components were mixed at room temperature and subsequently compacted using a gyratory compactor with 150 gyrations. Prior to testing, the samples underwent a two-stage treatment process: initially, they were placed in an oven at 60°C for 48 hours, followed by a 24-hour period of air curing. The Hamburg wheel tracking test was performed to evaluate the samples’ resistance to rutting. Additionally, the Indirect Tensile Strength (ITS) test and the Semi-Circular Beam (SCB) test were conducted to assess their resistance to cracking. Upon analyzing the test results, it was observed that the samples’ resistance to rutting decreased with higher asphalt and moisture content. In contrast, ITS and SCB tests revealed that the samples’ resistance to cracking initially increased with higher asphalt and moisture content, peaking at a certain point, and then decreased, forming a bell-curve pattern.Keywords: cold in-place, indirect tensile strength, recycling, emulsified asphalt, semi-circular beam
Procedia PDF Downloads 1533265 Integrating Data Mining with Case-Based Reasoning for Diagnosing Sorghum Anthracnose
Authors: Mariamawit T. Belete
Abstract:
Cereal production and marketing are the means of livelihood for millions of households in Ethiopia. However, cereal production is constrained by technical and socio-economic factors. Among the technical factors, cereal crop diseases are the major contributing factors to the low yield. The aim of this research is to develop an integration of data mining and knowledge based system for sorghum anthracnose disease diagnosis that assists agriculture experts and development agents to make timely decisions. Anthracnose diagnosing systems gather information from Melkassa agricultural research center and attempt to score anthracnose severity scale. Empirical research is designed for data exploration, modeling, and confirmatory procedures for testing hypothesis and prediction to draw a sound conclusion. WEKA (Waikato Environment for Knowledge Analysis) was employed for the modeling. Knowledge based system has come across a variety of approaches based on the knowledge representation method; case-based reasoning (CBR) is one of the popular approaches used in knowledge-based system. CBR is a problem solving strategy that uses previous cases to solve new problems. The system utilizes hidden knowledge extracted by employing clustering algorithms, specifically K-means clustering from sampled anthracnose dataset. Clustered cases with centroid value are mapped to jCOLIBRI, and then the integrator application is created using NetBeans with JDK 8.0.2. The important part of a case based reasoning model includes case retrieval; the similarity measuring stage, reuse; which allows domain expert to transfer retrieval case solution to suit for the current case, revise; to test the solution, and retain to store the confirmed solution to the case base for future use. Evaluation of the system was done for both system performance and user acceptance. For testing the prototype, seven test cases were used. Experimental result shows that the system achieves an average precision and recall values of 70% and 83%, respectively. User acceptance testing also performed by involving five domain experts, and an average of 83% acceptance is achieved. Although the result of this study is promising, however, further study should be done an investigation on hybrid approach such as rule based reasoning, and pictorial retrieval process are recommended.Keywords: sorghum anthracnose, data mining, case based reasoning, integration
Procedia PDF Downloads 8133264 Some Trace and Toxic Metal Content of Crude Ethanol Leaf Extract of Globimetula Oreophila (Hook. F) Danser Azadirachta Indica Using Atomic Absorption Spectroscopy
Authors: Dauda G., Bila Ha Sani Y. M., Magaji M. G., Musa A. M., Hassan H. S.
Abstract:
Introduction: Globimetula oreophila is a parasitic plant with a known therapeutic value that is widely used in the treatment of various ailments, including malaria, hypertension, cancer, diabetes, epilepsy and as a diuretic agent. Objectives: The present study is aimed at analyzing and documenting the level of trace and toxic metals in the crude ethanol leaf extract of G. oreophila. Methods: After collection and authentication, the leaves were air-dried, mashed into powder, weighed and extracted using aqueous ethanol (70%). The crude extract (0.5g) was digested with HNO₃: HCl (3:1); then heated to 2000C and analyzed for its metal content by atomic absorption spectroscopy (AAS). Results: Fe had the highest concentration (32.73mg/kg), while Pb was not detected. The concentrations of Co, Cu, Ni, Zn and Cd detected were 5.97, 10.8, 8.01 and 0.9mg/kg, respectively. The concentration of Cd, Fe and Ni were above the permissible limit of FAO/WHO. Conclusion: The results also show that the analyzed plant is a beneficial source of appropriate and essential trace metals. However, the leaf of G. oreophila in the present study was probably unsafe for long-term use because of the level of Fe, Ni, and Cd concentration.Keywords: Globimetula oreophila, minerals, trace element, crude extract
Procedia PDF Downloads 15133263 Development of New Technology Evaluation Model by Using Patent Information and Customers' Review Data
Authors: Kisik Song, Kyuwoong Kim, Sungjoo Lee
Abstract:
Many global firms and corporations derive new technology and opportunity by identifying vacant technology from patent analysis. However, previous studies failed to focus on technologies that promised continuous growth in industrial fields. Most studies that derive new technology opportunities do not test practical effectiveness. Since previous studies depended on expert judgment, it became costly and time-consuming to evaluate new technologies based on patent analysis. Therefore, research suggests a quantitative and systematic approach to technology evaluation indicators by using patent data to and from customer communities. The first step involves collecting two types of data. The data is used to construct evaluation indicators and apply these indicators to the evaluation of new technologies. This type of data mining allows a new method of technology evaluation and better predictor of how new technologies are adopted.Keywords: data mining, evaluating new technology, technology opportunity, patent analysis
Procedia PDF Downloads 37733262 Antibacterial Activity and Cytotoxicity of Silver Nanoparticles Synthesized by Moringa oleifera Extract as Reducing Agent
Authors: Temsiri Suwan, Penpicha Wanachantararak, Sakornrat Khongkhunthian, Siriporn Okonogi
Abstract:
In the present study, silver nanoparticles (AgNPs) were synthesized by green synthesis approach using Moringa oleifera aqueous extract (ME) as a reducing agent and silver nitrate as a precursor. The obtained AgNPs were characterized using UV-Vis spectroscopy (UV-Vis), dynamic light scattering (DLS), scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDX), and X-ray diffractometry (XRD). The results from UV-Vis revealed that the maximum absorption of AgNPs was at 430 nm and the EDX spectrum confirmed Ag element. The results from DLS indicated that the amount of ME played an important role in particle size, size distribution, and zeta potential of the obtained AgNPs. The smallest size (62.4 ± 1.8 nm) with narrow distribution (0.18 ± 0.02) of AgNPs was obtained after using 1% w/v of ME. This system gave high negative zeta potential of -36.5 ± 2.8 mV. SEM results indicated that the obtained AgNPs were spherical in shape. Antibacterial activity using dilution method revealed that the minimum inhibitory and minimum bactericidal concentrations of the obtained AgNPs against Streptococcus mutans were 0.025 and 0.1 mg/mL, respectively. Cytotoxicity test of AgNPs on adenocarcinomic human alveolar basal epithelial cells (A549) indicated that the particles impacted against A549 cells. The percentage of cell growth inhibition was 87.5 ± 3.6 % when only 0.1 mg/mL AgNPs was used. These results suggest that ME is the potential reducing agent for green synthesis of AgNPs.Keywords: antibacterial activity, Moringa oleifera extract, reducing agent, silver nanoparticles
Procedia PDF Downloads 10833261 Mechanical Investigation Approach to Optimize the High-Velocity Oxygen Fuel Fe-Based Amorphous Coatings Reinforced by B4C Nanoparticles
Authors: Behrooz Movahedi
Abstract:
Fe-based amorphous feedstock powders are used as the matrix into which various ratios of hard B4C nanoparticles (0, 5, 10, 15, 20 vol.%) as reinforcing agents were prepared using a planetary high-energy mechanical milling. The ball-milled nanocomposite feedstock powders were also sprayed by means of high-velocity oxygen fuel (HVOF) technique. The characteristics of the powder particles and the prepared coating depending on their microstructures and nanohardness were examined in detail using nanoindentation tester. The results showed that the formation of the Fe-based amorphous phase was noticed over the course of high-energy ball milling. It is interesting to note that the nanocomposite coating is divided into two regions, namely, a full amorphous phase region and homogeneous dispersion of B4C nanoparticles with a scale of 10–50 nm in a residual amorphous matrix. As the B4C content increases, the nanohardness of the composite coatings increases, but the fracture toughness begins to decrease at the B4C content higher than 20 vol.%. The optimal mechanical properties are obtained with 15 vol.% B4C due to the suitable content and uniform distribution of nanoparticles. Consequently, the changes in mechanical properties of the coatings were attributed to the changes in the brittle to ductile transition by adding B4C nanoparticles.Keywords: Fe-based amorphous, B₄C nanoparticles, nanocomposite coating, HVOF
Procedia PDF Downloads 13533260 An Examination of Teachers’ Interactive Whiteboards Use within the Scope of Technological, Pedagogical, and Content Knowledge (TPACK)
Authors: Ismail Celik, Pavlo Antonenko, Seyit Ahmet Kiray, Ismail Sahin
Abstract:
The aim of the present study was to thoroughly investigate the teachers’ interactive whiteboards (IWBs) use within the scope of the technological pedagogical and content knowledge (TPACK) framework based on the school practice observations of in-service teachers collected by pre-service teachers. In this study, teachers’ use of IWBs in their classes was investigated by using phenomenography, which is a qualitative research method design. The participants of this study consisted of teachers working in a province of Turkey. Within the scope of the study, 337 teachers from 61 different schools were observed by preservice teachers during School Experience classes. The teachers use the IWBs to review the points not understood by the students, to share knowledge, to enhance motivation, to maintain student participation/practice and for in-process, formative assessment. The problems teachers face while using the IWBs can be IWB-based (touchscreen problems/frozen image/lack of software), administration-based, student-based and teacher-based (lack of knowledge of use, need for technical support). It is considered that technological knowledge (TK) is important in solving the problems experienced with IWBs, and technological pedagogy knowledge (TPK) and technological content knowledge (TCK) are important in using the IWBs in an interactive and pedagogically meaningful way that uses IWBs affordances and is relevant to the instructional objectives.Keywords: TPACK, technology integration, interactive whiteboard, technology in education
Procedia PDF Downloads 40633259 Preparation and Flame-Retardant Properties of Epoxy Resins Containing Organophosphorus Compounds
Authors: Tachita Vlad-Bubulac, Ionela-Daniela Carja, Diana Serbezeanu, Corneliu Hamciuc, Vicente Javier Forrat Perez
Abstract:
The present work describes the preparation of new organophosphorus compounds with high content of phosphorus followed by the incorporation of these compounds into epoxy resin systems in order to investigate the phosphorus effect in terms of thermal stability, flame-retardant and mechanical properties of modified epoxy resins. Thus, two new organophosphorus compounds have been synthesized and fully characterized. 6-Oxido-6H-dibenz[c,e][1,2]oxaphosphorinyl-phenylcarbinol has been prepared by the addition reaction of P–H group of 9,10-dihydro-9-oxa-10-phosphaphenanthrene-10-oxide to carbonyl group of benzaldehyde. By treating the phenylcarbinol derivative with POCl3 a new phosphorus compound was obtained, having a content of 12.227% P. The organophosphorus compounds have been purified by recrystallization while their chemical structures have been confirmed by melting point measurements, FTIR and HNMR spectroscopies. In the next step various flame-retardant epoxy resins with different content of phosphorus have been prepared starting from a commercial epoxy resin and using dicyandiamide (DICY) as a latent curing agent in the presence of an accelerator. Differential scanning calorimetry (DSC) has been applied to investigate the behavior and kinetics of curing process of thermosetting systems. The results showed that the best curing characteristic and glass transition temperature are obtained at a ratio of epoxy resin: DICY: accelerator equal to 94:5:1. The thermal stability of the phosphorus-containing epoxy resins was investigated by thermogravimetric analysis in nitrogen and air, DSC, SEM and LOI test measurements.Keywords: epoxy resins, flame retardant properties, phosphorus-containing compounds, thermal stability
Procedia PDF Downloads 31333258 Synthesis and Characterization of Renewable Resource Based Green Epoxy Coating
Authors: Sukanya Pradhan, Smita Mohanty, S. K Nayak
Abstract:
Plant oils are a great renewable source for being a reliable starting material to access new products with a wide spectrum of structural and functional variations. Even though petroleum products might also render the same, but it would also impose a high risk factor of environmental and health hazard. Since epoxidized vegetable oils are easily available, eco-compatible, non-toxic and renewable, hence these have drawn much of the attentions in the polymer industrial sector especially for the development of eco-friendly coating materials. In this study a waterborne epoxy coating was prepared from epoxidized soyabean oil by using triethanolamine. Because of its hydrophobic nature, it was a tough and tedius task to make it hydrophilic. The hydrophobic biobased epoxy was modified into waterborne epoxy by the help of a plant based anhydride as curing agent. Physico-mechanical, chemical resistance tests and thermal analysis of the green coating material were carried out which showed good physic-mechanical, chemical resistance properties as well as environment friendly. The complete characterization of the final material was done in terms of scratch hardness, gloss test, impact resistance, adhesion and bend test.Keywords: epoxidized soybean oil, waterborne, curing agent, green coating
Procedia PDF Downloads 54133257 Arabic Light Stemmer for Better Search Accuracy
Authors: Sahar Khedr, Dina Sayed, Ayman Hanafy
Abstract:
Arabic is one of the most ancient and critical languages in the world. It has over than 250 million Arabic native speakers and more than twenty countries having Arabic as one of its official languages. In the past decade, we have witnessed a rapid evolution in smart devices, social network and technology sector which led to the need to provide tools and libraries that properly tackle the Arabic language in different domains. Stemming is one of the most crucial linguistic fundamentals. It is used in many applications especially in information extraction and text mining fields. The motivation behind this work is to enhance the Arabic light stemmer to serve the data mining industry and leverage it in an open source community. The presented implementation works on enhancing the Arabic light stemmer by utilizing and enhancing an algorithm that provides an extension for a new set of rules and patterns accompanied by adjusted procedure. This study has proven a significant enhancement for better search accuracy with an average 10% improvement in comparison with previous works.Keywords: Arabic data mining, Arabic Information extraction, Arabic Light stemmer, Arabic stemmer
Procedia PDF Downloads 30833256 Feature Selection for Production Schedule Optimization in Transition Mines
Authors: Angelina Anani, Ignacio Ortiz Flores, Haitao Li
Abstract:
The use of underground mining methods have increased significantly over the past decades. This increase has also been spared on by several mines transitioning from surface to underground mining. However, determining the transition depth can be a challenging task, especially when coupled with production schedule optimization. Several researchers have simplified the problem by excluding operational features relevant to production schedule optimization. Our research objective is to investigate the extent to which operational features of transition mines accounted for affect the optimal production schedule. We also provide a framework for factors to consider in production schedule optimization for transition mines. An integrated mixed-integer linear programming (MILP) model is developed that maximizes the NPV as a function of production schedule and transition depth. A case study is performed to validate the model, with a comparative sensitivity analysis to obtain operational insights.Keywords: underground mining, transition mines, mixed-integer linear programming, production schedule
Procedia PDF Downloads 16933255 Improved Performance in Content-Based Image Retrieval Using Machine Learning Approach
Authors: B. Ramesh Naik, T. Venugopal
Abstract:
This paper presents a novel approach which improves the high-level semantics of images based on machine learning approach. The contemporary approaches for image retrieval and object recognition includes Fourier transforms, Wavelets, SIFT and HoG. Though these descriptors helpful in a wide range of applications, they exploit zero order statistics, and this lacks high descriptiveness of image features. These descriptors usually take benefit of primitive visual features such as shape, color, texture and spatial locations to describe images. These features do not adequate to describe high-level semantics of the images. This leads to a gap in semantic content caused to unacceptable performance in image retrieval system. A novel method has been proposed referred as discriminative learning which is derived from machine learning approach that efficiently discriminates image features. The analysis and results of proposed approach were validated thoroughly on WANG and Caltech-101 Databases. The results proved that this approach is very competitive in content-based image retrieval.Keywords: CBIR, discriminative learning, region weight learning, scale invariant feature transforms
Procedia PDF Downloads 18133254 Customer Preference in the Textile Market: Fabric-Based Analysis
Authors: Francisca Margarita Ocran
Abstract:
Underwear, and more particularly bras and panties, are defined as intimate clothing. Strictly speaking, they enhance the place of women in the public or private satchel. Therefore, women's lingerie is a complex garment with a high involvement profile, motivating consumers to buy it not only by its functional utility but also by the multisensory experience it provides them. Customer behavior models are generally based on customer data mining, and each model is designed to answer questions at a specific time. Predicting the customer experience is uncertain and difficult. Thus, knowledge of consumers' tastes in lingerie deserves to be treated as an experiential product, where the dimensions of the experience motivating consumers to buy a lingerie product and to remain faithful to it must be analyzed in detail by the manufacturers and retailers to engage and retain consumers, which is why this research aims to identify the variables that push consumers to choose their lingerie product, based on an in-depth analysis of the types of fabrics used to make lingerie. The data used in this study comes from online purchases. Machine learning approach with the use of Python programming language and Pycaret gives us a precision of 86.34%, 85.98%, and 84.55% for the three algorithms to use concerning the preference of a buyer in front of a range of lingerie. Gradient Boosting, random forest, and K Neighbors were used in this study; they are very promising and rich in the classification of preference in the textile industry.Keywords: consumer behavior, data mining, lingerie, machine learning, preference
Procedia PDF Downloads 9033253 Deep Reinforcement Learning with Leonard-Ornstein Processes Based Recommender System
Authors: Khalil Bachiri, Ali Yahyaouy, Nicoleta Rogovschi
Abstract:
Improved user experience is a goal of contemporary recommender systems. Recommender systems are starting to incorporate reinforcement learning since it easily satisfies this goal of increasing a user’s reward every session. In this paper, we examine the most effective Reinforcement Learning agent tactics on the Movielens (1M) dataset, balancing precision and a variety of recommendations. The absence of variability in final predictions makes simplistic techniques, although able to optimize ranking quality criteria, worthless for consumers of the recommendation system. Utilizing the stochasticity of Leonard-Ornstein processes, our suggested strategy encourages the agent to investigate its surroundings. Research demonstrates that raising the NDCG (Discounted Cumulative Gain) and HR (HitRate) criterion without lowering the Ornstein-Uhlenbeck process drift coefficient enhances the diversity of suggestions.Keywords: recommender systems, reinforcement learning, deep learning, DDPG, Leonard-Ornstein process
Procedia PDF Downloads 14233252 Using Differentiation Instruction to Create a Personalized Experience
Authors: Valerie Yocco Rossi
Abstract:
Objective: The author will share why differentiation is necessary for all classrooms as well as strategies for differentiating content, process, and product. Through learning how to differentiate, teachers will be able to create activities and assessments to meet the abilities, readiness levels, and interests of all learners. Content and Purpose: This work will focus on how to create a learning experience for students that recognizes their different interests, abilities, and readiness levels by differentiating content, process, and product. Likewise, the best learning environments allow for choice. Choice boards allow students to select tasks based on interests. There can be challenging and basic tasks to meet the needs of various abilities. Equally, rubrics allow for personalized and differentiated assessments based on readiness levels and cognitive abilities. The principals of DI help to create a classroom where all students are learning to the best of their abilities. Outcomes: After reviewing the work, readers will be able to (1) identify the benefits of differentiated instruction; (2) convert traditional learning activities to differentiated ones; (3) differentiate, writing-based assessments.Keywords: differentiation, personalized learning, design, instructional strategies
Procedia PDF Downloads 6933251 Probabilistic Approach of Dealing with Uncertainties in Distributed Constraint Optimization Problems and Situation Awareness for Multi-agent Systems
Authors: Sagir M. Yusuf, Chris Baber
Abstract:
In this paper, we describe how Bayesian inferential reasoning will contributes in obtaining a well-satisfied prediction for Distributed Constraint Optimization Problems (DCOPs) with uncertainties. We also demonstrate how DCOPs could be merged to multi-agent knowledge understand and prediction (i.e. Situation Awareness). The DCOPs functions were merged with Bayesian Belief Network (BBN) in the form of situation, awareness, and utility nodes. We describe how the uncertainties can be represented to the BBN and make an effective prediction using the expectation-maximization algorithm or conjugate gradient descent algorithm. The idea of variable prediction using Bayesian inference may reduce the number of variables in agents’ sampling domain and also allow missing variables estimations. Experiment results proved that the BBN perform compelling predictions with samples containing uncertainties than the perfect samples. That is, Bayesian inference can help in handling uncertainties and dynamism of DCOPs, which is the current issue in the DCOPs community. We show how Bayesian inference could be formalized with Distributed Situation Awareness (DSA) using uncertain and missing agents’ data. The whole framework was tested on multi-UAV mission for forest fire searching. Future work focuses on augmenting existing architecture to deal with dynamic DCOPs algorithms and multi-agent information merging.Keywords: DCOP, multi-agent reasoning, Bayesian reasoning, swarm intelligence
Procedia PDF Downloads 11933250 The Human Right to a Safe, Clean and Healthy Environment in Corporate Social Responsibility's Strategies: An Approach to Understanding Mexico's Mining Sector
Authors: Thalia Viveros-Uehara
Abstract:
The virtues of Corporate Social Responsibility (CSR) are explored widely in the academic literature. However, few studies address its link to human rights, per se; specifically, the right to a safe, clean and healthy environment. Fewer still are the research works in this area that relate to developing countries, where a number of areas are biodiversity hotspots. In Mexico, despite the rise and evolution of CSR schemes, grave episodes of pollution persist, especially those caused by the mining industry. These cases set up the question of the correspondence between the current CSR practices of mining companies in the country and their responsibility to respect the right to a safe, clean and healthy environment. The present study approaches precisely such a bridge, which until now has not been fully tackled in light of Mexico's 2011 constitutional human rights amendment and the United Nation's Guiding Principles on Business and Human Rights (UN Guiding Principles), adopted by the Human Rights Council in 2011. To that aim, it initially presents a contextual framework; it then explores qualitatively the adoption of human rights’ language in the CSR strategies of the three main mining companies in Mexico, and finally, it examines their standing with respect to the UN Guiding Principles. The results reveal that human rights are included in the RSE strategies of the analysed businesses, at least at the rhetoric level; however, they do not embrace the right to a safe, clean and healthy environment as such. Moreover, we conclude that despite the finding that corporations publicly express their commitment to respect human rights, some operational weaknesses that hamper the exercise of such responsibility persist; for example, the systematic lack of human rights impact assessments per mining unit, the denial of actual and publicly-known negative episodes on the environment linked directly to their operations, and the absence of effective mechanisms to remediate adverse impacts.Keywords: corporate social responsibility, environmental impacts, human rights, right to a safe, clean and healthy environment, mining industry
Procedia PDF Downloads 32933249 Progressive Multimedia Collection Structuring via Scene Linking
Authors: Aman Berhe, Camille Guinaudeau, Claude Barras
Abstract:
In order to facilitate information seeking in large collections of multimedia documents with long and progressive content (such as broadcast news or TV series), one can extract the semantic links that exist between semantically coherent parts of documents, i.e., scenes. The links can then create a coherent collection of scenes from which it is easier to perform content analysis, topic extraction, or information retrieval. In this paper, we focus on TV series structuring and propose two approaches for scene linking at different levels of granularity (episode and season): a fuzzy online clustering technique and a graph-based community detection algorithm. When evaluated on the two first seasons of the TV series Game of Thrones, we found that the fuzzy online clustering approach performed better compared to graph-based community detection at the episode level, while graph-based approaches show better performance at the season level.Keywords: multimedia collection structuring, progressive content, scene linking, fuzzy clustering, community detection
Procedia PDF Downloads 10033248 Surviral: An Agent-Based Simulation Framework for Sars-Cov-2 Outcome Prediction
Authors: Sabrina Neururer, Marco Schweitzer, Werner Hackl, Bernhard Tilg, Patrick Raudaschl, Andreas Huber, Bernhard Pfeifer
Abstract:
History and the current outbreak of Covid-19 have shown the deadly potential of infectious diseases. However, infectious diseases also have a serious impact on areas other than health and healthcare, such as the economy or social life. These areas are strongly codependent. Therefore, disease control measures, such as social distancing, quarantines, curfews, or lockdowns, have to be adopted in a very considerate manner. Infectious disease modeling can support policy and decision-makers with adequate information regarding the dynamics of the pandemic and therefore assist in planning and enforcing appropriate measures that will prevent the healthcare system from collapsing. In this work, an agent-based simulation package named “survival” for simulating infectious diseases is presented. A special focus is put on SARS-Cov-2. The presented simulation package was used in Austria to model the SARS-Cov-2 outbreak from the beginning of 2020. Agent-based modeling is a relatively recent modeling approach. Since our world is getting more and more complex, the complexity of the underlying systems is also increasing. The development of tools and frameworks and increasing computational power advance the application of agent-based models. For parametrizing the presented model, different data sources, such as known infections, wastewater virus load, blood donor antibodies, circulating virus variants and the used capacity for hospitalization, as well as the availability of medical materials like ventilators, were integrated with a database system and used. The simulation result of the model was used for predicting the dynamics and the possible outcomes and was used by the health authorities to decide on the measures to be taken in order to control the pandemic situation. The survival package was implemented in the programming language Java and the analytics were performed with R Studio. During the first run in March 2020, the simulation showed that without measures other than individual personal behavior and appropriate medication, the death toll would have been about 27 million people worldwide within the first year. The model predicted the hospitalization rates (standard and intensive care) for Tyrol and South Tyrol with an accuracy of about 1.5% average error. They were calculated to provide 10-days forecasts. The state government and the hospitals were provided with the 10-days models to support their decision-making. This ensured that standard care was maintained for as long as possible without restrictions. Furthermore, various measures were estimated and thereafter enforced. Among other things, communities were quarantined based on the calculations while, in accordance with the calculations, the curfews for the entire population were reduced. With this framework, which is used in the national crisis team of the Austrian province of Tyrol, a very accurate model could be created on the federal state level as well as on the district and municipal level, which was able to provide decision-makers with a solid information basis. This framework can be transferred to various infectious diseases and thus can be used as a basis for future monitoring.Keywords: modelling, simulation, agent-based, SARS-Cov-2, COVID-19
Procedia PDF Downloads 17433247 Safety-critical Alarming Strategy Based on Statistically Defined Slope Deformation Behaviour Model Case Study: Upright-dipping Highwall in a Coal Mining Area
Authors: Lintang Putra Sadewa, Ilham Prasetya Budhi
Abstract:
Slope monitoring program has now become a mandatory campaign for any open pit mines around the world to operate safely. Utilizing various slope monitoring instruments and strategies, miners are now able to deliver precise decisions in mitigating the risk of slope failures which can be catastrophic. Currently, the most sophisticated slope monitoring technology available is the Slope Stability Radar (SSR), whichcan measure wall deformation in submillimeter accuracy. One of its eminent features is that SSRcan provide a timely warning by automatically raise an alarm when a predetermined rate-of-movement threshold is reached. However, establishing proper alarm thresholds is arguably one of the onerous challenges faced in any slope monitoring program. The difficulty mainly lies in the number of considerations that must be taken when generating a threshold becausean alarm must be effectivethat it should limit the occurrences of false alarms while alsobeing able to capture any real wall deformations. In this sense, experience shows that a site-specific alarm thresholdtendsto produce more reliable results because it considers site distinctive variables. This study will attempt to determinealarming thresholds for safety-critical monitoring based on an empirical model of slope deformation behaviour that is defined statistically fromdeformation data captured by the Slope Stability Radar (SSR). The study area comprises of upright-dipping highwall setting in a coal mining area with intense mining activities, andthe deformation data used for the study were recorded by the SSR throughout the year 2022. The model is site-specific in nature thus, valuable information extracted from the model (e.g., time-to-failure, onset-of-acceleration, and velocity) will be applicable in setting up site-specific alarm thresholds and will give a clear understanding of how deformation trends evolve over the area.Keywords: safety-critical monitoring, alarming strategy, slope deformation behaviour model, coal mining
Procedia PDF Downloads 9033246 Leveraging Power BI for Advanced Geotechnical Data Analysis and Visualization in Mining Projects
Authors: Elaheh Talebi, Fariba Yavari, Lucy Philip, Lesley Town
Abstract:
The mining industry generates vast amounts of data, necessitating robust data management systems and advanced analytics tools to achieve better decision-making processes in the development of mining production and maintaining safety. This paper highlights the advantages of Power BI, a powerful intelligence tool, over traditional Excel-based approaches for effectively managing and harnessing mining data. Power BI enables professionals to connect and integrate multiple data sources, ensuring real-time access to up-to-date information. Its interactive visualizations and dashboards offer an intuitive interface for exploring and analyzing geotechnical data. Advanced analytics is a collection of data analysis techniques to improve decision-making. Leveraging some of the most complex techniques in data science, advanced analytics is used to do everything from detecting data errors and ensuring data accuracy to directing the development of future project phases. However, while Power BI is a robust tool, specific visualizations required by geotechnical engineers may have limitations. This paper studies the capability to use Python or R programming within the Power BI dashboard to enable advanced analytics, additional functionalities, and customized visualizations. This dashboard provides comprehensive tools for analyzing and visualizing key geotechnical data metrics, including spatial representation on maps, field and lab test results, and subsurface rock and soil characteristics. Advanced visualizations like borehole logs and Stereonet were implemented using Python programming within the Power BI dashboard, enhancing the understanding and communication of geotechnical information. Moreover, the dashboard's flexibility allows for the incorporation of additional data and visualizations based on the project scope and available data, such as pit design, rock fall analyses, rock mass characterization, and drone data. This further enhances the dashboard's usefulness in future projects, including operation, development, closure, and rehabilitation phases. Additionally, this helps in minimizing the necessity of utilizing multiple software programs in projects. This geotechnical dashboard in Power BI serves as a user-friendly solution for analyzing, visualizing, and communicating both new and historical geotechnical data, aiding in informed decision-making and efficient project management throughout various project stages. Its ability to generate dynamic reports and share them with clients in a collaborative manner further enhances decision-making processes and facilitates effective communication within geotechnical projects in the mining industry.Keywords: geotechnical data analysis, power BI, visualization, decision-making, mining industry
Procedia PDF Downloads 9233245 Modelling the Physicochemical Properties of Papaya Based-Cookies Using Response Surface Methodology
Authors: Mayowa Saheed Sanusi A, Musiliu Olushola Sunmonua, Abdulquadri Alakab Owolabi Raheema, Adeyemi Ikimot Adejokea
Abstract:
The development of healthy cookies for health-conscious consumers cannot be overemphasized in the present global health crisis. This study was aimed to evaluate and model the influence of ripeness levels of papaya puree (unripe, ripe and overripe), oven temperature (130°C, 150°C and 170°C) and oven rack speed (stationary, 10 and 20 rpm) on physicochemical properties of papaya-based cookies using Response Surface Methodology (RSM). The physicochemical properties (baking time, cookies mass, cookies thickness, spread ratio, proximate composition, Calcium, Vitamin C and Total Phenolic Content) were determined using standard procedures. The data obtained were statistically analysed at p≤0.05 using ANOVA. The polynomial regression model of response surface methodology was used to model the physicochemical properties. The adequacy of the models was determined using the coefficient of determination (R²) and the response optimizer of RSM was used to determine the optimum physicochemical properties for the papaya-based cookies. Cookies produced from overripe papaya puree were observed to have the shortest baking time; ripe papaya puree favors cookies spread ratio, while the unripe papaya puree gives cookies with the highest mass and thickness. The highest crude protein content, fiber content, calcium content, Vitamin C and Total Phenolic Content (TPC) were observed in papaya based-cookies produced from overripe puree. The models for baking time, cookies mass, cookies thickness, spread ratio, moisture content, crude protein and TPC were significant, with R2 ranging from 0.73 – 0.95. The optimum condition for producing papaya based-cookies with desirable physicochemical properties was obtained at 149°C oven temperature, 17 rpm oven rack speed and with the use of overripe papaya puree. The Information on the use of puree from unripe, ripe and overripe papaya can help to increase the use of underutilized unripe or overripe papaya and also serve as a strategic means of obtaining a fat substitute to produce new products with lower production cost and health benefit.Keywords: papaya based-cookies, modeling, response surface methodology, physicochemical properties
Procedia PDF Downloads 16733244 Defining Processes of Gender Restructuring: The Case of Displaced Tribal Communities of North East India
Authors: Bitopi Dutta
Abstract:
Development Induced Displacement (DID) of subaltern groups has been an issue of intense debate in India. This research will do a gender analysis of displacement induced by the mining projects in tribal indigenous societies of North East India, centering on the primary research question which is 'How does DID reorder gendered relationship in tribal matrilineal societies?' This paper will not focus primarily on the impacts of the displacement induced by coal mining on indigenous tribal women in the North East India; it will rather study 'what' are the processes that lead to these transformations and 'how' do they operate. In doing so, the paper will locate the cracks in traditional social systems that the discourse of displacement manipulates for its own benefit. DID in this sense will not only be understood as only physical displacement, but also as social and cultural displacement. The study will cover one matrilineal tribe in the state of Meghalaya in the North East India affected by several coal mining projects in the last 30 years. In-depth unstructured interviews used to collect life narratives will be the primary mode of data collection because the indigenous culture of the tribes in Meghalaya, including the matrilineal tribes, is based on oral history where knowledge and experiences produced under a tradition of oral history exist in a continuum. This is unlike modern societies which produce knowledge in a compartmentalized system. An interview guide designed around specific themes will be used rather than specific questions to ensure the flow of narratives from the interviewee. In addition to this, a number of focus groups will be held. The data collected through the life narrative will be supplemented and contextualized through documentary research using government data, and local media sources of the region.Keywords: displacement, gender-relations, matriliny, mining
Procedia PDF Downloads 19533243 Reconfigurable Consensus Achievement of Multi Agent Systems Subject to Actuator Faults in a Leaderless Architecture
Authors: F. Amirarfaei, K. Khorasani
Abstract:
In this paper, reconfigurable consensus achievement of a team of agents with marginally stable linear dynamics and single input channel has been considered. The control algorithm is based on a first order linear protocol. After occurrence of a LOE fault in one of the actuators, using the imperfect information of the effectiveness of the actuators from fault detection and identification module, the control gain is redesigned in a way to still reach consensus. The idea is based on the modeling of change in effectiveness as change of Laplacian matrix. Then as special cases of this class of systems, a team of single integrators as well as double integrators are considered and their behavior subject to a LOE fault is considered. The well-known relative measurements consensus protocol is applied to a leaderless team of single integrator as well as double integrator systems, and Gersgorin disk theorem is employed to determine whether fault occurrence has an effect on system stability and team consensus achievement or not. The analyses show that loss of effectiveness fault in actuator(s) of integrator systems affects neither system stability nor consensus achievement.Keywords: multi-agent system, actuator fault, stability analysis, consensus achievement
Procedia PDF Downloads 33733242 Planning for Enviromental and Social Sustainability in Coastal Areas: A Case of Alappad
Authors: K. Vrinda
Abstract:
Coastal ecosystems across the world are facing a lot of challenges due to natural phenomena as well as from uncontrolled human interventions. Here, Alappad, a coastal island situated in Kerala, India is undergoing significant damage and is gradually losing its environmental and social sustainability. The area is blessed with very rare and precious black mineral sand deposits. Sand mining for these minerals started in 1911 and is still continuing. But, unfortunately all the problems that Alappad faces now, have its root on mining of this mineral sand. The land area is continuously diminishing due to sea erosion. The mining has also caused displacement of people and environmental degradation. Marine life also is getting affected by mining on beach and pollution. The inhabitants are fishermen who are largely dependent on the eco-system for a living. So loss of environmental sustainability subsequently affects social sustainability too. Now the damage has reached a point beyond which our actions may not be able to make any impact. This was one of the most affected areas of the 2004 tsunami and the environmental degradation has further increased the vulnerability. So this study focuses on understanding the concerns related to the resource utilization, environment and the indigenous community staying there, and on formulating suitable strategies to restore the sustainability of the area. An extensive study was conducted on site, to find out the physical, social, and economical characteristics of the area. A focus group discussion with the inhabitants shed light on different issues they face in their day-to-day life. The analysis of all these data, led to the formation of a new development vision for the area which focuses on environmental restoration and socio-economic development while allowing controlled exploitation of resources. A participatory approach is formulated which enables these three aspects through community based programs.Keywords: Community development, Disaster resilience, Ecological restoration, Environmental sustainability, Social-environmental planning, Social Sustainability
Procedia PDF Downloads 11133241 A Relationship Extraction Method from Literary Fiction Considering Korean Linguistic Features
Authors: Hee-Jeong Ahn, Kee-Won Kim, Seung-Hoon Kim
Abstract:
The knowledge of the relationship between characters can help readers to understand the overall story or plot of the literary fiction. In this paper, we present a method for extracting the specific relationship between characters from a Korean literary fiction. Generally, methods for extracting relationships between characters in text are statistical or computational methods based on the sentence distance between characters without considering Korean linguistic features. Furthermore, it is difficult to extract the relationship with direction from text, such as one-sided love, because they consider only the weight of relationship, without considering the direction of the relationship. Therefore, in order to identify specific relationships between characters, we propose a statistical method considering linguistic features, such as syntactic patterns and speech verbs in Korean. The result of our method is represented by a weighted directed graph of the relationship between the characters. Furthermore, we expect that proposed method could be applied to the relationship analysis between characters of other content like movie or TV drama.Keywords: data mining, Korean linguistic feature, literary fiction, relationship extraction
Procedia PDF Downloads 38033240 Investigation of Topic Modeling-Based Semi-Supervised Interpretable Document Classifier
Authors: Dasom Kim, William Xiu Shun Wong, Yoonjin Hyun, Donghoon Lee, Minji Paek, Sungho Byun, Namgyu Kim
Abstract:
There have been many researches on document classification for classifying voluminous documents automatically. Through document classification, we can assign a specific category to each unlabeled document on the basis of various machine learning algorithms. However, providing labeled documents manually requires considerable time and effort. To overcome the limitations, the semi-supervised learning which uses unlabeled document as well as labeled documents has been invented. However, traditional document classifiers, regardless of supervised or semi-supervised ones, cannot sufficiently explain the reason or the process of the classification. Thus, in this paper, we proposed a methodology to visualize major topics and class components of each document. We believe that our methodology for visualizing topics and classes of each document can enhance the reliability and explanatory power of document classifiers.Keywords: data mining, document classifier, text mining, topic modeling
Procedia PDF Downloads 402