Search results for: autonomous mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1615

Search results for: autonomous mining

385 Relevance Feedback within CBIR Systems

Authors: Mawloud Mosbah, Bachir Boucheham

Abstract:

We present here the results for a comparative study of some techniques, available in the literature, related to the relevance feedback mechanism in the case of a short-term learning. Only one method among those considered here is belonging to the data mining field which is the K-Nearest Neighbours Algorithm (KNN) while the rest of the methods is related purely to the information retrieval field and they fall under the purview of the following three major axes: Shifting query, Feature Weighting and the optimization of the parameters of similarity metric. As a contribution, and in addition to the comparative purpose, we propose a new version of the KNN algorithm referred to as an incremental KNN which is distinct from the original version in the sense that besides the influence of the seeds, the rate of the actual target image is influenced also by the images already rated. The results presented here have been obtained after experiments conducted on the Wang database for one iteration and utilizing colour moments on the RGB space. This compact descriptor, Colour Moments, is adequate for the efficiency purposes needed in the case of interactive systems. The results obtained allow us to claim that the proposed algorithm proves good results; it even outperforms a wide range of techniques available in the literature.

Keywords: CBIR, category search, relevance feedback, query point movement, standard Rocchio’s formula, adaptive shifting query, feature weighting, original KNN, incremental KNN

Procedia PDF Downloads 258
384 Particle Size Analysis of Itagunmodi Southwestern Nigeria Alluvial Gold Ore Sample by Gaudin Schumann Method

Authors: Olaniyi Awe, Adelana R. Adetunji, Abraham Adeleke

Abstract:

Mining of alluvial gold ore by artisanal miners has been going on for decades at Itagunmodi, Southwestern Nigeria. In order to optimize the traditional panning gravity separation method commonly used in the area, a mineral particle size analysis study is critical. This study analyzed alluvial gold ore samples collected at identified five different locations in the area with a view to determine the ore particle size distributions. 500g measured of as-received alluvial gold ore sample was introduced into the uppermost sieve of an electrical sieve shaker consisting of sieves arranged in the order of decreasing nominal apertures of 5600μm, 3350μm, 2800μm, 355μm, 250μm, 125μm and 90μm, and operated for 20 minutes. The amount of material retained on each sieve was measured and tabulated for analysis. A screen analysis graph using the Gaudin Schuman method was drawn for each of the screen tests on the alluvial samples. The study showed that the percentages of fine particle size -125+90 μm fraction were 45.00%, 36.00%, 39.60%, 43.00% and 36.80% for the selected samples. These primary ore characteristic results provide reference data for the alluvial gold ore processing method selection, process performance measurement and optimization.

Keywords: alluvial gold ore, sieve shaker, particle size, Gaudin Schumann

Procedia PDF Downloads 6
383 Geochemical Evaluation of Weathering-Induced Release of Trace Metals from the Maastritchian Shales in Parts of Bida an Anambra Basins, Nigeria

Authors: Adetunji Olusegun Aderigibigbe

Abstract:

Shales, especially black shales, are of great geological significance, in the study of heavy/trace metal contamination. This is due to their abundance in occurrence and high concentration of heavy metals embedded which are released during their weathering. Heavy metals constitute one of the most dangerous pollution known to human because they are toxic (i.e., carcinogenic), non-biodegradable and can enter the global eco-biological circle. In the past, heavy metal contamination in aquatic environment and agricultural top soil has been attributed to industrial wastes, mining extractions and pollution from traffic vehicles; only a few studies have focused on weathering of shale as possible source of heavy metal contamination. Based on the above background, this study attempts to establish weathering of shale as possible source of trace/heavy metal contaminations. This was done by carefully selecting fresh and their corresponding weathered shale samples from selected localities in Bida and Anambra Basins. The samples were analysed in Activation Laboratories Ltd; Ontario, Canada for trace/heavy metal. It was observed that some major and trace metals were released during weathering, i.e., some were depleted and some enriched. By this contamination of water zones and agricultural top soils are not only traceable to biogenic processes but geogenic inputs (weathering of shale) as well.

Keywords: contamination, fresh samples, heavy metals, pollution, shales, trace metals, weathered samples

Procedia PDF Downloads 106
382 Improve Divers Tracking and Classification in Sonar Images Using Robust Diver Wake Detection Algorithm

Authors: Mohammad Tarek Al Muallim, Ozhan Duzenli, Ceyhun Ilguy

Abstract:

Harbor protection systems are so important. The need for automatic protection systems has increased over the last years. Diver detection active sonar has great significance. It used to detect underwater threats such as divers and autonomous underwater vehicle. To automatically detect such threats the sonar image is processed by algorithms. These algorithms used to detect, track and classify of underwater objects. In this work, divers tracking and classification algorithm is improved be proposing a robust wake detection method. To detect objects the sonar images is normalized then segmented based on fixed threshold. Next, the centroids of the segments are found and clustered based on distance metric. Then to track the objects linear Kalman filter is applied. To reduce effect of noise and creation of false tracks, the Kalman tracker is fine tuned. The tuning is done based on our active sonar specifications. After the tracks are initialed and updated they are subjected to a filtering stage to eliminate the noisy and unstable tracks. Also to eliminate object with a speed out of the diver speed range such as buoys and fast boats. Afterwards the result tracks are subjected to a classification stage to deiced the type of the object been tracked. Here the classification stage is to deice wither if the tracked object is an open circuit diver or a close circuit diver. At the classification stage, a small area around the object is extracted and a novel wake detection method is applied. The morphological features of the object with his wake is extracted. We used support vector machine to find the best classifier. The sonar training images and the test images are collected by ARMELSAN Defense Technologies Company using the portable diver detection sonar ARAS-2023. After applying the algorithm to the test sonar data, we get fine and stable tracks of the divers. The total classification accuracy achieved with the diver type is 97%.

Keywords: harbor protection, diver detection, active sonar, wake detection, diver classification

Procedia PDF Downloads 212
381 Cognitive Performance and Physiological Stress during an Expedition in Antarctica

Authors: Andrée-Anne Parent, Alain-Steve Comtois

Abstract:

The Antarctica environment can be a great challenge for human exploration. Explorers need to be focused on the task and require the physical abilities to succeed and survive in complete autonomy in this hostile environment. The aim of this study was to observe cognitive performance and physiological stress with a biomarker (cortisol) and hand grip strength during an expedition in Antarctica. A total of 6 explorers were in complete autonomous exploration on the Forbidden Plateau in Antarctica to reach unknown summits during a 30 day period. The Stroop Test, a simple reaction time, and mood scale (PANAS) tests were performed every week during the expedition. Saliva samples were taken before sailing to Antarctica, the first day on the continent, after the mission on the continent and on the boat return trip. Furthermore, hair samples were taken before and after the expedition. The results were analyzed with SPSS using ANOVA repeated measures. The Stroop and mood scale results are presented in the following order: 1) before sailing to Antarctica, 2) the first day on the continent, 3) after the mission on the continent and 4) on the boat return trip. No significant difference was observed with the Stroop (759±166 ms, 850±114 ms, 772±179 ms and 833±105 ms, respectively) and the PANAS (39.5 ±5.7, 40.5±5, 41.8±6.9, 37.3±5.8 positive emotions, and 17.5±2.3, 18.2±5, 18.3±8.6, 15.8±5.4 negative emotions, respectively) (p>0.05). However, there appears to be an improvement at the end of the second week. Furthermore, the simple reaction time was significantly lower at the end of the second week, a moment where important decisions were taken about the mission, vs the week before (416±39 ms vs 459.8±39 ms respectively; p=0.030). Furthermore, the saliva cortisol was not significantly different (p>0.05) possibly due to important variations and seemed to reach a peak on the first day on the continent. However, the cortisol from the hair pre and post expedition increased significantly (2.4±0.5 pg/mg pre-expedition and 16.7±9.2 pg/mg post-expedition, p=0.013) showing important stress during the expedition. Moreover, no significant difference was observed on the grip strength except between after the mission on the continent and after the boat return trip (91.5±21 kg vs 85±19 kg, p=0.20). In conclusion, the cognitive performance does not seem to be affected during the expedition. Furthermore, it seems to increase for specific important events where the crew seemed to focus on the present task. The physiological stress does not seem to change significantly at specific moments, however, a global pre-post mission measure can be important and for this reason, for long-term missions, a pre-expedition baseline measure is important for crewmembers.

Keywords: Antarctica, cognitive performance, expedition, physiological adaptation, reaction time

Procedia PDF Downloads 222
380 Performance Demonstration of Extendable NSPO Space-Borne GPS Receiver

Authors: Hung-Yuan Chang, Wen-Lung Chiang, Kuo-Liang Wu, Chen-Tsung Lin

Abstract:

National Space Organization (NSPO) has completed in 2014 the development of a space-borne GPS receiver, including design, manufacture, comprehensive functional test, environmental qualification test and so on. The main performance of this receiver include 8-meter positioning accuracy, 0.05 m/sec speed-accuracy, the longest 90 seconds of cold start time, and up to 15g high dynamic scenario. The receiver will be integrated in the autonomous FORMOSAT-7 NSPO-Built satellite scheduled to be launched in 2019 to execute pre-defined scientific missions. The flight model of this receiver manufactured in early 2015 will pass comprehensive functional tests and environmental acceptance tests, etc., which are expected to be completed by the end of 2015. The space-borne GPS receiver is a pure software design in which all GPS baseband signal processing are executed by a digital signal processor (DSP), currently only 50% of its throughput being used. In response to the booming global navigation satellite systems, NSPO will gradually expand this receiver to become a multi-mode, multi-band, high-precision navigation receiver, and even a science payload, such as the reflectometry receiver of a global navigation satellite system. The fundamental purpose of this extension study is to port some software algorithms such as signal acquisition and correlation, reused code and large amount of computation load to the FPGA whose processor is responsible for operational control, navigation solution, and orbit propagation and so on. Due to the development and evolution of the FPGA is pretty fast, the new system architecture upgraded via an FPGA should be able to achieve the goal of being a multi-mode, multi-band high-precision navigation receiver, or scientific receiver. Finally, the results of tests show that the new system architecture not only retains the original overall performance, but also sets aside more resources available for future expansion possibility. This paper will explain the detailed DSP/FPGA architecture, development, test results, and the goals of next development stage of this receiver.

Keywords: space-borne, GPS receiver, DSP, FPGA, multi-mode multi-band

Procedia PDF Downloads 347
379 Application of Enzyme-Mediated Calcite Precipitation for Surface Control of Gold Mining Tailing Waste

Authors: Yogi Priyo Pradana, Heriansyah Putra, Regina Aprilia Zulfikar, Maulana Rafiq Ramadhan, Devyan Meisnnehr, Zalfa Maulida Insani

Abstract:

This paper studied the effects and mechanisms of fine-grained tailing by Enzyme-Mediated Calcite Precipitation (EMCP). Grouting solution used consists of reagents (CaCl₂ and (CO(NH₂)₂) and urease enzymes which react to produce CaCO₃. In sample preparation, the test tube is used to investigate the precipitation rate of calcite. The grouting solution added is 75 mL for one mold sample. The solution was poured into a mold sample up to as high as 5 mm from the top surface of the tailing to ensure the entire surface is submerged. The sample is left open in a cylinder for up to 3 days for curing. The direct mixing method is conducted so that the cementation process occurs by evenly distributed. The relationship between the results of the UCS test and the calcite precipitation rate likely indicates that the amount of calcite deposited in treated tailing could control the strength of the tailing. The sample results are analyzed using atomic absorption spectroscopy (AAS) to evaluate metal and metalloid content. Calcium carbonate deposited in the tailing is expected to strengthen the bond between tailing granules, which are easily slipped on the banks of the tailing dam. The EMCP method is expected to strengthen tailing in erosion-control surfaces.

Keywords: tailing, EMCP, UCS, AAS

Procedia PDF Downloads 114
378 Investigating the Biosorption Potential of Indigenous Filamentous Fungi from Copperbelt Tailing Dams in Zambia with Copper and Cobalt Tolerance

Authors: Leonce Dusengemungu

Abstract:

Filamentous fungi indigenous to heavy metals (HMs) contaminated environments have a considerable biosorption potential yet are currently under-investigated in developing countries. In the work presented herein, the biosorption potential of three indigenous filamentous fungi (Aspergillus transmontanensis, Cladosporium cladosporioides, and Geotrichum candidum) isolated from copper and cobalt mining wasteland sites in Zambia's Copperbelt province was investigated. In Cu and Co tolerance tests, all the fungal isolates were shown to be tolerant, with mycelial growth at HMs concentrations of up to 7000 ppm. However, exposure to high Cu and Co concentrations hindered the growth of the three strains to varying degrees, resulting in reduced mycelial biomass (evidenced by loss of the infrared bands at 887 and 930 cm-1 of the 1,3-glucans backbone) as well as morphological alterations, sporulation, and pigment synthesis. In addition, gas chromatography-mass spectrometry characterization of the fungal biomass extracts allowed to detect changes in the chemical constituents upon exposure to HMs, with profiles poorer in maltol, 1,2-cyclopentadione, and n-hexadecanoic acid, and richer in furaldehydes. Biosorption tests showed that A. transmontanensis and G. candidum showed better performance as bioremediators than C. cladosporioides, with biosorption efficiencies of 1645, 1853 and 1253 ppm at pH 3, respectively, and may deserve further research in field conditions.

Keywords: bioremediation, fungi, biosorption, heavy metal

Procedia PDF Downloads 44
377 Recommender Systems Using Ensemble Techniques

Authors: Yeonjeong Lee, Kyoung-jae Kim, Youngtae Kim

Abstract:

This study proposes a novel recommender system that uses data mining and multi-model ensemble techniques to enhance the recommendation performance through reflecting the precise user’s preference. The proposed model consists of two steps. In the first step, this study uses logistic regression, decision trees, and artificial neural networks to predict customers who have high likelihood to purchase products in each product group. Then, this study combines the results of each predictor using the multi-model ensemble techniques such as bagging and bumping. In the second step, this study uses the market basket analysis to extract association rules for co-purchased products. Finally, the system selects customers who have high likelihood to purchase products in each product group and recommends proper products from same or different product groups to them through above two steps. We test the usability of the proposed system by using prototype and real-world transaction and profile data. In addition, we survey about user satisfaction for the recommended product list from the proposed system and the randomly selected product lists. The results also show that the proposed system may be useful in real-world online shopping store.

Keywords: product recommender system, ensemble technique, association rules, decision tree, artificial neural networks

Procedia PDF Downloads 273
376 Malware Beaconing Detection by Mining Large-scale DNS Logs for Targeted Attack Identification

Authors: Andrii Shalaginov, Katrin Franke, Xiongwei Huang

Abstract:

One of the leading problems in Cyber Security today is the emergence of targeted attacks conducted by adversaries with access to sophisticated tools. These attacks usually steal senior level employee system privileges, in order to gain unauthorized access to confidential knowledge and valuable intellectual property. Malware used for initial compromise of the systems are sophisticated and may target zero-day vulnerabilities. In this work we utilize common behaviour of malware called ”beacon”, which implies that infected hosts communicate to Command and Control servers at regular intervals that have relatively small time variations. By analysing such beacon activity through passive network monitoring, it is possible to detect potential malware infections. So, we focus on time gaps as indicators of possible C2 activity in targeted enterprise networks. We represent DNS log files as a graph, whose vertices are destination domains and edges are timestamps. Then by using four periodicity detection algorithms for each pair of internal-external communications, we check timestamp sequences to identify the beacon activities. Finally, based on the graph structure, we infer the existence of other infected hosts and malicious domains enrolled in the attack activities.

Keywords: malware detection, network security, targeted attack, computational intelligence

Procedia PDF Downloads 236
375 Evaluation of Drilling Performance through Bit-Rock Interaction Using Passive Vibration Assisted Rotation Drilling (PVARD) Tool

Authors: Md. Shaheen Shah, Abdelsalam Abugharara, Dipesh Maharjan, Syed Imtiaz, Stephen Butt

Abstract:

Drilling performance is an essential goal in petroleum and mining industry. Drilling rate of penetration (ROP), which is inversely proportional to the mechanical specific energy (MSE) is influenced by numerous factors among which are the applied parameter: torque (T), weight on bit (WOB), fluid flow rate, revolution per minute (rpm), rock related parameters: rock type, rock homogeneousness, rock anisotropy orientation, and mechanical parameters: bit type, configuration of the bottom hole assembly (BHA). This paper is focused on studying the drilling performance by implementing a passive vibration assisted rotary drilling tool (pVARD) as part of the BHA through using different bit types: coring bit, roller cone bit, and PDC bit and various rock types: rock-like material, granite, sandstone, etc. The results of this study aim to produce a pVARD index for optimal drilling performance considering the recommendations of the pVARD’s spring compression tests and stress-strain analysis of rock samples conducted prior to drilling experiments, analyzing the cutting size distribution, and evaluating the applied drilling parameters as a function of WOB. These results are compared with those obtained from drilling without pVARD, which represents the typical rigid BHA of the conventional drilling.

Keywords: BHA, drilling performance, MSE, pVARD, rate of penetration, ROP, tensile and shear fractures, unconfined compressive strength

Procedia PDF Downloads 126
374 Mathematical modeling of the calculation of the absorbed dose in uranium production workers with the genetic effects.

Authors: P. Kazymbet, G. Abildinova, K.Makhambetov, M. Bakhtin, D. Rybalkina, K. Zhumadilov

Abstract:

Conducted cytogenetic research in workers Stepnogorsk Mining-Chemical Combine (Akmola region) with the study of 26341 chromosomal metaphase. Using a regression analysis with program DataFit, version 5.0, dependence between exposure dose and the following cytogenetic exponents has been studied: frequency of aberrant cells, frequency of chromosomal aberrations, frequency of the amounts of dicentric chromosomes, and centric rings. Experimental data on calibration curves "dose-effect" enabled the development of a mathematical model, allowing on data of the frequency of aberrant cells, chromosome aberrations, the amounts of dicentric chromosomes and centric rings calculate the absorbed dose at the time of the study. In the dose range of 0.1 Gy to 5.0 Gy dependence cytogenetic parameters on the dose had the following equation: Y = 0,0067е^0,3307х (R2 = 0,8206) – for frequency of chromosomal aberrations; Y = 0,0057е^0,3161х (R2 = 0,8832) –for frequency of cells with chromosomal aberrations; Y =5 Е-0,5е^0,6383 (R2 = 0,6321) – or frequency of the amounts of dicentric chromosomes and centric rings on cells. On the basis of cytogenetic parameters and regression equations calculated absorbed dose in workers of uranium production at the time of the study did not exceed 0.3 Gy.

Keywords: Stepnogorsk, mathematical modeling, cytogenetic, dicentric chromosomes

Procedia PDF Downloads 449
373 Comparative Sustainability Performance Analysis of Australian Companies Using Composite Measures

Authors: Ramona Zharfpeykan, Paul Rouse

Abstract:

Organizational sustainability is important to both organizations themselves and their stakeholders. Despite its increasing popularity and increasing numbers of organizations reporting sustainability, research on evaluating and comparing the sustainability performance of companies is limited. The aim of this study was to develop models to measure sustainability performance for both cross-sectional and longitudinal comparisons across companies in the same or different industries. A secondary aim was to see if sustainability reports can be used to evaluate sustainability performance. The study used both a content analysis of Australian sustainability reports in mining and metals and financial services for 2011-2014 and a survey of Australian and New Zealand organizations. Two methods ranging from a composite index using uniform weights to data envelopment analysis (DEA) were employed to analyze the data and develop the models. The results show strong statistically significant relationships between the developed models, which suggests that each model provides a consistent, systematic and reasonably robust analysis. The results of the models show that for both industries, companies that had sustainability scores above or below the industry average stayed almost the same during the study period. These indices and models can be used by companies to evaluate their sustainability performance and compare it with previous years, or with other companies in the same or different industries. These methods can also be used by various stakeholders and sustainability ranking companies such as the Global Reporting Initiative (GRI).

Keywords: data envelopment analysis, sustainability, sustainability performance measurement system, sustainability performance index, global reporting initiative

Procedia PDF Downloads 151
372 Strategic Mine Planning: A SWOT Analysis Applied to KOV Open Pit Mine in the Democratic Republic of Congo

Authors: Patrick May Mukonki

Abstract:

KOV pit (Kamoto Oliveira Virgule) is located 10 km from Kolwezi town, one of the mineral rich town in the Lualaba province of the Democratic Republic of Congo. The KOV pit is currently operating under the Katanga Mining Limited (KML), a Glencore-Gecamines (a State Owned Company) join venture. Recently, the mine optimization process provided a life of mine of approximately 10 years withnice pushbacks using the Datamine NPV Scheduler software. In previous KOV pit studies, we recently outlined the impact of the accuracy of the geological information on a long-term mine plan for a big copper mine such as KOV pit. The approach taken, discussed three main scenarios and outlined some weaknesses on the geological information side, and now, in this paper that we are going to develop here, we are going to highlight, as an overview, those weaknesses, strengths and opportunities, in a global SWOT analysis. The approach we are taking here is essentially descriptive in terms of steps taken to optimize KOV pit and, at every step, we categorized the challenges we faced to have a better tradeoff between what we called strengths and what we called weaknesses. The same logic is applied in terms of the opportunities and threats. The SWOT analysis conducted in this paper demonstrates that, despite a general poor ore body definition, and very rude ground water conditions, there is room for improvement for such high grade ore body.

Keywords: mine planning, mine optimization, mine scheduling, SWOT analysis

Procedia PDF Downloads 202
371 Short Answer Grading Using Multi-Context Features

Authors: S. Sharan Sundar, Nithish B. Moudhgalya, Nidhi Bhandari, Vineeth Vijayaraghavan

Abstract:

Automatic Short Answer Grading is one of the prime applications of artificial intelligence in education. Several approaches involving the utilization of selective handcrafted features, graphical matching techniques, concept identification and mapping, complex deep frameworks, sentence embeddings, etc. have been explored over the years. However, keeping in mind the real-world application of the task, these solutions present a slight overhead in terms of computations and resources in achieving high performances. In this work, a simple and effective solution making use of elemental features based on statistical, linguistic properties, and word-based similarity measures in conjunction with tree-based classifiers and regressors is proposed. The results for classification tasks show improvements ranging from 1%-30%, while the regression task shows a stark improvement of 35%. The authors attribute these improvements to the addition of multiple similarity scores to provide ensemble of scoring criteria to the models. The authors also believe the work could reinstate that classical natural language processing techniques and simple machine learning models can be used to achieve high results for short answer grading.

Keywords: artificial intelligence, intelligent systems, natural language processing, text mining

Procedia PDF Downloads 115
370 Synergy Effect of Energy and Water Saving in China's Energy Sectors: A Multi-Objective Optimization Analysis

Authors: Yi Jin, Xu Tang, Cuiyang Feng

Abstract:

The ‘11th five-year’ and ‘12th five-year’ plans have clearly put forward to strictly control the total amount and intensity of energy and water consumption. The synergy effect of energy and water has rarely been considered in the process of energy and water saving in China, where its contribution cannot be maximized. Energy sectors consume large amounts of energy and water when producing massive energy, which makes them both energy and water intensive. Therefore, the synergy effect in these sectors is significant. This paper assesses and optimizes the synergy effect in three energy sectors under the background of promoting energy and water saving. Results show that: From the perspective of critical path, chemical industry, mining and processing of non-metal ores and smelting and pressing of metals are coupling points in the process of energy and water flowing to energy sectors, in which the implementation of energy and water saving policies can bring significant synergy effect. Multi-objective optimization shows that increasing efforts on input restructuring can effectively improve synergy effects; relatively large synergetic energy saving and little water saving are obtained after solely reducing the energy and water intensity of coupling sectors. By optimizing the input structure of sectors, especially the coupling sectors, the synergy effect of energy and water saving can be improved in energy sectors under the premise of keeping economy running stably.

Keywords: critical path, energy sector, multi-objective optimization, synergy effect, water

Procedia PDF Downloads 339
369 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse

Procedia PDF Downloads 386
368 The Development of a Precision Irrigation System for Durian

Authors: Chatrabhuti Pipop, Visessri Supattra, Charinpanitkul Tawatchai

Abstract:

Durian is one of the top agricultural products exported by Thailand. There is the massive market potential for the durian industry. While the global demand for Thai durians, especially the demand from China, is very high, Thailand's durian supply is far from satisfying strong demand. Poor agricultural practices result in low yields and poor quality of fruit. Most irrigation systems currently used by the farmers are fixed schedule or fixed rates that ignore actual weather conditions and crop water requirements. In addition, the technologies emerging are too difficult and complex and prices are too high for the farmers to adopt and afford. Many farmers leave the durian trees to grow naturally. With improper irrigation and nutrient management system, durians are vulnerable to a variety of issues, including stunted growth, not flowering, diseases, and death. Technical development or research for durian is much needed to support the wellbeing of the farmers and the economic development of the country. However, there are a limited number of studies or development projects for durian because durian is a perennial crop requiring a long time to obtain the results to report. This study, therefore, aims to address the problem of durian production by developing an autonomous and precision irrigation system. The system is designed and equipped with an industrial programmable controller, a weather station, and a digital flow meter. Daily water requirements are computed based on weather data such as rainfall and evapotranspiration for daily irrigation with variable flow rates. A prediction model is also designed as a part of the system to enhance the irrigation schedule. Before the system was installed in the field, a simulation model was built and tested in a laboratory setting to ensure its accuracy. Water consumption was measured daily before and after the experiment for further analysis. With this system, the crop water requirement is precisely estimated and optimized based on the data from the weather station. Durian will be irrigated at the right amount and at the right time, offering the opportunity for higher yield and higher income to the farmers.

Keywords: Durian, precision irrigation, precision agriculture, smart farm

Procedia PDF Downloads 90
367 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 357
366 Mining the Proteome of Fusobacterium nucleatum for Potential Therapeutics Discovery

Authors: Abdul Musaweer Habib, Habibul Hasan Mazumder, Saiful Islam, Sohel Sikder, Omar Faruk Sikder

Abstract:

The plethora of genome sequence information of bacteria in recent times has ushered in many novel strategies for antibacterial drug discovery and facilitated medical science to take up the challenge of the increasing resistance of pathogenic bacteria to current antibiotics. In this study, we adopted subtractive genomics approach to analyze the whole genome sequence of the Fusobacterium nucleatum, a human oral pathogen having association with colorectal cancer. Our study divulged 1499 proteins of Fusobacterium nucleatum, which has no homolog in human genome. These proteins were subjected to screening further by using the Database of Essential Genes (DEG) that resulted in the identification of 32 vitally important proteins for the bacterium. Subsequent analysis of the identified pivotal proteins, using the KEGG Automated Annotation Server (KAAS) resulted in sorting 3 key enzymes of F. nucleatum that may be good candidates as potential drug targets, since they are unique for the bacterium and absent in humans. In addition, we have demonstrated the 3-D structure of these three proteins. Finally, determination of ligand binding sites of the key proteins as well as screening for functional inhibitors that best fitted with the ligands sites were conducted to discover effective novel therapeutic compounds against Fusobacterium nucleatum.

Keywords: colorectal cancer, drug target, Fusobacterium nucleatum, homology modeling, ligands

Procedia PDF Downloads 361
365 Numerical Simulation of Fracturing Behaviour of Pre-Cracked Crystalline Rock Using a Cohesive Grain-Based Distinct Element Model

Authors: Mahdi Saadat, Abbas Taheri

Abstract:

Understanding the cracking response of crystalline rocks at mineralogical scale is of great importance during the design procedure of mining structures. A grain-based distinct element model (GBM) is employed to numerically study the cracking response of Barre granite at micro- and macro-scales. The GBM framework is augmented with a proposed distinct element-based cohesive model to reproduce the micro-cracking response of the inter- and intra-grain contacts. The cohesive GBM framework is implemented in PFC2D distinct element codes. The microstructural properties of Barre granite are imported in PFC2D to generate synthetic specimens. The microproperties of the model is calibrated against the laboratory uniaxial compressive and Brazilian split tensile tests. The calibrated model is then used to simulate the fracturing behaviour of pre-cracked Barre granite with different flaw configurations. The numerical results of the proposed model demonstrate a good agreement with the experimental counterparts. The GBM framework proposed thus appears promising for further investigation of the influence of grain microstructure and mineralogical properties on the cracking behaviour of crystalline rocks.

Keywords: discrete element modelling, cohesive grain-based model, crystalline rock, fracturing behavior

Procedia PDF Downloads 107
364 The Human Process of Trust in Automated Decisions and Algorithmic Explainability as a Fundamental Right in the Exercise of Brazilian Citizenship

Authors: Paloma Mendes Saldanha

Abstract:

Access to information is a prerequisite for democracy while also guiding the material construction of fundamental rights. The exercise of citizenship requires knowing, understanding, questioning, advocating for, and securing rights and responsibilities. In other words, it goes beyond mere active electoral participation and materializes through awareness and the struggle for rights and responsibilities in the various spaces occupied by the population in their daily lives. In times of hyper-cultural connectivity, active citizenship is shaped through ethical trust processes, most often established between humans and algorithms. Automated decisions, so prevalent in various everyday situations, such as purchase preference predictions, virtual voice assistants, reduction of accidents in autonomous vehicles, content removal, resume selection, etc., have already found their place as a normalized discourse that sometimes does not reveal or make clear what violations of fundamental rights may occur when algorithmic explainability is lacking. In other words, technological and market development promotes a normalization for the use of automated decisions while silencing possible restrictions and/or breaches of rights through a culturally modeled, unethical, and unexplained trust process, which hinders the possibility of the right to a healthy, transparent, and complete exercise of citizenship. In this context, the article aims to identify the violations caused by the absence of algorithmic explainability in the exercise of citizenship through the construction of an unethical and silent trust process between humans and algorithms in automated decisions. As a result, it is expected to find violations of constitutionally protected rights such as privacy, data protection, and transparency, as well as the stipulation of algorithmic explainability as a fundamental right in the exercise of Brazilian citizenship in the era of virtualization, facing a threefold foundation called trust: culture, rules, and systems. To do so, the author will use a bibliographic review in the legal and information technology fields, as well as the analysis of legal and official documents, including national documents such as the Brazilian Federal Constitution, as well as international guidelines and resolutions that address the topic in a specific and necessary manner for appropriate regulation based on a sustainable trust process for a hyperconnected world.

Keywords: artificial intelligence, ethics, citizenship, trust

Procedia PDF Downloads 37
363 A Comparative Study on Supercritical C02 and Water as Working Fluids in a Heterogeneous Geothermal Reservoir

Authors: Musa D. Aliyu, Ouahid Harireche, Colin D. Hills

Abstract:

The incapability of supercritical C02 to transport and dissolve mineral species from the geothermal reservoir to the fracture apertures and other important parameters in heat mining makes it an attractive substance for Heat extraction from hot dry rock. In other words, the thermodynamic efficiency of hot dry rock (HDR) reservoirs also increases if supercritical C02 is circulated at excess temperatures of 3740C without the drawbacks connected with silica dissolution. Studies have shown that circulation of supercritical C02 in homogenous geothermal reservoirs is quite encouraging; in comparison to that of the water. This paper aims at investigating the aforementioned processes in the case of the heterogeneous geothermal reservoir located at the Soultz site (France). The MultiPhysics finite element package COMSOL with an interface of coupling different processes encountered in the geothermal reservoir stimulation is used. A fully coupled numerical model is developed to study the thermal and hydraulic processes in order to predict the long-term operation of the basic reservoir parameters that give optimum energy production. The results reveal that the temperature of the SCC02 at the production outlet is higher than that of water in long-term stimulation; as the temperature is an essential ingredient in rating the energy production. It is also observed that the mass flow rate of the SCC02 is far more favourable compared to that of water.

Keywords: FEM, HDR, heterogeneous reservoir, stimulation, supercritical C02

Procedia PDF Downloads 360
362 A Framework for Event-Based Monitoring of Business Processes in the Supply Chain Management of Industry 4.0

Authors: Johannes Atug, Andreas Radke, Mitchell Tseng, Gunther Reinhart

Abstract:

In modern supply chains, large numbers of SKU (Stock-Keeping-Unit) need to be timely managed, and any delays in noticing disruptions of items often limit the ability to defer the impact on customer order fulfillment. However, in supply chains of IoT-connected enterprises, the ERP (Enterprise-Resource-Planning), the MES (Manufacturing-Execution-System) and the SCADA (Supervisory-Control-and-Data-Acquisition) systems generate large amounts of data, which generally glean much earlier notice of deviations in the business process steps. That is, analyzing these streams of data with process mining techniques allows the monitoring of the supply chain business processes and thus identification of items that deviate from the standard order fulfillment process. In this paper, a framework to enable event-based SCM (Supply-Chain-Management) processes including an overview of core enabling technologies are presented, which is based on the RAMI (Reference-Architecture-Model for Industrie 4.0) architecture. The application of this framework in the industry is presented, and implications for SCM in industry 4.0 and further research are outlined.

Keywords: cyber-physical production systems, event-based monitoring, supply chain management, RAMI (Reference-Architecture-Model for Industrie 4.0)

Procedia PDF Downloads 208
361 Optimised Path Recommendation for a Real Time Process

Authors: Likewin Thomas, M. V. Manoj Kumar, B. Annappa

Abstract:

Traditional execution process follows the path of execution drawn by the process analyst without observing the behaviour of resource and other real-time constraints. Identifying process model, predicting the behaviour of resource and recommending the optimal path of execution for a real time process is challenging. The proposed AlfyMiner: αyM iner gives a new dimension in process execution with the novel techniques Process Model Analyser: PMAMiner and Resource behaviour Analyser: RBAMiner for recommending the probable path of execution. PMAMiner discovers next probable activity for currently executing activity in an online process using variant matching technique to identify the set of next probable activity, among which the next probable activity is discovered using decision tree model. RBAMiner identifies the resource suitable for performing the discovered next probable activity and observe the behaviour based on; load and performance using polynomial regression model, and waiting time using queueing theory. Based on the observed behaviour αyM iner recommend the probable path of execution with; next probable activity and the best suitable resource for performing it. Experiments were conducted on process logs of CoSeLoG Project1 and 72% of accuracy is obtained in identifying and recommending next probable activity and the efficiency of resource performance was optimised by 59% by decreasing their load.

Keywords: cross-organization process mining, process behaviour, path of execution, polynomial regression model

Procedia PDF Downloads 312
360 Visual Text Analytics Technologies for Real-Time Big Data: Chronological Evolution and Issues

Authors: Siti Azrina B. A. Aziz, Siti Hafizah A. Hamid

Abstract:

New approaches to analyze and visualize data stream in real-time basis is important in making a prompt decision by the decision maker. Financial market trading and surveillance, large-scale emergency response and crowd control are some example scenarios that require real-time analytic and data visualization. This situation has led to the development of techniques and tools that support humans in analyzing the source data. With the emergence of Big Data and social media, new techniques and tools are required in order to process the streaming data. Today, ranges of tools which implement some of these functionalities are available. In this paper, we present chronological evolution evaluation of technologies for supporting of real-time analytic and visualization of the data stream. Based on the past research papers published from 2002 to 2014, we gathered the general information, main techniques, challenges and open issues. The techniques for streaming text visualization are identified based on Text Visualization Browser in chronological order. This paper aims to review the evolution of streaming text visualization techniques and tools, as well as to discuss the problems and challenges for each of identified tools.

Keywords: information visualization, visual analytics, text mining, visual text analytics tools, big data visualization

Procedia PDF Downloads 379
359 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 120
358 Applications of Artificial Intelligence (AI) in Cardiac imaging

Authors: Angelis P. Barlampas

Abstract:

The purpose of this study is to inform the reader, about the various applications of artificial intelligence (AI), in cardiac imaging. AI grows fast and its role is crucial in medical specialties, which use large amounts of digital data, that are very difficult or even impossible to be managed by human beings and especially doctors.Artificial intelligence (AI) refers to the ability of computers to mimic human cognitive function, performing tasks such as learning, problem-solving, and autonomous decision making based on digital data. Whereas AI describes the concept of using computers to mimic human cognitive tasks, machine learning (ML) describes the category of algorithms that enable most current applications described as AI. Some of the current applications of AI in cardiac imaging are the follows: Ultrasound: Automated segmentation of cardiac chambers across five common views and consequently quantify chamber volumes/mass, ascertain ejection fraction and determine longitudinal strain through speckle tracking. Determine the severity of mitral regurgitation (accuracy > 99% for every degree of severity). Identify myocardial infarction. Distinguish between Athlete’s heart and hypertrophic cardiomyopathy, as well as restrictive cardiomyopathy and constrictive pericarditis. Predict all-cause mortality. CT Reduce radiation doses. Calculate the calcium score. Diagnose coronary artery disease (CAD). Predict all-cause 5-year mortality. Predict major cardiovascular events in patients with suspected CAD. MRI Segment of cardiac structures and infarct tissue. Calculate cardiac mass and function parameters. Distinguish between patients with myocardial infarction and control subjects. It could potentially reduce costs since it would preclude the need for gadolinium-enhanced CMR. Predict 4-year survival in patients with pulmonary hypertension. Nuclear Imaging Classify normal and abnormal myocardium in CAD. Detect locations with abnormal myocardium. Predict cardiac death. ML was comparable to or better than two experienced readers in predicting the need for revascularization. AI emerge as a helpful tool in cardiac imaging and for the doctors who can not manage the overall increasing demand, in examinations such as ultrasound, computed tomography, MRI, or nuclear imaging studies.

Keywords: artificial intelligence, cardiac imaging, ultrasound, MRI, CT, nuclear medicine

Procedia PDF Downloads 52
357 Pioneering Technology of Night Photo-Stimulation of the Brain Lymphatic System: Therapy of Brain Diseases during Sleep

Authors: Semyachkina-Glushkovskaya Oxana, Fedosov Ivan, Blokhina Inna, Terskov Andrey, Evsukova Arina, Elovenko Daria, Adushkina Viktoria, Dubrovsky Alexander, Jürgen Kurths

Abstract:

In modern neurobiology, sleep is considered a novel biomarker and a promising therapeutic target for brain diseases. This is due to recent discoveries of the nighttime activation of the brain lymphatic system (BLS), playing an important role in the removal of wastes and toxins from the brain and contributes neuroprotection of the central nervous system (CNS). In our review, we discuss that night stimulation of BLS might be a breakthrough strategy in a new treatment of Alzheimer’s and Parkinson’s disease, stroke, brain trauma, and oncology. Although this research is in its infancy, however, there are pioneering and promising results suggesting that night transcranial photostimulation (tPBM) stimulates more effectively lymphatic removal of amyloid-beta from mouse brain than daily tPBM that is associated with a greater improvement of the neurological status and recognition memory of animals. In our previous study, we discovered that tPBM modulates the tone and permeability of the lymphatic endothelium by stimulating NO formation, promoting lymphatic clearance of wastes and toxins from the brain tissues. We also demonstrate that tPBM can also lead to angio- and lymphangiogenesis, which is another mechanism underlying tPBM-mediated stimulation of BLS. Thus, photo-augmentation of BLS might be a promising therapeutic target for preventing or delaying brain diseases associated with BLS dysfunction. Here we present pioneering technology for simultaneous tPBM in humans and sleep monitoring for stimulation of BLS to remove toxins from CNS and modulation of brain immunity. The wireless-controlled gadget includes a flexible organic light-emitting diode (LED) source that is controlled directly by a sleep-tracking device via a mobile application. The designed autonomous LED source is capable of providing the required therapeutic dose of light radiation at a certain region of the patient’s head without disturbing of sleeping patient. To minimize patients' discomfort, advanced materials like flexible organic LEDs were used. Acknowledgment: This study was supported by RSF project No. 23-75-30001.

Keywords: brain diseases, brain lymphatic system, phototherapy, sleep

Procedia PDF Downloads 54
356 Visual Template Detection and Compositional Automatic Regular Expression Generation for Business Invoice Extraction

Authors: Anthony Proschka, Deepak Mishra, Merlyn Ramanan, Zurab Baratashvili

Abstract:

Small and medium-sized businesses receive over 160 billion invoices every year. Since these documents exhibit many subtle differences in layout and text, extracting structured fields such as sender name, amount, and VAT rate from them automatically is an open research question. In this paper, existing work in template-based document extraction is extended, and a system is devised that is able to reliably extract all required fields for up to 70% of all documents in the data set, more than any other previously reported method. The approaches are described for 1) detecting through visual features which template a given document belongs to, 2) automatically generating extraction rules for a given new template by composing regular expressions from multiple components, and 3) computing confidence scores that indicate the accuracy of the automatic extractions. The system can generate templates with as little as one training sample and only requires the ground truth field values instead of detailed annotations such as bounding boxes that are hard to obtain. The system is deployed and used inside a commercial accounting software.

Keywords: data mining, information retrieval, business, feature extraction, layout, business data processing, document handling, end-user trained information extraction, document archiving, scanned business documents, automated document processing, F1-measure, commercial accounting software

Procedia PDF Downloads 103