Search results for: data mining technique
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30147

Search results for: data mining technique

29637 Effect of Particles Size and Volume Fraction Concentration on the Thermal Conductivity and Thermal Diffusivity of Al2O3 Nanofluids Measured Using Transient Hot–Wire Laser Beam Deflection Technique

Authors: W. Mahmood Mat Yunus, Faris Mohammed Ali, Zainal Abidin Talib

Abstract:

In this study we present new data for the thermal conductivity enhancement in four nanofluids containing 11, 25, 50, 63 nm diameter aluminum oxide (Al2O3) nanoparticles in distilled water. The nanofluids were prepared using single step method (i.e. by dispersing nanoparticle directly in base fluid) which was gathered in ultrasonic device for approximately 7 hours. The transient hot-wire laser beam displacement technique was used to measure the thermal conductivity and thermal diffusivity of the prepared nanofluids. The thermal conductivity and thermal diffusivity were obtained by fitting the experimental data to the numerical data simulated for aluminum oxide in distilled water. The results show that the thermal conductivity and thermal diffusivity of nanofluids increases in non-linear behavior as the particle size increases. While, the thermal conductivity and thermal diffusivity of Al2O3 nanofluids was observed increasing linearly with concentration as the volume fraction concentration increases. We believe that the interfacial layer between solid/fluid is the main factor for the enhancement of thermal conductivity and thermal diffusivity of Al2O3 nanofluids in the present work.

Keywords: transient hot wire-laser beam technique, Al2O3 nanofluid, particle size, volume fraction concentration

Procedia PDF Downloads 553
29636 Hydrogeological Appraisal of Karacahisar Coal Field (Western Turkey): Impacts of Mining on Groundwater Resources Utilized for Water Supply

Authors: Sukran Acikel, Mehmet Ekmekci, Otgonbayar Namkhai

Abstract:

Lignite coal fields in western Turkey generally occurs in tensional Neogene basins bordered by major faults. Karacahisar coal field in Mugla province of western Turkey is a large Neogene basin filled with alternation of silisic and calcerous layers. The basement of the basin is composed of mainly karstified carbonate rocks of Mesozoic and schists of Paleozoic age. The basement rocks are exposed at highlands surrounding the basin. The basin fill deposits forms shallow, low yield and local aquifers whereas karstic carbonate rock masses forms the major aquifer in the region. The karstic aquifer discharges through a spring zone issuing at intersection of two major faults. Municipal water demand in Bodrum city, a touristic attraction area is almost totally supplied by boreholes tapping the karstic aquifer. A well field has been constructed on the eastern edge of the coal basin, which forms a ridge separating two Neogene basins. A major concern was raised about the plausible impact of mining activities on groundwater system in general and on water supply well field in particular. The hydrogeological studies carried out in the area revealed that the coal seam is located below the groundwater level. Mining operations will be affected by groundwater inflow to the pits, which will require dewatering measures. Dewatering activities in mine sites have two-sided effects: a) lowers the groundwater level at and around the pit for a safe and effective mining operation, b) continuous dewatering causes expansion of cone of depression to reach a spring, stream and/or well being utilized by local people, capturing their water. Plausible effect of mining operations on the flow of the spring zone was another issue of concern. Therefore, a detailed representative hydrogeological conceptual model of the site was developed on the basis of available data and field work. According to the hydrogeological conceptual model, dewatering of Neogene layers will not hydraulically affect the water supply wells, however, the ultimate perimeter of the open pit will expand to intersect the well field. According to the conceptual model, the coal seam is separated from the bottom by a thick impervious clay layer sitting on the carbonate basement. Therefore, the hydrostratigraphy does not allow a hydraulic interaction between the mine pit and the karstic carbonate rock aquifer. However, the structural setting in the basin suggests that deep faults intersecting the basement and the Neogene sequence will most probably carry the deep groundwater up to a level above the bottom of the pit. This will require taking necessary measure to lower the piezometric level of the carbonate rock aquifer along the faults. Dewatering the carbonate rock aquifer will reduce the flow to the spring zone. All findings were put together to recommend a strategy for safe and effective mining operation.

Keywords: conceptual model, dewatering, groundwater, mining operation

Procedia PDF Downloads 401
29635 Analysis and Design Modeling for Next Generation Network Intrusion Detection and Prevention System

Authors: Nareshkumar Harale, B. B. Meshram

Abstract:

The continued exponential growth of successful cyber intrusions against today’s businesses has made it abundantly clear that traditional perimeter security measures are no longer adequate and effective. We evolved the network trust architecture from trust-untrust to Zero-Trust, With Zero Trust, essential security capabilities are deployed in a way that provides policy enforcement and protection for all users, devices, applications, data resources, and the communications traffic between them, regardless of their location. Information exchange over the Internet, in spite of inclusion of advanced security controls, is always under innovative, inventive and prone to cyberattacks. TCP/IP protocol stack, the adapted standard for communication over network, suffers from inherent design vulnerabilities such as communication and session management protocols, routing protocols and security protocols are the major cause of major attacks. With the explosion of cyber security threats, such as viruses, worms, rootkits, malwares, Denial of Service attacks, accomplishing efficient and effective intrusion detection and prevention is become crucial and challenging too. In this paper, we propose a design and analysis model for next generation network intrusion detection and protection system as part of layered security strategy. The proposed system design provides intrusion detection for wide range of attacks with layered architecture and framework. The proposed network intrusion classification framework deals with cyberattacks on standard TCP/IP protocol, routing protocols and security protocols. It thereby forms the basis for detection of attack classes and applies signature based matching for known cyberattacks and data mining based machine learning approaches for unknown cyberattacks. Our proposed implemented software can effectively detect attacks even when malicious connections are hidden within normal events. The unsupervised learning algorithm applied to network audit data trails results in unknown intrusion detection. Association rule mining algorithms generate new rules from collected audit trail data resulting in increased intrusion prevention though integrated firewall systems. Intrusion response mechanisms can be initiated in real-time thereby minimizing the impact of network intrusions. Finally, we have shown that our approach can be validated and how the analysis results can be used for detecting and protection from the new network anomalies.

Keywords: network intrusion detection, network intrusion prevention, association rule mining, system analysis and design

Procedia PDF Downloads 228
29634 An Automatic Bayesian Classification System for File Format Selection

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.

Keywords: data mining, digital libraries, digital preservation, file format

Procedia PDF Downloads 499
29633 Application of Computer Aided Engineering Tools in Performance Prediction and Fault Detection of Mechanical Equipment of Mining Process Line

Authors: K. Jahani, J. Razavi

Abstract:

Nowadays, to decrease the number of downtimes in the industries such as metal mining, petroleum and chemical industries, predictive maintenance is crucial. In order to have efficient predictive maintenance, knowing the performance of critical equipment of production line such as pumps and hydro-cyclones under variable operating parameters, selecting best indicators of this equipment health situations, best locations for instrumentation, and also measuring of these indicators are very important. In this paper, computer aided engineering (CAE) tools are implemented to study some important elements of copper process line, namely slurry pumps and cyclone to predict the performance of these components under different working conditions. These modeling and simulations can be used in predicting, for example, the damage tolerance of the main shaft of the slurry pump or wear rate and location of cyclone wall or pump case and impeller. Also, the simulations can suggest best-measuring parameters, measuring intervals, and their locations.

Keywords: computer aided engineering, predictive maintenance, fault detection, mining process line, slurry pump, hydrocyclone

Procedia PDF Downloads 405
29632 Impact of Audit Committee on Earning Quality of Listed Consumer Goods Companies in Nigeria

Authors: Usman Yakubu, Muktar Haruna

Abstract:

The paper examines the impact of the audit committee on the earning quality of the listed consumer goods sector in Nigeria. The study used data collected from annual reports and accounts of the 13 sampled companies for the periods 2007 to 2018. Data were analyzed by means of descriptive statistics to provide summary statistics for the variables; also, correlation analysis was carried out using the Pearson correlation technique for the correlation between the dependent and independent variables. Regression was employed using the Generalized Least Square technique since the data has both time series and cross sectional attributes (panel data). It was found out that the audit committee had a positive and significant influence on the earning quality in the listed consumer goods companies in Nigeria. Thus, the study recommends that competency and personal integrity should be the worthwhile attributes to be considered while constituting the committee; this could enhance the quality of accounting information. In addition to that majority of the committee members should be independent directors in order to allow a high level of independency to be exercised.

Keywords: earning quality, corporate governance, audit committee, financial reporting

Procedia PDF Downloads 176
29631 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 259
29630 A Digital Filter for Symmetrical Components Identification

Authors: Khaled M. El-Naggar

Abstract:

This paper presents a fast and efficient technique for monitoring and supervising power system disturbances generated due to dynamic performance of power systems or faults. Monitoring power system quantities involve monitoring fundamental voltage, current magnitudes, and their frequencies as well as their negative and zero sequence components under different operating conditions. The proposed technique is based on simulated annealing optimization technique (SA). The method uses digital set of measurements for the voltage or current waveforms at power system bus to perform the estimation process digitally. The algorithm is tested using different simulated data to monitor the symmetrical components of power system waveforms. Different study cases are considered in this work. Effects of number of samples, sampling frequency and the sample window size are studied. Results are reported and discussed.

Keywords: estimation, faults, measurement, symmetrical components

Procedia PDF Downloads 466
29629 The Crisis of Turkey's Downing the Russian Warplane within the Concept of Country Branding: The Examples of BBC World, and Al Jazeera English

Authors: Derya Gül Ünlü, Oguz Kuş

Abstract:

The branding of a country means that the country has its own position different from other countries in its region and thus it is perceived more specifically. It is made possible by the branding efforts of a country and the uniqueness of all the national structures, by presenting it in a specific way, by creating the desired image and attracting tourists and foreign investors. Establishing a national brand involves, in a sense, the process of managing the perceptions of the citizens of the other country about the target country, by structuring the image of the country permanently and holistically. By this means, countries are not easily affected by their crisis of international relations. Therefore, within the scope of the research that will be carried out from this point, it is aimed to show how the warplane downing crisis between Turkey and Russia is perceived on social media. The Russian warplane was downed by Turkey on November 24, 2015, on the grounds that Turkey violated the airspace on the Syrian border. Whereupon the relations between the two countries have been tensed, and Russia has called on its citizens not to go to Turkey and citizens in Turkey to return to their countries. Moreover, relations between two countries have been weakened, for example, tourism tours organized in Russia to Turkey and visa-free travel were canceled and all military dialogue was cut off. After the event, various news sites on social media published plenty of news related to topic and the readers made various comments about the event and Turkey. In this context, an investigation into the perception of Turkey's national brand before and after the warplane downing crisis has been conducted. through comments fetched from the reports on the BBC World, and from Al Jazeera English news sites on Facebook accounts, which takes place widely in the social media. In order to realize study, user comments were fetched from jet downing-related news which are published on Facebook fan-page of BBC World Service, and Al Jazeera English. Regarding this, all the news published between 24.10.2015-24.12.2015 and containing Turk and Turkey keyword in its title composed data set of our study. Afterwards, comments written to these news were analyzed via text mining technique. Furthermore, by sentiment analysis, it was intended to reveal reader’s emotions before and after the crisis.

Keywords: Al Jazeera English, BBC World, country branding, social media, text mining

Procedia PDF Downloads 225
29628 Identity Verification Using k-NN Classifiers and Autistic Genetic Data

Authors: Fuad M. Alkoot

Abstract:

DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN). 

Keywords: biometrics, genetic data, identity verification, k nearest neighbor

Procedia PDF Downloads 258
29627 Imp_hist-Si: Improved Hybrid Image Segmentation Technique for Satellite Imagery to Decrease the Segmentation Error Rate

Authors: Neetu Manocha

Abstract:

Image segmentation is a technique where a picture is parted into distinct parts having similar features which have a place with similar items. Various segmentation strategies have been proposed as of late by prominent analysts. But, after ultimate thorough research, the novelists have analyzed that generally, the old methods do not decrease the segmentation error rate. Then author finds the technique HIST-SI to decrease the segmentation error rates. In this technique, cluster-based and threshold-based segmentation techniques are merged together. After then, to improve the result of HIST-SI, the authors added the method of filtering and linking in this technique named Imp_HIST-SI to decrease the segmentation error rates. The goal of this research is to find a new technique to decrease the segmentation error rates and produce much better results than the HIST-SI technique. For testing the proposed technique, a dataset of Bhuvan – a National Geoportal developed and hosted by ISRO (Indian Space Research Organisation) is used. Experiments are conducted using Scikit-image & OpenCV tools of Python, and performance is evaluated and compared over various existing image segmentation techniques for several matrices, i.e., Mean Square Error (MSE) and Peak Signal Noise Ratio (PSNR).

Keywords: satellite image, image segmentation, edge detection, error rate, MSE, PSNR, HIST-SI, linking, filtering, imp_HIST-SI

Procedia PDF Downloads 141
29626 Assessing Online Learning Paths in an Learning Management Systems Using a Data Mining and Machine Learning Approach

Authors: Alvaro Figueira, Bruno Cabral

Abstract:

Nowadays, students are used to be assessed through an online platform. Educators have stepped up from a period in which they endured the transition from paper to digital. The use of a diversified set of question types that range from quizzes to open questions is currently common in most university courses. In many courses, today, the evaluation methodology also fosters the students’ online participation in forums, the download, and upload of modified files, or even the participation in group activities. At the same time, new pedagogy theories that promote the active participation of students in the learning process, and the systematic use of problem-based learning, are being adopted using an eLearning system for that purpose. However, although there can be a lot of feedback from these activities to student’s, usually it is restricted to the assessments of online well-defined tasks. In this article, we propose an automatic system that informs students of abnormal deviations of a 'correct' learning path in the course. Our approach is based on the fact that by obtaining this information earlier in the semester, may provide students and educators an opportunity to resolve an eventual problem regarding the student’s current online actions towards the course. Our goal is to prevent situations that have a significant probability to lead to a poor grade and, eventually, to failing. In the major learning management systems (LMS) currently available, the interaction between the students and the system itself is registered in log files in the form of registers that mark beginning of actions performed by the user. Our proposed system uses that logged information to derive new one: the time each student spends on each activity, the time and order of the resources used by the student and, finally, the online resource usage pattern. Then, using the grades assigned to the students in previous years, we built a learning dataset that is used to feed a machine learning meta classifier. The produced classification model is then used to predict the grades a learning path is heading to, in the current year. Not only this approach serves the teacher, but also the student to receive automatic feedback on her current situation, having past years as a perspective. Our system can be applied to online courses that integrate the use of an online platform that stores user actions in a log file, and that has access to other student’s evaluations. The system is based on a data mining process on the log files and on a self-feedback machine learning algorithm that works paired with the Moodle LMS.

Keywords: data mining, e-learning, grade prediction, machine learning, student learning path

Procedia PDF Downloads 123
29625 Impact of Collieries on Groundwater in Damodar River Basin

Authors: Rajkumar Ghosh

Abstract:

The industrialization of coal mining and related activities has a significant impact on groundwater in the surrounding areas of the Damodar River. The Damodar River basin, located in eastern India, is known as the "Ruhr of India" due to its abundant coal reserves and extensive coal mining and industrial operations. One of the major consequences of collieries on groundwater is the contamination of water sources. Coal mining activities often involve the excavation and extraction of coal through underground or open-pit mining methods. These processes can release various pollutants and chemicals into the groundwater, including heavy metals, acid mine drainage, and other toxic substances. As a result, the quality of groundwater in the Damodar River region has deteriorated, making it unsuitable for drinking, irrigation, and other purposes. The high concentration of heavy metals, such as arsenic, lead, and mercury, in the groundwater has posed severe health risks to the local population. Prolonged exposure to contaminated water can lead to various health problems, including skin diseases, respiratory issues, and even long-term ailments like cancer. The contamination has also affected the aquatic ecosystem, harming fish populations and other organisms dependent on the river's water. Moreover, the excessive extraction of groundwater for industrial processes, including coal washing and cooling systems, has resulted in a decline in the water table and depletion of aquifers. This has led to water scarcity and reduced availability of water for agricultural activities, impacting the livelihoods of farmers in the region. Efforts have been made to mitigate these issues through the implementation of regulations and improved industrial practices. However, the historical legacy of coal industrialization continues to impact the groundwater in the Damodar River area. Remediation measures, such as the installation of water treatment plants and the promotion of sustainable mining practices, are essential to restore the quality of groundwater and ensure the well-being of the affected communities. In conclusion, the coal industrialization in the Damodar River surrounding has had a detrimental impact on groundwater. This research focuses on soil subsidence induced by the over-exploitation of ground water for dewatering open pit coal mines. Soil degradation happens in arid and semi-arid regions as a result of land subsidence in coal mining region, which reduces soil fertility. Depletion of aquifers, contamination, and water scarcity are some of the key challenges resulting from these activities. It is crucial to prioritize sustainable mining practices, environmental conservation, and the provision of clean drinking water to mitigate the long-lasting effects of collieries on the groundwater resources in the region.

Keywords: coal mining, groundwater, soil subsidence, water table, damodar river

Procedia PDF Downloads 82
29624 A Comparative Study of GTC and PSP Algorithms for Mining Sequential Patterns Embedded in Database with Time Constraints

Authors: Safa Adi

Abstract:

This paper will consider the problem of sequential mining patterns embedded in a database by handling the time constraints as defined in the GSP algorithm (level wise algorithms). We will compare two previous approaches GTC and PSP, that resumes the general principles of GSP. Furthermore this paper will discuss PG-hybrid algorithm, that using PSP and GTC. The results show that PSP and GTC are more efficient than GSP. On the other hand, the GTC algorithm performs better than PSP. The PG-hybrid algorithm use PSP algorithm for the two first passes on the database, and GTC approach for the following scans. Experiments show that the hybrid approach is very efficient for short, frequent sequences.

Keywords: database, GTC algorithm, PSP algorithm, sequential patterns, time constraints

Procedia PDF Downloads 390
29623 Research on ARQ Transmission Technique in Mars Detection Telecommunications System

Authors: Zhongfei Cai, Hui He, Changsheng Li

Abstract:

This paper studied in the automatic repeat request (ARQ) transmission technique in Mars detection telecommunications system. An ARQ method applied to proximity-1 space link protocol was proposed by this paper. In order to ensure the efficiency of data reliable transmission, this ARQ method combined these different ARQ maneuvers characteristics. Considering the Mars detection communication environments, this paper analyzed the characteristics of the saturation throughput rate, packet dropping probability, average delay and energy efficiency with different ARQ algorithms. Combined thus results with the theories of ARQ transmission technique, an ARQ transmission project in Mars detection telecommunications system was established. The simulation results showed that this algorithm had excellent saturation throughput rate and energy efficiency with low complexity.

Keywords: ARQ, mars, CCSDS, proximity-1, deepspace

Procedia PDF Downloads 340
29622 A Bayesian Classification System for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for easy creation and classification of institutional risk profiles supporting endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support set up of the most important risk factors. Subsequently, risk profiles employ risk factors classifier and associated configurations to support digital preservation experts with a semi-automatic estimation of endangerment group for file format risk profiles. Our goal is to make use of an expert knowledge base, accuired through a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation of risk factors for a requried dimension for analysis. Using the naive Bayes method, the decision support system recommends to an expert the matching risk profile group for the previously selected institutional risk profile. The proposed methods improve the visibility of risk factor values and the quality of a digital preservation process. The presented approach is designed to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and values of file format risk profiles. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert and to define its profile group. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: linked open data, information integration, digital libraries, data mining

Procedia PDF Downloads 428
29621 Survival Data with Incomplete Missing Categorical Covariates

Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar

Abstract:

The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.

Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution

Procedia PDF Downloads 406
29620 Field Trial of Resin-Based Composite Materials for the Treatment of Surface Collapses Associated with Former Shallow Coal Mining

Authors: Philip T. Broughton, Mark P. Bettney, Isla L. Smail

Abstract:

Effective treatment of ground instability is essential when managing the impacts associated with historic mining. A field trial was undertaken by the Coal Authority to investigate the geotechnical performance and potential use of composite materials comprising resin and fill or stone to safely treat surface collapses, such as crown-holes, associated with shallow mining. Test pits were loosely filled with various granular fill materials. The fill material was injected with commercially available silicate and polyurethane resin foam products. In situ and laboratory testing was undertaken to assess the geotechnical properties of the resultant composite materials. The test pits were subsequently excavated to assess resin permeation. Drilling and resin injection was easiest through clean limestone fill materials. Recycled building waste fill material proved difficult to inject with resin; this material is thus considered unsuitable for use in resin composites. Incomplete resin permeation in several of the test pits created irregular ‘blocks’ of composite. Injected resin foams significantly improve the stiffness and resistance (strength) of the un-compacted fill material. The stiffness of the treated fill material appears to be a function of the stone particle size, its associated compaction characteristics (under loose tipping) and the proportion of resin foam matrix. The type of fill material is more critical than the type of resin to the geotechnical properties of the composite materials. Resin composites can effectively support typical design imposed loads. Compared to other traditional treatment options, such as cement grouting, the use of resin composites is potentially less disruptive, particularly for sites with limited access, and thus likely to achieve significant reinstatement cost savings. The use of resin composites is considered a suitable option for the future treatment of shallow mining collapses.

Keywords: composite material, ground improvement, mining legacy, resin

Procedia PDF Downloads 355
29619 Algorithmic Obligations: Proactive Liability for AI-Generated Content and Copyright Compliance

Authors: Aleksandra Czubek

Abstract:

As AI systems increasingly shape content creation, existing copyright frameworks face significant challenges in determining liability for AI-generated outputs. Current legal discussions largely focus on who bears responsibility for infringing works, be it developers, users, or entities benefiting from AI outputs. This paper introduces a novel concept of algorithmic obligations, proposing that AI developers be subject to proactive duties that ensure their models prevent copyright infringement before it occurs. Building on principles of obligations law traditionally applied to human actors, the paper suggests a shift from reactive enforcement to proactive legal requirements. AI developers would be legally mandated to incorporate copyright-aware mechanisms within their systems, turning optional safeguards into enforceable standards. These obligations could vary in implementation across international, EU, UK, and U.S. legal frameworks, creating a multi-jurisdictional approach to copyright compliance. This paper explores how the EU’s existing copyright framework, exemplified by the Copyright Directive (2019/790), could evolve to impose a duty of foresight on AI developers, compelling them to embed mechanisms that prevent infringing outputs. By drawing parallels to GDPR’s “data protection by design,” a similar principle could be applied to copyright law, where AI models are designed to minimize copyright risks. In the UK, post-Brexit text and data mining exemptions are seen as pro-innovation but pose risks to copyright protections. This paper proposes a balanced approach, introducing algorithmic obligations to complement these exemptions. AI systems benefiting from text and data mining provisions should integrate safeguards that flag potential copyright violations in real time, ensuring both innovation and protection. In the U.S., where copyright law focuses on human-centric works, this paper suggests an evolution toward algorithmic due diligence. AI developers would have a duty similar to product liability, ensuring that their systems do not produce infringing outputs, even if the outputs themselves cannot be copyrighted. This framework introduces a shift from post-infringement remedies to preventive legal structures, where developers actively mitigate risks. The paper also breaks new ground by addressing obligations surrounding the training data of large language models (LLMs). Currently, training data is often treated under exceptions such as the EU’s text and data mining provisions or U.S. fair use. However, this paper proposes a proactive framework where developers are obligated to verify and document the legal status of their training data, ensuring it is licensed or otherwise cleared for use. In conclusion, this paper advocates for an obligations-centered model that shifts AI-related copyright law from reactive litigation to proactive design. By holding AI developers to a heightened standard of care, this approach aims to prevent infringement at its source, addressing both the outputs of AI systems and the training processes that underlie them.

Keywords: ip, technology, copyright, data, infringement, comparative analysis

Procedia PDF Downloads 20
29618 Research on Spatial Distribution of Service Facilities Based on Innovation Function: A Case Study of Zhejiang University Zijin Co-Maker Town

Authors: Zhang Yuqi

Abstract:

Service facilities are the boosters for the cultivation and development of innovative functions in innovative cluster areas. At the same time, reasonable service facilities planning can better link the internal functional blocks. This paper takes Zhejiang University Zijin Co-Maker Town as the research object, based on the combination of network data mining and field research and verification, combined with the needs of its internal innovative groups. It studies the distribution characteristics and existing problems of service facilities and then proposes a targeted planning suggestion. The main conclusions are as follows: (1) From the perspective of view, the town is rich in general life-supporting services, but lacking of provision targeted and distinctive service facilities for innovative groups; (2) From the perspective of scale structure, small-scale street shops are the main business form, lack of large-scale service center; (3) From the perspective of spatial structure, service facilities layout of each functional block is too fragile to fit the characteristics of 2aggregation- distribution' of innovation and entrepreneurial activities; (4) The goal of optimizing service facilities planning should be guided for fostering function of innovation and entrepreneurship and meet the actual needs of the innovation and entrepreneurial groups.

Keywords: the cultivation of innovative function, Zhejiang University Zijin Co-Maker Town, service facilities, network data mining, space optimization advice

Procedia PDF Downloads 117
29617 Electrospray Deposition Technique of Dye Molecules in the Vacuum

Authors: Nouf Alharbi

Abstract:

The electrospray deposition technique became an important method that enables fragile, nonvolatile molecules to be deposited in situ in high vacuum environments. Furthermore, it is considered one of the ways to close the gap between basic surface science and molecular engineering, which represents a gradual change in the range of scientist research. Also, this paper talked about one of the most important techniques that have been developed and aimed for helping to further develop and characterize the electrospray by providing data collected using an image charge detection instrument. Image charge detection mass spectrometry (CDMS) is used to measure speed and charge distributions of the molecular ions. As well as, some data has been included using SIMION simulation to simulate the energies and masses of the molecular ions through the system in order to refine the mass-selection process.

Keywords: charge, deposition, electrospray, image, ions, molecules, SIMION

Procedia PDF Downloads 133
29616 Application of IF Rough Data on Knowledge Towards Malaria of Rural Tribal Communities in Tripura

Authors: Chhaya Gangwal, R. N. Bhaumik, Shishir Kumar

Abstract:

Handling uncertainty and impreciseness of knowledge appears to be a challenging task in Information Systems. Intuitionistic fuzzy (IF) and rough set theory enhances databases by allowing it for the management of uncertainty and impreciseness. This paper presents a new efficient query optimization technique for the multi-valued or imprecise IF rough database. The usefulness of this technique was illustrated on malaria knowledge from the rural tribal communities of Tripura where most of the information is multi-valued and imprecise. Then, the querying about knowledge on malaria is executed into SQL server to make the implementation of IF rough data querying simpler.

Keywords: intuitionistic fuzzy set, rough set, relational database, IF rough relational database

Procedia PDF Downloads 446
29615 Treatment of Cyanide Effluents with Platinum Impregned on Mg-Al Layered Hydroxides

Authors: María R. Contreras, Diana Endara

Abstract:

Cyanide leaching is the most used technology for gold mining industry, which produces large amounts of effluents requiring treatment. In Ecuador the development of gold mining industry has increased, causing significant environmental impacts due to the highly use of cyanide, it is estimated that 10 gr of extracted gold generates 7000 liters of water contaminated with 300mg/L of free cyanide. The most common methods used nowadays are the treatment with peroxodisulfuric acid, ozonation, H₂O₂ and other reactants which are expensive and present disadvantages. Several methods have been developed to treat this contaminant such as heterogeneous catalysts. Layered double hydroxides (LDHs) have received much attention due to their wide applications like a catalysis support. Therefore, in this study, Mg-Al/ LDH was synthetized by coprecipitation method and then platinum was impregned on it, in order to enhance its catalytic activity. Two methods of impregnation were used, the first one, called incipient wet impregnation and the second one was developed by continuous agitation of LDH in contact with chloroplatinic acid solution for 24 h. The support impregnated was analyzed by X-ray diffraction, FTIR and SEM. Finally, the oxidation of cyanide ion was performed by preparing synthetic solutions of sodium cyanide (NaCN) with an initial concentration of 500 mg/L at pH 10,5 and air flow of 180 NL/h. After 8 hours of treatment, an 80% of oxidation of ion cyanide was achieved.

Keywords: catalysis, cyanide, LDHs, mining

Procedia PDF Downloads 146
29614 A QoS Aware Cluster Based Routing Algorithm for Wireless Mesh Network Using LZW Lossless Compression

Authors: J. S. Saini, P. P. K. Sandhu

Abstract:

The multi-hop nature of Wireless Mesh Networks and the hasty progression of throughput demands results in multi- channels and multi-radios structures in mesh networks, but the main problem of co-channels interference reduces the total throughput, specifically in multi-hop networks. Quality of Service mentions a vast collection of networking technologies and techniques that guarantee the ability of a network to make available desired services with predictable results. Quality of Service (QoS) can be directed at a network interface, towards a specific server or router's performance, or in specific applications. Due to interference among various transmissions, the QoS routing in multi-hop wireless networks is formidable task. In case of multi-channel wireless network, since two transmissions using the same channel may interfere with each other. This paper has considered the Destination Sequenced Distance Vector (DSDV) routing protocol to locate the secure and optimised path. The proposed technique also utilizes the Lempel–Ziv–Welch (LZW) based lossless data compression and intra cluster data aggregation to enhance the communication between the source and the destination. The use of clustering has the ability to aggregate the multiple packets and locates a single route using the clusters to improve the intra cluster data aggregation. The use of the LZW based lossless data compression has ability to reduce the data packet size and hence it will consume less energy, thus increasing the network QoS. The MATLAB tool has been used to evaluate the effectiveness of the projected technique. The comparative analysis has shown that the proposed technique outperforms over the existing techniques.

Keywords: WMNS, QOS, flooding, collision avoidance, LZW, congestion control

Procedia PDF Downloads 340
29613 In-situ Oxygen Enrichment for UCG

Authors: Adesola O. Orimoloye, Edward Gobina

Abstract:

Membrane separation technology is still considered as an emerging technology in the mining sector and does not yet have the widespread acceptance that it has in other industrial sectors. Underground Coal Gasification (UCG), wherein coal is converted to gas in-situ, is a safer alternative to mining method that retains all pollutants underground making the process environmentally friendly. In-situ combustion of coal for power generation allows access to more of the physical global coal resource than would be included in current economically recoverable reserve estimates. Where mining is no longer taking place, for economic or geological reasons, controlled gasification permits exploitation of the deposit (again a reaction of coal to form a synthesis gas) of coal seams in situ. The oxygen supply stage is one of the most expensive parts of any gasification project but the use of membranes is a potentially attractive approach for producing oxygen-enriched air. In this study, a variety of cost-effective membrane materials that gives an optimal amount of oxygen concentrations in the range of interest was designed and tested at diverse operating conditions. Oxygen-enriched atmosphere improves the combustion temperature but a decline is observed if oxygen concentration exceeds optimum. Experimental result also reveals the preparatory method, apparatus and performance of the fabricated membrane.

Keywords: membranes, oxygen-enrichment, gasification, coal

Procedia PDF Downloads 325
29612 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models

Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah

Abstract:

In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.

Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model

Procedia PDF Downloads 242
29611 Beyond Voluntary Corporate Social Responsibility: Examining the Impact of the New Mandatory Community Development Agreement in the Mining Sector of Sierra Leone

Authors: Wusu Conteh

Abstract:

Since the 1990s, neo-liberalization has become a global agenda. The free market ushered in an unprecedented drive by Multinational Corporations (MNCs) to secure mineral rights in resource-rich countries. Several governments in the Global South implemented a liberalized mining policy with support from the International Financial Institutions (IFIs). MNCs have maintained that voluntary Corporate Social Responsibility (CSR) has engendered socio-economic development in mining-affected communities. However, most resource-rich countries are struggling to transform the resources into sustainable socio-economic development. They are trapped in what has been widely described as the ‘resource curse.’ In an attempt to address this resource conundrum, the African Mining Vision (AMV) of 2009 developed a model on resource governance. The advent of the AMV has engendered the introduction of mandatory community development agreement (CDA) into the legal framework of many countries in Africa. In 2009, Sierra Leone enacted the Mines and Minerals Act that obligates mining companies to invest in Primary Host Communities. The study employs interviews and field observation techniques to explicate the dynamics of the CDA program. A total of 25 respondents -government officials, NGOs/CSOs and community stakeholders were interviewed. The study focuses on a case study of the Sierra Rutile CDA program in Sierra Leone. Extant scholarly works have extensively explored the resource curse and voluntary CSR. There are limited studies to uncover the mandatory CDA and its impact on socio-economic development in mining-affected communities. Thus, the purpose of this study is to explicate the impact of the CDA in Sierra Leone. Using the theory of change helps to understand how the availability of mandatory funds can empower communities to take an active part in decision making related to the development of the communities. The results show that the CDA has engendered a predictable fund for community development. It has also empowered ordinary members of the community to determine the development program. However, the CDA has created a new ground for contestations between the pre-existing local governance structure (traditional authority) and the newly created community development committee (CDC) that is headed by an ordinary member of the community.

Keywords: community development agreement, impact, mandatory, participation

Procedia PDF Downloads 125
29610 Artificial Neural Networks with Decision Trees for Diagnosis Issues

Authors: Y. Kourd, D. Lefebvre, N. Guersi

Abstract:

This paper presents a new idea for fault detection and isolation (FDI) technique which is applied to industrial system. This technique is based on Neural Networks fault-free and Faulty behaviors Models (NNFM's). NNFM's are used for residual generation, while decision tree architecture is used for residual evaluation. The decision tree is realized with data collected from the NNFM’s outputs and is used to isolate detectable faults depending on computed threshold. Each part of the tree corresponds to specific residual. With the decision tree, it becomes possible to take the appropriate decision regarding the actual process behavior by evaluating few numbers of residuals. In comparison to usual systematic evaluation of all residuals, the proposed technique requires less computational effort and can be used for on line diagnosis. An application example is presented to illustrate and confirm the effectiveness and the accuracy of the proposed approach.

Keywords: neural networks, decision trees, diagnosis, behaviors

Procedia PDF Downloads 507
29609 Increasing the Capacity of Plant Bottlenecks by Using of Improving the Ratio of Mean Time between Failures to Mean Time to Repair

Authors: Jalal Soleimannejad, Mohammad Asadizeidabadi, Mahmoud Koorki, Mojtaba Azarpira

Abstract:

A significant percentage of production costs is the maintenance costs, and analysis of maintenance costs could to achieve greater productivity and competitiveness. With this is mind, the maintenance of machines and installations is considered as an essential part of organizational functions and applying effective strategies causes significant added value in manufacturing activities. Organizations are trying to achieve performance levels on a global scale with emphasis on creating competitive advantage by different methods consist of RCM (Reliability-Center-Maintenance), TPM (Total Productivity Maintenance) etc. In this study, increasing the capacity of Concentration Plant of Golgohar Iron Ore Mining & Industrial Company (GEG) was examined by using of reliability and maintainability analyses. The results of this research showed that instead of increasing the number of machines (in order to solve the bottleneck problems), the improving of reliability and maintainability would solve bottleneck problems in the best way. It should be mention that in the abovementioned study, the data set of Concentration Plant of GEG as a case study, was applied and analyzed.

Keywords: bottleneck, golgohar iron ore mining & industrial company, maintainability, maintenance costs, reliability

Procedia PDF Downloads 365
29608 Gold, Power, Protest, Examining How Digital Media and PGIS are Used to Protest the Mining Industry in Colombia

Authors: Doug Specht

Abstract:

This research project sought to explore the links between digital media, PGIS and social movement organisations in Tolima, Colombia. The primary aim of the research was to examine how knowledge is created and disseminated through digital media and GIS in the region, and whether there exists the infrastructure to allow for this. The second strand was to ascertain if this has had a significant impact on the way grassroots movements work and produce collective actions. The third element is a hypothesis about how digital media and PGIS could play a larger role in activist activities, particularly in reference to the extractive industries. Three theoretical strands have been brought together to provide a basis for this research, namely (a) the politics of knowledge, (b) spatial management and inclusion, and (c) digital media and political engagement. Quantitative data relating to digital media and mobile internet use was collated alongside qualitative data relating to the likelihood of using digital media in activist campaigns, with particular attention being given to grassroots movements working against extractive industries in the Tolima region of Colombia. Through interviews, surveys and GIS analysis it has been possible to build a picture of online activism and the role of PPGIS within protest movement in the region of Tolima, Colombia. Results show a gap between the desires of social movements to use digital media and the skills and finances required to implement programs that utilise it. Maps and GIS are generally reserved for legal cases rather than for informing the lay person. However, it became apparent that the combination of digital/social media and PPGIS could play a significant role in supporting the work of grassroots movements.

Keywords: PGIS, GIS, social media, digital media, mining, colombia, social movements, protest

Procedia PDF Downloads 427