Search results for: data mining applications and discovery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30840

Search results for: data mining applications and discovery

29310 The Fundamental Research and Industrial Application on CO₂+O₂ in-situ Leaching Process in China

Authors: Lixin Zhao, Genmao Zhou

Abstract:

Traditional acid in-situ leaching (ISL) is not suitable for the sandstone uranium deposit with low permeability and high content of carbonate minerals, because of the blocking of calcium sulfate precipitates. Another factor influences the uranium acid in-situ leaching is that the pyrite in ore rocks will react with oxidation reagent and produce lots of sulfate ions which may speed up the precipitation process of calcium sulphate and consume lots of oxidation reagent. Due to the advantages such as less chemical reagent consumption and groundwater pollution, CO₂+O₂ in-situ leaching method has become one of the important research areas in uranium mining. China is the second country where CO₂+O₂ ISL has been adopted in industrial uranium production of the world. It is shown that the CO₂+O₂ ISL in China has been successfully developed. The reaction principle, technical process, well field design and drilling engineering, uranium-bearing solution processing, etc. have been fully studied. At current stage, several uranium mines use CO₂+O₂ ISL method to extract uranium from the ore-bearing aquifers. The industrial application and development potential of CO₂+O₂ ISL method in China are summarized. By using CO₂+O₂ neutral leaching technology, the problem of calcium carbonate and calcium sulfate precipitation have been solved during uranium mining. By reasonably regulating the amount of CO₂ and O₂, related ions and hydro-chemical conditions can be controlled within the limited extent for avoiding the occurrence of calcium sulfate and calcium carbonate precipitation. Based on this premise, the demand of CO₂+O₂ uranium leaching has been met to the maximum extent, which not only realizes the effective leaching of uranium, but also avoids the occurrence and precipitation of calcium carbonate and calcium sulfate, realizing the industrial development of the sandstone type uranium deposit.

Keywords: CO₂+O₂ ISL, industrial production, well field layout, uranium processing

Procedia PDF Downloads 178
29309 Simulation Data Summarization Based on Spatial Histograms

Authors: Jing Zhao, Yoshiharu Ishikawa, Chuan Xiao, Kento Sugiura

Abstract:

In order to analyze large-scale scientific data, research on data exploration and visualization has gained popularity. In this paper, we focus on the exploration and visualization of scientific simulation data, and define a spatial V-Optimal histogram for data summarization. We propose histogram construction algorithms based on a general binary hierarchical partitioning as well as a more specific one, the l-grid partitioning. For effective data summarization and efficient data visualization in scientific data analysis, we propose an optimal algorithm as well as a heuristic algorithm for histogram construction. To verify the effectiveness and efficiency of the proposed methods, we conduct experiments on the massive evacuation simulation data.

Keywords: simulation data, data summarization, spatial histograms, exploration, visualization

Procedia PDF Downloads 180
29308 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation

Authors: Abdal-Hafeez Alhussein

Abstract:

Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.

Keywords: artificial intelligence, information technology, automation, scalability

Procedia PDF Downloads 20
29307 Synthesis of Size-Tunable and Stable Iron Nanoparticles for Cancer Treatment

Authors: Ambika Selvaraj

Abstract:

Magnetic iron oxide nanoparticles (IO) of < 20nm (superparamagnetic) become promising tool in cancer therapy, and integrated nanodevices for cancer detection and screening. The obstacles include particle heterogeneity and cost. It can be overcome by developing monodispersed nanoparticles in economical approach. We have successfully synthesized < 7 nm IO by low temperature controlled technique, in which Fe0 is sandwiched between stabilizer and Fe2+. Size analysis showed the excellent size control from 31 nm at 33°C to 6.8 nm at 10°C. Resultant monodispersed IO were found to be stable for > 50 reuses, proved its applicability in biomedical applications.

Keywords: low temperature synthesis, hybrid iron nanoparticles, cancer therapy, biomedical applications

Procedia PDF Downloads 346
29306 Performance Evaluation of Distributed Deep Learning Frameworks in Cloud Environment

Authors: Shuen-Tai Wang, Fang-An Kuo, Chau-Yi Chou, Yu-Bin Fang

Abstract:

2016 has become the year of the Artificial Intelligence explosion. AI technologies are getting more and more matured that most world well-known tech giants are making large investment to increase the capabilities in AI. Machine learning is the science of getting computers to act without being explicitly programmed, and deep learning is a subset of machine learning that uses deep neural network to train a machine to learn  features directly from data. Deep learning realizes many machine learning applications which expand the field of AI. At the present time, deep learning frameworks have been widely deployed on servers for deep learning applications in both academia and industry. In training deep neural networks, there are many standard processes or algorithms, but the performance of different frameworks might be different. In this paper we evaluate the running performance of two state-of-the-art distributed deep learning frameworks that are running training calculation in parallel over multi GPU and multi nodes in our cloud environment. We evaluate the training performance of the frameworks with ResNet-50 convolutional neural network, and we analyze what factors that result in the performance among both distributed frameworks as well. Through the experimental analysis, we identify the overheads which could be further optimized. The main contribution is that the evaluation results provide further optimization directions in both performance tuning and algorithmic design.

Keywords: artificial intelligence, machine learning, deep learning, convolutional neural networks

Procedia PDF Downloads 213
29305 Frequent-Pattern Tree Algorithm Application to S&P and Equity Indexes

Authors: E. Younsi, H. Andriamboavonjy, A. David, S. Dokou, B. Lemrabet

Abstract:

Software and time optimization are very important factors in financial markets, which are competitive fields, and emergence of new computer tools further stresses the challenge. In this context, any improvement of technical indicators which generate a buy or sell signal is a major issue. Thus, many tools have been created to make them more effective. This worry about efficiency has been leading in present paper to seek best (and most innovative) way giving largest improvement in these indicators. The approach consists in attaching a signature to frequent market configurations by application of frequent patterns extraction method which is here most appropriate to optimize investment strategies. The goal of proposed trading algorithm is to find most accurate signatures using back testing procedure applied to technical indicators for improving their performance. The problem is then to determine the signatures which, combined with an indicator, outperform this indicator alone. To do this, the FP-Tree algorithm has been preferred, as it appears to be the most efficient algorithm to perform this task.

Keywords: quantitative analysis, back-testing, computational models, apriori algorithm, pattern recognition, data mining, FP-tree

Procedia PDF Downloads 365
29304 Pregnant Women in Substance Abuse: Transition of Characteristics and Mining of Association from Teds-a 2011 to 2018

Authors: Md Tareq Ferdous Khan, Shrabanti Mazumder, MB Rao

Abstract:

Background: Substance use during pregnancy is a longstanding public health problem that results in severe consequences for pregnant women and fetuses. Methods: Eight (2011-2018) datasets on pregnant women’s admissions are extracted from TEDS-A. Distributions of sociodemographic, substance abuse behaviors, and clinical characteristics are constructed and compared over the years for trends by the Cochran-Armitage test. Market basket analysis is used in mining the association among polysubstance abuse. Results: Over the years, pregnant woman admissions as the percentage of total and female admissions remain stable, where total annual admissions range from 1.54 to about 2 million with the female share of 33.30% to 35.61%. Pregnant women aged 21-29, 12 or more years of education, white race, unemployed, holding independent living status are among the most vulnerable. Concerns prevail on a significant number of polysubstance users, young age at first use, frequency of daily users, and records of prior admissions (60%). Trends of abused primary substances show a significant rise in heroin (66%) and methamphetamine (46%) over the years, although the latest year shows a considerable downturn. On the other hand, significant decreasing patterns are evident for alcohol (43%), marijuana or hashish (24%), cocaine or crack (23%), other opiates or synthetics (36%), and benzodiazepines (29%). Basket analysis reveals some patterns of co-occurrence of substances consistent over the years. Conclusions: This comprehensive study can work as a reference to identify the most vulnerable groups based on their characteristics and deal with the most hazardous substances from their evidence of co-occurrence.

Keywords: basket analysis, pregnant women, substance abuse, trend analysis

Procedia PDF Downloads 198
29303 Preparation and Characterizations of Hydroxyapatite-Sodium Alginate Nanocomposites for Biomedical Applications

Authors: Friday Godwin Okibe, Christian Chinweuba Onoyima, Edith Bolanle Agbaji, Victor Olatunji Ajibola

Abstract:

Polymer-inorganic nanocomposites are presently impacting diverse areas, specifically in biomedical sciences. In this research, hydroxyapatite-sodium alginate has been prepared, and characterized, with emphasis on the influence of sodium alginate on its characteristics. In situ wet chemical precipitation method was used in the preparation. The prepared nanocomposite was characterized with Fourier Transform Infrared spectroscopy (FTIR), Scanning Electron Microscopy (SEM), with image analysis, and X-Ray Diffraction (XRD). The FTIR study shows peaks characteristics of hydroxyapatite and confirmed formation of the nanocomposite via chemical interaction between sodium alginate and hydroxyapatite. Image analysis shows the nanocomposites to be of irregular morphologies which did not show significant change with increasing sodium alginate addition, while particle size decreased with increase in sodium alginate addition (359.46 nm to 109.98 nm). From the XRD data, both the crystallite size and degree of crystallinity also decreased with increasing sodium alginate composition (32.36 nm to 9.47 nm and 72.87% to 1.82% respectively), while the specific surface area and microstrain increased with increasing sodium alginate composition (0.0041 to 0.0139 and 58.99 m²/g to 201.58 m²/g respectively). The results show that the formulation with 50%wt of sodium alginate (HASA-50%wt), possess exceptional characteristics for biomedical applications such as drug delivery.

Keywords: nanocomposite, sodium alginate, hydroxyapatite, biomedical, FTIR, XRD, SEM

Procedia PDF Downloads 333
29302 A Paradigm Shift towards Personalized and Scalable Product Development and Lifecycle Management Systems in the Aerospace Industry

Authors: David E. Culler, Noah D. Anderson

Abstract:

Integrated systems for product design, manufacturing, and lifecycle management are difficult to implement and customize. Commercial software vendors, including CAD/CAM and third party PDM/PLM developers, create user interfaces and functionality that allow their products to be applied across many industries. The result is that systems become overloaded with functionality, difficult to navigate, and use terminology that is unfamiliar to engineers and production personnel. For example, manufacturers of automotive, aeronautical, electronics, and household products use similar but distinct methods and processes. Furthermore, each company tends to have their own preferred tools and programs for controlling work and information flow and that connect design, planning, and manufacturing processes to business applications. This paper presents a methodology and a case study that addresses these issues and suggests that in the future more companies will develop personalized applications that fit to the natural way that their business operates. A functioning system has been implemented at a highly competitive U.S. aerospace tooling and component supplier that works with many prominent airline manufacturers around the world including The Boeing Company, Airbus, Embraer, and Bombardier Aerospace. During the last three years, the program has produced significant benefits such as the automatic creation and management of component and assembly designs (parametric models and drawings), the extensive use of lightweight 3D data, and changes to the way projects are executed from beginning to end. CATIA (CAD/CAE/CAM) and a variety of programs developed in C#, VB.Net, HTML, and SQL make up the current system. The web-based platform is facilitating collaborative work across multiple sites around the world and improving communications with customers and suppliers. This work demonstrates that the creative use of Application Programming Interface (API) utilities, libraries, and methods is a key to automating many time-consuming tasks and linking applications together.

Keywords: PDM, PLM, collaboration, CAD/CAM, scalable systems

Procedia PDF Downloads 177
29301 Polydimethylsiloxane Applications in Interferometric Optical Fiber Sensors

Authors: Zeenat Parveen, Ashiq Hussain

Abstract:

This review paper consists of applications of PDMS (polydimethylsiloxane) materials for enhanced performance, optical fiber sensors in acousto-ultrasonic, mechanical measurements, current applications, sensing, measurements and interferometric optical fiber sensors. We will discuss the basic working principle of fiber optic sensing technology, various types of fiber optic and the PDMS as a coating material to increase the performance. Optical fiber sensing methods for detecting dynamic strain signals, including general sound and acoustic signals, high frequency signals i.e. ultrasonic/ultrasound, and other signals such as acoustic emission and impact induced dynamic strain. Optical fiber sensors have Industrial and civil engineering applications in mechanical measurements. Sometimes it requires different configurations and parameters of sensors. Optical fiber current sensors are based on Faraday Effect due to which we obtain better performance as compared to the conventional current transformer. Recent advancement and cost reduction has simulated interest in optical fiber sensing. Optical techniques are also implemented in material measurement. Fiber optic interferometers are used to sense various physical parameters including temperature, pressure and refractive index. There are four types of interferometers i.e. Fabry–perot, Mach-Zehnder, Michelson, and Sagnac. This paper also describes the future work of fiber optic sensors.

Keywords: fiber optic sensing, PDMS materials, acoustic, ultrasound, current sensor, mechanical measurements

Procedia PDF Downloads 392
29300 Limos Lactobacillus Fermentum from Buffalo Milk Is Suitable for Potential Biotechnological Process Development

Authors: Sergio D’Ambrosioa, Azza Dobousa, Chiara Schiraldia, Donatella Ciminib

Abstract:

Probiotics are living microorganisms that give beneficial effects while consumed. Lactic acid bacteria and bifidobacteria are among the most representative strains assessed as probiotics and exploited as food supplements. Numerous studies demonstrated their potential as a therapeutic candidate for a variety of diseases (restoring gut flora, lowering cholesterol, immune response-enhancing, anti-inflammation and anti-oxidation activities). These beneficial actions are also due to biomolecules produced by probiotics, such as exopolysaccharides (EPSs), that demonstrate plenty of beneficial properties such as antimicrobial, antitumor, anti-biofilm, antiviral and immunomodulatory activities. Limosilactobacillus fermentum is a widely studied member of probiotics; however, few data are available on the development of fermentation and downstream processes for the production of viable biomasses for potential industrial applications. However, few data are available on the development of fermentation processes for the large-scale production of probiotics biomass for industrial applications and for purification processes of EPSs at an industrial scale. For this purpose, L. fermentum strain was isolated from buffalo milk and used as a test example for biotechnological process development. The strain was able to produce up to 109 CFU/mL on a (glucose-based) semi-defined medium deprived of animal-derived raw materials up to the pilot scale (150 L), demonstrating improved results compared to commonly used, although industrially not suitable, media-rich of casein and beef extract. Biomass concentration via microfiltration on hollow fibers, and subsequent spray-drying allowed to recover of about 5.7 × 1010CFU/gpowder of viable cells, indicating strain resistance to harsh processing conditions. Overall, these data demonstrate the possibility of obtaining and maintaining adequate levels of viable L. fermentum cells by using a simple approach that is potentially suitable for industrial development. A downstream EPS purification protocol based on ultrafiltration, precipitation and activated charcoal treatments showed a purity of the recovered polysaccharides of about 70-80%.

Keywords: probiotics, fermentation, exopolysaccharides (EPSs), purification

Procedia PDF Downloads 86
29299 Isolation Enhancement of Compact Dual-Band Printed Multiple Input Multiple Output Antenna for WLAN Applications

Authors: Adham M. Salah, Tariq A. Nagem, Raed A. Abd-Alhameed, James M. Noras

Abstract:

Recently, the demand for wireless communications systems to cover more than one frequency band (multi-band) with high data rate has been increased for both fixed and mobile services. Multiple Input Multiple Output (MIMO) technology is one of the significant solutions for attaining these requirements and to achieve the maximum channel capacity of the wireless communications systems. The main issue associated with MIMO antennas especially in portable devices is the compact space between the radiating elements which leads to limit the physical separation between them. This issue exacerbates the performance of the MIMO antennas by increasing the mutual coupling between the radiating elements. In other words, the mutual coupling will be stronger if the radiating elements of the MIMO antenna are closer. This paper presents a low–profile dual-band (2×1) MIMO antenna that works at 2.4GHz, 5.3GHz and 5.8GHz for wireless local area networks (WLAN) applications. A neutralization line (NL) technique for enhancing the isolation has been used by introducing a strip line with a length of λg/4 at the isolation frequency (2.4GHz) between the radiating elements. The overall dimensions of the antenna are 33.5 x 36 x 1.6 mm³. The fabricated prototype shows a good agreement between the simulated and measured results. The antenna impedance bandwidths are 2.38–2.75 GHz and 4.4–6 GHz for the lower and upper band respectively; the reflection coefficient and mutual coupling are better than -25 dB in both lower and higher bands. The MIMO antenna performance characteristics are reported in terms of the scattering parameters, envelope correlation coefficient (ECC), total active reflection coefficient, capacity loss, antenna gain, and radiation patterns. Analysis of these characteristics indicates that the design is appropriate for the WLAN terminal applications.

Keywords: ECC, neutralization line, MIMO antenna, multi-band, mutual coupling, WLAN

Procedia PDF Downloads 137
29298 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions

Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib

Abstract:

Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.

Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption

Procedia PDF Downloads 413
29297 Concurrent Engineering Challenges and Resolution Mechanisms from Quality Perspectives

Authors: Grmanesh Gidey Kahsay

Abstract:

In modern technical engineering applications, quality is defined in two ways. The first one is that quality is the parameter that measures a product or service’s characteristics to meet and satisfy the pre-stated or fundamental needs (reliability, durability, serviceability). The second one is the quality of a product or service free of any defect or deficiencies. The American Society for Quality (ASQ) describes quality as a pursuit of optimal solutions to confirm successes and fulfillment to be accountable for the product or service's requirements and expectations. This article focuses on quality engineering tools in modern industrial applications. Quality engineering is a field of engineering that deals with the principles, techniques, models, and applications of the product or service to guarantee quality. Including the entire activities to analyze the product’s design and development, quality engineering emphasizes how to make sure that products and services are designed and developed to meet consumers’ requirements. This episode acquaints with quality tools such as quality systems, auditing, product design, and process control. The finding presents thoughts that aim to improve quality engineering proficiency and effectiveness by introducing essential quality techniques and tools in some selected industries.

Keywords: essential quality tools, quality systems and models, quality management systems, and quality assurance

Procedia PDF Downloads 159
29296 A Semiparametric Approach to Estimate the Mode of Continuous Multivariate Data

Authors: Tiee-Jian Wu, Chih-Yuan Hsu

Abstract:

Mode estimation is an important task, because it has applications to data from a wide variety of sources. We propose a semi-parametric approach to estimate the mode of an unknown continuous multivariate density function. Our approach is based on a weighted average of a parametric density estimate using the Box-Cox transform and a non-parametric kernel density estimate. Our semi-parametric mode estimate improves both the parametric- and non-parametric- mode estimates. Specifically, our mode estimate solves the non-consistency problem of parametric mode estimates (at large sample sizes) and reduces the variability of non-parametric mode estimates (at small sample sizes). The performance of our method at practical sample sizes is demonstrated by simulation examples and two real examples from the fields of climatology and image recognition.

Keywords: Box-Cox transform, density estimation, mode seeking, semiparametric method

Procedia PDF Downloads 286
29295 Expression of PGC-1 Alpha Isoforms in Response to Eccentric and Concentric Resistance Training in Healthy Subjects

Authors: Pejman Taghibeikzadehbadr

Abstract:

Background and Aim: PGC-1 alpha is a transcription factor that was first detected in brown adipose tissue. Since its discovery, PGC-1 alpha has been known to facilitate beneficial adaptations such as mitochondrial biogenesis and increased angiogenesis in skeletal muscle following aerobic exercise. Therefore, the purpose of this study was to investigate the expression of PGC-1 alpha isoforms in response to eccentric and concentric resistance training in healthy subjects. Materials and Methods: Ten healthy men were randomly divided into two groups (5 patients in eccentric group - 5 in eccentric group). Isokinetic contraction protocols included eccentric and concentric knee extension with maximum power and angular velocity of 60 degrees per second. The torques assigned to each subject were considered to match the workload in both protocols, with a rotational speed of 60 degrees per second. Contractions consisted of a maximum of 12 sets of 10 repetitions for the right leg, a rest time of 30 seconds between each set. At the beginning and end of the study, biopsy of the lateral broad muscle tissue was performed. Biopsies were performed in both distal and proximal directions of the lateral flank. To evaluate the expression of PGC1α-1 and PGC1α-4 genes, tissue analysis was performed in each group using Real-Time PCR technique. Data were analyzed using dependent t-test and covariance test. SPSS21 software and Exell 2013 software were used for data analysis. Results: The results showed that intra-group changes of PGC1α-1 after one session of activity were not significant in eccentric (p = 0.168) and concentric (p = 0.959) groups. Also, inter-group changes showed no difference between the two groups (p = 0.681). Also, intra-group changes of PGC1α-4 after one session of activity were significant in an eccentric group (p = 0.012) and concentric group (p = 0.02). Also, inter-group changes showed no difference between the two groups (p = 0.362). Conclusion: It seems that the lack of significant changes in the desired variables due to the lack of exercise pressure is sufficient to stimulate the increase of PGC1α-1 and PGC1α-4. And with regard to reviewing the answer, it seems that the compatibility debate has different results that need to be addressed.

Keywords: eccentric contraction, concentric contraction, PGC1α-1 و PGC1α-4, human subject

Procedia PDF Downloads 80
29294 Brief Review of the Self-Tightening, Left-Handed Thread

Authors: Robert S. Giachetti, Emanuele Grossi

Abstract:

Loosening of bolted joints in rotating machines can adversely affect their performance, cause mechanical damage, and lead to injuries. In this paper, two potential loosening phenomena in rotating applications are discussed. First, ‘precession,’ is governed by thread/nut contact forces, while the second is based on inertial effects of the fastened assembly. These mechanisms are reviewed within the context of historical usage of left-handed fasteners in rotating machines which appears absent in the literature and common machine design texts. Historically, to prevent loosening of wheel nuts, vehicle manufacturers have used right-handed and left-handed threads on different sides of the vehicle, but most modern vehicles have abandoned this custom and only use right-handed, tapered lug nuts on all sides of the vehicle. Other classical machines such as the bicycle continue to use different handed threads on each side while other machines such as, bench grinders, circular saws and brush cutters still use left-handed threads to fasten rotating components. Despite the continued use of left-handed fasteners, the rationale and analysis of left-handed threads to mitigate self-loosening of fasteners in rotating applications is not commonly, if at all, discussed in the literature or design textbooks. Without scientific literature to support these design selections, these implementations may be the result of experimental findings or aged institutional knowledge. Based on a review of rotating applications, historical documents and mechanical design references, a formal study of the paradoxical nature of left-handed threads in various applications is merited.

Keywords: rotating machinery, self-loosening fasteners, wheel fastening, vibration loosening

Procedia PDF Downloads 137
29293 Re-Evaluation of Field X Located in Northern Lake Albert Basin to Refine the Structural Interpretation

Authors: Calorine Twebaze, Jesca Balinga

Abstract:

Field X is located on the Eastern shores of L. Albert, Uganda, on the rift flank where the gross sedimentary fill is typically less than 2,000m. The field was discovered in 2006 and encountered about 20.4m of net pay across three (3) stratigraphic intervals within the discovery well. The field covers an area of 3 km2, with the structural configuration comprising a 3-way dip-closed hanging wall anticline that seals against the basement to the southeast along the bounding fault. Field X had been mapped on reprocessed 3D seismic data, which was originally acquired in 2007 and reprocessed in 2013. The seismic data quality is good across the field, and reprocessing work reduced the uncertainty in the location of the bounding fault and enhanced the lateral continuity of reservoir reflectors. The current study was a re-evaluation of Field X to refine fault interpretation and understand the structural uncertainties associated with the field. The seismic data, and three (3) wells datasets were used during the study. The evaluation followed standard workflows using Petrel software and structural attribute analysis. The process spanned from seismic- -well tie, structural interpretation, and structural uncertainty analysis. Analysis of three (3) well ties generated for the 3 wells provided a geophysical interpretation that was consistent with geological picks. The generated time-depth curves showed a general increase in velocity with burial depth. However, separation in curve trends observed below 1100m was mainly attributed to minimal lateral variation in velocity between the wells. In addition to Attribute analysis, three velocity modeling approaches were evaluated, including the Time-Depth Curve, Vo+ kZ, and Average Velocity Method. The generated models were calibrated at well locations using well tops to obtain the best velocity model for Field X. The Time-depth method resulted in more reliable depth surfaces with good structural coherence between the TWT and depth maps with minimal error at well locations of 2 to 5m. Both the NNE-SSW rift border fault and minor faults in the existing interpretation were reevaluated. However, the new interpretation delineated an E-W trending fault in the northern part of the field that had not been interpreted before. The fault was interpreted at all stratigraphic levels and thus propagates from the basement to the surface and is an active fault today. It was also noted that the entire field is less faulted with more faults in the deeper part of the field. The major structural uncertainties defined included 1) The time horizons due to reduced data quality, especially in the deeper parts of the structure, an error equal to one-third of the reflection time thickness was assumed, 2) Check shot analysis showed varying velocities within the wells thus varying depth values for each well, and 3) Very few average velocity points due to limited wells produced a pessimistic average Velocity model.

Keywords: 3D seismic data interpretation, structural uncertainties, attribute analysis, velocity modelling approaches

Procedia PDF Downloads 61
29292 The Influence of Audio on Perceived Quality of Segmentation

Authors: Silvio Ricardo Rodrigues Sanches, Bianca Cogo Barbosa, Beatriz Regina Brum, Cléber Gimenez Corrêa

Abstract:

To evaluate the quality of a segmentation algorithm, the authors use subjective or objective metrics. Although subjective metrics are more accurate than objective ones, objective metrics do not require user feedback to test an algorithm. Objective metrics require subjective experiments only during their development. Subjective experiments typically display to users some videos (generated from frames with segmentation errors) that simulate the environment of an application domain. This user feedback is crucial information for metric definition. In the subjective experiments applied to develop some state-of-the-art metrics used to test segmentation algorithms, the videos displayed during the experiments did not contain audio. Audio is an essential component in applications such as videoconference and augmented reality. If the audio influences the user’s perception, using only videos without audio in subjective experiments can compromise the efficiency of an objective metric generated using data from these experiments. This work aims to identify if the audio influences the user’s perception of segmentation quality in background substitution applications with audio. The proposed approach used a subjective method based on formal video quality assessment methods. The results showed that audio influences the quality of segmentation perceived by a user.

Keywords: background substitution, influence of audio, segmentation evaluation, segmentation quality

Procedia PDF Downloads 120
29291 Multivariate Analysis of Spectroscopic Data for Agriculture Applications

Authors: Asmaa M. Hussein, Amr Wassal, Ahmed Farouk Al-Sadek, A. F. Abd El-Rahman

Abstract:

In this study, a multivariate analysis of potato spectroscopic data was presented to detect the presence of brown rot disease or not. Near-Infrared (NIR) spectroscopy (1,350-2,500 nm) combined with multivariate analysis was used as a rapid, non-destructive technique for the detection of brown rot disease in potatoes. Spectral measurements were performed in 565 samples, which were chosen randomly at the infection place in the potato slice. In this study, 254 infected and 311 uninfected (brown rot-free) samples were analyzed using different advanced statistical analysis techniques. The discrimination performance of different multivariate analysis techniques, including classification, pre-processing, and dimension reduction, were compared. Applying a random forest algorithm classifier with different pre-processing techniques to raw spectra had the best performance as the total classification accuracy of 98.7% was achieved in discriminating infected potatoes from control.

Keywords: Brown rot disease, NIR spectroscopy, potato, random forest

Procedia PDF Downloads 192
29290 A Machine Learning Pipeline for Real-Time Activity Detection on Low Computational Power Devices for Metaverse Applications

Authors: Amit Kumar, Amanpreet Chander, Ashish Sahani

Abstract:

This paper presents our recent work on real-time human activity detection based on the media pipe pipeline and machine learning algorithms. The proposed system can detect human activities, including running, jumping, squatting, bending to the left or right, and standing still. This is a robust solution for developing a yoga, dance, metaverse, and fitness application that checks for the correction of the pose without having any additional monitor like a personal trainer. MediaPipe solution offers an open-source cross-platform which utilizes a two-step detector-tracker ML pipeline for live detection of key landmarks on our body which can be used for motion data collection. The prediction of real-time poses uses a variety of machine learning techniques and different types of analysis. Without primarily relying on powerful desktop environments for inference, our method achieves real-time performance on the majority of contemporary mobile phones, desktops/laptops, Python, or even the web. Experimental results show that our method outperforms the existing method in terms of accuracy and real-time capability, achieving an accuracy of 99.92% on testing datasets.

Keywords: human activity detection, media pipe, machine learning, metaverse applications

Procedia PDF Downloads 181
29289 An Investigation Enhancing E-Voting Application Performance

Authors: Aditya Verma

Abstract:

E-voting using blockchain provides us with a distributed system where data is present on each node present in the network and is reliable and secure too due to its immutability property. This work compares various blockchain consensus algorithms used for e-voting applications in the past, based on performance and node scalability, and chooses the optimal one and improves on one such previous implementation by proposing solutions for the loopholes of the optimally working blockchain consensus algorithm, in our chosen application, e-voting.

Keywords: blockchain, parallel bft, consensus algorithms, performance

Procedia PDF Downloads 169
29288 Using Electrical Impedance Tomography to Control a Robot

Authors: Shayan Rezvanigilkolaei, Shayesteh Vefaghnematollahi

Abstract:

Electrical impedance tomography is a non-invasive medical imaging technique suitable for medical applications. This paper describes an electrical impedance tomography device with the ability to navigate a robotic arm to manipulate a target object. The design of the device includes various hardware and software sections to perform medical imaging and control the robotic arm. In its hardware section an image is formed by 16 electrodes which are located around a container. This image is used to navigate a 3DOF robotic arm to reach the exact location of the target object. The data set to form the impedance imaging is obtained by having repeated current injections and voltage measurements between all electrode pairs. After performing the necessary calculations to obtain the impedance, information is transmitted to the computer. This data is fed and then executed in MATLAB which is interfaced with EIDORS (Electrical Impedance Tomography Reconstruction Software) to reconstruct the image based on the acquired data. In the next step, the coordinates of the center of the target object are calculated by image processing toolbox of MATLAB (IPT). Finally, these coordinates are used to calculate the angles of each joint of the robotic arm. The robotic arm moves to the desired tissue with the user command.

Keywords: electrical impedance tomography, EIT, surgeon robot, image processing of electrical impedance tomography

Procedia PDF Downloads 278
29287 Analyzing Current Transformers Saturation Characteristics for Different Connected Burden Using LabVIEW Data Acquisition Tool

Authors: D. Subedi, S. Pradhan

Abstract:

Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However when the power system experiences an abnormal situation leading to huge current flow, then this huge current is proportionally injected to the protection and metering circuit. Since the protection and metering equipment’s are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and also on the reliability of the protection and metering system. This paper shows the effect of burden on the Accuracy Limiting factor/ Instrument security factor of current transformers and also the change in saturation characteristics of the CT’s. The response of the CT to varying levels of overcurrent at different connected burden will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer saturation characteristics with changes in burden will be discussed.

Keywords: accuracy limiting factor, burden, current transformer, instrument security factor, saturation characteristics

Procedia PDF Downloads 419
29286 Facilitating Written Biology Assessment in Large-Enrollment Courses Using Machine Learning

Authors: Luanna B. Prevost, Kelli Carter, Margaurete Romero, Kirsti Martinez

Abstract:

Writing is an essential scientific practice, yet, in several countries, the increasing university science class-size limits the use of written assessments. Written assessments allow students to demonstrate their learning in their own words and permit the faculty to evaluate students’ understanding. However, the time and resources required to grade written assessments prohibit their use in large-enrollment science courses. This study examined the use of machine learning algorithms to automatically analyze student writing and provide timely feedback to the faculty about students' writing in biology. Written responses to questions about matter and energy transformation were collected from large-enrollment undergraduate introductory biology classrooms. Responses were analyzed using the LightSide text mining and classification software. Cohen’s Kappa was used to measure agreement between the LightSide models and human raters. Predictive models achieved agreement with human coding of 0.7 Cohen’s Kappa or greater. Models captured that when writing about matter-energy transformation at the ecosystem level, students focused on primarily on the concepts of heat loss, recycling of matter, and conservation of matter and energy. Models were also produced to capture writing about processes such as decomposition and biochemical cycling. The models created in this study can be used to provide automatic feedback about students understanding of these concepts to biology faculty who desire to use formative written assessments in larger enrollment biology classes, but do not have the time or personnel for manual grading.

Keywords: machine learning, written assessment, biology education, text mining

Procedia PDF Downloads 282
29285 Characterization of Tailings From Traditional Panning of Alluvial Gold Ore (A Case Study of Ilesa - Southwestern Nigeria Goldfield Tailings Dumps)

Authors: Olaniyi Awe, Adelana R. Adetunji, Abraham Adeleke

Abstract:

Field observation revealed a lot of artisanal gold mining activities in Ilesa gold belt of southwestern Nigeria. The possibility of alluvial and lode gold deposits in commercial quantities around this location is very high, as there are many resident artisanal gold miners who have been mining and trading alluvial gold ore for decades and to date in the area. Their major process of solid gold recovery from its ore is by gravity concentration using the convectional panning method. This method is simple to learn and fast to recover gold from its alluvial ore, but its effectiveness is based on rules of thumb and the artisanal miners' experience in handling gold ore panning tool while processing the ore. Research samples from five alluvial gold ore tailings dumps were collected and studied. Samples were subjected to particle size analysis and mineralogical and elemental characterization using X-Ray Diffraction (XRD) and Particle-Induced X-ray Emission (PIXE) methods, respectively. The results showed that the tailings were of major quartz in association with albite, plagioclase, mica, gold, calcite and sulphide minerals. The elemental composition analysis revealed a 15ppm of gold concentration in particle size fraction of -90 microns in one of the tailings dumps investigated. These results are significant. It is recommended that heaps of panning tailings should be further reprocessed using other gold recovery methods such as shaking tables, flotation and controlled cyanidation that can efficiently recover fine gold particles that were previously lost into the gold panning tailings. The tailings site should also be well controlled and monitored so that these heavy minerals do not find their way into surrounding water streams and rivers, thereby causing health hazards.

Keywords: gold ore, panning, PIXE, tailings, XRD

Procedia PDF Downloads 92
29284 The Effect of Naringenin on the Apoptosis in T47D Cell Line of Breast Cancer

Authors: AliAkbar Hafezi, Jahanbakhsh Asadi, Majid Shahbazi, Alijan Tabarraei, Nader Mansour Samaei, Hamed Sheibak, Roghaye Gharaei

Abstract:

Background: Breast cancer is the most common cancer in women. In most cancer cells, apoptosis is blocked. As for the importance of apoptosis in cancer cell death and the role of different genes in its induction or inhibition, the search for compounds that can begin the process of apoptosis in tumor cells is discussed as a new strategy in anticancer drug discovery. The aim of this study was to investigate the effect of Naringenin (NGEN) on the apoptosis in the T47D cell line of breast cancer. Materials and Methods: In this experimental study in vitro, the T47D cell line of breast cancer was selected as a sample. The cells at 24, 48, and 72 hours were treated with doses of 20, 200, and 1000 µm of Naringenin. Then, the transcription levels of the genes involved in apoptosis, including Bcl-2, Bax, Caspase 3, Caspase 8, Caspase 9, P53, PARP-1, and FAS, were assessed using Real Time-PCR. The collected data were analyzed using IBM SPSS Statistics 24.0. Results: The results showed that Naringenin at doses of 20, 200, and 1000 µm in all three times of 24, 48, and 72 hours increased the expression of Caspase 3, P53, PARP-1 and FAS and reduced the expression of Bcl-2 and increased the Bax/Bcl-2 ratio, nevertheless in none of the studied doses and times, had not a significant effect on the expression of Bax, Caspase 8 and Caspase 9. Conclusion: This study indicates that Naringenin can reduce the growth of some cancer cells and cause their deaths through increased apoptosis and decreased anti-apoptotic Bcl-2 gene expression and, resulting in the induction of apoptosis via both internal and external pathways.

Keywords: apoptosis, breast cancer, naringenin, T47D cell line

Procedia PDF Downloads 54
29283 Molecular Characterization of Polyploid Bamboo (Dendrocalamus hamiltonii) Using Microsatellite Markers

Authors: Rajendra K. Meena, Maneesh S. Bhandari, Santan Barthwal, Harish S. Ginwal

Abstract:

Microsatellite markers are the most valuable tools for the characterization of plant genetic resources or population genetic analysis. Since it is codominant and allelic markers, utilizing them in polyploid species remained doubtful. In such cases, the microsatellite marker is usually analyzed by treating them as a dominant marker. In the current study, it has been showed that despite losing the advantage of co-dominance, microsatellite markers are still a powerful tool for genotyping of polyploid species because of availability of large number of reproducible alleles per locus. It has been studied by genotyping of 19 subpopulations of Dendrocalamus hamiltonii (hexaploid bamboo species) with 17 polymorphic simple sequence repeat (SSR) primer pairs. Among these, ten primers gave typical banding pattern of microsatellite marker as expected in diploid species, but rest 7 gave an unusual pattern, i.e., more than two bands per locus per genotype. In such case, genotyping data are generally analyzed by considering as dominant markers. In the current study, data were analyzed in both ways as dominant and co-dominant. All the 17 primers were first scored as nonallelic data and analyzed; later, the ten primers giving standard banding patterns were analyzed as allelic data and the results were compared. The UPGMA clustering and genetic structure showed that results obtained with both the data sets are very similar with slight variation, and therefore the SSR marker could be utilized to characterize polyploid species by considering them as a dominant marker. The study is highly useful to widen the scope for SSR markers applications and beneficial to the researchers dealing with polyploid species.

Keywords: microsatellite markers, Dendrocalamus hamiltonii, dominant and codominant, polyploids

Procedia PDF Downloads 146
29282 Machine Learning and Internet of Thing for Smart-Hydrology of the Mantaro River Basin

Authors: Julio Jesus Salazar, Julio Jesus De Lama

Abstract:

the fundamental objective of hydrological studies applied to the engineering field is to determine the statistically consistent volumes or water flows that, in each case, allow us to size or design a series of elements or structures to effectively manage and develop a river basin. To determine these values, there are several ways of working within the framework of traditional hydrology: (1) Study each of the factors that influence the hydrological cycle, (2) Study the historical behavior of the hydrology of the area, (3) Study the historical behavior of hydrologically similar zones, and (4) Other studies (rain simulators or experimental basins). Of course, this range of studies in a certain basin is very varied and complex and presents the difficulty of collecting the data in real time. In this complex space, the study of variables can only be overcome by collecting and transmitting data to decision centers through the Internet of things and artificial intelligence. Thus, this research work implemented the learning project of the sub-basin of the Shullcas river in the Andean basin of the Mantaro river in Peru. The sensor firmware to collect and communicate hydrological parameter data was programmed and tested in similar basins of the European Union. The Machine Learning applications was programmed to choose the algorithms that direct the best solution to the determination of the rainfall-runoff relationship captured in the different polygons of the sub-basin. Tests were carried out in the mountains of Europe, and in the sub-basins of the Shullcas river (Huancayo) and the Yauli river (Jauja) with heights close to 5000 m.a.s.l., giving the following conclusions: to guarantee a correct communication, the distance between devices should not pass the 15 km. It is advisable to minimize the energy consumption of the devices and avoid collisions between packages, the distances oscillate between 5 and 10 km, in this way the transmission power can be reduced and a higher bitrate can be used. In case the communication elements of the devices of the network (internet of things) installed in the basin do not have good visibility between them, the distance should be reduced to the range of 1-3 km. The energy efficiency of the Atmel microcontrollers present in Arduino is not adequate to meet the requirements of system autonomy. To increase the autonomy of the system, it is recommended to use low consumption systems, such as the Ashton Raggatt McDougall or ARM Cortex L (Ultra Low Power) microcontrollers or even the Cortex M; and high-performance direct current (DC) to direct current (DC) converters. The Machine Learning System has initiated the learning of the Shullcas system to generate the best hydrology of the sub-basin. This will improve as machine learning and the data entered in the big data coincide every second. This will provide services to each of the applications of the complex system to return the best data of determined flows.

Keywords: hydrology, internet of things, machine learning, river basin

Procedia PDF Downloads 163
29281 Factors That Contribute to Noise Induced Hearing Loss Amongst Employees at the Platinum Mine in Limpopo Province, South Africa

Authors: Livhuwani Muthelo, R. N. Malema, T. M. Mothiba

Abstract:

Long term exposure to excessive noise in the mining industry increases the risk of noise induced hearing loss, with consequences for employee’s health, productivity and the overall quality of life. Objective: The objective of this study was to investigate the factors that contribute to Noise Induced Hearing Loss amongst employees at the Platinum mine in the Limpopo Province, South Africa. Study method: A qualitative, phenomenological, exploratory, descriptive, contextual design was applied in order to explore and describe the contributory factors. Purposive non-probability sampling was used to select 10 male employees who were diagnosed with NIHL in the year 2014 in four mine shafts, and 10 managers who were involved in a Hearing Conservation Programme. The data were collected using semi-structured one-on-one interviews. A qualitative data analysis of Tesch’s approach was followed. Results: The following themes emerged: Experiences and challenges faced by employees in the work environment, hearing protective device factors and management and leadership factors. Hearing loss was caused by partial application of guidelines, policies, and procedures from the Department of Minerals and Energy. Conclusion: The study results indicate that although there are guidelines, policies, and procedures available, failure in the implementation of one element will affect the development and maintenance of employees hearing mechanism. It is recommended that the mine management should apply the guidelines, policies, and procedures and promptly repair the broken hearing protective devices.

Keywords: employees, factors, noise induced hearing loss, noise exposure

Procedia PDF Downloads 129