Search results for: spectroscopy data analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 42285

Search results for: spectroscopy data analysis

40875 Enhanced Optical and Electrical Properties of P-Type AgBiS₂ Energy Harvesting Materials as an Absorber of Solar Cell by Copper Doping

Authors: Yasaman Tabari-Saadi, Kaiwen Sun, Jialiang Huang, Martin Green, Xiaojing Hao

Abstract:

Optical and electrical properties of p-type AgBiS₂ absorber material have been improved by copper doping on silver sites. X-Ray diffraction (XRD) and X-ray photoelectron spectroscopy (XPS) analysis suggest that complete solid solutions of Ag₁₋ₓCuₓBiS₂ thin film have been formed. The carrier concentration of pure AgBiS₂ thin film deposited by the chemical process is 4.5*E+14 cm⁻³, and copper doping leads to the improved carrier concentration despite the semiconductor AgBiS₂ remains p-type semiconductor. Copper doping directly changed the absorption coefficient and increased the optical band gap (~1.5eV), which makes it a promising absorber for thin-film solar cell applications.

Keywords: copper doped, AgBiS₂, thin-film solar cell, carrier concentration, p-type semiconductor

Procedia PDF Downloads 112
40874 Uncertainty and Optimization Analysis Using PETREL RE

Authors: Ankur Sachan

Abstract:

The ability to make quick yet intelligent and value-added decisions to develop new fields has always been of great significance. In situations where the capital expenses and subsurface risk are high, carefully analyzing the inherent uncertainties in the reservoir and how they impact the predicted hydrocarbon accumulation and production becomes a daunting task. The problem is compounded in offshore environments, especially in the presence of heavy oils and disconnected sands where the margin for error is small. Uncertainty refers to the degree to which the data set may be in error or stray from the predicted values. To understand and quantify the uncertainties in reservoir model is important when estimating the reserves. Uncertainty parameters can be geophysical, geological, petrophysical etc. Identification of these parameters is necessary to carry out the uncertainty analysis. With so many uncertainties working at different scales, it becomes essential to have a consistent and efficient way of incorporating them into our analysis. Ranking the uncertainties based on their impact on reserves helps to prioritize/ guide future data gathering and uncertainty reduction efforts. Assigning probabilistic ranges to key uncertainties also enables the computation of probabilistic reserves. With this in mind, this paper, with the help the uncertainty and optimization process in petrel RE shows how the most influential uncertainties can be determined efficiently and how much impact so they have on the reservoir model thus helping in determining a cost effective and accurate model of the reservoir.

Keywords: uncertainty, reservoir model, parameters, optimization analysis

Procedia PDF Downloads 620
40873 Comparative Analysis of the Computer Methods' Usage for Calculation of Hydrocarbon Reserves in the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Nowadays, the depletion of hydrocarbon deposits on the land of the Kaliningrad region leads to active geological exploration and development of oil and natural gas reserves in the southeastern part of the Baltic Sea. LLC 'Lukoil-Kaliningradmorneft' implements a comprehensive program for the development of the region's shelf in 2014-2023. Due to heterogeneity of reservoir rocks in various open fields, as well as with ambiguous conclusions on the contours of deposits, additional geological prospecting and refinement of the recoverable oil reserves are carried out. The key element is use of an effective technique of computer stock modeling at the first stage of processing of the received data. The following step uses information for the cluster analysis, which makes it possible to optimize the field development approaches. The article analyzes the effectiveness of various methods for reserves' calculation and computer modelling methods of the offshore hydrocarbon fields. Cluster analysis allows to measure influence of the obtained data on the development of a technical and economic model for mining deposits. The relationship between the accuracy of the calculation of recoverable reserves and the need of modernization of existing mining infrastructure, as well as the optimization of the scheme of opening and development of oil deposits, is observed.

Keywords: cluster analysis, computer modelling of deposits, correction of the feasibility study, offshore hydrocarbon fields

Procedia PDF Downloads 160
40872 Geospatial Data Complexity in Electronic Airport Layout Plan

Authors: Shyam Parhi

Abstract:

Airports GIS program collects Airports data, validate and verify it, and stores it in specific database. Airports GIS allows authorized users to submit changes to airport data. The verified data is used to develop several engineering applications. One of these applications is electronic Airport Layout Plan (eALP) whose primary aim is to move from paper to digital form of ALP. The first phase of development of eALP was completed recently and it was tested for a few pilot program airports across different regions. We conducted gap analysis and noticed that a lot of development work is needed to fine tune at least six mandatory sheets of eALP. It is important to note that significant amount of programming is needed to move from out-of-box ArcGIS to a much customized ArcGIS which will be discussed. The ArcGIS viewer capability to display essential features like runway or taxiway or the perpendicular distance between them will be discussed. An enterprise level workflow which incorporates coordination process among different lines of business will be highlighted.

Keywords: geospatial data, geology, geographic information systems, aviation

Procedia PDF Downloads 407
40871 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets

Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe

Abstract:

Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.

Keywords: biomedical research, genomics, information systems, software

Procedia PDF Downloads 261
40870 The Establishment and Application of TRACE/FRAPTRAN Model for Kuosheng Nuclear Power Plant

Authors: S. W. Chen, W. K. Lin, J. R. Wang, C. Shih, H. T. Lin, H. C. Chang, W. Y. Li

Abstract:

Kuosheng nuclear power plant (NPP) is a BWR/6 type NPP and located on the northern coast of Taiwan. First, Kuosheng NPP TRACE model were developed in this research. In order to assess the system response of Kuosheng NPP TRACE model, startup tests data were used to evaluate Kuosheng NPP TRACE model. Second, the over pressurization transient analysis of Kuosheng NPP TRACE model was performed. Besides, in order to confirm the mechanical property and integrity of fuel rods, FRAPTRAN analysis was also performed in this study.

Keywords: TRACE, safety analysis, BWR/6, FRAPTRA

Procedia PDF Downloads 553
40869 Infra Red Laser Induced Ablation of Graphene Based Polymer Nanocomposites

Authors: Jadranka Blazhevska Gilev

Abstract:

IR laser-induced ablation of poly(butylacrylate-methylmethacrylate/hydroxyl ethyl methacrylate)/reduced graphene oxide (p(BA/MMA/HEMA)/rGO) was examined with 0.5, 0.75 and 1 wt% reduced graphene oxide content in relation to polymer. The irradiation was performed with TEA (transversely excited atmosphere) CO₂ laser using incident fluence of 15-20 J/cm², repetition frequency of 1 Hz, in an evacuated (10-3 Pa) Pyrex spherical vessel. Thin deposited nanocomposites films with large specific area were obtained using different substrates. The properties of the films deposited on these substrates were evaluated by TGA, FTIR, (Thermogravimetric analysis, Fourier Transformation Infrared) Raman spectroscopy and SEM microscopy. Homogeneous distribution of graphene sheets was observed from the SEM images, making polymer/rGO deposit an ideal candidate for SERS application. SERS measurements were performed using Rhodamine 6G as probe molecule on the substrate Ag/p(BA/MMA/HEMA)/rGO.

Keywords: laser ablation, reduced graphene oxide, polymer/rGO nanocomposites, thin deposited film

Procedia PDF Downloads 187
40868 Analytical Modelling of Surface Roughness during Compacted Graphite Iron Milling Using Ceramic Inserts

Authors: Ş. Karabulut, A. Güllü, A. Güldaş, R. Gürbüz

Abstract:

This study investigates the effects of the lead angle and chip thickness variation on surface roughness during the machining of compacted graphite iron using ceramic cutting tools under dry cutting conditions. Analytical models were developed for predicting the surface roughness values of the specimens after the face milling process. Experimental data was collected and imported to the artificial neural network model. A multilayer perceptron model was used with the back propagation algorithm employing the input parameters of lead angle, cutting speed and feed rate in connection with chip thickness. Furthermore, analysis of variance was employed to determine the effects of the cutting parameters on surface roughness. Artificial neural network and regression analysis were used to predict surface roughness. The values thus predicted were compared with the collected experimental data, and the corresponding percentage error was computed. Analysis results revealed that the lead angle is the dominant factor affecting surface roughness. Experimental results indicated an improvement in the surface roughness value with decreasing lead angle value from 88° to 45°.

Keywords: CGI, milling, surface roughness, ANN, regression, modeling, analysis

Procedia PDF Downloads 441
40867 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia

Authors: Triano Nurhikmat

Abstract:

Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.

Keywords: association rule, data mining, industrial accidents, rules

Procedia PDF Downloads 286
40866 Zinc (II) Complexes of Nitrogen, Oxygen and Sulfur Coordination Modes: Synthesis, Spectral Studies and Antibacterial Activities

Authors: Ayodele Odularu, Peter Ajibade, Albert Bolhuis

Abstract:

This study aimed at assessing the antibacterial activities of four zinc (II) complexes. Zinc (II) complexes of nitrogen, oxygen and sulfur coordination modes were synthesized using direct substitution reaction. The characterization techniques involved physicochemical properties (molar conductivity) and spectroscopic techniques. The molar conductivity gave the non-electrolytic nature of zinc (II) complexes. The spectral studies of zinc (II) complexes were done using electronic spectra (UV-Vis) and Fourier Transform Infra-red Spectroscopy (FT-IR). Spectral data from the spectroscopic studies confirmed the coordination of the mixed ligands with zinc (II) ion. The antibacterial activities of zinc(II) complexes of were all in supportive of Overtone’s concept and Tweedy’s theory of chelation for bacterial strains of S. aureus MRSA252 and E coli MC4100 because the zones of inhibition were greater than the corresponding ligands. In summary, all zinc (II) complexes of ZEPY, ZE1PH, ZE1PY and ZE135PY all have potentials for antibacterial activities.

Keywords: antibacterial activities, spectral studies, syntheses, zinc(II) complexes

Procedia PDF Downloads 267
40865 Synthesis, Spectral Characterization and Photocatalytic Applications of Graphene Oxide Nanocomposite with Copper Doped Zinc Oxide

Authors: Humaira Khan, Mohsin Javed, Sammia Shahid

Abstract:

The reinforced photocatalytic activity of graphene oxide (GO) along with composites of ZnO nanoparticles and copper-doped ZnO nanoparticles were studied by synthesizing ZnO and copper- doped ZnO nanoparticles by co-precipitation method. Zinc acetate and copper acetate were used as precursors, whereas graphene oxide was prepared from pre-oxidized graphite in the presence of H2O2.The supernatant was collected carefully and showed high-quality single-layer characterized by FTIR (Fourier Transform Infrared Spectroscopy), TEM (Transmission Electron Microscopy), SEM (Scanning Electron Microscopy), XRD (X-ray Diffraction Analysis), EDS (Energy Dispersive Spectrometry). The degradation of methylene blue as standard pollutant under UV-Visible irradiation gave results for photocatalytic activity of dopants. It could be concluded that shrinking of optical band caused by composites of Cu-dopped nanoparticles with GO enhances the photocatalytic activity.

Keywords: degradation, graphene oxide, photocatalysis, ZnO nanoparticles and copper-doped ZnO nanoparticles

Procedia PDF Downloads 203
40864 A Fluorescent Polymeric Boron Sensor

Authors: Soner Cubuk, Mirgul Kosif, M. Vezir Kahraman, Ece Kok Yetimoglu

Abstract:

Boron is an essential trace element for the completion of the life circle for organisms. Suitable methods for the determination of boron have been proposed, including acid - base titrimetric, inductively coupled plasma emission spectroscopy flame atomic absorption and spectrophotometric. However, the above methods have some disadvantages such as long analysis times, requirement of corrosive media such as concentrated sulphuric acid and multi-step sample preparation requirements and time-consuming procedures. In this study, a selective and reusable fluorescent sensor for boron based on glycosyloxyethyl methacrylate was prepared by photopolymerization. The response characteristics such as response time, pH, linear range, limit of detection were systematically investigated. The excitation/emission maxima of the membrane were at 378/423 nm, respectively. The approximate response time was measured as 50 sec. In addition, sensor had a very low limit of detection which was 0.3 ppb. The sensor was successfully used for the determination of boron in water samples with satisfactory results.

Keywords: boron, fluorescence, photopolymerization, polymeric sensor

Procedia PDF Downloads 276
40863 Air Quality Analysis Using Machine Learning Models Under Python Environment

Authors: Salahaeddine Sbai

Abstract:

Air quality analysis using machine learning models is a method employed to assess and predict air pollution levels. This approach leverages the capabilities of machine learning algorithms to analyze vast amounts of air quality data and extract valuable insights. By training these models on historical air quality data, they can learn patterns and relationships between various factors such as weather conditions, pollutant emissions, and geographical features. The trained models can then be used to predict air quality levels in real-time or forecast future pollution levels. This application of machine learning in air quality analysis enables policymakers, environmental agencies, and the general public to make informed decisions regarding health, environmental impact, and mitigation strategies. By understanding the factors influencing air quality, interventions can be implemented to reduce pollution levels, mitigate health risks, and enhance overall air quality management. Climate change is having significant impacts on Morocco, affecting various aspects of the country's environment, economy, and society. In this study, we use some machine learning models under python environment to predict and analysis air quality change over North of Morocco to evaluate the climate change impact on agriculture.

Keywords: air quality, machine learning models, pollution, pollutant emissions

Procedia PDF Downloads 84
40862 Effect of Li-excess on Electrochemical Performance of Ni-rich LiNi₀.₉Co₀.₀₉Mn₀.₀₉O₂ Cathode Materials for Li-ion Batteries

Authors: Eyob Belew Abebe

Abstract:

Nickel-rich layered oxide cathode materials having a Ni content of ≥ 90% have great potential for use in next-generation lithium-ion batteries (LIBs), due to their high energy densities and relatively low cost. They suffer, however, from poor cycling performance and rate capability, significantly hampering their widespread applicability. In this study we synthesized a Ni-rich precursor through a co-precipitation method and added different amounts of Li-excess on the precursors using a solid-state method to obtain sintered Li1+x(Ni0.9Co0.05Mn0.05)1–xO2 (denoted as L1+x-NCM; x = 0.00, 0.02, 0.04, 0.06, and 0.08) transition metal (TM) oxide cathode materials. The L1+x-NCM cathode having a Li-excess of 4% exhibited a discharge capacity of ca. 216.17 mAh g–1 at 2.7–4.3 V, 0.1C and retained 95.7% of its initial discharge capacity (ca. 181.39 mAh g–1) after 100 cycles of 1C charge/discharge which is the best performance as compared with stoichiometric Li1+x(Ni0.9Co0.05Mn0.05)1-xO2 (i.e. x=0, Li:TM = 1:1). Furthermore, a high-rate capability of ca. 162.92 mAh g–1 at a rate of 10C, led to the 4% Li-excess optimizing the electrochemical performance, relative to the other Li-excess samples. Ex/in-situ X-ray diffraction, scanning electron microscopy, and X-ray photoelectron spectroscopy revealed that the 4% Li-excess in the Ni-rich NCM90 cathode material: (i). decreased the Li+/Ni2+ disorder by increasing the content of Ni3+ in the TM slab, (ii). increased the crystallinity, and (iii). accelerated Li+ ion transport by widening the Li-slab. Furthermore, electrochemical impedance spectroscopy and cyclic voltammetry confirmed that the appropriate Li-excess lowered the electrochemical impedance and improved the reversibility of the electrochemical reaction. Therefore, our results revealed that NCM90 cathode materials featuring an optimal Li-excess are potential candidates for use in next-generation Li-ion batteries.

Keywords: LiNi₀.₉Co₀.₀₉Mn₀.₀₉O₂, li-excess, cation mixing, structure change, cycle stability, electrochemical properties

Procedia PDF Downloads 160
40861 Trend Analysis of Africa’s Entrepreneurial Framework Conditions

Authors: Sheng-Hung Chen, Grace Mmametena Mahlangu, Hui-Cheng Wang

Abstract:

This study aims to explore the trends of the Entrepreneurial Framework Conditions (EFCs) in the five African regions. The Global Entrepreneur Monitor (GEM) is the primary source of data. The data drawn were organized into a panel (2000-2021) and obtained from the National Expert Survey (NES) databases as harmonized by the (GEM). The Methodology used is descriptive and uses mainly charts and tables; this is in line with the approach used by the GEM. The GEM draws its data from the National Expert Survey (NES). The survey by the NES is administered to experts in each country. The GEM collects entrepreneurship data specific to each country. It provides information about entrepreneurial ecosystems and their impact on entrepreneurship. The secondary source is from the literature review. This study focuses on the following GEM indicators: Financing for Entrepreneurs, Government support and Policies, Taxes and Bureaucracy, Government programs, Basic School Entrepreneurial Education and Training, Post school Entrepreneurial Education and Training, R&D Transfer, Commercial And Professional Infrastructure, Internal Market Dynamics, Internal Market Openness, Physical and Service Infrastructure, and Cultural And Social Norms, based on GEM Report 2020/21. The limitation of the study is the lack of updated data from some countries. Countries have to fund their own regional studies; African countries do not regularly participate due to a lack of resources.

Keywords: trend analysis, entrepreneurial framework conditions (EFCs), African region, government programs

Procedia PDF Downloads 56
40860 Analysis of Airborne Data Using Range Migration Algorithm for the Spotlight Mode of Synthetic Aperture Radar

Authors: Peter Joseph Basil Morris, Chhabi Nigam, S. Ramakrishnan, P. Radhakrishna

Abstract:

This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data using the Range Migration Algorithm (RMA) for the spotlight mode of operation. Unlike in polar format algorithm (PFA), space-variant defocusing and geometric distortion effects are mitigated in RMA since it does not assume that the illuminating wave-fronts are planar. This facilitates the use of RMA for imaging scenarios involving severe differential range curvatures enabling the imaging of larger scenes at fine resolution and at shorter ranges with low center frequencies. The RMA algorithm for the spotlight mode of SAR is analyzed in this paper using the airborne data. Pre-processing operations viz: - range de-skew and motion compensation to a line are performed on the raw data before being fed to the RMA component. Various stages of the RMA viz:- 2D Matched Filtering, Along Track Fourier Transform and Slot Interpolation are analyzed to find the performance limits and the dependence of the imaging geometry on the resolution of the final image. The ability of RMA to compensate for severe differential range curvatures in the two-dimensional spatial frequency domain are also illustrated in this paper.

Keywords: range migration algorithm, spotlight SAR, synthetic aperture radar, matched filtering, slot interpolation

Procedia PDF Downloads 233
40859 Electron Density Discrepancy Analysis of Energy Metabolism Coenzymes

Authors: Alan Luo, Hunter N. B. Moseley

Abstract:

Many macromolecular structure entries in the Protein Data Bank (PDB) have a range of regional (localized) quality issues, be it derived from x-ray crystallography, Nuclear Magnetic Resonance (NMR) spectroscopy, or other experimental approaches. However, most PDB entries are judged by global quality metrics like R-factor, R-free, and resolution for x-ray crystallography or backbone phi-psi distribution statistics and average restraint violations for NMR. Regional quality is often ignored when PDB entries are re-used for a variety of structurally based analyses. The binding of ligands, especially ligands involved in energy metabolism, is of particular interest in many structurally focused protein studies. Using a regional quality metric that provides chemically interpretable information from electron density maps, a significant number of outliers in regional structural quality was detected across x-ray crystallographic PDB entries for proteins bound to biochemically critical ligands. In this study, a series of analyses was performed to evaluate both specific and general potential factors that could promote these outliers. In particular, these potential factors were the minimum distance to a metal ion, the minimum distance to a crystal contact, and the isotropic atomic b-factor. To evaluate these potential factors, Fisher’s exact tests were performed, using regional quality criteria of outlier (top 1%, 2.5%, 5%, or 10%) versus non-outlier compared to a potential factor metric above versus below a certain outlier cutoff. The results revealed a consistent general effect from region-specific normalized b-factors but no specific effect from metal ion contact distances and only a very weak effect from crystal contact distance as compared to the b-factor results. These findings indicate that no single specific potential factor explains a majority of the outlier ligand-bound regions, implying that human error is likely as important as these other factors. Thus, all factors, including human error, should be considered when regions of low structural quality are detected. Also, the downstream re-use of protein structures for studying ligand-bound conformations should screen the regional quality of the binding sites. Doing so prevents misinterpretation due to the presence of structural uncertainty or flaws in regions of interest.

Keywords: biomacromolecular structure, coenzyme, electron density discrepancy analysis, x-ray crystallography

Procedia PDF Downloads 121
40858 Identification of CLV for Online Shoppers Using RFM Matrix: A Case Based on Features of B2C Architecture

Authors: Riktesh Srivastava

Abstract:

Online Shopping have established an astonishing evolution in the last few years. And it is now apparent that B2C architecture is becoming progressively imperative channel for even traditional brick and mortar type traders as well. In this completion knowing customers and predicting behavior are extremely important. More important, when any customer logs onto the B2C architecture, the traces of their buying patterns can be stored and used for future predictions. Such a prediction is called Customer Lifetime Value (CLV). Earlier, we used Net Present Value to do so, however, it ignores two important aspects of B2C architecture, “market risks” and “big amount of customer data”. Now, we use RFM- Recency, Frequency and Monetary Value to estimate the CLV, and as the term exemplifies, market risks, is well sheltered. Big Data Analysis is also roofed in RFM, which gives real exploration of the Big Data and lead to a better estimation for future cash flow from customers. In the present paper, 6 factors (collected from varied sources) are used to determine as to what attracts the customers to the B2C architecture. For these 6 factors, RFM is computed for 3 years (2013, 2014 and 2015) respectively. CLV and Revenue are the two parameters defined using RFM analysis, which gives the clear picture of the future predictions.

Keywords: CLV, RFM, revenue, recency, frequency, monetary value

Procedia PDF Downloads 212
40857 Spectral Re-Evaluation of the Magnetic Basement Depth over Yola Arm of Upper Benue Trough Nigeria Using Aeromagnetic Data

Authors: Emberga Terhemb Opara Alexander, Selemo Alexader, Onyekwuru Samuel

Abstract:

The aeromagnetic data have been used to re-evaluate parts of the Upper Benue Trough Nigeria using spectral analysis technique in order to appraise the mineral accumulation potential of the area. The regional field was separated with a first order polynomial using polyfit program. The residual data was subdivided into 24 spectral blocks using OASIS MONTAJ software program. Two prominent magnetic depth source layers were identified. The deeper source depth values obtained ranges from 1.56km to 2.92km with an average depth of 2.37km as the magnetic basement depth while for the shallower sources, the depth values ranges from -1.17km to 0.98km with an average depth of 0.55km. The shallow depth source is attributed to the volcanic rocks that intruded the sedimentary formation and this could possibly be responsible for the mineralization found in parts of the study area.

Keywords: spectral analysis, Upper Benue Trough, magnetic basement depth, aeromagnetic

Procedia PDF Downloads 439
40856 Evaluation of Machine Learning Algorithms and Ensemble Methods for Prediction of Students’ Graduation

Authors: Soha A. Bahanshal, Vaibhav Verdhan, Bayong Kim

Abstract:

Graduation rates at six-year colleges are becoming a more essential indicator for incoming fresh students and for university rankings. Predicting student graduation is extremely beneficial to schools and has a huge potential for targeted intervention. It is important for educational institutions since it enables the development of strategic plans that will assist or improve students' performance in achieving their degrees on time (GOT). A first step and a helping hand in extracting useful information from these data and gaining insights into the prediction of students' progress and performance is offered by machine learning techniques. Data analysis and visualization techniques are applied to understand and interpret the data. The data used for the analysis contains students who have graduated in 6 years in the academic year 2017-2018 for science majors. This analysis can be used to predict the graduation of students in the next academic year. Different Predictive modelings such as logistic regression, decision trees, support vector machines, Random Forest, Naïve Bayes, and KNeighborsClassifier are applied to predict whether a student will graduate. These classifiers were evaluated with k folds of 5. The performance of these classifiers was compared based on accuracy measurement. The results indicated that Ensemble Classifier achieves better accuracy, about 91.12%. This GOT prediction model would hopefully be useful to university administration and academics in developing measures for assisting and boosting students' academic performance and ensuring they graduate on time.

Keywords: prediction, decision trees, machine learning, support vector machine, ensemble model, student graduation, GOT graduate on time

Procedia PDF Downloads 65
40855 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis

Authors: Shriya Shukla, Lachin Fernando

Abstract:

Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.

Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning

Procedia PDF Downloads 76
40854 Spatially Random Sampling for Retail Food Risk Factors Study

Authors: Guilan Huang

Abstract:

In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.

Keywords: geospatial technology, restaurant, retail food risk factor study, spatially random sampling

Procedia PDF Downloads 345
40853 The Quality Assessment of Seismic Reflection Survey Data Using Statistical Analysis: A Case Study of Fort Abbas Area, Cholistan Desert, Pakistan

Authors: U. Waqas, M. F. Ahmed, A. Mehmood, M. A. Rashid

Abstract:

In geophysical exploration surveys, the quality of acquired data holds significant importance before executing the data processing and interpretation phases. In this study, 2D seismic reflection survey data of Fort Abbas area, Cholistan Desert, Pakistan was taken as test case in order to assess its quality on statistical bases by using normalized root mean square error (NRMSE), Cronbach’s alpha test (α) and null hypothesis tests (t-test and F-test). The analysis challenged the quality of the acquired data and highlighted the significant errors in the acquired database. It is proven that the study area is plain, tectonically least affected and rich in oil and gas reserves. However, subsurface 3D modeling and contouring by using acquired database revealed high degrees of structural complexities and intense folding. The NRMSE had highest percentage of residuals between the estimated and predicted cases. The outcomes of hypothesis testing also proved the biasness and erraticness of the acquired database. Low estimated value of alpha (α) in Cronbach’s alpha test confirmed poor reliability of acquired database. A very low quality of acquired database needs excessive static correction or in some cases, reacquisition of data is also suggested which is most of the time not feasible on economic grounds. The outcomes of this study could be used to assess the quality of large databases and to further utilize as a guideline to establish database quality assessment models to make much more informed decisions in hydrocarbon exploration field.

Keywords: Data quality, Null hypothesis, Seismic lines, Seismic reflection survey

Procedia PDF Downloads 153
40852 The Impact of Data Science on Geography: A Review

Authors: Roberto Machado

Abstract:

We conducted a systematic review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology, analyzing 2,996 studies and synthesizing 41 of them to explore the evolution of data science and its integration into geography. By employing optimization algorithms, we accelerated the review process, significantly enhancing the efficiency and precision of literature selection. Our findings indicate that data science has developed over five decades, facing challenges such as the diversified integration of data and the need for advanced statistical and computational skills. In geography, the integration of data science underscores the importance of interdisciplinary collaboration and methodological innovation. Techniques like large-scale spatial data analysis and predictive algorithms show promise in natural disaster management and transportation route optimization, enabling faster and more effective responses. These advancements highlight the transformative potential of data science in geography, providing tools and methodologies to address complex spatial problems. The relevance of this study lies in the use of optimization algorithms in systematic reviews and the demonstrated need for deeper integration of data science into geography. Key contributions include identifying specific challenges in combining diverse spatial data and the necessity for advanced computational skills. Examples of connections between these two fields encompass significant improvements in natural disaster management and transportation efficiency, promoting more effective and sustainable environmental solutions with a positive societal impact.

Keywords: data science, geography, systematic review, optimization algorithms, supervised learning

Procedia PDF Downloads 9
40851 Bluetooth Communication Protocol Study for Multi-Sensor Applications

Authors: Joao Garretto, R. J. Yarwood, Vamsi Borra, Frank Li

Abstract:

Bluetooth Low Energy (BLE) has emerged as one of the main wireless communication technologies used in low-power electronics, such as wearables, beacons, and Internet of Things (IoT) devices. BLE’s energy efficiency characteristic, smart mobiles interoperability, and Over the Air (OTA) capabilities are essential features for ultralow-power devices, which are usually designed with size and cost constraints. Most current research regarding the power analysis of BLE devices focuses on the theoretical aspects of the advertising and scanning cycles, with most results being presented in the form of mathematical models and computer software simulations. Such computer modeling and simulations are important for the comprehension of the technology, but hardware measurement is essential for the understanding of how BLE devices behave in real operation. In addition, recent literature focuses mostly on the BLE technology, leaving possible applications and its analysis out of scope. In this paper, a coin cell battery-powered BLE Data Acquisition Device, with a 4-in-1 sensor and one accelerometer, is proposed and evaluated with respect to its Power Consumption. First, evaluations of the device in advertising mode with the sensors turned off completely, followed by the power analysis when each of the sensors is individually turned on and data is being transmitted, and concluding with the power consumption evaluation when both sensors are on and respectively broadcasting the data to a mobile phone. The results presented in this paper are real-time measurements of the electrical current consumption of the BLE device, where the energy levels that are demonstrated are matched to the BLE behavior and sensor activity.

Keywords: bluetooth low energy, power analysis, BLE advertising cycle, wireless sensor node

Procedia PDF Downloads 79
40850 An Investigation of Differential Item and Test Functioning of Scholastic Aptitude Test 2011 (SWUSAT 2011)

Authors: Ruangdech Sirikit

Abstract:

The purposes of this study were analyzed differential item functioning and differential test functioning of SWUSAT aptitude test classification by sex variable. The data used in this research is the secondary data from Srinakharinwirot University Scholastic Aptitude Test 2011 (SWUSAT 2011) SWUSAT test consists of four subjects. There are verbal ability test, number ability test, reasoning ability test and spatial ability test. The data analysis was carried out in 2 steps. The first step was analyzing descriptive statistics. In the second step were analyzed differential item functioning (DIF) and differential test functioning (DTF) by using the DIFAS program. The research results were as follows: The results of data analysis for all 10 tests in year 2011. Sex was the characteristic that found DIF all 10 tests. The percentage of item number that found DIF was between 10% - 46.67%. There are 4 tests that most of items favors female group. There are 3 tests that most of items favors male group and there are 3 tests that the number of items favors female group equal favors male group. For Differential test functioning (DTF), there are 8 tests that have small DIF effect variance.

Keywords: differential item functioning, differential test functioning, SWUSAT, aptitude test

Procedia PDF Downloads 598
40849 Trend Analysis of Rainfall: A Climate Change Paradigm

Authors: Shyamli Singh, Ishupinder Kaur, Vinod K. Sharma

Abstract:

Climate Change refers to the change in climate for extended period of time. Climate is changing from the past history of earth but anthropogenic activities accelerate this rate of change and which is now being a global issue. Increase in greenhouse gas emissions is causing global warming and climate change related issues at an alarming rate. Increasing temperature results in climate variability across the globe. Changes in rainfall patterns, intensity and extreme events are some of the impacts of climate change. Rainfall variability refers to the degree to which rainfall patterns varies over a region (spatial) or through time period (temporal). Temporal rainfall variability can be directly or indirectly linked to climate change. Such variability in rainfall increases the vulnerability of communities towards climate change. Increasing urbanization and unplanned developmental activities, the air quality is deteriorating. This paper mainly focuses on the rainfall variability due to increasing level of greenhouse gases. Rainfall data of 65 years (1951-2015) of Safdarjung station of Delhi was collected from Indian Meteorological Department and analyzed using Mann-Kendall test for time-series data analysis. Mann-Kendall test is a statistical tool helps in analysis of trend in the given data sets. The slope of the trend can be measured through Sen’s slope estimator. Data was analyzed monthly, seasonally and yearly across the period of 65 years. The monthly rainfall data for the said period do not follow any increasing or decreasing trend. Monsoon season shows no increasing trend but here was an increasing trend in the pre-monsoon season. Hence, the actual rainfall differs from the normal trend of the rainfall. Through this analysis, it can be projected that there will be an increase in pre-monsoon rainfall than the actual monsoon season. Pre-monsoon rainfall causes cooling effect and results in drier monsoon season. This will increase the vulnerability of communities towards climate change and also effect related developmental activities.

Keywords: greenhouse gases, Mann-Kendall test, rainfall variability, Sen's slope

Procedia PDF Downloads 195
40848 An Extended Inverse Pareto Distribution, with Applications

Authors: Abdel Hadi Ebraheim

Abstract:

This paper introduces a new extension of the Inverse Pareto distribution in the framework of Marshal-Olkin (1997) family of distributions. This model is capable of modeling various shapes of aging and failure data. The statistical properties of the new model are discussed. Several methods are used to estimate the parameters involved. Explicit expressions are derived for different types of moments of value in reliability analysis are obtained. Besides, the order statistics of samples from the new proposed model have been studied. Finally, the usefulness of the new model for modeling reliability data is illustrated using two real data sets with simulation study.

Keywords: pareto distribution, marshal-Olkin, reliability, hazard functions, moments, estimation

Procedia PDF Downloads 75
40847 Investigating the Effects of Data Transformations on a Bi-Dimensional Chi-Square Test

Authors: Alexandru George Vaduva, Adriana Vlad, Bogdan Badea

Abstract:

In this research, we conduct a Monte Carlo analysis on a two-dimensional χ2 test, which is used to determine the minimum distance required for independent sampling in the context of chaotic signals. We investigate the impact of transforming initial data sets from any probability distribution to new signals with a uniform distribution using the Spearman rank correlation on the χ2 test. This transformation removes the randomness of the data pairs, and as a result, the observed distribution of χ2 test values differs from the expected distribution. We propose a solution to this problem and evaluate it using another chaotic signal.

Keywords: chaotic signals, logistic map, Pearson’s test, Chi Square test, bivariate distribution, statistical independence

Procedia PDF Downloads 87
40846 Nanoparticle Based Green Inhibitor for Corrosion Protection of Zinc in Acidic Medium

Authors: Neha Parekh, Divya Ladha, Poonam Wadhwani, Nisha Shah

Abstract:

Nano scaled materials have attracted tremendous interest as corrosion inhibitor due to their high surface area on the metal surfaces. It is well known that the zinc oxide nanoparticles have higher reactivity towards aqueous acidic solution. This work presents a new method to incorporate zinc oxide nanoparticles with white sesame seeds extract (nano-green inhibitor) for corrosion protection of zinc in acidic medium. The morphology of the zinc oxide nanoparticles was investigated by TEM and DLS. The corrosion inhibition efficiency of the green inhibitor and nano-green inhibitor was determined by Gravimetric and electrochemical impedance spectroscopy (EIS) methods. Gravimetric measurements suggested that nano-green inhibitor is more effective than green inhibitor. Furthermore, with the increasing temperature, inhibition efficiency increases for both the inhibitors. In addition, it was established the Temkin adsorption isotherm fits well with the experimental data for both the inhibitors. The effect of temperature and Temkin adsorption isotherm revealed Chemisorption mechanism occurring in the system. The activation energy (Ea) and other thermodynamic parameters for inhibition process were calculated. The data of EIS showed that the charge transfer controls the corrosion process. The surface morphology of zinc metal (specimen) in absence and presence of green inhibitor and nano-green inhibitor were performed using Scanning Electron Microscopy (SEM) and Atomic Force Microscopy (AFM) techniques. The outcomes indicated a formation of a protective layer over zinc metal (specimen).

Keywords: corrosion, green inhibitor, nanoparticles, zinc

Procedia PDF Downloads 436