Search results for: atomic data
25391 Exact Energy Spectrum and Expectation Values of the Inverse Square Root Potential Model
Authors: Benedict Ita, Peter Okoi
Abstract:
In this work, the concept of the extended Nikiforov-Uvarov technique is discussed and employed to obtain the exact bound state energy eigenvalues and the corresponding normalized eigenfunctions of the inverse square root potential. With expressions for the exact energy eigenvalues and corresponding eigenfunctions, the expressions for the expectation values of the inverse separation-squared, kinetic energy, and the momentum-squared of the potential are presented using the Hellmann Feynman theorem. For visualization, algorithms written and implemented in Python language are used to generate tables and plots for l-states of the energy eigenvalues and some expectation values. The results obtained here may find suitable applications in areas like atomic and molecular physics, chemical physics, nuclear physics, and solid-state physics.Keywords: Schrodinger equation, Nikoforov-Uvarov method, inverse square root potential, diatomic molecules, Python programming, Hellmann-Feynman theorem, second order differential equation, matrix algebra
Procedia PDF Downloads 2425390 An Empirical Study of the Impacts of Big Data on Firm Performance
Authors: Thuan Nguyen
Abstract:
In the present time, data to a data-driven knowledge-based economy is the same as oil to the industrial age hundreds of years ago. Data is everywhere in vast volumes! Big data analytics is expected to help firms not only efficiently improve performance but also completely transform how they should run their business. However, employing the emergent technology successfully is not easy, and assessing the roles of big data in improving firm performance is even much harder. There was a lack of studies that have examined the impacts of big data analytics on organizational performance. This study aimed to fill the gap. The present study suggested using firms’ intellectual capital as a proxy for big data in evaluating its impact on organizational performance. The present study employed the Value Added Intellectual Coefficient method to measure firm intellectual capital, via its three main components: human capital efficiency, structural capital efficiency, and capital employed efficiency, and then used the structural equation modeling technique to model the data and test the models. The financial fundamental and market data of 100 randomly selected publicly listed firms were collected. The results of the tests showed that only human capital efficiency had a significant positive impact on firm profitability, which highlighted the prominent human role in the impact of big data technology.Keywords: big data, big data analytics, intellectual capital, organizational performance, value added intellectual coefficient
Procedia PDF Downloads 24625389 Automated Test Data Generation For some types of Algorithm
Authors: Hitesh Tahbildar
Abstract:
The cost of test data generation for a program is computationally very high. In general case, no algorithm to generate test data for all types of algorithms has been found. The cost of generating test data for different types of algorithm is different. Till date, people are emphasizing the need to generate test data for different types of programming constructs rather than different types of algorithms. The test data generation methods have been implemented to find heuristics for different types of algorithms. Some algorithms that includes divide and conquer, backtracking, greedy approach, dynamic programming to find the minimum cost of test data generation have been tested. Our experimental results say that some of these types of algorithm can be used as a necessary condition for selecting heuristics and programming constructs are sufficient condition for selecting our heuristics. Finally we recommend the different heuristics for test data generation to be selected for different types of algorithms.Keywords: ongest path, saturation point, lmax, kL, kS
Procedia PDF Downloads 40825388 The Perspective on Data Collection Instruments for Younger Learners
Authors: Hatice Kübra Koç
Abstract:
For academia, collecting reliable and valid data is one of the most significant issues for researchers. However, it is not the same procedure for all different target groups; meanwhile, during data collection from teenagers, young adults, or adults, researchers can use common data collection tools such as questionnaires, interviews, and semi-structured interviews; yet, for young learners and very young ones, these reliable and valid data collection tools cannot be easily designed or applied by the researchers. In this study, firstly, common data collection tools are examined for ‘very young’ and ‘young learners’ participant groups since it is thought that the quality and efficiency of an academic study is mainly based on its valid and correct data collection and data analysis procedure. Secondly, two different data collection instruments for very young and young learners are stated as discussing the efficacy of them. Finally, a suggested data collection tool – a performance-based questionnaire- which is specifically developed for ‘very young’ and ‘young learners’ participant groups in the field of teaching English to young learners as a foreign language is presented in this current study. The designing procedure and suggested items/factors for the suggested data collection tool are accordingly revealed at the end of the study to help researchers have studied with young and very learners.Keywords: data collection instruments, performance-based questionnaire, young learners, very young learners
Procedia PDF Downloads 9425387 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors
Authors: Yaxin Bi
Abstract:
Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors
Procedia PDF Downloads 3425386 Generation of Quasi-Measurement Data for On-Line Process Data Analysis
Authors: Hyun-Woo Cho
Abstract:
For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.Keywords: data analysis, diagnosis, monitoring, process data, quality control
Procedia PDF Downloads 48325385 Determination of Mineral Elements in Some Coarse Grains Used as Staple Food in Kano, Nigeria
Authors: M. I. Mohammed, U. M. Ahmad
Abstract:
Analyses of mineral elements were carried out on some coarse grains used as staple food in Kano. The levels of Magnesium, Calcium, Manganese, Iron, Copper and Zinc were determined using atomic absorption spectrophotometer (AAS), and that of Sodium and Potassium were obtained using flame photometer (FES). The result of the study shows that the mean results of the mineral elements ranged from 62.50±0.55 - 84.82±0.74mg/kg sodium, 73.33±0.35 - 317±0.10mg/kg magnesium, 89.22±0.26 - 193.33±0.19mg/kg potassium, 70.00±0.52 - 186.67±0.29mg/kg calcium, 1.00±0.11 - 20.50±1.30mg/kg manganese, 25.00±0.11 - 80.50±0.36mg/kg iron. 4.00±0.08 - 13.00±0.24mg/kg copper and 15.00±0.34 - 50.50±0.24 zinc. There was significant difference (p < 0.05) in levels of sodium, potassium and calcium whereas no significant difference (p > 0.05) occurs in levels of magnesium, manganese, copper and zinc. In comparison with Recommended Daily Allowances of essential and trace metals set by international standard organizations, the coarse grains analysed in this work contribute little to the provision of essential and trace elements requirements.Keywords: mineral elements, coarse grains, staple food, Kano, Nigeria
Procedia PDF Downloads 27725384 Heavy Metals in Selected Infant Milk Formula
Authors: Suad M. Abuzariba, M. Gazette
Abstract:
To test for the presence of toxic heavy metals, specifically Arsenic, Lead, and Mercury in formula milk available in Misrata city north of Libya for infants aged 6-12 months through Atomic Absorption Spectrophotometer,30 samples of imported milk formula in Libyan markets subjected to test to accurate their pollution with heavy metals, We get concentration of Hg, Ar, Pb in milk formula samples was between 0.002-1.37, 1.62-0.04–2.16, 0.15–0.65 respectively, when compared the results with Libyan &WHO standards ,they were within standards of toxic heavy metals. The presence or absence of toxic heavy metals (Lead, Arsenic, and Mercury) in selected infant formula milk and their levels within or beyond standards set by the WHO. The three infant formulas tested, all were negative for Arsenic and Lead, while two out of the three infant formulas tested positive for Mercury with levels of 0.6333ppm and 0.8333ppm. The levels of Mercury obtained, expressed in parts per million (ppm), from the two infant formulas tested were above the Provisional Tolerable Weekly Intake of total Mercury, which is 0.005ppm, as set by the FAO, WHO, and JECFA.Keywords: heavy metals, milk formula, Libya, toxic
Procedia PDF Downloads 51025383 Sanction Influences and Reconstruction Strategies for Iran Oil Market in Post-Sanctions
Authors: Mehrdad HassanZadeh Dugoori, Iman Mohammadali Tajrishi
Abstract:
Since Iran's nuclear program became public in 2002, the International Atomic Energy Agency (IAEA) has been unable to confirm Tehran's assertions that its nuclear activities are exclusively for peaceful purposes and that it has not sought to develop nuclear weapons. The United Nations Security Council has adopted six resolutions since 2006 requiring Iran to stop enriching uranium - which can be used for civilian purposes, but also to build nuclear bombs, which Iran never follow this strategy- and co-operate with the IAEA. Four resolutions have included progressively expansive sanctions to persuade Tehran to comply. The US and EU have imposed additional sanctions on Iranian oil exports and banks since 2012. In this article we reassess the sanction dimensions of Iran and the influences. Then according to the last agreement between P5+1 and Iran in 15 July 2015, we mention reconstruction strategies for oil export markets of Iran and the operational program for one million barrel of crude oil sales per day. These strategies are the conclusion of focus group and brain storming with Iran's oil and gas managers during content analysis.Keywords: post-sanction, oil market, reconstruction, marketing, strategy
Procedia PDF Downloads 45725382 Emerging Technology for Business Intelligence Applications
Authors: Hsien-Tsen Wang
Abstract:
Business Intelligence (BI) has long helped organizations make informed decisions based on data-driven insights and gain competitive advantages in the marketplace. In the past two decades, businesses witnessed not only the dramatically increasing volume and heterogeneity of business data but also the emergence of new technologies, such as Artificial Intelligence (AI), Semantic Web (SW), Cloud Computing, and Big Data. It is plausible that the convergence of these technologies would bring more value out of business data by establishing linked data frameworks and connecting in ways that enable advanced analytics and improved data utilization. In this paper, we first review and summarize current BI applications and methodology. Emerging technologies that can be integrated into BI applications are then discussed. Finally, we conclude with a proposed synergy framework that aims at achieving a more flexible, scalable, and intelligent BI solution.Keywords: business intelligence, artificial intelligence, semantic web, big data, cloud computing
Procedia PDF Downloads 9825381 Using Equipment Telemetry Data for Condition-Based maintenance decisions
Authors: John Q. Todd
Abstract:
Given that modern equipment can provide comprehensive health, status, and error condition data via built-in sensors, maintenance organizations have a new and valuable source of insight to take advantage of. This presentation will expose what these data payloads might look like and how they can be filtered, visualized, calculated into metrics, used for machine learning, and generate alerts for further action.Keywords: condition based maintenance, equipment data, metrics, alerts
Procedia PDF Downloads 18925380 Electron Density Analysis and Nonlinear Optical Properties of Zwitterionic Compound
Authors: A. Chouaih, N. Benhalima, N. Boukabcha, R. Rahmani, F. Hamzaoui
Abstract:
Zwitterionic compounds have received the interest of chemists and physicists due to their applications as nonlinear optical materials. Recently, zwitterionic compounds exhibiting high nonlinear optical activity have been investigated. In this context, the molecular electron charge density distribution of the title compound is described accurately using the multipolar model of Hansen and Coppens. The net atomic charge and the molecular dipole moment have been determined in order to understand the nature of inter- and intramolecular charge transfer. The study reveals the nature of intermolecular interactions including charge transfer and hydrogen bonds in the title compound. In this crystal, the molecules form dimers via intermolecular hydrogen bonds. The dimers are further linked by C–H...O hydrogen bonds into chains along the c crystallographic axis. This study has also allowed us to determine various nonlinear optical properties such as molecular electrostatic potential, polarizability, and hyperpolarizability of the title compound.Keywords: organic compounds, polarizability, hyperpolarizability, dipole moment
Procedia PDF Downloads 41825379 Analogy in Microclimatic Parameters, Chemometric and Phytonutrient Profiles of Cultivated and Wild Ecotypes of Origanum vulgare L., across Kashmir Himalaya
Authors: Sumira Jan, Javid Iqbal Mir, Desh Beer Singh, Anil Sharma, Shafia Zaffar Faktoo
Abstract:
Background and Aims: Climatic and edaphic factors immensely influence crop quality and proper development. Regardless of economic potential, Himalayan Oregano has not subjected to phytonutrient and chemometric evaluation and its relationship with environmental conditions are scarce. The central objective of this research was to investigate microclimatic variation among wild and cultivated populations located in a microclimatic gradient in north-western Himalaya, Kashmir and analyse if such disparity was related with diverse climatic and edaphic conditions. Methods: Micrometeorological, Atomic absorption spectroscopy for micro elemental analysis was carried for soil. HPLC was carried out to estimate variation in phytonutrients and phytochemicals. Results: Geographic variation in phytonutrient was observed among cultivated and wild populations and among populations diverse within regions. Cultivated populations exhibited comparatively lesser phytonutrient value than wild populations. Moreover, our results observed higher vegetative growth of O. vulgare L. with higher pH (6-7), elevated organic carbon (2.42%), high nitrogen (97.41Kg/ha) and manganese (10-12ppm) and zinc contents (0.39-0.50) produce higher phytonutrients. HPLC data of phytonutrients like quercetin, betacarotene, ascorbic acid, arbutin and catechin revealed direct relationship with UV-B flux (r2=0.82), potassium (r2=0.97) displaying parallel relationship with phytonutrient value. Conclusions: Catechin was found as predominant phytonutrient among all populations with maximum accumulation of 163.8 ppm while as quercetin exhibited lesser value. Maximum arbutin (53.42ppm) and quercetin (2.87ppm) accumulated in plants thriving under intense and high UV-B flux. Minimum variation was demonstrated by beta carotene and ascorbic acid.Keywords: phytonutrient, ascorbic acid, beta carotene, quercetin, catechin
Procedia PDF Downloads 26825378 Simulation of 1D Dielectric Barrier Discharge in Argon Mixtures
Authors: Lucas Wilman Crispim, Patrícia Hallack, Maikel Ballester
Abstract:
This work aims at modeling electric discharges in gas mixtures. The mathematical model mimics the ignition process in a commercial spark-plug when a high voltage is applied to the plug terminals. A longitudinal unidimensional Cartesian domain is chosen for the simulation region. Energy and mass transfer are considered for a macroscopic fluid representation, while energy transfer in molecular collisions and chemical reactions are contemplated at microscopic level. The macroscopic model is represented by a set of uncoupled partial differential equations. Microscopic effects are studied within a discrete model for electronic and molecular collisions in the frame of ZDPlasKin, a plasma modeling numerical tool. The BOLSIG+ solver is employed in solving the electronic Boltzmann equation. An operator splitting technique is used to separate microscopic and macroscopic models. The simulation gas is a mixture of atomic Argon neutral, excited and ionized. Spatial and temporal evolution of such species and temperature are presented and discussed.Keywords: CFD, electronic discharge, ignition, spark plug
Procedia PDF Downloads 16225377 An Inviscid Compressible Flow Solver Based on Unstructured OpenFOAM Mesh Format
Authors: Utkan Caliskan
Abstract:
Two types of numerical codes based on finite volume method are developed in order to solve compressible Euler equations to simulate the flow through forward facing step channel. Both algorithms have AUSM+- up (Advection Upstream Splitting Method) scheme for flux splitting and two-stage Runge-Kutta scheme for time stepping. In this study, the flux calculations differentiate between the algorithm based on OpenFOAM mesh format which is called 'face-based' algorithm and the basic algorithm which is called 'element-based' algorithm. The face-based algorithm avoids redundant flux computations and also is more flexible with hybrid grids. Moreover, some of OpenFOAM’s preprocessing utilities can be used on the mesh. Parallelization of the face based algorithm for which atomic operations are needed due to the shared memory model, is also presented. For several mesh sizes, 2.13x speed up is obtained with face-based approach over the element-based approach.Keywords: cell centered finite volume method, compressible Euler equations, OpenFOAM mesh format, OpenMP
Procedia PDF Downloads 32025376 Ethics Can Enable Open Source Data Research
Authors: Dragana Calic
Abstract:
The openness, availability and the sheer volume of big data have provided, what some regard as, an invaluable and rich dataset. Researchers, businesses, advertising agencies, medical institutions, to name only a few, collect, share, and analyze this data to enable their processes and decision making. However, there are important ethical considerations associated with the use of big data. The rapidly evolving nature of online technologies has overtaken the many legislative, privacy, and ethical frameworks and principles that exist. For example, should we obtain consent to use people’s online data, and under what circumstances can privacy considerations be overridden? Current guidance on how to appropriately and ethically handle big data is inconsistent. Consequently, this paper focuses on two quite distinct but related ethical considerations that are at the core of the use of big data for research purposes. They include empowering the producers of data and empowering researchers who want to study big data. The first consideration focuses on informed consent which is at the core of empowering producers of data. In this paper, we discuss some of the complexities associated with informed consent and consider studies of producers’ perceptions to inform research ethics guidelines and practice. The second consideration focuses on the researcher. Similarly, we explore studies that focus on researchers’ perceptions and experiences.Keywords: big data, ethics, producers’ perceptions, researchers’ perceptions
Procedia PDF Downloads 28625375 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning
Authors: Walid Cherif
Abstract:
Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.Keywords: data mining, knowledge discovery, machine learning, similarity measurement, supervised classification
Procedia PDF Downloads 46525374 Seismic Data Scaling: Uncertainties, Potential and Applications in Workstation Interpretation
Authors: Ankur Mundhra, Shubhadeep Chakraborty, Y. R. Singh, Vishal Das
Abstract:
Seismic data scaling affects the dynamic range of a data and with present day lower costs of storage and higher reliability of Hard Disk data, scaling is not suggested. However, in dealing with data of different vintages, which perhaps were processed in 16 bits or even 8 bits and are need to be processed with 32 bit available data, scaling is performed. Also, scaling amplifies low amplitude events in deeper region which disappear due to high amplitude shallow events that saturate amplitude scale. We have focused on significance of scaling data to aid interpretation. This study elucidates a proper seismic loading procedure in workstations without using default preset parameters as available in most software suites. Differences and distribution of amplitude values at different depth for seismic data are probed in this exercise. Proper loading parameters are identified and associated steps are explained that needs to be taken care of while loading data. Finally, the exercise interprets the un-certainties which might arise when correlating scaled and unscaled versions of seismic data with synthetics. As, seismic well tie correlates the seismic reflection events with well markers, for our study it is used to identify regions which are enhanced and/or affected by scaling parameter(s).Keywords: clipping, compression, resolution, seismic scaling
Procedia PDF Downloads 47125373 Clinical Study of the Prunus dulcis (Almond) Shell Extract on Tinea capitis Infection
Authors: Nasreen Thebo, W. Shaikh, A. J. Laghari, P. Nangni
Abstract:
Prunus dulcis (Almond) shell extract is demonstrated for its biomedical applications. Shell extract prepared by soxhlet method and further characterized by UV-Visible spectrophotometer, atomic absorption spectrophotometer (AAS), FTIR, GC-MS techniques. In this study, the antifungal activity of almond shell extract was observed against clinically isolated pathogenic fungi by strip method. The antioxidant potential of crude shell extract of was evaluated by using DPPH (2-2-diphenyl-1-picryhydrazyl) and radical scavenging system. The possibility of short term therapy was only 20 days. The total antioxidant activity varied from 94.38 to 95.49% and total phenolic content was found as 4.455 mg/gm in almond shell extract. Finally the results provide a great therapeutic potential against Tinea capitis infection of scalp. Included in this study of shell extract that show scientific evidence for clinical efficacy, as well as found to be more useful in the treatment of dermatologic disorders and without any doubt it can be recommended to be Patent.Keywords: Tinea capitis, DPPH, FTIR, GC-MS therapeutic treatment
Procedia PDF Downloads 38025372 Biosorption of Lead (II) from Aqueous Solution Using Marine Algae Chlorella pyrenoidosa
Authors: Pramod Kumar, A. V. N. Swamy, C. V. Sowjanya, C. V. Ramachandra Murthy
Abstract:
Biosorption is one of the effective methods for the removal of heavy metal ions from aqueous solutions. Results are presented showing the sorption of Pb(II) from solutions by biomass of commonly available marine algae Chlorella sp. The ability of marine algae Chlorella pyrenoidosa to remove heavy metal ion (Pb(II)) from aqueous solutions has been studied in this work. The biosorption properties of the biosorbent like equilibrium agitation time, optimum pH, temperature and initial solute concentration were investigated on metal uptake by showing respective profiles. The maximum metal uptake was found to be 10.27 mg/g. To achieve this metal uptake, the optimum conditions were found to be 30 min as equilibrium agitation time, 4.6 as optimum pH, 60 ppm of initial solute concentration. Lead concentration is determined by atomic absorption spectrometer. The process was found to be well fitted for pseudo-second order kinetics.Keywords: biosorption, heavy metal ions, agitation time, metal uptake, aqueous solution
Procedia PDF Downloads 37325371 A Platform to Screen Targeting Molecules of Ligand-EGFR Interactions
Authors: Wei-Ting Kuo, Feng-Huei Lin
Abstract:
Epidermal growth factor receptor (EGFR) is often constitutively stimulated in cancer owing to the binding of ligands such as epidermal growth factor (EGF), so it is necessary to investigate the interaction between EGFR and its targeting biomolecules which were over ligands binding. This study would focus on the binding affinity and adhesion force of two targeting products anti-EGFR monoclonal antibody (mAb) and peptide A to EGFR comparing with EGF. Surface plasmon resonance (SPR) was used to obtain the equilibrium dissociation constant to evaluate the binding affinity. Atomic force microscopy (AFM) was performed to detect adhesion force. The result showed that binding affinity of mAb to EGFR was higher than that of EGF to EGFR, and peptide A to EGFR was lowest. The adhesion force between EGFR and mAb that was higher than EGF and peptide A to EGFR was lowest. From the studies, we could conclude that mAb had better adhesion force and binding affinity to EGFR than that of EGF and peptide A. SPR and AFM could confirm the interaction between receptor and targeting ligand easily and carefully. It provide a platform to screen ligands for receptor targeting and drug delivery.Keywords: adhesion force, binding affinity, epidermal growth factor receptor, target molecule
Procedia PDF Downloads 43325370 Assessment of Pollution Cd, Pb and as in Rice Cultivation in Savadkooh
Authors: Ghazal Banitahmasb, Nazanin Khakipour
Abstract:
More than 90 percent of the world's rice is produced and consumed in Asia. Heavy metal contamination of soil and water environments is a serious and growing problem. Toxin by human activities causes pollution in soils so that the intensity of metals in soils was exceeded. This study was done on 7 samples of rice cultivated in Savadkooh of Mazandaran province and soils; they were grown. The amount of heavy metals Arsenic, Lead and Cadmium were measured by atomic absorption. The test results showed that the amount of Lead in rice strain, Tarom A, was 0.768 ppm, the maximum amount of Cadmium in rice strain, Hashemi B, was 0.09 ppm and the highest levels of Arsenic was in red Tarom, 0.39 ppm. According to the results obtained in this study can be found all rice grown in Savadkooh city of Arsenic, Cadmium and Lead, but the measurements are less than specified in the national standard, and their use is safe for consumers. These results also indicate that positive and significant correlation between the studied heavy metals in soil and rice strains that grow there and by increasing the amount of heavy metals in the soil, the amount of these metals in crops grown on them is also increasing.Keywords: heavy metals, Oryza sativa L., soil pollution, Savadkooh
Procedia PDF Downloads 41525369 An Experimental Study on the Effect of Heat Input on the Weld Efficiency of TIG-MIG Hybrid Welding of Type-304 Austenitic Stainless Steel
Authors: Emmanuel Ogundimu, Esther Akinlabi, Mutiu Erinosho
Abstract:
Welding is described as the process of joining metals so that bonding can be created as a result of inter-atomic penetration. This study investigated the influence of heat input on the efficiency of the welded joints of 304 stainless steel. Three welds joint were made from two similar 304 stainless steel plates of thickness 6 mm. The tensile results obtained showed that the maximum average tensile strength of 672 MPa is possessed by the sample A1 with low heat input. It was discovered that the tensile strength, % elongation and weld joint efficiency decreased with the increase in heat input into the weld. The average % elongation for the entire samples ranged from 28.4% to 36.5%. Sample A1 had the highest joint efficiency of 94.5%. However, the optimum welding current of 190 for TIG- MIG hybrid welding of type-304 austenite stainless steel can be recommended for advanced technological applications such as aircraft manufacturing, nuclear industry, automobile industry, and processing industry.Keywords: microhardness, microstructure, tensile, MIG welding, process, tensile, shear stress TIG welding, TIG-MIG welding
Procedia PDF Downloads 20025368 Association of Social Data as a Tool to Support Government Decision Making
Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias
Abstract:
Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.Keywords: social data, government decision making, association of social data, data mining
Procedia PDF Downloads 37125367 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 2025366 Soil Degradation Resulting from Migration of Ion Leachate in Gosa Dumpsite, Abuja
Authors: S. Ebisintei, M. A. Olutoye, A. S. Kovo, U. G. Akpan
Abstract:
The effect of soil degradation due to ion leachate migration using dumpsite located at Idu industrial area of Abuja was investigated. It was done to assess the health and environmental pollution consequences caused by heavy metals’ concentration in the soil on inhabitants around the settlement. Soil samples collected from four cardinal points and at the center during the dry and wet season were pretreated, digested and heavy metal concentrations present were analyzed using Atomic Absorption Spectrophotometer. The concentrations of Pb, Cu, Mn, Ni, and Cr, were determined and also for control sample obtained 300 m away from the dumpsite. Water samples were collected from three wells to test for physiochemical properties of pH, COD, BOD, DO, hardness, conductivity, and alkalinity. The result showed a significant difference in concentration of toxic heavy metals in the dumpsite as compared to the control sample. A mathematical model was developed to predict the heavy metal concentrations beyond the sampling point. The results indicate that metal concentrations in both dry and wet season were above the WHO, and SON set standards. The trend, if unrestrained, portends danger to human life, reduces agricultural productivity and sustainability.Keywords: soil degradation, ion leachate, productivity, environment, sustainability
Procedia PDF Downloads 34725365 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform
Authors: Sadam Alwadi
Abstract:
Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.Keywords: outlier values, imputation, stock market data, detecting, estimation
Procedia PDF Downloads 8225364 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage
Authors: P. Jayashree, S. Rajkumar
Abstract:
With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding
Procedia PDF Downloads 29625363 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework
Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe
Abstract:
This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.Keywords: IoT, fog, cloud, data analysis, data privacy
Procedia PDF Downloads 10025362 Physical Properties of Nano-Sized Poly-N-Isopropylacrylamide Hydrogels
Authors: Esra Alveroglu Durucu, Kenan Koc
Abstract:
In this study, we synthesized and characterized nano-sized Poly- N-isopropylacrylamide (PNIPAM) hydrogels. N-isopropylacrylamide (NIPAM) micro and macro gels are known as a thermosensitive colloidal structure, and they respond to changes in the environmental conditions such as temperature and pH. Here, nano-sized gels were synthesized via precipitation copolymerization method. N,N-methylenebisacrylamide (BIS) and ammonium persulfate APS were used as crosslinker and initiator, respectively. 8-Hydroxypyrene-1,3,6- trisulfonic Acid (Pyranine, Py) molecules were used for arranging the particle size and thus physical properties of the nano-sized hydrogels. Fluorescence spectroscopy, atomic force microscopy and light scattering methods were used for characterizing the synthesized hydrogels. The results show that the gel size was decreased with increasing amount of ionic molecule from 550 to 140 nm due to the electrostatic behavior of the ionic side groups of pyranine. Light scattering experiments demonstrate that lower critical solution temperature (LCST) of the gels shifts to the lower temperature with decreasing size of gel due to the hydrophobicity–hydrophilicity balance of the polymer chains.Keywords: hydrogels, lower critical solution temperature, nanogels, poly(n-isopropylacrylamide)
Procedia PDF Downloads 247