Search results for: calibration data requirements
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26413

Search results for: calibration data requirements

25453 The Perspective on Data Collection Instruments for Younger Learners

Authors: Hatice Kübra Koç

Abstract:

For academia, collecting reliable and valid data is one of the most significant issues for researchers. However, it is not the same procedure for all different target groups; meanwhile, during data collection from teenagers, young adults, or adults, researchers can use common data collection tools such as questionnaires, interviews, and semi-structured interviews; yet, for young learners and very young ones, these reliable and valid data collection tools cannot be easily designed or applied by the researchers. In this study, firstly, common data collection tools are examined for ‘very young’ and ‘young learners’ participant groups since it is thought that the quality and efficiency of an academic study is mainly based on its valid and correct data collection and data analysis procedure. Secondly, two different data collection instruments for very young and young learners are stated as discussing the efficacy of them. Finally, a suggested data collection tool – a performance-based questionnaire- which is specifically developed for ‘very young’ and ‘young learners’ participant groups in the field of teaching English to young learners as a foreign language is presented in this current study. The designing procedure and suggested items/factors for the suggested data collection tool are accordingly revealed at the end of the study to help researchers have studied with young and very learners.

Keywords: data collection instruments, performance-based questionnaire, young learners, very young learners

Procedia PDF Downloads 69
25452 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors

Authors: Yaxin Bi

Abstract:

Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.

Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors

Procedia PDF Downloads 14
25451 Vulnerability Assessment of Groundwater Quality Deterioration Using PMWIN Model

Authors: A. Shakoor, M. Arshad

Abstract:

The utilization of groundwater resources in irrigation has significantly increased during the last two decades due to constrained canal water supplies. More than 70% of the farmers in the Punjab, Pakistan, depend directly or indirectly on groundwater to meet their crop water demands and hence, an unchecked paradigm shift has resulted in aquifer depletion and deterioration. Therefore, a comprehensive research was carried at central Punjab-Pakistan, regarding spatiotemporal variation in groundwater level and quality. Processing MODFLOW for window (PMWIN) and MT3D (solute transport model) models were used for existing and future prediction of groundwater level and quality till 2030. The comprehensive data set of aquifer lithology, canal network, groundwater level, groundwater salinity, evapotranspiration, groundwater abstraction, recharge etc. were used in PMWIN model development. The model was thus, successfully calibrated and validated with respect to groundwater level for the periods of 2003 to 2007 and 2008 to 2012, respectively. The coefficient of determination (R2) and model efficiency (MEF) for calibration and validation period were calculated as 0.89 and 0.98, respectively, which argued a high level of correlation between the calculated and measured data. For solute transport model (MT3D), the values of advection and dispersion parameters were used. The model used for future scenario up to 2030, by assuming that there would be no uncertain change in climate and groundwater abstraction rate would increase gradually. The model predicted results revealed that the groundwater would decline from 0.0131 to 1.68m/year during 2013 to 2030 and the maximum decline would be on the lower side of the study area, where infrastructure of canal system is very less. This lowering of groundwater level might cause an increase in the tubewell installation and pumping cost. Similarly, the predicted total dissolved solids (TDS) of the groundwater would increase from 6.88 to 69.88mg/L/year during 2013 to 2030 and the maximum increase would be on lower side. It was found that in 2030, the good quality would reduce by 21.4%, while marginal and hazardous quality water increased by 19.28 and 2%, respectively. It was found from the simulated results that the salinity of the study area had increased due to the intrusion of salts. The deterioration of groundwater quality would cause soil salinity and ultimately the reduction in crop productivity. It was concluded from the predicted results of groundwater model that the groundwater deteriorated with the depth of water table i.e. TDS increased with declining groundwater level. It is recommended that agronomic and engineering practices i.e. land leveling, rainwater harvesting, skimming well, ASR (Aquifer Storage and Recovery Wells) etc. should be integrated to meliorate management of groundwater for higher crop production in salt affected soils.

Keywords: groundwater quality, groundwater management, PMWIN, MT3D model

Procedia PDF Downloads 363
25450 Estimating Marine Tidal Power Potential in Kenya

Authors: Lucy Patricia Onundo, Wilfred Njoroge Mwema

Abstract:

The rapidly diminishing fossil fuel reserves, their exorbitant cost and the increasingly apparent negative effect of fossil fuels to climate changes is a wake-up call to explore renewable energy. Wind, bio-fuel and solar power have already become staples of Kenyan electricity mix. The potential of electric power generation from marine tidal currents is enormous, with oceans covering more than 70% of the earth. However, attempts to harness marine tidal energy in Kenya, has yet to be studied thoroughly due to its promising, cyclic, reliable and predictable nature and the vast energy contained within it. The high load factors resulting from the fluid properties and the predictable resource characteristics make marine currents particularly attractive for power generation and advantageous when compared to others. Global-level resource assessments and oceanographic literature and data have been compiled in an analysis of the technology-specific requirements for tidal energy technologies and the physical resources. Temporal variations in resource intensity as well as the differences between small-scale applications are considered.

Keywords: tidal power, renewable energy, energy assessment, Kenya

Procedia PDF Downloads 548
25449 Evaluating the Use of Swedish by-Product Foundry Sand in Asphalt Mixtures

Authors: Dina Kuttah

Abstract:

It is well known that recycling of by-product materials saves natural resources, reduces by-product volumes, and reduces the need for virgin materials. The steel industry produces a myriad of metal components for industrial chains, which in turn generates mineral discarded sand molds. Although these sands are clean before their use, after casting, they may contain contaminants. Therefore, huge quantities of excess by-product foundry sand (BFS) end up occupying large volumes in landfills. In Sweden, approximately 200000 tonnes of excess BFS end up in landfills. The transportation and construction industries have the greatest potential for reuse by-products because they use vast quantities of earthen materials annually. Accordingly, experimental work has been undertaken to evaluate the possible use of two chosen BFS from two Swedish foundries in a conventional Swedish asphalt mixture. The experimental procedure of this research has focused on the dosage, environmental and technical properties of the same mixture type ABT 11 and the same bitumen (160/220) but at different replacement proportions of the conventional fine sand with the two BFS. The environmental requirements, in addition to the technical requirements, namely, void ratio, static indirect tensile strength ratio, and resilient modulus before and after moisture-induced sensitivity tests of the asphalt mixtures, have been investigated in the current study. The test results demonstrated that the BFS from both foundries can be incorporated in the selected asphalt mixture at specified replacement proportions of the conventional fine sand fraction 0-2 mm, as discussed in the paper.

Keywords: asphalt mixtures, by-product foundry sand, indirect tensile strength, moisture induced sensitivity tests, resilient modulus

Procedia PDF Downloads 123
25448 Critical Success Factors Quality Requirement Change Management

Authors: Jamshed Ahmad, Abdul Wahid Khan, Javed Ali Khan

Abstract:

Managing software quality requirements change management is a difficult task in the field of software engineering. Avoiding incoming changes result in user dissatisfaction while accommodating to many requirement changes may delay product delivery. Poor requirements management is solely considered the primary cause of the software failure. It becomes more challenging in global software outsourcing. Addressing success factors in quality requirement change management is desired today due to the frequent change requests from the end-users. In this research study, success factors are recognized and scrutinized with the help of a systematic literature review (SLR). In total, 16 success factors were identified, which significantly impacted software quality requirement change management. The findings show that Proper Requirement Change Management, Rapid Delivery, Quality Software Product, Access to Market, Project Management, Skills and Methodologies, Low Cost/Effort Estimation, Clear Plan and Road Map, Agile Processes, Low Labor Cost, User Satisfaction, Communication/Close Coordination, Proper Scheduling and Time Constraints, Frequent Technological Changes, Robust Model, Geographical distribution/Cultural differences are the key factors that influence software quality requirement change. The recognized success factors and validated with the help of various research methods, i.e., case studies, interviews, surveys and experiments. These factors are then scrutinized in continents, database, company size and period of time. Based on these findings, requirement change will be implemented in a better way.

Keywords: global software development, requirement engineering, systematic literature review, success factors

Procedia PDF Downloads 186
25447 Generation of Quasi-Measurement Data for On-Line Process Data Analysis

Authors: Hyun-Woo Cho

Abstract:

For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.

Keywords: data analysis, diagnosis, monitoring, process data, quality control

Procedia PDF Downloads 463
25446 Durability of Reinforced Concrete Structure on Very High Aggressive Environment: A Case Study

Authors: Karla Peitl Miller, Leomar Bravin Porto, Kaitto Correa Fraga, Nataniele Eler Mendes

Abstract:

This paper presents the evaluation and study of a real reinforced concrete structure of a fertilizer storage building, constructed on a Vale’s Port at Brazil, which has been recently under refurbishment. Data that will be shared and commented aim to show how wrong choices in project concepts allied to a very high aggressive environment lead to a fast track degradation, incurring on a hazardous condition associated with huge and expensive treatment for repair and guarantee of minimum performance conditions and service life. It will be also shown and discussed all the covered steps since pathological manifestations first signs were observed until the complete revitalization and reparation planning would be drawn. The conclusions of the work easily explicit the importance of professional technical qualification, the importance of minimum requirements for design and structural reforms, and mainly, the importance of good inspection and diagnostic engineering continuous work.

Keywords: durability, reinforced concrete repair, structural inspection, diagnostic engineering

Procedia PDF Downloads 121
25445 Progress in Accuracy, Reliability and Safety in Firedamp Detection

Authors: José Luis Lorenzo Bayona, Ljiljana Medic-Pejic, Isabel Amez Arenillas, Blanca Castells Somoza

Abstract:

The communication presents the study results carried out by the Official Laboratory J. M. Madariaga (LOM) of the Polytechnic University of Madrid to analyze the reliability of methane detection systems used in underground mining. Poor firedamp control in work can cause from production stoppages to fatal accidents and since there is currently a great variety of equipment with different functional characteristics, a study is needed to indicate which measurement principles have the highest degree of confidence. For the development of the project, a series of fixed, transportable and portable methane detectors with different measurement principles have been selected to subject them to laboratory tests following the methods described in the applicable regulations. The test equipment has been the one usually used in the certification and calibration of these devices, subject to the LOM quality system, and the tests have been carried out on detectors accessible in the market. The conclusions establish the main advantages and disadvantages of the equipment according to the measurement principle used; catalytic combustion, interferometry and infrared absorption.

Keywords: ATEX standards, gas detector, methane meter, mining safety

Procedia PDF Downloads 124
25444 Study of Parking Demand for Offices – Case Study: Kolkata

Authors: Sanghamitra Roy

Abstract:

In recent times, India has experienced the phenomenal rise in the number of registered vehicles and vehicular trips, particularly intra-city trips in most of its urban areas. The increase in vehicle ownership and use have increased parking demand immensely and accommodating the same is now a matter of big concern. Most cities do not have adequate off-street parking facilities thus forcing people to park on the streets. This has resulted in decreased carrying capacity, decreased traffic speed, increased congestion, and increased environmental problems. While integrated multi-modal transportation system is the answer to such problems, parking issues will continue to exist. In Kolkata, only 6.4% land is devoted for roads. The consequences of this huge crunch in road spaces coupled with increased parking demand are severe particularly in the CBD and major commercial areas, making the role of off-street parking facilities in Kolkata even more critical. To meaningfully address parking issues, it is important to identify the factors that influence parking demand so that it can be assessed and comprehensive parking policies and plans for the city can be formulated. This paper aims at identifying the factors that contribute towards parking demand for offices in Kolkata and their degree of correlation with parking demand. The study is limited to home-to-work trips located within Kolkata Municipal Corporation (KMC) where parking related issues are most pronounced. The data for the study is collected through personal interviews, questionnaires and direct observations from offices across the wards of KMC. SPSS is used for classification of the data and analyses of the same. The findings of this study will help in re-assessment of the parking requirements specified in The Kolkata Municipal Corporation Building Rules as a step towards alleviating parking related issues in the city.

Keywords: building rules, office spaces, parking demand, urbanization

Procedia PDF Downloads 305
25443 Emerging Technology for Business Intelligence Applications

Authors: Hsien-Tsen Wang

Abstract:

Business Intelligence (BI) has long helped organizations make informed decisions based on data-driven insights and gain competitive advantages in the marketplace. In the past two decades, businesses witnessed not only the dramatically increasing volume and heterogeneity of business data but also the emergence of new technologies, such as Artificial Intelligence (AI), Semantic Web (SW), Cloud Computing, and Big Data. It is plausible that the convergence of these technologies would bring more value out of business data by establishing linked data frameworks and connecting in ways that enable advanced analytics and improved data utilization. In this paper, we first review and summarize current BI applications and methodology. Emerging technologies that can be integrated into BI applications are then discussed. Finally, we conclude with a proposed synergy framework that aims at achieving a more flexible, scalable, and intelligent BI solution.

Keywords: business intelligence, artificial intelligence, semantic web, big data, cloud computing

Procedia PDF Downloads 77
25442 Using Equipment Telemetry Data for Condition-Based maintenance decisions

Authors: John Q. Todd

Abstract:

Given that modern equipment can provide comprehensive health, status, and error condition data via built-in sensors, maintenance organizations have a new and valuable source of insight to take advantage of. This presentation will expose what these data payloads might look like and how they can be filtered, visualized, calculated into metrics, used for machine learning, and generate alerts for further action.

Keywords: condition based maintenance, equipment data, metrics, alerts

Procedia PDF Downloads 167
25441 Combination of Electrochemical Impedance Spectroscopy and Electromembrane Extraction for the Determination of Zolpidem Using Modified Screen-Printed Electrode

Authors: Ali Naeemy, Mir Ghasem Hoseini

Abstract:

In this study, for the first time, an analytical method developed and validated by combining electrochemical impedance spectroscopy and electromembrane extraction (EIS-EME) by Vulcan/poly pyrrole nanocomposite modified screen-printed electrode (PPY–VU/SPE) for accurately quantifying zolpidem. EME parameters optimized, including solvent composition, voltage, pH adjustments and extraction time. Zolpidem was transferred from a donor solution (pH 5) to an acceptor solution (pH 13) using a hollow fiber in 1-octanol as a membrane, driven by a 60 V voltage for 25 minutes, ensuring precise and selective extraction. In comparison with SPE, VU/SPE and PPY/SPE, the PPY–VU/SPE was much more efficient for ZP oxidation. Calibration curves with good linearity were obtained in the concentration range of 2-75 µmol L-1 using the EIS-EME with the detection limit of 0.5 µmol L-1 . Finally, the EIS-EME by using the PPY– VU/SPE was successfully used to determine ZP in tablet dosage form, urine and plasma samples. Keywords: Electrochemical impedance spectroscopy, Electromembrane extraction, Zolpidem, Vulcan, poly pyrrole, Screen printed electrode

Keywords: electrochemical impedance spectroscopy, electromembrane extraction, screen printed electrode, zolpidem

Procedia PDF Downloads 19
25440 Ethics Can Enable Open Source Data Research

Authors: Dragana Calic

Abstract:

The openness, availability and the sheer volume of big data have provided, what some regard as, an invaluable and rich dataset. Researchers, businesses, advertising agencies, medical institutions, to name only a few, collect, share, and analyze this data to enable their processes and decision making. However, there are important ethical considerations associated with the use of big data. The rapidly evolving nature of online technologies has overtaken the many legislative, privacy, and ethical frameworks and principles that exist. For example, should we obtain consent to use people’s online data, and under what circumstances can privacy considerations be overridden? Current guidance on how to appropriately and ethically handle big data is inconsistent. Consequently, this paper focuses on two quite distinct but related ethical considerations that are at the core of the use of big data for research purposes. They include empowering the producers of data and empowering researchers who want to study big data. The first consideration focuses on informed consent which is at the core of empowering producers of data. In this paper, we discuss some of the complexities associated with informed consent and consider studies of producers’ perceptions to inform research ethics guidelines and practice. The second consideration focuses on the researcher. Similarly, we explore studies that focus on researchers’ perceptions and experiences.

Keywords: big data, ethics, producers’ perceptions, researchers’ perceptions

Procedia PDF Downloads 272
25439 Meta Model for Optimum Design Objective Function of Steel Frames Subjected to Seismic Loads

Authors: Salah R. Al Zaidee, Ali S. Mahdi

Abstract:

Except for simple problems of statically determinate structures, optimum design problems in structural engineering have implicit objective functions where structural analysis and design are essential within each searching loop. With these implicit functions, the structural engineer is usually enforced to write his/her own computer code for analysis, design, and searching for optimum design among many feasible candidates and cannot take advantage of available software for structural analysis, design, and searching for the optimum solution. The meta-model is a regression model used to transform an implicit objective function into objective one and leads in turn to decouple the structural analysis and design processes from the optimum searching process. With the meta-model, well-known software for structural analysis and design can be used in sequence with optimum searching software. In this paper, the meta-model has been used to develop an explicit objective function for plane steel frames subjected to dead, live, and seismic forces. Frame topology is assumed as predefined based on architectural and functional requirements. Columns and beams sections and different connections details are the main design variables in this study. Columns and beams are grouped to reduce the number of design variables and to make the problem similar to that adopted in engineering practice. Data for the implicit objective function have been generated based on analysis and assessment for many design proposals with CSI SAP software. These data have been used later in SPSS software to develop a pure quadratic nonlinear regression model for the explicit objective function. Good correlations with a coefficient, R2, in the range from 0.88 to 0.99 have been noted between the original implicit functions and the corresponding explicit functions generated with meta-model.

Keywords: meta-modal, objective function, steel frames, seismic analysis, design

Procedia PDF Downloads 227
25438 Applying Failure Modes and Effect Analysis Concept in a Global Software Development Process

Authors: Camilo Souza, Lidia Melo, Fernanda Terra, Francisco Caio, Marcelo Reis

Abstract:

SIDIA is a research and development (R&D) institute that belongs to Samsung’s global software development process. The SIDIA’s Model Team (MT) is a part of Samsung’s Mobile Division Area, which is responsible for the development of Android releases embedded in Samsung mobile devices. Basically, in this software development process, the kickoff occurs in some strategic countries (e.g., South Korea) where some software requirements are applied and the initial software tests are performed. When the software achieves a more mature level, a new branch is derived, and the development continues in subsidiaries from other strategic countries (e.g., SIDIA-Brazil). However, even in the newly created branches, there are several interactions between developers from different nationalities in order to fix bugs reported during test activities, apply some specific requirements from partners and develop new features as well. Despite the GSD strategy contributes to improving software development, some challenges are also introduced as well. In this paper, we share the initial results about the application of the failure modes and effect analysis (FMEA) concept in the software development process followed by the SIDIA’s model team. The main goal was to identify and mitigate the process potential failures through the application of recommended actions. The initial results show that the application of the FMEA concept allows us to identify the potential failures in our GSD process as well as to propose corrective actions to mitigate them. Finally, FMEA encouraged members of different teams to take actions that contribute to improving our GSD process.

Keywords: global software development, potential failures, FMEA, recommended actions

Procedia PDF Downloads 208
25437 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning

Authors: Walid Cherif

Abstract:

Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.

Keywords: data mining, knowledge discovery, machine learning, similarity measurement, supervised classification

Procedia PDF Downloads 452
25436 Seismic Data Scaling: Uncertainties, Potential and Applications in Workstation Interpretation

Authors: Ankur Mundhra, Shubhadeep Chakraborty, Y. R. Singh, Vishal Das

Abstract:

Seismic data scaling affects the dynamic range of a data and with present day lower costs of storage and higher reliability of Hard Disk data, scaling is not suggested. However, in dealing with data of different vintages, which perhaps were processed in 16 bits or even 8 bits and are need to be processed with 32 bit available data, scaling is performed. Also, scaling amplifies low amplitude events in deeper region which disappear due to high amplitude shallow events that saturate amplitude scale. We have focused on significance of scaling data to aid interpretation. This study elucidates a proper seismic loading procedure in workstations without using default preset parameters as available in most software suites. Differences and distribution of amplitude values at different depth for seismic data are probed in this exercise. Proper loading parameters are identified and associated steps are explained that needs to be taken care of while loading data. Finally, the exercise interprets the un-certainties which might arise when correlating scaled and unscaled versions of seismic data with synthetics. As, seismic well tie correlates the seismic reflection events with well markers, for our study it is used to identify regions which are enhanced and/or affected by scaling parameter(s).

Keywords: clipping, compression, resolution, seismic scaling

Procedia PDF Downloads 454
25435 Optimization and Energy Management of Hybrid Standalone Energy System

Authors: T. M. Tawfik, M. A. Badr, E. Y. El-Kady, O. E. Abdellatif

Abstract:

Electric power shortage is a serious problem in remote rural communities in Egypt. Over the past few years, electrification of remote communities including efficient on-site energy resources utilization has achieved high progress. Remote communities usually fed from diesel generator (DG) networks because they need reliable energy and cheap fresh water. The main objective of this paper is to design an optimal economic power supply from hybrid standalone energy system (HSES) as alternative energy source. It covers energy requirements for reverse osmosis desalination unit (DU) located in National Research Centre farm in Noubarya, Egypt. The proposed system consists of PV panels, Wind Turbines (WT), Batteries, and DG as a backup for supplying DU load of 105.6 KWh/day rated power with 6.6 kW peak load operating 16 hours a day. Optimization of HSES objective is selecting the suitable size of each of the system components and control strategy that provide reliable, efficient, and cost-effective system using net present cost (NPC) as a criterion. The harmonization of different energy sources, energy storage, and load requirements are a difficult and challenging task. Thus, the performance of various available configurations is investigated economically and technically using iHOGA software that is based on genetic algorithm (GA). The achieved optimum configuration is further modified through optimizing the energy extracted from renewable sources. Effective minimization of energy charging the battery ensures that most of the generated energy directly supplies the demand, increasing the utilization of the generated energy.

Keywords: energy management, hybrid system, renewable energy, remote area, optimization

Procedia PDF Downloads 187
25434 Association of Social Data as a Tool to Support Government Decision Making

Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias

Abstract:

Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.

Keywords: social data, government decision making, association of social data, data mining

Procedia PDF Downloads 352
25433 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform

Authors: Sadam Alwadi

Abstract:

Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.

Keywords: outlier values, imputation, stock market data, detecting, estimation

Procedia PDF Downloads 71
25432 Integration GIS–SCADA Power Systems to Enclosure Air Dispersion Model

Authors: Ibrahim Shaker, Amr El Hossany, Moustafa Osman, Mohamed El Raey

Abstract:

This paper will explore integration model between GIS–SCADA system and enclosure quantification model to approach the impact of failure-safe event. There are real demands to identify spatial objects and improve control system performance. Nevertheless, the employed methodology is predicting electro-mechanic operations and corresponding time to environmental incident variations. Open processing, as object systems technology, is presented for integration enclosure database with minimal memory size and computation time via connectivity drivers such as ODBC:JDBC during main stages of GIS–SCADA connection. The function of Geographic Information System is manipulating power distribution in contrast to developing issues. In other ward, GIS-SCADA systems integration will require numerical objects of process to enable system model calibration and estimation demands, determine of past events for analysis and prediction of emergency situations for response training.

Keywords: air dispersion model, environmental management, SCADA systems, GIS system, integration power system

Procedia PDF Downloads 350
25431 A Dual-Polarized Wideband Probe for Near-Field Antenna Measurement

Authors: K. S. Sruthi

Abstract:

Antennas are one of the most important parts of a communication chain. They are used for both communication and calibration purposes. New developments in probe technologies have enabled near-field probes with much larger bandwidth. The objective of this paper is to design, simulate and fabricate a dual polarized wide band inverted quad ridged shape horn antenna which can be used as measurement probe for near field measurements. The inverted quad-ridged horn antenna probe not only provides measurement in the much wider range but also provides dual-polarization measurement thus enabling antenna developers to measure UWB, UHF, VHF antennas more precisely and at lower cost. The antenna is designed to meet the characteristics such as high gain, light weight, linearly polarized with suppressed side lobes for near-field measurement applications. The proposed antenna is simulated with commercially available packages such as Ansoft HFSS. The antenna gives a moderate gain over operating range while delivering a wide bandwidth.

Keywords: near-field antenna measurement, inverted quad-ridge horn antenna, wideband Antennas, dual polarized antennas, ansoft HFSS

Procedia PDF Downloads 407
25430 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 82
25429 Analysis of Erosion Quantity on Application of Conservation Techniques in Ci Liwung Hulu Watershed

Authors: Zaenal Mutaqin

Abstract:

The level of erosion that occurs in the upsteam watersheed will lead to limited infiltrattion, land degradation and river trivialisation and estuaries in the body. One of the watesheed that has been degraded caused by using land is the DA Ci Liwung Upstream. The high degradation that occurs in the DA Ci Liwung upstream is indicated by the hugher rate of erosion on the region, especially in the area of agriculture. In this case, agriculture cultivation intent to the agricultural land that has been applied conservation techniques. This study is applied to determine the quantity of erosion by reviewing Hidrologic Response Unit (HRU) in agricuktural cultivation land which is contained in DA Ci Liwung upstream by using the Soil and Water Assessmen Tool (SWAT). Conservation techniques applied are terracing, agroforestry and gulud terrace. It was concluded that agroforestry conservation techniques show the best value of erosion (lowest) compared with other conservation techniques with the contribution of erosion of 25.22 tonnes/ha/year. The results of the calibration between the discharge flow models with the observation that R²=0.9014 and NS=0.79 indicates that this model is acceptable and feasible applied to the Ci Liwung Hulu watershed.

Keywords: conservation, erosion, SWAT analysis, watersheed

Procedia PDF Downloads 276
25428 Estimation of Morbidity Level of Industrial Labour Conditions at Zestafoni Ferroalloy Plant

Authors: M. Turmanauli, T. Todua, O. Gvaberidze, R. Javakhadze, N. Chkhaidze, N. Khatiashvili

Abstract:

Background: Mining process has the significant influence on human health and quality of life. In recent years the events in Georgia were reflected on the industry working process, especially minimal requirements of labor safety, hygiene standards of workplace and the regime of work and rest are not observed. This situation is often caused by the lack of responsibility, awareness, and knowledge both of workers and employers. The control of working conditions and its protection has been worsened in many of industries. Materials and Methods: For evaluation of the current situation the prospective epidemiological study by face to face interview method was conducted at Georgian “Manganese Zestafoni Ferroalloy Plant” in 2011-2013. 65.7% of employees (1428 bulletin) were surveyed and the incidence rates of temporary disability days were studied. Results: The average length of a temporary disability single accident was studied taking into consideration as sex groups as well as the whole cohort. According to the classes of harmfulness the following results were received: Class 2.0-10.3%; 3.1-12.4%; 3.2-35.1%; 3.3-12.1%; 3.4-17.6%; 4.0-12.5%. Among the employees 47.5% and 83.1% were tobacco and alcohol consumers respectively. According to the age groups and years of work on the base of previous experience ≥50 ages and ≥21 years of work data prevalence respectively. The obtained data revealed increased morbidity rate according to age and years of work. It was found that the bone and articulate system and connective tissue diseases, aggravation of chronic respiratory diseases, ischemic heart diseases, hypertension and cerebral blood discirculation were the leading among the other diseases. High prevalence of morbidity observed in the workplace with not satisfactory labor conditions from the hygienic point of view. Conclusion: According to received data the causes of morbidity are the followings: unsafety labor conditions; incomplete of preventive medical examinations (preliminary and periodic); lack of access to appropriate health care services; derangement of gathering, recording, and analysis of morbidity data. This epidemiological study was conducted at the JSC “Manganese Ferro Alloy Plant” according to State program “ Prevention of Occupational Diseases” (Program code is 35 03 02 05).

Keywords: occupational health, mining process, morbidity level, cerebral blood discirculation

Procedia PDF Downloads 415
25427 Comparison of Selected Pier-Scour Equations for Wide Piers Using Field Data

Authors: Nordila Ahmad, Thamer Mohammad, Bruce W. Melville, Zuliziana Suif

Abstract:

Current methods for predicting local scour at wide bridge piers, were developed on the basis of laboratory studies and very limited scour prediction were tested with field data. Laboratory wide pier scour equation from previous findings with field data were presented. A wide range of field data were used and it consists of both live-bed and clear-water scour. A method for assessing the quality of the data was developed and applied to the data set. Three other wide pier-scour equations from the literature were used to compare the performance of each predictive method. The best-performing scour equation were analyzed using statistical analysis. Comparisons of computed and observed scour depths indicate that the equation from the previous publication produced the smallest discrepancy ratio and RMSE value when compared with the large amount of laboratory and field data.

Keywords: field data, local scour, scour equation, wide piers

Procedia PDF Downloads 389
25426 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol

Authors: Inkyu Kim, SangMan Moon

Abstract:

This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.

Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application

Procedia PDF Downloads 372
25425 Methods for Distinction of Cattle Using Supervised Learning

Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl

Abstract:

Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.

Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning

Procedia PDF Downloads 534
25424 Router 1X3 - RTL Design and Verification

Authors: Nidhi Gopal

Abstract:

Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.

Keywords: data packets, networking, router, routing

Procedia PDF Downloads 784