Search results for: quantity estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2797

Search results for: quantity estimation

2527 The Evaluation of the Performance of Different Filtering Approaches in Tracking Problem and the Effect of Noise Variance

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

Performance of different filtering approaches depends on modeling of dynamical system and algorithm structure. For modeling and smoothing the data the evaluation of posterior distribution in different filtering approach should be chosen carefully. In this paper different filtering approaches like filter KALMAN, EKF, UKF, EKS and smoother RTS is simulated in some trajectory tracking of path and accuracy and limitation of these approaches are explained. Then probability of model with different filters is compered and finally the effect of the noise variance to estimation is described with simulations results.

Keywords: Gaussian approximation, Kalman smoother, parameter estimation, noise variance

Procedia PDF Downloads 414
2526 Genetic Algorithm and Multi Criteria Decision Making Approach for Compressive Sensing Based Direction of Arrival Estimation

Authors: Ekin Nurbaş

Abstract:

One of the essential challenges in array signal processing, which has drawn enormous research interest over the past several decades, is estimating the direction of arrival (DOA) of plane waves impinging on an array of sensors. In recent years, the Compressive Sensing based DoA estimation methods have been proposed by researchers, and it has been discovered that the Compressive Sensing (CS)-based algorithms achieved significant performances for DoA estimation even in scenarios where there are multiple coherent sources. On the other hand, the Genetic Algorithm, which is a method that provides a solution strategy inspired by natural selection, has been used in sparse representation problems in recent years and provides significant improvements in performance. With all of those in consideration, in this paper, a method that combines the Genetic Algorithm (GA) and the Multi-Criteria Decision Making (MCDM) approaches for Direction of Arrival (DoA) estimation in the Compressive Sensing (CS) framework is proposed. In this method, we generate a multi-objective optimization problem by splitting the norm minimization and reconstruction loss minimization parts of the Compressive Sensing algorithm. With the help of the Genetic Algorithm, multiple non-dominated solutions are achieved for the defined multi-objective optimization problem. Among the pareto-frontier solutions, the final solution is obtained with the multiple MCDM methods. Moreover, the performance of the proposed method is compared with the CS-based methods in the literature.

Keywords: genetic algorithm, direction of arrival esitmation, multi criteria decision making, compressive sensing

Procedia PDF Downloads 134
2525 Renewable Energy from Local Waste for Producing of Processed Agricultural Products

Authors: Ruedee Niyomrath, Somboon Sarasit, Chaisri Tharaswatpipat

Abstract:

This research aims to study the potential of local waste material in quantity and quality. The potential for such local forms of waste material used as renewable energy for the production of processed agricultural products. The results of this study are useful to producers of agricultural products to use fuel that in local, reduce production costs, and conservation. The results showed that Samut Songkhram is a small province located in the central Thailand, sea area, and subdivided into 3 districts. This province has a population of 80 percent of farmers and agriculture with 50 percent of the area planted to coconut growing. Productivity of coconut help create value for the primacy of the province. Waste materials from coconut have quantity and quality potentials for processing biomass into charcoal as the renewable energy for the production of processed agricultural products.

Keywords: waste, renewable energy, producing of product, processed agricultural products

Procedia PDF Downloads 425
2524 A Unified Deep Framework for Joint 3d Pose Estimation and Action Recognition from a Single Color Camera

Authors: Huy Hieu Pham, Houssam Salmane, Louahdi Khoudour, Alain Crouzil, Pablo Zegers, Sergio Velastin

Abstract:

We present a deep learning-based multitask framework for joint 3D human pose estimation and action recognition from color video sequences. Our approach proceeds along two stages. In the first, we run a real-time 2D pose detector to determine the precise pixel location of important key points of the body. A two-stream neural network is then designed and trained to map detected 2D keypoints into 3D poses. In the second, we deploy the Efficient Neural Architecture Search (ENAS) algorithm to find an optimal network architecture that is used for modeling the Spatio-temporal evolution of the estimated 3D poses via an image-based intermediate representation and performing action recognition. Experiments on Human3.6M, Microsoft Research Redmond (MSR) Action3D, and Stony Brook University (SBU) Kinect Interaction datasets verify the effectiveness of the proposed method on the targeted tasks. Moreover, we show that our method requires a low computational budget for training and inference.

Keywords: human action recognition, pose estimation, D-CNN, deep learning

Procedia PDF Downloads 126
2523 Functioning of Public Distribution System and Calories Intake in the State of Maharashtra

Authors: Balasaheb Bansode, L. Ladusingh

Abstract:

The public distribution system is an important component of food security. It is a massive welfare program undertaken by Government of India and implemented by state government since India being a federal state; for achieving multiple objectives like eliminating hunger, reduction in malnutrition and making food consumption affordable. This program reaches at the community level through the various agencies of the government. The paper focuses on the accessibility of PDS at household level and how the present policy framework results in exclusion and inclusion errors. It tries to explore the sanctioned food grain quantity received by differentiated ration cards according to income criterion at household level, and also it has highlighted on the type of corruption in food distribution that is generated by the PDS system. The data used is of secondary nature from NSSO 68 round conducted in 2012. Bivariate and multivariate techniques have been used to understand the working and consumption of food for this paper.

Keywords: calories intake, entitle food quantity, poverty aliviation through PDS, target error

Procedia PDF Downloads 313
2522 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.

Keywords: apartment complex, big data, life-cycle building value analysis, machine learning

Procedia PDF Downloads 357
2521 Bayesian Estimation under Different Loss Functions Using Gamma Prior for the Case of Exponential Distribution

Authors: Md. Rashidul Hasan, Atikur Rahman Baizid

Abstract:

The Bayesian estimation approach is a non-classical estimation technique in statistical inference and is very useful in real world situation. The aim of this paper is to study the Bayes estimators of the parameter of exponential distribution under different loss functions and then compared among them as well as with the classical estimator named maximum likelihood estimator (MLE). In our real life, we always try to minimize the loss and we also want to gather some prior information (distribution) about the problem to solve it accurately. Here the gamma prior is used as the prior distribution of exponential distribution for finding the Bayes estimator. In our study, we also used different symmetric and asymmetric loss functions such as squared error loss function, quadratic loss function, modified linear exponential (MLINEX) loss function and non-linear exponential (NLINEX) loss function. Finally, mean square error (MSE) of the estimators are obtained and then presented graphically.

Keywords: Bayes estimator, maximum likelihood estimator (MLE), modified linear exponential (MLINEX) loss function, Squared Error (SE) loss function, non-linear exponential (NLINEX) loss function

Procedia PDF Downloads 372
2520 Improvement of Direct Torque and Flux Control of Dual Stator Induction Motor Drive Using Intelligent Techniques

Authors: Kouzi Katia

Abstract:

This paper proposes a Direct Torque Control (DTC) algorithm of dual Stator Induction Motor (DSIM) drive using two approach intelligent techniques: Artificial Neural Network (ANN) approach replaces the switching table selector block of conventional DTC and Mamdani Fuzzy Logic controller (FLC) is used for stator resistance estimation. The fuzzy estimation method is based on an online stator resistance correction through the variations of stator current estimation error and its variation. The fuzzy logic controller gives the future stator resistance increment at the output. The main advantage of suggested algorithm control is to reduce the hardware complexity of conventional selectors, to avoid the drive instability that may occur in certain situation and ensure the tracking of the actual of the stator resistance. The effectiveness of the technique and the improvement of the whole system performance are proved by results.

Keywords: artificial neural network, direct torque control, dual stator induction motor, fuzzy logic estimator, switching table

Procedia PDF Downloads 327
2519 Robustified Asymmetric Logistic Regression Model for Global Fish Stock Assessment

Authors: Osamu Komori, Shinto Eguchi, Hiroshi Okamura, Momoko Ichinokawa

Abstract:

The long time-series data on population assessments are essential for global ecosystem assessment because the temporal change of biomass in such a database reflects the status of global ecosystem properly. However, the available assessment data usually have limited sample sizes and the ratio of populations with low abundance of biomass (collapsed) to those with high abundance (non-collapsed) is highly imbalanced. To allow for the imbalance and uncertainty involved in the ecological data, we propose a binary regression model with mixed effects for inferring ecosystem status through an asymmetric logistic model. In the estimation equation, we observe that the weights for the non-collapsed populations are relatively reduced, which in turn puts more importance on the small number of observations of collapsed populations. Moreover, we extend the asymmetric logistic regression model using propensity score to allow for the sample biases observed in the labeled and unlabeled datasets. It robustified the estimation procedure and improved the model fitting.

Keywords: double robust estimation, ecological binary data, mixed effect logistic regression model, propensity score

Procedia PDF Downloads 250
2518 Carbohydrate Intake Estimation in Type I Diabetic Patients Described by UVA/Padova Model

Authors: David A. Padilla, Rodolfo Villamizar

Abstract:

In recent years, closed loop control strategies have been developed in order to establish a healthy glucose profile in type 1 diabetic mellitus (T1DM) patients. However, the controller itself is unable to define a suitable reference trajectory for glucose. In this paper, a control strategy Is proposed where the shape of the reference trajectory is generated bases in the amount of carbohydrates present during the digestive process, due to the effect of carbohydrate intake. Since there no exists a sensor to measure the amount of carbohydrates consumed, an estimator is proposed. Thus this paper presents the entire process of designing a carbohydrate estimator, which allows estimate disturbance for a predictive controller (MPC) in a T1MD patient, the estimation will be used to establish a profile of reference and improve the response of the controller by providing the estimated information of ingested carbohydrates. The dynamics of the diabetic model used are due to the equations described by the UVA/Padova model of the T1DMS simulator, the system was developed and simulated in Simulink, taking into account the noise and limitations of the glucose control system actuators.

Keywords: estimation, glucose control, predictive controller, MPC, UVA/Padova

Procedia PDF Downloads 243
2517 Rainwater Harvesting for Household Consumption in Rural Demonstration Sites of Nong Khai Province, Thailand

Authors: Shotiros Protong

Abstract:

In recent years, Thailand has been affected by climate change phenomenon, which is clearly seen from the season change for different times. The occurrence of violent storms, heavy rains, floods, and drought were found in several areas. In a long dry period, the water supply is not adequate in drought areas. Nowadays, it is renowned that there is a significant decrease of rainwater use for household consumption in rural area of Thailand. Rainwater harvesting is the practice of collection and storage of rainwater in storage tanks before it is lost as surface run-off. Rooftop rainwater harvesting is used to provide drinking water, domestic water, and water for livestock. Rainwater harvesting in households is an alternative for people to readily prepare water resources for their own consumptions during the drought season, can help mitigate flooding of flooded plains, and also may reduce demand on the basin and well. It also helps in the availability of potable water, as rainwater is substantially free of salts. Application of rainwater harvesting in rural water system provide a substantial benefit for both water supply and wastewater subsystems by reducing the need for clean water in water distribution systems, less generated storm water in sewer systems, and a reduction in storm water runoff polluting freshwater bodies. The combination of rainwater quality and rainfall quantity is used to determine proper rainwater harvesting for household consumption to be safe and adequate for survivals. Rainwater quality analysis is compared with the drinking water standard. In terms of rainfall quantity, the observed rainfall data are interpolated by GIS 10.5 and showed by map during 1980 to 2020, used to assess the annual yield for household consumptions.

Keywords: rainwater harvesting, drinking water standard, annual yield, rainfall quantity

Procedia PDF Downloads 146
2516 Estimation of Missing Values in Aggregate Level Spatial Data

Authors: Amitha Puranik, V. S. Binu, Seena Biju

Abstract:

Missing data is a common problem in spatial analysis especially at the aggregate level. Missing can either occur in covariate or in response variable or in both in a given location. Many missing data techniques are available to estimate the missing data values but not all of these methods can be applied on spatial data since the data are autocorrelated. Hence there is a need to develop a method that estimates the missing values in both response variable and covariates in spatial data by taking account of the spatial autocorrelation. The present study aims to develop a model to estimate the missing data points at the aggregate level in spatial data by accounting for (a) Spatial autocorrelation of the response variable (b) Spatial autocorrelation of covariates and (c) Correlation between covariates and the response variable. Estimating the missing values of spatial data requires a model that explicitly account for the spatial autocorrelation. The proposed model not only accounts for spatial autocorrelation but also utilizes the correlation that exists between covariates, within covariates and between a response variable and covariates. The precise estimation of the missing data points in spatial data will result in an increased precision of the estimated effects of independent variables on the response variable in spatial regression analysis.

Keywords: spatial regression, missing data estimation, spatial autocorrelation, simulation analysis

Procedia PDF Downloads 361
2515 Relevancy Measures of Errors in Displacements of Finite Elements Analysis Results

Authors: A. B. Bolkhir, A. Elshafie, T. K. Yousif

Abstract:

This paper highlights the methods of error estimation in finite element analysis (FEA) results. It indicates that the modeling error could be eliminated by performing finite element analysis with successively finer meshes or by extrapolating response predictions from an orderly sequence of relatively low degree of freedom analysis results. In addition, the paper eliminates the round-off error by running the code at a higher precision. The paper provides application in finite element analysis results. It draws a conclusion based on results of application of methods of error estimation.

Keywords: finite element analysis (FEA), discretization error, round-off error, mesh refinement, richardson extrapolation, monotonic convergence

Procedia PDF Downloads 472
2514 Production Process of Coconut-Shell Product in Amphawa District

Authors: Wannee Sutthachaidee

Abstract:

The study of the production process of coconut-shell product in Amphawa, Samutsongkram Province is objected to study the pattern of the process of coconut-shell product by focusing in the 3 main processes which are inbound logistics process, production process and outbound process. The result of the research: There were 4 main results from the study. Firstly, most of the manufacturer of coconut-shell product is usually owned by a single owner and the quantity of the finished product is quite low and the main labor group is local people. Secondly, the production process can be divided into 4 stages which are pre-production process, production process, packaging process and distribution process. Thirdly, each 3 of the logistics process of coconut shell will find process which may cause the problem to the business but the process which finds the most problem is the production process because the production process needs the skilled labor and the quantity of the labor does not match with the demand from the customers. Lastly, the factors which affect the production process of the coconut shell can be founded in almost every process of the process such as production design, packaging design, sourcing supply and distribution management.

Keywords: production process, coconut-shell product, Amphawa District, inbound logistics process

Procedia PDF Downloads 502
2513 Estimation of Cholesterol Level in Different Brands of Vegetable Oils in Iraq

Authors: Mohammed Idaan Hassan Al-Majidi

Abstract:

An analysis of twenty one assorted brands of vegetable oils in Babylon Iraq, reveals varying levels of cholesterol content. Cholesterol was found to be present in most of the oil brands sampled using three standard methods. Cholesterol was detected in seventeen of the vegetable oil brands with concentration of less than 1 mg/ml while seven of the oil brands had cholesterol concentrations ranging between 1-4 mg/ml. Low iodine values were obtained in four of the vegetable oil brands and three of them had high acid values. High performance liquid chromatography (HPLC) confirmed the presence of cholesterol at varying concentrations in all the oil brands and gave the lowest detectable cholesterol values in all the oil brands. The Laser brand made from rapeseed had the highest cholesterol concentration of 3.2 mg/ml while Grand brand made from groundnuts had the least concentration (0.12 mg/ml) of cholesterol using HPLC analysis. Leibermann-Burchard method showed that Gino brand from palm kernel had the least concentration of cholesterol (3.86 mg/ml ±0.032) and the highest concentration of 3.996 mg/ml ±0.0404 was obtained in Sesame seed oil brand. This report is important in view of health implications of cholesterol in our diets. Consequently, we have been able to show that there is no cholesterol free oil in the market as shown on the vegetable oil brand labels. Therefore, companies producing and marketing vegetable oils are enjoined to desist from misleading the public by labeling their products as “cholesterol free”. They should indicate the amount of cholesterol present in the vegetable oil, no matter how small the quantity may be.

Keywords: vegetable oils, heart diseases, leibermann-burchard, cholesterol

Procedia PDF Downloads 235
2512 Improved Multi-Channel Separation Algorithm for Satellite-Based Automatic Identification System Signals Based on Artificial Bee Colony and Adaptive Moment Estimation

Authors: Peng Li, Luan Wang, Haifeng Fei, Renhong Xie, Yibin Rui, Shanhong Guo

Abstract:

The applications of satellite-based automatic identification system (S-AIS) pave the road for wide-range maritime traffic monitoring and management. But the coverage of satellite’s view includes multiple AIS self-organizing networks, which leads to the collision of AIS signals from different cells. The contribution of this work is to propose an improved multi-channel blind source separation algorithm based on Artificial Bee Colony (ABC) and advanced stochastic optimization to perform separation of the mixed AIS signals. The proposed approach adopts modified ABC algorithm to get an optimized initial separating matrix, which can expedite the initialization bias correction, and utilizes the Adaptive Moment Estimation (Adam) to update the separating matrix by adjusting the learning rate for each parameter dynamically. Simulation results show that the algorithm can speed up convergence and lead to better performance in separation accuracy.

Keywords: satellite-based automatic identification system, blind source separation, artificial bee colony, adaptive moment estimation

Procedia PDF Downloads 167
2511 Effect of Low Plastic Clay Quantity on Behavioral Characteristics of Loose Sand

Authors: Roza Rahbari

Abstract:

After the Nigatta earthquake in Japan, in 1960, the liquefaction and its related hazards, moved to the thick of matter. Most of the research have been carried out on clean sands and silty sands so far, in order to study the effect of fine particles, confinement pressures, density and so on. However, because of this delusion that adhesiveness of clay prevents the liquefaction in sand, studies on clayey sands have not been taken seriously. However, several liquefactions happened in clayey sands in recent years, and lead to the necessity of more studies in this field. The studies which were carried out so far focused on high plastic clays. In this paper, the effect of low plasticity clays on the behavioral characteristics of sands is discussed. Thus, some triaxial tests were carried out on clean sands and clayey sands with different percentages of added clay. Specimens were compacted in various densities to study the effect of quantity of clay on various densities, too. Based on the findings, the amount of clay affects the behavior of sand greatly and leads to substantial changes in peak bearing capacity and steady state values.

Keywords: liquefaction, clay, sand, triaxial, monotonic, failure

Procedia PDF Downloads 228
2510 A Theorem Related to Sample Moments and Two Types of Moment-Based Density Estimates

Authors: Serge B. Provost

Abstract:

Numerous statistical inference and modeling methodologies are based on sample moments rather than the actual observations. A result justifying the validity of this approach is introduced. More specifically, it will be established that given the first n moments of a sample of size n, one can recover the original n sample points. This implies that a sample of size n and its first associated n moments contain precisely the same amount of information. However, it is efficient to make use of a limited number of initial moments as most of the relevant distributional information is included in them. Two types of density estimation techniques that rely on such moments will be discussed. The first one expresses a density estimate as the product of a suitable base density and a polynomial adjustment whose coefficients are determined by equating the moments of the density estimate to the sample moments. The second one assumes that the derivative of the logarithm of a density function can be represented as a rational function. This gives rise to a system of linear equations involving sample moments, the density estimate is then obtained by solving a differential equation. Unlike kernel density estimation, these methodologies are ideally suited to model ‘big data’ as they only require a limited number of moments, irrespective of the sample size. What is more, they produce simple closed form expressions that are amenable to algebraic manipulations. They also turn out to be more accurate as will be shown in several illustrative examples.

Keywords: density estimation, log-density, polynomial adjustments, sample moments

Procedia PDF Downloads 145
2509 A Method for Reduction of Association Rules in Data Mining

Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa

Abstract:

The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.

Keywords: data mining, association rules, rules reduction, artificial intelligence

Procedia PDF Downloads 145
2508 Optimal Sensing Technique for Estimating Stress Distribution of 2-D Steel Frame Structure Using Genetic Algorithm

Authors: Jun Su Park, Byung Kwan Oh, Jin Woo Hwang, Yousok Kim, Hyo Seon Park

Abstract:

For the structural safety, the maximum stress calculated from the stress distribution of a structure is widely used. The stress distribution can be estimated by deformed shape of the structure obtained from measurement. Although the estimation of stress is strongly affected by the location and number of sensing points, most studies have conducted the stress estimation without reasonable basis on sensing plan such as the location and number of sensors. In this paper, an optimal sensing technique for estimating the stress distribution is proposed. This technique proposes the optimal location and number of sensing points for a 2-D frame structure while minimizing the error of stress distribution between analytical model and estimation by cubic smoothing splines using genetic algorithm. To verify the proposed method, the optimal sensor measurement technique is applied to simulation tests on 2-D steel frame structure. The simulation tests are performed under various loading scenarios. Through those tests, the optimal sensing plan for the structure is suggested and verified.

Keywords: genetic algorithm, optimal sensing, optimizing sensor placements, steel frame structure

Procedia PDF Downloads 517
2507 Efficiency, Effectiveness, and Technological Change in Armed Forces: Indonesian Case

Authors: Citra Pertiwi, Muhammad Fikruzzaman Rahawarin

Abstract:

Government of Indonesia had committed to increasing its national defense the budget up to 1,5 percent of GDP. However, the budget increase does not necessarily allocate efficiently and effectively. Using Data Envelopment Analysis (DEA), the operational units of Indonesian Armed Forces are considered as a proxy to measure those two aspects. The bootstrap technique is being used as well to reduce uncertainty in the estimation. Additionally, technological change is being measured as a nonstationary component. Nearly half of the units are being estimated as fully efficient, with less than a third is considered as effective. Longer and larger sets of data might increase the robustness of the estimation in the future.

Keywords: bootstrap, effectiveness, efficiency, DEA, military, Malmquist, technological change

Procedia PDF Downloads 291
2506 The Underestimate of the Annual Maximum Rainfall Depths Due to Coarse Time Resolution Data

Authors: Renato Morbidelli, Carla Saltalippi, Alessia Flammini, Tommaso Picciafuoco, Corrado Corradini

Abstract:

A considerable part of rainfall data to be used in the hydrological practice is available in aggregated form within constant time intervals. This can produce undesirable effects, like the underestimate of the annual maximum rainfall depth, Hd, associated with a given duration, d, that is the basic quantity in the development of rainfall depth-duration-frequency relationships and in determining if climate change is producing effects on extreme event intensities and frequencies. The errors in the evaluation of Hd from data characterized by a coarse temporal aggregation, ta, and a procedure to reduce the non-homogeneity of the Hd series are here investigated. Our results indicate that: 1) in the worst conditions, for d=ta, the estimation of a single Hd value can be affected by an underestimation error up to 50%, while the average underestimation error for a series with at least 15-20 Hd values, is less than or equal to 16.7%; 2) the underestimation error values follow an exponential probability density function; 3) each very long time series of Hd contains many underestimated values; 4) relationships between the non-dimensional ratio ta/d and the average underestimate of Hd, derived from continuous rainfall data observed in many stations of Central Italy, may overcome this issue; 5) these equations should allow to improve the Hd estimates and the associated depth-duration-frequency curves at least in areas with similar climatic conditions.

Keywords: central Italy, extreme events, rainfall data, underestimation errors

Procedia PDF Downloads 171
2505 Qualitative and Quantitative Traits of Processed Farmed Fish in N. W. Greece

Authors: Cosmas Nathanailides, Fotini Kakali, Kostas Karipoglou

Abstract:

The filleting yield and the chemical composition of farmed sea bass (Dicentrarchus labrax); rainbow trout (Oncorynchus mykiss) and meagre (Argyrosomus regius) was investigated in farmed fish in NW Greece. The results provide an estimate of the quantity of fish required to produce one kilogram of fillet weight, an estimation which is required for the operational management of fish processing companies. Furthermore in this work, the ratio of feed input required to produce one kilogram of fish fillet (FFCR) is presented for the first time as a useful indicator of the ecological footprint of consuming farmed fish. The lowest lipid content appeared in meagre (1,7%) and the highest in trout (4,91%). The lowest fillet yield and fillet yield feed conversion ratio (FYFCR) was in meagre (FY=42,17%, FFCR=2,48), the best fillet yield (FY=53,8%) and FYFCR (2,10) was exhibited in farmed rainbow trout. This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.

Keywords: farmed fish, flesh quality, filleting yield, lipid

Procedia PDF Downloads 294
2504 The Modelling of Real Time Series Data

Authors: Valeria Bondarenko

Abstract:

We proposed algorithms for: estimation of parameters fBm (volatility and Hurst exponent) and for the approximation of random time series by functional of fBm. We proved the consistency of the estimators, which constitute the above algorithms, and proved the optimal forecast of approximated time series. The adequacy of estimation algorithms, approximation, and forecasting is proved by numerical experiment. During the process of creating software, the system has been created, which is displayed by the hierarchical structure. The comparative analysis of proposed algorithms with the other methods gives evidence of the advantage of approximation method. The results can be used to develop methods for the analysis and modeling of time series describing the economic, physical, biological and other processes.

Keywords: mathematical model, random process, Wiener process, fractional Brownian motion

Procedia PDF Downloads 340
2503 Enhancing Athlete Training using Real Time Pose Estimation with Neural Networks

Authors: Jeh Patel, Chandrahas Paidi, Ahmed Hambaba

Abstract:

Traditional methods for analyzing athlete movement often lack the detail and immediacy required for optimal training. This project aims to address this limitation by developing a Real-time human pose estimation system specifically designed to enhance athlete training across various sports. This system leverages the power of convolutional neural networks (CNNs) to provide a comprehensive and immediate analysis of an athlete’s movement patterns during training sessions. The core architecture utilizes dilated convolutions to capture crucial long-range dependencies within video frames. Combining this with the robust encoder-decoder architecture to further refine pose estimation accuracy. This capability is essential for precise joint localization across the diverse range of athletic poses encountered in different sports. Furthermore, by quantifying movement efficiency, power output, and range of motion, the system provides data-driven insights that can be used to optimize training programs. Pose estimation data analysis can also be used to develop personalized training plans that target specific weaknesses identified in an athlete’s movement patterns. To overcome the limitations posed by outdoor environments, the project employs strategies such as multi-camera configurations or depth sensing techniques. These approaches can enhance pose estimation accuracy in challenging lighting and occlusion scenarios, where pose estimation accuracy in challenging lighting and occlusion scenarios. A dataset is collected From the labs of Martin Luther King at San Jose State University. The system is evaluated through a series of tests that measure its efficiency and accuracy in real-world scenarios. Results indicate a high level of precision in recognizing different poses, substantiating the potential of this technology in practical applications. Challenges such as enhancing the system’s ability to operate in varied environmental conditions and further expanding the dataset for training were identified and discussed. Future work will refine the model’s adaptability and incorporate haptic feedback to enhance the interactivity and richness of the user experience. This project demonstrates the feasibility of an advanced pose detection model and lays the groundwork for future innovations in assistive enhancement technologies.

Keywords: computer vision, deep learning, human pose estimation, U-NET, CNN

Procedia PDF Downloads 15
2502 Early Detection of Kidney Failure by Using a Distinct Technique for Sweat Analysis

Authors: Saba. T. Suliman, Alaa. H. Osman, Sara. T. Ahmed, Zeinab. A. Mustafa, Akram. I. Omara, Banazier. A. Ibraheem

Abstract:

Diagnosis by sweat is one of the emerging methods whereby sweat can identify many diseases in the human body. Sweat contains many elements that help in the diagnostic process. In this research, we analyzed sweat samples by using a Colorimeter device to identify the disease of kidney failure in its various stages. This analysis is a non-invasive method where the sample is collected from outside the body, and then this sample is analyzed. Urea refers to the disease of kidney failure when its quantity is high in the blood and then in the sweat, and by experience, we found that the amount of urea for males differs from its quantity for females, where there is a noticeable increase for males in normal and pathological cases. In this research, we took many samples from a normal group that does not suffer from renal failure and another who suffers from the disease to compare the percentage of urea, and after analysis, we found that the urea percentage is high in people with kidney failure disease. with an accuracy of results of 85%.

Keywords: sweat analysis, kidney failure, urea, non-invasive, eccrine glands, mineral composition, sweat test

Procedia PDF Downloads 14
2501 Analysis of Erosion Quantity on Application of Conservation Techniques in Ci Liwung Hulu Watershed

Authors: Zaenal Mutaqin

Abstract:

The level of erosion that occurs in the upsteam watersheed will lead to limited infiltrattion, land degradation and river trivialisation and estuaries in the body. One of the watesheed that has been degraded caused by using land is the DA Ci Liwung Upstream. The high degradation that occurs in the DA Ci Liwung upstream is indicated by the hugher rate of erosion on the region, especially in the area of agriculture. In this case, agriculture cultivation intent to the agricultural land that has been applied conservation techniques. This study is applied to determine the quantity of erosion by reviewing Hidrologic Response Unit (HRU) in agricuktural cultivation land which is contained in DA Ci Liwung upstream by using the Soil and Water Assessmen Tool (SWAT). Conservation techniques applied are terracing, agroforestry and gulud terrace. It was concluded that agroforestry conservation techniques show the best value of erosion (lowest) compared with other conservation techniques with the contribution of erosion of 25.22 tonnes/ha/year. The results of the calibration between the discharge flow models with the observation that R²=0.9014 and NS=0.79 indicates that this model is acceptable and feasible applied to the Ci Liwung Hulu watershed.

Keywords: conservation, erosion, SWAT analysis, watersheed

Procedia PDF Downloads 273
2500 High Secure Data Hiding Using Cropping Image and Least Significant Bit Steganography

Authors: Khalid A. Al-Afandy, El-Sayyed El-Rabaie, Osama Salah, Ahmed El-Mhalaway

Abstract:

This paper presents a high secure data hiding technique using image cropping and Least Significant Bit (LSB) steganography. The predefined certain secret coordinate crops will be extracted from the cover image. The secret text message will be divided into sections. These sections quantity is equal the image crops quantity. Each section from the secret text message will embed into an image crop with a secret sequence using LSB technique. The embedding is done using the cover image color channels. Stego image is given by reassembling the image and the stego crops. The results of the technique will be compared to the other state of art techniques. Evaluation is based on visualization to detect any degradation of stego image, the difficulty of extracting the embedded data by any unauthorized viewer, Peak Signal-to-Noise Ratio of stego image (PSNR), and the embedding algorithm CPU time. Experimental results ensure that the proposed technique is more secure compared with the other traditional techniques.

Keywords: steganography, stego, LSB, crop

Procedia PDF Downloads 252
2499 Internet of Things based AquaSwach Water Purifier

Authors: Karthiyayini J., Arpita Chowdary Vantipalli, Darshana Sailu Tanti, Malvika Ravi Kudari, Krtin Kannan

Abstract:

This paper is propelled from the generally existing undertaking of the smart water quality management, which addresses an IoT (Internet of things) based brilliant water quality observing (SWQM) framework which we call it AquaSwach that guides in the ceaseless estimation of water conditions dependent on five actual boundaries i.e., temperature, pH, electric conductivity and turbidity properties and water virtue estimation each time you drink water. Six sensors relate to Arduino-Mega in a discrete way to detect the water parameters. Extracted data from the sensors are transmitted to a desktop application developed in the NET platform and compared with the WHO (World Health Organization) standard values.

Keywords: AquaSwach, IoT, WHO, water quality

Procedia PDF Downloads 197
2498 Long Term Examination of the Profitability Estimation Focused on Benefits

Authors: Stephan Printz, Kristina Lahl, René Vossen, Sabina Jeschke

Abstract:

Strategic investment decisions are characterized by high innovation potential and long-term effects on the competitiveness of enterprises. Due to the uncertainty and risks involved in this complex decision making process, the need arises for well-structured support activities. A method that considers cost and the long-term added value is the cost-benefit effectiveness estimation. One of those methods is the “profitability estimation focused on benefits – PEFB”-method developed at the Institute of Management Cybernetics at RWTH Aachen University. The method copes with the challenges associated with strategic investment decisions by integrating long-term non-monetary aspects whilst also mapping the chronological sequence of an investment within the organization’s target system. Thus, this method is characterized as a holistic approach for the evaluation of costs and benefits of an investment. This participation-oriented method was applied to business environments in many workshops. The results of the workshops are a library of more than 96 cost aspects, as well as 122 benefit aspects. These aspects are preprocessed and comparatively analyzed with regards to their alignment to a series of risk levels. For the first time, an accumulation and a distribution of cost and benefit aspects regarding their impact and probability of occurrence are given. The results give evidence that the PEFB-method combines precise measures of financial accounting with the incorporation of benefits. Finally, the results constitute the basics for using information technology and data science for decision support when applying within the PEFB-method.

Keywords: cost-benefit analysis, multi-criteria decision, profitability estimation focused on benefits, risk and uncertainty analysis

Procedia PDF Downloads 429