Search results for: mixture exponential (hyperexponential) distribution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6499

Search results for: mixture exponential (hyperexponential) distribution

6259 Facility Anomaly Detection with Gaussian Mixture Model

Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho

Abstract:

Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.

Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm

Procedia PDF Downloads 246
6258 Mechanical Properties of Cement Slurry by Partially Substitution of Industry Waste Natural Pozzolans

Authors: R. Ziaie Moayed, S. P. Emadoleslami Oskoei, S. D. Beladi Mousavi, A. Taleb Beydokhti

Abstract:

There have been many reports of the destructive effects of cement on the environment in recent years. In the present research, it has been attempted to reduce the destructive effects of cement by replacing silica fume as adhesive materials instead of cement. The present study has attempted to improve the mechanical properties of cement slurry by using waste material from a glass production factory, located in Qazvin city of Iran, in which accumulation volume has become an environmental threat. The chemical analysis of the waste material indicates that this material contains about 94% of SiO2 and AL2O3 and has a close structure to silica fume. Also, the particle grain size test was performed on the mentioned waste. Then, the unconfined compressive strength test of the slurry was performed by preparing a mixture of water and adhesives with different percentages of cement and silica fume. The water to an adhesive ratio of this mixture is 1:3, and the curing process last 28 days. It was found that the sample had an unconfined compressive strength of about 300 kg/cm2 in a mixture with equal proportions of cement and silica fume. Besides, the sample had a brittle fracture in the slurry sample made of pure cement, however, the fracture in cement-silica fume slurry mixture is flexible and the structure of the specimen remains coherent after fracture. Therefore, considering the flexibility that is achieved by replacing this waste, it can be used to stabilize soils with cracking potential.

Keywords: cement replacement, cement slurry, environmental threat, natural pozzolan, silica fume, waste material

Procedia PDF Downloads 105
6257 Temperature Distribution Control for Baby Incubator System Using Arduino AT Mega 2560

Authors: W. Widhiada, D. N. K. P. Negara, P. A. Suryawan

Abstract:

The technological advances in the field of health to be very important, especially on the safety of the baby. In this case a lot of premature infants death caused by poorly managed health facilities. Mostly the death of premature baby caused by bacteria since the temperature around the baby is not normal. Related to this, the incubator equipment needs to be important, especially in how to control the temperature in incubator. On/Off controls is used to regulate the temperature distribution in the incubator so that the desired temperature is 36 °C to stay awake and stable. The authors have been observed and analyzed the data to determine the temperature distribution in the incubator using program of MATLAB/Simulink. The output temperature distribution is obtained at 36 °C in 400 seconds using an Arduino AT 2560. This incubator is able to maintain an ambient temperature and maintain the baby's body temperature within normal limits and keep the moisture in the air in accordance with the limit values required in infant incubator.

Keywords: on/off control, distribution temperature, Arduino AT 2560, baby incubator

Procedia PDF Downloads 463
6256 The Normal-Generalized Hyperbolic Secant Distribution: Properties and Applications

Authors: Hazem M. Al-Mofleh

Abstract:

In this paper, a new four-parameter univariate continuous distribution called the Normal-Generalized Hyperbolic Secant Distribution (NGHS) is defined and studied. Some general and structural distributional properties are investigated and discussed, including: central and non-central n-th moments and incomplete moments, quantile and generating functions, hazard function, Rényi and Shannon entropies, shapes: skewed right, skewed left, and symmetric, modality regions: unimodal and bimodal, maximum likelihood (MLE) estimators for the parameters. Finally, two real data sets are used to demonstrate empirically its flexibility and prove the strength of the new distribution.

Keywords: bimodality, estimation, hazard function, moments, Shannon’s entropy

Procedia PDF Downloads 311
6255 Effect of Mixture of Flaxseed and Pumpkin Seeds Powder on Hypercholesterolemia

Authors: Zahra Ashraf

Abstract:

Flax and pumpkin seeds are a rich source of unsaturated fatty acids, antioxidants and fiber, known to have anti-atherogenic properties. Hypercholesterolemia is a state characterized by the elevated level of cholesterol in the blood. This research was designed to study the effect of flax and pumpkin seeds powder mixture on hypercholesterolemia and body weight. Rat’s species were selected as human representative. Thirty male albino rats were divided into three groups: a control group, a CD-chol group (control diet+cholesterol) fed with 1.5% cholesterol and FP-chol group (flaxseed and pumpkin seed powder+ cholesterol) fed with 1.5% cholesterol. Flax and pumpkin seed powder mixed at proportion of (5/1) (omega-3 and omega-6). Blood samples were collected to examine lipid profile and body weight was also measured. Thus the data was subjected to analysis of variance. In CD-chol group, body weight, total cholesterol TC, triacylglycerides TG in plasma, plasma LDL-C, ratio significantly increased with a decrease in plasma HDL (good cholesterol). In FP-chol group lipid parameters and body weights were decreased significantly with an increase in HDL and decrease in LDL (bad cholesterol). The mean values of body weight, total cholesterol, triglycerides, low density lipoprotein and high density lipoproteins in FP-chol group were 240.66±11.35g, 59.60±2.20mg/dl, 50.20±1.79 mg/dl, 36.20±1.62mg/dl, 36.40±2.20 mg/dl, respectively. Flaxseed and pumpkin seeds powder mixture showed reduction in body weight, serum cholesterol, low density lipoprotein and triglycerides. While significant increase was shown in high density lipoproteins when given to hypercholesterolemic rats. Our results suggested that flax and pumpkin seed mixture has hypocholesterolemic effects which were probably mediated by polyunsaturated fatty acids (omega-3 and omega-6) present in seed mixture.

Keywords: hypercolesterolemia, omega 3 and omega 6 fatty acids, cardiovascular diseases

Procedia PDF Downloads 400
6254 An Association Model to Correlate the Experimentally Determined Mixture Solubilities of Methyl 10-Undecenoate with Methyl Ricinoleate in Supercritical Carbon Dioxide

Authors: V. Mani Rathnam, Giridhar Madras

Abstract:

Fossil fuels are depleting rapidly as the demand for energy, and its allied chemicals are continuously increasing in the modern world. Therefore, sustainable renewable energy sources based on non-edible oils are being explored as a viable option as they do not compete with the food commodities. Oils such as castor oil are rich in fatty acids and thus can be used for the synthesis of biodiesel, bio-lubricants, and many other fine industrial chemicals. There are several processes available for the synthesis of different chemicals obtained from the castor oil. One such process is the transesterification of castor oil, which results in a mixture of fatty acid methyl esters. The main products in the above reaction are methyl ricinoleate and methyl 10-undecenoate. To separate these compounds, supercritical carbon dioxide (SCCO₂) was used as a green solvent. SCCO₂ was chosen as a solvent due to its easy availability, non-toxic, non-flammable, and low cost. In order to design any separation process, the preliminary requirement is the solubility or phase equilibrium data. Therefore, the solubility of a mixture of methyl ricinoleate with methyl 10-undecenoate in SCCO₂ was determined in the present study. The temperature and pressure range selected for the investigation were T = 313 K to 333 K and P = 10 MPa to 18 MPa. It was observed that the solubility (mol·mol⁻¹) of methyl 10-undecenoate varied from 2.44 x 10⁻³ to 8.42 x 10⁻³ whereas it varied from 0.203 x 10⁻³ to 6.28 x 10⁻³ for methyl ricinoleate within the chosen operating conditions. These solubilities followed a retrograde behavior (characterized by the decrease in the solubility values with the increase in temperature) throughout the range of investigated operating conditions. An association theory model, coupled with regular solution theory for activity coefficients, was developed in the present study. The deviation from the experimental data using this model can be quantified using the average absolute relative deviation (AARD). The AARD% for the present compounds is 4.69 and 8.08 for methyl 10-undecenoate and methyl ricinoleate, respectively in a mixture of methyl ricinoleate and methyl 10-undecenoate. The maximum solubility enhancement of 32% was observed for the methyl ricinoleate in a mixture of methyl ricinoleate and methyl 10-undecenoate. The highest selectivity of SCCO₂ was observed to be 12 for methyl 10-undecenoate in a mixture of methyl ricinoleate and methyl 10-undecenoate.

Keywords: association theory, liquid mixtures, solubilities, supercritical carbon dioxide

Procedia PDF Downloads 111
6253 Brake Force Distribution in Passenger Cars

Authors: Boukhris Lahouari, Bouchetara Mostefa

Abstract:

The active safety of a vehicle is mainly influenced by the properties of the installed braking system. With the increase in road traffic density and travel speeds, increasingly stringent requirements are placed on the vehicle's behaviour during braking. The achievable decelerations are limited by the physical aspect characterized by the coefficient of friction between the tires and the ground. As a result, it follows that an optimized distribution of braking forces becomes necessary for a better use of friction coefficients. This objective could only be achieved if sufficient knowledge is available on the theory of vehicle dynamics during braking and on current standards for the approval of braking systems. These will facilitate the development of a braking force calculation algorithm that will enable an optimized distribution of braking forces to be achieved. Operating safety is conditioned by the requirements of efficiency, progressiveness, regularity or fidelity of a braking system without obviously neglecting the recommendations imposed by the legislator.

Keywords: brake force distribution, distribution diagram, friction coefficient, brake by wire

Procedia PDF Downloads 52
6252 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model

Authors: Donatella Giuliani

Abstract:

In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.

Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation

Procedia PDF Downloads 192
6251 Influence of Processing Parameters on the Reliability of Sieving as a Particle Size Distribution Measurements

Authors: Eseldin Keleb

Abstract:

In the pharmaceutical industry particle size distribution is an important parameter for the characterization of pharmaceutical powders. The powder flowability, reactivity and compatibility, which have a decisive impact on the final product, are determined by particle size and size distribution. Therefore, the aim of this study was to evaluate the influence of processing parameters on the particle size distribution measurements. Different Size fractions of α-lactose monohydrate and 5% polyvinylpyrrolidone were prepared by wet granulation and were used for the preparation of samples. The influence of sieve load (50, 100, 150, 200, 250, 300, and 350 g), processing time (5, 10, and 15 min), sample size ratios (high percentage of small and large particles), type of disturbances (vibration and shaking) and process reproducibility have been investigated. Results obtained showed that a sieve load of 50 g produce the best separation, a further increase in sample weight resulted in incomplete separation even after the extension of the processing time for 15 min. Performing sieving using vibration was rapider and more efficient than shaking. Meanwhile between day reproducibility showed that particle size distribution measurements are reproducible. However, for samples containing 70% fines or 70% large particles, which processed at optimized parameters, the incomplete separation was always observed. These results indicated that sieving reliability is highly influenced by the particle size distribution of the sample and care must be taken for samples with particle size distribution skewness.

Keywords: sieving, reliability, particle size distribution, processing parameters

Procedia PDF Downloads 582
6250 The Bernstein Expansion for Exponentials in Taylor Functions: Approximation of Fixed Points

Authors: Tareq Hamadneh, Jochen Merker, Hassan Al-Zoubi

Abstract:

Bernstein's expansion for exponentials in Taylor functions provides lower and upper optimization values for the range of its original function. these values converge to the original functions if the degree is elevated or the domain subdivided. Taylor polynomial can be applied so that the exponential is a polynomial of finite degree over a given domain. Bernstein's basis has two main properties: its sum equals 1, and positive for all x 2 (0; 1). In this work, we prove the existence of fixed points for exponential functions in a given domain using the optimization values of Bernstein. The Bernstein basis of finite degree T over a domain D is defined non-negatively. Any polynomial p of degree t can be expanded into the Bernstein form of maximum degree t ≤ T, where we only need to compute the coefficients of Bernstein in order to optimize the original polynomial. The main property is that p(x) is approximated by the minimum and maximum Bernstein coefficients (Bernstein bound). If the bound is contained in the given domain, then we say that p(x) has fixed points in the same domain.

Keywords: Bernstein polynomials, Stability of control functions, numerical optimization, Taylor function

Procedia PDF Downloads 108
6249 Dynamics of a Reaction-Diffusion Problems Modeling Two Predators Competing for a Prey

Authors: Owolabi Kolade Matthew

Abstract:

In this work, we investigate both the analytical and numerical studies of the dynamical model comprising of three species system. We analyze the linear stability of stationary solutions in the one-dimensional multi-system modeling the interactions of two predators and one prey species. The stability analysis has a lot of implications for understanding the various spatiotemporal and chaotic behaviors of the species in the spatial domain. The analysis results presented have established the possibility of the three interacting species to coexist harmoniously, this feat is achieved by combining the local and global analyzes to determine the global dynamics of the system. In the presence of diffusion, a viable exponential time differencing method is applied to multi-species nonlinear time-dependent partial differential equation to address the points and queries that may naturally arise. The scheme is described in detail, and justified by a number of computational experiments.

Keywords: asymptotically stable, coexistence, exponential time differencing method, global and local stability, predator-prey model, nonlinear, reaction-diffusion system

Procedia PDF Downloads 389
6248 Examining the Relationship between Chi-Square Test Statistics and Skewness of Weibull Distribution: Simulation Study

Authors: Rafida M. Elobaid

Abstract:

Most of the literature on goodness-of-fit test try to provide a theoretical basis for studying empirical distribution functions. Such goodness-of-fit tests are Kolmogorove-Simirnov and Crumer-Von Mises Type tests. However, it is likely that most of literature has not focused in details on the relationship of the values of the test statistics and skewness or kurtosis. The aim of this study is to investigate the behavior of the values of the χ2 test statistic with the variation of the skewness of right skewed distribution. A simulation study is conducted to generate random numbers from Weibull distribution. For a fixed sample sizes, different levels of skewness are considered, and the corresponding values of the χ2 test statistic are calculated. Using different sample sizes, the results show an inverse relationship between the value of χ2 test and the level of skewness for Wiebull distribution, i.e the value of χ2 test statistic decreases as the value of skewness increases. The research results also show that with large values of skewness we are more confident that the data follows the assumed distribution. Nonparametric Kendall τ test is used to confirm these results.

Keywords: goodness-of-fit test, chi-square test, simulation, continuous right skewed distributions

Procedia PDF Downloads 381
6247 An Exponential Field Path Planning Method for Mobile Robots Integrated with Visual Perception

Authors: Magdy Roman, Mostafa Shoeib, Mostafa Rostom

Abstract:

Global vision, whether provided by overhead fixed cameras, on-board aerial vehicle cameras, or satellite images can always provide detailed information on the environment around mobile robots. In this paper, an intelligent vision-based method of path planning and obstacle avoidance for mobile robots is presented. The method integrates visual perception with a new proposed field-based path-planning method to overcome common path-planning problems such as local minima, unreachable destination and unnecessary lengthy paths around obstacles. The method proposes an exponential angle deviation field around each obstacle that affects the orientation of a close robot. As the robot directs toward, the goal point obstacles are classified into right and left groups, and a deviation angle is exponentially added or subtracted to the orientation of the robot. Exponential field parameters are chosen based on Lyapunov stability criterion to guarantee robot convergence to the destination. The proposed method uses obstacles' shape and location, extracted from global vision system, through a collision prediction mechanism to decide whether to activate or deactivate obstacles field. In addition, a search mechanism is developed in case of robot or goal point is trapped among obstacles to find suitable exit or entrance. The proposed algorithm is validated both in simulation and through experiments. The algorithm shows effectiveness in obstacles' avoidance and destination convergence, overcoming common path planning problems found in classical methods.

Keywords: path planning, collision avoidance, convergence, computer vision, mobile robots

Procedia PDF Downloads 162
6246 Fluoride as Obturating Material in Primary Teeth

Authors: Syed Ameer Haider Jafri

Abstract:

The primary goal of a root canal treatment in deciduous teeth is to eliminate infection and to retain the tooth in a functional state until it gets physiologically exfoliated and replaced by permanent successor. Important requisite of a root canal filling material for primary teeth is that, it should resorb at a similar rate as the roots of primary tooth, be harmless to the periapical tissue and to the permanent tooth germ, resorb readily if pushed beyond the apex, be antiseptic, radio-opaque, should not shrink, adhere to the walls, not discolor the tooth and easy to fill & remove, if required at any stage. Presently available, commonly used obturating materials for primary teeth are zinc oxide eugenol, calcium hydroxide and iodoform based pastes. None of these materials so far meet the ideal requirement of root canal filling material. So in search of ideal obturating material, this study was planed, in which mixture of calcium hydroxide, zinc oxide & sodium fluoride and mixture of calcium hydroxide & sodium fluoride was compared clinically and radiographically with calcium hydroxide for the obturation of root canals of 75 carious exposed primary mandibular second molars of 59 children aged 4-9 years. All the three material shows good results, but after a follow-up of 9 months mixture of calcium hydroxide, two percent sodium fluoride & zinc oxide powder closely follow the resorption of root, mixture of calcium hydroxide, two percent sodium fluoride follow resorption of root in the beginning but later on majority of cases shows faster resorption whereas calcium hydroxide starts depleting from the canal from the beginning even as early as 3 months. Thus mixture of calcium hydroxide, two percent sodium fluoride & zinc oxide found to be best obturaring material for primary tooth.

Keywords: obturating material, primary teeth, root canal treatment, success rate

Procedia PDF Downloads 278
6245 Investigation of the Brake Force Distribution in Passenger Cars

Authors: Boukhris Lahouari, Bouchetara Mostefa

Abstract:

The active safety of a vehicle is mainly influenced by the properties of the installed braking system. With the increase in road traffic density and travel speeds, increasingly stringent requirements are placed on the vehicle's behaviour during braking. The achievable decelerations are limited by the physical aspect characterized by the coefficient of friction between the tires and the ground. As a result, it follows that an optimized distribution of braking forces becomes necessary for a better use of friction coefficients. This objective could only be achieved if sufficient knowledge is available on the theory of vehicle dynamics during braking and on current standards for the approval of braking systems. This will facilitate the development of a braking force calculation algorithm that will enable an optimized distribution of braking forces to be achieved. Operating safety is conditioned by the requirements of efficiency, progressiveness, regularity or fidelity of a braking system without obviously neglecting the recommendations imposed by the legislator.

Keywords: brake force distribution, distribution diagram, friction coefficient, brake by wire

Procedia PDF Downloads 53
6244 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data

Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar

Abstract:

It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.

Keywords: accuracy, exponential smoothing, forecasting, initial value

Procedia PDF Downloads 157
6243 Basic Calibration and Normalization Techniques for Time Domain Reflectometry Measurements

Authors: Shagufta Tabassum

Abstract:

The study of dielectric properties in a binary mixture of liquids is very useful to understand the liquid structure, molecular interaction, dynamics, and kinematics of the mixture. Time-domain reflectometry (TDR) is a powerful tool for studying the cooperation and molecular dynamics of the H-bonded system. In this paper, we discuss the basic calibration and normalization procedure for time-domain reflectometry measurements. Our approach is to explain the different types of error occur during TDR measurements and how these errors can be eliminated or minimized.

Keywords: time domain reflectometry measurement techinque, cable and connector loss, oscilloscope loss, and normalization technique

Procedia PDF Downloads 180
6242 Genetic Change in Escherichia coli KJ122 That Improved Succinate Production from an Equal Mixture of Xylose and Glucose

Authors: Apichai Sawisit, Sirima Suvarnakuta Jantama, Sunthorn Kanchanatawee, Lonnie O. Ingram, Kaemwich Jantama

Abstract:

Escherichia coli KJ122 was engineered to produce succinate from glucose using the wild type GalP for glucose uptake instead of the native phosphotransferase system (ptsI mutation). This strain ferments 10% (w/v) xylose poorly. Mutants were selected by serial transfers in AM1 mineral salts medium with 10% (w/v) xylose. Evolved mutants exhibited a similar improvement, co-fermentation of an equal mixture of xylose and glucose. One of these, AS1600a, produced 84.26±1.37 g/L succinate, equivalent to that produced by the parent (KJ122) strain from 10% glucose (85.46±1.78 g/L). AS1600a was sequenced and found to contain a mutation in galactose permease (GalP, G236D). Expressing the galP* mutation gene in KJ122ΔgalP resembled the xylose utilization phenotype of the mutant AS1600a. The strain AS1600a and KJ122ΔgalP (pLOI5746; galP*) also co-fermented a mixture of glucose, xylose, arabinose, and galactose in sugarcane bagasse hydrolysate for succinate production.

Keywords: xylose, furfural, succinate, sugarcane bagasse, E. coli

Procedia PDF Downloads 364
6241 Reaction Rate Behavior of a Methane-Air Mixture over a Platinum Catalyst in a Single Channel Catalytic Reactor

Authors: Doo Ki Lee, Kumaresh Selvakumar, Man Young Kim

Abstract:

Catalytic combustion is an environmentally friendly technique to combust fuels in gas turbines. In this paper, the behavior of surface reaction rate on catalytic combustion is studied with respect to the heterogeneous oxidation of methane-air mixture in a catalytic reactor. Plug flow reactor (PFR), the simplified single catalytic channel assists in investigating the catalytic combustion phenomenon over the Pt catalyst by promoting the desired chemical reactions. The numerical simulation with multi-step elementary surface reactions is governed by the availability of free surface sites onto the catalytic surface and thereby, the catalytic combustion characteristics are demonstrated by examining the rate of the reaction for lean fuel mixture. Further, two different surface reaction mechanisms are adopted and compared for surface reaction rates to indicate the controlling heterogeneous reaction for better fuel conversion. The performance of platinum catalyst under heterogeneous reaction is analyzed under the same temperature condition, where the catalyst with the higher kinetic rate of reaction would have a maximum catalytic activity for enhanced methane catalytic combustion.

Keywords: catalytic combustion, heterogeneous reaction, plug flow reactor, surface reaction rate

Procedia PDF Downloads 246
6240 Reliability and Availability Analysis of Satellite Data Reception System using Reliability Modeling

Authors: Ch. Sridevi, S. P. Shailender Kumar, B. Gurudayal, A. Chalapathi Rao, K. Koteswara Rao, P. Srinivasulu

Abstract:

System reliability and system availability evaluation plays a crucial role in ensuring the seamless operation of complex satellite data reception system with consistent performance for longer periods. This paper presents a novel approach for the same using a case study on one of the antenna systems at satellite data reception ground station in India. The methodology involves analyzing system's components, their failure rates, system's architecture, generation of logical reliability block diagram model and estimating the reliability of the system using the component level mean time between failures considering exponential distribution to derive a baseline estimate of the system's reliability. The model is then validated with collected system level field failure data from the operational satellite data reception systems that includes failure occurred, failure time, criticality of the failure and repair times by using statistical techniques like median rank, regression and Weibull analysis to extract meaningful insights regarding failure patterns and practical reliability of the system and to assess the accuracy of the developed reliability model. The study mainly focused on identification of critical units within the system, which are prone to failures and have a significant impact on overall performance and brought out a reliability model of the identified critical unit. This model takes into account the interdependencies among system components and their impact on overall system reliability and provides valuable insights into the performance of the system to understand the Improvement or degradation of the system over a period of time and will be the vital input to arrive at the optimized design for future development. It also provides a plug and play framework to understand the effect on performance of the system in case of any up gradations or new designs of the unit. It helps in effective planning and formulating contingency plans to address potential system failures, ensuring the continuity of operations. Furthermore, to instill confidence in system users, the duration for which the system can operate continuously with the desired level of 3 sigma reliability was estimated that turned out to be a vital input to maintenance plan. System availability and station availability was also assessed by considering scenarios of clash and non-clash to determine the overall system performance and potential bottlenecks. Overall, this paper establishes a comprehensive methodology for reliability and availability analysis of complex satellite data reception systems. The results derived from this approach facilitate effective planning contingency measures, and provide users with confidence in system performance and enables decision-makers to make informed choices about system maintenance, upgrades and replacements. It also aids in identifying critical units and assessing system availability in various scenarios and helps in minimizing downtime and optimizing resource allocation.

Keywords: exponential distribution, reliability modeling, reliability block diagram, satellite data reception system, system availability, weibull analysis

Procedia PDF Downloads 55
6239 A Two Phase VNS Algorithm for the Combined Production Routing Problem

Authors: Nejah Ben Mabrouk, Bassem Jarboui, Habib Chabchoub

Abstract:

Production and distribution planning is the most important part in supply chain management. In this paper, a NP-hard production-distribution problem for one product over a multi-period horizon is investigated. The aim is to minimize the sum of costs of three items: production setups, inventories and distribution, while determining, for each period, the amount produced, the inventory levels and the delivery trips. To solve this difficult problem, we propose a bi-phase approach based on a Variable Neighbourhood Search (VNS). This heuristic is tested on 90 randomly generated instances from the literature, with 20 periods and 50, 100, 200 customers. Computational results show that our approach outperforms existing solution procedures available in the literature

Keywords: logistic, production, distribution, variable neighbourhood search

Procedia PDF Downloads 306
6238 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: data augmentation, mutex task generation, meta-learning, text classification.

Procedia PDF Downloads 69
6237 A Unification and Relativistic Correction for Boltzmann’s Law

Authors: Lloyd G. Allred

Abstract:

The distribution of velocities of particles in plasma is a well understood discipline of plasma physics. Boltzmann’s law and the Maxwell-Boltzmann distribution describe the distribution of velocity of a particle in plasma as a function of mass and temperature. Particles with the same mass tend to have the same velocity. By expressing the same law in terms of energy alone, the author obtains a distribution independent of mass. In summary, for particles in plasma, the energies tend to equalize, independent of the masses of the individual particles. For high-energy plasma, the original law predicts velocities greater than the speed of light. If one uses Einstein’s formula for energy (E=mc2), then a relativistic correction is not required.

Keywords: cosmology, EMP, plasma physics, relativity

Procedia PDF Downloads 196
6236 Development of Value Based Planning Methodology Incorporating Risk Assessment for Power Distribution Network

Authors: Asnawi Mohd Busrah, Au Mau Teng, Tan Chin Hooi, Lau Chee Chong

Abstract:

This paper describes value based planning (VBP) methodology incorporating risk assessment as an enhanced and more practical approach to evaluate distribution network projects in Peninsular Malaysia. Assessment indicators associated with economics, performance and risks are formulated to evaluate distribution projects to quantify their benefits against investment. The developed methodology is implemented in a web-based software customized to capture investment and network data, compute assessment indicators and rank the proposed projects according to their benefits. Value based planning approach addresses economic factors in the power distribution planning assessment, so as to minimize cost solution to the power utility while at the same time provide maximum benefits to customers.

Keywords: value based planning, distribution network, value of loss load (VoLL), energy not served (ENS)

Procedia PDF Downloads 459
6235 Comparison between Continuous Genetic Algorithms and Particle Swarm Optimization for Distribution Network Reconfiguration

Authors: Linh Nguyen Tung, Anh Truong Viet, Nghien Nguyen Ba, Chuong Trinh Trong

Abstract:

This paper proposes a reconfiguration methodology based on a continuous genetic algorithm (CGA) and particle swarm optimization (PSO) for minimizing active power loss and minimizing voltage deviation. Both algorithms are adapted using graph theory to generate feasible individuals, and the modified crossover is used for continuous variable of CGA. To demonstrate the performance and effectiveness of the proposed methods, a comparative analysis of CGA with PSO for network reconfiguration, on 33-node and 119-bus radial distribution system is presented. The simulation results have shown that both CGA and PSO can be used in the distribution network reconfiguration and CGA outperformed PSO with significant success rate in finding optimal distribution network configuration.

Keywords: distribution network reconfiguration, particle swarm optimization, continuous genetic algorithm, power loss reduction, voltage deviation

Procedia PDF Downloads 154
6234 Bioactive Compounds Characterization of Cereal-based Porridge Enriched with Cirina Forda

Authors: Kunle Oni

Abstract:

This study investigated the bioactivity potentials of porridge from yellow maize and malted sorghum enriched with Cirinaforda.All the samples were analyzed using standard methods.Results showed that the highest value 217.03μmolTEAC/100g, 43.3 mmol Fe2+ /100g, and 35.56% for DPPH, FRAP and TBARS respectively were reported in sample 50FYM+20MS+30CF, while the lowest value 146.10μmolTEAC/100, 20.18±0.11 mmol Fe2+/100g and 13.25% for DPPH, FRAP and TBARS were reported in the control sample.The oxalate and tannin contents were lowest in sample 50FYM+20MS+30CFbutOxalate was highest in the control sample while tannin was highest in sample 60FYM+20MS+20CF.The phytate content was highest in the 60FYM+20MS+20CF mixture (2.32 mg/100g) and lowest in the control (100% FYM) porridge (2.20 mg/100g).The result also showed that the total phenolic content was highest in the 60FYM+20MS+20CF mixture (318.28 mg GAE/100g) and lowest in the50FYM+30MS+20CF mixture (264.18mg GAE/100g).The total flavonoid content had the50FYM+20MS+30CFmixture having the highest content (189.31mg RE/100g) and the 60FYM+20MS+20CF mixture having the lowest (90.10mg RE/100g). The enrichment of the porridge with C. fordaincreased the concentration of various bioactive compounds compared to the control sample. The identified compounds cinnamic acid, methyl ester, 10-Methyl-E-11-tridecen-1-ol propionate, methaqualone,3-(2-Hydroxy-6-methylphenyl)-4(3H)-quinazolinone, and oleic acid

Keywords: bioactive compounds, characterization, cereal-based porridge, Cirina forda

Procedia PDF Downloads 26
6233 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: mutex task generation, data augmentation, meta-learning, text classification.

Procedia PDF Downloads 104
6232 Designing an Intelligent Voltage Instability System in Power Distribution Systems in the Philippines Using IEEE 14 Bus Test System

Authors: Pocholo Rodriguez, Anne Bernadine Ocampo, Ian Benedict Chan, Janric Micah Gray

Abstract:

The state of an electric power system may be classified as either stable or unstable. The borderline of stability is at any condition for which a slight change in an unfavourable direction of any pertinent quantity will cause instability. Voltage instability in power distribution systems could lead to voltage collapse and thus power blackouts. The researchers will present an intelligent system using back propagation algorithm that can detect voltage instability and output voltage of a power distribution and classify it as stable or unstable. The researchers’ work is the use of parameters involved in voltage instability as input parameters to the neural network for training and testing purposes that can provide faster detection and monitoring of the power distribution system.

Keywords: back-propagation algorithm, load instability, neural network, power distribution system

Procedia PDF Downloads 405
6231 A Comparative Study of Photo and Electro-Fenton Reactions Efficiency in Degradation of Cationic Dyes Mixture

Authors: S. Bouafia Chergui, Nihal Oturan, Hussein Khalaf, Mehmet A. Oturan

Abstract:

The aim of this work was to compare the degradation of a mixture of three cationic dyes by advanced oxidation processes (electro-Fenton, photo-Fenton) in aqueous solution. These processes are based on the in situ production of hydroxyl radical, a highly strong oxidant, which allows the degradation of organic pollutants until their mineralization into CO2 and H2O. Under optimal operating conditions, the evolution of total organic carbon (TOC) and electrical energy efficiency have been investigated for the two processes.

Keywords: photo-fenton, electro-fenton, energy efficiency, water treatment

Procedia PDF Downloads 477
6230 Flow-Through Supercritical Installation for Producing Biodiesel Fuel

Authors: Y. A. Shapovalov, F. M. Gumerov, M. K. Nauryzbaev, S. V. Mazanov, R. A. Usmanov, A. V. Klinov, L. K. Safiullina, S. A. Soshin

Abstract:

A flow-through installation was created and manufactured for the transesterification of triglycerides of fatty acids and production of biodiesel fuel under supercritical fluid conditions. Transesterification of rapeseed oil with ethanol was carried out according to two parameters: temperature and the ratio of alcohol/oil mixture at the constant pressure of 19 MPa. The kinetics of the yield of fatty acids ethyl esters (FAEE) was determined in the temperature range of 320-380 °C at the alcohol/oil molar ratio of 6:1-20:1. The content of the formed FAEE was determined by the method of correlation of the resulting biodiesel fuel by its kinematic viscosity. The maximum FAEE yield (about 90%) was obtained within 30 min at the ethanol/oil molar ratio of 12:1 and a temperature of 380 °C. When studying of transesterification of triglycerides, a kinetic model of an isothermal flow reactor was used. The reaction order implemented in the flow reactor has been determined. The first order of the reaction was confirmed by data on the conversion of FAEE during the reaction at different temperatures and the molar ratios of the initial reagents (ethanol/oil). Using the Arrhenius equation, the values of the effective constants of the transesterification reaction rate were calculated at different reaction temperatures. In addition, based on the experimental data, the activation energy and the pre-exponential factor of the transesterification reaction were determined.

Keywords: biodiesel, fatty acid esters, supercritical fluid technology, transesterification

Procedia PDF Downloads 84