Search results for: total capacity algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15353

Search results for: total capacity algorithm

14813 One-Dimension Model for Positive Displacement Pump with Cavitation Algorithm

Authors: Francesco Rizzuto, Matthew Stickland, Stephan Hannot

Abstract:

The simulation of a positive displacement pump system with commercial software for Computer Fluid Dynamics (CFD), will result in an enormous computational effort due to the complexity of the pump system. This drawback restricts the use of it to a specific part of the pump in one simulation. This research focuses on developing an algorithm that provides a suitable result in agreement with experiment data, without that computational effort. The compressible equations are solved with an explicit algorithm. A comparison is presented between the FV method with Monotonic Upwind scheme for Conservative Laws (MUSCL) with slope limiter and experimental results. The source term for cavitation and friction is introduced into the algorithm with a slipping strategy and solved with a 4th order Runge-Kutta scheme (RK4). Different pumps are modeled and analyzed to evaluate the flexibility of the code. The simulation required minimal computation time and resources without compromising the accuracy of the simulation results. Therefore, this algorithm highlights the feasibility of pressure pulsation simulation as a design tool for an industrial purpose.

Keywords: cavitation, diaphragm, DVCM, finite volume, MUSCL, positive displacement pump

Procedia PDF Downloads 129
14812 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm

Authors: Ping Bo, Meng Yunshan

Abstract:

Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.

Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter

Procedia PDF Downloads 298
14811 Multi-Objective Optimization in Carbon Abatement Technology Cycles (CAT) and Related Areas: Survey, Developments and Prospects

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis, Pagone Emanuele

Abstract:

An infinitesimal increase in performance can have immense reduction in operating and capital expenses in a power generation system. Therefore, constant studies are being carried out to improve both conventional and novel power cycles. Globally, power producers are constantly researching on ways to minimize emission and to collectively downsize the total cost rate of power plants. A substantial spurt of developmental technologies of low carbon cycles have been suggested and studied, however they all have their limitations and financial implication. In the area of carbon abatement in power plants, three major objectives conflict: The cost rate of the plant, Power output and Environmental impact. Since, an increase in one of this parameter directly affects the other. This poses a multi-objective problem. It is paramount to be able to discern the point where improving one objective affects the other. Hence, the need for a Pareto-based optimization algorithm. Pareto-based optimization algorithm helps to find those points where improving one objective influences another objective negatively and stops there. The application of Pareto-based optimization algorithm helps the user/operator/designer make an informed decision. This paper sheds more light on areas that multi-objective optimization has been applied in carbon abatement technologies in the last five years, developments and prospects.

Keywords: gas turbine, low carbon technology, pareto optimal, multi-objective optimization

Procedia PDF Downloads 769
14810 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment

Authors: Anju Bala, Inderveer Chana

Abstract:

Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.

Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation

Procedia PDF Downloads 493
14809 Chemical Composition, in vitro Antioxidant Activity and Gas Chromatography–Mass Spectrometry Analysis of Essential Oil and Extracts of Ruta chalpensis aerial Parts Growing in Tunisian Sahara

Authors: Samir Falhi, Neji Gharsallah, Adel Kadri

Abstract:

Ruta chalpensis L. is a medicinal plant in the family of Rutaceae, has been used as an important traditional in the Mediterranean basin in the treatment of many diseases. The current study was devoted to investigate and evaluate the chemical composition, total phenolic, flavonoid and tannin contents, and in vitro antioxidant activities of ethyl acetate, ethanol and hydroalcoholic extracts and essential oil from the aerial parts of Ruta chalpensis from Tunisian Sahara. Total phenolic, flavonoid and tannin contents of extracts ranged from 40.39 ± 1.87 to 75.13 ± 1.22 mg of GAE/g, from 22.62 ± 1.55 to 27.51 ± 1.04 mg of QE/g, and from 5.56 ± 1.32 to 10.89 ± 1.10 mg of CE/g respectively. Results showed that the highest antioxidant activities was determined for ethanol extract with IC50 value of 26.23 ± 0.91 µg/mL for 2,2-diphenyl-1-picrylhydrazyl assay, and for hydroalcoholic extract with EC50 value of 412.95±6.57 µg/mL and 105.52±2.45 mg of α-tocopherol/g for ferric reducing antioxidant power and total antioxidant capacity assays, respectively. Furthermore, Gas Chromatography–Mass Spectrometry (GC-MS) analysis of essential oil led to identification of 20 compounds representing 98.96 % of the total composition. The major components of essential oil were 2-undecanone (39.13%), 2-nonanone (25.04), 1-nonene (13.81), and α-limonene (7.72). Spectral data of Fourier-transform infrared spectroscopy analysis (FT-IR) of extracts revealed the presence of functional groups such as C= O, C─O, ─OH, and C─H, which confirmed its richness on polyphenols and biological active functional groups. These results showed that Ruta chalpensis could be a potential natural source of antioxidants that can be used in food and nutraceutical applications.

Keywords: antioxidant, FT-IR analysis, GC-MS analysis, phytochemicals contents, Ruta chalpensis

Procedia PDF Downloads 120
14808 Evaluating Acid Buffering Capacity of Sewage Sludge Barrier for Inhibiting Remobilization of Heavy Metals in Tailing Impoundment

Authors: Huyuan Zhang, Yi Chen

Abstract:

Compacted sewage sludge has been proved to be feasible as a barrier material for tailing impoundment because of its low permeability and retardation of heavy metals. The long-term penetration of acid mine drainage, however, would acidify the barrier system and result in remobilization of previously immobilized heavy metal pollutants. In this study, the effect of decreasing pH on the mobility of three typical heavy metals (Zn, Pb, and Cu) is investigated by acid titration test on sewage sludge under various conditions. The remobilization of heavy metals is discussed based on the acid buffering capacity of sewage sludge-leachate system. Test results indicate that heavy metals are dramatically released out when pH is decreased below 6.2, and their amounts take the order of Zn > Cu > Pb. The acid buffering capacity of sewage sludge decreases with the solid-liquid ratio but increases with the anaerobic incubation time, and it is mainly governed by dissolution of contained carbonate and organics. These results reveal that the sewage sludge possesses enough acid buffering capacity to consume protons within the acid mine drainage. Thus, this study suggests that an explosive remobilization of heavy metals is not expected in a long-term perspective.

Keywords: acid buffering capacity, barrier, heavy metals, remobilization, sewage sludge

Procedia PDF Downloads 297
14807 A Coordinate-Based Heuristic Route Search Algorithm for Delivery Truck Routing Problem

Authors: Ahmed Tarek, Ahmed Alveed

Abstract:

Vehicle routing problem is a well-known re-search avenue in computing. Modern vehicle routing is more focused with the GPS-based coordinate system, as the state-of-the-art vehicle, and trucking systems are equipped with digital navigation. In this paper, a new two dimensional coordinate-based algorithm for addressing the vehicle routing problem for a supply chain network is proposed and explored, and the algorithm is compared with other available, and recently devised heuristics. For the algorithms discussed, which includes the pro-posed coordinate-based search heuristic as well, the advantages and the disadvantages associated with the heuristics are explored. The proposed algorithm is studied from the stand point of a small supermarket chain delivery network that supplies to its stores in four different states around the East Coast area, and is trying to optimize its trucking delivery cost. Minimizing the delivery cost for the supply network of a supermarket chain is important to ensure its business success.

Keywords: coordinate-based optimal routing, Hamiltonian Circuit, heuristic algorithm, traveling salesman problem, vehicle routing problem

Procedia PDF Downloads 125
14806 A Weighted K-Medoids Clustering Algorithm for Effective Stability in Vehicular Ad Hoc Networks

Authors: Rejab Hajlaoui, Tarek Moulahi, Hervé Guyennet

Abstract:

In a highway scenario, the vehicle speed can exceed 120 kmph. Therefore, any vehicle can enter or leave the network within a very short time. This mobility adversely affects the network connectivity and decreases the life time of all established links. To ensure an effective stability in vehicular ad hoc networks with minimum broadcasting storm, we have developed a weighted algorithm based on the k-medoids clustering algorithm (WKCA). Indeed, the number of clusters and the initial cluster heads will not be selected randomly as usual, but considering the available transmission range and the environment size. Then, to ensure optimal assignment of nodes to clusters in both k-medoids phases, the combined weight of any node will be computed according to additional metrics including direction, relative speed and proximity. Empirical results prove that in addition to the convergence speed that characterizes the k-medoids algorithm, our proposed model performs well both AODV-Clustering and OLSR-Clustering protocols under different densities and velocities in term of end-to-end delay, packet delivery ratio, and throughput.

Keywords: communication, clustering algorithm, k-medoids, sensor, vehicular ad hoc network

Procedia PDF Downloads 213
14805 Sweepline Algorithm for Voronoi Diagram of Polygonal Sites

Authors: Dmitry A. Koptelov, Leonid M. Mestetskiy

Abstract:

Voronoi Diagram (VD) of finite set of disjoint simple polygons, called sites, is a partition of plane into loci (for each site at the locus) – regions, consisting of points that are closer to a given site than to all other. Set of polygons is a universal model for many applications in engineering, geoinformatics, design, computer vision, and graphics. VD of polygons construction usually done with a reduction to task of constructing VD of segments, for which there are effective O(n log n) algorithms for n segments. Preprocessing – constructing segments from polygons’ sides, and postprocessing – polygon’s loci construction by merging the loci of the sides of each polygon are also included in reduction. This approach doesn’t take into account two specific properties of the resulting segment sites. Firstly, all this segments are connected in pairs in the vertices of the polygons. Secondly, on the one side of each segment lies the interior of the polygon. The polygon is obviously included in its locus. Using this properties in the algorithm for VD construction is a resource to reduce computations. The article proposes an algorithm for the direct construction of VD of polygonal sites. Algorithm is based on sweepline paradigm, allowing to effectively take into account these properties. The solution is performed based on reduction. Preprocessing is the constructing of set of sites from vertices and edges of polygons. Each site has an orientation such that the interior of the polygon lies to the left of it. Proposed algorithm constructs VD for set of oriented sites with sweepline paradigm. Postprocessing is a selecting of edges of this VD formed by the centers of empty circles touching different polygons. Improving the efficiency of the proposed sweepline algorithm in comparison with the general Fortune algorithm is achieved due to the following fundamental solutions: 1. Algorithm constructs only such VD edges, which are on the outside of polygons. Concept of oriented sites allowed to avoid construction of VD edges located inside the polygons. 2. The list of events in sweepline algorithm has a special property: the majority of events are connected with “medium” polygon vertices, where one incident polygon side lies behind the sweepline and the other in front of it. The proposed algorithm processes such events in constant time and not in logarithmic time, as in the general Fortune algorithm. The proposed algorithm is fully implemented and tested on a large number of examples. The high reliability and efficiency of the algorithm is also confirmed by computational experiments with complex sets of several thousand polygons. It should be noted that, despite the considerable time that has passed since the publication of Fortune's algorithm in 1986, a full-scale implementation of this algorithm for an arbitrary set of segment sites has not been made. The proposed algorithm fills this gap for an important special case - a set of sites formed by polygons.

Keywords: voronoi diagram, sweepline, polygon sites, fortunes' algorithm, segment sites

Procedia PDF Downloads 157
14804 Integrated Process Modelling of a Thermophilic Biogas Plant

Authors: Obiora E. Anisiji, Jeremiah L. Chukwuneke, Chinonso H. Achebe, Paul C. Okolie

Abstract:

This work developed a mathematical model of a biogas plant from a mechanistic point of view, for urban area clean energy requirement. It aimed at integrating thermodynamics; which deals with the direction in which a process occurs and Biochemical kinetics; which gives the understanding of the rates of biochemical reaction. The mathematical formulation of the proposed gas plant follows the fundamental principles of thermodynamics, and further analysis were accomplished to develop an algorithm for evaluating the plant performance preferably in terms of daily production capacity. In addition, the capacity of the plant is equally estimated for a given cycle of operation and presented in time histories. A nominal 1500m3 biogas plant was studied characteristically and its performance efficiency evaluated. It was observed that the rate of biogas production is essentially a function of enthalpy ratio, the reactor temperature, pH, substrate concentration, rate of degradation of the biomass, and the accumulation of matter in the system due to bacteria growth. The results of this study conform to a very large extent with reported empirical data of some existing plant and further model validations were conducted in line with classical records found in literature.

Keywords: anaerobic digestion, biogas plant, biogas production, bio-reactor, energy, fermentation, rate of production, temperature, therm

Procedia PDF Downloads 410
14803 Sustainability Innovation Capacity Building Framework for UN Sustainable Development Goals

Authors: C. Park, H. Lee, Y-J. Lee

Abstract:

Aim: This study aims to present the Sustainability Innovation Capacity Building Framework (SICBF) to enable the wider public to achieve UN Sustainable Development Goals (UN SDGs) for a sustainable future. The intrinsically interwoven nature of sustainability requires systematic approaches to attain. However, there is a lack of an effective framework for capacity building that enables a systematic implementation approach for UN SDGs. The SICBF illustrates the six core components and their dynamics: 1. Momentum creation; 2. Exposure to diverse worldviews; 3. Serendipity/Eureka moment; 4. Creative problem solving; 5. Individual empowerment; 6. Systems thinking. Method: First, a structured literature review was used to synthesise existing sustainability competencies studies and generic innovation competencies. Secondly, the conceptual framework based on literature findings was tested with the participants' survey and interview data collected from four sets of MAKEathon events. The interview analysis and event observation data were used to further refine and validate the conceptual framework. Contributions: The scientific contribution of this study is to pave the way for SDGs specific capacity building framework that caters to the need for systematic approaches to allow the wider public aspiring to tackle the seemingly intractable sustainable development goals. The framework will aid sustainable development academics, educators, and practitioners in understanding the dynamics of how capacity building can be facilitated.

Keywords: capacity building, sustainability innovation, sustainable development, systems thinking, UN SDGs

Procedia PDF Downloads 53
14802 Phytoremediation Potenciality of ‘Polypogon monspeliensis L. in Detoxification of Petroleum-Contaminated Soils

Authors: Mozhgan Farzami Sepehr, Farhad Nourozi

Abstract:

In a greenhouse study, decontamination capacity of the species Polypogon monspoliensis, for detoxification of petroleum-polluted soils caused by sewage and waste materials of Tehran Petroleum Refinery. For this purpose, the amount of total oil and grease before and 45 days after transplanting one-month-old seedlings in the soils of five different treatments in which pollution-free agricultural soil and contaminated soil were mixed together with the weight ratio of respectively 1 to 9 (% 10), 2 to 8 (%20), 3 to 7 (%30) , 4 to 6 (%40), and 5 to 5 (%50) were evaluated and compared with the amounts obtained from control treatment without vegetation, but with the same concentration of pollution. Findings demonstrated that the maximum reduction in the petroleum rate ,as much as 84.85 percent, is related to the treatment 10% containing the plant. Increasing the shoot height in treatments 10% and 20% as well as the root dry and fresh weight in treatments 10% , 20% , and 30% shows that probably activity of more rhizosphere microorganisms of the plant in these treatments has led to the improvement in growth of plant organs comparing to the treatments without pollution.

Keywords: phytoremediation, total oil and grease, rhizosphere, microorganisms, petroleum-contaminated soil

Procedia PDF Downloads 385
14801 Experimental Investigations on Ultimate Bearing Capacity of Soft Soil Improved by a Group of End-Bearing Column

Authors: Mamata Mohanty, J. T. Shahu

Abstract:

The in-situ deep mixing is an effective ground improvement technique which involves columnar inclusion into soft ground to increase its bearing capacity and reduce settlement. The first part of the study presents the results of unconfined compression on cement-admixed clay prepared at different cement content and subjected to varying curing periods. It is found that cement content is a prime factor controlling the strength of the cement-admixed clay. Besides cement content, curing period is important parameter that adds to the strength of cement-admixed clay. Increase in cement content leads to significant increase in Unconfined Compressive Strength (UCS) values especially at cement contents greater than 8%. The second part of the study investigated the bearing capacity of the clay ground improved by a group of end-bearing column using model tests under plain-strain condition. This study mainly focus to examine the effect of cement contents on the ultimate bearing capacity and failure stress of the improved clay ground. The study shows that the bearing capacity of the improved ground increases significantly with increase in cement contents of the soil-cement columns. A considerable increase in the stiffness of the model ground and failure stress was observed with increase in cement contents.

Keywords: bearing capacity, cement content, curing time, unconfined compressive strength, undrained shear strength

Procedia PDF Downloads 158
14800 Real Time Video Based Smoke Detection Using Double Optical Flow Estimation

Authors: Anton Stadler, Thorsten Ike

Abstract:

In this paper, we present a video based smoke detection algorithm based on TVL1 optical flow estimation. The main part of the algorithm is an accumulating system for motion angles and upward motion speed of the flow field. We optimized the usage of TVL1 flow estimation for the detection of smoke with very low smoke density. Therefore, we use adapted flow parameters and estimate the flow field on difference images. We show in theory and in evaluation that this improves the performance of smoke detection significantly. We evaluate the smoke algorithm using videos with different smoke densities and different backgrounds. We show that smoke detection is very reliable in varying scenarios. Further we verify that our algorithm is very robust towards crowded scenes disturbance videos.

Keywords: low density, optical flow, upward smoke motion, video based smoke detection

Procedia PDF Downloads 331
14799 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 29
14798 An Image Segmentation Algorithm for Gradient Target Based on Mean-Shift and Dictionary Learning

Authors: Yanwen Li, Shuguo Xie

Abstract:

In electromagnetic imaging, because of the diffraction limited system, the pixel values could change slowly near the edge of the image targets and they also change with the location in the same target. Using traditional digital image segmentation methods to segment electromagnetic gradient images could result in lots of errors because of this change in pixel values. To address this issue, this paper proposes a novel image segmentation and extraction algorithm based on Mean-Shift and dictionary learning. Firstly, the preliminary segmentation results from adaptive bandwidth Mean-Shift algorithm are expanded, merged and extracted. Then the overlap rate of the extracted image block is detected before determining a segmentation region with a single complete target. Last, the gradient edge of the extracted targets is recovered and reconstructed by using a dictionary-learning algorithm, while the final segmentation results are obtained which are very close to the gradient target in the original image. Both the experimental results and the simulated results show that the segmentation results are very accurate. The Dice coefficients are improved by 70% to 80% compared with the Mean-Shift only method.

Keywords: gradient image, segmentation and extract, mean-shift algorithm, dictionary iearning

Procedia PDF Downloads 244
14797 Effect of Fermentation Time on Some Functional Properties of Moringa (Moringa oleifera) Seed Flour

Authors: Ocheme B. Ocheme, Omobolanle O. Oloyede, S. James, Eleojo V. Akpa

Abstract:

The effect of fermentation time on some functional properties of Moringa (Moringa oleifera) seed flour was examined. Fermentation, an effective processing method used to improve nutritional quality of plant foods, tends to affect the characteristics of food components and their behaviour in food systems just like other processing methods. Hence the need for this study. Moringa seeds were fermented naturally by soaking in potable water and allowing it to stand for 12, 24, 48 and 72 hours. At the end of fermentation, the seeds were oven dried at 600C for 12 hours and then milled into flour. Flour obtained from unfermented seeds served as control: hence a total of five flour samples. The functional properties were analyzed using standard methods. Fermentation significantly (p<0.05) increased the water holding capacity of Moringa seed flour from 0.86g/g - 2.31g/g. The highest value was observed after 48 hours of fermentation The same trend was observed for oil absorption capacity with values between 0.87 and 1.91g/g. Flour from unfermented Moringa seeds had a bulk density of 0.60g/cm3 which was significantly (p<0.05) higher than the bulk densities of flours from seeds fermented for 12, 24 and 48. Fermentation significantly (p<0.05) decreased the dispersibility of Moringa seed flours from 36% to 21, 24, 29 and 20% after 12, 24, 48 and 72 hours of fermentation respectively. The flours’ emulsifying capacities increased significantly (p<0.05) with increasing fermentation time with values between 50 – 68%. The flour obtained from seeds fermented for 12 hours had a significantly (p<0.05) higher foaming capacity of 16% while the flour obtained from seeds fermented for 0, 24 and 72 hours had the least foaming capacities of 9%. Flours from seeds fermented for 12 and 48 hours had better functional properties than flours from seeds fermented for 24 and 72 hours.

Keywords: fermentation, flour, functional properties, Moringa

Procedia PDF Downloads 652
14796 Sequential Covering Algorithm for Nondifferentiable Global Optimization Problem and Applications

Authors: Mohamed Rahal, Djaouida Guetta

Abstract:

In this paper, the one-dimensional unconstrained global optimization problem of continuous functions satifying a Hölder condition is considered. We extend the algorithm of sequential covering SCA for Lipschitz functions to a large class of Hölder functions. The convergence of the method is studied and the algorithm can be applied to systems of nonlinear equations. Finally, some numerical examples are presented and illustrate the efficiency of the present approach.

Keywords: global optimization, Hölder functions, sequential covering method, systems of nonlinear equations

Procedia PDF Downloads 343
14795 Study of Adaptive Filtering Algorithms and the Equalization of Radio Mobile Channel

Authors: Said Elkassimi, Said Safi, B. Manaut

Abstract:

This paper presented a study of three algorithms, the equalization algorithm to equalize the transmission channel with ZF and MMSE criteria, application of channel Bran A, and adaptive filtering algorithms LMS and RLS to estimate the parameters of the equalizer filter, i.e. move to the channel estimation and therefore reflect the temporal variations of the channel, and reduce the error in the transmitted signal. So far the performance of the algorithm equalizer with ZF and MMSE criteria both in the case without noise, a comparison of performance of the LMS and RLS algorithm.

Keywords: adaptive filtering second equalizer, LMS, RLS Bran A, Proakis (B) MMSE, ZF

Procedia PDF Downloads 298
14794 Sync Consensus Algorithm: Trying to Reach an Agreement at Full Speed

Authors: Yuri Zinchenko

Abstract:

Recently, distributed storage systems have been used more and more in various aspects of everyday life. They provide such necessary properties as Scalability, Fault Tolerance, Durability, and others. At the same time, not only reliable but also fast data storage remains one of the most pressing issues in this area. That brings us to the consensus algorithm as one of the most important components that has a great impact on the functionality of a distributed system. This paper is the result of an analysis of several well-known consensus algorithms, such as Paxos and Raft. The algorithm it offers, called Sync, promotes, but does not insist on simultaneous writing to the nodes (which positively affects the overall writing speed) and tries to minimize the system's inactive time. This allows nodes to reach agreement on the system state in a shorter period, which is a critical factor for distributed systems. Also when developing Sync, a lot of attention was paid to such criteria as simplicity and intuitiveness, the importance of which is difficult to overestimate.

Keywords: sync, consensus algorithm, distributed system, leader-based, synchronization.

Procedia PDF Downloads 42
14793 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm

Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin

Abstract:

Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.

Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform

Procedia PDF Downloads 506
14792 Strengthening of Bridges by Additional Prestressing

Authors: A. Bouhaloufa, T. Kadri, S. Zouaoui, A. Belhacene

Abstract:

To put more durable bridges, it is important to maintain existing structures, rather than investing in new structures. Instead of demolishing the old bridge and replace them with new, we must preserve and upgrade using better methods of diagnosis, auscultation and repair, the interest of this work is to increase the bearing capacity bridges damaged by additional prestressing, this type of reinforcement is growing continuously. In addition to excellent static strength, prestressing also has a very high resistance to fatigue, so it is suitable to solve the problem of failure of the bearing capacity of the bridges. This failure often comes to the development of overloads in quantity and quality, that is our daily traffic has increased and become very complicated, on the other hand its constituents are advanced in weight and speed and therefore almost all old bridges became unable to support the movement of the latter and remain disabled to all these problems. The main purpose of this work includes the following three aspects: - Determination of the main diseases and factors affecting the deterioration of bridges in Algeria, - Evaluation of the bearing capacity of bridges, - Proposal technical reinforcement to improve the bearing capacity of a degraded structure.

Keywords: bridges, repair, auscultation, diagnosis, pathology, additional prestressing

Procedia PDF Downloads 589
14791 Automated Test Data Generation For some types of Algorithm

Authors: Hitesh Tahbildar

Abstract:

The cost of test data generation for a program is computationally very high. In general case, no algorithm to generate test data for all types of algorithms has been found. The cost of generating test data for different types of algorithm is different. Till date, people are emphasizing the need to generate test data for different types of programming constructs rather than different types of algorithms. The test data generation methods have been implemented to find heuristics for different types of algorithms. Some algorithms that includes divide and conquer, backtracking, greedy approach, dynamic programming to find the minimum cost of test data generation have been tested. Our experimental results say that some of these types of algorithm can be used as a necessary condition for selecting heuristics and programming constructs are sufficient condition for selecting our heuristics. Finally we recommend the different heuristics for test data generation to be selected for different types of algorithms.

Keywords: ongest path, saturation point, lmax, kL, kS

Procedia PDF Downloads 382
14790 Review of Sulfur Unit Capacity Expansion Options

Authors: Avinashkumar Karre

Abstract:

Sulfur recovery unit, most commonly called as Claus process, is very significant gas desulfurization process unit in refinery and gas industries. Explorations of new natural gas fields, refining of high-sulfur crude oils, and recent crude expansion projects are needing capacity expansion of Claus unit for many companies around the world. In refineries, the sulphur recovery units take acid gas from amine regeneration units and sour water strippers, converting hydrogen sulfide to elemental sulfur using the Claus process. The Claus process is hydraulically limited by mass flow rate. Reducing the pressure drop across control valves, flow meters, lines, knock-out drums, and packing improves the capacity. Oxygen enrichment helps improve the capacity by removing nitrogen, this is more commonly done on all capacity expansion projects. Typical upgrades required due to oxygen enrichment are new burners, new refractory in thermal reactor, resizing of 1st condenser, instrumentation changes, and steam/condensate heat integration. Some other capacity expansion options typically considered are tail gas compressor, replacing air blower with higher head, hydrocarbon minimization in the feed, water removal, and ammonia removal. Increased capacity related upgrades in sulfur recovery unit also need changes in the tail gas treatment unit, typical changes include improvement to quench tower duty, packing area upgrades in quench and absorber towers and increased amine circulation flow rates.

Keywords: Claus process, oxygen enrichment, sulfur recovery unit, tail gas treatment unit

Procedia PDF Downloads 102
14789 Optimal Decisions for Personalized Products with Demand Information Updating and Limited Capacity

Authors: Meimei Zheng

Abstract:

Product personalization could not only bring new profits to companies but also provide the direction of long-term development for companies. However, the characteristics of personalized product cause some new problems. This paper investigates how companies make decisions on the supply of personalized products when facing different customer attitudes to personalized product and service, constraints due to limited capacity and updates of personalized demand information. This study will provide optimal decisions for companies to develop personalized markets, resulting in promoting business transformation and improving business competitiveness.

Keywords: demand forecast updating, limited capacity, personalized products, optimization

Procedia PDF Downloads 234
14788 The Effect of an Infill on the Bearing Capacity and Stiffness of Infilled Frames

Authors: Goran Baloevic, Jure Radnic, Nikola Grgic

Abstract:

The application of frames with masonry or panel infill is common in the engineering practice. In these cases, a frame is often considered to be a primary structure, while an infill is considered to be a secondary structure. In past calculations, the infill was rarely included in the design of frame structures in terms of their bearing capacity and safety. Recent calculations of such structures necessarily include the effect of infill since it contributes to stiffness and bearing capacity of overall system, especially under horizontal loads. In certain cases, if the infill is not included in the seismic design of frame structures, the result can be lower design safety. However, since the different configuration of the infill through the building’s height can be made, it is possible that contribution of such infill to the overall bearing capacity can be lower and seismic forces on the building can be increased due to greater stiffness of the structure. So far, many experimental and numerical researches on the behavior of infilled frames under horizontal static forces and earthquake have been performed. In this paper, several masonry-infilled concrete and steel frames under horizontal static forces and earthquake are analysed. The experimental results by shake-table and numerical results are compared in terms of the bearing capacity of bare and infilled frames. Herein, the stiffness of frames and infill were varied, with different position of the infill and different types of openings. Cases with positive and negative effects of the infill to the bearing capacity of the frames were considered. Finally, main conclusions and recommendations for practical application and design of masonry-infilled concrete and steel frames are given.

Keywords: bearing capacity, infilled frame, numerical model, shake table

Procedia PDF Downloads 440
14787 Total Chromatic Number of Δ-Claw-Free 3-Degenerated Graphs

Authors: Wongsakorn Charoenpanitseri

Abstract:

The total chromatic number χ"(G) of a graph G is the minimum number of colors needed to color the elements (vertices and edges) of G such that no incident or adjacent pair of elements receive the same color Let G be a graph with maximum degree Δ(G). Considering a total coloring of G and focusing on a vertex with maximum degree. A vertex with maximum degree needs a color and all Δ(G) edges incident to this vertex need more Δ(G) + 1 distinct colors. To color all vertices and all edges of G, it requires at least Δ(G) + 1 colors. That is, χ"(G) is at least Δ(G) + 1. However, no one can find a graph G with the total chromatic number which is greater than Δ(G) + 2. The Total Coloring Conjecture states that for every graph G, χ"(G) is at most Δ(G) + 2. In this paper, we prove that the Total Coloring Conjectur for a Δ-claw-free 3-degenerated graph. That is, we prove that the total chromatic number of every Δ-claw-free 3-degenerated graph is at most Δ(G) + 2.

Keywords: total colorings, the total chromatic number, 3-degenerated, CLAW-FREE

Procedia PDF Downloads 157
14786 In the Study of Co₂ Capacity Performance of Different Frothing Agents through Process Simulation

Authors: Muhammad Idrees, Masroor Abro, Sikandar Almani

Abstract:

Presently, the increasing CO₂ concentration in the atmosphere has been taken as one of the major challenges faced by the modern world. The average CO₂ in the atmosphere reached the highest value of 414.72 ppm in 2021, as reported in a conference of the parties (COP26). This study focuses on (i) the comparative study of MEA, NaOH, Acetic acid, and Na₂CO₃ in terms of their CO₂ capture performance, (ii) the significance of adding various frothing agents achieving improved absorption capacity of Na₂CO₃ and (iii) the overall economic evaluation of process with the help of Aspen Plus. The results obtained suggest that the addition of frothing agents significantly increased the absorption rate of dilute sodium carbonate such that from 45% to 99.9%. The effect of temperature, pressure and flow rate of liquid and flue gas streams on CO₂ absorption capacity was also investigated. It was found that the absorption capacity of Na₂CO₃ decreased with increasing temperature of the liquid stream and decreasing flow rate of the liquid stream and pressure of the gas stream.

Keywords: CO₂, absorbents, frothing agents, process simulation

Procedia PDF Downloads 52
14785 Understanding Social Networks in Community's Coping Capacity with Floods: A Case Study of a Community in Cambodia

Authors: Ourn Vimoil, Kallaya Suntornvongsagul

Abstract:

Cambodia is considered as one of the most disaster prone countries in South East Asia, and most of natural disasters are related to floods. Cambodia, a developing country, faces significant impacts from floods, such as environmental, social, and economic losses. Using data accessed from focus group discussions and field surveys with villagers in Ba Baong commune, prey Veng province, Cambodia, the research would like to examine roles of social networks in raising community’s coping capacity with floods. The findings indicate that social capital play crucial roles in three stages of floods, namely preparedness, response, and recovery to overcome the crisis. People shared their information and resources, and extent their assistances to one another in order to adapt to floods. The study contribute to policy makers, national and international agencies working on this issue to pay attention on social networks as one factors to accelerate flood coping capacity at community level.

Keywords: social network, community, coping capacity, flood, Cambodia

Procedia PDF Downloads 343
14784 Transformer Design Optimization Using Artificial Intelligence Techniques

Authors: Zakir Husain

Abstract:

Main objective of a power transformer design optimization problem requires minimizing the total overall cost and/or mass of the winding and core material by satisfying all possible constraints obligatory by the standards and transformer user requirement. The constraints include appropriate limits on winding fill factor, temperature rise, efficiency, no-load current and voltage regulation. The design optimizations tasks are a constrained minimum cost and/or mass solution by optimally setting the parameters, geometry and require magnetic properties of the transformer. In this paper, present the above design problems have been formulated by using genetic algorithm (GA) and simulated annealing (SA) on the MATLAB platform. The importance of the presented approach is stems for two main features. First, proposed technique provides reliable and efficient solution for the problem of design optimization with several variables. Second, it guaranteed to obtained solution is global optimum. This paper includes a demonstration of the application of the genetic programming GP technique to transformer design.

Keywords: optimization, power transformer, genetic algorithm (GA), simulated annealing technique (SA)

Procedia PDF Downloads 558