Search results for: dimension reducing distribution load flow algorithm
16778 Distribution of Synechococcus and Prochlorococcus in Southeastern Coast of Peninsular Malaysia
Authors: Roswati Md. Amin, Nurul Asmera Mudiman, Muhammad Faisal Abd. Rahman, Md-Suffian Idris, Noor Hazwani Mohd Azmi
Abstract:
Distribution of picophytoplankton from two genera, Synechococcus and Prochlorococcus at the surface water (0.5m) were observed from coastal to offshore area of the southeastern coast of Peninsular Malaysia, for a six day cruise in August 2014 during SouthWest monsoon. The picophytoplankton was divided into two different size fractions (0.7-2.7μm and <0.7 μm) by filtering through GF/D (2.7 μm) and GF/F (0.7 μm) filter papers and counted by using flow cytometer. Synechococcus and Prochlorococcus contribute higher at 0.7-2.7μm size range (ca. 90% and 95%, respectively) compared to <0.7 μm (ca. 10% and 5%, respectively). Synechococcus (>52%) dominated the total picophytoplankton compared to Prochlorococcus (<26%) for both size fractions in southeastern coast of Peninsular Malaysia. Total density (<2.7 μm) of Synechococcus was ranging between 1.72 x104 and 12.57 x104 cells ml-1, while Prochlorococcus varied from 1.50 x104 to 8.62 x104. Both Synechococcus and Prochlorococcus abundance showed a decreasing trend from coastal to offshore.Keywords: Peninsular Malaysia, prochlorococcus, South China Sea, synechococcus
Procedia PDF Downloads 31616777 Analysis of CO₂ Two-Phase Ejector with Taguchi and ANOVA Optimization and Refrigerant Selection with Enviro Economic Concerns by TOPSIS Analysis
Authors: Karima Megdouli, Bourhan tachtouch
Abstract:
Ejector refrigeration cycles offer an alternative to conventional systems for producing cold from low-temperature heat. In this article, a thermodynamic model is presented. This model has the advantage of simplifying the calculation algorithm and describes the complex double-throttling mechanism that occurs in the ejector. The model assumption and calculation algorithm are presented first. The impact of each efficiency is evaluated. Validation is performed on several data sets. The ejector model is then used to simulate a RES (refrigeration ejector system), to validate its robustness and suitability for use in predicting thermodynamic cycle performance. A Taguchi and ANOVA optimization is carried out on a RES. TOPSIS analysis was applied to decide the optimum refrigerants with cost, safety, environmental and enviro economic concerns along with thermophysical properties.Keywords: ejector, velocity distribution, shock circle, Taguchi and ANOVA optimization, TOPSIS analysis
Procedia PDF Downloads 8916776 Three-Dimensional CFD Modeling of Flow Field and Scouring around Bridge Piers
Authors: P. Deepak Kumar, P. R. Maiti
Abstract:
In recent years, sediment scour near bridge piers and abutment is a serious problem which causes nationwide concern because it has resulted in more bridge failures than other causes. Scour is the formation of scour hole around the structure mounted on and embedded in erodible channel bed due to the erosion of soil by flowing water. The formation of scour hole around the structures depends upon shape and size of the pier, depth of flow as well as angle of attack of flow and sediment characteristics. The flow characteristics around these structures change due to man-made obstruction in the natural flow path which changes the kinetic energy of the flow around these structures. Excessive scour affects the stability of the foundation of the structure by the removal of the bed material. The accurate estimation of scour depth around bridge pier is very difficult. The foundation of bridge piers have to be taken deeper and to provide sufficient anchorage length required for stability of the foundation. In this study, computational model simulations using a 3D Computational Fluid Dynamics (CFD) model were conducted to examine the mechanism of scour around a cylindrical pier. Subsequently, the flow characteristics around these structures are presented for different flow conditions. Mechanism of scouring phenomenon, the formation of vortex and its consequent effect is discussed for a straight channel. Effort was made towards estimation of scour depth around bridge piers under different flow conditions.Keywords: bridge pier, computational fluid dynamics, multigrid, pier shape, scour
Procedia PDF Downloads 29616775 Numerical Simulation of Magnetohydrodynamic (MHD) Blood Flow in a Stenosed Artery
Authors: Sreeparna Majee, G. C. Shit
Abstract:
Unsteady blood flow has been numerically investigated through stenosed arteries to achieve an idea about the physiological blood flow pattern in diseased arteries. The blood is treated as Newtonian fluid and the arterial wall is considered to be rigid having deposition of plaque in its lumen. For direct numerical simulation, vorticity-stream function formulation has been adopted to solve the problem using implicit finite difference method by developing well known Peaceman-Rachford Alternating Direction Implicit (ADI) scheme. The effects of magnetic parameter and Reynolds number on velocity and wall shear stress are being studied and presented quantitatively over the entire arterial segment. The streamlines have been plotted to understand the flow pattern in the stenosed artery, which has significant alterations in the downstream of the stenosis in the presence of magnetic field. The results show that there are nominal changes in the flow pattern when magnetic field strength is enhanced upto 8T which can have remarkable usage to MRI machines.Keywords: magnetohydrodynamics, blood flow, stenosis, energy dissipation
Procedia PDF Downloads 27516774 Experimental Study of the Dynamics of Sediments in Natural Channels in a Non-Stationary Flow Regime
Authors: Fourar Ali, Fourar Fatima Zohra
Abstract:
Knowledge of sediment characteristics is fundamental to understanding their sedimentary functioning: sedimentation, settlement, and erosion processes of cohesive sediments are controlled by complex interactions between physical, chemical, and biological factors. Sediment transport is of primary importance in river hydraulics and river engineering. Indeed, the displacement of sediments can lead to lasting modifications of the bed in terms of its elevation, slope and roughness. The protection of a bank, for example, is likely to initiate a local incision of the river bed, which, in turn, can lead to the subsidence of the bank. The flows in the natural environment occur in general with heterogeneous boundary conditions because of the distribution of the roughnesses of the fixed or mobile bottoms and of the important deformations of the free surface, especially for the flows with a weak draft considering the irregularity of the bottom. Bedforms significantly influence flow resistance. The arrangement of particles lining the bottom of the stream bed or experimental channel generates waveforms of different sizes that lead to changes in roughness and consequently spatial variability in the turbulent characteristics of the flow. The study which is focused on the laws of friction in alluvial beds, aims to analyze the characteristics of flows and materials constituting the natural channels. Experimental results were obtained by simulating these flows on a rough bottom in an experimental channel at the Hydraulics Laboratory of the University of Batna 2. The system of equations governing the problem is solved using the program named: CLIPPER.5 and ACP.Keywords: free surface flow, heterogeneous sand, moving bottom bed, friction coefficient, bottom roughness
Procedia PDF Downloads 9016773 Evaluating Reliability Indices in 3 Critical Feeders at Lorestan Electric Power Distribution Company
Authors: Atefeh Pourshafie, Homayoun Bakhtiari
Abstract:
The main task of power distribution companies is to supply the power required by customers in an acceptable level of quality and reliability. Some key performance indicators for electric power distribution companies are those evaluating the continuity of supply within the network. More than other problems, power outages (due to lightning, flood, fire, earthquake, etc.) challenge economy and business. In addition, end users expect a reliable power supply. Reliability indices are evaluated on an annual basis by the specialized holding company of Tavanir (Power Produce, Transmission& distribution company of Iran) . Evaluation of reliability indices is essential for distribution companies, and with regard to the privatization of distribution companies, it will be of particular importance to evaluate these indices and to plan for their improvement in a not too distant future. According to IEEE-1366 standard, there are too many indices; however, the most common reliability indices include SAIFI, SAIDI and CAIDI. These indices describe the period and frequency of blackouts in the reporting period (annual or any desired timeframe). This paper calculates reliability indices for three sample feeders in Lorestan Electric Power Distribution Company and defines the threshold values in a ten-month period. At the end, strategies are introduced to reach the threshold values in order to increase customers' satisfaction.Keywords: power, distribution network, reliability, outage
Procedia PDF Downloads 47216772 An Efficient Subcarrier Scheduling Algorithm for Downlink OFDMA-Based Wireless Broadband Networks
Authors: Hassen Hamouda, Mohamed Ouwais Kabaou, Med Salim Bouhlel
Abstract:
The growth of wireless technology made opportunistic scheduling a widespread theme in recent research. Providing high system throughput without reducing fairness allocation is becoming a very challenging task. A suitable policy for resource allocation among users is of crucial importance. This study focuses on scheduling multiple streaming flows on the downlink of a WiMAX system based on orthogonal frequency division multiple access (OFDMA). In this paper, we take the first step in formulating and analyzing this problem scrupulously. As a result, we proposed a new scheduling scheme based on Round Robin (RR) Algorithm. Because of its non-opportunistic process, RR does not take in account radio conditions and consequently it affect both system throughput and multi-users diversity. Our contribution called MORRA (Modified Round Robin Opportunistic Algorithm) consists to propose a solution to this issue. MORRA not only exploits the concept of opportunistic scheduler but also takes into account other parameters in the allocation process. The first parameter is called courtesy coefficient (CC) and the second is called Buffer Occupancy (BO). Performance evaluation shows that this well-balanced scheme outperforms both RR and MaxSNR schedulers and demonstrate that choosing between system throughput and fairness is not required.Keywords: OFDMA, opportunistic scheduling, fairness hierarchy, courtesy coefficient, buffer occupancy
Procedia PDF Downloads 30016771 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection
Procedia PDF Downloads 29416770 Numerical Investigation on Load Bearing Capacity of Pervious Concrete Piles as an Alternative to Granular Columns
Authors: Ashkan Shafee, Masoud Ghodrati, Ahmad Fahimifar
Abstract:
Pervious concrete combines considerable permeability with adequate strength, which makes it very beneficial in pavement construction and also in ground improvement projects. In this paper, a single pervious concrete pile subjected to vertical and lateral loading is analysed using a verified three dimensional finite element code. A parametric study was carried out in order to investigate load bearing capacity of a single unreinforced pervious concrete pile in saturated soft soil and also gain insight into the failure mechanism of this rather new soil improvement technique. The results show that concrete damaged plasticity constitutive model can perfectly simulate the highly brittle nature of the pervious concrete material and considering the computed vertical and horizontal load bearing capacities, some suggestions have been made for ground improvement projects.Keywords: concrete damaged plasticity, ground improvement, load-bearing capacity, pervious concrete pile
Procedia PDF Downloads 22916769 Experimental Investigation for Reducing Emissions in Maritime Industry
Authors: Mahmoud Ashraf Farouk
Abstract:
Shipping transportation is the foremost imperative mode of transportation in universal coordination. At display, more than 2/3 of the full worldwide exchange volume accounts for shipping transportation. Ships are utilized as an implies of marine transportation, introducing large-power diesel motors with exhaust containing nitrogen oxide NOx, sulfur oxide SOx, carbo di-oxide CO₂, particular matter PM10, hydrocarbon HC and carbon mono-oxide CO which are the most dangerous contaminants found in exhaust gas from ships. Ships radiating a large amount of exhaust gases have become a significant cause of pollution in the air in coastal areas, harbors and oceans. Therefore, IMO (the International Maritime Organization) has established rules to reduce this emission. This experiment shows the measurement of the exhaust gases emitted from the Aida IV ship's main engine using marine diesel oil fuel (MDO). The measurement is taken by the Sensonic2000 device on 85% load, which is the main sailing load. Moreover, the paper studies different emission reduction technologies as an alternative fuel, which as liquefied natural gas (LNG) applied to the system and reduction technology which is represented as selective catalytic reduction technology added to the marine diesel oil system (MDO+SCR). The experiment calculated the amount of nitrogen oxide NOx, sulfur oxide SOx, carbon-di-oxide CO₂, particular matter PM10, hydrocarbon HC and carbon mono-oxide CO because they have the most effect on the environment. The reduction technologies are applied on the same ship engine with the same load. Finally, the study found that MDO+SCR is the more efficient technology for the Aida IV ship as a training and supply ship due to low consumption and no need to modify the engine. Just add the SCR system to the exhaust line, which is easy and cheapest. Moreover, the differences between them in the emission are not so big.Keywords: marine, emissions, reduction, shipping
Procedia PDF Downloads 7616768 Frequency Analysis Using Multiple Parameter Probability Distributions for Rainfall to Determine Suitable Probability Distribution in Pakistan
Authors: Tasir Khan, Yejuan Wang
Abstract:
The study of extreme rainfall events is very important for flood management in river basins and the design of water conservancy infrastructure. Evaluation of quantiles of annual maximum rainfall (AMRF) is required in different environmental fields, agriculture operations, renewable energy sources, climatology, and the design of different structures. Therefore, the annual maximum rainfall (AMRF) was performed at different stations in Pakistan. Multiple probability distributions, log normal (LN), generalized extreme value (GEV), Gumbel (max), and Pearson type3 (P3) were used to find out the most appropriate distributions in different stations. The L moments method was used to evaluate the distribution parameters. Anderson darling test, Kolmogorov- Smirnov test, and chi-square test showed that two distributions, namely GUM (max) and LN, were the best appropriate distributions. The quantile estimate of a multi-parameter PD offers extreme rainfall through a specific location and is therefore important for decision-makers and planners who design and construct different structures. This result provides an indication of these multi-parameter distribution consequences for the study of sites and peak flow prediction and the design of hydrological maps. Therefore, this discovery can support hydraulic structure and flood management.Keywords: RAMSE, multiple frequency analysis, annual maximum rainfall, L-moments
Procedia PDF Downloads 8116767 Improving Cryptographically Generated Address Algorithm in IPv6 Secure Neighbor Discovery Protocol through Trust Management
Authors: M. Moslehpour, S. Khorsandi
Abstract:
As transition to widespread use of IPv6 addresses has gained momentum, it has been shown to be vulnerable to certain security attacks such as those targeting Neighbor Discovery Protocol (NDP) which provides the address resolution functionality in IPv6. To protect this protocol, Secure Neighbor Discovery (SEND) is introduced. This protocol uses Cryptographically Generated Address (CGA) and asymmetric cryptography as a defense against threats on integrity and identity of NDP. Although SEND protects NDP against attacks, it is computationally intensive due to Hash2 condition in CGA. To improve the CGA computation speed, we parallelized CGA generation process and used the available resources in a trusted network. Furthermore, we focused on the influence of the existence of malicious nodes on the overall load of un-malicious ones in the network. According to the evaluation results, malicious nodes have adverse impacts on the average CGA generation time and on the average number of tries. We utilized a Trust Management that is capable of detecting and isolating the malicious node to remove possible incentives for malicious behavior. We have demonstrated the effectiveness of the Trust Management System in detecting the malicious nodes and hence improving the overall system performance.Keywords: CGA, ICMPv6, IPv6, malicious node, modifier, NDP, overall load, SEND, trust management
Procedia PDF Downloads 18416766 Unsteady Numerical Analysis of Sediment Erosion Affected High Head Francis Turbine
Authors: Saroj Gautam, Ram Lama, Hari Prasad Neopane, Sailesh Chitrakar, Biraj Singh Thapa, Baoshan Zhu
Abstract:
Sediment flowing along with the water in rivers flowing in South Asia erodes the turbine components. The erosion of turbine components is influenced by the nature of fluid flow along with components of typical turbine types. This paper examines two cases of high head Francis turbines with the same speed number numerically. The numerical investigation involves both steady-state and transient analysis of the numerical model developed for both cases. Furthermore, the influence of leakage flow from the clearance gap of guide vanes is also examined and compared with no leakage flow. It presents the added pressure pulsation to rotor-stator-interaction in the turbine runner for both cases due to leakage flow. It was also found that leakage flow was a major contributor to the sediment erosion in those turbines.Keywords: sediment erosion, Francis turbine, leakage flow, rotor stator interaction
Procedia PDF Downloads 18516765 A Study of Secondary Particle Production from Carbon Ion Beam for Radiotherapy
Authors: Shaikah Alsubayae, Gianluigi Casse, Carlos Chavez, Jon Taylor, Alan Taylor, Mohammad Alsulimane
Abstract:
Achieving precise radiotherapy through carbon therapy necessitates the accurate monitoring of radiation dose distribution within the patient's body. This process is pivotal for targeted tumor treatment, minimizing harm to healthy tissues, and enhancing overall treatment effectiveness while reducing the risk of side effects. In our investigation, we adopted a methodological approach to monitor secondary proton doses in carbon therapy using Monte Carlo (MC) simulations. Initially, Geant4 simulations were employed to extract the initial positions of secondary particles generated during interactions between carbon ions and water, including protons, gamma rays, alpha particles, neutrons, and tritons. Subsequently, we explored the relationship between the carbon ion beam and these secondary particles. Interaction vertex imaging (IVI) proves valuable for monitoring dose distribution during carbon therapy, providing information about secondary particle locations and abundances, particularly protons. The IVI method relies on charged particles produced during ion fragmentation to gather range information by reconstructing particle trajectories back to their point of origin, known as the vertex. In the context of carbon ion therapy, our simulation results indicated a strong correlation between some secondary particles and the range of carbon ions. However, challenges arose due to the unique elongated geometry of the target, hindering the straightforward transmission of forward-generated protons. Consequently, the limited protons that did emerge predominantly originated from points close to the target entrance. Fragment (protons) trajectories were approximated as straight lines, and a beam back-projection algorithm, utilizing interaction positions recorded in Si detectors, was developed to reconstruct vertices. The analysis revealed a correlation between the reconstructed and actual positions.Keywords: radiotherapy, carbon therapy, monitor secondary proton doses, interaction vertex imaging
Procedia PDF Downloads 7816764 Image Compression on Region of Interest Based on SPIHT Algorithm
Authors: Sudeepti Dayal, Neelesh Gupta
Abstract:
Image abbreviation is utilized for reducing the size of a file without demeaning the quality of the image to an objectionable level. The depletion in file size permits more images to be deposited in a given number of spaces. It also minimizes the time necessary for images to be transferred. Storage of medical images is a most researched area in the current scenario. To store a medical image, there are two parameters on which the image is divided, regions of interest and non-regions of interest. The best way to store an image is to compress it in such a way that no important information is lost. Compression can be done in two ways, namely lossy, and lossless compression. Under that, several compression algorithms are applied. In the paper, two algorithms are used which are, discrete cosine transform, applied to non-region of interest (lossy), and discrete wavelet transform, applied to regions of interest (lossless). The paper introduces SPIHT (set partitioning hierarchical tree) algorithm which is applied onto the wavelet transform to obtain good compression ratio from which an image can be stored efficiently.Keywords: Compression ratio, DWT, SPIHT, DCT
Procedia PDF Downloads 34916763 Material Flow Modeling in Friction Stir Welding of AA6061-T6 Alloy and Study of the Effect of Process Parameters
Authors: B. SahaRoy, T. Medhi, S. C. Saha
Abstract:
To understand the friction stir welding process, it is very important to know the nature of the material flow in and around the tool. The process is a combination of both thermal as well as mechanical work i.e it is a coupled thermo-mechanical process. Numerical simulations are very much essential in order to obtain a complete knowledge of the process as well as the physics underlying it. In the present work a model based approach is adopted in order to study material flow. A thermo-mechanical based CFD model is developed using a Finite Element package, Comsol Multiphysics. The fluid flow analysis is done. The model simultaneously predicts shear strain fields, shear strain rates and shear stress over the entire workpiece for the given conditions. The flow fields generated by the streamline plot give an idea of the material flow. The variation of dynamic viscosity, velocity field and shear strain fields with various welding parameters is studied. Finally the result obtained from the above mentioned conditions is discussed elaborately and concluded.Keywords: AA6061-T6, CFD modelling, friction stir welding, material flow
Procedia PDF Downloads 52116762 The Practice of Low Flow Anesthesia to Reduce Carbon Footprints Sustainability Project
Authors: Ahmed Eid, Amita Gupta
Abstract:
Abstract: Background: Background Medical gases are estimated to contribute to 5% of the carbon footprints produced by hospitals, Desflurane has the largest impact, but all increase significantly when used with N2O admixture. Climate Change Act 2008, we must reduce our carbon emission by 80% of the 1990 baseline by 2050.NHS carbon emissions have reduced by 18.5% (2007-2017). The NHS Long Term Plan has outlined measures to achieve this objective, including a 2% reduction by transforming anaesthetic practices. FGF is an important variable that determines the utilization of inhalational agents and can be tightly controlled by the anaesthetist. Aims and Objectives Environmental safety, Identification of areas of high N20 and different anaesthetic agents used across the St Helier operating theatres and consider improvising on the current practice. Methods: Data was collected from St Helier operating theatres and retrieved daily from Care Station 650 anaesthetic machines. 60 cases were included in the sample. Collected data (average flow rate, amount and type of agent used, duration of surgery, type of surgery, duration, and the total amount of Air, O2 and N2O used. AAGBI impact anaesthesia calculator was used to identify the amount of CO2 produced and also the cost per hour for every pt. Communication via reminder emails to staff emphasized the significance of low-flow anaesthesia and departmental meeting presentations aimed at heightening awareness of LFA, Distribution of AAGBI calculator QR codes in all theatres enables the calculation of volatile anaesthetic consumption and CO2e post each case, facilitating informed environmental impact assessment. Results: A significant reduction in the flow rate use in the 2nd sample was observed, flow rate usage between 0-1L was 60% which means a great reduction of the consumption of volatile anaesthetics and also Co2e. By using LFA we can save money but most importantly we can make our lives much greener and save the planet.Keywords: low flow anesthesia, sustainability project, N₂0, Co2e
Procedia PDF Downloads 6816761 A Mixed Method Approach for Modeling Entry Capacity at Rotary Intersections
Authors: Antonio Pratelli, Lorenzo Brocchini, Reginald Roy Souleyrette
Abstract:
A rotary is a traffic circle intersection where vehicles entering from branches give priority to circulating flow. Vehicles entering the intersection from converging roads move around the central island and weave out of the circle into their desired exiting branch. This creates merging and diverging conflicts among any entry and its successive exit, i.e., a section. Therefore, rotary capacity models are usually based on the weaving of the different movements in any section of the circle, and the maximum rate of flow value is then related to each weaving section of the rotary. Nevertheless, the single-section capacity value does not lead to the typical performance characteristics of the intersection, such as the entry average delay which is directly linked to its level of service. From another point of view, modern roundabout capacity models are based on the limitation of the flow entering from the single entrance due to the amount of flow circulating in front of the entrance itself. Modern roundabouts capacity models generally lead also to a performance evaluation. This paper aims to incorporate a modern roundabout capacity model into an old rotary capacity method to obtain from the latter the single input capacity and ultimately achieve the related performance indicators. Put simply; the main objective is to calculate the average delay of each single roundabout entrance to apply the most common Highway Capacity Manual, or HCM, criteria. The paper is organized as follows: firstly, the rotary and roundabout capacity models are sketched, and it has made a brief introduction to the model combination technique with some practical instances. The successive section is deserved to summarize the TRRL old rotary capacity model and the most recent HCM-7th modern roundabout capacity model. Then, the two models are combined through an iteration-based algorithm, especially set-up and linked to the concept of roundabout total capacity, i.e., the value reached due to a traffic flow pattern leading to the simultaneous congestion of all roundabout entrances. The solution is the average delay for each entrance of the rotary, by which is estimated its respective level of service. In view of further experimental applications, at this research stage, a collection of existing rotary intersections operating with the priority-to-circle rule has already started, both in the US and in Italy. The rotaries have been selected by direct inspection of aerial photos through a map viewer, namely Google Earth. Each instance has been recorded by location, general urban or rural, and its main geometrical patterns. Finally, conclusion remarks are drawn, and a discussion on some further research developments has opened.Keywords: mixed methods, old rotary and modern roundabout capacity models, total capacity algorithm, level of service estimation
Procedia PDF Downloads 8616760 Forward Stable Computation of Roots of Real Polynomials with Only Real Distinct Roots
Authors: Nevena Jakovčević Stor, Ivan Slapničar
Abstract:
Any polynomial can be expressed as a characteristic polynomial of a complex symmetric arrowhead matrix. This expression is not unique. If the polynomial is real with only real distinct roots, the matrix can be chosen as real. By using accurate forward stable algorithm for computing eigen values of real symmetric arrowhead matrices we derive a forward stable algorithm for computation of roots of such polynomials in O(n^2 ) operations. The algorithm computes each root to almost full accuracy. In some cases, the algorithm invokes extended precision routines, but only in the non-iterative part. Our examples include numerically difficult problems, like the well-known Wilkinson’s polynomials. Our algorithm compares favorably to other method for polynomial root-finding, like MPSolve or Newton’s method.Keywords: roots of polynomials, eigenvalue decomposition, arrowhead matrix, high relative accuracy
Procedia PDF Downloads 41816759 Polymer Mixing in the Cavity Transfer Mixer
Authors: Giovanna Grosso, Martien A. Hulsen, Arash Sarhangi Fard, Andrew Overend, Patrick. D. Anderson
Abstract:
In many industrial applications and, in particular in polymer industry, the quality of mixing between different materials is fundamental to guarantee the desired properties of finished products. However, properly modelling and understanding polymer mixing often presents noticeable difficulties, because of the variety and complexity of the physical phenomena involved. This is the case of the Cavity Transfer Mixer (CTM), for which a clear understanding of mixing mechanisms is still missing, as well as clear guidelines for the system optimization. This device, invented and patented by Gale at Rapra Technology Limited, is an add-on to be mounted downstream of existing extruders, in order to improve distributive mixing. It consists of two concentric cylinders, the rotor and stator, both provided with staggered rows of hemispherical cavities. The inner cylinder (rotor) rotates, while the outer (stator) remains still. At the same time, the pressure load imposed upstream, pushes the fluid through the CTM. Mixing processes are driven by the flow field generated by the complex interaction between the moving geometry, the imposed pressure load and the rheology of the fluid. In such a context, the present work proposes a complete and accurate three dimensional modelling of the CTM and results of a broad range of simulations assessing the impact on mixing of several geometrical and functioning parameters. Among them, we find: the number of cavities per row, the number of rows, the size of the mixer, the rheology of the fluid and the ratio between the rotation speed and the fluid throughput. The model is composed of a flow part and a mixing part: a finite element solver computes the transient velocity field, which is used in the mapping method implementation in order to simulate the concentration field evolution. Results of simulations are summarized in guidelines for the device optimization.Keywords: Mixing, non-Newtonian fluids, polymers, rheology.
Procedia PDF Downloads 37916758 A Hybrid Pareto-Based Swarm Optimization Algorithm for the Multi-Objective Flexible Job Shop Scheduling Problems
Authors: Aydin Teymourifar, Gurkan Ozturk
Abstract:
In this paper, a new hybrid particle swarm optimization algorithm is proposed for the multi-objective flexible job shop scheduling problem that is very important and hard combinatorial problem. The Pareto approach is used for solving the multi-objective problem. Several new local search heuristics are integrated into an algorithm based on the critical block concept to enhance the performance of the algorithm. The algorithm is compared with the recently published multi-objective algorithms based on benchmarks selected from the literature. Several metrics are used for quantifying performance and comparison of the achieved solutions. The algorithms are also compared based on the Weighting summation of objectives approach. The proposed algorithm can find the Pareto solutions more efficiently than the compared algorithms in less computational time.Keywords: swarm-based optimization, local search, Pareto optimality, flexible job shop scheduling, multi-objective optimization
Procedia PDF Downloads 36916757 Development of an Efficient Algorithm for Cessna Citation X Speed Optimization in Cruise
Authors: Georges Ghazi, Marc-Henry Devillers, Ruxandra M. Botez
Abstract:
Aircraft flight trajectory optimization has been identified to be a promising solution for reducing both airline costs and the aviation net carbon footprint. Nowadays, this role has been mainly attributed to the flight management system. This system is an onboard multi-purpose computer responsible for providing the crew members with the optimized flight plan from a destination to the next. To accomplish this function, the flight management system uses a variety of look-up tables to compute the optimal speed and altitude for each flight regime instantly. Because the cruise is the longest segment of a typical flight, the proposed algorithm is focused on minimizing fuel consumption for this flight phase. In this paper, a complete methodology to estimate the aircraft performance and subsequently compute the optimal speed in cruise is presented. Results showed that the obtained performance database was accurate enough to predict the flight costs associated with the cruise phase.Keywords: Cessna Citation X, cruise speed optimization, flight cost, cost index, and golden section search
Procedia PDF Downloads 29216756 Customer Satisfaction on Reliability Dimension of Service Quality in Indian Higher Education
Authors: Rajasekhar Mamilla, G. Janardhana, G. Anjan Babu
Abstract:
The present research studies analyses the students’ satisfaction with university performance regarding the reliability dimension, ability of professors and staff to perform the promised services with quality to students in the post-graduate courses offered by Sri Venkateswara University in India. The research is done with the notion that the student compares the perceived performance with prior expectations. Customer satisfaction is seen as the outcome of this comparison. The sample respondents were administered with the schedule based on the stratified random technique for this study. Statistical techniques such as factor analysis, t-test and correlation analysis were used to accomplish the respective objectives of the study.Keywords: satisfaction, reliability, service quality, customer
Procedia PDF Downloads 54916755 Model for Introducing Products to New Customers through Decision Tree Using Algorithm C4.5 (J-48)
Authors: Komol Phaisarn, Anuphan Suttimarn, Vitchanan Keawtong, Kittisak Thongyoun, Chaiyos Jamsawang
Abstract:
This article is intended to analyze insurance information which contains information on the customer decision when purchasing life insurance pay package. The data were analyzed in order to present new customers with Life Insurance Perfect Pay package to meet new customers’ needs as much as possible. The basic data of insurance pay package were collect to get data mining; thus, reducing the scattering of information. The data were then classified in order to get decision model or decision tree using Algorithm C4.5 (J-48). In the classification, WEKA tools are used to form the model and testing datasets are used to test the decision tree for the accurate decision. The validation of this model in classifying showed that the accurate prediction was 68.43% while 31.25% were errors. The same set of data were then tested with other models, i.e. Naive Bayes and Zero R. The results showed that J-48 method could predict more accurately. So, the researcher applied the decision tree in writing the program used to introduce the product to new customers to persuade customers’ decision making in purchasing the insurance package that meets the new customers’ needs as much as possible.Keywords: decision tree, data mining, customers, life insurance pay package
Procedia PDF Downloads 42816754 A Study of Traditional Mode in the Framework of Sustainable Urban Transportation
Authors: Juanita, B. Kombaitan, Iwan Pratoyo Kusumantoro
Abstract:
The traditional mode is a non-motorized vehicle powered by human or animal power. The objective of the study was to define the strategy of using traditional modes by the framework of sustainable urban transport in support of urban tourism activities. The study of the traditional mode does not include a modified mode using the engine power as motor tricycles are often called ‘bentor ‘in Indonesia. The use of non-motorized traditional mode in Indonesia has begun to shift, and its use began to be eliminated by the change of propulsion using the machine. In an effort to push back the use of traditional mode one of them with tourism activities. Strategies for the use of traditional modes within the framework of sustainable urban transport are seen from three dimensions: social, economic and environmental. The social dimension related to accessibility and livability, an economic dimension related to traditional modes can promote products and tourist attractions, while the environmental dimension related to the needs of the users/groups with respect to safety, comfort. The traditional mode is rarely noticed by the policy makers, and public opinion in its use needs attention. The involvement of policy-making between stakeholders and the community is needed in the development of sustainable traditional mode strategies in support of urban tourism activities.Keywords: traditional mode, sustainable, urban, transportation
Procedia PDF Downloads 26516753 Aerodynamic Sound from a Sawtooth Plate with Different Thickness
Authors: Siti Ruhliah Lizarose Samion, Mohamed Sukri Mat Ali
Abstract:
The effect of sawtooth plate thickness on the aerodynamic noise generated in flow at a Reynolds number of 150 is numerically investigated. Two types of plate thickness (hthick=0.2D and hthin=0.02D) are proposed. Flow simulations are carried out using Direct Numerical Simulation, whereas the calculation of aerodynamic noise radiated from the flow is solved using Curle’s equation. It is found that the flow behavior of thin sawtooth plate, consisting counter-rotating-vortices, is more complex than that of the thick plate. This then explains well the generated sound in both plates cases. Sound generated from thin plat is approximately 0.5 dB lower than the thick plate. Findings from current study provide better understanding of the flow and noise behavior in edge serrations via understanding the case of a sawtooth plate.Keywords: aerodynamic sound, bluff body, sawtooth plate, Curle analogy
Procedia PDF Downloads 43616752 Heat Transfer Correlations for Exhaust Gas Flow
Authors: Fatih Kantas
Abstract:
Exhaust systems are key contributors to ground vehicles as a heat source. Understanding heat transfer in exhaust systems is related to defining effective parameter on heat transfer in exhaust system. In this journal, over 20 Nusselt numbers are investigated. This study shows advantages and disadvantages of various Nusselt numbers in different range Re, Pr and pulsating flow amplitude and frequency. Also (CAF) Convective Augmentation Factors are defined to correct standard Nusselt number for geometry and location of exhaust system. Finally, optimum Nusselt number and Convective Augmentation Factors are recommended according to Re, Pr and pulsating flow amplitude and frequency, geometry and location effect of exhaust system.Keywords: exhaust gas flow, heat transfer correlation, Nusselt, Prandtl, pulsating flow
Procedia PDF Downloads 35516751 Effects of the Non-Newtonian Viscosity of Blood on Flow Field in a Constricted Artery with a Porous Plaque
Authors: Maedeh Shojaeizadeh, Amirreza Yeganegi
Abstract:
Nowadays many people lose their lives due to cardiovascular diseases. Inappropriate food habits and lack of exercise expedite deposit process of fatty substances on inner surface of blood arteries. This abnormal lump disturbs uniform blood flow and reduces oxygen delivery to active organs. This work presents a numerical simulation of Non-Newtonian blood flow in a stenosis vessel. The vessel is considered as two dimensional channel and plaque area is modelled as a homogenous porous medium. To simulate blood flow reaction around stenosis region, we use C++ code and solve coupled Cauchy, Darcy, governing continuity and energy equations. The analyses results show that viscosity power (n) plays an important role in flow separation and the size of the eddy at the downstream edge of the plaque. It is also observed that with increasing (n) value, temperature discontinuity and likelihood of vessel rupture declined.Keywords: blood flow, computational fluid dynamic, porosity, power law fluid
Procedia PDF Downloads 45916750 On the Optimality Assessment of Nano-Particle Size Spectrometry and Its Association to the Entropy Concept
Authors: A. Shaygani, R. Saifi, M. S. Saidi, M. Sani
Abstract:
Particle size distribution, the most important characteristics of aerosols, is obtained through electrical characterization techniques. The dynamics of charged nano-particles under the influence of electric field in electrical mobility spectrometer (EMS) reveals the size distribution of these particles. The accuracy of this measurement is influenced by flow conditions, geometry, electric field and particle charging process, therefore by the transfer function (transfer matrix) of the instrument. In this work, a wire-cylinder corona charger was designed and the combined field-diffusion charging process of injected poly-disperse aerosol particles was numerically simulated as a prerequisite for the study of a multi-channel EMS. The result, a cloud of particles with non-uniform charge distribution, was introduced to the EMS. The flow pattern and electric field in the EMS were simulated using computational fluid dynamics (CFD) to obtain particle trajectories in the device and therefore to calculate the reported signal by each electrometer. According to the output signals (resulted from bombardment of particles and transferring their charges as currents), we proposed a modification to the size of detecting rings (which are connected to electrometers) in order to evaluate particle size distributions more accurately. Based on the capability of the system to transfer information contents about size distribution of the injected particles, we proposed a benchmark for the assessment of optimality of the design. This method applies the concept of Von Neumann entropy and borrows the definition of entropy from information theory (Shannon entropy) to measure optimality. Entropy, according to the Shannon entropy, is the ''average amount of information contained in an event, sample or character extracted from a data stream''. Evaluating the responses (signals) which were obtained via various configurations of detecting rings, the best configuration which gave the best predictions about the size distributions of injected particles, was the modified configuration. It was also the one that had the maximum amount of entropy. A reasonable consistency was also observed between the accuracy of the predictions and the entropy content of each configuration. In this method, entropy is extracted from the transfer matrix of the instrument for each configuration. Ultimately, various clouds of particles were introduced to the simulations and predicted size distributions were compared to the exact size distributions.Keywords: aerosol nano-particle, CFD, electrical mobility spectrometer, von neumann entropy
Procedia PDF Downloads 34316749 An Improved Method to Compute Sparse Graphs for Traveling Salesman Problem
Authors: Y. Wang
Abstract:
The Traveling salesman problem (TSP) is NP-hard in combinatorial optimization. The research shows the algorithms for TSP on the sparse graphs have the shorter computation time than those for TSP according to the complete graphs. We present an improved iterative algorithm to compute the sparse graphs for TSP by frequency graphs computed with frequency quadrilaterals. The iterative algorithm is enhanced by adjusting two parameters of the algorithm. The computation time of the algorithm is O(CNmaxn2) where C is the iterations, Nmax is the maximum number of frequency quadrilaterals containing each edge and n is the scale of TSP. The experimental results showed the computed sparse graphs generally have less than 5n edges for most of these Euclidean instances. Moreover, the maximum degree and minimum degree of the vertices in the sparse graphs do not have much difference. Thus, the computation time of the methods to resolve the TSP on these sparse graphs will be greatly reduced.Keywords: frequency quadrilateral, iterative algorithm, sparse graph, traveling salesman problem
Procedia PDF Downloads 233