Search results for: multipoint optimal minimum entropy deconvolution
4871 Optimal Site Selection for Temporary Housing regarding Disaster Management Case Study: Tehran Municipality (No.6)
Authors: Ghazaleh Monazami Tehrani, Zhamak Monazami Tehrani, Raziyeh Hadavand
Abstract:
Optimal site selection for temporary housing is one of the most important issues in crisis management. In this research, district six of Tehran city with high frequency and geographical distribution of earthquakes has been selected as a case study for positioning temporary housing after a probable earthquake. For achieving this goal this study tries to identify and evaluate distribution of location according to some standards such as compatible and incompatible urban land uses with utility of GIS and AHP. The results of this study show the most susceptible parts of this region in the center. According to the maps, north eastern part of Kordestan, Shaheed Gomnam intersection possesses the highest pixels value in terms of areal extent, therefore these places are recommended as an optimum site location for construction of emergency evacuation base.Keywords: optimal site selection, temporary housing , crisis management, AHP, GIS
Procedia PDF Downloads 2584870 Conservativeness of Probabilistic Constrained Optimal Control Method for Unknown Probability Distribution
Authors: Tomoaki Hashimoto
Abstract:
In recent decades, probabilistic constrained optimal control problems have attracted much attention in many research field. Although probabilistic constraints are generally intractable in an optimization problem, several tractable methods haven been proposed to handle probabilistic constraints. In most methods, probabilistic constraints are reduced to deterministic constraints that are tractable in an optimization problem. However, there is a gap between the transformed deterministic constraints in case of known and unknown probability distribution. This paper examines the conservativeness of probabilistic constrained optimization method with the unknown probability distribution. The objective of this paper is to provide a quantitative assessment of the conservatism for tractable constraints in probabilistic constrained optimization with the unknown probability distribution.Keywords: optimal control, stochastic systems, discrete time systems, probabilistic constraints
Procedia PDF Downloads 5814869 Analytical Solutions of Time Space Fractional, Advection-Dispersion and Whitham-Broer-Kaup Equations
Authors: Muhammad Danish Khan, Imran Naeem, Mudassar Imran
Abstract:
In this article, we study time-space Fractional Advection-Dispersion (FADE) equation and time-space Fractional Whitham-Broer-Kaup (FWBK) equation that have a significant role in hydrology. We introduce suitable transformations to convert fractional order derivatives to integer order derivatives and as a result these equations transform into Partial Differential Equations (PDEs). Then the Lie symmetries and corresponding optimal systems of the resulting PDEs are derived. The symmetry reductions and exact independent solutions based on optimal system are investigated which constitute the exact solutions of original fractional differential equations.Keywords: modified Riemann-Liouville fractional derivative, lie-symmetries, optimal system, invariant solutions
Procedia PDF Downloads 4334868 Impact of Population Size on Symmetric Travelling Salesman Problem Efficiency
Authors: Wafa' Alsharafat, Suhila Farhan Abu-Owida
Abstract:
Genetic algorithm (GA) is a powerful evolutionary searching technique that is used successfully to solve and optimize problems in different research areas. Genetic Algorithm (GA) considered as one of optimization methods used to solve Travel salesman Problem (TSP). The feasibility of GA in finding a TSP solution is dependent on GA operators; encoding method, population size, termination criteria, in general. In specific, crossover and its probability play a significant role in finding possible solutions for Symmetric TSP (STSP). In addition, the crossover should be determined and enhanced in term reaching optimal or at least near optimal. In this paper, we spot the light on using a modified crossover method called modified sequential constructive crossover and its impact on reaching optimal solution. To justify the relevance of a parameter value in solving the TSP, a set comparative analysis conducted on different crossover methods values.Keywords: genetic algorithm, crossover, mutation, TSP
Procedia PDF Downloads 2294867 Structural Optimization Using Catenary and Other Natural Shapes
Authors: Mitchell Gohnert
Abstract:
This paper reviews some fundamental concepts of structural optimization, which is focused on the shape of the structure. Bending stresses produce high peak stresses at each face of the member, and therefore, substantially more material is required to resist bending. The shape of the structure has a profound effect on stress levels. Stress may be reduced dramatically by simply changing the shape to accommodate natural stress flow. The main objective of structural optimization is to direct the thrust line along the axis of the member. Optimal shapes include the catenary arch or dome, triangular shapes, and columns. If the natural flow of stress matches the shape of the structures, the most optimal shape is determined. Structures, however, must resist multiple load patterns. An optimal shape is still possible by ensuring that the thrust lines fall within the middle third of the member.Keywords: optimization, natural structures, shells, catenary, domes, arches
Procedia PDF Downloads 454866 Optimal Sizing and Placement of Distributed Generators for Profit Maximization Using Firefly Algorithm
Authors: Engy Adel Mohamed, Yasser Gamal-Eldin Hegazy
Abstract:
This paper presents a firefly based algorithm for optimal sizing and allocation of distributed generators for profit maximization. Distributed generators in the proposed algorithm are of photovoltaic and combined heat and power technologies. Combined heat and power distributed generators are modeled as voltage controlled nodes while photovoltaic distributed generators are modeled as constant power nodes. The proposed algorithm is implemented in MATLAB environment and tested the unbalanced IEEE 37-node feeder. The results show the effectiveness of the proposed algorithm in optimal selection of distributed generators size and site in order to maximize the total system profit.Keywords: distributed generators, firefly algorithm, IEEE 37-node feeder, profit maximization
Procedia PDF Downloads 4434865 Modeling and Optimal Control of Hybrid Unmanned Aerial Vehicles with Wind Disturbance
Authors: Sunsoo Kim, Niladri Das, Raktim Bhattacharya
Abstract:
This paper addresses modeling and control of a six-degree-of-freedom unmanned aerial vehicle capable of vertical take-off and landing in the presence of wind disturbances. We design a hybrid vehicle that combines the benefits of both the fixed-wing and the rotary-wing UAVs. A non-linear model for the hybrid vehicle is rapidly built, combining rigid body dynamics, aerodynamics of wing, and dynamics of the motor and propeller. Further, we design a H₂ optimal controller to make the UAV robust to wind disturbances. We compare its results against that of proportional-integral-derivative and linear-quadratic regulator based control. Our proposed controller results in better performance in terms of root mean squared errors and time responses during two scenarios: hover and level- flight.Keywords: hybrid UAVs, VTOL, aircraft modeling, H2 optimal control, wind disturbances
Procedia PDF Downloads 1584864 Vector Quantization Based on Vector Difference Scheme for Image Enhancement
Authors: Biji Jacob
Abstract:
Vector quantization algorithm which uses minimum distance calculation for codebook generation, a time consuming calculation performed on each pixel values leads to computation complexity. The codebook is updated by comparing the distance of each vector to their centroid vector and measure for their closeness. In this paper vector quantization is modified based on vector difference algorithm for image enhancement purpose. In the proposed scheme, vector differences between the vectors are considered as the new generation vectors or new codebook vectors. The codebook is updated by comparing the new generation vector with a threshold value having minimum error with the parent vector. The minimum error decides the fitness of each newly generated vector. Thus the codebook is generated in an adaptive manner and the fitness value is determined for the suppression of the degraded portion of the image and thereby leads to the enhancement of the image through the adaptive searching capability of the vector quantization through vector difference algorithm. Experimental results shows that the vector difference scheme efficiently modifies the vector quantization algorithm for enhancing the image with peak signal to noise ratio (PSNR), mean square error (MSE), Euclidean distance (E_dist) as the performance parameters.Keywords: codebook, image enhancement, vector difference, vector quantization
Procedia PDF Downloads 2684863 A Review on Robot Trajectory Optimization and Process Validation through off-Line Programming in Virtual Environment Using Robcad
Authors: Ashwini Umale
Abstract:
Trajectory planning and optimization is a fundamental problem in articulated robotics. It is often viewed as a two phase problem of initial feasible path planning around obstacles and subsequent optimization of a trajectory satisfying dynamical constraints. An optimized trajectory of multi-axis robot is important and directly influences the Performance of the executing task. Optimal is defined to be the minimum time to transition from the current speed to the set speed. In optimization of trajectory through virtual environment explores the most suitable way to represent robot motion from virtual environment to real environment. This paper aims to review the research of trajectory optimization in virtual environment using simulation software Robcad. Improvements are to be expected in trajectory optimization to generate smooth and collision free trajectories with minimization of overall robot cycle time.Keywords: trajectory optimization, forward kinematics and reverse kinematics, dynamic constraints, robcad simulation software
Procedia PDF Downloads 5054862 Improvement of Ride Comfort of Turning Electric Vehicle Using Optimal Speed Control
Authors: Yingyi Zhou, Tohru Kawabe
Abstract:
With the spread of EVs (electric Vehicles), the ride comfort has been gaining a lot of attention. The influence of the lateral acceleration is important for the improvement of ride comfort of EVs as well as the longitudinal acceleration, especially upon turning of the vehicle. Therefore, this paper proposes a practical optimal speed control method to greatly improve the ride comfort in the vehicle turning situation. For consturcting this method, effective criteria that can appropriately evaluate deterioration of ride comfort is derived. The method can reduce the influence of both the longitudinal and the lateral speed changes for providing a confortable ride. From several simulation results, we can see the fact that the method can prevent aggravation of the ride comfort by suppressing the influence of longitudinal speed change in the turning situation. Hence, the effectiveness of the method is recognized.Keywords: electric vehicle, speed control, ride comfort, optimal control theory, driving support system
Procedia PDF Downloads 2154861 Microwave Imaging by Application of Information Theory Criteria in MUSIC Algorithm
Authors: Majid Pourahmadi
Abstract:
The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.Keywords: microwave imaging, time reversal, MUSIC algorithm, minimum description length (MDL)
Procedia PDF Downloads 3394860 Hydro-Mechanical Behavior of a Tuff and Calcareous Sand Mixture for Use in Pavement in Arid Region
Authors: I. Goual, M. S. Goual, M. K. Gueddouda, Taïbi Saïd, Abou-Bekr Nabil, A. Ferhat
Abstract:
The aim of the paper is to study the hydro-mechanical behavior of a tuff and calcareous sand mixture. A first experimental phase was carried out in order to find the optimal mixture. This showed that the material composed of 80% tuff and 20% calcareous sand provides the maximum mechanical strength. The second experimental phase concerns the study of the drying-wetting behavior of the optimal mixture was carried out on slurry samples and compacted samples at the MPO. Experimental results let to deduce the parameters necessary for the prediction of the hydro-mechanical behavior of pavement formulated from tuff and calcareous sand mixtures, related to moisture. This optimal mixture satisfies the regulation rules and hence constitutes a good local eco-material, abundantly available, for the conception of pavements.Keywords: tuff, sandy calcareous, road engineering, hydro mechanical behaviour, suction
Procedia PDF Downloads 4424859 Household Wealth and Portfolio Choice When Tail Events Are Salient
Authors: Carlson Murray, Ali Lazrak
Abstract:
Robust experimental evidence of systematic violations of expected utility (EU) establishes that individuals facing risk overweight utility from low probability gains and losses when making choices. These findings motivated development of models of preferences with probability weighting functions, such as rank dependent utility (RDU). We solve for the optimal investing strategy of an RDU investor in a dynamic binomial setting from which we derive implications for investing behavior. We show that relative to EU investors with constant relative risk aversion, commonly measured probability weighting functions produce optimal RDU terminal wealth with significant downside protection and upside exposure. We additionally find that in contrast to EU investors, RDU investors optimally choose a portfolio that contains fair bets that provide payo↵s that can be interpreted as lottery outcomes or exposure to idiosyncratic returns. In a calibrated version of the model, we calculate that RDU investors would be willing to pay 5% of their initial wealth for the freedom to trade away from an optimal EU wealth allocation. The dynamic trading strategy that supports the optimal wealth allocation implies portfolio weights that are independent of initial wealth but requires higher risky share after good stock return histories. Optimal trading also implies the possibility of non-participation when historical returns are poor. Our model fills a gap in the literature by providing new quantitative and qualitative predictions that can be tested experimentally or using data on household wealth and portfolio choice.Keywords: behavioral finance, probability weighting, portfolio choice
Procedia PDF Downloads 4204858 Use of Linear Programming for Optimal Production in a Production Line in Saudi Food Co.
Authors: Qasim M. Kriri
Abstract:
Few Saudi Arabia production companies face financial profit issues until this moment. This work presents a linear integer programming model that solves a production problem of a Saudi Food Company in Saudi Arabia. An optimal solution to the above-mentioned problem is a Linear Programming solution. In this regard, the main purpose of this project is to maximize profit. Linear Programming Technique has been used to derive the maximum profit from production of natural juice at Saudi Food Co. The operations of production of the company were formulated and optimal results are found out by using Lindo Software that employed Sensitivity Analysis and Parametric linear programming in order develop Linear Programming. In addition, the parameter values are increased, then the values of the objective function will be increased.Keywords: parameter linear programming, objective function, sensitivity analysis, optimize profit
Procedia PDF Downloads 2064857 Effect of Testing Device Calibration on Liquid Limit Assessment
Authors: M. O. Bayram, H. B. Gencdal, N. O. Fercan, B. Basbug
Abstract:
Liquid limit, which is used as a measure of soil strength, can be detected by Casagrande and fall-cone testing methods. The two methods majorly diverge from each other in terms of operator dependency. The Casagrande method that is applied according to ASTM D4318-17 standards may give misleading results, especially if the calibration process is not performed well. To reveal the effect of calibration for drop height and amount of soil paste placement in the Casagrande cup, a series of tests were carried out by multipoint method as it is specified in the ASTM standards. The tests include the combination of 6 mm, 8 mm, 10 mm, and 12 mm drop heights and under-filled, half-filled, and full-filled Casagrande cups by kaolinite samples. It was observed that during successive tests, the drop height of the cup deteriorated; hence the device was recalibrated before and after each test to provide the accuracy of the results. Besides, the tests by under-filled and full-filled samples for higher drop heights revealed lower liquid limit values than the lower drop heights revealed. For the half-filled samples, it was clearly seen that the liquid limit values didn’t change at all as the drop height increased, and this explains the function of standard specifications.Keywords: calibration, casagrande cup method, drop height, kaolinite, liquid limit, placing form
Procedia PDF Downloads 1614856 Improvement of the Q-System Using the Rock Engineering System: A Case Study of Water Conveyor Tunnel of Azad Dam
Authors: Sahand Golmohammadi, Sana Hosseini Shirazi
Abstract:
Because the status and mechanical parameters of discontinuities in the rock mass are included in the calculations, various methods of rock engineering classification are often used as a starting point for the design of different types of structures. The Q-system is one of the most frequently used methods for stability analysis and determination of support systems of underground structures in rock, including tunnel. In this method, six main parameters of the rock mass, namely, the rock quality designation (RQD), joint set number (Jn), joint roughness number (Jr), joint alteration number (Ja), joint water parameter (Jw) and stress reduction factor (SRF) are required. In this regard, in order to achieve a reasonable and optimal design, identifying the effective parameters for the stability of the mentioned structures is one of the most important goals and the most necessary actions in rock engineering. Therefore, it is necessary to study the relationships between the parameters of a system and how they interact with each other and, ultimately, the whole system. In this research, it has attempted to determine the most effective parameters (key parameters) from the six parameters of rock mass in the Q-system using the rock engineering system (RES) method to improve the relationships between the parameters in the calculation of the Q value. The RES system is, in fact, a method by which one can determine the degree of cause and effect of a system's parameters by making an interaction matrix. In this research, the geomechanical data collected from the water conveyor tunnel of Azad Dam were used to make the interaction matrix of the Q-system. For this purpose, instead of using the conventional methods that are always accompanied by defects such as uncertainty, the Q-system interaction matrix is coded using a technique that is actually a statistical analysis of the data and determining the correlation coefficient between them. So, the effect of each parameter on the system is evaluated with greater certainty. The results of this study show that the formed interaction matrix provides a reasonable estimate of the effective parameters in the Q-system. Among the six parameters of the Q-system, the SRF and Jr parameters have the maximum and minimum impact on the system, respectively, and also the RQD and Jw parameters have the maximum and minimum impact on the system, respectively. Therefore, by developing this method, we can obtain a more accurate relation to the rock mass classification by weighting the required parameters in the Q-system.Keywords: Q-system, rock engineering system, statistical analysis, rock mass, tunnel
Procedia PDF Downloads 734855 Optimal Placement of Phasor Measurement Units Using Gravitational Search Method
Authors: Satyendra Pratap Singh, S. P. Singh
Abstract:
This paper presents a methodology using Gravitational Search Algorithm for optimal placement of Phasor Measurement Units (PMUs) in order to achieve complete observability of the power system. The objective of proposed algorithm is to minimize the total number of PMUs at the power system buses, which in turn minimize installation cost of the PMUs. In this algorithm, the searcher agents are collection of masses which interact with each other using Newton’s laws of gravity and motion. This new Gravitational Search Algorithm based method has been applied to the IEEE 14-bus, IEEE 30-bus and IEEE 118-bus test systems. Case studies reveal optimal number of PMUs with better observability by proposed method.Keywords: gravitational search algorithm (GSA), law of motion, law of gravity, observability, phasor measurement unit
Procedia PDF Downloads 5084854 Wheeled Robot Stable Braking Process under Asymmetric Traction Coefficients
Authors: Boguslaw Schreyer
Abstract:
During the wheeled robot’s braking process, the extra dynamic vertical forces act on all wheels: left, right, front or rear. Those forces are directed downward on the front wheels while directed upward on the rear wheels. In order to maximize the deceleration, therefore, minimize the braking time and braking distance, we need to calculate a correct torque distribution: the front braking torque should be increased, and rear torque should be decreased. At the same time, we need to provide better transversal stability. In a simple case of all adhesion coefficients being the same under all wheels, the torque distribution may secure the optimal (maximal) control of the robot braking process, securing the minimum braking distance and a minimum braking time. At the same time, the transversal stability is relatively good. At any time, we control the transversal acceleration. In the case of the transversal movement, we stop the braking process and re-apply braking torque after a defined period of time. If we correctly calculate the value of the torques, we may secure the traction coefficient under the front and rear wheels close to its maximum. Also, in order to provide an optimum braking control, we need to calculate the timing of the braking torque application and the timing of its release. The braking torques should be released shortly after the wheels passed a maximum traction coefficient (while a wheels’ slip increases) and applied again after the wheels pass a maximum of traction coefficient (while the slip decreases). The correct braking torque distribution secures the front and rear wheels, passing this maximum at the same time. It guarantees an optimum deceleration control, therefore, minimum braking time. In order to calculate a correct torque distribution, a control unit should receive the input signals of a rear torque value (which changes independently), the robot’s deceleration, and values of the vertical front and rear forces. In order to calculate the timing of torque application and torque release, more signals are needed: speed of the robot: angular speed, and angular deceleration of the wheels. In case of different adhesion coefficients under the left and right wheels, but the same under each pair of wheels- the same under right wheels and the same under left wheels, the Select-Low (SL) and select high (SH) methods are applied. The SL method is suggested if transversal stability is more important than braking efficiency. Often in the case of the robot, more important is braking efficiency; therefore, the SH method is applied with some control of the transversal stability. In the case that all adhesion coefficients are different under all wheels, the front-rear torque distribution is maintained as in all previous cases. However, the timing of the braking torque application and release is controlled by the rear wheels’ lowest adhesion coefficient. The Lagrange equations have been used to describe robot dynamics. Matlab has been used in order to simulate the process of wheeled robot braking, and in conclusion, the braking methods have been selected.Keywords: wheeled robots, braking, traction coefficient, asymmetric
Procedia PDF Downloads 1654853 Non-Uniform Filter Banks-based Minimum Distance to Riemannian Mean Classifition in Motor Imagery Brain-Computer Interface
Authors: Ping Tan, Xiaomeng Su, Yi Shen
Abstract:
The motion intention in the motor imagery braincomputer interface is identified by classifying the event-related desynchronization (ERD) and event-related synchronization ERS characteristics of sensorimotor rhythm (SMR) in EEG signals. When the subject imagines different limbs or different parts moving, the rhythm components and bandwidth will change, which varies from person to person. How to find the effective sensorimotor frequency band of subjects is directly related to the classification accuracy of brain-computer interface. To solve this problem, this paper proposes a Minimum Distance to Riemannian Mean Classification method based on Non-Uniform Filter Banks. During the training phase, the EEG signals are decomposed into multiple different bandwidt signals by using multiple band-pass filters firstly; Then the spatial covariance characteristics of each frequency band signal are computered to be as the feature vectors. these feature vectors will be classified by the MDRM (Minimum Distance to Riemannian Mean) method, and cross validation is employed to obtain the effective sensorimotor frequency bands. During the test phase, the test signals are filtered by the bandpass filter of the effective sensorimotor frequency bands, and the extracted spatial covariance feature vectors will be classified by using the MDRM. Experiments on the BCI competition IV 2a dataset show that the proposed method is superior to other classification methods.Keywords: non-uniform filter banks, motor imagery, brain-computer interface, minimum distance to Riemannian mean
Procedia PDF Downloads 1264852 Application of Simulation of Discrete Events in Resource Management of Massive Concreting
Authors: Mohammad Amin Hamedirad, Seyed Javad Vaziri Kang Olyaei
Abstract:
Project planning and control are one of the most critical issues in the management of construction projects. Traditional methods of project planning and control, such as the critical path method or Gantt chart, are not widely used for planning projects with discrete and repetitive activities, and one of the problems of project managers is planning the implementation process and optimal allocation of its resources. Massive concreting projects is also a project with discrete and repetitive activities. This study uses the concept of simulating discrete events to manage resources, which includes finding the optimal number of resources considering various limitations such as limitations of machinery, equipment, human resources and even technical, time and implementation limitations using analysis of resource consumption rate, project completion time and critical points analysis of the implementation process. For this purpose, the concept of discrete-event simulation has been used to model different stages of implementation. After reviewing the various scenarios, the optimal number of allocations for each resource is finally determined to reach the maximum utilization rate and also to reduce the project completion time or reduce its cost according to the existing constraints. The results showed that with the optimal allocation of resources, the project completion time could be reduced by 90%, and the resulting costs can be reduced by up to 49%. Thus, allocating the optimal number of project resources using this method will reduce its time and cost.Keywords: simulation, massive concreting, discrete event simulation, resource management
Procedia PDF Downloads 1494851 A Lifetime-Enhancing Monitoring Node Distribution Using Minimum Spanning Tree in Mobile Ad Hoc Networks
Authors: Sungchul Ha, Hyunwoo Kim
Abstract:
In mobile ad hoc networks, all nodes in a network only have limited resources and calculation ability. Therefore communication topology which have long lifetime is good for all nodes in mobile ad hoc networks. There are a variety of researches on security problems in wireless ad hoc networks. The existing many researches try to make efficient security schemes to reduce network power consumption and enhance network lifetime. Because a new node can join the network at any time, the wireless ad hoc networks are exposed to various threats and can be destroyed by attacks. Resource consumption is absolutely necessary to secure networks, but more resource consumption can be a critical problem to network lifetime. This paper focuses on efficient monitoring node distribution to enhance network lifetime in wireless ad hoc networks. Since the wireless ad hoc networks cannot use centralized infrastructure and security systems of wired networks, a new special IDS scheme is necessary. The scheme should not only cover all nodes in a network but also enhance the network lifetime. In this paper, we propose an efficient IDS node distribution scheme using minimum spanning tree (MST) method. The simulation results show that the proposed algorithm has superior performance in comparison with existing algorithms.Keywords: MANETs, IDS, power control, minimum spanning tree
Procedia PDF Downloads 3734850 Comprehensive Analysis of Electrohysterography Signal Features in Term and Preterm Labor
Authors: Zhihui Liu, Dongmei Hao, Qian Qiu, Yang An, Lin Yang, Song Zhang, Yimin Yang, Xuwen Li, Dingchang Zheng
Abstract:
Premature birth, defined as birth before 37 completed weeks of gestation is a leading cause of neonatal morbidity and mortality and has long-term adverse consequences for health. It has recently been reported that the worldwide preterm birth rate is around 10%. The existing measurement techniques for diagnosing preterm delivery include tocodynamometer, ultrasound and fetal fibronectin. However, they are subjective, or suffer from high measurement variability and inaccurate diagnosis and prediction of preterm labor. Electrohysterography (EHG) method based on recording of uterine electrical activity by electrodes attached to maternal abdomen, is a promising method to assess uterine activity and diagnose preterm labor. The purpose of this study is to analyze the difference of EHG signal features between term labor and preterm labor. Free access database was used with 300 signals acquired in two groups of pregnant women who delivered at term (262 cases) and preterm (38 cases). Among them, EHG signals from 38 term labor and 38 preterm labor were preprocessed with band-pass Butterworth filters of 0.08–4Hz. Then, EHG signal features were extracted, which comprised classical time domain description including root mean square and zero-crossing number, spectral parameters including peak frequency, mean frequency and median frequency, wavelet packet coefficients, autoregression (AR) model coefficients, and nonlinear measures including maximal Lyapunov exponent, sample entropy and correlation dimension. Their statistical significance for recognition of two groups of recordings was provided. The results showed that mean frequency of preterm labor was significantly smaller than term labor (p < 0.05). 5 coefficients of AR model showed significant difference between term labor and preterm labor. The maximal Lyapunov exponent of early preterm (time of recording < the 26th week of gestation) was significantly smaller than early term. The sample entropy of late preterm (time of recording > the 26th week of gestation) was significantly smaller than late term. There was no significant difference for other features between the term labor and preterm labor groups. Any future work regarding classification should therefore focus on using multiple techniques, with the mean frequency, AR coefficients, maximal Lyapunov exponent and the sample entropy being among the prime candidates. Even if these methods are not yet useful for clinical practice, they do bring the most promising indicators for the preterm labor.Keywords: electrohysterogram, feature, preterm labor, term labor
Procedia PDF Downloads 5724849 Analysis of Radiation-Induced Liver Disease (RILD) and Evaluation of Relationship between Therapeutic Activity and Liver Clearance Rate with Tc-99m-Mebrofenin in Yttrium-90 Microspheres Treatment
Authors: H. Tanyildizi, M. Abuqebitah, I. Cavdar, M. Demir, L. Kabasakal
Abstract:
Aim: Whole liver radiation has the modest benefit in the treatment of unresectable hepatic metastases but the radiation doses must keep in control. Otherwise, RILD complications may arise. In this study, we aimed to calculate amount of maximum permissible activity (MPA) and critical organ absorbed doses with MIRD methodology, to evaluate tumour doses for treatment response and whole liver doses for RILD and to find optimal liver function test additionally. Materials and Methods: This study includes 29 patients who attended our nuclear medicine department suffering from Y-90 microspheres treatment. 10 mCi Tc-99m MAA was applied to the patients for dosimetry via IV. After the injection, whole body SPECT/CT images were taken in one hour. The minimum therapeutic tumour dose is on the point of being 120 Gy1, the amount of activities were calculated with MIRD methodology considering volumetric tumour/liver rate. A sub-working group was created with 11 patients randomly and liver clearance rate with Tc-99m-Mebrofenin was calculated according to Ekman formalism. Results: The volumetric tumour/liver rates were found between 33-66% (Maksimum Tolarable Dose (MTD) 48-52Gy3) for 4 patients, were found less than 33% (MTD 72Gy3) for 25 patients. According to these results the average amount of activity, mean liver dose and mean tumour dose were found 1793.9±1.46 MBq, 32.86±0.19 Gy, and 138.26±0.40 Gy. RILD was not observed in any patient. In sub-working group, the relationship between Bilirubin, Albumin, INR (which show presence of liver disease and its degree), liver clearance with Tc-99m-Mebrofenin and calculated activity amounts were found r=0.49, r=0.27, r=0.43, r=0.57, respectively. Discussions: The minimum tumour dose was found 120 Gy for positive dose-response relation. If volumetric tumour/liver rate was > 66%, dose 30 Gy; if volumetric tumour/liver rate 33-66%, dose escalation 48 Gy; if volumetric tumour/liver rate < 33%, dose 72 Gy. These dose limitations did not create RILD. Clearance measurement with Mebrofenin was concluded that the best method to determine the liver function. Therefore, liver clearance rate with Tc-99m-Mebrofenin should be considered in calculation of yttrium-90 microspheres dosimetry.Keywords: clearance, dosimetry, liver, RILD
Procedia PDF Downloads 4404848 Optimal Design of Redundant Hybrid Manipulator for Minimum Singularity
Authors: Arash Rahmani, Ahmad Ghanbari, Abbas Baghernezhad, Babak Safaei
Abstract:
In the design of parallel manipulators, usually mean value of a dexterity measure over the workspace volume is considered as the objective function to be used in optimization algorithms. The mentioned indexes in a hybrid parallel manipulator (HPM) are quite complicated to solve thanks to infinite solutions for every point within the workspace of the redundant manipulators. In this paper, spatial isotropic design axioms are extended as a well-known method for optimum design of manipulators. An upper limit for the isotropy measure of HPM is calculated and instead of computing and minimizing isotropy measure, minimizing the obtained limit is considered. To this end, two different objective functions are suggested which are obtained from objective functions of comprising modules. Finally, by using genetic algorithm (GA), the best geometric parameters for a specific hybrid parallel robot which is composed of two modified Gough-Stewart platforms (MGSP) are achieved.Keywords: hybrid manipulator, spatial isotropy, genetic algorithm, optimum design
Procedia PDF Downloads 3374847 Analyzing the Results of Buildings Energy Audit by Using Grey Set Theory
Authors: Tooraj Karimi, Mohammadreza Sadeghi Moghadam
Abstract:
Grey set theory has the advantage of using fewer data to analyze many factors, and it is therefore more appropriate for system study rather than traditional statistical regression which require massive data, normal distribution in the data and few variant factors. So, in this paper grey clustering and entropy of coefficient vector of grey evaluations are used to analyze energy consumption in buildings of the Oil Ministry in Tehran. In fact, this article intends to analyze the results of energy audit reports and defines most favorable characteristics of system, which is energy consumption of buildings, and most favorable factors affecting these characteristics in order to modify and improve them. According to the results of the model, ‘the real Building Load Coefficient’ has been selected as the most important system characteristic and ‘uncontrolled area of the building’ has been diagnosed as the most favorable factor which has the greatest effect on energy consumption of building. Grey clustering in this study has been used for two purposes: First, all the variables of building relate to energy audit cluster in two main groups of indicators and the number of variables is reduced. Second, grey clustering with variable weights has been used to classify all buildings in three categories named ‘no standard deviation’, ‘low standard deviation’ and ‘non- standard’. Entropy of coefficient vector of Grey evaluations is calculated to investigate greyness of results. It shows that among the 38 buildings surveyed in terms of energy consumption, 3 cases are in standard group, 24 cases are in ‘low standard deviation’ group and 11 buildings are completely non-standard. In addition, clustering greyness of 13 buildings is less than 0.5 and average uncertainly of clustering results is 66%.Keywords: energy audit, grey set theory, grey incidence matrixes, grey clustering, Iran oil ministry
Procedia PDF Downloads 3744846 Efficient Broadcasting in Wireless Sensor Networks
Authors: Min Kyung An, Hyuk Cho
Abstract:
In this paper, we study the Minimum Latency Broadcast Scheduling (MLBS) problem in wireless sensor networks (WSNs). The main issue of the MLBS problem is to compute schedules with the minimum number of timeslots such that a base station can broadcast data to all other sensor nodes with no collisions. Unlike existing works that utilize the traditional omni-directional WSNs, we target the directional WSNs where nodes can collaboratively determine and orientate their antenna directions. We first develop a 7-approximation algorithm, adopting directional WSNs. Our ratio is currently the best, to the best of our knowledge. We then validate the performance of the proposed algorithm through simulation.Keywords: broadcast, collision-free, directional antenna, approximation, wireless sensor networks
Procedia PDF Downloads 3474845 Fast Robust Switching Control Scheme for PWR-Type Nuclear Power Plants
Authors: Piyush V. Surjagade, Jiamei Deng, Paul Doney, S. R. Shimjith, A. John Arul
Abstract:
In sophisticated and complex systems such as nuclear power plants, maintaining the system's stability in the presence of uncertainties and disturbances and obtaining a fast dynamic response are the most challenging problems. Thus, to ensure the satisfactory and safe operation of nuclear power plants, this work proposes a new fast, robust optimal switching control strategy for pressurized water reactor-type nuclear power plants. The proposed control strategy guarantees a substantial degree of robustness, fast dynamic response over the entire operational envelope, and optimal performance during the nominal operation of the plant. To improve the robustness, obtain a fast dynamic response, and make the system optimal, a bank of controllers is designed. Various controllers, like a baseline proportional-integral-derivative controller, an optimal linear quadratic Gaussian controller, and a robust adaptive L1 controller, are designed to perform distinct tasks in a specific situation. At any instant of time, the most suitable controller from the bank of controllers is selected using the switching logic unit that designates the controller by monitoring the health of the nuclear power plant or transients. The proposed switching control strategy optimizes the overall performance and increases operational safety and efficiency. Simulation studies have been performed considering various uncertainties and disturbances that demonstrate the applicability and effectiveness of the proposed switching control strategy over some conventional control techniques.Keywords: switching control, robust control, optimal control, nuclear power control
Procedia PDF Downloads 1374844 On the Optimality of Blocked Main Effects Plans
Authors: Rita SahaRay, Ganesh Dutta
Abstract:
In this article, experimental situations are considered where a main effects plan is to be used to study m two-level factors using n runs which are partitioned into b blocks, not necessarily of same size. Assuming the block sizes to be even for all blocks, for the case n ≡ 2 (mod 4), optimal designs are obtained with respect to type 1 and type 2 optimality criteria in the class of designs providing estimation of all main effects orthogonal to the block effects. In practice, such orthogonal estimation of main effects is often a desirable condition. In the wider class of all available m two level even sized blocked main effects plans, where the factors do not occur at high and low levels equally often in each block, E-optimal designs are also characterized. Simple construction methods based on Hadamard matrices and Kronecker product for these optimal designs are presented.Keywords: design matrix, Hadamard matrix, Kronecker product, type 1 criteria, type 2 criteria
Procedia PDF Downloads 3664843 Performance Comparison of Microcontroller-Based Optimum Controller for Fruit Drying System
Authors: Umar Salisu
Abstract:
This research presents the development of a hot air tomatoes drying system. To provide a more efficient and continuous temperature control, microcontroller-based optimal controller was developed. The system is based on a power control principle to achieve smooth power variations depending on a feedback temperature signal of the process. An LM35 temperature sensor and LM399 differential comparator were used to measure the temperature. The mathematical model of the system was developed and the optimal controller was designed and simulated and compared with the PID controller transient response. A controlled environment suitable for fruit drying is developed within a closed chamber and is a three step process. First, the infrared light is used internally to preheated the fruit to speedily remove the water content inside the fruit for fast drying. Second, hot air of a specified temperature is blown inside the chamber to maintain the humidity below a specified level and exhaust the humid air of the chamber. Third, the microcontroller disconnects the power to the chamber after the moisture content of the fruits is removed to minimal. Experiments were conducted with 1kg of fresh tomatoes at three different temperatures (40, 50 and 60 °C) at constant relative humidity of 30%RH. The results obtained indicate that the system is significantly reducing the drying time without affecting the quality of the fruits. In the context of temperature control, the results obtained showed that the response of the optimal controller has zero overshoot whereas the PID controller response overshoots to about 30% of the set-point. Another performance metric used is the rising time; the optimal controller rose without any delay while the PID controller delayed for more than 50s. It can be argued that the optimal controller performance is preferable than that of the PID controller since it does not overshoot and it starts in good time.Keywords: drying, microcontroller, optimum controller, PID controller
Procedia PDF Downloads 3014842 Numerical Investigation of the Transverse Instability in Radiation Pressure Acceleration
Authors: F. Q. Shao, W. Q. Wang, Y. Yin, T. P. Yu, D. B. Zou, J. M. Ouyang
Abstract:
The Radiation Pressure Acceleration (RPA) mechanism is very promising in laser-driven ion acceleration because of high laser-ion energy conversion efficiency. Although some experiments have shown the characteristics of RPA, the energy of ions is quite limited. The ion energy obtained in experiments is only several MeV/u, which is much lower than theoretical prediction. One possible limiting factor is the transverse instability incited in the RPA process. The transverse instability is basically considered as the Rayleigh-Taylor (RT) instability, which is a kind of interfacial instability and occurs when a light fluid pushes against a heavy fluid. Multi-dimensional particle-in-cell (PIC) simulations show that the onset of transverse instability will destroy the acceleration process and broaden the energy spectrum of fast ions during the RPA dominant ion acceleration processes. The evidence of the RT instability driven by radiation pressure has been observed in a laser-foil interaction experiment in a typical RPA regime, and the dominant scale of RT instability is close to the laser wavelength. The development of transverse instability in the radiation-pressure-acceleration dominant laser-foil interaction is numerically examined by two-dimensional particle-in-cell simulations. When a laser interacts with a foil with modulated surface, the internal instability is quickly incited and it develops. The linear growth and saturation of the transverse instability are observed, and the growth rate is numerically diagnosed. In order to optimize interaction parameters, a method of information entropy is put forward to describe the chaotic degree of the transverse instability. With moderate modulation, the transverse instability shows a low chaotic degree and a quasi-monoenergetic proton beam is produced.Keywords: information entropy, radiation pressure acceleration, Rayleigh-Taylor instability, transverse instability
Procedia PDF Downloads 346