Search results for: smoothness priors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 42

Search results for: smoothness priors

42 Comparison of Detrending Methods in Spectral Analysis of Heart Rate Variability

Authors: Liping Li, Changchun Liu, Ke Li, Chengyu Liu

Abstract:

Non-stationary trend in R-R interval series is considered as a main factor that could highly influence the evaluation of spectral analysis. It is suggested to remove trends in order to obtain reliable results. In this study, three detrending methods, the smoothness prior approach, the wavelet and the empirical mode decomposition, were compared on artificial R-R interval series with four types of simulated trends. The Lomb-Scargle periodogram was used for spectral analysis of R-R interval series. Results indicated that the wavelet method showed a better overall performance than the other two methods, and more time-saving, too. Therefore it was selected for spectral analysis of real R-R interval series of thirty-seven healthy subjects. Significant decreases (19.94±5.87% in the low frequency band and 18.97±5.78% in the ratio (p<0.001)) were found. Thus the wavelet method is recommended as an optimal choice for use.

Keywords: empirical mode decomposition, heart rate variability, signal detrending, smoothness priors, wavelet

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069
41 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.

Keywords: Multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, Importance sampling, approximate posterior distribution, Marginal likelihood evidence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
40 A Comparison Study of Fabric Objective Measurement (FOM) Using KES-FB and PhabrOmeter System on Warp Knitted Fabrics Handle – Smoothness, Stiffness and Softness

Authors: Ka-Yan Yim, Chi-Wai Kan

Abstract:

This paper conducts a comparison study using KES-FB and PhabrOmeter to measure 58 selected warp knitted fabric hand properties. Fabric samples were selected and measured by both KES-FB and PhabrOmeter. Results show differences between these two measurement methods. Smoothness and stiffness values obtained by KES-FB were found significant correlated (p value = 0.003 and 0.022) to the PhabrOmeter results while softness values between two measurement methods did not show significant correlation (p value = 0.828). Disagreements among these two measurement methods imply limitations on different mechanism principles when facing warp knitted fabrics. Subjective measurement methods and further studies are suggested in order to ascertain deeper investigation on the mechanisms of fabric hand perceptions.

Keywords: Fabric hand, fabric objective measurement, KES-FB, PhabrOmeter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3395
39 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution

Authors: Saleem Z. Ramadan

Abstract:

This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the Pth percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.

Keywords: Reliability, Accelerated life testing, Cumulative exposure model, Bayesian estimation, Progressive Type-I censoring, Weibull distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2160
38 Shape Restoration of the Left Ventricle

Authors: May-Ling Tan, Yi Su, Chi-Wan Lim, Liang Zhong, Ru-San Tan

Abstract:

This paper describes an automatic algorithm to restore the shape of three-dimensional (3D) left ventricle (LV) models created from magnetic resonance imaging (MRI) data using a geometry-driven optimization approach. Our basic premise is to restore the LV shape such that the LV epicardial surface is smooth after the restoration. A geometrical measure known as the Minimum Principle Curvature (κ2) is used to assess the smoothness of the LV. This measure is used to construct the objective function of a two-step optimization process. The objective of the optimization is to achieve a smooth epicardial shape by iterative in-plane translation of the MRI slices. Quantitatively, this yields a minimum sum in terms of the magnitude of κ 2, when κ2 is negative. A limited memory quasi-Newton algorithm, L-BFGS-B, is used to solve the optimization problem. We tested our algorithm on an in vitro theoretical LV model and 10 in vivo patient-specific models which contain significant motion artifacts. The results show that our method is able to automatically restore the shape of LV models back to smoothness without altering the general shape of the model. The magnitudes of in-plane translations are also consistent with existing registration techniques and experimental findings.

Keywords: Magnetic Resonance Imaging, Left Ventricle, ShapeRestoration, Principle Curvature, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
37 Integrating Context Priors into a Decision Tree Classification Scheme

Authors: Kasim Terzic, Bernd Neumann

Abstract:

Scene interpretation systems need to match (often ambiguous) low-level input data to concepts from a high-level ontology. In many domains, these decisions are uncertain and benefit greatly from proper context. This paper demonstrates the use of decision trees for estimating class probabilities for regions described by feature vectors, and shows how context can be introduced in order to improve the matching performance.

Keywords: Classification, Decision Trees, Interpretation, Vision

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1300
36 Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes

Authors: Muhammad Sajjad, Naveed Khattak, Noman Jafri

Abstract:

World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.

Keywords: Adaptive, digital image processing, imagemagnification, interpolation, geometrical shapes, qualitative &quantitative analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1800
35 On the Parameter of the Burr Type X under Bayesian Principles

Authors: T. N. Sindhu, M. Aslam

Abstract:

A comprehensive Bayesian analysis has been carried out in the context of informative and non-informative priors for the shape parameter of the Burr type X distribution under different symmetric and asymmetric loss functions. Elicitation of hyperparameter through prior predictive approach is also discussed. Also we derive the expression for posterior predictive distributions, predictive intervals and the credible Intervals. As an illustration, comparisons of these estimators are made through simulation study.

Keywords: Credible Intervals, Loss Functions, Posterior Predictive Distributions, Predictive Intervals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1506
34 Beta-spline Surface Fitting to Multi-slice Images

Authors: Normi Abdul Hadi, Arsmah Ibrahim, Fatimah Yahya, Jamaludin Md. Ali

Abstract:

Beta-spline is built on G2 continuity which guarantees smoothness of generated curves and surfaces using it. This curve is preferred to be used in object design rather than reconstruction. This study however, employs the Beta-spline in reconstructing a 3- dimensional G2 image of the Stanford Rabbit. The original data consists of multi-slice binary images of the rabbit. The result is then compared with related works using other techniques.

Keywords: Beta-spline, multi-slice image, rectangular surface, 3D reconstruction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
33 Generating Arabic Fonts Using Rational Cubic Ball Functions

Authors: Fakharuddin Ibrahim, Jamaludin Md. Ali, Ahmad Ramli

Abstract:

In this paper, we will discuss about the data interpolation by using the rational cubic Ball curve. To generate a curve with a better and satisfactory smoothness, the curve segments must be connected with a certain amount of continuity. The continuity that we will consider is of type G1 continuity. The conditions considered are known as the G1 Hermite condition. A simple application of the proposed method is to generate an Arabic font satisfying the required continuity.

Keywords: Continuity, data interpolation, Hermite condition, rational Ball curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1467
32 The Effects of RCA Clean Variables on Particle Removal Efficiency

Authors: Siti Kudnie Sahari, Jane Chai Hai Sing, Khairuddin Ab. Hamid

Abstract:

Shrunken patterning for integrated device manufacturing requires surface cleanliness and surface smoothness in wet chemical processing [1]. It is necessary to control all process parameters perfectly especially for the common cleaning technique RCA clean (SC-1 and SC-2) [2]. In this paper the characteristic and effect of surface preparation parameters are discussed. The properties of RCA wet chemical processing in silicon technology is based on processing time, temperature, concentration and megasonic power of SC-1 and QDR. An improvement of wafer surface preparation by the enhanced variables of the wet cleaning chemical process is proposed.

Keywords: RCA, SC-1, SC-2, QDR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3242
31 Desalination of Salt Water by Collision with Surface Coated with Nano Particles

Authors: Hesham Muhammad Ibrahim

Abstract:

This paper introduces and proves new concept of salt dissolving in water as very tiny solid sodium chloride particles of nanovolumes, from this point of view salt water can be desalinated by collision with special surface characterized by smoothness upon nano level, high rigidity, high hardness under appropriate conditions of water launching in the form of thin laminar flow under suitable speed and angle of incidence to get desalinated water.

Keywords: Desalination by collision, nano coating, water desalination, water repellent surface.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908
30 Evaluation of the Effect of Rotor Solidity on the Performance of a H-Darrieus Turbine Adopting a Blade Element-Momentum Algorithm

Authors: G. Bedon, M. Raciti Castelli, E. Benini

Abstract:

The present study aims to evaluating the effect of rotor solidity - in terms of chord length for a given rotor diameter - on the performances of a small vertical axis Darrieus wind turbine. The proposed work focuses on both power production and rotor power coefficient, considering also the structural constraints deriving from the centrifugal forces due to rotor angular velocity. Also the smoothness of the resulting power curves have been investigated, in order to evaluate the controllability of the corresponding rotor architectures.

Keywords: Vertical axis wind turbine, Darrieus, solidity, Blade Element-Momentum

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5968
29 On Bayesian Analysis of Failure Rate under Topp Leone Distribution using Complete and Censored Samples

Authors: N. Feroze, M. Aslam

Abstract:

The article is concerned with analysis of failure rate (shape parameter) under the Topp Leone distribution using a Bayesian framework. Different loss functions and a couple of noninformative priors have been assumed for posterior estimation. The posterior predictive distributions have also been derived. A simulation study has been carried to compare the performance of different estimators. A real life example has been used to illustrate the applicability of the results obtained. The findings of the study suggest  that the precautionary loss function based on Jeffreys prior and singly type II censored samples can effectively be employed to obtain the Bayes estimate of the failure rate under Topp Leone distribution.

Keywords: loss functions, type II censoring, posterior distribution, Bayes estimators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2560
28 Motion Detection Techniques Using Optical Flow

Authors: A. A. Shafie, Fadhlan Hafiz, M. H. Ali

Abstract:

Motion detection is very important in image processing. One way of detecting motion is using optical flow. Optical flow cannot be computed locally, since only one independent measurement is available from the image sequence at a point, while the flow velocity has two components. A second constraint is needed. The method used for finding the optical flow in this project is assuming that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image. This technique is later used in developing software for motion detection which has the capability to carry out four types of motion detection. The motion detection software presented in this project also can highlight motion region, count motion level as well as counting object numbers. Many objects such as vehicles and human from video streams can be recognized by applying optical flow technique.

Keywords: Background modeling, Motion detection, Optical flow, Velocity smoothness constant, motion trajectories.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5384
27 Color Image Segmentation using Adaptive Spatial Gaussian Mixture Model

Authors: M.Sujaritha, S. Annadurai

Abstract:

An adaptive spatial Gaussian mixture model is proposed for clustering based color image segmentation. A new clustering objective function which incorporates the spatial information is introduced in the Bayesian framework. The weighting parameter for controlling the importance of spatial information is made adaptive to the image content to augment the smoothness towards piecewisehomogeneous region and diminish the edge-blurring effect and hence the name adaptive spatial finite mixture model. The proposed approach is compared with the spatially variant finite mixture model for pixel labeling. The experimental results with synthetic and Berkeley dataset demonstrate that the proposed method is effective in improving the segmentation and it can be employed in different practical image content understanding applications.

Keywords: Adaptive; Spatial, Mixture model, Segmentation, Color.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2498
26 Inferences on Compound Rayleigh Parameters with Progressively Type-II Censored Samples

Authors: Abdullah Y. Al-Hossain

Abstract:

This paper considers inference under progressive type II censoring with a compound Rayleigh failure time distribution. The maximum likelihood (ML), and Bayes methods are used for estimating the unknown parameters as well as some lifetime parameters, namely reliability and hazard functions. We obtained Bayes estimators using the conjugate priors for two shape and scale parameters. When the two parameters are unknown, the closed-form expressions of the Bayes estimators cannot be obtained. We use Lindley.s approximation to compute the Bayes estimates. Another Bayes estimator has been obtained based on continuous-discrete joint prior for the unknown parameters. An example with the real data is discussed to illustrate the proposed method. Finally, we made comparisons between these estimators and the maximum likelihood estimators using a Monte Carlo simulation study.

Keywords: Progressive type II censoring, compound Rayleigh failure time distribution, maximum likelihood estimation, Bayes estimation, Lindley's approximation method, Monte Carlo simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2390
25 Comparison of Bayesian and Regression Schemes to Model Public Health Services

Authors: Sotirios Raptis

Abstract:

Bayesian reasoning (BR) or Linear (Auto) Regression (AR/LR) can predict different sources of data using priors or other data, and can link social service demands in cohorts, while their consideration in isolation (self-prediction) may lead to service misuse ignoring the context. The paper advocates that BR with Binomial (BD), or Normal (ND) models or raw data (.D) as probabilistic updates can be compared to AR/LR to link services in Scotland and reduce cost by sharing healthcare (HC) resources. Clustering, cross-correlation, along with BR, LR, AR can better predict demand. Insurance companies and policymakers can link such services, and examples include those offered to the elderly, and low-income people, smoking-related services linked to mental health services, or epidemiological weight in children. 22 service packs are used that are published by Public Health Services (PHS) Scotland and Scottish Government (SG) from 1981 to 2019, broken into 110 year series (factors), joined using LR, AR, BR. The Primary component analysis found 11 significant factors, while C-Means (CM) clustering gave five major clusters.

Keywords: Bayesian probability, cohorts, data frames, regression, services, prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 224
24 Fuzzy Sliding Mode Speed Controller for a Vector Controlled Induction Motor

Authors: S. Massoum, A. Bentaallah, A. Massoum, F. Benaimeche, P. Wira, A. Meroufel

Abstract:

This paper presents a speed fuzzy sliding mode controller for a vector controlled induction machine (IM) fed by a voltage source inverter (PWM). The sliding mode based fuzzy control method is developed to achieve fast response, a best disturbance rejection and to maintain a good decoupling. The problem with sliding mode control is that there is high frequency switching around the sliding mode surface. The FSMC is the combination of the robustness of Sliding Mode Control (SMC) and the smoothness of Fuzzy Logic (FL). To reduce the torque fluctuations (chattering), the sign function used in the conventional SMC is substituted with a fuzzy logic algorithm. The proposed algorithm was simulated by Matlab/Simulink software and simulation results show that the performance of the control scheme is robust and the chattering problem is solved.

Keywords: IM, FOC, FLC, SMC, and FSMC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2814
23 Parametric Transition as a Spiral Curve and Its Application in Spur Gear Tooth with FEA

Authors: S. H. Yahaya, J. M. Ali, T.A. Abdullah

Abstract:

The exploration of this paper will focus on the Cshaped transition curve. This curve is designed by using the concept of circle to circle where one circle lies inside other. The degree of smoothness employed is curvature continuity. The function used in designing the C-curve is Bézier-like cubic function. This function has a low degree, flexible for the interactive design of curves and surfaces and has a shape parameter. The shape parameter is used to control the C-shape curve. Once the C-shaped curve design is completed, this curve will be applied to design spur gear tooth. After the tooth design procedure is finished, the design will be analyzed by using Finite Element Analysis (FEA). This analysis is used to find out the applicability of the tooth design and the gear material that chosen. In this research, Cast Iron 4.5 % Carbon, ASTM A-48 is selected as a gear material.

Keywords: Bézier-like cubic function, Curvature continuity, Cshapedtransition curve, Spur gear tooth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2340
22 A Thought on Exotic Statistical Distributions

Authors: R K Sinha

Abstract:

The statistical distributions are modeled in explaining nature of various types of data sets. Although these distributions are mostly uni-modal, it is quite common to see multiple modes in the observed distribution of the underlying variables, which make the precise modeling unrealistic. The observed data do not exhibit smoothness not necessarily due to randomness, but could also be due to non-randomness resulting in zigzag curves, oscillations, humps etc. The present paper argues that trigonometric functions, which have not been used in probability functions of distributions so far, have the potential to take care of this, if incorporated in the distribution appropriately. A simple distribution (named as, Sinoform Distribution), involving trigonometric functions, is illustrated in the paper with a data set. The importance of trigonometric functions is demonstrated in the paper, which have the characteristics to make statistical distributions exotic. It is possible to have multiple modes, oscillations and zigzag curves in the density, which could be suitable to explain the underlying nature of select data set.

Keywords: Exotic Statistical Distributions, Kurtosis, Mixture Distributions, Multi-modal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
21 Edge Detection in Digital Images Using Fuzzy Logic Technique

Authors: Abdallah A. Alshennawy, Ayman A. Aly

Abstract:

The fuzzy technique is an operator introduced in order to simulate at a mathematical level the compensatory behavior in process of decision making or subjective evaluation. The following paper introduces such operators on hand of computer vision application. In this paper a novel method based on fuzzy logic reasoning strategy is proposed for edge detection in digital images without determining the threshold value. The proposed approach begins by segmenting the images into regions using floating 3x3 binary matrix. The edge pixels are mapped to a range of values distinct from each other. The robustness of the proposed method results for different captured images are compared to those obtained with the linear Sobel operator. It is gave a permanent effect in the lines smoothness and straightness for the straight lines and good roundness for the curved lines. In the same time the corners get sharper and can be defined easily.

Keywords: Fuzzy logic, Edge detection, Image processing, computer vision, Mechanical parts, Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4768
20 A New Heuristic Approach to Solving U-shape Assembly Line Balancing Problems Type-1

Authors: M. Fathi, M. J. Alvarez, V. Rodríguez

Abstract:

Assembly line balancing is a very important issue in mass production systems due to production cost. Although many studies have been done on this topic, but because assembly line balancing problems are so complex they are categorized as NP-hard problems and researchers strongly recommend using heuristic methods. This paper presents a new heuristic approach called the critical task method (CTM) for solving U-shape assembly line balancing problems. The performance of the proposed heuristic method is tested by solving a number of test problems and comparing them with 12 other heuristics available in the literature to confirm the superior performance of the proposed heuristic. Furthermore, to prove the efficiency of the proposed CTM, the objectives are increased to minimize the number of workstation (or equivalently maximize line efficiency), and minimizing the smoothness index. Finally, it is proven that the proposed heuristic is more efficient than the others to solve the U-shape assembly line balancing problem.

Keywords: Critical task method, Heuristic, Line balancingproblem, U-shape

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2473
19 Comparison between Minimum Direct and Indirect Jerks of Linear Dynamic Systems

Authors: Tawiwat Veeraklaew, Nathasit Phathana-im, Songkit Heama

Abstract:

Both the minimum energy consumption and smoothness, which is quantified as a function of jerk, are generally needed in many dynamic systems such as the automobile and the pick-and-place robot manipulator that handles fragile equipments. Nevertheless, many researchers come up with either solely concerning on the minimum energy consumption or minimum jerk trajectory. This research paper proposes a simple yet very interesting relationship between the minimum direct and indirect jerks approaches in designing the time-dependent system yielding an alternative optimal solution. Extremal solutions for the cost functions of direct and indirect jerks are found using the dynamic optimization methods together with the numerical approximation. This is to allow us to simulate and compare visually and statistically the time history of control inputs employed by minimum direct and indirect jerk designs. By considering minimum indirect jerk problem, the numerical solution becomes much easier and yields to the similar results as minimum direct jerk problem.

Keywords: Optimization, Dynamic, Linear Systems, Jerks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1258
18 Combining Minimum Energy and Minimum Direct Jerk of Linear Dynamic Systems

Authors: V. Tawiwat, P. Jumnong

Abstract:

Both the minimum energy consumption and smoothness, which is quantified as a function of jerk, are generally needed in many dynamic systems such as the automobile and the pick-and-place robot manipulator that handles fragile equipments. Nevertheless, many researchers come up with either solely concerning on the minimum energy consumption or minimum jerk trajectory. This research paper proposes a simple yet very interesting when combining the minimum energy and jerk of indirect jerks approaches in designing the time-dependent system yielding an alternative optimal solution. Extremal solutions for the cost functions of the minimum energy, the minimum jerk and combining them together are found using the dynamic optimization methods together with the numerical approximation. This is to allow us to simulate and compare visually and statistically the time history of state inputs employed by combining minimum energy and jerk designs. The numerical solution of minimum direct jerk and energy problem are exactly the same solution; however, the solutions from problem of minimum energy yield the similar solution especially in term of tendency.

Keywords: Optimization, Dynamic, Linear Systems, Jerks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572
17 Globally Convergent Edge-preserving Reconstruction with Contour-line Smoothing

Authors: Marc C. Robini, Pierre-Jean Viverge, Yuemin Zhu, Jianhua Luo

Abstract:

The standard approach to image reconstruction is to stabilize the problem by including an edge-preserving roughness penalty in addition to faithfulness to the data. However, this methodology produces noisy object boundaries and creates a staircase effect. The existing attempts to favor the formation of smooth contour lines take the edge field explicitly into account; they either are computationally expensive or produce disappointing results. In this paper, we propose to incorporate the smoothness of the edge field in an implicit way by means of an additional penalty term defined in the wavelet domain. We also derive an efficient half-quadratic algorithm to solve the resulting optimization problem, including the case when the data fidelity term is non-quadratic and the cost function is nonconvex. Numerical experiments show that our technique preserves edge sharpness while smoothing contour lines; it produces visually pleasing reconstructions which are quantitatively better than those obtained without wavelet-domain constraints.

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1344
16 Risk Factors in a Road Construction Site

Authors: V.R Gannapathy, S.K Subramaniam, A.B Mohamad Diah, M.K Suaidi, A.H Hamidon

Abstract:

The picture of a perfect road construction site is the one that utilizes conventional vertical road signs and a flagman to optimize the traffic flow with minimum hazel to the public. Former research has been carried out by Department of Occupational Safety and Health (DOSH) and Ministry of Works to further enhance smoothness in traffic operations and particularly in safety issues within work zones. This paper highlights on hazardous zones in a certain road construction or road maintenance site. Most cases show that the flagman falls into high risk of fatal accidents within work zone. Various measures have been taken by both the authorities and contractors to overcome such miseries, yet it-s impossible to eliminate the usage of a flagman since it is considered the best practice. With the implementation of new technologies in automating the traffic flow in road construction site, it is possible to eliminate the usage of a flagman. The intelligent traffic light system is designed to solve problems which contribute hazardous at road construction site and to be inline with the road safety regulation which is taken into granted.

Keywords: Intelligent Traffic Light, Critical Zones, Safety Regulation, Flagman

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6364
15 3D Liver Segmentation from CT Images Using a Level Set Method Based on a Shape and Intensity Distribution Prior

Authors: Nuseiba M. Altarawneh, Suhuai Luo, Brian Regan, Guijin Tang

Abstract:

Liver segmentation from medical images poses more challenges than analogous segmentations of other organs. This contribution introduces a liver segmentation method from a series of computer tomography images. Overall, we present a novel method for segmenting liver by coupling density matching with shape priors. Density matching signifies a tracking method which operates via maximizing the Bhattacharyya similarity measure between the photometric distribution from an estimated image region and a model photometric distribution. Density matching controls the direction of the evolution process and slows down the evolving contour in regions with weak edges. The shape prior improves the robustness of density matching and discourages the evolving contour from exceeding liver’s boundaries at regions with weak boundaries. The model is implemented using a modified distance regularized level set (DRLS) model. The experimental results show that the method achieves a satisfactory result. By comparing with the original DRLS model, it is evident that the proposed model herein is more effective in addressing the over segmentation problem. Finally, we gauge our performance of our model against matrices comprising of accuracy, sensitivity, and specificity.

Keywords: Bhattacharyya distance, distance regularized level set (DRLS) model, liver segmentation, level set method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2338
14 Design of 3-Step Skew BLAC Motor for Better Performance in Electric Power Steering System

Authors: Design of 3-Step Skew BLAC Motor for Better Performance in Electric Power Steering System

Abstract:

In Electric Power Steering (EPS), spoke type Brushless AC (BLAC) motors offer distinct advantages over other electric motor types in terms torque smoothness, reliability and efficiency. This paper deals with the shape optimization of spoke type BLAC motor, in order to reduce cogging torque. This paper examines 3 steps skewing rotor angle, optimizing rotor core edge and rotor overlap length for reducing cogging torque in spoke type BLAC motor. The methods were applied to existing machine designs and their performance was calculated using finite- element analysis (FEA). Prototypes of the machine designs were constructed and experimental results obtained. It is shown that the FEA predicted the cogging torque to be nearly reduce using those methods.

Keywords: EPS, 3-Step skewing, spoke type BLAC, cogging torque, FEA, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2938
13 On the Reduction of Side Effects in Tomography

Authors: V. Masilamani, C. Vanniarajan, Kamala Krithivasan

Abstract:

As the Computed Tomography(CT) requires normally hundreds of projections to reconstruct the image, patients are exposed to more X-ray energy, which may cause side effects such as cancer. Even when the variability of the particles in the object is very less, Computed Tomography requires many projections for good quality reconstruction. In this paper, less variability of the particles in an object has been exploited to obtain good quality reconstruction. Though the reconstructed image and the original image have same projections, in general, they need not be the same. In addition to projections, if a priori information about the image is known, it is possible to obtain good quality reconstructed image. In this paper, it has been shown by experimental results why conventional algorithms fail to reconstruct from a few projections, and an efficient polynomial time algorithm has been given to reconstruct a bi-level image from its projections along row and column, and a known sub image of unknown image with smoothness constraints by reducing the reconstruction problem to integral max flow problem. This paper also discusses the necessary and sufficient conditions for uniqueness and extension of 2D-bi-level image reconstruction to 3D-bi-level image reconstruction.

Keywords: Discrete Tomography, Image Reconstruction, Projection, Computed Tomography, Integral Max Flow Problem, Smooth Binary Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370