Search results for: Two step method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8701

Search results for: Two step method

8521 Comparison of the DC/DC-Converters for Fuel Cell Applications

Authors: Oleksandr Krykunov

Abstract:

The source voltage of high-power fuel cell shows strong load dependence at comparatively low voltage levels. In order to provide the voltage of 750V on the DC-link for feeding electrical energy into the mains via a three phase inverter a step-up converter with a large step-up ratio is required. The output voltage of this DC/DC-converter must be stabile during variations of the load current and the voltage of the fuel cell. This paper presents the methods and results of the calculation of the efficiency and the expense for the realization for the circuits of the DC/DC-converter that meet these requirements.

Keywords: DC/DC-converter, calculation, efficiency, fuel cell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2577
8520 Selecting the Best Sub-Region Indexing the Images in the Case of Weak Segmentation Based On Local Color Histograms

Authors: Mawloud Mosbah, Bachir Boucheham

Abstract:

Color Histogram is considered as the oldest method used by CBIR systems for indexing images. In turn, the global histograms do not include the spatial information; this is why the other techniques coming later have attempted to encounter this limitation by involving the segmentation task as a preprocessing step. The weak segmentation is employed by the local histograms while other methods as CCV (Color Coherent Vector) are based on strong segmentation. The indexation based on local histograms consists of splitting the image into N overlapping blocks or sub-regions, and then the histogram of each block is computed. The dissimilarity between two images is reduced, as consequence, to compute the distance between the N local histograms of the both images resulting then in N*N values; generally, the lowest value is taken into account to rank images, that means that the lowest value is that which helps to designate which sub-region utilized to index images of the collection being asked. In this paper, we make under light the local histogram indexation method in the hope to compare the results obtained against those given by the global histogram. We address also another noteworthy issue when Relying on local histograms namely which value, among N*N values, to trust on when comparing images, in other words, which sub-region among the N*N sub-regions on which we base to index images. Based on the results achieved here, it seems that relying on the local histograms, which needs to pose an extra overhead on the system by involving another preprocessing step naming segmentation, does not necessary mean that it produces better results. In addition to that, we have proposed here some ideas to select the local histogram on which we rely on to encode the image rather than relying on the local histogram having lowest distance with the query histograms.

Keywords: CBIR, Color Global Histogram, Color Local Histogram, Weak Segmentation, Euclidean Distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
8519 Segmentation of Ascending and Descending Aorta in CTA Images

Authors: H. Özkan

Abstract:

In this study, a new and fast algorithm for Ascending Aorta (AscA) and Descending Aorta (DesA) segmentation is presented using Computed Tomography Angiography images. This process is quite important especially at the detection of aortic plaques, aneurysms, calcification or stenosis. The applied method has been carried out at four steps. At first step, lung segmentation is achieved. At the second one, Mediastinum Region (MR) is detected to use in the segmentation. At the third one, images have been applied optimal threshold and components which are outside of the MR were removed. Lastly, identifying and segmentation of AscA and DesA have been carried out. The performance of the applied method is found quite well for radiologists and it gives enough results to the surgeries medically.

Keywords: Ascending aorta (AscA), Descending aorta (DesA), Computed tomography angiography (CTA), Computer aided detection (CAD), Segmentation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1826
8518 Specifying a Timestamp-based Protocol For Multi-step Transactions Using LTL

Authors: Rafat Alshorman, Walter Hussak

Abstract:

Most of the concurrent transactional protocols consider serializability as a correctness criterion of the transactions execution. Usually, the proof of the serializability relies on mathematical proofs for a fixed finite number of transactions. In this paper, we introduce a protocol to deal with an infinite number of transactions which are iterated infinitely often. We specify serializability of the transactions and the protocol using a specification language based on temporal logics. It is worthwhile using temporal logics such as LTL (Lineartime Temporal Logic) to specify transactions, to gain full automatic verification by using model checkers.

Keywords: Multi-step transactions, LTL specifications, Model Checking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1374
8517 Multiresolution Approach to Subpixel Registration by Linear Approximation of PSF

Authors: Erol Seke, Kemal Özkan

Abstract:

Linear approximation of point spread function (PSF) is a new method for determining subpixel translations between images. The problem with the actual algorithm is the inability of determining translations larger than 1 pixel. In this paper a multiresolution technique is proposed to deal with the problem. Its performance is evaluated by comparison with two other well known registration method. In the proposed technique the images are downsampled in order to have a wider view. Progressively decreasing the downsampling rate up to the initial resolution and using linear approximation technique at each step, the algorithm is able to determine translations of several pixels in subpixel levels.

Keywords: Point Spread Function, Subpixel translation, Superresolution, Multiresolution approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656
8516 A Monte Carlo Method to Data Stream Analysis

Authors: Kittisak Kerdprasop, Nittaya Kerdprasop, Pairote Sattayatham

Abstract:

Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.

Keywords: Data Stream, Monte Carlo, Sampling, DensityEstimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413
8515 Parallezation Protein Sequence Similarity Algorithms using Remote Method Interface

Authors: Mubarak Saif Mohsen, Zurinahni Zainol, Rosalina Abdul Salam, Wahidah Husain

Abstract:

One of the major problems in genomic field is to perform sequence comparison on DNA and protein sequences. Executing sequence comparison on the DNA and protein data is a computationally intensive task. Sequence comparison is the basic step for all algorithms in protein sequences similarity. Parallel computing is an attractive solution to provide the computational power needed to speedup the lengthy process of the sequence comparison. Our main research is to enhance the protein sequence algorithm using dynamic programming method. In our approach, we parallelize the dynamic programming algorithm using multithreaded program to perform the sequence comparison and also developed a distributed protein database among many PCs using Remote Method Interface (RMI). As a result, we showed how different sizes of protein sequences data and computation of scoring matrix of these protein sequence on different number of processors affected the processing time and speed, as oppose to sequential processing.

Keywords: Protein sequence algorithm, dynamic programming algorithm, multithread

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1895
8514 Multidimensional Compromise Programming Evaluation of Digital Commerce Websites

Authors: C. Ardil

Abstract:

Multidimensional compromise programming evaluation of digital commerce websites is essential not only to have recommendations for improvement, but also to make comparisons with global business competitors. This research provides a multidimensional decision making model that prioritizes the objective criteria weights of various commerce websites using multidimensional compromise solution. Evaluation of digital commerce website quality can be considered as a complex information system structure including qualitative and quantitative factors for a multicriteria decision making problem. The proposed multicriteria decision making approach mainly consists of three sequential steps for the selection problem. In the first step, three major different evaluation criteria are characterized for website ranking problem. In the second step, identified critical criteria are weighted using the standard deviation procedure. In the third step, the multidimensional compromise programming is applied to rank the digital commerce websites.

Keywords: Standard deviation, commerce website, website evaluation, multicriteria decision making, multicriteria compromise programming, website quality, multidimensional decision analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 809
8513 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model

Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis

Abstract:

In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.

Keywords: Expectation-maximization (EM) algorithm, cause of failure, intensity, linear degradation path, masked data, reliability function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1069
8512 An eighth order Backward Differentiation Formula with Continuous Coefficients for Stiff Ordinary Differential Equations

Authors: Olusheye Akinfenwa, Samuel Jator, Nianmin Yoa

Abstract:

A block backward differentiation formula of uniform order eight is proposed for solving first order stiff initial value problems (IVPs). The conventional 8-step Backward Differentiation Formula (BDF) and additional methods are obtained from the same continuous scheme and assembled into a block matrix equation which is applied to provide the solutions of IVPs on non-overlapping intervals. The stability analysis of the method indicates that the method is L0-stable. Numerical results obtained using the proposed new block form show that it is attractive for solutions of stiff problems and compares favourably with existing ones.

Keywords: Stiff IVPs, System of ODEs, Backward differentiationformulas, Block methods, Stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2754
8511 Adaptive Score Normalization: A Novel Approach for Multimodal Biometric Systems

Authors: Anouar Ben Khalifa, Sami Gazzah, Najoua Essoukri BenAmara

Abstract:

Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.

Keywords: Multibiometrics, Fusion, Score level, Score normalization, Adaptive normalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3547
8510 New Features for Specific JPEG Steganalysis

Authors: Johann Barbier, Eric Filiol, Kichenakoumar Mayoura

Abstract:

We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.

Keywords: Compressed frequency domain, Fisher discriminant, specific JPEG steganalysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2156
8509 Vapor Bubble Dynamics in Upward Subcooled Flow Boiling During Void Evolution

Authors: Rouhollah Ahmadi, Tatsuya Ueno, Tomio Okawa

Abstract:

Bubble generation was observed using a high-speed camera in subcooled flow boiling at low void fraction. Constant heat flux was applied on one side of an upward rectangular channel to make heated test channel. Water as a working fluid from high subcooling to near saturation temperature was injected step by step to investigate bubble behavior during void development. Experiments were performed in two different pressures condition close to 2bar and 4bar. It was observed that in high subcooling when boiling was commenced, bubble after nucleation departed its origin and slid beside heated surface. In an observation window mean release frequency of bubble fb,mean, nucleation site Ns and mean bubble volume Vb,mean in each step of experiments were measured to investigate wall vaporization rate. It was found that in proximity of PNVG vaporization rate was increased significantly in compare with condensation rate which remained in low value.

Keywords: Subcooled flow boiling, Bubble dynamics, Void fraction, Sliding bubble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
8508 Using Simulation Modeling Approach to Predict USMLE Steps 1 and 2 Performances

Authors: Chau-Kuang Chen, John Hughes, Jr., A. Dexter Samuels

Abstract:

The prediction models for the United States Medical Licensure Examination (USMLE) Steps 1 and 2 performances were constructed by the Monte Carlo simulation modeling approach via linear regression. The purpose of this study was to build robust simulation models to accurately identify the most important predictors and yield the valid range estimations of the Steps 1 and 2 scores. The application of simulation modeling approach was deemed an effective way in predicting student performances on licensure examinations. Also, sensitivity analysis (a/k/a what-if analysis) in the simulation models was used to predict the magnitudes of Steps 1 and 2 affected by changes in the National Board of Medical Examiners (NBME) Basic Science Subject Board scores. In addition, the study results indicated that the Medical College Admission Test (MCAT) Verbal Reasoning score and Step 1 score were significant predictors of the Step 2 performance. Hence, institutions could screen qualified student applicants for interviews and document the effectiveness of basic science education program based on the simulation results.

Keywords: Prediction Model, Sensitivity Analysis, Simulation Method, USMLE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456
8507 Feature Selection Methods for an Improved SVM Classifier

Authors: Daniel Morariu, Lucian N. Vintan, Volker Tresp

Abstract:

Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step, the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of feature selection methods to reduce the dimensionality of the document-representation vector. In this paper, three feature selection methods are evaluated: Random Selection, Information Gain (IG) and Support Vector Machine feature selection (called SVM_FS). We show that the best results were obtained with SVM_FS method for a relatively small dimension of the feature vector. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).

Keywords: Feature Selection, Learning with Kernels, SupportVector Machine, and Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
8506 An Improved Learning Algorithm based on the Conjugate Gradient Method for Back Propagation Neural Networks

Authors: N. M. Nawi, M. R. Ransing, R. S. Ransing

Abstract:

The conjugate gradient optimization algorithm usually used for nonlinear least squares is presented and is combined with the modified back propagation algorithm yielding a new fast training multilayer perceptron (MLP) algorithm (CGFR/AG). The approaches presented in the paper consist of three steps: (1) Modification on standard back propagation algorithm by introducing gain variation term of the activation function, (2) Calculating the gradient descent on error with respect to the weights and gains values and (3) the determination of the new search direction by exploiting the information calculated by gradient descent in step (2) as well as the previous search direction. The proposed method improved the training efficiency of back propagation algorithm by adaptively modifying the initial search direction. Performance of the proposed method is demonstrated by comparing to the conjugate gradient algorithm from neural network toolbox for the chosen benchmark. The results show that the number of iterations required by the proposed method to converge is less than 20% of what is required by the standard conjugate gradient and neural network toolbox algorithm.

Keywords: Back-propagation, activation function, conjugategradient, search direction, gain variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2833
8505 Semi-Automatic Method to Assist Expert for Association Rules Validation

Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen

Abstract:

In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.

Keywords: Association rules, Rule-based classification, Classification quality, Validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
8504 Design of 3-Step Skew BLAC Motor for Better Performance in Electric Power Steering System

Authors: Design of 3-Step Skew BLAC Motor for Better Performance in Electric Power Steering System

Abstract:

In Electric Power Steering (EPS), spoke type Brushless AC (BLAC) motors offer distinct advantages over other electric motor types in terms torque smoothness, reliability and efficiency. This paper deals with the shape optimization of spoke type BLAC motor, in order to reduce cogging torque. This paper examines 3 steps skewing rotor angle, optimizing rotor core edge and rotor overlap length for reducing cogging torque in spoke type BLAC motor. The methods were applied to existing machine designs and their performance was calculated using finite- element analysis (FEA). Prototypes of the machine designs were constructed and experimental results obtained. It is shown that the FEA predicted the cogging torque to be nearly reduce using those methods.

Keywords: EPS, 3-Step skewing, spoke type BLAC, cogging torque, FEA, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2933
8503 Despeckling of Synthetic Aperture Radar Images Using Inner Product Spaces in Undecimated Wavelet Domain

Authors: Syed Musharaf Ali, Muhammad Younus Javed, Naveed Sarfraz Khattak, Athar Mohsin, UmarFarooq

Abstract:

This paper introduces the effective speckle reduction of synthetic aperture radar (SAR) images using inner product spaces in undecimated wavelet domain. There are two major areas in projection onto span algorithm where improvement can be made. First is the use of undecimated wavelet transformation instead of discrete wavelet transformation. And second area is the use of smoothing filter namely directional smoothing filter which is an additional step. Proposed method does not need any noise estimation and thresholding technique. More over proposed method gives good results on both single polarimetric and fully polarimetric SAR images.

Keywords: Directional Smoothing, Inner product, Length ofvector, Undecimated wavelet transformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1601
8502 Robust Detection of R-Wave Using Wavelet Technique

Authors: Awadhesh Pachauri, Manabendra Bhuyan

Abstract:

Electrocardiogram (ECG) is considered to be the backbone of cardiology. ECG is composed of P, QRS & T waves and information related to cardiac diseases can be extracted from the intervals and amplitudes of these waves. The first step in extracting ECG features starts from the accurate detection of R peaks in the QRS complex. We have developed a robust R wave detector using wavelets. The wavelets used for detection are Daubechies and Symmetric. The method does not require any preprocessing therefore, only needs the ECG correct recordings while implementing the detection. The database has been collected from MIT-BIH arrhythmia database and the signals from Lead-II have been analyzed. MatLab 7.0 has been used to develop the algorithm. The ECG signal under test has been decomposed to the required level using the selected wavelet and the selection of detail coefficient d4 has been done based on energy, frequency and cross-correlation analysis of decomposition structure of ECG signal. The robustness of the method is apparent from the obtained results.

Keywords: ECG, P-QRS-T waves, Wavelet Transform, Hard Thresholding, R-wave Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2467
8501 New Adaptive Linear Discriminante Analysis for Face Recognition with SVM

Authors: Mehdi Ghayoumi

Abstract:

We have applied new accelerated algorithm for linear discriminate analysis (LDA) in face recognition with support vector machine. The new algorithm has the advantage of optimal selection of the step size. The gradient descent method and new algorithm has been implemented in software and evaluated on the Yale face database B. The eigenfaces of these approaches have been used to training a KNN. Recognition rate with new algorithm is compared with gradient.

Keywords: lda, adaptive, svm, face recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412
8500 Development of Fuzzy Logic Control Ontology for E-Learning

Authors: Muhammad Sollehhuddin A. Jalil, Mohd Ibrahim Shapiai, Rubiyah Yusof

Abstract:

Nowadays, ontology is common in many areas like artificial intelligence, bioinformatics, e-commerce, education and many more. Ontology is one of the focus areas in the field of Information Retrieval. The purpose of an ontology is to describe a conceptual representation of concepts and their relationships within a particular domain. In other words, ontology provides a common vocabulary for anyone who needs to share information in the domain. There are several ontology domains in various fields including engineering and non-engineering knowledge. However, there are only a few available ontology for engineering knowledge. Fuzzy logic as engineering knowledge is still not available as ontology domain. In general, fuzzy logic requires step-by-step guidelines and instructions of lab experiments. In this study, we presented domain ontology for Fuzzy Logic Control (FLC) knowledge. We give Table of Content (ToC) with middle strategy based on the Uschold and King method to develop FLC ontology. The proposed framework is developed using Protégé as the ontology tool. The Protégé’s ontology reasoner, known as the Pellet reasoner is then used to validate the presented framework. The presented framework offers better performance based on consistency and classification parameter index. In general, this ontology can provide a platform to anyone who needs to understand FLC knowledge.

Keywords: Engineering knowledge, fuzzy logic control ontology, ontology development, table of contents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168
8499 Making Data Structures and Algorithms more Understandable by Programming Sudoku the Human Way

Authors: Roelien Goede

Abstract:

Data Structures and Algorithms is a module in most Computer Science or Information Technology curricula. It is one of the modules most students identify as being difficult. This paper demonstrates how programming a solution for Sudoku can make abstract concepts more concrete. The paper relates concepts of a typical Data Structures and Algorithms module to a step by step solution for Sudoku in a human type as opposed to a computer oriented solution.

Keywords: Data Structures, Algorithms, Sudoku, ObjectOriented Programming, Programming Teaching, Education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3090
8498 How to Integrate Sustainability in Technological Degrees: Robotics at UPC

Authors: Antoni Grau, Yolanda Bolea, Alberto Sanfeliu

Abstract:

Embedding Sustainability in technological curricula has become a crucial factor for educating engineers with competences in sustainability. The Technical University of Catalonia UPC, in 2008, designed the Sustainable Technology Excellence Program STEP 2015 in order to assure a successful Sustainability Embedding. This Program takes advantage of the opportunity that the redesign of all Bachelor and Master Degrees in Spain by 2010 under the European Higher Education Area framework offered. The STEP program goals are: to design compulsory courses in each degree; to develop the conceptual base and identify reference models in sustainability for all specialties at UPC; to create an internal interdisciplinary network of faculty from all the schools; to initiate new transdisciplinary research activities in technology-sustainability-education; to spread the know/how attained; to achieve international scientific excellence in technology-sustainability-education and to graduate the first engineers/architects of the new EHEA bachelors with sustainability as a generic competence. Specifically, in this paper authors explain their experience in leading the STEP program, and two examples are presented: Industrial Robotics subject and the curriculum for the School of Architecture.

Keywords: Sustainability, curricula improvement, robotics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1853
8497 User-Friendly Task Creation Using a CAD Integrated Robotic System on a Real Workcell

Authors: Alireza Changizi, Arash Rezaei, Jamal Muhammad, Jyrki Latokartano, Minna Lanz

Abstract:

Offline programming (OLP) is a new method in robot programming which is used widely in the industry nowadays which is a simulation base method that can produce the robot codes for motion according to virtual world in the simulation software. In this project Delmia v5 is used as simulation software. First the work cell component was modelled by Catia v5 and all of them was imported to a process file in Delmia and placed roughly to form the virtual work cell. Then robot was added to the work cell from the Delmia library. Work cell was calibrated corresponding to real world work cell to have accurate code. Tool calibration is the first step of calibration scheme and then work cell equipment can be calibrated using 6 point calibration method. Finally generated code needs to be reformed to match related controller code instruction. At the last stage IO were set to accomplish robots cooperation and make their motion synchronized. The pros and cons also will be discussed to clarify the presented results show the feasibility of the method and its effect on production line efficiency. Finally the positive and negative points of the implementation will be discussed.

Keywords: Component, robotic, automated, production, offline programming, CAD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1107
8496 A Computer Aided Detection (CAD) System for Microcalcifications in Mammograms - MammoScan mCaD

Authors: Kjersti Engan, Thor Ole Gulsrud, Karl Fredrik Fretheim, Barbro Furebotten Iversen, Liv Eriksen

Abstract:

Clusters of microcalcifications in mammograms are an important sign of breast cancer. This paper presents a complete Computer Aided Detection (CAD) scheme for automatic detection of clustered microcalcifications in digital mammograms. The proposed system, MammoScan μCaD, consists of three main steps. Firstly all potential microcalcifications are detected using a a method for feature extraction, VarMet, and adaptive thresholding. This will also give a number of false detections. The goal of the second step, Classifier level 1, is to remove everything but microcalcifications. The last step, Classifier level 2, uses learned dictionaries and sparse representations as a texture classification technique to distinguish single, benign microcalcifications from clustered microcalcifications, in addition to remove some remaining false detections. The system is trained and tested on true digital data from Stavanger University Hospital, and the results are evaluated by radiologists. The overall results are promising, with a sensitivity > 90 % and a low false detection rate (approx 1 unwanted pr. image, or 0.3 false pr. image).

Keywords: mammogram, microcalcifications, detection, CAD, MammoScan μCaD, VarMet, dictionary learning, texture, FTCM, classification, adaptive thresholding

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1801
8495 Research of a Multistep Method Applied to Numerical Solution of Volterra Integro-Differential Equation

Authors: M.Imanova, G.Mehdiyeva, V.Ibrahimov

Abstract:

Solution of some practical problems is reduced to the solution of the integro-differential equations. But for the numerical solution of such equations basically quadrature methods or its combination with multistep or one-step methods are used. The quadrature methods basically is applied to calculation of the integral participating in right hand side of integro-differential equations. As this integral is of Volterra type, it is obvious that at replacement with its integrated sum the upper limit of the sum depends on a current point in which values of the integral are defined. Thus we receive the integrated sum with variable boundary, to work with is hardly. Therefore multistep method with the constant coefficients, which is free from noted lack and gives the way for finding it-s coefficients is present.

Keywords: Volterra integro-differential equations, multistepmethods, finite-difference methods, initial value problem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
8494 Digital Library Evaluation by SWARA-WASPAS Method

Authors: Mehmet Yörükoğlu, Serhat Aydın

Abstract:

Since the discovery of the manuscript, mechanical methods for storing, transferring and using the information have evolved into digital methods over the time. In this process, libraries that are the center of the information have also become digitized and become accessible from anywhere and at any time in the world by taking on a structure that has no physical boundaries. In this context, some criteria for information obtained from digital libraries have become more important for users. This paper evaluates the user criteria from different perspectives that make a digital library more useful. The Step-Wise Weight Assessment Ratio Analysis-Weighted Aggregated Sum Product Assessment (SWARA-WASPAS) method is used with flexibility and easy calculation steps for the evaluation of digital library criteria. Three different digital libraries are evaluated by information technology experts according to five conflicting main criteria, ‘interface design’, ‘effects on users’, ‘services’, ‘user engagement’ and ‘context’. Finally, alternatives are ranked in descending order.

Keywords: Digital library, multi criteria decision making, SWARA-WASPAS method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 888
8493 A Robust and Efficient Segmentation Method Applied for Cardiac Left Ventricle with Abnormal Shapes

Authors: Peifei Zhu, Zisheng Li, Yasuki Kakishita, Mayumi Suzuki, Tomoaki Chono

Abstract:

Segmentation of left ventricle (LV) from cardiac ultrasound images provides a quantitative functional analysis of the heart to diagnose disease. Active Shape Model (ASM) is widely used for LV segmentation, but it suffers from the drawback that initialization of the shape model is not sufficiently close to the target, especially when dealing with abnormal shapes in disease. In this work, a two-step framework is improved to achieve a fast and efficient LV segmentation. First, a robust and efficient detection based on Hough forest localizes cardiac feature points. Such feature points are used to predict the initial fitting of the LV shape model. Second, ASM is applied to further fit the LV shape model to the cardiac ultrasound image. With the robust initialization, ASM is able to achieve more accurate segmentation. The performance of the proposed method is evaluated on a dataset of 810 cardiac ultrasound images that are mostly abnormal shapes. This proposed method is compared with several combinations of ASM and existing initialization methods. Our experiment results demonstrate that accuracy of the proposed method for feature point detection for initialization was 40% higher than the existing methods. Moreover, the proposed method significantly reduces the number of necessary ASM fitting loops and thus speeds up the whole segmentation process. Therefore, the proposed method is able to achieve more accurate and efficient segmentation results and is applicable to unusual shapes of heart with cardiac diseases, such as left atrial enlargement.

Keywords: Hough forest, active shape model, segmentation, cardiac left ventricle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
8492 Effect of Cooling Coherent Nozzle Orientation on the Machinability of Ti-6Al-4V in Step Shoulder Milling

Authors: Salah Gariani, Islam Shyha, Osama Elgadi, Khaled Jegandi

Abstract:

In this work, a cooling coherent round nozzle was developed and the impact of nozzle placement (i.e. nozzle angle and stand-off/impinging distance) on the machinability of Ti-6Al-4V was evaluated. Key process measures were cutting force, workpiece temperature, tool wear, burr formation and average surface roughness (Ra). Experimental results showed that nozzle position at a 15° angle in the feed direction and 45°/60° against feed direction assisted in minimising workpiece temperature. A stand-off distance of 55 and 75 mm is also necessary to control burr formation, workpiece temperature and Ra, but coherent nozzle orientation has no statistically significant impact on the mean values of cutting force and tool wear. It can be concluded that stand-off distance is more substantially significant than nozzle angles when step shoulder milling Ti-6Al- 4V using vegetable oil-based cutting fluid.

Keywords: Coherent round nozzle, step shoulder milling, Ti-6Al-4V, vegetable oil-based cutting fluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 421