Search results for: improved analytic hierarchy process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6801

Search results for: improved analytic hierarchy process

6441 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System

Authors: Cheima Ben Soltane, Ittansa Yonas Kelbesa

Abstract:

Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.

Keywords: Feature Extraction, Speaker Modeling, Feature Matching, Mel Frequency Cepstrum Coefficient (MFCC), Gaussian mixture model (GMM), Vector Quantization (VQ), Linde-Buzo-Gray (LBG), Expectation Maximization (EM), pre-processing, Voice Activity Detection (VAD), Short Time Energy (STE), Background Noise Statistical Modeling, Closed-Set Tex-Independent Speaker Identification System (CISI).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880
6440 Short-Term Load Forecasting Based on Variational Mode Decomposition and Least Square Support Vector Machine

Authors: Jiangyong Liu, Xiangxiang Xu, Bote Luo, Xiaoxue Luo, Jiang Zhu, Lingzhi Yi

Abstract:

To address the problems of non-linearity and high randomness of the original power load sequence causing the degradation of power load forecasting accuracy, a short-term load forecasting method is proposed. The method is based on the least square support vector machine (LSSVM) optimized by an improved sparrow search algorithm combined with the variational mode decomposition proposed in this paper. The application of the variational mode decomposition technique decomposes the raw power load data into a series of intrinsic mode functions components, which can reduce the complexity and instability of the raw data while overcoming modal confounding; the proposed improved sparrow search algorithm can solve the problem of difficult selection of learning parameters in the LSSVM. Finally, through comparison experiments, the results show that the method can effectively improve prediction accuracy.

Keywords: Load forecasting, variational mode decomposition, improved sparrow search algorithm, least square support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48
6439 Improved Multi-Objective Particle Swarm Optimization Applied to Design Problem

Authors: Kapse Swapnil, K. Shankar

Abstract:

Aiming at optimizing the weight and deflection of cantilever beam subjected to maximum stress and maximum deflection, Multi-objective Particle Swarm Optimization (MOPSO) with Utopia Point based local search is implemented. Utopia point is used to govern the search towards the Pareto Optimal set. The elite candidates obtained during the iterations are stored in an archive according to non-dominated sorting and also the archive is truncated based on least crowding distance. Local search is also performed on elite candidates and the most diverse particle is selected as the global best. This method is implemented on standard test functions and it is observed that the improved algorithm gives better convergence and diversity as compared to NSGA-II in fewer iterations. Implementation on practical structural problem shows that in 5 to 6 iterations, the improved algorithm converges with better diversity as evident by the improvement of cantilever beam on an average of 0.78% and 9.28% in the weight and deflection respectively compared to NSGA-II.

Keywords: Utopia point, multi-objective particle swarm optimization, local search, cantilever beam.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 983
6438 Detection of Black Holes in MANET Using Collaborative Watchdog with Fuzzy Logic

Authors: Y. Harold Robinson, M. Rajaram, E. Golden Julie, S. Balaji

Abstract:

Mobile ad hoc network (MANET) is a self-configuring network of mobile node connected without wires. A Fuzzy Logic Based Collaborative watchdog approach is used to reduce the detection time of misbehaved nodes and increase the overall truthfulness. This methodology will increase the secure efficient routing by detecting the Black Holes attacks. The simulation results proved that this method improved the energy, reduced the delay and also improved the overall performance of the detecting black hole attacks in MANET.

Keywords: MANET, collaborative watchdog, fuzzy logic, AODV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1349
6437 Salinity on Survival and Early Development of Biofuel Feedstock Crops

Authors: Vincent M. Russo

Abstract:

Salinity level may affect early development of biofuel feedstock crops. The biofuel feedstock crops canola (Brassica napus L.), sorghum [Sorghum bicolor (L.) Moench], and sunflower (Helianthus annuus L.); and the potential feedstock crop sweet corn (Zea mays L.) were planted in media in pots and treated with aqueous solutions of 0, 0.1, 0.5 and 1.0 M NaCl once at: 1) planting; 2) 7-10 days after planting or 3) first true leaf expansion. An additional treatment (4) comprised of one-half strength of the 0.1, 0.5 and 1.0 M (concentrations 0.05, 0.25, 0.5 M at each application) was applied at first true leaf expansion and four days later. Survival of most crops decreased below 90% above 0.5 M; survival of canola decreased above 0.1 M. Application timing had little effect on crop survival. For canola root fresh and dry weights improved when application was at plant emergence; for sorghum top and root fresh weights improved when the split application was used. When application was at planting root dry weight was improved over most other applications. Sunflower top fresh weight was among the highest when saline solutions were split and top dry weight was among the highest when application was at plant emergence. Sweet corn root fresh weight was improved when the split application was used or application was at planting. Sweet corn root dry weight was highest when application was at planting or plant emergence. Even at high salinity rates survival rates greater than what might be expected occurred. Plants that survived appear to be able to adjust to saline during the early stages of development.

Keywords: Canola, Development, Sorghum, Sunflower, Sweetcorn, Survival

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527
6436 A Simplified and Effective Algorithm Used to Mine Similar Processes: An Illustrated Example

Authors: Min-Hsun Kuo, Yun-Shiow Chen

Abstract:

The running logs of a process hold valuable information about its executed activity behavior and generated activity logic structure. Theses informative logs can be extracted, analyzed and utilized to improve the efficiencies of the process's execution and conduction. One of the techniques used to accomplish the process improvement is called as process mining. To mine similar processes is such an improvement mission in process mining. Rather than directly mining similar processes using a single comparing coefficient or a complicate fitness function, this paper presents a simplified heuristic process mining algorithm with two similarity comparisons that are able to relatively conform the activity logic sequences (traces) of mining processes with those of a normalized (regularized) one. The relative process conformance is to find which of the mining processes match the required activity sequences and relationships, further for necessary and sufficient applications of the mined processes to process improvements. One similarity presented is defined by the relationships in terms of the number of similar activity sequences existing in different processes; another similarity expresses the degree of the similar (identical) activity sequences among the conforming processes. Since these two similarities are with respect to certain typical behavior (activity sequences) occurred in an entire process, the common problems, such as the inappropriateness of an absolute comparison and the incapability of an intrinsic information elicitation, which are often appeared in other process conforming techniques, can be solved by the relative process comparison presented in this paper. To demonstrate the potentiality of the proposed algorithm, a numerical example is illustrated.

Keywords: process mining, process similarity, artificial intelligence, process conformance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1442
6435 Energy Recovery Soft Switching Improved Efficiency Half Bridge Inverter for Electronic Ballast Applications

Authors: A. Yazdanpanah Goharrizi

Abstract:

An improved topology of a voltage-fed quasi-resonant soft switching LCrCdc series-parallel half bridge inverter with a constant-frequency for electronic ballast applications is proposed in this paper. This new topology introduces a low-cost solution to reduce switching losses and circuit rating to achieve high-efficiency ballast. Switching losses effect on ballast efficiency is discussed through experimental point of view. In this discussion, an improved topology in which accomplishes soft switching operation over a wide power regulation range is proposed. The proposed structure uses reverse recovery diode to provide better operation for the ballast system. A symmetrical pulse wide modulation (PWM) control scheme is implemented to regulate a wide range of out-put power. Simulation results are kindly verified with the experimental measurements obtained by ballast-lamp laboratory prototype. Different load conditions are provided in order to clarify the performance of the proposed converter.

Keywords: Electronic ballast, Pulse wide modulation (PWM) Reverse recovery diode, Soft switching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2189
6434 Conceptual Method for Flexible Business Process Modeling

Authors: Adla Bentellis, Zizette Boufaïda

Abstract:

Nowadays, the pace of business change is such that, increasingly, new functionality has to be realized and reliably installed in a matter of days, or even hours. Consequently, more and more business processes are prone to a continuous change. The objective of the research in progress is to use the MAP model, in a conceptual modeling method for flexible and adaptive business process. This method can be used to capture the flexibility dimensions of a business process; it takes inspiration from modularity concept in the object oriented paradigm to establish a hierarchical construction of the BP modeling. Its intent is to provide a flexible modeling that allows companies to quickly adapt their business processes.

Keywords: Business Process, Business process modeling, flexibility, MAP Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1895
6433 Effects of Milling Process Parameters on Cutting Forces and Surface Roughness When Finishing Ti6al4v Produced by Electron Beam Melting

Authors: Abdulmajeed Dabwan, Saqib Anwar, Ali Al-Samhan

Abstract:

Electron Beam Melting (EBM) is a metal powder bed-based Additive Manufacturing (AM) technology, which uses computer-controlled electron beams to create fully dense three-dimensional near-net-shaped parts from metal powder. It gives the ability to produce any complex parts directly from a computer-aided design (CAD) model without tools and dies, and with a variety of materials. However, the quality of the surface finish in EBM process has limitations to meeting the performance requirements of additively manufactured components. The aim of this study is to investigate the cutting forces induced during milling Ti6Al4V produced by EBM as well as the surface quality of the milled surfaces. The effects of cutting speed and radial depth of cut on the cutting forces, surface roughness, and surface morphology were investigated. The results indicated that the cutting speed was found to be proportional to the resultant cutting force at any cutting conditions while the surface roughness improved significantly with the increase in cutting speed and radial depth of cut.

Keywords: Electron beam melting, additive manufacturing, Ti6Al4V, surface morphology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 716
6432 Linking Business Process Models and System Models Based on Business Process Modelling

Authors: Faisal A. Aburub

Abstract:

Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.

Keywords: Business process modelling, system models, role activity diagrams, sequence diagrams.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1524
6431 A Particle Swarm Optimal Control Method for DC Motor by Considering Energy Consumption

Authors: Yingjie Zhang, Ming Li, Ying Zhang, Jing Zhang, Zuolei Hu

Abstract:

In the actual start-up process of DC motors, the DC drive system often faces a conflict between energy consumption and acceleration performance. To resolve the conflict, this paper proposes a comprehensive performance index that energy consumption index is added on the basis of classical control performance index in the DC motor starting process. Taking the comprehensive performance index as the cost function, particle swarm optimization algorithm is designed to optimize the comprehensive performance. Then it conducts simulations on the optimization of the comprehensive performance of the DC motor on condition that the weight coefficient of the energy consumption index should be properly designed. The simulation results show that as the weight of energy consumption increased, the energy efficiency was significantly improved at the expense of a slight sacrifice of fastness indicators with the comprehensive performance index method. The energy efficiency was increased from 63.18% to 68.48% and the response time reduced from 0.2875s to 0.1736s simultaneously compared with traditional proportion integrals differential controller in energy saving.

Keywords: Comprehensive performance index, energy consumption, acceleration performance, particle swarm optimal control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 640
6430 Adaptive Subchannel Allocation for MC-CDMA System

Authors: Cuiran Li, Jianli Xie, Chengshu Li

Abstract:

Multicarrier code-division multiple-access is one of the effective techniques to gain its multiple access capability, robustness against fading, and to mitigate the ISI. In this paper, we propose an improved mulcarrier CDMA system with adaptive subchannel allocation. We analyzed the performance of our proposed system in frequency selective fading environment with narrowband interference existing and compared it with that of parallel transmission over many subchannels (namely, conventional MC-CDMA scheme) and DS-CDMA system. Simulation results show that adaptive subchannel allocation scheme, when used in conventional multicarrier CDMA system, the performance will be greatly improved.

Keywords: MC-CDMA, Rayleigh fading, Narrowbandinterference, Channel estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572
6429 Effect of Coffee Grounds on Physical and Heating Value Properties of Sugarcane Bagasse Pellets

Authors: K. Rattawan, W. Intagun, W. Kanoksilapatham

Abstract:

Objective of this research is to study effect of coffee grounds on physical and heating value properties of sugarcane bagasse pellets. The coffee grounds were tested as an additive for pelletizing process of bagasse pellets. Pelletizing was performed using a Flat–die pellet mill machine. Moisture content of raw materials was controlled at 10-13%. Die temperature range during the process was 75-80 oC. Physical characteristics (bulk density and durability) of the bagasse pellet and pellets with 1-5% coffee ground were determined following the standard assigned by the Pellet Fuel Institute (PFI). The results revealed increasing values of 648±3.4, 659 ± 3.1, 679 ± 3.3 and 685 ± 3.1 kg/m3 (for pellet bulk density); and 98.7 ± 0.11, 99.2 ± 0.26, 99.3 ± 0.19 and 99.4 ± 0.07% (for pellet durability), respectively. In addition, the heating values of the coffee ground supplemented pellets (15.9 ± 1.16, 17.0 ± 1.23 and 18.8 ± 1.34 MJ/kg) were improved comparing to the non-supplemented control (14.9 ± 1.14 MJ/kg), respectively. The results indicated that both the bulk density and durability values of the bagasse pellets were increased with the increasing proportion of the coffee ground additive.

Keywords: Bagasse, coffee grounds, pelletizing, heating value, sugar cane bagasse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 764
6428 Unified Structured Process for Health Analytics

Authors: Supunmali Ahangama, Danny Chiang Choon Poo

Abstract:

Health analytics (HA) is used in healthcare systems for effective decision making, management and planning of healthcare and related activities. However, user resistances, unique position of medical data content and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. Success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose a HA process model with features from rational unified process (RUP) model and agile methodology.

Keywords: Agile methodology, health analytics, unified process model, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2329
6427 Enhanced Data Access Control of Cooperative Environment used for DMU Based Design

Authors: Wei Lifan, Zhang Huaiyu, Yang Yunbin, Li Jia

Abstract:

Through the analysis of the process digital design based on digital mockup, the fact indicates that a distributed cooperative supporting environment is the foundation conditions to adopt design approach based on DMU. Data access authorization is concerned firstly because the value and sensitivity of the data for the enterprise. The access control for administrators is often rather weak other than business user. So authors established an enhanced system to avoid the administrators accessing the engineering data by potential approach and without authorization. Thus the data security is improved.

Keywords: access control, DMU, PLM, virtual prototype.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1462
6426 Improved Wi-Fi Backscatter System for Multi-to-Multi Communication

Authors: Chang-Bin Ha, Yong-Jun Kim, Dong-Hyun Ha, Hyoung-Kyu Song

Abstract:

The conventional Wi-Fi backscatter system can only process one-to-one communication between the Wi-Fi reader and the Wi-Fi tag. For improvement of throughput of the conventional system, this paper proposes the multi-to-multi communication system. In the proposed system, the interference by the multi-to-multi communication is effectively cancelled by the orthogonal multiple access based on the identification code of the tag. Although the overhead is generated by the procedure for the multi-to-multi communication, because the procedure is processed by the Wi-Fi protocol, the overhead is insignificant for the entire communication procedure. From the numerical results, it is confirmed that the proposed system has nearly proportional increased throughput in according to the number of the tag that simultaneously participates in communication.

Keywords: Backscatter, Multi-to-multi communication, Orthogonality, Wi-Fi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2074
6425 An Improved Tie Force Method for Progressive Collapse Resistance of Precast Concrete Cross Wall Structures

Authors: M. Tohidi, J. Yang, C. Baniotopoulos

Abstract:

Progressive collapse of buildings typically occurs  when abnormal loading conditions cause local damages, which leads  to a chain reaction of failure and ultimately catastrophic collapse. The  tie force (TF) method is one of the main design approaches for  progressive collapse. As the TF method is a simplified method, further  investigations on the reliability of the method is necessary. This study  aims to develop an improved TF method to design the cross wall  structures for progressive collapse. To this end, the pullout behavior of  strands in grout was firstly analyzed; and then, by considering the tie  force-slip relationship in the friction stage together with the catenary  action mechanism, a comprehensive analytical method was developed.  The reliability of this approach is verified by the experimental results  of concrete block pullout tests and full scale floor-to-floor joints tests  undertaken by Portland Cement Association (PCA). Discrepancies in  the tie force between the analytical results and codified specifications  have suggested the deficiency of TF method, hence an improved  model based on the analytical results has been proposed to address this  concern.

 

Keywords: Cross wall, progressive collapse, ties force method, catenary, analytical.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3686
6424 An Improved K-Means Algorithm for Gene Expression Data Clustering

Authors: Billel Kenidra, Mohamed Benmohammed

Abstract:

Data mining technique used in the field of clustering is a subject of active research and assists in biological pattern recognition and extraction of new knowledge from raw data. Clustering means the act of partitioning an unlabeled dataset into groups of similar objects. Each group, called a cluster, consists of objects that are similar between themselves and dissimilar to objects of other groups. Several clustering methods are based on partitional clustering. This category attempts to directly decompose the dataset into a set of disjoint clusters leading to an integer number of clusters that optimizes a given criterion function. The criterion function may emphasize a local or a global structure of the data, and its optimization is an iterative relocation procedure. The K-Means algorithm is one of the most widely used partitional clustering techniques. Since K-Means is extremely sensitive to the initial choice of centers and a poor choice of centers may lead to a local optimum that is quite inferior to the global optimum, we propose a strategy to initiate K-Means centers. The improved K-Means algorithm is compared with the original K-Means, and the results prove how the efficiency has been significantly improved.

Keywords: Microarray data mining, biological pattern recognition, partitional clustering, k-means algorithm, centroid initialization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1283
6423 Author's Approach to the Problem of Correctional Speech Therapy with Children Suffering from Alalia

Authors: Е. V. Kutsina, S. A. Tarasova

Abstract:

In this article we present a methodology which enables preschool and primary school unlanguaged children to remember words, phrases and texts with the help of graphic signs - letters, syllables and words. Reading for a child becomes a support for speech development. Teaching is based on the principle "from simple to complex", "a letter - a syllable - a word - a proposal - a text." Availability of multi-level texts allows using this methodology for working with children who have different levels of speech development.

Keywords: Alalia, analytic-synthetic method, development of coherent speech, formation of vocabulary, learning to read, , sentence formation, three-level stories, unlanguaged children.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1940
6422 Combined Effect of Heat Stimulation and Delayed Addition of Superplasticizer with Slag on Fresh and Hardened Property of Mortar

Authors: Faraidoon Rahmanzai, Mizuki Takigawa, Yu Bomura, Shigeyuki Date

Abstract:

To obtain the high quality and essential workability of mortar, different types of superplasticizers are used. The superplasticizers are the chemical admixture used in the mix to improve the fluidity of mortar. Many factors influenced the superplasticizer to disperse the cement particle in the mortar. Nature and amount of replaced cement by slag, mixing procedure, delayed addition time, and heat stimulation technique of superplasticizer cause the varied effect on the fluidity of the cementitious material. In this experiment, the superplasticizers were heated for 1 hour under 60 °C in a thermostatic chamber. Furthermore, the effect of delayed addition time of heat stimulated superplasticizers (SP) was also analyzed. This method was applied to two types of polycarboxylic acid based ether SP (precast type superplasticizer (SP2) and ready-mix type superplasticizer (SP1)) in combination with a partial replacement of normal Portland cement with blast furnace slag (BFS) with 30% w/c ratio. On the other hands, the fluidity, air content, fresh density, and compressive strength for 7 and 28 days were studied. The results indicate that the addition time and heat stimulation technique improved the flow and air content, decreased the density, and slightly decreased the compressive strength of mortar. Moreover, the slag improved the flow of mortar by increasing the amount of slag, and the effect of external temperature of SP on the flow of mortar was decreased. In comparison, the flow of mortar was improved on 5-minute delay for both kinds of SP, but SP1 has improved the flow in all conditions. Most importantly, the transition points in both types of SP appear to be the same, at about 5±1 min.  In addition, the optimum addition time of SP to mortar should be in this period.

Keywords: Combined effect, delayed addition, heat stimulation, flow of mortar.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 845
6421 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892
6420 Introducing Fast Robot Roller Hemming Process in Automotive Industry

Authors: Babak Saboori, Behzad Saboori, Johan S. Carlson, Rikard Söderberg

Abstract:

As product life cycle becomes less and less every day, having flexible manufacturing processes for any companies seems more demanding. In the assembling of closures, i.e. opening parts in car body, hemming process is the one which needs more attention. This paper focused on the robot roller hemming process and how to reduce its cycle time by introducing a fast roller hemming process. A robot roller hemming process of a tailgate of Saab 93 SportCombi model is investigated as a case study in this paper. By applying task separation, robot coordination, and robot cell configuration principles in the roller hemming process, three alternatives are proposed, developed, and remarkable reduction in cycle times achieved [1].

Keywords: Cell configuration, cycle time, robot coordination, roller hemming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4073
6419 Production of Spherical Ag/ZnO Nanocomposite Particles for Photocatalytic Applications

Authors: K. B. Dermenci, B. Ebin, S.Gürmen

Abstract:

Noble metal participation in nanostructured semiconductor catalysts has drawn much interest because of their improved properties. Recently, it has been discussed by many researchers that Ag participation in TiO2, CuO, ZnO semiconductors showed improved photocatalytic and optical properties. In this research, Ag/ZnO nanocomposite particles were prepared by Ultrasonic Spray Pyrolysis(USP) Method. 0.1M silver and zinc nitrate aqueous solutions were used as precursor solutions. The Ag:Zn atomic ratio of the solution was selected 1:1. Experiments were taken place under constant air flow of 400 mL/min at 800°C furnace temperature. Particles were characterized by X-Ray Diffraction (XRD), Scanning Electron Microscope (SEM) and Energy Dispersive Spectroscopy (EDS). The crystallite sizes of Ag and ZnO in composite particles are 24.6 nm, 19.7 nm respectively. Although, spherical nanocomposite particles are in a range of 300- 800 nm, these particles are formed by the aggregation of primary particles which are in a range of 20-60 nm.

Keywords: Ag/ZnO nanocatalysts, Nanotechnology, USP

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2880
6418 Detection of Keypoint in Press-Fit Curve Based on Convolutional Neural Network

Authors: Shoujia Fang, Guoqing Ding, Xin Chen

Abstract:

The quality of press-fit assembly is closely related to reliability and safety of product. The paper proposed a keypoint detection method based on convolutional neural network to improve the accuracy of keypoint detection in press-fit curve. It would provide an auxiliary basis for judging quality of press-fit assembly. The press-fit curve is a curve of press-fit force and displacement. Both force data and distance data are time-series data. Therefore, one-dimensional convolutional neural network is used to process the press-fit curve. After the obtained press-fit data is filtered, the multi-layer one-dimensional convolutional neural network is used to perform the automatic learning of press-fit curve features, and then sent to the multi-layer perceptron to finally output keypoint of the curve. We used the data of press-fit assembly equipment in the actual production process to train CNN model, and we used different data from the same equipment to evaluate the performance of detection. Compared with the existing research result, the performance of detection was significantly improved. This method can provide a reliable basis for the judgment of press-fit quality.

Keywords: Keypoint detection, curve feature, convolutional neural network, press-fit assembly.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 940
6417 Enhancing Seamless Communication Through a user Co-designed Wearable Device

Authors: A. Marcengo, A. Rapp, E. Guercio

Abstract:

This work aims to describe the process of developing services and applications of seamless communication within a Telecom Italia long-term research project, which takes as central aim the design of a wearable communication device. In particular, the objective was to design a wrist phone integrated into everyday life of people in full transparency. The methodology used to design the wristwatch was developed through several subsequent steps also involving the Personas Layering Framework. The data collected in this phases have been very useful for designing an improved version of the first two concepts of wrist phone going to change aspects related to the four critical points expressed by the users.

Keywords: Design, Interaction, User Centred Design, Wristphone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1277
6416 Predicting Extrusion Process Parameters Using Neural Networks

Authors: Sachin Man Bajimaya, SangChul Park, Gi-Nam Wang

Abstract:

The objective of this paper is to estimate realistic principal extrusion process parameters by means of artificial neural network. Conventionally, finite element analysis is used to derive process parameters. However, the finite element analysis of the extrusion model does not consider the manufacturing process constraints in its modeling. Therefore, the process parameters obtained through such an analysis remains highly theoretical. Alternatively, process development in industrial extrusion is to a great extent based on trial and error and often involves full-size experiments, which are both expensive and time-consuming. The artificial neural network-based estimation of the extrusion process parameters prior to plant execution helps to make the actual extrusion operation more efficient because more realistic parameters may be obtained. And so, it bridges the gap between simulation and real manufacturing execution system. In this work, a suitable neural network is designed which is trained using an appropriate learning algorithm. The network so trained is used to predict the manufacturing process parameters.

Keywords: Artificial Neural Network (ANN), Indirect Extrusion, Finite Element Analysis, MES.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2366
6415 An Overview of the Factors Affecting Microbial-Induced Calcite Precipitation and its Potential Application in Soil Improvement

Authors: Wei-Soon Ng, Min-Lee Lee, Siew-Ling Hii

Abstract:

Microbial-induced calcite precipitation (MICP) is a relatively green and sustainable soil improvement technique. It utilizes biochemical process that exists naturally in soil to improve engineering properties of soils. The calcite precipitation process is uplifted by the mean of injecting higher concentration of urease positive bacteria and reagents into the soil. The main objective of this paper is to provide an overview of the factors affecting the MICP in soil. Several factors were identified including nutrients, bacteria type, geometric compatibility of bacteria, bacteria cell concentration, fixation and distribution of bacteria in soil, temperature, reagents concentration, pH, and injection method. These factors were found to be essential for promoting successful MICP soil treatment. Furthermore, a preliminary laboratory test was carried out to investigate the potential application of the technique in improving the shear strength and impermeability of a residual soil specimen. The results showed that both shear strength and impermeability of residual soil improved significantly upon MICP treatment. The improvement increased with increasing soil density.

Keywords: Bacteria, biocementation, bioclogging, calcite precipitation, soil improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5944
6414 Treatment of Cutting Oily-Wastewater by Sono Fenton Process: Experimental Approach and Combined Process

Authors: P. Painmanakul, T. Chintateerachai, S. Lertlapwasin, N. Rojvilavan, T. Chalermsinsuwan, N. Chawaloesphonsiya, O. Larpparisudthi

Abstract:

Conventional coagulation, advance oxidation process (AOPs), and the combined process were evaluated and compared for its suitability to treat the stabilized cutting-oil wastewater. The 90% efficiency was obtained from the coagulation at Al2(SO4)3 dosage of 150 mg/L and pH 7. On the other hands, efficiencies of AOPs for 30 minutes oxidation time were 10% for acoustic oxidation, 12% for acoustic oxidation with hydrogen peroxide, 76% for Fenton, and 92% sono-Fenton processes. The highest efficiency for effective oil removal of AOPs required large amount of chemical. Therefore, AOPs were studied as a post-treatment after conventional separation process. The efficiency was considerable as the effluent COD can pass the standard required for industrial wastewater discharge with less chemical and energy consumption.

 

Keywords: Cutting oily-wastewater, Advance oxidation process, Sono-Fenton, Combined process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3267
6413 An Improved Model for Prediction of the Effective Thermal Conductivity of Nanofluids

Authors: K. Abbaspoursani, M. Allahyari, M. Rahmani

Abstract:

Thermal conductivity is an important characteristic of a nanofluid in laminar flow heat transfer. This paper presents an improved model for the prediction of the effective thermal conductivity of nanofluids based on dimensionless groups. The model expresses the thermal conductivity of a nanofluid as a function of the thermal conductivity of the solid and liquid, their volume fractions and particle size. The proposed model includes a parameter which accounts for the interfacial shell, brownian motion, and aggregation of particle. The validation of the model is verified by applying the results obtained by the experiments of Tio2-water and Al2o3-water nanofluids.

Keywords: Critical particle size, nanofluid, model, and thermal conductivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2047
6412 A New Heuristic Algorithm for the Classical Symmetric Traveling Salesman Problem

Authors: S. B. Liu, K. M. Ng, H. L. Ong

Abstract:

This paper presents a new heuristic algorithm for the classical symmetric traveling salesman problem (TSP). The idea of the algorithm is to cut a TSP tour into overlapped blocks and then each block is improved separately. It is conjectured that the chance of improving a good solution by moving a node to a position far away from its original one is small. By doing intensive search in each block, it is possible to further improve a TSP tour that cannot be improved by other local search methods. To test the performance of the proposed algorithm, computational experiments are carried out based on benchmark problem instances. The computational results show that algorithm proposed in this paper is efficient for solving the TSPs.

Keywords: Local search, overlapped neighborhood, travelingsalesman problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2221