Search results for: Algorithms decision tree
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3107

Search results for: Algorithms decision tree

857 Modified Data Mining Approach for Defective Diagnosis in Hard Disk Drive Industry

Authors: S. Soommat, S. Patamatamkul, T. Prempridi, M. Sritulyachot, P. Ineure, S. Yimman

Abstract:

Currently, slider process of Hard Disk Drive Industry become more complex, defective diagnosis for yield improvement becomes more complicated and time-consumed. Manufacturing data analysis with data mining approach is widely used for solving that problem. The existing mining approach from combining of the KMean clustering, the machine oriented Kruskal-Wallis test and the multivariate chart were applied for defective diagnosis but it is still be a semiautomatic diagnosis system. This article aims to modify an algorithm to support an automatic decision for the existing approach. Based on the research framework, the new approach can do an automatic diagnosis and help engineer to find out the defective factors faster than the existing approach about 50%.

Keywords: Slider process, Defective diagnosis and Data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1199
856 Environmental Issues Related to Nuclear Desalination

Authors: V. Anastasov, I.Khamis

Abstract:

The paper presents an overview of environmental issues that may be expected with nuclear desalination. The analysis of coupling nuclear power with desalination plants indicates that adverse marine impacts can be mitigated with alternative intake designs or cooling systems. The atmospheric impact of desalination may be greatly reduced through the coupling with nuclear power, while maximizing the socio-economic benefit for both processes. The potential for tritium contamination of the desalinated water was reviewed. Experience with the systems and practices related to the radiological quality of the product water, shows no examples of cross-contamination. Furthermore, the indicators for the public acceptance of nuclear desalination, as one of the most important sustainability aspects of any such large project, show a positive trend. From the data collected, a conclusion is made that nuclear desalination should be supported by decision-makers.

Keywords: Environmental impacts, nuclear desalination, publicacceptance, tritium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2631
855 Fifth Order Variable Step Block Backward Differentiation Formulae for Solving Stiff ODEs

Authors: S.A.M. Yatim, Z.B. Ibrahim, K.I. Othman, F. Ismail

Abstract:

The implicit block methods based on the backward differentiation formulae (BDF) for the solution of stiff initial value problems (IVPs) using variable step size is derived. We construct a variable step size block methods which will store all the coefficients of the method with a simplified strategy in controlling the step size with the intention of optimizing the performance in terms of precision and computation time. The strategy involves constant, halving or increasing the step size by 1.9 times the previous step size. Decision of changing the step size is determined by the local truncation error (LTE). Numerical results are provided to support the enhancement of method applied.

Keywords: Backward differentiation formulae, block backwarddifferentiation formulae, stiff ordinary differential equation, variablestep size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2263
854 A Two Level Load Balancing Approach for Cloud Environment

Authors: Anurag Jain, Rajneesh Kumar

Abstract:

Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.

Keywords: Cloud Analyst, Cloud Computing, Join Idle Queue, Join Shortest Queue, Load balancing, Task Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992
853 The Conceptual Design Model of an Automated Supermarket

Authors: Sathya Narayanan V., Sidharth P., Sanal Kumar. V. R.

Abstract:

The success of any retail business is predisposed by its swift response and its knack in understanding the constraints and the requirements of customers. In this paper a conceptual design model of an automated customer-friendly supermarket has been proposed. In this model a 10-sided, space benefited, regular polygon shaped gravity shelves have been designed for goods storage and effective customer-specific algorithms have been built-in for quick automatic delivery of the randomly listed goods. The algorithm is developed with two main objectives, viz., delivery time and priority. For meeting these objectives the randomly listed items are reorganized according to the critical-path of the robotic arm specific to the identified shop and its layout and the items are categorized according to the demand, shape, size, similarity and nature of the product for an efficient pick-up, packing and delivery process. We conjectured that the proposed automated supermarket model reduces business operating costs with much customer satisfaction warranting a winwin situation.

Keywords: Automated Supermarket, Electronic Shopping, Polygon-shaped Rack, Shortest Path Algorithm for Shopping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3240
852 Performances Comparison of Neural Architectures for On-Line Speed Estimation in Sensorless IM Drives

Authors: K.Sedhuraman, S.Himavathi, A.Muthuramalingam

Abstract:

The performance of sensor-less controlled induction motor drive depends on the accuracy of the estimated speed. Conventional estimation techniques being mathematically complex require more execution time resulting in poor dynamic response. The nonlinear mapping capability and powerful learning algorithms of neural network provides a promising alternative for on-line speed estimation. The on-line speed estimator requires the NN model to be accurate, simpler in design, structurally compact and computationally less complex to ensure faster execution and effective control in real time implementation. This in turn to a large extent depends on the type of Neural Architecture. This paper investigates three types of neural architectures for on-line speed estimation and their performance is compared in terms of accuracy, structural compactness, computational complexity and execution time. The suitable neural architecture for on-line speed estimation is identified and the promising results obtained are presented.

Keywords: Sensorless IM drives, rotor speed estimators, artificial neural network, feed- forward architecture, single neuron cascaded architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1459
851 A Frugal Bidding Procedure for Replicating WWW Content

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Internet, data content replication, static allocation, mechanism design, equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1404
850 Face Recognition Using Discrete Orthogonal Hahn Moments

Authors: Fatima Akhmedova, Simon Liao

Abstract:

One of the most critical decision points in the design of a face recognition system is the choice of an appropriate face representation. Effective feature descriptors are expected to convey sufficient, invariant and non-redundant facial information. In this work we propose a set of Hahn moments as a new approach for feature description. Hahn moments have been widely used in image analysis due to their invariance, nonredundancy and the ability to extract features either globally and locally. To assess the applicability of Hahn moments to Face Recognition we conduct two experiments on the Olivetti Research Laboratory (ORL) database and University of Notre-Dame (UND) X1 biometric collection. Fusion of the global features along with the features from local facial regions are used as an input for the conventional k-NN classifier. The method reaches an accuracy of 93% of correctly recognized subjects for the ORL database and 94% for the UND database.

Keywords: Face Recognition, Hahn moments, Recognition-by-parts, Time-lapse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1777
849 Low Power and Less Area Architecture for Integer Motion Estimation

Authors: C Hisham, K Komal, Amit K Mishra

Abstract:

Full search block matching algorithm is widely used for hardware implementation of motion estimators in video compression algorithms. In this paper we are proposing a new architecture, which consists of a 2D parallel processing unit and a 1D unit both working in parallel. The proposed architecture reduces both data access power and computational power which are the main causes of power consumption in integer motion estimation. It also completes the operations with nearly the same number of clock cycles as compared to a 2D systolic array architecture. In this work sum of absolute difference (SAD)-the most repeated operation in block matching, is calculated in two steps. The first step is to calculate the SAD for alternate rows by a 2D parallel unit. If the SAD calculated by the parallel unit is less than the stored minimum SAD, the SAD of the remaining rows is calculated by the 1D unit. Early termination, which stops avoidable computations has been achieved with the help of alternate rows method proposed in this paper and by finding a low initial SAD value based on motion vector prediction. Data reuse has been applied to the reference blocks in the same search area which significantly reduced the memory access.

Keywords: Sum of absolute difference, high speed DSP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493
848 Development of Fake News Model Using Machine Learning through Natural Language Processing

Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini

Abstract:

Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.

Keywords: Fake news detection, types of fake news, machine learning, natural language processing, classification techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515
847 In Search of Bauman’s Moral Impulse in Shadow Factories of China

Authors: Akram Hatami, Naser Firoozi, Vesa Puhakka

Abstract:

Ethics and responsibility are rapidly becoming a distinguishing feature of organizations. In this paper, we analyze ethics and responsibility in shadow factories in China. We engage ourselves with Bauman’s moral impulse perspective because his idea can contextualize ethics and responsibility. Moral impulse is a feeling of a selfless, infinite and unconditional responsibility towards, and care for, Others. We analyze a case study from a secondary data source because, for such a critical phenomenon as business ethics in shadow factories, collecting primary data is difficult, since they are unregistered factories. We argue that there has not been enough attention given to the ethics and responsibility in shadow factories in China. Our main goal is to demonstrate that, considering the Other, more importantly the employees, in ethical decision-making is a simple instruction beyond the narrow version of ethics by ethical codes and rules.

Keywords: Moral impulse, responsibility, shadow factories, the other.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
846 An Optimal Bayesian Maintenance Policy for a Partially Observable System Subject to Two Failure Modes

Authors: Akram Khaleghei Ghosheh Balagh, Viliam Makis, Leila Jafari

Abstract:

In this paper, we present a new maintenance model for a partially observable system subject to two failure modes, namely a catastrophic failure and a failure due to the system degradation. The system is subject to condition monitoring and the degradation process is described by a hidden Markov model. A cost-optimal Bayesian control policy is developed for maintaining the system. The control problem is formulated in the semi-Markov decision process framework. An effective computational algorithm is developed, illustrated by a numerical example.

Keywords: Partially observable system, hidden Markov model, competing risks, multivariate Bayesian control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2187
845 Modeling and Dynamics Analysis for Intelligent Skid-Steering Vehicle Based on Trucksim-Simulink

Authors: Yansong Zhang, Xueyuan Li, Junjie Zhou, Xufeng Yin, Shihua Yuan, Shuxian Liu

Abstract:

Aiming at the verification of control algorithms for skid-steering vehicles, a vehicle simulation model of 6×6 electric skid-steering unmanned vehicle was established based on Trucksim and Simulink. The original transmission and steering mechanism of Trucksim are removed, and the electric skid-steering model and a closed-loop controller for the vehicle speed and yaw rate are built in Simulink. The simulation results are compared with the ones got by theoretical formulas. The results show that the predicted tire mechanics and vehicle kinematics of Trucksim-Simulink simulation model are closed to the theoretical results. Therefore, it can be used as an effective approach to study the dynamic performance and control algorithm of skid-steering vehicle. In this paper, a method of motion control based on feed forward control is also designed. The simulation results show that the feed forward control strategy can make the vehicle follow the target yaw rate more quickly and accurately, which makes the vehicle have more maneuverability.

Keywords: Skid-steering, Trucksim-Simulink, feedforward control, dynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 949
844 Breast Skin-Line Estimation and Breast Segmentation in Mammograms using Fast-Marching Method

Authors: Roshan Dharshana Yapa, Koichi Harada

Abstract:

Breast skin-line estimation and breast segmentation is an important pre-process in mammogram image processing and computer-aided diagnosis of breast cancer. Limiting the area to be processed into a specific target region in an image would increase the accuracy and efficiency of processing algorithms. In this paper we are presenting a new algorithm for estimating skin-line and breast segmentation using fast marching algorithm. Fast marching is a partial-differential equation based numerical technique to track evolution of interfaces. We have introduced some modifications to the traditional fast marching method, specifically to improve the accuracy of skin-line estimation and breast tissue segmentation. Proposed modifications ensure that the evolving front stops near the desired boundary. We have evaluated the performance of the algorithm by using 100 mammogram images taken from mini-MIAS database. The results obtained from the experimental evaluation indicate that this algorithm explains 98.6% of the ground truth breast region and accuracy of the segmentation is 99.1%. Also this algorithm is capable of partially-extracting nipple when it is available in the profile.

Keywords: Mammogram, fast marching method, mathematical morphology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2677
843 Coordination on Agrifood Supply Chain

Authors: Martha Liliana Reina Usuga, Wilson Adarme Jaimes, Oscar Eduardo Suarez

Abstract:

Coordinated supply chain represents major challenges for the different actors involved in it, because each agent responds to individual interests. The paper presents a framework with the reviewed literature regarding the system's decision structure and nature of demand. Later, it characterizes an agri food supply chain in the Central Region of Colombia, it responds to a decentralized distribution system and a stochastic demand. Finally, the paper recommends coordinating the chain based on shared information, and mechanisms for each agent, as VMI (vendor-managed inventory) strategy for farmer-buyer relationship, information system for farmers and contracts for transportation service providers.

Keywords: Agri-food supply chain, Coordination mechanisms, Decentralized distribution system, Supply chain coordination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2355
842 A Real-Time Rendering based on Efficient Updating of Static Objects Buffer

Authors: Youngjae Chun, Kyoungsu Oh

Abstract:

Real-time 3D applications have to guarantee interactive rendering speed. There is a restriction for the number of polygons which is rendered due to performance of a graphics hardware or graphics algorithms. Generally, the rendering performance will be drastically increased when handling only the dynamic 3d models, which is much fewer than the static ones. Since shapes and colors of the static objects don-t change when the viewing direction is fixed, the information can be reused. We render huge amounts of polygon those cannot handled by conventional rendering techniques in real-time by using a static object image and merging it with rendering result of the dynamic objects. The performance must be decreased as a consequence of updating the static object image including removing an static object that starts to move, re-rending the other static objects being overlapped by the moving ones. Based on visibility of the object beginning to move, we can skip the updating process. As a result, we enhance rendering performance and reduce differences of rendering speed between each frame. Proposed method renders total 200,000,000 polygons that consist of 500,000 dynamic polygons and the rest are static polygons in about 100 frames per second.

Keywords: Occlusion query, Real-time rendering, Temporal coherence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
841 Distributed Estimation Using an Improved Incremental Distributed LMS Algorithm

Authors: Amir Rastegarnia, Mohammad Ali Tinati, Azam Khalili

Abstract:

In this paper we consider the problem of distributed adaptive estimation in wireless sensor networks for two different observation noise conditions. In the first case, we assume that there are some sensors with high observation noise variance (noisy sensors) in the network. In the second case, different variance for observation noise is assumed among the sensors which is more close to real scenario. In both cases, an initial estimate of each sensor-s observation noise is obtained. For the first case, we show that when there are such sensors in the network, the performance of conventional distributed adaptive estimation algorithms such as incremental distributed least mean square (IDLMS) algorithm drastically decreases. In addition, detecting and ignoring these sensors leads to a better performance in a sense of estimation. In the next step, we propose a simple algorithm to detect theses noisy sensors and modify the IDLMS algorithm to deal with noisy sensors. For the second case, we propose a new algorithm in which the step-size parameter is adjusted for each sensor according to its observation noise variance. As the simulation results show, the proposed methods outperforms the IDLMS algorithm in the same condition.

Keywords: Distributes estimation, sensor networks, adaptive filter, IDLMS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1447
840 Using Simulation for Prediction of Units Movements in Case of Communication Failure

Authors: J. Hodicky, P. Frantis

Abstract:

Command and Control (C2) system and its interfacethe Common Operational Picture (COP) are main means that supports commander in its decision making process. COP contains information about friendly and enemy unit positions. The friendly position is gathered via tactical network. In the case of tactical network failure the information about units are not available. The tactical simulator can be used as a tool that is capable to predict movements of units in respect of terrain features. Article deals with an experiment that was based on Czech C2 system that is in the case of connectivity lost fed by VR Forces simulator. Article analyzes maximum time interval in which the position created by simulator is still usable and truthful for commander in real time.

Keywords: command and control system, movement prediction, simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1281
839 Order Optimization of a Telecommunication Distribution Center through Service Lead Time

Authors: Tamás Hartványi, Ferenc Tóth

Abstract:

European telecommunication distribution center performance is measured by service lead time and quality. Operation model is CTO (customized to order) namely, a high mix customization of telecommunication network equipment and parts. CTO operation contains material receiving, warehousing, network and server assembly to order and configure based on customer specifications. Variety of the product and orders does not support mass production structure. One of the success factors to satisfy customer is to have a proper aggregated planning method for the operation in order to have optimized human resources and highly efficient asset utilization. Research will investigate several methods and find proper way to have an order book simulation where practical optimization problem may contain thousands of variables and the simulation running times of developed algorithms were taken into account with high importance. There are two operation research models that were developed, customer demand is given in orders, no change over time, customer demands are given for product types, and changeover time is constant.

Keywords: CTO, aggregated planning, demand simulation, changeover time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 788
838 A Fast Replica Placement Methodology for Large-scale Distributed Computing Systems

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Data replication, auctions, static allocation, pricing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
837 Performance Evaluation of a Minimum Mean Square Error-Based Physical Sidelink Share Channel Receiver under Fading Channel

Authors: Yang Fu, Jaime Rodrigo Navarro, Jose F. Monserrat, Faiza Bouchmal, Oscar Carrasco Quilis

Abstract:

Cellular Vehicle to Everything (C-V2X) is considered a promising solution for future autonomous driving. From Release 16 to Release 17, the Third Generation Partnership Project (3GPP) has introduced the definitions and services for 5G New Radio (NR) V2X. Since establishing a simulator for C-V2X communications is an essential preliminary step to achieve reliable and stable communication links, this paper proposes a complete framework of a link-level simulator based on the 3GPP specifications for the Physical Sidelink Share Channel (PSSCH) of the 5G NR Physical Layer (PHY). In this framework, several algorithms in the receiver part, i.e., sliding window in channel estimation and Minimum Mean Square Error (MMSE)-based equalization, are developed. Finally, the performance of the developed PSSCH receiver is validated through extensive simulations under different assumptions.

Keywords: Yang Fu, Jaime Rodrigo Navarro, Jose F. Monserrat, Faiza Bouchmal, Oscar Carrasco Quilis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 153
836 Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP) for Recovering Signal

Authors: Israa Sh. Tawfic, Sema Koc Kayhan

Abstract:

Given a large sparse signal, great wishes are to reconstruct the signal precisely and accurately from lease number of measurements as possible as it could. Although this seems possible by theory, the difficulty is in built an algorithm to perform the accuracy and efficiency of reconstructing. This paper proposes a new proved method to reconstruct sparse signal depend on using new method called Least Support Matching Pursuit (LS-OMP) merge it with the theory of Partial Knowing Support (PSK) given new method called Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP). The new methods depend on the greedy algorithm to compute the support which depends on the number of iterations. So to make it faster, the PKLS-OMP adds the idea of partial knowing support of its algorithm. It shows the efficiency, simplicity, and accuracy to get back the original signal if the sampling matrix satisfies the Restricted Isometry Property (RIP). Simulation results also show that it outperforms many algorithms especially for compressible signals.

Keywords: Compressed sensing, Lest Support Orthogonal Matching Pursuit, Partial Knowing Support, Restricted isometry property, signal reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2233
835 Implementation of a Web-Based Wireless ECG Measuring and Recording System

Authors: Onder Yakut, Serdar Solak, Emine Dogru Bolat

Abstract:

Measuring the Electrocardiogram (ECG) signal is an essential process for the diagnosis of the heart diseases. The ECG signal has the information of the degree of how much the heart performs its functions. In medical diagnosis and treatment systems, Decision Support Systems processing the ECG signal are being developed for the use of clinicians while medical examination. In this study, a modular wireless ECG (WECG) measuring and recording system using a single board computer and e-Health sensor platform is developed. In this designed modular system, after the ECG signal is taken from the body surface by the electrodes first, it is filtered and converted to digital form. Then, it is recorded to the health database using Wi-Fi communication technology. The real time access of the ECG data is provided through the internet utilizing the developed web interface.

Keywords: ECG, e-health sensor shield, raspberry Pi, wifi technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3009
834 A Structural Support Vector Machine Approach for Biometric Recognition

Authors: Vishal Awasthi, Atul Kumar Agnihotri

Abstract:

Face is a non-intrusive strong biometrics for identification of original and dummy facial by different artificial means. Face recognition is extremely important in the contexts of computer vision, psychology, surveillance, pattern recognition, neural network, content based video processing. The availability of a widespread face database is crucial to test the performance of these face recognition algorithms. The openly available face databases include face images with a wide range of poses, illumination, gestures and face occlusions but there is no dummy face database accessible in public domain. This paper presents a face detection algorithm based on the image segmentation in terms of distance from a fixed point and template matching methods. This proposed work is having the most appropriate number of nodal points resulting in most appropriate outcomes in terms of face recognition and detection. The time taken to identify and extract distinctive facial features is improved in the range of 90 to 110 sec. with the increment of efficiency by 3%.

Keywords: Face recognition, Principal Component Analysis, PCA, Linear Discriminant Analysis, LDA, Improved Support Vector Machine, iSVM, elastic bunch mapping technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 498
833 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: Matrix Minimization Algorithm, Decoding Sequential Search Algorithm, image compression, Discrete Cosine Transform, Discrete Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 249
832 An Optimization Model of CMMI-Based Software Project Risk Response Planning

Authors: Chun-guang Pan, Ying-wu Chen

Abstract:

Risk response planning is of importance for software project risk management (SPRM). In CMMI, risk management was in the third capability maturity level, which provides a framework for software project risk identification, assessment, risk planning, risk control. However, the CMMI-based SPRM currently lacks quantitative supporting tools, especially during the process of implementing software project risk planning. In this paper, an economic optimization model for selecting risk reduction actions in the phase of software project risk response planning is presented. Furthermore, an example taken from a Chinese software industry is illustrated to verify the application of this method. The research provides a risk decision method for project risk managers that can be used in the implementation of CMMI-based SPRM.

Keywords: Software project, risk management, CMMI, riskresponse planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2092
831 Interactive Fuzzy Multi-objective Programming in Land Re-organisational Planning for Sustainable Rural Development

Authors: Bijaya Krushna Mangaraj, Deepak Kumar Das

Abstract:

Sustainability in rural production system can only be achieved if it can suitably satisfy the local requirement as well as the outside demand with the changing time. With the increased pressure from the food sector in a globalised world, the agrarian economy needs to re-organise its cultivable land system to be compatible with new management practices as well as the multiple needs of various stakeholders and the changing resource scenario. An attempt has been made to transform this problem into a multi-objective decisionmaking problem considering various objectives, resource constraints and conditional constraints. An interactive fuzzy multi-objective programming approach has been used for such a purpose taking a case study in Indian context to demonstrate the validity of the method.

Keywords: Land re-organisation, Crop planning, Multiobjective Decision-Making, Fuzzy Goal Programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456
830 Gabriel-constrained Parametric Surface Triangulation

Authors: Oscar E. Ruiz, Carlos Cadavid, Juan G. Lalinde, Ricardo Serrano, Guillermo Peris-Fajarnes

Abstract:

The Boundary Representation of a 3D manifold contains FACES (connected subsets of a parametric surface S : R2 -! R3). In many science and engineering applications it is cumbersome and algebraically difficult to deal with the polynomial set and constraints (LOOPs) representing the FACE. Because of this reason, a Piecewise Linear (PL) approximation of the FACE is needed, which is usually represented in terms of triangles (i.e. 2-simplices). Solving the problem of FACE triangulation requires producing quality triangles which are: (i) independent of the arguments of S, (ii) sensitive to the local curvatures, and (iii) compliant with the boundaries of the FACE and (iv) topologically compatible with the triangles of the neighboring FACEs. In the existing literature there are no guarantees for the point (iii). This article contributes to the topic of triangulations conforming to the boundaries of the FACE by applying the concept of parameterindependent Gabriel complex, which improves the correctness of the triangulation regarding aspects (iii) and (iv). In addition, the article applies the geometric concept of tangent ball to a surface at a point to address points (i) and (ii). Additional research is needed in algorithms that (i) take advantage of the concepts presented in the heuristic algorithm proposed and (ii) can be proved correct.

Keywords: surface triangulation, conforming triangulation, surfacesampling, Gabriel complex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1666
829 A Comprehensive Key Performance Indicators Dashboard for Emergency Medical Services

Authors: G. Feletti, D. Tedesco, P. Trucco

Abstract:

The present study aims to develop a dashboard of Key Performance Indicators (KPI) to enhance information and predictive capabilities in Emergency Medical Services (EMS) systems, supporting both operational and strategic decisions of different actors. The employed research methodology consists of a first phase of revision of the technical-scientific literature concerning the indicators currently in use for the performance measurement of EMS. It emerges that current studies focus on two distinct areas and independent objectives: the ambulance service, a fundamental component of pre-hospital health treatment, and the patient care in the Emergency Department (ED). Conversely, the perspective proposed by this study is to consider an integrated view of the ambulance service process and the ED process, both essential to ensure high quality of care and patient safety. Thus, the proposal covers the end-to-end healthcare service process and, as such, allows considering the interconnection between the two EMS processes, the pre-hospital and hospital ones, connected by the assignment of the patient to a specific ED. In this way, it is possible to optimize the entire patient management. Therefore, attention is paid even to EMS aspects that in current literature tend to be neglected or underestimated. In particular, the integration of the two processes enables to evaluate the advantage of an ED selection decision having visibility on EDs’ saturation status and therefore considering, besides the distance, the available resources and the expected waiting times. Starting from a critical review of the KPIs proposed in extant literature, the design of the dashboard was carried out: the high number of analyzed KPIs was reduced by eliminating firstly the ones not in line with the aim of the study and then the ones supporting a similar functionality. The KPIs finally selected were tested on a realistic dataset, which draw us to exclude additional indicators due to unavailability of data required for their computation. The final dashboard, that was discussed and validated by experts in the field, includes a variety of KPIs able to support operational and planning decisions, early warning, and citizens’ awareness on EDs accessibility in real time. The association of each KPI to the EMS phase it refers to enabled the design of a well-balanced dashboard, covering both efficiency and effectiveness performance objectives of the entire EMS process. Indeed, just the initial phases related to the interconnection between ambulance service and patient care are covered by traditional KPIs. Future developments could be directed to building a hierarchical dashboard, composed by a high-level minimal set of KPIs for measuring the basic performance of the EMS system, at an aggregate level, and lower levels of KPIs that bring additional and more detailed information on specific performance dimensions or EMS phases.

Keywords: Emergency Medical Services, Key Performance Indicators, Dashboard, Decision Support.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 474
828 Adaptive Motion Estimator Based on Variable Block Size Scheme

Authors: S. Dhahri, A. Zitouni, H. Chaouch, R. Tourki

Abstract:

This paper presents an adaptive motion estimator that can be dynamically reconfigured by the best algorithm depending on the variation of the video nature during the lifetime of an application under running. The 4 Step Search (4SS) and the Gradient Search (GS) algorithms are integrated in the estimator in order to be used in the case of rapid and slow video sequences respectively. The Full Search Block Matching (FSBM) algorithm has been also integrated in order to be used in the case of the video sequences which are not real time oriented. In order to efficiently reduce the computational cost while achieving better visual quality with low cost power, the proposed motion estimator is based on a Variable Block Size (VBS) scheme that uses only the 16x16, 16x8, 8x16 and 8x8 modes. Experimental results show that the adaptive motion estimator allows better results in term of Peak Signal to Noise Ratio (PSNR), computational cost, FPGA occupied area, and dissipated power relatively to the most popular variable block size schemes presented in the literature.

Keywords: H264, Configurable Motion Estimator, VariableBlock Size, PSNR, Dissipated power.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656