Search results for: Data Management in Cloud
7617 A Novel Compression Algorithm for Electrocardiogram Signals based on Wavelet Transform and SPIHT
Authors: Sana Ktata, Kaïs Ouni, Noureddine Ellouze
Abstract:
Electrocardiogram (ECG) data compression algorithm is needed that will reduce the amount of data to be transmitted, stored and analyzed, but without losing the clinical information content. A wavelet ECG data codec based on the Set Partitioning In Hierarchical Trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm has achieved notable success in still image coding. We modified the algorithm for the one-dimensional (1-D) case and applied it to compression of ECG data. By this compression method, small percent root mean square difference (PRD) and high compression ratio with low implementation complexity are achieved. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. Compression ratios of up to 48:1 for ECG signals lead to acceptable results for visual inspection.Keywords: Discrete Wavelet Transform, ECG compression, SPIHT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21317616 A Software Tool Design for Cerebral Infarction of MR Images
Authors: Kyoung-Jong Park, Woong-Gi Jeon, Hee-Cheol Kim, Dong-Eog Kim, Heung-Kook Choi
Abstract:
The brain MR imaging-based clinical research and analysis system were specifically built and the development for a large-scale data was targeted. We used the general clinical data available for building large-scale data. Registration period for the selection of the lesion ROI and the region growing algorithm was used and the Mesh-warp algorithm for matching was implemented. The accuracy of the matching errors was modified individually. Also, the large ROI research data can accumulate by our developed compression method. In this way, the correctly decision criteria to the research result was suggested. The experimental groups were age, sex, MR type, patient ID and smoking which can easily be queries. The result data was visualized of the overlapped images by a color table. Its data was calculated by the statistical package. The evaluation for the utilization of this system in the chronic ischemic damage in the area has done from patients with the acute cerebral infarction. This is the cause of neurologic disability index location in the center portion of the lateral ventricle facing. The corona radiate was found in the position. Finally, the system reliability was measured both inter-user and intra-user registering correlation.
Keywords: Software tool design, Cerebral infarction, Brain MR image, Registration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16637615 Factors Affecting General Practitioners’ Transfer of Specialized Self-Care Knowledge to Patients
Authors: Weidong Xia, Malgorzata Kolotylo, Xuan Tan
Abstract:
This study examines the key factors that influence general practitioners’ learning and transfer of specialized arthritis knowledge and self-care techniques to patients during normal patient visits. Drawing on the theory of planed behavior and using matched survey data collected from general practitioners before and after training sessions provided by specialized orthopedic physicians, the study suggests that the general practitioner’s intention to use and transfer learned knowledge was influenced mainly by intrinsic motivation, organizational learning culture and absorptive capacity, but was not influenced by extrinsic motivation. The results provide both theoretical and practical implications.
Keywords: Empirical study, healthcare knowledge management, patient self-care, physician knowledge transfer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12397614 Common Sense Leadership in the Example of Turkish Political Leader Devlet Bahçeli
Authors: B. Gültekin, T. Gültekin
Abstract:
Peace diplomacy is the most important international tool to maintain peace all over the World. This study consists of three parts. In the first part, the leadership of Devlet Bahçeli, leader of the Nationalist Movement Party, will be introduced as a tool of peace communication and peace management. Also, in this part, peace communication will be explained by the peace leadership traits of Devlet Bahçeli, who is one of the efficient political leaders representing the concepts of compromise and agreement on different sides of politics. In the second part of study, it is aimed to analyze Devlet Bahçeli’s leadership within the frame of peace communication and the final part of this study is about creating an original public communication model for public diplomacy based on Devlet Bahçeli as an example. As a result, the main purpose of this study is to develop an original peace communication model including peace modules, peace management projects, original dialogue procedures and protocols exhibited in the policies of Devlet Bahçeli. The political leadership represented by Devlet Bahçeli inspires political leaders to provide peace communication. In this study, principles and policies of peace leadership of Devlet Bahçeli will be explained as an original model on a peace communication platform.Keywords: Dialogue management, public diplomacy, peace diplomacy, peace leadership.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8657613 A Performance Analysis of Different Scheduling Schemes in WiMAX
Authors: A. Youseef
Abstract:
IEEE 802.16 (WiMAX) aims to present high speed wireless access to cover wide range coverage. The base station (BS) and the subscriber station (SS) are the main parts of WiMAX. WiMAX uses either Point-to-Multipoint (PMP) or mesh topologies. In the PMP mode, the SSs connect to the BS to gain access to the network. However, in the mesh mode, the SSs connect to each other to gain access to the BS. The main components of QoS management in the 802.16 standard are the admission control, buffer management and packet scheduling. In this paper, we use QualNet 5.0.2 to study the performance of different scheduling schemes, such as WFQ, SCFQ, RR and SP when the numbers of SSs increase. We find that when the number of SSs increases, the average jitter and average end-to-end delay is increased and the throughput is reduced.
Keywords: WiMAX, Scheduling Scheme, QoS, QualNet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20817612 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.Keywords: Data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks, WSN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12207611 New Data Reuse Adaptive Filters with Noise Constraint
Authors: Young-Seok Choi
Abstract:
We present a new framework of the data-reusing (DR) adaptive algorithms by incorporating a constraint on noise, referred to as a noise constraint. The motivation behind this work is that the use of the statistical knowledge of the channel noise can contribute toward improving the convergence performance of an adaptive filter in identifying a noisy linear finite impulse response (FIR) channel. By incorporating the noise constraint into the cost function of the DR adaptive algorithms, the noise constrained DR (NC-DR) adaptive algorithms are derived. Experimental results clearly indicate their superior performance over the conventional DR ones.Keywords: Adaptive filter, data-reusing, least-mean square (LMS), affine projection (AP), noise constraint.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16277610 Mining Genes Relations in Microarray Data Combined with Ontology in Colon Cancer Automated Diagnosis System
Authors: A. Gruzdz, A. Ihnatowicz, J. Siddiqi, B. Akhgar
Abstract:
MATCH project [1] entitle the development of an automatic diagnosis system that aims to support treatment of colon cancer diseases by discovering mutations that occurs to tumour suppressor genes (TSGs) and contributes to the development of cancerous tumours. The constitution of the system is based on a) colon cancer clinical data and b) biological information that will be derived by data mining techniques from genomic and proteomic sources The core mining module will consist of the popular, well tested hybrid feature extraction methods, and new combined algorithms, designed especially for the project. Elements of rough sets, evolutionary computing, cluster analysis, self-organization maps and association rules will be used to discover the annotations between genes, and their influence on tumours [2]-[11]. The methods used to process the data have to address their high complexity, potential inconsistency and problems of dealing with the missing values. They must integrate all the useful information necessary to solve the expert's question. For this purpose, the system has to learn from data, or be able to interactively specify by a domain specialist, the part of the knowledge structure it needs to answer a given query. The program should also take into account the importance/rank of the particular parts of data it analyses, and adjusts the used algorithms accordingly.Keywords: Bioinformatics, gene expression, ontology, selforganizingmaps.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19747609 On Methodologies for Analysing Sickness Absence Data: An Insight into a New Method
Authors: Xiaoshu Lu, Päivi Leino-Arjas, Kustaa Piha, Akseli Aittomäki, Peppiina Saastamoinen, Ossi Rahkonen, Eero Lahelma
Abstract:
Sickness absence represents a major economic and social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model selection and a critical analysis of the temporal trends, the occurrence and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model applicability to complicated longitudinal data.Keywords: Sickness absence, longitudinal data, methodologies, mix-distribution model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22717608 Phase Jitter Transfer in High Speed Data Links
Authors: Tsunwai Gary Yip
Abstract:
Phase locked loops in 10 Gb/s and faster data links are low phase noise devices. Characterization of their phase jitter transfer functions is difficult because the intrinsic noise of the PLLs is comparable to the phase noise of the reference clock signal. The problem is solved by using a linear model to account for the intrinsic noise. This study also introduces a novel technique for measuring the transfer function. It involves the use of the reference clock as a source of wideband excitation, in contrast to the commonly used sinusoidal excitations at discrete frequencies. The data reported here include the intrinsic noise of a PLL for 10 Gb/s links and the jitter transfer function of a PLL for 12.8 Gb/s links. The measured transfer function suggests that the PLL responded like a second order linear system to a low noise reference clock.Keywords: Intrinsic phase noise, jitter in data link, PLL jitter transfer function, high speed clocking in electronic circuit
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16017607 A Review in Recent Development of Network Threats and Security Measures
Authors: Roza Dastres, Mohsen Soori
Abstract:
Networks are vulnerable devices due to their basic feature of facilitating remote access and data communication. The information in the networks needs to be kept secured and safe in order to provide an effective communication and sharing device in the web of data. Due to challenges and threats of the data in networks, the network security is one of the most important considerations in information technology infrastructures. As a result, the security measures are considered in the network in order to decrease the probability of accessing the secured data by the hackers. The purpose of network security is to protect the network and its components from unauthorized access and abuse in order to provide a safe and secured communication device for the users. In the present research work a review in recent development of network threats and security measures is presented and future research works are also suggested. Different attacks to the networks and security measured against them are discussed in order to increase security in the web of data. So, new ideas in the network security systems can be presented by analyzing the published papers in order to move forward the research field.
Keywords: Network threats, network security, security measures, firewalls.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8377606 Lean Environmental Management Integration System (LEMIS) Framework Development
Authors: Puvanasvaran, A. P., Suresh V., N. Norazlin
Abstract:
The Lean Environmental Management Integration System (LEMIS) framework development is integration between lean core element and ISO 14001. The curiosity on the relationship between continuous improvement and sustainability of lean implementation has influenced this study toward LEMIS. Characteristic of ISO 14001 standard clauses and core elements of lean principles are explored from past studies and literature reviews. Survey was carried out on ISO 14001 certified companies to examine continual improvement by implementing the ISO 14001 standard. The study found that there is a significant and positive relationship between Lean Principles: value, value stream, flow, pull and perfection with the ISO 14001 requirements. LEMIS is significant to support the continuous improvement and sustainability. The integration system can be implemented to any manufacturing company. It gives awareness on the importance on why organizations need to sustain its environmental management system. In the meantime, the lean principle can be adapted in order to streamline daily activities of the company. Throughout the study, it had proven that there is no sacrifice or trade-off between lean principles with ISO 14001 requirements. The framework developed in the study can be further simplified in the future, especially the method of crossing each sub requirements of ISO 14001 standard with the core elements of Lean principles in this study.
Keywords: LEMIS, ISO 14001, integration, framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23847605 Imposing Speed Constraints on Arrival Flights: Case Study for Changi Airport
Authors: S. Aneeka, S.M. Phyoe, R. Guo, Z.W. Zhong
Abstract:
Arrival flights tend to spend long waiting times at holding stacks if the arrival airport is congested. However, the waiting time spent in the air in the vicinity of the arrival airport may be reduced if the delays are distributed to the cruising phase of the arrival flights by means of speed control. Here, a case study was conducted for the flights arriving at Changi Airport. The flights that were assigned holdings were simulated to fly at a reduced speed during the cruising phase. As the study involves a single airport and is limited to imposing speed constraints to arrivals within 200 NM from its location, the simulation setup in this study could be considered as an application of the Extended Arrival Management (E-AMAN) technique, which is proven to result in considerable fuel savings and more efficient management of delays. The objective of this experiment was to quantify the benefits of imposing cruise speed constraints to arrivals at Changi Airport and to assess the effects on controllers’ workload. The simulation results indicated considerable fuel savings, reduced aircraft emissions and reduced controller workload.
Keywords: Aircraft emissions, air traffic flow management, controller workload, fuel consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13407604 Battery Grading Algorithm in 2nd-Life Repurposing Li-ion Battery System
Authors: Ya Lv, Benjamin Ong Wei Lin, Wanli Niu, Benjamin Seah Chin Tat
Abstract:
This article presents a methodology that improves reliability and cyclability of 2nd-life Li-ion battery system repurposed as energy storage system (ESS). Most of the 2nd-life retired battery systems in market have module/pack-level state of health (SOH) indicator, which is utilized for guiding appropriate depth of discharge (DOD) in the application of ESS. Due to the lack of cell-level SOH indication, the different degrading behaviors among various cells cannot be identified upon reaching retired status; in the end, considering end of life (EOL) loss and pack-level DOD, the repurposed ESS has to be oversized by > 1.5 times to complement the application requirement of reliability and cyclability. This proposed battery grading algorithm, using non-invasive methodology, is able to detect outlier cells based on historical voltage data and calculate cell-level historical maximum temperature data using semi-analytic methodology. In this way, the individual battery cell in the 2nd-life battery system can be graded in terms of SOH on basis of the historical voltage fluctuation and estimated historical maximum temperature variation. These grades will have corresponding DOD grades in the application of the repurposed ESS to enhance the system reliability and cyclability. In all, this introduced battery grading algorithm is non-invasive, compatible with all kinds of retired Li-ion battery systems which lack of cell-level SOH indication, as well as potentially being embedded into battery management software for preventive maintenance and real-time cyclability optimization.
Keywords: Battery grading algorithm, 2nd-life repurposing battery system, semi-analytic methodology, reliability and cyclability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8417603 A Review on Soft Computing Technique in Intrusion Detection System
Authors: Noor Suhana Sulaiman, Rohani Abu Bakar, Norrozila Sulaiman
Abstract:
Intrusion Detection System is significant in network security. It detects and identifies intrusion behavior or intrusion attempts in a computer system by monitoring and analyzing the network packets in real time. In the recent year, intelligent algorithms applied in the intrusion detection system (IDS) have been an increasing concern with the rapid growth of the network security. IDS data deals with a huge amount of data which contains irrelevant and redundant features causing slow training and testing process, higher resource consumption as well as poor detection rate. Since the amount of audit data that an IDS needs to examine is very large even for a small network, classification by hand is impossible. Hence, the primary objective of this review is to review the techniques prior to classification process suit to IDS data.Keywords: Intrusion Detection System, security, soft computing, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18647602 Gamma Glutamyl Transferase and Lactate Dehydrogenase as Biochemical Markers of Severity of Preeclampsia
Authors: S. M. Munde, N. R. Hazari, A. P. Thorat, S. B. Gaikwad, V. S. Hatolkar
Abstract:
This study was conducted to examine the possible role of serum Gamma-glutamyltransferase (GGT) and Lactate dehydrogenase (LDH) in the prediction of severity of preeclampsia. The study group comprised of 40 preeclamptic cases (22 with mild and 18 with severe) and 40 healthy normotensive pregnant controls. Serum samples of all the cases were assayed for GGT and LDH. Demographic, hemodynamic and laboratory data as well as serum GGT and LDH levels were compared among the three groups.
The results indicated that severe preeclamptic cases had significantly increased levels of serum GGT and LDH. The symptoms in severe preeclamptic women were significantly increased in patients with GGT > 70 IU/L and LDH >800 IU/L. Elevated levels of serum GGT and LDH can be used as biochemical markers which reflects the severity of preeclampsia and useful for the management of preeclampsia to decrease maternal and fetal morbidity and mortality.
Keywords: Severe Preeclampsia, GGT, LDH.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30707601 Modeling and Simulation of Acoustic Link Using Mackenize Propagation Speed Equation
Authors: Christhu Raj M. R., Rajeev Sukumaran
Abstract:
Underwater acoustic networks have attracted great attention in the last few years because of its numerous applications. High data rate can be achieved by efficiently modeling the physical layer in the network protocol stack. In Acoustic medium, propagation speed of the acoustic waves is dependent on many parameters such as temperature, salinity, density, and depth. Acoustic propagation speed cannot be modeled using standard empirical formulas such as Urick and Thorp descriptions. In this paper, we have modeled the acoustic channel using real time data of temperature, salinity, and speed of Bay of Bengal (Indian Coastal Region). We have modeled the acoustic channel by using Mackenzie speed equation and real time data obtained from National Institute of Oceanography and Technology. It is found that acoustic propagation speed varies between 1503 m/s to 1544 m/s as temperature and depth differs. The simulation results show that temperature, salinity, depth plays major role in acoustic propagation and data rate increases with appropriate data sets substituted in the simulated model.Keywords: Underwater Acoustics, Mackenzie Speed Equation, Temperature, Salinity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21997600 Isobaric Vapor-Liquid Equilibrium Data for Binary Mixture of 2-Methyltetrahydrofuran and Cumene
Authors: V. K. Rattan, Baljinder K. Gill, Seema Kapoor
Abstract:
Isobaric vapor-liquid equilibrium measurements are reported for binary mixture of 2-Methyltetrahydrofuran and Cumene at 97.3 kPa. The data were obtained using a vapor recirculating type (modified Othmer's) equilibrium still. The mixture shows slight negative deviation from ideality. The system does not form an azeotrope. The experimental data obtained in this study are thermodynamically consistent according to the Herington test. The activity coefficients have been satisfactorily correlated by means of the Margules, and NRTL equations. Excess Gibbs free energy has been calculated from the experimental data. The values of activity coefficients have also been obtained by the UNIFAC group contribution method.Keywords: Binary mixture, 2-Methyltetrahydrofuran, Cumene, Vapor-liquid equilibrium, UNIFAC, Excess Gibbs free energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27207599 Evaluating the Innovation Ability of Manufacturing Resources
Authors: M.F. Zaeh, G. Reinhart, U. Lindemann, F. Karl, W. Biedermann
Abstract:
Due to today-s turbulent environment, manufacturing resources, particularly in assembly, must be reconfigured frequently. These reconfigurations are caused by various, partly cyclic, influencing factors. Hence, it is important to evaluate the innovation ability - the capability of resources to implement innovations quickly and efficiently without large expense - of manufacturing resources. For this purpose, a new methodology is presented in this article. Within the methodology, design structure matrices and graph theory are used. The results of the methodology include different indices to evaluate the innovation ability of the manufacturing resources. Due to the cyclicity of the influencing factors, the methodology can be used to synchronize the realization of adaptations.
Keywords: Changeability, Cycle Management, Design StructureMatrices, Graph Theory, Manufacturing Resource Planning, Production Management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14867598 A Keyword-Based Filtering Technique of Document-Centric XML using NFA Representation
Authors: Changwoo Byun, Kyounghan Lee, Seog Park
Abstract:
XML is becoming a de facto standard for online data exchange. Existing XML filtering techniques based on a publish/subscribe model are focused on the highly structured data marked up with XML tags. These techniques are efficient in filtering the documents of data-centric XML but are not effective in filtering the element contents of the document-centric XML. In this paper, we propose an extended XPath specification which includes a special matching character '%' used in the LIKE operation of SQL in order to solve the difficulty of writing some queries to adequately filter element contents using the previous XPath specification. We also present a novel technique for filtering a collection of document-centric XMLs, called Pfilter, which is able to exploit the extended XPath specification. We show several performance studies, efficiency and scalability using the multi-query processing time (MQPT).Keywords: XML Data Stream, Document-centric XML, Filtering Technique, Value-based Predicates.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17607597 A Mixture Model of Two Different Distributions Approach to the Analysis of Heterogeneous Survival Data
Authors: Ülkü Erişoğlu, Murat Erişoğlu, Hamza Erol
Abstract:
In this paper we propose a mixture of two different distributions such as Exponential-Gamma, Exponential-Weibull and Gamma-Weibull to model heterogeneous survival data. Various properties of the proposed mixture of two different distributions are discussed. Maximum likelihood estimations of the parameters are obtained by using the EM algorithm. Illustrative example based on real data are also given.Keywords: Exponential-Gamma, Exponential-Weibull, Gamma-Weibull, EM Algorithm, Survival Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40657596 Overview of Operational Risk Management Methods
Authors: Milan Rippel, Pert Teplý
Abstract:
Operational risk has become one of the most discussed topics in the financial industry in the recent years. The reasons for this attention can be attributed to higher investments in information systems and technology, the increasing wave of mergers and acquisitions and emergence of new financial instruments. In addition, the New Basel Capital Accord (known as Basel II) demands a capital requirement for operational risk and further motivates financial institutions to more precisely measure and manage this type of risk. The aim of this paper is to shed light on main characteristics of operational risk management and common applied methods: scenario analysis, key risk indicators, risk control self assessment and loss distribution approach.
Keywords: Operational risk, economic capital, key risk indicators, loss distribution approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39647595 Motion Recognition Based On Fuzzy WP Feature Extraction Approach
Authors: Keun-Chang Kwak
Abstract:
This paper is concerned with motion recognition based fuzzy WP(Wavelet Packet) feature extraction approach from Vicon physical data sets. For this purpose, we use an efficient fuzzy mutual-information-based WP transform for feature extraction. This method estimates the required mutual information using a novel approach based on fuzzy membership function. The physical action data set includes 10 normal and 10 aggressive physical actions that measure the human activity. The data have been collected from 10 subjects using the Vicon 3D tracker. The experiments consist of running, seating, and walking as physical activity motion among various activities. The experimental results revealed that the presented feature extraction approach showed good recognition performance.
Keywords: Motion recognition, fuzzy wavelet packet, Vicon physical data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16447594 Classification and Analysis of Risks in Software Engineering
Authors: Hooman Hoodat, Hassan Rashidi
Abstract:
Despite various methods that exist in software risk management, software projects have a high rate of failure. When complexity and size of the projects are increased, managing software development becomes more difficult. In these projects the need for more analysis and risk assessment is vital. In this paper, a classification for software risks is specified. Then relations between these risks using risk tree structure are presented. Analysis and assessment of these risks are done using probabilistic calculations. This analysis helps qualitative and quantitative assessment of risk of failure. Moreover it can help software risk management process. This classification and risk tree structure can apply to some software tools.
Keywords: Risk analysis, risk assessment, risk classification, risk tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 90337593 DCBOR: A Density Clustering Based on Outlier Removal
Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan
Abstract:
Data clustering is an important data exploration technique with many applications in data mining. We present an enhanced version of the well known single link clustering algorithm. We will refer to this algorithm as DCBOR. The proposed algorithm alleviates the chain effect by removing the outliers from the given dataset. So this algorithm provides outlier detection and data clustering simultaneously. This algorithm does not need to update the distance matrix, since the algorithm depends on merging the most k-nearest objects in one step and the cluster continues grow as long as possible under specified condition. So the algorithm consists of two phases; at the first phase, it removes the outliers from the input dataset. At the second phase, it performs the clustering process. This algorithm discovers clusters of different shapes, sizes, densities and requires only one input parameter; this parameter represents a threshold for outlier points. The value of the input parameter is ranging from 0 to 1. The algorithm supports the user in determining an appropriate value for it. We have tested this algorithm on different datasets contain outlier and connecting clusters by chain of density points, and the algorithm discovers the correct clusters. The results of our experiments demonstrate the effectiveness and the efficiency of DCBOR.Keywords: Data Clustering, Clustering Algorithms, Handling Noise, Arbitrary Shape of Clusters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19347592 Alternative to M-Estimates in Multisensor Data Fusion
Authors: Nga-Viet Nguyen, Georgy Shevlyakov, Vladimir Shin
Abstract:
To solve the problem of multisensor data fusion under non-Gaussian channel noise. The advanced M-estimates are known to be robust solution while trading off some accuracy. In order to improve the estimation accuracy while still maintaining the equivalent robustness, a two-stage robust fusion algorithm is proposed using preliminary rejection of outliers then an optimal linear fusion. The numerical experiments show that the proposed algorithm is equivalent to the M-estimates in the case of uncorrelated local estimates and significantly outperforms the M-estimates when local estimates are correlated.Keywords: Data fusion, estimation, robustness, M-estimates.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18327591 Data Structures and Algorithms of Intelligent Web-Based System for Modular Design
Authors: Ivan C. Mustakerov, Daniela I. Borissova
Abstract:
In recent years, new product development became more and more competitive and globalized, and the designing phase is critical for the product success. The concept of modularity can provide the necessary foundation for organizations to design products that can respond rapidly to market needs. The paper describes data structures and algorithms of intelligent Web-based system for modular design taking into account modules compatibility relationship and given design requirements. The system intelligence is realized by developed algorithms for choice of modules reflecting all system restrictions and requirements. The proposed data structure and algorithms are illustrated by case study of personal computer configuration. The applicability of the proposed approach is tested through a prototype of Web-based system.
Keywords: Data structures, algorithms, intelligent web-based system, modular design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18157590 Self Watermarking based on Visual Cryptography
Authors: Mahmoud A. Hassan, Mohammed A. Khalili
Abstract:
We are proposing a simple watermarking method based on visual cryptography. The method is based on selection of specific pixels from the original image instead of random selection of pixels as per Hwang [1] paper. Verification information is generated which will be used to verify the ownership of the image without the need to embed the watermark pattern into the original digital data. Experimental results show the proposed method can recover the watermark pattern from the marked data even if some changes are made to the original digital data.Keywords: Watermarking, visual cryptography, visualthreshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17407589 Identifying E-Learning Components at North-West University, Mafikeng Campus
Authors: Sylvia Tumelo Nthutang, Nehemiah Mavetera
Abstract:
Educational institutions are under pressure from their competitors. Regulators and community groups need educational institutions to adopt appropriate business and organizational practices. Globally, educational institutions are now using e-learning as the best teaching and learning approach. E-learning is becoming the center of attention to the learning institutions, educational systems and software inventors. North-West University (NWU) is currently using eFundi, a Learning Management System (LMS). LMS are all information systems and procedures that adds value to students learning and support the learning material in text or any multimedia files. With various e-learning tools, students would be able to access all the materials related to the course in electronic copies. The study was tasked with identifying the e-learning components at the NWU, Mafikeng campus. Quantitative research methodology was considered in data collection and descriptive statistics for data analysis. The Activity Theory (AT) was used as a theory to guide the study. AT outlines the limitations amongst e-learning at the macro-organizational level (plan, guiding principle, campus-wide solutions) and micro-organization (daily functioning practice, collaborative transformation, specific adaptation). On a technological environment, AT gives people an opportunity to change from concentrating on computers as an area of concern but also understand that technology is part of human activities. The findings have identified the university’s current IT tools and knowledge on e-learning elements. It was recommended that university should consider buying computer resources that consumes less power and practice e-learning effectively.
Keywords: E-learning, information and communication technology, teaching, and virtual learning environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10807588 Integration of Seismic and Seismological Data Interpretation for Subsurface Structure Identification
Authors: Iftikhar Ahmed Satti, Wan Ismail Wan Yusoff
Abstract:
The structural interpretation of a part of eastern Potwar (Missa Keswal) has been carried out with available seismological, seismic and well data. Seismological data contains both the source parameters and fault plane solution (FPS) parameters and seismic data contains ten seismic lines that were re-interpreted by using well data. Structural interpretation depicts two broad types of fault sets namely, thrust and back thrust faults. These faults together give rise to pop up structures in the study area and also responsible for many structural traps and seismicity. Seismic interpretation includes time and depth contour maps of Chorgali Formation while seismological interpretation includes focal mechanism solution (FMS), depth, frequency, magnitude bar graphs and renewal of Seismotectonic map. The Focal Mechanism Solutions (FMS) that surrounds the study area are correlated with the different geological and structural maps of the area for the determination of the nature of subsurface faults. Results of structural interpretation from both seismic and seismological data show good correlation. It is hoped that the present work will help in better understanding of the variations in the subsurface structure and can be a useful tool for earthquake prediction, planning of oil field and reservoir monitoring.Keywords: Focal mechanism solution (FMS), Fault plane solution (FPS), Reservoir monitoring, earthquake prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2481