Search results for: Corpus-based approach
3681 Parallel-computing Approach for FFT Implementation on Digital Signal Processor (DSP)
Authors: Yi-Pin Hsu, Shin-Yu Lin
Abstract:
An efficient parallel form in digital signal processor can improve the algorithm performance. The butterfly structure is an important role in fast Fourier transform (FFT), because its symmetry form is suitable for hardware implementation. Although it can perform a symmetric structure, the performance will be reduced under the data-dependent flow characteristic. Even though recent research which call as novel memory reference reduction methods (NMRRM) for FFT focus on reduce memory reference in twiddle factor, the data-dependent property still exists. In this paper, we propose a parallel-computing approach for FFT implementation on digital signal processor (DSP) which is based on data-independent property and still hold the property of low-memory reference. The proposed method combines final two steps in NMRRM FFT to perform a novel data-independent structure, besides it is very suitable for multi-operation-unit digital signal processor and dual-core system. We have applied the proposed method of radix-2 FFT algorithm in low memory reference on TI TMSC320C64x DSP. Experimental results show the method can reduce 33.8% clock cycles comparing with the NMRRM FFT implementation and keep the low-memory reference property.
Keywords: Parallel-computing, FFT, low-memory reference, TIDSP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21983680 Building Information Modeling-Based Approach for Automatic Quantity Take-off and Cost Estimation
Authors: Lo Kar Yin, Law Ka Mei
Abstract:
Architectural, engineering, construction and operations (AECO) industry practitioners have been well adapting to the dynamic construction market from the fundamental training of its disciplines. As further triggered by the pandemic since 2019, great steps are taken in virtual environment and the best collaboration is strived with project teams without boundaries. With adoption of Building Information Modeling-based approach and qualitative analysis, this paper is to review quantity take-off (QTO) and cost estimation process through modeling techniques in liaison with suppliers, fabricators, subcontractors, contractors, designers, consultants and services providers in the construction industry value chain for automatic project cost budgeting, project cost control and cost evaluation on design options of in-situ reinforced-concrete construction and Modular Integrated Construction (MiC) at design stage, variation of works and cash flow/spending analysis at construction stage as far as practicable, with a view to sharing the findings for enhancing mutual trust and co-operation among AECO industry practitioners. It is to foster development through a common prototype of design and build project delivery method in NEC4 Engineering and Construction Contract (ECC) Options A and C.
Keywords: Building Information Modeling, cost estimation, quantity take-off, modeling techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7103679 A Mapping Approach of Code Generation for Arinc653-Based Avionics Software
Authors: Lu Zou, Dianfu MA, Ying Wang, Xianqi Zhao
Abstract:
Avionic software architecture has transit from a federated avionics architecture to an integrated modular avionics (IMA) .ARINC 653 (Avionics Application Standard Software Interface) is a software specification for space and time partitioning in Safety-critical avionics Real-time operating systems. Methods to transform the abstract avionics application logic function to the executable model have been brought up, however with less consideration about the code generating input and output model specific for ARINC 653 platform and inner-task synchronous dynamic interaction order sequence. In this paper, we proposed an AADL-based model-driven design methodology to fulfill the purpose to automatically generating Cµ executable model on ARINC 653 platform from the ARINC653 architecture which defined as AADL653 in order to facilitate the development of the avionics software constructed on ARINC653 OS. This paper presents the mapping rules between the AADL653 elements and the elements in Cµ language, and define the code generating rules , designs an automatic C µ code generator .Then, we use a case to illustrate our approach. Finally, we give the related work and future research directions.Keywords: IMA, ARINC653, AADL653, code generation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30383678 Optimizing Dialogue Strategy Learning Using Learning Automata
Authors: G. Kumaravelan, R. Sivakumar
Abstract:
Modeling the behavior of the dialogue management in the design of a spoken dialogue system using statistical methodologies is currently a growing research area. This paper presents a work on developing an adaptive learning approach to optimize dialogue strategy. At the core of our system is a method formalizing dialogue management as a sequential decision making under uncertainty whose underlying probabilistic structure has a Markov Chain. Researchers have mostly focused on model-free algorithms for automating the design of dialogue management using machine learning techniques such as reinforcement learning. But in model-free algorithms there exist a dilemma in engaging the type of exploration versus exploitation. Hence we present a model-based online policy learning algorithm using interconnected learning automata for optimizing dialogue strategy. The proposed algorithm is capable of deriving an optimal policy that prescribes what action should be taken in various states of conversation so as to maximize the expected total reward to attain the goal and incorporates good exploration and exploitation in its updates to improve the naturalness of humancomputer interaction. We test the proposed approach using the most sophisticated evaluation framework PARADISE for accessing to the railway information system.Keywords: Dialogue management, Learning automata, Reinforcement learning, Spoken dialogue system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16113677 Modelling and Simulation of Cascaded H-Bridge Multilevel Single Source Inverter Using PSIM
Authors: Gaddafi S. Shehu, T. Yalcinoz, Abdullahi B. Kunya
Abstract:
Multilevel inverters such as flying capacitor, diodeclamped, and cascaded H-bridge inverters are very popular particularly in medium and high power applications. This paper focuses on a cascaded H-bridge module using a single direct current (DC) source in order to generate an 11-level output voltage. The noble approach reduces the number of switches and gate drivers, in comparison with a conventional method. The anticipated topology produces more accurate result with an isolation transformer at high switching frequency. Different modulation techniques can be used for the multilevel inverter, but this work features modulation techniques known as selective harmonic elimination (SHE).This modulation approach reduces the number of carriers with reduction in Switching Losses, Total Harmonic Distortion (THD), and thereby increasing Power Quality (PQ). Based on the simulation result obtained, it appears SHE has the ability to eliminate selected harmonics by chopping off the fundamental output component. The performance evaluation of the proposed cascaded multilevel inverter is performed using PSIM simulation package and THD of 0.94% is obtained.
Keywords: Cascaded H-bridge Multilevel Inverter, Power Quality, Selective Harmonic Elimination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50963676 A New Stability Analysis and Stabilization of Discrete-Time Switched Linear Systems Using Vector Norms Approach
Authors: Marwen Kermani, Anis Sakly, Faouzi M'sahli
Abstract:
In this paper, we aim to investigate a new stability analysis for discrete-time switched linear systems based on the comparison, the overvaluing principle, the application of Borne-Gentina criterion and the Kotelyanski conditions. This stability conditions issued from vector norms correspond to a vector Lyapunov function. In fact, the switched system to be controlled will be represented in the Companion form. A comparison system relative to a regular vector norm is used in order to get the simple arrow form of the state matrix that yields to a suitable use of Borne-Gentina criterion for the establishment of sufficient conditions for global asymptotic stability. This proposed approach could be a constructive solution to the state and static output feedback stabilization problems.
Keywords: Discrete-time switched linear systems, Global asymptotic stability, Vector norms, Borne-Gentina criterion, Arrow form state matrix, Arbitrary switching, State feedback controller, Static output feedback controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16393675 Riemannian Manifolds for Brain Extraction on Multi-modal Resonance Magnetic Images
Authors: Mohamed Gouskir, Belaid Bouikhalene, Hicham Aissaoui, Benachir Elhadadi
Abstract:
In this paper, we present an application of Riemannian geometry for processing non-Euclidean image data. We consider the image as residing in a Riemannian manifold, for developing a new method to brain edge detection and brain extraction. Automating this process is a challenge due to the high diversity in appearance brain tissue, among different patients and sequences. The main contribution, in this paper, is the use of an edge-based anisotropic diffusion tensor for the segmentation task by integrating both image edge geometry and Riemannian manifold (geodesic, metric tensor) to regularize the convergence contour and extract complex anatomical structures. We check the accuracy of the segmentation results on simulated brain MRI scans of single T1-weighted, T2-weighted and Proton Density sequences. We validate our approach using two different databases: BrainWeb database, and MRI Multiple sclerosis Database (MRI MS DB). We have compared, qualitatively and quantitatively, our approach with the well-known brain extraction algorithms. We show that using a Riemannian manifolds to medical image analysis improves the efficient results to brain extraction, in real time, outperforming the results of the standard techniques.Keywords: Riemannian manifolds, Riemannian Tensor, Brain Segmentation, Non-Euclidean data, Brain Extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16623674 Protein Secondary Structure Prediction Using Parallelized Rule Induction from Coverings
Authors: Leong Lee, Cyriac Kandoth, Jennifer L. Leopold, Ronald L. Frank
Abstract:
Protein 3D structure prediction has always been an important research area in bioinformatics. In particular, the prediction of secondary structure has been a well-studied research topic. Despite the recent breakthrough of combining multiple sequence alignment information and artificial intelligence algorithms to predict protein secondary structure, the Q3 accuracy of various computational prediction algorithms rarely has exceeded 75%. In a previous paper [1], this research team presented a rule-based method called RT-RICO (Relaxed Threshold Rule Induction from Coverings) to predict protein secondary structure. The average Q3 accuracy on the sample datasets using RT-RICO was 80.3%, an improvement over comparable computational methods. Although this demonstrated that RT-RICO might be a promising approach for predicting secondary structure, the algorithm-s computational complexity and program running time limited its use. Herein a parallelized implementation of a slightly modified RT-RICO approach is presented. This new version of the algorithm facilitated the testing of a much larger dataset of 396 protein domains [2]. Parallelized RTRICO achieved a Q3 score of 74.6%, which is higher than the consensus prediction accuracy of 72.9% that was achieved for the same test dataset by a combination of four secondary structure prediction methods [2].Keywords: data mining, protein secondary structure prediction, parallelization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15963673 Study on the Impact of Size and Position of the Shear Field in Determining the Shear Modulus of Glulam Beam Using Photogrammetry Approach
Authors: Niaz Gharavi, Hexin Zhang
Abstract:
The shear modulus of a timber beam can be determined using torsion test or shear field test method. The shear field test method is based on shear distortion measurement of the beam at the zone with the constant transverse load in the standardized four-point bending test. The current code of practice advises using two metallic arms act as an instrument to measure the diagonal displacement of the constructing square. The size and the position of the constructing square might influence the shear modulus determination. This study aimed to investigate the size and the position effect of the square in the shear field test method. A binocular stereo vision system has been employed to determine the 3D displacement of a grid of target points. Six glue laminated beams were produced and tested. Analysis of Variance (ANOVA) was performed on the acquired data to evaluate the significance of the size effect and the position effect of the square. The results have shown that the size of the square has a noticeable influence on the value of shear modulus, while, the position of the square within the area with the constant shear force does not affect the measured mean shear modulus.Keywords: Shear field test method, structural-sized test, shear modulus of Glulam beam, photogrammetry approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10083672 Fighter Aircraft Selection Using Fuzzy Preference Optimization Programming (POP)
Authors: C. Ardil
Abstract:
The Turkish Air Force needs to acquire a sixth- generation fighter aircraft in order to maintain its air superiority and dominance against its rivals under the risks posed by global geopolitical opportunities and threats. Accordingly, five evaluation criteria were determined to evaluate the sixth-generation fighter aircraft alternatives and to select the best one. Systematically, a new fuzzy preference optimization programming (POP) method is proposed to select the best sixth generation fighter aircraft in an uncertain environment. The POP technique considers both quantitative and qualitative evaluation criteria. To demonstrate the applicability and effectiveness of the proposed approach, it is applied to a multiple criteria decision-making problem to evaluate and select sixth-generation fighter aircraft. The results of the fuzzy POP method are compared with the results of the fuzzy TOPSIS approach to validate it. According to the comparative analysis, fuzzy POP and fuzzy TOPSIS methods get the same results. This demonstrates the applicability of the fuzzy POP technique to address the sixth-generation fighter selection problem.
Keywords: Fighter aircraft selection, sixth-generation fighter aircraft, fuzzy decision process, multiple criteria decision making, preference optimization programming, POP, TOPSIS, Kizilelma, MIUS, fuzzy set theory
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4473671 IMLFQ Scheduling Algorithm with Combinational Fault Tolerant Method
Authors: MohammadReza EffatParvar, Akbar Bemana, Mehdi EffatParvar
Abstract:
Scheduling algorithms are used in operating systems to optimize the usage of processors. One of the most efficient algorithms for scheduling is Multi-Layer Feedback Queue (MLFQ) algorithm which uses several queues with different quanta. The most important weakness of this method is the inability to define the optimized the number of the queues and quantum of each queue. This weakness has been improved in IMLFQ scheduling algorithm. Number of the queues and quantum of each queue affect the response time directly. In this paper, we review the IMLFQ algorithm for solving these problems and minimizing the response time. In this algorithm Recurrent Neural Network has been utilized to find both the number of queues and the optimized quantum of each queue. Also in order to prevent any probable faults in processes' response time computation, a new fault tolerant approach has been presented. In this approach we use combinational software redundancy to prevent the any probable faults. The experimental results show that using the IMLFQ algorithm results in better response time in comparison with other scheduling algorithms also by using fault tolerant mechanism we improve IMLFQ performance.Keywords: IMLFQ, Fault Tolerant, Scheduling, Queue, Recurrent Neural Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15363670 Face Localization and Recognition in Varied Expressions and Illumination
Authors: Hui-Yu Huang, Shih-Hang Hsu
Abstract:
In this paper, we propose a robust scheme to work face alignment and recognition under various influences. For face representation, illumination influence and variable expressions are the important factors, especially the accuracy of facial localization and face recognition. In order to solve those of factors, we propose a robust approach to overcome these problems. This approach consists of two phases. One phase is preprocessed for face images by means of the proposed illumination normalization method. The location of facial features can fit more efficient and fast based on the proposed image blending. On the other hand, based on template matching, we further improve the active shape models (called as IASM) to locate the face shape more precise which can gain the recognized rate in the next phase. The other phase is to process feature extraction by using principal component analysis and face recognition by using support vector machine classifiers. The results show that this proposed method can obtain good facial localization and face recognition with varied illumination and local distortion.
Keywords: Gabor filter, improved active shape model (IASM), principal component analysis (PCA), face alignment, face recognition, support vector machine (SVM)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14913669 Enhancing Experiential Learning in a Smart Flipped Classroom: A Case Study
Authors: Fahri Benli, Sitalakshmi Venkatraman, Ye Wei, Fiona Wahr
Abstract:
A flipped classroom which is a form of blended learning shifts the focus from a teacher-centered approach to a learner-centered approach. However, not all learners are ready to take the active role of knowledge and skill acquisition through a flipped classroom and they continue to delve in a passive mode of learning. This challenges educators in designing, scaffolding and facilitating in-class activities for students to have active learning experiences in a flipped classroom environment. Experiential learning theories have been employed by educators in the past in physical classrooms based on the principle that knowledge could be actively developed through direct experience. However, with more of online teaching witnessed recently, there are inherent limitations in designing and simulating an experiential learning activity for an online environment. In this paper, we explore enhancing experiential learning using smart digital tools that could be employed in a flipped classroom within a higher education setting. We present the use of smart collaborative tools online to enhance the experiential learning activity to teach higher-order cognitive concepts of business process modeling as a case study.
Keywords: Experiential learning, flipped classroom, smart software tools, online learning higher-order learning attributes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4363668 Hardware Implementation of Local Binary Pattern Based Two-Bit Transform Motion Estimation
Authors: Seda Yavuz, Anıl Çelebi, Aysun Taşyapı Çelebi, Oğuzhan Urhan
Abstract:
Nowadays, demand for using real-time video transmission capable devices is ever-increasing. So, high resolution videos have made efficient video compression techniques an essential component for capturing and transmitting video data. Motion estimation has a critical role in encoding raw video. Hence, various motion estimation methods are introduced to efficiently compress the video. Low bit‑depth representation based motion estimation methods facilitate computation of matching criteria and thus, provide small hardware footprint. In this paper, a hardware implementation of a two-bit transformation based low-complexity motion estimation method using local binary pattern approach is proposed. Image frames are represented in two-bit depth instead of full-depth by making use of the local binary pattern as a binarization approach and the binarization part of the hardware architecture is explained in detail. Experimental results demonstrate the difference between the proposed hardware architecture and the architectures of well-known low-complexity motion estimation methods in terms of important aspects such as resource utilization, energy and power consumption.
Keywords: Binarization, hardware architecture, local binary pattern, motion estimation, two-bit transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13743667 A Simulation-Optimization Approach to Control Production, Subcontracting and Maintenance Decisions for a Deteriorating Production System
Authors: Héctor Rivera-Gómez, Eva Selene Hernández-Gress, Oscar Montaño-Arango, Jose Ramon Corona-Armenta
Abstract:
This research studies the joint production, maintenance and subcontracting control policy for an unreliable deteriorating manufacturing system. Production activities are controlled by a derivation of the Hedging Point Policy, and given that the system is subject to deterioration, it reduces progressively its capacity to satisfy product demand. Multiple deterioration effects are considered, reflected mainly in the quality of the parts produced and the reliability of the machine. Subcontracting is available as support to satisfy product demand; also, overhaul maintenance can be conducted to reduce the effects of deterioration. The main objective of the research is to determine simultaneously the production, maintenance and subcontracting rate, which minimize the total, incurred cost. A stochastic dynamic programming model is developed and solved through a simulation-based approach composed of statistical analysis and optimization with the response surface methodology. The obtained results highlight the strong interactions between production, deterioration and quality, which justify the development of an integrated model. A numerical example and a sensitivity analysis are presented to validate our results.Keywords: Deterioration, simulation, subcontracting, production planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19013666 Evaluating Contractors in Construction Projects by Multi-Criteria Decision Making and Supply Chain Approach
Authors: Sara Najiazarpour, Mahsa Najiazarpour
Abstract:
There are many problems in contracting projects and their performance. At each project stage and due to different reasons, these problems affect cost, time, and quality. Hence, in order to increase the efficiency and performance in all levels of the chain and with supply chain management approach, there will be a coordination from the beginning of a project to the end of project (handover of project). Contractor selection is the foremost part of construction projects which in this multi-criteria decision-making, the best contractor is determined by expert judgment, different variables, and their priorities. In this paper for selecting the best contractor, numerous criteria were collected by asking from adept experts and then among them, 16 criteria with highest frequency were considered for questionnaire. This questionnaire was distributed between experts. Cronbach's alpha coefficient was used and then based on Borda function important criteria were selected which was categorized in four main criteria as follows: Environmental factors and physical equipment, past performance and technical expertise, affordability and standards. Then with PROMTHEE method, the criteria were normalized and monitored, finally the best alternative was selected. A case study had been done, and the best contractor was selected based on all criteria and their priorities.
Keywords: Evaluation and selecting contractors, project development, supply chain management, multi-criteria decision-making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 843665 A New Classification of Risk-Reduction Options to Improve the Risk-Reduction Readiness of the Railway Industry
Authors: Eberechi Weli, Michael Todinov
Abstract:
The gap between the selection of risk-reduction options in the railway industry and the task of their effective implementation results in compromised safety and substantial losses. An effective risk management must necessarily integrate the evaluation phases with the implementation phase. This paper proposes an essential categorisation of risk reduction measures that best addresses a standard railway industry portfolio. By categorising the risk reduction options into design, operational, procedural and technical options, it is guaranteed that the efforts of the implementation facilitators (people, processes and supporting systems) are systematically harmonised. The classification is based on an integration of fundamental principles of risk reduction in the railway industry with the systems engineering approach.
This paper argues that the use of a similar classification approach is an attribute of organisations possessing a superior level of risk-reduction readiness. The integration of the proposed rational classification structure provides a solid ground for effective risk reduction.
Keywords: Cost effectiveness, organisational readiness, risk reduction, railway, system engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18023664 Strengthening the HCI Approaches in the Software Development Process
Authors: Rogayah A. Majid, Nor Laila Md. Noor, Wan Adilah Wan Adnan
Abstract:
User-Centered Design (UCD), Usability Engineering (UE) and Participatory Design (PD) are the common Human- Computer Interaction (HCI) approaches that are practiced in the software development process, focusing towards issues and matters concerning user involvement. It overlooks the organizational perspective of HCI integration within the software development organization. The Management Information Systems (MIS) perspective of HCI takes a managerial and organizational context to view the effectiveness of integrating HCI in the software development process. The Human-Centered Design (HCD) which encompasses all of the human aspects including aesthetic and ergonomic, is claimed as to provide a better approach in strengthening the HCI approaches to strengthen the software development process. In determining the effectiveness of HCD in the software development process, this paper presents the findings of a content analysis of HCI approaches by viewing those approaches as a technology which integrates user requirements, ranging from the top management to other stake holder in the software development process. The findings obtained show that HCD approach is a technology that emphasizes on human, tools and knowledge in strengthening the HCI approaches to strengthen the software development process in the quest to produce a sustainable, usable and useful software product.
Keywords: Human-Centered Design (HCD), Management Information Systems (MIS), Participatory Design (PD), User- Centered Design (UCD), Usability Engineering (UE)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22413663 Fuzzy Inference System for Determining Collision Risk of Ship in Madura Strait Using Automatic Identification System
Authors: Emmy Pratiwi, Ketut B. Artana, A. A. B. Dinariyana
Abstract:
Madura Strait is considered as one of the busiest shipping channels in Indonesia. High vessel traffic density in Madura Strait gives serious threat due to navigational safety in this area, i.e. ship collision. This study is necessary as an attempt to enhance the safety of marine traffic. Fuzzy inference system (FIS) is proposed to calculate risk collision of ships. Collision risk is evaluated based on ship domain, Distance to Closest Point of Approach (DCPA), and Time to Closest Point of Approach (TCPA). Data were collected by utilizing Automatic Identification System (AIS). This study considers several ships’ domain models to give the characteristic of marine traffic in the waterways. Each encounter in the ship domain is analyzed to obtain the level of collision risk. Risk level of ships, as the result in this study, can be used as guidance to avoid the accident, providing brief description about safety traffic in Madura Strait and improving the navigational safety in the area.
Keywords: Automatic identification system, collision risk, DCPA, fuzzy inference system, TCPA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15863662 Decision Analysis Module for Excel
Authors: Radomir Perzina, Jaroslav Ramik
Abstract:
The Analytic Hierarchy Process is frequently used approach for solving decision making problems. There exists wide range of software programs utilizing that approach. Their main disadvantage is that they are relatively expensive and missing intermediate calculations. This work introduces a Microsoft Excel add-in called DAME – Decision Analysis Module for Excel. Comparing to other computer programs DAME is free, can work with scenarios or multiple decision makers and displays intermediate calculations. Users can structure their decision models into three levels – scenarios/users, criteria and variants. Items on all levels can be evaluated either by weights or pair-wise comparisons. There are provided three different methods for the evaluation of the weights of criteria, the variants as well as the scenarios – Saaty’s Method, Geometric Mean Method and Fuller’s Triangle Method. Multiplicative and additive syntheses are supported. The proposed software package is demonstrated on couple of illustrating examples of real life decision problems.
Keywords: Analytic hierarchy process, multi-criteria decision making, pair-wise comparisons, Microsoft Excel, Scenarios.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34043661 Identity Management in Virtual Worlds Based on Biometrics Watermarking
Authors: S. Bader, N. Essoukri Ben Amara
Abstract:
With the technological development and rise of virtual worlds, these spaces are becoming more and more attractive for cybercriminals, hidden behind avatars and fictitious identities. Since access to these spaces is not restricted or controlled, some impostors take advantage of gaining unauthorized access and practicing cyber criminality. This paper proposes an identity management approach for securing access to virtual worlds. The major purpose of the suggested solution is to install a strong security mechanism to protect virtual identities represented by avatars. Thus, only legitimate users, through their corresponding avatars, are allowed to access the platform resources. Access is controlled by integrating an authentication process based on biometrics. In the request process for registration, a user fingerprint is enrolled and then encrypted into a watermark utilizing a cancelable and non-invertible algorithm for its protection. After a user personalizes their representative character, the biometric mark is embedded into the avatar through a watermarking procedure. The authenticity of the avatar identity is verified when it requests authorization for access. We have evaluated the proposed approach on a dataset of avatars from various virtual worlds, and we have registered promising performance results in terms of authentication accuracy, acceptation and rejection rates.Keywords: Identity management, security, biometrics authentication and authorization, avatar, virtual world.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16573660 Mixtures of Monotone Networks for Prediction
Authors: Marina Velikova, Hennie Daniels, Ad Feelders
Abstract:
In many data mining applications, it is a priori known that the target function should satisfy certain constraints imposed by, for example, economic theory or a human-decision maker. In this paper we consider partially monotone prediction problems, where the target variable depends monotonically on some of the input variables but not on all. We propose a novel method to construct prediction models, where monotone dependences with respect to some of the input variables are preserved by virtue of construction. Our method belongs to the class of mixture models. The basic idea is to convolute monotone neural networks with weight (kernel) functions to make predictions. By using simulation and real case studies, we demonstrate the application of our method. To obtain sound assessment for the performance of our approach, we use standard neural networks with weight decay and partially monotone linear models as benchmark methods for comparison. The results show that our approach outperforms partially monotone linear models in terms of accuracy. Furthermore, the incorporation of partial monotonicity constraints not only leads to models that are in accordance with the decision maker's expertise, but also reduces considerably the model variance in comparison to standard neural networks with weight decay.Keywords: mixture models, monotone neural networks, partially monotone models, partially monotone problems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12463659 Analysis of Vocal Fold Vibrations from High-Speed Digital Images Based On Dynamic Time Warping
Authors: A. I. A. Rahman, Sh-Hussain Salleh, K. Ahmad, K. Anuar
Abstract:
Analysis of vocal fold vibration is essential for understanding the mechanism of voice production and for improving clinical assessment of voice disorders. This paper presents a Dynamic Time Warping (DTW) based approach to analyze and objectively classify vocal fold vibration patterns. The proposed technique was designed and implemented on a Glottal Area Waveform (GAW) extracted from high-speed laryngeal images by delineating the glottal edges for each image frame. Feature extraction from the GAW was performed using Linear Predictive Coding (LPC). Several types of voice reference templates from simulations of clear, breathy, fry, pressed and hyperfunctional voice productions were used. The patterns of the reference templates were first verified using the analytical signal generated through Hilbert transformation of the GAW. Samples from normal speakers’ voice recordings were then used to evaluate and test the effectiveness of this approach. The classification of the voice patterns using the technique of LPC and DTW gave the accuracy of 81%.
Keywords: Dynamic Time Warping, Glottal Area Waveform, Linear Predictive Coding, High-Speed Laryngeal Images, Hilbert Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23343658 Formal Analysis of a Public-Key Algorithm
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.
Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15373657 Reinforcement Learning-Based Coexistence Interference Management in Wireless Body Area Networks
Authors: Izaz Ahmad, Farhatullah, Shahbaz Ali, Farhad Ali, Faiza, Hazrat Junaid, Farhan Zaid
Abstract:
Current trends in remote health monitoring to monetize on the Internet of Things applications have been raised in efficient and interference free communications in Wireless Body Area Network (WBAN) scenario. Co-existence interference in WBANs have aggravates the over-congested radio bands, thereby requiring efficient Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) strategies and improve interference management. Existing solutions utilize simplistic heuristics to approach interference problems. The scope of this research article is to investigate reinforcement learning for efficient interference management under co-existing scenarios with an emphasis on homogenous interferences. The aim of this paper is to suggest a smart CSMA/CA mechanism based on reinforcement learning called QIM-MAC that effectively uses sense slots with minimal interference. Simulation results are analyzed based on scenarios which show that the proposed approach maximized Average Network Throughput and Packet Delivery Ratio and minimized Packet Loss Ratio, Energy Consumption and Average Delay.
Keywords: WBAN, IEEE 802.15.4 Standard, CAP Super-frame, Q-Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6503656 A Portable Cognitive Tool for Engagement Level and Activity Identification
Authors: T. Teo, S. W. Lye, Y. F. Li, Z. Zakaria
Abstract:
Wearable devices such as Electroencephalography (EEG) hold immense potential in the monitoring and assessment of a person’s task engagement. This is especially so in remote or online sites. Research into its use in measuring an individual's cognitive state while performing task activities is therefore expected to increase. Despite the growing number of EEG research into brain functioning activities of a person, key challenges remain in adopting EEG for real-time operations. These include limited portability, long preparation time, high number of channel dimensionality, intrusiveness, as well as level of accuracy in acquiring neurological data. This paper proposes an approach using a 4-6 EEG channels to determine the cognitive states of a subject when undertaking a set of passive and active monitoring tasks of a subject. Air traffic controller (ATC) dynamic-tasks are used as a proxy. The work found that using a developed channel reduction and identifier algorithm, good trend adherence of 89.1% can be obtained between a commercially available brain computer interface (BCI) 14 channel Emotiv EPOC+ EEG headset and that of a carefully selected set of reduced 4-6 channels. The approach can also identify different levels of engagement activities ranging from general monitoring, ad hoc and repeated active monitoring activities involving information search, extraction, and memory activities.
Keywords: Neurophysiology, monitoring, EEG, outliers, electroencephalography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 893655 Machine Learning for Music Aesthetic Annotation Using MIDI Format: A Harmony-Based Classification Approach
Authors: Lin Yang, Zhian Mi, Jiacheng Xiao, Rong Li
Abstract:
Swimming with the tide of deep learning, the field of music information retrieval (MIR) experiences parallel development and a sheer variety of feature-learning models has been applied to music classification and tagging tasks. Among those learning techniques, the deep convolutional neural networks (CNNs) have been widespreadly used with better performance than the traditional approach especially in music genre classification and prediction. However, regarding the music recommendation, there is a large semantic gap between the corresponding audio genres and the various aspects of a song that influence user preference. In our study, aiming to bridge the gap, we strive to construct an automatic music aesthetic annotation model with MIDI format for better comparison and measurement of the similarity between music pieces in the way of harmonic analysis. We use the matrix of qualification converted from MIDI files as input to train two different classifiers, support vector machine (SVM) and Decision Tree (DT). Experimental results in performance of a tag prediction task have shown that both learning algorithms are capable of extracting high-level properties in an end-to end manner from music information. The proposed model is helpful to learn the audience taste and then the resulting recommendations are likely to appeal to a niche consumer.
Keywords: Harmonic analysis, machine learning, music classification and tagging, MIDI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7583654 Detecting Email Forgery using Random Forests and Naïve Bayes Classifiers
Authors: Emad E Abdallah, A.F. Otoom, ArwaSaqer, Ola Abu-Aisheh, Diana Omari, Ghadeer Salem
Abstract:
As emails communications have no consistent authentication procedure to ensure the authenticity, we present an investigation analysis approach for detecting forged emails based on Random Forests and Naïve Bays classifiers. Instead of investigating the email headers, we use the body content to extract a unique writing style for all the possible suspects. Our approach consists of four main steps: (1) The cybercrime investigator extract different effective features including structural, lexical, linguistic, and syntactic evidence from previous emails for all the possible suspects, (2) The extracted features vectors are normalized to increase the accuracy rate. (3) The normalized features are then used to train the learning engine, (4) upon receiving the anonymous email (M); we apply the feature extraction process to produce a feature vector. Finally, using the machine learning classifiers the email is assigned to one of the suspects- whose writing style closely matches M. Experimental results on real data sets show the improved performance of the proposed method and the ability of identifying the authors with a very limited number of features.Keywords: Digital investigation, cybercrimes, emails forensics, anonymous emails, writing style, and authorship analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 52543653 Analytical Authentication of Butter Using Fourier Transform Infrared Spectroscopy Coupled with Chemometrics
Authors: M. Bodner, M. Scampicchio
Abstract:
Fourier Transform Infrared (FT-IR) spectroscopy coupled with chemometrics was used to distinguish between butter samples and non-butter samples. Further, quantification of the content of margarine in adulterated butter samples was investigated. Fingerprinting region (1400-800 cm–1) was used to develop unsupervised pattern recognition (Principal Component Analysis, PCA), supervised modeling (Soft Independent Modelling by Class Analogy, SIMCA), classification (Partial Least Squares Discriminant Analysis, PLS-DA) and regression (Partial Least Squares Regression, PLS-R) models. PCA of the fingerprinting region shows a clustering of the two sample types. All samples were classified in their rightful class by SIMCA approach; however, nine adulterated samples (between 1% and 30% w/w of margarine) were classified as belonging both at the butter class and at the non-butter one. In the two-class PLS-DA model’s (R2 = 0.73, RMSEP, Root Mean Square Error of Prediction = 0.26% w/w) sensitivity was 71.4% and Positive Predictive Value (PPV) 100%. Its threshold was calculated at 7% w/w of margarine in adulterated butter samples. Finally, PLS-R model (R2 = 0.84, RMSEP = 16.54%) was developed. PLS-DA was a suitable classification tool and PLS-R a proper quantification approach. Results demonstrate that FT-IR spectroscopy combined with PLS-R can be used as a rapid, simple and safe method to identify pure butter samples from adulterated ones and to determine the grade of adulteration of margarine in butter samples.
Keywords: Adulterated butter, margarine, PCA, PLS-DA, PLS-R, SIMCA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7813652 A Novel Approach for Protein Classification Using Fourier Transform
Authors: A. F. Ali, D. M. Shawky
Abstract:
Discovering new biological knowledge from the highthroughput biological data is a major challenge to bioinformatics today. To address this challenge, we developed a new approach for protein classification. Proteins that are evolutionarily- and thereby functionally- related are said to belong to the same classification. Identifying protein classification is of fundamental importance to document the diversity of the known protein universe. It also provides a means to determine the functional roles of newly discovered protein sequences. Our goal is to predict the functional classification of novel protein sequences based on a set of features extracted from each protein sequence. The proposed technique used datasets extracted from the Structural Classification of Proteins (SCOP) database. A set of spectral domain features based on Fast Fourier Transform (FFT) is used. The proposed classifier uses multilayer back propagation (MLBP) neural network for protein classification. The maximum classification accuracy is about 91% when applying the classifier to the full four levels of the SCOP database. However, it reaches a maximum of 96% when limiting the classification to the family level. The classification results reveal that spectral domain contains information that can be used for classification with high accuracy. In addition, the results emphasize that sequence similarity measures are of great importance especially at the family level.
Keywords: Bioinformatics, Artificial Neural Networks, Protein Sequence Analysis, Feature Extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2360