Search results for: Steganalysis Heuristic approach.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5129

Search results for: Steganalysis Heuristic approach.

3599 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.

Keywords: Clustering, k-means, categorical datasets, pattern recognition, unsupervised learning, knowledge discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3507
3598 Optimization Approach to Estimate Hammerstein–Wiener Nonlinear Blocks in Presence of Noise and Disturbance

Authors: Leili Esmaeilani, Jafar Ghaisari, Mohsen Ahmadian

Abstract:

Hammerstein–Wiener model is a block-oriented model where a linear dynamic system is surrounded by two static nonlinearities at its input and output and could be used to model various processes. This paper contains an optimization approach method for analysing the problem of Hammerstein–Wiener systems identification. The method relies on reformulate the identification problem; solve it as constraint quadratic problem and analysing its solutions. During the formulation of the problem, effects of adding noise to both input and output signals of nonlinear blocks and disturbance to linear block, in the emerged equations are discussed. Additionally, the possible parametric form of matrix operations to reduce the equation size is presented. To analyse the possible solutions to the mentioned system of equations, a method to reduce the difference between the number of equations and number of unknown variables by formulate and importing existing knowledge about nonlinear functions is presented. Obtained equations are applied to an instance H–W system to validate the results and illustrate the proposed method.

Keywords: Identification, Hammerstein-Wiener, optimization, quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 779
3597 Distributional Semantics Approach to Thai Word Sense Disambiguation

Authors: Sunee Pongpinigpinyo, Wanchai Rivepiboon

Abstract:

Word sense disambiguation is one of the most important open problems in natural language processing applications such as information retrieval and machine translation. Many approach strategies can be employed to resolve word ambiguity with a reasonable degree of accuracy. These strategies are: knowledgebased, corpus-based, and hybrid-based. This paper pays attention to the corpus-based strategy that employs an unsupervised learning method for disambiguation. We report our investigation of Latent Semantic Indexing (LSI), an information retrieval technique and unsupervised learning, to the task of Thai noun and verbal word sense disambiguation. The Latent Semantic Indexing has been shown to be efficient and effective for Information Retrieval. For the purposes of this research, we report experiments on two Thai polysemous words, namely  /hua4/ and /kep1/ that are used as a representative of Thai nouns and verbs respectively. The results of these experiments demonstrate the effectiveness and indicate the potential of applying vector-based distributional information measures to semantic disambiguation.

Keywords: Distributional semantics, Latent Semantic Indexing, natural language processing, Polysemous words, unsupervisedlearning, Word Sense Disambiguation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
3596 A Complexity-Based Approach in Image Compression using Neural Networks

Authors: Hadi Veisi, Mansour Jamzad

Abstract:

In this paper we present an adaptive method for image compression that is based on complexity level of the image. The basic compressor/de-compressor structure of this method is a multilayer perceptron artificial neural network. In adaptive approach different Back-Propagation artificial neural networks are used as compressor and de-compressor and this is done by dividing the image into blocks, computing the complexity of each block and then selecting one network for each block according to its complexity value. Three complexity measure methods, called Entropy, Activity and Pattern-based are used to determine the level of complexity in image blocks and their ability in complexity estimation are evaluated and compared. In training and evaluation, each image block is assigned to a network based on its complexity value. Best-SNR is another alternative in selecting compressor network for image blocks in evolution phase which chooses one of the trained networks such that results best SNR in compressing the input image block. In our evaluations, best results are obtained when overlapping the blocks is allowed and choosing the networks in compressor is based on the Best-SNR. In this case, the results demonstrate superiority of this method comparing with previous similar works and JPEG standard coding.

Keywords: Adaptive image compression, Image complexity, Multi-layer perceptron neural network, JPEG Standard, PSNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
3595 Fighter Aircraft Evaluation and Selection Process Based on Triangular Fuzzy Numbers in Multiple Criteria Decision Making Analysis Using the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS)

Authors: C. Ardil

Abstract:

This article presents a multiple criteria evaluation approach to uncertainty, vagueness, and imprecision analysis for ranking alternatives with fuzzy data for decision making using the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS). The fighter aircraft evaluation and selection decision making problem is modeled in a fuzzy environment with triangular fuzzy numbers. The fuzzy decision information related to the fighter aircraft selection problem is taken into account in ordering the alternatives and selecting the best candidate. The basic fuzzy TOPSIS procedure steps transform fuzzy decision matrices into matrices of alternatives evaluated according to all decision criteria. A practical numerical example illustrates the proposed approach to the fighter aircraft selection problem.

Keywords: triangular fuzzy number (TFN), multiple criteria decision making analysis, decision making, aircraft selection, MCDMA, fuzzy TOPSIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 438
3594 Investigation of the Physical Computing in Computational Thinking Practices, Computer Programming Concepts and Self-Efficacy for Crosscutting Ideas in STEM Content Environments

Authors: Sarantos Psycharis

Abstract:

Physical Computing, as an instructional model, is applied in the framework of the Engineering Pedagogy to teach “transversal/cross-cutting ideas” in a STEM content approach. Labview and Arduino were used in order to connect the physical world with real data in the framework of the so called Computational Experiment. Tertiary prospective engineering educators were engaged during their course and Computational Thinking (CT) concepts were registered before and after the intervention across didactic activities using validated questionnaires for the relationship between self-efficacy, computer programming, and CT concepts when STEM content epistemology is implemented in alignment with the Computational Pedagogy model. Results show a significant change in students’ responses for self-efficacy for CT before and after the instruction. Results also indicate a significant relation between the responses in the different CT concepts/practices. According to the findings, STEM content epistemology combined with Physical Computing should be a good candidate as a learning and teaching approach in university settings that enhances students’ engagement in CT concepts/practices.

Keywords: STEM, computational thinking, physical computing, Arduino, Labview, self-efficacy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 780
3593 The Islamic Hadiths on Female Circumcision and the Symbolism of Solomon’s Temple

Authors: Richard L. Worthington

Abstract:

Female ‘circumcision’ (FGC/FGM) in Islam is based primarily upon the ‘hadiths,’ which are the sayings of Muhammad. While it is usual to attack such hadiths in order to stop female ‘circumcision,’ yet those practicing female ‘circumcision’ merely react against such an attack. However, there is a new approach, called ‘Temple Theology,’ which reads religious stories in the light of how the rituals and politics of Solomon’s temple were encoded in those stories. For example, one hadith tells us not to cut severely in circumcising a woman. However, the Menorah lampstand was symbolized as a woman, and so ‘circumcising’ a woman could be re-interpreted as merely referring to trimming the wicks of the lamps. Similarly, another hadith mentions that when a man is within the four parts of a woman (her arms and legs) that the couple should bathe because their circumcised parts have met (implying that the woman was circumcised). However, the bronze ‘Sea’ basin of Solomon’s temple, used for immersion (‘bathing’), had four sides, implying that the circumcised parts relate to temple symbolism. The hadiths relating to the fitra – Islamic practices which include circumcision – and to Hagar being circumcised by Sarah are likewise interpreted. This approach implies that the hadiths can be respected without giving them a literal interpretation. In this way, it is hoped that those devout Muslims who defend female ‘circumcision’ can re-evaluate their position in a positive way from within their own tradition, as opposed to being seemingly hounded by non-Muslims.

Keywords: Female circumcision, Fitra, Hadith, Temple theology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 383
3592 Business Intelligence for N=1 Analytics using Hybrid Intelligent System Approach

Authors: Rajendra M Sonar

Abstract:

The future of business intelligence (BI) is to integrate intelligence into operational systems that works in real-time analyzing small chunks of data based on requirements on continuous basis. This is moving away from traditional approach of doing analysis on ad-hoc basis or sporadically in passive and off-line mode analyzing huge amount data. Various AI techniques such as expert systems, case-based reasoning, neural-networks play important role in building business intelligent systems. Since BI involves various tasks and models various types of problems, hybrid intelligent techniques can be better choice. Intelligent systems accessible through web services make it easier to integrate them into existing operational systems to add intelligence in every business processes. These can be built to be invoked in modular and distributed way to work in real time. Functionality of such systems can be extended to get external inputs compatible with formats like RSS. In this paper, we describe a framework that use effective combinations of these techniques, accessible through web services and work in real-time. We have successfully developed various prototype systems and done few commercial deployments in the area of personalization and recommendation on mobile and websites.

Keywords: Business Intelligence, Customer Relationship Management, Hybrid Intelligent Systems, Personalization and Recommendation (P&R), Recommender Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052
3591 Parallel-computing Approach for FFT Implementation on Digital Signal Processor (DSP)

Authors: Yi-Pin Hsu, Shin-Yu Lin

Abstract:

An efficient parallel form in digital signal processor can improve the algorithm performance. The butterfly structure is an important role in fast Fourier transform (FFT), because its symmetry form is suitable for hardware implementation. Although it can perform a symmetric structure, the performance will be reduced under the data-dependent flow characteristic. Even though recent research which call as novel memory reference reduction methods (NMRRM) for FFT focus on reduce memory reference in twiddle factor, the data-dependent property still exists. In this paper, we propose a parallel-computing approach for FFT implementation on digital signal processor (DSP) which is based on data-independent property and still hold the property of low-memory reference. The proposed method combines final two steps in NMRRM FFT to perform a novel data-independent structure, besides it is very suitable for multi-operation-unit digital signal processor and dual-core system. We have applied the proposed method of radix-2 FFT algorithm in low memory reference on TI TMSC320C64x DSP. Experimental results show the method can reduce 33.8% clock cycles comparing with the NMRRM FFT implementation and keep the low-memory reference property.

Keywords: Parallel-computing, FFT, low-memory reference, TIDSP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2166
3590 Building Information Modeling-Based Approach for Automatic Quantity Take-off and Cost Estimation

Authors: Lo Kar Yin, Law Ka Mei

Abstract:

Architectural, engineering, construction and operations (AECO) industry practitioners have been well adapting to the dynamic construction market from the fundamental training of its disciplines. As further triggered by the pandemic since 2019, great steps are taken in virtual environment and the best collaboration is strived with project teams without boundaries. With adoption of Building Information Modeling-based approach and qualitative analysis, this paper is to review quantity take-off (QTO) and cost estimation process through modeling techniques in liaison with suppliers, fabricators, subcontractors, contractors, designers, consultants and services providers in the construction industry value chain for automatic project cost budgeting, project cost control and cost evaluation on design options of in-situ reinforced-concrete construction and Modular Integrated Construction (MiC) at design stage, variation of works and cash flow/spending analysis at construction stage as far as practicable, with a view to sharing the findings for enhancing mutual trust and co-operation among AECO industry practitioners. It is to foster development through a common prototype of design and build project delivery method in NEC4 Engineering and Construction Contract (ECC) Options A and C.

Keywords: Building Information Modeling, cost estimation, quantity take-off, modeling techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 656
3589 A Mapping Approach of Code Generation for Arinc653-Based Avionics Software

Authors: Lu Zou, Dianfu MA, Ying Wang, Xianqi Zhao

Abstract:

Avionic software architecture has transit from a federated avionics architecture to an integrated modular avionics (IMA) .ARINC 653 (Avionics Application Standard Software Interface) is a software specification for space and time partitioning in Safety-critical avionics Real-time operating systems. Methods to transform the abstract avionics application logic function to the executable model have been brought up, however with less consideration about the code generating input and output model specific for ARINC 653 platform and inner-task synchronous dynamic interaction order sequence. In this paper, we proposed an AADL-based model-driven design methodology to fulfill the purpose to automatically generating Cµ executable model on ARINC 653 platform from the ARINC653 architecture which defined as AADL653 in order to facilitate the development of the avionics software constructed on ARINC653 OS. This paper presents the mapping rules between the AADL653 elements and the elements in Cµ language, and define the code generating rules , designs an automatic C µ code generator .Then, we use a case to illustrate our approach. Finally, we give the related work and future research directions.

Keywords: IMA, ARINC653, AADL653, code generation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3007
3588 Optimizing Dialogue Strategy Learning Using Learning Automata

Authors: G. Kumaravelan, R. Sivakumar

Abstract:

Modeling the behavior of the dialogue management in the design of a spoken dialogue system using statistical methodologies is currently a growing research area. This paper presents a work on developing an adaptive learning approach to optimize dialogue strategy. At the core of our system is a method formalizing dialogue management as a sequential decision making under uncertainty whose underlying probabilistic structure has a Markov Chain. Researchers have mostly focused on model-free algorithms for automating the design of dialogue management using machine learning techniques such as reinforcement learning. But in model-free algorithms there exist a dilemma in engaging the type of exploration versus exploitation. Hence we present a model-based online policy learning algorithm using interconnected learning automata for optimizing dialogue strategy. The proposed algorithm is capable of deriving an optimal policy that prescribes what action should be taken in various states of conversation so as to maximize the expected total reward to attain the goal and incorporates good exploration and exploitation in its updates to improve the naturalness of humancomputer interaction. We test the proposed approach using the most sophisticated evaluation framework PARADISE for accessing to the railway information system.

Keywords: Dialogue management, Learning automata, Reinforcement learning, Spoken dialogue system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584
3587 Modelling and Simulation of Cascaded H-Bridge Multilevel Single Source Inverter Using PSIM

Authors: Gaddafi S. Shehu, T. Yalcinoz, Abdullahi B. Kunya

Abstract:

Multilevel inverters such as flying capacitor, diodeclamped, and cascaded H-bridge inverters are very popular particularly in medium and high power applications. This paper focuses on a cascaded H-bridge module using a single direct current (DC) source in order to generate an 11-level output voltage. The noble approach reduces the number of switches and gate drivers, in comparison with a conventional method. The anticipated topology produces more accurate result with an isolation transformer at high switching frequency. Different modulation techniques can be used for the multilevel inverter, but this work features modulation techniques known as selective harmonic elimination (SHE).This modulation approach reduces the number of carriers with reduction in Switching Losses, Total Harmonic Distortion (THD), and thereby increasing Power Quality (PQ). Based on the simulation result obtained, it appears SHE has the ability to eliminate selected harmonics by chopping off the fundamental output component. The performance evaluation of the proposed cascaded multilevel inverter is performed using PSIM simulation package and THD of 0.94% is obtained.

Keywords: Cascaded H-bridge Multilevel Inverter, Power Quality, Selective Harmonic Elimination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5073
3586 A New Stability Analysis and Stabilization of Discrete-Time Switched Linear Systems Using Vector Norms Approach

Authors: Marwen Kermani, Anis Sakly, Faouzi M'sahli

Abstract:

In this paper, we aim to investigate a new stability analysis for discrete-time switched linear systems based on the comparison, the overvaluing principle, the application of Borne-Gentina criterion and the Kotelyanski conditions. This stability conditions issued from vector norms correspond to a vector Lyapunov function. In fact, the switched system to be controlled will be represented in the Companion form. A comparison system relative to a regular vector norm is used in order to get the simple arrow form of the state matrix that yields to a suitable use of Borne-Gentina criterion for the establishment of sufficient conditions for global asymptotic stability. This proposed approach could be a constructive solution to the state and static output feedback stabilization problems.

Keywords: Discrete-time switched linear systems, Global asymptotic stability, Vector norms, Borne-Gentina criterion, Arrow form state matrix, Arbitrary switching, State feedback controller, Static output feedback controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1613
3585 Riemannian Manifolds for Brain Extraction on Multi-modal Resonance Magnetic Images

Authors: Mohamed Gouskir, Belaid Bouikhalene, Hicham Aissaoui, Benachir Elhadadi

Abstract:

In this paper, we present an application of Riemannian geometry for processing non-Euclidean image data. We consider the image as residing in a Riemannian manifold, for developing a new method to brain edge detection and brain extraction. Automating this process is a challenge due to the high diversity in appearance brain tissue, among different patients and sequences. The main contribution, in this paper, is the use of an edge-based anisotropic diffusion tensor for the segmentation task by integrating both image edge geometry and Riemannian manifold (geodesic, metric tensor) to regularize the convergence contour and extract complex anatomical structures. We check the accuracy of the segmentation results on simulated brain MRI scans of single T1-weighted, T2-weighted and Proton Density sequences. We validate our approach using two different databases: BrainWeb database, and MRI Multiple sclerosis Database (MRI MS DB). We have compared, qualitatively and quantitatively, our approach with the well-known brain extraction algorithms. We show that using a Riemannian manifolds to medical image analysis improves the efficient results to brain extraction, in real time, outperforming the results of the standard techniques.

Keywords: Riemannian manifolds, Riemannian Tensor, Brain Segmentation, Non-Euclidean data, Brain Extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
3584 Protein Secondary Structure Prediction Using Parallelized Rule Induction from Coverings

Authors: Leong Lee, Cyriac Kandoth, Jennifer L. Leopold, Ronald L. Frank

Abstract:

Protein 3D structure prediction has always been an important research area in bioinformatics. In particular, the prediction of secondary structure has been a well-studied research topic. Despite the recent breakthrough of combining multiple sequence alignment information and artificial intelligence algorithms to predict protein secondary structure, the Q3 accuracy of various computational prediction algorithms rarely has exceeded 75%. In a previous paper [1], this research team presented a rule-based method called RT-RICO (Relaxed Threshold Rule Induction from Coverings) to predict protein secondary structure. The average Q3 accuracy on the sample datasets using RT-RICO was 80.3%, an improvement over comparable computational methods. Although this demonstrated that RT-RICO might be a promising approach for predicting secondary structure, the algorithm-s computational complexity and program running time limited its use. Herein a parallelized implementation of a slightly modified RT-RICO approach is presented. This new version of the algorithm facilitated the testing of a much larger dataset of 396 protein domains [2]. Parallelized RTRICO achieved a Q3 score of 74.6%, which is higher than the consensus prediction accuracy of 72.9% that was achieved for the same test dataset by a combination of four secondary structure prediction methods [2].

Keywords: data mining, protein secondary structure prediction, parallelization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
3583 Study on the Impact of Size and Position of the Shear Field in Determining the Shear Modulus of Glulam Beam Using Photogrammetry Approach

Authors: Niaz Gharavi, Hexin Zhang

Abstract:

The shear modulus of a timber beam can be determined using torsion test or shear field test method. The shear field test method is based on shear distortion measurement of the beam at the zone with the constant transverse load in the standardized four-point bending test. The current code of practice advises using two metallic arms act as an instrument to measure the diagonal displacement of the constructing square. The size and the position of the constructing square might influence the shear modulus determination. This study aimed to investigate the size and the position effect of the square in the shear field test method. A binocular stereo vision system has been employed to determine the 3D displacement of a grid of target points. Six glue laminated beams were produced and tested. Analysis of Variance (ANOVA) was performed on the acquired data to evaluate the significance of the size effect and the position effect of the square. The results have shown that the size of the square has a noticeable influence on the value of shear modulus, while, the position of the square within the area with the constant shear force does not affect the measured mean shear modulus.

Keywords: Shear field test method, structural-sized test, shear modulus of Glulam beam, photogrammetry approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 971
3582 Fighter Aircraft Selection Using Fuzzy Preference Optimization Programming (POP)

Authors: C. Ardil

Abstract:

The Turkish Air Force needs to acquire a sixth- generation fighter aircraft in order to maintain its air superiority and dominance against its rivals under the risks posed by global geopolitical opportunities and threats. Accordingly, five evaluation criteria were determined to evaluate the sixth-generation fighter aircraft alternatives and to select the best one. Systematically, a new fuzzy preference optimization programming (POP) method is proposed to select the best sixth generation fighter aircraft in an uncertain environment. The POP technique considers both quantitative and qualitative evaluation criteria. To demonstrate the applicability and effectiveness of the proposed approach, it is applied to a multiple criteria decision-making problem to evaluate and select sixth-generation fighter aircraft. The results of the fuzzy POP method are compared with the results of the fuzzy TOPSIS approach to validate it. According to the comparative analysis, fuzzy POP and fuzzy TOPSIS methods get the same results. This demonstrates the applicability of the fuzzy POP technique to address the sixth-generation fighter selection problem.

Keywords: Fighter aircraft selection, sixth-generation fighter aircraft, fuzzy decision process, multiple criteria decision making, preference optimization programming, POP, TOPSIS, Kizilelma, MIUS, fuzzy set theory

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 414
3581 IMLFQ Scheduling Algorithm with Combinational Fault Tolerant Method

Authors: MohammadReza EffatParvar, Akbar Bemana, Mehdi EffatParvar

Abstract:

Scheduling algorithms are used in operating systems to optimize the usage of processors. One of the most efficient algorithms for scheduling is Multi-Layer Feedback Queue (MLFQ) algorithm which uses several queues with different quanta. The most important weakness of this method is the inability to define the optimized the number of the queues and quantum of each queue. This weakness has been improved in IMLFQ scheduling algorithm. Number of the queues and quantum of each queue affect the response time directly. In this paper, we review the IMLFQ algorithm for solving these problems and minimizing the response time. In this algorithm Recurrent Neural Network has been utilized to find both the number of queues and the optimized quantum of each queue. Also in order to prevent any probable faults in processes' response time computation, a new fault tolerant approach has been presented. In this approach we use combinational software redundancy to prevent the any probable faults. The experimental results show that using the IMLFQ algorithm results in better response time in comparison with other scheduling algorithms also by using fault tolerant mechanism we improve IMLFQ performance.

Keywords: IMLFQ, Fault Tolerant, Scheduling, Queue, Recurrent Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1510
3580 Face Localization and Recognition in Varied Expressions and Illumination

Authors: Hui-Yu Huang, Shih-Hang Hsu

Abstract:

In this paper, we propose a robust scheme to work face alignment and recognition under various influences. For face representation, illumination influence and variable expressions are the important factors, especially the accuracy of facial localization and face recognition. In order to solve those of factors, we propose a robust approach to overcome these problems. This approach consists of two phases. One phase is preprocessed for face images by means of the proposed illumination normalization method. The location of facial features can fit more efficient and fast based on the proposed image blending. On the other hand, based on template matching, we further improve the active shape models (called as IASM) to locate the face shape more precise which can gain the recognized rate in the next phase. The other phase is to process feature extraction by using principal component analysis and face recognition by using support vector machine classifiers. The results show that this proposed method can obtain good facial localization and face recognition with varied illumination and local distortion.

Keywords: Gabor filter, improved active shape model (IASM), principal component analysis (PCA), face alignment, face recognition, support vector machine (SVM)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
3579 Enhancing Experiential Learning in a Smart Flipped Classroom: A Case Study

Authors: Fahri Benli, Sitalakshmi Venkatraman, Ye Wei, Fiona Wahr

Abstract:

A flipped classroom which is a form of blended learning shifts the focus from a teacher-centered approach to a learner-centered approach. However, not all learners are ready to take the active role of knowledge and skill acquisition through a flipped classroom and they continue to delve in a passive mode of learning. This challenges educators in designing, scaffolding and facilitating in-class activities for students to have active learning experiences in a flipped classroom environment. Experiential learning theories have been employed by educators in the past in physical classrooms based on the principle that knowledge could be actively developed through direct experience. However, with more of online teaching witnessed recently, there are inherent limitations in designing and simulating an experiential learning activity for an online environment. In this paper, we explore enhancing experiential learning using smart digital tools that could be employed in a flipped classroom within a higher education setting. We present the use of smart collaborative tools online to enhance the experiential learning activity to teach higher-order cognitive concepts of business process modeling as a case study.

Keywords: Experiential learning, flipped classroom, smart software tools, online learning higher-order learning attributes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 390
3578 Hardware Implementation of Local Binary Pattern Based Two-Bit Transform Motion Estimation

Authors: Seda Yavuz, Anıl Çelebi, Aysun Taşyapı Çelebi, Oğuzhan Urhan

Abstract:

Nowadays, demand for using real-time video transmission capable devices is ever-increasing. So, high resolution videos have made efficient video compression techniques an essential component for capturing and transmitting video data. Motion estimation has a critical role in encoding raw video. Hence, various motion estimation methods are introduced to efficiently compress the video. Low bit‑depth representation based motion estimation methods facilitate computation of matching criteria and thus, provide small hardware footprint. In this paper, a hardware implementation of a two-bit transformation based low-complexity motion estimation method using local binary pattern approach is proposed. Image frames are represented in two-bit depth instead of full-depth by making use of the local binary pattern as a binarization approach and the binarization part of the hardware architecture is explained in detail. Experimental results demonstrate the difference between the proposed hardware architecture and the architectures of well-known low-complexity motion estimation methods in terms of important aspects such as resource utilization, energy and power consumption.

Keywords: Binarization, hardware architecture, local binary pattern, motion estimation, two-bit transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1348
3577 A Simulation-Optimization Approach to Control Production, Subcontracting and Maintenance Decisions for a Deteriorating Production System

Authors: Héctor Rivera-Gómez, Eva Selene Hernández-Gress, Oscar Montaño-Arango, Jose Ramon Corona-Armenta

Abstract:

This research studies the joint production, maintenance and subcontracting control policy for an unreliable deteriorating manufacturing system. Production activities are controlled by a derivation of the Hedging Point Policy, and given that the system is subject to deterioration, it reduces progressively its capacity to satisfy product demand. Multiple deterioration effects are considered, reflected mainly in the quality of the parts produced and the reliability of the machine. Subcontracting is available as support to satisfy product demand; also, overhaul maintenance can be conducted to reduce the effects of deterioration. The main objective of the research is to determine simultaneously the production, maintenance and subcontracting rate, which minimize the total, incurred cost. A stochastic dynamic programming model is developed and solved through a simulation-based approach composed of statistical analysis and optimization with the response surface methodology. The obtained results highlight the strong interactions between production, deterioration and quality, which justify the development of an integrated model. A numerical example and a sensitivity analysis are presented to validate our results.

Keywords: Deterioration, simulation, subcontracting, production planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1877
3576 A New Classification of Risk-Reduction Options to Improve the Risk-Reduction Readiness of the Railway Industry

Authors: Eberechi Weli, Michael Todinov

Abstract:

The gap between the selection of risk-reduction options in the railway industry and the task of their effective implementation results in compromised safety and substantial losses. An effective risk management must necessarily integrate the evaluation phases with the implementation phase. This paper proposes an essential categorisation of risk reduction measures that best addresses a standard railway industry portfolio. By categorising the risk reduction options into design, operational, procedural and technical options, it is guaranteed that the efforts of the implementation facilitators (people, processes and supporting systems) are systematically harmonised. The classification is based on an integration of fundamental principles of risk reduction in the railway industry with the systems engineering approach.

This paper argues that the use of a similar classification approach is an attribute of organisations possessing a superior level of risk-reduction readiness. The integration of the proposed rational classification structure provides a solid ground for effective risk reduction.

Keywords: Cost effectiveness, organisational readiness, risk reduction, railway, system engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
3575 Strengthening the HCI Approaches in the Software Development Process

Authors: Rogayah A. Majid, Nor Laila Md. Noor, Wan Adilah Wan Adnan

Abstract:

User-Centered Design (UCD), Usability Engineering (UE) and Participatory Design (PD) are the common Human- Computer Interaction (HCI) approaches that are practiced in the software development process, focusing towards issues and matters concerning user involvement. It overlooks the organizational perspective of HCI integration within the software development organization. The Management Information Systems (MIS) perspective of HCI takes a managerial and organizational context to view the effectiveness of integrating HCI in the software development process. The Human-Centered Design (HCD) which encompasses all of the human aspects including aesthetic and ergonomic, is claimed as to provide a better approach in strengthening the HCI approaches to strengthen the software development process. In determining the effectiveness of HCD in the software development process, this paper presents the findings of a content analysis of HCI approaches by viewing those approaches as a technology which integrates user requirements, ranging from the top management to other stake holder in the software development process. The findings obtained show that HCD approach is a technology that emphasizes on human, tools and knowledge in strengthening the HCI approaches to strengthen the software development process in the quest to produce a sustainable, usable and useful software product.

Keywords: Human-Centered Design (HCD), Management Information Systems (MIS), Participatory Design (PD), User- Centered Design (UCD), Usability Engineering (UE)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2216
3574 Fuzzy Inference System for Determining Collision Risk of Ship in Madura Strait Using Automatic Identification System

Authors: Emmy Pratiwi, Ketut B. Artana, A. A. B. Dinariyana

Abstract:

Madura Strait is considered as one of the busiest shipping channels in Indonesia. High vessel traffic density in Madura Strait gives serious threat due to navigational safety in this area, i.e. ship collision. This study is necessary as an attempt to enhance the safety of marine traffic. Fuzzy inference system (FIS) is proposed to calculate risk collision of ships. Collision risk is evaluated based on ship domain, Distance to Closest Point of Approach (DCPA), and Time to Closest Point of Approach (TCPA). Data were collected by utilizing Automatic Identification System (AIS). This study considers several ships’ domain models to give the characteristic of marine traffic in the waterways. Each encounter in the ship domain is analyzed to obtain the level of collision risk. Risk level of ships, as the result in this study, can be used as guidance to avoid the accident, providing brief description about safety traffic in Madura Strait and improving the navigational safety in the area.

Keywords: Automatic identification system, collision risk, DCPA, fuzzy inference system, TCPA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
3573 Decision Analysis Module for Excel

Authors: Radomir Perzina, Jaroslav Ramik

Abstract:

The Analytic Hierarchy Process is frequently used approach for solving decision making problems. There exists wide range of software programs utilizing that approach. Their main disadvantage is that they are relatively expensive and missing intermediate calculations. This work introduces a Microsoft Excel add-in called DAME – Decision Analysis Module for Excel. Comparing to other computer programs DAME is free, can work with scenarios or multiple decision makers and displays intermediate calculations. Users can structure their decision models into three levels – scenarios/users, criteria and variants. Items on all levels can be evaluated either by weights or pair-wise comparisons. There are provided three different methods for the evaluation of the weights of criteria, the variants as well as the scenarios – Saaty’s Method, Geometric Mean Method and Fuller’s Triangle Method. Multiplicative and additive syntheses are supported. The proposed software package is demonstrated on couple of illustrating examples of real life decision problems.

Keywords: Analytic hierarchy process, multi-criteria decision making, pair-wise comparisons, Microsoft Excel, Scenarios.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3378
3572 Identity Management in Virtual Worlds Based on Biometrics Watermarking

Authors: S. Bader, N. Essoukri Ben Amara

Abstract:

With the technological development and rise of virtual worlds, these spaces are becoming more and more attractive for cybercriminals, hidden behind avatars and fictitious identities. Since access to these spaces is not restricted or controlled, some impostors take advantage of gaining unauthorized access and practicing cyber criminality. This paper proposes an identity management approach for securing access to virtual worlds. The major purpose of the suggested solution is to install a strong security mechanism to protect virtual identities represented by avatars. Thus, only legitimate users, through their corresponding avatars, are allowed to access the platform resources. Access is controlled by integrating an authentication process based on biometrics. In the request process for registration, a user fingerprint is enrolled and then encrypted into a watermark utilizing a cancelable and non-invertible algorithm for its protection. After a user personalizes their representative character, the biometric mark is embedded into the avatar through a watermarking procedure. The authenticity of the avatar identity is verified when it requests authorization for access. We have evaluated the proposed approach on a dataset of avatars from various virtual worlds, and we have registered promising performance results in terms of authentication accuracy, acceptation and rejection rates.

Keywords: Identity management, security, biometrics authentication and authorization, avatar, virtual world.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
3571 Mixtures of Monotone Networks for Prediction

Authors: Marina Velikova, Hennie Daniels, Ad Feelders

Abstract:

In many data mining applications, it is a priori known that the target function should satisfy certain constraints imposed by, for example, economic theory or a human-decision maker. In this paper we consider partially monotone prediction problems, where the target variable depends monotonically on some of the input variables but not on all. We propose a novel method to construct prediction models, where monotone dependences with respect to some of the input variables are preserved by virtue of construction. Our method belongs to the class of mixture models. The basic idea is to convolute monotone neural networks with weight (kernel) functions to make predictions. By using simulation and real case studies, we demonstrate the application of our method. To obtain sound assessment for the performance of our approach, we use standard neural networks with weight decay and partially monotone linear models as benchmark methods for comparison. The results show that our approach outperforms partially monotone linear models in terms of accuracy. Furthermore, the incorporation of partial monotonicity constraints not only leads to models that are in accordance with the decision maker's expertise, but also reduces considerably the model variance in comparison to standard neural networks with weight decay.

Keywords: mixture models, monotone neural networks, partially monotone models, partially monotone problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1220
3570 Analysis of Vocal Fold Vibrations from High-Speed Digital Images Based On Dynamic Time Warping

Authors: A. I. A. Rahman, Sh-Hussain Salleh, K. Ahmad, K. Anuar

Abstract:

Analysis of vocal fold vibration is essential for understanding the mechanism of voice production and for improving clinical assessment of voice disorders. This paper presents a Dynamic Time Warping (DTW) based approach to analyze and objectively classify vocal fold vibration patterns. The proposed technique was designed and implemented on a Glottal Area Waveform (GAW) extracted from high-speed laryngeal images by delineating the glottal edges for each image frame. Feature extraction from the GAW was performed using Linear Predictive Coding (LPC). Several types of voice reference templates from simulations of clear, breathy, fry, pressed and hyperfunctional voice productions were used. The patterns of the reference templates were first verified using the analytical signal generated through Hilbert transformation of the GAW. Samples from normal speakers’ voice recordings were then used to evaluate and test the effectiveness of this approach. The classification of the voice patterns using the technique of LPC and DTW gave the accuracy of 81%.

Keywords: Dynamic Time Warping, Glottal Area Waveform, Linear Predictive Coding, High-Speed Laryngeal Images, Hilbert Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2311