Search results for: discrete event system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8876

Search results for: discrete event system

8546 Adaptive Digital Watermarking Integrating Fuzzy Inference HVS Perceptual Model

Authors: Sherin M. Youssef, Ahmed Abouelfarag, Noha M. Ghatwary

Abstract:

An adaptive Fuzzy Inference Perceptual model has been proposed for watermarking of digital images. The model depends on the human visual characteristics of image sub-regions in the frequency multi-resolution wavelet domain. In the proposed model, a multi-variable fuzzy based architecture has been designed to produce a perceptual membership degree for both candidate embedding sub-regions and strength watermark embedding factor. Different sizes of benchmark images with different sizes of watermarks have been applied on the model. Several experimental attacks have been applied such as JPEG compression, noises and rotation, to ensure the robustness of the scheme. In addition, the model has been compared with different watermarking schemes. The proposed model showed its robustness to attacks and at the same time achieved a high level of imperceptibility.

Keywords: Watermarking, The human visual system (HVS), Fuzzy Inference System (FIS), Local Binary Pattern (LBP), Discrete Wavelet Transform (DWT).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1774
8545 Unpacking Chilean Preservice Teachers’ Beliefs on Practicum Experiences through Digital Stories

Authors: Claudio Díaz, Mabel Ortiz

Abstract:

An EFL teacher education programme in Chile takes five years to train a future teacher of English. Preservice teachers are prepared to learn an advanced level of English and teach the language from 5th to 12th grade in the Chilean educational system. In the context of their first EFL Methodology course in year four, preservice teachers have to create a five-minute digital story that starts from a critical incident they have experienced as teachers-to-be during their observations or interventions in the schools. A critical incident can be defined as a happening, a specific incident or event either observed by them or involving them. The happening sparks their thinking and may make them subsequently think differently about the particular event. When they create their digital stories, preservice teachers put technology, teaching practice and theory together to narrate a story that is complemented by still images, moving images, text, sound effects and music. The story should be told as a personal narrative, which explains the critical incident. This presentation will focus on the creation process of 50 Chilean preservice teachers’ digital stories highlighting the critical incidents they started their stories. It will also unpack preservice teachers’ beliefs and reflections when approaching their teaching practices in schools. These beliefs will be coded and categorized through content analysis to evidence preservice teachers’ most rooted conceptions about English teaching and learning in Chilean schools. The findings seem to indicate that preservice teachers’ beliefs are strongly mediated by contextual and affective factors.

Keywords: Beliefs, Digital stories, Preservice teachers, Practicum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403
8544 Modeling of Crude Oil Blending via Discrete-Time Neural Networks

Authors: Xiaoou Li, Wen Yu

Abstract:

Crude oil blending is an important unit operation in petroleum refining industry. A good model for the blending system is beneficial for supervision operation, prediction of the export petroleum quality and realizing model-based optimal control. Since the blending cannot follow the ideal mixing rule in practice, we propose a static neural network to approximate the blending properties. By the dead-zone approach, we propose a new robust learning algorithm and give theoretical analysis. Real data of crude oil blending is applied to illustrate the neuro modeling approach.

Keywords: Neural networks, modeling, stability, crude oil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2208
8543 Some Discrete Propositions in IVSs

Authors: A. Pouhassani

Abstract:

The aim of this paper is to exhibit some properties of local topologies of an IVS. Also, we Introduce ISG structure as an interesting structure of semigroups in IVSs.

Keywords: IVS, ISG, Local topology, Lebesgue number, Lindelof theorem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 993
8542 Robust Camera Calibration using Discrete Optimization

Authors: Stephan Rupp, Matthias Elter, Michael Breitung, Walter Zink, Christian Küblbeck

Abstract:

Camera calibration is an indispensable step for augmented reality or image guided applications where quantitative information should be derived from the images. Usually, a camera calibration is obtained by taking images of a special calibration object and extracting the image coordinates of projected calibration marks enabling the calculation of the projection from the 3d world coordinates to the 2d image coordinates. Thus such a procedure exhibits typical steps, including feature point localization in the acquired images, camera model fitting, correction of distortion introduced by the optics and finally an optimization of the model-s parameters. In this paper we propose to extend this list by further step concerning the identification of the optimal subset of images yielding the smallest overall calibration error. For this, we present a Monte Carlo based algorithm along with a deterministic extension that automatically determines the images yielding an optimal calibration. Finally, we present results proving that the calibration can be significantly improved by automated image selection.

Keywords: Camera Calibration, Discrete Optimization, Monte Carlo Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
8541 Information content of Islamic Private Debt Announcement: Evidence from Malaysia

Authors: Sahar Modirzadehbami, Gholamreza Mansourfar

Abstract:

Different types of Islamic debts have been increasingly utilized as preferred means of debt funding by Malaysian private firms in recent years. This study examines the impact of Islamic debts announcement on private firms- stock returns. Our sample includes forty five listed companies on Bursa Malaysia involved in issuing of Islamic debts during 2005 to 2008. The abnormal returns and cumulative average abnormal returns are calculated and tested using standard event study methodology. The results show that a significant, negative abnormal return occurs one day before announcement date. This negative abnormal return is representing market participant-s adverse attitude toward Islamic private debt announcement during the research period.

Keywords: Announcement effect, Event study, Islamic debts, Malaysia, Sukuk

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
8540 Customer Value Creation by CRM System in Electronic Device Companies

Authors: Hideki.Kobayashi, Hiroshi.Osada

Abstract:

The service industry accounts for about 70% of GDP of Japan, and the importance of the service innovation is pointed out. The importance of the system use and the support service increases in the information system that is one of the service industries. However, because the system is not used enough, the purpose for which it was originally intended cannot often be achieved in the CRM system. To promote the use of the system, the effective service method is needed. It is thought that the service model's making and the clarification of the success factors are necessary to improve the operation service of the CRM system. In this research the model of the operation service in the CRM system is made.

Keywords: Information system, Operation service, Serviceinnovation, Solution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1265
8539 Creation and Annihilation of Spacetime Elements

Authors: Dnyanesh P. Mathur, Gregory L. Slater

Abstract:

Gravitation and the expansion of the universe at a large scale are generally regarded as two completely distinct phenomena. Yet, in General theory of Relativity (GR), they both manifest as 'curvature' of spacetime. We propose a hypothesis which treats these two 'curvature-producing' phenomena as aspects of an underlying process. This process treats spacetime itself as composed of discrete units (Plancktons) and is 'dynamic' in the sense that these elements of spacetime are continually being both created and annihilated. It is these two complementary processes of Planckton creation and Planckton annihilation which manifest themselves as - 'cosmic expansion' on the one hand and as 'gravitational attraction’ on the other. The Planckton hypothesis treats spacetime as a perfect fluid in the same manner as the co-moving frame of reference of Friedman equations and the Gullstrand-Painleve metric; i.e., Planckton hypothesis replaces 'curvature' of spacetime by the 'flow' of Plancktons (spacetime). Here we discuss how this perspective may allow a unified description of both cosmological and gravitational acceleration as well as providing a mechanism for inducing an irreducible action at every point associated with the creation and annihilation of Plancktons, which could be identified as the zero point energy.

Keywords: Discrete spacetime, spacetime flow, zero point energy, dark energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 115
8538 Implementation of the Outputs of Computer Simulation to Support Decision-Making Processes

Authors: Jiří Barta

Abstract:

At the present time, awareness, education, computer simulation and information systems protection are very serious and relevant topics. The article deals with perspectives and possibilities of implementation of emergence or natural hazard threats into the system which is developed for communication among members of crisis management staffs. The Czech Hydro-Meteorological Institute with its System of Integrated Warning Service resents the largest usable base of information. National information systems are connected to foreign systems, especially to flooding emergency systems of neighboring countries, systems of European Union and international organizations where the Czech Republic is a member. Use of outputs of particular information systems and computer simulations on a single communication interface of information system for communication among members of crisis management staff and setting the site interoperability in the net will lead to time savings in decision-making processes in solving extraordinary events and crisis situations. Faster managing of an extraordinary event or a crisis situation will bring positive effects and minimize the impact of negative effects on the environment.

Keywords: Computer simulation, communication, continuity, critical infrastructure, information systems, safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671
8537 An Impairment Sensitive and Reliable SR-ARQ Mechanism for Unreliable Feedback in GPRS

Authors: Mansab Ali, Muhammad Khalid Khan

Abstract:

The advances in wireless communication have opened unlimited horizons but there are some challenges as well. The Nature derived air medium between MS (Mobile Station) and BS (Base Station) is beyond human control and produces channel impairment. The impact of the natural conditions at the air medium is the biggest issue in wireless communication. Natural conditions make reliability more cumbersome; here reliability refers to the efficient recovery of the lost or erroneous data. The SR-ARQ (Selective Repeat-Automatic Repeat Request) protocol is a de facto standard for any wireless technology at the air interface with its standard reliability features. Our focus in this research is on the reliability of the control or feedback signal of the SR-ARQ protocol. The proposed mechanism, RSR-ARQ (Reliable SR-ARQ) is an enhancement of the SR-ARQ protocol that has ensured the reliability of the control signals through channel impairment sensitive mechanism. We have modeled the system under two-state discrete time Markov Channel. The simulation results demonstrate the better recovery of the lost or erroneous data that will increase the overall system performance.

Keywords: ISR-ARQ, MAA, RSR-ARQ, SAA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1194
8536 Effect of Scalping on the Mechanical Behavior of Coarse Soils

Authors: Nadine Ali Hassan, Ngoc Son Nguyen, Didier Marot, Fateh Bendahmane

Abstract:

This paper aims at presenting a study of the effect of scalping methods on the mechanical properties of coarse soils by resorting to numerical simulations based on the discrete element method (DEM) and experimental triaxial tests. Two reconstitution methods are used, designated as scalping method and substitution method. Triaxial compression tests are first simulated on a granular materials with a grap graded particle size distribution by using the DEM. We study the effect of these reconstitution methods on the stress-strain behavior of coarse soils with different fine contents and with different ways to control the densities of the scalped and substituted materials. Experimental triaxial tests are performed on original mixtures of sands and gravels with different fine contents and on their corresponding scalped and substituted samples. Numerical results are qualitatively compared to experimental ones. Agreements and discrepancies between these results are also discussed.

Keywords: Coarse soils, scalping, substitution, discrete element method, triaxial test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 581
8535 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 852
8534 Attack Detection through Image Adaptive Self Embedding Watermarking

Authors: S. Shefali, S. M. Deshpande, S. G. Tamhankar

Abstract:

Now a days, a significant part of commercial and governmental organisations like museums, cultural organizations, libraries, commercial enterprises, etc. invest intensively in new technologies for image digitization, digital libraries, image archiving and retrieval. Hence image authorization, authentication and security has become prime need. In this paper, we present a semi-fragile watermarking scheme for color images. The method converts the host image into YIQ color space followed by application of orthogonal dual domains of DCT and DWT transforms. The DCT helps to separate relevant from irrelevant image content to generate silent image features. DWT has excellent spatial localisation to help aid in spatial tamper characterisation. Thus image adaptive watermark is generated based of image features which allows the sharp detection of microscopic changes to locate modifications in the image. Further, the scheme utilises the multipurpose watermark consisting of soft authenticator watermark and chrominance watermark. Which has been proved fragile to some predefined processing like intentinal fabrication of the image or forgery and robust to other incidental attacks caused in the communication channel.

Keywords: Cryptography, Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT), Watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1985
8533 The Evaluation of Event Sport Tourism on Regional Economic Development

Authors: Huei-Wen Lin, Huei-Fu Lu

Abstract:

Event sport tourism (EST) has become an especially important economic sector around the world. As the magnitude continues to grow, attracting more tourists, media, and investment for the host community, and many local areas/regions and states have identified the expenditures by visitors as a potential source of economic or employment growth. The main purposes of this study are to investigate stakeholders’ insights into the feature of hosting EST and using them as a regional development strategy. Continuing the focus of previous literature on the regional development and economic benefits by hosting EST, a total of fıve semi-structured interview questions are designed and a thematic analysis is employed to conduct with eight key sport and tourism decision makers in Atlanta during July to August 2016. Through the depth interviews, the study will contribute to a better understanding of stakeholders’ decision-making, identifying benefits and constraints as well as leveraging the impacts of hosting EST. These findings have provided stakeholders’ perspectives of hosting EST and using them as a reference of regional development in emerging sport tourism markets in the US. Additionally, this study examines key considerations and issues that affect and are critical to reliable understanding of the economic impacts of hosting EST on the regional development, and it will be able to benefit future management authorities (i.e. governments and communities) in their sport tourism development endeavors in defining and hosting successful EST. Furthermore, the insights gained from the qualitative analysis could help other cities/regions analyzing the economic impacts of hosting EST and using it as an instrument of city development strategy.

Keywords: Event sport tourism, regional economic development, thematic analysis, stakeholder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2665
8532 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach

Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar

Abstract:

Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.

Keywords: ANN, DWT, GLCM, KNN, ROI, artificial neural networks, discrete wavelet transform, gray-level co-occurrence matrix, k-nearest neighbor, region of interest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 902
8531 The Impact of Subsequent Stock Market Liberalization on the Integration of Stock Markets in ASEAN-4 + South Korea

Authors: Noor Azryani Auzairy, Rubi Ahmad

Abstract:

To strengthen the capital market, there is a need to integrate the capital markets within the region by removing legal or informal restriction, specifically, stock market liberalization. Thus the paper is to investigate the effects of the subsequent stock market liberalization on stock market integration in 4 ASEAN countries (Malaysia, Indonesia, Thailand, Singapore) and Korea from 1997 to 2007. The correlation between stock market liberalization and stock market integration are to be examined by analyzing the stock prices and returns within the region and in comparison with the world MSCI index. Event study method is to be used with windows of ±12 months and T-7 + T. The results show that the subsequent stock market liberalization generally, gives minor positive effects to stock returns, except for one or two countries. The subsequent liberalization also integrates the markets short-run and long-run.

Keywords: ASEAN, event method, stock market integration, stock market liberalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1839
8530 Detection and Correction of Ectopic Beats for HRV Analysis Applying Discrete Wavelet Transforms

Authors: Desmond B. Keenan

Abstract:

The clinical usefulness of heart rate variability is limited to the range of Holter monitoring software available. These software algorithms require a normal sinus rhythm to accurately acquire heart rate variability (HRV) measures in the frequency domain. Premature ventricular contractions (PVC) or more commonly referred to as ectopic beats, frequent in heart failure, hinder this analysis and introduce ambiguity. This investigation demonstrates an algorithm to automatically detect ectopic beats by analyzing discrete wavelet transform coefficients. Two techniques for filtering and replacing the ectopic beats from the RR signal are compared. One technique applies wavelet hard thresholding techniques and another applies linear interpolation to replace ectopic cycles. The results demonstrate through simulation, and signals acquired from a 24hr ambulatory recorder, that these techniques can accurately detect PVC-s and remove the noise and leakage effects produced by ectopic cycles retaining smooth spectra with the minimum of error.

Keywords: Heart rate variability, vagal tone, sympathetic, parasympathetic, wavelets, ectopic beats, spectral analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2015
8529 Automatic Flood Prediction Using Rainfall Runoff Model in Moravian-Silesian Region

Authors: B. Sir, M. Podhoranyi, S. Kuchar, T. Kocyan

Abstract:

Rainfall runoff models play important role in hydrological predictions. However, the model is only one part of the process for creation of flood prediction. The aim of this paper is to show the process of successful prediction for flood event (May 15 – May 18 2014). Prediction was performed by rainfall runoff model HEC–HMS, one of the models computed within Floreon+ system. The paper briefly evaluates the results of automatic hydrologic prediction on the river Olše catchment and its gages Český Těšín and Věřňovice.

Keywords: Flood, HEC-HMS, Prediction, Rainfall – Runoff.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2164
8528 Dynamics of Functional Composition of a Brazilian Tropical Forest in Response to Drought Stress

Authors: Theodore N.S. Karfakis, Anna Andrade

Abstract:

The aim of this study was to examine the dynamics of functional composition of a non flooded Amazonian forest in response to drought stress in terms of diameter growth, recruitment and mortality. The survey was carried out in the continuous forest of the Biological dynamics of forest fragments project 90 km outside the city of Manaus, state of Amazonas Brazil. All stems >10 cm dbh where identified to species level and monitored in 18 one hectare permanent sample plots from 1981 to 2004.For statistical analysis all species where aggregated in three ecological guilds. Two distinct drought events occurred in 1983 and 1997. Results showed that more early successional species performed better than later successional ones. Response was significant for both events but for the 1997 event this was more pronounced possibly because of the fact that the event was in the middle of the dry rather than the wet period as was the 1983 one.

Keywords: Brazil, functional composition, drought, Amazonian non flooded forest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
8527 Efficient System for Speech Recognition using General Regression Neural Network

Authors: Abderrahmane Amrouche, Jean Michel Rouvaen

Abstract:

In this paper we present an efficient system for independent speaker speech recognition based on neural network approach. The proposed architecture comprises two phases: a preprocessing phase which consists in segmental normalization and features extraction and a classification phase which uses neural networks based on nonparametric density estimation namely the general regression neural network (GRNN). The relative performances of the proposed model are compared to the similar recognition systems based on the Multilayer Perceptron (MLP), the Recurrent Neural Network (RNN) and the well known Discrete Hidden Markov Model (HMM-VQ) that we have achieved also. Experimental results obtained with Arabic digits have shown that the use of nonparametric density estimation with an appropriate smoothing factor (spread) improves the generalization power of the neural network. The word error rate (WER) is reduced significantly over the baseline HMM method. GRNN computation is a successful alternative to the other neural network and DHMM.

Keywords: Speech Recognition, General Regression NeuralNetwork, Hidden Markov Model, Recurrent Neural Network, ArabicDigits.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135
8526 Spectral Entropy Employment in Speech Enhancement based on Wavelet Packet

Authors: Talbi Mourad, Salhi Lotfi, Chérif Adnen

Abstract:

In this work, we are interested in developing a speech denoising tool by using a discrete wavelet packet transform (DWPT). This speech denoising tool will be employed for applications of recognition, coding and synthesis. For noise reduction, instead of applying the classical thresholding technique, some wavelet packet nodes are set to zero and the others are thresholded. To estimate the non stationary noise level, we employ the spectral entropy. A comparison of our proposed technique to classical denoising methods based on thresholding and spectral subtraction is made in order to evaluate our approach. The experimental implementation uses speech signals corrupted by two sorts of noise, white and Volvo noises. The obtained results from listening tests show that our proposed technique is better than spectral subtraction. The obtained results from SNR computation show the superiority of our technique when compared to the classical thresholding method using the modified hard thresholding function based on u-law algorithm.

Keywords: Enhancement, spectral subtraction, SNR, discrete wavelet packet transform, spectral entropy Histogram

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933
8525 Sensor Network Based Emergency Response and Navigation Support Architecture

Authors: Dilusha Weeraddana, Ashanie Gunathillake, Samiru Gayan

Abstract:

In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment. 

Keywords: Emergency response, Firefighters, Navigation, Wireless sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1965
8524 Bond Graph and Bayesian Networks for Reliable Diagnosis

Authors: Abdelaziz Zaidi, Belkacem Ould Bouamama, Moncef Tagina

Abstract:

Bond Graph as a unified multidisciplinary tool is widely used not only for dynamic modelling but also for Fault Detection and Isolation because of its structural and causal proprieties. A binary Fault Signature Matrix is systematically generated but to make the final binary decision is not always feasible because of the problems revealed by such method. The purpose of this paper is introducing a methodology for the improvement of the classical binary method of decision-making, so that the unknown and identical failure signatures can be treated to improve the robustness. This approach consists of associating the evaluated residuals and the components reliability data to build a Hybrid Bayesian Network. This network is used in two distinct inference procedures: one for the continuous part and the other for the discrete part. The continuous nodes of the network are the prior probabilities of the components failures, which are used by the inference procedure on the discrete part to compute the posterior probabilities of the failures. The developed methodology is applied to a real steam generator pilot process.

Keywords: Redundancy relations, decision-making, Bond Graph, reliability, Bayesian Networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2478
8523 Optimal Path Planning under Priori Information in Stochastic, Time-varying Networks

Authors: Siliang Wang, Minghui Wang, Jun Hu

Abstract:

A novel path planning approach is presented to solve optimal path in stochastic, time-varying networks under priori traffic information. Most existing studies make use of dynamic programming to find optimal path. However, those methods are proved to be unable to obtain global optimal value, moreover, how to design efficient algorithms is also another challenge. This paper employs a decision theoretic framework for defining optimal path: for a given source S and destination D in urban transit network, we seek an S - D path of lowest expected travel time where its link travel times are discrete random variables. To solve deficiency caused by the methods of dynamic programming, such as curse of dimensionality and violation of optimal principle, an integer programming model is built to realize assignment of discrete travel time variables to arcs. Simultaneously, pruning techniques are also applied to reduce computation complexity in the algorithm. The final experiments show the feasibility of the novel approach.

Keywords: pruning method, stochastic, time-varying networks, optimal path planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1810
8522 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: Data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1064
8521 A Normalization-based Robust Image Watermarking Scheme Using SVD and DCT

Authors: Say Wei Foo, Qi Dong

Abstract:

Digital watermarking is one of the techniques for copyright protection. In this paper, a normalization-based robust image watermarking scheme which encompasses singular value decomposition (SVD) and discrete cosine transform (DCT) techniques is proposed. For the proposed scheme, the host image is first normalized to a standard form and divided into non-overlapping image blocks. SVD is applied to each block. By concatenating the first singular values (SV) of adjacent blocks of the normalized image, a SV block is obtained. DCT is then carried out on the SV blocks to produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency band of a SVD-DCT block by imposing a particular relationship between two pseudo-randomly selected DCT coefficients. An adaptive frequency mask is used to adjust local watermark embedding strength. Watermark extraction involves mainly the inverse process. The watermark extracting method is blind and efficient. Experimental results show that the quality degradation of watermarked image caused by the embedded watermark is visually transparent. Results also show that the proposed scheme is robust against various image processing operations and geometric attacks.

Keywords: Image watermarking, Image normalization, Singularvalue decomposition, Discrete cosine transform, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2046
8520 An Event Based Approach to Extract the Run Time Execution Path of BPEL Process for Monitoring QoS in the Cloud

Authors: Rima Grati, Khouloud Boukadi, Hanene Ben-Abdallah

Abstract:

Due to the dynamic nature of the Cloud, continuous monitoring of QoS requirements is necessary to manage the Cloud computing environment. The process of QoS monitoring and SLA violation detection consists of: collecting low and high level information pertinent to the service, analyzing the collected information, and taking corrective actions when SLA violations are detected. In this paper, we detail the architecture and the implementation of the first step of this process. More specifically, we propose an event-based approach to obtain run time information of services developed as BPEL processes. By catching particular events (i.e., the low level information), our approach recognizes the run-time execution path of a monitored service and uses the BPEL execution patterns to compute QoS of the composite service (i.e., the high level information).

Keywords: Monitoring of Web service composition, Cloud environment, Run-time extraction of execution path of BPEL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
8519 State of the Art: A Study on Fall Detection

Authors: Goh Yongli, Ooi Shih Yin, Pang Ying Han

Abstract:

Unintentional falls are rife throughout the ages and have been the common factor of serious or critical injuries especially for the elderly society. Fortunately, owing to the recent rapid advancement in technology, fall detection system is made possible, enabling detection of falling events for the elderly, monitoring the patient and consequently provides emergency support in the event of falling. This paper presents a review of 3 main categories of fall detection techniques, ranging from year 2005 to year 2010. This paper will be focusing on discussing the techniques alongside with summary and conclusion for them.

Keywords: State of the art, fall detection, wearable devices, ambient analyser, motion detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2104
8518 Numerical Study of Airfoils Aerodynamic Performance in Heavy Rain Environment

Authors: M. Ismail, Cao Yihua, Zhao Ming, Abu Bakar

Abstract:

Heavy rainfall greatly affects the aerodynamic performance of the aircraft. There are many accidents of aircraft caused by aerodynamic efficiency degradation by heavy rain. In this Paper we have studied the heavy rain effects on the aerodynamic efficiency of cambered NACA 64-210 and symmetric NACA 0012 airfoils. Our results show significant increase in drag and decrease in lift. We used preprocessing software gridgen for creation of geometry and mesh, used fluent as solver and techplot as postprocessor. Discrete phase modeling called DPM is used to model the rain particles using two phase flow approach. The rain particles are assumed to be inert. Both airfoils showed significant decrease in lift and increase in drag in simulated rain environment. The most significant difference between these two airfoils was the NACA 64-210 more sensitivity than NACA 0012 to liquid water content (LWC). We believe that the results showed in this paper will be useful for the designer of the commercial aircrafts and UAVs, and will be helpful for training of the pilots to control the airplanes in heavy rain.

Keywords: airfoil, discrete phase modeling, heavy rain, Reynolds

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2220
8517 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: Matrix Minimization Algorithm, Decoding Sequential Search Algorithm, image compression, Discrete Cosine Transform, Discrete Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 172