Search results for: reconstruction entropy.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 382

Search results for: reconstruction entropy.

262 Medical Image Registration by Minimizing Divergence Measure Based on Tsallis Entropy

Authors: Shaoyan Sun, Liwei Zhang, Chonghui Guo

Abstract:

As the use of registration packages spreads, the number of the aligned image pairs in image databases (either by manual or automatic methods) increases dramatically. These image pairs can serve as a set of training data. Correspondingly, the images that are to be registered serve as testing data. In this paper, a novel medical image registration method is proposed which is based on the a priori knowledge of the expected joint intensity distribution estimated from pre-aligned training images. The goal of the registration is to find the optimal transformation such that the distance between the observed joint intensity distribution obtained from the testing image pair and the expected joint intensity distribution obtained from the corresponding training image pair is minimized. The distance is measured using the divergence measure based on Tsallis entropy. Experimental results show that, compared with the widely-used Shannon mutual information as well as Tsallis mutual information, the proposed method is computationally more efficient without sacrificing registration accuracy.

Keywords: Multimodality images, image registration, Shannonentropy, Tsallis entropy, mutual information, Powell optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596
261 1/Sigma Term Weighting Scheme for Sentiment Analysis

Authors: Hanan Alshaher, Jinsheng Xu

Abstract:

Large amounts of data on the web can provide valuable information. For example, product reviews help business owners measure customer satisfaction. Sentiment analysis classifies texts into two polarities: positive and negative. This paper examines movie reviews and tweets using a new term weighting scheme, called one-over-sigma (1/sigma), on benchmark datasets for sentiment classification. The proposed method aims to improve the performance of sentiment classification. The results show that 1/sigma is more accurate than the popular term weighting schemes. In order to verify if the entropy reflects the discriminating power of terms, we report a comparison of entropy values for different term weighting schemes.

Keywords: Sentiment analysis, term weighting scheme, 1/sigma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 456
260 Phenomenological and Theoretical Analysis of Relativistic Temperature Transformation and Relativistic Entropy

Authors: Marko Popovic

Abstract:

There are three possible effects of Special Theory of Relativity (STR) on a thermodynamic system. Planck and Einstein looked upon this process as isobaric; on the other hand Ott saw it as an adiabatic process. However plenty of logical reasons show that the process is isotherm. Our phenomenological consideration demonstrates that the temperature is invariant with Lorenz transformation. In that case process is isotherm, so volume and pressure are Lorentz covariant. If the process is isotherm the Boyles law is Lorentz invariant. Also equilibrium constant and Gibbs energy, activation energy, enthalpy entropy and extent of the reaction became Lorentz invariant.

Keywords: STR, relativistic temperature transformation, Boyle'slaw, equilibrium constant, Gibbs energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247
259 Conjugate Mixed Convection Heat Transfer and Entropy Generation of Cu-Water Nanofluid in an Enclosure with Thick Wavy Bottom Wall

Authors: Sanjib Kr Pal, S. Bhattacharyya

Abstract:

Mixed convection of Cu-water nanofluid in an enclosure with thick wavy bottom wall has been investigated numerically. A co-ordinate transformation method is used to transform the computational domain into an orthogonal co-ordinate system. The governing equations in the computational domain are solved through a pressure correction based iterative algorithm. The fluid flow and heat transfer characteristics are analyzed for a wide range of Richardson number (0.1 ≤ Ri ≤ 5), nanoparticle volume concentration (0.0 ≤ ϕ ≤ 0.2), amplitude (0.0 ≤ α ≤ 0.1) of the wavy thick- bottom wall and the wave number (ω) at a fixed Reynolds number. Obtained results showed that heat transfer rate increases remarkably by adding the nanoparticles. Heat transfer rate is dependent on the wavy wall amplitude and wave number and decreases with increasing Richardson number for fixed amplitude and wave number. The Bejan number and the entropy generation are determined to analyze the thermodynamic optimization of the mixed convection.

Keywords: Entropy generation, mixed convection, conjugate heat transfer, numerical, nanofluid, wall waviness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 996
258 Iterative Image Reconstruction for Sparse-View Computed Tomography via Total Variation Regularization and Dictionary Learning

Authors: XianYu Zhao, JinXu Guo

Abstract:

Recently, low-dose computed tomography (CT) has become highly desirable due to increasing attention to the potential risks of excessive radiation. For low-dose CT imaging, ensuring image quality while reducing radiation dose is a major challenge. To facilitate low-dose CT imaging, we propose an improved statistical iterative reconstruction scheme based on the Penalized Weighted Least Squares (PWLS) standard combined with total variation (TV) minimization and sparse dictionary learning (DL) to improve reconstruction performance. We call this method "PWLS-TV-DL". In order to evaluate the PWLS-TV-DL method, we performed experiments on digital phantoms and physical phantoms, respectively. The experimental results show that our method is in image quality and calculation. The efficiency is superior to other methods, which confirms the potential of its low-dose CT imaging.

Keywords: Low dose computed tomography, penalized weighted least squares, total variation, dictionary learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 782
257 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: ATR, HRRP, motion compensation, SFW, TMP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 603
256 Recent Trends in Nonlinear Methods of HRV Analysis: A Review

Authors: Ramesh K. Sunkaria

Abstract:

The linear methods of heart rate variability analysis such as non-parametric (e.g. fast Fourier transform analysis) and parametric methods (e.g. autoregressive modeling) has become an established non-invasive tool for marking the cardiac health, but their sensitivity and specificity were found to be lower than expected with positive predictive value <30%. This may be due to considering the RR-interval series as stationary and re-sampling them prior to their use for analysis, whereas actually it is not. This paper reviews the non-linear methods of HRV analysis such as correlation dimension, largest Lyupnov exponent, power law slope, fractal analysis, detrended fluctuation analysis, complexity measure etc. which are currently becoming popular as these uses the actual RR-interval series. These methods are expected to highly accurate cardiac health prognosis.

Keywords: chaos, nonlinear dynamics, sample entropy, approximate entropy, detrended fluctuation analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2297
255 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis

Authors: Young-Seok Choi

Abstract:

This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.

Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
254 The Impact of Post-Disaster Relocation on Community Solidarity: The Case of Post-Disaster Reconstruction after Typhoon Morakot in Taiwan

Authors: Tsung-Hsi Fu, Wan-I Lin, Jyh-Cherng Shieh

Abstract:

Typhoon Morakot hit Taiwan in 2009 and caused severe damages. The government employs a compulsory relocation strategy for post-disaster reconstruction. This study analyzes the impact of this strategy on community solidarity. It employs a multiple approach for data collection, including semi-structural interview, secondary data, and documentation. The results indicate that the government-s strategy for distributing housing has led to conflicts within the communities. In addition, the relocating process has stimulated tensions between victims of the disaster and those residents whose lands were chosen to be new sites for relocation. The government-s strategy of “collective relocation" also worsened community integration. In addition, the fact that a permanent housing community may accommodate people from different places also posts challenge for the development of new inter-personal relations in the communities. This study concludes by emphasizing the importance of bringing social, economic and cultural aspects into consideration for post-disaster relocation..

Keywords: community solidarity, permanent housing, post-disaster reconstruction, relocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2081
253 Video Coding Algorithm for Video Sequences with Abrupt Luminance Change

Authors: Sang Hyun Kim

Abstract:

In this paper, a fast motion compensation algorithm is proposed that improves coding efficiency for video sequences with brightness variations. We also propose a cross entropy measure between histograms of two frames to detect brightness variations. The framewise brightness variation parameters, a multiplier and an offset field for image intensity, are estimated and compensated. Simulation results show that the proposed method yields a higher peak signal to noise ratio (PSNR) compared with the conventional method, with a greatly reduced computational load, when the video scene contains illumination changes.

Keywords: Motion estimation, Fast motion compensation, Brightness variation compensation, Brightness change detection, Cross entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1725
252 Performance Analysis of Reconstruction Algorithms in Diffuse Optical Tomography

Authors: K. Uma Maheswari, S. Sathiyamoorthy, G. Lakshmi

Abstract:

Diffuse Optical Tomography (DOT) is a non-invasive imaging modality used in clinical diagnosis for earlier detection of carcinoma cells in brain tissue. It is a form of optical tomography which produces gives the reconstructed image of a human soft tissue with by using near-infra-red light. It comprises of two steps called forward model and inverse model. The forward model provides the light propagation in a biological medium. The inverse model uses the scattered light to collect the optical parameters of human tissue. DOT suffers from severe ill-posedness due to its incomplete measurement data. So the accurate analysis of this modality is very complicated. To overcome this problem, optical properties of the soft tissue such as absorption coefficient, scattering coefficient, optical flux are processed by the standard regularization technique called Levenberg - Marquardt regularization. The reconstruction algorithms such as Split Bregman and Gradient projection for sparse reconstruction (GPSR) methods are used to reconstruct the image of a human soft tissue for tumour detection. Among these algorithms, Split Bregman method provides better performance than GPSR algorithm. The parameters such as signal to noise ratio (SNR), contrast to noise ratio (CNR), relative error (RE) and CPU time for reconstructing images are analyzed to get a better performance.

Keywords: Diffuse optical tomography, ill-posedness, Levenberg Marquardt method, Split Bregman, the Gradient projection for sparse reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
251 Super Resolution Blind Reconstruction of Low Resolution Images using Wavelets based Fusion

Authors: Liyakathunisa, V. K. Ananthashayana

Abstract:

Crucial information barely visible to the human eye is often embedded in a series of low resolution images taken of the same scene. Super resolution reconstruction is the process of combining several low resolution images into a single higher resolution image. The ideal algorithm should be fast, and should add sharpness and details, both at edges and in regions without adding artifacts. In this paper we propose a super resolution blind reconstruction technique for linearly degraded images. In our proposed technique the algorithm is divided into three parts an image registration, wavelets based fusion and an image restoration. In this paper three low resolution images are considered which may sub pixels shifted, rotated, blurred or noisy, the sub pixel shifted images are registered using affine transformation model; A wavelet based fusion is performed and the noise is removed using soft thresolding. Our proposed technique reduces blocking artifacts and also smoothens the edges and it is also able to restore high frequency details in an image. Our technique is efficient and computationally fast having clear perspective of real time implementation.

Keywords: Affine Transforms, Denoiseing, DWT, Fusion, Image registration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2621
250 Mathematical Reconstruction of an Object Image Using X-Ray Interferometric Fourier Holography Method

Authors: M. K. Balyan

Abstract:

The main principles of X-ray Fourier interferometric holography method are discussed. The object image is reconstructed by the mathematical method of Fourier transformation. The three methods are presented – method of approximation, iteration method and step by step method. As an example the complex amplitude transmission coefficient reconstruction of a beryllium wire is considered. The results reconstructed by three presented methods are compared. The best results are obtained by means of step by step method.

Keywords: Dynamical diffraction, hologram, object image, X-ray holography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
249 MRI Reconstruction Using Discrete Fourier Transform: A tutorial

Authors: Abiodun M. Aibinu, Momoh J. E. Salami, Amir A. Shafie, Athaur Rahman Najeeb

Abstract:

The use of Inverse Discrete Fourier Transform (IDFT) implemented in the form of Inverse Fourier Transform (IFFT) is one of the standard method of reconstructing Magnetic Resonance Imaging (MRI) from uniformly sampled K-space data. In this tutorial, three of the major problems associated with the use of IFFT in MRI reconstruction are highlighted. The tutorial also gives brief introduction to MRI physics; MRI system from instrumentation point of view; K-space signal and the process of IDFT and IFFT for One and two dimensional (1D and 2D) data.

Keywords: Discrete Fourier Transform (DFT), K-space Data, Magnetic Resonance (MR), Spin, Windows.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5054
248 Exact Image Super-Resolution for Pure Translational Motion and Shift-Invariant Blur

Authors: Fatih Kara, Cabir Vural

Abstract:

In this work, a special case of the image superresolution problem where the only type of motion is global translational motion and the blurs are shift-invariant is investigated. The necessary conditions for exact reconstruction of the original image by using finite impulse-response reconstruction filters are developed. Given that the conditions are satisfied, a method for exact super-resolution is presented and some simulation results are shown.

Keywords: Image processing, image super-resolution, finite impulse-response filters, existence-uniqueness conditions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1265
247 Differentiation of Heart Rate Time Series from Electroencephalogram and Noise

Authors: V. I. Thajudin Ahamed, P. Dhanasekaran, Paul Joseph K.

Abstract:

Analysis of heart rate variability (HRV) has become a popular non-invasive tool for assessing the activities of autonomic nervous system. Most of the methods were hired from techniques used for time series analysis. Currently used methods are time domain, frequency domain, geometrical and fractal methods. A new technique, which searches for pattern repeatability in a time series, is proposed for quantifying heart rate (HR) time series. These set of indices, which are termed as pattern repeatability measure and pattern repeatability ratio are able to distinguish HR data clearly from noise and electroencephalogram (EEG). The results of analysis using these measures give an insight into the fundamental difference between the composition of HR time series with respect to EEG and noise.

Keywords: Approximate entropy, heart rate variability, noise, pattern repeatability, and sample entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
246 5iD Viewer - Observation of Fish School Behaviour in Labyrinths and Use of Semantic and Syntactic Entropy for School Structure Definition

Authors: Dalibor Štys, Dalibor Štys Jr., Jana Pečenková, Kryštof M. Štys, Maryia Chkalova, Petr Kouba, Aliaksandr Pautsina, Denis Durniev, Tomáš Náhlík, Petr Císař

Abstract:

In this article is reported a construction and some properties of the 5iD viewer, the system recording simultaneously 5 views of a given experimental object. Properties of the system are demonstrated on the analysis of fish schooling behaviour. It is demonstrated the method of instrument calibration which allows inclusion of image distortion and it is proposed and partly tested also the method of distance assessment in the case that only two opposite cameras are available. Finally, we demonstrate how the state trajectory of the behaviour of the fish school may be constructed from the entropy of the system.

Keywords: 3D positioning, school behavior, distance calibration, space vision, space distortion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893
245 An Optimized Design of Non-uniform Filterbank

Authors: Ram Kumar Soni, Alok Jain, Rajiv Saxena

Abstract:

The tree structured approach of non-uniform filterbank (NUFB) is normally used in perfect reconstruction (PR). The PR is not always feasible due to certain limitations, i.e, constraints in selecting design parameters, design complexity and some times output is severely affected by aliasing error if necessary and sufficient conditions of PR is not satisfied perfectly. Therefore, there has been generalized interest of researchers to go for near perfect reconstruction (NPR). In this proposed work, an optimized tree structure technique is used for the design of NPR non-uniform filterbank. Window functions of Blackman family are used to design the prototype FIR filter. A single variable linear optimization is used to minimize the amplitude distortion. The main feature of the proposed design is its simplicity with linear phase property.

Keywords: Tree structure, NUFB, QMF, NPR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
244 Research of Linear Camera Calibration Based on Planar Pattern

Authors: Jin Sun, Hongbin Gu

Abstract:

An important step in three-dimensional reconstruction and computer vision is camera calibration, whose objective is to estimate the intrinsic and extrinsic parameters of each camera. In this paper, two linear methods based on the different planes are given. In both methods, the general plane is used to replace the calibration object with very good precision. In the first method, after controlling the camera to undergo five times- translation movements and taking pictures of the orthogonal planes, a set of linear constraints of the camera intrinsic parameters is then derived by means of homography matrix. The second method is to get all camera parameters by taking only one picture of a given radius circle. experiments on simulated data and real images,indicate that our method is reasonable and is a good supplement to camera calibration.

Keywords: camera calibration, 3D reconstruction, computervision

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785
243 Absent Theaters: A Virtual Reconstruction from Memories

Authors: P. Castillo Muñoz, A. Lara Ramírez

Abstract:

Absent Theaters is a project that virtually reconstructs three theaters that existed in the twentieth century, demolished in the city of Medellin, Colombia: Circo España, Bolívar, and Junín. Virtual reconstruction is used as an excuse to talk with those who lived in their childhood and youth cultural spaces that formed a whole generation. Around 100 people who witnessed these theaters were interviewed. The means used to perform the oral history work was the virtual reconstruction of the interior of the theaters that were presented to the interviewees through the Virtual Reality glasses. The voices of people between 60 and 103 years old were used to generate a transmission of knowledge to the new generations about the importance of theaters as essential places for the city, as spaces generating social relations and knowledge of other cultures. Oral stories about events, the historical and social context of the city, were mixed with archive images and animations of the architectural transformations of these places. Oral stories about events, the historical and social context of the city, were mixed with archive images and animations of the architectural transformations of these places, with the purpose of compiling a collective discourse around cultural activities, heritage, and memory of Medellin.

Keywords: Culture, heritage, oral history, theaters, virtual reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1038
242 Entropy based Expeditive Methodology for Rating Curves Assessment

Authors: D. Mirauda, M. Greco, P. Moscarelli

Abstract:

The river flow forecasting represents a crucial point to employ for improving a management policy addressed to the right use of water resources as well as for conjugating prevention and defense actions against environmental degradation. The difficulties occurring during the field activities encourage the development and implementation of operative computation and measuring methods addressed to time reduction for data acquisition and processing maintaining a good level of accuracy. Therefore, the aim of the present work is to test a new entropy based expeditive methodology for the evaluation of the rating curves on three gauged sections with different geometric and morphological characteristics. The methodology requires the choice of only three verticals along the measure section and the sampling of only the maximum velocity. The results underline how in most conditions the rating curves drawn can replace those built with classic methodologies, simplifying thus the procedures of data monitoring and calculation.

Keywords: gauged station, entropic approach, expeditive methodology, rating curves.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1371
241 Digital Watermarking Based on Visual Cryptography and Histogram

Authors: R. Rama Kishore, Sunesh

Abstract:

Nowadays, robust and secure watermarking algorithm and its optimization have been need of the hour. A watermarking algorithm is presented to achieve the copy right protection of the owner based on visual cryptography, histogram shape property and entropy. In this, both host image and watermark are preprocessed. Host image is preprocessed by using Butterworth filter, and watermark is with visual cryptography. Applying visual cryptography on water mark generates two shares. One share is used for embedding the watermark, and the other one is used for solving any dispute with the aid of trusted authority. Usage of histogram shape makes the process more robust against geometric and signal processing attacks. The combination of visual cryptography, Butterworth filter, histogram, and entropy can make the algorithm more robust, imperceptible, and copy right protection of the owner.

Keywords: Butterworth filter, digital watermarking, histogram, visual cryptography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618
240 An Earth Mover’s Distance Algorithm Based DDoS Detection Mechanism in SDN

Authors: Yang Zhou, Kangfeng Zheng, Wei Ni, Ren Ping Liu

Abstract:

Software-defined networking (SDN) provides a solution for scalable network framework with decoupled control and data plane. However, this architecture also induces a particular distributed denial-of-service (DDoS) attack that can affect or even overwhelm the SDN network. DDoS attack detection problem has to date been mostly researched as entropy comparison problem. However, this problem lacks the utilization of SDN, and the results are not accurate. In this paper, we propose a DDoS attack detection method, which interprets DDoS detection as a signature matching problem and is formulated as Earth Mover’s Distance (EMD) model. Considering the feasibility and accuracy, we further propose to define the cost function of EMD to be a generalized Kullback-Leibler divergence. Simulation results show that our proposed method can detect DDoS attacks by comparing EMD values with the ones computed in the case without attacks. Moreover, our method can significantly increase the true positive rate of detection.

Keywords: DDoS detection, EMD, relative entropy, SDN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 711
239 Risk Assessment of Building Information Modelling Adoption in Construction Projects

Authors: Amirhossein Karamoozian, Desheng Wu, Behzad Abbasnejad

Abstract:

Building information modelling (BIM) is a new technology to enhance the efficiency of project management in the construction industry. In addition to the potential benefits of this useful technology, there are various risks and obstacles to applying it in construction projects. In this study, a decision making approach is presented for risk assessment in BIM adoption in construction projects. Various risk factors of exerting BIM during different phases of the project lifecycle are identified with the help of Delphi method, experts’ opinions and related literature. Afterward, Shannon’s entropy and Fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Situation) are applied to derive priorities of the identified risk factors. Results indicated that lack of knowledge between professional engineers about workflows in BIM and conflict of opinions between different stakeholders are the risk factors with the highest priority.

Keywords: Risk, BIM, Shannon’s entropy, Fuzzy TOPSIS, construction projects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411
238 Subjective Assessment about Super Resolution Image Resolution

Authors: Seiichi Gohshi, Hiroyuki Sekiguchi, Yoshiyasu Shimizu, Takeshi Ikenaga

Abstract:

Super resolution (SR) technologies are now being applied to video to improve resolution. Some TV sets are now equipped with SR functions. However, it is not known if super resolution image reconstruction (SRR) for TV really works or not. Super resolution with non-linear signal processing (SRNL) has recently been proposed. SRR and SRNL are the only methods for processing video signals in real time. The results from subjective assessments of SSR and SRNL are described in this paper. SRR video was produced in simulations with quarter precision motion vectors and 100 iterations. These are ideal conditions for SRR. We found that the image quality of SRNL is better than that of SRR even though SRR was processed under ideal conditions.

Keywords: Super Resolution Image Reconstruction, Super Resolution with Non-Linear Signal Processing, Subjective Assessment, Image Quality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
237 Synthesis of Wavelet Filters using Wavelet Neural Networks

Authors: Wajdi Bellil, Chokri Ben Amar, Adel M. Alimi

Abstract:

An application of Beta wavelet networks to synthesize pass-high and pass-low wavelet filters is investigated in this work. A Beta wavelet network is constructed using a parametric function called Beta function in order to resolve some nonlinear approximation problem. We combine the filter design theory with wavelet network approximation to synthesize perfect filter reconstruction. The order filter is given by the number of neurons in the hidden layer of the neural network. In this paper we use only the first derivative of Beta function to illustrate the proposed design procedures and exhibit its performance.

Keywords: Beta wavelets, Wavenet, multiresolution analysis, perfect filter reconstruction, salient point detect, repeatability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
236 A Nonoblivious Image Watermarking System Based on Singular Value Decomposition and Texture Segmentation

Authors: Soroosh Rezazadeh, Mehran Yazdi

Abstract:

In this paper, a robust digital image watermarking scheme for copyright protection applications using the singular value decomposition (SVD) is proposed. In this scheme, an entropy masking model has been applied on the host image for the texture segmentation. Moreover, the local luminance and textures of the host image are considered for watermark embedding procedure to increase the robustness of the watermarking scheme. In contrast to all existing SVD-based watermarking systems that have been designed to embed visual watermarks, our system uses a pseudo-random sequence as a watermark. We have tested the performance of our method using a wide variety of image processing attacks on different test images. A comparison is made between the results of our proposed algorithm with those of a wavelet-based method to demonstrate the superior performance of our algorithm.

Keywords: Watermarking, copyright protection, singular value decomposition, entropy masking, texture segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
235 Real-Time Episodic Memory Construction for Optimal Action Selection in Cognitive Robotics

Authors: Deon de Jager, Yahya Zweiri, Dimitrios Makris

Abstract:

The three most important components in the cognitive architecture for cognitive robotics is memory representation, memory recall, and action-selection performed by the executive. In this paper, action selection, performed by the executive, is defined as a memory quantification and optimization process. The methodology describes the real-time construction of episodic memory through semantic memory optimization. The optimization is performed by set-based particle swarm optimization, using an adaptive entropy memory quantification approach for fitness evaluation. The performance of the approach is experimentally evaluated by simulation, where a UAV is tasked with the collection and delivery of a medical package. The experiments show that the UAV dynamically uses the episodic memory to autonomously control its velocity, while successfully completing its mission.

Keywords: Cognitive robotics, semantic memory, episodic memory, maximum entropy principle, particle swarm optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
234 Applied Actuator Fault Accommodation in Flight Control Systems Using Fault Reconstruction Based FDD and SMC Reconfiguration

Authors: A. Ghodbane, M. Saad, J.-F. Boland, C. Thibeault

Abstract:

Historically, actuators’ redundancy was used to deal with faults occurring suddenly in flight systems. This technique was generally expensive, time consuming and involves increased weight and space in the system. Therefore, nowadays, the on-line fault diagnosis of actuators and accommodation plays a major role in the design of avionic systems. These approaches, known as Fault Tolerant Flight Control systems (FTFCs) are able to adapt to such sudden faults while keeping avionics systems lighter and less expensive. In this paper, a (FTFC) system based on the Geometric Approach and a Reconfigurable Flight Control (RFC) are presented. The Geometric approach is used for cosmic ray fault reconstruction, while Sliding Mode Control (SMC) based on Lyapunov stability theory is designed for the reconfiguration of the controller in order to compensate the fault effect. Matlab®/Simulink® simulations are performed to illustrate the effectiveness and robustness of the proposed flight control system against actuators’ faulty signal caused by cosmic rays. The results demonstrate the successful real-time implementation of the proposed FTFC system on a non-linear 6 DOF aircraft model.

Keywords: Actuators’ faults, Fault detection and diagnosis, Fault tolerant flight control, Sliding mode control, Geometric approach for fault reconstruction, Lyapunov stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2537
233 The Cardiac Diagnostic Prediction Applied to a Designed Holter

Authors: Leonardo Juan Ramírez López, Javier Oswaldo Rodriguez Velasquez

Abstract:

We have designed a Holter that measures the heart´s activity for over 24 hours, implemented a prediction methodology, and generate alarms as well as indicators to patients and treating physicians. Various diagnostic advances have been developed in clinical cardiology thanks to Holter implementation; however, their interpretation has largely been conditioned to clinical analysis and measurements adjusted to diverse population characteristics, thus turning it into a subjective examination. This, however, requires vast population studies to be validated that, in turn, have not achieved the ultimate goal: mortality prediction. Given this context, our Insight Research Group developed a mathematical methodology that assesses cardiac dynamics through entropy and probability, creating a numerical and geometrical attractor which allows quantifying the normalcy of chronic and acute disease as well as the evolution between such states, and our Tigum Research Group developed a holter device with 12 channels and advanced computer software. This has been shown in different contexts with 100% sensitivity and specificity results.

Keywords: Entropy, mathematical, prediction, cardiac, holter, attractor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 653