Search results for: Gray Code.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 825

Search results for: Gray Code.

495 Numerical Analysis of the Influence of Airfoil Asymmetry on VAWT Performance

Authors: Marco Raciti Castelli, Giulia Simioni, Ernesto Benini

Abstract:

This paper presents a model for the evaluation of energy performance and aerodynamic forces acting on a three-bladed small vertical axis Darrieus wind turbine depending on blade chord curvature with respect to rotor axis. The adopted survey methodology is based on an analytical code coupled to a solid modeling software, capable of generating the desired blade geometry depending on the blade design geometric parameters, which is linked to a finite volume CFD code for the calculation of rotor performance. After describing and validating the model with experimental data, the results of numerical simulations are proposed on the bases of two different blade profile architectures, which are respectively characterized by a straight chord and by a curved one, having a chord radius equal to rotor external circumference. A CFD campaign of analysis is completed for three blade-candidate airfoil sections, that is the recently-developed DU 06-W-200 cambered blade profile, a classical symmetrical NACA 0021 and its derived cambered airfoil, characterized by a curved chord, having a chord radius equal to rotor external circumference. The effects of blade chord curvature on angle of attack, blade tangential and normal forces are first investigated and then the overall rotor torque and power are analyzed as a function of blade azimuthal position, achieving a numerical quantification of the influence of blade camber on overall rotor performance.

Keywords: VAWT, NACA 0021, DU 06-W-200, cambered airfoil

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2710
494 A Comparison of Real Valued Transforms for Image Compression

Authors: Shivali D. Kulkarni, Ameya K. Naik, Nitin S. Nagori

Abstract:

In this paper we present simulation results for the application of a bandwidth efficient algorithm (mapping algorithm) to an image transmission system. This system considers three different real valued transforms to generate energy compact coefficients. First results are presented for gray scale and color image transmission in the absence of noise. It is seen that the system performs its best when discrete cosine transform is used. Also the performance of the system is dominated more by the size of the transform block rather than the number of coefficients transmitted or the number of bits used to represent each coefficient. Similar results are obtained in the presence of additive white Gaussian noise. The varying values of the bit error rate have very little or no impact on the performance of the algorithm. Optimum results are obtained for the system considering 8x8 transform block and by transmitting 15 coefficients from each block using 8 bits.

Keywords: Additive white Gaussian noise channel, mapping algorithm, peak signal to noise ratio, transform encoding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463
493 Filtering and Reconstruction System for Gray Forensic Images

Authors: Ahd Aljarf, Saad Amin

Abstract:

Images are important source of information used as evidence during any investigation process. Their clarity and accuracy is essential and of the utmost importance for any investigation. Images are vulnerable to losing blocks and having noise added to them either after alteration or when the image was taken initially, therefore, having a high performance image processing system and it is implementation is very important in a forensic point of view. This paper focuses on improving the quality of the forensic images. For different reasons packets that store data can be affected, harmed or even lost because of noise. For example, sending the image through a wireless channel can cause loss of bits. These types of errors might give difficulties generally for the visual display quality of the forensic images. Two of the images problems: noise and losing blocks are covered. However, information which gets transmitted through any way of communication may suffer alteration from its original state or even lose important data due to the channel noise. Therefore, a developed system is introduced to improve the quality and clarity of the forensic images.

Keywords: Image Filtering, Image Reconstruction, Image Processing, Forensic Images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172
492 Physio-mechanical Properties of Aluminium Metal Matrix Composites Reinforced with Al2O3 and SiC

Authors: D. Sujan, Z. Oo, M. E. Rahman, M. A. Maleque, C. K. Tan

Abstract:

Particulate reinforced metal matrix composites (MMCs) are potential materials for various applications due to their advantageous of physical and mechanical properties. This paper presents a study on the performance of stir cast Al2O3 SiC reinforced metal matrix composite materials. The results indicate that the composite materials exhibit improved physical and mechanical properties, such as, low coefficient of thermal expansion, high ultimate tensile strength, high impact strength, and hardness. It has been found that with the increase of weight percentage of reinforcement particles in the aluminium metal matrix, the new material exhibits lower wear rate against abrasive wearing. Being extremely lighter than the conventional gray cast iron material, the Al-Al2O3 and Al-SiC composites could be potential green materials for applications in the automobile industry, for instance, in making car disc brake rotors.

Keywords: Metal Matrix Composite, Strength to Weight Ratio, Wear Rate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5911
491 The MUST ADS Concept

Authors: J-B. Clavel, N. Thiollière, B. Mouginot

Abstract:

The presented work is motivated by a French law regarding nuclear waste management. A new conceptual Accelerator Driven System (ADS) designed for the Minor Actinides (MA) transmutation has been assessed by numerical simulation. The MUltiple Spallation Target (MUST) ADS combines high thermal power (up to 1.4 GWth) and high specific power. A 30 mA and 1 GeV proton beam is divided into three secondary beams transmitted on three liquid lead-bismuth spallation targets. Neutron and thermalhydraulic simulations have been performed with the code MURE, based on the Monte-Carlo transport code MCNPX. A methodology has been developed to define characteristic of the MUST ADS concept according to a specific transmutation scenario. The reference scenario is based on a MA flux (neptunium, americium and curium) providing from European Fast Reactor (EPR) and a plutonium multireprocessing strategy is accounted for. The MUST ADS reference concept is a sodium cooled fast reactor. The MA fuel at equilibrium is mixed with MgO inert matrix to limit the core reactivity and improve the fuel thermal conductivity. The fuel is irradiated over five years. Five years of cooling and two years for the fuel fabrication are taken into account. The MUST ADS reference concept burns about 50% of the initial MA inventory during a complete cycle. In term of mass, up to 570 kg/year are transmuted in one concept. The methodology to design the MUST ADS and to calculate fuel composition at equilibrium is precisely described in the paper. A detailed fuel evolution analysis is performed and the reference scenario is compared to a scenario where only americium transmutation is performed.

Keywords: Accelerator Driven System, double strata scenario, minor actinides, MUST, transmutation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
490 Performance Analysis of Digital Signal Processors Using SMV Benchmark

Authors: Erh-Wen Hu, Cyril S. Ku, Andrew T. Russo, Bogong Su, Jian Wang

Abstract:

Unlike general-purpose processors, digital signal processors (DSP processors) are strongly application-dependent. To meet the needs for diverse applications, a wide variety of DSP processors based on different architectures ranging from the traditional to VLIW have been introduced to the market over the years. The functionality, performance, and cost of these processors vary over a wide range. In order to select a processor that meets the design criteria for an application, processor performance is usually the major concern for digital signal processing (DSP) application developers. Performance data are also essential for the designers of DSP processors to improve their design. Consequently, several DSP performance benchmarks have been proposed over the past decade or so. However, none of these benchmarks seem to have included recent new DSP applications. In this paper, we use a new benchmark that we recently developed to compare the performance of popular DSP processors from Texas Instruments and StarCore. The new benchmark is based on the Selectable Mode Vocoder (SMV), a speech-coding program from the recent third generation (3G) wireless voice applications. All benchmark kernels are compiled by the compilers of the respective DSP processors and run on their simulators. Weighted arithmetic mean of clock cycles and arithmetic mean of code size are used to compare the performance of five DSP processors. In addition, we studied how the performance of a processor is affected by code structure, features of processor architecture and optimization of compiler. The extensive experimental data gathered, analyzed, and presented in this paper should be helpful for DSP processor and compiler designers to meet their specific design goals.

Keywords: digital signal processors, DSP benchmark, instruction level parallelism, modified cyclomatic complexity, performance analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1557
489 Stability of Concrete Moment Resisting Frames in View of Current Codes Requirements

Authors: Mahmoud A. Mahmoud, Ashraf Osman

Abstract:

In this study, the different approaches currently followed by design codes to assess the stability of buildings utilizing concrete moment resisting frames structural system are evaluated. For such purpose, a parametric study was performed. It involved analyzing group of concrete moment resisting frames having different slenderness ratios (height/width ratios), designed for different lateral loads to vertical loads ratios and constructed using ordinary reinforced concrete and high strength concrete for stability check and overall buckling using code approaches and computer buckling analysis. The objectives were to examine the influence of such parameters that directly linked to frames’ lateral stiffness on the buildings’ stability and evaluates the code approach in view of buckling analysis results. Based on this study, it was concluded that, the most susceptible buildings to instability and magnification of second order effects are buildings having high aspect ratios (height/width ratio), having low lateral to vertical loads ratio and utilizing construction materials of high strength. In addition, the study showed that the instability limits imposed by codes are mainly mathematical to ensure reliable analysis not a physical ones and that they are in general conservative. Also, it has been shown that the upper limit set by one of the codes that second order moment for structural elements should be limited to 1.4 the first order moment is not justified, instead, the overall story check is more reliable.

Keywords: Buckling, lateral stability, p-delta, second order.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2250
488 Methods of Geodesic Distance in Two-Dimensional Face Recognition

Authors: Rachid Ahdid, Said Safi, Bouzid Manaut

Abstract:

In this paper, we present a comparative study of three methods of 2D face recognition system such as: Iso-Geodesic Curves (IGC), Geodesic Distance (GD) and Geodesic-Intensity Histogram (GIH). These approaches are based on computing of geodesic distance between points of facial surface and between facial curves. In this study we represented the image at gray level as a 2D surface in a 3D space, with the third coordinate proportional to the intensity values of pixels. In the classifying step, we use: Neural Networks (NN), K-Nearest Neighbor (KNN) and Support Vector Machines (SVM). The images used in our experiments are from two wellknown databases of face images ORL and YaleB. ORL data base was used to evaluate the performance of methods under conditions where the pose and sample size are varied, and the database YaleB was used to examine the performance of the systems when the facial expressions and lighting are varied.

Keywords: 2D face recognition, Geodesic distance, Iso-Geodesic Curves, Geodesic-Intensity Histogram, facial surface, Neural Networks, K-Nearest Neighbor, Support Vector Machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1778
487 Effect of Coal on Engineering Properties in Building Materials: Opportunity to Manufacturing Insulating Bricks

Authors: Bachir Chemani, Halima Chemani

Abstract:

The objective of this study is to investigate the effect of adding coal to obtain insulating ceramic product. The preparation of mixtures is achieved with 04 types of different masse compositions, consisting of gray and yellow clay, and coal. Analyses are performed on local raw materials by adding coal as additive. The coal content varies from 5 to 20 % in weight by varying the size of coal particles ranging from 0.25mm to 1.60mm.

Initially, each natural moisture content of a raw material has been determined at the temperature of 105°C in a laboratory oven. The Influence of low-coal content on absorption, the apparent density, the contraction and the resistance during compression have been evaluated. The experimental results showed that the optimized composition could be obtained by adding 10% by weight of coal leading thus to insulating ceramic products with water absorption, a density and resistance to compression of 9.40 %, 1.88 g/cm3, 35.46 MPa, respectively. The results show that coal, when mixed with traditional raw materials, offers the conditions to be used as an additive in the production of lightweight ceramic products.

Keywords: Clay, coal, resistance to compression, insulating bricks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2143
486 Smart Surveillance using PDA

Authors: Basem Mustafa Abd. Amer , Syed Abdul Rahman Al-Attas

Abstract:

The aim of this research is to develop a fast and reliable surveillance system based on a personal digital assistant (PDA) device. This is to extend the capability of the device to detect moving objects which is already available in personal computers. Secondly, to compare the performance between Background subtraction (BS) and Temporal Frame Differencing (TFD) techniques for PDA platform as to which is more suitable. In order to reduce noise and to prepare frames for the moving object detection part, each frame is first converted to a gray-scale representation and then smoothed using a Gaussian low pass filter. Two moving object detection schemes i.e., BS and TFD have been analyzed. The background frame is updated by using Infinite Impulse Response (IIR) filter so that the background frame is adapted to the varying illuminate conditions and geometry settings. In order to reduce the effect of noise pixels resulting from frame differencing morphological filters erosion and dilation are applied. In this research, it has been found that TFD technique is more suitable for motion detection purpose than the BS in term of speed. On average TFD is approximately 170 ms faster than the BS technique

Keywords: Surveillance, PDA, Motion Detection, ImageProcessing , Background Subtraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717
485 Bridging Quantitative and Qualitative of Glaucoma Detection

Authors: Noor Elaiza Abdul Khalid, Noorhayati Mohamed Noor, Zamalia Mahmud, Saadiah Yahya, and Norharyati Md Ariff

Abstract:

Glaucoma diagnosis involves extracting three features of the fundus image; optic cup, optic disc and vernacular. Present manual diagnosis is expensive, tedious and time consuming. A number of researches have been conducted to automate this process. However, the variability between the diagnostic capability of an automated system and ophthalmologist has yet to be established. This paper discusses the efficiency and variability between ophthalmologist opinion and digital technique; threshold. The efficiency and variability measures are based on image quality grading; poor, satisfactory or good. The images are separated into four channels; gray, red, green and blue. A scientific investigation was conducted on three ophthalmologists who graded the images based on the image quality. The images are threshold using multithresholding and graded as done by the ophthalmologist. A comparison of grade from the ophthalmologist and threshold is made. The results show there is a small variability between result of ophthalmologists and digital threshold.

Keywords: Digital Fundus Image, Glaucoma Detection, Multithresholding, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991
484 Numerical Simulation of Free Surface Water Wave for the Flow around NACA 0012 Hydrofoil and Wigley Hull Using VOF Method

Authors: Saadia Adjali, Omar Imine, Mohammed Aounallah, Mustapha Belkadi

Abstract:

Steady three-dimensional and two free surface waves generated by moving bodies are presented, the flow problem to be simulated is rich in complexity and poses many modeling challenges because of the existence of breaking waves around the ship hull, and because of the interaction of the two-phase flow with the turbulent boundary layer. The results of several simulations are reported. The first study was performed for NACA0012 of hydrofoil with different meshes, this section is analyzed at h/c= 1, 0345 for 2D. In the second simulation a mathematically defined Wigley hull form is used to investigate the application of a commercial CFD code in prediction of the total resistance and its components from tangential and normal forces on the hull wetted surface. The computed resistance and wave profiles are used to estimate the coefficient of the total resistance for Wigley hull advancing in calm water under steady conditions. The commercial CFD software FLUENT version 12 is used for the computations in the present study. The calculated grid is established using the code computer GAMBIT 2.3.26. The shear stress k-ωSST model is used for turbulence modeling and the volume of fluid technique is employed to simulate the free-surface motion. The second order upwind scheme is used for discretizing the convection terms in the momentum transport equations, the Modified HRIC scheme for VOF discretization. The results obtained compare well with the experimental data.

Keywords: Free surface flows, Breaking waves, Boundary layer, Wigley hull, Volume of fluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3501
483 Numerical Simulation of Free Surface Water Wave for the Flow around NACA 0012 Hydrofoil and Wigley Hull Using VOF Method

Authors: Saadia Adjali, Omar Imine, Mohammed Aounallah, Mustapha Belkadi

Abstract:

Steady three-dimensional and two free surface waves generated by moving bodies are presented, the flow problem to be simulated is rich in complexity and poses many modeling challenges because of the existence of breaking waves around the ship hull, and because of the interaction of the two-phase flow with the turbulent boundary layer. The results of several simulations are reported. The first study was performed for NACA0012 of hydrofoil with different meshes, this section is analyzed at h/c= 1, 0345 for 2D. In the second simulation a mathematically defined Wigley hull form is used to investigate the application of a commercial CFD code in prediction of the total resistance and its components from tangential and normal forces on the hull wetted surface. The computed resistance and wave profiles are used to estimate the coefficient of the total resistance for Wigley hull advancing in calm water under steady conditions. The commercial CFD software FLUENT version 12 is used for the computations in the present study. The calculated grid is established using the code computer GAMBIT 2.3.26. The shear stress k-ωSST model is used for turbulence modeling and the volume of fluid technique is employed to simulate the free-surface motion. The second order upwind scheme is used for discretizing the convection terms in the momentum transport equations, the Modified HRIC scheme for VOF discretization. The results obtained compare well with the experimental data.

Keywords: Free surface flows, breaking waves, boundary layer, Wigley hull, volume of fluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3258
482 PeliGRIFF: A Parallel DEM-DLM/FD Method for DNS of Particulate Flows with Collisions

Authors: Anthony Wachs, Guillaume Vinay, Gilles Ferrer, Jacques Kouakou, Calin Dan, Laurence Girolami

Abstract:

An original Direct Numerical Simulation (DNS) method to tackle the problem of particulate flows at moderate to high concentration and finite Reynolds number is presented. Our method is built on the framework established by Glowinski and his coworkers [1] in the sense that we use their Distributed Lagrange Multiplier/Fictitious Domain (DLM/FD) formulation and their operator-splitting idea but differs in the treatment of particle collisions. The novelty of our contribution relies on replacing the simple artificial repulsive force based collision model usually employed in the literature by an efficient Discrete Element Method (DEM) granular solver. The use of our DEM solver enables us to consider particles of arbitrary shape (at least convex) and to account for actual contacts, in the sense that particles actually touch each other, in contrast with the simple repulsive force based collision model. We recently upgraded our serial code, GRIFF 1 [2], to full MPI capabilities. Our new code, PeliGRIFF 2, is developed under the framework of the full MPI open source platform PELICANS [3]. The new MPI capabilities of PeliGRIFF open new perspectives in the study of particulate flows and significantly increase the number of particles that can be considered in a full DNS approach: O(100000) in 2D and O(10000) in 3D. Results on the 2D/3D sedimentation/fluidization of isometric polygonal/polyedral particles with collisions are presented.

Keywords: Particulate flow, distributed lagrange multiplier/fictitious domain method, discrete element method, polygonal shape, sedimentation, distributed computing, MPI

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069
481 Stability Analysis of a Tricore

Authors: C. M. De Marco Muscat-Fenech, A.M. Grech La Rosa

Abstract:

The application of stability theory has led to detailed studies of different types of vessels; however, the shortage of information relating to multihull vessels demanded further investigation. This study shows that the position of the hulls has a very influential effect on both the transverse and longitudinal stability of the tricore. HSC stability code is applied for the optimisation of the hull configurations. Such optimization criteria would undoubtedly aid the performance of the vessel for both commercial or leisure purposes

Keywords: Stability, Multihull, Tricore

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2858
480 Performance Evaluation of Low Density Parity Check Codes

Authors: Othman O. Khalifa, Sheroz khan, Mohamad Zaid, Muhamad Nawawi

Abstract:

This paper mainly about the study on one of the widely used error correcting codes that is Low parity check Codes (LDPC). In this paper, the Regular LDPC code has been discussed The LDPC codes explained in this paper is about the Regular Binary LDPC codes or the Gallager.

Keywords: LDPC, channel coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2780
479 Numerical Investigation of Nozzle Shape Effect on Shock Wave in Natural Gas Processing

Authors: Esam I. Jassim, Mohamed M. Awad

Abstract:

Natural gas flow contains undesirable solid particles, liquid condensation, and/or oil droplets and requires reliable removing equipment to perform filtration. Recent natural gas processing applications are demanded compactness and reliability of process equipment. Since conventional means are sophisticated in design, poor in efficiency, and continue lacking robust, a supersonic nozzle has been introduced as an alternative means to meet such demands. A 3-D Convergent-Divergent Nozzle is simulated using commercial Code for pressure ratio (NPR) varies from 1.2 to 2. Six different shapes of nozzle are numerically examined to illustrate the position of shock-wave as such spot could be considered as a benchmark of particle separation. Rectangle, triangle, circular, elliptical, pentagon, and hexagon nozzles are simulated using Fluent Code with all have same cross-sectional area. The simple one-dimensional inviscid theory does not describe the actual features of fluid flow precisely as it ignores the impact of nozzle configuration on the flow properties. CFD Simulation results, however, show that nozzle geometry influences the flow structures including location of shock wave. The CFD analysis predicts shock appearance when p01/pa>1.2 for almost all geometry and locates at the lower area ratio (Ae/At). Simulation results showed that shock wave in Elliptical nozzle has the farthest distance from the throat among the others at relatively small NPR. As NPR increases, hexagon would be the farthest. The numerical result is compared with available experimental data and has shown good agreement in terms of shock location and flow structure.

Keywords: CFD, Particle Separation, Shock wave, Supersonic Nozzle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3202
478 Effects of Gamma Radiation on Tomato Leafminer, Tuta absoluta (Meyrick) (Lepidoptera: Gelechiidae)

Authors: Akın Kuyulu, Hanife Genç

Abstract:

In present study, it was aimed to evaluate the gamma radiation impacts on tomato leaf miner at different biological stages. The laboratory colony of tomato leaf miner was used to set up the experiments. Different biological stages of the insects (eggs, 4th instars and pupae) were irradiated using Cobalt-60 at doses of 0 (control), 100 Gray (Gy), 200 Gy, 300 Gy and 400 Gy in Cos-44HH-N source, at dose rate of 480 Gy/h. After irradiation, the eggs were incubated until hatching; the mature larvae were reared to complete their developments. Adult emergences from irradiated pupae were also evaluated. The results showed that there were no egg hatching at all tested irradiation doses. Although, the pupal percentages of irradiated mature larvae were 54%, 15% and 8% at doses of 100 Gy, 200 Gy and 300 Gy respectively, there were no adult emergences from irradiated mature larvae. On the other hand, the adult emergences were observed from irradiated pupae, decreased as radiation doses increased along with malformed adult appearance. Male and female individuals were out crossed with laboratory reared adults. Fecundity was correlated with radiation doses.

Keywords: Irradiation, tomato, tomato leafminer, Tuta absoluta.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1546
477 Data Hiding by Vector Quantization in Color Image

Authors: Yung-Gi Wu

Abstract:

With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.

Keywords: Data hiding, vector quantization, watermark.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740
476 A Novel Prostate Segmentation Algorithm in TRUS Images

Authors: Ali Rafiee, Ahad Salimi, Ali Reza Roosta

Abstract:

Prostate cancer is one of the most frequent cancers in men and is a major cause of mortality in the most of countries. In many diagnostic and treatment procedures for prostate disease accurate detection of prostate boundaries in transrectal ultrasound (TRUS) images is required. This is a challenging and difficult task due to weak prostate boundaries, speckle noise and the short range of gray levels. In this paper a novel method for automatic prostate segmentation in TRUS images is presented. This method involves preprocessing (edge preserving noise reduction and smoothing) and prostate segmentation. The speckle reduction has been achieved by using stick filter and top-hat transform has been implemented for smoothing. A feed forward neural network and local binary pattern together have been use to find a point inside prostate object. Finally the boundary of prostate is extracted by the inside point and an active contour algorithm. A numbers of experiments are conducted to validate this method and results showed that this new algorithm extracted the prostate boundary with MSE less than 4.6% relative to boundary provided manually by physicians.

Keywords: Prostate segmentation, stick filter, neural network, active contour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1923
475 Gray Level Image Encryption

Authors: Roza Afarin, Saeed Mozaffari

Abstract:

The aim of this paper is image encryption using Genetic Algorithm (GA). The proposed encryption method consists of two phases. In modification phase, pixels locations are altered to reduce correlation among adjacent pixels. Then, pixels values are changed in the diffusion phase to encrypt the input image. Both phases are performed by GA with binary chromosomes. For modification phase, these binary patterns are generated by Local Binary Pattern (LBP) operator while for diffusion phase binary chromosomes are obtained by Bit Plane Slicing (BPS). Initial population in GA includes rows and columns of the input image. Instead of subjective selection of parents from this initial population, a random generator with predefined key is utilized. It is necessary to decrypt the coded image and reconstruct the initial input image. Fitness function is defined as average of transition from 0 to 1 in LBP image and histogram uniformity in modification and diffusion phases, respectively. Randomness of the encrypted image is measured by entropy, correlation coefficients and histogram analysis. Experimental results show that the proposed method is fast enough and can be used effectively for image encryption.

Keywords: Correlation coefficients, Genetic algorithm, Image encryption, Image entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2200
474 The Application of Hadamard Matrixes in the SNR Enhancement of Optical Time-Domain Reflectometry(OTDR)

Authors: Mingyu Zhong, Yi Xie

Abstract:

Results in one field necessarily give insight into the others, and all have much potential for scientific and technological application. The Hadamard-transform technique once been applied to the spectrometry also has its use in the SNR Enhancement of OTDR. In this report, a new set of code (Simplex-codes) is discussed and where the addition gain of SNR come from is implied.

Keywords: Hadamard-transform, matrixes, averaging, opticaltime-domain reflectometry (OTDR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1269
473 Establishment and Evaluation of Information System for Chemotherapy Care

Authors: Yi-Ting Liu, Pei-Ying Wen

Abstract:

In order to improve the overall safety of chemotherapy, safety-protecting netwas established for the whole process from prescribing by physicians, transcribing by nurses, dispensing by pharmacists to administering by nurses. The information system was used to check and monitorwhole process of administration and related sheets were computerized to simplify the paperwork.

Keywords: Chemotherapy, Bar Code Medication Administration (BCMA), Medication Safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820
472 A Study of RSCMAC Enhanced GPS Dynamic Positioning

Authors: Ching-Tsan Chiang, Sheng-Jie Yang, Jing-Kai Huang

Abstract:

The purpose of this research is to develop and apply the RSCMAC to enhance the dynamic accuracy of Global Positioning System (GPS). GPS devices provide services of accurate positioning, speed detection and highly precise time standard for over 98% area on the earth. The overall operation of Global Positioning System includes 24 GPS satellites in space; signal transmission that includes 2 frequency carrier waves (Link 1 and Link 2) and 2 sets random telegraphic codes (C/A code and P code), on-earth monitoring stations or client GPS receivers. Only 4 satellites utilization, the client position and its elevation can be detected rapidly. The more receivable satellites, the more accurate position can be decoded. Currently, the standard positioning accuracy of the simplified GPS receiver is greatly increased, but due to affected by the error of satellite clock, the troposphere delay and the ionosphere delay, current measurement accuracy is in the level of 5~15m. In increasing the dynamic GPS positioning accuracy, most researchers mainly use inertial navigation system (INS) and installation of other sensors or maps for the assistance. This research utilizes the RSCMAC advantages of fast learning, learning convergence assurance, solving capability of time-related dynamic system problems with the static positioning calibration structure to improve and increase the GPS dynamic accuracy. The increasing of GPS dynamic positioning accuracy can be achieved by using RSCMAC system with GPS receivers collecting dynamic error data for the error prediction and follows by using the predicted error to correct the GPS dynamic positioning data. The ultimate purpose of this research is to improve the dynamic positioning error of cheap GPS receivers and the economic benefits will be enhanced while the accuracy is increased.

Keywords: Dynamic Error, GPS, Prediction, RSCMAC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
471 CO2 Emission and Cost Optimization of Reinforced Concrete Frame Designed by Performance Based Design Approach

Authors: Jin Woo Hwang, Byung Kwan Oh, Yousok Kim, Hyo Seon Park

Abstract:

As greenhouse effect has been recognized as serious environmental problem of the world, interests in carbon dioxide (CO2) emission which comprises major part of greenhouse gas (GHG) emissions have been increased recently. Since construction industry takes a relatively large portion of total CO2 emissions of the world, extensive studies about reducing CO2 emissions in construction and operation of building have been carried out after the 2000s. Also, performance based design (PBD) methodology based on nonlinear analysis has been robustly developed after Northridge Earthquake in 1994 to assure and assess seismic performance of building more exactly because structural engineers recognized that prescriptive code based design approach cannot address inelastic earthquake responses directly and assure performance of building exactly. Although CO2 emissions and PBD approach are recent rising issues on construction industry and structural engineering, there were few or no researches considering these two issues simultaneously. Thus, the objective of this study is to minimize the CO2 emissions and cost of building designed by PBD approach in structural design stage considering structural materials. 4 story and 4 span reinforced concrete building optimally designed to minimize CO2 emissions and cost of building and to satisfy specific seismic performance (collapse prevention in maximum considered earthquake) of building satisfying prescriptive code regulations using non-dominated sorting genetic algorithm-II (NSGA-II). Optimized design result showed that minimized CO2 emissions and cost of building were acquired satisfying specific seismic performance. Therefore, the methodology proposed in this paper can be used to reduce both CO2 emissions and cost of building designed by PBD approach.

Keywords: CO2 emissions, performance based design, optimization, sustainable design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1810
470 Evaluating Refactoring with a Quality Index

Authors: Crt Gerlec, Marjan Hericko

Abstract:

The aim of every software product is to achieve an appropriate level of software quality. Developers and designers are trying to produce readable, reliable, maintainable, reusable and testable code. To help achieve these goals, several approaches have been utilized. In this paper, refactoring technique was used to evaluate software quality with a quality index. It is composed of different metric sets which describes various quality aspects.

Keywords: Refactoring, Software Metrics, Software Quality, Quality Index, Agile methodologies

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582
469 Accelerating GLA with an M-Tree

Authors: Olli Luoma, Johannes Tuikkala, Olli Nevalainen

Abstract:

In this paper, we propose a novel improvement for the generalized Lloyd Algorithm (GLA). Our algorithm makes use of an M-tree index built on the codebook which makes it possible to reduce the number of distance computations when the nearest code words are searched. Our method does not impose the use of any specific distance function, but works with any metric distance, making it more general than many other fast GLA variants. Finally, we present the positive results of our performance experiments.

Keywords: Clustering, GLA, M-Tree, Vector Quantization .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468
468 Study of the Elastic Scattering of 16O, 14N and 12C on the Nucleus of 27Al at Different Energies near the Coulomb Barrier

Authors: N. Amangeldi, N. Burtebayev, Sh. Hamada, A. Amar

Abstract:

the measurement of the angular distribution for the elastic scattering of 16O, 14N and 12C on 27Al has been done at energy 1.75 MeV/nucleon. The optical potential code SPIVAL used in this work to analyze the experimental results. A good agreement between the experimental and theoretical results was obtained.

Keywords: 27Al(16O, 16O)27Al, SPIVAL, 27Al(14N, 14N)27Al, 27Al(12C, 12C)27Al, Elastic Scattering, Optical Potential Codes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1362
467 QR Technology to Automate Health Condition Detection Payment System: A Case Study in Schools of the Kingdom of Saudi Arabia

Authors: Amjad Alsulami, Farah Albishri, Kholod Alzubidi, Lama Almehemadi, Salma Elhag

Abstract:

Food allergy is a common and rising problem among children. Many students have their first allergic reaction at school, one of these is anaphylaxis, which can be fatal. This study discovered that several schools' processes lacked safety regulations and information on how to handle allergy issues and chronic diseases like diabetes where students were not supervised or monitored during the cafeteria purchasing process. Academic institutions have no obvious prevention or effort when purchasing food containing allergens or negatively impacting the health status of students who suffer from chronic diseases. The stability of students' health must be maintained because it greatly affects their performance and educational achievement. To address this issue, this paper uses a business reengineering process to propose the automation of the whole food-purchasing process, which will aid in detecting and avoiding allergic occurrences and preventing any side effects from eating foods that are conflicting with students' health. This may be achieved by designing a smart card with an embedded QR code that reveals which foods cause an allergic reaction in a student. A survey was distributed to determine and examine how the cafeteria will handle allergic children and whether any management or policy is applied in the school. Also, the survey findings indicate that the integration of QR technology into the food purchasing process would improve health condition detection. The family supported that the suggested solution would be advantageous because it ensured their children avoided eating not allowed food. Moreover, by analyzing and simulating the as-is process and the suggested process, the results demonstrate that there is an improvement in quality and time.

Keywords: QR code, smart card, food allergies, Business Process reengineering, health condition detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 190
466 Mean Codeword Lengths and Their Correspondence with Entropy Measures

Authors: R.K.Tuli

Abstract:

The objective of the present communication is to develop new genuine exponentiated mean codeword lengths and to study deeply the problem of correspondence between well known measures of entropy and mean codeword lengths. With the help of some standard measures of entropy, we have illustrated such a correspondence. In literature, we usually come across many inequalities which are frequently used in information theory. Keeping this idea in mind, we have developed such inequalities via coding theory approach.

Keywords: Codeword, Code alphabet, Uniquely decipherablecode, Mean codeword length, Uncertainty, Noiseless channel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667