Search results for: octree compression techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2886

Search results for: octree compression techniques

2736 Exergy Analysis of Vapour Compression Refrigeration System Using R507A, R134a, R114, R22 and R717

Authors: Ali Dinarveis

Abstract:

This paper compares the energy and exergy efficiency of a vapour compression refrigeration system using refrigerants of different groups. In this study, five different refrigerants including R507A, R134a, R114, R22 and R717 have been studied. EES Program is used to solve the thermodynamic equations. The results of this analysis are shown graphically. Based on the results, energy and exergy efficiencies for R717 are higher than the other refrigerants. Also, the energy and exergy efficiencies will be decreased with increasing the condensing temperature and decreasing the evaporating temperature.

Keywords: Energy, exergy, refrigeration, temperature, thermodynamic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 791
2735 Map Matching Performance under Various Similarity Metrics for Heterogeneous Robot Teams

Authors: M. C. Akay, A. Aybakan, H. Temeltas

Abstract:

Aerial and ground robots have various advantages of usage in different missions. Aerial robots can move quickly and get a different sight of view of the area, but those vehicles cannot carry heavy payloads. On the other hand, unmanned ground vehicles (UGVs) are slow moving vehicles, since those can carry heavier payloads than unmanned aerial vehicles (UAVs). In this context, we investigate the performances of various Similarity Metrics to provide a common map for Heterogeneous Robot Team (HRT) in complex environments. Within the usage of Lidar Odometry and Octree Mapping technique, the local 3D maps of the environment are gathered.  In order to obtain a common map for HRT, informative theoretic similarity metrics are exploited. All types of these similarity metrics gave adequate as allowable simulation time and accurate results that can be used in different types of applications. For the heterogeneous multi robot team, those methods can be used to match different types of maps.

Keywords: Common maps, heterogeneous robot team, map matching, informative theoretic similarity metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 836
2734 Experimental Studies of Spiral-Confined HSCFST Columns under Uni-Axial Compression

Authors: Mianheng Lai, Johnny Ching Ming Ho, Hoat Joen Pam

Abstract:

Concrete-filled-steel-tube (CFST) columns are becoming increasingly popular owing to the superior behavior contributed by the composite action. However, this composite action cannot be fully developed because of different dilation properties between steel tube and concrete. During initial compression, there will be de-bonding between the constitutive materials. As a result, the strength, initial stiffness and ductility of CFST columns reduce significantly. To resolve this problem, external confinement in the form of spirals is proposed to improve the interface bonding. In this paper, a total of 14CFST columns with high-strength as well as ultra-high-strength concrete in-filled were fabricated and tested under uni-axial compression. From the experimental results, it can be concluded that the proposed spirals can improve the strength, initial stiffness, ductility and the interface bonding condition of CFST columns by restraining the lateral expansion of steel tube and core concrete. Moreover, the failure modes of confined core concrete change due to the strong confinement provided by spirals.

Keywords: Concrete-filled-steel-tube, confinement, failure mode, high-strength concrete, spirals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2186
2733 REDUCER – An Architectural Design Pattern for Reducing Large and Noisy Data Sets

Authors: Apkar Salatian

Abstract:

To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article we also show how REDUCER has successfully been applied to 3 different case studies.

Keywords: Design Pattern, filtering, compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1448
2732 Improving the Convergence of the Backpropagation Algorithm Using Local Adaptive Techniques

Authors: Z. Zainuddin, N. Mahat, Y. Abu Hassan

Abstract:

Since the presentation of the backpropagation algorithm, a vast variety of improvements of the technique for training a feed forward neural networks have been proposed. This article focuses on two classes of acceleration techniques, one is known as Local Adaptive Techniques that are based on weightspecific only, such as the temporal behavior of the partial derivative of the current weight. The other, known as Dynamic Adaptation Methods, which dynamically adapts the momentum factors, α, and learning rate, η, with respect to the iteration number or gradient. Some of most popular learning algorithms are described. These techniques have been implemented and tested on several problems and measured in terms of gradient and error function evaluation, and percentage of success. Numerical evidence shows that these techniques improve the convergence of the Backpropagation algorithm.

Keywords: Backpropagation, Dynamic Adaptation Methods, Local Adaptive Techniques, Neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2130
2731 Time Compression in Engineer-to-Order Industry: A Case Study of a Norwegian Shipbuilding Industry

Authors: Tarek Fatouh, Chehab Elbelehy, Alaa Abdelsalam, Eman Elakkad, Alaa Abdelshafie

Abstract:

This paper aims to explore the possibility of time compression in Engineer to Order production networks. A case study research method is used in a Norwegian shipbuilding project by implementing a value stream mapping lean tool with total cycle time as a unit of analysis. The analysis resulted in demonstrating the time deviations for the planned tasks in one of the processes in the shipbuilding project. So, authors developed a future state map by removing time wastes from value stream process.

Keywords: Engineer to order, total cycle time, value stream mapping, shipbuilding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 515
2730 Comparison of Valuation Techniques for Bone Age Assessment

Authors: N. Olarte L, A. Rubiano F, A. Mejía F.

Abstract:

This comparison of valuation techniques for bone age assessment is a work carried out by the Telemedicine Research Group of the Military University - TIGUM, as a preliminary to the Design and development a treatment system of hand and wrist radiological images for children aged 0-6 years to bone age assessment . In this paper the techniques mentioned for decades have been the most widely used and the statistically significant. Althought, initially with the current project, it wants to work with children who have limit age, this comparison and evaluation techniques work will help in the future to expand the study subject in the system to bone age assessment, implementing more techniques, tools and deeper analysis to accomplish this purpose.

Keywords: Atlas, Bone Age Assessment, Hand and Wrist Radiograph, Image Processing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2468
2729 Techniques for Video Mosaicing

Authors: P.Saravanan, Narayanan .C.K., P.V.S.S Prakash, Prabhakara Rao .G.V

Abstract:

Video Mosaicing is the stitching of selected frames of a video by estimating the camera motion between the frames and thereby registering successive frames of the video to arrive at the mosaic. Different techniques have been proposed in the literature for video mosaicing. Despite of the large number of papers dealing with techniques to generate mosaic, only a few authors have investigated conditions under which these techniques generate good estimate of motion parameters. In this paper, these techniques are studied under different videos, and the reasons for failures are found. We propose algorithms with incorporation of outlier removal algorithms for better estimation of motion parameters.

Keywords: Motion parameters, Outlier removal algorithms, Registering , and Video Mosaicing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1208
2728 Production of Ultra-Low Temperature by the Vapor Compression Refrigeration Cycles with Environment Friendly Working Fluids

Authors: Sameh Frikha, Mohamed Salah Abid

Abstract:

We investigate the performance of an integrated cascade (IC) refrigeration system which uses environment friendly zeotropic mixtures. Computational calculation has been carried out by varying pressure level at the evaporator and the condenser of the system. Effects of mass flow rate of the refrigerant on the coefficient of performance (COP) are presented. We show that the integrated cascade system produces ultra-low temperatures in the evaporator by using environment friendly zeotropic mixture.

Keywords: Coefficient of Performance, Environment friendly zeotropic mixture, Integrated cascade, Ultra low temperature, Vapor compression refrigeration cycles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1350
2727 An Experimental Comparison of Unsupervised Learning Techniques for Face Recognition

Authors: Dinesh Kumar, C.S. Rai, Shakti Kumar

Abstract:

Face Recognition has always been a fascinating research area. It has drawn the attention of many researchers because of its various potential applications such as security systems, entertainment, criminal identification etc. Many supervised and unsupervised learning techniques have been reported so far. Principal Component Analysis (PCA), Self Organizing Maps (SOM) and Independent Component Analysis (ICA) are the three techniques among many others as proposed by different researchers for Face Recognition, known as the unsupervised techniques. This paper proposes integration of the two techniques, SOM and PCA, for dimensionality reduction and feature selection. Simulation results show that, though, the individual techniques SOM and PCA itself give excellent performance but the combination of these two can also be utilized for face recognition. Experimental results also indicate that for the given face database and the classifier used, SOM performs better as compared to other unsupervised learning techniques. A comparison of two proposed methodologies of SOM, Local and Global processing, shows the superiority of the later but at the cost of more computational time.

Keywords: Face Recognition, Principal Component Analysis, Self Organizing Maps, Independent Component Analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823
2726 Nonlinear Transformation of Laser Generated Ultrasonic Pulses in Geomaterials

Authors: Elena B. Cherepetskaya, Alexander A. Karabutov, Natalia B. Podymova, Ivan Sas

Abstract:

Nonlinear evolution of broadband ultrasonic pulses passed through the rock specimens is studied using the apparatus “GEOSCAN-02M”. Ultrasonic pulses are excited by the pulses of Qswitched Nd:YAG laser with the time duration of 10 ns and with the energy of 260 mJ. This energy can be reduced to 20 mJ by some light filters. The laser beam radius did not exceed 5 mm. As a result of the absorption of the laser pulse in the special material – the optoacoustic generator–the pulses of longitudinal ultrasonic waves are excited with the time duration of 100 ns and with the maximum pressure amplitude of 10 MPa. The immersion technique is used to measure the parameters of these ultrasonic pulses passed through a specimen, the immersion liquid is distilled water. The reference pulse passed through the cell with water has the compression and the rarefaction phases. The amplitude of the rarefaction phase is five times lower than that of the compression phase. The spectral range of the reference pulse reaches 10 MHz. The cubic-shaped specimens of the Karelian gabbro are studied with the rib length 3 cm. The ultimate strength of the specimens by the uniaxial compression is (300±10) MPa. As the reference pulse passes through the area of the specimen without cracks the compression phase decreases and the rarefaction one increases due to diffraction and scattering of ultrasound, so the ratio of these phases becomes 2.3:1. After preloading some horizontal cracks appear in the specimens. Their location is found by one-sided scanning of the specimen using the backward mode detection of the ultrasonic pulses reflected from the structure defects. Using the computer processing of these signals the images are obtained of the cross-sections of the specimens with cracks. By the increase of the reference pulse amplitude from 0.1 MPa to 5 MPa the nonlinear transformation of the ultrasonic pulse passed through the specimen with horizontal cracks results in the decrease by 2.5 times of the amplitude of the rarefaction phase and in the increase of its duration by 2.1 times. By the increase of the reference pulse amplitude from 5 MPa to 10 MPa the time splitting of the phases is observed for the bipolar pulse passed through the specimen. The compression and rarefaction phases propagate with different velocities. These features of the powerful broadband ultrasonic pulses passed through the rock specimens can be described by the hysteresis model of Preisach- Mayergoyz and can be used for the location of cracks in the optically opaque materials.

Keywords: Cracks, geological materials, nonlinear evolution of ultrasonic pulses, rock.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1843
2725 Comparative Analysis and Evaluation of Software Vulnerabilities Testing Techniques

Authors: Khalid Alnafjan, Tazar Hussain, Hanif Ullah, Zia ul haq Paracha

Abstract:

Software and applications are subjected to serious and damaging security threats, these threats are increasing as a result of increased number of potential vulnerabilities. Security testing is an indispensable process to validate software security requirements and to identify security related vulnerabilities. In this paper we analyze and compare different available vulnerabilities testing techniques based on a pre defined criteria using analytical hierarchy process (AHP). We have selected five testing techniques which includes Source code analysis, Fault code injection, Robustness, Stress and Penetration testing techniques. These testing techniques have been evaluated against five criteria which include cost, thoroughness, Ease of use, effectiveness and efficiency. The outcome of the study is helpful for researchers, testers and developers to understand effectiveness of each technique in its respective domain. Also the study helps to compare the inner working of testing techniques against a selected criterion to achieve optimum testing results.

Keywords: Software Security, Security Testing, Testing techniques, vulnerability, AHP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2837
2724 A High Bitrate Information Hiding Algorithm for Video in Video

Authors: Wang Shou-Dao, Xiao Chuang-Bai, Lin Yu

Abstract:

In high bitrate information hiding techniques, 1 bit is embedded within each 4 x 4 Discrete Cosine Transform (DCT) coefficient block by means of vector quantization, then the hidden bit can be effectively extracted in terminal end. In this paper high bitrate information hiding algorithms are summarized, and the scheme of video in video is implemented. Experimental result shows that the host video which is embedded numerous auxiliary information have little visually quality decline. Peak Signal to Noise Ratio (PSNR)Y of host video only degrades 0.22dB in average, while the hidden information has a high percentage of survives and keeps a high robustness in H.264/AVC compression, the average Bit Error Rate(BER) of hiding information is 0.015%.

Keywords: Information Hiding, Embed, Quantification, Extract

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1847
2723 Sub-Impact Phenomenon of Elasto-Plastic Free-Free Beam during a Strike

Authors: H. Rong, X. C. Yin, J. Yang, Y. N. Shen

Abstract:

Based on Rayleigh beam theory, the sub-impacts of a free-free beam struck horizontally by a round-nosed rigid mass is simulated by the finite difference method and the impact-separation conditions. In order to obtain the sub-impact force, a uniaxial compression elastic-plastic contact model is employed to analyze the local deformation field on contact zone. It is found that the horizontal impact is a complicated process including the elastic plastic sub-impacts in sequence. There are two sub-zones of sub-impact. In addition, it found that the elastic energy of the free-free beam is more suitable for the Poisson collision hypothesis to explain compression and recovery processes.

Keywords: beam, sub-impact, elastic-plastic deformation, finite difference method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799
2722 Improving Image Quality in Remote Sensing Satellites using Channel Coding

Authors: H. M. Behairy, M. S. Khorsheed

Abstract:

Among other factors that characterize satellite communication channels is their high bit error rate. We present a system for still image transmission over noisy satellite channels. The system couples image compression together with error control codes to improve the received image quality while maintaining its bandwidth requirements. The proposed system is tested using a high resolution satellite imagery simulated over the Rician fading channel. Evaluation results show improvement in overall system including image quality and bandwidth requirements compared to similar systems with different coding schemes.

Keywords: Image Transmission, Image Compression, Channel Coding, Error-Control Coding, DCT, Convolution Codes, Viterbi Algorithm, PCGC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1810
2721 Effect of Clustering on Energy Efficiency and Network Lifetime in Wireless Sensor Networks

Authors: Prakash G L, Chaitra K Meti, Poojitha K, Divya R.K.

Abstract:

Wireless Sensor Network is Multi hop Self-configuring Wireless Network consisting of sensor nodes. The deployment of wireless sensor networks in many application areas, e.g., aggregation services, requires self-organization of the network nodes into clusters. Efficient way to enhance the lifetime of the system is to partition the network into distinct clusters with a high energy node as cluster head. The different methods of node clustering techniques have appeared in the literature, and roughly fall into two families; those based on the construction of a dominating set and those which are based solely on energy considerations. Energy optimized cluster formation for a set of randomly scattered wireless sensors is presented. Sensors within a cluster are expected to be communicating with cluster head only. The energy constraint and limited computing resources of the sensor nodes present the major challenges in gathering the data. In this paper we propose a framework to study how partially correlated data affect the performance of clustering algorithms. The total energy consumption and network lifetime can be analyzed by combining random geometry techniques and rate distortion theory. We also present the relation between compression distortion and data correlation.

Keywords: Clusters, multi hop, random geometry, rate distortion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588
2720 Fabrication of Tissue Engineering Scaffolds Using Rapid Prototyping Techniques

Authors: Osama A. Abdelaal, Saied M. Darwish

Abstract:

Rapid prototyping (RP) techniques are a group of advanced manufacturing processes that can produce custom made objects directly from computer data such as Computer Aided Design (CAD), Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) data. Using RP fabrication techniques, constructs with controllable and complex internal architecture with appropriate mechanical properties can be achieved. One of the attractive and promising utilization of RP techniques is related to tissue engineering (TE) scaffold fabrication. Tissue engineering scaffold is a 3D construction that acts as a template for tissue regeneration. Although several conventional techniques such as solvent casting and gas forming are utilized in scaffold fabrication; these processes show poor interconnectivity and uncontrollable porosity of the produced scaffolds. So, RP techniques become the best alternative fabrication methods of TE scaffolds. This paper reviews the current state of the art in the area of tissue engineering scaffolds fabrication using advanced RP processes, as well as the current limitations and future trends in scaffold fabrication RP techniques.

Keywords: Biomanufacturing, Rapid prototyping, Solid FreeForm Fabrication, Scaffold Fabrication, Tissue Engineering

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5214
2719 Image Steganography Using Least Significant Bit Technique

Authors: Preeti Kumari, Ridhi Kapoor

Abstract:

 In any communication, security is the most important issue in today’s world. In this paper, steganography is the process of hiding the important data into other data, such as text, audio, video, and image. The interest in this topic is to provide availability, confidentiality, integrity, and authenticity of data. The steganographic technique that embeds hides content with unremarkable cover media so as not to provoke eavesdropper’s suspicion or third party and hackers. In which many applications of compression, encryption, decryption, and embedding methods are used for digital image steganography. Due to compression, the nose produces in the image. To sustain noise in the image, the LSB insertion technique is used. The performance of the proposed embedding system with respect to providing security to secret message and robustness is discussed. We also demonstrate the maximum steganography capacity and visual distortion.

Keywords: Steganography, LSB, encoding, information hiding, color image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1022
2718 Effect of Using Stone Cutting Waste on the Compression Strength and Slump Characteristics of Concrete

Authors: Kamel K. Alzboon, Khalid N.Mahasneh

Abstract:

The aim of this work is to study the possible use of stone cutting sludge waste in concrete production, which would reduce both the environmental impact and the production cost .Slurry sludge was used a source of water in concrete production, which was obtained from Samara factory/Jordan, The physico-chemical and mineralogical characterization of the sludge was carried out to identify the major components and to compare it with the typical sand used to produce concrete. Samples analysis showed that 96% of slurry sludge volume is water, so it should be considered as an important source of water. Results indicated that the use of slurry sludge as water source in concrete production has insignificant effect on compression strength, while it has a sharp effect on the slump values. Using slurry sludge with a percentage of 25% of the total water content obtained successful concrete samples regarding slump and compression tests. To clarify slurry sludge, settling process can be used to remove the suspended solid. A settling period of 30 min. obtained 99% removal efficiency. The clarified water is suitable for using in concrete mixes, which reduce water consumption, conserve water recourses, increase the profit, reduce operation cost and save the environment. Additionally, the dry sludge could be used in the mix design instead of the fine materials with sizes < 160 um. This application could conserve the natural materials and solve the environmental and economical problem caused by sludge accumulation.

Keywords: Concrete, recycle, sludge, slurry waste, stone cutting waste, waste.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3516
2717 Optimal Image Compression Based on Sign and Magnitude Coding of Wavelet Coefficients

Authors: Mbainaibeye Jérôme, Noureddine Ellouze

Abstract:

Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.

Keywords: Image compression, wavelet transform, sign coding, magnitude coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
2716 A Comparative Study of Image Segmentation Algorithms

Authors: Mehdi Hosseinzadeh, Parisa Khoshvaght

Abstract:

In some applications, such as image recognition or compression, segmentation refers to the process of partitioning a digital image into multiple segments. Image segmentation is typically used to locate objects and boundaries (lines, curves, etc.) in images. Image segmentation is to classify or cluster an image into several parts (regions) according to the feature of image, for example, the pixel value or the frequency response. More precisely, image segmentation is the process of assigning a label to every pixel in an image such that pixels with the same label share certain visual characteristics. The result of image segmentation is a set of segments that collectively cover the entire image, or a set of contours extracted from the image. Several image segmentation algorithms were proposed to segment an image before recognition or compression. Up to now, many image segmentation algorithms exist and be extensively applied in science and daily life. According to their segmentation method, we can approximately categorize them into region-based segmentation, data clustering, and edge-base segmentation. In this paper, we give a study of several popular image segmentation algorithms that are available.

Keywords: Image Segmentation, hierarchical segmentation, partitional segmentation, density estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2873
2715 Influence of Densification Process and Material Properties on Final Briquettes Quality from Fast-Growing Willows

Authors: Peter Križan, Juraj Beniak, Ľubomír Šooš, Miloš Matúš

Abstract:

Biomass treatment through densification is very suitable and helpful technology before its effective energy recovery. Densification process of biomass is significantly influenced by various technological and material variables, which are ultimately reflected on the final solid biofuels quality. The paper deals with the experimental research of the relationship between technological and material variables during densification of fast-growing trees, roundly fast-growing willows. The main goal of presented experimental research is to determine the relationship between compression pressure and raw material particle size from a final briquettes density point of view. Experimental research was realized by single-axis densification. The impact of particle size with interaction of compression pressure and stabilization time on the quality properties of briquettes was determined. These variables interaction affects the final solid biofuels (briquettes) quality. From briquettes production point of view and from densification machines constructions point of view is very important to know about mutual interaction of these variables on final briquettes quality. The experimental findings presented here are showing the importance of mentioned variables during the densification process. 

Keywords: Briquettes density, densification, particle size, compression pressure, stabilization time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709
2714 Designing of Full Adder Using Low Power Techniques

Authors: Shashank Gautam

Abstract:

This paper proposes techniques like MT CMOS, POWER GATING, DUAL STACK, GALEOR and LECTOR to reduce the leakage power. A Full Adder has been designed using these techniques and power dissipation is calculated and is compared with general CMOS logic of Full Adder. Simulation results show the validity of the proposed techniques is effective to save power dissipation and to increase the speed of operation of the circuits to a large extent.

Keywords: Low Power, MT CMOS, Galeor, Lector, Power Gating, Dual Stack, Full Adder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2073
2713 Optimal Data Compression and Filtering: The Case of Infinite Signal Sets

Authors: Anatoli Torokhti, Phil Howlett

Abstract:

We present a theory for optimal filtering of infinite sets of random signals. There are several new distinctive features of the proposed approach. First, we provide a single optimal filter for processing any signal from a given infinite signal set. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.

Keywords: stochastic signals, optimization problems in signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1243
2712 Generic Filtering of Infinite Sets of Stochastic Signals

Authors: Anatoli Torokhti, Phil Howlett

Abstract:

A theory for optimal filtering of infinite sets of random signals is presented. There are several new distinctive features of the proposed approach. First, a single optimal filter for processing any signal from a given infinite signal set is provided. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the scheme concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.

Keywords: Optimal filtering, data compression, stochastic signals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1276
2711 Comparative Study of Different Enhancement Techniques for Computed Tomography Images

Authors: C. G. Jinimole, A. Harsha

Abstract:

One of the key problems facing in the analysis of Computed Tomography (CT) images is the poor contrast of the images. Image enhancement can be used to improve the visual clarity and quality of the images or to provide a better transformation representation for further processing. Contrast enhancement of images is one of the acceptable methods used for image enhancement in various applications in the medical field. This will be helpful to visualize and extract details of brain infarctions, tumors, and cancers from the CT image. This paper presents a comparison study of five contrast enhancement techniques suitable for the contrast enhancement of CT images. The types of techniques include Power Law Transformation, Logarithmic Transformation, Histogram Equalization, Contrast Stretching, and Laplacian Transformation. All these techniques are compared with each other to find out which enhancement provides better contrast of CT image. For the comparison of the techniques, the parameters Peak Signal to Noise Ratio (PSNR) and Mean Square Error (MSE) are used. Logarithmic Transformation provided the clearer and best quality image compared to all other techniques studied and has got the highest value of PSNR. Comparison concludes with better approach for its future research especially for mapping abnormalities from CT images resulting from Brain Injuries.

Keywords: Computed tomography, enhancement techniques, increasing contrast, PSNR and MSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1331
2710 Characterization of 3D Printed Re-Entrant Chiral Auxetic Geometries

Authors: Tatheer Zahra

Abstract:

Auxetic materials have counteractive properties due to re-entrant geometry that enables them to possess Negative Poisson’s Ratio (NPR). These materials have better energy absorbing and shock resistance capabilities as compared to conventional positive Poisson’s ratio materials. The re-entrant geometry can be created through 3D printing for convenient application of these materials. This paper investigates the mechanical properties of 3D printed chiral auxetic geometries of various sizes. Small scale samples were printed using an ordinary 3D printer and were tested under compression and tension to ascertain their strength and deformation characteristics. A maximum NPR of -9 was obtained under compression and tension. The re-entrant chiral cell size has been shown to affect the mechanical properties of the re-entrant chiral auxetics.

Keywords: Auxetic materials, 3D printing, Negative Poisson’s Ratio, re-entrant chiral auxetics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 617
2709 A New Method for Contour Approximation Using Basic Ramer Idea

Authors: Ali Abdrhman Ukasha

Abstract:

This paper presented two new efficient algorithms for contour approximation. The proposed algorithm is compared with Ramer (good quality), Triangle (faster) and Trapezoid (fastest) in this work; which are briefly described. Cartesian co-ordinates of an input contour are processed in such a manner that finally contours is presented by a set of selected vertices of the edge of the contour. In the paper the main idea of the analyzed procedures for contour compression is performed. For comparison, the mean square error and signal-to-noise ratio criterions are used. Computational time of analyzed methods is estimated depending on a number of numerical operations. Experimental results are obtained both in terms of image quality, compression ratios, and speed. The main advantages of the analyzed algorithm is small numbers of the arithmetic operations compared to the existing algorithms.

Keywords: Polygonal approximation, Ramer, Triangle and Trapezoid methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1769
2708 An Approach to Physical Performance Analysis for Judo

Authors: Stefano Frassinelli, Alessandro Niccolai, Riccardo E. Zich

Abstract:

Sport performance analysis is a technique that is becoming every year more important for athletes of every level. Many techniques have been developed to measure and analyse efficiently the performance of athletes in some sports, but in combat sports these techniques found in many times their limits, due to the high interaction between the two opponents during the competition. In this paper the problem will be framed. Moreover the physical performance measurement problem will be analysed and three different techniques to manage it will be presented. All the techniques have been used to analyse the performance of 22 high level Judo athletes.

Keywords: Sport performance, physical performance, judo, performance coefficients.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1254
2707 Audio Watermarking Based on Compression-expansion Technique

Authors: Say Wei Foo, Qi Dong

Abstract:

A novel robust audio watermarking scheme is proposed in this paper. In the proposed scheme, the host audio signals are segmented into frames. Two consecutive frames are assessed if they are suitable to represent a watermark bit. If so, frequency transform is performed on these two frames. The compressionexpansion technique is adopted to generate distortion over the two frames. The distortion is used to represent one watermark bit. Psychoacoustic model is applied to calculate local auditory mask to ensure that the distortion is not audible. The watermarking schemes using mono and stereo audio signals are designed differently. The correlation-based detection method is used to detect the distortion and extract embedded watermark bits. The experimental results show that the quality degradation caused by the embedded watermarks is perceptually transparent and the proposed schemes are very robust against different types of attacks.

Keywords: Audio watermarking, Compression-expansion, Stereo signals, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600