Search results for: Fully spatial signal processing
961 Advanced Stochastic Models for Partially Developed Speckle
Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije
Abstract:
Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744960 A Pipelined FSBM Hardware Architecture for HTDV-H.26x
Authors: H. Loukil, A. Ben Atitallah, F. Ghozzi, M. A. Ben Ayed, N. Masmoudi
Abstract:
In MPEG and H.26x standards, to eliminate the temporal redundancy we use motion estimation. Given that the motion estimation stage is very complex in terms of computational effort, a hardware implementation on a re-configurable circuit is crucial for the requirements of different real time multimedia applications. In this paper, we present hardware architecture for motion estimation based on "Full Search Block Matching" (FSBM) algorithm. This architecture presents minimum latency, maximum throughput, full utilization of hardware resources such as embedded memory blocks, and combining both pipelining and parallel processing techniques. Our design is described in VHDL language, verified by simulation and implemented in a Stratix II EP2S130F1020C4 FPGA circuit. The experiment result show that the optimum operating clock frequency of the proposed design is 89MHz which achieves 160M pixels/sec.Keywords: SAD, FSBM, Hardware Implementation, FPGA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641959 Delay Preserving Substructures in Wireless Networks Using Edge Difference between a Graph and its Square Graph
Authors: T. N. Janakiraman, J. Janet Lourds Rani
Abstract:
In practice, wireless networks has the property that the signal strength attenuates with respect to the distance from the base station, it could be better if the nodes at two hop away are considered for better quality of service. In this paper, we propose a procedure to identify delay preserving substructures for a given wireless ad-hoc network using a new graph operation G 2 – E (G) = G* (Edge difference of square graph of a given graph and the original graph). This operation helps to analyze some induced substructures, which preserve delay in communication among them. This operation G* on a given graph will induce a graph, in which 1- hop neighbors of any node are at 2-hop distance in the original network. In this paper, we also identify some delay preserving substructures in G*, which are (i) set of all nodes, which are mutually at 2-hop distance in G that will form a clique in G*, (ii) set of nodes which forms an odd cycle C2k+1 in G, will form an odd cycle in G* and the set of nodes which form a even cycle C2k in G that will form two disjoint companion cycles ( of same parity odd/even) of length k in G*, (iii) every path of length 2k+1 or 2k in G will induce two disjoint paths of length k in G*, and (iv) set of nodes in G*, which induces a maximal connected sub graph with radius 1 (which identifies a substructure with radius equal 2 and diameter at most 4 in G). The above delay preserving sub structures will behave as good clusters in the original network.Keywords: Clique, cycles, delay preserving substructures, maximal connected sub graph.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1254958 Consistent Modeling of Functional Dependencies along with World Knowledge
Authors: Sven Rebhan, Nils Einecke, Julian Eggert
Abstract:
In this paper we propose a method for vision systems to consistently represent functional dependencies between different visual routines along with relational short- and long-term knowledge about the world. Here the visual routines are bound to visual properties of objects stored in the memory of the system. Furthermore, the functional dependencies between the visual routines are seen as a graph also belonging to the object-s structure. This graph is parsed in the course of acquiring a visual property of an object to automatically resolve the dependencies of the bound visual routines. Using this representation, the system is able to dynamically rearrange the processing order while keeping its functionality. Additionally, the system is able to estimate the overall computational costs of a certain action. We will also show that the system can efficiently use that structure to incorporate already acquired knowledge and thus reduce the computational demand.Keywords: Adaptive systems, Knowledge representation, Machinevision, Systems engineering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696957 Analysis and Measuring Surface Roughness of Nonwovens Using Machine Vision Method
Authors: Dariush Semnani, Javad Yekrang, Hossein Ghayoor
Abstract:
Concerning the measurement of friction properties of textiles and fabrics using Kawabata Evaluation System (KES), whose output is constrained to the surface friction factor of fabric, and no other data would be generated; this research has been conducted to gain information about surface roughness regarding its surface friction factor. To assess roughness properties of light nonwovens, a 3-dimensional model of a surface has been simulated with regular sinuous waves through it as an ideal surface. A new factor was defined, namely Surface Roughness Factor, through comparing roughness properties of simulated surface and real specimens. The relation between the proposed factor and friction factor of specimens has been analyzed by regression, and results showed a meaningful correlation between them. It can be inferred that the new presented factor can be used as an acceptable criterion for evaluating the roughness properties of light nonwoven fabrics.Keywords: Surface roughness, Nonwoven, Machine vision, Image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3092956 On a Pitch Duration Technique for Prosody Control
Authors: JongKuk Kim, HernSoo Hahn, Uei-Joong Yoo, MyungJin Bae
Abstract:
In this paper, we propose a method of alter duration in frequency domain that control prosody in real time after pitch alteration. If there has a method to alteration duration freely among prosody information, that may used in several fields such as speech impediment person's pronunciation proof reading or language study. The pitch alteration method used control prosody altered by PSOLA synthesis method which is in time domain processing method. However, the duration of pitch alteration speech is changed by the frequency domain. In this paper, we altered the duration with the method of duration alteration by Fast Fourier Transformation in frequency domain. Consequently, the intelligibility of the pitch and duration are controlled has a slight decrease than the case when only pitch is changed, but the proposed algorithm obtained the higher MOS score about naturalness.Keywords: PSOLA, Pitch Alteration, Duration Control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1685955 Unsteady Laminar Boundary Layer Forced Flow in the Region of the Stagnation Point on a Stretching Flat Sheet
Authors: A. T. Eswara
Abstract:
This paper analyses the unsteady, two-dimensional stagnation point flow of an incompressible viscous fluid over a flat sheet when the flow is started impulsively from rest and at the same time, the sheet is suddenly stretched in its own plane with a velocity proportional to the distance from the stagnation point. The partial differential equations governing the laminar boundary layer forced convection flow are non-dimensionalised using semi-similar transformations and then solved numerically using an implicit finitedifference scheme known as the Keller-box method. Results pertaining to the flow and heat transfer characteristics are computed for all dimensionless time, uniformly valid in the whole spatial region without any numerical difficulties. Analytical solutions are also obtained for both small and large times, respectively representing the initial unsteady and final steady state flow and heat transfer. Numerical results indicate that the velocity ratio parameter is found to have a significant effect on skin friction and heat transfer rate at the surface. Furthermore, it is exposed that there is a smooth transition from the initial unsteady state flow (small time solution) to the final steady state (large time solution).Keywords: Forced flow, Keller-box method, Stagnation point, Stretching flat sheet, Unsteady laminar boundary layer, Velocity ratio parameter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695954 Particle Image Velocimetry for Measuring Water Flow Velocity
Authors: King Kuok Kuok, Po Chan Chiu
Abstract:
Floods are natural phenomena, which may turn into disasters causing widespread damage, health problems and even deaths. Nowadays, floods had become more serious and more frequent due to climatic changes. During flooding, discharge measurement still can be taken by standing on the bridge across the river using portable measurement instrument. However, it is too dangerous to get near to the river especially during high flood. Therefore, this study employs Particle Image Velocimetry (PIV) as a tool to measure the surface flow velocity. PIV is a image processing technique to track the movement of water from one point to another. The PIV codes are developed using Matlab. In this study, 18 ping pong balls were scattered over the surface of the drain and images were taken with a digital SLR camera. The images obtained were analyzed using the PIV code. Results show that PIV is able to produce the flow velocity through analyzing the series of images captured.
Keywords: Particle Image Velocimetry, flow velocity, surface flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2845953 Variability of Hydrological Modeling of the Blue Nile
Authors: Abeer Samy, Oliver C. Saavedra Valeriano, Abdelazim Negm
Abstract:
The Blue Nile Basin is the most important tributary of the Nile River. Egypt and Sudan are almost dependent on water originated from the Blue Nile. This multi-dependency creates conflicts among the three countries Egypt, Sudan, and Ethiopia making the management of these conflicts as an international issue. Good assessment of the water resources of the Blue Nile is an important to help in managing such conflicts. Hydrological models are good tool for such assessment. This paper presents a critical review of the nature and variability of the climate and hydrology of the Blue Nile Basin as a first step of using hydrological modeling to assess the water resources of the Blue Nile. Many several attempts are done to develop basin-scale hydrological modeling on the Blue Nile. Lumped and semi distributed models used averages of meteorological inputs and watershed characteristics in hydrological simulation, to analyze runoff for flood control and water resource management. Distributed models include the temporal and spatial variability of catchment conditions and meteorological inputs to allow better representation of the hydrological process. The main challenge of all used models was to assess the water resources of the basin is the shortage of the data needed for models calibration and validation. It is recommended to use distributed model for their higher accuracy to cope with the great variability and complexity of the Blue Nile basin and to collect sufficient data to have more sophisticated and accurate hydrological modeling.
Keywords: Blue Nile Basin, Climate Change, Hydrological Modeling, Watershed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3073952 Exploring Inter-Relationships between Events to Identify Strategic Technological Competencies: A Combined Approach
Authors: Cláudio Santos, Madalena Araújo, Nuno Correia
Abstract:
The inherent complexity in nowadays- business environments is forcing organizations to be attentive to the dynamics in several fronts. Therefore, the management of technological innovation is continually faced with uncertainty about the future. These issues lead to a need for a systemic perspective, able to analyze the consequences of interactions between different factors. The field of technology foresight has proposed methods and tools to deal with this broader perspective. In an attempt to provide a method to analyze the complex interactions between events in several areas, departing from the identification of the most strategic competencies, this paper presents a methodology based on the Delphi method and Quality Function Deployment. This methodology is applied in a sheet metal processing equipment manufacturer, as a case study.Keywords: Competencies, Delphi Method, Quality Function Deployment, Technology Foresight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1702951 Discrete Polyphase Matched Filtering-based Soft Timing Estimation for Mobile Wireless Systems
Authors: Thomas O. Olwal, Michael A. van Wyk, Barend J. van Wyk
Abstract:
In this paper we present a soft timing phase estimation (STPE) method for wireless mobile receivers operating in low signal to noise ratios (SNRs). Discrete Polyphase Matched (DPM) filters, a Log-maximum a posterior probability (MAP) and/or a Soft-output Viterbi algorithm (SOVA) are combined to derive a new timing recovery (TR) scheme. We apply this scheme to wireless cellular communication system model that comprises of a raised cosine filter (RCF), a bit-interleaved turbo-coded multi-level modulation (BITMM) scheme and the channel is assumed to be memory-less. Furthermore, no clock signals are transmitted to the receiver contrary to the classical data aided (DA) models. This new model ensures that both the bandwidth and power of the communication system is conserved. However, the computational complexity of ideal turbo synchronization is increased by 50%. Several simulation tests on bit error rate (BER) and block error rate (BLER) versus low SNR reveal that the proposed iterative soft timing recovery (ISTR) scheme outperforms the conventional schemes.
Keywords: discrete polyphase matched filters, maximum likelihood estimators, soft timing phase estimation, wireless mobile systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692950 A Normalization-based Robust Watermarking Scheme Using Zernike Moments
Authors: Say Wei Foo, Qi Dong
Abstract:
Digital watermarking has become an important technique for copyright protection but its robustness against attacks remains a major problem. In this paper, we propose a normalizationbased robust image watermarking scheme. In the proposed scheme, original host image is first normalized to a standard form. Zernike transform is then applied to the normalized image to calculate Zernike moments. Dither modulation is adopted to quantize the magnitudes of Zernike moments according to the watermark bit stream. The watermark extracting method is a blind method. Security analysis and false alarm analysis are then performed. The quality degradation of watermarked image caused by the embedded watermark is visually transparent. Experimental results show that the proposed scheme has very high robustness against various image processing operations and geometric attacks.
Keywords: Image watermarking, Image normalization, Zernike moments, Robustness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755949 Highly Accurate Target Motion Compensation Using Entropy Function Minimization
Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani
Abstract:
One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.
Keywords: ATR, HRRP, motion compensation, SFW, TMP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 657948 Design and Simulation of Heartbeat Measurement System Using Arduino Microcontroller in Proteus
Authors: Muhibul H. Bhuyan, Mafujul Hasan
Abstract:
If a person can monitor his/her heart rate regularly then he/she can detect heart disease early and thus he/she can enjoy longer life span. Therefore, this disease should be taken seriously. Hence, many health care devices and monitoring systems are being designed to keep track of the heart disease. This work reports a design and simulation processes of an Arduino microcontroller based heart rate measurement and monitoring system in Proteus environment. Clipping sensors were utilized to sense the heart rate of an individual from the finger tips. It is a digital device and uses mainly infrared (IR) transmitter (mainly IR LED) and receiver (mainly IR photo-transistor or IR photo-detector). When the heart pumps the blood and circulates it among the blood vessels of the body, the changed blood pressure is detected by the transmitter and then reflected back to the receiver accordingly. The reflected signals are then processed inside the microcontroller through a software written assembly language and appropriate heart rate (HR) is determined by it in beats per minute (bpm) from the detected signal for a duration of 10 seconds and display the same in bpm on the LCD screen in digital format. The designed system was simulated on several persons with varying ages, for example, infants, adult persons and active athletes. Simulation results were found very satisfactory.
Keywords: Heart rate measurement, design, simulation, Proteus, Arduino Uno microcontroller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1792947 Towards Growing Self-Organizing Neural Networks with Fixed Dimensionality
Authors: Guojian Cheng, Tianshi Liu, Jiaxin Han, Zheng Wang
Abstract:
The competitive learning is an adaptive process in which the neurons in a neural network gradually become sensitive to different input pattern clusters. The basic idea behind the Kohonen-s Self-Organizing Feature Maps (SOFM) is competitive learning. SOFM can generate mappings from high-dimensional signal spaces to lower dimensional topological structures. The main features of this kind of mappings are topology preserving, feature mappings and probability distribution approximation of input patterns. To overcome some limitations of SOFM, e.g., a fixed number of neural units and a topology of fixed dimensionality, Growing Self-Organizing Neural Network (GSONN) can be used. GSONN can change its topological structure during learning. It grows by learning and shrinks by forgetting. To speed up the training and convergence, a new variant of GSONN, twin growing cell structures (TGCS) is presented here. This paper first gives an introduction to competitive learning, SOFM and its variants. Then, we discuss some GSONN with fixed dimensionality, which include growing cell structures, its variants and the author-s model: TGCS. It is ended with some testing results comparison and conclusions.Keywords: Artificial neural networks, Competitive learning, Growing cell structures, Self-organizing feature maps.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542946 Food Safety Aspects of Pesticide Residues in Spice Paprika
Authors: Sz. Klátyik, B. Darvas, M. Mörtl, M. Ottucsák, E. Takács, H. Bánáti, L. Simon, G. Gyurcsó, A. Székács
Abstract:
Environmental and health safety of condiments used for spicing food products in food processing or by culinary means receive relatively low attention, even though possible contamination of spices may affect food quality and safety. Contamination surveys mostly focus on microbial contaminants or their secondary metabolites, mycotoxins. Chemical contaminants, particularly pesticide residues, however, are clearly substantial factors in the case of given condiments in the Capsicum family including spice paprika and chilli. To assess food safety and support the quality of the Hungaricum product spice paprika, the pesticide residue status of spice paprika and chilli is assessed on the basis of reported pesticide contamination cases and non-compliances in the Rapid Alert System for Food and Feed of the European Union since 1998.
Keywords: Spice paprika, Capsicum, pesticide residues, RASFF.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2403945 Hardware Prototyping of an Efficient Encryption Engine
Authors: Muhammad I. Ibrahimy, Mamun B.I. Reaz, Khandaker Asaduzzaman, Sazzad Hussain
Abstract:
An approach to develop the FPGA of a flexible key RSA encryption engine that can be used as a standard device in the secured communication system is presented. The VHDL modeling of this RSA encryption engine has the unique characteristics of supporting multiple key sizes, thus can easily be fit into the systems that require different levels of security. A simple nested loop addition and subtraction have been used in order to implement the RSA operation. This has made the processing time faster and used comparatively smaller amount of space in the FPGA. The hardware design is targeted on Altera STRATIX II device and determined that the flexible key RSA encryption engine can be best suited in the device named EP2S30F484C3. The RSA encryption implementation has made use of 13,779 units of logic elements and achieved a clock frequency of 17.77MHz. It has been verified that this RSA encryption engine can perform 32-bit, 256-bit and 1024-bit encryption operation in less than 41.585us, 531.515us and 790.61us respectively.Keywords: RSA, FPGA, Communication, Security, VHDL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1451944 Laser Ultrasonic Imaging Based on Synthetic Aperture Focusing Technique Algorithm
Authors: Sundara Subramanian Karuppasamy, Che Hua Yang
Abstract:
In this work, the laser ultrasound technique has been used for analyzing and imaging the inner defects in metal blocks. To detect the defects in blocks, traditionally the researchers used piezoelectric transducers for the generation and reception of ultrasonic signals. These transducers can be configured into the sparse and phased array. But these two configurations have their drawbacks including the requirement of many transducers, time-consuming calculations, limited bandwidth, and provide confined image resolution. Here, we focus on the non-contact method for generating and receiving the ultrasound to examine the inner defects in aluminum blocks. A Q-switched pulsed laser has been used for the generation and the reception is done by using Laser Doppler Vibrometer (LDV). Based on the Doppler effect, LDV provides a rapid and high spatial resolution way for sensing ultrasonic waves. From the LDV, a series of scanning points are selected which serves as the phased array elements. The side-drilled hole of 10 mm diameter with a depth of 25 mm has been introduced and the defect is interrogated by the linear array of scanning points obtained from the LDV. With the aid of the Synthetic Aperture Focusing Technique (SAFT) algorithm, based on the time-shifting principle the inspected images are generated from the A-scan data acquired from the 1-D linear phased array elements. Thus the defect can be precisely detected with good resolution.
Keywords: Laser ultrasonics, linear phased array, nondestructive testing, synthetic aperture focusing technique, ultrasonic imaging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 950943 Efficient CT Image Volume Rendering for Diagnosis
Authors: HaeNa Lee, Sun K. Yoo
Abstract:
Volume rendering is widely used in medical CT image visualization. Applying 3D image visualization to diagnosis application can require accurate volume rendering with high resolution. Interpolation is important in medical image processing applications such as image compression or volume resampling. However, it can distort the original image data because of edge blurring or blocking effects when image enhancement procedures were applied. In this paper, we proposed adaptive tension control method exploiting gradient information to achieve high resolution medical image enhancement in volume visualization, where restored images are similar to original images as much as possible. The experimental results show that the proposed method can improve image quality associated with the adaptive tension control efficacy.Keywords: Tension control, Interpolation, Ray-casting, Medical imaging analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2372942 Finite Element Approach to Evaluate Time Dependent Shear Behavior of Connections in Hybrid Steel-PC Girder under Sustained Loading
Authors: Mohammad Najmol Haque, Takeshi Maki, Jun Sasaki
Abstract:
Headed stud shear connections are widely used in the junction or embedded zone of hybrid girder to achieve whole composite action with continuity that can sustain steel-concrete interfacial tensile and shear forces. In Japan, Japan Road Association (JRA) specifications are used for hybrid girder design that utilizes very low level of stud capacity than those of American Institute of Steel Construction (AISC) specifications, Japan Society of Civil Engineers (JSCE) specifications and EURO code. As low design shear strength is considered in design of connections, the time dependent shear behavior due to sustained external loading is not considered, even not fully studied. In this study, a finite element approach was used to evaluate the time dependent shear behavior for headed studs used as connections at the junction. This study clarified, how the sustained loading distinctively impacted on changing the interfacial shear of connections with time which was sensitive to lodging history, positions of flanges, neighboring studs, position of prestress bar and reinforcing bar, concrete strength, etc. and also identified a shear influence area. Stud strength was also confirmed through pushout tests. The outcome obtained from the study may provide an important basis and reference data in designing connections of hybrid girders with enhanced stud capacity with due consideration of their long-term shear behavior.
Keywords: Finite element approach, hybrid girder, headed stud shear connections, sustained loading, time dependent shear behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 628941 Spatial Planning as an Approach to Achieve Sustainable Development in Historic Cities
Authors: Mohammad Ali Abdi, Sima Mehdizadegan Namin
Abstract:
Sustainable development is a concept which was originated in Burtland commission in 1978. Although this concept was born with environmental aspects, it is penetrated in all areas rapidly, turning into a dominate view of planning. Concentrating on future generation issue, especially when talking about heritage has a long story. Each approach with all of its characteristics illustrates differences in planning, hence planning always reflects the dominate idea of its age. This paper studies sustainable development in planning for historical cities with the aim of finding ways to deal with heritage in planning for historical cities in Iran. Through this, it will be illustrated how challenges between sustainable concept and heritage could be concluded in planning. Consequently, the paper will emphasize on: Sustainable development in city planning Trends regarding heritage Challenges due to planning for historical cities in Iran For the first two issues, documentary method regarding the sustainable development and heritage literature is considered. As the next step focusing on Iranian historical cities require considering the urban planning and management structure and identifying the main challenges related to heritage, so analyzing challenges regarding heritage is considered. As the result it would be illustrated that key issue in such planning is active conservation to improve and use the potential of heritage while it's continues conservation is guaranteed. By emphasizing on the planning system in Iran it will be obvious that some reforms are needed in this system and its way of relating with heritage. The main weakness in planning for historical cities in Iran is the lack of independent city management. Without this factor achieving active conservation as the main factor of sustainable development would not be possible.Keywords: Active conservation, city planning, heritage, sustainable development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493940 Automatic Classification of Initial Categories of Alzheimer's Disease from Structural MRI Phase Images: A Comparison of PSVM, KNN and ANN Methods
Authors: Ahsan Bin Tufail, Ali Abidi, Adil Masood Siddiqui, Muhammad Shahzad Younis
Abstract:
An early and accurate detection of Alzheimer's disease (AD) is an important stage in the treatment of individuals suffering from AD. We present an approach based on the use of structural magnetic resonance imaging (sMRI) phase images to distinguish between normal controls (NC), mild cognitive impairment (MCI) and AD patients with clinical dementia rating (CDR) of 1. Independent component analysis (ICA) technique is used for extracting useful features which form the inputs to the support vector machines (SVM), K nearest neighbour (kNN) and multilayer artificial neural network (ANN) classifiers to discriminate between the three classes. The obtained results are encouraging in terms of classification accuracy and effectively ascertain the usefulness of phase images for the classification of different stages of Alzheimer-s disease.
Keywords: Biomedical image processing, classification algorithms, feature extraction, statistical learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2765939 Adhesion Problematic for Novel Non-Crimp Fabric and Surface Modification of Carbon-Fibres Using Oxy-Fluorination
Authors: Iris Käppler, Paul Matthäi, Chokri Cherif
Abstract:
In the scope of application of technical textiles, Non- Crimp Fabrics are increasingly used. In general, NCF exhibit excellent load bearing properties, but caused by the manufacturing process, there are some remaining disadvantages which have to be reduced. Regarding to this, a novel technique of processing NCF was developed substituting the binding-thread by an adhesive. This stitchfree method requires new manufacturing concept as well as new basic methods to prove adhesion of glue at fibres and textiles. To improve adhesion properties and the wettability of carbon-fibres by the adhesive, oxy-fluorination was used. The modification of carbonfibres by oxy-fluorination was investigated via scanning electron microscope, X-ray photoelectron spectroscopy and single fibre tensiometry. Special tensile tests were developed to determine the maximum force required for detachment.
Keywords: Non-Crimp Fabric, adhesive, stitch-free, high-performance fibre.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2036938 A New Image Psychovisual Coding Quality Measurement based Region of Interest
Authors: M. Nahid, A. Bajit, A. Tamtaoui, E. H. Bouyakhf
Abstract:
To model the human visual system (HVS) in the region of interest, we propose a new objective metric evaluation adapted to wavelet foveation-based image compression quality measurement, which exploits a foveation setup filter implementation technique in the DWT domain, based especially on the point and region of fixation of the human eye. This model is then used to predict the visible divergences between an original and compressed image with respect to this region field and yields an adapted and local measure error by removing all peripheral errors. The technique, which we call foveation wavelet visible difference prediction (FWVDP), is demonstrated on a number of noisy images all of which have the same local peak signal to noise ratio (PSNR), but visibly different errors. We show that the FWVDP reliably predicts the fixation areas of interest where error is masked, due to high image contrast, and the areas where the error is visible, due to low image contrast. The paper also suggests ways in which the FWVDP can be used to determine a visually optimal quantization strategy for foveation-based wavelet coefficients and to produce a quantitative local measure of image quality.
Keywords: Human Visual System, Image Quality, ImageCompression, foveation wavelet, region of interest ROI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1498937 Landfill Failure Mobility Analysis: A Probabilistic Approach
Authors: Ali Jahanfar, Brajesh Dubey, Bahram Gharabaghi, Saber Bayat Movahed
Abstract:
Ever increasing population growth of major urban centers and environmental challenges in siting new landfills have resulted in a growing trend in design of mega-landfills some with extraordinary heights and dangerously steep slopes. Landfill failure mobility risk analysis is one of the most uncertain types of dynamic rheology models due to very large inherent variabilities in the heterogeneous solid waste material shear strength properties. The waste flow of three historic dumpsite and two landfill failures were back-analyzed using run-out modeling with DAN-W model. The travel distances of the waste flow during landfill failures were calculated approach by taking into account variability in material shear strength properties. The probability distribution function for shear strength properties of the waste material were grouped into four major classed based on waste material compaction (landfills versus dumpsites) and composition (high versus low quantity) of high shear strength waste materials such as wood, metal, plastic, paper and cardboard in the waste. This paper presents a probabilistic method for estimation of the spatial extent of waste avalanches, after a potential landfill failure, to create maps of vulnerability scores to inform property owners and residents of the level of the risk.Keywords: Landfill failure, waste flow, Voellmy rheology, friction coefficient, waste compaction and type.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2286936 QoS Improvement Using Intelligent Algorithm under Dynamic Tropical Weather for Earth-Space Satellite Applications
Authors: Joseph S. Ojo, Vincent A. Akpan, Oladayo G. Ajileye, Olalekan L, Ojo
Abstract:
In this paper, the intelligent algorithm (IA) that is capable of adapting to dynamical tropical weather conditions is proposed based on fuzzy logic techniques. The IA effectively interacts with the quality of service (QoS) criteria irrespective of the dynamic tropical weather to achieve improvement in the satellite links. To achieve this, an adaptive network-based fuzzy inference system (ANFIS) has been adopted. The algorithm is capable of interacting with the weather fluctuation to generate appropriate improvement to the satellite QoS for efficient services to the customers. 5-year (2012-2016) rainfall rate of one-minute integration time series data has been used to derive fading based on ITU-R P. 618-12 propagation models. The data are obtained from the measurement undertaken by the Communication Research Group (CRG), Physics Department, Federal University of Technology, Akure, Nigeria. The rain attenuation and signal-to-noise ratio (SNR) were derived for frequency between Ku and V-band and propagation angle with respect to different transmitting power. The simulated results show a substantial reduction in SNR especially for application in the area of digital video broadcast-second generation coding modulation satellite networks.
Keywords: Fuzzy logic, intelligent algorithm, Nigeria, QoS, satellite applications, tropical weather.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 818935 Stability Optimization of Functionally Graded Pipes Conveying Fluid
Authors: Karam Y. Maalawi, Hanan E.M EL-Sayed
Abstract:
This paper presents an exact analytical model for optimizing stability of thin-walled, composite, functionally graded pipes conveying fluid. The critical flow velocity at which divergence occurs is maximized for a specified total structural mass in order to ensure the economic feasibility of the attained optimum designs. The composition of the material of construction is optimized by defining the spatial distribution of volume fractions of the material constituents using piecewise variations along the pipe length. The major aim is to tailor the material distribution in the axial direction so as to avoid the occurrence of divergence instability without the penalty of increasing structural mass. Three types of boundary conditions have been examined; namely, Hinged-Hinged, Clamped- Hinged and Clamped-Clamped pipelines. The resulting optimization problem has been formulated as a nonlinear mathematical programming problem solved by invoking the MatLab optimization toolbox routines, which implement constrained function minimization routine named “fmincon" interacting with the associated eigenvalue problem routines. In fact, the proposed mathematical models have succeeded in maximizing the critical flow velocity without mass penalty and producing efficient and economic designs having enhanced stability characteristics as compared with the baseline designs.Keywords: Functionally graded materials, pipe flow, optimumdesign, fluid- structure interaction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2208934 An Optimization of Machine Parameters for Modified Horizontal Boring Tool Using Taguchi Method
Authors: Thirasak Panyaphirawat, Pairoj Sapsmarnwong, Teeratas Pornyungyuen
Abstract:
This paper presents the findings of an experimental investigation of important machining parameters for the horizontal boring tool modified to mouth with a horizontal lathe machine to bore an overlength workpiece. In order to verify a usability of a modified tool, design of experiment based on Taguchi method is performed. The parameters investigated are spindle speed, feed rate, depth of cut and length of workpiece. Taguchi L9 orthogonal array is selected for four factors three level parameters in order to minimize surface roughness (Ra and Rz) of S45C steel tubes. Signal to noise ratio analysis and analysis of variance (ANOVA) is performed to study an effect of said parameters and to optimize the machine setting for best surface finish. The controlled factors with most effect are depth of cut, spindle speed, length of workpiece, and feed rate in order. The confirmation test is performed to test the optimal setting obtained from Taguchi method and the result is satisfactory.
Keywords: Design of Experiment, Taguchi Design, Optimization, Analysis of Variance, Machining Parameters, Horizontal Boring Tool.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2706933 Optimization of Process Parameters of Pressure Die Casting using Taguchi Methodology
Authors: Satish Kumar, Arun Kumar Gupta, Pankaj Chandna
Abstract:
The present work analyses different parameters of pressure die casting to minimize the casting defects. Pressure diecasting is usually applied for casting of aluminium alloys. Good surface finish with required tolerances and dimensional accuracy can be achieved by optimization of controllable process parameters such as solidification time, molten temperature, filling time, injection pressure and plunger velocity. Moreover, by selection of optimum process parameters the pressure die casting defects such as porosity, insufficient spread of molten material, flash etc. are also minimized. Therefore, a pressure die casting component, carburetor housing of aluminium alloy (Al2Si2O5) has been considered. The effects of selected process parameters on casting defects and subsequent setting of parameters with the levels have been accomplished by Taguchi-s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L18 orthogonal array. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the percent contribution of different process parameters. Confidence interval has also been estimated for 95% consistency level and three conformational experiments have been performed to validate the optimum level of different parameters. Overall 2.352% reduction in defects has been observed with the help of suggested optimum process parameters.
Keywords: Aluminium Casting, Pressure Die Casting, Taguchi Methodology, Design of Experiments
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7335932 MJPEG Real-Time Transmission in Industrial Environments Using a CBR Channel
Authors: J. Silvestre, L. Almeida, R. Marau, P. Pedreiras
Abstract:
Currently, there are many local area industrial networks that can give guaranteed bandwidth to synchronous traffic, particularly providing CBR channels (Constant Bit Rate), which allow improved bandwidth management. Some of such networks operate over Ethernet, delivering channels with enough capacity, specially with compressors, to integrate multimedia traffic in industrial monitoring and image processing applications with many sources. In these industrial environments where a low latency is an essential requirement, JPEG is an adequate compressing technique but it generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic in CBR channels is inefficient and current solutions to this problem significantly increase the latency or further degrade the quality. In this paper an R(q) model is used which allows on-line calculation of the JPEG quantification factor. We obtained increased quality, a lower requirement for the CBR channel with reduced number of discarded frames along with better use of the channel bandwidth.Keywords: Industrial Networks, Multimedia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1594