Search results for: high resolution array processing techniques
8789 A Grid Synchronization Method Based on Adaptive Notch Filter for SPV System with Modified MPPT
Authors: Priyanka Chaudhary, M. Rizwan
Abstract:
This paper presents a grid synchronization technique based on adaptive notch filter for SPV (Solar Photovoltaic) system along with MPPT (Maximum Power Point Tracking) techniques. An efficient grid synchronization technique offers proficient detection of various components of grid signal like phase and frequency. It also acts as a barrier for harmonics and other disturbances in grid signal. A reference phase signal synchronized with the grid voltage is provided by the grid synchronization technique to standardize the system with grid codes and power quality standards. Hence, grid synchronization unit plays important role for grid connected SPV systems. As the output of the PV array is fluctuating in nature with the meteorological parameters like irradiance, temperature, wind etc. In order to maintain a constant DC voltage at VSC (Voltage Source Converter) input, MPPT control is required to track the maximum power point from PV array. In this work, a variable step size P & O (Perturb and Observe) MPPT technique with DC/DC boost converter has been used at first stage of the system. This algorithm divides the dPpv/dVpv curve of PV panel into three separate zones i.e. zone 0, zone 1 and zone 2. A fine value of tracking step size is used in zone 0 while zone 1 and zone 2 requires a large value of step size in order to obtain a high tracking speed. Further, adaptive notch filter based control technique is proposed for VSC in PV generation system. Adaptive notch filter (ANF) approach is used to synchronize the interfaced PV system with grid to maintain the amplitude, phase and frequency parameters as well as power quality improvement. This technique offers the compensation of harmonics current and reactive power with both linear and nonlinear loads. To maintain constant DC link voltage a PI controller is also implemented and presented in this paper. The complete system has been designed, developed and simulated using SimPower System and Simulink toolbox of MATLAB. The performance analysis of three phase grid connected solar photovoltaic system has been carried out on the basis of various parameters like PV output power, PV voltage, PV current, DC link voltage, PCC (Point of Common Coupling) voltage, grid voltage, grid current, voltage source converter current, power supplied by the voltage source converter etc. The results obtained from the proposed system are found satisfactory.
Keywords: Solar photovoltaic systems, MPPT, voltage source converter, grid synchronization technique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19688788 ISC–Intelligent Subspace Clustering, A Density Based Clustering Approach for High Dimensional Dataset
Authors: Sunita Jahirabadkar, Parag Kulkarni
Abstract:
Many real-world data sets consist of a very high dimensional feature space. Most clustering techniques use the distance or similarity between objects as a measure to build clusters. But in high dimensional spaces, distances between points become relatively uniform. In such cases, density based approaches may give better results. Subspace Clustering algorithms automatically identify lower dimensional subspaces of the higher dimensional feature space in which clusters exist. In this paper, we propose a new clustering algorithm, ISC – Intelligent Subspace Clustering, which tries to overcome three major limitations of the existing state-of-art techniques. ISC determines the input parameter such as є – distance at various levels of Subspace Clustering which helps in finding meaningful clusters. The uniform parameters approach is not suitable for different kind of databases. ISC implements dynamic and adaptive determination of Meaningful clustering parameters based on hierarchical filtering approach. Third and most important feature of ISC is the ability of incremental learning and dynamic inclusion and exclusions of subspaces which lead to better cluster formation.
Keywords: Density based clustering, high dimensional data, subspace clustering, dynamic parameter setting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20188787 Artificial Intelligence Techniques for Controlling Spacecraft Power System
Authors: Hanaa T. El-Madany, Faten H. Fahmy, Ninet M. A. El-Rahman, Hassen T. Dorrah
Abstract:
Advancements in the field of artificial intelligence (AI) made during this decade have forever changed the way we look at automating spacecraft subsystems including the electrical power system. AI have been used to solve complicated practical problems in various areas and are becoming more and more popular nowadays. In this paper, a mathematical modeling and MATLAB–SIMULINK model for the different components of the spacecraft power system is presented. Also, a control system, which includes either the Neural Network Controller (NNC) or the Fuzzy Logic Controller (FLC) is developed for achieving the coordination between the components of spacecraft power system as well as control the energy flows. The performance of the spacecraft power system is evaluated by comparing two control systems using the NNC and the FLC.Keywords: Spacecraft, Neural network, Fuzzy logic control, Photovoltaic array.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19498786 A Monte Carlo Method to Data Stream Analysis
Authors: Kittisak Kerdprasop, Nittaya Kerdprasop, Pairote Sattayatham
Abstract:
Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.Keywords: Data Stream, Monte Carlo, Sampling, DensityEstimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14178785 Ontologies for Complex Event Processing
Authors: Irina Astrova, Arne Koschel, Jan Lukanowski, Jose Luis Munoz Martinez, Valerij Procenko, Marc Schaaf
Abstract:
In this paper, five ontologies are described, which include the event concepts. The paper provides an overview and comparison of existing event models. The main criteria for comparison are that there should be possibilities to model events with stretch in the time and location and participation of objects; however, there are other factors that should be taken into account as well. The paper also shows an example of using ontologies in complex event processing.
Keywords: Ontologies, events, complex event processing (CEP).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27018784 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents
Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei
Abstract:
With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.
Keywords: Document processing, framework, formal definition, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6388783 Genetic-Based Multi Resolution Noisy Color Image Segmentation
Authors: Raghad Jawad Ahmed
Abstract:
Segmentation of a color image composed of different kinds of regions can be a hard problem, namely to compute for an exact texture fields. The decision of the optimum number of segmentation areas in an image when it contains similar and/or un stationary texture fields. A novel neighborhood-based segmentation approach is proposed. A genetic algorithm is used in the proposed segment-pass optimization process. In this pass, an energy function, which is defined based on Markov Random Fields, is minimized. In this paper we use an adaptive threshold estimation method for image thresholding in the wavelet domain based on the generalized Gaussian distribution (GGD) modeling of sub band coefficients. This method called Normal Shrink is computationally more efficient and adaptive because the parameters required for estimating the threshold depend on sub band data energy that used in the pre-stage of segmentation. A quad tree is employed to implement the multi resolution framework, which enables the use of different strategies at different resolution levels, and hence, the computation can be accelerated. The experimental results using the proposed segmentation approach are very encouraging.Keywords: Color image segmentation, Genetic algorithm, Markov random field, Scale space filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15768782 Improved Feature Processing for Iris Biometric Authentication System
Authors: Somnath Dey, Debasis Samanta
Abstract:
Iris-based biometric authentication is gaining importance in recent times. Iris biometric processing however, is a complex process and computationally very expensive. In the overall processing of iris biometric in an iris-based biometric authentication system, feature processing is an important task. In feature processing, we extract iris features, which are ultimately used in matching. Since there is a large number of iris features and computational time increases as the number of features increases, it is therefore a challenge to develop an iris processing system with as few as possible number of features and at the same time without compromising the correctness. In this paper, we address this issue and present an approach to feature extraction and feature matching process. We apply Daubechies D4 wavelet with 4 levels to extract features from iris images. These features are encoded with 2 bits by quantizing into 4 quantization levels. With our proposed approach it is possible to represent an iris template with only 304 bits, whereas existing approaches require as many as 1024 bits. In addition, we assign different weights to different iris region to compare two iris templates which significantly increases the accuracy. Further, we match the iris template based on a weighted similarity measure. Experimental results on several iris databases substantiate the efficacy of our approach.Keywords: Iris recognition, biometric, feature processing, patternrecognition, pattern matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21408781 Structural Sustainability Techniques for RC High Rise Buildings
Authors: Mohamed A. Azab
Abstract:
Over the early years of the 21st century, cities throughout the Middle East, particularly in the Gulf region have expanded more rapidly than ever before. Given the presence of a large volume of high-rise buildings allover the region, the local authority aims to set a new standard for sustainable development; with an integrated approach to maintain a balance between economy, quality, environmental protection and safety of life. In the very near future, as mandatory requirements, sustainability will be the criteria that should be included in all building projects. It is well known in the building sustainability topics that structural design engineers do not have a key role in this matter. In addition, the LEED (Leadership in Energy and Environmental Design) has looked almost exclusively on the environmental components and materials specifications. The objective of this paper is to focus and establish groundwork for sustainability techniques and applications related to the RC high-rise buildings design, from the structural point of view. A set of recommendations related to local conditions, structural modeling and analysis is given, and some helpful suggestions for structural design team work are addressed. This paper attempts to help structural engineers in identifying the building sustainability design, in order to meet local needs and achieve alternative solutions at an early stage of project design.Keywords: Building, Design, High-rise, Middle East, Structural, Sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34708780 Improvement in Properties of Ni-Cr-Mo-V Steel through Process Control
Authors: Arnab Majumdar, Sanjoy Sadhukhan
Abstract:
Although gun barrel steels are an important variety from defense view point, available literatures are very limited. In the present work, an IF grade Ni-Cr-Mo-V high strength low alloy steel is produced in Electric Earth Furnace-ESR Route. Ingot was hot forged to desired dimension with a reduction ratio of 70-75% followed by homogenization, hardening and tempering treatment. Sample chemistry, NMIR, macro and micro structural analyses were done. Mechanical properties which include tensile, impact, and fracture toughness were studied. Ultrasonic testing was done to identify internal flaws. The existing high strength low alloy Ni-Cr-Mo-V steel shows improved properties in modified processing route and heat treatment schedule in comparison to properties noted earlier for manufacturing of gun barrels. The improvement in properties seems to withstand higher explosive loads with the same amount of steel in gun barrel application.Keywords: Gun barrel steels, IF grade, physical properties, thermal and mechanical processing, mechanical properties, ultrasonic testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24348779 Characterization of Fabricated A 384.1-MgO Based Metal Matrix Composite and Optimization of Tensile Strength using Taguchi Techniques
Authors: Nripjit, Anand K Tyagi, Nirmal Singh
Abstract:
The present work consecutively on synthesis and characterization of composites, Al/Al alloy A 384.1 as matrix in which the main ingredient as Al/Al-5% MgO alloy based metal matrix composite. As practical implications the low cost processing route for the fabrication of Al alloy A 384.1 and operational difficulties of presently available manufacturing processes based in liquid manipulation methods. As all new developments, complete understanding of the influence of processing variables upon the final quality of the product. And the composite is applied comprehensively to the acquaintance for achieving superiority of information concerning the specific heat measurement of a material through the aid of thermographs. Products are evaluated concerning relative particle size and mechanical behavior under tensile strength. Furthermore, Taguchi technique was employed to examine the experimental optimum results are achieved, owing to effectiveness of this approach.Keywords: MMC, Thermographs, Tensile strength, Taguchi technique, Optimal parameters
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16388778 Six Sigma-Based Optimization of Shrinkage Accuracy in Injection Molding Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using six sigma methodologies to reach the desired shrinkage of a manufactured high-density polyurethane (HDPE) part produced by the injection molding machine. It presents a case study where the correct shrinkage is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for an injection molding process. To improve this process and keep the product within specifications, the six sigma methodology, design, measure, analyze, improve, and control (DMAIC) approach, was implemented in this study. The six sigma approach was paired with the Taguchi methodology to identify the optimized processing parameters that keep the shrinkage rate within the specifications by our customer. An L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of the cooling time, melt temperature, holding time, and metering stroke. The noise factor is the difference between material brand 1 and material brand 2. After the confirmation run was completed, measurements verify that the new parameter settings are optimal. With the new settings, the process capability index has improved dramatically. The purpose of this study is to show that the six sigma and Taguchi methodology can be efficiently used to determine important factors that will improve the process capability index of the injection molding process.
Keywords: Injection molding, shrinkage, six sigma, Taguchi parameter design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13818777 Review of Surface Electromyogram Signals: Its Analysis and Applications
Authors: Anjana Goen, D. C. Tiwari
Abstract:
Electromyography (EMG) is the study of muscles function through analysis of electrical activity produced from muscles. This electrical activity which is displayed in the form of signal is the result of neuromuscular activation associated with muscle contraction. The most common techniques of EMG signal recording are by using surface and needle/wire electrode where the latter is usually used for interest in deep muscle. This paper will focus on surface electromyogram (SEMG) signal. During SEMG recording, several problems had to been countered such as noise, motion artifact and signal instability. Thus, various signal processing techniques had been implemented to produce a reliable signal for analysis. SEMG signal finds broad application particularly in biomedical field. It had been analyzed and studied for various interests such as neuromuscular disease, enhancement of muscular function and human-computer interface.
Keywords: Evolvable hardware (EHW), Functional Electrical Simulation (FES), Hidden Markov Model (HMM), Hjorth Time Domain (HTD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35168776 Comparison of Pore Space Features by Thin Sections and X-Ray Microtomography
Authors: H. Alves, J. T. Assis, M. Geraldes, I. Lima, R. T. Lopes
Abstract:
Microtomographic images and thin section (TS) images were analyzed and compared against some parameters of geological interest such as porosity and its distribution along the samples. The results show that microtomography (CT) analysis, although limited by its resolution, have some interesting information about the distribution of porosity (homogeneous or not) and can also quantify the connected and non-connected pores, i.e., total porosity. TS have no limitations concerning resolution, but are limited by the experimental data available in regards to a few glass sheets for analysis and also can give only information about the connected pores, i.e., effective porosity. Those two methods have their own virtues and flaws but when paired together they are able to complement one another, making for a more reliable and complete analysis.
Keywords: Microtomography, petrographical microscopy, sediments, thin sections.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23298775 Classification of Right and Left-Hand Movement Using Multi-Resolution Analysis Method
Authors: Nebi Gedik
Abstract:
The aim of the brain-computer interface studies on electroencephalogram (EEG) signals containing motor imagery is to extract the effective features that will provide the highest possible classification accuracy for the detection of the desired motor movement. However, achieving this goal is difficult as the most suitable frequency band and time frame vary from subject to subject. In this study, the classification success of the two-feature data obtained from raw EEG signals and the coefficients of the multi-resolution analysis method applied to the EEG signals were analyzed comparatively. The method was applied to several EEG channels (C3, Cz and C4) signals obtained from the EEG data set belonging to the publicly available BCI competition III.
Keywords: Motor imagery, EEG, wave atom transform, k-NN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5898774 A 2D-3D Hybrid Vision System for Robotic Manipulation of Randomly Oriented Objects
Authors: Moulay A. Akhloufi
Abstract:
This paper presents an new vision technique for robotic manipulation of randomly oriented objects in industrial applications. The proposed approach uses 2D and 3D vision for efficiently extracting the 3D pose of an object in the presence of multiple randomly positioned objects. 2D vision permits to quickly select the objects of interest for 3D processing with a new modified ICP algorithm (FaR-ICP), thus reducing significantly the processing time. The extracted 3D pose is then sent to the robot manipulator for picking. The tests show that the proposed system achieves high performancesKeywords: 3D vision, Hand-Eye calibration, robot visual servoing, random bin picking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18148773 A Probabilistic Optimization Approach for a Gas Processing Plant under Uncertain Feed Conditions and Product Requirements
Authors: G. Mesfin, M. Shuhaimi
Abstract:
This paper proposes a new optimization techniques for the optimization a gas processing plant uncertain feed and product flows. The problem is first formulated using a continuous linear deterministic approach. Subsequently, the single and joint chance constraint models for steady state process with timedependent uncertainties have been developed. The solution approach is based on converting the probabilistic problems into their equivalent deterministic form and solved at different confidence levels Case study for a real plant operation has been used to effectively implement the proposed model. The optimization results indicate that prior decision has to be made for in-operating plant under uncertain feed and product flows by satisfying all the constraints at 95% confidence level for single chance constrained and 85% confidence level for joint chance constrained optimizations cases.Keywords: Butane, Feed composition, LPG, Productspecification, Propane.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13988772 An Image Enhancement Method Based on Curvelet Transform for CBCT-Images
Authors: Shahriar Farzam, Maryam Rastgarpour
Abstract:
Image denoising plays extremely important role in digital image processing. Enhancement of clinical image research based on Curvelet has been developed rapidly in recent years. In this paper, we present a method for image contrast enhancement for cone beam CT (CBCT) images based on fast discrete curvelet transforms (FDCT) that work through Unequally Spaced Fast Fourier Transform (USFFT). These transforms return a table of Curvelet transform coefficients indexed by a scale parameter, an orientation and a spatial location. Accordingly, the coefficients obtained from FDCT-USFFT can be modified in order to enhance contrast in an image. Our proposed method first uses a two-dimensional mathematical transform, namely the FDCT through unequal-space fast Fourier transform on input image and then applies thresholding on coefficients of Curvelet to enhance the CBCT images. Consequently, applying unequal-space fast Fourier Transform leads to an accurate reconstruction of the image with high resolution. The experimental results indicate the performance of the proposed method is superior to the existing ones in terms of Peak Signal to Noise Ratio (PSNR) and Effective Measure of Enhancement (EME).
Keywords: Curvelet transform, image enhancement, CBCT, image denoising.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12608771 Retrieving Extended High Dynamic Range from Digital Negative Image - An Experiment on Architectural Photo Imaging
Authors: See Zi Siang, Khairul Hazrin Hashim, Harold Thwaites, Lee Xia Sheng, Ooi Wooi Har
Abstract:
The paper explores the development of an optimization of method and apparatus for retrieving extended high dynamic range from digital negative image. Architectural photo imaging can benefit from high dynamic range imaging (HDRI) technique for preserving and presenting sufficient luminance in the shadow and highlight clipping image areas. The HDRI technique that requires multiple exposure images as the source of HDRI rendering may not be effective in terms of time efficiency during the acquisition process and post-processing stage, considering it has numerous potential imaging variables and technical limitations during the multiple exposure process. This paper explores an experimental method and apparatus that aims to expand the dynamic range from digital negative image in HDRI environment. The method and apparatus explored is based on a single source of RAW image acquisition for the use of HDRI post-processing. It will cater the optimization in order to avoid and minimize the conventional HDRI photographic errors caused by different physical conditions during the photographing process and the misalignment of multiple exposed image sequences. The study observes the characteristics and capabilities of RAW image format as digital negative used for the retrieval of extended high dynamic range process in HDRI environment.
Keywords: High Dynamic Range Image, Photography Workflow Optimization, Digital Negative Image, Architectural Image
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16178770 Global Security Using Human Face Understanding under Vision Ubiquitous Architecture System
Abstract:
Different methods containing biometric algorithms are presented for the representation of eigenfaces detection including face recognition, are identification and verification. Our theme of this research is to manage the critical processing stages (accuracy, speed, security and monitoring) of face activities with the flexibility of searching and edit the secure authorized database. In this paper we implement different techniques such as eigenfaces vector reduction by using texture and shape vector phenomenon for complexity removal, while density matching score with Face Boundary Fixation (FBF) extracted the most likelihood characteristics in this media processing contents. We examine the development and performance efficiency of the database by applying our creative algorithms in both recognition and detection phenomenon. Our results show the performance accuracy and security gain with better achievement than a number of previous approaches in all the above processes in an encouraging mode.Keywords: Ubiquitous architecture, verification, Identification, recognition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13368769 Modeling of Water Erosion in the M'Goun Watershed Using OpenGIS Software
Authors: M. Khal, Ab. Algouti, A. Algouti
Abstract:
Water erosion is the major cause of the erosion that shapes the earth's surface. Modeling water erosion requires the use of software and GIS programs, commercial or closed source. The very high prices for commercial GIS licenses, motivates users and researchers to find open source software as relevant and applicable as the proprietary GIS. The objective of this study is the modeling of water erosion and the hydrogeological and morphophysical characterization of the Oued M'Goun watershed (southern flank of the Central High Atlas) developed by free programs of GIS. The very pertinent results are obtained by executing tasks and algorithms in a simple and easy way. Thus, the various geoscientific and geostatistical analyzes of a digital elevation model (SRTM 30 m resolution) and their combination with the treatments and interpretation of satellite imagery information allowed us to characterize the region studied and to map the area most vulnerable to water erosion.
Keywords: Central High-Atlas, hydrogeology, M’Goun watershed, OpenGIS, water erosion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9338768 Fabrication of Tissue Engineering Scaffolds Using Rapid Prototyping Techniques
Authors: Osama A. Abdelaal, Saied M. Darwish
Abstract:
Rapid prototyping (RP) techniques are a group of advanced manufacturing processes that can produce custom made objects directly from computer data such as Computer Aided Design (CAD), Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) data. Using RP fabrication techniques, constructs with controllable and complex internal architecture with appropriate mechanical properties can be achieved. One of the attractive and promising utilization of RP techniques is related to tissue engineering (TE) scaffold fabrication. Tissue engineering scaffold is a 3D construction that acts as a template for tissue regeneration. Although several conventional techniques such as solvent casting and gas forming are utilized in scaffold fabrication; these processes show poor interconnectivity and uncontrollable porosity of the produced scaffolds. So, RP techniques become the best alternative fabrication methods of TE scaffolds. This paper reviews the current state of the art in the area of tissue engineering scaffolds fabrication using advanced RP processes, as well as the current limitations and future trends in scaffold fabrication RP techniques.Keywords: Biomanufacturing, Rapid prototyping, Solid FreeForm Fabrication, Scaffold Fabrication, Tissue Engineering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 52868767 Nutritional Potential and Functionality of Whey Powder Influenced by Different Processing Temperature and Storage
Authors: Zarmina Gillani, Nuzhat Huma, Aysha Sameen, Mulazim Hussain Bukhari
Abstract:
Whey is an excellent food ingredient owing to its high nutritive value and its functional properties. However, composition of whey varies depending on composition of milk, processing conditions, processing method, and its whey protein content. The aim of this study was to prepare a whey powder from raw whey and to determine the influence of different processing temperatures (160 and 180 °C) on the physicochemical, functional properties during storage of 180 days and on whey protein denaturation. Results have shown that temperature significantly (P < 0.05) affects the pH, acidity, non-protein nitrogen (NPN), protein total soluble solids, fat and lactose contents. Significantly (p < 0.05) higher foaming capacity (FC), foam stability (FS), whey protein nitrogen index (WPNI), and a lower turbidity and solubility index (SI) were observed in whey powder processed at 160 °C compared to whey powder processed at 180 °C. During storage of 180 days, slow but progressive changes were noticed on the physicochemical and functional properties of whey powder. Reverse phase-HPLC analysis revealed a significant (P < 0.05) effect of temperature on whey protein contents. Denaturation of β-Lactoglobulin is followed by α-lacalbumin, casein glycomacropeptide (CMP/GMP), and bovine serum albumin (BSA).
Keywords: Whey powder, temperature, denaturation, reverse phase – HPLC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12298766 Computing the Loop Bound in Iterative Data Flow Graphs Using Natural Token Flow
Authors: Ali Shatnawi
Abstract:
Signal processing applications which are iterative in nature are best represented by data flow graphs (DFG). In these applications, the maximum sampling frequency is dependent on the topology of the DFG, the cyclic dependencies in particular. The determination of the iteration bound, which is the reciprocal of the maximum sampling frequency, is critical in the process of hardware implementation of signal processing applications. In this paper, a novel technique to compute the iteration bound is proposed. This technique is different from all previously proposed techniques, in the sense that it is based on the natural flow of tokens into the DFG rather than the topology of the graph. The proposed algorithm has lower run-time complexity than all known algorithms. The performance of the proposed algorithm is illustrated through analytical analysis of the time complexity, as well as through simulation of some benchmark problems.Keywords: Data flow graph, Iteration period bound, Rateoptimalscheduling, Recursive DSP algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25628765 An Efficient Energy Adaptive Hybrid Error Correction Technique for Underwater Wireless Sensor Networks
Authors: Ammar Elyas babiker, M.Nordin B. Zakaria, Hassan Yosif, Samir B. Ibrahim
Abstract:
Variable channel conditions in underwater networks, and variable distances between sensors due to water current, leads to variable bit error rate (BER). This variability in BER has great effects on energy efficiency of error correction techniques used. In this paper an efficient energy adaptive hybrid error correction technique (AHECT) is proposed. AHECT adaptively changes error technique from pure retransmission (ARQ) in a low BER case to a hybrid technique with variable encoding rates (ARQ & FEC) in a high BER cases. An adaptation algorithm depends on a precalculated packet acceptance rate (PAR) look-up table, current BER, packet size and error correction technique used is proposed. Based on this adaptation algorithm a periodically 3-bit feedback is added to the acknowledgment packet to state which error correction technique is suitable for the current channel conditions and distance. Comparative studies were done between this technique and other techniques, and the results show that AHECT is more energy efficient and has high probability of success than all those techniques.Keywords: Underwater communication, wireless sensornetworks, error correction technique, energy efficiency
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21518764 The Design of a Die for the Processing of Aluminum through Equal Channel Angular Pressing
Authors: P. G. F. Siqueira, N. G. S. Almeida, P. M. A. Stemler, P. R. Cetlin, M. T. P. Aguilar
Abstract:
The processing of metals through Equal Channel Angular Pressing (ECAP) leads to their remarkable strengthening. The ECAP dies control the amount of strain imposed on the material through its geometry, especially through the angle between the die channels, and thus the microstructural and mechanical properties evolution of the material. The present study describes the design of an ECAP die whose utilization and maintenance are facilitated, and that also controls the eventual undesired flow of the material during processing. The proposed design was validated through numerical simulations procedures using commercial software. The die was manufactured according to the present design and tested. Tests using aluminum alloys also indicated to be suitable for the processing of higher strength alloys.
Keywords: ECAP, mechanical design, numerical methods, SPD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7868763 Influence of Model Hydrometeor Form on Probability of Discharge Initiation from Artificial Charged Water Aerosol Cloud
Authors: A. G. Temnikov, O. S. Belova, L. L. Chernensky, T. K. Gerastenok, N. Y. Lysov, A. V. Orlov, D. S. Zhuravkova
Abstract:
Hypothesis of the lightning initiation on the arrays of large hydrometeors are in the consideration. There is no agreement about the form the hydrometeors that could be the best for the lightning initiation from the thundercloud. Artificial charged water aerosol clouds of the positive or negative polarity could help investigate the possible influence of the hydrometeor form on the peculiarities and the probability of the lightning discharge initiation between the thundercloud and the ground. Artificial charged aerosol clouds that could create the electric field strength in the range of 5-6 kV/cm to 16-18 kV/cm have been used in experiments. The array of the model hydrometeors of the volume and plate form has been disposed near the bottom cloud boundary. It was established that the different kinds of the discharge could be initiated in the presence of the model hydrometeors array – from the cloud discharges up to the diffuse and channel discharges between the charged cloud and the ground. It was found that the form of the model hydrometeors could significantly influence the channel discharge initiation from the artificial charged aerosol cloud of the negative or positive polarity correspondingly. Analysis and generalization of the experimental results have shown that the maximal probability of the channel discharge initiation and propagation stimulation has been observed for the artificial charged cloud of the positive polarity when the arrays of the model hydrometeors of the cylinder revolution form have been used. At the same time, for the artificial charged clouds of the negative polarity, application of the model hydrometeor array of the plate rhombus form has provided the maximal probability of the channel discharge formation between the charged cloud and the ground. The established influence of the form of the model hydrometeors on the channel discharge initiation and from the artificial charged water aerosol cloud and its following successful propagation has been related with the different character of the positive and negative streamer and volume leader development on the model hydrometeors array being near the bottom boundary of the charged cloud. The received experimental results have shown the possibly important role of the form of the large hail particles precipitated in thundercloud on the discharge initiation.
Keywords: Cloud and channel discharges, hydrometeor form, lightning initiation, negative and positive artificial charged aerosol cloud.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8758762 A New High Speed Neural Model for Fast Character Recognition Using Cross Correlation and Matrix Decomposition
Authors: Hazem M. El-Bakry
Abstract:
Neural processors have shown good results for detecting a certain character in a given input matrix. In this paper, a new idead to speed up the operation of neural processors for character detection is presented. Such processors are designed based on cross correlation in the frequency domain between the input matrix and the weights of neural networks. This approach is developed to reduce the computation steps required by these faster neural networks for the searching process. The principle of divide and conquer strategy is applied through image decomposition. Each image is divided into small in size sub-images and then each one is tested separately by using a single faster neural processor. Furthermore, faster character detection is obtained by using parallel processing techniques to test the resulting sub-images at the same time using the same number of faster neural networks. In contrast to using only faster neural processors, the speed up ratio is increased with the size of the input image when using faster neural processors and image decomposition. Moreover, the problem of local subimage normalization in the frequency domain is solved. The effect of image normalization on the speed up ratio of character detection is discussed. Simulation results show that local subimage normalization through weight normalization is faster than subimage normalization in the spatial domain. The overall speed up ratio of the detection process is increased as the normalization of weights is done off line.Keywords: Fast Character Detection, Neural Processors, Cross Correlation, Image Normalization, Parallel Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15378761 Modern Vibration Signal Processing Techniques for Vehicle Gearbox Fault Diagnosis
Authors: Mohamed El Morsy, Gabriela Achtenová
Abstract:
This paper presents modern vibration signalprocessing techniques for vehicle gearbox fault diagnosis, via the wavelet analysis and the Squared Envelope (SE) technique. The wavelet analysis is regarded as a powerful tool for the detection of sudden changes in non-stationary signals. The Squared Envelope (SE) technique has been extensively used for rolling bearing diagnostics. In the present work a scheme of using the Squared Envelope technique for early detection of gear tooth pit. The pitting defect is manufactured on the tooth side of a fifth speed gear on the intermediate shaft of a vehicle gearbox. The objective is to supplement the current techniques of gearbox fault diagnosis based on using the raw vibration and ordered signals. The test stand is equipped with three dynamometers; the input dynamometer serves as the internal combustion engine, the output dynamometers introduce the load on the flanges of output joint shafts. The gearbox used for experimental measurements is the type most commonly used in modern small to mid-sized passenger cars with transversely mounted powertrain and front wheel drive; a five-speed gearbox with final drive gear and front wheel differential. The results show that the approaches methods are effective for detecting and diagnosing localized gear faults in early stage under different operation conditions, and are more sensitive and robust than current gear diagnostic techniques.
Keywords: Wavelet analysis, Squared Envelope, gear faults.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25768760 An Improved Preprocessing for Biosonar Target Classification
Authors: Turgay Temel, John Hallam
Abstract:
An improved processing description to be employed in biosonar signal processing in a cochlea model is proposed and examined. It is compared to conventional models using a modified discrimination analysis and both are tested. Their performances are evaluated with echo data captured from natural targets (trees).Results indicate that the phase characteristics of low-pass filters employed in the echo processing have a significant effect on class separability for this data.
Keywords: Cochlea model, discriminant analysis, neurospikecoding, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492