Search results for: LCL filter
264 Reducing Environmental Impact of Olive Oil Production in Sakaka City Using Combined Chemical, Physical, and Biological Treatment
Authors: Abdullah Alhajoj, Bassam Alowaiesh
Abstract:
This work aims to reduce the risks of discharging olive mill waste directly to the environment without treatment in Sakaka City, KSA. The organic loads expressed by chemical oxygen demand (COD) and biological oxygen demand (BOD) of the produced wastewater (OMWW) as well as the solid waste (OMW) were evaluated. The wastes emitted from the three-phase centrifuge decanters was found to be higher than that emitted from the two-phase centrifuge decanters. The olive mill wastewater (OMWW) was treated using advanced oxidation combined with filtration treatment. The results indicated that the concentration of COD, BOD, TSS, oil and grease and phenol was reduced by using complex sand filtration from 72150, 21660 10256, 36430, and 1470 mg/l to 980, 421, 58, 68, and 0.35 mg/l for three-phase OMWW and from 150562, 17955, 15325, 19658 and 2153 mg/l to 1050, 501, 29, 0.75, and 0.29 mg/l, respectively. While, by using modified trickling filter (packed with the neck of waste plastic bottles the concentration of the previously mentioned parameters was reduced to 1190, 570, 55, 0.85, and 0.3 mg/l, respectively. This work supports the application of such treatment technique for reducing the environmental threats of olive mill waste effluents in Saudi Arabia.Keywords: two-phase, three-phase, olive mill, olive oil, waste treatment, filtration, advanced oxidation, waste plastic bottles
Procedia PDF Downloads 153263 Treatment of Grey Water from Different Restaurants in FUTA Using Fungi
Authors: F. A. Ogundolie, F. Okogue, D. V. Adegunloye
Abstract:
Greywater samples were obtained from three restaurants in the Federal University of Technology; Akure coded SSR, MGR and GGR. Fungi isolates obtained include Rhizopus stolonifer, Aspergillus niger, Mucor mucedo, Aspergillus flavus, Saccharomyces cerevisiae. Of these fungi isolates obtained, R. stolonifer, A. niger and A. flavus showed significant degradation ability on grey water and was used for this research. A simple bioreactor was constructed using biodegradation process in purification of waste water samples. Waste water undergoes primary treatment; secondary treatment involves the introduction of the isolated organisms into the waste water sample and the tertiary treatment which involved the use of filter candle and the sand bed filtration process to achieve the end product without the use of chemicals. A. niger brought about significant reduction in both the bacterial load and the fungi load of the greywater samples of the three respective restaurants with a reduction of (1.29 × 108 to 1.57 × 102 cfu/ml; 1.04 × 108 to 1.12 × 102 cfu/ml and 1.72 × 108 to 1.60 × 102 cfu/ml) for bacterial load in SSR, MGR and GGR respectively. Reduction of 2.01 × 104 to 1.2 × 101; 1.72 × 104 to 1.1 × 101, and 2.50 × 104 to 1.5 × 101 in fungi load from SSR, MGR and GGR respectively. Result of degradation of these selected waste water by the fungi showed that A. niger was probably more potent in the degradation of organic matter and hence, A. niger could be used in the treatment of wastewater.Keywords: Aspergillus niger, greywater, bacterial, fungi, microbial load, bioreactor, biodegradation, purification, organic matter and filtration
Procedia PDF Downloads 312262 Application of the Global Optimization Techniques to the Optical Thin Film Design
Authors: D. Li
Abstract:
Optical thin films are used in a wide variety of optical components and there are many software tools programmed for advancing multilayer thin film design. The available software packages for designing the thin film structure may not provide optimum designs. Normally, almost all current software programs obtain their final designs either from optimizing a starting guess or by technique, which may or may not involve a pseudorandom process, that give different answers every time, depending upon the initial conditions. With the increasing power of personal computers, functional methods in optimization and synthesis of optical multilayer systems have been developed such as DGL Optimization, Simulated Annealing, Genetic Algorithms, Needle Optimization, Inductive Optimization and Flip-Flop Optimization. Among these, DGL Optimization has proved its efficiency in optical thin film designs. The application of the DGL optimization technique to the design of optical coating is presented. A DGL optimization technique is provided, and its main features are discussed. Guidelines on the application of the DGL optimization technique to various types of design problems are given. The innovative global optimization strategies used in a software tool, OnlyFilm, to optimize multilayer thin film designs through different filter designs are outlined. OnlyFilm is a powerful, versatile, and user-friendly thin film software on the market, which combines optimization and synthesis design capabilities with powerful analytical tools for optical thin film designers. It is also the only thin film design software that offers a true global optimization function.Keywords: optical coatings, optimization, design software, thin film design
Procedia PDF Downloads 316261 An Enhanced SAR-Based Tsunami Detection System
Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah
Abstract:
Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.Keywords: detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter
Procedia PDF Downloads 392260 Cheese Production at Low Temperatures Using Probiotic L. casei ATCC 393 and Rennin Enzyme Entrapped in Tubular Cellulose
Authors: Eleftheria Barouni, Antonia Terpou, Maria Kanellaki, Argyro Bekatorou, Athanasios A.Koutinas
Abstract:
The aim of the present work was to evaluate the production of cheese using a composite filter of tubular cellulose (TC) with [a] entrapped rennin enzyme and [b] immobilized L.casei and entrapped enzyme. Tubular cellulose from sawdust was prepared after lignin removal with 1% NaOH. The biocatalysts were thermally dried at 38oC and used for milk coagulation. The effect of temperature (5,20,37 oC) of the first dried biocatalyst on the pH kinetics of milk coagulation was examined. The optimum temperature (37oC) of the first biocatalyst was used for milk coagulation with the second biocatalyst prepared by entrapment of both rennin enzyme and probiotic lactic acid bacteria in order to introduce a sour taste in cheeses. This co-biocatalyst was used for milk coagulation. Samples were studied as regards its effect on lactic acid formation and its correlation with taste test results in cheeses. For both biocatalysts samples were analyzed for total acidity and lactic acid formation by HPLC. The quality of the produced cheeses was examined through the determination of volatile compounds by SPME GC/MS analysis. Preliminary taste tests and microbiological analysis were performed and encourage us for further research regarding scale up.Keywords: tubular cellulose, Lactobacillus casei, rennin enzyme, cheese production
Procedia PDF Downloads 358259 Plasma Treatment of Poppy and Flax Seeds in Fluidized Bed Reactor
Authors: Jakub Perner, Jindrich Matousek, Hana Malinska
Abstract:
Adverse environmental conditions at planting (especially water shortage) can lead into reduced germination rate of seeds. The plasma treatment is one of the possibilities that can solve this problem. Such treatment can increase the germination rate of seeds and make germs grow faster due to increased wettability of seeds surface or disrupted seed coat. This could lead to enhanced oxygen and water transport into the seed and improve germination. Poppy and flax seeds were treated in fluidized bed reactor, and discharge power ranging from 10 to 40 W was used. The working gas was air at pressure 100 Pa. Poppy seeds were then planted into Petri dishes on 7 layers of filter paper saturated with water, and the number of germinated seeds was observed from 3 to 6 days after planting. Every plasma treated sample showed improved germination rate compared to untreated seeds (75.5%) six days after planting. Samples treated in 40W discharge had the highest germination rate (81.2%). The decreased contact angle of water on treated poppy seeds was observed from 85° (untreated) to 30–35° (treated). Untreated flax seeds have a germination rate over 98%; therefore, the weight of seeds was taken to be a measure of the successful germination. Treated flax seeds had a slightly higher weight than untreated. Also, the contact angle of water decreased from 99° (untreated) to 65-73° (treated); therefore the treatment of both species is considered to be successful.Keywords: flax, germination, plasma treatment, poppy
Procedia PDF Downloads 178258 Diagnosis and Analysis of Automated Liver and Tumor Segmentation on CT
Authors: R. R. Ramsheeja, R. Sreeraj
Abstract:
For view the internal structures of the human body such as liver, brain, kidney etc have a wide range of different modalities for medical images are provided nowadays. Computer Tomography is one of the most significant medical image modalities. In this paper use CT liver images for study the use of automatic computer aided techniques to calculate the volume of the liver tumor. Segmentation method is used for the detection of tumor from the CT scan is proposed. Gaussian filter is used for denoising the liver image and Adaptive Thresholding algorithm is used for segmentation. Multiple Region Of Interest(ROI) based method that may help to characteristic the feature different. It provides a significant impact on classification performance. Due to the characteristic of liver tumor lesion, inherent difficulties appear selective. For a better performance, a novel proposed system is introduced. Multiple ROI based feature selection and classification are performed. In order to obtain of relevant features for Support Vector Machine(SVM) classifier is important for better generalization performance. The proposed system helps to improve the better classification performance, reason in which we can see a significant reduction of features is used. The diagnosis of liver cancer from the computer tomography images is very difficult in nature. Early detection of liver tumor is very helpful to save the human life.Keywords: computed tomography (CT), multiple region of interest(ROI), feature values, segmentation, SVM classification
Procedia PDF Downloads 509257 Brain Tumor Segmentation Based on Minimum Spanning Tree
Authors: Simeon Mayala, Ida Herdlevær, Jonas Bull Haugsøen, Shamundeeswari Anandan, Sonia Gavasso, Morten Brun
Abstract:
In this paper, we propose a minimum spanning tree-based method for segmenting brain tumors. The proposed method performs interactive segmentation based on the minimum spanning tree without tuning parameters. The steps involve preprocessing, making a graph, constructing a minimum spanning tree, and a newly implemented way of interactively segmenting the region of interest. In the preprocessing step, a Gaussian filter is applied to 2D images to remove the noise. Then, the pixel neighbor graph is weighted by intensity differences and the corresponding minimum spanning tree is constructed. The image is loaded in an interactive window for segmenting the tumor. The region of interest and the background are selected by clicking to split the minimum spanning tree into two trees. One of these trees represents the region of interest and the other represents the background. Finally, the segmentation given by the two trees is visualized. The proposed method was tested by segmenting two different 2D brain T1-weighted magnetic resonance image data sets. The comparison between our results and the standard gold segmentation confirmed the validity of the minimum spanning tree approach. The proposed method is simple to implement and the results indicate that it is accurate and efficient.Keywords: brain tumor, brain tumor segmentation, minimum spanning tree, segmentation, image processing
Procedia PDF Downloads 122256 Real-World PM, PN and NOx Emission Differences among DOC+CDPF Retrofit Diesel-, Diesel- And Natural Gas-Fueled Bus
Authors: Zhiwen Yang, Jingyuan Li, Zhenkai Xie, Jian Ling, Jiguang Wang, Mengliang Li
Abstract:
To reflect the effects of different emission control strategies, such as retrofitting after-treatment system and replacing with natural gas-fueled vehicles, on particle number (PN), particle mass (PM) and nitrogen oxides (NOx) emissions emitted by urban bus, a portable emission measurement system (PEMS) was employed herein to conduct real-world driving emission measurements on a diesel oxidation catalytic converter (DOC) and catalyzed diesel particulate filter (CDPF) retrofitting China IV diesel bus, a China IV diesel bus, and a China V natural gas bus. The results show that both tested diesel buses possess markedly advantages in NOx emission control when compared to the lean-burn natural gas bus equipped without any NOx after-treatment system. As to PN and PM, only the DOC+CDPF retrofitting diesel bus exhibits enormous benefits on emission control relate to the natural gas bus, especially the normal diesel bus. Meanwhile, the differences in PM and PN emissions between retrofitted and normal diesel buses generally increase with the increase in vehicle-specific power (VSP). Furthermore, the differences in PM emissions, especially those in the higher VSP ranges, are more significant than those in PN. In addition, the maximum peak PN particle size (32 nm) of the retrofitted diesel bus was significantly lower than that of the normal diesel bus (100 nm). These phenomena indicate that the CDPF retrofitting can effectively reduce diesel bus exhaust particle emissions, especially those with large particle sizes.Keywords: CDPF, diesel, natural gas, real-world emissions
Procedia PDF Downloads 297255 A 1.57ghz Mixer Design for GPS Receiver
Authors: Hamd Ahmed
Abstract:
During the Persian Gulf War in 1991s, The confederation forces were surprised when they were being shot at by friendly forces in Iraqi desert. As obvious was the fact that they were mislead due to the lack of proper guidance and technology resulting in unnecessary loss of life and bloodshed. This unforeseen incident along with many others led the US department of defense to open the doors of GPS. In the very beginning, this technology was for military use, but now it is being widely used and increasingly popular among the public due to its high accuracy and immeasurable significance. The GPS system simply consists of three segments, the space segment (the satellite), the control segment (ground control) and the user segment (receiver). This project work is about designing a 1.57GHZ mixer for triple conversion GPS receiver .The GPS Front-End receiver based on super heterodyne receiver which improves selectivity and image frequency. However the main principle of the super heterodyne receiver depends on the mixer. Many different types of mixers (single balanced mixer, Single Ended mixer, Double balanced mixer) can be used with GPS receiver, it depends on the required specifications. This research project will provide an overview of the GPS system and details about the basic architecture of the GPS receiver. The basic emphasis of this report in on investigating general concept of the mixer circuit some terms related to the mixer along with their definitions and present the types of mixer, then gives some advantages of using singly balanced mixer and its application. The focus of this report is on how to design mixer for GPS receiver and discussing the simulation results.Keywords: GPS , RF filter, heterodyne, mixer
Procedia PDF Downloads 323254 Tensor Deep Stacking Neural Networks and Bilinear Mapping Based Speech Emotion Classification Using Facial Electromyography
Authors: P. S. Jagadeesh Kumar, Yang Yung, Wenli Hu
Abstract:
Speech emotion classification is a dominant research field in finding a sturdy and profligate classifier appropriate for different real-life applications. This effort accentuates on classifying different emotions from speech signal quarried from the features related to pitch, formants, energy contours, jitter, shimmer, spectral, perceptual and temporal features. Tensor deep stacking neural networks were supported to examine the factors that influence the classification success rate. Facial electromyography signals were composed of several forms of focuses in a controlled atmosphere by means of audio-visual stimuli. Proficient facial electromyography signals were pre-processed using moving average filter, and a set of arithmetical features were excavated. Extracted features were mapped into consistent emotions using bilinear mapping. With facial electromyography signals, a database comprising diverse emotions will be exposed with a suitable fine-tuning of features and training data. A success rate of 92% can be attained deprived of increasing the system connivance and the computation time for sorting diverse emotional states.Keywords: speech emotion classification, tensor deep stacking neural networks, facial electromyography, bilinear mapping, audio-visual stimuli
Procedia PDF Downloads 254253 Causes Analysis of Vacuum Consolidation Failure to Soft Foundation Filled by Newly Dredged Mud
Authors: Bao Shu-Feng, Lou Yan, Dong Zhi-Liang, Mo Hai-Hong, Chen Ping-Shan
Abstract:
For soft foundation filled by newly dredged mud, after improved by Vacuum Preloading Technology (VPT), the soil strength was increased only a little, the effective improved depth was small, and the ground bearing capacity is still low. To analyze the causes in depth, it was conducted in laboratory of several comparative single well model experiments of VPT. It was concluded: (1) it mainly caused serious clogging problem and poor drainage performance in vertical drains of high content of fine soil particles and strong hydrophilic minerals in dredged mud, too fast loading rate at the early stage of vacuum preloading (namely rapidly reaching-80kPa) and too small characteristic opening size of the filter of the existed vertical drains; (2) it commonly reduced the drainage efficiency of drainage system, in turn weaken vacuum pressure in soils and soil improvement effect of the greater partial loss and friction loss of vacuum pressure caused by larger curvature of vertical drains and larger transfer resistance of vacuum pressure in horizontal drain.Keywords: newly dredged mud, single well model experiments of vacuum preloading technology, poor drainage performance of vertical drains, poor soil improvement effect, causes analysis
Procedia PDF Downloads 287252 Hydrogen Sulfide Removal from Biogas Using Biofilm on Packed Bed of Salak Fruit Seeds
Authors: Retno A. S. Lestari, Wahyudi B. Sediawan, Siti Syamsiah, Sarto
Abstract:
Sulfur-oxidizing bacteria were isolated and then grown on snakefruits seeds forming biofilm. Their performance in sulfide removal were experimentally observed. Snakefruit seeds were then used as packing material in a cylindrical tube. Biological treatment of hydrogen sulfide from biogas was investigated using biofilm on packed bed of snakefruits seeds. Biogas containing 27,9512 ppm of hydrogen sulfide was flown through the bed. Then the hydrogen sulfide concentrations in the outlet at various times were analyzed. A set of simple kinetics model for the rate of the sulfide removal and the bacterial growth was proposed. The axial sulfide concentration gradient in the flowing liquid are assumed to be steady-state. Mean while the biofilm grows on the surface of the seeds and the oxidation takes place in the biofilm. Since the biofilm is very thin, the sulfide concentration in the biofilm is assumed to be uniform. The simultaneous ordinary differential equations obtained were then solved numerically using Runge-Kutta method. The acuracy of the model proposed was tested by comparing the calcultion results using the model with the experimental data obtained. It turned out that the model proposed can be applied to describe the removal of sulfide liquid using bio-filter in packed bed. The values of the parameters were also obtained by curve-fitting. The biofilter could remove 89,83 % of the inlet of hydrogen sulfide from biogas for 2.5 h, and optimum loading of 8.33 ml/h.Keywords: Sulfur-oxidizing bacteria, snakefruits seeds, biofilm, packing material, biogas
Procedia PDF Downloads 408251 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences
Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng
Abstract:
Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).Keywords: motion detection, motion tracking, trajectory analysis, video surveillance
Procedia PDF Downloads 548250 Segmentation of the Liver and Spleen From Abdominal CT Images Using Watershed Approach
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
The phase of segmentation is an important step in the processing and interpretation of medical images. In this paper, we focus on the segmentation of liver and spleen from the abdomen computed tomography (CT) images. The importance of our study comes from the fact that the segmentation of ROI from CT images is usually a difficult task. This difficulty is the gray’s level of which is similar to the other organ also the ROI are connected to the ribs, heart, kidneys, etc. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to remove the surrounding and connected organs and tissues by applying morphological filters. This first step makes the extraction of interest regions easier. The second step consists of improving the quality of the image gradient. In this step, we propose a method for improving the image gradient to reduce these deficiencies by applying the spatial filters followed by the morphological filters. Thereafter we proceed to the segmentation of the liver, spleen. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.The system has been evaluated by computing the sensitivity and specificity between the semi-automatically segmented (liver and spleen) contour and the manually contour traced by radiological experts.Keywords: CT images, liver and spleen segmentation, anisotropic diffusion filter, morphological filters, watershed algorithm
Procedia PDF Downloads 495249 An Energy Efficient Spectrum Shaping Scheme for Substrate Integrated Waveguides Based on Spread Reshaping Code
Authors: Yu Zhao, Rainer Gruenheid, Gerhard Bauch
Abstract:
In the microwave and millimeter-wave transmission region, substrate-integrated waveguide (SIW) is a very promising candidate for the development of circuits and components. It facilitates the transmission at the data rates in excess of 200 Gbit/s. An SIW mimics a rectangular waveguide by approximating the closed sidewalls with a via fence. This structure suppresses the low frequency components and makes the channel of the SIW a bandpass or high pass filter. This channel characteristic impedes the conventional baseband transmission using non-return-to-zero (NRZ) pulse shaping scheme. Therefore, mixers are commonly proposed to be used as carrier modulator and demodulator in order to facilitate a passband transmission. However, carrier modulation is not an energy efficient solution, because modulation and demodulation at high frequencies consume a lot of energy. For the first time to our knowledge, this paper proposes a spectrum shaping scheme of low complexity for the channel of SIW, namely spread reshaping code. It aims at matching the spectrum of the transmit signal to the channel frequency response. It facilitates the transmission through the SIW channel while it avoids using carrier modulation. In some cases, it even does not need equalization. Simulations reveal a good performance of this scheme, such that, as a result, eye opening is achieved without any equalization or modulation for the respective transmission channels.Keywords: bandpass channel, eye-opening, switching frequency, substrate-integrated waveguide, spectrum shaping scheme, spread reshaping code
Procedia PDF Downloads 160248 Screening and Optimization of Pretreatments for Rice Straw and Their Utilization for Bioethanol Production Using Developed Yeast Strain
Authors: Ganesh Dattatraya Saratale, Min Kyu Oh
Abstract:
Rice straw is one of the most abundant lignocellulosic waste materials and its annual production is about 731 Mt in the world. This study treats the subject of effective utilization of this waste biomass for biofuels production. We have showed a comparative assessment of numerous pretreatment strategies for rice straw, comprising of major physical, chemical and physicochemical methods. Among the different methods employed for pretreatment alkaline pretreatment in combination with sodium chlorite/acetic acid delignification found efficient pretreatment with significant improvement in the enzymatic digestibility of rice straw. A cellulase dose of 20 filter paper units (FPU) released a maximum 63.21 g/L of reducing sugar with 94.45% hydrolysis yield and 64.64% glucose yield from rice straw, respectively. The effects of different pretreatment methods on biomass structure and complexity were investigated by FTIR, XRD and SEM analytical techniques. Finally the enzymatic hydrolysate of rice straw was used for ethanol production using developed Saccharomyces cerevisiae SR8. The developed yeast strain enabled efficient fermentation of xylose and glucose and produced higher ethanol production. Thus development of bioethanol production from lignocellulosic waste biomass is generic, applicable methodology and have great implication for using ‘green raw materials’ and producing ‘green products’ much needed today.Keywords: rice straw, pretreatment, enzymatic hydrolysis, FPU, Saccharomyces cerevisiae SR8, ethanol fermentation
Procedia PDF Downloads 537247 Performance Analysis of 5G for Low Latency Transmission Based on Universal Filtered Multi-Carrier Technique and Interleave Division Multiple Access
Authors: A. Asgharzadeh, M. Maroufi
Abstract:
5G mobile communication system has drawn more and more attention. The 5G system needs to provide three different types of services, including enhanced Mobile BroadBand (eMBB), massive machine-type communication (mMTC), and ultra-reliable and low-latency communication (URLLC). Universal Filtered Multi-Carrier (UFMC), Filter Bank Multicarrier (FBMC), and Filtered Orthogonal Frequency Division Multiplexing (f-OFDM) are suggested as a well-known candidate waveform for the coming 5G system. Themachine-to-machine (M2M) communications are one of the essential applications in 5G, and it involves exchanging of concise messages with a very short latency. However, in UFMC systems, the subcarriers are grouped into subbands but f-OFDM only one subband covers the entire band. Furthermore, in FBMC, a subband includes only one subcarrier, and the number of subbands is the same as the number of subcarriers. This paper mainly discusses the performance of UFMC with different parameters for the UFMC system. Also, paper shows that UFMC is the best choice outperforming OFDM in any case and FBMC in case of very short packets while performing similarly for long sequences with channel estimation techniques for Interleave Division Multiple Access (IDMA) systems.Keywords: universal filtered multi-carrier technique, UFMC, interleave division multiple access, IDMA, fifth-generation, subband
Procedia PDF Downloads 133246 Encapsulated Rennin Enzyme in Nano and Micro Tubular Cellulose/Starch Gel Composite for Milk Coagulation
Authors: Eleftheria Barouni, Theano Petsi, Argyro Bekatorou, Dionysos Kolliopoulos, Dimitrios Vasileiou, Panayiotis Panas, Maria Kanellaki, Athanasios A. Koutinas
Abstract:
The aim of the present work was the production and use of a composite filter (TC/starch), containing rennin enzyme, in continuous system and in successive fermentation batches (SFB) for milk coagulation in order to compare the operational stability of both systems and cheese production cost. Tubular cellulose (TC) was produced after removal of lignin from lignocellulosic biomass using several procedures, e.g. alkaline treatment [1] and starch gel was added for the reduction of TC tubes dimensions to micro- and nano- range[2]. Four immobilized biocatalysts were prepared using different ways of the enzyme entrapment. 1) TC/ rennin (rennin entrapped in the tubes of TC), 2) TC/SG-rennin (rennin entrapped in the tubes of the composite), 3) TC-SG/rennin (rennin entrapped into the layer of starch gel) and 4) TC/rennin- SG/rennin (rennin is entrapped both in the tubes of the TC and into the layer of starch gel). Firstly these immobilized biocatalysts were examined in ten SFB regarding the coagulation time and their activity All the above immobilized biocatalysts remained active and the coagulation time was ranged from 90 to 480, 120-480, 330-510, and 270-540 min for (1), (2), (3), and (4) respectively. The quality of the cheese was examined through the determination of volatile compounds by SPME GC/MS analysis. These results encouraged us to study a continuous coagulation system of milk. Even though the (1) immobilized biocatalyst gave lower coagulation time, we used the (2) immobilized biocatalyst in the continuous system. The results were promising.Keywords: tubular cellulose, starch gel, composite biocatalyst, Rennin, milk coagulation
Procedia PDF Downloads 326245 A Palmprint Identification System Based Multi-Layer Perceptron
Authors: David P. Tantua, Abdulkader Helwan
Abstract:
Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator
Procedia PDF Downloads 371244 Typology of Customers in Fitness Centres
Authors: Josef Voracek, Jan Sima
Abstract:
The main purpose of our study is to state the basic types of fitness customers. This paper aims to create a specific customer typology in today’s fitness centres in the region of Prague. Our suggested typology of Prague fitness centres customers is based on answers to the questions: What are the customers like, what are their preferences, and what kinds of services do they use more often in Prague fitness centres? These are the main aspects of the presented typology. A survey was conducted on a sample of 1004 respondents from 48 fitness centres, which ran during May 2012. We used questionnaires and latent class analysis for the assessment and interpretation of data. Gender was especially the main filter criterion. In the population, there were 522 males and 482 females. Data were analysed using the LCA method. We identified 6 segments of typical customers, of which three are male and three are female. Each segment is influenced primarily by the age of customers, from which we can develop further characteristics, such as education, income, marital status, etc. Male segments use the main workout area above all, whilst female segments use a much wider range of services offered, for example, group exercises, personal training, and cardio theatres. LCA method was found to be the most suitable tool, because cluster analysis is very limited in the forms and numbers of variables and indicators. Models of 3 latent classes for each gender are optimal, as it is demonstrated by entropy indices and matrices of the likelihood of the membership to the classes. A probable weak point of the survey is the selection of fitness centres, because of the market in Prague is really specific.Keywords: customer, fitness, latent class analysis, typology
Procedia PDF Downloads 216243 Estimation of Normalized Glandular Doses Using a Three-Layer Mammographic Phantom
Authors: Kuan-Jen Lai, Fang-Yi Lin, Shang-Rong Huang, Yun-Zheng Zeng, Po-Chieh Hsu, Jay Wu
Abstract:
The normalized glandular dose (DgN) estimates the energy deposition of mammography in clinical practice. The Monte Carlo simulations frequently use uniformly mixed phantom for calculating the conversion factor. However, breast tissues are not uniformly distributed, leading to errors of conversion factor estimation. This study constructed a three-layer phantom to estimated more accurate of normalized glandular dose. In this study, MCNP code (Monte Carlo N-Particles code) was used to create the geometric structure. We simulated three types of target/filter combinations (Mo/Mo, Mo/Rh, Rh/Rh), six voltages (25 ~ 35 kVp), six HVL parameters and nine breast phantom thicknesses (2 ~ 10 cm) for the three-layer mammographic phantom. The conversion factor for 25%, 50% and 75% glandularity was calculated. The error of conversion factors compared with the results of the American College of Radiology (ACR) was within 6%. For Rh/Rh, the difference was within 9%. The difference between the 50% average glandularity and the uniform phantom was 7.1% ~ -6.7% for the Mo/Mo combination, voltage of 27 kVp, half value layer of 0.34 mmAl, and breast thickness of 4 cm. According to the simulation results, the regression analysis found that the three-layer mammographic phantom at 0% ~ 100% glandularity can be used to accurately calculate the conversion factors. The difference in glandular tissue distribution leads to errors of conversion factor calculation. The three-layer mammographic phantom can provide accurate estimates of glandular dose in clinical practice.Keywords: Monte Carlo simulation, mammography, normalized glandular dose, glandularity
Procedia PDF Downloads 189242 The Optimization of an Industrial Recycling Line: Improving the Durability of Recycled Polyethyene Blends
Authors: Alae Lamtai, Said Elkoun, Hniya Kharmoudi, Mathieu Robert, Carl Diez
Abstract:
This study applies Taguchi's design of experiment methodology and grey relational analysis (GRA) for multi objective optimization of an industrial recycling line. This last is composed mainly of a mono and twin-screw extruder and a filtration system. Experiments were performed according to L₁₆ standard orthogonal array based on five process parameters, namely: mono screw design, screw speed of the mono and twin-screw extruder, melt pump pressure, and filter mesh size. The objective of this optimization is to improve the durability of the Polyethylene (PE) blend by decreasing the loss of Stress Crack resistance (SCR) using Notched Crack Ligament Stress (NCLS) test and Unnotched Crack Ligament Stress (UCLS) in parallel with increasing the gain of Izod impact strength of the Polyethylene (PE) blend before and after recycling. Based on Grey Relational Analysis (GRA), the optimal setting of process parameters was identified, and the results indicated that the mono-screw design and screw speed of both mono and twin-screw extruder impact significantly the mechanical properties of recycled Polyethylene (PE) blend.Keywords: Taguchi, recycling line, polyethylene, stress crack resistance, Izod impact strength, grey relational analysis
Procedia PDF Downloads 83241 PAPR Reduction of FBMC Using Sliding Window Tone Reservation Active Constellation Extension Technique
Authors: S. Anuradha, V. Sandeep Kumar
Abstract:
The high Peak to Average Power Ratio (PAR) in Filter Bank Multicarrier with Offset Quadrature Amplitude Modulation (FBMC-OQAM) can significantly reduce power efficiency and performance. In this paper, we address the problem of PAPR reduction for FBMC-OQAM systems using Tone Reservation (TR) technique. Due to the overlapping structure of FBMCOQAM signals, directly applying TR schemes of OFDM systems to FBMC-OQAM systems is not effective. We improve the tone reservation (TR) technique by employing sliding window with Active Constellation Extension for the PAPR reduction of FBMC-OQAM signals, called sliding window tone reservation Active Constellation Extension (SW-TRACE) technique. The proposed SW-TRACE technique uses the peak reduction tones (PRTs) of several consecutive data blocks to cancel the peaks of the FBMC-OQAM signal inside a window, with dynamically extending outer constellation points in active(data-carrying) channels, within margin-preserving constraints, in order to minimize the peak magnitude. Analysis and simulation results compared to the existing Tone Reservation (TR) technique for FBMC/OQAM system. The proposed method SW-TRACE has better PAPR performance and lower computational complexity.Keywords: FBMC-OQAM, peak-to-average power ratio, sliding window, tone reservation Active Constellation Extension
Procedia PDF Downloads 447240 Effect Analysis of an Improved Adaptive Speech Noise Reduction Algorithm in Online Communication Scenarios
Authors: Xingxing Peng
Abstract:
With the development of society, there are more and more online communication scenarios such as teleconference and online education. In the process of conference communication, the quality of voice communication is a very important part, and noise may cause the communication effect of participants to be greatly reduced. Therefore, voice noise reduction has an important impact on scenarios such as voice calls. This research focuses on the key technologies of the sound transmission process. The purpose is to maintain the audio quality to the maximum so that the listener can hear clearer and smoother sound. Firstly, to solve the problem that the traditional speech enhancement algorithm is not ideal when dealing with non-stationary noise, an adaptive speech noise reduction algorithm is studied in this paper. Traditional noise estimation methods are mainly used to deal with stationary noise. In this chapter, we study the spectral characteristics of different noise types, especially the characteristics of non-stationary Burst noise, and design a noise estimator module to deal with non-stationary noise. Noise features are extracted from non-speech segments, and the noise estimation module is adjusted in real time according to different noise characteristics. This adaptive algorithm can enhance speech according to different noise characteristics, improve the performance of traditional algorithms to deal with non-stationary noise, so as to achieve better enhancement effect. The experimental results show that the algorithm proposed in this chapter is effective and can better adapt to different types of noise, so as to obtain better speech enhancement effect.Keywords: speech noise reduction, speech enhancement, self-adaptation, Wiener filter algorithm
Procedia PDF Downloads 57239 Assessment of Escherichia coli along Nakibiso Stream in Mbale Municipality, Uganda
Authors: Abdul Walusansa
Abstract:
The aim of this study was to assess the level of microbial pollution along Nakibiso stream. The study was carried out in polluted waters of Nakibiso stream, originating from Mbale municipality and running through ADRA Estates to Namatala Wetlands in Eastern Uganda. Four sites along the stream were selected basing on the activities of their vicinity. A total of 120 samples were collected in sterile bottles from the four sampling locations of the stream during the wet and dry seasons of the year 2011. The samples were taken to the National water and Sewerage Cooperation Laboratory for Analysis. Membrane filter technique was used to test for Erischerichia coli. Nitrogen, Phosphorus, pH, dissolved oxygen, electrical conductivity, total suspended solids, turbidity and temperature were also measured. Results for Nitrogen and Phosphorus for sites; 1, 2, 3 and 4 were 1.8, 8.8, 7.7 and 13.8 NH4-N mg/L; and 1.8, 2.1, 1.8 and 2.3 PO4-P mg/L respectively. Basing on these results, it was estimated that farmers use 115 and 24 Kg/acre of Nitrogen and Phosphorus respectively per month. Taking results for Nitrogen, the same amount of Nutrients in artificial fertilizers would cost $ 88. This shows that reuse of wastewater has a potential in terms of nutrients. The results for E. coli for sites 1, 2, 3 and 4 were 1.1 X 107, 9.1 X 105, 7.4 X 105, and 3.4 X 105 respectively. E. coli hence decreased downstream with statistically significant variations between sites 1 and 4. Site 1 had the highest mean E.coli counts. The bacterial contamination was significantly higher during the dry season when more water was needed for irrigation. Although the water had the potential for reuse in farming, bacterial contamination during both seasons was higher than 103 FC/100ml recommended by WHO for unrestricted Agriculture.Keywords: E. coli, nitrogen, phosphorus, water reuse, waste water
Procedia PDF Downloads 247238 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm
Authors: Anuradha Chug, Sunali Gandhi
Abstract:
Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm
Procedia PDF Downloads 380237 Accurate Positioning Method of Indoor Plastering Robot Based on Line Laser
Authors: Guanqiao Wang, Hongyang Yu
Abstract:
There is a lot of repetitive work in the traditional construction industry. These repetitive tasks can significantly improve production efficiency by replacing manual tasks with robots. There- fore, robots appear more and more frequently in the construction industry. Navigation and positioning are very important tasks for construction robots, and the requirements for accuracy of positioning are very high. Traditional indoor robots mainly use radiofrequency or vision methods for positioning. Compared with ordinary robots, the indoor plastering robot needs to be positioned closer to the wall for wall plastering, so the requirements for construction positioning accuracy are higher, and the traditional navigation positioning method has a large error, which will cause the robot to move. Without the exact position, the wall cannot be plastered, or the error of plastering the wall is large. A new positioning method is proposed, which is assisted by line lasers and uses image processing-based positioning to perform more accurate positioning on the traditional positioning work. In actual work, filter, edge detection, Hough transform and other operations are performed on the images captured by the camera. Each time the position of the laser line is found, it is compared with the standard value, and the position of the robot is moved or rotated to complete the positioning work. The experimental results show that the actual positioning error is reduced to less than 0.5 mm by this accurate positioning method.Keywords: indoor plastering robot, navigation, precise positioning, line laser, image processing
Procedia PDF Downloads 148236 A Spatial Repetitive Controller Applied to an Aeroelastic Model for Wind Turbines
Authors: Riccardo Fratini, Riccardo Santini, Jacopo Serafini, Massimo Gennaretti, Stefano Panzieri
Abstract:
This paper presents a nonlinear differential model, for a three-bladed horizontal axis wind turbine (HAWT) suited for control applications. It is based on a 8-dofs, lumped parameters structural dynamics coupled with a quasi-steady sectional aerodynamics. In particular, using the Euler-Lagrange Equation (Energetic Variation approach), the authors derive, and successively validate, such model. For the derivation of the aerodynamic model, the Greenbergs theory, an extension of the theory proposed by Theodorsen to the case of thin airfoils undergoing pulsating flows, is used. Specifically, in this work, the authors restricted that theory under the hypothesis of low perturbation reduced frequency k, which causes the lift deficiency function C(k) to be real and equal to 1. Furthermore, the expressions of the aerodynamic loads are obtained using the quasi-steady strip theory (Hodges and Ormiston), as a function of the chordwise and normal components of relative velocity between flow and airfoil Ut, Up, their derivatives, and section angular velocity ε˙. For the validation of the proposed model, the authors carried out open and closed-loop simulations of a 5 MW HAWT, characterized by radius R =61.5 m and by mean chord c = 3 m, with a nominal angular velocity Ωn = 1.266rad/sec. The first analysis performed is the steady state solution, where a uniform wind Vw = 11.4 m/s is considered and a collective pitch angle θ = 0.88◦ is imposed. During this step, the authors noticed that the proposed model is intrinsically periodic due to the effect of the wind and of the gravitational force. In order to reject this periodic trend in the model dynamics, the authors propose a collective repetitive control algorithm coupled with a PD controller. In particular, when the reference command to be tracked and/or the disturbance to be rejected are periodic signals with a fixed period, the repetitive control strategies can be applied due to their high precision, simple implementation and little performance dependency on system parameters. The functional scheme of a repetitive controller is quite simple and, given a periodic reference command, is composed of a control block Crc(s) usually added to an existing feedback control system. The control block contains and a free time-delay system eτs in a positive feedback loop, and a low-pass filter q(s). It should be noticed that, while the time delay term reduces the stability margin, on the other hand the low pass filter is added to ensure stability. It is worth noting that, in this work, the authors propose a phase shifting for the controller and the delay system has been modified as e^(−(T−γk)), where T is the period of the signal and γk is a phase shifting of k samples of the same periodic signal. It should be noticed that, the phase shifting technique is particularly useful in non-minimum phase systems, such as flexible structures. In fact, using the phase shifting, the iterative algorithm could reach the convergence also at high frequencies. Notice that, in our case study, the shifting of k samples depends both on the rotor angular velocity Ω and on the rotor azimuth angle Ψ: we refer to this controller as a spatial repetitive controller. The collective repetitive controller has also been coupled with a C(s) = PD(s), in order to dampen oscillations of the blades. The performance of the spatial repetitive controller is compared with an industrial PI controller. In particular, starting from wind speed velocity Vw = 11.4 m/s the controller is asked to maintain the nominal angular velocity Ωn = 1.266rad/s after an instantaneous increase of wind speed (Vw = 15 m/s). Then, a purely periodic external disturbance is introduced in order to stress the capabilities of the repetitive controller. The results of the simulations show that, contrary to a simple PI controller, the spatial repetitive-PD controller has the capability to reject both external disturbances and periodic trend in the model dynamics. Finally, the nominal value of the angular velocity is reached, in accordance with results obtained with commercial software for a turbine of the same type.Keywords: wind turbines, aeroelasticity, repetitive control, periodic systems
Procedia PDF Downloads 249235 A Gradient Orientation Based Efficient Linear Interpolation Method
Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar
Abstract:
This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing
Procedia PDF Downloads 260