Search results for: hot processing windows
3136 Acacia mearnsii De Wild-A New Scourge on Cork Oak Forests of El Kala National Park (North-Eastern Algeria)
Authors: Samir Chekchaki, ArifaBeddiar
Abstract:
Nowadays, more and more species are introduced outside their natural range. If most of them remain difficult, some may adopt a much more dynamic behavior. Indeed, we have witnessed in recent decades, the development of high forests of Acacia mearnsii in El Kala National Park. Introduced indefinitely, this leguminous intended to make money (nitrogen supply for industrial plantations of Eucalyptus), became one of the most invasive and more costly in terms of forest management. It has crossed all barriers: it has acclimatized, naturalized and then expanded through diverse landscapes; entry into competition with native species such as cork oak and altered ecosystem functioning. Therefore, it is interesting to analyze this new threat by relying on plants as bio-indicator for assessing biodiversity at different scales. We have identified the species present in several plots distributed in a range of vegetation types subjected to different degrees of disturbance by using the braun-blanquet method. Fifty-six species have been recorded. They are distributed in 48 genera and 29 families. The analysis of the relative frequency of species correlated with relative abundance clearly shows that the Acacia mearnsii feels marginalized. The ecological analysis of this biological invasion shows that disruption of either natural or anthropogenic origin (fire, prolonged drought, cut) represent the factors that exacerbate invasion by opening invasion windows. The lifting of seeds of Acacia mearnsii lasting physical dormancy (and variable) is ensured by the thermal shock in relation to its heliophilous character.Keywords: Acacia mearnsii De Wild, El Kala National park, fire, invasive, vegetation
Procedia PDF Downloads 3573135 Effect of Different Processing Methods on the Proximate, Functional, Sensory, and Nutritional Properties of Weaning Foods Formulated from Maize (Zea mays) and Soybean (Glycine max) Flour Blends
Authors: C. O. Agu, C. C. Okafor
Abstract:
Maize and soybean flours were produced using different methods of processing which include fermentation (FWF), roasting (RWF) and malting (MWF). Products from the different methods were mixed in the ratio 60:40 maize/soybean, respectively. These composites mixed with other ingredients such as sugar, vegetable oil, vanilla flavour and vitamin mix were analyzed for proximate composition, physical/functional, sensory and nutritional properties. The results for the protein content ranged between 6.25% and 16.65% with sample RWF having the highest value. Crude fibre values ranged from 3.72 to 10.0%, carbohydrate from 58.98% to 64.2%, ash from 1.27 to 2.45%. Physical and functional properties such as bulk density, wettability, gelation capacity have values between 0.74 and 0.76g/ml, 20.33 and 46.33 min and 0.73 to 0.93g/ml, respectively. On the sensory quality colour, flavour, taste, texture and general acceptability were determined. In terms of colour and flavour there was no significant difference (P < 0.05) while the values for taste ranged between 4.89 and 7.1 l, texture 5.50 to 8.38 and general acceptability 6.09 and 7.89. Nutritionally there is no significant difference (P < 0.05) between sample RWF and the control in all parameters considered. Samples FWF and MWF showed significantly (P < 0.5) lower values in all parameters determined. In the light of the above findings, roasting method is highly recommend in the production of weaning foods.Keywords: fermentation, malting, ratio, roasting, wettability
Procedia PDF Downloads 3043134 Processing of Input Material as a Way to Improve the Efficiency of the Glass Production Process
Authors: Joanna Rybicka-Łada, Magda Kosmal, Anna Kuśnierz
Abstract:
One of the main problems of the glass industry is the still high consumption of energy needed to produce glass mass, as well as the increase in prices, fuels, and raw materials. Therefore, comprehensive actions are taken to improve the entire production process. The key element of these activities, starting from filling the set to receiving the finished product, is the melting process, whose task is, among others, dissolving the components of the set, removing bubbles from the resulting melt, and obtaining a chemically homogeneous glass melt. This solution avoids dust formation during filling and is available on the market. This process consumes over 90% of the total energy needed in the production process. The processes occurring in the set during its conversion have a significant impact on the further stages and speed of the melting process and, thus, on its overall effectiveness. The speed of the reactions occurring and their course depend on the chemical nature of the raw materials, the degree of their fragmentation, thermal treatment as well as the form of the introduced set. An opportunity to minimize segregation and accelerate the conversion of glass sets may be the development of new technologies for preparing and dosing sets. The previously preferred traditional method of melting the set, based on mixing all glass raw materials together in loose form, can be replaced with a set in a thickened form. The aim of the project was to develop a glass set in a selectively or completely densified form and to examine the influence of set processing on the melting process and the properties of the glass.Keywords: glass, melting process, glass set, raw materials
Procedia PDF Downloads 603133 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement
Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova
Abstract:
One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank
Procedia PDF Downloads 3873132 Adaptation of Projection Profile Algorithm for Skewed Handwritten Text Line Detection
Authors: Kayode A. Olaniyi, Tola. M. Osifeko, Adeola A. Ogunleye
Abstract:
Text line segmentation is an important step in document image processing. It represents a labeling process that assigns the same label using distance metric probability to spatially aligned units. Text line detection techniques have successfully been implemented mainly in printed documents. However, processing of the handwritten texts especially unconstrained documents has remained a key problem. This is because the unconstrained hand-written text lines are often not uniformly skewed. The spaces between text lines may not be obvious, complicated by the nature of handwriting and, overlapping ascenders and/or descenders of some characters. Hence, text lines detection and segmentation represents a leading challenge in handwritten document image processing. Text line detection methods that rely on the traditional global projection profile of the text document cannot efficiently confront with the problem of variable skew angles between different text lines. Hence, the formulation of a horizontal line as a separator is often not efficient. This paper presents a technique to segment a handwritten document into distinct lines of text. The proposed algorithm starts, by partitioning the initial text image into columns, across its width into chunks of about 5% each. At each vertical strip of 5%, the histogram of horizontal runs is projected. We have worked with the assumption that text appearing in a single strip is almost parallel to each other. The algorithm developed provides a sliding window through the first vertical strip on the left side of the page. It runs through to identify the new minimum corresponding to a valley in the projection profile. Each valley would represent the starting point of the orientation line and the ending point is the minimum point on the projection profile of the next vertical strip. The derived text-lines traverse around any obstructing handwritten vertical strips of connected component by associating it to either the line above or below. A decision of associating such connected component is made by the probability obtained from a distance metric decision. The technique outperforms the global projection profile for text line segmentation and it is robust to handle skewed documents and those with lines running into each other.Keywords: connected-component, projection-profile, segmentation, text-line
Procedia PDF Downloads 1243131 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach
Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh
Abstract:
This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.Keywords: river stage-discharge process, LSSVM, discrete wavelet transform, Ensemble Empirical Decomposition Mode, multi-station modeling
Procedia PDF Downloads 1753130 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter
Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi
Abstract:
In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm
Procedia PDF Downloads 3873129 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning
Authors: Tanvi P. Patel, Warish D. Patel
Abstract:
Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern
Procedia PDF Downloads 3713128 Roasting Degree of Cocoa Beans by Artificial Neural Network (ANN) Based Electronic Nose System and Gas Chromatography (GC)
Authors: Juzhong Tan, William Kerr
Abstract:
Roasting is one critical procedure in chocolate processing, where special favors are developed, moisture content is decreased, and better processing properties are developed. Therefore, determination of roasting degree of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products, and it also decides the commercial value of cocoa beans collected from cocoa farmers. The roasting degree of cocoa beans currently relies on human specialists, who sometimes are biased, and chemical analysis, which take long time and are inaccessible to many manufacturers and farmers. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was used to detecting the gas generated by cocoa beans with a different roasting degree (0min, 20min, 30min, and 40min) and the signals collected by gas sensors were used to train a three-layers ANN. Chemical analysis of the graded beans was operated by traditional GC-MS system and the contents of volatile chemical compounds were used to train another ANN as a reference to electronic nosed signals trained ANN. Both trained ANN were used to predict cocoa beans with a different roasting degree for validation. The best accuracy of grading achieved by electronic nose signals trained ANN (using signals from TGS 813 826 820 880 830 2620 2602 2610) turned out to be 96.7%, however, the GC trained ANN got the accuracy of 83.8%.Keywords: artificial neutron network, cocoa bean, electronic nose, roasting
Procedia PDF Downloads 2343127 Analyzing the Effect of Materials’ Selection on Energy Saving and Carbon Footprint: A Case Study Simulation of Concrete Structure Building
Authors: M. Kouhirostamkolaei, M. Kouhirostami, M. Sam, J. Woo, A. T. Asutosh, J. Li, C. Kibert
Abstract:
Construction is one of the most energy consumed activities in the urban environment that results in a significant amount of greenhouse gas emissions around the world. Thus, the impact of the construction industry on global warming is undeniable. Thus, reducing building energy consumption and mitigating carbon production can slow the rate of global warming. The purpose of this study is to determine the amount of energy consumption and carbon dioxide production during the operation phase and the impact of using new shells on energy saving and carbon footprint. Therefore, a residential building with a re-enforced concrete structure is selected in Babolsar, Iran. DesignBuilder software has been used for one year of building operation to calculate the amount of carbon dioxide production and energy consumption in the operation phase of the building. The primary results show the building use 61750 kWh of energy each year. Computer simulation analyzes the effect of changing building shells -using XPS polystyrene and new electrochromic windows- as well as changing the type of lighting on energy consumption reduction and subsequent carbon dioxide production. The results show that the amount of energy and carbon production during building operation has been reduced by approximately 70% by applying the proposed changes. The changes reduce CO2e to 11345 kg CO2/yr. The result of this study helps designers and engineers to consider material selection’s process as one of the most important stages of design for improving energy performance of buildings.Keywords: construction materials, green construction, energy simulation, carbon footprint, energy saving, concrete structure, designbuilder
Procedia PDF Downloads 1983126 Parametric Models of Facade Designs of High-Rise Residential Buildings
Authors: Yuchen Sharon Sung, Yingjui Tseng
Abstract:
High-rise residential buildings have become the most mainstream housing pattern in the world’s metropolises under the current trend of urbanization. The facades of high-rise buildings are essential elements of the urban landscape. The skins of these facades are important media between the interior and exterior of high- rise buildings. It not only connects between users and environments, but also plays an important functional and aesthetic role. This research involves a study of skins of high-rise residential buildings using the methodology of shape grammar to find out the rules which determine the combinations of the facade patterns and analyze the patterns’ parameters using software Grasshopper. We chose a number of facades of high-rise residential buildings as source to discover the underlying rules and concepts of the generation of facade skins. This research also provides the rules that influence the composition of facade skins. The items of the facade skins, such as windows, balconies, walls, sun visors and metal grilles are treated as elements in the system of facade skins. The compositions of these elements will be categorized and described by logical rules; and the types of high-rise building facade skins will be modelled by Grasshopper. Then a variety of analyzed patterns can also be applied on other facade skins through this parametric mechanism. Using these patterns established in the models, researchers can analyze each single item to do more detail tests and architects can apply each of these items to construct their facades for other buildings through various combinations and permutations. The goal of these models is to develop a mechanism to generate prototypes in order to facilitate generation of various facade skins.Keywords: facade skin, grasshopper, high-rise residential building, shape grammar
Procedia PDF Downloads 5093125 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 1613124 Development of the Food Market of the Republic of Kazakhstan in the Field of Milk Processing
Authors: Gulmira Zhakupova, Tamara Tultabayeva, Aknur Muldasheva, Assem Sagandyk
Abstract:
The development of technology and production of products with increased biological value based on the use of natural food raw materials are important tasks in the policy of the food market of the Republic of Kazakhstan. For Kazakhstan, livestock farming, in particular sheep farming, is the most ancient and developed industry and way of life. The history of the Kazakh people is largely connected with this type of agricultural production, with established traditions using dairy products from sheep's milk. Therefore, the development of new technologies from sheep’s milk remains relevant. In addition, one of the most promising areas for the development of food technology for therapeutic and prophylactic purposes is sheep milk products as a source of protein, immunoglobulins, minerals, vitamins, and other biologically active compounds. This article presents the results of research on the study of milk processing technology. The objective of the study is to study the possibilities of processing sheep milk and its role in human nutrition, as well as the results of research to improve the technology of sheep milk products. The studies were carried out on the basis of sanitary and hygienic requirements for dairy products in accordance with the following test methods. To perform microbiological analysis, we used the method for identifying Salmonella bacteria (Horizontal method for identifying, counting, and serotyping Salmonella) in a certain mass or volume of product. Nutritional value is a complex of properties of food products that meet human physiological needs for energy and basic nutrients. The protein mass fraction was determined by the Kjeldahl method. This method is based on the mineralization of a milk sample with concentrated sulfuric acid in the presence of an oxidizing agent, an inert salt - potassium sulfate, and a catalyst - copper sulfate. In this case, the amino groups of the protein are converted into ammonium sulfate dissolved in sulfuric acid. The vitamin composition was determined by HPLC. To determine the content of mineral substances in the studied samples, the method of atomic absorption spectrophotometry was used. The study identified the technological parameters of sheep milk products and determined the prospects for researching sheep milk products. Microbiological studies were used to determine the safety of the study product. According to the results of the microbiological analysis, no deviations from the norm were identified. This means high safety of the products under study. In terms of nutritional value, the resulting products are high in protein. Data on the positive content of amino acids were also obtained. The results obtained will be used in the food industry and will serve as recommendations for manufacturers.Keywords: dairy, milk processing, nutrition, colostrum
Procedia PDF Downloads 573123 Active Noise Cancellation in the Rectangular Enclosure Systems
Authors: D. Shakirah Shukor, A. Aminudin, Hashim U. A., Waziralilah N. Fathiah, T. Vikneshvaran
Abstract:
The interior noise control is essential to be explored due to the interior acoustic analysis is significant in the systems such as automobiles, aircraft, air-handling system and diesel engine exhausts system. In this research, experimental work was undertaken for canceling an active noise in the rectangular enclosure. The rectangular enclosure was fabricated with multiple speakers and microphones inside the enclosure. A software program using digital signal processing is implemented to evaluate the proposed method. Experimental work was conducted to obtain the acoustic behavior and characteristics of the rectangular enclosure and noise cancellation based on active noise control in low-frequency range. Noise is generated by using multispeaker inside the enclosure and microphones are used for noise measurements. The technique for noise cancellation relies on the principle of destructive interference between two sound fields in the rectangular enclosure. One field is generated by the original or primary sound source, the other by a secondary sound source set up to interfere with, and cancel, that unwanted primary sound. At the end of this research, the result of output noise before and after cancellation are presented and discussed. On the basis of the findings presented in this research, an active noise cancellation in the rectangular enclosure is worth exploring in order to improve the noise control technologies.Keywords: active noise control, digital signal processing, noise cancellation, rectangular enclosure
Procedia PDF Downloads 2723122 Exploration into Bio Inspired Computing Based on Spintronic Energy Efficiency Principles and Neuromorphic Speed Pathways
Authors: Anirudh Lahiri
Abstract:
Neuromorphic computing, inspired by the intricate operations of biological neural networks, offers a revolutionary approach to overcoming the limitations of traditional computing architectures. This research proposes the integration of spintronics with neuromorphic systems, aiming to enhance computational performance, scalability, and energy efficiency. Traditional computing systems, based on the Von Neumann architecture, struggle with scalability and efficiency due to the segregation of memory and processing functions. In contrast, the human brain exemplifies high efficiency and adaptability, processing vast amounts of information with minimal energy consumption. This project explores the use of spintronics, which utilizes the electron's spin rather than its charge, to create more energy-efficient computing systems. Spintronic devices, such as magnetic tunnel junctions (MTJs) manipulated through spin-transfer torque (STT) and spin-orbit torque (SOT), offer a promising pathway to reducing power consumption and enhancing the speed of data processing. The integration of these devices within a neuromorphic framework aims to replicate the efficiency and adaptability of biological systems. The research is structured into three phases: an exhaustive literature review to build a theoretical foundation, laboratory experiments to test and optimize the theoretical models, and iterative refinements based on experimental results to finalize the system. The initial phase focuses on understanding the current state of neuromorphic and spintronic technologies. The second phase involves practical experimentation with spintronic devices and the development of neuromorphic systems that mimic synaptic plasticity and other biological processes. The final phase focuses on refining the systems based on feedback from the testing phase and preparing the findings for publication. The expected contributions of this research are twofold. Firstly, it aims to significantly reduce the energy consumption of computational systems while maintaining or increasing processing speed, addressing a critical need in the field of computing. Secondly, it seeks to enhance the learning capabilities of neuromorphic systems, allowing them to adapt more dynamically to changing environmental inputs, thus better mimicking the human brain's functionality. The integration of spintronics with neuromorphic computing could revolutionize how computational systems are designed, making them more efficient, faster, and more adaptable. This research aligns with the ongoing pursuit of energy-efficient and scalable computing solutions, marking a significant step forward in the field of computational technology.Keywords: material science, biological engineering, mechanical engineering, neuromorphic computing, spintronics, energy efficiency, computational scalability, synaptic plasticity.
Procedia PDF Downloads 433121 Low Power Glitch Free Dual Output Coarse Digitally Controlled Delay Lines
Authors: K. Shaji Mon, P. R. John Sreenidhi
Abstract:
In deep-submicrometer CMOS processes, time-domain resolution of a digital signal is becoming higher than voltage resolution of analog signals. This claim is nowadays pushing toward a new circuit design paradigm in which the traditional analog signal processing is expected to be progressively substituted by the processing of times in the digital domain. Within this novel paradigm, digitally controlled delay lines (DCDL) should play the role of digital-to-analog converters in traditional, analog-intensive, circuits. Digital delay locked loops are highly prevalent in integrated systems.The proposed paper addresses the glitches present in delay circuits along with area,power dissipation and signal integrity.The digitally controlled delay lines(DCDL) under study have been designed in a 90 nm CMOS technology 6 layer metal Copper Strained SiGe Low K Dielectric. Simulation and synthesis results show that the novel circuits exhibit no glitches for dual output coarse DCDL with less power dissipation and consumes less area compared to the glitch free NAND based DCDL.Keywords: glitch free, NAND-based DCDL, CMOS, deep-submicrometer
Procedia PDF Downloads 2453120 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University
Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat
Abstract:
Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.Keywords: big data platforms, cloudera manager, Hadoop, MapReduce
Procedia PDF Downloads 3583119 Sustainable Manufacturing of Concentrated Latex and Ribbed Smoked Sheets in Sri Lanka
Authors: Pasan Dunuwila, V. H. L. Rodrigo, Naohiro Goto
Abstract:
Sri Lanka is one the largest natural rubber (NR) producers of the world, where the NR industry is a major foreign exchange earner. Among the locally manufactured NR products, concentrated latex (CL) and ribbed smoked sheets (RSS) hold a significant position. Furthermore, these products become the foundation for many products utilized by the people all over the world (e.g. gloves, condoms, tires, etc.). Processing of CL and RSS costs a significant amount of material, energy, and workforce. With this background, both manufacturing lines have immensely challenged by waste, low productivity, lack of cost efficiency, rising cost of production, and many environmental issues. To face the above challenges, the adaptation of sustainable manufacturing measures that use less energy, water, materials, and produce less waste is imperative. However, these sectors lack comprehensive studies that shed light on such measures and thoroughly discuss their improvement potentials from both environmental and economic points of view. Therefore, based on a study of three CL and three RSS mills in Sri Lanka, this study deploys sustainable manufacturing techniques and tools to uncover the underlying potentials to improve performances in CL and RSS processing sectors. This study is comprised of three steps: 1. quantification of average material waste, economic losses, and greenhouse gas (GHG) emissions via material flow analysis (MFA), material flow cost accounting (MFCA), and life cycle assessment (LCA) in each manufacturing process, 2. identification of improvement options with the help of Pareto and What-if analyses, field interviews, and the existing literature; and 3. validation of the identified improvement options via the re-execution of MFA, MFCA, and LCA. With the help of this methodology, the economic and environmental hotspots, and the degrees of improvement in both systems could be identified. Results highlighted that each process could be improved to have less waste, monetary losses, manufacturing costs, and GHG emissions. Conclusively, study`s methodology and findings are believed to be beneficial for assuring the sustainable growth not only in Sri Lankan NR processing sector itself but also in NR or any other industry rooted in other developing countries.Keywords: concentrated latex, natural rubber, ribbed smoked sheets, Sri Lanka
Procedia PDF Downloads 2613118 Signal Processing Techniques for Adaptive Beamforming with Robustness
Authors: Ju-Hong Lee, Ching-Wei Liao
Abstract:
Adaptive beamforming using antenna array of sensors is useful in the process of adaptively detecting and preserving the presence of the desired signal while suppressing the interference and the background noise. For conventional adaptive array beamforming, we require a prior information of either the impinging direction or the waveform of the desired signal to adapt the weights. The adaptive weights of an antenna array beamformer under a steered-beam constraint are calculated by minimizing the output power of the beamformer subject to the constraint that forces the beamformer to make a constant response in the steering direction. Hence, the performance of the beamformer is very sensitive to the accuracy of the steering operation. In the literature, it is well known that the performance of an adaptive beamformer will be deteriorated by any steering angle error encountered in many practical applications, e.g., the wireless communication systems with massive antennas deployed at the base station and user equipment. Hence, developing effective signal processing techniques to deal with the problem due to steering angle error for array beamforming systems has become an important research work. In this paper, we present an effective signal processing technique for constructing an adaptive beamformer against the steering angle error. The proposed array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. Based on the presumed steering vector and a preset angle range for steering mismatch tolerance, we first create a matrix related to the direction vector of signal sources. Two projection matrices are generated from the matrix. The projection matrix associated with the desired signal information and the received array data are utilized to iteratively estimate the actual direction vector of the desired signal. The estimated direction vector of the desired signal is then used for appropriately finding the quiescent weight vector. The other projection matrix is set to be the signal blocking matrix required for performing adaptive beamforming. Accordingly, the proposed beamformer consists of adaptive quiescent weights and partially adaptive weights. Several computer simulation examples are provided for evaluating and comparing the proposed technique with the existing robust techniques.Keywords: adaptive beamforming, robustness, signal blocking, steering angle error
Procedia PDF Downloads 1243117 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation
Authors: Aicha Majda, Abdelhamid El Hassani
Abstract:
Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.Keywords: graph cuts, lung CT scan, lung parenchyma segmentation, patch-based similarity metric
Procedia PDF Downloads 1693116 Production of High Purity Cellulose Products from Sawdust Waste Material
Authors: Simiksha Balkissoon, Jerome Andrew, Bruce Sithole
Abstract:
Approximately half of the wood processed in the Forestry, Timber, Pulp and Paper (FTPP) sector is accumulated as waste. The concept of a “green economy” encourages industries to employ revolutionary, transformative technologies to eliminate waste generation by exploring the development of new value chains. The transition towards an almost paperless world driven by the rise of digital media has resulted in a decline in traditional paper markets, prompting the FTTP sector to reposition itself and expand its product offerings by unlocking the potential of value-adding opportunities from renewable resources such as wood to generate revenue and mitigate its environmental impact. The production of valuable products from wood waste such as sawdust has been extensively explored in recent years. Wood components such as lignin, cellulose and hemicelluloses, which can be extracted selectively by chemical processing, are suitable candidates for producing numerous high-value products. In this study, a novel approach to produce high-value cellulose products, such as dissolving wood pulp (DWP), from sawdust was developed. DWP is a high purity cellulose product used in several applications such as pharmaceutical, textile, food, paint and coatings industries. The proposed approach demonstrates the potential to eliminate several complex processing stages, such as pulping and bleaching, which are associated with traditional commercial processes to produce high purity cellulose products such as DWP, making it less chemically energy and water-intensive. The developed process followed the path of experimentally designed lab tests evaluating typical processing conditions such as residence time, chemical concentrations, liquid-to-solid ratios and temperature, followed by the application of suitable purification steps. Characterization of the product from the initial stage was conducted using commercially available DWP grades as reference materials. The chemical characteristics of the products thus far have shown similar properties to commercial products, making the proposed process a promising and viable option for the production of DWP from sawdust.Keywords: biomass, cellulose, chemical treatment, dissolving wood pulp
Procedia PDF Downloads 1863115 Short Answer Grading Using Multi-Context Features
Authors: S. Sharan Sundar, Nithish B. Moudhgalya, Nidhi Bhandari, Vineeth Vijayaraghavan
Abstract:
Automatic Short Answer Grading is one of the prime applications of artificial intelligence in education. Several approaches involving the utilization of selective handcrafted features, graphical matching techniques, concept identification and mapping, complex deep frameworks, sentence embeddings, etc. have been explored over the years. However, keeping in mind the real-world application of the task, these solutions present a slight overhead in terms of computations and resources in achieving high performances. In this work, a simple and effective solution making use of elemental features based on statistical, linguistic properties, and word-based similarity measures in conjunction with tree-based classifiers and regressors is proposed. The results for classification tasks show improvements ranging from 1%-30%, while the regression task shows a stark improvement of 35%. The authors attribute these improvements to the addition of multiple similarity scores to provide ensemble of scoring criteria to the models. The authors also believe the work could reinstate that classical natural language processing techniques and simple machine learning models can be used to achieve high results for short answer grading.Keywords: artificial intelligence, intelligent systems, natural language processing, text mining
Procedia PDF Downloads 1333114 A Constructivist Grounded Theory Study on the Impact of Automation on People and Gardening
Authors: Hamilton V. Niculescu
Abstract:
Following a three year study conducted on eighteen Irish people that are involved in growing vegetables in various community gardens around Dublin, Republic of Ireland, it was revealed that addition of some automated features aimed at improving agricultural practices represented a process which was regarded as potentially beneficial, and as a great tool to closely monitor climate conditions inside the greenhouses. The participants were provided with a free custom-built mobile app through which they could remotely monitor and control features such as irrigation, air ventilation, and windows to ensure optimal growing conditions for vegetables growing inside purpose-built greenhouses. While the initial interest was generally high, within weeks, the participants' level of interaction with the enclosures slowly declined. By employing a constructivist grounded theory methodology, following focus group discussions, in-depth semi-structured interviews, and observations, it was revealed that participants' trust in newer technologies, and renewables, in particular, was low. There are various reasons for this, but because the participants in this study consist of mainly working-class people, it can be argued that lack of education and knowledge are the main barriers acting against the adoption of innovations. Consequently, it was revealed that most participants eventually decided to "set and forget" the systems in automatic working mode, indicating that the immediate effect of introducing people to assisting technologies also introduced some unintended consequences into their lifestyle. It is argued that this occurrence also indicates the fact that people initially "read" newer technologies and only adopt those features that they find useful and less intrusive in regards to their current lifestyle.Keywords: automation, communication, greenhouse, sustainable
Procedia PDF Downloads 1193113 The Effect of Feedstock Powder Treatment / Processing on the Microstructure, Quality, and Performance of Thermally Sprayed Titanium Based Composite Coating
Authors: Asma Salman, Brian Gabbitas, Peng Cao, Deliang Zhang
Abstract:
The performance of a coating is strongly dependent upon its microstructure, which in turn is dependent on the characteristics of the feedstock powder. This study involves the evaluation and performance of a titanium-based composite coating produced by the HVOF (high-velocity oxygen fuel) spraying method. The feedstock for making the composite coating was produced using high energy mechanical milling of TiO2 and Al powders followed by a combustion reaction. The characteristics of the feedstock powder were improved by treating it with an organic binder. Two types of coatings were produced using treated and untreated feedstock powders. The microstructures and characteristics of both types of coatings were studied, and their thermal shock resistance was accessed by dipping into molten aluminum. The results of this study showed that feedstock treatment did not have a significant effect on the microstructure of the coatings. However, it did affect the uniformity, thickness and surface roughness of the coating on the steel substrate. A coating produced by an untreated feedstock showed better thermal shock resistance in molten aluminum compared with the one produced by PVA (polyvinyl alcohol) treatment.Keywords: coating, feedstock, powder processing, thermal shock resistance, thermally spraying
Procedia PDF Downloads 2723112 The Application of Lesson Study Model in Writing Review Text in Junior High School
Authors: Sulastriningsih Djumingin
Abstract:
This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.Keywords: application, lesson study, review text, writing
Procedia PDF Downloads 2023111 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 4093110 Impact of Agricultural Waste Utilization and Management on the Environment
Authors: Ravi Kumar
Abstract:
Agricultural wastes are the non-product outcomes of agricultural processing whose monetary value is less as compared to its collection cost, transportation, and processing. When such agricultural waste is not properly disposed of, it may damage the natural environment and cause detrimental pollution in the atmosphere. Agricultural development and intensive farming methods usually result in wastes that remarkably affect the rural environments in particular and the global environment in general. Agricultural waste has toxicity latent to human beings, animals, and plants through various indirect and direct outlets. The present paper explores the various activities that result in agricultural waste and the routes that can utilize the agricultural waste in a manageable manner to reduce its adverse impact on the environment. Presently, the agricultural waste management system for ecological agriculture and sustainable development has emerged as a crucial issue for policymakers. There is an urgent need to consider agricultural wastes as prospective resources rather than undesirable in order to avoid the transmission and contamination of water, land, and air resources. Waste management includes the disposal and treatment of waste with a view to eliminate threats of waste by modifying the waste to condense the microbial load. The study concludes that proper waste utilization and management will facilitate the purification and development of the ecosystem and provide feasible biofuel resources. This proper utilization and management of these wastes for agricultural production may reduce their accumulation and further reduce environmental pollution by improving environmental health.Keywords: agricultural waste, utilization, management, environment, health
Procedia PDF Downloads 963109 Effects of Climate Change and Livelihood Diversification on Gendered Productivity Gap of Farmers in Northern Regions of Ghana
Authors: William Adzawla
Abstract:
In the midst of climate variability and change, the role of gender in ensuring food production remains vital. Therefore, this study analysed the gendered productivity among maize farmers, and the effects of climate change and variability as well as livelihood diversification on gendered productivity gap. This involved a total of 619 farmers selected through a multistage sampling procedure. The data was analysed using Oaxaca Blinder decomposition model. From the result, there is a significant productivity gap of 58.8% and 44.8% between male and female heads, and between male heads and female spouses, respectively. About 87.47% and 98.08% of the variations in gendered productivity were explained by resource endowment. While livelihood diversification significantly influenced gendered productivity through endowment and coefficient effect, climate variables significantly affect productivity gap through only coefficient effects. The study concluded that there is a substantial gendered productivity gap among farmers and this is particularly due to differences in endowment. Generally, there is a high potential of reducing gendered productivity gaps through the provision of equal diversification opportunities and reducing females’ vulnerability to climate change. Among the livelihood activities, off-farm activities such as agro-processing and shea butter processing should be promoted. Similarly, the adoption of on-farm adaptation strategies should be promoted among the farmers.Keywords: climate change and variability, gender, livelihood diversification, oaxaca-blinder decomposition, productivity gap
Procedia PDF Downloads 1703108 Flexible Integration of Airbag Weakening Lines in Interior Components: Airbag Weakening with Jenoptik Laser Technology
Authors: Markus Remm, Sebastian Dienert
Abstract:
Vehicle interiors are not only changing in terms of design and functionality but also due to new driving situations in which, for example, autonomous operating modes are possible. Flexible seating positions are changing the requirements for passive safety system behavior and location in the interior of a vehicle. With fully autonomous driving, the driver can, for example, leave the position behind the steering wheel and take a seated position facing backward. Since autonomous and non-autonomous vehicles will share the same road network for the foreseeable future, accidents cannot be avoided, which makes the use of passive safety systems indispensable. With JENOPTIK-VOTAN® A technology, the trend towards flexible predetermined airbag weakening lines is enabled. With the help of laser beams, the predetermined weakening lines are introduced from the backside of the components so that they are absolutely invisible. This machining process is sensor-controlled and guarantees that a small residual wall thickness remains for the best quality and reliability for airbag weakening lines. Due to the wide processing range of the laser, the processing of almost all materials is possible. A CO₂ laser is used for many plastics, natural fiber materials, foams, foils and material composites. A femtosecond laser is used for natural materials and textiles that are very heat-sensitive. This laser type has extremely short laser pulses with very high energy densities. Supported by a high-precision and fast movement of the laser beam by a laser scanner system, the so-called cold ablation is enabled to predetermine weakening lines layer by layer until the desired residual wall thickness remains. In that way, for example, genuine leather can be processed in a material-friendly and process-reliable manner without design implications to the components A-Side. Passive safety in the vehicle is increased through the interaction of modern airbag technology and high-precision laser airbag weakening. The JENOPTIK-VOTAN® A product family has been representing this for more than 25 years and is pointing the way to the future with new and innovative technologies.Keywords: design freedom, interior material processing, laser technology, passive safety
Procedia PDF Downloads 1213107 Potential of Salvia sclarea L. for Phytoremediation of Soils Contaminated with Heavy Metals
Authors: Violina R. Angelova, Radka V. Ivanova, Givko M. Todorov, Krasimir I. Ivanov
Abstract:
A field study was conducted to evaluate the efficacy of Salvia sclarea L. for phytoremediation of contaminated soils. The experiment was performed on an agricultural fields contaminated by the Non-Ferrous-Metal Works near Plovdiv, Bulgaria. The content of heavy metals in different parts of Salvia sclarea L. (roots, stems, leaves and inflorescences) was determined by ICP. The essential oil of the Salvia sclarea L. was obtained by steam distillation in laboratory conditions and was analyzed for heavy metals and its chemical composition was determined. Salvia sclarea L. is a plant which is tolerant to heavy metals and can be grown on contaminated soils. Based on the obtained results and using the most common criteria, Salvia sclarea L. can be classified as Pb hyperaccumulator and Cd and Zn accumulators, therefore, this plant has suitable potential for the phytoremediation of heavy metal contaminated soils. Favorable is also the fact that heavy metals do not influence the development of the Salvia sclarea L., as well as on the quality and quantity of the essential oil. For clary sage oil obtained from the processing of clary sage grown on highly contaminated soils, its key odour-determining ingredients meet the quality requirements of the European Pharmacopoeia and BS ISO 7609 regarding Bulgarian clary sage oil and/or have values that are close to the limits of these standards. The possibility of further industrial processing will make Salvia sclarea L. an economically interesting crop for farmers of phytoextraction technology.Keywords: clary sage, heavy metals, phytoremediation, polluted soils
Procedia PDF Downloads 216