Search results for: quadrature processing
3040 Wastewater from the Food Industry: Characteristics and Possibilities of Sediments on the Basis of the Dairy Industry
Authors: Monika Gałwa-Widera, Anna Kwarciak–Kozłowska, Lucyna Sławik-Dembiczak
Abstract:
Issues relating to management of sewage sludge from small and medium-sized wastewater treatment plants is a vital issue, which deal with such scholars as well as those directly involved in the issue of wastewater treatment and management of sedimentary. According to the Law on Waste generating waste is responsible for such processing to the product obtained impacted on the environment minimally. In small and medium-sized wastewater treatment plants have to deal with the technology of sludge management technology is far from drying and incineration of sewage sludge. So here you can use other technologies. One of them is the composting of sewage sludge. It is a process of processing and disposal of sewage sludge that effectively their disposal. By composting, we can obtain a product that contains significant amounts of organic matter to assess the fertilizing qualities. Modifications to the ongoing process in biological reactors allow for more rapid receipt of a wholesome product. The research presented and discussed in this publication relate to assist the composting process of sewage sludge and biomass structural material in the shares of rates: 35% biomass, 55% sludge, 10% structural material using a method which involves the re-spawning batch composting physical methods leachate from the composting process.Keywords: biomass, composting, industry, sewage sludge
Procedia PDF Downloads 4403039 Indium-Gallium-Zinc Oxide Photosynaptic Device with Alkylated Graphene Oxide for Optoelectronic Spike Processing
Authors: Seyong Oh, Jin-Hong Park
Abstract:
Recently, neuromorphic computing based on brain-inspired artificial neural networks (ANNs) has attracted huge amount of research interests due to the technological abilities to facilitate massively parallel, low-energy consuming, and event-driven computing. In particular, research on artificial synapse that imitate biological synapses responsible for human information processing and memory is in the spotlight. Here, we demonstrate a photosynaptic device, wherein a synaptic weight is governed by a mixed spike consisting of voltage and light spikes. Compared to the device operated only by the voltage spike, ∆G in the proposed photosynaptic device significantly increased from -2.32nS to 5.95nS with no degradation of nonlinearity (NL) (potentiation/depression values were changed from 4.24/8 to 5/8). Furthermore, the Modified National Institute of Standards and Technology (MNIST) digit pattern recognition rates improved from 36% and 49% to 50% and 62% in ANNs consisting of the synaptic devices with 20 and 100 weight states, respectively. We expect that the photosynaptic device technology processed by optoelectronic spike will play an important role in implementing the neuromorphic computing systems in the future.Keywords: optoelectronic synapse, IGZO (Indium-Gallium-Zinc Oxide) photosynaptic device, optoelectronic spiking process, neuromorphic computing
Procedia PDF Downloads 1743038 Corpus-Based Neural Machine Translation: Empirical Study Multilingual Corpus for Machine Translation of Opaque Idioms - Cloud AutoML Platform
Authors: Khadija Refouh
Abstract:
Culture bound-expressions have been a bottleneck for Natural Language Processing (NLP) and comprehension, especially in the case of machine translation (MT). In the last decade, the field of machine translation has greatly advanced. Neural machine translation NMT has recently achieved considerable development in the quality of translation that outperformed previous traditional translation systems in many language pairs. Neural machine translation NMT is an Artificial Intelligence AI and deep neural networks applied to language processing. Despite this development, there remain some serious challenges that face neural machine translation NMT when translating culture bounded-expressions, especially for low resources language pairs such as Arabic-English and Arabic-French, which is not the case with well-established language pairs such as English-French. Machine translation of opaque idioms from English into French are likely to be more accurate than translating them from English into Arabic. For example, Google Translate Application translated the sentence “What a bad weather! It runs cats and dogs.” to “يا له من طقس سيء! تمطر القطط والكلاب” into the target language Arabic which is an inaccurate literal translation. The translation of the same sentence into the target language French was “Quel mauvais temps! Il pleut des cordes.” where Google Translate Application used the accurate French corresponding idioms. This paper aims to perform NMT experiments towards better translation of opaque idioms using high quality clean multilingual corpus. This Corpus will be collected analytically from human generated idiom translation. AutoML translation, a Google Neural Machine Translation Platform, is used as a custom translation model to improve the translation of opaque idioms. The automatic evaluation of the custom model will be compared to the Google NMT using Bilingual Evaluation Understudy Score BLEU. BLEU is an algorithm for evaluating the quality of text which has been machine-translated from one natural language to another. Human evaluation is integrated to test the reliability of the Blue Score. The researcher will examine syntactical, lexical, and semantic features using Halliday's functional theory.Keywords: multilingual corpora, natural language processing (NLP), neural machine translation (NMT), opaque idioms
Procedia PDF Downloads 1493037 Multiscale Process Modeling of Ceramic Matrix Composites
Authors: Marianna Maiaru, Gregory M. Odegard, Josh Kemppainen, Ivan Gallegos, Michael Olaya
Abstract:
Ceramic matrix composites (CMCs) are typically used in applications that require long-term mechanical integrity at elevated temperatures. CMCs are usually fabricated using a polymer precursor that is initially polymerized in situ with fiber reinforcement, followed by a series of cycles of pyrolysis to transform the polymer matrix into a rigid glass or ceramic. The pyrolysis step typically generates volatile gasses, which creates porosity within the polymer matrix phase of the composite. Subsequent cycles of monomer infusion, polymerization, and pyrolysis are often used to reduce the porosity and thus increase the durability of the composite. Because of the significant expense of such iterative processing cycles, new generations of CMCs with improved durability and manufacturability are difficult and expensive to develop using standard Edisonian approaches. The goal of this research is to develop a computational process-modeling-based approach that can be used to design the next generation of CMC materials with optimized material and processing parameters for maximum strength and efficient manufacturing. The process modeling incorporates computational modeling tools, including molecular dynamics (MD), to simulate the material at multiple length scales. Results from MD simulation are used to inform the continuum-level models to link molecular-level characteristics (material structure, temperature) to bulk-level performance (strength, residual stresses). Processing parameters are optimized such that process-induced residual stresses are minimized and laminate strength is maximized. The multiscale process modeling method developed with this research can play a key role in the development of future CMCs for high-temperature and high-strength applications. By combining multiscale computational tools and process modeling, new manufacturing parameters can be established for optimal fabrication and performance of CMCs for a wide range of applications.Keywords: digital engineering, finite elements, manufacturing, molecular dynamics
Procedia PDF Downloads 983036 Sunspot Cycles: Illuminating Humanity's Mysteries
Authors: Aghamusa Azizov
Abstract:
This study investigates the correlation between solar activity and sentiment in news media coverage, using a large-scale dataset of solar activity since 1750 and over 15 million articles from "The New York Times" dating from 1851 onwards. Employing Pearson's correlation coefficient and multiple Natural Language Processing (NLP) tools—TextBlob, Vader, and DistillBERT—the research examines the extent to which fluctuations in solar phenomena are reflected in the sentiment of historical news narratives. The findings reveal that the correlation between solar activity and media sentiment is generally negligible, suggesting a weak influence of solar patterns on the portrayal of events in news media. Notably, a moderate positive correlation was observed between the sentiments derived from TextBlob and Vader, indicating consistency across NLP tools. The analysis provides insights into the historical impact of solar activity on human affairs and highlights the importance of using multiple analytical methods to understand complex relationships in large datasets. The study contributes to the broader understanding of how extraterrestrial factors may intersect with media-reported events and underlines the intricate nature of interdisciplinary research in the data science and historical domains.Keywords: solar activity correlation, media sentiment analysis, natural language processing, historical event patterns
Procedia PDF Downloads 783035 Performance of an Improved Fluidized System for Processing Green Tea
Authors: Nickson Kipng’etich Lang’at, Thomas Thoruwa, John Abraham, John Wanyoko
Abstract:
Green tea is made from the top two leaves and buds of a shrub, Camellia sinensis, of the family Theaceae and the order Theales. The green tea leaves are picked and immediately sent to be dried or steamed to prevent fermentation. Fluid bed drying technique is a common drying method used in drying green tea because of its ease in design and construction and fluidization of fine tea particles. Major problems in this method are significant loss of chemical content of the leaf and green appearance of tea, retention of high moisture content in the leaves and bed channeling and defluidization. The energy associated with the drying technology has been shown to be a vital factor in determining the quality of green tea. As part of the implementation, prototype dryer was built that facilitated sequence of operations involving steaming, cooling, pre-drying and final drying. The major findings of the project were in terms of quality characteristics of tea leaves and energy consumption during processing. The optimal design achieved a moisture content of 4.2 ± 0.84%. With the optimum drying temperature of 100 ºC, the specific energy consumption was 1697.8 kj.Kg-1 and evaporation rate of 4.272 x 10-4 Kg.m-2.s-1. The energy consumption in a fluidized system can be further reduced by focusing on energy saving designs.Keywords: evaporation rate, fluid bed dryer, maceration, specific energy consumption
Procedia PDF Downloads 3133034 Retrofitting Cement Plants with Oxyfuel Technology for Carbon Capture
Authors: Peloriadi Konstantina, Fakis Dimitris, Grammelis Panagiotis
Abstract:
Methods for carbon capture and storage (CCS) can play a key role in the reduction of industrial CO₂ emissions, especially in the cement industry, which accounts for 7% of global emissions. Cement industries around the world have committed to address this problem by reaching carbon neutrality by the year 2050. The aim of the work to be presented was to contribute to the decarbonization strategy by integrating the 1st generation oxyfuel technology in cement production plants. This technology has been shown to improve fuel efficiency while providing one of the most cost-effective solutions when compared to other capture methods. A validated simulation of the cement plant was thus used as a basis to develop an oxyfuel retrofitted cement process. The process model for the oxyfuel technology is developed on the ASPEN (Advanced System for Process Engineering) PLUSTM simulation software. This process consists of an Air Separation Unit (ASU), an oxyfuel cement plant with coal and alternative solid fuel (ASF) as feedstock, and a carbon dioxide processing unit (CPU). A detailed description and analysis of the CPU will be presented, including the findings of a literature review and simulation results, regarding the effects of flue gas impurities during operation. Acknowledgment: This research has been conducted in the framework of the EU funded AC2OCEM project, which investigates first and the second generation oxyfuel concepts.Keywords: oxyfuel technology, carbon capture and storage, CO₂ processing unit, cement, aspen plus
Procedia PDF Downloads 1933033 A Simple Device for Characterizing High Power Electron Beams for Welding
Authors: Aman Kaur, Colin Ribton, Wamadeva Balachandaran
Abstract:
Electron beam welding due to its inherent advantages is being extensively used for material processing where high precision is required. Especially in aerospace or nuclear industries, there are high quality requirements and the cost of materials and processes is very high which makes it very important to ensure the beam quality is maintained and checked prior to carrying out the welds. Although the processes in these industries are highly controlled, however, even the minor changes in the operating parameters of the electron gun can make large enough variations in the beam quality that can result in poor welding. To measure the beam quality a simple device has been designed that can be used at high powers. The device consists of two slits in x and y axis which collects a small portion of the beam current when the beam is deflected over the slits. The signals received from the device are processed in data acquisition hardware and the dedicated software developed for the device. The device has been used in controlled laboratory environments to analyse the signals and the weld quality relationships by varying the focus current. The results showed matching trends in the weld dimensions and the beam characteristics. Further experimental work is being carried out to determine the ability of the device and signal processing software to detect subtle changes in the beam quality and to relate these to the physical weld quality indicators.Keywords: electron beam welding, beam quality, high power, weld quality indicators
Procedia PDF Downloads 3243032 Timescape-Based Panoramic View for Historic Landmarks
Authors: H. Ali, A. Whitehead
Abstract:
Providing a panoramic view of famous landmarks around the world offers artistic and historic value for historians, tourists, and researchers. Exploring the history of famous landmarks by presenting a comprehensive view of a temporal panorama merged with geographical and historical information presents a unique challenge of dealing with images that span a long period, from the 1800’s up to the present. This work presents the concept of temporal panorama through a timeline display of aligned historic and modern images for many famous landmarks. Utilization of this panorama requires a collection of hundreds of thousands of landmark images from the Internet comprised of historic images and modern images of the digital age. These images have to be classified for subset selection to keep the more suitable images that chronologically document a landmark’s history. Processing of historic images captured using older analog technology under various different capturing conditions represents a big challenge when they have to be used with modern digital images. Successful processing of historic images to prepare them for next steps of temporal panorama creation represents an active contribution in cultural heritage preservation through the fulfillment of one of UNESCO goals in preservation and displaying famous worldwide landmarks.Keywords: cultural heritage, image registration, image subset selection, registered image similarity, temporal panorama, timescapes
Procedia PDF Downloads 1653031 A Study to Assess the Energy Saving Potential and Economic Analysis of an Agro Based Industry in Karnataka, India
Authors: Sangamesh G. Sakri, Akash N. Patil, Sadashivappa M. Kotli
Abstract:
Agro based industries in India are considered as the micro, small and medium enterprises (MSME). In India, MSMEs contribute approximately 8 percent of the country’s GDP, 42 percent of the manufacturing output and 40 percent of exports. The toor dal (scientific name Cajanus cajan, commonly known as yellow gram, pigeon pea) is the second largest pulse crop in India accounting for about 20% of total pulse production. The toor dal milling industry in India is one of the major agro-processing industries in the country. Most of the dal mills are concentrated in pulse producing areas, which are spread all over the country. In Karnataka state, Gulbarga is a district, where toor dal is the main crop and is grown extensively. There are more than 500 dal mills in and around the Gulbarga district to process dal. However, the majority of these dal milling units use traditional methods of processing which are energy and capital intensive. There exists a huge energy saving potential in these mills. An energy audit is conducted on a dal mill in Gulbarga to understand the energy consumption pattern to assess the energy saving potential, and an economic analysis is conducted to identify energy conservation opportunities.Keywords: conservation, demand side management, load curve, toor dal
Procedia PDF Downloads 2723030 Development of Cathode for Hybrid Zinc Ion Supercapacitor Using Secondary Marigold Floral Waste for Green Energy Application
Authors: Syali Pradhan, Neetu Jha
Abstract:
The Marigold flower is used in religious places for offering and decoration purpose every day. The flowers are discarded near trees or in aquatic bodies. This floral waste can be used for extracting dyes or oils. Still the secondary waste remains after processing which need to be addressed. This research aims to provide green and clean power using secondary floral waste available after processing. The carbonization of floral waste produce carbon material with high surface area and enhance active site for more reaction. The Hybrid supercapacitors are more stable, offer improved operating temperature and use less toxic material compared to battery. They provide enhanced energy density compared to supercapacitors. Hence, hybrid supercapacitor designed using waste material would be more practicable for future energy application. Here, we present the utilization of carbonized floral waste as supercapacitor electrode material. This material after carbonization gets graphitized and shows high surface area, optimum porosity along with high conductivity. Hence, this material has been tested as cathode electrode material for high performance zinc storage hybrid supercapacitor. High energy storage along with high stability has been obtained using this cathodic waste material as electrode.Keywords: marigold, flower waste, energy storage, cathode, supercapacitor
Procedia PDF Downloads 743029 Enhancing Seawater Desalination Efficiency with Combined Reverse Osmosis and Vibratory Shear-Enhanced Processing for Higher Conversion Rates and Reduced Energy Consumption
Authors: Reda Askouri, Mohamed Moussetad, Rhma Adhiri
Abstract:
Reverse osmosis (RO) is one of the most widely used techniques for seawater desalination. However, the conversion rate of this method is generally limited to 35-45% due to the high-pressure capacity of the membranes. Additionally, the specific energy consumption (SEC) for seawater desalination is high, necessitating energy recovery systems to minimise energy consumption. This study aims to enhance the performance of seawater desalination by combining RO with a vibratory shear-enhanced processing (VSEP) technique. The RO unit in this study comprises two stages, each powered by a hydraulic turbocharger that increases the pressure in both stages. The concentrate from the second stage is then directly processed by VSEP technology. The results demonstrate that the permeate water obtained exhibits high quality and that the conversion rate is significantly increased, reaching high percentages with low SEC. Furthermore, the high concentration of total solids in the concentrate allows for potential exploitation within the environmental protection framework. By valorising the concentrated waste, it’s possible to reduce the environmental impact while increasing the overall efficiency of the desalination process.Keywords: specific energy consumption, vibratory shear enhanced process, environmental challenge, water recovery
Procedia PDF Downloads 123028 Effect of Curing Temperature on the Textural and Rheological of Gelatine-SDS Hydrogels
Authors: Virginia Martin Torrejon, Binjie Wu
Abstract:
Gelatine is a protein biopolymer obtained from the partial hydrolysis of animal tissues which contain collagen, the primary structural component in connective tissue. Gelatine hydrogels have attracted considerable research in recent years as an alternative to synthetic materials due to their outstanding gelling properties, biocompatibility and compostability. Surfactants, such as sodium dodecyl sulfate (SDS), are often used in hydrogels solutions as surface modifiers or solubility enhancers, and their incorporation can influence the hydrogel’s viscoelastic properties and, in turn, its processing and applications. Literature usually focuses on studying the impact of formulation parameters (e.g., gelatine content, gelatine strength, additives incorporation) on gelatine hydrogels properties, but processing parameters, such as curing temperature, are commonly overlooked. For example, some authors have reported a decrease in gel strength at lower curing temperatures, but there is a lack of research on systematic viscoelastic characterisation of high strength gelatine and gelatine-SDS systems at a wide range of curing temperatures. This knowledge is essential to meet and adjust the technological requirements for different applications (e.g., viscosity, setting time, gel strength or melting/gelling temperature). This work investigated the effect of curing temperature (10, 15, 20, 23 and 25 and 30°C) on the elastic modulus (G’) and melting temperature of high strength gelatine-SDS hydrogels, at 10 wt% and 20 wt% gelatine contents, by small-amplitude oscillatory shear rheology coupled with Fourier Transform Infrared Spectroscopy. It also correlates the gel strength obtained by rheological measurements with the gel strength measured by texture analysis. Gelatine and gelatine-SDS hydrogels’ rheological behaviour strongly depended on the curing temperature, and its gel strength and melting temperature can be slightly modified to adjust it to given processing and applications needs. Lower curing temperatures led to gelatine and gelatine-SDS hydrogels with considerably higher storage modulus. However, their melting temperature was lower than those gels cured at higher temperatures and lower gel strength. This effect was more considerable at longer timescales. This behaviour is attributed to the development of thermal-resistant structures in the lower strength gels cured at higher temperatures.Keywords: gelatine gelation kinetics, gelatine-SDS interactions, gelatine-surfactant hydrogels, melting and gelling temperature of gelatine gels, rheology of gelatine hydrogels
Procedia PDF Downloads 1013027 Empirical Investigation of Gender Differences in Information Processing Style, Tinkering, and Self-Efficacy for Robot Tele-Operation
Authors: Dilruba Showkat, Cindy Grimm
Abstract:
As robots become more ubiquitous, it is significant for us to understand how different groups of people respond to possible ways of interacting with the robot. In this study, we focused on gender differences while users were tele-operating a humanoid robot that was physically co-located with them. We investigated three factors during the human-robot interaction (1) information processing strategy (2) self-efficacy and (3) tinkering or exploratory behavior. The experimental results show that the information on how to use the robot was processed comprehensively by the female participants whereas males processed them selectively (p < 0.001). Males were more confident when using the robot than females (p = 0.0002). Males tinkered more with the robot than females (p = 0.0021). We found that tinkering was positively correlated (p = 0.0068) with task success and negatively correlated (p = 0.0032) with task completion time. Tinkering might have resulted in greater task success and lower task completion time for males. Findings from this research can be used for making design decisions for robots and open new research directions. Our results show the importance of accounting for gender differences when developing interfaces for interacting with robots and open new research directions.Keywords: humanoid robots, tele-operation, gender differences, human-robot interaction
Procedia PDF Downloads 1673026 Effect of Different Processing Methods on the Proximate, Functional, Sensory, and Nutritional Properties of Weaning Foods Formulated from Maize (Zea mays) and Soybean (Glycine max) Flour Blends
Authors: C. O. Agu, C. C. Okafor
Abstract:
Maize and soybean flours were produced using different methods of processing which include fermentation (FWF), roasting (RWF) and malting (MWF). Products from the different methods were mixed in the ratio 60:40 maize/soybean, respectively. These composites mixed with other ingredients such as sugar, vegetable oil, vanilla flavour and vitamin mix were analyzed for proximate composition, physical/functional, sensory and nutritional properties. The results for the protein content ranged between 6.25% and 16.65% with sample RWF having the highest value. Crude fibre values ranged from 3.72 to 10.0%, carbohydrate from 58.98% to 64.2%, ash from 1.27 to 2.45%. Physical and functional properties such as bulk density, wettability, gelation capacity have values between 0.74 and 0.76g/ml, 20.33 and 46.33 min and 0.73 to 0.93g/ml, respectively. On the sensory quality colour, flavour, taste, texture and general acceptability were determined. In terms of colour and flavour there was no significant difference (P < 0.05) while the values for taste ranged between 4.89 and 7.1 l, texture 5.50 to 8.38 and general acceptability 6.09 and 7.89. Nutritionally there is no significant difference (P < 0.05) between sample RWF and the control in all parameters considered. Samples FWF and MWF showed significantly (P < 0.5) lower values in all parameters determined. In the light of the above findings, roasting method is highly recommend in the production of weaning foods.Keywords: fermentation, malting, ratio, roasting, wettability
Procedia PDF Downloads 3043025 Processing of Input Material as a Way to Improve the Efficiency of the Glass Production Process
Authors: Joanna Rybicka-Łada, Magda Kosmal, Anna Kuśnierz
Abstract:
One of the main problems of the glass industry is the still high consumption of energy needed to produce glass mass, as well as the increase in prices, fuels, and raw materials. Therefore, comprehensive actions are taken to improve the entire production process. The key element of these activities, starting from filling the set to receiving the finished product, is the melting process, whose task is, among others, dissolving the components of the set, removing bubbles from the resulting melt, and obtaining a chemically homogeneous glass melt. This solution avoids dust formation during filling and is available on the market. This process consumes over 90% of the total energy needed in the production process. The processes occurring in the set during its conversion have a significant impact on the further stages and speed of the melting process and, thus, on its overall effectiveness. The speed of the reactions occurring and their course depend on the chemical nature of the raw materials, the degree of their fragmentation, thermal treatment as well as the form of the introduced set. An opportunity to minimize segregation and accelerate the conversion of glass sets may be the development of new technologies for preparing and dosing sets. The previously preferred traditional method of melting the set, based on mixing all glass raw materials together in loose form, can be replaced with a set in a thickened form. The aim of the project was to develop a glass set in a selectively or completely densified form and to examine the influence of set processing on the melting process and the properties of the glass.Keywords: glass, melting process, glass set, raw materials
Procedia PDF Downloads 603024 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement
Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova
Abstract:
One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank
Procedia PDF Downloads 3873023 Adaptation of Projection Profile Algorithm for Skewed Handwritten Text Line Detection
Authors: Kayode A. Olaniyi, Tola. M. Osifeko, Adeola A. Ogunleye
Abstract:
Text line segmentation is an important step in document image processing. It represents a labeling process that assigns the same label using distance metric probability to spatially aligned units. Text line detection techniques have successfully been implemented mainly in printed documents. However, processing of the handwritten texts especially unconstrained documents has remained a key problem. This is because the unconstrained hand-written text lines are often not uniformly skewed. The spaces between text lines may not be obvious, complicated by the nature of handwriting and, overlapping ascenders and/or descenders of some characters. Hence, text lines detection and segmentation represents a leading challenge in handwritten document image processing. Text line detection methods that rely on the traditional global projection profile of the text document cannot efficiently confront with the problem of variable skew angles between different text lines. Hence, the formulation of a horizontal line as a separator is often not efficient. This paper presents a technique to segment a handwritten document into distinct lines of text. The proposed algorithm starts, by partitioning the initial text image into columns, across its width into chunks of about 5% each. At each vertical strip of 5%, the histogram of horizontal runs is projected. We have worked with the assumption that text appearing in a single strip is almost parallel to each other. The algorithm developed provides a sliding window through the first vertical strip on the left side of the page. It runs through to identify the new minimum corresponding to a valley in the projection profile. Each valley would represent the starting point of the orientation line and the ending point is the minimum point on the projection profile of the next vertical strip. The derived text-lines traverse around any obstructing handwritten vertical strips of connected component by associating it to either the line above or below. A decision of associating such connected component is made by the probability obtained from a distance metric decision. The technique outperforms the global projection profile for text line segmentation and it is robust to handle skewed documents and those with lines running into each other.Keywords: connected-component, projection-profile, segmentation, text-line
Procedia PDF Downloads 1243022 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach
Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh
Abstract:
This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.Keywords: river stage-discharge process, LSSVM, discrete wavelet transform, Ensemble Empirical Decomposition Mode, multi-station modeling
Procedia PDF Downloads 1753021 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter
Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi
Abstract:
In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm
Procedia PDF Downloads 3873020 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning
Authors: Tanvi P. Patel, Warish D. Patel
Abstract:
Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern
Procedia PDF Downloads 3723019 Roasting Degree of Cocoa Beans by Artificial Neural Network (ANN) Based Electronic Nose System and Gas Chromatography (GC)
Authors: Juzhong Tan, William Kerr
Abstract:
Roasting is one critical procedure in chocolate processing, where special favors are developed, moisture content is decreased, and better processing properties are developed. Therefore, determination of roasting degree of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products, and it also decides the commercial value of cocoa beans collected from cocoa farmers. The roasting degree of cocoa beans currently relies on human specialists, who sometimes are biased, and chemical analysis, which take long time and are inaccessible to many manufacturers and farmers. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was used to detecting the gas generated by cocoa beans with a different roasting degree (0min, 20min, 30min, and 40min) and the signals collected by gas sensors were used to train a three-layers ANN. Chemical analysis of the graded beans was operated by traditional GC-MS system and the contents of volatile chemical compounds were used to train another ANN as a reference to electronic nosed signals trained ANN. Both trained ANN were used to predict cocoa beans with a different roasting degree for validation. The best accuracy of grading achieved by electronic nose signals trained ANN (using signals from TGS 813 826 820 880 830 2620 2602 2610) turned out to be 96.7%, however, the GC trained ANN got the accuracy of 83.8%.Keywords: artificial neutron network, cocoa bean, electronic nose, roasting
Procedia PDF Downloads 2343018 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 1613017 Development of the Food Market of the Republic of Kazakhstan in the Field of Milk Processing
Authors: Gulmira Zhakupova, Tamara Tultabayeva, Aknur Muldasheva, Assem Sagandyk
Abstract:
The development of technology and production of products with increased biological value based on the use of natural food raw materials are important tasks in the policy of the food market of the Republic of Kazakhstan. For Kazakhstan, livestock farming, in particular sheep farming, is the most ancient and developed industry and way of life. The history of the Kazakh people is largely connected with this type of agricultural production, with established traditions using dairy products from sheep's milk. Therefore, the development of new technologies from sheep’s milk remains relevant. In addition, one of the most promising areas for the development of food technology for therapeutic and prophylactic purposes is sheep milk products as a source of protein, immunoglobulins, minerals, vitamins, and other biologically active compounds. This article presents the results of research on the study of milk processing technology. The objective of the study is to study the possibilities of processing sheep milk and its role in human nutrition, as well as the results of research to improve the technology of sheep milk products. The studies were carried out on the basis of sanitary and hygienic requirements for dairy products in accordance with the following test methods. To perform microbiological analysis, we used the method for identifying Salmonella bacteria (Horizontal method for identifying, counting, and serotyping Salmonella) in a certain mass or volume of product. Nutritional value is a complex of properties of food products that meet human physiological needs for energy and basic nutrients. The protein mass fraction was determined by the Kjeldahl method. This method is based on the mineralization of a milk sample with concentrated sulfuric acid in the presence of an oxidizing agent, an inert salt - potassium sulfate, and a catalyst - copper sulfate. In this case, the amino groups of the protein are converted into ammonium sulfate dissolved in sulfuric acid. The vitamin composition was determined by HPLC. To determine the content of mineral substances in the studied samples, the method of atomic absorption spectrophotometry was used. The study identified the technological parameters of sheep milk products and determined the prospects for researching sheep milk products. Microbiological studies were used to determine the safety of the study product. According to the results of the microbiological analysis, no deviations from the norm were identified. This means high safety of the products under study. In terms of nutritional value, the resulting products are high in protein. Data on the positive content of amino acids were also obtained. The results obtained will be used in the food industry and will serve as recommendations for manufacturers.Keywords: dairy, milk processing, nutrition, colostrum
Procedia PDF Downloads 573016 Active Noise Cancellation in the Rectangular Enclosure Systems
Authors: D. Shakirah Shukor, A. Aminudin, Hashim U. A., Waziralilah N. Fathiah, T. Vikneshvaran
Abstract:
The interior noise control is essential to be explored due to the interior acoustic analysis is significant in the systems such as automobiles, aircraft, air-handling system and diesel engine exhausts system. In this research, experimental work was undertaken for canceling an active noise in the rectangular enclosure. The rectangular enclosure was fabricated with multiple speakers and microphones inside the enclosure. A software program using digital signal processing is implemented to evaluate the proposed method. Experimental work was conducted to obtain the acoustic behavior and characteristics of the rectangular enclosure and noise cancellation based on active noise control in low-frequency range. Noise is generated by using multispeaker inside the enclosure and microphones are used for noise measurements. The technique for noise cancellation relies on the principle of destructive interference between two sound fields in the rectangular enclosure. One field is generated by the original or primary sound source, the other by a secondary sound source set up to interfere with, and cancel, that unwanted primary sound. At the end of this research, the result of output noise before and after cancellation are presented and discussed. On the basis of the findings presented in this research, an active noise cancellation in the rectangular enclosure is worth exploring in order to improve the noise control technologies.Keywords: active noise control, digital signal processing, noise cancellation, rectangular enclosure
Procedia PDF Downloads 2723015 Exploration into Bio Inspired Computing Based on Spintronic Energy Efficiency Principles and Neuromorphic Speed Pathways
Authors: Anirudh Lahiri
Abstract:
Neuromorphic computing, inspired by the intricate operations of biological neural networks, offers a revolutionary approach to overcoming the limitations of traditional computing architectures. This research proposes the integration of spintronics with neuromorphic systems, aiming to enhance computational performance, scalability, and energy efficiency. Traditional computing systems, based on the Von Neumann architecture, struggle with scalability and efficiency due to the segregation of memory and processing functions. In contrast, the human brain exemplifies high efficiency and adaptability, processing vast amounts of information with minimal energy consumption. This project explores the use of spintronics, which utilizes the electron's spin rather than its charge, to create more energy-efficient computing systems. Spintronic devices, such as magnetic tunnel junctions (MTJs) manipulated through spin-transfer torque (STT) and spin-orbit torque (SOT), offer a promising pathway to reducing power consumption and enhancing the speed of data processing. The integration of these devices within a neuromorphic framework aims to replicate the efficiency and adaptability of biological systems. The research is structured into three phases: an exhaustive literature review to build a theoretical foundation, laboratory experiments to test and optimize the theoretical models, and iterative refinements based on experimental results to finalize the system. The initial phase focuses on understanding the current state of neuromorphic and spintronic technologies. The second phase involves practical experimentation with spintronic devices and the development of neuromorphic systems that mimic synaptic plasticity and other biological processes. The final phase focuses on refining the systems based on feedback from the testing phase and preparing the findings for publication. The expected contributions of this research are twofold. Firstly, it aims to significantly reduce the energy consumption of computational systems while maintaining or increasing processing speed, addressing a critical need in the field of computing. Secondly, it seeks to enhance the learning capabilities of neuromorphic systems, allowing them to adapt more dynamically to changing environmental inputs, thus better mimicking the human brain's functionality. The integration of spintronics with neuromorphic computing could revolutionize how computational systems are designed, making them more efficient, faster, and more adaptable. This research aligns with the ongoing pursuit of energy-efficient and scalable computing solutions, marking a significant step forward in the field of computational technology.Keywords: material science, biological engineering, mechanical engineering, neuromorphic computing, spintronics, energy efficiency, computational scalability, synaptic plasticity.
Procedia PDF Downloads 453014 A Mixed Finite Element Formulation for Functionally Graded Micro-Beam Resting on Two-Parameter Elastic Foundation
Authors: Cagri Mollamahmutoglu, Aykut Levent, Ali Mercan
Abstract:
Micro-beams are one of the most common components of Nano-Electromechanical Systems (NEMS) and Micro Electromechanical Systems (MEMS). For this reason, static bending, buckling, and free vibration analysis of micro-beams have been the subject of many studies. In addition, micro-beams restrained with elastic type foundations have been of particular interest. In the analysis of microstructures, closed-form solutions are proposed when available, but most of the time solutions are based on numerical methods due to the complex nature of the resulting differential equations. Thus, a robust and efficient solution method has great importance. In this study, a mixed finite element formulation is obtained for a functionally graded Timoshenko micro-beam resting on two-parameter elastic foundation. In the formulation modified couple stress theory is utilized for the micro-scale effects. The equation of motion and boundary conditions are derived according to Hamilton’s principle. A functional, derived through a scientific procedure based on Gateaux Differential, is proposed for the bending and buckling analysis which is equivalent to the governing equations and boundary conditions. Most important advantage of the formulation is that the mixed finite element formulation allows usage of C₀ type continuous shape functions. Thus shear-locking is avoided in a built-in manner. Also, element matrices are sparsely populated and can be easily calculated with closed-form integration. In this framework results concerning the effects of micro-scale length parameter, power-law parameter, aspect ratio and coefficients of partially or fully continuous elastic foundation over the static bending, buckling, and free vibration response of FG-micro-beam under various boundary conditions are presented and compared with existing literature. Performance characteristics of the presented formulation were evaluated concerning other numerical methods such as generalized differential quadrature method (GDQM). It is found that with less computational burden similar convergence characteristics were obtained. Moreover, formulation also includes a direct calculation of the micro-scale related contributions to the structural response as well.Keywords: micro-beam, functionally graded materials, two-paramater elastic foundation, mixed finite element method
Procedia PDF Downloads 1623013 Low Power Glitch Free Dual Output Coarse Digitally Controlled Delay Lines
Authors: K. Shaji Mon, P. R. John Sreenidhi
Abstract:
In deep-submicrometer CMOS processes, time-domain resolution of a digital signal is becoming higher than voltage resolution of analog signals. This claim is nowadays pushing toward a new circuit design paradigm in which the traditional analog signal processing is expected to be progressively substituted by the processing of times in the digital domain. Within this novel paradigm, digitally controlled delay lines (DCDL) should play the role of digital-to-analog converters in traditional, analog-intensive, circuits. Digital delay locked loops are highly prevalent in integrated systems.The proposed paper addresses the glitches present in delay circuits along with area,power dissipation and signal integrity.The digitally controlled delay lines(DCDL) under study have been designed in a 90 nm CMOS technology 6 layer metal Copper Strained SiGe Low K Dielectric. Simulation and synthesis results show that the novel circuits exhibit no glitches for dual output coarse DCDL with less power dissipation and consumes less area compared to the glitch free NAND based DCDL.Keywords: glitch free, NAND-based DCDL, CMOS, deep-submicrometer
Procedia PDF Downloads 2453012 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University
Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat
Abstract:
Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.Keywords: big data platforms, cloudera manager, Hadoop, MapReduce
Procedia PDF Downloads 3593011 Sustainable Manufacturing of Concentrated Latex and Ribbed Smoked Sheets in Sri Lanka
Authors: Pasan Dunuwila, V. H. L. Rodrigo, Naohiro Goto
Abstract:
Sri Lanka is one the largest natural rubber (NR) producers of the world, where the NR industry is a major foreign exchange earner. Among the locally manufactured NR products, concentrated latex (CL) and ribbed smoked sheets (RSS) hold a significant position. Furthermore, these products become the foundation for many products utilized by the people all over the world (e.g. gloves, condoms, tires, etc.). Processing of CL and RSS costs a significant amount of material, energy, and workforce. With this background, both manufacturing lines have immensely challenged by waste, low productivity, lack of cost efficiency, rising cost of production, and many environmental issues. To face the above challenges, the adaptation of sustainable manufacturing measures that use less energy, water, materials, and produce less waste is imperative. However, these sectors lack comprehensive studies that shed light on such measures and thoroughly discuss their improvement potentials from both environmental and economic points of view. Therefore, based on a study of three CL and three RSS mills in Sri Lanka, this study deploys sustainable manufacturing techniques and tools to uncover the underlying potentials to improve performances in CL and RSS processing sectors. This study is comprised of three steps: 1. quantification of average material waste, economic losses, and greenhouse gas (GHG) emissions via material flow analysis (MFA), material flow cost accounting (MFCA), and life cycle assessment (LCA) in each manufacturing process, 2. identification of improvement options with the help of Pareto and What-if analyses, field interviews, and the existing literature; and 3. validation of the identified improvement options via the re-execution of MFA, MFCA, and LCA. With the help of this methodology, the economic and environmental hotspots, and the degrees of improvement in both systems could be identified. Results highlighted that each process could be improved to have less waste, monetary losses, manufacturing costs, and GHG emissions. Conclusively, study`s methodology and findings are believed to be beneficial for assuring the sustainable growth not only in Sri Lankan NR processing sector itself but also in NR or any other industry rooted in other developing countries.Keywords: concentrated latex, natural rubber, ribbed smoked sheets, Sri Lanka
Procedia PDF Downloads 261