Search results for: processing programs
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6104

Search results for: processing programs

4814 Effect of Different Processing Methods on the Proximate, Functional, Sensory, and Nutritional Properties of Weaning Foods Formulated from Maize (Zea mays) and Soybean (Glycine max) Flour Blends

Authors: C. O. Agu, C. C. Okafor

Abstract:

Maize and soybean flours were produced using different methods of processing which include fermentation (FWF), roasting (RWF) and malting (MWF). Products from the different methods were mixed in the ratio 60:40 maize/soybean, respectively. These composites mixed with other ingredients such as sugar, vegetable oil, vanilla flavour and vitamin mix were analyzed for proximate composition, physical/functional, sensory and nutritional properties. The results for the protein content ranged between 6.25% and 16.65% with sample RWF having the highest value. Crude fibre values ranged from 3.72 to 10.0%, carbohydrate from 58.98% to 64.2%, ash from 1.27 to 2.45%. Physical and functional properties such as bulk density, wettability, gelation capacity have values between 0.74 and 0.76g/ml, 20.33 and 46.33 min and 0.73 to 0.93g/ml, respectively. On the sensory quality colour, flavour, taste, texture and general acceptability were determined. In terms of colour and flavour there was no significant difference (P < 0.05) while the values for taste ranged between 4.89 and 7.1 l, texture 5.50 to 8.38 and general acceptability 6.09 and 7.89. Nutritionally there is no significant difference (P < 0.05) between sample RWF and the control in all parameters considered. Samples FWF and MWF showed significantly (P < 0.5) lower values in all parameters determined. In the light of the above findings, roasting method is highly recommend in the production of weaning foods.

Keywords: fermentation, malting, ratio, roasting, wettability

Procedia PDF Downloads 304
4813 Processing of Input Material as a Way to Improve the Efficiency of the Glass Production Process

Authors: Joanna Rybicka-Łada, Magda Kosmal, Anna Kuśnierz

Abstract:

One of the main problems of the glass industry is the still high consumption of energy needed to produce glass mass, as well as the increase in prices, fuels, and raw materials. Therefore, comprehensive actions are taken to improve the entire production process. The key element of these activities, starting from filling the set to receiving the finished product, is the melting process, whose task is, among others, dissolving the components of the set, removing bubbles from the resulting melt, and obtaining a chemically homogeneous glass melt. This solution avoids dust formation during filling and is available on the market. This process consumes over 90% of the total energy needed in the production process. The processes occurring in the set during its conversion have a significant impact on the further stages and speed of the melting process and, thus, on its overall effectiveness. The speed of the reactions occurring and their course depend on the chemical nature of the raw materials, the degree of their fragmentation, thermal treatment as well as the form of the introduced set. An opportunity to minimize segregation and accelerate the conversion of glass sets may be the development of new technologies for preparing and dosing sets. The previously preferred traditional method of melting the set, based on mixing all glass raw materials together in loose form, can be replaced with a set in a thickened form. The aim of the project was to develop a glass set in a selectively or completely densified form and to examine the influence of set processing on the melting process and the properties of the glass.

Keywords: glass, melting process, glass set, raw materials

Procedia PDF Downloads 60
4812 Frailty Patterns in the US and Implications for Long-Term Care

Authors: Joelle Fong

Abstract:

Older persons are at greatest risk of becoming frail. As survival to the age of 80 and beyond continues to increase, the health and frailty of older Americans has garnered much recent attention among policy makers and healthcare administrators. This paper examines patterns in old-age frailty within a multistate actuarial model that characterizes the stochastic process of biological ageing. Using aggregate population-level U.S. mortality data, we implement a stochastic aging model to examine cohort trends and gender differences in frailty distributions for older Americans born 1865 – 1894. The stochastic ageing model, which draws from the fields of actuarial science and gerontology, is well-established in the literature. The implications for public health insurance programs are also discussed. Our results suggest that, on average, women tend to be frailer than men at older ages and reveal useful insights about the magnitude of the male-female differential at critical age points. Specifically, we note that the frailty statuses of males and females are actually quite comparable from ages 65 to 80. Beyond age 80, however, the frailty levels start to diverge considerably implying that women are moving quicker into worse states of health than men. Tracking average frailty by gender over 30 successive birth cohorts, we also find that frailty levels for both genders follow a distinct peak-and-trough pattern. For instance, frailty among 85-year old American survivors increased in years 1954-1963, decreased in years 1964-1971, and again started to increase in years 1972-1979. A number of factors may have accounted for these cohort differences including differences in cohort life histories, differences in disease prevalence, differences in lifestyle and behavior, differential access to medical advances, as well as changes in environmental risk factors over time. We conclude with a discussion on the implications of our findings on spending for long-term care programs within the broader health insurance system.

Keywords: actuarial modeling, cohort analysis, frail elderly, health

Procedia PDF Downloads 244
4811 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement

Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova

Abstract:

One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.

Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank

Procedia PDF Downloads 387
4810 Predictors of Lost to Follow-Up among HIV Patients Attending Anti-Retroviral Therapy Treatment Centers in Nigeria

Authors: Oluwasina Folajinmi, Kate Ssamulla, Penninah Lutung, Daniel Reijer

Abstract:

Background: Despite of well-verified benefits of anti-retroviral therapy (ART) in prolonging life expectancy being lost to follow-up (LTFU) presents a challenge to the success of ART programs in resource limited countries like Nigeria. In several studies of ART programs in developing countries, researchers have reported that there has been a high rate of LTFU among patients receiving care and treatment at ART treatment centers. This study seeks to determine the cause of LTFU among HIV clients. Method: A descriptive cross sectional study focused on a population of 9,280 persons living with HIV/AIDS who were enrolled in nine treatment centers in Nigeria (both pre-ART and ART patients were included). Out of the total population, 1752 (18.9%) were found to be LTFU. Of this group we randomly selected 1200 clients (68.5%) their d patients’ information was generated through a database. Data on demographics and CD4 counts, causes of LTFU were analyzed and summarized. Results: Out of 1200 LTFU clients selected, 462 (38.5%) were on ART; 341 clients (73.8%) had CD4 level < 500cell/µL and 738 (61.5%) on pre-ART had CD4 level >500/µL. In our records we found telephone number for 675 (56.1%) of these clients. 675 (56.1%) were owners of a phone. The majority of the client’s 731 (60.9%) were living at not more than 25km away from the ART center. A majority were females (926 or 77.2%) while 274 (22.8%) were male. 675 (56.1%) clients were reported traced via telephone and home address. 326 (27.2%) of clients phone numbers were not reachable; 173 (14.4%) of telephone numbers were incomplete. 71 (5.9%) had relocated due to communal crises and expert client trackers reported that some patient could not afford transportation to ART centers. Conclusion: This study shows that, low health education levels, poverty, relocations and lack of reliable phone contact were major predictors of LTFU. Periodic updates of home addresses, telephone contacts including at least two next of kin, phone text messages and home visits may improve follow up. Early and consistent tracking of missed appointments is crucial. Creation of more ART decentralized centres are needed to avoid long distances.

Keywords: anti-retroviral therapy, HIV/AIDS, predictors, lost to follow up

Procedia PDF Downloads 304
4809 Adaptation of Projection Profile Algorithm for Skewed Handwritten Text Line Detection

Authors: Kayode A. Olaniyi, Tola. M. Osifeko, Adeola A. Ogunleye

Abstract:

Text line segmentation is an important step in document image processing. It represents a labeling process that assigns the same label using distance metric probability to spatially aligned units. Text line detection techniques have successfully been implemented mainly in printed documents. However, processing of the handwritten texts especially unconstrained documents has remained a key problem. This is because the unconstrained hand-written text lines are often not uniformly skewed. The spaces between text lines may not be obvious, complicated by the nature of handwriting and, overlapping ascenders and/or descenders of some characters. Hence, text lines detection and segmentation represents a leading challenge in handwritten document image processing. Text line detection methods that rely on the traditional global projection profile of the text document cannot efficiently confront with the problem of variable skew angles between different text lines. Hence, the formulation of a horizontal line as a separator is often not efficient. This paper presents a technique to segment a handwritten document into distinct lines of text. The proposed algorithm starts, by partitioning the initial text image into columns, across its width into chunks of about 5% each. At each vertical strip of 5%, the histogram of horizontal runs is projected. We have worked with the assumption that text appearing in a single strip is almost parallel to each other. The algorithm developed provides a sliding window through the first vertical strip on the left side of the page. It runs through to identify the new minimum corresponding to a valley in the projection profile. Each valley would represent the starting point of the orientation line and the ending point is the minimum point on the projection profile of the next vertical strip. The derived text-lines traverse around any obstructing handwritten vertical strips of connected component by associating it to either the line above or below. A decision of associating such connected component is made by the probability obtained from a distance metric decision. The technique outperforms the global projection profile for text line segmentation and it is robust to handle skewed documents and those with lines running into each other.

Keywords: connected-component, projection-profile, segmentation, text-line

Procedia PDF Downloads 124
4808 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach

Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh

Abstract:

This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.

Keywords: river stage-discharge process, LSSVM, discrete wavelet transform, Ensemble Empirical Decomposition Mode, multi-station modeling

Procedia PDF Downloads 175
4807 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter

Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi

Abstract:

In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.

Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm

Procedia PDF Downloads 387
4806 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning

Authors: Tanvi P. Patel, Warish D. Patel

Abstract:

Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.

Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern

Procedia PDF Downloads 371
4805 The Effectiveness of the South African Government Theory of Expanded Public Works Program: Infrastructure

Authors: Siziwe Monica Zuma

Abstract:

The Expanded Public Works Program (EPWP) is an instrument that the South African Government uses to reduce unemployment and poverty and also stimulate economic growth. However, due to the limited budget and programs in the EPWP, the program has had challenges in reducing unemployment, poverty and stimulating economic growth. The EPWP Vuk’uphile program had positive outcomes in developing Black emerging contractors, in order for them to participate in the main stream economy far better than when the EPWP program was not introduced. The Skills component of the program particularly the EPWP Infrastructure, which is the most funded program under EPWP has had limited success in transferring appropriate skills to ensure labour participants can penetrate the labour market after participating in the EPWP. Education and skills are important attributes that can contribute to labour absorption, however, the EPWP particularly the infrastructure program needs to strengthen skills development over a longer period of time suggested a year with multi skills relevant to the labour market. Longer and more sustained employment provides a safety net and reduces poverty better that short term employment. The EPWP program can be expanded in the infrastructure sector, focusing on rural infrastructure, agricultural infrastructure, infrastructure related components like property, ownership, management, and other services. These can stimulate the Economic sector Infrastructure of EPWP, offer longer term and more sustained employment and rural enterprise development and further employment. The Expanded Public Works Program (EPWP) is an instrument that the South African Government uses to reduce unemployment and poverty and also stimulate economic growth. However, due to the limited budget and programs in the EPWP, the program has had challenges in reducing unemployment, poverty and stimulating economic growth. The EPWP Vuk’uphile program has had positive outcomes in developing Black emerging contractors, in order for them to participate in the main stream economy far better than when the EPWP program was not introduced. The Skills component of the program particularly the EPWP Infrastructure, which is the most funded program under EPWP has had limited success in transferring appropriate skills to ensure labour participants are able to penetrate the labour market after participating in the EPWP. Education and skills are important attributes that can contribute to labour absorption, however, the EPWP particularly the infrastructure program needs to strengthen skills development over a longer period of time suggested a year with multi skills relevant to the labour market. Longer and more sustained employment provides a safety net and reduces poverty better that short term employment. The EPWP program can be expanded in the infrastructure sector, focusing on rural infrastructure, agricultural infrastructure, infrastructure related components like property, ownership, management, and other services. These can stimulate the Economic sector Infrastructure of EPWP, offer longer term and more sustained employment and rural enterprise development and further employment.

Keywords: Expanded Public Works Program (EPWP), VUKÚPHILE, youth, Public Works Programs (PWP), Infrastructure Sector of EPWP (EPWP Infrastructure)

Procedia PDF Downloads 218
4804 Roasting Degree of Cocoa Beans by Artificial Neural Network (ANN) Based Electronic Nose System and Gas Chromatography (GC)

Authors: Juzhong Tan, William Kerr

Abstract:

Roasting is one critical procedure in chocolate processing, where special favors are developed, moisture content is decreased, and better processing properties are developed. Therefore, determination of roasting degree of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products, and it also decides the commercial value of cocoa beans collected from cocoa farmers. The roasting degree of cocoa beans currently relies on human specialists, who sometimes are biased, and chemical analysis, which take long time and are inaccessible to many manufacturers and farmers. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was used to detecting the gas generated by cocoa beans with a different roasting degree (0min, 20min, 30min, and 40min) and the signals collected by gas sensors were used to train a three-layers ANN. Chemical analysis of the graded beans was operated by traditional GC-MS system and the contents of volatile chemical compounds were used to train another ANN as a reference to electronic nosed signals trained ANN. Both trained ANN were used to predict cocoa beans with a different roasting degree for validation. The best accuracy of grading achieved by electronic nose signals trained ANN (using signals from TGS 813 826 820 880 830 2620 2602 2610) turned out to be 96.7%, however, the GC trained ANN got the accuracy of 83.8%.

Keywords: artificial neutron network, cocoa bean, electronic nose, roasting

Procedia PDF Downloads 234
4803 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring

Authors: Zheng Wang, Zhenhong Li, Jon Mills

Abstract:

Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.

Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring

Procedia PDF Downloads 161
4802 Selecting Special Education as a Career: A Qualitative Study of Motivating Factors for Special Education Teachers

Authors: Jennifer Duffy, Liz Fleming

Abstract:

Teacher shortage in special education is an American educational problem. Due to the implementation of The No Child Left Behind Act (2001) and The Individuals with Disabilities Education Improvement Act (2004), there has been an increase in the number of students requiring special education services. Consequently, there has been an influx to hire more special education teachers. However, the historic challenge of hiring certified special education teachers has been intensified with this the profession’s increasing demand of positions to fill. Efforts to improve recruitment and entry into the field must be informed by an understanding of the factors that initially inspire special education teachers to choose this career pathway. Hence, an understanding of reasons why teachers select special education as a profession is needed. The purpose of this study was to explore personal, academic, and professional motivations that lead to the selection of special education as a career choice. Using the grounded theory approach, this research investigation examined the factors that were most instrumental in influencing applicants to select special education as a career choice. Over one hundred de-identified graduate school applications to Bay Path University’s Graduate Special Education Programs from 2014- 2017 were qualitatively analyzed. Grounded coding was used to discover themes that emerged in applicants’ admissions essays explaining why he/she was pursuing a career in special education. The central themes that were most influential in applicants’ selection of special education as a career trajectory were (a) personal/familial connections to disability, (b) meaningful paraprofessional experiences working with disabled children, (c) aptitudes for teaching, and (d) finding personal rewards and professional fulfillment by advocating for vulnerable children. Implications from these findings include educating family members of children with disabilities about possible career tracks in special education, designing programs for paraprofessionals to become certified teachers, exposing prospective teacher candidates to the field of special education, and recruiting professionals from the human services field who seek to improve the quality of life and educational opportunities for children with special needs.

Keywords: career choice, professional pathways to teaching children with disabilities, special education, teacher recruitment

Procedia PDF Downloads 295
4801 Development of the Food Market of the Republic of Kazakhstan in the Field of Milk Processing

Authors: Gulmira Zhakupova, Tamara Tultabayeva, Aknur Muldasheva, Assem Sagandyk

Abstract:

The development of technology and production of products with increased biological value based on the use of natural food raw materials are important tasks in the policy of the food market of the Republic of Kazakhstan. For Kazakhstan, livestock farming, in particular sheep farming, is the most ancient and developed industry and way of life. The history of the Kazakh people is largely connected with this type of agricultural production, with established traditions using dairy products from sheep's milk. Therefore, the development of new technologies from sheep’s milk remains relevant. In addition, one of the most promising areas for the development of food technology for therapeutic and prophylactic purposes is sheep milk products as a source of protein, immunoglobulins, minerals, vitamins, and other biologically active compounds. This article presents the results of research on the study of milk processing technology. The objective of the study is to study the possibilities of processing sheep milk and its role in human nutrition, as well as the results of research to improve the technology of sheep milk products. The studies were carried out on the basis of sanitary and hygienic requirements for dairy products in accordance with the following test methods. To perform microbiological analysis, we used the method for identifying Salmonella bacteria (Horizontal method for identifying, counting, and serotyping Salmonella) in a certain mass or volume of product. Nutritional value is a complex of properties of food products that meet human physiological needs for energy and basic nutrients. The protein mass fraction was determined by the Kjeldahl method. This method is based on the mineralization of a milk sample with concentrated sulfuric acid in the presence of an oxidizing agent, an inert salt - potassium sulfate, and a catalyst - copper sulfate. In this case, the amino groups of the protein are converted into ammonium sulfate dissolved in sulfuric acid. The vitamin composition was determined by HPLC. To determine the content of mineral substances in the studied samples, the method of atomic absorption spectrophotometry was used. The study identified the technological parameters of sheep milk products and determined the prospects for researching sheep milk products. Microbiological studies were used to determine the safety of the study product. According to the results of the microbiological analysis, no deviations from the norm were identified. This means high safety of the products under study. In terms of nutritional value, the resulting products are high in protein. Data on the positive content of amino acids were also obtained. The results obtained will be used in the food industry and will serve as recommendations for manufacturers.

Keywords: dairy, milk processing, nutrition, colostrum

Procedia PDF Downloads 57
4800 Active Noise Cancellation in the Rectangular Enclosure Systems

Authors: D. Shakirah Shukor, A. Aminudin, Hashim U. A., Waziralilah N. Fathiah, T. Vikneshvaran

Abstract:

The interior noise control is essential to be explored due to the interior acoustic analysis is significant in the systems such as automobiles, aircraft, air-handling system and diesel engine exhausts system. In this research, experimental work was undertaken for canceling an active noise in the rectangular enclosure. The rectangular enclosure was fabricated with multiple speakers and microphones inside the enclosure. A software program using digital signal processing is implemented to evaluate the proposed method. Experimental work was conducted to obtain the acoustic behavior and characteristics of the rectangular enclosure and noise cancellation based on active noise control in low-frequency range. Noise is generated by using multispeaker inside the enclosure and microphones are used for noise measurements. The technique for noise cancellation relies on the principle of destructive interference between two sound fields in the rectangular enclosure. One field is generated by the original or primary sound source, the other by a secondary sound source set up to interfere with, and cancel, that unwanted primary sound. At the end of this research, the result of output noise before and after cancellation are presented and discussed. On the basis of the findings presented in this research, an active noise cancellation in the rectangular enclosure is worth exploring in order to improve the noise control technologies.

Keywords: active noise control, digital signal processing, noise cancellation, rectangular enclosure

Procedia PDF Downloads 272
4799 Exploration into Bio Inspired Computing Based on Spintronic Energy Efficiency Principles and Neuromorphic Speed Pathways

Authors: Anirudh Lahiri

Abstract:

Neuromorphic computing, inspired by the intricate operations of biological neural networks, offers a revolutionary approach to overcoming the limitations of traditional computing architectures. This research proposes the integration of spintronics with neuromorphic systems, aiming to enhance computational performance, scalability, and energy efficiency. Traditional computing systems, based on the Von Neumann architecture, struggle with scalability and efficiency due to the segregation of memory and processing functions. In contrast, the human brain exemplifies high efficiency and adaptability, processing vast amounts of information with minimal energy consumption. This project explores the use of spintronics, which utilizes the electron's spin rather than its charge, to create more energy-efficient computing systems. Spintronic devices, such as magnetic tunnel junctions (MTJs) manipulated through spin-transfer torque (STT) and spin-orbit torque (SOT), offer a promising pathway to reducing power consumption and enhancing the speed of data processing. The integration of these devices within a neuromorphic framework aims to replicate the efficiency and adaptability of biological systems. The research is structured into three phases: an exhaustive literature review to build a theoretical foundation, laboratory experiments to test and optimize the theoretical models, and iterative refinements based on experimental results to finalize the system. The initial phase focuses on understanding the current state of neuromorphic and spintronic technologies. The second phase involves practical experimentation with spintronic devices and the development of neuromorphic systems that mimic synaptic plasticity and other biological processes. The final phase focuses on refining the systems based on feedback from the testing phase and preparing the findings for publication. The expected contributions of this research are twofold. Firstly, it aims to significantly reduce the energy consumption of computational systems while maintaining or increasing processing speed, addressing a critical need in the field of computing. Secondly, it seeks to enhance the learning capabilities of neuromorphic systems, allowing them to adapt more dynamically to changing environmental inputs, thus better mimicking the human brain's functionality. The integration of spintronics with neuromorphic computing could revolutionize how computational systems are designed, making them more efficient, faster, and more adaptable. This research aligns with the ongoing pursuit of energy-efficient and scalable computing solutions, marking a significant step forward in the field of computational technology.

Keywords: material science, biological engineering, mechanical engineering, neuromorphic computing, spintronics, energy efficiency, computational scalability, synaptic plasticity.

Procedia PDF Downloads 43
4798 From Modeling of Data Structures towards Automatic Programs Generating

Authors: Valentin P. Velikov

Abstract:

Automatic program generation saves time, human resources, and allows receiving syntactically clear and logically correct modules. The 4-th generation programming languages are related to drawing the data and the processes of the subject area, as well as, to obtain a frame of the respective information system. The application can be separated in interface and business logic. That means, for an interactive generation of the needed system to be used an already existing toolkit or to be created a new one.

Keywords: computer science, graphical user interface, user dialog interface, dialog frames, data modeling, subject area modeling

Procedia PDF Downloads 305
4797 Low Power Glitch Free Dual Output Coarse Digitally Controlled Delay Lines

Authors: K. Shaji Mon, P. R. John Sreenidhi

Abstract:

In deep-submicrometer CMOS processes, time-domain resolution of a digital signal is becoming higher than voltage resolution of analog signals. This claim is nowadays pushing toward a new circuit design paradigm in which the traditional analog signal processing is expected to be progressively substituted by the processing of times in the digital domain. Within this novel paradigm, digitally controlled delay lines (DCDL) should play the role of digital-to-analog converters in traditional, analog-intensive, circuits. Digital delay locked loops are highly prevalent in integrated systems.The proposed paper addresses the glitches present in delay circuits along with area,power dissipation and signal integrity.The digitally controlled delay lines(DCDL) under study have been designed in a 90 nm CMOS technology 6 layer metal Copper Strained SiGe Low K Dielectric. Simulation and synthesis results show that the novel circuits exhibit no glitches for dual output coarse DCDL with less power dissipation and consumes less area compared to the glitch free NAND based DCDL.

Keywords: glitch free, NAND-based DCDL, CMOS, deep-submicrometer

Procedia PDF Downloads 245
4796 From Cascade to Cluster School Model of Teachers’ Professional Development Training Programme: Nigerian Experience, Ondo State: A Case Study

Authors: Oloruntegbe Kunle Oke, Alake Ese Monica, Odutuyi Olubu Musili

Abstract:

This research explores the differing effectiveness of cascade and cluster models in professional development programs for educators in Ondo State, Nigeria. The cascade model emphasizes a top-down approach, where training is cascaded from expert trainers to lower levels of teachers. In contrast, the cluster model, a bottom-up approach, fosters collaborative learning among teachers within specific clusters. Through a review of the literature and empirical studies of the implementations of the former in two academic sessions followed by the cluster model in another two, the study examined their effectiveness on teacher development, productivity and students’ achievements. The study also drew a comparative analysis of the strengths and weaknesses associated with each model, considering factors such as scalability, cost-effectiveness, adaptability in various contexts, and sustainability. 2500 teachers from Ondo State Primary Schools participated in the cascade with intensive training in five zones for a week each in two academic sessions. On the other hand, 1,980 and 1,663 teachers in 52 and 34 clusters, respectively, were in the first and the following session. The programs were designed for one week of rigorous training of teachers by facilitators in the former while the latter was made up of four components: sit-in-observation, need-based assessment workshop, pre-cluster and the actual cluster meetings in addition to sensitization, and took place one day a week for ten weeks. Validated Cluster Impact Survey Instruments, CISI and Teacher’s Assessment Questionnaire (TAQ) were administered to ascertain the effectiveness of the models during and after implementation. The findings from the literature detailed specific effectiveness, strengths and limitations of each approach, especially the potential for inconsistencies and resistance to change. Findings from the data collected revealed the superiority of the cluster model. Response to TAQ equally showed content knowledge and skill update in both but were more sustained in the cluster model. Overall, the study contributes to the ongoing discourse on effective strategies for improving teacher training and enhancing student outcomes, offering practical recommendations for the development and implementation of future professional development projects.

Keywords: cascade model, cluster model, teachers’ development, productivity, students’ achievement

Procedia PDF Downloads 41
4795 On In vitro Durum Wheat Isolated Microspore Culture

Authors: Zelikha Labbani

Abstract:

Since its creation in 1964 by Guha and Maheshwari in India on Datura innoxia Mill, in vitro androgenesis has become the method of choice in the production of doubled haploid in many species. However, in durum wheat, the Doubled haploid plant breeding programs remained limited due to the low production of androgenetic embryos and converting them into fertile green plants. We describe here an efficient method for inducing embryos and regenerating green plants directly from isolated microspores of durum wheat.

Keywords: durum wheat, haploid embryos, on in vitro, pretreatment

Procedia PDF Downloads 355
4794 The Benefits of Regional Brand for Companies

Authors: H. Starzyczna, M. Stoklasa, K. Matusinska

Abstract:

This article deals with the benefits of regional brands for companies in the Czech Republic. Research was focused on finding out the expected and actual benefits of regional brands for companies. The data were obtained by questionnaire survey and analysed by IBM SPSS. Representative sample of 204 companies was created. The research analysis disclosed the expected benefits that the regional brand should bring to companies. But the actual benefits are much worse. The statistical testing of hypotheses revealed that the benefits depend on the region of origin, which surprised both us and the regional coordinators.

Keywords: Brand, regional brands, product protective branding programs, brand benefits

Procedia PDF Downloads 346
4793 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University

Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat

Abstract:

Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.

Keywords: big data platforms, cloudera manager, Hadoop, MapReduce

Procedia PDF Downloads 358
4792 Correlation between the Larvae Density (Diptera: Culicidae) and Physicochemical Characteristics of Habitats in Mazandaran Province, Northern Iran

Authors: Seyed Hassan Nikookar, Mahmoud Fazeli-Dinan, Seyyed Payman Ziapour, Ahmad-Ali Enayati

Abstract:

Background: Mosquitoes look for all kinds of aquatic habitats for laying eggs. Characteristics of water habitats are important factors in determining whether a mosquito can survive and successfully completed their developmental stages. Physicochemical factors can display an important role in vector control programs. This investigate determined whether physicochemical factors differ between habitats can be effective in the larvae density in Mazandaran province. Methods: Larvae were collected by the standard dipper up to 350 ml for 15-20 minutes from fixed habitats in 16 villages of 30 townships, the specimens identified by morphological key. Water samples were collected during larval collection and were evaluated for temperature (°C), acidity (pH), turbidity (NTU), electrical conductivity (μS/cm), alkalinity (mg/l), total hardness (mg/l), nitrate (mg/l), chloride (mg/l), phosphate (mg/l), sulfate (mg/l) in selected habitats using standard methods. Spearman Correlation coefficient was used for analyze data. Results: Totally 7566 mosquito larvae of three genera and 15 species were collected of fixed habitats. Cx. pipiens was the dominant species except in villages of Tileno, Zavat, Asad Abad, Shah Mansur Mahale which An. maculipennis, Cx. torrentium were as the predominant species. Turbidity in Karat Koti, Chloride in Al Tappeh, nitrate, phosphate and sulfate in Chalmardi, electrical conductivity, alkalinity, total hardness in Komishan villages were significantly higher than other villages (P < 0.05). There were a significant positive correlation between Cx. pipiens and Electrical conductivity, Alkalinity, Total hardness, Chloride, Cx. tritaeniorhynchus and Chloride, whereas a significant negative correlation observed between Sulfate and Cx. perexiguss. Conclusion: The correlations observed between physicochemical factor and larval density, possibly can confirm the effect of these parameters on the breeding activities of mosquitoes, and could probability facilitate larval control programs by the handwork of such factors.

Keywords: anopheles, culex, culiseta, physicochemical, habitats, larvae density, correlation

Procedia PDF Downloads 265
4791 Sustainable Manufacturing of Concentrated Latex and Ribbed Smoked Sheets in Sri Lanka

Authors: Pasan Dunuwila, V. H. L. Rodrigo, Naohiro Goto

Abstract:

Sri Lanka is one the largest natural rubber (NR) producers of the world, where the NR industry is a major foreign exchange earner. Among the locally manufactured NR products, concentrated latex (CL) and ribbed smoked sheets (RSS) hold a significant position. Furthermore, these products become the foundation for many products utilized by the people all over the world (e.g. gloves, condoms, tires, etc.). Processing of CL and RSS costs a significant amount of material, energy, and workforce. With this background, both manufacturing lines have immensely challenged by waste, low productivity, lack of cost efficiency, rising cost of production, and many environmental issues. To face the above challenges, the adaptation of sustainable manufacturing measures that use less energy, water, materials, and produce less waste is imperative. However, these sectors lack comprehensive studies that shed light on such measures and thoroughly discuss their improvement potentials from both environmental and economic points of view. Therefore, based on a study of three CL and three RSS mills in Sri Lanka, this study deploys sustainable manufacturing techniques and tools to uncover the underlying potentials to improve performances in CL and RSS processing sectors. This study is comprised of three steps: 1. quantification of average material waste, economic losses, and greenhouse gas (GHG) emissions via material flow analysis (MFA), material flow cost accounting (MFCA), and life cycle assessment (LCA) in each manufacturing process, 2. identification of improvement options with the help of Pareto and What-if analyses, field interviews, and the existing literature; and 3. validation of the identified improvement options via the re-execution of MFA, MFCA, and LCA. With the help of this methodology, the economic and environmental hotspots, and the degrees of improvement in both systems could be identified. Results highlighted that each process could be improved to have less waste, monetary losses, manufacturing costs, and GHG emissions. Conclusively, study`s methodology and findings are believed to be beneficial for assuring the sustainable growth not only in Sri Lankan NR processing sector itself but also in NR or any other industry rooted in other developing countries.

Keywords: concentrated latex, natural rubber, ribbed smoked sheets, Sri Lanka

Procedia PDF Downloads 261
4790 Signal Processing Techniques for Adaptive Beamforming with Robustness

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

Adaptive beamforming using antenna array of sensors is useful in the process of adaptively detecting and preserving the presence of the desired signal while suppressing the interference and the background noise. For conventional adaptive array beamforming, we require a prior information of either the impinging direction or the waveform of the desired signal to adapt the weights. The adaptive weights of an antenna array beamformer under a steered-beam constraint are calculated by minimizing the output power of the beamformer subject to the constraint that forces the beamformer to make a constant response in the steering direction. Hence, the performance of the beamformer is very sensitive to the accuracy of the steering operation. In the literature, it is well known that the performance of an adaptive beamformer will be deteriorated by any steering angle error encountered in many practical applications, e.g., the wireless communication systems with massive antennas deployed at the base station and user equipment. Hence, developing effective signal processing techniques to deal with the problem due to steering angle error for array beamforming systems has become an important research work. In this paper, we present an effective signal processing technique for constructing an adaptive beamformer against the steering angle error. The proposed array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. Based on the presumed steering vector and a preset angle range for steering mismatch tolerance, we first create a matrix related to the direction vector of signal sources. Two projection matrices are generated from the matrix. The projection matrix associated with the desired signal information and the received array data are utilized to iteratively estimate the actual direction vector of the desired signal. The estimated direction vector of the desired signal is then used for appropriately finding the quiescent weight vector. The other projection matrix is set to be the signal blocking matrix required for performing adaptive beamforming. Accordingly, the proposed beamformer consists of adaptive quiescent weights and partially adaptive weights. Several computer simulation examples are provided for evaluating and comparing the proposed technique with the existing robust techniques.

Keywords: adaptive beamforming, robustness, signal blocking, steering angle error

Procedia PDF Downloads 124
4789 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: graph cuts, lung CT scan, lung parenchyma segmentation, patch-based similarity metric

Procedia PDF Downloads 169
4788 Production of High Purity Cellulose Products from Sawdust Waste Material

Authors: Simiksha Balkissoon, Jerome Andrew, Bruce Sithole

Abstract:

Approximately half of the wood processed in the Forestry, Timber, Pulp and Paper (FTPP) sector is accumulated as waste. The concept of a “green economy” encourages industries to employ revolutionary, transformative technologies to eliminate waste generation by exploring the development of new value chains. The transition towards an almost paperless world driven by the rise of digital media has resulted in a decline in traditional paper markets, prompting the FTTP sector to reposition itself and expand its product offerings by unlocking the potential of value-adding opportunities from renewable resources such as wood to generate revenue and mitigate its environmental impact. The production of valuable products from wood waste such as sawdust has been extensively explored in recent years. Wood components such as lignin, cellulose and hemicelluloses, which can be extracted selectively by chemical processing, are suitable candidates for producing numerous high-value products. In this study, a novel approach to produce high-value cellulose products, such as dissolving wood pulp (DWP), from sawdust was developed. DWP is a high purity cellulose product used in several applications such as pharmaceutical, textile, food, paint and coatings industries. The proposed approach demonstrates the potential to eliminate several complex processing stages, such as pulping and bleaching, which are associated with traditional commercial processes to produce high purity cellulose products such as DWP, making it less chemically energy and water-intensive. The developed process followed the path of experimentally designed lab tests evaluating typical processing conditions such as residence time, chemical concentrations, liquid-to-solid ratios and temperature, followed by the application of suitable purification steps. Characterization of the product from the initial stage was conducted using commercially available DWP grades as reference materials. The chemical characteristics of the products thus far have shown similar properties to commercial products, making the proposed process a promising and viable option for the production of DWP from sawdust.

Keywords: biomass, cellulose, chemical treatment, dissolving wood pulp

Procedia PDF Downloads 185
4787 Short Answer Grading Using Multi-Context Features

Authors: S. Sharan Sundar, Nithish B. Moudhgalya, Nidhi Bhandari, Vineeth Vijayaraghavan

Abstract:

Automatic Short Answer Grading is one of the prime applications of artificial intelligence in education. Several approaches involving the utilization of selective handcrafted features, graphical matching techniques, concept identification and mapping, complex deep frameworks, sentence embeddings, etc. have been explored over the years. However, keeping in mind the real-world application of the task, these solutions present a slight overhead in terms of computations and resources in achieving high performances. In this work, a simple and effective solution making use of elemental features based on statistical, linguistic properties, and word-based similarity measures in conjunction with tree-based classifiers and regressors is proposed. The results for classification tasks show improvements ranging from 1%-30%, while the regression task shows a stark improvement of 35%. The authors attribute these improvements to the addition of multiple similarity scores to provide ensemble of scoring criteria to the models. The authors also believe the work could reinstate that classical natural language processing techniques and simple machine learning models can be used to achieve high results for short answer grading.

Keywords: artificial intelligence, intelligent systems, natural language processing, text mining

Procedia PDF Downloads 133
4786 The Effect of 'Teachers Teaching Teachers' Professional Development Course on Teachers’ Achievement and Classroom Practices

Authors: Nuri Balta, Ali Eryilmaz

Abstract:

High-quality teachers are the key to improve student learning. Without a professional development of the teachers, the improvement of student success is difficult and incomplete. This study offers an in-service training course model for professional development of teachers (PD) entitled "teachers teaching teachers" (TTT). The basic premise of the PD program, designed for this study, was primarily aimed to increase the subject matter knowledge of high school physics teachers. The TTT course (the three hour long workshops), organized for this study, lasted for seven weeks with seventeen teachers took part in the TTT program at different amounts. In this study, the effect of the TTT program on teachers’ knowledge improvement was searched through the modern physics unit (MPU). The participating teachers taught the unit to one of their grade ten classes earlier, and they taught another equivalent class two months later. They were observed in their classes both before and after TTT program. The teachers were divided into placebo and the treatment groups. The aim of Solomon four-group design is an attempt to eliminate the possible effect of pre-test. However, in this study the similar design was used to eliminate the effect of pre teaching. The placebo group teachers taught their both classes as regular and the treatment group teachers had TTT program between the two teachings. The class observation results showed that the TTT program increased teachers’ knowledge and skills in teaching MPU. Further, participating in the TTT program caused teachers to teach the MPU in accordance with the requirements of the curriculum. In order to see any change in participating teachers’ success, an achievement test was applied to them. A large effect size (dCohen=.93) was calculated for the effect of TTT program on treatment group teachers’ achievement. The results suggest that staff developers should consider including topics, attractive to teachers, in-service training programs (a) to help teachers’ practice teaching the new topics (b) to increase the participation rate. During the conduction of the TTT courses, it was observed that teachers could not end some discussions and explain some concepts. It is now clear that teachers need support, especially when discussing counterintuitive concepts such as modern physics concepts. For this reason it is recommended that content focused PD programs be conducted at the helm of a scholarly coach.

Keywords: high school physics, in-service training course, modern physics unit, teacher professional development

Procedia PDF Downloads 196
4785 The Effect of Feedstock Powder Treatment / Processing on the Microstructure, Quality, and Performance of Thermally Sprayed Titanium Based Composite Coating

Authors: Asma Salman, Brian Gabbitas, Peng Cao, Deliang Zhang

Abstract:

The performance of a coating is strongly dependent upon its microstructure, which in turn is dependent on the characteristics of the feedstock powder. This study involves the evaluation and performance of a titanium-based composite coating produced by the HVOF (high-velocity oxygen fuel) spraying method. The feedstock for making the composite coating was produced using high energy mechanical milling of TiO2 and Al powders followed by a combustion reaction. The characteristics of the feedstock powder were improved by treating it with an organic binder. Two types of coatings were produced using treated and untreated feedstock powders. The microstructures and characteristics of both types of coatings were studied, and their thermal shock resistance was accessed by dipping into molten aluminum. The results of this study showed that feedstock treatment did not have a significant effect on the microstructure of the coatings. However, it did affect the uniformity, thickness and surface roughness of the coating on the steel substrate. A coating produced by an untreated feedstock showed better thermal shock resistance in molten aluminum compared with the one produced by PVA (polyvinyl alcohol) treatment.

Keywords: coating, feedstock, powder processing, thermal shock resistance, thermally spraying

Procedia PDF Downloads 272