Search results for: speech signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5546

Search results for: speech signal processing

1496 A Review on Medical Image Registration Techniques

Authors: Shadrack Mambo, Karim Djouani, Yskandar Hamam, Barend van Wyk, Patrick Siarry

Abstract:

This paper discusses the current trends in medical image registration techniques and addresses the need to provide a solid theoretical foundation for research endeavours. Methodological analysis and synthesis of quality literature was done, providing a platform for developing a good foundation for research study in this field which is crucial in understanding the existing levels of knowledge. Research on medical image registration techniques assists clinical and medical practitioners in diagnosis of tumours and lesion in anatomical organs, thereby enhancing fast and accurate curative treatment of patients. Literature review aims to provide a solid theoretical foundation for research endeavours in image registration techniques. Developing a solid foundation for a research study is possible through a methodological analysis and synthesis of existing contributions. Out of these considerations, the aim of this paper is to enhance the scientific community’s understanding of the current status of research in medical image registration techniques and also communicate to them, the contribution of this research in the field of image processing. The gaps identified in current techniques can be closed by use of artificial neural networks that form learning systems designed to minimise error function. The paper also suggests several areas of future research in the image registration.

Keywords: image registration techniques, medical images, neural networks, optimisaztion, transformation

Procedia PDF Downloads 172
1495 Unveiling the Detailed Turn Off-On Mechanism of Carbon Dots to Different Sized MnO₂ Nanosensor for Selective Detection of Glutathione

Authors: Neeraj Neeraj, Soumen Basu, Banibrata Maity

Abstract:

Glutathione (GSH) is one of the most important biomolecules having small molecular weight, which helps in various cellular functions like regulation of gene, xenobiotic metabolism, preservation of intracellular redox activities, signal transduction, etc. Therefore, the detection of GSH requires huge attention by using extremely selective and sensitive techniques. Herein, a rapid fluorometric nanosensor is designed by combining carbon dots (Cdots) and MnO₂ nanoparticles of different sizes for the detection of GSH. The bottom-up approach, i.e., microwave method, was used for the preparation of the water soluble and greatly fluorescent Cdots by using ascorbic acid as a precursor. MnO₂ nanospheres of different sizes (large, medium, and small) were prepared by varying the ratio of concentration of methionine and KMnO₄ at room temperature, which was confirmed by HRTEM analysis. The successive addition of MnO₂ nanospheres in Cdots results fluorescence quenching. From the fluorescence intensity data, Stern-Volmer quenching constant values (KS-V) were evaluated. From the fluorescence intensity and lifetime analysis, it was found that the degree of fluorescence quenching of Cdots followed the order: large > medium > small. Moreover, fluorescence recovery studies were also performed in the presence of GSH. Fluorescence restoration studies also show the order of turn on follows the same order, i.e., large > medium > small, which was also confirmed by quantum yield and lifetime studies. The limits of detection (LOD) of GSH in presence of Cdots@different sized MnO₂ nanospheres were also evaluated. It was observed thatLOD values were in μM region and lowest in case of large MnO₂ nanospheres. The separation distance (d) between Cdots and the surface of different MnO₂ nanospheres was determined. The d values increase with increase in the size of the MnO₂ nanospheres. In summary, the synthesized Cdots@MnO₂ nanocomposites acted as a rapid, simple, economical as well as environmental-friendly nanosensor for the detection of GSH.

Keywords: carbon dots, fluorescence, glutathione, MnO₂ nanospheres, turn off-on

Procedia PDF Downloads 145
1494 The Comparative Electroencephalogram Study: Children with Autistic Spectrum Disorder and Healthy Children Evaluate Classical Music in Different Ways

Authors: Galina Portnova, Kseniya Gladun

Abstract:

In our EEG experiment participated 27 children with ASD with the average age of 6.13 years and the average score for CARS 32.41 and 25 healthy children (of 6.35 years). Six types of musical stimulation were presented, included Gluck, Javier-Naida, Kenny G, Chopin and other classic musical compositions. Children with autism showed orientation reaction to the music and give behavioral responses to different types of music, some of them might assess stimulation by scales. The participants were instructed to remain calm. Brain electrical activity was recorded using a 19-channel EEG recording device, 'Encephalan' (Russia, Taganrog). EEG epochs lasting 150 s were analyzed using EEGLab plugin for MatLab (Mathwork Inc.). For EEG analysis we used Fast Fourier Transform (FFT), analyzed Peak alpha frequency (PAF), correlation dimension D2 and Stability of rhythms. To express the dynamics of desynchronizing of different rhythms we've calculated the envelope of the EEG signal, using the whole frequency range and a set of small narrowband filters using Hilbert transformation. Our data showed that healthy children showed similar EEG spectral changes during musical stimulation as well as described the feelings induced by musical fragments. The exception was the ‘Chopin. Prelude’ fragment (no.6). This musical fragment induced different subjective feeling, behavioral reactions and EEG spectral changes in children with ASD and healthy children. The correlation dimension D2 was significantly lower in autists compared to healthy children during musical stimulation. Hilbert envelope frequency was reduced in all group of subjects during musical compositions 1,3,5,6 compositions compared to the background. During musical fragments 2 and 4 (terrible) lower Hilbert envelope frequency was observed only in children with ASD and correlated with the severity of the disease. Alfa peak frequency was lower compared to the background during this musical composition in healthy children and conversely higher in children with ASD.

Keywords: electroencephalogram (EEG), emotional perception, ASD, musical perception, childhood Autism rating scale (CARS)

Procedia PDF Downloads 278
1493 Impact of Modifying the Surface Materials on the Radiative Heat Transfer Phenomenon

Authors: Arkadiusz Urzędowski, Dorota Wójcicka-Migasiuk, Andrzej Sachajdak, Magdalena Paśnikowska-Łukaszuk

Abstract:

Due to the impact of climate changes and inevitability to reduce greenhouse gases, the need to use low-carbon and sustainable construction has increased. In this work, it is investigated how texture of the surface building materials and radiative heat transfer phenomenon in flat multilayer can be correlated. Attempts to test the surface emissivity are taken however, the trustworthiness of measurement results remains a concern since sensor size and thickness are common problems. This paper presents an experimental method to studies surface emissivity with use self constructed thermal sensors and thermal imaging technique. The surface of building materials was modified by mechanical and chemical treatment affecting the reduction of the emissivity. For testing the shaping surface of materials and mapping its three-dimensional structure, scanning profilometry were used in a laboratory. By comparing the results of laboratory tests and performed analysis of 3D computer fluid dynamics software, it can be shown that a change in the surface coverage of materials affects the heat transport by radiation between layers. Motivated by recent advancements in variational inference, this publication evaluates the potential use a dedicated data processing approach, and properly constructed temperature sensors, the influence of the surface emissivity on the phenomenon of radiation and heat transport in the entire partition can be determined.

Keywords: heat transfer, surface roughness, surface emissivity, radiation

Procedia PDF Downloads 87
1492 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being

Procedia PDF Downloads 64
1491 Evaluating Effectiveness of Training and Development Corporate Programs: The Russian Agribusiness Context

Authors: Ekaterina Tikhonova

Abstract:

This research is aimed to evaluate the effectiveness of T&D (Training and Development) on the example of two T&D programs for the Executive TOP Management run in 2012, 2015-2016 in Komos Group. This study is commissioned to research the effectiveness of two similar corporate T&D programs (within one company) in two periods of time (2012, 2015-2016) through evaluating the programs’ effectiveness using the four-level Kirkpatrick’s model of evaluating T&D programs and calculating ROI as an instrument for T&D program measuring by Phillips’ formula. The research investigates the correlation of two figures: the ROI calculated and the rating percentage scale per the ROI implementation (Wagle’s scale). The study includes an assessment of feedback 360 (Kirkpatrick's model) and Phillips’ ROI Methodology that provides a step-by-step process for collecting data, summarizing and processing the collected information. The data is collected from the company accounting data, the HR budgets, MCFO and the company annual reports for the research periods. All analyzed data and reports are organized and presented in forms of tables, charts, and graphs. The paper also gives a brief description of some constrains of the research considered. After ROI calculation, the study reveals that ROI ranges between the average implementation (65% to 75%) by Wagle’s scale that can be considered as a positive outcome. The paper also gives some recommendations how to use ROI in practice and describes main benefits of ROI implementation.

Keywords: ROI, organizational performance, efficacy of T&D program, employee performance

Procedia PDF Downloads 247
1490 A Moving Target: Causative Factors for Geographic Variation in a Handed Flower

Authors: Celeste De Kock, Bruce Anderson, Corneile Minnaar

Abstract:

Geographic variation in the floral morphology of a flower species has often been assumed to result from co-variation in the availability of regionally-specific functional pollinator types, giving rise to plant ecotypes that are adapted to the morphology of the main pollinator types in that area. Wachendorfia paniculata is a geographically variable enantiostylous (handed) flower with preliminary observations suggesting that differences in pollinator community composition might be driving differences in the degree of herkogamy (spatial separation of the stigma and anthers on the same flower) across its geographic range. This study aimed to determine if pollinator-related variables such as visitation rate and pollinator type could explain differences in floral morphology seen in different populations. To assess pollinator community compositions, pollinator visitation rates, and the degree of herkogamy and flower size, flowers from 13 populations were observed and measured across the Western Cape, South Africa. Multiple regression analyses indicated that pollinator-related variables had no significant effect on the degree of herkogamy between sites. However, the degree of herkogamy was strongly negatively associated with the time of measurement. It remains possible that pollinators have had an effect on the development of herkogamy throughout the evolutionary timeline of different W. paniculata populations, but not necessarily to the fine-scale degree, as was predicted for this study. Annual fluctuations in pollinator community composition, paired with recent disturbances such as urbanization and the overabundance of artificially introduced honeybee hives, might also result in the signal of pollinator adaptation getting lost. Surprisingly, differences in herkogamy between populations could largely be explained by the time of day at which flowers were measured, suggesting a significant narrowing of the distance between reproductive parts throughout the day. We propose that this floral movement could possibly be an adaptation to ensure pollination if pollinator visitation to a flower was not sufficient earlier in the day, and will be explored in subsequent studies.

Keywords: enantiostyly, floral movement, geographic variation, ecotypes

Procedia PDF Downloads 273
1489 Optimization of Ultrasound Assisted Extraction and Characterization of Functional Properties of Dietary Fiber from Oat Cultivar S2000

Authors: Muhammad Suhail Ibrahim, Muhammad Nadeem, Waseem Khalid, Ammara Ainee, Taleeha Roheen, Sadaf Javaria, Aftab Ahmed, Hira Fatima, Mian Nadeem Riaz, Muhammad Zubair Khalid, Isam A. Mohamed Ahmed J, Moneera O. Aljobair

Abstract:

This study was executed to explore the efficacy of ultrasound-assisted extraction of dietary fiber from oat cultivar S2000. Extraction (variables time, temperature and amplitude) was optimized by using response surface methodology (RSM) conducted by Box Behnken Design (BBD). The effect of time, temperature and amplitude were studied at three levels. It was observed that time and temperature exerted more impact on extraction efficiency as compared to amplitude. The highest yield of total dietary fiber (TDF), soluble dietary fiber (SDF) and In-soluble dietary fiber (IDF) fractions were observed under ultrasound processing for 20 min at 40 ◦C with 80% amplitude. Characterization of extracted dietary fiber showed that it had better crystallinity, thermal properties and good fibrous structure. It also showed better functional properties as compared to traditionally extracted dietary fiber. Furthermore, dietary fibers from oats may offer high-value utilization and the expansion of comprehensive utilization in functional food and nutraceutical development.

Keywords: extraction, ultrasonication, response surface methodology, box behnken design

Procedia PDF Downloads 40
1488 Role of mHealth in Effective Response to Disaster

Authors: Mohammad H. Yarmohamadian, Reza Safdari, Nahid Tavakoli

Abstract:

In recent years, many countries have suffered various natural disasters. Disaster response continues to face the challenges in health care sector in all countries. Information and communication management is a significant challenge in disaster scene. During the last decades, rapid advances in information technology have led to manage information effectively and improve communication in health care setting. Information technology is a vital solution for effective response to disasters and emergencies so that if an efficient ICT-based health information system is available, it will be highly valuable in such situation. Of that, mobile technology represents a nearly computing technology infrastructure that is accessible, convenient, inexpensive and easy to use. Most projects have not yet reached the deployment stage, but evaluation exercises show that mHealth should allow faster processing and transport of patients, improved accuracy of triage and better monitoring of unattended patients at a disaster scene. Since there is a high prevalence of cell phones among world population, it is expected the health care providers and managers to take measures for applying this technology for improvement patient safety and public health in disasters. At present there are challenges in the utilization of mhealth in disasters such as lack of structural and financial issues in our country. In this paper we will discuss about benefits and challenges of mhealth technology in disaster setting considering connectivity, usability, intelligibility, communication and teaching for implementing this technology for disaster response.

Keywords: information technology, mhealth, disaster, effective response

Procedia PDF Downloads 434
1487 Balancing the Need for Closure: A Requirement for Effective Mood Development in Flow

Authors: Cristian Andrei Nica

Abstract:

The state of flow relies on cognitive elements that sustain openness for information processing in order to promote goal attainment. However, the need for closure may create mental constraints, which can impact affectivity levels. This study aims to observe the extent in which need for closure moderates the interaction between flow and affectivity, taking into account the mediating role of the mood repair motivation in the interaction process between need for closure and affectivity. Using a non-experimental, correlational design, n=73 participants n=18 men and n=55 women, ages between 19-64 years (m= 28.02) (SD=9.22), completed the Positive Affectivity-Negative Affectivity Schedule, the need for closure scale-revised, the mood repair items and an adapted version of the flow state scale 2, in order to assess the trait aspects of flow. Results show that need for closure significantly moderates the flow-affectivity process, while the tolerance of ambiguity sub-scale is positively associated with negative affectivity and negatively to positive affectivity. At the same time, mood repair motivation significantly mediates the interaction between need for closure and positive affectivity, whereas the mediation process for negative affectivity is insignificant. Need for closure needs to be considered when promoting the development of positive emotions. It has been found that the motivation to repair one’s mood mediates the interaction between need for closure and positive affectivity. According to this study, flow can trigger positive emotions when the person is willing to engage in mood regulation strategies and approach meaningful experiences with an open mind.

Keywords: flow, mood regulation, mood repair motivation, need for closure, negative affectivity, positive affectivity

Procedia PDF Downloads 119
1486 A Model for Teaching Arabic Grammar in Light of the Common European Framework of Reference for Languages

Authors: Erfan Abdeldaim Mohamed Ahmed Abdalla

Abstract:

The complexity of Arabic grammar poses challenges for learners, particularly in relation to its arrangement, classification, abundance, and bifurcation. The challenge at hand is a result of the contextual factors that gave rise to the grammatical rules in question, as well as the pedagogical approach employed at the time, which was tailored to the needs of learners during that particular historical period. Consequently, modern-day students encounter this same obstacle. This requires a thorough examination of the arrangement and categorization of Arabic grammatical rules based on particular criteria, as well as an assessment of their objectives. Additionally, it is necessary to identify the prevalent and renowned grammatical rules, as well as those that are infrequently encountered, obscure and disregarded. This paper presents a compilation of grammatical rules that require arrangement and categorization in accordance with the standards outlined in the Common European Framework of Reference for Languages (CEFR). In addition to facilitating comprehension of the curriculum, accommodating learners' requirements, and establishing the fundamental competencies for achieving proficiency in Arabic, it is imperative to ascertain the conventions that language learners necessitate in alignment with explicitly delineated benchmarks such as the CEFR criteria. The aim of this study is to reduce the quantity of grammatical rules that are typically presented to non-native Arabic speakers in Arabic textbooks. This reduction is expected to enhance the motivation of learners to continue their Arabic language acquisition and to approach the level of proficiency of native speakers. The primary obstacle faced by learners is the intricate nature of Arabic grammar, which poses a significant challenge in the realm of study. The proliferation and complexity of regulations evident in Arabic language textbooks designed for individuals who are not native speakers is noteworthy. The inadequate organisation and delivery of the material create the impression that the grammar is being imparted to a student with the intention of memorising "Alfiyyat-Ibn-Malik." Consequently, the sequence of grammatical rules instruction was altered, with rules originally intended for later instruction being presented first and those intended for earlier instruction being presented subsequently. Students often focus on learning grammatical rules that are not necessarily required while neglecting the rules that are commonly used in everyday speech and writing. Non-Arab students are taught Arabic grammar chapters that are infrequently utilised in Arabic literature and may be a topic of debate among grammarians. The aforementioned findings are derived from the statistical analysis and investigations conducted by the researcher, which will be disclosed in due course of the research. To instruct non-Arabic speakers on grammatical rules, it is imperative to discern the most prevalent grammatical frameworks in grammar manuals and linguistic literature (study sample). The present proposal suggests the allocation of grammatical structures across linguistic levels, taking into account the guidelines of the CEFR, as well as the grammatical structures that are necessary for non-Arabic-speaking learners to generate a modern, cohesive, and comprehensible language.

Keywords: grammar, Arabic, functional, framework, problems, standards, statistical, popularity, analysis

Procedia PDF Downloads 81
1485 Predictive Maintenance of Industrial Shredders: Efficient Operation through Real-Time Monitoring Using Statistical Machine Learning

Authors: Federico Pittino, Thomas Arnold

Abstract:

The shredding of waste materials is a key step in the recycling process towards the circular economy. Industrial shredders for waste processing operate in very harsh operating conditions, leading to the need for frequent maintenance of critical components. Maintenance optimization is particularly important also to increase the machine’s efficiency, thereby reducing the operational costs. In this work, a monitoring system has been developed and deployed on an industrial shredder located at a waste recycling plant in Austria. The machine has been monitored for one year, and methods for predictive maintenance have been developed for two key components: the cutting knives and the drive belt. The large amount of collected data is leveraged by statistical machine learning techniques, thereby not requiring very detailed knowledge of the machine or its live operating conditions. The results show that, despite the wide range of operating conditions, a reliable estimate of the optimal time for maintenance can be derived. Moreover, the trade-off between the cost of maintenance and the increase in power consumption due to the wear state of the monitored components of the machine is investigated. This work proves the benefits of real-time monitoring system for the efficient operation of industrial shredders.

Keywords: predictive maintenance, circular economy, industrial shredder, cost optimization, statistical machine learning

Procedia PDF Downloads 118
1484 Developing Writing Skills of Learners with Persistent Literacy Difficulties through the Explicit Teaching of Grammar in Context: Action Research in a Welsh Secondary School

Authors: Jean Ware, Susan W. Jones

Abstract:

Background: The benefits of grammar instruction in the teaching of writing is contested in most English speaking countries. A majority of Anglophone countries abandoned the teaching of grammar in the 1950s based on the conclusions that it had no positive impact on learners’ development of reading, writing, and language. Although the decontextualised teaching of grammar is not helpful in improving writing, a curriculum with a focus on grammar in an embedded and meaningful way can help learners develop their understanding of the mechanisms of language. Although British learners are generally not taught grammar rules explicitly, learners in schools in France, the Netherlands, and Germany are taught explicitly about the structure of their own language. Exposing learners to grammatical analysis can help them develop their understanding of language. Indeed, if learners are taught that each part of speech has an identified role in the sentence. This means that rather than have to memorise lists of words or spelling patterns, they can focus on determining each word or phrase’s task in the sentence. These processes of categorisation and deduction are higher order thinking skills. When considering definitions of dyslexia available in Great Britain, the explicit teaching of grammar in context could help learners with persistent literacy difficulties. Indeed, learners with dyslexia often develop strengths in problem solving; the teaching of grammar could, therefore, help them develop their understanding of language by using analytical and logical thinking. Aims: This study aims at gaining a further understanding of how the explicit teaching of grammar in context can benefit learners with persistent literacy difficulties. The project is designed to identify ways of adapting existing grammar focussed teaching materials so that learners with specific learning difficulties such as dyslexia can use them to further develop their writing skills. It intends to improve educational practice through action, analysis and reflection. Research Design/Methods: The project, therefore, uses an action research design and multiple sources of evidence. The data collection tools used were standardised test data, teacher assessment data, semi-structured interviews, learners’ before and after attempts at a writing task at the beginning and end of the cycle, documentary data and lesson observation carried out by a specialist teacher. Existing teaching materials were adapted for use with five Year 9 learners who had experienced persistent literacy difficulties from primary school onwards. The initial adaptations included reducing the amount of content to be taught in each lesson, and pre teaching some of the metalanguage needed. Findings: Learners’ before and after attempts at the writing task were scored by a colleague who did not know the order of the attempts. All five learners’ scores were higher on the second writing task. Learners reported that they had enjoyed the teaching approach. They also made suggestions to be included in the second cycle, as did the colleague who carried out observations. Conclusions: Although this is a very small exploratory study, these results suggest that adapting grammar focused teaching materials shows promise for helping learners with persistent literacy difficulties develop their writing skills.

Keywords: explicit teaching of grammar in context, literacy acquisition, persistent literacy difficulties, writing skills

Procedia PDF Downloads 153
1483 Engineering of Reagentless Fluorescence Biosensors Based on Single-Chain Antibody Fragments

Authors: Christian Fercher, Jiaul Islam, Simon R. Corrie

Abstract:

Fluorescence-based immunodiagnostics are an emerging field in biosensor development and exhibit several advantages over traditional detection methods. While various affinity biosensors have been developed to generate a fluorescence signal upon sensing varying concentrations of analytes, reagentless, reversible, and continuous monitoring of complex biological samples remains challenging. Here, we aimed to genetically engineer biosensors based on single-chain antibody fragments (scFv) that are site-specifically labeled with environmentally sensitive fluorescent unnatural amino acids (UAA). A rational design approach resulted in quantifiable analyte-dependent changes in peak fluorescence emission wavelength and enabled antigen detection in vitro. Incorporation of a polarity indicator within the topological neighborhood of the antigen-binding interface generated a titratable wavelength blueshift with nanomolar detection limits. In order to ensure continuous analyte monitoring, scFv candidates with fast binding and dissociation kinetics were selected from a genetic library employing a high-throughput phage display and affinity screening approach. Initial rankings were further refined towards rapid dissociation kinetics using bio-layer interferometry (BLI) and surface plasmon resonance (SPR). The most promising candidates were expressed, purified to homogeneity, and tested for their potential to detect biomarkers in a continuous microfluidic-based assay. Variations of dissociation kinetics within an order of magnitude were achieved without compromising the specificity of the antibody fragments. This approach is generally applicable to numerous antibody/antigen combinations and currently awaits integration in a wide range of assay platforms for one-step protein quantification.

Keywords: antibody engineering, biosensor, phage display, unnatural amino acids

Procedia PDF Downloads 139
1482 Stability of Novel Peptides (Linusorbs) in Flaxseed Meal Fortified Gluten-Free Bread

Authors: Youn Young Shim, Martin J. T. Reaney

Abstract:

Flaxseed meal is rich in water-soluble gums and, as such, can improve texture in gluten-free products. Flaxseed bioactive-antioxidant peptides, linusorbs (LOs, a.k.a. cyclolinopeptides), are a class of molecules that may contribute health-promoting effects. The effects of dough preparation, baking, and storage on flaxseed-derived LOs stability in doughs and baked products are un-known. Gluten-free (GF) bread dough and bread were prepared with flaxseed meal and the LO content was determined in the flaxseed meal, bread flour containing the flaxseed meal, bread dough, and bread. The LO contents during storage (0, 1, 2, and 4 weeks) at different temperatures (−18 °C, 4 °C, and 22−23 °C) were determined by high-performance liquid chromatog-raphy-diode array detection (HPLC-DAD). The content of oxidized LOs like [1–9-NαC],[1(Rs, Ss)-MetO]-linusorb B2 (LO14) were substantially constant in flaxseed meal and flour produced from flaxseed meal under all conditions for up to 4 weeks. However, during GF-bread production LOs decreased. Due to microbial contamination dough could not be stored at either 4 or 21°C, and bread could only be stored for one week at 21°C. Up to 4 weeks storage was possible for bread and dough at −18 °C and bread at 4 °C without the loss of LOs. The LOs change mostly from processing and less so from storage. The concentration of reduced LOs in flour and meal were much higher than measured in dough and bread. There was not a corre-sponding increase in oxidized LOs. The LOs in flaxseed meal-fortified bread were stable for products stored at low temperatures. This study is the first of the impact of baking conditions on LO content and quality.

Keywords: flaxseed, stability, gluten-free, antioxidant

Procedia PDF Downloads 83
1481 Sliding Mode Power System Stabilizer for Synchronous Generator Stability Improvement

Authors: J. Ritonja, R. Brezovnik, M. Petrun, B. Polajžer

Abstract:

Many modern synchronous generators in power systems are extremely weakly damped. The reasons are cost optimization of the machine building and introduction of the additional control equipment into power systems. Oscillations of the synchronous generators and related stability problems of the power systems are harmful and can lead to failures in operation and to damages. The only useful solution to increase damping of the unwanted oscillations represents the implementation of the power system stabilizers. Power system stabilizers generate the additional control signal which changes synchronous generator field excitation voltage. Modern power system stabilizers are integrated into static excitation systems of the synchronous generators. Available commercial power system stabilizers are based on linear control theory. Due to the nonlinear dynamics of the synchronous generator, current stabilizers do not assure optimal damping of the synchronous generator’s oscillations in the entire operating range. For that reason the use of the robust power system stabilizers which are convenient for the entire operating range is reasonable. There are numerous robust techniques applicable for the power system stabilizers. In this paper the use of sliding mode control for synchronous generator stability improvement is studied. On the basis of the sliding mode theory, the robust power system stabilizer was developed. The main advantages of the sliding mode controller are simple realization of the control algorithm, robustness to parameter variations and elimination of disturbances. The advantage of the proposed sliding mode controller against conventional linear controller was tested for damping of the synchronous generator oscillations in the entire operating range. Obtained results show the improved damping in the entire operating range of the synchronous generator and the increase of the power system stability. The proposed study contributes to the progress in the development of the advanced stabilizer, which will replace conventional linear stabilizers and improve damping of the synchronous generators.

Keywords: control theory, power system stabilizer, robust control, sliding mode control, stability, synchronous generator

Procedia PDF Downloads 217
1480 Natural Fibre Composite Structural Sections for Residential Stud Wall Applications

Authors: Mike R. Bambach

Abstract:

Increasing awareness of environmental concerns is leading a drive towards more sustainable structural products for the built environment. Natural fibres such as flax, jute and hemp have recently been considered for fibre-resin composites, with a major motivation for their implementation being their notable sustainability attributes. While recent decades have seen substantial interest in the use of such natural fibres in composite materials, much of this research has focused on the materials aspects, including fibre processing techniques, composite fabrication methodologies, matrix materials and their effects on the mechanical properties. The present study experimentally investigates the compression strength of structural channel sections of flax, jute and hemp, with a particular focus on their suitability for residential stud wall applications. The section geometry is optimised for maximum strength via the introduction of complex stiffeners in the webs and flanges. Experimental results on both natural fibre composite channel sections and typical steel and timber residential wall studs are compared. The geometrically optimised natural fibre composite channels are shown to have compression capacities suitable for residential wall stud applications, identifying them as a potentially viable alternative to traditional building materials in such application, and potentially other light structural applications.

Keywords: channel sections, natural fibre composites, residential stud walls, structural composites

Procedia PDF Downloads 309
1479 Preparation and Characterization of Iron/Titanium-Pillared Clays

Authors: Rezala Houria, Valverde Jose Luis, Romero Amaya, Molinari Alessandra, Maldotti Andrea

Abstract:

The escalation of oil prices in 1973 confronted the oil industry with the problem of how to maximize the processing of crude oil, especially the heavy fractions, to give gasoline components. Strong impetus was thus given to the development of catalysts with relatively large pore sizes, which were able to deal with larger molecules than the existing molecular sieves, and with good thermal and hydrothermal stability. The oil embargo in 1973 therefore acted as a stimulus for the investigation and development of pillared clays. Iron doped titania-pillared montmorillonite clays was prepared using bentonite from deposits of Maghnia in western-Algeria. The preparation method consists of differents steps (purification of the raw bentonite, preparation of a pillaring agent solution and exchange of the cations located between the clay layers with the previously formed iron/titanium solution). The characterization of this material was carried out by X-ray fluorescence spectrometry, X-ray diffraction, textural measures by BET method, inductively coupled plasma atomic emission spectroscopy, diffuse reflectance UV visible spectroscopy, temperature- programmed desorption of ammonia and atomic absorption.This new material was investigated as photocatalyst for selective oxygenation of the liquid alkylaromatics such as: toluene, paraxylene and orthoxylene and the photocatalytic properties of it were compared with those of the titanium-pillared clays.

Keywords: iron doping, montmorillonite clays, pillared clays, oil industry

Procedia PDF Downloads 300
1478 A Verification Intellectual Property for Multi-Flow Rate Control on Any Single Flow Bus Functional Model

Authors: Pawamana Ramachandra, Jitesh Gupta, Saranga P. Pogula

Abstract:

In verification of high volume and complex packet processing IPs, finer control of flow management aspects (for example, rate, bits/sec etc.) per flow class (or a virtual channel or a software thread) is needed. When any Software/Universal Verification Methodology (UVM) thread arbitration is left to the simulator (e.g., Verilog Compiler Simulator (VCS) or Incisive Enterprise Simulator core simulation engine (NCSIM)), it is hard to predict its pattern of resulting distribution of bandwidth by the simulator thread arbitration. In many cases, the patterns desired in a test scenario may not be accomplished as the simulator might give a different distribution than what was required. This can lead to missing multiple traffic scenarios, specifically deadlock and starvation related. We invented a component (namely Flow Manager Verification IP) to be intervening between the application (test case) and the protocol VIP (with UVM sequencer) to control the bandwidth per thread/virtual channel/flow. The Flow Manager has knobs visible to the UVM sequence/test to configure the required distribution of rate per thread/virtual channel/flow. This works seamlessly and produces rate stimuli to further harness the Design Under Test (DUT) with asymmetric inputs compared to the programmed bandwidth/Quality of Service (QoS) distributions in the Design Under Test.

Keywords: flow manager, UVM sequencer, rated traffic generation, quality of service

Procedia PDF Downloads 92
1477 DNA Methylation Changes in Response to Ocean Acidification at the Time of Larval Metamorphosis in the Edible Oyster, Crassostrea hongkongensis

Authors: Yong-Kian Lim, Khan Cheung, Xin Dang, Steven Roberts, Xiaotong Wang, Vengatesen Thiyagarajan

Abstract:

Unprecedented rate of increased CO₂ level in the ocean and the subsequent changes in carbonate system including decreased pH, known as ocean acidification (OA), is predicted to disrupt not only the calcification process but also several other physiological and developmental processes in a variety of marine organisms, including edible oysters. Nonetheless, not all species are vulnerable to those OA threats, e.g., some species may be able to cope with OA stress using environmentally induced modifications on gene and protein expressions. For example, external environmental stressors, including OA, can influence the addition and removal of methyl groups through epigenetic modification (e.g., DNA methylation) process to turn gene expression “on or off” as part of a rapid adaptive mechanism to cope with OA. In this study, the above hypothesis was tested through testing the effect of OA, using decreased pH 7.4 as a proxy, on the DNA methylation pattern of an endemic and a commercially important estuary oyster species, Crassostrea hongkongensis, at the time of larval habitat selection and metamorphosis. Larval growth rate did not differ between control pH 8.1 and treatment pH 7.4. The metamorphosis rate of the pediveliger larvae was higher at pH 7.4 than those in control pH 8.1; however, over one-third of the larvae raised at pH 7.4 failed to attach to an optimal substrate as defined by biofilm presence. During larval development, a total of 130 genes were differentially methylated across the two treatments. The differential methylation in the larval genes may have partially accounted for the higher metamorphosis success rate under decreased pH 7.4 but with poor substratum selection ability. Differentially methylated loci were concentrated in the exon regions and appear to be associated with cytoskeletal and signal transduction, oxidative stress, metabolic processes, and larval metamorphosis, which implies the high potential of C. hongkongensis larvae to acclimate and adapt through non-genetic ways to OA threats within a single generation.

Keywords: adaptive plasticity, DNA methylation, larval metamorphosis, ocean acidification

Procedia PDF Downloads 135
1476 Camptothecin Promotes ROS-Mediated G2/M Phase Cell Cycle Arrest, Resulting from Autophagy-Mediated Cytoprotection

Authors: Rajapaksha Gedara Prasad Tharanga Jayasooriya, Matharage Gayani Dilshara, Yung Hyun Choi, Gi-Young Kim

Abstract:

Camptothecin (CPT) is a quinolone alkaloid which inhibits DNA topoisomerase I that induces cytotoxicity in a variety of cancer cell lines. We previously showed that CPT effectively inhibited invasion of prostate cancer cells and also combined treatment with subtoxic doses of CPT and TNF-related apoptosis-inducing ligand (TRAIL) potentially enhanced apoptosis in a caspase-dependent manner in hepatoma cancer cells. Here, we found that treatment with CPT caused an irreversible cell cycle arrest in the G2/M phase. CPT-induced cell cycle arrest was associated with a decrease in protein levels of cell division cycle 25C (Cdc25C) and increased the level of cyclin B and p21. The CPT-induced decrease in Cdc25C was blocked in the presence of proteasome inhibitor MG132, thus reversed the cell cycle arrest. In addition to that treatment of CPT-increased phosphorylation of Cdc25C was the resulted of activation of checkpoint kinase 2 (Chk2), which was associated with phosphorylation of ataxia telangiectasia-mutated. Interestingly CPT induced G2/M phase of the cell cycle arrest is reactive oxygen species (ROS) dependent where ROS inhibitors NAC and GSH reversed the CPT-induced cell cycle arrest. These results further confirm by using transient knockdown of nuclear factor-erythroid 2-related factor 2 (Nrf2) since it regulates the production of ROS. Our data reveal that treatment of siNrf2 increased the ROS level as well as further increased the CPT induce G2/M phase cell cycle arrest. Our data also indicate CPT-enhanced cell cycle arrest through the extracellular signal-regulated kinase (ERK) and the c-Jun N-terminal kinase (JNK) pathway. Inhibitors of ERK and JNK more decreased the Cdc25C expression and protein expression of p21 and cyclin B. These findings indicate that Chk2-mediated phosphorylation of Cdc25C plays a major role in G2/M arrest by CPT.

Keywords: camptothecin, cell cycle, checkpoint kinase 2, nuclear factor-erythroid 2-related factor 2, reactive oxygen species

Procedia PDF Downloads 429
1475 Process Optimization for Albanian Crude Oil Characterization

Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici

Abstract:

Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.

Keywords: TBP distillation curves, crude oil, optimization, simulation

Procedia PDF Downloads 299
1474 Implementation Association Rule Method in Determining the Layout of Qita Supermarket as a Strategy in the Competitive Retail Industry in Indonesia

Authors: Dwipa Rizki Utama, Hanief Ibrahim

Abstract:

The development of industry retail in Indonesia is very fast, various strategy was undertaken to boost the customer satisfaction and the productivity purchases to boost the profit, one of which is implementing strategies layout. The purpose of this study is to determine the layout of Qita supermarket, a retail industry in Indonesia, in order to improve customer satisfaction and to maximize the rate of products’ sale as a whole, so as the infrequently purchased products will be purchased. This research uses a literature study method, and one of the data mining methods is association rule which applied in market basket analysis. Data were tested amounted 100 from 160 after pre-processing data, so then the distribution department and 26 departments corresponding to the data previous layout will be obtained. From those data, by the association rule method, customer behavior when purchasing items simultaneously can be studied, so then the layout of the supermarket based on customer behavior can be determined. Using the rapid miner software by the minimal support 25% and minimal confidence 30% showed that the 14th department purchased at the same time with department 10, 21st department purchased at the same time with department 13, 15th department purchased at the same time with department 12, 14th department purchased at the same time with department 12, and 10th department purchased at the same time with department 14. From those results, a better supermarket layout can be arranged than the previous layout.

Keywords: industry retail, strategy, association rule, supermarket

Procedia PDF Downloads 183
1473 Design of a Photovoltaic Power Generation System Based on Artificial Intelligence and Internet of Things

Authors: Wei Hu, Wenguang Chen, Chong Dong

Abstract:

In order to improve the efficiency and safety of photovoltaic power generation devices, this photovoltaic power generation system combines Artificial Intelligence (AI) and the Internet of Things (IoT) to control the chasing photovoltaic power generation device to track the sun to improve power generation efficiency and then convert energy management. The system uses artificial intelligence as the control terminal, the power generation device executive end uses the Linux system, and Exynos4412 is the CPU. The power generating device collects the sun image information through Sony CCD. After several power generating devices feedback the data to the CPU for processing, several CPUs send the data to the artificial intelligence control terminal through the Internet. The control terminal integrates the executive terminal information, time information, and environmental information to decide whether to generate electricity normally and then whether to convert the converted electrical energy into the grid or store it in the battery pack. When the power generation environment is abnormal, the control terminal authorizes the protection strategy, the power generation device executive terminal stops power generation and enters a self-protection posture, and at the same time, the control terminal synchronizes the data with the cloud. At the same time, the system is more intelligent, more adaptive, and longer life.

Keywords: photo-voltaic power generation, the pursuit of light, artificial intelligence, internet of things, photovoltaic array, power management

Procedia PDF Downloads 120
1472 A Transformer-Based Question Answering Framework for Software Contract Risk Assessment

Authors: Qisheng Hu, Jianglei Han, Yue Yang, My Hoa Ha

Abstract:

When a company is considering purchasing software for commercial use, contract risk assessment is critical to identify risks to mitigate the potential adverse business impact, e.g., security, financial and regulatory risks. Contract risk assessment requires reviewers with specialized knowledge and time to evaluate the legal documents manually. Specifically, validating contracts for a software vendor requires the following steps: manual screening, interpreting legal documents, and extracting risk-prone segments. To automate the process, we proposed a framework to assist legal contract document risk identification, leveraging pre-trained deep learning models and natural language processing techniques. Given a set of pre-defined risk evaluation problems, our framework utilizes the pre-trained transformer-based models for question-answering to identify risk-prone sections in a contract. Furthermore, the question-answering model encodes the concatenated question-contract text and predicts the start and end position for clause extraction. Due to the limited labelled dataset for training, we leveraged transfer learning by fine-tuning the models with the CUAD dataset to enhance the model. On a dataset comprising 287 contract documents and 2000 labelled samples, our best model achieved an F1 score of 0.687.

Keywords: contract risk assessment, NLP, transfer learning, question answering

Procedia PDF Downloads 127
1471 Continuous Measurement of Spatial Exposure Based on Visual Perception in Three-Dimensional Space

Authors: Nanjiang Chen

Abstract:

In the backdrop of expanding urban landscapes, accurately assessing spatial openness is critical. Traditional visibility analysis methods grapple with discretization errors and inefficiencies, creating a gap in truly capturing the human experi-ence of space. Addressing these gaps, this paper introduces a distinct continuous visibility algorithm, a leap in measuring urban spaces from a human-centric per-spective. This study presents a methodological breakthrough by applying this algorithm to urban visibility analysis. Unlike conventional approaches, this tech-nique allows for a continuous range of visibility assessment, closely mirroring hu-man visual perception. By eliminating the need for predefined subdivisions in ray casting, it offers a more accurate and efficient tool for urban planners and architects. The proposed algorithm not only reduces computational errors but also demonstrates faster processing capabilities, validated through a case study in Bei-jing's urban setting. Its key distinction lies in its potential to benefit a broad spec-trum of stakeholders, ranging from urban developers to public policymakers, aid-ing in the creation of urban spaces that prioritize visual openness and quality of life. This advancement in urban analysis methods could lead to more inclusive, comfortable, and well-integrated urban environments, enhancing the spatial experience for communities worldwide.

Keywords: visual openness, spatial continuity, ray-tracing algorithms, urban computation

Procedia PDF Downloads 38
1470 Perceptions of Teachers toward Inclusive Education Focus on Hearing Impairment

Authors: Chalise Kiran

Abstract:

The prime idea of inclusive education is to mainstream every child in education. However, it will be challenging for implementation when there are policy and practice gaps. It will be even more challenging when children have disabilities. Generally, the focus will be on the policy gap, but the problem may not always be with policy. The proper practice could be a challenge in the countries like Nepal. In determining practice, the teachers’ perceptions toward inclusive will play a vital role. Nepal has categorized disability in 7 types (physical, visual, hearing, vision/hearing, speech, mental, and multiple). Out of these, hearing impairment is the study realm. In the context of a limited number of researches on children with disabilities and rare researches on CWHI and their education in Nepal, this study is a pioneering effort in knowing basically the problems and challenges of CWHI focused on inclusive education in the schools including gaps and barriers in its proper implementation. Philosophically, the paradigm of the study is post-positivism. In the post-positivist worldview, the quantitative approach with the description of the situation and inferential relationship are revealed out in the study. This is related to the natural model of objective reality. The data were collected from an individual survey with the teachers and head teachers of 35 schools in Nepal. The survey questionnaire was prepared and filled by the respondents from the schools where the CWHI study in 7 provincial 20 districts of Nepal. Through these considerations, the perceptions of CWHI focused inclusive education were explored in the study. The data were analyzed using both descriptive and inferential tools on which the Likert scale-based analysis was done for descriptive analysis, and chi-square mathematical tool was used to know the significant relationship between dependent variables and independent variables. The descriptive analysis showed that the majority of teachers have positive perceptions toward implementing CWHI focused inclusive education, and the majority of them have positive perceptions toward CWHI focused inclusive education, though there are some problems and challenges. The study has found out the major challenges and problems categorically. Some of them are: a large number of students in a single class; availability of generic textbooks for CWHI and no availability of textbooks to all students; less opportunity for teachers to acquire knowledge on CWHI; not adequate teachers in the schools; no flexibility in the curriculum; less information system in schools; no availability of educational consular; disaster-prone students; no child abuse control strategy; no disabled-friendly schools; no free health check-up facility; no participation of the students in school activities and in child clubs and so on. By and large, it is found that teachers’ age, gender, years of experience, position, employment status, and disability with him or her show no statistically significant relation to successfully implement CWHI focused inclusive education and perceptions to CWHI focused inclusive education in schools. However, in some of the cases, the set null hypothesis was rejected, and some are completely retained. The study has suggested policy implications, implications for educational authority, and implications for teachers and parents categorically.

Keywords: children with hearing impairment, disability, inclusive education, perception

Procedia PDF Downloads 110
1469 Brain-Computer Interfaces That Use Electroencephalography

Authors: Arda Ozkurt, Ozlem Bozkurt

Abstract:

Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.

Keywords: BCI, EEG, non-invasive, spatial resolution

Procedia PDF Downloads 69
1468 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.

Keywords: CNC machining, six sigma, surface roughness, Taguchi methodology

Procedia PDF Downloads 239
1467 Assessment of Microbiological Feed Safety from Serbian Market from 2013 to 2017

Authors: Danijela Vuković, Radovan Čobanović, Milorad Plačkić

Abstract:

The expansion of population imposes increase in usage of animal meat, on whose quality directly affects the quality of the feed that the animals are fed with. The selection of raw materials, hygiene during the technological process, various hydrothermal treatments, methods of mixing etc. have an influence on the quality of feed. Monitoring of the feed is very important to obtain information about the quality of feed and the possible prevention of animal diseases which can lead to different human diseases outbreaks. In this study parameters of feed safety were monitored. According to the mentioned, the goal of this study was to evaluate microbiological safety of feed (feedstuffs and complete mixtures). Total number of analyzed samples was 4399. Analyzed feed samples were collected in various retail shops and feed factories during the period of 44 months (from January 2013 untill September 2017). Samples were analyzed on Salmonella spp. and Clostridium perfringens in quantity of 50g according to Serbian regulation. All microorganisms were tested according to ISO methodology: Salmonella spp. ISO 6579:2002 and Clostridium perfringens ISO 7937:2004. Out of 4399 analyzed feed samples 97,5% were satisfactory and 2,5% unsatisfactory concerning Salmonella spp. As far as Clostridium perfringens is concerned 100% of analyzed samples were satisfactory. The obtained results suggest that technological processing of feed in Serbia is at high level when it comes to safety and hygiene of the products, but there are still possibilities for progress and improvement which only can be reached trough the permanent monitoring of feed.

Keywords: microbiology, safety, hygiene, feed

Procedia PDF Downloads 296