Search results for: biologically inspired algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4215

Search results for: biologically inspired algorithm

2175 A Literature Review about Responsible Third Cycle Supervision

Authors: Johanna Lundqvist

Abstract:

Third cycle supervision is a multifaceted and complex task for supervisors in higher education. It progresses over several years and is affected by several proximal and distal factors. It can result in positive learning outcomes for doctoral students and high-quality publications. However, not all doctoral students thrive during their doctoral studies; nor do they all complete their studies. This is problematic for both the individuals themselves as well as society at large: doctoral students are valuable and important in current research, future research and higher education. The aim of this literature review is to elucidate what responsible third cycle supervision can include and be in practice. The question posed is as follows: according to recent literature, what is it that characterises responsible third cycle supervision in which doctoral students can thrive and develop their research knowledge and skills? A literature review was conducted, and the data gathered from the literature regarding responsible third cycle supervision was analysed by means of a thematic analysis. The analysis was inspired by the notion of responsible inclusion outlined by David Mitchell. In this study, the term literature refers to research articles and regulations. The results (preliminary) show that responsible third cycle supervision is associated with a number of interplaying factors (themes). These are as follows: committed supervisors and doctoral students; a clear vision and research problem; an individual study plan; adequate resources; interaction processes and constructive feedback; creativity; cultural awareness; respect and research ethics; systematic quality work and improvement efforts; focus on overall third cycle learning goals; and focus on research presentations and publications. Thus, responsible third cycle supervision can occur if these factors are realized in practice. This literature review is of relevance to evaluators, researchers, and management in higher education, as well as third cycle supervisors.

Keywords: doctoral student, higher education, third cycle supervisors, third cycle programmes

Procedia PDF Downloads 137
2174 The Targeting Logic of Terrorist Groups in the Sahel

Authors: Mathieu Bere

Abstract:

Al-Qaeda and Islamic State-affiliated groups such as Ja’amat Nusra al Islam Wal Muslimim (JNIM) and the Islamic State-Greater Sahara Faction, which is now part of the Boko Haram splinter group, Islamic State in West Africa, were responsible, between 2018 and 2020, for at least 1.333 violent incidents against both military and civilian targets, including the assassination and kidnapping for ransom of Western citizens in Mali, Burkina Faso and Niger, the Central Sahel. Protecting civilians from the terrorist violence that is now spreading from the Sahel to the coastal countries of West Africa has been very challenging, mainly because of the many unknowns that surround the perpetrators. To contribute to a better protection of civilians in the region, this paper aims to shed light on the motivations and targeting logic of jihadist perpetrators of terrorist violence against civilians in the central Sahel region. To that end, it draws on relevant secondary data retrieved from datasets, the media, and the existing literature, but also on primary data collected through interviews and surveys in Burkina Faso. An analysis of the data with the support of qualitative and statistical analysis software shows that military and rational strategic motives, more than purely ideological or religious motives, have been the main drivers of terrorist violence that strategically targeted government symbols and representatives as well as local leaders in the central Sahel. Behind this targeting logic, the jihadist grand strategy emerges: wiping out the Western-inspired legal, education and governance system in order to replace it with an Islamic, sharia-based political, legal, and educational system.

Keywords: terrorism, jihadism, Sahel, targeting logic

Procedia PDF Downloads 87
2173 Sustainable Renovation of Cultural Buildings Case Study: Red Bay National Historic Site, Canada

Authors: Richard Briginshaw, Hana Alaojeli, Javaria Ahmad, Hamza Gaffar, Nourtan Murad

Abstract:

Sustainable renovations to cultural buildings and sites require a high level of competency in the sometimes conflicting areas of social/historical demands, environmental concerns, and the programmatic and technical requirements of the project. A detailed analysis of the existing site, building and client program are critical to reveal both challenges and opportunities. This forms the starting point for the design process – empirical explorations that search for a balanced and inspired architectural solution to the project. The Red Bay National Historic Site on the Labrador Coast of eastern Canada is a challenging project to explore and resolve these ideas. Originally the site of a 16ᵗʰ century whaling station occupied by Basque sailors from France and Spain, visitors now experience this history at the interpretive center, along with the unique geography, climate, local culture and vernacular architecture of the area. Working with our client, Parks Canada, the project called for significant alterations and expansion to the existing facility due to an increase in the number of annual visitors. Sustainable aspects of the design are focused on sensitive site development, passive energy strategies such as building orientation and building envelope efficiency, active renewable energy systems, carefully considered material selections, water efficiency, and interiors that respond to human comfort and a unique visitor experience.

Keywords: sustainability, renovations and expansion, cultural project, architectural design, green building

Procedia PDF Downloads 168
2172 Digital Athena – Contemporary Commentaries and Greek Mythology Explored through 3D Printing

Authors: Rose Lastovicka, Bernard Guy, Diana Burton

Abstract:

Greek myth and art acted as tools to think with, and a lens through which to explore complex topics as a form of social media. In particular, coins were a form of propaganda to communicate the wealth and power of the city-states they originated from as they circulated from person to person. From this, how can the application of 3D printing technologies explore the infusion of ancient forms with contemporary commentaries to promote discussion? The digital reconstruction of artifacts is a topic that has been researched by various groups all over the globe. Yet, the exploration of Greek myth through artifacts infused with contemporary issues is currently unexplored in this medium. Using the Stratasys J750 3D printer - a multi-material, full-colour 3D printer - a series of coins inspired by ancient Greek currency and myth was created to present commentaries on the adversities surrounding individuals in the LGBT+ community. Using the J750 as the medium for expression allows for complete control and precision of the models to create complex high-resolution iconography. The coins are printed with a hard, translucent material with coloured 3D visuals embedded into the coin to then be viewed in close contact by the audience. These coins as commentaries present an avenue for wider understanding by drawing perspectives not only from sources concerned with the contemporary LGBT+ community but also from sources exploring ancient homosexuality and the perception and regulation of it in antiquity. By displaying what are usually points of contention between anti- and pro-LGBT+ parties, this visual medium opens up a discussion to both parties, suggesting heritage can play a vital interpretative role in the contemporary world.

Keywords: 3D printing, design, Greek mythology, LGBT+ community

Procedia PDF Downloads 116
2171 A Study on the Different Components of a Typical Back-Scattered Chipless RFID Tag Reflection

Authors: Fatemeh Babaeian, Nemai Chandra Karmakar

Abstract:

Chipless RFID system is a wireless system for tracking and identification which use passive tags for encoding data. The advantage of using chipless RFID tag is having a planar tag which is printable on different low-cost materials like paper and plastic. The printed tag can be attached to different items in the labelling level. Since the price of chipless RFID tag can be as low as a fraction of a cent, this technology has the potential to compete with the conventional optical barcode labels. However, due to the passive structure of the tag, data processing of the reflection signal is a crucial challenge. The captured reflected signal from a tag attached to an item consists of different components which are the reflection from the reader antenna, the reflection from the item, the tag structural mode RCS component and the antenna mode RCS of the tag. All these components are summed up in both time and frequency domains. The effect of reflection from the item and the structural mode RCS component can distort/saturate the frequency domain signal and cause difficulties in extracting the desired component which is the antenna mode RCS. Therefore, it is required to study the reflection of the tag in both time and frequency domains to have a better understanding of the nature of the captured chipless RFID signal. The other benefits of this study can be to find an optimised encoding technique in tag design level and to find the best processing algorithm the chipless RFID signal in decoding level. In this paper, the reflection from a typical backscattered chipless RFID tag with six resonances is analysed, and different components of the signal are separated in both time and frequency domains. Moreover, the time domain signal corresponding to each resonator of the tag is studied. The data for this processing was captured from simulation in CST Microwave Studio 2017. The outcome of this study is understanding different components of a measured signal in a chipless RFID system and a discovering a research gap which is a need to find an optimum detection algorithm for tag ID extraction.

Keywords: antenna mode RCS, chipless RFID tag, resonance, structural mode RCS

Procedia PDF Downloads 200
2170 A Radiomics Approach to Predict the Evolution of Prostate Imaging Reporting and Data System Score 3/5 Prostate Areas in Multiparametric Magnetic Resonance

Authors: Natascha C. D'Amico, Enzo Grossi, Giovanni Valbusa, Ala Malasevschi, Gianpiero Cardone, Sergio Papa

Abstract:

Purpose: To characterize, through a radiomic approach, the nature of areas classified PI-RADS (Prostate Imaging Reporting and Data System) 3/5, recognized in multiparametric prostate magnetic resonance with T2-weighted (T2w), diffusion and perfusion sequences with paramagnetic contrast. Methods and Materials: 24 cases undergoing multiparametric prostate MR and biopsy were admitted to this pilot study. Clinical outcome of the PI-RADS 3/5 was found through biopsy, finding 8 malignant tumours. The analysed images were acquired with a Philips achieva 1.5T machine with a CE- T2-weighted sequence in the axial plane. Semi-automatic tumour segmentation was carried out on MR images using 3DSlicer image analysis software. 45 shape-based, intensity-based and texture-based features were extracted and represented the input for preprocessing. An evolutionary algorithm (a TWIST system based on KNN algorithm) was used to subdivide the dataset into training and testing set and select features yielding the maximal amount of information. After this pre-processing 20 input variables were selected and different machine learning systems were used to develop a predictive model based on a training testing crossover procedure. Results: The best machine learning system (three-layers feed-forward neural network) obtained a global accuracy of 90% ( 80 % sensitivity and 100% specificity ) with a ROC of 0.82. Conclusion: Machine learning systems coupled with radiomics show a promising potential in distinguishing benign from malign tumours in PI-RADS 3/5 areas.

Keywords: machine learning, MR prostate, PI-Rads 3, radiomics

Procedia PDF Downloads 188
2169 University Under Terror: Explaining the Deviant Activities in the Tribhuvan University

Authors: Tek Nath Subedi

Abstract:

Violent activities used previously to erupt in specific societies owing to unequal, oppressive and unjust social practices have crossed out of their usual place of occurrence and entered the peace-zones such as schools and universities. The recent history of Universities in Nepal is marked with violent activities of students targeting to professors and other university officials. A student in a university is supposed to involve in scholarly activities and cooperate the teacher so as to achieve the academic degree and to enhance knowledge. Besides, the relationship between a teacher and a student is supposed to be the most sacred, which, in other words, is explained as the relationship between deity and devotee in a spiritual ground, carer and the kid in a moral ground, and mutually dependent in a functionalist view. Above all, the interdependence between teacher and student is illustrated as the two sides of the same coin. But the targeted attack on teachers and the degrading relationship between university and students raises serious questions on what makes a student violent and why the academia is terrorized. Therefore, this research aims to find the nexus of deviant activities of students to other social, political, economic and cultural areas of their life. To find possible answer of a few analytical questions, this paper takes an in-depth interview into consideration based on a snowball sampling and inquires some professors and officials how and why they are terrorized. It also reaches to some students who have witnessed or been part of any terror, following the same research method and ask them why violence has become recurring in the university and how university can be converted into a peace-zone. This research has simply hypothesized that the terror-inspired activities carried out by students in the university are the extended version of their deviant activities out of the university premises. But, it was found that the deviant activities of students were connected in particular to their culture and political identity.

Keywords: deviance, relationship, student, teacher, terror, university

Procedia PDF Downloads 65
2168 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.

Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements

Procedia PDF Downloads 66
2167 Replicating Brain’s Resting State Functional Connectivity Network Using a Multi-Factor Hub-Based Model

Authors: B. L. Ho, L. Shi, D. F. Wang, V. C. T. Mok

Abstract:

The brain’s functional connectivity while temporally non-stationary does express consistency at a macro spatial level. The study of stable resting state connectivity patterns hence provides opportunities for identification of diseases if such stability is severely perturbed. A mathematical model replicating the brain’s spatial connections will be useful for understanding brain’s representative geometry and complements the empirical model where it falls short. Empirical computations tend to involve large matrices and become infeasible with fine parcellation. However, the proposed analytical model has no such computational problems. To improve replicability, 92 subject data are obtained from two open sources. The proposed methodology, inspired by financial theory, uses multivariate regression to find relationships of every cortical region of interest (ROI) with some pre-identified hubs. These hubs acted as representatives for the entire cortical surface. A variance-covariance framework of all ROIs is then built based on these relationships to link up all the ROIs. The result is a high level of match between model and empirical correlations in the range of 0.59 to 0.66 after adjusting for sample size; an increase of almost forty percent. More significantly, the model framework provides an intuitive way to delineate between systemic drivers and idiosyncratic noise while reducing dimensions by more than 30 folds, hence, providing a way to conduct attribution analysis. Due to its analytical nature and simple structure, the model is useful as a standalone toolkit for network dependency analysis or as a module for other mathematical models.

Keywords: functional magnetic resonance imaging, multivariate regression, network hubs, resting state functional connectivity

Procedia PDF Downloads 151
2166 Determination of Optimum Parameters for Thermal Stress Distribution in Composite Plate Containing a Triangular Cutout by Optimization Method

Authors: Mohammad Hossein Bayati Chaleshtari, Hadi Khoramishad

Abstract:

Minimizing the stress concentration around triangular cutout in infinite perforated plates subjected to a uniform heat flux induces thermal stresses is an important consideration in engineering design. Furthermore, understanding the effective parameters on stress concentration and proper selection of these parameters enables the designer to achieve a reliable design. In the analysis of thermal stress, the effective parameters on stress distribution around cutout include fiber angle, flux angle, bluntness and rotation angle of the cutout for orthotropic materials. This paper was tried to examine effect of these parameters on thermal stress analysis of infinite perforated plates with central triangular cutout. In order to achieve the least amount of thermal stress around a triangular cutout using a novel swarm intelligence optimization technique called dragonfly optimizer that inspired by the life method and hunting behavior of dragonfly in nature. In this study, using the two-dimensional thermoelastic theory and based on the Likhnitskiiʼ complex variable technique, the stress analysis of orthotropic infinite plate with a circular cutout under a uniform heat flux was developed to the plate containing a quasi-triangular cutout in thermal steady state condition. To achieve this goal, a conformal mapping function was used to map an infinite plate containing a quasi- triangular cutout into the outside of a unit circle. The plate is under uniform heat flux at infinity and Neumann boundary conditions and thermal-insulated condition at the edge of the cutout were considered.

Keywords: infinite perforated plate, complex variable method, thermal stress, optimization method

Procedia PDF Downloads 147
2165 Category-Base Theory of the Optimum Signal Approximation Clarifying the Importance of Parallel Worlds in the Recognition of Human and Application to Secure Signal Communication with Feedback

Authors: Takuro Kida, Yuichi Kida

Abstract:

We show a base of the new trend of algorithm mathematically that treats a historical reason of continuous discrimination in the world as well as its solution by introducing new concepts of parallel world that includes an invisible set of errors as its companion. With respect to a matrix operator-filter bank that the matrix operator-analysis-filter bank H and the matrix operator-sampling-filter bank S are given, firstly, we introduce the detailed algorithm to derive the optimum matrix operator-synthesis-filter bank Z that minimizes all the worst-case measures of the matrix operator-error-signals E(ω) = F(ω) − Y(ω) between the matrix operator-input-signals F(ω) and the matrix operator-output signals Y(ω) of the matrix operator-filter bank at the same time. Further, feedback is introduced to the above approximation theory and it is indicated that introducing conversations with feedback does not superior automatically to the accumulation of existing knowledge of signal prediction. Secondly, the concept of category in the field of mathematics is applied to the above optimum signal approximation and is indicated that the category-based approximation theory is applied to the set-theoretic consideration of the recognition of humans. Based on this discussion, it is shown naturally why the narrow perception that tends to create isolation shows an apparent advantage in the short term and, often, why such narrow thinking becomes intimate with discriminatory action in a human group. Throughout these considerations, it is presented that, in order to abolish easy and intimate discriminatory behavior, it is important to create a parallel world of conception where we share the set of invisible error signals, including the words and the consciousness of both worlds.

Keywords: signal prediction, pseudo inverse matrix, artificial intelligence, conditional optimization

Procedia PDF Downloads 156
2164 Novel Bioinspired Design to Capture Smoky CO2 by Reactive Absorption with Aqueous Scrubber

Authors: J. E. O. Hernandez

Abstract:

In the next 20 years, energy production by burning fuels will increase and so will the atmospheric concentration of CO2 and its well-known threats to life on Earth. The technologies available for capturing CO2 are still dubious and this keeps fostering an interest in bio-inspired approaches. The leading one is the application of carbonic anhydrase (CA) –a superfast biocatalyst able to convert up to one million molecules of CO2 into carbonates in water. However, natural CA underperforms when applied to real smoky CO2 in chimneys and, so far, the efforts to create superior CAs in the lab rely on screening methods running under pristine conditions at the micro level, which are far from resembling those in chimneys. For the evolution of man-made enzymes, selection rather than screening would be ideal but this is challenging because of the need for a suitable artificial environment that is also sustainable for our society. Herein we present the stepwise design and construction of a bioprocess (from bench-scale to semi-pilot) for evolutionary selection experiments. In this bioprocess, reaction and adsorption took place simultaneously at atmospheric pressure in a spray tower. The scrubbing solution was fed countercurrently by reusing municipal pressure and it was mainly prepared with water, carbonic anhydrase and calcium chloride. This bioprocess allowed for the enzymatic carbonation of smoky CO2; the reuse of process water and the recovery of solid carbonates without cooling of smoke, pretreatments, solvent amines and compression of CO2. The average yield of solid carbonates was 0.54 g min-1 or 12-fold the amount produced in serum bottles at lab bench scale. This bioprocess could be used as a tailor-made environment for driving the selection of superior CAs. The bioprocess and its match CA could be sustainably used to reduce global warming by CO2 emissions from exhausts.

Keywords: biological carbon capture and sequestration, carbonic anhydrase, directed evolution, global warming

Procedia PDF Downloads 193
2163 Low Energy Technology for Leachate Valorisation

Authors: Jesús M. Martín, Francisco Corona, Dolores Hidalgo

Abstract:

Landfills present long-term threats to soil, air, groundwater and surface water due to the formation of greenhouse gases (methane gas and carbon dioxide) and leachate from decomposing garbage. The composition of leachate differs from site to site and also within the landfill. The leachates alter with time (from weeks to years) since the landfilled waste is biologically highly active and their composition varies. Mainly, the composition of the leachate depends on factors such as characteristics of the waste, the moisture content, climatic conditions, degree of compaction and the age of the landfill. Therefore, the leachate composition cannot be generalized and the traditional treatment models should be adapted in each case. Although leachate composition is highly variable, what different leachates have in common is hazardous constituents and their potential eco-toxicological effects on human health and on terrestrial ecosystems. Since leachate has distinct compositions, each landfill or dumping site would represent a different type of risk on its environment. Nevertheless, leachates consist always of high organic concentration, conductivity, heavy metals and ammonia nitrogen. Leachate could affect the current and future quality of water bodies due to uncontrolled infiltrations. Therefore, control and treatment of leachate is one of the biggest issues in urban solid waste treatment plants and landfills design and management. This work presents a treatment model that will be carried out "in-situ" using a cost-effective novel technology that combines solar evaporation/condensation plus forward osmosis. The plant is powered by renewable energies (solar energy, biomass and residual heat), which will minimize the carbon footprint of the process. The final effluent quality is very high, allowing reuse (preferred) or discharge into watercourses. In the particular case of this work, the final effluents will be reused for cleaning and gardening purposes. A minority semi-solid residual stream is also generated in the process. Due to its special composition (rich in metals and inorganic elements), this stream will be valorized in ceramic industries to improve the final products characteristics.

Keywords: forward osmosis, landfills, leachate valorization, solar evaporation

Procedia PDF Downloads 202
2162 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data

Authors: K. Sathishkumar, V. Thiagarasu

Abstract:

Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.

Keywords: microarray technology, gene expression data, clustering, gene Selection

Procedia PDF Downloads 323
2161 An Exponential Field Path Planning Method for Mobile Robots Integrated with Visual Perception

Authors: Magdy Roman, Mostafa Shoeib, Mostafa Rostom

Abstract:

Global vision, whether provided by overhead fixed cameras, on-board aerial vehicle cameras, or satellite images can always provide detailed information on the environment around mobile robots. In this paper, an intelligent vision-based method of path planning and obstacle avoidance for mobile robots is presented. The method integrates visual perception with a new proposed field-based path-planning method to overcome common path-planning problems such as local minima, unreachable destination and unnecessary lengthy paths around obstacles. The method proposes an exponential angle deviation field around each obstacle that affects the orientation of a close robot. As the robot directs toward, the goal point obstacles are classified into right and left groups, and a deviation angle is exponentially added or subtracted to the orientation of the robot. Exponential field parameters are chosen based on Lyapunov stability criterion to guarantee robot convergence to the destination. The proposed method uses obstacles' shape and location, extracted from global vision system, through a collision prediction mechanism to decide whether to activate or deactivate obstacles field. In addition, a search mechanism is developed in case of robot or goal point is trapped among obstacles to find suitable exit or entrance. The proposed algorithm is validated both in simulation and through experiments. The algorithm shows effectiveness in obstacles' avoidance and destination convergence, overcoming common path planning problems found in classical methods.

Keywords: path planning, collision avoidance, convergence, computer vision, mobile robots

Procedia PDF Downloads 195
2160 Biodegradation of Carbamazepine and Diclofenac by Bacterial Strain Labrys Portucalensis

Authors: V. S. Bessa, I. S. Moreira, S. Murgolo, C. Piccirillo, G. Mascolo, P. M. L. Castro

Abstract:

The occurrence of pharmaceuticals in the environment has been a topic of increasing concern. Pharmaceuticals are not completely mineralized in the human body and are released on the sewage systems as the pharmaceutical itself and as their “biologically active” metabolites through excretion, as well as by improper elimination and disposal. Conventional wastewater treatment plants (WWTPs) are not designed to remove these emerging pollutants and they are thus released into the environment. The antiepileptic drug carbamazepine (CBZ) and the non-steroidal anti-inflammatory diclofenac (DCF) are two widely used pharmaceuticals, frequently detected in water bodies, including rivers and groundwater, in concentrations ranging from ng L 1 to mg L 1. These two compounds were classified as medium to high-risk pollutants in WWTP effluents and surface waters. Also, CBZ has been suggested as a molecular marker of wastewater contamination in surface water and groundwater and the European Union included DCF in the watch list of substances Directive to be monitored. In the present study, biodegradation of CBZ and DCF by the bacterial strain Labrys portucalensis F11, a strain able to degrade other pharmaceutical compounds, was assessed; tests were performed with F11 as single carbon and energy source, as well as in presence of 5.9mM of sodium acetate. In assays supplemented with 2.0 and 4.0 µM of CBZ, the compound was no longer detected in the bulk medium after 24hr and 5days, respectively. Complete degradation was achieved in 21 days for 11.0 µM and in 23 days for 21.0 µM. For the highest concentration tested (43.0 µM), 95% of degradation was achieved in 30days. Supplementation with acetate increased the degradation rate of CBZ, for all tested concentrations. In the case of DCF, when supplemented as a single carbon source, approximately 70% of DCF (1.7, 3.3, 8.4, 17.5 and 34.0 µM) was degraded in 30days. Complete degradation was achieved in the presence of acetate for all tested concentrations, at higher degradation rates. The detection of intermediates produced during DCF biodegradation was performed by UPLC-QTOF/MS/MS, which allowed the identification of a range of metabolites. Stoichiometric liberation of chorine occurred and no metabolites were detected at the end of the biodegradation assays suggesting a complete mineralization of DCF. Strain Labrys portucalensis F11 proved to be able to degrade these two top priority environmental contaminants and may be potentially useful for biotechnological applications/environment remediation.

Keywords: biodegradation, carbamazepine, diclofenac, pharmaceuticals

Procedia PDF Downloads 273
2159 A Multi-Modal Virtual Walkthrough of the Virtual Past and Present Based on Panoramic View, Crowd Simulation and Acoustic Heritage on Mobile Platform

Authors: Lim Chen Kim, Tan Kian Lam, Chan Yi Chee

Abstract:

This research presents a multi-modal simulation in the reconstruction of the past and the construction of present in digital cultural heritage on mobile platform. In bringing the present life, the virtual environment is generated through a presented scheme for rapid and efficient construction of 360° panoramic view. Then, acoustical heritage model and crowd model are presented and improvised into the 360° panoramic view. For the reconstruction of past life, the crowd is simulated and rendered in an old trading port. However, the keystone of this research is in a virtual walkthrough that shows the virtual present life in 2D and virtual past life in 3D, both in an environment of virtual heritage sites in George Town through mobile device. Firstly, the 2D crowd is modelled and simulated using OpenGL ES 1.1 on mobile platform. The 2D crowd is used to portray the present life in 360° panoramic view of a virtual heritage environment based on the extension of Newtonian Laws. Secondly, the 2D crowd is animated and rendered into 3D with improved variety and incorporated into the virtual past life using Unity3D Game Engine. The behaviours of the 3D models are then simulated based on the enhancement of the classical model of Boid algorithm. Finally, a demonstration system is developed and integrated with the models, techniques and algorithms of this research. The virtual walkthrough is demonstrated to a group of respondents and is evaluated through the user-centred evaluation by navigating around the demonstration system. The results of the evaluation based on the questionnaires have shown that the presented virtual walkthrough has been successfully deployed through a multi-modal simulation and such a virtual walkthrough would be particularly useful in a virtual tour and virtual museum applications.

Keywords: Boid Algorithm, Crowd Simulation, Mobile Platform, Newtonian Laws, Virtual Heritage

Procedia PDF Downloads 277
2158 Heliport Remote Safeguard System Based on Real-Time Stereovision 3D Reconstruction Algorithm

Authors: Ł. Morawiński, C. Jasiński, M. Jurkiewicz, S. Bou Habib, M. Bondyra

Abstract:

With the development of optics, electronics, and computers, vision systems are increasingly used in various areas of life, science, and industry. Vision systems have a huge number of applications. They can be used in quality control, object detection, data reading, e.g., QR-code, etc. A large part of them is used for measurement purposes. Some of them make it possible to obtain a 3D reconstruction of the tested objects or measurement areas. 3D reconstruction algorithms are mostly based on creating depth maps from data that can be acquired from active or passive methods. Due to the specific appliance in airfield technology, only passive methods are applicable because of other existing systems working on the site, which can be blinded on most spectral levels. Furthermore, reconstruction is required to work long distances ranging from hundreds of meters to tens of kilometers with low loss of accuracy even with harsh conditions such as fog, rain, or snow. In response to those requirements, HRESS (Heliport REmote Safeguard System) was developed; which main part is a rotational head with a two-camera stereovision rig gathering images around the head in 360 degrees along with stereovision 3D reconstruction and point cloud combination. The sub-pixel analysis introduced in the HRESS system makes it possible to obtain an increased distance measurement resolution and accuracy of about 3% for distances over one kilometer. Ultimately, this leads to more accurate and reliable measurement data in the form of a point cloud. Moreover, the program algorithm introduces operations enabling the filtering of erroneously collected data in the point cloud. All activities from the programming, mechanical and optical side are aimed at obtaining the most accurate 3D reconstruction of the environment in the measurement area.

Keywords: airfield monitoring, artificial intelligence, stereovision, 3D reconstruction

Procedia PDF Downloads 125
2157 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm

Authors: Annalakshmi G., Sakthivel Murugan S.

Abstract:

This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.

Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization

Procedia PDF Downloads 163
2156 The Role of Semi Open Spaces on Exploitation of Wind-Driven Ventilation

Authors: Paria Saadatjoo

Abstract:

Given that HVAC systems are the main sources of carbon dioxide producers, developing ways to reduce dependence on these systems and making use of natural resources is too important to achieve environmentally friendly buildings. A major part of building potential in terms of using natural energy resources depends on its physical features. So architectural decisions at the first step of the design process can influence the building's energy efficiency significantly. Implementation of semi-open spaces into solid apartment blocks inspired by the concept of courtyard in ancient buildings as a passive cooling strategy is currently enjoying great popularity. However, the analysis of these features and their effect on wind behavior at initial design steps is a difficult task for architects. The main objective of this research was to investigate the influence of semi-open to closed space ratio on airflow patterns in and around midrise buildings and introduce the best ratio in terms of harnessing natural ventilation. The main strategy of this paper was semi-experimental, and the research methodology was descriptive statistics. At the first step, by changing the terrace area, 6 models with various open to closed space ratios were created. These forms were then transferred to CFD software to calculate the primary indicators of natural ventilation potentials such as wind force coefficient, air flow rate, age of air distribution, etc. Investigations indicated that modifying the terrace area and, in other words, the open to closed space ratio influenced the wind force coefficient, airflow rate, and age of air distribution.

Keywords: natural ventilation, wind, midrise, open space, energy

Procedia PDF Downloads 170
2155 The Design and Implementation of an Enhanced 2D Mesh Switch

Authors: Manel Langar, Riad Bourguiba, Jaouhar Mouine

Abstract:

In this paper, we propose the design and implementation of an enhanced wormhole virtual channel on chip router. It is a heart of a mesh NoC using the XY deterministic routing algorithm. It is characterized by its simple virtual channel allocation strategy which allows reducing area and complexity of connections without affecting the performance. We implemented our router on a Tezzaron process to validate its performances. This router is a basic element that will be used later to design a 3D mesh NoC.

Keywords: NoC, mesh, router, 3D NoC

Procedia PDF Downloads 568
2154 Investigating the Algorithm to Maintain a Constant Speed in the Wankel Engine

Authors: Adam Majczak, Michał Bialy, Zbigniew Czyż, Zdzislaw Kaminski

Abstract:

Increasingly stringent emission standards for passenger cars require us to find alternative drives. The share of electric vehicles in the sale of new cars increases every year. However, their performance and, above all, range cannot be today successfully compared to those of cars with a traditional internal combustion engine. Battery recharging lasts hours, which can be hardly accepted due to the time needed to refill a fuel tank. Therefore, the ways to reduce the adverse features of cars equipped with electric motors only are searched for. One of the methods is a combination of an electric engine as a main source of power and a small internal combustion engine as an electricity generator. This type of drive enables an electric vehicle to achieve a radically increased range and low emissions of toxic substances. For several years, the leading automotive manufacturers like the Mazda and the Audi together with the best companies in the automotive industry, e.g., AVL have developed some electric drive systems capable of recharging themselves while driving, known as a range extender. An electricity generator is powered by a Wankel engine that has seemed to pass into history. This low weight and small engine with a rotating piston and a very low vibration level turned out to be an excellent source in such applications. Its operation as an energy source for a generator almost entirely eliminates its disadvantages like high fuel consumption, high emission of toxic substances, or short lifetime typical of its traditional application. The operation of the engine at a constant rotational speed enables a significant increase in its lifetime, and its small external dimensions enable us to make compact modules to drive even small urban cars like the Audi A1 or the Mazda 2. The algorithm to maintain a constant speed was investigated on the engine dynamometer with an eddy current brake and the necessary measuring apparatus. The research object was the Aixro XR50 rotary engine with the electronic power supply developed at the Lublin University of Technology. The load torque of the engine was altered during the research by means of the eddy current brake capable of giving any number of load cycles. The parameters recorded included speed and torque as well as a position of a throttle in an inlet system. Increasing and decreasing load did not significantly change engine speed, which means that control algorithm parameters are correctly selected. This work has been financed by the Polish Ministry of Science and Higher Education.

Keywords: electric vehicle, power generator, range extender, Wankel engine

Procedia PDF Downloads 157
2153 Layer-Level Feature Aggregation Network for Effective Semantic Segmentation of Fine-Resolution Remote Sensing Images

Authors: Wambugu Naftaly, Ruisheng Wang, Zhijun Wang

Abstract:

Models based on convolutional neural networks (CNNs), in conjunction with Transformer, have excelled in semantic segmentation, a fundamental task for intelligent Earth observation using remote sensing (RS) imagery. Nonetheless, tokenization in the Transformer model undermines object structures and neglects inner-patch local information, whereas CNNs are unable to simulate global semantics due to limitations inherent in their convolutional local properties. The integration of the two methodologies facilitates effective global-local feature aggregation and interactions, potentially enhancing segmentation results. Inspired by the merits of CNNs and Transformers, we introduce a layer-level feature aggregation network (LLFA-Net) to address semantic segmentation of fine-resolution remote sensing (FRRS) images for land cover classification. The simple yet efficient system employs a transposed unit that hierarchically utilizes dense high-level semantics and sufficient spatial information from various encoder layers through a layer-level feature aggregation module (LLFAM) and models global contexts using structured Transformer blocks. Furthermore, the decoder aggregates resultant features to generate rich semantic representation. Extensive experiments on two public land cover datasets demonstrate that our proposed framework exhibits competitive performance relative to the most recent frameworks in semantic segmentation.

Keywords: land cover mapping, semantic segmentation, remote sensing, vision transformer networks, deep learning

Procedia PDF Downloads 0
2152 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series

Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold

Abstract:

To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.

Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network

Procedia PDF Downloads 139
2151 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble

Authors: Jaehong Yu, Seoung Bum Kim

Abstract:

Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.

Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking

Procedia PDF Downloads 339
2150 Modified Weibull Approach for Bridge Deterioration Modelling

Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight

Abstract:

State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.

Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models

Procedia PDF Downloads 727
2149 Visual Thinking Routines: A Mixed Methods Approach Applied to Student Teachers at the American University in Dubai

Authors: Alain Gholam

Abstract:

Visual thinking routines are principles based on several theories, approaches, and strategies. Such routines promote thinking skills, call for collaboration and sharing of ideas, and above all, make thinking and learning visible. Visual thinking routines were implemented in the teaching methodology graduate course at the American University in Dubai. The study used mixed methods. It was guided by the following two research questions: 1). To what extent do visual thinking inspire learning in the classroom, and make time for students’ questions, contributions, and thinking? 2). How do visual thinking routines inspire learning in the classroom and make time for students’ questions, contributions, and thinking? Eight student teachers enrolled in the teaching methodology course at the American University in Dubai (Spring 2017) participated in the following study. First, they completed a survey that measured to what degree they believed visual thinking routines inspired learning in the classroom and made time for students’ questions, contributions, and thinking. In order to build on the results from the quantitative phase, the student teachers were next involved in a qualitative data collection phase, where they had to answer the question: How do visual thinking routines inspire learning in the classroom and make time for students’ questions, contributions, and thinking? Results revealed that the implementation of visual thinking routines in the classroom strongly inspire learning in the classroom and make time for students’ questions, contributions, and thinking. In addition, student teachers explained how visual thinking routines allow for organization, variety, thinking, and documentation. As with all original, new, and unique resources, visual thinking routines are not free of challenges. To make the most of this useful and valued resource, educators, need to comprehend, model and spread an awareness of the effective ways of using such routines in the classroom. It is crucial that such routines become part of the curriculum to allow for and document students’ questions, contributions, and thinking.

Keywords: classroom display, student engagement, thinking classroom, visual thinking routines

Procedia PDF Downloads 228
2148 Design of Identification Based Adaptive Control for Fermentation Process in Bioreactor

Authors: J. Ritonja

Abstract:

The biochemical technology has been developing extremely fast since the middle of the last century. The main reason for such development represents a requirement for large production of high-quality biologically manufactured products such as pharmaceuticals, foods, and beverages. The impact of the biochemical industry on the world economy is enormous. The great importance of this industry also results in intensive development in scientific disciplines relevant to the development of biochemical technology. In addition to developments in the fields of biology and chemistry, which enable to understand complex biochemical processes, development in the field of control theory and applications is also very important. In the paper, the control for the biochemical reactor for the milk fermentation was studied. During the fermentation process, the biophysical quantities must be precisely controlled to obtain the high-quality product. To control these quantities, the bioreactor’s stirring drive and/or heating system can be used. Available commercial biochemical reactors are equipped with open loop or conventional linear closed loop control system. Due to the outstanding parameters variations and the partial nonlinearity of the biochemical process, the results obtained with these control systems are not satisfactory. To improve the fermentation process, the self-tuning adaptive control system was proposed. The use of the self-tuning adaptive control is suggested because the parameters’ variations of the studied biochemical process are very slow in most cases. To determine the linearized mathematical model of the fermentation process, the recursive least square identification method was used. Based on the obtained mathematical model the linear quadratic regulator was tuned. The parameters’ identification and the controller’s synthesis are executed on-line and adapt the controller’s parameters to the fermentation process’ dynamics during the operation. The use of the proposed combination represents the original solution for the control of the milk fermentation process. The purpose of the paper is to contribute to the progress of the control systems for the biochemical reactors. The proposed adaptive control system was tested thoroughly. From the obtained results it is obvious that the proposed adaptive control system assures much better following of the reference signal as a conventional linear control system with fixed control parameters.

Keywords: adaptive control, biochemical reactor, linear quadratic regulator, recursive least square identification

Procedia PDF Downloads 125
2147 Graphical Theoretical Construction of Discrete time Share Price Paths from Matroid

Authors: Min Wang, Sergey Utev

Abstract:

The lessons from the 2007-09 global financial crisis have driven scientific research, which considers the design of new methodologies and financial models in the global market. The quantum mechanics approach was introduced in the unpredictable stock market modeling. One famous quantum tool is Feynman path integral method, which was used to model insurance risk by Tamturk and Utev and adapted to formalize the path-dependent option pricing by Hao and Utev. The research is based on the path-dependent calculation method, which is motivated by the Feynman path integral method. The path calculation can be studied in two ways, one way is to label, and the other is computational. Labeling is a part of the representation of objects, and generating functions can provide many different ways of representing share price paths. In this paper, the recent works on graphical theoretical construction of individual share price path via matroid is presented. Firstly, a study is done on the knowledge of matroid, relationship between lattice path matroid and Tutte polynomials and ways to connect points in the lattice path matroid and Tutte polynomials is suggested. Secondly, It is found that a general binary tree can be validly constructed from a connected lattice path matroid rather than general lattice path matroid. Lastly, it is suggested that there is a way to represent share price paths via a general binary tree, and an algorithm is developed to construct share price paths from general binary trees. A relationship is also provided between lattice integer points and Tutte polynomials of a transversal matroid. Use this way of connection together with the algorithm, a share price path can be constructed from a given connected lattice path matroid.

Keywords: combinatorial construction, graphical representation, matroid, path calculation, share price, Tutte polynomial

Procedia PDF Downloads 138
2146 Antioxidant Activity and Microbiological Quality of Functional Bread Enriched with Morus Alba Leaf Extract during Storage

Authors: Joanna Kobus-Cisowska, Daria Szymanowska, Piotr Szulc, Oskar Szczepaniak, Marcin Dziedzinski, Szymon Byczkiewicz

Abstract:

A wide range of food products is offered on the market. However, increasing consumer awareness of the impact of food on health causes a growing interest in enriched products. Cereal products are an important element of the daily diet of man. In the literature, no data was found on the impact of Morus alba preparations on the content of active ingredients and properties of wholemeal bread. Mulberry leaves (Morus alba L) are a rich source of bioactive compounds with multidirectional antioxidant activity, which means that they can be a component of new foods that prevent disease or support therapy and improve the patient's health. The aim of the study was to assess the impact of the addition of white mulberry leaf extract on the antioxidant activity of bread. It has been shown that bread can be a carrier of biologically active substances from mulberry leaves, because the addition of mulberry at a sensory acceptable level and meeting microbiological requirements significantly influenced the increase in the content of bioactive ingredients and the antioxidant activity of bread. The addition of mulberry leaf water extract to bread increased the level of flavonols and phenolic acids, in particular protocatechic, chlorogenic gallic and caffeic acid and isoquercetin and rutine, and also increased the antioxidant potential, which were microbiological stable during 5 days storage. It has been shown also that the addition of Morus alba preparations has a statistically significant effect on anti-radical activity. In addition, there were no differences in activity in DPPH · and ABTS · + tests between post-storage samples. This means that the compounds responsible for the anti-radical activity present in the bread were not inactivated during storage. It was found that the tested bread was characterized by high microbiological purity, which is indicated by the obtained results of analyzes performed for the titers of indicator microorganisms and the absence of pathogens. In the tested products from the moment of production throughout the entire storage period, no undesirable microflora was found, which proves their safety and guarantees microbiological stability during the storage period.

Keywords: antioxidants, bread, extract, quality

Procedia PDF Downloads 174