Search results for: sensory processing sensitivity
1764 Anthropometric Data Variation within Gari-Frying Population
Authors: T. M. Samuel, O. O. Aremu, I. O. Ismaila, L. I. Onu, B. O. Adetifa, S. E. Adegbite, O. O. Olokoshe
Abstract:
The imperative of anthropometry in designing to fit cannot be overemphasized. Of essence is the variability of measurements among population for which data is collected. In this paper anthropometric data were collected for the design of gari-frying facility such that work system would be designed to fit the gari-frying population in the Southwestern states of Nigeria comprising Lagos, Ogun, Oyo, Osun, Ondo, and Ekiti. Twenty-seven body dimensions were measured among 120 gari-frying processors. Statistical analysis was performed using SPSS package to determine the mean, standard deviation, minimum value, maximum value and percentiles (2nd, 5th, 25th, 50th, 75th, 95th, and 98th) of the different anthropometric parameters. One sample t-test was conducted to determine the variation within the population. The 50th percentiles of some of the anthropometric parameters were compared with those from other populations in literature. The correlation between the worker’s age and the body anthropometry was also investigated.The mean weight, height, shoulder height (sitting), eye height (standing) and eye height (sitting) are 63.37 kg, 1.57 m, 0.55 m, 1.45 m, and 0.67 m respectively.Result also shows a high correlation with other populations and a statistically significant difference in variability of data within the population in all the body dimensions measured. With a mean age of 42.36 years, results shows that age will be a wrong indicator for estimating the anthropometry for the population.Keywords: anthropometry, cassava processing, design to fit, gari-frying, workstation design
Procedia PDF Downloads 2531763 Marble Powder’s Effect on Permeability and Mechanical Properties of Concrete
Authors: Shams Ul Khaliq, Khan Shahzada, Bashir Alam, Fawad Bilal, Mushtaq Zeb, Faizan Akbar
Abstract:
Marble industry contributes its fair share in environmental deterioration, producing voluminous amounts of mud and other excess residues obtained from marble and granite processing, polluting soil, water and air. Reusing these products in other products will not just prevent our environment from polluting but also help with economy. In this research, an attempt has been made to study the expediency of waste Marble Powder (MP) in concrete production. Various laboratory tests were performed to investigate permeability, physical and mechanical properties, such as slump, compressive strength, split tensile test, etc. Concrete test samples were fabricated with varying MP content (replacing 5-30% cement), furnished from two different sources. 5% replacement of marble dust caused 6% and 12% decrease in compressive and tensile strength respectively. These parameters gradually decreased with increasing MP content up to 30%. Most optimum results were obtained with 10% replacement. Improvement in consistency and permeability were noticed. The permeability was improved with increasing MP proportion up to 10% without substantial decrease in compressive strength. Obtained results revealed that MP as an alternative to cement in concrete production is a viable option considering its economic and environment friendly implications.Keywords: marble powder, strength, permeability, consistency, environment
Procedia PDF Downloads 3331762 Mobile Devices and E-Learning Systems as a Cost-Effective Alternative for Digitizing Paper Quizzes and Questionnaires in Social Work
Authors: K. Myška, L. Pilařová
Abstract:
The article deals with possibilities of using cheap mobile devices with the combination of free or open source software tools as an alternative to professional hardware and software equipment. Especially in social work, it is important to find cheap yet functional solution that can compete with complex but expensive solutions for digitizing paper materials. Our research was focused on the analysis of cheap and affordable solutions for digitizing the most frequently used paper materials that are being commonly used by terrain workers in social work. We used comparative analysis as a research method. Social workers need to process data from paper forms quite often. It is still more affordable, time and cost-effective to use paper forms to get feedback in many cases. Collecting data from paper quizzes and questionnaires can be done with the help of professional scanners and software. These technologies are very powerful and have advanced options for digitizing and processing digitized data, but are also very expensive. According to results of our study, the combination of open source software and mobile phone or cheap scanner can be considered as a cost-effective alternative to professional equipment.Keywords: digitalization, e-learning, mobile devices, questionnaire
Procedia PDF Downloads 1511761 Estimation of Ribb Dam Catchment Sediment Yield and Reservoir Effective Life Using Soil and Water Assessment Tool Model and Empirical Methods
Authors: Getalem E. Haylia
Abstract:
The Ribb dam is one of the irrigation projects in the Upper Blue Nile basin, Ethiopia, to irrigate the Fogera plain. Reservoir sedimentation is a major problem because it reduces the useful reservoir capacity by the accumulation of sediments coming from the watersheds. Estimates of sediment yield are needed for studies of reservoir sedimentation and planning of soil and water conservation measures. The objective of this study was to simulate the Ribb dam catchment sediment yield using SWAT model and to estimate Ribb reservoir effective life according to trap efficiency methods. The Ribb dam catchment is found in North Western part of Ethiopia highlands, and it belongs to the upper Blue Nile and Lake Tana basins. Soil and Water Assessment Tool (SWAT) was selected to simulate flow and sediment yield in the Ribb dam catchment. The model sensitivity, calibration, and validation analysis at Ambo Bahir site were performed with Sequential Uncertainty Fitting (SUFI-2). The flow data at this site was obtained by transforming the Lower Ribb gauge station (2002-2013) flow data using Area Ratio Method. The sediment load was derived based on the sediment concentration yield curve of Ambo site. Stream flow results showed that the Nash-Sutcliffe efficiency coefficient (NSE) was 0.81 and the coefficient of determination (R²) was 0.86 in calibration period (2004-2010) and, 0.74 and 0.77 in validation period (2011-2013), respectively. Using the same periods, the NS and R² for the sediment load calibration were 0.85 and 0.79 and, for the validation, it became 0.83 and 0.78, respectively. The simulated average daily flow rate and sediment yield generated from Ribb dam watershed were 3.38 m³/s and 1772.96 tons/km²/yr, respectively. The effective life of Ribb reservoir was estimated using the developed empirical methods of the Brune (1953), Churchill (1948) and Brown (1958) methods and found to be 30, 38 and 29 years respectively. To conclude, massive sediment comes from the steep slope agricultural areas, and approximately 98-100% of this incoming annual sediment loads have been trapped by the Ribb reservoir. In Ribb catchment, as well as reservoir systematic and thorough consideration of technical, social, environmental, and catchment managements and practices should be made to lengthen the useful life of Ribb reservoir.Keywords: catchment, reservoir effective life, reservoir sedimentation, Ribb, sediment yield, SWAT model
Procedia PDF Downloads 1871760 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices
Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner
Abstract:
Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.Keywords: biometrics, electrocardiographic, machine learning, signals processing
Procedia PDF Downloads 1421759 Indoor Real-Time Positioning and Mapping Based on Manhattan Hypothesis Optimization
Authors: Linhang Zhu, Hongyu Zhu, Jiahe Liu
Abstract:
This paper investigated a method of indoor real-time positioning and mapping based on the Manhattan world assumption. In indoor environments, relying solely on feature matching techniques or other geometric algorithms for sensor pose estimation inevitably resulted in cumulative errors, posing a significant challenge to indoor positioning. To address this issue, we adopt the Manhattan world hypothesis to optimize the camera pose algorithm based on feature matching, which improves the accuracy of camera pose estimation. A special processing method was applied to image data frames that conformed to the Manhattan world assumption. When similar data frames appeared subsequently, this could be used to eliminate drift in sensor pose estimation, thereby reducing cumulative errors in estimation and optimizing mapping and positioning. Through experimental verification, it is found that our method achieves high-precision real-time positioning in indoor environments and successfully generates maps of indoor environments. This provides effective technical support for applications such as indoor navigation and robot control.Keywords: Manhattan world hypothesis, real-time positioning and mapping, feature matching, loopback detection
Procedia PDF Downloads 611758 A Hybrid Expert System for Generating Stock Trading Signals
Authors: Hosein Hamisheh Bahar, Mohammad Hossein Fazel Zarandi, Akbar Esfahanipour
Abstract:
In this paper, a hybrid expert system is developed by using fuzzy genetic network programming with reinforcement learning (GNP-RL). In this system, the frame-based structure of the system uses the trading rules extracted by GNP. These rules are extracted by using technical indices of the stock prices in the training time period. For developing this system, we applied fuzzy node transition and decision making in both processing and judgment nodes of GNP-RL. Consequently, using these method not only did increase the accuracy of node transition and decision making in GNP's nodes, but also extended the GNP's binary signals to ternary trading signals. In the other words, in our proposed Fuzzy GNP-RL model, a No Trade signal is added to conventional Buy or Sell signals. Finally, the obtained rules are used in a frame-based system implemented in Kappa-PC software. This developed trading system has been used to generate trading signals for ten companies listed in Tehran Stock Exchange (TSE). The simulation results in the testing time period shows that the developed system has more favorable performance in comparison with the Buy and Hold strategy.Keywords: fuzzy genetic network programming, hybrid expert system, technical trading signal, Tehran stock exchange
Procedia PDF Downloads 3321757 Self-Attention Mechanism for Target Hiding Based on Satellite Images
Authors: Hao Yuan, Yongjian Shen, Xiangjun He, Yuheng Li, Zhouzhou Zhang, Pengyu Zhang, Minkang Cai
Abstract:
Remote sensing data can provide support for decision-making in disaster assessment or disaster relief. The traditional processing methods of sensitive targets in remote sensing mapping are mainly based on manual retrieval and image editing tools, which are inefficient. Methods based on deep learning for sensitive target hiding are faster and more flexible. But these methods have disadvantages in training time and cost of calculation. This paper proposed a target hiding model Self Attention (SA) Deepfill, which used self-attention modules to replace part of gated convolution layers in image inpainting. By this operation, the calculation amount of the model becomes smaller, and the performance is improved. And this paper adds free-form masks to the model’s training to enhance the model’s universal. The experiment on an open remote sensing dataset proved the efficiency of our method. Moreover, through experimental comparison, the proposed method can train for a longer time without over-fitting. Finally, compared with the existing methods, the proposed model has lower computational weight and better performance.Keywords: remote sensing mapping, image inpainting, self-attention mechanism, target hiding
Procedia PDF Downloads 1361756 Experimental Characterization of Composite Material with Non Contacting Methods
Authors: Nikolaos Papadakis, Constantinos Condaxakis, Konstantinos Savvakis
Abstract:
The aim of this paper is to determine the elastic properties (elastic modulus and Poisson ratio) of a composite material based on noncontacting imaging methods. More specifically, the significantly reduced cost of digital cameras has given the opportunity of the high reliability of low-cost strain measurement. The open source platform Ncorr is used in this paper which utilizes the method of digital image correlation (DIC). The use of digital image correlation in measuring strain uses random speckle preparation on the surface of the gauge area, image acquisition, and postprocessing the image correlation to obtain displacement and strain field on surface under study. This study discusses technical issues relating to the quality of results to be obtained are discussed. [0]8 fabric glass/epoxy composites specimens were prepared and tested at different orientations 0[o], 30[o], 45[o], 60[o], 90[o]. Each test was recorded with the camera at a constant frame rate and constant lighting conditions. The recorded images were processed through the use of the image processing software. The parameters of the test are reported. The strain map output which is obtained through strain measurement using Ncorr is validated by a) comparing the elastic properties with expected values from Classical laminate theory, b) through finite element analysis.Keywords: composites, Ncorr, strain map, videoextensometry
Procedia PDF Downloads 1441755 Extraction of Essential Oil and Pectin from Lime and Waste Technology Development
Authors: Wilaisri Limphapayom
Abstract:
Lime is one of the economically important produced in Thailand. The objective of this research is to increase utilization in food and cosmetic. Extraction of essential oil and pectin from lime (Citrus aurantifolia (Christm & Panz ) Swing) have been studied. Extraction of essential oil has been made by using hydro-distillation .The essential oil ranged from 1.72-2.20%. The chemical composition of essential oil composed of alpha-pinene , beta-pinene , D-limonene , comphene , a-phellandrene , g-terpinene , a-ocimene , O-cymene , 2-carene , Linalool , trans-ocimenol , Geraniol , Citral , Isogeraniol , Verbinol , and others when analyzed by using GC-MS method. Pectin extraction from lime waste , boiled water after essential oil extraction. Pectin extraction were found 40.11-65.81 g /100g of lime peel. The best extraction condition was found to be higher in yield by using ethanol extraction. The potential of this study had satisfactory results to improve lime processing system for value-added . The present study was also focused on Lime powder production as source of vitamin C or ascorbic acid and the potential of lime waste as a source of essential oil and pectin. Lime powder produced from Spray Dryer . Lime juice with 2 different level of maltodextrins DE 10 , 30 and 50% w/w was sprayed at 150 degrees celsius inlet air temperature and at 90-degree celsius outlet temperature. Lime powder with 50% maltodextrin gave the most desirable quality product. This product has vitamin C contents of 25 mg/100g (w/w).Keywords: extraction, pectin, essential oil, lime
Procedia PDF Downloads 2991754 Network Word Discovery Framework Based on Sentence Semantic Vector Similarity
Authors: Ganfeng Yu, Yuefeng Ma, Shanliang Yang
Abstract:
The word discovery is a key problem in text information retrieval technology. Methods in new word discovery tend to be closely related to words because they generally obtain new word results by analyzing words. With the popularity of social networks, individual netizens and online self-media have generated various network texts for the convenience of online life, including network words that are far from standard Chinese expression. How detect network words is one of the important goals in the field of text information retrieval today. In this paper, we integrate the word embedding model and clustering methods to propose a network word discovery framework based on sentence semantic similarity (S³-NWD) to detect network words effectively from the corpus. This framework constructs sentence semantic vectors through a distributed representation model, uses the similarity of sentence semantic vectors to determine the semantic relationship between sentences, and finally realizes network word discovery by the meaning of semantic replacement between sentences. The experiment verifies that the framework not only completes the rapid discovery of network words but also realizes the standard word meaning of the discovery of network words, which reflects the effectiveness of our work.Keywords: text information retrieval, natural language processing, new word discovery, information extraction
Procedia PDF Downloads 951753 Soil Quality Response to Long-Term Intensive Resources Management and Soil Texture
Authors: Dalia Feiziene, Virginijus Feiza, Agne Putramentaite, Jonas Volungevicius, Kristina Amaleviciute, Sarunas Antanaitis
Abstract:
The investigations on soil conservation are one of the most important topics in modern agronomy. Soil management practices have great influence on soil physico-chemical quality and GHG emission. Research objective: To reveal the sensitivity and vitality of soils with different texture to long-term antropogenisation on Cambisol in Central Lithuania and to compare them with not antropogenised soil resources. Methods: Two long-term field experiments (loam on loam; sandy loam on loam) with different management intensity were estimated. Disturbed and undisturbed soil samples were collected from 5-10, 15-20 and 30-35 cm depths. Soil available P and K contents were determined by ammonium lactate extraction, total N by the dry combustion method, SOC content by Tyurin titrimetric (classical) method, texture by pipette method. In undisturbed core samples soil pore volume distribution, plant available water (PAW) content were determined. A closed chamber method was applied to quantify soil respiration (SR). Results: Long-term resources management changed soil quality. In soil with loam texture, within 0-10, 10-20 and 30-35 cm soil layers, significantly higher PAW, SOC and mesoporosity (MsP) were under no-tillage (NT) than under conventional tillage (CT). However, total porosity (TP) under NT was significantly higher only in 0-10 cm layer. MsP acted as dominant factor for N, P and K accumulation in adequate layers. P content in all soil layers was higher under NT than in CT. N and K contents were significantly higher than under CT only in 0-10 cm layer. In soil with sandy loam texture, significant increase in SOC, PAW, MsP, N, P and K under NT was only in 0-10 cm layer. TP under NT was significantly lower in all layers. PAW acted as strong dominant factor for N, P, K accumulation. The higher PAW the higher NPK contents were determined. NT did not secure chemical quality within deeper layers than CT. Long-term application of mineral fertilisers significantly increased SOC and soil NPK contents primarily in top-soil. Enlarged fertilization determined the significantly higher leaching of nutrients to deeper soil layers (CT) and increased hazards of top-soil pollution. Straw returning significantly increased SOC and NPK accumulation in top-soil. The SR on sandy loam was significantly higher than on loam. At dry weather conditions, on loam SR was higher in NT than in CT, on sandy loam SR was higher in CT than in NT. NPK fertilizers promoted significantly higher SR in both dry and wet year, but suppressed SR on sandy loam during usual year. Not antropogenised soil had similar SOC and NPK distribution within 0-35 cm layer and depended on genesis of soil profile horizons.Keywords: fertilizers, long-term experiments, soil texture, soil tillage, straw
Procedia PDF Downloads 2991752 Variable Renewable Energy Droughts in the Power Sector – A Model-based Analysis and Implications in the European Context
Authors: Martin Kittel, Alexander Roth
Abstract:
The continuous integration of variable renewable energy sources (VRE) in the power sector is required for decarbonizing the European economy. Power sectors become increasingly exposed to weather variability, as the availability of VRE, i.e., mainly wind and solar photovoltaic, is not persistent. Extreme events, e.g., long-lasting periods of scarce VRE availability (‘VRE droughts’), challenge the reliability of supply. Properly accounting for the severity of VRE droughts is crucial for designing a resilient renewable European power sector. Energy system modeling is used to identify such a design. Our analysis reveals the sensitivity of the optimal design of the European power sector towards VRE droughts. We analyze how VRE droughts impact optimal power sector investments, especially in generation and flexibility capacity. We draw upon work that systematically identifies VRE drought patterns in Europe in terms of frequency, duration, and seasonality, as well as the cross-regional and cross-technological correlation of most extreme drought periods. Based on their analysis, the authors provide a selection of relevant historical weather years representing different grades of VRE drought severity. These weather years will serve as input for the capacity expansion model for the European power sector used in this analysis (DIETER). We additionally conduct robustness checks varying policy-relevant assumptions on capacity expansion limits, interconnections, and level of sector coupling. Preliminary results illustrate how an imprudent selection of weather years may cause underestimating the severity of VRE droughts, flawing modeling insights concerning the need for flexibility. Sub-optimal European power sector designs vulnerable to extreme weather can result. Using relevant weather years that appropriately represent extreme weather events, our analysis identifies a resilient design of the European power sector. Although the scope of this work is limited to the European power sector, we are confident that our insights apply to other regions of the world with similar weather patterns. Many energy system studies still rely on one or a limited number of sometimes arbitrarily chosen weather years. We argue that the deliberate selection of relevant weather years is imperative for robust modeling results.Keywords: energy systems, numerical optimization, variable renewable energy sources, energy drought, flexibility
Procedia PDF Downloads 721751 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 3: Volume Reduction and Stabilization of Solid Waste
Authors: Masaumi Nakahara, Sou Watanabe, Hiromichi Ogi, Atsuhiro Shibata, Kazunori Nomura
Abstract:
In the Japan Atomic Energy Agency, three types of experimental research, advanced reactor fuel reprocessing, radioactive waste disposal, and nuclear fuel cycle technology, have been carried out at the Chemical Processing Facility. The facility has generated high level radioactive liquid and solid wastes in hot cells. The high level radioactive solid waste is divided into three main categories, a flammable waste, a non-flammable waste, and a solid reagent waste. A plastic product is categorized into the flammable waste and molten with a heating mantle. The non-flammable waste is cut with a band saw machine for reducing the volume. Among the solid reagent waste, a used adsorbent after the experiments is heated, and an extractant is decomposed for its stabilization. All high level radioactive solid wastes in the hot cells are packed in a high level radioactive solid waste can. The high level radioactive solid waste can is transported to the 2nd High Active Solid Waste Storage in the Tokai Reprocessing Plant in the Japan Atomic Energy Agency.Keywords: high level radioactive solid waste, advanced reactor fuel reprocessing, radioactive waste disposal, nuclear fuel cycle technology
Procedia PDF Downloads 1591750 Study of Durability of Porous Polymer Materials, Glass-Fiber-Reinforced Polyurethane Foam (R-PUF) in MarkIII Containment Membrane System
Authors: Florent Cerdan, Anne-Gaëlle Denay, Annette Roy, Jean-Claude Grandidier, Éric Laine
Abstract:
The insulation of MarkIII membrane of the Liquid Natural Gas Carriers (LNGC) consists of a load- bearing system made of panels in reinforced polyurethane foam (R-PUF). During the shipping, the cargo containment shall be potentially subject to risk events which can be water leakage through the wall ballast tank. The aim of these present works is to further develop understanding of water transfer mechanisms and water effect on properties of R-PUF. This multi-scale approach contributes to improve the durability. Macroscale / Mesoscale Firstly, the use of the gravimetric technique has allowed to define, at room temperature, the water transfer mechanisms and kinetic diffusion, in the R-PUF. The solubility follows a first kinetic fast growing connected to the water absorption by the micro-porosity, and then evolves linearly slowly, this second stage is connected to molecular diffusion and dissolution of water in the dense membranes polyurethane. Secondly, in the purpose of improving the understanding of the transfer mechanism, the study of the evolution of the buoyant force has been established. It allowed to identify the effect of the balance of total and partial pressure of mixture gas contained in pores surface. Mesoscale / Microscale The differential scanning calorimetry (DSC) and Dynamical Mechanical Analysis (DMA), have been used to investigate the hydration of the hard and soft segments of the polyurethane matrix. The purpose was to identify the sensitivity of these two phases. It been shown that the glass transition temperatures shifts towards the low temperatures when the solubility of the water increases. These observations permit to conclude to a plasticization of the polymer matrix. Microscale The Fourier Transform Infrared (FTIR) study has been used to investigate the characterization of functional groups on the edge, the center and mid-way of the sample according the duration of submersion. More water there is in the material, more the water fix themselves on the urethanes groups and more specifically on amide groups. The pic of C=O urethane shifts at lower frequencies quickly before 24 hours of submersion then grows slowly. The intensity of the pic decreases more flatly after that.Keywords: porous materials, water sorption, glass transition temperature, DSC, DMA, FTIR, transfer mechanisms
Procedia PDF Downloads 5291749 Examining the Extent and Magnitude of Food Security amongst Rural Farming Households in Nigeria
Authors: Ajibade T., Omotesho O. A., Ayinde O. E, Ajibade E. T., Muhammad-Lawal A.
Abstract:
This study was carried out to examine the extent and magnitude of food security amongst farming rural households in Nigeria. Data used for this study was collected from a total of two hundred and forty rural farming households using a two-stage random sampling technique. The main tools of analysis for this study include descriptive statistics and a constructed food security index using the identification and aggregation procedure. The headcount ratio in this study reveals that 71% of individuals in the study area were food secure with an average per capita calorie and protein availability of 4,213.92kcal and 99.98g respectively. The aggregated household daily calorie availability and daily protein availability per capita were 3,634.57kcal and 84.08g respectively which happens to be above the food security line of 2,470kcal and 65g used in this study. The food insecure households fell short of the minimum daily per capita calorie and protein requirement by 2.1% and 24.9%. The study revealed that the area is food insecure due to unequal distribution of the available food amongst the sampled population. The study recommends that the households should empower themselves financially in order to enhance their ability to afford the food during both on and off seasons. Also, processing and storage of farm produce should be enhanced in order to improve on availability throughout the year.Keywords: farming household, food security, identification and aggregation, food security index
Procedia PDF Downloads 2911748 Intrathecal: Not Intravenous Administration of Evans Blue Reduces Pain Behavior in Neuropathic Rats
Authors: Kun Hua O., Dong Woon Kim, Won Hyung Lee
Abstract:
Introduction: Neuropathic pain induced by spinal or peripheral nerve injury is highly resistant to common painkillers, nerve blocks, and other pain management approaches. Recently, several new therapeutic drug candidates have been developed to control neuropathic pain. In this study, we used the spinal nerve L5 ligation (SNL) model to investigate the ability of intrathecal or intravenous Evans blue to decrease pain behavior and to study the relationship between Evans blue and the neural structure of pain transmission. Method: Neuropathic pain (allodynia) of the left hind paw was induced by unilateral SNL in Sprague-Dawley rats(n=10) in each group. Evans blue (5, 15, 50μg/10μl) or phosphate buffer saline(PBS,10μl) was injected intrathecally at 3days post-ligation or intravenously(1mg/200 μl) 3days and 5days post-ligation . Mechanical sensitivity was assessed using Von Frey filaments at 3 days post-ligation and at 2 hours, days 1, 2, 3, 5,7 after intrathecal Evans blue injection, and on days 2, 4, 7, and 11 at 14 days after intravenous injection. In the intrathecal group, microglia and glutaminergic neurons in the dorsal horn and VNUT(vesicular nucleotide transporter) in the dorsal root ganglia were tested to evaluate co-staining with Evans blue. The experimental procedures were performed in accordance with the animal care guideline of the Korean Academy of Medical Science(Animal ethic committee of Chungnam National University Hospital: CNUH-014-A0005-1). Results: Tight ligation of the L5 spinal nerve induced allodynia in the left hind paw 3 days post-ligation. Intrathecal Evans blue most significantly(P<0.001) alleviated allodynia at 2 days after intrathecal, but not an intravenous injection. Glutaminergic neurons in the dorsal horn and VNUT in the dorsal root ganglia were co-stained with Evans blue. On the other hand, microglia in the dorsal horn were partially co-stained with Evans blue. Conclusion: We confirmed that Evans blue might have an analgesic effect through the central nervous system, not another system in neuropathic pain of the SNL animal model. These results suggest Evans blue may be a potential new drug for the treatment of chronic pain. This research was supported by the National Research Foundation of Korea (NRF-2020R1A2C100757512), funded by the Ministry of Education.Keywords: neuropathic pain, Evas blue, intrathecal, intravenous
Procedia PDF Downloads 941747 Utilization of Standard Paediatric Observation Chart to Evaluate Infants under Six Months Presenting with Non-Specific Complaints
Authors: Michael Zhang, Nicholas Marriage, Valerie Astle, Marie-Louise Ratican, Jonathan Ash, Haddijatou Hughes
Abstract:
Objective: Young infants are often brought to the Emergency Department (ED) with a variety of complaints, some of them are non-specific and present as a diagnostic challenge to the attending clinician. Whilst invasive investigations such as blood tests and lumbar puncture are necessary in some cases to exclude serious infections, some basic clinical tools in additional to thorough clinical history can be useful to assess the risks of serious conditions in these young infants. This study aimed to examine the utilization of one of clinical tools in this regard. Methods: This retrospective observational study examined the medical records of infants under 6 months presenting to a mixed urban ED between January 2013 and December 2014. The infants deemed to have non-specific complaints or diagnoses by the emergency clinicians were selected for analysis. The ones with clear systemic diagnoses were excluded. Among all relevant clinical information and investigation results, utilization of Standard Paediatric Observation Chart (SPOC) was particularly scrutinized in these medical records. This specific chart was developed by the expert clinicians in local health department. It categorizes important clinical signs into some color-coded zones as a visual cue for serious implication of some abnormalities. An infant is regarded as SPOC positive when fulfills 1 red zone or 2 yellow zones criteria, and the attending clinician would be prompted to investigate and treat for potential serious conditions accordingly. Results: Eight hundred and thirty-five infants met the inclusion criteria for this project. The ones admitted to the hospital for further management were more likely to have SPOC positive criteria than the discharged infants (Odds ratio: 12.26, 95% CI: 8.04 – 18.69). Similarly, Sepsis alert criteria on SPOC were positive in a higher percentage of patients with serious infections (56.52%) in comparison to those with mild conditions (15.89%) (p < 0.001). The SPOC sepsis criteria had a sensitivity of 56.5% (95% CI: 47.0% - 65.7%) and a moderate specificity of 84.1% (95% CI: 80.8% - 87.0%) to identify serious infections. Applying to this infant population, with a 17.4% prevalence of serious infection, the positive predictive value was only 42.8% (95% CI: 36.9% - 49.0%). However, the negative predictive value was high at 90.2% (95% CI: 88.1% - 91.9%). Conclusions: Standard Paediatric Observation Chart has been applied as a useful clinical tool in the clinical practice to help identify and manage young sick infants in ED effectively.Keywords: clinical tool, infants, non-specific complaints, Standard Paediatric Observation Chart
Procedia PDF Downloads 2521746 Emotional Artificial Intelligence and the Right to Privacy
Authors: Emine Akar
Abstract:
The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.Keywords: AI, privacy law, data protection, big data
Procedia PDF Downloads 881745 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows
Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid
Abstract:
Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil
Procedia PDF Downloads 1301744 B Spline Finite Element Method for Drifted Space Fractional Tempered Diffusion Equation
Authors: Ayan Chakraborty, BV. Rathish Kumar
Abstract:
Off-late many models in viscoelasticity, signal processing or anomalous diffusion equations are formulated in fractional calculus. Tempered fractional calculus is the generalization of fractional calculus and in the last few years several important partial differential equations occurring in the different field of science have been reconsidered in this term like diffusion wave equations, Schr$\ddot{o}$dinger equation and so on. In the present paper, a time-dependent tempered fractional diffusion equation of order $\gamma \in (0,1)$ with forcing function is considered. Existence, uniqueness, stability, and regularity of the solution has been proved. Crank-Nicolson discretization is used in the time direction. B spline finite element approximation is implemented. Generally, B-splines basis are useful for representing the geometry of a finite element model, interfacing a finite element analysis program. By utilizing this technique a priori space-time estimate in finite element analysis has been derived and we proved that the convergent order is $\mathcal{O}(h²+T²)$ where $h$ is the space step size and $T$ is the time. A couple of numerical examples have been presented to confirm the accuracy of theoretical results. Finally, we conclude that the studied method is useful for solving tempered fractional diffusion equations.Keywords: B-spline finite element, error estimates, Gronwall's lemma, stability, tempered fractional
Procedia PDF Downloads 1921743 The Evolution of Amazon Alexa: From Voice Assistant to Smart Home Hub
Authors: Abrar Abuzaid, Maha Alaaeddine, Haya Alesayi
Abstract:
This project is centered around understanding the usage and impact of Alexa, Amazon's popular virtual assistant, in everyday life. Alexa, known for its integration into devices like Amazon Echo, offers functionalities such as voice interaction, media control, providing real-time information, and managing smart home devices. Our primary focus is to conduct a straightforward survey aimed at uncovering how people use Alexa in their daily routines. We plan to reach out to a wide range of individuals to get a diverse perspective on how Alexa is being utilized for various tasks, the frequency and context of its use, and the overall user experience. The survey will explore the most common uses of Alexa, its impact on daily life, features that users find most beneficial, and improvements they are looking for. This project is not just about collecting data but also about understanding the real-world applications of a technology like Alexa and how it fits into different lifestyles. By examining the responses, we aim to gain a practical understanding of Alexa's role in homes and possibly in workplaces. This project will provide insights into user satisfaction and areas where Alexa could be enhanced to meet the evolving needs of its users. It’s a step towards connecting technology with everyday life, making it more accessible and user-friendlyKeywords: Amazon Alexa, artificial intelligence, smart speaker, natural language processing
Procedia PDF Downloads 621742 Evaluation of the Incorporation of Modified Starch in Puff Pastry Dough by Mixolab Rheological Analysis
Authors: Alejandra Castillo-Arias, Carlos A. Fuenmayor, Carlos M. Zuluaga-Domínguez
Abstract:
The connection between health and nutrition has driven the food industry to explore healthier and more sustainable alternatives. Key strategies to enhance nutritional quality and extend shelf life include reducing saturated fats and incorporating natural ingredients. One area of focus is the use of modified starch in baked goods, which has attracted significant interest in food science and industry due to its functional benefits. Modified starches are commonly used for their gelling, thickening, and water-retention properties. Derived from sources like waxy corn, potatoes, tapioca, or rice, these polysaccharides improve thermal stability and resistance to dough. The use of modified starch enhances the texture and structure of baked goods, which is crucial for consumer acceptance. In this study, it was evaluated the effects of modified starch inclusion on dough used for puff pastry elaboration, measured with Mixolab analysis. This technique assesses flour quality by examining its behavior under varying conditions, providing a comprehensive profile of its baking properties. The analysis included measurements of water absorption capacity, dough development time, dough stability, softening, final consistency, and starch gelatinization. Each of these parameters offers insights into how the flour will perform during baking and the quality of the final product. The performance of wheat flour with varying levels of modified starch inclusion (10%, 20%, 30%, and 40%) was evaluated through Mixolab analysis, with a control sample consisting of 100% wheat flour. Water absorption, gluten content, and retrogradation indices were analyzed to understand how modified starch affects dough properties. The results showed that the inclusion of modified starch increased the absorption index, especially at levels above 30%, indicating a dough with better handling qualities and potentially improved texture in the final baked product. However, the reduction in wheat flour resulted in a lower kneading index, affecting dough strength. Conversely, incorporating more than 20% modified starch reduced the retrogradation index, indicating improved stability and resistance to crystallization after cooling. Additionally, the modified starch improved the gluten index, contributing to better dough elasticity and stability, providing good structural support and resistance to deformation during mixing and baking. As expected, the control sample exhibited a higher amylase index, due to the presence of enzymes in wheat flour. However, this is of low concern in puff pastry dough, as amylase activity is more relevant in fermented doughs, which is not the case here. Overall, the use of modified starch in puff pastry enhanced product quality by improving texture, structure, and shelf life, particularly when used at levels between 30% and 40%. This research underscores the potential of modified starches to address health concerns associated with traditional starches and to contribute to the development of higher-quality, consumer-friendly baked products. Furthermore, the findings suggest that modified starches could play a pivotal role in future innovations within the baking industry, particularly in products aiming to balance healthfulness with sensory appeal. By incorporating modified starch into their formulations, bakeries can meet the growing demand for healthier, more sustainable products while maintaining the indulgent qualities that consumers expect from baked goods.Keywords: baking quality, dough properties, modified starch, puff pastry
Procedia PDF Downloads 221741 Comparing UV-based and O₃-Based AOPs for Removal of Emerging Contaminants from Food Processing Digestate Sludge
Authors: N. Moradi, C. M. Lopez-Vazquez, H. Garcia Hernandez, F. Rubio Rincon, D. Brdanovic, Mark van Loosdrecht
Abstract:
Advanced oxidation processes have been widely used for disinfection, removal of residual organic material, and for the removal of emerging contaminants from drinking water and wastewater. Yet, the application of these technologies to sludge treatment processes has not gained enough attention, mostly, considering the complexity of the sludge matrix. In this research, ozone and UV/H₂O₂ treatment were applied for the removal of emerging contaminants from a digestate supernatant. The removal of the following compounds was assessed:(i) salicylic acid (SA) (a surrogate of non-stradiol anti-inflammatory drugs (NSAIDs)), and (ii) sulfamethoxazole (SMX), sulfamethazine (SMN), and tetracycline (TCN) (the most frequent human and animal antibiotics). The ozone treatment was carried out in a plexiglass bubble column reactor with a capacity of 2.7 L; the system was equipped with a stirrer and a gas diffuser. The UV and UV/H₂O₂ treatments were done using a LED set-up (PearlLab beam device) dosing H₂O₂. In the ozone treatment evaluations, 95 % of the three antibiotics were removed during the first 20 min of exposure time, while an SA removal of 91 % occurred after 8 hours of exposure time. In the UV treatment evaluations, when adding the optimum dose of hydrogen peroxide (H₂O₂:COD molar ratio of 0.634), 36% of SA, 82% of TCN, and more than 90 % of both SMX and SMN were removed after 8 hours of exposure time. This study concluded that O₃ was more effective than UV/H₂O₂ in removing emerging contaminants from the digestate supernatant.Keywords: digestate sludge, emerging contaminants, ozone, UV-AOP
Procedia PDF Downloads 1021740 Occupational Exposure and Contamination to Antineoplastic Drugs of Healthcare Professionals in Mauritania
Authors: Antoine Villa, Moustapha Mohamedou, Florence Pilliere, Catherine Verdun-Esquer, Mathieu Molimard, Mohamed Sidatt Cheikh El Moustaph, Mireille Canal-Raffin
Abstract:
Context: In Mauritania, the activity of the National Center of Oncology (NCO) has steadily risen leading to an increase in the handling of antineoplastic drugs (AD) by healthcare professionals. In this context, the AD contamination of those professionals is a major concern for occupational physicians. It has been evaluated using biological monitoring of occupational exposure (BMOE). Methods: The intervention took place in 2015, in 2 care units, and evaluated nurses preparing and/or infusing AD and agents in charge of hygiene. Participants provided a single urine sample, at the end of the week, at the end of their shift. Five molecules were sought using specific high sensitivity methods (UHPLC-MS/MS) with very low limits of quantification (LOQ) (cyclophosphamide (CP), Ifosfamide (IF), methotrexate (MTX): 2.5ng/L; doxorubicin (Doxo): 10ng/L; α-fluoro-β-alanine (FBAL, 5-FU metabolite): 20ng/L). A healthcare worker was considered as 'contaminated' when an AD was detected at a urine concentration equal to or greater than the LOQ of the analytical method or at trace concentration. Results: Twelve persons participated (6 nurses, 6 agents in charge of hygiene). Twelve urine samples were collected and analyzed. The percentage of contamination was 66.6% for all participants (n=8/12), 100% for nurses (6/6) and 33% for agents in charge of hygiene (2/6). In 62.5% (n=5/8) of the contaminated workers, two to four of the AD were detected in the urine. CP was found in the urine of all contaminated workers. FBAL was found in four, MTX in three and Doxo in one. Only IF was not detected. Urinary concentrations (all drugs combined) ranged from 3 to 844 ng/L for nurses and from 3 to 44 ng/L for agents in charge of hygiene. The median urinary concentrations were 87 ng/L, 15.1 ng/L and 4.4 ng/L for FBAL, CP and MTX, respectively. The Doxo urinary concentration was found 218ng/L. Discussion: There is no current biological exposure index for the interpretation of AD contamination. The contamination of these healthcare professionals is therefore established by the detection of one or more AD in urine. These urinary contaminations are higher than the LOQ of the analytical methods, which must be as low as possible. Given the danger of AD, the implementation of corrective measures is essential for the staff. Biological monitoring of occupational exposure is the most reliable process to identify groups at risk, tracing insufficiently controlled exposures and as an alarm signal. These results show the necessity to educate professionals about the risks of handling AD and/or to care for treated patients.Keywords: antineoplastic drugs, Mauritania, biological monitoring of occupational exposure, contamination
Procedia PDF Downloads 3161739 Morphology Operation and Discrete Wavelet Transform for Blood Vessels Segmentation in Retina Fundus
Authors: Rita Magdalena, N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Sofia Saidah, Bima Sakti
Abstract:
Vessel segmentation of retinal fundus is important for biomedical sciences in diagnosing ailments related to the eye. Segmentation can simplify medical experts in diagnosing retinal fundus image state. Therefore, in this study, we designed a software using MATLAB which enables the segmentation of the retinal blood vessels on retinal fundus images. There are two main steps in the process of segmentation. The first step is image preprocessing that aims to improve the quality of the image to be optimum segmented. The second step is the image segmentation in order to perform the extraction process to retrieve the retina’s blood vessel from the eye fundus image. The image segmentation methods that will be analyzed in this study are Morphology Operation, Discrete Wavelet Transform and combination of both. The amount of data that used in this project is 40 for the retinal image and 40 for manually segmentation image. After doing some testing scenarios, the average accuracy for Morphology Operation method is 88.46 % while for Discrete Wavelet Transform is 89.28 %. By combining the two methods mentioned in later, the average accuracy was increased to 89.53 %. The result of this study is an image processing system that can segment the blood vessels in retinal fundus with high accuracy and low computation time.Keywords: discrete wavelet transform, fundus retina, morphology operation, segmentation, vessel
Procedia PDF Downloads 1951738 Modeling Bessel Beams and Their Discrete Superpositions from the Generalized Lorenz-Mie Theory to Calculate Optical Forces over Spherical Dielectric Particles
Authors: Leonardo A. Ambrosio, Carlos. H. Silva Santos, Ivan E. L. Rodrigues, Ayumi K. de Campos, Leandro A. Machado
Abstract:
In this work, we propose an algorithm developed under Python language for the modeling of ordinary scalar Bessel beams and their discrete superpositions and subsequent calculation of optical forces exerted over dielectric spherical particles. The mathematical formalism, based on the generalized Lorenz-Mie theory, is implemented in Python for its large number of free mathematical (as SciPy and NumPy), data visualization (Matplotlib and PyJamas) and multiprocessing libraries. We also propose an approach, provided by a synchronized Software as Service (SaaS) in cloud computing, to develop a user interface embedded on a mobile application, thus providing users with the necessary means to easily introduce desired unknowns and parameters and see the graphical outcomes of the simulations right at their mobile devices. Initially proposed as a free Android-based application, such an App enables data post-processing in cloud-based architectures and visualization of results, figures and numerical tables.Keywords: Bessel Beams and Frozen Waves, Generalized Lorenz-Mie Theory, Numerical Methods, optical forces
Procedia PDF Downloads 3801737 Effect of Thermal Pretreatment on Functional Properties of Chicken Protein Hydrolysate
Authors: Nutnicha Wongpadungkiat, Suwit Siriwatanayotin, Aluck Thipayarat, Punchira Vongsawasdi, Chotika Viriyarattanasak
Abstract:
Chicken products are major export product of Thailand. With a dramatically increasing consumption of chicken product in the world, there are abundant wastes from chicken meat processing industry. Recently, much research in the development of value-added products from chicken meat industry has focused on the production of protein hydrolysate, utilized as food ingredients for human diet and animal feed. The present study aimed to determine the effect of thermal pre-treatment on functional properties of chicken protein hydrolysate. Chicken breasts were heated at 40, 60, 80 and 100ºC prior to hydrolysis by Alcalase at 60ºC, pH 8 for 4 hr. The hydrolysate was freeze-dried, and subsequently used for assessment of its functional properties molecular weight by gel electrophoresis (SDS-PAGE). The obtained results show that increasing the pre-treatment temperature increased oil holding capacity and emulsion stability while decreasing antioxidant activity and water holding capacity. The SDS-PAGE analysis showed the evidence of protein aggregation in the hydrolysate treated at the higher pre-treatment temperature. These results suggest the connection between molecular weight of the hydrolysate and its functional properties.Keywords: chicken protein hydrolysate, enzymatic hydrolysis, thermal pretreatment, functional properties
Procedia PDF Downloads 2701736 General Time-Dependent Sequenced Route Queries in Road Networks
Authors: Mohammad Hossein Ahmadi, Vahid Haghighatdoost
Abstract:
Spatial databases have been an active area of research over years. In this paper, we study how to answer the General Time-Dependent Sequenced Route queries. Given the origin and destination of a user over a time-dependent road network graph, an ordered list of categories of interests and a departure time interval, our goal is to find the minimum travel time path along with the best departure time that minimizes the total travel time from the source location to the given destination passing through a sequence of points of interests belonging to each of the specified categories of interest. The challenge of this problem is the added complexity to the optimal sequenced route queries, where we assume that first the road network is time dependent, and secondly the user defines a departure time interval instead of one single departure time instance. For processing general time-dependent sequenced route queries, we propose two solutions as Discrete-Time and Continuous-Time Sequenced Route approaches, finding approximate and exact solutions, respectively. Our proposed approaches traverse the road network based on A*-search paradigm equipped with an efficient heuristic function, for shrinking the search space. Extensive experiments are conducted to verify the efficiency of our proposed approaches.Keywords: trip planning, time dependent, sequenced route query, road networks
Procedia PDF Downloads 3211735 Biotech Processes to Recover Valuable Fraction from Buffalo Whey Usable in Probiotic Growth, Cosmeceutical, Nutraceutical and Food Industries
Authors: Alberto Alfano, Sergio D’ambrosio, Darshankumar Parecha, Donatella Cimini, Chiara Schiraldi.
Abstract:
The main objective of this study regards the setup of an efficient small-scale platform for the conversion of local renewable waste materials, such as whey, into added-value products, thereby reducing environmental impact and costs deriving from the disposal of processing waste products. The buffalo milk whey derived from the cheese-making process, called second cheese whey, is the main by-product of the dairy industry. Whey is the main and most polluting by-product obtained from cheese manufacturing consisting of lactose, lactic acid, proteins, and salts, making whey an added-value product. In Italy, and in particular, in the Campania region, soft cheese production needs a large volume of liquid waste, especially during late spring and summer. This project is part of a circular economy perspective focused on the conversion of potentially polluting and difficult to purify waste into a resource to be exploited, and it embodies the concept of the three “R”: reduce, recycle, and reuse. Special focus was paid to the production of health-promoting biomolecules and biopolymers, which may be exploited in different segments of the food and pharmaceutical industries. These biomolecules may be recovered through appropriate processes and reused in an attempt to obtain added value products. So, ultrafiltration and nanofiltration processes were performed to fractionate bioactive components starting from buffalo milk whey. In this direction, the present study focused on the implementation of a downstream process that converts waste generated from food and food processing industries into added value products with potential applications. Owing to innovative downstream and biotechnological processes, rather than a waste product may be considered a resource to obtain high added value products, such as food supplements (probiotics), cosmeceuticals, biopolymers, and recyclable purified water. Besides targeting gastrointestinal disorders, probiotics such as Lactobacilli have been reported to improve immunomodulation and protection of the host against infections caused by viral and bacterial pathogens. Interestingly, also inactivated microbial (probiotic) cells and their metabolic products, indicated as parabiotic and postbiotics, respectively, have a crucial role and act as mediators in the modulation of the host’s immune function. To boost the production of biomass (both viable and/or heat inactivated cells) and/or the synthesis of growth-related postbiotics, such as EPS, efficient and sustainable fermentation processes are necessary. Based on a “zero-waste” approach, wastes generated from local industries can be recovered and recycled to develop sustainable biotechnological processes to obtain probiotics as well as post and parabiotic, to be tested as bioactive compounds against gastrointestinal disorders. The results have shown it was possible to recover an ultrafiltration retentate with suitable characteristics to be used in skin dehydration, to perform films (i.e., packaging for food industries), or as a wound repair agent and a nanofiltration retentate to recover lactic acid and carbon sources (e.g., lactose, glucose..) used for microbial cultivation. On the side, the last goal is to obtain purified water that can be reused throughout the process. In fact, water reclamation and reuse provide a unique and viable opportunity to augment traditional water supplies, a key issue nowadays.Keywords: biotech process, downstream process, probiotic growth, from waste to product, buffalo whey
Procedia PDF Downloads 69