Search results for: Bruno Francois
115 Analysis of the Presence of Alkylglycerols by Gas Chromatography in Ostrich Oil
Authors: Luana N. Cardozo, Debora A. S. Coutinho, Fabiola Lagher, Bruno J. G. Silva, Ivonilce Venture, Mainara Tesser, Graciela Venera
Abstract:
Ostrich oil is used as food in Brazil, and it has been the subject of scientific research because it contains essential fatty acids (Omega 3, 6, 7, and 9), which provide benefits to human health. Alkylglycerols are lipid ethers consisted of a saturated or unsaturated hydrocarbon chain joined by ether-type bonding to one of the glycerol hydroxyls. It is known that supplementation with alkylglycerols can act significantly on the functioning of immune system cells, both in pathological situations and in homeostasis. Objective: Analyze the presence of alkylglycerols in ostrich oil. Methods: The ostrich oil was bought from an industry that manufactures the product for sale as food, located in Mirante da Serra, northern Brazil. The samples were sent for analysis to the chemistry department of the Federal University of Paraná, where they were analyzed by the gas chromatography method. Results: The analysis of the ostrich oil presented alkylglycerols in area 514505154. Comparison, it is possible to observe that shark liver oil contains the area 26190196, and the difference between both is highly significant. Conclusion: The importance of alkylglycerol supplementation for the immune system is known. The analysis of the results made it possible to verify the presence of alkylglycerols in the ostrich oil, which is five times higher than in the shark liver oil, that would be the largest source food, but was surpassed by the ostrich oil until the present time. The present study emphasizes that ostrich oil can be considered a food source of alkylglycerols and may play a promising role in the immune system because it contains such substance, but further studies are needed to prove its performance in the body.Keywords: ostrich oil, nutritional composition, alkylglycerols, food
Procedia PDF Downloads 141114 Local Differential Privacy-Based Data-Sharing Scheme for Smart Utilities
Authors: Veniamin Boiarkin, Bruno Bogaz Zarpelão, Muttukrishnan Rajarajan
Abstract:
The manufacturing sector is a vital component of most economies, which leads to a large number of cyberattacks on organisations, whereas disruption in operation may lead to significant economic consequences. Adversaries aim to disrupt the production processes of manufacturing companies, gain financial advantages, and steal intellectual property by getting unauthorised access to sensitive data. Access to sensitive data helps organisations to enhance the production and management processes. However, the majority of the existing data-sharing mechanisms are either susceptible to different cyber attacks or heavy in terms of computation overhead. In this paper, a privacy-preserving data-sharing scheme for smart utilities is proposed. First, a customer’s privacy adjustment mechanism is proposed to make sure that end-users have control over their privacy, which is required by the latest government regulations, such as the General Data Protection Regulation. Secondly, a local differential privacy-based mechanism is proposed to ensure the privacy of the end-users by hiding real data based on the end-user preferences. The proposed scheme may be applied to different industrial control systems, whereas in this study, it is validated for energy utility use cases consisting of smart, intelligent devices. The results show that the proposed scheme may guarantee the required level of privacy with an expected relative error in utility.Keywords: data-sharing, local differential privacy, manufacturing, privacy-preserving mechanism, smart utility
Procedia PDF Downloads 76113 Haematological Responses on Amateur Cycling Stages Race
Authors: Renato André S. Silva, Nana L. F. Sampaio, Carlos J. G. Cruz, Bruno Vianna, Flávio O. Pires
Abstract:
multiple stage bicycle races require high physiological loads from professional cyclists. Such demands can lead to immunosuppression and health problems. However, in this type of competition, little is known about its physiological effects on amateur athletes, who generally receive less medical support. Thus, this study analyzes the hematological effects of a multiple stage bicycle race on amateur cyclists. Seven Brazilian national amateur cyclists (34 ± 4.21 years) underwent a laboratory test to evaluate VO2Max (69.89 ± 7.43 ml⋅kg-1⋅min-1). Six days later, these volunteers raced in the Tour of Goiás, participating in five races in four days (435 km) of competition. Arterial blood samples were collected one day before and one day after the competition. The Kolmogorov-Smirnov tests were used to evaluate the data distribution and Wilcoxon to compare the two moments (p <0.05) of data collection. The results show: Red cells ↓ 7.8% (5.1 ± 0.28 vs 4.7 ± 0.37 106 / mm 3, p = 0.01); Hemoglobin ↓ 7.9% (15.1 ± 0.31 vs 13.9 ± 0.27 g / dL, p = 0.01); Leukocytes ↑ 9.5% (4946 ± 553 versus 5416 ± 1075 / mm 3, p = 0.17); Platelets ↓ 7.0% (200.2 ± 51.5 vs 186.1 ± 39.5 / mm 3, p = 0.01); LDH ↑ 11% (164.4 ± 28.5 vs 182.5 ± 20.5 U / L, p = 0.17); CK ↑ 13.5% (290.7 ± 206.1 vs 330.1 ± 90.5 U / L, p = 0.39); CK-MB ↑ 2% (15.7 ± 3.9 vs. 20.1 ± 2.9 U / L, p = 0.06); Cortizol ↓ 13.5% (12.1 ± 2.4 vs 9.9 ± 1.9 μg / dL, p = 0.01); Total testosterone ↓ 7% (453.6 ± 120.1 vs 421.7 ± 74.3 ng / dL, p = 0.12); IGF-1 ↓ 15.1% (213.8 ± 18.8 vs 181.5 ± 34.7 ng / mL, p = 0.04). This means that there was significant reductions in O2 allocation / transport capacities, vascular injury disruption, and a fortuitous reduction of muscle skeletal anabolism along with maintenance and / or slight elevation of immune function, glucose and lipid energy and myocardial damage. Therefore, the results suggest that no abnormal health effect was identified among the athletes after participating in the Tour de Goiás.Keywords: cycling, health effects, cycling stages races, haematology
Procedia PDF Downloads 200112 Selectivity Mechanism of Cobalt Precipitation by an Imidazole Linker From an Old Battery Solution
Authors: Anna-Caroline Lavergne-Bril, Jean-François Colin, David Peralta, Pascale Maldivi
Abstract:
Cobalt is a critical material, widely used in Li-ion batteries. Due to the planned electrification of European vehicles, cobalt needs are expending – and resources are limited. To meet the needs in cobalt to come, it is necessary to develop new efficient ways to recycle cobalt. One of the biggest sources comes from old electrical vehicles batteries (batteries sold in 2019: 500 000 tons of waste to be). A closed loop process of cobalt recycling has been developed and this presentation aims to present the selectivity mechanism of cobalt over manganese and nickel in solution. Cobalt precipitation as a ZIF material (Zeolitic Imidazolate framework) from a starting solution composed of equimolar nickel, manganese and cobalt is studied. A 2-MeIm (2-methylimidazole) linker is introduced in a multimetallic Ni, Mn, Co solution and the resulting ZIF-67 is 100% pure Co among its metallic centers. Selectivity of Co over Ni is experimentally studied and DFT modelisation calculation are conducted to understand the geometry of ligand-metal-solvent complexes in solution. Selectivity of Co over Mn is experimentally studied, and DFT modelisation calcucation are conducted to understand the link between pKa of the ligand and precipitration of Mn impurities within the final material. Those calculation open the way to other ligand being used in the same process, with more efficiency. Experimental material are synthetized from bimetallic (Ni²⁺/Co²⁺, Mn²⁺/Co²⁺, Mn²⁺/Ni²⁺) solutions. Their crystallographic structure is analysed by XRD diffraction (Brüker AXS D8 diffractometer, Cu anticathode). Morphology is studied by scanning electron microscopy, using a LEO 1530 FE-SEM microscope. The chemical analysis is performed by using ICP-OES (Agilent Technologies 700 series ICP-OES). Modelisation calculation are DFT calculation (density functional theory), using B3LYP, conducted with Orca 4.2.Keywords: MOFs, ZIFs, recycling, closed-loop, cobalt, li-ion batteries
Procedia PDF Downloads 137111 Structural Morphing on High Performance Composite Hydrofoil to Postpone Cavitation
Authors: Fatiha Mohammed Arab, Benoit Augier, Francois Deniset, Pascal Casari, Jacques Andre Astolfi
Abstract:
For the top high performance foiling yachts, cavitation is often a limiting factor for take-off and top speed. This work investigates solutions to delay the onset of cavitation thanks to structural morphing. The structural morphing is based on compliant leading and trailing edge, with effect similar to flaps. It is shown here that the commonly accepted effect of flaps regarding the control of lift and drag forces can also be used to postpone the inception of cavitation. A numerical and experimental study is conducted in order to assess the effect of the geometric parameters of hydrofoil on their hydrodynamic performances and in cavitation inception. The effect of a 70% trailing edge and a 30% leading edge of NACA 0012 is investigated using Xfoil software at a constant Reynolds number 106. The simulations carried out for a range flaps deflections and various angles of attack. So, the result showed that the lift coefficient increase with the increase of flap deflection, but also with the increase of angle of attack and enlarged the bucket cavitation. To evaluate the efficiency of the Xfoil software, a 2D analysis flow over a NACA 0012 with leading and trailing edge flap was studied using Fluent software. The results of the two methods are in a good agreement. To validate the numerical approach, a passive adaptive composite model is built and tested in the hydrodynamic tunnel at the Research Institute of French Naval Academy. The model shows the ability to simulate the effect of flap by a LE and TE structural morphing due to hydrodynamic loading.Keywords: cavitation, flaps, hydrofoil, panel method, xfoil
Procedia PDF Downloads 175110 Experimental Set-up for the Thermo-Hydric Study of a Wood Chips Bed Crossed by an Air Flow
Authors: Dimitri Bigot, Bruno Malet-Damour, Jérôme Vigneron
Abstract:
Many studies have been made about using bio-based materials in buildings. The goal is to reduce its environmental footprint by analyzing its life cycle. This can lead to minimize the carbon emissions or energy consumption. A previous work proposed to numerically study the feasibility of using wood chips to regulate relative humidity inside a building. This has shown the capability of a wood chips bed to regulate humidity inside the building, to improve thermal comfort, and so potentially reduce building energy consumption. However, it also shown that some physical parameters of the wood chips must be identified to validate the proposed model and the associated results. This paper presents an experimental setup able to study such a wood chips bed with different solicitations. It consists of a simple duct filled with wood chips and crossed by an air flow with variable temperature and relative humidity. Its main objective is to study the thermal behavior of the wood chips bed by controlling temperature and relative humidity of the air that enters into it and by observing the same parameters at the output. First, the experimental set up is described according to previous results. A focus is made on the particular properties that have to be characterized. Then some case studies are presented in relation to the previous results in order to identify the key physical properties. Finally, the feasibility of the proposed technology is discussed, and some model validation paths are given.Keywords: wood chips bed, experimental set-up, bio-based material, desiccant, relative humidity, water content, thermal behaviour, air treatment
Procedia PDF Downloads 122109 Construction and Validation of a Hybrid Lumbar Spine Model for the Fast Evaluation of Intradiscal Pressure and Mobility
Authors: Dicko Ali Hamadi, Tong-Yette Nicolas, Gilles Benjamin, Faure Francois, Palombi Olivier
Abstract:
A novel hybrid model of the lumbar spine, allowing fast static and dynamic simulations of the disc pressure and the spine mobility, is introduced in this work. Our contribution is to combine rigid bodies, deformable finite elements, articular constraints, and springs into a unique model of the spine. Each vertebra is represented by a rigid body controlling a surface mesh to model contacts on the facet joints and the spinous process. The discs are modeled using a heterogeneous tetrahedral finite element model. The facet joints are represented as elastic joints with six degrees of freedom, while the ligaments are modeled using non-linear one-dimensional elastic elements. The challenge we tackle is to make these different models efficiently interact while respecting the principles of Anatomy and Mechanics. The mobility, the intradiscal pressure, the facet joint force and the instantaneous center of rotation of the lumbar spine are validated against the experimental and theoretical results of the literature on flexion, extension, lateral bending as well as axial rotation. Our hybrid model greatly simplifies the modeling task and dramatically accelerates the simulation of pressure within the discs, as well as the evaluation of the range of motion and the instantaneous centers of rotation, without penalizing precision. These results suggest that for some types of biomechanical simulations, simplified models allow far easier modeling and faster simulations compared to usual full-FEM approaches without any loss of accuracy.Keywords: hybrid, modeling, fast simulation, lumbar spine
Procedia PDF Downloads 306108 Hijabs, Burqas and Burqinis: Freedom of Religious Expression In The French Public Sphere
Authors: John Tate
Abstract:
In 2004, the French Parliament banned the “hijab” in public schools, and in 2010 it prohibited the “burqa” and “niqab” in “public places.” The result was a “secular” outcome involving the removal of these garments, often identified with Islamic religious and cultural practice, from the French public sphere. Yet in 2016, the French local council bans on the “burqini” were overruled by France’s highest administrative court, the Conseil d’État, allowing for their retention in the public sphere. Unlike the burqa and hijab bans, the burqini bans produced significant divisions at the highest echelons of the French political class, with the Prime Minister, Manuel Valls, and the President, François Hollande, finding themselves at odds on the issue. This article seeks to achieve four aims. It seeks to (a) explain the contrary outcomes between key French state institutions, such as the Conseil d’État and the French Parliament, concerning the hijab and burqa bans, and the Conseil d’État and French local councils, concerning the burqini bans; (b) to do so by identifying two qualitatively distinct, and at times incompatible, conceptions of laïcité, present within official French public discourse, and applied by these French state institutions to underwrite these respective outcomes; (c) explain why, given these contrary conceptions of laïcité, and these contrary outcomes, the widespread identification of laïcité with “secularism” is both misleading and inaccurate; and (d) provide an explanation why senior members of the French political class were divided on the burqini bans when they were not divided on the nation-wide prohibitions of the hijab in public schools and the burqa in public places. In regard to this last question, the article seeks to ask why the Burqini was “different”?Keywords: liberalism, republicanism, laïcité, citizenship
Procedia PDF Downloads 148107 Enhancement of Light Extraction of Luminescent Coating by Nanostructuring
Authors: Aubry Martin, Nehed Amara, Jeff Nyalosaso, Audrey Potdevin, FrançOis ReVeret, Michel Langlet, Genevieve Chadeyron
Abstract:
Energy-saving lighting devices based on LightEmitting Diodes (LEDs) combine a semiconductor chip emitting in the ultraviolet or blue wavelength region to one or more phosphor(s) deposited in the form of coatings. The most common ones combine a blue LED with the yellow phosphor Y₃Al₅O₁₂:Ce³⁺ (YAG:Ce) and a red phosphor. Even if these devices are characterized by satisfying photometric parameters (Color Rendering Index, Color Temperature) and good luminous efficiencies, further improvements can be carried out to enhance light extraction efficiency (increase in phosphor forward emission). One of the possible strategies is to pattern the phosphor coatings. Here, we have worked on different ways to nanostructure the coating surface. On the one hand, we used the colloidal lithography combined with the Langmuir-Blodgett technique to directly pattern the surface of YAG:Tb³⁺ sol-gel derived coatings, YAG:Tb³⁺ being used as phosphor model. On the other hand, we achieved composite architectures combining YAG:Ce coatings and ZnO nanowires. Structural, morphological and optical properties of both systems have been studied and compared to flat YAG coatings. In both cases, nanostructuring brought a significative enhancement of photoluminescence properties under UV or blue radiations. In particular, angle-resolved photoluminescence measurements have shown that nanostructuring modifies photons path within the coatings, with a better extraction of the guided modes. These two strategies have the advantage of being versatile and applicable to any phosphor synthesizable by sol-gel technique. They then appear as promising ways to enhancement luminescence efficiencies of both phosphor coatings and the optical devices into which they are incorporated, such as LED-based lighting or safety devices.Keywords: phosphor coatings, nanostructuring, light extraction, ZnO nanowires, colloidal lithography, LED devices
Procedia PDF Downloads 176106 Normalized Enterprises Architectures: Portugal's Public Procurement System Application
Authors: Tiago Sampaio, André Vasconcelos, Bruno Fragoso
Abstract:
The Normalized Systems Theory, which is designed to be applied to software architectures, provides a set of theorems, elements and rules, with the purpose of enabling evolution in Information Systems, as well as ensuring that they are ready for change. In order to make that possible, this work’s solution is to apply the Normalized Systems Theory to the domain of enterprise architectures, using Archimate. This application is achieved through the adaptation of the elements of this theory, making them artifacts of the modeling language. The theorems are applied through the identification of the viewpoints to be used in the architectures, as well as the transformation of the theory’s encapsulation rules into architectural rules. This way, it is possible to create normalized enterprise architectures, thus fulfilling the needs and requirements of the business. This solution was demonstrated using the Portuguese Public Procurement System. The Portuguese government aims to make this system as fair as possible, allowing every organization to have the same business opportunities. The aim is for every economic operator to have access to all public tenders, which are published in any of the 6 existing platforms, independently of where they are registered. In order to make this possible, we applied our solution to the construction of two different architectures, which are able of fulfilling the requirements of the Portuguese government. One of those architectures, TO-BE A, has a Message Broker that performs the communication between the platforms. The other, TO-BE B, represents the scenario in which the platforms communicate with each other directly. Apart from these 2 architectures, we also represent the AS-IS architecture that demonstrates the current behavior of the Public Procurement Systems. Our evaluation is based on a comparison between the AS-IS and the TO-BE architectures, regarding the fulfillment of the rules and theorems of the Normalized Systems Theory and some quality metrics.Keywords: archimate, architecture, broker, enterprise, evolvable systems, interoperability, normalized architectures, normalized systems, normalized systems theory, platforms
Procedia PDF Downloads 357105 Tailoring of ECSS Standard for Space Qualification Test of CubeSat Nano-Satellite
Authors: B. Tiseo, V. Quaranta, G. Bruno, G. Sisinni
Abstract:
There is an increasing demand of nano-satellite development among universities, small companies, and emerging countries. Low-cost and fast-delivery are the main advantages of such class of satellites achieved by the extensive use of commercial-off-the-shelf components. On the other side, the loss of reliability and the poor success rate are limiting the use of nano-satellite to educational and technology demonstration and not to the commercial purpose. Standardization of nano-satellite environmental testing by tailoring the existing test standard for medium/large satellites is then a crucial step for their market growth. Thus, it is fundamental to find the right trade-off between the improvement of reliability and the need to keep their low-cost/fast-delivery advantages. This is particularly even more essential for satellites of CubeSat family. Such miniaturized and standardized satellites have 10 cm cubic form and mass no more than 1.33 kilograms per 1 unit (1U). For this class of nano-satellites, the qualification process is mandatory to reduce the risk of failure during a space mission. This paper reports the description and results of the space qualification test campaign performed on Endurosat’s CubeSat nano-satellite and modules. Mechanical and environmental tests have been carried out step by step: from the testing of the single subsystem up to the assembled CubeSat nano-satellite. Functional tests have been performed during all the test campaign to verify the functionalities of the systems. The test duration and levels have been selected by tailoring the European Space Agency standard ECSS-E-ST-10-03C and GEVS: GSFC-STD-7000A.Keywords: CubeSat, nano-satellite, shock, testing, vibration
Procedia PDF Downloads 186104 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level
Authors: Pedro M. Abreu, Bruno R. Mendes
Abstract:
The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.Keywords: clinical pharmacy, co-payments, healthcare, medicines
Procedia PDF Downloads 251103 Parameter and Lose Effect Analysis of Beta Stirling Cycle Refrigerating Machine
Authors: Muluken Z. Getie, Francois Lanzetta, Sylvie Begot, Bimrew T. Admassu
Abstract:
This study is aimed at the numerical analysis of the effects of phase angle and losses (shuttle heat loss and gas leakage to the crankcase) that could have an impact on the pressure and temperature of working fluid for a β-type Stirling cycle refrigerating machine. First, the developed numerical model incorporates into the ideal adiabatic analysis, the shuttle heat transfer (heat loss from compression space to expansion space), and gas leakage from the working space to the buffer space into the crankcase. The other losses that may not have a direct effect on the temperature and pressure of working fluid are simply incorporated in a simple analysis. The model is then validated by reversing the model to the engine model and compared with other literature results using (GPU-3) engine. After validating the model with other engine model and experiment results, analysis of the effect of phase angle, shuttle heat lose and gas leakage on temperature, pressure, and performance (power requirement, cooling capacity and coefficient of performance) of refrigerating machine considering the FEMTO 60 Stirling engine as a case study have been conducted. Shuttle heat loss has a greater effect on the temperature of working gas; gas leakage to the crankcase has more effect on the pressure of working spaces and hence both have a considerable impact on the performance of the Stirling cycle refrigerating machine. The optimum coefficient of performance exists between phase angles of 900-950, and optimum cooling capacity could be found between phase angles of 950-980.Keywords: beta configuration, engine model, moderate cooling, stirling refrigerator, and validation
Procedia PDF Downloads 102102 Integration of an Augmented Reality System for the Visualization of the HRMAS NMR Analysis of Brain Biopsy Specimens Using the Brainlab Cranial Navigation System
Authors: Abdelkrim Belhaoua, Jean-Pierre Radoux, Mariana Kuras, Vincent Récamier, Martial Piotto, Karim Elbayed, François Proust, Izzie Namer
Abstract:
This paper proposes an augmented reality system dedicated to neurosurgery in order to assist the surgeon during an operation. This work is part of the ExtempoRMN project (Funded by Bpifrance) which aims at analyzing during a surgical operation the metabolic content of tumoral brain biopsy specimens by HRMAS NMR. Patients affected with a brain tumor (gliomas) frequently need to undergo an operation in order to remove the tumoral mass. During the operation, the neurosurgeon removes biopsy specimens using image-guided surgery. The biopsy specimens removed are then sent for HRMAS NMR analysis in order to obtain a better diagnosis and prognosis. Image-guided refers to the use of MRI images and a computer to precisely locate and target a lesion (abnormal tissue) within the brain. This is performed using preoperative MRI images and the BrainLab neuro-navigation system. With the patient MRI images loaded on the Brainlab Cranial neuro-navigation system in the operating theater, surgeons can better identify their approach before making an incision. The Brainlab neuro-navigation tool tracks in real time the position of the instruments and displays their position on the patient MRI data. The results of the biopsy analysis by 1H HRMAS NMR are then sent back to the operating theater and superimposed on the 3D localization system directly on the MRI images. The method we have developed to communicate between the HRMAS NMR analysis software and Brainlab makes use of a combination of C++, VTK and the Insight Toolkit using OpenIGTLink protocol.Keywords: neuro-navigation, augmented reality, biopsy, BrainLab, HR-MAS NMR
Procedia PDF Downloads 363101 Influences of Separation of the Boundary Layer in the Reservoir Pressure in the Shock Tube
Authors: Bruno Coelho Lima, Joao F.A. Martos, Paulo G. P. Toro, Israel S. Rego
Abstract:
The shock tube is a ground-facility widely used in aerospace and aeronautics science and technology for studies on gas dynamic and chemical-physical processes in gases at high-temperature, explosions and dynamic calibration of pressure sensors. A shock tube in its simplest form is comprised of two separate tubes of equal cross-section by a diaphragm. The diaphragm function is to separate the two reservoirs at different pressures. The reservoir containing high pressure is called the Driver, the low pressure reservoir is called Driven. When the diaphragm is broken by pressure difference, a normal shock wave and non-stationary (named Incident Shock Wave) will be formed in the same place of diaphragm and will get around toward the closed end of Driven. When this shock wave reaches the closer end of the Driven section will be completely reflected. Now, the shock wave will interact with the boundary layer that was created by the induced flow by incident shock wave passage. The interaction between boundary layer and shock wave force the separation of the boundary layer. The aim of this paper is to make an analysis of influences of separation of the boundary layer in the reservoir pressure in the shock tube. A comparison among CDF (Computational Fluids Dynamics), experiments test and analytical analysis were performed. For the analytical analysis, some routines in Python was created, in the numerical simulations (Computational Fluids Dynamics) was used the Ansys Fluent, and the experimental tests were used T1 shock tube located in IEAv (Institute of Advanced Studies).Keywords: boundary layer separation, moving shock wave, shock tube, transient simulation
Procedia PDF Downloads 315100 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection
Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine
Abstract:
Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine
Procedia PDF Downloads 26799 Enhancing Solar Fuel Production by CO₂ Photoreduction Using Transition Metal Oxide Catalysts in Reactors Prepared by Additive Manufacturing
Authors: Renata De Toledo Cintra, Bruno Ramos, Douglas Gouvêa
Abstract:
There is a huge global concern due to the emission of greenhouse gases, consequent environmental problems, and the increase in the average temperature of the planet, caused mainly by fossil fuels, petroleum derivatives represent a big part. One of the main greenhouse gases, in terms of volume, is CO₂. Recovering a part of this product through chemical reactions that use sunlight as an energy source and even producing renewable fuel (such as ethane, methane, ethanol, among others) is a great opportunity. The process of artificial photosynthesis, through the conversion of CO₂ and H₂O into organic products and oxygen using a metallic oxide catalyst, and incidence of sunlight, is one of the promising solutions. Therefore, this research is of great relevance. To this reaction take place efficiently, an optimized reactor was developed through simulation and prior analysis so that the geometry of the internal channel is an efficient route and allows the reaction to happen, in a controlled and optimized way, in flow continuously and offering the least possible resistance. The design of this reactor prototype can be made in different materials, such as polymers, ceramics and metals, and made through different processes, such as additive manufacturing (3D printer), CNC, among others. To carry out the photocatalysis in the reactors, different types of catalysts will be used, such as ZnO deposited by spray pyrolysis in the lighting window, probably modified ZnO, TiO₂ and modified TiO₂, among others, aiming to increase the production of organic molecules, with the lowest possible energy.Keywords: artificial photosynthesis, CO₂ reduction, photocatalysis, photoreactor design, 3D printed reactors, solar fuels
Procedia PDF Downloads 8698 Estimation of Exhaust and Non-Exhaust Particulate Matter Emissions’ Share from On-Road Vehicles in Addis Ababa City
Authors: Solomon Neway Jida, Jean-Francois Hetet, Pascal Chesse
Abstract:
Vehicular emission is the key source of air pollution in the urban environment. This includes both fine particles (PM2.5) and coarse particulate matters (PM10). However, particulate matter emissions from road traffic comprise emissions from exhaust tailpipe and emissions due to wear and tear of the vehicle part such as brake, tire and clutch and re-suspension of dust (non-exhaust emission). This study estimates the share of the two sources of pollutant particle emissions from on-roadside vehicles in the Addis Ababa municipality, Ethiopia. To calculate its share, two methods were applied; the exhaust-tailpipe emissions were calculated using the Europeans emission inventory Tier II method and Tier I for the non-exhaust emissions (like vehicle tire wear, brake, and road surface wear). The results show that of the total traffic-related particulate emissions in the city, 63% emitted from vehicle exhaust and the remaining 37% from non-exhaust sources. The annual roads transport exhaust emission shares around 2394 tons of particles from all vehicle categories. However, from the total yearly non-exhaust particulate matter emissions’ contribution, tire and brake wear shared around 65% and 35% emanated by road-surface wear. Furthermore, vehicle tire and brake wear were responsible for annual 584.8 tons of coarse particles (PM10) and 314.4 tons of fine particle matter (PM2.5) emissions in the city whereas surface wear emissions were responsible for around 313.7 tons of PM10 and 169.9 tons of PM2.5 pollutant emissions in the city. This suggests that non-exhaust sources might be as significant as exhaust sources and have a considerable contribution to the impact on air quality.Keywords: Addis Ababa, automotive emission, emission estimation, particulate matters
Procedia PDF Downloads 13097 Mechanical Behavior of Corroded RC Beams Strengthened by NSM CFRP Rods
Authors: Belal Almassri, Amjad Kreit, Firas Al Mahmoud, Raoul François
Abstract:
Corrosion of steel in reinforced concrete leads to several major defects. Firstly, a reduction in the crosssectional area of the reinforcement and in its ductility results in premature bar failure. Secondly, the expansion of the corrosion products causes concrete cracking and steel–concrete bond deterioration and also affects the bending stiffness of the reinforced concrete members, causing a reduction in the overall load-bearing capacity of the reinforced concrete beams. This paper investigates the validity of a repair technique using Near Surface Mounted (NSM) carbon-fibre-reinforced polymer (CFRP) rods to restore the mechanical performance of corrosion-damaged RC beams. In the NSM technique, the CFRP rods are placed inside pre-cut grooves and are bonded to the concrete with epoxy adhesive. Experimental results were obtained on two beams: a corroded beam that had been exposed to natural corrosion for 25 years and a control beam, (both are 3 m long) repaired in bending only. Each beam was repaired with one 6-mm-diameter NSM CFRP rod. The beams were tested in a three-point bending test up to failure. Overall stiffness and crack maps were studied before and after the repair. Ultimate capacity, ductility and failure mode were also reviewed. Finally some comparisons were made between repaired and non-repaired beams in order to assess the effectiveness of the NSM technique. The experimental results showed that the NSM technique improved the overall characteristics (ultimate load capacity and stiffness) of the control and corroded beams and allowed sufficient ductility to be restored to the repaired corroded elements, thus restoring the safety margin, despite the non-classical mode of failure that occurred in the corroded beam, with the separation of the concrete cover due to corrosion products.Keywords: carbon fibre, corrosion, strength, mechanical testing
Procedia PDF Downloads 45096 In-Silico Fusion of Bacillus Licheniformis Chitin Deacetylase with Chitin Binding Domains from Chitinases
Authors: Keyur Raval, Steffen Krohn, Bruno Moerschbacher
Abstract:
Chitin, the biopolymer of the N-acetylglucosamine, is the most abundant biopolymer on the planet after cellulose. Industrially, chitin is isolated and purified from the shell residues of shrimps. A deacetylated derivative of chitin i.e. chitosan has more market value and applications owing to it solubility and overall cationic charge compared to the parent polymer. This deacetylation on an industrial scale is performed chemically using alkalis like sodium hydroxide. This reaction not only is hazardous to the environment owing to negative impact on the marine ecosystem. A greener option to this process is the enzymatic process. In nature, the naïve chitin is converted to chitosan by chitin deacetylase (CDA). This enzymatic conversion on the industrial scale is however hampered by the crystallinity of chitin. Thus, this enzymatic action requires the substrate i.e. chitin to be soluble which is technically difficult and an energy consuming process. We in this project wanted to address this shortcoming of CDA. In lieu of this, we have modeled a fusion protein with CDA and an auxiliary protein. The main interest being to increase the accessibility of the enzyme towards crystalline chitin. A similar fusion work with chitinases had improved the catalytic ability towards insoluble chitin. In the first step, suitable partners were searched through the protein data bank (PDB) wherein the domain architecture were sought. The next step was to create the models of the fused product using various in silico techniques. The models were created by MODELLER and evaluated for properties such as the energy or the impairment of the binding sites. A fusion PCR has been designed based on the linker sequences generated by MODELLER and would be tested for its activity towards insoluble chitin.Keywords: chitin deacetylase, modeling, chitin binding domain, chitinases
Procedia PDF Downloads 24295 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 1994 New Photosensitizers Encapsulated within Arene-Ruthenium Complexes Active in Photodynamic Therapy: Intracellular Signaling and Evaluation in Colorectal Cancer Models
Authors: Suzan Ghaddar, Aline Pinon, Manuel Gallardo-villagran, Mona Diab-assaf, Bruno Therrien, Bertrand Liagre
Abstract:
Colorectal cancer (CRC) is the third most common cancer and exhibits a consistently rising incidence worldwide. Despite notable advancements in CRC treatment, frequent occurrences of side effects and the development of therapy resistance persistently challenge current approaches. Eventually, innovations in focal therapies remain imperative to enhance the patient’s overall quality of life. Photodynamic therapy (PDT) emerges as a promising treatment modality, clinically used for the treatment of various cancer types. It relies on the use of photosensitive molecules called photosensitizers (PS), which are photoactivated after accumulation in cancer cells, to induce the production of reactive oxygen species (ROS) that cause cancer cell death. Among commonly used metal-based drugs in cancer therapy, ruthenium (Ru) possesses favorable attributes that demonstrate its selectivity towards cancer cells and render it suitable for anti-cancer drug design. In vitro studies using distinct arene-Ru complexes, encapsulating porphin PS, are conducted on human HCT116 and HT-29 colorectal cancer cell lines. These studies encompass the evaluation of the antiproliferative effect, ROS production, apoptosis, cell cycle progression, molecular localization, and protein expression. Preliminary results indicated that these complexes exert significant photocytotoxicity on the studied colorectal cancer cell lines, representing them as promising and potential candidates for anti- cancer agents.Keywords: colorectal cancer, photodynamic therapy, photosensitizers, arene-ruthenium complexes, apoptosis
Procedia PDF Downloads 9993 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control
Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni
Abstract:
An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.Keywords: automation, human factors, air traffic controller, MINIMA, OOTL (Out-Of-The-Loop), EEG (Electroencephalography), HMI (Human Machine Interface)
Procedia PDF Downloads 38392 On Elastic Anisotropy of Fused Filament Fabricated Acrylonitrile Butadiene Styrene Structures
Authors: Joseph Marae Djouda, Ashraf Kasmi, François Hild
Abstract:
Fused filament fabrication is one of the most widespread additive manufacturing techniques because of its low-cost implementation. Its initial development was based on part fabrication with thermoplastic materials. The influence of the manufacturing parameters such as the filament orientation through the nozzle, the deposited layer thickness, or the speed deposition on the mechanical properties of the parts has been widely experimentally investigated. It has been recorded the remarkable variations of the anisotropy in the function of the filament path during the fabrication process. However, there is a lack in the development of constitutive models describing the mechanical properties. In this study, integrated digital image correlation (I-DIC) is used for the identification of mechanical constitutive parameters of two configurations of ABS samples: +/-45° and so-called “oriented deposition.” In this last, the filament was deposited in order to follow the principal strain of the sample. The identification scheme based on the gap reduction between simulation and the experiment directly from images recorded from a single sample (single edge notched tension specimen) is developed. The macroscopic and mesoscopic analysis are conducted from images recorded in both sample surfaces during the tensile test. The elastic and elastoplastic models in isotropic and orthotropic frameworks have been established. It appears that independently of the sample configurations (filament orientation during the fabrication), the elastoplastic isotropic model gives the correct description of the behavior of samples. It is worth noting that in this model, the number of constitutive parameters is limited to the one considered in the elastoplastic orthotropic model. This leads to the fact that the anisotropy of the architectured 3D printed ABS parts can be neglected in the establishment of the macroscopic behavior description.Keywords: elastic anisotropy, fused filament fabrication, Acrylonitrile butadiene styrene, I-DIC identification
Procedia PDF Downloads 12691 Influence of Thermal Damage on the Mechanical Strength of Trimmed CFRP
Authors: Guillaume Mullier, Jean François Chatelain
Abstract:
Carbon Fiber Reinforced Plastics (CFRPs) are widely used for advanced applications, in particular in aerospace, automotive and wind energy industries. Once cured to near net shape, CFRP parts need several finishing operations such as trimming, milling or drilling in order to accommodate fastening hardware and meeting the final dimensions. The present research aims to study the effect of the cutting temperature in trimming on the mechanical strength of high performance CFRP laminates used for aeronautics applications. The cutting temperature is of great importance when dealing with trimming of CFRP. Temperatures higher than the glass-transition temperature (Tg) of the resin matrix are highly undesirable: they cause degradation of the matrix in the trimmed edges area, which can severely affect the mechanical performance of the entire component. In this study, a 9.50 mm diameter CVD diamond coated carbide tool with six flutes was used to trim 24-plies CFRP laminates. A 300 m/min cutting speed and 1140 mm/min feed rate were used in the experiments. The tool was heated prior to trimming using a blowtorch, for temperatures ranging from 20°C to 300°C. The temperature at the cutting edge was measured using embedded K-Type thermocouples. Samples trimmed for different cutting temperatures, below and above Tg, were mechanically tested using three-points bending short-beam loading configurations. New cutting tools as well as worn cutting tools were utilized for the experiments. The experiments with the new tools could not prove any correlation between the length of cut, the cutting temperature and the mechanical performance. Thus mechanical strength was constant, regardless of the cutting temperature. However, for worn tools, producing a cutting temperature rising up to 450°C, thermal damage of the resin was observed. The mechanical tests showed a reduced mean resistance in short beam configuration, while the resistance in three point bending decreases with increase of the cutting temperature.Keywords: composites, trimming, thermal damage, surface quality
Procedia PDF Downloads 32190 Uptake of Copper by Dead Biomass of Burkholderia cenocepacia Isolated from a Metal Mine in Pará, Brazil
Authors: Ingrid R. Avanzi, Marcela dos P. G. Baltazar, Louise H. Gracioso, Luciana J. Gimenes, Bruno Karolski, Elen A. Perpetuo, Claudio Auguto Oller do Nascimento
Abstract:
In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process. In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process.Keywords: biosorption, dead biomass, biotechnology, copper recovery
Procedia PDF Downloads 33789 Optimization of Quercus cerris Bark Liquefaction
Authors: Luísa P. Cruz-Lopes, Hugo Costa e Silva, Idalina Domingos, José Ferreira, Luís Teixeira de Lemos, Bruno Esteves
Abstract:
The liquefaction process of cork based tree barks has led to an increase of interest due to its potential innovation in the lumber and wood industries. In this particular study the bark of Quercus cerris (Turkish oak) is used due to its appreciable amount of cork tissue, although of inferior quality when compared to the cork provided by other Quercus trees. This study aims to optimize alkaline catalysis liquefaction conditions, regarding several parameters. To better comprehend the possible chemical characteristics of the bark of Quercus cerris, a complete chemical analysis was performed. The liquefaction process was performed in a double-jacket reactor heated with oil, using glycerol and a mixture of glycerol/ethylene glycol as solvents, potassium hydroxide as a catalyst, and varying the temperature, liquefaction time and granulometry. Due to low liquefaction efficiency resulting from the first experimental procedures a study was made regarding different washing techniques after the filtration process using methanol and methanol/water. The chemical analysis stated that the bark of Quercus cerris is mostly composed by suberin (ca. 30%) and lignin (ca. 24%) as well as insolvent hemicelluloses in hot water (ca. 23%). On the liquefaction stage, the results that led to higher yields were: using a mixture of methanol/ethylene glycol as reagents and a time and temperature of 120 minutes and 200 ºC, respectively. It is concluded that using a granulometry of <80 mesh leads to better results, even if this parameter barely influences the liquefaction efficiency. Regarding the filtration stage, washing the residue with methanol and then distilled water leads to a considerable increase on final liquefaction percentages, which proves that this procedure is effective at liquefying suberin content and lignocellulose fraction.Keywords: liquefaction, Quercus cerris, polyalcohol liquefaction, temperature
Procedia PDF Downloads 33288 Developing an Edutainment Game for Children with ADHD Based on SAwD and VCIA Model
Authors: Bruno Gontijo Batista
Abstract:
This paper analyzes how the Socially Aware Design (SAwD) and the Value-oriented and Culturally Informed Approach (VCIA) design model can be used to develop an edutainment game for children with Attention Deficit Hyperactivity Disorder (ADHD). The SAwD approach seeks a design that considers new dimensions in human-computer interaction, such as culture, aesthetics, emotional and social aspects of the user's everyday experience. From this perspective, the game development was VCIA model-based, including the users in the design process through participatory methodologies, considering their behavioral patterns, culture, and values. This is because values, beliefs, and behavioral patterns influence how technology is understood and used and the way it impacts people's lives. This model can be applied at different stages of design, which goes from explaining the problem and organizing the requirements to the evaluation of the prototype and the final solution. Thus, this paper aims to understand how this model can be used in the development of an edutainment game for children with ADHD. In the area of education and learning, children with ADHD have difficulties both in behavior and in school performance, as they are easily distracted, which is reflected both in classes and on tests. Therefore, they must perform tasks that are exciting or interesting for them, once the pleasure center in the brain is activated, it reinforces the center of attention, leaving the child more relaxed and focused. In this context, serious games have been used as part of the treatment of ADHD in children aiming to improve focus and attention, stimulate concentration, as well as be a tool for improving learning in areas such as math and reading, combining education and entertainment (edutainment). Thereby, as a result of the research, it was developed, in a participatory way, applying the VCIA model, an edutainment game prototype, for a mobile platform, for children between 8 and 12 years old.Keywords: ADHD, edutainment, SAwD, VCIA
Procedia PDF Downloads 19087 Predicting Stem Borer Density in Maize Using RapidEye Data and Generalized Linear Models
Authors: Elfatih M. Abdel-Rahman, Tobias Landmann, Richard Kyalo, George Ong’amo, Bruno Le Ru
Abstract:
Maize (Zea mays L.) is a major staple food crop in Africa, particularly in the eastern region of the continent. The maize growing area in Africa spans over 25 million ha and 84% of rural households in Africa cultivate maize mainly as a means to generate food and income. Average maize yields in Sub Saharan Africa are 1.4 t/ha as compared to global average of 2.5–3.9 t/ha due to biotic and abiotic constraints. Amongst the biotic production constraints in Africa, stem borers are the most injurious. In East Africa, yield losses due to stem borers are currently estimated between 12% to 40% of the total production. The objective of the present study was therefore to predict stem borer larvae density in maize fields using RapidEye reflectance data and generalized linear models (GLMs). RapidEye images were captured for a test site in Kenya (Machakos) in January and in February 2015. Stem borer larva numbers were modeled using GLMs assuming Poisson (Po) and negative binomial (NB) distributions with error with log arithmetic link. Root mean square error (RMSE) and ratio prediction to deviation (RPD) statistics were employed to assess the models performance using a leave one-out cross-validation approach. Results showed that NB models outperformed Po ones in all study sites. RMSE and RPD ranged between 0.95 and 2.70, and between 2.39 and 6.81, respectively. Overall, all models performed similar when used the January and the February image data. We conclude that reflectance data from RapidEye data can be used to estimate stem borer larvae density. The developed models could to improve decision making regarding controlling maize stem borers using various integrated pest management (IPM) protocols.Keywords: maize, stem borers, density, RapidEye, GLM
Procedia PDF Downloads 49686 Effect of Gum Extracts on the Textural and Bread-Making Properties of a Composite Flour Based on Sour Cassava Starch (Manihot esculenta), Peanut (Arachis hypogaea) and Cowpea Flour (Vigna unguiculata)
Authors: Marie Madeleine Nanga Ndjang, Julie Mathilde Klang, Edwin M. Mmutlane, Derek Tantoh Ndinteh, Eugenie Kayitesi, Francois Ngoufack Zambou
Abstract:
Gluten intolerance and the unavailability of wheat flour in some parts of the world have led to the development of gluten-free bread. However, gluten-free bread generally results in a low specific volume, and to remedy this, the use of hydrocolloids and bases has proved to be very successful. Thus, the present study aims to determine the optimal proportions of gum extract of Triumffetapentendraand sodium bicarbonate in breadmaking of a composite flour based on sour cassava starch, peanut, and cowpea flour. To achieve this, a BoxBenkhendesign was used, the variable being the amount of extract gums, the amount of bicarbonate, and the amount of water. The responses evaluated were the specific volume and texture properties (Hardness, Cohesiveness, Consistency, Elasticity, and Masticability). The specific volume was done according to standard methods of AACC and the textural properties by a texture analyzer. It appears from this analysis that the specific volume is positively influenced by the incorporation of extract gums, bicarbonate, and water. The hardness, consistency, and plasticity increased with the incorporation rate of extract gums but reduced with the incorporation rate of bicarbonate and water. On the other hand, Cohesion and elasticity increased with the incorporation rate of bicarbonate and water but reduced with the incorporation of extract gum. The optimate proportions of extract gum, bicarbonate, and water are 0.28;1.99, and 112.5, respectively. This results in a specific volume of 1.51; a hardness of 38.51; a cohesiveness of 0.88; a consistency of 32.86; an elasticity of 5.57, and amasticability of 162.35. Thus, this analysis suggests that gum extracts and sodium bicarbonate can be used to improve the quality of gluten-free bread.Keywords: box benkhen design, bread-making, gums, textures properties, specific volume
Procedia PDF Downloads 95