Search results for: Ky Serge Stephane
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 63

Search results for: Ky Serge Stephane

63 Gender-Specific Vulnerability on Climate Change and Food Security Status - A Catchment Approach on Agroforestry Systems - A Multi-Country Case Study

Authors: Zerihun Yohannes Amare Id, Bernhard Freyer, Ky Serge Stephane, Ouéda Adama, Blessing Mudombi, Jean Nzuma, Mekonen Getachew Abebe, Adane Tesfaye, Birtukan Atinkut Asmare, Tesfahun Asmamaw Kassie

Abstract:

The study was conducted in Ethiopia (Zege Catchment) (ZC), Zimbabwe (Upper Save Catchment) (USC), and Burkina Faso (Nakambe Catchment) (NC). The study utilized a quantitative approach with 180 participants and complemented it with qualitative methods, including 33 key informant interviews and 6 focus group discussions. Households in ZC (58%), NC (55%), and US (40%) do not cover their household food consumption from crop production. The households rely heavily on perennial cash crops rather than annual crop production. Exposure indicators in ZC (0.758), USC (0.774), and NC (0.944), and sensitivity indicators in ZC (0.849) and NC (0.937) show statistically significant and high correlation with vulnerability. In the USC, adaptive capacity (0.746) and exposure (0.774) are also statistically significant and highly correlated with vulnerability. Vulnerability levels of the NC are very high (0.75) (0.85 female and 0.65 male participants) compared to the USC (0.66) (0.69 female and 0.61 male participants) and ZC (0.47) (0.34 female and 0.58 male participants). Female-headed households had statistically significantly lower vulnerability index compared to males in ZC, while male-headed households had statistically significantly lower vulnerability index compared to females in USC and NC. The reason is land certification in ZC (80%) is higher than in the US (10%) and NC (8%). Agroforestry practices variables across the study catchments had statistically significant contributions to households' adaptive capacity. We conclude that agroforestry practices do have substantial benefits in increasing women's adaptive capacity and reducing their vulnerability to climate change and food insecurity.

Keywords: climate change vulnerability, agroforestry, gender, food security, Sub-Saharan Africa

Procedia PDF Downloads 57
62 Direct Measurements of the Electrocaloric Effect in Solid Ferroelectric Materials via Thermoreflectance

Authors: Layla Farhat, Mathieu Bardoux, Stéphane Longuemart, Ziad Herro, Abdelhak Hadj Sahraoui

Abstract:

Electrocaloric (EC) effect refers to the isothermal entropy or adiabatic temperature changes of a dielectric material induced by an external electric field. This phenomenon has been largely ignored for application because only modest EC effects (2.6

Keywords: electrocaloric effect, thermoreflectance, ferroelectricity, cooling system

Procedia PDF Downloads 152
61 An Extension of the Generalized Extreme Value Distribution

Authors: Serge Provost, Abdous Saboor

Abstract:

A q-analogue of the generalized extreme value distribution which includes the Gumbel distribution is introduced. The additional parameter q allows for increased modeling flexibility. The resulting distribution can have a finite, semi-infinite or infinite support. It can also produce several types of hazard rate functions. The model parameters are determined by making use of the method of maximum likelihood. It will be shown that it compares favourably to three related distributions in connection with the modeling of a certain hydrological data set.

Keywords: extreme value theory, generalized extreme value distribution, goodness-of-fit statistics, Gumbel distribution

Procedia PDF Downloads 310
60 A Methodology for Characterising the Tail Behaviour of a Distribution

Authors: Serge Provost, Yishan Zang

Abstract:

Following a review of various approaches that are utilized for classifying the tail behavior of a distribution, an easily implementable methodology that relies on an arctangent transformation is presented. The classification criterion is actually based on the difference between two specific quantiles of the transformed distribution. The resulting categories enable one to classify distributional tails as distinctly short, short, nearly medium, medium, extended medium and somewhat long, providing that at least two moments exist. Distributions possessing a single moment are said to be long tailed while those failing to have any finite moments are classified as having an extremely long tail. Several illustrative examples will be presented.

Keywords: arctangent transformation, tail classification, heavy-tailed distributions, distributional moments

Procedia PDF Downloads 88
59 On Modeling Data Sets by Means of a Modified Saddlepoint Approximation

Authors: Serge B. Provost, Yishan Zhang

Abstract:

A moment-based adjustment to the saddlepoint approximation is introduced in the context of density estimation. First applied to univariate distributions, this methodology is extended to the bivariate case. It then entails estimating the density function associated with each marginal distribution by means of the saddlepoint approximation and applying a bivariate adjustment to the product of the resulting density estimates. The connection to the distribution of empirical copulas will be pointed out. As well, a novel approach is proposed for estimating the support of distribution. As these results solely rely on sample moments and empirical cumulant-generating functions, they are particularly well suited for modeling massive data sets. Several illustrative applications will be presented.

Keywords: empirical cumulant-generating function, endpoints identification, saddlepoint approximation, sample moments, density estimation

Procedia PDF Downloads 124
58 The Assessment of Bilingual Students: How Bilingual Can It Really Be?

Authors: Serge Lacroix

Abstract:

The proposed study looks at the psychoeducational assessment of bilingual students, in English and French in this case. It will be the opportunity to look at language of assessment and specifically how certain tests can be administered in one language and others in another language. It is also a look into the questioning of the validity of the test scores that are obtained as well as the quality and generalizability of the conclusions that can be drawn. Bilingualism and multiculturalism, although in constant expansion, is not considered in norms development and remains a poorly understood factor when it is at play in the context of a psychoeducational assessment. Student placement, diagnoses, accurate measures of intelligence and achievement are all impacted by the quality of the assessment procedure. The same is true for questionnaires administered to parents and self-reports completed by bilingual students who, more often than not, are assessed in a language that is not their primary one or are compared to monolinguals not dealing with the same challenges or the same skills. Results show that students, when offered to work in a bilingual fashion, chooses to do so in a significant proportion. Recommendations will be offered to support educators aiming at expanding their skills when confronted with multilingual students in an assessment context.

Keywords: psychoeducational assessment, bilingualism, multiculturalism, intelligence, achievement

Procedia PDF Downloads 424
57 AI-Driven Strategies for Sustainable Electronics Repair: A Case Study in Energy Efficiency

Authors: Badiy Elmabrouk, Abdelhamid Boujarif, Zhiguo Zeng, Stephane Borrel, Robert Heidsieck

Abstract:

In an era where sustainability is paramount, this paper introduces a machine learning-driven testing protocol to accurately predict diode failures, merging reliability engineering with failure physics to enhance repair operations efficiency. Our approach refines the burn-in process, significantly curtailing its duration, which not only conserves energy but also elevates productivity and mitigates component wear. A case study from GE HealthCare’s repair center vividly demonstrates the method’s effectiveness, recording a high prediction of diode failures and a substantial decrease in energy consumption that translates to an annual reduction of 6.5 Tons of CO2 emissions. This advancement sets a benchmark for environmentally conscious practices in the electronics repair sector.

Keywords: maintenance, burn-in, failure physics, reliability testing

Procedia PDF Downloads 21
56 Corrosion Behavior of Induced Stress Duplex Stainless Steel in Chloride Environment

Authors: Serge Mudinga Lemika, Samuel Olukayode Akinwamide, Aribo Sunday, Babatunde Abiodun Obadele, Peter Apata Olubambi

Abstract:

Use of Duplex stainless steel has become predominant in applications where excellent corrosion resistance is of utmost importance. Corrosion behavior of duplex stainless steel induced with varying stress in a chloride media were studied. Characterization of as received 2205 duplex stainless steels were carried out to reveal its structure and properties tensile sample produced from duplex stainless steel was initially subjected to tensile test to obtain the yield strength. Stresses obtained by various percentages (20, 40, 60 and 80%) of the yield strength was induced in DSS samples. Corrosion tests were carried out in magnesium chloride solution at room temperature. Morphologies of cracks observed with optical and scanning electron microscope showed that samples induced with higher stress had its austenite and ferrite grains affected by pitting.

Keywords: duplex stainless steel, hardness, nanoceramics, spark plasma sintering

Procedia PDF Downloads 269
55 Nonparametric Copula Approximations

Authors: Serge Provost, Yishan Zang

Abstract:

Copulas are currently utilized in finance, reliability theory, machine learning, signal processing, geodesy, hydrology and biostatistics, among several other fields of scientific investigation. It follows from Sklar's theorem that the joint distribution function of a multidimensional random vector can be expressed in terms of its associated copula and marginals. Since marginal distributions can easily be determined by making use of a variety of techniques, we address the problem of securing the distribution of the copula. This will be done by using several approaches. For example, we will obtain bivariate least-squares approximations of the empirical copulas, modify the kernel density estimation technique and propose a criterion for selecting appropriate bandwidths, differentiate linearized empirical copulas, secure Bernstein polynomial approximations of suitable degrees, and apply a corollary to Sklar's result. Illustrative examples involving actual observations will be presented. The proposed methodologies will as well be applied to a sample generated from a known copula distribution in order to validate their effectiveness.

Keywords: copulas, Bernstein polynomial approximation, least-squares polynomial approximation, kernel density estimation, density approximation

Procedia PDF Downloads 41
54 Neural Network Based Compressor Flow Estimator in an Aircraft Vapor Cycle System

Authors: Justin Reverdi, Sixin Zhang, Serge Gratton, Said Aoues, Thomas Pellegrini

Abstract:

In Vapor Cycle Systems, the flow sensor plays a key role in different monitoring and control purposes. However, physical sensors can be expensive, inaccurate, heavy, cumbersome, or highly sensitive to vibrations, which is especially problematic when embedded into an aircraft. The conception of a virtual sensor based on other standard sensors is a good alternative. In this paper, a data-driven model using a Convolutional Neural Network is proposed to estimate the flow of the compressor. To fit the model to our dataset, we tested different loss functions. We show in our application that a Dynamic Time Warping based loss function called DILATE leads to better dynamical performance than the vanilla mean squared error (MSE) loss function. DILATE allows choosing a trade-off between static and dynamic performance.

Keywords: deep learning, dynamic time warping, vapor cycle system, virtual sensor

Procedia PDF Downloads 117
53 A Simulated Scenario of WikiGIS to Support the Iteration and Traceability Management of the Geodesign Process

Authors: Wided Batita, Stéphane Roche, Claude Caron

Abstract:

Geodesign is an emergent term related to a new and complex process. Hence, it needs to rethink tools, technologies and platforms in order to efficiently achieve its goals. A few tools have emerged since 2010 such as CommunityViz, GeoPlanner, etc. In the era of Web 2.0 and collaboration, WikiGIS has been proposed as a new category of tools. In this paper, we present WikiGIS functionalities dealing mainly with the iteration and traceability management to support the collaboration of the Geodesign process. Actually, WikiGIS is built on GeoWeb 2.0 technologies —and primarily on wiki— and aims at managing the tracking of participants’ editing. This paper focuses on a simplified simulation to illustrate the strength of WikiGIS in the management of traceability and in the access to history in a Geodesign process. Indeed, a cartographic user interface has been implemented, and then a hypothetical use case has been imagined as proof of concept.

Keywords: geodesign, history, traceability, tracking of participants’ editing, WikiGIS

Procedia PDF Downloads 212
52 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces

Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet

Abstract:

In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.

Keywords: dropwise condensation, textured surface, image processing, watershed

Procedia PDF Downloads 192
51 Application of the MOOD Technique to the Steady-State Euler Equations

Authors: Gaspar J. Machado, Stéphane Clain, Raphael Loubère

Abstract:

The goal of the present work is to numerically study steady-state nonlinear hyperbolic equations in the context of the finite volume framework. We will consider the unidimensional Burgers' equation as the reference case for the scalar situation and the unidimensional Euler equations for the vectorial situation. We consider two approaches to solve the nonlinear equations: a time marching algorithm and a direct steady-state approach. We first develop the necessary and sufficient conditions to obtain the existence and unicity of the solution. We treat regular examples and solutions with a steady shock and to provide very-high-order finite volume approximations we implement a method based on the MOOD technology (Multi-dimensional Optimal Order Detection). The main ingredient consists in using an 'a posteriori' limiting strategy to eliminate non physical oscillations deriving from the Gibbs phenomenon while keeping a high accuracy for the smooth part.

Keywords: Euler equations, finite volume, MOOD, steady-state

Procedia PDF Downloads 244
50 Preparation of Nanocomposites Based on Biodegradable Polycaprolactone by Melt Mixture

Authors: Mohamed Amine Zenasni, Bahia Meroufel, André Merlin, Said Benfarhi, Stéphane Molina, Béatrice George

Abstract:

The introduction of nano-fillers into polymers field lead to the creation of the nano composites. This creation is starting up a new revolution into the world of materials. Nano composites are similar to traditional composite of a polymer blend and filler with at least one nano-scopic dimension. In our project, we worked with nano composites of biodegradable polymer: polycaprolactone, combined with nano-clay (Maghnite) and with different nano-organo-clays. These nano composites have been prepared by melt mixture method. The advantage of this polymer is its degradability and bio compatibility. A study of the relationship between development, micro structure and physico chemical properties of nano composites, clays modified with 3-aminopropyltriethoxysilane (APTES) and Hexadecyltriméthy ammonium bromide (CTAB) and untreated clays were made. Melt mixture method is most suitable methods to get a better dispersion named exfoliation.

Keywords: nanocomposite, biodegradable, polycaprolactone, maghnite, melt mixture, APTES, CTAB

Procedia PDF Downloads 400
49 Uncertainty Analysis of a Hardware in Loop Setup for Testing Products Related to Building Technology

Authors: Balasundaram Prasaant, Ploix Stephane, Delinchant Benoit, Muresan Cristian

Abstract:

Hardware in Loop (HIL) testing is done to test and validate a particular product especially in building technology. When it comes to building technology, it is more important to test the products for their efficiency. The test rig in the HIL simulator may contribute to some uncertainties on measured efficiency. The uncertainties include physical uncertainties and scenario-based uncertainties. In this paper, a simple uncertainty analysis framework for an HIL setup is shown considering only the physical uncertainties. The entire modeling of the HIL setup is done in Dymola. The uncertain sources are considered based on available knowledge of the components and also on expert knowledge. For the propagation of uncertainty, Monte Carlo Simulation is used since it is the most reliable and easy to use. In this article it is shown how an HIL setup can be modeled and how uncertainty propagation can be performed on it. Such an approach is not common in building energy analysis.

Keywords: energy in buildings, hardware in loop testing, modelica modelling, Monte Carlo simulation, uncertainty propagation

Procedia PDF Downloads 105
48 Characterization of the Queuine Salvage Pathway From Bacteria in the Human Parasite Entamoeba Histolytica

Authors: Lotem Sarid, Meirav Trebicz-Geffen, Serge Ankri

Abstract:

Queuosine (Q) is a naturally occurring modified nucleoside that occurs in the first position of transfer RNA anticodons such as Asp, Asn, His, and Tyr. As eukaryotes lack pathways to synthesize queuine, the nucleobase of queuosine, they must obtain it from their diet or gut microbiota. Our previous work investigated the effects of queuine on the physiology of the eukaryotic parasite Entamoeba histolytica and defined the enzyme EhTGT responsible for its incorporation into tRNA. To our best knowledge, it is unknown how E. histolytica salvages Q from gut bacteria. We used N-acryloyl-3-aminophenylboronic acid (APB) PAGE analysis to demonstrate that E. histolytica trophozoites can salvage queuine from Q or E. coli K12 but not from the modified E. coli QueC strain, which cannot produce queuine. Next, we examined the role of EhDUF2419, a protein with homology to DNA glycosylase, as a queuine salvage enzyme in E. histolytica. When EhDUF2419 expression is silenced, it inhibits Q's conversion to queuine, resulting in a decrease in Q-tRNA levels. We also observed that Q protects control trophozoites from oxidative stress (OS), but not siEhDUF2419 trophozoites. Overall, our data reveal that EhDUF2419 is central for the salvaging of queuine from bacteria and for the resistance of the parasite to OS.

Keywords: entamoeba histolytica, epitranscriptomics, gut microbiota, queuine, queuosine, response to oxidative stress, tRNA modification.

Procedia PDF Downloads 88
47 CNN-Based Compressor Mass Flow Estimator in Industrial Aircraft Vapor Cycle System

Authors: Justin Reverdi, Sixin Zhang, Saïd Aoues, Fabrice Gamboa, Serge Gratton, Thomas Pellegrini

Abstract:

In vapor cycle systems, the mass flow sensor plays a key role for different monitoring and control purposes. However, physical sensors can be inaccurate, heavy, cumbersome, expensive, or highly sensitive to vibrations, which is especially problematic when embedded into an aircraft. The conception of a virtual sensor, based on other standard sensors, is a good alternative. This paper has two main objectives. Firstly, a data-driven model using a convolutional neural network is proposed to estimate the mass flow of the compressor. We show that it significantly outperforms the standard polynomial regression model (thermodynamic maps) in terms of the standard MSE metric and engineer performance metrics. Secondly, a semi-automatic segmentation method is proposed to compute the engineer performance metrics for real datasets, as the standard MSE metric may pose risks in analyzing the dynamic behavior of vapor cycle systems.

Keywords: deep learning, convolutional neural network, vapor cycle system, virtual sensor

Procedia PDF Downloads 19
46 A Theorem Related to Sample Moments and Two Types of Moment-Based Density Estimates

Authors: Serge B. Provost

Abstract:

Numerous statistical inference and modeling methodologies are based on sample moments rather than the actual observations. A result justifying the validity of this approach is introduced. More specifically, it will be established that given the first n moments of a sample of size n, one can recover the original n sample points. This implies that a sample of size n and its first associated n moments contain precisely the same amount of information. However, it is efficient to make use of a limited number of initial moments as most of the relevant distributional information is included in them. Two types of density estimation techniques that rely on such moments will be discussed. The first one expresses a density estimate as the product of a suitable base density and a polynomial adjustment whose coefficients are determined by equating the moments of the density estimate to the sample moments. The second one assumes that the derivative of the logarithm of a density function can be represented as a rational function. This gives rise to a system of linear equations involving sample moments, the density estimate is then obtained by solving a differential equation. Unlike kernel density estimation, these methodologies are ideally suited to model ‘big data’ as they only require a limited number of moments, irrespective of the sample size. What is more, they produce simple closed form expressions that are amenable to algebraic manipulations. They also turn out to be more accurate as will be shown in several illustrative examples.

Keywords: density estimation, log-density, polynomial adjustments, sample moments

Procedia PDF Downloads 127
45 Solvent Extraction in Ionic Liquids: Structuration and Aggregation Effects on Extraction Mechanisms

Authors: Sandrine Dourdain, Cesar Lopez, Tamir Sukhbaatar, Guilhem Arrachart, Stephane Pellet-Rostaing

Abstract:

A promising challenge in solvent extraction is to replace the conventional organic solvents, with ionic liquids (IL). Depending on the extraction systems, these new solvents show better efficiency than the conventional ones. Although some assumptions based on ions exchanges have been proposed in the literature, these properties are not predictable because the involved mechanisms are still poorly understood. It is well established that the mechanisms underlying solvent extraction processes are based not only on the molecular chelation of the extractant molecules but also on their ability to form supra-molecular aggregates due to their amphiphilic nature. It is therefore essential to evaluate how IL affects the aggregation properties of the extractant molecules. Our aim is to evaluate the influence of IL structure and polarity on solvent extraction mechanisms, by looking at the aggregation of the extractant molecules in IL. We compare extractant systems that are well characterized in common solvents and show thanks to SAXS and SANS measurements, that in the absence of IL ion exchange mechanisms, extraction properties are related to aggregation.

Keywords: solvent extraction in Ionic liquid, aggregation, Ionic liquids structure, SAXS, SANS

Procedia PDF Downloads 123
44 N-Heptane as Model Molecule for Cracking Catalyst Evaluation to Improve the Yield of Ethylene and Propylene

Authors: Tony K. Joseph, Balasubramanian Vathilingam, Stephane Morin

Abstract:

Currently, the refiners around the world are more focused on improving the yield of light olefins (propylene and ethylene) as both of them are very prominent raw materials to produce wide spectrum of polymeric materials such as polyethylene and polypropylene. Henceforth, it is desirable to increase the yield of light olefins via selective cracking of heavy oil fractions. In this study, zeolite grown on SiC was used as the catalyst to do model cracking reaction of n-heptane. The catalytic cracking of n-heptane was performed in a fixed bed reactor (12 mm i.d.) at three different temperatures (425, 450 and 475 °C) and at atmospheric pressure. A carrier gas (N₂) was mixed with n-heptane with ratio of 90:10 (N₂:n-heptane), and the gaseous mixture was introduced into the fixed bed reactor. Various flow rate of reactants was tested to increase the yield of ethylene and propylene. For the comparison purpose, commercial zeolite was also tested in addition to Zeolite on SiC. The products were analyzed using an Agilent gas chromatograph (GC-9860) equipped with flame ionization detector (FID). The GC is connected online with the reactor and all the cracking tests were successfully reproduced. The entire catalytic evaluation results will be presented during the conference.

Keywords: cracking, catalyst, evaluation, ethylene, heptane, propylene

Procedia PDF Downloads 106
43 Estimating Occupancy in Residential Context Using Bayesian Networks for Energy Management

Authors: Manar Amayri, Hussain Kazimi, Quoc-Dung Ngo, Stephane Ploix

Abstract:

A general approach is proposed to determine occupant behavior (occupancy and activity) in residential buildings and to use these estimates for improved energy management. Occupant behaviour is modelled with a Bayesian Network in an unsupervised manner. This algorithm makes use of domain knowledge gathered via questionnaires and recorded sensor data for motion detection, power, and hot water consumption as well as indoor CO₂ concentration. Two case studies are presented which show the real world applicability of estimating occupant behaviour in this way. Furthermore, experiments integrating occupancy estimation and hot water production control show that energy efficiency can be increased by roughly 5% over known optimal control techniques and more than 25% over rule-based control while maintaining the same occupant comfort standards. The efficiency gains are strongly correlated with occupant behaviour and accuracy of the occupancy estimates.

Keywords: energy, management, control, optimization, Bayesian methods, learning theory, sensor networks, knowledge modelling and knowledge based systems, artificial intelligence, buildings

Procedia PDF Downloads 343
42 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings

Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.

Keywords: building system, time series, diagnosis, outliers, delay, data gap

Procedia PDF Downloads 217
41 Energy-Efficient Internet of Things Communications: A Comparative Study of Long-Term Evolution for Machines and Narrowband Internet of Things Technologies

Authors: Nassim Labdaoui, Fabienne Nouvel, Stéphane Dutertre

Abstract:

The Internet of Things (IoT) is emerging as a crucial communication technology for the future. Many solutions have been proposed, and among them, licensed operators have put forward LTE-M and NB-IoT. However, implementing these technologies requires a good understanding of the device energy requirements, which can vary depending on the coverage conditions. In this paper, we investigate the power consumption of LTE-M and NB-IoT devices using Ublox SARA-R422S modules based on relevant standards from two French operators. The measurements were conducted under different coverage conditions, and we also present an empirical consumption model based on the different states of the radio modem as per the RRC protocol specifications. Our findings indicate that these technologies can achieve a 5 years operational battery life under certain conditions. Moreover, we conclude that the size of transmitted data does not have a significant impact on the total power consumption of the device under favorable coverage conditions. However, it can quickly influence the battery life of the device under harsh coverage conditions. Overall, this paper offers insights into the power consumption of LTE-M and NBIoT devices and provides useful information for those considering the use of these technologies.

Keywords: internet of things, LTE-M, NB-IoT, MQTT, cellular IoT, power consumption

Procedia PDF Downloads 100
40 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI

Authors: Genady Grabarnik, Serge Yaskolko

Abstract:

Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.

Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education

Procedia PDF Downloads 29
39 Influence of the Coarse-Graining Method on a DEM-CFD Simulation of a Pilot-Scale Gas Fluidized Bed

Authors: Theo Ndereyimana, Yann Dufresne, Micael Boulet, Stephane Moreau

Abstract:

The DEM (Discrete Element Method) is used a lot in the industry to simulate large-scale flows of particles; for instance, in a fluidized bed, it allows to predict of the trajectory of every particle. One of the main limits of the DEM is the computational time. The CGM (Coarse-Graining Method) has been developed to tackle this issue. The goal is to increase the size of the particle and, by this means, decrease the number of particles. The method leads to a reduction of the collision frequency due to the reduction of the number of particles. Multiple characteristics of the particle movement and the fluid flow - when there is a coupling between DEM and CFD (Computational Fluid Dynamics). The main characteristic that is impacted is the energy dissipation of the system, to regain the dissipation, an ADM (Additional Dissipative Mechanism) can be added to the model. The objective of this current work is to observe the influence of the choice of the ADM and the factor of coarse-graining on the numerical results. These results will be compared with experimental results of a fluidized bed and with a numerical model of the same fluidized bed without using the CGM. The numerical model is one of a 3D cylindrical fluidized bed with 9.6M Geldart B-type particles in a bubbling regime.

Keywords: additive dissipative mechanism, coarse-graining, discrete element method, fluidized bed

Procedia PDF Downloads 32
38 Model of a Context-Aware Middleware for Mobile Workers

Authors: Esraa Moustafa, Gaetan Rey, Stephane Lavirotte, Jean-Yves Tigli

Abstract:

With the development of Internet of Things and Web of Things, computing becomes more pervasive, invisible and present everywhere. In fact, in our environment, we are surrounded by multiple devices that deliver (web) services that meet the needs of the users. However, the mobility of these devices as the users has important repercussions that challenge software design of these applications because the variability of the environment cannot be anticipated at the design time. Thus, it will be interesting to dynamically discover the environment and adapt the application during its execution to the new contextual conditions. We, therefore, propose a model of a context-aware middleware that can address this issue through a monitoring service that is capable of reasoning and observation channels capable of calculating the context during the runtime. The monitoring service evaluates the pre-defined X-Query predicates in the context manager and uses Prolog to deduce the services needed to respond back. An independent Observation Channel for each different predicate is then dynamically generated by the monitoring service depending on the current state of the environment. Each channel sends its result directly to the context manager which consequently calculates the context based on all the predicates’ results while preserving the reactivity of the self-adaptive system.

Keywords: auto-adaptation, context-awareness, middleware, reasoning engine

Procedia PDF Downloads 212
37 Mechanical Characterization of Porcine Skin with the Finite Element Method Based Inverse Optimization Approach

Authors: Djamel Remache, Serge Dos Santos, Michael Cliez, Michel Gratton, Patrick Chabrand, Jean-Marie Rossi, Jean-Louis Milan

Abstract:

Skin tissue is an inhomogeneous and anisotropic material. Uniaxial tensile testing is one of the primary testing techniques for the mechanical characterization of skin at large scales. In order to predict the mechanical behavior of materials, the direct or inverse analytical approaches are often used. However, in case of an inhomogeneous and anisotropic material as skin tissue, analytical approaches are not able to provide solutions. The numerical simulation is thus necessary. In this work, the uniaxial tensile test and the FEM (finite element method) based inverse method were used to identify the anisotropic mechanical properties of porcine skin tissue. The uniaxial tensile experiments were performed using Instron 8800 tensile machine®. The uniaxial tensile test was simulated with FEM, and then the inverse optimization approach (or the inverse calibration) was used for the identification of mechanical properties of the samples. Experimentally results were compared to finite element solutions. The results showed that the finite element model predictions of the mechanical behavior of the tested skin samples were well correlated with experimental results.

Keywords: mechanical skin tissue behavior, uniaxial tensile test, finite element analysis, inverse optimization approach

Procedia PDF Downloads 375
36 Characterization of Organic Matter in Spodosol Amazonian by Fluorescence Spectroscopy

Authors: Amanda M. Tadini, Houssam Hajjoul, Gustavo Nicolodelli, Stéphane Mounier, Célia R. Montes, Débora M. B. P. Milori

Abstract:

Soil organic matter (SOM) plays an important role in maintaining soil productivity and accounting for the promotion of biological diversity. The main components of the SOM are the humic substances which can be fractionated according to its solubility in humic acid (HA), fulvic acids (FA) and humin (HU). The determination of the chemical properties of organic matter as well as its interaction with metallic species is an important tool for understanding the structure of the humic fractions. Fluorescence spectroscopy has been studied as a source of information about what is happening at the molecular level in these compounds. Specially, soils of Amazon region are an important ecosystem of the planet. The aim of this study is to understand the molecular and structural composition of HA samples from Spodosol of Amazonia using the fluorescence Emission-Excitation Matrix (EEM) and Time Resolved Fluorescence Spectroscopy (TRFS). The results showed that the samples of HA showed two fluorescent components; one has a more complex structure and the other one has a simpler structure, which was also seen in TRFS through the evaluation of each sample lifetime. Thus, studies of this nature become important because it aims to evaluate the molecular and structural characteristics of the humic fractions in the region that is considered as one of the most important regions in the world, the Amazon.

Keywords: Amazonian soil, characterization, fluorescence, humic acid, lifetime

Procedia PDF Downloads 568
35 Yawning Computing Using Bayesian Networks

Authors: Serge Tshibangu, Turgay Celik, Zenzo Ncube

Abstract:

Road crashes kill nearly over a million people every year, and leave millions more injured or permanently disabled. Various annual reports reveal that the percentage of fatal crashes due to fatigue/driver falling asleep comes directly after the percentage of fatal crashes due to intoxicated drivers. This percentage is higher than the combined percentage of fatal crashes due to illegal/Un-Safe U-turn and illegal/Un-Safe reversing. Although a relatively small percentage of police reports on road accidents highlights drowsiness and fatigue, the importance of these factors is greater than we might think, hidden by the undercounting of their events. Some scenarios show that these factors are significant in accidents with killed and injured people. Thus the need for an automatic drivers fatigue detection system in order to considerably reduce the number of accidents owing to fatigue.This research approaches the drivers fatigue detection problem in an innovative way by combining cues collected from both temporal analysis of drivers’ faces and environment. Monotony in driving environment is inter-related with visual symptoms of fatigue on drivers’ faces to achieve fatigue detection. Optical and infrared (IR) sensors are used to analyse the monotony in driving environment and to detect the visual symptoms of fatigue on human face. Internal cues from drivers faces and external cues from environment are combined together using machine learning algorithms to automatically detect fatigue.

Keywords: intelligent transportation systems, bayesian networks, yawning computing, machine learning algorithms

Procedia PDF Downloads 432
34 Performing Diagnosis in Building with Partially Valid Heterogeneous Tests

Authors: Houda Najeh, Mahendra Pratap Singh, Stéphane Ploix, Antoine Caucheteux, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building system is highly vulnerable to different kinds of faults and human misbehaviors. Energy efficiency and user comfort are directly targeted due to abnormalities in building operation. The available fault diagnosis tools and methodologies particularly rely on rules or pure model-based approaches. It is assumed that model or rule-based test could be applied to any situation without taking into account actual testing contexts. Contextual tests with validity domain could reduce a lot of the design of detection tests. The main objective of this paper is to consider fault validity when validate the test model considering the non-modeled events such as occupancy, weather conditions, door and window openings and the integration of the knowledge of the expert on the state of the system. The concept of heterogeneous tests is combined with test validity to generate fault diagnoses. A combination of rules, range and model-based tests known as heterogeneous tests are proposed to reduce the modeling complexity. Calculation of logical diagnoses coming from artificial intelligence provides a global explanation consistent with the test result. An application example shows the efficiency of the proposed technique: an office setting at Grenoble Institute of Technology.

Keywords: heterogeneous tests, validity, building system, sensor grids, sensor fault, diagnosis, fault detection and isolation

Procedia PDF Downloads 261