Search results for: Stéphane Roche
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 67

Search results for: Stéphane Roche

67 Performance of the Aptima® HIV-1 Quant Dx Assay on the Panther System

Authors: Siobhan O’Shea, Sangeetha Vijaysri Nair, Hee Cheol Kim, Charles Thomas Nugent, Cheuk Yan William Tong, Sam Douthwaite, Andrew Worlock

Abstract:

The Aptima® HIV-1 Quant Dx Assay is a fully automated assay on the Panther system. It is based on Transcription-Mediated Amplification and real time detection technologies. This assay is intended for monitoring HIV-1 viral load in plasma specimens and for the detection of HIV-1 in plasma and serum specimens. Nine-hundred and seventy nine specimens selected at random from routine testing at St Thomas’ Hospital, London were anonymised and used to compare the performance of the Aptima HIV-1 Quant Dx assay and Roche COBAS® AmpliPrep/COBAS® TaqMan® HIV-1 Test, v2.0. Two-hundred and thirty four specimens gave quantitative HIV-1 viral load results in both assays. The quantitative results reported by the Aptima Assay were comparable those reported by the Roche COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 with a linear regression slope of 1.04 and an intercept on -0.097. The Aptima assay detected HIV-1 in more samples than the Roche assay. This was not due to lack of specificity of the Aptima assay because this assay gave 99.83% specificity on testing plasma specimens from 600 HIV-1 negative individuals. To understand the reason for this higher detection rate a side-by-side comparison of low level panels made from the HIV-1 3rd international standard (NIBSC10/152) and clinical samples of various subtypes were tested in both assays. The Aptima assay was more sensitive than the Roche assay. The good sensitivity, specificity and agreement with other commercial assays make the HIV-1 Quant Dx Assay appropriate for both viral load monitoring and detection of HIV-1 infections.

Keywords: HIV viral load, Aptima, Roche, Panther system

Procedia PDF Downloads 338
66 A Simulated Scenario of WikiGIS to Support the Iteration and Traceability Management of the Geodesign Process

Authors: Wided Batita, Stéphane Roche, Claude Caron

Abstract:

Geodesign is an emergent term related to a new and complex process. Hence, it needs to rethink tools, technologies and platforms in order to efficiently achieve its goals. A few tools have emerged since 2010 such as CommunityViz, GeoPlanner, etc. In the era of Web 2.0 and collaboration, WikiGIS has been proposed as a new category of tools. In this paper, we present WikiGIS functionalities dealing mainly with the iteration and traceability management to support the collaboration of the Geodesign process. Actually, WikiGIS is built on GeoWeb 2.0 technologies —and primarily on wiki— and aims at managing the tracking of participants’ editing. This paper focuses on a simplified simulation to illustrate the strength of WikiGIS in the management of traceability and in the access to history in a Geodesign process. Indeed, a cartographic user interface has been implemented, and then a hypothetical use case has been imagined as proof of concept.

Keywords: geodesign, history, traceability, tracking of participants’ editing, WikiGIS

Procedia PDF Downloads 211
65 Comparison of Different DNA Extraction Platforms with FFPE tissue

Authors: Wang Yanping Karen, Mohd Rafeah Siti, Park MI Kyoung

Abstract:

Formalin-fixed paraffin embedded (FFPE) tissue is important in the area of oncological diagnostics. This method of preserving tissues enabling them to be stored easily at ambient temperature for a long time. This decreases the risk of losing the DNA quantity and quality after extraction, reducing sample wastage, and making FFPE more cost effective. However, extracting DNA from FFPE tissue is a challenge as DNA purified is often highly cross-linked, fragmented, and degraded. In addition, this causes problems for many downstream processes. In this study, there will be a comparison of DNA extraction efficiency between One BioMed’s Xceler8 automated platform with commercial available extraction kits (Qiagen and Roche). The FFPE tissue slices were subjected to deparaffinization process, pretreatment and then DNA extraction using the three mentioned platforms. The DNA quantity were determined with real-time PCR (BioRad CFX ) and gel electrophoresis. The amount of DNA extracted with the One BioMed’s X8 platform was found to be comparable with the other two manual extraction kits.

Keywords: DNA extraction, FFPE tissue, qiagen, roche, one biomed X8

Procedia PDF Downloads 65
64 Direct Measurements of the Electrocaloric Effect in Solid Ferroelectric Materials via Thermoreflectance

Authors: Layla Farhat, Mathieu Bardoux, Stéphane Longuemart, Ziad Herro, Abdelhak Hadj Sahraoui

Abstract:

Electrocaloric (EC) effect refers to the isothermal entropy or adiabatic temperature changes of a dielectric material induced by an external electric field. This phenomenon has been largely ignored for application because only modest EC effects (2.6

Keywords: electrocaloric effect, thermoreflectance, ferroelectricity, cooling system

Procedia PDF Downloads 150
63 Cut-Off of CMV Cobas® Taqman® (CAP/CTM Roche®) for Introduction of Ganciclovir Pre-Emptive Therapy in Allogeneic Hematopoietic Stem Cell Transplant Recipients

Authors: B. B. S. Pereira, M. O. Souza, L. P. Zanetti, L. C. S. Oliveira, J. R. P. Moreno, M. P. Souza, V. R. Colturato, C. M. Machado

Abstract:

Background: The introduction of prophylactic or preemptive therapies has effectively decreased the CMV mortality rates after hematopoietic stem cell transplantation (HSCT). CMV antigenemia (pp65) or quantitative PCR are methods currently approved for CMV surveillance in pre-emptive strategies. Commercial assays are preferred as cut-off levels defined by in-house assays may vary among different protocols and in general show low reproducibility. Moreover, comparison of published data among different centers is only possible if international standards of quantification are included in the assays. Recently, the World Health Organization (WHO) established the first international standard for CMV detection. The real time PCR COBAS Ampliprep/ CobasTaqMan (CAP/CTM) (Roche®) was developed using the WHO standard for CMV quantification. However, the cut-off for the introduction of antiviral has not been determined yet. Methods: We conducted a retrospective study to determine: 1) the sensitivity and specificity of the new CMV CAP/CTM test in comparison with pp65 antigenemia to detect episodes of CMV infection/reactivation, and 2) the cut-off of viral load for introduction of ganciclovir (GCV). Pp65 antigenemia was performed and the corresponding plasma samples were stored at -20°C for further CMV detection by CAP/CTM. Comparison of tests was performed by kappa index. The appearance of positive antigenemia was considered the state variable to determine the cut-off of CMV viral load by ROC curve. Statistical analysis was performed using SPSS software version 19 (SPSS, Chicago, IL, USA.). Results: Thirty-eight patients were included and followed from August 2014 through May 2015. The antigenemia test detected 53 episodes of CMV infection in 34 patients (89.5%), while CAP/CTM detected 37 episodes in 33 patients (86.8%). AG and PCR results were compared in 431 samples and Kappa index was 30.9%. The median time for first AG detection was 42 (28-140) days, while CAP/CTM detected at a median of 7 days earlier (34 days, ranging from 7 to 110 days). The optimum cut-off value of CMV DNA was 34.25 IU/mL to detect positive antigenemia with 88.2% of sensibility, 100% of specificity and AUC of 0.91. This cut-off value is below the limit of detection and quantification of the equipment which is 56 IU/mL. According to CMV recurrence definition, 16 episodes of CMV recurrence were detected by antigenemia (47.1%) and 4 (12.1%) by CAP/CTM. The duration of viremia as detected by antigenemia was shorter (60.5% of the episodes lasted ≤ 7 days) in comparison to CAP/CTM (57.9% of the episodes lasting 15 days or more). This data suggests that the use of antigenemia to define the duration of GCV therapy might prompt early interruption of antiviral, which may favor CMV reactivation. The CAP/CTM PCR could possibly provide a safer information concerning the duration of GCV therapy. As prolonged treatment may increase the risk of toxicity, this hypothesis should be confirmed in prospective trials. Conclusions: Even though CAP/CTM by ROCHE showed great qualitative correlation with the antigenemia technique, the fully automated CAP/CTM did not demonstrate increased sensitivity. The cut-off value below the limit of detection and quantification may result in delayed introduction of pre-emptive therapy.

Keywords: antigenemia, CMV COBAS/TAQMAN, cytomegalovirus, antiviral cut-off

Procedia PDF Downloads 160
62 Establishment and Aging Process Analysis in Dermal Fibroblast Cell Culture of Green Turtle (Chelonia mydas)

Authors: Yemima Dani Riani, Anggraini Barlian

Abstract:

Green turtle (Chelonia mydas) is one of well known long-lived turtle. Its age can reach 100 years old. Senescence in green turtle is an interesting process to study because until now no clear explanation has been established about senescence at cellular or molecular level in this species. Since 1999, green turtle announced as an endangered species. Hence, establishment of fibroblast skin cell culture of green turtle may be material for future study of senescence. One common marker used for detecting senescence is telomere shortening. Reduced telomerase activity, the reverse transcriptase enzyme which adds TTAGGG DNA sequence to telomere end, may also cause senescence. The purpose of this research are establish and identify green turtle fibroblast skin cell culture and also compare telomere length and telomerase activity from passage 5 and 14. Primary cell culture made with primary explant method then cultured in Leibovitz-15 (Sigma) supplemented by 10% Fetal Bovine Serum (Sigma) and 100 U/mL Penicillin/Streptomycin (Sigma) at 30 ± 1oC. Cells identified with Rabbit Anti-Vimentin Polyclonal Antibody (Abcam) and Goat Polyclonal Antibody (Abcam) using confocal microscope (Zeiss LSM 170). Telomere length obtained using TeloTAGGG Telomere Length Assay (Roche) while telomerase activity obtained using TeloTAGGG Telomerase PCR ElisaPlus (Roche). Primary cell culture from green turtle skin had fibroblastic morphology and immunocytochemistry test with vimentin antibody proved the culture was fibroblast cell. Measurement of telomere length and telomerase activity showed that telomere length and telomerase activity of passage 14 was greater than passage 5. However, based on morphology, green turtle fibroblast skin cell culture showed senescent morphology. Based on the analysis of telomere length and telomerase activity, suspected fibroblast skin cell culture of green turtles is not undergo aging through telomere shortening.

Keywords: cell culture, chelonia mydas, telomerase, telomere, senescence

Procedia PDF Downloads 393
61 Thyroid Dysfunction in Patients with Chronic Hemodialysis

Authors: Benghezel Hichem

Abstract:

Thyroid dysfunction in hemodialysis subjects is represented mainly by hypothyroidism. The objective of our work is to determine the thyroid profile of our hemodialysis patients and to highlight the prevalence of different thyroid disorders. Methods: This is a retrospective study performed on a mono centric 2 months (February and March 2013) on 42 hemodialysis patients (11 male and 31 female). We made the dosage of thyroid hormones Thyrotropin (TSH) ((free thyroxin ) FT4 and free Triodothyronin ) FT3) by chemiluminescence immunoassay method on cobas 6000 Roche Diagnostics. The results: The prevalence of biological hypothyroidism was 18% (7% with a high TSH isolated and a mean +/- SD 9.44 +/- 6.29, 5% with high TSH, and with low FT4 a mean +/- SD is 8.18 +/- 0.53 for TSH and 9.69 +/- 0.22 for FT4, One patient with a high TSH, and low FT4, FT3. 4% of patients with a low T3 syndrome with a mean +/- SD of 3.93 +/- 0,3 for FT3), we notice that 5% of patients with hyperthyroidism TSH collapsed and mean +/- SD of TSH is 0.017 +/- 0,001. Conclusion: The biological Hypothyroidism is a common endocrine disorder in chronic hemodialysis.

Keywords: hypothyroidism, hemodialysis, thyréostimulin, free thyroxin, triodothyronin

Procedia PDF Downloads 392
60 Comparison of Nucleic Acid Extraction Platforms On Tissue Samples

Authors: Siti Rafeah Md Rafei, Karen Wang Yanping, Park Mi Kyoung

Abstract:

Tissue samples are precious supply for molecular studies or disease identification diagnosed using molecular assays, namely real-time PCR (qPCR). It is critical to establish the most favorable nucleic acid extraction that gives the PCR-amplifiable genomic DNA. Furthermore, automated nucleic acid extraction is an appealing alternative to labor-intensive manual methods. Operational complexity, defined as the number of steps required to obtain an extracted sample, is one of the criteria in the comparison. Here we are comparing the One BioMed’s automated X8 platform with the commercially available manual-operated kits from QIAGEN Mini Kit and Roche. We extracted DNA from rat fresh-frozen tissue (from different type of organs) in the matrices. After tissue pre-treatment, it is added to the One BioMed’s X8 pre-filled cartridge, and the QIAGEN QIAmp column respectively. We found that the results after subjecting the eluates to the Real Time PCR using BIORAD CFX are comparable.

Keywords: DNA extraction, frozen tissue, PCR, qPCR, rat

Procedia PDF Downloads 116
59 AI-Driven Strategies for Sustainable Electronics Repair: A Case Study in Energy Efficiency

Authors: Badiy Elmabrouk, Abdelhamid Boujarif, Zhiguo Zeng, Stephane Borrel, Robert Heidsieck

Abstract:

In an era where sustainability is paramount, this paper introduces a machine learning-driven testing protocol to accurately predict diode failures, merging reliability engineering with failure physics to enhance repair operations efficiency. Our approach refines the burn-in process, significantly curtailing its duration, which not only conserves energy but also elevates productivity and mitigates component wear. A case study from GE HealthCare’s repair center vividly demonstrates the method’s effectiveness, recording a high prediction of diode failures and a substantial decrease in energy consumption that translates to an annual reduction of 6.5 Tons of CO2 emissions. This advancement sets a benchmark for environmentally conscious practices in the electronics repair sector.

Keywords: maintenance, burn-in, failure physics, reliability testing

Procedia PDF Downloads 17
58 Rapid Detection of MBL Genes by SYBR Green Based Real-Time PCR

Authors: Taru Singh, Shukla Das, V. G. Ramachandran

Abstract:

Objectives: To develop SYBR green based real-time PCR assay to detect carbapenemases (NDM, IMP) genes in E. coli. Methods: A total of 40 E. coli from stool samples were tested. Six were previously characterized as resistant to carbapenems and documented by PCR. The remaining 34 isolates previously tested susceptible to carbapenems and were negative for these genes. Bacterial RNA was extracted using manual method. The real-time PCR was performed using the Light Cycler III 480 instrument (Roche) and specific primers for each carbapenemase target were used. Results: Each one of the two carbapenemase gene tested presented a different melting curve after PCR amplification. The melting temperature (Tm) analysis of the amplicons identified was as follows: blaIMP type (Tm 82.18°C), blaNDM-1 (Tm 78.8°C). No amplification was detected among the negative samples. The results showed 100% concordance with the genotypes previously identified. Conclusions: The new assay was able to detect the presence of two different carbapenemase gene type by real-time PCR.

Keywords: resistance, b-lactamases, E. coli, real-time PCR

Procedia PDF Downloads 376
57 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces

Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet

Abstract:

In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.

Keywords: dropwise condensation, textured surface, image processing, watershed

Procedia PDF Downloads 188
56 Application of the MOOD Technique to the Steady-State Euler Equations

Authors: Gaspar J. Machado, Stéphane Clain, Raphael Loubère

Abstract:

The goal of the present work is to numerically study steady-state nonlinear hyperbolic equations in the context of the finite volume framework. We will consider the unidimensional Burgers' equation as the reference case for the scalar situation and the unidimensional Euler equations for the vectorial situation. We consider two approaches to solve the nonlinear equations: a time marching algorithm and a direct steady-state approach. We first develop the necessary and sufficient conditions to obtain the existence and unicity of the solution. We treat regular examples and solutions with a steady shock and to provide very-high-order finite volume approximations we implement a method based on the MOOD technology (Multi-dimensional Optimal Order Detection). The main ingredient consists in using an 'a posteriori' limiting strategy to eliminate non physical oscillations deriving from the Gibbs phenomenon while keeping a high accuracy for the smooth part.

Keywords: Euler equations, finite volume, MOOD, steady-state

Procedia PDF Downloads 242
55 Preparation of Nanocomposites Based on Biodegradable Polycaprolactone by Melt Mixture

Authors: Mohamed Amine Zenasni, Bahia Meroufel, André Merlin, Said Benfarhi, Stéphane Molina, Béatrice George

Abstract:

The introduction of nano-fillers into polymers field lead to the creation of the nano composites. This creation is starting up a new revolution into the world of materials. Nano composites are similar to traditional composite of a polymer blend and filler with at least one nano-scopic dimension. In our project, we worked with nano composites of biodegradable polymer: polycaprolactone, combined with nano-clay (Maghnite) and with different nano-organo-clays. These nano composites have been prepared by melt mixture method. The advantage of this polymer is its degradability and bio compatibility. A study of the relationship between development, micro structure and physico chemical properties of nano composites, clays modified with 3-aminopropyltriethoxysilane (APTES) and Hexadecyltriméthy ammonium bromide (CTAB) and untreated clays were made. Melt mixture method is most suitable methods to get a better dispersion named exfoliation.

Keywords: nanocomposite, biodegradable, polycaprolactone, maghnite, melt mixture, APTES, CTAB

Procedia PDF Downloads 397
54 Uncertainty Analysis of a Hardware in Loop Setup for Testing Products Related to Building Technology

Authors: Balasundaram Prasaant, Ploix Stephane, Delinchant Benoit, Muresan Cristian

Abstract:

Hardware in Loop (HIL) testing is done to test and validate a particular product especially in building technology. When it comes to building technology, it is more important to test the products for their efficiency. The test rig in the HIL simulator may contribute to some uncertainties on measured efficiency. The uncertainties include physical uncertainties and scenario-based uncertainties. In this paper, a simple uncertainty analysis framework for an HIL setup is shown considering only the physical uncertainties. The entire modeling of the HIL setup is done in Dymola. The uncertain sources are considered based on available knowledge of the components and also on expert knowledge. For the propagation of uncertainty, Monte Carlo Simulation is used since it is the most reliable and easy to use. In this article it is shown how an HIL setup can be modeled and how uncertainty propagation can be performed on it. Such an approach is not common in building energy analysis.

Keywords: energy in buildings, hardware in loop testing, modelica modelling, Monte Carlo simulation, uncertainty propagation

Procedia PDF Downloads 102
53 Solvent Extraction in Ionic Liquids: Structuration and Aggregation Effects on Extraction Mechanisms

Authors: Sandrine Dourdain, Cesar Lopez, Tamir Sukhbaatar, Guilhem Arrachart, Stephane Pellet-Rostaing

Abstract:

A promising challenge in solvent extraction is to replace the conventional organic solvents, with ionic liquids (IL). Depending on the extraction systems, these new solvents show better efficiency than the conventional ones. Although some assumptions based on ions exchanges have been proposed in the literature, these properties are not predictable because the involved mechanisms are still poorly understood. It is well established that the mechanisms underlying solvent extraction processes are based not only on the molecular chelation of the extractant molecules but also on their ability to form supra-molecular aggregates due to their amphiphilic nature. It is therefore essential to evaluate how IL affects the aggregation properties of the extractant molecules. Our aim is to evaluate the influence of IL structure and polarity on solvent extraction mechanisms, by looking at the aggregation of the extractant molecules in IL. We compare extractant systems that are well characterized in common solvents and show thanks to SAXS and SANS measurements, that in the absence of IL ion exchange mechanisms, extraction properties are related to aggregation.

Keywords: solvent extraction in Ionic liquid, aggregation, Ionic liquids structure, SAXS, SANS

Procedia PDF Downloads 122
52 N-Heptane as Model Molecule for Cracking Catalyst Evaluation to Improve the Yield of Ethylene and Propylene

Authors: Tony K. Joseph, Balasubramanian Vathilingam, Stephane Morin

Abstract:

Currently, the refiners around the world are more focused on improving the yield of light olefins (propylene and ethylene) as both of them are very prominent raw materials to produce wide spectrum of polymeric materials such as polyethylene and polypropylene. Henceforth, it is desirable to increase the yield of light olefins via selective cracking of heavy oil fractions. In this study, zeolite grown on SiC was used as the catalyst to do model cracking reaction of n-heptane. The catalytic cracking of n-heptane was performed in a fixed bed reactor (12 mm i.d.) at three different temperatures (425, 450 and 475 °C) and at atmospheric pressure. A carrier gas (N₂) was mixed with n-heptane with ratio of 90:10 (N₂:n-heptane), and the gaseous mixture was introduced into the fixed bed reactor. Various flow rate of reactants was tested to increase the yield of ethylene and propylene. For the comparison purpose, commercial zeolite was also tested in addition to Zeolite on SiC. The products were analyzed using an Agilent gas chromatograph (GC-9860) equipped with flame ionization detector (FID). The GC is connected online with the reactor and all the cracking tests were successfully reproduced. The entire catalytic evaluation results will be presented during the conference.

Keywords: cracking, catalyst, evaluation, ethylene, heptane, propylene

Procedia PDF Downloads 104
51 Estimating Occupancy in Residential Context Using Bayesian Networks for Energy Management

Authors: Manar Amayri, Hussain Kazimi, Quoc-Dung Ngo, Stephane Ploix

Abstract:

A general approach is proposed to determine occupant behavior (occupancy and activity) in residential buildings and to use these estimates for improved energy management. Occupant behaviour is modelled with a Bayesian Network in an unsupervised manner. This algorithm makes use of domain knowledge gathered via questionnaires and recorded sensor data for motion detection, power, and hot water consumption as well as indoor CO₂ concentration. Two case studies are presented which show the real world applicability of estimating occupant behaviour in this way. Furthermore, experiments integrating occupancy estimation and hot water production control show that energy efficiency can be increased by roughly 5% over known optimal control techniques and more than 25% over rule-based control while maintaining the same occupant comfort standards. The efficiency gains are strongly correlated with occupant behaviour and accuracy of the occupancy estimates.

Keywords: energy, management, control, optimization, Bayesian methods, learning theory, sensor networks, knowledge modelling and knowledge based systems, artificial intelligence, buildings

Procedia PDF Downloads 342
50 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings

Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.

Keywords: building system, time series, diagnosis, outliers, delay, data gap

Procedia PDF Downloads 217
49 Energy-Efficient Internet of Things Communications: A Comparative Study of Long-Term Evolution for Machines and Narrowband Internet of Things Technologies

Authors: Nassim Labdaoui, Fabienne Nouvel, Stéphane Dutertre

Abstract:

The Internet of Things (IoT) is emerging as a crucial communication technology for the future. Many solutions have been proposed, and among them, licensed operators have put forward LTE-M and NB-IoT. However, implementing these technologies requires a good understanding of the device energy requirements, which can vary depending on the coverage conditions. In this paper, we investigate the power consumption of LTE-M and NB-IoT devices using Ublox SARA-R422S modules based on relevant standards from two French operators. The measurements were conducted under different coverage conditions, and we also present an empirical consumption model based on the different states of the radio modem as per the RRC protocol specifications. Our findings indicate that these technologies can achieve a 5 years operational battery life under certain conditions. Moreover, we conclude that the size of transmitted data does not have a significant impact on the total power consumption of the device under favorable coverage conditions. However, it can quickly influence the battery life of the device under harsh coverage conditions. Overall, this paper offers insights into the power consumption of LTE-M and NBIoT devices and provides useful information for those considering the use of these technologies.

Keywords: internet of things, LTE-M, NB-IoT, MQTT, cellular IoT, power consumption

Procedia PDF Downloads 98
48 Influence of the Coarse-Graining Method on a DEM-CFD Simulation of a Pilot-Scale Gas Fluidized Bed

Authors: Theo Ndereyimana, Yann Dufresne, Micael Boulet, Stephane Moreau

Abstract:

The DEM (Discrete Element Method) is used a lot in the industry to simulate large-scale flows of particles; for instance, in a fluidized bed, it allows to predict of the trajectory of every particle. One of the main limits of the DEM is the computational time. The CGM (Coarse-Graining Method) has been developed to tackle this issue. The goal is to increase the size of the particle and, by this means, decrease the number of particles. The method leads to a reduction of the collision frequency due to the reduction of the number of particles. Multiple characteristics of the particle movement and the fluid flow - when there is a coupling between DEM and CFD (Computational Fluid Dynamics). The main characteristic that is impacted is the energy dissipation of the system, to regain the dissipation, an ADM (Additional Dissipative Mechanism) can be added to the model. The objective of this current work is to observe the influence of the choice of the ADM and the factor of coarse-graining on the numerical results. These results will be compared with experimental results of a fluidized bed and with a numerical model of the same fluidized bed without using the CGM. The numerical model is one of a 3D cylindrical fluidized bed with 9.6M Geldart B-type particles in a bubbling regime.

Keywords: additive dissipative mechanism, coarse-graining, discrete element method, fluidized bed

Procedia PDF Downloads 30
47 Model of a Context-Aware Middleware for Mobile Workers

Authors: Esraa Moustafa, Gaetan Rey, Stephane Lavirotte, Jean-Yves Tigli

Abstract:

With the development of Internet of Things and Web of Things, computing becomes more pervasive, invisible and present everywhere. In fact, in our environment, we are surrounded by multiple devices that deliver (web) services that meet the needs of the users. However, the mobility of these devices as the users has important repercussions that challenge software design of these applications because the variability of the environment cannot be anticipated at the design time. Thus, it will be interesting to dynamically discover the environment and adapt the application during its execution to the new contextual conditions. We, therefore, propose a model of a context-aware middleware that can address this issue through a monitoring service that is capable of reasoning and observation channels capable of calculating the context during the runtime. The monitoring service evaluates the pre-defined X-Query predicates in the context manager and uses Prolog to deduce the services needed to respond back. An independent Observation Channel for each different predicate is then dynamically generated by the monitoring service depending on the current state of the environment. Each channel sends its result directly to the context manager which consequently calculates the context based on all the predicates’ results while preserving the reactivity of the self-adaptive system.

Keywords: auto-adaptation, context-awareness, middleware, reasoning engine

Procedia PDF Downloads 211
46 Immersive Block Scheduling in Higher Education: A Case Study in Curriculum Reform and Increased Student Success

Authors: Thomas Roche, Erica Wilson, Elizabeth Goode

Abstract:

Universities across the globe are considering how to effect meaningful change in their higher education (HE) delivery in the face of increasingly diverse student cohorts and shifting student learning preferences. This paper reports on a descriptive case study of whole-of-institution curriculum reform at one regional Australian university, where more traditional 13-week semesters were replaced with a 6-week immersive block model drawing on active learning pedagogy. Based on a synthesis of literature in best practice HE pedagogy and principles, the case study draws on student performance data and senior management staff interviews (N = 5) to outline the key changes necessary for successful HE transformation to deliver increased student pass rates and retention. The findings from this case study indicate that an institutional transformation to an immersive block model requires both a considered change in institutional policy and process as well as the appropriate resourcing of roles, governance committees, technical solutions, and, importantly, communities of practice. Implications for practice at higher education institutions considering reforming their curriculum model are also discussed.

Keywords: student retention, immersive scheduling, block model, curriculum reform, active learning, higher education pedagogy, higher education policy

Procedia PDF Downloads 30
45 An Analysis of a Relational Frame Skills Training Intervention to Increase General Intelligence in Early Childhood

Authors: Ian M. Grey, Bryan Roche, Anna Dillon, Justin Thomas, Sarah Cassidy, Dylan Colbert, Ian Stewart

Abstract:

This paper presents findings from a study conducted in two schools in Abu Dhabi. The hypothesis is that teaching young children to derive various relations between stimuli leads to increases in full-scale IQ scores of typically developing children. In the experimental group, sixteen 6-7-year-old children were exposed over six weeks to an intensive training intervention designed specifically for their age group. This training intervention, presented on a tablet, aimed to improve their understanding of the relations Same, Opposite, Different, contextual control over the concept of Sameness and Difference, and purely arbitrary derived relational responding for Sameness and Difference. In the control group, sixteen 6-7-year-old children interacted with KIBO robotics over six weeks. KIBO purports to improve cognitive skills through engagement with STEAM activities. Increases in full-scale IQ were recorded for most children in the experimental group, while no increases in full-scale IQ were recorded for the control group. These findings support the hypothesis that relational skills underlie many aspects of general cognitive ability.

Keywords: early childhood, derived relational responding, intelligence, relational frame theory, relational skills

Procedia PDF Downloads 154
44 Vitamin D Deficiency and Insufficiency in Postmenopausal Women with Obesity

Authors: Vladyslav Povoroznyuk, Anna Musiienko, Nataliia Dzerovych, Roksolana Povoroznyuk, Oksana Ivanyk

Abstract:

Deficiency and insufficiency of Vitamin D is a pandemic of the 21st century. Obesity patients have a lower level of vitamin D, but the literature data are contradictory. The purpose of this study is to investigate deficiency and insufficiency vitamin D in postmenopausal women with obesity. We examined 1007 women aged 50-89 years. Mean age was 65.74±8.61 years; mean height was 1.61±0.07 m; mean weight was 70.65±13.50 kg; mean body mass index was 27.27±4.86 kg/m2, and mean 25(OH) D levels in serum was 26.00±12.00 nmol/l. The women were divided into the following six groups depending on body mass index: I group – 338 women with normal body weight, II group – 16 women with insufficient body weight, III group – 382 women with excessive body weight, IV group – 199 women with obesity of class I, V group – 60 women with obesity of class II, and VI group – 12 women with obesity of class III. Level of 25(OH)D in serum was measured by means of an electrochemiluminescent method - Elecsys 2010 analyzer (Roche Diagnostics, Germany) and cobas test-systems. 34.4% of the examined women have deficiency of vitamin D and 31.4% insufficiency. Women with obesity of class I (23.60±10.24 ng/ml) and obese of class II (22.38±10.34 ng/ml) had significantly lower levels of 25 (OH) D compared to women with normal body weight (28.24±12.99 ng/ml), p=0.00003. In women with obesity, BMI significantly influences vitamin D level, and this influence does not depend on the season.

Keywords: obesity, body mass index, vitamin D deficiency, vitamin D insufficiency, postmenopausal women, age

Procedia PDF Downloads 140
43 Characterization of Organic Matter in Spodosol Amazonian by Fluorescence Spectroscopy

Authors: Amanda M. Tadini, Houssam Hajjoul, Gustavo Nicolodelli, Stéphane Mounier, Célia R. Montes, Débora M. B. P. Milori

Abstract:

Soil organic matter (SOM) plays an important role in maintaining soil productivity and accounting for the promotion of biological diversity. The main components of the SOM are the humic substances which can be fractionated according to its solubility in humic acid (HA), fulvic acids (FA) and humin (HU). The determination of the chemical properties of organic matter as well as its interaction with metallic species is an important tool for understanding the structure of the humic fractions. Fluorescence spectroscopy has been studied as a source of information about what is happening at the molecular level in these compounds. Specially, soils of Amazon region are an important ecosystem of the planet. The aim of this study is to understand the molecular and structural composition of HA samples from Spodosol of Amazonia using the fluorescence Emission-Excitation Matrix (EEM) and Time Resolved Fluorescence Spectroscopy (TRFS). The results showed that the samples of HA showed two fluorescent components; one has a more complex structure and the other one has a simpler structure, which was also seen in TRFS through the evaluation of each sample lifetime. Thus, studies of this nature become important because it aims to evaluate the molecular and structural characteristics of the humic fractions in the region that is considered as one of the most important regions in the world, the Amazon.

Keywords: Amazonian soil, characterization, fluorescence, humic acid, lifetime

Procedia PDF Downloads 564
42 ACTN3 Genotype Association with Motoric Performance of Roma Children

Authors: J. Bernasovska, I. Boronova, J. Poracova, M. Mydlarova Blascakova, V. Szabadosova, P. Ruzbarsky, E. Petrejcikova, I. Bernasovsky

Abstract:

The paper presents the results of the molecular genetics analysis in sports research, with special emphasis to use genetic information in diagnosing of motoric predispositions in Roma boys from East Slovakia. The ability and move are the basic characteristics of all living organisms. The phenotypes are influenced by a combination of genetic and environmental factors. Genetic tests differ in principle from the traditional motoric tests, because the DNA of an individual does not change during life. The aim of the presented study was to examine motion abilities and to determine the frequency of ACTN3 (R577X) gene in Roma children. Genotype data were obtained from 138 Roma and 155 Slovak boys from 7 to 15 years old. Children were investigated on physical performance level in association with their genotype. Biological material for genetic analyses comprised samples of buccal swabs. Genotypes were determined using Real Time High resolution melting PCR method (Rotor-Gene 6000 Corbett and Light Cycler 480 Roche). The software allows creating reports of any analysis, where information of the specific analysis, normalized and differential graphs and many information of the samples are shown. Roma children of analyzed group legged to non-Romany children at the same age in all the compared tests. The % distribution of R and X alleles in Roma children was different from controls. The frequency of XX genotype was 9.26%, RX 46.33% and RR was 44.41%. The frequency of XX genotype was 9.26% which is comparable to a frequency of an Indian population. Data were analyzed with the ANOVA test.

Keywords: ACTN3 gene, R577X polymorphism, Roma children, sport performance, Slovakia

Procedia PDF Downloads 306
41 Performing Diagnosis in Building with Partially Valid Heterogeneous Tests

Authors: Houda Najeh, Mahendra Pratap Singh, Stéphane Ploix, Antoine Caucheteux, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building system is highly vulnerable to different kinds of faults and human misbehaviors. Energy efficiency and user comfort are directly targeted due to abnormalities in building operation. The available fault diagnosis tools and methodologies particularly rely on rules or pure model-based approaches. It is assumed that model or rule-based test could be applied to any situation without taking into account actual testing contexts. Contextual tests with validity domain could reduce a lot of the design of detection tests. The main objective of this paper is to consider fault validity when validate the test model considering the non-modeled events such as occupancy, weather conditions, door and window openings and the integration of the knowledge of the expert on the state of the system. The concept of heterogeneous tests is combined with test validity to generate fault diagnoses. A combination of rules, range and model-based tests known as heterogeneous tests are proposed to reduce the modeling complexity. Calculation of logical diagnoses coming from artificial intelligence provides a global explanation consistent with the test result. An application example shows the efficiency of the proposed technique: an office setting at Grenoble Institute of Technology.

Keywords: heterogeneous tests, validity, building system, sensor grids, sensor fault, diagnosis, fault detection and isolation

Procedia PDF Downloads 257
40 An Eulerian Method for Fluid-Structure Interaction Simulation Applied to Wave Damping by Elastic Structures

Authors: Julien Deborde, Thomas Milcent, Stéphane Glockner, Pierre Lubin

Abstract:

A fully Eulerian method is developed to solve the problem of fluid-elastic structure interactions based on a 1-fluid method. The interface between the fluid and the elastic structure is captured by a level set function, advected by the fluid velocity and solved with a WENO 5 scheme. The elastic deformations are computed in an Eulerian framework thanks to the backward characteristics. We use the Neo Hookean or Mooney Rivlin hyperelastic models and the elastic forces are incorporated as a source term in the incompressible Navier-Stokes equations. The velocity/pressure coupling is solved with a pressure-correction method and the equations are discretized by finite volume schemes on a Cartesian grid. The main difficulty resides in that large deformations in the fluid cause numerical instabilities. In order to avoid these problems, we use a re-initialization process for the level set and linear extrapolation of the backward characteristics. First, we verify and validate our approach on several test cases, including the benchmark of FSI proposed by Turek. Next, we apply this method to study the wave damping phenomenon which is a mean to reduce the waves impact on the coastline. So far, to our knowledge, only simulations with rigid or one dimensional elastic structure has been studied in the literature. We propose to place elastic structures on the seabed and we present results where 50 % of waves energy is absorbed.

Keywords: damping wave, Eulerian formulation, finite volume, fluid structure interaction, hyperelastic material

Procedia PDF Downloads 287
39 Python Implementation for S1000D Applicability Depended Processing Model - SALERNO

Authors: Theresia El Khoury, Georges Badr, Amir Hajjam El Hassani, Stéphane N’Guyen Van Ky

Abstract:

The widespread adoption of machine learning and artificial intelligence across different domains can be attributed to the digitization of data over several decades, resulting in vast amounts of data, types, and structures. Thus, data processing and preparation turn out to be a crucial stage. However, applying these techniques to S1000D standard-based data poses a challenge due to its complexity and the need to preserve logical information. This paper describes SALERNO, an S1000d AppLicability dEpended pRocessiNg mOdel. This python-based model analyzes and converts the XML S1000D-based files into an easier data format that can be used in machine learning techniques while preserving the different logic and relationships in files. The model parses the files in the given folder, filters them, and extracts the required information to be saved in appropriate data frames and Excel sheets. Its main idea is to group the extracted information by applicability. In addition, it extracts the full text by replacing internal and external references while maintaining the relationships between files, as well as the necessary requirements. The resulting files can then be saved in databases and used in different models. Documents in both English and French languages were tested, and special characters were decoded. Updates on the technical manuals were taken into consideration as well. The model was tested on different versions of the S1000D, and the results demonstrated its ability to effectively handle the applicability, requirements, references, and relationships across all files and on different levels.

Keywords: aeronautics, big data, data processing, machine learning, S1000D

Procedia PDF Downloads 80
38 Physicochemical Characterization of Coastal Aerosols over the Mediterranean Comparison with Weather Research and Forecasting-Chem Simulations

Authors: Stephane Laussac, Jacques Piazzola, Gilles Tedeschi

Abstract:

Estimation of the impact of atmospheric aerosols on the climate evolution is an important scientific challenge. One of a major source of particles is constituted by the oceans through the generation of sea-spray aerosols. In coastal areas, marine aerosols can affect air quality through their ability to interact chemically and physically with other aerosol species and gases. The integration of accurate sea-spray emission terms in modeling studies is then required. However, it was found that sea-spray concentrations are not represented with the necessary accuracy in some situations, more particularly at short fetch. In this study, the WRF-Chem model was implemented on a North-Western Mediterranean coastal region. WRF-Chem is the Weather Research and Forecasting (WRF) model online-coupled with chemistry for investigation of regional-scale air quality which simulates the emission, transport, mixing, and chemical transformation of trace gases and aerosols simultaneously with the meteorology. One of the objectives was to test the ability of the WRF-Chem model to represent the fine details of the coastal geography to provide accurate predictions of sea spray evolution for different fetches and the anthropogenic aerosols. To assess the performance of the model, a comparison between the model predictions using a local emission inventory and the physicochemical analysis of aerosol concentrations measured for different wind direction on the island of Porquerolles located 10 km south of the French Riviera is proposed.

Keywords: sea-spray aerosols, coastal areas, sea-spray concentrations, short fetch, WRF-Chem model

Procedia PDF Downloads 162