Search results for: Stéphane Roche
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14

Search results for: Stéphane Roche

14 A Simulated Scenario of WikiGIS to Support the Iteration and Traceability Management of the Geodesign Process

Authors: Wided Batita, Stéphane Roche, Claude Caron

Abstract:

Geodesign is an emergent term related to a new and complex process. Hence, it needs to rethink tools, technologies and platforms in order to efficiently achieve its goals. A few tools have emerged since 2010 such as CommunityViz, GeoPlanner, etc. In the era of Web 2.0 and collaboration, WikiGIS has been proposed as a new category of tools. In this paper, we present WikiGIS functionalities dealing mainly with the iteration and traceability management to support the collaboration of the Geodesign process. Actually, WikiGIS is built on GeoWeb 2.0 technologies —and primarily on wiki— and aims at managing the tracking of participants’ editing. This paper focuses on a simplified simulation to illustrate the strength of WikiGIS in the management of traceability and in the access to history in a Geodesign process. Indeed, a cartographic user interface has been implemented, and then a hypothetical use case has been imagined as proof of concept.

Keywords: Geodesign, history, traceability, tracking of participants’ editing, WikiGIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 933
13 Performance of the Aptima® HIV-1 Quant Dx Assay on the Panther System

Authors: Siobhan O’Shea, Sangeetha Vijaysri Nair, Hee Cheol Kim, Charles Thomas Nugent, Cheuk Yan William Tong, Sam Douthwaite, Andrew Worlock

Abstract:

The Aptima® HIV-1 Quant Dx Assay is a fully automated assay on the Panther system. It is based on Transcription- Mediated Amplification and real time detection technologies. This assay is intended for monitoring HIV-1 viral load in plasma specimens and for the detection of HIV-1 in plasma and serum specimens. Nine-hundred and seventy nine specimens selected at random from routine testing at St Thomas’ Hospital, London were anonymised and used to compare the performance of the Aptima HIV-1 Quant Dx assay and Roche COBAS® AmpliPrep/COBAS® TaqMan® HIV-1 Test, v2.0. Two-hundred and thirty four specimens gave quantitative HIV-1 viral load results in both assays. The quantitative results reported by the Aptima Assay were comparable to those reported by the Roche COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 with a linear regression slope of 1.04 and an intercept on -0.097. The Aptima assay detected HIV-1 in more samples than the COBAS assay. This was not due to lack of specificity of the Aptima assay because this assay gave 99.83% specificity on testing plasma specimens from 600 HIV-1 negative individuals. To understand the reason for this higher detection rate a side-by-side comparison of low level panels made from the HIV-1 3rd international standard (NIBSC10/152) and clinical samples of various subtypes were tested in both assays. The Aptima assay was more sensitive than the COBAS assay. The good sensitivity, specificity and agreement with other commercial assays make the HIV-1 Quant Dx Assay appropriate for both viral load monitoring and detection of HIV-1 infections.

Keywords: HIV viral load, Aptima, Roche, Panther system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3217
12 A Security Analysis for Home Gateway Architectures

Authors: Pierre Parrend, Stephane Frenot

Abstract:

Providing Services at Home has become over the last few years a very dynamic and promising technological domain. It is likely to enable wide dissemination of secure and automated living environments. We propose a methodology for identifying threats to Services at Home Delivery systems, as well as a threat analysis of a multi-provider Home Gateway architecture. This methodology is based on a dichotomous positive/preventive study of the target system: it aims at identifying both what the system must do, and what it must not do. This approach completes existing methods with a synthetic view of potential security flaws, thus enabling suitable measures to be taken into account. Security implications of the evolution of a given system become easier to deal with. A prototype is built based on the conclusions of this analysis.

Keywords: Security requirements, Connected Home, OSGi, Sofware Components.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
11 Communication Behaviors as Predictors of Long-Term Dyadic Adjustment: Personality as a Moderator

Authors: Ariane Lazaridès, Claude Bélanger, Stéphane Sabourin

Abstract:

In this longitudinal study, we examined the moderating role of personality in the relationship between communication behaviors and long-term dyadic adjustment. A sample of 82 couples completed the NEO Five-Factor Inventory and the Dyadic Adjustment Scale. These couples were also videotaped during a 15-minute problem-solving discussion. Approximately 2.5 years later, these couples completed again the Dyadic Adjustment Scale. Results show that personality of both men and women moderates the relationship between communication behaviors of the partner and long-term dyadic adjustment of the individual. Women-s openness and men-s extraversion moderate the relationship between some communication behaviors and long-term dyadic adjustment

Keywords: Communication Behavior, Couples, Dyadic Adjustment, Personality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1482
10 Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework

Authors: J. Grira, Y. Bédard, S. Roche

Abstract:

The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.

Keywords: Collaborative risk analysis, intention of use, Geospatial database design, Geospatial data misuse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1666
9 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces

Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet

Abstract:

In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.

Keywords: Dropwise condensation, textured surface, image processing, watershed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 691
8 Uncertainty Analysis of a Hardware in Loop Setup for Testing Products Related to Building Technology

Authors: Balasundaram Prasaant, Ploix Stephane, Delinchant Benoit, Muresan Cristian

Abstract:

Hardware in Loop (HIL) testing is done to test and validate a particular product especially in building technology. When it comes to building technology, it is more important to test the products for their efficiency. The test rig in the HIL simulator may contribute to some uncertainties on measured efficiency. The uncertainties include physical uncertainties and scenario-based uncertainties. In this paper, a simple uncertainty analysis framework for an HIL setup is shown considering only the physical uncertainties. The entire modeling of the HIL setup is done in Dymola. The uncertain sources are considered based on available knowledge of the components and also on expert knowledge. For the propagation of uncertainty, Monte Carlo Simulation is used since it is the most reliable and easy to use. In this article it is shown how an HIL setup can be modeled and how uncertainty propagation can be performed on it. Such an approach is not common in building energy analysis.

Keywords: Energy in Buildings, Hardware in Loop, Modelica (Dymola), Monte Carlo Simulation, Uncertainty Propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 575
7 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings

Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.

Keywords: Building system, time series, diagnosis, outliers, delay, data gap.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 903
6 Learning to Order Terms: Supervised Interestingness Measures in Terminology Extraction

Authors: Jérôme Azé, Mathieu Roche, Yves Kodratoff, Michèle Sebag

Abstract:

Term Extraction, a key data preparation step in Text Mining, extracts the terms, i.e. relevant collocation of words, attached to specific concepts (e.g. genetic-algorithms and decisiontrees are terms associated to the concept “Machine Learning" ). In this paper, the task of extracting interesting collocations is achieved through a supervised learning algorithm, exploiting a few collocations manually labelled as interesting/not interesting. From these examples, the ROGER algorithm learns a numerical function, inducing some ranking on the collocations. This ranking is optimized using genetic algorithms, maximizing the trade-off between the false positive and true positive rates (Area Under the ROC curve). This approach uses a particular representation for the word collocations, namely the vector of values corresponding to the standard statistical interestingness measures attached to this collocation. As this representation is general (over corpora and natural languages), generality tests were performed by experimenting the ranking function learned from an English corpus in Biology, onto a French corpus of Curriculum Vitae, and vice versa, showing a good robustness of the approaches compared to the state-of-the-art Support Vector Machine (SVM).

Keywords: Text-mining, Terminology Extraction, Evolutionary algorithm, ROC Curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
5 Vitamin D Deficiency and Insufficiency in Postmenopausal Women with Obesity

Authors: Vladyslav Povoroznyuk, Anna Musiienko, Nataliia Dzerovych, Roksolana Povoroznyuk, Oksana Ivanyk

Abstract:

Deficiency and insufficiency of Vitamin D is a pandemic of the 21st century. Obesity patients have a lower level of vitamin D, but the literature data are contradictory. The purpose of this study is to investigate deficiency and insufficiency vitamin D in postmenopausal women with obesity. We examined 1007 women aged 50-89 years. Mean age was 65.74±8.61 years; mean height was 1.61±0.07 m; mean weight was 70.65±13.50 kg; mean body mass index was 27.27±4.86 kg/m2, and mean 25(OH) D levels in serum was 26.00±12.00 nmol/l. The women were divided into the following six groups depending on body mass index: I group – 338 women with normal body weight, II group – 16 women with insufficient body weight, III group – 382 women with excessive body weight, IV group – 199 women with obesity of class I, V group – 60 women with obesity of class II, and VI group – 12 women with obesity of class III. Level of 25(OH)D in serum was measured by means of an electrochemiluminescent method - Elecsys 2010 analyzer (Roche Diagnostics, Germany) and cobas test-systems. 34.4% of the examined women have deficiency of vitamin D and 31.4% insufficiency. Women with obesity of class I (23.60±10.24 ng/ml) and obese of class II (22.38±10.34 ng/ml) had significantly lower levels of 25 (OH) D compared to women with normal body weight (28.24±12.99 ng/ml), p=0.00003. In women with obesity, BMI significantly influences vitamin D level, and this influence does not depend on the season.

Keywords: Obesity, body mass index, vitamin D deficiency/insufficiency, postmenopausal women, age.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1059
4 Characterization of Organic Matter in Spodosol Amazonian by Fluorescence Spectroscopy

Authors: Amanda M. Tadini, Houssam Hajjoul, Gustavo Nicolodelli, Stéphane Mounier, Célia R. Montes, Débora M. B. P. Milori

Abstract:

Soil organic matter (SOM) plays an important role in maintaining soil productivity and accounting for the promotion of biological diversity. The main components of the SOM are the humic substances which can be fractionated according to its solubility in humic acid (HA), fulvic acids (FA) and humin (HU). The determination of the chemical properties of organic matter as well as its interaction with metallic species is an important tool for understanding the structure of the humic fractions. Fluorescence spectroscopy has been studied as a source of information about what is happening at the molecular level in these compounds. Specially, soils of Amazon region are an important ecosystem of the planet. The aim of this study is to understand the molecular and structural composition of HA samples from Spodosol of Amazonia using the fluorescence Emission-Excitation Matrix (EEM) and Time Resolved Fluorescence Spectroscopy (TRFS). The results showed that the samples of HA showed two fluorescent components; one has a more complex structure and the other one has a simpler structure, which was also seen in TRFS through the evaluation of each sample lifetime. Thus, studies of this nature become important because it aims to evaluate the molecular and structural characteristics of the humic fractions in the region that is considered as one of the most important regions in the world, the Amazon.

Keywords: Amazonian soil, characterization, fluorescence, humic acid, lifetime.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1122
3 Optimization of Kinematics for Birds and UAVs Using Evolutionary Algorithms

Authors: Mohamed Hamdaoui, Jean-Baptiste Mouret, Stephane Doncieux, Pierre Sagaut

Abstract:

The aim of this work is to present a multi-objective optimization method to find maximum efficiency kinematics for a flapping wing unmanned aerial vehicle. We restrained our study to rectangular wings with the same profile along the span and to harmonic dihedral motion. It is assumed that the birdlike aerial vehicle (whose span and surface area were fixed respectively to 1m and 0.15m2) is in horizontal mechanically balanced motion at fixed speed. We used two flight physics models to describe the vehicle aerodynamic performances, namely DeLaurier-s model, which has been used in many studies dealing with flapping wings, and the model proposed by Dae-Kwan et al. Then, a constrained multi-objective optimization of the propulsive efficiency is performed using a recent evolutionary multi-objective algorithm called є-MOEA. Firstly, we show that feasible solutions (i.e. solutions that fulfil the imposed constraints) can be obtained using Dae-Kwan et al.-s model. Secondly, we highlight that a single objective optimization approach (weighted sum method for example) can also give optimal solutions as good as the multi-objective one which nevertheless offers the advantage of directly generating the set of the best trade-offs. Finally, we show that the DeLaurier-s model does not yield feasible solutions.

Keywords: Flight physics, evolutionary algorithm, optimization, Pareto surface.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
2 ACTN3 Genotype Association with Motoric Performance of Roma Children

Authors: J. Bernasovska, I. Boronova, J. Poracova, M. Mydlarova Blascakova, V. Szabadosova, P. Ruzbarsky, E. Petrejcikova, I. Bernasovsky

Abstract:

The paper presents the results of the molecular genetics analysis in sports research, with special emphasis to use genetic information in diagnosing of motoric predispositions in Roma boys from East Slovakia. The ability and move are the basic characteristics of all living organisms. The phenotypes are influenced by a combination of genetic and environmental factors. Genetic tests differ in principle from the traditional motoric tests, because the DNA of an individual does not change during life. The aim of the presented study was to examine motion abilities and to determine the frequency of ACTN3 (R577X) gene in Roma children. Genotype data were obtained from 138 Roma and 155 Slovak boys from 7 to 15 years old. Children were investigated on physical performance level in association with their genotype. Biological material for genetic analyses comprised samples of buccal swabs. Genotypes were determined using Real Time High resolution melting PCR method (Rotor-Gene 6000 Corbett and Light Cycler 480 Roche). The software allows creating reports of any analysis, where information of the specific analysis, normalized and differential graphs and many information of the samples are shown. Roma children of analyzed group legged to non-Romany children at the same age in all the compared tests. The % distribution of R and X alleles in Roma children was different from controls. The frequency of XX genotype was 9.26%, RX 46.33% and RR was 44.41%. The frequency of XX genotype was 9.26% which is comparable to a frequency of an Indian population. Data were analyzed with the ANOVA test.

Keywords: ACTN3 gene, R577X polymorphism, Roma children, Slovakia, sports performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1206
1 Performing Diagnosis in Building with Partially Valid Heterogeneous Tests

Authors: Houda Najeh, Mahendra Pratap Singh, Stéphane Ploix, Antoine Caucheteux, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building system is highly vulnerable to different kinds of faults and human misbehaviors. Energy efficiency and user comfort are directly targeted due to abnormalities in building operation. The available fault diagnosis tools and methodologies particularly rely on rules or pure model-based approaches. It is assumed that model or rule-based test could be applied to any situation without taking into account actual testing contexts. Contextual tests with validity domain could reduce a lot of the design of detection tests. The main objective of this paper is to consider fault validity when validate the test model considering the non-modeled events such as occupancy, weather conditions, door and window openings and the integration of the knowledge of the expert on the state of the system. The concept of heterogeneous tests is combined with test validity to generate fault diagnoses. A combination of rules, range and model-based tests known as heterogeneous tests are proposed to reduce the modeling complexity. Calculation of logical diagnoses coming from artificial intelligence provides a global explanation consistent with the test result. An application example shows the efficiency of the proposed technique: an office setting at Grenoble Institute of Technology.

Keywords: Heterogeneous tests, validity, building system, sensor grids, sensor fault, diagnosis, fault detection and isolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 651