Search results for: Direct approach
4131 A Hybrid Distributed Vision System for Robot Localization
Authors: Hsiang-Wen Hsieh, Chin-Chia Wu, Hung-Hsiu Yu, Shu-Fan Liu
Abstract:
Localization is one of the critical issues in the field of robot navigation. With an accurate estimate of the robot pose, robots will be capable of navigating in the environment autonomously and efficiently. In this paper, a hybrid Distributed Vision System (DVS) for robot localization is presented. The presented approach integrates odometry data from robot and images captured from overhead cameras installed in the environment to help reduce possibilities of fail localization due to effects of illumination, encoder accumulated errors, and low quality range data. An odometry-based motion model is applied to predict robot poses, and robot images captured by overhead cameras are then used to update pose estimates with HSV histogram-based measurement model. Experiment results show the presented approach could localize robots in a global world coordinate system with localization errors within 100mm.Keywords: Distributed Vision System, Localization, Measurement model, Motion model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13404130 Quantifying the Stability of Software Systems via Simulation in Dependency Networks
Authors: Weifeng Pan
Abstract:
The stability of a software system is one of the most important quality attributes affecting the maintenance effort. Many techniques have been proposed to support the analysis of software stability at the architecture, file, and class level of software systems, but little effort has been made for that at the feature (i.e., method and attribute) level. And the assumptions the existing techniques based on always do not meet the practice to a certain degree. Considering that, in this paper, we present a novel metric, Stability of Software (SoS), to measure the stability of object-oriented software systems by software change propagation analysis using a simulation way in software dependency networks at feature level. The approach is evaluated by case studies on eight open source Java programs using different software structures (one employs design patterns versus one does not) for the same object-oriented program. The results of the case studies validate the effectiveness of the proposed metric. The approach has been fully automated by a tool written in Java.Keywords: Software stability, change propagation, design pattern, software maintenance, object-oriented (OO) software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16784129 New Hybrid Method to Model Extreme Rainfalls
Authors: Y. Laaroussi, Z. Guennoun, A. Amar
Abstract:
Modeling and forecasting dynamics of rainfall occurrences constitute one of the major topics, which have been largely treated by statisticians, hydrologists, climatologists and many other groups of scientists. In the same issue, we propose, in the present paper, a new hybrid method, which combines Extreme Values and fractal theories. We illustrate the use of our methodology for transformed Emberger Index series, constructed basing on data recorded in Oujda (Morocco). The index is treated at first by Peaks Over Threshold (POT) approach, to identify excess observations over an optimal threshold u. In the second step, we consider the resulting excess as a fractal object included in one dimensional space of time. We identify fractal dimension by the box counting. We discuss the prospect descriptions of rainfall data sets under Generalized Pareto Distribution, assured by Extreme Values Theory (EVT). We show that, despite of the appropriateness of return periods given by POT approach, the introduction of fractal dimension provides accurate interpretation results, which can ameliorate apprehension of rainfall occurrences.
Keywords: Extreme values theory, Fractals dimensions, Peaks Over Threshold, Rainfall occurrences.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20994128 Family Functionality in Mexican Children with Congenital and Non-Congenital Deafness
Authors: D. Estrella, A. Silva, R. Zapata, H. Rubio
Abstract:
A total of 100 primary caregivers (mothers, fathers, grandparents) with at least one child or grandchild with a diagnosis of congenital bilateral profound deafness were assessed in order to evaluate the functionality of families with a deaf member, who was evaluated by specialists in audiology, molecular biology, genetics and psychology. After confirmation of the clinical diagnosis, DNA from the patients and parents were analyzed in search of the 35delG deletion of the GJB2 gene to determine who possessed the mutation. All primary caregivers were provided psychological support, regardless of whether or not they had the mutation, and prior and subsequent, the family APGAR test was applied. All parents, grandparents were informed of the results of the genetic analysis during the psychological intervention. The family APGAR, after psychological and genetic counseling, showed that 14% perceived their families as functional, 62% moderately functional and 24% dysfunctional. This shows the importance of psychological support in family functionality that has a direct impact on the quality of life of these families.
Keywords: Deafness, psychological support, family, adaptation to disability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9764127 Quantum-Like Approach for Deriving a Theory Describing the Concept of Interpretation
Authors: Yehuda Roth
Abstract:
In quantum theory, a system’s time evolution is predictable unless an observer performs measurement, as the measurement process can randomize the system. This randomness appears when the measuring device does not accurately describe the measured item, i.e., when the states characterizing the measuring device appear as a superposition of those being measured. When such a mismatch occurs, the measured data randomly collapse into a single eigenstate of the measuring device. This scenario resembles the interpretation process in which the observer does not experience an objective reality but interprets it based on preliminary descriptions initially ingrained into his/her mind. This distinction is the motivation for the present study in which the collapse scenario is regarded as part of the interpretation process of the observer. By adopting the formalism of the quantum theory, we present a complete mathematical approach that describes the interpretation process. We demonstrate this process by applying the proposed interpretation formalism to the ambiguous image "My wife and mother-in-law" to identify whether a woman in the picture is young or old.
Keywords: Interpretation, ambiguous images, data reception, state matching, classification, determination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1894126 Sperm Whale Signal Analysis: Comparison using the Auto Regressive model and the Daubechies 15 Wavelets Transform
Authors: Olivier Adam, Maciej Lopatka, Christophe Laplanche, Jean-François Motsch
Abstract:
This article presents the results using a parametric approach and a Wavelet Transform in analysing signals emitting from the sperm whale. The extraction of intrinsic characteristics of these unique signals emitted by marine mammals is still at present a difficult exercise for various reasons: firstly, it concerns non-stationary signals, and secondly, these signals are obstructed by interfering background noise. In this article, we compare the advantages and disadvantages of both methods: Auto Regressive models and Wavelet Transform. These approaches serve as an alternative to the commonly used estimators which are based on the Fourier Transform for which the hypotheses necessary for its application are in certain cases, not sufficiently proven. These modern approaches provide effective results particularly for the periodic tracking of the signal's characteristics and notably when the signal-to-noise ratio negatively effects signal tracking. Our objectives are twofold. Our first goal is to identify the animal through its acoustic signature. This includes recognition of the marine mammal species and ultimately of the individual animal (within the species). The second is much more ambitious and directly involves the intervention of cetologists to study the sounds emitted by marine mammals in an effort to characterize their behaviour. We are working on an approach based on the recordings of marine mammal signals and the findings from this data result from the Wavelet Transform. This article will explore the reasons for using this approach. In addition, thanks to the use of new processors, these algorithms once heavy in calculation time can be integrated in a real-time system.Keywords: Autoregressive model, Daubechies Wavelet, Fourier Transform, marine mammals, signal processing, spectrogram, sperm whale, Wavelet Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20054125 Computer Proven Correctness of the Rabin Public-Key Scheme
Authors: Johannes Buchmann, Markus Kaiser
Abstract:
We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15914124 Precombining Adaptive LMMSE Detection for DS-CDMA Systems in Time Varying Channels: Non Blind and Blind Approaches
Authors: M. D. Kokate, T. R. Sontakke, P. W. Wani
Abstract:
This paper deals with an adaptive multiuser detector for direct sequence code division multiple-access (DS-CDMA) systems. A modified receiver, precombinig LMMSE is considered under time varying channel environment. Detector updating is performed with two criterions, mean square estimation (MSE) and MOE optimization technique. The adaptive implementation issues of these two schemes are quite different. MSE criterion updates the filter weights by minimizing error between data vector and adaptive vector. MOE criterion together with canonical representation of the detector results in a constrained optimization problem. Even though the canonical representation is very complicated under time varying channels, it is analyzed with assumption of average power profile of multipath replicas of user of interest. The performance of both schemes is studied for practical SNR conditions. Results show that for poor SNR, MSE precombining LMMSE is better than the blind precombining LMMSE but for greater SNR, MOE scheme outperforms with better result.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14954123 Modified Naïve Bayes Based Prediction Modeling for Crop Yield Prediction
Authors: Kefaya Qaddoum
Abstract:
Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.
Keywords: Tomato yields prediction, naive Bayes, redundancy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 51094122 EU Families and Adolescents Quit Tobacco Focus Group Analysis in Hungary
Authors: Szilvia Gergely Seuss, Mihaela Nistor, Lilla Csáky, Péter Molnár
Abstract:
In the frame of the European Union project entitled EU-Families and Adolescents Quit Tobacco (www.eufaqt.eu) focus group analysis has been carried out in Hungary to acquire qualitative information on attitudes towards smoking in groups of adolescents, parents and educators, respectively. It rendered to identify methods for smoking prevention/ intervention with family approach. The results explored the role of the family in smoking behaviour. Teachers do not feel responsibility in prevention or cessation of smoking. Adolescents are not aware of the addictive effect of the cigarette. Water pipe is popular among adolescent, therefore spreading of more information needed on the harmful effects of water pipe. We outlined the requirement for professionals to provide interventions. Partnership of EU-FAQT project has worked out antismoking interventions for adolescents and their families conducted by psychologists to ensure skill development to prevent and quit tobacco.
Keywords: Smoking of adolescents, family approach, focus group analysis, water pipe.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18064121 Quantification of the Variables of the Information Model for the Use of School Terminology from 1884 to 2014 in Dalmatia
Authors: V. Vidučić, T. Brešan Ančić, M. Tomelić Ćurlin
Abstract:
Prior to quantifying the variables of the information model for using school terminology in Croatia's region of Dalmatia from 1884 to 2014, the most relevant model variables had to be determined: historical circumstances, standard of living, education system, linguistic situation, and media. The research findings show that there was no significant transfer of the 1884 school terms into 1949 usage; likewise, the 1949 school terms were not widely used in 2014. On the other hand, the research revealed that the meaning of school terms changed over the decades. The quantification of the variables will serve as the groundwork for creating an information model for using school terminology in Dalmatia from 1884 to 2014 and for defining direct growth rates in further research.
Keywords: Education system, historical circumstances, linguistic situation, media, school terminology, standard of living.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16114120 2D and 3D Unsteady Simulation of the Heat Transfer in the Sample during Heat Treatment by Moving Heat Source
Authors: Z. Veselý, M. Honner, J. Mach
Abstract:
The aim of the performed work is to establish the 2D and 3D model of direct unsteady task of sample heat treatment by moving source employing computer model on the basis of finite element method. Complex boundary condition on heat loaded sample surface is the essential feature of the task. Computer model describes heat treatment of the sample during heat source movement over the sample surface. It is started from 2D task of sample cross section as a basic model. Possibilities of extension from 2D to 3D task are discussed. The effect of the addition of third model dimension on temperature distribution in the sample is showed. Comparison of various model parameters on the sample temperatures is observed. Influence of heat source motion on the depth of material heat treatment is shown for several velocities of the movement. Presented computer model is prepared for the utilization in laser treatment of machine parts.Keywords: Computer simulation, unsteady model, heat treatment, complex boundary condition, moving heat source.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20384119 Impacts of the Courtyard with Glazed Roof on House Winter Thermal Conditions
Authors: Bin Su
Abstract:
The 'wind-rain' house has a courtyard with glazed roof, which allows more direct sunlight to come into indoor spaces during the winter. The glazed roof can be partially opened or closed and automatically controlled to provide natural ventilation in order to adjust for indoor thermal conditions and the roof area can be shaded by reflective insulation materials during the summer. Two field studies for evaluating indoor thermal conditions of the two 'windrain' houses have been carried out by author in 2009 and 2010. Indoor and outdoor air temperature and relative humidity adjacent to floor and ceiling of the two sample houses were continuously tested at 15-minute intervals, 24 hours a day during the winter months. Based on field study data, this study investigates relationships between building design and indoor thermal condition of the 'windrain' house to improve the future house design for building thermal comfort and energy efficiencyKeywords: Courtyard, house design, indoor thermal comfort, 'wind-rain' house
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16964118 Multi-Disciplinary Optimisation Methodology for Aircraft Load Prediction
Authors: Sudhir Kumar Tiwari
Abstract:
The paper demonstrates a methodology that can be used at an early design stage of any conventional aircraft. This research activity assesses the feasibility derivation of methodology for aircraft loads estimation during the various phases of design for a transport category aircraft by utilizing potential of using commercial finite element analysis software, which may drive significant time saving. Early Design phase have limited data and quick changing configuration results in handling of large number of load cases. It is useful to idealize the aircraft as a connection of beams, which can be very accurately modelled using finite element analysis (beam elements). This research explores the correct approach towards idealizing an aircraft using beam elements. FEM Techniques like inertia relief were studied for implementation during course of work. The correct boundary condition technique envisaged for generation of shear force, bending moment and torque diagrams for the aircraft. The possible applications of this approach are the aircraft design process, which have been investigated.
Keywords: Multi-disciplinary optimization, aircraft load, finite element analysis, Stick Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11304117 A Black-box Approach for Response Quality Evaluation of Conversational Agent Systems
Authors: Ong Sing Goh, C. Ardil, Wilson Wong, Chun Che Fung
Abstract:
The evaluation of conversational agents or chatterbots question answering systems is a major research area that needs much attention. Before the rise of domain-oriented conversational agents based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when chatterbots began to become more domain specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time to achieve high quality responses. This paper discusses the inappropriateness of the existing measures for response quality evaluation and the call for new standard measures and related considerations are brought forward. As a short-term solution for evaluating response quality of conversational agents, and to demonstrate the challenges in evaluating systems of different nature, this research proposes a blackbox approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems, AnswerBus, START and AINI.
Keywords: Evaluation, conversational agents, Response Quality, chatterbots
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19274116 Design of Static Synchronous Series Compensator Based Damping Controller Employing Real Coded Genetic Algorithm
Authors: S.C.Swain, A.K.Balirsingh, S. Mahapatra, S. Panda
Abstract:
This paper presents a systematic approach for designing Static Synchronous Series Compensator (SSSC) based supplementary damping controllers for damping low frequency oscillations in a single-machine infinite-bus power system. The design problem of the proposed controller is formulated as an optimization problem and RCGA is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. Simulation results are presented and compared with a conventional method of tuning the damping controller parameters to show the effectiveness and robustness of the proposed design approach.Keywords: Low frequency Oscillations, Phase CompensationTechnique, Real Coded Genetic Algorithm, Single-machine InfiniteBus Power System, Static Synchronous Series Compensator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25024115 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents
Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei
Abstract:
With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.
Keywords: Document processing, framework, formal definition, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6394114 The Future of Hospitals: A Systematic Review in the Field of Architectural Design with a Disruptive Research and Development Approach
Authors: María Araya Léon, Ainoa Abella, Aura Murillo, Ricardo Guasch, Laura Clèries
Abstract:
This article aims to examine scientific theory framed within the term hospitals of the future from a multidisciplinary and cross-sectional perspective. To understand the connection that the various cross-sectional areas, we studied have with architectural spaces and to determine the future outlook of the works examined and how they can be classified into the categories of need/solution, evolution/revolution, collective/individual, and preventive/corrective. The changes currently taking place within the context of healthcare demonstrate how important these projects are and the need for companies to face future changes. A systematic review has been carried out focused on what will the hospitals of the future be like in relation to the elements that form part of their use, design, and architectural space experience, using the WOS database from 2016 to 2019. The large number of works about sensoring & big data and the scarce amount related to the area of materials is worth highlighting. Furthermore, no growth concerning future issues is envisaged over time. Regarding classifications, the articles we reviewed address evolutionary and collective solutions more, and in terms of preventive and corrective solutions, they were found at a similar level. Although our research focused on the future of hospitals, there is little evidence representing this approach. We also detected that, given the relevance of the research on how the built environment influences human health and well-being, these studies should be promoted within the context of healthcare. This article allows to find evidence on the future perspective from within the domain of hospital architecture, in order to create bridges between the productive sector of architecture and scientific theory. This will make it possible to detect R&D opportunities in each analyzed cross-section.
Keywords: Hospitals, trends, architectural space, disruptive approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2994113 Valorization of Residues from Forest Industry for the Generation of Energy
Authors: M. A. Amezcua-Allieri, E. Torres, J. A. Zermeño Eguía-Lis, M. Magdaleno, L. A. Melgarejo, E. Palmerín, A. Rosas, D. López, J. Aburto
Abstract:
The use of biomass to produce renewable energy is one of the forms that can be used to reduce the impact of energy production. Like any other energy resource, there are limitations for biomass use, and it must compete not only with fossil fuels but also with other renewable energy sources such as solar or wind energy. Combustion is currently the most efficient and widely used waste-to-energy process, in the areas where direct use of biomass is possible, without the need to make large transfers of raw material. Many industrial facilities can use agricultural or forestry waste, straw, chips, bagasse, etc. in their thermal systems without making major transformations or adjustments in the feeding to the ovens, making this waste an attractive and cost-effective option in terms of availability, access, and costs. In spite of the facilities and benefits, the environmental reasons (emission of gases and particulate material) are decisive for its use for energy purpose. This paper describes a valorization of residues from forest industry to generate energy, using a case study.
Keywords: Bioenergy, forest waste, life-cycle assessment, waste-to-energy, electricity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8144112 Effective Communication with the Czech Customers 50+ in the Financial Market
Authors: K. Matušínská, H. Starzyczná, M. Stoklasa
Abstract:
The paper deals with finding and describing of the effective marketing communication forms relating to the segment 50+ in the financial market in the Czech Republic. The segment 50+ can be seen as a great marketing potential in the future but unfortunately the Czech financial institutions haven´t still reacted enough to this fact and they haven´t prepared appropriate marketing programs for this customers´ segment. Demographic aging is a fundamental characteristic of the current European population evolution but the perspective of further population aging is more noticeable in the Czech Republic. This paper is based on data from one part of primary marketing research. Paper determinates the basic problem areas as well as definition of marketing communication in the financial market, defining the primary research problem, hypothesis and primary research methodology. Finally suitable marketing communication approach to selected sub-segment at age of 50-60 years is proposed according to marketing research findings.Keywords: Population aging in the Czech Republic, segment 50+, financial services, marketing communication, marketing research, marketing communication approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12224111 VLSI Design of 2-D Discrete Wavelet Transform for Area-Efficient and High-Speed Image Computing
Authors: Mountassar Maamoun, Mehdi Neggazi, Abdelhamid Meraghni, Daoud Berkani
Abstract:
This paper presents a VLSI design approach of a highspeed and real-time 2-D Discrete Wavelet Transform computing. The proposed architecture, based on new and fast convolution approach, reduces the hardware complexity in addition to reduce the critical path to the multiplier delay. Furthermore, an advanced twodimensional (2-D) discrete wavelet transform (DWT) implementation, with an efficient memory area, is designed to produce one output in every clock cycle. As a result, a very highspeed is attained. The system is verified, using JPEG2000 coefficients filters, on Xilinx Virtex-II Field Programmable Gate Array (FPGA) device without accessing any external memory. The resulting computing rate is up to 270 M samples/s and the (9,7) 2-D wavelet filter uses only 18 kb of memory (16 kb of first-in-first-out memory) with 256×256 image size. In this way, the developed design requests reduced memory and provide very high-speed processing as well as high PSNR quality.Keywords: Discrete Wavelet Transform (DWT), Fast Convolution, FPGA, VLSI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19664110 Removal of Pharmaceutical Compounds by a Sequential Treatment of Ozonation Followed by Fenton Process: Influence of the Water Matrix
Authors: Almudena Aguinaco, Olga Gimeno, Fernando J. Beltrán, Juan José P. Sagasti
Abstract:
A sequential treatment of ozonation followed by a Fenton or photo-Fenton process, using black light lamps (365 nm) in this latter case, has been applied to remove a mixture of pharmaceutical compounds and the generated by-products both in ultrapure and secondary treated wastewater. The scientifictechnological innovation of this study stems from the in situ generation of hydrogen peroxide from the direct ozonation of pharmaceuticals, and can later be used in the application of Fenton and photo-Fenton processes. The compounds selected as models were sulfamethoxazol and acetaminophen. It should be remarked that the use of a second process is necessary as a result of the low mineralization yield reached by the exclusive application of ozone. Therefore, the influence of the water matrix has been studied in terms of hydrogen peroxide concentration, individual compound concentration and total organic carbon removed. Moreover, the concentration of different iron species in solution has been measured.Keywords: Fenton, photo-Fenton, ozone, pharmaceutical compounds, hydrogen peroxide, water treatment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18444109 Semi-automatic Construction of Ontology-based CBR System for Knowledge Integration
Authors: Junjie Gao, Guishi Deng
Abstract:
In order to integrate knowledge in heterogeneous case-based reasoning (CBR) systems, ontology-based CBR system has become a hot topic. To solve the facing problems of ontology-based CBR system, for example, its architecture is nonstandard, reusing knowledge in legacy CBR is deficient, ontology construction is difficult, etc, we propose a novel approach for semi-automatically construct ontology-based CBR system whose architecture is based on two-layer ontology. Domain knowledge implied in legacy case bases can be mapped from relational database schema and knowledge items to relevant OWL local ontology automatically by a mapping algorithm with low time-complexity. By concept clustering based on formal concept analysis, computing concept equation measure and concept inclusion measure, some suggestions about enriching or amending concept hierarchy of OWL local ontologies are made automatically that can aid designers to achieve semi-automatic construction of OWL domain ontology. Validation of the approach is done by an application example.Keywords: OWL ontology, Case-based Reasoning, FormalConcept Analysis, Knowledge Integration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20124108 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot
Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan
Abstract:
With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.Keywords: Service Robot, Object Recognition, 3D Sensors, Plane Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16744107 Enhancing Rural Agricultural Value Chains through Electric Mobility Services in Ethiopia
Authors: Clemens Pizzinini, Philipp Rosner, David Ziegler, Markus Lienkamp
Abstract:
Transportation is a constitutional part of most supply and value chains in modern economies. Smallholder farmers in rural Ethiopia face severe challenges along their supply and value chains. In particular, suitable, affordable, and available transport services are in high demand. To develop context-specific technical solutions, a problem-to-solution methodology based on the interaction with technology is developed. With this approach, we fill the gap between proven transportation assessment frameworks and general user-centered techniques. Central to our approach is an electric test vehicle that is implemented in rural supply and value chains for research, development, and testing. Based on our objective and the derived methodological requirements, a set of existing methods is selected. Local partners are integrated in an organizational framework that executes major parts of this research endeavour in Arsi Zone, Oromia Region, Ethiopia.
Keywords: Agricultural value chain, participatory methods, agile methods, sub-Saharan Africa, Ethiopia, electric vehicle, transport service.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1604106 Robot Movement Using the Trust Region Policy Optimization
Authors: Romisaa Ali
Abstract:
The Policy Gradient approach is a subset of the Deep Reinforcement Learning (DRL) combines Deep Neural Networks (DNN) with Reinforcement Learning (RL). This approach finds the optimal policy of robot movement, based on the experience it gains from interaction with its environment. Unlike previous policy gradient algorithms, which were unable to handle the two types of error variance and bias introduced by the DNN model due to over- or underestimation, this algorithm is capable of handling both types of error variance and bias. This article will discuss the state-of-the-art SOTA policy gradient technique, trust region policy optimization (TRPO), by applying this method in various environments compared to another policy gradient method, the Proximal Policy Optimization (PPO), to explain their robust optimization, using this SOTA to gather experience data during various training phases after observing the impact of hyper-parameters on neural network performance.
Keywords: Deep neural networks, deep reinforcement learning, Proximal Policy Optimization, state-of-the-art, trust region policy optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854105 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: Metagenomics, phenotype prediction, deep learning, embeddings, multiple instance learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9104104 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition
Authors: L. Hamsaveni, Navya Prakash, Suresha
Abstract:
Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.Keywords: Grayscale image format, image fusing, SURF detection, YCbCr image format.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11554103 3D Guidance of Unmanned Aerial Vehicles Using Sliding Mode Approach
Authors: M. Zamurad Shah, M. Kemal Özgören, Raza Samar
Abstract:
This paper presents a 3D guidance scheme for Unmanned Aerial Vehicles (UAVs). The proposed guidance scheme is based on the sliding mode approach using nonlinear sliding manifolds. Generalized 3D kinematic equations are considered here during the design process to cater for the coupling between longitudinal and lateral motions. Sliding mode based guidance scheme is then derived for the multiple-input multiple-output (MIMO) system using the proposed nonlinear manifolds. Instead of traditional sliding surfaces, nonlinear sliding surfaces are proposed here for performance and stability in all flight conditions. In the reaching phase control inputs, the bang-bang terms with signum functions are accompanied with proportional terms in order to reduce the chattering amplitudes. The Proposed 3D guidance scheme is implemented on a 6-degrees-of-freedom (6-dof) simulation of a UAV and simulation results are presented here for different 3D trajectories with and without disturbances.
Keywords: Unmanned Aerial Vehicles, Sliding mode control, 3D Guidance, Path following, trajectory tracking, nonlinear sliding manifolds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27034102 Low Cost Technique for Measuring Luminance in Biological Systems
Abstract:
In this work, the relationship between the melanin content in a tissue and subsequent absorption of light through that tissue was determined using a digital camera. This technique proved to be simple, cost effective, efficient and reliable. Tissue phantom samples were created using milk and soy sauce to simulate the optical properties of melanin content in human tissue. Increasing the concentration of soy sauce in the milk correlated to an increase in melanin content of an individual. Two methods were employed to measure the light transmitted through the sample. The first was direct measurement of the transmitted intensity using a conventional lux meter. The second method involved correctly calibrating an ordinary digital camera and using image analysis software to calculate the transmitted intensity through the phantom. The results from these methods were then graphically compared to the theoretical relationship between the intensity of transmitted light and the concentration of absorbers in the sample. Conclusions were then drawn about the effectiveness and efficiency of these low cost methods.Keywords: Tissue phantoms, scattering coefficient, albedo, low-cost method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1301