Search results for: human detection and identification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13293

Search results for: human detection and identification

12153 Localization of Radioactive Sources with a Mobile Radiation Detection System using Profit Functions

Authors: Luís Miguel Cabeça Marques, Alberto Manuel Martinho Vale, José Pedro Miragaia Trancoso Vaz, Ana Sofia Baptista Fernandes, Rui Alexandre de Barros Coito, Tiago Miguel Prates da Costa

Abstract:

The detection and localization of hidden radioactive sources are of significant importance in countering the illicit traffic of Special Nuclear Materials and other radioactive sources and materials. Radiation portal monitors are commonly used at airports, seaports, and international land borders for inspecting cargo and vehicles. However, these equipment can be expensive and are not available at all checkpoints. Consequently, the localization of SNM and other radioactive sources often relies on handheld equipment, which can be time-consuming. The current study presents the advantages of real-time analysis of gamma-ray count rate data from a mobile radiation detection system based on simulated data and field tests. The incorporation of profit functions and decision criteria to optimize the detection system's path significantly enhances the radiation field information and reduces survey time during cargo inspection. For source position estimation, a maximum likelihood estimation algorithm is employed, and confidence intervals are derived using the Fisher information. The study also explores the impact of uncertainties, baselines, and thresholds on the performance of the profit function. The proposed detection system, utilizing a plastic scintillator with silicon photomultiplier sensors, boasts several benefits, including cost-effectiveness, high geometric efficiency, compactness, and lightweight design. This versatility allows for seamless integration into any mobile platform, be it air, land, maritime, or hybrid, and it can also serve as a handheld device. Furthermore, integration of the detection system into drones, particularly multirotors, and its affordability enable the automation of source search and substantial reduction in survey time, particularly when deploying a fleet of drones. While the primary focus is on inspecting maritime container cargo, the methodologies explored in this research can be applied to the inspection of other infrastructures, such as nuclear facilities or vehicles.

Keywords: plastic scintillators, profit functions, path planning, gamma-ray detection, source localization, mobile radiation detection system, security scenario

Procedia PDF Downloads 102
12152 User Requirements Study in Order to Improve the Quality of Social Robots for Dementia Patients

Authors: Konrad Rejdak

Abstract:

Introduction: Neurodegenerative diseases are frequently accompanied by loss and unwanted change in functional independence, social relationships, and economic circumstances. Currently, the achievements of social robots to date is being projected to improve multidimensional quality of life among people with cognitive impairment and others. Objectives: Identification of particular human needs in the context of the changes occurring in course of neurodegenerative diseases. Methods: Based on the 110 surveys performed in the Medical University of Lublin from medical staff, patients, and caregivers we made prioritization of the users' needs as high, medium, and low. The issues included in the surveys concerned four aspects: user acceptance, functional requirements, the design of the robotic assistant and preferred types of human-robot interaction. Results: We received completed questionnaires; 50 from medical staff, 30 from caregivers and 30 from potential users. Above 90% of the respondents from each of the three groups, accepted a robotic assistant as a potential caregiver. High priority functional capability of assistive technology was to handle emergencies in a private home-like recognizing life-threatening situations and reminding about medication intake. With reference to the design of the robotic assistant, the majority of the respondent would like to have an anthropomorphic appearance with a positive emotionally expressive face. The most important type of human-robot interaction was a voice-operated system and by touchscreen. Conclusion: The results from our study might contribute to a better understanding of the system and users’ requirements for the development of a service robot intended to support patients with dementia.

Keywords: assistant robot, dementia, long term care, patients

Procedia PDF Downloads 149
12151 Location Tracking of Human Using Mobile Robot and Wireless Sensor Networks

Authors: Muazzam A. Khan

Abstract:

In order to avoid dangerous environmental disasters, robots are being recognized as good entrants to step in as human rescuers. Robots has been gaining interest of many researchers in rescue matters especially which are furnished with advanced sensors. In distributed wireless robot system main objective for a rescue system is to track the location of the object continuously. This paper provides a novel idea to track and locate human in disaster area using stereo vision system and ZigBee technology. This system recursively predict and updates 3D coordinates in a robot coordinate camera system of a human which makes the system cost effective. This system is comprised of ZigBee network which has many advantages such as low power consumption, self-healing low data rates and low cost.

Keywords: stereo vision, segmentation, classification, human tracking, ZigBee module

Procedia PDF Downloads 487
12150 Open-Source YOLO CV For Detection of Dust on Solar PV Surface

Authors: Jeewan Rai, Kinzang, Yeshi Jigme Choden

Abstract:

Accumulation of dust on solar panels impacts the overall efficiency and the amount of energy they produce. While various techniques exist for detecting dust to schedule cleaning, many of these methods use MATLAB image processing tools and other licensed software, which can be financially burdensome. This study will investigate the efficiency of a free open-source computer vision library using the YOLO algorithm. The proposed approach has been tested on images of solar panels with varying dust levels through an experiment setup. The experimental findings illustrated the effectiveness of using the YOLO-based image classification method and the overall dust detection approach with an accuracy of 90% in distinguishing between clean and dusty panels. This open-source solution provides a cost effective and accessible alternative to commercial image processing tools, offering solutions for optimizing solar panel maintenance and enhancing energy production.

Keywords: YOLO, openCV, dust detection, solar panels, computer vision, image processing

Procedia PDF Downloads 14
12149 An Overview on the Effectiveness of Brand Mascot and Celebrity Endorsement

Authors: Isari Pairoa, Proud Arunrangsiwed

Abstract:

Celebrity and brand mascot endorsement have been explored for more than three decades. Both endorsers can effectively transfer their reputation to corporate image and can influence the customers to purchase the product. However, there was little known about the mediators between the level of endorsement and its effect on buying behavior. The objective of the current study is to identify the gab of the previous studies and to seek possible mediators. It was found that consumer’s memory and identification are the mediators, of source credibility and endorsement effect. A future study should confirm the model of endorsement, which was established in the current study.

Keywords: product endorsement, memory, identification theory, source credibility, unintentional effect

Procedia PDF Downloads 221
12148 The Context of Human Rights in a Poverty-Stricken Africa: A Reflection

Authors: Ugwu Chukwuka E.

Abstract:

The African context of human right instruments as recognized today can be traced to Africa’s relationship with the Western World. A significant preponderance of these instruments are found in both colonial and post colonial statutes as the colonial laws, the post colonial legal documents as constitutions or Africa’s adherence to relevant international instruments on human rights as the Universal Declaration of Human Rights (1948) and the African Charter on Human and Peoples’ Rights (1981). In spite of all these human rights instruments inherent in the African continent, it is contended in this paper that, these Western-oriented notion of human rights, emphasizes rights that hardly meets the current needs of contemporary African citizens. Adopting a historical research methodology, this study interrogates the dynamics of the African poverty context in relation to the implementation of human rights instruments in the continent. In this vein, using human rights and poverty scenarios from one Anglophone (Uganda) and one Francophone (Senegal) countries in Africa, the study hypothesized that, majority of Africans are not in a historical condition for the realization of these rights. The raison d’etre for this claim emerges from the fact that, the present generations of African hoi polloi are inundated with extensive powerlessness, ignorance, diseases, hunger and overall poverty that emasculates their interest in these rights instruments. In contrast, the few Africans who have access to the enjoyment of these rights in the continent hardly needs these instruments, as their power and resources base secures them that. The paper concludes that the stress of African states and stakeholders on African affairs should concentrated significantly, on the alleviation of the present historical poverty squalor of Africans, which when attended to, enhances the realization of human right situations in the continent.

Keywords: Africa, human rights, poverty, western world

Procedia PDF Downloads 434
12147 Design of an Ensemble Learning Behavior Anomaly Detection Framework

Authors: Abdoulaye Diop, Nahid Emad, Thierry Winter, Mohamed Hilia

Abstract:

Data assets protection is a crucial issue in the cybersecurity field. Companies use logical access control tools to vault their information assets and protect them against external threats, but they lack solutions to counter insider threats. Nowadays, insider threats are the most significant concern of security analysts. They are mainly individuals with legitimate access to companies information systems, which use their rights with malicious intents. In several fields, behavior anomaly detection is the method used by cyber specialists to counter the threats of user malicious activities effectively. In this paper, we present the step toward the construction of a user and entity behavior analysis framework by proposing a behavior anomaly detection model. This model combines machine learning classification techniques and graph-based methods, relying on linear algebra and parallel computing techniques. We show the utility of an ensemble learning approach in this context. We present some detection methods tests results on an representative access control dataset. The use of some explored classifiers gives results up to 99% of accuracy.

Keywords: cybersecurity, data protection, access control, insider threat, user behavior analysis, ensemble learning, high performance computing

Procedia PDF Downloads 118
12146 Theoretical Aspects and Practical Approach in the Research of the Human Capital of Student Volunteer Community

Authors: Kalinina Anatasiia, Pevnaya Mariya

Abstract:

The article concerns theoretical basis in the research of student volunteering, identifies references of student volunteering as a social community, classifies human capital indicators of student volunteers. Also there are presented the results of research of 450 student volunteers in Russia concerning the correlation between international volunteering and indicators of human capital of youth. Findings include compared characteristics of human capital of “potential” and “real” international student volunteers. Factor analysis revealed two categories of active students categories of active students.

Keywords: human capital, international volunteering, student volunteering, social community, youth volunteering, youth politics

Procedia PDF Downloads 549
12145 Gene Names Identity Recognition Using Siamese Network for Biomedical Publications

Authors: Micheal Olaolu Arowolo, Muhammad Azam, Fei He, Mihail Popescu, Dong Xu

Abstract:

As the quantity of biological articles rises, so does the number of biological route figures. Each route figure shows gene names and relationships. Annotating pathway diagrams manually is time-consuming. Advanced image understanding models could speed up curation, but they must be more precise. There is rich information in biological pathway figures. The first step to performing image understanding of these figures is to recognize gene names automatically. Classical optical character recognition methods have been employed for gene name recognition, but they are not optimized for literature mining data. This study devised a method to recognize an image bounding box of gene name as a photo using deep Siamese neural network models to outperform the existing methods using ResNet, DenseNet and Inception architectures, the results obtained about 84% accuracy.

Keywords: biological pathway, gene identification, object detection, Siamese network

Procedia PDF Downloads 277
12144 Risk Assessment on New Bio-Composite Materials Made from Water Resource Recovery

Authors: Arianna Nativio, Zoran Kapelan, Jan Peter van der Hoek

Abstract:

Bio-composite materials are becoming increasingly popular in various applications, such as the automotive industry. Usually, bio-composite materials are made from natural resources recovered from plants, now, a new type of bio-composite material has begun to be produced in the Netherlands. This material is made from resources recovered from drinking water treatments (calcite), wastewater treatment (cellulose), and material from surface water management (aquatic plants). Surface water, raw drinking water, and wastewater can be contaminated with pathogens and chemical compounds. Therefore, it would be valuable to develop a framework to assess, monitor, and control the potential risks. Indeed, the goal is to define the major risks in terms of human health, quality of materials, and environment associated with the production and application of these new materials. This study describes the general risk assessment framework, starting with a qualitative risk assessment. The qualitative risk analysis was carried out by using the HAZOP methodology for the hazard identification phase. The HAZOP methodology is logical and structured and able to identify the hazards in the first stage of the design when hazards and associated risks are not well known. The identified hazards were analyzed to define the potential associated risks, and then these were evaluated by using the qualitative Event Tree Analysis. ETA is a logical methodology used to define the consequences for a specific hazardous incidents, evaluating the failure modes of safety barriers and dangerous intermediate events that lead to the final scenario (risk). This paper shows the effectiveness of combining of HAZOP and qualitative ETA methodologies for hazard identification and risk mapping. Then, key risks were identified, and a quantitative framework was developed based on the type of risks identified, such as QMRA and QCRA. These two models were applied to assess human health risks due to the presence of pathogens and chemical compounds such as heavy metals into the bio-composite materials. Thus, due to these contaminations, the bio-composite product, during its application, might release toxic substances into the environment leading to a negative environmental impact. Therefore, leaching tests are going to be planned to simulate the application of these materials into the environment and evaluate the potential leaching of inorganic substances, assessing environmental risk.

Keywords: bio-composite, risk assessment, water reuse, resource recovery

Procedia PDF Downloads 104
12143 Fault Detection and Isolation in Sensors and Actuators of Wind Turbines

Authors: Shahrokh Barati, Reza Ramezani

Abstract:

Due to the countries growing attention to the renewable energy producing, the demand for energy from renewable energy has gone up among the renewable energy sources; wind energy is the fastest growth in recent years. In this regard, in order to increase the availability of wind turbines, using of Fault Detection and Isolation (FDI) system is necessary. Wind turbines include of various faults such as sensors fault, actuator faults, network connection fault, mechanical faults and faults in the generator subsystem. Although, sensors and actuators have a large number of faults in wind turbine but have discussed fewer in the literature. Therefore, in this work, we focus our attention to design a sensor and actuator fault detection and isolation algorithm and Fault-tolerant control systems (FTCS) for Wind Turbine. The aim of this research is to propose a comprehensive fault detection and isolation system for sensors and actuators of wind turbine based on data-driven approaches. To achieve this goal, the features of measurable signals in real wind turbine extract in any condition. The next step is the feature selection among the extract in any condition. The next step is the feature selection among the extracted features. Features are selected that led to maximum separation networks that implemented in parallel and results of classifiers fused together. In order to maximize the reliability of decision on fault, the property of fault repeatability is used.

Keywords: FDI, wind turbines, sensors and actuators faults, renewable energy

Procedia PDF Downloads 395
12142 Performing Diagnosis in Building with Partially Valid Heterogeneous Tests

Authors: Houda Najeh, Mahendra Pratap Singh, Stéphane Ploix, Antoine Caucheteux, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building system is highly vulnerable to different kinds of faults and human misbehaviors. Energy efficiency and user comfort are directly targeted due to abnormalities in building operation. The available fault diagnosis tools and methodologies particularly rely on rules or pure model-based approaches. It is assumed that model or rule-based test could be applied to any situation without taking into account actual testing contexts. Contextual tests with validity domain could reduce a lot of the design of detection tests. The main objective of this paper is to consider fault validity when validate the test model considering the non-modeled events such as occupancy, weather conditions, door and window openings and the integration of the knowledge of the expert on the state of the system. The concept of heterogeneous tests is combined with test validity to generate fault diagnoses. A combination of rules, range and model-based tests known as heterogeneous tests are proposed to reduce the modeling complexity. Calculation of logical diagnoses coming from artificial intelligence provides a global explanation consistent with the test result. An application example shows the efficiency of the proposed technique: an office setting at Grenoble Institute of Technology.

Keywords: heterogeneous tests, validity, building system, sensor grids, sensor fault, diagnosis, fault detection and isolation

Procedia PDF Downloads 284
12141 Curbing Cybercrime by Application of Internet Users’ Identification System (IUIS) in Nigeria

Authors: K. Alese Boniface, K. Adu Michael

Abstract:

Cybercrime is now becoming a big challenge in Nigeria apart from the traditional crime. Inability to identify perpetrators is one of the reasons for the growing menace. This paper proposes a design for monitoring internet users’ activities in order to curbing cybercrime. It requires redefining the operations of Internet Service Providers (ISPs) which will now mandate users to be authenticated before accessing the internet. In implementing this work which can be adapted to a larger scale, a virtual router application is developed and configured to mimic a real router device. A sign-up portal is developed to allow users to register with the ISP. The portal asks for identification information which will include bio-data and government issued identification data like National Identity Card number, et cetera. A unique username and password are chosen by the user to enable access to the internet which will be used to reference him to an Internet Protocol Address (IP Address) of any system he uses on the internet and thereby associating him to any criminal act related to that IP address at that particular time. Questions such as “What happen when another user knows the password and uses it to commit crime?” and other pertinent issues are addressed.

Keywords: cybercrime, sign-up portal, internet service provider (ISP), internet protocol address (IP address)

Procedia PDF Downloads 272
12140 A Resource Based View: Perspective on Acquired Human Resource towards Competitive Advantage

Authors: Monia Hassan Abdulrahman

Abstract:

Resource-based view is built on many theories in addition to diverse perspectives, we extend this view placing emphasis on human resources addressing the tools required to sustain competitive advantage. Highlighting on several theories and judgments, assumptions were established to clearly reach if resource possession alone suffices for the sustainability of competitive advantage, or necessary accommodation are required for better performance. New practices were indicated in terms of resources used in firms, these practices were implemented on the human resources in particular, and results were developed in compliance to the mentioned assumptions. Such results drew attention to the significance of practices that provide enhancement of human resources that have a core responsibility of maintaining resource-based view for an organization to lead the way to gaining competitive advantage.

Keywords: competitive advantage, resource based value, human resources, strategic management

Procedia PDF Downloads 385
12139 Optimization of Hate Speech and Abusive Language Detection on Indonesian-language Twitter using Genetic Algorithms

Authors: Rikson Gultom

Abstract:

Hate Speech and Abusive language on social media is difficult to detect, usually, it is detected after it becomes viral in cyberspace, of course, it is too late for prevention. An early detection system that has a fairly good accuracy is needed so that it can reduce conflicts that occur in society caused by postings on social media that attack individuals, groups, and governments in Indonesia. The purpose of this study is to find an early detection model on Twitter social media using machine learning that has high accuracy from several machine learning methods studied. In this study, the support vector machine (SVM), Naïve Bayes (NB), and Random Forest Decision Tree (RFDT) methods were compared with the Support Vector machine with genetic algorithm (SVM-GA), Nave Bayes with genetic algorithm (NB-GA), and Random Forest Decision Tree with Genetic Algorithm (RFDT-GA). The study produced a comparison table for the accuracy of the hate speech and abusive language detection model, and presented it in the form of a graph of the accuracy of the six algorithms developed based on the Indonesian-language Twitter dataset, and concluded the best model with the highest accuracy.

Keywords: abusive language, hate speech, machine learning, optimization, social media

Procedia PDF Downloads 122
12138 Verification of Space System Dynamics Using the MATLAB Identification Toolbox in Space Qualification Test

Authors: Yuri V. Kim

Abstract:

This article presents a new approach to the Functional Testing of Space Systems (SS). It can be considered as a generic test and used for a wide class of SS that from the point of view of System Dynamics and Control may be described by the ordinary differential equations. Suggested methodology is based on using semi-natural experiment- laboratory stand that doesn’t require complicated, precise and expensive technological control-verification equipment. However, it allows for testing system as a whole totally assembled unit during Assembling, Integration and Testing (AIT) activities, involving system hardware (HW) and software (SW). The test physically activates system input (sensors) and output (actuators) and requires recording their outputs in real time. The data is then inserted in laboratory PC where it is post-experiment processed by Matlab/Simulink Identification Toolbox. It allows for estimating system dynamics in form of estimation of system differential equations by the experimental way and comparing them with expected mathematical model prematurely verified by mathematical simulation during the design process.

Keywords: system dynamics, space system ground tests and space qualification, system dynamics identification, satellite attitude control, assembling, integration and testing

Procedia PDF Downloads 157
12137 System Identification of Timber Masonry Walls Using Shaking Table Test

Authors: Timir Baran Roy, Luis Guerreiro, Ashutosh Bagchi

Abstract:

Dynamic study is important in order to design, repair and rehabilitation of structures. It has played an important role in the behavior characterization of structures; such as bridges, dams, high-rise buildings etc. There had been a substantial development in this area over the last few decades, especially in the field of dynamic identification techniques of structural systems. Frequency Domain Decomposition (FDD) and Time Domain Decomposition are most commonly used methods to identify modal parameters; such as natural frequency, modal damping, and mode shape. The focus of the present research is to study the dynamic characteristics of typical timber masonry walls commonly used in Portugal. For that purpose, a multi-storey structural prototypes of such walls have been tested on a seismic shake table at the National Laboratory for Civil Engineering, Portugal (LNEC). Signal processing has been performed of the output response, which is collected from the shaking table experiment of the prototype using accelerometers. In the present work signal processing of the output response, based on the input response has been done in two ways: FDD and Stochastic Subspace Identification (SSI). In order to estimate the values of the modal parameters, algorithms for FDD are formulated, and parametric functions for the SSI are computed. Finally, estimated values from both the methods are compared to measure the accuracy of both the techniques.

Keywords: frequency domain decomposition (fdd), modal parameters, signal processing, stochastic subspace identification (ssi), time domain decomposition

Procedia PDF Downloads 260
12136 The Journey of a Malicious HTTP Request

Authors: M. Mansouri, P. Jaklitsch, E. Teiniker

Abstract:

SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect high-level attacks such as SQL injection.

Keywords: Linux system calls, web attack detection, interception, SQL

Procedia PDF Downloads 351
12135 Analysis of the Unmanned Aerial Vehicles’ Incidents and Accidents: The Role of Human Factors

Authors: Jacob J. Shila, Xiaoyu O. Wu

Abstract:

As the applications of unmanned aerial vehicles (UAV) continue to increase across the world, it is critical to understand the factors that contribute to incidents and accidents associated with these systems. Given the variety of daily applications that could utilize the operations of the UAV (e.g., medical, security operations, construction activities, landscape activities), the main discussion has been how to safely incorporate the UAV into the national airspace system. The types of UAV incidents being reported range from near sightings by other pilots to actual collisions with aircraft or UAV. These incidents have the potential to impact the rest of aviation operations in a variety of ways, including human lives, liability costs, and delay costs. One of the largest causes of these incidents cited is the human factor; other causes cited include maintenance, aircraft, and others. This work investigates the key human factors associated with UAV incidents. To that end, the data related to UAV incidents that have occurred in the United States is both reviewed and analyzed to identify key human factors related to UAV incidents. The data utilized in this work is gathered from the Federal Aviation Administration (FAA) drone database. This study adopts the human factor analysis and classification system (HFACS) to identify key human factors that have contributed to some of the UAV failures to date. The uniqueness of this work is the incorporation of UAV incident data from a variety of applications and not just military data. In addition, identifying the specific human factors is crucial towards developing safety operational models and human factor guidelines for the UAV. The findings of these common human factors are also compared to similar studies in other countries to determine whether these factors are common internationally.

Keywords: human factors, incidents and accidents, safety, UAS, UAV

Procedia PDF Downloads 234
12134 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach

Authors: Gong Zhilin, Jing Yang, Jian Yin

Abstract:

The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).

Keywords: credit card, data mining, fraud detection, money transactions

Procedia PDF Downloads 123
12133 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining

Authors: Mohsen Farhadloo, Majid Farhadloo

Abstract:

Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.

Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis

Procedia PDF Downloads 91
12132 Fault Detection and Isolation of a Three-Tank System using Analytical Temporal Redundancy, Parity Space/Relation Based Residual Generation

Authors: A. T. Kuda, J. J. Dayya, A. Jimoh

Abstract:

This paper investigates the fault detection and Isolation technique of measurement data sets from a three tank system using analytical model-based temporal redundancy which is based on residual generation using parity equations/space approach. It further briefly outlines other approaches of model-based residual generation. The basic idea of parity space residual generation in temporal redundancy is dynamic relationship between sensor outputs and actuator inputs (input-output model). These residuals where then used to detect whether or not the system is faulty and indicate the location of the fault when it is faulty. The method obtains good results by detecting and isolating faults from the considered data sets measurements generated from the system.

Keywords: fault detection, fault isolation, disturbing influences, system failure, parity equation/relation, structured parity equations

Procedia PDF Downloads 294
12131 Encoded Nanospheres for the Fast Ratiometric Detection of Cystic Fibrosis

Authors: Iván Castelló, Georgiana Stoica, Emilio Palomares, Fernando Bravo

Abstract:

We present herein two colour encoded silica nanospheres (2nanoSi) for the fluorescence quantitative ratiometric determination of trypsin in humans. The system proved to be a faster (minutes) method, with two times higher sensitivity than the state-of-the-art biomarkers based sensors for cystic fibrosis (CF), allowing the quantification of trypsin concentrations in a wide range (0-350 mg/L). Furthermore, as trypsin is directly related to the development of cystic fibrosis, different human genotypes, i.e. healthy homozygotic (> 80 mg/L), CF homozygotic (< 50 mg/L), and heterozygotic (> 50 mg/L), respectively, can be determined using our 2nanoSi nanospheres.

Keywords: cystic fibrosis, trypsin, quantum dots, biomarker, homozygote, heterozygote

Procedia PDF Downloads 477
12130 Human Lens Metabolome: A Combined LC-MS and NMR Study

Authors: Vadim V. Yanshole, Lyudmila V. Yanshole, Alexey S. Kiryutin, Timofey D. Verkhovod, Yuri P. Tsentalovich

Abstract:

Cataract, or clouding of the eye lens, is the leading cause of vision impairment in the world. The lens tissue have very specific structure: It does not have vascular system, the lens proteins – crystallins – do not turnover throughout lifespan. The protection of lens proteins is provided by the metabolites which diffuse inside the lens from the aqueous humor or synthesized in the lens epithelial layer. Therefore, the study of changes in the metabolite composition of a cataractous lens as compared to a normal lens may elucidate the possible mechanisms of the cataract formation. Quantitative metabolomic profiles of normal and cataractous human lenses were obtained with the combined use of high-frequency nuclear magnetic resonance (NMR) and ion-pairing high-performance liquid chromatography with high-resolution mass-spectrometric detection (LC-MS) methods. The quantitative content of more than fifty metabolites has been determined in this work for normal aged and cataractous human lenses. The most abundant metabolites in the normal lens are myo-inositol, lactate, creatine, glutathione, glutamate, and glucose. For the majority of metabolites, their levels in the lens cortex and nucleus are similar, with the few exceptions including antioxidants and UV filters: The concentrations of glutathione, ascorbate and NAD in the lens nucleus decrease as compared to the cortex, while the levels of the secondary UV filters formed from primary UV filters in redox processes increase. That confirms that the lens core is metabolically inert, and the metabolic activity in the lens nucleus is mostly restricted by protection from the oxidative stress caused by UV irradiation, UV filter spontaneous decomposition, or other factors. It was found that the metabolomic composition of normal and age-matched cataractous human lenses differ significantly. The content of the most important metabolites – antioxidants, UV filters, and osmolytes – in the cataractous nucleus is at least ten fold lower than in the normal nucleus. One may suppose that the majority of these metabolites are synthesized in the lens epithelial layer, and that age-related cataractogenesis might originate from the dysfunction of the lens epithelial cells. Comprehensive quantitative metabolic profiles of the human eye lens have been acquired for the first time. The obtained data can be used for the analysis of changes in the lens chemical composition occurring with age and with the cataract development.

Keywords: cataract, lens, NMR, LC-MS, metabolome

Procedia PDF Downloads 310
12129 Comparative Analysis of Feature Extraction and Classification Techniques

Authors: R. L. Ujjwal, Abhishek Jain

Abstract:

In the field of computer vision, most facial variations such as identity, expression, emotions and gender have been extensively studied. Automatic age estimation has been rarely explored. With age progression of a human, the features of the face changes. This paper is providing a new comparable study of different type of algorithm to feature extraction [Hybrid features using HAAR cascade & HOG features] & classification [KNN & SVM] training dataset. By using these algorithms we are trying to find out one of the best classification algorithms. Same thing we have done on the feature selection part, we extract the feature by using HAAR cascade and HOG. This work will be done in context of age group classification model.

Keywords: computer vision, age group, face detection

Procedia PDF Downloads 362
12128 Bioaccumulation and Forensic Relevance of Gunshot Residue in Forensically Relevant Blowflies

Authors: Michaela Storen, Michelle Harvey, Xavier Conlan

Abstract:

Gun violence internationally is increasing at an unprecedented level, becoming a favoured means for executing violence against another individual. Not only is this putting a strain on forensic scientists who attempt to determine the cause of death in circumstances where firearms have been involved in the death of an individual, but it also highlights the need for an alternative technique of identification of a gunshot wound when other established techniques have been exhausted. A corpse may be colonized by necrophagous insects following death, and this close association between the time of death and insect colonization makes entomological samples valuable evidence when remains become decomposed beyond toxicological utility. Entomotoxicology provides the potential for the identification of toxins in a decomposing corpse, with recent research uncovering the capabilities of entomotoxicology to detect gunshot residue (GSR) in a corpse. However, shortcomings of the limited literature available on this topic have not been addressed, with the bioaccumulation, detection limits, and sensitivity to gunshots not considered thus far, leaving questions as to the applicability of this new technique in the forensic context. Larvae were placed on meat contaminated with GSR at different concentrations and compared to a control meat sample to establish the uptake of GSR by the larvae, with bioaccumulation established by placing the larvae on fresh, uncontaminated meat for a period of time before analysis using ICP-MS. The findings of Pb, Ba, and Sb at each stage of the lifecycle and bioaccumulation in the larvae will be presented. In addition, throughout these previously mentioned experiments, larvae were washed once, twice and three times to evaluate the effectiveness of existing entomological practices in removing external toxins from specimens prior to entomotoxicologyical analysis. Analysis of these larval washes will be presented. By addressing these points, this research extends the utility of entomotoxicology in cause-of-death investigations and provides an additional source of evidence for forensic scientists in the circumstances involving a gunshot wound on a corpse, in addition to advising the effectiveness of current entomology collection protocols.

Keywords: bioaccumulation, chemistry, entomology, gunshot residue, toxicology

Procedia PDF Downloads 77
12127 A Spatio-Temporal Analysis and Change Detection of Wetlands in Diamond Harbour, West Bengal, India Using Normalized Difference Water Index

Authors: Lopita Pal, Suresh V. Madha

Abstract:

Wetlands are areas of marsh, fen, peat land or water, whether natural or artificial, permanent or temporary, with water that is static or flowing, fresh, brackish or salt, including areas of marine water the depth of which at low tide does not exceed six metres. The rapidly expanding human population, large scale changes in land use/land cover, burgeoning development projects and improper use of watersheds all has caused a substantial decline of wetland resources in the world. Major degradations have been impacted from agricultural, industrial and urban developments leading to various types of pollutions and hydrological perturbations. Regular fishing activities and unsustainable grazing of animals are degrading the wetlands in a slow pace. The paper focuses on the spatio-temporal change detection of the area of the water body and the main cause of this depletion. The total area under study (22°19’87’’ N, 88°20’23’’ E) is a wetland region in West Bengal of 213 sq.km. The procedure used is the Normalized Difference Water Index (NDWI) from multi-spectral imagery and Landsat to detect the presence of surface water, and the datasets have been compared of the years 2016, 2006 and 1996. The result shows a sharp decline in the area of water body due to a rapid increase in the agricultural practices and the growing urbanization.

Keywords: spatio-temporal change, NDWI, urbanization, wetland

Procedia PDF Downloads 276
12126 Silicon-Photonic-Sensor System for Botulinum Toxin Detection in Water

Authors: Binh T. T. Nguyen, Zhenyu Li, Eric Yap, Yi Zhang, Ai-Qun Liu

Abstract:

Silicon-photonic-sensor system is an emerging class of analytical technologies that use evanescent field wave to sensitively measure the slight difference in the surrounding environment. The wavelength shift induced by local refractive index change is used as an indicator in the system. These devices can be served as sensors for a wide variety of chemical or biomolecular detection in clinical and environmental fields. In our study, a system including a silicon-based micro-ring resonator, microfluidic channel, and optical processing is designed, fabricated for biomolecule detection. The system is demonstrated to detect Clostridium botulinum type A neurotoxin (BoNT) in different water sources. BoNT is one of the most toxic substances known and relatively easily obtained from a cultured bacteria source. The toxin is extremely lethal with LD50 of about 0.1µg/70kg intravenously, 1µg/ 70 kg by inhalation, and 70µg/kg orally. These factors make botulinum neurotoxins primary candidates as bioterrorism or biothreat agents. It is required to have a sensing system which can detect BoNT in a short time, high sensitive and automatic. For BoNT detection, silicon-based micro-ring resonator is modified with a linker for the immobilization of the anti-botulinum capture antibody. The enzymatic reaction is employed to increase the signal hence gains sensitivity. As a result, a detection limit to 30 pg/mL is achieved by our silicon-photonic sensor within a short period of 80 min. The sensor also shows high specificity versus the other type of botulinum. In the future, by designing the multifunctional waveguide array with fully automatic control system, it is simple to simultaneously detect multi-biomaterials at a low concentration within a short period. The system has a great potential to apply for online, real-time and high sensitivity for the label-free bimolecular rapid detection.

Keywords: biotoxin, photonic, ring resonator, sensor

Procedia PDF Downloads 111
12125 An Improved Convolution Deep Learning Model for Predicting Trip Mode Scheduling

Authors: Amin Nezarat, Naeime Seifadini

Abstract:

Trip mode selection is a behavioral characteristic of passengers with immense importance for travel demand analysis, transportation planning, and traffic management. Identification of trip mode distribution will allow transportation authorities to adopt appropriate strategies to reduce travel time, traffic and air pollution. The majority of existing trip mode inference models operate based on human selected features and traditional machine learning algorithms. However, human selected features are sensitive to changes in traffic and environmental conditions and susceptible to personal biases, which can make them inefficient. One way to overcome these problems is to use neural networks capable of extracting high-level features from raw input. In this study, the convolutional neural network (CNN) architecture is used to predict the trip mode distribution based on raw GPS trajectory data. The key innovation of this paper is the design of the layout of the input layer of CNN as well as normalization operation, in a way that is not only compatible with the CNN architecture but can also represent the fundamental features of motion including speed, acceleration, jerk, and Bearing rate. The highest prediction accuracy achieved with the proposed configuration for the convolutional neural network with batch normalization is 85.26%.

Keywords: predicting, deep learning, neural network, urban trip

Procedia PDF Downloads 129
12124 Diversity Indices as a Tool for Evaluating Quality of Water Ways

Authors: Khadra Ahmed, Khaled Kheireldin

Abstract:

In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.

Keywords: planktons, diversity indices, water quality index, water ways

Procedia PDF Downloads 511