Search results for: feature detection
2406 Performance Analysis of Bluetooth Low Energy Mesh Routing Algorithm in Case of Disaster Prediction
Authors: Asmir Gogic, Aljo Mujcic, Sandra Ibric, Nermin Suljanovic
Abstract:
Ubiquity of natural disasters during last few decades have risen serious questions towards the prediction of such events and human safety. Every disaster regardless its proportion has a precursor which is manifested as a disruption of some environmental parameter such as temperature, humidity, pressure, vibrations and etc. In order to anticipate and monitor those changes, in this paper we propose an overall system for disaster prediction and monitoring, based on wireless sensor network (WSN). Furthermore, we introduce a modified and simplified WSN routing protocol built on the top of the trickle routing algorithm. Routing algorithm was deployed using the bluetooth low energy protocol in order to achieve low power consumption. Performance of the WSN network was analyzed using a real life system implementation. Estimates of the WSN parameters such as battery life time, network size and packet delay are determined. Based on the performance of the WSN network, proposed system can be utilized for disaster monitoring and prediction due to its low power profile and mesh routing feature.Keywords: bluetooth low energy, disaster prediction, mesh routing protocols, wireless sensor networks
Procedia PDF Downloads 3852405 Acrylic Microspheres-Based Microbial Bio-Optode for Nitrite Ion Detection
Authors: Siti Nur Syazni Mohd Zuki, Tan Ling Ling, Nina Suhaity Azmi, Chong Kwok Feng, Lee Yook Heng
Abstract:
Nitrite (NO2-) ion is used prevalently as a preservative in processed meat. Elevated levels of nitrite also found in edible bird’s nests (EBNs). Consumption of NO2- ion at levels above the health-based risk may cause cancer in humans. Spectrophotometric Griess test is the simplest established standard method for NO2- ion detection, however, it requires careful control of pH of each reaction step and susceptible to strong oxidants and dyeing interferences. Other traditional methods rely on the use of laboratory-scale instruments such as GC-MS, HPLC and ion chromatography, which cannot give real-time response. Therefore, it is of significant need for devices capable of measuring nitrite concentration in-situ, rapidly and without reagents, sample pretreatment or extraction step. Herein, we constructed a microspheres-based microbial optode for visual quantitation of NO2- ion. Raoutella planticola, the bacterium expressing NAD(P)H nitrite reductase (NiR) enzyme has been successfully extracted by microbial technique from EBN collected from local birdhouse. The whole cells and the lipophilic Nile Blue chromoionophore were physically absorbed on the photocurable poly(n-butyl acrylate-N-acryloxysuccinimide) [poly (nBA-NAS)] microspheres, whilst the reduced coenzyme NAD(P)H was covalently immobilized on the succinimide-functionalized acrylic microspheres to produce a reagentless biosensing system. Upon the NiR enzyme catalyzes the oxidation of NAD(P)H to NAD(P)+, NO2- ion is reduced to ammonium hydroxide, and that a colour change from blue to pink of the immobilized Nile Blue chromoionophore is perceived as a result of deprotonation reaction increasing the local pH in the microspheres membrane. The microspheres-based optosensor was optimized with a reflectance spectrophotometer at 639 nm and pH 8. The resulting microbial bio-optode membrane could quantify NO2- ion at 0.1 ppm and had a linear response up to 400 ppm. Due to the large surface area to mass ratio of the acrylic microspheres, it allows efficient solid state diffusional mass transfer of the substrate to the bio-recognition phase, and achieve the steady state response as fast as 5 min. The proposed optical microbial biosensor requires no sample pre-treatment step and possesses high stability as the whole cell biocatalyst provides protection to the enzymes from interfering substances, hence it is suitable for measurements in contaminated samples.Keywords: acrylic microspheres, microbial bio-optode, nitrite ion, reflectometric
Procedia PDF Downloads 4482404 Study and Construction on Signalling System during Reverse Motion Due to Obstacle
Authors: S. M. Yasir Arafat
Abstract:
Driving models are needed by many researchers to improve traffic safety and to advance autonomous vehicle design. To be most useful, a driving model must state specifically what information is needed and how it is processed. So we developed an “Obstacle Avoidance and Detection Autonomous Car” based on sensor application. The ever increasing technological demands of today call for very complex systems, which in turn require highly sophisticated controllers to ensure that high performance can be achieved and maintained under adverse conditions. Based on a developed model of brakes operation, the controller of braking system operation has been designed. It has a task to enable solution to the problem of the better controlling of braking system operation in a more accurate way then it was the case now a day.Keywords: automobile, obstacle, safety, sensing
Procedia PDF Downloads 3642403 Vfx-Creativity or Cost Cutting Study of the Use of Vfx in Hindi Cinema
Authors: Nidhi Patel, Amol Shinde, Amrin Moger
Abstract:
Mainstream Hindi cinema also known as Bollywood, is the largest film producing industry in India. The Indian film industry underwent a sea change since last few years. The industry adapted to the latest technologies and creative manpower to improve visual and cinematic effects. The changes helped the industry to improve its creative looks and ease on production budget. The research focuses on this very change, i.e. the use of VFX. There has been growing use of VFX in feature films. The primary focus is on how VFX can make a difference in the experience of watching a movie. The research examines the use of CGI/VFX in the narrative, which delivers a visually fulfilling film. It also focuses on the use of CGI/ VFX as a cost cutting tool. The research was exploratory in nature. It studies the industry’s evolvement, increment in its use by filmmakers and their intention to use it in their films. The researcher used qualitative method for data collection as an in-depth interview of 10 artists from VFX studios in Mumbai was conducted. The finding reveals the way VFX is used in Hindi cinema by the directors. The researcher learnt that VFX is majorly used as a tool to enhance creativity and provide the audience with creative viewing experience.Keywords: Bollywood, Hindi cinema, VFX, CGI, technology, creativity, cost cutting
Procedia PDF Downloads 3592402 Characterization of the Dispersion Phenomenon in an Optical Biosensor
Authors: An-Shik Yang, Chin-Ting Kuo, Yung-Chun Yang, Wen-Hsin Hsieh, Chiang-Ho Cheng
Abstract:
Optical biosensors have become a powerful detection and analysis tool for wide-ranging applications in biomedical research, pharmaceuticals and environmental monitoring. This study carried out the computational fluid dynamics (CFD)-based simulations to explore the dispersion phenomenon in the microchannel of a optical biosensor. The predicted time sequences of concentration contours were utilized to better understand the dispersion development occurred in different geometric shapes of microchannels. The simulation results showed the surface concentrations at the sensing probe (with the best performance of a grating coupler) in respect of time to appraise the dispersion effect and therefore identify the design configurations resulting in minimum dispersion.Keywords: CFD simulations, dispersion, microfluidic, optical waveguide sensors
Procedia PDF Downloads 5452401 Investigating (Im)Politeness Strategies in Email Communication: The Case Algerian PhD Supervisees and Irish Supervisors
Authors: Zehor Ktitni
Abstract:
In pragmatics, politeness is regarded as a feature of paramount importance to successful interpersonal relationships. On the other hand, emails have recently become one of the indispensable means of communication in educational settings. This research puts email communication at the core of the study and analyses it from a politeness perspective. More specifically, it endeavours to look closely at how the concept of (im)politeness is reflected through students’ emails. To this end, a corpus of Algerian supervisees’ email threads, exchanged with their Irish supervisors, was compiled. Leech’s model of politeness (2014) was selected as the main theoretical framework of this study, in addition to making reference to Brown and Levinson’s model (1987) as it is one of the most influential models in the area of pragmatic politeness. Further, some follow-up interviews are to be conducted with Algerian students to reinforce the results derived from the corpus. Initial findings suggest that Algerian Ph.D. students’ emails tend to include more politeness markers than impoliteness ones, they heavily make use of academic titles when addressing their supervisors (Dr. or Prof.), and they rely on hedging devices in order to sound polite.Keywords: politeness, email communication, corpus pragmatics, Algerian PhD supervisees, Irish supervisors
Procedia PDF Downloads 702400 Hate Speech Detection Using Deep Learning and Machine Learning Models
Authors: Nabil Shawkat, Jamil Saquer
Abstract:
Social media has accelerated our ability to engage with others and eliminated many communication barriers. On the other hand, the widespread use of social media resulted in an increase in online hate speech. This has drastic impacts on vulnerable individuals and societies. Therefore, it is critical to detect hate speech to prevent innocent users and vulnerable communities from becoming victims of hate speech. We investigate the performance of different deep learning and machine learning algorithms on three different datasets. Our results show that the BERT model gives the best performance among all the models by achieving an F1-score of 90.6% on one of the datasets and F1-scores of 89.7% and 88.2% on the other two datasets.Keywords: hate speech, machine learning, deep learning, abusive words, social media, text classification
Procedia PDF Downloads 1362399 PCR Based DNA Analysis in Detecting P53 Mutation in Human Breast Cancer (MDA-468)
Authors: Debbarma Asis, Guha Chandan
Abstract:
Tumor Protein-53 (P53) is one of the tumor suppressor proteins. P53 regulates the cell cycle that conserves stability by preventing genome mutation. It is named so as it runs as 53-kilodalton (kDa) protein on Polyacrylamide gel electrophoresis although the actual mass is 43.7 kDa. Experimental evidence has indicated that P53 cancer mutants loses tumor suppression activity and subsequently gain oncogenic activities to promote tumourigenesis. Tumor-specific DNA has recently been detected in the plasma of breast cancer patients. Detection of tumor-specific genetic materials in cancer patients may provide a unique and valuable tumor marker for diagnosis and prognosis. Commercially available MDA-468 breast cancer cell line was used for the proposed study.Keywords: tumor protein (P53), cancer mutants, MDA-468, tumor suppressor gene
Procedia PDF Downloads 4802398 Bag of Words Representation Based on Fusing Two Color Local Descriptors and Building Multiple Dictionaries
Authors: Fatma Abdedayem
Abstract:
We propose an extension to the famous method called Bag of words (BOW) which proved a successful role in the field of image categorization. Practically, this method based on representing image with visual words. In this work, firstly, we extract features from images using Spatial Pyramid Representation (SPR) and two dissimilar color descriptors which are opponent-SIFT and transformed-color-SIFT. Secondly, we fuse color local features by joining the two histograms coming from these descriptors. Thirdly, after collecting of all features, we generate multi-dictionaries coming from n random feature subsets that obtained by dividing all features into n random groups. Then, by using these dictionaries separately each image can be represented by n histograms which are lately concatenated horizontally and form the final histogram, that allows to combine Multiple Dictionaries (MDBoW). In the final step, in order to classify image we have applied Support Vector Machine (SVM) on the generated histograms. Experimentally, we have used two dissimilar image datasets in order to test our proposition: Caltech 256 and PASCAL VOC 2007.Keywords: bag of words (BOW), color descriptors, multi-dictionaries, MDBoW
Procedia PDF Downloads 2972397 Rules in Policy Integration, Case Study: Victoria Catchment Management
Authors: Ratri Werdiningtyas, Yongping Wei, Andrew Western
Abstract:
This paper contributes to on-going attempts at bringing together land, water and environmental policy in catchment management. A tension remains in defining the boundaries of policy integration. Most of Integrated Water Resource Management is valued as rhetoric policy. It is far from being achieved on the ground because the socio-ecological system has not been understood and developed into complete and coherent problem representation. To clarify the feature of integration, this article draws on institutional fit for public policy integration and uses these insights in an empirical setting to identify the mechanism that can facilitate effective public integration for catchment management. This research is based on the journey of Victoria’s government from 1890-2016. A total of 274 Victorian Acts related to land, water, environment management published in those periods has been investigated. Four conditions of integration have been identified in their co-evolution: (1) the integration policy based on reserves, (2) the integration policy based on authority interest, (3) policy based on integrated information and, (4) policy based coordinated resource, authority and information. Results suggest that policy coordination among their policy instrument is superior rather than policy integration in the case of catchment management.Keywords: catchment management, co-evolution, policy integration, phase
Procedia PDF Downloads 2472396 Delivering Safer Clinical Trials; Using Electronic Healthcare Records (EHR) to Monitor, Detect and Report Adverse Events in Clinical Trials
Authors: Claire Williams
Abstract:
Randomised controlled Trials (RCTs) of efficacy are still perceived as the gold standard for the generation of evidence, and whilst advances in data collection methods are well developed, this progress has not been matched for the reporting of adverse events (AEs). Assessment and reporting of AEs in clinical trials are fraught with human error and inefficiency and are extremely time and resource intensive. Recent research conducted into the quality of reporting of AEs during clinical trials concluded it is substandard and reporting is inconsistent. Investigators commonly send reports to sponsors who are incorrectly categorised and lacking in critical information, which can complicate the detection of valid safety signals. In our presentation, we will describe an electronic data capture system, which has been designed to support clinical trial processes by reducing the resource burden on investigators, improving overall trial efficiencies, and making trials safer for patients. This proprietary technology was developed using expertise proven in the delivery of the world’s first prospective, phase 3b real-world trial, ‘The Salford Lung Study, ’ which enabled robust safety monitoring and reporting processes to be accomplished by the remote monitoring of patients’ EHRs. This technology enables safety alerts that are pre-defined by the protocol to be detected from the data extracted directly from the patients EHR. Based on study-specific criteria, which are created from the standard definition of a serious adverse event (SAE) and the safety profile of the medicinal product, the system alerts the investigator or study team to the safety alert. Each safety alert will require a clinical review by the investigator or delegate; examples of the types of alerts include hospital admission, death, hepatotoxicity, neutropenia, and acute renal failure. This is achieved in near real-time; safety alerts can be reviewed along with any additional information available to determine whether they meet the protocol-defined criteria for reporting or withdrawal. This active surveillance technology helps reduce the resource burden of the more traditional methods of AE detection for the investigators and study teams and can help eliminate reporting bias. Integration of multiple healthcare data sources enables much more complete and accurate safety data to be collected as part of a trial and can also provide an opportunity to evaluate a drug’s safety profile long-term, in post-trial follow-up. By utilising this robust and proven method for safety monitoring and reporting, a much higher risk of patient cohorts can be enrolled into trials, thus promoting inclusivity and diversity. Broadening eligibility criteria and adopting more inclusive recruitment practices in the later stages of drug development will increase the ability to understand the medicinal products risk-benefit profile across the patient population that is likely to use the product in clinical practice. Furthermore, this ground-breaking approach to AE detection not only provides sponsors with better-quality safety data for their products, but it reduces the resource burden on the investigator and study teams. With the data taken directly from the source, trial costs are reduced, with minimal data validation required and near real-time reporting enables safety concerns and signals to be detected more quickly than in a traditional RCT.Keywords: more comprehensive and accurate safety data, near real-time safety alerts, reduced resource burden, safer trials
Procedia PDF Downloads 852395 Predictive Maintenance Based on Oil Analysis Applicable to Transportation Fleets
Authors: Israel Ibarra Solis, Juan Carlos Rodriguez Sierra, Ma. del Carmen Salazar Hernandez, Isis Rodriguez Sanchez, David Perez Guerrero
Abstract:
At the present paper we try to explain the analysis techniques use for the lubricating oil in a maintenance period of a city bus (Mercedes Benz Boxer 40), which is call ‘R-24 route’, line Coecillo Centro SA de CV in Leon Guanajuato, to estimate the optimal time for the oil change. Using devices such as the rotational viscometer and the atomic absorption spectrometer, they can detect the incipient form when the oil loses its lubricating properties and, therefore, cannot protect the mechanical components of diesel engines such these trucks. Timely detection of lost property in the oil, it allows us taking preventive plan maintenance for the fleet.Keywords: atomic absorption spectrometry, maintenance, predictive velocity rate, lubricating oils
Procedia PDF Downloads 5692394 Adaptive Kaman Filter for Fault Diagnosis of Linear Parameter-Varying Systems
Authors: Rajamani Doraiswami, Lahouari Cheded
Abstract:
Fault diagnosis of Linear Parameter-Varying (LPV) system using an adaptive Kalman filter is proposed. The LPV model is comprised of scheduling parameters, and the emulator parameters. The scheduling parameters are chosen such that they are capable of tracking variations in the system model as a result of changes in the operating regimes. The emulator parameters, on the other hand, simulate variations in the subsystems during the identification phase and have negligible effect during the operational phase. The nominal model and the influence vectors, which are the gradient of the feature vector respect to the emulator parameters, are identified off-line from a number of emulator parameter perturbed experiments. A Kalman filter is designed using the identified nominal model. As the system varies, the Kalman filter model is adapted using the scheduling variables. The residual is employed for fault diagnosis. The proposed scheme is successfully evaluated on simulated system as well as on a physical process control system.Keywords: identification, linear parameter-varying systems, least-squares estimation, fault diagnosis, Kalman filter, emulators
Procedia PDF Downloads 4992393 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets
Authors: Ece Cigdem Mutlu, Burak Alakent
Abstract:
Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.Keywords: average run length, M-estimators, quality control, robust estimators
Procedia PDF Downloads 1902392 Operating System Based Virtualization Models in Cloud Computing
Authors: Dev Ras Pandey, Bharat Mishra, S. K. Tripathi
Abstract:
Cloud computing is ready to transform the structure of businesses and learning through supplying the real-time applications and provide an immediate help for small to medium sized businesses. The ability to run a hypervisor inside a virtual machine is important feature of virtualization and it is called nested virtualization. In today’s growing field of information technology, many of the virtualization models are available, that provide a convenient approach to implement, but decision for a single model selection is difficult. This paper explains the applications of operating system based virtualization in cloud computing with an appropriate/suitable model with their different specifications and user’s requirements. In the present paper, most popular models are selected, and the selection was based on container and hypervisor based virtualization. Selected models were compared with a wide range of user’s requirements as number of CPUs, memory size, nested virtualization supports, live migration and commercial supports, etc. and we identified a most suitable model of virtualization.Keywords: virtualization, OS based virtualization, container based virtualization, hypervisor based virtualization
Procedia PDF Downloads 3292391 Software Cloning and Agile Environment
Authors: Ravi Kumar, Dhrubajit Barman, Nomi Baruah
Abstract:
Software Cloning has grown an active area in software engineering research community yielding numerous techniques, various tools and other methods for clone detection and removal. The copying, modifying a block of code is identified as cloning as it is the most basic means of software reuse. Agile Software Development is an approach which is currently being used in various software projects, so that it helps to respond the unpredictability of building software through incremental, iterative, work cadences. Software Cloning has been introduced to Agile Environment and many Agile Software Development approaches are using the concept of Software Cloning. This paper discusses the various Agile Software Development approaches. It also discusses the degree to which the Software Cloning concept is being introduced in the Agile Software Development approaches.Keywords: agile environment, refactoring, reuse, software cloning
Procedia PDF Downloads 5312390 Automatic Segmentation of Lung Pleura Based On Curvature Analysis
Authors: Sasidhar B., Bhaskar Rao N., Ramesh Babu D. R., Ravi Shankar M.
Abstract:
Segmentation of lung pleura is a preprocessing step in Computer-Aided Diagnosis (CAD) which helps in reducing false positives in detection of lung cancer. The existing methods fail in extraction of lung regions with the nodules at the pleura of the lungs. In this paper, a new method is proposed which segments lung regions with nodules at the pleura of the lungs based on curvature analysis and morphological operators. The proposed algorithm is tested on 06 patient’s dataset which consists of 60 images of Lung Image Database Consortium (LIDC) and the results are found to be satisfactory with 98.3% average overlap measure (AΩ).Keywords: curvature analysis, image segmentation, morphological operators, thresholding
Procedia PDF Downloads 5962389 Development of a Pain Detector Using Microwave Radiometry Method
Authors: Nanditha Rajamani, Anirudhaa R. Rao, Divya Sriram
Abstract:
One of the greatest difficulties in treating patients with pain is the highly subjective nature of pain sensation. The measurement of pain intensity is primarily dependent on the patient’s report, often with little physical evidence to provide objective corroboration. This is also complicated by the fact that there are only few and expensive existing technologies (Functional Magnetic Resonance Imaging-fMRI). The need is thus clear and urgent for a reliable, non-invasive, non-painful, objective, readily adoptable, and coefficient diagnostic platform that provides additional diagnostic information to supplement its current regime with more information to assist doctors in diagnosing these patients. Thus, our idea of developing a pain detector was conceived to take a step further the detection and diagnosis of chronic and acute pain.Keywords: pain sensor, microwave radiometery, pain sensation, fMRI
Procedia PDF Downloads 4562388 Collaborative and Context-Aware Learning Approach Using Mobile Technology
Authors: Sameh Baccari, Mahmoud Neji
Abstract:
In recent years, the rapid developments on mobile devices and wireless technologies enable new dimension capabilities for the learning domain. This dimension facilitates people daily activities and shortens the distances between individuals. When these technologies have been used in learning, a new paradigm has been emerged giving birth to mobile learning. Because of the mobility feature, m-learning courses have to be adapted dynamically to the learner’s context. The main challenge in context-aware mobile learning is to develop an approach building the best learning resources according to dynamic learning situations. In this paper, we propose a context-aware mobile learning system called Collaborative and Context-aware Mobile Learning System (CCMLS). It takes into account the requirements of Mobility, Collaboration and Context-Awareness. This system is based on the semantic modeling of the learning context and the learning content. The adaptation part of this approach is made up of adaptation rules to propose and select relevant resources, learning partners and learning activities based not only on the user’s needs, but also on its current context.Keywords: mobile learning, mobile technologies, context-awareness, collaboration, semantic web, adaptation engine, adaptation strategy, learning object, learning context
Procedia PDF Downloads 3082387 Exploring the Feasibility of Introducing Particular Polyphenols into Cow Milk Naturally through Animal Feeding
Authors: Steve H. Y. Lee, Jeremy P. E. Spencer
Abstract:
The aim of the present study was to explore the feasibility of enriching polyphenols in cow milk via addition of flavanone-rich citrus pulp to existing animal feed. 8 Holstein lactating cows were enrolled onto the 4 week feeding study. 4 cows were fed the standard farm diet (control group), with another 4 (treatment group) which are fed a standard farm diet mixed with citrus pulp diet. Milk was collected twice a day, 3 times a week. The resulting milk yield and its macronutrient composition as well as lactose content were measured. The milk phenolic compounds were analysed using electrochemical detection (ECD).Keywords: milk, polyphenol, animal feeding, lactating cows
Procedia PDF Downloads 2992386 Semantic Data Schema Recognition
Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia
Abstract:
The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns
Procedia PDF Downloads 4182385 1-Butyl-2,3-Dimethylimidazolium Bis (Trifluoromethanesulfonyl) Imide and Titanium Oxide Based Voltammetric Sensor for the Quantification of Flunarizine Dihydrochloride in Solubilized Media
Authors: Rajeev Jain, Nimisha Jadon, Kshiti Singh
Abstract:
Titanium oxide nanoparticles and 1-butyl-2,3-dimethylimidazolium bis (trifluoromethane- sulfonyl) imide modified glassy carbon electrode (TiO2/IL/GCE) has been fabricated for electrochemical sensing of flunarizine dihydrochloride (FRH). The electrochemical properties and morphology of the prepared nanocomposite were studied by electrochemical impedance spectroscopy (EIS) and transmission electron microscopy (TEM). The response of the electrochemical sensor was found to be proportional to the concentrations of FRH in the range from 0.5 µg mL-1 to 16 µg mL-1. The detection limit obtained was 0.03 µg mL-1. The proposed method was also applied to the determination of FRH in pharmaceutical formulation and human serum with good recoveries.Keywords: flunarizine dihydrochloride, ionic liquid, nanoparticles, voltammetry, human serum
Procedia PDF Downloads 3292384 Spatial and Temporal Analysis of Forest Cover Change with Special Reference to Anthropogenic Activities in Kullu Valley, North-Western Indian Himalayan Region
Authors: Krisala Joshi, Sayanta Ghosh, Renu Lata, Jagdish C. Kuniyal
Abstract:
Throughout the world, monitoring and estimating the changing pattern of forests across diverse landscapes through remote sensing is instrumental in understanding the interactions of human activities and the ecological environment with the changing climate. Forest change detection using satellite imageries has emerged as an important means to gather information on a regional scale. Kullu valley in Himachal Pradesh, India is situated in a transitional zone between the lesser and the greater Himalayas. Thus, it presents a typical rugged mountainous terrain with moderate to high altitude which varies from 1200 meters to over 6000 meters. Due to changes in agricultural cropping patterns, urbanization, industrialization, hydropower generation, climate change, tourism, and anthropogenic forest fire, it has undergone a tremendous transformation in forest cover in the past three decades. The loss and degradation of forest cover results in soil erosion, loss of biodiversity including damage to wildlife habitats, and degradation of watershed areas, and deterioration of the overall quality of nature and life. The supervised classification of LANDSAT satellite data was performed to assess the changes in forest cover in Kullu valley over the years 2000 to 2020. Normalized Burn Ratio (NBR) was calculated to discriminate between burned and unburned areas of the forest. Our study reveals that in Kullu valley, the increasing number of forest fire incidents specifically, those due to anthropogenic activities has been on a rise, each subsequent year. The main objective of the present study is, therefore, to estimate the change in the forest cover of Kullu valley and to address the various social aspects responsible for the anthropogenic forest fires. Also, to assess its impact on the significant changes in the regional climatic factors, specifically, temperature, humidity, and precipitation over three decades, with the help of satellite imageries and ground data. The main outcome of the paper, we believe, will be helpful for the administration for making a quantitative assessment of the forest cover area changes due to anthropogenic activities and devising long-term measures for creating awareness among the local people of the area.Keywords: Anthropogenic Activities, Forest Change Detection, Normalized Burn Ratio (NBR), Supervised Classification
Procedia PDF Downloads 1732383 Facial Expression Recognition Using Sparse Gaussian Conditional Random Field
Authors: Mohammadamin Abbasnejad
Abstract:
The analysis of expression and facial Action Units (AUs) detection are very important tasks in fields of computer vision and Human Computer Interaction (HCI) due to the wide range of applications in human life. Many works have been done during the past few years which has their own advantages and disadvantages. In this work, we present a new model based on Gaussian Conditional Random Field. We solve our objective problem using ADMM and we show how well the proposed model works. We train and test our work on two facial expression datasets, CK+, and RU-FACS. Experimental evaluation shows that our proposed approach outperform state of the art expression recognition.Keywords: Gaussian Conditional Random Field, ADMM, convergence, gradient descent
Procedia PDF Downloads 3562382 Size Reduction of Images Using Constraint Optimization Approach for Machine Communications
Authors: Chee Sun Won
Abstract:
This paper presents the size reduction of images for machine-to-machine communications. Here, the salient image regions to be preserved include the image patches of the key-points such as corners and blobs. Based on a saliency image map from the key-points and their image patches, an axis-aligned grid-size optimization is proposed for the reduction of image size. To increase the size-reduction efficiency the aspect ratio constraint is relaxed in the constraint optimization framework. The proposed method yields higher matching accuracy after the size reduction than the conventional content-aware image size-reduction methods.Keywords: image compression, image matching, key-point detection and description, machine-to-machine communication
Procedia PDF Downloads 4182381 KSVD-SVM Approach for Spontaneous Facial Expression Recognition
Authors: Dawood Al Chanti, Alice Caplier
Abstract:
Sparse representations of signals have received a great deal of attention in recent years. In this paper, the interest of using sparse representation as a mean for performing sparse discriminative analysis between spontaneous facial expressions is demonstrated. An automatic facial expressions recognition system is presented. It uses a KSVD-SVM approach which is made of three main stages: A pre-processing and feature extraction stage, which solves the problem of shared subspace distribution based on the random projection theory, to obtain low dimensional discriminative and reconstructive features; A dictionary learning and sparse coding stage, which uses the KSVD model to learn discriminative under or over dictionaries for sparse coding; Finally a classification stage, which uses a SVM classifier for facial expressions recognition. Our main concern is to be able to recognize non-basic affective states and non-acted expressions. Extensive experiments on the JAFFE static acted facial expressions database but also on the DynEmo dynamic spontaneous facial expressions database exhibit very good recognition rates.Keywords: dictionary learning, random projection, pose and spontaneous facial expression, sparse representation
Procedia PDF Downloads 3052380 Risk Assessment of Lead Element in Red Peppers Collected from Marketplaces in Antalya, Southern Turkey
Authors: Serpil Kilic, Ihsan Burak Cam, Murat Kilic, Timur Tongur
Abstract:
Interest in the lead (Pb) has considerably increased due to knowledge about the potential toxic effects of this element, recently. Exposure to heavy metals above the acceptable limit affects human health. Indeed, Pb is accumulated through food chains up to toxic concentrations; therefore, it can pose an adverse potential threat to human health. A sensitive and reliable method for determination of Pb element in red pepper were improved in the present study. Samples (33 red pepper products having different brands) were purchased from different markets in Turkey. The selected method validation criteria (linearity, Limit of Detection, Limit of Quantification, recovery, and trueness) demonstrated. Recovery values close to 100% showed adequate precision and accuracy for analysis. According to the results of red pepper analysis, all of the tested lead element in the samples was determined at various concentrations. A Perkin- Elmer ELAN DRC-e model ICP-MS system was used for detection of Pb. Organic red pepper was used to obtain a matrix for all method validation studies. The certified reference material, Fapas chili powder, was digested and analyzed, together with the different sample batches. Three replicates from each sample were digested and analyzed. The results of the exposure levels of the elements were discussed considering the scientific opinions of the European Food Safety Authority (EFSA), which is the European Union’s (EU) risk assessment source associated with food safety. The Target Hazard Quotient (THQ) was described by the United States Environmental Protection Agency (USEPA) for the calculation of potential health risks associated with long-term exposure to chemical pollutants. THQ value contains intake of elements, exposure frequency and duration, body weight and the oral reference dose (RfD). If the THQ value is lower than one, it means that the exposed population is assumed to be safe and 1 < THQ < 5 means that the exposed population is in a level of concern interval. In this study, the THQ of Pb was obtained as < 1. The results of THQ calculations showed that the values were below one for all the tested, meaning the samples did not pose a health risk to the local population. This work was supported by The Scientific Research Projects Coordination Unit of Akdeniz University. Project Number: FBA-2017-2494.Keywords: lead analyses, red pepper, risk assessment, daily exposure
Procedia PDF Downloads 1672379 Scalable UI Test Automation for Large-scale Web Applications
Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani
Abstract:
This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.Keywords: aws, elastic container service, scalability, serverless, ui automation test
Procedia PDF Downloads 1072378 Bayesian Analysis of Change Point Problems Using Conditionally Specified Priors
Authors: Golnaz Shahtahmassebi, Jose Maria Sarabia
Abstract:
In this talk, we introduce a new class of conjugate prior distributions obtained from conditional specification methodology. We illustrate the application of such distribution in Bayesian change point detection in Poisson processes. We obtain the posterior distribution of model parameters using a general bivariate distribution with gamma conditionals. Simulation from the posterior is readily implemented using a Gibbs sampling algorithm. The Gibbs sampling is implemented even when using conditional densities that are incompatible or only compatible with an improper joint density. The application of such methods will be demonstrated using examples of simulated and real data.Keywords: change point, bayesian inference, Gibbs sampler, conditional specification, gamma conditional distributions
Procedia PDF Downloads 1892377 The Application of Hellomac Rockfall Alert System in Rockfall Barriers: An Explainer
Authors: Kinjal Parmar, Matteo Lelli
Abstract:
The usage of IoT technology as a rockfall alert system is relatively new. This paper explains the potential of such an alert system called HelloMac from Maccaferri which provides transportation infrastructure asset owners the way to effectively utilize their resources in the detection of boulder impacts on rockfall barriers. This would ensure a faster assessment of the impacted barrier and subsequently facilitates the implementation of remedial works in an effective and timely manner. In addition, the HelloMac can also be integrated with another warning system to alert vehicle users of the unseen dangers ahead. HelloMac is developed to work also in remote areas, where cell coverage is not available. User gets notified when a rockfall even occurs via mobile app, SMS and email. Using such alarming systems effectively, we can reduce the risk of rockfall hazard.Keywords: rockfall, barrier, HelloMac, rockfall alert system
Procedia PDF Downloads 52