Search results for: Object detection
2152 Acrylic Microspheres-Based Microbial Bio-Optode for Nitrite Ion Detection
Authors: Siti Nur Syazni Mohd Zuki, Tan Ling Ling, Nina Suhaity Azmi, Chong Kwok Feng, Lee Yook Heng
Abstract:
Nitrite (NO2-) ion is used prevalently as a preservative in processed meat. Elevated levels of nitrite also found in edible bird’s nests (EBNs). Consumption of NO2- ion at levels above the health-based risk may cause cancer in humans. Spectrophotometric Griess test is the simplest established standard method for NO2- ion detection, however, it requires careful control of pH of each reaction step and susceptible to strong oxidants and dyeing interferences. Other traditional methods rely on the use of laboratory-scale instruments such as GC-MS, HPLC and ion chromatography, which cannot give real-time response. Therefore, it is of significant need for devices capable of measuring nitrite concentration in-situ, rapidly and without reagents, sample pretreatment or extraction step. Herein, we constructed a microspheres-based microbial optode for visual quantitation of NO2- ion. Raoutella planticola, the bacterium expressing NAD(P)H nitrite reductase (NiR) enzyme has been successfully extracted by microbial technique from EBN collected from local birdhouse. The whole cells and the lipophilic Nile Blue chromoionophore were physically absorbed on the photocurable poly(n-butyl acrylate-N-acryloxysuccinimide) [poly (nBA-NAS)] microspheres, whilst the reduced coenzyme NAD(P)H was covalently immobilized on the succinimide-functionalized acrylic microspheres to produce a reagentless biosensing system. Upon the NiR enzyme catalyzes the oxidation of NAD(P)H to NAD(P)+, NO2- ion is reduced to ammonium hydroxide, and that a colour change from blue to pink of the immobilized Nile Blue chromoionophore is perceived as a result of deprotonation reaction increasing the local pH in the microspheres membrane. The microspheres-based optosensor was optimized with a reflectance spectrophotometer at 639 nm and pH 8. The resulting microbial bio-optode membrane could quantify NO2- ion at 0.1 ppm and had a linear response up to 400 ppm. Due to the large surface area to mass ratio of the acrylic microspheres, it allows efficient solid state diffusional mass transfer of the substrate to the bio-recognition phase, and achieve the steady state response as fast as 5 min. The proposed optical microbial biosensor requires no sample pre-treatment step and possesses high stability as the whole cell biocatalyst provides protection to the enzymes from interfering substances, hence it is suitable for measurements in contaminated samples.Keywords: acrylic microspheres, microbial bio-optode, nitrite ion, reflectometric
Procedia PDF Downloads 4502151 Study and Construction on Signalling System during Reverse Motion Due to Obstacle
Authors: S. M. Yasir Arafat
Abstract:
Driving models are needed by many researchers to improve traffic safety and to advance autonomous vehicle design. To be most useful, a driving model must state specifically what information is needed and how it is processed. So we developed an “Obstacle Avoidance and Detection Autonomous Car” based on sensor application. The ever increasing technological demands of today call for very complex systems, which in turn require highly sophisticated controllers to ensure that high performance can be achieved and maintained under adverse conditions. Based on a developed model of brakes operation, the controller of braking system operation has been designed. It has a task to enable solution to the problem of the better controlling of braking system operation in a more accurate way then it was the case now a day.Keywords: automobile, obstacle, safety, sensing
Procedia PDF Downloads 3652150 Effect of Key Parameters on Performances of an Adsorption Solar Cooling Machine
Authors: Allouache Nadia
Abstract:
Solid adsorption cooling machines have been extensively studied recently. They constitute very attractive solutions recover important amount of industrial waste heat medium temperature and to use renewable energy sources such as solar energy. The development of the technology of these machines can be carried out by experimental studies and by mathematical modelisation. This last method allows saving time and money because it is suppler to use to simulate the variation of different parameters. The adsorption cooling machines consist essentially of an evaporator, a condenser and a reactor (object of this work) containing a porous medium, which is in our case the activated carbon reacting by adsorption with ammoniac. The principle can be described as follows: When the adsorbent (at temperature T) is in exclusive contact with vapour of adsorbate (at pressure P), an amount of adsorbate is trapped inside the micro-pores in an almost liquid state. This adsorbed mass m, is a function of T and P according to a divariant equilibrium m=f (T,P). Moreover, at constant pressure, m decreases as T increases, and at constant adsorbed mass P increases with T. This makes it possible to imagine an ideal refrigerating cycle consisting of a period of heating/desorption/condensation followed by a period of cooling/adsorption/evaporation. Effect of key parameters on the machine performances are analysed and discussed.Keywords: activated carbon-ammoniac pair, effect of key parameters, numerical modeling, solar cooling machine
Procedia PDF Downloads 2552149 Characterization of the Dispersion Phenomenon in an Optical Biosensor
Authors: An-Shik Yang, Chin-Ting Kuo, Yung-Chun Yang, Wen-Hsin Hsieh, Chiang-Ho Cheng
Abstract:
Optical biosensors have become a powerful detection and analysis tool for wide-ranging applications in biomedical research, pharmaceuticals and environmental monitoring. This study carried out the computational fluid dynamics (CFD)-based simulations to explore the dispersion phenomenon in the microchannel of a optical biosensor. The predicted time sequences of concentration contours were utilized to better understand the dispersion development occurred in different geometric shapes of microchannels. The simulation results showed the surface concentrations at the sensing probe (with the best performance of a grating coupler) in respect of time to appraise the dispersion effect and therefore identify the design configurations resulting in minimum dispersion.Keywords: CFD simulations, dispersion, microfluidic, optical waveguide sensors
Procedia PDF Downloads 5462148 The Effect of Mood and Normative Conformity on Prosocial Behavior
Authors: Antoine Miguel Borromeo, Kristian Anthony Menez, Moira Louise Ordonez, David Carl Rabaya
Abstract:
This study aimed to test if induced mood and normative conformity have any effect specifically on prosocial behavior, which was operationalized as the willingness to donate to a non-government organization. The effect of current attitude towards the object of the prosocial behavior was also considered with a covariate test. Undergraduates taking an introductory course on psychology (N = 132) from the University of the Philippines Diliman were asked how much money they were willing to donate after being presented a video about coral reef destruction and a website that advocates towards saving the coral reefs. A 3 (Induced mood: Positive vs Fear and Sadness vs Anger, Contempt, and Disgust) x 2 (Normative conformity: Presence vs Absence) between-subjects analysis of covariance was used for experimentation. Prosocial behavior was measured by presenting a circumstance wherein participants were given money and asked if they were willing to donate an amount to the non-government organization. An analysis of covariance revealed that the mood induced has no significant effect on prosocial behavior, F(2,125) = 0.654, p > 0.05. The analysis also showed how normative conformity has no significant effect on prosocial behavior, F(1,125) = 0.238, p > 0.05, as well as their interaction F(2, 125) = 1.580, p > 0.05. However, the covariate, current attitude towards corals was revealed to be significant, F(1,125) = 8.778, p < 0.05. From this, we speculate that inherent attitudes of people have a greater effect on prosocial behavior than temporary factors such as mood and conformity.Keywords: attitude, induced mood, normative conformity, prosocial behavior
Procedia PDF Downloads 2312147 Hate Speech Detection Using Deep Learning and Machine Learning Models
Authors: Nabil Shawkat, Jamil Saquer
Abstract:
Social media has accelerated our ability to engage with others and eliminated many communication barriers. On the other hand, the widespread use of social media resulted in an increase in online hate speech. This has drastic impacts on vulnerable individuals and societies. Therefore, it is critical to detect hate speech to prevent innocent users and vulnerable communities from becoming victims of hate speech. We investigate the performance of different deep learning and machine learning algorithms on three different datasets. Our results show that the BERT model gives the best performance among all the models by achieving an F1-score of 90.6% on one of the datasets and F1-scores of 89.7% and 88.2% on the other two datasets.Keywords: hate speech, machine learning, deep learning, abusive words, social media, text classification
Procedia PDF Downloads 1392146 PCR Based DNA Analysis in Detecting P53 Mutation in Human Breast Cancer (MDA-468)
Authors: Debbarma Asis, Guha Chandan
Abstract:
Tumor Protein-53 (P53) is one of the tumor suppressor proteins. P53 regulates the cell cycle that conserves stability by preventing genome mutation. It is named so as it runs as 53-kilodalton (kDa) protein on Polyacrylamide gel electrophoresis although the actual mass is 43.7 kDa. Experimental evidence has indicated that P53 cancer mutants loses tumor suppression activity and subsequently gain oncogenic activities to promote tumourigenesis. Tumor-specific DNA has recently been detected in the plasma of breast cancer patients. Detection of tumor-specific genetic materials in cancer patients may provide a unique and valuable tumor marker for diagnosis and prognosis. Commercially available MDA-468 breast cancer cell line was used for the proposed study.Keywords: tumor protein (P53), cancer mutants, MDA-468, tumor suppressor gene
Procedia PDF Downloads 4812145 Delivering Safer Clinical Trials; Using Electronic Healthcare Records (EHR) to Monitor, Detect and Report Adverse Events in Clinical Trials
Authors: Claire Williams
Abstract:
Randomised controlled Trials (RCTs) of efficacy are still perceived as the gold standard for the generation of evidence, and whilst advances in data collection methods are well developed, this progress has not been matched for the reporting of adverse events (AEs). Assessment and reporting of AEs in clinical trials are fraught with human error and inefficiency and are extremely time and resource intensive. Recent research conducted into the quality of reporting of AEs during clinical trials concluded it is substandard and reporting is inconsistent. Investigators commonly send reports to sponsors who are incorrectly categorised and lacking in critical information, which can complicate the detection of valid safety signals. In our presentation, we will describe an electronic data capture system, which has been designed to support clinical trial processes by reducing the resource burden on investigators, improving overall trial efficiencies, and making trials safer for patients. This proprietary technology was developed using expertise proven in the delivery of the world’s first prospective, phase 3b real-world trial, ‘The Salford Lung Study, ’ which enabled robust safety monitoring and reporting processes to be accomplished by the remote monitoring of patients’ EHRs. This technology enables safety alerts that are pre-defined by the protocol to be detected from the data extracted directly from the patients EHR. Based on study-specific criteria, which are created from the standard definition of a serious adverse event (SAE) and the safety profile of the medicinal product, the system alerts the investigator or study team to the safety alert. Each safety alert will require a clinical review by the investigator or delegate; examples of the types of alerts include hospital admission, death, hepatotoxicity, neutropenia, and acute renal failure. This is achieved in near real-time; safety alerts can be reviewed along with any additional information available to determine whether they meet the protocol-defined criteria for reporting or withdrawal. This active surveillance technology helps reduce the resource burden of the more traditional methods of AE detection for the investigators and study teams and can help eliminate reporting bias. Integration of multiple healthcare data sources enables much more complete and accurate safety data to be collected as part of a trial and can also provide an opportunity to evaluate a drug’s safety profile long-term, in post-trial follow-up. By utilising this robust and proven method for safety monitoring and reporting, a much higher risk of patient cohorts can be enrolled into trials, thus promoting inclusivity and diversity. Broadening eligibility criteria and adopting more inclusive recruitment practices in the later stages of drug development will increase the ability to understand the medicinal products risk-benefit profile across the patient population that is likely to use the product in clinical practice. Furthermore, this ground-breaking approach to AE detection not only provides sponsors with better-quality safety data for their products, but it reduces the resource burden on the investigator and study teams. With the data taken directly from the source, trial costs are reduced, with minimal data validation required and near real-time reporting enables safety concerns and signals to be detected more quickly than in a traditional RCT.Keywords: more comprehensive and accurate safety data, near real-time safety alerts, reduced resource burden, safer trials
Procedia PDF Downloads 872144 Predictive Maintenance Based on Oil Analysis Applicable to Transportation Fleets
Authors: Israel Ibarra Solis, Juan Carlos Rodriguez Sierra, Ma. del Carmen Salazar Hernandez, Isis Rodriguez Sanchez, David Perez Guerrero
Abstract:
At the present paper we try to explain the analysis techniques use for the lubricating oil in a maintenance period of a city bus (Mercedes Benz Boxer 40), which is call ‘R-24 route’, line Coecillo Centro SA de CV in Leon Guanajuato, to estimate the optimal time for the oil change. Using devices such as the rotational viscometer and the atomic absorption spectrometer, they can detect the incipient form when the oil loses its lubricating properties and, therefore, cannot protect the mechanical components of diesel engines such these trucks. Timely detection of lost property in the oil, it allows us taking preventive plan maintenance for the fleet.Keywords: atomic absorption spectrometry, maintenance, predictive velocity rate, lubricating oils
Procedia PDF Downloads 5702143 Research of Control System for Space Intelligent Robot Based on Vision Servo
Authors: Changchun Liang, Xiaodong Zhang, Xin Liu, Pengfei Sun
Abstract:
Space intelligent robotic systems are expected to play an increasingly important role in the future. The robotic on-orbital service, whose key is the tracking and capturing technology, becomes research hot in recent years. In this paper, the authors propose a vision servo control system for target capturing. Robotic manipulator will be an intelligent robotic system with large-scale movement, functional agility, and autonomous ability, and it can be operated by astronauts in the space station or be controlled by the ground operator in the remote operation mode. To realize the autonomous movement and capture mission of SRM, a kind of autonomous programming strategy based on multi-camera vision fusion is designed and the selection principle of object visual position and orientation measurement information is defined for the better precision. Distributed control system hierarchy is designed and reliability is considering to guarantee the abilities of control system. At last, a ground experiment system is set up based on the concept of robotic control system. With that, the autonomous target capturing experiments are conducted. The experiment results validate the proposed algorithm, and demonstrates that the control system can fulfill the needs of function, real-time and reliability.Keywords: control system, on-orbital service, space robot, vision servo
Procedia PDF Downloads 4192142 Investigation the Difference of Several Hormones Correlated to Reproduction between Infertile and Fertile Dairy Cows
Authors: Ali M. Mutlag, Yang Zhiqiang, Meng Jiaren, Zhang Jingyan, Li Jianxi
Abstract:
The object of this study was to investigate several hormones correlated to the reproduction and Inhibin A, Inhibin B and NO levels in the infertile dairy cows as attempt to illustrate the physiological causes of dairy cows infertility. 40 Holstein cows (21 infertile and 19 fertile) were used at estrous phase of the cycle, Hormones FSH, LH, E2, Testosterone, Were measured using ELISA method. Inhibin A and B also estimated by ELISA method, Nitric oxide was measured by Greiss reagent method. The results showed different concentrations of the hormone in which FSH illustrated significantly higher concentration in the infertile cows than fertile cows (P<0.05). LH and E2 showed significant decrease in the infertile cows than the fertile cows (P<0.05), No significant difference appeared in testosterone concentrations in the fertile cows and infertile cows (P>0.05). The both inhibins A and B showed significant P<0.05 decrease concentrations in the infertile cows also NO showed clearly significant decrease P<0.05 in the infertile cows. In conclusion, The present study approved the poorly ovarian activities and reproduction disturbance of infertile cows in spite of trigger estrous signs, The study confirmed a positive correlation between inhibins and NO to regulate the ovarian physiology. These inhibins represent effective markers of dairy cows infertility.Keywords: cows, inhibins A and B, infertility, nitric oxide (NO)
Procedia PDF Downloads 2912141 Developing Integrated Model for Building Design and Evacuation Planning
Authors: Hao-Hsi Tseng, Hsin-Yun Lee
Abstract:
In the process of building design, the designers have to complete the spatial design and consider the evacuation performance at the same time. It is usually difficult to combine the two planning processes and it results in the gap between spatial design and evacuation performance. Then the designers cannot complete an integrated optimal design solution. In addition, the evacuation routing models proposed by previous researchers is different from the practical evacuation decisions in the real field. On the other hand, more and more building design projects are executed by Building Information Modeling (BIM) in which the design content is formed by the object-oriented framework. Thus, the integration of BIM and evacuation simulation can make a significant contribution for designers. Therefore, this research plan will establish a model that integrates spatial design and evacuation planning. The proposed model will provide the support for the spatial design modifications and optimize the evacuation planning. The designers can complete the integrated design solution in BIM. Besides, this research plan improves the evacuation routing method to make the simulation results more practical. The proposed model will be applied in a building design project for evaluation and validation when it will provide the near-optimal design suggestion. By applying the proposed model, the integration and efficiency of the design process are improved and the evacuation plan is more useful. The quality of building spatial design will be better.Keywords: building information modeling, evacuation, design, floor plan
Procedia PDF Downloads 4562140 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets
Authors: Ece Cigdem Mutlu, Burak Alakent
Abstract:
Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.Keywords: average run length, M-estimators, quality control, robust estimators
Procedia PDF Downloads 1912139 Software Cloning and Agile Environment
Authors: Ravi Kumar, Dhrubajit Barman, Nomi Baruah
Abstract:
Software Cloning has grown an active area in software engineering research community yielding numerous techniques, various tools and other methods for clone detection and removal. The copying, modifying a block of code is identified as cloning as it is the most basic means of software reuse. Agile Software Development is an approach which is currently being used in various software projects, so that it helps to respond the unpredictability of building software through incremental, iterative, work cadences. Software Cloning has been introduced to Agile Environment and many Agile Software Development approaches are using the concept of Software Cloning. This paper discusses the various Agile Software Development approaches. It also discusses the degree to which the Software Cloning concept is being introduced in the Agile Software Development approaches.Keywords: agile environment, refactoring, reuse, software cloning
Procedia PDF Downloads 5312138 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data
Authors: Ruchika Malhotra, Megha Khanna
Abstract:
The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics
Procedia PDF Downloads 4182137 Automatic Segmentation of Lung Pleura Based On Curvature Analysis
Authors: Sasidhar B., Bhaskar Rao N., Ramesh Babu D. R., Ravi Shankar M.
Abstract:
Segmentation of lung pleura is a preprocessing step in Computer-Aided Diagnosis (CAD) which helps in reducing false positives in detection of lung cancer. The existing methods fail in extraction of lung regions with the nodules at the pleura of the lungs. In this paper, a new method is proposed which segments lung regions with nodules at the pleura of the lungs based on curvature analysis and morphological operators. The proposed algorithm is tested on 06 patient’s dataset which consists of 60 images of Lung Image Database Consortium (LIDC) and the results are found to be satisfactory with 98.3% average overlap measure (AΩ).Keywords: curvature analysis, image segmentation, morphological operators, thresholding
Procedia PDF Downloads 5962136 Development of a Pain Detector Using Microwave Radiometry Method
Authors: Nanditha Rajamani, Anirudhaa R. Rao, Divya Sriram
Abstract:
One of the greatest difficulties in treating patients with pain is the highly subjective nature of pain sensation. The measurement of pain intensity is primarily dependent on the patient’s report, often with little physical evidence to provide objective corroboration. This is also complicated by the fact that there are only few and expensive existing technologies (Functional Magnetic Resonance Imaging-fMRI). The need is thus clear and urgent for a reliable, non-invasive, non-painful, objective, readily adoptable, and coefficient diagnostic platform that provides additional diagnostic information to supplement its current regime with more information to assist doctors in diagnosing these patients. Thus, our idea of developing a pain detector was conceived to take a step further the detection and diagnosis of chronic and acute pain.Keywords: pain sensor, microwave radiometery, pain sensation, fMRI
Procedia PDF Downloads 4582135 Exploring the Feasibility of Introducing Particular Polyphenols into Cow Milk Naturally through Animal Feeding
Authors: Steve H. Y. Lee, Jeremy P. E. Spencer
Abstract:
The aim of the present study was to explore the feasibility of enriching polyphenols in cow milk via addition of flavanone-rich citrus pulp to existing animal feed. 8 Holstein lactating cows were enrolled onto the 4 week feeding study. 4 cows were fed the standard farm diet (control group), with another 4 (treatment group) which are fed a standard farm diet mixed with citrus pulp diet. Milk was collected twice a day, 3 times a week. The resulting milk yield and its macronutrient composition as well as lactose content were measured. The milk phenolic compounds were analysed using electrochemical detection (ECD).Keywords: milk, polyphenol, animal feeding, lactating cows
Procedia PDF Downloads 2992134 Semantic Data Schema Recognition
Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia
Abstract:
The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns
Procedia PDF Downloads 4182133 Factor Study Affecting Visual Awareness on Dynamic Object Monitoring
Authors: Terry Liang Khin Teo, Sun Woh Lye, Kai Lun Brendon Goh
Abstract:
As applied to dynamic monitoring situations, the prevailing approach to situation awareness (SA) assumes that the relevant areas of interest (AOI) be perceived before that information can be processed further to affect decision-making and, thereafter, action. It is not entirely clear whether this is the case. This study seeks to investigate the monitoring of dynamic objects through matching eye fixations with the relevant AOIs in boundary-crossing scenarios. By this definition, a match is where a fixation is registered on the AOI. While many factors may affect monitoring characteristics, traffic simulations were designed in this study to explore two factors, namely: the number of inbounds/outbound traffic transfers and the number of entry and/or exit points in a radar monitoring sector. These two factors were graded into five levels of difficulty ranging from low to high traffic flow numbers. Combined permutation in terms of levels of difficulty of these two factors yielded a total of thirty scenarios. Through this, results showed that changes in the traffic flow numbers on transfer resulted in greater variations having match limits ranging from 29%-100%, as compared to the number of sector entry/exit points of range limit from 80%-100%. The subsequent analysis is able to determine the type and combination of traffic scenarios where imperfect matching is likely to occur.Keywords: air traffic simulation, eye-tracking, visual monitoring, focus attention
Procedia PDF Downloads 592132 1-Butyl-2,3-Dimethylimidazolium Bis (Trifluoromethanesulfonyl) Imide and Titanium Oxide Based Voltammetric Sensor for the Quantification of Flunarizine Dihydrochloride in Solubilized Media
Authors: Rajeev Jain, Nimisha Jadon, Kshiti Singh
Abstract:
Titanium oxide nanoparticles and 1-butyl-2,3-dimethylimidazolium bis (trifluoromethane- sulfonyl) imide modified glassy carbon electrode (TiO2/IL/GCE) has been fabricated for electrochemical sensing of flunarizine dihydrochloride (FRH). The electrochemical properties and morphology of the prepared nanocomposite were studied by electrochemical impedance spectroscopy (EIS) and transmission electron microscopy (TEM). The response of the electrochemical sensor was found to be proportional to the concentrations of FRH in the range from 0.5 µg mL-1 to 16 µg mL-1. The detection limit obtained was 0.03 µg mL-1. The proposed method was also applied to the determination of FRH in pharmaceutical formulation and human serum with good recoveries.Keywords: flunarizine dihydrochloride, ionic liquid, nanoparticles, voltammetry, human serum
Procedia PDF Downloads 3342131 Unsupervised Learning of Spatiotemporally Coherent Metrics
Authors: Ross Goroshin, Joan Bruna, Jonathan Tompson, David Eigen, Yann LeCun
Abstract:
Current state-of-the-art classification and detection algorithms rely on supervised training. In this work we study unsupervised feature learning in the context of temporally coherent video data. We focus on feature learning from unlabeled video data, using the assumption that adjacent video frames contain semantically similar information. This assumption is exploited to train a convolutional pooling auto-encoder regularized by slowness and sparsity. We establish a connection between slow feature learning to metric learning and show that the trained encoder can be used to define a more temporally and semantically coherent metric.Keywords: machine learning, pattern clustering, pooling, classification
Procedia PDF Downloads 4562130 Increasing Business Competitiveness in Georgia in Terms of Globalization
Authors: Badri Gechbaia, Levan Gvarishvili
Abstract:
Despite the fact that a lot of Georgian scientists have worked on the issue of the business competitiveness, it think that it is necessary to deepen the works in this sphere, it is necessary also to perfect the methodology in the estimation of the business competitiveness, we have to display the main factors which define the competitive advantages in the business sphere, we have also to establish the interconnections between the business competitiveness level and the quality of states economical involvement in the international economic processes, we have to define the ways to rise the business competitiveness and its role in the upgrading of countries economic development. The introduction part justifies the actuality of the studied topic and the thesis; It defines the survey subject, the object, and the goals with relevant objectives; theoretical-methodological and informational-statistical base for the survey; what is new in the survey and what the value for its theoretical and practical application is. The aforementioned study is an effort to raise public awareness on this issue. Analysis of the fundamental conditions for the efficient functioning of business in Georgia, identification of reserves for increasing its efficiency based on the assessment of the strengths and weaknesses of the business sector. Methods of system analysis, abstract-logic, induction and deduction, synthesis and generalization, and positive, normative, and comparative analysis are used in the research process. Specific regularities of the impact of the globalization process on the determinants of business competitiveness are established. The reasons for business competitiveness in Georgia have been identifiedKeywords: competitiveness, methodology, georgian, economic
Procedia PDF Downloads 1152129 Spatial and Temporal Analysis of Forest Cover Change with Special Reference to Anthropogenic Activities in Kullu Valley, North-Western Indian Himalayan Region
Authors: Krisala Joshi, Sayanta Ghosh, Renu Lata, Jagdish C. Kuniyal
Abstract:
Throughout the world, monitoring and estimating the changing pattern of forests across diverse landscapes through remote sensing is instrumental in understanding the interactions of human activities and the ecological environment with the changing climate. Forest change detection using satellite imageries has emerged as an important means to gather information on a regional scale. Kullu valley in Himachal Pradesh, India is situated in a transitional zone between the lesser and the greater Himalayas. Thus, it presents a typical rugged mountainous terrain with moderate to high altitude which varies from 1200 meters to over 6000 meters. Due to changes in agricultural cropping patterns, urbanization, industrialization, hydropower generation, climate change, tourism, and anthropogenic forest fire, it has undergone a tremendous transformation in forest cover in the past three decades. The loss and degradation of forest cover results in soil erosion, loss of biodiversity including damage to wildlife habitats, and degradation of watershed areas, and deterioration of the overall quality of nature and life. The supervised classification of LANDSAT satellite data was performed to assess the changes in forest cover in Kullu valley over the years 2000 to 2020. Normalized Burn Ratio (NBR) was calculated to discriminate between burned and unburned areas of the forest. Our study reveals that in Kullu valley, the increasing number of forest fire incidents specifically, those due to anthropogenic activities has been on a rise, each subsequent year. The main objective of the present study is, therefore, to estimate the change in the forest cover of Kullu valley and to address the various social aspects responsible for the anthropogenic forest fires. Also, to assess its impact on the significant changes in the regional climatic factors, specifically, temperature, humidity, and precipitation over three decades, with the help of satellite imageries and ground data. The main outcome of the paper, we believe, will be helpful for the administration for making a quantitative assessment of the forest cover area changes due to anthropogenic activities and devising long-term measures for creating awareness among the local people of the area.Keywords: Anthropogenic Activities, Forest Change Detection, Normalized Burn Ratio (NBR), Supervised Classification
Procedia PDF Downloads 1732128 Facial Expression Recognition Using Sparse Gaussian Conditional Random Field
Authors: Mohammadamin Abbasnejad
Abstract:
The analysis of expression and facial Action Units (AUs) detection are very important tasks in fields of computer vision and Human Computer Interaction (HCI) due to the wide range of applications in human life. Many works have been done during the past few years which has their own advantages and disadvantages. In this work, we present a new model based on Gaussian Conditional Random Field. We solve our objective problem using ADMM and we show how well the proposed model works. We train and test our work on two facial expression datasets, CK+, and RU-FACS. Experimental evaluation shows that our proposed approach outperform state of the art expression recognition.Keywords: Gaussian Conditional Random Field, ADMM, convergence, gradient descent
Procedia PDF Downloads 3572127 Size Reduction of Images Using Constraint Optimization Approach for Machine Communications
Authors: Chee Sun Won
Abstract:
This paper presents the size reduction of images for machine-to-machine communications. Here, the salient image regions to be preserved include the image patches of the key-points such as corners and blobs. Based on a saliency image map from the key-points and their image patches, an axis-aligned grid-size optimization is proposed for the reduction of image size. To increase the size-reduction efficiency the aspect ratio constraint is relaxed in the constraint optimization framework. The proposed method yields higher matching accuracy after the size reduction than the conventional content-aware image size-reduction methods.Keywords: image compression, image matching, key-point detection and description, machine-to-machine communication
Procedia PDF Downloads 4192126 Risk Assessment of Lead Element in Red Peppers Collected from Marketplaces in Antalya, Southern Turkey
Authors: Serpil Kilic, Ihsan Burak Cam, Murat Kilic, Timur Tongur
Abstract:
Interest in the lead (Pb) has considerably increased due to knowledge about the potential toxic effects of this element, recently. Exposure to heavy metals above the acceptable limit affects human health. Indeed, Pb is accumulated through food chains up to toxic concentrations; therefore, it can pose an adverse potential threat to human health. A sensitive and reliable method for determination of Pb element in red pepper were improved in the present study. Samples (33 red pepper products having different brands) were purchased from different markets in Turkey. The selected method validation criteria (linearity, Limit of Detection, Limit of Quantification, recovery, and trueness) demonstrated. Recovery values close to 100% showed adequate precision and accuracy for analysis. According to the results of red pepper analysis, all of the tested lead element in the samples was determined at various concentrations. A Perkin- Elmer ELAN DRC-e model ICP-MS system was used for detection of Pb. Organic red pepper was used to obtain a matrix for all method validation studies. The certified reference material, Fapas chili powder, was digested and analyzed, together with the different sample batches. Three replicates from each sample were digested and analyzed. The results of the exposure levels of the elements were discussed considering the scientific opinions of the European Food Safety Authority (EFSA), which is the European Union’s (EU) risk assessment source associated with food safety. The Target Hazard Quotient (THQ) was described by the United States Environmental Protection Agency (USEPA) for the calculation of potential health risks associated with long-term exposure to chemical pollutants. THQ value contains intake of elements, exposure frequency and duration, body weight and the oral reference dose (RfD). If the THQ value is lower than one, it means that the exposed population is assumed to be safe and 1 < THQ < 5 means that the exposed population is in a level of concern interval. In this study, the THQ of Pb was obtained as < 1. The results of THQ calculations showed that the values were below one for all the tested, meaning the samples did not pose a health risk to the local population. This work was supported by The Scientific Research Projects Coordination Unit of Akdeniz University. Project Number: FBA-2017-2494.Keywords: lead analyses, red pepper, risk assessment, daily exposure
Procedia PDF Downloads 1692125 Scalable UI Test Automation for Large-scale Web Applications
Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani
Abstract:
This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.Keywords: aws, elastic container service, scalability, serverless, ui automation test
Procedia PDF Downloads 1082124 Bayesian Analysis of Change Point Problems Using Conditionally Specified Priors
Authors: Golnaz Shahtahmassebi, Jose Maria Sarabia
Abstract:
In this talk, we introduce a new class of conjugate prior distributions obtained from conditional specification methodology. We illustrate the application of such distribution in Bayesian change point detection in Poisson processes. We obtain the posterior distribution of model parameters using a general bivariate distribution with gamma conditionals. Simulation from the posterior is readily implemented using a Gibbs sampling algorithm. The Gibbs sampling is implemented even when using conditional densities that are incompatible or only compatible with an improper joint density. The application of such methods will be demonstrated using examples of simulated and real data.Keywords: change point, bayesian inference, Gibbs sampler, conditional specification, gamma conditional distributions
Procedia PDF Downloads 1902123 Design and Optimization for a Compliant Gripper with Force Regulation Mechanism
Authors: Nhat Linh Ho, Thanh-Phong Dao, Shyh-Chour Huang, Hieu Giang Le
Abstract:
This paper presents a design and optimization for a compliant gripper. The gripper is constructed based on the concept of compliant mechanism with flexure hinge. A passive force regulation mechanism is presented to control the grasping force a micro-sized object instead of using a sensor force. The force regulation mechanism is designed using the planar springs. The gripper is expected to obtain a large range of displacement to handle various sized objects. First of all, the statics and dynamics of the gripper are investigated by using the finite element analysis in ANSYS software. And then, the design parameters of the gripper are optimized via Taguchi method. An orthogonal array L9 is used to establish an experimental matrix. Subsequently, the signal to noise ratio is analyzed to find the optimal solution. Finally, the response surface methodology is employed to model the relationship between the design parameters and the output displacement of the gripper. The design of experiment method is then used to analyze the sensitivity so as to determine the effect of each parameter on the displacement. The results showed that the compliant gripper can move with a large displacement of 213.51 mm and the force regulation mechanism is expected to be used for high precision positioning systems.Keywords: flexure hinge, compliant mechanism, compliant gripper, force regulation mechanism, Taguchi method, response surface methodology, design of experiment
Procedia PDF Downloads 332