Search results for: false EIV
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 387

Search results for: false EIV

177 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: automatic target recognition (ATR), high resolution range profile (HRRP), motion compensation, stepped frequency waveform technique (SFW), target motion parameters (TMPs)

Procedia PDF Downloads 152
176 Feature Based Unsupervised Intrusion Detection

Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein

Abstract:

The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.

Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka

Procedia PDF Downloads 296
175 Computational Identification of Signalling Pathways in Protein Interaction Networks

Authors: Angela U. Makolo, Temitayo A. Olagunju

Abstract:

The knowledge of signaling pathways is central to understanding the biological mechanisms of organisms since it has been identified that in eukaryotic organisms, the number of signaling pathways determines the number of ways the organism will react to external stimuli. Signaling pathways are studied using protein interaction networks constructed from protein-protein interaction data obtained using high throughput experimental procedures. However, these high throughput methods are known to produce very high rates of false positive and negative interactions. In order to construct a useful protein interaction network from this noisy data, computational methods are applied to validate the protein-protein interactions. In this study, a computational technique to identify signaling pathways from a protein interaction network constructed using validated protein-protein interaction data was designed. A weighted interaction graph of the Saccharomyces cerevisiae (Baker’s Yeast) organism using the proteins as the nodes and interactions between them as edges was constructed. The weights were obtained using Bayesian probabilistic network to estimate the posterior probability of interaction between two proteins given the gene expression measurement as biological evidence. Only interactions above a threshold were accepted for the network model. A pathway was formalized as a simple path in the interaction network from a starting protein and an ending protein of interest. We were able to identify some pathway segments, one of which is a segment of the pathway that signals the start of the process of meiosis in S. cerevisiae.

Keywords: Bayesian networks, protein interaction networks, Saccharomyces cerevisiae, signalling pathways

Procedia PDF Downloads 543
174 The Effects of Self-Efficacy on Challenge and Threat States

Authors: Nadine Sammy, Mark Wilson, Samuel Vine

Abstract:

The Theory of Challenge and Threat States in Athletes (TCTSA) states that self-efficacy is an antecedent of challenge and threat. These states result from conscious and unconscious evaluations of situational demands and personal resources and are represented by both cognitive and physiological markers. Challenge is considered a more adaptive stress response as it is associated with a more efficient cardiovascular profile, as well as better performance and attention effects compared with threat. Self-efficacy is proposed to influence challenge/threat because an individual’s belief that they have the skills necessary to execute the courses of action required to succeed contributes to a perception that they can cope with the demands of the situation. This study experimentally examined the effects of self-efficacy on cardiovascular responses (challenge and threat), demand and resource evaluations, performance and attention under pressurised conditions. Forty-five university students were randomly assigned to either a control (n=15), low self-efficacy (n=15) or high self-efficacy (n=15) group and completed baseline and pressurised golf putting tasks. Self-efficacy was manipulated using false feedback adapted from previous studies. Measures of self-efficacy, cardiovascular reactivity, demand and resource evaluations, task performance and attention were recorded. The high self-efficacy group displayed more favourable cardiovascular reactivity, indicative of a challenge state, compared with the low self-efficacy group. The former group also reported high resource evaluations, but no task performance or attention effects were detected. These findings demonstrate that levels of self-efficacy influence cardiovascular reactivity and perceptions of resources under pressurised conditions.

Keywords: cardiovascular, challenge, performance, threat

Procedia PDF Downloads 232
173 Face Recognition Using Body-Worn Camera: Dataset and Baseline Algorithms

Authors: Ali Almadan, Anoop Krishnan, Ajita Rattani

Abstract:

Facial recognition is a widely adopted technology in surveillance, border control, healthcare, banking services, and lately, in mobile user authentication with Apple introducing “Face ID” moniker with iPhone X. A lot of research has been conducted in the area of face recognition on datasets captured by surveillance cameras, DSLR, and mobile devices. Recently, face recognition technology has also been deployed on body-worn cameras to keep officers safe, enabling situational awareness and providing evidence for trial. However, limited academic research has been conducted on this topic so far, without the availability of any publicly available datasets with a sufficient sample size. This paper aims to advance research in the area of face recognition using body-worn cameras. To this aim, the contribution of this work is two-fold: (1) collection of a dataset consisting of a total of 136,939 facial images of 102 subjects captured using body-worn cameras in in-door and daylight conditions and (2) evaluation of various deep-learning architectures for face identification on the collected dataset. Experimental results suggest a maximum True Positive Rate(TPR) of 99.86% at False Positive Rate(FPR) of 0.000 obtained by SphereFace based deep learning architecture in daylight condition. The collected dataset and the baseline algorithms will promote further research and development. A downloadable link of the dataset and the algorithms is available by contacting the authors.

Keywords: face recognition, body-worn cameras, deep learning, person identification

Procedia PDF Downloads 163
172 Application of Biosensors in Forensic Analysis

Authors: Shirin jalili, Hadi Shirzad, Samaneh Nabavi, Somayeh Khanjani

Abstract:

Biosensors in forensic analysis are ideal biological tools that can be used for rapid and sensitive initial screening and testing to detect of suspicious components like biological and chemical agent in crime scenes. The wide use of different biomolecules such as proteins, nucleic acids, microorganisms, antibodies and enzymes makes it possible. These biosensors have great advantages such as rapidity, little sample manipulation and high sensitivity, also Because of their stability, specificity and low cost they have become a very important tool to Forensic analysis and detection of crime. In crime scenes different substances such as rape samples, Semen, saliva fingerprints and blood samples, act as a detecting elements for biosensors. On the other hand, successful fluid recovery via biosensor has the propensity to yield a highly valuable source of genetic material, which is important in finding the suspect. Although current biological fluid testing techniques are impaired for identification of body fluids. But these methods have disadvantages. For example if they are to be used simultaneously, Often give false positive result. These limitations can negatively result the output of a case through missed or misinterpreted evidence. The use of biosensor enable criminal researchers the highly sensitive and non-destructive detection of biological fluid through interaction with several fluid-endogenous and other biological and chemical contamination at the crime scene. For this reason, using of the biosensors for detecting the biological fluid found at the crime scenes which play an important role in identifying the suspect and solving the criminal.

Keywords: biosensors, forensic analysis, biological fluid, crime detection

Procedia PDF Downloads 1117
171 Fake News Detection for Korean News Using Machine Learning Techniques

Authors: Tae-Uk Yun, Pullip Chung, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Fake news is defined as the news articles that are intentionally and verifiably false, and could mislead readers. Spread of fake news may provoke anxiety, chaos, fear, or irrational decisions of the public. Thus, detecting fake news and preventing its spread has become very important issue in our society. However, due to the huge amount of fake news produced every day, it is almost impossible to identify it by a human. Under this context, researchers have tried to develop automated fake news detection using machine learning techniques over the past years. But, there have been no prior studies proposed an automated fake news detection method for Korean news to our best knowledge. In this study, we aim to detect Korean fake news using text mining and machine learning techniques. Our proposed method consists of two steps. In the first step, the news contents to be analyzed is convert to quantified values using various text mining techniques (topic modeling, TF-IDF, and so on). After that, in step 2, classifiers are trained using the values produced in step 1. As the classifiers, machine learning techniques such as logistic regression, backpropagation network, support vector machine, and deep neural network can be applied. To validate the effectiveness of the proposed method, we collected about 200 short Korean news from Seoul National University’s FactCheck. which provides with detailed analysis reports from 20 media outlets and links to source documents for each case. Using this dataset, we will identify which text features are important as well as which classifiers are effective in detecting Korean fake news.

Keywords: fake news detection, Korean news, machine learning, text mining

Procedia PDF Downloads 275
170 The Relationship Between Beauty Bloggers and the Consumption Patterns of Female Followers: A Case Study on Instagram Pages of Beauty Bloggers

Authors: Reyhane Abdollahi

Abstract:

The beauty of appearance has been important in people's lives since the beginning of history. In every era, beauty has had a specific meaning, and individuals have represented the standards of beauty during each period. According to statistics, the beauty industry has experienced significant economic growth in recent decades, with projections indicating it will reach $583 billion by 2027. The emergence of social media, backed by technological advancements, has created a suitable platform for various beauty brands to engage in economic activities. It can be said that today, beauty bloggers represent the beauty standards of society, actively engaging on social media platforms such as Instagram. Beauty bloggers promote cosmetic and skin care products in front of the camera in their ideal state, utilizing their skills. Instagram, with its limited two-way communication between users and influencers, has also created a suitable environment for advertising. The aim of this research is to study the relationship between beauty bloggers and the consumption patterns of female followers. This research was conducted through interviews with Ten women over the age of 20 who have followed these pages for three years or more, and the findings were analyzed using qualitative content analysis. According to the findings, beauty bloggers encourage women to purchase cosmetic products by creating a sense of identification through sharing their experiences. Beauty bloggers generate a false sense of need for consumption among their audience by promoting beauty products. The feeling of inadequacy, stemming from women's comparisons with bloggers who are always beautiful, leads women to try to imitate the consumption habits and appearance of these bloggers.

Keywords: beauty blogger, instagram, beauty, consumption

Procedia PDF Downloads 9
169 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer

Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved

Abstract:

Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.

Keywords: computer-aided system, detection, image segmentation, morphology

Procedia PDF Downloads 150
168 Beyond the Jingoism of “Infodemic” in the Use of Language: Prospects for a Better Nigeria

Authors: Anacletus Ogbunkwu

Abstract:

It is very disheartening that fake news or inaccurate information spread like wide fire and even with greater speed than fact based news/information. The peak of this anomaly is manifest in information management on the Corona virus pandemic, political/leadership based information, ethnic bigotry, unwarranted panics, false alarms, religious fanaticism, and business moguls in their advertorials, comedies, etc. This ugly situation has left Nigeria and her citizens with emotional trauma, unguided agitations, incessant tribal wars, lost of life and property, widened disunity among Nigerian ethnic and religious groups, amplified insecurity, aided election violence, etc. Unfortunately, among the major driving factors to this misinformation and conspiracy are the official/government and private news agencies, gossip, comedians, and social media handles such as; facebook, twitter, whatsapp, instagram, and online news agencies, etc. Thus this paper examines the impact of misinformation here referred to as infodemic. Also, it studies the epistemic effect of misinformation on the citizens of Nigeria in order to find ways of abating this anomaly for a better society. The methods of exposition and hermeneutics will be used in order to gain in-depth study of the details of infodemic in Nigeria and to offer philosophical analysis/interpretation of data as gathered, respectively. This paper concludes that misinformation or fake news has a perilous effect of epistemic mistrust to Nigeria and her citizens; hence infodemic is a cog in the wheel of National progress.

Keywords: nigeria, infodemic, language, media, news, progress

Procedia PDF Downloads 118
167 Robust Heart Rate Estimation from Multiple Cardiovascular and Non-Cardiovascular Physiological Signals Using Signal Quality Indices and Kalman Filter

Authors: Shalini Rankawat, Mansi Rankawat, Rahul Dubey, Mazad Zaveri

Abstract:

Physiological signals such as electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often seriously corrupted by noise, artifacts, and missing data, which lead to errors in the estimation of heart rate (HR) and incidences of false alarm from ICU monitors. Clinical support in ICU requires most reliable heart rate estimation. Cardiac activity, because of its relatively high electrical energy, may introduce artifacts in Electroencephalogram (EEG), Electrooculogram (EOG), and Electromyogram (EMG) recordings. This paper presents a robust heart rate estimation method by detection of R-peaks of ECG artifacts in EEG, EMG & EOG signals, using energy-based function and a novel Signal Quality Index (SQI) assessment technique. SQIs of physiological signals (EEG, EMG, & EOG) were obtained by correlation of nonlinear energy operator (teager energy) of these signals with either ECG or ABP signal. HR is estimated from ECG, ABP, EEG, EMG, and EOG signals from separate Kalman filter based upon individual SQIs. Data fusion of each HR estimate was then performed by weighing each estimate by the Kalman filters’ SQI modified innovations. The fused signal HR estimate is more accurate and robust than any of the individual HR estimate. This method was evaluated on MIMIC II data base of PhysioNet from bedside monitors of ICU patients. The method provides an accurate HR estimate even in the presence of noise and artifacts.

Keywords: ECG, ABP, EEG, EMG, EOG, ECG artifacts, Teager-Kaiser energy, heart rate, signal quality index, Kalman filter, data fusion

Procedia PDF Downloads 696
166 Origamic Forms: A New Realm in Improving Acoustical Environment

Authors: Mostafa Refat Ismail, Hazem Eldaly

Abstract:

The adaptation of architecture design to building function is getting highly needed in contemporary designs, especially with the great progression in design methods and tools. This, in turn, requires great flexibility in design strategies, as well as a wider spectrum of space settings to achieve the required environment that special activities imply. Acoustics is an essential factor influencing cognitive acts and behavior as well as, on the extreme end, the physical well-being inside a space. The complexity of this constrain is fueled up by the extended geometric dimensions of multipurpose halls, making acoustic adequateness a great concern that could not easily be achieved for each purpose. To achieve a performance oriented acoustic environment, various parametric shaped false ceilings based on origami folded notion are simulated. These parametric origami shapes are able to fold and unfold forming an interactive structure that changes the mutual acoustic environment according to the geometric shapes' position and its changing exposed surface areas. The mobility of the facets in the origami surface can stretch up the range from a complete plain surface to an unfolded element where a considerable amount of absorption is added to the space. The behavior of the parametric origami shapes are being modeled employing a ray tracing computer simulation package for various shapes topology. The conclusion shows a great variation in the acoustical performance due to the variation in folding faces of the origami surfaces, which cause different reflections and consequently large variations in decay curves.

Keywords: parametric, origami, acoustics, architecture

Procedia PDF Downloads 285
165 Resiliency in Fostering: A Qualitative Study of Highly Experienced Foster Parents

Authors: Ande Nesmith

Abstract:

There is an ongoing shortage of foster parents worldwide to take on a growing population of children in need of out-of-home care. Currently, resources are primarily aimed at recruitment rather than retention. Retention rates are extraordinarily low, especially in the first two years of fostering. Qualitative interviews with 19 foster parents averaging 20 years of service provided insight into the challenges they faced and how they overcame them. Thematic analysis of interview transcripts identified sources of stress and resiliency. Key stressors included lack of support and responsiveness from the children’s social workers, false maltreatment allegations, and secondary trauma from children’s destructive behaviors and emotional dysregulation. Resilient parents connected with other foster parents for support, engaged in creative problem-solving, recognized that positive feedback from children usually arrives years later, and through training, understood the neurobiological impact of trauma on child behavior. Recommendations include coordinating communication between the foster parent licensing agency social workers and the children’s social workers, creating foster parent support networks and mentoring, and continuous training on trauma including effective parenting strategies. Research is needed to determine whether these resilience indicators in fact lead to long-term retention. Policies should include a mechanism to develop a cohesive line of communication and connection between foster parents and the children’s social workers as well as their respective agencies.

Keywords: foster care stability, foster parent burnout, foster parent resiliency, foster parent retention, trauma-informed fostering

Procedia PDF Downloads 350
164 Development of Star Image Simulator for Star Tracker Algorithm Validation

Authors: Zoubida Mahi

Abstract:

A successful satellite mission in space requires a reliable attitude and orbit control system to command, control and position the satellite in appropriate orbits. Several sensors are used for attitude control, such as magnetic sensors, earth sensors, horizon sensors, gyroscopes, and solar sensors. The star tracker is the most accurate sensor compared to other sensors, and it is able to offer high-accuracy attitude control without the need for prior attitude information. There are mainly three approaches in star sensor research: digital simulation, hardware in the loop simulation, and field test of star observation. In the digital simulation approach, all of the processes are done in software, including star image simulation. Hence, it is necessary to develop star image simulation software that could simulate real space environments and various star sensor configurations. In this paper, we present a new stellar image simulation tool that is used to test and validate the stellar sensor algorithms; the developed tool allows to simulate of stellar images with several types of noise, such as background noise, gaussian noise, Poisson noise, multiplicative noise, and several scenarios that exist in space such as the presence of the moon, the presence of optical system problem, illumination and false objects. On the other hand, we present in this paper a new star extraction algorithm based on a new centroid calculation method. We compared our algorithm with other star extraction algorithms from the literature, and the results obtained show the star extraction capability of the proposed algorithm.

Keywords: star tracker, star simulation, star detection, centroid, noise, scenario

Procedia PDF Downloads 96
163 A Caged Bird Set Free: The Women Saviors in Fae Myenne Ng's Steer Toward Rock

Authors: Hei Yuen Pak

Abstract:

Steer Toward Rock, Fae Myenne Ng’s second novel after the National Bestseller Bone, is superficially concluded as a story of pessimism, which underestimates the sophistication of Ng’s portrayal. It is often summarized as a “heartbreaking novel of unrequited love” or “a story of timeless and tragic”; yet, Ng’s novel conveys more than a mere sense of tragedy and heartbreak, but rather an overflowing warmth and optimism. Ng is complimented of “illuminating a part of U.S. history few are aware of”—the false identity established on the paper relationships. Nevertheless, toward the end of the novel, this falsity enlightens the male protagonist, Jack Moon Szeto, of the ultimate realization of the “truthfulness” to himself, with the escort of the female characters. This paper intends to investigate how Ng’s depiction subverts the traditional sex/gender system and also the patriarchal savior stereotype. This paper mainly examines the characterization of and the relations among the four major characters: Jack Moon Szeto, Joice Qwan, Veda Qwan, and Ilin Cheung. By deploying Kate Millett’s, Marilyn French’s, Mary Daly’s feminist theories, the first half of the essay elucidates the power relations between Jack and the three females Joice, Veda, and Ilin in terms of gender and sexuality. After analyzing the relations, Jack, this male caged bird, is set free by the epiphany derived from the three female characters, which is the pivot of the second half. In reference to Jean-Paul Sartre and Simone de Beauvoir’s existentialist perspectives, I argue how Jack is transformed from, in Satre’s term, being-for-others to being-for-itself. Hence, the caged bird is free by the women saviors.

Keywords: Fae Myenne Ng, gender and sexuality, feminism, power relations

Procedia PDF Downloads 572
162 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method

Authors: Arwa Alzughaibi

Abstract:

Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.

Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization

Procedia PDF Downloads 257
161 Chaotic Sequence Noise Reduction and Chaotic Recognition Rate Improvement Based on Improved Local Geometric Projection

Authors: Rubin Dan, Xingcai Wang, Ziyang Chen

Abstract:

A chaotic time series noise reduction method based on the fusion of the local projection method, wavelet transform, and particle swarm algorithm (referred to as the LW-PSO method) is proposed to address the problem of false recognition due to noise in the recognition process of chaotic time series containing noise. The method first uses phase space reconstruction to recover the original dynamical system characteristics and removes the noise subspace by selecting the neighborhood radius; then it uses wavelet transform to remove D1-D3 high-frequency components to maximize the retention of signal information while least-squares optimization is performed by the particle swarm algorithm. The Lorenz system containing 30% Gaussian white noise is simulated and verified, and the phase space, SNR value, RMSE value, and K value of the 0-1 test method before and after noise reduction of the Schreiber method, local projection method, wavelet transform method, and LW-PSO method are compared and analyzed, which proves that the LW-PSO method has a better noise reduction effect compared with the other three common methods. The method is also applied to the classical system to evaluate the noise reduction effect of the four methods and the original system identification effect, which further verifies the superiority of the LW-PSO method. Finally, it is applied to the Chengdu rainfall chaotic sequence for research, and the results prove that the LW-PSO method can effectively reduce the noise and improve the chaos recognition rate.

Keywords: Schreiber noise reduction, wavelet transform, particle swarm optimization, 0-1 test method, chaotic sequence denoising

Procedia PDF Downloads 199
160 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images

Authors: U. Datta

Abstract:

The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.

Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection

Procedia PDF Downloads 135
159 Computer Aided Diagnosis Bringing Changes in Breast Cancer Detection

Authors: Devadrita Dey Sarkar

Abstract:

Regardless of the many technologic advances in the past decade, increased training and experience, and the obvious benefits of uniform standards, the false-negative rate in screening mammography remains unacceptably high .A computer aided neural network classification of regions of suspicion (ROS) on digitized mammograms is presented in this abstract which employs features extracted by a new technique based on independent component analysis. CAD is a concept established by taking into account equally the roles of physicians and computers, whereas automated computer diagnosis is a concept based on computer algorithms only. With CAD, the performance by computers does not have to be comparable to or better than that by physicians, but needs to be complementary to that by physicians. In fact, a large number of CAD systems have been employed for assisting physicians in the early detection of breast cancers on mammograms. A CAD scheme that makes use of lateral breast images has the potential to improve the overall performance in the detection of breast lumps. Because breast lumps can be detected reliably by computer on lateral breast mammographs, radiologists’ accuracy in the detection of breast lumps would be improved by the use of CAD, and thus early diagnosis of breast cancer would become possible. In the future, many CAD schemes could be assembled as packages and implemented as a part of PACS. For example, the package for breast CAD may include the computerized detection of breast nodules, as well as the computerized classification of benign and malignant nodules. In order to assist in the differential diagnosis, it would be possible to search for and retrieve images (or lesions) with these CAD systems, which would be reliable and useful method for quantifying the similarity of a pair of images for visual comparison by radiologists.

Keywords: CAD(computer-aided design), lesions, neural network, ROS(region of suspicion)

Procedia PDF Downloads 456
158 Evaluation of Firearm Injury Syndromic Surveillance in Utah

Authors: E. Bennion, A. Acharya, S. Barnes, D. Ferrell, S. Luckett-Cole, G. Mower, J. Nelson, Y. Nguyen

Abstract:

Objective: This study aimed to evaluate the validity of a firearm injury query in the Early Notification of Community-based Epidemics syndromic surveillance system. Syndromic surveillance data are used at the Utah Department of Health for early detection of and rapid response to unusually high rates of violence and injury, among other health outcomes. The query of interest was defined by the Centers for Disease Control and Prevention and used chief complaint and discharge diagnosis codes to capture initial emergency department encounters for firearm injury of all intents. Design: Two epidemiologists manually reviewed electronic health records of emergency department visits captured by the query from April-May 2020, compared results, and sent conflicting determinations to two arbiters. Results: Of the 85 unique records captured, 67 were deemed probable, 19 were ruled out, and two were undetermined, resulting in a positive predictive value of 75.3%. Common reasons for false positives included non-initial encounters and misleading keywords. Conclusion: Improving the validity of syndromic surveillance data would better inform outbreak response decisions made by state and local health departments. The firearm injury definition could be refined to exclude non-initial encounters by negating words such as “last month,” “last week,” and “aftercare”; and to exclude non-firearm injury by negating words such as “pellet gun,” “air gun,” “nail gun,” “bullet bike,” and “exit wound” when a firearm is not mentioned.

Keywords: evaluation, health information system, firearm injury, syndromic surveillance

Procedia PDF Downloads 166
157 Organotin (IV) Based Complexes as Promiscuous Antibacterials: Synthesis in vitro, in Silico Pharmacokinetic, and Docking Studies

Authors: Wajid Rehman, Sirajul Haq, Bakhtiar Muhammad, Syed Fahad Hassan, Amin Badshah, Muhammad Waseem, Fazal Rahim, Obaid-Ur-Rahman Abid, Farzana Latif Ansari, Umer Rashid

Abstract:

Five novel triorganotin (IV) compounds have been synthesized and characterized. The tin atom is penta-coordinated to assume trigonal-bipyramidal geometry. Using in silico derived parameters; the objective of our study is to design and synthesize promiscuous antibacterials potent enough to combat resistance. Among various synthesized organotin (IV) complexes, compound 5 was found as potent antibacterial agent against various bacterial strains. Further lead optimization of drug-like properties was evaluated through in silico predictions. Data mining and computational analysis were utilized to derive compound promiscuity phenomenon to avoid drug attrition rate in designing antibacterials. Xanthine oxidase and human glucose- 6-phosphatase were found as only true positive off-target hits by ChEMBL database and others utilizing similarity ensemble approach. Propensity towards a-3 receptor, human macrophage migration factor and thiazolidinedione were found as false positive off targets with E-value 1/4> 10^-4 for compound 1, 3, and 4. Further, displaying positive drug-drug interaction of compound 1 as uricosuric was validated by all databases and docked protein targets with sequence similarity and compositional matrix alignment via BLAST software. Promiscuity of the compound 5 was further confirmed by in silico binding to different antibacterial targets.

Keywords: antibacterial activity, drug promiscuity, ADMET prediction, metallo-pharmaceutical, antimicrobial resistance

Procedia PDF Downloads 503
156 Design Study on a Contactless Material Feeding Device for Electro Conductive Workpieces

Authors: Oliver Commichau, Richard Krimm, Bernd-Arno Behrens

Abstract:

A growing demand on the production rate of modern presses leads to higher stroke rates. Commonly used material feeding devices for presses like grippers and roll-feeding systems can only achieve high stroke rates along with high gripper forces, to avoid stick-slip. These forces are limited by the sensibility of the surfaces of the workpieces. Stick-slip leads to scratches on the surface and false positioning of the workpiece. In this paper, a new contactless feeding device is presented, which develops higher feeding force without damaging the surface of the workpiece through gripping forces. It is based on the principle of the linear induction motor. A primary part creates a magnetic field and induces eddy currents in the electrically conductive material. A Lorentz-Force applies to the workpiece in feeding direction as a mutual reaction between the eddy-currents and the magnetic induction. In this study, the FEA model of this approach is shown. The calculation of this model was used to identify the influence of various design parameters on the performance of the feeder and thus showing the promising capabilities and limits of this technology. In order to validate the study, a prototype of the feeding device has been built. An experimental setup was used to measure pulling forces and placement accuracy of the experimental feeder in order to give an outlook of a potential industrial application of this approach.

Keywords: conductive material, contactless feeding, linear induction, Lorentz-Force

Procedia PDF Downloads 179
155 Prediction of Solanum Lycopersicum Genome Encoded microRNAs Targeting Tomato Spotted Wilt Virus

Authors: Muhammad Shahzad Iqbal, Zobia Sarwar, Salah-ud-Din

Abstract:

Tomato spotted wilt virus (TSWV) belongs to the genus Tospoviruses (family Bunyaviridae). It is one of the most devastating pathogens of tomato (Solanum Lycopersicum) and heavily damages the crop yield each year around the globe. In this study, we retrieved 329 mature miRNA sequences from two microRNA databases (miRBase and miRSoldb) and checked the putative target sites in the downloaded-genome sequence of TSWV. A consensus of three miRNA target prediction tools (RNA22, miRanda and psRNATarget) was used to screen the false-positive microRNAs targeting sites in the TSWV genome. These tools calculated different target sites by calculating minimum free energy (mfe), site-complementarity, minimum folding energy and other microRNA-mRNA binding factors. R language was used to plot the predicted target-site data. All the genes having possible target sites for different miRNAs were screened by building a consensus table. Out of these 329 mature miRNAs predicted by three algorithms, only eight miRNAs met all the criteria/threshold specifications. MC-Fold and MC-Sym were used to predict three-dimensional structures of miRNAs and further analyzed in USCF chimera to visualize the structural and conformational changes before and after microRNA-mRNA interactions. The results of the current study show that the predicted eight miRNAs could further be evaluated by in vitro experiments to develop TSWV-resistant transgenic tomato plants in the future.

Keywords: tomato spotted wild virus (TSWV), Solanum lycopersicum, plant virus, miRNAs, microRNA target prediction, mRNA

Procedia PDF Downloads 155
154 Virtual Reality as a Method in Transformative Learning: A Strategy to Reduce Implicit Bias

Authors: Cory A. Logston

Abstract:

It is imperative researchers continue to explore every transformative strategy to increase empathy and awareness of racial bias. Racism is a social and political concept that uses stereotypical ideology to highlight racial inequities. Everyone has biases they may not be aware of toward disparate out-groups. There is some form of racism in every profession; doctors, lawyers, and teachers are not immune. There have been numerous successful and unsuccessful strategies to motivate and transform an individual’s unconscious biased attitudes. One method designed to induce a transformative experience and identify implicit bias is virtual reality (VR). VR is a technology designed to transport the user to a three-dimensional environment. In a virtual reality simulation, the viewer is immersed in a realistic interactive video taking on the perspective of a Black man. The viewer as the character experiences discrimination in various life circumstances growing up as a child into adulthood. For instance, the prejudice felt in school, as an adolescent encountering the police and false accusations in the workplace. Current research suggests that an immersive VR simulation can enhance self-awareness and become a transformative learning experience. This study uses virtual reality immersion and transformative learning theory to create empathy and identify any unintentional racial bias. Participants, White teachers, will experience a VR immersion to create awareness and identify implicit biases regarding Black students. The desired outcome provides a springboard to reconceptualize their own implicit bias. Virtual reality is gaining traction in the research world and promises to be an effective tool in the transformative learning process.

Keywords: empathy, implicit bias, transformative learning, virtual reality

Procedia PDF Downloads 194
153 Identifying Issues of Corporate Governance and the Effect on Organizational Performance

Authors: Abiodun Oluwaseun Ibude

Abstract:

Every now and then we hear of companies closing down their operations due to unethical practices like an overstatement of company’s balance sheet, concealing company’s debt, embezzlement of company’s fund, declaring false profit and so on. This has led to the liquidation of companies and the loss of investments of shareholders as well as the interest of other stakeholders. As a result of these ugly trends, there is need to put in place a formidable mechanism that will ensure that business activities are conducted in a healthy manner. It should also promote good ethics as well as ensure that the interest of stakeholders and the objectives of any organization is achieved within the confines of the law; wherein law exists to provide criminal penalties for falsification of documents and for conducting other irregularities. Based on the foregoing, it becomes imperative to ensure that steps are taken to stop this menace and face the challenges ahead. This calls for the practice of good governance. The purpose of this study is to identify various components of corporate governance and determine the impact of it on the performance of established organizations. A survey method with the use of questionnaire was applied in collecting data useful for this study which were later analyzed using correlation co-efficiency statistical tools in generating finding, making a conclusion, and necessary recommendation. From the research conducted, it was discovered that there are systems within organizations apart from regulatory agencies that ensure effective control of activities, promote accountability, and operational efficiency. However, some members of organizations fail to explore the usage of corporate governance and impact negatively of an organization’s performance. In conclusion, good corporate governance will not be achieved unless there is openness, honesty, transparency, accountability, and fairness.

Keywords: corporate governance, formidable mechanism, company’s balance sheet, stakeholders

Procedia PDF Downloads 115
152 Development of an Implicit Physical Influence Upwind Scheme for Cell-Centered Finite Volume Method

Authors: Shidvash Vakilipour, Masoud Mohammadi, Rouzbeh Riazi, Scott Ormiston, Kimia Amiri, Sahar Barati

Abstract:

An essential component of a finite volume method (FVM) is the advection scheme that estimates values on the cell faces based on the calculated values on the nodes or cell centers. The most widely used advection schemes are upwind schemes. These schemes have been developed in FVM on different kinds of structured and unstructured grids. In this research, the physical influence scheme (PIS) is developed for a cell-centered FVM that uses an implicit coupled solver. Results are compared with the exponential differencing scheme (EDS) and the skew upwind differencing scheme (SUDS). Accuracy of these schemes is evaluated for a lid-driven cavity flow at Re = 1000, 3200, and 5000 and a backward-facing step flow at Re = 800. Simulations show considerable differences between the results of EDS scheme with benchmarks, especially for the lid-driven cavity flow at high Reynolds numbers. These differences occur due to false diffusion. Comparing SUDS and PIS schemes shows relatively close results for the backward-facing step flow and different results in lid-driven cavity flow. The poor results of SUDS in the lid-driven cavity flow can be related to its lack of sensitivity to the pressure difference between cell face and upwind points, which is critical for the prediction of such vortex dominant flows.

Keywords: cell-centered finite volume method, coupled solver, exponential differencing scheme (EDS), physical influence scheme (PIS), pressure weighted interpolation method (PWIM), skew upwind differencing scheme (SUDS)

Procedia PDF Downloads 284
151 Enzymatic Repair Prior To DNA Barcoding, Aspirations, and Restraints

Authors: Maxime Merheb, Rachel Matar

Abstract:

Retrieving ancient DNA sequences which in return permit the entire genome sequencing from fossils have extraordinarily improved in recent years, thanks to sequencing technology and other methodological advances. In any case, the quest to search for ancient DNA is still obstructed by the damage inflicted on DNA which accumulates after the death of a living organism. We can characterize this damage into three main categories: (i) Physical abnormalities such as strand breaks which lead to the presence of short DNA fragments. (ii) Modified bases (mainly cytosine deamination) which cause errors in the sequence due to an incorporation of a false nucleotide during DNA amplification. (iii) DNA modifications referred to as blocking lesions, will halt the PCR extension which in return will also affect the amplification and sequencing process. We can clearly see that the issues arising from breakage and coding errors were significantly decreased in recent years. Fast sequencing of short DNA fragments was empowered by platforms for high-throughput sequencing, most of the coding errors were uncovered to be the consequences of cytosine deamination which can be easily removed from the DNA using enzymatic treatment. The methodology to repair DNA sequences is still in development, it can be basically explained by the process of reintroducing cytosine rather than uracil. This technique is thus restricted to amplified DNA molecules. To eliminate any type of damage (particularly those that block PCR) is a process still pending the complete repair methodologies; DNA detection right after extraction is highly needed. Before using any resources into extensive, unreasonable and uncertain repair techniques, it is vital to distinguish between two possible hypotheses; (i) DNA is none existent to be amplified to begin with therefore completely un-repairable, (ii) the DNA is refractory to PCR and it is worth to be repaired and amplified. Hence, it is extremely important to develop a non-enzymatic technique to detect the most degraded DNA.

Keywords: ancient DNA, DNA barcodong, enzymatic repair, PCR

Procedia PDF Downloads 400
150 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search

Authors: Wenbo Wang, Yi-Fang Brook Wu

Abstract:

The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.

Keywords: fact checking, claim verification, deep learning, natural language processing

Procedia PDF Downloads 62
149 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 334
148 Epilepsy Seizure Prediction by Effective Connectivity Estimation Using Granger Causality and Directed Transfer Function Analysis of Multi-Channel Electroencephalogram

Authors: Mona Hejazi, Ali Motie Nasrabadi

Abstract:

Epilepsy is a persistent neurological disorder that affects more than 50 million people worldwide. Hence, there is a necessity to introduce an efficient prediction model for making a correct diagnosis of the epileptic seizure and accurate prediction of its type. In this study we consider how the Effective Connectivity (EC) patterns obtained from intracranial Electroencephalographic (EEG) recordings reveal information about the dynamics of the epileptic brain and can be used to predict imminent seizures, as this will enable the patients (and caregivers) to take appropriate precautions. We use this definition because we believe that effective connectivity near seizures begin to change, so we can predict seizures according to this feature. Results are reported on the standard Freiburg EEG dataset which contains data from 21 patients suffering from medically intractable focal epilepsy. Six channels of EEG from each patients are considered and effective connectivity using Directed Transfer Function (DTF) and Granger Causality (GC) methods is estimated. We concentrate on effective connectivity standard deviation over time and feature changes in five brain frequency sub-bands (Alpha, Beta, Theta, Delta, and Gamma) are compared. The performance obtained for the proposed scheme in predicting seizures is: average prediction time is 50 minutes before seizure onset, the maximum sensitivity is approximate ~80% and the false positive rate is 0.33 FP/h. DTF method is more acceptable to predict epileptic seizures and generally we can observe that the greater results are in gamma and beta sub-bands. The research of this paper is significantly helpful for clinical applications, especially for the exploitation of online portable devices.

Keywords: effective connectivity, Granger causality, directed transfer function, epilepsy seizure prediction, EEG

Procedia PDF Downloads 469