Search results for: false positives
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 418

Search results for: false positives

148 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method

Authors: Arwa Alzughaibi

Abstract:

Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.

Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization

Procedia PDF Downloads 258
147 Chaotic Sequence Noise Reduction and Chaotic Recognition Rate Improvement Based on Improved Local Geometric Projection

Authors: Rubin Dan, Xingcai Wang, Ziyang Chen

Abstract:

A chaotic time series noise reduction method based on the fusion of the local projection method, wavelet transform, and particle swarm algorithm (referred to as the LW-PSO method) is proposed to address the problem of false recognition due to noise in the recognition process of chaotic time series containing noise. The method first uses phase space reconstruction to recover the original dynamical system characteristics and removes the noise subspace by selecting the neighborhood radius; then it uses wavelet transform to remove D1-D3 high-frequency components to maximize the retention of signal information while least-squares optimization is performed by the particle swarm algorithm. The Lorenz system containing 30% Gaussian white noise is simulated and verified, and the phase space, SNR value, RMSE value, and K value of the 0-1 test method before and after noise reduction of the Schreiber method, local projection method, wavelet transform method, and LW-PSO method are compared and analyzed, which proves that the LW-PSO method has a better noise reduction effect compared with the other three common methods. The method is also applied to the classical system to evaluate the noise reduction effect of the four methods and the original system identification effect, which further verifies the superiority of the LW-PSO method. Finally, it is applied to the Chengdu rainfall chaotic sequence for research, and the results prove that the LW-PSO method can effectively reduce the noise and improve the chaos recognition rate.

Keywords: Schreiber noise reduction, wavelet transform, particle swarm optimization, 0-1 test method, chaotic sequence denoising

Procedia PDF Downloads 199
146 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images

Authors: U. Datta

Abstract:

The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.

Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection

Procedia PDF Downloads 135
145 Computer Aided Diagnosis Bringing Changes in Breast Cancer Detection

Authors: Devadrita Dey Sarkar

Abstract:

Regardless of the many technologic advances in the past decade, increased training and experience, and the obvious benefits of uniform standards, the false-negative rate in screening mammography remains unacceptably high .A computer aided neural network classification of regions of suspicion (ROS) on digitized mammograms is presented in this abstract which employs features extracted by a new technique based on independent component analysis. CAD is a concept established by taking into account equally the roles of physicians and computers, whereas automated computer diagnosis is a concept based on computer algorithms only. With CAD, the performance by computers does not have to be comparable to or better than that by physicians, but needs to be complementary to that by physicians. In fact, a large number of CAD systems have been employed for assisting physicians in the early detection of breast cancers on mammograms. A CAD scheme that makes use of lateral breast images has the potential to improve the overall performance in the detection of breast lumps. Because breast lumps can be detected reliably by computer on lateral breast mammographs, radiologists’ accuracy in the detection of breast lumps would be improved by the use of CAD, and thus early diagnosis of breast cancer would become possible. In the future, many CAD schemes could be assembled as packages and implemented as a part of PACS. For example, the package for breast CAD may include the computerized detection of breast nodules, as well as the computerized classification of benign and malignant nodules. In order to assist in the differential diagnosis, it would be possible to search for and retrieve images (or lesions) with these CAD systems, which would be reliable and useful method for quantifying the similarity of a pair of images for visual comparison by radiologists.

Keywords: CAD(computer-aided design), lesions, neural network, ROS(region of suspicion)

Procedia PDF Downloads 456
144 Organotin (IV) Based Complexes as Promiscuous Antibacterials: Synthesis in vitro, in Silico Pharmacokinetic, and Docking Studies

Authors: Wajid Rehman, Sirajul Haq, Bakhtiar Muhammad, Syed Fahad Hassan, Amin Badshah, Muhammad Waseem, Fazal Rahim, Obaid-Ur-Rahman Abid, Farzana Latif Ansari, Umer Rashid

Abstract:

Five novel triorganotin (IV) compounds have been synthesized and characterized. The tin atom is penta-coordinated to assume trigonal-bipyramidal geometry. Using in silico derived parameters; the objective of our study is to design and synthesize promiscuous antibacterials potent enough to combat resistance. Among various synthesized organotin (IV) complexes, compound 5 was found as potent antibacterial agent against various bacterial strains. Further lead optimization of drug-like properties was evaluated through in silico predictions. Data mining and computational analysis were utilized to derive compound promiscuity phenomenon to avoid drug attrition rate in designing antibacterials. Xanthine oxidase and human glucose- 6-phosphatase were found as only true positive off-target hits by ChEMBL database and others utilizing similarity ensemble approach. Propensity towards a-3 receptor, human macrophage migration factor and thiazolidinedione were found as false positive off targets with E-value 1/4> 10^-4 for compound 1, 3, and 4. Further, displaying positive drug-drug interaction of compound 1 as uricosuric was validated by all databases and docked protein targets with sequence similarity and compositional matrix alignment via BLAST software. Promiscuity of the compound 5 was further confirmed by in silico binding to different antibacterial targets.

Keywords: antibacterial activity, drug promiscuity, ADMET prediction, metallo-pharmaceutical, antimicrobial resistance

Procedia PDF Downloads 504
143 Design Study on a Contactless Material Feeding Device for Electro Conductive Workpieces

Authors: Oliver Commichau, Richard Krimm, Bernd-Arno Behrens

Abstract:

A growing demand on the production rate of modern presses leads to higher stroke rates. Commonly used material feeding devices for presses like grippers and roll-feeding systems can only achieve high stroke rates along with high gripper forces, to avoid stick-slip. These forces are limited by the sensibility of the surfaces of the workpieces. Stick-slip leads to scratches on the surface and false positioning of the workpiece. In this paper, a new contactless feeding device is presented, which develops higher feeding force without damaging the surface of the workpiece through gripping forces. It is based on the principle of the linear induction motor. A primary part creates a magnetic field and induces eddy currents in the electrically conductive material. A Lorentz-Force applies to the workpiece in feeding direction as a mutual reaction between the eddy-currents and the magnetic induction. In this study, the FEA model of this approach is shown. The calculation of this model was used to identify the influence of various design parameters on the performance of the feeder and thus showing the promising capabilities and limits of this technology. In order to validate the study, a prototype of the feeding device has been built. An experimental setup was used to measure pulling forces and placement accuracy of the experimental feeder in order to give an outlook of a potential industrial application of this approach.

Keywords: conductive material, contactless feeding, linear induction, Lorentz-Force

Procedia PDF Downloads 179
142 Prediction of Solanum Lycopersicum Genome Encoded microRNAs Targeting Tomato Spotted Wilt Virus

Authors: Muhammad Shahzad Iqbal, Zobia Sarwar, Salah-ud-Din

Abstract:

Tomato spotted wilt virus (TSWV) belongs to the genus Tospoviruses (family Bunyaviridae). It is one of the most devastating pathogens of tomato (Solanum Lycopersicum) and heavily damages the crop yield each year around the globe. In this study, we retrieved 329 mature miRNA sequences from two microRNA databases (miRBase and miRSoldb) and checked the putative target sites in the downloaded-genome sequence of TSWV. A consensus of three miRNA target prediction tools (RNA22, miRanda and psRNATarget) was used to screen the false-positive microRNAs targeting sites in the TSWV genome. These tools calculated different target sites by calculating minimum free energy (mfe), site-complementarity, minimum folding energy and other microRNA-mRNA binding factors. R language was used to plot the predicted target-site data. All the genes having possible target sites for different miRNAs were screened by building a consensus table. Out of these 329 mature miRNAs predicted by three algorithms, only eight miRNAs met all the criteria/threshold specifications. MC-Fold and MC-Sym were used to predict three-dimensional structures of miRNAs and further analyzed in USCF chimera to visualize the structural and conformational changes before and after microRNA-mRNA interactions. The results of the current study show that the predicted eight miRNAs could further be evaluated by in vitro experiments to develop TSWV-resistant transgenic tomato plants in the future.

Keywords: tomato spotted wild virus (TSWV), Solanum lycopersicum, plant virus, miRNAs, microRNA target prediction, mRNA

Procedia PDF Downloads 155
141 Virtual Reality as a Method in Transformative Learning: A Strategy to Reduce Implicit Bias

Authors: Cory A. Logston

Abstract:

It is imperative researchers continue to explore every transformative strategy to increase empathy and awareness of racial bias. Racism is a social and political concept that uses stereotypical ideology to highlight racial inequities. Everyone has biases they may not be aware of toward disparate out-groups. There is some form of racism in every profession; doctors, lawyers, and teachers are not immune. There have been numerous successful and unsuccessful strategies to motivate and transform an individual’s unconscious biased attitudes. One method designed to induce a transformative experience and identify implicit bias is virtual reality (VR). VR is a technology designed to transport the user to a three-dimensional environment. In a virtual reality simulation, the viewer is immersed in a realistic interactive video taking on the perspective of a Black man. The viewer as the character experiences discrimination in various life circumstances growing up as a child into adulthood. For instance, the prejudice felt in school, as an adolescent encountering the police and false accusations in the workplace. Current research suggests that an immersive VR simulation can enhance self-awareness and become a transformative learning experience. This study uses virtual reality immersion and transformative learning theory to create empathy and identify any unintentional racial bias. Participants, White teachers, will experience a VR immersion to create awareness and identify implicit biases regarding Black students. The desired outcome provides a springboard to reconceptualize their own implicit bias. Virtual reality is gaining traction in the research world and promises to be an effective tool in the transformative learning process.

Keywords: empathy, implicit bias, transformative learning, virtual reality

Procedia PDF Downloads 194
140 Identifying Issues of Corporate Governance and the Effect on Organizational Performance

Authors: Abiodun Oluwaseun Ibude

Abstract:

Every now and then we hear of companies closing down their operations due to unethical practices like an overstatement of company’s balance sheet, concealing company’s debt, embezzlement of company’s fund, declaring false profit and so on. This has led to the liquidation of companies and the loss of investments of shareholders as well as the interest of other stakeholders. As a result of these ugly trends, there is need to put in place a formidable mechanism that will ensure that business activities are conducted in a healthy manner. It should also promote good ethics as well as ensure that the interest of stakeholders and the objectives of any organization is achieved within the confines of the law; wherein law exists to provide criminal penalties for falsification of documents and for conducting other irregularities. Based on the foregoing, it becomes imperative to ensure that steps are taken to stop this menace and face the challenges ahead. This calls for the practice of good governance. The purpose of this study is to identify various components of corporate governance and determine the impact of it on the performance of established organizations. A survey method with the use of questionnaire was applied in collecting data useful for this study which were later analyzed using correlation co-efficiency statistical tools in generating finding, making a conclusion, and necessary recommendation. From the research conducted, it was discovered that there are systems within organizations apart from regulatory agencies that ensure effective control of activities, promote accountability, and operational efficiency. However, some members of organizations fail to explore the usage of corporate governance and impact negatively of an organization’s performance. In conclusion, good corporate governance will not be achieved unless there is openness, honesty, transparency, accountability, and fairness.

Keywords: corporate governance, formidable mechanism, company’s balance sheet, stakeholders

Procedia PDF Downloads 116
139 Development of an Implicit Physical Influence Upwind Scheme for Cell-Centered Finite Volume Method

Authors: Shidvash Vakilipour, Masoud Mohammadi, Rouzbeh Riazi, Scott Ormiston, Kimia Amiri, Sahar Barati

Abstract:

An essential component of a finite volume method (FVM) is the advection scheme that estimates values on the cell faces based on the calculated values on the nodes or cell centers. The most widely used advection schemes are upwind schemes. These schemes have been developed in FVM on different kinds of structured and unstructured grids. In this research, the physical influence scheme (PIS) is developed for a cell-centered FVM that uses an implicit coupled solver. Results are compared with the exponential differencing scheme (EDS) and the skew upwind differencing scheme (SUDS). Accuracy of these schemes is evaluated for a lid-driven cavity flow at Re = 1000, 3200, and 5000 and a backward-facing step flow at Re = 800. Simulations show considerable differences between the results of EDS scheme with benchmarks, especially for the lid-driven cavity flow at high Reynolds numbers. These differences occur due to false diffusion. Comparing SUDS and PIS schemes shows relatively close results for the backward-facing step flow and different results in lid-driven cavity flow. The poor results of SUDS in the lid-driven cavity flow can be related to its lack of sensitivity to the pressure difference between cell face and upwind points, which is critical for the prediction of such vortex dominant flows.

Keywords: cell-centered finite volume method, coupled solver, exponential differencing scheme (EDS), physical influence scheme (PIS), pressure weighted interpolation method (PWIM), skew upwind differencing scheme (SUDS)

Procedia PDF Downloads 284
138 Enzymatic Repair Prior To DNA Barcoding, Aspirations, and Restraints

Authors: Maxime Merheb, Rachel Matar

Abstract:

Retrieving ancient DNA sequences which in return permit the entire genome sequencing from fossils have extraordinarily improved in recent years, thanks to sequencing technology and other methodological advances. In any case, the quest to search for ancient DNA is still obstructed by the damage inflicted on DNA which accumulates after the death of a living organism. We can characterize this damage into three main categories: (i) Physical abnormalities such as strand breaks which lead to the presence of short DNA fragments. (ii) Modified bases (mainly cytosine deamination) which cause errors in the sequence due to an incorporation of a false nucleotide during DNA amplification. (iii) DNA modifications referred to as blocking lesions, will halt the PCR extension which in return will also affect the amplification and sequencing process. We can clearly see that the issues arising from breakage and coding errors were significantly decreased in recent years. Fast sequencing of short DNA fragments was empowered by platforms for high-throughput sequencing, most of the coding errors were uncovered to be the consequences of cytosine deamination which can be easily removed from the DNA using enzymatic treatment. The methodology to repair DNA sequences is still in development, it can be basically explained by the process of reintroducing cytosine rather than uracil. This technique is thus restricted to amplified DNA molecules. To eliminate any type of damage (particularly those that block PCR) is a process still pending the complete repair methodologies; DNA detection right after extraction is highly needed. Before using any resources into extensive, unreasonable and uncertain repair techniques, it is vital to distinguish between two possible hypotheses; (i) DNA is none existent to be amplified to begin with therefore completely un-repairable, (ii) the DNA is refractory to PCR and it is worth to be repaired and amplified. Hence, it is extremely important to develop a non-enzymatic technique to detect the most degraded DNA.

Keywords: ancient DNA, DNA barcodong, enzymatic repair, PCR

Procedia PDF Downloads 400
137 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search

Authors: Wenbo Wang, Yi-Fang Brook Wu

Abstract:

The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.

Keywords: fact checking, claim verification, deep learning, natural language processing

Procedia PDF Downloads 62
136 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 334
135 Epilepsy Seizure Prediction by Effective Connectivity Estimation Using Granger Causality and Directed Transfer Function Analysis of Multi-Channel Electroencephalogram

Authors: Mona Hejazi, Ali Motie Nasrabadi

Abstract:

Epilepsy is a persistent neurological disorder that affects more than 50 million people worldwide. Hence, there is a necessity to introduce an efficient prediction model for making a correct diagnosis of the epileptic seizure and accurate prediction of its type. In this study we consider how the Effective Connectivity (EC) patterns obtained from intracranial Electroencephalographic (EEG) recordings reveal information about the dynamics of the epileptic brain and can be used to predict imminent seizures, as this will enable the patients (and caregivers) to take appropriate precautions. We use this definition because we believe that effective connectivity near seizures begin to change, so we can predict seizures according to this feature. Results are reported on the standard Freiburg EEG dataset which contains data from 21 patients suffering from medically intractable focal epilepsy. Six channels of EEG from each patients are considered and effective connectivity using Directed Transfer Function (DTF) and Granger Causality (GC) methods is estimated. We concentrate on effective connectivity standard deviation over time and feature changes in five brain frequency sub-bands (Alpha, Beta, Theta, Delta, and Gamma) are compared. The performance obtained for the proposed scheme in predicting seizures is: average prediction time is 50 minutes before seizure onset, the maximum sensitivity is approximate ~80% and the false positive rate is 0.33 FP/h. DTF method is more acceptable to predict epileptic seizures and generally we can observe that the greater results are in gamma and beta sub-bands. The research of this paper is significantly helpful for clinical applications, especially for the exploitation of online portable devices.

Keywords: effective connectivity, Granger causality, directed transfer function, epilepsy seizure prediction, EEG

Procedia PDF Downloads 469
134 Didactics of Literature within the Brechtian Theatre in Edward Albee's Who's Afraid of Virginia Woolf? and Ernest Lehman's Screenplay Adaptation from an Audiovisual Perspective

Authors: Angel Mauricio Castillo

Abstract:

The background to the way theatrical performances and music dramas- as they were known in the mid-nineteenth century, provided the audience with a complete immersion into the feelings of the characters through poetry, music and other artistic representations which create a false sense of reality. However, a novel representation on stage some eighty years later, which is non-cathartic, is significant because it represents the antithesis to the common creations of the period and is originated by the separation of the elements as a dominant. A succinct description of the basic methodologies includes the sense of defamiliarization that results as a near translation of the German word Verfremdung will be referred to along this work as the V-effect (also known as the ‘alienation effect’) and will embody the representation of the performing techniques that enables the audience to watch a play being fully aware of its nature. A play might sometimes present the audience with a constant reminder that it is only a play; therefore, all elements will be introduced to provoke dissimilar reactions and opinions. A clear indication of the major findings of the study is that there is a strong correlation between Hegel, Marx and Brecht as it is disclosed how the didactics of Literature have been influencing not only Brecht’s productions but also every educational context in which these ideas are intertwined. The result is a new dialectical process that is to say, a new thesis that creates independent thinking skills on the part of the audience. Therefore, this model opposes to the Hegelian formula thesis-antithesis-synthesis in that the synthesis in the Brechtian theatre will inevitably fall into the category of a different thesis within an enlightening type of discourse. The confronting ideas of illusion versus reality will create a new dialectical thesis instead of resulting into a synthesis.

Keywords: Brechtian theatre, didactics, literature, education

Procedia PDF Downloads 179
133 Early Gastric Cancer Prediction from Diet and Epidemiological Data Using Machine Learning in Mizoram Population

Authors: Brindha Senthil Kumar, Payel Chakraborty, Senthil Kumar Nachimuthu, Arindam Maitra, Prem Nath

Abstract:

Gastric cancer is predominantly caused by demographic and diet factors as compared to other cancer types. The aim of the study is to predict Early Gastric Cancer (ECG) from diet and lifestyle factors using supervised machine learning algorithms. For this study, 160 healthy individual and 80 cases were selected who had been followed for 3 years (2016-2019), at Civil Hospital, Aizawl, Mizoram. A dataset containing 11 features that are core risk factors for the gastric cancer were extracted. Supervised machine algorithms: Logistic Regression, Naive Bayes, Support Vector Machine (SVM), Multilayer perceptron, and Random Forest were used to analyze the dataset using Python Jupyter Notebook Version 3. The obtained classified results had been evaluated using metrics parameters: minimum_false_positives, brier_score, accuracy, precision, recall, F1_score, and Receiver Operating Characteristics (ROC) curve. Data analysis results showed Naive Bayes - 88, 0.11; Random Forest - 83, 0.16; SVM - 77, 0.22; Logistic Regression - 75, 0.25 and Multilayer perceptron - 72, 0.27 with respect to accuracy and brier_score in percent. Naive Bayes algorithm out performs with very low false positive rates as well as brier_score and good accuracy. Naive Bayes algorithm classification results in predicting ECG showed very satisfactory results using only diet cum lifestyle factors which will be very helpful for the physicians to educate the patients and public, thereby mortality of gastric cancer can be reduced/avoided with this knowledge mining work.

Keywords: Early Gastric cancer, Machine Learning, Diet, Lifestyle Characteristics

Procedia PDF Downloads 161
132 YOLO-IR: Infrared Small Object Detection in High Noise Images

Authors: Yufeng Li, Yinan Ma, Jing Wu, Chengnian Long

Abstract:

Infrared object detection aims at separating small and dim target from clutter background and its capabilities extend beyond the limits of visible light, making it invaluable in a wide range of applications such as improving safety, security, efficiency, and functionality. However, existing methods are usually sensitive to the noise of the input infrared image, leading to a decrease in target detection accuracy and an increase in the false alarm rate in high-noise environments. To address this issue, an infrared small target detection algorithm called YOLO-IR is proposed in this paper to improve the robustness to high infrared noise. To address the problem that high noise significantly reduces the clarity and reliability of target features in infrared images, we design a soft-threshold coordinate attention mechanism to improve the model’s ability to extract target features and its robustness to noise. Since the noise may overwhelm the local details of the target, resulting in the loss of small target features during depth down-sampling, we propose a deep and shallow feature fusion neck to improve the detection accuracy. In addition, because the generalized Intersection over Union (IoU)-based loss functions may be sensitive to noise and lead to unstable training in high-noise environments, we introduce a Wasserstein-distance based loss function to improve the training of the model. The experimental results show that YOLO-IR achieves a 5.0% improvement in recall and a 6.6% improvement in F1-score over existing state-of-art model.

Keywords: infrared small target detection, high noise, robustness, soft-threshold coordinate attention, feature fusion

Procedia PDF Downloads 74
131 Evaluating the Implementation of a Quality Management System in the COVID-19 Diagnostic Laboratory of a Tertiary Care Hospital in Delhi

Authors: Sukriti Sabharwal, Sonali Bhattar, Shikhar Saxena

Abstract:

Introduction: COVID-19 molecular diagnostic laboratory is the cornerstone of the COVID-19 disease diagnosis as the patient’s treatment and management protocol depend on the molecular results. For this purpose, it is extremely important that the laboratory conducting these results adheres to the quality management processes to increase the accuracy and validity of the reports generated. We started our own molecular diagnostic setup at the onset of the pandemic. Therefore, we conducted this study to generate our quality management data to help us in improving on our weak points. Materials and Methods: A total of 14561 samples were evaluated by the retrospective observational method. The quality variables analysed were classified into pre-analytical, analytical, and post-analytical variables, and the results were presented in percentages. Results: Among the pre-analytical variables, sample leaking was the most common cause of the rejection of samples (134/14561, 0.92%), followed by non-generation of SRF ID (76/14561, 0.52%) and non-compliance to triple packaging (44/14561, 0.3%). The other pre-analytical aspects assessed were incomplete patient identification (17/14561, 0.11%), insufficient quantity of samples (12/14561, 0.08%), missing forms/samples (7/14561, 0.04%), samples in the wrong vials/empty VTM tubes (5/14561, 0.03%) and LIMS entry not done (2/14561, 0.01%). We are unable to obtain internal quality control in 0.37% of samples (55/14561). We also experienced two incidences of cross-contamination among the samples resulting in false-positive results. Among the post-analytical factors, a total of 0.07% of samples (11/14561) could not be dispatched within the stipulated time frame. Conclusion: Adherence to quality control processes is foremost for the smooth running of any diagnostic laboratory, especially the ones involved in critical reporting. Not only do the indicators help in keeping in check the laboratory parameters but they also allow comparison with other laboratories.

Keywords: laboratory quality management, COVID-19, molecular diagnostics, healthcare

Procedia PDF Downloads 165
130 Questioning the Relationship Between Young People and Fake News Through Their Use of Social Media

Authors: Marion Billard

Abstract:

This paper will focus on the question of the real relationship between young people and fake news. Fake news is one of today’s main issues in the world of information and communication. Social media and its democratization helped to spread false information. According to traditional beliefs, young people are more inclined to believe what they read through social media. But, the individuals concerned, think that they are more inclined to make a distinction between real and fake news. This phenomenon is due to their use of the internet and social media from an early age. During the 2016 and 2017 French and American presidential campaigns, the term fake news was in the mouth of the entire world and became a real issue in the field of information. While young people were informing themselves with newspapers or television until the beginning of the ’90s, Gen Z (meaning people born between 1997 and 2010), has always been immersed in this world of fast communication. They know how to use social media from a young age and the internet has no secret for them. Today, despite the sporadic use of traditional media, young people tend to turn to their smartphones and social networks such as Instagram or Twitter to stay abreast of the latest news. The growth of social media information led to an “ambient journalism”, giving access to an endless quantity of information. Waking up in the morning, young people will see little posts with short texts supplying the essential of the news, without, for the most, many details. As a result, impressionable people are not able to do a distinction between real media, and “junk news” or Fake News. This massive use of social media is probably explained by the inability of the youngsters to find connections between the communication of the traditional media and what they are living. The question arises if this over-confidence of the young people in their ability to distinguish between accurate and fake news would not make it more difficult for them to examine critically the information. Their relationship with media and fake news is more complex than popular opinion. Today’s young people are not the master in the quest for information, nor inherently the most impressionable public on social media.

Keywords: fake news, youngsters, social media, information, generation

Procedia PDF Downloads 161
129 Virtue, Truth, Freedom, And The History Of Philosophy

Authors: Ashley DelCorno

Abstract:

GEM Anscombe’s 1958 essay Modern Moral Philosophy and the tradition of virtue ethics that followed has given rise to the restoration (or, more plainly, the resurrection) of Aristotle as something of an authority figure. Alisdair MacIntyre and Martha Nussbaum are proponents, for example, not just of Aristotle’s relevancy but also of his apparent implicit authority. That said, it’s not clear that the schema imagined by virtue ethicists accurately describes moral life or that it does not inadvertently work to impoverish genuine decision-making. If the label ‘virtue’ is categorically denied to some groups (while arbitrarily afforded to others), it can only turn on itself, thus rendering ridiculous its own premise. Likewise, as an inescapable feature of virtue ethics, Aristotelean binaries like ‘virtue/vice’ and ‘voluntary/involuntary’ offer up false dichotomies that may seriously compromise an agent’s ability to conceptualize choices that are truly free and rooted in meaningful criteria. Here, this topic is analyzed through a feminist lens predicated on the known paradoxes of patriarchy. The work of feminist theorists Jacqui Alexander, Katharine Angel, Simone de Beauvoir, bell hooks, Audre Lorde, Imani Perry, and Amia Srinivasan serves as important guideposts, and the argument here is built from a key tenet of black feminist thought regarding scarcity and possibility. Above all, it’s clear that though the philosophical tradition of virtue ethics presents itself as recovering the place of agency in ethics, its premises possess crippling limitations toward the achievement of this goal. These include, most notably, virtue ethics’ binding analysis of history, as well as its axiomatic attachment to obligatory clauses, problematic reading-in of Aristotle and arbitrary commitment to predetermined and competitively patriarchal ideas of what counts as a virtue.

Keywords: feminist history, the limits of utopic imagination, curatorial creation, truth, virtue, freedom

Procedia PDF Downloads 83
128 Comparative Study of Flood Plain Protection Zone Determination Methodologies in Colombia, Spain and Canada

Authors: P. Chang, C. Lopez, C. Burbano

Abstract:

Flood protection zones are riparian buffers that are formed to manage and mitigate the impact of flooding, and in turn, protect local populations. The purpose of this study was to evaluate the Guía Técnica de Criterios para el Acotamiento de las Rondas Hídricas in Colombia against international regulations in Canada and Spain, in order to determine its limitations and contribute to its improvement. The need to establish a specific corridor that allows for the dynamic development of a river is clear; however, limitations present in the Colombian Technical Guide are identified. The study shows that international regulations provide similar concepts as used in Colombia, but additionally integrate aspects such as regionalization that allows for a better characterization of the channel way, and incorporate the frequency of flooding and its probability of occurrence in the concept of risk when determining the protection zone. The case study analyzed in Dosquebradas - Risaralda aimed at comparing the application of the different standards through hydraulic modeling. It highlights that the current Colombian standard does not offer sufficient details in its implementation phase, which leads to a false sense of security related to inaccuracy and lack of data. Furthermore, the study demonstrates how the Colombian norm is ill-adapted to the conditions of Dosquebradas typical of the Andes region, both in the social and hydraulic aspects, and does not reduce the risk, nor does it improve the protection of the population. Our study considers it pertinent to include risk estimation as an integral part of the methodology when establishing protect flood zone, considering the particularity of water systems, as they are characterized by an heterogeneous natural dynamic behavior.

Keywords: environmental corridor, flood zone determination, hydraulic domain, legislation flood protection zone

Procedia PDF Downloads 113
127 A Psychoanalytic Lens: Unmasked Layers of the Self among Post-Graduate Psychology Students in Surviving the COVID-19 Lockdown

Authors: Sharon Sibanda, Benny Motileng

Abstract:

The World Health Organisation (WHO) identified the Sars-Cov-2 (COVID-19) as a pandemic on the 12ᵗʰ of March 2020, with South Africa recording its first case on the 5ᵗʰ of March 2020. The rapidly spreading virus led the South African government to implement one of the strictest nationwide lockdowns globally, resulting in the closing down of all institutions of higher learning effective March 18ᵗʰ 2020. Thus, this qualitative study primarily aimed to explore whether post-graduate psychology students were in a state of a depleted or cohesive self, post the psychological isolation of COVID-19 risk-adjusted level 5 lockdown. Semi-structured interviews from a qualitative interpretive approach comprising N=6 psychology post-graduate students facilitated a rich understanding of their intra-psychic experiences of the self. Thematic analysis of data gathered from the interviews illuminated how students were forced into the self by the emotional isolation of hard lockdown, with the emergence of core psychic conflict often defended against through external self-object experiences. The findings also suggest that lockdown stripped off this sample of psychology post-graduate students’ defensive escape from the inner self through external self-object distractions. The external self was stripped to the core of the internal self by the isolation of hard lockdown, thereby uncovering the psychic function of roles and defenses amalgamated throughout modern cultural consciousness that dictates self-functioning. The study suggests modelling reflexivity skills in the integration of internal and external self-experience dynamics as part of a training model for continued personal and professional development for psychology students.

Keywords: COVID-19, fragmentation, self-object experience, true/false self

Procedia PDF Downloads 59
126 The Improved Therapeutic Effect of Trans-Cinnamaldehyde on Adipose-Derived Stem Cells without Chemical Induction

Authors: Karthyayani Rajamani, Yi-Chun Lin, Tung-Chou Wen, Jeanne Hsieh, Yi-Maun Subeq, Jen-Wei Liu, Po-Cheng Lin, Horng-Jyh Harn, Shinn-Zong Lin, Tzyy-Wen Chiou

Abstract:

Assuring cell quality is an essential parameter for the success of stem cell therapy, utilization of various components to improve this potential has been the primary goal of stem cell research. The aim of this study was not only to demonstrate the capacity of trans-cinnamaldehyde (TC) to reverse stress-induced senescence but also improve the therapeutic abilities of stem cells. Because of the availability and the promising application potential in regenerative medicine, adipose-derived stem cells (ADSCs) were chosen for the study. We found that H2O2 treatment resulted in the expression of senescence characteristics in the ADSCs, including decreased proliferation rate, increased senescence-associated- β-galactosidase (SA-β-gal) activity, decreased SIRT1 (silent mating type information regulation 2 homologs) expression and decreased telomerase activity. However, TC treatment was sufficient to rescue or reduce the effects of H2O2 induction, ultimately leading to an increased proliferation rate, a decrease in the percentage of SA-β-gal positive cells, upregulation of SIRT1 expression, and increased telomerase activity of the senescent ADSCs at the cellular level. Further recently it was observed that the ADSCs were treated with TC without induction of senescence, all the before said positives were observed. Moreover, a chemically induced liver fibrosis animal model was used to evaluate the functionality of these rescued cells in vivo. Liver dysfunction was established by injecting 200 mg/kg thioacetamide (TAA) intraperitoneally into Wistar rats every third day for 60 days. The experimental rats were separated into groups; normal group (rats without TAA induction), sham group (without ADSC transplantation), positive control group (transplanted with normal ADSCs); H2O2 group (transplanted with H2O2 -induced senescent ADSCs), H2O2+TC group (transplanted with ADSCs pretreated with H2O2 and then further treated with TC) and TC group (ADSC treated with TC without H2O2 treatment). In the transplantation group, 1 × 106 human ADSCs were introduced into each rat via direct liver injection. Based on the biochemical analysis and immunohistochemical staining results, it was determined that the therapeutic effects on liver fibrosis by the induced senescent ADSCs (H2O2 group) were not as significant as those exerted by the normal ADSCs (the positive control group). However, the H2O2+TC group showed significant reversal of liver damage when compared to the H2O2 group 1 week post-transplantation. Further ADSCs without H2O2 treatment but with just TC treatment performed much better than all the groups. These data confirmed that the TC treatment had the potential to improve the therapeutic effect of ADSCs. It is therefore suggested that TC has potential applications in maintaining stem cell quality and could possibly aid in the treatment of senescence-related disorders.

Keywords: senescence, SIRT1, adipose derived stem cells, liver fibrosis

Procedia PDF Downloads 258
125 Analysis of Real Time Seismic Signal Dataset Using Machine Learning

Authors: Sujata Kulkarni, Udhav Bhosle, Vijaykumar T.

Abstract:

Due to the closeness between seismic signals and non-seismic signals, it is vital to detect earthquakes using conventional methods. In order to distinguish between seismic events and non-seismic events depending on their amplitude, our study processes the data that come from seismic sensors. The authors suggest a robust noise suppression technique that makes use of a bandpass filter, an IIR Wiener filter, recursive short-term average/long-term average (STA/LTA), and Carl short-term average (STA)/long-term average for event identification (LTA). The trigger ratio used in the proposed study to differentiate between seismic and non-seismic activity is determined. The proposed work focuses on significant feature extraction for machine learning-based seismic event detection. This serves as motivation for compiling a dataset of all features for the identification and forecasting of seismic signals. We place a focus on feature vector dimension reduction techniques due to the temporal complexity. The proposed notable features were experimentally tested using a machine learning model, and the results on unseen data are optimal. Finally, a presentation using a hybrid dataset (captured by different sensors) demonstrates how this model may also be employed in a real-time setting while lowering false alarm rates. The planned study is based on the examination of seismic signals obtained from both individual sensors and sensor networks (SN). A wideband seismic signal from BSVK and CUKG station sensors, respectively located near Basavakalyan, Karnataka, and the Central University of Karnataka, makes up the experimental dataset.

Keywords: Carl STA/LTA, features extraction, real time, dataset, machine learning, seismic detection

Procedia PDF Downloads 124
124 Developmental Psycholinguistic Approach to Conversational Skills - A Continuum of the Sensitivity to Gricean Maxims

Authors: Zsuzsanna Schnell, Francesca Ervas

Abstract:

Background: the experimental pragmatic study confirms a basic tenet in the Relevance theoretical views in language philosophy. It draws up a developmental trajectory of the maxims, revealing the cognitive difficulty of their interpretation, their relative place to each other, and the order they may follow in development. A central claim of the present research is that social-cognitive skills play a significant role in inferential meaning construction. Children passing the False Belief Test are significantly more successful in tasks measuring the recognition of the infringement of conversational maxims. Aims and method: Preschoolers’ conversational skills and pragmatic competence is examined in view of their mentalization skills. In doing so it use a measure of linguistic tasks, containing 5 short scenarios for each Gricean maxim. it measure preschoolers’ ToM performance with a first- and a second order ToM task and compare participants’ ability to recognize the infringement of the Gricean maxims in view of their social cognitive skills. Results: Findings suggest that Theory of Mind has a predictive force of 75% concerning the ability to follow Gricean maxims efficiently. ToM proved to be a significant factor in predicting the group’s performance and success rates in 3 out of 4 maxim infringement recognition tasks: in the Quantity, Relevance and Manner conditions, but not in the Quality trial. Conclusions: the results confirm that children’s communicative competence in social contexts requires the development of higher-order social-cognitive reasoning, and reveal the cognitive effort needed for the recognition of the infringement of each maxim, yielding a continuum of their cognitive difficulty and trajectory of development.

Keywords: maxim infringement recognition, social cognition, Gricean maxims, developmental pragmatics

Procedia PDF Downloads 7
123 Tc-99m MIBI Scintigraphy to Differentiate Malignant from Benign Lesions, Detected on Planar Bone Scan

Authors: Aniqa Jabeen

Abstract:

The aim of this study was to evaluate the effectiveness of Tc-99m MIBI (Technetium 99-methoxy-iso-butyl-isonitrile) scintigraphy to differentiate malignancies from benign lesions, which were detected on planar bone scans. Materials and Methods: 59 patients with bone lesions were enrolled in the study. The scintigraphic findings were compared with the clinical, radiological and the histological findings. Each patient initially underwent a three-phase bone scan with Tc-99m MDP (Methylene Diphosphonate) and if evidence of lesion found, the patient then underwent a dynamic and static MIBI scintigraphy after three to four days. The MDP and MIBI scans were evaluated visually and quantitatively. For quantitative analysis count ratios of lesions and contralateral normal side (L/C) were taken by region of interests drawn on scans. The Student T test was applied to assess the significant difference between benign and malignant lesions p-value < 0.05 was considered significant. Result: The MDP scans showed the increase tracer uptake, but there was no significant difference between benign and malignant uptake of the radiotracer. However significant difference (p-value 0.015), in uptake was seen in malignant (L/C = 3.51 ± 1.02) and benign lesion (L/C = 2.50±0.42) on MIBI scan. Three of thirty benign lesions did not show significant MIBI uptake. Seven malignant appeared as false negatives. Specificity of the scan was 86.66%, and its Negative Predictive Value (NPV) was 81.25% whereas the sensitivity of scan was 79.31%. In excluding the axial metastasis from the lesions, the sensitivity of MIBI scan increased to 91.66% and the NPV also increased to 92.85%. Conclusion: MIBI scintigraphy provides its usefulness by distinguishing malignant from benign lesions. MIBI also correctly identifies metastatic lesions. The negative predictive value of the scan points towards its ability to accurately diagnose the normal (benign) cases. However, biopsy remains the gold standard and a definitive diagnostic modality in musculoskeletal tumors. MIBI scan provides useful information in preoperative assessment and in distinguishing between malignant and benign lesions.

Keywords: benign, malignancies, MDP bone scan, MIBI scintigraphy

Procedia PDF Downloads 404
122 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models

Authors: Danielle Shackley, Yetunde Folajimi

Abstract:

As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.

Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model

Procedia PDF Downloads 97
121 Developmental Psycholinguistic Approach to Conversational Skills: A Continuum of the Sensitivity to Gricean Maxims

Authors: Zsuzsanna Schnell, Francesca Ervas

Abstract:

Background: Our experimental pragmatic study confirms a basic tenet in the Relevance of theoretical views in language philosophy. It draws up a developmental trajectory of the maxims, revealing the cognitive difficulty of their interpretation, their relative place to each other, and the order they may follow in development. A central claim of the present research is that social-cognitive skills play a significant role in inferential meaning construction. Children passing the False Belief Test are significantly more successful in tasks measuring the recognition of the infringement of conversational maxims. Aims and method: We examine preschoolers' conversational and pragmatic competence in view of their mentalization skills. To do so, we use a measure of linguistic tasks containing 5 short scenarios for each Gricean maxim. We measure preschoolers’ ToM performance with a first- and second-order ToM task and compare participants’ ability to recognize the infringement of the Gricean maxims in view of their social cognitive skills. Results: Findings suggest that Theory of Mind has a predictive force of 75% concerning the ability to follow Gricean maxims efficiently. ToM proved to be a significant factor in predicting the group’s performance and success rates in 3 out of 4 maxim infringement recognition tasks: in the Quantity, Relevance and Manner conditions, but not in the Quality trial. Conclusions: Our results confirm that children’s communicative competence in social contexts requires the development of higher-order social-cognitive reasoning. They reveal the cognitive effort needed to recognize the infringement of each maxim, yielding a continuum of their cognitive difficulty and trajectory of development.

Keywords: developmental pragmatics, social cognition, preschoolers, maxim infringement, Gricean pragmatics

Procedia PDF Downloads 32
120 Precious Gold and Diamond Accessories Versus False Fashion Diamond and Stained Accessories

Authors: Amira Yousef Mahrous Yousef

Abstract:

This paper includes fast fashion verses sustainable fashion or slow fashion Indian based consumers. The expression ‘Fast fashion’ is generally referred to low-cost clothing collections that considered first hand copy of luxury brands, sometime interchangeably used with ‘mass fashion’. Whereas slow fashion or limited fashion which are consider to be more organic or eco-friendly. "Sustainable fashion is ethical fashion and here the consumer is just not design conscious but also social-environment conscious". Paper will deal with desire of young Indian consumer towards such luxury brands present in India, and their understanding of sustainable fashion, how to maintain the equilibrium between never newer fashion, style, and fashion sustainability. The green fashion market is growing rapidly as eco-friendly consumers are willing to expand their organic lifestyle to include clothing. With an increasing share of fashion consumers globally, Indian consumers are observed to consider the social and environmental ethics while making purchasing decisions. While some research clearly identifies the efforts of responsible consumers towards green fashion, some argue that fashion-orientated consumers who are sensitive towards environment do not actively participate towards supporting green fashion. This study aims to analyze the current perception of green fashion among Indian consumers. A small-scale exploratory study is conducted where consumers’ perception of green fashion is examined followed by an analysis of translation of this perception into purchase decision making. This research paper gives insight into consumer awareness on green fashion and provides scope towards the expansion of ethical fashion consumption .

Keywords: inclusions, temperature gradient, HPHT synthetic fibers, polyamide fibers, fiber volume, compressive strength. gold nano clusters, copper ions, wool keratin, fluorescence

Procedia PDF Downloads 49
119 Human Leukocyte Antigen Class 1 Phenotype Distribution and Analysis in Persons from Central Uganda with Active Tuberculosis and Latent Mycobacterium tuberculosis Infection

Authors: Helen K. Buteme, Rebecca Axelsson-Robertson, Moses L. Joloba, Henry W. Boom, Gunilla Kallenius, Markus Maeurer

Abstract:

Background: The Ugandan population is heavily affected by infectious diseases and Human leukocyte antigen (HLA) diversity plays a crucial role in the host-pathogen interaction and affects the rates of disease acquisition and outcome. The identification of HLA class 1 alleles and determining which alleles are associated with tuberculosis (TB) outcomes would help in screening individuals in TB endemic areas for susceptibility to TB and to predict resistance or progression to TB which would inevitably lead to better clinical management of TB. Aims: To be able to determine the HLA class 1 phenotype distribution in a Ugandan TB cohort and to establish the relationship between these phenotypes and active and latent TB. Methods: Blood samples were drawn from 32 HIV negative individuals with active TB and 45 HIV negative individuals with latent MTB infection. DNA was extracted from the blood samples and the DNA samples HLA typed by the polymerase chain reaction-sequence specific primer method. The allelic frequencies were determined by direct count. Results: HLA-A*02, A*01, A*74, A*30, B*15, B*58, C*07, C*03 and C*04 were the dominant phenotypes in this Ugandan cohort. There were differences in the distribution of HLA types between the individuals with active TB and the individuals with LTBI with only HLA-A*03 allele showing a statistically significant difference (p=0.0136). However, after FDR computation the corresponding q-value is above the expected proportion of false discoveries (q-value 0.2176). Key findings: We identified a number of HLA class I alleles in a population from Central Uganda which will enable us to carry out a functional characterization of CD8+ T-cell mediated immune responses to MTB. Our results also suggest that there may be a positive association between the HLA-A*03 allele and TB implying that individuals with the HLA-A*03 allele are at a higher risk of developing active TB.

Keywords: HLA, phenotype, tuberculosis, Uganda

Procedia PDF Downloads 404