Search results for: data encoding
25138 The Impact of CYP2C9 Gene Polymorphisms on Warfarin Dosing
Authors: Weaam Aldeeban, Majd Aljamali, Lama A. Youssef
Abstract:
Background & Objective: Warfarin is considered a problematic drug due to its narrow therapeutic window and wide inter-individual response variations, which are attributed to demographic, environmental, and genetic factors, particularly single nucleotide polymorphism (SNPs) in the genes encoding VKORC1 and CYP2C9 involved in warfarin's mechanism of action and metabolism, respectively. CYP2C9*2rs1799853 and CYP2C9*3rs1057910 alleles are linked to reduced enzyme activity, as carriers of either or both alleles are classified as moderate or slow metabolizers, and therefore exhibit higher sensitivity of warfarin compared with wild type (CYP2C9*1*1). Our study aimed to assess the frequency of *1, *2, and *3 alleles in the CYP2C9 gene in a cohort of Syrian patients receiving a maintenance dose of warfarin for different indications, the impact of genotypes on warfarin dosing, and the frequency of adverse effects (i.e., bleedings). Subjects & Methods: This retrospective cohort study encompassed 94 patients treated with warfarin. Patients’ genotypes were identified by sequencing the polymerase chain reaction (PCR) specific products of the gene encoding CYP2C9, and the effects on warfarin therapeutic outcomes were investigated. Results: Sequencing revealed that 43.6% of the study population has the *2 and/or *3 SNPs. The mean weekly maintenance dose of warfarin was 37.42 ± 15.5 mg for patients with the wild-type allele (CYP2C9*1*1), whereas patients with one or both variants (*2 and/or *3) demanded a significantly lower dose (28.59 ±11.58 mg) of warfarin, (P= 0.015). A higher percentage (40.7%) of patients with allele *2 and/or *3 experienced hemorrhagic accidents compared with only 17.9% of patients with the wild type *1*1, (P = 0.04). Conclusions: Our study proves an association between *2 and *3 genotypes and higher sensitivity to warfarin and a tendency to bleed, which necessitates lowering the dose. These findings emphasize the significance of CYP2C9 genotyping prior to commencing warfarin therapy in order to achieve optimal and faster dose control and to ensure effectiveness and safety.Keywords: warfarin, CYP2C9, polymorphisms, Syrian, hemorrhage
Procedia PDF Downloads 14525137 Molecular Characterization of Major Isolated Organism Involved in Bovine Subclinical Mastitis
Authors: H. K. Ratre, M. Roy, S. Roy, M. S. Parmar, V. Bhagat
Abstract:
Mastitis is a common problem of dairy industries. Reduction in milk production and an irreparable damage to the udder associated with the disease are common causes of culling of dairy cows. Milk from infected animals is not suitable for drinking and for making different milk products. So, it has a major economic importance in dairy cattle. The aims of this study were to investigate the bacteriological panorama in milk from udder quarters with subclinical mastitis and to carried out for the molecular characterization of the major isolated organisms, from subclinical mastitis-affected cows in and around Durg and Rajnandgaon district of Chhattisgarh. Isolation and identification of bacteria from the milk samples of subclinical mastitis-affected cows were done by standard and routine culture procedures. A total of 78 isolates were obtained from cows and among the various bacteria isolated, Staphylococcus spp. occupied prime position with occurrence rate of 51.282%. However, other bacteria isolated includeStreptococcus spp. (20.512%), Micrococcus spp. (14.102%), E. coli (8.974%), Klebsiela spp. (2.564%), Salmonella spp. (1.282%) and Proteus spp. (1.282%). Staphylococcus spp. was isolated as the major causative agent of subclinical mastitis in the studied area. Molecular characterization of Staphylococus aureusisolates was done for genetic expression of the virulence genes like ‘nuc’ encoding thermonucleaseexoenzyme, coa and spa by PCR amplification of the respective genes in 25 Staphylococcus isolates. In the present study, 15 isolates (77.27%) out of 20 coagulase positive isolates were found to be genotypically positive for ‘nuc’ where as 20 isolates (52.63%) out of 38 CNS expressed the presence of the same virulence gene. In the present study, three Staphylococcus isolates were found to be genotypically positive for coa gene. The Amplification of the coa gene yielded two different products of 627, 710 bp. The amplification of the gene segment encoding the IgG binding region of protein A (spa) revealed a size of 220 and 253bp in twostaphylococcus isolates. The X-region binding of the spa gene produced an amplicon of 315 bp in one Staphylococcal isolates. Staphylococcus aureus was found to be major isolate (51.28%) responsible for causing subclinical mastitis in cows which also showed expression of virulence genesnuc, coa and spa.Keywords: mastitis, bacteria, characterization, expression, gene
Procedia PDF Downloads 21325136 Rapid and Cheap Test for Detection of Streptococcus pyogenes and Streptococcus pneumoniae with Antibiotic Resistance Identification
Authors: Marta Skwarecka, Patrycja Bloch, Rafal Walkusz, Oliwia Urbanowicz, Grzegorz Zielinski, Sabina Zoledowska, Dawid Nidzworski
Abstract:
Upper respiratory tract infections are one of the most common reasons for visiting a general doctor. Streptococci are the most common bacterial etiological factors in these infections. There are many different types of Streptococci and infections vary in severity from mild throat infections to pneumonia. For example, S. pyogenes mainly contributes to acute pharyngitis, palatine tonsils and scarlet fever, whereas S. Streptococcus pneumoniae is responsible for several invasive diseases like sepsis, meningitis or pneumonia with high mortality and dangerous complications. There are only a few diagnostic tests designed for detection Streptococci from the infected throat of patients. However, they are mostly based on lateral flow techniques, and they are not used as a standard due to their low sensitivity. The diagnostic standard is to culture patients throat swab on semi selective media in order to multiply pure etiological agent of infection and subsequently to perform antibiogram, which takes several days from the patients visit in the clinic. Therefore, the aim of our studies is to develop and implement to the market a Point of Care device for the rapid identification of Streptococcus pyogenes and Streptococcus pneumoniae with simultaneous identification of antibiotic resistance genes. In the course of our research, we successfully selected genes for to-species identification of Streptococci and genes encoding antibiotic resistance proteins. We have developed a reaction to amplify these genes, which allows detecting the presence of S. pyogenes or S. pneumoniae followed by testing their resistance to erythromycin, chloramphenicol and tetracycline. What is more, the detection of β-lactamase-encoding genes that could protect Streptococci against antibiotics from the ampicillin group, which are widely used in the treatment of this type of infection is also developed. The test is carried out directly from the patients' swab, and the results are available after 20 to 30 minutes after sample subjection, which could be performed during the medical visit.Keywords: antibiotic resistance, Streptococci, respiratory infections, diagnostic test
Procedia PDF Downloads 12725135 The Use of Software and Internet Search Engines to Develop the Encoding and Decoding Skills of a Dyslexic Learner: A Case Study
Authors: Rabih Joseph Nabhan
Abstract:
This case study explores the impact of two major computer software programs Learn to Speak English and Learn English Spelling and Pronunciation, and some Internet search engines such as Google on mending the decoding and spelling deficiency of Simon X, a dyslexic student. The improvement in decoding and spelling may result in better reading comprehension and composition writing. Some computer programs and Internet materials can help regain the missing awareness and consequently restore his self-confidence and self-esteem. In addition, this study provides a systematic plan comprising a set of activities (four computer programs and Internet materials) which address the problem from the lowest to the highest levels of phoneme and phonological awareness. Four methods of data collection (accounts, observations, published tests, and interviews) create the triangulation to validly and reliably collect data before the plan, during the plan, and after the plan. The data collected are analyzed quantitatively and qualitatively. Sometimes the analysis is either quantitative or qualitative, and some other times a combination of both. Tables and figures are utilized to provide a clear and uncomplicated illustration of some data. The improvement in the decoding, spelling, reading comprehension, and composition writing skills that occurred is proved through the use of authentic materials performed by the student under study. Such materials are a comparison between two sample passages written by the learner before and after the plan, a genuine computer chat conversation, and the scores of the academic year that followed the execution of the plan. Based on these results, the researcher recommends further studies on other Lebanese dyslexic learners using the computer to mend their language problem in order to design and make a most reliable software program that can address this disability more efficiently and successfully.Keywords: analysis, awareness, dyslexic, software
Procedia PDF Downloads 22225134 Cross Attention Fusion for Dual-Stream Speech Emotion Recognition
Authors: Shaode Yu, Jiajian Meng, Bing Zhu, Hang Yu, Qiurui Sun
Abstract:
Speech emotion recognition (SER) is for recognizing human subjective emotions through audio data in-depth analysis. From speech audios, how to comprehensively extract emotional information and how to effectively fuse extracted features remain challenging. This paper presents a dual-stream SER framework that embraces both full training and transfer learning of different networks for thorough feature encoding. Besides, a plug-and-play cross-attention fusion (CAF) module is implemented for the valid integration of the dual-stream encoder output. The effectiveness of the proposed CAF module is compared to the other three fusion modules (feature summation, feature concatenation, and feature-wise linear modulation) on two databases (RAVDESS and IEMO-CAP) using different dual-stream encoders (full training network, DPCNN or TextRCNN; transfer learning network, HuBERT or Wav2Vec2). Experimental results suggest that the CAF module can effectively reconcile conflicts between features from different encoders and outperform the other three feature fusion modules on the SER task. In the future, the plug-and-play CAF module can be extended for multi-branch feature fusion, and the dual-stream SER framework can be widened for multi-stream data representation to improve the recognition performance and generalization capacity.Keywords: speech emotion recognition, cross-attention fusion, dual-stream, pre-trained
Procedia PDF Downloads 7325133 The Journey of a Malicious HTTP Request
Authors: M. Mansouri, P. Jaklitsch, E. Teiniker
Abstract:
SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect high-level attacks such as SQL injection.Keywords: Linux system calls, web attack detection, interception, SQL
Procedia PDF Downloads 35625132 Data Transformations in Data Envelopment Analysis
Authors: Mansour Mohammadpour
Abstract:
Data transformation refers to the modification of any point in a data set by a mathematical function. When applying transformations, the measurement scale of the data is modified. Data transformations are commonly employed to turn data into the appropriate form, which can serve various functions in the quantitative analysis of the data. This study addresses the investigation of the use of data transformations in Data Envelopment Analysis (DEA). Although data transformations are important options for analysis, they do fundamentally alter the nature of the variable, making the interpretation of the results somewhat more complex.Keywords: data transformation, data envelopment analysis, undesirable data, negative data
Procedia PDF Downloads 1925131 Performance Comparison of Non-Binary RA and QC-LDPC Codes
Abstract:
Repeat–Accumulate (RA) codes are subclass of LDPC codes with fast encoder structures. In this paper, we consider a nonbinary extension of binary LDPC codes over GF(q) and construct a non-binary RA code and a non-binary QC-LDPC code over GF(2^4), we construct non-binary RA codes with linear encoding method and non-binary QC-LDPC codes with algebraic constructions method. And the BER performance of RA and QC-LDPC codes over GF(q) are compared with BP decoding and by simulation over the Additive White Gaussian Noise (AWGN) channels.Keywords: non-binary RA codes, QC-LDPC codes, performance comparison, BP algorithm
Procedia PDF Downloads 37425130 Exploring Affordable Care Practs in Nigeria’s Health Insurance Discourse
Authors: Emmanuel Chinaguh, Kehinde Adeosun
Abstract:
Nigerians die untimely, with 55.75 years of life expectancy, which is 17.45 below the world average of 73.2 (Worldometer, 2020). This is due, among other factors, to the country's limited access to high-quality healthcare. To increase access to good and affordable healthcare services, the National Health Insurance Authority (NHIA) Bill 2022 – which repealed the National Health Insurance Scheme Act 2004 – was passed into law. Applying Jacob Mey’s (2001) pragmatics act (pract) theory, this study explores how NHIA seeks to actualise these healthcare goals by characterising the general situational prototype or pragmemes and pragmatic acts in institutional communications. Data was sourced from the NHIA operational guidelines, which has 147 pages and four sections, and shared posters on NHIA Nigeria Twitter Handle with 14,200 followers. Digital humanities tools, like AntConc and Voyant, were engaged in the data analysis for text encoding and data visualisation. This study identifies these discourse tokens in the data: advertisement and programmes, standards and accreditation, records and information, and offences and penalties. Advertisement and programmes pract facilitating, propagating, prospecting, advising and informing; standards and accreditation, and records and information pract stating, informing and instructing; and offences and penalties pract stating and sanctioning. These practs combined to advance the goals of affordable care and universal accessibility to quality healthcare services. The pragmatic acts were marked by these pragmatic tools: shared situational knowledge (SSK), relevance (REL), reference (REF) and inference (INF). This paper adds to the understanding of health insurance discourse in Nigeria as a mediated social practice that promotes the health of Nigerians.Keywords: affordable care, NHIA, Nigeria’s health insurance discourse, pragmatic acts.
Procedia PDF Downloads 8125129 A High Compression Ratio for a Losseless Image Compression Based on the Arithmetic Coding with the Sorted Run Length Coding: Meteosat Second Generation Image Compression
Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane
Abstract:
Image compression is the heart of several multimedia techniques. It is used to reduce the number of bits required to represent an image. Meteosat Second Generation (MSG) satellite allows the acquisition of 12 image files every 15 minutes and that results in a large databases sizes. In this paper, a novel image compression method based on the arithmetic coding with the sorted Run Length Coding (SRLC) for MSG images is proposed. The SRLC allows us to find the occurrence of the consecutive pixels of the original image to create a sorted run. The arithmetic coding allows the encoding of the sorted data of the previous stage to retrieve a unique code word that represents a binary code stream in the sorted order to boost the compression ratio. Through this article, we show that our method can perform the best results concerning compression ratio and bit rate unlike the method based on the Run Length Coding (RLC) and the arithmetic coding. Evaluation criteria like the compression ratio and the bit rate allow the confirmation of the efficiency of our method of image compression.Keywords: image compression, arithmetic coding, Run Length Coding, RLC, Sorted Run Length Coding, SRLC, Meteosat Second Generation, MSG
Procedia PDF Downloads 35125128 Identification and Characterization of Polysaccharide Biosynthesis Protein (CAPD) of Enterococcus faecium
Authors: Liaqat Ali, Hubert E. Blum, Türkân Sakinc
Abstract:
Enterococcus faecium is an emerging multidrug-resistant nosocomial pathogen increased dramatically worldwide and causing bacteremia, endocarditis, urinary tract and surgical site infections in immunocomprised patients. The capsular polysaccharides that contribute to pathogenesis through evasion of the host innate immune system are also involved in hindering leukocyte killing of enterococci. The gene cluster (enterococcal polysaccharide antigen) of E. faecalis encoding homologues of many genes involved in polysaccharide biosynthesis. We identified two putative loci with 22 kb and 19 kb which contained 11 genes encoding for glycosyltransferases (GTFs); this was confirmed by using genome comparison of already sequenced strains that has no homology to known capsule genes and the epa-locus. The polysaccharide-conjugate vaccines have rapidly emerged as a suitable strategy to combat different pathogenic bacteria, therefore, we investigated a polysaccharide biosynthesis CapD protein in E. faecium contains 336 amino acids and had putative function for N-linked glycosylation. The deletion/knock-out capD mutant was constructed and complemented by homologues recombination method and confirmed by using PCR and sequencing. For further characterization and functional analysis, in-vitro cell culture and in-vivo a mouse infection models were used. Our ΔcapD mutant shows a strong hydrophobicity and all strains exhibited biofilm production. Subsequently, the opsonic activity was tested in an opsonophagocytic assay which shows increased in mutant compared complemented and wild type strains but more than two fold decreased in colonization and adherence was seen on surface of uroepithelial cells. However, a significant higher bacterial colonialization was observed in capD mutant during animal bacteremia infection. Unlike other polysaccharides biosynthesis proteins, CapD does not seems to be a major virulence factor in enterococci but further experiments and attention is needed to clarify its function, exact mechanism and involvement in pathogenesis of enteroccocal nosocomial infections eventually to develop a vaccine/ or targeted therapy.Keywords: E. faecium, pathogenesis, polysaccharides, biofilm formation
Procedia PDF Downloads 33125127 Object-Scene: Deep Convolutional Representation for Scene Classification
Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang
Abstract:
Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization
Procedia PDF Downloads 33125126 The Co-Simulation Interface SystemC/Matlab Applied in JPEG and SDR Application
Authors: Walid Hassairi, Moncef Bousselmi, Mohamed Abid
Abstract:
Functional verification is a major part of today’s system design task. Several approaches are available for verification on a high abstraction level, where designs are often modeled using MATLAB/Simulink. However, different approaches are a barrier to a unified verification flow. In this paper, we propose a co-simulation interface between SystemC and MATLAB and Simulink to enable functional verification of multi-abstraction levels designs. The resulting verification flow is tested on JPEG compression algorithm. The required synchronization of both simulation environments, as well as data type conversion is solved using the proposed co-simulation flow. We divided into two encoder jpeg parts. First implemented in SystemC which is the DCT is representing the HW part. Second, consisted of quantization and entropy encoding which is implemented in Matlab is the SW part. For communication and synchronization between these two parts we use S-Function and engine in Simulink matlab. With this research premise, this study introduces a new implementation of a Hardware SystemC of DCT. We compare the result of our simulation compared to SW / SW. We observe a reduction in simulation time you have 88.15% in JPEG and the design efficiency of the supply design is 90% in SDR.Keywords: hardware/software, co-design, co-simulation, systemc, matlab, s-function, communication, synchronization
Procedia PDF Downloads 40125125 Artificial Intelligence in Bioscience: The Next Frontier
Authors: Parthiban Srinivasan
Abstract:
With recent advances in computational power and access to enough data in biosciences, artificial intelligence methods are increasingly being used in drug discovery research. These methods are essentially a series of advanced statistics based exercises that review the past to indicate the likely future. Our goal is to develop a model that accurately predicts biological activity and toxicity parameters for novel compounds. We have compiled a robust library of over 150,000 chemical compounds with different pharmacological properties from literature and public domain databases. The compounds are stored in simplified molecular-input line-entry system (SMILES), a commonly used text encoding for organic molecules. We utilize an automated process to generate an array of numerical descriptors (features) for each molecule. Redundant and irrelevant descriptors are eliminated iteratively. Our prediction engine is based on a portfolio of machine learning algorithms. We found Random Forest algorithm to be a better choice for this analysis. We captured non-linear relationship in the data and formed a prediction model with reasonable accuracy by averaging across a large number of randomized decision trees. Our next step is to apply deep neural network (DNN) algorithm to predict the biological activity and toxicity properties. We expect the DNN algorithm to give better results and improve the accuracy of the prediction. This presentation will review all these prominent machine learning and deep learning methods, our implementation protocols and discuss these techniques for their usefulness in biomedical and health informatics.Keywords: deep learning, drug discovery, health informatics, machine learning, toxicity prediction
Procedia PDF Downloads 35625124 Semirings of Graphs: An Approach Towards the Algebra of Graphs
Authors: Gete Umbrey, Saifur Rahman
Abstract:
Graphs are found to be most capable in computing, and its abstract structures have been applied in some specific computations and algorithms like in phase encoding controller, processor microcontroller, and synthesis of a CMOS switching network, etc. Being motivated by these works, we develop an independent approach to study semiring structures and various properties by defining the binary operations which in fact, seems analogous to an existing definition in some sense but with a different approach. This work emphasizes specifically on the construction of semigroup and semiring structures on the set of undirected graphs, and their properties are investigated therein. It is expected that the investigation done here may have some interesting applications in theoretical computer science, networking and decision making, and also on joining of two network systems.Keywords: graphs, join and union of graphs, semiring, weighted graphs
Procedia PDF Downloads 14825123 Data Compression in Ultrasonic Network Communication via Sparse Signal Processing
Authors: Beata Zima, Octavio A. Márquez Reyes, Masoud Mohammadgholiha, Jochen Moll, Luca de Marchi
Abstract:
This document presents the approach of using compressed sensing in signal encoding and information transferring within a guided wave sensor network, comprised of specially designed frequency steerable acoustic transducers (FSATs). Wave propagation in a damaged plate was simulated using commercial FEM-based software COMSOL. Guided waves were excited by means of FSATs, characterized by the special shape of its electrodes, and modeled using PIC255 piezoelectric material. The special shape of the FSAT, allows for focusing wave energy in a certain direction, accordingly to the frequency components of its actuation signal, which makes available a larger monitored area. The process begins when a FSAT detects and records reflection from damage in the structure, this signal is then encoded and prepared for transmission, using a combined approach, based on Compressed Sensing Matching Pursuit and Quadrature Amplitude Modulation (QAM). After codification of the signal is in binary chars the information is transmitted between the nodes in the network. The message reaches the last node, where it is finally decoded and processed, to be used for damage detection and localization purposes. The main aim of the investigation is to determine the location of detected damage using reconstructed signals. The study demonstrates that the special steerable capabilities of FSATs, not only facilitate the detection of damage but also permit transmitting the damage information to a chosen area in a specific direction of the investigated structure.Keywords: data compression, ultrasonic communication, guided waves, FEM analysis
Procedia PDF Downloads 12325122 Negativization: A Focus Strategy in Basà Language
Authors: Imoh Philip
Abstract:
Basà language is classified as belonging to Kainji family, under the sub-phylum Western-Kainji known as Rubasa (Basa Benue) (Croizier & Blench, 1992:32). Basà is an under-described language spoken in the North-Central Nigeria. The language is characterized by subject-verb-object (henceforth SVO) as its canonical word order. Data for this work is sourced from the researcher’s native intuition of the language corroborated with a careful observation of native speakers. This paper investigates the syntactic derivational strategy of information-structure encoding in Basà language. It emphasizes on a negative operator, as a strategy for focusing a constituent or clause that follows it and negativizes a whole proposition. For items that are not nouns, they have to undergo an obligatory nominalization process, either by affixation, modification or conversion before they are moved to the pre verbal position for these operations. The study discovers and provides evidence of the fact showing that deferent constituents in the sentence such as the subject, direct, indirect object, genitive, verb phrase, prepositional phrase, clause and idiophone, etc. can be focused with the same negativizing operator. The process is characterized by focusing the pre verbal NP constituent alone, whereas the whole proposition is negated. The study can stimulate similar study or be replicated in other languages.Keywords: negation, focus, Basà, nominalization
Procedia PDF Downloads 59425121 OCR/ICR Text Recognition Using ABBYY FineReader as an Example Text
Authors: A. R. Bagirzade, A. Sh. Najafova, S. M. Yessirkepova, E. S. Albert
Abstract:
This article describes a text recognition method based on Optical Character Recognition (OCR). The features of the OCR method were examined using the ABBYY FineReader program. It describes automatic text recognition in images. OCR is necessary because optical input devices can only transmit raster graphics as a result. Text recognition describes the task of recognizing letters shown as such, to identify and assign them an assigned numerical value in accordance with the usual text encoding (ASCII, Unicode). The peculiarity of this study conducted by the authors using the example of the ABBYY FineReader, was confirmed and shown in practice, the improvement of digital text recognition platforms developed by Electronic Publication.Keywords: ABBYY FineReader system, algorithm symbol recognition, OCR/ICR techniques, recognition technologies
Procedia PDF Downloads 16625120 Neurophysiology of Domain Specific Execution Costs of Grasping in Working Memory Phases
Authors: Rumeysa Gunduz, Dirk Koester, Thomas Schack
Abstract:
Previous behavioral studies have shown that working memory (WM) and manual actions share limited capacity cognitive resources, which in turn results in execution costs of manual actions in WM. However, to the best of our knowledge, there is no study investigating the neurophysiology of execution costs. The current study aims to fill this research gap investigating the neurophysiology of execution costs of grasping in WM phases (encoding, maintenance, retrieval) considering verbal and visuospatial domains of WM. A WM-grasping dual task paradigm was implemented to examine execution costs. Baseline single task required performing verbal or visuospatial version of a WM task. Dual task required performing the WM task embedded in a high precision grasp to place task. 30 participants were tested in a 2 (single vs. dual task) x 2 (visuo-spatial vs. verbal WM) within subject design. Event related potentials (ERPs) were extracted for each WM phase separately in the single and dual tasks. Memory performance for visuospatial WM, but not for verbal WM, was significantly lower in the dual task compared to the single task. Encoding related ERPs in the single task revealed different ERPs of verbal WM and visuospatial WM at bilateral anterior sites and right posterior site. In the dual task, bilateral anterior difference disappeared due to bilaterally increased anterior negativities for visuospatial WM. Maintenance related ERPs in the dual task revealed different ERPs of verbal WM and visuospatial WM at bilateral posterior sites. There was also anterior negativity for visuospatial WM. Retrieval related ERPs in the single task revealed different ERPs of verbal WM and visuospatial WM at bilateral posterior sites. In the dual task, there was no difference between verbal WM and visuospatial WM. Behavioral and ERP findings suggest that execution of grasping shares cognitive resources only with visuospatial WM, which in turn results in domain specific execution costs. Moreover, ERP findings suggest unique patterns of costs in each WM phase, which supports the idea that each WM phase reflects a separate cognitive process. This study not only contributes to the understanding of cognitive principles of manual action control, but also contributes to the understanding of WM as an entity consisting of separate modalities and cognitive processes.Keywords: dual task, grasping execution, neurophysiology, working memory domains, working memory phases
Procedia PDF Downloads 42525119 Hardware Implementation of Local Binary Pattern Based Two-Bit Transform Motion Estimation
Authors: Seda Yavuz, Anıl Çelebi, Aysun Taşyapı Çelebi, Oğuzhan Urhan
Abstract:
Nowadays, demand for using real-time video transmission capable devices is ever-increasing. So, high resolution videos have made efficient video compression techniques an essential component for capturing and transmitting video data. Motion estimation has a critical role in encoding raw video. Hence, various motion estimation methods are introduced to efficiently compress the video. Low bit‑depth representation based motion estimation methods facilitate computation of matching criteria and thus, provide small hardware footprint. In this paper, a hardware implementation of a two-bit transformation based low-complexity motion estimation method using local binary pattern approach is proposed. Image frames are represented in two-bit depth instead of full-depth by making use of the local binary pattern as a binarization approach and the binarization part of the hardware architecture is explained in detail. Experimental results demonstrate the difference between the proposed hardware architecture and the architectures of well-known low-complexity motion estimation methods in terms of important aspects such as resource utilization, energy and power consumption.Keywords: binarization, hardware architecture, local binary pattern, motion estimation, two-bit transform
Procedia PDF Downloads 30925118 Processing Big Data: An Approach Using Feature Selection
Authors: Nikat Parveen, M. Ananthi
Abstract:
Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.Keywords: big data, key value, feature selection, retrieval, performance
Procedia PDF Downloads 33825117 A Web-Based Systems Immunology Toolkit Allowing the Visualization and Comparative Analysis of Publically Available Collective Data to Decipher Immune Regulation in Early Life
Authors: Mahbuba Rahman, Sabri Boughorbel, Scott Presnell, Charlie Quinn, Darawan Rinchai, Damien Chaussabel, Nico Marr
Abstract:
Collections of large-scale datasets made available in public repositories can be used to identify and fill gaps in biomedical knowledge. But first, these data need to be made readily accessible to researchers for analysis and interpretation. Here a collection of transcriptome datasets was made available to investigate the functional programming of human hematopoietic cells in early life. Thirty two datasets were retrieved from the NCBI Gene Expression Omnibus (GEO) and loaded in a custom, interactive web application called the Gene Expression browser (GXB), designed for visualization and query of integrated large-scale data. Multiple sample groupings and gene rank lists were created based on the study design and variables in each dataset. Web links to customized graphical views can be generated by users and subsequently be used to graphically present data in manuscripts for publication. The GXB tool also enables browsing of a single gene across datasets, which can provide information on the role of a given molecule across biological systems. The dataset collection is available online. As a proof-of-principle, one of the datasets (GSE25087) was re-analyzed to identify genes that are differentially expressed by regulatory T cells in early life. Re-analysis of this dataset and a cross-study comparison using multiple other datasets in the above mentioned collection revealed that PMCH, a gene encoding a precursor of melanin-concentrating hormone (MCH), a cyclic neuropeptide, is highly expressed in a variety of other hematopoietic cell types, including neonatal erythroid cells as well as plasmacytoid dendritic cells upon viral infection. Our findings suggest an as yet unrecognized role of MCH in immune regulation, thereby highlighting the unique potential of the curated dataset collection and systems biology approach to generate new hypotheses which can be tested in future mechanistic studies.Keywords: early-life, GEO datasets, PMCH, interactive query, systems biology
Procedia PDF Downloads 29625116 Angular-Coordinate Driven Radial Tree Drawing
Authors: Farshad Ghassemi Toosi, Nikola S. Nikolov
Abstract:
We present a visualization technique for radial drawing of trees consisting of two slightly different algorithms. Both of them make use of node-link diagrams for visual encoding. This visualization creates clear drawings without edge crossing. One of the algorithms is suitable for real-time visualization of large trees, as it requires minimal recalculation of the layout if leaves are inserted or removed from the tree; while the other algorithm makes better utilization of the drawing space. The algorithms are very similar and follow almost the same procedure but with different parameters. Both algorithms assign angular coordinates for all nodes which are then converted into 2D Cartesian coordinates for visualization. We present both algorithms and discuss how they compare to each other.Keywords: Radial drawing, Visualization, Algorithm, Use of node-link diagrams
Procedia PDF Downloads 33725115 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective
Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou
Abstract:
The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.Keywords: mortality map, spatial patterns, statistical area, variation
Procedia PDF Downloads 25825114 Identified Transcription Factors and Gene Regulation in Scient Biosynthesis in Ophrys Orchids
Authors: Chengwei Wang, Shuqing Xu, Philipp M. Schlüter
Abstract:
The genus Ophrys is remarkable for its mimicry, flower-lip closely resembling pollinator females in a species-specific manner. Therefore, floral traits associated with pollinator attraction, especially scent, are suitable models for investigating the molecular basis of adaption, speciation, and evolution. Within the two Ophrys species groups: O. sphegodes (S) and O. fusca (F), pollinator shifts among the same insect species have taken place. Preliminary data suggest that they involve a comparable hydrocarbon profile in their scent, which is mainly composed of alkanes and alkenes. Genes encoding stearoyl-acyl carrier protein desaturases (SAD) involved in alkene biosynthesis have been identified in the S group. This study aims to investigate the control and parallel evolution of ecologically significant alkene production in Ophrys. Owing to the central role those SAD genes play in determining positioning of the alkene double-bonds, a detailed understanding of their functional mechanism and of regulatory aspects is of utmost importance. We have identified 5 transcription factors potentially related to SAD expression in O. sphegodes which belong to the MYB, GTE, WRKY, and MADS families. Ultimately, our results will contribute to understanding genes important in the regulatory control of floral scent synthesis.Keywords: floral traits, transcription factors, biosynthesis, parallel evolution
Procedia PDF Downloads 9825113 A User Interface for Easiest Way Image Encryption with Chaos
Authors: D. López-Mancilla, J. M. Roblero-Villa
Abstract:
Since 1990, the research on chaotic dynamics has received considerable attention, particularly in light of potential applications of this phenomenon in secure communications. Data encryption using chaotic systems was reported in the 90's as a new approach for signal encoding that differs from the conventional methods that use numerical algorithms as the encryption key. The algorithms for image encryption have received a lot of attention because of the need to find security on image transmission in real time over the internet and wireless networks. Known algorithms for image encryption, like the standard of data encryption (DES), have the drawback of low level of efficiency when the image is large. The encrypting based on chaos proposes a new and efficient way to get a fast and highly secure image encryption. In this work, a user interface for image encryption and a novel and easiest way to encrypt images using chaos are presented. The main idea is to reshape any image into a n-dimensional vector and combine it with vector extracted from a chaotic system, in such a way that the vector image can be hidden within the chaotic vector. Once this is done, an array is formed with the original dimensions of the image and turns again. An analysis of the security of encryption from the images using statistical analysis is made and is used a stage of optimization for image encryption security and, at the same time, the image can be accurately recovered. The user interface uses the algorithms designed for the encryption of images, allowing you to read an image from the hard drive or another external device. The user interface, encrypt the image allowing three modes of encryption. These modes are given by three different chaotic systems that the user can choose. Once encrypted image, is possible to observe the safety analysis and save it on the hard disk. The main results of this study show that this simple method of encryption, using the optimization stage, allows an encryption security, competitive with complicated encryption methods used in other works. In addition, the user interface allows encrypting image with chaos, and to submit it through any public communication channel, including internet.Keywords: image encryption, chaos, secure communications, user interface
Procedia PDF Downloads 48925112 Applications of Big Data in Education
Authors: Faisal Kalota
Abstract:
Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners’ needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in education. Additionally, it discusses some of the concerns related to Big Data and current research trends. While Big Data can provide big benefits, it is important that institutions understand their own needs, infrastructure, resources, and limitation before jumping on the Big Data bandwagon.Keywords: big data, learning analytics, analytics, big data in education, Hadoop
Procedia PDF Downloads 42225111 Fruiting Body Specific Sc4 Hydrophobin Gene Plays a Role in Schizophyllum Commune Hyphal Attachment to Structured Glass Surfaces
Authors: Evans Iyamu
Abstract:
Genes encoding hydrophobins play distinct roles at different stages of the life cycle of fungi, and they foster hyphal attachment to surfaces. The hydrophobin Sc4 is known to provide a hydrophobic membrane lining of the gas channels within Schizophyllum commune fruiting bodies. Here, we cultivated non-fruiting, monokaryotic S. commune 12-43 on glass surfaces that could be verified by micrography. Differential gene expression profiling of nine hydrophobin genes and the hydrophobin-like sc15 gene by quantitative PCR showed significant up-regulation of sc4 when S. commune was attached to glass surfaces, also confirmed with RNA-Seq data analysis. Another silicate, namely quartz sand, was investigated, and induction of sc4 was seen as well. The up-regulation of the hydrophobin gene sc4 may indicate involvement in S. commune hyphal attachment to glass as well as quartz surfaces. We propose that the covering of hyphae by Sc4 allows for direct interaction with the hydrophobic surfaces of silicates and that differential functions of specific hydrophobin genes depend on the surface interface involved. This study could help with the clarification of the biological functions of hydrophobins in natural surroundings, including hydrophobic surface attachment. Therefore, the analysis of growth on glass serves as a basis for understanding S. commune interaction with glass surfaces while providing the possibility to visualize the interaction microscopically.Keywords: hydrophobin, structured glass surfaces, differential gene expression, quartz sand
Procedia PDF Downloads 11725110 Speeding-up Gray-Scale FIC by Moments
Authors: Eman A. Al-Hilo, Hawraa H. Al-Waelly
Abstract:
In this work, fractal compression (FIC) technique is introduced based on using moment features to block indexing the zero-mean range-domain blocks. The moment features have been used to speed up the IFS-matching stage. Its moments ratio descriptor is used to filter the domain blocks and keep only the blocks that are suitable to be IFS matched with tested range block. The results of tests conducted on Lena picture and Cat picture (256 pixels, resolution 24 bits/pixel) image showed a minimum encoding time (0.89 sec for Lena image and 0.78 of Cat image) with appropriate PSNR (30.01dB for Lena image and 29.8 of Cat image). The reduction in ET is about 12% for Lena and 67% for Cat image.Keywords: fractal gray level image, fractal compression technique, iterated function system, moments feature, zero-mean range-domain block
Procedia PDF Downloads 49025109 Comparison of Existing Predictor and Development of Computational Method for S- Palmitoylation Site Identification in Arabidopsis Thaliana
Authors: Ayesha Sanjana Kawser Parsha
Abstract:
S-acylation is an irreversible bond in which cysteine residues are linked to fatty acids palmitate (74%) or stearate (22%), either at the COOH or NH2 terminal, via a thioester linkage. There are several experimental methods that can be used to identify the S-palmitoylation site; however, since they require a lot of time, computational methods are becoming increasingly necessary. There aren't many predictors, however, that can locate S- palmitoylation sites in Arabidopsis Thaliana with sufficient accuracy. This research is based on the importance of building a better prediction tool. To identify the type of machine learning algorithm that predicts this site more accurately for the experimental dataset, several prediction tools were examined in this research, including the GPS PALM 6.0, pCysMod, GPS LIPID 1.0, CSS PALM 4.0, and NBA PALM. These analyses were conducted by constructing the receiver operating characteristics plot and the area under the curve score. An AI-driven deep learning-based prediction tool has been developed utilizing the analysis and three sequence-based input data, such as the amino acid composition, binary encoding profile, and autocorrelation features. The model was developed using five layers, two activation functions, associated parameters, and hyperparameters. The model was built using various combinations of features, and after training and validation, it performed better when all the features were present while using the experimental dataset for 8 and 10-fold cross-validations. While testing the model with unseen and new data, such as the GPS PALM 6.0 plant and pCysMod mouse, the model performed better, and the area under the curve score was near 1. It can be demonstrated that this model outperforms the prior tools in predicting the S- palmitoylation site in the experimental data set by comparing the area under curve score of 10-fold cross-validation of the new model with the established tools' area under curve score with their respective training sets. The objective of this study is to develop a prediction tool for Arabidopsis Thaliana that is more accurate than current tools, as measured by the area under the curve score. Plant food production and immunological treatment targets can both be managed by utilizing this method to forecast S- palmitoylation sites.Keywords: S- palmitoylation, ROC PLOT, area under the curve, cross- validation score
Procedia PDF Downloads 72