Search results for: coding complexity metric mccabe
2492 Identifying Protein-Coding and Non-Coding Regions in Transcriptomes
Authors: Angela U. Makolo
Abstract:
Protein-coding and Non-coding regions determine the biology of a sequenced transcriptome. Research advances have shown that Non-coding regions are important in disease progression and clinical diagnosis. Existing bioinformatics tools have been targeted towards Protein-coding regions alone. Therefore, there are challenges associated with gaining biological insights from transcriptome sequence data. These tools are also limited to computationally intensive sequence alignment, which is inadequate and less accurate to identify both Protein-coding and Non-coding regions. Alignment-free techniques can overcome the limitation of identifying both regions. Therefore, this study was designed to develop an efficient sequence alignment-free model for identifying both Protein-coding and Non-coding regions in sequenced transcriptomes. Feature grouping and randomization procedures were applied to the input transcriptomes (37,503 data points). Successive iterations were carried out to compute the gradient vector that converged the developed Protein-coding and Non-coding Region Identifier (PNRI) model to the approximate coefficient vector. The logistic regression algorithm was used with a sigmoid activation function. A parameter vector was estimated for every sample in 37,503 data points in a bid to reduce the generalization error and cost. Maximum Likelihood Estimation (MLE) was used for parameter estimation by taking the log-likelihood of six features and combining them into a summation function. Dynamic thresholding was used to classify the Protein-coding and Non-coding regions, and the Receiver Operating Characteristic (ROC) curve was determined. The generalization performance of PNRI was determined in terms of F1 score, accuracy, sensitivity, and specificity. The average generalization performance of PNRI was determined using a benchmark of multi-species organisms. The generalization error for identifying Protein-coding and Non-coding regions decreased from 0.514 to 0.508 and to 0.378, respectively, after three iterations. The cost (difference between the predicted and the actual outcome) also decreased from 1.446 to 0.842 and to 0.718, respectively, for the first, second and third iterations. The iterations terminated at the 390th epoch, having an error of 0.036 and a cost of 0.316. The computed elements of the parameter vector that maximized the objective function were 0.043, 0.519, 0.715, 0.878, 1.157, and 2.575. The PNRI gave an ROC of 0.97, indicating an improved predictive ability. The PNRI identified both Protein-coding and Non-coding regions with an F1 score of 0.970, accuracy (0.969), sensitivity (0.966), and specificity of 0.973. Using 13 non-human multi-species model organisms, the average generalization performance of the traditional method was 74.4%, while that of the developed model was 85.2%, thereby making the developed model better in the identification of Protein-coding and Non-coding regions in transcriptomes. The developed Protein-coding and Non-coding region identifier model efficiently identified the Protein-coding and Non-coding transcriptomic regions. It could be used in genome annotation and in the analysis of transcriptomes.Keywords: sequence alignment-free model, dynamic thresholding classification, input randomization, genome annotation
Procedia PDF Downloads 682491 A Theoretical Study on Pain Assessment through Human Facial Expresion
Authors: Mrinal Kanti Bhowmik, Debanjana Debnath Jr., Debotosh Bhattacharjee
Abstract:
A facial expression is undeniably the human manners. It is a significant channel for human communication and can be applied to extract emotional features accurately. People in pain often show variations in facial expressions that are readily observable to others. A core of actions is likely to occur or to increase in intensity when people are in pain. To illustrate the changes in the facial appearance, a system known as Facial Action Coding System (FACS) is pioneered by Ekman and Friesen for human observers. According to Prkachin and Solomon, a set of such actions carries the bulk of information about pain. Thus, the Prkachin and Solomon pain intensity (PSPI) metric is defined. So, it is very important to notice that facial expressions, being a behavioral source in communication media, provide an important opening into the issues of non-verbal communication in pain. People express their pain in many ways, and this pain behavior is the basis on which most inferences about pain are drawn in clinical and research settings. Hence, to understand the roles of different pain behaviors, it is essential to study the properties. For the past several years, the studies are concentrated on the properties of one specific form of pain behavior i.e. facial expression. This paper represents a comprehensive study on pain assessment that can model and estimate the intensity of pain that the patient is suffering. It also reviews the historical background of different pain assessment techniques in the context of painful expressions. Different approaches incorporate FACS from psychological views and a pain intensity score using the PSPI metric in pain estimation. This paper investigates in depth analysis of different approaches used in pain estimation and presents different observations found from each technique. It also offers a brief study on different distinguishing features of real and fake pain. Therefore, the necessity of the study lies in the emerging fields of painful face assessment in clinical settings.Keywords: facial action coding system (FACS), pain, pain behavior, Prkachin and Solomon pain intensity (PSPI)
Procedia PDF Downloads 3482490 Network Coding with Buffer Scheme in Multicast for Broadband Wireless Network
Authors: Gunasekaran Raja, Ramkumar Jayaraman, Rajakumar Arul, Kottilingam Kottursamy
Abstract:
Broadband Wireless Network (BWN) is the promising technology nowadays due to the increased number of smartphones. Buffering scheme using network coding considers the reliability and proper degree distribution in Worldwide interoperability for Microwave Access (WiMAX) multi-hop network. Using network coding, a secure way of transmission is performed which helps in improving throughput and reduces the packet loss in the multicast network. At the outset, improved network coding is proposed in multicast wireless mesh network. Considering the problem of performance overhead, degree distribution makes a decision while performing buffer in the encoding / decoding process. Consequently, BuS (Buffer Scheme) based on network coding is proposed in the multi-hop network. Here the encoding process introduces buffer for temporary storage to transmit packets with proper degree distribution. The simulation results depend on the number of packets received in the encoding/decoding with proper degree distribution using buffering scheme.Keywords: encoding and decoding, buffer, network coding, degree distribution, broadband wireless networks, multicast
Procedia PDF Downloads 4112489 A Multimodal Approach to Improve the Performance of Biometric System
Authors: Chander Kant, Arun Kumar
Abstract:
Biometric systems automatically recognize an individual based on his/her physiological and behavioral characteristics. There are also some traits like weight, age, height etc. that may not provide reliable user recognition because of there common and temporary nature. These traits are called soft bio metric traits. Although soft bio metric traits are lack of permanence to uniquely and reliably identify an individual, yet they provide some beneficial evidence about the user identity and may improve the system performance. Here in this paper, we have proposed an approach for integrating the soft bio metrics with fingerprint and face to improve the performance of personal authentication system. In our approach we have proposed a combined architecture of three different sensors to elevate the system performance. The approach includes, soft bio metrics, fingerprint and face traits. We have also proven the efficiency of proposed system regarding FAR (False Acceptance Ratio) and total response time, with the help of MUBI (Multimodal Bio metrics Integration) software.Keywords: FAR, minutiae point, multimodal bio metrics, primary bio metric, soft bio metric
Procedia PDF Downloads 3492488 Towards a Goal-Question-Metric Based Approach to Assess Social Sustainability of Software Systems
Authors: Rahma Amri, Narjès Bellamine Ben Saoud
Abstract:
Sustainable development or sustainability is one of the most urgent issues in actual debate in almost domains. Particularly the significant way the software pervades our live should make it in the center of sustainability concerns. The social aspects of sustainability haven’t been well studied in the context of software systems and still immature research field that needs more interest among researchers’ community. This paper presents a Goal-Question-Metric based approach to assess social sustainability of software systems. The approach is based on a generic social sustainability model taken from Social sciences.Keywords: software assessment approach, social sustainability, goal-question-metric paradigm, software project metrics
Procedia PDF Downloads 3972487 Conceptualising Project Complexity in Ghana’s Construction Industry: A Qualitative Study
Authors: Kwasi Dartey-Baah, Mias De Klerk
Abstract:
Project complexity has been cited as one of the essential areas of project management. It can be observed from environmental, social, technological, and organisational viewpoints, and its handling is critical to project success. Conceptualised in varied industries, this paper seeks to ascertain the meaning and understanding of project complexity within the Ghanaian construction industry based on the three dimensions of complexities (faith, fact, and interaction) using experts' opinions. Taking the form of a focus group discussion, the paper sought to gain an in-depth understanding of project complexity issues in Ghana’s construction industry. The method use obtained data from experts (a purposely selected group) comprising project leaders and project management academics. The findings indicated that the experts broadly agreed with the complexity items but offered varied reasons for their agreement. In the composite assessment of the complexity dimensions of (faith, fact, and interaction), it emerged that there was some agreement with the complexity dimensions of fact and interaction within Ghana’s construction industry. On the other hand, with the dimension for complexity by faith, it was noted that the experts in Ghana’s construction construed complexity by faith, not as the absence of evidence but the evidence that hinges on at least a member of the project team. It is expected that other researches on project complexity will focus on other industries to enhance the knowledge of the same within the field of project management.Keywords: project complexity, complexity by faith, complexity by fact, complexity by interaction, construction industry, Ghana
Procedia PDF Downloads 1612486 A Guide to User-Friendly Bash Prompt: Adding Natural Language Processing Plus Bash Explanation to the Command Interface
Authors: Teh Kean Kheng, Low Soon Yee, Burra Venkata Durga Kumar
Abstract:
In 2022, as the future world becomes increasingly computer-related, more individuals are attempting to study coding for themselves or in school. This is because they have discovered the value of learning code and the benefits it will provide them. But learning coding is difficult for most people. Even senior programmers that have experience for a decade year still need help from the online source while coding. The reason causing this is that coding is not like talking to other people; it has the specific syntax to make the computer understand what we want it to do, so coding will be hard for normal people if they don’t have contact in this field before. Coding is hard. If a user wants to learn bash code with bash prompt, it will be harder because if we look at the bash prompt, we will find that it is just an empty box and waiting for a user to tell the computer what we want to do, if we don’t refer to the internet, we will not know what we can do with the prompt. From here, we can conclude that the bash prompt is not user-friendly for new users who are learning bash code. Our goal in writing this paper is to give an idea to implement a user-friendly Bash prompt in Ubuntu OS using Artificial Intelligent (AI) to lower the threshold of learning in Bash code, to make the user use their own words and concept to write and learn Bash code.Keywords: user-friendly, bash code, artificial intelligence, threshold, semantic similarity, lexical similarity
Procedia PDF Downloads 1432485 Polymorphic Positions, Haplotypes, and Mutations Detected In The Mitochonderial DNA Coding Region By Sanger Sequence Technique
Authors: Imad H. Hameed, Mohammad A. Jebor, Ammera J. Omer
Abstract:
The aim of this research is to study the mitochonderial coding region by using the Sanger sequencing technique and establish the degree of variation characteristic of a fragment. FTA® Technology (FTA™ paper DNA extraction) utilized to extract DNA. Portion of coding region encompassing positions 11719 –12384 amplified in accordance with the Anderson reference sequence. PCR products purified by EZ-10 spin column then sequenced and Detected by using the ABI 3730xL DNA Analyzer. Five new polymorphic positions 11741, 11756, 11878, 11887 and 12133 are described may be suitable sources for identification purpose in future. The calculated value D= 0.95 and RMP=0.048 of the genetic diversity should be understood as high in the context of coding function of the analysed DNA fragment. The relatively high gene diversity and a relatively low random match probability were observed in Iraq population. The obtained data can be used to identify the variable nucleotide positions characterized by frequent occurrence which is most promising for various identifications.Keywords: coding region, Iraq, mitochondrial DNA, polymorphic positions, sanger technique
Procedia PDF Downloads 4372484 An Inventory Management Model to Manage the Stock Level for Irregular Demand Items
Authors: Riccardo Patriarca, Giulio Di Gravio, Francesco Costantino, Massimo Tronci
Abstract:
An accurate inventory management policy acquires a crucial role in the several high-availability sectors. In these sectors, due to the high-cost of spares and backorders, an (S-1, S) replenishment policy is necessary for high-availability items. The policy enables the shipment of a substitute efficient item anytime the inventory size decreases by one. This policy can be modelled following the Multi-Echelon Technique for Recoverable Item Control (METRIC). The METRIC is a system-based technique that allows defining the optimum stock level in a multi-echelon network, adopting measures in line with the decision-maker’s perspective. The METRIC defines an availability-cost function with inventory costs and required service levels, using as inputs data about the demand trend, the supplying and maintenance characteristics of the network and the budget/availability constraints. The traditional METRIC relies on the hypothesis that a Poisson distribution well represents the demand distribution in case of items with a low failure rate. However, in this research, we will explore the effects of using a Poisson distribution to model the demand of low failure rate items characterized by an irregular demand trend. This characteristic of a demand is not included in the traditional METRIC formulation leading to the need of revising its traditional formulation. Using the CV (Coefficient of Variation) and ADI (Average inter-Demand Interval) classification, we will define the inherent flaws of Poisson-based METRIC for irregular demand items, defining an innovative ad hoc distribution which can better fit the irregular demands. This distribution will allow defining proper stock levels to reduce stocking and backorder costs due to the high irregularities in the demand trend. A case study in the aviation domain will clarify the benefits of this innovative METRIC approach.Keywords: METRIC, inventory management, irregular demand, spare parts
Procedia PDF Downloads 3482483 A Hybrid P2P Storage Scheme Based on Erasure Coding and Replication
Authors: Usman Mahmood, Khawaja M. U. Suleman
Abstract:
A peer-to-peer storage system has challenges like; peer availability, data protection, churn rate. To address these challenges different redundancy, replacement and repair schemes are used. This paper presents a hybrid scheme of redundancy using replication and erasure coding. We calculate and compare the storage, access, and maintenance costs of our proposed scheme with existing redundancy schemes. For realistic behaviour of peers a trace of live peer-to-peer system is used. The effect of different replication, and repair schemes are also shown. The proposed hybrid scheme performs better than existing double coding hybrid scheme in all metrics and have an improved maintenance cost than hierarchical codes.Keywords: erasure coding, P2P, redundancy, replication
Procedia PDF Downloads 3952482 The Complexity of Testing Cryptographic Devices on Input Faults
Authors: Alisher Ikramov, Gayrat Juraev
Abstract:
The production of logic devices faces the occurrence of faults during manufacturing. This work analyses the complexity of testing a special type of logic device on inverse, adhesion, and constant input faults. The focus of this work is on devices that implement cryptographic functions. The complexity values for the general case faults and for some frequently occurring subsets were determined and proved in this work. For a special case, when the length of the text block is equal to the length of the key block, the complexity of testing is proven to be asymptotically half the complexity of testing all logic devices on the same types of input faults.Keywords: complexity, cryptographic devices, input faults, testing
Procedia PDF Downloads 2262481 Low Complexity Deblocking Algorithm
Authors: Jagroop Singh Sidhu, Buta Singh
Abstract:
A low computational deblocking filter including three frequency related modes (smooth mode, intermediate mode, and non-smooth mode for low-frequency, mid-frequency, and high frequency regions, respectively) is proposed. The suggested approach requires zero additions, zero subtractions, zero multiplications (for intermediate region), no divisions (for non-smooth region) and no comparison. The suggested method thus keeps the computation lower and thus suitable for image coding systems based on blocks. Comparison of average number of operations for smooth, non-smooth, intermediate (per pixel vector for each block) using filter suggested by Chen and the proposed method filter suggests that the proposed filter keeps the computation lower and is thus suitable for fast processing algorithms.Keywords: blocking artifacts, computational complexity, non-smooth, intermediate, smooth
Procedia PDF Downloads 4642480 Reduced Complexity of ML Detection Combined with DFE
Authors: Jae-Hyun Ro, Yong-Jun Kim, Chang-Bin Ha, Hyoung-Kyu Song
Abstract:
In multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) systems, many detection schemes have been developed to improve the error performance and to reduce the complexity. Maximum likelihood (ML) detection has optimal error performance but it has very high complexity. Thus, this paper proposes reduced complexity of ML detection combined with decision feedback equalizer (DFE). The error performance of the proposed detection scheme is higher than the conventional DFE. But the complexity of the proposed scheme is lower than the conventional ML detection.Keywords: detection, DFE, MIMO-OFDM, ML
Procedia PDF Downloads 6102479 Experimental Options for the Role of Dynamic Torsion in General Relativity
Authors: Ivan Ravlich, Ivan Linscott, Sigrid Close
Abstract:
The experimental search for spin coupling in General Relativity via torsion has been inconclusive. In this work, further experimental avenues to test dynamic torsion are proposed and evaluated. In the extended theory, by relaxing the torsion free condition on the metric connection, general relativity is reformulated to relate the spin density of particles to a new quantity, the torsion tensor. In torsion theories, the spin tensor and torsion tensor are related in much the same way as the stress-energy tensor is related to the metric connection. Similarly, as the metric is the field associated with the metric connection, fields can be associated with the torsion tensor resulting in a field that is either propagating or static. Experimental searches for static torsion have thus far been inconclusive, and currently, there have been no experimental tests for propagating torsion. Experimental tests of propagating theories of torsion are proposed utilizing various spin densities of matter, such as interfaces in superconducting materials and plasmas. The experimental feasibility and observable bounds are estimated, and the most viable candidates are selected to pursue in detail in a future work.Keywords: general relativity, gravitation, propagating torsion, spin density
Procedia PDF Downloads 2312478 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation
Authors: Oğuzhan Urhan
Abstract:
In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.Keywords: fast motion estimation; low-complexity motion estimation, video coding
Procedia PDF Downloads 3172477 A New Verification Based Congestion Control Scheme in Mobile Networks
Authors: P. K. Guha Thakurta, Shouvik Roy, Bhawana Raj
Abstract:
A congestion control scheme in mobile networks is proposed in this paper through a verification based model. The model proposed in this work is represented through performance metric like buffer Occupancy, latency and packet loss rate. Based on pre-defined values, each of the metric is introduced in terms of three different states. A Markov chain based model for the proposed work is introduced to monitor the occurrence of the corresponding state transitions. Thus, the estimation of the network status is obtained in terms of performance metric. In addition, the improved performance of our proposed model over existing works is shown with experimental results.Keywords: congestion, mobile networks, buffer, delay, call drop, markov chain
Procedia PDF Downloads 4432476 Software Engineering Revolution Driven by Complexity Science
Abstract:
This paper introduces a new software engineering paradigm based on complexity science, called NSE (Nonlinear Software Engineering paradigm). The purpose of establishing NSE is to help software development organizations double their productivity, half their cost, and increase the quality of their products in several orders of magnitude simultaneously. NSE complies with the essential principles of complexity science. NSE brings revolutionary changes to almost all aspects in software engineering. NSE has been fully implemented with its support platform Panorama++.Keywords: complexity science, software development, software engineering, software maintenance
Procedia PDF Downloads 2652475 Quantitative Comparison Complexity and Robustness of Supply Chain Network Based on Different Configurations
Authors: Ahmadreza Rezaei, Qiong Liu
Abstract:
Supply chain network made based on suppliers and product architecture design. these networks are complex and vulnerable that may be expose disruption risks. any supply chain network configuration has its own related complexity and robustness that can have direct effect on its efficiency. So it's necessary to evaluate any configuration with considering complexity and robustness aspects together. However, there is a lack of research about this subject to managers can evaluate their supply chain configurations and choose configuration with balanced complexity and robustness together. In this study, developed indicators improve robustness of supply chain with using framework to evaluate relationships between complexity and robustness of supply chain network under different network configurations . this framework includes Investigation and analysis of quantitative indicators based on network characteristics. Moreover, overall metrics of Shannon entropy is presented to evaluate network topological complexity. So we will analyze two factor of complexity and robustness of networks based on supply chain configurations As result, Complexity and Robustness are two integral components of network that show network resistances under disruption. It's necessary to attain a balanced level of complexity and robustness in network configurations. the proposed framework could be used in supply chain network to improve efficiency.Keywords: supply chain design, structural complexity, robustness, supply chain configuration, Shannon entropy
Procedia PDF Downloads 102474 Probing Syntax Information in Word Representations with Deep Metric Learning
Authors: Bowen Ding, Yihao Kuang
Abstract:
In recent years, with the development of large-scale pre-trained lan-guage models, building vector representations of text through deep neural network models has become a standard practice for natural language processing tasks. From the performance on downstream tasks, we can know that the text representation constructed by these models contains linguistic information, but its encoding mode and extent are unclear. In this work, a structural probe is proposed to detect whether the vector representation produced by a deep neural network is embedded with a syntax tree. The probe is trained with the deep metric learning method, so that the distance between word vectors in the metric space it defines encodes the distance of words on the syntax tree, and the norm of word vectors encodes the depth of words on the syntax tree. The experiment results on ELMo and BERT show that the syntax tree is encoded in their parameters and the word representations they produce.Keywords: deep metric learning, syntax tree probing, natural language processing, word representations
Procedia PDF Downloads 682473 Foliation and the First Law of Thermodynamics for the Kerr Newman Black Hole
Authors: Syed M. Jawwad Riaz
Abstract:
There has been a lot of interest in exploring the thermodynamic properties at the horizon of a black hole geometry. Earlier, it has been shown, for different spacetimes, that the Einstein field equations at the horizon can be expressed as a first law of black hole thermodynamics. In this paper, considering r = constant slices, for the Kerr-Newman black hole, shown that the Einstein field equations for the induced 3-metric of the hypersurface is expressed in thermodynamic quantities under the virtual displacements of the hypersurfaces. As expected, it is found that the field equations of the induced metric corresponding to the horizon can only be written as a first law of black hole thermodynamics. It is to be mentioned here that the procedure adopted is much easier, to obtain such results, as here one has to essentially deal with (n - 1)-dimensional induced metric for an n-dimensional spacetime.Keywords: black hole space-times, Einstein's field equation, foliation, hyper-surfaces
Procedia PDF Downloads 3472472 Performance Evaluation of One and Two Dimensional Prime Codes for Optical Code Division Multiple Access Systems
Authors: Gurjit Kaur, Neena Gupta
Abstract:
In this paper, we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension, i.e. time slots, whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research, we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for Optical Code Division Multiple Access (OCDMA) system on a single platform. Analysis shows that 2D prime code supports lesser number of active users than 1D codes, but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa
Procedia PDF Downloads 3372471 Gaze Behaviour of Individuals with and without Intellectual Disability for Nonaccidental and Metric Shape Properties
Authors: S. Haider, B. Bhushan
Abstract:
Eye Gaze behaviour of individuals with and without intellectual disability are investigated in an eye tracking study in terms of sensitivity to Nonaccidental (NAPs) and Metric (MPs) shape properties. Total fixation time is used as an indirect measure of attention allocation. Studies have found Mean reaction times for non accidental properties (NAPs) to be shorter than for metric (MPs) when the MP and NAP differences were equalized. METHODS: Twenty-five individuals with intellectual disability (mild and moderate level of Mental Retardation) and twenty-seven normal individuals were compared on mean total fixation duration, accuracy level and mean reaction time for mild NAPs, extreme NAPs and metric properties of images. 2D images of cylinders were adapted and made into forced choice match-to-sample tasks. Tobii TX300 Eye Tracker was used to record total fixation duration and data obtained from the Areas of Interest (AOI). Variable trial duration (total reaction time of each participant) and fixed trail duration (data taken at each second from one to fifteen seconds) data were used for analyses. Both groups did not differ in terms of fixation times (fixed as well as variable) across any of the three image manipulations but differed in terms of reaction time and accuracy. Normal individuals had longer reaction time compared to individuals with intellectual disability across all types of images. Both the groups differed significantly on accuracy measure across all image types. Normal individuals performed better across all three types of images. Mild NAPs vs. Metric differences: There was significant difference between mild NAPs and metric properties of images in terms of reaction times. Mild NAPs images had significantly longer reaction time compared to metric for normal individuals but this difference was not found for individuals with intellectual disability. Mild NAPs images had significantly better accuracy level compared to metric for both the groups. In conclusion, type of image manipulations did not result in differences in attention allocation for individuals with and without intellectual disability. Mild Nonaccidental properties facilitate better accuracy level compared to metric in both the groups but this advantage is seen only for normal group in terms of mean reaction time.Keywords: eye gaze fixations, eye movements, intellectual disability, stimulus properties
Procedia PDF Downloads 5532470 Decision Trees Constructing Based on K-Means Clustering Algorithm
Authors: Loai Abdallah, Malik Yousef
Abstract:
A domain space for the data should reflect the actual similarity between objects. Since objects belonging to the same cluster usually share some common traits even though their geometric distance might be relatively large. In general, the Euclidean distance of data points that represented by large number of features is not capturing the actual relation between those points. In this study, we propose a new method to construct a different space that is based on clustering to form a new distance metric. The new distance space is based on ensemble clustering (EC). The EC distance space is defined by tracking the membership of the points over multiple runs of clustering algorithm metric. Over this distance, we train the decision trees classifier (DT-EC). The results obtained by applying DT-EC on 10 datasets confirm our hypotheses that embedding the EC space as a distance metric would improve the performance.Keywords: ensemble clustering, decision trees, classification, K nearest neighbors
Procedia PDF Downloads 1912469 Whole Exome Sequencing Data Analysis of Rare Diseases: Non-Coding Variants and Copy Number Variations
Authors: S. Fahiminiya, J. Nadaf, F. Rauch, L. Jerome-Majewska, J. Majewski
Abstract:
Background: Sequencing of protein coding regions of human genome (Whole Exome Sequencing; WES), has demonstrated a great success in the identification of causal mutations for several rare genetic disorders in human. Generally, most of WES studies have focused on rare variants in coding exons and splicing-sites where missense substitutions lead to the alternation of protein product. Although focusing on this category of variants has revealed the mystery behind many inherited genetic diseases in recent years, a subset of them remained still inconclusive. Here, we present the result of our WES studies where analyzing only rare variants in coding regions was not conclusive but further investigation revealed the involvement of non-coding variants and copy number variations (CNV) in etiology of the diseases. Methods: Whole exome sequencing was performed using our standard protocols at Genome Quebec Innovation Center, Montreal, Canada. All bioinformatics analyses were done using in-house WES pipeline. Results: To date, we successfully identified several disease causing mutations within gene coding regions (e.g. SCARF2: Van den Ende-Gupta syndrome and SNAP29: 22q11.2 deletion syndrome) by using WES. In addition, we showed that variants in non-coding regions and CNV have also important value and should not be ignored and/or filtered out along the way of bioinformatics analysis on WES data. For instance, in patients with osteogenesis imperfecta type V and in patients with glucocorticoid deficiency, we identified variants in 5'UTR, resulting in the production of longer or truncating non-functional proteins. Furthermore, CNVs were identified as the main cause of the diseases in patients with metaphyseal dysplasia with maxillary hypoplasia and brachydactyly and in patients with osteogenesis imperfecta type VII. Conclusions: Our study highlights the importance of considering non-coding variants and CNVs during interpretation of WES data, as they can be the only cause of disease under investigation.Keywords: whole exome sequencing data, non-coding variants, copy number variations, rare diseases
Procedia PDF Downloads 4192468 Performance Improvement of Cooperative Scheme in Wireless OFDM Systems
Authors: Ki-Ro Kim, Seung-Jun Yu, Hyoung-Kyu Song
Abstract:
Recently, the wireless communication systems are required to have high quality and provide high bit rate data services. Researchers have studied various multiple antenna scheme to meet the demand. In practical application, it is difficult to deploy multiple antennas for limited size and cost. Cooperative diversity techniques are proposed to overcome the limitations. Cooperative communications have been widely investigated to improve performance of wireless communication. Among diversity schemes, space-time block code has been widely studied for cooperative communication systems. In this paper, we propose a new cooperative scheme using pre-coding and space-time block code. The proposed cooperative scheme provides improved error performance than a conventional cooperative scheme using space-time block coding scheme.Keywords: cooperative communication, space-time block coding, pre-coding
Procedia PDF Downloads 3602467 Fast Prediction Unit Partition Decision and Accelerating the Algorithm Using Cudafor Intra and Inter Prediction of HEVC
Authors: Qiang Zhang, Chun Yuan
Abstract:
Since the PU (Prediction Unit) decision process is the most time consuming part of the emerging HEVC (High Efficient Video Coding) standardin intra and inter frame coding, this paper proposes the fast PU decision algorithm and speed up the algorithm using CUDA (Compute Unified Device Architecture). In intra frame coding, the fast PU decision algorithm uses the texture features to skip intra-frame prediction or terminal the intra-frame prediction for smaller PU size. In inter frame coding of HEVC, the fast PU decision algorithm takes use of the similarity of its own two Nx2N size PU's motion vectors and the hierarchical structure of CU (Coding Unit) partition to skip some modes of PU partition, so as to reduce the motion estimation times. The accelerate algorithm using CUDA is based on the fast PU decision algorithm which uses the GPU to make the motion search and the gradient computation could be parallel computed. The proposed algorithm achieves up to 57% time saving compared to the HM 10.0 with little rate-distortion losses (0.043dB drop and 1.82% bitrate increase on average).Keywords: HEVC, PU decision, inter prediction, intra prediction, CUDA, parallel
Procedia PDF Downloads 3992466 The Relationship between Rhythmic Complexity and Listening Engagement as a Proxy for Perceptual Interest
Authors: Noah R. Fram
Abstract:
Although it has been confirmed by multiple studies, the inverted-U relationship between stimulus complexity and preference (liking) remains contentious. Research aimed at substantiating the model are largely reliant upon anecdotal self-assessments of subjects and basic measures of complexity, leaving potential confounds unresolved. This study attempts to address the topic by assessing listening time as a behavioral correlate of liking (with the assumption that engagement prolongs listening time) and by looking for latent factors underlying several measures of rhythmic complexity. Participants listened to groups of rhythms, stopping each one when they started to lose interest and were asked to rate each rhythm in each group in terms of interest, complexity, and preference. Subjects were not informed that the time spent listening to each rhythm was the primary measure of interest. The hypothesis that listening time does demonstrate the same inverted-U relationship with complexity as verbal reports of liking was confirmed using a variety of metrics for rhythmic complexity, including meter-dependent measures of syncopation and meter-independent measures of entropy.Keywords: complexity, entropy, rhythm, syncopation
Procedia PDF Downloads 1742465 Constant Dimension Codes via Generalized Coset Construction
Authors: Kanchan Singh, Sheo Kumar Singh
Abstract:
The fundamental problem of subspace coding is to explore the maximum possible cardinality Aq(n, d, k) of a set of k-dimensional subspaces of an n-dimensional vector space over Fq such that the subspace distance satisfies ds(W1, W2) ≥ d for any two distinct subspaces W1, W2 in this set. In this paper, we construct a new class of constant dimension codes (CDCs) by generalizing the coset construction and combining it with CDCs derived from parallel linkage construction and coset construction with an aim to improve the new lower bounds of Aq(n, d, k). We found a remarkable improvement in some of the lower bounds of Aq(n, d, k).Keywords: constant dimension codes, rank metric codes, coset construction, parallel linkage construction
Procedia PDF Downloads 242464 Adaptive Few-Shot Deep Metric Learning
Authors: Wentian Shi, Daming Shi, Maysam Orouskhani, Feng Tian
Abstract:
Whereas currently the most prevalent deep learning methods require a large amount of data for training, few-shot learning tries to learn a model from limited data without extensive retraining. In this paper, we present a loss function based on triplet loss for solving few-shot problem using metric based learning. Instead of setting the margin distance in triplet loss as a constant number empirically, we propose an adaptive margin distance strategy to obtain the appropriate margin distance automatically. We implement the strategy in the deep siamese network for deep metric embedding, by utilizing an optimization approach by penalizing the worst case and rewarding the best. Our experiments on image recognition and co-segmentation model demonstrate that using our proposed triplet loss with adaptive margin distance can significantly improve the performance.Keywords: few-shot learning, triplet network, adaptive margin, deep learning
Procedia PDF Downloads 1722463 Degree of Approximation by the (T.E^1) Means of Conjugate Fourier Series in the Hölder Metric
Authors: Kejal Khatri, Vishnu Narayan Mishra
Abstract:
We compute the degree of approximation of functions\tilde{f}\in H_w, a new Banach space using (T.E^1) summability means of conjugate Fourier series. In this paper, we extend the results of Singh and Mahajan which in turn generalizes the result of Lal and Yadav. Some corollaries have also been deduced from our main theorem and particular cases.Keywords: conjugate Fourier series, degree of approximation, Hölder metric, matrix summability, product summability
Procedia PDF Downloads 420