Search results for: Multidimensional Sequence Data
6940 Identification of Cellulose-Hydrolytic Thermophiles Isolated from Sg. Klah Hot Spring Based On 16S rDNA Gene Sequence
Authors: M. J. Norashirene, Y. Zakiah, S. Nurdiana, I. Nur Hilwani, M. H. Siti Khairiyah, M. J. Muhamad Arif
Abstract:
In this study, six bacterial isolates of a slightly thermophilic organism from the Sg. Klah hot spring, Malaysia were successfully isolated and designated as M7T55D1, M7T55D2, M7T55D3, M7T53D1, M7T53D2 and M7T53D3 respectively. The bacterial isolates were screened for their cellulose hydrolytic ability on Carboxymethlycellulose agar medium. The isolated bacterial strains were identified morphologically, biochemically and molecularly with the aid of 16S rDNA sequencing. All of the bacteria showed their optimum growth at a slightly alkaline pH of 7.5 with a temperature of 55°C. All strains were Gram-negative, non-spore forming type, strictly aerobic, catalase-positive and oxidase-positive with the ability to produce thermostable cellulase. Based on BLASTn results, bacterial isolates of M7T55D2 and M7T53D1 gave the highest homology (97%) with similarity to Tepidimonas ignava while isolates M7T55D1, M7T55D3, M7T53D2 and M7T53D3 showed their closest homology (97%-98%) with Tepidimonas thermarum. These cellulolytic thermophiles might have a commercial potential to produce valuable thermostable cellulase.
Keywords: Cellulase, Cellulolytic, Thermophiles, 16S rDNA Gene.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20926939 Performance Analysis of the Subgroup Method for Collective I/O
Authors: Kwangho Cha, Hyeyoung Cho, Sungho Kim
Abstract:
As many scientific applications require large data processing, the importance of parallel I/O has been increasingly recognized. Collective I/O is one of the considerable features of parallel I/O and enables application programmers to easily handle their large data volume. In this paper we measured and analyzed the performance of original collective I/O and the subgroup method, the way of using collective I/O of MPI effectively. From the experimental results, we found that the subgroup method showed good performance with small data size.
Keywords: Collective I/O, MPI, parallel file system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15756938 Statistical Analysis for Overdispersed Medical Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling overdispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling overdispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling overdispered medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling overdispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling overdispersed medical count data when ZIP and ZINB are inadequate.
Keywords: Zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33156937 Student Satisfaction Data for Work Based Learners
Authors: Rosie Borup, Hanifa Shah
Abstract:
This paper aims to describe how student satisfaction is measured for work-based learners as these are non-traditional learners, conducting academic learning in the workplace, typically their curricula have a high degree of negotiation, and whose motivations are directly related to their employers- needs, as well as their own career ambitions. We argue that while increasing WBL participation, and use of SSD are both accepted as being of strategic importance to the HE agenda, the use of WBL SSD is rarely examined, and lessons can be learned from the comparison of SSD from a range of WBL programmes, and increased visibility of this type of data will provide insight into ways to improve and develop this type of delivery. The key themes that emerged from the analysis of the interview data were: learners profiles and needs, employers drivers, academic staff drivers, organizational approach, tools for collecting data and visibility of findings. The paper concludes with observations on best practice in the collection, analysis and use of WBL SSD, thus offering recommendations for both academic managers and practitioners.Keywords: Student satisfaction data, work based learning, employer engagement, NSS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14936936 A Consistency Protocol Multi-Layer for Replicas Management in Large Scale Systems
Authors: Ghalem Belalem, Yahya Slimani
Abstract:
Large scale systems such as computational Grid is a distributed computing infrastructure that can provide globally available network resources. The evolution of information processing systems in Data Grid is characterized by a strong decentralization of data in several fields whose objective is to ensure the availability and the reliability of the data in the reason to provide a fault tolerance and scalability, which cannot be possible only with the use of the techniques of replication. Unfortunately the use of these techniques has a height cost, because it is necessary to maintain consistency between the distributed data. Nevertheless, to agree to live with certain imperfections can improve the performance of the system by improving competition. In this paper, we propose a multi-layer protocol combining the pessimistic and optimistic approaches conceived for the data consistency maintenance in large scale systems. Our approach is based on a hierarchical representation model with tree layers, whose objective is with double vocation, because it initially makes it possible to reduce response times compared to completely pessimistic approach and it the second time to improve the quality of service compared to an optimistic approach.Keywords: Data Grid, replication, consistency, optimistic approach, pessimistic approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15756935 Analysis of the AZF Region in Slovak Men with Azoospermia
Authors: J. Bernasovská, R. Lohajová Behulová, E. Petrejčiková, I. Boroňová, I. Bernasovský
Abstract:
Y chromosome microdeletions are the most common genetic cause of male infertility and screening for these microdeletions in azoospermic or severely oligospermic men is now standard practice. Analysis of the Y chromosome in men with azoospermia or severe oligozoospermia has resulted in the identification of three regions in the euchromatic part of the long arm of the human Y chromosome (Yq11) that are frequently deleted in men with otherwise unexplained spermatogenic failure. PCR analysis of microdeletions in the AZFa, AZFb and AZFc regions of the human Y chromosome is an important screening tool. The aim of this study was to analyse the type of microdeletions in men with fertility disorders in Slovakia. We evaluated 227 patients with azoospermia and with normal karyotype. All patient samples were analyzed cytogenetically. For PCR amplification of sequence-tagged sites (STS) of the AZFa, AZFb and AZFc regions of the Y chromosome was used Devyser AZF set. Fluorescently labeled primers for all markers in one multiplex PCR reaction were used and for automated visualization and identification of the STS markers we used genetic analyzer ABi 3500xl (Life Technologies). We reported 13 cases of deletions in the AZF region 5,73%. Particular types of deletions were recorded in each region AZFa,b,c .The presence of microdeletions in the AZFc region was the most frequent. The study confirmed that percentage of microdeletions in the AZF region is low in Slovak azoospermic patients, but important from a prognostic view.
Keywords: AZF, male infertility, microdeletions, Y chromosome.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22226934 Analysis of a Population of Diabetic Patients Databases with Classifiers
Authors: Murat Koklu, Yavuz Unal
Abstract:
Data mining can be called as a technique to extract information from data. It is the process of obtaining hidden information and then turning it into qualified knowledge by statistical and artificial intelligence technique. One of its application areas is medical area to form decision support systems for diagnosis just by inventing meaningful information from given medical data. In this study a decision support system for diagnosis of illness that make use of data mining and three different artificial intelligence classifier algorithms namely Multilayer Perceptron, Naive Bayes Classifier and J.48. Pima Indian dataset of UCI Machine Learning Repository was used. This dataset includes urinary and blood test results of 768 patients. These test results consist of 8 different feature vectors. Obtained classifying results were compared with the previous studies. The suggestions for future studies were presented.
Keywords: Artificial Intelligence, Classifiers, Data Mining, Diabetic Patients.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 54316933 Discovery of Time Series Event Patterns based on Time Constraints from Textual Data
Authors: Shigeaki Sakurai, Ken Ueno, Ryohei Orihara
Abstract:
This paper proposes a method that discovers time series event patterns from textual data with time information. The patterns are composed of sequences of events and each event is extracted from the textual data, where an event is characteristic content included in the textual data such as a company name, an action, and an impression of a customer. The method introduces 7 types of time constraints based on the analysis of the textual data. The method also evaluates these constraints when the frequency of a time series event pattern is calculated. We can flexibly define the time constraints for interesting combinations of events and can discover valid time series event patterns which satisfy these conditions. The paper applies the method to daily business reports collected by a sales force automation system and verifies its effectiveness through numerical experiments.
Keywords: Text mining, sequential mining, time constraints, daily business reports.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14886932 A 3.125Gb/s Clock and Data Recovery Circuit Using 1/4-Rate Technique
Authors: Il-Do Jeong, Hang-Geun Jeong
Abstract:
This paper describes the design and fabrication of a clock and data recovery circuit (CDR). We propose a new clock and data recovery which is based on a 1/4-rate frequency detector (QRFD). The proposed frequency detector helps reduce the VCO frequency and is thus advantageous for high speed application. The proposed frequency detector can achieve low jitter operation and extend the pull-in range without using the reference clock. The proposed CDR was implemented using a 1/4-rate bang-bang type phase detector (PD) and a ring voltage controlled oscillator (VCO). The CDR circuit has been fabricated in a standard 0.18 CMOS technology. It occupies an active area of 1 x 1 and consumes 90 mW from a single 1.8V supply.
Keywords: Clock and data recovery, 1/4-rate frequency detector, 1/4-rate phase detector.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29276931 Very High Speed Data Driven Dynamic NAND Gate at 22nm High K Metal Gate Strained Silicon Technology Node
Authors: Shobha Sharma, Amita Dev
Abstract:
Data driven dynamic logic is the high speed dynamic circuit with low area. The clock of the dynamic circuit is removed and data drives the circuit instead of clock for precharging purpose. This data driven dynamic nand gate is given static forward substrate biasing of Vsupply/2 as well as the substrate bias is connected to the input data, resulting in dynamic substrate bias. The dynamic substrate bias gives the shortest propagation delay with a penalty on the power dissipation. Propagation delay is reduced by 77.8% compared to the normal reverse substrate bias Data driven dynamic nand. Also dynamic substrate biased D3nand’s propagation delay is reduced by 31.26% compared to data driven dynamic nand gate with static forward substrate biasing of Vdd/2. This data driven dynamic nand gate with dynamic body biasing gives us the highest speed with no area penalty and finds its applications where power penalty is acceptable. Also combination of Dynamic and static Forward body bias can be used with reduced propagation delay compared to static forward biased circuit and with comparable increase in an average power. The simulations were done on hspice simulator with 22nm High-k metal gate strained Si technology HP models of Arizona State University, USA.Keywords: Data driven nand gate, dynamic substrate biasing, nand gate, static substrate biasing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16166930 Vector Space of the Extended Base-triplets over the Galois Field of five DNA Bases Alphabet
Authors: Robersy Sánchez, Ricardo Grau
Abstract:
A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D, G, A, U, C}, where the letter D represent one or more hypothetical bases with unspecific pairing. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvements of a primitive DNA repair system could make possible the transition from the ancient to the modern genetic code. Our results suggest that the Watson-Crick base pairing and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as the transition from the former to the later. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences.
Keywords: Genetic code vector space, primeval genetic code, power spectrum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23646929 Temperature Variation Effects on I-V Characteristics of Cu-Phthalocyanine based OFET
Authors: Q. Zafar, R. Akram, Kh.S. Karimov, T.A. Khan, M. Farooq, M.M. Tahir
Abstract:
In this study we present the effect of elevated temperatures from 300K to 400K on the electrical properties of copper Phthalocyanine (CuPc) based organic field effect transistors (OFET). Thin films of organic semiconductor CuPc (40nm) and semitransparent Al (20nm) were deposited in sequence, by vacuum evaporation on a glass substrate with previously deposited Ag source and drain electrodes with a gap of 40 μm. Under resistive mode of operation, where gate was suspended it was observed that drain current of this organic field effect transistor (OFET) show an increase with temperature. While in grounded gate condition metal (aluminum) – semiconductor (Copper Phthalocyanine) Schottky junction dominated the output characteristics and device showed switching effect from low to high conduction states like Zener diode at higher bias voltages. This threshold voltage for switching effect has been found to be inversely proportional to temperature and shows an abrupt decrease after knee temperature of 360K. Change in dynamic resistance (Rd = dV/dI) with respect to temperature was observed to be -1%/K.Keywords: Copper Phthalocyanine, Metal-Semiconductor Schottky Junction, Organic Field Effect Transistor, Switching effect, Temperature Sensor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25796928 Soft Computing based Retrieval System for Medical Applications
Authors: Pardeep Singh, Sanjay Sharma
Abstract:
With increasing data in medical databases, medical data retrieval is growing in popularity. Some of this analysis including inducing propositional rules from databases using many soft techniques, and then using these rules in an expert system. Diagnostic rules and information on features are extracted from clinical databases on diseases of congenital anomaly. This paper explain the latest soft computing techniques and some of the adaptive techniques encompasses an extensive group of methods that have been applied in the medical domain and that are used for the discovery of data dependencies, importance of features, patterns in sample data, and feature space dimensionality reduction. These approaches pave the way for new and interesting avenues of research in medical imaging and represent an important challenge for researchers.Keywords: CBIR, GA, Rough sets, CBMIR, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17326927 Secure Power Systems Against Malicious Cyber-Physical Data Attacks: Protection and Identification
Authors: Morteza Talebi, Jianan Wang, Zhihua Qu
Abstract:
The security of power systems against malicious cyberphysical data attacks becomes an important issue. The adversary always attempts to manipulate the information structure of the power system and inject malicious data to deviate state variables while evading the existing detection techniques based on residual test. The solutions proposed in the literature are capable of immunizing the power system against false data injection but they might be too costly and physically not practical in the expansive distribution network. To this end, we define an algebraic condition for trustworthy power system to evade malicious data injection. The proposed protection scheme secures the power system by deterministically reconfiguring the information structure and corresponding residual test. More importantly, it does not require any physical effort in either microgrid or network level. The identification scheme of finding meters being attacked is proposed as well. Eventually, a well-known IEEE 30-bus system is adopted to demonstrate the effectiveness of the proposed schemes.Keywords: Algebraic Criterion, Malicious Cyber-Physical Data Injection, Protection and Identification, Trustworthy Power System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19936926 Study on the Effect of Bolt Locking Method on the Deformation of Bipolar Plate in PEMFC
Authors: Tao Chen, ShiHua Liu, JiWei Zhang
Abstract:
Assembly of the proton exchange membrane fuel cells (PEMFC) has a very important influence on its performance and efficiency. The various components of PEMFC stack are usually locked and fixed by bolts. Locking bolt will cause the deformation of the bipolar plate and the other components, which will affect directly the deformation degree of the integral parts of the PEMFC as well as the performance of PEMFC. This paper focuses on the object of three-cell stack of PEMFC. Finite element simulation is used to investigate the deformation of bipolar plate caused by quantity and layout of bolts, bolt locking pressure, and bolt locking sequence, etc. Finally, we made a conclusion that the optimal combination packaging scheme was adopted to assemble the fuel cell stack. The scheme was in use of 3.8 MPa locking pressure imposed on the fuel cell stack, type Ⅱ of four locking bolts and longitudinal locking method. The scheme was obtained by comparatively analyzing the overall displacement contour of PEMFC stack, absolute displacement curve of bipolar plate along the given three paths in the Z direction and the polarization curve of fuel cell. The research results are helpful for the fuel cell stack assembly.
Keywords: Bipolar plate, deformation, finite element simulation, fuel cell, locking bolt.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8346925 NSBS: Design of a Network Storage Backup System
Authors: Xinyan Zhang, Zhipeng Tan, Shan Fan
Abstract:
The first layer of defense against data loss is the backup data. This paper implements an agent-based network backup system used the backup, server-storage and server-backup agent these tripartite construction, and the snapshot and hierarchical index are used in the NSBS. It realizes the control command and data flow separation, balances the system load, thereby improving efficiency of the system backup and recovery. The test results show the agent-based network backup system can effectively improve the task-based concurrency, reasonably allocate network bandwidth, the system backup performance loss costs smaller and improves data recovery efficiency by 20%.
Keywords: Agent, network backup system, three architecture model, NSBS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22326924 Isolation and Identification of Diacylglycerol Acyltransferase Type- 2 (GAT2) Genes from Three Egyptian Olive Cultivars
Authors: Yahia I. Mohamed, Ahmed I. Marzouk, Mohamed A. Yacout
Abstract:
Aim of this work was to study the genetic basis for oil accumulation in olive fruit via tracking DGAT2 (Diacylglycerol acyltransferase type-2) gene in three Egyptian Origen Olive cultivars namely Toffahi, Hamed and Maraki using molecular marker techniques and bioinformatics tools. Results illustrate that, firstly: specific genomic band of Maraki cultivars was identified as DGAT2 (Diacylglycerol acyltransferase type-2) and identical for this gene in Olea europaea with 100% of similarity. Secondly, differential genomic band of Maraki cultivars which produced from RAPD fingerprinting technique reflected predicted distinguished sequence which identified as DGAT2 (Diacylglycerol acyltransferase type-2) in Fragaria vesca subsp. Vesca with 76% of sequential similarity. Third and finally, specific genomic specific band of Hamed cultivars was identified as two fragments, 1- Olea europaea cultivar Koroneiki diacylglycerol acyltransferase type 2 mRNA, complete cds with two matches regions with 99% or 2- Predicted: Fragaria vesca subsp. vesca diacylglycerol O-acyltransferase 2-like (LOC101313050), mRNA with 86 % of similarity.
Keywords: Olea europaea, fingerprinting, Diacylglycerol acyltransferase type- 2 (DGAT2).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24166923 Estimating the Life-Distribution Parameters of Weibull-Life PV Systems Utilizing Non-Parametric Analysis
Authors: Saleem Z. Ramadan
Abstract:
In this paper, a model is proposed to determine the life distribution parameters of the useful life region for the PV system utilizing a combination of non-parametric and linear regression analysis for the failure data of these systems. Results showed that this method is dependable for analyzing failure time data for such reliable systems when the data is scarce.Keywords: Masking, Bathtub model, reliability, non-parametric analysis, useful life.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18436922 Novel Security Strategy for Real Time Digital Videos
Authors: Prakash Devale, R. S. Prasad, Amol Dhumane, Pritesh Patil
Abstract:
Now a days video data embedding approach is a very challenging and interesting task towards keeping real time video data secure. We can implement and use this technique with high-level applications. As the rate-distortion of any image is not confirmed, because the gain provided by accurate image frame segmentation are balanced by the inefficiency of coding objects of arbitrary shape, with a lot factors like losses that depend on both the coding scheme and the object structure. By using rate controller in association with the encoder one can dynamically adjust the target bitrate. This paper discusses about to keep secure videos by mixing signature data with negligible distortion in the original video, and to keep steganographic video as closely as possible to the quality of the original video. In this discussion we propose the method for embedding the signature data into separate video frames by the use of block Discrete Cosine Transform. These frames are then encoded by real time encoding H.264 scheme concepts. After processing, at receiver end recovery of original video and the signature data is proposed.
Keywords: Data Hiding, Digital Watermarking, video coding H.264, Rate Control, Block DCT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15616921 Web Search Engine Based Naming Procedure for Independent Topic
Authors: Takahiro Nishigaki, Takashi Onoda
Abstract:
In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.Keywords: Independent topic analysis, topic extraction, topic naming, web search engine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5006920 An Anomaly Detection Approach to Detect Unexpected Faults in Recordings from Test Drives
Authors: Andreas Theissler, Ian Dear
Abstract:
In the automotive industry test drives are being conducted during the development of new vehicle models or as a part of quality assurance of series-production vehicles. The communication on the in-vehicle network, data from external sensors, or internal data from the electronic control units is recorded by automotive data loggers during the test drives. The recordings are used for fault analysis. Since the resulting data volume is tremendous, manually analysing each recording in great detail is not feasible. This paper proposes to use machine learning to support domainexperts by preventing them from contemplating irrelevant data and rather pointing them to the relevant parts in the recordings. The underlying idea is to learn the normal behaviour from available recordings, i.e. a training set, and then to autonomously detect unexpected deviations and report them as anomalies. The one-class support vector machine “support vector data description” is utilised to calculate distances of feature vectors. SVDDSUBSEQ is proposed as a novel approach, allowing to classify subsequences in multivariate time series data. The approach allows to detect unexpected faults without modelling effort as is shown with experimental results on recordings from test drives.
Keywords: Anomaly detection, fault detection, test drive analysis, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24776919 A Comparison Study of the Removal of Selected Pharmaceuticals in Waters by Chemical Oxidation Treatments
Authors: F. Javier Benitez, Juan Luis Acero, Francisco J. Real, Gloria Roldan, Francisco Casas
Abstract:
The degradation of selected pharmaceuticals in some water matrices was studied by using several chemical treatments. The pharmaceuticals selected were the beta-blocker metoprolol, the nonsteroidal anti-inflammatory naproxen, the antibiotic amoxicillin, and the analgesic phenacetin; and their degradations were conducted by using UV radiation alone, ozone, Fenton-s reagent, Fenton-like system, photo-Fenton system, and combinations of UV radiation and ozone with H2O2, TiO2, Fe(II), and Fe(III). The water matrices, in addition to ultra-pure water, were a reservoir water, a groundwater, and two secondary effluents from two municipal WWTP. The results reveal that the presence of any second oxidant enhanced the oxidation rates, with the systems UV/TiO2 and O3/TiO2 providing the highest degradation rates. It is also observed in most of the investigated oxidation systems that the degradation rate followed the sequence: amoxicillin > naproxen > metoprolol > phenacetin. Lower rates were obtained with the pharmaceuticals dissolved in natural waters and secondary effluents due to the organic matter present which consume some amounts of the oxidant agents.Keywords: Pharmaceuticals, UV radiation, ozone, advancedoxidation processes, water matrices, degradation rates
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22356918 Suicide Conceptualization in Adolescents through Semantic Networks
Authors: K. P. Valdés García, E. I. Rodríguez Fonseca, L. G. Juárez Cantú
Abstract:
Suicide is a global, multidimensional and dynamic problem of mental health, which requires a constant study for its understanding and prevention. When research of this phenomenon is done, it is necessary to consider the different characteristics it may have because of the individual and sociocultural variables, the importance of this consideration is related to the generation of effective treatments and interventions. Adolescents are a vulnerable population due to the characteristics of the development stage. The investigation was carried out with the objective of identifying and describing the conceptualization of adolescents of suicide, and in this process, we find possible differences between men and women. The study was carried out in Saltillo, Coahuila, Mexico. The sample was composed of 418 volunteer students aged between 11 and 18 years. The ethical aspects of the research were reviewed and considered in all the processes of the investigation with the participants, their parents and the schools to which they belonged, psychological attention was offered to the participants and preventive workshops were carried in the educational institutions. Natural semantic networks were the instrument used, since this hybrid method allows to find and analyze the social concept of a phenomenon; in this case, the word suicide was used as an evocative stimulus and participants were asked to evoke at least five words and a maximum 10 that they thought were related to suicide, and then hierarchize them according to the closeness with the construct. The subsequent analysis was carried with Excel, yielding the semantic weights, affective loads and the distances between each of the semantic fields established according to the words reported by the subjects. The results showed similarities in the conceptualization of suicide in adolescents, men and women. Seven semantic fields were generated; the words were related in the discourse analysis: 1) death, 2) possible triggering factors, 3) associated moods, 4) methods used to carry it out, 5) psychological symptomatology that could affect, 6) words associated with a rejection of suicide, and finally, 7) specific objects to carry it out. One of the necessary aspects to consider in the investigations of complex issues such as suicide is to have a diversity of instruments and techniques that adjust to the characteristics of the population and that allow to understand the phenomena from the social constructs and not only theoretical. The constant study of suicide is a pressing need, the loss of a life from emotional difficulties that can be solved through psychiatry and psychological methods requires governments and professionals to pay attention and work with the risk population.
Keywords: Adolescents, semantic networks, speech analysis, suicide.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7566917 Resilient Modulus and Deformation Responses of Waste Glass in Flexible Pavement System
Authors: M. Al-Saedi, A. Chegenizadeh, H. Nikraz
Abstract:
Experimental investigations are conducted to assess a layered structure of glass (G) - rock (R) blends under the impact of repeated loading. Laboratory tests included sieve analyses, modified compaction test and repeated load triaxial test (RLTT) is conducted on different structures of stratified GR samples to reach the objectives of this study. Waste materials are such essential components in the climate system, and also commonly used in minimising the need for natural materials in many countries. Glass is one of the most widely used groups of waste materials which have been extensively using in road applications. Full range particle size and colours of glass are collected and mixed at different ratios with natural rock material trying to use the blends in pavement layers. Whole subsurface specimen sequentially consists of a single layer of R and a layer of G-R blend. 12G/88R and 45G/55R mix ratios are employed in this research, the thickness of G-R layer was changed, and the results were compared between the pure rock and the layered specimens. The relations between resilient module (Mr) and permanent deformation with sequence number are presented. During the earlier stages of RLTT, the results indicated that the 45G/55R specimen shows higher moduli than R specimen.
Keywords: Rock base course, layered structure, glass, resilient modulus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6256916 Decision Tree for Competing Risks Survival Probability in Breast Cancer Study
Authors: N. A. Ibrahim, A. Kudus, I. Daud, M. R. Abu Bakar
Abstract:
Competing risks survival data that comprises of more than one type of event has been used in many applications, and one of these is in clinical study (e.g. in breast cancer study). The decision tree method can be extended to competing risks survival data by modifying the split function so as to accommodate two or more risks which might be dependent on each other. Recently, researchers have constructed some decision trees for recurrent survival time data using frailty and marginal modelling. We further extended the method for the case of competing risks. In this paper, we developed the decision tree method for competing risks survival time data based on proportional hazards for subdistribution of competing risks. In particular, we grow a tree by using deviance statistic. The application of breast cancer data is presented. Finally, to investigate the performance of the proposed method, simulation studies on identification of true group of observations were executed.Keywords: Competing risks, Decision tree, Simulation, Subdistribution Proportional Hazard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23746915 Spatially Random Sampling for Retail Food Risk Factors Study
Authors: Guilan Huang
Abstract:
In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.
Keywords: Geospatial technology, restaurant, retail food risk factors study, spatial random sampling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14656914 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising
Authors: Jianwei Ma, Diriba Gemechu
Abstract:
In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.Keywords: Anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, Split Bregman Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10136913 Improving Academic Performance Prediction using Voting Technique in Data Mining
Authors: Ikmal Hisyam Mohamad Paris, Lilly Suriani Affendey, Norwati Mustapha
Abstract:
In this paper we compare the accuracy of data mining methods to classifying students in order to predicting student-s class grade. These predictions are more useful for identifying weak students and assisting management to take remedial measures at early stages to produce excellent graduate that will graduate at least with second class upper. Firstly we examine single classifiers accuracy on our data set and choose the best one and then ensembles it with a weak classifier to produce simple voting method. We present results show that combining different classifiers outperformed other single classifiers for predicting student performance.Keywords: Classification, Data Mining, Prediction, Combination of Multiple Classifiers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27546912 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas
Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards
Abstract:
Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.
Keywords: Airborne laser scanning, digital terrain models, filtering, forested areas.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7186911 Blockchain for IoT Security and Privacy in Healthcare Sector
Authors: Umair Shafique, Hafiz Usman Zia, Fiaz Majeed, Samina Naz, Javeria Ahmed, Maleeha Zainab
Abstract:
The Internet of Things (IoT) has become a hot topic for the last couple of years. This innovative technology has shown promising progress in various areas and the world has witnessed exponential growth in multiple application domains. Researchers are working to investigate its aptitudes to get the best from it by harnessing its true potential. But at the same time, IoT networks open up a new aspect of vulnerability and physical threats to data integrity, privacy, and confidentiality. It is due to centralized control, data silos approach for handling information, and a lack of standardization in the IoT networks. As we know, blockchain is a new technology that involves creating secure distributed ledgers to store and communicate data. Some of the benefits include resiliency, integrity, anonymity, decentralization, and autonomous control. The potential for blockchain technology to provide the key to managing and controlling IoT has created a new wave of excitement around the idea of putting that data back into the hands of the end-users. In this manuscript, we have proposed a model that combines blockchain and IoT networks to address potential security and privacy issues in the healthcare domain and how various stakeholders will interact with the system.
Keywords: Internet of Things, IoT, blockchain, data integrity, authentication, data privacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 411