Search results for: physiological data extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26718

Search results for: physiological data extraction

24708 Understanding the Nature of Blood Pressure as Metabolic Syndrome Component in Children

Authors: Mustafa M. Donma, Orkide Donma

Abstract:

Pediatric overweight and obesity need attention because they may cause morbid obesity, which may develop metabolic syndrome (MetS). Criteria used for the definition of adult MetS cannot be applied for pediatric MetS. Dynamic physiological changes that occur during childhood and adolescence require the evaluation of each parameter based upon age intervals. The aim of this study is to investigate the distribution of blood pressure (BP) values within diverse pediatric age intervals and the possible use and clinical utility of a recently introduced Diagnostic Obesity Notation Model Assessment Tension (DONMA tense) Index derived from systolic BP (SBP) and diastolic BP (DBP) [SBP+DBP/200]. Such a formula may enable a more integrative picture for the assessment of pediatric obesity and MetS due to the use of both SBP and DBP. 554 children, whose ages were between 6-16 years participated in the study; the study population was divided into two groups based upon their ages. The first group comprises 280 cases aged 6-10 years (72-120 months), while those aged 10-16 years (121-192 months) constituted the second group. The values of SBP, DBP and the formula (SBP+DBP/200) covering both were evaluated. Each group was divided into seven subgroups with varying degrees of obesity and MetS criteria. Two clinical definitions of MetS have been described. These groups were MetS3 (children with three major components), and MetS2 (children with two major components). The other groups were morbid obese (MO), obese (OB), overweight (OW), normal (N) and underweight (UW). The children were included into the groups according to the age- and sex-based body mass index (BMI) percentile values tabulated by WHO. Data were evaluated by SPSS version 16 with p < 0.05 as the statistical significance degree. Tension index was evaluated in the groups above and below 10 years of age. This index differed significantly between N and MetS as well as OW and MetS groups (p = 0.001) above 120 months. However, below 120 months, significant differences existed between MetS3 and MetS2 (p = 0.003) as well as MetS3 and MO (p = 0.001). In comparison with the SBP and DBP values, tension index values have enabled more clear-cut separation between the groups. It has been detected that the tension index was capable of discriminating MetS3 from MetS2 in the group, which was composed of children aged 6-10 years. This was not possible in the older group of children. This index was more informative for the first group. This study also confirmed that 130 mm Hg and 85 mm Hg cut-off points for SBP and DBP, respectively, are too high for serving as MetS criteria in children because the mean value for tension index was calculated as 1.00 among MetS children. This finding has shown that much lower cut-off points must be set for SBP and DBP for the diagnosis of pediatric MetS, especially for children under-10 years of age. This index may be recommended to discriminate MO, MetS2 and MetS3 among the 6-10 years of age group, whose MetS diagnosis is problematic.

Keywords: blood pressure, children, index, metabolic syndrome, obesity

Procedia PDF Downloads 112
24707 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 69
24706 Electro-Winning of Dilute Solution of Copper Metal from Sepon Mine, Lao PDR

Authors: S. Vasailor, C. Rattanakawin

Abstract:

Electro-winning of copper metal from dilute sulfate solution (13.7 g/L) was performed in a lab electrolytic cell with stainless-steel cathode and lead-alloy anode. The effects of various parameters including cell voltage, electro-winning temperature and time were studied in order to acquire an appropriate current efficiency of copper deposition. The highest efficiency is about 95% obtaining from electro-winning condition of 3V, 55°C and 3,600 s correspondingly. The cathode copper with 95.5% Cu analyzed using atomic absorption spectrometry can be obtained from this single-winning condition. In order to increase the copper grade, solvent extraction should be used to increase the sulfate concentration, say 50 g/L, prior to winning the cathode copper effectively.

Keywords: copper metal, current efficiency, dilute sulfate solution, electro-winning

Procedia PDF Downloads 132
24705 Integration of Knowledge and Metadata for Complex Data Warehouses and Big Data

Authors: Jean Christian Ralaivao, Fabrice Razafindraibe, Hasina Rakotonirainy

Abstract:

This document constitutes a resumption of work carried out in the field of complex data warehouses (DW) relating to the management and formalization of knowledge and metadata. It offers a methodological approach for integrating two concepts, knowledge and metadata, within the framework of a complex DW architecture. The objective of the work considers the use of the technique of knowledge representation by description logics and the extension of Common Warehouse Metamodel (CWM) specifications. This will lead to a fallout in terms of the performance of a complex DW. Three essential aspects of this work are expected, including the representation of knowledge in description logics and the declination of this knowledge into consistent UML diagrams while respecting or extending the CWM specifications and using XML as pivot. The field of application is large but will be adapted to systems with heteroge-neous, complex and unstructured content and moreover requiring a great (re)use of knowledge such as medical data warehouses.

Keywords: data warehouse, description logics, integration, knowledge, metadata

Procedia PDF Downloads 131
24704 Data Analytics in Energy Management

Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair

Abstract:

With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.

Keywords: energy analytics, energy management, operational data, business intelligence, optimization

Procedia PDF Downloads 358
24703 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data

Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.

Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query

Procedia PDF Downloads 150
24702 Effect of Cooking Time, Seed-To-Water Ratio and Soaking Time on the Proximate Composition and Functional Properties of Tetracarpidium conophorum (Nigerian Walnut) Seeds

Authors: J. O. Idoko, C. N. Michael, T. O. Fasuan

Abstract:

This study investigated the effects of cooking time, seed-to-water ratio and soaking time on proximate and functional properties of African walnut seed using Box-Behnken design and Response Surface Methodology (BBD-RSM) with a view to increase its utilization in the food industry. African walnut seeds were sorted washed, soaked, cooked, dehulled, sliced, dried and milled. Proximate analysis and functional properties of the samples were evaluated using standard procedures. Data obtained were analyzed using descriptive and inferential statistics. Quadratic models were obtained to predict the proximate and functional qualities as a function of cooking time, seed-to-water ratio and soaking time. The results showed that the crude protein ranged between 11.80% and 23.50%, moisture content ranged between 1.00% and 4.66%, ash content ranged between 3.35% and 5.25%, crude fibre ranged from 0.10% to 7.25% and carbohydrate ranged from 1.22% to 29.35%. The functional properties showed that soluble protein ranged from 16.26% to 42.96%, viscosity ranged from 23.43 mPas to 57 mPas, emulsifying capacity ranged from 17.14% to 39.43% and water absorption capacity ranged from 232% to 297%. An increase in the volume of water used during cooking resulted in loss of water soluble protein through leaching, the length of soaking time and the moisture content of the dried product are inversely related, ash content is inversely related to the cooking time and amount of water used, extraction of fat is enhanced by increase in soaking time while increase in cooking and soaking times result into decrease in fibre content. The results obtained indicated that African walnut could be used in several food formulations as protein supplement and binder.

Keywords: African walnut, functional properties, proximate analysis, response surface methodology

Procedia PDF Downloads 385
24701 The Extent of Big Data Analysis by the External Auditors

Authors: Iyad Ismail, Fathilatul Abdul Hamid

Abstract:

This research was mainly investigated to recognize the extent of big data analysis by external auditors. This paper adopts grounded theory as a framework for conducting a series of semi-structured interviews with eighteen external auditors. The research findings comprised the availability extent of big data and big data analysis usage by the external auditors in Palestine, Gaza Strip. Considering the study's outcomes leads to a series of auditing procedures in order to improve the external auditing techniques, which leads to high-quality audit process. Also, this research is crucial for auditing firms by giving an insight into the mechanisms of auditing firms to identify the most important strategies that help in achieving competitive audit quality. These results are aims to instruct the auditing academic and professional institutions in developing techniques for external auditors in order to the big data analysis. This paper provides appropriate information for the decision-making process and a source of future information which affects technological auditing.

Keywords: big data analysis, external auditors, audit reliance, internal audit function

Procedia PDF Downloads 61
24700 New Advanced Medical Software Technology Challenges and Evolution of the Regulatory Framework in Expert Software, Artificial Intelligence, and Machine Learning

Authors: Umamaheswari Shanmugam, Silvia Ronchi, Radu Vornicu

Abstract:

Software, artificial intelligence, and machine learning can improve healthcare through innovative and advanced technologies that are able to use the large amount and variety of data generated during healthcare services every day. As we read the news, over 500 machine learning or other artificial intelligence medical devices have now received FDA clearance or approval, the first ones even preceding the year 2000. One of the big advantages of these new technologies is the ability to get experience and knowledge from real-world use and to continuously improve their performance. Healthcare systems and institutions can have a great benefit because the use of advanced technologies improves the same time efficiency and efficacy of healthcare. Software-defined as a medical device, is stand-alone software that is intended to be used for patients for one or more of these specific medical intended uses: - diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of a disease, any other health conditions, replacing or modifying any part of a physiological or pathological process–manage the received information from in vitro specimens derived from the human samples (body) and without principal main action of its principal intended use by pharmacological, immunological or metabolic definition. Software qualified as medical devices must comply with the general safety and performance requirements applicable to medical devices. These requirements are necessary to ensure high performance and quality and also to protect patients’ safety. The evolution and the continuous improvement of software used in healthcare must take into consideration the increase in regulatory requirements, which are becoming more complex in each market. The gap between these advanced technologies and the new regulations is the biggest challenge for medical device manufacturers. Regulatory requirements can be considered a market barrier, as they can delay or obstacle the device approval, but they are necessary to ensure performance, quality, and safety, and at the same time, they can be a business opportunity if the manufacturer is able to define in advance the appropriate regulatory strategy. The abstract will provide an overview of the current regulatory framework, the evolution of the international requirements, and the standards applicable to medical device software in the potential market all over the world.

Keywords: artificial intelligence, machine learning, SaMD, regulatory, clinical evaluation, classification, international requirements, MDR, 510k, PMA, IMDRF, cyber security, health care systems.

Procedia PDF Downloads 86
24699 A Model of Teacher Leadership in History Instruction

Authors: Poramatdha Chutimant

Abstract:

The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.

Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership

Procedia PDF Downloads 276
24698 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation

Authors: Mohammad Anwar, Shah Waliullah

Abstract:

This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.

Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model

Procedia PDF Downloads 65
24697 Viscoelastic Response of the Human Corneal Stroma Induced by Riboflavin/UVA Cross-Linking

Authors: C. Labate, M. P. De Santo, G. Lombardo, R. Barberi, M. Lombardo, N. M. Ziebarth

Abstract:

In the past decades, the importance of corneal biomechanics in the normal and pathological functions of the eye has gained its credibility. In fact, the mechanical properties of biological tissues are essential to their physiological function. We are convinced that an improved understanding of the nanomechanics of corneal tissue is important to understand the basic molecular interactions between collagen fibrils. Ultimately, this information will help in the development of new techniques to cure ocular diseases and in the development of biomimetic materials. Therefore, nanotechnology techniques are powerful tools and, in particular, Atomic Force Microscopy has demonstrated its ability to reliably characterize the biomechanics of biological tissues either at the micro- or nano-level. In the last years, we have investigated the mechanical anisotropy of the human corneal stroma at both the tissue and molecular levels. In particular, we have focused on corneal cross-linking, an established procedure aimed at slowing down or halting the progression of the disease known as keratoconus. We have obtained the first evidence that riboflavin/UV-A corneal cross-linking induces both an increase of the elastic response and a decrease of the viscous response of the most anterior stroma at the scale of stromal molecular interactions.

Keywords: atomic force spectroscopy, corneal stroma, cross-linking, viscoelasticity

Procedia PDF Downloads 307
24696 Placebo Analgesia in Older Age: Evidence from Event-Related Potentials

Authors: Angelika Dierolf, K. Rischer, A. Gonzalez-Roldan, P. Montoya, F. Anton, M. Van der Meulen

Abstract:

Placebo analgesia is a powerful cognitive endogenous pain modulation mechanism with high relevance in pain treatment. Older people would benefit, especially from non-pharmacologic pain interventions, since this age group is disproportionately affected by acute and chronic pain, while pharmacological treatments are less suitable due to polypharmacy and age-related changes in drug metabolism. Although aging is known to affect neurobiological and physiological aspects of pain perception, as for example, changes in pain threshold and pain tolerance, its effects on cognitive pain modulation strategies, including placebo analgesia, have hardly been investigated so far. In the present study, we are assessing placebo analgesia in 35 older adults (60 years and older) and 35 younger adults (between 18 and 35 years). Acute pain was induced with short transdermal electrical pulses to the inner forearm, using a concentric stimulating electrode. Stimulation intensities were individually adjusted to the participant’s threshold. Next to the stimulation site, we applied sham transcutaneous electrical nerve stimulation (TENS). Participants were informed that sometimes the TENS device would be switched on (placebo condition), and sometimes it would be switched off (control condition). In reality, it was always switched off. Participants received alternating blocks of painful stimuli in the placebo and control condition and were asked to rate the intensity and unpleasantness of each stimulus on a visual analog scale (VAS). Pain-related evoked potentials were recorded with a 64-channel EEG. Preliminary results show a reduced placebo effect in older compared to younger adults in both behavioral and neurophysiological data. Older people experienced less subjective pain reduction under sham TENS treatment compared to younger adults, as evidenced by the VAS ratings. The N1 and P2 event-related potential components were generally reduced in the older group. While younger adults showed a reduced N1 and P2 under sham TENS treatment, this reduction was considerably smaller in older people. This reduced placebo effect in the older group suggests that cognitive pain modulation is altered in aging and may at least partly explain why older adults experience more pain. Our results highlight the need for a better understanding of the efficacy of non-pharmacological pain treatments in older adults and how these can be optimized to meet the specific requirements of this population.

Keywords: placebo analgesia, aging, acute pain, TENS, EEG

Procedia PDF Downloads 136
24695 Soul-Body Relationship in Medieval Islamic Thought – Analysis of Avicenna’s Psychology and Medicine with Implication to Mental Health

Authors: Yula Milshteyn

Abstract:

The present study focuses on the science of the “Soul” in Islamic Medieval Psychology.The main objective of the current essay is to analyze the concept of the “soul” in relation to “mental” disorders, in the philosophical psychology and medicinal treatise of Ibn Sina, a Muslim Persian physician-philosopher (known as Avicenna in the Western world) (981-1037 CE). The examination will concentrate on the nature of the soul, and the relationship of the soul to the body, as well as the manifestation of health and sickness in soul and body, The analysis draws on Avicenna’s Psychology (Kitab al-Najat or The Book of Salvation), Remarks and Admonitions (Al-isharat wa al-tanbihat), and the medical treatise – The Canon of Medicine (al-Qānūn fī al-Ṭibb). Avicenna’s psychology of the soul is primarily based on Aristotelian and Neo-platonic paradigms. For Avicenna, soul is a metaphysical, independent substance, which in modern terms implies independence of human consciousness from the material body. The soul however, is linked to the body and controls all its’ faculties or functions. It is suggested that in the specific case study of schizophrenia, it is a disorder pertained to both, soul and body and can be characterized as a multi-faceted neurobiological, physiological, psychological and metaphysical spiritual phenomenon.

Keywords: Avicenna, canon of the medicine, mental disorders, psychology, schizophrenia, soul-body

Procedia PDF Downloads 51
24694 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO

Procedia PDF Downloads 434
24693 Topic Modelling Using Latent Dirichlet Allocation and Latent Semantic Indexing on SA Telco Twitter Data

Authors: Phumelele Kubheka, Pius Owolawi, Gbolahan Aiyetoro

Abstract:

Twitter is one of the most popular social media platforms where users can share their opinions on different subjects. As of 2010, The Twitter platform generates more than 12 Terabytes of data daily, ~ 4.3 petabytes in a single year. For this reason, Twitter is a great source for big mining data. Many industries such as Telecommunication companies can leverage the availability of Twitter data to better understand their markets and make an appropriate business decision. This study performs topic modeling on Twitter data using Latent Dirichlet Allocation (LDA). The obtained results are benchmarked with another topic modeling technique, Latent Semantic Indexing (LSI). The study aims to retrieve topics on a Twitter dataset containing user tweets on South African Telcos. Results from this study show that LSI is much faster than LDA. However, LDA yields better results with higher topic coherence by 8% for the best-performing model represented in Table 1. A higher topic coherence score indicates better performance of the model.

Keywords: big data, latent Dirichlet allocation, latent semantic indexing, telco, topic modeling, twitter

Procedia PDF Downloads 145
24692 Enhance the Power of Sentiment Analysis

Authors: Yu Zhang, Pedro Desouza

Abstract:

Since big data has become substantially more accessible and manageable due to the development of powerful tools for dealing with unstructured data, people are eager to mine information from social media resources that could not be handled in the past. Sentiment analysis, as a novel branch of text mining, has in the last decade become increasingly important in marketing analysis, customer risk prediction and other fields. Scientists and researchers have undertaken significant work in creating and improving their sentiment models. In this paper, we present a concept of selecting appropriate classifiers based on the features and qualities of data sources by comparing the performances of five classifiers with three popular social media data sources: Twitter, Amazon Customer Reviews, and Movie Reviews. We introduced a couple of innovative models that outperform traditional sentiment classifiers for these data sources, and provide insights on how to further improve the predictive power of sentiment analysis. The modelling and testing work was done in R and Greenplum in-database analytic tools.

Keywords: sentiment analysis, social media, Twitter, Amazon, data mining, machine learning, text mining

Procedia PDF Downloads 346
24691 Concentrations and History of Heavy Metals in Sediment Cores: Geochemistry and Geochronology Using 210Pb

Authors: F. Fernandes, C. Poleto

Abstract:

This paper aims at assessing the concentrations of heavy metals and the isotopic composition of lead 210Pb in different fractions of sediment produced in the watershed that makes up the Mãe d'água dam and thus characterizing the distribution of metals along the sedimentary column and inferencing in the urbanization of the same process. Sample collection was carried out in June 2014; eight sediment cores were sampled in the lake of the dam. For extraction of the sediments core, a core sampler “Piston Core” was used. The trace metal concentrations were determined by conventional atomic absorption spectrophotometric methods. The samples were subjected to radiochemical analysis of 210Po. 210Pb activity was obtained by measuring 210Po activity. The chronology was calculated using the constant rate of supply (CRS). 210Pb is used to estimate the sedimentation rate.

Keywords: ²¹⁰Pb dating method, heavy metal, lakes urban, pollution history

Procedia PDF Downloads 293
24690 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework

Authors: Abbas Raza Ali

Abstract:

Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.

Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation

Procedia PDF Downloads 169
24689 Programming with Grammars

Authors: Peter M. Maurer Maurer

Abstract:

DGL is a context free grammar-based tool for generating random data. Many types of simulator input data require some computation to be placed in the proper format. For example, it might be necessary to generate ordered triples in which the third element is the sum of the first two elements, or it might be necessary to generate random numbers in some sorted order. Although DGL is universal in computational power, generating these types of data is extremely difficult. To overcome this problem, we have enhanced DGL to include features that permit direct computation within the structure of a context free grammar. The features have been implemented as special types of productions, preserving the context free flavor of DGL specifications.

Keywords: DGL, Enhanced Context Free Grammars, Programming Constructs, Random Data Generation

Procedia PDF Downloads 141
24688 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses

Authors: Ouzayr Rabhi, Ibtissam Arrassen

Abstract:

To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.

Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML

Procedia PDF Downloads 153
24687 Molecular Profiling of an Oleaginous Trebouxiophycean Alga Parachlorella kessleri Subjected to Nutrient Deprivation

Authors: Pannaga Pavan Jutur

Abstract:

Parachlorella kessleri, a marine unicellular green alga belonging to class Trebouxiophyceae, accumulates large amounts of oil, i.e., lipids under nutrient-deprived (-N, -P, and -S) conditions. Understanding their metabolic imprints is important for elucidating the physiological mechanisms of lipid accumulations in this microalga subjected to nutrient deprivation. Metabolic and lipidomic profiles were obtained respectively using gas chromatography-mass spectrometry (GC-MS) of P. kessleri under nutrient starvation (-N, -P and -S) conditions. Relative quantities of more than 100 metabolites were systematically compared in all these three starvation conditions. Our results demonstrate that in lipid metabolism, the quantities of neutral lipids increased significantly followed by the decrease in other metabolites involved in photosynthesis, nitrogen assimilation, etc. In conclusion, the metabolomics and lipidomic profiles have identified a few common metabolites such as citric acid, valine, and trehalose to play a significant role in the overproduction of oil by this microalga subjected to nutrient deprivation. Understanding the entire system through untargeted metabolome profiling will lead to identifying relevant metabolites involved in the biosynthesis and degradation of precursor molecules that may have the potential for biofuel production, aiming towards the vision of tomorrow’s bioenergy needs.

Keywords: algae, biofuels, nutrient stress, omics

Procedia PDF Downloads 269
24686 Apoptosis Pathway Targeted by Thymoquinone in MCF7 Breast Cancer Cell Line

Authors: M. Marjaneh, M. Y. Narazah, H. Shahrul

Abstract:

Array-based gene expression analysis is a powerful tool to profile expression of genes and to generate information on therapeutic effects of new anti-cancer compounds. Anti-apoptotic effect of thymoquinone was studied in MCF7 breast cancer cell line using gene expression profiling with cDNA micro array. The purity and yield of RNA samples were determined using RNeasyPlus Mini kit. The Agilent RNA 6000 Nano LabChip kit evaluated the quantity of the RNA samples. AffinityScript RT oligo-dT promoter primer was used to generate cDNA strands. T7 RNA polymerase was used to convert cDNA to cRNA. The cRNA samples and human universal reference RNA were labelled with Cy-3-CTP and Cy-5-CTP, respectively. Feature Extraction and GeneSpring software analysed the data. The single experiment analysis revealed involvement of 64 pathways with up-regulated genes and 78 pathways with down-regulated genes. The MAPK and p38-MAPK pathways were inhibited due to the up-regulation of PTPRR gene. The inhibition of p38-MAPK suggested up-regulation of TGF-ß pathway. Inhibition of p38 - MAPK caused up-regulation of TP53 and down-regulation of Bcl2 genes indicating involvement of intrinsic apoptotic pathway. Down-regulation of CARD16 gene as an adaptor molecule regulated CASP1 and suggested necrosis-like programmed cell death and involvement of caspase in apoptosis. Furthermore, down-regulation of GPCR, EGF-EGFR signalling pathways suggested reduction of ER. Involvement of AhR pathway which control cytochrome P450 and glucuronidation pathways showed metabolism of Thymoquinone. The findings showed differential expression of several genes in apoptosis pathways with thymoquinone treatment in estrogen receptor-positive breast cancer cells.

Keywords: cDNA microarray, thymoquinone, CARD16, PTPRR, CASP10

Procedia PDF Downloads 343
24685 Secured Embedding of Patient’s Confidential Data in Electrocardiogram Using Chaotic Maps

Authors: Butta Singh

Abstract:

This paper presents a chaotic map based approach for secured embedding of patient’s confidential data in electrocardiogram (ECG) signal. The chaotic map generates predefined locations through the use of selective control parameters. The sample value difference method effectually hides the confidential data in ECG sample pairs at these predefined locations. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through various statistical and clinical performance measures. Statistical metrics comprise of Percentage Root Mean Square Difference (PRD) and Peak Signal to Noise Ratio (PSNR). Further, a comparative analysis between proposed method and existing approaches was also performed. The results clearly demonstrated the superiority of proposed method.

Keywords: chaotic maps, ECG steganography, data embedding, electrocardiogram

Procedia PDF Downloads 184
24684 Detection Efficient Enterprises via Data Envelopment Analysis

Authors: S. Turkan

Abstract:

In this paper, the Turkey’s Top 500 Industrial Enterprises data in 2014 were analyzed by data envelopment analysis. Data envelopment analysis is used to detect efficient decision-making units such as universities, hospitals, schools etc. by using inputs and outputs. The decision-making units in this study are enterprises. To detect efficient enterprises, some financial ratios are determined as inputs and outputs. For this reason, financial indicators related to productivity of enterprises are considered. The efficient foreign weighted owned capital enterprises are detected via super efficiency model. According to the results, it is said that Mercedes-Benz is the most efficient foreign weighted owned capital enterprise in Turkey.

Keywords: data envelopment analysis, super efficiency, logistic regression, financial ratios

Procedia PDF Downloads 318
24683 Study of the Toxic Activity of the Entomopathogenic Fungus Beauveria bassiana on the Wistar Rat Rattus norvegicus

Authors: F. Haddadj, S. Hamdi, A. Milla, S. Zenia, A. Smai, H. Saadi, F. Marniche, B. Doumandji-Mitiche

Abstract:

The use of a biopesticide based on a microorganism scale requires particular care including safety against the useful auxiliary fauna and mammals among other human beings. Due to its persistence in soil and its apparent human and animal safety, Beauveria bassiana is a cryptogram used for controlling pests organizations, particularly in the locust where its effectiveness has been proven. This fungus is also called for greater respect for biotic communities and the environment. Indeed, biopesticides have several environmental benefits: biodegradability, their activity and selectivity decrease unintended non-target species effects, decreased resistance to some of them. It is in this sense that we contribute by presenting our work on the safety of B. bassiana against mammals. For this we conducted a toxicological study of this fungus strain on Wistar rats Rattus norvegicus, first its effect on weight gain. In a second time were performed histological target organ is the liver. After 20 days of treatment, the results of the toxicological studies have shown that B. bassiana caused no change in the physiological state of rats or weight gain, behavior and diet. On cuts in liver histology revealed no disturbance on the organ.

Keywords: B. bassiana, entomopathogenic fungus, histology, Rattus norvegicus

Procedia PDF Downloads 235
24682 Imprecise Vowel Articulation in Down Syndrome: An Acoustic Study

Authors: Anitha Naittee Abraham, N. Sreedevi

Abstract:

Individuals with Down syndrome (DS) have relatively better expressive language compared to other individuals with intellectual disabilities. Reduced speech intelligibility is one of the major concerns of this group of individuals due to their anatomical and physiological differences. The study investigated the vowel articulation of Malayalam speaking children with DS in the age range of 5-10 years. The vowel production of 10 children with DS was compared with typically developing children in the same age range. Vowels were extracted from 3 words with the corner vowels /a/, /i/ and /u/ in the word-initial position, using Praat (version 5.3.23) software. Acoustic analysis was based on vowel space area (VSA), Formant centralization ration (FCR) and F2i/F2u. The findings revealed increased formant values for the control group except for F2a and F2u. Also, the experimental group had higher FCR, lower VSA, and F2i/F2u values suggestive of imprecise vowel articulation due to restricted tongue movements. The results of the independent t-test revealed a significant difference in F1a, F2i, F2u, VSA, FCR and F2i/F2u values between the experimental and control group. These findings support the fact that children with DS have imprecise vowel articulation that interferes with the overall speech intelligibility. Hence it is essential to target the oromotor skills to enhance the speech intelligibility which in turn benefit in the social and vocational domains of these individuals.

Keywords: Down syndrome, FCR, vowel articulation, vowel space

Procedia PDF Downloads 174
24681 Analysis of Rural Roads in Developing Countries Using Principal Component Analysis and Simple Average Technique in the Development of a Road Safety Performance Index

Authors: Muhammad Tufail, Jawad Hussain, Hammad Hussain, Imran Hafeez, Naveed Ahmad

Abstract:

Road safety performance index is a composite index which combines various indicators of road safety into single number. Development of a road safety performance index using appropriate safety performance indicators is essential to enhance road safety. However, a road safety performance index in developing countries has not been given as much priority as needed. The primary objective of this research is to develop a general Road Safety Performance Index (RSPI) for developing countries based on the facility as well as behavior of road user. The secondary objectives include finding the critical inputs in the RSPI and finding the better method of making the index. In this study, the RSPI is developed by selecting four main safety performance indicators i.e., protective system (seat belt, helmet etc.), road (road width, signalized intersections, number of lanes, speed limit), number of pedestrians, and number of vehicles. Data on these four safety performance indicators were collected using observation survey on a 20 km road section of the National Highway N-125 road Taxila, Pakistan. For the development of this composite index, two methods are used: a) Principal Component Analysis (PCA) and b) Equal Weighting (EW) method. PCA is used for extraction, weighting, and linear aggregation of indicators to obtain a single value. An individual index score was calculated for each road section by multiplication of weights and standardized values of each safety performance indicator. However, Simple Average technique was used for weighting and linear aggregation of indicators to develop a RSPI. The road sections are ranked according to RSPI scores using both methods. The two weighting methods are compared, and the PCA method is found to be much more reliable than the Simple Average Technique.

Keywords: indicators, aggregation, principle component analysis, weighting, index score

Procedia PDF Downloads 147
24680 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 630
24679 Urban Land Cover from GF-2 Satellite Images Using Object Based and Neural Network Classifications

Authors: Lamyaa Gamal El-Deen Taha, Ashraf Sharawi

Abstract:

China launched satellite GF-2 in 2014. This study deals with comparing nearest neighbor object-based classification and neural network classification methods for classification of the fused GF-2 image. Firstly, rectification of GF-2 image was performed. Secondly, a comparison between nearest neighbor object-based classification and neural network classification for classification of fused GF-2 was performed. Thirdly, the overall accuracy of classification and kappa index were calculated. Results indicate that nearest neighbor object-based classification is better than neural network classification for urban mapping.

Keywords: GF-2 images, feature extraction-rectification, nearest neighbour object based classification, segmentation algorithms, neural network classification, multilayer perceptron

Procedia PDF Downloads 383