Search results for: spatial and temporal data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26825

Search results for: spatial and temporal data

24305 Early Childhood Education: Teachers Ability to Assess

Authors: Ade Dwi Utami

Abstract:

Pedagogic competence is the basic competence of teachers to perform their tasks as educators. The ability to assess has become one of the demands in teachers pedagogic competence. Teachers ability to assess is related to curriculum instructions and applications. This research is aimed at obtaining data concerning teachers ability to assess that comprises of understanding assessment, determining assessment type, tools and procedure, conducting assessment process, and using assessment result information. It uses mixed method of explanatory technique in which qualitative data is used to verify the quantitative data obtained through a survey. The technique of quantitative data collection is by test whereas the qualitative data collection is by observation, interview and documentation. Then, the analyzed data is processed through a proportion study technique to be categorized into high, medium and low. The result of the research shows that teachers ability to assess can be grouped into 3 namely, 2% of high, 4% of medium and 94% of low. The data shows that teachers ability to assess is still relatively low. Teachers are lack of knowledge and comprehension in assessment application. The statement is verified by the qualitative data showing that teachers did not state which aspect was assessed in learning, record children’s behavior, and use the data result as a consideration to design a program. Teachers have assessment documents yet they only serve as means of completing teachers administration for the certification program. Thus, assessment documents were not used with the basis of acquired knowledge. The condition should become a consideration of the education institution of educators and the government to improve teachers pedagogic competence, including the ability to assess.

Keywords: assessment, early childhood education, pedagogic competence, teachers

Procedia PDF Downloads 246
24304 Landcover Mapping Using Lidar Data and Aerial Image and Soil Fertility Degradation Assessment for Rice Production Area in Quezon, Nueva Ecija, Philippines

Authors: Eliza. E. Camaso, Guiller. B. Damian, Miguelito. F. Isip, Ronaldo T. Alberto

Abstract:

Land-cover maps were important for many scientific, ecological and land management purposes and during the last decades, rapid decrease of soil fertility was observed to be due to land use practices such as rice cultivation. High-precision land-cover maps are not yet available in the area which is important in an economy management. To assure   accurate mapping of land cover to provide information, remote sensing is a very suitable tool to carry out this task and automatic land use and cover detection. The study did not only provide high precision land cover maps but it also provides estimates of rice production area that had undergone chemical degradation due to fertility decline. Land-cover were delineated and classified into pre-defined classes to achieve proper detection features. After generation of Land-cover map, of high intensity of rice cultivation, soil fertility degradation assessment in rice production area due to fertility decline was created to assess the impact of soils used in agricultural production. Using Simple spatial analysis functions and ArcGIS, the Land-cover map of Municipality of Quezon in Nueva Ecija, Philippines was overlaid to the fertility decline maps from Land Degradation Assessment Philippines- Bureau of Soils and Water Management (LADA-Philippines-BSWM) to determine the area of rice crops that were most likely where nitrogen, phosphorus, zinc and sulfur deficiencies were induced by high dosage of urea and imbalance N:P fertilization. The result found out that 80.00 % of fallow and 99.81% of rice production area has high soil fertility decline.

Keywords: aerial image, landcover, LiDAR, soil fertility degradation

Procedia PDF Downloads 252
24303 Statistical Analysis for Overdispersed Medical Count Data

Authors: Y. N. Phang, E. F. Loh

Abstract:

Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling over-dispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling over-dispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling over-dispersed medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling over-dispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian, and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling over-dispersed medical count data when ZIP and ZINB are inadequate.

Keywords: zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit

Procedia PDF Downloads 542
24302 Monotone Rational Trigonometric Interpolation

Authors: Uzma Bashir, Jamaludin Md. Ali

Abstract:

This study is concerned with the visualization of monotone data using a piece-wise C1 rational trigonometric interpolating scheme. Four positive shape parameters are incorporated in the structure of rational trigonometric spline. Conditions on two of these parameters are derived to attain the monotonicity of monotone data and other two are left-free. Figures are used widely to exhibit that the proposed scheme produces graphically smooth monotone curves.

Keywords: trigonometric splines, monotone data, shape preserving, C1 monotone interpolant

Procedia PDF Downloads 271
24301 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 85
24300 Interplay of Power Management at Core and Server Level

Authors: Jörg Lenhardt, Wolfram Schiffmann, Jörg Keller

Abstract:

While the feature sizes of recent Complementary Metal Oxid Semiconductor (CMOS) devices decrease the influence of static power prevails their energy consumption. Thus, power savings that benefit from Dynamic Frequency and Voltage Scaling (DVFS) are diminishing and temporal shutdown of cores or other microchip components become more worthwhile. A consequence of powering off unused parts of a chip is that the relative difference between idle and fully loaded power consumption is increased. That means, future chips and whole server systems gain more power saving potential through power-aware load balancing, whereas in former times this power saving approach had only limited effect, and thus, was not widely adopted. While powering off complete servers was used to save energy, it will be superfluous in many cases when cores can be powered down. An important advantage that comes with that is a largely reduced time to respond to increased computational demand. We include the above developments in a server power model and quantify the advantage. Our conclusion is that strategies from datacenters when to power off server systems might be used in the future on core level, while load balancing mechanisms previously used at core level might be used in the future at server level.

Keywords: power efficiency, static power consumption, dynamic power consumption, CMOS

Procedia PDF Downloads 221
24299 Integration of Knowledge and Metadata for Complex Data Warehouses and Big Data

Authors: Jean Christian Ralaivao, Fabrice Razafindraibe, Hasina Rakotonirainy

Abstract:

This document constitutes a resumption of work carried out in the field of complex data warehouses (DW) relating to the management and formalization of knowledge and metadata. It offers a methodological approach for integrating two concepts, knowledge and metadata, within the framework of a complex DW architecture. The objective of the work considers the use of the technique of knowledge representation by description logics and the extension of Common Warehouse Metamodel (CWM) specifications. This will lead to a fallout in terms of the performance of a complex DW. Three essential aspects of this work are expected, including the representation of knowledge in description logics and the declination of this knowledge into consistent UML diagrams while respecting or extending the CWM specifications and using XML as pivot. The field of application is large but will be adapted to systems with heteroge-neous, complex and unstructured content and moreover requiring a great (re)use of knowledge such as medical data warehouses.

Keywords: data warehouse, description logics, integration, knowledge, metadata

Procedia PDF Downloads 138
24298 Data Analytics in Energy Management

Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair

Abstract:

With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.

Keywords: energy analytics, energy management, operational data, business intelligence, optimization

Procedia PDF Downloads 364
24297 The Extent of Big Data Analysis by the External Auditors

Authors: Iyad Ismail, Fathilatul Abdul Hamid

Abstract:

This research was mainly investigated to recognize the extent of big data analysis by external auditors. This paper adopts grounded theory as a framework for conducting a series of semi-structured interviews with eighteen external auditors. The research findings comprised the availability extent of big data and big data analysis usage by the external auditors in Palestine, Gaza Strip. Considering the study's outcomes leads to a series of auditing procedures in order to improve the external auditing techniques, which leads to high-quality audit process. Also, this research is crucial for auditing firms by giving an insight into the mechanisms of auditing firms to identify the most important strategies that help in achieving competitive audit quality. These results are aims to instruct the auditing academic and professional institutions in developing techniques for external auditors in order to the big data analysis. This paper provides appropriate information for the decision-making process and a source of future information which affects technological auditing.

Keywords: big data analysis, external auditors, audit reliance, internal audit function

Procedia PDF Downloads 70
24296 A Model of Teacher Leadership in History Instruction

Authors: Poramatdha Chutimant

Abstract:

The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.

Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership

Procedia PDF Downloads 279
24295 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation

Authors: Mohammad Anwar, Shah Waliullah

Abstract:

This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.

Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model

Procedia PDF Downloads 68
24294 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO

Procedia PDF Downloads 442
24293 The Clinical Characteristics and Their Relationship with Sleep Disorders in Patients with Parkinson Disease Accompanied with Cognitive Impairment

Authors: Peng Guo

Abstract:

Objective To investigate the clinical characteristics and changes of video-polysomnography (v-PSG) in Parkinson disease (PD) patients accompanied with cognitive impairment. Methods Three hundred and ninety-four patients with PD were enrolled in Beijing Tiantan Hospital, according to CI level, the patients were divided into PD without cognitive impairment (PD-NCI), PD with mild cognitive impairment (PD-MCI), and PD with dementia (PDD) group. Collect patient's demographic data, including gender, onset age, education level and duration. The cognitive function of PD patients was evaluated by Montreal cognitive assessment (MoCA) scale, and the overall cognitive function and cognitive domains of the three groups were compared.Using v-PSG to assess the sleep status of patients. Correlation analysis of MoCA Scale and v-PSG results in PD-CI group. Results 1. In 394 cases of PD, 94 cases (23.86%) in PD-NCI group , 177 cases(44.92%) in PD-MCI group , 123 cases (31.22%) in PDD group. 2.There was no significant difference in gender, age of onset, education level and duration in PD-NCI group, PD-MCI group and PDD group (P>0.05). 3. The total score of MoCA scale in PD-NCI group, PD-MCI group and PDD group decreased one by one. In PD-NCI group, PD-MCI group and PDD group, the scores of each cognitive domain in MoCA scale decreased significantly(P<0.05). 4.Compared with the PD-MCI group, PDD group had lower total sleep time, lower sleep efficiency (P<0.05). Compared with PD-NCI group, PDD group had lower total sleep time and lower sleep efficiency (P<0.05).5. The sleep efficiency of PD-CI patients is positively correlated with the total score of MoCA scale, visual spatial function, executive function, delayed recall and attention score(P<0.05). Conclusions The incidence of CI in PD patients was high; The cognitive function and cognitive domains of PD-CI patients were significantly impaired; In patients with PD-CI, total sleep time decreased, sleep efficiency decreased, and it was related to overall cognitive function and partial cognitive impairment.

Keywords: Parkinson disease, cognitive impairment, clinical characteristics, sleep disorders, video-polysomnography

Procedia PDF Downloads 30
24292 Topic Modelling Using Latent Dirichlet Allocation and Latent Semantic Indexing on SA Telco Twitter Data

Authors: Phumelele Kubheka, Pius Owolawi, Gbolahan Aiyetoro

Abstract:

Twitter is one of the most popular social media platforms where users can share their opinions on different subjects. As of 2010, The Twitter platform generates more than 12 Terabytes of data daily, ~ 4.3 petabytes in a single year. For this reason, Twitter is a great source for big mining data. Many industries such as Telecommunication companies can leverage the availability of Twitter data to better understand their markets and make an appropriate business decision. This study performs topic modeling on Twitter data using Latent Dirichlet Allocation (LDA). The obtained results are benchmarked with another topic modeling technique, Latent Semantic Indexing (LSI). The study aims to retrieve topics on a Twitter dataset containing user tweets on South African Telcos. Results from this study show that LSI is much faster than LDA. However, LDA yields better results with higher topic coherence by 8% for the best-performing model represented in Table 1. A higher topic coherence score indicates better performance of the model.

Keywords: big data, latent Dirichlet allocation, latent semantic indexing, telco, topic modeling, twitter

Procedia PDF Downloads 150
24291 Assessing the Material Determinants of Cavity Polariton Relaxation using Angle-Resolved Photoluminescence Excitation Spectroscopy

Authors: Elizabeth O. Odewale, Sachithra T. Wanasinghe, Aaron S. Rury

Abstract:

Cavity polaritons form when molecular excitons strongly couple to photons in carefully constructed optical cavities. These polaritons, which are hybrid light-matter states possessing a unique combination of photonic and excitonic properties, present the opportunity to manipulate the properties of various semiconductor materials. The systematic manipulation of materials through polariton formation could potentially improve the functionalities of many optoelectronic devices such as lasers, light-emitting diodes, photon-based quantum computers, and solar cells. However, the prospects of leveraging polariton formation for novel devices and device operation depend on more complete connections between the properties of molecular chromophores, and the hybrid light-matter states they form, which remains an outstanding scientific goal. Specifically, for most optoelectronic applications, it is paramount to understand how polariton formation affects the spectra of light absorbed by molecules coupled strongly to cavity photons. An essential feature of a polariton state is its dispersive energy, which occurs due to the enhanced spatial delocalization of the polaritons relative to bare molecules. To leverage the spatial delocalization of cavity polaritons, angle-resolved photoluminescence excitation spectroscopy was employed in characterizing light emission from the polaritonic states. Using lasers of appropriate energies, the polariton branches were resonantly excited to understand how molecular light absorption changes under different strong light-matter coupling conditions. Since an excited state has a finite lifetime, the photon absorbed by the polariton decays non-radiatively into lower-lying molecular states, from which radiative relaxation to the ground state occurs. The resulting fluorescence is collected across several angles of excitation incidence. By modeling the behavior of the light emission observed from the lower-lying molecular state and combining this result with the output of angle-resolved transmission measurements, inferences are drawn about how the behavior of molecules changes when they form polaritons. These results show how the intrinsic properties of molecules, such as the excitonic lifetime, affect the rate at which the polaritonic states relax. While it is true that the lifetime of the photon mediates the rate of relaxation in a cavity, the results from this study provide evidence that the lifetime of the molecular exciton also limits the rate of polariton relaxation.

Keywords: flourescece, molecules in cavityies, optical cavity, photoluminescence excitation, spectroscopy, strong coupling

Procedia PDF Downloads 73
24290 Enhance the Power of Sentiment Analysis

Authors: Yu Zhang, Pedro Desouza

Abstract:

Since big data has become substantially more accessible and manageable due to the development of powerful tools for dealing with unstructured data, people are eager to mine information from social media resources that could not be handled in the past. Sentiment analysis, as a novel branch of text mining, has in the last decade become increasingly important in marketing analysis, customer risk prediction and other fields. Scientists and researchers have undertaken significant work in creating and improving their sentiment models. In this paper, we present a concept of selecting appropriate classifiers based on the features and qualities of data sources by comparing the performances of five classifiers with three popular social media data sources: Twitter, Amazon Customer Reviews, and Movie Reviews. We introduced a couple of innovative models that outperform traditional sentiment classifiers for these data sources, and provide insights on how to further improve the predictive power of sentiment analysis. The modelling and testing work was done in R and Greenplum in-database analytic tools.

Keywords: sentiment analysis, social media, Twitter, Amazon, data mining, machine learning, text mining

Procedia PDF Downloads 353
24289 Elements of Successful Commercial Streets: A Socio-Spatial Analysis of Commercial Streets in Cairo

Authors: Toka Aly

Abstract:

Historically, marketplaces were the most important nodes and focal points of cities, where different activities took place. Commercial streets offer more than just spaces for shopping; they also offer choices for social activities and cultural exchange. They are considered the backbone of the city’s vibrancy and vitality. Despite that, the public life in Cairo’s commercial streets has deteriorated, where the shopping activities became reliant mainly on 'planned formal places', mainly in privatized or indoor spaces like shopping malls. The main aim of this paper is to explore the key elements and tools of assessing the successfulness of commercial streets in Cairo. The methodology followed in this paper is based on a case study methodology (multiple cases) that is based on assessing and analyzing the physical and social elements in historical and contemporary commercial streets in El Muiz Street and Baghdad Street in Cairo. The data collection is based on personal observations, photographs, maps and street sections. Findings indicate that the key factors of analyzing commercial streets are factors affecting the sensory experience, factors affecting the social behavior, and general aspects that attract people. Findings also indicate that urban features have clear influence on shopping pedestrian activities in both streets. Moreover, in order for a commercial street to be successful, shopping patterns must provide people with a quality public space that can provide easy navigation and accessibility, good visual continuity, and well-designed urban features and social gathering. Outcomes of this study will be a significant endeavor in providing a good background for urban designers on analyzing and assessing successfulness of commercial streets. The study will also help in understanding the different physical and social pattern of vending activities taking place in Cairo.

Keywords: activities, commercial street, marketplace, successful, vending

Procedia PDF Downloads 302
24288 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework

Authors: Abbas Raza Ali

Abstract:

Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.

Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation

Procedia PDF Downloads 175
24287 Programming with Grammars

Authors: Peter M. Maurer Maurer

Abstract:

DGL is a context free grammar-based tool for generating random data. Many types of simulator input data require some computation to be placed in the proper format. For example, it might be necessary to generate ordered triples in which the third element is the sum of the first two elements, or it might be necessary to generate random numbers in some sorted order. Although DGL is universal in computational power, generating these types of data is extremely difficult. To overcome this problem, we have enhanced DGL to include features that permit direct computation within the structure of a context free grammar. The features have been implemented as special types of productions, preserving the context free flavor of DGL specifications.

Keywords: DGL, Enhanced Context Free Grammars, Programming Constructs, Random Data Generation

Procedia PDF Downloads 147
24286 Two Dimensional Finite Element Model to Study Calcium Dynamics in Fibroblast Cell with Excess Buffer Approximation Involving ER Flux and SERCA Pump

Authors: Mansha Kotwani

Abstract:

The specific spatio-temporal calcium concentration patterns are required by the fibroblasts to maintain its structure and functions. Thus, calcium concentration is regulated in cell at different levels in various activities of the cell. The variations in cytosolic calcium concentration largely depend on the buffers present in cytosol and influx of calcium into cytosol from ER through IP3Rs or Raynodine receptors followed by reuptake of calcium into ER through sarcoplasmic/endoplasmic reticulum ATPs (SERCA) pump. In order to understand the mechanisms of wound repair, tissue remodeling and growth performed by fibroblasts, it is of crucial importance to understand the mechanisms of calcium concentration regulation in fibroblasts. In this paper, a model has been developed to study calcium distribution in NRK fibroblast in the presence of buffers and ER flux with SERCA pump. The model has been developed for two dimensional unsteady state case. Appropriate initial and boundary conditions have been framed along with physiology of the cell. Finite element technique has been employed to obtain the solution. The numerical results have been used to study the effect of buffers, ER flux and source amplitude on calcium distribution in fibroblast cell.

Keywords: buffers, IP3R, ER flux, SERCA pump, source amplitude

Procedia PDF Downloads 243
24285 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses

Authors: Ouzayr Rabhi, Ibtissam Arrassen

Abstract:

To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.

Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML

Procedia PDF Downloads 160
24284 Deep Supervision Based-Unet to Detect Buildings Changes from VHR Aerial Imagery

Authors: Shimaa Holail, Tamer Saleh, Xiongwu Xiao

Abstract:

Building change detection (BCD) from satellite imagery is an essential topic in urbanization monitoring, agricultural land management, and updating geospatial databases. Recently, methods for detecting changes based on deep learning have made significant progress and impressive results. However, it has the problem of being insensitive to changes in buildings with complex spectral differences, and the features being extracted are not discriminatory enough, resulting in incomplete buildings and irregular boundaries. To overcome these problems, we propose a dual Siamese network based on the Unet model with the addition of a deep supervision strategy (DS) in this paper. This network consists of a backbone (encoder) based on ImageNet pre-training, a fusion block, and feature pyramid networks (FPN) to enhance the step-by-step information of the changing regions and obtain a more accurate BCD map. To train the proposed method, we created a new dataset (EGY-BCD) of high-resolution and multi-temporal aerial images captured over New Cairo in Egypt to detect building changes for this purpose. The experimental results showed that the proposed method is effective and performs well with the EGY-BCD dataset regarding the overall accuracy, F1-score, and mIoU, which were 91.6 %, 80.1 %, and 73.5 %, respectively.

Keywords: building change detection, deep supervision, semantic segmentation, EGY-BCD dataset

Procedia PDF Downloads 120
24283 Humanising Hospital Retrofitting: The Case Study of Malaysia Public Hospitals

Authors: Nur Faridatull Syafinaz Ahmad Tajudin

Abstract:

A hospital is a setting where individuals who are ill or injured are treated and cared for by doctors and nurses. Sanatoriums are settings where people can receive treatment and rest, particularly when recovering from a protracted illness. According to the report, hospitals are primarily designed to meet the needs of medical personnel by maximising their functionality and workflow. Hospitals frequently do a poor job of determining the patients' physical and emotional requirements and expectations. The literature on hospital design has recently focused more on the seeming need to "humanise" medical facilities. Despite the popularity of this design objective, "humanising" a space has hardly ever been defined or critically examined. The term "humanistic design" covered a broad range of design elements and designer interpretations. In reality, the hospital's layout and design the hospital may have a massive effect on patients' feel experience things and heal.

Keywords: hospital retrofitting, hospital design, humanising hospital, spatial design

Procedia PDF Downloads 120
24282 Secured Embedding of Patient’s Confidential Data in Electrocardiogram Using Chaotic Maps

Authors: Butta Singh

Abstract:

This paper presents a chaotic map based approach for secured embedding of patient’s confidential data in electrocardiogram (ECG) signal. The chaotic map generates predefined locations through the use of selective control parameters. The sample value difference method effectually hides the confidential data in ECG sample pairs at these predefined locations. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through various statistical and clinical performance measures. Statistical metrics comprise of Percentage Root Mean Square Difference (PRD) and Peak Signal to Noise Ratio (PSNR). Further, a comparative analysis between proposed method and existing approaches was also performed. The results clearly demonstrated the superiority of proposed method.

Keywords: chaotic maps, ECG steganography, data embedding, electrocardiogram

Procedia PDF Downloads 195
24281 Moderating Effect of Different Social Supports on the Relationship between Workplace Bullying and Intention to Occupation Leave in Nurses

Authors: Chenchieh Chang

Abstract:

Objectives: This study had two objectives. First, it used affective events theory to investigate the relationship between workplace bullying and the intention to resign in nurses, a topic rarely explored in previous studies. Second, according to the conservation of resource theory, individuals encountering work incidents will utilize resources that are at their disposal to strengthen or weaken the effects of the incidents on them. Such resources include social support that comes from their bosses, colleagues, family, and friends. To answer the question of whether different social supports exert distinct effects on alleviating stress experienced by nurses, this study examined the moderating effects of different social supports on the relationship between workplace bullying and nurses’ intention to resign. Method: This study was approved by an institutional review board (code number: 105070) and adopted purposive sampling to survey 911, full-time nurses. Results: Work-related bullying exerted a significant and positive effect on the intention to resign, whereas bullying pertaining to interpersonal relationships and body-related bullying nonsignificantly affected intention to resign. Support from supervisors enhanced the effect of work-related bullying on an intention to resign, whereas support from colleagues and family did not moderate said effect. Research Limitations/Implications: The self-reporting method and cross-sectional research design adopted in this study might have resulted in common method variance and limited the ability to make causal inferences. This study suggests future studies to obtain measures of predictor and criterion variables from different sources or ensure a temporal, proximal, or psychological separation between predictor and criterion in the collection of data to avoid the common method bias. Practical Implications: First, businesses should establish a friendly work environment and prevent employees from encountering workplace bullying. Second, because social support cannot diminish the effect of workplace bullying on employees’ intention to resign, businesses should offer other means of assistance. For example, business managers may introduce confidential systems for employees to report workplace bullying; or they may establish consultation centers where employees can properly express their thoughts and feelings when encountering workplace bullying.

Keywords: workplace bullying, intention to occupation leave, social supports, nurses

Procedia PDF Downloads 114
24280 Detection Efficient Enterprises via Data Envelopment Analysis

Authors: S. Turkan

Abstract:

In this paper, the Turkey’s Top 500 Industrial Enterprises data in 2014 were analyzed by data envelopment analysis. Data envelopment analysis is used to detect efficient decision-making units such as universities, hospitals, schools etc. by using inputs and outputs. The decision-making units in this study are enterprises. To detect efficient enterprises, some financial ratios are determined as inputs and outputs. For this reason, financial indicators related to productivity of enterprises are considered. The efficient foreign weighted owned capital enterprises are detected via super efficiency model. According to the results, it is said that Mercedes-Benz is the most efficient foreign weighted owned capital enterprise in Turkey.

Keywords: data envelopment analysis, super efficiency, logistic regression, financial ratios

Procedia PDF Downloads 324
24279 Meningeal Hemangiopericytoma in an HIV-Positive Patient: A Case Report and Review of Literature

Authors: Roland Benedict Reyes, Marc Edsel Ayes, Regina Berba, Cybele Lara Abad

Abstract:

Background: Three AIDS-defining malignancies have been associated with the human immunodeficiency virus (HIV): Kaposi’s sarcoma, non-Hodgkin’s lymphoma, and cervical carcinoma. However, new cases of non-AIDS defining malignancies also have been increasingly associated with HIV. One of these is a rare intracranial malignancy, meningeal hemangiopericyotma. Case Description: A 32-year old HIV-positive male, not on highly active antiretroviral therapy, was admitted to our hospital due to generalized weakness and sudden onset hearing loss. Cranial MRI was done, which revealed a temporal nodule with the following considerations: granuloma, meningioma or metastases. A craniotomy was performed and the mass excised. Results from the biopsy showed meningeal hemangiopericytoma. The patient was then started on antiretroviral therapy (Lamivudine, Tenofovir, and Efavirenz) and was discharged for radiation therapy and metastatic work-up as an outpatient. On follow-up seven months later, metastatic work up revealed multiple hepatic foci not previously documented suggestive of metastasis short of biopsy sampling. Conclusions: This case of an intracranial hemangiopericytoma in an HIV-positive patient is the second case thus far presented, based on our systematic and extensive search of the literature.

Keywords: Hemangiopericytoma, Human Immunodeficiency Virus, Meningeal hemangiopericytoma, Neoplasm

Procedia PDF Downloads 463
24278 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 640
24277 Artificial Neural Network Based Approach in Prediction of Potential Water Pollution Across Different Land-Use Patterns

Authors: M.Rüştü Karaman, İsmail İşeri, Kadir Saltalı, A.Reşit Brohi, Ayhan Horuz, Mümin Dizman

Abstract:

Considerable relations has recently been given to the environmental hazardous caused by agricultural chemicals such as excess fertilizers. In this study, a neural network approach was investigated in the prediction of potential nitrate pollution across different land-use patterns by using a feedforward multilayered computer model of artificial neural network (ANN) with proper training. Periodical concentrations of some anions, especially nitrate (NO3-), and cations were also detected in drainage waters collected from the drain pipes placed in irrigated tomato field, unirrigated wheat field, fallow and pasture lands. The soil samples were collected from the irrigated tomato field and unirrigated wheat field on a grid system with 20 m x 20 m intervals. Site specific nitrate concentrations in the soil samples were measured for ANN based simulation of nitrate leaching potential from the land profiles. In the application of ANN model, a multi layered feedforward was evaluated, and data sets regarding with training, validation and testing containing the measured soil nitrate values were estimated based on spatial variability. As a result of the testing values, while the optimal structures of 2-15-1 was obtained (R2= 0.96, P < 0.01) for unirrigated field, the optimal structures of 2-10-1 was obtained (R2= 0.96, P < 0.01) for irrigated field. The results showed that the ANN model could be successfully used in prediction of the potential leaching levels of nitrate, based on different land use patterns. However, for the most suitable results, the model should be calibrated by training according to different NN structures depending on site specific soil parameters and varied agricultural managements.

Keywords: artificial intelligence, ANN, drainage water, nitrate pollution

Procedia PDF Downloads 310
24276 Analysis of the Extreme Hydrometeorological Events in the Theorical Hydraulic Potential and Streamflow Forecast

Authors: Sara Patricia Ibarra-Zavaleta, Rabindranarth Romero-Lopez, Rosario Langrave, Annie Poulin, Gerald Corzo, Mathias Glaus, Ricardo Vega-Azamar, Norma Angelica Oropeza

Abstract:

The progressive change in climatic conditions worldwide has increased frequency and severity of extreme hydrometeorological events (EHE). Mexico is an example; this has been affected by the presence of EHE leaving economic, social and environmental losses. The objective of this research was to apply a Canadian distributed hydrological model (DHM) to tropical conditions and to evaluate its capacity to predict flows in a basin in the central Gulf of Mexico. In addition, the DHM (once calibrated and validated) was used to calculate the theoretical hydraulic power and the performance to predict streamflow before the presence of an EHE. The results of the DHM show that the goodness of fit indicators between the observed and simulated flows in the calibration process (NSE=0.83, RSR=0.021 and BIAS=-4.3) and validation: temporal was assessed at two points: point one (NSE=0.78, RSR=0.113 and BIAS=0.054) and point two (NSE=0.825, RSR=0.103 and BIAS=0.063) are satisfactory. The DHM showed its applicability in tropical environments and its ability to characterize the rainfall-runoff relationship in the study area. This work can serve as a tool for identifying vulnerabilities before floods and for the rational and sustainable management of water resources.

Keywords: HYDROTEL, hydraulic power, extreme hydrometeorological events, streamflow

Procedia PDF Downloads 341