Search results for: Data oriented modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9176

Search results for: Data oriented modeling

8936 Prediction of Reusability of Object Oriented Software Systems using Clustering Approach

Authors: Anju Shri, Parvinder S. Sandhu, Vikas Gupta, Sanyam Anand

Abstract:

In literature, there are metrics for identifying the quality of reusable components but the framework that makes use of these metrics to precisely predict reusability of software components is still need to be worked out. These reusability metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the software component and hence improve the productivity due to probabilistic increase in the reuse level. As CK metric suit is most widely used metrics for extraction of structural features of an object oriented (OO) software; So, in this study, tuned CK metric suit i.e. WMC, DIT, NOC, CBO and LCOM, is used to obtain the structural analysis of OO-based software components. An algorithm has been proposed in which the inputs can be given to K-Means Clustering system in form of tuned values of the OO software component and decision tree is formed for the 10-fold cross validation of data to evaluate the in terms of linguistic reusability value of the component. The developed reusability model has produced high precision results as desired.

Keywords: CK-Metric, Desicion Tree, Kmeans, Reusability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1877
8935 Retrospective Reconstruction of Time Series Data for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modeling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modeling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modeling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.

Keywords: Content analysis, factors, integrated waste management system, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1986
8934 Spike Sorting Method Using Exponential Autoregressive Modeling of Action Potentials

Authors: Sajjad Farashi

Abstract:

Neurons in the nervous system communicate with each other by producing electrical signals called spikes. To investigate the physiological function of nervous system it is essential to study the activity of neurons by detecting and sorting spikes in the recorded signal. In this paper a method is proposed for considering the spike sorting problem which is based on the nonlinear modeling of spikes using exponential autoregressive model. The genetic algorithm is utilized for model parameter estimation. In this regard some selected model coefficients are used as features for sorting purposes. For optimal selection of model coefficients, self-organizing feature map is used. The results show that modeling of spikes with nonlinear autoregressive model outperforms its linear counterpart. Also the extracted features based on the coefficients of exponential autoregressive model are better than wavelet based extracted features and get more compact and well-separated clusters. In the case of spikes different in small-scale structures where principal component analysis fails to get separated clouds in the feature space, the proposed method can obtain well-separated cluster which removes the necessity of applying complex classifiers.

Keywords: Exponential autoregressive model, Neural data, spike sorting, time series modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729
8933 Modeling and Simulation of Practical Metamaterial Structures

Authors: Ridha Salhi, Mondher Labidi, Fethi Choubani

Abstract:

Metamaterials have attracted much attention in recent years because of their electromagnetic exquisite proprieties. We will present, in this paper, the modeling of three metamaterial structures by equivalent circuit model. We begin by modeling the SRR (Split Ring Resonator), then we model the HIS (High Impedance Surfaces), and finally, we present the model of the CPW (Coplanar Wave Guide). In order to validate models, we compare the results obtained by an equivalent circuit models with numerical simulation.

Keywords: Metamaterials, SRR, HIS, CPW, IDC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709
8932 Stator-Flux-Oriented Based Encoderless Direct Torque Control for Synchronous Reluctance Machines Using Sliding Mode Approach

Authors: J. Soltani, H. Abootorabi Zarchi, Gh. R. Arab Markadeh

Abstract:

In this paper a sliding-mode torque and flux control is designed for encoderless synchronous reluctance motor drive. The sliding-mode plus PI controllers are designed in the stator-flux field oriented reference frame which is able to track the mentioned reference signals with a minimum pulsations in the state condition. In addition, with these controllers a fast dynamic response is also achieved for the drive system. The proposed control scheme is robust subject to parameters variation except to stator resistance. To solve this problem a simple estimator is used for on-line detecting of this parameter. Moreover, the rotor position and speed are estimated by on-line obtaining of the stator-flux-space vector. The effectiveness and capability of the proposed control approach is verified by both the simulation and experimental results.

Keywords: Synchronous Reluctance Motor, Direct Torque and Flux Control, Sliding Mode, Field-Oriented Frame, Encoderless.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2523
8931 A Java Based Discrete Event Simulation Library

Authors: Brahim Belattar, Abdelhabib Bourouis

Abstract:

This paper describes important features of JAPROSIM, a free and open source simulation library implemented in Java programming language. It provides a framework for building discrete event simulation models. The process interaction world view adopted by JAPROSIM is discussed. We present the architecture and major components of the simulation library. A pedagogical example is given in order to illustrate how to use JAPROSIM for building discrete event simulation models. Further motivations are discussed and suggestions for improving our work are given.

Keywords: Discrete Event Simulation, Object-Oriented Simulation, JAPROSIM, Process Interaction Worldview, Java-based modeling and simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3755
8930 Urban Growth Analysis Using Multi-Temporal Satellite Images, Non-stationary Decomposition Methods and Stochastic Modeling

Authors: Ali Ben Abbes, ImedRiadh Farah, Vincent Barra

Abstract:

Remotely sensed data are a significant source for monitoring and updating databases for land use/cover. Nowadays, changes detection of urban area has been a subject of intensive researches. Timely and accurate data on spatio-temporal changes of urban areas are therefore required. The data extracted from multi-temporal satellite images are usually non-stationary. In fact, the changes evolve in time and space. This paper is an attempt to propose a methodology for changes detection in urban area by combining a non-stationary decomposition method and stochastic modeling. We consider as input of our methodology a sequence of satellite images I1, I2, … In at different periods (t = 1, 2, ..., n). Firstly, a preprocessing of multi-temporal satellite images is applied. (e.g. radiometric, atmospheric and geometric). The systematic study of global urban expansion in our methodology can be approached in two ways: The first considers the urban area as one same object as opposed to non-urban areas (e.g. vegetation, bare soil and water). The objective is to extract the urban mask. The second one aims to obtain a more knowledge of urban area, distinguishing different types of tissue within the urban area. In order to validate our approach, we used a database of Tres Cantos-Madrid in Spain, which is derived from Landsat for a period (from January 2004 to July 2013) by collecting two frames per year at a spatial resolution of 25 meters. The obtained results show the effectiveness of our method.

Keywords: Multi-temporal satellite image, urban growth, Non-stationarity, stochastic modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
8929 Real-time Haptic Modeling and Simulation for Prosthetic Insertion

Authors: Catherine A. Todd, Fazel Naghdy

Abstract:

In this work a surgical simulator is produced which enables a training otologist to conduct a virtual, real-time prosthetic insertion. The simulator provides the Ear, Nose and Throat surgeon with real-time visual and haptic responses during virtual cochlear implantation into a 3D model of the human Scala Tympani (ST). The parametric model is derived from measured data as published in the literature and accounts for human morphological variance, such as differences in cochlear shape, enabling patient-specific pre- operative assessment. Haptic modeling techniques use real physical data and insertion force measurements, to develop a force model which mimics the physical behavior of an implant as it collides with the ST walls during an insertion. Output force profiles are acquired from the insertion studies conducted in the work, to validate the haptic model. The simulator provides the user with real-time, quantitative insertion force information and associated electrode position as user inserts the virtual implant into the ST model. The information provided by this study may also be of use to implant manufacturers for design enhancements as well as for training specialists in optimal force administration, using the simulator. The paper reports on the methods for anatomical modeling and haptic algorithm development, with focus on simulator design, development, optimization and validation. The techniques may be transferrable to other medical applications that involve prosthetic device insertions where user vision is obstructed.

Keywords: Haptic modeling, medical device insertion, real-time visualization of prosthetic implantation, surgical simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1999
8928 Effects of Different Meteorological Variables on Reference Evapotranspiration Modeling: Application of Principal Component Analysis

Authors: Akinola Ikudayisi, Josiah Adeyemo

Abstract:

The correct estimation of reference evapotranspiration (ETₒ) is required for effective irrigation water resources planning and management. However, there are some variables that must be considered while estimating and modeling ETₒ. This study therefore determines the multivariate analysis of correlated variables involved in the estimation and modeling of ETₒ at Vaalharts irrigation scheme (VIS) in South Africa using Principal Component Analysis (PCA) technique. Weather and meteorological data between 1994 and 2014 were obtained both from South African Weather Service (SAWS) and Agricultural Research Council (ARC) in South Africa for this study. Average monthly data of minimum and maximum temperature (°C), rainfall (mm), relative humidity (%), and wind speed (m/s) were the inputs to the PCA-based model, while ETₒ is the output. PCA technique was adopted to extract the most important information from the dataset and also to analyze the relationship between the five variables and ETₒ. This is to determine the most significant variables affecting ETₒ estimation at VIS. From the model performances, two principal components with a variance of 82.7% were retained after the eigenvector extraction. The results of the two principal components were compared and the model output shows that minimum temperature, maximum temperature and windspeed are the most important variables in ETₒ estimation and modeling at VIS. In order words, ETₒ increases with temperature and windspeed. Other variables such as rainfall and relative humidity are less important and cannot be used to provide enough information about ETₒ estimation at VIS. The outcome of this study has helped to reduce input variable dimensionality from five to the three most significant variables in ETₒ modelling at VIS, South Africa.

Keywords: Irrigation, principal component analysis, reference evapotranspiration, Vaalharts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1017
8927 Improvement Approach on Rotor Time Constant Adaptation with Optimum Flux in IFOC for Induction Machines Drives

Authors: S. Grouni, R. Ibtiouen, M. Kidouche, O. Touhami

Abstract:

Induction machine models used for steady-state and transient analysis require machine parameters that are usually considered design parameters or data. The knowledge of induction machine parameters is very important for Indirect Field Oriented Control (IFOC). A mismatched set of parameters will degrade the response of speed and torque control. This paper presents an improvement approach on rotor time constant adaptation in IFOC for Induction Machines (IM). Our approach tends to improve the estimation accuracy of the fundamental model for flux estimation. Based on the reduced order of the IM model, the rotor fluxes and rotor time constant are estimated using only the stator currents and voltages. This reduced order model offers many advantages for real time identification parameters of the IM.

Keywords: Indirect Field Oriented Control (IFOC), InductionMachine (IM), Rotor Time Constant, Parameters ApproachAdaptation. Optimum rotor flux.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1664
8926 Comparison of Polynomial and Radial Basis Kernel Functions based SVR and MLR in Modeling Mass Transfer by Vertical and Inclined Multiple Plunging Jets

Authors: S. Deswal, M. Pal

Abstract:

Presently various computational techniques are used in modeling and analyzing environmental engineering data. In the present study, an intra-comparison of polynomial and radial basis kernel functions based on Support Vector Regression and, in turn, an inter-comparison with Multi Linear Regression has been attempted in modeling mass transfer capacity of vertical (θ = 90O) and inclined (θ multiple plunging jets (varying from 1 to 16 numbers). The data set used in this study consists of four input parameters with a total of eighty eight cases, forty four each for vertical and inclined multiple plunging jets. For testing, tenfold cross validation was used. Correlation coefficient values of 0.971 and 0.981 along with corresponding root mean square error values of 0.0025 and 0.0020 were achieved by using polynomial and radial basis kernel functions based Support Vector Regression respectively. An intra-comparison suggests improved performance by radial basis function in comparison to polynomial kernel based Support Vector Regression. Further, an inter-comparison with Multi Linear Regression (correlation coefficient = 0.973 and root mean square error = 0.0024) reveals that radial basis kernel functions based Support Vector Regression performs better in modeling and estimating mass transfer by multiple plunging jets.

Keywords: Mass transfer, multiple plunging jets, polynomial and radial basis kernel functions, Support Vector Regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1383
8925 Understanding Physical Activity Behavior of Type 2 Diabetics Using the Theory of Planned Behavior and Structural Equation Modeling

Authors: D. O. Omondi, M. K. Walingo, G. M. Mbagaya, L. O. A. Othuon

Abstract:

Understanding patient factors related to physical activity behavior is important in the management of Type 2 Diabetes. This study applied the Theory of Planned Behavior model to understand physical activity behavior among sampled Type 2 diabetics in Kenya. The study was conducted within the diabetic clinic at Kisii Level 5 Hospital and adopted sequential mixed methods design beginning with qualitative phase and ending with quantitative phase. Qualitative data was analyzed using grounded theory analysis method. Structural equation modeling using maximum likelihood was used to analyze quantitative data. The common fit indices revealed that the theory of planned behavior fitted the data acceptably well among the Type 2 diabetes and within physical activity behavior {¤ç2 = 213, df = 84, n=230, p = .061, ¤ç2/df = 2.53; TLI = .97; CFI =.96; RMSEA (90CI) = .073(.029, .08)}. This theory proved to be useful in understanding physical activity behavior among Type 2 diabetics.

Keywords: Physical activity, Theory of Planned Behavior, Type2 diabetes, Kenya.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
8924 Designing Software Quality Measurement System for Telecommunication Industry Using Object-Oriented Technique

Authors: Nor Fazlina Iryani Abdul Hamid, Mohamad Khatim Hasan

Abstract:

Numbers of software quality measurement system have been implemented over the past few years, but none of them focuses on telecommunication industry. Software quality measurement system for telecommunication industry was a system that could calculate the quality value of the measured software that totally focused in telecommunication industry. Before designing a system, quality factors, quality attributes and quality metrics were identified based on literature review and survey. Then, using the identified quality factors, quality attributes and quality metrics, quality model for telecommunication industry was constructed. Each identified quality metrics had its own formula. Quality value for the system was measured based on the quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). The system was designed using object-oriented approach in web-based environment. Thus, existing of software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.

Keywords: Software Quality, Quality Measurement, Object-oriented Approach, Net satisfaction Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2408
8923 A 2D-3D Hybrid Vision System for Robotic Manipulation of Randomly Oriented Objects

Authors: Moulay A. Akhloufi

Abstract:

This paper presents an new vision technique for robotic manipulation of randomly oriented objects in industrial applications. The proposed approach uses 2D and 3D vision for efficiently extracting the 3D pose of an object in the presence of multiple randomly positioned objects. 2D vision permits to quickly select the objects of interest for 3D processing with a new modified ICP algorithm (FaR-ICP), thus reducing significantly the processing time. The extracted 3D pose is then sent to the robot manipulator for picking. The tests show that the proposed system achieves high performances

Keywords: 3D vision, Hand-Eye calibration, robot visual servoing, random bin picking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757
8922 3DARModeler: a 3D Modeling System in Augmented Reality Environment

Authors: Trien V. Do, Jong-Weon Lee

Abstract:

This paper describes a 3D modeling system in Augmented Reality environment, named 3DARModeler. It can be considered a simple version of 3D Studio Max with necessary functions for a modeling system such as creating objects, applying texture, adding animation, estimating real light sources and casting shadows. The 3DARModeler introduces convenient, and effective human-computer interaction to build 3D models by combining both the traditional input method (mouse/keyboard) and the tangible input method (markers). It has the ability to align a new virtual object with the existing parts of a model. The 3DARModeler targets nontechnical users. As such, they do not need much knowledge of computer graphics and modeling techniques. All they have to do is select basic objects, customize their attributes, and put them together to build a 3D model in a simple and intuitive way as if they were doing in the real world. Using the hierarchical modeling technique, the users are able to group several basic objects to manage them as a unified, complex object. The system can also connect with other 3D systems by importing and exporting VRML/3Ds Max files. A module of speech recognition is included in the system to provide flexible user interfaces.

Keywords: 3D Modeling, Augmented Reality, GeometricModeling, Virtual Reality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2604
8921 Enabling Integration across Heterogeneous Care Networks

Authors: Federico Cabitza, Marco P. Locatelli, Marcello Sarini, Carla Simone

Abstract:

The paper shows how the CASMAS modeling language, and its associated pervasive computing architecture, can be used to facilitate continuity of care by providing members of patientcentered communities of care with a support to cooperation and knowledge sharing through the usage of electronic documents and digital devices. We consider a scenario of clearly fragmented care to show how proper mechanisms can be defined to facilitate a better integration of practices and information across heterogeneous care networks. The scenario is declined in terms of architectural components and cooperation-oriented mechanisms that make the support reactive to the evolution of the context where these communities operate.

Keywords: Pervasive Computing, Communities of Care, HeterogeneousCare Networks, Multi-Agent System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1319
8920 Towards Finite Element Modeling of the Accoustics of Human Head

Authors: Maciej Paszynski, Leszek Demkowicz, Jason Kurtz

Abstract:

In this paper, a new formulation for acoustics coupled with linear elasticity is presented. The primary objective of the work is to develop a three dimensional hp adaptive finite element method code destinated for modeling of acoustics of human head. The code will have numerous applications e.g. in designing hearing protection devices for individuals working in high noise environments. The presented work is in the preliminary stage. The variational formulation has been implemented and tested on a sequence of meshes with concentric multi-layer spheres, with material data representing the tissue (the brain), skull and the air. Thus, an efficient solver for coupled elasticity/acoustics problems has been developed, and tested on high contrast material data representing the human head.

Keywords: finite element method, acoustics, coupled problems, biomechanics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1926
8919 Spatial Econometric Approaches for Count Data: An Overview and New Directions

Authors: Paula Simões, Isabel Natário

Abstract:

This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.

Keywords: Spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2636
8918 Dynamic-Stochastic Influence Diagrams: Integrating Time-Slices IDs and Discrete Event Systems Modeling

Authors: Xin Zhao, Yin-fan Zhu, Wei-ping Wang, Qun Li

Abstract:

The Influence Diagrams (IDs) is a kind of Probabilistic Belief Networks for graphic modeling. The usage of IDs can improve the communication among field experts, modelers, and decision makers, by showing the issue frame discussed from a high-level point of view. This paper enhances the Time-Sliced Influence Diagrams (TSIDs, or called Dynamic IDs) based formalism from a Discrete Event Systems Modeling and Simulation (DES M&S) perspective, for Exploring Analysis (EA) modeling. The enhancements enable a modeler to specify times occurred of endogenous events dynamically with stochastic sampling as model running and to describe the inter- influences among them with variable nodes in a dynamic situation that the existing TSIDs fails to capture. The new class of model is named Dynamic-Stochastic Influence Diagrams (DSIDs). The paper includes a description of the modeling formalism and the hiberarchy simulators implementing its simulation algorithm, and shows a case study to illustrate its enhancements.

Keywords: Time-sliced influence diagrams, discrete event systems, dynamic-stochastic influence diagrams, modeling formalism, simulation algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1392
8917 Modeling and Simulation of Underwater Flexible Manipulator as Raleigh Beam Using Bond Graph

Authors: Sumit Kumar, Sunil Kumar, Chandan Deep Singh

Abstract:

This paper presents modeling and simulation of flexible robot in an underwater environment. The underwater environment completely contrasts with ground or space environment. The robot in an underwater situation is subjected to various dynamic forces like buoyancy forces, hydrostatic and hydrodynamic forces. The underwater robot is modeled as Rayleigh beam. The developed model further allows estimating the deflection of tip in two directions. The complete dynamics of the underwater robot is analyzed, which is the main focus of this investigation. The control of robot trajectory is not discussed in this paper. Simulation is performed using Symbol Shakti software.

Keywords: Bond graph modeling, dynamics. modeling, Rayleigh beam, underwater robot.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2969
8916 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: Co-scheduling, data-centric, data-intensive, data locality, in-memory storage, large scale.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
8915 Data Transformation Services (DTS): Creating Data Mart by Consolidating Multi-Source Enterprise Operational Data

Authors: J. D. D. Daniel, K. N. Goh, S. M. Yusop

Abstract:

Trends in business intelligence, e-commerce and remote access make it necessary and practical to store data in different ways on multiple systems with different operating systems. As business evolve and grow, they require efficient computerized solution to perform data update and to access data from diverse enterprise business applications. The objective of this paper is to demonstrate the capability of DTS [1] as a database solution for automatic data transfer and update in solving business problem. This DTS package is developed for the sales of variety of plants and eventually expanded into commercial supply and landscaping business. Dimension data modeling is used in DTS package to extract, transform and load data from heterogeneous database systems such as MySQL, Microsoft Access and Oracle that consolidates into a Data Mart residing in SQL Server. Hence, the data transfer from various databases is scheduled to run automatically every quarter of the year to review the efficient sales analysis. Therefore, DTS is absolutely an attractive solution for automatic data transfer and update which meeting today-s business needs.

Keywords: Data Transformation Services (DTS), ObjectLinking and Embedding Database (OLEDB), Data Mart, OnlineAnalytical Processing (OLAP), Online Transactional Processing(OLTP).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1981
8914 Static and Dynamic Complexity Analysis of Software Metrics

Authors: Kamaljit Kaur, Kirti Minhas, Neha Mehan, Namita Kakkar

Abstract:

Software complexity metrics are used to predict critical information about reliability and maintainability of software systems. Object oriented software development requires a different approach to software complexity metrics. Object Oriented Software Metrics can be broadly classified into static and dynamic metrics. Static Metrics give information at the code level whereas dynamic metrics provide information on the actual runtime. In this paper we will discuss the various complexity metrics, and the comparison between static and dynamic complexity.

Keywords: Static Complexity, Dynamic Complexity, Halstead Metric, Mc Cabe's Metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3164
8913 Operational Risk – Scenario Analysis

Authors: Milan Rippel, Petr Teply

Abstract:

This paper focuses on operational risk measurement techniques and on economic capital estimation methods. A data sample of operational losses provided by an anonymous Central European bank is analyzed using several approaches. Loss Distribution Approach and scenario analysis method are considered. Custom plausible loss events defined in a particular scenario are merged with the original data sample and their impact on capital estimates and on the financial institution is evaluated. Two main questions are assessed – What is the most appropriate statistical method to measure and model operational loss data distribution? and What is the impact of hypothetical plausible events on the financial institution? The g&h distribution was evaluated to be the most suitable one for operational risk modeling. The method based on the combination of historical loss events modeling and scenario analysis provides reasonable capital estimates and allows for the measurement of the impact of extreme events on banking operations.

Keywords: operational risk, scenario analysis, economic capital, loss distribution approach, extreme value theory, stress testing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2378
8912 Consequential Influences of Work-Induced Emotions on the Work-Induced Happiness of Frontline Workers in Finance-Oriented Firms

Authors: Mohammed-Aminu Sanda, Emmanuel K. Mawuena

Abstract:

Frontline workers performing client service duties in finance-oriented firms in most sub-Saharan African countries, such as Ghana, are known to be challenged in the conduct of their activities. The challenge is attributed to clients’ continued demand for real-time services from such workers, despite the introduction of technological interventions to offset the situation. This has caused such frontline workers to experience increases in their work-induced emotions with consequential effects on their work-induced happiness. This study, therefore, explored the effect of frontline workers’ work-induced emotions on their worked-induced happiness when providing tellering services to clients. A cross-sectional design and quantitative technique were used. Data were collected from a sample of 280 frontline workers using questionnaire. Based on the analysis, it was found that an increase in the frontline workers’ work-induced emotions, caused by their feelings of strain, burnout, frustration, and hard work, had consequential effect on their work-induced happiness. This consequential effect was also found to be aggravated by the workers’ senses of being stretched beyond limit, being emotionally drained, and being used up by their work activities. It is concluded that frontline workers in finance-oriented firms can provide quality real-time services to clients without increases in their work-induced emotions, but with enhanced work-induced happiness, when the psychological and physiological emotional factors associated with the challenged work activities are understood and remedied. Management of the firms can use such understanding to redesign the activities of their frontline workers and improve the quality of their service delivery interactivity with clients.

Keywords: Client-service activity, finance industrial sector, frontline workers, work-induced emotion, work-induced happiness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 701
8911 Coloured Reconfigurable Nets for Code Mobility Modeling

Authors: Kahloul Laid, Chaoui Allaoua

Abstract:

Code mobility technologies attract more and more developers and consumers. Numerous domains are concerned, many platforms are developed and interest applications are realized. However, developing good software products requires modeling, analyzing and proving steps. The choice of models and modeling languages is so critical on these steps. Formal tools are powerful in analyzing and proving steps. However, poorness of classical modeling language to model mobility requires proposition of new models. The objective of this paper is to provide a specific formalism “Coloured Reconfigurable Nets" and to show how this one seems to be adequate to model different kinds of code mobility.

Keywords: Code mobility, modelling mobility, labelled reconfigurable nets, Coloured reconfigurable nets, mobile code design paradigms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521
8910 Analysis and Prototyping of Biological Systems: the Abstract Biological Process Model

Authors: Antonio Di Leva, Roberto Berchi, Gianpiero Pescarmona, Michele Sonnessa

Abstract:

The aim of a biological model is to understand the integrated structure and behavior of complex biological systems as a function of the underlying molecular networks to achieve simulation and forecast of their operation. Although several approaches have been introduced to take into account structural and environment related features, relatively little attention has been given to represent the behavior of biological systems. The Abstract Biological Process (ABP) model illustrated in this paper is an object-oriented model based on UML (the standard object-oriented language). Its main objective is to bring into focus the functional aspects of the biological system under analysis.

Keywords: Biological processes, system dynamics, systemmodeling, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587
8909 Prediction of Dissolved Oxygen in Rivers Using a Wang-Mendel Method – Case Study of Au Sable River

Authors: Mahmoud R. Shaghaghian

Abstract:

Amount of dissolve oxygen in a river has a great direct affect on aquatic macroinvertebrates and this would influence on the region ecosystem indirectly. In this paper it is tried to predict dissolved oxygen in rivers by employing an easy Fuzzy Logic Modeling, Wang Mendel method. This model just uses previous records to estimate upcoming values. For this purpose daily and hourly records of eight stations in Au Sable watershed in Michigan, United States are employed for 12 years and 50 days period respectively. Calculations indicate that for long period prediction it is better to increase input intervals. But for filling missed data it is advisable to decrease the interval. Increasing partitioning of input and output features influence a little on accuracy but make the model too time consuming. Increment in number of input data also act like number of partitioning. Large amount of train data does not modify accuracy essentially, so, an optimum training length should be selected.

Keywords: Dissolved oxygen, Au Sable, fuzzy logic modeling, Wang Mendel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1840
8908 Kirchhoff’s Depth Migration over Heterogeneous Velocity Models with Ray Tracing Modeling Approach

Authors: Alok Kumar Routa, Priya Ranjan Mohanty

Abstract:

Complex seismic signatures are generated due to the complexity of the subsurface which is difficult to interpret. In the present study, an attempt has been made to model the complex subsurface using the Ray tracing modeling technique. Add to this, for the imaging of these geological features, Kirchhoff’s prestack depth migration is applied over the synthetic common shot gather dataset. It is found that the Kirchhoff’s migration technique in addition with the Ray tracing modeling concept has the flexibility towards the imaging of various complex geology which gives satisfactory results with proper delineation of the reflectors at their respective true depth position. The entire work has been carried out under the MATLAB environment.

Keywords: Kirchhoff’s migration, Prestack depth migration, Ray tracing modeling, Velocity model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1329
8907 Mining Multicity Urban Data for Sustainable Population Relocation

Authors: Xu Du, Aparna S. Varde

Abstract:

In this research, we propose to conduct diagnostic and predictive analysis about the key factors and consequences of urban population relocation. To achieve this goal, urban simulation models extract the urban development trends as land use change patterns from a variety of data sources. The results are treated as part of urban big data with other information such as population change and economic conditions. Multiple data mining methods are deployed on this data to analyze nonlinear relationships between parameters. The result determines the driving force of population relocation with respect to urban sprawl and urban sustainability and their related parameters. This work sets the stage for developing a comprehensive urban simulation model for catering to specific questions by targeted users. It contributes towards achieving sustainability as a whole.

Keywords: Data Mining, Environmental Modeling, Sustainability, Urban Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730