Search results for: Financial Decision Support Systems
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7231

Search results for: Financial Decision Support Systems

4261 Development of Intelligent Time/Frequency Based Signal Detection Algorithm for Intrusion Detection System

Authors: Waqas Ahmed, S Sajjad Haider Zaidi

Abstract:

For the past couple of decades Weak signal detection is of crucial importance in various engineering and scientific applications. It finds its application in areas like Wireless communication, Radars, Aerospace engineering, Control systems and many of those. Usually weak signal detection requires phase sensitive detector and demodulation module to detect and analyze the signal. This article gives you a preamble to intrusion detection system which can effectively detect a weak signal from a multiplexed signal. By carefully inspecting and analyzing the respective signal, this system can successfully indicate any peripheral intrusion. Intrusion detection system (IDS) is a comprehensive and easy approach towards detecting and analyzing any signal that is weakened and garbled due to low signal to noise ratio (SNR). This approach finds significant importance in applications like peripheral security systems.

Keywords: Data Acquisition, fast frequency transforms, Lab VIEW software, weak signal detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2510
4260 Location Detection of Vehicular Accident Using Global Navigation Satellite Systems/Inertial Measurement Units Navigator

Authors: Neda Navidi, Rene Jr. Landry

Abstract:

Vehicle tracking and accident recognizing are considered by many industries like insurance and vehicle rental companies. The main goal of this paper is to detect the location of a car accident by combining different methods. The methods, which are considered in this paper, are Global Navigation Satellite Systems/Inertial Measurement Units (GNSS/IMU)-based navigation and vehicle accident detection algorithms. They are expressed by a set of raw measurements, which are obtained from a designed integrator black box using GNSS and inertial sensors. Another concern of this paper is the definition of accident detection algorithm based on its jerk to identify the position of that accident. In fact, the results convinced us that, even in GNSS blockage areas, the position of the accident could be detected by GNSS/INS integration with 50% improvement compared to GNSS stand alone.

Keywords: Driving behavior, integration, IMU, GNSS, monitoring, tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1230
4259 Memristor-A Promising Candidate for Neural Circuits in Neuromorphic Computing Systems

Authors: Juhi Faridi, Mohd. Ajmal Kafeel

Abstract:

The advancements in the field of Artificial Intelligence (AI) and technology has led to an evolution of an intelligent era. Neural networks, having the computational power and learning ability similar to the brain is one of the key AI technologies. Neuromorphic computing system (NCS) consists of the synaptic device, neuronal circuit, and neuromorphic architecture. Memristor are a promising candidate for neuromorphic computing systems, but when it comes to neuromorphic computing, the conductance behavior of the synaptic memristor or neuronal memristor needs to be studied thoroughly in order to fathom the neuroscience or computer science. Furthermore, there is a need of more simulation work for utilizing the existing device properties and providing guidance to the development of future devices for different performance requirements. Hence, development of NCS needs more simulation work to make use of existing device properties. This work aims to provide an insight to build neuronal circuits using memristors to achieve a Memristor based NCS.  Here we throw a light on the research conducted in the field of memristors for building analog and digital circuits in order to motivate the research in the field of NCS by building memristor based neural circuits for advanced AI applications. This literature is a step in the direction where we describe the various Key findings about memristors and its analog and digital circuits implemented over the years which can be further utilized in implementing the neuronal circuits in the NCS. This work aims to help the electronic circuit designers to understand how the research progressed in memristors and how these findings can be used in implementing the neuronal circuits meant for the recent progress in the NCS.

Keywords: Analog circuits, digital circuits, memristors, neuromorphic computing systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1214
4258 E-Learning Methodology Development using Modeling

Authors: Sarma Cakula, Maija Sedleniece

Abstract:

Simulation and modeling computer programs are concerned with construction of models for analyzing different perspectives and possibilities in changing conditions environment. The paper presents theoretical justification and evaluation of qualitative e-learning development model in perspective of advancing modern technologies. There have been analyzed principles of qualitative e-learning in higher education, productivity of studying process using modern technologies, different kind of methods and future perspectives of e-learning in formal education. Theoretically grounded and practically tested model of developing e-learning methods using different technologies for different type of classroom, which can be used in professor-s decision making process to choose the most effective e-learning methods has been worked out.

Keywords: E-learning, modeling, E-learning methods development, personal knowledge management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
4257 An Atomic-Domains-Based Approach for Attack Graph Generation

Authors: Fangfang Chen, Chunlu Wang, Zhihong Tian, Shuyuan Jin, Tianle Zhang

Abstract:

Attack graph is an integral part of modeling the overview of network security. System administrators use attack graphs to determine how vulnerable their systems are and to determine what security measures to deploy to defend their systems. Previous methods on AGG(attack graphs generation) are aiming at the whole network, which makes the process of AGG complex and non-scalable. In this paper, we propose a new approach which is simple and scalable to AGG by decomposing the whole network into atomic domains. Each atomic domain represents a host with a specific privilege. Then the process for AGG is achieved by communications among all the atomic domains. Our approach simplifies the process of design for the whole network, and can gives the attack graphs including each attack path for each host, and when the network changes we just carry on the operations of corresponding atomic domains which makes the process of AGG scalable.

Keywords: atomic domain, vulnerability, attack graphs, generation, computer security

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
4256 Improvement of the Q-System Using the Rock Engineering System: A Case Study of Water Conveyor Tunnel of Azad Dam

Authors: S. Golmohammadi, M. Noorian Bidgoli

Abstract:

Because the status and mechanical parameters of discontinuities in the rock mass are included in the calculations, various methods of rock engineering classification are often used as a starting point for the design of different types of structures. The Q-system is one of the most frequently used methods for stability analysis and determination of support systems of underground structures in rock, including tunnel. In this method, six main parameters of the rock mass, namely, the Rock Quality Designation (RQD), joint set number (Jn), joint roughness number (Jr), joint alteration number (Ja), joint water parameter (Jw) and Stress Reduction Factor (SRF) are required. In this regard, in order to achieve a reasonable and optimal design, identifying the effective parameters for the stability of the mentioned structures is one of the most important goals and the most necessary actions in rock engineering. Therefore, it is necessary to study the relationships between the parameters of a system and how they interact with each other and, ultimately, the whole system. In this research, it has been attempted to determine the most effective parameters (key parameters) from the six parameters of rock mass in the Q-system using the Rock Engineering System (RES) method to improve the relationships between the parameters in the calculation of the Q value. The RES system is, in fact, a method by which one can determine the degree of cause and effect of a system's parameters by making an interaction matrix. In this research, the geomechanical data collected from the water conveyor tunnel of Azad Dam were used to make the interaction matrix of the Q-system. For this purpose, instead of using the conventional methods that are always accompanied by defects such as uncertainty, the Q-system interaction matrix is coded using a technique that is actually a statistical analysis of the data and determining the correlation coefficient between them. So, the effect of each parameter on the system is evaluated with greater certainty. The results of this study show that the formed interaction matrix provides a reasonable estimate of the effective parameters in the Q-system. Among the six parameters of the Q-system, the SRF and Jr parameters have the maximum and minimum impact on the system, respectively, and also the RQD and Jw parameters have the maximum and minimum impact on the system, respectively. Therefore, by developing this method, we can obtain a more accurate relation to the rock mass classification by weighting the required parameters in the Q-system.

Keywords: Q-system, Rock Engineering System, statistical analysis, rock mass, tunnel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 295
4255 Parallel Double Splicing on Iso-Arrays

Authors: V. Masilamani, D.K. Sheena Christy, D.G. Thomas

Abstract:

Image synthesis is an important area in image processing. To synthesize images various systems are proposed in the literature. In this paper, we propose a bio-inspired system to synthesize image and to study the generating power of the system, we define the class of languages generated by our system. We call image as array in this paper. We use a primitive called iso-array to synthesize image/array. The operation is double splicing on iso-arrays. The double splicing operation is used in DNA computing and we use this to synthesize image. A comparison of the family of languages generated by the proposed self restricted double splicing systems on iso-arrays with the existing family of local iso-picture languages is made. Certain closure properties such as union, concatenation and rotation are studied for the family of languages generated by the proposed model.

Keywords: DNA computing, splicing system, iso-picture languages, iso-array double splicing system, iso-array self splicing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545
4254 Prediction-Based Midterm Operation Planning for Energy Management of Exhibition Hall

Authors: Doseong Eom, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

Large exhibition halls require a lot of energy to maintain comfortable atmosphere for the visitors viewing inside. One way of reducing the energy cost is to have thermal energy storage systems installed so that the thermal energy can be stored in the middle of night when the energy price is low and then used later when the price is high. To minimize the overall energy cost, however, we should be able to decide how much energy to save during which time period exactly. If we can foresee future energy load and the corresponding cost, we will be able to make such decisions reasonably. In this paper, we use machine learning technique to obtain models for predicting weather conditions and the number of visitors on hourly basis for the next day. Based on the energy load thus predicted, we build a cost-optimal daily operation plan for the thermal energy storage systems and cooling and heating facilities through simulation-based optimization.

Keywords: Building energy management, machine learning, simulation-based optimization, operation planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 989
4253 The Performance of Disbursement Procedure of Public Works in Thailand

Authors: Israngkura Na Ayudhya B, Kunishima M.

Abstract:

This paper analysis performance of disbursement procedure of public works project in Thailand. The results of research were summarised based on contracts, submitted invoice, inspection dated, copies of disbursement dated between client and their main contractor and interviewed with persons involved in central and local government projects during 1994-2008 in Thailand. The data collection was to investigate the disbursement procedure related to performance in disbursement during construction period (Planned duration of contract against Actual execution date in each month). A graphical presentation of a duration analysis of the projects illustrated significant disbursement formation in each project. It was established that the shortage of staff, the financial stability of clients, bureaucratic, method of disbursement and economics situation has play major role on performance of disbursement to their main contractors.

Keywords: Construction disbursement, Payment procedure, Public works

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494
4252 Solid Circulation Rate and Gas Leakage Measurements in an Interconnected Bubbling Fluidized Beds

Authors: Ho-Jung Ryu, Seung-Yong Lee, Young Cheol Park, Moon-Hee Park

Abstract:

Two-interconnected fluidized bed systems are widely used in various processes such as Fisher-Tropsch, hot gas desulfurization, CO2 capture-regeneration with dry sorbent, chemical-looping combustion, sorption enhanced steam methane reforming, chemical-looping hydrogen generation system, and so on. However, most of two-interconnected fluidized beds systems require riser and/or pneumatic transport line for solid conveying and loopseals or seal-pots for gas sealing, recirculation of solids to the riser, and maintaining of pressure balance. The riser (transport bed) is operated at the high velocity fluidization condition and residence times of gas and solid in the riser are very short. If the reaction rate of catalyst or sorbent is slow, the riser can not ensure sufficient contact time between gas and solid and we have to use two bubbling beds for each reaction to ensure sufficient contact time. In this case, additional riser must be installed for solid circulation. Consequently, conventional two-interconnected fluidized bed systems are very complex, large, and difficult to operate. To solve these problems, a novel two-interconnected fluidized bed system has been developed. This system has two bubbling beds, solid injection nozzles, solid conveying lines, and downcomers. In this study, effects of operating variables on solid circulation rate, gas leakage between two beds have been investigated in a cold mode two-interconnected fluidized bed system. Moreover, long-term operation of continuous solid circulation up to 60 hours has been performed to check feasibility of stable operation.

Keywords: Fluidized bed, Gas leakage, Long-term operation, Solid circulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1999
4251 Risk Management Approach for a Secure and Performant Integration of Automated Drug Dispensing Systems in Hospitals

Authors: Hind Bouami, Patrick Millot

Abstract:

Medication dispensing system is a life-critical system whose failure may result in preventable adverse events leading to longer patient stays in hospitals or patient death. Automation has led to great improvements in life-critical systems as it increased safety, efficiency, and comfort. However, critical risks related to medical organization complexity and automated solutions integration can threaten drug dispensing security and performance. Knowledge about the system’s complexity aspects and human machine parameters to control for automated equipment’s security and performance will help operators to secure their automation process and to optimize their system’s reliability. In this context, this study aims to document the operator’s situation awareness about automation risks and parameters involved in automation security and performance. Our risk management approach has been deployed in the North Luxembourg hospital center’s pharmacy, which is equipped with automated drug dispensing systems since 2009. With more than 4 million euros of gains generated, North Luxembourg hospital center’s success story was enabled by the management commitment, pharmacy’s involvement in the implementation and improvement of the automation project, and the close collaboration between the pharmacy and Sinteco’s firm to implement the necessary innovation and organizational actions for automated solutions integration security and performance. An analysis of the actions implemented by the hospital and the parameters involved in automated equipment’s integration security and performance has been made. The parameters to control for automated equipment’s integration security and performance are human aspects (6.25%), technical aspects (50%), and human-machine interaction (43.75%). The implementation of an anthropocentric analysis system before automation would have prevented and optimized the control of risks related to automation.

Keywords: Automated drug delivery systems, hospitals, human-centered automated system, risk management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 725
4250 The Impact of Metacognitive Knowledge and Experience on Top Management Team Diversity and Small to Medium Enterprises Performance

Authors: Jo Rhodes, Peter Lok, Zahra Sadeghinejad

Abstract:

The aim of this study is to determine the impact of metacognition on top management team members and firm performance based on full team integration. A survey of 1500 small to medium enterprises (SMEs) was initiated and 140 firms were obtained in this study (with response rate of 9%). The result showed that different metacognitive abilities of managers [knowledge and experience] could enhance team decision-making and problem solving, resulting in greater firm performance. This is a significant finding for SMEs because these organisations have small teams with owner leadership and entrepreneurial orientation.

Keywords: Metacognition, behavioural integration, top management team, performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1290
4249 Knowledge Audit Model for Requirement Elicitation Process

Authors: Laleh Taheri, Noraini C. Pa., Rusli Abdullah, Salfarina Abdullah

Abstract:

Knowledge plays an important role to the success of any organization. Software development organizations are highly knowledge-intensive organizations especially in their requirement elicitation process (REP). There are several problems regarding communicating and using the knowledge in REP such as misunderstanding, being out of scope, conflicting information and changes of requirements. All of these problems occurred in transmitting the requirements knowledge during REP. Several researches have been done in REP in order to solve the problem towards requirements. Knowledge Audit (KA) approaches were proposed in order to solve managing knowledge in human resources, financial and manufacturing. There is lack of study applying the KA in requirements elicitation process. Therefore, this paper proposes a KA model for REP in supporting to acquire good requirements.

Keywords: Knowledge Audit, Requirement Elicitation Process, KA Model, Knowledge in Requirement Elicitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2901
4248 Migration from Commercial to in-House Developed Learning Management Systems

Authors: Lejla A. Bexheti, Visar S. Shehu, Adrian A. Besimi

Abstract:

The Learning Management Systems present learning environment which offers a collection of e-learning tools in a package that allows a common interface and information sharing among the tools. South East European University initial experience in LMS was with the usage of the commercial LMS-ANGEL. After a three year experience on ANGEL usage because of expenses that were very high it was decided to develop our own software. As part of the research project team for the in-house design and development of the new LMS, we primarily had to select the features that would cover our needs and also comply with the actual trends in the area of software development, and then design and develop the system. In this paper we present the process of LMS in-house development for South East European University, its architecture, conception and strengths with a special accent on the process of migration and integration with other enterprise applications.

Keywords: e-learning tools, LMS, migration, user feedback.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
4247 Adjusted LOLE and EENS Indices for the Consideration of Load Excess Transfer in Power Systems Adequacy Studies

Authors: F. Vallée, J-F. Toubeau, Z. De Grève, J. Lobry

Abstract:

When evaluating the capacity of a generation park to cover the load in transmission systems, traditional Loss of Load Expectation (LOLE) and Expected Energy not Served (EENS) indices can be used. If those indices allow computing the annual duration and severity of load non covering situations, they do not take into account the fact that the load excess is generally shifted from one penury state (hour or quarter of an hour) to the following one. In this paper, a sequential Monte Carlo framework is introduced in order to compute adjusted LOLE and EENS indices. Practically, those adapted indices permit to consider the effect of load excess transfer on the global adequacy of a generation park, providing thus a more accurate evaluation of this quantity.

Keywords: Expected Energy not Served, Loss of Load Expectation, Monte Carlo simulation, reliability, wind generation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2208
4246 Modeling and Simulation of Flow Shop Scheduling Problem through Petri Net Tools

Authors: Joselito Medina Marin, Norberto Hernández Romero, Juan Carlos Seck Tuoh Mora, Erick S. Martinez Gomez

Abstract:

The Flow Shop Scheduling Problem (FSSP) is a typical problem that is faced by production planning managers in Flexible Manufacturing Systems (FMS). This problem consists in finding the optimal scheduling to carry out a set of jobs, which are processed in a set of machines or shared resources. Moreover, all the jobs are processed in the same machine sequence. As in all the scheduling problems, the makespan can be obtained by drawing the Gantt chart according to the operations order, among other alternatives. On this way, an FMS presenting the FSSP can be modeled by Petri nets (PNs), which are a powerful tool that has been used to model and analyze discrete event systems. Then, the makespan can be obtained by simulating the PN through the token game animation and incidence matrix. In this work, we present an adaptive PN to obtain the makespan of FSSP by applying PN analytical tools.

Keywords: Flow-shop scheduling problem, makespan, Petri nets, state equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739
4245 Low Cost IMU \ GPS Integration Using Kalman Filtering for Land Vehicle Navigation Application

Authors: Othman Maklouf, Abdurazag Ghila, Ahmed Abdulla, Ameer Yousef

Abstract:

Land vehicle navigation system technology is a subject of great interest today. Global Positioning System (GPS) is a common choice for positioning in such systems. However, GPS alone is incapable of providing continuous and reliable positioning, because of its inherent dependency on external electromagnetic signals. Inertial Navigation is the implementation of inertial sensors to determine the position and orientation of a vehicle. As such, inertial navigation has unbounded error growth since the error accumulates at each step. Thus in order to contain these errors some form of external aiding is required. The availability of low cost Micro-Electro-Mechanical-System (MEMS) inertial sensors is now making it feasible to develop Inertial Navigation System (INS) using an inertial measurement unit (IMU), in conjunction with GPS to fulfill the demands of such systems. Typically IMU’s are very expensive systems; however this INS will use “low cost” components. Unfortunately with low cost also comes low performance and is the main reason for the inclusion of GPS and Kalman filtering into the system. The aim of this paper is to develop a GPS/MEMS INS integrated system, which is able to provide a navigation solution with accuracy levels appropriate for land vehicle navigation. The primary piece of equipment used was a MEMS-based Crista IMU (from Cloud Cap Technology Inc.) and a Garmin GPS 18 PC (which is both a receiver and antenna). The integration of GPS with INS can be implemented using a Kalman filter in loosely coupled mode. In this integration mode the INS error states, together with any navigation state (position, velocity, and attitude) and other unknown parameters of interest, are estimated using GPS measurements. All important equations regarding navigation are presented along with discussion.

Keywords: GPS, IMU, Kalman Filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7533
4244 A Robust Image Watermarking Scheme using Image Moment Normalization

Authors: Latha Parameswaran, K. Anbumani

Abstract:

Multimedia security is an incredibly significant area of concern. A number of papers on robust digital watermarking have been presented, but there are no standards that have been defined so far. Thus multimedia security is still a posing problem. The aim of this paper is to design a robust image-watermarking scheme, which can withstand a different set of attacks. The proposed scheme provides a robust solution integrating image moment normalization, content dependent watermark and discrete wavelet transformation. Moment normalization is useful to recover the watermark even in case of geometrical attacks. Content dependent watermarks are a powerful means of authentication as the data is watermarked with its own features. Discrete wavelet transforms have been used as they describe image features in a better manner. The proposed scheme finds its place in validating identification cards and financial instruments.

Keywords: Watermarking, moments, wavelets, content-based, benchmarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1547
4243 A DEA Model for Performance Evaluation in The Presence of Time Lag Effect

Authors: Yanshuang Zhang, Byungho Jeong

Abstract:

Data Envelopment Analysis (DEA) is a methodology that computes efficiency values for decision making units (DMU) in a given period by comparing the outputs with the inputs. In many cases, there are some time lag between the consumption of inputs and the production of outputs. For a long-term research project, it is hard to avoid the production lead time phenomenon. This time lag effect should be considered in evaluating the performance of organizations. This paper suggests a model to calculate efficiency values for the performance evaluation problem with time lag. In the experimental part, the proposed methods are compared with the CCR and an existing time lag model using the data set of the 21st century frontier R&D program which is a long-term national R&D program of Korea.

Keywords: DEA, Efficiency, Time Lag

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1894
4242 End To End Process to Automate Batch Application

Authors: Nagmani Lnu

Abstract:

Often, quality engineering refers to testing the applications that either have a User Interface (UI) or an Application Programming Interface (API). We often find mature test practices, standards, and automation regarding UI or API testing. However, another kind is present in almost all types of industries that deal with data in bulk and often get handled through something called a batch application. This is primarily an offline application companies develop to process large data sets that often deal with multiple business rules. The challenge gets more prominent when we try to automate batch testing. This paper describes the approaches taken to test a batch application from a financial industry to test the payment settlement process (a critical use case in all kinds of FinTech companies), resulting in 100% test automation in test creation and test execution. One can follow this approach for any other batch use cases to achieve a higher efficiency in their testing process.

Keywords: Batch testing, batch test automation, batch test strategy, payments testing, payments settlement testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 66
4241 Planning Rigid Body Motions and Optimal Control Problem on Lie Group SO(2, 1)

Authors: Nemat Abazari, Ilgin Sager

Abstract:

In this paper smooth trajectories are computed in the Lie group SO(2, 1) as a motion planning problem by assigning a Frenet frame to the rigid body system to optimize the cost function of the elastic energy which is spent to track a timelike curve in Minkowski space. A method is proposed to solve a motion planning problem that minimizes the integral of the Lorentz inner product of Darboux vector of a timelike curve. This method uses the coordinate free Maximum Principle of Optimal control and results in the theory of integrable Hamiltonian systems. The presence of several conversed quantities inherent in these Hamiltonian systems aids in the explicit computation of the rigid body motions.

Keywords: Optimal control, Hamiltonian vector field, Darboux vector, maximum principle, lie group, rigid body motion, Lorentz metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
4240 Process Optimization and Automation of Information Technology Services in a Heterogenic Digital Environment

Authors: Tasneem Halawani, Yamen Khateeb

Abstract:

With customers’ ever-increasing expectations for fast services provisioning for all their business needs, information technology (IT) organizations, as business partners, have to cope with this demanding environment and deliver their services in the most effective and efficient way. The purpose of this paper is to identify optimization and automation opportunities for the top requested IT services in a heterogenic digital environment and widely spread customer base. In collaboration with systems, processes, and subject matter experts (SMEs), the processes in scope were approached by analyzing four-year related historical data, identifying and surveying stakeholders, modeling the as-is processes, and studying systems integration/automation capabilities. This effort resulted in identifying several pain areas, including standardization, unnecessary customer and IT involvement, manual steps, systems integration, and performance measurement. These pain areas were addressed by standardizing the top five requested IT services, eliminating/automating 43 steps, and utilizing a single platform for end-to-end process execution. In conclusion, the optimization of IT service request processes in a heterogenic digital environment and widely spread customer base is challenging, yet achievable without compromising the service quality and customers’ added value. Further studies can focus on measuring the value of the eliminated/automated process steps to quantify the enhancement impact. Moreover, a similar approach can be utilized to optimize other IT service requests, with a focus on business criticality.

Keywords: Automation, customer value, heterogenic, integration, IT services, optimization, processes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 665
4239 Comparison of Inter Cell Interference Coordination Approaches

Authors: Selma Sbit, Mohamed Bechir Dadi, Belgacem Chibani Rhaimi

Abstract:

This work aims to compare various techniques used in order to mitigate Inter-Cell Interference (ICI) in Long Term Evolution (LTE) and LTE-Advanced systems. For that, we will evaluate the performance of each one. In mobile communication networks, systems are limited by ICI particularly caused by deployment of small cells in conventional cell’s implementation. Therefore, various mitigation techniques, named Inter-Cell Interference Coordination techniques (ICIC), enhanced Inter-Cell Interference Coordination (eICIC) techniques and Coordinated Multi-Point transmission and reception (CoMP) are proposed. This paper presents a comparative study of these strategies. It can be concluded that CoMP techniques can ameliorate SINR and capacity system compared to ICIC and eICIC. In fact, SINR value reaches 15 dB for a distance of 0.5 km between user equipment and servant base station if we use CoMP technology whereas it cannot exceed 12 dB and 9 dB for eICIC and ICIC approaches respectively as reflected in simulations.

Keywords: 4th generation, interference, coordination, ICIC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1005
4238 Ecological Networks: From Structural Analysis to Synchronization

Authors: N. F. F. Ebecken, G. C. Pereira

Abstract:

Ecological systems are exposed and are influenced by various natural and anthropogenic disturbances. They produce various effects and states seeking response symmetry to a state of global phase coherence or stability and balance of their food webs. This research project addresses the development of a computational methodology for modeling plankton food webs. The use of algorithms to establish connections, the generation of representative fuzzy multigraphs and application of technical analysis of complex networks provide a set of tools for defining, analyzing and evaluating community structure of coastal aquatic ecosystems, beyond the estimate of possible external impacts to the networks. Thus, this study aims to develop computational systems and data models to assess how these ecological networks are structurally and functionally organized, to analyze the types and degree of compartmentalization and synchronization between oscillatory and interconnected elements network and the influence of disturbances on the overall pattern of rhythmicity of the system.

Keywords: Ecological networks, plankton food webs, fuzzy multigraphs, dynamic of networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1995
4237 Success Factors of Large Scale ERP Implementation in Thailand

Authors: Rotchanakitumnuai, Siriluck

Abstract:

The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

Keywords: large scale ERP, implementation success factors, Thailand

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3196
4236 Job Shop Scheduling: Classification, Constraints and Objective Functions

Authors: Majid Abdolrazzagh-Nezhad, Salwani Abdullah

Abstract:

The job-shop scheduling problem (JSSP) is an important decision facing those involved in the fields of industry, economics and management. This problem is a class of combinational optimization problem known as the NP-hard problem. JSSPs deal with a set of machines and a set of jobs with various predetermined routes through the machines, where the objective is to assemble a schedule of jobs that minimizes certain criteria such as makespan, maximum lateness, and total weighted tardiness. Over the past several decades, interest in meta-heuristic approaches to address JSSPs has increased due to the ability of these approaches to generate solutions which are better than those generated from heuristics alone. This article provides the classification, constraints and objective functions imposed on JSSPs that are available in the literature.

Keywords: Job-shop scheduling, classification, constraints, objective functions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
4235 Modeling Biology Inspired Reactive Agents Using X-machines

Authors: George Eleftherakis, Petros Kefalas, Anna Sotiriadou, Evangelos Kehris

Abstract:

Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.

Keywords: Biology inspired agent, formal methods, x-machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1507
4234 EML-Estimation of Multivariate t Copulas with Heuristic Optimization

Authors: Jin Zhang, Wing Lon Ng

Abstract:

In recent years, copulas have become very popular in financial research and actuarial science as they are more flexible in modelling the co-movements and relationships of risk factors as compared to the conventional linear correlation coefficient by Pearson. However, a precise estimation of the copula parameters is vital in order to correctly capture the (possibly nonlinear) dependence structure and joint tail events. In this study, we employ two optimization heuristics, namely Differential Evolution and Threshold Accepting to tackle the parameter estimation of multivariate t distribution models in the EML approach. Since the evolutionary optimizer does not rely on gradient search, the EML approach can be applied to estimation of more complicated copula models such as high-dimensional copulas. Our experimental study shows that the proposed method provides more robust and more accurate estimates as compared to the IFM approach.

Keywords: Copula Models, Student t Copula, Parameter Inference, Differential Evolution, Threshold Accepting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1560
4233 Sampling of Variables in Discrete-Event Simulation using the Example of Inventory Evolutions in Job-Shop-Systems Based on Deterministic and Non-Deterministic Data

Authors: Bernd Scholz-Reiter, Christian Toonen, Jan Topi Tervo, Dennis Lappe

Abstract:

Time series analysis often requires data that represents the evolution of an observed variable in equidistant time steps. In order to collect this data sampling is applied. While continuous signals may be sampled, analyzed and reconstructed applying Shannon-s sampling theorem, time-discrete signals have to be dealt with differently. In this article we consider the discrete-event simulation (DES) of job-shop-systems and study the effects of different sampling rates on data quality regarding completeness and accuracy of reconstructed inventory evolutions. At this we discuss deterministic as well as non-deterministic behavior of system variables. Error curves are deployed to illustrate and discuss the sampling rate-s impact and to derive recommendations for its wellfounded choice.

Keywords: discrete-event simulation, job-shop-system, sampling rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1827
4232 Hybrid Neural Network Methods for Lithology Identification in the Algerian Sahara

Authors: S. Chikhi, M. Batouche, H. Shout

Abstract:

In this paper, we combine a probabilistic neural method with radial-bias functions in order to construct the lithofacies of the wells DF01, DF02 and DF03 situated in the Triassic province of Algeria (Sahara). Lithofacies is a crucial problem in reservoir characterization. Our objective is to facilitate the experts' work in geological domain and to allow them to obtain quickly the structure and the nature of lands around the drilling. This study intends to design a tool that helps automatic deduction from numerical data. We used a probabilistic formalism to enhance the classification process initiated by a Self-Organized Map procedure. Our system gives lithofacies, from well-log data, of the concerned reservoir wells in an aspect easy to read by a geology expert who identifies the potential for oil production at a given source and so forms the basis for estimating the financial returns and economic benefits.

Keywords: Classification, Lithofacies, Probabilistic formalism, Reservoir characterization, Well-log data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897