Search results for: Markovian Decision Process based Adaptive Scheduling
14685 Using a Trust-Based Environment Key for Mobile Agent Code Protection
Authors: Salima Hacini, Zahia Guessoum, Zizette Boufaïda
Abstract:
Human activities are increasingly based on the use of remote resources and services, and on the interaction between remotely located parties that may know little about each other. Mobile agents must be prepared to execute on different hosts with various environmental security conditions. The aim of this paper is to propose a trust based mechanism to improve the security of mobile agents and allow their execution in various environments. Thus, an adaptive trust mechanism is proposed. It is based on the dynamic interaction between the agent and the environment. Information collected during the interaction enables generation of an environment key. This key informs on the host-s trust degree and permits the mobile agent to adapt its execution. Trust estimation is based on concrete parameters values. Thus, in case of distrust, the source of problem can be located and a mobile agent appropriate behavior can be selected.Keywords: Internet security, malicious host, mobile agent security, trust management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 141314684 Performance of Compound Enhancement Algorithms on Dental Radiograph Images
Authors: S.A.Ahmad, M.N.Taib, N.E.A.Khalid, R.Ahmad, H.Taib
Abstract:
The purpose of this research is to compare the original intra-oral digital dental radiograph images with images that are enhanced using a combination of image processing algorithms. Intraoral digital dental radiograph images are often noisy, blur edges and low in contrast. A combination of sharpening and enhancement method are used to overcome these problems. Three types of proposed compound algorithms used are Sharp Adaptive Histogram Equalization (SAHE), Sharp Median Adaptive Histogram Equalization (SMAHE) and Sharp Contrast adaptive histogram equalization (SCLAHE). This paper presents an initial study of the perception of six dentists on the details of abnormal pathologies and improvement of image quality in ten intra-oral radiographs. The research focus on the detection of only three types of pathology which is periapical radiolucency, widen periodontal ligament space and loss of lamina dura. The overall result shows that SCLAHE-s slightly improve the appearance of dental abnormalities- over the original image and also outperform the other two proposed compound algorithms.Keywords: intra-oral dental radiograph, histogram equalization, sharpening, CLAHE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178414683 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees
Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel
Abstract:
Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.
Keywords: Cloud storage, decision trees, diagnostic image, search, telemedicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 94814682 A Decision Boundary based Discretization Technique using Resampling
Authors: Taimur Qureshi, Djamel A Zighed
Abstract:
Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.Keywords: Bootstrap, discretization, resampling, soft decision trees.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 143414681 Increasing the Efficiency of Rake Receivers for Ultra-Wideband Applications
Authors: Aimilia P. Doukeli, Athanasios S. Lioumpas, George K. Karagiannidis, Panayiotis V. Frangos, P. Takis Mathiopoulos
Abstract:
In diversity rich environments, such as in Ultra- Wideband (UWB) applications, the a priori determination of the number of strong diversity branches is difficult, because of the considerably large number of diversity paths, which are characterized by a variety of power delay profiles (PDPs). Several Rake implementations have been proposed in the past, in order to reduce the number of the estimated and combined paths. To this aim, we introduce two adaptive Rake receivers, which combine a subset of the resolvable paths considering simultaneously the quality of both the total combining output signal-to-noise ratio (SNR) and the individual SNR of each path. These schemes achieve better adaptation to channel conditions compared to other known receivers, without further increasing the complexity. Their performance is evaluated in different practical UWB channels, whose models are based on extensive propagation measurements. The proposed receivers compromise between the power consumption, complexity and performance gain for the additional paths, resulting in important savings in power and computational resources.Keywords: Adaptive Rake receivers, diversity techniques, fading channels, UWB channel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154614680 Application of Adaptive Neuro-Fuzzy Inference System in Smoothing Transition Autoregressive Models
Authors: Ε. Giovanis
Abstract:
In this paper we propose and examine an Adaptive Neuro-Fuzzy Inference System (ANFIS) in Smoothing Transition Autoregressive (STAR) modeling. Because STAR models follow fuzzy logic approach, in the non-linear part fuzzy rules can be incorporated or other training or computational methods can be applied as the error backpropagation algorithm instead to nonlinear squares. Furthermore, additional fuzzy membership functions can be examined, beside the logistic and exponential, like the triangle, Gaussian and Generalized Bell functions among others. We examine two macroeconomic variables of US economy, the inflation rate and the 6-monthly treasury bills interest rates.Keywords: Forecasting, Neuro-Fuzzy, Smoothing transition, Time-series
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 163014679 An Application of the Data Mining Methods with Decision Rule
Authors: Xun Ge, Jianhua Gong
Abstract:
ankings for output of Chinese main agricultural commodity in the world for 1978, 1980, 1990, 2000, 2006, 2007 and 2008 have been released in United Nations FAO Database. Unfortunately, where the ranking of output of Chinese cotton lint in the world for 2008 was missed. This paper uses sequential data mining methods with decision rules filling this gap. This new data mining method will be help to give a further improvement for United Nations FAO Database.
Keywords: Ranking, output of the main agricultural commodity, gross domestic product, decision table, information system, data mining, decision rule
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 171014678 A Self Supervised Bi-directional Neural Network (BDSONN) Architecture for Object Extraction Guided by Beta Activation Function and Adaptive Fuzzy Context Sensitive Thresholding
Authors: Siddhartha Bhattacharyya, Paramartha Dutta, Ujjwal Maulik, Prashanta Kumar Nandi
Abstract:
A multilayer self organizing neural neural network (MLSONN) architecture for binary object extraction, guided by a beta activation function and characterized by backpropagation of errors estimated from the linear indices of fuzziness of the network output states, is discussed. Since the MLSONN architecture is designed to operate in a single point fixed/uniform thresholding scenario, it does not take into cognizance the heterogeneity of image information in the extraction process. The performance of the MLSONN architecture with representative values of the threshold parameters of the beta activation function employed is also studied. A three layer bidirectional self organizing neural network (BDSONN) architecture comprising fully connected neurons, for the extraction of objects from a noisy background and capable of incorporating the underlying image context heterogeneity through variable and adaptive thresholding, is proposed in this article. The input layer of the network architecture represents the fuzzy membership information of the image scene to be extracted. The second layer (the intermediate layer) and the final layer (the output layer) of the network architecture deal with the self supervised object extraction task by bi-directional propagation of the network states. Each layer except the output layer is connected to the next layer following a neighborhood based topology. The output layer neurons are in turn, connected to the intermediate layer following similar topology, thus forming a counter-propagating architecture with the intermediate layer. The novelty of the proposed architecture is that the assignment/updating of the inter-layer connection weights are done using the relative fuzzy membership values at the constituent neurons in the different network layers. Another interesting feature of the network lies in the fact that the processing capabilities of the intermediate and the output layer neurons are guided by a beta activation function, which uses image context sensitive adaptive thresholding arising out of the fuzzy cardinality estimates of the different network neighborhood fuzzy subsets, rather than resorting to fixed and single point thresholding. An application of the proposed architecture for object extraction is demonstrated using a synthetic and a real life image. The extraction efficiency of the proposed network architecture is evaluated by a proposed system transfer index characteristic of the network.Keywords: Beta activation function, fuzzy cardinality, multilayer self organizing neural network, object extraction,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156514677 Quality Based Approach for Efficient Biologics Manufacturing
Authors: Takashi Kaminagayoshi, Shigeyuki Haruyama
Abstract:
To improve the manufacturing efficiency of biologics, such as antibody drugs, a quality engineering framework was designed. Within this framework, critical steps and parameters in the manufacturing process were studied. Identification of these critical steps and critical parameters allows a deeper understanding of manufacturing capabilities, and suggests to process development department process control standards based on actual manufacturing capabilities as part of a PDCA (plan-do-check-act) cycle. This cycle can be applied to each manufacturing process so that it can be standardized, reducing the time needed to establish each new process.Keywords: Antibody drugs, biologics, manufacturing efficiency, PDCA cycle, quality engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 165314676 Attack Detection through Image Adaptive Self Embedding Watermarking
Authors: S. Shefali, S. M. Deshpande, S. G. Tamhankar
Abstract:
Now a days, a significant part of commercial and governmental organisations like museums, cultural organizations, libraries, commercial enterprises, etc. invest intensively in new technologies for image digitization, digital libraries, image archiving and retrieval. Hence image authorization, authentication and security has become prime need. In this paper, we present a semi-fragile watermarking scheme for color images. The method converts the host image into YIQ color space followed by application of orthogonal dual domains of DCT and DWT transforms. The DCT helps to separate relevant from irrelevant image content to generate silent image features. DWT has excellent spatial localisation to help aid in spatial tamper characterisation. Thus image adaptive watermark is generated based of image features which allows the sharp detection of microscopic changes to locate modifications in the image. Further, the scheme utilises the multipurpose watermark consisting of soft authenticator watermark and chrominance watermark. Which has been proved fragile to some predefined processing like intentinal fabrication of the image or forgery and robust to other incidental attacks caused in the communication channel.
Keywords: Cryptography, Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT), Watermarking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 204214675 Trajectory Estimation and Control of Vehicle using Neuro-Fuzzy Technique
Authors: B. Selma, S. Chouraqui
Abstract:
Nonlinear system identification is becoming an important tool which can be used to improve control performance. This paper describes the application of adaptive neuro-fuzzy inference system (ANFIS) model for controlling a car. The vehicle must follow a predefined path by supervised learning. Backpropagation gradient descent method was performed to train the ANFIS system. The performance of the ANFIS model was evaluated in terms of training performance and classification accuracies and the results confirmed that the proposed ANFIS model has potential in controlling the non linear system.
Keywords: Adaptive neuro-fuzzy inference system (ANFIS), Fuzzy logic, neural network, nonlinear system, control
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178514674 Adaptive Non-linear Filtering Technique for Image Restoration
Authors: S. K. Satpathy, S. Panda, K. K. Nagwanshi, S. K. Nayak, C. Ardil
Abstract:
Removing noise from the any processed images is very important. Noise should be removed in such a way that important information of image should be preserved. A decisionbased nonlinear algorithm for elimination of band lines, drop lines, mark, band lost and impulses in images is presented in this paper. The algorithm performs two simultaneous operations, namely, detection of corrupted pixels and evaluation of new pixels for replacing the corrupted pixels. Removal of these artifacts is achieved without damaging edges and details. However, the restricted window size renders median operation less effective whenever noise is excessive in that case the proposed algorithm automatically switches to mean filtering. The performance of the algorithm is analyzed in terms of Mean Square Error [MSE], Peak-Signal-to-Noise Ratio [PSNR], Signal-to-Noise Ratio Improved [SNRI], Percentage Of Noise Attenuated [PONA], and Percentage Of Spoiled Pixels [POSP]. This is compared with standard algorithms already in use and improved performance of the proposed algorithm is presented. The advantage of the proposed algorithm is that a single algorithm can replace several independent algorithms which are required for removal of different artifacts.
Keywords: Filtering, Decision Based Algorithm, noise, imagerestoration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 215814673 Modeling of Supply Chains Delocalization Problems Taking into Account the New Financial Policies: Case of Multinational Firms Established in OECD Member Countries
Authors: Mouna Benfssahi, Zoubir El Felsoufi
Abstract:
For many enterprises, the delocalization of a part or the totality of their supply chain to low cost countries is the best way to reduce costs and remain competitive against the growing globalized market. This new tendency is driven by logistics advantages, as well as, financial and tax discount offered by the host countries. The objective of this article is to examine the new financial challenges introduced by the project of base erosion and profits shifting (BEPS), published in 2015, and also their impact on the decision of delocalization. In fact, the strategy adopted by multinational firms for determining the transfer price (TP) of goods and services, as well as the shared amount of revenues and expenses have a major impact upon group profit and may contribute to divergent results. In order to get more profit, a coherent decision of delocalization should be based on an evaluation of all the operational and financial characteristics associated with such movement. Therefore, it is interesting to model these new constraints and integrate them in a more global decision model. The established model will enable to measure how much these financial constraints impact the decision of delocalization and will give new helpful directives for enterprise managers.
Keywords: Delocalization, intragroup transaction, multinational firms, optimization model, supply chain management, transfer pricing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 76714672 Predicting Extrusion Process Parameters Using Neural Networks
Authors: Sachin Man Bajimaya, SangChul Park, Gi-Nam Wang
Abstract:
The objective of this paper is to estimate realistic principal extrusion process parameters by means of artificial neural network. Conventionally, finite element analysis is used to derive process parameters. However, the finite element analysis of the extrusion model does not consider the manufacturing process constraints in its modeling. Therefore, the process parameters obtained through such an analysis remains highly theoretical. Alternatively, process development in industrial extrusion is to a great extent based on trial and error and often involves full-size experiments, which are both expensive and time-consuming. The artificial neural network-based estimation of the extrusion process parameters prior to plant execution helps to make the actual extrusion operation more efficient because more realistic parameters may be obtained. And so, it bridges the gap between simulation and real manufacturing execution system. In this work, a suitable neural network is designed which is trained using an appropriate learning algorithm. The network so trained is used to predict the manufacturing process parameters.Keywords: Artificial Neural Network (ANN), Indirect Extrusion, Finite Element Analysis, MES.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 236814671 Selecting Stealth Aircraft Using Determinate Fuzzy Preference Programming in Multiple Criteria Decision Making
Authors: C. Ardil
Abstract:
This paper investigates the application of the determinate fuzzy preference programming method for a more nuanced and comprehensive evaluation of stealth aircraft. Traditional methods often struggle to incorporate subjective factors and uncertainties inherent in complex systems like stealth aircraft. Determinate fuzzy preference programming addresses this limitation by leveraging the strengths of determinate fuzzy sets. The proposed novel multiple criteria decision-making algorithm integrates these concepts to consider aspects and criteria influencing aircraft performance. This approach aims to provide a more holistic assessment by enabling decision-makers to observe positive and negative outranking flows simultaneously. By demonstrating the validity and effectiveness of this approach through a practical example of selecting a stealth aircraft, this paper aims to establish the determinate fuzzy preference programming method as a valuable tool for informed decision-making in this critical domain.
Keywords: Determinate fuzzy set, stealth aircraft selection, distance function, decision making, uncertainty, preference programming. MCDM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14514670 Contrast Enhancement in Digital Images Using an Adaptive Unsharp Masking Method
Authors: Z. Mortezaie, H. Hassanpour, S. Asadi Amiri
Abstract:
Captured images may suffer from Gaussian blur due to poor lens focus or camera motion. Unsharp masking is a simple and effective technique to boost the image contrast and to improve digital images suffering from Gaussian blur. The technique is based on sharpening object edges by appending the scaled high-frequency components of the image to the original. The quality of the enhanced image is highly dependent on the characteristics of both the high-frequency components and the scaling/gain factor. Since the quality of an image may not be the same throughout, we propose an adaptive unsharp masking method in this paper. In this method, the gain factor is computed, considering the gradient variations, for individual pixels of the image. Subjective and objective image quality assessments are used to compare the performance of the proposed method both with the classic and the recently developed unsharp masking methods. The experimental results show that the proposed method has a better performance in comparison to the other existing methods.Keywords: Unsharp masking, blur image, sub-region gradient, image enhancement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 141114669 A Hidden Markov Model for Modeling Pavement Deterioration under Incomplete Monitoring Data
Authors: Nam Lethanh, Bryan T. Adey
Abstract:
In this paper, the potential use of an exponential hidden Markov model to model a hidden pavement deterioration process, i.e. one that is not directly measurable, is investigated. It is assumed that the evolution of the physical condition, which is the hidden process, and the evolution of the values of pavement distress indicators, can be adequately described using discrete condition states and modeled as a Markov processes. It is also assumed that condition data can be collected by visual inspections over time and represented continuously using an exponential distribution. The advantage of using such a model in decision making process is illustrated through an empirical study using real world data.Keywords: Deterioration modeling, Exponential distribution, Hidden Markov model, Pavement management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 230514668 Linear Programming Application in Unit Commitment of Wind Farms with Considering Uncertainties
Authors: M. Esmaeeli Shahrakht, A. Kazemi
Abstract:
Due to uncertainty of wind velocity, wind power generators don’t have deterministic output power. Utilizing wind power generation and thermal power plants together create new concerns for operation engineers of power systems. In this paper, a model is presented to implement the uncertainty of load and generated wind power which can be utilized in power system operation planning. Stochastic behavior of parameters is simulated by generating scenarios that can be solved by deterministic method. A mixed-integer linear programming method is used for solving deterministic generation scheduling problem. The proposed approach is applied to a 12-unit test system including 10 thermal units and 2 wind farms. The results show affectivity of piecewise linear model in unit commitment problems. Also using linear programming causes a considerable reduction in calculation times and guarantees convergence to the global optimum. Neglecting the uncertainty of wind velocity causes higher cost assessment of generation scheduling.
Keywords: Load uncertainty, linear programming, scenario generation, unit commitment, wind farm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 293614667 Manual Testing of Web Software Systems Supported by Direct Guidance of the Tester Based On Design Model
Authors: Karel Frajtak, Miroslav Bures, Ivan Jelinek
Abstract:
Software testing is important stage of software development cycle. Current testing process involves tester and electronic documents with test case scenarios. In this paper we focus on new approach to testing process using automated test case generation and tester guidance through the system based on the model of the system. Test case generation and model-based testing is not possible without proper system model. We aim on providing better feedback from the testing process thus eliminating the unnecessary paper work.
Keywords: Model based testing, test automation, test generating, tester support.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195914666 Hybrid Methods for Optimisation of Weights in Spatial Multi-Criteria Evaluation Decision for Fire Risk and Hazard
Authors: I. Yakubu, D. Mireku-Gyimah, D. Asafo-Adjei
Abstract:
The challenge for everyone involved in preserving the ecosystem is to find creative ways to protect and restore the remaining ecosystems while accommodating and enhancing the country social and economic well-being. Frequent fires of anthropogenic origin have been affecting the ecosystems in many countries adversely. Hence adopting ways of decision making such as Multicriteria Decision Making (MCDM) is appropriate since it will enhance the evaluation and analysis of fire risk and hazard of the ecosystem. In this paper, fire risk and hazard data from the West Gonja area of Ghana were used in some of the methods (Analytical Hierarchy Process, Compromise Programming, and Grey Relational Analysis (GRA) for MCDM evaluation and analysis to determine the optimal weight method for fire risk and hazard. Ranking of the land cover types was carried out using; Fire Hazard, Fire Fighting Capacity and Response Risk Criteria. Pairwise comparison under Analytic Hierarchy Process (AHP) was used to determine the weight of the various criteria. Weights for sub-criteria were also obtained by the pairwise comparison method. The results were optimised using GRA and Compromise Programming (CP). The results from each method, hybrid GRA and CP, were compared and it was established that all methods were satisfactory in terms of optimisation of weight. The most optimal method for spatial multicriteria evaluation was the hybrid GRA method. Thus, a hybrid AHP and GRA method is more effective method for ranking alternatives in MCDM than the hybrid AHP and CP method.
Keywords: Compromise programming, grey relational analysis, spatial multi-criteria, weight optimisation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 65614665 Fast Algorithm of Infrared Point Target Detection in Fluctuant Background
Authors: Yang Weiping, Zhang Zhilong, Li Jicheng, Chen Zengping, He Jun
Abstract:
The background estimation approach using a small window median filter is presented on the bases of analyzing IR point target, noise and clutter model. After simplifying the two-dimensional filter, a simple method of adopting one-dimensional median filter is illustrated to make estimations of background according to the characteristics of IR scanning system. The adaptive threshold is used to segment canceled image in the background. Experimental results show that the algorithm achieved good performance and satisfy the requirement of big size image-s real-time processing.Keywords: Point target, background estimation, median filter, adaptive threshold, target detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 184314664 Customers’ Priority to Implement SSTs Using AHP Analysis
Authors: Mohammad Jafariahangari, Marjan Habibi, Miresmaeil Mirnabibaboli, Mirza Hassan Hosseini
Abstract:
Self-service technologies (SSTs) make an important contribution to the daily life of people nowadays. However, the introduction of SST does not lead to its usage. Thereby, this paper was an attempt on discovery of the most preferred SST in the customers’ point of view. To fulfill this aim, the Analytical Hierarchy Process (AHP) was applied based on Saaty’s questionnaire which was administered to the customers of e-banking services located in Golestan providence, northern Iran. This study used qualitative factors in association with the intention of consumers’ usage of SSTs to rank three SSTs: ATM, mobile banking and internet banking. The results showed that mobile banking get the highest weight in consumers’ point of view. This research can be useful both for managers and service providers and also for customers who intend to use e-banking.
Keywords: Analytical Hierarchy Process, Decision-making, Ebanking, Iran, Self-service technologies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 221114663 ANFIS Modeling of the Surface Roughness in Grinding Process
Authors: H. Baseri, G. Alinejad
Abstract:
The objective of this study is to design an adaptive neuro-fuzzy inference system (ANFIS) for estimation of surface roughness in grinding process. The Used data have been generated from experimental observations when the wheel has been dressed using a rotary diamond disc dresser. The input parameters of model are dressing speed ratio, dressing depth and dresser cross-feed rate and output parameter is surface roughness. In the experimental procedure the grinding conditions are constant and only the dressing conditions are varied. The comparison of the predicted values and the experimental data indicates that the ANFIS model has a better performance with respect to back-propagation neural network (BPNN) model which has been presented by the authors in previous work for estimation of the surface roughness.Keywords: Grinding, ANFIS, Neural network, Disc dressing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 241514662 Combining Fuzzy Logic and Neural Networks in Modeling Landfill Gas Production
Authors: Mohamed Abdallah, Mostafa Warith, Roberto Narbaitz, Emil Petriu, Kevin Kennedy
Abstract:
Heterogeneity of solid waste characteristics as well as the complex processes taking place within the landfill ecosystem motivated the implementation of soft computing methodologies such as artificial neural networks (ANN), fuzzy logic (FL), and their combination. The present work uses a hybrid ANN-FL model that employs knowledge-based FL to describe the process qualitatively and implements the learning algorithm of ANN to optimize model parameters. The model was developed to simulate and predict the landfill gas production at a given time based on operational parameters. The experimental data used were compiled from lab-scale experiment that involved various operating scenarios. The developed model was validated and statistically analyzed using F-test, linear regression between actual and predicted data, and mean squared error measures. Overall, the simulated landfill gas production rates demonstrated reasonable agreement with actual data. The discussion focused on the effect of the size of training datasets and number of training epochs.
Keywords: Adaptive neural fuzzy inference system (ANFIS), gas production, landfill
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 241514661 Intelligent Dynamic Decision-making Model Using in Robot's Movement
Authors: Yufang Cheng, Hsiu-Hua Yang
Abstract:
This work develops a novel intelligent “model of dynamic decision-making" usingcell assemblies network architecture in robot's movement. The “model of dynamic decision-making" simulates human decision-making, and follows commands to make the correct decisions. The cell assemblies approach consisting of fLIF neurons was used to implement tasks for finding targets and avoiding obstacles. Experimental results show that the cell assemblies approach of can be employed to efficiently complete finding targets and avoiding obstacles tasks and can simulate the human thinking and the mode of information transactions.
Keywords: Cell assemblies, fLIF, Hebbian learning rule.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 121914660 Evaluating Contractors in Construction Projects by Multi-Criteria Decision Making and Supply Chain Approach
Authors: Sara Najiazarpour, Mahsa Najiazarpour
Abstract:
There are many problems in contracting projects and their performance. At each project stage and due to different reasons, these problems affect cost, time, and quality. Hence, in order to increase the efficiency and performance in all levels of the chain and with supply chain management approach, there will be a coordination from the beginning of a project to the end of project (handover of project). Contractor selection is the foremost part of construction projects which in this multi-criteria decision-making, the best contractor is determined by expert judgment, different variables, and their priorities. In this paper for selecting the best contractor, numerous criteria were collected by asking from adept experts and then among them, 16 criteria with highest frequency were considered for questionnaire. This questionnaire was distributed between experts. Cronbach's alpha coefficient was used and then based on Borda function important criteria were selected which was categorized in four main criteria as follows: Environmental factors and physical equipment, past performance and technical expertise, affordability and standards. Then with PROMTHEE method, the criteria were normalized and monitored, finally the best alternative was selected. A case study had been done, and the best contractor was selected based on all criteria and their priorities.
Keywords: Evaluation and selecting contractors, project development, supply chain management, multi-criteria decision-making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8414659 A Two Level Load Balancing Approach for Cloud Environment
Authors: Anurag Jain, Rajneesh Kumar
Abstract:
Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.
Keywords: Cloud Analyst, Cloud Computing, Join Idle Queue, Join Shortest Queue, Load balancing, Task Scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 199214658 Decision Making using Maximization of Negret
Authors: José M. Merigó, Montserrat Casanovas
Abstract:
We analyze the problem of decision making under ignorance with regrets. Recently, Yager has developed a new method for decision making where instead of using regrets he uses another type of transformation called negrets. Basically, the negret is considered as the dual of the regret. We study this problem in detail and we suggest the use of geometric aggregation operators in this method. For doing this, we develop a different method for constructing the negret matrix where all the values are positive. The main result obtained is that now the model is able to deal with negative numbers because of the transformation done in the negret matrix. We further extent these results to another model developed also by Yager about mixing valuations and negrets. Unfortunately, in this case we are not able to deal with negative numbers because the valuations can be either positive or negative.Keywords: Decision Making, Aggregation operators, Negret, OWA operator, OWG operator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 131014657 Economic Assessment Methodology to Support Decisions for Transport Infrastructure Development
Authors: Dimitrios J. Dimitriou
Abstract:
The decades after the end of the second War provide evidence that infrastructures investments contibute to economic development, on terms of productivity and income growth. In order to force productivity and increase competitiveness the financing of large transport infrastructure projects are on the top of the agenda in strategic planning process. Such a decision may take form some days to some decades and stakeholders as well as decision makers need tools in order to estimate the economic impact on natioanl economy of such an investment. The key question in such decisions is if the effects caused by the new infrastructure could be able to boost economic development on one hand, and create new jobs and activities on the other. This paper deals with the review of estimation of the mega transport infrastructure projects economic effects in economy.
Keywords: Economic impact, transport infrastructure, strategic planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 106814656 Asymptotic Analysis of Instant Messaging Service with Relay Nodes
Authors: Muhammad T. Alam, Zheng Da Wu
Abstract:
In this paper, we provide complete end-to-end delay analyses including the relay nodes for instant messages. Message Session Relay Protocol (MSRP) is used to provide congestion control for large messages in the Instant Messaging (IM) service. Large messages are broken into several chunks. These chunks may traverse through a maximum number of two relay nodes before reaching destination according to the IETF specification of the MSRP relay extensions. We discuss the current solutions of sending large instant messages and introduce a proposal to reduce message flows in the IM service. We consider virtual traffic parameter i.e., the relay nodes are stateless non-blocking for scalability purpose. This type of relay node is also assumed to have input rate at constant bit rate. We provide a new scheduling policy that schedules chunks according to their previous node?s delivery time stamp tags. Validation and analysis is shown for such scheduling policy. The performance analysis with the model introduced in this paper is simple and straight forward, which lead to reduced message flows in the IM service.
Keywords: Instant messaging, stateless, chunking, MSRP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620