Search results for: deterministic
103 Maximizer of the Posterior Marginal Estimate for Noise Reduction of JPEG-compressed Image
Authors: Yohei Saika, Yuji Haraguchi
Abstract:
We constructed a method of noise reduction for JPEG-compressed image based on Bayesian inference using the maximizer of the posterior marginal (MPM) estimate. In this method, we tried the MPM estimate using two kinds of likelihood, both of which enhance grayscale images converted into the JPEG-compressed image through the lossy JPEG image compression. One is the deterministic model of the likelihood and the other is the probabilistic one expressed by the Gaussian distribution. Then, using the Monte Carlo simulation for grayscale images, such as the 256-grayscale standard image “Lena" with 256 × 256 pixels, we examined the performance of the MPM estimate based on the performance measure using the mean square error. We clarified that the MPM estimate via the Gaussian probabilistic model of the likelihood is effective for reducing noises, such as the blocking artifacts and the mosquito noise, if we set parameters appropriately. On the other hand, we found that the MPM estimate via the deterministic model of the likelihood is not effective for noise reduction due to the low acceptance ratio of the Metropolis algorithm.Keywords: Noise reduction, JPEG-compressed image, Bayesian inference, the maximizer of the posterior marginal estimate
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987102 AI-based Radio Resource and Transmission Opportunity Allocation for 5G-V2X HetNets: NR and NR-U networks
Authors: Farshad Zeinali, Sajedeh Norouzi, Nader Mokari, Eduard A. Jorswieck
Abstract:
The capacity of fifth-generation (5G)vehicle-to-everything (V2X) networks poses significant challenges.To address this challenge, this paper utilizes New Radio (NR) and New Radio Unlicensed (NR-U) networks to develop a vehicular heterogeneous network (HetNet). We propose a framework, named joint BS assignment and resource allocation (JBSRA) for mobile V2X users and also consider coexistence schemes based on flexible duty cycle (DC) mechanism for unlicensed bands. Our objective is to maximize the average throughput of vehicles, while guarantying the WiFi users throughput. In simulations based on deep reinforcement learning (DRL) algorithms such as deep deterministic policy gradient (DDPG) and deep Q network (DQN), our proposed framework outperforms existing solutions that rely on fixed DC or schemes without consideration of unlicensed bands.
Keywords: Vehicle-to-everything, resource allocation, BS assignment, new radio, new radio unlicensed, coexistence NR-U and WiFi, deep deterministic policy gradient, Deep Q-network, Duty cycle mechanism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 323101 Reliability Based Investigation on the Choice of Characteristic Soil Properties
Authors: Jann-Eike Saathoff, Kirill Alexander Schmoor, Martin Achmus, Mauricio Terceros
Abstract:
By using partial factors of safety, uncertainties due to the inherent variability of the soil properties and loads are taken into account in the geotechnical design process. According to the reliability index concept in Eurocode-0 in conjunction with Eurocode-7 a minimum safety level of β = 3.8 for reliability class RC2 shall be established. The reliability of the system depends heavily on the choice of the prespecified safety factor and the choice of the characteristic soil properties. The safety factors stated in the standards are mainly based on experience. However, no general accepted method for the calculation of a characteristic value within the current design practice exists. In this study, a laterally loaded monopile is investigated and the influence of the chosen quantile values of the deterministic system, calculated with p-y springs, will be presented. Monopiles are the most common foundation concepts for offshore wind energy converters. Based on the calculations for non-cohesive soils, a recommendation for an appropriate quantile value for the necessary safety level according to the standards for a deterministic design is given.
Keywords: Asymptotic sampling, characteristic value, monopile foundation, probabilistic design, quantile values.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 671100 Young’s Modulus Variability: Influence on Masonry Vault Behavior
Authors: A. Zanaz, S. Yotte, F. Fouchal, A. Chateauneuf
Abstract:
This paper presents a methodology for probabilistic assessment of bearing capacity and prediction of failure mechanism of masonry vaults at the ultimate state with consideration of the natural variability of Young’s modulus of stones. First, the computation model is explained. The failure mode corresponds to the four-hinge mechanism. Based on this consideration, the study of a vault composed of 16 segments is presented. The Young’s modulus of the segments is considered as random variable defined by a mean value and a coefficient of variation. A relationship linking the vault bearing capacity to the voussoirs modulus variation is proposed. The most probable failure mechanisms, in addition to that observed in the deterministic case, are identified for each variability level as well as their probability of occurrence. The results show that the mechanism observed in the deterministic case has decreasing probability of occurrence in terms of variability, while the number of other mechanisms and their probability of occurrence increases with the coefficient of variation of Young’s modulus. This means that if a significant change in the Young’s modulus of the segments is proven, taking it into account in computations becomes mandatory, both for determining the vault bearing capacity and for predicting its failure mechanism.Keywords: Masonry, mechanism, probability, variability, vault.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 200499 A Novel Solution Methodology for Transit Route Network Design Problem
Authors: Ghada Moussa, Mamoud Owais
Abstract:
Transit route Network Design Problem (TrNDP) is the most important component in Transit planning, in which the overall cost of the public transportation system highly depends on it. The main purpose of this study is to develop a novel solution methodology for the TrNDP, which goes beyond pervious traditional sophisticated approaches. The novelty of the solution methodology, adopted in this paper, stands on the deterministic operators which are tackled to construct bus routes. The deterministic manner of the TrNDP solution relies on using linear and integer mathematical formulations that can be solved exactly with their standard solvers. The solution methodology has been tested through Mandl’s benchmark network problem. The test results showed that the methodology developed in this research is able to improve the given network solution in terms of number of constructed routes, direct transit service coverage, transfer directness and solution reliability. Although the set of routes resulted from the methodology would stand alone as a final efficient solution for TrNDP, it could be used as an initial solution for meta-heuristic procedures to approach global optimal. Based on the presented methodology, a more robust network optimization tool would be produced for public transportation planning purposes.
Keywords: Integer programming, Transit route design, Transportation, Urban planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 310998 Real-time Performance Study of EPA Periodic Data Transmission
Authors: Liu Ning, Zhong Chongquan, Teng Hongfei
Abstract:
EPA (Ethernet for Plant Automation) resolves the nondeterministic problem of standard Ethernet and accomplishes real-time communication by means of micro-segment topology and deterministic scheduling mechanism. This paper studies the real-time performance of EPA periodic data transmission from theoretical and experimental perspective. By analyzing information transmission characteristics and EPA deterministic scheduling mechanism, 5 indicators including delivery time, time synchronization accuracy, data-sending time offset accuracy, utilization percentage of configured timeslice and non-RTE bandwidth that can be used to specify the real-time performance of EPA periodic data transmission are presented and investigated. On this basis, the test principles and test methods of the indicators are respectively studied and some formulas for real-time performance of EPA system are derived. Furthermore, an experiment platform is developed to test the indicators of EPA periodic data transmission in a micro-segment. According to the analysis and the experiment, the methods to improve the real-time performance of EPA periodic data transmission including optimizing network structure, studying self-adaptive adjustment method of timeslice and providing data-sending time offset accuracy for configuration are proposed.
Keywords: EPA system, Industrial Ethernet, Periodic data, Real-time performance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146697 Network-Constrained AC Unit Commitment under Uncertainty Using a Bender’s Decomposition Approach
Authors: B. Janani, S. Thiruvenkadam
Abstract:
In this work, the system evaluates the impact of considering a stochastic approach on the day ahead basis Unit Commitment. Comparisons between stochastic and deterministic Unit Commitment solutions are provided. The Unit Commitment model consists in the minimization of the total operation costs considering unit’s technical constraints like ramping rates, minimum up and down time. Load shedding and wind power spilling is acceptable, but at inflated operational costs. The evaluation process consists in the calculation of the optimal unit commitment and in verifying the fulfillment of the considered constraints. For the calculation of the optimal unit commitment, an algorithm based on the Benders Decomposition, namely on the Dual Dynamic Programming, was developed. Two approaches were considered on the construction of stochastic solutions. Data related to wind power outputs from two different operational days are considered on the analysis. Stochastic and deterministic solutions are compared based on the actual measured wind power output at the operational day. Through a technique capability of finding representative wind power scenarios and its probabilities, the system can analyze a more detailed process about the expected final operational cost.
Keywords: Benders’ decomposition, network constrained AC unit commitment, stochastic programming, wind power uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 131096 Deep Reinforcement Learning Approach for Trading Automation in the Stock Market
Authors: Taylan Kabbani, Ekrem Duman
Abstract:
Deep Reinforcement Learning (DRL) algorithms can scale to previously intractable problems. The automation of profit generation in the stock market is possible using DRL, by combining the financial assets price ”prediction” step and the ”allocation” step of the portfolio in one unified process to produce fully autonomous systems capable of interacting with its environment to make optimal decisions through trial and error. This work represents a DRL model to generate profitable trades in the stock market, effectively overcoming the limitations of supervised learning approaches. We formulate the trading problem as a Partially observed Markov Decision Process (POMDP) model, considering the constraints imposed by the stock market, such as liquidity and transaction costs. We then solved the formulated POMDP problem using the Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm and achieved a 2.68 Sharpe ratio on the test dataset. From the point of view of stock market forecasting and the intelligent decision-making mechanism, this paper demonstrates the superiority of DRL in financial markets over other types of machine learning and proves its credibility and advantages of strategic decision-making.
Keywords: Autonomous agent, deep reinforcement learning, MDP, sentiment analysis, stock market, technical indicators, twin delayed deep deterministic policy gradient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 52395 Characterizing Multivariate Thresholds in Industrial Engineering
Authors: Ali E. Abbas
Abstract:
This paper highlights some of the normative issues that might result by setting independent thresholds in risk analyses and particularly with safety regions. A second objective is to explain how such regions can be specified appropriately in a meaningful way. We start with a review of the importance of setting deterministic trade-offs among target requirements. We then show how to determine safety regions for risk analysis appropriately using utility functions.
Keywords: Decision analysis, thresholds, risk, reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 109694 Soft Real-Time Fuzzy Task Scheduling for Multiprocessor Systems
Authors: Mahdi Hamzeh, Sied Mehdi Fakhraie, Caro Lucas
Abstract:
All practical real-time scheduling algorithms in multiprocessor systems present a trade-off between their computational complexity and performance. In real-time systems, tasks have to be performed correctly and timely. Finding minimal schedule in multiprocessor systems with real-time constraints is shown to be NP-hard. Although some optimal algorithms have been employed in uni-processor systems, they fail when they are applied in multiprocessor systems. The practical scheduling algorithms in real-time systems have not deterministic response time. Deterministic timing behavior is an important parameter for system robustness analysis. The intrinsic uncertainty in dynamic real-time systems increases the difficulties of scheduling problem. To alleviate these difficulties, we have proposed a fuzzy scheduling approach to arrange real-time periodic and non-periodic tasks in multiprocessor systems. Static and dynamic optimal scheduling algorithms fail with non-critical overload. In contrast, our approach balances task loads of the processors successfully while consider starvation prevention and fairness which cause higher priority tasks have higher running probability. A simulation is conducted to evaluate the performance of the proposed approach. Experimental results have shown that the proposed fuzzy scheduler creates feasible schedules for homogeneous and heterogeneous tasks. It also and considers tasks priorities which cause higher system utilization and lowers deadline miss time. According to the results, it performs very close to optimal schedule of uni-processor systems.Keywords: Computational complexity, Deadline, Feasible scheduling, Fuzzy scheduling, Priority, Real-time multiprocessor systems, Robustness, System utilization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 212793 On Bounding Jayanti's Distributed Mutual Exclusion Algorithm
Authors: Awadhesh Kumar Singh
Abstract:
Jayanti-s algorithm is one of the best known abortable mutual exclusion algorithms. This work is an attempt to overcome an already known limitation of the algorithm while preserving its all important properties and elegance. The limitation is that the token number used to assign process identification number to new incoming processes is unbounded. We have used a suitably adapted alternative data structure, in order to completely eliminate the use of token number, in the algorithm.
Keywords: Abortable, deterministic, local spin, mutual exclusion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 127292 A Logic Approach to Database Dynamic Updating
Authors: Daniel Stamate
Abstract:
We introduce a logic-based framework for database updating under constraints. In our framework, the constraints are represented as an instantiated extended logic program. When performing an update, database consistency may be violated. We provide an approach of maintaining database consistency, and study the conditions under which the maintenance process is deterministic. We show that the complexity of the computations and decision problems presented in our framework is in each case polynomial time.Keywords: Databases, knowledge bases, constraints, updates, minimal change, consistency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 135891 On Deterministic Chaos: Disclosing the Missing Mathematics from the Lorenz-Haken Equations
Authors: Belkacem Meziane
Abstract:
The original 3D Lorenz-Haken equations -which describe laser dynamics- are converted into 2-second-order differential equations out of which the so far missing mathematics is extracted. Leaning on high-order trigonometry, important outcomes are pulled out: A fundamental result attributes chaos to forbidden periodic solutions, inside some precisely delimited region of the control parameter space that governs self-pulsing.
Keywords: chaos, Lorenz-Haken equations, laser dynamics, nonlinearities
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 60990 How to Build and Evaluate a Solution Method: An Illustration for the Vehicle Routing Problem
Authors: Nicolas Zufferey
Abstract:
The vehicle routing problem (VRP) is a famous combinatorial optimization problem. Because of its well-known difficulty, metaheuristics are the most appropriate methods to tackle large and realistic instances. The goal of this paper is to highlight the key ideas for designing VRP metaheuristics according to the following criteria: efficiency, speed, robustness, and ability to take advantage of the problem structure. Such elements can obviously be used to build solution methods for other combinatorial optimization problems, at least in the deterministic field.
Keywords: Vehicle routing problem, Metaheuristics, Combinatorial optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 207689 Statistical Analysis of First Order Plus Dead-time System using Operational Matrix
Authors: Pham Luu Trung Duong, Moonyong Lee
Abstract:
To increase precision and reliability of automatic control systems, we have to take into account of random factors affecting the control system. Thus, operational matrix technique is used for statistical analysis of first order plus time delay system with uniform random parameter. Examples with deterministic and stochastic disturbance are considered to demonstrate the validity of the method. Comparison with Monte Carlo method is made to show the computational effectiveness of the method.
Keywords: First order plus dead-time, Operational matrix, Statistical analysis, Walsh function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 136488 A Comparison of Exact and Heuristic Approaches to Capital Budgeting
Authors: Jindřiška Šedová, Miloš Šeda
Abstract:
This paper summarizes and compares approaches to solving the knapsack problem and its known application in capital budgeting. The first approach uses deterministic methods and can be applied to small-size tasks with a single constraint. We can also apply commercial software systems such as the GAMS modelling system. However, because of NP-completeness of the problem, more complex problem instances must be solved by means of heuristic techniques to achieve an approximation of the exact solution in a reasonable amount of time. We show the problem representation and parameter settings for a genetic algorithm framework.Keywords: Capital budgeting, knapsack problem, GAMS, heuristic method, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173887 DPSO Based SEPIC Converter in PV System under Partial Shading Condition
Authors: K. Divya, G. Sugumaran
Abstract:
This paper proposes an improved Maximum Power Point Tracking of PhotoVoltaic system using Deterministic Partical Swarm Optimization technique. This method has the ability to track the maximum power under varying environmental conditions i.e. partial shading conditions. The advantage of this method, particles moves in the restricted value of velocity to achieve the maximum power. SEPIC converter is employed to boost up the voltage of PV system. To estimate the value of the proposed method, MATLAB simulation carried out under partial shading condition.
Keywords: DPSO, Partial shading condition, P&O, PV, SEPIC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 227486 BIBD-s for (13, 5, 5), (16, 6, 5) and (21, 6, 4) Possessing Possibly an Automorphism of Order 3
Authors: Ivica Martinjak, Mario-Osvin Pavcevic
Abstract:
When trying to enumerate all BIBD-s for given parameters, their natural solution space appears to be huge and grows extremely with the number of points of the design. Therefore, constructive enumerations are often carried out by assuming additional constraints on design-s structure, automorphisms being mostly used ones. It remains a hard task to construct designs with trivial automorphism group – those with no additional symmetry – although it is believed that most of the BIBD-s belong to that case. In this paper, very many new designs with parameters 2-(13, 5, 5), 2-(16, 6, 5) and 2-(21, 6, 4) are constructed, assuming an action of an automorphism of order 3. Even more, it was possible to construct millions of such designs with no non-trivial automorphisms.Keywords: BIBD, incidence matrix, automorphism group, tactical decomposition, deterministic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 131885 The Different Ways to Describe Regular Languages by Using Finite Automata and the Changing Algorithm Implementation
Authors: Abdulmajid Mukhtar Afat
Abstract:
This paper aims at introducing finite automata theory, the different ways to describe regular languages and create a program to implement the subset construction algorithms to convert nondeterministic finite automata (NFA) to deterministic finite automata (DFA). This program is written in c++ programming language. The program reads FA 5tuples from text file and then classifies it into either DFA or NFA. For DFA, the program will read the string w and decide whether it is acceptable or not. If accepted, the program will save the tracking path and point it out. On the other hand, when the automation is NFA, the program will change the Automation to DFA so that it is easy to track and it can decide whether the w exists in the regular language or not.
Keywords: Finite Automata, subset construction DFA, NFA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 198584 Generator of Hypotheses an Approach of Data Mining Based on Monotone Systems Theory
Authors: Rein Kuusik, Grete Lind
Abstract:
Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).Keywords: data mining, monotone systems, pattern, rule.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 125583 Unscented Grid Filtering and Smoothing for Nonlinear Time Series Analysis
Authors: Nikolay Nikolaev, Evgueni Smirnov
Abstract:
This paper develops an unscented grid-based filter and a smoother for accurate nonlinear modeling and analysis of time series. The filter uses unscented deterministic sampling during both the time and measurement updating phases, to approximate directly the distributions of the latent state variable. A complementary grid smoother is also made to enable computing of the likelihood. This helps us to formulate an expectation maximisation algorithm for maximum likelihood estimation of the state noise and the observation noise. Empirical investigations show that the proposed unscented grid filter/smoother compares favourably to other similar filters on nonlinear estimation tasks. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 133082 Contour Estimation in Synthetic and Real Weld Defect Images based on Maximum Likelihood
Authors: M. Tridi, N. Nacereddine, N. Oucief
Abstract:
This paper describes a novel method for automatic estimation of the contours of weld defect in radiography images. Generally, the contour detection is the first operation which we apply in the visual recognition system. Our approach can be described as a region based maximum likelihood formulation of parametric deformable contours. This formulation provides robustness against the poor image quality, and allows simultaneous estimation of the contour parameters together with other parameters of the model. Implementation is performed by a deterministic iterative algorithm with minimal user intervention. Results testify for the very good performance of the approach especially in synthetic weld defect images.Keywords: Contour, gaussian, likelihood, rayleigh.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166081 Comparison of Material Constitutive Models Used in FEA of Low Volume Roads
Authors: Lenka Ševelová, Aleš Florian
Abstract:
Appropriate and progressive tool for analyzing behavior of low volume roads are probabilistic models used in reliability analyses. The necessary part of the probabilistic model is the deterministic model of structural behavior. The FE model of low volume roads is created in the ANSYS software. It is able to determine the state of stress and deformation in any point of the structure and thus generate data required for the reliability analysis. The paper compares two material constitutive models used for modeling of unbound non-homogenous materials used in low volume roads. The first model is linear elastic model according to Hook theory (H model), the second one is nonlinear elastic-plastic Drucker-Prager model (D-P model).
Keywords: FEA, FEM, geotechnical materials, low volume roads, material constitutive models, pavement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 288580 Some Solid Transportation Models with Crisp and Rough Costs
Authors: Pradip Kundu, Samarjit Kar, Manoranjan Maiti
Abstract:
In this paper, some practical solid transportation models are formulated considering per trip capacity of each type of conveyances with crisp and rough unit transportation costs. This is applicable for the system in which full vehicles, e.g. trucks, rail coaches are to be booked for transportation of products so that transportation cost is determined on the full of the conveyances. The models with unit transportation costs as rough variables are transformed into deterministic forms using rough chance constrained programming with the help of trust measure. Numerical examples are provided to illustrate the proposed models in crisp environment as well as with unit transportation costs as rough variables.
Keywords: Solid transportation problem, Rough set, Rough variable, Trust measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 262179 The Role of ICT for Income Inequality: The Model and the Simulations
Authors: Shoji Katagiri
Abstract:
This paper is to clarify the relationship between ICT and income inequality. To do so, we develop the general equilibrium model with ICT investment, obtain the equilibrium solutions, and then simulate the model with these solutions for some OECD countries. As a result, generally, during the corresponding periods we confirm that the relationship between ICT investment and income inequality is positive. In this mode, the increment of the ratio of ICT investment to the aggregated investment in stock enhances the capital’s share of income, and finally leads to income inequality such as the increase of the share of the top decile income. Although we confirm the positive relationship between ICT investment and income inequality, the upward trend for that relationship depends on the values of parameters for the making use of the simulations and these parameters are not deterministic in the magnitudes on the calculated results for the simulations.Keywords: ICT, inequality, capital accumulation, technology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 123378 The Problem of Using the Calculation of the Critical Path to Solver Instances of the Job Shop Scheduling Problem
Authors: Marco Antonio Cruz-Chávez, Juan Frausto-Solís, Fernando Ramos-Quintana
Abstract:
A procedure commonly used in Job Shop Scheduling Problem (JSSP) to evaluate the neighborhoods functions that use the non-deterministic algorithms is the calculation of the critical path in a digraph. This paper presents an experimental study of the cost of computation that exists when the calculation of the critical path in the solution for instances in which a JSSP of large size is involved. The results indicate that if the critical path is use in order to generate neighborhoods in the meta-heuristics that are used in JSSP, an elevated cost of computation exists in spite of the fact that the calculation of the critical path in any digraph is of polynomial complexity.
Keywords: Job Shop, CPM, critical path, neighborhood, meta-heuristic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 230177 Bandwidth allocation in ATM Network for different QOS Requirements
Authors: H. El-Madbouly
Abstract:
For future Broad band ISDN, Asynchronous Transfer Mode (ATM) is designed not only to support a wide range of traffic classes with diverse flow characteristics, but also to guarantee the different quality of service QOS requirements. The QOS may be measured in terms of cell loss probability and maximum cell delay. In this paper, ATM networks in which the virtual path (VP) concept is implemented are considered. By applying the Markov Deterministic process method, an efficient algorithm to compute the minimum capacity required to satisfy the QOS requirements when multiple classes of on-off are multiplexed on to a single VP. Using the result, we then proposed a simple algorithm to determine different combinations of VP to achieve the optimum of the total capacity required for satisfying the individual QOS requirements (loss- delay).Keywords: Bandwidth allocation, Quality of services, ATMNetwork, virtual path.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 155476 Image Authenticity and Perceptual Optimization via Genetic Algorithm and a Dependence Neighborhood
Authors: Imran Usman, Asifullah Khan, Rafiullah Chamlawi, Abdul Majid
Abstract:
Information hiding for authenticating and verifying the content integrity of the multimedia has been exploited extensively in the last decade. We propose the idea of using genetic algorithm and non-deterministic dependence by involving the un-watermarkable coefficients for digital image authentication. Genetic algorithm is used to intelligently select coefficients for watermarking in a DCT based image authentication scheme, which implicitly watermark all the un-watermarkable coefficients also, in order to thwart different attacks. Experimental results show that such intelligent selection results in improvement of imperceptibility of the watermarked image, and implicit watermarking of all the coefficients improves security against attacks such as cover-up, vector quantization and transplantation.
Keywords: Digital watermarking, fragile watermarking, geneticalgorithm, Image authentication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 151775 A Multi-objective Fuzzy Optimization Method of Resource Input Based on Genetic Algorithm
Abstract:
With the increasing complexity of engineering problems, the traditional, single-objective and deterministic optimization method can not meet people-s requirements. A multi-objective fuzzy optimization model of resource input is built for M chlor-alkali chemical eco-industrial park in this paper. First, the model is changed into the form that can be solved by genetic algorithm using fuzzy theory. And then, a fitness function is constructed for genetic algorithm. Finally, a numerical example is presented to show that the method compared with traditional single-objective optimization method is more practical and efficient.Keywords: Fitness function, genetic algorithm, multi-objectivefuzzy optimization, satisfaction degree membership function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 135474 Joint Use of Factor Analysis (FA) and Data Envelopment Analysis (DEA) for Ranking of Data Envelopment Analysis
Authors: Reza Nadimi, Fariborz Jolai
Abstract:
This article combines two techniques: data envelopment analysis (DEA) and Factor analysis (FA) to data reduction in decision making units (DMU). Data envelopment analysis (DEA), a popular linear programming technique is useful to rate comparatively operational efficiency of decision making units (DMU) based on their deterministic (not necessarily stochastic) input–output data and factor analysis techniques, have been proposed as data reduction and classification technique, which can be applied in data envelopment analysis (DEA) technique for reduction input – output data. Numerical results reveal that the new approach shows a good consistency in ranking with DEA.Keywords: Effectiveness, Decision Making, Data EnvelopmentAnalysis, Factor Analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2424