Search results for: Economic dispatch problem
4274 Application of Transportation Linear Programming Algorithms to Cost Reduction in Nigeria Soft Drinks Industry
Authors: A. O. Salami
Abstract:
The transportation problems are primarily concerned with the optimal way in which products produced at different plants (supply origins) are transported to a number of warehouses or customers (demand destinations). The objective in a transportation problem is to fully satisfy the destination requirements within the operating production capacity constraints at the minimum possible cost. The objective of this study is to determine ways of minimizing transportation cost in order to maximum profit. Data were sourced from the records of the Distribution Department of 7-Up Bottling Company Plc., Ilorin, Kwara State, Nigeria. The data were computed and analyzed using the three methods of solving transportation problem. The result shows that the three methods produced the same total transportation costs amounting to N1, 358, 019, implying that any of the method can be adopted by the company in transporting its final products to the wholesale dealers in order to minimize total production cost.
Keywords: Allocation problem, Cost Minimization, Distribution system, Resources utilization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 88024273 Finding Pareto Optimal Front for the Multi-Mode Time, Cost Quality Trade-off in Project Scheduling
Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo
Abstract:
Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.Keywords: FastPGA, Multi-Execution Activity Mode, ParetoOptimality, Project Scheduling, Time-Cost-Quality Trade-Off.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16844272 A Meta-Heuristic Algorithm for Set Covering Problem Based on Gravity
Authors: S. Raja Balachandar, K. Kannan
Abstract:
A new Meta heuristic approach called "Randomized gravitational emulation search algorithm (RGES)" for solving large size set covering problems has been designed. This algorithm is found upon introducing randomization concept along with the two of the four primary parameters -velocity- and -gravity- in physics. A new heuristic operator is introduced in the domain of RGES to maintain feasibility specifically for the set covering problem to yield best solutions. The performance of this algorithm has been evaluated on a large set of benchmark problems from OR-library. Computational results showed that the randomized gravitational emulation search algorithm - based heuristic is capable of producing high quality solutions. The performance of this heuristic when compared with other existing heuristic algorithms is found to be excellent in terms of solution quality.
Keywords: Set covering problem, velocity, gravitational force, Newton's law, meta heuristic, combinatorial optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22314271 Usage of Military Spending, Debt Servicing and Growth for Dealing with Emergency Plan of Indian External Debt
Authors: Sahbi Farhani
Abstract:
This study investigates the relationship between external debt and military spending in case of India over the period of 1970–2012. In doing so, we have applied the structural break unit root tests to examine stationarity properties of the variables. The Auto-Regressive Distributed Lag (ARDL) bounds testing approach is used to test whether cointegration exists in presence of structural breaks stemming in the series. Our results indicate the cointegration among external debt, military spending, debt servicing, and economic growth. Moreover, military spending and debt servicing add in external debt. Economic growth helps in lowering external debt. The Vector Error Correction Model (VECM) analysis and Granger causality test reveal that military spending and economic growth cause external debt. The feedback effect also exists between external debt and debt servicing in case of India.
Keywords: External debt, military spending, ARDL approach, structural breaks, India.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14454270 A New Measurable Definition of Knowledge in New Growth Theory
Authors: Mohammad Ali Molaei
Abstract:
New Growth Theory helps us make sense of the ongoing shift from a resource-based economy to a knowledge-based economy. It underscores the point that the economic processes which create and diffuse new knowledge are critical to shaping the growth of nations, communities and individual firms. In all too many contributions to New (Endogenous) Growth Theory – though not in all – central reference is made to 'a stock of knowledge', a 'stock of ideas', etc., this variable featuring centre-stage in the analysis. Yet it is immediately apparent that this is far from being a crystal clear concept. The difficulty and uncertainty of being able to capture the value associated with knowledge is a real problem. The intent of this paper is introducing new thinking and theorizing about the knowledge and its measurability in new growth theory. Moreover the study aims to synthesize various strain of the literature with a practical bearing on knowledge concept. By contribution of institution framework which is found within NGT, we can indirectly measure the knowledge concept. Institutions matter because they shape the environment for production and employment of new knowledgeKeywords: Institution Framework, Knowledge, New GrowthTheory (NGT)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15444269 Mathematical Programming on Multivariate Calibration Estimation in Stratified Sampling
Authors: Dinesh Rao, M.G.M. Khan, Sabiha Khan
Abstract:
Calibration estimation is a method of adjusting the original design weights to improve the survey estimates by using auxiliary information such as the known population total (or mean) of the auxiliary variables. A calibration estimator uses calibrated weights that are determined to minimize a given distance measure to the original design weights while satisfying a set of constraints related to the auxiliary information. In this paper, we propose a new multivariate calibration estimator for the population mean in the stratified sampling design, which incorporates information available for more than one auxiliary variable. The problem of determining the optimum calibrated weights is formulated as a Mathematical Programming Problem (MPP) that is solved using the Lagrange multiplier technique.Keywords: Calibration estimation, Stratified sampling, Multivariate auxiliary information, Mathematical programming problem, Lagrange multiplier technique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19524268 Quality of Life: Expectations and Achievements of Middle Class in Kazakhstan
Authors: Nazym Shedenova, Aigul Beimisheva
Abstract:
The improvement of quality of life is the main visible integrated indicator of state well-being. More and more states pay attention to define and to achieve social standards of quality of life as social-economic strategy of development. These standards are determinate by state features, complex of needs and interests of individual, family and society. It still remains in open question: “What is middle class" in contemporary Kazakhstan. Appearance of new social standards of quality of life is important indicator of its successful establishment. The middle class as agent of social, politic and economic reforms promotes to improve the quality of life of the country. But if consider a low and a middle stratums of middle class, we can see that high social expectations and real achievements are still significantly different. The article relies on the sociological data, collected during of search of household-s standards of living in Almaty city and Almaty region, and case-study of cottage city “Jana Kuat".Keywords: the quality of life, the social standards of life, the middle class of Kazakhstan, the economic behavior of households.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27354267 Central Asia and Kazakhstan: In Search of Civic Identity
Authors: Elnura Assyltayeva, Zhanar Aldubasheva, Zhengisbek Tolen, Ziyakul Assyltayeva, Aliya Alimzhanova
Abstract:
Mankind has entered into an extremely complex and controversial stage of its development: the world is simultaneously organized and chaoticized, globalized and localized, combined and split. Analysts point out that globalization as a process of strengthening economic, cultural, financial and other ties of states cause many problems. In the economic sphere, it creates the danger of growing gap between the states, in the sphere of politics it leads to the weakening of political power and influence of nation-states.Keywords: Civic identity, globalization, identity crisis, culture identity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17864266 Inverse Problem Methodology for the Measurement of the Electromagnetic Parameters Using MLP Neural Network
Authors: T. Hacib, M. R. Mekideche, N. Ferkha
Abstract:
This paper presents an approach which is based on the use of supervised feed forward neural network, namely multilayer perceptron (MLP) neural network and finite element method (FEM) to solve the inverse problem of parameters identification. The approach is used to identify unknown parameters of ferromagnetic materials. The methodology used in this study consists in the simulation of a large number of parameters in a material under test, using the finite element method (FEM). Both variations in relative magnetic permeability and electrical conductivity of the material under test are considered. Then, the obtained results are used to generate a set of vectors for the training of MLP neural network. Finally, the obtained neural network is used to evaluate a group of new materials, simulated by the FEM, but not belonging to the original dataset. Noisy data, added to the probe measurements is used to enhance the robustness of the method. The reached results demonstrate the efficiency of the proposed approach, and encourage future works on this subject.Keywords: Inverse problem, MLP neural network, parametersidentification, FEM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17644265 Upper Bound of the Generalize p-Value for the Behrens-Fisher Problem with a Known Ratio of Variances
Authors: Rada Somkhuean, Suparat Niwitpong, Sa-aat Niwitpong
Abstract:
This paper presents the generalized p-values for testing the Behrens-Fisher problem when a ratio of variance is known. We also derive a closed form expression of the upper bound of the proposed generalized p-value.
Keywords: Generalized p-value, hypothesis testing, ratio of variances, upper bound.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12334264 Posture Stabilization of Kinematic Model of Differential Drive Robots via Lyapunov-Based Control Design
Abstract:
In this paper, the problem of posture stabilization for a kinematic model of differential drive robots is studied. A more complex model of the kinematics of differential drive robots is used for the design of stabilizing control. This model is formulated in terms of the physical parameters of the system such as the radius of the wheels, and velocity of the wheels are the control inputs of it. In this paper, the framework of Lyapunov-based control design has been used to solve posture stabilization problem for the comprehensive model of differential drive robots. The results of the simulations show that the devised controller successfully solves the posture regulation problem. Finally, robustness and performance of the controller have been studied under system parameter uncertainty.Keywords: Differential drive robots, nonlinear control, Lyapunov-based control design, posture regulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17974263 Decision Maturity Framework: Introducing Maturity In Heuristic Search
Authors: Ayed Salman, Fawaz Al-Anzi, Aseel Al-Minayes
Abstract:
Heuristics-based search methodologies normally work on searching a problem space of possible solutions toward finding a “satisfactory" solution based on “hints" estimated from the problem-specific knowledge. Research communities use different types of methodologies. Unfortunately, most of the times, these hints are immature and can lead toward hindering these methodologies by a premature convergence. This is due to a decrease of diversity in search space that leads to a total implosion and ultimately fitness stagnation of the population. In this paper, a novel Decision Maturity framework (DMF) is introduced as a solution to this problem. The framework simply improves the decision on the direction of the search by materializing hints enough before using them. Ideas from this framework are injected into the particle swarm optimization methodology. Results were obtained under both static and dynamic environment. The results show that decision maturity prevents premature converges to a high degree.Keywords: Heuristic Search, hints, Particle Swarm Optimization, Decision Maturity Framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13554262 Two Iterative Algorithms to Compute the Bisymmetric Solution of the Matrix Equation A1X1B1 + A2X2B2 + ... + AlXlBl = C
Authors: A.Tajaddini
Abstract:
In this paper, two matrix iterative methods are presented to solve the matrix equation A1X1B1 + A2X2B2 + ... + AlXlBl = C the minimum residual problem l i=1 AiXiBi−CF = minXi∈BRni×ni l i=1 AiXiBi−CF and the matrix nearness problem [X1, X2, ..., Xl] = min[X1,X2,...,Xl]∈SE [X1,X2, ...,Xl] − [X1, X2, ..., Xl]F , where BRni×ni is the set of bisymmetric matrices, and SE is the solution set of above matrix equation or minimum residual problem. These matrix iterative methods have faster convergence rate and higher accuracy than former methods. Paige’s algorithms are used as the frame method for deriving these matrix iterative methods. The numerical example is used to illustrate the efficiency of these new methods.
Keywords: Bisymmetric matrices, Paige’s algorithms, Least square.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13934261 The Household-Based Socio-Economic Index for Every District in Peninsular Malaysia
Authors: Nuzlinda Abdul Rahman, Syerrina Zakaria
Abstract:
Deprivation indices are widely used in public health study. These indices are also referred as the index of inequalities or disadvantage. Even though, there are many indices that have been built before, it is believed to be less appropriate to use the existing indices to be applied in other countries or areas which had different socio-economic conditions and different geographical characteristics. The objective of this study is to construct the index based on the geographical and socio-economic factors in Peninsular Malaysia which is defined as the weighted household-based deprivation index. This study has employed the variables based on household items, household facilities, school attendance and education level obtained from Malaysia 2000 census report. The factor analysis is used to extract the latent variables from indicators, or reducing the observable variable into smaller amount of components or factor. Based on the factor analysis, two extracted factors were selected, known as Basic Household Amenities and Middle-Class Household Item factor. It is observed that the district with a lower index values are located in the less developed states like Kelantan, Terengganu and Kedah. Meanwhile, the areas with high index values are located in developed states such as Pulau Pinang, W.P. Kuala Lumpur and Selangor.Keywords: Factor Analysis, Basic Household Amenities, Middle-Class Household Item, Socio-economic Index
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30104260 A TRIZ-based Approach to Generation of Service-supporting Product Concepts
Authors: Seungkyum Kim, Yongtae Park
Abstract:
Recently, business environment and customer needs have become rapidly changing, hence it is very difficult to fulfill sophisticated customer needs by product or service innovation only. In practice, to cope with this problem, various manufacturing companies have developed services to combine with their products. Along with this, many academic studies on PSS (Product Service System) which is the integrated system of products and services have been conducted from the viewpoint of manufacturers. On the other hand, service providers are also attempting to develop service-supporting products to increase their service competitiveness and provide differentiated value. However, there is a lack of research based on the service-centric point of view. Accordingly, this paper proposes a concept generation method for service-supporting product development from the service-centric point of view. This method is designed to be executed in five consecutive steps: situation analysis, problem definition, problem resolution, solution evaluation, and concept generation. In the proposed approach, some tools of TRIZ (Theory of Solving Inventive Problem) such as ISQ (Innovative Situation Questionnaire) and 40 inventive principles are employed in order to define problems of the current services and solve them by generating service-supporting product concepts. This research contributes to the development of service-supporting products and service-centric PSSs.Keywords: TRIZ, PSS (Product Service System), service-supporting product, concept generation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19284259 Finding Pareto Optimal Front for the Multi- Mode Time, Cost Quality Trade-off in Project Scheduling
Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo
Abstract:
Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.Keywords: FastPGA, Multi-Execution Activity Mode, Pareto Optimality, Project Scheduling, Time-Cost-Quality Trade-Off.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18074258 Rural Connectivity Technologies Cost Analysis
Authors: F. Simba, L. Trojer, N.H. Mvungi, B.M. Mwinyiwiwa, E.M. Mjema
Abstract:
Rural areas of Tanzania are still disadvantaged in terms of diffusion of IP-based services; this is due to lack of Information and Communication Technology (ICT) infrastructures, especially lack of connectivity. One of the limitations for connectivity problems in rural areas of Tanzania is the high cost to establish infrastructures for IP-based services [1-2]. However the cost of connectivity varies from one technology to the other and at the same time, the cost is also different from one operator (service provider) to another within the country. This paper presents development of software system to calculate cost of connectivity to rural areas of Tanzania. The system is developed to make an easy access of connectivity cost from different technologies and different operators. The development of the calculator follows the V-model software development lifecycle. The calculator is used to evaluate the economic viability of different technologies considered as being potential candidates to provide rural connectivity. In this paper, the evaluation is based on the techno-economic analysis approach.
Keywords: rural, connectivity, cost, V-model, techno economic analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18994257 Constraint Based Frequent Pattern Mining Technique for Solving GCS Problem
Authors: First G.M. Karthik, Second Ramachandra.V.Pujeri, Dr.
Abstract:
Generalized Center String (GCS) problem are generalized from Common Approximate Substring problem and Common substring problems. GCS are known to be NP-hard allowing the problems lies in the explosion of potential candidates. Finding longest center string without concerning the sequence that may not contain any motifs is not known in advance in any particular biological gene process. GCS solved by frequent pattern-mining techniques and known to be fixed parameter tractable based on the fixed input sequence length and symbol set size. Efficient method known as Bpriori algorithms can solve GCS with reasonable time/space complexities. Bpriori 2 and Bpriori 3-2 algorithm are been proposed of any length and any positions of all their instances in input sequences. In this paper, we reduced the time/space complexity of Bpriori algorithm by Constrained Based Frequent Pattern mining (CBFP) technique which integrates the idea of Constraint Based Mining and FP-tree mining. CBFP mining technique solves the GCS problem works for all center string of any length, but also for the positions of all their mutated copies of input sequence. CBFP mining technique construct TRIE like with FP tree to represent the mutated copies of center string of any length, along with constraints to restraint growth of the consensus tree. The complexity analysis for Constrained Based FP mining technique and Bpriori algorithm is done based on the worst case and average case approach. Algorithm's correctness compared with the Bpriori algorithm using artificial data is shown.Keywords: Constraint Based Mining, FP tree, Data mining, GCS problem, CBFP mining technique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17024256 A New Heuristic for Improving the Performance of Genetic Algorithm
Authors: Warattapop Chainate, Peeraya Thapatsuwan, Pupong Pongcharoen
Abstract:
The hybridisation of genetic algorithm with heuristics has been shown to be one of an effective way to improve its performance. In this work, genetic algorithm hybridised with four heuristics including a new heuristic called neighbourhood improvement were investigated through the classical travelling salesman problem. The experimental results showed that the proposed heuristic outperformed other heuristics both in terms of quality of the results obtained and the computational time.Keywords: Genetic Algorithm, Hybridisation, Metaheuristics, Travelling Salesman Problem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18484255 Investigation and Congestion Management to Solvethe Over-Load Problem of Shiraz Substation in FREC
Authors: M Nayeripour, E. Azad, A. Roosta, T. Niknam
Abstract:
In this paper, the transformers over-load problem of Shiraz substation in Fars Regional Electric Company (FREC) is investigated for a period of three years plan. So the suggestions for using phase shifting transformer (PST) and unified power flow controller (UPFC) in order to solve this problem are examined in details and finally, some economical and practical designs will be given in order to solve the related problems. Practical consideration and using the basic and fundamental concept of powers in transmission lines in order to find the economical design are the main advantages of this research. The simulation results of the integrated overall system with different designs compare them base on economical and practical aspects to solve the over-load and loss-reduction.
Keywords: Congestion management, Phase shifting transformer(PST), Unified power flow controller (UPFC), Transmission lines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20024254 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome
Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco
Abstract:
Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.
Keywords: Data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8994253 A Reproduction of Boundary Conditions in Three-Dimensional Continuous Casting Problem
Authors: Iwona Nowak, Jacek Smolka, Andrzej J. Nowak
Abstract:
The paper discusses a 3D numerical solution of the inverse boundary problem for a continuous casting process of alloy. The main goal of the analysis presented within the paper was to estimate heat fluxes along the external surface of the ingot. The verified information on these fluxes was crucial for a good design of a mould, effective cooling system and generally the whole caster. In the study an enthalpy-porosity technique implemented in Fluent package was used for modeling the solidification process. In this method, the phase change interface was determined on the basis of the liquid fraction approach. In inverse procedure the sensitivity analysis was applied for retrieving boundary conditions. A comparison of the measured and retrieved values showed a high accuracy of the computations. Additionally, the influence of the accuracy of measurements on the estimated heat fluxes was also investigated.
Keywords: Boundary inverse problem, sensitivity analysis, continuous casting, numerical simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15264252 Hybrid Neural Network Methods for Lithology Identification in the Algerian Sahara
Authors: S. Chikhi, M. Batouche, H. Shout
Abstract:
In this paper, we combine a probabilistic neural method with radial-bias functions in order to construct the lithofacies of the wells DF01, DF02 and DF03 situated in the Triassic province of Algeria (Sahara). Lithofacies is a crucial problem in reservoir characterization. Our objective is to facilitate the experts' work in geological domain and to allow them to obtain quickly the structure and the nature of lands around the drilling. This study intends to design a tool that helps automatic deduction from numerical data. We used a probabilistic formalism to enhance the classification process initiated by a Self-Organized Map procedure. Our system gives lithofacies, from well-log data, of the concerned reservoir wells in an aspect easy to read by a geology expert who identifies the potential for oil production at a given source and so forms the basis for estimating the financial returns and economic benefits.
Keywords: Classification, Lithofacies, Probabilistic formalism, Reservoir characterization, Well-log data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18974251 Solving Single Machine Total Weighted Tardiness Problem Using Gaussian Process Regression
Authors: Wanatchapong Kongkaew
Abstract:
This paper proposes an application of probabilistic technique, namely Gaussian process regression, for estimating an optimal sequence of the single machine with total weighted tardiness (SMTWT) scheduling problem. In this work, the Gaussian process regression (GPR) model is utilized to predict an optimal sequence of the SMTWT problem, and its solution is improved by using an iterated local search based on simulated annealing scheme, called GPRISA algorithm. The results show that the proposed GPRISA method achieves a very good performance and a reasonable trade-off between solution quality and time consumption. Moreover, in the comparison of deviation from the best-known solution, the proposed mechanism noticeably outperforms the recently existing approaches.
Keywords: Gaussian process regression, iterated local search, simulated annealing, single machine total weighted tardiness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22354250 Energy Consumption and Economic Growth in South Asian Countries: A Co-integrated Panel Analysis
Authors: S. Noor, M. W. Siddiqi
Abstract:
This study examines causal link between energy use and economic growth for five South Asian countries over period 1971-2006. Panel cointegration, ECM and FMOLS are applied for short and long run estimates. In short run unidirectional causality from per capita GDP to per capita energy consumption is found, but not vice versa. In long run one percent increase in per capita energy consumption tend to decrease 0.13 percent per capita GDP. i.e. Energy use discourage economic growth. This short and long run relationship indicate energy shortage crisis in South Asia due to increased energy use coupled with insufficient energy supply. Beside this long run estimated coefficient of error term suggest that short term adjustment to equilibrium are driven by adjustment back to long run equilibrium. Moreover, per capita energy consumption is responsive to adjustment back to equilibrium and it takes 59 years approximately. It specifies long run feedback between both variables.
Keywords: Energy consumption, Income, Panel co-integration, Causality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33134249 Model Updating-Based Approach for Damage Prognosis in Frames via Modal Residual Force
Authors: Gholamreza Ghodrati Amiri, Mojtaba Jafarian Abyaneh, Ali Zare Hosseinzadeh
Abstract:
This paper presents an effective model updating strategy for damage localization and quantification in frames by defining damage detection problem as an optimization issue. A generalized version of the Modal Residual Force (MRF) is employed for presenting a new damage-sensitive cost function. Then, Grey Wolf Optimization (GWO) algorithm is utilized for solving suggested inverse problem and the global extremums are reported as damage detection results. The applicability of the presented method is investigated by studying different damage patterns on the benchmark problem of the IASC-ASCE, as well as a planar shear frame structure. The obtained results emphasize good performance of the method not only in free-noise cases, but also when the input data are contaminated with different levels of noises.Keywords: Frame, grey wolf optimization algorithm, modal residual force, structural damage detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14954248 Strategy of Zakat Utilization for Productive Economic and Social Activity: A Case Study at Lembaga Amil Zakat, Infaq and Shodaqoh Yayasan Badan Wakaf Universitas Islam Indonesia
Authors: Krisnanda, Naili Qiyadatul Ulya
Abstract:
Utilization of zakat for productive economic and social activities can be considered an appropriate way to optimize the efficiency and major benefits within these community funds. As we know, not least among the Muslims who desperately need help from zakat funds to improve the livelihoods of their standard of living. In this case, optimizing the utilization of zakat funds can help the community, especially Muslims, to improve and prosper in their lives. Optimizing zakat funds for this purpose can not only motivate people to help others for the welfare and empowerment of the people but can also foster social solidarity between religious communities. The establishment of the nature of social solidarity will reduce the impact of poverty and even eradicate poverty. This study was conducted to determine how the strategy of zakat utilization is through the program of Zakat Galang Berdikari by the Zakat, Infaq, and Shodaqoh Institute of Waqf Board Foundation of Universitas Islam Indonesia (LAZIS YBW UII), what are successful indicators of LAZIS YBW UII in empowering zakat, how to manage zakat at LAZIS YBW UII through the program of Zakat Galang Berdikari to determine the extent of zakat utilization in productive economic activities and to help people less able to start an independent business in Yogyakarta. This study used a qualitative approach and the type of empirical research. This study used primary and secondary data by interviewing stakeholders according to the criteria, carrying out field observations and documentation which were then analyzed carefully and presented in a descriptive form. The result of this research is that the utilization of zakat funds in the Zakat Galang Berdikari program by LAZIS YBW UII is the right strategy to optimize zakat for productive economic and social activities in Yogyakarta.
Keywords: Zakat utilization, zakat funds, productive economic, LAZIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4104247 A Meta-Heuristic Algorithm for Vertex Covering Problem Based on Gravity
Authors: S. Raja Balachandar, K.Kannan
Abstract:
A new Meta heuristic approach called "Randomized gravitational emulation search algorithm (RGES)" for solving vertex covering problems has been designed. This algorithm is found upon introducing randomization concept along with the two of the four primary parameters -velocity- and -gravity- in physics. A new heuristic operator is introduced in the domain of RGES to maintain feasibility specifically for the vertex covering problem to yield best solutions. The performance of this algorithm has been evaluated on a large set of benchmark problems from OR-library. Computational results showed that the randomized gravitational emulation search algorithm - based heuristic is capable of producing high quality solutions. The performance of this heuristic when compared with other existing heuristic algorithms is found to be excellent in terms of solution quality.
Keywords: Vertex covering Problem, Velocity, Gravitational Force, Newton's Law, Meta Heuristic, Combinatorial optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20104246 Proffering a Brand New Methodology to Resource Discovery in Grid based on Economic Criteria Using Learning Automata
Authors: Ali Sarhadi, Mohammad Reza Meybodi, Ali Yousefi
Abstract:
Resource discovery is one of the chief services of a grid. A new approach to discover the provenances in grid through learning automata has been propounded in this article. The objective of the aforementioned resource-discovery service is to select the resource based upon the user-s applications and the mercantile yardsticks that is to say opting for an originator which can accomplish the user-s tasks in the most economic manner. This novel service is submitted in two phases. We proffered an applicationbased categorization by means of an intelligent nerve-prone plexus. The user in question sets his or her application as the input vector of the nerve-prone nexus. The output vector of the aforesaid network limns the appropriateness of any one of the resource for the presented executive procedure. The most scrimping option out of those put forward in the previous stage which can be coped with to fulfill the task in question is picked out. Te resource choice is carried out by means of the presented algorithm based upon the learning automata.
Keywords: Resource discovery, learning automata, neural network, economic policy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14534245 Optimal Allocation Between Subprime Structured Mortgage Products and Treasuries
Authors: MP. Mulaudzi, MA. Petersen, J. Mukuddem-Petersen , IM. Schoeman, B. de Waal, JM. Manale
Abstract:
This conference paper discusses a risk allocation problem for subprime investing banks involving investment in subprime structured mortgage products (SMPs) and Treasuries. In order to solve this problem, we develop a L'evy process-based model of jump diffusion-type for investment choice in subprime SMPs and Treasuries. This model incorporates subprime SMP losses for which credit default insurance in the form of credit default swaps (CDSs) can be purchased. In essence, we solve a mean swap-at-risk (SaR) optimization problem for investment which determines optimal allocation between SMPs and Treasuries subject to credit risk protection via CDSs. In this regard, SaR is indicative of how much protection investors must purchase from swap protection sellers in order to cover possible losses from SMP default. Here, SaR is defined in terms of value-at-risk (VaR). Finally, we provide an analysis of the aforementioned optimization problem and its connections with the subprime mortgage crisis (SMC).
Keywords: Investors; Jump Diffusion Process, Structured Mortgage Products, Treasuries, Credit Risk, Credit Default Swaps, Tranching Risk, Counterparty Risk, Value-at-Risk, Swaps-at-Risk, Subprime Mortgage Crisis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731