Search results for: problem structuring methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20430

Search results for: problem structuring methods

20070 Selection of Optimal Reduced Feature Sets of Brain Signal Analysis Using Heuristically Optimized Deep Autoencoder

Authors: Souvik Phadikar, Nidul Sinha, Rajdeep Ghosh

Abstract:

In brainwaves research using electroencephalogram (EEG) signals, finding the most relevant and effective feature set for identification of activities in the human brain is a big challenge till today because of the random nature of the signals. The feature extraction method is a key issue to solve this problem. Finding those features that prove to give distinctive pictures for different activities and similar for the same activities is very difficult, especially for the number of activities. The performance of a classifier accuracy depends on this quality of feature set. Further, more number of features result in high computational complexity and less number of features compromise with the lower performance. In this paper, a novel idea of the selection of optimal feature set using a heuristically optimized deep autoencoder is presented. Using various feature extraction methods, a vast number of features are extracted from the EEG signals and fed to the autoencoder deep neural network. The autoencoder encodes the input features into a small set of codes. To avoid the gradient vanish problem and normalization of the dataset, a meta-heuristic search algorithm is used to minimize the mean square error (MSE) between encoder input and decoder output. To reduce the feature set into a smaller one, 4 hidden layers are considered in the autoencoder network; hence it is called Heuristically Optimized Deep Autoencoder (HO-DAE). In this method, no features are rejected; all the features are combined into the response of responses of the hidden layer. The results reveal that higher accuracy can be achieved using optimal reduced features. The proposed HO-DAE is also compared with the regular autoencoder to test the performance of both. The performance of the proposed method is validated and compared with the other two methods recently reported in the literature, which reveals that the proposed method is far better than the other two methods in terms of classification accuracy.

Keywords: autoencoder, brainwave signal analysis, electroencephalogram, feature extraction, feature selection, optimization

Procedia PDF Downloads 96
20069 Route Planning for Optimization Approach PSO_GA Sharing System (Scooter Sharing-Public Transportation) with Hybrid Optimization Approach PSO_GA

Authors: Mohammad Ali Farrokhpour

Abstract:

In the current decade and sustainable transportation systems, scooter sharing has attracted widespread attention as an environmentally-friendly means of public transportation which can help develop public transportation. The combination of scooters and subway in the area of sustainable transportation systems can provide a great many opportunities for developing access to public transportation. Of the challenges which have arisen and initiated discussions of interest about the implementation of a scooter-subway system to replace personal vehicles is the issue of routing in the aforementioned system. This has been chosen as the main subject of the present paper. Thus, the present paper provides an account for routing in this system. Because the issue of routing includes multiple factors such as time, costs, traffic, green spaces, etc., the above-mentioned problem is considered to be a multi-objective NP-hard optimization problem. For this purpose, the hybrid optimization approach of PSO-GA has been put forward in the present paper for the provided answers to be of higher accuracy and validity than those of normal optimization methods. The results obtained from modeling and problem solving for the case study in the MATLAB software are indicative of the efficiency and desirability of the model and the proposed approach for solving the model

Keywords: route planning, scooter sharing, public transportation, sharing system

Procedia PDF Downloads 61
20068 Second Order Optimality Conditions in Nonsmooth Analysis on Riemannian Manifolds

Authors: Seyedehsomayeh Hosseini

Abstract:

Much attention has been paid over centuries to understanding and solving the problem of minimization of functions. Compared to linear programming and nonlinear unconstrained optimization problems, nonlinear constrained optimization problems are much more difficult. Since the procedure of finding an optimizer is a search based on the local information of the constraints and the objective function, it is very important to develop techniques using geometric properties of the constraints and the objective function. In fact, differential geometry provides a powerful tool to characterize and analyze these geometric properties. Thus, there is clearly a link between the techniques of optimization on manifolds and standard constrained optimization approaches. Furthermore, there are manifolds that are not defined as constrained sets in R^n an important example is the Grassmann manifolds. Hence, to solve optimization problems on these spaces, intrinsic methods are used. In a nondifferentiable problem, the gradient information of the objective function generally cannot be used to determine the direction in which the function is decreasing. Therefore, techniques of nonsmooth analysis are needed to deal with such a problem. As a manifold, in general, does not have a linear structure, the usual techniques, which are often used in nonsmooth analysis on linear spaces, cannot be applied and new techniques need to be developed. This paper presents necessary and sufficient conditions for a strict local minimum of extended real-valued, nonsmooth functions defined on Riemannian manifolds.

Keywords: Riemannian manifolds, nonsmooth optimization, lower semicontinuous functions, subdifferential

Procedia PDF Downloads 347
20067 Using Thinking Blocks to Encourage the Use of Higher Order Thinking Skills among Students When Solving Problems on Fractions

Authors: Abdul Halim Abdullah, Nur Liyana Zainal Abidin, Mahani Mokhtar

Abstract:

Problem-solving is an activity which can encourage students to use Higher Order Thinking Skills (HOTS). Learning fractions can be challenging for students since empirical evidence shows that students experience difficulties in solving the fraction problems. However, visual methods can help students to overcome the difficulties since the methods help students to make meaningful visual representations and link abstract concepts in Mathematics. Therefore, the purpose of this study was to investigate whether there were any changes in students’ HOTS at the four highest levels when learning the fractions by using Thinking Blocks. 54 students participated in a quasi-experiment using pre-tests and post-tests. Students were divided into two groups. The experimental group (n=32) received a treatment to improve the students’ HOTS and the other group acted as the control group (n=22) which used a traditional method. Data were analysed by using Mann-Whitney test. The results indicated that during post-test, students who used Thinking Blocks showed significant improvement in their HOTS level (p=0.000). In addition, the results of post-test also showed that the students’ performance improved significantly at the four highest levels of HOTS; namely, application (p=0.001), analyse (p=0.000), evaluate (p=0.000), and create (p=0.000). Therefore, it can be concluded that Thinking Blocks can effectively encourage students to use the four highest levels of HOTS which consequently enable them to solve fractions problems successfully.

Keywords: Thinking Blocks, Higher Order Thinking Skills (HOTS), fractions, problem solving

Procedia PDF Downloads 253
20066 A Study on Computational Fluid Dynamics (CFD)-Based Design Optimization Techniques Using Multi-Objective Evolutionary Algorithms (MOEA)

Authors: Ahmed E. Hodaib, Mohamed A. Hashem

Abstract:

In engineering applications, a design has to be as fully perfect as possible in some defined case. The designer has to overcome many challenges in order to reach the optimal solution to a specific problem. This process is called optimization. Generally, there is always a function called “objective function” that is required to be maximized or minimized by choosing input parameters called “degrees of freedom” within an allowed domain called “search space” and computing the values of the objective function for these input values. It becomes more complex when we have more than one objective for our design. As an example for Multi-Objective Optimization Problem (MOP): A structural design that aims to minimize weight and maximize strength. In such case, the Pareto Optimal Frontier (POF) is used, which is a curve plotting two objective functions for the best cases. At this point, a designer should make a decision to choose the point on the curve. Engineers use algorithms or iterative methods for optimization. In this paper, we will discuss the Evolutionary Algorithms (EA) which are widely used with Multi-objective Optimization Problems due to their robustness, simplicity, suitability to be coupled and to be parallelized. Evolutionary algorithms are developed to guarantee the convergence to an optimal solution. An EA uses mechanisms inspired by Darwinian evolution principles. Technically, they belong to the family of trial and error problem solvers and can be considered global optimization methods with a stochastic optimization character. The optimization is initialized by picking random solutions from the search space and then the solution progresses towards the optimal point by using operators such as Selection, Combination, Cross-over and/or Mutation. These operators are applied to the old solutions “parents” so that new sets of design variables called “children” appear. The process is repeated until the optimal solution to the problem is reached. Reliable and robust computational fluid dynamics solvers are nowadays commonly utilized in the design and analyses of various engineering systems, such as aircraft, turbo-machinery, and auto-motives. Coupling of Computational Fluid Dynamics “CFD” and Multi-Objective Evolutionary Algorithms “MOEA” has become substantial in aerospace engineering applications, such as in aerodynamic shape optimization and advanced turbo-machinery design.

Keywords: mathematical optimization, multi-objective evolutionary algorithms "MOEA", computational fluid dynamics "CFD", aerodynamic shape optimization

Procedia PDF Downloads 237
20065 Six-Phase Tooth-Coil Winding Starter-Generator Embedded in Aerospace Engine

Authors: Flur R. Ismagilov, Vyacheslav E. Vavilov, Denis V. Gusakov

Abstract:

This paper is devoted to solve the problem of increasing the electrification of aircraft engines by installing a synchronous generator at high pressure shaft. Technical solution of this problem by various research centers is discussed. A design solution of the problem was proposed. To evaluate the effectiveness of the proposed cooling system, thermal analysis was carried out in ANSYS software.

Keywords: starter-generator, more electrical engine, aircraft engines, high pressure shaft, synchronous generator

Procedia PDF Downloads 233
20064 The Application of Pareto Local Search to the Single-Objective Quadratic Assignment Problem

Authors: Abdullah Alsheddy

Abstract:

This paper presents the employment of Pareto optimality as a strategy to help (single-objective) local search escaping local optima. Instead of local search, Pareto local search is applied to solve the quadratic assignment problem which is multi-objectivized by adding a helper objective. The additional objective is defined as a function of the primary one with augmented penalties that are dynamically updated.

Keywords: Pareto optimization, multi-objectivization, quadratic assignment problem, local search

Procedia PDF Downloads 446
20063 An Algorithm for the Map Labeling Problem with Two Kinds of Priorities

Authors: Noboru Abe, Yoshinori Amai, Toshinori Nakatake, Sumio Masuda, Kazuaki Yamaguchi

Abstract:

We consider the problem of placing labels of the points on a plane. For each point, its position, the size of its label and a priority are given. Moreover, several candidates of its label positions are prespecified, and each of such label positions is assigned a priority. The objective of our problem is to maximize the total sum of priorities of placed labels and their points. By refining a labeling algorithm that can use these priorities, we propose a new heuristic algorithm which is more suitable for treating the assigned priorities.

Keywords: map labeling, greedy algorithm, heuristic algorithm, priority

Procedia PDF Downloads 412
20062 Applying Neural Networks for Solving Record Linkage Problem via Fuzzy Description Logics

Authors: Mikheil Kalmakhelidze

Abstract:

Record linkage (RL) problem has become more and more important in recent years due to the growing interest towards big data analysis. The problem can be formulated in a very simple way: Given two entries a and b of a database, decide whether they represent the same object or not. There are two classical deterministic and probabilistic ways of solving the RL problem. Using simple Bayes classifier in many cases produces useful results but sometimes they show to be poor. In recent years several successful approaches have been made towards solving specific RL problems by neural network algorithms including single layer perception, multilayer back propagation network etc. In our work, we model the RL problem for specific dataset of student applications in fuzzy description logic (FDL) where linkage of specific pair (a,b) depends on the truth value of corresponding formula A(a,b) in a canonical FDL model. As a main result, we build neural network for deciding truth value of FDL formulas in a canonical model and thus link RL problem to machine learning. We apply the approach to dataset with 10000 entries and also compare to classical RL solving approaches. The results show to be more accurate than standard probabilistic approach.

Keywords: description logic, fuzzy logic, neural networks, record linkage

Procedia PDF Downloads 252
20061 Rheological Study of Natural Sediments: Application in Filling of Estuaries

Authors: S. Serhal, Y. Melinge, D. Rangeard, F. Hage Chehadeh

Abstract:

Filling of estuaries is an international problem that can cause economic and environmental damage. This work aims the study of the rheological structuring mechanisms of natural sedimentary liquid-solid mixture in estuaries in order to better understand their filling. The estuary of the Rance river, located in Brittany, France is particularly targeted by the study. The aim is to provide answers on the rheological behavior of natural sediments by detecting structural factors influencing the rheological parameters. So we can better understand the fillings estuarine areas and especially consider sustainable solutions of ‘cleansing’ of these areas. The sediments were collected from the trap of Lyvet in Rance estuary. This trap was created by the association COEUR (Comité Opérationnel des Elus et Usagers de la Rance) in 1996 in order to facilitate the cleansing of the estuary. It creates a privileged area for the deposition of sediments and consequently makes the cleansing of the estuary easier. We began our work with a preliminary study to establish the trend of the rheological behavior of the suspensions and to specify the dormant phase which precedes the beginning of the biochemical reactivity of the suspensions. Then we highlight the visco-plastic character at younger age using the Kinexus rheometer, plate-plate geometry. This rheological behavior of suspensions is represented by the Bingham model using dynamic yield stress and viscosity which can be a function of volume fraction, granular extent, and chemical reactivity. The evolution of the viscosity as a function of the solid volume fraction is modeled by the Krieger-Dougherty model. On the other hand, the analysis of the dynamic yield stress showed a fairly functional link with the solid volume fraction.

Keywords: estuaries, rheological behavior, sediments, Kinexus rheometer, Bingham model, viscosity, yield stress

Procedia PDF Downloads 137
20060 Main Tendencies of Youth Unemployment and the Regulation Mechanisms for Decreasing Its Rate in Georgia

Authors: Nino Paresashvili, Nino Abesadze

Abstract:

The modern world faces huge challenges. Globalization changed the socio-economic conditions of many countries. The current processes in the global environment have a different impact on countries with different cultures. However, an alleviation of poverty and improvement of living conditions is still the basic challenge for the majority of countries, because much of the population still lives under the official threshold of poverty. It is very important to stimulate youth employment. In order to prepare young people for the labour market, it is essential to provide them with the appropriate professional skills and knowledge. It is necessary to plan efficient activities for decreasing an unemployment rate and for developing the perfect mechanisms for regulation of a labour market. Such planning requires thorough study and analysis of existing reality, as well as development of corresponding mechanisms. Statistical analysis of unemployment is one of the main platforms for regulation of the labour market key mechanisms. The corresponding statistical methods should be used in the study process. Such methods are observation, gathering, grouping, and calculation of the generalized indicators. Unemployment is one of the most severe socioeconomic problems in Georgia. According to the past as well as the current statistics, unemployment rates always have been the most problematic issue to resolve for policy makers. Analytical works towards to the above-mentioned problem will be the basis for the next sustainable steps to solve the main problem. The results of the study showed that the choice of young people is not often due to their inclinations, their interests and the labour market demand. That is why the wrong professional orientation of young people in most cases leads to their unemployment. At the same time, it was shown that there are a number of professions in the labour market with a high demand because of the deficit the appropriate specialties. To achieve healthy competitiveness in youth employment, it is necessary to formulate regional employment programs with taking into account the regional infrastructure specifications.

Keywords: unemployment, analysis, methods, tendencies, regulation mechanisms

Procedia PDF Downloads 360
20059 Loudspeaker Parameters Inverse Problem for Improving Sound Frequency Response Simulation

Authors: Y. T. Tsai, Jin H. Huang

Abstract:

The sound pressure level (SPL) of the moving-coil loudspeaker (MCL) is often simulated and analyzed using the lumped parameter model. However, the SPL of a MCL cannot be simulated precisely in the high frequency region, because the value of cone effective area is changed due to the geometry variation in different mode shapes, it is also related to affect the acoustic radiation mass and resistance. Herein, the paper presents the inverse method which has a high ability to measure the value of cone effective area in various frequency points, also can estimate the MCL electroacoustic parameters simultaneously. The proposed inverse method comprises the direct problem, adjoint problem, and sensitivity problem in collaboration with nonlinear conjugate gradient method. Estimated values from the inverse method are validated experimentally which compared with the measured SPL curve result. Results presented in this paper not only improve the accuracy of lumped parameter model but also provide the valuable information on loudspeaker cone design.

Keywords: inverse problem, cone effective area, loudspeaker, nonlinear conjugate gradient method

Procedia PDF Downloads 286
20058 An Amphibious House for Flood Prone Areas in Godavari River Basin

Authors: Gangadhara Rao K.

Abstract:

In Andhra Pradesh traditionally, the flood problem had been confined to the flooding of smaller rivers. But the drainage problem in the coastal delta zones has worsened, multiplying the destructive potential of cyclones and increasing flood hazards. As a result of floods, the people living around these areas are forced to move out of their traditions in search of higher altitude places. This paper will be discussing about suitability of techniques used in Bangladesh in context of Godavari river basin in Andhra Pradesh. The study considers social, physical and environmental conditions of the region. The methods for achieving this objective includes the study of both cases from Bangladesh and Andhra Pradesh. Comparison with the existing techniques and suit to our requirements and context. If successful, we can adopt those techniques and this might help the people living in riverfront areas to stay safe during the floods without losing their traditional lands.

Keywords: amphibious, bouyancy, floating, architecture, flood resistent

Procedia PDF Downloads 149
20057 Low-Level Modeling for Optimal Train Routing and Scheduling in Busy Railway Stations

Authors: Quoc Khanh Dang, Thomas Bourdeaud’huy, Khaled Mesghouni, Armand Toguy´eni

Abstract:

This paper studies a train routing and scheduling problem for busy railway stations. Our objective is to allow trains to be routed in dense areas that are reaching saturation. Unlike traditional methods that allocate all resources to setup a route for a train and until the route is freed, our work focuses on the use of resources as trains progress through the railway node. This technique allows a larger number of trains to be routed simultaneously in a railway node and thus reduces their current saturation. To deal with this problem, this study proposes an abstract model and a mixed-integer linear programming formulation to solve it. The applicability of our method is illustrated on a didactic example.

Keywords: busy railway stations, mixed-integer linear programming, offline railway station management, train platforming, train routing, train scheduling

Procedia PDF Downloads 231
20056 A Hybrid Distributed Algorithm for Multi-Objective Dynamic Flexible Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a hybrid distributed algorithm has been suggested for multi-objective dynamic flexible job shop scheduling problem. The proposed algorithm is high level, in which several algorithms search the space on different machines simultaneously also it is a hybrid algorithm that takes advantages of the artificial intelligence, evolutionary and optimization methods. Distribution is done at different levels and new approaches are used for design of the algorithm. Apache spark and Hadoop frameworks have been used for the distribution of the algorithm. The Pareto optimality approach is used for solving the multi-objective benchmarks. The suggested algorithm that is able to solve large-size problems in short times has been compared with the successful algorithms of the literature. The results prove high speed and efficiency of the algorithm.

Keywords: distributed algorithms, apache-spark, Hadoop, flexible dynamic job shop scheduling, multi-objective optimization

Procedia PDF Downloads 330
20055 ECO ROADS: A Solution to the Vehicular Pollution on Roads

Authors: Harshit Garg, Shakshi Gupta

Abstract:

One of the major problems in today’s world is the growing pollution. The cause for all environmental problems is the increasing pollution rate. Looking upon the statistics, one can find out that most of the pollution is caused by the vehicular pollution which is more than 70 % of the total pollution, effecting the environment as well as human health proportionally. One is aware of the fact that vehicles run on roads so why not having the roads which could adsorb that pollution, not only once but a number of times. Every problem has a solution which can be solved by the state of art of technology, that is one can use the innovative ideas and thoughts to make technology as a solution to the problem of vehicular pollution on roads. Solving the problem up to a certain limit/ percentage can be formulated into a new term called ECO ROADS.

Keywords: environment, pollution, roads, sustainibility

Procedia PDF Downloads 534
20054 Approaching the Spatial Multi-Objective Land Use Planning Problems at Mountain Areas by a Hybrid Meta-Heuristic Optimization Technique

Authors: Konstantinos Tolidis

Abstract:

The mountains are amongst the most fragile environments in the world. The world’s mountain areas cover 24% of the Earth’s land surface and are home to 12% of the global population. A further 14% of the global population is estimated to live in the vicinity of their surrounding areas. As urbanization continues to increase in the world, the mountains are also key centers for recreation and tourism; their attraction is often heightened by their remarkably high levels of biodiversity. Due to the fact that the features in mountain areas vary spatially (development degree, human geography, socio-economic reality, relations of dependency and interaction with other areas-regions), the spatial planning on these areas consists of a crucial process for preserving the natural, cultural and human environment and consists of one of the major processes of an integrated spatial policy. This research has been focused on the spatial decision problem of land use allocation optimization which is an ordinary planning problem on the mountain areas. It is a matter of fact that such decisions must be made not only on what to do, how much to do, but also on where to do, adding a whole extra class of decision variables to the problem when combined with the consideration of spatial optimization. The utility of optimization as a normative tool for spatial problem is widely recognized. However, it is very difficult for planners to quantify the weights of the objectives especially when these are related to mountain areas. Furthermore, the land use allocation optimization problems at mountain areas must be addressed not only by taking into account the general development objectives but also the spatial objectives (e.g. compactness, compatibility and accessibility, etc). Therefore, the main research’s objective was to approach the land use allocation problem by utilizing a hybrid meta-heuristic optimization technique tailored to the mountain areas’ spatial characteristics. The results indicates that the proposed methodological approach is very promising and useful for both generating land use alternatives for further consideration in land use allocation decision-making and supporting spatial management plans at mountain areas.

Keywords: multiobjective land use allocation, mountain areas, spatial planning, spatial decision making, meta-heuristic methods

Procedia PDF Downloads 309
20053 Direct Blind Separation Methods for Convolutive Images Mixtures

Authors: Ahmed Hammed, Wady Naanaa

Abstract:

In this paper, we propose a general approach to deal with the problem of a convolutive mixture of images. We use a direct blind source separation method by adding only one non-statistical justified constraint describing the relationships between different mixing matrix at the aim to make its resolution easy. This method can be applied, provided that this constraint is known, to degraded document affected by the overlapping of text-patterns and images. This is due to chemical and physical reactions of the materials (paper, inks,...) occurring during the documents aging, and other unpredictable causes such as humidity, microorganism infestation, human handling, etc. We will demonstrate that this problem corresponds to a convolutive mixture of images. Subsequently, we will show how the validation of our method through numerical examples. We can so obtain clear images from unreadable ones which can be caused by pages superposition, a phenomenon similar to that we find every often in archival documents.

Keywords: blind source separation, convoluted mixture, degraded documents, text-patterns overlapping

Procedia PDF Downloads 304
20052 Preventing Corruption in Dubai: Governance, Contemporary Strategies and Systemic Flaws

Authors: Graham Brooks, Belaisha Bin Belaisha, Hakkyong Kim

Abstract:

The problem of preventing and/or reducing corruption is a major international problem. This paper, however, specifically focuses on how organisations in Dubai are tackling the problem of money laundering. This research establishes that Dubai has a clear international anti-money laundering framework but suffers from some national weaknesses such as diverse anti-money laundering working practice, lack of communication, sharing information and disparate organisational vested self-interest.

Keywords: corruption, governance, money laundering, prevention, strategies

Procedia PDF Downloads 258
20051 Comparison of Safety Factor Evaluation Methods for Buckling of High Strength Steel Welded Box Section Columns

Authors: Balazs Somodi, Balazs Kovesdi

Abstract:

In the research praxis of civil engineering the statistical evaluation of experimental and numerical investigations is an essential task in order to compare the experimental and numerical resistances of a specific structural problem with the proposed resistances of the standards. However, in the standards and in the international literature there are several different safety factor evaluation methods that can be used to check the necessary safety level (e.g.: 5% quantile level, 2.3% quantile level, 1‰ quantile level, γM partial safety factor, γM* partial safety factor, β reliability index). Moreover, in the international literature different calculation methods could be found even for the same safety factor as well. In the present study the flexural buckling resistance of high strength steel (HSS) welded closed sections are analyzed. The authors investigated the flexural buckling resistances of the analyzed columns by laboratory experiments. In the present study the safety levels of the obtained experimental resistances are calculated based on several safety approaches and compared with the EN 1990. The results of the different safety approaches are compared and evaluated. Based on the evaluation tendencies are identified and the differences between the statistical evaluation methods are explained.

Keywords: flexural buckling, high strength steel, partial safety factor, statistical evaluation

Procedia PDF Downloads 144
20050 An Improved Ant Colony Algorithm for Genome Rearrangements

Authors: Essam Al Daoud

Abstract:

Genome rearrangement is an important area in computational biology and bioinformatics. The basic problem in genome rearrangements is to compute the edit distance, i.e., the minimum number of operations needed to transform one genome into another. Unfortunately, unsigned genome rearrangement problem is NP-hard. In this study an improved ant colony optimization algorithm to approximate the edit distance is proposed. The main idea is to convert the unsigned permutation to signed permutation and evaluate the ants by using Kaplan algorithm. Two new operations are added to the standard ant colony algorithm: Replacing the worst ants by re-sampling the ants from a new probability distribution and applying the crossover operations on the best ants. The proposed algorithm is tested and compared with the improved breakpoint reversal sort algorithm by using three datasets. The results indicate that the proposed algorithm achieves better accuracy ratio than the previous methods.

Keywords: ant colony algorithm, edit distance, genome breakpoint, genome rearrangement, reversal sort

Procedia PDF Downloads 327
20049 Upon One Smoothing Problem in Project Management

Authors: Dimitri Golenko-Ginzburg

Abstract:

A CPM network project with deterministic activity durations, in which activities require homogenous resources with fixed capacities, is considered. The problem is to determine the optimal schedule of starting times for all network activities within their maximal allowable limits (in order not to exceed the network's critical time) to minimize the maximum required resources for the project at any point in time. In case when a non-critical activity may start only at discrete moments with the pregiven time span, the problem becomes NP-complete and an optimal solution may be obtained via a look-over algorithm. For the case when a look-over requires much computational time an approximate algorithm is suggested. The algorithm's performance ratio, i.e., the relative accuracy error, is determined. Experimentation has been undertaken to verify the suggested algorithm.

Keywords: resource smoothing problem, CPM network, lookover algorithm, lexicographical order, approximate algorithm, accuracy estimate

Procedia PDF Downloads 283
20048 Survey Paper on Graph Coloring Problem and Its Application

Authors: Prateek Chharia, Biswa Bhusan Ghosh

Abstract:

Graph coloring is one of the prominent concepts in graph coloring. It can be defined as a coloring of the various regions of the graph such that all the constraints are fulfilled. In this paper various graphs coloring approaches like greedy coloring, Heuristic search for maximum independent set and graph coloring using edge table is described. Graph coloring can be used in various real time applications like student time tabling generation, Sudoku as a graph coloring problem, GSM phone network.

Keywords: graph coloring, greedy coloring, heuristic search, edge table, sudoku as a graph coloring problem

Procedia PDF Downloads 519
20047 Measuring Science and Technology Innovation Capacity in Developing Countries: From a National Innovation System

Authors: Haeng A. Seo, Changseok Oh, Seung Jun Yoo

Abstract:

This study attempts to examine the disparities in S&T innovation capacity from 14 developing countries to discuss how to support specific features in national innovation systems. It includes East-Asian, Middle-Asian, Central American and African countries. Here, we particularly focus on five dimensions- resources, activities, network, environment and performance- with 37 indicators. They were derived as structuring components of the relevant diagnostic model, which encompasses the whole process of S&T innovation from the input of resources to the output of economically valuable results. For many developing nations, economic industries remain weaker than actual S&T capabilities, and relevant regulatory authorities may not exist. This paper will be helpful to provide basic evidence and to set directions for better national S&T Innovation capacities and toward national competitiveness.

Keywords: developing countries, measurement, NIS, S&T innovation capacity

Procedia PDF Downloads 262
20046 Wasting Human and Computer Resources

Authors: Mária Csernoch, Piroska Biró

Abstract:

The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem-solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text-based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text-based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.

Keywords: deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources

Procedia PDF Downloads 367
20045 A General Variable Neighborhood Search Algorithm to Minimize Makespan of the Distributed Permutation Flowshop Scheduling Problem

Authors: G. M. Komaki, S. Mobin, E. Teymourian, S. Sheikh

Abstract:

This paper addresses minimizing the makespan of the distributed permutation flow shop scheduling problem. In this problem, there are several parallel identical factories or flowshops each with series of similar machines. Each job should be allocated to one of the factories and all of the operations of the jobs should be performed in the allocated factory. This problem has recently gained attention and due to NP-Hard nature of the problem, metaheuristic algorithms have been proposed to tackle it. Majority of the proposed algorithms require large computational time which is the main drawback. In this study, a general variable neighborhood search algorithm (GVNS) is proposed where several time-saving schemes have been incorporated into it. Also, the GVNS uses the sophisticated method to change the shaking procedure or perturbation depending on the progress of the incumbent solution to prevent stagnation of the search. The performance of the proposed algorithm is compared to the state-of-the-art algorithms based on standard benchmark instances.

Keywords: distributed permutation flow shop, scheduling, makespan, general variable neighborhood search algorithm

Procedia PDF Downloads 336
20044 Simulation Model of Induction Heating in COMSOL Multiphysics

Authors: K. Djellabi, M. E. H. Latreche

Abstract:

The induction heating phenomenon depends on various factors, making the problem highly nonlinear. The mathematical analysis of this problem in most cases is very difficult and it is reduced to simple cases. Another knowledge of induction heating systems is generated in production environments, but these trial-error procedures are long and expensive. The numerical models of induction heating problem are another approach to reduce abovementioned drawbacks. This paper deals with the simulation model of induction heating problem. The simulation model of induction heating system in COMSOL Multiphysics is created. In this work we present results of numerical simulations of induction heating process in pieces of cylindrical shapes, in an inductor with four coils. The modeling of the inducting heating process was made with the software COMSOL Multiphysics Version 4.2a, for the study we present the temperature charts.

Keywords: induction heating, electromagnetic field, inductor, numerical simulation, finite element

Procedia PDF Downloads 295
20043 Forecasting Residential Water Consumption in Hamilton, New Zealand

Authors: Farnaz Farhangi

Abstract:

Many people in New Zealand believe that the access to water is inexhaustible, and it comes from a history of virtually unrestricted access to it. For the region like Hamilton which is one of New Zealand’s fastest growing cities, it is crucial for policy makers to know about the future water consumption and implementation of rules and regulation such as universal water metering. Hamilton residents use water freely and they do not have any idea about how much water they use. Hence, one of proposed objectives of this research is focusing on forecasting water consumption using different methods. Residential water consumption time series exhibits seasonal and trend variations. Seasonality is the pattern caused by repeating events such as weather conditions in summer and winter, public holidays, etc. The problem with this seasonal fluctuation is that, it dominates other time series components and makes difficulties in determining other variations (such as educational campaign’s effect, regulation, etc.) in time series. Apart from seasonality, a stochastic trend is also combined with seasonality and makes different effects on results of forecasting. According to the forecasting literature, preprocessing (de-trending and de-seasonalization) is essential to have more performed forecasting results, while some other researchers mention that seasonally non-adjusted data should be used. Hence, I answer the question that is pre-processing essential? A wide range of forecasting methods exists with different pros and cons. In this research, I apply double seasonal ARIMA and Artificial Neural Network (ANN), considering diverse elements such as seasonality and calendar effects (public and school holidays) and combine their results to find the best predicted values. My hypothesis is the examination the results of combined method (hybrid model) and individual methods and comparing the accuracy and robustness. In order to use ARIMA, the data should be stationary. Also, ANN has successful forecasting applications in terms of forecasting seasonal and trend time series. Using a hybrid model is a way to improve the accuracy of the methods. Due to the fact that water demand is dominated by different seasonality, in order to find their sensitivity to weather conditions or calendar effects or other seasonal patterns, I combine different methods. The advantage of this combination is reduction of errors by averaging of each individual model. It is also useful when we are not sure about the accuracy of each forecasting model and it can ease the problem of model selection. Using daily residential water consumption data from January 2000 to July 2015 in Hamilton, I indicate how prediction by different methods varies. ANN has more accurate forecasting results than other method and preprocessing is essential when we use seasonal time series. Using hybrid model reduces forecasting average errors and increases the performance.

Keywords: artificial neural network (ANN), double seasonal ARIMA, forecasting, hybrid model

Procedia PDF Downloads 310
20042 Arabic Lexicon Learning to Analyze Sentiment in Microblogs

Authors: Mahmoud B. Rokaya

Abstract:

The study of opinion mining and sentiment analysis includes analysis of opinions, sentiments, evaluations, attitudes, and emotions. The rapid growth of social media, social networks, reviews, forum discussions, microblogs, and Twitter, leads to a parallel growth in the field of sentiment analysis. The field of sentiment analysis tries to develop effective tools to make it possible to capture the trends of people. There are two approaches in the field, lexicon-based and corpus-based methods. A lexicon-based method uses a sentiment lexicon which includes sentiment words and phrases with assigned numeric scores. These scores reveal if sentiment phrases are positive or negative, their intensity, and/or their emotional orientations. Creation of manual lexicons is hard. This brings the need for adaptive automated methods for generating a lexicon. The proposed method generates dynamic lexicons based on the corpus and then classifies text using these lexicons. In the proposed method, different approaches are combined to generate lexicons from text. The proposed method classifies the tweets into 5 classes instead of +ve or –ve classes. The sentiment classification problem is written as an optimization problem, finding optimum sentiment lexicons are the goal of the optimization process. The solution was produced based on mathematical programming approaches to find the best lexicon to classify texts. A genetic algorithm was written to find the optimal lexicon. Then, extraction of a meta-level feature was done based on the optimal lexicon. The experiments were conducted on several datasets. Results, in terms of accuracy, recall and F measure, outperformed the state-of-the-art methods proposed in the literature in some of the datasets. A better understanding of the Arabic language and culture of Arab Twitter users and sentiment orientation of words in different contexts can be achieved based on the sentiment lexicons proposed by the algorithm.

Keywords: social media, Twitter sentiment, sentiment analysis, lexicon, genetic algorithm, evolutionary computation

Procedia PDF Downloads 158
20041 On the Performance of Improvised Generalized M-Estimator in the Presence of High Leverage Collinearity Enhancing Observations

Authors: Habshah Midi, Mohammed A. Mohammed, Sohel Rana

Abstract:

Multicollinearity occurs when two or more independent variables in a multiple linear regression model are highly correlated. The ridge regression is the commonly used method to rectify this problem. However, the ridge regression cannot handle the problem of multicollinearity which is caused by high leverage collinearity enhancing observation (HLCEO). Since high leverage points (HLPs) are responsible for inducing multicollinearity, the effect of HLPs needs to be reduced by using Generalized M estimator. The existing GM6 estimator is based on the Minimum Volume Ellipsoid (MVE) which tends to swamp some low leverage points. Hence an improvised GM (MGM) estimator is presented to improve the precision of the GM6 estimator. Numerical example and simulation study are presented to show how HLPs can cause multicollinearity. The numerical results show that our MGM estimator is the most efficient method compared to some existing methods.

Keywords: identification, high leverage points, multicollinearity, GM-estimator, DRGP, DFFITS

Procedia PDF Downloads 235