Search results for: conceptual data model
9880 GeNS: a Biological Data Integration Platform
Authors: Joel Arrais, João E. Pereira, João Fernandes, José Luís Oliveira
Abstract:
The scientific achievements coming from molecular biology depend greatly on the capability of computational applications to analyze the laboratorial results. A comprehensive analysis of an experiment requires typically the simultaneous study of the obtained dataset with data that is available in several distinct public databases. Nevertheless, developing a centralized access to these distributed databases rises up a set of challenges such as: what is the best integration strategy, how to solve nomenclature clashes, how to solve database overlapping data and how to deal with huge datasets. In this paper we present GeNS, a system that uses a simple and yet innovative approach to address several biological data integration issues. Compared with existing systems, the main advantages of GeNS are related to its maintenance simplicity and to its coverage and scalability, in terms of number of supported databases and data types. To support our claims we present the current use of GeNS in two concrete applications. GeNS currently contains more than 140 million of biological relations and it can be publicly downloaded or remotely access through SOAP web services.Keywords: Data integration, biological databases
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16329879 Comparison of ANN and Finite Element Model for the Prediction of Ultimate Load of Thin-Walled Steel Perforated Sections in Compression
Authors: Zhi-Jun Lu, Qi Lu, Meng Wu, Qian Xiang, Jun Gu
Abstract:
The analysis of perforated steel members is a 3D problem in nature, therefore the traditional analytical expressions for the ultimate load of thin-walled steel sections cannot be used for the perforated steel member design. In this study, finite element method (FEM) and artificial neural network (ANN) were used to simulate the process of stub column tests based on specific codes. Results show that compared with those of the FEM model, the ultimate load predictions obtained from ANN technique were much closer to those obtained from the physical experiments. The ANN model for the solving the hard problem of complex steel perforated sections is very promising.Keywords: Artificial neural network, finite element method, perforated sections, thin-walled steel, ultimate load.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10759878 Bubble Point Pressures of CO2+Ethyl Palmitate by a Cubic Equation of State and the Wong-Sandler Mixing Rule
Authors: M. A. Sedghamiz, S. Raeissi
Abstract:
This study presents three different approaches to estimate bubble point pressures for the binary system of CO2 and ethyl palmitate fatty acid ethyl ester. The first method involves the Peng-Robinson (PR) Equation of State (EoS) with the conventional mixing rule of Van der Waals. The second approach involves the PR EOS together with the Wong Sandler (WS) mixing rule, coupled with the UNIQUAC GE model. In order to model the bubble point pressures with this approach, the volume and area parameter for ethyl palmitate were estimated by the Hansen group contribution method. The last method involved the Peng-Robinson, combined with the Wong-Sandler method, but using NRTL as the GE model. Results using the Van der Waals mixing rule clearly indicated that this method has the largest errors among all three methods, with errors in the range of 3.96-6.22%. The PR-WS-UNIQUAC method exhibited small errors, with average absolute deviations between 0.95 to 1.97 percent. The PR-WS-NRTL method led to the least errors, where average absolute deviations ranged between 0.65-1.7%.
Keywords: Bubble pressure, Gibbs excess energy model, mixing rule, CO2 solubility, ethyl palmitate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18539877 A Modified Run Length Coding Technique for Test Data Compression Based on Multi-Level Selective Huffman Coding
Authors: C. Kalamani, K. Paramasivam
Abstract:
Test data compression is an efficient method for reducing the test application cost. The problem of reducing test data has been addressed by researchers in three different aspects: Test Data Compression, Built-in-Self-Test (BIST) and Test set compaction. The latter two methods are capable of enhancing fault coverage with cost of hardware overhead. The drawback of the conventional methods is that they are capable of reducing the test storage and test power but when test data have redundant length of runs, no additional compression method is followed. This paper presents a modified Run Length Coding (RLC) technique with Multilevel Selective Huffman Coding (MLSHC) technique to reduce test data volume, test pattern delivery time and power dissipation in scan test applications where redundant length of runs is encountered then the preceding run symbol is replaced with tiny codeword. Experimental results show that the presented method not only improves the test data compression but also reduces the overall test data volume compared to recent schemes. Experiments for the six largest ISCAS-98 benchmarks show that our method outperforms most known techniques.
Keywords: Modified run length coding, multilevel selective Huffman coding, built-in-self-test modified selective Huffman coding, automatic test equipment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12749876 EEIA: Energy Efficient Indexed Aggregation in Smart Wireless Sensor Networks
Authors: Mohamed Watfa, William Daher, Hisham Al Azar
Abstract:
The main idea behind in network aggregation is that, rather than sending individual data items from sensors to sinks, multiple data items are aggregated as they are forwarded by the sensor network. Existing sensor network data aggregation techniques assume that the nodes are preprogrammed and send data to a central sink for offline querying and analysis. This approach faces two major drawbacks. First, the system behavior is preprogrammed and cannot be modified on the fly. Second, the increased energy wastage due to the communication overhead will result in decreasing the overall system lifetime. Thus, energy conservation is of prime consideration in sensor network protocols in order to maximize the network-s operational lifetime. In this paper, we give an energy efficient approach to query processing by implementing new optimization techniques applied to in-network aggregation. We first discuss earlier approaches in sensors data management and highlight their disadvantages. We then present our approach “Energy Efficient Indexed Aggregation" (EEIA) and evaluate it through several simulations to prove its efficiency, competence and effectiveness.Keywords: Sensor Networks, Data Base, Data Fusion, Aggregation, Indexing, Energy Efficiency
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17969875 Simulating the Interaction between Groundwater and Brittle Failure in Open Pit Slopes
Authors: Janisse Vivas, Doug Stead, Davide Elmo, Charles Hunt
Abstract:
This paper presents the results of a study on the influence of varying percentages of rock bridges along a basal surface defining a biplanar failure mode. A pseudo-coupled-hydromechanical brittle fracture analysis is adopted using the state-of-the-art code Slope Model. Model results show that rock bridge failure is strongly influenced by the incorporation of groundwater pressures. The models show that groundwater pressure can promote total failure of a 5% rock bridge along the basal surface. Once the percentage of the rock bridges increases to 10 and 15%, although, the rock bridges are broken, full interconnection of the surface defining the basal surface of the biplanar mode does not occur. Increased damage is caused when the rock bridge is located at the daylighting end of the basal surface in proximity to the blast damage zone. As expected, some cracking damage is experienced in the blast damage zone, where properties representing a good quality controlled damage blast technique were assumed. Model results indicate the potential increase of permeability towards the blast damage zone.Keywords: Slope model, lattice spring, blasting damage zone.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21069874 Uncertainty Propagation and Sensitivity Analysis During Calibration of an Integrated Land Use and Transport Model
Authors: Parikshit Dutta, Mathieu Saujot, Elise Arnaud, Benoit Lefevre, Emmanuel Prados
Abstract:
In this work, propagation of uncertainty during calibration process of TRANUS, an integrated land use and transport model (ILUTM), has been investigated. It has also been examined, through a sensitivity analysis, which input parameters affect the variation of the outputs the most. Moreover, a probabilistic verification methodology of calibration process, which equates the observed and calculated production, has been proposed. The model chosen as an application is the model of the city of Grenoble, France. For sensitivity analysis and uncertainty propagation, Monte Carlo method was employed, and a statistical hypothesis test was used for verification. The parameters of the induced demand function in TRANUS, were assumed as uncertain in the present case. It was found that, if during calibration, TRANUS converges, then with a high probability the calibration process is verified. Moreover, a weak correlation was found between the inputs and the outputs of the calibration process. The total effect of the inputs on outputs was investigated, and the output variation was found to be dictated by only a few input parameters.Keywords: Uncertainty propagation, sensitivity analysis, calibration under uncertainty, hypothesis testing, integrated land use and transport models, TRANUS, Grenoble.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15219873 Comparison of Different Methods to Produce Fuzzy Tolerance Relations for Rainfall Data Classification in the Region of Central Greece
Authors: N. Samarinas, C. Evangelides, C. Vrekos
Abstract:
The aim of this paper is the comparison of three different methods, in order to produce fuzzy tolerance relations for rainfall data classification. More specifically, the three methods are correlation coefficient, cosine amplitude and max-min method. The data were obtained from seven rainfall stations in the region of central Greece and refers to 20-year time series of monthly rainfall height average. Three methods were used to express these data as a fuzzy relation. This specific fuzzy tolerance relation is reformed into an equivalence relation with max-min composition for all three methods. From the equivalence relation, the rainfall stations were categorized and classified according to the degree of confidence. The classification shows the similarities among the rainfall stations. Stations with high similarity can be utilized in water resource management scenarios interchangeably or to augment data from one to another. Due to the complexity of calculations, it is important to find out which of the methods is computationally simpler and needs fewer compositions in order to give reliable results.
Keywords: Classification, fuzzy logic, tolerance relations, rainfall data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10269872 Modeling the Saltatory Conduction in Myelinated Axons by Order Reduction
Authors: Ruxandra Barbulescu, Daniel Ioan, Gabriela Ciuprina
Abstract:
The saltatory conduction is the way the action potential is transmitted along a myelinated axon. The potential diffuses along the myelinated compartments and it is regenerated in the Ranvier nodes due to the ion channels allowing the flow across the membrane. For an efficient simulation of populations of neurons, it is important to use reduced order models both for myelinated compartments and for Ranvier nodes and to have control over their accuracy and inner parameters. The paper presents a reduced order model of this neural system which allows an efficient simulation method for the saltatory conduction in myelinated axons. This model is obtained by concatenating reduced order linear models of 1D myelinated compartments and nonlinear 0D models of Ranvier nodes. The models for the myelinated compartments are selected from a series of spatially distributed models developed and hierarchized according to their modeling errors. The extracted model described by a nonlinear PDE of hyperbolic type is able to reproduce the saltatory conduction with acceptable accuracy and takes into account the finite propagation speed of potential. Finally, this model is again reduced in order to make it suitable for the inclusion in large-scale neural circuits.Keywords: Saltatory conduction, action potential, myelinated compartments, nonlinear, Ranvier nodes, reduced order models, POD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8469871 Granularity Analysis for Spatio-Temporal Web Sensors
Authors: Shun Hattori
Abstract:
In recent years, many researches to mine the exploding Web world, especially User Generated Content (UGC) such as weblogs, for knowledge about various phenomena and events in the physical world have been done actively, and also Web services with the Web-mined knowledge have begun to be developed for the public. However, there are few detailed investigations on how accurately Web-mined data reflect physical-world data. It must be problematic to idolatrously utilize the Web-mined data in public Web services without ensuring their accuracy sufficiently. Therefore, this paper introduces the simplest Web Sensor and spatiotemporallynormalized Web Sensor to extract spatiotemporal data about a target phenomenon from weblogs searched by keyword(s) representing the target phenomenon, and tries to validate the potential and reliability of the Web-sensed spatiotemporal data by four kinds of granularity analyses of coefficient correlation with temperature, rainfall, snowfall, and earthquake statistics per day by region of Japan Meteorological Agency as physical-world data: spatial granularity (region-s population density), temporal granularity (time period, e.g., per day vs. per week), representation granularity (e.g., “rain" vs. “heavy rain"), and media granularity (weblogs vs. microblogs such as Tweets).Keywords: Granularity analysis, knowledge extraction, spatiotemporal data mining, Web credibility, Web mining, Web sensor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18829870 Performance Evaluation of Discrete Fourier Transform Algorithm Based PMU for Wide Area Measurement System
Authors: Alpesh Adeshara, Rajendrasinh Jadeja, Praghnesh Bhatt
Abstract:
Implementation of advanced technologies requires sophisticated instruments that deal with the operation, control, restoration and protection of rapidly growing power system network under normal and abnormal conditions. Presently, the applications of Phasor Measurement Unit (PMU) are widely found in real time operation, monitoring, controlling and analysis of power system network as it eliminates the various limitations of supervisory control and data acquisition system (SCADA) conventionally used in power system. The use of PMU data is very rapidly increasing its importance for online and offline analysis. Wide area measurement system (WAMS) is developed as new technology by use of multiple PMUs in power system. The present paper proposes a model of Matlab based PMU using Discrete Fourier Transform (DFT) algorithm and evaluation of its operation under different contingencies. In this paper, PMU based two bus system having WAMS network is presented as a case study.Keywords: DFT-Discrete Fourier Transform, GPS-Global Positioning System, PMU-Phasor Measurement System, WAMS-Wide Area Monitoring System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27269869 Morpho-Phonological Modelling in Natural Language Processing
Authors: Eleni Galiotou, Angela Ralli
Abstract:
In this paper we propose a computational model for the representation and processing of morpho-phonological phenomena in a natural language, like Modern Greek. We aim at a unified treatment of inflection, compounding, and word-internal phonological changes, in a model that is used for both analysis and generation. After discussing certain difficulties cuase by well-known finitestate approaches, such as Koskenniemi-s two-level model [7] when applied to a computational treatment of compounding, we argue that a morphology-based model provides a more adequate account of word-internal phenomena. Contrary to the finite state approaches that cannot handle hierarchical word constituency in a satisfactory way, we propose a unification-based word grammar, as the nucleus of our strategy, which takes into consideration word representations that are based on affixation and [stem stem] or [stem word] compounds. In our formalism, feature-passing operations are formulated with the use of the unification device, and phonological rules modeling the correspondence between lexical and surface forms apply at morpheme boundaries. In the paper, examples from Modern Greek illustrate our approach. Morpheme structures, stress, and morphologically conditioned phoneme changes are analyzed and generated in a principled way.
Keywords: Morpho-Phonology, Natural Language Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21299868 The Framework of Termination Mechanism in Modern Emergency Management
Authors: Yannan Wu, An Chen, Yan Zhao
Abstract:
Termination Mechanism is an indispensible part of the emergency management mechanism. Despite of its importance in both theory and practice, it is almost a brand new field for researching. The concept of termination mechanism is proposed firstly in this paper, and the design and implementation which are helpful to guarantee the effect and integrity of emergency management are discussed secondly. Starting with introduction of the problems caused by absent termination and incorrect termination, the essence of termination mechanism is analyzed, a model based on Optimal Stopping Theory is constructed and the termination index is given. The model could be applied to find the best termination time point.. Termination decision should not only be concerned in termination stage, but also in the whole emergency management process, which makes it a dynamic decision making process. Besides, the main subjects and the procedure of termination are illustrated after the termination time point is given. Some future works are discussed lastly.Keywords: Emergency management, Termination Mechanism, Optimal Termination Model, Decision Making, Optimal StoppingTheory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12679867 Optimal Trajectories for Highly Automated Driving
Authors: Christian Rathgeber, Franz Winkler, Xiaoyu Kang, Steffen Müller
Abstract:
In this contribution two approaches for calculating optimal trajectories for highly automated vehicles are presented and compared. The first one is based on a non-linear vehicle model, used for evaluation. The second one is based on a simplified model and can be implemented on a current ECU. In usual driving situations both approaches show very similar results.Keywords: Trajectory planning, direct method, indirect method, highly automated driving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29419866 Arrival and Departure Scheduling at Hub Airports Considering Airlines Level
Authors: A. Nourmohammadzadeh, R. Tavakkoli- Moghaddam
Abstract:
As the air traffic increases at a hub airport, some flights cannot land or depart at their preferred target time. This event happens because the airport runways become occupied to near their capacity. It results in extra costs for both passengers and airlines because of the loss of connecting flights or more waiting, more fuel consumption, rescheduling crew members, etc. Hence, devising an appropriate scheduling method that determines a suitable runway and time for each flight in order to efficiently use the hub capacity and minimize the related costs is of great importance. In this paper, we present a mixed-integer zero-one model for scheduling a set of mixed landing and departing flights (despite of most previous studies considered only landings). According to the fact that the flight cost is strongly affected by the level of airline, we consider different airline categories in our model. This model presents a single objective minimizing the total sum of three terms, namely 1) the weighted deviation from targets, 2) the scheduled time of the last flight (i.e., makespan), and 3) the unbalancing the workload on runways. We solve 10 simulated instances of different sizes up to 30 flights and 4 runways. Optimal solutions are obtained in a reasonable time, which are satisfactory in comparison with the traditional rule, namely First- Come-First-Serve (FCFS) that is far apart from optimality in most cases.Keywords: Arrival and departure scheduling, Airline level, Mixed-integer model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18289865 The Research Report of Employment Trends in Printing Industry for Prepress
Authors: Weera Chotithammaporn
Abstract:
This research aims to study employment trends in printing industry for prepress support by Suan Sunandha University Fund. The objectives of this research are to explain the trends of the employment in Thai Printing Industry for prepress in Bangkok and the description of different personnel that prepress entrepreneur need and also the problems of employment. The population of prepress entrepreneurs is about 100 organizations in the area of Bangkok. The questionnaires has been taken and analyzed with SPSS program by using the average percentage and standard deviation. This research is multiple case studies. The conceptual framework is developed on the basis of the open systems theory. The research result show that 1. The most of prepress entrepreneur have trend to choose the employee by any sex, the age 25-29 years old, bachelor degree and have 1-2 years experience. 2. The most problems are the understanding in job, communication/relation and the understanding in new technology. 3. The trends aims to employment in 1-3 years have 57.8% for prepress industry in Bangkok. This research suggests that: 1. Thai printing industry for prepress in Bangkok need quality employee that expert in printing technology. 2. Prepress entrepreneur should have agreement to development with university for practice the employee. 3. Prepress entrepreneur should support personal to fulfill the knowledge.
Keywords: Printing Industry, Prepress.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22489864 Simulating Economic Order Quantity and Reorder Point Policy for a Repairable Items Inventory System
Authors: Mojahid F. Saeed Osman
Abstract:
Repairable items inventory system is a management tool used to incorporate all information concerning inventory levels and movements for repaired and new items. This paper presents development of an effective simulation model for managing the inventory of repairable items for a production system where production lines send their faulty items to a repair shop considering the stochastic failure behavior and repair times. The developed model imitates the process of handling the on-hand inventory of repaired items and the replenishment of the inventory of new items using Economic Order Quantity and Reorder Point ordering policy in a flexible and risk-free environment. We demonstrate the appropriateness and effectiveness of the proposed simulation model using an illustrative case problem. The developed simulation model can be used as a reliable tool for estimating a healthy on-hand inventory of new and repaired items, backordered items, and downtime due to unavailability of repaired items, and validating and examining Economic Order Quantity and Reorder Point ordering policy, which would further be compared with other ordering strategies as future work.
Keywords: Inventory system, repairable items, simulation, maintenance, economic order quantity, reorder point.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6769863 Periodic Solutions for a Delayed Population Model on Time Scales
Authors: Kejun Zhuang, Zhaohui Wen
Abstract:
This paper deals with a delayed single population model on time scales. With the assistance of coincidence degree theory, sufficient conditions for existence of periodic solutions are obtained. Furthermore, the better estimations for bounds of periodic solutions are established.
Keywords: Coincidence degree, continuation theorem, periodic solutions, time scales
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13489862 A Design of the Infrastructure and Computer Network for Distance Education, Online Learning via New Media, E-Learning and Blended Learning
Authors: Sumitra Nuanmeesri
Abstract:
The research focus on study, analyze and design the model of the infrastructure and computer networks for distance education, online learning via new media, e-learning and blended learning. The collected information from study and analyze process that information was evaluated by the index of item objective congruence (IOC) by 9 specialists to design model. The results of evaluate the model with the mean and standard deviation by the sample of 9 specialists value is 3.85. The results showed that the infrastructure and computer networks are designed to be appropriate to a great extent appropriate to a great extent.
Keywords: Blended Learning, New Media, Infrastructure and Computer Network, Tele-Education, Online Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20259861 Reliable Capacitated Facility Location Problem Considering Maximal Covering
Authors: Mehdi Seifbarghy, Sajjad Jalali, Seyed Habib A. Rahmati
Abstract:
This paper provides a framework in order to incorporate reliability issue as a sign of disruption in distribution systems and partial covering theory as a response to limitation in coverage radios and economical preferences, simultaneously into the traditional literatures of capacitated facility location problems. As a result we develop a bi-objective model based on the discrete scenarios for expected cost minimization and demands coverage maximization through a three echelon supply chain network by facilitating multi-capacity levels for provider side layers and imposing gradual coverage function for distribution centers (DCs). Additionally, in spite of objectives aggregation for solving the model through LINGO software, a branch of LP-Metric method called Min- Max approach is proposed and different aspects of corresponds model will be explored.Keywords: Reliability Cost, Partial Covering, LP-Metric
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18959860 Environmental Management of the Tanning Industry's Supply Chain: An Integration Model from Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001:2004
Authors: N. Clavijo Buriticá, L. M. Correa Lópezand J. R., Sánchez Rodríguez
Abstract:
The environmental impact caused by industries is an issue that, in the last 20 years, has become very important in terms of society, economics and politics in Colombia. Particularly, the tannery process is extremely polluting because of uneffective treatments and regulations given to the dumping process and atmospheric emissions. Considering that, this investigation is intended to propose a management model based on the integration of Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001-2004, that prioritizes the strategic components of the organizations. As a result, a management model will be obtained and it will provide a strategic perspective through a systemic approach to the tanning process. This will be achieved through the use of Multicriteria Decision tools, along with Quality Function Deployment and Fuzzy Logic. The strategic approach that embraces the management model using the alignment of Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001-2004, is an integrated perspective that allows a gradual frame of the tactical and operative elements through the correct setting of the information flow, improving the decision making process. In that way, Small Medium Enterprises (SMEs) could improve their productivity, competitiveness and as an added value, the minimization of the environmental impact. This improvement is expected to be controlled through a Dashboard that helps the Organization measure its performance along the implementation of the model in its productive process.
Keywords: Integration, environmental impact, management, systemic organization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20409859 Joint Training Offer Selection and Course Timetabling Problems: Models and Algorithms
Authors: Gianpaolo Ghiani, Emanuela Guerriero, Emanuele Manni, Alessandro Romano
Abstract:
In this article, we deal with a variant of the classical course timetabling problem that has a practical application in many areas of education. In particular, in this paper we are interested in high schools remedial courses. The purpose of such courses is to provide under-prepared students with the skills necessary to succeed in their studies. In particular, a student might be under prepared in an entire course, or only in a part of it. The limited availability of funds, as well as the limited amount of time and teachers at disposal, often requires schools to choose which courses and/or which teaching units to activate. Thus, schools need to model the training offer and the related timetabling, with the goal of ensuring the highest possible teaching quality, by meeting the above-mentioned financial, time and resources constraints. Moreover, there are some prerequisites between the teaching units that must be satisfied. We first present a Mixed-Integer Programming (MIP) model to solve this problem to optimality. However, the presence of many peculiar constraints contributes inevitably in increasing the complexity of the mathematical model. Thus, solving it through a general-purpose solver may be performed for small instances only, while solving real-life-sized instances of such model requires specific techniques or heuristic approaches. For this purpose, we also propose a heuristic approach, in which we make use of a fast constructive procedure to obtain a feasible solution. To assess our exact and heuristic approaches we perform extensive computational results on both real-life instances (obtained from a high school in Lecce, Italy) and randomly generated instances. Our tests show that the MIP model is never solved to optimality, with an average optimality gap of 57%. On the other hand, the heuristic algorithm is much faster (in about the 50% of the considered instances it converges in approximately half of the time limit) and in many cases allows achieving an improvement on the objective function value obtained by the MIP model. Such an improvement ranges between 18% and 66%.Keywords: Heuristic, MIP model, Remedial course, School, Timetabling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16349858 Non-negative Principal Component Analysis for Face Recognition
Abstract:
Principle component analysis is often combined with the state-of-art classification algorithms to recognize human faces. However, principle component analysis can only capture these features contributing to the global characteristics of data because it is a global feature selection algorithm. It misses those features contributing to the local characteristics of data because each principal component only contains some levels of global characteristics of data. In this study, we present a novel face recognition approach using non-negative principal component analysis which is added with the constraint of non-negative to improve data locality and contribute to elucidating latent data structures. Experiments are performed on the Cambridge ORL face database. We demonstrate the strong performances of the algorithm in recognizing human faces in comparison with PCA and NREMF approaches.Keywords: classification, face recognition, non-negativeprinciple component analysis (NPCA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16959857 Coupling Compensation of 6-DOF Parallel Robot Based on Screw Theory
Authors: Ming Cong, Yinghua Wu, Dong Liu, Haiying Wen, Junfa Yu
Abstract:
In order to improve control performance and eliminate steady, a coupling compensation for 6-DOF parallel robot is presented. Taking dynamic load Tank Simulator as the research object, this paper analyzes the coupling of 6-DOC parallel robot considering the degree of freedom of the 6-DOF parallel manipulator. The coupling angle and coupling velocity are derived based on inverse kinematics model. It uses the mechanism-model combined method which takes practical moving track that considering the performance of motion controller and motor as its input to make the study. Experimental results show that the coupling compensation improves motion stability as well as accuracy. Besides, it decreases the dither amplitude of dynamic load Tank Simulator.
Keywords: coupling compensation, screw theory, parallel robot, mechanism-model combined motion
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16799856 Adjusted Ratio and Regression Type Estimators for Estimation of Population Mean when some Observations are missing
Authors: Nuanpan Nangsue
Abstract:
Ratio and regression type estimators have been used by previous authors to estimate a population mean for the principal variable from samples in which both auxiliary x and principal y variable data are available. However, missing data are a common problem in statistical analyses with real data. Ratio and regression type estimators have also been used for imputing values of missing y data. In this paper, six new ratio and regression type estimators are proposed for imputing values for any missing y data and estimating a population mean for y from samples with missing x and/or y data. A simulation study has been conducted to compare the six ratio and regression type estimators with a previous estimator of Rueda. Two population sizes N = 1,000 and 5,000 have been considered with sample sizes of 10% and 30% and with correlation coefficients between population variables X and Y of 0.5 and 0.8. In the simulations, 10 and 40 percent of sample y values and 10 and 40 percent of sample x values were randomly designated as missing. The new ratio and regression type estimators give similar mean absolute percentage errors that are smaller than the Rueda estimator for all cases. The new estimators give a large reduction in errors for the case of 40% missing y values and sampling fraction of 30%.
Keywords: Auxiliary variable, missing data, ratio and regression type estimators.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17329855 Modeling User Behaviour by Planning
Authors: Alfredo Milani, Silvia Suriani
Abstract:
A model of user behaviour based automated planning is introduced in this work. The behaviour of users of web interactive systems can be described in term of a planning domain encapsulating the timed actions patterns representing the intended user profile. The user behaviour recognition is then posed as a planning problem where the goal is to parse a given sequence of user logs of the observed activities while reaching a final state. A general technique for transforming a timed finite state automata description of the behaviour into a numerical parameter planning model is introduced. Experimental results show that the performance of a planning based behaviour model is effective and scalable for real world applications. A major advantage of the planning based approach is to represent in a single automated reasoning framework problems of plan recognitions, plan synthesis and plan optimisation.Keywords: User behaviour, Timed Transition Automata, Automated Planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13479854 Efficient Implementation of Serial and Parallel Support Vector Machine Training with a Multi-Parameter Kernel for Large-Scale Data Mining
Authors: Tatjana Eitrich, Bruno Lang
Abstract:
This work deals with aspects of support vector learning for large-scale data mining tasks. Based on a decomposition algorithm that can be run in serial and parallel mode we introduce a data transformation that allows for the usage of an expensive generalized kernel without additional costs. In order to speed up the decomposition algorithm we analyze the problem of working set selection for large data sets and analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our modifications and settings lead to improvement of support vector learning performance and thus allow using extensive parameter search methods to optimize classification accuracy.
Keywords: Support Vector Machines, Shared Memory Parallel Computing, Large Data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15779853 Maintenance Function's Performance Evaluation Using Adapted Balanced Scorecard Model
Authors: A. Bakhtiar, B. Purwanggono, N. Metasari
Abstract:
PT XYZ is a bottled drinking water company. To preserve production resources owned by the company so that the resources could be utilized well, it has implemented maintenance management system, which has important role in company's profitability, and is one of the factors influenced overall company's performance. Yet, up to now the company has never measured maintenance activities' contribution to company's performance. Performance evaluation is done according to adapted Balanced Scorecard model fitted to maintenance function context. This model includes six perspectives: innovation and growth, production, maintenance, environment, costumer, and finance. Actual performance measurement is done through Analytic Hierarchy Process and Objective Matrix. From the research done, we can conclude that the company's maintenance function is categorized in moderate performance. But, there are some indicators which has high priority but low performance, which are: costumers' complain rate, work lateness rate, and Return on Investment.
Keywords: Maintenance, performance, balanced scorecard, objective matrix.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27159852 Analysis of Attention to the Confucius Institute from Domestic and Foreign Mainstream Media
Authors: Wei Yang, Xiaohui Cui, Weiping Zhu, Liqun Liu
Abstract:
The rapid development of the Confucius Institute is attracting more and more attention from mainstream media around the world. Mainstream media plays a large role in public information dissemination and public opinion. This study presents efforts to analyze the correlation and functional relationship between domestic and foreign mainstream media by analyzing the amount of reports on the Confucius Institute. Three kinds of correlation calculation methods, the Pearson correlation coefficient (PCC), the Spearman correlation coefficient (SCC), and the Kendall rank correlation coefficient (KCC), were applied to analyze the correlations among mainstream media from three regions: mainland of China; Hong Kong and Macao (the two special administration regions of China denoted as SARs); and overseas countries excluding China, such as the United States, England, and Canada. Further, the paper measures the functional relationships among the regions using a regression model. The experimental analyses found high correlations among mainstream media from the different regions. Additionally, we found that there is a linear relationship between the mainstream media of overseas countries and those of the SARs by analyzing the amount of reports on the Confucius Institute based on a data set obtained by crawling the websites of 106 mainstream media during the years 2004 to 2014.Keywords: Confucius Institute, correlation analysis, mainstream media, regression model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13609851 Meteorological Risk Assessment for Ships with Fuzzy Logic Designer
Authors: Ismail Karaca, Ridvan Saracoglu, Omer Soner
Abstract:
Fuzzy Logic, an advanced method to support decision-making, is used by various scientists in many disciplines. Fuzzy programming is a product of fuzzy logic, fuzzy rules, and implication. In marine science, fuzzy programming for ships is dramatically increasing together with autonomous ship studies. In this paper, a program to support the decision-making process for ship navigation has been designed. The program is produced in fuzzy logic and rules, by taking the marine accidents and expert opinions into account. After the program was designed, the program was tested by 46 ship accidents reported by the Transportation Safety Investigation Center of Turkey. Wind speed, sea condition, visibility, day/night ratio have been used as input data. They have been converted into a risk factor within the Fuzzy Logic Designer application and fuzzy rules set by marine experts. Finally, the expert's meteorological risk factor for each accident is compared with the program's risk factor, and the error rate was calculated. The main objective of this study is to improve the navigational safety of ships, by using the advance decision support model. According to the study result, fuzzy programming is a robust model that supports safe navigation.
Keywords: Calculation of risk factor, fuzzy logic, fuzzy programming for ship, safe navigation of ships.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 826