Search results for: Uncertainty sets
791 A Probabilistic View of the Spatial Pooler in Hierarchical Temporal Memory
Authors: Mackenzie Leake, Liyu Xia, Kamil Rocki, Wayne Imaino
Abstract:
In the Hierarchical Temporal Memory (HTM) paradigm the effect of overlap between inputs on the activation of columns in the spatial pooler is studied. Numerical results suggest that similar inputs are represented by similar sets of columns and dissimilar inputs are represented by dissimilar sets of columns. It is shown that the spatial pooler produces these results under certain conditions for the connectivity and proximal thresholds. Following the discussion of the initialization of parameters for the thresholds, corresponding qualitative arguments about the learning dynamics of the spatial pooler are discussed.Keywords: Hierarchical Temporal Memory, HTM, Learning Algorithms, Machine Learning, Spatial Pooler.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2195790 Robust Stabilization of Rotational Motion of Underwater Robots against Parameter Uncertainties
Authors: Riku Hayashida, Tomoaki Hashimoto
Abstract:
This paper provides a robust stabilization method for rotational motion of underwater robots against parameter uncertainties. Underwater robots are expected to be used for various work assignments. The large variety of applications of underwater robots motivates researchers to develop control systems and technologies for underwater robots. Several control methods have been proposed so far for the stabilization of nominal system model of underwater robots with no parameter uncertainty. Parameter uncertainties are considered to be obstacles in implementation of the such nominal control methods for underwater robots. The objective of this study is to establish a robust stabilization method for rotational motion of underwater robots against parameter uncertainties. The effectiveness of the proposed method is verified by numerical simulations.Keywords: Robust control, stabilization method, underwater robot, parameter uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 568789 Role of Director's Philosophical Approach in Cinematographic Expression
Authors: Sedat Cereci
Abstract:
The original idea for a feature film may come from a writer, director or a producer. Director is the person responsible for the creative aspects, both interpretive and technical, of a motion picture production in a film. Director may be shot discussing his project with his or her cowriters, members of production staff, and producer, and director may be shown selecting locales or constructing sets. All these activities provide, of course, ways of externalizing director-s ideas about the film. A director sometimes pushes both the film image and techniques of narration to new artistic limits, but main responsibility of director is take the spectator to an original opinion in his philosophical approach. Director tries to find an artistic angle in every scene and change screenplay into an effective story and sets his film on a spiritual and philosophical base.Keywords: Director, role, film, approach, opinion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542788 Stating Best Commercialization Method: An Unanswered Question from Scholars and Practitioners
Authors: Saheed A. Gbadegeshin
Abstract:
Commercialization method is a means to make inventions available at the market for final consumption. It is described as an important tool for keeping business enterprises sustainable and improving national economic growth. Thus, there are several scholarly publications on it, either presenting or testing different methods for commercialization. However, young entrepreneurs, technologists and scientists would like to know the best method to commercialize their innovations. Then, this question arises: What is the best commercialization method? To answer the question, a systematic literature review was conducted, and practitioners were interviewed. The literary results revealed that there are many methods but new methods are needed to improve commercialization especially during these times of economic crisis and political uncertainty. Similarly, the empirical results showed there are several methods, but the best method is the one that reduces costs, reduces the risks associated with uncertainty, and improves customer participation and acceptability. Therefore, it was concluded that new commercialization method is essential for today's high technologies and a method was presented.
Keywords: Commercialization method, high technology, lean start-up methodology, technology, knowledge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1301787 Automated Knowledge Engineering
Authors: Sandeep Chandana, Rene V. Mayorga, Christine W. Chan
Abstract:
This article outlines conceptualization and implementation of an intelligent system capable of extracting knowledge from databases. Use of hybridized features of both the Rough and Fuzzy Set theory render the developed system flexibility in dealing with discreet as well as continuous datasets. A raw data set provided to the system, is initially transformed in a computer legible format followed by pruning of the data set. The refined data set is then processed through various Rough Set operators which enable discovery of parameter relationships and interdependencies. The discovered knowledge is automatically transformed into a rule base expressed in Fuzzy terms. Two exemplary cancer repository datasets (for Breast and Lung Cancer) have been used to test and implement the proposed framework.Keywords: Knowledge Extraction, Fuzzy Sets, Rough Sets, Neuro–Fuzzy Systems, Databases
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1787786 Assessment and Uncertainty Analysis of ROSA/LSTF Test on Pressurized Water Reactor 1.9% Vessel Upper Head Small-Break Loss-of-Coolant Accident
Authors: Takeshi Takeda
Abstract:
An experiment utilizing the ROSA/LSTF (rig of safety assessment/large-scale test facility) simulated a 1.9% vessel upper head small-break loss-of-coolant accident with an accident management (AM) measure under the total failure of high-pressure injection system of emergency core cooling system in a pressurized water reactor. Steam generator (SG) secondary-side depressurization on the AM measure was started by fully opening relief valves in both SGs when the maximum core exit temperature rose to 623 K. A large increase took place in the cladding surface temperature of simulated fuel rods on account of a late and slow response of core exit thermocouples during core boil-off. The author analyzed the LSTF test by reference to the matrix of an integral effect test for the validation of a thermal-hydraulic system code. Problems remained in predicting the primary coolant distribution and the core exit temperature with the RELAP5/MOD3.3 code. The uncertainty analysis results of the RELAP5 code confirmed that the sample size with respect to the order statistics influences the value of peak cladding temperature with a 95% probability at a 95% confidence level, and the Spearman’s rank correlation coefficient.
Keywords: LSTF, LOCA, uncertainty analysis, RELAP5.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 728785 Analyzing Periurban Fringe with Rough Set
Authors: Benedetto Manganelli, Beniamino Murgante
Abstract:
The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.
Keywords: Land Classification, Map Algebra, Periurban Fringe, Rough Set, Urban Planning, Urban Sprawl.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724784 Improving Sales through Inventory Reduction: A Retail Chain Case Study
Authors: M. G. Mattos, J. E. Pécora Jr, T. A. Briso
Abstract:
Today's challenging business environment, with unpredictable demand and volatility, requires a supply chain strategy that handles uncertainty and risks in the right way. Even though inventory models have been previously explored, this paper seeks to apply these concepts on a practical situation. This study involves the inventory replenishment problem, applying techniques that are mainly based on mathematical assumptions and modeling. The primary goal is to improve the retailer’s supply chain processes taking store differences when setting the various target stock levels. Through inventory review policy, picking piece implementation and minimum exposure definition, we were able not only to promote the inventory reduction as well as improve sales results. The inventory management theory from literature review was then tested on a single case study regarding a particular department in one of the largest Latam retail chains.
Keywords: Inventory, distribution, retail, risk, safety stock, sales, uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813783 Reliability Analysis of Press Unit using Vague Set
Authors: S. P. Sharma, Monica Rani
Abstract:
In conventional reliability assessment, the reliability data of system components are treated as crisp values. The collected data have some uncertainties due to errors by human beings/machines or any other sources. These uncertainty factors will limit the understanding of system component failure due to the reason of incomplete data. In these situations, we need to generalize classical methods to fuzzy environment for studying and analyzing the systems of interest. Fuzzy set theory has been proposed to handle such vagueness by generalizing the notion of membership in a set. Essentially, in a Fuzzy Set (FS) each element is associated with a point-value selected from the unit interval [0, 1], which is termed as the grade of membership in the set. A Vague Set (VS), as well as an Intuitionistic Fuzzy Set (IFS), is a further generalization of an FS. Instead of using point-based membership as in FS, interval-based membership is used in VS. The interval-based membership in VS is more expressive in capturing vagueness of data. In the present paper, vague set theory coupled with conventional Lambda-Tau method is presented for reliability analysis of repairable systems. The methodology uses Petri nets (PN) to model the system instead of fault tree because it allows efficient simultaneous generation of minimal cuts and path sets. The presented method is illustrated with the press unit of the paper mill.
Keywords: Lambda -Tau methodology, Petri nets, repairable system, vague fuzzy set.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527782 The Comparisons of Average Outgoing Quality Limit between the MCSP-2-C and MCSP-C
Authors: P. Guayjarernpanishkand, T. Mayureesawan
Abstract:
This paper presents a comparison of average outgoing quality limit of the MCSP-2-C plan with MCSP-C when MCSP-2-C has been developed from MCSP-C. The parameters used in MCSP-2- C are: i (the clearance number), c (the acceptance number), m (the number of conforming units to be found before allowing c nonconforming units in the sampling inspection), f1 and f2 (the sampling frequency at level 1 and 2, respectively). The average outgoing quality limit (AOQL) values from two plans were compared and we found that for all sets of i, r, and c values, MCSP-2-C gives higher values than MCSP-C. For all sets of i, r, and c values, the average outgoing quality values of MCSP-C and MCSP-2-C are similar when p is low or high but is difference when p is moderate.Keywords: average outgoing quality, average outgoing quality limit, continuous sampling plan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1506781 Comparison of Response Surface Designs in a Spherical Region
Authors: Boonorm Chomtee, John J. Borkowski
Abstract:
The objective of the research is to study and compare response surface designs: Central composite designs (CCD), Box- Behnken designs (BBD), Small composite designs (SCD), Hybrid designs, and Uniform shell designs (USD) over sets of reduced models when the design is in a spherical region for 3 and 4 design variables. The two optimality criteria ( D and G ) are considered which larger values imply a better design. The comparison of design optimality criteria of the response surface designs across the full second order model and sets of reduced models for 3 and 4 factors based on the two criteria are presented.Keywords: design optimality criteria, reduced models, response surface design, spherical design region
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1260780 Data Envelopment Analysis under Uncertainty and Risk
Authors: P. Beraldi, M. E. Bruni
Abstract:
Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.Keywords: DEA, Stochastic Programming, Ex-ante evaluation technique, Conditional Value at Risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967779 A Serial Hierarchical Support Vector Machine and 2D Feature Sets Act for Brain DTI Segmentation
Authors: Mohammad Javadi
Abstract:
Serial hierarchical support vector machine (SHSVM) is proposed to discriminate three brain tissues which are white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF). SHSVM has novel classification approach by repeating the hierarchical classification on data set iteratively. It used Radial Basis Function (rbf) Kernel with different tuning to obtain accurate results. Also as the second approach, segmentation performed with DAGSVM method. In this article eight univariate features from the raw DTI data are extracted and all the possible 2D feature sets are examined within the segmentation process. SHSVM succeed to obtain DSI values higher than 0.95 accuracy for all the three tissues, which are higher than DAGSVM results.
Keywords: Brain segmentation, DTI, hierarchical, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1856778 Improving Load Frequency Control of Multi-Area Power System by Considering Uncertainty by Using Optimized Type 2 Fuzzy Pid Controller with the Harmony Search Algorithm
Authors: Mehrdad Mahmudizad, Roya Ahmadi Ahangar
Abstract:
This paper presents the method of designing the type 2 fuzzy PID controllers in order to solve the problem of Load Frequency Control (LFC). The Harmony Search (HS) algorithm is used to regulate the measurement factors and the effect of uncertainty of membership functions of Interval Type 2 Fuzzy Proportional Integral Differential (IT2FPID) controllers in order to reduce the frequency deviation resulted from the load oscillations. The simulation results implicitly show that the performance of the proposed IT2FPID LFC in terms of error, settling time and resistance against different load oscillations is more appropriate and preferred than PID and Type 1 Fuzzy Proportional Integral Differential (T1FPID) controllers.Keywords: Load Frequency Control, Fuzzy-PID controller, Type 2 fuzzy system, Harmony Search algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733777 Two-Dimensional Modeling of Spent Nuclear Fuel Using FLUENT
Authors: Imane Khalil, Quinn Pratt
Abstract:
In a nuclear reactor, an array of fuel rods containing stacked uranium dioxide pellets clad with zircalloy is the heat source for a thermodynamic cycle of energy conversion from heat to electricity. After fuel is used in a nuclear reactor, the assemblies are stored underwater in a spent nuclear fuel pool at the nuclear power plant while heat generation and radioactive decay rates decrease before it is placed in packages for dry storage or transportation. A computational model of a Boiling Water Reactor spent fuel assembly is modeled using FLUENT, the computational fluid dynamics package. Heat transfer simulations were performed on the two-dimensional 9x9 spent fuel assembly to predict the maximum cladding temperature for different input to the FLUENT model. Uncertainty quantification is used to predict the heat transfer and the maximum temperature profile inside the assembly.Keywords: Spent nuclear fuel, conduction, heat transfer, uncertainty quantification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 856776 Web Log Mining by an Improved AprioriAll Algorithm
Authors: Wang Tong, He Pi-lian
Abstract:
This paper sets forth the possibility and importance about applying Data Mining in Web logs mining and shows some problems in the conventional searching engines. Then it offers an improved algorithm based on the original AprioriAll algorithm which has been used in Web logs mining widely. The new algorithm adds the property of the User ID during the every step of producing the candidate set and every step of scanning the database by which to decide whether an item in the candidate set should be put into the large set which will be used to produce next candidate set. At the meantime, in order to reduce the number of the database scanning, the new algorithm, by using the property of the Apriori algorithm, limits the size of the candidate set in time whenever it is produced. Test results show the improved algorithm has a more lower complexity of time and space, better restrain noise and fit the capacity of memory.
Keywords: Candidate Sets Pruning, Data Mining, ImprovedAlgorithm, Noise Restrain, Web Log
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2280775 Socio-Spatial Resilience Strategic Planning Through Understanding Strategic Perspectives on Tehran and Bath
Authors: Aynaz Lotfata
Abstract:
Planning community has been long discussing emerging paradigms within the planning theory in the face of the changing conditions of the world order. The paradigm shift concept was introduced by Thomas Kuhn, in 1960, who claimed the necessity of shifting within scientific knowledge boundaries; and following him in 1970 Imre Loktas also gave priority to the emergence of multi-paradigm societies [24]. Multi-paradigm is changing our predetermined lifeworld through uncertainties. Those uncertainties are reflected in two sides, the first one is uncertainty as a concept of possibility and creativity in public sphere and the second one is uncertainty as a risk. Therefore, it is necessary to apply a resilience planning approach to be more dynamic in controlling uncertainties which have the potential to transfigure present time and space definitions. In this way, stability of system can be achieved. Uncertainty is not only an outcome of worldwide changes but also a place-specific issue, i.e. it changes from continent to continent, a country to country; a region to region. Therefore, applying strategic spatial planning with respect to resilience principle contributes to: control, grasp and internalize uncertainties through place-specific strategies. In today-s fast changing world, planning system should follow strategic spatial projects to control multi-paradigm societies with adaptability capacities. Here, we have selected two alternatives to demonstrate; these are; 1.Tehran (Iran) from the Middle East 2.Bath (United Kingdom) from Europe. The study elaborates uncertainties and particularities in their strategic spatial planning processes in a comparative manner. Through the comparison, the study aims at assessing place-specific priorities in strategic planning. The approach is to a two-way stream, where the case cities from the extreme end of the spectrum can learn from each other. The structure of this paper is to firstly compare semi-periphery (Tehran) and coreperiphery (Bath) cities, with the focus to reveal how they equip to face with uncertainties according to their geographical locations and local particularities. Secondly, the key message to address is “Each locality requires its own strategic planning approach to be resilient.--
Keywords: Adaptation, Relational Network, Socio-Spatial Strategic Resiliency, Uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820774 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems
Authors: Belkacem Laimouche
Abstract:
With the field of Artificial Intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.
Keywords: Artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, inter-laboratory comparison, data analysis, data reliability, bias impact assessment, bias measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142773 Properties and Approximation Distribution Reductions in Multigranulation Rough Set Model
Authors: Properties, Approximation Distribution Reductions in Multigranulation Rough Set Model
Abstract:
Some properties of approximation sets are studied in multi-granulation optimist model in rough set theory using maximal compatible classes. The relationships between or among lower and upper approximations in single and multiple granulation are compared and discussed. Through designing Boolean functions and discernibility matrices in incomplete information systems, the lower and upper approximation sets and reduction in multi-granulation environments can be found. By using examples, the correctness of computation approach is consolidated. The related conclusions obtained are suitable for further investigating in multiple granulation RSM.
Keywords: Incomplete information system, maximal compatible class, multi-granulation rough set model, reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 859772 Fuzzy Based Particle Swarm Optimization Routing Technique for Load Balancing in Wireless Sensor Networks
Authors: S. Balaji, E. Golden Julie, M. Rajaram, Y. Harold Robinson
Abstract:
Network lifetime improvement and uncertainty in multiple systems are the issues of wireless sensor network routing. This paper presents fuzzy based particle swarm optimization routing technique to improve the network scalability. Significantly, in the cluster formation procedure, fuzzy based system is used to solve the uncertainty and network balancing. Cluster heads play an important role to reduce the energy consumption using particle swarm optimization algorithm, the cluster head sends its information along data packets to the heads with link. The simulation results show that the presented routing protocol can perform load balancing effectively and reduce the energy consumption of cluster heads.
Keywords: Wireless sensor networks, fuzzy logic, PSO, LEACH.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1283771 Uncertainty of the Brazilian Earth System Model for Solar Radiation
Authors: Elison Eduardo Jardim Bierhals, Claudineia Brazil, Deivid Pires, Rafael Haag, Elton Gimenez Rossini
Abstract:
This study evaluated the uncertainties involved in the solar radiation projections generated by the Brazilian Earth System Model (BESM) of the Weather and Climate Prediction Center (CPTEC) belonging to Coupled Model Intercomparison Phase 5 (CMIP5), with the aim of identifying efficiency in the projections for solar radiation of said model and in this way establish the viability of its use. Two different scenarios elaborated by Intergovernmental Panel on Climate Change (IPCC) were evaluated: RCP 4.5 (with more optimistic contour conditions) and 8.5 (with more pessimistic initial conditions). The method used to verify the accuracy of the present model was the Nash coefficient and the Statistical bias, as it better represents these atmospheric patterns. The BESM showed a tendency to overestimate the data of solar radiation projections in most regions of the state of Rio Grande do Sul and through the validation methods adopted by this study, BESM did not present a satisfactory accuracy.
Keywords: Climate changes, projections, solar radiation, uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 995770 Comparison of Imputation Techniques for Efficient Prediction of Software Fault Proneness in Classes
Authors: Geeta Sikka, Arvinder Kaur Takkar, Moin Uddin
Abstract:
Missing data is a persistent problem in almost all areas of empirical research. The missing data must be treated very carefully, as data plays a fundamental role in every analysis. Improper treatment can distort the analysis or generate biased results. In this paper, we compare and contrast various imputation techniques on missing data sets and make an empirical evaluation of these methods so as to construct quality software models. Our empirical study is based on NASA-s two public dataset. KC4 and KC1. The actual data sets of 125 cases and 2107 cases respectively, without any missing values were considered. The data set is used to create Missing at Random (MAR) data Listwise Deletion(LD), Mean Substitution(MS), Interpolation, Regression with an error term and Expectation-Maximization (EM) approaches were used to compare the effects of the various techniques.Keywords: Missing data, Imputation, Missing Data Techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667769 Comanche – A Compiler-Driven I/O Management System
Authors: Wendy Zhang, Ernst L. Leiss, Huilin Ye
Abstract:
Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.Keywords: I/O Management, Out-of-core, Compiler, Tile mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318768 A New Approach of Fuzzy Methods for Evaluating of Hydrological Data
Authors: Nasser Shamskia, Seyyed Habib Rahmati, Hassan Haleh , Seyyedeh Hoda Rahmati
Abstract:
The main criteria of designing in the most hydraulic constructions essentially are based on runoff or discharge of water. Two of those important criteria are runoff and return period. Mostly, these measures are calculated or estimated by stochastic data. Another feature in hydrological data is their impreciseness. Therefore, in order to deal with uncertainty and impreciseness, based on Buckley-s estimation method, a new fuzzy method of evaluating hydrological measures are developed. The method introduces triangular shape fuzzy numbers for different measures in which both of the uncertainty and impreciseness concepts are considered. Besides, since another important consideration in most of the hydrological studies is comparison of a measure during different months or years, a new fuzzy method which is consistent with special form of proposed fuzzy numbers, is also developed. Finally, to illustrate the methods more explicitly, the two algorithms are tested on one simple example and a real case study.Keywords: Fuzzy Discharge, Fuzzy estimation, Fuzzy ranking method, Hydrological data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1712767 A Sociocybernetics Data Analysis Using Causality in Tourism Networks
Authors: M. Lloret-Climent, J. Nescolarde-Selva
Abstract:
The aim of this paper is to propose a mathematical model to determine invariant sets, set covering, orbits and, in particular, attractors in the set of tourism variables. Analysis was carried out based on a pre-designed algorithm and applying our interpretation of chaos theory developed in the context of General Systems Theory. This article sets out the causal relationships associated with tourist flows in order to enable the formulation of appropriate strategies. Our results can be applied to numerous cases. For example, in the analysis of tourist flows, these findings can be used to determine whether the behaviour of certain groups affects that of other groups and to analyse tourist behaviour in terms of the most relevant variables. Unlike statistical analyses that merely provide information on current data, our method uses orbit analysis to forecast, if attractors are found, the behaviour of tourist variables in the immediate future.
Keywords: Attractor, invariant set, orbits, tourist variables.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1745766 The Use of Dynamically Optimised High Frequency Moving Average Strategies for Intraday Trading
Authors: Abdalla Kablan, Joseph Falzon
Abstract:
This paper is motivated by the aspect of uncertainty in financial decision making, and how artificial intelligence and soft computing, with its uncertainty reducing aspects can be used for algorithmic trading applications that trade in high frequency. This paper presents an optimized high frequency trading system that has been combined with various moving averages to produce a hybrid system that outperforms trading systems that rely solely on moving averages. The paper optimizes an adaptive neuro-fuzzy inference system that takes both the price and its moving average as input, learns to predict price movements from training data consisting of intraday data, dynamically switches between the best performing moving averages, and performs decision making of when to buy or sell a certain currency in high frequency.Keywords: Financial decision making, High frequency trading, Adaprive neuro-fuzzy systems, moving average strategy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5072765 Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC) within Operational Research (OR) with Sustainability and Phenomenology
Authors: Al-Salamin Hussain, Elias O. Tembe
Abstract:
Supply chain (SC) is an operational research (OR) approach and technique which acts as catalyst within central nervous system of business today. Without SC, any type of business is at doldrums, hence entropy. SC is the lifeblood of business today because it is the pivotal hub which provides imperative competitive advantage. The paper present a conceptual framework dubbed as Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC).The term Homomorphic is derived from abstract algebraic mathematical term homomorphism (same shape) which also embeds the following mathematical application sets: monomorphisms, isomorphism, automorphisms, and endomorphism. The HCFESC is intertwined and integrated with wide and broad sets of elements.
Keywords: Automorphisms, Homomorphism, Monomorphisms, Supply Chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776764 On the Noise Distance in Robust Fuzzy C-Means
Authors: M. G. C. A. Cimino, G. Frosini, B. Lazzerini, F. Marcelloni
Abstract:
In the last decades, a number of robust fuzzy clustering algorithms have been proposed to partition data sets affected by noise and outliers. Robust fuzzy C-means (robust-FCM) is certainly one of the most known among these algorithms. In robust-FCM, noise is modeled as a separate cluster and is characterized by a prototype that has a constant distance δ from all data points. Distance δ determines the boundary of the noise cluster and therefore is a critical parameter of the algorithm. Though some approaches have been proposed to automatically determine the most suitable δ for the specific application, up to today an efficient and fully satisfactory solution does not exist. The aim of this paper is to propose a novel method to compute the optimal δ based on the analysis of the distribution of the percentage of objects assigned to the noise cluster in repeated executions of the robust-FCM with decreasing values of δ . The extremely encouraging results obtained on some data sets found in the literature are shown and discussed.Keywords: noise prototype, robust fuzzy clustering, robustfuzzy C-means
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1822763 Optimal Data Compression and Filtering: The Case of Infinite Signal Sets
Authors: Anatoli Torokhti, Phil Howlett
Abstract:
We present a theory for optimal filtering of infinite sets of random signals. There are several new distinctive features of the proposed approach. First, we provide a single optimal filter for processing any signal from a given infinite signal set. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.
Keywords: stochastic signals, optimization problems in signal processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1280762 Generic Filtering of Infinite Sets of Stochastic Signals
Authors: Anatoli Torokhti, Phil Howlett
Abstract:
A theory for optimal filtering of infinite sets of random signals is presented. There are several new distinctive features of the proposed approach. First, a single optimal filter for processing any signal from a given infinite signal set is provided. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the scheme concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.Keywords: Optimal filtering, data compression, stochastic signals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321