Search results for: statistical models of speech recognition
3409 Energy Performance of Buildings Due to Downscaled Seasonal Models
Authors: Anastasia K. Eleftheriadou, Athanasios Sfetsos, Nikolaos Gounaris
Abstract:
The current paper presents an extensive bottom-up framework for assessing building sector-specific vulnerability to climate change: energy supply and demand. The research focuses on the application of downscaled seasonal models for estimating energy performance of buildings in Greece. The ARW-WRF model has been set-up and suitably parameterized to produce downscaled climatological fields for Greece, forced by the output of the CFSv2 model. The outer domain, D01/Europe, included 345 x 345 cells of horizontal resolution 20 x 20 km2 and the inner domain, D02/Greece, comprised 180 x 180 cells of 5 x 5 km2 horizontal resolution. The model run has been setup for a period with a forecast horizon of 6 months, storing outputs on a six hourly basis.Keywords: Urban environment, vulnerability, climate change, energy performance, seasonal forecast models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17433408 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods
Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim
Abstract:
Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.
Keywords: Economical analysis, probability of failure, retaining walls, statistical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10263407 Inferring the Dynamics of “Hidden“ Neurons from Electrophysiological Recordings
Authors: Valeri A. Makarov, Nazareth P. Castellanos
Abstract:
Statistical analysis of electrophysiological recordings obtained under, e.g. tactile, stimulation frequently suggests participation in the network dynamics of experimentally unobserved “hidden" neurons. Such interneurons making synapses to experimentally recorded neurons may strongly alter their dynamical responses to the stimuli. We propose a mathematical method that formalizes this possibility and provides an algorithm for inferring on the presence and dynamics of hidden neurons based on fitting of the experimental data to spike trains generated by the network model. The model makes use of Integrate and Fire neurons “chemically" coupled through exponentially decaying synaptic currents. We test the method on simulated data and also provide an example of its application to the experimental recording from the Dorsal Column Nuclei neurons of the rat under tactile stimulation of a hind limb.Keywords: Integrate and fire neuron, neural network models, spike trains.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13433406 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models
Authors: Viriyavudh Sim, WooYoung Jung
Abstract:
Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.
Keywords: Wind fragility, glass window, high rise apartment, Monte Carlo Simulation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12223405 The Data Mining usage in Production System Management
Authors: Pavel Vazan, Pavol Tanuska, Michal Kebisek
Abstract:
The paper gives the pilot results of the project that is oriented on the use of data mining techniques and knowledge discoveries from production systems through them. They have been used in the management of these systems. The simulation models of manufacturing systems have been developed to obtain the necessary data about production. The authors have developed the way of storing data obtained from the simulation models in the data warehouse. Data mining model has been created by using specific methods and selected techniques for defined problems of production system management. The new knowledge has been applied to production management system. Gained knowledge has been tested on simulation models of the production system. An important benefit of the project has been proposal of the new methodology. This methodology is focused on data mining from the databases that store operational data about the production process.Keywords: data mining, data warehousing, management of production system, simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34773404 Fingerprint Identification using Discretization Technique
Authors: W. Y. Leng, S. M. Shamsuddin
Abstract:
Fingerprint based identification system; one of a well known biometric system in the area of pattern recognition and has always been under study through its important role in forensic science that could help government criminal justice community. In this paper, we proposed an identification framework of individuals by means of fingerprint. Different from the most conventional fingerprint identification frameworks the extracted Geometrical element features (GEFs) will go through a Discretization process. The intention of Discretization in this study is to attain individual unique features that could reflect the individual varianceness in order to discriminate one person from another. Previously, Discretization has been shown a particularly efficient identification on English handwriting with accuracy of 99.9% and on discrimination of twins- handwriting with accuracy of 98%. Due to its high discriminative power, this method is adopted into this framework as an independent based method to seek for the accuracy of fingerprint identification. Finally the experimental result shows that the accuracy rate of identification of the proposed system using Discretization is 100% for FVC2000, 93% for FVC2002 and 89.7% for FVC2004 which is much better than the conventional or the existing fingerprint identification system (72% for FVC2000, 26% for FVC2002 and 32.8% for FVC2004). The result indicates that Discretization approach manages to boost up the classification effectively, and therefore prove to be suitable for other biometric features besides handwriting and fingerprint.Keywords: Discretization, fingerprint identification, geometrical features, pattern recognition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23643403 Optimal Calculation of Partial Transmission Ratios of Four-Step Helical Gearboxes for Getting Minimal Gearbox Length
Authors: Vu Ngoc Pi
Abstract:
This paper presents a new study on the applications of optimization and regression analysis techniques for optimal calculation of partial ratios of four-step helical gearboxes for getting minimal gearbox length. In the paper, basing on the moment equilibrium condition of a mechanic system including four gear units and their regular resistance condition, models for determination of the partial ratios of the gearboxes are proposed. In particular, explicit models for calculation of the partial ratios are proposed by using regression analysis. Using these models, the determination of the partial ratios is accurate and simple.Keywords: Gearbox design; optimal design; helical gearbox, transmission ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20943402 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation
Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski
Abstract:
Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.Keywords: Bootstrap, Edgeworth approximation, independent and Identical distributed, quantile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4463401 Continuous Threshold Prey Harvesting in Predator-Prey Models
Authors: Jonathan Bohn, Jorge Rebaza, Kaitlin Speer
Abstract:
The dynamics of a predator-prey model with continuous threshold policy harvesting functions on the prey is studied. Theoretical and numerical methods are used to investigate boundedness of solutions, existence of bionomic equilibria, and the stability properties of coexistence equilibrium points and periodic orbits. Several bifurcations as well as some heteroclinic orbits are computed.Keywords: Predator-prey models, threshold harvesting, dynamicalsystems
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23073400 Effect of Increasing Road Light Luminance on Night Driving Performance of Older Adults
Authors: Said M. Easa, Maureen J. Reed, Frank Russo, Essam Dabbour, Atif Mehmood, Kathryn Curtis
Abstract:
The main objective of this study was to determine if a minimal increase in road light level (luminance) could lead to improved driving performance among older adults. Older, middleaged and younger adults were tested in a driving simulator following vision and cognitive screening. Comparisons were made for the performance of simulated night driving under two road light conditions (0.6 and 2.5 cd/m2). At each light level, the effects of self reported night driving avoidance were examined along with the vision/cognitive performance. It was found that increasing road light level from 0.6 cd/m2 to 2.5 cd/m2 resulted in improved recognition of signage on straight highway segments. The improvement depends on different driver-related factors such as vision and cognitive abilities, and confidence. On curved road sections, the results showed that driver-s performance worsened. It is concluded that while increasing road lighting may be helpful to older adults especially for sign recognition, it may also result in increased driving confidence and thus reduced attention in some driving situations.Keywords: Driving, older adults, night-time, road lighting, attention, simulation, curves, signs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18503399 Reliability of Digital FSO Links in Europe
Authors: Zdenek Kolka, Otakar Wilfert, Viera Biolkova
Abstract:
The paper deals with an analysis of visibility records collected from 210 European airports to obtain a realistic estimation of the availability of Free Space Optical (FSO) data links. Commercially available optical links usually operate in the 850nm waveband. Thus the influence of the atmosphere on the optical beam and on the visible light is similar. Long-term visibility records represent an invaluable source of data for the estimation of the quality of service of FSO links. The model used characterizes both the statistical properties of fade depths and the statistical properties of individual fade durations. Results are presented for Italy, France, and Germany.
Keywords: Computer networks, free-space optical links, meteorology, quality of service.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21493398 Variability of Hydrological Modeling of the Blue Nile
Authors: Abeer Samy, Oliver C. Saavedra Valeriano, Abdelazim Negm
Abstract:
The Blue Nile Basin is the most important tributary of the Nile River. Egypt and Sudan are almost dependent on water originated from the Blue Nile. This multi-dependency creates conflicts among the three countries Egypt, Sudan, and Ethiopia making the management of these conflicts as an international issue. Good assessment of the water resources of the Blue Nile is an important to help in managing such conflicts. Hydrological models are good tool for such assessment. This paper presents a critical review of the nature and variability of the climate and hydrology of the Blue Nile Basin as a first step of using hydrological modeling to assess the water resources of the Blue Nile. Many several attempts are done to develop basin-scale hydrological modeling on the Blue Nile. Lumped and semi distributed models used averages of meteorological inputs and watershed characteristics in hydrological simulation, to analyze runoff for flood control and water resource management. Distributed models include the temporal and spatial variability of catchment conditions and meteorological inputs to allow better representation of the hydrological process. The main challenge of all used models was to assess the water resources of the basin is the shortage of the data needed for models calibration and validation. It is recommended to use distributed model for their higher accuracy to cope with the great variability and complexity of the Blue Nile basin and to collect sufficient data to have more sophisticated and accurate hydrological modeling.
Keywords: Blue Nile Basin, Climate Change, Hydrological Modeling, Watershed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30743397 A Comparative Study of Force Prediction Models during Static Bending Stage for 3-Roller Cone Frustum Bending
Authors: Mahesh Chudasama, Harit Raval
Abstract:
Conical sections and shells of metal plates manufactured by 3-roller conical bending process are widely used in the industries. The process is completed by first bending the metal plates statically and then dynamic roller bending sequentially. It is required to have an analytical model to get maximum bending force, for optimum design of the machine, for static bending stage. Analytical models assuming various stress conditions are considered and these analytical models are compared considering various parameters and reported in this paper. It is concluded from the study that for higher bottom roller inclination, the shear stress affects greatly to the static bending force whereas for lower bottom roller inclination it can be neglected.
Keywords: Roller-bending, static-bending, stress-conditions, analytical-modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10463396 Measurement of Small PD-S in Compressed SF6(10%) - N2(90%) Gas Mixture
Authors: B. Rajesh Kamath, J. Sundara Rajan
Abstract:
Partial Discharge measurement is a very important means of assessing the integrity of insulation systems in a High Voltage apparatus. In compressed gas insulation systems, floating particles can initiate partial discharge activities which adversely affect the working of insulation. Partial Discharges below the inception voltage also plays a crucial in damaging the integrity of insulation over a period of time. This paper discusses the effect of loose and fixed Copper and Nichrome wire particles on the PD characteristics in SF6-N2 (10:90) gas mixtures at a pressure of 0.4MPa. The Partial Discharge statistical parameters and their correlation to the observed results are discussed.Keywords: Gas Insulated transmission Line, Sulphur HexaFlouride, metallic Particles, Partial Discharge (PD), InceptionVoltage (Vi), Extinction Voltage (Ve), PD Statistical parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16733395 Prediction of Basic Wind Speed for Ayeyarwady
Authors: Chaw Su Mon
Abstract:
Abstract— The paper presents a preliminary study on modeling and estimation of basic wind speed ( extreme wind gusts ) for the consideration of vulnerability and design of building in Ayeyarwady Region. The establishment of appropriate design wind speeds is a critical step towards the calculation of design wind loads for structures. In this paper the extreme value analysis of this prediction work is based on the anemometer data (1970-2009) maintained by the department of meteorology and hydrology of Pathein. Statistical and probabilistic approaches are used to derive formulas for estimating 3-second gusts from recorded data (10-minute sustained mean wind speeds).
Keywords: Basic Wind Speed, Building, Gusts, Statistical and probabilistic approaches
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12813394 A Decision Boundary based Discretization Technique using Resampling
Authors: Taimur Qureshi, Djamel A Zighed
Abstract:
Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.Keywords: Bootstrap, discretization, resampling, soft decision trees.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14353393 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software
Authors: Anjushi Verma, Tirthankar Gayen
Abstract:
Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.
Keywords: Black Box, faults, failure, software reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13863392 Operations Research Applications in Audit Planning and Scheduling
Authors: Abdel-Aziz M. Mohamed
Abstract:
This paper presents a state-of-the-art survey of the operations research models developed for internal audit planning. Two alternative approaches have been followed in the literature for audit planning: (1) identifying the optimal audit frequency; and (2) determining the optimal audit resource allocation. The first approach identifies the elapsed time between two successive audits, which can be presented as the optimal number of audits in a given planning horizon, or the optimal number of transactions after which an audit should be performed. It also includes the optimal audit schedule. The second approach determines the optimal allocation of audit frequency among all auditable units in the firm. In our review, we discuss both the deterministic and probabilistic models developed for audit planning. In addition, game theory models are reviewed to find the optimal auditing strategy based on the interactions between the auditors and the clients.Keywords: Operations research applications, audit frequency, audit planning, audit-staff scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29363391 The Effect of Damping Treatment for Noise Control on Offshore Platforms Using Statistical Energy Analysis
Authors: Ji Xi, Cheng Song Chin, Ehsan Mesbahi
Abstract:
Structure-borne noise is an important aspect of offshore platform sound field. It can be generated either directly by vibrating machineries induced mechanical force, indirectly by the excitation of structure or excitation by incident airborne noise. Therefore, limiting of the transmission of vibration energy throughout the offshore platform is the key to control the structureborne noise. This is usually done by introducing damping treatment to the steel structures. Two types of damping treatment using onboard are presented. By conducting a Statistical Energy Analysis (SEA) simulation on a jack-up rig, the noise level in the source room, the neighboring rooms, and remote living quarter cabins are compared before and after the damping treatments been applied. The results demonstrated that, in the source neighboring room and living quarter area, there is a significant noise reduction with the damping treatment applied, whereas in the source room where air-borne sound predominates that of structure-borne sound, the impact is not obvious. The conclusion on effective damping treatment in the offshore platform is made which enable acoustic professionals to implement noise control during the design stage for offshore crews’ hearing protection and habitant comfortability.Keywords: Statistical energy analysis, damping treatment, noise control, offshore platform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21233390 Improvement of Water Distillation Plant by Using Statistical Process Control System
Authors: Qasim Kriri, Harsh B. Desai
Abstract:
Water supply and sanitation in Saudi Arabia is portrayed by difficulties and accomplishments. One of the fundamental difficulties is water shortage. With a specific end goal to beat water shortage, significant ventures have been attempted in sea water desalination, water circulation, sewerage, and wastewater treatment. The motivation behind Statistical Process Control (SPC) is to decide whether the execution of a procedure is keeping up an acceptable quality level [AQL]. SPC is an analytical decision-making method. A fundamental apparatus in the SPC is the Control Charts, which follow the inconstancy in the estimations of the item quality attributes. By utilizing the suitable outline, administration can decide whether changes should be made with a specific end goal to keep the procedure in charge. The two most important quality factors in the distilled water which were taken into consideration were pH (Potential of Hydrogen) and TDS (Total Dissolved Solids). There were three stages at which the quality checks were done. The stages were as follows: (1) Water at the source, (2) water after chemical treatment & (3) water which is sent for packing. The upper specification limit, central limit and lower specification limit are taken as per Saudi water standards. The procedure capacity to accomplish the particulars set for the quality attributes of Berain water Factory chose to be focused by the proposed SPC system.
Keywords: Acceptable quality level, statistical quality control, control charts, process charts.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10653389 A CFD Analysis of Hydraulic Characteristics of the Rod Bundles in the BREST-OD-300 Wire-Spaced Fuel Assemblies
Authors: Dmitry V. Fomichev, Vladimir I. Solonin
Abstract:
This paper presents the findings from a numerical simulation of the flow in 37-rod fuel assembly models spaced by a double-wire trapezoidal wrapping as applied to the BREST-OD-300 experimental nuclear reactor. Data on a high static pressure distribution within the models, and equations for determining the fuel bundle flow friction factors have been obtained. Recommendations are provided on using the closing turbulence models available in the ANSYS Fluent. A comparative analysis has been performed against the existing empirical equations for determining the flow friction factors. The calculated and experimental data fit has been shown.
An analysis into the experimental data and results of the numerical simulation of the BREST-OD-300 fuel rod assembly hydrodynamic performance are presented.
Keywords: BREST-OD-300, ware-spaces, fuel assembly, computation fluid dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22303388 Mean Velocity Modeling of Open-Channel Flow with Submerged Rigid Vegetation
Authors: M. Morri, A. Soualmia, P. Belleudy
Abstract:
Vegetation affects the mean and turbulent flow structure. It may increase flood risks and sediment transport. Therefore, it is important to develop analytical approaches for the bed shear stress on vegetated bed, to predict resistance caused by vegetation. In the recent years, experimental and numerical models have both been developed to model the effects of submerged vegetation on open-channel flow. In this paper, different analytic models are compared and tested using the criteria of deviation, to explore their capacity for predicting the mean velocity and select the suitable one that will be applied in real case of rivers. The comparison between the measured data in vegetated flume and simulated mean velocities indicated, a good performance, in the case of rigid vegetation, whereas, Huthoff model shows the best agreement with a high coefficient of determination (R2=80%) and the smallest error in the prediction of the average velocities.
Keywords: Analytic Models, Comparison, Mean Velocity, Vegetation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25383387 Microgrid: Low Power Network Topology and Control
Authors: Amit Sachan
Abstract:
The network designing and data modeling developments which are the two significant research tasks in direction to tolerate power control of Microgrid concluded using IEC 61850 data models and facilities. The current casing areas of IEC 61580 include infrastructures in substation automation systems, among substations and to DERs. So, for LV microgrid power control, previously using the IEC 61850 amenities to control the smart electrical devices, we have to model those devices as IEC 61850 data models and design a network topology to maintenance all-in-one communiqué amid those devices. In adding, though IEC 61850 assists modeling a portion by open-handed several object models for common functions similar measurement, metering, monitoring…etc., there are motionless certain missing smithereens for building a multiplicity of functions for household appliances like tuning the temperature of an electric heater or refrigerator.
Keywords: IEC 61850, RCMC, HCMC, DER Unit Controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24983386 Empirical Statistical Modeling of Rainfall Prediction over Myanmar
Authors: Wint Thida Zaw, Thinn Thu Naing
Abstract:
One of the essential sectors of Myanmar economy is agriculture which is sensitive to climate variation. The most important climatic element which impacts on agriculture sector is rainfall. Thus rainfall prediction becomes an important issue in agriculture country. Multi variables polynomial regression (MPR) provides an effective way to describe complex nonlinear input output relationships so that an outcome variable can be predicted from the other or others. In this paper, the modeling of monthly rainfall prediction over Myanmar is described in detail by applying the polynomial regression equation. The proposed model results are compared to the results produced by multiple linear regression model (MLR). Experiments indicate that the prediction model based on MPR has higher accuracy than using MLR.Keywords: Polynomial Regression, Rainfall Forecasting, Statistical forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26353385 An Analysis of Classification of Imbalanced Datasets by Using Synthetic Minority Over-Sampling Technique
Authors: Ghada A. Alfattni
Abstract:
Analysing unbalanced datasets is one of the challenges that practitioners in machine learning field face. However, many researches have been carried out to determine the effectiveness of the use of the synthetic minority over-sampling technique (SMOTE) to address this issue. The aim of this study was therefore to compare the effectiveness of the SMOTE over different models on unbalanced datasets. Three classification models (Logistic Regression, Support Vector Machine and Nearest Neighbour) were tested with multiple datasets, then the same datasets were oversampled by using SMOTE and applied again to the three models to compare the differences in the performances. Results of experiments show that the highest number of nearest neighbours gives lower values of error rates.Keywords: Imbalanced datasets, SMOTE, machine learning, logistic regression, support vector machine, nearest neighbour.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13153384 An Approach to Task Modeling for User Interface Design
Authors: Costin Pribeanu
Abstract:
The model-based approach to user interface design relies on developing separate models capturing various aspects about users, tasks, application domain, presentation and dialog structures. This paper presents a task modeling approach for user interface design and aims at exploring mappings between task, domain and presentation models. The basic idea of our approach is to identify typical configurations in task and domain models and to investigate how they relate each other. A special emphasis is put on applicationspecific functions and mappings between domain objects and operational task structures. In this respect, we will address two layers in task decomposition: a functional (planning) layer and an operational layer.Keywords: task modeling, user interface design, unit tasks, basic tasks, operational task model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18813383 Modeling Exponential Growth Activity Using Technology: A Research with Bachelor of Business Administration Students
Authors: V. Vargas-Alejo, L. E. Montero-Moguel
Abstract:
Understanding the concept of function has been important in mathematics education for many years. In this study, the models built by a group of five business administration and accounting undergraduate students when carrying out a population growth activity are analyzed. The theoretical framework is the Models and Modeling Perspective. The results show how the students included tables, graphics, and algebraic representations in their models. Using technology was useful to interpret, describe, and predict the situation. The first model, the students built to describe the situation, was linear. After that, they modified and refined their ways of thinking; finally, they created exponential growth. Modeling the activity was useful to deep on mathematical concepts such as covariation, rate of change, and exponential function also to differentiate between linear and exponential growth.Keywords: Covariation reasoning, exponential function, modeling, representations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5073382 Agent-Based Simulation of Simulating Anticipatory Systems – Classification
Authors: Eugene Kindler
Abstract:
The present paper is oriented to classification and application of agent technique in simulation of anticipatory systems, namely those that use simulation models for the aid of anticipation. The main ideas root in the fact that the best way for description of computer simulation models is the technique of describing the simulated system itself (and the translation into the computer code is provided as automatic), and that the anticipation itself is often nested.
Keywords: Agents, Anticipatory systems, Discrete eventsimulation, Simula, Taxonomy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15593381 Comparison of the Thermal Characteristics of Induction Motor, Switched Reluctance Motor and Inset Permanent Magnet Motor for Electric Vehicle Application
Authors: Sadeep Sasidharan, T. B. Isha
Abstract:
Modern day electric vehicles require compact high torque/power density motors for electric propulsion. This necessitates proper thermal management of the electric motors. The main focus of this paper is to compare the steady state thermal analysis of a conventional 20 kW 8/6 Switched Reluctance Motor (SRM) with that of an Induction Motor and Inset Permanent Magnet (IPM) motor of the same rating. The goal is to develop a proper thermal model of the three types of models for Finite Element Thermal Analysis. JMAG software is used for the development and simulation of the thermal models. The results show that the induction motor is subjected to more heating when used for electric vehicle application constantly, compared to the SRM and IPM.Keywords: SRM, induction motor, IPM, thermal analysis, loss models, electric vehicles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10393380 An Improved K-Means Algorithm for Gene Expression Data Clustering
Authors: Billel Kenidra, Mohamed Benmohammed
Abstract:
Data mining technique used in the field of clustering is a subject of active research and assists in biological pattern recognition and extraction of new knowledge from raw data. Clustering means the act of partitioning an unlabeled dataset into groups of similar objects. Each group, called a cluster, consists of objects that are similar between themselves and dissimilar to objects of other groups. Several clustering methods are based on partitional clustering. This category attempts to directly decompose the dataset into a set of disjoint clusters leading to an integer number of clusters that optimizes a given criterion function. The criterion function may emphasize a local or a global structure of the data, and its optimization is an iterative relocation procedure. The K-Means algorithm is one of the most widely used partitional clustering techniques. Since K-Means is extremely sensitive to the initial choice of centers and a poor choice of centers may lead to a local optimum that is quite inferior to the global optimum, we propose a strategy to initiate K-Means centers. The improved K-Means algorithm is compared with the original K-Means, and the results prove how the efficiency has been significantly improved.
Keywords: Microarray data mining, biological pattern recognition, partitional clustering, k-means algorithm, centroid initialization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1286