Search results for: the five-factor model of personality
15615 Mistuning in Radial Inflow Turbines
Authors: Valentina Futoryanova, Hugh Hunt
Abstract:
One of the common failure modes of the diesel engine turbochargers is high cycle fatigue of the turbine wheel blades. Mistuning of the blades due to the casting process is believed to contribute to the failure mode. Laser vibrometer is used to characterize mistuning for a population of turbine wheels through the analysis of the blade response to piezo speaker induced noise. The turbine wheel design under investigation is radial and is typically used in 6-12 L diesel engine applications. Amplitudes and resonance frequencies are reviewed and summarized. The study also includes test results for a paddle wheel that represents a perfectly tuned system and acts as a reference. Mass spring model is developed for the paddle wheel and the model suitability is tested against the actual data. Randomization is applied to the stiffness matrix to model the mistuning effect in the turbine wheels. Experimental data is shown to have good agreement with the model.Keywords: vibration, radial turbines, mistuning, turbine blades, modal analysis, periodic structures, finite element
Procedia PDF Downloads 43215614 Long Term Love Relationships Analyzed as a Dynamic System with Random Variations
Authors: Nini Johana Marín Rodríguez, William Fernando Oquendo Patino
Abstract:
In this work, we model a coupled system where we explore the effects of steady and random behavior on a linear system like an extension of the classic Strogatz model. This is exemplified by modeling a couple love dynamics as a linear system of two coupled differential equations and studying its stability for four types of lovers chosen as CC='Cautious- Cautious', OO='Only other feelings', OP='Opposites' and RR='Romeo the Robot'. We explore the effects of, first, introducing saturation, and second, adding a random variation to one of the CC-type lover, which will shape his character by trying to model how its variability influences the dynamics between love and hate in couple in a long run relationship. This work could also be useful to model other kind of systems where interactions can be modeled as linear systems with external or internal random influence. We found the final results are not easy to predict and a strong dependence on initial conditions appear, which a signature of chaos.Keywords: differential equations, dynamical systems, linear system, love dynamics
Procedia PDF Downloads 35315613 Analysis of Users’ Behavior on Book Loan Log Based on Association Rule Mining
Authors: Kanyarat Bussaban, Kunyanuth Kularbphettong
Abstract:
This research aims to create a model for analysis of student behavior using Library resources based on data mining technique in case of Suan Sunandha Rajabhat University. The model was created under association rules, apriori algorithm. The results were found 14 rules and the rules were tested with testing data set and it showed that the ability of classify data was 79.24 percent and the MSE was 22.91. The results showed that the user’s behavior model by using association rule technique can use to manage the library resources.Keywords: behavior, data mining technique, a priori algorithm, knowledge discovery
Procedia PDF Downloads 40415612 Single-Element Simulations of Wood Material in LS-DYNA
Authors: Ren Zuo Wang
Abstract:
In this paper, in order to investigate the behavior of the wood structure, the non-linearity of wood material model in LS-DYNA is adopted. It is difficult and less efficient to conduct the experiment of the ancient wood structure, hence LS-DYNA software can be used to simulate nonlinear responses of ancient wood structure. In LS-DYNA software, there is material model called *MAT_WOOD or *MAT_143. This model is to simulate a single-element response of the wood subjected to tension and compression under the parallel and the perpendicular material directions. Comparing with the exact solution and numerical simulations results using LS-DYNA, it demonstrates the accuracy and the efficiency of the proposed simulation method.Keywords: LS-DYNA, wood structure, single-element simulations, MAT_143
Procedia PDF Downloads 65315611 Text Mining of Twitter Data Using a Latent Dirichlet Allocation Topic Model and Sentiment Analysis
Authors: Sidi Yang, Haiyi Zhang
Abstract:
Twitter is a microblogging platform, where millions of users daily share their attitudes, views, and opinions. Using a probabilistic Latent Dirichlet Allocation (LDA) topic model to discern the most popular topics in the Twitter data is an effective way to analyze a large set of tweets to find a set of topics in a computationally efficient manner. Sentiment analysis provides an effective method to show the emotions and sentiments found in each tweet and an efficient way to summarize the results in a manner that is clearly understood. The primary goal of this paper is to explore text mining, extract and analyze useful information from unstructured text using two approaches: LDA topic modelling and sentiment analysis by examining Twitter plain text data in English. These two methods allow people to dig data more effectively and efficiently. LDA topic model and sentiment analysis can also be applied to provide insight views in business and scientific fields.Keywords: text mining, Twitter, topic model, sentiment analysis
Procedia PDF Downloads 17915610 Analysis on the Need of Engineering Drawing and Feasibility Study on 3D Model Based Engineering Implementation
Authors: Parthasarathy J., Ramshankar C. S.
Abstract:
Engineering drawings these days play an important role in every part of an industry. By and large, Engineering drawings are influential over every phase of the product development process. Traditionally, drawings are used for communication in industry because they are the clearest way to represent the product manufacturing information. Until recently, manufacturing activities were driven by engineering data captured in 2D paper documents or digital representations of those documents. The need of engineering drawing is inevitable. Still Engineering drawings are disadvantageous in re-entry of data throughout manufacturing life cycle. This document based approach is prone to errors and requires costly re-entry of data at every stage in the manufacturing life cycle. So there is a requirement to eliminate Engineering drawings throughout product development process and to implement 3D Model Based Engineering (3D MBE or 3D MBD). Adopting MBD appears to be the next logical step to continue reducing time-to-market and improve product quality. Ideally, by fully applying the MBD concept, the product definition will no longer rely on engineering drawings throughout the product lifecycle. This project addresses the need of Engineering drawing and its influence in various parts of an industry and the need to implement the 3D Model Based Engineering with its advantages and the technical barriers that must be overcome in order to implement 3D Model Based Engineering. This project also addresses the requirements of neutral formats and its realisation in order to implement the digital product definition principles in a light format. In order to prove the concepts of 3D Model Based Engineering, the screw jack body part is also demonstrated. At ZF Windpower Coimbatore Limited, 3D Model Based Definition is implemented to Torque Arm (Machining and Casting), Steel tube, Pinion shaft, Cover, Energy tube.Keywords: engineering drawing, model based engineering MBE, MBD, CAD
Procedia PDF Downloads 43515609 A Bi-Objective Model to Address Simultaneous Formulation of Project Scheduling and Material Ordering
Authors: Babak H. Tabrizi, Seyed Farid Ghaderi
Abstract:
Concurrent planning of project scheduling and material ordering has been increasingly addressed within last decades as an approach to improve the project execution costs. Therefore, we have taken the problem into consideration in this paper, aiming to maximize schedules quality robustness, in addition to minimize the relevant costs. In this regard, a bi-objective mathematical model is developed to formulate the problem. Moreover, it is possible to utilize the all-unit discount for materials purchasing. The problem is then solved by the constraint method, and the Pareto front is obtained for a variety of robustness values. The applicability and efficiency of the proposed model is tested by different numerical instances, finally.Keywords: e-constraint method, material ordering, project management, project scheduling
Procedia PDF Downloads 29515608 Estimation of Soil Moisture at High Resolution through Integration of Optical and Microwave Remote Sensing and Applications in Drought Analyses
Authors: Donglian Sun, Yu Li, Paul Houser, Xiwu Zhan
Abstract:
California experienced severe drought conditions in the past years. In this study, the drought conditions in California are analyzed using soil moisture anomalies derived from integrated optical and microwave satellite observations along with auxiliary land surface data. Based on the U.S. Drought Monitor (USDM) classifications, three typical drought conditions were selected for the analysis: extreme drought conditions in 2007 and 2013, severe drought conditions in 2004 and 2009, and normal conditions in 2005 and 2006. Drought is defined as negative soil moisture anomaly. To estimate soil moisture at high spatial resolutions, three approaches are explored in this study: the universal triangle model that estimates soil moisture from Normalized Difference Vegetation Index (NDVI) and Land Surface Temperature (LST); the basic model that estimates soil moisture under different conditions with auxiliary data like precipitation, soil texture, topography, and surface types; and the refined model that uses accumulated precipitation and its lagging effects. It is found that the basic model shows better agreements with the USDM classifications than the universal triangle model, while the refined model using precipitation accumulated from the previous summer to current time demonstrated the closest agreements with the USDM patterns.Keywords: soil moisture, high resolution, regional drought, analysis and monitoring
Procedia PDF Downloads 13615607 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model
Authors: Chaudhuri Manoj Kumar Swain, Susmita Das
Abstract:
This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis
Procedia PDF Downloads 17715606 Causes of Variation Orders in the Egyptian Construction Industry: Time and Cost Impacts
Authors: A. Samer Ezeldin, Jwanda M. El Sarag
Abstract:
Variation orders are of great importance in any construction project. Variation orders are defined as any change in the scope of works of a project that can be an addition omission, or even modification. This paper investigates the variation orders that occur during construction projects in Egypt. The literature review represents a comparison of causes of variation orders among Egypt, Tanzania, Nigeria, Malaysia and the United Kingdom. A classification of occurrence of variation orders due to owner related factors, consultant related factors and other factors are signified in the literature review. These classified events that lead to variation orders were introduced in a survey with 19 events to observe their frequency of occurrence, and their time and cost impacts. The survey data was obtained from 87 participants that included clients, consultants, and contractors and a database of 42 scenarios was created. A model is then developed to help assist project managers in predicting the frequency of variations and account for a budget for any additional costs and minimize any delays that can take place. Two experts with more than 25 years of experience were given the model to verify that the model was working effectively. The model was then validated on a residential compound that was completed in July 2016 to prove that the model actually produces acceptable results.Keywords: construction, cost impact, Egypt, time impact, variation orders
Procedia PDF Downloads 18315605 Analysis of a Coupled Hydro-Sedimentological Numerical Model for the Western Tombolo of Giens
Authors: Yves Lacroix, Van Van Than, Didier Léandri, Pierre Liardet
Abstract:
The western Tombolo of the Giens peninsula in southern France, known as Almanarre beach, is subject to coastal erosion. We are trying to use computer simulation in order to propose solutions to stop this erosion. Our aim was first to determine the main factors for this erosion and successfully apply a coupled hydro-sedimentological numerical model based on observations and measurements that have been performed on the site for decades. We have gathered all available information and data about waves, winds, currents, tides, bathymetry, coastal line, and sediments concerning the site. These have been divided into two sets: one devoted to calibrating a numerical model using Mike 21 software, the other to serve as a reference in order to numerically compare the present situation to what it could be if we implemented different types of underwater constructions. This paper presents the first part of the study: selecting and melting different sources into a coherent data basis, identifying the main erosion factors, and calibrating the coupled software model against the selected reference period. Our results bring calibration of the numerical model with good fitting coefficients. They also show that the winter South-Western storm events conjugated to depressive weather conditions constitute a major factor of erosion, mainly due to wave impact in the northern part of the Almanarre beach. Together, current and wind impact is shown negligible.Keywords: Almanarre beach, coastal erosion, hydro-sedimentological, numerical model
Procedia PDF Downloads 37615604 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm
Procedia PDF Downloads 41215603 Giftedness Cloud Model: A Psychological and Ecological Vision of Giftedness Concept
Authors: Rimeyah H. S. Almutairi, Alaa Eldin A. Ayoub
Abstract:
The aim of this study was to identify empirical and theoretical studies that explored giftedness theories and identification. In order to assess and synthesize the mechanisms, outcomes, and impacts of gifted identification models. Thus, we sought to provide an evidence-informed answer to how does current giftedness theories work and effectiveness. In order to develop a model that incorporates the advantages of existing models and avoids their disadvantages as much as possible. We conducted a systematic literature review (SLR). The disciplined analysis resulted in a final sample consisting of 30 appropriate searches. The results indicated that: (a) there is no uniform and consistent definition of Giftedness; (b) researchers are using several non-consistent criteria to detect gifted, and (d) The detection of talent is largely limited to early ages, and there is obvious neglect of adults. This study contributes to the development of Giftedness Cloud Model (GCM) which defined as a model that attempts to interpretation giftedness within an interactive psychological and ecological framework. GCM aims to help a talented to reach giftedness core and manifestation talent in creative productivity or invention. Besides that, GCM suggests classifying giftedness into four levels of mastery, excellence, creative productivity, and manifestation. In addition, GCM presents an idea to distinguish between talent and giftedness.Keywords: giftedness cloud model, talent, systematic literature review, giftedness concept
Procedia PDF Downloads 16715602 Finite Element Simulation of RC Exterior Beam-Column Joints Using Damage Plasticity Model
Authors: A. M. Halahla, M. H. Baluch, M. K. Rahman, A. H. Al-Gadhib, M. N. Akhtar
Abstract:
In the present study, 3D simulation of a typical exterior (RC) beam–column joint (BCJ) strengthened with carbon fiber-reinforced plastic (CFRP) sheet are carried out. Numerical investigations are performed using a nonlinear finite element ( FE) analysis by incorporating damage plasticity model (CDP), for material behaviour the concrete response in compression, tension softening were used, linear plastic with isotropic hardening for reinforcing steel, and linear elastic lamina material model for CFRP sheets using the commercial FE software ABAQUS. The numerical models developed in the present study are validated with the results obtained from the experiment under monotonic loading using the hydraulic Jack in displacement control mode. The experimental program includes casting of deficient BCJ loaded to failure load for both un-strengthened and strengthened BCJ. The failure mode, and deformation response of CFRP strengthened and un-strengthened joints and propagation of damage in the components of BCJ are discussed. Finite element simulations are compared with the experimental result and are noted to yield reasonable comparisons. The damage plasticity model was able to capture with good accuracy of the ultimate load and the mode of failure in the beam column joint.Keywords: reinforced concrete, exterior beam-column joints, concrete damage plasticity model, computational simulation, 3-D finite element model
Procedia PDF Downloads 38315601 An Interlock Model of Friction and Superlubricity
Authors: Azadeh Malekan, Shahin Rouhani
Abstract:
Superlubricity is a phenomenon where two surfaces in contact show negligible friction;this may be because the asperities of the two surfaces do not interlock. Two rough surfaces, when pressed against each other, can get into a formation where the summits of asperities of one surface lock into the valleys of the other surface. The amount of interlock depends on the geometry of the two surfaces. We suggest the friction force may then be proportional to the amount of interlock; this explains Superlubricity as the situation where there is little interlock. Then the friction force will be directly proportional to the normal force as it is related to the work necessary to lift the upper surface in order to clear the interlock. To investigate this model, we simulate the contact of two surfaces. In order to validate our model, we first investigate Amontons‘ law. Assuming that asperities retain deformations in the time scale while the top asperity moves across the lattice spacing Amonton’s law is observed. Structural superlubricity is examined by the hypothesis that surfaces are very rigid and there is no deformation in asperities. This may happen at small normal forces. When two identical surfaces come into contact, rotating the top surface we observe a peak in friction force near the angle of orientation where the two surfaces can interlock.Keywords: friction, amonton`s law, superlubricity, contact model
Procedia PDF Downloads 14715600 Simultaneous versus Sequential Model in Foreign Entry
Authors: Patricia Heredia, Isabel Saz, Marta Fernández
Abstract:
This article proposes that the decision regarding exporting and the choice of export channel are nested and non-independent decisions. We assume that firms make two sequential decisions before arriving at their final choice: the decision to access foreign markets and the decision about the type of channel. This hierarchical perspective of the choices involved in the process is appealing for two reasons. First, it supports the idea that people have a limited analytical capacity. Managers often break down a complex decision into a hierarchical process because this makes it more manageable. Secondly, it recognizes that important differences exist between entry modes. In light of the above, the objective of this study is to test different entry mode choice processes: independent decisions and nested and non-independent decisions. To do this, the methodology estimates and compares the following two models: (i) a simultaneous single-stage model with three entry mode choices (using a multinomial logit model); ii) a two-stage model with the export decision preceding the channel decision using a sequential logit model. The study uses resource-based factors in determining these decision processes concerning internationalization and the study carries out empirical analysis using a DOC Rioja sample of 177 firms.Using the Akaike and Schwarz Information Criteria, the empirical evidence supports the existence of a nested structure, where the decision about exporting precedes the export mode decision. The implications and contributions of the findings are discussed.Keywords: sequential logit model, two-stage choice process, export mode, wine industry
Procedia PDF Downloads 2915599 Hydraulic Analysis of Irrigation Approach Channel Using HEC-RAS Model
Authors: Muluegziabher Semagne Mekonnen
Abstract:
This study was intended to show the irrigation water requirements and evaluation of canal hydraulics steady state conditions to improve on scheme performance of the Meki-Ziway irrigation project. The methodology used was the CROPWAT 8.0 model to estimate the irrigation water requirements of five major crops irrigated in the study area. The results showed that for the whole existing and potential irrigation development area of 2000 ha and 2599 ha, crop water requirements were 3,339,200 and 4,339,090.4 m³, respectively. Hydraulic simulation models are fundamental tools for understanding the hydraulic flow characteristics of irrigation systems. Hydraulic simulation models are fundamental tools for understanding the hydraulic flow characteristics of irrigation systems. In this study Hydraulic Analysis of Irrigation Canals Using HEC-RAS Model was conducted in Meki-Ziway Irrigation Scheme. The HEC-RAS model was tested in terms of error estimation and used to determine canal capacity potential.Keywords: HEC-RAS, irrigation, hydraulic. canal reach, capacity
Procedia PDF Downloads 6015598 Optimal Hedging of a Portfolio of European Options in an Extended Binomial Model under Proportional Transaction Costs
Authors: Norm Josephy, Lucy Kimball, Victoria Steblovskaya
Abstract:
Hedging of a portfolio of European options under proportional transaction costs is considered. Our discrete time financial market model extends the binomial market model with transaction costs to the case where the underlying stock price ratios are distributed over a bounded interval rather than over a two-point set. An optimal hedging strategy is chosen from a set of admissible non-self-financing hedging strategies. Our approach to optimal hedging of a portfolio of options is based on theoretical foundation that includes determination of a no-arbitrage option price interval as well as on properties of the non-self-financing strategies and their residuals. A computational algorithm for optimizing an investor relevant criterion over the set of admissible non-self-financing hedging strategies is developed. Applicability of our approach is demonstrated using both simulated data and real market data.Keywords: extended binomial model, non-self-financing hedging, optimization, proportional transaction costs
Procedia PDF Downloads 25215597 Proposing a Failure Criterion for Cohesionless Media Considering Cyclic Fabric Anisotropy
Authors: Ali Noorzad, Ehsan Badakhshan, Shima Zameni
Abstract:
The present paper is focused on a generalized failure criterion for geomaterials with cross-anisotropy. The cyclic behavior of granular material primarily depends on the nature and arrangement of constituent particles, particle size, and shape that affect fabric anisotropy. To account for the influence of loading directions on strength variations, an anisotropic variable in terms of the invariants of the stress tensor and fabric into the failure criterion is proposed. In an extension to original CANAsand constitutive model two concepts namely critical state and compact state play paramount roles as all of the moduli and coefficients are related to these states. The applicability of the present model is evaluated through comparisons between the predicted and the measured results. All simulations have demonstrated that the proposed constitutive model is capable of modeling the cyclic behavior of sand with inherent anisotropy.Keywords: fabric, cohesionless media, cyclic loading, critical state, compact state, CANAsand constitutive model
Procedia PDF Downloads 21915596 An Overview of Domain Models of Urban Quantitative Analysis
Authors: Mohan Li
Abstract:
Nowadays, intelligent research technology is more and more important than traditional research methods in urban research work, and this proportion will greatly increase in the next few decades. Frequently such analyzing work cannot be carried without some software engineering knowledge. And here, domain models of urban research will be necessary when applying software engineering knowledge to urban work. In many urban plan practice projects, making rational models, feeding reliable data, and providing enough computation all make indispensable assistance in producing good urban planning. During the whole work process, domain models can optimize workflow design. At present, human beings have entered the era of big data. The amount of digital data generated by cities every day will increase at an exponential rate, and new data forms are constantly emerging. How to select a suitable data set from the massive amount of data, manage and process it has become an ability that more and more planners and urban researchers need to possess. This paper summarizes and makes predictions of the emergence of technologies and technological iterations that may affect urban research in the future, discover urban problems, and implement targeted sustainable urban strategies. They are summarized into seven major domain models. They are urban and rural regional domain model, urban ecological domain model, urban industry domain model, development dynamic domain model, urban social and cultural domain model, urban traffic domain model, and urban space domain model. These seven domain models can be used to guide the construction of systematic urban research topics and help researchers organize a series of intelligent analytical tools, such as Python, R, GIS, etc. These seven models make full use of quantitative spatial analysis, machine learning, and other technologies to achieve higher efficiency and accuracy in urban research, assisting people in making reasonable decisions.Keywords: big data, domain model, urban planning, urban quantitative analysis, machine learning, workflow design
Procedia PDF Downloads 17715595 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory
Authors: Chiung-Hui Chen
Abstract:
The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.Keywords: behavior, big data, hierarchical hidden Markov model, intelligent object
Procedia PDF Downloads 23315594 Leverage Effect for Volatility with Generalized Laplace Error
Authors: Farrukh Javed, Krzysztof Podgórski
Abstract:
We propose a new model that accounts for the asymmetric response of volatility to positive ('good news') and negative ('bad news') shocks in economic time series the so-called leverage effect. In the past, asymmetric powers of errors in the conditionally heteroskedastic models have been used to capture this effect. Our model is using the gamma difference representation of the generalized Laplace distributions that efficiently models the asymmetry. It has one additional natural parameter, the shape, that is used instead of power in the asymmetric power models to capture the strength of a long-lasting effect of shocks. Some fundamental properties of the model are provided including the formula for covariances and an explicit form for the conditional distribution of 'bad' and 'good' news processes given the past the property that is important for the statistical fitting of the model. Relevant features of volatility models are illustrated using S&P 500 historical data.Keywords: heavy tails, volatility clustering, generalized asymmetric laplace distribution, leverage effect, conditional heteroskedasticity, asymmetric power volatility, GARCH models
Procedia PDF Downloads 38515593 Analysis and Optimized Design of a Packaged Liquid Chiller
Authors: Saeed Farivar, Mohsen Kahrom
Abstract:
The purpose of this work is to develop a physical simulation model for the purpose of studying the effect of various design parameters on the performance of packaged-liquid chillers. This paper presents a steady-state model for predicting the performance of package-Liquid chiller over a wide range of operation condition. The model inputs are inlet conditions; geometry and output of model include system performance variable such as power consumption, coefficient of performance (COP) and states of refrigerant through the refrigeration cycle. A computer model that simulates the steady-state cyclic performance of a vapor compression chiller is developed for the purpose of performing detailed physical design analysis of actual industrial chillers. The model can be used for optimizing design and for detailed energy efficiency analysis of packaged liquid chillers. The simulation model takes into account presence of all chiller components such as compressor, shell-and-tube condenser and evaporator heat exchangers, thermostatic expansion valve and connection pipes and tubing’s by thermo-hydraulic modeling of heat transfer, fluids flow and thermodynamics processes in each one of the mentioned components. To verify the validity of the developed model, a 7.5 USRT packaged-liquid chiller is used and a laboratory test stand for bringing the chiller to its standard steady-state performance condition is build. Experimental results obtained from testing the chiller in various load and temperature conditions is shown to be in good agreement with those obtained from simulating the performance of the chiller using the computer prediction model. An entropy-minimization-based optimization analysis is performed based on the developed analytical performance model of the chiller. The variation of design parameters in construction of shell-and-tube condenser and evaporator heat exchangers are studied using the developed performance and optimization analysis and simulation model and a best-match condition between the physical design and construction of chiller heat exchangers and its compressor is found to exist. It is expected that manufacturers of chillers and research organizations interested in developing energy-efficient design and analysis of compression chillers can take advantage of the presented study and its results.Keywords: optimization, packaged liquid chiller, performance, simulation
Procedia PDF Downloads 27815592 Replicating Brain’s Resting State Functional Connectivity Network Using a Multi-Factor Hub-Based Model
Authors: B. L. Ho, L. Shi, D. F. Wang, V. C. T. Mok
Abstract:
The brain’s functional connectivity while temporally non-stationary does express consistency at a macro spatial level. The study of stable resting state connectivity patterns hence provides opportunities for identification of diseases if such stability is severely perturbed. A mathematical model replicating the brain’s spatial connections will be useful for understanding brain’s representative geometry and complements the empirical model where it falls short. Empirical computations tend to involve large matrices and become infeasible with fine parcellation. However, the proposed analytical model has no such computational problems. To improve replicability, 92 subject data are obtained from two open sources. The proposed methodology, inspired by financial theory, uses multivariate regression to find relationships of every cortical region of interest (ROI) with some pre-identified hubs. These hubs acted as representatives for the entire cortical surface. A variance-covariance framework of all ROIs is then built based on these relationships to link up all the ROIs. The result is a high level of match between model and empirical correlations in the range of 0.59 to 0.66 after adjusting for sample size; an increase of almost forty percent. More significantly, the model framework provides an intuitive way to delineate between systemic drivers and idiosyncratic noise while reducing dimensions by more than 30 folds, hence, providing a way to conduct attribution analysis. Due to its analytical nature and simple structure, the model is useful as a standalone toolkit for network dependency analysis or as a module for other mathematical models.Keywords: functional magnetic resonance imaging, multivariate regression, network hubs, resting state functional connectivity
Procedia PDF Downloads 15115591 From the Sharing Economy to Social Manufacturing: Analyzing Collaborative Service Networks in the Manufacturing Domain
Authors: Babak Mohajeri
Abstract:
In recent years, the conventional business model of ownership has been changed towards accessibility in a variety of markets. Two trends can be observed in the evolution of this rental-like business model. Firstly, the technological development that enables the emergence of new business models. These new business models increasingly become agile and flexible. For example Spotify, an online music stream company provides consumers access to over millions of music tracks, conveniently through the smartphone, tablet or computer. Similarly, Car2Go, the car sharing company accesses its members with flexible and nearby sharing cars. The second trend is the increasing communication and connections via social networks. This trend enables a shift to peer-to-peer accessibility based business models. Conventionally, companies provide access for their customers to own companies products or services. In peer-to-peer model, nonetheless, companies facilitate access and connection across their customers to use other customers owned property or skills, competencies or services .The is so-called the sharing economy business model. The aim of this study is to investigate into a new and emerging type of the sharing economy model in which role of customers and service providers may dramatically change. This new model is called Collaborative Service Networks. We propose a mechanism for Collaborative Service Networks business model. Uber and Airbnb, two successful growing companies, have been selected for our case studies and their business models are analyzed. Finally, we study the emergence of the collaborative service networks in the manufacturing domain. Our finding results to a new manufacturing paradigm called social manufacturing.Keywords: sharing economy, collaborative service networks, social manufacturing, manufacturing development
Procedia PDF Downloads 31715590 Volatility Model with Markov Regime Switching to Forecast Baht/USD
Authors: Nop Sopipan
Abstract:
In this paper, we forecast the volatility of Baht/USDs using Markov Regime Switching GARCH (MRS-GARCH) models. These models allow volatility to have different dynamics according to unobserved regime variables. The main purpose of this paper is to find out whether MRS-GARCH models are an improvement on the GARCH type models in terms of modeling and forecasting Baht/USD volatility. The MRS-GARCH is the best performance model for Baht/USD volatility in short term but the GARCH model is best perform for long term.Keywords: volatility, Markov Regime Switching, forecasting, Baht/USD
Procedia PDF Downloads 30215589 Implementation of Free-Field Boundary Condition for 2D Site Response Analysis in OpenSees
Authors: M. Eskandarighadi, C. R. McGann
Abstract:
It is observed from past experiences of earthquakes that local site conditions can significantly affect the strong ground motion characteristics experience at the site. One-dimensional seismic site response analysis is the most common approach for investigating site response. This approach assumes that soil is homogeneous and infinitely extended in the horizontal direction. Therefore, tying side boundaries together is one way to model this behavior, as the wave passage is assumed to be only vertical. However, 1D analysis cannot capture the 2D nature of wave propagation, soil heterogeneity, and 2D soil profile with features such as inclined layer boundaries. In contrast, 2D seismic site response modeling can consider all of the mentioned factors to better understand local site effects on strong ground motions. 2D wave propagation and considering that the soil profile on the two sides of the model may not be identical clarifies the importance of a boundary condition on each side that can minimize the unwanted reflections from the edges of the model and input appropriate loading conditions. Ideally, the model size should be sufficiently large to minimize the wave reflection, however, due to computational limitations, increasing the model size is impractical in some cases. Another approach is to employ free-field boundary conditions that take into account the free-field motion that would exist far from the model domain and apply this to the sides of the model. This research focuses on implementing free-field boundary conditions in OpenSees for 2D site response analysisComparisons are made between 1D models and 2D models with various boundary conditions, and details and limitations of the developed free-field boundary modeling approach are discussed.Keywords: boundary condition, free-field, opensees, site response analysis, wave propagation
Procedia PDF Downloads 15815588 Cuckoo Search Optimization for Black Scholes Option Pricing
Authors: Manas Shah
Abstract:
Black Scholes option pricing model is one of the most important concepts in modern world of computational finance. However, its practical use can be challenging as one of the input parameters must be estimated; implied volatility of the underlying security. The more precisely these values are estimated, the more accurate their corresponding estimates of theoretical option prices would be. Here, we present a novel model based on Cuckoo Search Optimization (CS) which finds more precise estimates of implied volatility than Particle Swarm Optimization (PSO) and Genetic Algorithm (GA).Keywords: black scholes model, cuckoo search optimization, particle swarm optimization, genetic algorithm
Procedia PDF Downloads 45315587 Solar Energy Applications in Seawater Distillation
Authors: Yousef Abdulaziz Almolhem
Abstract:
Geographically, the most Arabic countries locate in areas confined to arid or semiarid regions. For this reason, most of our countries have adopted the seawater desalination as a strategy to overcome this problem. For example, the water supply of AUE, Kuwait, and Saudi Arabia is almost 100% from the seawater desalination plants. Many areas in Saudia Arabia and other countries in the world suffer from lack of fresh water which hinders the development of these areas, despite the availability of saline water and high solar radiation intensity. Furthermore, most developing countries do not have sufficient meteorological data to evaluate if the solar radiation is enough to meet the solar desalination. A mathematical model was developed to simulate and predict the thermal behavior of the solar still which used direct solar energy for distillation of seawater. Measurement data were measured in the Environment and Natural Resources Department, Faculty of Agricultural and Food sciences, King Faisal University, Saudi Arabia, in order to evaluate the present model. The simulation results obtained from this model were compared with the measured data. The main results of this research showed that there are slight differences between the measured and predicted values of the elements studied, which is resultant from the change of some factors considered constants in the model such as the sky clearance, wind velocity and the salt concentration in the water in the basin of the solar still. It can be concluded that the present model can be used to estimate the average total solar radiation and the thermal behavior of the solar still in any area with consideration to the geographical location.Keywords: mathematical model, sea water, distillation, solar radiation
Procedia PDF Downloads 28315586 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: recurrent neural network, players lineup, basketball data, decision making model
Procedia PDF Downloads 133