Search results for: Coulomb modified Glauber model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18294

Search results for: Coulomb modified Glauber model

16104 Text Mining of Twitter Data Using a Latent Dirichlet Allocation Topic Model and Sentiment Analysis

Authors: Sidi Yang, Haiyi Zhang

Abstract:

Twitter is a microblogging platform, where millions of users daily share their attitudes, views, and opinions. Using a probabilistic Latent Dirichlet Allocation (LDA) topic model to discern the most popular topics in the Twitter data is an effective way to analyze a large set of tweets to find a set of topics in a computationally efficient manner. Sentiment analysis provides an effective method to show the emotions and sentiments found in each tweet and an efficient way to summarize the results in a manner that is clearly understood. The primary goal of this paper is to explore text mining, extract and analyze useful information from unstructured text using two approaches: LDA topic modelling and sentiment analysis by examining Twitter plain text data in English. These two methods allow people to dig data more effectively and efficiently. LDA topic model and sentiment analysis can also be applied to provide insight views in business and scientific fields.

Keywords: text mining, Twitter, topic model, sentiment analysis

Procedia PDF Downloads 157
16103 Analysis on the Need of Engineering Drawing and Feasibility Study on 3D Model Based Engineering Implementation

Authors: Parthasarathy J., Ramshankar C. S.

Abstract:

Engineering drawings these days play an important role in every part of an industry. By and large, Engineering drawings are influential over every phase of the product development process. Traditionally, drawings are used for communication in industry because they are the clearest way to represent the product manufacturing information. Until recently, manufacturing activities were driven by engineering data captured in 2D paper documents or digital representations of those documents. The need of engineering drawing is inevitable. Still Engineering drawings are disadvantageous in re-entry of data throughout manufacturing life cycle. This document based approach is prone to errors and requires costly re-entry of data at every stage in the manufacturing life cycle. So there is a requirement to eliminate Engineering drawings throughout product development process and to implement 3D Model Based Engineering (3D MBE or 3D MBD). Adopting MBD appears to be the next logical step to continue reducing time-to-market and improve product quality. Ideally, by fully applying the MBD concept, the product definition will no longer rely on engineering drawings throughout the product lifecycle. This project addresses the need of Engineering drawing and its influence in various parts of an industry and the need to implement the 3D Model Based Engineering with its advantages and the technical barriers that must be overcome in order to implement 3D Model Based Engineering. This project also addresses the requirements of neutral formats and its realisation in order to implement the digital product definition principles in a light format. In order to prove the concepts of 3D Model Based Engineering, the screw jack body part is also demonstrated. At ZF Windpower Coimbatore Limited, 3D Model Based Definition is implemented to Torque Arm (Machining and Casting), Steel tube, Pinion shaft, Cover, Energy tube.

Keywords: engineering drawing, model based engineering MBE, MBD, CAD

Procedia PDF Downloads 412
16102 A Bi-Objective Model to Address Simultaneous Formulation of Project Scheduling and Material Ordering

Authors: Babak H. Tabrizi, Seyed Farid Ghaderi

Abstract:

Concurrent planning of project scheduling and material ordering has been increasingly addressed within last decades as an approach to improve the project execution costs. Therefore, we have taken the problem into consideration in this paper, aiming to maximize schedules quality robustness, in addition to minimize the relevant costs. In this regard, a bi-objective mathematical model is developed to formulate the problem. Moreover, it is possible to utilize the all-unit discount for materials purchasing. The problem is then solved by the constraint method, and the Pareto front is obtained for a variety of robustness values. The applicability and efficiency of the proposed model is tested by different numerical instances, finally.

Keywords: e-constraint method, material ordering, project management, project scheduling

Procedia PDF Downloads 278
16101 Estimation of Soil Moisture at High Resolution through Integration of Optical and Microwave Remote Sensing and Applications in Drought Analyses

Authors: Donglian Sun, Yu Li, Paul Houser, Xiwu Zhan

Abstract:

California experienced severe drought conditions in the past years. In this study, the drought conditions in California are analyzed using soil moisture anomalies derived from integrated optical and microwave satellite observations along with auxiliary land surface data. Based on the U.S. Drought Monitor (USDM) classifications, three typical drought conditions were selected for the analysis: extreme drought conditions in 2007 and 2013, severe drought conditions in 2004 and 2009, and normal conditions in 2005 and 2006. Drought is defined as negative soil moisture anomaly. To estimate soil moisture at high spatial resolutions, three approaches are explored in this study: the universal triangle model that estimates soil moisture from Normalized Difference Vegetation Index (NDVI) and Land Surface Temperature (LST); the basic model that estimates soil moisture under different conditions with auxiliary data like precipitation, soil texture, topography, and surface types; and the refined model that uses accumulated precipitation and its lagging effects. It is found that the basic model shows better agreements with the USDM classifications than the universal triangle model, while the refined model using precipitation accumulated from the previous summer to current time demonstrated the closest agreements with the USDM patterns.

Keywords: soil moisture, high resolution, regional drought, analysis and monitoring

Procedia PDF Downloads 117
16100 Cadmium Telluride Quantum Dots (CdTe QDs)-Thymine Conjugate Based Fluorescence Biosensor for Sensitive Determination of Nucleobases/Nucleosides

Authors: Lucja Rodzik, Joanna Lewandowska-Lancucka, Michal Szuwarzynski, Krzysztof Szczubialka, Maria Nowakowska

Abstract:

The analysis of nucleobases is of great importance for bioscience since their abnormal concentration in body fluids suggests the deficiency and mutation of the immune system, and it is considered to be an important parameter for diagnosis of various diseases. The presented conjugate meets the need for development of the effective, selective and highly sensitive sensor for nucleobase/nucleoside detection. The novel, highly fluorescent cadmium telluride quantum dots (CdTe QDs) functionalized with thymine and stabilized with thioglycolic acid (TGA) conjugates has been developed and thoroughly characterized. Successful formation of the material was confirmed by elemental analysis, and UV–Vis fluorescence and FTIR spectroscopies. The crystalline structure of the obtained product was characterized with X-ray diffraction (XRD) method. The composition of CdTe QDs and their thymine conjugate was also examined using X-ray photoelectron spectroscopy (XPS). The size of the CdTe-thymine was 3-6 nm as demonstrated using atomic force microscopy (AFM) and high resolution transmission electron microscopy (HRTEM) imaging. The plasmon resonance fluorescence band at 540 nm on excitation at 351 nm was observed for these nanoparticles. The intensity of this band increased with the increase in the amount of conjugated thymine with no shift in its position. Based on the fluorescence measurements, it was found that the CdTe-thymine conjugate interacted efficiently and selectively not only with adenine, a nucleobase complementary to thymine, but also with nucleosides and adenine-containing modified nucleosides, i.e., 5′-deoxy-5′-(methylthio)adenosine (MTA) and 2’-O-methyladenosine, the urinary tumor markers which allow monitoring of the disease progression. The applicability of the CdTe-thymine sensor for the real sample analysis was also investigated in simulated urine conditions. High sensitivity and selectivity of CdTe-thymine fluorescence towards adenine, adenosine and modified adenosine suggest that obtained conjugate can be potentially useful for development of the biosensor for complementary nucleobase/nucleoside detection.

Keywords: CdTe quantum dots, conjugate, sensor, thymine

Procedia PDF Downloads 390
16099 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model

Authors: Chaudhuri Manoj Kumar Swain, Susmita Das

Abstract:

This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.

Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis

Procedia PDF Downloads 152
16098 Causes of Variation Orders in the Egyptian Construction Industry: Time and Cost Impacts

Authors: A. Samer Ezeldin, Jwanda M. El Sarag

Abstract:

Variation orders are of great importance in any construction project. Variation orders are defined as any change in the scope of works of a project that can be an addition omission, or even modification. This paper investigates the variation orders that occur during construction projects in Egypt. The literature review represents a comparison of causes of variation orders among Egypt, Tanzania, Nigeria, Malaysia and the United Kingdom. A classification of occurrence of variation orders due to owner related factors, consultant related factors and other factors are signified in the literature review. These classified events that lead to variation orders were introduced in a survey with 19 events to observe their frequency of occurrence, and their time and cost impacts. The survey data was obtained from 87 participants that included clients, consultants, and contractors and a database of 42 scenarios was created. A model is then developed to help assist project managers in predicting the frequency of variations and account for a budget for any additional costs and minimize any delays that can take place. Two experts with more than 25 years of experience were given the model to verify that the model was working effectively. The model was then validated on a residential compound that was completed in July 2016 to prove that the model actually produces acceptable results.

Keywords: construction, cost impact, Egypt, time impact, variation orders

Procedia PDF Downloads 159
16097 Analysis of a Coupled Hydro-Sedimentological Numerical Model for the Western Tombolo of Giens

Authors: Yves Lacroix, Van Van Than, Didier Léandri, Pierre Liardet

Abstract:

The western Tombolo of the Giens peninsula in southern France, known as Almanarre beach, is subject to coastal erosion. We are trying to use computer simulation in order to propose solutions to stop this erosion. Our aim was first to determine the main factors for this erosion and successfully apply a coupled hydro-sedimentological numerical model based on observations and measurements that have been performed on the site for decades. We have gathered all available information and data about waves, winds, currents, tides, bathymetry, coastal line, and sediments concerning the site. These have been divided into two sets: one devoted to calibrating a numerical model using Mike 21 software, the other to serve as a reference in order to numerically compare the present situation to what it could be if we implemented different types of underwater constructions. This paper presents the first part of the study: selecting and melting different sources into a coherent data basis, identifying the main erosion factors, and calibrating the coupled software model against the selected reference period. Our results bring calibration of the numerical model with good fitting coefficients. They also show that the winter South-Western storm events conjugated to depressive weather conditions constitute a major factor of erosion, mainly due to wave impact in the northern part of the Almanarre beach. Together, current and wind impact is shown negligible.

Keywords: Almanarre beach, coastal erosion, hydro-sedimentological, numerical model

Procedia PDF Downloads 361
16096 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models

Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti

Abstract:

This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.

Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm

Procedia PDF Downloads 392
16095 Giftedness Cloud Model: A Psychological and Ecological Vision of Giftedness Concept

Authors: Rimeyah H. S. Almutairi, Alaa Eldin A. Ayoub

Abstract:

The aim of this study was to identify empirical and theoretical studies that explored giftedness theories and identification. In order to assess and synthesize the mechanisms, outcomes, and impacts of gifted identification models. Thus, we sought to provide an evidence-informed answer to how does current giftedness theories work and effectiveness. In order to develop a model that incorporates the advantages of existing models and avoids their disadvantages as much as possible. We conducted a systematic literature review (SLR). The disciplined analysis resulted in a final sample consisting of 30 appropriate searches. The results indicated that: (a) there is no uniform and consistent definition of Giftedness; (b) researchers are using several non-consistent criteria to detect gifted, and (d) The detection of talent is largely limited to early ages, and there is obvious neglect of adults. This study contributes to the development of Giftedness Cloud Model (GCM) which defined as a model that attempts to interpretation giftedness within an interactive psychological and ecological framework. GCM aims to help a talented to reach giftedness core and manifestation talent in creative productivity or invention. Besides that, GCM suggests classifying giftedness into four levels of mastery, excellence, creative productivity, and manifestation. In addition, GCM presents an idea to distinguish between talent and giftedness.

Keywords: giftedness cloud model, talent, systematic literature review, giftedness concept

Procedia PDF Downloads 150
16094 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 304
16093 Finite Element Simulation of RC Exterior Beam-Column Joints Using Damage Plasticity Model

Authors: A. M. Halahla, M. H. Baluch, M. K. Rahman, A. H. Al-Gadhib, M. N. Akhtar

Abstract:

In the present study, 3D simulation of a typical exterior (RC) beam–column joint (BCJ) strengthened with carbon fiber-reinforced plastic (CFRP) sheet are carried out. Numerical investigations are performed using a nonlinear finite element ( FE) analysis by incorporating damage plasticity model (CDP), for material behaviour the concrete response in compression, tension softening were used, linear plastic with isotropic hardening for reinforcing steel, and linear elastic lamina material model for CFRP sheets using the commercial FE software ABAQUS. The numerical models developed in the present study are validated with the results obtained from the experiment under monotonic loading using the hydraulic Jack in displacement control mode. The experimental program includes casting of deficient BCJ loaded to failure load for both un-strengthened and strengthened BCJ. The failure mode, and deformation response of CFRP strengthened and un-strengthened joints and propagation of damage in the components of BCJ are discussed. Finite element simulations are compared with the experimental result and are noted to yield reasonable comparisons. The damage plasticity model was able to capture with good accuracy of the ultimate load and the mode of failure in the beam column joint.

Keywords: reinforced concrete, exterior beam-column joints, concrete damage plasticity model, computational simulation, 3-D finite element model

Procedia PDF Downloads 358
16092 An Interlock Model of Friction and Superlubricity

Authors: Azadeh Malekan, Shahin Rouhani

Abstract:

Superlubricity is a phenomenon where two surfaces in contact show negligible friction;this may be because the asperities of the two surfaces do not interlock. Two rough surfaces, when pressed against each other, can get into a formation where the summits of asperities of one surface lock into the valleys of the other surface. The amount of interlock depends on the geometry of the two surfaces. We suggest the friction force may then be proportional to the amount of interlock; this explains Superlubricity as the situation where there is little interlock. Then the friction force will be directly proportional to the normal force as it is related to the work necessary to lift the upper surface in order to clear the interlock. To investigate this model, we simulate the contact of two surfaces. In order to validate our model, we first investigate Amontons‘ law. Assuming that asperities retain deformations in the time scale while the top asperity moves across the lattice spacing Amonton’s law is observed. Structural superlubricity is examined by the hypothesis that surfaces are very rigid and there is no deformation in asperities. This may happen at small normal forces. When two identical surfaces come into contact, rotating the top surface we observe a peak in friction force near the angle of orientation where the two surfaces can interlock.

Keywords: friction, amonton`s law, superlubricity, contact model

Procedia PDF Downloads 132
16091 Hydraulic Analysis of Irrigation Approach Channel Using HEC-RAS Model

Authors: Muluegziabher Semagne Mekonnen

Abstract:

This study was intended to show the irrigation water requirements and evaluation of canal hydraulics steady state conditions to improve on scheme performance of the Meki-Ziway irrigation project. The methodology used was the CROPWAT 8.0 model to estimate the irrigation water requirements of five major crops irrigated in the study area. The results showed that for the whole existing and potential irrigation development area of 2000 ha and 2599 ha, crop water requirements were 3,339,200 and 4,339,090.4 m³, respectively. Hydraulic simulation models are fundamental tools for understanding the hydraulic flow characteristics of irrigation systems. Hydraulic simulation models are fundamental tools for understanding the hydraulic flow characteristics of irrigation systems. In this study Hydraulic Analysis of Irrigation Canals Using HEC-RAS Model was conducted in Meki-Ziway Irrigation Scheme. The HEC-RAS model was tested in terms of error estimation and used to determine canal capacity potential.

Keywords: HEC-RAS, irrigation, hydraulic. canal reach, capacity

Procedia PDF Downloads 39
16090 Optimal Hedging of a Portfolio of European Options in an Extended Binomial Model under Proportional Transaction Costs

Authors: Norm Josephy, Lucy Kimball, Victoria Steblovskaya

Abstract:

Hedging of a portfolio of European options under proportional transaction costs is considered. Our discrete time financial market model extends the binomial market model with transaction costs to the case where the underlying stock price ratios are distributed over a bounded interval rather than over a two-point set. An optimal hedging strategy is chosen from a set of admissible non-self-financing hedging strategies. Our approach to optimal hedging of a portfolio of options is based on theoretical foundation that includes determination of a no-arbitrage option price interval as well as on properties of the non-self-financing strategies and their residuals. A computational algorithm for optimizing an investor relevant criterion over the set of admissible non-self-financing hedging strategies is developed. Applicability of our approach is demonstrated using both simulated data and real market data.

Keywords: extended binomial model, non-self-financing hedging, optimization, proportional transaction costs

Procedia PDF Downloads 237
16089 Proposing a Failure Criterion for Cohesionless Media Considering Cyclic Fabric Anisotropy

Authors: Ali Noorzad, Ehsan Badakhshan, Shima Zameni

Abstract:

The present paper is focused on a generalized failure criterion for geomaterials with cross-anisotropy. The cyclic behavior of granular material primarily depends on the nature and arrangement of constituent particles, particle size, and shape that affect fabric anisotropy. To account for the influence of loading directions on strength variations, an anisotropic variable in terms of the invariants of the stress tensor and fabric into the failure criterion is proposed. In an extension to original CANAsand constitutive model two concepts namely critical state and compact state play paramount roles as all of the moduli and coefficients are related to these states. The applicability of the present model is evaluated through comparisons between the predicted and the measured results. All simulations have demonstrated that the proposed constitutive model is capable of modeling the cyclic behavior of sand with inherent anisotropy.

Keywords: fabric, cohesionless media, cyclic loading, critical state, compact state, CANAsand constitutive model

Procedia PDF Downloads 200
16088 Modeling Water Inequality and Water Security: The Role of Water Governance

Authors: Pius Babuna, Xiaohua Yang, Roberto Xavier Supe Tulcan, Bian Dehui, Mohammed Takase, Bismarck Yelfogle Guba, Chuanliang Han, Doris Abra Awudi, Meishui Lia

Abstract:

Water inequality, water security, and water governance are fundamental parameters that affect the sustainable use of water resources. Through policy formulation and decision-making, water governance determines both water security and water inequality. Largely, where water inequality exists, water security is undermined through unsustainable water use practices that lead to pollution of water resources, conflicts, hoarding of water, and poor sanitation. Incidentally, the interconnectedness of water governance, water inequality, and water security has not been investigated previously. This study modified the Gini coefficient and used a Logistics Growth of Water Resources (LGWR) Model to access water inequality and water security mathematically, and discussed the connected role of water governance. We tested the validity of both models by calculating the actual water inequality and water security of Ghana. We also discussed the implications of water inequality on water security and the overarching role of water governance. The results show that regional water inequality is widespread in some parts. The Volta region showed the highest water inequality (Gini index of 0.58), while the central region showed the lowest (Gini index of 0.15). Water security is moderately sustainable. The use of water resources is currently stress-free. It was estimated to maintain such status until 2132 ± 18, when Ghana will consume half of the current total water resources of 53.2 billion cubic meters. Effectively, water inequality is a threat to water security, results in poverty, under-development heightens tensions in water use, and causes instability. With proper water governance, water inequality can be eliminated through formulating and implementing approaches that engender equal allocation and sustainable use of water resources.

Keywords: water inequality, water security, water governance, Gini coefficient, moran index, water resources management

Procedia PDF Downloads 109
16087 An Overview of Domain Models of Urban Quantitative Analysis

Authors: Mohan Li

Abstract:

Nowadays, intelligent research technology is more and more important than traditional research methods in urban research work, and this proportion will greatly increase in the next few decades. Frequently such analyzing work cannot be carried without some software engineering knowledge. And here, domain models of urban research will be necessary when applying software engineering knowledge to urban work. In many urban plan practice projects, making rational models, feeding reliable data, and providing enough computation all make indispensable assistance in producing good urban planning. During the whole work process, domain models can optimize workflow design. At present, human beings have entered the era of big data. The amount of digital data generated by cities every day will increase at an exponential rate, and new data forms are constantly emerging. How to select a suitable data set from the massive amount of data, manage and process it has become an ability that more and more planners and urban researchers need to possess. This paper summarizes and makes predictions of the emergence of technologies and technological iterations that may affect urban research in the future, discover urban problems, and implement targeted sustainable urban strategies. They are summarized into seven major domain models. They are urban and rural regional domain model, urban ecological domain model, urban industry domain model, development dynamic domain model, urban social and cultural domain model, urban traffic domain model, and urban space domain model. These seven domain models can be used to guide the construction of systematic urban research topics and help researchers organize a series of intelligent analytical tools, such as Python, R, GIS, etc. These seven models make full use of quantitative spatial analysis, machine learning, and other technologies to achieve higher efficiency and accuracy in urban research, assisting people in making reasonable decisions.

Keywords: big data, domain model, urban planning, urban quantitative analysis, machine learning, workflow design

Procedia PDF Downloads 160
16086 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: behavior, big data, hierarchical hidden Markov model, intelligent object

Procedia PDF Downloads 217
16085 Leverage Effect for Volatility with Generalized Laplace Error

Authors: Farrukh Javed, Krzysztof Podgórski

Abstract:

We propose a new model that accounts for the asymmetric response of volatility to positive ('good news') and negative ('bad news') shocks in economic time series the so-called leverage effect. In the past, asymmetric powers of errors in the conditionally heteroskedastic models have been used to capture this effect. Our model is using the gamma difference representation of the generalized Laplace distributions that efficiently models the asymmetry. It has one additional natural parameter, the shape, that is used instead of power in the asymmetric power models to capture the strength of a long-lasting effect of shocks. Some fundamental properties of the model are provided including the formula for covariances and an explicit form for the conditional distribution of 'bad' and 'good' news processes given the past the property that is important for the statistical fitting of the model. Relevant features of volatility models are illustrated using S&P 500 historical data.

Keywords: heavy tails, volatility clustering, generalized asymmetric laplace distribution, leverage effect, conditional heteroskedasticity, asymmetric power volatility, GARCH models

Procedia PDF Downloads 369
16084 Analysis and Optimized Design of a Packaged Liquid Chiller

Authors: Saeed Farivar, Mohsen Kahrom

Abstract:

The purpose of this work is to develop a physical simulation model for the purpose of studying the effect of various design parameters on the performance of packaged-liquid chillers. This paper presents a steady-state model for predicting the performance of package-Liquid chiller over a wide range of operation condition. The model inputs are inlet conditions; geometry and output of model include system performance variable such as power consumption, coefficient of performance (COP) and states of refrigerant through the refrigeration cycle. A computer model that simulates the steady-state cyclic performance of a vapor compression chiller is developed for the purpose of performing detailed physical design analysis of actual industrial chillers. The model can be used for optimizing design and for detailed energy efficiency analysis of packaged liquid chillers. The simulation model takes into account presence of all chiller components such as compressor, shell-and-tube condenser and evaporator heat exchangers, thermostatic expansion valve and connection pipes and tubing’s by thermo-hydraulic modeling of heat transfer, fluids flow and thermodynamics processes in each one of the mentioned components. To verify the validity of the developed model, a 7.5 USRT packaged-liquid chiller is used and a laboratory test stand for bringing the chiller to its standard steady-state performance condition is build. Experimental results obtained from testing the chiller in various load and temperature conditions is shown to be in good agreement with those obtained from simulating the performance of the chiller using the computer prediction model. An entropy-minimization-based optimization analysis is performed based on the developed analytical performance model of the chiller. The variation of design parameters in construction of shell-and-tube condenser and evaporator heat exchangers are studied using the developed performance and optimization analysis and simulation model and a best-match condition between the physical design and construction of chiller heat exchangers and its compressor is found to exist. It is expected that manufacturers of chillers and research organizations interested in developing energy-efficient design and analysis of compression chillers can take advantage of the presented study and its results.

Keywords: optimization, packaged liquid chiller, performance, simulation

Procedia PDF Downloads 258
16083 Replicating Brain’s Resting State Functional Connectivity Network Using a Multi-Factor Hub-Based Model

Authors: B. L. Ho, L. Shi, D. F. Wang, V. C. T. Mok

Abstract:

The brain’s functional connectivity while temporally non-stationary does express consistency at a macro spatial level. The study of stable resting state connectivity patterns hence provides opportunities for identification of diseases if such stability is severely perturbed. A mathematical model replicating the brain’s spatial connections will be useful for understanding brain’s representative geometry and complements the empirical model where it falls short. Empirical computations tend to involve large matrices and become infeasible with fine parcellation. However, the proposed analytical model has no such computational problems. To improve replicability, 92 subject data are obtained from two open sources. The proposed methodology, inspired by financial theory, uses multivariate regression to find relationships of every cortical region of interest (ROI) with some pre-identified hubs. These hubs acted as representatives for the entire cortical surface. A variance-covariance framework of all ROIs is then built based on these relationships to link up all the ROIs. The result is a high level of match between model and empirical correlations in the range of 0.59 to 0.66 after adjusting for sample size; an increase of almost forty percent. More significantly, the model framework provides an intuitive way to delineate between systemic drivers and idiosyncratic noise while reducing dimensions by more than 30 folds, hence, providing a way to conduct attribution analysis. Due to its analytical nature and simple structure, the model is useful as a standalone toolkit for network dependency analysis or as a module for other mathematical models.

Keywords: functional magnetic resonance imaging, multivariate regression, network hubs, resting state functional connectivity

Procedia PDF Downloads 134
16082 From the Sharing Economy to Social Manufacturing: Analyzing Collaborative Service Networks in the Manufacturing Domain

Authors: Babak Mohajeri

Abstract:

In recent years, the conventional business model of ownership has been changed towards accessibility in a variety of markets. Two trends can be observed in the evolution of this rental-like business model. Firstly, the technological development that enables the emergence of new business models. These new business models increasingly become agile and flexible. For example Spotify, an online music stream company provides consumers access to over millions of music tracks, conveniently through the smartphone, tablet or computer. Similarly, Car2Go, the car sharing company accesses its members with flexible and nearby sharing cars. The second trend is the increasing communication and connections via social networks. This trend enables a shift to peer-to-peer accessibility based business models. Conventionally, companies provide access for their customers to own companies products or services. In peer-to-peer model, nonetheless, companies facilitate access and connection across their customers to use other customers owned property or skills, competencies or services .The is so-called the sharing economy business model. The aim of this study is to investigate into a new and emerging type of the sharing economy model in which role of customers and service providers may dramatically change. This new model is called Collaborative Service Networks. We propose a mechanism for Collaborative Service Networks business model. Uber and Airbnb, two successful growing companies, have been selected for our case studies and their business models are analyzed. Finally, we study the emergence of the collaborative service networks in the manufacturing domain. Our finding results to a new manufacturing paradigm called social manufacturing.

Keywords: sharing economy, collaborative service networks, social manufacturing, manufacturing development

Procedia PDF Downloads 296
16081 Volatility Model with Markov Regime Switching to Forecast Baht/USD

Authors: Nop Sopipan

Abstract:

In this paper, we forecast the volatility of Baht/USDs using Markov Regime Switching GARCH (MRS-GARCH) models. These models allow volatility to have different dynamics according to unobserved regime variables. The main purpose of this paper is to find out whether MRS-GARCH models are an improvement on the GARCH type models in terms of modeling and forecasting Baht/USD volatility. The MRS-GARCH is the best performance model for Baht/USD volatility in short term but the GARCH model is best perform for long term.

Keywords: volatility, Markov Regime Switching, forecasting, Baht/USD

Procedia PDF Downloads 289
16080 Implementation of Free-Field Boundary Condition for 2D Site Response Analysis in OpenSees

Authors: M. Eskandarighadi, C. R. McGann

Abstract:

It is observed from past experiences of earthquakes that local site conditions can significantly affect the strong ground motion characteristics experience at the site. One-dimensional seismic site response analysis is the most common approach for investigating site response. This approach assumes that soil is homogeneous and infinitely extended in the horizontal direction. Therefore, tying side boundaries together is one way to model this behavior, as the wave passage is assumed to be only vertical. However, 1D analysis cannot capture the 2D nature of wave propagation, soil heterogeneity, and 2D soil profile with features such as inclined layer boundaries. In contrast, 2D seismic site response modeling can consider all of the mentioned factors to better understand local site effects on strong ground motions. 2D wave propagation and considering that the soil profile on the two sides of the model may not be identical clarifies the importance of a boundary condition on each side that can minimize the unwanted reflections from the edges of the model and input appropriate loading conditions. Ideally, the model size should be sufficiently large to minimize the wave reflection, however, due to computational limitations, increasing the model size is impractical in some cases. Another approach is to employ free-field boundary conditions that take into account the free-field motion that would exist far from the model domain and apply this to the sides of the model. This research focuses on implementing free-field boundary conditions in OpenSees for 2D site response analysisComparisons are made between 1D models and 2D models with various boundary conditions, and details and limitations of the developed free-field boundary modeling approach are discussed.

Keywords: boundary condition, free-field, opensees, site response analysis, wave propagation

Procedia PDF Downloads 130
16079 Cuckoo Search Optimization for Black Scholes Option Pricing

Authors: Manas Shah

Abstract:

Black Scholes option pricing model is one of the most important concepts in modern world of computational finance. However, its practical use can be challenging as one of the input parameters must be estimated; implied volatility of the underlying security. The more precisely these values are estimated, the more accurate their corresponding estimates of theoretical option prices would be. Here, we present a novel model based on Cuckoo Search Optimization (CS) which finds more precise estimates of implied volatility than Particle Swarm Optimization (PSO) and Genetic Algorithm (GA).

Keywords: black scholes model, cuckoo search optimization, particle swarm optimization, genetic algorithm

Procedia PDF Downloads 433
16078 Solar Energy Applications in Seawater Distillation

Authors: Yousef Abdulaziz Almolhem

Abstract:

Geographically, the most Arabic countries locate in areas confined to arid or semiarid regions. For this reason, most of our countries have adopted the seawater desalination as a strategy to overcome this problem. For example, the water supply of AUE, Kuwait, and Saudi Arabia is almost 100% from the seawater desalination plants. Many areas in Saudia Arabia and other countries in the world suffer from lack of fresh water which hinders the development of these areas, despite the availability of saline water and high solar radiation intensity. Furthermore, most developing countries do not have sufficient meteorological data to evaluate if the solar radiation is enough to meet the solar desalination. A mathematical model was developed to simulate and predict the thermal behavior of the solar still which used direct solar energy for distillation of seawater. Measurement data were measured in the Environment and Natural Resources Department, Faculty of Agricultural and Food sciences, King Faisal University, Saudi Arabia, in order to evaluate the present model. The simulation results obtained from this model were compared with the measured data. The main results of this research showed that there are slight differences between the measured and predicted values of the elements studied, which is resultant from the change of some factors considered constants in the model such as the sky clearance, wind velocity and the salt concentration in the water in the basin of the solar still. It can be concluded that the present model can be used to estimate the average total solar radiation and the thermal behavior of the solar still in any area with consideration to the geographical location.

Keywords: mathematical model, sea water, distillation, solar radiation

Procedia PDF Downloads 267
16077 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 114
16076 Elasto-Plastic Behavior of Rock during Temperature Drop

Authors: N. Reppas, Y. L. Gui, B. Wetenhall, C. T. Davie, J. Ma

Abstract:

A theoretical constitutive model describing the stress-strain behavior of rock subjected to different confining pressures is presented. A bounding surface plastic model with hardening effects is proposed which includes the effect of temperature drop. The bounding surface is based on a mapping rule and the temperature effect on rock is controlled by Poisson’s ratio. Validation of the results against available experimental data is also presented. The relation of deviatoric stress and axial strain is illustrated at different temperatures to analyze the effect of temperature decrease in terms of stiffness of the material.

Keywords: bounding surface, cooling of rock, plasticity model, rock deformation, elasto-plastic behavior

Procedia PDF Downloads 116
16075 Application of Artificial Neural Network in Initiating Cleaning Of Photovoltaic Solar Panels

Authors: Mohamed Mokhtar, Mostafa F. Shaaban

Abstract:

Among the challenges facing solar photovoltaic (PV) systems in the United Arab Emirates (UAE), dust accumulation on solar panels is considered the most severe problem that faces the growth of solar power plants. The accumulation of dust on the solar panels significantly degrades output from these panels. Hence, solar PV panels have to be cleaned manually or using costly automated cleaning methods. This paper focuses on initiating cleaning actions when required to reduce maintenance costs. The cleaning actions are triggered only when the dust level exceeds a threshold value. The amount of dust accumulated on the PV panels is estimated using an artificial neural network (ANN). Experiments are conducted to collect the required data, which are used in the training of the ANN model. Then, this ANN model will be fed by the output power from solar panels, ambient temperature, and solar irradiance, and thus, it will be able to estimate the amount of dust accumulated on solar panels at these conditions. The model was tested on different case studies to confirm the accuracy of the developed model.

Keywords: machine learning, dust, PV panels, renewable energy

Procedia PDF Downloads 125