Search results for: time series models
22129 Analysis of Delamination in Drilling of Composite Materials
Authors: Navid Zarif Karimi, Hossein Heidary, Giangiacomo Minak, Mehdi Ahmadi
Abstract:
In this paper analytical model based on the mechanics of oblique cutting, linear elastic fracture mechanics (LEFM) and bending plate theory has been presented to determine the critical feed rate causing delamination in drilling of composite materials. Most of the models in this area used LEFM and bending plate theory; hence, they can only determine the critical thrust force which is an incorporable parameter. In this model by adding cutting oblique mechanics to previous models, critical feed rate has been determined. Also instead of simplification in loading condition, actual thrust force induced by chisel edge and cutting lips on composite plate is modeled.Keywords: composite material, delamination, drilling, thrust force
Procedia PDF Downloads 51522128 BIM Modeling of Site and Existing Buildings: Case Study of ESTP Paris Campus
Authors: Rita Sassine, Yassine Hassani, Mohamad Al Omari, Stéphanie Guibert
Abstract:
Building Information Modelling (BIM) is the process of creating, managing, and centralizing information during the building lifecycle. BIM can be used all over a construction project, from the initiation phase to the planning and execution phases to the maintenance and lifecycle management phase. For existing buildings, BIM can be used for specific applications such as lifecycle management. However, most of the existing buildings don’t have a BIM model. Creating a compatible BIM for existing buildings is very challenging. It requires special equipment for data capturing and efforts to convert these data into a BIM model. The main difficulties for such projects are to define the data needed, the level of development (LOD), and the methodology to be adopted. In addition to managing information for an existing building, studying the impact of the built environment is a challenging topic. So, integrating the existing terrain that surrounds buildings into the digital model is essential to be able to make several simulations as flood simulation, energy simulation, etc. Making a replication of the physical model and updating its information in real-time to make its Digital Twin (DT) is very important. The Digital Terrain Model (DTM) represents the ground surface of the terrain by a set of discrete points with unique height values over 2D points based on reference surface (e.g., mean sea level, geoid, and ellipsoid). In addition, information related to the type of pavement materials, types of vegetation and heights and damaged surfaces can be integrated. Our aim in this study is to define the methodology to be used in order to provide a 3D BIM model for the site and the existing building based on the case study of “Ecole Spéciale des Travaux Publiques (ESTP Paris)” school of engineering campus. The property is located on a hilly site of 5 hectares and is composed of more than 20 buildings with a total area of 32 000 square meters and a height between 50 and 68 meters. In this work, the campus precise levelling grid according to the NGF-IGN69 altimetric system and the grid control points are computed according to (Réseau Gédésique Français) RGF93 – Lambert 93 french system with different methods: (i) Land topographic surveying methods using robotic total station, (ii) GNSS (Global Network Satellite sytem) levelling grid with NRTK (Network Real Time Kinematic) mode, (iii) Point clouds generated by laser scanning. These technologies allow the computation of multiple building parameters such as boundary limits, the number of floors, the floors georeferencing, the georeferencing of the 4 base corners of each building, etc. Once the entry data are identified, the digital model of each building is done. The DTM is also modeled. The process of altimetric determination is complex and requires efforts in order to collect and analyze multiple data formats. Since many technologies can be used to produce digital models, different file formats such as DraWinG (DWG), LASer (LAS), Comma-separated values (CSV), Industry Foundation Classes (IFC) and ReViT (RVT) will be generated. Checking the interoperability between BIM models is very important. In this work, all models are linked together and shared on 3DEXPERIENCE collaborative platform.Keywords: building information modeling, digital terrain model, existing buildings, interoperability
Procedia PDF Downloads 11222127 Application of Statistical Linearized Models for Investigations of Digital Dynamic Pulse-Frequency Control Systems
Authors: B. H. Aitchanov, Sh. K. Aitchanova, O. A. Baimuratov
Abstract:
This paper is focused on dynamic pulse-frequency modulation (DPFM) control systems. Currently, the control law based on DPFM control signals is widely used in direct digital control subsystems introduced in the automated control systems of technological processes. Statistical analysis of automatic control systems is reduced to its construction of functional relationships between the statistical characteristics of the errors processes and input processes. Structural and dynamic Volterra models of digital pulse-frequency control systems can be used to develop methods for generating the dependencies, differing accuracy, requiring the amount of information about the statistical characteristics of input processes and computing labor intensity of their use.Keywords: digital dynamic pulse-frequency control systems, dynamic pulse-frequency modulation, control object, discrete filter, impulse device, microcontroller
Procedia PDF Downloads 49522126 On the Influence of Thermal Radiation Upon Heat Transfer Characteristics of a Porous Media Under Local Thermal Non-Equilibrium Condition
Authors: Yasser Mahmoudi, Nader Karimi
Abstract:
The present work investigates numerically the effect of thermal radiation from the solid phase on the rate of heat transfer inside a porous medium. Forced convection heat transfer process within a pipe filled with a porous media is considered. The Darcy-Brinkman-Forchheimer model is utilized to represent the fluid transport within the porous medium. A local thermal non-equilibrium (LTNE), two-equation model is used to represent the energy transport for the solid and fluid phases. The radiative heat transfer equation is solved by discrete ordinate method (DOM) to compute the radiative heat flux in the porous medium. Two primary approaches (models A and B) are used to represent the boundary conditions for constant wall heat flux. The effects of radiative heat transfer on the Nusselt numbers of the two phases are examined by comparing the results obtained by the application of models A and B. The fluid Nusselt numbers calculated by the application of models A and B show that the Nusselt number obtained by model A for the radiative case is higher than those predicted for the non-radiative case. However, for model B the fluid Nusselt numbers obtained for the radiative and non-radiative cases are similar.Keywords: porous media, local thermal non-equilibrium, forced convection heat transfer, thermal radiation, Discrete Ordinate Method (DOM)
Procedia PDF Downloads 32522125 Simulation of the Flow in a Circular Vertical Spillway Using a Numerical Model
Authors: Mohammad Zamani, Ramin Mansouri
Abstract:
Spillways are one of the most important hydraulic structures of dams that provide the stability of the dam and downstream areas at the time of flood. A circular vertical spillway with various inlet forms is very effective when there is not enough space for the other spillway. Hydraulic flow in a vertical circular spillway is divided into three groups: free, orifice, and under pressure (submerged). In this research, the hydraulic flow characteristics of a Circular Vertical Spillway are investigated with the CFD model. Two-dimensional unsteady RANS equations were solved numerically using Finite Volume Method. The PISO scheme was applied for the velocity-pressure coupling. The mostly used two-equation turbulence models, k-ε and k-ω, were chosen to model Reynolds shear stress term. The power law scheme was used for the discretization of momentum, k, ε, and ω equations. The VOF method (geometrically reconstruction algorithm) was adopted for interface simulation. In this study, three types of computational grids (coarse, intermediate, and fine) were used to discriminate the simulation environment. In order to simulate the flow, the k-ε (Standard, RNG, Realizable) and k-ω (standard and SST) models were used. Also, in order to find the best wall function, two types, standard wall, and non-equilibrium wall function, were investigated. The laminar model did not produce satisfactory flow depth and velocity along the Morning-Glory spillway. The results of the most commonly used two-equation turbulence models (k-ε and k-ω) were identical. Furthermore, the standard wall function produced better results compared to the non-equilibrium wall function. Thus, for other simulations, the standard k-ε with the standard wall function was preferred. The comparison criterion in this study is also the trajectory profile of jet water. The results show that the fine computational grid, the input speed condition for the flow input boundary, and the output pressure for the boundaries that are in contact with the air provide the best possible results. Also, the standard wall function is chosen for the effect of the wall function, and the turbulent model k-ε (Standard) has the most consistent results with experimental results. When the jet gets closer to the end of the basin, the computational results increase with the numerical results of their differences. The mesh with 10602 nodes, turbulent model k-ε standard and the standard wall function, provide the best results for modeling the flow in a vertical circular Spillway. There was a good agreement between numerical and experimental results in the upper and lower nappe profiles. In the study of water level over crest and discharge, in low water levels, the results of numerical modeling are good agreement with the experimental, but with the increasing water level, the difference between the numerical and experimental discharge is more. In the study of the flow coefficient, by decreasing in P/R ratio, the difference between the numerical and experimental result increases.Keywords: circular vertical, spillway, numerical model, boundary conditions
Procedia PDF Downloads 8622124 Examination of How Do Smart Watches Influence the Market of Luxury Watches with Particular Regard of the Buying-Reasons
Authors: Christopher Benedikt Jakob
Abstract:
In our current society, there is no need to take a look at the wristwatch to know the exact time. Smartphones, the watch in the car or the computer watch, inform us about the time too. Over hundreds of years, luxury watches have held a fascination for human beings. Consumers buy watches that cost thousands of euros, although they could buy much cheaper watches which also fulfill the function to indicate the correct time. This shows that the functional value has got a minor meaning with reference to the buying-reasons as regards luxury watches. For a few years, people have an increased demand to track data like their walking distance per day or to track their sleep for example. Smart watches enable consumers to get information about these data. There exists a trend that people intend to optimise parts of their social life, and thus they get the impression that they are able to optimise themselves as human beings. With the help of smart watches, they are able to optimise parts of their productivity and to realise their targets at the same time. These smart watches are also offered as luxury models, and the question is: how will customers of traditional luxury watches react? Therefore this study has the intention to give answers to the question why people are willing to spend an enormous amount of money on the consumption of luxury watches. The self-expression model, the relationship basis model, the functional benefit representation model and the means-end-theory are chosen as an appropriate methodology to find reasons why human beings purchase specific luxury watches and luxury smart watches. This evaluative approach further discusses these strategies concerning for example if consumers buy luxury watches/smart watches to express the current self or the ideal self and if human beings make decisions on expected results. The research critically evaluates that relationships are compared on the basis of their advantages. Luxury brands offer socio-emotional advantages like social functions of identification and that the strong brand personality of luxury watches and luxury smart watches helps customers to structure and retrieve brand awareness which simplifies the process of decision-making. One of the goals is to identify if customers know why they like specific luxury watches and dislike others although they are produced in the same country and cost comparable prices. It is very obvious that the market for luxury watches especially for luxury smart watches is changing way faster than it has been in the past. Therefore the research examines the market changing parameters in detail.Keywords: buying-behaviour, brand management, consumer, luxury watch, smart watch
Procedia PDF Downloads 21022123 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions
Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen
Abstract:
Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma
Procedia PDF Downloads 17622122 Parallelization of Random Accessible Progressive Streaming of Compressed 3D Models over Web
Authors: Aayushi Somani, Siba P. Samal
Abstract:
Three-dimensional (3D) meshes are data structures, which store geometric information of an object or scene, generally in the form of vertices and edges. Current technology in laser scanning and other geometric data acquisition technologies acquire high resolution sampling which leads to high resolution meshes. While high resolution meshes give better quality rendering and hence is used often, the processing, as well as storage of 3D meshes, is currently resource-intensive. At the same time, web applications for data processing have become ubiquitous owing to their accessibility. For 3D meshes, the advancement of 3D web technologies, such as WebGL, WebVR, has enabled high fidelity rendering of huge meshes. However, there exists a gap in ability to stream huge meshes to a native client and browser application due to high network latency. Also, there is an inherent delay of loading WebGL pages due to large and complex models. The focus of our work is to identify the challenges faced when such meshes are streamed into and processed on hand-held devices, owing to its limited resources. One of the solutions that are conventionally used in the graphics community to alleviate resource limitations is mesh compression. Our approach deals with a two-step approach for random accessible progressive compression and its parallel implementation. The first step includes partition of the original mesh to multiple sub-meshes, and then we invoke data parallelism on these sub-meshes for its compression. Subsequent threaded decompression logic is implemented inside the Web Browser Engine with modification of WebGL implementation in Chromium open source engine. This concept can be used to completely revolutionize the way e-commerce and Virtual Reality technology works for consumer electronic devices. These objects can be compressed in the server and can be transmitted over the network. The progressive decompression can be performed on the client device and rendered. Multiple views currently used in e-commerce sites for viewing the same product from different angles can be replaced by a single progressive model for better UX and smoother user experience. Can also be used in WebVR for commonly and most widely used activities like virtual reality shopping, watching movies and playing games. Our experiments and comparison with existing techniques show encouraging results in terms of latency (compressed size is ~10-15% of the original mesh), processing time (20-22% increase over serial implementation) and quality of user experience in web browser.Keywords: 3D compression, 3D mesh, 3D web, chromium, client-server architecture, e-commerce, level of details, parallelization, progressive compression, WebGL, WebVR
Procedia PDF Downloads 17022121 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 16622120 Spatial Analysis of Park and Ride Users’ Dynamic Accessibility to Train Station: A Case Study in Perth
Authors: Ting (Grace) Lin, Jianhong (Cecilia) Xia, Todd Robinson
Abstract:
Accessibility analysis, examining people’s ability to access facilities and destinations, is a fundamental assessment for transport planning, policy making, and social exclusion research. Dynamic accessibility which measures accessibility in real-time traffic environment has been an advanced accessibility indicator in transport research. It is also a useful indicator to help travelers to understand travel time daily variability, assists traffic engineers to monitor traffic congestions, and finally develop effective strategies in order to mitigate traffic congestions. This research involved real-time traffic information by collecting travel time data with 15-minute interval via the TomTom® API. A framework for measuring dynamic accessibility was then developed based on the gravity theory and accessibility dichotomy theory through space and time interpolation. Finally, the dynamic accessibility can be derived at any given time and location under dynamic accessibility spatial analysis framework.Keywords: dynamic accessibility, hot spot, transport research, TomTom® API
Procedia PDF Downloads 38922119 Investigating the Effect of VR, Time Study and Ergonomics on the Design of Industrial Workstations
Authors: Aydin Azizi, Poorya Ghafoorpoor Yazdi
Abstract:
This paper presents the review of the studies on the ergonomics, virtual reality, and work measurement (time study) at the industrial workstations because each of these three individual techniques can be used to improve the design of workstations and task position. The objective of this paper is to give an overall literature review that if there is any relation between these three different techniques. Therefore, it is so important to review the scientific studies to find a better and effective way for improving design of workstations. On the other hand, manufacturers found that instead of using one of the approaches, utilizing the combination of these individual techniques are more effective to reduce the cost and production time.Keywords: ergonomics, time study, virtual reality, workplace
Procedia PDF Downloads 11922118 Time Bound Parallel Processing of a Disaster Management Alert System Using Random Selection of Target Audience: Bangladesh Context
Authors: Hasan Al Bashar Abul Ulayee, AKM Saifun Nabi, MD Mesbah-Ul-Awal
Abstract:
Alert system for disaster management is common now a day and can play a vital role reducing devastation and saves lives and costs. An alert in right time can save thousands of human life, help to take shelter, manage other assets including live stocks and above all, a right time alert will help to take preparation to face and early recovery of the situation. In a country like Bangladesh where populations is more than 170 million and always facing different types of natural calamities and disasters, an early right time alert is very effective and implementation of alert system is challenging. The challenge comes from the time constraint of alerting the huge number of population. The other method of existing disaster management pre alert is traditional, sequential and non-selective so efficiency is not good enough. This paper describes a way by which alert can be provided to maximum number of people within the short time bound using parallel processing as well as random selection of selective target audience.Keywords: alert system, Bangladesh, disaster management, parallel processing, SMS
Procedia PDF Downloads 47022117 Joint Training Offer Selection and Course Timetabling Problems: Models and Algorithms
Authors: Gianpaolo Ghiani, Emanuela Guerriero, Emanuele Manni, Alessandro Romano
Abstract:
In this article, we deal with a variant of the classical course timetabling problem that has a practical application in many areas of education. In particular, in this paper we are interested in high schools remedial courses. The purpose of such courses is to provide under-prepared students with the skills necessary to succeed in their studies. In particular, a student might be under prepared in an entire course, or only in a part of it. The limited availability of funds, as well as the limited amount of time and teachers at disposal, often requires schools to choose which courses and/or which teaching units to activate. Thus, schools need to model the training offer and the related timetabling, with the goal of ensuring the highest possible teaching quality, by meeting the above-mentioned financial, time and resources constraints. Moreover, there are some prerequisites between the teaching units that must be satisfied. We first present a Mixed-Integer Programming (MIP) model to solve this problem to optimality. However, the presence of many peculiar constraints contributes inevitably in increasing the complexity of the mathematical model. Thus, solving it through a general purpose solver may be performed for small instances only, while solving real-life-sized instances of such model requires specific techniques or heuristic approaches. For this purpose, we also propose a heuristic approach, in which we make use of a fast constructive procedure to obtain a feasible solution. To assess our exact and heuristic approaches we perform extensive computational results on both real-life instances (obtained from a high school in Lecce, Italy) and randomly generated instances. Our tests show that the MIP model is never solved to optimality, with an average optimality gap of 57%. On the other hand, the heuristic algorithm is much faster (in about the 50% of the considered instances it converges in approximately half of the time limit) and in many cases allows achieving an improvement on the objective function value obtained by the MIP model. Such an improvement ranges between 18% and 66%.Keywords: heuristic, MIP model, remedial course, school, timetabling
Procedia PDF Downloads 60522116 Temperature Fields in a Channel Partially-Filled by Porous Material with Internal Heat Generations: On Exact Solution
Authors: Yasser Mahmoudi, Nader Karimi
Abstract:
The present work examines analytically the effect internal heat generation on temperature fields in a channel partially-filled with a porous under local thermal non-equilibrium condition. The Darcy-Brinkman model is used to represent the fluid transport through the porous material. Two fundamental models (models A and B) represent the thermal boundary conditions at the interface between the porous medium and the clear region. The governing equations of the problem are manipulated, and for each interface model, exact solutions for the solid and fluid temperature fields are developed. These solutions incorporate the porous material thickness, Biot number, fluid to solid thermal conductivity ratio Darcy number, as the non-dimensional energy terms in fluid and solid as parameters. Results show that considering any of the two models and under zero or negative heat generation (heat sink) and for any Darcy number, an increase in the porous thickness increases the amount of heat flux transferred to the porous region. The obtained results are applicable to the analysis of complex porous media incorporating internal heat generation, such as heat transfer enhancement (THE), tumor ablation in biological tissues and porous radiant burners (PRBs).Keywords: porous media, local thermal non-equilibrium, forced convection, heat transfer, exact solution, internal heat generation
Procedia PDF Downloads 46022115 Energy Performance of Buildings Due to Downscaled Seasonal Models
Authors: Anastasia K. Eleftheriadou, Athanasios Sfetsos, Nikolaos Gounaris
Abstract:
The present work examines the suitability of a seasonal forecasting model downscaled with a very high spatial resolution in order to assess the energy performance and requirements of buildings. The application of the developed model is applied on Greece for a period and with a forecast horizon of 5 months in the future. Greece, as a country in the middle of a financial crisis and facing serious societal challenges, is also very sensitive to climate changes. The commonly used method for the correlation of climate change with the buildings energy consumption is the concept of Degree Days (DD). This method can be applied to heating and cooling systems for a better management of environmental, economic and energy crisis, and can be used as medium (3-6 months) planning tools in order to predict the building needs and country’s requirements for residential energy use.Keywords: downscaled seasonal models, degree days, energy performance
Procedia PDF Downloads 45322114 Expected Present Value of Losses in the Computation of Optimum Seismic Design Parameters
Authors: J. García-Pérez
Abstract:
An approach to compute optimum seismic design parameters is presented. It is based on the optimization of the expected present value of the total cost, which includes the initial cost of structures as well as the cost due to earthquakes. Different types of seismicity models are considered, including one for characteristic earthquakes. Uncertainties are included in some variables to observe the influence on optimum values. Optimum seismic design coefficients are computed for three different structural types representing high, medium and low rise buildings, located near and far from the seismic sources. Ordinary and important structures are considered in the analysis. The results of optimum values show an important influence of seismicity models as well as of uncertainties on the variables.Keywords: importance factors, optimum parameters, seismic losses, seismic risk, total cost
Procedia PDF Downloads 28522113 Real Interest Rates and Real Returns of Agricultural Commodities in the Context of Quantitative Easing
Authors: Wei Yao, Constantinos Alexiou
Abstract:
In the existing literature, many studies have focused on the implementation and effectiveness of quantitative easing (QE) since 2008, but only a few have evaluated QE’s effect on commodity prices. In this context, by following Frankel’s (1986) commodity price overshooting model, we study the dynamic covariation between the expected real interest rates and six agricultural commodities’ real returns over the period from 2000:1 to 2018 for the US economy. We use wavelet analysis to investigate the causal relationship and co-movement of time series data by calculating the coefficient of determination in different frequencies. We find that a) US unconventional monetary policy may cause more positive and significant covariation between the expected real interest rates and agricultural commodities’ real returns over the short horizons; b) a lead-lag relationship that runs from agricultural commodities’ real returns to the expected real short-term interest rates over the long horizons; and c) a lead-lag relationship from agricultural commodities’ real returns to the expected real long-term interest rates over short horizons. In the realm of monetary policy, we argue that QE may shift the negative relationship between most commodities’ real returns and the expected real interest rates to a positive one over a short horizon.Keywords: QE, commodity price, interest rate, wavelet coherence
Procedia PDF Downloads 8922112 Continuous-Time and Discrete-Time Singular Value Decomposition of an Impulse Response Function
Authors: Rogelio Luck, Yucheng Liu
Abstract:
This paper proposes the continuous-time singular value decomposition (SVD) for the impulse response function, a special kind of Green’s functions e⁻⁽ᵗ⁻ ᵀ⁾, in order to find a set of singular functions and singular values so that the convolutions of such function with the set of singular functions on a specified domain are the solutions to the inhomogeneous differential equations for those singular functions. A numerical example was illustrated to verify the proposed method. Besides the continuous-time SVD, a discrete-time SVD is also presented for the impulse response function, which is modeled using a Toeplitz matrix in the discrete system. The proposed method has broad applications in signal processing, dynamic system analysis, acoustic analysis, thermal analysis, as well as macroeconomic modeling.Keywords: singular value decomposition, impulse response function, Green’s function , Toeplitz matrix , Hankel matrix
Procedia PDF Downloads 15622111 Enhanced Flight Dynamics Model to Simulate the Aircraft Response to Gust Encounters
Authors: Castells Pau, Poetsch Christophe
Abstract:
The effect of gust and turbulence encounters on aircraft is a wide field of study which allows different approaches, from high-fidelity multidisciplinary simulations to more simplified models adapted to industrial applications. The typical main goal is to predict the gust loads on the aircraft in order to ensure a safe design and achieve certification. Another topic widely studied is the gust loads reduction through an active control law. The impact of gusts on aircraft handling qualities is of interest as well in the analysis of in-service events so as to evaluate the aircraft response and the performance of the flight control laws. Traditionally, gust loads and handling qualities are addressed separately with different models adapted to the specific needs of each discipline. In this paper, an assessment of the differences between both models is presented and a strategy to better account for the physics of gust encounters in a typical flight dynamics model is proposed based on the model used for gust loads analysis. The applied corrections aim to capture the gust unsteady aerodynamics and propagation as well as the effect of dynamic flexibility at low frequencies. Results from the gust loads model at different flight conditions and measures from real events are used for validation. An assessment of a possible extension of steady aerodynamic nonlinearities to low frequency range is also addressed. The proposed corrections provide meaningful means to evaluate the performance and possible adjustments of the flight control laws.Keywords: flight dynamics, gust loads, handling qualities, unsteady aerodynamics
Procedia PDF Downloads 14722110 Applying Participatory Design for the Reuse of Deserted Community Spaces
Authors: Wei-Chieh Yeh, Yung-Tang Shen
Abstract:
The concept of community building started in 1994 in Taiwan. After years of development, it fostered the notion of active local resident participation in community issues as co-operators, instead of minions. Participatory design gives participants more control in the decision-making process, helps to reduce the friction caused by arguments and assists in bringing different parties to consensus. This results in an increase in the efficiency of projects run in the community. Therefore, the participation of local residents is key to the success of community building. This study applied participatory design to develop plans for the reuse of deserted spaces in the community from the first stage of brainstorming for design ideas, making creative models to be employed later, through to the final stage of construction. After conducting a series of participatory designed activities, it aimed to integrate the different opinions of residents, develop a sense of belonging and reach a consensus. Besides this, it also aimed at building the residents’ awareness of their responsibilities for the environment and related issues of sustainable development. By reviewing relevant literature and understanding the history of related studies, the study formulated a theory. It took the “2012-2014 Changhua County Community Planner Counseling Program” as a case study to investigate the implementation process of participatory design. Research data are collected by document analysis, participants’ observation and in-depth interviews. After examining the three elements of “Design Participation”, “Construction Participation”, and” Follow–up Maintenance Participation” in the case, the study emerged with a promising conclusion: Maintenance works were carried out better compared to common public works. Besides this, maintenance costs were lower. Moreover, the works that residents were involved in were more creative. Most importantly, the community characteristics could be easy be recognized.Keywords: participatory design, deserted space, community building, reuse
Procedia PDF Downloads 37222109 Statistical Modeling and by Artificial Neural Networks of Suspended Sediment Mina River Watershed at Wadi El-Abtal Gauging Station (Northern Algeria)
Authors: Redhouane Ghernaout, Amira Fredj, Boualem Remini
Abstract:
Suspended sediment transport is a serious problem worldwide, but it is much more worrying in certain regions of the world, as is the case in the Maghreb and more particularly in Algeria. It continues to take disturbing proportions in Northern Algeria due to the variability of rains in time and in space and constant deterioration of vegetation. Its prediction is essential in order to identify its intensity and define the necessary actions for its reduction. The purpose of this study is to analyze the concentration data of suspended sediment measured at Wadi El-Abtal Hydrometric Station. It also aims to find and highlight regressive power relationships, which can explain the suspended solid flow by the measured liquid flow. The study strives to find models of artificial neural networks linking the flow, month and precipitation parameters with solid flow. The obtained results show that the power function of the solid transport rating curve and the models of artificial neural networks are appropriate methods for analysing and estimating suspended sediment transport in Wadi Mina at Wadi El-Abtal Hydrometric Station. They made it possible to identify in a fairly conclusive manner the model of neural networks with four input parameters: the liquid flow Q, the month and the daily precipitation measured at the representative stations (Frenda 013002 and Ain El-Hadid 013004 ) of the watershed. The model thus obtained makes it possible to estimate the daily solid flows (interpolate and extrapolate) even beyond the period of observation of solid flows (1985/86 to 1999/00), given the availability of the average daily liquid flows and daily precipitation since 1953/1954.Keywords: suspended sediment, concentration, regression, liquid flow, solid flow, artificial neural network, modeling, mina, algeria
Procedia PDF Downloads 10322108 Analysis of Network Performance Using Aspect of Quantum Cryptography
Authors: Nisarg A. Patel, Hiren B. Patel
Abstract:
Quantum cryptography is described as a point-to-point secure key generation technology that has emerged in recent times in providing absolute security. Researchers have started studying new innovative approaches to exploit the security of Quantum Key Distribution (QKD) for a large-scale communication system. A number of approaches and models for utilization of QKD for secure communication have been developed. The uncertainty principle in quantum mechanics created a new paradigm for QKD. One of the approaches for use of QKD involved network fashioned security. The main goal was point-to-point Quantum network that exploited QKD technology for end-to-end network security via high speed QKD. Other approaches and models equipped with QKD in network fashion are introduced in the literature as. A different approach that this paper deals with is using QKD in existing protocols, which are widely used on the Internet to enhance security with main objective of unconditional security. Our work is towards the analysis of the QKD in Mobile ad-hoc network (MANET).Keywords: cryptography, networking, quantum, encryption and decryption
Procedia PDF Downloads 18422107 Data Mining Model for Predicting the Status of HIV Patients during Drug Regimen Change
Authors: Ermias A. Tegegn, Million Meshesha
Abstract:
Human Immunodeficiency Virus and Acquired Immunodeficiency Syndrome (HIV/AIDS) is a major cause of death for most African countries. Ethiopia is one of the seriously affected countries in sub Saharan Africa. Previously in Ethiopia, having HIV/AIDS was almost equivalent to a death sentence. With the introduction of Antiretroviral Therapy (ART), HIV/AIDS has become chronic, but manageable disease. The study focused on a data mining technique to predict future living status of HIV/AIDS patients at the time of drug regimen change when the patients become toxic to the currently taking ART drug combination. The data is taken from University of Gondar Hospital ART program database. Hybrid methodology is followed to explore the application of data mining on ART program dataset. Data cleaning, handling missing values and data transformation were used for preprocessing the data. WEKA 3.7.9 data mining tools, classification algorithms, and expertise are utilized as means to address the research problem. By using four different classification algorithms, (i.e., J48 Classifier, PART rule induction, Naïve Bayes and Neural network) and by adjusting their parameters thirty-two models were built on the pre-processed University of Gondar ART program dataset. The performances of the models were evaluated using the standard metrics of accuracy, precision, recall, and F-measure. The most effective model to predict the status of HIV patients with drug regimen substitution is pruned J48 decision tree with a classification accuracy of 98.01%. This study extracts interesting attributes such as Ever taking Cotrim, Ever taking TbRx, CD4 count, Age, Weight, and Gender so as to predict the status of drug regimen substitution. The outcome of this study can be used as an assistant tool for the clinician to help them make more appropriate drug regimen substitution. Future research directions are forwarded to come up with an applicable system in the area of the study.Keywords: HIV drug regimen, data mining, hybrid methodology, predictive model
Procedia PDF Downloads 14222106 Optimization of Perfusion Distribution in Custom Vascular Stent-Grafts Through Patient-Specific CFD Models
Authors: Scott M. Black, Craig Maclean, Pauline Hall Barrientos, Konstantinos Ritos, Asimina Kazakidi
Abstract:
Aortic aneurysms and dissections are leading causes of death in cardiovascular disease. Both inevitably lead to hemodynamic instability without surgical intervention in the form of vascular stent-graft deployment. An accurate description of the aortic geometry and blood flow in patient-specific cases is vital for treatment planning and long-term success of such grafts, as they must generate physiological branch perfusion and in-stent hemodynamics. The aim of this study was to create patient-specific computational fluid dynamics (CFD) models through a multi-modality, multi-dimensional approach with boundary condition optimization to predict branch flow rates and in-stent hemodynamics in custom stent-graft configurations. Three-dimensional (3D) thoracoabdominal aortae were reconstructed from four-dimensional flow-magnetic resonance imaging (4D Flow-MRI) and computed tomography (CT) medical images. The former employed a novel approach to generate and enhance vessel lumen contrast via through-plane velocity at discrete, user defined cardiac time steps post-hoc. To produce patient-specific boundary conditions (BCs), the aortic geometry was reduced to a one-dimensional (1D) model. Thereafter, a zero-dimensional (0D) 3-Element Windkessel model (3EWM) was coupled to each terminal branch to represent the distal vasculature. In this coupled 0D-1D model, the 3EWM parameters were optimized to yield branch flow waveforms which are representative of the 4D Flow-MRI-derived in-vivo data. Thereafter, a 0D-3D CFD model was created, utilizing the optimized 3EWM BCs and a 4D Flow-MRI-obtained inlet velocity profile. A sensitivity analysis on the effects of stent-graft configuration and BC parameters was then undertaken using multiple stent-graft configurations and a range of distal vasculature conditions. 4D Flow-MRI granted unparalleled visualization of blood flow throughout the cardiac cycle in both the pre- and postsurgical states. Segmentation and reconstruction of healthy and stented regions from retrospective 4D Flow-MRI images also generated 3D models with geometries which were successfully validated against their CT-derived counterparts. 0D-1D coupling efficiently captured branch flow and pressure waveforms, while 0D-3D models also enabled 3D flow visualization and quantification of clinically relevant hemodynamic parameters for in-stent thrombosis and graft limb occlusion. It was apparent that changes in 3EWM BC parameters had a pronounced effect on perfusion distribution and near-wall hemodynamics. Results show that the 3EWM parameters could be iteratively changed to simulate a range of graft limb diameters and distal vasculature conditions for a given stent-graft to determine the optimal configuration prior to surgery. To conclude, this study outlined a methodology to aid in the prediction post-surgical branch perfusion and in-stent hemodynamics in patient specific cases for the implementation of custom stent-grafts.Keywords: 4D flow-MRI, computational fluid dynamics, vascular stent-grafts, windkessel
Procedia PDF Downloads 18122105 Simulating the Hot Hand Phenomenon in Basketball with Bayesian Hidden Markov Models
Authors: Gabriel Calvo, Carmen Armero, Luigi Spezia
Abstract:
A basketball player is said to have a hot hand if his/her performance is better than expected in different periods of time. A way to deal with this phenomenon is to make use of latent variables, which can indicate whether the player is ‘on fire’ or not. This work aims to model the hot hand phenomenon through a Bayesian hidden Markov model (HMM) with two states (cold and hot) and two different probability of success depending on the corresponding hidden state. This task is illustrated through a comprehensive simulation study. The simulated data sets emulate the field goal attempts in an NBA season from different profile players. This model can be a powerful tool to assess the ‘streakiness’ of each player, and it provides information about the general performance of the players during the match. Finally, the Bayesian HMM allows computing the posterior probability of any type of streak.Keywords: Bernoulli trials, field goals, latent variables, posterior distribution
Procedia PDF Downloads 19222104 Spectroscopic Study of a Eu-Complex Containing Hybrid Material
Authors: Y. A. R. Oliveira, M. A. Couto dos Santos, N. B. C. Júnior, S. J. L. Ribeiro, L. D. Carlos
Abstract:
The Eu(TTA)3(H2O)2 complex (TTA = thenoyltrifluoroacetone) pure (EuTTA) and incorporated in an organicinorganic hybrid material (EuTTA-hyb) are revisited, this time from the crystal field parameters (CFP) and Judd-Ofelt intensity parameters (Ωλ) point of view. A detailed analysis of the emission spectra revealed that the EuTTA phase still remains in the hybrid phase. Sparkle Model calculations of the EuTTA ground state geometry have been performed and satisfactorily compared to the X-ray structure. The observed weaker crystal field strength of the phase generated by the incorporation is promptly interpreted through the existing EXAFS results of the EuTTA-hyb structure. Satisfactory predictions of the CFP, of the 7F1 level splitting and of the Ωλ in all cases were obtained by using the charge factors and polarizabilities as degrees of freedom of non-parametric models.Keywords: crystal field parameters, europium complexes, Judd-Ofelt intensity parameters
Procedia PDF Downloads 40822103 Real-Time Aerial Marine Surveillance System for Safe Navigation
Authors: Vinesh Thiruchelvam, Umar Mumtaz Chowdry, Sathish Kumar Selvaperumal
Abstract:
The prime purpose of the project is to provide a sophisticated system for surveillance specialized for the Port Authorities in the Maritime Industry. The current aerial surveillance does not have a wide dimensioning view. The channels of communication is shared and not exclusive allowing for communications errors or disturbance mainly due to traffic. The scope is to analyze the various aspects as real-time aerial and marine surveillance is one of the most important methods which could ensure the domain security of the sailors. The system will improve real time data as obtained for the controller base station. The key implementation will be based on camera speed, angle and adherence to a sustainable power utilization module.Keywords: SMS, real time, GUI, maritime industry
Procedia PDF Downloads 49822102 A Study for the Effect of Fire Initiated Location on Evacuation Success Rate
Authors: Jin A Ryu, Hee Sun Kim
Abstract:
As the number of fire accidents is gradually raising, many studies have been reported on evacuation. Previous studies have mostly focused on evaluating the safety of evacuation and the risk of fire in particular buildings. However, studies on effects of various parameters on evacuation have not been nearly done. Therefore, this paper aims at observing evacuation time under the effect of fire initiated location. In this study, evacuation simulations are performed on a 5-floor building located in Seoul, South Korea using the commercial program, Fire Dynamics Simulator with Evacuation (FDS+EVAC). Only the fourth and fifth floors are modeled with an assumption that fire starts in a room located on the fourth floor. The parameter for evacuation simulations is location of fire initiation to observe the evacuation time and safety. Results show that the location of fire initiation is closer to exit, the more time is taken to evacuate. The case having the nearest location of fire initiation to exit has the lowest ratio of successful occupants to the total occupants. In addition, for safety evaluation, the evacuation time calculated from computer simulation model is compared with the tolerable evacuation time according to code in Japan. As a result, all cases are completed within the tolerable evacuation time. This study allows predicting evacuation time under various conditions of fire and can be used to evaluate evacuation appropriateness and fire safety of building.Keywords: fire simulation, evacuation simulation, temperature, evacuation safety
Procedia PDF Downloads 35022101 The Relationship Between Hourly Compensation and Unemployment Rate Using the Panel Data Regression Analysis
Authors: S. K. Ashiquer Rahman
Abstract:
the paper concentrations on the importance of hourly compensation, emphasizing the significance of the unemployment rate. There are the two most important factors of a nation these are its unemployment rate and hourly compensation. These are not merely statistics but they have profound effects on individual, families, and the economy. They are inversely related to one another. When we consider the unemployment rate that will probably decline as hourly compensations in manufacturing rise. But when we reduced the unemployment rates and increased job prospects could result from higher compensation. That’s why, the increased hourly compensation in the manufacturing sector that could have a favorable effect on job changing issues. Moreover, the relationship between hourly compensation and unemployment is complex and influenced by broader economic factors. In this paper, we use panel data regression models to evaluate the expected link between hourly compensation and unemployment rate in order to determine the effect of hourly compensation on unemployment rate. We estimate the fixed effects model, evaluate the error components, and determine which model (the FEM or ECM) is better by pooling all 60 observations. We then analysis and review the data by comparing 3 several countries (United States, Canada and the United Kingdom) using panel data regression models. Finally, we provide result, analysis and a summary of the extensive research on how the hourly compensation effects on the unemployment rate. Additionally, this paper offers relevant and useful informational to help the government and academic community use an econometrics and social approach to lessen on the effect of the hourly compensation on Unemployment rate to eliminate the problem.Keywords: hourly compensation, Unemployment rate, panel data regression models, dummy variables, random effects model, fixed effects model, the linear regression model
Procedia PDF Downloads 8122100 Flexural Strengthening of Steel Beams Using Fiber Reinforced Polymers
Authors: Sally Hosny, Mona G. Ibrahim, N. K. Hassan
Abstract:
Fiber reinforced polymers (FRP) is one of the most environmentally method for strengthening and retrofitting steel structure buildings. The behaviour of flexural strengthened steel I-beams using FRP was investigated. The finite element (FE) models were developed using ANSYS® as verification cases to simulate the experimental behaviour of using FRP strips to flexure strengthen steel I-beam. Two experimental studies were selected for verification; first examined the effect of different thicknesses and modulus of elasticity while the second studied the effect of applying different carbon fiber reinforced polymers (CFRP) bond lengths. The proposed FE models were in good agreement with the experimental results in terms of failure modes, load bearing capacities and strain distribution on CFRP strips. The verified FE models can be utilized to conduct a parametric study where various widths (40, 50, 60, 70 and 80 mm), thickness (1.2, 2 and 4 mm) and lengths (1500, 1700 and 1800 mm) of CFRP were analyzed. The results presented clearly revealed that the load bearing capacity was significantly increased (+7%) when the width and thickness were increased. However, load bearing capacity was slightly affected using longer CFRP strips. Moreover, applying another glass fiber reinforced polymers (GFRP) of 1500 mm in length, 50 mm in width and thicknesses of 1.2, 2 and 4 mm were investigated. Load bearing capacity of strengthened I-beams using GFRP is less than CFRP by average 8%. Statistical analysis has been conducted using Minitab®.Keywords: FRP, strengthened steel I-beams, flexural, FEM, ANSYS
Procedia PDF Downloads 280