Search results for: interlaminar damage model
16678 Research on Air pollution Spatiotemporal Forecast Model Based on LSTM
Authors: JingWei Yu, Hong Yang Yu
Abstract:
At present, the increasingly serious air pollution in various cities of China has made people pay more attention to the air quality index(hereinafter referred to as AQI) of their living areas. To face this situation, it is of great significance to predict air pollution in heavily polluted areas. In this paper, based on the time series model of LSTM, a spatiotemporal prediction model of PM2.5 concentration in Mianyang, Sichuan Province, is established. The model fully considers the temporal variability and spatial distribution characteristics of PM2.5 concentration. The spatial correlation of air quality at different locations is based on the Air quality status of other nearby monitoring stations, including AQI and meteorological data to predict the air quality of a monitoring station. The experimental results show that the method has good prediction accuracy that the fitting degree with the actual measured data reaches more than 0.7, which can be applied to the modeling and prediction of the spatial and temporal distribution of regional PM2.5 concentration.Keywords: LSTM, PM2.5, neural networks, spatio-temporal prediction
Procedia PDF Downloads 13716677 Investigation of the Excitotoxicity Pathways in Neuroblastoma Cells
Authors: Merve Colak, Gizem Donmez Yalcin
Abstract:
Glutamate has many neurological functions in the central nervous system and is found at high concentrations in the brain. Increased levels of glutamate in the neuronal space are toxic, causing neuron damage and death. This is called glutamate-induced excitotoxicity. Excitotoxicity is among the causes of many neurological diseases such as trauma, cerebral ischemia, epilepsy, Parkinson's Disease, Alzheimer's Disease. Since neuroblastoma cells are known to be excitotoxic, we propose that excitotoxicity can be studied in neuroblastoma cells. Excitotoxicity can be induced using kainic acid in neuroblastoma cells. Measuring the secretion of glutamate, excitotoxicity can be analyzed in neuroblastoma cells.Keywords: glutamate, excitotoxicity, kainic acid, Sirt4
Procedia PDF Downloads 16116676 Evaluation of Ceres Wheat and Rice Model for Climatic Conditions in Haryana, India
Authors: Mamta Rana, K. K. Singh, Nisha Kumari
Abstract:
The simulation models with its soil-weather-plant atmosphere interacting system are important tools for assessing the crops in changing climate conditions. The CERES-Wheat & Rice vs. 4.6 DSSAT was calibrated and evaluated for one of the major producers of wheat and rice state- Haryana, India. The simulation runs were made under irrigated conditions and three fertilizer applications dose of N-P-K to estimate crop yield and other growth parameters along with the phenological development of the crop. The genetic coefficients derived by iteratively manipulating the relevant coefficients that characterize the phenological process of wheat and rice crop to the best fit match between the simulated and observed anthesis, physological maturity and final grain yield. The model validated by plotting the simulated and remote sensing derived LAI. LAI product from remote sensing provides the edge of spatial, timely and accurate assessment of crop. For validating the yield and yield components, the error percentage between the observed and simulated data was calculated. The analysis shows that the model can be used to simulate crop yield and yield components for wheat and rice cultivar under different management practices. During the validation, the error percentage was less than 10%, indicating the utility of the calibrated model for climate risk assessment in the selected region.Keywords: simulation model, CERES-wheat and rice model, crop yield, genetic coefficient
Procedia PDF Downloads 30516675 Single Ended Primary Inductance Converter with Internal Model Controller
Authors: Fatih Suleyman Taskincan, Ahmet Karaarslan
Abstract:
In this article, the study and analysis of Single Ended Primary Inductance Converter (SEPIC) are presented for battery charging applications that will be used in military applications. The usage of this kind of converters come from its advantage of non-reverse polarity at outputs. As capacitors charge and discharge through inductance, peak current does not occur on capacitors. Therefore, the efficiency will be high compared to buck-boost converters. In this study, the converter (SEPIC) is designed to be operated with Internal Model Controller (IMC). The traditional controllers like Proportional Integral Controller are not preferred as its linearity behavior. Hence IMC is designed for this converter. This controller is a model-based control and provides more robustness and better set point monitoring. Moreover, it can be used for an unstable process where the conventional controller cannot handle the dynamic operation. Matlab/Simulink environment is used to simulate the converter and its controller, then, the results are shown and discussed.Keywords: DC/DC converter, single ended primary inductance converter, SEPIC, internal model controller, IMC, switched mode power supply
Procedia PDF Downloads 63116674 Household Perspectives and Resistance to Preventive Relocation in Flood Prone Areas: A Case Study in the Polwatta River Basin, Southern Sri Lanka
Authors: Ishara Madusanka, So Morikawa
Abstract:
Natural disasters, particularly floods, pose severe challenges globally, affecting both developed and developing countries. In many regions, especially Asia, riverine floods are prevalent and devastating. Integrated flood management incorporates structural and non-structural measures, with preventive relocation emerging as a cost-effective and proactive strategy for areas repeatedly impacted by severe flooding. However, preventive relocation is often hindered by economic, psychological, social, and institutional barriers. This study investigates the factors influencing resistance to preventive relocation and evaluates the role of flood risk information in shaping relocation decisions through risk perception. A conceptual model was developed, incorporating variables such as Flood Risk Information (FRI), Place Attachment (PA), Good Living Conditions (GLC), and Adaptation to Flooding (ATF), with Flood Risk Perception (FRP) serving as a mediating variable. The research was conducted in Welipitiya in the Polwatta river basin, Matara district, Sri Lanka, a region experiencing recurrent flood damage. For this study, an experimental design involving a structured questionnaire survey was utilized, with 185 households participating. The treatment group received flood risk information, including flood risk maps and historical data, while the control group did not. Data were collected in 2023 and analyzed using independent sample t-tests and Partial Least Squares Structural Equation Modeling (PLS-SEM). PLS-SEM was chosen for its ability to model latent variables, handle complex relationships, and suitability for exploratory research. Multi-group Analysis (MGA) assessed variations across different flood risk areas. Findings indicate that flood risk information had a limited impact on flood risk perception and relocation decisions, though its effect was significant in specific high-risk areas. Place attachment was a significant factor influencing relocation decisions across the sample. One potential reason for the limited impact of flood risk information on relocation decisions could be the lack of specificity in the information provided. The results suggest that while flood risk information alone may not significantly influence relocation decisions, it is crucial in specific contexts. Future studies and practitioners should focus on providing more detailed risk information and addressing psychological factors like place attachments to enhance preventive relocation efforts.Keywords: flood risk communication, flood risk perception, place attachment, preventive relocation, structural equation modeling
Procedia PDF Downloads 3416673 Chronic Renal Failure Associated with Heavy Metal Contamination of Drinking Water in Hail, Kingdom of Saudi Arabia
Authors: Elsayed A. M. Shokr, A. Alhazemi, T. Naser, Talal A. Zuhair, Adel A. Zuhair, Ahmed N. Alshamary, Thamer A. Alanazi, Hosam A. Alanazi
Abstract:
The main threats to human health from heavy metals are associated with exposure to Pb, Cd, Cu, Mo, Zn, Ni, Mn Co and Cr. is mainly via intake of drinking water being the most important source in most populations. These metals have been extensively studied and their effects on human health regularly reviewed by international bodies such as the WHO. Heavy metals have been used by humans for thousands of years. Although several adverse health effects of heavy metals have been known for a long time, exposure to heavy metals continues, and is even increasing in some parts of the world, in particular in less developed countries, though emissions have declined in most developed countries over the last 100 years. A strong relationship between contaminated drinking water with heavy metals from some of the stations of water shopping in Hail, KSA and chronic diseases such as renal failure, liver cirrhosis, and chronic anemia has been identified in this study. These diseases are apparently related to contaminant drinking water with heavy metals such as Pb, Cd, Cu, Mo, Zn, Ni, Mn Co and Cr. Renal failure is related to contaminate drinking water with lead and cadmium, liver cirrhosis to copper and molybdenum, and chronic anemia to copper and cadmium. Recent data indicate that adverse health effects of cadmium exposure may occur at lower exposure levels than previously anticipated, primarily in the form of kidney damage but possibly also bone effects and fractures. The general population is primarily exposed to mercury via drinking water being a major source of methyl mercury exposure, and dental amalgam. During the last century lead, cadmium, zinc, iron and arsenic is mainly via intake of drinking water being the most important source in most populations. Long-term exposure to lead, cadmium, zinc, iron and arsenic in drinking-water is mainly related to primarily in the form of kidney damage. Studies of these diseases suggest that abnormal incidence in specific areas is related to toxic materials in the groundwater and thereby led to the contamination of drinking water in these areas.Keywords: heavy metals, liver functions, kidney functions and chronic renal failure, hail, renal, water
Procedia PDF Downloads 32016672 Study of Inhibition of the End Effect Based on AR Model Predict of Combined Data Extension and Window Function
Authors: Pan Hongxia, Wang Zhenhua
Abstract:
In this paper, the EMD decomposition in the process of endpoint effect adopted data based on AR model to predict the continuation and window function method of combining the two effective inhibition. Proven by simulation of the simulation signal obtained the ideal effect, then, apply this method to the gearbox test data is also achieved good effect in the process, for the analysis of the subsequent data processing to improve the calculation accuracy. In the end, under various working conditions for the gearbox fault diagnosis laid a good foundation.Keywords: gearbox, fault diagnosis, ar model, end effect
Procedia PDF Downloads 36816671 Media Planning Decisions and Preferences through a Goal Programming Model: An Application to a Media Campaign for a Mature Product in Italy
Authors: Cinzia Colapinto, Davide La Torre
Abstract:
Goal Programming (GP) and its variants were applied to marketing and specific marketing issues, such as media scheduling problems in the last decades. The concept of satisfaction functions has been widely utilized in the GP model to explicitly integrate the Decision-Maker’s preferences. These preferences can be guided by the available information regarding the decision-making situation. A GP model with satisfaction functions for media planning decisions is proposed and then illustrated through a case study related to a marketing/media campaign in the Italian market.Keywords: goal programming, satisfaction functions, media planning, tourism management
Procedia PDF Downloads 40116670 Measurement Tools of the Maturity Model for IT Service Outsourcing in Higher Education Institutions
Authors: Victoriano Valencia García, Luis Usero Aragonés, Eugenio J. Fernández Vicente
Abstract:
Nowadays, the successful implementation of ICTs is vital for almost any kind of organization. Good governance and ICT management are essential for delivering value, managing technological risks, managing resources and performance measurement. In addition, outsourcing is a strategic IT service solution which complements IT services provided internally in organizations. This paper proposes the measurement tools of a new holistic maturity model based on standards ISO/IEC 20000 and ISO/IEC 38500, and the frameworks and best practices of ITIL and COBIT, with a specific focus on IT outsourcing. These measurement tools allow independent validation and practical application in the field of higher education, using a questionnaire, metrics tables, and continuous improvement plan tables as part of the measurement process. Guidelines and standards are proposed in the model for facilitating adaptation to universities and achieving excellence in the outsourcing of IT services.Keywords: IT governance, IT management, IT services, outsourcing, maturity model, measurement tools
Procedia PDF Downloads 59316669 Variability Management of Contextual Feature Model in Multi-Software Product Line
Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz
Abstract:
Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.Keywords: software product line, feature model, variability management, multi-SPLs
Procedia PDF Downloads 7016668 Simulation Programs to Education of Crisis Management Members
Authors: Jiri Barta
Abstract:
This paper deals with a simulation programs and technologies using in the educational process for members of the crisis management. Risk analysis, simulation, preparation and planning are among the main activities of workers of crisis management. Made correctly simulation of emergency defines the extent of the danger. On this basis, it is possible to effectively prepare and plan measures to minimize damage. The paper is focused on simulation programs that are trained at the University of Defence. Implementation of the outputs from simulation programs in decision-making processes of crisis staffs is one of the main tasks of the research project.Keywords: crisis management, continuity, critical infrastructure, dangerous substance, education, flood, simulation programs
Procedia PDF Downloads 46516667 Use of Two-Dimensional Hydraulics Modeling for Design of Erosion Remedy
Authors: Ayoub. El Bourtali, Abdessamed.Najine, Amrou Moussa. Benmoussa
Abstract:
One of the main goals of river engineering is river training, which is defined as controlling and predicting the behavior of a river. It is taking effective measurements to eliminate all related risks and thus improve the river system. In some rivers, the riverbed continues to erode and degrade; therefore, equilibrium will never be reached. Generally, river geometric characteristics and riverbed erosion analysis are some of the most complex but critical topics in river engineering and sediment hydraulics; riverbank erosion is the second answering process in hydrodynamics, which has a major impact on the ecological chain and socio-economic process. This study aims to integrate the new computer technology that can analyze erosion and hydraulic problems through computer simulation and modeling. Choosing the right model remains a difficult and sensitive job for field engineers. This paper makes use of the 5.0.4 version of the HEC-RAS model. The river section is adopted according to the gauged station and the proximity of the adjustment. In this work, we will demonstrate how 2D hydraulic modeling helped clarify the design and cover visuals to set up depth and velocities at riverbanks and throughout advanced structures. The hydrologic engineering center's-river analysis system (HEC-RAS) 2D model was used to create a hydraulic study of the erosion model. The geometric data were generated from the 12.5-meter x 12.5-meter resolution digital elevation model. In addition to showing eroded or overturned river sections, the model output also shows patterns of riverbank changes, which can help us reduce problems caused by erosion.Keywords: 2D hydraulics model, erosion, floodplain, hydrodynamic, HEC-RAS, riverbed erosion, river morphology, resolution digital data, sediment
Procedia PDF Downloads 19116666 Numerical Simulation of the Kurtosis Effect on the EHL Problem
Authors: S. Gao, S. Srirattayawong
Abstract:
In this study, a computational fluid dynamics (CFD) model has been developed for studying the effect of surface roughness profile on the EHL problem. The cylinders contact geometry, meshing and calculation of the conservation of mass and momentum equations are carried out by using the commercial software packages ICEMCFD and ANSYS Fluent. The user defined functions (UDFs) for density, viscosity and elastic deformation of the cylinders as the functions of pressure and temperature have been defined for the CFD model. Three different surface roughness profiles are created and incorporated into the CFD model. It is found that the developed CFD model can predict the characteristics of fluid flow and heat transfer in the EHL problem, including the leading parameters such as the pressure distribution, minimal film thickness, viscosity, and density changes. The obtained results show that the pressure profile at the center of the contact area directly relates to the roughness amplitude. The rough surface with kurtosis value over 3 influences the fluctuated shape of pressure distribution higher than other cases.Keywords: CFD, EHL, kurtosis, surface roughness
Procedia PDF Downloads 32016665 Risk of Fatal and Non-Fatal Coronary Heart Disease and Stroke Events among Adult Patients with Hypertension: Basic Markov Model Inputs for Evaluating Cost-Effectiveness of Hypertension Treatment: Systematic Review of Cohort Studies
Authors: Mende Mensa Sorato, Majid Davari, Abbas Kebriaeezadeh, Nizal Sarrafzadegan, Tamiru Shibru, Behzad Fatemi
Abstract:
Markov model, like cardiovascular disease (CVD) policy model based simulation, is being used for evaluating the cost-effectiveness of hypertension treatment. Stroke, angina, myocardial infarction (MI), cardiac arrest, and all-cause mortality were included in this model. Hypertension is a risk factor for a number of vascular and cardiac complications and CVD outcomes. Objective: This systematic review was conducted to evaluate the comprehensiveness of this model across different regions globally. Methods: We searched articles written in the English language from PubMed/Medline, Ovid/Medline, Embase, Scopus, Web of Science, and Google scholar with a systematic search query. Results: Thirteen cohort studies involving a total of 2,165,770 (1,666,554 hypertensive adult population and 499,226 adults with treatment-resistant hypertension) were included in this scoping review. Hypertension is clearly associated with coronary heart disease (CHD) and stroke mortality, unstable angina, stable angina, MI, heart failure (HF), sudden cardiac death, transient ischemic attack, ischemic stroke, subarachnoid hemorrhage, intracranial hemorrhage, peripheral arterial disease (PAD), and abdominal aortic aneurism (AAA). Association between HF and hypertension is variable across regions. Treatment resistant hypertension is associated with a higher relative risk of developing major cardiovascular events and all-cause mortality when compared with non-resistant hypertension. However, it is not included in the previous CVD policy model. Conclusion: The CVD policy model used can be used in most regions for the evaluation of the cost-effectiveness of hypertension treatment. However, hypertension is highly associated with HF in Latin America, the Caribbean, Eastern Europe, and Sub-Saharan Africa. Therefore, it is important to consider HF in the CVD policy model for evaluating the cost-effectiveness of hypertension treatment in these regions. We do not suggest the inclusion of PAD and AAA in the CVD policy model for evaluating the cost-effectiveness of hypertension treatment due to a lack of sufficient evidence. Researchers should consider the effect of treatment-resistant hypertension either by including it in the basic model or during setting the model assumptions.Keywords: cardiovascular disease policy model, cost-effectiveness analysis, hypertension, systematic review, twelve major cardiovascular events
Procedia PDF Downloads 7316664 Standard Resource Parameter Based Trust Model in Cloud Computing
Authors: Shyamlal Kumawat
Abstract:
Cloud computing is shifting the approach IT capital are utilized. Cloud computing dynamically delivers convenient, on-demand access to shared pools of software resources, platform and hardware as a service through internet. The cloud computing model—made promising by sophisticated automation, provisioning and virtualization technologies. Users want the ability to access these services including infrastructure resources, how and when they choose. To accommodate this shift in the consumption model technology has to deal with the security, compatibility and trust issues associated with delivering that convenience to application business owners, developers and users. Absent of these issues, trust has attracted extensive attention in Cloud computing as a solution to enhance the security. This paper proposes a trusted computing technology through Standard Resource parameter Based Trust Model in Cloud Computing to select the appropriate cloud service providers. The direct trust of cloud entities is computed on basis of the interaction evidences in past and sustained on its present performances. Various SLA parameters between consumer and provider are considered in trust computation and compliance process. The simulations are performed using CloudSim framework and experimental results show that the proposed model is effective and extensible.Keywords: cloud, Iaas, Saas, Paas
Procedia PDF Downloads 33216663 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster
Authors: Trapti Sharma, Devesh Kumar Srivastava
Abstract:
This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.Keywords: hadoop, mapreduce, k-mediod, validation, verification
Procedia PDF Downloads 37016662 An Energy-Efficient Model of Integrating Telehealth IoT Devices with Fog and Cloud Computing-Based Platform
Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo
Abstract:
The rapid growth of telehealth Internet of Things (IoT) devices has raised concerns about energy consumption and efficient data processing. This paper introduces an energy-efficient model that integrates telehealth IoT devices with a fog and cloud computing-based platform, offering a sustainable and robust solution to overcome these challenges. Our model employs fog computing as a localized data processing layer while leveraging cloud computing for resource-intensive tasks, significantly reducing energy consumption. We incorporate adaptive energy-saving strategies. Simulation analysis validates our approach's effectiveness in enhancing energy efficiency for telehealth IoT systems integrated with localized fog nodes and both private and public cloud infrastructures. Future research will focus on further optimization of the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability in other healthcare and industry sectors.Keywords: energy-efficient, fog computing, IoT, telehealth
Procedia PDF Downloads 8816661 Petri Net Modeling and Simulation of a Call-Taxi System
Authors: T. Godwin
Abstract:
A call-taxi system is a type of taxi service where a taxi could be requested through a phone call or mobile app. A schematic functioning of a call-taxi system is modeled using Petri net, which provides the necessary conditions for a taxi to be assigned by a dispatcher to pick a customer as well as the conditions for the taxi to be released by the customer. A Petri net is a graphical modeling tool used to understand sequences, concurrences, and confluences of activities in the working of discrete event systems. It uses tokens on a directed bipartite multi-graph to simulate the activities of a system. The Petri net model is translated into a simulation model and a call-taxi system is simulated. The simulation model helps in evaluating the operation of a call-taxi system based on the fleet size as well as the operating policies for call-taxi assignment and empty call-taxi repositioning. The developed Petri net based simulation model can be used to decide the fleet size as well as the call-taxi assignment policies for a call-taxi system.Keywords: call-taxi, discrete event system, petri net, simulation modeling
Procedia PDF Downloads 42716660 Modeling Waiting and Service Time for Patients: A Case Study of Matawale Health Centre, Zomba, Malawi
Authors: Moses Aron, Elias Mwakilama, Jimmy Namangale
Abstract:
Spending more time on long queues for a basic service remains a common challenge to most developing countries, including Malawi. For health sector in particular, Out-Patient Department (OPD) experiences long queues. This puts the lives of patients at risk. However, using queuing analysis to under the nature of the problems and efficiency of service systems, such problems can be abated. Based on a kind of service, literature proposes different possible queuing models. However, unlike using generalized assumed models proposed by literature, use of real time case study data can help in deeper understanding the particular problem model and how such a model can vary from one day to the other and also from each case to another. As such, this study uses data obtained from one urban HC for BP, Pediatric and General OPD cases to investigate an average queuing time for patients within the system. It seeks to highlight the proper queuing model by investigating the kind of distributions functions over patient’s arrival time, inter-arrival time, waiting time and service time. Comparable with the standard set values by WHO, the study found that patients at this HC spend more waiting times than service times. On model investigation, different days presented different models ranging from an assumed M/M/1, M/M/2 to M/Er/2. As such, through sensitivity analysis, in general, a commonly assumed M/M/1 model failed to fit the data but rather an M/Er/2 demonstrated to fit well. An M/Er/3 model seemed to be good in terms of measuring resource utilization, proposing a need to increase medical personnel at this HC. However, an M/Er/4 showed to cause more idleness of human resources.Keywords: health care, out-patient department, queuing model, sensitivity analysis
Procedia PDF Downloads 43816659 An Improved Multiple Scattering Reflectance Model Based on Specular V-Cavity
Authors: Hongbin Yang, Mingxue Liao, Changwen Zheng, Mengyao Kong, Chaohui Liu
Abstract:
Microfacet-based reflection models are widely used to model light reflections for rough surfaces. Microfacet models have become the standard surface material building block for describing specular components with varying roughness; and yet, while they possess many desirable properties as well as produce convincing results, their design ignores important sources of scattering, which can cause a significant loss of energy. Specifically, they only simulate the single scattering on the microfacets and ignore the subsequent interactions. As the roughness increases, the interaction will become more and more important. So a multiple-scattering microfacet model based on specular V-cavity is presented for this important open problem. However, it spends much unnecessary rendering time because of setting the same number of scatterings for different roughness surfaces. In this paper, we design a geometric attenuation term G to compute the BRDF (Bidirectional reflection distribution function) of multiple scattering of rough surfaces. Moreover, we consider determining the number of scattering by deterministic heuristics for different roughness surfaces. As a result, our model produces a similar appearance of the objects with the state of the art model with significantly improved rendering efficiency. Finally, we derive a multiple scattering BRDF based on the original microfacet framework.Keywords: bidirectional reflection distribution function, BRDF, geometric attenuation term, multiple scattering, V-cavity model
Procedia PDF Downloads 11616658 Numerical Study on Parallel Rear-Spoiler on Super Cars
Authors: Anshul Ashu
Abstract:
Computers are applied to the vehicle aerodynamics in two ways. One of two is Computational Fluid Dynamics (CFD) and other is Computer Aided Flow Visualization (CAFV). Out of two CFD is chosen because it shows the result with computer graphics. The simulation of flow field around the vehicle is one of the important CFD applications. The flow field can be solved numerically using panel methods, k-ε method, and direct simulation methods. The spoiler is the tool in vehicle aerodynamics used to minimize unfavorable aerodynamic effects around the vehicle and the parallel spoiler is set of two spoilers which are designed in such a manner that it could effectively reduce the drag. In this study, the standard k-ε model of the simplified version of Bugatti Veyron, Audi R8 and Porsche 911 are used to simulate the external flow field. Flow simulation is done for variable Reynolds number. The flow simulation consists of three different levels, first over the model without a rear spoiler, second for over model with single rear spoiler, and third over the model with parallel rear-spoiler. The second and third level has following parameter: the shape of the spoiler, the angle of attack and attachment position. A thorough analysis of simulations results has been found. And a new parallel spoiler is designed. It shows a little improvement in vehicle aerodynamics with a decrease in vehicle aerodynamic drag and lift. Hence, it leads to good fuel economy and traction force of the model.Keywords: drag, lift, flow simulation, spoiler
Procedia PDF Downloads 50216657 Predicting Returns Volatilities and Correlations of Stock Indices Using Multivariate Conditional Autoregressive Range and Return Models
Authors: Shay Kee Tan, Kok Haur Ng, Jennifer So-Kuen Chan
Abstract:
This paper extends the conditional autoregressive range (CARR) model to multivariate CARR (MCARR) model and further to the two-stage MCARR-return model to model and forecast volatilities, correlations and returns of multiple financial assets. The first stage model fits the scaled realised Parkinson volatility measures using individual series and their pairwise sums of indices to the MCARR model to obtain in-sample estimates and forecasts of volatilities for these individual and pairwise sum series. Then covariances are calculated to construct the fitted variance-covariance matrix of returns which are imputed into the stage-two return model to capture the heteroskedasticity of assets’ returns. We investigate different choices of mean functions to describe the volatility dynamics. Empirical applications are based on the Standard and Poor 500, Dow Jones Industrial Average and Dow Jones United States Financial Service Indices. Results show that the stage-one MCARR models using asymmetric mean functions give better in-sample model fits than those based on symmetric mean functions. They also provide better out-of-sample volatility forecasts than those using CARR models based on two robust loss functions with the scaled realised open-to-close volatility measure as the proxy for the unobserved true volatility. We also find that the stage-two return models with constant means and multivariate Student-t errors give better in-sample fits than the Baba, Engle, Kraft, and Kroner type of generalized autoregressive conditional heteroskedasticity (BEKK-GARCH) models. The estimates and forecasts of value-at-risk (VaR) and conditional VaR based on the best MCARR-return models for each asset are provided and tested using Kupiec test to confirm the accuracy of the VaR forecasts.Keywords: range-based volatility, correlation, multivariate CARR-return model, value-at-risk, conditional value-at-risk
Procedia PDF Downloads 10116656 Analysis of Aerodynamic Forces Acting on a Train Passing Through a Tornado
Authors: Masahiro Suzuki, Nobuyuki Okura
Abstract:
The crosswind effect on ground transportations has been extensively investigated for decades. The effect of tornado, however, has been hardly studied in spite of the fact that even heavy ground vehicles, namely, trains were overturned by tornadoes with casualties in the past. Therefore, aerodynamic effects of the tornado on the train were studied by several approaches in this study. First, an experimental facility was developed to clarify aerodynamic forces acting on a vehicle running through a tornado. Our experimental set-up consists of two apparatus. One is a tornado simulator, and the other is a moving model rig. PIV measurements showed that the tornado simulator can generate a swirling-flow field similar to those of the natural tornadoes. The flow field has the maximum tangential velocity of 7.4 m/s and the vortex core radius of 96 mm. The moving model rig makes a 1/40 scale model train of single-car/three-car unit run thorough the swirling flow with the maximum speed of 4.3 m/s. The model car has 72 pressure ports on its surface to estimate the aerodynamic forces. The experimental results show that the aerodynamic forces vary its magnitude and direction depends on the location of the vehicle in the flow field. Second, the aerodynamic forces on the train were estimated by using Rankin vortex model. The Rankin vortex model is a simple tornado model which widely used in the field of civil engineering. The estimated aerodynamic forces on the middle car were fairly good agreement with the experimental results. Effects of the vortex core radius and the path of the train on the aerodynamic forces were investigated using the Rankin vortex model. The results shows that the side and lift forces increases as the vortex core radius increases, while the yawing moment is maximum when the core radius is 0.3875 times of the car length. Third, a computational simulation was conducted to clarify the flow field around the train. The simulated results qualitatively agreed with the experimental ones.Keywords: aerodynamic force, experimental method, tornado, train
Procedia PDF Downloads 23716655 Calibration Model of %Titratable Acidity (Citric Acid) for Intact Tomato by Transmittance SW-NIR Spectroscopy
Authors: K. Petcharaporn, S. Kumchoo
Abstract:
The acidity (citric acid) is one of the chemical contents that can refer to the internal quality and the maturity index of tomato. The titratable acidity (%TA) can be predicted by a non-destructive method prediction by using the transmittance short wavelength (SW-NIR). Spectroscopy in the wavelength range between 665-955 nm. The set of 167 tomato samples divided into groups of 117 tomatoes sample for training set and 50 tomatoes sample for test set were used to establish the calibration model to predict and measure %TA by partial least squares regression (PLSR) technique. The spectra were pretreated with MSC pretreatment and it gave the optimal result for calibration model as (R = 0.92, RMSEC = 0.03%) and this model obtained high accuracy result to use for %TA prediction in test set as (R = 0.81, RMSEP = 0.05%). From the result of prediction in test set shown that the transmittance SW-NIR spectroscopy technique can be used for a non-destructive method for %TA prediction of tomatoes.Keywords: tomato, quality, prediction, transmittance, titratable acidity, citric acid
Procedia PDF Downloads 27316654 Computational Fluid Dynamics Analysis of Convergent–Divergent Nozzle and Comparison against Theoretical and Experimental Results
Authors: Stewart A. Keir, Faik A. Hamad
Abstract:
This study aims to use both analytical and experimental methods of analysis to examine the accuracy of Computational Fluid Dynamics (CFD) models that can then be used for more complex analyses, accurately representing more elaborate flow phenomena such as internal shockwaves and boundary layers. The geometry used in the analytical study and CFD model is taken from the experimental rig. The analytical study is undertaken using isentropic and adiabatic relationships and the output of the analytical study, the 'shockwave location tool', is created. The results from the analytical study are then used to optimize the redesign an experimental rig for more favorable placement of pressure taps and gain a much better representation of the shockwaves occurring in the divergent section of the nozzle. The CFD model is then optimized through the selection of different parameters, e.g. turbulence models (Spalart-Almaras, Realizable k-epsilon & Standard k-omega) in order to develop an accurate, robust model. The results from the CFD model can then be directly compared to experimental and analytical results in order to gauge the accuracy of each method of analysis. The CFD model will be used to visualize the variation of various parameters such as velocity/Mach number, pressure and turbulence across the shock. The CFD results will be used to investigate the interaction between the shock wave and the boundary layer. The validated model can then be used to modify the nozzle designs which may offer better performance and ease of manufacture and may present feasible improvements to existing high-speed flow applications.Keywords: CFD, nozzle, fluent, gas dynamics, shock-wave
Procedia PDF Downloads 23416653 Simulation-Based Validation of Safe Human-Robot-Collaboration
Authors: Titanilla Komenda
Abstract:
Human-machine-collaboration defines a direct interaction between humans and machines to fulfil specific tasks. Those so-called collaborative machines are used without fencing and interact with humans in predefined workspaces. Even though, human-machine-collaboration enables a flexible adaption to variable degrees of freedom, industrial applications are rarely found. The reasons for this are not technical progress but rather limitations in planning processes ensuring safety for operators. Until now, humans and machines were mainly considered separately in the planning process, focusing on ergonomics and system performance respectively. Within human-machine-collaboration, those aspects must not be seen in isolation from each other but rather need to be analysed in interaction. Furthermore, a simulation model is needed that can validate the system performance and ensure the safety for the operator at any given time. Following on from this, a holistic simulation model is presented, enabling a simulative representation of collaborative tasks – including both, humans and machines. The presented model does not only include a geometry and a motion model of interacting humans and machines but also a numerical behaviour model of humans as well as a Boole’s probabilistic sensor model. With this, error scenarios can be simulated by validating system behaviour in unplanned situations. As these models can be defined on the basis of Failure Mode and Effects Analysis as well as probabilities of errors, the implementation in a collaborative model is discussed and evaluated regarding limitations and simulation times. The functionality of the model is shown on industrial applications by comparing simulation results with video data. The analysis shows the impact of considering human factors in the planning process in contrast to only meeting system performance. In this sense, an optimisation function is presented that meets the trade-off between human and machine factors and aids in a successful and safe realisation of collaborative scenarios.Keywords: human-machine-system, human-robot-collaboration, safety, simulation
Procedia PDF Downloads 36116652 Investigating a Deterrence Function for Work Trips for Perth Metropolitan Area
Authors: Ali Raouli, Amin Chegenizadeh, Hamid Nikraz
Abstract:
The Perth metropolitan area and its surrounding regions have been expanding rapidly in recent decades and it is expected that this growth will continue in the years to come. With this rapid growth and the resulting increase in population, consideration should be given to strategic planning and modelling for the future expansion of Perth. The accurate estimation of projected traffic volumes has always been a major concern for the transport modelers and planners. Development of a reliable strategic transport model depends significantly on the inputs data into the model and the calibrated parameters of the model to reflect the existing situation. Trip distribution is the second step in four-step modelling (FSM) which is complex due to its behavioral nature. Gravity model is the most common method for trip distribution. The spatial separation between the Origin and Destination (OD) zones will be reflected in gravity model by applying deterrence functions which provide an opportunity to include people’s behavior in choosing their destinations based on distance, time and cost of their journeys. Deterrence functions play an important role for distribution of the trips within a study area and would simulate the trip distances and therefore should be calibrated for any particular strategic transport model to correctly reflect the trip behavior within the modelling area. This paper aims to review the most common deterrence functions and propose a calibrated deterrence function for work trips within the Perth Metropolitan Area based on the information obtained from the latest available Household data and Perth and Region Travel Survey (PARTS) data. As part of this study, a four-step transport model using EMME software has been developed for Perth Metropolitan Area to assist with the analysis and findings.Keywords: deterrence function, four-step modelling, origin destination, transport model
Procedia PDF Downloads 16816651 Development of a Tilt-Rotor Aircraft Model Using System Identification Technique
Authors: Ferdinando Montemari, Antonio Vitale, Nicola Genito, Giovanni Cuciniello
Abstract:
The introduction of tilt-rotor aircraft into the existing civilian air transportation system will provide beneficial effects due to tilt-rotor capability to combine the characteristics of a helicopter and a fixed-wing aircraft into one vehicle. The disposability of reliable tilt-rotor simulation models supports the development of such vehicle. Indeed, simulation models are required to design automatic control systems that increase safety, reduce pilot's workload and stress, and ensure the optimal aircraft configuration with respect to flight envelope limits, especially during the most critical flight phases such as conversion from helicopter to aircraft mode and vice versa. This article presents a process to build a simplified tilt-rotor simulation model, derived from the analysis of flight data. The model aims to reproduce the complex dynamics of tilt-rotor during the in-flight conversion phase. It uses a set of scheduled linear transfer functions to relate the autopilot reference inputs to the most relevant rigid body state variables. The model also computes information about the rotor flapping dynamics, which are useful to evaluate the aircraft control margin in terms of rotor collective and cyclic commands. The rotor flapping model is derived through a mixed theoretical-empirical approach, which includes physical analytical equations (applicable to helicopter configuration) and parametric corrective functions. The latter are introduced to best fit the actual rotor behavior and balance the differences existing between helicopter and tilt-rotor during flight. Time-domain system identification from flight data is exploited to optimize the model structure and to estimate the model parameters. The presented model-building process was applied to simulated flight data of the ERICA Tilt-Rotor, generated by using a high fidelity simulation model implemented in FlightLab environment. The validation of the obtained model was very satisfying, confirming the validity of the proposed approach.Keywords: flapping dynamics, flight dynamics, system identification, tilt-rotor modeling and simulation
Procedia PDF Downloads 20116650 A Simple Finite Element Method for Glioma Tumor Growth Model with Density Dependent Diffusion
Authors: Shangerganesh Lingeshwaran
Abstract:
In this presentation, we have performed numerical simulations for a reaction-diffusion equation with various nonlinear density-dependent diffusion operators and proliferation functions. The mathematical model represented by parabolic partial differential equation is considered to study the invasion of gliomas (the most common type of brain tumors) and to describe the growth of cancer cells and response to their treatment. The unknown quantity of the given reaction-diffusion equation is the density of cancer cells and the mathematical model based on the proliferation and migration of glioma cells. A standard Galerkin finite element method is used to perform the numerical simulations of the given model. Finally, important observations on the each of nonlinear diffusion functions and proliferation functions are presented with the help of computational results.Keywords: glioma invasion, nonlinear diffusion, reaction-diffusion, finite eleament method
Procedia PDF Downloads 23316649 Improving the Employee Transfer Experience within an Organization
Authors: Drew Fockler
Abstract:
This research examines how to improve an employee’s experience when transferring between departments within an organization. This research includes a historical review of a Canadian retail organization. Based on this historical review, gaps are identified between current and future visions to show where problems with existing training and development practices need to be resolved to reduce front-line employee turnover within an organization. The strategies within this paper support leaders through the LEAD: Listen, Explore, Act and Develop, Change Management Model. The LEAD Change Management Model supports the change process. This research proposes three possible solutions to improve an employee who is transferring between departments. The best solution to resolve the problem of improving an employee moving between departments experience is creating a Training Manager position within the retail store. A Training Manager position could support both employees and leadership with training and development of staff who are moving between departments. Within this research, an implementation plan using the TransX Model was created. The TransX Model is a hybrid of Leader-Member Exchange Theory and Transformational Leadership Theory to facilitate this organizational change within an organization by creating a common vision. Finally, this research provides the next steps as well as future considerations to enhance the training manager role within an organization.Keywords: employee transfers, employee engagement, human resources, employee induction, TransX model, lead change management model
Procedia PDF Downloads 77